Classification in medical images using adaptive metric k-NN
Chen, C.; Chernoff, K.; Karemore, G.; Lo, P.; Nielsen, M.; Lauze, F.
2010-03-01
The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier with respect to different adaptive metrics in the context of medical imaging. We propose using adaptive metrics such that the structure of the data is better described, introducing some unsupervised learning knowledge in k-NN. We investigated four different metrics are estimated: a theoretical metric based on the assumption that images are drawn from Brownian Image Model (BIM), the normalized metric based on variance of the data, the empirical metric is based on the empirical covariance matrix of the unlabeled data, and an optimized metric obtained by minimizing the classification error. The spectral structure of the empirical covariance also leads to Principal Component Analysis (PCA) performed on it which results the subspace metrics. The metrics are evaluated on two data sets: lateral X-rays of the lumbar aortic/spine region, where we use k-NN for performing abdominal aorta calcification detection; and mammograms, where we use k-NN for breast cancer risk assessment. The results show that appropriate choice of metric can improve classification.
Classification in medical image analysis using adaptive metric k-NN
DEFF Research Database (Denmark)
Chen, Chen; Chernoff, Konstantin; Karemore, Gopal
2010-01-01
The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier...
Improving GPU-accelerated adaptive IDW interpolation algorithm using fast kNN search.
Mei, Gang; Xu, Nengxiong; Xu, Liangliang
2016-01-01
This paper presents an efficient parallel Adaptive Inverse Distance Weighting (AIDW) interpolation algorithm on modern Graphics Processing Unit (GPU). The presented algorithm is an improvement of our previous GPU-accelerated AIDW algorithm by adopting fast k-nearest neighbors (kNN) search. In AIDW, it needs to find several nearest neighboring data points for each interpolated point to adaptively determine the power parameter; and then the desired prediction value of the interpolated point is obtained by weighted interpolating using the power parameter. In this work, we develop a fast kNN search approach based on the space-partitioning data structure, even grid, to improve the previous GPU-accelerated AIDW algorithm. The improved algorithm is composed of the stages of kNN search and weighted interpolating. To evaluate the performance of the improved algorithm, we perform five groups of experimental tests. The experimental results indicate: (1) the improved algorithm can achieve a speedup of up to 1017 over the corresponding serial algorithm; (2) the improved algorithm is at least two times faster than our previous GPU-accelerated AIDW algorithm; and (3) the utilization of fast kNN search can significantly improve the computational efficiency of the entire GPU-accelerated AIDW algorithm.
Multiclass Boosting with Adaptive Group-Based kNN and Its Application in Text Categorization
Directory of Open Access Journals (Sweden)
Lei La
2012-01-01
Full Text Available AdaBoost is an excellent committee-based tool for classification. However, its effectiveness and efficiency in multiclass categorization face the challenges from methods based on support vector machine (SVM, neural networks (NN, naïve Bayes, and k-nearest neighbor (kNN. This paper uses a novel multi-class AdaBoost algorithm to avoid reducing the multi-class classification problem to multiple two-class classification problems. This novel method is more effective. In addition, it keeps the accuracy advantage of existing AdaBoost. An adaptive group-based kNN method is proposed in this paper to build more accurate weak classifiers and in this way control the number of basis classifiers in an acceptable range. To further enhance the performance, weak classifiers are combined into a strong classifier through a double iterative weighted way and construct an adaptive group-based kNN boosting algorithm (AGkNN-AdaBoost. We implement AGkNN-AdaBoost in a Chinese text categorization system. Experimental results showed that the classification algorithm proposed in this paper has better performance both in precision and recall than many other text categorization methods including traditional AdaBoost. In addition, the processing speed is significantly enhanced than original AdaBoost and many other classic categorization algorithms.
Adaptive metric kernel regression
DEFF Research Database (Denmark)
Goutte, Cyril; Larsen, Jan
2000-01-01
Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...
Research on cardiovascular disease prediction based on distance metric learning
Ni, Zhuang; Liu, Kui; Kang, Guixia
2018-04-01
Distance metric learning algorithm has been widely applied to medical diagnosis and exhibited its strengths in classification problems. The k-nearest neighbour (KNN) is an efficient method which treats each feature equally. The large margin nearest neighbour classification (LMNN) improves the accuracy of KNN by learning a global distance metric, which did not consider the locality of data distributions. In this paper, we propose a new distance metric algorithm adopting cosine metric and LMNN named COS-SUBLMNN which takes more care about local feature of data to overcome the shortage of LMNN and improve the classification accuracy. The proposed methodology is verified on CVDs patient vector derived from real-world medical data. The Experimental results show that our method provides higher accuracy than KNN and LMNN did, which demonstrates the effectiveness of the Risk predictive model of CVDs based on COS-SUBLMNN.
Adaptive Metric Kernel Regression
DEFF Research Database (Denmark)
Goutte, Cyril; Larsen, Jan
1998-01-01
Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...
IG-KNN UNTUK PREDIKSI CUSTOMER CHURN TELEKOMUNIKASI
Directory of Open Access Journals (Sweden)
Muhammad Arifin
2015-04-01
Full Text Available ABSTRAK IG-KNN merupakan gabungan dari algotitma pemilihan fitur information gain dengan algoritma klasifikasi KNN, kedua algoritma ini diharapkan dapat meningkatkan akurasi dalam memprediksi customer churn telekomunikasi. Prediksi customer churn telekomunikasi merupakan kebutuhan yang sangat penting bagi kelangsungan hidup perusahaan telekomunikasi, dimana dengan banyaknya pelanggang yang meninggalkan perusaan maka perusahaan berpeluang untuk merugi. Mendeteksi pelanggan yang berpeluang meninggalkan perusahaan sejak dini perusahaan akan mendapatkan keuntungan 10 kali, karena biaya untuk mempertahankan pelanggan lebih murah 10 kali lipat dibanding dengan mecari pelanggan baru. Berdasarkan hasil penelitian ini prediksi customer churn telekomunikasi dengan menggunakan IG-KNN menunjukkan akurasi yang lebih baik meski dengan nilai k yang berbeda- beda bila dibandingkan dengan prediksi customer churn telekomunikasi dengan menggunkan KNN tanpa fitur seleksi Information Gain, adapun peningkatan akurasi dari k1 sampai dengan k11 sebesar 1,7%. Kata kunci: information gain, KNN, customer churn telekomunikasi.
H-Metric: Characterizing Image Datasets via Homogenization Based on KNN-Queries
Directory of Open Access Journals (Sweden)
Welington M da Silva
2012-01-01
Full Text Available Precision-Recall is one of the main metrics for evaluating content-based image retrieval techniques. However, it does not provide an ample perception of the properties of an image dataset immersed in a metric space. In this work, we describe an alternative metric named H-Metric, which is determined along a sequence of controlled modifications in the image dataset. The process is named homogenization and works by altering the homogeneity characteristics of the classes of the images. The result is a process that measures how hard it is to deal with a set of images in respect to content-based retrieval, offering support in the task of analyzing configurations of distance functions and of features extractors.
Bakhtiari-Nejad, Maryam; Nguyen, Nhan T.; Krishnakumar, Kalmanje Srinvas
2009-01-01
This paper presents the application of Bounded Linear Stability Analysis (BLSA) method for metrics driven adaptive control. The bounded linear stability analysis method is used for analyzing stability of adaptive control models, without linearizing the adaptive laws. Metrics-driven adaptive control introduces a notion that adaptation should be driven by some stability metrics to achieve robustness. By the application of bounded linear stability analysis method the adaptive gain is adjusted during the adaptation in order to meet certain phase margin requirements. Analysis of metrics-driven adaptive control is evaluated for a linear damaged twin-engine generic transport model of aircraft. The analysis shows that the system with the adjusted adaptive gain becomes more robust to unmodeled dynamics or time delay.
Research on quality metrics of wireless adaptive video streaming
Li, Xuefei
2018-04-01
With the development of wireless networks and intelligent terminals, video traffic has increased dramatically. Adaptive video streaming has become one of the most promising video transmission technologies. For this type of service, a good QoS (Quality of Service) of wireless network does not always guarantee that all customers have good experience. Thus, new quality metrics have been widely studies recently. Taking this into account, the objective of this paper is to investigate the quality metrics of wireless adaptive video streaming. In this paper, a wireless video streaming simulation platform with DASH mechanism and multi-rate video generator is established. Based on this platform, PSNR model, SSIM model and Quality Level model are implemented. Quality Level Model considers the QoE (Quality of Experience) factors such as image quality, stalling and switching frequency while PSNR Model and SSIM Model mainly consider the quality of the video. To evaluate the performance of these QoE models, three performance metrics (SROCC, PLCC and RMSE) which are used to make a comparison of subjective and predicted MOS (Mean Opinion Score) are calculated. From these performance metrics, the monotonicity, linearity and accuracy of these quality metrics can be observed.
Applicability of Existing Objective Metrics of Perceptual Quality for Adaptive Video Streaming
DEFF Research Database (Denmark)
Søgaard, Jacob; Krasula, Lukás; Shahid, Muhammad
2016-01-01
Objective video quality metrics are designed to estimate the quality of experience of the end user. However, these objective metrics are usually validated with video streams degraded under common distortion types. In the presented work, we analyze the performance of published and known full......-reference and noreference quality metrics in estimating the perceived quality of adaptive bit-rate video streams knowingly out of scope. Experimental results indicate not surprisingly that state of the art objective quality metrics overlook the perceived degradations in the adaptive video streams and perform poorly...
A kNN method that uses a non-natural evolutionary algorithm for ...
African Journals Online (AJOL)
We used this algorithm for component selection of a kNN (k Nearest Neighbor) method for breast cancer prognosis. Results with the UCI prognosis data set show that we can find components that help improve the accuracy of kNN by almost 3%, raising it above 79%. Keywords: kNN; classification; evolutionary algorithm; ...
KNN BASED CLASSIFICATION OF DIGITAL MODULATED SIGNALS
Directory of Open Access Journals (Sweden)
Sajjad Ahmed Ghauri
2016-11-01
Full Text Available Demodulation process without the knowledge of modulation scheme requires Automatic Modulation Classification (AMC. When receiver has limited information about received signal then AMC become essential process. AMC finds important place in the field many civil and military fields such as modern electronic warfare, interfering source recognition, frequency management, link adaptation etc. In this paper we explore the use of K-nearest neighbor (KNN for modulation classification with different distance measurement methods. Five modulation schemes are used for classification purpose which is Binary Phase Shift Keying (BPSK, Quadrature Phase Shift Keying (QPSK, Quadrature Amplitude Modulation (QAM, 16-QAM and 64-QAM. Higher order cummulants (HOC are used as an input feature set to the classifier. Simulation results shows that proposed classification method provides better results for the considered modulation formats.
Directory of Open Access Journals (Sweden)
Karsten Schulz
2009-11-01
Full Text Available Nearest neighbor techniques are commonly used in remote sensing, pattern recognition and statistics to classify objects into a predefined number of categories based on a given set of predictors. These techniques are especially useful for highly nonlinear relationship between the variables. In most studies the distance measure is adopted a priori. In contrast we propose a general procedure to find an adaptive metric that combines a local variance reducing technique and a linear embedding of the observation space into an appropriate Euclidean space. To illustrate the application of this technique, two agricultural land cover classifications using mono-temporal and multi-temporal Landsat scenes are presented. The results of the study, compared with standard approaches used in remote sensing such as maximum likelihood (ML or k-Nearest Neighbor (k-NN indicate substantial improvement with regard to the overall accuracy and the cardinality of the calibration data set. Also, using MNN in a soft/fuzzy classification framework demonstrated to be a very useful tool in order to derive critical areas that need some further attention and investment concerning additional calibration data.
Growth of KNN thin films for non-linear optical applications
International Nuclear Information System (INIS)
Sharma, Shweta; Gupta, Reema; Gupta, Vinay; Tomar, Monika
2018-01-01
Two-wave mixing is a remarkable area of research in the field of non-linear optics, finding various applications in the development of opto-electronic devices, photorefractive waveguides, real time holography, etc. Non-linear optical properties of ferroelectric potassium sodium niobate (KNN) thin films have been interrogated using two-wave mixing phenomenon. Regarding this, a-axis oriented K 0.35 Na (1-0.35) NbO 3 thin films were successfully grown on epitaxial matched (100) SrTiO 3 substrate using pulsed laser deposition (PLD) technique. The uniformly distributed Au micro-discs of 200 μm diameter were integrated with KNN/STO thin film to study the plasmonic enhancement in the optical response. Beam amplification has been observed as a result of the two-wave mixing. This is due to the alignment of ferroelectric domains in KNN films and the excitement of plasmons at the metal-dielectric (Au-KNN) interface. (copyright 2017 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)
Growth of KNN thin films for non-linear optical applications
Energy Technology Data Exchange (ETDEWEB)
Sharma, Shweta; Gupta, Reema; Gupta, Vinay [Department of Physics and Astrophysics, University of Delhi (India); Tomar, Monika [Department of Physics, Miranda House University of Delhi (India)
2018-02-15
Two-wave mixing is a remarkable area of research in the field of non-linear optics, finding various applications in the development of opto-electronic devices, photorefractive waveguides, real time holography, etc. Non-linear optical properties of ferroelectric potassium sodium niobate (KNN) thin films have been interrogated using two-wave mixing phenomenon. Regarding this, a-axis oriented K{sub 0.35}Na{sub (1-0.35)}NbO{sub 3} thin films were successfully grown on epitaxial matched (100) SrTiO{sub 3} substrate using pulsed laser deposition (PLD) technique. The uniformly distributed Au micro-discs of 200 μm diameter were integrated with KNN/STO thin film to study the plasmonic enhancement in the optical response. Beam amplification has been observed as a result of the two-wave mixing. This is due to the alignment of ferroelectric domains in KNN films and the excitement of plasmons at the metal-dielectric (Au-KNN) interface. (copyright 2017 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)
Preserved Network Metrics across Translated Texts
Cabatbat, Josephine Jill T.; Monsanto, Jica P.; Tapang, Giovanni A.
2014-09-01
Co-occurrence language networks based on Bible translations and the Universal Declaration of Human Rights (UDHR) translations in different languages were constructed and compared with random text networks. Among the considered network metrics, the network size, N, the normalized betweenness centrality (BC), and the average k-nearest neighbors, knn, were found to be the most preserved across translations. Moreover, similar frequency distributions of co-occurring network motifs were observed for translated texts networks.
Efficient and Flexible KNN Query Processing in Real-Life Road Networks
DEFF Research Database (Denmark)
Lu, Yang; Bui, Bin; Zhao, Jiakui
2008-01-01
are included into the RNG index, which enables the index to support both distance-based and time-based KNN queries and continuous KNN queries. Our work extends previous ones by taking into account more practical scenarios, such as complexities in real-life road networks and time-based KNN queries. Extensive......Along with the developments of mobile services, effectively modeling road networks and efficiently indexing and querying network constrained objects has become a challenging problem. In this paper, we first introduce a road network model which captures real-life road networks better than previous...
Secure kNN Computation and Integrity Assurance of Data Outsourcing in the Cloud
Directory of Open Access Journals (Sweden)
Jun Hong
2017-01-01
Full Text Available As cloud computing has been popularized massively and rapidly, individuals and enterprises prefer outsourcing their databases to the cloud service provider (CSP to save the expenditure for managing and maintaining the data. The outsourced databases are hosted, and query services are offered to clients by the CSP, whereas the CSP is not fully trusted. Consequently, the security shall be violated by multiple factors. Data privacy and query integrity are perceived as two major factors obstructing enterprises from outsourcing their databases. A novel scheme is proposed in this paper to effectuate k-nearest neighbors (kNN query and kNN query authentication on an encrypted outsourced spatial database. An asymmetric scalar-product-preserving encryption scheme is elucidated, in which data points and query points are encrypted with diverse encryption keys, and the CSP can determine the distance relation between encrypted data points and query points. Furthermore, the similarity search tree is extended to build a novel verifiable SS-tree that supports efficient kNN query and kNN query verification. It is indicated from the security analysis and experiment results that our scheme not only maintains the confidentiality of outsourced confidential data and query points but also has a lower kNN query processing and verification overhead than the MR-tree.
Directory of Open Access Journals (Sweden)
SUCI AULIA
2015-01-01
Full Text Available ABSTRAK Penelitian mengenai pengklasifikasian tingkat keparahan penyakit Diabetes Retinopati berbasis image processing masih hangat dibicarakan, citra yang biasa digunakan untuk mendeteksi jenis penyakit ini adalah citra optik disk, mikroaneurisma, eksudat, dan hemorrhages yang berasal dari citra fundus. Pada penelitian ini telah dilakukan perbandingan algoritma SVM dengan KNN untuk klasifikasi penyakit diabetes retinopati (mild, moderate, severe berdasarkan citra eksudat dan microaneurisma. Untuk proses ekstraksi ciri digunakan metode wavelet pada masing-masing kedua metode tersebut. Pada penelitian ini digunakan 160 data uji, masing-masing 40 citra untuk kelas normal, kelas mild, kelas moderate, kelas saviere. Tingkat akurasi yang diperoleh dengan menggunakan metode KNN lebih tinggi dibandingkan SVM, yaitu 65 % dan 62%. Klasifikasi dengan algoritma KNN diperoleh hasil terbaik dengan parameter K=9 cityblock. Sedangkan klasifikasi dengan metode SVM diperoleh hasil terbaik dengan parameter One Agains All. Kata kunci: Diabetic Retinopathy, KNN , SVM, Wavelet. ABSTRACT Research based on severity classification of the disease diabetic retinopathy by using image processing method is still hotly debated, the image is used to detect the type of this disease is an optical image of the disk, microaneurysm, exudates, and bleeding of the image of the fundus. This study was performed to compare SVM method with KNN method for classification of diabetic retinopathy disease (mild, moderate, severe based on exudate and microaneurysm image. For feature extraction uses wavelet method, and each of the two methods. This study made use of 160 test data, each of 40 images for normal class, mild class, moderate class, severe class. The accuracy obtained by KNN higher than SVM, with 65% and 62%. KNN classification method achieved the best results with the parameters K = 9, cityblock. While the classification with SVM method obtained the best results with
CONTROLLED CONDENSATION IN K-NN AND ITS APPLICATION FOR REAL TIME COLOR IDENTIFICATION
Directory of Open Access Journals (Sweden)
Carmen Villar Patiño
2017-04-01
Full Text Available k-NN algorithms are frequently used in statistical classification. They are accurate and distribution free. Despite these advantages, k-NN algorithms imply a high computational cost. To find efficient ways to implement them is an important challenge in pattern recognition. In this article, an improved version of the k-NN Controlled Condensation algorithm is introduced. Its potential for instantaneous color identification in real time is also analyzed. This algorithm is based on the representation of data in terms of a reduced set of informative prototypes. It includes two parameters to control the balance between speed and precision. This gives us the opportunity to achieve a convenient percentage of condensation without incurring in an important loss of accuracy. We test our proposal in an instantaneous color identification exercise in video images. We achieve the real time identification by using k-NN Controlled Condensation executed through multi-threading programming methods. The results are encouraging.
Directory of Open Access Journals (Sweden)
Akmal Mat Harttat Maziati
2015-06-01
Full Text Available Alkaline niobate mainly potassium sodium niobate, (KxNa1-x NbO3 (abreviated as KNN has long attracted attention as piezoelectric materials as its high Curie temperature (Tc and piezoelectric properties. The volatility of alkaline element (K, Na is, however detrimental to the stoichiometry of KNN, contributing to the failure to achieve high-density structure and lead to the formation of intrinsic defects. By partially doping of several rare-earth elements, the inherent defects could be improved significantly. Therefore, considerable attempts have been made to develop doped-KNN based ceramic materials with high electrical properties. In this paper, these research activities are reviewed, including dopants type and doping role in KNN perovskite structure.
Short Term Prediction of Freeway Exiting Volume Based on SVM and KNN
Directory of Open Access Journals (Sweden)
Xiang Wang
2015-09-01
The model results indicate that the proposed algorithm is feasible and accurate. The Mean Absolute Percentage Error is under 10%. When comparing with the results of single KNN or SVM method, the results show that the combination of KNN and SVM can improve the reliability of the prediction significantly. The proposed method can be implemented in the on-line application of exiting volume prediction, which is able to consider different vehicle types.
DeWeber, Jefferson T.; Wagner, Tyler
2018-01-01
Predictions of the projected changes in species distributions and potential adaptation action benefits can help guide conservation actions. There is substantial uncertainty in projecting species distributions into an unknown future, however, which can undermine confidence in predictions or misdirect conservation actions if not properly considered. Recent studies have shown that the selection of alternative climate metrics describing very different climatic aspects (e.g., mean air temperature vs. mean precipitation) can be a substantial source of projection uncertainty. It is unclear, however, how much projection uncertainty might stem from selecting among highly correlated, ecologically similar climate metrics (e.g., maximum temperature in July, maximum 30‐day temperature) describing the same climatic aspect (e.g., maximum temperatures) known to limit a species’ distribution. It is also unclear how projection uncertainty might propagate into predictions of the potential benefits of adaptation actions that might lessen climate change effects. We provide probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty stemming from the selection of four maximum temperature metrics for brook trout (Salvelinus fontinalis), a cold‐water salmonid of conservation concern in the eastern United States. Projected losses in suitable stream length varied by as much as 20% among alternative maximum temperature metrics for mid‐century climate projections, which was similar to variation among three climate models. Similarly, the regional average predicted increase in brook trout occurrence probability under an adaptation action scenario of full riparian forest restoration varied by as much as .2 among metrics. Our use of Bayesian inference provides probabilistic measures of vulnerability and adaptation action benefits for individual stream reaches that properly address statistical uncertainty and can help guide conservation
DeWeber, Jefferson T; Wagner, Tyler
2018-06-01
Predictions of the projected changes in species distributions and potential adaptation action benefits can help guide conservation actions. There is substantial uncertainty in projecting species distributions into an unknown future, however, which can undermine confidence in predictions or misdirect conservation actions if not properly considered. Recent studies have shown that the selection of alternative climate metrics describing very different climatic aspects (e.g., mean air temperature vs. mean precipitation) can be a substantial source of projection uncertainty. It is unclear, however, how much projection uncertainty might stem from selecting among highly correlated, ecologically similar climate metrics (e.g., maximum temperature in July, maximum 30-day temperature) describing the same climatic aspect (e.g., maximum temperatures) known to limit a species' distribution. It is also unclear how projection uncertainty might propagate into predictions of the potential benefits of adaptation actions that might lessen climate change effects. We provide probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty stemming from the selection of four maximum temperature metrics for brook trout (Salvelinus fontinalis), a cold-water salmonid of conservation concern in the eastern United States. Projected losses in suitable stream length varied by as much as 20% among alternative maximum temperature metrics for mid-century climate projections, which was similar to variation among three climate models. Similarly, the regional average predicted increase in brook trout occurrence probability under an adaptation action scenario of full riparian forest restoration varied by as much as .2 among metrics. Our use of Bayesian inference provides probabilistic measures of vulnerability and adaptation action benefits for individual stream reaches that properly address statistical uncertainty and can help guide conservation actions. Our
Self-Organization in Aggregating Robot Swarms: A DW-KNN Topological Approach
Khaldi, Belkacem
2018-02-02
In certain swarm applications, where the inter-agent distance is not the only factor in the collective behaviours of the swarm, additional properties such as density could have a crucial effect. In this paper, we propose applying a Distance-Weighted K-Nearest Neighbouring (DW-KNN) topology to the behaviour of robot swarms performing self-organized aggregation, in combination with a virtual physics approach to keep the robots together. A distance-weighted function based on a Smoothed Particle Hydrodynamic (SPH) interpolation approach, which is used to evaluate the robot density in the swarm, is applied as the key factor for identifying the K-nearest neighbours taken into account when aggregating the robots. The intra virtual physical connectivity among these neighbours is achieved using a virtual viscoelastic-based proximity model. With the ARGoS based-simulator, we model and evaluate the proposed approach, showing various self-organized aggregations performed by a swarm of N foot-bot robots. Also, we compared the aggregation quality of DW-KNN aggregation approach to that of the conventional KNN approach and found better performance.
Quantum Algorithm for K-Nearest Neighbors Classification Based on the Metric of Hamming Distance
Ruan, Yue; Xue, Xiling; Liu, Heng; Tan, Jianing; Li, Xi
2017-11-01
K-nearest neighbors (KNN) algorithm is a common algorithm used for classification, and also a sub-routine in various complicated machine learning tasks. In this paper, we presented a quantum algorithm (QKNN) for implementing this algorithm based on the metric of Hamming distance. We put forward a quantum circuit for computing Hamming distance between testing sample and each feature vector in the training set. Taking advantage of this method, we realized a good analog for classical KNN algorithm by setting a distance threshold value t to select k - n e a r e s t neighbors. As a result, QKNN achieves O( n 3) performance which is only relevant to the dimension of feature vectors and high classification accuracy, outperforms Llyod's algorithm (Lloyd et al. 2013) and Wiebe's algorithm (Wiebe et al. 2014).
Quality Assessment of Adaptive Bitrate Videos using Image Metrics and Machine Learning
DEFF Research Database (Denmark)
Søgaard, Jacob; Forchhammer, Søren; Brunnström, Kjell
2015-01-01
Adaptive bitrate (ABR) streaming is widely used for distribution of videos over the internet. In this work, we investigate how well we can predict the quality of such videos using well-known image metrics, information about the bitrate levels, and a relatively simple machine learning method...
Short-term Power Load Forecasting Based on Balanced KNN
Lv, Xianlong; Cheng, Xingong; YanShuang; Tang, Yan-mei
2018-03-01
To improve the accuracy of load forecasting, a short-term load forecasting model based on balanced KNN algorithm is proposed; According to the load characteristics, the historical data of massive power load are divided into scenes by the K-means algorithm; In view of unbalanced load scenes, the balanced KNN algorithm is proposed to classify the scene accurately; The local weighted linear regression algorithm is used to fitting and predict the load; Adopting the Apache Hadoop programming framework of cloud computing, the proposed algorithm model is parallelized and improved to enhance its ability of dealing with massive and high-dimension data. The analysis of the household electricity consumption data for a residential district is done by 23-nodes cloud computing cluster, and experimental results show that the load forecasting accuracy and execution time by the proposed model are the better than those of traditional forecasting algorithm.
Directory of Open Access Journals (Sweden)
Sharif Uddin
2016-01-01
Full Text Available An enhanced k-nearest neighbor (k-NN classification algorithm is presented, which uses a density based similarity measure in addition to a distance based similarity measure to improve the diagnostic performance in bearing fault diagnosis. Due to its use of distance based similarity measure alone, the classification accuracy of traditional k-NN deteriorates in case of overlapping samples and outliers and is highly susceptible to the neighborhood size, k. This study addresses these limitations by proposing the use of both distance and density based measures of similarity between training and test samples. The proposed k-NN classifier is used to enhance the diagnostic performance of a bearing fault diagnosis scheme, which classifies different fault conditions based upon hybrid feature vectors extracted from acoustic emission (AE signals. Experimental results demonstrate that the proposed scheme, which uses the enhanced k-NN classifier, yields better diagnostic performance and is more robust to variations in the neighborhood size, k.
Assessment of various supervised learning algorithms using different performance metrics
Susheel Kumar, S. M.; Laxkar, Deepak; Adhikari, Sourav; Vijayarajan, V.
2017-11-01
Our work brings out comparison based on the performance of supervised machine learning algorithms on a binary classification task. The supervised machine learning algorithms which are taken into consideration in the following work are namely Support Vector Machine(SVM), Decision Tree(DT), K Nearest Neighbour (KNN), Naïve Bayes(NB) and Random Forest(RF). This paper mostly focuses on comparing the performance of above mentioned algorithms on one binary classification task by analysing the Metrics such as Accuracy, F-Measure, G-Measure, Precision, Misclassification Rate, False Positive Rate, True Positive Rate, Specificity, Prevalence.
Lead-free piezoelectric KNN-BZ-BNT films with a vertical morphotropic phase boundary
Directory of Open Access Journals (Sweden)
Wen Chen
2015-07-01
Full Text Available The lead-free piezoelectric 0.915K0.5Na0.5NbO3-0.075BaZrO3-0.01Bi0.5Na0.5TiO3 (0.915KNN-0.075BZ-0.01BNT films were prepared by a chemical solution deposition method. The films possess a pure rhomobohedral perovskite phase and a dense surface without crack. The temperature-dependent dielectric properties of the specimens manifest that only phase transition from ferroelectric to paraelectric phase occurred and the Curie temperature is 217 oC. The temperature stability of ferroelectric phase was also supported by the stable piezoelectric properties of the films. These results suggest that the slope of the morphotropic phase boundary (MPB for the solid solution formed with the KNN and BZ in the films should be vertical. The voltage-induced polarization switching, and a distinct piezo-response suggested that the 0.915 KNN-0.075BZ-0.01BNT films show good piezoelectric properties.
Processing and characterizations of BNT-KNN ceramics for actuator applications
Directory of Open Access Journals (Sweden)
Mallam Chandrasekhar
2016-06-01
Full Text Available BNT-KNN powder (with composition 0.93Bi0.5Na0.5TiO3–0.07K0.5Na0.5NbO3 was synthesized as a single perovskite phase by conventional solid state reaction route and dense ceramics were obtained by sintering of powder compacts at 1100 °C for 4 h. Dielectric study confirmed relaxor behaviour, whereas the microstructure study showed sharp cornered cubic like grains with an average grain size ∼1.15 µm. The saturated polarization vs. electric field (P-E hysteresis loops confirmed the ferroelectric (FE nature while the butterfly shaped strain vs. electric field (S-E loops suggested the piezoelectric nature of the BNT-KNN ceramic samples. Maximum electric field induced strain of ∼0.62% suggested the usefulness of this system for actuator applications.
Nanoscale characterization and local piezoelectric properties of lead-free KNN-LT-LS thin films
Energy Technology Data Exchange (ETDEWEB)
Abazari, M; Safari, A [Glenn Howatt Electroceramics Laboratories, Department of Materials Science and Engineering, Rutgers-The state University of New Jersey, Piscataway, NJ 08854 (United States); Choi, T; Cheong, S-W [Rutgers Center for Emergent Materials, Department of Physics and Astronomy, Rutgers-The state University of New Jersey, Piscataway, NJ 08854 (United States)
2010-01-20
We report the observation of domain structure and piezoelectric properties of pure and Mn-doped (K{sub 0.44},Na{sub 0.52},Li{sub 0.04})(Nb{sub 0.84},Ta{sub 0.1},Sb{sub 0.06})O{sub 3} (KNN-LT-LS) thin films on SrTiO{sub 3} substrates. It is revealed that, using piezoresponse force microscopy, ferroelectric domain structure in such 500 nm thin films comprised of primarily 180{sup 0} domains. This was in accordance with the tetragonal structure of the films, confirmed by relative permittivity measurements and x-ray diffraction patterns. Effective piezoelectric coefficient (d{sub 33}) of the films were calculated using piezoelectric displacement curves and shown to be {approx}53 pm V{sup -1} for pure KNN-LT-LS thin films. This value is among the highest values reported for an epitaxial lead-free thin film and shows a great potential for KNN-LT-LS to serve as an alternative to PZT thin films in future applications.
Nanoscale characterization and local piezoelectric properties of lead-free KNN-LT-LS thin films
Abazari, M.; Choi, T.; Cheong, S.-W.; Safari, A.
2010-01-01
We report the observation of domain structure and piezoelectric properties of pure and Mn-doped (K0.44,Na0.52,Li0.04)(Nb0.84,Ta0.1,Sb0.06)O3 (KNN-LT-LS) thin films on SrTiO3 substrates. It is revealed that, using piezoresponse force microscopy, ferroelectric domain structure in such 500 nm thin films comprised of primarily 180° domains. This was in accordance with the tetragonal structure of the films, confirmed by relative permittivity measurements and x-ray diffraction patterns. Effective piezoelectric coefficient (d33) of the films were calculated using piezoelectric displacement curves and shown to be ~53 pm V-1 for pure KNN-LT-LS thin films. This value is among the highest values reported for an epitaxial lead-free thin film and shows a great potential for KNN-LT-LS to serve as an alternative to PZT thin films in future applications.
GPU-FS-kNN: a software tool for fast and scalable kNN computation using GPUs.
Directory of Open Access Journals (Sweden)
Ahmed Shamsul Arefin
Full Text Available BACKGROUND: The analysis of biological networks has become a major challenge due to the recent development of high-throughput techniques that are rapidly producing very large data sets. The exploding volumes of biological data are craving for extreme computational power and special computing facilities (i.e. super-computers. An inexpensive solution, such as General Purpose computation based on Graphics Processing Units (GPGPU, can be adapted to tackle this challenge, but the limitation of the device internal memory can pose a new problem of scalability. An efficient data and computational parallelism with partitioning is required to provide a fast and scalable solution to this problem. RESULTS: We propose an efficient parallel formulation of the k-Nearest Neighbour (kNN search problem, which is a popular method for classifying objects in several fields of research, such as pattern recognition, machine learning and bioinformatics. Being very simple and straightforward, the performance of the kNN search degrades dramatically for large data sets, since the task is computationally intensive. The proposed approach is not only fast but also scalable to large-scale instances. Based on our approach, we implemented a software tool GPU-FS-kNN (GPU-based Fast and Scalable k-Nearest Neighbour for CUDA enabled GPUs. The basic approach is simple and adaptable to other available GPU architectures. We observed speed-ups of 50-60 times compared with CPU implementation on a well-known breast microarray study and its associated data sets. CONCLUSION: Our GPU-based Fast and Scalable k-Nearest Neighbour search technique (GPU-FS-kNN provides a significant performance improvement for nearest neighbour computation in large-scale networks. Source code and the software tool is available under GNU Public License (GPL at https://sourceforge.net/p/gpufsknn/.
Sharp metric obstructions for quasi-Einstein metrics
Case, Jeffrey S.
2013-02-01
Using the tractor calculus to study smooth metric measure spaces, we adapt results of Gover and Nurowski to give sharp metric obstructions to the existence of quasi-Einstein metrics on suitably generic manifolds. We do this by introducing an analogue of the Weyl tractor W to the setting of smooth metric measure spaces. The obstructions we obtain can be realized as tensorial invariants which are polynomial in the Riemann curvature tensor and its divergence. By taking suitable limits of their tensorial forms, we then find obstructions to the existence of static potentials, generalizing to higher dimensions a result of Bartnik and Tod, and to the existence of potentials for gradient Ricci solitons.
Multi-site Stochastic Simulation of Daily Streamflow with Markov Chain and KNN Algorithm
Mathai, J.; Mujumdar, P.
2017-12-01
A key focus of this study is to develop a method which is physically consistent with the hydrologic processes that can capture short-term characteristics of daily hydrograph as well as the correlation of streamflow in temporal and spatial domains. In complex water resource systems, flow fluctuations at small time intervals require that discretisation be done at small time scales such as daily scales. Also, simultaneous generation of synthetic flows at different sites in the same basin are required. We propose a method to equip water managers with a streamflow generator within a stochastic streamflow simulation framework. The motivation for the proposed method is to generate sequences that extend beyond the variability represented in the historical record of streamflow time series. The method has two steps: In step 1, daily flow is generated independently at each station by a two-state Markov chain, with rising limb increments randomly sampled from a Gamma distribution and the falling limb modelled as exponential recession and in step 2, the streamflow generated in step 1 is input to a nonparametric K-nearest neighbor (KNN) time series bootstrap resampler. The KNN model, being data driven, does not require assumptions on the dependence structure of the time series. A major limitation of KNN based streamflow generators is that they do not produce new values, but merely reshuffle the historical data to generate realistic streamflow sequences. However, daily flow generated using the Markov chain approach is capable of generating a rich variety of streamflow sequences. Furthermore, the rising and falling limbs of daily hydrograph represent different physical processes, and hence they need to be modelled individually. Thus, our method combines the strengths of the two approaches. We show the utility of the method and improvement over the traditional KNN by simulating daily streamflow sequences at 7 locations in the Godavari River basin in India.
Exact Cross-Validation for kNN and applications to passive and active learning in classification
Célisse, Alain; Mary-Huard, Tristan
2011-01-01
In the binary classification framework, a closed form expression of the cross-validation Leave-p-Out (LpO) risk estimator for the k Nearest Neighbor algorithm (kNN) is derived. It is first used to study the LpO risk minimization strategy for choosing k in the passive learning setting. The impact of p on the choice of k and the LpO estimation of the risk are inferred. In the active learning setting, a procedure is proposed that selects new examples using a LpO committee of kNN classifiers. The...
Directory of Open Access Journals (Sweden)
Abdullah M. Iliyasu
2017-12-01
Full Text Available A quantum hybrid (QH intelligent approach that blends the adaptive search capability of the quantum-behaved particle swarm optimisation (QPSO method with the intuitionistic rationality of traditional fuzzy k-nearest neighbours (Fuzzy k-NN algorithm (known simply as the Q-Fuzzy approach is proposed for efficient feature selection and classification of cells in cervical smeared (CS images. From an initial multitude of 17 features describing the geometry, colour, and texture of the CS images, the QPSO stage of our proposed technique is used to select the best subset features (i.e., global best particles that represent a pruned down collection of seven features. Using a dataset of almost 1000 images, performance evaluation of our proposed Q-Fuzzy approach assesses the impact of our feature selection on classification accuracy by way of three experimental scenarios that are compared alongside two other approaches: the All-features (i.e., classification without prior feature selection and another hybrid technique combining the standard PSO algorithm with the Fuzzy k-NN technique (P-Fuzzy approach. In the first and second scenarios, we further divided the assessment criteria in terms of classification accuracy based on the choice of best features and those in terms of the different categories of the cervical cells. In the third scenario, we introduced new QH hybrid techniques, i.e., QPSO combined with other supervised learning methods, and compared the classification accuracy alongside our proposed Q-Fuzzy approach. Furthermore, we employed statistical approaches to establish qualitative agreement with regards to the feature selection in the experimental scenarios 1 and 3. The synergy between the QPSO and Fuzzy k-NN in the proposed Q-Fuzzy approach improves classification accuracy as manifest in the reduction in number cell features, which is crucial for effective cervical cancer detection and diagnosis.
International Nuclear Information System (INIS)
Rubio-Marcos, F.; Marchet, P.; Merle-Mejean, T.; Fernandez, J.F.
2010-01-01
Lead-free KNN-modified piezoceramics of the system (Li,Na,K)(Nb,Ta,Sb)O 3 were prepared by conventional solid-state sintering. The X-ray diffraction patterns revealed a perovskite phase, together with some minor secondary phase, which was assigned to K 3 LiNb 6 O 17 , tetragonal tungsten-bronze (TTB). A structural evolution toward a pure tetragonal structure with the increasing sintering time was observed, associated with the decrease of TTB phase. A correlation between higher tetragonality and higher piezoelectric response was clearly evidenced. Contrary to the case of the LiTaO 3 modified KNN, very large abnormal grains with TTB structure were not detected. As a consequence, the simultaneous modification by tantalum and antimony seems to induce during sintering a different behaviour from the one of LiTaO 3 modified KNN.
Energy Technology Data Exchange (ETDEWEB)
Rubio-Marcos, F., E-mail: frmarcos@icv.csic.es [Electroceramic Department, Instituto de Ceramica y Vidrio, CSIC, Kelsen 5, 28049 Madrid (Spain); Marchet, P.; Merle-Mejean, T. [SPCTS, UMR 6638 CNRS, Universite de Limoges, 123, Av. A. Thomas, 87060 Limoges (France); Fernandez, J.F. [Electroceramic Department, Instituto de Ceramica y Vidrio, CSIC, Kelsen 5, 28049 Madrid (Spain)
2010-09-01
Lead-free KNN-modified piezoceramics of the system (Li,Na,K)(Nb,Ta,Sb)O{sub 3} were prepared by conventional solid-state sintering. The X-ray diffraction patterns revealed a perovskite phase, together with some minor secondary phase, which was assigned to K{sub 3}LiNb{sub 6}O{sub 17}, tetragonal tungsten-bronze (TTB). A structural evolution toward a pure tetragonal structure with the increasing sintering time was observed, associated with the decrease of TTB phase. A correlation between higher tetragonality and higher piezoelectric response was clearly evidenced. Contrary to the case of the LiTaO{sub 3} modified KNN, very large abnormal grains with TTB structure were not detected. As a consequence, the simultaneous modification by tantalum and antimony seems to induce during sintering a different behaviour from the one of LiTaO{sub 3} modified KNN.
Adeniyi, D. A.; Wei, Z.; Yang, Y.
2017-10-01
Recommendation problem has been extensively studied by researchers in the field of data mining, database and information retrieval. This study presents the design and realisation of an automated, personalised news recommendations system based on Chi-square statistics-based K-nearest neighbour (χ2SB-KNN) model. The proposed χ2SB-KNN model has the potential to overcome computational complexity and information overloading problems, reduces runtime and speeds up execution process through the use of critical value of χ2 distribution. The proposed recommendation engine can alleviate scalability challenges through combined online pattern discovery and pattern matching for real-time recommendations. This work also showcases the development of a novel method of feature selection referred to as Data Discretisation-Based feature selection method. This is used for selecting the best features for the proposed χ2SB-KNN algorithm at the preprocessing stage of the classification procedures. The implementation of the proposed χ2SB-KNN model is achieved through the use of a developed in-house Java program on an experimental website called OUC newsreaders' website. Finally, we compared the performance of our system with two baseline methods which are traditional Euclidean distance K-nearest neighbour and Naive Bayesian techniques. The result shows a significant improvement of our method over the baseline methods studied.
A new approach to very short term wind speed prediction using k-nearest neighbor classification
International Nuclear Information System (INIS)
Yesilbudak, Mehmet; Sagiroglu, Seref; Colak, Ilhami
2013-01-01
Highlights: ► Wind speed parameter was predicted in an n-tupled inputs using k-NN classification. ► The effects of input parameters, nearest neighbors and distance metrics were analyzed. ► Many useful and reasonable inferences were uncovered using the developed model. - Abstract: Wind energy is an inexhaustible energy source and wind power production has been growing rapidly in recent years. However, wind power has a non-schedulable nature due to wind speed variations. Hence, wind speed prediction is an indispensable requirement for power system operators. This paper predicts wind speed parameter in an n-tupled inputs using k-nearest neighbor (k-NN) classification and analyzes the effects of input parameters, nearest neighbors and distance metrics on wind speed prediction. The k-NN classification model was developed using the object oriented programming techniques and includes Manhattan and Minkowski distance metrics except from Euclidean distance metric on the contrary of literature. The k-NN classification model which uses wind direction, air temperature, atmospheric pressure and relative humidity parameters in a 4-tupled space achieved the best wind speed prediction for k = 5 in the Manhattan distance metric. Differently, the k-NN classification model which uses wind direction, air temperature and atmospheric pressure parameters in a 3-tupled inputs gave the worst wind speed prediction for k = 1 in the Minkowski distance metric
Different Apple Varieties Classification Using kNN and MLP Algorithms
Sabancı, Kadir
2016-01-01
In this study, three different apple varieties grown in Karaman provinceare classified using kNN and MLP algorithms. 90 apples in total, 30 GoldenDelicious, 30 Granny Smith and 30 Starking Delicious have been used in thestudy. DFK 23U445 USB 3.0 (with Fujinon C Mount Lens) industrial camera hasbeen used to capture apple images. 4 size properties (diameter, area, perimeterand fullness) and 3 color properties (red, green, blue) have been decided usingimage processing techniques through analyzin...
Directory of Open Access Journals (Sweden)
Mutiara Ayu Banjarsari
2016-04-01
Full Text Available The data pile on a database of academic information systems at Computer Science Program of Mathematic and Natural Science Faculty of Lambung Mangkurat University is not fully utilized, although it can provide new information that has not been known before. Data mining techniques can be used to predict the timely graduation of students. The k-Nearest Nieghbor, a method to classify objects based on training data located closest to the object, was used in this study. Selection of the value of k in kNN algorithm became important because it would affect the performance of the algorithm kNN, therefore it was necessary to know how the value of k and the level of accuracy. The k-Fold Cross Validation method and Accuracy Test was used to determine the value of k-Optimal. The result showed that the value of k = 5 was defined as k-Optimal which was then be applied in the kNN algorithm for prediction of timely graduation of students based on the Grade Point Average up to 4th semester. Keywords: kNN, k-Optimal, Classification, Data mining, k-Fold Cross Validation method Tumpukan data pada database sistem informasi akademik Program Studi Ilmu Komputer FMIPA Unlam belum dimanfaatkan secara maksimal, padahal dari data tersebut dapat memberikan sebuah informasi baru yang belum diketahui sebelumnya. Teknik data mining dapat digunakan untuk memprediksi kelulusan tepat waktu mahasiswa. Penelitian menggunakan metode k-Nearest Nieghbor yang merupakan sebuah metode untuk melakukan klasifikasi terhadap objek berdasarkan data training yang jaraknya paling dekat dengan objek tersebut. Pemilihan nilai k pada algoritma kNN menjadi hal yang penting karena akan mempengaruhi kinerja dari algoritma kNN, oleh karena itu perlu diketahui berapa nilai k dan tingkat akurasinya. Metode k-Fold Cross Validation dan Uji Akurasi digunakan untuk mengetahui nilai k-Optimal. Hasil yang didapat adalah nilai k=5 dengan tingkat akurasi sebesar 80.00% yang ditetapkan sebagai k-Optimal. Nilai k
Hanson, Curt; Schaefer, Jacob; Burken, John J.; Larson, David; Johnson, Marcus
2014-01-01
Flight research has shown the effectiveness of adaptive flight controls for improving aircraft safety and performance in the presence of uncertainties. The National Aeronautics and Space Administration's (NASA)'s Integrated Resilient Aircraft Control (IRAC) project designed and conducted a series of flight experiments to study the impact of variations in adaptive controller design complexity on performance and handling qualities. A novel complexity metric was devised to compare the degrees of simplicity achieved in three variations of a model reference adaptive controller (MRAC) for NASA's F-18 (McDonnell Douglas, now The Boeing Company, Chicago, Illinois) Full-Scale Advanced Systems Testbed (Gen-2A) aircraft. The complexity measures of these controllers are also compared to that of an earlier MRAC design for NASA's Intelligent Flight Control System (IFCS) project and flown on a highly modified F-15 aircraft (McDonnell Douglas, now The Boeing Company, Chicago, Illinois). Pilot comments during the IRAC research flights pointed to the importance of workload on handling qualities ratings for failure and damage scenarios. Modifications to existing pilot aggressiveness and duty cycle metrics are presented and applied to the IRAC controllers. Finally, while adaptive controllers may alleviate the effects of failures or damage on an aircraft's handling qualities, they also have the potential to introduce annoying changes to the flight dynamics or to the operation of aircraft systems. A nuisance rating scale is presented for the categorization of nuisance side-effects of adaptive controllers.
Optical and Piezoelectric Study of KNN Solid Solutions Co-Doped with La-Mn and Eu-Fe
Directory of Open Access Journals (Sweden)
Jesús-Alejandro Peña-Jiménez
2016-09-01
Full Text Available The solid-state method was used to synthesize single phase potassium-sodium niobate (KNN co-doped with the La3+–Mn4+ and Eu3+–Fe3+ ion pairs. Structural determination of all studied solid solutions was accomplished by XRD and Rietveld refinement method. Electron paramagnetic resonance (EPR studies were performed to determine the oxidation state of paramagnetic centers. Optical spectroscopy measurements, excitation, emission and decay lifetime were carried out for each solid solution. The present study reveals that doping KNN with La3+–Mn4+ and Eu3+–Fe3+ at concentrations of 0.5 mol % and 1 mol %, respectively, improves the ferroelectric and piezoelectric behavior and induce the generation of optical properties in the material for potential applications.
Adaptive metric learning with deep neural networks for video-based facial expression recognition
Liu, Xiaofeng; Ge, Yubin; Yang, Chao; Jia, Ping
2018-01-01
Video-based facial expression recognition has become increasingly important for plenty of applications in the real world. Despite that numerous efforts have been made for the single sequence, how to balance the complex distribution of intra- and interclass variations well between sequences has remained a great difficulty in this area. We propose the adaptive (N+M)-tuplet clusters loss function and optimize it with the softmax loss simultaneously in the training phrase. The variations introduced by personal attributes are alleviated using the similarity measurements of multiple samples in the feature space with many fewer comparison times as conventional deep metric learning approaches, which enables the metric calculations for large data applications (e.g., videos). Both the spatial and temporal relations are well explored by a unified framework that consists of an Inception-ResNet network with long short term memory and the two fully connected layer branches structure. Our proposed method has been evaluated with three well-known databases, and the experimental results show that our method outperforms many state-of-the-art approaches.
A Regression-based K nearest neighbor algorithm for gene function prediction from heterogeneous data
Directory of Open Access Journals (Sweden)
Ruzzo Walter L
2006-03-01
Full Text Available Abstract Background As a variety of functional genomic and proteomic techniques become available, there is an increasing need for functional analysis methodologies that integrate heterogeneous data sources. Methods In this paper, we address this issue by proposing a general framework for gene function prediction based on the k-nearest-neighbor (KNN algorithm. The choice of KNN is motivated by its simplicity, flexibility to incorporate different data types and adaptability to irregular feature spaces. A weakness of traditional KNN methods, especially when handling heterogeneous data, is that performance is subject to the often ad hoc choice of similarity metric. To address this weakness, we apply regression methods to infer a similarity metric as a weighted combination of a set of base similarity measures, which helps to locate the neighbors that are most likely to be in the same class as the target gene. We also suggest a novel voting scheme to generate confidence scores that estimate the accuracy of predictions. The method gracefully extends to multi-way classification problems. Results We apply this technique to gene function prediction according to three well-known Escherichia coli classification schemes suggested by biologists, using information derived from microarray and genome sequencing data. We demonstrate that our algorithm dramatically outperforms the naive KNN methods and is competitive with support vector machine (SVM algorithms for integrating heterogenous data. We also show that by combining different data sources, prediction accuracy can improve significantly. Conclusion Our extension of KNN with automatic feature weighting, multi-class prediction, and probabilistic inference, enhance prediction accuracy significantly while remaining efficient, intuitive and flexible. This general framework can also be applied to similar classification problems involving heterogeneous datasets.
STUDY COMPARISON OF SVM-, K-NN- AND BACKPROPAGATION-BASED CLASSIFIER FOR IMAGE RETRIEVAL
Directory of Open Access Journals (Sweden)
Muhammad Athoillah
2015-03-01
Full Text Available Classification is a method for compiling data systematically according to the rules that have been set previously. In recent years classification method has been proven to help many peopleâ€™s work, such as image classification, medical biology, traffic light, text classification etc. There are many methods to solve classification problem. This variation method makes the researchers find it difficult to determine which method is best for a problem. This framework is aimed to compare the ability of classification methods, such as Support Vector Machine (SVM, K-Nearest Neighbor (K-NN, and Backpropagation, especially in study cases of image retrieval with five category of image dataset. The result shows that K-NN has the best average result in accuracy with 82%. It is also the fastest in average computation time with 17,99 second during retrieve session for all categories class. The Backpropagation, however, is the slowest among three of them. In average it needed 883 second for training session and 41,7 second for retrieve session.
Sustainability Metrics: The San Luis Basin Project
Sustainability is about promoting humanly desirable dynamic regimes of the environment. Metrics: ecological footprint, net regional product, exergy, emergy, and Fisher Information. Adaptive management: (1) metrics assess problem, (2) specific problem identified, and (3) managemen...
Abazari, M.; Safari, A.
2009-05-01
We report the effects of Ba, Ti, and Mn dopants on ferroelectric polarization and leakage current of (K0.44Na0.52Li0.04)(Nb0.84Ta0.1Sb0.06)O3 (KNN-LT-LS) thin films deposited by pulsed laser deposition. It is shown that donor dopants such as Ba2+, which increased the resistivity in bulk KNN-LT-LS, had an opposite effect in the thin film. Ti4+ as an acceptor B-site dopant reduces the leakage current by an order of magnitude, while the polarization values showed a slight degradation. Mn4+, however, was found to effectively suppress the leakage current by over two orders of magnitude while enhancing the polarization, with 15 and 23 μC/cm2 remanent and saturated polarization, whose values are ˜70% and 82% of the reported values for bulk composition. This phenomenon has been associated with the dual effect of Mn4+ in KNN-LT-LS thin film, by substituting both A- and B-site cations. A detailed description on how each dopant affects the concentrations of vacancies in the lattice is presented. Mn-doped KNN-LT-LS thin films are shown to be a promising candidate for lead-free thin films and applications.
Fast Most Similar Neighbor (MSN) classifiers for Mixed Data
Hernández Rodríguez, Selene
2010-01-01
The k nearest neighbor (k-NN) classifier has been extensively used in Pattern Recognition because of its simplicity and its good performance. However, in large datasets applications, the exhaustive k-NN classifier becomes impractical. Therefore, many fast k-NN classifiers have been developed; most of them rely on metric properties (usually the triangle inequality) to reduce the number of prototype comparisons. Hence, the existing fast k-NN classifiers are applicable only when the comparison f...
Metrics, Media and Advertisers: Discussing Relationship
Directory of Open Access Journals (Sweden)
Marco Aurelio de Souza Rodrigues
2014-11-01
Full Text Available This study investigates how Brazilian advertisers are adapting to new media and its attention metrics. In-depth interviews were conducted with advertisers in 2009 and 2011. In 2009, new media and its metrics were celebrated as innovations that would increase advertising campaigns overall efficiency. In 2011, this perception has changed: New media’s profusion of metrics, once seen as an advantage, started to compromise its ease of use and adoption. Among its findings, this study argues that there is an opportunity for media groups willing to shift from a product-focused strategy towards a customer-centric one, through the creation of new, simple and integrative metrics.
Evaluation of normalization methods for cDNA microarray data by k-NN classification
Energy Technology Data Exchange (ETDEWEB)
Wu, Wei; Xing, Eric P; Myers, Connie; Mian, Saira; Bissell, Mina J
2004-12-17
Non-biological factors give rise to unwanted variations in cDNA microarray data. There are many normalization methods designed to remove such variations. However, to date there have been few published systematic evaluations of these techniques for removing variations arising from dye biases in the context of downstream, higher-order analytical tasks such as classification. Ten location normalization methods that adjust spatial- and/or intensity-dependent dye biases, and three scale methods that adjust scale differences were applied, individually and in combination, to five distinct, published, cancer biology-related cDNA microarray data sets. Leave-one-out cross-validation (LOOCV) classification error was employed as the quantitative end-point for assessing the effectiveness of a normalization method. In particular, a known classifier, k-nearest neighbor (k-NN), was estimated from data normalized using a given technique, and the LOOCV error rate of the ensuing model was computed. We found that k-NN classifiers are sensitive to dye biases in the data. Using NONRM and GMEDIAN as baseline methods, our results show that single-bias-removal techniques which remove either spatial-dependent dye bias (referred later as spatial effect) or intensity-dependent dye bias (referred later as intensity effect) moderately reduce LOOCV classification errors; whereas double-bias-removal techniques which remove both spatial- and intensity effect reduce LOOCV classification errors even further. Of the 41 different strategies examined, three two-step processes, IGLOESS-SLFILTERW7, ISTSPLINE-SLLOESS and IGLOESS-SLLOESS, all of which removed intensity effect globally and spatial effect locally, appear to reduce LOOCV classification errors most consistently and effectively across all data sets. We also found that the investigated scale normalization methods do not reduce LOOCV classification error. Using LOOCV error of k-NNs as the evaluation criterion, three double
Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian
2014-06-27
Pulmonary acoustic parameters extracted from recorded respiratory sounds provide valuable information for the detection of respiratory pathologies. The automated analysis of pulmonary acoustic signals can serve as a differential diagnosis tool for medical professionals, a learning tool for medical students, and a self-management tool for patients. In this context, we intend to evaluate and compare the performance of the support vector machine (SVM) and K-nearest neighbour (K-nn) classifiers in diagnosis respiratory pathologies using respiratory sounds from R.A.L.E database. The pulmonary acoustic signals used in this study were obtained from the R.A.L.E lung sound database. The pulmonary acoustic signals were manually categorised into three different groups, namely normal, airway obstruction pathology, and parenchymal pathology. The mel-frequency cepstral coefficient (MFCC) features were extracted from the pre-processed pulmonary acoustic signals. The MFCC features were analysed by one-way ANOVA and then fed separately into the SVM and K-nn classifiers. The performances of the classifiers were analysed using the confusion matrix technique. The statistical analysis of the MFCC features using one-way ANOVA showed that the extracted MFCC features are significantly different (p < 0.001). The classification accuracies of the SVM and K-nn classifiers were found to be 92.19% and 98.26%, respectively. Although the data used to train and test the classifiers are limited, the classification accuracies found are satisfactory. The K-nn classifier was better than the SVM classifier for the discrimination of pulmonary acoustic signals from pathological and normal subjects obtained from the RALE database.
Directory of Open Access Journals (Sweden)
Fei Wang
2017-12-01
Full Text Available Accurate solar photovoltaic (PV power forecasting is an essential tool for mitigating the negative effects caused by the uncertainty of PV output power in systems with high penetration levels of solar PV generation. Weather classification based modeling is an effective way to increase the accuracy of day-ahead short-term (DAST solar PV power forecasting because PV output power is strongly dependent on the specific weather conditions in a given time period. However, the accuracy of daily weather classification relies on both the applied classifiers and the training data. This paper aims to reveal how these two factors impact the classification performance and to delineate the relation between classification accuracy and sample dataset scale. Two commonly used classification methods, K-nearest neighbors (KNN and support vector machines (SVM are applied to classify the daily local weather types for DAST solar PV power forecasting using the operation data from a grid-connected PV plant in Hohhot, Inner Mongolia, China. We assessed the performance of SVM and KNN approaches, and then investigated the influences of sample scale, the number of categories, and the data distribution in different categories on the daily weather classification results. The simulation results illustrate that SVM performs well with small sample scale, while KNN is more sensitive to the length of the training dataset and can achieve higher accuracy than SVM with sufficient samples.
Jay M. Ver Hoef; Hailemariam Temesgen; Sergio Gómez
2013-01-01
Forest surveys provide critical information for many diverse interests. Data are often collected from samples, and from these samples, maps of resources and estimates of aerial totals or averages are required. In this paper, two approaches for mapping and estimating totals; the spatial linear model (SLM) and k-NN (k-Nearest Neighbor) are compared, theoretically,...
Sigma Routing Metric for RPL Protocol
Directory of Open Access Journals (Sweden)
Paul Sanmartin
2018-04-01
Full Text Available This paper presents the adaptation of a specific metric for the RPL protocol in the objective function MRHOF. Among the functions standardized by IETF, we find OF0, which is based on the minimum hop count, as well as MRHOF, which is based on the Expected Transmission Count (ETX. However, when the network becomes denser or the number of nodes increases, both OF0 and MRHOF introduce long hops, which can generate a bottleneck that restricts the network. The adaptation is proposed to optimize both OFs through a new routing metric. To solve the above problem, the metrics of the minimum number of hops and the ETX are combined by designing a new routing metric called SIGMA-ETX, in which the best route is calculated using the standard deviation of ETX values between each node, as opposed to working with the ETX average along the route. This method ensures a better routing performance in dense sensor networks. The simulations are done through the Cooja simulator, based on the Contiki operating system. The simulations showed that the proposed optimization outperforms at a high margin in both OF0 and MRHOF, in terms of network latency, packet delivery ratio, lifetime, and power consumption.
Metric learning for DNA microarray data analysis
International Nuclear Information System (INIS)
Takeuchi, Ichiro; Nakagawa, Masao; Seto, Masao
2009-01-01
In many microarray studies, gene set selection is an important preliminary step for subsequent main task such as tumor classification, cancer subtype identification, etc. In this paper, we investigate the possibility of using metric learning as an alternative to gene set selection. We develop a simple metric learning algorithm aiming to use it for microarray data analysis. Exploiting a property of the algorithm, we introduce a novel approach for extending the metric learning to be adaptive. We apply the algorithm to previously studied microarray data on malignant lymphoma subtype identification.
Balouchestani, Mohammadreza; Krishnan, Sridhar
2014-01-01
Long-term recording of Electrocardiogram (ECG) signals plays an important role in health care systems for diagnostic and treatment purposes of heart diseases. Clustering and classification of collecting data are essential parts for detecting concealed information of P-QRS-T waves in the long-term ECG recording. Currently used algorithms do have their share of drawbacks: 1) clustering and classification cannot be done in real time; 2) they suffer from huge energy consumption and load of sampling. These drawbacks motivated us in developing novel optimized clustering algorithm which could easily scan large ECG datasets for establishing low power long-term ECG recording. In this paper, we present an advanced K-means clustering algorithm based on Compressed Sensing (CS) theory as a random sampling procedure. Then, two dimensionality reduction methods: Principal Component Analysis (PCA) and Linear Correlation Coefficient (LCC) followed by sorting the data using the K-Nearest Neighbours (K-NN) and Probabilistic Neural Network (PNN) classifiers are applied to the proposed algorithm. We show our algorithm based on PCA features in combination with K-NN classifier shows better performance than other methods. The proposed algorithm outperforms existing algorithms by increasing 11% classification accuracy. In addition, the proposed algorithm illustrates classification accuracy for K-NN and PNN classifiers, and a Receiver Operating Characteristics (ROC) area of 99.98%, 99.83%, and 99.75% respectively.
Metrics for Polyphonic Sound Event Detection
Directory of Open Access Journals (Sweden)
Annamaria Mesaros
2016-05-01
Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.
Group covariance and metrical theory
International Nuclear Information System (INIS)
Halpern, L.
1983-01-01
The a priori introduction of a Lie group of transformations into a physical theory has often proved to be useful; it usually serves to describe special simplified conditions before a general theory can be worked out. Newton's assumptions of absolute space and time are examples where the Euclidian group and translation group have been introduced. These groups were extended to the Galilei group and modified in the special theory of relativity to the Poincare group to describe physics under the given conditions covariantly in the simplest way. The criticism of the a priori character leads to the formulation of the general theory of relativity. The general metric theory does not really give preference to a particular invariance group - even the principle of equivalence can be adapted to a whole family of groups. The physical laws covariantly inserted into the metric space are however adapted to the Poincare group. 8 references
A New Method to Improve the Electrical Properties of KNN-based Ceramics: Tailoring Phase Fraction
Lv, Xiang; Wu, Jiagang; Zhu, Jianguo; Xiao, Dingquan; Zhang, Xixiang
2017-01-01
Although both the phase type and fraction of multi-phase coexistence can affect the electrical properties of (K,Na)NbO3 (KNN)-based ceramics, effects of phase fraction on their electrical properties were few concerned. In this work, through changing the calcination temperature of CaZrO3 powders, we successfully developed the 0.96K0.5Na0.5Nb0.96Sb0.04O3-0.01CaZrO3-0.03Bi0.5Na0.5HfO3 ceramics containing a wide rhombohedral-tetragonal (R-T) phase coexistence with the variations of T (or R) phase fractions. It was found that higher T phase fraction can warrant a larger piezoelectric constant (d33) and d33 also showed a linear variation with respect to tetragonality ratio (c/a). More importantly, a number of domain patterns were observed due to high T phase fraction and large c/a ratio, greatly benefiting the piezoelectricity. In addition, the improved ferroelectric fatigue behavior and thermal stability were also shown in the ceramics containing high T phase fraction. Therefore, this work can bring a new viewpoint into the physical mechanism of KNN-based ceramics behind R-T phase coexistence.
A New Method to Improve the Electrical Properties of KNN-based Ceramics: Tailoring Phase Fraction
Lv, Xiang
2017-08-18
Although both the phase type and fraction of multi-phase coexistence can affect the electrical properties of (K,Na)NbO3 (KNN)-based ceramics, effects of phase fraction on their electrical properties were few concerned. In this work, through changing the calcination temperature of CaZrO3 powders, we successfully developed the 0.96K0.5Na0.5Nb0.96Sb0.04O3-0.01CaZrO3-0.03Bi0.5Na0.5HfO3 ceramics containing a wide rhombohedral-tetragonal (R-T) phase coexistence with the variations of T (or R) phase fractions. It was found that higher T phase fraction can warrant a larger piezoelectric constant (d33) and d33 also showed a linear variation with respect to tetragonality ratio (c/a). More importantly, a number of domain patterns were observed due to high T phase fraction and large c/a ratio, greatly benefiting the piezoelectricity. In addition, the improved ferroelectric fatigue behavior and thermal stability were also shown in the ceramics containing high T phase fraction. Therefore, this work can bring a new viewpoint into the physical mechanism of KNN-based ceramics behind R-T phase coexistence.
Adaptive Fault Detection for Complex Dynamic Processes Based on JIT Updated Data Set
Directory of Open Access Journals (Sweden)
Jinna Li
2012-01-01
Full Text Available A novel fault detection technique is proposed to explicitly account for the nonlinear, dynamic, and multimodal problems existed in the practical and complex dynamic processes. Just-in-time (JIT detection method and k-nearest neighbor (KNN rule-based statistical process control (SPC approach are integrated to construct a flexible and adaptive detection scheme for the control process with nonlinear, dynamic, and multimodal cases. Mahalanobis distance, representing the correlation among samples, is used to simplify and update the raw data set, which is the first merit in this paper. Based on it, the control limit is computed in terms of both KNN rule and SPC method, such that we can identify whether the current data is normal or not by online approach. Noted that the control limit obtained changes with updating database such that an adaptive fault detection technique that can effectively eliminate the impact of data drift and shift on the performance of detection process is obtained, which is the second merit in this paper. The efficiency of the developed method is demonstrated by the numerical examples and an industrial case.
Optimization of internet content filtering-Combined with KNN and OCAT algorithms
Guo, Tianze; Wu, Lingjing; Liu, Jiaming
2018-04-01
The face of the status quo that rampant illegal content in the Internet, the result of traditional way to filter information, keyword recognition and manual screening, is getting worse. Based on this, this paper uses OCAT algorithm nested by KNN classification algorithm to construct a corpus training library that can dynamically learn and update, which can be improved on the filter corpus for constantly updated illegal content of the network, including text and pictures, and thus can better filter and investigate illegal content and its source. After that, the research direction will focus on the simplified updating of recognition and comparison algorithms and the optimization of the corpus learning ability in order to improve the efficiency of filtering, save time and resources.
Busarakam, Kanungnid; Bull, Alan T; Trujillo, Martha E; Riesco, Raul; Sangal, Vartul; van Wezel, Gilles P; Goodfellow, Michael
2016-06-01
A polyphasic study was designed to determine the taxonomic provenance of three Modestobacter strains isolated from an extreme hyper-arid Atacama Desert soil. The strains, isolates KNN 45-1a, KNN 45-2b(T) and KNN 45-3b, were shown to have chemotaxonomic and morphological properties in line with their classification in the genus Modestobacter. The isolates had identical 16S rRNA gene sequences and formed a branch in the Modestobacter gene tree that was most closely related to the type strain of Modestobacter marinus (99.6% similarity). All three isolates were distinguished readily from Modestobacter type strains by a broad range of phenotypic properties, by qualitative and quantitative differences in fatty acid profiles and by BOX fingerprint patterns. The whole genome sequence of isolate KNN 45-2b(T) showed 89.3% average nucleotide identity, 90.1% (SD: 10.97%) average amino acid identity and a digital DNA-DNA hybridization value of 42.4±3.1 against the genome sequence of M. marinus DSM 45201(T), values consistent with its assignment to a separate species. On the basis of all of these data, it is proposed that the isolates be assigned to the genus Modestobacter as Modestobacter caceresii sp. nov. with isolate KNN 45-2b(T) (CECT 9023(T)=DSM 101691(T)) as the type strain. Analysis of the whole-genome sequence of M. caceresii KNN 45-2b(T), with 4683 open reading frames and a genome size of ∽4.96Mb, revealed the presence of genes and gene-clusters that encode for properties relevant to its adaptability to harsh environmental conditions prevalent in extreme hyper arid Atacama Desert soils. Copyright © 2016. Published by Elsevier GmbH.
Juniati, D.; Khotimah, C.; Wardani, D. E. K.; Budayasa, K.
2018-01-01
The heart abnormalities can be detected from heart sound. A heart sound can be heard directly with a stethoscope or indirectly by a phonocardiograph, a machine of the heart sound recording. This paper presents the implementation of fractal dimension theory to make a classification of phonocardiograms into a normal heart sound, a murmur, or an extrasystole. The main algorithm used to calculate the fractal dimension was Higuchi’s Algorithm. There were two steps to make a classification of phonocardiograms, feature extraction, and classification. For feature extraction, we used Discrete Wavelet Transform to decompose the signal of heart sound into several sub-bands depending on the selected level. After the decomposition process, the signal was processed using Fast Fourier Transform (FFT) to determine the spectral frequency. The fractal dimension of the FFT output was calculated using Higuchi Algorithm. The classification of fractal dimension of all phonocardiograms was done with KNN and Fuzzy c-mean clustering methods. Based on the research results, the best accuracy obtained was 86.17%, the feature extraction by DWT decomposition level 3 with the value of kmax 50, using 5-fold cross validation and the number of neighbors was 5 at K-NN algorithm. Meanwhile, for fuzzy c-mean clustering, the accuracy was 78.56%.
Directory of Open Access Journals (Sweden)
Yousef Malik
2016-12-01
Full Text Available The performance of many learning and data mining algorithms depends critically on suitable metrics to assess efficiency over the input space. Learning a suitable metric from examples may, therefore, be the key to successful application of these algorithms. We have demonstrated that the k-nearest neighbor (kNN classification can be significantly improved by learning a distance metric from labeled examples. The clustering ensemble is used to define the distance between points in respect to how they co-cluster. This distance is then used within the framework of the kNN algorithm to define a classifier named ensemble clustering kNN classifier (EC-kNN. In many instances in our experiments we achieved highest accuracy while SVM failed to perform as well. In this study, we compare the performance of a two-class classifier using EC-kNN with different one-class and two-class classifiers. The comparison was applied to seven different plant microRNA species considering eight feature selection methods. In this study, the averaged results show that EC-kNN outperforms all other methods employed here and previously published results for the same data. In conclusion, this study shows that the chosen classifier shows high performance when the distance metric is carefully chosen.
Metrics for measuring net-centric data strategy implementation
Kroculick, Joseph B.
2010-04-01
An enterprise data strategy outlines an organization's vision and objectives for improved collection and use of data. We propose generic metrics and quantifiable measures for each of the DoD Net-Centric Data Strategy (NCDS) data goals. Data strategy metrics can be adapted to the business processes of an enterprise and the needs of stakeholders in leveraging the organization's data assets to provide for more effective decision making. Generic metrics are applied to a specific application where logistics supply and transportation data is integrated across multiple functional groups. A dashboard presents a multidimensional view of the current progress to a state where logistics data shared in a timely and seamless manner among users, applications, and systems.
Epileptic MEG Spike Detection Using Statistical Features and Genetic Programming with KNN
Directory of Open Access Journals (Sweden)
Turky N. Alotaiby
2017-01-01
Full Text Available Epilepsy is a neurological disorder that affects millions of people worldwide. Monitoring the brain activities and identifying the seizure source which starts with spike detection are important steps for epilepsy treatment. Magnetoencephalography (MEG is an emerging epileptic diagnostic tool with high-density sensors; this makes manual analysis a challenging task due to the vast amount of MEG data. This paper explores the use of eight statistical features and genetic programing (GP with the K-nearest neighbor (KNN for interictal spike detection. The proposed method is comprised of three stages: preprocessing, genetic programming-based feature generation, and classification. The effectiveness of the proposed approach has been evaluated using real MEG data obtained from 28 epileptic patients. It has achieved a 91.75% average sensitivity and 92.99% average specificity.
Multimedia content classification metrics for content adaptation
Fernandes, Rui; Andrade, M.T.
2015-01-01
Multimedia content consumption is very popular nowadays. However, not every content can be consumed in its original format: the combination of content, transport and access networks, consumption device and usage environment characteristics may all pose restrictions to that purpose. One way to provide the best possible quality to the user is to adapt the content according to these restrictions as well as user preferences. This adaptation stage can be best executed if knowledge about the conten...
Multimedia content classification metrics for content adaptation
Fernandes, Rui; Andrade, M.T.
2016-01-01
Multimedia content consumption is very popular nowadays. However, not every content can be consumed in its original format: the combination of content, transport and access networks, consumption device and usage environment characteristics may all pose restrictions to that purpose. One way to provide the best possible quality to the user is to adapt the content according to these restrictions as well as user preferences. This adaptation stage can be best executed if knowledge about the conten...
Highly textured KNN-based piezoelectric ceramics by conventional sintering
International Nuclear Information System (INIS)
Zapata, Angelica Maria Mazuera; Silva Junior, Paulo Sergio da; Zambrano, Michel Venet
2016-01-01
Full text: Texturing in ferroelectric ceramics has played an important role in the enhancement of their piezoelectric properties. Common methods for ceramic texturing are hot pressing and template grain ground; nevertheless, the needed facilities to apply hot pressing and the processing of single crystal make the texture of ceramics expensive and very difficult. In this study, a novel method was investigated to obtain highly textured lead-free ceramics. A (K 0.5 Na 0.5 ) 0.97 Li 0. 0 3 Nb 0.8 Ta 0. 2 matrix (KNLNT), with CuO excess was sintered between 1070 and 1110 °C following a solid state reaction procedure. The CuO excess promotes liquid phase formation and a partial melting of the material. XRD patterns showed the intensity of (100) family peaks became much stronger with the increasing of sintering temperature and CuO. In addition, Lotgering factor was calculated and exhibited a texture degree between 40 % and 70 % for sintered samples having 13 and 16 wt. % CuO, respectively. These, highly textured ceramics, with adequate cut, can be used as substitutes single crystals for texturing of KNN-based lead-free ceramics. (author)
METRICS FOR DYNAMIC SCALING OF DATABASE IN CLOUDS
Directory of Open Access Journals (Sweden)
Alexander V. Boichenko
2013-01-01
Full Text Available This article analyzes the main methods of scaling databases (replication, sharding and their support at the popular relational databases and NoSQL solutions with different data models: a document-oriented, key-value, column-oriented, graph. The article provides an assessment of the capabilities of modern cloud-based solution and gives a model for the organization of dynamic scaling in the cloud infrastructure. In the article are analyzed different types of metrics and are included the basic metrics that characterize the functioning parameters and database technology, as well as sets the goals of the integral metrics, necessary for the implementation of adaptive algorithms for dynamic scaling databases in the cloud infrastructure. This article was prepared with the support of RFBR grant № 13-07-00749.
A locally adaptive normal distribution
DEFF Research Database (Denmark)
Arvanitidis, Georgios; Hansen, Lars Kai; Hauberg, Søren
2016-01-01
entropy distribution under the given metric. The underlying metric is, however, non-parametric. We develop a maximum likelihood algorithm to infer the distribution parameters that relies on a combination of gradient descent and Monte Carlo integration. We further extend the LAND to mixture models......The multivariate normal density is a monotonic function of the distance to the mean, and its ellipsoidal shape is due to the underlying Euclidean metric. We suggest to replace this metric with a locally adaptive, smoothly changing (Riemannian) metric that favors regions of high local density...
Enhanced Data Representation by Kernel Metric Learning for Dementia Diagnosis
Directory of Open Access Journals (Sweden)
David Cárdenas-Peña
2017-07-01
Full Text Available Alzheimer's disease (AD is the kind of dementia that affects the most people around the world. Therefore, an early identification supporting effective treatments is required to increase the life quality of a wide number of patients. Recently, computer-aided diagnosis tools for dementia using Magnetic Resonance Imaging scans have been successfully proposed to discriminate between patients with AD, mild cognitive impairment, and healthy controls. Most of the attention has been given to the clinical data, provided by initiatives as the ADNI, supporting reliable researches on intervention, prevention, and treatments of AD. Therefore, there is a need for improving the performance of classification machines. In this paper, we propose a kernel framework for learning metrics that enhances conventional machines and supports the diagnosis of dementia. Our framework aims at building discriminative spaces through the maximization of center kernel alignment function, aiming at improving the discrimination of the three considered neurological classes. The proposed metric learning performance is evaluated on the widely-known ADNI database using three supervised classification machines (k-nn, SVM and NNs for multi-class and bi-class scenarios from structural MRIs. Specifically, from ADNI collection 286 AD patients, 379 MCI patients and 231 healthy controls are used for development and validation of our proposed metric learning framework. For the experimental validation, we split the data into two subsets: 30% of subjects used like a blindfolded assessment and 70% employed for parameter tuning. Then, in the preprocessing stage, each structural MRI scan a total of 310 morphological measurements are automatically extracted from by FreeSurfer software package and concatenated to build an input feature matrix. Obtained test performance results, show that including a supervised metric learning improves the compared baseline classifiers in both scenarios. In the multi
Statistical 2D and 3D shape analysis using Non-Euclidean Metrics
DEFF Research Database (Denmark)
Larsen, Rasmus; Hilger, Klaus Baggesen; Wrobel, Mark Christoph
2002-01-01
We address the problem of extracting meaningful, uncorrelated biological modes of variation from tangent space shape coordinates in 2D and 3D using non-Euclidean metrics. We adapt the maximum autocorrelation factor analysis and the minimum noise fraction transform to shape decomposition. Furtherm......We address the problem of extracting meaningful, uncorrelated biological modes of variation from tangent space shape coordinates in 2D and 3D using non-Euclidean metrics. We adapt the maximum autocorrelation factor analysis and the minimum noise fraction transform to shape decomposition....... Furthermore, we study metrics based on repated annotations of a training set. We define a way of assessing the correlation between landmarks contrary to landmark coordinates. Finally, we apply the proposed methods to a 2D data set consisting of outlines of lungs and a 3D/(4D) data set consisting of sets...
Quantitative adaptation analytics for assessing dynamic systems of systems: LDRD Final Report
Energy Technology Data Exchange (ETDEWEB)
Gauthier, John H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). System Readiness & Sustainment Technologies (6133, M/S 1188); Miner, Nadine E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Military & Energy Systems Analysis (6114, M/S 1188); Wilson, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Resilience and Regulatory Effects (6921, M/S 1138); Le, Hai D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). System Readiness & Sustainment Technologies (6133, M/S 1188); Kao, Gio K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Networked System Survivability & Assurance (5629, M/S 0671); Melander, Darryl J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Software Systems R& D (9525, M/S 1188); Longsine, Dennis Earl [Sandia National Laboratories, Unknown, Unknown; Vander Meer, Jr., Robert C. [SAIC, Inc., Albuquerque, NM (United States)
2015-01-01
Our society is increasingly reliant on systems and interoperating collections of systems, known as systems of systems (SoS). These SoS are often subject to changing missions (e.g., nation- building, arms-control treaties), threats (e.g., asymmetric warfare, terrorism), natural environments (e.g., climate, weather, natural disasters) and budgets. How well can SoS adapt to these types of dynamic conditions? This report details the results of a three year Laboratory Directed Research and Development (LDRD) project aimed at developing metrics and methodologies for quantifying the adaptability of systems and SoS. Work products include: derivation of a set of adaptability metrics, a method for combining the metrics into a system of systems adaptability index (SoSAI) used to compare adaptability of SoS designs, development of a prototype dynamic SoS (proto-dSoS) simulation environment which provides the ability to investigate the validity of the adaptability metric set, and two test cases that evaluate the usefulness of a subset of the adaptability metrics and SoSAI for distinguishing good from poor adaptability in a SoS. Intellectual property results include three patents pending: A Method For Quantifying Relative System Adaptability, Method for Evaluating System Performance, and A Method for Determining Systems Re-Tasking.
Localized Multi-Model Extremes Metrics for the Fourth National Climate Assessment
Thompson, T. R.; Kunkel, K.; Stevens, L. E.; Easterling, D. R.; Biard, J.; Sun, L.
2017-12-01
We have performed localized analysis of scenario-based datasets for the Fourth National Climate Assessment (NCA4). These datasets include CMIP5-based Localized Constructed Analogs (LOCA) downscaled simulations at daily temporal resolution and 1/16th-degree spatial resolution. Over 45 temperature and precipitation extremes metrics have been processed using LOCA data, including threshold, percentile, and degree-days calculations. The localized analysis calculates trends in the temperature and precipitation extremes metrics for relatively small regions such as counties, metropolitan areas, climate zones, administrative areas, or economic zones. For NCA4, we are currently addressing metropolitan areas as defined by U.S. Census Bureau Metropolitan Statistical Areas. Such localized analysis provides essential information for adaptation planning at scales relevant to local planning agencies and businesses. Nearly 30 such regions have been analyzed to date. Each locale is defined by a closed polygon that is used to extract LOCA-based extremes metrics specific to the area. For each metric, single-model data at each LOCA grid location are first averaged over several 30-year historical and future periods. Then, for each metric, the spatial average across the region is calculated using model weights based on both model independence and reproducibility of current climate conditions. The range of single-model results is also captured on the same localized basis, and then combined with the weighted ensemble average for each region and each metric. For example, Boston-area cooling degree days and maximum daily temperature is shown below for RCP8.5 (red) and RCP4.5 (blue) scenarios. We also discuss inter-regional comparison of these metrics, as well as their relevance to risk analysis for adaptation planning.
Jakes, Peter; Kungl, Hans; Schierholz, Roland; Eichel, Rüdiger-A
2014-09-01
The defect structure for copper-doped sodium potassium niobate (KNN) ferroelectrics has been analyzed with respect to its defect structure. In particular, the interplay between the mutually compensating dimeric (Cu(Nb)'''-V(O)··) and trimeric (V(O)··-Cu(Nb)'''-V(O)··)· defect complexes with 180° and non-180° domain walls has been analyzed and compared to the effects from (Cu'' - V(O)··)(x)× dipoles in CuO-doped lead zirconate titanate (PZT). Attempts are made to relate the rearrangement of defect complexes to macroscopic electromechanical properties.
Method Points: towards a metric for method complexity
Directory of Open Access Journals (Sweden)
Graham McLeod
1998-11-01
Full Text Available A metric for method complexity is proposed as an aid to choosing between competing methods, as well as in validating the effects of method integration or the products of method engineering work. It is based upon a generic method representation model previously developed by the author and adaptation of concepts used in the popular Function Point metric for system size. The proposed technique is illustrated by comparing two popular I.E. deliverables with counterparts in the object oriented Unified Modeling Language (UML. The paper recommends ways to improve the practical adoption of new methods.
Evaluation metrics for biostatistical and epidemiological collaborations.
Rubio, Doris McGartland; Del Junco, Deborah J; Bhore, Rafia; Lindsell, Christopher J; Oster, Robert A; Wittkowski, Knut M; Welty, Leah J; Li, Yi-Ju; Demets, Dave
2011-10-15
Increasing demands for evidence-based medicine and for the translation of biomedical research into individual and public health benefit have been accompanied by the proliferation of special units that offer expertise in biostatistics, epidemiology, and research design (BERD) within academic health centers. Objective metrics that can be used to evaluate, track, and improve the performance of these BERD units are critical to their successful establishment and sustainable future. To develop a set of reliable but versatile metrics that can be adapted easily to different environments and evolving needs, we consulted with members of BERD units from the consortium of academic health centers funded by the Clinical and Translational Science Award Program of the National Institutes of Health. Through a systematic process of consensus building and document drafting, we formulated metrics that covered the three identified domains of BERD practices: the development and maintenance of collaborations with clinical and translational science investigators, the application of BERD-related methods to clinical and translational research, and the discovery of novel BERD-related methodologies. In this article, we describe the set of metrics and advocate their use for evaluating BERD practices. The routine application, comparison of findings across diverse BERD units, and ongoing refinement of the metrics will identify trends, facilitate meaningful changes, and ultimately enhance the contribution of BERD activities to biomedical research. Copyright © 2011 John Wiley & Sons, Ltd.
Directory of Open Access Journals (Sweden)
mohsen Mirzayi
2016-03-01
Full Text Available Landscape indices can be used as an approach for predicting water quality changes to monitor non-point source pollution. In the present study, the data collected over the period from 2012 to 2013 from 81 water quality stations along the rivers flowing in Mazandaran Province were analyzed. Upstream boundries were drawn and landscape metrics were extracted for each of the sub-watersheds at class and landscape levels. Principal component analysis was used to single out the relevant water quality parameters and forward linear regression was employed to determine the optimal metrics for the description of each parameter. The first five components were able to describe 96.61% of the variation in water quality in Mazandaran Province. Adaptive Neuro-fuzzy Inference System (ANFIS and multiple linear regression were used to model the relationship between landscape metrics and water quality parameters. The results indicate that multiple regression was able to predict SAR, TDS, pH, NO3‒, and PO43‒ in the test step, with R2 values equal to 0.81, 0.56, 0.73, 0.44. and 0.63, respectively. The corresponding R2 value of ANFIS in the test step were 0.82, 0.79, 0.82, 0.31, and 0.36, respectively. Clearly, ANFIS exhibited a better performance in each case than did the linear regression model. This indicates a nonlinear relationship between the water quality parameters and landscape metrics. Since different land cover/uses have considerable impacts on both the outflow water quality and the available and dissolved pollutants in rivers, the method can be reasonably used for regional planning and environmental impact assessment in development projects in the region.
Manganaro, Alberto; Pizzo, Fabiola; Lombardo, Anna; Pogliaghi, Alberto; Benfenati, Emilio
2016-02-01
The ability of a substance to resist degradation and persist in the environment needs to be readily identified in order to protect the environment and human health. Many regulations require the assessment of persistence for substances commonly manufactured and marketed. Besides laboratory-based testing methods, in silico tools may be used to obtain a computational prediction of persistence. We present a new program to develop k-Nearest Neighbor (k-NN) models. The k-NN algorithm is a similarity-based approach that predicts the property of a substance in relation to the experimental data for its most similar compounds. We employed this software to identify persistence in the sediment compartment. Data on half-life (HL) in sediment were obtained from different sources and, after careful data pruning the final dataset, containing 297 organic compounds, was divided into four experimental classes. We developed several models giving satisfactory performances, considering that both the training and test set accuracy ranged between 0.90 and 0.96. We finally selected one model which will be made available in the near future in the freely available software platform VEGA. This model offers a valuable in silico tool that may be really useful for fast and inexpensive screening. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wang, Wei; Guyet, Thomas; Quiniou, René
2014-01-01
In this work, we propose a novel framework of autonomic intrusion detection that fulfills online and adaptive intrusion detection over unlabeled HTTP traffic streams in computer networks. The framework holds potential for self-managing: self-labeling, self-updating and self-adapting. Our framework employs the Affinity Propagation (AP) algorithm to learn a subject’s behaviors through dynamical clustering of the streaming data. It automatically labels the data and adapts to normal behavior changes while identifies anomalies. Two large real HTTP traffic streams collected in our institute as well as a set of benchmark KDD’99 data are used to validate the framework and the method. The test results show that the autonomic model achieves better results in terms of effectiveness and efficiency compared to adaptive Sequential Karhunen–Loeve method and static AP as well as three other static anomaly detection methods, namely, k-NN, PCA and SVM.
Wang, Wei
2014-06-22
In this work, we propose a novel framework of autonomic intrusion detection that fulfills online and adaptive intrusion detection over unlabeled HTTP traffic streams in computer networks. The framework holds potential for self-managing: self-labeling, self-updating and self-adapting. Our framework employs the Affinity Propagation (AP) algorithm to learn a subject’s behaviors through dynamical clustering of the streaming data. It automatically labels the data and adapts to normal behavior changes while identifies anomalies. Two large real HTTP traffic streams collected in our institute as well as a set of benchmark KDD’99 data are used to validate the framework and the method. The test results show that the autonomic model achieves better results in terms of effectiveness and efficiency compared to adaptive Sequential Karhunen–Loeve method and static AP as well as three other static anomaly detection methods, namely, k-NN, PCA and SVM.
International Nuclear Information System (INIS)
Chen, Liang; Dirmeyer, Paul A
2016-01-01
To assess the biogeophysical impacts of land cover/land use change (LCLUC) on surface temperature, two observation-based metrics and their applicability in climate modeling were explored in this study. Both metrics were developed based on the surface energy balance, and provided insight into the contribution of different aspects of land surface change (such as albedo, surface roughness, net radiation and surface heat fluxes) to changing climate. A revision of the first metric, the intrinsic biophysical mechanism, can be used to distinguish the direct and indirect effects of LCLUC on surface temperature. The other, a decomposed temperature metric, gives a straightforward depiction of separate contributions of all components of the surface energy balance. These two metrics well capture observed and model simulated surface temperature changes in response to LCLUC. Results from paired FLUXNET sites and land surface model sensitivity experiments indicate that surface roughness effects usually dominate the direct biogeophysical feedback of LCLUC, while other effects play a secondary role. However, coupled climate model experiments show that these direct effects can be attenuated by large scale atmospheric changes (indirect feedbacks). When applied to real-time transient LCLUC experiments, the metrics also demonstrate usefulness for assessing the performance of climate models and quantifying land–atmosphere interactions in response to LCLUC. (letter)
NeatSort - A practical adaptive algorithm
La Rocca, Marcello; Cantone, Domenico
2014-01-01
We present a new adaptive sorting algorithm which is optimal for most disorder metrics and, more important, has a simple and quick implementation. On input $X$, our algorithm has a theoretical $\\Omega (|X|)$ lower bound and a $\\mathcal{O}(|X|\\log|X|)$ upper bound, exhibiting amazing adaptive properties which makes it run closer to its lower bound as disorder (computed on different metrics) diminishes. From a practical point of view, \\textit{NeatSort} has proven itself competitive with (and of...
A metric and frameworks for resilience analysis of engineered and infrastructure systems
International Nuclear Information System (INIS)
Francis, Royce; Bekera, Behailu
2014-01-01
In this paper, we have reviewed various approaches to defining resilience and the assessment of resilience. We have seen that while resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. In this paper, we have proposed a resilience analysis framework and a metric for measuring resilience. Our analysis framework consists of system identification, resilience objective setting, vulnerability analysis, and stakeholder engagement. The implementation of this framework is focused on the achievement of three resilience capacities: adaptive capacity, absorptive capacity, and recoverability. These three capacities also form the basis of our proposed resilience factor and uncertainty-weighted resilience metric. We have also identified two important unresolved discussions emerging in the literature: the idea of resilience as an epistemological versus inherent property of the system, and design for ecological versus engineered resilience in socio-technical systems. While we have not resolved this tension, we have shown that our framework and metric promote the development of methodologies for investigating “deep” uncertainties in resilience assessment while retaining the use of probability for expressing uncertainties about highly uncertain, unforeseeable, or unknowable hazards in design and management activities. - Highlights: • While resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. • We proposed a resilience analysis framework whose implementation is encapsulated within resilience metric incorporating absorptive, adaptive, and restorative capacities. • We have shown that our framework and metric can support the investigation of “deep” uncertainties in resilience assessment or analysis. • We have discussed the role of quantitative metrics in design for ecological versus engineered resilience in socio-technical systems. • Our resilience metric supports
Fast Link Adaptation for MIMO-OFDM
DEFF Research Database (Denmark)
Jensen, Tobias Lindstrøm; Kant, Shashi; Wehinger, Joachim
2010-01-01
We investigate link-quality metrics (LQMs) based on raw bit-error-rate, effective signal-to-interference-plus-noise ratio, and mutual information (MI) for the purpose of fast link adaptation (LA) in communication systems employing orthogonal frequency-division multiplexing and multiple-input–mult......We investigate link-quality metrics (LQMs) based on raw bit-error-rate, effective signal-to-interference-plus-noise ratio, and mutual information (MI) for the purpose of fast link adaptation (LA) in communication systems employing orthogonal frequency-division multiplexing and multiple...
Parrish, Donna; Butryn, Ryan S.; Rizzo, Donna M.
2012-01-01
We developed a methodology to predict brook trout (Salvelinus fontinalis) distribution using summer temperature metrics as predictor variables. Our analysis used long-term fish and hourly water temperature data from the Dog River, Vermont (USA). Commonly used metrics (e.g., mean, maximum, maximum 7-day maximum) tend to smooth the data so information on temperature variation is lost. Therefore, we developed a new set of metrics (called event metrics) to capture temperature variation by describing the frequency, area, duration, and magnitude of events that exceeded a user-defined temperature threshold. We used 16, 18, 20, and 22°C. We built linear discriminant models and tested and compared the event metrics against the commonly used metrics. Correct classification of the observations was 66% with event metrics and 87% with commonly used metrics. However, combined event and commonly used metrics correctly classified 92%. Of the four individual temperature thresholds, it was difficult to assess which threshold had the “best” accuracy. The 16°C threshold had slightly fewer misclassifications; however, the 20°C threshold had the fewest extreme misclassifications. Our method leveraged the volumes of existing long-term data and provided a simple, systematic, and adaptable framework for monitoring changes in fish distribution, specifically in the case of irregular, extreme temperature events.
Using principal component analysis for selecting network behavioral anomaly metrics
Gregorio-de Souza, Ian; Berk, Vincent; Barsamian, Alex
2010-04-01
This work addresses new approaches to behavioral analysis of networks and hosts for the purposes of security monitoring and anomaly detection. Most commonly used approaches simply implement anomaly detectors for one, or a few, simple metrics and those metrics can exhibit unacceptable false alarm rates. For instance, the anomaly score of network communication is defined as the reciprocal of the likelihood that a given host uses a particular protocol (or destination);this definition may result in an unrealistically high threshold for alerting to avoid being flooded by false positives. We demonstrate that selecting and adapting the metrics and thresholds, on a host-by-host or protocol-by-protocol basis can be done by established multivariate analyses such as PCA. We will show how to determine one or more metrics, for each network host, that records the highest available amount of information regarding the baseline behavior, and shows relevant deviances reliably. We describe the methodology used to pick from a large selection of available metrics, and illustrate a method for comparing the resulting classifiers. Using our approach we are able to reduce the resources required to properly identify misbehaving hosts, protocols, or networks, by dedicating system resources to only those metrics that actually matter in detecting network deviations.
Load Balancing Metric with Diversity for Energy Efficient Routing in Wireless Sensor Networks
DEFF Research Database (Denmark)
Moad, Sofiane; Hansen, Morten Tranberg; Jurdak, Raja
2011-01-01
The expected number of transmission (ETX) represents a routing metric that considers the highly variable link qualities for a specific radio in Wireless Sensor Networks (WSNs). To adapt to these differences, radio diversity is a recently explored solution for WSNs. In this paper, we propose...... an energy balancing metric which explores the diversity in link qualities present at different radios. The goal is to effectively use the energy of the network and therefore extend the network lifetime. The proposed metric takes into account the transmission and reception costs for a specific radio in order...... to choose an energy efficient radio. In addition, the metric uses the remaining energy of nodes in order to regulate the traffic so that critical nodes are avoided. We show by simulations that our metric can improve the network lifetime up to 20%....
Learning-Based Adaptive Imputation Methodwith kNN Algorithm for Missing Power Data
Directory of Open Access Journals (Sweden)
Minkyung Kim
2017-10-01
Full Text Available This paper proposes a learning-based adaptive imputation method (LAI for imputing missing power data in an energy system. This method estimates the missing power data by using the pattern that appears in the collected data. Here, in order to capture the patterns from past power data, we newly model a feature vector by using past data and its variations. The proposed LAI then learns the optimal length of the feature vector and the optimal historical length, which are significant hyper parameters of the proposed method, by utilizing intentional missing data. Based on a weighted distance between feature vectors representing a missing situation and past situation, missing power data are estimated by referring to the k most similar past situations in the optimal historical length. We further extend the proposed LAI to alleviate the effect of unexpected variation in power data and refer to this new approach as the extended LAI method (eLAI. The eLAI selects a method between linear interpolation (LI and the proposed LAI to improve accuracy under unexpected variations. Finally, from a simulation under various energy consumption profiles, we verify that the proposed eLAI achieves about a 74% reduction of the average imputation error in an energy system, compared to the existing imputation methods.
Latent Dirichlet Allocation (LDA) Model and kNN Algorithm to Classify Research Project Selection
Safi’ie, M. A.; Utami, E.; Fatta, H. A.
2018-03-01
Universitas Sebelas Maret has a teaching staff more than 1500 people, and one of its tasks is to carry out research. In the other side, the funding support for research and service is limited, so there is need to be evaluated to determine the Research proposal submission and devotion on society (P2M). At the selection stage, research proposal documents are collected as unstructured data and the data stored is very large. To extract information contained in the documents therein required text mining technology. This technology applied to gain knowledge to the documents by automating the information extraction. In this articles we use Latent Dirichlet Allocation (LDA) to the documents as a model in feature extraction process, to get terms that represent its documents. Hereafter we use k-Nearest Neighbour (kNN) algorithm to classify the documents based on its terms.
Gaba, Yaé Ulrich
2017-01-01
In this paper, we discuss recent results about generalized metric spaces and fixed point theory. We introduce the notion of $\\eta$-cone metric spaces, give some topological properties and prove some fixed point theorems for contractive type maps on these spaces. In particular we show that theses $\\eta$-cone metric spaces are natural generalizations of both cone metric spaces and metric type spaces.
Obtención de polvos cerámicos de BNKT-KNN por el método Pechini
Directory of Open Access Journals (Sweden)
Yasnó, J. P.
2013-10-01
Full Text Available Pechini method was used in order to obtain fine ceramic and single-phase powders for a lead-free ferroelectric system 0,97[(Bi1/2Na1/21-x(Bi1/2K1/2xTiO3]-0,03[(Na1/2K1/2NbO3]or BNKT-KNN (x = 0.00, 0.18, 0.21, 0.24, 0.27. This method allowed obtaining powders with 100 % perovskite phase, which was confirmed by X-ray diffraction, for this particular system in all the studied stoichiometries using temperature as low as 600 ºC. The effects on the bonds present in the structure due to variation of the stoichiometry, Na-K, were determined using infrared spectroscopy, FT-IR. Irregular nanoparticles were observed by scanning electron microscopy.El método Pechini fue utilizado para obtener polvos cerámicos finos y monofásicos del sistema ferroeléctrico libre de plomo 0,97[(Bi1/2Na1/21-x(Bi1/2K1/2xTiO3]-0,03[(Na1/2K1/2NbO3] ó BNKT-KNN (x = 0.00, 0.18, 0.21, 0.24, 0.27. Este método permitió la obtención de polvos con 100 % de fase perovskita, para el sistema de interés en todas las estequiometrias estudiadas, a una temperatura tan baja como 600 ºC, lo que fue confirmado por difracción de rayos X. Por medio de espectroscopia infrarroja, FT-IR, se pudo determinar cómo afecta la variación de la estequiometria, Na-K, los enlaces presentes en la estructura. Mediante microscopia electrónica de barrido se observaron partículas nanométricas irregulares.
Directory of Open Access Journals (Sweden)
Yussouf Nahayo
2016-04-01
Full Text Available This paper proposes some methods of robust text-independent speaker identification based on Gaussian Mixture Model (GMM. We implemented a combination of GMM model with a set of classifiers such as Support Vector Machine (SVM, K-Nearest Neighbour (K-NN, and Naive Bayes Classifier (NBC. In order to improve the identification rate, we developed a combination of hybrid systems by using validation technique. The experiments were performed on the dialect DR1 of the TIMIT corpus. The results have showed a better performance for the developed technique compared to the individual techniques.
Eye Tracking Metrics for Workload Estimation in Flight Deck Operation
Ellis, Kyle; Schnell, Thomas
2010-01-01
Flight decks of the future are being enhanced through improved avionics that adapt to both aircraft and operator state. Eye tracking allows for non-invasive analysis of pilot eye movements, from which a set of metrics can be derived to effectively and reliably characterize workload. This research identifies eye tracking metrics that correlate to aircraft automation conditions, and identifies the correlation of pilot workload to the same automation conditions. Saccade length was used as an indirect index of pilot workload: Pilots in the fully automated condition were observed to have on average, larger saccadic movements in contrast to the guidance and manual flight conditions. The data set itself also provides a general model of human eye movement behavior and so ostensibly visual attention distribution in the cockpit for approach to land tasks with various levels of automation, by means of the same metrics used for workload algorithm development.
Directory of Open Access Journals (Sweden)
Amir Eslam Bonyad
2015-06-01
Full Text Available In this study, we explored the utility of k Nearest Neighbor (kNN algorithm to integrate IRS-P6 LISS III satellite imagery data and ground inventory data for application in forest attributes (DBH, trees height, volume, basal area, density and forest cover type estimation and mapping. The ground inventory data was based on a systematic-random sampling grid and the numbers of sampling plots were 408 circular plots in a plantation in Guilan province, north of Iran. We concluded that kNN method was useful tool for mapping at a fine accuracy between 80% and 93.94%. Values of k between 5 and 8 seemed appropriate. The best distance metrics were found Euclidean, Fuzzy and Mahalanobis. Results showed that kNN was accurate enough for practical applicability for mapping forest areas.
Classification of Pulse Waveforms Using Edit Distance with Real Penalty
Directory of Open Access Journals (Sweden)
Zhang Dongyu
2010-01-01
Full Text Available Abstract Advances in sensor and signal processing techniques have provided effective tools for quantitative research in traditional Chinese pulse diagnosis (TCPD. Because of the inevitable intraclass variation of pulse patterns, the automatic classification of pulse waveforms has remained a difficult problem. In this paper, by referring to the edit distance with real penalty (ERP and the recent progress in -nearest neighbors (KNN classifiers, we propose two novel ERP-based KNN classifiers. Taking advantage of the metric property of ERP, we first develop an ERP-induced inner product and a Gaussian ERP kernel, then embed them into difference-weighted KNN classifiers, and finally develop two novel classifiers for pulse waveform classification. The experimental results show that the proposed classifiers are effective for accurate classification of pulse waveform.
Optimal Detection Range of RFID Tag for RFID-based Positioning System Using the k-NN Algorithm
Directory of Open Access Journals (Sweden)
Joon Heo
2009-06-01
Full Text Available Positioning technology to track a moving object is an important and essential component of ubiquitous computing environments and applications. An RFID-based positioning system using the k-nearest neighbor (k-NN algorithm can determine the position of a moving reader from observed reference data. In this study, the optimal detection range of an RFID-based positioning system was determined on the principle that tag spacing can be derived from the detection range. It was assumed that reference tags without signal strength information are regularly distributed in 1-, 2- and 3-dimensional spaces. The optimal detection range was determined, through analytical and numerical approaches, to be 125% of the tag-spacing distance in 1-dimensional space. Through numerical approaches, the range was 134% in 2-dimensional space, 143% in 3-dimensional space.
An optimization-based framework for anisotropic simplex mesh adaptation
Yano, Masayuki; Darmofal, David L.
2012-09-01
We present a general framework for anisotropic h-adaptation of simplex meshes. Given a discretization and any element-wise, localizable error estimate, our adaptive method iterates toward a mesh that minimizes error for a given degrees of freedom. Utilizing mesh-metric duality, we consider a continuous optimization problem of the Riemannian metric tensor field that provides an anisotropic description of element sizes. First, our method performs a series of local solves to survey the behavior of the local error function. This information is then synthesized using an affine-invariant tensor manipulation framework to reconstruct an approximate gradient of the error function with respect to the metric tensor field. Finally, we perform gradient descent in the metric space to drive the mesh toward optimality. The method is first demonstrated to produce optimal anisotropic meshes minimizing the L2 projection error for a pair of canonical problems containing a singularity and a singular perturbation. The effectiveness of the framework is then demonstrated in the context of output-based adaptation for the advection-diffusion equation using a high-order discontinuous Galerkin discretization and the dual-weighted residual (DWR) error estimate. The method presented provides a unified framework for optimizing both the element size and anisotropy distribution using an a posteriori error estimate and enables efficient adaptation of anisotropic simplex meshes for high-order discretizations.
Metrics required for Power System Resilient Operations and Protection
Energy Technology Data Exchange (ETDEWEB)
Eshghi, K.; Johnson, B. K.; Rieger, C. G.
2016-08-01
Today’s complex grid involves many interdependent systems. Various layers of hierarchical control and communication systems are coordinated, both spatially and temporally to achieve gird reliability. As new communication network based control system technologies are being deployed, the interconnected nature of these systems is becoming more complex. Deployment of smart grid concepts promises effective integration of renewable resources, especially if combined with energy storage. However, without a philosophical focus on resilience, a smart grid will potentially lead to higher magnitude and/or duration of disruptive events. The effectiveness of a resilient infrastructure depends upon its ability to anticipate, absorb, adapt to, and/or rapidly recover from a potentially catastrophic event. Future system operations can be enhanced with a resilient philosophy through architecting the complexity with state awareness metrics that recognize changing system conditions and provide for an agile and adaptive response. The starting point for metrics lies in first understanding the attributes of performance that will be qualified. In this paper, we will overview those attributes and describe how they will be characterized by designing a distributed agent that can be applied to the power grid.
Hu, Bo; Kalfoglou, Yannis; Dupplaw, David; Alani, Harith; Lewis, Paul; Shadbolt, Nigel
2006-01-01
In the context of the Semantic Web, many ontology-related operations, e.g. ontology ranking, segmentation, alignment, articulation, reuse, evaluation, can be boiled down to one fundamental operation: computing the similarity and/or dissimilarity among ontological entities, and in some cases among ontologies themselves. In this paper, we review standard metrics for computing distance measures and we propose a series of semantic metrics. We give a formal account of semantic metrics drawn from a...
Chistyakov, Vyacheslav
2015-01-01
Aimed toward researchers and graduate students familiar with elements of functional analysis, linear algebra, and general topology; this book contains a general study of modulars, modular spaces, and metric modular spaces. Modulars may be thought of as generalized velocity fields and serve two important purposes: generate metric spaces in a unified manner and provide a weaker convergence, the modular convergence, whose topology is non-metrizable in general. Metric modular spaces are extensions of metric spaces, metric linear spaces, and classical modular linear spaces. The topics covered include the classification of modulars, metrizability of modular spaces, modular transforms and duality between modular spaces, metric and modular topologies. Applications illustrated in this book include: the description of superposition operators acting in modular spaces, the existence of regular selections of set-valued mappings, new interpretations of spaces of Lipschitzian and absolutely continuous mappings, the existe...
Next-Generation Metrics: Responsible Metrics & Evaluation for Open Science
Energy Technology Data Exchange (ETDEWEB)
Wilsdon, J.; Bar-Ilan, J.; Peters, I.; Wouters, P.
2016-07-01
Metrics evoke a mixed reaction from the research community. A commitment to using data to inform decisions makes some enthusiastic about the prospect of granular, real-time analysis o of research and its wider impacts. Yet we only have to look at the blunt use of metrics such as journal impact factors, h-indices and grant income targets, to be reminded of the pitfalls. Some of the most precious qualities of academic culture resist simple quantification, and individual indicators often struggle to do justice to the richness and plurality of research. Too often, poorly designed evaluation criteria are “dominating minds, distorting behaviour and determining careers (Lawrence, 2007).” Metrics hold real power: they are constitutive of values, identities and livelihoods. How to exercise that power to more positive ends has been the focus of several recent and complementary initiatives, including the San Francisco Declaration on Research Assessment (DORA1), the Leiden Manifesto2 and The Metric Tide3 (a UK government review of the role of metrics in research management and assessment). Building on these initiatives, the European Commission, under its new Open Science Policy Platform4, is now looking to develop a framework for responsible metrics for research management and evaluation, which can be incorporated into the successor framework to Horizon 2020. (Author)
Goal based mesh adaptivity for fixed source radiation transport calculations
International Nuclear Information System (INIS)
Baker, C.M.J.; Buchan, A.G.; Pain, C.C.; Tollit, B.S.; Goffin, M.A.; Merton, S.R.; Warner, P.
2013-01-01
Highlights: ► Derives an anisotropic goal based error measure for shielding problems. ► Reduces the error in the detector response by optimizing the finite element mesh. ► Anisotropic adaptivity captures material interfaces using fewer elements than AMR. ► A new residual based on the numerical scheme chosen forms the error measure. ► The error measure also combines the forward and adjoint metrics in a novel way. - Abstract: In this paper, the application of goal based error measures for anisotropic adaptivity applied to shielding problems in which a detector is present is explored. Goal based adaptivity is important when the response of a detector is required to ensure that dose limits are adhered to. To achieve this, a dual (adjoint) problem is solved which solves the neutron transport equation in terms of the response variables, in this case the detector response. The methods presented can be applied to general finite element solvers, however, the derivation of the residuals are dependent on the underlying finite element scheme which is also discussed in this paper. Once error metrics for the forward and adjoint solutions have been formed they are combined using a novel approach. The two metrics are combined by forming the minimum ellipsoid that covers both the error metrics rather than taking the maximum ellipsoid that is contained within the metrics. Another novel approach used within this paper is the construction of the residual. The residual, used to form the goal based error metrics, is calculated from the subgrid scale correction which is inherent in the underlying spatial discretisation employed
Integrated Metrics for Improving the Life Cycle Approach to Assessing Product System Sustainability
Directory of Open Access Journals (Sweden)
Wesley Ingwersen
2014-03-01
Full Text Available Life cycle approaches are critical for identifying and reducing environmental burdens of products. While these methods can indicate potential environmental impacts of a product, current Life Cycle Assessment (LCA methods fail to integrate the multiple impacts of a system into unified measures of social, economic or environmental performance related to sustainability. Integrated metrics that combine multiple aspects of system performance based on a common scientific or economic principle have proven to be valuable for sustainability evaluation. In this work, we propose methods of adapting four integrated metrics for use with LCAs of product systems: ecological footprint, emergy, green net value added, and Fisher information. These metrics provide information on the full product system in land, energy, monetary equivalents, and as a unitless information index; each bundled with one or more indicators for reporting. When used together and for relative comparison, integrated metrics provide a broader coverage of sustainability aspects from multiple theoretical perspectives that is more likely to illuminate potential issues than individual impact indicators. These integrated metrics are recommended for use in combination with traditional indicators used in LCA. Future work will test and demonstrate the value of using these integrated metrics and combinations to assess product system sustainability.
Baby universe metric equivalent to an interior black-hole metric
International Nuclear Information System (INIS)
Gonzalez-Diaz, P.F.
1991-01-01
It is shown that the maximally extended metric corresponding to a large wormhole is the unique possible wormhole metric whose baby universe sector is conformally equivalent ot the maximal inextendible Kruskal metric corresponding to the interior region of a Schwarzschild black hole whose gravitational radius is half the wormhole neck radius. The physical implications of this result in the black hole evaporation process are discussed. (orig.)
Croitoru, Anca; Apreutesei, Gabriela; Mastorakis, Nikos E.
2017-09-01
The subject of this paper belongs to the theory of approximate metrics [23]. An approximate metric on X is a real application defined on X × X that satisfies only a part of the metric axioms. In a recent paper [23], we introduced a new type of approximate metric, named C-metric, that is an application which satisfies only two metric axioms: symmetry and triangular inequality. The remarkable fact in a C-metric space is that a topological structure induced by the C-metric can be defined. The innovative idea of this paper is that we obtain some convergence properties of a C-metric space in the absence of a metric. In this paper we investigate C-metric spaces. The paper is divided into four sections. Section 1 is for Introduction. In Section 2 we recall some concepts and preliminary results. In Section 3 we present some properties of C-metric spaces, such as convergence properties, a canonical decomposition and a C-fixed point theorem. Finally, in Section 4 some conclusions are highlighted.
Learning Low-Dimensional Metrics
Jain, Lalit; Mason, Blake; Nowak, Robert
2017-01-01
This paper investigates the theoretical foundations of metric learning, focused on three key questions that are not fully addressed in prior work: 1) we consider learning general low-dimensional (low-rank) metrics as well as sparse metrics; 2) we develop upper and lower (minimax)bounds on the generalization error; 3) we quantify the sample complexity of metric learning in terms of the dimension of the feature space and the dimension/rank of the underlying metric;4) we also bound the accuracy ...
Scalar-metric and scalar-metric-torsion gravitational theories
International Nuclear Information System (INIS)
Aldersley, S.J.
1977-01-01
The techniques of dimensional analysis and of the theory of tensorial concomitants are employed to study field equations in gravitational theories which incorporate scalar fields of the Brans-Dicke type. Within the context of scalar-metric gravitational theories, a uniqueness theorem for the geometric (or gravitational) part of the field equations is proven and a Lagrangian is determined which is uniquely specified by dimensional analysis. Within the context of scalar-metric-torsion gravitational theories a uniqueness theorem for field Lagrangians is presented and the corresponding Euler-Lagrange equations are given. Finally, an example of a scalar-metric-torsion theory is presented which is similar in many respects to the Brans-Dicke theory and the Einstein-Cartan theory
International Nuclear Information System (INIS)
Ma Zhihao; Chen Jingling
2011-01-01
In this work we study metrics of quantum states, which are natural generalizations of the usual trace metric and Bures metric. Some useful properties of the metrics are proved, such as the joint convexity and contractivity under quantum operations. Our result has a potential application in studying the geometry of quantum states as well as the entanglement detection.
A guide to calculating habitat-quality metrics to inform conservation of highly mobile species
Bieri, Joanna A.; Sample, Christine; Thogmartin, Wayne E.; Diffendorfer, James E.; Earl, Julia E.; Erickson, Richard A.; Federico, Paula; Flockhart, D. T. Tyler; Nicol, Sam; Semmens, Darius J.; Skraber, T.; Wiederholt, Ruscena; Mattsson, Brady J.
2018-01-01
Many metrics exist for quantifying the relative value of habitats and pathways used by highly mobile species. Properly selecting and applying such metrics requires substantial background in mathematics and understanding the relevant management arena. To address this multidimensional challenge, we demonstrate and compare three measurements of habitat quality: graph-, occupancy-, and demographic-based metrics. Each metric provides insights into system dynamics, at the expense of increasing amounts and complexity of data and models. Our descriptions and comparisons of diverse habitat-quality metrics provide means for practitioners to overcome the modeling challenges associated with management or conservation of such highly mobile species. Whereas previous guidance for applying habitat-quality metrics has been scattered in diversified tracks of literature, we have brought this information together into an approachable format including accessible descriptions and a modeling case study for a typical example that conservation professionals can adapt for their own decision contexts and focal populations.Considerations for Resource ManagersManagement objectives, proposed actions, data availability and quality, and model assumptions are all relevant considerations when applying and interpreting habitat-quality metrics.Graph-based metrics answer questions related to habitat centrality and connectivity, are suitable for populations with any movement pattern, quantify basic spatial and temporal patterns of occupancy and movement, and require the least data.Occupancy-based metrics answer questions about likelihood of persistence or colonization, are suitable for populations that undergo localized extinctions, quantify spatial and temporal patterns of occupancy and movement, and require a moderate amount of data.Demographic-based metrics answer questions about relative or absolute population size, are suitable for populations with any movement pattern, quantify demographic
Beamspace Adaptive Beamforming for Hydrodynamic Towed Array Self-Noise Cancellation
National Research Council Canada - National Science Library
Premus, Vincent
2001-01-01
... against signal self-nulling associated with steering vector mismatch. Particular attention is paid to the definition of white noise gain as the metric that reflects the level of mainlobe adaptive nulling for an adaptive beamformer...
Beamspace Adaptive Beamforming for Hydrodynamic Towed Array Self-Noise Cancellation
National Research Council Canada - National Science Library
Premus, Vincent
2000-01-01
... against signal self-nulling associated with steering vector mismatch. Particular attention is paid to the definition of white noise gain as the metric that reflects the level of mainlobe adaptive nulling for an adaptive beamformer...
METRIC context unit architecture
Energy Technology Data Exchange (ETDEWEB)
Simpson, R.O.
1988-01-01
METRIC is an architecture for a simple but powerful Reduced Instruction Set Computer (RISC). Its speed comes from the simultaneous processing of several instruction streams, with instructions from the various streams being dispatched into METRIC's execution pipeline as they become available for execution. The pipeline is thus kept full, with a mix of instructions for several contexts in execution at the same time. True parallel programming is supported within a single execution unit, the METRIC Context Unit. METRIC's architecture provides for expansion through the addition of multiple Context Units and of specialized Functional Units. The architecture thus spans a range of size and performance from a single-chip microcomputer up through large and powerful multiprocessors. This research concentrates on the specification of the METRIC Context Unit at the architectural level. Performance tradeoffs made during METRIC's design are discussed, and projections of METRIC's performance are made based on simulation studies.
Metrics correlation and analysis service (MCAS)
International Nuclear Information System (INIS)
Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya
2009-01-01
The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information 'pond' is disorganized, it a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by disjoint middleware.
Observable traces of non-metricity: New constraints on metric-affine gravity
Delhom-Latorre, Adrià; Olmo, Gonzalo J.; Ronco, Michele
2018-05-01
Relaxing the Riemannian condition to incorporate geometric quantities such as torsion and non-metricity may allow to explore new physics associated with defects in a hypothetical space-time microstructure. Here we show that non-metricity produces observable effects in quantum fields in the form of 4-fermion contact interactions, thereby allowing us to constrain the scale of non-metricity to be greater than 1 TeV by using results on Bahbah scattering. Our analysis is carried out in the framework of a wide class of theories of gravity in the metric-affine approach. The bound obtained represents an improvement of several orders of magnitude to previous experimental constraints.
LENUS (Irish Health Repository)
Creane, Arthur
2012-07-01
Many soft biological tissues contain collagen fibres, which act as major load bearing constituents. The orientation and the dispersion of these fibres influence the macroscopic mechanical properties of the tissue and are therefore of importance in several areas of research including constitutive model development, tissue engineering and mechanobiology. Qualitative comparisons between these fibre architectures can be made using vector plots of mean orientations and contour plots of fibre dispersion but quantitative comparison cannot be achieved using these methods. We propose a \\'remodelling metric\\' between two angular fibre distributions, which represents the mean rotational effort required to transform one into the other. It is an adaptation of the earth mover\\'s distance, a similarity measure between two histograms\\/signatures used in image analysis, which represents the minimal cost of transforming one distribution into the other by moving distribution mass around. In this paper, its utility is demonstrated by considering the change in fibre architecture during a period of plaque growth in finite element models of the carotid bifurcation. The fibre architecture is predicted using a strain-based remodelling algorithm. We investigate the remodelling metric\\'s potential as a clinical indicator of plaque vulnerability by comparing results between symptomatic and asymptomatic carotid bifurcations. Fibre remodelling was found to occur at regions of plaque burden. As plaque thickness increased, so did the remodelling metric. A measure of the total predicted fibre remodelling during plaque growth, TRM, was found to be higher in the symptomatic group than in the asymptomatic group. Furthermore, a measure of the total fibre remodelling per plaque size, TRM\\/TPB, was found to be significantly higher in the symptomatic vessels. The remodelling metric may prove to be a useful tool in other soft tissues and engineered scaffolds where fibre adaptation is also present.
Wang, Xueyi
2012-02-08
The k-nearest neighbors (k-NN) algorithm is a widely used machine learning method that finds nearest neighbors of a test object in a feature space. We present a new exact k-NN algorithm called kMkNN (k-Means for k-Nearest Neighbors) that uses the k-means clustering and the triangle inequality to accelerate the searching for nearest neighbors in a high dimensional space. The kMkNN algorithm has two stages. In the buildup stage, instead of using complex tree structures such as metric trees, kd-trees, or ball-tree, kMkNN uses a simple k-means clustering method to preprocess the training dataset. In the searching stage, given a query object, kMkNN finds nearest training objects starting from the nearest cluster to the query object and uses the triangle inequality to reduce the distance calculations. Experiments show that the performance of kMkNN is surprisingly good compared to the traditional k-NN algorithm and tree-based k-NN algorithms such as kd-trees and ball-trees. On a collection of 20 datasets with up to 10(6) records and 10(4) dimensions, kMkNN shows a 2-to 80-fold reduction of distance calculations and a 2- to 60-fold speedup over the traditional k-NN algorithm for 16 datasets. Furthermore, kMkNN performs significant better than a kd-tree based k-NN algorithm for all datasets and performs better than a ball-tree based k-NN algorithm for most datasets. The results show that kMkNN is effective for searching nearest neighbors in high dimensional spaces.
Metric diffusion along foliations
Walczak, Szymon M
2017-01-01
Up-to-date research in metric diffusion along compact foliations is presented in this book. Beginning with fundamentals from the optimal transportation theory and the theory of foliations; this book moves on to cover Wasserstein distance, Kantorovich Duality Theorem, and the metrization of the weak topology by the Wasserstein distance. Metric diffusion is defined, the topology of the metric space is studied and the limits of diffused metrics along compact foliations are discussed. Essentials on foliations, holonomy, heat diffusion, and compact foliations are detailed and vital technical lemmas are proved to aide understanding. Graduate students and researchers in geometry, topology and dynamics of foliations and laminations will find this supplement useful as it presents facts about the metric diffusion along non-compact foliation and provides a full description of the limit for metrics diffused along foliation with at least one compact leaf on the two dimensions.
Johnson, Stephen B.; Ghoshal, Sudipto; Haste, Deepak; Moore, Craig
2017-01-01
This paper describes the theory and considerations in the application of metrics to measure the effectiveness of fault management. Fault management refers here to the operational aspect of system health management, and as such is considered as a meta-control loop that operates to preserve or maximize the system's ability to achieve its goals in the face of current or prospective failure. As a suite of control loops, the metrics to estimate and measure the effectiveness of fault management are similar to those of classical control loops in being divided into two major classes: state estimation, and state control. State estimation metrics can be classified into lower-level subdivisions for detection coverage, detection effectiveness, fault isolation and fault identification (diagnostics), and failure prognosis. State control metrics can be classified into response determination effectiveness and response effectiveness. These metrics are applied to each and every fault management control loop in the system, for each failure to which they apply, and probabilistically summed to determine the effectiveness of these fault management control loops to preserve the relevant system goals that they are intended to protect.
Completion of a Dislocated Metric Space
Directory of Open Access Journals (Sweden)
P. Sumati Kumari
2015-01-01
Full Text Available We provide a construction for the completion of a dislocated metric space (abbreviated d-metric space; we also prove that the completion of the metric associated with a d-metric coincides with the metric associated with the completion of the d-metric.
Contrast-based sensorless adaptive optics for retinal imaging.
Zhou, Xiaolin; Bedggood, Phillip; Bui, Bang; Nguyen, Christine T O; He, Zheng; Metha, Andrew
2015-09-01
Conventional adaptive optics ophthalmoscopes use wavefront sensing methods to characterize ocular aberrations for real-time correction. However, there are important situations in which the wavefront sensing step is susceptible to difficulties that affect the accuracy of the correction. To circumvent these, wavefront sensorless adaptive optics (or non-wavefront sensing AO; NS-AO) imaging has recently been developed and has been applied to point-scanning based retinal imaging modalities. In this study we show, for the first time, contrast-based NS-AO ophthalmoscopy for full-frame in vivo imaging of human and animal eyes. We suggest a robust image quality metric that could be used for any imaging modality, and test its performance against other metrics using (physical) model eyes.
Feature Selection and Predictors of Falls with Foot Force Sensors Using KNN-Based Algorithms
Directory of Open Access Journals (Sweden)
Shengyun Liang
2015-11-01
Full Text Available The aging process may lead to the degradation of lower extremity function in the elderly population, which can restrict their daily quality of life and gradually increase the fall risk. We aimed to determine whether objective measures of physical function could predict subsequent falls. Ground reaction force (GRF data, which was quantified by sample entropy, was collected by foot force sensors. Thirty eight subjects (23 fallers and 15 non-fallers participated in functional movement tests, including walking and sit-to-stand (STS. A feature selection algorithm was used to select relevant features to classify the elderly into two groups: at risk and not at risk of falling down, for three KNN-based classifiers: local mean-based k-nearest neighbor (LMKNN, pseudo nearest neighbor (PNN, local mean pseudo nearest neighbor (LMPNN classification. We compared classification performances, and achieved the best results with LMPNN, with sensitivity, specificity and accuracy all 100%. Moreover, a subset of GRFs was significantly different between the two groups via Wilcoxon rank sum test, which is compatible with the classification results. This method could potentially be used by non-experts to monitor balance and the risk of falling down in the elderly population.
Metrics with vanishing quantum corrections
International Nuclear Information System (INIS)
Coley, A A; Hervik, S; Gibbons, G W; Pope, C N
2008-01-01
We investigate solutions of the classical Einstein or supergravity equations that solve any set of quantum corrected Einstein equations in which the Einstein tensor plus a multiple of the metric is equated to a symmetric conserved tensor T μν (g αβ , ∂ τ g αβ , ∂ τ ∂ σ g αβ , ...,) constructed from sums of terms, the involving contractions of the metric and powers of arbitrary covariant derivatives of the curvature tensor. A classical solution, such as an Einstein metric, is called universal if, when evaluated on that Einstein metric, T μν is a multiple of the metric. A Ricci flat classical solution is called strongly universal if, when evaluated on that Ricci flat metric, T μν vanishes. It is well known that pp-waves in four spacetime dimensions are strongly universal. We focus attention on a natural generalization; Einstein metrics with holonomy Sim(n - 2) in which all scalar invariants are zero or constant. In four dimensions we demonstrate that the generalized Ghanam-Thompson metric is weakly universal and that the Goldberg-Kerr metric is strongly universal; indeed, we show that universality extends to all four-dimensional Sim(2) Einstein metrics. We also discuss generalizations to higher dimensions
Directory of Open Access Journals (Sweden)
Bessem Samet
2013-01-01
Full Text Available In 2005, Mustafa and Sims (2006 introduced and studied a new class of generalized metric spaces, which are called G-metric spaces, as a generalization of metric spaces. We establish some useful propositions to show that many fixed point theorems on (nonsymmetric G-metric spaces given recently by many authors follow directly from well-known theorems on metric spaces. Our technique can be easily extended to other results as shown in application.
Metrics correlation and analysis service (MCAS)
International Nuclear Information System (INIS)
Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya
2010-01-01
The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information pool is disorganized, it is a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation, and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by loosely coupled or fully decoupled middleware.
Metric-adjusted skew information
DEFF Research Database (Denmark)
Liang, Cai; Hansen, Frank
2010-01-01
on a bipartite system and proved superadditivity of the Wigner-Yanase-Dyson skew informations for such states. We extend this result to the general metric-adjusted skew information. We finally show that a recently introduced extension to parameter values 1 ...We give a truly elementary proof of the convexity of metric-adjusted skew information following an idea of Effros. We extend earlier results of weak forms of superadditivity to general metric-adjusted skew information. Recently, Luo and Zhang introduced the notion of semi-quantum states...... of (unbounded) metric-adjusted skew information....
Li, Xiaohui; Yang, Sibo; Fan, Rongwei; Yu, Xin; Chen, Deying
2018-06-01
In this paper, discrimination of soft tissues using laser-induced breakdown spectroscopy (LIBS) in combination with multivariate statistical methods is presented. Fresh pork fat, skin, ham, loin and tenderloin muscle tissues are manually cut into slices and ablated using a 1064 nm pulsed Nd:YAG laser. Discrimination analyses between fat, skin and muscle tissues, and further between highly similar ham, loin and tenderloin muscle tissues, are performed based on the LIBS spectra in combination with multivariate statistical methods, including principal component analysis (PCA), k nearest neighbors (kNN) classification, and support vector machine (SVM) classification. Performances of the discrimination models, including accuracy, sensitivity and specificity, are evaluated using 10-fold cross validation. The classification models are optimized to achieve best discrimination performances. The fat, skin and muscle tissues can be definitely discriminated using both kNN and SVM classifiers, with accuracy of over 99.83%, sensitivity of over 0.995 and specificity of over 0.998. The highly similar ham, loin and tenderloin muscle tissues can also be discriminated with acceptable performances. The best performances are achieved with SVM classifier using Gaussian kernel function, with accuracy of 76.84%, sensitivity of over 0.742 and specificity of over 0.869. The results show that the LIBS technique assisted with multivariate statistical methods could be a powerful tool for online discrimination of soft tissues, even for tissues of high similarity, such as muscles from different parts of the animal body. This technique could be used for discrimination of tissues suffering minor clinical changes, thus may advance the diagnosis of early lesions and abnormalities.
2009-08-01
integration across base MSF Category: Neighbors and Stakeholders (NS) No. Conceptual Metric No. Conceptual Metric NS1 “ Walkable ” on-base community...34 Walkable " on- base community design 1 " Walkable " community Design – on-base: clustering of facilities, presence of sidewalks, need for car...access to public transit LEED for Neighborhood Development (ND) 0-100 index based on score of walkable community indicators Adapt LEED-ND
Software metrics: Software quality metrics for distributed systems. [reliability engineering
Post, J. V.
1981-01-01
Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.
González-Brenes, José P.; Huang, Yun
2015-01-01
Classification evaluation metrics are often used to evaluate adaptive tutoring systems-- programs that teach and adapt to humans. Unfortunately, it is not clear how intuitive these metrics are for practitioners with little machine learning background. Moreover, our experiments suggest that existing convention for evaluating tutoring systems may…
The metric system: An introduction
Lumley, Susan M.
On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.
The metric system: An introduction
Energy Technology Data Exchange (ETDEWEB)
Lumley, S.M.
1995-05-01
On July 13, 1992, Deputy Director Duane Sewell restated the Laboratory`s policy on conversion to the metric system which was established in 1974. Sewell`s memo announced the Laboratory`s intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory`s conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on July 25, 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation`s conversion to the metric system. The second part of this report is on applying the metric system.
Attack-Resistant Trust Metrics
Levien, Raph
The Internet is an amazingly powerful tool for connecting people together, unmatched in human history. Yet, with that power comes great potential for spam and abuse. Trust metrics are an attempt to compute the set of which people are trustworthy and which are likely attackers. This chapter presents two specific trust metrics developed and deployed on the Advogato Website, which is a community blog for free software developers. This real-world experience demonstrates that the trust metrics fulfilled their goals, but that for good results, it is important to match the assumptions of the abstract trust metric computation to the real-world implementation.
International Nuclear Information System (INIS)
Hill, P; Labby, Z; Bayliss, R A; Geurts, M; Bayouth, J
2015-01-01
Purpose: To develop a plan comparison tool that will ensure robustness and deliverability through analysis of baseline and online-adaptive radiotherapy plans using similarity metrics. Methods: The ViewRay MRIdian treatment planning system allows export of a plan file that contains plan and delivery information. A software tool was developed to read and compare two plans, providing information and metrics to assess their similarity. In addition to performing direct comparisons (e.g. demographics, ROI volumes, number of segments, total beam-on time), the tool computes and presents histograms of derived metrics (e.g. step-and-shoot segment field sizes, segment average leaf gaps). Such metrics were investigated for their ability to predict that an online-adapted plan reasonably similar to a baseline plan where deliverability has already been established. Results: In the realm of online-adaptive planning, comparing ROI volumes offers a sanity check to verify observations found during contouring. Beyond ROI analysis, it has been found that simply editing contours and re-optimizing to adapt treatment can produce a delivery that is substantially different than the baseline plan (e.g. number of segments increased by 31%), with no changes in optimization parameters and only minor changes in anatomy. Currently the tool can quickly identify large omissions or deviations from baseline expectations. As our online-adaptive patient population increases, we will continue to develop and refine quantitative acceptance criteria for adapted plans and relate them historical delivery QA measurements. Conclusion: The plan comparison tool is in clinical use and reports a wide range of comparison metrics, illustrating key differences between two plans. This independent check is accomplished in seconds and can be performed in parallel to other tasks in the online-adaptive workflow. Current use prevents large planning or delivery errors from occurring, and ongoing refinements will lead to
An Adaptive Handover Prediction Scheme for Seamless Mobility Based Wireless Networks
Directory of Open Access Journals (Sweden)
Ali Safa Sadiq
2014-01-01
Full Text Available We propose an adaptive handover prediction (AHP scheme for seamless mobility based wireless networks. That is, the AHP scheme incorporates fuzzy logic with AP prediction process in order to lend cognitive capability to handover decision making. Selection metrics, including received signal strength, mobile node relative direction towards the access points in the vicinity, and access point load, are collected and considered inputs of the fuzzy decision making system in order to select the best preferable AP around WLANs. The obtained handover decision which is based on the calculated quality cost using fuzzy inference system is also based on adaptable coefficients instead of fixed coefficients. In other words, the mean and the standard deviation of the normalized network prediction metrics of fuzzy inference system, which are collected from available WLANs are obtained adaptively. Accordingly, they are applied as statistical information to adjust or adapt the coefficients of membership functions. In addition, we propose an adjustable weight vector concept for input metrics in order to cope with the continuous, unpredictable variation in their membership degrees. Furthermore, handover decisions are performed in each MN independently after knowing RSS, direction toward APs, and AP load. Finally, performance evaluation of the proposed scheme shows its superiority compared with representatives of the prediction approaches.
Symmetries of the dual metrics
International Nuclear Information System (INIS)
Baleanu, D.
1998-01-01
The geometric duality between the metric g μν and a Killing tensor K μν is studied. The conditions were found when the symmetries of the metric g μν and the dual metric K μν are the same. Dual spinning space was constructed without introduction of torsion. The general results are applied to the case of Kerr-Newmann metric
Directory of Open Access Journals (Sweden)
Kihong Kim
2018-02-01
Full Text Available Various kinds of metrics used for the quantitative evaluation of scholarly journals are reviewed. The impact factor and related metrics including the immediacy index and the aggregate impact factor, which are provided by the Journal Citation Reports, are explained in detail. The Eigenfactor score and the article influence score are also reviewed. In addition, journal metrics such as CiteScore, Source Normalized Impact per Paper, SCImago Journal Rank, h-index, and g-index are discussed. Limitations and problems that these metrics have are pointed out. We should be cautious to rely on those quantitative measures too much when we evaluate journals or researchers.
Holographic Spherically Symmetric Metrics
Petri, Michael
The holographic principle (HP) conjectures, that the maximum number of degrees of freedom of any realistic physical system is proportional to the system's boundary area. The HP has its roots in the study of black holes. It has recently been applied to cosmological solutions. In this article we apply the HP to spherically symmetric static space-times. We find that any regular spherically symmetric object saturating the HP is subject to tight constraints on the (interior) metric, energy-density, temperature and entropy-density. Whenever gravity can be described by a metric theory, gravity is macroscopically scale invariant and the laws of thermodynamics hold locally and globally, the (interior) metric of a regular holographic object is uniquely determined up to a constant factor and the interior matter-state must follow well defined scaling relations. When the metric theory of gravity is general relativity, the interior matter has an overall string equation of state (EOS) and a unique total energy-density. Thus the holographic metric derived in this article can serve as simple interior 4D realization of Mathur's string fuzzball proposal. Some properties of the holographic metric and its possible experimental verification are discussed. The geodesics of the holographic metric describe an isotropically expanding (or contracting) universe with a nearly homogeneous matter-distribution within the local Hubble volume. Due to the overall string EOS the active gravitational mass-density is zero, resulting in a coasting expansion with Ht = 1, which is compatible with the recent GRB-data.
Using Genetic Algorithms for Building Metrics of Collaborative Systems
Directory of Open Access Journals (Sweden)
Cristian CIUREA
2011-01-01
Full Text Available he paper objective is to reveal the importance of genetic algorithms in building robust metrics of collaborative systems. The main types of collaborative systems in economy are presented and some characteristics of genetic algorithms are described. A genetic algorithm was implemented in order to determine the local maximum and minimum points of the relative complexity function associated to a collaborative banking system. The intelligent collaborative systems based on genetic algorithms, representing the new generation of collaborative systems, are analyzed and the implementation of auto-adaptive interfaces in a banking application is described.
On prognostic models, artificial intelligence and censored observations.
Anand, S S; Hamilton, P W; Hughes, J G; Bell, D A
2001-03-01
The development of prognostic models for assisting medical practitioners with decision making is not a trivial task. Models need to possess a number of desirable characteristics and few, if any, current modelling approaches based on statistical or artificial intelligence can produce models that display all these characteristics. The inability of modelling techniques to provide truly useful models has led to interest in these models being purely academic in nature. This in turn has resulted in only a very small percentage of models that have been developed being deployed in practice. On the other hand, new modelling paradigms are being proposed continuously within the machine learning and statistical community and claims, often based on inadequate evaluation, being made on their superiority over traditional modelling methods. We believe that for new modelling approaches to deliver true net benefits over traditional techniques, an evaluation centric approach to their development is essential. In this paper we present such an evaluation centric approach to developing extensions to the basic k-nearest neighbour (k-NN) paradigm. We use standard statistical techniques to enhance the distance metric used and a framework based on evidence theory to obtain a prediction for the target example from the outcome of the retrieved exemplars. We refer to this new k-NN algorithm as Censored k-NN (Ck-NN). This reflects the enhancements made to k-NN that are aimed at providing a means for handling censored observations within k-NN.
Metric regularity and subdifferential calculus
International Nuclear Information System (INIS)
Ioffe, A D
2000-01-01
The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces
Context-dependent ATC complexity metric
Mercado Velasco, G.A.; Borst, C.
2015-01-01
Several studies have investigated Air Traffic Control (ATC) complexity metrics in a search for a metric that could best capture workload. These studies have shown how daunting the search for a universal workload metric (one that could be applied in different contexts: sectors, traffic patterns,
Sindik, Joško; Miljanović, Maja
2017-03-01
The article deals with the issue of research methodology, illustrating the use of known research methods for new purposes. Questionnaires that originally do not have metric characteristics can be called »handy questionnaires«. In this article, the author is trying to consider the possibilities of their improved scientific usability, which can be primarily ensured by improving their metric characteristics, consequently using multivariate instead of univariate statistical methods. In order to establish the base for the application of multivariate statistical procedures, the main idea is to develop strategies to design measurement instruments from parts of the handy questionnaires. This can be accomplished in two ways: before deciding upon the methods for data collection (redesigning the handy questionnaires) and before the collection of the data (a priori) or after the data has been collected, without modifying the questionnaire (a posteriori). The basic principles of applying these two strategies of the metrical adaptation of handy questionnaires are described.
DLA Energy Biofuel Feedstock Metrics Study
2012-12-11
moderately/highly in- vasive Metric 2: Genetically modified organism ( GMO ) hazard, Yes/No and Hazard Category Metric 3: Species hybridization...4– biofuel distribution Stage # 5– biofuel use Metric 1: State inva- siveness ranking Yes Minimal Minimal No No Metric 2: GMO hazard Yes...may utilize GMO microbial or microalgae species across the applicable biofuel life cycles (stages 1–3). The following consequence Metrics 4–6 then
Symmetries of Taub-NUT dual metrics
International Nuclear Information System (INIS)
Baleanu, D.; Codoban, S.
1998-01-01
Recently geometric duality was analyzed for a metric which admits Killing tensors. An interesting example arises when the manifold has Killing-Yano tensors. The symmetries of the dual metrics in the case of Taub-NUT metric are investigated. Generic and non-generic symmetries of dual Taub-NUT metric are analyzed
Bellet, Aurelien; Sebban, Marc
2015-01-01
Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learnin
Technical Privacy Metrics: a Systematic Survey
Wagner, Isabel; Eckhoff, David
2018-01-01
The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...
On Information Metrics for Spatial Coding.
Souza, Bryan C; Pavão, Rodrigo; Belchior, Hindiael; Tort, Adriano B L
2018-04-01
The hippocampal formation is involved in navigation, and its neuronal activity exhibits a variety of spatial correlates (e.g., place cells, grid cells). The quantification of the information encoded by spikes has been standard procedure to identify which cells have spatial correlates. For place cells, most of the established metrics derive from Shannon's mutual information (Shannon, 1948), and convey information rate in bits/s or bits/spike (Skaggs et al., 1993, 1996). Despite their widespread use, the performance of these metrics in relation to the original mutual information metric has never been investigated. In this work, using simulated and real data, we find that the current information metrics correlate less with the accuracy of spatial decoding than the original mutual information metric. We also find that the top informative cells may differ among metrics, and show a surrogate-based normalization that yields comparable spatial information estimates. Since different information metrics may identify different neuronal populations, we discuss current and alternative definitions of spatially informative cells, which affect the metric choice. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Meerschaert, R; Paul, A; Zhuang, L [Department of Oncology, Radiation Oncology Division, Wayne State University School of Medicine, Detroit, MI (United States); Nalichowski, A [Department of Oncology, Radiation Oncology Division, Karmanos Cancer Institute, Detroit, MI (United States); Burmeister, J; Miller, A [Department of Oncology, Radiation Oncology Division, Wayne State University School of Medicine, Detroit, MI (United States); Department of Oncology, Radiation Oncology Division, Karmanos Cancer Institute, Detroit, MI (United States)
2016-06-15
Purpose: To evaluate adaptive daily planning for cervical cancer patients who underwent high-dose-rate intra-cavitary brachytherapy (HDR-ICBT). Methods: This study included 22 cervical cancer patients who underwent 5 fractions of HDR ICBT. Regions of interest (ROIs) including high-risk clinical tumor volume (HR-CTV) and organs-at-risk (OARs) were manually contoured on daily CT images. All patients were treated with adaptive daily plans, which involved ROI delineation and dose optimization at each treatment fraction. Single treatment plans were retrospectively generated by applying the first treatment fraction’s dwell times adjusted for decay and dwell positions of the applicator to subsequent treatment fractions. Various existing similarity metrics were calculated for the ROIs to quantify interfractional organ variations. A novel similarity score (JRARM) was established, which combined both volumetric overlap metrics (DSC, JSC, and RVD) and distance metrics (ASD, MSD, and RMSD). Linear regression was performed to determine a relationship between inter-fractional organ variations of various similarity metrics and D2cc variations from both plans. Wilcoxon Signed Rank Tests were used to assess adaptive daily plans and single plans by comparing EQD2 D2cc (α/β=3) for OARs. Results: For inter-fractional organ variations, the sigmoid demonstrated the greatest variations based on the JRARM and DSC similarity metrics. Comparisons between paired ROIs showed differences in JRARM scores and DSCs at each treatment fraction. RVD, MSD, and RMSD were found to be significantly correlated to D2cc variations for bladder and sigmoid. The comparison between plans found that adaptive daily planning provided lower EQD2 D2cc of OARs than single planning, specifically for the sigmoid (p=0.015). Conclusion: Substantial inter-fractional organ motion can occur during HDR-BT, which may significantly affect D2cc of OARs. Adaptive daily planning provides improved dose sparing for OARs
Generalized Painleve-Gullstrand metrics
Energy Technology Data Exchange (ETDEWEB)
Lin Chunyu [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: l2891112@mail.ncku.edu.tw; Soo Chopin [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: cpsoo@mail.ncku.edu.tw
2009-02-02
An obstruction to the implementation of spatially flat Painleve-Gullstrand (PG) slicings is demonstrated, and explicitly discussed for Reissner-Nordstroem and Schwarzschild-anti-deSitter spacetimes. Generalizations of PG slicings which are not spatially flat but which remain regular at the horizons are introduced. These metrics can be obtained from standard spherically symmetric metrics by physical Lorentz boosts. With these generalized PG metrics, problematic contributions to the imaginary part of the action in the Parikh-Wilczek derivation of Hawking radiation due to the obstruction can be avoided.
Kerr metric in the deSitter background
International Nuclear Information System (INIS)
Vaidya, P.C.
1984-01-01
In addition to the Kerr metric with cosmological constant Λ several other metrics are presented giving a Kerr-like solution of Einstein's equations in the background of deSitter universe. A new metric of what may be termed as rotating deSitter space-time devoid of matter but containing null fluid with twisting null rays, has been presented. This metric reduces to the standard deSitter metric when the twist in the rays vanishes. Kerr metric in this background is the immediate generalization of Schwarzschild's exterior metric with cosmological constant. (author)
Kerr metric in cosmological background
Energy Technology Data Exchange (ETDEWEB)
Vaidya, P C [Gujarat Univ., Ahmedabad (India). Dept. of Mathematics
1977-06-01
A metric satisfying Einstein's equation is given which in the vicinity of the source reduces to the well-known Kerr metric and which at large distances reduces to the Robertson-Walker metric of a nomogeneous cosmological model. The radius of the event horizon of the Kerr black hole in the cosmological background is found out.
Vehicle-to-infrastructure program cooperative adaptive cruise control.
2015-03-01
This report documents the work completed by the Crash Avoidance Metrics Partners LLC (CAMP) Vehicle to Infrastructure (V2I) Consortium during the project titled Cooperative Adaptive Cruise Control (CACC). Participating companies in the V2I Cons...
Just-in-time adaptive classifiers-part II: designing the classifier.
Alippi, Cesare; Roveri, Manuel
2008-12-01
Aging effects, environmental changes, thermal drifts, and soft and hard faults affect physical systems by changing their nature and behavior over time. To cope with a process evolution adaptive solutions must be envisaged to track its dynamics; in this direction, adaptive classifiers are generally designed by assuming the stationary hypothesis for the process generating the data with very few results addressing nonstationary environments. This paper proposes a methodology based on k-nearest neighbor (NN) classifiers for designing adaptive classification systems able to react to changing conditions just-in-time (JIT), i.e., exactly when it is needed. k-NN classifiers have been selected for their computational-free training phase, the possibility to easily estimate the model complexity k and keep under control the computational complexity of the classifier through suitable data reduction mechanisms. A JIT classifier requires a temporal detection of a (possible) process deviation (aspect tackled in a companion paper) followed by an adaptive management of the knowledge base (KB) of the classifier to cope with the process change. The novelty of the proposed approach resides in the general framework supporting the real-time update of the KB of the classification system in response to novel information coming from the process both in stationary conditions (accuracy improvement) and in nonstationary ones (process tracking) and in providing a suitable estimate of k. It is shown that the classification system grants consistency once the change targets the process generating the data in a new stationary state, as it is the case in many real applications.
Directory of Open Access Journals (Sweden)
Isabel Garrido
2016-04-01
Full Text Available The class of metric spaces (X,d known as small-determined spaces, introduced by Garrido and Jaramillo, are properly defined by means of some type of real-valued Lipschitz functions on X. On the other hand, B-simple metric spaces introduced by Hejcman are defined in terms of some kind of bornologies of bounded subsets of X. In this note we present a common framework where both classes of metric spaces can be studied which allows us to see not only the relationships between them but also to obtain new internal characterizations of these metric properties.
Candelas, Philip; de la Ossa, Xenia; McOrist, Jock
2017-12-01
Heterotic vacua of string theory are realised, at large radius, by a compact threefold with vanishing first Chern class together with a choice of stable holomorphic vector bundle. These form a wide class of potentially realistic four-dimensional vacua of string theory. Despite all their phenomenological promise, there is little understanding of the metric on the moduli space of these. What is sought is the analogue of special geometry for these vacua. The metric on the moduli space is important in phenomenology as it normalises D-terms and Yukawa couplings. It is also of interest in mathematics, since it generalises the metric, first found by Kobayashi, on the space of gauge field connections, to a more general context. Here we construct this metric, correct to first order in {α^{\\backprime}}, in two ways: first by postulating a metric that is invariant under background gauge transformations of the gauge field, and also by dimensionally reducing heterotic supergravity. These methods agree and the resulting metric is Kähler, as is required by supersymmetry. Checking the metric is Kähler is intricate and the anomaly cancellation equation for the H field plays an essential role. The Kähler potential nevertheless takes a remarkably simple form: it is the Kähler potential of special geometry with the Kähler form replaced by the {α^{\\backprime}}-corrected hermitian form.
On characterizations of quasi-metric completeness
Energy Technology Data Exchange (ETDEWEB)
Dag, H.; Romaguera, S.; Tirado, P.
2017-07-01
Hu proved in [4] that a metric space (X, d) is complete if and only if for any closed subspace C of (X, d), every Banach contraction on C has fixed point. Since then several authors have investigated the problem of characterizing the metric completeness by means of fixed point theorems. Recently this problem has been studied in the more general context of quasi-metric spaces for different notions of completeness. Here we present a characterization of a kind of completeness for quasi-metric spaces by means of a quasi-metric versions of Hu’s theorem. (Author)
Directory of Open Access Journals (Sweden)
D.A. Adeniyi
2016-01-01
Full Text Available The major problem of many on-line web sites is the presentation of many choices to the client at a time; this usually results to strenuous and time consuming task in finding the right product or information on the site. In this work, we present a study of automatic web usage data mining and recommendation system based on current user behavior through his/her click stream data on the newly developed Really Simple Syndication (RSS reader website, in order to provide relevant information to the individual without explicitly asking for it. The K-Nearest-Neighbor (KNN classification method has been trained to be used on-line and in Real-Time to identify clients/visitors click stream data, matching it to a particular user group and recommend a tailored browsing option that meet the need of the specific user at a particular time. To achieve this, web users RSS address file was extracted, cleansed, formatted and grouped into meaningful session and data mart was developed. Our result shows that the K-Nearest Neighbor classifier is transparent, consistent, straightforward, simple to understand, high tendency to possess desirable qualities and easy to implement than most other machine learning techniques specifically when there is little or no prior knowledge about data distribution.
Douglas, M R; Davis, M A; Amarello, M; Smith, J J; Schuett, G W; Herrmann, H-W; Holycross, A T; Douglas, M E
2016-04-01
Ecosystems transition quickly in the Anthropocene, whereas biodiversity adapts more slowly. Here we simulated a shifting woodland ecosystem on the Colorado Plateau of western North America by using as its proxy over space and time the fundamental niche of the Arizona black rattlesnake (Crotalus cerberus). We found an expansive (= end-of-Pleistocene) range that contracted sharply (= present), but is blocked topographically by Grand Canyon/Colorado River as it shifts predictably northwestward under moderate climate change (= 2080). Vulnerability to contemporary wildfire was quantified from available records, with forested area reduced more than 27% over 13 years. Both 'ecosystem metrics' underscore how climate and wildfire are rapidly converting the Plateau ecosystem into novel habitat. To gauge potential effects on C. cerberus, we derived a series of relevant 'conservation metrics' (i.e. genetic variability, dispersal capacity, effective population size) by sequencing 118 individuals across 846 bp of mitochondrial (mt)DNA-ATPase8/6. We identified five significantly different clades (net sequence divergence = 2.2%) isolated by drainage/topography, with low dispersal (F ST = 0.82) and small sizes (2N ef = 5.2). Our compiled metrics (i.e. small-populations, topographic-isolation, low-dispersal versus conserved-niche, vulnerable-ecosystem, dispersal barriers) underscore the susceptibility of this woodland specialist to a climate and wildfire tandem. We offer adaptive management scenarios that may counterbalance these metrics and avoid the extirpation of this and other highly specialized, relictual woodland clades.
Structural Evolution of the R-T Phase Boundary in KNN-Based Ceramics
Lv, Xiang
2017-10-04
Although a rhombohedral-tetragonal (R-T) phase boundary is known to substantially enhance the piezoelectric properties of potassium-sodium niobate ceramics, the structural evolution of the R-T phase boundary itself is still unclear. In this work, the structural evolution of R-T phase boundary from -150 °C to 200 °C is investigated in (0.99-x)K0.5Na0.5Nb1-ySbyO3-0.01CaSnO3-xBi0.5K0.5HfO3 (where x=0~0.05 with y=0.035, and y=0~0.07 with x=0.03) ceramics. Through temperature-dependent powder X-ray diffraction (XRD) patterns and Raman spectra, the structural evolution was determined to be Rhombohedral (R, <-125 °C) → Rhombohedral+Orthorhombic (R+O, -125 °C to 0 °C) → Rhombohedral+Tetragonal (R+T, 0 °C to 150 °C) → dominating Tetragonal (T, 200 °C to Curie temperature (TC)) → Cubic (C, >TC). In addition, the enhanced electrical properties (e.g., a direct piezoelectric coefficient (d33) of ~450±5 pC/N, a conversion piezoelectric coefficient (d33*) of ~580±5 pm/V, an electromechanical coupling factor (kp) of ~0.50±0.02, and TC~250 °C), fatigue-free behavior, and good thermal stability were exhibited by the ceramics possessing the R-T phase boundary. This work improves understanding of the physical mechanism behind the R-T phase boundary in KNN-based ceramics and is an important step towards their adoption in practical applications. This article is protected by copyright. All rights reserved.
Engineering performance metrics
Delozier, R.; Snyder, N.
1993-03-01
Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.
Muntinga, D.; Bernritter, S.
2017-01-01
Het merk staat steeds meer centraal in de organisatie. Het is daarom essentieel om de gezondheid, prestaties en ontwikkelingen van het merk te meten. Het is echter een uitdaging om de juiste brand metrics te selecteren. Een enorme hoeveelheid metrics vraagt de aandacht van merkbeheerders. Maar welke
Privacy Metrics and Boundaries
L-F. Pau (Louis-François)
2005-01-01
textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for
2010-06-21
... 2010 Summer Study on Enhancing Adaptability of Our Military Forces AGENCY: Department of Defense (DoD... Enhancing Adaptability of our Military Forces will meet in closed session from August 2-13, 2010, in... establish defining metrics and identifying fundamental attributes of an architecture to enhance adaptability...
Silva, Carlos Alberto; Klauberg, Carine; Hudak, Andrew T; Vierling, Lee A; Liesenberg, Veraldo; Bernett, Luiz G; Scheraiber, Clewerson F; Schoeninger, Emerson R
2018-01-01
Accurate forest inventory is of great economic importance to optimize the entire supply chain management in pulp and paper companies. The aim of this study was to estimate stand dominate and mean heights (HD and HM) and tree density (TD) of Pinus taeda plantations located in South Brazil using in-situ measurements, airborne Light Detection and Ranging (LiDAR) data and the non- k-nearest neighbor (k-NN) imputation. Forest inventory attributes and LiDAR derived metrics were calculated at 53 regular sample plots and we used imputation models to retrieve the forest attributes at plot and landscape-levels. The best LiDAR-derived metrics to predict HD, HM and TD were H99TH, HSD, SKE and HMIN. The Imputation model using the selected metrics was more effective for retrieving height than tree density. The model coefficients of determination (adj.R2) and a root mean squared difference (RMSD) for HD, HM and TD were 0.90, 0.94, 0.38m and 6.99, 5.70, 12.92%, respectively. Our results show that LiDAR and k-NN imputation can be used to predict stand heights with high accuracy in Pinus taeda. However, furthers studies need to be realized to improve the accuracy prediction of TD and to evaluate and compare the cost of acquisition and processing of LiDAR data against the conventional inventory procedures.
Verification and Validation Challenges for Adaptive Flight Control of Complex Autonomous Systems
Nguyen, Nhan T.
2018-01-01
Autonomy of aerospace systems requires the ability for flight control systems to be able to adapt to complex uncertain dynamic environment. In spite of the five decades of research in adaptive control, the fact still remains that currently no adaptive control system has ever been deployed on any safety-critical or human-rated production systems such as passenger transport aircraft. The problem lies in the difficulty with the certification of adaptive control systems since existing certification methods cannot readily be used for nonlinear adaptive control systems. Research to address the notion of metrics for adaptive control began to appear in the recent years. These metrics, if accepted, could pave a path towards certification that would potentially lead to the adoption of adaptive control as a future control technology for safety-critical and human-rated production systems. Development of certifiable adaptive control systems represents a major challenge to overcome. Adaptive control systems with learning algorithms will never become part of the future unless it can be proven that they are highly safe and reliable. Rigorous methods for adaptive control software verification and validation must therefore be developed to ensure that adaptive control system software failures will not occur, to verify that the adaptive control system functions as required, to eliminate unintended functionality, and to demonstrate that certification requirements imposed by regulatory bodies such as the Federal Aviation Administration (FAA) can be satisfied. This presentation will discuss some of the technical issues with adaptive flight control and related V&V challenges.
Energy Technology Data Exchange (ETDEWEB)
Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott
2012-03-01
Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.
Fixed point theory in metric type spaces
Agarwal, Ravi P; O’Regan, Donal; Roldán-López-de-Hierro, Antonio Francisco
2015-01-01
Written by a team of leading experts in the field, this volume presents a self-contained account of the theory, techniques and results in metric type spaces (in particular in G-metric spaces); that is, the text approaches this important area of fixed point analysis beginning from the basic ideas of metric space topology. The text is structured so that it leads the reader from preliminaries and historical notes on metric spaces (in particular G-metric spaces) and on mappings, to Banach type contraction theorems in metric type spaces, fixed point theory in partially ordered G-metric spaces, fixed point theory for expansive mappings in metric type spaces, generalizations, present results and techniques in a very general abstract setting and framework. Fixed point theory is one of the major research areas in nonlinear analysis. This is partly due to the fact that in many real world problems fixed point theory is the basic mathematical tool used to establish the existence of solutions to problems which arise natur...
Deep Transfer Metric Learning.
Junlin Hu; Jiwen Lu; Yap-Peng Tan; Jie Zhou
2016-12-01
Conventional metric learning methods usually assume that the training and test samples are captured in similar scenarios so that their distributions are assumed to be the same. This assumption does not hold in many real visual recognition applications, especially when samples are captured across different data sets. In this paper, we propose a new deep transfer metric learning (DTML) method to learn a set of hierarchical nonlinear transformations for cross-domain visual recognition by transferring discriminative knowledge from the labeled source domain to the unlabeled target domain. Specifically, our DTML learns a deep metric network by maximizing the inter-class variations and minimizing the intra-class variations, and minimizing the distribution divergence between the source domain and the target domain at the top layer of the network. To better exploit the discriminative information from the source domain, we further develop a deeply supervised transfer metric learning (DSTML) method by including an additional objective on DTML, where the output of both the hidden layers and the top layer are optimized jointly. To preserve the local manifold of input data points in the metric space, we present two new methods, DTML with autoencoder regularization and DSTML with autoencoder regularization. Experimental results on face verification, person re-identification, and handwritten digit recognition validate the effectiveness of the proposed methods.
Energy functionals for Calabi-Yau metrics
International Nuclear Information System (INIS)
Headrick, M; Nassar, A
2013-01-01
We identify a set of ''energy'' functionals on the space of metrics in a given Kähler class on a Calabi-Yau manifold, which are bounded below and minimized uniquely on the Ricci-flat metric in that class. Using these functionals, we recast the problem of numerically solving the Einstein equation as an optimization problem. We apply this strategy, using the ''algebraic'' metrics (metrics for which the Kähler potential is given in terms of a polynomial in the projective coordinates), to the Fermat quartic and to a one-parameter family of quintics that includes the Fermat and conifold quintics. We show that this method yields approximations to the Ricci-flat metric that are exponentially accurate in the degree of the polynomial (except at the conifold point, where the convergence is polynomial), and therefore orders of magnitude more accurate than the balanced metrics, previously studied as approximations to the Ricci-flat metric. The method is relatively fast and easy to implement. On the theoretical side, we also show that the functionals can be used to give a heuristic proof of Yau's theorem
Pragmatic quality metrics for evolutionary software development models
Royce, Walker
1990-01-01
Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.
Adaptive and dynamic meshing methods for numerical simulations
Acikgoz, Nazmiye
For the numerical simulation of many problems of engineering interest, it is desirable to have an automated mesh adaption tool capable of producing high quality meshes with an affordably low number of mesh points. This is important especially for problems, which are characterized by anisotropic features of the solution and require mesh clustering in the direction of high gradients. Another significant issue in meshing emerges in the area of unsteady simulations with moving boundaries or interfaces, where the motion of the boundary has to be accommodated by deforming the computational grid. Similarly, there exist problems where current mesh needs to be adapted to get more accurate solutions because either the high gradient regions are initially predicted inaccurately or they change location throughout the simulation. To solve these problems, we propose three novel procedures. For this purpose, in the first part of this work, we present an optimization procedure for three-dimensional anisotropic tetrahedral grids based on metric-driven h-adaptation. The desired anisotropy in the grid is dictated by a metric that defines the size, shape, and orientation of the grid elements throughout the computational domain. Through the use of topological and geometrical operators, the mesh is iteratively adapted until the final mesh minimizes a given objective function. In this work, the objective function measures the distance between the metric of each simplex and a target metric, which can be either user-defined (a-priori) or the result of a-posteriori error analysis. During the adaptation process, one tries to decrease the metric-based objective function until the final mesh is compliant with the target within a given tolerance. However, in regions such as corners and complex face intersections, the compliance condition was found to be very difficult or sometimes impossible to satisfy. In order to address this issue, we propose an optimization process based on an ad
Regge calculus from discontinuous metrics
International Nuclear Information System (INIS)
Khatsymovsky, V.M.
2003-01-01
Regge calculus is considered as a particular case of the more general system where the linklengths of any two neighbouring 4-tetrahedra do not necessarily coincide on their common face. This system is treated as that one described by metric discontinuous on the faces. In the superspace of all discontinuous metrics the Regge calculus metrics form some hypersurface defined by continuity conditions. Quantum theory of the discontinuous metric system is assumed to be fixed somehow in the form of quantum measure on (the space of functionals on) the superspace. The problem of reducing this measure to the Regge hypersurface is addressed. The quantum Regge calculus measure is defined from a discontinuous metric measure by inserting the δ-function-like phase factor. The requirement that continuity conditions be imposed in a 'face-independent' way fixes this factor uniquely. The term 'face-independent' means that this factor depends only on the (hyper)plane spanned by the face, not on it's form and size. This requirement seems to be natural from the viewpoint of existence of the well-defined continuum limit maximally free of lattice artefacts
International Nuclear Information System (INIS)
Douglas, Michael R.; Karp, Robert L.; Lukic, Sergio; Reinbacher, Rene
2008-01-01
We develop numerical methods for approximating Ricci flat metrics on Calabi-Yau hypersurfaces in projective spaces. Our approach is based on finding balanced metrics and builds on recent theoretical work by Donaldson. We illustrate our methods in detail for a one parameter family of quintics. We also suggest several ways to extend our results
Metrics for Evaluation of Student Models
Pelanek, Radek
2015-01-01
Researchers use many different metrics for evaluation of performance of student models. The aim of this paper is to provide an overview of commonly used metrics, to discuss properties, advantages, and disadvantages of different metrics, to summarize current practice in educational data mining, and to provide guidance for evaluation of student…
Impact of artefact removal on ChIP quality metrics in ChIP-seq and ChIP-exo data.
Directory of Open Access Journals (Sweden)
Thomas Samuel Carroll
2014-04-01
Full Text Available With the advent of ChIP-seq multiplexing technologies and the subsequent increase in ChIP-seq throughput, the development of working standards for the quality assessment of ChIP-seq studies has received significant attention. The ENCODE consortium’s large scale analysis of transcription factor binding and epigenetic marks as well as concordant work on ChIP-seq by other laboratories has established a new generation of ChIP-seq quality control measures. The use of these metrics alongside common processing steps has however not been evaluated. In this study, we investigate the effects of blacklisting and removal of duplicated reads on established metrics of ChIP-seq quality and show that the interpretation of these metrics is highly dependent on the ChIP-seq preprocessing steps applied. Further to this we perform the first investigation of the use of these metrics for ChIP-exo data and make recommendations for the adaptation of the NSC statistic to allow for the assessment of ChIP-exo efficiency.
Directory of Open Access Journals (Sweden)
Muhammad Bilal
2016-07-01
Full Text Available Sentiment mining is a field of text mining to determine the attitude of people about a particular product, topic, politician in newsgroup posts, review sites, comments on facebook posts twitter, etc. There are many issues involved in opinion mining. One important issue is that opinions could be in different languages (English, Urdu, Arabic, etc.. To tackle each language according to its orientation is a challenging task. Most of the research work in sentiment mining has been done in English language. Currently, limited research is being carried out on sentiment classification of other languages like Arabic, Italian, Urdu and Hindi. In this paper, three classification models are used for text classification using Waikato Environment for Knowledge Analysis (WEKA. Opinions written in Roman-Urdu and English are extracted from a blog. These extracted opinions are documented in text files to prepare a training dataset containing 150 positive and 150 negative opinions, as labeled examples. Testing data set is supplied to three different models and the results in each case are analyzed. The results show that Naïve Bayesian outperformed Decision Tree and KNN in terms of more accuracy, precision, recall and F-measure.
Issues in Benchmark Metric Selection
Crolotte, Alain
It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.
Robustness of climate metrics under climate policy ambiguity
International Nuclear Information System (INIS)
Ekholm, Tommi; Lindroos, Tomi J.; Savolainen, Ilkka
2013-01-01
Highlights: • We assess the economic impacts of using different climate metrics. • The setting is cost-efficient scenarios for three interpretations of the 2C target. • With each target setting, the optimal metric is different. • Therefore policy ambiguity prevents the selection of an optimal metric. • Robust metric values that perform well with multiple policy targets however exist. -- Abstract: A wide array of alternatives has been proposed as the common metrics with which to compare the climate impacts of different emission types. Different physical and economic metrics and their parameterizations give diverse weights between e.g. CH 4 and CO 2 , and fixing the metric from one perspective makes it sub-optimal from another. As the aims of global climate policy involve some degree of ambiguity, it is not possible to determine a metric that would be optimal and consistent with all policy aims. This paper evaluates the cost implications of using predetermined metrics in cost-efficient mitigation scenarios. Three formulations of the 2 °C target, including both deterministic and stochastic approaches, shared a wide range of metric values for CH 4 with which the mitigation costs are only slightly above the cost-optimal levels. Therefore, although ambiguity in current policy might prevent us from selecting an optimal metric, it can be possible to select robust metric values that perform well with multiple policy targets
Web metrics for library and information professionals
Stuart, David
2014-01-01
This is a practical guide to using web metrics to measure impact and demonstrate value. The web provides an opportunity to collect a host of different metrics, from those associated with social media accounts and websites to more traditional research outputs. This book is a clear guide for library and information professionals as to what web metrics are available and how to assess and use them to make informed decisions and demonstrate value. As individuals and organizations increasingly use the web in addition to traditional publishing avenues and formats, this book provides the tools to unlock web metrics and evaluate the impact of this content. The key topics covered include: bibliometrics, webometrics and web metrics; data collection tools; evaluating impact on the web; evaluating social media impact; investigating relationships between actors; exploring traditional publications in a new environment; web metrics and the web of data; the future of web metrics and the library and information professional.Th...
Partial rectangular metric spaces and fixed point theorems.
Shukla, Satish
2014-01-01
The purpose of this paper is to introduce the concept of partial rectangular metric spaces as a generalization of rectangular metric and partial metric spaces. Some properties of partial rectangular metric spaces and some fixed point results for quasitype contraction in partial rectangular metric spaces are proved. Some examples are given to illustrate the observed results.
International Nuclear Information System (INIS)
Vaidya, P.C.; Patel, L.K.; Bhatt, P.V.
1976-01-01
Using Galilean time and retarded distance as coordinates the usual Kerr metric is expressed in form similar to the Newman-Unti-Tamburino (NUT) metric. The combined Kerr-NUT metric is then investigated. In addition to the Kerr and NUT solutions of Einstein's equations, three other types of solutions are derived. These are (i) the radiating Kerr solution, (ii) the radiating NUT solution satisfying Rsub(ik) = sigmaxisub(i)xisub(k), xisub(i)xisup(i) = 0, and (iii) the associated Kerr solution satisfying Rsub(ik) = 0. Solution (i) is distinct from and simpler than the one reported earlier by Vaidya and Patel (Phys. Rev.; D7:3590 (1973)). Solutions (ii) and (iii) gave line elements which have the axis of symmetry as a singular line. (author)
Background metric in supergravity theories
International Nuclear Information System (INIS)
Yoneya, T.
1978-01-01
In supergravity theories, we investigate the conformal anomaly of the path-integral determinant and the problem of fermion zero modes in the presence of a nontrivial background metric. Except in SO(3) -invariant supergravity, there are nonvanishing conformal anomalies. As a consequence, amplitudes around the nontrivial background metric contain unpredictable arbitrariness. The fermion zero modes which are explicitly constructed for the Euclidean Schwarzschild metric are interpreted as an indication of the supersymmetric multiplet structure of a black hole. The degree of degeneracy of a black hole is 2/sup 4n/ in SO(n) supergravity
Daylight metrics and energy savings
Energy Technology Data Exchange (ETDEWEB)
Mardaljevic, John; Heschong, Lisa; Lee, Eleanor
2009-12-31
The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.
A framework for quantification of groundwater dynamics - concepts and hydro(geo-)logical metrics
Haaf, Ezra; Heudorfer, Benedikt; Stahl, Kerstin; Barthel, Roland
2017-04-01
Fluctuation patterns in groundwater hydrographs are generally assumed to contain information on aquifer characteristics, climate and environmental controls. However, attempts to disentangle this information and map the dominant controls have been few. This is due to the substantial heterogeneity and complexity of groundwater systems, which is reflected in the abundance of morphologies of groundwater time series. To describe the structure and shape of hydrographs, descriptive terms like "slow"/ "fast" or "flashy"/ "inert" are frequently used, which are subjective, irreproducible and limited. This lack of objective and refined concepts limit approaches for regionalization of hydrogeological characteristics as well as our understanding of dominant processes controlling groundwater dynamics. Therefore, we propose a novel framework for groundwater hydrograph characterization in an attempt to categorize morphologies explicitly and quantitatively based on perceptual concepts of aspects of the dynamics. This quantitative framework is inspired by the existing and operational eco-hydrological classification frameworks for streamflow. The need for a new framework for groundwater systems is justified by the fundamental differences between the state variable groundwater head and the flow variable streamflow. Conceptually, we extracted exemplars of specific dynamic patterns, attributing descriptive terms for means of systematisation. Metrics, primarily taken from streamflow literature, were subsequently adapted to groundwater and assigned to the described patterns for means of quantification. In this study, we focused on the particularities of groundwater as a state variable. Furthermore, we investigated the descriptive skill of individual metrics as well as their usefulness for groundwater hydrographs. The ensemble of categorized metrics result in a framework, which can be used to describe and quantify groundwater dynamics. It is a promising tool for the setup of a successful
International Nuclear Information System (INIS)
Roege, Paul E.; Collier, Zachary A.; Mancillas, James; McDonagh, John A.; Linkov, Igor
2014-01-01
Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today's energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system's energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth. - Highlights: • Resilience is the ability of a system to recover from adversity. • There is a need for methods to quantify and measure system resilience. • We developed a matrix-based approach to generate energy resilience metrics. • These metrics can be used in energy planning, system design, and operations
Balanced metrics for vector bundles and polarised manifolds
DEFF Research Database (Denmark)
Garcia Fernandez, Mario; Ross, Julius
2012-01-01
leads to a Hermitian-Einstein metric on E and a constant scalar curvature Kähler metric in c_1(L). For special values of α, limits of balanced metrics are solutions of a system of coupled equations relating a Hermitian-Einstein metric on E and a Kähler metric in c1(L). For this, we compute the top two......We consider a notion of balanced metrics for triples (X, L, E) which depend on a parameter α, where X is smooth complex manifold with an ample line bundle L and E is a holomorphic vector bundle over X. For generic choice of α, we prove that the limit of a convergent sequence of balanced metrics...
The metrics of science and technology
Geisler, Eliezer
2000-01-01
Dr. Geisler's far-reaching, unique book provides an encyclopedic compilation of the key metrics to measure and evaluate the impact of science and technology on academia, industry, and government. Focusing on such items as economic measures, patents, peer review, and other criteria, and supported by an extensive review of the literature, Dr. Geisler gives a thorough analysis of the strengths and weaknesses inherent in metric design, and in the use of the specific metrics he cites. His book has already received prepublication attention, and will prove especially valuable for academics in technology management, engineering, and science policy; industrial R&D executives and policymakers; government science and technology policymakers; and scientists and managers in government research and technology institutions. Geisler maintains that the application of metrics to evaluate science and technology at all levels illustrates the variety of tools we currently possess. Each metric has its own unique strengths and...
Extending cosmology: the metric approach
Mendoza, S.
2012-01-01
Comment: 2012, Extending Cosmology: The Metric Approach, Open Questions in Cosmology; Review article for an Intech "Open questions in cosmology" book chapter (19 pages, 3 figures). Available from: http://www.intechopen.com/books/open-questions-in-cosmology/extending-cosmology-the-metric-approach
Measuring Information Security: Guidelines to Build Metrics
von Faber, Eberhard
Measuring information security is a genuine interest of security managers. With metrics they can develop their security organization's visibility and standing within the enterprise or public authority as a whole. Organizations using information technology need to use security metrics. Despite the clear demands and advantages, security metrics are often poorly developed or ineffective parameters are collected and analysed. This paper describes best practices for the development of security metrics. First attention is drawn to motivation showing both requirements and benefits. The main body of this paper lists things which need to be observed (characteristic of metrics), things which can be measured (how measurements can be conducted) and steps for the development and implementation of metrics (procedures and planning). Analysis and communication is also key when using security metrics. Examples are also given in order to develop a better understanding. The author wants to resume, continue and develop the discussion about a topic which is or increasingly will be a critical factor of success for any security managers in larger organizations.
Active Metric Learning for Supervised Classification
Kumaran, Krishnan; Papageorgiou, Dimitri; Chang, Yutong; Li, Minhan; Takáč, Martin
2018-01-01
Clustering and classification critically rely on distance metrics that provide meaningful comparisons between data points. We present mixed-integer optimization approaches to find optimal distance metrics that generalize the Mahalanobis metric extensively studied in the literature. Additionally, we generalize and improve upon leading methods by removing reliance on pre-designated "target neighbors," "triplets," and "similarity pairs." Another salient feature of our method is its ability to en...
Multimetric indices: How many metrics?
Multimetric indices (MMI’s) often include 5 to 15 metrics, each representing a different attribute of assemblage condition, such as species diversity, tolerant taxa, and nonnative taxa. Is there an optimal number of metrics for MMIs? To explore this question, I created 1000 9-met...
Robustness Metrics: Consolidating the multiple approaches to quantify Robustness
DEFF Research Database (Denmark)
Göhler, Simon Moritz; Eifler, Tobias; Howard, Thomas J.
2016-01-01
robustness metrics; 3) Functional expectancy and dispersion robustness metrics; and 4) Probability of conformance robustness metrics. The goal was to give a comprehensive overview of robustness metrics and guidance to scholars and practitioners to understand the different types of robustness metrics...
Common Metrics for Human-Robot Interaction
Steinfeld, Aaron; Lewis, Michael; Fong, Terrence; Scholtz, Jean; Schultz, Alan; Kaber, David; Goodrich, Michael
2006-01-01
This paper describes an effort to identify common metrics for task-oriented human-robot interaction (HRI). We begin by discussing the need for a toolkit of HRI metrics. We then describe the framework of our work and identify important biasing factors that must be taken into consideration. Finally, we present suggested common metrics for standardization and a case study. Preparation of a larger, more detailed toolkit is in progress.
Narrowing the Gap Between QoS Metrics and Web QoE Using Above-the-fold Metrics
da Hora, Diego Neves; Asrese, Alemnew; Christophides, Vassilis; Teixeira, Renata; Rossi, Dario
2018-01-01
International audience; Page load time (PLT) is still the most common application Quality of Service (QoS) metric to estimate the Quality of Experience (QoE) of Web users. Yet, recent literature abounds with proposals for alternative metrics (e.g., Above The Fold, SpeedIndex and variants) that aim at better estimating user QoE. The main purpose of this work is thus to thoroughly investigate a mapping between established and recently proposed objective metrics and user QoE. We obtain ground tr...
Factor structure of the Tomimatsu-Sato metrics
International Nuclear Information System (INIS)
Perjes, Z.
1989-02-01
Based on an earlier result stating that δ = 3 Tomimatsu-Sato (TS) metrics can be factored over the field of integers, an analogous representation for higher TS metrics was sought. It is shown that the factoring property of TS metrics follows from the structure of special Hankel determinants. A set of linear algebraic equations determining the factors was defined, and the factors of the first five TS metrics were tabulated, together with their primitive factors. (R.P.) 4 refs.; 2 tabs
ST-intuitionistic fuzzy metric space with properties
Arora, Sahil; Kumar, Tanuj
2017-07-01
In this paper, we define ST-intuitionistic fuzzy metric space and the notion of convergence and completeness properties of cauchy sequences is studied. Further, we prove some properties of ST-intuitionistic fuzzy metric space. Finally, we introduce the concept of symmetric ST Intuitionistic Fuzzy metric space.
Directory of Open Access Journals (Sweden)
CARLOS ALBERTO SILVA
Full Text Available ABSTRACT Accurate forest inventory is of great economic importance to optimize the entire supply chain management in pulp and paper companies. The aim of this study was to estimate stand dominate and mean heights (HD and HM and tree density (TD of Pinus taeda plantations located in South Brazil using in-situ measurements, airborne Light Detection and Ranging (LiDAR data and the non- k-nearest neighbor (k-NN imputation. Forest inventory attributes and LiDAR derived metrics were calculated at 53 regular sample plots and we used imputation models to retrieve the forest attributes at plot and landscape-levels. The best LiDAR-derived metrics to predict HD, HM and TD were H99TH, HSD, SKE and HMIN. The Imputation model using the selected metrics was more effective for retrieving height than tree density. The model coefficients of determination (adj.R2 and a root mean squared difference (RMSD for HD, HM and TD were 0.90, 0.94, 0.38m and 6.99, 5.70, 12.92%, respectively. Our results show that LiDAR and k-NN imputation can be used to predict stand heights with high accuracy in Pinus taeda. However, furthers studies need to be realized to improve the accuracy prediction of TD and to evaluate and compare the cost of acquisition and processing of LiDAR data against the conventional inventory procedures.
Non-common path aberration correction in an adaptive optics scanning ophthalmoscope.
Sulai, Yusufu N; Dubra, Alfredo
2014-09-01
The correction of non-common path aberrations (NCPAs) between the imaging and wavefront sensing channel in a confocal scanning adaptive optics ophthalmoscope is demonstrated. NCPA correction is achieved by maximizing an image sharpness metric while the confocal detection aperture is temporarily removed, effectively minimizing the monochromatic aberrations in the illumination path of the imaging channel. Comparison of NCPA estimated using zonal and modal orthogonal wavefront corrector bases provided wavefronts that differ by ~λ/20 in root-mean-squared (~λ/30 standard deviation). Sequential insertion of a cylindrical lens in the illumination and light collection paths of the imaging channel was used to compare image resolution after changing the wavefront correction to maximize image sharpness and intensity metrics. Finally, the NCPA correction was incorporated into the closed-loop adaptive optics control by biasing the wavefront sensor signals without reducing its bandwidth.
A new anisotropic mesh adaptation method based upon hierarchical a posteriori error estimates
Huang, Weizhang; Kamenski, Lennard; Lang, Jens
2010-03-01
A new anisotropic mesh adaptation strategy for finite element solution of elliptic differential equations is presented. It generates anisotropic adaptive meshes as quasi-uniform ones in some metric space, with the metric tensor being computed based on hierarchical a posteriori error estimates. A global hierarchical error estimate is employed in this study to obtain reliable directional information of the solution. Instead of solving the global error problem exactly, which is costly in general, we solve it iteratively using the symmetric Gauß-Seidel method. Numerical results show that a few GS iterations are sufficient for obtaining a reasonably good approximation to the error for use in anisotropic mesh adaptation. The new method is compared with several strategies using local error estimators or recovered Hessians. Numerical results are presented for a selection of test examples and a mathematical model for heat conduction in a thermal battery with large orthotropic jumps in the material coefficients.
Pragmatic security metrics applying metametrics to information security
Brotby, W Krag
2013-01-01
Other books on information security metrics discuss number theory and statistics in academic terms. Light on mathematics and heavy on utility, PRAGMATIC Security Metrics: Applying Metametrics to Information Security breaks the mold. This is the ultimate how-to-do-it guide for security metrics.Packed with time-saving tips, the book offers easy-to-follow guidance for those struggling with security metrics. Step by step, it clearly explains how to specify, develop, use, and maintain an information security measurement system (a comprehensive suite of metrics) to
Defining a Progress Metric for CERT RMM Improvement
2017-09-14
REV-03.18.2016.0 Defining a Progress Metric for CERT-RMM Improvement Gregory Crabb Nader Mehravari David Tobar September 2017 TECHNICAL ...fendable resource allocation decisions. Technical metrics measure aspects of controls implemented through technology (systems, soft- ware, hardware...implementation metric would be the percentage of users who have received anti-phishing training . • Effectiveness/efficiency metrics measure whether
Directory of Open Access Journals (Sweden)
2007-01-01
Full Text Available Many software and IT projects fail in completing theirs objectives because different causes of which the management of the projects has a high weight. In order to have successfully projects, lessons learned have to be used, historical data to be collected and metrics and indicators have to be computed and used to compare them with past projects and avoid failure to happen. This paper presents some metrics that can be used for the IT project management.
Mass Customization Measurements Metrics
DEFF Research Database (Denmark)
Nielsen, Kjeld; Brunø, Thomas Ditlev; Jørgensen, Kaj Asbjørn
2014-01-01
A recent survey has indicated that 17 % of companies have ceased mass customizing less than 1 year after initiating the effort. This paper presents measurement for a company’s mass customization performance, utilizing metrics within the three fundamental capabilities: robust process design, choice...... navigation, and solution space development. A mass customizer when assessing performance with these metrics can identify within which areas improvement would increase competitiveness the most and enable more efficient transition to mass customization....
Metrical Phonology: German Sound System.
Tice, Bradley S.
Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English and German languages. The objective is to promote use of metrical phonology as a tool for enhancing instruction in stress patterns in words and sentences, particularly in…
Construction of Einstein-Sasaki metrics in D≥7
International Nuclear Information System (INIS)
Lue, H.; Pope, C. N.; Vazquez-Poritz, J. F.
2007-01-01
We construct explicit Einstein-Kaehler metrics in all even dimensions D=2n+4≥6, in terms of a 2n-dimensional Einstein-Kaehler base metric. These are cohomogeneity 2 metrics which have the new feature of including a NUT-type parameter, or gravomagnetic charge, in addition to..' in addition to mass and rotation parameters. Using a canonical construction, these metrics all yield Einstein-Sasaki metrics in dimensions D=2n+5≥7. As is commonly the case in this type of construction, for suitable choices of the free parameters the Einstein-Sasaki metrics can extend smoothly onto complete and nonsingular manifolds, even though the underlying Einstein-Kaehler metric has conical singularities. We discuss some explicit examples in the case of seven-dimensional Einstein-Sasaki spaces. These new spaces can provide supersymmetric backgrounds in M theory, which play a role in the AdS 4 /CFT 3 correspondence
National Metrical Types in Nineteenth Century Art Song
Directory of Open Access Journals (Sweden)
Leigh VanHandel
2010-01-01
Full Text Available William Rothstein’s article “National metrical types in music of the eighteenth and early nineteenth centuries” (2008 proposes a distinction between the metrical habits of 18th and early 19th century German music and those of Italian and French music of that period. Based on theoretical treatises and compositional practice, he outlines these national metrical types and discusses the characteristics of each type. This paper presents the results of a study designed to determine whether, and to what degree, Rothstein’s characterizations of national metrical types are present in 19th century French and German art song. Studying metrical habits in this genre may provide a lens into changing metrical conceptions of 19th century theorists and composers, as well as to the metrical habits and compositional style of individual 19th century French and German art song composers.
A Metric on Phylogenetic Tree Shapes.
Colijn, C; Plazzotta, G
2018-01-01
The shapes of evolutionary trees are influenced by the nature of the evolutionary process but comparisons of trees from different processes are hindered by the challenge of completely describing tree shape. We present a full characterization of the shapes of rooted branching trees in a form that lends itself to natural tree comparisons. We use this characterization to define a metric, in the sense of a true distance function, on tree shapes. The metric distinguishes trees from random models known to produce different tree shapes. It separates trees derived from tropical versus USA influenza A sequences, which reflect the differing epidemiology of tropical and seasonal flu. We describe several metrics based on the same core characterization, and illustrate how to extend the metric to incorporate trees' branch lengths or other features such as overall imbalance. Our approach allows us to construct addition and multiplication on trees, and to create a convex metric on tree shapes which formally allows computation of average tree shapes. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
Software Quality Assurance Metrics
McRae, Kalindra A.
2004-01-01
Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.
Degraded visual environment image/video quality metrics
Baumgartner, Dustin D.; Brown, Jeremy B.; Jacobs, Eddie L.; Schachter, Bruce J.
2014-06-01
A number of image quality metrics (IQMs) and video quality metrics (VQMs) have been proposed in the literature for evaluating techniques and systems for mitigating degraded visual environments. Some require both pristine and corrupted imagery. Others require patterned target boards in the scene. None of these metrics relates well to the task of landing a helicopter in conditions such as a brownout dust cloud. We have developed and used a variety of IQMs and VQMs related to the pilot's ability to detect hazards in the scene and to maintain situational awareness. Some of these metrics can be made agnostic to sensor type. Not only are the metrics suitable for evaluating algorithm and sensor variation, they are also suitable for choosing the most cost effective solution to improve operating conditions in degraded visual environments.
The Jacobi metric for timelike geodesics in static spacetimes
Gibbons, G. W.
2016-01-01
It is shown that the free motion of massive particles moving in static spacetimes is given by the geodesics of an energy-dependent Riemannian metric on the spatial sections analogous to Jacobi's metric in classical dynamics. In the massless limit Jacobi's metric coincides with the energy independent Fermat or optical metric. For stationary metrics, it is known that the motion of massless particles is given by the geodesics of an energy independent Finslerian metric of Randers type. The motion of massive particles is governed by neither a Riemannian nor a Finslerian metric. The properies of the Jacobi metric for massive particles moving outside the horizon of a Schwarschild black hole are described. By constrast with the massless case, the Gaussian curvature of the equatorial sections is not always negative.
Relaxed metrics and indistinguishability operators: the relationship
Energy Technology Data Exchange (ETDEWEB)
Martin, J.
2017-07-01
In 1982, the notion of indistinguishability operator was introduced by E. Trillas in order to fuzzify the crisp notion of equivalence relation (/cite{Trillas}). In the study of such a class of operators, an outstanding property must be pointed out. Concretely, there exists a duality relationship between indistinguishability operators and metrics. The aforesaid relationship was deeply studied by several authors that introduced a few techniques to generate metrics from indistinguishability operators and vice-versa (see, for instance, /cite{BaetsMesiar,BaetsMesiar2}). In the last years a new generalization of the metric notion has been introduced in the literature with the purpose of developing mathematical tools for quantitative models in Computer Science and Artificial Intelligence (/cite{BKMatthews,Ma}). The aforementioned generalized metrics are known as relaxed metrics. The main target of this talk is to present a study of the duality relationship between indistinguishability operators and relaxed metrics in such a way that the aforementioned classical techniques to generate both concepts, one from the other, can be extended to the new framework. (Author)
Do kinematic metrics of walking balance adapt to perturbed optical flow?
Thompson, Jessica D; Franz, Jason R
2017-08-01
Visual (i.e., optical flow) perturbations can be used to study balance control and balance deficits. However, it remains unclear whether walking balance control adapts to such perturbations over time. Our purpose was to investigate the propensity for visuomotor adaptation in walking balance control using prolonged exposure to optical flow perturbations. Ten subjects (age: 25.4±3.8years) walked on a treadmill while watching a speed-matched virtual hallway with and without continuous mediolateral optical flow perturbations of three different amplitudes. Each of three perturbation trials consisted of 8min of prolonged exposure followed by 1min of unperturbed walking. Using 3D motion capture, we analyzed changes in foot placement kinematics and mediolateral sacrum motion. At their onset, perturbations elicited wider and shorter steps, alluding to a more cautious, general anticipatory balance control strategy. As perturbations continued, foot placement tended toward values seen during unperturbed walking while step width variability and mediolateral sacrum motion concurrently increased. Our findings suggest that subjects progressively shifted from a general anticipatory balance control strategy to a reactive, task-specific strategy using step-to-step adjustments. Prolonged exposure to optical flow perturbations may have clinical utility to reinforce reactive, task-specific balance control through training. Copyright © 2017 Elsevier B.V. All rights reserved.
Measurable Control System Security through Ideal Driven Technical Metrics
Energy Technology Data Exchange (ETDEWEB)
Miles McQueen; Wayne Boyer; Sean McBride; Marie Farrar; Zachary Tudor
2008-01-01
The Department of Homeland Security National Cyber Security Division supported development of a small set of security ideals as a framework to establish measurable control systems security. Based on these ideals, a draft set of proposed technical metrics was developed to allow control systems owner-operators to track improvements or degradations in their individual control systems security posture. The technical metrics development effort included review and evaluation of over thirty metrics-related documents. On the bases of complexity, ambiguity, or misleading and distorting effects the metrics identified during the reviews were determined to be weaker than necessary to aid defense against the myriad threats posed by cyber-terrorism to human safety, as well as to economic prosperity. Using the results of our metrics review and the set of security ideals as a starting point for metrics development, we identified thirteen potential technical metrics - with at least one metric supporting each ideal. Two case study applications of the ideals and thirteen metrics to control systems were then performed to establish potential difficulties in applying both the ideals and the metrics. The case studies resulted in no changes to the ideals, and only a few deletions and refinements to the thirteen potential metrics. This led to a final proposed set of ten core technical metrics. To further validate the security ideals, the modifications made to the original thirteen potential metrics, and the final proposed set of ten core metrics, seven separate control systems security assessments performed over the past three years were reviewed for findings and recommended mitigations. These findings and mitigations were then mapped to the security ideals and metrics to assess gaps in their coverage. The mappings indicated that there are no gaps in the security ideals and that the ten core technical metrics provide significant coverage of standard security issues with 87% coverage. Based
Experiential space is hardly metric
Czech Academy of Sciences Publication Activity Database
Šikl, Radovan; Šimeček, Michal; Lukavský, Jiří
2008-01-01
Roč. 2008, č. 37 (2008), s. 58-58 ISSN 0301-0066. [European Conference on Visual Perception. 24.08-28.08.2008, Utrecht] R&D Projects: GA ČR GA406/07/1676 Institutional research plan: CEZ:AV0Z70250504 Keywords : visual space perception * metric and non-metric perceptual judgments * ecological validity Subject RIV: AN - Psychology
High resolution metric imaging payload
Delclaud, Y.
2017-11-01
Alcatel Space Industries has become Europe's leader in the field of high and very high resolution optical payloads, in the frame work of earth observation system able to provide military government with metric images from space. This leadership allowed ALCATEL to propose for the export market, within a French collaboration frame, a complete space based system for metric observation.
Smart Grid Status and Metrics Report Appendices
Energy Technology Data Exchange (ETDEWEB)
Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Antonopoulos, Chrissi A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clements, Samuel L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorrissen, Willy J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ruiz, Kathleen A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, David L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gardner, Chris [APQC, Houston, TX (United States); Varney, Jeff [APQC, Houston, TX (United States)
2014-07-01
A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.
Implications of Metric Choice for Common Applications of Readmission Metrics
Davies, Sheryl; Saynina, Olga; Schultz, Ellen; McDonald, Kathryn M; Baker, Laurence C
2013-01-01
Objective. To quantify the differential impact on hospital performance of three readmission metrics: all-cause readmission (ACR), 3M Potential Preventable Readmission (PPR), and Centers for Medicare and Medicaid 30-day readmission (CMS).
Prognostic Performance Metrics
National Aeronautics and Space Administration — This chapter presents several performance metrics for offline evaluation of prognostics algorithms. A brief overview of different methods employed for performance...
Symmetric Kullback-Leibler Metric Based Tracking Behaviors for Bioinspired Robotic Eyes.
Liu, Hengli; Luo, Jun; Wu, Peng; Xie, Shaorong; Li, Hengyu
2015-01-01
A symmetric Kullback-Leibler metric based tracking system, capable of tracking moving targets, is presented for a bionic spherical parallel mechanism to minimize a tracking error function to simulate smooth pursuit of human eyes. More specifically, we propose a real-time moving target tracking algorithm which utilizes spatial histograms taking into account symmetric Kullback-Leibler metric. In the proposed algorithm, the key spatial histograms are extracted and taken into particle filtering framework. Once the target is identified, an image-based control scheme is implemented to drive bionic spherical parallel mechanism such that the identified target is to be tracked at the center of the captured images. Meanwhile, the robot motion information is fed forward to develop an adaptive smooth tracking controller inspired by the Vestibuloocular Reflex mechanism. The proposed tracking system is designed to make the robot track dynamic objects when the robot travels through transmittable terrains, especially bumpy environment. To perform bumpy-resist capability under the condition of violent attitude variation when the robot works in the bumpy environment mentioned, experimental results demonstrate the effectiveness and robustness of our bioinspired tracking system using bionic spherical parallel mechanism inspired by head-eye coordination.
Symmetric Kullback-Leibler Metric Based Tracking Behaviors for Bioinspired Robotic Eyes
Directory of Open Access Journals (Sweden)
Hengli Liu
2015-01-01
Full Text Available A symmetric Kullback-Leibler metric based tracking system, capable of tracking moving targets, is presented for a bionic spherical parallel mechanism to minimize a tracking error function to simulate smooth pursuit of human eyes. More specifically, we propose a real-time moving target tracking algorithm which utilizes spatial histograms taking into account symmetric Kullback-Leibler metric. In the proposed algorithm, the key spatial histograms are extracted and taken into particle filtering framework. Once the target is identified, an image-based control scheme is implemented to drive bionic spherical parallel mechanism such that the identified target is to be tracked at the center of the captured images. Meanwhile, the robot motion information is fed forward to develop an adaptive smooth tracking controller inspired by the Vestibuloocular Reflex mechanism. The proposed tracking system is designed to make the robot track dynamic objects when the robot travels through transmittable terrains, especially bumpy environment. To perform bumpy-resist capability under the condition of violent attitude variation when the robot works in the bumpy environment mentioned, experimental results demonstrate the effectiveness and robustness of our bioinspired tracking system using bionic spherical parallel mechanism inspired by head-eye coordination.
An adaptive phase space method with application to reflection traveltime tomography
International Nuclear Information System (INIS)
Chung, Eric; Qian, Jianliang; Uhlmann, Gunther; Zhao, Hongkai
2011-01-01
In this work, an adaptive strategy for the phase space method for traveltime tomography (Chung et al 2007 Inverse Problems 23 309–29) is developed. The method first uses those geodesics/rays that produce smaller mismatch with the measurements and continues on in the spirit of layer stripping without defining the layers explicitly. The adaptive approach improves stability, efficiency and accuracy. We then extend our method to reflection traveltime tomography by incorporating broken geodesics/rays for which a jump condition has to be imposed at the broken point for the geodesic flow. In particular, we show that our method can distinguish non-broken and broken geodesics in the measurement and utilize them accordingly in reflection traveltime tomography. We demonstrate that our method can recover the convex hull (with respect to the underlying metric) of unknown obstacles as well as the metric outside the convex hull. (paper)
Energy-Based Metrics for Arthroscopic Skills Assessment.
Poursartip, Behnaz; LeBel, Marie-Eve; McCracken, Laura C; Escoto, Abelardo; Patel, Rajni V; Naish, Michael D; Trejos, Ana Luisa
2017-08-05
Minimally invasive skills assessment methods are essential in developing efficient surgical simulators and implementing consistent skills evaluation. Although numerous methods have been investigated in the literature, there is still a need to further improve the accuracy of surgical skills assessment. Energy expenditure can be an indication of motor skills proficiency. The goals of this study are to develop objective metrics based on energy expenditure, normalize these metrics, and investigate classifying trainees using these metrics. To this end, different forms of energy consisting of mechanical energy and work were considered and their values were divided by the related value of an ideal performance to develop normalized metrics. These metrics were used as inputs for various machine learning algorithms including support vector machines (SVM) and neural networks (NNs) for classification. The accuracy of the combination of the normalized energy-based metrics with these classifiers was evaluated through a leave-one-subject-out cross-validation. The proposed method was validated using 26 subjects at two experience levels (novices and experts) in three arthroscopic tasks. The results showed that there are statistically significant differences between novices and experts for almost all of the normalized energy-based metrics. The accuracy of classification using SVM and NN methods was between 70% and 95% for the various tasks. The results show that the normalized energy-based metrics and their combination with SVM and NN classifiers are capable of providing accurate classification of trainees. The assessment method proposed in this study can enhance surgical training by providing appropriate feedback to trainees about their level of expertise and can be used in the evaluation of proficiency.
Principle of space existence and De Sitter metric
International Nuclear Information System (INIS)
Mal'tsev, V.K.
1990-01-01
The selection principle for the solutions of the Einstein equations suggested in a series of papers implies the existence of space (g ik ≠ 0) only in the presence of matter (T ik ≠0). This selection principle (principle of space existence, in the Markov terminology) implies, in the general case, the absence of the cosmological solution with the De Sitter metric. On the other hand, the De Sitter metric is necessary for describing both inflation and deflation periods of the Universe. It is shown that the De Sitter metric is also allowed by the selection principle under discussion if the metric experiences the evolution into the Friedmann metric
What can article-level metrics do for you?
Fenner, Martin
2013-10-01
Article-level metrics (ALMs) provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, usage statistics, discussions in online comments and social media, social bookmarking, and recommendations. In this essay, we describe why article-level metrics are an important extension of traditional citation-based journal metrics and provide a number of example from ALM data collected for PLOS Biology.
About the possibility of a generalized metric
International Nuclear Information System (INIS)
Lukacs, B.; Ladik, J.
1991-10-01
The metric (the structure of the space-time) may be dependent on the properties of the object measuring it. The case of size dependence of the metric was examined. For this dependence the simplest possible form of the metric tensor has been constructed which fulfils the following requirements: there be two extremal characteristic scales; the metric be unique and the usual between them; the change be sudden in the neighbourhood of these scales; the size of the human body appear as a parameter (postulated on the basis of some philosophical arguments). Estimates have been made for the two extremal length scales according to existing observations. (author) 19 refs
Distributed consensus for metamorphic systems using a gossip algorithm for CAT(0) metric spaces
Bellachehab, Anass; Jakubowicz, Jérémie
2015-01-01
We present an application of distributed consensus algorithms to metamorphic systems. A metamorphic system is a set of identical units that can self-assemble to form a rigid structure. For instance, one can think of a robotic arm composed of multiple links connected by joints. The system can change its shape in order to adapt to different environments via reconfiguration of its constituting units. We assume in this work that several metamorphic systems form a network: two systems are connected whenever they are able to communicate with each other. The aim of this paper is to propose a distributed algorithm that synchronizes all the systems in the network. Synchronizing means that all the systems should end up having the same configuration. This aim is achieved in two steps: (i) we cast the problem as a consensus problem on a metric space and (ii) we use a recent distributed consensus algorithm that only make use of metrical notions.
Experiences in adapting post-byzantine chant into foreign languages: Research and praxis
Directory of Open Access Journals (Sweden)
Olkinuora Jaakko
2011-01-01
Full Text Available This article presents the current state of the research and practical methodology of the adaptation of Byzantine melodies written in the “New Method” into foreign languages, with Romanian, English and Finnish serving as examples. The adaptation of independent, “fixed” melodies as well as metrical liturgical texts (prosomoia and canons are examined. The challenges emerging in adapting Byzantine chant into Finnish are also discussed. The author also suggests some future subjects for research, which include the synthesis of examining arrangements in both “Old” and “New Method”.
Ideal Based Cyber Security Technical Metrics for Control Systems
Energy Technology Data Exchange (ETDEWEB)
W. F. Boyer; M. A. McQueen
2007-10-01
Much of the world's critical infrastructure is at risk from attack through electronic networks connected to control systems. Security metrics are important because they provide the basis for management decisions that affect the protection of the infrastructure. A cyber security technical metric is the security relevant output from an explicit mathematical model that makes use of objective measurements of a technical object. A specific set of technical security metrics are proposed for use by the operators of control systems. Our proposed metrics are based on seven security ideals associated with seven corresponding abstract dimensions of security. We have defined at least one metric for each of the seven ideals. Each metric is a measure of how nearly the associated ideal has been achieved. These seven ideals provide a useful structure for further metrics development. A case study shows how the proposed metrics can be applied to an operational control system.
THE ROLE OF ARTICLE LEVEL METRICS IN SCIENTIFIC PUBLISHING
Directory of Open Access Journals (Sweden)
Vladimir TRAJKOVSKI
2016-04-01
Full Text Available Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Article-level metrics (ALMs provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, statistics of usage, discussions in online comments and social media, social bookmarking, and recommendations. In this editorial, the role of article level metrics in publishing scientific papers has been described. Article-Level Metrics (ALMs are rapidly emerging as important tools to quantify how individual articles are being discussed, shared, and used. Data sources depend on the tool, but they include classic metrics indicators depending on citations, academic social networks (Mendeley, CiteULike, Delicious and social media (Facebook, Twitter, blogs, and Youtube. The most popular tools used to apply this new metrics are: Public Library of Science - Article-Level Metrics, Altmetric, Impactstory and Plum Analytics. Journal Impact Factor (JIF does not consider impact or influence beyond citations count as this count reflected only through Thomson Reuters’ Web of Science® database. JIF provides indicator related to the journal, but not related to a published paper. Thus, altmetrics now becomes an alternative metrics for performance assessment of individual scientists and their contributed scholarly publications. Macedonian scholarly publishers have to work on implementing of article level metrics in their e-journals. It is the way to increase their visibility and impact in the world of science.
Characterising risk - aggregated metrics: radiation and noise
International Nuclear Information System (INIS)
Passchier, W.
1998-01-01
The characterisation of risk is an important phase in the risk assessment - risk management process. From the multitude of risk attributes a few have to be selected to obtain a risk characteristic or profile that is useful for risk management decisions and implementation of protective measures. One way to reduce the number of attributes is aggregation. In the field of radiation protection such an aggregated metric is firmly established: effective dose. For protection against environmental noise the Health Council of the Netherlands recently proposed a set of aggregated metrics for noise annoyance and sleep disturbance. The presentation will discuss similarities and differences between these two metrics and practical limitations. The effective dose has proven its usefulness in designing radiation protection measures, which are related to the level of risk associated with the radiation practice in question, given that implicit judgements on radiation induced health effects are accepted. However, as the metric does not take into account the nature of radiation practice, it is less useful in policy discussions on the benefits and harm of radiation practices. With respect to the noise exposure metric, only one effect is targeted (annoyance), and the differences between sources are explicitly taken into account. This should make the metric useful in policy discussions with respect to physical planning and siting problems. The metric proposed has only significance on a population level, and can not be used as a predictor for individual risk. (author)
Supplier selection using different metric functions
Directory of Open Access Journals (Sweden)
Omosigho S.E.
2015-01-01
Full Text Available Supplier selection is an important component of supply chain management in today’s global competitive environment. Hence, the evaluation and selection of suppliers have received considerable attention in the literature. Many attributes of suppliers, other than cost, are considered in the evaluation and selection process. Therefore, the process of evaluation and selection of suppliers is a multi-criteria decision making process. The methodology adopted to solve the supplier selection problem is intuitionistic fuzzy TOPSIS (Technique for Order Preference by Similarity to the Ideal Solution. Generally, TOPSIS is based on the concept of minimum distance from the positive ideal solution and maximum distance from the negative ideal solution. We examine the deficiencies of using only one metric function in TOPSIS and propose the use of spherical metric function in addition to the commonly used metric functions. For empirical supplier selection problems, more than one metric function should be used.
2012-03-02
... Performance Metrics; Commission Staff Request Comments on Performance Metrics for Regions Outside of RTOs and... performance communicate about the benefits of RTOs and, where appropriate, (2) changes that need to be made to... common set of performance measures for markets both within and outside of ISOs/RTOs. As recommended by...
Regional Sustainability: The San Luis Basin Metrics Project
There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute. Moreover, individual metrics may not capture all aspects of a system that are relevant to sust...
Santosh Kumar S C Sharma Bhupendra Suman
2011-01-01
A mobile ad hoc network is collection of self configuring and adaption of wireless link between communicating devices (mobile devices) to form an arbitrary topology and multihop wireless connectivity without the use of existing infrastructure. It requires efficient dynamic routing protocol to determine the routes subsequent to a set of rules that enables two or more devices to communicate with each others. This paper basically classifies and evaluates the mobility metrics into two categories-...
Schweizer, B
2005-01-01
Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.
Metric solution of a spinning mass
International Nuclear Information System (INIS)
Sato, H.
1982-01-01
Studies on a particular class of asymptotically flat and stationary metric solutions called the Kerr-Tomimatsu-Sato class are reviewed about its derivation and properties. For a further study, an almost complete list of the papers worked on the Tomimatsu-Sato metrics is given. (Auth.)
Software architecture analysis tool : software architecture metrics collection
Muskens, J.; Chaudron, M.R.V.; Westgeest, R.
2002-01-01
The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a
Generalized tolerance sensitivity and DEA metric sensitivity
Neralić, Luka; E. Wendell, Richard
2015-01-01
This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA). Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.
On Nakhleh's metric for reduced phylogenetic networks
Cardona, Gabriel; Llabrés, Mercè; Rosselló, Francesc; Valiente Feruglio, Gabriel Alejandro
2009-01-01
We prove that Nakhleh’s metric for reduced phylogenetic networks is also a metric on the classes of tree-child phylogenetic networks, semibinary tree-sibling time consistent phylogenetic networks, and multilabeled phylogenetic trees. We also prove that it separates distinguishable phylogenetic networks. In this way, it becomes the strongest dissimilarity measure for phylogenetic networks available so far. Furthermore, we propose a generalization of that metric that separates arbitrary phyl...
Generalized tolerance sensitivity and DEA metric sensitivity
Directory of Open Access Journals (Sweden)
Luka Neralić
2015-03-01
Full Text Available This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA. Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.
Social Media Metrics Importance and Usage Frequency in Latvia
Directory of Open Access Journals (Sweden)
Ronalds Skulme
2017-12-01
Full Text Available Purpose of the article: The purpose of this paper was to explore which social media marketing metrics are most often used and are most important for marketing experts in Latvia and can be used to evaluate marketing campaign effectiveness. Methodology/methods: In order to achieve the aim of this paper several theoretical and practical research methods were used, such as theoretical literature analysis, surveying and grouping. First of all, theoretical research about social media metrics was conducted. Authors collected information about social media metric grouping methods and the most frequently mentioned social media metrics in the literature. The collected information was used as the foundation for the expert surveys. The expert surveys were used to collect information from Latvian marketing professionals to determine which social media metrics are used most often and which social media metrics are most important in Latvia. Scientific aim: The scientific aim of this paper was to identify if social media metrics importance varies depending on the consumer purchase decision stage. Findings: Information about the most important and most often used social media marketing metrics in Latvia was collected. A new social media grouping framework is proposed. Conclusions: The main conclusion is that the importance and the usage frequency of the social media metrics is changing depending of consumer purchase decisions stage the metric is used to evaluate.
A comparison theorem of the Kobayashi metric and the Bergman metric on a class of Reinhardt domains
International Nuclear Information System (INIS)
Weiping Yin.
1990-03-01
A comparison theorem for the Kobayashi and Bergman metric is given on a class of Reinhardt domains in C n . In the meantime, we obtain a class of complete invariant Kaehler metrics for these domains of the special cases. (author). 5 refs
Using Activity Metrics for DEVS Simulation Profiling
Directory of Open Access Journals (Sweden)
Muzy A.
2014-01-01
Full Text Available Activity metrics can be used to profile DEVS models before and during the simulation. It is critical to get good activity metrics of models before and during their simulation. Having a means to compute a-priori activity of components (analytic activity may be worth when simulating a model (or parts of it for the first time. After, during the simulation, analytic activity can be corrected using dynamic one. In this paper, we introduce McCabe cyclomatic complexity metric (MCA to compute analytic activity. Both static and simulation activity metrics have been implemented through a plug-in of the DEVSimPy (DEVS Simulator in Python language environment and applied to DEVS models.
National Research Council Canada - National Science Library
Olson, Teresa; Lee, Harry; Sanders, Johnnie
2002-01-01
.... We have developed the Tracker Performance Metric (TPM) specifically for this purpose. It was designed to measure the output performance, on a frame-by-frame basis, using its output position and quality...
Metrication: An economic wake-up call for US industry
Carver, G. P.
1993-03-01
As the international standard of measurement, the metric system is one key to success in the global marketplace. International standards have become an important factor in international economic competition. Non-metric products are becoming increasingly unacceptable in world markets that favor metric products. Procurement is the primary federal tool for encouraging and helping U.S. industry to convert voluntarily to the metric system. Besides the perceived unwillingness of the customer, certain regulatory language, and certain legal definitions in some states, there are no major impediments to conversion of the remaining non-metric industries to metric usage. Instead, there are good reasons for changing, including an opportunity to rethink many industry standards and to take advantage of size standardization. Also, when the remaining industries adopt the metric system, they will come into conformance with federal agencies engaged in similar activities.
Conformal and related changes of metric on the product of two almost contact metric manifolds.
Blair, D. E.
1990-01-01
This paper studies conformal and related changes of the product metric on the product of two almost contact metric manifolds. It is shown that if one factor is Sasakian, the other is not, but that locally the second factor is of the type studied by Kenmotsu. The results are more general and given in terms of trans-Sasakian, α-Sasakian and β-Kenmotsu structures.
Extremal limits of the C metric: Nariai, Bertotti-Robinson, and anti-Nariai C metrics
International Nuclear Information System (INIS)
Dias, Oscar J.C.; Lemos, Jose P.S.
2003-01-01
In two previous papers we have analyzed the C metric in a background with a cosmological constant Λ, namely, the de-Sitter (dS) C metric (Λ>0), and the anti-de Sitter (AdS) C metric (Λ 0, Λ=0, and Λ 2 xS-tilde 2 ) to each point in the deformed two-sphere S-tilde 2 corresponds a dS 2 spacetime, except for one point which corresponds to a dS 2 spacetime with an infinite straight strut or string. There are other important new features that appear. One expects that the solutions found in this paper are unstable and decay into a slightly nonextreme black hole pair accelerated by a strut or by strings. Moreover, the Euclidean version of these solutions mediate the quantum process of black hole pair creation that accompanies the decay of the dS and AdS spaces
Graev metrics on free products and HNN extensions
DEFF Research Database (Denmark)
Slutsky, Konstantin
2014-01-01
We give a construction of two-sided invariant metrics on free products (possibly with amalgamation) of groups with two-sided invariant metrics and, under certain conditions, on HNN extensions of such groups. Our approach is similar to the Graev's construction of metrics on free groups over pointed...
Reference Device-Assisted Adaptive Location Fingerprinting
Directory of Open Access Journals (Sweden)
Dongjin Wu
2016-06-01
Full Text Available Location fingerprinting suffers in dynamic environments and needs recalibration from time to time to maintain system performance. This paper proposes an adaptive approach for location fingerprinting. Based on real-time received signal strength indicator (RSSI samples measured by a group of reference devices, the approach applies a modified Universal Kriging (UK interpolant to estimate adaptive temporal and environmental radio maps. The modified UK can take the spatial distribution characteristics of RSSI into account. In addition, the issue of device heterogeneity caused by multiple reference devices is further addressed. To compensate the measuring differences of heterogeneous reference devices, differential RSSI metric is employed. Extensive experiments were conducted in an indoor field and the results demonstrate that the proposed approach not only adapts to dynamic environments and the situation of changing APs’ positions, but it is also robust toward measuring differences of heterogeneous reference devices.
Validation of Metrics for Collaborative Systems
Ion IVAN; Cristian CIUREA
2008-01-01
This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.
g-Weak Contraction in Ordered Cone Rectangular Metric Spaces
Directory of Open Access Journals (Sweden)
S. K. Malhotra
2013-01-01
Full Text Available We prove some common fixed-point theorems for the ordered g-weak contractions in cone rectangular metric spaces without assuming the normality of cone. Our results generalize some recent results from cone metric and cone rectangular metric spaces into ordered cone rectangular metric spaces. Examples are provided which illustrate the results.
The dynamics of metric-affine gravity
International Nuclear Information System (INIS)
Vitagliano, Vincenzo; Sotiriou, Thomas P.; Liberati, Stefano
2011-01-01
Highlights: → The role and the dynamics of the connection in metric-affine theories is explored. → The most general second order action does not lead to a dynamical connection. → Including higher order invariants excites new degrees of freedom in the connection. → f(R) actions are also discussed and shown to be a non- representative class. - Abstract: Metric-affine theories of gravity provide an interesting alternative to general relativity: in such an approach, the metric and the affine (not necessarily symmetric) connection are independent quantities. Furthermore, the action should include covariant derivatives of the matter fields, with the covariant derivative naturally defined using the independent connection. As a result, in metric-affine theories a direct coupling involving matter and connection is also present. The role and the dynamics of the connection in such theories is explored. We employ power counting in order to construct the action and search for the minimal requirements it should satisfy for the connection to be dynamical. We find that for the most general action containing lower order invariants of the curvature and the torsion the independent connection does not carry any dynamics. It actually reduces to the role of an auxiliary field and can be completely eliminated algebraically in favour of the metric and the matter field, introducing extra interactions with respect to general relativity. However, we also show that including higher order terms in the action radically changes this picture and excites new degrees of freedom in the connection, making it (or parts of it) dynamical. Constructing actions that constitute exceptions to this rule requires significant fine tuned and/or extra a priori constraints on the connection. We also consider f(R) actions as a particular example in order to show that they constitute a distinct class of metric-affine theories with special properties, and as such they cannot be used as representative toy
The definitive guide to IT service metrics
McWhirter, Kurt
2012-01-01
Used just as they are, the metrics in this book will bring many benefits to both the IT department and the business as a whole. Details of the attributes of each metric are given, enabling you to make the right choices for your business. You may prefer and are encouraged to design and create your own metrics to bring even more value to your business - this book will show you how to do this, too.
NASA education briefs for the classroom. Metrics in space
The use of metric measurement in space is summarized for classroom use. Advantages of the metric system over the English measurement system are described. Some common metric units are defined, as are special units for astronomical study. International system unit prefixes and a conversion table of metric/English units are presented. Questions and activities for the classroom are recommended.
Enhancing Authentication Models Characteristic Metrics via ...
African Journals Online (AJOL)
In this work, we derive the universal characteristic metrics set for authentication models based on security, usability and design issues. We then compute the probability of the occurrence of each characteristic metrics in some single factor and multifactor authentication models in order to determine the effectiveness of these ...
Validation of Metrics for Collaborative Systems
Directory of Open Access Journals (Sweden)
Ion IVAN
2008-01-01
Full Text Available This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.
Understanding Acceptance of Software Metrics--A Developer Perspective
Umarji, Medha
2009-01-01
Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…
Mapping growing stock volume and forest live biomass: a case study of the Polissya region of Ukraine
Bilous, Andrii; Myroniuk, Viktor; Holiaka, Dmytrii; Bilous, Svitlana; See, Linda; Schepaschenko, Dmitry
2017-10-01
Forest inventory and biomass mapping are important tasks that require inputs from multiple data sources. In this paper we implement two methods for the Ukrainian region of Polissya: random forest (RF) for tree species prediction and k-nearest neighbors (k-NN) for growing stock volume and biomass mapping. We examined the suitability of the five-band RapidEye satellite image to predict the distribution of six tree species. The accuracy of RF is quite high: ~99% for forest/non-forest mask and 89% for tree species prediction. Our results demonstrate that inclusion of elevation as a predictor variable in the RF model improved the performance of tree species classification. We evaluated different distance metrics for the k-NN method, including Euclidean or Mahalanobis distance, most similar neighbor (MSN), gradient nearest neighbor, and independent component analysis. The MSN with the four nearest neighbors (k = 4) is the most precise (according to the root-mean-square deviation) for predicting forest attributes across the study area. The k-NN method allowed us to estimate growing stock volume with an accuracy of 3 m3 ha-1 and for live biomass of about 2 t ha-1 over the study area.
Measuring reliability under epistemic uncertainty: Review on non-probabilistic reliability metrics
Directory of Open Access Journals (Sweden)
Kang Rui
2016-06-01
Full Text Available In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence-theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis-based reliability metrics, possibility-theory-based reliability metrics (posbist reliability and uncertainty-theory-based reliability metrics (belief reliability. It is pointed out that a qualified reliability metric that is able to consider the effect of epistemic uncertainty needs to (1 compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2 satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.
Construction of self-dual codes in the Rosenbloom-Tsfasman metric
Krisnawati, Vira Hari; Nisa, Anzi Lina Ukhtin
2017-12-01
Linear code is a very basic code and very useful in coding theory. Generally, linear code is a code over finite field in Hamming metric. Among the most interesting families of codes, the family of self-dual code is a very important one, because it is the best known error-correcting code. The concept of Hamming metric is develop into Rosenbloom-Tsfasman metric (RT-metric). The inner product in RT-metric is different from Euclid inner product that is used to define duality in Hamming metric. Most of the codes which are self-dual in Hamming metric are not so in RT-metric. And, generator matrix is very important to construct a code because it contains basis of the code. Therefore in this paper, we give some theorems and methods to construct self-dual codes in RT-metric by considering properties of the inner product and generator matrix. Also, we illustrate some examples for every kind of the construction.
Chaotic inflation with metric and matter perturbations
International Nuclear Information System (INIS)
Feldman, H.A.; Brandenberger, R.H.
1989-01-01
A perturbative scheme to analyze the evolution of both metric and scalar field perturbations in an expanding universe is developed. The scheme is applied to study chaotic inflation with initial metric and scalar field perturbations present. It is shown that initial gravitational perturbations with wavelength smaller than the Hubble radius rapidly decay. The metric simultaneously picks up small perturbations determined by the matter inhomogeneities. Both are frozen in once the wavelength exceeds the Hubble radius. (orig.)
Phantom metrics with Killing spinors
Directory of Open Access Journals (Sweden)
W.A. Sabra
2015-11-01
Full Text Available We study metric solutions of Einstein–anti-Maxwell theory admitting Killing spinors. The analogue of the IWP metric which admits a space-like Killing vector is found and is expressed in terms of a complex function satisfying the wave equation in flat (2+1-dimensional space–time. As examples, electric and magnetic Kasner spaces are constructed by allowing the solution to depend only on the time coordinate. Euclidean solutions are also presented.
Customizing Countermeasure Prescriptions using Predictive Measures of Sensorimotor Adaptability
Bloomberg, J. J.; Peters, B. T.; Mulavara, A. P.; Miller, C. A.; Batson, C. D.; Wood, S. J.; Guined, J. R.; Cohen, H. S.; Buccello-Stout, R.; DeDios, Y. E.;
2014-01-01
Astronauts experience sensorimotor disturbances during the initial exposure to microgravity and during the readapation phase following a return to a gravitational environment. These alterations may lead to disruption in the ability to perform mission critical functional tasks during and after these gravitational transitions. Astronauts show significant inter-subject variation in adaptive capability following gravitational transitions. The ability to predict the manner and degree to which each individual astronaut will be affected would improve the effectiveness of a countermeasure comprised of a training program designed to enhance sensorimotor adaptability. Due to this inherent individual variability we need to develop predictive measures of sensorimotor adaptability that will allow us to predict, before actual space flight, which crewmember will experience challenges in adaptive capacity. Thus, obtaining this information will allow us to design and implement better sensorimotor adaptability training countermeasures that will be customized for each crewmember's unique adaptive capabilities. Therefore the goals of this project are to: 1) develop a set of predictive measures capable of identifying individual differences in sensorimotor adaptability, and 2) use this information to design sensorimotor adaptability training countermeasures that are customized for each crewmember's individual sensorimotor adaptive characteristics. To achieve these goals we are currently pursuing the following specific aims: Aim 1: Determine whether behavioral metrics of individual sensory bias predict sensorimotor adaptability. For this aim, subjects perform tests that delineate individual sensory biases in tests of visual, vestibular, and proprioceptive function. Aim 2: Determine if individual capability for strategic and plastic-adaptive responses predicts sensorimotor adaptability. For this aim, each subject's strategic and plastic-adaptive motor learning abilities are assessed using
Adaptive Stereotactic Body Radiation Therapy Planning for Lung Cancer
International Nuclear Information System (INIS)
Qin, Yujiao; Zhang, Fan; Yoo, David S.; Kelsey, Chris R.; Yin, Fang-Fang; Cai, Jing
2013-01-01
Purpose: To investigate the dosimetric effects of adaptive planning on lung stereotactic body radiation therapy (SBRT). Methods and Materials: Forty of 66 consecutive lung SBRT patients were selected for a retrospective adaptive planning study. CBCT images acquired at each fraction were used for treatment planning. Adaptive plans were created using the same planning parameters as the original CT-based plan, with the goal to achieve comparable comformality index (CI). For each patient, 2 cumulative plans, nonadaptive plan (P NON ) and adaptive plan (P ADP ), were generated and compared for the following organs-at-risks (OARs): cord, esophagus, chest wall, and the lungs. Dosimetric comparison was performed between P NON and P ADP for all 40 patients. Correlations were evaluated between changes in dosimetric metrics induced by adaptive planning and potential impacting factors, including tumor-to-OAR distances (d T-OAR ), initial internal target volume (ITV 1 ), ITV change (ΔITV), and effective ITV diameter change (Δd ITV ). Results: 34 (85%) patients showed ITV decrease and 6 (15%) patients showed ITV increase throughout the course of lung SBRT. Percentage ITV change ranged from −59.6% to 13.0%, with a mean (±SD) of −21.0% (±21.4%). On average of all patients, P ADP resulted in significantly (P=0 to .045) lower values for all dosimetric metrics. Δd ITV /d T-OAR was found to correlate with changes in dose to 5 cc (ΔD5cc) of esophagus (r=0.61) and dose to 30 cc (ΔD30cc) of chest wall (r=0.81). Stronger correlations between Δd ITV /d T-OAR and ΔD30cc of chest wall were discovered for peripheral (r=0.81) and central (r=0.84) tumors, respectively. Conclusions: Dosimetric effects of adaptive lung SBRT planning depend upon target volume changes and tumor-to-OAR distances. Adaptive lung SBRT can potentially reduce dose to adjacent OARs if patients present large tumor volume shrinkage during the treatment
Adaptive Stereotactic Body Radiation Therapy Planning for Lung Cancer
Energy Technology Data Exchange (ETDEWEB)
Qin, Yujiao [Medical Physics Graduate Program, Duke University, Durham, North Carolina (United States); Zhang, Fan [Occupational and Environmental Safety Office, Duke University Medical Center, Durham, North Carolina (United States); Yoo, David S.; Kelsey, Chris R. [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States); Yin, Fang-Fang [Medical Physics Graduate Program, Duke University, Durham, North Carolina (United States); Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States); Cai, Jing, E-mail: jing.cai@duke.edu [Medical Physics Graduate Program, Duke University, Durham, North Carolina (United States); Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States)
2013-09-01
Purpose: To investigate the dosimetric effects of adaptive planning on lung stereotactic body radiation therapy (SBRT). Methods and Materials: Forty of 66 consecutive lung SBRT patients were selected for a retrospective adaptive planning study. CBCT images acquired at each fraction were used for treatment planning. Adaptive plans were created using the same planning parameters as the original CT-based plan, with the goal to achieve comparable comformality index (CI). For each patient, 2 cumulative plans, nonadaptive plan (P{sub NON}) and adaptive plan (P{sub ADP}), were generated and compared for the following organs-at-risks (OARs): cord, esophagus, chest wall, and the lungs. Dosimetric comparison was performed between P{sub NON} and P{sub ADP} for all 40 patients. Correlations were evaluated between changes in dosimetric metrics induced by adaptive planning and potential impacting factors, including tumor-to-OAR distances (d{sub T-OAR}), initial internal target volume (ITV{sub 1}), ITV change (ΔITV), and effective ITV diameter change (Δd{sub ITV}). Results: 34 (85%) patients showed ITV decrease and 6 (15%) patients showed ITV increase throughout the course of lung SBRT. Percentage ITV change ranged from −59.6% to 13.0%, with a mean (±SD) of −21.0% (±21.4%). On average of all patients, P{sub ADP} resulted in significantly (P=0 to .045) lower values for all dosimetric metrics. Δd{sub ITV}/d{sub T-OAR} was found to correlate with changes in dose to 5 cc (ΔD5cc) of esophagus (r=0.61) and dose to 30 cc (ΔD30cc) of chest wall (r=0.81). Stronger correlations between Δd{sub ITV}/d{sub T-OAR} and ΔD30cc of chest wall were discovered for peripheral (r=0.81) and central (r=0.84) tumors, respectively. Conclusions: Dosimetric effects of adaptive lung SBRT planning depend upon target volume changes and tumor-to-OAR distances. Adaptive lung SBRT can potentially reduce dose to adjacent OARs if patients present large tumor volume shrinkage during the treatment.
Invariant metric for nonlinear symplectic maps
Indian Academy of Sciences (India)
In this paper, we construct an invariant metric in the space of homogeneous polynomials of a given degree (≥ 3). The homogeneous polynomials specify a nonlinear symplectic map which in turn represents a Hamiltonian system. By minimizing the norm constructed out of this metric as a function of system parameters, we ...
Two-dimensional manifolds with metrics of revolution
International Nuclear Information System (INIS)
Sabitov, I Kh
2000-01-01
This is a study of the topological and metric structure of two-dimensional manifolds with a metric that is locally a metric of revolution. In the case of compact manifolds this problem can be thoroughly investigated, and in particular it is explained why there are no closed analytic surfaces of revolution in R 3 other than a sphere and a torus (moreover, in the smoothness class C ∞ such surfaces, understood in a certain generalized sense, exist in any topological class)
Gravitational lensing in metric theories of gravity
International Nuclear Information System (INIS)
Sereno, Mauro
2003-01-01
Gravitational lensing in metric theories of gravity is discussed. I introduce a generalized approximate metric element, inclusive of both post-post-Newtonian contributions and a gravitomagnetic field. Following Fermat's principle and standard hypotheses, I derive the time delay function and deflection angle caused by an isolated mass distribution. Several astrophysical systems are considered. In most of the cases, the gravitomagnetic correction offers the best perspectives for an observational detection. Actual measurements distinguish only marginally different metric theories from each other
The uniqueness of the Fisher metric as information metric
Czech Academy of Sciences Publication Activity Database
Le, Hong-Van
2017-01-01
Roč. 69, č. 4 (2017), s. 879-896 ISSN 0020-3157 Institutional support: RVO:67985840 Keywords : Chentsov’s theorem * mixed topology * monotonicity of the Fisher metric Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.049, year: 2016 https://link.springer.com/article/10.1007%2Fs10463-016-0562-0
Danilǎ, Bogdan; Harko, Tiberiu; Lobo, Francisco S. N.; Mak, M. K.
2017-02-01
We consider the internal structure and the physical properties of specific classes of neutron, quark and Bose-Einstein condensate stars in the recently proposed hybrid metric-Palatini gravity theory, which is a combination of the metric and Palatini f (R ) formalisms. It turns out that the theory is very successful in accounting for the observed phenomenology, since it unifies local constraints at the Solar System level and the late-time cosmic acceleration, even if the scalar field is very light. In this paper, we derive the equilibrium equations for a spherically symmetric configuration (mass continuity and Tolman-Oppenheimer-Volkoff) in the framework of the scalar-tensor representation of the hybrid metric-Palatini theory, and we investigate their solutions numerically for different equations of state of neutron and quark matter, by adopting for the scalar field potential a Higgs-type form. It turns out that the scalar-tensor definition of the potential can be represented as an Clairaut differential equation, and provides an explicit form for f (R ) given by f (R )˜R +Λeff, where Λeff is an effective cosmological constant. Furthermore, stellar models, described by the stiff fluid, radiation-like, bag model and the Bose-Einstein condensate equations of state are explicitly constructed in both general relativity and hybrid metric-Palatini gravity, thus allowing an in-depth comparison between the predictions of these two gravitational theories. As a general result it turns out that for all the considered equations of state, hybrid gravity stars are more massive than their general relativistic counterparts. Furthermore, two classes of stellar models corresponding to two particular choices of the functional form of the scalar field (constant value, and logarithmic form, respectively) are also investigated. Interestingly enough, in the case of a constant scalar field the equation of state of the matter takes the form of the bag model equation of state describing
The universal connection and metrics on moduli spaces
International Nuclear Information System (INIS)
Massamba, Fortune; Thompson, George
2003-11-01
We introduce a class of metrics on gauge theoretic moduli spaces. These metrics are made out of the universal matrix that appears in the universal connection construction of M. S. Narasimhan and S. Ramanan. As an example we construct metrics on the c 2 = 1 SU(2) moduli space of instantons on R 4 for various universal matrices. (author)
Reproducibility of graph metrics in fMRI networks
Directory of Open Access Journals (Sweden)
Qawi K Telesford
2010-12-01
Full Text Available The reliability of graph metrics calculated in network analysis is essential to the interpretation of complex network organization. These graph metrics are used to deduce the small-world properties in networks. In this study, we investigated the test-retest reliability of graph metrics from functional magnetic resonance imaging (fMRI data collected for two runs in 45 healthy older adults. Graph metrics were calculated on data for both runs and compared using intraclass correlation coefficient (ICC statistics and Bland-Altman (BA plots. ICC scores describe the level of absolute agreement between two measurements and provide a measure of reproducibility. For mean graph metrics, ICC scores were high for clustering coefficient (ICC=0.86, global efficiency (ICC=0.83, path length (ICC=0.79, and local efficiency (ICC=0.75; the ICC score for degree was found to be low (ICC=0.29. ICC scores were also used to generate reproducibility maps in brain space to test voxel-wise reproducibility for unsmoothed and smoothed data. Reproducibility was uniform across the brain for global efficiency and path length, but was only high in network hubs for clustering coefficient, local efficiency and degree. BA plots were used to test the measurement repeatability of all graph metrics. All graph metrics fell within the limits for repeatability. Together, these results suggest that with exception of degree, mean graph metrics are reproducible and suitable for clinical studies. Further exploration is warranted to better understand reproducibility across the brain on a voxel-wise basis.
Complexity Metrics for Workflow Nets
DEFF Research Database (Denmark)
Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.
2009-01-01
analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...
Energy Technology Data Exchange (ETDEWEB)
Nawrocki, J; Chino, J; Light, K; Vergalasova, I; Craciunescu, O [Duke University Medical Center, Durham, NC (United States)
2014-06-01
Purpose: To compare PET extracted metrics and investigate the role of a gradient-based PET segmentation tool, PET Edge (MIM Software Inc., Cleveland, OH), in the context of an adaptive PET protocol for node positive gynecologic cancer patients. Methods: An IRB approved protocol enrolled women with gynecological, PET visible malignancies. A PET-CT was obtained for treatment planning prescribed to 45–50.4Gy with a 55– 70Gy boost to the PET positive nodes. An intra-treatment PET-CT was obtained between 30–36Gy, and all volumes re-contoured. Standard uptake values (SUVmax, SUVmean, SUVmedian) and GTV volumes were extracted from the clinician contoured GTVs on the pre- and intra-treament PET-CT for primaries and nodes and compared with a two tailed Wilcoxon signed-rank test. The differences between primary and node GTV volumes contoured in the treatment planning system and those volumes generated using PET Edge were also investigated. Bland-Altman plots were used to describe significant differences between the two contouring methods. Results: Thirteen women were enrolled in this study. The median baseline/intra-treatment primary (SUVmax, mean, median) were (30.5, 9.09, 7.83)/( 16.6, 4.35, 3.74), and nodes were (20.1, 4.64, 3.93)/( 6.78, 3.13, 3.26). The p values were all < 0.001. The clinical contours were all larger than the PET Edge generated ones, with mean difference of +20.6 ml for primary, and +23.5 ml for nodes. The Bland-Altman revealed changes between clinician/PET Edge contours to be mostly within the margins of the coefficient of variability. However, there was a proportional trend, i.e. the larger the GTV, the larger the clinical contours as compared to PET Edge contours. Conclusion: Primary and node SUV values taken from the intratreament PET-CT can be used to assess the disease response and to design an adaptive plan. The PET Edge tool can streamline the contouring process and lead to smaller, less user-dependent contours.
Goedel-type metrics in various dimensions
International Nuclear Information System (INIS)
Guerses, Metin; Karasu, Atalay; Sarioglu, Oezguer
2005-01-01
Goedel-type metrics are introduced and used in producing charged dust solutions in various dimensions. The key ingredient is a (D - 1)-dimensional Riemannian geometry which is then employed in constructing solutions to the Einstein-Maxwell field equations with a dust distribution in D dimensions. The only essential field equation in the procedure turns out to be the source-free Maxwell's equation in the relevant background. Similarly the geodesics of this type of metric are described by the Lorentz force equation for a charged particle in the lower dimensional geometry. It is explicitly shown with several examples that Goedel-type metrics can be used in obtaining exact solutions to various supergravity theories and in constructing spacetimes that contain both closed timelike and closed null curves and that contain neither of these. Among the solutions that can be established using non-flat backgrounds, such as the Tangherlini metrics in (D - 1)-dimensions, there exists a class which can be interpreted as describing black-hole-type objects in a Goedel-like universe
Standardised metrics for global surgical surveillance.
Weiser, Thomas G; Makary, Martin A; Haynes, Alex B; Dziekan, Gerald; Berry, William R; Gawande, Atul A
2009-09-26
Public health surveillance relies on standardised metrics to evaluate disease burden and health system performance. Such metrics have not been developed for surgical services despite increasing volume, substantial cost, and high rates of death and disability associated with surgery. The Safe Surgery Saves Lives initiative of WHO's Patient Safety Programme has developed standardised public health metrics for surgical care that are applicable worldwide. We assembled an international panel of experts to develop and define metrics for measuring the magnitude and effect of surgical care in a population, while taking into account economic feasibility and practicability. This panel recommended six measures for assessing surgical services at a national level: number of operating rooms, number of operations, number of accredited surgeons, number of accredited anaesthesia professionals, day-of-surgery death ratio, and postoperative in-hospital death ratio. We assessed the feasibility of gathering such statistics at eight diverse hospitals in eight countries and incorporated them into the WHO Guidelines for Safe Surgery, in which methods for data collection, analysis, and reporting are outlined.
Developing a Security Metrics Scorecard for Healthcare Organizations.
Elrefaey, Heba; Borycki, Elizabeth; Kushniruk, Andrea
2015-01-01
In healthcare, information security is a key aspect of protecting a patient's privacy and ensuring systems availability to support patient care. Security managers need to measure the performance of security systems and this can be achieved by using evidence-based metrics. In this paper, we describe the development of an evidence-based security metrics scorecard specific to healthcare organizations. Study participants were asked to comment on the usability and usefulness of a prototype of a security metrics scorecard that was developed based on current research in the area of general security metrics. Study findings revealed that scorecards need to be customized for the healthcare setting in order for the security information to be useful and usable in healthcare organizations. The study findings resulted in the development of a security metrics scorecard that matches the healthcare security experts' information requirements.
Landscape pattern metrics and regional assessment
O'Neill, R. V.; Riitters, K.H.; Wickham, J.D.; Jones, K.B.
1999-01-01
The combination of remote imagery data, geographic information systems software, and landscape ecology theory provides a unique basis for monitoring and assessing large-scale ecological systems. The unique feature of the work has been the need to develop and interpret quantitative measures of spatial pattern-the landscape indices. This article reviews what is known about the statistical properties of these pattern metrics and suggests some additional metrics based on island biogeography, percolation theory, hierarchy theory, and economic geography. Assessment applications of this approach have required interpreting the pattern metrics in terms of specific environmental endpoints, such as wildlife and water quality, and research into how to represent synergystic effects of many overlapping sources of stress.
Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions
Energy Technology Data Exchange (ETDEWEB)
Mathew, Paul; Sartor, Dale; Tschudi, William
2009-07-13
This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.
Metrics Are Needed for Collaborative Software Development
Directory of Open Access Journals (Sweden)
Mojgan Mohtashami
2011-10-01
Full Text Available There is a need for metrics for inter-organizational collaborative software development projects, encompassing management and technical concerns. In particular, metrics are needed that are aimed at the collaborative aspect itself, such as readiness for collaboration, the quality and/or the costs and benefits of collaboration in a specific ongoing project. We suggest questions and directions for such metrics, spanning the full lifespan of a collaborative project, from considering the suitability of collaboration through evaluating ongoing projects to final evaluation of the collaboration.
Predicting class testability using object-oriented metrics
Bruntink, Magiel; Deursen, Arie
2004-01-01
textabstractIn this paper we investigate factors of the testability of object-oriented software systems. The starting point is given by a study of the literature to obtain both an initial model of testability and existing OO metrics related to testability. Subsequently, these metrics are evaluated by means of two case studies of large Java systems for which JUnit test cases exist. The goal of this paper is to define and evaluate a set of metrics that can be used to assess the testability of t...
Software metrics a rigorous and practical approach
Fenton, Norman
2014-01-01
A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant
Hermitian-Einstein metrics on parabolic stable bundles
International Nuclear Information System (INIS)
Li Jiayu; Narasimhan, M.S.
1995-12-01
Let M-bar be a compact complex manifold of complex dimension two with a smooth Kaehler metric and D a smooth divisor on M-bar. If E is a rank 2 holomorphic vector bundle on M-bar with a stable parabolic structure along D, we prove the existence of a metric on E' = E module MbarD (compatible with the parabolic structure) which is Hermitian-Einstein with respect to the restriction of Kaehler metric of M-barD. A converse is also proved. (author). 24 refs
The Development and Assesment of Adaptation Pathways for Urban Pluvial Flooding
Babovic, F.; Mijic, A.; Madani, K.
2017-12-01
Around the globe, urban areas are growing in both size and importance. However, due to the prevalence of impermeable surfaces within the urban fabric of cities these areas have a high risk of pluvial flooding. Due to the convergence of population growth and climate change the risk of pluvial flooding is growing. When designing solutions and adaptations to pluvial flood risk urban planners and engineers encounter a great deal of uncertainty due to model uncertainty, uncertainty within the data utilised, and uncertainty related to future climate and land use conditions. The interaction of these uncertainties leads to conditions of deep uncertainty. However, infrastructure systems must be designed and built in the face of this deep uncertainty. An Adaptation Tipping Points (ATP) methodology was used to develop a strategy to adapt an urban drainage system in the North East of London under conditions of deep uncertainty. The ATP approach was used to assess the current drainage system and potential drainage system adaptations. These adaptations were assessed against potential changes in rainfall depth and peakedness-defined as the ratio of mean to peak rainfall. These solutions encompassed both traditional and blue-green solutions that the Local Authority are known to be considering. This resulted in a set of Adaptation Pathways. However, theses pathways do not convey any information regarding the relative merits and demerits of the potential adaptation options presented. To address this a cost-benefit metric was developed that would reflect the solutions' costs and benefits under uncertainty. The resulting metric combines elements of the Benefits of SuDS Tool (BeST) with real options analysis in order to reflect the potential value of ecosystem services delivered by blue-green solutions under uncertainty. Lastly, it is discussed how a local body can utilise the adaptation pathways; their relative costs and benefits; and a system of local data collection to help guide
Coverage Metrics for Model Checking
Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)
2001-01-01
When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.
Future of the PCI Readmission Metric.
Wasfy, Jason H; Yeh, Robert W
2016-03-01
Between 2013 and 2014, the Centers for Medicare and Medicaid Services and the National Cardiovascular Data Registry publically reported risk-adjusted 30-day readmission rates after percutaneous coronary intervention (PCI) as a pilot project. A key strength of this public reporting effort included risk adjustment with clinical rather than administrative data. Furthermore, because readmission after PCI is common, expensive, and preventable, this metric has substantial potential to improve quality and value in American cardiology care. Despite this, concerns about the metric exist. For example, few PCI readmissions are caused by procedural complications, limiting the extent to which improved procedural technique can reduce readmissions. Also, similar to other readmission measures, PCI readmission is associated with socioeconomic status and race. Accordingly, the metric may unfairly penalize hospitals that care for underserved patients. Perhaps in the context of these limitations, Centers for Medicare and Medicaid Services has not yet included PCI readmission among metrics that determine Medicare financial penalties. Nevertheless, provider organizations may still wish to focus on this metric to improve value for cardiology patients. PCI readmission is associated with low-risk chest discomfort and patient anxiety. Therefore, patient education, improved triage mechanisms, and improved care coordination offer opportunities to minimize PCI readmissions. Because PCI readmission is common and costly, reducing PCI readmission offers provider organizations a compelling target to improve the quality of care, and also performance in contracts involve shared financial risk. © 2016 American Heart Association, Inc.
Model assessment using a multi-metric ranking technique
Fitzpatrick, P. J.; Lau, Y.; Alaka, G.; Marks, F.
2017-12-01
Validation comparisons of multiple models presents challenges when skill levels are similar, especially in regimes dominated by the climatological mean. Assessing skill separation will require advanced validation metrics and identifying adeptness in extreme events, but maintain simplicity for management decisions. Flexibility for operations is also an asset. This work postulates a weighted tally and consolidation technique which ranks results by multiple types of metrics. Variables include absolute error, bias, acceptable absolute error percentages, outlier metrics, model efficiency, Pearson correlation, Kendall's Tau, reliability Index, multiplicative gross error, and root mean squared differences. Other metrics, such as root mean square difference and rank correlation were also explored, but removed when the information was discovered to be generally duplicative to other metrics. While equal weights are applied, weights could be altered depending for preferred metrics. Two examples are shown comparing ocean models' currents and tropical cyclone products, including experimental products. The importance of using magnitude and direction for tropical cyclone track forecasts instead of distance, along-track, and cross-track are discussed. Tropical cyclone intensity and structure prediction are also assessed. Vector correlations are not included in the ranking process, but found useful in an independent context, and will be briefly reported.
Comparison of luminance based metrics in different lighting conditions
DEFF Research Database (Denmark)
Wienold, J.; Kuhn, T.E.; Christoffersen, J.
In this study, we evaluate established and newly developed metrics for predicting glare using data from three different research studies. The evaluation covers two different targets: 1. How well the user’s perception of glare magnitude correlates to the prediction of the glare metrics? 2. How well...... do the glare metrics describe the subjects’ disturbance by glare? We applied Spearman correlations, logistic regressions and an accuracy evaluation, based on an ROC-analysis. The results show that five of the twelve investigated metrics are failing at least one of the statistical tests. The other...... seven metrics CGI, modified DGI, DGP, Ev, average Luminance of the image Lavg, UGP and UGR are passing all statistical tests. DGP, CGI, DGI_mod and UGP have largest AUC and might be slightly more robust. The accuracy of the predictions of afore mentioned seven metrics for the disturbance by glare lies...
A bi-metric theory of gravitation
International Nuclear Information System (INIS)
Rosen, N.
1975-01-01
The bi-metric theory of gravitation proposed previously is simplified in that the auxiliary conditions are discarded, the two metric tensors being tied together only by means of the boundary conditions. Some of the properties of the field of a particle are investigated; there is no black hole, and it appears that no gravitational collapse can take place. Although the proposed theory and general relativity are at present observationally indistinguishable, some differences are pointed out which may some day be susceptible of observation. An alternative bi-metric theory is considered which gives for the precession of the perihelion 5/6 of the value given by general relativity; it seems less satisfactory than the present theory from the aesthetic point of view. (author)
Poodat, Fatemeh; Arrowsmith, Colin; Fraser, David; Gordon, Ascelin
2015-09-01
Connectivity among fragmented areas of habitat has long been acknowledged as important for the viability of biological conservation, especially within highly modified landscapes. Identifying important habitat patches in ecological connectivity is a priority for many conservation strategies, and the application of 'graph theory' has been shown to provide useful information on connectivity. Despite the large number of metrics for connectivity derived from graph theory, only a small number have been compared in terms of the importance they assign to nodes in a network. This paper presents a study that aims to define a new set of metrics and compares these with traditional graph-based metrics, used in the prioritization of habitat patches for ecological connectivity. The metrics measured consist of "topological" metrics, "ecological metrics," and "integrated metrics," Integrated metrics are a combination of topological and ecological metrics. Eight metrics were applied to the habitat network for the fat-tailed dunnart within Greater Melbourne, Australia. A non-directional network was developed in which nodes were linked to adjacent nodes. These links were then weighted by the effective distance between patches. By applying each of the eight metrics for the study network, nodes were ranked according to their contribution to the overall network connectivity. The structured comparison revealed the similarity and differences in the way the habitat for the fat-tailed dunnart was ranked based on different classes of metrics. Due to the differences in the way the metrics operate, a suitable metric should be chosen that best meets the objectives established by the decision maker.
Directory of Open Access Journals (Sweden)
Rebecca SAFRAN, Samuel FLAXMAN, Michael KOPP, Darren E. IRWIN, Derek BRIGGS, Matthew R. EVANS, W. Chris FUNK, David A. GRAY, Eileen A. HEBE
2012-06-01
Full Text Available Whereas a rich literature exists for estimating population genetic divergence, metrics of phenotypic trait divergence are lacking, particularly for comparing multiple traits among three or more populations. Here, we review and analyze via simulation Hedges’ g, a widely used parametric estimate of effect size. Our analyses indicate that g is sensitive to a combination of unequal trait variances and unequal sample sizes among populations and to changes in the scale of measurement. We then go on to derive and explain a new, non-parametric distance measure, “Δp”, which is calculated based upon a joint cumulative distribution function (CDF from all populations under study. More precisely, distances are measured in terms of the percentiles in this CDF at which each population’s median lies. Δp combines many desirable features of other distance metrics into a single metric; namely, compared to other metrics, p is relatively insensitive to unequal variances and sample sizes among the populations sampled. Furthermore, a key feature of Δp—and our main motivation for developing it—is that it easily accommodates simultaneous comparisons of any number of traits across any number of populations. To exemplify its utility, we employ Δp to address a question related to the role of sexual selection in speciation: are sexual signals more divergent than ecological traits in closely related taxa? Using traits of known function in closely related populations, we show that traits predictive of reproductive performance are, indeed, more divergent and more sexually dimorphic than traits related to ecological adaptation [Current Zoology 58 (3: 423-436, 2012].
Comparative Study of Trace Metrics between Bibliometrics and Patentometrics
Directory of Open Access Journals (Sweden)
Fred Y. Ye
2016-06-01
Full Text Available Purpose: To comprehensively evaluate the overall performance of a group or an individual in both bibliometrics and patentometrics. Design/methodology/approach: Trace metrics were applied to the top 30 universities in the 2014 Academic Ranking of World Universities (ARWU — computer sciences, the top 30 ESI highly cited papers in the computer sciences field in 2014, as well as the top 30 assignees and the top 30 most cited patents in the National Bureau of Economic Research (NBER computer hardware and software category. Findings: We found that, by applying trace metrics, the research or marketing impact efficiency, at both group and individual levels, was clearly observed. Furthermore, trace metrics were more sensitive to the different publication-citation distributions than the average citation and h-index were. Research limitations: Trace metrics considered publications with zero citations as negative contributions. One should clarify how he/she evaluates a zero-citation paper or patent before applying trace metrics. Practical implications: Decision makers could regularly examinine the performance of their university/company by applying trace metrics and adjust their policies accordingly. Originality/value: Trace metrics could be applied both in bibliometrics and patentometrics and provide a comprehensive view. Moreover, the high sensitivity and unique impact efficiency view provided by trace metrics can facilitate decision makers in examining and adjusting their policies.
Evaluating and Estimating the WCET Criticality Metric
DEFF Research Database (Denmark)
Jordan, Alexander
2014-01-01
a programmer (or compiler) from targeting optimizations the right way. A possible resort is to use a metric that targets WCET and which can be efficiently computed for all code parts of a program. Similar to dynamic profiling techniques, which execute code with input that is typically expected...... for the application, based on WCET analysis we can indicate how critical a code fragment is, in relation to the worst-case bound. Computing such a metric on top of static analysis, incurs a certain overhead though, which increases with the complexity of the underlying WCET analysis. We present our approach...... to estimate the Criticality metric, by relaxing the precision of WCET analysis. Through this, we can reduce analysis time by orders of magnitude, while only introducing minor error. To evaluate our estimation approach and share our garnered experience using the metric, we evaluate real-time programs, which...
Adaptive distributed outlier detection for WSNs.
De Paola, Alessandra; Gaglio, Salvatore; Lo Re, Giuseppe; Milazzo, Fabrizio; Ortolani, Marco
2015-05-01
The paradigm of pervasive computing is gaining more and more attention nowadays, thanks to the possibility of obtaining precise and continuous monitoring. Ease of deployment and adaptivity are typically implemented by adopting autonomous and cooperative sensory devices; however, for such systems to be of any practical use, reliability and fault tolerance must be guaranteed, for instance by detecting corrupted readings amidst the huge amount of gathered sensory data. This paper proposes an adaptive distributed Bayesian approach for detecting outliers in data collected by a wireless sensor network; our algorithm aims at optimizing classification accuracy, time complexity and communication complexity, and also considering externally imposed constraints on such conflicting goals. The performed experimental evaluation showed that our approach is able to improve the considered metrics for latency and energy consumption, with limited impact on classification accuracy.
A convergence theory for probabilistic metric spaces | Jäger ...
African Journals Online (AJOL)
We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...
Forecasting Sensorimotor Adaptability from Baseline Inter-Trial Correlations
Beaton, K. H.; Bloomberg, J. J.
2016-01-01
One of the greatest challenges for sensorimotor adaptation to the spaceflight environment is the large variability in symptoms, and corresponding functional impairments, from one crewmember to the next. This renders preflight training and countermeasure development difficult, as a "one-size-fits-all" approach is inappropriate. Therefore, it would be highly advantageous to know ahead of time which crewmembers might have more difficulty adjusting to the novel g-levels inherent to spaceflight. This information could guide individually customized countermeasures, which would enable more efficient use of crew time and provide better outcomes. The principal aim of this work is to look for baseline performance metrics that relate to locomotor adaptability. We propose a novel hypothesis that considers baseline inter-trial correlations, the trial-to-trial fluctuations ("noise") in motor performance, as a predictor of individual adaptive capabilities.
Indefinite metric fields and the renormalization group
International Nuclear Information System (INIS)
Sherry, T.N.
1976-11-01
The renormalization group equations are derived for the Green functions of an indefinite metric field theory. In these equations one retains the mass dependence of the coefficient functions, since in the indefinite metric theories the masses cannot be neglected. The behavior of the effective coupling constant in the asymptotic and infrared limits is analyzed. The analysis is illustrated by means of a simple model incorporating indefinite metric fields. The model scales at first order, and at this order also the effective coupling constant has both ultra-violet and infra-red fixed points, the former being the bare coupling constant
Kerr-Newman metric in deSitter background
International Nuclear Information System (INIS)
Patel, L.K.; Koppar, S.S.; Bhatt, P.V.
1987-01-01
In addition to the Kerr-Newman metric with cosmological constant several other metrics are presented giving Kerr-Newman type solutions of Einstein-Maxwell field equations in the background of deSitter universe. The electromagnetic field in all the solutions is assumed to be source-free. A new metric of what may be termed as an electrovac rotating deSitter space-time- a space-time devoid of matter but containing source-free electromagnetic field and a null fluid with twisting rays-has been presented. In the absence of the electromagnetic field, these solutions reduce to those discussed by Vaidya (1984). 8 refs. (author)
The independence of software metrics taken at different life-cycle stages
Kafura, D.; Canning, J.; Reddy, G.
1984-01-01
Over the past few years a large number of software metrics have been proposed and, in varying degrees, a number of these metrics have been subjected to empirical validation which demonstrated the utility of the metrics in the software development process. Attempts to classify these metrics and to determine if the metrics in these different classes appear to be measuring distinct attributes of the software product are studied. Statistical analysis is used to determine the degree of relationship among the metrics.
Thermodynamic metrics and optimal paths.
Sivak, David A; Crooks, Gavin E
2012-05-11
A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.
Invariant metrics for Hamiltonian systems
International Nuclear Information System (INIS)
Rangarajan, G.; Dragt, A.J.; Neri, F.
1991-05-01
In this paper, invariant metrics are constructed for Hamiltonian systems. These metrics give rise to norms on the space of homeogeneous polynomials of phase-space variables. For an accelerator lattice described by a Hamiltonian, these norms characterize the nonlinear content of the lattice. Therefore, the performance of the lattice can be improved by minimizing the norm as a function of parameters describing the beam-line elements in the lattice. A four-fold increase in the dynamic aperture of a model FODO cell is obtained using this procedure. 7 refs
Steiner trees for fixed orientation metrics
DEFF Research Database (Denmark)
Brazil, Marcus; Zachariasen, Martin
2009-01-01
We consider the problem of constructing Steiner minimum trees for a metric defined by a polygonal unit circle (corresponding to s = 2 weighted legal orientations in the plane). A linear-time algorithm to enumerate all angle configurations for degree three Steiner points is given. We provide...... a simple proof that the angle configuration for a Steiner point extends to all Steiner points in a full Steiner minimum tree, such that at most six orientations suffice for edges in a full Steiner minimum tree. We show that the concept of canonical forms originally introduced for the uniform orientation...... metric generalises to the fixed orientation metric. Finally, we give an O(s n) time algorithm to compute a Steiner minimum tree for a given full Steiner topology with n terminal leaves....
Metric Learning for Hyperspectral Image Segmentation
Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca
2011-01-01
We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.
Validation of Metrics as Error Predictors
Mendling, Jan
In this chapter, we test the validity of metrics that were defined in the previous chapter for predicting errors in EPC business process models. In Section 5.1, we provide an overview of how the analysis data is generated. Section 5.2 describes the sample of EPCs from practice that we use for the analysis. Here we discuss a disaggregation by the EPC model group and by error as well as a correlation analysis between metrics and error. Based on this sample, we calculate a logistic regression model for predicting error probability with the metrics as input variables in Section 5.3. In Section 5.4, we then test the regression function for an independent sample of EPC models from textbooks as a cross-validation. Section 5.5 summarizes the findings.
Predicting class testability using object-oriented metrics
M. Bruntink (Magiel); A. van Deursen (Arie)
2004-01-01
textabstractIn this paper we investigate factors of the testability of object-oriented software systems. The starting point is given by a study of the literature to obtain both an initial model of testability and existing OO metrics related to testability. Subsequently, these metrics are evaluated
Software Power Metric Model: An Implementation | Akwukwuma ...
African Journals Online (AJOL)
... and the execution time (TIME) in each case was recorded. We then obtain the application functions point count. Our result shows that the proposed metric is computable, consistent in its use of unit, and is programming language independent. Keywords: Software attributes, Software power, measurement, Software metric, ...
Meter Detection in Symbolic Music Using Inner Metric Analysis
de Haas, W.B.; Volk, A.
2016-01-01
In this paper we present PRIMA: a new model tailored to symbolic music that detects the meter and the first downbeat position of a piece. Given onset data, the metrical structure of a piece is interpreted using the Inner Metric Analysis (IMA) model. IMA identifies the strong and weak metrical
Performance metrics for the evaluation of hyperspectral chemical identification systems
Truslow, Eric; Golowich, Steven; Manolakis, Dimitris; Ingle, Vinay
2016-02-01
Remote sensing of chemical vapor plumes is a difficult but important task for many military and civilian applications. Hyperspectral sensors operating in the long-wave infrared regime have well-demonstrated detection capabilities. However, the identification of a plume's chemical constituents, based on a chemical library, is a multiple hypothesis testing problem which standard detection metrics do not fully describe. We propose using an additional performance metric for identification based on the so-called Dice index. Our approach partitions and weights a confusion matrix to develop both the standard detection metrics and identification metric. Using the proposed metrics, we demonstrate that the intuitive system design of a detector bank followed by an identifier is indeed justified when incorporating performance information beyond the standard detection metrics.
Curvature properties of four-dimensional Walker metrics
International Nuclear Information System (INIS)
Chaichi, M; Garcia-Rio, E; Matsushita, Y
2005-01-01
A Walker n-manifold is a semi-Riemannian manifold, which admits a field of parallel null r-planes, r ≤ n/2. In the present paper we study curvature properties of a Walker 4-manifold (M, g) which admits a field of parallel null 2-planes. The metric g is necessarily of neutral signature (+ + - -). Such a Walker 4-manifold is the lowest dimensional example not of Lorentz type. There are three functions of coordinates which define a Walker metric. Some recent work shows that a Walker 4-manifold of restricted type whose metric is characterized by two functions exhibits a large variety of symplectic structures, Hermitian structures, Kaehler structures, etc. For such a restricted Walker 4-manifold, we shall study mainly curvature properties, e.g., conditions for a Walker metric to be Einstein, Osserman, or locally conformally flat, etc. One of our main results is the exact solutions to the Einstein equations for a restricted Walker 4-manifold
Decision Analysis for Metric Selection on a Clinical Quality Scorecard.
Guth, Rebecca M; Storey, Patricia E; Vitale, Michael; Markan-Aurora, Sumita; Gordon, Randolph; Prevost, Traci Q; Dunagan, Wm Claiborne; Woeltje, Keith F
2016-09-01
Clinical quality scorecards are used by health care institutions to monitor clinical performance and drive quality improvement. Because of the rapid proliferation of quality metrics in health care, BJC HealthCare found it increasingly difficult to select the most impactful scorecard metrics while still monitoring metrics for regulatory purposes. A 7-step measure selection process was implemented incorporating Kepner-Tregoe Decision Analysis, which is a systematic process that considers key criteria that must be satisfied in order to make the best decision. The decision analysis process evaluates what metrics will most appropriately fulfill these criteria, as well as identifies potential risks associated with a particular metric in order to identify threats to its implementation. Using this process, a list of 750 potential metrics was narrowed to 25 that were selected for scorecard inclusion. This decision analysis process created a more transparent, reproducible approach for selecting quality metrics for clinical quality scorecards. © The Author(s) 2015.
International Nuclear Information System (INIS)
Jesic, Sinisa N.; Babacev, Natasa A.
2008-01-01
The purpose of this paper is to prove some common fixed point theorems for a pair of R-weakly commuting mappings defined on intuitionistic fuzzy metric spaces [Park JH. Intuitionistic fuzzy metric spaces. Chaos, Solitons and Fractals 2004;22:1039-46] and L-fuzzy metric spaces [Saadati R, Razani A, Adibi H. A common fixed point theorem in L-fuzzy metric spaces. Chaos, Solitons and Fractals, doi:10.1016/j.chaos.2006.01.023], with nonlinear contractive condition, defined with function, first observed by Boyd and Wong [Boyd DW, Wong JSW. On nonlinear contractions. Proc Am Math Soc 1969;20:458-64]. Following Pant [Pant RP. Common fixed points of noncommuting mappings. J Math Anal Appl 1994;188:436-40] we define R-weak commutativity for a pair of mappings and then prove the main results. These results generalize some known results due to Saadati et al., and Jungck [Jungck G. Commuting maps and fixed points. Am Math Mon 1976;83:261-3]. Some examples and comments according to the preceding results are given
43 CFR 12.915 - Metric system of measurement.
2010-10-01
... procurements, grants, and other business-related activities. Metric implementation may take longer where the... recipient, such as when foreign competitors are producing competing products in non-metric units. (End of...
Cophenetic metrics for phylogenetic trees, after Sokal and Rohlf.
Cardona, Gabriel; Mir, Arnau; Rosselló, Francesc; Rotger, Lucía; Sánchez, David
2013-01-16
Phylogenetic tree comparison metrics are an important tool in the study of evolution, and hence the definition of such metrics is an interesting problem in phylogenetics. In a paper in Taxon fifty years ago, Sokal and Rohlf proposed to measure quantitatively the difference between a pair of phylogenetic trees by first encoding them by means of their half-matrices of cophenetic values, and then comparing these matrices. This idea has been used several times since then to define dissimilarity measures between phylogenetic trees but, to our knowledge, no proper metric on weighted phylogenetic trees with nested taxa based on this idea has been formally defined and studied yet. Actually, the cophenetic values of pairs of different taxa alone are not enough to single out phylogenetic trees with weighted arcs or nested taxa. For every (rooted) phylogenetic tree T, let its cophenetic vectorφ(T) consist of all pairs of cophenetic values between pairs of taxa in T and all depths of taxa in T. It turns out that these cophenetic vectors single out weighted phylogenetic trees with nested taxa. We then define a family of cophenetic metrics dφ,p by comparing these cophenetic vectors by means of Lp norms, and we study, either analytically or numerically, some of their basic properties: neighbors, diameter, distribution, and their rank correlation with each other and with other metrics. The cophenetic metrics can be safely used on weighted phylogenetic trees with nested taxa and no restriction on degrees, and they can be computed in O(n2) time, where n stands for the number of taxa. The metrics dφ,1 and dφ,2 have positive skewed distributions, and they show a low rank correlation with the Robinson-Foulds metric and the nodal metrics, and a very high correlation with each other and with the splitted nodal metrics. The diameter of dφ,p, for p⩾1 , is in O(n(p+2)/p), and thus for low p they are more discriminative, having a wider range of values.
Finite Metric Spaces of Strictly negative Type
DEFF Research Database (Denmark)
Hjorth, Poul G.
If a finite metric space is of strictly negative type then its transfinite diameter is uniquely realized by an infinite extent (“load vector''). Finite metric spaces that have this property include all trees, and all finite subspaces of Euclidean and Hyperbolic spaces. We prove that if the distance...
Gravitational Metric Tensor Exterior to Rotating Homogeneous ...
African Journals Online (AJOL)
The covariant and contravariant metric tensors exterior to a homogeneous spherical body rotating uniformly about a common φ axis with constant angular velocity ω is constructed. The constructed metric tensors in this gravitational field have seven non-zero distinct components.The Lagrangian for this gravitational field is ...
Exact solutions of strong gravity in generalized metrics
International Nuclear Information System (INIS)
Hojman, R.; Smailagic, A.
1981-05-01
We consider classical solutions for the strong gravity theory of Salam and Strathdee in a wider class of metrics with positive, zero and negative curvature. It turns out that such solutions exist and their relevance for quark confinement is explored. Only metrics with positive curvature (spherical symmetry) give a confining potential in a simple picture of the scalar hadron. This supports the idea of describing the hadron as a closed microuniverse of the strong metric. (author)
Development of quality metrics for ambulatory pediatric cardiology: Infection prevention.
Johnson, Jonathan N; Barrett, Cindy S; Franklin, Wayne H; Graham, Eric M; Halnon, Nancy J; Hattendorf, Brandy A; Krawczeski, Catherine D; McGovern, James J; O'Connor, Matthew J; Schultz, Amy H; Vinocur, Jeffrey M; Chowdhury, Devyani; Anderson, Jeffrey B
2017-12-01
In 2012, the American College of Cardiology's (ACC) Adult Congenital and Pediatric Cardiology Council established a program to develop quality metrics to guide ambulatory practices for pediatric cardiology. The council chose five areas on which to focus their efforts; chest pain, Kawasaki Disease, tetralogy of Fallot, transposition of the great arteries after arterial switch, and infection prevention. Here, we sought to describe the process, evaluation, and results of the Infection Prevention Committee's metric design process. The infection prevention metrics team consisted of 12 members from 11 institutions in North America. The group agreed to work on specific infection prevention topics including antibiotic prophylaxis for endocarditis, rheumatic fever, and asplenia/hyposplenism; influenza vaccination and respiratory syncytial virus prophylaxis (palivizumab); preoperative methods to reduce intraoperative infections; vaccinations after cardiopulmonary bypass; hand hygiene; and testing to identify splenic function in patients with heterotaxy. An extensive literature review was performed. When available, previously published guidelines were used fully in determining metrics. The committee chose eight metrics to submit to the ACC Quality Metric Expert Panel for review. Ultimately, metrics regarding hand hygiene and influenza vaccination recommendation for patients did not pass the RAND analysis. Both endocarditis prophylaxis metrics and the RSV/palivizumab metric passed the RAND analysis but fell out during the open comment period. Three metrics passed all analyses, including those for antibiotic prophylaxis in patients with heterotaxy/asplenia, for influenza vaccination compliance in healthcare personnel, and for adherence to recommended regimens of secondary prevention of rheumatic fever. The lack of convincing data to guide quality improvement initiatives in pediatric cardiology is widespread, particularly in infection prevention. Despite this, three metrics were
A new form of the rotating C-metric
International Nuclear Information System (INIS)
Hong, Kenneth; Teo, Edward
2005-01-01
In a previous paper, we showed that the traditional form of the charged C-metric can be transformed, by a change of coordinates, into one with an explicitly factorizable structure function. This new form of the C-metric has the advantage that its properties become much simpler to analyse. In this paper, we propose an analogous new form for the rotating charged C-metric, with structure function G(ξ) = (1 - ξ 2 )(1 + r + Aξ)(1 + r - Aξ), where r ± are the usual locations of the horizons in the Kerr-Newman black hole. Unlike the non-rotating case, this new form is not related to the traditional one by a coordinate transformation. We show that the physical distinction between these two forms of the rotating C-metric lies in the nature of the conical singularities causing the black holes to accelerate apart: the new form is free of torsion singularities and therefore does not contain any closed timelike curves. We claim that this new form should be considered the natural generalization of the C-metric with rotation
[Clinical trial data management and quality metrics system].
Chen, Zhao-hua; Huang, Qin; Deng, Ya-zhong; Zhang, Yue; Xu, Yu; Yu, Hao; Liu, Zong-fan
2015-11-01
Data quality management system is essential to ensure accurate, complete, consistent, and reliable data collection in clinical research. This paper is devoted to various choices of data quality metrics. They are categorized by study status, e.g. study start up, conduct, and close-out. In each category, metrics for different purposes are listed according to ALCOA+ principles such us completeness, accuracy, timeliness, traceability, etc. Some general quality metrics frequently used are also introduced. This paper contains detail information as much as possible to each metric by providing definition, purpose, evaluation, referenced benchmark, and recommended targets in favor of real practice. It is important that sponsors and data management service providers establish a robust integrated clinical trial data quality management system to ensure sustainable high quality of clinical trial deliverables. It will also support enterprise level of data evaluation and bench marking the quality of data across projects, sponsors, data management service providers by using objective metrics from the real clinical trials. We hope this will be a significant input to accelerate the improvement of clinical trial data quality in the industry.
Socio-Technical Security Metrics (Dagstuhl Seminar 14491)
Gollmann, Dieter; Herley, Cormac; Koenig, Vincent; Pieters, Wolter; Sasse, Martina Angela
2015-01-01
This report documents the program and the outcomes of Dagstuhl Seminar 14491 "Socio-Technical Security Metrics". In the domain of safety, metrics inform many decisions, from the height of new dikes to the design of nuclear plants. We can state, for example, that the dikes should be high enough to
Landscape metrics for three-dimension urban pattern recognition
Liu, M.; Hu, Y.; Zhang, W.; Li, C.
2017-12-01
Understanding how landscape pattern determines population or ecosystem dynamics is crucial for managing our landscapes. Urban areas are becoming increasingly dominant social-ecological systems, so it is important to understand patterns of urbanization. Most studies of urban landscape pattern examine land-use maps in two dimensions because the acquisition of 3-dimensional information is difficult. We used Brista software based on Quickbird images and aerial photos to interpret the height of buildings, thus incorporating a 3-dimensional approach. We estimated the feasibility and accuracy of this approach. A total of 164,345 buildings in the Liaoning central urban agglomeration of China, which included seven cities, were measured. Twelve landscape metrics were proposed or chosen to describe the urban landscape patterns in 2- and 3-dimensional scales. The ecological and social meaning of landscape metrics were analyzed with multiple correlation analysis. The results showed that classification accuracy compared with field surveys was 87.6%, which means this method for interpreting building height was acceptable. The metrics effectively reflected the urban architecture in relation to number of buildings, area, height, 3-D shape and diversity aspects. We were able to describe the urban characteristics of each city with these metrics. The metrics also captured ecological and social meanings. The proposed landscape metrics provided a new method for urban landscape analysis in three dimensions.
SOCIAL METRICS APPLIED TO SMART TOURISM
Directory of Open Access Journals (Sweden)
O. Cervantes
2016-09-01
Full Text Available We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.
Social Metrics Applied to Smart Tourism
Cervantes, O.; Gutiérrez, E.; Gutiérrez, F.; Sánchez, J. A.
2016-09-01
We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general) to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services) to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.
Reproducibility of graph metrics of human brain functional networks.
Deuker, Lorena; Bullmore, Edward T; Smith, Marie; Christensen, Soren; Nathan, Pradeep J; Rockstroh, Brigitte; Bassett, Danielle S
2009-10-01
Graph theory provides many metrics of complex network organization that can be applied to analysis of brain networks derived from neuroimaging data. Here we investigated the test-retest reliability of graph metrics of functional networks derived from magnetoencephalography (MEG) data recorded in two sessions from 16 healthy volunteers who were studied at rest and during performance of the n-back working memory task in each session. For each subject's data at each session, we used a wavelet filter to estimate the mutual information (MI) between each pair of MEG sensors in each of the classical frequency intervals from gamma to low delta in the overall range 1-60 Hz. Undirected binary graphs were generated by thresholding the MI matrix and 8 global network metrics were estimated: the clustering coefficient, path length, small-worldness, efficiency, cost-efficiency, assortativity, hierarchy, and synchronizability. Reliability of each graph metric was assessed using the intraclass correlation (ICC). Good reliability was demonstrated for most metrics applied to the n-back data (mean ICC=0.62). Reliability was greater for metrics in lower frequency networks. Higher frequency gamma- and beta-band networks were less reliable at a global level but demonstrated high reliability of nodal metrics in frontal and parietal regions. Performance of the n-back task was associated with greater reliability than measurements on resting state data. Task practice was also associated with greater reliability. Collectively these results suggest that graph metrics are sufficiently reliable to be considered for future longitudinal studies of functional brain network changes.
Generalization of Vaidya's radiation metric
Energy Technology Data Exchange (ETDEWEB)
Gleiser, R J; Kozameh, C N [Universidad Nacional de Cordoba (Argentina). Instituto de Matematica, Astronomia y Fisica
1981-11-01
In this paper it is shown that if Vaidya's radiation metric is considered from the point of view of kinetic theory in general relativity, the corresponding phase space distribution function can be generalized in a particular way. The new family of spherically symmetric radiation metrics obtained contains Vaidya's as a limiting situation. The Einstein field equations are solved in a ''comoving'' coordinate system. Two arbitrary functions of a single variable are introduced in the process of solving these equations. Particular examples considered are a stationary solution, a nonvacuum solution depending on a single parameter, and several limiting situations.
Marketing communication metrics for social media
Töllinen, Aarne; Karjaluoto, Heikki
2011-01-01
The objective of this paper is to develop a conceptual framework for measuring the effectiveness of social media marketing communications. Specifically, we study whether the existing marketing communications performance metrics are still valid in the changing digitalised communications landscape, or whether it is time to rethink them, or even to devise entirely new metrics. Recent advances in information technology and marketing bring a need to re-examine measurement models. We combine two im...
Accuracy and precision in the calculation of phenology metrics
DEFF Research Database (Denmark)
Ferreira, Ana Sofia; Visser, Andre; MacKenzie, Brian
2014-01-01
a phenology metric is first determined from a noise- and gap-free time series, and again once it has been modified. We show that precision is a greater concern than accuracy for many of these metrics, an important point that has been hereto overlooked in the literature. The variability in precision between...... phenology metrics is substantial, but it can be improved by the use of preprocessing techniques (e.g., gap-filling or smoothing). Furthermore, there are important differences in the inherent variability of the metrics that may be crucial in the interpretation of studies based upon them. Of the considered......Phytoplankton phenology (the timing of seasonal events) is a commonly used indicator for evaluating responses of marine ecosystems to climate change. However, phenological metrics are vulnerable to observation-(bloom amplitude, missing data, and observational noise) and analysis-related (temporal...
Quantum anomalies for generalized Euclidean Taub-NUT metrics
International Nuclear Information System (INIS)
Cotaescu, Ion I; Moroianu, Sergiu; Visinescu, Mihai
2005-01-01
The generalized Taub-NUT metrics exhibit in general gravitational anomalies. This is in contrast with the fact that the original Taub-NUT metric does not exhibit gravitational anomalies, which is a consequence of the fact that it admits Killing-Yano tensors forming Staeckel-Killing tensors as products. We have found that for axial anomalies, interpreted as the index of the Dirac operator, the presence of Killing-Yano tensors is irrelevant. In order to evaluate the axial anomalies, we compute the index of the Dirac operator with the APS boundary condition on balls and on annular domains. The result is an explicit number-theoretic quantity depending on the radii of the domain. This quantity is 0 for metrics close to the original Taub-NUT metric but it does not vanish in general
How much do genetic covariances alter the rate of adaptation?
Agrawal, Aneil F; Stinchcombe, John R
2009-03-22
Genetically correlated traits do not evolve independently, and the covariances between traits affect the rate at which a population adapts to a specified selection regime. To measure the impact of genetic covariances on the rate of adaptation, we compare the rate fitness increases given the observed G matrix to the expected rate if all the covariances in the G matrix are set to zero. Using data from the literature, we estimate the effect of genetic covariances in real populations. We find no net tendency for covariances to constrain the rate of adaptation, though the quality and heterogeneity of the data limit the certainty of this result. There are some examples in which covariances strongly constrain the rate of adaptation but these are balanced by counter examples in which covariances facilitate the rate of adaptation; in many cases, covariances have little or no effect. We also discuss how our metric can be used to identify traits or suites of traits whose genetic covariances to other traits have a particularly large impact on the rate of adaptation.
A multi-attribute approach to choosing adaptation strategies: Application to sea-level rise
International Nuclear Information System (INIS)
Smith, A.E.; Chu, H.Q.
1994-01-01
Selecting good adaptation strategies in anticipation of climate change is gaining increasing attention as it becomes increasingly clear that much of the likely change is already committed, and could not be avoided even with aggressive and immediate emissions reductions. Adaptation decision making will place special requirements on regional and local planners in the US and other countries, especially developing countries. Approaches, tools, and guidance will be useful to assist in an effective response to the challenge. This paper describes the value of using a multi-attribute approach for evaluating adaptation strategies and its implementation as a decision-support software tool to help planners understand and execute this approach. The multi-attribute approach described here explicitly addresses the fact that many aspects of the decision cannot be easily quantified, that future conditions are highly uncertain, and that there are issues of equity, flexibility, and coordination that may be as important to the decision as costs and benefits. The approach suggested also avoids trying to collapse information on all of the attributes to a single metric. Such metrics can obliterate insights about the nature of the trade-offs that must be made in choosing among very dissimilar types of responses to the anticipated threat of climate change. Implementation of such an approach requires management of much information, and an ability to easily manipulate its presentation while seeking acceptable trade-offs. The Adaptation Strategy Evaluator (ASE) was developed under funding from the US Environmental Protection Agency to provide user-friendly, PC-based guidance through the major steps of a multi-attribute evaluation. The initial application of ASE, and the focus of this paper, is adaptation to sea level rise. However, the approach can be easily adapted to any multi-attribute choice problem, including the range of other adaptation planning needs
Metrics for Performance Evaluation of Patient Exercises during Physical Therapy.
Vakanski, Aleksandar; Ferguson, Jake M; Lee, Stephen
2017-06-01
The article proposes a set of metrics for evaluation of patient performance in physical therapy exercises. Taxonomy is employed that classifies the metrics into quantitative and qualitative categories, based on the level of abstraction of the captured motion sequences. Further, the quantitative metrics are classified into model-less and model-based metrics, in reference to whether the evaluation employs the raw measurements of patient performed motions, or whether the evaluation is based on a mathematical model of the motions. The reviewed metrics include root-mean square distance, Kullback Leibler divergence, log-likelihood, heuristic consistency, Fugl-Meyer Assessment, and similar. The metrics are evaluated for a set of five human motions captured with a Kinect sensor. The metrics can potentially be integrated into a system that employs machine learning for modelling and assessment of the consistency of patient performance in home-based therapy setting. Automated performance evaluation can overcome the inherent subjectivity in human performed therapy assessment, and it can increase the adherence to prescribed therapy plans, and reduce healthcare costs.
Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions
Energy Technology Data Exchange (ETDEWEB)
Mathew, Paul; Greenberg, Steve; Sartor, Dale
2009-07-13
This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.
Traffic Adaptive MAC Protocols in Wireless Body Area Networks
Directory of Open Access Journals (Sweden)
Farhan Masud
2017-01-01
Full Text Available In Wireless Body Area Networks (WBANs, every healthcare application that is based on physical sensors is responsible for monitoring the vital signs data of patient. WBANs applications consist of heterogeneous and dynamic traffic loads. Routine patient’s observation is described as low-load traffic while an alarming situation that is unpredictable by nature is referred to as high-load traffic. This paper offers a thematic review of traffic adaptive Medium Access Control (MAC protocols in WBANs. First, we have categorized them based on their goals, methods, and metrics of evaluation. The Zigbee standard IEEE 802.15.4 and the baseline MAC IEEE 802.15.6 are also reviewed in terms of traffic adaptive approaches. Furthermore, a comparative analysis of the protocols is made and their performances are analyzed in terms of delay, packet delivery ratio (PDR, and energy consumption. The literature shows that no review work has been done on traffic adaptive MAC protocols in WBANs. This review work, therefore, could add enhancement to traffic adaptive MAC protocols and will stimulate a better way of solving the traffic adaptivity problem.
Massless and massive quanta resulting from a mediumlike metric tensor
International Nuclear Information System (INIS)
Soln, J.
1985-01-01
A simple model of the ''primordial'' scalar field theory is presented in which the metric tensor is a generalization of the metric tensor from electrodynamics in a medium. The radiation signal corresponding to the scalar field propagates with a velocity that is generally less than c. This signal can be associated simultaneously with imaginary and real effective (momentum-dependent) masses. The requirement that the imaginary effective mass vanishes, which we take to be the prerequisite for the vacuumlike signal propagation, leads to the ''spontaneous'' splitting of the metric tensor into two distinct metric tensors: one metric tensor gives rise to masslesslike radiation and the other to a massive particle. (author)
A guide to phylogenetic metrics for conservation, community ecology and macroecology
Cadotte, Marc W.; Carvalho, Silvia B.; Davies, T. Jonathan; Ferrier, Simon; Fritz, Susanne A.; Grenyer, Rich; Helmus, Matthew R.; Jin, Lanna S.; Mooers, Arne O.; Pavoine, Sandrine; Purschke, Oliver; Redding, David W.; Rosauer, Dan F.; Winter, Marten; Mazel, Florent
2016-01-01
ABSTRACT The use of phylogenies in ecology is increasingly common and has broadened our understanding of biological diversity. Ecological sub‐disciplines, particularly conservation, community ecology and macroecology, all recognize the value of evolutionary relationships but the resulting development of phylogenetic approaches has led to a proliferation of phylogenetic diversity metrics. The use of many metrics across the sub‐disciplines hampers potential meta‐analyses, syntheses, and generalizations of existing results. Further, there is no guide for selecting the appropriate metric for a given question, and different metrics are frequently used to address similar questions. To improve the choice, application, and interpretation of phylo‐diversity metrics, we organize existing metrics by expanding on a unifying framework for phylogenetic information. Generally, questions about phylogenetic relationships within or between assemblages tend to ask three types of question: how much; how different; or how regular? We show that these questions reflect three dimensions of a phylogenetic tree: richness, divergence, and regularity. We classify 70 existing phylo‐diversity metrics based on their mathematical form within these three dimensions and identify ‘anchor’ representatives: for α‐diversity metrics these are PD (Faith's phylogenetic diversity), MPD (mean pairwise distance), and VPD (variation of pairwise distances). By analysing mathematical formulae and using simulations, we use this framework to identify metrics that mix dimensions, and we provide a guide to choosing and using the most appropriate metrics. We show that metric choice requires connecting the research question with the correct dimension of the framework and that there are logical approaches to selecting and interpreting metrics. The guide outlined herein will help researchers navigate the current jungle of indices. PMID:26785932
Assessing Software Quality Through Visualised Cohesion Metrics
Directory of Open Access Journals (Sweden)
Timothy Shih
2001-05-01
Full Text Available Cohesion is one of the most important factors for software quality as well as maintainability, reliability and reusability. Module cohesion is defined as a quality attribute that seeks for measuring the singleness of the purpose of a module. The module of poor quality can be a serious obstacle to the system quality. In order to design a good software quality, software managers and engineers need to introduce cohesion metrics to measure and produce desirable software. A highly cohesion software is thought to be a desirable constructing. In this paper, we propose a function-oriented cohesion metrics based on the analysis of live variables, live span and the visualization of processing element dependency graph. We give six typical cohesion examples to be measured as our experiments and justification. Therefore, a well-defined, well-normalized, well-visualized and well-experimented cohesion metrics is proposed to indicate and thus enhance software cohesion strength. Furthermore, this cohesion metrics can be easily incorporated with software CASE tool to help software engineers to improve software quality.
MESUR: USAGE-BASED METRICS OF SCHOLARLY IMPACT
Energy Technology Data Exchange (ETDEWEB)
BOLLEN, JOHAN [Los Alamos National Laboratory; RODRIGUEZ, MARKO A. [Los Alamos National Laboratory; VAN DE SOMPEL, HERBERT [Los Alamos National Laboratory
2007-01-30
The evaluation of scholarly communication items is now largely a matter of expert opinion or metrics derived from citation data. Both approaches can fail to take into account the myriad of factors that shape scholarly impact. Usage data has emerged as a promising complement to existing methods o fassessment but the formal groundwork to reliably and validly apply usage-based metrics of schlolarly impact is lacking. The Andrew W. Mellon Foundation funded MESUR project constitutes a systematic effort to define, validate and cross-validate a range of usage-based metrics of schlolarly impact by creating a semantic model of the scholarly communication process. The constructed model will serve as the basis of a creating a large-scale semantic network that seamlessly relates citation, bibliographic and usage data from a variety of sources. A subsequent program that uses the established semantic network as a reference data set will determine the characteristics and semantics of a variety of usage-based metrics of schlolarly impact. This paper outlines the architecture and methodology adopted by the MESUR project and its future direction.
Classroom reconstruction of the Schwarzschild metric
Kassner, Klaus
2015-01-01
A promising way to introduce general relativity in the classroom is to study the physical implications of certain given metrics, such as the Schwarzschild one. This involves lower mathematical expenditure than an approach focusing on differential geometry in its full glory and permits to emphasize physical aspects before attacking the field equations. Even so, in terms of motivation, lacking justification of the metric employed may pose an obstacle. The paper discusses how to establish the we...
Area Regge calculus and discontinuous metrics
International Nuclear Information System (INIS)
Wainwright, Chris; Williams, Ruth M
2004-01-01
Taking the triangle areas as independent variables in the theory of Regge calculus can lead to ambiguities in the edge lengths, which can be interpreted as discontinuities in the metric. We construct solutions to area Regge calculus using a triangulated lattice and find that on a spacelike or timelike hypersurface no such discontinuity can arise. On a null hypersurface however, we can have such a situation and the resulting metric can be interpreted as a so-called refractive wave
Use of metrics in an effective ALARA program
International Nuclear Information System (INIS)
Bates, B.B. Jr.
1996-01-01
ALARA radiological protection programs require metrics to meet their objectives. Sources of metrics include external dosimetry; internal dosimetry; radiological occurrences from the occurrence reporting an processing system (ORPS); and radiological incident reports (RIR). The sources themselves contain an abundance of specific open-quotes indicators.close quotes To choose the site-specific indicators that will be tracked and trended requires careful review. Justification is needed to defend the indicators selected and maybe even stronger justification is needed for those indicators that are available, but not chosen as a metric. Historically, the many different sources of information resided in a plethora of locations. Even the same type of metric had data located in different areas and could not be easily totaled for the entire Site. This required the end user to expend valuable time and effort to locate the data they needed. To address this problem, a central metrics database has been developed so that a customer can have all their questions addressed quickly and correctly. The database was developed in the beginning to answer some of the customer's most frequently asked questions. IL is now also a tool to communicate the status of the radiation protection program to facility managers. Finally, it also addresses requirements contained in the Rad Con manual and the 10CFR835 implementation guides. The database uses currently available, open-quotes user friendly,close quotes software and contains information from RIR's, ORPS, and external dosimetry records specific to ALARA performance indicators. The database is expandable to allow new metrics input. Specific reports have been developed to assist customers in their tracking and trending of ALARA metrics. These include quarterly performance indicator reports, monthly radiological incident reports, monthly external dose history and goals tracking reports, and the future use of performance indexing
Assessment of six dissimilarity metrics for climate analogues
Grenier, Patrick; Parent, Annie-Claude; Huard, David; Anctil, François; Chaumont, Diane
2013-04-01
Spatial analogue techniques consist in identifying locations whose recent-past climate is similar in some aspects to the future climate anticipated at a reference location. When identifying analogues, one key step is the quantification of the dissimilarity between two climates separated in time and space, which involves the choice of a metric. In this communication, spatial analogues and their usefulness are briefly discussed. Next, six metrics are presented (the standardized Euclidean distance, the Kolmogorov-Smirnov statistic, the nearest-neighbor distance, the Zech-Aslan energy statistic, the Friedman-Rafsky runs statistic and the Kullback-Leibler divergence), along with a set of criteria used for their assessment. The related case study involves the use of numerical simulations performed with the Canadian Regional Climate Model (CRCM-v4.2.3), from which three annual indicators (total precipitation, heating degree-days and cooling degree-days) are calculated over 30-year periods (1971-2000 and 2041-2070). Results indicate that the six metrics identify comparable analogue regions at a relatively large scale, but best analogues may differ substantially. For best analogues, it is also shown that the uncertainty stemming from the metric choice does generally not exceed that stemming from the simulation or model choice. A synthesis of the advantages and drawbacks of each metric is finally presented, in which the Zech-Aslan energy statistic stands out as the most recommended metric for analogue studies, whereas the Friedman-Rafsky runs statistic is the least recommended, based on this case study.
Borelli, Michael L.
This document details the administrative issues associated with guiding a school district through its metrication efforts. Issues regarding staff development, curriculum development, and the acquisition of instructional resources are considered. Alternative solutions are offered. Finally, an overall implementation strategy is discussed with…
Effective use of metrics in an ALARA program
International Nuclear Information System (INIS)
Bates, B.B. Jr.
1996-01-01
ALARA radiological protection programs require metrics to meet their objectives. Sources of metrics include; external dosimetry; internal dosimetry; radiological occurrences from the occurrence reporting and processing system (ORPS); and radiological incident reports (RIR). The sources themselves contain an abundance of specific ''indicators''. To choose the site-specific indicators that will be tracked and trended requires careful review. This required the end users to expend valuable time and effort to locate the data they needed. To address this problem, a central metrics database has been developed so that customers can have all their questions addressed quickly and correctly. The database was developed in the beginning to answer some of the customer's most frequently asked questions. It is now also a tool to communicate the status of the radiation protection program to facility managers. Finally it also addresses requirements contained in the Rad Con manual and the 10CFR835 implementation guides. The database uses currently available, ''user friendly'', software and contains information from RIR's, ORPS, and external dosimetry records specific to ALARA performance indicators. The database is expandable to allow new metrics input. Specific reports have been developed to assist customers in their tracking and trending of ALARA metrics
Shuler, Robert
2018-04-01
The goal of this paper is to take a completely fresh approach to metric gravity, in which the metric principle is strictly adhered to but its properties in local space-time are derived from conservation principles, not inferred from a global field equation. The global field strength variation then gains some flexibility, but only in the regime of very strong fields (2nd-order terms) whose measurement is now being contemplated. So doing provides a family of similar gravities, differing only in strong fields, which could be developed into meaningful verification targets for strong fields after the manner in which far-field variations were used in the 20th century. General Relativity (GR) is shown to be a member of the family and this is demonstrated by deriving the Schwarzschild metric exactly from a suitable field strength assumption. The method of doing so is interesting in itself because it involves only one differential equation rather than the usual four. Exact static symmetric field solutions are also given for one pedagogical alternative based on potential, and one theoretical alternative based on inertia, and the prospects of experimentally differentiating these are analyzed. Whether the method overturns the conventional wisdom that GR is the only metric theory of gravity and that alternatives must introduce additional interactions and fields is somewhat semantical, depending on whether one views the field strength assumption as a field and whether the assumption that produces GR is considered unique in some way. It is of course possible to have other fields, and the local space-time principle can be applied to field gravities which usually are weak-field approximations having only time dilation, giving them the spatial factor and promoting them to full metric theories. Though usually pedagogical, some of them are interesting from a quantum gravity perspective. Cases are noted where mass measurement errors, or distributions of dark matter, can cause one
International Nuclear Information System (INIS)
Harper, A.F.A.; Digby, R.B.; Thong, S.P.; Lacey, F.
1978-04-01
In April 1978 a meeting of senior metrication officers convened by the Commonwealth Science Council of the Commonwealth Secretariat, was held in London. The participants were drawn from Australia, Bangladesh, Britain, Canada, Ghana, Guyana, India, Jamaica, Papua New Guinea, Solomon Islands and Trinidad and Tobago. Among other things, the meeting resolved to develop a set of guidelines to assist countries to change to SI and to compile such guidelines in the form of a working manual
An accurate metric for the spacetime around rotating neutron stars
Pappas, George
2017-04-01
The problem of having an accurate description of the spacetime around rotating neutron stars is of great astrophysical interest. For astrophysical applications, one needs to have a metric that captures all the properties of the spacetime around a rotating neutron star. Furthermore, an accurate appropriately parametrized metric, I.e. a metric that is given in terms of parameters that are directly related to the physical structure of the neutron star, could be used to solve the inverse problem, which is to infer the properties of the structure of a neutron star from astrophysical observations. In this work, we present such an approximate stationary and axisymmetric metric for the exterior of rotating neutron stars, which is constructed using the Ernst formalism and is parametrized by the relativistic multipole moments of the central object. This metric is given in terms of an expansion on the Weyl-Papapetrou coordinates with the multipole moments as free parameters and is shown to be extremely accurate in capturing the physical properties of a neutron star spacetime as they are calculated numerically in general relativity. Because the metric is given in terms of an expansion, the expressions are much simpler and easier to implement, in contrast to previous approaches. For the parametrization of the metric in general relativity, the recently discovered universal 3-hair relations are used to produce a three-parameter metric. Finally, a straightforward extension of this metric is given for scalar-tensor theories with a massless scalar field, which also admit a formulation in terms of an Ernst potential.
Metric reconstruction from Weyl scalars
Energy Technology Data Exchange (ETDEWEB)
Whiting, Bernard F; Price, Larry R [Department of Physics, PO Box 118440, University of Florida, Gainesville, FL 32611 (United States)
2005-08-07
The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations.
Metric reconstruction from Weyl scalars
International Nuclear Information System (INIS)
Whiting, Bernard F; Price, Larry R
2005-01-01
The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations
Using Publication Metrics to Highlight Academic Productivity and Research Impact
Carpenter, Christopher R.; Cone, David C.; Sarli, Cathy C.
2016-01-01
This article provides a broad overview of widely available measures of academic productivity and impact using publication data and highlights uses of these metrics for various purposes. Metrics based on publication data include measures such as number of publications, number of citations, the journal impact factor score, and the h-index, as well as emerging metrics based on document-level metrics. Publication metrics can be used for a variety of purposes for tenure and promotion, grant applications and renewal reports, benchmarking, recruiting efforts, and administrative purposes for departmental or university performance reports. The authors also highlight practical applications of measuring and reporting academic productivity and impact to emphasize and promote individual investigators, grant applications, or department output. PMID:25308141
Comparison of routing metrics for wireless mesh networks
CSIR Research Space (South Africa)
Nxumalo, SL
2011-09-01
Full Text Available in each and every relay node so as to find the next hop for the packet. A routing metric is simply a measure used for selecting the best path, used by a routing protocol. Figure 2 shows the relationship between a routing protocol and the routing... on its QoS-awareness level. The routing metrics that considered QoS the most were selected from each group. This section discusses the four routing metrics that were compared in this paper, which are: hop count (HOP), expected transmission count (ETX...
The metrics and correlates of physician migration from Africa
Directory of Open Access Journals (Sweden)
Arah Onyebuchi A
2007-05-01
Full Text Available Abstract Background Physician migration from poor to rich countries is considered an important contributor to the growing health workforce crisis in the developing world. This is particularly true for Africa. The perceived magnitude of such migration for each source country might, however, depend on the choice of metrics used in the analysis. This study examined the influence of choice of migration metrics on the rankings of African countries that suffered the most physician migration, and investigated the correlates of physician migration. Methods Ranking and correlational analyses were conducted on African physician migration data adjusted for bilateral net flows, and supplemented with developmental, economic and health system data. The setting was the 53 African birth countries of African-born physicians working in nine wealthier destination countries. Three metrics of physician migration were used: total number of physician émigrés; emigration fraction defined as the proportion of the potential physician pool working in destination countries; and physician migration density defined as the number of physician émigrés per 1000 population of the African source country. Results Rankings based on any of the migration metrics differed substantially from those based on the other two metrics. Although the emigration fraction and physician migration density metrics gave proportionality to the migration crisis, only the latter was consistently associated with source countries' workforce capacity, health, health spending, economic and development characteristics. As such, higher physician migration density was seen among African countries with relatively higher health workforce capacity (0.401 ≤ r ≤ 0.694, p ≤ 0.011, health status, health spending, and development. Conclusion The perceived magnitude of physician migration is sensitive to the choice of metrics. Complementing the emigration fraction, the physician migration density is a metric
Description of the Sandia National Laboratories science, technology & engineering metrics process.
Energy Technology Data Exchange (ETDEWEB)
Jordan, Gretchen B.; Watkins, Randall D.; Trucano, Timothy Guy; Burns, Alan Richard; Oelschlaeger, Peter
2010-04-01
There has been a concerted effort since 2007 to establish a dashboard of metrics for the Science, Technology, and Engineering (ST&E) work at Sandia National Laboratories. These metrics are to provide a self assessment mechanism for the ST&E Strategic Management Unit (SMU) to complement external expert review and advice and various internal self assessment processes. The data and analysis will help ST&E Managers plan, implement, and track strategies and work in order to support the critical success factors of nurturing core science and enabling laboratory missions. The purpose of this SAND report is to provide a guide for those who want to understand the ST&E SMU metrics process. This report provides an overview of why the ST&E SMU wants a dashboard of metrics, some background on metrics for ST&E programs from existing literature and past Sandia metrics efforts, a summary of work completed to date, specifics on the portfolio of metrics that have been chosen and the implementation process that has been followed, and plans for the coming year to improve the ST&E SMU metrics process.
Relevance of motion-related assessment metrics in laparoscopic surgery.
Oropesa, Ignacio; Chmarra, Magdalena K; Sánchez-González, Patricia; Lamata, Pablo; Rodrigues, Sharon P; Enciso, Silvia; Sánchez-Margallo, Francisco M; Jansen, Frank-Willem; Dankelman, Jenny; Gómez, Enrique J
2013-06-01
Motion metrics have become an important source of information when addressing the assessment of surgical expertise. However, their direct relationship with the different surgical skills has not been fully explored. The purpose of this study is to investigate the relevance of motion-related metrics in the evaluation processes of basic psychomotor laparoscopic skills and their correlation with the different abilities sought to measure. A framework for task definition and metric analysis is proposed. An explorative survey was first conducted with a board of experts to identify metrics to assess basic psychomotor skills. Based on the output of that survey, 3 novel tasks for surgical assessment were designed. Face and construct validation was performed, with focus on motion-related metrics. Tasks were performed by 42 participants (16 novices, 22 residents, and 4 experts). Movements of the laparoscopic instruments were registered with the TrEndo tracking system and analyzed. Time, path length, and depth showed construct validity for all 3 tasks. Motion smoothness and idle time also showed validity for tasks involving bimanual coordination and tasks requiring a more tactical approach, respectively. Additionally, motion smoothness and average speed showed a high internal consistency, proving them to be the most task-independent of all the metrics analyzed. Motion metrics are complementary and valid for assessing basic psychomotor skills, and their relevance depends on the skill being evaluated. A larger clinical implementation, combined with quality performance information, will give more insight on the relevance of the results shown in this study.
Left-invariant Einstein metrics on S3 ×S3
Belgun, Florin; Cortés, Vicente; Haupt, Alexander S.; Lindemann, David
2018-06-01
The classification of homogeneous compact Einstein manifolds in dimension six is an open problem. We consider the remaining open case, namely left-invariant Einstein metrics g on G = SU(2) × SU(2) =S3 ×S3. Einstein metrics are critical points of the total scalar curvature functional for fixed volume. The scalar curvature S of a left-invariant metric g is constant and can be expressed as a rational function in the parameters determining the metric. The critical points of S, subject to the volume constraint, are given by the zero locus of a system of polynomials in the parameters. In general, however, the determination of the zero locus is apparently out of reach. Instead, we consider the case where the isotropy group K of g in the group of motions is non-trivial. When K ≇Z2 we prove that the Einstein metrics on G are given by (up to homothety) either the standard metric or the nearly Kähler metric, based on representation-theoretic arguments and computer algebra. For the remaining case K ≅Z2 we present partial results.
International Nuclear Information System (INIS)
Yesilbudak, Mehmet; Sagiroglu, Seref; Colak, Ilhami
2017-01-01
Highlights: • An accurate wind power prediction model is proposed for very short-term horizon. • The k-nearest neighbor classifier is implemented based on the multi-tupled inputs. • The variation of wind power prediction errors is evaluated in various aspects. • Our approach shows the superior prediction performance over the persistence method. - Abstract: With the growing share of wind power production in the electric power grids, many critical challenges to the grid operators have been emerged in terms of the power balance, power quality, voltage support, frequency stability, load scheduling, unit commitment and spinning reserve calculations. To overcome such problems, numerous studies have been conducted to predict the wind power production, but a small number of them have attempted to improve the prediction accuracy by employing the multidimensional meteorological input data. The novelties of this study lie in the proposal of an efficient and easy to implement very short-term wind power prediction model based on the k-nearest neighbor classifier (kNN), in the usage of wind speed, wind direction, barometric pressure and air temperature parameters as the multi-tupled meteorological inputs and in the comparison of wind power prediction results with respect to the persistence reference model. As a result of the achieved patterns, we characterize the variation of wind power prediction errors according to the input tuples, distance measures and neighbor numbers, and uncover the most influential and the most ineffective meteorological parameters on the optimization of wind power prediction results.
Active Metric Learning from Relative Comparisons
Xiong, Sicheng; Rosales, Rómer; Pei, Yuanli; Fern, Xiaoli Z.
2014-01-01
This work focuses on active learning of distance metrics from relative comparison information. A relative comparison specifies, for a data point triplet $(x_i,x_j,x_k)$, that instance $x_i$ is more similar to $x_j$ than to $x_k$. Such constraints, when available, have been shown to be useful toward defining appropriate distance metrics. In real-world applications, acquiring constraints often require considerable human effort. This motivates us to study how to select and query the most useful ...
A practical approach to determine dose metrics for nanomaterials.
Delmaar, Christiaan J E; Peijnenburg, Willie J G M; Oomen, Agnes G; Chen, Jingwen; de Jong, Wim H; Sips, Adriënne J A M; Wang, Zhuang; Park, Margriet V D Z
2015-05-01
Traditionally, administered mass is used to describe doses of conventional chemical substances in toxicity studies. For deriving toxic doses of nanomaterials, mass and chemical composition alone may not adequately describe the dose, because particles with the same chemical composition can have completely different toxic mass doses depending on properties such as particle size. Other dose metrics such as particle number, volume, or surface area have been suggested, but consensus is lacking. The discussion regarding the most adequate dose metric for nanomaterials clearly needs a systematic, unbiased approach to determine the most appropriate dose metric for nanomaterials. In the present study, the authors propose such an approach and apply it to results from in vitro and in vivo experiments with silver and silica nanomaterials. The proposed approach is shown to provide a convenient tool to systematically investigate and interpret dose metrics of nanomaterials. Recommendations for study designs aimed at investigating dose metrics are provided. © 2015 SETAC.
Information-theoretic semi-supervised metric learning via entropy regularization.
Niu, Gang; Dai, Bo; Yamada, Makoto; Sugiyama, Masashi
2014-08-01
We propose a general information-theoretic approach to semi-supervised metric learning called SERAPH (SEmi-supervised metRic leArning Paradigm with Hypersparsity) that does not rely on the manifold assumption. Given the probability parameterized by a Mahalanobis distance, we maximize its entropy on labeled data and minimize its entropy on unlabeled data following entropy regularization. For metric learning, entropy regularization improves manifold regularization by considering the dissimilarity information of unlabeled data in the unsupervised part, and hence it allows the supervised and unsupervised parts to be integrated in a natural and meaningful way. Moreover, we regularize SERAPH by trace-norm regularization to encourage low-dimensional projections associated with the distance metric. The nonconvex optimization problem of SERAPH could be solved efficiently and stably by either a gradient projection algorithm or an EM-like iterative algorithm whose M-step is convex. Experiments demonstrate that SERAPH compares favorably with many well-known metric learning methods, and the learned Mahalanobis distance possesses high discriminability even under noisy environments.
Fisher information metrics for binary classifier evaluation and training
CERN. Geneva
2018-01-01
Different evaluation metrics for binary classifiers are appropriate to different scientific domains and even to different problems within the same domain. This presentation focuses on the optimisation of event selection to minimise statistical errors in HEP parameter estimation, a problem that is best analysed in terms of the maximisation of Fisher information about the measured parameters. After describing a general formalism to derive evaluation metrics based on Fisher information, three more specific metrics are introduced for the measurements of signal cross sections in counting experiments (FIP1) or distribution fits (FIP2) and for the measurements of other parameters from distribution fits (FIP3). The FIP2 metric is particularly interesting because it can be derived from any ROC curve, provided that prevalence is also known. In addition to its relation to measurement errors when used as an evaluation criterion (which makes it more interesting that the ROC AUC), a further advantage of the FIP2 metric is ...
Metrical and dynamical aspects in complex analysis
2017-01-01
The central theme of this reference book is the metric geometry of complex analysis in several variables. Bridging a gap in the current literature, the text focuses on the fine behavior of the Kobayashi metric of complex manifolds and its relationships to dynamical systems, hyperbolicity in the sense of Gromov and operator theory, all very active areas of research. The modern points of view expressed in these notes, collected here for the first time, will be of interest to academics working in the fields of several complex variables and metric geometry. The different topics are treated coherently and include expository presentations of the relevant tools, techniques and objects, which will be particularly useful for graduate and PhD students specializing in the area.
Important LiDAR metrics for discriminating forest tree species in Central Europe
Shi, Yifang; Wang, Tiejun; Skidmore, Andrew K.; Heurich, Marco
2018-03-01
Numerous airborne LiDAR-derived metrics have been proposed for classifying tree species. Yet an in-depth ecological and biological understanding of the significance of these metrics for tree species mapping remains largely unexplored. In this paper, we evaluated the performance of 37 frequently used LiDAR metrics derived under leaf-on and leaf-off conditions, respectively, for discriminating six different tree species in a natural forest in Germany. We firstly assessed the correlation between these metrics. Then we applied a Random Forest algorithm to classify the tree species and evaluated the importance of the LiDAR metrics. Finally, we identified the most important LiDAR metrics and tested their robustness and transferability. Our results indicated that about 60% of LiDAR metrics were highly correlated to each other (|r| > 0.7). There was no statistically significant difference in tree species mapping accuracy between the use of leaf-on and leaf-off LiDAR metrics. However, combining leaf-on and leaf-off LiDAR metrics significantly increased the overall accuracy from 58.2% (leaf-on) and 62.0% (leaf-off) to 66.5% as well as the kappa coefficient from 0.47 (leaf-on) and 0.51 (leaf-off) to 0.58. Radiometric features, especially intensity related metrics, provided more consistent and significant contributions than geometric features for tree species discrimination. Specifically, the mean intensity of first-or-single returns as well as the mean value of echo width were identified as the most robust LiDAR metrics for tree species discrimination. These results indicate that metrics derived from airborne LiDAR data, especially radiometric metrics, can aid in discriminating tree species in a mixed temperate forest, and represent candidate metrics for tree species classification and monitoring in Central Europe.
Nindl, Bradley C; Jaffin, Dianna P; Dretsch, Michael N; Cheuvront, Samuel N; Wesensten, Nancy J; Kent, Michael L; Grunberg, Neil E; Pierce, Joseph R; Barry, Erin S; Scott, Jonathan M; Young, Andrew J; OʼConnor, Francis G; Deuster, Patricia A
2015-11-01
Human performance optimization (HPO) is defined as "the process of applying knowledge, skills and emerging technologies to improve and preserve the capabilities of military members, and organizations to execute essential tasks." The lack of consensus for operationally relevant and standardized metrics that meet joint military requirements has been identified as the single most important gap for research and application of HPO. In 2013, the Consortium for Health and Military Performance hosted a meeting to develop a toolkit of standardized HPO metrics for use in military and civilian research, and potentially for field applications by commanders, units, and organizations. Performance was considered from a holistic perspective as being influenced by various behaviors and barriers. To accomplish the goal of developing a standardized toolkit, key metrics were identified and evaluated across a spectrum of domains that contribute to HPO: physical performance, nutritional status, psychological status, cognitive performance, environmental challenges, sleep, and pain. These domains were chosen based on relevant data with regard to performance enhancers and degraders. The specific objectives at this meeting were to (a) identify and evaluate current metrics for assessing human performance within selected domains; (b) prioritize metrics within each domain to establish a human performance assessment toolkit; and (c) identify scientific gaps and the needed research to more effectively assess human performance across domains. This article provides of a summary of 150 total HPO metrics across multiple domains that can be used as a starting point-the beginning of an HPO toolkit: physical fitness (29 metrics), nutrition (24 metrics), psychological status (36 metrics), cognitive performance (35 metrics), environment (12 metrics), sleep (9 metrics), and pain (5 metrics). These metrics can be particularly valuable as the military emphasizes a renewed interest in Human Dimension efforts
2011-01-01
Background Citations in peer-reviewed articles and the impact factor are generally accepted measures of scientific impact. Web 2.0 tools such as Twitter, blogs or social bookmarking tools provide the possibility to construct innovative article-level or journal-level metrics to gauge impact and influence. However, the relationship of the these new metrics to traditional metrics such as citations is not known. Objective (1) To explore the feasibility of measuring social impact of and public attention to scholarly articles by analyzing buzz in social media, (2) to explore the dynamics, content, and timing of tweets relative to the publication of a scholarly article, and (3) to explore whether these metrics are sensitive and specific enough to predict highly cited articles. Methods Between July 2008 and November 2011, all tweets containing links to articles in the Journal of Medical Internet Research (JMIR) were mined. For a subset of 1573 tweets about 55 articles published between issues 3/2009 and 2/2010, different metrics of social media impact were calculated and compared against subsequent citation data from Scopus and Google Scholar 17 to 29 months later. A heuristic to predict the top-cited articles in each issue through tweet metrics was validated. Results A total of 4208 tweets cited 286 distinct JMIR articles. The distribution of tweets over the first 30 days after article publication followed a power law (Zipf, Bradford, or Pareto distribution), with most tweets sent on the day when an article was published (1458/3318, 43.94% of all tweets in a 60-day period) or on the following day (528/3318, 15.9%), followed by a rapid decay. The Pearson correlations between tweetations and citations were moderate and statistically significant, with correlation coefficients ranging from .42 to .72 for the log-transformed Google Scholar citations, but were less clear for Scopus citations and rank correlations. A linear multivariate model with time and tweets as significant
Rainbows without unicorns: metric structures in theories with modified dispersion relations
International Nuclear Information System (INIS)
Lobo, Iarley P.; Loret, Niccolo; Nettel, Francisco
2017-01-01
Rainbow metrics are a widely used approach to the metric formalism for theories with modified dispersion relations. They have had a huge success in the quantum gravity phenomenology literature, since they allow one to introduce momentum-dependent space-time metrics into the description of systems with a modified dispersion relation. In this paper, we introduce the reader to some realizations of this general idea: the original rainbow metrics proposal, the momentum-space-inspired metric and a Finsler geometry approach. As the main result of this work we also present an alternative definition of a four-velocity dependent metric which allows one to handle the massless limit. This paper aims to highlight some of their properties and how to properly describe their relativistic realizations. (orig.)
Rainbows without unicorns: metric structures in theories with modified dispersion relations
Lobo, Iarley P.; Loret, Niccoló; Nettel, Francisco
2017-07-01
Rainbow metrics are a widely used approach to the metric formalism for theories with modified dispersion relations. They have had a huge success in the quantum gravity phenomenology literature, since they allow one to introduce momentum-dependent space-time metrics into the description of systems with a modified dispersion relation. In this paper, we introduce the reader to some realizations of this general idea: the original rainbow metrics proposal, the momentum-space-inspired metric and a Finsler geometry approach. As the main result of this work we also present an alternative definition of a four-velocity dependent metric which allows one to handle the massless limit. This paper aims to highlight some of their properties and how to properly describe their relativistic realizations.
Rainbows without unicorns: metric structures in theories with modified dispersion relations
Energy Technology Data Exchange (ETDEWEB)
Lobo, Iarley P. [Universita ' ' La Sapienza' ' , Dipartimento di Fisica, Rome (Italy); ICRANet, Pescara (Italy); CAPES Foundation, Ministry of Education of Brazil, Brasilia (Brazil); Universidade Federal da Paraiba, Departamento de Fisica, Joao Pessoa, PB (Brazil); INFN Sezione Roma 1 (Italy); Loret, Niccolo [Ruder Boskovic Institute, Division of Theoretical Physics, Zagreb (Croatia); Nettel, Francisco [Universita ' ' La Sapienza' ' , Dipartimento di Fisica, Rome (Italy); Universidad Nacional Autonoma de Mexico, Instituto de Ciencias Nucleares, Mexico (Mexico); INFN Sezione Roma 1 (Italy)
2017-07-15
Rainbow metrics are a widely used approach to the metric formalism for theories with modified dispersion relations. They have had a huge success in the quantum gravity phenomenology literature, since they allow one to introduce momentum-dependent space-time metrics into the description of systems with a modified dispersion relation. In this paper, we introduce the reader to some realizations of this general idea: the original rainbow metrics proposal, the momentum-space-inspired metric and a Finsler geometry approach. As the main result of this work we also present an alternative definition of a four-velocity dependent metric which allows one to handle the massless limit. This paper aims to highlight some of their properties and how to properly describe their relativistic realizations. (orig.)
Temporal variability of daily personal magnetic field exposure metrics in pregnant women.
Lewis, Ryan C; Evenson, Kelly R; Savitz, David A; Meeker, John D
2015-01-01
Recent epidemiology studies of power-frequency magnetic fields and reproductive health have characterized exposures using data collected from personal exposure monitors over a single day, possibly resulting in exposure misclassification due to temporal variability in daily personal magnetic field exposure metrics, but relevant data in adults are limited. We assessed the temporal variability of daily central tendency (time-weighted average, median) and peak (upper percentiles, maximum) personal magnetic field exposure metrics over 7 consecutive days in 100 pregnant women. When exposure was modeled as a continuous variable, central tendency metrics had substantial reliability, whereas peak metrics had fair (maximum) to moderate (upper percentiles) reliability. The predictive ability of a single-day metric to accurately classify participants into exposure categories based on a weeklong metric depended on the selected exposure threshold, with sensitivity decreasing with increasing exposure threshold. Consistent with the continuous measures analysis, sensitivity was higher for central tendency metrics than for peak metrics. If there is interest in peak metrics, more than 1 day of measurement is needed over the window of disease susceptibility to minimize measurement error, but 1 day may be sufficient for central tendency metrics.
Development of soil quality metrics using mycorrhizal fungi
Energy Technology Data Exchange (ETDEWEB)
Baar, J.
2010-07-01
Based on the Treaty on Biological Diversity of Rio de Janeiro in 1992 for maintaining and increasing biodiversity, several countries have started programmes monitoring soil quality and the above- and below ground biodiversity. Within the European Union, policy makers are working on legislation for soil protection and management. Therefore, indicators are needed to monitor the status of the soils and these indicators reflecting the soil quality, can be integrated in working standards or soil quality metrics. Soil micro-organisms, particularly arbuscular mycorrhizal fungi (AMF), are indicative of soil changes. These soil fungi live in symbiosis with the great majority of plants and are sensitive to changes in the physico-chemical conditions of the soil. The aim of this study was to investigate whether AMF are reliable and sensitive indicators for disturbances in the soils and can be used for the development of soil quality metrics. Also, it was studied whether soil quality metrics based on AMF meet requirements to applicability by users and policy makers. Ecological criterions were set for the development of soil quality metrics for different soils. Multiple root samples containing AMF from various locations in The Netherlands were analyzed. The results of the analyses were related to the defined criterions. This resulted in two soil quality metrics, one for sandy soils and a second one for clay soils, with six different categories ranging from very bad to very good. These soil quality metrics meet the majority of requirements for applicability and are potentially useful for the development of legislations for the protection of soil quality. (Author) 23 refs.
Measures of agreement between computation and experiment:validation metrics.
Energy Technology Data Exchange (ETDEWEB)
Barone, Matthew Franklin; Oberkampf, William Louis
2005-08-01
With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables and sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric and also features that should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent free-shear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.
Tide or Tsunami? The Impact of Metrics on Scholarly Research
Bonnell, Andrew G.
2016-01-01
Australian universities are increasingly resorting to the use of journal metrics such as impact factors and ranking lists in appraisal and promotion processes, and are starting to set quantitative "performance expectations" which make use of such journal-based metrics. The widespread use and misuse of research metrics is leading to…
Term Based Comparison Metrics for Controlled and Uncontrolled Indexing Languages
Good, B. M.; Tennis, J. T.
2009-01-01
Introduction: We define a collection of metrics for describing and comparing sets of terms in controlled and uncontrolled indexing languages and then show how these metrics can be used to characterize a set of languages spanning folksonomies, ontologies and thesauri. Method: Metrics for term set characterization and comparison were identified and…
Software metrics: The key to quality software on the NCC project
Burns, Patricia J.
1993-01-01
Network Control Center (NCC) Project metrics are captured during the implementation and testing phases of the NCCDS software development lifecycle. The metrics data collection and reporting function has interfaces with all elements of the NCC project. Close collaboration with all project elements has resulted in the development of a defined and repeatable set of metrics processes. The resulting data are used to plan and monitor release activities on a weekly basis. The use of graphical outputs facilitates the interpretation of progress and status. The successful application of metrics throughout the NCC project has been instrumental in the delivery of quality software. The use of metrics on the NCC Project supports the needs of the technical and managerial staff. This paper describes the project, the functions supported by metrics, the data that are collected and reported, how the data are used, and the improvements in the quality of deliverable software since the metrics processes and products have been in use.
48 CFR 611.002-70 - Metric system implementation.
2010-10-01
... with security, operations, economic, technical, logistical, training and safety requirements. (3) The... total cost of the retrofit, including redesign costs, exceeds $50,000; (ii) Metric is not the accepted... office with an explanation for the disapproval. (7) The in-house operating metric costs shall be...
Data characteristics that determine classifier performance
CSIR Research Space (South Africa)
Van der Walt, Christiaan M
2006-11-01
Full Text Available available at [11]. The kNN uses a LinearNN nearest neighbour search algorithm with an Euclidean distance metric [8]. The optimal k value is determined by performing 10-fold cross-validation. An optimal k value between 1 and 10 is used for Experiments 1... classifiers. 10-fold cross-validation is used to evaluate and compare the performance of the classifiers on the different data sets. 3.1. Artificial data generation Multivariate Gaussian distributions are used to generate artificial data sets. We use d...
Metrics for assessing retailers based on consumer perception
Directory of Open Access Journals (Sweden)
Klimin Anastasii
2017-01-01
Full Text Available The article suggests a new look at trading platforms, which is called “metrics.” Metrics are a way to look at the point of sale in a large part from the buyer’s side. The buyer enters the store and make buying decision based on those factors that the seller often does not consider, or considers in part, because “does not see” them, since he is not a buyer. The article proposes the classification of retailers, metrics and a methodology for their determination, presents the results of an audit of retailers in St. Petersburg on the proposed methodology.
Chaos of discrete dynamical systems in complete metric spaces
International Nuclear Information System (INIS)
Shi Yuming; Chen Guanrong
2004-01-01
This paper is concerned with chaos of discrete dynamical systems in complete metric spaces. Discrete dynamical systems governed by continuous maps in general complete metric spaces are first discussed, and two criteria of chaos are then established. As a special case, two corresponding criteria of chaos for discrete dynamical systems in compact subsets of metric spaces are obtained. These results have extended and improved the existing relevant results of chaos in finite-dimensional Euclidean spaces
The correlation of metrics in complex networks with applications in functional brain networks
International Nuclear Information System (INIS)
Li, C; Wang, H; Van Mieghem, P; De Haan, W; Stam, C J
2011-01-01
An increasing number of network metrics have been applied in network analysis. If metric relations were known better, we could more effectively characterize networks by a small set of metrics to discover the association between network properties/metrics and network functioning. In this paper, we investigate the linear correlation coefficients between widely studied network metrics in three network models (Bárabasi–Albert graphs, Erdös–Rényi random graphs and Watts–Strogatz small-world graphs) as well as in functional brain networks of healthy subjects. The metric correlations, which we have observed and theoretically explained, motivate us to propose a small representative set of metrics by including only one metric from each subset of mutually strongly dependent metrics. The following contributions are considered important. (a) A network with a given degree distribution can indeed be characterized by a small representative set of metrics. (b) Unweighted networks, which are obtained from weighted functional brain networks with a fixed threshold, and Erdös–Rényi random graphs follow a similar degree distribution. Moreover, their metric correlations and the resultant representative metrics are similar as well. This verifies the influence of degree distribution on metric correlations. (c) Most metric correlations can be explained analytically. (d) Interestingly, the most studied metrics so far, the average shortest path length and the clustering coefficient, are strongly correlated and, thus, redundant. Whereas spectral metrics, though only studied recently in the context of complex networks, seem to be essential in network characterizations. This representative set of metrics tends to both sufficiently and effectively characterize networks with a given degree distribution. In the study of a specific network, however, we have to at least consider the representative set so that important network properties will not be neglected
Otherwise Engaged : Social Media from Vanity Metrics to Critical Analytics
Rogers, R.
2018-01-01
Vanity metrics is a term that captures the measurement and display of how well one is doing in the “success theater” of social media. The notion of vanity metrics implies a critique of metrics concerning both the object of measurement as well as their capacity to measure unobtrusively or only to
IDENTIFYING MARKETING EFFECTIVENESS METRICS (Case study: East Azerbaijan`s industrial units)
Faridyahyaie, Reza; Faryabi, Mohammad; Bodaghi Khajeh Noubar, Hossein
2012-01-01
The Paper attempts to identify marketing eff ectiveness metrics in industrial units. The metrics investigated in this study are completely applicable and comprehensive, and consequently they can evaluate marketing eff ectiveness in various industries. The metrics studied include: Market Share, Profitability, Sales Growth, Customer Numbers, Customer Satisfaction and Customer Loyalty. The findings indicate that these six metrics are impressive when measuring marketing effectiveness. Data was ge...
Some observations on a fuzzy metric space
Energy Technology Data Exchange (ETDEWEB)
Gregori, V.
2017-07-01
Let $(X,d)$ be a metric space. In this paper we provide some observations about the fuzzy metric space in the sense of Kramosil and Michalek $(Y,N,/wedge)$, where $Y$ is the set of non-negative real numbers $[0,/infty[$ and $N(x,y,t)=1$ if $d(x,y)/leq t$ and $N(x,y,t)=0$ if $d(x,y)/geq t$. (Author)
22 CFR 226.15 - Metric system of measurement.
2010-04-01
... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Metric system of measurement. 226.15 Section 226.15 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT ADMINISTRATION OF ASSISTANCE AWARDS TO U.S. NON-GOVERNMENTAL ORGANIZATIONS Pre-award Requirements § 226.15 Metric system of measurement. (a...
20 CFR 435.15 - Metric system of measurement.
2010-04-01
... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Metric system of measurement. 435.15 Section 435.15 Employees' Benefits SOCIAL SECURITY ADMINISTRATION UNIFORM ADMINISTRATIVE REQUIREMENTS FOR... metric system is the preferred measurement system for U.S. trade and commerce. The Act requires each...
Conceptual Soundness, Metric Development, Benchmarking, and Targeting for PATH Subprogram Evaluation
Energy Technology Data Exchange (ETDEWEB)
Mosey. G.; Doris, E.; Coggeshall, C.; Antes, M.; Ruch, J.; Mortensen, J.
2009-01-01
The objective of this study is to evaluate the conceptual soundness of the U.S. Department of Housing and Urban Development (HUD) Partnership for Advancing Technology in Housing (PATH) program's revised goals and establish and apply a framework to identify and recommend metrics that are the most useful for measuring PATH's progress. This report provides an evaluative review of PATH's revised goals, outlines a structured method for identifying and selecting metrics, proposes metrics and benchmarks for a sampling of individual PATH programs, and discusses other metrics that potentially could be developed that may add value to the evaluation process. The framework and individual program metrics can be used for ongoing management improvement efforts and to inform broader program-level metrics for government reporting requirements.
Assessment of the Log-Euclidean Metric Performance in Diffusion Tensor Image Segmentation
Directory of Open Access Journals (Sweden)
Mostafa Charmi
2010-06-01
Full Text Available Introduction: Appropriate definition of the distance measure between diffusion tensors has a deep impact on Diffusion Tensor Image (DTI segmentation results. The geodesic metric is the best distance measure since it yields high-quality segmentation results. However, the important problem with the geodesic metric is a high computational cost of the algorithms based on it. The main goal of this paper is to assess the possible substitution of the geodesic metric with the Log-Euclidean one to reduce the computational cost of a statistical surface evolution algorithm. Materials and Methods: We incorporated the Log-Euclidean metric in the statistical surface evolution algorithm framework. To achieve this goal, the statistics and gradients of diffusion tensor images were defined using the Log-Euclidean metric. Numerical implementation of the segmentation algorithm was performed in the MATLAB software using the finite difference techniques. Results: In the statistical surface evolution framework, the Log-Euclidean metric was able to discriminate the torus and helix patterns in synthesis datasets and rat spinal cords in biological phantom datasets from the background better than the Euclidean and J-divergence metrics. In addition, similar results were obtained with the geodesic metric. However, the main advantage of the Log-Euclidean metric over the geodesic metric was the dramatic reduction of computational cost of the segmentation algorithm, at least by 70 times. Discussion and Conclusion: The qualitative and quantitative results have shown that the Log-Euclidean metric is a good substitute for the geodesic metric when using a statistical surface evolution algorithm in DTIs segmentation.
Absolutely minimal extensions of functions on metric spaces
International Nuclear Information System (INIS)
Milman, V A
1999-01-01
Extensions of a real-valued function from the boundary ∂X 0 of an open subset X 0 of a metric space (X,d) to X 0 are discussed. For the broad class of initial data coming under discussion (linearly bounded functions) locally Lipschitz extensions to X 0 that preserve localized moduli of continuity are constructed. In the set of these extensions an absolutely minimal extension is selected, which was considered before by Aronsson for Lipschitz initial functions in the case X 0 subset of R n . An absolutely minimal extension can be regarded as an ∞-harmonic function, that is, a limit of p-harmonic functions as p→+∞. The proof of the existence of absolutely minimal extensions in a metric space with intrinsic metric is carried out by the Perron method. To this end, ∞-subharmonic, ∞-superharmonic, and ∞-harmonic functions on a metric space are defined and their properties are established
PSQM-based RR and NR video quality metrics
Lu, Zhongkang; Lin, Weisi; Ong, Eeping; Yang, Xiaokang; Yao, Susu
2003-06-01
This paper presents a new and general concept, PQSM (Perceptual Quality Significance Map), to be used in measuring the visual distortion. It makes use of the selectivity characteristic of HVS (Human Visual System) that it pays more attention to certain area/regions of visual signal due to one or more of the following factors: salient features in image/video, cues from domain knowledge, and association of other media (e.g., speech or audio). PQSM is an array whose elements represent the relative perceptual-quality significance levels for the corresponding area/regions for images or video. Due to its generality, PQSM can be incorporated into any visual distortion metrics: to improve effectiveness or/and efficiency of perceptual metrics; or even to enhance a PSNR-based metric. A three-stage PQSM estimation method is also proposed in this paper, with an implementation of motion, texture, luminance, skin-color and face mapping. Experimental results show the scheme can improve the performance of current image/video distortion metrics.
Models for predicting objective function weights in prostate cancer IMRT
International Nuclear Information System (INIS)
Boutilier, Justin J.; Lee, Taewoo; Craig, Tim; Sharpe, Michael B.; Chan, Timothy C. Y.
2015-01-01
Purpose: To develop and evaluate the clinical applicability of advanced machine learning models that simultaneously predict multiple optimization objective function weights from patient geometry for intensity-modulated radiation therapy of prostate cancer. Methods: A previously developed inverse optimization method was applied retrospectively to determine optimal objective function weights for 315 treated patients. The authors used an overlap volume ratio (OV) of bladder and rectum for different PTV expansions and overlap volume histogram slopes (OVSR and OVSB for the rectum and bladder, respectively) as explanatory variables that quantify patient geometry. Using the optimal weights as ground truth, the authors trained and applied three prediction models: logistic regression (LR), multinomial logistic regression (MLR), and weighted K-nearest neighbor (KNN). The population average of the optimal objective function weights was also calculated. Results: The OV at 0.4 cm and OVSR at 0.1 cm features were found to be the most predictive of the weights. The authors observed comparable performance (i.e., no statistically significant difference) between LR, MLR, and KNN methodologies, with LR appearing to perform the best. All three machine learning models outperformed the population average by a statistically significant amount over a range of clinical metrics including bladder/rectum V53Gy, bladder/rectum V70Gy, and dose to the bladder, rectum, CTV, and PTV. When comparing the weights directly, the LR model predicted bladder and rectum weights that had, on average, a 73% and 74% relative improvement over the population average weights, respectively. The treatment plans resulting from the LR weights had, on average, a rectum V70Gy that was 35% closer to the clinical plan and a bladder V70Gy that was 29% closer, compared to the population average weights. Similar results were observed for all other clinical metrics. Conclusions: The authors demonstrated that the KNN and MLR
Models for predicting objective function weights in prostate cancer IMRT
Energy Technology Data Exchange (ETDEWEB)
Boutilier, Justin J., E-mail: j.boutilier@mail.utoronto.ca; Lee, Taewoo [Department of Mechanical and Industrial Engineering, University of Toronto, 5 King’s College Road, Toronto, Ontario M5S 3G8 (Canada); Craig, Tim [Radiation Medicine Program, UHN Princess Margaret Cancer Centre, 610 University of Avenue, Toronto, Ontario M5T 2M9, Canada and Department of Radiation Oncology, University of Toronto, 148 - 150 College Street, Toronto, Ontario M5S 3S2 (Canada); Sharpe, Michael B. [Radiation Medicine Program, UHN Princess Margaret Cancer Centre, 610 University of Avenue, Toronto, Ontario M5T 2M9 (Canada); Department of Radiation Oncology, University of Toronto, 148 - 150 College Street, Toronto, Ontario M5S 3S2 (Canada); Techna Institute for the Advancement of Technology for Health, 124 - 100 College Street, Toronto, Ontario M5G 1P5 (Canada); Chan, Timothy C. Y. [Department of Mechanical and Industrial Engineering, University of Toronto, 5 King’s College Road, Toronto, Ontario M5S 3G8, Canada and Techna Institute for the Advancement of Technology for Health, 124 - 100 College Street, Toronto, Ontario M5G 1P5 (Canada)
2015-04-15
Purpose: To develop and evaluate the clinical applicability of advanced machine learning models that simultaneously predict multiple optimization objective function weights from patient geometry for intensity-modulated radiation therapy of prostate cancer. Methods: A previously developed inverse optimization method was applied retrospectively to determine optimal objective function weights for 315 treated patients. The authors used an overlap volume ratio (OV) of bladder and rectum for different PTV expansions and overlap volume histogram slopes (OVSR and OVSB for the rectum and bladder, respectively) as explanatory variables that quantify patient geometry. Using the optimal weights as ground truth, the authors trained and applied three prediction models: logistic regression (LR), multinomial logistic regression (MLR), and weighted K-nearest neighbor (KNN). The population average of the optimal objective function weights was also calculated. Results: The OV at 0.4 cm and OVSR at 0.1 cm features were found to be the most predictive of the weights. The authors observed comparable performance (i.e., no statistically significant difference) between LR, MLR, and KNN methodologies, with LR appearing to perform the best. All three machine learning models outperformed the population average by a statistically significant amount over a range of clinical metrics including bladder/rectum V53Gy, bladder/rectum V70Gy, and dose to the bladder, rectum, CTV, and PTV. When comparing the weights directly, the LR model predicted bladder and rectum weights that had, on average, a 73% and 74% relative improvement over the population average weights, respectively. The treatment plans resulting from the LR weights had, on average, a rectum V70Gy that was 35% closer to the clinical plan and a bladder V70Gy that was 29% closer, compared to the population average weights. Similar results were observed for all other clinical metrics. Conclusions: The authors demonstrated that the KNN and MLR
Value-based metrics and Internet-based enterprises
Gupta, Krishan M.
2001-10-01
Within the last few years, a host of value-based metrics like EVA, MVA, TBR, CFORI, and TSR have evolved. This paper attempts to analyze the validity and applicability of EVA and Balanced Scorecard for Internet based organizations. Despite the collapse of the dot-com model, the firms engaged in e- commerce continue to struggle to find new ways to account for customer-base, technology, employees, knowledge, etc, as part of the value of the firm. While some metrics, like the Balance Scorecard are geared towards internal use, others like EVA are for external use. Value-based metrics are used for performing internal audits as well as comparing firms against one another; and can also be effectively utilized by individuals outside the firm looking to determine if the firm is creating value for its stakeholders.
NASA Aviation Safety Program Systems Analysis/Program Assessment Metrics Review
Louis, Garrick E.; Anderson, Katherine; Ahmad, Tisan; Bouabid, Ali; Siriwardana, Maya; Guilbaud, Patrick
2003-01-01
The goal of this project is to evaluate the metrics and processes used by NASA's Aviation Safety Program in assessing technologies that contribute to NASA's aviation safety goals. There were three objectives for reaching this goal. First, NASA's main objectives for aviation safety were documented and their consistency was checked against the main objectives of the Aviation Safety Program. Next, the metrics used for technology investment by the Program Assessment function of AvSP were evaluated. Finally, other metrics that could be used by the Program Assessment Team (PAT) were identified and evaluated. This investigation revealed that the objectives are in fact consistent across organizational levels at NASA and with the FAA. Some of the major issues discussed in this study which should be further investigated, are the removal of the Cost and Return-on-Investment metrics, the lack of the metrics to measure the balance of investment and technology, the interdependencies between some of the metric risk driver categories, and the conflict between 'fatal accident rate' and 'accident rate' in the language of the Aviation Safety goal as stated in different sources.
Scalar metric fluctuations in space-time matter inflation
International Nuclear Information System (INIS)
Anabitarte, Mariano; Bellini, Mauricio
2006-01-01
Using the Ponce de Leon background metric, which describes a 5D universe in an apparent vacuum: G-bar AB =0, we study the effective 4D evolution of both, the inflaton and gauge-invariant scalar metric fluctuations, in the recently introduced model of space-time matter inflation
10 CFR 600.306 - Metric system of measurement.
2010-01-01
... cause significant inefficiencies or loss of markets to United States firms. (b) Recipients are... Requirements for Grants and Cooperative Agreements With For-Profit Organizations General § 600.306 Metric... Competitiveness Act of 1988 (15 U.S.C. 205) and implemented by Executive Order 12770, states that: (1) The metric...
A heuristic way of obtaining the Kerr metric
International Nuclear Information System (INIS)
Enderlein, J.
1997-01-01
An intuitive, straightforward way of finding the metric of a rotating black hole is presented, based on the algebra of differential forms. The representation obtained for the metric displays a simplicity which is not obvious in the usual Boyer Lindquist coordinates. copyright 1997 American Association of Physics Teachers
Using metrics in stability of stochastic programming problems
Czech Academy of Sciences Publication Activity Database
Houda, Michal
2005-01-01
Roč. 13, č. 1 (2005), s. 128-134 ISSN 0572-3043 R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : stochastic programming * quantitative stability * Wasserstein metrics * Kolmogorov metrics * simulation study Subject RIV: BB - Applied Statistics, Operational Research
Synchronization of multi-agent systems with metric-topological interactions.
Wang, Lin; Chen, Guanrong
2016-09-01
A hybrid multi-agent systems model integrating the advantages of both metric interaction and topological interaction rules, called the metric-topological model, is developed. This model describes planar motions of mobile agents, where each agent can interact with all the agents within a circle of a constant radius, and can furthermore interact with some distant agents to reach a pre-assigned number of neighbors, if needed. Some sufficient conditions imposed only on system parameters and agent initial states are presented, which ensure achieving synchronization of the whole group of agents. It reveals the intrinsic relationships among the interaction range, the speed, the initial heading, and the density of the group. Moreover, robustness against variations of interaction range, density, and speed are investigated by comparing the motion patterns and performances of the hybrid metric-topological interaction model with the conventional metric-only and topological-only interaction models. Practically in all cases, the hybrid metric-topological interaction model has the best performance in the sense of achieving highest frequency of synchronization, fastest convergent rate, and smallest heading difference.
Modified intuitionistic fuzzy metric spaces and some fixed point theorems
International Nuclear Information System (INIS)
Saadati, R.; Sedghi, S.; Shobe, N.
2008-01-01
Since the intuitionistic fuzzy metric space has extra conditions (see [Gregori V, Romaguera S, Veereamani P. A note on intuitionistic fuzzy metric spaces. Chaos, Solitons and Fractals 2006;28:902-5]). In this paper, we consider modified intuitionistic fuzzy metric spaces and prove some fixed point theorems in these spaces. All the results presented in this paper are new
Effect of thematic map misclassification on landscape multi-metric assessment.
Kleindl, William J; Powell, Scott L; Hauer, F Richard
2015-06-01
Advancements in remote sensing and computational tools have increased our awareness of large-scale environmental problems, thereby creating a need for monitoring, assessment, and management at these scales. Over the last decade, several watershed and regional multi-metric indices have been developed to assist decision-makers with planning actions of these scales. However, these tools use remote-sensing products that are subject to land-cover misclassification, and these errors are rarely incorporated in the assessment results. Here, we examined the sensitivity of a landscape-scale multi-metric index (MMI) to error from thematic land-cover misclassification and the implications of this uncertainty for resource management decisions. Through a case study, we used a simplified floodplain MMI assessment tool, whose metrics were derived from Landsat thematic maps, to initially provide results that were naive to thematic misclassification error. Using a Monte Carlo simulation model, we then incorporated map misclassification error into our MMI, resulting in four important conclusions: (1) each metric had a different sensitivity to error; (2) within each metric, the bias between the error-naive metric scores and simulated scores that incorporate potential error varied in magnitude and direction depending on the underlying land cover at each assessment site; (3) collectively, when the metrics were combined into a multi-metric index, the effects were attenuated; and (4) the index bias indicated that our naive assessment model may overestimate floodplain condition of sites with limited human impacts and, to a lesser extent, either over- or underestimated floodplain condition of sites with mixed land use.
Crowdsourcing metrics of digital collections
Directory of Open Access Journals (Sweden)
Tuula Pääkkönen
2015-12-01
Full Text Available In the National Library of Finland (NLF there are millions of digitized newspaper and journal pages, which are openly available via the public website http://digi.kansalliskirjasto.fi. To serve users better, last year the front end was completely overhauled with its main aim in crowdsourcing features, e.g., by giving end-users the opportunity to create digital clippings and a personal scrapbook from the digital collections. But how can you know whether crowdsourcing has had an impact? How much crowdsourcing functionalities have been used so far? Did crowdsourcing work? In this paper the statistics and metrics of a recent crowdsourcing effort are analysed across the different digitized material types (newspapers, journals, ephemera. The subjects, categories and keywords given by the users are analysed to see which topics are the most appealing. Some notable public uses of the crowdsourced article clippings are highlighted. These metrics give us indications on how the end-users, based on their own interests, are investigating and using the digital collections. Therefore, the suggested metrics illustrate the versatility of the information needs of the users, varying from citizen science to research purposes. By analysing the user patterns, we can respond to the new needs of the users by making minor changes to accommodate the most active participants, while still making the service more approachable for those who are trying out the functionalities for the first time. Participation in the clippings and annotations can enrich the materials in unexpected ways and can possibly pave the way for opportunities of using crowdsourcing more also in research contexts. This creates more opportunities for the goals of open science since source data becomes available, making it possible for researchers to reach out to the general public for help. In the long term, utilizing, for example, text mining methods can allow these different end-user segments to
First results from a combined analysis of CERN computing infrastructure metrics
Duellmann, Dirk; Nieke, Christian
2017-10-01
The IT Analysis Working Group (AWG) has been formed at CERN across individual computing units and the experiments to attempt a cross cutting analysis of computing infrastructure and application metrics. In this presentation we will describe the first results obtained using medium/long term data (1 months — 1 year) correlating box level metrics, job level metrics from LSF and HTCondor, IO metrics from the physics analysis disk pools (EOS) and networking and application level metrics from the experiment dashboards. We will cover in particular the measurement of hardware performance and prediction of job duration, the latency sensitivity of different job types and a search for bottlenecks with the production job mix in the current infrastructure. The presentation will conclude with the proposal of a small set of metrics to simplify drawing conclusions also in the more constrained environment of public cloud deployments.
Nonlinear Semi-Supervised Metric Learning Via Multiple Kernels and Local Topology.
Li, Xin; Bai, Yanqin; Peng, Yaxin; Du, Shaoyi; Ying, Shihui
2018-03-01
Changing the metric on the data may change the data distribution, hence a good distance metric can promote the performance of learning algorithm. In this paper, we address the semi-supervised distance metric learning (ML) problem to obtain the best nonlinear metric for the data. First, we describe the nonlinear metric by the multiple kernel representation. By this approach, we project the data into a high dimensional space, where the data can be well represented by linear ML. Then, we reformulate the linear ML by a minimization problem on the positive definite matrix group. Finally, we develop a two-step algorithm for solving this model and design an intrinsic steepest descent algorithm to learn the positive definite metric matrix. Experimental results validate that our proposed method is effective and outperforms several state-of-the-art ML methods.
Anisotropic mesh adaptation for marine ice-sheet modelling
Gillet-Chaulet, Fabien; Tavard, Laure; Merino, Nacho; Peyaud, Vincent; Brondex, Julien; Durand, Gael; Gagliardini, Olivier
2017-04-01
Improving forecasts of ice-sheets contribution to sea-level rise requires, amongst others, to correctly model the dynamics of the grounding line (GL), i.e. the line where the ice detaches from its underlying bed and goes afloat on the ocean. Many numerical studies, including the intercomparison exercises MISMIP and MISMIP3D, have shown that grid refinement in the GL vicinity is a key component to obtain reliable results. Improving model accuracy while maintaining the computational cost affordable has then been an important target for the development of marine icesheet models. Adaptive mesh refinement (AMR) is a method where the accuracy of the solution is controlled by spatially adapting the mesh size. It has become popular in models using the finite element method as they naturally deal with unstructured meshes, but block-structured AMR has also been successfully applied to model GL dynamics. The main difficulty with AMR is to find efficient and reliable estimators of the numerical error to control the mesh size. Here, we use the estimator proposed by Frey and Alauzet (2015). Based on the interpolation error, it has been found effective in practice to control the numerical error, and has some flexibility, such as its ability to combine metrics for different variables, that makes it attractive. Routines to compute the anisotropic metric defining the mesh size have been implemented in the finite element ice flow model Elmer/Ice (Gagliardini et al., 2013). The mesh adaptation is performed using the freely available library MMG (Dapogny et al., 2014) called from Elmer/Ice. Using a setup based on the inter-comparison exercise MISMIP+ (Asay-Davis et al., 2016), we study the accuracy of the solution when the mesh is adapted using various variables (ice thickness, velocity, basal drag, …). We show that combining these variables allows to reduce the number of mesh nodes by more than one order of magnitude, for the same numerical accuracy, when compared to uniform mesh
Information Quality Aware Data Collection for Adaptive Monitoring of Distribution Grids
DEFF Research Database (Denmark)
Kemal, Mohammed Seifu; Olsen, Rasmus Løvenstein; Schwefel, Hans-Peter
2017-01-01
frequency during run time. The data collection system should adapt to changing dynamics of the communication network and electrical grid. This paper first introduces adaptation functionalities for the data collection mechanism. To study and analyse the influence of configuration parameters that can......Abstract. Information from existing smart metering infrastructure, mainly used for billing purposes can also be utilised to monitor and control state of the grid. To add functionalities such as fault detection and real-time state estimation, data from smart meters should be accessed with increased...... be utilised for adaptation, a two-layer smart meter data access infrastructure is presented. An information quality metric, Mismatch Probability (mmPr) is introduced for the quantitative analysis of the two-layer data access system implemented in MATLAB based discrete event simulation study....
On the topology defined by Thurston's asymmetric metric
DEFF Research Database (Denmark)
Papadopoulos, Athanase; Theret, Guillaume
2007-01-01
that the topology that the asymmetric metric L induces on Teichmüller space is the same as the usual topology. Furthermore, we show that L satisfies the axioms of a (not necessarily symmetric) metric in the sense of Busemann and conclude that L is complete in the sense of Busemann....
An accurate metric for the spacetime around neutron stars
Pappas, George
2016-01-01
The problem of having an accurate description of the spacetime around neutron stars is of great astrophysical interest. For astrophysical applications, one needs to have a metric that captures all the properties of the spacetime around a neutron star. Furthermore, an accurate appropriately parameterised metric, i.e., a metric that is given in terms of parameters that are directly related to the physical structure of the neutron star, could be used to solve the inverse problem, which is to inf...
Analytic convergence of harmonic metrics for parabolic Higgs bundles
Kim, Semin; Wilkin, Graeme
2018-04-01
In this paper we investigate the moduli space of parabolic Higgs bundles over a punctured Riemann surface with varying weights at the punctures. We show that the harmonic metric depends analytically on the weights and the stable Higgs bundle. This gives a Higgs bundle generalisation of a theorem of McOwen on the existence of hyperbolic cone metrics on a punctured surface within a given conformal class, and a generalisation of a theorem of Judge on the analytic parametrisation of these metrics.
A software quality model and metrics for risk assessment
Hyatt, L.; Rosenberg, L.
1996-01-01
A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.
Contrasting Various Metrics for Measuring Tropical Cyclone Activity
Directory of Open Access Journals (Sweden)
Jia-Yuh Yu Ping-Gin Chiu
2012-01-01
Full Text Available Popular metrics used for measuring the tropical cyclone (TC activity, including NTC (number of tropical cyclones, TCD (tropical cyclone days, ACE (accumulated cyclone energy, PDI (power dissipation index, along with two newly proposed indices: RACE (revised accumulated cyclone energy and RPDI (revised power dissipation index, are compared using the JTWC (Joint Typhoon Warning Center best-track data of TC over the western North Pacific basin. Our study shows that, while the above metrics have demonstrated various degrees of discrepancies, but in practical terms, they are all able to produce meaningful temporal and spatial changes in response to climate variability. Compared with the conventional ACE and PDI, RACE and RPDI seem to provide a more precise estimate of the total TC activity, especially in projecting the upswing trend of TC activity over the past few decades, simply because of a better approach in estimating TC wind energy. However, we would argue that there is still no need to find a ¡§universal¡¨ or ¡§best¡¨ metric for TC activity because different metrics are designed to stratify different aspects of TC activity, and whether the selected metric is appropriate or not should be determined solely by the purpose of study. Except for magnitude difference, the analysis results seem insensitive to the choice of the best-track datasets.
Tracking occupational hearing loss across global industries: a comparative analysis of metrics.
Rabinowitz, Peter M; Galusha, Deron; McTague, Michael F; Slade, Martin D; Wesdock, James C; Dixon-Ernst, Christine
2012-01-01
Occupational hearing loss is one of the most prevalent occupational conditions; yet, there is no acknowledged international metric to allow comparisons of risk between different industries and regions. In order to make recommendations for an international standard of occupational hearing loss, members of an international industry group (the International Aluminium Association) submitted details of different hearing loss metrics currently in use by members. We compared the performance of these metrics using an audiometric data set for over 6000 individuals working in 10 locations of one member company. We calculated rates for each metric at each location from 2002 to 2006. For comparison, we calculated the difference of observed-expected (for age) binaural high-frequency hearing loss (in dB/year) for each location over the same time period. We performed linear regression to determine the correlation between each metric and the observed-expected rate of hearing loss. The different metrics produced discrepant results, with annual rates ranging from 0.0% for a less-sensitive metric to more than 10% for a highly sensitive metric. At least two metrics, a 10dB age-corrected threshold shift from baseline and a 15dB nonage-corrected shift metric, correlated well with the difference of observed-expected high-frequency hearing loss. This study suggests that it is feasible to develop an international standard for tracking occupational hearing loss in industrial working populations.
Correlation between centrality metrics and their application to the opinion model
Li, C.; Li, Q.; Van Mieghem, P.F.A.; Stanley, H.E.; Wang, H.
2015-01-01
In recent decades, a number of centrality metrics describing network properties of nodes have been proposed to rank the importance of nodes. In order to understand the correlations between centrality metrics and to approximate a high-complexity centrality metric by a strongly correlated
INFORMATIVE ENERGY METRIC FOR SIMILARITY MEASURE IN REPRODUCING KERNEL HILBERT SPACES
Directory of Open Access Journals (Sweden)
Songhua Liu
2012-02-01
Full Text Available In this paper, information energy metric (IEM is obtained by similarity computing for high-dimensional samples in a reproducing kernel Hilbert space (RKHS. Firstly, similar/dissimilar subsets and their corresponding informative energy functions are defined. Secondly, IEM is proposed for similarity measure of those subsets, which converts the non-metric distances into metric ones. Finally, applications of this metric is introduced, such as classification problems. Experimental results validate the effectiveness of the proposed method.
A Validation of Object-Oriented Design Metrics as Quality Indicators
Basili, Victor R.; Briand, Lionel C.; Melo, Walcelio
1997-01-01
This paper presents the results of a study in which we empirically investigated the suits of object-oriented (00) design metrics introduced in another work. More specifically, our goal is to assess these metrics as predictors of fault-prone classes and, therefore, determine whether they can be used as early quality indicators. This study is complementary to the work described where the same suite of metrics had been used to assess frequencies of maintenance changes to classes. To perform our validation accurately, we collected data on the development of eight medium-sized information management systems based on identical requirements. All eight projects were developed using a sequential life cycle model, a well-known 00 analysis/design method and the C++ programming language. Based on empirical and quantitative analysis, the advantages and drawbacks of these 00 metrics are discussed. Several of Chidamber and Kamerer's 00 metrics appear to be useful to predict class fault-proneness during the early phases of the life-cycle. Also, on our data set, they are better predictors than 'traditional' code metrics, which can only be collected at a later phase of the software development processes.
Individual differences in multitasking ability and adaptability.
Morgan, Brent; D'Mello, Sidney; Abbott, Robert; Radvansky, Gabriel; Haass, Michael; Tamplin, Andrea
2013-08-01
The aim of this study was to identify the cognitive factors that predictability and adaptability during multitasking with a flight simulator. Multitasking has become increasingly prevalent as most professions require individuals to perform multiple tasks simultaneously. Considerable research has been undertaken to identify the characteristics of people (i.e., individual differences) that predict multitasking ability. Although working memory is a reliable predictor of general multitasking ability (i.e., performance in normal conditions), there is the question of whether different cognitive faculties are needed to rapidly respond to changing task demands (adaptability). Participants first completed a battery of cognitive individual differences tests followed by multitasking sessions with a flight simulator. After a baseline condition, difficulty of the flight simulator was incrementally increased via four experimental manipulations, and performance metrics were collected to assess multitasking ability and adaptability. Scholastic aptitude and working memory predicted general multitasking ability (i.e., performance at baseline difficulty), but spatial manipulation (in conjunction with working memory) was a major predictor of adaptability (performance in difficult conditions after accounting for baseline performance). Multitasking ability and adaptability may be overlapping but separate constructs that draw on overlapping (but not identical) sets of cognitive abilities. The results of this study are applicable to practitioners and researchers in human factors to assess multitasking performance in real-world contexts and with realistic task constraints. We also present a framework for conceptualizing multitasking adaptability on the basis of five adaptability profiles derived from performance on tasks with consistent versus increased difficulty.
A Numerical Framework for Sobolev Metrics on the Space of Curves
DEFF Research Database (Denmark)
Bauer, Martin; Bruveris, Martins; Harms, Philipp
2017-01-01
Statistical shape analysis can be done in a Riemannian framework by endowing the set of shapes with a Riemannian metric. Sobolev metrics of order two and higher on shape spaces of parametrized or unparametrized curves have several desirable properties not present in lower order metrics...
Self-dual metrics with self-dual Killing vectors
International Nuclear Information System (INIS)
Tod, K.P.; Ward, R.S.
1979-01-01
Twistor methods are used to derive a class of solutions to Einstein's vacuum equations, with anti-self dual Weyl tensor. In particular, all metrics with a Killing vector whose derivative is anti-self-dual and which admit a real positive-definite section are exhibited and shown to coincide with the metrics of Hawking. (author)
27 CFR 4.72 - Metric standards of fill.
2010-04-01
... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Metric standards of fill. 4.72 Section 4.72 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU, DEPARTMENT OF THE TREASURY LIQUORS LABELING AND ADVERTISING OF WINE Standards of Fill for Wine § 4.72 Metric...
Directory of Open Access Journals (Sweden)
Regina Lopes Schimitt
2011-01-01
Full Text Available INTRODUÇÃO: O ritmo social é um conceito que integra a relação entre Zeitgebers (sincronizadores sociais e os marcadores de tempo endógenos, e pode ser avaliado com a Escala de Ritmo Social (Social Rhythm Metric-17, SRM-17. O objetivo deste estudo foi realizar a adaptação da versão brasileira da SRM-17 para o português angolano, comparando as duas escalas em populações que utilizam o mesmo idioma mas apresentam diferenças culturais. MÉTODOS: A versão brasileira da SRM-17 foi submetida à avaliação de 10 estudantes universitários angolanos, que analisaram o grau de clareza de cada um dos 15 itens do instrumento usando uma escala visual analógica de 10 cm e propuseram modificações ao texto. Foi realizada revisão dos resultados para a elaboração da versão final, bem como prova de leitura e relatório final. RESULTADOS: A versão final angolana manteve uma equivalência de itens com relação à versão em português brasileiro. A versão avaliada demonstrou um grau satisfatório de clareza e equivalência semântica na maioria dos itens. Porém, alguns itens apresentaram um escore na clareza inferior à média aritmética de compreensão global do instrumento (8,38±1,0. CONCLUSÃO: Apesar de o português ser o idioma oficial nos dois países, há diferenças culturais significativas nas duas populações. Este trabalho apresenta uma versão adaptada à realidade angolana de um instrumento específico para aferir ritmo social. O processo de adaptação transcultural deve efetivar-se com estudos de validação do instrumento final em uma amostra maior da população, onde também poderão ser avaliadas as equivalências operacional, de medida e funcional.INTRODUCTION: Social rhythm is a concept that correlates social Zeitgebers (synchronizers with endogenous markers of time, and can be assessed with the Social Rhythm Metric-17 (SRM-17. The aim of this study was to adapt the Brazilian version of the SRM-17 to Angolan
Metrical presentation boosts implicit learning of artificial grammar.
Selchenkova, Tatiana; François, Clément; Schön, Daniele; Corneyllie, Alexandra; Perrin, Fabien; Tillmann, Barbara
2014-01-01
The present study investigated whether a temporal hierarchical structure favors implicit learning. An artificial pitch grammar implemented with a set of tones was presented in two different temporal contexts, notably with either a strongly metrical structure or an isochronous structure. According to the Dynamic Attending Theory, external temporal regularities can entrain internal oscillators that guide attention over time, allowing for temporal expectations that influence perception of future events. Based on this framework, it was hypothesized that the metrical structure provides a benefit for artificial grammar learning in comparison to an isochronous presentation. Our study combined behavioral and event-related potential measurements. Behavioral results demonstrated similar learning in both participant groups. By contrast, analyses of event-related potentials showed a larger P300 component and an earlier N2 component for the strongly metrical group during the exposure phase and the test phase, respectively. These findings suggests that the temporal expectations in the strongly metrical condition helped listeners to better process the pitch dimension, leading to improved learning of the artificial grammar.
Metrics Evolution in an Energy Research and Development Program
International Nuclear Information System (INIS)
Dixon, Brent
2011-01-01
All technology programs progress through three phases: Discovery, Definition, and Deployment. The form and application of program metrics needs to evolve with each phase. During the discovery phase, the program determines what is achievable. A set of tools is needed to define program goals, to analyze credible technical options, and to ensure that the options are compatible and meet the program objectives. A metrics system that scores the potential performance of technical options is part of this system of tools, supporting screening of concepts and aiding in the overall definition of objectives. During the definition phase, the program defines what specifically is wanted. What is achievable is translated into specific systems and specific technical options are selected and optimized. A metrics system can help with the identification of options for optimization and the selection of the option for deployment. During the deployment phase, the program shows that the selected system works. Demonstration projects are established and classical systems engineering is employed. During this phase, the metrics communicate system performance. This paper discusses an approach to metrics evolution within the Department of Energy's Nuclear Fuel Cycle R and D Program, which is working to improve the sustainability of nuclear energy.
Socio-technical security metrics
Gollmann, D.; Herley, C.; Koenig, V.; Pieters, W.; Sasse, M.A.
2015-01-01
Report from Dagstuhl seminar 14491. This report documents the program and the outcomes of Dagstuhl Seminar 14491 “Socio-Technical Security Metrics”. In the domain of safety, metrics inform many decisions, from the height of new dikes to the design of nuclear plants. We can state, for example, that
Metrics for Probabilistic Geometries
DEFF Research Database (Denmark)
Tosi, Alessandra; Hauberg, Søren; Vellido, Alfredo
2014-01-01
the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances...
Tracking occupational hearing loss across global industries: A comparative analysis of metrics
Directory of Open Access Journals (Sweden)
Peter M Rabinowitz
2012-01-01
Full Text Available Occupational hearing loss is one of the most prevalent occupational conditions; yet, there is no acknowledged international metric to allow comparisons of risk between different industries and regions. In order to make recommendations for an international standard of occupational hearing loss, members of an international industry group (the International Aluminium Association submitted details of different hearing loss metrics currently in use by members. We compared the performance of these metrics using an audiometric data set for over 6000 individuals working in 10 locations of one member company. We calculated rates for each metric at each location from 2002 to 2006. For comparison, we calculated the difference of observed-expected (for age binaural high-frequency hearing loss (in dB/year for each location over the same time period. We performed linear regression to determine the correlation between each metric and the observed-expected rate of hearing loss. The different metrics produced discrepant results, with annual rates ranging from 0.0% for a less-sensitive metric to more than 10% for a highly sensitive metric. At least two metrics, a 10dB age-corrected threshold shift from baseline and a 15dB nonage-corrected shift metric, correlated well with the difference of observed-expected high-frequency hearing loss. This study suggests that it is feasible to develop an international standard for tracking occupational hearing loss in industrial working populations.
Analyses Of Two End-User Software Vulnerability Exposure Metrics
Energy Technology Data Exchange (ETDEWEB)
Jason L. Wright; Miles McQueen; Lawrence Wellman
2012-08-01
The risk due to software vulnerabilities will not be completely resolved in the near future. Instead, putting reliable vulnerability measures into the hands of end-users so that informed decisions can be made regarding the relative security exposure incurred by choosing one software package over another is of importance. To that end, we propose two new security metrics, average active vulnerabilities (AAV) and vulnerability free days (VFD). These metrics capture both the speed with which new vulnerabilities are reported to vendors and the rate at which software vendors fix them. We then examine how the metrics are computed using currently available datasets and demonstrate their estimation in a simulation experiment using four different browsers as a case study. Finally, we discuss how the metrics may be used by the various stakeholders of software and to software usage decisions.
On the (1,1)-tensor bundle with Cheeger–Gromoll type metric
Indian Academy of Sciences (India)
The main purpose of the present paper is to construct Riemannian almost product structures on the (1, 1)-tensor bundle equipped with Cheeger–Gromoll type metric over a Riemannian manifold and present some results concerning these structures. Keywords. Almost product structure; Cheeger–Gromoll type metric; metric ...
MetricForensics: A Multi-Level Approach for Mining Volatile Graphs
Energy Technology Data Exchange (ETDEWEB)
Henderson, Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Eliassi-Rad, Tina [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Faloutsos, Christos [Carnegie Mellon Univ., Pittsburgh, PA (United States); Akoglu, Leman [Carnegie Mellon Univ., Pittsburgh, PA (United States); Li, Lei [Carnegie Mellon Univ., Pittsburgh, PA (United States); Maruhashi, Koji [Fujitsu Laboratories Ltd., Kanagawa (Japan); Prakash, B. Aditya [Carnegie Mellon Univ., Pittsburgh, PA (United States); Tong, H [Carnegie Mellon Univ., Pittsburgh, PA (United States)
2010-02-08
Advances in data collection and storage capacity have made it increasingly possible to collect highly volatile graph data for analysis. Existing graph analysis techniques are not appropriate for such data, especially in cases where streaming or near-real-time results are required. An example that has drawn significant research interest is the cyber-security domain, where internet communication traces are collected and real-time discovery of events, behaviors, patterns and anomalies is desired. We propose MetricForensics, a scalable framework for analysis of volatile graphs. MetricForensics combines a multi-level “drill down" approach, a collection of user-selected graph metrics and a collection of analysis techniques. At each successive level, more sophisticated metrics are computed and the graph is viewed at a finer temporal resolution. In this way, MetricForensics scales to highly volatile graphs by only allocating resources for computationally expensive analysis when an interesting event is discovered at a coarser resolution first. We test MetricForensics on three real-world graphs: an enterprise IP trace, a trace of legitimate and malicious network traffic from a research institution, and the MIT Reality Mining proximity sensor data. Our largest graph has »3M vertices and »32M edges, spanning 4:5 days. The results demonstrate the scalability and capability of MetricForensics in analyzing volatile graphs; and highlight four novel phenomena in such graphs: elbows, broken correlations, prolonged spikes, and strange stars.
Metric space construction for the boundary of space-time
International Nuclear Information System (INIS)
Meyer, D.A.
1986-01-01
A distance function between points in space-time is defined and used to consider the manifold as a topological metric space. The properties of the distance function are investigated: conditions under which the metric and manifold topologies agree, the relationship with the causal structure of the space-time and with the maximum lifetime function of Wald and Yip, and in terms of the space of causal curves. The space-time is then completed as a topological metric space; the resultant boundary is compared with the causal boundary and is also calculated for some pertinent examples
Metrics for Diagnosing Undersampling in Monte Carlo Tally Estimates
International Nuclear Information System (INIS)
Perfetti, Christopher M.; Rearden, Bradley T.
2015-01-01
This study explored the potential of using Markov chain convergence diagnostics to predict the prevalence and magnitude of biases due to undersampling in Monte Carlo eigenvalue and flux tally estimates. Five metrics were applied to two models of pressurized water reactor fuel assemblies and their potential for identifying undersampling biases was evaluated by comparing the calculated test metrics with known biases in the tallies. Three of the five undersampling metrics showed the potential to accurately predict the behavior of undersampling biases in the responses examined in this study.
Evaluation of Vehicle-Based Crash Severity Metrics.
Tsoi, Ada H; Gabler, Hampton C
2015-01-01
Vehicle change in velocity (delta-v) is a widely used crash severity metric used to estimate occupant injury risk. Despite its widespread use, delta-v has several limitations. Of most concern, delta-v is a vehicle-based metric which does not consider the crash pulse or the performance of occupant restraints, e.g. seatbelts and airbags. Such criticisms have prompted the search for alternative impact severity metrics based upon vehicle kinematics. The purpose of this study was to assess the ability of the occupant impact velocity (OIV), acceleration severity index (ASI), vehicle pulse index (VPI), and maximum delta-v (delta-v) to predict serious injury in real world crashes. The study was based on the analysis of event data recorders (EDRs) downloaded from the National Automotive Sampling System / Crashworthiness Data System (NASS-CDS) 2000-2013 cases. All vehicles in the sample were GM passenger cars and light trucks involved in a frontal collision. Rollover crashes were excluded. Vehicles were restricted to single-event crashes that caused an airbag deployment. All EDR data were checked for a successful, completed recording of the event and that the crash pulse was complete. The maximum abbreviated injury scale (MAIS) was used to describe occupant injury outcome. Drivers were categorized into either non-seriously injured group (MAIS2-) or seriously injured group (MAIS3+), based on the severity of any injuries to the thorax, abdomen, and spine. ASI and OIV were calculated according to the Manual for Assessing Safety Hardware. VPI was calculated according to ISO/TR 12353-3, with vehicle-specific parameters determined from U.S. New Car Assessment Program crash tests. Using binary logistic regression, the cumulative probability of injury risk was determined for each metric and assessed for statistical significance, goodness-of-fit, and prediction accuracy. The dataset included 102,744 vehicles. A Wald chi-square test showed each vehicle-based crash severity metric
Irwin, Brian J.; Conroy, Michael J.
2013-01-01
The success of natural resource management depends on monitoring, assessment and enforcement. In support of these efforts, reference points (RPs) are often viewed as critical values of management-relevant indicators. This paper considers RPs from the standpoint of objective-driven decision making in dynamic resource systems, guided by principles of structured decision making (SDM) and adaptive resource management (AM). During the development of natural resource policy, RPs have been variously treated as either ‘targets’ or ‘triggers’. Under a SDM/AM paradigm, target RPs correspond approximately to value-based objectives, which may in turn be either of fundamental interest to stakeholders or intermediaries to other central objectives. By contrast, trigger RPs correspond to decision rules that are presumed to lead to desirable outcomes (such as the programme targets). Casting RPs as triggers or targets within a SDM framework is helpful towards clarifying why (or whether) a particular metric is appropriate. Further, the benefits of a SDM/AM process include elucidation of underlying untested assumptions that may reveal alternative metrics for use as RPs. Likewise, a structured decision-analytic framework may also reveal that failure to achieve management goals is not because the metrics are wrong, but because the decision-making process in which they are embedded is insufficiently robust to uncertainty, is not efficiently directed at producing a resource objective, or is incapable of adaptation to new knowledge.
Resilience Metrics for the Electric Power System: A Performance-Based Approach.
Energy Technology Data Exchange (ETDEWEB)
Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Castillo, Andrea R [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva-Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2017-02-01
Grid resilience is a concept related to a power system's ability to continue operating and delivering power even in the event that low probability, high-consequence disruptions such as hurricanes, earthquakes, and cyber-attacks occur. Grid resilience objectives focus on managing and, ideally, minimizing potential consequences that occur as a result of these disruptions. Currently, no formal grid resilience definitions, metrics, or analysis methods have been universally accepted. This document describes an effort to develop and describe grid resilience metrics and analysis methods. The metrics and methods described herein extend upon the Resilience Analysis Process (RAP) developed by Watson et al. for the 2015 Quadrennial Energy Review. The extension allows for both outputs from system models and for historical data to serve as the basis for creating grid resilience metrics and informing grid resilience planning and response decision-making. This document describes the grid resilience metrics and analysis methods. Demonstration of the metrics and methods is shown through a set of illustrative use cases.
Quality Metrics in Neonatal and Pediatric Critical Care Transport: A National Delphi Project.
Schwartz, Hamilton P; Bigham, Michael T; Schoettker, Pamela J; Meyer, Keith; Trautman, Michael S; Insoft, Robert M
2015-10-01
The transport of neonatal and pediatric patients to tertiary care facilities for specialized care demands monitoring the quality of care delivered during transport and its impact on patient outcomes. In 2011, pediatric transport teams in Ohio met to identify quality indicators permitting comparisons among programs. However, no set of national consensus quality metrics exists for benchmarking transport teams. The aim of this project was to achieve national consensus on appropriate neonatal and pediatric transport quality metrics. Modified Delphi technique. The first round of consensus determination was via electronic mail survey, followed by rounds of consensus determination in-person at the American Academy of Pediatrics Section on Transport Medicine's 2012 Quality Metrics Summit. All attendees of the American Academy of Pediatrics Section on Transport Medicine Quality Metrics Summit, conducted on October 21-23, 2012, in New Orleans, LA, were eligible to participate. Candidate quality metrics were identified through literature review and those metrics currently tracked by participating programs. Participants were asked in a series of rounds to identify "very important" quality metrics for transport. It was determined a priori that consensus on a metric's importance was achieved when at least 70% of respondents were in agreement. This is consistent with other Delphi studies. Eighty-two candidate metrics were considered initially. Ultimately, 12 metrics achieved consensus as "very important" to transport. These include metrics related to airway management, team mobilization time, patient and crew injuries, and adverse patient care events. Definitions were assigned to the 12 metrics to facilitate uniform data tracking among programs. The authors succeeded in achieving consensus among a diverse group of national transport experts on 12 core neonatal and pediatric transport quality metrics. We propose that transport teams across the country use these metrics to
Enterprise Sustainment Metrics
2015-06-19
are negatively impacting KPIs” (Parmenter, 2010: 31). In the current state, the Air Force’s AA and PBL metrics are once again split . AA does...must have the authority to “take immediate action to rectify situations that are negatively impacting KPIs” (Parmenter, 2010: 31). 3. Measuring...highest profitability and shareholder value for each company” (2014: 273). By systematically diagraming a process, either through a swim lane flowchart
Hardware Acceleration of Adaptive Neural Algorithms.
Energy Technology Data Exchange (ETDEWEB)
James, Conrad D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2017-11-01
As tradit ional numerical computing has faced challenges, researchers have turned towards alternative computing approaches to reduce power - per - computation metrics and improve algorithm performance. Here, we describe an approach towards non - conventional computing that strengthens the connection between machine learning and neuroscience concepts. The Hardware Acceleration of Adaptive Neural Algorithms (HAANA) project ha s develop ed neural machine learning algorithms and hardware for applications in image processing and cybersecurity. While machine learning methods are effective at extracting relevant features from many types of data, the effectiveness of these algorithms degrades when subjected to real - world conditions. Our team has generated novel neural - inspired approa ches to improve the resiliency and adaptability of machine learning algorithms. In addition, we have also designed and fabricated hardware architectures and microelectronic devices specifically tuned towards the training and inference operations of neural - inspired algorithms. Finally, our multi - scale simulation framework allows us to assess the impact of microelectronic device properties on algorithm performance.
Advanced spatial metrics analysis in cellular automata land use and cover change modeling
International Nuclear Information System (INIS)
Zamyatin, Alexander; Cabral, Pedro
2011-01-01
This paper proposes an approach for a more effective definition of cellular automata transition rules for landscape change modeling using an advanced spatial metrics analysis. This approach considers a four-stage methodology based on: (i) the search for the appropriate spatial metrics with minimal correlations; (ii) the selection of the appropriate neighborhood size; (iii) the selection of the appropriate technique for spatial metrics application; and (iv) the analysis of the contribution level of each spatial metric for joint use. The case study uses an initial set of 7 spatial metrics of which 4 are selected for modeling. Results show a better model performance when compared to modeling without any spatial metrics or with the initial set of 7 metrics.
Evans, Garrett Nolan
In this work, I present two projects that both contribute to the aim of discovering how intelligence manifests in the brain. The first project is a method for analyzing recorded neural signals, which takes the form of a convolution-based metric on neural membrane potential recordings. Relying only on integral and algebraic operations, the metric compares the timing and number of spikes within recordings as well as the recordings' subthreshold features: summarizing differences in these with a single "distance" between the recordings. Like van Rossum's (2001) metric for spike trains, the metric is based on a convolution operation that it performs on the input data. The kernel used for the convolution is carefully chosen such that it produces a desirable frequency space response and, unlike van Rossum's kernel, causes the metric to be first order both in differences between nearby spike times and in differences between same-time membrane potential values: an important trait. The second project is a combinatorial syntax method for connectionist semantic network encoding. Combinatorial syntax has been a point on which those who support a symbol-processing view of intelligent processing and those who favor a connectionist view have had difficulty seeing eye-to-eye. Symbol-processing theorists have persuasively argued that combinatorial syntax is necessary for certain intelligent mental operations, such as reasoning by analogy. Connectionists have focused on the versatility and adaptability offered by self-organizing networks of simple processing units. With this project, I show that there is a way to reconcile the two perspectives and to ascribe a combinatorial syntax to a connectionist network. The critical principle is to interpret nodes, or units, in the connectionist network as bound integrations of the interpretations for nodes that they share links with. Nodes need not correspond exactly to neurons and may correspond instead to distributed sets, or assemblies, of
Discriminatory Data Mapping by Matrix-Based Supervised Learning Metrics
Strickert, M.; Schneider, P.; Keilwagen, J.; Villmann, T.; Biehl, M.; Hammer, B.
2008-01-01
Supervised attribute relevance detection using cross-comparisons (SARDUX), a recently proposed method for data-driven metric learning, is extended from dimension-weighted Minkowski distances to metrics induced by a data transformation matrix Ω for modeling mutual attribute dependence. Given class
Directory of Open Access Journals (Sweden)
Farshad Firuzi
2017-06-01
Full Text Available We consider unit tangent sphere bundle of a Riemannian manifold $ (M,g $ as a $ (2n+1 $-dimensional manifold and we equip it with pseudo-Riemannian $ g $-natural almost contact B-metric structure. Then, by computing coefficients of the structure tensor $ F$, we completely characterize the unit tangent sphere bundle equipped to this structure, with respect to the relevant classification of almost contact B-metric structures, and determine a class such that the unit tangent sphere bundle with mentioned structure belongs to it. Also, we find some curvature conditions such that the mentioned structure satisfies each of eleven basic classes.
Sensory Metrics of Neuromechanical Trust.
Softky, William; Benford, Criscillia
2017-09-01
Today digital sources supply a historically unprecedented component of human sensorimotor data, the consumption of which is correlated with poorly understood maladies such as Internet addiction disorder and Internet gaming disorder. Because both natural and digital sensorimotor data share common mathematical descriptions, one can quantify our informational sensorimotor needs using the signal processing metrics of entropy, noise, dimensionality, continuity, latency, and bandwidth. Such metrics describe in neutral terms the informational diet human brains require to self-calibrate, allowing individuals to maintain trusting relationships. With these metrics, we define the trust humans experience using the mathematical language of computational models, that is, as a primitive statistical algorithm processing finely grained sensorimotor data from neuromechanical interaction. This definition of neuromechanical trust implies that artificial sensorimotor inputs and interactions that attract low-level attention through frequent discontinuities and enhanced coherence will decalibrate a brain's representation of its world over the long term by violating the implicit statistical contract for which self-calibration evolved. Our hypersimplified mathematical understanding of human sensorimotor processing as multiscale, continuous-time vibratory interaction allows equally broad-brush descriptions of failure modes and solutions. For example, we model addiction in general as the result of homeostatic regulation gone awry in novel environments (sign reversal) and digital dependency as a sub-case in which the decalibration caused by digital sensorimotor data spurs yet more consumption of them. We predict that institutions can use these sensorimotor metrics to quantify media richness to improve employee well-being; that dyads and family-size groups will bond and heal best through low-latency, high-resolution multisensory interaction such as shared meals and reciprocated touch; and
Zimmerman, Marianna
1975-01-01
Describes a classroom activity which involved sixth grade students in a learning situation including making ice cream, safety procedures in a science laboratory, calibrating a thermometer, using metric units of volume and mass. (EB)
Ensuring Success of Adaptive Control Research Through Project Lifecycle Risk Mitigation
Pavlock, Kate M.
2011-01-01
Lessons Learne: 1. Design-out unnecessary risk to prevent excessive mitigation management during flight. 2. Consider iterative checkouts to confirm or improve human factor characteristics. 3. Consider the total flight test profile to uncover unanticipated human-algorithm interactions. 4. Consider test card cadence as a metric to assess test readiness. 5. Full-scale flight test is critical to development, maturation, and acceptance of adaptive control laws for operational use.
Untangling Risk in Water Supply Systems: What Factors Drive Long-term Adaptation?
Zeff, H. B.; Lin, L.; Band, L. E.; Reed, P. M.; Characklis, G. W.
2016-12-01
Deeply uncertain factors like climate change, the hydrologic impacts of urbanization, forest evolution, and long-term demand forecasts make water supply planning a `wicked' problem. The traditional technique of assessing risk based on historical observations can be inadequate in the face of environmental non-stationarity. However, competing models and limited observational data make it difficult for decision makers and experts to agree on how much uncertainty should be built into analyses of risk, particularly at the timescales relevant to long-term investments in water infrastructure. Further, the physical connectivity of these deeply uncertain processes create inter-related systems, amplifying the challenges of a `worst case scenario'. The development of adaptive systems and planning processes provide solutions that have been shown to meet technical, environmental, and social objectives at lower costs. Instead of developing plans with fixed targets for the timing of actions, adaptive plans develop risk metrics and thresholds that are able to integrate new information to determine when conditions reach a `tipping point' which necessitates action. It is an open question as to how new information can be best integrated into the decision-making process (i.e. how much weight do we give new observations relative to the historical record), but a better understanding of the way the relevant systems are expected to evolve and change over time could inform these decisions. In this study, we use linked, dynamic models of temperature and precipitation changes, forest evolution, urbanization, hydrology, and water demand to develop scenarios for an adaptive water management framework that uses risk-based metrics to make short- and long-term decisions. The impact of individual environmental processes on the adaptive capability of this management framework is evaluated through problem formulations that successively increase the complexity of the uncertainty scenarios. Although
Don't Trust a Management Metric, Especially in Life Support
Jones, Harry W.
2014-01-01
Goodhart's law states that metrics do not work. Metrics become distorted when used and they deflect effort away from more important goals. These well-known and unavoidable problems occurred when the closure and system mass metrics were used to manage life support research. The intent of life support research should be to develop flyable, operable, reliable systems, not merely to increase life support system closure or to reduce its total mass. It would be better to design life support systems to meet the anticipated mission requirements and user needs. Substituting the metrics of closure and total mass for these goals seems to have led life support research to solve the wrong problems.
Resilient Control Systems Practical Metrics Basis for Defining Mission Impact
Energy Technology Data Exchange (ETDEWEB)
Craig G. Rieger
2014-08-01
"Resilience” describes how systems operate at an acceptable level of normalcy despite disturbances or threats. In this paper we first consider the cognitive, cyber-physical interdependencies inherent in critical infrastructure systems and how resilience differs from reliability to mitigate these risks. Terminology and metrics basis are provided to integrate the cognitive, cyber-physical aspects that should be considered when defining solutions for resilience. A practical approach is taken to roll this metrics basis up to system integrity and business case metrics that establish “proper operation” and “impact.” A notional chemical processing plant is the use case for demonstrating how the system integrity metrics can be applied to establish performance, and
Analysis of Subjects' Vulnerability in a Touch Screen Game Using Behavioral Metrics.
Parsinejad, Payam; Sipahi, Rifat
2017-12-01
In this article, we report results on an experimental study conducted with volunteer subjects playing a touch-screen game with two unique difficulty levels. Subjects have knowledge about the rules of both game levels, but only sufficient playing experience with the easy level of the game, making them vulnerable with the difficult level. Several behavioral metrics associated with subjects' playing the game are studied in order to assess subjects' mental-workload changes induced by their vulnerability. Specifically, these metrics are calculated based on subjects' finger kinematics and decision making times, which are then compared with baseline metrics, namely, performance metrics pertaining to how well the game is played and a physiological metric called pnn50 extracted from heart rate measurements. In balanced experiments and supported by comparisons with baseline metrics, it is found that some of the studied behavioral metrics have the potential to be used to infer subjects' mental workload changes through different levels of the game. These metrics, which are decoupled from task specifics, relate to subjects' ability to develop strategies to play the game, and hence have the advantage of offering insight into subjects' task-load and vulnerability assessment across various experimental settings.
Metric-Aware Secure Service Orchestration
Directory of Open Access Journals (Sweden)
Gabriele Costa
2012-12-01
Full Text Available Secure orchestration is an important concern in the internet of service. Next to providing the required functionality the composite services must also provide a reasonable level of security in order to protect sensitive data. Thus, the orchestrator has a need to check whether the complex service is able to satisfy certain properties. Some properties are expressed with metrics for precise definition of requirements. Thus, the problem is to analyse the values of metrics for a complex business process. In this paper we extend our previous work on analysis of secure orchestration with quantifiable properties. We show how to define, verify and enforce quantitative security requirements in one framework with other security properties. The proposed approach should help to select the most suitable service architecture and guarantee fulfilment of the declared security requirements.
A composite efficiency metrics for evaluation of resource and energy utilization
International Nuclear Information System (INIS)
Yang, Siyu; Yang, Qingchun; Qian, Yu
2013-01-01
Polygeneration systems are commonly found in chemical and energy industry. These systems often involve chemical conversions and energy conversions. Studies of these systems are interdisciplinary, mainly involving fields of chemical engineering, energy engineering, environmental science, and economics. Each of these fields has developed an isolated index system different from the others. Analyses of polygeneration systems are therefore very likely to provide bias results with only the indexes from one field. This paper is motivated from this problem to develop a new composite efficiency metrics for polygeneration systems. This new metrics is based on the second law of thermodynamics, exergy theory. We introduce exergy cost for waste treatment as the energy penalty into conventional exergy efficiency. Using this new metrics could avoid the situation of spending too much energy for increasing production or paying production capacity for saving energy consumption. The composite metrics is studied on a simplified co-production process, syngas to methanol and electricity. The advantage of the new efficiency metrics is manifested by comparison with carbon element efficiency, energy efficiency, and exergy efficiency. Results show that the new metrics could give more rational analysis than the other indexes. - Highlights: • The composite efficiency metric gives the balanced evaluation of resource utilization and energy utilization. • This efficiency uses the exergy for waste treatment as the energy penalty. • This efficiency is applied on a simplified co-production process. • Results show that the composite metrics is better than energy efficiencies and resource efficiencies
Environmental cost of using poor decision metrics to prioritize environmental projects.
Pannell, David J; Gibson, Fiona L
2016-04-01
Conservation decision makers commonly use project-scoring metrics that are inconsistent with theory on optimal ranking of projects. As a result, there may often be a loss of environmental benefits. We estimated the magnitudes of these losses for various metrics that deviate from theory in ways that are common in practice. These metrics included cases where relevant variables were omitted from the benefits metric, project costs were omitted, and benefits were calculated using a faulty functional form. We estimated distributions of parameters from 129 environmental projects from Australia, New Zealand, and Italy for which detailed analyses had been completed previously. The cost of using poor prioritization metrics (in terms of lost environmental values) was often high--up to 80% in the scenarios we examined. The cost in percentage terms was greater when the budget was smaller. The most costly errors were omitting information about environmental values (up to 31% loss of environmental values), omitting project costs (up to 35% loss), omitting the effectiveness of management actions (up to 9% loss), and using a weighted-additive decision metric for variables that should be multiplied (up to 23% loss). The latter 3 are errors that occur commonly in real-world decision metrics, in combination often reducing potential benefits from conservation investments by 30-50%. Uncertainty about parameter values also reduced the benefits from investments in conservation projects but often not by as much as faulty prioritization metrics. © 2016 Society for Conservation Biology.
Empirical analysis of change metrics for software fault prediction
Choudhary, Garvit Rajesh; Kumar, Sandeep; Kumar, Kuldeep; Mishra, Alok; Catal, Cagatay
2018-01-01
A quality assurance activity, known as software fault prediction, can reduce development costs and improve software quality. The objective of this study is to investigate change metrics in conjunction with code metrics to improve the performance of fault prediction models. Experimental studies are
76 FR 53885 - Patent and Trademark Resource Centers Metrics
2011-08-30
... DEPARTMENT OF COMMERCE United States Patent and Trademark Office Patent and Trademark Resource Centers Metrics ACTION: Proposed collection; comment request. SUMMARY: The United States Patent and... ``Patent and Trademark Resource Centers Metrics comment'' in the subject line of the message. Mail: Susan K...
Resilience-based performance metrics for water resources management under uncertainty
Roach, Tom; Kapelan, Zoran; Ledbetter, Ralph
2018-06-01
This paper aims to develop new, resilience type metrics for long-term water resources management under uncertain climate change and population growth. Resilience is defined here as the ability of a water resources management system to 'bounce back', i.e. absorb and then recover from a water deficit event, restoring the normal system operation. Ten alternative metrics are proposed and analysed addressing a range of different resilience aspects including duration, magnitude, frequency and volume of related water deficit events. The metrics were analysed on a real-world case study of the Bristol Water supply system in the UK and compared with current practice. The analyses included an examination of metrics' sensitivity and correlation, as well as a detailed examination into the behaviour of metrics during water deficit periods. The results obtained suggest that multiple metrics which cover different aspects of resilience should be used simultaneously when assessing the resilience of a water resources management system, leading to a more complete understanding of resilience compared with current practice approaches. It was also observed that calculating the total duration of a water deficit period provided a clearer and more consistent indication of system performance compared to splitting the deficit periods into the time to reach and time to recover from the worst deficit events.
The canonical partial metric and the uniform convexity on normed spaces
Directory of Open Access Journals (Sweden)
S. Oltra
2005-10-01
Full Text Available In this paper we introduce the notion of canonical partial metric associated to a norm to study geometric properties of normed spaces. In particular, we characterize strict convexity and uniform convexity of normed spaces in terms of the canonical partial metric defined by its norm. We prove that these geometric properties can be considered, in this sense, as topological properties that appear when we compare the natural metric topology of the space with the non translation invariant topology induced by the canonical partial metric in the normed space.
Metrics in Keplerian orbits quotient spaces
Milanov, Danila V.
2018-03-01
Quotient spaces of Keplerian orbits are important instruments for the modelling of orbit samples of celestial bodies on a large time span. We suppose that variations of the orbital eccentricities, inclinations and semi-major axes remain sufficiently small, while arbitrary perturbations are allowed for the arguments of pericentres or longitudes of the nodes, or both. The distance between orbits or their images in quotient spaces serves as a numerical criterion for such problems of Celestial Mechanics as search for common origin of meteoroid streams, comets, and asteroids, asteroid families identification, and others. In this paper, we consider quotient sets of the non-rectilinear Keplerian orbits space H. Their elements are identified irrespective of the values of pericentre arguments or node longitudes. We prove that distance functions on the quotient sets, introduced in Kholshevnikov et al. (Mon Not R Astron Soc 462:2275-2283, 2016), satisfy metric space axioms and discuss theoretical and practical importance of this result. Isometric embeddings of the quotient spaces into R^n, and a space of compact subsets of H with Hausdorff metric are constructed. The Euclidean representations of the orbits spaces find its applications in a problem of orbit averaging and computational algorithms specific to Euclidean space. We also explore completions of H and its quotient spaces with respect to corresponding metrics and establish a relation between elements of the extended spaces and rectilinear trajectories. Distance between an orbit and subsets of elliptic and hyperbolic orbits is calculated. This quantity provides an upper bound for the metric value in a problem of close orbits identification. Finally the invariance of the equivalence relations in H under coordinates change is discussed.
Juneja, Prabhjot; Evans, Philp M; Harris, Emma J
2013-08-01
Validation is required to ensure automated segmentation algorithms are suitable for radiotherapy target definition. In the absence of true segmentation, algorithmic segmentation is validated against expert outlining of the region of interest. Multiple experts are used to overcome inter-expert variability. Several approaches have been studied in the literature, but the most appropriate approach to combine the information from multiple expert outlines, to give a single metric for validation, is unclear. None consider a metric that can be tailored to case-specific requirements in radiotherapy planning. Validation index (VI), a new validation metric which uses experts' level of agreement was developed. A control parameter was introduced for the validation of segmentations required for different radiotherapy scenarios: for targets close to organs-at-risk and for difficult to discern targets, where large variation between experts is expected. VI was evaluated using two simulated idealized cases and data from two clinical studies. VI was compared with the commonly used Dice similarity coefficient (DSCpair - wise) and found to be more sensitive than the DSCpair - wise to the changes in agreement between experts. VI was shown to be adaptable to specific radiotherapy planning scenarios.
The correlation of metrics in complex networks with applications in functional brain networks
Li, C.; Wang, H.; De Haan, W.; Stam, C.J.; Van Mieghem, P.F.A.
2011-01-01
An increasing number of network metrics have been applied in network analysis. If metric relations were known better, we could more effectively characterize networks by a small set of metrics to discover the association between network properties/metrics and network functioning. In this paper, we
The positive action conjecture and asymptotically euclidean metrics in quantum gravity
International Nuclear Information System (INIS)
Gibbons, G.W.; Pope, C.N.
1979-01-01
The positive action conjecture requires that the action of any asymptotically Euclidean 4-dimensional Riemannian metric be positive, vanishing if and only if the space is flat. Because any Ricci flat, asymptotically Euclidean metric has zero action and is local extremum of the action which is a local minimum at flat space, the conjecture requires that there are no Ricci flat asymptotically Euclidean metrics other than flat space, which would establish that flat space is the only local minimum. We prove this for metrics on R 4 and a large class of more complicated topologies and for self-dual metrics. We show that if Rsupμsubμ >= 0 there are no bound states of the Dirac equation and discuss the relevance to possible baryon non-conserving processes mediated by gravitational instantons. We conclude that these are forbidden in the lowest stationary phase approximation. We give a detailed discussion of instantons invariant under an SU(2) or SO(3) isometry group. We find all regular solutions, none of which is asymptotically Euclidean and all of which possess a further Killing vector. In an appendix we construct an approximate self-dual metric on K3 - the only simply connected compact manifold which admits a self-dual metric. (orig.) [de
Nonlinear metric perturbation enhancement of primordial gravitational waves.
Bastero-Gil, M; Macias-Pérez, J; Santos, D
2010-08-20
We present the evolution of the full set of Einstein equations during preheating after inflation. We study a generic supersymmetric model of hybrid inflation, integrating fields and metric fluctuations in a 3-dimensional lattice. We take initial conditions consistent with Einstein's constraint equations. The induced preheating of the metric fluctuations is not large enough to backreact onto the fields, but preheating of the scalar modes does affect the evolution of vector and tensor modes. In particular, they do enhance the induced stochastic background of gravitational waves during preheating, giving an energy density in general an order of magnitude larger than that obtained by evolving the tensor fluctuations in an homogeneous background metric. This enhancement can improve the expectations for detection by planned gravitational wave observatories.
Empirical Information Metrics for Prediction Power and Experiment Planning
Directory of Open Access Journals (Sweden)
Christopher Lee
2011-01-01
Full Text Available In principle, information theory could provide useful metrics for statistical inference. In practice this is impeded by divergent assumptions: Information theory assumes the joint distribution of variables of interest is known, whereas in statistical inference it is hidden and is the goal of inference. To integrate these approaches we note a common theme they share, namely the measurement of prediction power. We generalize this concept as an information metric, subject to several requirements: Calculation of the metric must be objective or model-free; unbiased; convergent; probabilistically bounded; and low in computational complexity. Unfortunately, widely used model selection metrics such as Maximum Likelihood, the Akaike Information Criterion and Bayesian Information Criterion do not necessarily meet all these requirements. We define four distinct empirical information metrics measured via sampling, with explicit Law of Large Numbers convergence guarantees, which meet these requirements: Ie, the empirical information, a measure of average prediction power; Ib, the overfitting bias information, which measures selection bias in the modeling procedure; Ip, the potential information, which measures the total remaining information in the observations not yet discovered by the model; and Im, the model information, which measures the model’s extrapolation prediction power. Finally, we show that Ip + Ie, Ip + Im, and Ie — Im are fixed constants for a given observed dataset (i.e. prediction target, independent of the model, and thus represent a fundamental subdivision of the total information contained in the observations. We discuss the application of these metrics to modeling and experiment planning.
Quality Evaluation in Wireless Imaging Using Feature-Based Objective Metrics
Engelke, Ulrich; Zepernick, Hans-Jürgen
2007-01-01
This paper addresses the evaluation of image quality in the context of wireless systems using feature-based objective metrics. The considered metrics comprise of a weighted combination of feature values that are used to quantify the extend by which the related artifacts are present in a processed image. In view of imaging applications in mobile radio and wireless communication systems, reduced-reference objective quality metrics are investigated for quantifying user-perceived quality. The exa...
Tice, Bradley S.
Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English language with the intention that it may be used in second language instruction. Stress is defined by its physical and acoustical correlates, and the principles of…
Heuristic extension of the Schwarzschild metric
International Nuclear Information System (INIS)
Espinosa, J.M.
1982-01-01
The Schwarzschild solution of Einstein's equations of gravitation has several singularities. It is known that the singularity at r = 2Gm/c 2 is only apparent, a result of the coordinates in which the solution was found. Paradoxical results occuring near the singularity show the system of coordinates is incomplete. We introduce a simple, two-dimensional metric with an apparent singularity that makes it incomplete. By a straightforward, heuristic procedure we extend and complete this simple metric. We then use the same procedure to give a heuristic derivation of the Kruskal system of coordinates, which is known to extend the Schwarzschild manifold past its apparent singularity and produce a complete manifold
Jacobi-Maupertuis metric and Kepler equation
Chanda, Sumanto; Gibbons, Gary William; Guha, Partha
This paper studies the application of the Jacobi-Eisenhart lift, Jacobi metric and Maupertuis transformation to the Kepler system. We start by reviewing fundamentals and the Jacobi metric. Then we study various ways to apply the lift to Kepler-related systems: first as conformal description and Bohlin transformation of Hooke’s oscillator, second in contact geometry and third in Houri’s transformation [T. Houri, Liouville integrability of Hamiltonian systems and spacetime symmetry (2016), www.geocities.jp/football_physician/publication.html], coupled with Milnor’s construction [J. Milnor, On the geometry of the Kepler problem, Am. Math. Mon. 90 (1983) 353-365] with eccentric anomaly.
Path integral measure for first-order and metric gravities
International Nuclear Information System (INIS)
Aros, Rodrigo; Contreras, Mauricio; Zanelli, Jorge
2003-01-01
The equivalence between the path integrals for first-order gravity and the standard torsion-free, metric gravity in 3 + 1 dimensions is analysed. Starting with the path integral for first-order gravity, the correct measure for the path integral of the metric theory is obtained
Evaluating hydrological model performance using information theory-based metrics
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...
Local adjacency metric dimension of sun graph and stacked book graph
Yulisda Badri, Alifiah; Darmaji
2018-03-01
A graph is a mathematical system consisting of a non-empty set of nodes and a set of empty sides. One of the topics to be studied in graph theory is the metric dimension. Application in the metric dimension is the navigation robot system on a path. Robot moves from one vertex to another vertex in the field by minimizing the errors that occur in translating the instructions (code) obtained from the vertices of that location. To move the robot must give different instructions (code). In order for the robot to move efficiently, the robot must be fast to translate the code of the nodes of the location it passes. so that the location vertex has a minimum distance. However, if the robot must move with the vertex location on a very large field, so the robot can not detect because the distance is too far.[6] In this case, the robot can determine its position by utilizing location vertices based on adjacency. The problem is to find the minimum cardinality of the required location vertex, and where to put, so that the robot can determine its location. The solution to this problem is the dimension of adjacency metric and adjacency metric bases. Rodrguez-Velzquez and Fernau combine the adjacency metric dimensions with local metric dimensions, thus becoming the local adjacency metric dimension. In the local adjacency metric dimension each vertex in the graph may have the same adjacency representation as the terms of the vertices. To obtain the local metric dimension of values in the graph of the Sun and the stacked book graph is used the construction method by considering the representation of each adjacent vertex of the graph.
Analysis of Network Clustering Algorithms and Cluster Quality Metrics at Scale.
Emmons, Scott; Kobourov, Stephen; Gallant, Mike; Börner, Katy
2016-01-01
Notions of community quality underlie the clustering of networks. While studies surrounding network clustering are increasingly common, a precise understanding of the realtionship between different cluster quality metrics is unknown. In this paper, we examine the relationship between stand-alone cluster quality metrics and information recovery metrics through a rigorous analysis of four widely-used network clustering algorithms-Louvain, Infomap, label propagation, and smart local moving. We consider the stand-alone quality metrics of modularity, conductance, and coverage, and we consider the information recovery metrics of adjusted Rand score, normalized mutual information, and a variant of normalized mutual information used in previous work. Our study includes both synthetic graphs and empirical data sets of sizes varying from 1,000 to 1,000,000 nodes. We find significant differences among the results of the different cluster quality metrics. For example, clustering algorithms can return a value of 0.4 out of 1 on modularity but score 0 out of 1 on information recovery. We find conductance, though imperfect, to be the stand-alone quality metric that best indicates performance on the information recovery metrics. Additionally, our study shows that the variant of normalized mutual information used in previous work cannot be assumed to differ only slightly from traditional normalized mutual information. Smart local moving is the overall best performing algorithm in our study, but discrepancies between cluster evaluation metrics prevent us from declaring it an absolutely superior algorithm. Interestingly, Louvain performed better than Infomap in nearly all the tests in our study, contradicting the results of previous work in which Infomap was superior to Louvain. We find that although label propagation performs poorly when clusters are less clearly defined, it scales efficiently and accurately to large graphs with well-defined clusters.
Sensorless adaptive optics for isoSTED nanoscopy
Antonello, Jacopo; Hao, Xiang; Allgeyer, Edward S.; Bewersdorf, Joerg; Rittscher, Jens; Booth, Martin J.
2018-02-01
The presence of aberrations is a major concern when using fluorescence microscopy to image deep inside tissue. Aberrations due to refractive index mismatch and heterogeneity of the specimen under investigation cause severe reduction in the amount of fluorescence emission that is collected by the microscope. Furthermore, aberrations adversely affect the resolution, leading to loss of fine detail in the acquired images. These phenomena are particularly troublesome for super-resolution microscopy techniques such as isotropic stimulated-emission-depletion microscopy (isoSTED), which relies on accurate control of the shape and co-alignment of multiple excitation and depletion foci to operate as expected and to achieve the super-resolution effect. Aberrations can be suppressed by implementing sensorless adaptive optics techniques, whereby aberration correction is achieved by maximising a certain image quality metric. In confocal microscopy for example, one can employ the total image brightness as an image quality metric. Aberration correction is subsequently achieved by iteratively changing the settings of a wavefront corrector device until the metric is maximised. This simplistic approach has limited applicability to isoSTED microscopy where, due to the complex interplay between the excitation and depletion foci, maximising the total image brightness can lead to introducing aberrations in the depletion foci. In this work we first consider the effects that different aberration modes have on isoSTED microscopes. We then propose an iterative, wavelet-based aberration correction algorithm and evaluate its benefits.
Converging from Branching to Linear Metrics on Markov Chains
DEFF Research Database (Denmark)
Bacci, Giorgio; Bacci, Giovanni; Larsen, Kim Guldstrand
2015-01-01
time in the size of the MC. The upper-approximants are Kantorovich-like pseudometrics, i.e. branching-time distances, that converge point-wise to the linear-time metrics. This convergence is interesting in itself, since it reveals a nontrivial relation between branching and linear-time metric...
Probabilistic G-Metric space and some fixed point results
Directory of Open Access Journals (Sweden)
A. R. Janfada
2013-01-01
Full Text Available In this note we introduce the notions of generalized probabilistic metric spaces and generalized Menger probabilistic metric spaces. After making our elementary observations and proving some basic properties of these spaces, we are going to prove some fixed point result in these spaces.
Performance evaluation of routing metrics for wireless mesh networks
CSIR Research Space (South Africa)
Nxumalo, SL
2009-08-01
Full Text Available for WMN. The routing metrics have not been compared with QoS parameters. This paper is a work in progress of the project in which researchers want to compare the performance of different routing metrics in WMN using a wireless test bed. Researchers...
Validation of network communicability metrics for the analysis of brain structural networks.
Directory of Open Access Journals (Sweden)
Jennifer Andreotti
Full Text Available Computational network analysis provides new methods to analyze the brain's structural organization based on diffusion imaging tractography data. Networks are characterized by global and local metrics that have recently given promising insights into diagnosis and the further understanding of psychiatric and neurologic disorders. Most of these metrics are based on the idea that information in a network flows along the shortest paths. In contrast to this notion, communicability is a broader measure of connectivity which assumes that information could flow along all possible paths between two nodes. In our work, the features of network metrics related to communicability were explored for the first time in the healthy structural brain network. In addition, the sensitivity of such metrics was analysed using simulated lesions to specific nodes and network connections. Results showed advantages of communicability over conventional metrics in detecting densely connected nodes as well as subsets of nodes vulnerable to lesions. In addition, communicability centrality was shown to be widely affected by the lesions and the changes were negatively correlated with the distance from lesion site. In summary, our analysis suggests that communicability metrics that may provide an insight into the integrative properties of the structural brain network and that these metrics may be useful for the analysis of brain networks in the presence of lesions. Nevertheless, the interpretation of communicability is not straightforward; hence these metrics should be used as a supplement to the more standard connectivity network metrics.
Reliability of TMS metrics in patients with chronic incomplete spinal cord injury.
Potter-Baker, K A; Janini, D P; Frost, F S; Chabra, P; Varnerin, N; Cunningham, D A; Sankarasubramanian, V; Plow, E B
2016-11-01
Test-retest reliability analysis in individuals with chronic incomplete spinal cord injury (iSCI). The purpose of this study was to examine the reliability of neurophysiological metrics acquired with transcranial magnetic stimulation (TMS) in individuals with chronic incomplete tetraplegia. Cleveland Clinic Foundation, Cleveland, Ohio, USA. TMS metrics of corticospinal excitability, output, inhibition and motor map distribution were collected in muscles with a higher MRC grade and muscles with a lower MRC grade on the more affected side of the body. Metrics denoting upper limb function were also collected. All metrics were collected at two sessions separated by a minimum of two weeks. Reliability between sessions was determined using Spearman's correlation coefficients and concordance correlation coefficients (CCCs). We found that TMS metrics that were acquired in higher MRC grade muscles were approximately two times more reliable than those collected in lower MRC grade muscles. TMS metrics of motor map output, however, demonstrated poor reliability regardless of muscle choice (P=0.34; CCC=0.51). Correlation analysis indicated that patients with more baseline impairment and/or those in a more chronic phase of iSCI demonstrated greater variability of metrics. In iSCI, reliability of TMS metrics varies depending on the muscle grade of the tested muscle. Variability is also influenced by factors such as baseline motor function and time post SCI. Future studies that use TMS metrics in longitudinal study designs to understand functional recovery should be cautious as choice of muscle and clinical characteristics can influence reliability.
Connection Setup Signaling Scheme with Flooding-Based Path Searching for Diverse-Metric Network
Kikuta, Ko; Ishii, Daisuke; Okamoto, Satoru; Oki, Eiji; Yamanaka, Naoaki
Connection setup on various computer networks is now achieved by GMPLS. This technology is based on the source-routing approach, which requires the source node to store metric information of the entire network prior to computing a route. Thus all metric information must be distributed to all network nodes and kept up-to-date. However, as metric information become more diverse and generalized, it is hard to update all information due to the huge update overhead. Emerging network services and applications require the network to support diverse metrics for achieving various communication qualities. Increasing the number of metrics supported by the network causes excessive processing of metric update messages. To reduce the number of metric update messages, another scheme is required. This paper proposes a connection setup scheme that uses flooding-based signaling rather than the distribution of metric information. The proposed scheme requires only flooding of signaling messages with requested metric information, no routing protocol is required. Evaluations confirm that the proposed scheme achieves connection establishment without excessive overhead. Our analysis shows that the proposed scheme greatly reduces the number of control messages compared to the conventional scheme, while their blocking probabilities are comparable.
Problems in Systematic Application of Software Metrics and Possible Solution
Rakic, Gordana; Budimac, Zoran
2013-01-01
Systematic application of software metric techniques can lead to significant improvements of the quality of a final software product. However, there is still the evident lack of wider utilization of software metrics techniques and tools due to many reasons. In this paper we investigate some limitations of contemporary software metrics tools and then propose construction of a new tool that would solve some of the problems. We describe the promising prototype, its internal structure, and then f...
Some Extensions of Banach's Contraction Principle in Complete Cone Metric Spaces
Directory of Open Access Journals (Sweden)
Raja P
2008-01-01
Full Text Available Abstract In this paper we consider complete cone metric spaces. We generalize some definitions such as -nonexpansive and -uniformly locally contractive functions -closure, -isometric in cone metric spaces, and certain fixed point theorems will be proved in those spaces. Among other results, we prove some interesting applications for the fixed point theorems in cone metric spaces.
[Applicability of traditional landscape metrics in evaluating urban heat island effect].
Chen, Ai-Lian; Sun, Ran-Hao; Chen, Li-Ding
2012-08-01
By using 24 landscape metrics, this paper evaluated the urban heat island effect in parts of Beijing downtown area. QuickBird (QB) images were used to extract the landscape type information, and the thermal bands from Landsat Enhanced Thematic Mapper Plus (ETM+) images were used to extract the land surface temperature (LST) in four seasons of the same year. The 24 landscape pattern metrics were calculated at landscape and class levels in a fixed window with 120 mx 120 m in size, with the applicability of these traditional landscape metrics in evaluating the urban heat island effect examined. Among the 24 landscape metrics, only the percentage composition of landscape (PLAND), patch density (PD), largest patch index (LPI), coefficient of Euclidean nearest-neighbor distance variance (ENN_CV), and landscape division index (DIVISION) at landscape level were significantly correlated with the LST in March, May, and November, and the PLAND, LPI, DIVISION, percentage of like adjacencies, and interspersion and juxtaposition index at class level showed significant correlations with the LST in March, May, July, and December, especially in July. Some metrics such as PD, edge density, clumpiness index, patch cohesion index, effective mesh size, splitting index, aggregation index, and normalized landscape shape index showed varying correlations with the LST at different class levels. The traditional landscape metrics could not be appropriate in evaluating the effects of river on LST, while some of the metrics could be useful in characterizing urban LST and analyzing the urban heat island effect, but screening and examining should be made on the metrics.
Experiences with Software Quality Metrics in the EMI Middleware
CERN. Geneva
2012-01-01
The EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering – Product Quality to identify a set of characteristics that need to be present in the EMI software. For each software characteristic, such as portability, maintainability, compliance, etc, a set of associated metrics and KPIs (Key Performance Indicators) are identified. This article presents how the EMI Quality Model and the EMI Metrics have been defined in the context of the software quality assurance activities carried out in EMI. It also describes the measurement plan and presents some of the metrics reports that have been produced for the EMI releases and updates. It also covers which tools and techniques can be used by any software project t...
Neutron Damage Metrics and the Quantification of the Associated Uncertainty
International Nuclear Information System (INIS)
Griffin, P.J.
2012-01-01
The motivation for this work is the determination of a methodology for deriving and validating a reference metric that can be used to correlate radiation damage from neutrons of various energies and from charged particles with observed damage modes. Exposure functions for some damage modes are being used by the radiation effects community, e.g. 1-MeV-Equivalent damage in Si and in GaAs semiconductors as well as displacements per atom (dpa) and subsequent material embrittlement in iron. The limitations with the current treatment of these energy-dependent metrics include a lack of an associated covariance matrix and incomplete validation. In addition, the analytical approaches used to derive the current metrics fail to properly treat damage in compound/poly-atomic materials, the evolution and recombination of defects as a function of time since exposure, as well as the influence of dopant materials and impurities in the material of interest. The current metrics only provide a crude correlation with the damage modes of interest. They do not, typically, even distinguish between the damage effectiveness of different types of neutron-induced lattice defects, e.g. they fail to distinguish between a vacancy-oxygen defect and a divacancy with respect to the minority carrier lifetime and the decrease in gain in a Si bipolar transistor. The goal of this work is to facilitate the generation of more advanced radiation metrics that will provide an easier intercomparison of radiation damage as delivered from various types of test facilities and with various real-world nuclear applications. One first needs to properly define the scope of the radiation damage application that is a concern before an appropriate damage metric is selected. The fidelity of the metric selected and the range of environmental parameters under which the metric can be correlated with the damage should match the intended application. It should address the scope of real-world conditions where the metric will
Relevance of metric-free interactions in flocking phenomena.
Ginelli, Francesco; Chaté, Hugues
2010-10-15
We show that the collective properties of self-propelled particles aligning with their topological (Voronoi) neighbors are qualitatively different from those of usual models where metric interaction ranges are used. This relevance of metric-free interactions, shown in a minimal setting, indicate that realistic models for the cohesive motion of cells, bird flocks, and fish schools may have to incorporate them, as suggested by recent observations.
Gamut Volume Index: a color preference metric based on meta-analysis and optimized colour samples.
Liu, Qiang; Huang, Zheng; Xiao, Kaida; Pointer, Michael R; Westland, Stephen; Luo, M Ronnier
2017-07-10
A novel metric named Gamut Volume Index (GVI) is proposed for evaluating the colour preference of lighting. This metric is based on the absolute gamut volume of optimized colour samples. The optimal colour set of the proposed metric was obtained by optimizing the weighted average correlation between the metric predictions and the subjective ratings for 8 psychophysical studies. The performance of 20 typical colour metrics was also investigated, which included colour difference based metrics, gamut based metrics, memory based metrics as well as combined metrics. It was found that the proposed GVI outperformed the existing counterparts, especially for the conditions where correlated colour temperatures differed.
Metric freeness and projectivity for classical and quantum normed modules
Energy Technology Data Exchange (ETDEWEB)
Helemskii, A Ya [M. V. Lomonosov Moscow State University, Moscow (Russian Federation)
2013-07-31
In functional analysis, there are several diverse approaches to the notion of projective module. We show that a certain general categorical scheme contains all basic versions as special cases. In this scheme, the notion of free object comes to the foreground, and, in the best categories, projective objects are precisely retracts of free ones. We are especially interested in the so-called metric version of projectivity and characterize the metrically free classical and quantum (= operator) normed modules. Informally speaking, so-called extremal projectivity, which was known earlier, is interpreted as a kind of 'asymptotical metric projectivity'. In addition, we answer the following specific question in the geometry of normed spaces: what is the structure of metrically projective modules in the simplest case of normed spaces? We prove that metrically projective normed spaces are precisely the subspaces of l{sub 1}(M) (where M is a set) that are denoted by l{sub 1}{sup 0}(M) and consist of finitely supported functions. Thus, in this case, projectivity coincides with freeness. Bibliography: 28 titles.
Some Metric Properties of Planar Gaussian Free Field
Goswami, Subhajit
In this thesis we study the properties of some metrics arising from two-dimensional Gaussian free field (GFF), namely the Liouville first-passage percolation (Liouville FPP), the Liouville graph distance and an effective resistance metric. In Chapter 1, we define these metrics as well as discuss the motivations for studying them. Roughly speaking, Liouville FPP is the shortest path metric in a planar domain D where the length of a path P is given by ∫Pe gammah(z)|dz| where h is the GFF on D and gamma > 0. In Chapter 2, we present an upper bound on the expected Liouville FPP distance between two typical points for small values of gamma (the near-Euclidean regime). A similar upper bound is derived in Chapter 3 for the Liouville graph distance which is, roughly, the minimal number of Euclidean balls with comparable Liouville quantum gravity (LQG) measure whose union contains a continuous path between two endpoints. Our bounds seem to be in disagreement with Watabiki's prediction (1993) on the random metric of Liouville quantum gravity in this regime. The contents of these two chapters are based on a joint work with Jian Ding. In Chapter 4, we derive some asymptotic estimates for effective resistances on a random network which is defined as follows. Given any gamma > 0 and for eta = {etav}v∈Z2 denoting a sample of the two-dimensional discrete Gaussian free field on Z2 pinned at the origin, we equip the edge ( u, v) with conductance egamma(etau + eta v). The metric structure of effective resistance plays a crucial role in our proof of the main result in Chapter 4. The primary motivation behind this metric is to understand the random walk on Z 2 where the edge (u, v) has weight egamma(etau + etav). Using the estimates from Chapter 4 we show in Chapter 5 that for almost every eta, this random walk is recurrent and that, with probability tending to 1 as T → infinity, the return probability at time 2T decays as T-1+o(1). In addition, we prove a version of subdiffusive
Learning Global-Local Distance Metrics for Signature-Based Biometric Cryptosystems
Directory of Open Access Journals (Sweden)
George S. Eskander Ekladious
2017-11-01
Full Text Available Biometric traits, such as fingerprints, faces and signatures have been employed in bio-cryptosystems to secure cryptographic keys within digital security schemes. Reliable implementations of these systems employ error correction codes formulated as simple distance thresholds, although they may not effectively model the complex variability of behavioral biometrics like signatures. In this paper, a Global-Local Distance Metric (GLDM framework is proposed to learn cost-effective distance metrics, which reduce within-class variability and augment between-class variability, so that simple error correction thresholds of bio-cryptosystems provide high classification accuracy. First, a large number of samples from a development dataset are used to train a global distance metric that differentiates within-class from between-class samples of the population. Then, once user-specific samples are available for enrollment, the global metric is tuned to a local user-specific one. Proof-of-concept experiments on two reference offline signature databases confirm the viability of the proposed approach. Distance metrics are produced based on concise signature representations consisting of about 20 features and a single prototype. A signature-based bio-cryptosystem is designed using the produced metrics and has shown average classification error rates of about 7% and 17% for the PUCPR and the GPDS-300 databases, respectively. This level of performance is comparable to that obtained with complex state-of-the-art classifiers.
Projection Operator: A Step Towards Certification of Adaptive Controllers
Larchev, Gregory V.; Campbell, Stefan F.; Kaneshige, John T.
2010-01-01
One of the major barriers to wider use of adaptive controllers in commercial aviation is the lack of appropriate certification procedures. In order to be certified by the Federal Aviation Administration (FAA), an aircraft controller is expected to meet a set of guidelines on functionality and reliability while not negatively impacting other systems or safety of aircraft operations. Due to their inherent time-variant and non-linear behavior, adaptive controllers cannot be certified via the metrics used for linear conventional controllers, such as gain and phase margin. Projection Operator is a robustness augmentation technique that bounds the output of a non-linear adaptive controller while conforming to the Lyapunov stability rules. It can also be used to limit the control authority of the adaptive component so that the said control authority can be arbitrarily close to that of a linear controller. In this paper we will present the results of applying the Projection Operator to a Model-Reference Adaptive Controller (MRAC), varying the amount of control authority, and comparing controller s performance and stability characteristics with those of a linear controller. We will also show how adjusting Projection Operator parameters can make it easier for the controller to satisfy the certification guidelines by enabling a tradeoff between controller s performance and robustness.
A case study of evolutionary computation of biochemical adaptation
International Nuclear Information System (INIS)
François, Paul; Siggia, Eric D
2008-01-01
Simulations of evolution have a long history, but their relation to biology is questioned because of the perceived contingency of evolution. Here we provide an example of a biological process, adaptation, where simulations are argued to approach closer to biology. Adaptation is a common feature of sensory systems, and a plausible component of other biochemical networks because it rescales upstream signals to facilitate downstream processing. We create random gene networks numerically, by linking genes with interactions that model transcription, phosphorylation and protein–protein association. We define a fitness function for adaptation in terms of two functional metrics, and show that any reasonable combination of them will yield the same adaptive networks after repeated rounds of mutation and selection. Convergence to these networks is driven by positive selection and thus fast. There is always a path in parameter space of continuously improving fitness that leads to perfect adaptation, implying that the actual mutation rates we use in the simulation do not bias the results. Our results imply a kinetic view of evolution, i.e., it favors gene networks that can be learned quickly from the random examples supplied by mutation. This formulation allows for deductive predictions of the networks realized in nature
On the Metric-based Approximate Minimization of Markov Chains
DEFF Research Database (Denmark)
Bacci, Giovanni; Bacci, Giorgio; Larsen, Kim Guldstrand
2018-01-01
In this paper we address the approximate minimization problem of Markov Chains (MCs) from a behavioral metric-based perspective. Specifically, given a finite MC and a positive integer k, we are looking for an MC with at most k states having minimal distance to the original. The metric considered...
lakemorpho: Calculating lake morphometry metrics in R.
Hollister, Jeffrey; Stachelek, Joseph
2017-01-01
Metrics describing the shape and size of lakes, known as lake morphometry metrics, are important for any limnological study. In cases where a lake has long been the subject of study these data are often already collected and are openly available. Many other lakes have these data collected, but access is challenging as it is often stored on individual computers (or worse, in filing cabinets) and is available only to the primary investigators. The vast majority of lakes fall into a third category in which the data are not available. This makes broad scale modelling of lake ecology a challenge as some of the key information about in-lake processes are unavailable. While this valuable in situ information may be difficult to obtain, several national datasets exist that may be used to model and estimate lake morphometry. In particular, digital elevation models and hydrography have been shown to be predictive of several lake morphometry metrics. The R package lakemorpho has been developed to utilize these data and estimate the following morphometry metrics: surface area, shoreline length, major axis length, minor axis length, major and minor axis length ratio, shoreline development, maximum depth, mean depth, volume, maximum lake length, mean lake width, maximum lake width, and fetch. In this software tool article we describe the motivation behind developing lakemorpho , discuss the implementation in R, and describe the use of lakemorpho with an example of a typical use case.
Ranking metrics in gene set enrichment analysis: do they matter?
Zyla, Joanna; Marczyk, Michal; Weiner, January; Polanska, Joanna
2017-05-12
There exist many methods for describing the complex relation between changes of gene expression in molecular pathways or gene ontologies under different experimental conditions. Among them, Gene Set Enrichment Analysis seems to be one of the most commonly used (over 10,000 citations). An important parameter, which could affect the final result, is the choice of a metric for the ranking of genes. Applying a default ranking metric may lead to poor results. In this work 28 benchmark data sets were used to evaluate the sensitivity and false positive rate of gene set analysis for 16 different ranking metrics including new proposals. Furthermore, the robustness of the chosen methods to sample size was tested. Using k-means clustering algorithm a group of four metrics with the highest performance in terms of overall sensitivity, overall false positive rate and computational load was established i.e. absolute value of Moderated Welch Test statistic, Minimum Significant Difference, absolute value of Signal-To-Noise ratio and Baumgartner-Weiss-Schindler test statistic. In case of false positive rate estimation, all selected ranking metrics were robust with respect to sample size. In case of sensitivity, the absolute value of Moderated Welch Test statistic and absolute value of Signal-To-Noise ratio gave stable results, while Baumgartner-Weiss-Schindler and Minimum Significant Difference showed better results for larger sample size. Finally, the Gene Set Enrichment Analysis method with all tested ranking metrics was parallelised and implemented in MATLAB, and is available at https://github.com/ZAEDPolSl/MrGSEA . Choosing a ranking metric in Gene Set Enrichment Analysis has critical impact on results of pathway enrichment analysis. The absolute value of Moderated Welch Test has the best overall sensitivity and Minimum Significant Difference has the best overall specificity of gene set analysis. When the number of non-normally distributed genes is high, using Baumgartner
Liu, Jianfei; Jung, HaeWon; Dubra, Alfredo; Tam, Johnny
2017-09-01
Adaptive optics scanning light ophthalmoscopy (AOSLO) has enabled quantification of the photoreceptor mosaic in the living human eye using metrics such as cell density and average spacing. These rely on the identification of individual cells. Here, we demonstrate a novel approach for computer-aided identification of cone photoreceptors on nonconfocal split detection AOSLO images. Algorithms for identification of cone photoreceptors were developed, based on multiscale circular voting (MSCV) in combination with a priori knowledge that split detection images resemble Nomarski differential interference contrast images, in which dark and bright regions are present on the two sides of each cell. The proposed algorithm locates dark and bright region pairs, iteratively refining the identification across multiple scales. Identification accuracy was assessed in data from 10 subjects by comparing automated identifications with manual labeling, followed by computation of density and spacing metrics for comparison to histology and published data. There was good agreement between manual and automated cone identifications with overall recall, precision, and F1 score of 92.9%, 90.8%, and 91.8%, respectively. On average, computed density and spacing values using automated identification were within 10.7% and 11.2% of the expected histology values across eccentricities ranging from 0.5 to 6.2 mm. There was no statistically significant difference between MSCV-based and histology-based density measurements (P = 0.96, Kolmogorov-Smirnov 2-sample test). MSCV can accurately detect cone photoreceptors on split detection images across a range of eccentricities, enabling quick, objective estimation of photoreceptor mosaic metrics, which will be important for future clinical trials utilizing adaptive optics.
Inflation with non-minimal coupling. Metric vs. Palatini formulations
International Nuclear Information System (INIS)
Bauer, F.; Demir, D.A.; Izmir Institute of Technology
2008-03-01
We analyze non-minimally coupled scalar field theories in metric (second-order) and Palatini (first-order) formalisms in a comparative fashion. After contrasting them in a general setup, we specialize to inflation and find that the two formalisms differ in their predictions for various cosmological parameters. The main reason is that dependencies on the non-minimal coupling parameter are different in the two formalisms. For successful inflation, the Palatini approach prefers a much larger value for the non-minimal coupling parameter than the Metric approach. Unlike the Metric formalism, in Palatini, the inflaton stays well below the Planck scale whereby providing a natural inflationary epoch. (orig.)
On the Metric-Based Approximate Minimization of Markov Chains
DEFF Research Database (Denmark)
Bacci, Giovanni; Bacci, Giorgio; Larsen, Kim Guldstrand
2017-01-01
We address the behavioral metric-based approximate minimization problem of Markov Chains (MCs), i.e., given a finite MC and a positive integer k, we are interested in finding a k-state MC of minimal distance to the original. By considering as metric the bisimilarity distance of Desharnais at al...
Metrical results on systems of small linear forms
DEFF Research Database (Denmark)
Hussain, M.; Kristensen, Simon
In this paper the metric theory of Diophantine approximation associated with the small linear forms is investigated. Khintchine--Groshev theorems are established along with Hausdorff measure generalization without the monotonic assumption on the approximating function.......In this paper the metric theory of Diophantine approximation associated with the small linear forms is investigated. Khintchine--Groshev theorems are established along with Hausdorff measure generalization without the monotonic assumption on the approximating function....