WorldWideScience

Sample records for metric distance based

  1. Research on cardiovascular disease prediction based on distance metric learning

    Science.gov (United States)

    Ni, Zhuang; Liu, Kui; Kang, Guixia

    2018-04-01

    Distance metric learning algorithm has been widely applied to medical diagnosis and exhibited its strengths in classification problems. The k-nearest neighbour (KNN) is an efficient method which treats each feature equally. The large margin nearest neighbour classification (LMNN) improves the accuracy of KNN by learning a global distance metric, which did not consider the locality of data distributions. In this paper, we propose a new distance metric algorithm adopting cosine metric and LMNN named COS-SUBLMNN which takes more care about local feature of data to overcome the shortage of LMNN and improve the classification accuracy. The proposed methodology is verified on CVDs patient vector derived from real-world medical data. The Experimental results show that our method provides higher accuracy than KNN and LMNN did, which demonstrates the effectiveness of the Risk predictive model of CVDs based on COS-SUBLMNN.

  2. Learning Global-Local Distance Metrics for Signature-Based Biometric Cryptosystems

    Directory of Open Access Journals (Sweden)

    George S. Eskander Ekladious

    2017-11-01

    Full Text Available Biometric traits, such as fingerprints, faces and signatures have been employed in bio-cryptosystems to secure cryptographic keys within digital security schemes. Reliable implementations of these systems employ error correction codes formulated as simple distance thresholds, although they may not effectively model the complex variability of behavioral biometrics like signatures. In this paper, a Global-Local Distance Metric (GLDM framework is proposed to learn cost-effective distance metrics, which reduce within-class variability and augment between-class variability, so that simple error correction thresholds of bio-cryptosystems provide high classification accuracy. First, a large number of samples from a development dataset are used to train a global distance metric that differentiates within-class from between-class samples of the population. Then, once user-specific samples are available for enrollment, the global metric is tuned to a local user-specific one. Proof-of-concept experiments on two reference offline signature databases confirm the viability of the proposed approach. Distance metrics are produced based on concise signature representations consisting of about 20 features and a single prototype. A signature-based bio-cryptosystem is designed using the produced metrics and has shown average classification error rates of about 7% and 17% for the PUCPR and the GPDS-300 databases, respectively. This level of performance is comparable to that obtained with complex state-of-the-art classifiers.

  3. Quantum Algorithm for K-Nearest Neighbors Classification Based on the Metric of Hamming Distance

    Science.gov (United States)

    Ruan, Yue; Xue, Xiling; Liu, Heng; Tan, Jianing; Li, Xi

    2017-11-01

    K-nearest neighbors (KNN) algorithm is a common algorithm used for classification, and also a sub-routine in various complicated machine learning tasks. In this paper, we presented a quantum algorithm (QKNN) for implementing this algorithm based on the metric of Hamming distance. We put forward a quantum circuit for computing Hamming distance between testing sample and each feature vector in the training set. Taking advantage of this method, we realized a good analog for classical KNN algorithm by setting a distance threshold value t to select k - n e a r e s t neighbors. As a result, QKNN achieves O( n 3) performance which is only relevant to the dimension of feature vectors and high classification accuracy, outperforms Llyod's algorithm (Lloyd et al. 2013) and Wiebe's algorithm (Wiebe et al. 2014).

  4. Distance Metric Tracking

    Science.gov (United States)

    2016-03-02

    whereBψ is any Bregman divergence and ηt is the learning rate parameter. From (Hall & Willett, 2015) we have: Theorem 1. G` = max θ∈Θ,`∈L ‖∇f(θ)‖ φmax = 1...Kullback-Liebler divergence between an initial guess of the matrix that parameterizes the Mahalanobis distance and a solution that satisfies a set of...Bregman divergence and ηt is the learning rate parameter. M̂0, µ̂0 are initialized to some initial value. In [18] a closed-form algorithm for solving

  5. Improved nonlinear fault detection strategy based on the Hellinger distance metric: Plug flow reactor monitoring

    KAUST Repository

    Harrou, Fouzi

    2017-03-18

    Fault detection has a vital role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. This paper proposes an innovative multivariate fault detection method that can be used for monitoring nonlinear processes. The proposed method merges advantages of nonlinear projection to latent structures (NLPLS) modeling and those of Hellinger distance (HD) metric to identify abnormal changes in highly correlated multivariate data. Specifically, the HD is used to quantify the dissimilarity between current NLPLS-based residual and reference probability distributions obtained using fault-free data. Furthermore, to enhance further the robustness of these methods to measurement noise, and reduce the false alarms due to modeling errors, wavelet-based multiscale filtering of residuals is used before the application of the HD-based monitoring scheme. The performances of the developed NLPLS-HD fault detection technique is illustrated using simulated plug flow reactor data. The results show that the proposed method provides favorable performance for detection of faults compared to the conventional NLPLS method.

  6. Measuring distance “as the horse runs”: Cross-scale comparison of terrain-based metrics

    Science.gov (United States)

    Buttenfield, Barbara P.; Ghandehari, M; Leyk, S; Stanislawski, Larry V.; Brantley, M E; Qiang, Yi

    2016-01-01

    Distance metrics play significant roles in spatial modeling tasks, such as flood inundation (Tucker and Hancock 2010), stream extraction (Stanislawski et al. 2015), power line routing (Kiessling et al. 2003) and analysis of surface pollutants such as nitrogen (Harms et al. 2009). Avalanche risk is based on slope, aspect, and curvature, all directly computed from distance metrics (Gutiérrez 2012). Distance metrics anchor variogram analysis, kernel estimation, and spatial interpolation (Cressie 1993). Several approaches are employed to measure distance. Planar metrics measure straight line distance between two points (“as the crow flies”) and are simple and intuitive, but suffer from uncertainties. Planar metrics assume that Digital Elevation Model (DEM) pixels are rigid and flat, as tiny facets of ceramic tile approximating a continuous terrain surface. In truth, terrain can bend, twist and undulate within each pixel.Work with Light Detection and Ranging (lidar) data or High Resolution Topography to achieve precise measurements present challenges, as filtering can eliminate or distort significant features (Passalacqua et al. 2015). The current availability of lidar data is far from comprehensive in developed nations, and non-existent in many rural and undeveloped regions. Notwithstanding computational advances, distance estimation on DEMs has never been systematically assessed, due to assumptions that improvements are so small that surface adjustment is unwarranted. For individual pixels inaccuracies may be small, but additive effects can propagate dramatically, especially in regional models (e.g., disaster evacuation) or global models (e.g., sea level rise) where pixels span dozens to hundreds of kilometers (Usery et al 2003). Such models are increasingly common, lending compelling reasons to understand shortcomings in the use of planar distance metrics. Researchers have studied curvature-based terrain modeling. Jenny et al. (2011) use curvature to generate

  7. Metrics for measuring distances in configuration spaces

    International Nuclear Information System (INIS)

    Sadeghi, Ali; Ghasemi, S. Alireza; Schaefer, Bastian; Mohr, Stephan; Goedecker, Stefan; Lill, Markus A.

    2013-01-01

    In order to characterize molecular structures we introduce configurational fingerprint vectors which are counterparts of quantities used experimentally to identify structures. The Euclidean distance between the configurational fingerprint vectors satisfies the properties of a metric and can therefore safely be used to measure dissimilarities between configurations in the high dimensional configuration space. In particular we show that these metrics are a perfect and computationally cheap replacement for the root-mean-square distance (RMSD) when one has to decide whether two noise contaminated configurations are identical or not. We introduce a Monte Carlo approach to obtain the global minimum of the RMSD between configurations, which is obtained from a global minimization over all translations, rotations, and permutations of atomic indices

  8. Content-based retrieval of brain tumor in contrast-enhanced MRI images using tumor margin information and learned distance metric.

    Science.gov (United States)

    Yang, Wei; Feng, Qianjin; Yu, Mei; Lu, Zhentai; Gao, Yang; Xu, Yikai; Chen, Wufan

    2012-11-01

    A content-based image retrieval (CBIR) method for T1-weighted contrast-enhanced MRI (CE-MRI) images of brain tumors is presented for diagnosis aid. The method is thoroughly evaluated on a large image dataset. Using the tumor region as a query, the authors' CBIR system attempts to retrieve tumors of the same pathological category. Aside from commonly used features such as intensity, texture, and shape features, the authors use a margin information descriptor (MID), which is capable of describing the characteristics of tissue surrounding a tumor, for representing image contents. In addition, the authors designed a distance metric learning algorithm called Maximum mean average Precision Projection (MPP) to maximize the smooth approximated mean average precision (mAP) to optimize retrieval performance. The effectiveness of MID and MPP algorithms was evaluated using a brain CE-MRI dataset consisting of 3108 2D scans acquired from 235 patients with three categories of brain tumors (meningioma, glioma, and pituitary tumor). By combining MID and other features, the mAP of retrieval increased by more than 6% with the learned distance metrics. The distance metric learned by MPP significantly outperformed the other two existing distance metric learning methods in terms of mAP. The CBIR system using the proposed strategies achieved a mAP of 87.3% and a precision of 89.3% when top 10 images were returned by the system. Compared with scale-invariant feature transform, the MID, which uses the intensity profile as descriptor, achieves better retrieval performance. Incorporating tumor margin information represented by MID with the distance metric learned by the MPP algorithm can substantially improve the retrieval performance for brain tumors in CE-MRI.

  9. Distance walked and run as improved metrics over time-based energy estimation in epidemiological studies and prevention; evidence from medication use.

    Directory of Open Access Journals (Sweden)

    Paul T Williams

    Full Text Available The guideline physical activity levels are prescribed in terms of time, frequency, and intensity (e.g., 30 minutes brisk walking, five days a week or its energy equivalence and assume that different activities may be combined to meet targeted goals (exchangeability premise. Habitual runners and walkers may quantify exercise in terms of distance (km/day, and for them, the relationship between activity dose and health benefits may be better assessed in terms of distance rather than time. Analyses were therefore performed to test: 1 whether time-based or distance-based estimates of energy expenditure provide the best metric for relating running and walking to hypertensive, high cholesterol, and diabetes medication use (conditions known to be diminished by exercise, and 2 the exchangeability premise.Logistic regression analyses of medication use (dependent variable vs. metabolic equivalent hours per day (METhr/d of running, walking and other exercise (independent variables using cross-sectional data from the National Runners' (17,201 male, 16,173 female and Walkers' Health Studies (3,434 male, 12,384 female.Estimated METhr/d of running and walking activity were 38% and 31% greater, respectively, when calculated from self-reported time than distance in men, and 43% and 37% greater in women, respectively. Percent reductions in the odds for hypertension and high cholesterol medication use per METhr/d run or per METhr/d walked were ≥ 2-fold greater when estimated from reported distance (km/wk than from time (hr/wk. The per METhr/d odds reduction was significantly greater for the distance- than the time-based estimate for hypertension (runners: P<10(-5 for males and P=0.003 for females; walkers: P=0.03 for males and P<10(-4 for females, high cholesterol medication use in runners (P<10(-4 for males and P=0.02 for females and male walkers (P=0.01 for males and P=0.08 for females and for diabetes medication use in male runners (P<10(-3.Although causality

  10. On the Metric-Based Approximate Minimization of Markov Chains

    DEFF Research Database (Denmark)

    Bacci, Giovanni; Bacci, Giorgio; Larsen, Kim Guldstrand

    2017-01-01

    We address the behavioral metric-based approximate minimization problem of Markov Chains (MCs), i.e., given a finite MC and a positive integer k, we are interested in finding a k-state MC of minimal distance to the original. By considering as metric the bisimilarity distance of Desharnais at al...

  11. On the Metric-based Approximate Minimization of Markov Chains

    DEFF Research Database (Denmark)

    Bacci, Giovanni; Bacci, Giorgio; Larsen, Kim Guldstrand

    2018-01-01

    In this paper we address the approximate minimization problem of Markov Chains (MCs) from a behavioral metric-based perspective. Specifically, given a finite MC and a positive integer k, we are looking for an MC with at most k states having minimal distance to the original. The metric considered...

  12. Alignment-free genome tree inference by learning group-specific distance metrics.

    Science.gov (United States)

    Patil, Kaustubh R; McHardy, Alice C

    2013-01-01

    Understanding the evolutionary relationships between organisms is vital for their in-depth study. Gene-based methods are often used to infer such relationships, which are not without drawbacks. One can now attempt to use genome-scale information, because of the ever increasing number of genomes available. This opportunity also presents a challenge in terms of computational efficiency. Two fundamentally different methods are often employed for sequence comparisons, namely alignment-based and alignment-free methods. Alignment-free methods rely on the genome signature concept and provide a computationally efficient way that is also applicable to nonhomologous sequences. The genome signature contains evolutionary signal as it is more similar for closely related organisms than for distantly related ones. We used genome-scale sequence information to infer taxonomic distances between organisms without additional information such as gene annotations. We propose a method to improve genome tree inference by learning specific distance metrics over the genome signature for groups of organisms with similar phylogenetic, genomic, or ecological properties. Specifically, our method learns a Mahalanobis metric for a set of genomes and a reference taxonomy to guide the learning process. By applying this method to more than a thousand prokaryotic genomes, we showed that, indeed, better distance metrics could be learned for most of the 18 groups of organisms tested here. Once a group-specific metric is available, it can be used to estimate the taxonomic distances for other sequenced organisms from the group. This study also presents a large scale comparison between 10 methods--9 alignment-free and 1 alignment-based.

  13. Characterization of Diffusion Metric Map Similarity in Data From a Clinical Data Repository Using Histogram Distances

    Science.gov (United States)

    Warner, Graham C.; Helmer, Karl G.

    2018-01-01

    As the sharing of data is mandated by funding agencies and journals, reuse of data has become more prevalent. It becomes imperative, therefore, to develop methods to characterize the similarity of data. While users can group data based on the acquisition parameters stored in the file headers, these gives no indication whether a file can be combined with other data without increasing the variance in the data set. Methods have been implemented that characterize the signal-to-noise ratio or identify signal drop-outs in the raw image files, but potential users of data often have access to calculated metric maps and these are more difficult to characterize and compare. Here we describe a histogram-distance-based method applied to diffusion metric maps of fractional anisotropy and mean diffusivity that were generated using data extracted from a repository of clinically-acquired MRI data. We describe the generation of the data set, the pitfalls specific to diffusion MRI data, and the results of the histogram distance analysis. We find that, in general, data from GE scanners are less similar than are data from Siemens scanners. We also find that the distribution of distance metric values is not Gaussian at any selection of the acquisition parameters considered here (field strength, number of gradient directions, b-value, and vendor). PMID:29568257

  14. A study of metrics of distance and correlation between ranked lists for compositionality detection

    DEFF Research Database (Denmark)

    Lioma, Christina; Hansen, Niels Dalum

    2017-01-01

    affects the measurement of semantic similarity. We propose a new compositionality detection method that represents phrases as ranked lists of term weights. Our method approximates the semantic similarity between two ranked list representations using a range of well-known distance and correlation metrics...... of compositionality using any of the distance and correlation metrics considered....

  15. Metric distances derived from cosine similarity and Pearson and Spearman correlations

    OpenAIRE

    van Dongen, Stijn; Enright, Anton J.

    2012-01-01

    We investigate two classes of transformations of cosine similarity and Pearson and Spearman correlations into metric distances, utilising the simple tool of metric-preserving functions. The first class puts anti-correlated objects maximally far apart. Previously known transforms fall within this class. The second class collates correlated and anti-correlated objects. An example of such a transformation that yields a metric distance is the sine function when applied to centered data.

  16. 80537 based distance relay

    DEFF Research Database (Denmark)

    Pedersen, Knud Ole Helgesen

    1999-01-01

    A method for implementing a digital distance relay in the power system is described.Instructions are given on how to program this relay on a 80537 based microcomputer system.The problem is used as a practical case study in the course 53113: Micocomputer applications in the power system.The relay...

  17. A robust new metric of phenotypic distance to estimate and compare multiple trait differences among populations

    Directory of Open Access Journals (Sweden)

    Rebecca SAFRAN, Samuel FLAXMAN, Michael KOPP, Darren E. IRWIN, Derek BRIGGS, Matthew R. EVANS, W. Chris FUNK, David A. GRAY, Eileen A. HEBE

    2012-06-01

    Full Text Available Whereas a rich literature exists for estimating population genetic divergence, metrics of phenotypic trait divergence are lacking, particularly for comparing multiple traits among three or more populations. Here, we review and analyze via simulation Hedges’ g, a widely used parametric estimate of effect size. Our analyses indicate that g is sensitive to a combination of unequal trait variances and unequal sample sizes among populations and to changes in the scale of measurement. We then go on to derive and explain a new, non-parametric distance measure, “Δp”, which is calculated based upon a joint cumulative distribution function (CDF from all populations under study. More precisely, distances are measured in terms of the percentiles in this CDF at which each population’s median lies. Δp combines many desirable features of other distance metrics into a single metric; namely, compared to other metrics, p is relatively insensitive to unequal variances and sample sizes among the populations sampled. Furthermore, a key feature of Δp—and our main motivation for developing it—is that it easily accommodates simultaneous comparisons of any number of traits across any number of populations. To exemplify its utility, we employ Δp to address a question related to the role of sexual selection in speciation: are sexual signals more divergent than ecological traits in closely related taxa? Using traits of known function in closely related populations, we show that traits predictive of reproductive performance are, indeed, more divergent and more sexually dimorphic than traits related to ecological adaptation [Current Zoology 58 (3: 423-436, 2012].

  18. Sequence of maximal distance codes in graphs or other metric spaces

    Directory of Open Access Journals (Sweden)

    Charles Delorme

    2013-11-01

    Full Text Available Given a subset C in a metric space E, its successor is the subset  s(C of points at maximum distance from C in E. We study some properties of the sequence obtained by iterating this operation.  Graphs with their usual distance provide already typical examples.

  19. Drift correction for single-molecule imaging by molecular constraint field, a distance minimum metric

    International Nuclear Information System (INIS)

    Han, Renmin; Wang, Liansan; Xu, Fan; Zhang, Yongdeng; Zhang, Mingshu; Liu, Zhiyong; Ren, Fei; Zhang, Fa

    2015-01-01

    The recent developments of far-field optical microscopy (single molecule imaging techniques) have overcome the diffraction barrier of light and improve image resolution by a factor of ten compared with conventional light microscopy. These techniques utilize the stochastic switching of probe molecules to overcome the diffraction limit and determine the precise localizations of molecules, which often requires a long image acquisition time. However, long acquisition times increase the risk of sample drift. In the case of high resolution microscopy, sample drift would decrease the image resolution. In this paper, we propose a novel metric based on the distance between molecules to solve the drift correction. The proposed metric directly uses the position information of molecules to estimate the frame drift. We also designed an algorithm to implement the metric for the general application of drift correction. There are two advantages of our method: First, because our method does not require space binning of positions of molecules but directly operates on the positions, it is more natural for single molecule imaging techniques. Second, our method can estimate drift with a small number of positions in each temporal bin, which may extend its potential application. The effectiveness of our method has been demonstrated by both simulated data and experiments on single molecular images

  20. A boosting framework for visuality-preserving distance metric learning and its application to medical image retrieval.

    Science.gov (United States)

    Yang, Liu; Jin, Rong; Mummert, Lily; Sukthankar, Rahul; Goode, Adam; Zheng, Bin; Hoi, Steven C H; Satyanarayanan, Mahadev

    2010-01-01

    Similarity measurement is a critical component in content-based image retrieval systems, and learning a good distance metric can significantly improve retrieval performance. However, despite extensive study, there are several major shortcomings with the existing approaches for distance metric learning that can significantly affect their application to medical image retrieval. In particular, "similarity" can mean very different things in image retrieval: resemblance in visual appearance (e.g., two images that look like one another) or similarity in semantic annotation (e.g., two images of tumors that look quite different yet are both malignant). Current approaches for distance metric learning typically address only one goal without consideration of the other. This is problematic for medical image retrieval where the goal is to assist doctors in decision making. In these applications, given a query image, the goal is to retrieve similar images from a reference library whose semantic annotations could provide the medical professional with greater insight into the possible interpretations of the query image. If the system were to retrieve images that did not look like the query, then users would be less likely to trust the system; on the other hand, retrieving images that appear superficially similar to the query but are semantically unrelated is undesirable because that could lead users toward an incorrect diagnosis. Hence, learning a distance metric that preserves both visual resemblance and semantic similarity is important. We emphasize that, although our study is focused on medical image retrieval, the problem addressed in this work is critical to many image retrieval systems. We present a boosting framework for distance metric learning that aims to preserve both visual and semantic similarities. The boosting framework first learns a binary representation using side information, in the form of labeled pairs, and then computes the distance as a weighted Hamming

  1. Series distance – an intuitive metric to quantify hydrograph similarity in terms of occurrence, amplitude and timing of hydrological events

    Directory of Open Access Journals (Sweden)

    U. Ehret

    2011-03-01

    Full Text Available Applying metrics to quantify the similarity or dissimilarity of hydrographs is a central task in hydrological modelling, used both in model calibration and the evaluation of simulations or forecasts. Motivated by the shortcomings of standard objective metrics such as the Root Mean Square Error (RMSE or the Mean Absolute Peak Time Error (MAPTE and the advantages of visual inspection as a powerful tool for simultaneous, case-specific and multi-criteria (yet subjective evaluation, we propose a new objective metric termed Series Distance, which is in close accordance with visual evaluation. The Series Distance quantifies the similarity of two hydrographs neither in a time-aggregated nor in a point-by-point manner, but on the scale of hydrological events. It consists of three parts, namely a Threat Score which evaluates overall agreement of event occurrence, and the overall distance of matching observed and simulated events with respect to amplitude and timing. The novelty of the latter two is the way in which matching point pairs on the observed and simulated hydrographs are identified: not by equality in time (as is the case with the RMSE, but by the same relative position in matching segments (rise or recession of the event, indicating the same underlying hydrological process. Thus, amplitude and timing errors are calculated simultaneously but separately, from point pairs that also match visually, considering complete events rather than only individual points (as is the case with MAPTE. Relative weights can freely be assigned to each component of the Series Distance, which allows (subjective customization of the metric to various fields of application, but in a traceable way. Each of the three components of the Series Distance can be used in an aggregated or non-aggregated way, which makes the Series Distance a suitable tool for differentiated, process-based model diagnostics.

    After discussing the applicability of established time series

  2. Two fixed point theorems on quasi-metric spaces via mw- distances

    Energy Technology Data Exchange (ETDEWEB)

    Alegre, C.

    2017-07-01

    In this paper we prove a Banach-type fixed point theorem and a Kannan-type theorem in the setting of quasi-metric spaces using the notion of mw-distance. These theorems generalize some results that have recently appeared in the literature. (Author)

  3. Fixed point results for contractions involving generalized altering distances in ordered metric spaces

    Directory of Open Access Journals (Sweden)

    Samet Bessem

    2011-01-01

    Full Text Available Abstract In this article, we establish coincidence point and common fixed point theorems for mappings satisfying a contractive inequality which involves two generalized altering distance functions in ordered complete metric spaces. As application, we study the existence of a common solution to a system of integral equations. 2000 Mathematics subject classification. Primary 47H10, Secondary 54H25

  4. WE-E-213CD-11: A New Automatically Generated Metric for Evaluating the Spatial Precision of Deformable Image Registrations: The Distance Discordance Metric.

    Science.gov (United States)

    Saleh, Z; Apte, A; Sharp, G; Deasy, J

    2012-06-01

    We propose a new metric called Distance Discordance (DD), which is defined as the distance between two anatomic points from two moving images, which are co-located on some reference image, when deformed onto another reference image. To demonstrate the concept of DD, we created a reference software phantom which contains two objects. The first object (1) consists of a hollow box with a fixed size core and variable wall thickness. The second object (2) consists of a solid box of fixed size and arbitrary location. 7 different variations of the fixed phantom were created. Each phantom was deformed onto every other phantom using two B-Spline DIR algorithms available in Elastix and Plastimatch. Voxels were sampled from the reference phantom [1], which were also deformed from moving phantoms [2…6], and we find the differences in their corresponding location on phantom [7]. Each voxel results in a distribution of DD values, which we call distance discordance histogram (DDH). We also demonstrate this concept in 8 Head & Neck patients. The two image registration algorithms produced two different DD results for the same phantom image set. The mean values of the DDH were slightly lower for Elastix (0-1.28 cm) as compared to the values produced by Plastimatch (0-1.43 cm). The combined DDH for the H&N patients followed a lognormal distribution with a mean of 0.45 cm and std. deviation of 0.42 cm. The proposed distance discordance (DD) metric is an easily interpretable, quantitative tool that can be used to evaluate the effect of inter-patient variability on the goodness of the registration in different parts of the patient anatomy. Therefore, it can be utilized to exclude certain images based on their DDH characteristics. In addition, this metric does not rely on 'ground truth' or the presence of contoured structures. Partially supported by NIH grant R01 CA85181. © 2012 American Association of Physicists in Medicine.

  5. Deep Multimodal Distance Metric Learning Using Click Constraints for Image Ranking.

    Science.gov (United States)

    Yu, Jun; Yang, Xiaokang; Gao, Fei; Tao, Dacheng

    2017-12-01

    How do we retrieve images accurately? Also, how do we rank a group of images precisely and efficiently for specific queries? These problems are critical for researchers and engineers to generate a novel image searching engine. First, it is important to obtain an appropriate description that effectively represent the images. In this paper, multimodal features are considered for describing images. The images unique properties are reflected by visual features, which are correlated to each other. However, semantic gaps always exist between images visual features and semantics. Therefore, we utilize click feature to reduce the semantic gap. The second key issue is learning an appropriate distance metric to combine these multimodal features. This paper develops a novel deep multimodal distance metric learning (Deep-MDML) method. A structured ranking model is adopted to utilize both visual and click features in distance metric learning (DML). Specifically, images and their related ranking results are first collected to form the training set. Multimodal features, including click and visual features, are collected with these images. Next, a group of autoencoders is applied to obtain initially a distance metric in different visual spaces, and an MDML method is used to assign optimal weights for different modalities. Next, we conduct alternating optimization to train the ranking model, which is used for the ranking of new queries with click features. Compared with existing image ranking methods, the proposed method adopts a new ranking model to use multimodal features, including click features and visual features in DML. We operated experiments to analyze the proposed Deep-MDML in two benchmark data sets, and the results validate the effects of the method.

  6. An uncertainty importance measure using a distance metric for the change in a cumulative distribution function

    International Nuclear Information System (INIS)

    Chun, Moon-Hyun; Han, Seok-Jung; Tak, Nam-IL

    2000-01-01

    A simple measure of uncertainty importance using the entire change of cumulative distribution functions (CDFs) has been developed for use in probability safety assessments (PSAs). The entire change of CDFs is quantified in terms of the metric distance between two CDFs. The metric distance measure developed in this study reflects the relative impact of distributional changes of inputs on the change of an output distribution, while most of the existing uncertainty importance measures reflect the magnitude of relative contribution of input uncertainties to the output uncertainty. The present measure has been evaluated analytically for various analytical distributions to examine its characteristics. To illustrate the applicability and strength of the present measure, two examples are provided. The first example is an application of the present measure to a typical problem of a system fault tree analysis and the second one is for a hypothetical non-linear model. Comparisons of the present result with those obtained by existing uncertainty importance measures show that the metric distance measure is a useful tool to express the measure of uncertainty importance in terms of the relative impact of distributional changes of inputs on the change of an output distribution

  7. Product Differentiation and Brand Competition in the Italian Breakfast Cereal Market: a Distance Metric Approach

    Directory of Open Access Journals (Sweden)

    Paolo Sckokai

    2013-03-01

    Full Text Available This article employs a nation-wide sample of supermarket scanner data to study product and brand competition in the Italian breakfast cereal market. A modified Almost Ideal Demand System (AIDS, that includes Distance Metrics (DMs as proposed by Pinkse, Slade and Brett (2002, is estimated to study demand responses, substitution patterns, own-price and cross-price elasticities. Estimation results provide evidence of some degree of brand loyalty, while consumers do not seem loyal to the product type. Elasticity estimates point out the presence of patterns of substitution within products sharing the same brand and similar nutritional characteristics.

  8. Discriminatory Data Mapping by Matrix-Based Supervised Learning Metrics

    NARCIS (Netherlands)

    Strickert, M.; Schneider, P.; Keilwagen, J.; Villmann, T.; Biehl, M.; Hammer, B.

    2008-01-01

    Supervised attribute relevance detection using cross-comparisons (SARDUX), a recently proposed method for data-driven metric learning, is extended from dimension-weighted Minkowski distances to metrics induced by a data transformation matrix Ω for modeling mutual attribute dependence. Given class

  9. Distance metric learning for complex networks: Towards size-independent comparison of network structures

    Science.gov (United States)

    Aliakbary, Sadegh; Motallebi, Sadegh; Rashidian, Sina; Habibi, Jafar; Movaghar, Ali

    2015-02-01

    Real networks show nontrivial topological properties such as community structure and long-tail degree distribution. Moreover, many network analysis applications are based on topological comparison of complex networks. Classification and clustering of networks, model selection, and anomaly detection are just some applications of network comparison. In these applications, an effective similarity metric is needed which, given two complex networks of possibly different sizes, evaluates the amount of similarity between the structural features of the two networks. Traditional graph comparison approaches, such as isomorphism-based methods, are not only too time consuming but also inappropriate to compare networks with different sizes. In this paper, we propose an intelligent method based on the genetic algorithms for integrating, selecting, and weighting the network features in order to develop an effective similarity measure for complex networks. The proposed similarity metric outperforms state of the art methods with respect to different evaluation criteria.

  10. SU-E-J-159: Intra-Patient Deformable Image Registration Uncertainties Quantified Using the Distance Discordance Metric

    International Nuclear Information System (INIS)

    Saleh, Z; Thor, M; Apte, A; Deasy, J; Sharp, G; Muren, L

    2014-01-01

    Purpose: The quantitative evaluation of deformable image registration (DIR) is currently challenging due to lack of a ground truth. In this study we test a new method proposed for quantifying multiple-image based DIRrelated uncertainties, for DIR of pelvic images. Methods: 19 patients were analyzed, each with 6 CT scans, who previously had radiotherapy for prostate cancer. Manually delineated structures for rectum and bladder, which served as ground truth structures, were delineated on the planning CT and each subsequent scan. For each patient, voxel-by-voxel DIR-related uncertainties were evaluated, following B-spline based DIR, by applying a previously developed metric, the distance discordance metric (DDM; Saleh et al., PMB (2014) 59:733). The DDM map was superimposed on the first acquired CT scan and DDM statistics were assessed, also relative to two metrics estimating the agreement between the propagated and the manually delineated structures. Results: The highest DDM values which correspond to greatest spatial uncertainties were observed near the body surface and in the bowel due to the presence of gas. The mean rectal and bladder DDM values ranged from 1.1–11.1 mm and 1.5–12.7 mm, respectively. There was a strong correlation in the DDMs between the rectum and bladder (Pearson R = 0.68 for the max DDM). For both structures, DDM was correlated with the ratio between the DIR-propagated and manually delineated volumes (R = 0.74 for the max rectal DDM). The maximum rectal DDM was negatively correlated with the Dice Similarity Coefficient between the propagated and the manually delineated volumes (R= −0.52). Conclusion: The multipleimage based DDM map quantified considerable DIR variability across different structures and among patients. Besides using the DDM for quantifying DIR-related uncertainties it could potentially be used to adjust for uncertainties in DIR-based accumulated dose distributions

  11. Ideal Based Cyber Security Technical Metrics for Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    W. F. Boyer; M. A. McQueen

    2007-10-01

    Much of the world's critical infrastructure is at risk from attack through electronic networks connected to control systems. Security metrics are important because they provide the basis for management decisions that affect the protection of the infrastructure. A cyber security technical metric is the security relevant output from an explicit mathematical model that makes use of objective measurements of a technical object. A specific set of technical security metrics are proposed for use by the operators of control systems. Our proposed metrics are based on seven security ideals associated with seven corresponding abstract dimensions of security. We have defined at least one metric for each of the seven ideals. Each metric is a measure of how nearly the associated ideal has been achieved. These seven ideals provide a useful structure for further metrics development. A case study shows how the proposed metrics can be applied to an operational control system.

  12. Value-based metrics and Internet-based enterprises

    Science.gov (United States)

    Gupta, Krishan M.

    2001-10-01

    Within the last few years, a host of value-based metrics like EVA, MVA, TBR, CFORI, and TSR have evolved. This paper attempts to analyze the validity and applicability of EVA and Balanced Scorecard for Internet based organizations. Despite the collapse of the dot-com model, the firms engaged in e- commerce continue to struggle to find new ways to account for customer-base, technology, employees, knowledge, etc, as part of the value of the firm. While some metrics, like the Balance Scorecard are geared towards internal use, others like EVA are for external use. Value-based metrics are used for performing internal audits as well as comparing firms against one another; and can also be effectively utilized by individuals outside the firm looking to determine if the firm is creating value for its stakeholders.

  13. Multi-site Study of Diffusion Metric Variability: Characterizing the Effects of Site, Vendor, Field Strength, and Echo Time using the Histogram Distance

    Science.gov (United States)

    Helmer, K. G.; Chou, M-C.; Preciado, R. I.; Gimi, B.; Rollins, N. K.; Song, A.; Turner, J.; Mori, S.

    2016-01-01

    MRI-based multi-site trials now routinely include some form of diffusion-weighted imaging (DWI) in their protocol. These studies can include data originating from scanners built by different vendors, each with their own set of unique protocol restrictions, including restrictions on the number of available gradient directions, whether an externally-generated list of gradient directions can be used, and restrictions on the echo time (TE). One challenge of multi-site studies is to create a common imaging protocol that will result in a reliable and accurate set of diffusion metrics. The present study describes the effect of site, scanner vendor, field strength, and TE on two common metrics: the first moment of the diffusion tensor field (mean diffusivity, MD), and the fractional anisotropy (FA). We have shown in earlier work that ROI metrics and the mean of MD and FA histograms are not sufficiently sensitive for use in site characterization. Here we use the distance between whole brain histograms of FA and MD to investigate within- and between-site effects. We concluded that the variability of DTI metrics due to site, vendor, field strength, and echo time could influence the results in multi-center trials and that histogram distance is sensitive metrics for each of these variables. PMID:27350723

  14. Multi-site Study of Diffusion Metric Variability: Characterizing the Effects of Site, Vendor, Field Strength, and Echo Time using the Histogram Distance.

    Science.gov (United States)

    Helmer, K G; Chou, M-C; Preciado, R I; Gimi, B; Rollins, N K; Song, A; Turner, J; Mori, S

    2016-02-27

    MRI-based multi-site trials now routinely include some form of diffusion-weighted imaging (DWI) in their protocol. These studies can include data originating from scanners built by different vendors, each with their own set of unique protocol restrictions, including restrictions on the number of available gradient directions, whether an externally-generated list of gradient directions can be used, and restrictions on the echo time (TE). One challenge of multi-site studies is to create a common imaging protocol that will result in a reliable and accurate set of diffusion metrics. The present study describes the effect of site, scanner vendor, field strength, and TE on two common metrics: the first moment of the diffusion tensor field (mean diffusivity, MD), and the fractional anisotropy (FA). We have shown in earlier work that ROI metrics and the mean of MD and FA histograms are not sufficiently sensitive for use in site characterization. Here we use the distance between whole brain histograms of FA and MD to investigate within- and between-site effects. We concluded that the variability of DTI metrics due to site, vendor, field strength, and echo time could influence the results in multi-center trials and that histogram distance is sensitive metrics for each of these variables.

  15. Evaluating hydrological model performance using information theory-based metrics

    Science.gov (United States)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  16. Phylo_dCor: distance correlation as a novel metric for phylogenetic profiling.

    Science.gov (United States)

    Sferra, Gabriella; Fratini, Federica; Ponzi, Marta; Pizzi, Elisabetta

    2017-09-05

    Elaboration of powerful methods to predict functional and/or physical protein-protein interactions from genome sequence is one of the main tasks in the post-genomic era. Phylogenetic profiling allows the prediction of protein-protein interactions at a whole genome level in both Prokaryotes and Eukaryotes. For this reason it is considered one of the most promising methods. Here, we propose an improvement of phylogenetic profiling that enables handling of large genomic datasets and infer global protein-protein interactions. This method uses the distance correlation as a new measure of phylogenetic profile similarity. We constructed robust reference sets and developed Phylo-dCor, a parallelized version of the algorithm for calculating the distance correlation that makes it applicable to large genomic data. Using Saccharomyces cerevisiae and Escherichia coli genome datasets, we showed that Phylo-dCor outperforms phylogenetic profiling methods previously described based on the mutual information and Pearson's correlation as measures of profile similarity. In this work, we constructed and assessed robust reference sets and propose the distance correlation as a measure for comparing phylogenetic profiles. To make it applicable to large genomic data, we developed Phylo-dCor, a parallelized version of the algorithm for calculating the distance correlation. Two R scripts that can be run on a wide range of machines are available upon request.

  17. US Rocket Propulsion Industrial Base Health Metrics

    Science.gov (United States)

    Doreswamy, Rajiv

    2013-01-01

    The number of active liquid rocket engine and solid rocket motor development programs has severely declined since the "space race" of the 1950s and 1960s center dot This downward trend has been exacerbated by the retirement of the Space Shuttle, transition from the Constellation Program to the Space launch System (SLS) and similar activity in DoD programs center dot In addition with consolidation in the industry, the rocket propulsion industrial base is under stress. To Improve the "health" of the RPIB, we need to understand - The current condition of the RPIB - How this compares to past history - The trend of RPIB health center dot This drives the need for a concise set of "metrics" - Analogous to the basic data a physician uses to determine the state of health of his patients - Easy to measure and collect - The trend is often more useful than the actual data point - Can be used to focus on problem areas and develop preventative measures The nation's capability to conceive, design, develop, manufacture, test, and support missions using liquid rocket engines and solid rocket motors that are critical to its national security, economic health and growth, and future scientific needs. center dot The RPIB encompasses US government, academic, and commercial (including industry primes and their supplier base) research, development, test, evaluation, and manufacturing capabilities and facilities. center dot The RPIB includes the skilled workforce, related intellectual property, engineering and support services, and supply chain operations and management. This definition touches the five main segments of the U.S. RPIB as categorized by the USG: defense, intelligence community, civil government, academia, and commercial sector. The nation's capability to conceive, design, develop, manufacture, test, and support missions using liquid rocket engines and solid rocket motors that are critical to its national security, economic health and growth, and future scientific needs

  18. Comparison of luminance based metrics in different lighting conditions

    DEFF Research Database (Denmark)

    Wienold, J.; Kuhn, T.E.; Christoffersen, J.

    In this study, we evaluate established and newly developed metrics for predicting glare using data from three different research studies. The evaluation covers two different targets: 1. How well the user’s perception of glare magnitude correlates to the prediction of the glare metrics? 2. How well...... do the glare metrics describe the subjects’ disturbance by glare? We applied Spearman correlations, logistic regressions and an accuracy evaluation, based on an ROC-analysis. The results show that five of the twelve investigated metrics are failing at least one of the statistical tests. The other...... seven metrics CGI, modified DGI, DGP, Ev, average Luminance of the image Lavg, UGP and UGR are passing all statistical tests. DGP, CGI, DGI_mod and UGP have largest AUC and might be slightly more robust. The accuracy of the predictions of afore mentioned seven metrics for the disturbance by glare lies...

  19. H-Metric: Characterizing Image Datasets via Homogenization Based on KNN-Queries

    Directory of Open Access Journals (Sweden)

    Welington M da Silva

    2012-01-01

    Full Text Available Precision-Recall is one of the main metrics for evaluating content-based image retrieval techniques. However, it does not provide an ample perception of the properties of an image dataset immersed in a metric space. In this work, we describe an alternative metric named H-Metric, which is determined along a sequence of controlled modifications in the image dataset. The process is named homogenization and works by altering the homogeneity characteristics of the classes of the images. The result is a process that measures how hard it is to deal with a set of images in respect to content-based retrieval, offering support in the task of analyzing configurations of distance functions and of features extractors.

  20. Distance-Based Phylogenetic Methods Around a Polytomy.

    Science.gov (United States)

    Davidson, Ruth; Sullivant, Seth

    2014-01-01

    Distance-based phylogenetic algorithms attempt to solve the NP-hard least-squares phylogeny problem by mapping an arbitrary dissimilarity map representing biological data to a tree metric. The set of all dissimilarity maps is a Euclidean space properly containing the space of all tree metrics as a polyhedral fan. Outputs of distance-based tree reconstruction algorithms such as UPGMA and neighbor-joining are points in the maximal cones in the fan. Tree metrics with polytomies lie at the intersections of maximal cones. A phylogenetic algorithm divides the space of all dissimilarity maps into regions based upon which combinatorial tree is reconstructed by the algorithm. Comparison of phylogenetic methods can be done by comparing the geometry of these regions. We use polyhedral geometry to compare the local nature of the subdivisions induced by least-squares phylogeny, UPGMA, and neighbor-joining when the true tree has a single polytomy with exactly four neighbors. Our results suggest that in some circumstances, UPGMA and neighbor-joining poorly match least-squares phylogeny.

  1. GPS Device Testing Based on User Performance Metrics

    Science.gov (United States)

    2015-10-02

    1. Rationale for a Test Program Based on User Performance Metrics ; 2. Roberson and Associates Test Program ; 3. Status of, and Revisions to, the Roberson and Associates Test Program ; 4. Comparison of Roberson and DOT/Volpe Programs

  2. Randomized Approaches for Nearest Neighbor Search in Metric Space When Computing the Pairwise Distance Is Extremely Expensive

    Science.gov (United States)

    Wang, Lusheng; Yang, Yong; Lin, Guohui

    Finding the closest object for a query in a database is a classical problem in computer science. For some modern biological applications, computing the similarity between two objects might be very time consuming. For example, it takes a long time to compute the edit distance between two whole chromosomes and the alignment cost of two 3D protein structures. In this paper, we study the nearest neighbor search problem in metric space, where the pair-wise distance between two objects in the database is known and we want to minimize the number of distances computed on-line between the query and objects in the database in order to find the closest object. We have designed two randomized approaches for indexing metric space databases, where objects are purely described by their distances with each other. Analysis and experiments show that our approaches only need to compute O(logn) objects in order to find the closest object, where n is the total number of objects in the database.

  3. Metrics for assessing retailers based on consumer perception

    Directory of Open Access Journals (Sweden)

    Klimin Anastasii

    2017-01-01

    Full Text Available The article suggests a new look at trading platforms, which is called “metrics.” Metrics are a way to look at the point of sale in a large part from the buyer’s side. The buyer enters the store and make buying decision based on those factors that the seller often does not consider, or considers in part, because “does not see” them, since he is not a buyer. The article proposes the classification of retailers, metrics and a methodology for their determination, presents the results of an audit of retailers in St. Petersburg on the proposed methodology.

  4. Energy-Based Metrics for Arthroscopic Skills Assessment.

    Science.gov (United States)

    Poursartip, Behnaz; LeBel, Marie-Eve; McCracken, Laura C; Escoto, Abelardo; Patel, Rajni V; Naish, Michael D; Trejos, Ana Luisa

    2017-08-05

    Minimally invasive skills assessment methods are essential in developing efficient surgical simulators and implementing consistent skills evaluation. Although numerous methods have been investigated in the literature, there is still a need to further improve the accuracy of surgical skills assessment. Energy expenditure can be an indication of motor skills proficiency. The goals of this study are to develop objective metrics based on energy expenditure, normalize these metrics, and investigate classifying trainees using these metrics. To this end, different forms of energy consisting of mechanical energy and work were considered and their values were divided by the related value of an ideal performance to develop normalized metrics. These metrics were used as inputs for various machine learning algorithms including support vector machines (SVM) and neural networks (NNs) for classification. The accuracy of the combination of the normalized energy-based metrics with these classifiers was evaluated through a leave-one-subject-out cross-validation. The proposed method was validated using 26 subjects at two experience levels (novices and experts) in three arthroscopic tasks. The results showed that there are statistically significant differences between novices and experts for almost all of the normalized energy-based metrics. The accuracy of classification using SVM and NN methods was between 70% and 95% for the various tasks. The results show that the normalized energy-based metrics and their combination with SVM and NN classifiers are capable of providing accurate classification of trainees. The assessment method proposed in this study can enhance surgical training by providing appropriate feedback to trainees about their level of expertise and can be used in the evaluation of proficiency.

  5. Gamut Volume Index: a color preference metric based on meta-analysis and optimized colour samples.

    Science.gov (United States)

    Liu, Qiang; Huang, Zheng; Xiao, Kaida; Pointer, Michael R; Westland, Stephen; Luo, M Ronnier

    2017-07-10

    A novel metric named Gamut Volume Index (GVI) is proposed for evaluating the colour preference of lighting. This metric is based on the absolute gamut volume of optimized colour samples. The optimal colour set of the proposed metric was obtained by optimizing the weighted average correlation between the metric predictions and the subjective ratings for 8 psychophysical studies. The performance of 20 typical colour metrics was also investigated, which included colour difference based metrics, gamut based metrics, memory based metrics as well as combined metrics. It was found that the proposed GVI outperformed the existing counterparts, especially for the conditions where correlated colour temperatures differed.

  6. Clustering by Partitioning around Medoids using Distance-Based ...

    African Journals Online (AJOL)

    OLUWASOGO

    outperforms both the Euclidean and Manhattan distance metrics in certain situations. KEYWORDS: PAM ... version of a dataset, compare the quality of clusters obtained from the Euclidean .... B. Theoretical Framework and Methodology.

  7. A PEG Construction of LDPC Codes Based on the Betweenness Centrality Metric

    Directory of Open Access Journals (Sweden)

    BHURTAH-SEEWOOSUNGKUR, I.

    2016-05-01

    Full Text Available Progressive Edge Growth (PEG constructions are usually based on optimizing the distance metric by using various methods. In this work however, the distance metric is replaced by a different one, namely the betweenness centrality metric, which was shown to enhance routing performance in wireless mesh networks. A new type of PEG construction for Low-Density Parity-Check (LDPC codes is introduced based on the betweenness centrality metric borrowed from social networks terminology given that the bipartite graph describing the LDPC is analogous to a network of nodes. The algorithm is very efficient in filling edges on the bipartite graph by adding its connections in an edge-by-edge manner. The smallest graph size the new code could construct surpasses those obtained from a modified PEG algorithm - the RandPEG algorithm. To the best of the authors' knowledge, this paper produces the best regular LDPC column-weight two graphs. In addition, the technique proves to be competitive in terms of error-correcting performance. When compared to MacKay, PEG and other recent modified-PEG codes, the algorithm gives better performance over high SNR due to its particular edge and local graph properties.

  8. Operator-based metric for nuclear operations automation assessment

    Energy Technology Data Exchange (ETDEWEB)

    Zacharias, G.L.; Miao, A.X.; Kalkan, A. [Charles River Analytics Inc., Cambridge, MA (United States)] [and others

    1995-04-01

    Continuing advances in real-time computational capabilities will support enhanced levels of smart automation and AI-based decision-aiding systems in the nuclear power plant (NPP) control room of the future. To support development of these aids, we describe in this paper a research tool, and more specifically, a quantitative metric, to assess the impact of proposed automation/aiding concepts in a manner that can account for a number of interlinked factors in the control room environment. In particular, we describe a cognitive operator/plant model that serves as a framework for integrating the operator`s information-processing capabilities with his procedural knowledge, to provide insight as to how situations are assessed by the operator, decisions made, procedures executed, and communications conducted. Our focus is on the situation assessment (SA) behavior of the operator, the development of a quantitative metric reflecting overall operator awareness, and the use of this metric in evaluating automation/aiding options. We describe the results of a model-based simulation of a selected emergency scenario, and metric-based evaluation of a range of contemplated NPP control room automation/aiding options. The results demonstrate the feasibility of model-based analysis of contemplated control room enhancements, and highlight the need for empirical validation.

  9. Distance-Based Opportunistic Mobile Data Offloading.

    Science.gov (United States)

    Lu, Xiaofeng; Lio, Pietro; Hui, Pan

    2016-06-15

    Cellular network data traffic can be offload onto opportunistic networks. This paper proposes a Distance-based Opportunistic Publish/Subscribe (DOPS) content dissemination model, which is composed of three layers: application layer, decision-making layer and network layer. When a user wants new content, he/she subscribes on a subscribing server. Users having the contents decide whether to deliver the contents to the subscriber based on the distance information. If in the meantime a content owner has traveled further in the immediate past time than the distance between the owner and the subscriber, the content owner will send the content to the subscriber through opportunistic routing. Simulations provide an evaluation of the data traffic offloading efficiency of DOPS.

  10. Distance-Based Opportunistic Mobile Data Offloading

    Directory of Open Access Journals (Sweden)

    Xiaofeng Lu

    2016-06-01

    Full Text Available Cellular network data traffic can be offload onto opportunistic networks. This paper proposes a Distance-based Opportunistic Publish/Subscribe (DOPS content dissemination model, which is composed of three layers: application layer, decision-making layer and network layer. When a user wants new content, he/she subscribes on a subscribing server. Users having the contents decide whether to deliver the contents to the subscriber based on the distance information. If in the meantime a content owner has traveled further in the immediate past time than the distance between the owner and the subscriber, the content owner will send the content to the subscriber through opportunistic routing. Simulations provide an evaluation of the data traffic offloading efficiency of DOPS.

  11. MESUR: USAGE-BASED METRICS OF SCHOLARLY IMPACT

    Energy Technology Data Exchange (ETDEWEB)

    BOLLEN, JOHAN [Los Alamos National Laboratory; RODRIGUEZ, MARKO A. [Los Alamos National Laboratory; VAN DE SOMPEL, HERBERT [Los Alamos National Laboratory

    2007-01-30

    The evaluation of scholarly communication items is now largely a matter of expert opinion or metrics derived from citation data. Both approaches can fail to take into account the myriad of factors that shape scholarly impact. Usage data has emerged as a promising complement to existing methods o fassessment but the formal groundwork to reliably and validly apply usage-based metrics of schlolarly impact is lacking. The Andrew W. Mellon Foundation funded MESUR project constitutes a systematic effort to define, validate and cross-validate a range of usage-based metrics of schlolarly impact by creating a semantic model of the scholarly communication process. The constructed model will serve as the basis of a creating a large-scale semantic network that seamlessly relates citation, bibliographic and usage data from a variety of sources. A subsequent program that uses the established semantic network as a reference data set will determine the characteristics and semantics of a variety of usage-based metrics of schlolarly impact. This paper outlines the architecture and methodology adopted by the MESUR project and its future direction.

  12. Auto-associative Kernel Regression Model with Weighted Distance Metric for Instrument Drift Monitoring

    International Nuclear Information System (INIS)

    Shin, Ho Cheol; Park, Moon Ghu; You, Skin

    2006-01-01

    Recently, many on-line approaches to instrument channel surveillance (drift monitoring and fault detection) have been reported worldwide. On-line monitoring (OLM) method evaluates instrument channel performance by assessing its consistency with other plant indications through parametric or non-parametric models. The heart of an OLM system is the model giving an estimate of the true process parameter value against individual measurements. This model gives process parameter estimate calculated as a function of other plant measurements which can be used to identify small sensor drifts that would require the sensor to be manually calibrated or replaced. This paper describes an improvement of auto associative kernel regression (AAKR) by introducing a correlation coefficient weighting on kernel distances. The prediction performance of the developed method is compared with conventional auto-associative kernel regression

  13. Managing distance and covariate information with point-based clustering

    Directory of Open Access Journals (Sweden)

    Peter A. Whigham

    2016-09-01

    Full Text Available Abstract Background Geographic perspectives of disease and the human condition often involve point-based observations and questions of clustering or dispersion within a spatial context. These problems involve a finite set of point observations and are constrained by a larger, but finite, set of locations where the observations could occur. Developing a rigorous method for pattern analysis in this context requires handling spatial covariates, a method for constrained finite spatial clustering, and addressing bias in geographic distance measures. An approach, based on Ripley’s K and applied to the problem of clustering with deliberate self-harm (DSH, is presented. Methods Point-based Monte-Carlo simulation of Ripley’s K, accounting for socio-economic deprivation and sources of distance measurement bias, was developed to estimate clustering of DSH at a range of spatial scales. A rotated Minkowski L1 distance metric allowed variation in physical distance and clustering to be assessed. Self-harm data was derived from an audit of 2 years’ emergency hospital presentations (n = 136 in a New Zealand town (population ~50,000. Study area was defined by residential (housing land parcels representing a finite set of possible point addresses. Results Area-based deprivation was spatially correlated. Accounting for deprivation and distance bias showed evidence for clustering of DSH for spatial scales up to 500 m with a one-sided 95 % CI, suggesting that social contagion may be present for this urban cohort. Conclusions Many problems involve finite locations in geographic space that require estimates of distance-based clustering at many scales. A Monte-Carlo approach to Ripley’s K, incorporating covariates and models for distance bias, are crucial when assessing health-related clustering. The case study showed that social network structure defined at the neighbourhood level may account for aspects of neighbourhood clustering of DSH. Accounting for

  14. Turbulence Hazard Metric Based on Peak Accelerations for Jetliner Passengers

    Science.gov (United States)

    Stewart, Eric C.

    2005-01-01

    Calculations are made of the approximate hazard due to peak normal accelerations of an airplane flying through a simulated vertical wind field associated with a convective frontal system. The calculations are based on a hazard metric developed from a systematic application of a generic math model to 1-cosine discrete gusts of various amplitudes and gust lengths. The math model simulates the three degree-of- freedom longitudinal rigid body motion to vertical gusts and includes (1) fuselage flexibility, (2) the lag in the downwash from the wing to the tail, (3) gradual lift effects, (4) a simplified autopilot, and (5) motion of an unrestrained passenger in the rear cabin. Airplane and passenger response contours are calculated for a matrix of gust amplitudes and gust lengths. The airplane response contours are used to develop an approximate hazard metric of peak normal accelerations as a function of gust amplitude and gust length. The hazard metric is then applied to a two-dimensional simulated vertical wind field of a convective frontal system. The variations of the hazard metric with gust length and airplane heading are demonstrated.

  15. PSQM-based RR and NR video quality metrics

    Science.gov (United States)

    Lu, Zhongkang; Lin, Weisi; Ong, Eeping; Yang, Xiaokang; Yao, Susu

    2003-06-01

    This paper presents a new and general concept, PQSM (Perceptual Quality Significance Map), to be used in measuring the visual distortion. It makes use of the selectivity characteristic of HVS (Human Visual System) that it pays more attention to certain area/regions of visual signal due to one or more of the following factors: salient features in image/video, cues from domain knowledge, and association of other media (e.g., speech or audio). PQSM is an array whose elements represent the relative perceptual-quality significance levels for the corresponding area/regions for images or video. Due to its generality, PQSM can be incorporated into any visual distortion metrics: to improve effectiveness or/and efficiency of perceptual metrics; or even to enhance a PSNR-based metric. A three-stage PQSM estimation method is also proposed in this paper, with an implementation of motion, texture, luminance, skin-color and face mapping. Experimental results show the scheme can improve the performance of current image/video distortion metrics.

  16. Stochastic Learning of Multi-Instance Dictionary for Earth Mover's Distance based Histogram Comparison

    OpenAIRE

    Fan, Jihong; Liang, Ru-Ze

    2016-01-01

    Dictionary plays an important role in multi-instance data representation. It maps bags of instances to histograms. Earth mover's distance (EMD) is the most effective histogram distance metric for the application of multi-instance retrieval. However, up to now, there is no existing multi-instance dictionary learning methods designed for EMD based histogram comparison. To fill this gap, we develop the first EMD-optimal dictionary learning method using stochastic optimization method. In the stoc...

  17. Evaluation of Vehicle-Based Crash Severity Metrics.

    Science.gov (United States)

    Tsoi, Ada H; Gabler, Hampton C

    2015-01-01

    Vehicle change in velocity (delta-v) is a widely used crash severity metric used to estimate occupant injury risk. Despite its widespread use, delta-v has several limitations. Of most concern, delta-v is a vehicle-based metric which does not consider the crash pulse or the performance of occupant restraints, e.g. seatbelts and airbags. Such criticisms have prompted the search for alternative impact severity metrics based upon vehicle kinematics. The purpose of this study was to assess the ability of the occupant impact velocity (OIV), acceleration severity index (ASI), vehicle pulse index (VPI), and maximum delta-v (delta-v) to predict serious injury in real world crashes. The study was based on the analysis of event data recorders (EDRs) downloaded from the National Automotive Sampling System / Crashworthiness Data System (NASS-CDS) 2000-2013 cases. All vehicles in the sample were GM passenger cars and light trucks involved in a frontal collision. Rollover crashes were excluded. Vehicles were restricted to single-event crashes that caused an airbag deployment. All EDR data were checked for a successful, completed recording of the event and that the crash pulse was complete. The maximum abbreviated injury scale (MAIS) was used to describe occupant injury outcome. Drivers were categorized into either non-seriously injured group (MAIS2-) or seriously injured group (MAIS3+), based on the severity of any injuries to the thorax, abdomen, and spine. ASI and OIV were calculated according to the Manual for Assessing Safety Hardware. VPI was calculated according to ISO/TR 12353-3, with vehicle-specific parameters determined from U.S. New Car Assessment Program crash tests. Using binary logistic regression, the cumulative probability of injury risk was determined for each metric and assessed for statistical significance, goodness-of-fit, and prediction accuracy. The dataset included 102,744 vehicles. A Wald chi-square test showed each vehicle-based crash severity metric

  18. Moving from gamma passing rates to patient DVH-based QA metrics in pretreatment dose QA

    Energy Technology Data Exchange (ETDEWEB)

    Zhen, Heming; Nelms, Benjamin E.; Tome, Wolfgang A. [Department of Medical Physics, University of Wisconsin, Madison, Wisconsin 53705 (United States); Department of Human Oncology, University of Wisconsin, Madison, Wisconsin 53792 and Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Department of Medical Physics, University of Wisconsin, Madison, Wisconsin 53705 and Department of Human Oncology, University of Wisconsin, Madison, Wisconsin 53792 (United States)

    2011-10-15

    Purpose: The purpose of this work is to explore the usefulness of the gamma passing rate metric for per-patient, pretreatment dose QA and to validate a novel patient-dose/DVH-based method and its accuracy and correlation. Specifically, correlations between: (1) gamma passing rates for three 3D dosimeter detector geometries vs clinically relevant patient DVH-based metrics; (2) Gamma passing rates of whole patient dose grids vs DVH-based metrics, (3) gamma passing rates filtered by region of interest (ROI) vs DVH-based metrics, and (4) the capability of a novel software algorithm that estimates corrected patient Dose-DVH based on conventional phan-tom QA data are analyzed. Methods: Ninety six unique ''imperfect'' step-and-shoot IMRT plans were generated by applying four different types of errors on 24 clinical Head/Neck patients. The 3D patient doses as well as the dose to a cylindrical QA phantom were then recalculated using an error-free beam model to serve as a simulated measurement for comparison. Resulting deviations to the planned vs simulated measured DVH-based metrics were generated, as were gamma passing rates for a variety of difference/distance criteria covering: dose-in-phantom comparisons and dose-in-patient comparisons, with the in-patient results calculated both over the whole grid and per-ROI volume. Finally, patient dose and DVH were predicted using the conventional per-beam planar data as input into a commercial ''planned dose perturbation'' (PDP) algorithm, and the results of these predicted DVH-based metrics were compared to the known values. Results: A range of weak to moderate correlations were found between clinically relevant patient DVH metrics (CTV-D95, parotid D{sub mean}, spinal cord D1cc, and larynx D{sub mean}) and both 3D detector and 3D patient gamma passing rate (3%/3 mm, 2%/2 mm) for dose-in-phantom along with dose-in-patient for both whole patient volume and filtered per-ROI. There was

  19. Towards Video Quality Metrics Based on Colour Fractal Geometry

    Directory of Open Access Journals (Sweden)

    Richard Noël

    2010-01-01

    Full Text Available Vision is a complex process that integrates multiple aspects of an image: spatial frequencies, topology and colour. Unfortunately, so far, all these elements were independently took into consideration for the development of image and video quality metrics, therefore we propose an approach that blends together all of them. Our approach allows for the analysis of the complexity of colour images in the RGB colour space, based on the probabilistic algorithm for calculating the fractal dimension and lacunarity. Given that all the existing fractal approaches are defined only for gray-scale images, we extend them to the colour domain. We show how these two colour fractal features capture the multiple aspects that characterize the degradation of the video signal, based on the hypothesis that the quality degradation perceived by the user is directly proportional to the modification of the fractal complexity. We claim that the two colour fractal measures can objectively assess the quality of the video signal and they can be used as metrics for the user-perceived video quality degradation and we validated them through experimental results obtained for an MPEG-4 video streaming application; finally, the results are compared against the ones given by unanimously-accepted metrics and subjective tests.

  20. Distance Ranging Based on Quantum Entanglement

    International Nuclear Information System (INIS)

    Xiao Jun-Jun; Han Xiao-Chun; Zeng Gui-Hua; Fang Chen; Zhao Jian-Kang

    2013-01-01

    In the quantum metrology, applications of quantum techniques based on entanglement bring in some better performances than conventional approaches. We experimentally investigate an application of entanglement in accurate ranging based on the second-order coherence in the time domain. By a fitting algorithm in the data processing, the optimization results show a precision of ±200 μm at a distance of 1043.3m. In addition, the influence of jamming noise on the ranging scheme is studied. With some different fitting parameters, the result shows that the proposed scheme has a powerful anti-jamming capability for white noise

  1. On using multiple routing metrics with destination sequenced distance vector protocol for MultiHop wireless ad hoc networks

    Science.gov (United States)

    Mehic, M.; Fazio, P.; Voznak, M.; Partila, P.; Komosny, D.; Tovarek, J.; Chmelikova, Z.

    2016-05-01

    A mobile ad hoc network is a collection of mobile nodes which communicate without a fixed backbone or centralized infrastructure. Due to the frequent mobility of nodes, routes connecting two distant nodes may change. Therefore, it is not possible to establish a priori fixed paths for message delivery through the network. Because of its importance, routing is the most studied problem in mobile ad hoc networks. In addition, if the Quality of Service (QoS) is demanded, one must guarantee the QoS not only over a single hop but over an entire wireless multi-hop path which may not be a trivial task. In turns, this requires the propagation of QoS information within the network. The key to the support of QoS reporting is QoS routing, which provides path QoS information at each source. To support QoS for real-time traffic one needs to know not only minimum delay on the path to the destination but also the bandwidth available on it. Therefore, throughput, end-to-end delay, and routing overhead are traditional performance metrics used to evaluate the performance of routing protocol. To obtain additional information about the link, most of quality-link metrics are based on calculation of the lost probabilities of links by broadcasting probe packets. In this paper, we address the problem of including multiple routing metrics in existing routing packets that are broadcasted through the network. We evaluate the efficiency of such approach with modified version of DSDV routing protocols in ns-3 simulator.

  2. Semantic metrics

    OpenAIRE

    Hu, Bo; Kalfoglou, Yannis; Dupplaw, David; Alani, Harith; Lewis, Paul; Shadbolt, Nigel

    2006-01-01

    In the context of the Semantic Web, many ontology-related operations, e.g. ontology ranking, segmentation, alignment, articulation, reuse, evaluation, can be boiled down to one fundamental operation: computing the similarity and/or dissimilarity among ontological entities, and in some cases among ontologies themselves. In this paper, we review standard metrics for computing distance measures and we propose a series of semantic metrics. We give a formal account of semantic metrics drawn from a...

  3. Term Based Comparison Metrics for Controlled and Uncontrolled Indexing Languages

    Science.gov (United States)

    Good, B. M.; Tennis, J. T.

    2009-01-01

    Introduction: We define a collection of metrics for describing and comparing sets of terms in controlled and uncontrolled indexing languages and then show how these metrics can be used to characterize a set of languages spanning folksonomies, ontologies and thesauri. Method: Metrics for term set characterization and comparison were identified and…

  4. Distinguishability notion based on Wootters statistical distance: Application to discrete maps

    Science.gov (United States)

    Gomez, Ignacio S.; Portesi, M.; Lamberti, P. W.

    2017-08-01

    We study the distinguishability notion given by Wootters for states represented by probability density functions. This presents the particularity that it can also be used for defining a statistical distance in chaotic unidimensional maps. Based on that definition, we provide a metric d ¯ for an arbitrary discrete map. Moreover, from d ¯ , we associate a metric space with each invariant density of a given map, which results to be the set of all distinguished points when the number of iterations of the map tends to infinity. Also, we give a characterization of the wandering set of a map in terms of the metric d ¯ , which allows us to identify the dissipative regions in the phase space. We illustrate the results in the case of the logistic and the circle maps numerically and analytically, and we obtain d ¯ and the wandering set for some characteristic values of their parameters. Finally, an extension of the metric space associated for arbitrary probability distributions (not necessarily invariant densities) is given along with some consequences. The statistical properties of distributions given by histograms are characterized in terms of the cardinal of the associated metric space. For two conjugate variables, the uncertainty principle is expressed in terms of the diameters of the associated metric space with those variables.

  5. Metrics-based assessments of research: incentives for 'institutional plagiarism'?

    Science.gov (United States)

    Berry, Colin

    2013-06-01

    The issue of plagiarism--claiming credit for work that is not one's own, rightly, continues to cause concern in the academic community. An analysis is presented that shows the effects that may arise from metrics-based assessments of research, when credit for an author's outputs (chiefly publications) is given to an institution that did not support the research but which subsequently employs the author. The incentives for what is termed here "institutional plagiarism" are demonstrated with reference to the UK Research Assessment Exercise in which submitting units of assessment are shown in some instances to derive around twice the credit for papers produced elsewhere by new recruits, compared to papers produced 'in-house'.

  6. Analysis and Comparison of Information Theory-based Distances for Genomic Strings

    Science.gov (United States)

    Balzano, Walter; Cicalese, Ferdinando; Del Sorbo, Maria Rosaria; Vaccaro, Ugo

    2008-07-01

    Genomic string comparison via alignment are widely applied for mining and retrieval of information in biological databases. In some situation, the effectiveness of such alignment based comparison is still unclear, e.g., for sequences with non-uniform length and with significant shuffling of identical substrings. An alternative approach is the one based on information theory distances. Biological data information content is stored in very long strings of only four characters. In last ten years, several entropic measures have been proposed for genomic string analysis. Notwithstanding their individual merit and experimental validation, to the nest of our knowledge, there is no direct comparison of these different metrics. We shall present four of the most representative alignment-free distance measures, based on mutual information. Each one has a different origin and expression. Our comparison involves a sort of arrangement, to reduce different concepts to a unique formalism, so as it has been possible to construct a phylogenetic tree for each of them. The trees produced via these metrics are compared to the ones widely accepted as biologically validated. In general the results provided more evidence of the reliability of the alignment-free distance models. Also, we observe that one of the metrics appeared to be more robust than the other three. We believe that this result can be object of further researches and observations. Many of the results of experimentation, the graphics and the table are available at the following URL: http://people.na.infn.it/˜wbalzano/BIO

  7. Censoring distances based on labeled cortical distance maps in cortical morphometry.

    Science.gov (United States)

    Ceyhan, Elvan; Nishino, Tomoyuki; Alexopolous, Dimitrios; Todd, Richard D; Botteron, Kelly N; Miller, Michael I; Ratnanather, J Tilak

    2013-01-01

    It has been demonstrated that shape differences in cortical structures may be manifested in neuropsychiatric disorders. Such morphometric differences can be measured by labeled cortical distance mapping (LCDM) which characterizes the morphometry of the laminar cortical mantle of cortical structures. LCDM data consist of signed/labeled distances of gray matter (GM) voxels with respect to GM/white matter (WM) surface. Volumes and other summary measures for each subject and the pooled distances can help determine the morphometric differences between diagnostic groups, however they do not reveal all the morphometric information contained in LCDM distances. To extract more information from LCDM data, censoring of the pooled distances is introduced for each diagnostic group where the range of LCDM distances is partitioned at a fixed increment size; and at each censoring step, the distances not exceeding the censoring distance are kept. Censored LCDM distances inherit the advantages of the pooled distances but also provide information about the location of morphometric differences which cannot be obtained from the pooled distances. However, at each step, the censored distances aggregate, which might confound the results. The influence of data aggregation is investigated with an extensive Monte Carlo simulation analysis and it is demonstrated that this influence is negligible. As an illustrative example, GM of ventral medial prefrontal cortices (VMPFCs) of subjects with major depressive disorder (MDD), subjects at high risk (HR) of MDD, and healthy control (Ctrl) subjects are used. A significant reduction in laminar thickness of the VMPFC in MDD and HR subjects is observed compared to Ctrl subjects. Moreover, the GM LCDM distances (i.e., locations with respect to the GM/WM surface) for which these differences start to occur are determined. The methodology is also applicable to LCDM-based morphometric measures of other cortical structures affected by disease.

  8. Censoring Distances Based on Labeled Cortical Distance Maps in Cortical Morphometry

    Directory of Open Access Journals (Sweden)

    Elvan eCeyhan

    2013-10-01

    Full Text Available It has been demonstrated that shape differences are manifested in cortical structures due to neuropsychiatric disorders. Such morphometric differences can be measured by labeled cortical distance mapping (LCDM which characterizes the morphometry of the laminar cortical mantle of cortical structures. LCDM data consist of signed/labeled distances of gray matter (GM voxels with respect to GM/white matter (WM surface. Volumes and other summary measures for each subject and the pooled distances can help determine the morphometric differences between diagnostic groups, however they do not reveal all the morphometric information con-tained in LCDM distances. To extract more information from LCDM data, censoring of the pooled distances is introduced for each diagnostic group where the range of LCDM distances is partitioned at a fixed increment size; and at each censoring step, the distances not exceeding the censoring distance are kept. Censored LCDM distances inherit the advantages of the pooled distances but also provide information about the location of morphometric differences which cannot be obtained from the pooled distances. However, at each step, the censored distances aggregate, which might confound the results. The influence of data aggregation is investigated with an extensive Monte Carlo simulation analysis and it is demonstrated that this influence is negligible. As an illustrative example, GM of ventral medial prefrontal cortices (VMPFCs of subjects with major depressive disorder (MDD, subjects at high risk (HR of MDD, and healthy control (Ctrl subjects are used. A significant reduction in laminar thickness of the VMPFC in MDD and HR subjects is observed compared to Ctrl subjects. Moreover, the GM LCDM distances (i.e., locations with respect to the GM/WM surface for which these differences start to occur are determined. The methodology is also applicable to LCDM-based morphometric measures of other cortical structures affected by disease.

  9. Image retrieval by information fusion based on scalable vocabulary tree and robust Hausdorff distance

    Science.gov (United States)

    Che, Chang; Yu, Xiaoyang; Sun, Xiaoming; Yu, Boyang

    2017-12-01

    In recent years, Scalable Vocabulary Tree (SVT) has been shown to be effective in image retrieval. However, for general images where the foreground is the object to be recognized while the background is cluttered, the performance of the current SVT framework is restricted. In this paper, a new image retrieval framework that incorporates a robust distance metric and information fusion is proposed, which improves the retrieval performance relative to the baseline SVT approach. First, the visual words that represent the background are diminished by using a robust Hausdorff distance between different images. Second, image matching results based on three image signature representations are fused, which enhances the retrieval precision. We conducted intensive experiments on small-scale to large-scale image datasets: Corel-9, Corel-48, and PKU-198, where the proposed Hausdorff metric and information fusion outperforms the state-of-the-art methods by about 13, 15, and 15%, respectively.

  10. Mahalanobis Distance Based Iterative Closest Point

    DEFF Research Database (Denmark)

    Hansen, Mads Fogtmann; Blas, Morten Rufus; Larsen, Rasmus

    2007-01-01

    the notion of a mahalanobis distance map upon a point set with associated covariance matrices which in addition to providing correlation weighted distance implicitly provides a method for assigning correspondence during alignment. This distance map provides an easy formulation of the ICP problem that permits...... a fast optimization. Initially, the covariance matrices are set to the identity matrix, and all shapes are aligned to a randomly selected shape (equivalent to standard ICP). From this point the algorithm iterates between the steps: (a) obtain mean shape and new estimates of the covariance matrices from...... the aligned shapes, (b) align shapes to the mean shape. Three different methods for estimating the mean shape with associated covariance matrices are explored in the paper. The proposed methods are validated experimentally on two separate datasets (IMM face dataset and femur-bones). The superiority of ICP...

  11. Deep Correlated Holistic Metric Learning for Sketch-Based 3D Shape Retrieval.

    Science.gov (United States)

    Dai, Guoxian; Xie, Jin; Fang, Yi

    2018-07-01

    How to effectively retrieve desired 3D models with simple queries is a long-standing problem in computer vision community. The model-based approach is quite straightforward but nontrivial, since people could not always have the desired 3D query model available by side. Recently, large amounts of wide-screen electronic devices are prevail in our daily lives, which makes the sketch-based 3D shape retrieval a promising candidate due to its simpleness and efficiency. The main challenge of sketch-based approach is the huge modality gap between sketch and 3D shape. In this paper, we proposed a novel deep correlated holistic metric learning (DCHML) method to mitigate the discrepancy between sketch and 3D shape domains. The proposed DCHML trains two distinct deep neural networks (one for each domain) jointly, which learns two deep nonlinear transformations to map features from both domains into a new feature space. The proposed loss, including discriminative loss and correlation loss, aims to increase the discrimination of features within each domain as well as the correlation between different domains. In the new feature space, the discriminative loss minimizes the intra-class distance of the deep transformed features and maximizes the inter-class distance of the deep transformed features to a large margin within each domain, while the correlation loss focused on mitigating the distribution discrepancy across different domains. Different from existing deep metric learning methods only with loss at the output layer, our proposed DCHML is trained with loss at both hidden layer and output layer to further improve the performance by encouraging features in the hidden layer also with desired properties. Our proposed method is evaluated on three benchmarks, including 3D Shape Retrieval Contest 2013, 2014, and 2016 benchmarks, and the experimental results demonstrate the superiority of our proposed method over the state-of-the-art methods.

  12. Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact

    Science.gov (United States)

    2011-01-01

    predictors (P < .001) could explain 27% of the variation of citations. Highly tweeted articles were 11 times more likely to be highly cited than less-tweeted articles (9/12 or 75% of highly tweeted article were highly cited, while only 3/43 or 7% of less-tweeted articles were highly cited; rate ratio 0.75/0.07 = 10.75, 95% confidence interval, 3.4–33.6). Top-cited articles can be predicted from top-tweeted articles with 93% specificity and 75% sensitivity. Conclusions Tweets can predict highly cited articles within the first 3 days of article publication. Social media activity either increases citations or reflects the underlying qualities of the article that also predict citations, but the true use of these metrics is to measure the distinct concept of social impact. Social impact measures based on tweets are proposed to complement traditional citation metrics. The proposed twimpact factor may be a useful and timely metric to measure uptake of research findings and to filter research findings resonating with the public in real time. PMID:22173204

  13. Antirandom Testing: A Distance-Based Approach

    Directory of Open Access Journals (Sweden)

    Shen Hui Wu

    2008-01-01

    Full Text Available Random testing requires each test to be selected randomly regardless of the tests previously applied. This paper introduces the concept of antirandom testing where each test applied is chosen such that its total distance from all previous tests is maximum. This spans the test vector space to the maximum extent possible for a given number of vectors. An algorithm for generating antirandom tests is presented. Compared with traditional pseudorandom testing, antirandom testing is found to be very effective when a high-fault coverage needs to be achieved with a limited number of test vectors. The superiority of the new approach is even more significant for testing bridging faults.

  14. Quality Evaluation in Wireless Imaging Using Feature-Based Objective Metrics

    OpenAIRE

    Engelke, Ulrich; Zepernick, Hans-Jürgen

    2007-01-01

    This paper addresses the evaluation of image quality in the context of wireless systems using feature-based objective metrics. The considered metrics comprise of a weighted combination of feature values that are used to quantify the extend by which the related artifacts are present in a processed image. In view of imaging applications in mobile radio and wireless communication systems, reduced-reference objective quality metrics are investigated for quantifying user-perceived quality. The exa...

  15. On the importance of the distance measures used to train and test knowledge-based potentials for proteins

    DEFF Research Database (Denmark)

    Carlsen, Martin; Koehl, Patrice; Røgen, Peter

    2014-01-01

    (PPD), while the other had the set of all distances filtered to reflect consistency in an ensemble of decoys (PPE). We tested four types of metric to characterize the distance between the decoy and the native structure, two based on extrinsic geometry (RMSD and GTD-TS*), and two based on intrinsic...... geometry (Q* and MT). The corresponding eight potentials were tested on a large collection of decoy sets. We found that it is usually better to train a potential using an intrinsic distance measure. We also found that PPE outperforms PPD, emphasizing the benefits of capturing consistent information...

  16. Connection Setup Signaling Scheme with Flooding-Based Path Searching for Diverse-Metric Network

    Science.gov (United States)

    Kikuta, Ko; Ishii, Daisuke; Okamoto, Satoru; Oki, Eiji; Yamanaka, Naoaki

    Connection setup on various computer networks is now achieved by GMPLS. This technology is based on the source-routing approach, which requires the source node to store metric information of the entire network prior to computing a route. Thus all metric information must be distributed to all network nodes and kept up-to-date. However, as metric information become more diverse and generalized, it is hard to update all information due to the huge update overhead. Emerging network services and applications require the network to support diverse metrics for achieving various communication qualities. Increasing the number of metrics supported by the network causes excessive processing of metric update messages. To reduce the number of metric update messages, another scheme is required. This paper proposes a connection setup scheme that uses flooding-based signaling rather than the distribution of metric information. The proposed scheme requires only flooding of signaling messages with requested metric information, no routing protocol is required. Evaluations confirm that the proposed scheme achieves connection establishment without excessive overhead. Our analysis shows that the proposed scheme greatly reduces the number of control messages compared to the conventional scheme, while their blocking probabilities are comparable.

  17. Grading the Metrics: Performance-Based Funding in the Florida State University System

    Science.gov (United States)

    Cornelius, Luke M.; Cavanaugh, Terence W.

    2016-01-01

    A policy analysis of Florida's 10-factor Performance-Based Funding system for state universities. The focus of the article is on the system of performance metrics developed by the state Board of Governors and their impact on institutions and their missions. The paper also discusses problems and issues with the metrics, their ongoing evolution, and…

  18. Evaluative Usage-Based Metrics for the Selection of E-Journals.

    Science.gov (United States)

    Hahn, Karla L.; Faulkner, Lila A.

    2002-01-01

    Explores electronic journal usage statistics and develops three metrics and three benchmarks based on those metrics. Topics include earlier work that assessed the value of print journals and was modified for the electronic format; the evaluation of potential purchases; and implications for standards development, including the need for content…

  19. Microservice scaling optimization based on metric collection in Kubernetes

    OpenAIRE

    Blažej, Aljaž

    2017-01-01

    As web applications become more complex and the number of internet users rises, so does the need to optimize the use of hardware supporting these applications. Optimization can be achieved with microservices, as they offer several advantages compared to the monolithic approach, such as better utilization of resources, scalability and isolation of different parts of an application. Another important part is collecting metrics, since they can be used for analysis and debugging as well as the ba...

  20. Metric-based method of software requirements correctness improvement

    Directory of Open Access Journals (Sweden)

    Yaremchuk Svitlana

    2017-01-01

    Full Text Available The work highlights the most important principles of software reliability management (SRM. The SRM concept construes a basis for developing a method of requirements correctness improvement. The method assumes that complicated requirements contain more actual and potential design faults/defects. The method applies a newer metric to evaluate the requirements complexity and double sorting technique evaluating the priority and complexity of a particular requirement. The method enables to improve requirements correctness due to identification of a higher number of defects with restricted resources. Practical application of the proposed method in the course of demands review assured a sensible technical and economic effect.

  1. Analysis of Molecular Variance Inferred from Metric Distances among DNA Haplotypes: Application to Human Mitochondrial DNA Restriction Data

    OpenAIRE

    Excoffier, L.; Smouse, P. E.; Quattro, J. M.

    1992-01-01

    We present here a framework for the study of molecular variation within a single species. Information on DNA haplotype divergence is incorporated into an analysis of variance format, derived from a matrix of squared-distances among all pairs of haplotypes. This analysis of molecular variance (AMOVA) produces estimates of variance components and F-statistic analogs, designated here as φ-statistics, reflecting the correlation of haplotypic diversity at different levels of hierarchical subdivisi...

  2. Equivalence Testing of Complex Particle Size Distribution Profiles Based on Earth Mover's Distance.

    Science.gov (United States)

    Hu, Meng; Jiang, Xiaohui; Absar, Mohammad; Choi, Stephanie; Kozak, Darby; Shen, Meiyu; Weng, Yu-Ting; Zhao, Liang; Lionberger, Robert

    2018-04-12

    Particle size distribution (PSD) is an important property of particulates in drug products. In the evaluation of generic drug products formulated as suspensions, emulsions, and liposomes, the PSD comparisons between a test product and the branded product can provide useful information regarding in vitro and in vivo performance. Historically, the FDA has recommended the population bioequivalence (PBE) statistical approach to compare the PSD descriptors D50 and SPAN from test and reference products to support product equivalence. In this study, the earth mover's distance (EMD) is proposed as a new metric for comparing PSD particularly when the PSD profile exhibits complex distribution (e.g., multiple peaks) that is not accurately described by the D50 and SPAN descriptor. EMD is a statistical metric that measures the discrepancy (distance) between size distribution profiles without a prior assumption of the distribution. PBE is then adopted to perform statistical test to establish equivalence based on the calculated EMD distances. Simulations show that proposed EMD-based approach is effective in comparing test and reference profiles for equivalence testing and is superior compared to commonly used distance measures, e.g., Euclidean and Kolmogorov-Smirnov distances. The proposed approach was demonstrated by evaluating equivalence of cyclosporine ophthalmic emulsion PSDs that were manufactured under different conditions. Our results show that proposed approach can effectively pass an equivalent product (e.g., reference product against itself) and reject an inequivalent product (e.g., reference product against negative control), thus suggesting its usefulness in supporting bioequivalence determination of a test product to the reference product which both possess multimodal PSDs.

  3. Calibration Base Lines for Electronic Distance Measuring Instruments (EDMI)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A calibration base line (CBL) is a precisely measured, straight-line course of approximately 1,400 m used to calibrate Electronic Distance Measuring Instruments...

  4. Algorithms for Speeding up Distance-Based Outlier Detection

    Data.gov (United States)

    National Aeronautics and Space Administration — The problem of distance-based outlier detection is difficult to solve efficiently in very large datasets because of potential quadratic time complexity. We address...

  5. Alternatives to accuracy and bias metrics based on percentage errors for radiation belt modeling applications

    Energy Technology Data Exchange (ETDEWEB)

    Morley, Steven Karl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-01

    This report reviews existing literature describing forecast accuracy metrics, concentrating on those based on relative errors and percentage errors. We then review how the most common of these metrics, the mean absolute percentage error (MAPE), has been applied in recent radiation belt modeling literature. Finally, we describe metrics based on the ratios of predicted to observed values (the accuracy ratio) that address the drawbacks inherent in using MAPE. Specifically, we define and recommend the median log accuracy ratio as a measure of bias and the median symmetric accuracy as a measure of accuracy.

  6. Evidence-based Metrics Toolkit for Measuring Safety and Efficiency in Human-Automation Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — APRIL 2016 NOTE: Principal Investigator moved to Rice University in mid-2015. Project continues at Rice with the same title (Evidence-based Metrics Toolkit for...

  7. Metric learning

    CERN Document Server

    Bellet, Aurelien; Sebban, Marc

    2015-01-01

    Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learnin

  8. Multi-metric model-based structural health monitoring

    Science.gov (United States)

    Jo, Hongki; Spencer, B. F.

    2014-04-01

    ABSTRACT The inspection and maintenance of bridges of all types is critical to the public safety and often critical to the economy of a region. Recent advanced sensor technologies provide accurate and easy-to-deploy means for structural health monitoring and, if the critical locations are known a priori, can be monitored by direct measurements. However, for today's complex civil infrastructure, the critical locations are numerous and often difficult to identify. This paper presents an innovative framework for structural monitoring at arbitrary locations on the structure combining computational models and limited physical sensor information. The use of multi-metric measurements is advocated to improve the accuracy of the approach. A numerical example is provided to illustrate the proposed hybrid monitoring framework, particularly focusing on fatigue life assessment of steel structures.

  9. An Improved EMD-Based Dissimilarity Metric for Unsupervised Linear Subspace Learning

    Directory of Open Access Journals (Sweden)

    Xiangchun Yu

    2018-01-01

    Full Text Available We investigate a novel way of robust face image feature extraction by adopting the methods based on Unsupervised Linear Subspace Learning to extract a small number of good features. Firstly, the face image is divided into blocks with the specified size, and then we propose and extract pooled Histogram of Oriented Gradient (pHOG over each block. Secondly, an improved Earth Mover’s Distance (EMD metric is adopted to measure the dissimilarity between blocks of one face image and the corresponding blocks from the rest of face images. Thirdly, considering the limitations of the original Locality Preserving Projections (LPP, we proposed the Block Structure LPP (BSLPP, which effectively preserves the structural information of face images. Finally, an adjacency graph is constructed and a small number of good features of a face image are obtained by methods based on Unsupervised Linear Subspace Learning. A series of experiments have been conducted on several well-known face databases to evaluate the effectiveness of the proposed algorithm. In addition, we construct the noise, geometric distortion, slight translation, slight rotation AR, and Extended Yale B face databases, and we verify the robustness of the proposed algorithm when faced with a certain degree of these disturbances.

  10. Image Inpainting Based on Coherence Transport with Adapted Distance Functions

    KAUST Repository

    März, Thomas

    2011-01-01

    We discuss an extension of our method image inpainting based on coherence transport. For the latter method the pixels of the inpainting domain have to be serialized into an ordered list. Until now, to induce the serialization we have used the distance to boundary map. But there are inpainting problems where the distance to boundary serialization causes unsatisfactory inpainting results. In the present work we demonstrate cases where we can resolve the difficulties by employing other distance functions which better suit the problem at hand. © 2011 Society for Industrial and Applied Mathematics.

  11. Long-distance configuration of FPGA based on serial communication

    International Nuclear Information System (INIS)

    Liu Xiang; Song Kezhu; Zhang Sifeng

    2010-01-01

    To solve FPGA configuration in some nuclear electronics, which works in radioactivity environment, the article introduces a way of long-distance configuration with PC and CPLD, based on serial communication. Taking CYCLONE series FPGA and EPCS configuration chip from ALTERA for example, and using the AS configuration mode, we described our design from the aspects of basic theory, hardware connection, software function and communication protocol. With this design, we could configure several FPGAs in the distance of 100 meters, or we could configure on FPGA in the distance of 150 meters. (authors)

  12. The difference between presence-based education and distance learning

    OpenAIRE

    Fernández Rodríguez, Mònica

    2002-01-01

    Attempts to define distance learning always involve comparisons with presence-based education, as the latter is the most direct reference that the former has. It is on this basis that the convergent points, similarities and differences of the two types of approach are established. This article opens with such a comparison, before going on to focus mainly on distance learning and to examine methodological strategies that should be borne in mind when implementing an e-learning system.

  13. Application of Technology in Project-Based Distance Learning

    Directory of Open Access Journals (Sweden)

    Ali Mehrabian

    2008-06-01

    Full Text Available Present technology and the accessibility of internet have made distance learning easier, more efficient, and more convenient for students. This technology allows instructors and students to communicate asynchronously, at times and locations of their own choosing, by exchanging printed or electronic information. The use of project-based approach is being recognized in the literature as a potential component of courses in the faculties of engineering, science, and technology. Instructors may have to restructure their course differently to accommodate and facilitate the effectiveness of distance learning. A project-based engineering course, traditionally taught in a classroom settings using live mode at the College of Engineering and Computer Sciences at the University of Central Florida (UCF has been transformed to a distance course taught using distance modes. In this case, pedagogical transitions and adjustments are required, in particular for obtaining an optimal balance between the course material and the project work. Project collaboration in groups requires communication, which is possible with extensive utilization of new information and communication technology, such as virtual meetings. This paper discusses the course transition from live to distance modes and touches on some issues as they relate to the effectiveness of this methodology and the lessons learned from its application within different context. More specifically, this discussion includes the benefit of implementing project-based work in the domain of the distance learning courses.

  14. Green Net Value Added as a Sustainability Metric Based on ...

    Science.gov (United States)

    Sustainability measurement in economics involves evaluation of environmental and economic impact in an integrated manner. In this study, system level economic data are combined with environmental impact from a life cycle assessment (LCA) of a common product. We are exploring a costing approach that captures traditional costs but also incorporates externality costs to provide a convenient, easily interpretable metric. Green Net Value Added (GNVA) is a type of full cost accounting that incorporates total revenue, the cost of materials and services, depreciation, and environmental externalities. Two, but not all, of the potential environmental impacts calculated by the standard LCIA method (TRACI) could be converted to externality cost values. We compute externality costs disaggregated by upstream sectors, full cost, and GNVA to evaluate the relative sustainability of Bounty® paper towels manufactured at two production facilities. We found that the longer running, more established line had a higher GNVA than the newer line. The dominant factors contributing to externality costs are calculated to come from the stationary sources in the supply chain: electricity generation (27-35%), refineries (20-21%), pulp and paper making (15-23%). Health related externalities from Particulate Matter (PM2.5) and Carbon Dioxide equivalent (CO2e) emissions appear largely driven by electricity usage and emissions by the facilities, followed by pulp processing and transport. Supply

  15. Countermeasure development using a formalised metric-based process

    Science.gov (United States)

    Barker, Laurence

    2008-10-01

    Guided weapons, are a potent threat to both air and surface platforms; to protect the platform, Countermeasures are often used to disrupt the operation of the tracking system. Development of effective techniques to defeat the guidance sensors is a complex activity. The countermeasure often responds to the behaviour of a responsive sensor system, creating a "closed loop" interaction. Performance assessment is difficult, and determining that enough knowledge exists to make a case that a platform is adequately protected is challenging. A set of metrics known as Countermeasure Confidence Levels (CCL) is described. These set out a measure of confidence in prediction of Countermeasure performance. The CCL scale provides, for the first time, a method to determine whether enough evidence exists to support development activity and introduction to operational service. Application of the CCL scale to development of a hypothetical countermeasure is described. This tracks how the countermeasure is matured from initial concept to in-service application. The purpose of each stage is described, together with a description of what work is likely to be needed. This will involve timely use of analysis, simulation, laboratory work and field testing. The use of the CCL scale at key decision points is described. These include procurement decision points, and entry-to-service decisions. Each stage requires collection of evidence of effectiveness. Completeness of the available evidence can be assessed, and duplication can be avoided. Read-across between concepts, weapon systems and platforms can be addressed and the impact of technology insertion can be assessed.

  16. Computations of Wall Distances Based on Differential Equations

    Science.gov (United States)

    Tucker, Paul G.; Rumsey, Chris L.; Spalart, Philippe R.; Bartels, Robert E.; Biedron, Robert T.

    2004-01-01

    The use of differential equations such as Eikonal, Hamilton-Jacobi and Poisson for the economical calculation of the nearest wall distance d, which is needed by some turbulence models, is explored. Modifications that could palliate some turbulence-modeling anomalies are also discussed. Economy is of especial value for deforming/adaptive grid problems. For these, ideally, d is repeatedly computed. It is shown that the Eikonal and Hamilton-Jacobi equations can be easy to implement when written in implicit (or iterated) advection and advection-diffusion equation analogous forms, respectively. These, like the Poisson Laplacian term, are commonly occurring in CFD solvers, allowing the re-use of efficient algorithms and code components. The use of the NASA CFL3D CFD program to solve the implicit Eikonal and Hamilton-Jacobi equations is explored. The re-formulated d equations are easy to implement, and are found to have robust convergence. For accurate Eikonal solutions, upwind metric differences are required. The Poisson approach is also found effective, and easiest to implement. Modified distances are not found to affect global outputs such as lift and drag significantly, at least in common situations such as airfoil flows.

  17. High Precision Infrared Temperature Measurement System Based on Distance Compensation

    Directory of Open Access Journals (Sweden)

    Chen Jing

    2017-01-01

    Full Text Available To meet the need of real-time remote monitoring of human body surface temperature for optical rehabilitation therapy, a non-contact high-precision real-time temperature measurement method based on distance compensation was proposed, and the system design was carried out. The microcontroller controls the infrared temperature measurement module and the laser range module to collect temperature and distance data. The compensation formula of temperature with distance wass fitted according to the least square method. Testing had been performed on different individuals to verify the accuracy of the system. The results indicate that the designed non-contact infrared temperature measurement system has a residual error of less than 0.2°C and the response time isless than 0.1s in the range of 0 to 60cm. This provides a reference for developing long-distance temperature measurement equipment in optical rehabilitation therapy.

  18. Optimizing distance-based methods for large data sets

    Science.gov (United States)

    Scholl, Tobias; Brenner, Thomas

    2015-10-01

    Distance-based methods for measuring spatial concentration of industries have received an increasing popularity in the spatial econometrics community. However, a limiting factor for using these methods is their computational complexity since both their memory requirements and running times are in {{O}}(n^2). In this paper, we present an algorithm with constant memory requirements and shorter running time, enabling distance-based methods to deal with large data sets. We discuss three recent distance-based methods in spatial econometrics: the D&O-Index by Duranton and Overman (Rev Econ Stud 72(4):1077-1106, 2005), the M-function by Marcon and Puech (J Econ Geogr 10(5):745-762, 2010) and the Cluster-Index by Scholl and Brenner (Reg Stud (ahead-of-print):1-15, 2014). Finally, we present an alternative calculation for the latter index that allows the use of data sets with millions of firms.

  19. Distance Based Method for Outlier Detection of Body Sensor Networks

    Directory of Open Access Journals (Sweden)

    Haibin Zhang

    2016-01-01

    Full Text Available We propose a distance based method for the outlier detection of body sensor networks. Firstly, we use a Kernel Density Estimation (KDE to calculate the probability of the distance to k nearest neighbors for diagnosed data. If the probability is less than a threshold, and the distance of this data to its left and right neighbors is greater than a pre-defined value, the diagnosed data is decided as an outlier. Further, we formalize a sliding window based method to improve the outlier detection performance. Finally, to estimate the KDE by training sensor readings with errors, we introduce a Hidden Markov Model (HMM based method to estimate the most probable ground truth values which have the maximum probability to produce the training data. Simulation results show that the proposed method possesses a good detection accuracy with a low false alarm rate.

  20. Stochastic learning of multi-instance dictionary for earth mover’s distance-based histogram comparison

    KAUST Repository

    Fan, Jihong; Liang, Ru-Ze

    2016-01-01

    Dictionary plays an important role in multi-instance data representation. It maps bags of instances to histograms. Earth mover’s distance (EMD) is the most effective histogram distance metric for the application of multi-instance retrieval. However

  1. Overlapping community detection based on link graph using distance dynamics

    Science.gov (United States)

    Chen, Lei; Zhang, Jing; Cai, Li-Jun

    2018-01-01

    The distance dynamics model was recently proposed to detect the disjoint community of a complex network. To identify the overlapping structure of a network using the distance dynamics model, an overlapping community detection algorithm, called L-Attractor, is proposed in this paper. The process of L-Attractor mainly consists of three phases. In the first phase, L-Attractor transforms the original graph to a link graph (a new edge graph) to assure that one node has multiple distances. In the second phase, using the improved distance dynamics model, a dynamic interaction process is introduced to simulate the distance dynamics (shrink or stretch). Through the dynamic interaction process, all distances converge, and the disjoint community structure of the link graph naturally manifests itself. In the third phase, a recovery method is designed to convert the disjoint community structure of the link graph to the overlapping community structure of the original graph. Extensive experiments are conducted on the LFR benchmark networks as well as real-world networks. Based on the results, our algorithm demonstrates higher accuracy and quality than other state-of-the-art algorithms.

  2. Standardized reporting of functioning information on ICF-based common metrics.

    Science.gov (United States)

    Prodinger, Birgit; Tennant, Alan; Stucki, Gerold

    2018-02-01

    In clinical practice and research a variety of clinical data collection tools are used to collect information on people's functioning for clinical practice and research and national health information systems. Reporting on ICF-based common metrics enables standardized documentation of functioning information in national health information systems. The objective of this methodological note on applying the ICF in rehabilitation is to demonstrate how to report functioning information collected with a data collection tool on ICF-based common metrics. We first specify the requirements for the standardized reporting of functioning information. Secondly, we introduce the methods needed for transforming functioning data to ICF-based common metrics. Finally, we provide an example. The requirements for standardized reporting are as follows: 1) having a common conceptual framework to enable content comparability between any health information; and 2) a measurement framework so that scores between two or more clinical data collection tools can be directly compared. The methods needed to achieve these requirements are the ICF Linking Rules and the Rasch measurement model. Using data collected incorporating the 36-item Short Form Health Survey (SF-36), the World Health Organization Disability Assessment Schedule 2.0 (WHODAS 2.0), and the Stroke Impact Scale 3.0 (SIS 3.0), the application of the standardized reporting based on common metrics is demonstrated. A subset of items from the three tools linked to common chapters of the ICF (d4 Mobility, d5 Self-care and d6 Domestic life), were entered as "super items" into the Rasch model. Good fit was achieved with no residual local dependency and a unidimensional metric. A transformation table allows for comparison between scales, and between a scale and the reporting common metric. Being able to report functioning information collected with commonly used clinical data collection tools with ICF-based common metrics enables clinicians

  3. Assessing Measurement Distances for OTA Testing of Massive MIMO Base Station at 28 GHz

    DEFF Research Database (Denmark)

    Kyösti, Pekka; Fan, Wei; Kyrolainen, Jukka

    2017-01-01

    metrics. The first metric is a new power metric based on assumptions of a code book of fixed beams and planar waves. The second one is the multi-user (MU) MIMO sum rate capacity. The intention is to evaluate physical dimensions in metres with respect to different BS array sizes. Simulation results...

  4. Spatial generalised linear mixed models based on distances.

    Science.gov (United States)

    Melo, Oscar O; Mateu, Jorge; Melo, Carlos E

    2016-10-01

    Risk models derived from environmental data have been widely shown to be effective in delineating geographical areas of risk because they are intuitively easy to understand. We present a new method based on distances, which allows the modelling of continuous and non-continuous random variables through distance-based spatial generalised linear mixed models. The parameters are estimated using Markov chain Monte Carlo maximum likelihood, which is a feasible and a useful technique. The proposed method depends on a detrending step built from continuous or categorical explanatory variables, or a mixture among them, by using an appropriate Euclidean distance. The method is illustrated through the analysis of the variation in the prevalence of Loa loa among a sample of village residents in Cameroon, where the explanatory variables included elevation, together with maximum normalised-difference vegetation index and the standard deviation of normalised-difference vegetation index calculated from repeated satellite scans over time. © The Author(s) 2013.

  5. Swarm-based wayfinding support in open and distance learning

    NARCIS (Netherlands)

    Tattersall, Colin; Manderveld, Jocelyn; Van den Berg, Bert; Van Es, René; Janssen, José; Koper, Rob

    2005-01-01

    Please refer to the original source: Tattersall, C. Manderveld, J., Van den Berg, B., Van Es, R., Janssen, J., & Koper, R. (2005). Swarm-based wayfinding support in open and distance learning. In Alkhalifa, E.M. (Ed). Cognitively Informed Systems: Utilizing Practical Approaches to Enrich Information

  6. An Artificial Intelligence-Based Distance Education System: Artimat

    Science.gov (United States)

    Nabiyev, Vasif; Karal, Hasan; Arslan, Selahattin; Erumit, Ali Kursat; Cebi, Ayca

    2013-01-01

    The purpose of this study is to evaluate the artificial intelligence-based distance education system called ARTIMAT, which has been prepared in order to improve mathematical problem solving skills of the students, in terms of conceptual proficiency and ease of use with the opinions of teachers and students. The implementation has been performed…

  7. Project organized Problem-based learning in Distance Education

    DEFF Research Database (Denmark)

    Jensen, Lars Peter; Helbo, Jan; Knudsen, Morten

    2002-01-01

    Project organized problem based learning is a successful concept for on-campus engineering education at Aalborg University. Recently this "Aalborg concept" has been used in networked distance education as well. This paper describes the experiences from two years of Internet-mediated project work...... in a new Master of Information Technology education. The main conclusions are, that the project work is a strong learning motivator, enhancing peer collaboration, for off-campus students as well. However, the concept cannot be directly transferred to off-campus learning. In this paper, the main problems...... experienced with group organized project work in distance education are described, and some possible solutions are listed....

  8. Project-Organized Problem-Based Learning in Distance Education

    DEFF Research Database (Denmark)

    Jensen, Lars Peter; Helbo, Jan; Knudsen, Morten

    2003-01-01

    Project organized problem based learning is a successful concept for on-campus engineering education at Aalborg University. Recently this "Aalborg concept" has been used in networked distance education as well. This paper describes the experiences from two years of Internet-mediated project work...... in a new Master of Information Technology education. The main conclusions are, that the project work is a strong learning motivator, enhancing peer collaboration, for off-campus students as well. However, the concept cannot be directly transferred to off-campus learning. In this paper, the main problems...... experienced with group organized project work in distance education are described, and some possible solutions are listed....

  9. Project-based Collaborative learning in distance education

    DEFF Research Database (Denmark)

    Knudsen, Morten; Bajard, Christine; Helbo, Jan

    2004-01-01

    ) programme indicates, however, that adjustments are required in transforming the on-campus model to distance education. The main problem is that while project work is an excellent regulator of the learning process for on-campus students, this does not seem to be the case for off-campus students. Consequently......This article describes the experiences drawn from an experiment in transferring positive experience with a project-organised on-campus engineering programme to a technology supported distance education programme. Three years of experience with the Master of Industrial Information Technology (MII......, didactic adjustments have been made based on feedback, in particular from evaluation questionnaires. This process has been very constructive in approaching the goal: a successful model for project organized learning in distance education....

  10. Monitor-Based Statistical Model Checking for Weighted Metric Temporal Logic

    DEFF Research Database (Denmark)

    Bulychev, Petr; David, Alexandre; Larsen, Kim Guldstrand

    2012-01-01

    We present a novel approach and implementation for ana- lysing weighted timed automata (WTA) with respect to the weighted metric temporal logic (WMTL≤ ). Based on a stochastic semantics of WTAs, we apply statistical model checking (SMC) to estimate and test probabilities of satisfaction with desi......We present a novel approach and implementation for ana- lysing weighted timed automata (WTA) with respect to the weighted metric temporal logic (WMTL≤ ). Based on a stochastic semantics of WTAs, we apply statistical model checking (SMC) to estimate and test probabilities of satisfaction...

  11. Distance matrix-based approach to protein structure prediction.

    Science.gov (United States)

    Kloczkowski, Andrzej; Jernigan, Robert L; Wu, Zhijun; Song, Guang; Yang, Lei; Kolinski, Andrzej; Pokarowski, Piotr

    2009-03-01

    Much structural information is encoded in the internal distances; a distance matrix-based approach can be used to predict protein structure and dynamics, and for structural refinement. Our approach is based on the square distance matrix D = [r(ij)(2)] containing all square distances between residues in proteins. This distance matrix contains more information than the contact matrix C, that has elements of either 0 or 1 depending on whether the distance r (ij) is greater or less than a cutoff value r (cutoff). We have performed spectral decomposition of the distance matrices D = sigma lambda(k)V(k)V(kT), in terms of eigenvalues lambda kappa and the corresponding eigenvectors v kappa and found that it contains at most five nonzero terms. A dominant eigenvector is proportional to r (2)--the square distance of points from the center of mass, with the next three being the principal components of the system of points. By predicting r (2) from the sequence we can approximate a distance matrix of a protein with an expected RMSD value of about 7.3 A, and by combining it with the prediction of the first principal component we can improve this approximation to 4.0 A. We can also explain the role of hydrophobic interactions for the protein structure, because r is highly correlated with the hydrophobic profile of the sequence. Moreover, r is highly correlated with several sequence profiles which are useful in protein structure prediction, such as contact number, the residue-wise contact order (RWCO) or mean square fluctuations (i.e. crystallographic temperature factors). We have also shown that the next three components are related to spatial directionality of the secondary structure elements, and they may be also predicted from the sequence, improving overall structure prediction. We have also shown that the large number of available HIV-1 protease structures provides a remarkable sampling of conformations, which can be viewed as direct structural information about the

  12. New method for distance-based close following safety indicator.

    Science.gov (United States)

    Sharizli, A A; Rahizar, R; Karim, M R; Saifizul, A A

    2015-01-01

    The increase in the number of fatalities caused by road accidents involving heavy vehicles every year has raised the level of concern and awareness on road safety in developing countries like Malaysia. Changes in the vehicle dynamic characteristics such as gross vehicle weight, travel speed, and vehicle classification will affect a heavy vehicle's braking performance and its ability to stop safely in emergency situations. As such, the aim of this study is to establish a more realistic new distance-based safety indicator called the minimum safe distance gap (MSDG), which incorporates vehicle classification (VC), speed, and gross vehicle weight (GVW). Commercial multibody dynamics simulation software was used to generate braking distance data for various heavy vehicle classes under various loads and speeds. By applying nonlinear regression analysis to the simulation results, a mathematical expression of MSDG has been established. The results show that MSDG is dynamically changed according to GVW, VC, and speed. It is envisaged that this new distance-based safety indicator would provide a more realistic depiction of the real traffic situation for safety analysis.

  13. Metrics for comparing neuronal tree shapes based on persistent homology.

    Directory of Open Access Journals (Sweden)

    Yanjie Li

    Full Text Available As more and more neuroanatomical data are made available through efforts such as NeuroMorpho.Org and FlyCircuit.org, the need to develop computational tools to facilitate automatic knowledge discovery from such large datasets becomes more urgent. One fundamental question is how best to compare neuron structures, for instance to organize and classify large collection of neurons. We aim to develop a flexible yet powerful framework to support comparison and classification of large collection of neuron structures efficiently. Specifically we propose to use a topological persistence-based feature vectorization framework. Existing methods to vectorize a neuron (i.e, convert a neuron to a feature vector so as to support efficient comparison and/or searching typically rely on statistics or summaries of morphometric information, such as the average or maximum local torque angle or partition asymmetry. These simple summaries have limited power in encoding global tree structures. Based on the concept of topological persistence recently developed in the field of computational topology, we vectorize each neuron structure into a simple yet informative summary. In particular, each type of information of interest can be represented as a descriptor function defined on the neuron tree, which is then mapped to a simple persistence-signature. Our framework can encode both local and global tree structure, as well as other information of interest (electrophysiological or dynamical measures, by considering multiple descriptor functions on the neuron. The resulting persistence-based signature is potentially more informative than simple statistical summaries (such as average/mean/max of morphometric quantities-Indeed, we show that using a certain descriptor function will give a persistence-based signature containing strictly more information than the classical Sholl analysis. At the same time, our framework retains the efficiency associated with treating neurons as

  14. VCSEL-based sensors for distance and velocity

    Science.gov (United States)

    Moench, Holger; Carpaij, Mark; Gerlach, Philipp; Gronenborn, Stephan; Gudde, Ralph; Hellmig, Jochen; Kolb, Johanna; van der Lee, Alexander

    2016-03-01

    VCSEL based sensors can measure distance and velocity in three dimensional space and are already produced in high quantities for professional and consumer applications. Several physical principles are used: VCSELs are applied as infrared illumination for surveillance cameras. High power arrays combined with imaging optics provide a uniform illumination of scenes up to a distance of several hundred meters. Time-of-flight methods use a pulsed VCSEL as light source, either with strong single pulses at low duty cycle or with pulse trains. Because of the sensitivity to background light and the strong decrease of the signal with distance several Watts of laser power are needed at a distance of up to 100m. VCSEL arrays enable power scaling and can provide very short pulses at higher power density. Applications range from extended functions in a smartphone over industrial sensors up to automotive LIDAR for driver assistance and autonomous driving. Self-mixing interference works with coherent laser photons scattered back into the cavity. It is therefore insensitive to environmental light. The method is used to measure target velocity and distance with very high accuracy at distances up to one meter. Single-mode VCSELs with integrated photodiode and grating stabilized polarization enable very compact and cost effective products. Besides the well know application as computer input device new applications with even higher accuracy or for speed over ground measurement in automobiles and up to 250km/h are investigated. All measurement methods exploit the known VCSEL properties like robustness, stability over temperature and the potential for packages with integrated optics and electronics. This makes VCSEL sensors ideally suited for new mass applications in consumer and automotive markets.

  15. Encyclopedia of distances

    CERN Document Server

    Deza, Michel Marie

    2016-01-01

    This 4th edition of the leading reference volume on distance metrics is characterized by updated and rewritten sections on some items suggested by experts and readers, as well a general streamlining of content and the addition of essential new topics. Though the structure remains unchanged, the new edition also explores recent advances in the use of distances and metrics for e.g. generalized distances, probability theory, graph theory, coding theory, data analysis. New topics in the purely mathematical sections include e.g. the Vitanyi multiset-metric, algebraic point-conic distance, triangular ratio metric, Rossi-Hamming metric, Taneja distance, spectral semimetric between graphs, channel metrization, and Maryland bridge distance. The multidisciplinary sections have also been supplemented with new topics, including: dynamic time wrapping distance, memory distance, allometry, atmospheric depth, elliptic orbit distance, VLBI distance measurements, the astronomical system of units, and walkability distance. Lea...

  16. Metric for Calculation of System Complexity based on its Connections

    Directory of Open Access Journals (Sweden)

    João Ricardo Braga de Paiva

    2017-02-01

    Full Text Available This paper proposes a methodology based on system connections to calculate its complexity. Two study cases are proposed: the dining Chinese philosophers’ problem and the distribution center. Both studies are modeled using the theory of Discrete Event Systems and simulations in different contexts were performed in order to measure their complexities. The obtained results present i the static complexity as a limiting factor for the dynamic complexity, ii the lowest cost in terms of complexity for each unit of measure of the system performance and iii the output sensitivity to the input parameters. The associated complexity and performance measures aggregate knowledge about the system.

  17. A metric for the Radial Basis Function Network - Application on Real Radar Data

    NARCIS (Netherlands)

    Heiden, R. van der; Groen, F.C.A.

    1996-01-01

    A Radial Basis Functions (RBF) network for pattern recognition is considered. Classification with such a network is based on distances between patterns, so a metric is always present. Using real radar data, the Euclidean metric is shown to perform poorly - a metric based on the so called Box-Cox

  18. Assessment and improvement of radiation oncology trainee contouring ability utilizing consensus-based penalty metrics

    International Nuclear Information System (INIS)

    Hallock, Abhirami; Read, Nancy; D'Souza, David

    2012-01-01

    The objective of this study was to develop and assess the feasibility of utilizing consensus-based penalty metrics for the purpose of critical structure and organ at risk (OAR) contouring quality assurance and improvement. A Delphi study was conducted to obtain consensus on contouring penalty metrics to assess trainee-generated OAR contours. Voxel-based penalty metric equations were used to score regions of discordance between trainee and expert contour sets. The utility of these penalty metric scores for objective feedback on contouring quality was assessed by using cases prepared for weekly radiation oncology radiation oncology trainee treatment planning rounds. In two Delphi rounds, six radiation oncology specialists reached agreement on clinical importance/impact and organ radiosensitivity as the two primary criteria for the creation of the Critical Structure Inter-comparison of Segmentation (CriSIS) penalty functions. Linear/quadratic penalty scoring functions (for over- and under-contouring) with one of four levels of severity (none, low, moderate and high) were assigned for each of 20 OARs in order to generate a CriSIS score when new OAR contours are compared with reference/expert standards. Six cases (central nervous system, head and neck, gastrointestinal, genitourinary, gynaecological and thoracic) then were used to validate 18 OAR metrics through comparison of trainee and expert contour sets using the consensus derived CriSIS functions. For 14 OARs, there was an improvement in CriSIS score post-educational intervention. The use of consensus-based contouring penalty metrics to provide quantitative information for contouring improvement is feasible.

  19. A GPS Phase-Locked Loop Performance Metric Based on the Phase Discriminator Output.

    Science.gov (United States)

    Stevanovic, Stefan; Pervan, Boris

    2018-01-19

    We propose a novel GPS phase-lock loop (PLL) performance metric based on the standard deviation of tracking error (defined as the discriminator's estimate of the true phase error), and explain its advantages over the popular phase jitter metric using theory, numerical simulation, and experimental results. We derive an augmented GPS phase-lock loop (PLL) linear model, which includes the effect of coherent averaging, to be used in conjunction with this proposed metric. The augmented linear model allows more accurate calculation of tracking error standard deviation in the presence of additive white Gaussian noise (AWGN) as compared to traditional linear models. The standard deviation of tracking error, with a threshold corresponding to half of the arctangent discriminator pull-in region, is shown to be a more reliable/robust measure of PLL performance under interference conditions than the phase jitter metric. In addition, the augmented linear model is shown to be valid up until this threshold, which facilitates efficient performance prediction, so that time-consuming direct simulations and costly experimental testing can be reserved for PLL designs that are much more likely to be successful. The effect of varying receiver reference oscillator quality on the tracking error metric is also considered.

  20. Continuity Properties of Distances for Markov Processes

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Mao, Hua; Larsen, Kim Guldstrand

    2014-01-01

    In this paper we investigate distance functions on finite state Markov processes that measure the behavioural similarity of non-bisimilar processes. We consider both probabilistic bisimilarity metrics, and trace-based distances derived from standard Lp and Kullback-Leibler distances. Two desirable...

  1. The International Safeguards Technology Base: How is the Patient Doing? An Exploration of Effective Metrics

    International Nuclear Information System (INIS)

    Schanfein, Mark J.; Gouveia, Fernando S.

    2010-01-01

    The term 'Technology Base' is commonly used but what does it mean? Is there a common understanding of the components that comprise a technology base? Does a formal process exist to assess the health of a given technology base? These are important questions the relevance of which is even more pressing given the USDOE/NNSA initiatives to strengthen the safeguards technology base through investments in research and development and human capital development. Accordingly, the authors will establish a high-level framework to define and understand what comprises a technology base. Potential goal-driven metrics to assess the health of a technology base will also be explored, such as linear demographics and resource availability, in the hope that they can be used to better understand and improve the health of the U.S. safeguards technology base. Finally, through the identification of such metrics, the authors will offer suggestions and highlight choices for addressing potential shortfalls.

  2. Project-Based Collaborative Learning in Distance Education

    DEFF Research Database (Denmark)

    Knudsen, Morten; Bajard, C.; Helbo, Jan

    2003-01-01

    This article describes the experiences drawn from an experiment in transferring positive experience with a project-organised on-campus engineering programme to a technology supported distance education programme. Three years of experience with the Master of Industrial Information Technology (MII)......, didactic adjustments have been made based on feedback, in particular from evaluation questionnaires. This process has been very constructive in approaching the goal: a successful model for project organized learning in distance education.......) programme indicates, however, that adjustments are required in transforming the on-campus model to distance education. The main problem is that while project work is an excellent regulator of the learning process for on-campus students, this does not seem to be the case for off-campus students. Consequently......This article describes the experiences drawn from an experiment in transferring positive experience with a project-organised on-campus engineering programme to a technology supported distance education programme. Three years of experience with the Master of Industrial Information Technology (MII...

  3. Adaptive density trajectory cluster based on time and space distance

    Science.gov (United States)

    Liu, Fagui; Zhang, Zhijie

    2017-10-01

    There are some hotspot problems remaining in trajectory cluster for discovering mobile behavior regularity, such as the computation of distance between sub trajectories, the setting of parameter values in cluster algorithm and the uncertainty/boundary problem of data set. As a result, based on the time and space, this paper tries to define the calculation method of distance between sub trajectories. The significance of distance calculation for sub trajectories is to clearly reveal the differences in moving trajectories and to promote the accuracy of cluster algorithm. Besides, a novel adaptive density trajectory cluster algorithm is proposed, in which cluster radius is computed through using the density of data distribution. In addition, cluster centers and number are selected by a certain strategy automatically, and uncertainty/boundary problem of data set is solved by designed weighted rough c-means. Experimental results demonstrate that the proposed algorithm can perform the fuzzy trajectory cluster effectively on the basis of the time and space distance, and obtain the optimal cluster centers and rich cluster results information adaptably for excavating the features of mobile behavior in mobile and sociology network.

  4. Consumer Neuroscience-Based Metrics Predict Recall, Liking and Viewing Rates in Online Advertising.

    Science.gov (United States)

    Guixeres, Jaime; Bigné, Enrique; Ausín Azofra, Jose M; Alcañiz Raya, Mariano; Colomer Granero, Adrián; Fuentes Hurtado, Félix; Naranjo Ornedo, Valery

    2017-01-01

    The purpose of the present study is to investigate whether the effectiveness of a new ad on digital channels (YouTube) can be predicted by using neural networks and neuroscience-based metrics (brain response, heart rate variability and eye tracking). Neurophysiological records from 35 participants were exposed to 8 relevant TV Super Bowl commercials. Correlations between neurophysiological-based metrics, ad recall, ad liking, the ACE metrix score and the number of views on YouTube during a year were investigated. Our findings suggest a significant correlation between neuroscience metrics and self-reported of ad effectiveness and the direct number of views on the YouTube channel. In addition, and using an artificial neural network based on neuroscience metrics, the model classifies (82.9% of average accuracy) and estimate the number of online views (mean error of 0.199). The results highlight the validity of neuromarketing-based techniques for predicting the success of advertising responses. Practitioners can consider the proposed methodology at the design stages of advertising content, thus enhancing advertising effectiveness. The study pioneers the use of neurophysiological methods in predicting advertising success in a digital context. This is the first article that has examined whether these measures could actually be used for predicting views for advertising on YouTube.

  5. Consumer Neuroscience-Based Metrics Predict Recall, Liking and Viewing Rates in Online Advertising

    Directory of Open Access Journals (Sweden)

    Jaime Guixeres

    2017-10-01

    Full Text Available The purpose of the present study is to investigate whether the effectiveness of a new ad on digital channels (YouTube can be predicted by using neural networks and neuroscience-based metrics (brain response, heart rate variability and eye tracking. Neurophysiological records from 35 participants were exposed to 8 relevant TV Super Bowl commercials. Correlations between neurophysiological-based metrics, ad recall, ad liking, the ACE metrix score and the number of views on YouTube during a year were investigated. Our findings suggest a significant correlation between neuroscience metrics and self-reported of ad effectiveness and the direct number of views on the YouTube channel. In addition, and using an artificial neural network based on neuroscience metrics, the model classifies (82.9% of average accuracy and estimate the number of online views (mean error of 0.199. The results highlight the validity of neuromarketing-based techniques for predicting the success of advertising responses. Practitioners can consider the proposed methodology at the design stages of advertising content, thus enhancing advertising effectiveness. The study pioneers the use of neurophysiological methods in predicting advertising success in a digital context. This is the first article that has examined whether these measures could actually be used for predicting views for advertising on YouTube.

  6. Proposed Performance-Based Metrics for the Future Funding of Graduate Medical Education: Starting the Conversation.

    Science.gov (United States)

    Caverzagie, Kelly J; Lane, Susan W; Sharma, Niraj; Donnelly, John; Jaeger, Jeffrey R; Laird-Fick, Heather; Moriarty, John P; Moyer, Darilyn V; Wallach, Sara L; Wardrop, Richard M; Steinmann, Alwin F

    2017-12-12

    Graduate medical education (GME) in the United States is financed by contributions from both federal and state entities that total over $15 billion annually. Within institutions, these funds are distributed with limited transparency to achieve ill-defined outcomes. To address this, the Institute of Medicine convened a committee on the governance and financing of GME to recommend finance reform that would promote a physician training system that meets society's current and future needs. The resulting report provided several recommendations regarding the oversight and mechanisms of GME funding, including implementation of performance-based GME payments, but did not provide specific details about the content and development of metrics for these payments. To initiate a national conversation about performance-based GME funding, the authors asked: What should GME be held accountable for in exchange for public funding? In answer to this question, the authors propose 17 potential performance-based metrics for GME funding that could inform future funding decisions. Eight of the metrics are described as exemplars to add context and to help readers obtain a deeper understanding of the inherent complexities of performance-based GME funding. The authors also describe considerations and precautions for metric implementation.

  7. Consumer Neuroscience-Based Metrics Predict Recall, Liking and Viewing Rates in Online Advertising

    Science.gov (United States)

    Guixeres, Jaime; Bigné, Enrique; Ausín Azofra, Jose M.; Alcañiz Raya, Mariano; Colomer Granero, Adrián; Fuentes Hurtado, Félix; Naranjo Ornedo, Valery

    2017-01-01

    The purpose of the present study is to investigate whether the effectiveness of a new ad on digital channels (YouTube) can be predicted by using neural networks and neuroscience-based metrics (brain response, heart rate variability and eye tracking). Neurophysiological records from 35 participants were exposed to 8 relevant TV Super Bowl commercials. Correlations between neurophysiological-based metrics, ad recall, ad liking, the ACE metrix score and the number of views on YouTube during a year were investigated. Our findings suggest a significant correlation between neuroscience metrics and self-reported of ad effectiveness and the direct number of views on the YouTube channel. In addition, and using an artificial neural network based on neuroscience metrics, the model classifies (82.9% of average accuracy) and estimate the number of online views (mean error of 0.199). The results highlight the validity of neuromarketing-based techniques for predicting the success of advertising responses. Practitioners can consider the proposed methodology at the design stages of advertising content, thus enhancing advertising effectiveness. The study pioneers the use of neurophysiological methods in predicting advertising success in a digital context. This is the first article that has examined whether these measures could actually be used for predicting views for advertising on YouTube. PMID:29163251

  8. Objectively Quantifying Radiation Esophagitis With Novel Computed Tomography–Based Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Niedzielski, Joshua S., E-mail: jsniedzielski@mdanderson.org [Department of Radiation Physics, The University of Texas M. D. Anderson Cancer Center, Houston, Texas (United States); University of Texas Houston Graduate School of Biomedical Science, Houston, Texas (United States); Yang, Jinzhong [Department of Radiation Physics, The University of Texas M. D. Anderson Cancer Center, Houston, Texas (United States); University of Texas Houston Graduate School of Biomedical Science, Houston, Texas (United States); Stingo, Francesco [Department of Biostatistics, The University of Texas M. D. Anderson Cancer Center, Houston, Texas (United States); Martel, Mary K.; Mohan, Radhe [Department of Radiation Physics, The University of Texas M. D. Anderson Cancer Center, Houston, Texas (United States); University of Texas Houston Graduate School of Biomedical Science, Houston, Texas (United States); Gomez, Daniel R. [Department of Radiation Oncology, The University of Texas M. D. Anderson Cancer Center, Houston, Texas (United States); Briere, Tina M. [Department of Radiation Physics, The University of Texas M. D. Anderson Cancer Center, Houston, Texas (United States); University of Texas Houston Graduate School of Biomedical Science, Houston, Texas (United States); Liao, Zhongxing [Department of Radiation Oncology, The University of Texas M. D. Anderson Cancer Center, Houston, Texas (United States); Court, Laurence E. [Department of Radiation Physics, The University of Texas M. D. Anderson Cancer Center, Houston, Texas (United States); University of Texas Houston Graduate School of Biomedical Science, Houston, Texas (United States)

    2016-02-01

    Purpose: To study radiation-induced esophageal expansion as an objective measure of radiation esophagitis in patients with non-small cell lung cancer (NSCLC) treated with intensity modulated radiation therapy. Methods and Materials: Eighty-five patients had weekly intra-treatment CT imaging and esophagitis scoring according to Common Terminlogy Criteria for Adverse Events 4.0, (24 Grade 0, 45 Grade 2, and 16 Grade 3). Nineteen esophageal expansion metrics based on mean, maximum, spatial length, and volume of expansion were calculated as voxel-based relative volume change, using the Jacobian determinant from deformable image registration between the planning and weekly CTs. An anatomic variability correction method was validated and applied to these metrics to reduce uncertainty. An analysis of expansion metrics and radiation esophagitis grade was conducted using normal tissue complication probability from univariate logistic regression and Spearman rank for grade 2 and grade 3 esophagitis endpoints, as well as the timing of expansion and esophagitis grade. Metrics' performance in classifying esophagitis was tested with receiver operating characteristic analysis. Results: Expansion increased with esophagitis grade. Thirteen of 19 expansion metrics had receiver operating characteristic area under the curve values >0.80 for both grade 2 and grade 3 esophagitis endpoints, with the highest performance from maximum axial expansion (MaxExp1) and esophageal length with axial expansion ≥30% (LenExp30%) with area under the curve values of 0.93 and 0.91 for grade 2, 0.90 and 0.90 for grade 3 esophagitis, respectively. Conclusions: Esophageal expansion may be a suitable objective measure of esophagitis, particularly maximum axial esophageal expansion and esophageal length with axial expansion ≥30%, with 2.1 Jacobian value and 98.6 mm as the metric value for 50% probability of grade 3 esophagitis. The uncertainty in esophageal Jacobian calculations can be reduced

  9. Objectively Quantifying Radiation Esophagitis With Novel Computed Tomography–Based Metrics

    International Nuclear Information System (INIS)

    Niedzielski, Joshua S.; Yang, Jinzhong; Stingo, Francesco; Martel, Mary K.; Mohan, Radhe; Gomez, Daniel R.; Briere, Tina M.; Liao, Zhongxing; Court, Laurence E.

    2016-01-01

    Purpose: To study radiation-induced esophageal expansion as an objective measure of radiation esophagitis in patients with non-small cell lung cancer (NSCLC) treated with intensity modulated radiation therapy. Methods and Materials: Eighty-five patients had weekly intra-treatment CT imaging and esophagitis scoring according to Common Terminlogy Criteria for Adverse Events 4.0, (24 Grade 0, 45 Grade 2, and 16 Grade 3). Nineteen esophageal expansion metrics based on mean, maximum, spatial length, and volume of expansion were calculated as voxel-based relative volume change, using the Jacobian determinant from deformable image registration between the planning and weekly CTs. An anatomic variability correction method was validated and applied to these metrics to reduce uncertainty. An analysis of expansion metrics and radiation esophagitis grade was conducted using normal tissue complication probability from univariate logistic regression and Spearman rank for grade 2 and grade 3 esophagitis endpoints, as well as the timing of expansion and esophagitis grade. Metrics' performance in classifying esophagitis was tested with receiver operating characteristic analysis. Results: Expansion increased with esophagitis grade. Thirteen of 19 expansion metrics had receiver operating characteristic area under the curve values >0.80 for both grade 2 and grade 3 esophagitis endpoints, with the highest performance from maximum axial expansion (MaxExp1) and esophageal length with axial expansion ≥30% (LenExp30%) with area under the curve values of 0.93 and 0.91 for grade 2, 0.90 and 0.90 for grade 3 esophagitis, respectively. Conclusions: Esophageal expansion may be a suitable objective measure of esophagitis, particularly maximum axial esophageal expansion and esophageal length with axial expansion ≥30%, with 2.1 Jacobian value and 98.6 mm as the metric value for 50% probability of grade 3 esophagitis. The uncertainty in esophageal Jacobian calculations can be reduced

  10. Distance learning, problem based learning and dynamic knowledge networks.

    Science.gov (United States)

    Giani, U; Martone, P

    1998-06-01

    This paper is an attempt to develop a distance learning model grounded upon a strict integration of problem based learning (PBL), dynamic knowledge networks (DKN) and web tools, such as hypermedia documents, synchronous and asynchronous communication facilities, etc. The main objective is to develop a theory of distance learning based upon the idea that learning is a highly dynamic cognitive process aimed at connecting different concepts in a network of mutually supporting concepts. Moreover, this process is supposed to be the result of a social interaction that has to be facilitated by the web. The model was tested by creating a virtual classroom of medical and nursing students and activating a learning session on the concept of knowledge representation in health sciences.

  11. Didactics, Technology, and Organisation of Project Based Distance Education

    DEFF Research Database (Denmark)

    Knudsen, Morten Haack; Borch, Ole M.; Helbo, Jan

    2005-01-01

    The didactics, technology, and organization of an ICT supported distance engineering Master education are described. A systematic monitoring and evaluation of the basis year has given useful experience, subsequently used for adjustments and improvements. A successful on-campus project organized...... as asynchronous, which is possible with extensive utilization of new information and communication technology. Virtual meetings are conducted with text, sound and video based communication. Also the organization requires technology. A new learning management system, specifically designed to the didactic form...

  12. The International Safeguards Technology Base: How is the Patient Doing? An Exploration of Effective Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Schanfein, Mark J; Gouveia, Fernando S

    2010-07-01

    The term “Technology Base” is commonly used but what does it mean? Is there a common understanding of the components that comprise a technology base? Does a formal process exist to assess the health of a given technology base? These are important questions the relevance of which is even more pressing given the USDOE/NNSA initiatives to strengthen the safeguards technology base through investments in research & development and human capital development. Accordingly, the authors will establish a high-level framework to define and understand what comprises a technology base. Potential goal-driven metrics to assess the health of a technology base will also be explored, such as linear demographics and resource availability, in the hope that they can be used to better understand and improve the health of the U.S. safeguards technology base. Finally, through the identification of such metrics, the authors will offer suggestions and highlight choices for addressing potential shortfalls.

  13. Distance Based Root Cause Analysis and Change Impact Analysis of Performance Regressions

    Directory of Open Access Journals (Sweden)

    Junzan Zhou

    2015-01-01

    Full Text Available Performance regression testing is applied to uncover both performance and functional problems of software releases. A performance problem revealed by performance testing can be high response time, low throughput, or even being out of service. Mature performance testing process helps systematically detect software performance problems. However, it is difficult to identify the root cause and evaluate the potential change impact. In this paper, we present an approach leveraging server side logs for identifying root causes of performance problems. Firstly, server side logs are used to recover call tree of each business transaction. We define a novel distance based metric computed from call trees for root cause analysis and apply inverted index from methods to business transactions for change impact analysis. Empirical studies show that our approach can effectively and efficiently help developers diagnose root cause of performance problems.

  14. Fusion set selection with surrogate metric in multi-atlas based image segmentation

    International Nuclear Information System (INIS)

    Zhao, Tingting; Ruan, Dan

    2016-01-01

    Multi-atlas based image segmentation sees unprecedented opportunities but also demanding challenges in the big data era. Relevant atlas selection before label fusion plays a crucial role in reducing potential performance loss from heterogeneous data quality and high computation cost from extensive data. This paper starts with investigating the image similarity metric (termed ‘surrogate’), an alternative to the inaccessible geometric agreement metric (termed ‘oracle’) in atlas relevance assessment, and probes into the problem of how to select the ‘most-relevant’ atlases and how many such atlases to incorporate. We propose an inference model to relate the surrogates and the oracle geometric agreement metrics. Based on this model, we quantify the behavior of the surrogates in mimicking oracle metrics for atlas relevance ordering. Finally, analytical insights on the choice of fusion set size are presented from a probabilistic perspective, with the integrated goal of including the most relevant atlases and excluding the irrelevant ones. Empirical evidence and performance assessment are provided based on prostate and corpus callosum segmentation. (paper)

  15. IMPACT OF COMPUTER BASED ONLINE ENTREPRENEURSHIP DISTANCE EDUCATION IN INDIA

    Directory of Open Access Journals (Sweden)

    Bhagwan SHREE RAM

    2012-07-01

    Full Text Available The success of Indian enterprises and professionals in the computer and information technology (CIT domain during the twenty year has been spectacular. Entrepreneurs, bureaucrats and technocrats are now advancing views about how India can ride CIT bandwagon and leapfrog into a knowledge-based economy in the area of entrepreneurship distance education on-line. Isolated instances of remotely located villagers sending and receiving email messages, effective application of mobile communications and surfing the Internet are being promoted as examples of how the nation can achieve this transformation, while vanquishing socio-economic challenges such as illiteracy, high growth of population, poverty, and the digital divide along the way. Likewise, even while a small fraction of the urban population in India has access to computers and the Internet, e-governance is being projected as the way of the future. There is no dearth of fascinating stories about CIT enabled changes, yet there is little discussion about whether such changes are effective and sustainable in the absence of the basic infrastructure that is accessible to the citizens of more advanced economies. When used appropriately, different CITs are said to help expand access to entrepreneurship distance education, strengthen the relevance of education to the increasingly digital workplace, and raise technical and managerial educational quality by, among others, helping make teaching and learning into an engaging, active process connected to real life. This research paper investigates on the impact of computer based online entrepreneurship distance education in India.

  16. Metric diffusion along foliations

    CERN Document Server

    Walczak, Szymon M

    2017-01-01

    Up-to-date research in metric diffusion along compact foliations is presented in this book. Beginning with fundamentals from the optimal transportation theory and the theory of foliations; this book moves on to cover Wasserstein distance, Kantorovich Duality Theorem, and the metrization of the weak topology by the Wasserstein distance. Metric diffusion is defined, the topology of the metric space is studied and the limits of diffused metrics along compact foliations are discussed. Essentials on foliations, holonomy, heat diffusion, and compact foliations are detailed and vital technical lemmas are proved to aide understanding. Graduate students and researchers in geometry, topology and dynamics of foliations and laminations will find this supplement useful as it presents facts about the metric diffusion along non-compact foliation and provides a full description of the limit for metrics diffused along foliation with at least one compact leaf on the two dimensions.

  17. Analysis of transitions at two-fold redundant sites in mammalian genomes. Transition redundant approach-to-equilibrium (TREx distance metrics

    Directory of Open Access Journals (Sweden)

    Liberles David A

    2006-03-01

    sites within two-fold redundant coding systems were examined in the mouse, rat, and human genomes. The key metric (f2, the fraction of those sites that holds the same nucleotide, was measured for putative ortholog pairs. A transition redundant exchange (TREx distance was calculated from f2 for these pairs. Pyrimidine-pyrimidine transitions at these sites occur approximately 14% faster than purine-purine transitions in various lineages. Transition rate constants were similar in different genes within the same lineages; within a set of orthologs, the f2 distribution is only modest overdispersed. No correlation between disparity and overdispersion is observed. In rodents, evidence was found for greater conservation of TREx sites in genes on the X chromosome, accounting for a small part of the overdispersion, however. Conclusion The TREx metric is useful to analyze the history of transition rate constants within these mammals over the past 100 million years. The TREx metric estimates the extent to which silent nucleotide substitutions accumulate in different genes, on different chromosomes, with different compositions, in different lineages, and at different times.

  18. Individuality evaluation for paper based artifact-metrics using transmitted light image

    Science.gov (United States)

    Yamakoshi, Manabu; Tanaka, Junichi; Furuie, Makoto; Hirabayashi, Masashi; Matsumoto, Tsutomu

    2008-02-01

    Artifact-metrics is an automated method of authenticating artifacts based on a measurable intrinsic characteristic. Intrinsic characters, such as microscopic random-patterns made during the manufacturing process, are very difficult to copy. A transmitted light image of the distribution can be used for artifact-metrics, since the fiber distribution of paper is random. Little is known about the individuality of the transmitted light image although it is an important requirement for intrinsic characteristic artifact-metrics. Measuring individuality requires that the intrinsic characteristic of each artifact significantly differs, so having sufficient individuality can make an artifact-metric system highly resistant to brute force attack. Here we investigate the influence of paper category, matching size of sample, and image-resolution on the individuality of a transmitted light image of paper through a matching test using those images. More concretely, we evaluate FMR/FNMR curves by calculating similarity scores with matches using correlation coefficients between pairs of scanner input images, and the individuality of paper by way of estimated EER with probabilistic measure through a matching method based on line segments, which can localize the influence of rotation gaps of a sample in the case of large matching size. As a result, we found that the transmitted light image of paper has a sufficient individuality.

  19. Web page sorting algorithm based on query keyword distance relation

    Science.gov (United States)

    Yang, Han; Cui, Hong Gang; Tang, Hao

    2017-08-01

    In order to optimize the problem of page sorting, according to the search keywords in the web page in the relationship between the characteristics of the proposed query keywords clustering ideas. And it is converted into the degree of aggregation of the search keywords in the web page. Based on the PageRank algorithm, the clustering degree factor of the query keyword is added to make it possible to participate in the quantitative calculation. This paper proposes an improved algorithm for PageRank based on the distance relation between search keywords. The experimental results show the feasibility and effectiveness of the method.

  20. Handwritten Digit Recognition using Edit Distance-Based KNN

    OpenAIRE

    Bernard , Marc; Fromont , Elisa; Habrard , Amaury; Sebban , Marc

    2012-01-01

    We discuss the student project given for the last 5 years to the 1st year Master Students which follow the Machine Learning lecture at the University Jean Monnet in Saint Etienne, France. The goal of this project is to develop a GUI that can recognize digits and/or letters drawn manually. The system is based on a string representation of the dig- its using Freeman codes and on the use of an edit-distance-based K-Nearest Neighbors classifier. In addition to the machine learning knowledge about...

  1. Improved nonlinear fault detection strategy based on the Hellinger distance metric: Plug flow reactor monitoring

    KAUST Repository

    Harrou, Fouzi; Madakyaru, Muddu; Sun, Ying

    2017-01-01

    Fault detection has a vital role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. This paper proposes an innovative multivariate fault detection method that can be used for monitoring

  2. Large Scale Metric Learning for Distance-Based Image Classification on Open Ended Data Sets

    NARCIS (Netherlands)

    Mensink, T.; Verbeek, J.; Perronnin, F.; Csurka, G.; Farinella, G.M.; Battiato, S.; Cipolla, R,

    2013-01-01

    Many real-life large-scale datasets are open-ended and dynamic: new images are continuously added to existing classes, new classes appear over time, and the semantics of existing classes might evolve too. Therefore, we study large-scale image classification methods that can incorporate new classes

  3. Classification in medical images using adaptive metric k-NN

    Science.gov (United States)

    Chen, C.; Chernoff, K.; Karemore, G.; Lo, P.; Nielsen, M.; Lauze, F.

    2010-03-01

    The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier with respect to different adaptive metrics in the context of medical imaging. We propose using adaptive metrics such that the structure of the data is better described, introducing some unsupervised learning knowledge in k-NN. We investigated four different metrics are estimated: a theoretical metric based on the assumption that images are drawn from Brownian Image Model (BIM), the normalized metric based on variance of the data, the empirical metric is based on the empirical covariance matrix of the unlabeled data, and an optimized metric obtained by minimizing the classification error. The spectral structure of the empirical covariance also leads to Principal Component Analysis (PCA) performed on it which results the subspace metrics. The metrics are evaluated on two data sets: lateral X-rays of the lumbar aortic/spine region, where we use k-NN for performing abdominal aorta calcification detection; and mammograms, where we use k-NN for breast cancer risk assessment. The results show that appropriate choice of metric can improve classification.

  4. AN ARTIFICIAL INTELLIGENCE-BASED DISTANCE EDUCATION SYSTEM: Artimat

    Directory of Open Access Journals (Sweden)

    Vasif NABIYEV

    2013-04-01

    Full Text Available The purpose of this study is to evaluate the artificial intelligence-based distance education system called as ARTIMAT, which has been prepared in order to improve mathematical problem solving skills of the students, in terms of conceptual proficiency and ease of use with the opinions of teachers and students. The implementation has been performed with 4 teachers and 59 students in 10th grade in an Anatolian High School in Trabzon. Many institutions and organizations in the world approach seriously to distance education besides traditional education. It is inevitable to use the distance education in teaching the problem solving skills in this different dimension of the education. In the studies in Turkey and abroad in the field of mathematics teaching, problem solving skills are generally stated not to be at the desired level and often expressed to have difficulty in teaching. For this reason, difficulties of the students in problem solving have initially been evaluated and the system has been prepared utilizing artificial intelligence algorithms according to the obtained results. In the evaluation of the findings obtained from the application, it has been concluded that the system is responsive to the needs of the students and is successful in general, but that conceptual changes should be made in order that students adapt to the system quickly.

  5. APF-Based Car Following Behavior Considering Lateral Distance

    Directory of Open Access Journals (Sweden)

    Zhao-Sheng Yang

    2013-01-01

    Full Text Available Considering the influence of lateral distance on consecutive vehicles, this paper proposes a new car following model based on the artificial potential field theory (APF. Traditional car following behaviors all assume that the vehicles are driving along the middle of a lane. Different from the traditional car following principles, this incorporation of APF offers a potential breakthrough in the fields of car following theory. The individual vehicle can be represented as a unit point charge in electric field, and the interaction of the attractive potential energy and the repellent potential energy between vehicles simplifies the various influence factors on the target vehicle in actual following behavior. Consequently, it can make a better analysis of the following behavior under the lateral separation. Then, the proposed model has been demonstrated in simulation environment, through which the space-time trajectories and the potential energy change regulation are obtained. Simulations verify that the following vehicle's behavior is vulnerable to be affected by lateral distance, where the attractive potential energy tends to become repellent potential energy as the longitudinal distance decreases. The search results prove that the proposed model quantifies the relations between headway and potential energy and better reflects the following process in real-world situation.

  6. Encyclopedia of distances

    CERN Document Server

    Deza, Michel Marie

    2014-01-01

    This updated and revised third edition of the leading reference volume on distance metrics includes new items from very active research areas in the use of distances and metrics such as geometry, graph theory, probability theory and analysis. Among the new topics included are, for example, polyhedral metric space, nearness matrix problems, distances between belief assignments, distance-related animal settings, diamond-cutting distances, natural units of length, Heidegger’s de-severance distance, and brain distances. The publication of this volume coincides with intensifying research efforts into metric spaces and especially distance design for applications. Accurate metrics have become a crucial goal in computational biology, image analysis, speech recognition and information retrieval. Leaving aside the practical questions that arise during the selection of a ‘good’ distance function, this work focuses on providing the research community with an invaluable comprehensive listing of the main available di...

  7. Information Entropy-Based Metrics for Measuring Emergences in Artificial Societies

    Directory of Open Access Journals (Sweden)

    Mingsheng Tang

    2014-08-01

    Full Text Available Emergence is a common phenomenon, and it is also a general and important concept in complex dynamic systems like artificial societies. Usually, artificial societies are used for assisting in resolving several complex social issues (e.g., emergency management, intelligent transportation system with the aid of computer science. The levels of an emergence may have an effect on decisions making, and the occurrence and degree of an emergence are generally perceived by human observers. However, due to the ambiguity and inaccuracy of human observers, to propose a quantitative method to measure emergences in artificial societies is a meaningful and challenging task. This article mainly concentrates upon three kinds of emergences in artificial societies, including emergence of attribution, emergence of behavior, and emergence of structure. Based on information entropy, three metrics have been proposed to measure emergences in a quantitative way. Meanwhile, the correctness of these metrics has been verified through three case studies (the spread of an infectious influenza, a dynamic microblog network, and a flock of birds with several experimental simulations on the Netlogo platform. These experimental results confirm that these metrics increase with the rising degree of emergences. In addition, this article also has discussed the limitations and extended applications of these metrics.

  8. Contaminant classification using cosine distances based on multiple conventional sensors.

    Science.gov (United States)

    Liu, Shuming; Che, Han; Smith, Kate; Chang, Tian

    2015-02-01

    Emergent contamination events have a significant impact on water systems. After contamination detection, it is important to classify the type of contaminant quickly to provide support for remediation attempts. Conventional methods generally either rely on laboratory-based analysis, which requires a long analysis time, or on multivariable-based geometry analysis and sequence analysis, which is prone to being affected by the contaminant concentration. This paper proposes a new contaminant classification method, which discriminates contaminants in a real time manner independent of the contaminant concentration. The proposed method quantifies the similarities or dissimilarities between sensors' responses to different types of contaminants. The performance of the proposed method was evaluated using data from contaminant injection experiments in a laboratory and compared with a Euclidean distance-based method. The robustness of the proposed method was evaluated using an uncertainty analysis. The results show that the proposed method performed better in identifying the type of contaminant than the Euclidean distance based method and that it could classify the type of contaminant in minutes without significantly compromising the correct classification rate (CCR).

  9. Normalized compression distance of multisets with applications

    NARCIS (Netherlands)

    Cohen, A.R.; Vitányi, P.M.B.

    Pairwise normalized compression distance (NCD) is a parameter-free, feature-free, alignment-free, similarity metric based on compression. We propose an NCD of multisets that is also metric. Previously, attempts to obtain such an NCD failed. For classification purposes it is superior to the pairwise

  10. A Game Map Complexity Measure Based on Hamming Distance

    Science.gov (United States)

    Li, Yan; Su, Pan; Li, Wenliang

    With the booming of PC game market, Game AI has attracted more and more researches. The interesting and difficulty of a game are relative with the map used in game scenarios. Besides, the path-finding efficiency in a game is also impacted by the complexity of the used map. In this paper, a novel complexity measure based on Hamming distance, called the Hamming complexity, is introduced. This measure is able to estimate the complexity of binary tileworld. We experimentally demonstrated that Hamming complexity is highly relative with the efficiency of A* algorithm, and therefore it is a useful reference to the designer when developing a game map.

  11. Concordance-based Kendall's Correlation for Computationally-Light vs. Computationally-Heavy Centrality Metrics: Lower Bound for Correlation

    Directory of Open Access Journals (Sweden)

    Natarajan Meghanathan

    2017-01-01

    Full Text Available We identify three different levels of correlation (pair-wise relative ordering, network-wide ranking and linear regression that could be assessed between a computationally-light centrality metric and a computationally-heavy centrality metric for real-world networks. The Kendall's concordance-based correlation measure could be used to quantitatively assess how well we could consider the relative ordering of two vertices vi and vj with respect to a computationally-light centrality metric as the relative ordering of the same two vertices with respect to a computationally-heavy centrality metric. We hypothesize that the pair-wise relative ordering (concordance-based assessment of the correlation between centrality metrics is the most strictest of all the three levels of correlation and claim that the Kendall's concordance-based correlation coefficient will be lower than the correlation coefficient observed with the more relaxed levels of correlation measures (linear regression-based Pearson's product-moment correlation coefficient and the network wide ranking-based Spearman's correlation coefficient. We validate our hypothesis by evaluating the three correlation coefficients between two sets of centrality metrics: the computationally-light degree and local clustering coefficient complement-based degree centrality metrics and the computationally-heavy eigenvector centrality, betweenness centrality and closeness centrality metrics for a diverse collection of 50 real-world networks.

  12. Supplier selection using different metric functions

    Directory of Open Access Journals (Sweden)

    Omosigho S.E.

    2015-01-01

    Full Text Available Supplier selection is an important component of supply chain management in today’s global competitive environment. Hence, the evaluation and selection of suppliers have received considerable attention in the literature. Many attributes of suppliers, other than cost, are considered in the evaluation and selection process. Therefore, the process of evaluation and selection of suppliers is a multi-criteria decision making process. The methodology adopted to solve the supplier selection problem is intuitionistic fuzzy TOPSIS (Technique for Order Preference by Similarity to the Ideal Solution. Generally, TOPSIS is based on the concept of minimum distance from the positive ideal solution and maximum distance from the negative ideal solution. We examine the deficiencies of using only one metric function in TOPSIS and propose the use of spherical metric function in addition to the commonly used metric functions. For empirical supplier selection problems, more than one metric function should be used.

  13. Metric-based approach and tool for modeling the I and C system using Markov chains

    International Nuclear Information System (INIS)

    Butenko, Valentyna; Kharchenko, Vyacheslav; Odarushchenko, Elena; Butenko, Dmitriy

    2015-01-01

    Markov's chains (MC) are well-know and widely applied in dependability and performability analysis of safety-critical systems, because of the flexible representation of system components dependencies and synchronization. There are few radblocks for greater application of the MC: accounting the additional system components increases the model state-space and complicates analysis; the non-numerically sophisticated user may find it difficult to decide between the variety of numerical methods to determine the most suitable and accurate for their application. Thus obtaining the high accurate and trusted modeling results becomes a nontrivial task. In this paper, we present the metric-based approach for selection of the applicable solution approach, based on the analysis of MCs stiffness, decomposability, sparsity and fragmentedness. Using this selection procedure the modeler can provide the verification of earlier obtained results. The presented approach was implemented in utility MSMC, which supports the MC construction, metric-based analysis, recommendations shaping and model solution. The model can be exported to the wall-known off-the-shelf mathematical packages for verification. The paper presents the case study of the industrial NPP I and C system, manufactured by RPC Radiy. The paper shows an application of metric-based approach and MSMC fool for dependability and safety analysis of RTS, and procedure of results verification. (author)

  14. Comparison of SOAP and REST Based Web Services Using Software Evaluation Metrics

    Directory of Open Access Journals (Sweden)

    Tihomirovs Juris

    2016-12-01

    Full Text Available The usage of Web services has recently increased. Therefore, it is important to select right type of Web services at the project design stage. The most common implementations are based on SOAP (Simple Object Access Protocol and REST (Representational State Transfer Protocol styles. Maintainability of REST and SOAP Web services has become an important issue as popularity of Web services is increasing. Choice of the right approach is not an easy decision since it is influenced by development requirements and maintenance considerations. In the present research, we present the comparison of SOAP and REST based Web services using software evaluation metrics. To achieve this aim, a systematic literature review will be made to compare REST and SOAP Web services in terms of the software evaluation metrics.

  15. Model-Based Referenceless Quality Metric of 3D Synthesized Images Using Local Image Description.

    Science.gov (United States)

    Gu, Ke; Jakhetiya, Vinit; Qiao, Jun-Fei; Li, Xiaoli; Lin, Weisi; Thalmann, Daniel

    2017-07-28

    New challenges have been brought out along with the emerging of 3D-related technologies such as virtual reality (VR), augmented reality (AR), and mixed reality (MR). Free viewpoint video (FVV), due to its applications in remote surveillance, remote education, etc, based on the flexible selection of direction and viewpoint, has been perceived as the development direction of next-generation video technologies and has drawn a wide range of researchers' attention. Since FVV images are synthesized via a depth image-based rendering (DIBR) procedure in the "blind" environment (without reference images), a reliable real-time blind quality evaluation and monitoring system is urgently required. But existing assessment metrics do not render human judgments faithfully mainly because geometric distortions are generated by DIBR. To this end, this paper proposes a novel referenceless quality metric of DIBR-synthesized images using the autoregression (AR)-based local image description. It was found that, after the AR prediction, the reconstructed error between a DIBR-synthesized image and its AR-predicted image can accurately capture the geometry distortion. The visual saliency is then leveraged to modify the proposed blind quality metric to a sizable margin. Experiments validate the superiority of our no-reference quality method as compared with prevailing full-, reduced- and no-reference models.

  16. New exposure-based metric approach for evaluating O3 risk to North American aspen forests

    International Nuclear Information System (INIS)

    Percy, K.E.; Nosal, M.; Heilman, W.; Dann, T.; Sober, J.; Legge, A.H.; Karnosky, D.F.

    2007-01-01

    The United States and Canada currently use exposure-based metrics to protect vegetation from O 3 . Using 5 years (1999-2003) of co-measured O 3 , meteorology and growth response, we have developed exposure-based regression models that predict Populus tremuloides growth change within the North American ambient air quality context. The models comprised growing season fourth-highest daily maximum 8-h average O 3 concentration, growing degree days, and wind speed. They had high statistical significance, high goodness of fit, include 95% confidence intervals for tree growth change, and are simple to use. Averaged across a wide range of clonal sensitivity, historical 2001-2003 growth change over most of the 26 M ha P. tremuloides distribution was estimated to have ranged from no impact (0%) to strong negative impacts (-31%). With four aspen clones responding negatively (one responded positively) to O 3 , the growing season fourth-highest daily maximum 8-h average O 3 concentration performed much better than growing season SUM06, AOT40 or maximum 1 h average O 3 concentration metrics as a single indicator of aspen stem cross-sectional area growth. - A new exposure-based metric approach to predict O 3 risk to North American aspen forests has been developed

  17. On the importance of the distance measures used to train and test knowledge-based potentials for proteins.

    Directory of Open Access Journals (Sweden)

    Martin Carlsen

    Full Text Available Knowledge-based potentials are energy functions derived from the analysis of databases of protein structures and sequences. They can be divided into two classes. Potentials from the first class are based on a direct conversion of the distributions of some geometric properties observed in native protein structures into energy values, while potentials from the second class are trained to mimic quantitatively the geometric differences between incorrectly folded models and native structures. In this paper, we focus on the relationship between energy and geometry when training the second class of knowledge-based potentials. We assume that the difference in energy between a decoy structure and the corresponding native structure is linearly related to the distance between the two structures. We trained two distance-based knowledge-based potentials accordingly, one based on all inter-residue distances (PPD, while the other had the set of all distances filtered to reflect consistency in an ensemble of decoys (PPE. We tested four types of metric to characterize the distance between the decoy and the native structure, two based on extrinsic geometry (RMSD and GTD-TS*, and two based on intrinsic geometry (Q* and MT. The corresponding eight potentials were tested on a large collection of decoy sets. We found that it is usually better to train a potential using an intrinsic distance measure. We also found that PPE outperforms PPD, emphasizing the benefits of capturing consistent information in an ensemble. The relevance of these results for the design of knowledge-based potentials is discussed.

  18. Cloud-based Computing and Applications of New Snow Metrics for Societal Benefit

    Science.gov (United States)

    Nolin, A. W.; Sproles, E. A.; Crumley, R. L.; Wilson, A.; Mar, E.; van de Kerk, M.; Prugh, L.

    2017-12-01

    Seasonal and interannual variability in snow cover affects socio-environmental systems including water resources, forest ecology, freshwater and terrestrial habitat, and winter recreation. We have developed two new seasonal snow metrics: snow cover frequency (SCF) and snow disappearance date (SDD). These metrics are calculated at 500-m resolution using NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) snow cover data (MOD10A1). SCF is the number of times snow is observed in a pixel over the user-defined observation period. SDD is the last date of observed snow in a water year. These pixel-level metrics are calculated rapidly and globally in the Google Earth Engine cloud-based environment. SCF and SDD can be interactively visualized in a map-based interface, allowing users to explore spatial and temporal snowcover patterns from 2000-present. These metrics are especially valuable in regions where snow data are sparse or non-existent. We have used these metrics in several ongoing projects. When SCF was linked with a simple hydrologic model in the La Laguna watershed in northern Chile, it successfully predicted summer low flows with a Nash-Sutcliffe value of 0.86. SCF has also been used to help explain changes in Dall sheep populations in Alaska where sheep populations are negatively impacted by late snow cover and low snowline elevation during the spring lambing season. In forest management, SCF and SDD appear to be valuable predictors of post-wildfire vegetation growth. We see a positive relationship between winter SCF and subsequent summer greening for several years post-fire. For western US winter recreation, we are exploring trends in SDD and SCF for regions where snow sports are economically important. In a world with declining snowpacks and increasing uncertainty, these metrics extend across elevations and fill data gaps to provide valuable information for decision-making. SCF and SDD are being produced so that anyone with Internet access and a Google

  19. Utility of ck metrics in predicting size of board-based software games

    International Nuclear Information System (INIS)

    Sabhat, N.; Azam, F.; Malik, A.A.

    2017-01-01

    Software size is one of the most important inputs of many software cost and effort estimation models. Early estimation of software plays an important role at the time of project inception. An accurate estimate of software size is, therefore, crucial for planning, managing, and controlling software development projects dealing with the development of software games. However, software size is unavailable during early phase of software development. This research determines the utility of CK (Chidamber and Kemerer) metrics, a well-known suite of object-oriented metrics, in estimating the size of software applications using the information from its UML (Unified Modeling Language) class diagram. This work focuses on a small subset dealing with board-based software games. Almost sixty games written using an object-oriented programming language are downloaded from open source repositories, analyzed and used to calibrate a regression-based size estimation model. Forward stepwise MLR (Multiple Linear Regression) is used for model fitting. The model thus obtained is assessed using a variety of accuracy measures such as MMRE (Mean Magnitude of Relative Error), Prediction of x(PRED(x)), MdMRE (Median of Relative Error) and validated using K-fold cross validation. The accuracy of this model is also compared with an existing model tailored for size estimation of board games. Based on a small subset of desktop games developed in various object-oriented languages, we obtained a model using CK metrics and forward stepwise multiple linear regression with reasonable estimation accuracy as indicated by the value of the coefficient of determination (R2 = 0.756).Comparison results indicate that the existing size estimation model outperforms the model derived using CK metrics in terms of accuracy of prediction. (author)

  20. Application of Metric-based Software Reliability Analysis to Example Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Smidts, Carol

    2008-07-01

    The software reliability of TELLERFAST ATM software is analyzed by using two metric-based software reliability analysis methods, a state transition diagram-based method and a test coverage-based method. The procedures for the software reliability analysis by using the two methods and the analysis results are provided in this report. It is found that the two methods have a relation of complementary cooperation, and therefore further researches on combining the two methods to reflect the benefit of the complementary cooperative effect to the software reliability analysis are recommended

  1. Compression-based classification of biological sequences and structures via the Universal Similarity Metric: experimental assessment

    Directory of Open Access Journals (Sweden)

    Manzini Giovanni

    2007-07-01

    Full Text Available Abstract Background Similarity of sequences is a key mathematical notion for Classification and Phylogenetic studies in Biology. It is currently primarily handled using alignments. However, the alignment methods seem inadequate for post-genomic studies since they do not scale well with data set size and they seem to be confined only to genomic and proteomic sequences. Therefore, alignment-free similarity measures are actively pursued. Among those, USM (Universal Similarity Metric has gained prominence. It is based on the deep theory of Kolmogorov Complexity and universality is its most novel striking feature. Since it can only be approximated via data compression, USM is a methodology rather than a formula quantifying the similarity of two strings. Three approximations of USM are available, namely UCD (Universal Compression Dissimilarity, NCD (Normalized Compression Dissimilarity and CD (Compression Dissimilarity. Their applicability and robustness is tested on various data sets yielding a first massive quantitative estimate that the USM methodology and its approximations are of value. Despite the rich theory developed around USM, its experimental assessment has limitations: only a few data compressors have been tested in conjunction with USM and mostly at a qualitative level, no comparison among UCD, NCD and CD is available and no comparison of USM with existing methods, both based on alignments and not, seems to be available. Results We experimentally test the USM methodology by using 25 compressors, all three of its known approximations and six data sets of relevance to Molecular Biology. This offers the first systematic and quantitative experimental assessment of this methodology, that naturally complements the many theoretical and the preliminary experimental results available. Moreover, we compare the USM methodology both with methods based on alignments and not. We may group our experiments into two sets. The first one, performed via ROC

  2. Compression-based classification of biological sequences and structures via the Universal Similarity Metric: experimental assessment.

    Science.gov (United States)

    Ferragina, Paolo; Giancarlo, Raffaele; Greco, Valentina; Manzini, Giovanni; Valiente, Gabriel

    2007-07-13

    Similarity of sequences is a key mathematical notion for Classification and Phylogenetic studies in Biology. It is currently primarily handled using alignments. However, the alignment methods seem inadequate for post-genomic studies since they do not scale well with data set size and they seem to be confined only to genomic and proteomic sequences. Therefore, alignment-free similarity measures are actively pursued. Among those, USM (Universal Similarity Metric) has gained prominence. It is based on the deep theory of Kolmogorov Complexity and universality is its most novel striking feature. Since it can only be approximated via data compression, USM is a methodology rather than a formula quantifying the similarity of two strings. Three approximations of USM are available, namely UCD (Universal Compression Dissimilarity), NCD (Normalized Compression Dissimilarity) and CD (Compression Dissimilarity). Their applicability and robustness is tested on various data sets yielding a first massive quantitative estimate that the USM methodology and its approximations are of value. Despite the rich theory developed around USM, its experimental assessment has limitations: only a few data compressors have been tested in conjunction with USM and mostly at a qualitative level, no comparison among UCD, NCD and CD is available and no comparison of USM with existing methods, both based on alignments and not, seems to be available. We experimentally test the USM methodology by using 25 compressors, all three of its known approximations and six data sets of relevance to Molecular Biology. This offers the first systematic and quantitative experimental assessment of this methodology, that naturally complements the many theoretical and the preliminary experimental results available. Moreover, we compare the USM methodology both with methods based on alignments and not. We may group our experiments into two sets. The first one, performed via ROC (Receiver Operating Curve) analysis, aims at

  3. Metrics for Probabilistic Geometries

    DEFF Research Database (Denmark)

    Tosi, Alessandra; Hauberg, Søren; Vellido, Alfredo

    2014-01-01

    the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances...

  4. Kerr metric in cosmological background

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, P C [Gujarat Univ., Ahmedabad (India). Dept. of Mathematics

    1977-06-01

    A metric satisfying Einstein's equation is given which in the vicinity of the source reduces to the well-known Kerr metric and which at large distances reduces to the Robertson-Walker metric of a nomogeneous cosmological model. The radius of the event horizon of the Kerr black hole in the cosmological background is found out.

  5. EPR-based distance measurements at ambient temperature.

    Science.gov (United States)

    Krumkacheva, Olesya; Bagryanskaya, Elena

    2017-07-01

    Pulsed dipolar (PD) EPR spectroscopy is a powerful technique allowing for distance measurements between spin labels in the range of 2.5-10.0nm. It was proposed more than 30years ago, and nowadays is widely used in biophysics and materials science. Until recently, PD EPR experiments were limited to cryogenic temperatures (TEPR as well as other approaches based on EPR (e.g., relaxation enhancement; RE). In this paper, we review the features of PD EPR and RE at ambient temperatures, in particular, requirements on electron spin phase memory time, ways of immobilization of biomolecules, the influence of a linker between the spin probe and biomolecule, and future opportunities. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. A Novel Riemannian Metric Based on Riemannian Structure and Scaling Information for Fixed Low-Rank Matrix Completion.

    Science.gov (United States)

    Mao, Shasha; Xiong, Lin; Jiao, Licheng; Feng, Tian; Yeung, Sai-Kit

    2017-05-01

    Riemannian optimization has been widely used to deal with the fixed low-rank matrix completion problem, and Riemannian metric is a crucial factor of obtaining the search direction in Riemannian optimization. This paper proposes a new Riemannian metric via simultaneously considering the Riemannian geometry structure and the scaling information, which is smoothly varying and invariant along the equivalence class. The proposed metric can make a tradeoff between the Riemannian geometry structure and the scaling information effectively. Essentially, it can be viewed as a generalization of some existing metrics. Based on the proposed Riemanian metric, we also design a Riemannian nonlinear conjugate gradient algorithm, which can efficiently solve the fixed low-rank matrix completion problem. By experimenting on the fixed low-rank matrix completion, collaborative filtering, and image and video recovery, it illustrates that the proposed method is superior to the state-of-the-art methods on the convergence efficiency and the numerical performance.

  7. Dialect distances based on orthographic and phonetic transcriptions

    CSIR Research Space (South Africa)

    Zulu, N

    2006-11-01

    Full Text Available , where transcription segments were compared using the algorithm. In 2003 Gooskens and Heeringa [5] calculated Levenshtein distances between 15 Norwegian dialects and compared them to the distances as perceived by Norwegian listeners... by a clustering algorithm. Figure 2 illustrates the dendrogram derived from the clustering of perceptual distances as perceived by Norwegian listeners for the 15 Norwegian dialects investigated in this research [6]. Figure 2: Dendrogram...

  8. Two projects in theoretical neuroscience: A convolution-based metric for neural membrane potentials and a combinatorial connectionist semantic network method

    Science.gov (United States)

    Evans, Garrett Nolan

    In this work, I present two projects that both contribute to the aim of discovering how intelligence manifests in the brain. The first project is a method for analyzing recorded neural signals, which takes the form of a convolution-based metric on neural membrane potential recordings. Relying only on integral and algebraic operations, the metric compares the timing and number of spikes within recordings as well as the recordings' subthreshold features: summarizing differences in these with a single "distance" between the recordings. Like van Rossum's (2001) metric for spike trains, the metric is based on a convolution operation that it performs on the input data. The kernel used for the convolution is carefully chosen such that it produces a desirable frequency space response and, unlike van Rossum's kernel, causes the metric to be first order both in differences between nearby spike times and in differences between same-time membrane potential values: an important trait. The second project is a combinatorial syntax method for connectionist semantic network encoding. Combinatorial syntax has been a point on which those who support a symbol-processing view of intelligent processing and those who favor a connectionist view have had difficulty seeing eye-to-eye. Symbol-processing theorists have persuasively argued that combinatorial syntax is necessary for certain intelligent mental operations, such as reasoning by analogy. Connectionists have focused on the versatility and adaptability offered by self-organizing networks of simple processing units. With this project, I show that there is a way to reconcile the two perspectives and to ascribe a combinatorial syntax to a connectionist network. The critical principle is to interpret nodes, or units, in the connectionist network as bound integrations of the interpretations for nodes that they share links with. Nodes need not correspond exactly to neurons and may correspond instead to distributed sets, or assemblies, of

  9. Vehicular Networking Enhancement And Multi-Channel Routing Optimization, Based on Multi-Objective Metric and Minimum Spanning Tree

    Directory of Open Access Journals (Sweden)

    Peppino Fazio

    2013-01-01

    Full Text Available Vehicular Ad hoc NETworks (VANETs represent a particular mobile technology that permits the communication among vehicles, offering security and comfort. Nowadays, distributed mobile wireless computing is becoming a very important communications paradigm, due to its flexibility to adapt to different mobile applications. VANETs are a practical example of data exchanging among real mobile nodes. To enable communications within an ad-hoc network, characterized by continuous node movements, routing protocols are needed to react to frequent changes in network topology. In this paper, the attention is focused mainly on the network layer of VANETs, proposing a novel approach to reduce the interference level during mobile transmission, based on the multi-channel nature of IEEE 802.11p (1609.4 standard. In this work a new routing protocol based on Distance Vector algorithm is presented to reduce the delay end to end and to increase packet delivery ratio (PDR and throughput in VANETs. A new metric is also proposed, based on the maximization of the average Signal-to-Interference Ratio (SIR level and the link duration probability between two VANET nodes. In order to relieve the effects of the co-channel interference perceived by mobile nodes, transmission channels are switched on a basis of a periodical SIR evaluation. A Network Simulator has been used for implementing and testing the proposed idea.

  10. Inferring feature relevances from metric learning

    DEFF Research Database (Denmark)

    Schulz, Alexander; Mokbel, Bassam; Biehl, Michael

    2015-01-01

    Powerful metric learning algorithms have been proposed in the last years which do not only greatly enhance the accuracy of distance-based classifiers and nearest neighbor database retrieval, but which also enable the interpretability of these operations by assigning explicit relevance weights...

  11. Femtosecond frequency comb based distance measurement in air

    NARCIS (Netherlands)

    Balling, P.; Kren, P.; Masika, P.; van den Berg, S.A.

    2009-01-01

    Interferometric measurement of distance using a femtosecond frequency comb is demonstrated and compared with a counting interferometer displacement measurement. A numerical model of pulse propagation in air is developed and the results are compared with experimental data for short distances. The

  12. The Hidden Flow Structure and Metric Space of Network Embedding Algorithms Based on Random Walks.

    Science.gov (United States)

    Gu, Weiwei; Gong, Li; Lou, Xiaodan; Zhang, Jiang

    2017-10-13

    Network embedding which encodes all vertices in a network as a set of numerical vectors in accordance with it's local and global structures, has drawn widespread attention. Network embedding not only learns significant features of a network, such as the clustering and linking prediction but also learns the latent vector representation of the nodes which provides theoretical support for a variety of applications, such as visualization, link prediction, node classification, and recommendation. As the latest progress of the research, several algorithms based on random walks have been devised. Although those algorithms have drawn much attention for their high scores in learning efficiency and accuracy, there is still a lack of theoretical explanation, and the transparency of those algorithms has been doubted. Here, we propose an approach based on the open-flow network model to reveal the underlying flow structure and its hidden metric space of different random walk strategies on networks. We show that the essence of embedding based on random walks is the latent metric structure defined on the open-flow network. This not only deepens our understanding of random- walk-based embedding algorithms but also helps in finding new potential applications in network embedding.

  13. Femtosecond frequency comb based distance measurement in air.

    Science.gov (United States)

    Balling, Petr; Kren, Petr; Masika, Pavel; van den Berg, S A

    2009-05-25

    Interferometric measurement of distance using a femtosecond frequency comb is demonstrated and compared with a counting interferometer displacement measurement. A numerical model of pulse propagation in air is developed and the results are compared with experimental data for short distances. The relative agreement for distance measurement in known laboratory conditions is better than 10(-7). According to the model, similar precision seems feasible even for long-distance measurement in air if conditions are sufficiently known. It is demonstrated that the relative width of the interferogram envelope even decreases with the measured length, and a fringe contrast higher than 90% could be obtained for kilometer distances in air, if optimal spectral width for that length and wavelength is used. The possibility of comb radiation delivery to the interferometer by an optical fiber is shown by model and experiment, which is important from a practical point of view.

  14. Analysis on the Metrics used in Optimizing Electronic Business based on Learning Techniques

    Directory of Open Access Journals (Sweden)

    Irina-Steliana STAN

    2014-09-01

    Full Text Available The present paper proposes a methodology of analyzing the metrics related to electronic business. The drafts of the optimizing models include KPIs that can highlight the business specific, if only they are integrated by using learning-based techniques. Having set the most important and high-impact elements of the business, the models should get in the end the link between them, by automating business flows. The human resource will be found in the situation of collaborating more and more with the optimizing models which will translate into high quality decisions followed by profitability increase.

  15. Comparison of continuous versus categorical tumor measurement-based metrics to predict overall survival in cancer treatment trials

    Science.gov (United States)

    An, Ming-Wen; Mandrekar, Sumithra J.; Branda, Megan E.; Hillman, Shauna L.; Adjei, Alex A.; Pitot, Henry; Goldberg, Richard M.; Sargent, Daniel J.

    2011-01-01

    Purpose The categorical definition of response assessed via the Response Evaluation Criteria in Solid Tumors has documented limitations. We sought to identify alternative metrics for tumor response that improve prediction of overall survival. Experimental Design Individual patient data from three North Central Cancer Treatment Group trials (N0026, n=117; N9741, n=1109; N9841, n=332) were used. Continuous metrics of tumor size based on longitudinal tumor measurements were considered in addition to a trichotomized response (TriTR: Response vs. Stable vs. Progression). Cox proportional hazards models, adjusted for treatment arm and baseline tumor burden, were used to assess the impact of the metrics on subsequent overall survival, using a landmark analysis approach at 12-, 16- and 24-weeks post baseline. Model discrimination was evaluated using the concordance (c) index. Results The overall best response rates for the three trials were 26%, 45%, and 25% respectively. While nearly all metrics were statistically significantly associated with overall survival at the different landmark time points, the c-indices for the traditional response metrics ranged from 0.59-0.65; for the continuous metrics from 0.60-0.66 and for the TriTR metrics from 0.64-0.69. The c-indices for TriTR at 12-weeks were comparable to those at 16- and 24-weeks. Conclusions Continuous tumor-measurement-based metrics provided no predictive improvement over traditional response based metrics or TriTR; TriTR had better predictive ability than best TriTR or confirmed response. If confirmed, TriTR represents a promising endpoint for future Phase II trials. PMID:21880789

  16. Statistical rice yield modeling using blended MODIS-Landsat based crop phenology metrics in Taiwan

    Science.gov (United States)

    Chen, C. R.; Chen, C. F.; Nguyen, S. T.; Lau, K. V.

    2015-12-01

    Taiwan is a populated island with a majority of residents settled in the western plains where soils are suitable for rice cultivation. Rice is not only the most important commodity, but also plays a critical role for agricultural and food marketing. Information of rice production is thus important for policymakers to devise timely plans for ensuring sustainably socioeconomic development. Because rice fields in Taiwan are generally small and yet crop monitoring requires information of crop phenology associating with the spatiotemporal resolution of satellite data, this study used Landsat-MODIS fusion data for rice yield modeling in Taiwan. We processed the data for the first crop (Feb-Mar to Jun-Jul) and the second (Aug-Sep to Nov-Dec) in 2014 through five main steps: (1) data pre-processing to account for geometric and radiometric errors of Landsat data, (2) Landsat-MODIS data fusion using using the spatial-temporal adaptive reflectance fusion model, (3) construction of the smooth time-series enhanced vegetation index 2 (EVI2), (4) rice yield modeling using EVI2-based crop phenology metrics, and (5) error verification. The fusion results by a comparison bewteen EVI2 derived from the fusion image and that from the reference Landsat image indicated close agreement between the two datasets (R2 > 0.8). We analysed smooth EVI2 curves to extract phenology metrics or phenological variables for establishment of rice yield models. The results indicated that the established yield models significantly explained more than 70% variability in the data (p-value 0.8), in both cases. The root mean square error (RMSE) and mean absolute error (MAE) used to measure the model accuracy revealed the consistency between the estimated yields and the government's yield statistics. This study demonstrates advantages of using EVI2-based phenology metrics (derived from Landsat-MODIS fusion data) for rice yield estimation in Taiwan prior to the harvest period.

  17. GPU-based Branchless Distance-Driven Projection and Backprojection.

    Science.gov (United States)

    Liu, Rui; Fu, Lin; De Man, Bruno; Yu, Hengyong

    2017-12-01

    Projection and backprojection operations are essential in a variety of image reconstruction and physical correction algorithms in CT. The distance-driven (DD) projection and backprojection are widely used for their highly sequential memory access pattern and low arithmetic cost. However, a typical DD implementation has an inner loop that adjusts the calculation depending on the relative position between voxel and detector cell boundaries. The irregularity of the branch behavior makes it inefficient to be implemented on massively parallel computing devices such as graphics processing units (GPUs). Such irregular branch behaviors can be eliminated by factorizing the DD operation as three branchless steps: integration, linear interpolation, and differentiation, all of which are highly amenable to massive vectorization. In this paper, we implement and evaluate a highly parallel branchless DD algorithm for 3D cone beam CT. The algorithm utilizes the texture memory and hardware interpolation on GPUs to achieve fast computational speed. The developed branchless DD algorithm achieved 137-fold speedup for forward projection and 188-fold speedup for backprojection relative to a single-thread CPU implementation. Compared with a state-of-the-art 32-thread CPU implementation, the proposed branchless DD achieved 8-fold acceleration for forward projection and 10-fold acceleration for backprojection. GPU based branchless DD method was evaluated by iterative reconstruction algorithms with both simulation and real datasets. It obtained visually identical images as the CPU reference algorithm.

  18. Robust linear discriminant analysis with distance based estimators

    Science.gov (United States)

    Lim, Yai-Fung; Yahaya, Sharipah Soaad Syed; Ali, Hazlina

    2017-11-01

    Linear discriminant analysis (LDA) is one of the supervised classification techniques concerning relationship between a categorical variable and a set of continuous variables. The main objective of LDA is to create a function to distinguish between populations and allocating future observations to previously defined populations. Under the assumptions of normality and homoscedasticity, the LDA yields optimal linear discriminant rule (LDR) between two or more groups. However, the optimality of LDA highly relies on the sample mean and pooled sample covariance matrix which are known to be sensitive to outliers. To alleviate these conflicts, a new robust LDA using distance based estimators known as minimum variance vector (MVV) has been proposed in this study. The MVV estimators were used to substitute the classical sample mean and classical sample covariance to form a robust linear discriminant rule (RLDR). Simulation and real data study were conducted to examine on the performance of the proposed RLDR measured in terms of misclassification error rates. The computational result showed that the proposed RLDR is better than the classical LDR and was comparable with the existing robust LDR.

  19. The Knowledge Base as an Extension of Distance Learning Reference Service

    Science.gov (United States)

    Casey, Anne Marie

    2012-01-01

    This study explores knowledge bases as extension of reference services for distance learners. Through a survey and follow-up interviews with distance learning librarians, this paper discusses their interest in creating and maintaining a knowledge base as a resource for reference services to distance learners. It also investigates their perceptions…

  20. Low-complexity atlas-based prostate segmentation by combining global, regional, and local metrics

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Qiuliang; Ruan, Dan, E-mail: druan@mednet.ucla.edu [The Department of Radiation Oncology, University of California Los Angeles, California 90095 (United States)

    2014-04-15

    Purpose: To improve the efficiency of atlas-based segmentation without compromising accuracy, and to demonstrate the validity of the proposed method on MRI-based prostate segmentation application. Methods: Accurate and efficient automatic structure segmentation is an important task in medical image processing. Atlas-based methods, as the state-of-the-art, provide good segmentation at the cost of a large number of computationally intensive nonrigid registrations, for anatomical sites/structures that are subject to deformation. In this study, the authors propose to utilize a combination of global, regional, and local metrics to improve the accuracy yet significantly reduce the number of required nonrigid registrations. The authors first perform an affine registration to minimize the global mean squared error (gMSE) to coarsely align each atlas image to the target. Subsequently, atarget-specific regional MSE (rMSE), demonstrated to be a good surrogate for dice similarity coefficient (DSC), is used to select a relevant subset from the training atlas. Only within this subset are nonrigid registrations performed between the training images and the target image, to minimize a weighted combination of gMSE and rMSE. Finally, structure labels are propagated from the selected training samples to the target via the estimated deformation fields, and label fusion is performed based on a weighted combination of rMSE and local MSE (lMSE) discrepancy, with proper total-variation-based spatial regularization. Results: The proposed method was applied to a public database of 30 prostate MR images with expert-segmented structures. The authors’ method, utilizing only eight nonrigid registrations, achieved a performance with a median/mean DSC of over 0.87/0.86, outperforming the state-of-the-art full-fledged atlas-based segmentation approach of which the median/mean DSC was 0.84/0.82 when applying to their data set. Conclusions: The proposed method requires a fixed number of nonrigid

  1. Symmetric Kullback-Leibler Metric Based Tracking Behaviors for Bioinspired Robotic Eyes.

    Science.gov (United States)

    Liu, Hengli; Luo, Jun; Wu, Peng; Xie, Shaorong; Li, Hengyu

    2015-01-01

    A symmetric Kullback-Leibler metric based tracking system, capable of tracking moving targets, is presented for a bionic spherical parallel mechanism to minimize a tracking error function to simulate smooth pursuit of human eyes. More specifically, we propose a real-time moving target tracking algorithm which utilizes spatial histograms taking into account symmetric Kullback-Leibler metric. In the proposed algorithm, the key spatial histograms are extracted and taken into particle filtering framework. Once the target is identified, an image-based control scheme is implemented to drive bionic spherical parallel mechanism such that the identified target is to be tracked at the center of the captured images. Meanwhile, the robot motion information is fed forward to develop an adaptive smooth tracking controller inspired by the Vestibuloocular Reflex mechanism. The proposed tracking system is designed to make the robot track dynamic objects when the robot travels through transmittable terrains, especially bumpy environment. To perform bumpy-resist capability under the condition of violent attitude variation when the robot works in the bumpy environment mentioned, experimental results demonstrate the effectiveness and robustness of our bioinspired tracking system using bionic spherical parallel mechanism inspired by head-eye coordination.

  2. Parameter Search Algorithms for Microwave Radar-Based Breast Imaging: Focal Quality Metrics as Fitness Functions.

    Science.gov (United States)

    O'Loughlin, Declan; Oliveira, Bárbara L; Elahi, Muhammad Adnan; Glavin, Martin; Jones, Edward; Popović, Milica; O'Halloran, Martin

    2017-12-06

    Inaccurate estimation of average dielectric properties can have a tangible impact on microwave radar-based breast images. Despite this, recent patient imaging studies have used a fixed estimate although this is known to vary from patient to patient. Parameter search algorithms are a promising technique for estimating the average dielectric properties from the reconstructed microwave images themselves without additional hardware. In this work, qualities of accurately reconstructed images are identified from point spread functions. As the qualities of accurately reconstructed microwave images are similar to the qualities of focused microscopic and photographic images, this work proposes the use of focal quality metrics for average dielectric property estimation. The robustness of the parameter search is evaluated using experimental dielectrically heterogeneous phantoms on the three-dimensional volumetric image. Based on a very broad initial estimate of the average dielectric properties, this paper shows how these metrics can be used as suitable fitness functions in parameter search algorithms to reconstruct clear and focused microwave radar images.

  3. Symmetric Kullback-Leibler Metric Based Tracking Behaviors for Bioinspired Robotic Eyes

    Directory of Open Access Journals (Sweden)

    Hengli Liu

    2015-01-01

    Full Text Available A symmetric Kullback-Leibler metric based tracking system, capable of tracking moving targets, is presented for a bionic spherical parallel mechanism to minimize a tracking error function to simulate smooth pursuit of human eyes. More specifically, we propose a real-time moving target tracking algorithm which utilizes spatial histograms taking into account symmetric Kullback-Leibler metric. In the proposed algorithm, the key spatial histograms are extracted and taken into particle filtering framework. Once the target is identified, an image-based control scheme is implemented to drive bionic spherical parallel mechanism such that the identified target is to be tracked at the center of the captured images. Meanwhile, the robot motion information is fed forward to develop an adaptive smooth tracking controller inspired by the Vestibuloocular Reflex mechanism. The proposed tracking system is designed to make the robot track dynamic objects when the robot travels through transmittable terrains, especially bumpy environment. To perform bumpy-resist capability under the condition of violent attitude variation when the robot works in the bumpy environment mentioned, experimental results demonstrate the effectiveness and robustness of our bioinspired tracking system using bionic spherical parallel mechanism inspired by head-eye coordination.

  4. Application of Entropy-Based Metrics to Identify Emotional Distress from Electroencephalographic Recordings

    Directory of Open Access Journals (Sweden)

    Beatriz García-Martínez

    2016-06-01

    Full Text Available Recognition of emotions is still an unresolved challenge, which could be helpful to improve current human-machine interfaces. Recently, nonlinear analysis of some physiological signals has shown to play a more relevant role in this context than their traditional linear exploration. Thus, the present work introduces for the first time the application of three recent entropy-based metrics: sample entropy (SE, quadratic SE (QSE and distribution entropy (DE to discern between emotional states of calm and negative stress (also called distress. In the last few years, distress has received growing attention because it is a common negative factor in the modern lifestyle of people from developed countries and, moreover, it may lead to serious mental and physical health problems. Precisely, 279 segments of 32-channel electroencephalographic (EEG recordings from 32 subjects elicited to be calm or negatively stressed have been analyzed. Results provide that QSE is the first single metric presented to date with the ability to identify negative stress. Indeed, this metric has reported a discriminant ability of around 70%, which is only slightly lower than the one obtained by some previous works. Nonetheless, discriminant models from dozens or even hundreds of features have been previously obtained by using advanced classifiers to yield diagnostic accuracies about 80%. Moreover, in agreement with previous neuroanatomy findings, QSE has also revealed notable differences for all the brain regions in the neural activation triggered by the two considered emotions. Consequently, given these results, as well as easy interpretation of QSE, this work opens a new standpoint in the detection of emotional distress, which may gain new insights about the brain’s behavior under this negative emotion.

  5. Healthcare4VideoStorm: Making Smart Decisions Based on Storm Metrics.

    Science.gov (United States)

    Zhang, Weishan; Duan, Pengcheng; Chen, Xiufeng; Lu, Qinghua

    2016-04-23

    Storm-based stream processing is widely used for real-time large-scale distributed processing. Knowing the run-time status and ensuring performance is critical to providing expected dependability for some applications, e.g., continuous video processing for security surveillance. The existing scheduling strategies' granularity is too coarse to have good performance, and mainly considers network resources without computing resources while scheduling. In this paper, we propose Healthcare4Storm, a framework that finds Storm insights based on Storm metrics to gain knowledge from the health status of an application, finally ending up with smart scheduling decisions. It takes into account both network and computing resources and conducts scheduling at a fine-grained level using tuples instead of topologies. The comprehensive evaluation shows that the proposed framework has good performance and can improve the dependability of the Storm-based applications.

  6. Local Deep Hashing Matching of Aerial Images Based on Relative Distance and Absolute Distance Constraints

    Directory of Open Access Journals (Sweden)

    Suting Chen

    2017-12-01

    Full Text Available Aerial images have features of high resolution, complex background, and usually require large amounts of calculation, however, most algorithms used in matching of aerial images adopt the shallow hand-crafted features expressed as floating-point descriptors (e.g., SIFT (Scale-invariant Feature Transform, SURF (Speeded Up Robust Features, which may suffer from poor matching speed and are not well represented in the literature. Here, we propose a novel Local Deep Hashing Matching (LDHM method for matching of aerial images with large size and with lower complexity or fast matching speed. The basic idea of the proposed algorithm is to utilize the deep network model in the local area of the aerial images, and study the local features, as well as the hash function of the images. Firstly, according to the course overlap rate of aerial images, the algorithm extracts the local areas for matching to avoid the processing of redundant information. Secondly, a triplet network structure is proposed to mine the deep features of the patches of the local image, and the learned features are imported to the hash layer, thus obtaining the representation of a binary hash code. Thirdly, the constraints of the positive samples to the absolute distance are added on the basis of the triplet loss, a new objective function is constructed to optimize the parameters of the network and enhance the discriminating capabilities of image patch features. Finally, the obtained deep hash code of each image patch is used for the similarity comparison of the image patches in the Hamming space to complete the matching of aerial images. The proposed LDHM algorithm evaluates the UltraCam-D dataset and a set of actual aerial images, simulation result demonstrates that it may significantly outperform the state-of-the-art algorithm in terms of the efficiency and performance.

  7. TEACHER EDUCATION FOR DISTANCE LEARNING BASED SPECIAL EDUCATION IN PAKISTAN

    Directory of Open Access Journals (Sweden)

    Tanzila NABEEL

    2009-01-01

    Full Text Available Special education is a mode of education in which specially designed instruction material and environment is required to meet the diverse requirements of children with special needs. In Pakistan, Open University (AIOU exclusively initiated a program for teacher preparation for Special Children through distance learning. This was a unique program of its kind with no precedence of defined services for Special Teachers’ Preparation. Dept of Special Education AIOU - through Distance learning system, offers study/training at graduate, masters and Ph. D. levels. Teachers are prepared in 6 specialized areas of Visual Impairment, Physical Disabilities, Hearing Impairment, Intellectual Disability, Learning Disability and Inclusive Education. The Open University has a well established regional network, outreach system providing educational counseling and guiding services to its students. University has 32 regional campuses with 86 part-time regional coordinating officers throughout the country for providing assistance to the Regional campuses. Over 900 study centers are established during the semester and are managed through the university’s regional campuses. Each student is assigned to a tutor who is a subject specialist. To maintain consistency of on and off campus observations, University faculty conducts reliability observations with adjunct Supervisors. Their professional growth impacts the quality of the teaching cadre. It was for the first time in the history of teacher training institutes of Pakistan that a teacher training program at Masters Level in the area of Special Education was offered through distance education. This paper gives the experiences, methodology and successes as outcome of the Distance- learning Special-Educator Program in Pakistan. Also highlighted is the Special Teacher Preparation Model through Distance Education System. Increased program completion rates support the fact that Open University faculty have become better

  8. Adaptive metric learning with deep neural networks for video-based facial expression recognition

    Science.gov (United States)

    Liu, Xiaofeng; Ge, Yubin; Yang, Chao; Jia, Ping

    2018-01-01

    Video-based facial expression recognition has become increasingly important for plenty of applications in the real world. Despite that numerous efforts have been made for the single sequence, how to balance the complex distribution of intra- and interclass variations well between sequences has remained a great difficulty in this area. We propose the adaptive (N+M)-tuplet clusters loss function and optimize it with the softmax loss simultaneously in the training phrase. The variations introduced by personal attributes are alleviated using the similarity measurements of multiple samples in the feature space with many fewer comparison times as conventional deep metric learning approaches, which enables the metric calculations for large data applications (e.g., videos). Both the spatial and temporal relations are well explored by a unified framework that consists of an Inception-ResNet network with long short term memory and the two fully connected layer branches structure. Our proposed method has been evaluated with three well-known databases, and the experimental results show that our method outperforms many state-of-the-art approaches.

  9. Heterogeneity Measurement Based on Distance Measure for Polarimetric SAR Data

    Science.gov (United States)

    Xing, Xiaoli; Chen, Qihao; Liu, Xiuguo

    2018-04-01

    To effectively test the scene heterogeneity for polarimetric synthetic aperture radar (PolSAR) data, in this paper, the distance measure is introduced by utilizing the similarity between the sample and pixels. Moreover, given the influence of the distribution and modeling texture, the K distance measure is deduced according to the Wishart distance measure. Specifically, the average of the pixels in the local window replaces the class center coherency or covariance matrix. The Wishart and K distance measure are calculated between the average matrix and the pixels. Then, the ratio of the standard deviation to the mean is established for the Wishart and K distance measure, and the two features are defined and applied to reflect the complexity of the scene. The proposed heterogeneity measure is proceeded by integrating the two features using the Pauli basis. The experiments conducted on the single-look and multilook PolSAR data demonstrate the effectiveness of the proposed method for the detection of the scene heterogeneity.

  10. METRIC context unit architecture

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, R.O.

    1988-01-01

    METRIC is an architecture for a simple but powerful Reduced Instruction Set Computer (RISC). Its speed comes from the simultaneous processing of several instruction streams, with instructions from the various streams being dispatched into METRIC's execution pipeline as they become available for execution. The pipeline is thus kept full, with a mix of instructions for several contexts in execution at the same time. True parallel programming is supported within a single execution unit, the METRIC Context Unit. METRIC's architecture provides for expansion through the addition of multiple Context Units and of specialized Functional Units. The architecture thus spans a range of size and performance from a single-chip microcomputer up through large and powerful multiprocessors. This research concentrates on the specification of the METRIC Context Unit at the architectural level. Performance tradeoffs made during METRIC's design are discussed, and projections of METRIC's performance are made based on simulation studies.

  11. Critical thinking skills in nursing students: comparison of simulation-based performance with metrics

    Science.gov (United States)

    Fero, Laura J.; O’Donnell, John M.; Zullo, Thomas G.; Dabbs, Annette DeVito; Kitutu, Julius; Samosky, Joseph T.; Hoffman, Leslie A.

    2018-01-01

    Aim This paper is a report of an examination of the relationship between metrics of critical thinking skills and performance in simulated clinical scenarios. Background Paper and pencil assessments are commonly used to assess critical thinking but may not reflect simulated performance. Methods In 2007, a convenience sample of 36 nursing students participated in measurement of critical thinking skills and simulation-based performance using videotaped vignettes, high-fidelity human simulation, the California Critical Thinking Disposition Inventory and California Critical Thinking Skills Test. Simulation- based performance was rated as ‘meeting’ or ‘not meeting’ overall expectations. Test scores were categorized as strong, average, or weak. Results Most (75·0%) students did not meet overall performance expectations using videotaped vignettes or high-fidelity human simulation; most difficulty related to problem recognition and reporting findings to the physician. There was no difference between overall performance based on method of assessment (P = 0·277). More students met subcategory expectations for initiating nursing interventions (P ≤ 0·001) using high-fidelity human simulation. The relationship between video-taped vignette performance and critical thinking disposition or skills scores was not statistically significant, except for problem recognition and overall critical thinking skills scores (Cramer’s V = 0·444, P = 0·029). There was a statistically significant relationship between overall high-fidelity human simulation performance and overall critical thinking disposition scores (Cramer’s V = 0·413, P = 0·047). Conclusion Students’ performance reflected difficulty meeting expectations in simulated clinical scenarios. High-fidelity human simulation performance appeared to approximate scores on metrics of critical thinking best. Further research is needed to determine if simulation-based performance correlates with critical thinking skills

  12. Critical thinking skills in nursing students: comparison of simulation-based performance with metrics.

    Science.gov (United States)

    Fero, Laura J; O'Donnell, John M; Zullo, Thomas G; Dabbs, Annette DeVito; Kitutu, Julius; Samosky, Joseph T; Hoffman, Leslie A

    2010-10-01

    This paper is a report of an examination of the relationship between metrics of critical thinking skills and performance in simulated clinical scenarios. Paper and pencil assessments are commonly used to assess critical thinking but may not reflect simulated performance. In 2007, a convenience sample of 36 nursing students participated in measurement of critical thinking skills and simulation-based performance using videotaped vignettes, high-fidelity human simulation, the California Critical Thinking Disposition Inventory and California Critical Thinking Skills Test. Simulation-based performance was rated as 'meeting' or 'not meeting' overall expectations. Test scores were categorized as strong, average, or weak. Most (75.0%) students did not meet overall performance expectations using videotaped vignettes or high-fidelity human simulation; most difficulty related to problem recognition and reporting findings to the physician. There was no difference between overall performance based on method of assessment (P = 0.277). More students met subcategory expectations for initiating nursing interventions (P ≤ 0.001) using high-fidelity human simulation. The relationship between videotaped vignette performance and critical thinking disposition or skills scores was not statistically significant, except for problem recognition and overall critical thinking skills scores (Cramer's V = 0.444, P = 0.029). There was a statistically significant relationship between overall high-fidelity human simulation performance and overall critical thinking disposition scores (Cramer's V = 0.413, P = 0.047). Students' performance reflected difficulty meeting expectations in simulated clinical scenarios. High-fidelity human simulation performance appeared to approximate scores on metrics of critical thinking best. Further research is needed to determine if simulation-based performance correlates with critical thinking skills in the clinical setting. © 2010 The Authors. Journal of Advanced

  13. Adapting observationally based metrics of biogeophysical feedbacks from land cover/land use change to climate modeling

    International Nuclear Information System (INIS)

    Chen, Liang; Dirmeyer, Paul A

    2016-01-01

    To assess the biogeophysical impacts of land cover/land use change (LCLUC) on surface temperature, two observation-based metrics and their applicability in climate modeling were explored in this study. Both metrics were developed based on the surface energy balance, and provided insight into the contribution of different aspects of land surface change (such as albedo, surface roughness, net radiation and surface heat fluxes) to changing climate. A revision of the first metric, the intrinsic biophysical mechanism, can be used to distinguish the direct and indirect effects of LCLUC on surface temperature. The other, a decomposed temperature metric, gives a straightforward depiction of separate contributions of all components of the surface energy balance. These two metrics well capture observed and model simulated surface temperature changes in response to LCLUC. Results from paired FLUXNET sites and land surface model sensitivity experiments indicate that surface roughness effects usually dominate the direct biogeophysical feedback of LCLUC, while other effects play a secondary role. However, coupled climate model experiments show that these direct effects can be attenuated by large scale atmospheric changes (indirect feedbacks). When applied to real-time transient LCLUC experiments, the metrics also demonstrate usefulness for assessing the performance of climate models and quantifying land–atmosphere interactions in response to LCLUC. (letter)

  14. Portfolio Selection Based on Distance between Fuzzy Variables

    Directory of Open Access Journals (Sweden)

    Weiyi Qian

    2014-01-01

    Full Text Available This paper researches portfolio selection problem in fuzzy environment. We introduce a new simple method in which the distance between fuzzy variables is used to measure the divergence of fuzzy investment return from a prior one. Firstly, two new mathematical models are proposed by expressing divergence as distance, investment return as expected value, and risk as variance and semivariance, respectively. Secondly, the crisp forms of the new models are also provided for different types of fuzzy variables. Finally, several numerical examples are given to illustrate the effectiveness of the proposed approach.

  15. Distributions of Cognates in Europe as Based on Levenshtein Distance

    Science.gov (United States)

    Schepens, Job; Dijkstra, Ton; Grootjen, Franc

    2012-01-01

    Researchers on bilingual processing can benefit from computational tools developed in artificial intelligence. We show that a normalized Levenshtein distance function can efficiently and reliably simulate bilingual orthographic similarity ratings. Orthographic similarity distributions of cognates and non-cognates were identified across pairs of…

  16. Novel Clustering Method Based on K-Medoids and Mobility Metric

    Directory of Open Access Journals (Sweden)

    Y. Hamzaoui

    2018-06-01

    Full Text Available The structure and constraint of MANETS influence negatively the performance of QoS, moreover the main routing protocols proposed generally operate in flat routing. Hence, this structure gives the bad results of QoS when the network becomes larger and denser. To solve this problem we use one of the most popular methods named clustering. The present paper comes within the frameworks of research to improve the QoS in MANETs. In this paper we propose a new algorithm of clustering based on the new mobility metric and K-Medoid to distribute the nodes into several clusters. Intuitively our algorithm can give good results in terms of stability of the cluster, and can also extend life time of cluster head.

  17. Detect-and-forward in two-hop relay channels: a metrics-based analysis

    KAUST Repository

    Benjillali, Mustapha

    2010-06-01

    In this paper, we analyze the coded performance of a cooperative system with multiple parallel relays using "Detect-and-Forward" (DetF) strategy where each relay demodulates the overheard signal and forwards the detected binary words. The proposed method is based on the probabilistic characterization of the reliability metrics given under the form of L-values. First, we derive analytical expressions of the probability density functions (PDFs) of the L-values in the elementary two-hop DetF relay channel with different source-relay channel state information assumptions. Then, we apply the obtained expressions to calculate the theoretically achievable rates and compare them with the practical throughput of a simulated turbo-coded transmission. Next, we derive tight approximations for the end-to-end coded bit error rate (BER) of a general cooperative scheme with multiple parallel relays. Simulation results demonstrate the accuracy of our derivations for different cooperation configurations and conditions. © 2010 IEEE.

  18. Questionable validity of the catheter-associated urinary tract infection metric used for value-based purchasing.

    Science.gov (United States)

    Calderon, Lindsay E; Kavanagh, Kevin T; Rice, Mara K

    2015-10-01

    Catheter-associated urinary tract infections (CAUTIs) occur in 290,000 US hospital patients annually, with an estimated cost of $290 million. Two different measurement systems are being used to track the US health care system's performance in lowering the rate of CAUTIs. Since 2010, the Agency for Healthcare Research and Quality (AHRQ) metric has shown a 28.2% decrease in CAUTI, whereas the Centers for Disease Control and Prevention metric has shown a 3%-6% increase in CAUTI since 2009. Differences in data acquisition and the definition of the denominator may explain this discrepancy. The AHRQ metric analyzes chart-audited data and reflects both catheter use and care. The Centers for Disease Control and Prevention metric analyzes self-reported data and primarily reflects catheter care. Because analysis of the AHRQ metric showed a progressive change in performance over time and the scientific literature supports the importance of catheter use in the prevention of CAUTI, it is suggested that risk-adjusted catheter-use data be incorporated into metrics that are used for determining facility performance and for value-based purchasing initiatives. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  19. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    Science.gov (United States)

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  20. Camera-based micro interferometer for distance sensing

    Science.gov (United States)

    Will, Matthias; Schädel, Martin; Ortlepp, Thomas

    2017-12-01

    Interference of light provides a high precision, non-contact and fast method for measurement method for distances. Therefore this technology dominates in high precision systems. However, in the field of compact sensors capacitive, resistive or inductive methods dominates. The reason is, that the interferometric system has to be precise adjusted and needs a high mechanical stability. As a result, we have usual high-priced complex systems not suitable in the field of compact sensors. To overcome these we developed a new concept for a very small interferometric sensing setup. We combine a miniaturized laser unit, a low cost pixel detector and machine vision routines to realize a demonstrator for a Michelson type micro interferometer. We demonstrate a low cost sensor smaller 1cm3 including all electronics and demonstrate distance sensing up to 30 cm and resolution in nm range.

  1. EPR-based distance measurements at ambient temperature

    Science.gov (United States)

    Krumkacheva, Olesya; Bagryanskaya, Elena

    2017-07-01

    Pulsed dipolar (PD) EPR spectroscopy is a powerful technique allowing for distance measurements between spin labels in the range of 2.5-10.0 nm. It was proposed more than 30 years ago, and nowadays is widely used in biophysics and materials science. Until recently, PD EPR experiments were limited to cryogenic temperatures (T biomolecules, the influence of a linker between the spin probe and biomolecule, and future opportunities.

  2. Comparison of Highly Resolved Model-Based Exposure Metrics for Traffic-Related Air Pollutants to Support Environmental Health Studies

    Directory of Open Access Journals (Sweden)

    Shih Ying Chang

    2015-12-01

    Full Text Available Human exposure to air pollution in many studies is represented by ambient concentrations from space-time kriging of observed values. Space-time kriging techniques based on a limited number of ambient monitors may fail to capture the concentration from local sources. Further, because people spend more time indoors, using ambient concentration to represent exposure may cause error. To quantify the associated exposure error, we computed a series of six different hourly-based exposure metrics at 16,095 Census blocks of three Counties in North Carolina for CO, NOx, PM2.5, and elemental carbon (EC during 2012. These metrics include ambient background concentration from space-time ordinary kriging (STOK, ambient on-road concentration from the Research LINE source dispersion model (R-LINE, a hybrid concentration combining STOK and R-LINE, and their associated indoor concentrations from an indoor infiltration mass balance model. Using a hybrid-based indoor concentration as the standard, the comparison showed that outdoor STOK metrics yielded large error at both population (67% to 93% and individual level (average bias between −10% to 95%. For pollutants with significant contribution from on-road emission (EC and NOx, the on-road based indoor metric performs the best at the population level (error less than 52%. At the individual level, however, the STOK-based indoor concentration performs the best (average bias below 30%. For PM2.5, due to the relatively low contribution from on-road emission (7%, STOK-based indoor metric performs the best at both population (error below 40% and individual level (error below 25%. The results of the study will help future epidemiology studies to select appropriate exposure metric and reduce potential bias in exposure characterization.

  3. Clustering by Partitioning around Medoids using Distance-Based Similarity Measures on Interval-Scaled Variables

    Directory of Open Access Journals (Sweden)

    D. L. Nkweteyim

    2018-03-01

    Full Text Available It is reported in this paper, the results of a study of the partitioning around medoids (PAM clustering algorithm applied to four datasets, both standardized and not, and of varying sizes and numbers of clusters. The angular distance proximity measure in addition to the two more traditional proximity measures, namely the Euclidean distance and Manhattan distance, was used to compute object-object similarity. The data used in the study comprise three widely available datasets, and one that was constructed from publicly available climate data. Results replicate some of the well known facts about the PAM algorithm, namely that the quality of the clusters generated tend to be much better for small datasets, that the silhouette value is a good, even if not perfect, guide for the optimal number of clusters to generate, and that human intervention is required to interpret generated clusters. Additionally, results also indicate that the angular distance measure, which traditionally has not been widely used in clustering, outperforms both the Euclidean and Manhattan distance metrics in certain situations.

  4. Directional output distance functions: endogenous directions based on exogenous normalization constraints

    Science.gov (United States)

    In this paper we develop a model for computing directional output distance functions with endogenously determined direction vectors. We show how this model is related to the slacks-based directional distance function introduced by Fare and Grosskopf and show how to use the slacks-based function to e...

  5. Rigorous force field optimization principles based on statistical distance minimization

    Energy Technology Data Exchange (ETDEWEB)

    Vlcek, Lukas, E-mail: vlcekl1@ornl.gov [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States); Joint Institute for Computational Sciences, University of Tennessee, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6173 (United States); Chialvo, Ariel A. [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States)

    2015-10-14

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.

  6. Long distance measurement with a femtosecond laser based frequency comb

    Science.gov (United States)

    Bhattacharya, N.; Cui, M.; Zeitouny, M. G.; Urbach, H. P.; van den Berg, S. A.

    2017-11-01

    Recent advances in the field of ultra-short pulse lasers have led to the development of reliable sources of carrier envelope phase stabilized femtosecond pulses. The pulse train generated by such a source has a frequency spectrum that consists of discrete, regularly spaced lines known as a frequency comb. In this case both the frequency repetition and the carrier-envelope-offset frequency are referenced to a frequency standard, like an atomic clock. As a result the accuracy of the frequency standard is transferred to the optical domain, with the frequency comb as transfer oscillator. These unique properties allow the frequency comb to be applied as a versatile tool, not only for time and frequency metrology, but also in fundamental physics, high-precision spectroscopy, and laser noise characterization. The pulse-to-pulse phase relationship of the light emitted by the frequency comb has opened up new directions for long range highly accurate distance measurement.

  7. A no-reference image and video visual quality metric based on machine learning

    Science.gov (United States)

    Frantc, Vladimir; Voronin, Viacheslav; Semenishchev, Evgenii; Minkin, Maxim; Delov, Aliy

    2018-04-01

    The paper presents a novel visual quality metric for lossy compressed video quality assessment. High degree of correlation with subjective estimations of quality is due to using of a convolutional neural network trained on a large amount of pairs video sequence-subjective quality score. We demonstrate how our predicted no-reference quality metric correlates with qualitative opinion in a human observer study. Results are shown on the EVVQ dataset with comparison existing approaches.

  8. Resource-level QoS metric for CPU-based guarantees in cloud providers

    OpenAIRE

    Goiri Presa, Íñigo; Julià Massó, Ferran; Fitó, Josep Oriol; Macías Lloret, Mario; Guitart Fernández, Jordi

    2010-01-01

    Success of Cloud computing requires that both customers and providers can be confident that signed Service Level Agreements (SLA) are supporting their respective business activities to their best extent. Currently used SLAs fail in providing such confidence, especially when providers outsource resources to other providers. These resource providers typically support very simple metrics, or metrics that hinder an efficient exploitation of their resources. In this paper, we propose a re...

  9. FACTORS AND METRICS THAT INFLUENCE FRANCHISEE PERFORMANCE: AN APPROACH BASED ON BRAZILIAN FRANCHISES

    OpenAIRE

    Aguiar, Helder de Souza; Consoni, Flavia

    2017-01-01

    The article searches to map the manager’s decisions in order to understand what has been the franchisor system for choose regarding to characteristics, and what the metrics has been adopted to measure the performance Though 15 interviews with Brazilian franchise there was confirmation that revenue is the main metric used by national franchises to measure performance, although other indicators are also used in a complementary way. In addition, two other factors were cited by the interviewees a...

  10. Metric Learning for Hyperspectral Image Segmentation

    Science.gov (United States)

    Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca

    2011-01-01

    We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.

  11. Resilience Metrics for the Electric Power System: A Performance-Based Approach.

    Energy Technology Data Exchange (ETDEWEB)

    Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Castillo, Andrea R [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva-Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    Grid resilience is a concept related to a power system's ability to continue operating and delivering power even in the event that low probability, high-consequence disruptions such as hurricanes, earthquakes, and cyber-attacks occur. Grid resilience objectives focus on managing and, ideally, minimizing potential consequences that occur as a result of these disruptions. Currently, no formal grid resilience definitions, metrics, or analysis methods have been universally accepted. This document describes an effort to develop and describe grid resilience metrics and analysis methods. The metrics and methods described herein extend upon the Resilience Analysis Process (RAP) developed by Watson et al. for the 2015 Quadrennial Energy Review. The extension allows for both outputs from system models and for historical data to serve as the basis for creating grid resilience metrics and informing grid resilience planning and response decision-making. This document describes the grid resilience metrics and analysis methods. Demonstration of the metrics and methods is shown through a set of illustrative use cases.

  12. Resilience-based performance metrics for water resources management under uncertainty

    Science.gov (United States)

    Roach, Tom; Kapelan, Zoran; Ledbetter, Ralph

    2018-06-01

    This paper aims to develop new, resilience type metrics for long-term water resources management under uncertain climate change and population growth. Resilience is defined here as the ability of a water resources management system to 'bounce back', i.e. absorb and then recover from a water deficit event, restoring the normal system operation. Ten alternative metrics are proposed and analysed addressing a range of different resilience aspects including duration, magnitude, frequency and volume of related water deficit events. The metrics were analysed on a real-world case study of the Bristol Water supply system in the UK and compared with current practice. The analyses included an examination of metrics' sensitivity and correlation, as well as a detailed examination into the behaviour of metrics during water deficit periods. The results obtained suggest that multiple metrics which cover different aspects of resilience should be used simultaneously when assessing the resilience of a water resources management system, leading to a more complete understanding of resilience compared with current practice approaches. It was also observed that calculating the total duration of a water deficit period provided a clearer and more consistent indication of system performance compared to splitting the deficit periods into the time to reach and time to recover from the worst deficit events.

  13. Bin Ratio-Based Histogram Distances and Their Application to Image Classification.

    Science.gov (United States)

    Hu, Weiming; Xie, Nianhua; Hu, Ruiguang; Ling, Haibin; Chen, Qiang; Yan, Shuicheng; Maybank, Stephen

    2014-12-01

    Large variations in image background may cause partial matching and normalization problems for histogram-based representations, i.e., the histograms of the same category may have bins which are significantly different, and normalization may produce large changes in the differences between corresponding bins. In this paper, we deal with this problem by using the ratios between bin values of histograms, rather than bin values' differences which are used in the traditional histogram distances. We propose a bin ratio-based histogram distance (BRD), which is an intra-cross-bin distance, in contrast with previous bin-to-bin distances and cross-bin distances. The BRD is robust to partial matching and histogram normalization, and captures correlations between bins with only a linear computational complexity. We combine the BRD with the ℓ1 histogram distance and the χ(2) histogram distance to generate the ℓ1 BRD and the χ(2) BRD, respectively. These combinations exploit and benefit from the robustness of the BRD under partial matching and the robustness of the ℓ1 and χ(2) distances to small noise. We propose a method for assessing the robustness of histogram distances to partial matching. The BRDs and logistic regression-based histogram fusion are applied to image classification. The experimental results on synthetic data sets show the robustness of the BRDs to partial matching, and the experiments on seven benchmark data sets demonstrate promising results of the BRDs for image classification.

  14. Multi-image acquisition-based distance sensor using agile laser spot beam.

    Science.gov (United States)

    Riza, Nabeel A; Amin, M Junaid

    2014-09-01

    We present a novel laser-based distance measurement technique that uses multiple-image-based spatial processing to enable distance measurements. Compared with the first-generation distance sensor using spatial processing, the modified sensor is no longer hindered by the classic Rayleigh axial resolution limit for the propagating laser beam at its minimum beam waist location. The proposed high-resolution distance sensor design uses an electronically controlled variable focus lens (ECVFL) in combination with an optical imaging device, such as a charged-coupled device (CCD), to produce and capture different laser spot size images on a target with these beam spot sizes different from the minimal spot size possible at this target distance. By exploiting the unique relationship of the target located spot sizes with the varying ECVFL focal length for each target distance, the proposed distance sensor can compute the target distance with a distance measurement resolution better than the axial resolution via the Rayleigh resolution criterion. Using a 30 mW 633 nm He-Ne laser coupled with an electromagnetically actuated liquid ECVFL, along with a 20 cm focal length bias lens, and using five spot images captured per target position by a CCD-based Nikon camera, a proof-of-concept proposed distance sensor is successfully implemented in the laboratory over target ranges from 10 to 100 cm with a demonstrated sub-cm axial resolution, which is better than the axial Rayleigh resolution limit at these target distances. Applications for the proposed potentially cost-effective distance sensor are diverse and include industrial inspection and measurement and 3D object shape mapping and imaging.

  15. Stochastic learning of multi-instance dictionary for earth mover’s distance-based histogram comparison

    KAUST Repository

    Fan, Jihong

    2016-09-17

    Dictionary plays an important role in multi-instance data representation. It maps bags of instances to histograms. Earth mover’s distance (EMD) is the most effective histogram distance metric for the application of multi-instance retrieval. However, up to now, there is no existing multi-instance dictionary learning methods designed for EMD-based histogram comparison. To fill this gap, we develop the first EMD-optimal dictionary learning method using stochastic optimization method. In the stochastic learning framework, we have one triplet of bags, including one basic bag, one positive bag, and one negative bag. These bags are mapped to histograms using a multi-instance dictionary. We argue that the EMD between the basic histogram and the positive histogram should be smaller than that between the basic histogram and the negative histogram. Base on this condition, we design a hinge loss. By minimizing this hinge loss and some regularization terms of the dictionary, we update the dictionary instances. The experiments over multi-instance retrieval applications shows its effectiveness when compared to other dictionary learning methods over the problems of medical image retrieval and natural language relation classification. © 2016 The Natural Computing Applications Forum

  16. Prognostics and Condition-Based Maintenance: A New Approach to Precursive Metrics

    International Nuclear Information System (INIS)

    Jarrell, Donald B.; Sisk, Daniel R.; Bond, Leonard J.

    2004-01-01

    The assumptions used in the design basis of process equipment have always been as much art as science. The usually imprecise boundaries of the equipments' operational envelope provide opportunities for two major improvements in the operations and maintenance (O and M) of process machinery: (a) the actual versus intended machine environment can be understood and brought into much better alignment and (b) the end goal can define O and M strategies in terms of life cycle and economic management of plant assets.Scientists at the Pacific Northwest National Laboratory (PNNL) have performed experiments aimed at understanding and controlling aging of both safety-specific nuclear plant components and the infrastructure that supports essential plant processes. In this paper we examine the development of aging precursor metrics and their correlation with degradation rate and projected machinery failure.Degradation-specific correlations have been developed at PNNL that will allow accurate physics-based diagnostic and prognostic determinations to be derived from a new view of condition-based maintenance. This view, founded in root cause analysis, is focused on quantifying the primary stressor(s) responsible for degradation in the component of interest and formulating a deterministic relationship between the stressor intensity and the resulting degradation rate. This precursive relationship between the performance, degradation, and underlying stressor set is used to gain a first-principles approach to prognostic determinations. A holistic infrastructure approach, as applied through a conditions-based maintenance framework, will allow intelligent, automated diagnostic and prognostic programming to provide O and M practitioners with an understanding of the condition of their machinery today and an assurance of its operational state tomorrow

  17. Distance-Based Functional Diversity Measures and Their Decomposition: A Framework Based on Hill Numbers

    Science.gov (United States)

    Chiu, Chun-Huo; Chao, Anne

    2014-01-01

    Hill numbers (or the “effective number of species”) are increasingly used to characterize species diversity of an assemblage. This work extends Hill numbers to incorporate species pairwise functional distances calculated from species traits. We derive a parametric class of functional Hill numbers, which quantify “the effective number of equally abundant and (functionally) equally distinct species” in an assemblage. We also propose a class of mean functional diversity (per species), which quantifies the effective sum of functional distances between a fixed species to all other species. The product of the functional Hill number and the mean functional diversity thus quantifies the (total) functional diversity, i.e., the effective total distance between species of the assemblage. The three measures (functional Hill numbers, mean functional diversity and total functional diversity) quantify different aspects of species trait space, and all are based on species abundance and species pairwise functional distances. When all species are equally distinct, our functional Hill numbers reduce to ordinary Hill numbers. When species abundances are not considered or species are equally abundant, our total functional diversity reduces to the sum of all pairwise distances between species of an assemblage. The functional Hill numbers and the mean functional diversity both satisfy a replication principle, implying the total functional diversity satisfies a quadratic replication principle. When there are multiple assemblages defined by the investigator, each of the three measures of the pooled assemblage (gamma) can be multiplicatively decomposed into alpha and beta components, and the two components are independent. The resulting beta component measures pure functional differentiation among assemblages and can be further transformed to obtain several classes of normalized functional similarity (or differentiation) measures, including N-assemblage functional generalizations of

  18. Distance-based functional diversity measures and their decomposition: a framework based on Hill numbers.

    Directory of Open Access Journals (Sweden)

    Chun-Huo Chiu

    Full Text Available Hill numbers (or the "effective number of species" are increasingly used to characterize species diversity of an assemblage. This work extends Hill numbers to incorporate species pairwise functional distances calculated from species traits. We derive a parametric class of functional Hill numbers, which quantify "the effective number of equally abundant and (functionally equally distinct species" in an assemblage. We also propose a class of mean functional diversity (per species, which quantifies the effective sum of functional distances between a fixed species to all other species. The product of the functional Hill number and the mean functional diversity thus quantifies the (total functional diversity, i.e., the effective total distance between species of the assemblage. The three measures (functional Hill numbers, mean functional diversity and total functional diversity quantify different aspects of species trait space, and all are based on species abundance and species pairwise functional distances. When all species are equally distinct, our functional Hill numbers reduce to ordinary Hill numbers. When species abundances are not considered or species are equally abundant, our total functional diversity reduces to the sum of all pairwise distances between species of an assemblage. The functional Hill numbers and the mean functional diversity both satisfy a replication principle, implying the total functional diversity satisfies a quadratic replication principle. When there are multiple assemblages defined by the investigator, each of the three measures of the pooled assemblage (gamma can be multiplicatively decomposed into alpha and beta components, and the two components are independent. The resulting beta component measures pure functional differentiation among assemblages and can be further transformed to obtain several classes of normalized functional similarity (or differentiation measures, including N-assemblage functional

  19. Identifying multiple influential spreaders in term of the distance-based coloring

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Lei; Lin, Jian-Hong; Guo, Qiang [Research Center of Complex Systems Science, University of Shanghai for Science and Technology, Shanghai 200093 (China); Liu, Jian-Guo, E-mail: liujg004@ustc.edu.cn [Research Center of Complex Systems Science, University of Shanghai for Science and Technology, Shanghai 200093 (China); Data Science and Cloud Service Research Centre, Shanghai University of Finance and Economics, Shanghai 200433 (China)

    2016-02-22

    Identifying influential nodes is of significance for understanding the dynamics of information diffusion process in complex networks. In this paper, we present an improved distance-based coloring method to identify the multiple influential spreaders. In our method, each node is colored by a kind of color with the rule that the distance between initial nodes is close to the average distance of a network. When all nodes are colored, nodes with the same color are sorted into an independent set. Then we choose the nodes at the top positions of the ranking list according to their centralities. The experimental results for an artificial network and three empirical networks show that, comparing with the performance of traditional coloring method, the improvement ratio of our distance-based coloring method could reach 12.82%, 8.16%, 4.45%, 2.93% for the ER, Erdős, Polblogs and Routers networks respectively. - Highlights: • We present an improved distance-based coloring method to identify the multiple influential spreaders. • Each node is colored by a kind of color where the distance between initial nodes is close to the average distance. • For three empirical networks show that the improvement ratio of our distance-based coloring method could reach 8.16% for the Erdos network.

  20. Identifying multiple influential spreaders in term of the distance-based coloring

    International Nuclear Information System (INIS)

    Guo, Lei; Lin, Jian-Hong; Guo, Qiang; Liu, Jian-Guo

    2016-01-01

    Identifying influential nodes is of significance for understanding the dynamics of information diffusion process in complex networks. In this paper, we present an improved distance-based coloring method to identify the multiple influential spreaders. In our method, each node is colored by a kind of color with the rule that the distance between initial nodes is close to the average distance of a network. When all nodes are colored, nodes with the same color are sorted into an independent set. Then we choose the nodes at the top positions of the ranking list according to their centralities. The experimental results for an artificial network and three empirical networks show that, comparing with the performance of traditional coloring method, the improvement ratio of our distance-based coloring method could reach 12.82%, 8.16%, 4.45%, 2.93% for the ER, Erdős, Polblogs and Routers networks respectively. - Highlights: • We present an improved distance-based coloring method to identify the multiple influential spreaders. • Each node is colored by a kind of color where the distance between initial nodes is close to the average distance. • For three empirical networks show that the improvement ratio of our distance-based coloring method could reach 8.16% for the Erdos network.

  1. Challenges and Opportunities for Learning Biology in Distance-Based Settings

    Science.gov (United States)

    Hallyburton, Chad L.; Lunsford, Eddie

    2013-01-01

    The history of learning biology through distance education is documented. A review of terminology and unique problems associated with biology instruction is presented. Using published research and their own teaching experience, the authors present recommendations and best practices for managing biology in distance-based formats. They offer ideas…

  2. Species-Level Differences in Hyperspectral Metrics among Tropical Rainforest Trees as Determined by a Tree-Based Classifier

    Directory of Open Access Journals (Sweden)

    Dar A. Roberts

    2012-06-01

    Full Text Available This study explores a method to classify seven tropical rainforest tree species from full-range (400–2,500 nm hyperspectral data acquired at tissue (leaf and bark, pixel and crown scales using laboratory and airborne sensors. Metrics that respond to vegetation chemistry and structure were derived using narrowband indices, derivative- and absorption-based techniques, and spectral mixture analysis. We then used the Random Forests tree-based classifier to discriminate species with minimally-correlated, importance-ranked metrics. At all scales, best overall accuracies were achieved with metrics derived from all four techniques and that targeted chemical and structural properties across the visible to shortwave infrared spectrum (400–2500 nm. For tissue spectra, overall accuracies were 86.8% for leaves, 74.2% for bark, and 84.9% for leaves plus bark. Variation in tissue metrics was best explained by an axis of red absorption related to photosynthetic leaves and an axis distinguishing bark water and other chemical absorption features. Overall accuracies for individual tree crowns were 71.5% for pixel spectra, 70.6% crown-mean spectra, and 87.4% for a pixel-majority technique. At pixel and crown scales, tree structure and phenology at the time of image acquisition were important factors that determined species spectral separability.

  3. Dynamic Eye Tracking Based Metrics for Infant Gaze Patterns in the Face-Distractor Competition Paradigm

    Science.gov (United States)

    Ahtola, Eero; Stjerna, Susanna; Yrttiaho, Santeri; Nelson, Charles A.; Leppänen, Jukka M.; Vanhatalo, Sampsa

    2014-01-01

    Objective To develop new standardized eye tracking based measures and metrics for infants’ gaze dynamics in the face-distractor competition paradigm. Method Eye tracking data were collected from two samples of healthy 7-month-old (total n = 45), as well as one sample of 5-month-old infants (n = 22) in a paradigm with a picture of a face or a non-face pattern as a central stimulus, and a geometric shape as a lateral stimulus. The data were analyzed by using conventional measures of infants’ initial disengagement from the central to the lateral stimulus (i.e., saccadic reaction time and probability) and, additionally, novel measures reflecting infants gaze dynamics after the initial disengagement (i.e., cumulative allocation of attention to the central vs. peripheral stimulus). Results The results showed that the initial saccade away from the centrally presented stimulus is followed by a rapid re-engagement of attention with the central stimulus, leading to cumulative preference for the central stimulus over the lateral stimulus over time. This pattern tended to be stronger for salient facial expressions as compared to non-face patterns, was replicable across two independent samples of 7-month-old infants, and differentiated between 7 and 5 month-old infants. Conclusion The results suggest that eye tracking based assessments of infants’ cumulative preference for faces over time can be readily parameterized and standardized, and may provide valuable techniques for future studies examining normative developmental changes in preference for social signals. Significance Standardized measures of early developing face preferences may have potential to become surrogate biomarkers of neurocognitive and social development. PMID:24845102

  4. Prognostics and Condition Based Maintenance: A New Approach to Precursive Metrics

    International Nuclear Information System (INIS)

    Jarrell, Donald B.; Sisk, Daniel R.; Bond, Leonard J.

    2002-01-01

    Scientists at the Pacific Northwest National Laboratory (PNNL) have examined the necessity for understanding and controlling the aging process of both safety-specific plant components and the infrastructure that supports these processes. In this paper we examine the preliminary development of aging precursor metrics and their correlation with degradation rate and projected machine failure. Degradation specific correlations are currently being developed at PNNL that will allow accurate physics-based diagnostic and prognostic determinations to be derived from a new view of condition based maintenance. This view, founded in root cause analysis, is focused on quantifying the primary stressor(s) responsible for degradation in the component of interest. The derivative relationship between the performance, degradation and the underlying stressor set is used to gain a first principles approach to prognostic determinations. The assumptions used for the design basis of process equipment have always been as much art as science and for this reason have been misused or relegated into obscurity in all but the nuclear industry. The ability to successfully link degradation and expected equipment life to stressor intensity level is valuable in that it quantifies the degree of machine stress for a given production level. This allows two major improvements in the O and M of process machinery: (1) the actual versus intended machine environment can be understood and brought into much better alignment, and (2) the end goal can define operations and maintenance strategies in terms of life cycle and economic management of plant assets. A holistic infrastructure approach, as applied through a CBM framework, will allow intelligent, automated diagnostic and prognostic programs to provide O and M practitioners with an understanding of the condition of their machinery today and an assurance of its operational state tomorrow

  5. Dynamic eye tracking based metrics for infant gaze patterns in the face-distractor competition paradigm.

    Directory of Open Access Journals (Sweden)

    Eero Ahtola

    Full Text Available To develop new standardized eye tracking based measures and metrics for infants' gaze dynamics in the face-distractor competition paradigm.Eye tracking data were collected from two samples of healthy 7-month-old (total n = 45, as well as one sample of 5-month-old infants (n = 22 in a paradigm with a picture of a face or a non-face pattern as a central stimulus, and a geometric shape as a lateral stimulus. The data were analyzed by using conventional measures of infants' initial disengagement from the central to the lateral stimulus (i.e., saccadic reaction time and probability and, additionally, novel measures reflecting infants gaze dynamics after the initial disengagement (i.e., cumulative allocation of attention to the central vs. peripheral stimulus.The results showed that the initial saccade away from the centrally presented stimulus is followed by a rapid re-engagement of attention with the central stimulus, leading to cumulative preference for the central stimulus over the lateral stimulus over time. This pattern tended to be stronger for salient facial expressions as compared to non-face patterns, was replicable across two independent samples of 7-month-old infants, and differentiated between 7 and 5 month-old infants.The results suggest that eye tracking based assessments of infants' cumulative preference for faces over time can be readily parameterized and standardized, and may provide valuable techniques for future studies examining normative developmental changes in preference for social signals.Standardized measures of early developing face preferences may have potential to become surrogate biomarkers of neurocognitive and social development.

  6. Attenuation-based size metric for estimating organ dose to patients undergoing tube current modulated CT exams

    Energy Technology Data Exchange (ETDEWEB)

    Bostani, Maryam, E-mail: mbostani@mednet.ucla.edu; McMillan, Kyle; Lu, Peiyun; Kim, Hyun J.; Cagnon, Chris H.; McNitt-Gray, Michael F. [Departments of Biomedical Physics and Radiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States); DeMarco, John J. [Department of Radiation Oncology, University of California, Los Angeles, Los Angeles, California 90095 (United States)

    2015-02-15

    Purpose: Task Group 204 introduced effective diameter (ED) as the patient size metric used to correlate size-specific-dose-estimates. However, this size metric fails to account for patient attenuation properties and has been suggested to be replaced by an attenuation-based size metric, water equivalent diameter (D{sub W}). The purpose of this study is to investigate different size metrics, effective diameter, and water equivalent diameter, in combination with regional descriptions of scanner output to establish the most appropriate size metric to be used as a predictor for organ dose in tube current modulated CT exams. Methods: 101 thoracic and 82 abdomen/pelvis scans from clinically indicated CT exams were collected retrospectively from a multidetector row CT (Sensation 64, Siemens Healthcare) with Institutional Review Board approval to generate voxelized patient models. Fully irradiated organs (lung and breasts in thoracic scans and liver, kidneys, and spleen in abdominal scans) were segmented and used as tally regions in Monte Carlo simulations for reporting organ dose. Along with image data, raw projection data were collected to obtain tube current information for simulating tube current modulation scans using Monte Carlo methods. Additionally, previously described patient size metrics [ED, D{sub W}, and approximated water equivalent diameter (D{sub Wa})] were calculated for each patient and reported in three different ways: a single value averaged over the entire scan, a single value averaged over the region of interest, and a single value from a location in the middle of the scan volume. Organ doses were normalized by an appropriate mAs weighted CTDI{sub vol} to reflect regional variation of tube current. Linear regression analysis was used to evaluate the correlations between normalized organ doses and each size metric. Results: For the abdominal organs, the correlations between normalized organ dose and size metric were overall slightly higher for all three

  7. Image Inpainting Based on Coherence Transport with Adapted Distance Functions

    KAUST Repository

    Mä rz, Thomas

    2011-01-01

    We discuss an extension of our method image inpainting based on coherence transport. For the latter method the pixels of the inpainting domain have to be serialized into an ordered list. Until now, to induce the serialization we have used

  8. running distance education in studio-based art institutions

    African Journals Online (AJOL)

    PUBLICATIONS1

    tion to increase student enrolment without compounding the problem of overcrowding. This is ... tance education effectively in studio-based art institutions at a lesser cost than through video- ... example, have adopted this system for this pur-.

  9. RSSI-Based Distance Estimation Framework Using a Kalman Filter for Sustainable Indoor Computing Environments

    Directory of Open Access Journals (Sweden)

    Yunsick Sung

    2016-11-01

    Full Text Available Given that location information is the key to providing a variety of services in sustainable indoor computing environments, it is required to obtain accurate locations. Locations can be estimated by three distances from three fixed points. Therefore, if the distance between two points can be measured or estimated accurately, the location in indoor environments can be estimated. To increase the accuracy of the measured distance, noise filtering, signal revision, and distance estimation processes are generally performed. This paper proposes a novel framework for estimating the distance between a beacon and an access point (AP in a sustainable indoor computing environment. Diverse types of received strength signal indications (RSSIs are used for WiFi, Bluetooth, and radio signals, and the proposed distance estimation framework is unique in that it is independent of the specific wireless signal involved, being based on the Bluetooth signal of the beacon. Generally, RSSI measurement, noise filtering, and revision are required for distance estimation using RSSIs. The employed RSSIs are first measured from an AP, with multiple APs sometimes used to increase the accuracy of the distance estimation. Owing to the inevitable presence of noise in the measured RSSIs, the application of noise filtering is essential, and further revision is used to address the inaccuracy and instability that characterizes RSSIs measured in an indoor environment. The revised RSSIs are then used to estimate the distance. The proposed distance estimation framework uses one AP to measure the RSSIs, a Kalman filter to eliminate noise, and a log-distance path loss model to revise the measured RSSIs. In the experimental implementation of the framework, both a RSSI filter and a Kalman filter were respectively used for noise elimination to comparatively evaluate the performance of the latter for the specific application. The Kalman filter was found to reduce the accumulated errors by 8

  10. Estimating Impacts of Agricultural Subsurface Drainage on Evapotranspiration Using the Landsat Imagery-Based METRIC Model

    Directory of Open Access Journals (Sweden)

    Kul Khand

    2017-11-01

    Full Text Available Agricultural subsurface drainage changes the field hydrology and potentially the amount of water available to the crop by altering the flow path and the rate and timing of water removal. Evapotranspiration (ET is normally among the largest components of the field water budget, and the changes in ET from the introduction of subsurface drainage are likely to have a greater influence on the overall water yield (surface runoff plus subsurface drainage from subsurface drained (TD fields compared to fields without subsurface drainage (UD. To test this hypothesis, we examined the impact of subsurface drainage on ET at two sites located in the Upper Midwest (North Dakota-Site 1 and South Dakota-Site 2 using the Landsat imagery-based METRIC (Mapping Evapotranspiration at high Resolution with Internalized Calibration model. Site 1 was planted with corn (Zea mays L. and soybean (Glycine max L. during the 2009 and 2010 growing seasons, respectively. Site 2 was planted with corn for the 2013 growing season. During the corn growing seasons (2009 and 2013, differences between the total ET from TD and UD fields were less than 5 mm. For the soybean year (2010, ET from the UD field was 10% (53 mm greater than that from the TD field. During the peak ET period from June to September for all study years, ET differences from TD and UD fields were within 15 mm (<3%. Overall, differences between daily ET from TD and UD fields were not statistically significant (p > 0.05 and showed no consistent relationship.

  11. Development of a clinician reputation metric to identify appropriate problem-medication pairs in a crowdsourced knowledge base.

    Science.gov (United States)

    McCoy, Allison B; Wright, Adam; Rogith, Deevakar; Fathiamini, Safa; Ottenbacher, Allison J; Sittig, Dean F

    2014-04-01

    Correlation of data within electronic health records is necessary for implementation of various clinical decision support functions, including patient summarization. A key type of correlation is linking medications to clinical problems; while some databases of problem-medication links are available, they are not robust and depend on problems and medications being encoded in particular terminologies. Crowdsourcing represents one approach to generating robust knowledge bases across a variety of terminologies, but more sophisticated approaches are necessary to improve accuracy and reduce manual data review requirements. We sought to develop and evaluate a clinician reputation metric to facilitate the identification of appropriate problem-medication pairs through crowdsourcing without requiring extensive manual review. We retrieved medications from our clinical data warehouse that had been prescribed and manually linked to one or more problems by clinicians during e-prescribing between June 1, 2010 and May 31, 2011. We identified measures likely to be associated with the percentage of accurate problem-medication links made by clinicians. Using logistic regression, we created a metric for identifying clinicians who had made greater than or equal to 95% appropriate links. We evaluated the accuracy of the approach by comparing links made by those physicians identified as having appropriate links to a previously manually validated subset of problem-medication pairs. Of 867 clinicians who asserted a total of 237,748 problem-medication links during the study period, 125 had a reputation metric that predicted the percentage of appropriate links greater than or equal to 95%. These clinicians asserted a total of 2464 linked problem-medication pairs (983 distinct pairs). Compared to a previously validated set of problem-medication pairs, the reputation metric achieved a specificity of 99.5% and marginally improved the sensitivity of previously described knowledge bases. A

  12. Distance tracking scheme for seamless handover in IMS-based ...

    African Journals Online (AJOL)

    This paper proposes a fast and seamless handover scheme for systems based on IP Multimedia Subsystem (IMS) architectural framework with Universal Mobile Telecommunications System (UMTS) access network. In the scheme the location, direction and movement pattern of a Mobile Node (MN) in a network cell are ...

  13. Generalized Distance Transforms and Skeletons in Graphics Hardware

    NARCIS (Netherlands)

    Strzodka, R.; Telea, A.

    2004-01-01

    We present a framework for computing generalized distance transforms and skeletons of two-dimensional objects using graphics hardware. Our method is based on the concept of footprint splatting. Combining different splats produces weighted distance transforms for different metrics, as well as the

  14. Noisy EEG signals classification based on entropy metrics. Performance assessment using first and second generation statistics.

    Science.gov (United States)

    Cuesta-Frau, David; Miró-Martínez, Pau; Jordán Núñez, Jorge; Oltra-Crespo, Sandra; Molina Picó, Antonio

    2017-08-01

    This paper evaluates the performance of first generation entropy metrics, featured by the well known and widely used Approximate Entropy (ApEn) and Sample Entropy (SampEn) metrics, and what can be considered an evolution from these, Fuzzy Entropy (FuzzyEn), in the Electroencephalogram (EEG) signal classification context. The study uses the commonest artifacts found in real EEGs, such as white noise, and muscular, cardiac, and ocular artifacts. Using two different sets of publicly available EEG records, and a realistic range of amplitudes for interfering artifacts, this work optimises and assesses the robustness of these metrics against artifacts in class segmentation terms probability. The results show that the qualitative behaviour of the two datasets is similar, with SampEn and FuzzyEn performing the best, and the noise and muscular artifacts are the most confounding factors. On the contrary, there is a wide variability as regards initialization parameters. The poor performance achieved by ApEn suggests that this metric should not be used in these contexts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Linking customer and financial metrics to shareholder value : The leverage effect in customer-based valuation

    NARCIS (Netherlands)

    Schulze, C.; Skiera, B.; Wiesel, T.

    Customers are the most important assets of most companies, such that customer equity has been used as a proxy for shareholder value. However, linking customer metrics to shareholder value without considering debt and non-operating assets ignores their effects on relative changes in customer equity

  16. Editorial ~ Does "Lean Thinking" Relate to Network-based Distance Education

    Directory of Open Access Journals (Sweden)

    Peter S. Cookson

    2003-10-01

    Full Text Available Pointing to the “objectivised, rationalized, technologically-based interaction,” Peters (1973 referred to the then prevailing correspondence forms of distance education as “the most industrialized form of education” (p. 313. With such features as assembly line methods; division of labor; centralized processes of teaching materials development, production and dispatching; student admissions enrollment systems; automated registration, course allocation, and student support, and personnel management systems, distance education institutions demonstrated management structures and practices utilized in industrial and business organizations. Large numbers of courses and students were thus “processed” in correspondence, radio, and television-based distance education systems.

  17. Design and development of a Personality Prediction System based on Mobile-Phone based Metrics

    OpenAIRE

    Alonso Aguilar, Carlos

    2017-01-01

    The need of communication between people is something that is associated in our nature as human beings, but the way people do it has completely changed since the smartphone and Internet appeared. Otherwise, knowing human personality of someone is something really difficult that we gain after working communication skills with others. Based on this two principal points in my TFG election, whose aim is predict human personality by recollecting information of smartphones, using Big Data and Machi...

  18. Teachers’ Invisible Presence in Net-based Distance Education

    Directory of Open Access Journals (Sweden)

    Agneta Hult

    2005-11-01

    Full Text Available Conferencing – or dialogue – has always been a core activity in liberal adult education. More recently, attempts have been made to transfer such conversations online in the form of computer-mediated conferencing. This transfer has raised a range of pedagogical questions, most notably “Can established practices be continued? Or must new forms of participation and group management be established? This paper addresses these questions. It is based on two sources: (1 3,700 online postings from a variety of Net-based adult education courses in Sweden; and (2 interviews with participants and course-leaders. It comprises a discussion of online conversational activity and, in particular, the absent presence and pedagogic orientation of teachers who steer learners towards explicit and implicit course goals. In other words, it is a reminder that adult education is not a free-floating form of self instruction but, rather, operates within boundaries created and managed by other human beings.

  19. The effect of uncertainties in distance-based ranking methods for multi-criteria decision making

    Science.gov (United States)

    Jaini, Nor I.; Utyuzhnikov, Sergei V.

    2017-08-01

    Data in the multi-criteria decision making are often imprecise and changeable. Therefore, it is important to carry out sensitivity analysis test for the multi-criteria decision making problem. The paper aims to present a sensitivity analysis for some ranking techniques based on the distance measures in multi-criteria decision making. Two types of uncertainties are considered for the sensitivity analysis test. The first uncertainty is related to the input data, while the second uncertainty is towards the Decision Maker preferences (weights). The ranking techniques considered in this study are TOPSIS, the relative distance and trade-off ranking methods. TOPSIS and the relative distance method measure a distance from an alternative to the ideal and antiideal solutions. In turn, the trade-off ranking calculates a distance of an alternative to the extreme solutions and other alternatives. Several test cases are considered to study the performance of each ranking technique in both types of uncertainties.

  20. Multiscale Distance Coherence Vector Algorithm for Content-Based Image Retrieval

    Science.gov (United States)

    Jiexian, Zeng; Xiupeng, Liu

    2014-01-01

    Multiscale distance coherence vector algorithm for content-based image retrieval (CBIR) is proposed due to the same descriptor with different shapes and the shortcomings of antinoise performance of the distance coherence vector algorithm. By this algorithm, the image contour curve is evolved by Gaussian function first, and then the distance coherence vector is, respectively, extracted from the contour of the original image and evolved images. Multiscale distance coherence vector was obtained by reasonable weight distribution of the distance coherence vectors of evolved images contour. This algorithm not only is invariable to translation, rotation, and scaling transformation but also has good performance of antinoise. The experiment results show us that the algorithm has a higher recall rate and precision rate for the retrieval of images polluted by noise. PMID:24883416

  1. Thickness and clearance visualization based on distance field of 3D objects

    Directory of Open Access Journals (Sweden)

    Masatomo Inui

    2015-07-01

    Full Text Available This paper proposes a novel method for visualizing the thickness and clearance of 3D objects in a polyhedral representation. The proposed method uses the distance field of the objects in the visualization. A parallel algorithm is developed for constructing the distance field of polyhedral objects using the GPU. The distance between a voxel and the surface polygons of the model is computed many times in the distance field construction. Similar sets of polygons are usually selected as close polygons for close voxels. By using this spatial coherence, a parallel algorithm is designed to compute the distances between a cluster of close voxels and the polygons selected by the culling operation so that the fast shared memory mechanism of the GPU can be fully utilized. The thickness/clearance of the objects is visualized by distributing points on the visible surfaces of the objects and painting them with a unique color corresponding to the thickness/clearance values at those points. A modified ray casting method is developed for computing the thickness/clearance using the distance field of the objects. A system based on these algorithms can compute the distance field of complex objects within a few minutes for most cases. After the distance field construction, thickness/clearance visualization at a near interactive rate is achieved.

  2. SU-E-I-71: Quality Assessment of Surrogate Metrics in Multi-Atlas-Based Image Segmentation

    International Nuclear Information System (INIS)

    Zhao, T; Ruan, D

    2015-01-01

    Purpose: With the ever-growing data of heterogeneous quality, relevance assessment of atlases becomes increasingly critical for multi-atlas-based image segmentation. However, there is no universally recognized best relevance metric and even a standard to compare amongst candidates remains elusive. This study, for the first time, designs a quantification to assess relevance metrics’ quality, based on a novel perspective of the metric as surrogate for inferring the inaccessible oracle geometric agreement. Methods: We first develop an inference model to relate surrogate metrics in image space to the underlying oracle relevance metric in segmentation label space, with a monotonically non-decreasing function subject to random perturbations. Subsequently, we investigate model parameters to reveal key contributing factors to surrogates’ ability in prognosticating the oracle relevance value, for the specific task of atlas selection. Finally, we design an effective contract-to-noise ratio (eCNR) to quantify surrogates’ quality based on insights from these analyses and empirical observations. Results: The inference model was specialized to a linear function with normally distributed perturbations, with surrogate metric exemplified by several widely-used image similarity metrics, i.e., MSD/NCC/(N)MI. Surrogates’ behaviors in selecting the most relevant atlases were assessed under varying eCNR, showing that surrogates with high eCNR dominated those with low eCNR in retaining the most relevant atlases. In an end-to-end validation, NCC/(N)MI with eCNR of 0.12 compared to MSD with eCNR of 0.10 resulted in statistically better segmentation with mean DSC of about 0.85 and the first and third quartiles of (0.83, 0.89), compared to MSD with mean DSC of 0.84 and the first and third quartiles of (0.81, 0.89). Conclusion: The designed eCNR is capable of characterizing surrogate metrics’ quality in prognosticating the oracle relevance value. It has been demonstrated to be

  3. Active Metric Learning for Supervised Classification

    OpenAIRE

    Kumaran, Krishnan; Papageorgiou, Dimitri; Chang, Yutong; Li, Minhan; Takáč, Martin

    2018-01-01

    Clustering and classification critically rely on distance metrics that provide meaningful comparisons between data points. We present mixed-integer optimization approaches to find optimal distance metrics that generalize the Mahalanobis metric extensively studied in the literature. Additionally, we generalize and improve upon leading methods by removing reliance on pre-designated "target neighbors," "triplets," and "similarity pairs." Another salient feature of our method is its ability to en...

  4. A method for assigning species into groups based on generalized Mahalanobis distance between habitat model coefficients

    Science.gov (United States)

    Williams, C.J.; Heglund, P.J.

    2009-01-01

    Habitat association models are commonly developed for individual animal species using generalized linear modeling methods such as logistic regression. We considered the issue of grouping species based on their habitat use so that management decisions can be based on sets of species rather than individual species. This research was motivated by a study of western landbirds in northern Idaho forests. The method we examined was to separately fit models to each species and to use a generalized Mahalanobis distance between coefficient vectors to create a distance matrix among species. Clustering methods were used to group species from the distance matrix, and multidimensional scaling methods were used to visualize the relations among species groups. Methods were also discussed for evaluating the sensitivity of the conclusions because of outliers or influential data points. We illustrate these methods with data from the landbird study conducted in northern Idaho. Simulation results are presented to compare the success of this method to alternative methods using Euclidean distance between coefficient vectors and to methods that do not use habitat association models. These simulations demonstrate that our Mahalanobis-distance- based method was nearly always better than Euclidean-distance-based methods or methods not based on habitat association models. The methods used to develop candidate species groups are easily explained to other scientists and resource managers since they mainly rely on classical multivariate statistical methods. ?? 2008 Springer Science+Business Media, LLC.

  5. Contact- and distance-based principal component analysis of protein dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Ernst, Matthias; Sittel, Florian; Stock, Gerhard, E-mail: stock@physik.uni-freiburg.de [Biomolecular Dynamics, Institute of Physics, Albert Ludwigs University, 79104 Freiburg (Germany)

    2015-12-28

    To interpret molecular dynamics simulations of complex systems, systematic dimensionality reduction methods such as principal component analysis (PCA) represent a well-established and popular approach. Apart from Cartesian coordinates, internal coordinates, e.g., backbone dihedral angles or various kinds of distances, may be used as input data in a PCA. Adopting two well-known model problems, folding of villin headpiece and the functional dynamics of BPTI, a systematic study of PCA using distance-based measures is presented which employs distances between C{sub α}-atoms as well as distances between inter-residue contacts including side chains. While this approach seems prohibitive for larger systems due to the quadratic scaling of the number of distances with the size of the molecule, it is shown that it is sufficient (and sometimes even better) to include only relatively few selected distances in the analysis. The quality of the PCA is assessed by considering the resolution of the resulting free energy landscape (to identify metastable conformational states and barriers) and the decay behavior of the corresponding autocorrelation functions (to test the time scale separation of the PCA). By comparing results obtained with distance-based, dihedral angle, and Cartesian coordinates, the study shows that the choice of input variables may drastically influence the outcome of a PCA.

  6. Retrospective group fusion similarity search based on eROCE evaluation metric.

    Science.gov (United States)

    Avram, Sorin I; Crisan, Luminita; Bora, Alina; Pacureanu, Liliana M; Avram, Stefana; Kurunczi, Ludovic

    2013-03-01

    In this study, a simple evaluation metric, denoted as eROCE was proposed to measure the early enrichment of predictive methods. We demonstrated the superior robustness of eROCE compared to other known metrics throughout several active to inactive ratios ranging from 1:10 to 1:1000. Group fusion similarity search was investigated by varying 16 similarity coefficients, five molecular representations (binary and non-binary) and two group fusion rules using two reference structure set sizes. We used a dataset of 3478 actives and 43,938 inactive molecules and the enrichment was analyzed by means of eROCE. This retrospective study provides optimal similarity search parameters in the case of ALDH1A1 inhibitors. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. A Metric on Phylogenetic Tree Shapes.

    Science.gov (United States)

    Colijn, C; Plazzotta, G

    2018-01-01

    The shapes of evolutionary trees are influenced by the nature of the evolutionary process but comparisons of trees from different processes are hindered by the challenge of completely describing tree shape. We present a full characterization of the shapes of rooted branching trees in a form that lends itself to natural tree comparisons. We use this characterization to define a metric, in the sense of a true distance function, on tree shapes. The metric distinguishes trees from random models known to produce different tree shapes. It separates trees derived from tropical versus USA influenza A sequences, which reflect the differing epidemiology of tropical and seasonal flu. We describe several metrics based on the same core characterization, and illustrate how to extend the metric to incorporate trees' branch lengths or other features such as overall imbalance. Our approach allows us to construct addition and multiplication on trees, and to create a convex metric on tree shapes which formally allows computation of average tree shapes. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  8. Landscape Classifications for Landscape Metrics-based Assessment of Urban Heat Island: A Comparative Study

    International Nuclear Information System (INIS)

    Zhao, X F; Deng, L; Wang, H N; Chen, F; Hua, L Z

    2014-01-01

    In recent years, some studies have been carried out on the landscape analysis of urban thermal patterns. With the prevalence of thermal landscape, a key problem has come forth, which is how to classify thermal landscape into thermal patches. Current researches used different methods of thermal landscape classification such as standard deviation method (SD) and R method. To find out the differences, a comparative study was carried out in Xiamen using a 20-year winter time-serial Landsat images. After the retrieval of land surface temperature (LST), the thermal landscape was classified using the two methods separately. Then landscape metrics, 6 at class level and 14 at landscape level, were calculated and analyzed using Fragstats 3.3. We found that: (1) at the class level, all the metrics with SD method were evened and did not show an obvious trend along with the process of urbanization, while the R method could. (2) While at the landscape level, 6 of the 14 metrics remains the similar trends, 5 were different at local turn points of the curve, 3 of them differed completely in the shape of curves. (3) When examined with visual interpretation, SD method tended to exaggerate urban heat island effects than the R method

  9. Energy dependent track structure parametrizations for protons and carbon ions based on nano-metric simulations

    International Nuclear Information System (INIS)

    Frauke, A.; Wilkens, J.J.; Villagrasa, C.; Rabus, H.

    2015-01-01

    The BioQuaRT project within the European Metrology Research Programme aims at correlating ion track structure characteristics with the biological effects of radiation and develops measurement and simulation techniques for determining ion track structure on different length scales from about 2 nm to about 10 μm. Within this framework, we investigate methods to translate track-structure quantities derived on a nanometer scale to macroscopic dimensions. Input data sets were generated by simulations of ion tracks of protons and carbon ions in liquid water using the Geant-4 Monte Carlo tool-kit with the Geant-4-DNA processes. Based on the energy transfer points - recorded with nanometer resolution - we investigated parametrizations of overall properties of ion track structure. Three different track structure parametrizations have been developed using the distances to the 10 next neighbouring ionizations, the radial energy distribution and ionisation cluster size distributions. These parametrizations of nanometer-scale track structure build a basis for deriving biologically relevant mean values which are essential in the clinical situation where each voxel is exposed to a mixed radiation field. (authors)

  10. Metrics for Performance Evaluation of Patient Exercises during Physical Therapy.

    Science.gov (United States)

    Vakanski, Aleksandar; Ferguson, Jake M; Lee, Stephen

    2017-06-01

    The article proposes a set of metrics for evaluation of patient performance in physical therapy exercises. Taxonomy is employed that classifies the metrics into quantitative and qualitative categories, based on the level of abstraction of the captured motion sequences. Further, the quantitative metrics are classified into model-less and model-based metrics, in reference to whether the evaluation employs the raw measurements of patient performed motions, or whether the evaluation is based on a mathematical model of the motions. The reviewed metrics include root-mean square distance, Kullback Leibler divergence, log-likelihood, heuristic consistency, Fugl-Meyer Assessment, and similar. The metrics are evaluated for a set of five human motions captured with a Kinect sensor. The metrics can potentially be integrated into a system that employs machine learning for modelling and assessment of the consistency of patient performance in home-based therapy setting. Automated performance evaluation can overcome the inherent subjectivity in human performed therapy assessment, and it can increase the adherence to prescribed therapy plans, and reduce healthcare costs.

  11. An improved initialization center k-means clustering algorithm based on distance and density

    Science.gov (United States)

    Duan, Yanling; Liu, Qun; Xia, Shuyin

    2018-04-01

    Aiming at the problem of the random initial clustering center of k means algorithm that the clustering results are influenced by outlier data sample and are unstable in multiple clustering, a method of central point initialization method based on larger distance and higher density is proposed. The reciprocal of the weighted average of distance is used to represent the sample density, and the data sample with the larger distance and the higher density are selected as the initial clustering centers to optimize the clustering results. Then, a clustering evaluation method based on distance and density is designed to verify the feasibility of the algorithm and the practicality, the experimental results on UCI data sets show that the algorithm has a certain stability and practicality.

  12. Implementation of a microcomputer based distance relay for parallel transmission lines

    International Nuclear Information System (INIS)

    Phadke, A.G.; Jihuang, L.

    1986-01-01

    Distance relaying for parallel transmission lines is a difficult application problem with conventional phase and ground distance relays. It is known that for cross-country faults involving dissimilar phases and ground, three phase tripping may result. This paper summarizes a newly developed microcomputer based relay which is capable of classifying the cross-country fault correctly. The paper describes the principle of operation and results of laboratory tests of this relay

  13. A Web-Based Graphical Food Frequency Assessment System: Design, Development and Usability Metrics.

    Science.gov (United States)

    Franco, Rodrigo Zenun; Alawadhi, Balqees; Fallaize, Rosalind; Lovegrove, Julie A; Hwang, Faustina

    2017-05-08

    Food frequency questionnaires (FFQs) are well established in the nutrition field, but there remain important questions around how to develop online tools in a way that can facilitate wider uptake. Also, FFQ user acceptance and evaluation have not been investigated extensively. This paper presents a Web-based graphical food frequency assessment system that addresses challenges of reproducibility, scalability, mobile friendliness, security, and usability and also presents the utilization metrics and user feedback from a deployment study. The application design employs a single-page application Web architecture with back-end services (database, authentication, and authorization) provided by Google Firebase's free plan. Its design and responsiveness take advantage of the Bootstrap framework. The FFQ was deployed in Kuwait as part of the EatWellQ8 study during 2016. The EatWellQ8 FFQ contains 146 food items (including drinks). Participants were recruited in Kuwait without financial incentive. Completion time was based on browser timestamps and usability was measured using the System Usability Scale (SUS), scoring between 0 and 100. Products with a SUS higher than 70 are considered to be good. A total of 235 participants created accounts in the system, and 163 completed the FFQ. Of those 163 participants, 142 reported their gender (93 female, 49 male) and 144 reported their date of birth (mean age of 35 years, range from 18-65 years). The mean completion time for all FFQs (n=163), excluding periods of interruption, was 14.2 minutes (95% CI 13.3-15.1 minutes). Female participants (n=93) completed in 14.1 minutes (95% CI 12.9-15.3 minutes) and male participants (n=49) completed in 14.3 minutes (95% CI 12.6-15.9 minutes). Participants using laptops or desktops (n=69) completed the FFQ in an average of 13.9 minutes (95% CI 12.6-15.1 minutes) and participants using smartphones or tablets (n=91) completed in an average of 14.5 minutes (95% CI 13.2-15.8 minutes). The median SUS

  14. Encyclopedia of distances

    CERN Document Server

    Deza, Michel Marie

    2009-01-01

    Distance metrics and distances have become an essential tool in many areas of pure and applied Mathematics. This title offers both independent introductions and definitions, while at the same time making cross-referencing easy through hyperlink-like boldfaced references to original definitions.

  15. Knowledge-based prediction of plan quality metrics in intracranial stereotactic radiosurgery

    International Nuclear Information System (INIS)

    Shiraishi, Satomi; Moore, Kevin L.; Tan, Jun; Olsen, Lindsey A.

    2015-01-01

    Purpose: The objective of this work was to develop a comprehensive knowledge-based methodology for predicting achievable dose–volume histograms (DVHs) and highly precise DVH-based quality metrics (QMs) in stereotactic radiosurgery/radiotherapy (SRS/SRT) plans. Accurate QM estimation can identify suboptimal treatment plans and provide target optimization objectives to standardize and improve treatment planning. Methods: Correlating observed dose as it relates to the geometric relationship of organs-at-risk (OARs) to planning target volumes (PTVs) yields mathematical models to predict achievable DVHs. In SRS, DVH-based QMs such as brain V 10Gy (volume receiving 10 Gy or more), gradient measure (GM), and conformity index (CI) are used to evaluate plan quality. This study encompasses 223 linear accelerator-based SRS/SRT treatment plans (SRS plans) using volumetric-modulated arc therapy (VMAT), representing 95% of the institution’s VMAT radiosurgery load from the past four and a half years. Unfiltered models that use all available plans for the model training were built for each category with a stratification scheme based on target and OAR characteristics determined emergently through initial modeling process. Model predictive accuracy is measured by the mean and standard deviation of the difference between clinical and predicted QMs, δQM = QM clin − QM pred , and a coefficient of determination, R 2 . For categories with a large number of plans, refined models are constructed by automatic elimination of suspected suboptimal plans from the training set. Using the refined model as a presumed achievable standard, potentially suboptimal plans are identified. Predictions of QM improvement are validated via standardized replanning of 20 suspected suboptimal plans based on dosimetric predictions. The significance of the QM improvement is evaluated using the Wilcoxon signed rank test. Results: The most accurate predictions are obtained when plans are stratified based on

  16. Knowledge-based prediction of plan quality metrics in intracranial stereotactic radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Shiraishi, Satomi; Moore, Kevin L., E-mail: kevinmoore@ucsd.edu [Department of Radiation Medicine and Applied Sciences, University of California, San Diego, La Jolla, California 92093 (United States); Tan, Jun [Department of Radiation Oncology, UT Southwestern Medical Center, Dallas, Texas 75490 (United States); Olsen, Lindsey A. [Department of Radiation Oncology, Washington University School of Medicine, St. Louis, Missouri 63110 (United States)

    2015-02-15

    Purpose: The objective of this work was to develop a comprehensive knowledge-based methodology for predicting achievable dose–volume histograms (DVHs) and highly precise DVH-based quality metrics (QMs) in stereotactic radiosurgery/radiotherapy (SRS/SRT) plans. Accurate QM estimation can identify suboptimal treatment plans and provide target optimization objectives to standardize and improve treatment planning. Methods: Correlating observed dose as it relates to the geometric relationship of organs-at-risk (OARs) to planning target volumes (PTVs) yields mathematical models to predict achievable DVHs. In SRS, DVH-based QMs such as brain V{sub 10Gy} (volume receiving 10 Gy or more), gradient measure (GM), and conformity index (CI) are used to evaluate plan quality. This study encompasses 223 linear accelerator-based SRS/SRT treatment plans (SRS plans) using volumetric-modulated arc therapy (VMAT), representing 95% of the institution’s VMAT radiosurgery load from the past four and a half years. Unfiltered models that use all available plans for the model training were built for each category with a stratification scheme based on target and OAR characteristics determined emergently through initial modeling process. Model predictive accuracy is measured by the mean and standard deviation of the difference between clinical and predicted QMs, δQM = QM{sub clin} − QM{sub pred}, and a coefficient of determination, R{sup 2}. For categories with a large number of plans, refined models are constructed by automatic elimination of suspected suboptimal plans from the training set. Using the refined model as a presumed achievable standard, potentially suboptimal plans are identified. Predictions of QM improvement are validated via standardized replanning of 20 suspected suboptimal plans based on dosimetric predictions. The significance of the QM improvement is evaluated using the Wilcoxon signed rank test. Results: The most accurate predictions are obtained when plans are

  17. Identification of robust statistical downscaling methods based on a comprehensive suite of performance metrics for South Korea

    Science.gov (United States)

    Eum, H. I.; Cannon, A. J.

    2015-12-01

    Climate models are a key provider to investigate impacts of projected future climate conditions on regional hydrologic systems. However, there is a considerable mismatch of spatial resolution between GCMs and regional applications, in particular a region characterized by complex terrain such as Korean peninsula. Therefore, a downscaling procedure is an essential to assess regional impacts of climate change. Numerous statistical downscaling methods have been used mainly due to the computational efficiency and simplicity. In this study, four statistical downscaling methods [Bias-Correction/Spatial Disaggregation (BCSD), Bias-Correction/Constructed Analogue (BCCA), Multivariate Adaptive Constructed Analogs (MACA), and Bias-Correction/Climate Imprint (BCCI)] are applied to downscale the latest Climate Forecast System Reanalysis data to stations for precipitation, maximum temperature, and minimum temperature over South Korea. By split sampling scheme, all methods are calibrated with observational station data for 19 years from 1973 to 1991 are and tested for the recent 19 years from 1992 to 2010. To assess skill of the downscaling methods, we construct a comprehensive suite of performance metrics that measure an ability of reproducing temporal correlation, distribution, spatial correlation, and extreme events. In addition, we employ Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) to identify robust statistical downscaling methods based on the performance metrics for each season. The results show that downscaling skill is considerably affected by the skill of CFSR and all methods lead to large improvements in representing all performance metrics. According to seasonal performance metrics evaluated, when TOPSIS is applied, MACA is identified as the most reliable and robust method for all variables and seasons. Note that such result is derived from CFSR output which is recognized as near perfect climate data in climate studies. Therefore, the

  18. The correction of vibration in frequency scanning interferometry based absolute distance measurement system for dynamic measurements

    Science.gov (United States)

    Lu, Cheng; Liu, Guodong; Liu, Bingguo; Chen, Fengdong; Zhuang, Zhitao; Xu, Xinke; Gan, Yu

    2015-10-01

    Absolute distance measurement systems are of significant interest in the field of metrology, which could improve the manufacturing efficiency and accuracy of large assemblies in fields such as aircraft construction, automotive engineering, and the production of modern windmill blades. Frequency scanning interferometry demonstrates noticeable advantages as an absolute distance measurement system which has a high precision and doesn't depend on a cooperative target. In this paper , the influence of inevitable vibration in the frequency scanning interferometry based absolute distance measurement system is analyzed. The distance spectrum is broadened as the existence of Doppler effect caused by vibration, which will bring in a measurement error more than 103 times bigger than the changes of optical path difference. In order to decrease the influence of vibration, the changes of the optical path difference are monitored by a frequency stabilized laser, which runs parallel to the frequency scanning interferometry. The experiment has verified the effectiveness of this method.

  19. Weighted Evidence Combination Rule Based on Evidence Distance and Uncertainty Measure: An Application in Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Lei Chen

    2018-01-01

    Full Text Available Conflict management in Dempster-Shafer theory (D-S theory is a hot topic in information fusion. In this paper, a novel weighted evidence combination rule based on evidence distance and uncertainty measure is proposed. The proposed approach consists of two steps. First, the weight is determined based on the evidence distance. Then, the weight value obtained in first step is modified by taking advantage of uncertainty. Our proposed method can efficiently handle high conflicting evidences with better performance of convergence. A numerical example and an application based on sensor fusion in fault diagnosis are given to demonstrate the efficiency of our proposed method.

  20. Metrics for Electronic-Nursing-Record-Based Narratives: Cross-sectional Analysis

    Science.gov (United States)

    Kim, Kidong; Jeong, Suyeon; Lee, Kyogu; Park, Hyeoun-Ae; Min, Yul Ha; Lee, Joo Yun; Kim, Yekyung; Yoo, Sooyoung; Doh, Gippeum

    2016-01-01

    Summary Objectives We aimed to determine the characteristics of quantitative metrics for nursing narratives documented in electronic nursing records and their association with hospital admission traits and diagnoses in a large data set not limited to specific patient events or hypotheses. Methods We collected 135,406,873 electronic, structured coded nursing narratives from 231,494 hospital admissions of patients discharged between 2008 and 2012 at a tertiary teaching institution that routinely uses an electronic health records system. The standardized number of nursing narratives (i.e., the total number of nursing narratives divided by the length of the hospital stay) was suggested to integrate the frequency and quantity of nursing documentation. Results The standardized number of nursing narratives was higher for patients aged 70 years (median = 30.2 narratives/day, interquartile range [IQR] = 24.0–39.4 narratives/day), long (8 days) hospital stays (median = 34.6 narratives/day, IQR = 27.2–43.5 narratives/day), and hospital deaths (median = 59.1 narratives/day, IQR = 47.0–74.8 narratives/day). The standardized number of narratives was higher in “pregnancy, childbirth, and puerperium” (median = 46.5, IQR = 39.0–54.7) and “diseases of the circulatory system” admissions (median = 35.7, IQR = 29.0–43.4). Conclusions Diverse hospital admissions can be consistently described with nursing-document-derived metrics for similar hospital admissions and diagnoses. Some areas of hospital admissions may have consistently increasing volumes of nursing documentation across years. Usability of electronic nursing document metrics for evaluating healthcare requires multiple aspects of hospital admissions to be considered. PMID:27901174

  1. Visibility-Based Goal Oriented Metrics and Application to Navigation and Path Planning Problems

    Science.gov (United States)

    2017-12-14

    Oriented Metrics and Application to Navigation and Path Planning Problems Report Term: 0-Other Email : ytsai@math.utexas.edu Distribution Statement: 1...error bounds that we have obtained. Report Date: 06-Dec-2017 INVESTIGATOR(S): Phone Number: 5122327757 Principal: Y Name: Yen-Hsi Tsai Email ...w1 w2 ◆ and ~z = ✓ z1 z2 ◆ . Then we can write D0 h (PN (xi,j)) = Rp (R+⌘)2+h2 + 1 2h (µ2w1 µ2z1) 0 µ2w2µ3z2 2h 0 ! . It follows that the non

  2. Feedback for reinforcement learning based brain-machine interfaces using confidence metrics

    Science.gov (United States)

    Prins, Noeline W.; Sanchez, Justin C.; Prasad, Abhishek

    2017-06-01

    Objective. For brain-machine interfaces (BMI) to be used in activities of daily living by paralyzed individuals, the BMI should be as autonomous as possible. One of the challenges is how the feedback is extracted and utilized in the BMI. Our long-term goal is to create autonomous BMIs that can utilize an evaluative feedback from the brain to update the decoding algorithm and use it intelligently in order to adapt the decoder. In this study, we show how to extract the necessary evaluative feedback from a biologically realistic (synthetic) source, use both the quantity and the quality of the feedback, and how that feedback information can be incorporated into a reinforcement learning (RL) controller architecture to maximize its performance. Approach. Motivated by the perception-action-reward cycle (PARC) in the brain which links reward for cognitive decision making and goal-directed behavior, we used a reward-based RL architecture named Actor-Critic RL as the model. Instead of using an error signal towards building an autonomous BMI, we envision to use a reward signal from the nucleus accumbens (NAcc) which plays a key role in the linking of reward to motor behaviors. To deal with the complexity and non-stationarity of biological reward signals, we used a confidence metric which was used to indicate the degree of feedback accuracy. This confidence was added to the Actor’s weight update equation in the RL controller architecture. If the confidence was high (>0.2), the BMI decoder used this feedback to update its parameters. However, when the confidence was low, the BMI decoder ignored the feedback and did not update its parameters. The range between high confidence and low confidence was termed as the ‘ambiguous’ region. When the feedback was within this region, the BMI decoder updated its weight at a lower rate than when fully confident, which was decided by the confidence. We used two biologically realistic models to generate synthetic data for MI (Izhikevich

  3. Curriculum Guidelines for a Distance Education Course in Urban Agriculture Based on an Eclectic Model.

    Science.gov (United States)

    Gaum, Wilma G.; van Rooyen, Hugo G.

    1997-01-01

    Describes research to develop curriculum guidelines for a distance education course in urban agriculture. The course, designed to train the teacher, is based on an eclectic curriculum design model. The course is aimed at the socioeconomic empowerment of urban farmers and is based on sustainable ecological-agricultural principles, an…

  4. Bounded distance decoding of linear error-correcting codes with Gröbner bases

    NARCIS (Netherlands)

    Bulygin, S.; Pellikaan, G.R.

    2009-01-01

    The problem of bounded distance decoding of arbitrary linear codes using Gröbner bases is addressed. A new method is proposed, which is based on reducing an initial decoding problem to solving a certain system of polynomial equations over a finite field. The peculiarity of this system is that, when

  5. Metrication manual

    International Nuclear Information System (INIS)

    Harper, A.F.A.; Digby, R.B.; Thong, S.P.; Lacey, F.

    1978-04-01

    In April 1978 a meeting of senior metrication officers convened by the Commonwealth Science Council of the Commonwealth Secretariat, was held in London. The participants were drawn from Australia, Bangladesh, Britain, Canada, Ghana, Guyana, India, Jamaica, Papua New Guinea, Solomon Islands and Trinidad and Tobago. Among other things, the meeting resolved to develop a set of guidelines to assist countries to change to SI and to compile such guidelines in the form of a working manual

  6. Teaching Reading Comprehension in English in a Distance Web-Based Course: New Roles for Teachers

    Directory of Open Access Journals (Sweden)

    Jorge Hugo Muñoz Marín

    2010-10-01

    Full Text Available Distance web-based learning is a popular strategy in ELT teaching in Colombia. Despite of the growth of experiences, there are very few studies regarding teachers' participation in these courses. This paper reports preliminary findings of an on-going study aiming at exploring the roles that a teacher plays in an efl reading comprehension distance web-based course. Data analysis suggests that teachers play new roles solving technical problems, providing immediate feedback, interacting with students in a non traditional way, providing time management advice, and acting as a constant motivator. The authors conclude that EFL teachers require training for this new teaching roles and the analysis of web-based distance learning environments as an option under permanent construction that requires their active participation.

  7. Moment-based metrics for global sensitivity analysis of hydrological systems

    Directory of Open Access Journals (Sweden)

    A. Dell'Oca

    2017-12-01

    Full Text Available We propose new metrics to assist global sensitivity analysis, GSA, of hydrological and Earth systems. Our approach allows assessing the impact of uncertain parameters on main features of the probability density function, pdf, of a target model output, y. These include the expected value of y, the spread around the mean and the degree of symmetry and tailedness of the pdf of y. Since reliable assessment of higher-order statistical moments can be computationally demanding, we couple our GSA approach with a surrogate model, approximating the full model response at a reduced computational cost. Here, we consider the generalized polynomial chaos expansion (gPCE, other model reduction techniques being fully compatible with our theoretical framework. We demonstrate our approach through three test cases, including an analytical benchmark, a simplified scenario mimicking pumping in a coastal aquifer and a laboratory-scale conservative transport experiment. Our results allow ascertaining which parameters can impact some moments of the model output pdf while being uninfluential to others. We also investigate the error associated with the evaluation of our sensitivity metrics by replacing the original system model through a gPCE. Our results indicate that the construction of a surrogate model with increasing level of accuracy might be required depending on the statistical moment considered in the GSA. The approach is fully compatible with (and can assist the development of analysis techniques employed in the context of reduction of model complexity, model calibration, design of experiment, uncertainty quantification and risk assessment.

  8. Benchmarking Distance Control and Virtual Drilling for Lateral Skull Base Surgery.

    Science.gov (United States)

    Voormolen, Eduard H J; Diederen, Sander; van Stralen, Marijn; Woerdeman, Peter A; Noordmans, Herke Jan; Viergever, Max A; Regli, Luca; Robe, Pierre A; Berkelbach van der Sprenkel, Jan Willem

    2018-01-01

    Novel audiovisual feedback methods were developed to improve image guidance during skull base surgery by providing audiovisual warnings when the drill tip enters a protective perimeter set at a distance around anatomic structures ("distance control") and visualizing bone drilling ("virtual drilling"). To benchmark the drill damage risk reduction provided by distance control, to quantify the accuracy of virtual drilling, and to investigate whether the proposed feedback methods are clinically feasible. In a simulated surgical scenario using human cadavers, 12 unexperienced users (medical students) drilled 12 mastoidectomies. Users were divided into a control group using standard image guidance and 3 groups using distance control with protective perimeters of 1, 2, or 3 mm. Damage to critical structures (sigmoid sinus, semicircular canals, facial nerve) was assessed. Neurosurgeons performed another 6 mastoidectomy/trans-labyrinthine and retro-labyrinthine approaches. Virtual errors as compared with real postoperative drill cavities were calculated. In a clinical setting, 3 patients received lateral skull base surgery with the proposed feedback methods. Users drilling with distance control protective perimeters of 3 mm did not damage structures, whereas the groups using smaller protective perimeters and the control group injured structures. Virtual drilling maximum cavity underestimations and overestimations were 2.8 ± 0.1 and 3.3 ± 0.4 mm, respectively. Feedback methods functioned properly in the clinical setting. Distance control reduced the risks of drill damage proportional to the protective perimeter distance. Errors in virtual drilling reflect spatial errors of the image guidance system. These feedback methods are clinically feasible. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Techniques and Methods to Improve the Audit Process of the Distributed Informatics Systems Based on Metric System

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2011-01-01

    Full Text Available The paper presents how an assessment system is implemented to evaluate the IT&C audit process quality. Issues regarding theoretical and practical terms are presented together with a brief presentation of the metrics and indicators developed in previous researches. The implementation process of an indicator system is highlighted and linked to specification stated in international standards regarding the measurement process. Also, the effects of an assessment system on the IT&C audit process quality are emphasized to demonstrate the importance of such assessment system. The audit process quality is an iterative process consisting of repetitive improvements based on objective measures established on analytical models of the indicators.

  10. Automated segmentation of ultrasonic breast lesions using statistical texture classification and active contour based on probability distance.

    Science.gov (United States)

    Liu, Bo; Cheng, H D; Huang, Jianhua; Tian, Jiawei; Liu, Jiafeng; Tang, Xianglong

    2009-08-01

    Because of its complicated structure, low signal/noise ratio, low contrast and blurry boundaries, fully automated segmentation of a breast ultrasound (BUS) image is a difficult task. In this paper, a novel segmentation method for BUS images without human intervention is proposed. Unlike most published approaches, the proposed method handles the segmentation problem by using a two-step strategy: ROI generation and ROI segmentation. First, a well-trained texture classifier categorizes the tissues into different classes, and the background knowledge rules are used for selecting the regions of interest (ROIs) from them. Second, a novel probability distance-based active contour model is applied for segmenting the ROIs and finding the accurate positions of the breast tumors. The active contour model combines both global statistical information and local edge information, using a level set approach. The proposed segmentation method was performed on 103 BUS images (48 benign and 55 malignant). To validate the performance, the results were compared with the corresponding tumor regions marked by an experienced radiologist. Three error metrics, true-positive ratio (TP), false-negative ratio (FN) and false-positive ratio (FP) were used for measuring the performance of the proposed method. The final results (TP = 91.31%, FN = 8.69% and FP = 7.26%) demonstrate that the proposed method can segment BUS images efficiently, quickly and automatically.

  11. Gap-metric-based robustness analysis of nonlinear systems with full and partial feedback linearisation

    Science.gov (United States)

    Al-Gburi, A.; Freeman, C. T.; French, M. C.

    2018-06-01

    This paper uses gap metric analysis to derive robustness and performance margins for feedback linearising controllers. Distinct from previous robustness analysis, it incorporates the case of output unstructured uncertainties, and is shown to yield general stability conditions which can be applied to both stable and unstable plants. It then expands on existing feedback linearising control schemes by introducing a more general robust feedback linearising control design which classifies the system nonlinearity into stable and unstable components and cancels only the unstable plant nonlinearities. This is done in order to preserve the stabilising action of the inherently stabilising nonlinearities. Robustness and performance margins are derived for this control scheme, and are expressed in terms of bounds on the plant nonlinearities and the accuracy of the cancellation of the unstable plant nonlinearity by the controller. Case studies then confirm reduced conservatism compared with standard methods.

  12. Multi-linear model set design based on the nonlinearity measure and H-gap metric.

    Science.gov (United States)

    Shaghaghi, Davood; Fatehi, Alireza; Khaki-Sedigh, Ali

    2017-05-01

    This paper proposes a model bank selection method for a large class of nonlinear systems with wide operating ranges. In particular, nonlinearity measure and H-gap metric are used to provide an effective algorithm to design a model bank for the system. Then, the proposed model bank is accompanied with model predictive controllers to design a high performance advanced process controller. The advantage of this method is the reduction of excessive switch between models and also decrement of the computational complexity in the controller bank that can lead to performance improvement of the control system. The effectiveness of the method is verified by simulations as well as experimental studies on a pH neutralization laboratory apparatus which confirms the efficiency of the proposed algorithm. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Representing distance, consuming distance

    DEFF Research Database (Denmark)

    Larsen, Gunvor Riber

    Title: Representing Distance, Consuming Distance Abstract: Distance is a condition for corporeal and virtual mobilities, for desired and actual travel, but yet it has received relatively little attention as a theoretical entity in its own right. Understandings of and assumptions about distance...... are being consumed in the contemporary society, in the same way as places, media, cultures and status are being consumed (Urry 1995, Featherstone 2007). An exploration of distance and its representations through contemporary consumption theory could expose what role distance plays in forming...

  14. KM-FCM: A fuzzy clustering optimization algorithm based on Mahalanobis distance

    Directory of Open Access Journals (Sweden)

    Zhiwen ZU

    2018-04-01

    Full Text Available The traditional fuzzy clustering algorithm uses Euclidean distance as the similarity criterion, which is disadvantageous to the multidimensional data processing. In order to solve this situation, Mahalanobis distance is used instead of the traditional Euclidean distance, and the optimization of fuzzy clustering algorithm based on Mahalanobis distance is studied to enhance the clustering effect and ability. With making the initialization means by Heuristic search algorithm combined with k-means algorithm, and in terms of the validity function which could automatically adjust the optimal clustering number, an optimization algorithm KM-FCM is proposed. The new algorithm is compared with FCM algorithm, FCM-M algorithm and M-FCM algorithm in three standard data sets. The experimental results show that the KM-FCM algorithm is effective. It has higher clustering accuracy than FCM, FCM-M and M-FCM, recognizing high-dimensional data clustering well. It has global optimization effect, and the clustering number has no need for setting in advance. The new algorithm provides a reference for the optimization of fuzzy clustering algorithm based on Mahalanobis distance.

  15. Research on Single Base-Station Distance Estimation Algorithm in Quasi-GPS Ultrasonic Location System

    International Nuclear Information System (INIS)

    Cheng, X C; Su, S J; Wang, Y K; Du, J B

    2006-01-01

    In order to identify each base-station in quasi-GPS ultrasonic location system, a unique pseudo-random code is assigned to each base-station. This article primarily studies the distance estimation problem between Autonomous Guide Vehicle (AGV) and single base-station, and then the ultrasonic spread-spectrum distance measurement Time Delay Estimation (TDE) model is established. Based on the above model, the envelope correlation fast TDE algorithm based on FFT is presented and analyzed. It shows by experiments that when the m sequence used in the received signal is as same as the reference signal, there will be a sharp correlation value in their envelope correlation function after they are processed by the above algorithm; otherwise, the will be no prominent correlation value. So, the AGV can identify each base-station easily

  16. Research on Single Base-Station Distance Estimation Algorithm in Quasi-GPS Ultrasonic Location System

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, X C; Su, S J; Wang, Y K; Du, J B [Instrument Department, College of Mechatronics Engineering and Automation, National University of Defense Technology, ChangSha, Hunan, 410073 (China)

    2006-10-15

    In order to identify each base-station in quasi-GPS ultrasonic location system, a unique pseudo-random code is assigned to each base-station. This article primarily studies the distance estimation problem between Autonomous Guide Vehicle (AGV) and single base-station, and then the ultrasonic spread-spectrum distance measurement Time Delay Estimation (TDE) model is established. Based on the above model, the envelope correlation fast TDE algorithm based on FFT is presented and analyzed. It shows by experiments that when the m sequence used in the received signal is as same as the reference signal, there will be a sharp correlation value in their envelope correlation function after they are processed by the above algorithm; otherwise, the will be no prominent correlation value. So, the AGV can identify each base-station easily.

  17. Simulation of Distance Relay for Load Encroachment Alleviation with Agent Based Supervision of Zone3

    Directory of Open Access Journals (Sweden)

    Mohamed Badr

    2017-03-01

    Full Text Available Cascaded tripping of power lines due to mal-operation of zone-3 distance relays has been one of the main causes of many previous blackouts worldwide. Encroachment of load into zone-3 characteristics during stressed system operation conditions is a basic factor for such mal-operation of the relays. By improving the operation of zone-3, it is possible to prevent mal-operations so that cascaded line tripping can be avoided. For proper study of the behavior of distance relay during faults and load encroachment phenomenon, we must build a model of distance relay, so in this paper a modeling study of distance relay is implemented using MATLAB/Simulink program. However, this model is distinguished from previous models that, examines in detail the third zone of distance relay. Many cases are simulated with changing line loading and fault location to ensure the capability of the relay to detect the fault and thus the maximum load ability limit of distance relay is obtained. In order to prevent cascading events caused by hidden failures in zone-3 relays, agent based relay architectures have been suggested in the recent past. In such architectures each zone-3 relay contains agents that require communication with other agents at various relevant relays in order to distinguish a real zone-3 event from a temporary overload. In this paper, a local master agent is consulted by all zone-3 agents before a tripping decision is made. The master agent maintains a rule base which is updated based on the local topology of the network and real time monitoring of the status of other relays and circuit breakers. Cisco Packet Tracer program is used for running communication network simulations. The result of the simulation indicate that the time estimated to send and receive a packet data unit (PDU message between one relay to anther can satisfy the communication requirement for the proposed scheme with fiber media.

  18. Knowledge Based Artificial Augmentation Intelligence Technology: Next Step in Academic Instructional Tools for Distance Learning

    Science.gov (United States)

    Crowe, Dale; LaPierre, Martin; Kebritchi, Mansureh

    2017-01-01

    With augmented intelligence/knowledge based system (KBS) it is now possible to develop distance learning applications to support both curriculum and administrative tasks. Instructional designers and information technology (IT) professionals are now moving from the programmable systems era that started in the 1950s to the cognitive computing era.…

  19. Decoding and finding the minimum distance with Gröbner bases : history and new insights

    NARCIS (Netherlands)

    Bulygin, S.; Pellikaan, G.R.; Woungang, I.; Misra, S.; Misra, S.C.

    2010-01-01

    In this chapter, we discuss decoding techniques and finding the minimum distance of linear codes with the use of Grobner bases. First, we give a historical overview of decoding cyclic codes via solving systems polynominal equations over finite fields. In particular, we mention papers of Cooper,.

  20. MOND rotation curves for spiral galaxies with Cepheid-based distances

    NARCIS (Netherlands)

    Bottema, R; Pestana, JLG; Rothberg, B; Sanders, RH

    2002-01-01

    Rotation curves for four spiral galaxies with recently determined Cepheid-based distances are reconsidered in terms of modified Newtonian dynamics (MOND). For two of the objects, NGC 2403 and NGC 7331, the rotation curves predicted by MOND are compatible with the observed curves when these galaxies

  1. Distance-Based Image Classification: Generalizing to New Classes at Near Zero Cost

    NARCIS (Netherlands)

    Mensink, T.; Verbeek, J.; Perronnin, F.; Csurka, G.

    2013-01-01

    We study large-scale image classification methods that can incorporate new classes and training images continuously over time at negligible cost. To this end, we consider two distance-based classifiers, the k-nearest neighbor (k-NN) and nearest class mean (NCM) classifiers, and introduce a new

  2. A distance weighted-based approach for self-organized aggregation in robot swarms

    KAUST Repository

    Khaldi, Belkacem; Harrou, Fouzi; Cherif, Foudil; Sun, Ying

    2017-01-01

    topology to keep the robots together. A distance-weighted function based on a Smoothed Particle Hydrodynamic (SPH) interpolation approach is used as a key factor to identify the K-Nearest neighbors taken into account when aggregating the robots. The intra

  3. Preparedness of NGO Health Service Providers in Bangladesh about Distance Based Learning

    Directory of Open Access Journals (Sweden)

    AKM ALAMGIR

    2006-07-01

    Full Text Available This cross-sectional survey was conducted countrywide from 15 January to 01 March 2004 to explore the potentials of health care service providers (physicians, nurses, paramedics etc. for using distance-based learning materials. Face-to-face in-depth interview was taken from 99 randomly selected direct service providers, 45 midlevel clinic mangers/physicians and 06 administrators or policy planners. Quasi-open questionnaire was developed for three different levels. Pre-trained interviewer team assisted data collection at field level. Total procedure was stringently monitored for completeness and consistency to ensure quality data. SPSS software was used to process and analyze both univariate and multivariate multiple responses. Identified need for training areas were- STD/HIV, tuberculosis updates, family planning, treatment of locally endemic diseases, behavioral change communication & marketing and quality management system for managers. About 76.7% clinic managers and 89.1% service providers had primary information about distance-based learning in spite showed interest. About 51.5% desired monthly, 20.6% biweekly and 26.8% wanted bimonthly circulation of the distance-based study materials. About 35.1% expected print materials with regular facilitators while 58.8% demanded stand-by facilitators. The study suggested wide acceptance of distance-based learning methods as supplementary to the continuing medical education among the countrywide health service providers.

  4. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    Science.gov (United States)

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  5. INTERACTION BETWEEN ENTERPRISES AND UNIVERSITIES CIVIL AVIATION BASED TECHNOLOGIES OF DISTANCE EDUCATION

    Directory of Open Access Journals (Sweden)

    K. S. Ermakov

    2015-01-01

    Full Text Available Distance education based on modern information technology as a tool for interaction between universities and enterprises of civil aviation. The introduction of the learning process real needs of civil aviation, enabling an airline to use scientific potential of educational institutions for the successful implementation of scientific research aimed at solving urgent problems.

  6. Prioritizing Urban Habitats for Connectivity Conservation: Integrating Centrality and Ecological Metrics.

    Science.gov (United States)

    Poodat, Fatemeh; Arrowsmith, Colin; Fraser, David; Gordon, Ascelin

    2015-09-01

    Connectivity among fragmented areas of habitat has long been acknowledged as important for the viability of biological conservation, especially within highly modified landscapes. Identifying important habitat patches in ecological connectivity is a priority for many conservation strategies, and the application of 'graph theory' has been shown to provide useful information on connectivity. Despite the large number of metrics for connectivity derived from graph theory, only a small number have been compared in terms of the importance they assign to nodes in a network. This paper presents a study that aims to define a new set of metrics and compares these with traditional graph-based metrics, used in the prioritization of habitat patches for ecological connectivity. The metrics measured consist of "topological" metrics, "ecological metrics," and "integrated metrics," Integrated metrics are a combination of topological and ecological metrics. Eight metrics were applied to the habitat network for the fat-tailed dunnart within Greater Melbourne, Australia. A non-directional network was developed in which nodes were linked to adjacent nodes. These links were then weighted by the effective distance between patches. By applying each of the eight metrics for the study network, nodes were ranked according to their contribution to the overall network connectivity. The structured comparison revealed the similarity and differences in the way the habitat for the fat-tailed dunnart was ranked based on different classes of metrics. Due to the differences in the way the metrics operate, a suitable metric should be chosen that best meets the objectives established by the decision maker.

  7. Impact of distance on the network management capability of the home base firm

    DEFF Research Database (Denmark)

    Mykhaylenko, Alona; Wæhrens, Brian Vejrum; Slepniov, Dmitrij

    For many globally dispersed organizations the home base (HB) is historically the locus of integrative, coordinating and innovating efforts, important for the overall performance. The growing concerns about the offshoring strategies posing threats to the capabilities of the HB draw attention to how...... a HB can continuously sustain its centrality. The well-known challenges of distance in the distributed working arrangements may be regarded as a major threat to the network management capabilities (NMCs) of the HB. Therefore, this paper investigates what role does distance between the HB and its...

  8. Exploring Education Major Focused Adult Learners' Perspectives and Practices of Web-Based Distance Education in Sixteen Universities

    Science.gov (United States)

    Zhang, Jing

    2009-01-01

    Distance education is not a new concept for all kinds of learners in the modern societies. Many researchers have studied traditional distance education programs for adult learners in the past, but little research has been done on Web-based distance education (WBDE) for adult learners. There are also many popular online universities in the U.S. or…

  9. Accurate and robust phylogeny estimation based on profile distances: a study of the Chlorophyceae (Chlorophyta

    Directory of Open Access Journals (Sweden)

    Rahmann Sven

    2004-06-01

    Full Text Available Abstract Background In phylogenetic analysis we face the problem that several subclade topologies are known or easily inferred and well supported by bootstrap analysis, but basal branching patterns cannot be unambiguously estimated by the usual methods (maximum parsimony (MP, neighbor-joining (NJ, or maximum likelihood (ML, nor are they well supported. We represent each subclade by a sequence profile and estimate evolutionary distances between profiles to obtain a matrix of distances between subclades. Results Our estimator of profile distances generalizes the maximum likelihood estimator of sequence distances. The basal branching pattern can be estimated by any distance-based method, such as neighbor-joining. Our method (profile neighbor-joining, PNJ then inherits the accuracy and robustness of profiles and the time efficiency of neighbor-joining. Conclusions Phylogenetic analysis of Chlorophyceae with traditional methods (MP, NJ, ML and MrBayes reveals seven well supported subclades, but the methods disagree on the basal branching pattern. The tree reconstructed by our method is better supported and can be confirmed by known morphological characters. Moreover the accuracy is significantly improved as shown by parametric bootstrap.

  10. Horses for courses: a DNA-based test for race distance aptitude in thoroughbred racehorses.

    Science.gov (United States)

    Hill, Emmeline W; Ryan, Donal P; MacHugh, David E

    2012-12-01

    Variation at the myostatin (MSTN) gene locus has been shown to influence racing phenotypes in Thoroughbred horses, and in particular, early skeletal muscle development and the aptitude for racing at short distances. Specifically, a single nucleotide polymorphism (SNP) in the first intron of MSTN (g.66493737C/T) is highly predictive of best race distance among Flat racing Thoroughbreds: homozygous C/C horses are best suited to short distance races, heterozygous C/T horses are best suited to middle distance races, and homozygous T/T horses are best suited to longer distance races. Patent applications for this gene marker association, and other linked markers, have been filed. The information contained within the patent applications is exclusively licensed to the commercial biotechnology company Equinome Ltd, which provides a DNA-based test to the international Thoroughbred horse racing and breeding industry. The application of this information in the industry enables informed decision making in breeding and racing and can be used to assist selection to accelerate the rate of change of genetic types among distinct populations (Case Study 1) and within individual breeding operations (Case Study 2).

  11. A Case Study Based Analysis of Performance Metrics for Green Infrastructure

    Science.gov (United States)

    Gordon, B. L.; Ajami, N.; Quesnel, K.

    2017-12-01

    Aging infrastructure, population growth, and urbanization are demanding new approaches to management of all components of the urban water cycle, including stormwater. Traditionally, urban stormwater infrastructure was designed to capture and convey rainfall-induced runoff out of a city through a network of curbs, gutters, drains, and pipes, also known as grey infrastructure. These systems were planned with a single-purpose and designed under the assumption of hydrologic stationarity, a notion that no longer holds true in the face of a changing climate. One solution gaining momentum around the world is green infrastructure (GI). Beyond stormwater quality improvement and quantity reduction (or technical benefits), GI solutions offer many environmental, economic, and social benefits. Yet many practical barriers have prevented the widespread adoption of these systems worldwide. At the center of these challenges is the inability of stakeholders to know how to monitor, measure, and assess the multi-sector performance of GI systems. Traditional grey infrastructure projects require different monitoring strategies than natural systems; there are no overarching policies on how to best design GI monitoring and evaluation systems and measure performance. Previous studies have attempted to quantify the performance of GI, mostly using one evaluation method on a specific case study. We use a case study approach to address these knowledge gaps and develop a conceptual model of how to evaluate the performance of GI through the lens of financing. First, we examined many different case studies of successfully implemented GI around the world. Then we narrowed in on 10 exemplary case studies. For each case studies, we determined what performance method the project developer used such as LCA, TBL, Low Impact Design Assessment (LIDA) and others. Then, we determined which performance metrics were used to determine success and what data was needed to calculate those metrics. Finally, we

  12. Unsupervised classification of lidar-based vegetation structure metrics at Jean Lafitte National Historical Park and Preserve

    Science.gov (United States)

    Kranenburg, Christine J.; Palaseanu-Lovejoy, Monica; Nayegandhi, Amar; Brock, John; Woodman, Robert

    2012-01-01

    Traditional vegetation maps capture the horizontal distribution of various vegetation properties, for example, type, species and age/senescence, across a landscape. Ecologists have long known, however, that many important forest properties, for example, interior microclimate, carbon capacity, biomass and habitat suitability, are also dependent on the vertical arrangement of branches and leaves within tree canopies. The objective of this study was to use a digital elevation model (DEM) along with tree canopy-structure metrics derived from a lidar survey conducted using the Experimental Advanced Airborne Research Lidar (EAARL) to capture a three-dimensional view of vegetation communities in the Barataria Preserve unit of Jean Lafitte National Historical Park and Preserve, Louisiana. The EAARL instrument is a raster-scanning, full waveform-resolving, small-footprint, green-wavelength (532-nanometer) lidar system designed to map coastal bathymetry, topography and vegetation structure simultaneously. An unsupervised clustering procedure was then applied to the 3-dimensional-based metrics and DEM to produce a vegetation map based on the vertical structure of the park's vegetation, which includes a flotant marsh, scrub-shrub wetland, bottomland hardwood forest, and baldcypress-tupelo swamp forest. This study was completed in collaboration with the National Park Service Inventory and Monitoring Program's Gulf Coast Network. The methods presented herein are intended to be used as part of a cost-effective monitoring tool to capture change in park resources.

  13. Performance evaluation of a distance learning program.

    OpenAIRE

    Dailey, D. J.; Eno, K. R.; Brinkley, J. F.

    1994-01-01

    This paper presents a performance metric which uses a single number to characterize the response time for a non-deterministic client-server application operating over the Internet. When applied to a Macintosh-based distance learning application called the Digital Anatomist Browser, the metric allowed us to observe that "A typical student doing a typical mix of Browser commands on a typical data set will experience the same delay if they use a slow Macintosh on a local network or a fast Macint...

  14. The effects of alignment quality, distance calculation method, sequence filtering, and region on the analysis of 16S rRNA gene-based studies.

    Directory of Open Access Journals (Sweden)

    Patrick D Schloss

    Full Text Available Pyrosequencing of PCR-amplified fragments that target variable regions within the 16S rRNA gene has quickly become a powerful method for analyzing the membership and structure of microbial communities. This approach has revealed and introduced questions that were not fully appreciated by those carrying out traditional Sanger sequencing-based methods. These include the effects of alignment quality, the best method of calculating pairwise genetic distances for 16S rRNA genes, whether it is appropriate to filter variable regions, and how the choice of variable region relates to the genetic diversity observed in full-length sequences. I used a diverse collection of 13,501 high-quality full-length sequences to assess each of these questions. First, alignment quality had a significant impact on distance values and downstream analyses. Specifically, the greengenes alignment, which does a poor job of aligning variable regions, predicted higher genetic diversity, richness, and phylogenetic diversity than the SILVA and RDP-based alignments. Second, the effect of different gap treatments in determining pairwise genetic distances was strongly affected by the variation in sequence length for a region; however, the effect of different calculation methods was subtle when determining the sample's richness or phylogenetic diversity for a region. Third, applying a sequence mask to remove variable positions had a profound impact on genetic distances by muting the observed richness and phylogenetic diversity. Finally, the genetic distances calculated for each of the variable regions did a poor job of correlating with the full-length gene. Thus, while it is tempting to apply traditional cutoff levels derived for full-length sequences to these shorter sequences, it is not advisable. Analysis of beta-diversity metrics showed that each of these factors can have a significant impact on the comparison of community membership and structure. Taken together, these results

  15. A guide to phylogenetic metrics for conservation, community ecology and macroecology

    Science.gov (United States)

    Cadotte, Marc W.; Carvalho, Silvia B.; Davies, T. Jonathan; Ferrier, Simon; Fritz, Susanne A.; Grenyer, Rich; Helmus, Matthew R.; Jin, Lanna S.; Mooers, Arne O.; Pavoine, Sandrine; Purschke, Oliver; Redding, David W.; Rosauer, Dan F.; Winter, Marten; Mazel, Florent

    2016-01-01

    ABSTRACT The use of phylogenies in ecology is increasingly common and has broadened our understanding of biological diversity. Ecological sub‐disciplines, particularly conservation, community ecology and macroecology, all recognize the value of evolutionary relationships but the resulting development of phylogenetic approaches has led to a proliferation of phylogenetic diversity metrics. The use of many metrics across the sub‐disciplines hampers potential meta‐analyses, syntheses, and generalizations of existing results. Further, there is no guide for selecting the appropriate metric for a given question, and different metrics are frequently used to address similar questions. To improve the choice, application, and interpretation of phylo‐diversity metrics, we organize existing metrics by expanding on a unifying framework for phylogenetic information. Generally, questions about phylogenetic relationships within or between assemblages tend to ask three types of question: how much; how different; or how regular? We show that these questions reflect three dimensions of a phylogenetic tree: richness, divergence, and regularity. We classify 70 existing phylo‐diversity metrics based on their mathematical form within these three dimensions and identify ‘anchor’ representatives: for α‐diversity metrics these are PD (Faith's phylogenetic diversity), MPD (mean pairwise distance), and VPD (variation of pairwise distances). By analysing mathematical formulae and using simulations, we use this framework to identify metrics that mix dimensions, and we provide a guide to choosing and using the most appropriate metrics. We show that metric choice requires connecting the research question with the correct dimension of the framework and that there are logical approaches to selecting and interpreting metrics. The guide outlined herein will help researchers navigate the current jungle of indices. PMID:26785932

  16. Assessment of the Log-Euclidean Metric Performance in Diffusion Tensor Image Segmentation

    Directory of Open Access Journals (Sweden)

    Mostafa Charmi

    2010-06-01

    Full Text Available Introduction: Appropriate definition of the distance measure between diffusion tensors has a deep impact on Diffusion Tensor Image (DTI segmentation results. The geodesic metric is the best distance measure since it yields high-quality segmentation results. However, the important problem with the geodesic metric is a high computational cost of the algorithms based on it. The main goal of this paper is to assess the possible substitution of the geodesic metric with the Log-Euclidean one to reduce the computational cost of a statistical surface evolution algorithm. Materials and Methods: We incorporated the Log-Euclidean metric in the statistical surface evolution algorithm framework. To achieve this goal, the statistics and gradients of diffusion tensor images were defined using the Log-Euclidean metric. Numerical implementation of the segmentation algorithm was performed in the MATLAB software using the finite difference techniques. Results: In the statistical surface evolution framework, the Log-Euclidean metric was able to discriminate the torus and helix patterns in synthesis datasets and rat spinal cords in biological phantom datasets from the background better than the Euclidean and J-divergence metrics. In addition, similar results were obtained with the geodesic metric. However, the main advantage of the Log-Euclidean metric over the geodesic metric was the dramatic reduction of computational cost of the segmentation algorithm, at least by 70 times. Discussion and Conclusion: The qualitative and quantitative results have shown that the Log-Euclidean metric is a good substitute for the geodesic metric when using a statistical surface evolution algorithm in DTIs segmentation.

  17. Distance-based relative orbital elements determination for formation flying system

    Science.gov (United States)

    He, Yanchao; Xu, Ming; Chen, Xi

    2016-01-01

    The present paper deals with determination of relative orbital elements based only on distance between satellites in the formation flying system, which has potential application in engineering, especially suited for rapid orbit determination required missions. A geometric simplification is performed to reduce the formation configuration in three-dimensional space to a plane. Then the equivalent actual configuration deviating from its nominal design is introduced to derive a group of autonomous linear equations on the mapping between the relative orbital elements differences and distance errors. The primary linear equations-based algorithm is initially proposed to conduct the rapid and precise determination of the relative orbital elements without the complex computation, which is further improved by least-squares method with more distance measurements taken into consideration. Numerical simulations and comparisons with traditional approaches are presented to validate the effectiveness of the proposed methods. To assess the performance of the two proposed algorithms, accuracy validation and Monte Carlo simulations are implemented in the presence of noises of distance measurements and the leader's absolute orbital elements. It is demonstrated that the relative orbital elements determination accuracy of two approaches reaches more than 90% and even close to the actual values for the least-squares improved one. The proposed approaches can be alternates for relative orbit determination without assistance of additional facilities in engineering for their fairly high efficiency with accuracy and autonomy.

  18. A Process-Based Transport-Distance Model of Aeolian Transport

    Science.gov (United States)

    Naylor, A. K.; Okin, G.; Wainwright, J.; Parsons, A. J.

    2017-12-01

    We present a new approach to modeling aeolian transport based on transport distance. Particle fluxes are based on statistical probabilities of particle detachment and distributions of transport lengths, which are functions of particle size classes. A computational saltation model is used to simulate transport distances over a variety of sizes. These are fit to an exponential distribution, which has the advantages of computational economy, concordance with current field measurements, and a meaningful relationship to theoretical assumptions about mean and median particle transport distance. This novel approach includes particle-particle interactions, which are important for sustaining aeolian transport and dust emission. Results from this model are compared with results from both bulk- and particle-sized-specific transport equations as well as empirical wind tunnel studies. The transport-distance approach has been successfully used for hydraulic processes, and extending this methodology from hydraulic to aeolian transport opens up the possibility of modeling joint transport by wind and water using consistent physics. Particularly in nutrient-limited environments, modeling the joint action of aeolian and hydraulic transport is essential for understanding the spatial distribution of biomass across landscapes and how it responds to climatic variability and change.

  19. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-01-01

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722

  20. Decision-making in honeybee swarms based on quality and distance information of candidate nest sites.

    Science.gov (United States)

    Laomettachit, Teeraphan; Termsaithong, Teerasit; Sae-Tang, Anuwat; Duangphakdee, Orawan

    2015-01-07

    In the nest-site selection process of honeybee swarms, an individual bee performs a waggle dance to communicate information about direction, quality, and distance of a discovered site to other bees at the swarm. Initially, different groups of bees dance to represent different potential sites, but eventually the swarm usually reaches an agreement for only one site. Here, we model the nest-site selection process in honeybee swarms of Apis mellifera and show how the swarms make adaptive decisions based on a trade-off between the quality and distance to candidate nest sites. We use bifurcation analysis and stochastic simulations to reveal that the swarm's site distance preference is moderate>near>far when the swarms choose between low quality sites. However, the distance preference becomes near>moderate>far when the swarms choose between high quality sites. Our simulations also indicate that swarms with large population size prefer nearer sites and, in addition, are more adaptive at making decisions based on available information compared to swarms with smaller population size. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  2. A Health-Based Metric for Evaluating the Effectiveness of Noise Barrier Mitigation Associated With Transport Infrastructure Noise

    Directory of Open Access Journals (Sweden)

    Geoffrey P Prendergast

    2017-01-01

    Full Text Available Introduction: This study examines the use of the number of night-time sleep disturbances as a health-based metric to assess the cost effectiveness of rail noise mitigation strategies for situations, wherein high-intensity noises dominate such as freight train pass-bys and wheel squeal. Materials and Methods: Twenty residential properties adjacent to the existing and proposed rail tracks in a noise catchment area of the Epping to Thornleigh Third Track project were used as a case study. Awakening probabilities were calculated for individual’s awakening 1, 3 and 5 times a night when subjected to 10 independent freight train pass-by noise events using internal maximum sound pressure levels (LAFmax. Results: Awakenings were predicted using a random intercept multivariate logistic regression model. With source mitigation in place, the majority of the residents were still predicted to be awoken at least once per night (median 88.0%, although substantial reductions in the median probabilities of awakening three and five times per night from 50.9 to 29.4% and 9.2 to 2.7%, respectively, were predicted. This resulted in a cost-effective estimate of 7.6–8.8 less people being awoken at least three times per night per A$1 million spent on noise barriers. Conclusion: The study demonstrates that an easily understood metric can be readily used to assist making decisions related to noise mitigation for large-scale transport projects.

  3. PG-Metrics: A chemometric-based approach for classifying bacterial peptidoglycan data sets and uncovering their subjacent chemical variability.

    Directory of Open Access Journals (Sweden)

    Keshav Kumar

    Full Text Available Bacteria cells are protected from osmotic and environmental stresses by an exoskeleton-like polymeric structure called peptidoglycan (PG or murein sacculus. This structure is fundamental for bacteria's viability and thus, the mechanisms underlying cell wall assembly and how it is modulated serve as targets for many of our most successful antibiotics. Therefore, it is now more important than ever to understand the genetics and structural chemistry of the bacterial cell walls in order to find new and effective methods of blocking it for the treatment of disease. In the last decades, liquid chromatography and mass spectrometry have been demonstrated to provide the required resolution and sensitivity to characterize the fine chemical structure of PG. However, the large volume of data sets that can be produced by these instruments today are difficult to handle without a proper data analysis workflow. Here, we present PG-metrics, a chemometric based pipeline that allows fast and easy classification of bacteria according to their muropeptide chromatographic profiles and identification of the subjacent PG chemical variability between e.g. bacterial species, growth conditions and, mutant libraries. The pipeline is successfully validated here using PG samples from different bacterial species and mutants in cell wall proteins. The obtained results clearly demonstrated that PG-metrics pipeline is a valuable bioanalytical tool that can lead us to cell wall classification and biomarker discovery.

  4. Content-Based High-Resolution Remote Sensing Image Retrieval via Unsupervised Feature Learning and Collaborative Affinity Metric Fusion

    Directory of Open Access Journals (Sweden)

    Yansheng Li

    2016-08-01

    Full Text Available With the urgent demand for automatic management of large numbers of high-resolution remote sensing images, content-based high-resolution remote sensing image retrieval (CB-HRRS-IR has attracted much research interest. Accordingly, this paper proposes a novel high-resolution remote sensing image retrieval approach via multiple feature representation and collaborative affinity metric fusion (IRMFRCAMF. In IRMFRCAMF, we design four unsupervised convolutional neural networks with different layers to generate four types of unsupervised features from the fine level to the coarse level. In addition to these four types of unsupervised features, we also implement four traditional feature descriptors, including local binary pattern (LBP, gray level co-occurrence (GLCM, maximal response 8 (MR8, and scale-invariant feature transform (SIFT. In order to fully incorporate the complementary information among multiple features of one image and the mutual information across auxiliary images in the image dataset, this paper advocates collaborative affinity metric fusion to measure the similarity between images. The performance evaluation of high-resolution remote sensing image retrieval is implemented on two public datasets, the UC Merced (UCM dataset and the Wuhan University (WH dataset. Large numbers of experiments show that our proposed IRMFRCAMF can significantly outperform the state-of-the-art approaches.

  5. Anchoring quartet-based phylogenetic distances and applications to species tree reconstruction.

    Science.gov (United States)

    Sayyari, Erfan; Mirarab, Siavash

    2016-11-11

    Inferring species trees from gene trees using the coalescent-based summary methods has been the subject of much attention, yet new scalable and accurate methods are needed. We introduce DISTIQUE, a new statistically consistent summary method for inferring species trees from gene trees under the coalescent model. We generalize our results to arbitrary phylogenetic inference problems; we show that two arbitrarily chosen leaves, called anchors, can be used to estimate relative distances between all other pairs of leaves by inferring relevant quartet trees. This results in a family of distance-based tree inference methods, with running times ranging between quadratic to quartic in the number of leaves. We show in simulated studies that DISTIQUE has comparable accuracy to leading coalescent-based summary methods and reduced running times.

  6. A distance weighted-based approach for self-organized aggregation in robot swarms

    KAUST Repository

    Khaldi, Belkacem

    2017-12-14

    In this paper, a Distance-Weighted K Nearest Neighboring (DW-KNN) topology is proposed to study self-organized aggregation as an emergent swarming behavior within robot swarms. A virtual physics approach is applied among the proposed neighborhood topology to keep the robots together. A distance-weighted function based on a Smoothed Particle Hydrodynamic (SPH) interpolation approach is used as a key factor to identify the K-Nearest neighbors taken into account when aggregating the robots. The intra virtual physical connectivity among these neighbors is achieved using a virtual viscoelastic-based proximity model. With the ARGoS based-simulator, we model and evaluate the proposed approach showing various self-organized aggregations performed by a swarm of N foot-bot robots.

  7. CT-based compartmental quantification of adipose tissue versus body metrics in colorectal cancer patients

    Energy Technology Data Exchange (ETDEWEB)

    Nattenmueller, Johanna; Hoegenauer, Hanna; Grenacher, Lars; Kauczor, Hans-Ulrich [University Hospital, Department of Diagnostic and Interventional Radiology, Heidelberg (Germany); Boehm, Juergen; Ulrich, Cornelia [Huntsman Cancer Institute, Department of Population Health Sciences, Salt Lake City, UT (United States); Scherer, Dominique; Paskow, Michael; Gigic, Biljana; Schrotz-King, Petra [National Center for Tumor Diseases and German Cancer Research Center, Division of Preventive Oncology, Heidelberg (Germany)

    2016-11-15

    While obesity is considered a prognostic factor in colorectal cancer (CRC), there is increasing evidence that not simply body mass index (BMI) alone but specifically abdominal fat distribution is what matters. As part of the ColoCare study, this study measured the distribution of adipose tissue compartments in CRC patients and aimed to identify the body metric that best correlates with these measurements as a useful proxy for adipose tissue distribution. In 120 newly-diagnosed CRC patients who underwent multidetector computed tomography (CT), densitometric quantification of total (TFA), visceral (VFA), intraperitoneal (IFA), retroperitoneal (RFA), and subcutaneous fat area (SFA), as well as the M. erector spinae and psoas was performed to test the association with gender, age, tumor stage, metabolic equivalents, BMI, waist-to-height (WHtR) and waist-to-hip ratio (WHR). VFA was 28.8 % higher in men (p{sub VFA}<0.0001) and 30.5 % higher in patients older than 61 years (p{sub VFA}<0.0001). WHtR correlated best with all adipose tissue compartments (r{sub VFA}=0.69, r{sub TFA}=0.84, p<0.0001) and visceral-to-subcutaneous-fat-ratio (VFR, r{sub VFR}=0.22, p=<0.05). Patients with tumor stages III/IV showed significantly lower overall adipose tissue than I/II. Increased M. erector spinae mass was inversely correlated with all compartments. Densitometric quantification on CT is a highly reproducible and reliable method to show fat distribution across adipose tissue compartments. This distribution might be best reflected by WHtR, rather than by BMI or WHR. (orig.)

  8. CT-based compartmental quantification of adipose tissue versus body metrics in colorectal cancer patients

    International Nuclear Information System (INIS)

    Nattenmueller, Johanna; Hoegenauer, Hanna; Grenacher, Lars; Kauczor, Hans-Ulrich; Boehm, Juergen; Ulrich, Cornelia; Scherer, Dominique; Paskow, Michael; Gigic, Biljana; Schrotz-King, Petra

    2016-01-01

    While obesity is considered a prognostic factor in colorectal cancer (CRC), there is increasing evidence that not simply body mass index (BMI) alone but specifically abdominal fat distribution is what matters. As part of the ColoCare study, this study measured the distribution of adipose tissue compartments in CRC patients and aimed to identify the body metric that best correlates with these measurements as a useful proxy for adipose tissue distribution. In 120 newly-diagnosed CRC patients who underwent multidetector computed tomography (CT), densitometric quantification of total (TFA), visceral (VFA), intraperitoneal (IFA), retroperitoneal (RFA), and subcutaneous fat area (SFA), as well as the M. erector spinae and psoas was performed to test the association with gender, age, tumor stage, metabolic equivalents, BMI, waist-to-height (WHtR) and waist-to-hip ratio (WHR). VFA was 28.8 % higher in men (p_V_F_A<0.0001) and 30.5 % higher in patients older than 61 years (p_V_F_A<0.0001). WHtR correlated best with all adipose tissue compartments (r_V_F_A=0.69, r_T_F_A=0.84, p<0.0001) and visceral-to-subcutaneous-fat-ratio (VFR, r_V_F_R=0.22, p=<0.05). Patients with tumor stages III/IV showed significantly lower overall adipose tissue than I/II. Increased M. erector spinae mass was inversely correlated with all compartments. Densitometric quantification on CT is a highly reproducible and reliable method to show fat distribution across adipose tissue compartments. This distribution might be best reflected by WHtR, rather than by BMI or WHR. (orig.)

  9. Stability of Switched Feedback Time-Varying Dynamic Systems Based on the Properties of the Gap Metric for Operators

    Directory of Open Access Journals (Sweden)

    M. De la Sen

    2012-01-01

    Full Text Available The stabilization of dynamic switched control systems is focused on and based on an operator-based formulation. It is assumed that the controlled object and the controller are described by sequences of closed operator pairs (L,C on a Hilbert space H of the input and output spaces and it is related to the existence of the inverse of the resulting input-output operator being admissible and bounded. The technical mechanism addressed to get the results is the appropriate use of the fact that closed operators being sufficiently close to bounded operators, in terms of the gap metric, are also bounded. That philosophy is followed for the operators describing the input-output relations in switched feedback control systems so as to guarantee the closed-loop stabilization.

  10. Networks and centroid metrics for understanding football

    African Journals Online (AJOL)

    Gonçalo Dias

    games. However, it seems that the centroid metric, supported only by the position of players in the field ...... the strategy adopted by the coach (Gama et al., 2014). ... centroid distance as measures of team's tactical performance in youth football.

  11. Finite Metric Spaces of Strictly negative Type

    DEFF Research Database (Denmark)

    Hjorth, Poul G.

    If a finite metric space is of strictly negative type then its transfinite diameter is uniquely realized by an infinite extent (“load vector''). Finite metric spaces that have this property include all trees, and all finite subspaces of Euclidean and Hyperbolic spaces. We prove that if the distance...

  12. LIVE AUTHORITY IN THE CLASSROOM IN VIDEO CONFERENCE-BASED SYNCHRONOUS DISTANCE EDUCATION: The Teaching Assistant

    Directory of Open Access Journals (Sweden)

    Hasan KARAL

    2010-07-01

    Full Text Available The aim of this study was to define the role of the assistant in a classroom environment where students are taught using video conference-based synchronous distance education. Qualitative research approach was adopted and, among purposeful sampling methods, criterion sampling method was preferred in the scope of the study. The study was carried out during the spring semester of the 2008-2009 academic years. A teaching assistant and a total of 9 sophomore or senior students from the Department of City and Regional Development, Faculty of Architecture, Karadeniz Technical University, participated as subjects. The students included in the study sampling were taking lessons from the Middle East Technical University on the basis of synchronous distance education. Among the qualitative research methods, case study method was used and the study data were obtained from the semi-structured interview and observation results. Study data were analyzed with descriptive analysis methods. Data obtained at the end of the study were found to support the suggestion that there should be an authority in the video conference-based synchronous distance education. Findings obtained during the interviews made with the students revealed that some of the teacher’s classroom management related responsibilities are transferred to the assistant present in the classroom during the synchronous distance education. It was concluded at the end of the interviews that a teaching assistant’s presence should be obligatory in the undergraduate synchronous distance classroom environment. However, it was also concluded that there may not be any need for an authority in the classroom environment at the postgraduate education level due to the profile and expectations of the student, which differ from those of students at lower educational levels.

  13. Assessing Gait Impairments Based on Auto-Encoded Patterns of Mahalanobis Distances from Consecutive Steps.

    Science.gov (United States)

    Muñoz-Organero, Mario; Davies, Richard; Mawson, Sue

    2017-01-01

    Insole pressure sensors capture the force distribution patterns during the stance phase while walking. By comparing patterns obtained from healthy individuals to patients suffering different medical conditions based on a given similarity measure, automatic impairment indexes can be computed in order to help in applications such as rehabilitation. This paper uses the data sensed from insole pressure sensors for a group of healthy controls to train an auto-encoder using patterns of stochastic distances in series of consecutive steps while walking at normal speeds. Two experiment groups are compared to the healthy control group: a group of patients suffering knee pain and a group of post-stroke survivors. The Mahalanobis distance is computed for every single step by each participant compared to the entire dataset sensed from healthy controls. The computed distances for consecutive steps are fed into the previously trained autoencoder and the average error is used to assess how close the walking segment is to the autogenerated model from healthy controls. The results show that automatic distortion indexes can be used to assess each participant as compared to normal patterns computed from healthy controls. The stochastic distances observed for the group of stroke survivors are bigger than those for the people with knee pain.

  14. The Learning Tutor: A Web based Authoring System to Support Distance Tutoring

    Directory of Open Access Journals (Sweden)

    Omar Abou Khaled

    2000-01-01

    Full Text Available In distance learning contexts, such as those are being widely promoted and developed with the extensive use of ICT (Information and Communication Technology some important issues have to be carefully addressed, in order to make education more effective and available. Distant students have to face sound organizational problems concerning the working time-management and the regulation of all the learning process. These are far more complex at a distance because of the difficulties to understand and objectively evaluate how the study is progressing in term of knowledge and competence acquisition, both for the students themselves and for the teacher who is supposed to adjust the teaching process in case of need. Moreover, the absence of clear indication for the student of the relative level of importance of each piece of information available comes to be another key issue in distance education. This paper describes a Web-based authoring system, the Learning Tutor, conceived to cover these issues. The environment is composed by several interconnected authoring systems: “The Course Description, the Guiding Thread and the Agenda”, “The Work Plan and Themes Reviewer”, and “The Quizzes self-evaluation facility”. This model of combined tools aims at providing the suitable support for organization, work and time management in distance learning processes using well documented mastery learning principles.

  15. Assessing Woody Vegetation Trends in Sahelian Drylands Using MODIS Based Seasonal Metrics

    Science.gov (United States)

    Brandt, Martin; Hiernaux, Pierre; Rasmussen, Kjeld; Mbow, Cheikh; Kergoat, Laurent; Tagesson, Torbern; Ibrahim, Yahaya Z.; Wele, Abdoulaye; Tucker, Compton J.; Fensholt, Rasmus

    2016-01-01

    Woody plants play a major role for the resilience of drylands and in peoples' livelihoods. However, due to their scattered distribution, quantifying and monitoring woody cover over space and time is challenging. We develop a phenology driven model and train/validate MODIS (MCD43A4, 500m) derived metrics with 178 ground observations from Niger, Senegal and Mali to estimate woody cover trends from 2000 to 2014 over the entire Sahel. The annual woody cover estimation at 500 m scale is fairly accurate with an RMSE of 4.3 (woody cover %) and r(exp 2) = 0.74. Over the 15 year period we observed an average increase of 1.7 (+/- 5.0) woody cover (%) with large spatial differences: No clear change can be observed in densely populated areas (0.2 +/- 4.2), whereas a positive change is seen in sparsely populated areas (2.1 +/- 5.2). Woody cover is generally stable in cropland areas (0.9 +/- 4.6), reflecting the protective management of parkland trees by the farmers. Positive changes are observed in savannas (2.5 +/- 5.4) and woodland areas (3.9 +/- 7.3). The major pattern of woody cover change reveals strong increases in the sparsely populated Sahel zones of eastern Senegal, western Mali and central Chad, but a decreasing trend is observed in the densely populated western parts of Senegal, northern Nigeria, Sudan and southwestern Niger. This decrease is often local and limited to woodlands, being an indication of ongoing expansion of cultivated areas and selective logging.We show that an overall positive trend is found in areas of low anthropogenic pressure demonstrating the potential of these ecosystems to provide services such as carbon storage, if not over-utilized. Taken together, our results provide an unprecedented synthesis of woody cover dynamics in theSahel, and point to land use and human population density as important drivers, however only partially and locally offsetting a general post-drought increase.

  16. Use and Cost of Electronic Resources in Central Library of Ferdowsi University Based on E-metrics

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Davarpanah

    2012-07-01

    Full Text Available The purpose of this study was to investigate the usage of electronic journals in Ferdowsi University, Iran based on e-metrics. The paper also aimed to emphasize the analysis of cost-benefit and the correlation between the journal impact factors and the usage data. In this study experiences of Ferdowsi University library on licensing and usage of electronic resources was evaluated by providing a cost-benefit analysis based on the cost and usage statistics of electronic resources. Vendor-provided data were also compared with local usage data. The usage data were collected by tracking web-based access locally, and by collecting vender-provided usage data. The data sources were one-year of vendor-supplied e-resource usage data such as Ebsco, Elsevier, Proquest, Emerald, Oxford and Springer and local usage data collected from the Ferdowsi university web server. The study found that actual usage values differ for vendor-provided data and local usage data. Elsevier has got the highest usage degree in searches, sessions and downloads. Statistics also showed that a small number of journals satisfy significant amount of use while the majority of journals were used less frequent and some were never used at all. The users preferred the PDF rather than HTML format. The data in subject profile suggested that the provided e-resources were best suited to certain subjects. There was no correlation between IF and electronic journal use. Monitoring the usage of e-resources gained increasing importance for acquisition policy and budget decisions. The article provided information about local metrics for the six surveyed vendors/publishers, e.g. usage trends, requests per package, cost per use as related to the scientific specialty of the university.

  17. A novel three-stage distance-based consensus ranking method

    Science.gov (United States)

    Aghayi, Nazila; Tavana, Madjid

    2018-05-01

    In this study, we propose a three-stage weighted sum method for identifying the group ranks of alternatives. In the first stage, a rank matrix, similar to the cross-efficiency matrix, is obtained by computing the individual rank position of each alternative based on importance weights. In the second stage, a secondary goal is defined to limit the vector of weights since the vector of weights obtained in the first stage is not unique. Finally, in the third stage, the group rank position of alternatives is obtained based on a distance of individual rank positions. The third stage determines a consensus solution for the group so that the ranks obtained have a minimum distance from the ranks acquired by each alternative in the previous stage. A numerical example is presented to demonstrate the applicability and exhibit the efficacy of the proposed method and algorithms.

  18. Extended VIKOR Method for Intuitionistic Fuzzy Multiattribute Decision-Making Based on a New Distance Measure

    Directory of Open Access Journals (Sweden)

    Xiao Luo

    2017-01-01

    Full Text Available An intuitionistic fuzzy VIKOR (IF-VIKOR method is proposed based on a new distance measure considering the waver of intuitionistic fuzzy information. The method aggregates all individual decision-makers’ assessment information based on intuitionistic fuzzy weighted averaging operator (IFWA, determines the weights of decision-makers and attributes objectively using intuitionistic fuzzy entropy, calculates the group utility and individual regret by the new distance measure, and then reaches a compromise solution. It can be effectively applied to multiattribute decision-making (MADM problems where the weights of decision-makers and attributes are completely unknown and the attribute values are intuitionistic fuzzy numbers (IFNs. The validity and stability of this method are verified by example analysis and sensitivity analysis, and its superiority is illustrated by the comparison with the existing method.

  19. Application of energies of optimal frequency bands for fault diagnosis based on modified distance function

    Energy Technology Data Exchange (ETDEWEB)

    Zamanian, Amir Hosein [Southern Methodist University, Dallas (United States); Ohadi, Abdolreza [Amirkabir University of Technology (Tehran Polytechnic), Tehran (Iran, Islamic Republic of)

    2017-06-15

    Low-dimensional relevant feature sets are ideal to avoid extra data mining for classification. The current work investigates the feasibility of utilizing energies of vibration signals in optimal frequency bands as features for machine fault diagnosis application. Energies in different frequency bands were derived based on Parseval's theorem. The optimal feature sets were extracted by optimization of the related frequency bands using genetic algorithm and a Modified distance function (MDF). The frequency bands and the number of bands were optimized based on the MDF. The MDF is designed to a) maximize the distance between centers of classes, b) minimize the dispersion of features in each class separately, and c) minimize dimension of extracted feature sets. The experimental signals in two different gearboxes were used to demonstrate the efficiency of the presented technique. The results show the effectiveness of the presented technique in gear fault diagnosis application.

  20. Auditory distance perception in humans: a review of cues, development, neuronal bases, and effects of sensory loss.

    Science.gov (United States)

    Kolarik, Andrew J; Moore, Brian C J; Zahorik, Pavel; Cirstea, Silvia; Pardhan, Shahina

    2016-02-01

    Auditory distance perception plays a major role in spatial awareness, enabling location of objects and avoidance of obstacles in the environment. However, it remains under-researched relative to studies of the directional aspect of sound localization. This review focuses on the following four aspects of auditory distance perception: cue processing, development, consequences of visual and auditory loss, and neurological bases. The several auditory distance cues vary in their effective ranges in peripersonal and extrapersonal space. The primary cues are sound level, reverberation, and frequency. Nonperceptual factors, including the importance of the auditory event to the listener, also can affect perceived distance. Basic internal representations of auditory distance emerge at approximately 6 months of age in humans. Although visual information plays an important role in calibrating auditory space, sensorimotor contingencies can be used for calibration when vision is unavailable. Blind individuals often manifest supranormal abilities to judge relative distance but show a deficit in absolute distance judgments. Following hearing loss, the use of auditory level as a distance cue remains robust, while the reverberation cue becomes less effective. Previous studies have not found evidence that hearing-aid processing affects perceived auditory distance. Studies investigating the brain areas involved in processing different acoustic distance cues are described. Finally, suggestions are given for further research on auditory distance perception, including broader investigation of how background noise and multiple sound sources affect perceived auditory distance for those with sensory loss.

  1. Computation of Thermal Development in Injection Mould Filling, based on the Distance Model

    OpenAIRE

    Andersson, Per-Åke

    2002-01-01

    The heat transfer in the filling phase of injection moulding is studied, based on Gunnar Aronsson’s distance model for flow expansion ([Aronsson], 1996). The choice of a thermoplastic materials model is motivated by general physical properties, admitting temperature and pressure dependence. Two-phase, per-phase-incompressible, power-law fluids are considered. The shear rate expression takes into account pseudo-radial flow from a point inlet. Instead of using a finite element (FEM) solver for ...

  2. Algorithm for personal identification in distance learning system based on registration of keyboard rhythm

    Science.gov (United States)

    Nikitin, P. V.; Savinov, A. N.; Bazhenov, R. I.; Sivandaev, S. V.

    2018-05-01

    The article describes the method of identifying a person in distance learning systems based on a keyboard rhythm. An algorithm for the organization of access control is proposed, which implements authentication, identification and verification of a person using the keyboard rhythm. Authentication methods based on biometric personal parameters, including those based on the keyboard rhythm, due to the inexistence of biometric characteristics without a particular person, are able to provide an advanced accuracy and inability to refuse authorship and convenience for operators of automated systems, in comparison with other methods of conformity checking. Methods of permanent hidden keyboard monitoring allow detecting the substitution of a student and blocking the key system.

  3. Inter-regional metric disadvantages when comparing countries’ happiness on a global scale. A Rasch based consequential validity analysis

    Directory of Open Access Journals (Sweden)

    Diego Fernando Rojas-Gualdrón

    2017-07-01

    Full Text Available Measurement confounding due to socioeconomic differences between world regions may bias the estimations of countries’ happiness and global inequality. Potential implications of this bias have not been researched. In this study, the consequential validity of the Happy Planet Index, 2012 as an indicator of global inequality is evaluated from the Rasch measurement perspective. Differential Item Functioning by world region and bias in the estimated magnitude of inequalities were analyzed. The recalculated measure showed a good fit to Rasch model assumptions. The original index underestimated relative inequalities between world regions by 20%. DIF had no effect on relative measures but affected absolute measures by overestimating world average happiness and underestimating its variance. These findings suggest measurement confounding by unmeasured characteristics. Metric disadvantages must be adjusted to make fair comparisons. Public policy decisions based on biased estimations could have relevant negative consequences on people’s health and well-being by not focusing efforts on real vulnerable populations.

  4. Comparison of frequency-distance relationship and Gaussian-diffusion-based methods of compensation for distance-dependent spatial resolution in SPECT imaging

    International Nuclear Information System (INIS)

    Kohli, Vandana; King, Micgael A.; Glick, Stephen J.; Pan, Tin-Su

    1998-01-01

    The goal of this investigation was to compare resolution recovery versus noise level of two methods for compensation of distance-dependent resolution (DDR) in SPECT imaging. The two methods of compensation were restoration filtering based on the frequency-distance relationship (FDR) prior to iterative reconstruction, and modelling DDR in the projector/backprojector pair employed in iterative reconstruction. FDR restoration filtering was computationally faster than modelling the detector response in iterative reconstruction. Using Gaussian diffusion to model the detector response in iterative reconstruction sped up the process by a factor of 2.5 over frequency domain filtering in the projector/backprojector pair. Gaussian diffusion modelling resulted in a better resolution versus noise tradeoff than either FDR restoration filtering or solely modelling attenuation in the projector/backprojector pair of iterative reconstruction. For the pixel size investigated herein (0.317 cm), accounting for DDR in the projector/backprojector pair by Gaussian diffusion, or by applying a blurring function based on the distance from the face of the collimator at each distance, resulted in very similar resolution recovery and slice noise level. (author)

  5. Prediction of speech intelligibility based on a correlation metric in the envelope power spectrum domain

    DEFF Research Database (Denmark)

    Relano-Iborra, Helia; May, Tobias; Zaar, Johannes

    A powerful tool to investigate speech perception is the use of speech intelligibility prediction models. Recently, a model was presented, termed correlation-based speechbased envelope power spectrum model (sEPSMcorr) [1], based on the auditory processing of the multi-resolution speech-based Envel...

  6. Numerical Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Douglas, Michael R.; Karp, Robert L.; Lukic, Sergio; Reinbacher, Rene

    2008-01-01

    We develop numerical methods for approximating Ricci flat metrics on Calabi-Yau hypersurfaces in projective spaces. Our approach is based on finding balanced metrics and builds on recent theoretical work by Donaldson. We illustrate our methods in detail for a one parameter family of quintics. We also suggest several ways to extend our results

  7. High resolution metric imaging payload

    Science.gov (United States)

    Delclaud, Y.

    2017-11-01

    Alcatel Space Industries has become Europe's leader in the field of high and very high resolution optical payloads, in the frame work of earth observation system able to provide military government with metric images from space. This leadership allowed ALCATEL to propose for the export market, within a French collaboration frame, a complete space based system for metric observation.

  8. Validation of network communicability metrics for the analysis of brain structural networks.

    Directory of Open Access Journals (Sweden)

    Jennifer Andreotti

    Full Text Available Computational network analysis provides new methods to analyze the brain's structural organization based on diffusion imaging tractography data. Networks are characterized by global and local metrics that have recently given promising insights into diagnosis and the further understanding of psychiatric and neurologic disorders. Most of these metrics are based on the idea that information in a network flows along the shortest paths. In contrast to this notion, communicability is a broader measure of connectivity which assumes that information could flow along all possible paths between two nodes. In our work, the features of network metrics related to communicability were explored for the first time in the healthy structural brain network. In addition, the sensitivity of such metrics was analysed using simulated lesions to specific nodes and network connections. Results showed advantages of communicability over conventional metrics in detecting densely connected nodes as well as subsets of nodes vulnerable to lesions. In addition, communicability centrality was shown to be widely affected by the lesions and the changes were negatively correlated with the distance from lesion site. In summary, our analysis suggests that communicability metrics that may provide an insight into the integrative properties of the structural brain network and that these metrics may be useful for the analysis of brain networks in the presence of lesions. Nevertheless, the interpretation of communicability is not straightforward; hence these metrics should be used as a supplement to the more standard connectivity network metrics.

  9. SU-E-T-789: Validation of 3DVH Accuracy On Quantifying Delivery Errors Based On Clinical Relevant DVH Metrics

    International Nuclear Information System (INIS)

    Ma, T; Kumaraswamy, L

    2015-01-01

    Purpose: Detection of treatment delivery errors is important in radiation therapy. However, accurate quantification of delivery errors is also of great importance. This study aims to evaluate the 3DVH software’s ability to accurately quantify delivery errors. Methods: Three VMAT plans (prostate, H&N and brain) were randomly chosen for this study. First, we evaluated whether delivery errors could be detected by gamma evaluation. Conventional per-beam IMRT QA was performed with the ArcCHECK diode detector for the original plans and for the following modified plans: (1) induced dose difference error up to ±4.0% and (2) control point (CP) deletion (3 to 10 CPs were deleted) (3) gantry angle shift error (3 degree uniformly shift). 2D and 3D gamma evaluation were performed for all plans through SNC Patient and 3DVH, respectively. Subsequently, we investigated the accuracy of 3DVH analysis for all cases. This part evaluated, using the Eclipse TPS plans as standard, whether 3DVH accurately can model the changes in clinically relevant metrics caused by the delivery errors. Results: 2D evaluation seemed to be more sensitive to delivery errors. The average differences between ECLIPSE predicted and 3DVH results for each pair of specific DVH constraints were within 2% for all three types of error-induced treatment plans, illustrating the fact that 3DVH is fairly accurate in quantifying the delivery errors. Another interesting observation was that even though the gamma pass rates for the error plans are high, the DVHs showed significant differences between original plan and error-induced plans in both Eclipse and 3DVH analysis. Conclusion: The 3DVH software is shown to accurately quantify the error in delivered dose based on clinically relevant DVH metrics, where a conventional gamma based pre-treatment QA might not necessarily detect

  10. Protein structure modelling and evaluation based on a 4-distance description of side-chain interactions

    Directory of Open Access Journals (Sweden)

    Inbar Yuval

    2010-07-01

    Full Text Available Abstract Background Accurate evaluation and modelling of residue-residue interactions within and between proteins is a key aspect of computational structure prediction including homology modelling, protein-protein docking, refinement of low-resolution structures, and computational protein design. Results Here we introduce a method for accurate protein structure modelling and evaluation based on a novel 4-distance description of residue-residue interaction geometry. Statistical 4-distance preferences were extracted from high-resolution protein structures and were used as a basis for a knowledge-based potential, called Hunter. We demonstrate that 4-distance description of side chain interactions can be used reliably to discriminate the native structure from a set of decoys. Hunter ranked the native structure as the top one in 217 out of 220 high-resolution decoy sets, in 25 out of 28 "Decoys 'R' Us" decoy sets and in 24 out of 27 high-resolution CASP7/8 decoy sets. The same concept was applied to side chain modelling in protein structures. On a set of very high-resolution protein structures the average RMSD was 1.47 Å for all residues and 0.73 Å for buried residues, which is in the range of attainable accuracy for a model. Finally, we show that Hunter performs as good or better than other top methods in homology modelling based on results from the CASP7 experiment. The supporting web site http://bioinfo.weizmann.ac.il/hunter/ was developed to enable the use of Hunter and for visualization and interactive exploration of 4-distance distributions. Conclusions Our results suggest that Hunter can be used as a tool for evaluation and for accurate modelling of residue-residue interactions in protein structures. The same methodology is applicable to other areas involving high-resolution modelling of biomolecules.

  11. Strategies Used by Pet Dogs for Solving Olfaction-Based Problems at Various Distances.

    Directory of Open Access Journals (Sweden)

    Zita Polgár

    Full Text Available The olfactory acuity of domestic dogs has been well established through numerous studies on trained canines, however whether untrained dogs spontaneously utilize this ability for problem solving is less clear. In the present paper we report two studies that examine what strategies family dogs use in two types of olfaction-based problems as well as their success at various distances. In Study 1, thirty dogs were tasked with distinguishing a target, either their covered owner (Exp 1 or baited food (Exp 2, from three visually identical choices at distances of 0m (touching distance, 1m, and 3m. There were nine consecutive trials for each target. We found that in Exp 1 the dogs successfully chose their owners over strangers at 0m and 1m, but not at 3m, where they used a win-stay strategy instead. In Exp 2 the dogs were only successful in choosing the baited pot at 0m. They used the win-stay strategy at 1m, but chose randomly at 3m. In Study 2, a different group of dogs was tested with their owners (Exp 1 and baited food (Exp 2 at just the 3m distance with two possible targets in 10-10 trials. In Exp 1 the dogs' overall performance was at chance level; however, when analyzed by trial, we noticed that despite tending to find their owners on the first trial, they generally switched to a win-stay strategy in subsequent trials, only to return to correctly choosing their owners based on olfaction in the later trials. In Exp 2, the dogs chose randomly throughout. We also found that dogs who relied on visual information in the warm-up trials were less successful in the olfaction-based test. Our results suggest that despite their ability to successfully collect information through olfaction, family dogs often prioritize other strategies to solve basic choice tasks.

  12. Studying the Post-Fire Response of Vegetation in California Protected Areas with NDVI-based Pheno-Metrics

    Science.gov (United States)

    Jia, S.; Gillespie, T. W.

    2016-12-01

    Post-fire response from vegetation is determined by the intensity and timing of fires as well as the nature of local biomes. Though the field-based studies focusing on selected study sites helped to understand the mechanisms of post-fire response, there is a need to extend the analysis to a broader spatial extent with the assistance of remotely sensed imagery of fires and vegetation. Pheno-metrics, a series of variables on the growing cycle extracted from basic satellite measurements of vegetation coverage, translate the basic remote sensing measurements such as NDVI to the language of phenology and fire ecology in a quantitative form. In this study, we analyzed the rate of biomass removal after ignition and the speed of post-fire recovery in California protected areas from 2000 to 2014 with USGS MTBS fire data and USGS eMODIS pheno-metrics. NDVI drop caused by fire showed the aboveground biomass of evergreen forest was removed much slower than shrubland because of higher moisture level and greater density of fuel. In addition, the above two major land cover types experienced a greatly weakened immediate post-fire growing season, featuring a later start and peak of season, a shorter length of season, and a lower start and peak of NDVI. Such weakening was highly correlated with burn severity, and also influenced by the season of fire and the land cover type, according to our modeling between the anomalies of pheno-metrics and the difference of normalized burn ratio (dNBR). The influence generally decayed over time, but can remain high within the first 5 years after fire, mostly because of the introduction of exotic species when the native species were missing. Local-specific variables are necessary to better address the variance within the same fire and improve the outcomes of models. This study can help ecologists in validating the theories of post-fire vegetation response mechanisms and assist local fire managers in post-fire vegetation recovery.

  13. A Study of Faculty Attitudes toward Internet-Based Distance Education: A Survey of Two Jordanian Public Universities

    Science.gov (United States)

    Gasaymeh, Al-Mothana M.

    2009-01-01

    The purpose of this study was to examine the attitudes toward internet-based distance education by the faculty members of two Jordanian public universities, Al-Hussein Bin Talal University and Yarmouk University, as well as to explore the relationship between their attitudes toward internet-based distance education and their perceptions of their…

  14. An Initialization Method Based on Hybrid Distance for k-Means Algorithm.

    Science.gov (United States)

    Yang, Jie; Ma, Yan; Zhang, Xiangfen; Li, Shunbao; Zhang, Yuping

    2017-11-01

    The traditional [Formula: see text]-means algorithm has been widely used as a simple and efficient clustering method. However, the performance of this algorithm is highly dependent on the selection of initial cluster centers. Therefore, the method adopted for choosing initial cluster centers is extremely important. In this letter, we redefine the density of points according to the number of its neighbors, as well as the distance between points and their neighbors. In addition, we define a new distance measure that considers both Euclidean distance and density. Based on that, we propose an algorithm for selecting initial cluster centers that can dynamically adjust the weighting parameter. Furthermore, we propose a new internal clustering validation measure, the clustering validation index based on the neighbors (CVN), which can be exploited to select the optimal result among multiple clustering results. Experimental results show that the proposed algorithm outperforms existing initialization methods on real-world data sets and demonstrates the adaptability of the proposed algorithm to data sets with various characteristics.

  15. Water quality assessment with hierarchical cluster analysis based on Mahalanobis distance.

    Science.gov (United States)

    Du, Xiangjun; Shao, Fengjing; Wu, Shunyao; Zhang, Hanlin; Xu, Si

    2017-07-01

    Water quality assessment is crucial for assessment of marine eutrophication, prediction of harmful algal blooms, and environment protection. Previous studies have developed many numeric modeling methods and data driven approaches for water quality assessment. The cluster analysis, an approach widely used for grouping data, has also been employed. However, there are complex correlations between water quality variables, which play important roles in water quality assessment but have always been overlooked. In this paper, we analyze correlations between water quality variables and propose an alternative method for water quality assessment with hierarchical cluster analysis based on Mahalanobis distance. Further, we cluster water quality data collected form coastal water of Bohai Sea and North Yellow Sea of China, and apply clustering results to evaluate its water quality. To evaluate the validity, we also cluster the water quality data with cluster analysis based on Euclidean distance, which are widely adopted by previous studies. The results show that our method is more suitable for water quality assessment with many correlated water quality variables. To our knowledge, it is the first attempt to apply Mahalanobis distance for coastal water quality assessment.

  16. Prediction Model of Collapse Risk Based on Information Entropy and Distance Discriminant Analysis Method

    Directory of Open Access Journals (Sweden)

    Hujun He

    2017-01-01

    Full Text Available The prediction and risk classification of collapse is an important issue in the process of highway construction in mountainous regions. Based on the principles of information entropy and Mahalanobis distance discriminant analysis, we have produced a collapse hazard prediction model. We used the entropy measure method to reduce the influence indexes of the collapse activity and extracted the nine main indexes affecting collapse activity as the discriminant factors of the distance discriminant analysis model (i.e., slope shape, aspect, gradient, and height, along with exposure of the structural face, stratum lithology, relationship between weakness face and free face, vegetation cover rate, and degree of rock weathering. We employ postearthquake collapse data in relation to construction of the Yingxiu-Wolong highway, Hanchuan County, China, as training samples for analysis. The results were analyzed using the back substitution estimation method, showing high accuracy and no errors, and were the same as the prediction result of uncertainty measure. Results show that the classification model based on information entropy and distance discriminant analysis achieves the purpose of index optimization and has excellent performance, high prediction accuracy, and a zero false-positive rate. The model can be used as a tool for future evaluation of collapse risk.

  17. Combining energy and power based safety metrics in controller design for domestic robots

    NARCIS (Netherlands)

    Tadele, T.S.; de Vries, Theodorus J.A.; Stramigioli, Stefano

    This paper presents a general passivity based interaction controller design approach that utilizes a combined energy and power based safety norms to assert safety of domestic robots. Since these robots are expected to co-habit the same environment with a human user, analysing and ensuring their

  18. Evaluating visual discomfort in stereoscopic projection-based CAVE system with a close viewing distance

    Science.gov (United States)

    Song, Weitao; Weng, Dongdong; Feng, Dan; Li, Yuqian; Liu, Yue; Wang, Yongtian

    2015-05-01

    As one of popular immersive Virtual Reality (VR) systems, stereoscopic cave automatic virtual environment (CAVE) system is typically consisted of 4 to 6 3m-by-3m sides of a room made of rear-projected screens. While many endeavors have been made to reduce the size of the projection-based CAVE system, the issue of asthenopia caused by lengthy exposure to stereoscopic images in such CAVE with a close viewing distance was seldom tangled. In this paper, we propose a light-weighted approach which utilizes a convex eyepiece to reduce visual discomfort induced by stereoscopic vision. An empirical experiment was conducted to examine the feasibility of convex eyepiece in a large depth of field (DOF) at close viewing distance both objectively and subjectively. The result shows the positive effects of convex eyepiece on the relief of eyestrain.

  19. Wide-Spectrum Microscope with a Long Working Distance Aspherical Objective Based on Obscuration Constraint

    Directory of Open Access Journals (Sweden)

    Weibo Wang

    2016-11-01

    Full Text Available We present an approach for an initial configuration design based on obscuration constraint and on-axis Taylor series expansion to realize the design of long working distance microscope (numerical aperture (NA = 0.13 and working distance (WD = 525 mm with a low obscuration aspherical Schwarzschild objective in wide-spectrum imaging (λ = 400–900 nm. Experiments of the testing on the resolution target and inspection on United States Air Force (USAF resolution chart and a line charge-coupled device (CCD (pixel size of 14 μm × 56 μm with different wavelength light sources (λ = 480 nm, 550 nm, 660 nm, 850 nm were implemented to verify the validity of the proposed method.

  20. Distance Constrained Based Adaptive Flocking Control for Multiagent Networks with Time Delay

    Directory of Open Access Journals (Sweden)

    Qing Zhang

    2015-01-01

    Full Text Available The flocking control of multiagent system is a new type of decentralized control method, which has aroused great attention. The paper includes a detailed research in terms of distance constrained based adaptive flocking control for multiagent system with time delay. Firstly, the program on the adaptive flocking with time delay of multiagent is proposed. Secondly, a kind of adaptive controllers and updating laws are presented. According to the Lyapunov stability theory, it is proved that the distance between agents can be larger than a constant during the motion evolution. What is more, velocities of each agent come to the same asymptotically. Finally, the analytical results can be verified by a numerical example.

  1. Developing an outcome-based biodiversity metric in support of the field to market project: Final report

    Science.gov (United States)

    Drew, C. Ashton; Alexander-Vaughn, Louise B.; Collazo, Jaime A.; McKerrow, Alexa; Anderson, John

    2013-01-01

    depends on that animal’s resource specialization, mobility, and life history strategies (Jeanneret et al. 2003a, b; Jennings & Pocock 2009). The knowledge necessary to define the biodiversity contribution of agricultural lands is specialized, dispersed, and nuanced, and thus not readily accessible. Given access to clearly defined biodiversity tradeoffs between alternative agricultural practices, landowners, land managers and farm operators could collectively enhance the conservation and economic value of agricultural landscapes. Therefore, Field to Market: The Keystone Alliance for Sustainable Agriculture and The Nature Conservancy jointly funded a pilot project to develop a biodiversity metric to integrate into Field to Market’s existing sustainability calculator, The Fieldprint Calculator (http://www. fieldtomarket.org/). Field to Market: The Keystone Alliance for Sustainable Agriculture is an alliance among producers, agribusinesses, food companies, and conservation organizations seeking to create sustainable outcomes for agriculture. The Fieldprint Calculator supports the Keystone Alliance’s vision to achieve safe, accessible, and nutritious food, fiber and fuel in thriving ecosystems to meet the needs of 9 billion people in 2050. In support of this same vision, our project provides proof-of-concept for an outcome-based biodiversity metric for Field to Market to quantify biodiversity impacts of commercial row crop production on terrestrial vertebrate richness. Little research exists examining the impacts of alternative commercial agricultural practices on overall terrestrial biodiversity (McLaughlin & Mineau 1995). Instead, most studies compare organic versus conventional practices (e.g. Freemark & Kirk 2001; Wickramasinghe et al. 2004), and most studies focus on flora, avian, or invertebrate communities (Jeanneret et al. 2003a; Maes et al. 2008; Pollard & Relton 1970). Therefore, we used an expert-knowledge-based approach to develop a metric that predicts

  2. Studying the added value of visual attention in objective image quality metrics based on eye movement data

    NARCIS (Netherlands)

    Liu, H.; Heynderickx, I.E.J.

    2009-01-01

    Current research on image quality assessment tends to include visual attention in objective metrics to further enhance their performance. A variety of computational models of visual attention are implemented in different metrics, but their accuracy in representing human visual attention is not fully

  3. Timing Metrics of Joint Timing and Carrier-Frequency Offset Estimation Algorithms for TDD-based OFDM systems

    NARCIS (Netherlands)

    Hoeksema, F.W.; Srinivasan, R.; Schiphorst, Roelof; Slump, Cornelis H.

    2004-01-01

    In joint timing and carrier offset estimation algorithms for Time Division Duplexing (TDD) OFDM systems, different timing metrics are proposed to determine the beginning of a burst or symbol. In this contribution we investigated the different timing metrics in order to establish their impact on the

  4. MO-FG-202-07: Real-Time EPID-Based Detection Metric For VMAT Delivery Errors

    International Nuclear Information System (INIS)

    Passarge, M; Fix, M K; Manser, P; Stampanoni, M F M; Siebers, J V

    2016-01-01

    Purpose: To create and test an accurate EPID-frame-based VMAT QA metric to detect gross dose errors in real-time and to provide information about the source of error. Methods: A Swiss cheese model was created for an EPID-based real-time QA process. The system compares a treatmentplan- based reference set of EPID images with images acquired over each 2° gantry angle interval. The metric utilizes a sequence of independent consecutively executed error detection Methods: a masking technique that verifies infield radiation delivery and ensures no out-of-field radiation; output normalization checks at two different stages; global image alignment to quantify rotation, scaling and translation; standard gamma evaluation (3%, 3 mm) and pixel intensity deviation checks including and excluding high dose gradient regions. Tolerances for each test were determined. For algorithm testing, twelve different types of errors were selected to modify the original plan. Corresponding predictions for each test case were generated, which included measurement-based noise. Each test case was run multiple times (with different noise per run) to assess the ability to detect introduced errors. Results: Averaged over five test runs, 99.1% of all plan variations that resulted in patient dose errors were detected within 2° and 100% within 4° (∼1% of patient dose delivery). Including cases that led to slightly modified but clinically equivalent plans, 91.5% were detected by the system within 2°. Based on the type of method that detected the error, determination of error sources was achieved. Conclusion: An EPID-based during-treatment error detection system for VMAT deliveries was successfully designed and tested. The system utilizes a sequence of methods to identify and prevent gross treatment delivery errors. The system was inspected for robustness with realistic noise variations, demonstrating that it has the potential to detect a large majority of errors in real-time and indicate the error

  5. MO-FG-202-07: Real-Time EPID-Based Detection Metric For VMAT Delivery Errors

    Energy Technology Data Exchange (ETDEWEB)

    Passarge, M; Fix, M K; Manser, P [Division of Medical Radiation Physics and Department of Radiation Oncology, Inselspital, Bern University Hospital, and University of Bern, Bern (Switzerland); Stampanoni, M F M [Institute for Biomedical Engineering, ETH Zurich, and PSI, Villigen (Switzerland); Siebers, J V [Department of Radiation Oncology, University of Virginia, Charlottesville, VA (United States)

    2016-06-15

    Purpose: To create and test an accurate EPID-frame-based VMAT QA metric to detect gross dose errors in real-time and to provide information about the source of error. Methods: A Swiss cheese model was created for an EPID-based real-time QA process. The system compares a treatmentplan- based reference set of EPID images with images acquired over each 2° gantry angle interval. The metric utilizes a sequence of independent consecutively executed error detection Methods: a masking technique that verifies infield radiation delivery and ensures no out-of-field radiation; output normalization checks at two different stages; global image alignment to quantify rotation, scaling and translation; standard gamma evaluation (3%, 3 mm) and pixel intensity deviation checks including and excluding high dose gradient regions. Tolerances for each test were determined. For algorithm testing, twelve different types of errors were selected to modify the original plan. Corresponding predictions for each test case were generated, which included measurement-based noise. Each test case was run multiple times (with different noise per run) to assess the ability to detect introduced errors. Results: Averaged over five test runs, 99.1% of all plan variations that resulted in patient dose errors were detected within 2° and 100% within 4° (∼1% of patient dose delivery). Including cases that led to slightly modified but clinically equivalent plans, 91.5% were detected by the system within 2°. Based on the type of method that detected the error, determination of error sources was achieved. Conclusion: An EPID-based during-treatment error detection system for VMAT deliveries was successfully designed and tested. The system utilizes a sequence of methods to identify and prevent gross treatment delivery errors. The system was inspected for robustness with realistic noise variations, demonstrating that it has the potential to detect a large majority of errors in real-time and indicate the error

  6. Detecting Malicious Nodes in Medical Smartphone Networks Through Euclidean Distance-Based Behavioral Profiling

    DEFF Research Database (Denmark)

    Meng, Weizhi; Li, Wenjuan; Wang, Yu

    2017-01-01

    and healthcare personnel. The underlying network architecture to support such devices is also referred to as medical smartphone networks (MSNs). Similar to other networks, MSNs also suffer from various attacks like insider attacks (e.g., leakage of sensitive patient information by a malicious insider......). In this work, we focus on MSNs and design a trust-based intrusion detection approach through Euclidean distance-based behavioral profiling to detect malicious devices (or called nodes). In the evaluation, we collaborate with healthcare organizations and implement our approach in a real simulated MSN...

  7. Distance selection based on relevance feedback in the context of CBIR using the SFS meta-heuristic with one round

    Directory of Open Access Journals (Sweden)

    Mawloud Mosbah

    2017-03-01

    Full Text Available In this paper, we address the selection in the context of Content Based-Image Retrieval (CBIR. Instead of addressing features’ selection issue, we deal here with distance selection as a novel paradigm poorly addressed within CBIR field. Whereas distance concept is a very precise and sharp mathematical tool, we extend the study to weak distances: Similarity, quasi-distance, and divergence. Therefore, as many as eighteen (18 such measures as considered: distances: {Euclidian, …}, similarities{Ruzika, …}, quasi-distances: {Neyman-X2, …} and divergences: {Jeffrey, …}. We specifically propose a hybrid system based on the Sequential Forward Selector (SFS meta-heuristic with one round and relevance feedback. The experiments conducted on the Wang database (Corel-1K using color moments as a signature show that our system yields promising results in terms of effectiveness.

  8. Toward an ozone standard to protect vegetation based on effective dose: a review of deposition resistances and a possible metric

    Science.gov (United States)

    Massman, W. J.

    Present air quality standards to protect vegetation from ozone are based on measured concentrations (i.e., exposure) rather than on plant uptake rates (or dose). Some familiar cumulative exposure-based indices include SUM06, AOT40, and W126. However, plant injury is more closely related to dose, or more appropriately to effective dose, than to exposure. This study develops and applies a simple model for estimating effective ozone dose that combines the plant canopy's rate of stomatal ozone uptake with the plant's defense to ozone uptake. Here the plant defense is explicitly parameterized as a function of gross photosynthesis and the model is applied using eddy covariance (ozone and CO 2) flux data obtained at a vineyard site in the San Joaquin Valley during the California Ozone Deposition Experiment (CODE91). With the ultimate intention of applying these concepts using prognostic models and remotely sensed data, the pathways for ozone deposition are parameterized (as much as possible) in terms of canopy LAI and the surface friction velocity. Results indicate that (1) the daily maximum potential for plant injury (based on effective dose) tends to coincide with the daily peak in ozone mixing ratio (ppbV), (2) potentially there are some significant differences between ozone metrics based on dose (no plant defense) and effective dose, and (3) nocturnal conductance can contribute significantly to the potential for plant ozone injury.

  9. The challenge of defining risk-based metrics to improve food safety: inputs from the BASELINE project.

    Science.gov (United States)

    Manfreda, Gerardo; De Cesare, Alessandra

    2014-08-01

    In 2002, the Regulation (EC) 178 of the European Parliament and of the Council states that, in order to achieve the general objective of a high level of protection of human health and life, food law shall be based on risk analysis. However, the Commission Regulation No 2073/2005 on microbiological criteria for foodstuffs requires that food business operators ensure that foodstuffs comply with the relevant microbiological criteria. Such criteria define the acceptability of a product, a batch of foodstuffs or a process, based on the absence, presence or number of micro-organisms, and/or on the quantity of their toxins/metabolites, per unit(s) of mass, volume, area or batch. The same Regulation describes a food safety criterion as a mean to define the acceptability of a product or a batch of foodstuff applicable to products placed on the market; moreover, it states a process hygiene criterion as a mean indicating the acceptable functioning of the production process. Both food safety criteria and process hygiene criteria are not based on risk analysis. On the contrary, the metrics formulated by the Codex Alimentarius Commission in 2004, named Food Safety Objective (FSO) and Performance Objective (PO), are risk-based and fit the indications of Regulation 178/2002. The main aims of this review are to illustrate the key differences between microbiological criteria and the risk-based metrics defined by the Codex Alimentarius Commission and to explore the opportunity and also the possibility to implement future European Regulations including PO and FSO as supporting parameters to microbiological criteria. This review clarifies also the implications of defining an appropriate level of human protection, how to establish FSO and PO and how to implement them in practice linked to each other through quantitative risk assessment models. The contents of this review should clarify the context for application of the results collected during the EU funded project named BASELINE (www

  10. USING COMPUTER-BASED TESTING AS ALTERNATIVE ASSESSMENT METHOD OF STUDENT LEARNING IN DISTANCE EDUCATION

    Directory of Open Access Journals (Sweden)

    Amalia SAPRIATI

    2010-04-01

    Full Text Available This paper addresses the use of computer-based testing in distance education, based on the experience of Universitas Terbuka (UT, Indonesia. Computer-based testing has been developed at UT for reasons of meeting the specific needs of distance students as the following: Ø students’ inability to sit for the scheduled test, Ø conflicting test schedules, and Ø students’ flexibility to take examination to improve their grades. In 2004, UT initiated a pilot project in the development of system and program for computer-based testing method. Then in 2005 and 2006 tryouts in the use of computer-based testing methods were conducted in 7 Regional Offices that were considered as having sufficient supporting recourses. The results of the tryouts revealed that students were enthusiastic in taking computer-based tests and they expected that the test method would be provided by UT as alternative to the traditional paper and pencil test method. UT then implemented computer-based testing method in 6 and 12 Regional Offices in 2007 and 2008 respectively. The computer-based testing was administered in the city of the designated Regional Office and was supervised by the Regional Office staff. The development of the computer-based testing was initiated with conducting tests using computers in networked configuration. The system has been continually improved, and it currently uses devices linked to the internet or the World Wide Web. The construction of the test involves the generation and selection of the test items from the item bank collection of the UT Examination Center. Thus the combination of the selected items compromises the test specification. Currently UT has offered 250 courses involving the use of computer-based testing. Students expect that more courses are offered with computer-based testing in Regional Offices within easy access by students.

  11. 2008 GEM Modeling Challenge: Metrics Study of the Dst Index in Physics-Based Magnetosphere and Ring Current Models and in Statistical and Analytic Specifications

    Science.gov (United States)

    Rastaetter, L.; Kuznetsova, M.; Hesse, M.; Pulkkinen, A.; Glocer, A.; Yu, Y.; Meng, X.; Raeder, J.; Wiltberger, M.; Welling, D.; hide

    2011-01-01

    In this paper the metrics-based results of the Dst part of the 2008-2009 GEM Metrics Challenge are reported. The Metrics Challenge asked modelers to submit results for 4 geomagnetic storm events and 5 different types of observations that can be modeled by statistical or climatological or physics-based (e.g. MHD) models of the magnetosphere-ionosphere system. We present the results of over 25 model settings that were run at the Community Coordinated Modeling Center (CCMC) and at the institutions of various modelers for these events. To measure the performance of each of the models against the observations we use comparisons of one-hour averaged model data with the Dst index issued by the World Data Center for Geomagnetism, Kyoto, Japan, and direct comparison of one-minute model data with the one-minute Dst index calculated by the United States Geologic Survey (USGS).

  12. Electronic white cane with GPS radar-based concept as blind mobility enhancement without distance limitation

    Science.gov (United States)

    Halim, Suharsono; Handafiah, Finna; Aprilliyani, Ria; Udhiarto, Arief

    2018-02-01

    The Indonesian Ministry of Social Affairs, in July 2012, informed that the number of blind in Indonesia has been the largest among to the people with other disabilities. The most common tools utilized to help the blind was a conventional cane which has limited features and therefore it was difficult to be used as a mobilization tools. Moreover, the conventional cane cannot assist them or their family when the blind gets lost. In this research, we designed and implemented an electronic white cane with the concept of radar and global positioning system (GPS). The purpose of this research was to design and develop an electronic white cane which can enhance the mobility of the blind without distance coverage limitation. Utilizing ultrasonic sensors as a distance measurement and a servo motor as an actuator, the produced radar system is able to map an area with maximum distance and coverage angle of 5 meters and 180° respectively. The blind senses the obstacle around them from the vibration generated by five vibration motors. The vibration becomes more intense when the obstacle is detected closer. In addition, we implemented a GPS to monitor the blind's position and allow their family to find them easily when the blind need a help. Based on the tests performed, we have successfully developed an electronic white cane that can be a solution to improve the blind's mobility.

  13. Protograph based LDPC codes with minimum distance linearly growing with block size

    Science.gov (United States)

    Divsalar, Dariush; Jones, Christopher; Dolinar, Sam; Thorpe, Jeremy

    2005-01-01

    We propose several LDPC code constructions that simultaneously achieve good threshold and error floor performance. Minimum distance is shown to grow linearly with block size (similar to regular codes of variable degree at least 3) by considering ensemble average weight enumerators. Our constructions are based on projected graph, or protograph, structures that support high-speed decoder implementations. As with irregular ensembles, our constructions are sensitive to the proportion of degree-2 variable nodes. A code with too few such nodes tends to have an iterative decoding threshold that is far from the capacity threshold. A code with too many such nodes tends to not exhibit a minimum distance that grows linearly in block length. In this paper we also show that precoding can be used to lower the threshold of regular LDPC codes. The decoding thresholds of the proposed codes, which have linearly increasing minimum distance in block size, outperform that of regular LDPC codes. Furthermore, a family of low to high rate codes, with thresholds that adhere closely to their respective channel capacity thresholds, is presented. Simulation results for a few example codes show that the proposed codes have low error floors as well as good threshold SNFt performance.

  14. Facial expression recognition and model-based regeneration for distance teaching

    Science.gov (United States)

    De Silva, Liyanage C.; Vinod, V. V.; Sengupta, Kuntal

    1998-12-01

    This paper presents a novel idea of a visual communication system, which can support distance teaching using a network of computers. Here the author's main focus is to enhance the quality of distance teaching by reducing the barrier between the teacher and the student, which is formed due to the remote connection of the networked participants. The paper presents an effective way of improving teacher-student communication link of an IT (Information Technology) based distance teaching scenario, using facial expression recognition results and face global and local motion detection results of both the teacher and the student. It presents a way of regenerating the facial images for the teacher-student down-link, which can enhance the teachers facial expressions and which also can reduce the network traffic compared to usual video broadcasting scenarios. At the same time, it presents a way of representing a large volume of facial expression data of the whole student population (in the student-teacher up-link). This up-link representation helps the teacher to receive an instant feed back of his talk, as if he was delivering a face to face lecture. In conventional video tele-conferencing type of applications, this task is nearly impossible, due to huge volume of upward network traffic. The authors utilize several of their previous publication results for most of the image processing components needs to be investigated to complete such a system. In addition, some of the remaining system components are covered by several on going work.

  15. Predicting speech intelligibility based on a correlation metric in the envelope power spectrum domain

    DEFF Research Database (Denmark)

    Relaño-Iborra, Helia; May, Tobias; Zaar, Johannes

    2016-01-01

    A speech intelligibility prediction model is proposed that combines the auditory processing front end of the multi-resolution speech-based envelope power spectrum model [mr-sEPSM; Jørgensen, Ewert, and Dau (2013). J. Acoust. Soc. Am. 134(1), 436–446] with a correlation back end inspired by the sh...

  16. A Single Conjunction Risk Assessment Metric: the F-Value

    Science.gov (United States)

    Frigm, Ryan Clayton; Newman, Lauri K.

    2009-01-01

    The Conjunction Assessment Team at NASA Goddard Space Flight Center provides conjunction risk assessment for many NASA robotic missions. These risk assessments are based on several figures of merit, such as miss distance, probability of collision, and orbit determination solution quality. However, these individual metrics do not singly capture the overall risk associated with a conjunction, making it difficult for someone without this complete understanding to take action, such as an avoidance maneuver. The goal of this analysis is to introduce a single risk index metric that can easily convey the level of risk without all of the technical details. The proposed index is called the conjunction "F-value." This paper presents the concept of the F-value and the tuning of the metric for use in routine Conjunction Assessment operations.

  17. Contribution to a quantitative assessment model for reliability-based metrics of electronic and programmable safety-related functions

    International Nuclear Information System (INIS)

    Hamidi, K.

    2005-10-01

    The use of fault-tolerant EP architectures has induced growing constraints, whose influence on reliability-based performance metrics is no more negligible. To face up the growing influence of simultaneous failure, this thesis proposes, for safety-related functions, a new-trend assessment method of reliability, based on a better taking into account of time-aspect. This report introduces the concept of information and uses it to interpret the failure modes of safety-related function as the direct result of the initiation and propagation of erroneous information until the actuator-level. The main idea is to distinguish the apparition and disappearance of erroneous states, which could be defined as intrinsically dependent of HW-characteristic and maintenance policies, and their possible activation, constrained through architectural choices, leading to the failure of safety-related function. This approach is based on a low level on deterministic SED models of the architecture and use non homogeneous Markov chains to depict the time-evolution of probabilities of errors. (author)

  18. Assessment of multi-version NPP I and C systems safety. Metric-based approach, technique and tool

    International Nuclear Information System (INIS)

    Kharchenko, Vyacheslav; Volkovoy, Andrey; Bakhmach, Eugenii; Siora, Alexander; Duzhyi, Vyacheslav

    2011-01-01

    The challenges related to problem of assessment of actual diversity level and evaluation of diversity-oriented NPP I and C systems safety are analyzed. There are risks of inaccurate assessment and problems of insufficient decreasing probability of CCFs. CCF probability of safety-critical systems may be essentially decreased due to application of several different types of diversity (multi-diversity). Different diversity types of FPGA-based NPP I and C systems, general approach and stages of diversity and safety assessment as a whole are described. Objectives of the report are: (a) analysis of the challenges caused by use of diversity approach in NPP I and C systems in context of FPGA and other modern technologies application; (b) development of multi-version NPP I and C systems assessment technique and tool based on check-list and metric-oriented approach; (c) case-study of the technique: assessment of multi-version FPGA-based NPP I and C developed by use of Radiy TM Platform. (author)

  19. Integration of advanced technologies to enhance problem-based learning over distance: Project TOUCH.

    Science.gov (United States)

    Jacobs, Joshua; Caudell, Thomas; Wilks, David; Keep, Marcus F; Mitchell, Steven; Buchanan, Holly; Saland, Linda; Rosenheimer, Julie; Lozanoff, Beth K; Lozanoff, Scott; Saiki, Stanley; Alverson, Dale

    2003-01-01

    Distance education delivery has increased dramatically in recent years as a result of the rapid advancement of communication technology. The National Computational Science Alliance's Access Grid represents a significant advancement in communication technology with potential for distance medical education. The purpose of this study is to provide an overview of the TOUCH project (Telehealth Outreach for Unified Community Health; http://hsc.unm.edu/touch) with special emphasis on the process of problem-based learning case development for distribution over the Access Grid. The objective of the TOUCH project is to use emerging Internet-based technology to overcome geographic barriers for delivery of tutorial sessions to medical students pursuing rotations at remote sites. The TOUCH project also is aimed at developing a patient simulation engine and an immersive virtual reality environment to achieve a realistic health care scenario enhancing the learning experience. A traumatic head injury case is developed and distributed over the Access Grid as a demonstration of the TOUCH system. Project TOUCH serves as an example of a computer-based learning system for developing and implementing problem-based learning cases within the medical curriculum, but this system should be easily applied to other educational environments and disciplines involving functional and clinical anatomy. Future phases will explore PC versions of the TOUCH cases for increased distribution. Copyright 2003 Wiley-Liss, Inc.

  20. A Metrics-Based Approach to Intrusion Detection System Evaluation for Distributed Real-Time Systems

    Science.gov (United States)

    2002-04-01

    Based Approach to Intrusion Detection System Evaluation for Distributed Real - Time Systems Authors: G. A. Fink, B. L. Chappell, T. G. Turner, and...Distributed, Security. 1 Introduction Processing and cost requirements are driving future naval combat platforms to use distributed, real - time systems of...distributed, real - time systems . As these systems grow more complex, the timing requirements do not diminish; indeed, they may become more constrained

  1. Characterizing the round sphere by mean distance

    DEFF Research Database (Denmark)

    Kokkendorff, Simon Lyngby

    2008-01-01

    We discuss the measure theoretic metric invariants extent, rendezvous number and mean distance of a general compact metric space X and relate these to classical metric invariants such as diameter and radius. In the final section we focus attention to the category of Riemannian manifolds. The main...

  2. Metrics for energy resilience

    International Nuclear Information System (INIS)

    Roege, Paul E.; Collier, Zachary A.; Mancillas, James; McDonagh, John A.; Linkov, Igor

    2014-01-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today's energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system's energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth. - Highlights: • Resilience is the ability of a system to recover from adversity. • There is a need for methods to quantify and measure system resilience. • We developed a matrix-based approach to generate energy resilience metrics. • These metrics can be used in energy planning, system design, and operations

  3. Phone-based metric as a predictor for basic personality traits

    DEFF Research Database (Denmark)

    Mønsted, Bjarke; Mollgaard, Anders; Mathiesen, Joachim

    2018-01-01

    Basic personality traits are believed to be expressed in, and predictable from, smart phone data. We investigate the extent of this predictability using data (n = 636) from the Copenhagen Network Study, which to our knowledge is the most extensive study concerning smartphone usage and personality...... traits. Based on phone usage patterns, earlier studies have reported surprisingly high predictability of all Big Five personality traits. We predict personality trait tertiles (low, medum, high) from a set of behavioral variables extracted from the data, and find that only extraversion can be predicted...

  4. Phone-based metric as a predictor for basic personality traits

    DEFF Research Database (Denmark)

    Mønsted, Bjarke; Mollgaard, Anders; Mathiesen, Joachim

    2018-01-01

    traits. Based on phone usage patterns, earlier studies have reported surprisingly high predictability of all Big Five personality traits. We predict personality trait tertiles (low, medum, high) from a set of behavioral variables extracted from the data, and find that only extraversion can be predicted......Basic personality traits are believed to be expressed in, and predictable from, smart phone data. We investigate the extent of this predictability using data (n = 636) from the Copenhagen Network Study, which to our knowledge is the most extensive study concerning smartphone usage and personality...

  5. Ant-Based Phylogenetic Reconstruction (ABPR: A new distance algorithm for phylogenetic estimation based on ant colony optimization

    Directory of Open Access Journals (Sweden)

    Karla Vittori

    2008-12-01

    Full Text Available We propose a new distance algorithm for phylogenetic estimation based on Ant Colony Optimization (ACO, named Ant-Based Phylogenetic Reconstruction (ABPR. ABPR joins two taxa iteratively based on evolutionary distance among sequences, while also accounting for the quality of the phylogenetic tree built according to the total length of the tree. Similar to optimization algorithms for phylogenetic estimation, the algorithm allows exploration of a larger set of nearly optimal solutions. We applied the algorithm to four empirical data sets of mitochondrial DNA ranging from 12 to 186 sequences, and from 898 to 16,608 base pairs, and covering taxonomic levels from populations to orders. We show that ABPR performs better than the commonly used Neighbor-Joining algorithm, except when sequences are too closely related (e.g., population-level sequences. The phylogenetic relationships recovered at and above species level by ABPR agree with conventional views. However, like other algorithms of phylogenetic estimation, the proposed algorithm failed to recover expected relationships when distances are too similar or when rates of evolution are very variable, leading to the problem of long-branch attraction. ABPR, as well as other ACO-based algorithms, is emerging as a fast and accurate alternative method of phylogenetic estimation for large data sets.

  6. Research on Signature Verification Method Based on Discrete Fréchet Distance

    Science.gov (United States)

    Fang, J. L.; Wu, W.

    2018-05-01

    This paper proposes a multi-feature signature template based on discrete Fréchet distance, which breaks through the limitation of traditional signature authentication using a single signature feature. It solves the online handwritten signature authentication signature global feature template extraction calculation workload, signature feature selection unreasonable problem. In this experiment, the false recognition rate (FAR) and false rejection rate (FRR) of the statistical signature are calculated and the average equal error rate (AEER) is calculated. The feasibility of the combined template scheme is verified by comparing the average equal error rate of the combination template and the original template.

  7. A Multiple Criteria Decision Making Method Based on Relative Value Distances

    Directory of Open Access Journals (Sweden)

    Shyur Huan-jyh

    2015-12-01

    Full Text Available This paper proposes a new multiple criteria decision-making method called ERVD (election based on relative value distances. The s-shape value function is adopted to replace the expected utility function to describe the risk-averse and risk-seeking behavior of decision makers. Comparisons and experiments contrasting with the TOPSIS (Technique for Order Preference by Similarity to the Ideal Solution method are carried out to verify the feasibility of using the proposed method to represent the decision makers’ preference in the decision making process. Our experimental results show that the proposed approach is an appropriate and effective MCDM method.

  8. Finding differentially expressed genes in high dimensional data: Rank based test statistic via a distance measure.

    Science.gov (United States)

    Mathur, Sunil; Sadana, Ajit

    2015-12-01

    We present a rank-based test statistic for the identification of differentially expressed genes using a distance measure. The proposed test statistic is highly robust against extreme values and does not assume the distribution of parent population. Simulation studies show that the proposed test is more powerful than some of the commonly used methods, such as paired t-test, Wilcoxon signed rank test, and significance analysis of microarray (SAM) under certain non-normal distributions. The asymptotic distribution of the test statistic, and the p-value function are discussed. The application of proposed method is shown using a real-life data set. © The Author(s) 2011.

  9. A Kerr-NUT metric

    International Nuclear Information System (INIS)

    Vaidya, P.C.; Patel, L.K.; Bhatt, P.V.

    1976-01-01

    Using Galilean time and retarded distance as coordinates the usual Kerr metric is expressed in form similar to the Newman-Unti-Tamburino (NUT) metric. The combined Kerr-NUT metric is then investigated. In addition to the Kerr and NUT solutions of Einstein's equations, three other types of solutions are derived. These are (i) the radiating Kerr solution, (ii) the radiating NUT solution satisfying Rsub(ik) = sigmaxisub(i)xisub(k), xisub(i)xisup(i) = 0, and (iii) the associated Kerr solution satisfying Rsub(ik) = 0. Solution (i) is distinct from and simpler than the one reported earlier by Vaidya and Patel (Phys. Rev.; D7:3590 (1973)). Solutions (ii) and (iii) gave line elements which have the axis of symmetry as a singular line. (author)

  10. Performance Evaluation and Optimal Management of Distance-Based Registration Using a Semi-Markov Process

    Directory of Open Access Journals (Sweden)

    Jae Joon Suh

    2017-01-01

    Full Text Available We consider the distance-based registration (DBR which is a kind of dynamic location registration scheme in a mobile communication network. In the DBR, the location of a mobile station (MS is updated when it enters a base station more than or equal to a specified distance away from the base station where the location registration for the MS was done last. In this study, we first investigate the existing performance-evaluation methods on the DBR with implicit registration (DBIR presented to improve the performance of the DBR and point out some problems of the evaluation methods. We propose a new performance-evaluation method for the DBIR scheme using a semi-Markov process (SMP which can resolve the controversial issues of the existing methods. The numerical results obtained with the proposed SMP model are compared with those from previous models. It is shown that the SMP model should be considered to get an accurate performance of the DBIR scheme.

  11. Color Texture Image Retrieval Based on Local Extrema Features and Riemannian Distance

    Directory of Open Access Journals (Sweden)

    Minh-Tan Pham

    2017-10-01

    Full Text Available A novel efficient method for content-based image retrieval (CBIR is developed in this paper using both texture and color features. Our motivation is to represent and characterize an input image by a set of local descriptors extracted from characteristic points (i.e., keypoints within the image. Then, dissimilarity measure between images is calculated based on the geometric distance between the topological feature spaces (i.e., manifolds formed by the sets of local descriptors generated from each image of the database. In this work, we propose to extract and use the local extrema pixels as our feature points. Then, the so-called local extrema-based descriptor (LED is generated for each keypoint by integrating all color, spatial as well as gradient information captured by its nearest local extrema. Hence, each image is encoded by an LED feature point cloud and Riemannian distances between these point clouds enable us to tackle CBIR. Experiments performed on several color texture databases including Vistex, STex, color Brodazt, USPtex and Outex TC-00013 using the proposed approach provide very efficient and competitive results compared to the state-of-the-art methods.

  12. A metric-based assessment of flood risk and vulnerability of rural communities in the Lower Shire Valley, Malawi

    Science.gov (United States)

    Adeloye, A. J.; Mwale, F. D.; Dulanya, Z.

    2015-06-01

    In response to the increasing frequency and economic damages of natural disasters globally, disaster risk management has evolved to incorporate risk assessments that are multi-dimensional, integrated and metric-based. This is to support knowledge-based decision making and hence sustainable risk reduction. In Malawi and most of Sub-Saharan Africa (SSA), however, flood risk studies remain focussed on understanding causation, impacts, perceptions and coping and adaptation measures. Using the IPCC Framework, this study has quantified and profiled risk to flooding of rural, subsistent communities in the Lower Shire Valley, Malawi. Flood risk was obtained by integrating hazard and vulnerability. Flood hazard was characterised in terms of flood depth and inundation area obtained through hydraulic modelling in the valley with Lisflood-FP, while the vulnerability was indexed through analysis of exposure, susceptibility and capacity that were linked to social, economic, environmental and physical perspectives. Data on these were collected through structured interviews of the communities. The implementation of the entire analysis within GIS enabled the visualisation of spatial variability in flood risk in the valley. The results show predominantly medium levels in hazardousness, vulnerability and risk. The vulnerability is dominated by a high to very high susceptibility. Economic and physical capacities tend to be predominantly low but social capacity is significantly high, resulting in overall medium levels of capacity-induced vulnerability. Exposure manifests as medium. The vulnerability and risk showed marginal spatial variability. The paper concludes with recommendations on how these outcomes could inform policy interventions in the Valley.

  13. Observationally-based Metrics of Ocean Carbon and Biogeochemical Variables are Essential for Evaluating Earth System Model Projections

    Science.gov (United States)

    Russell, J. L.; Sarmiento, J. L.

    2017-12-01

    The Southern Ocean is central to the climate's response to increasing levels of atmospheric greenhouse gases as it ventilates a large fraction of the global ocean volume. Global coupled climate models and earth system models, however, vary widely in their simulations of the Southern Ocean and its role in, and response to, the ongoing anthropogenic forcing. Due to its complex water-mass structure and dynamics, Southern Ocean carbon and heat uptake depend on a combination of winds, eddies, mixing, buoyancy fluxes and topography. Understanding how the ocean carries heat and carbon into its interior and how the observed wind changes are affecting this uptake is essential to accurately projecting transient climate sensitivity. Observationally-based metrics are critical for discerning processes and mechanisms, and for validating and comparing climate models. As the community shifts toward Earth system models with explicit carbon simulations, more direct observations of important biogeochemical parameters, like those obtained from the biogeochemically-sensored floats that are part of the Southern Ocean Carbon and Climate Observations and Modeling project, are essential. One goal of future observing systems should be to create observationally-based benchmarks that will lead to reducing uncertainties in climate projections, and especially uncertainties related to oceanic heat and carbon uptake.

  14. A No Reference Image Quality Assessment Metric Based on Visual Perception

    Directory of Open Access Journals (Sweden)

    Yan Fu

    2016-12-01

    Full Text Available Nowadays, how to evaluate image quality reasonably is a basic and challenging problem. In view of the present no reference evaluation methods, they cannot reflect the human visual perception of image quality accurately. In this paper, we propose an efficient general-purpose no reference image quality assessment (NRIQA method based on visual perception, and effectively integrates human visual characteristics into the NRIQA fields. First, a novel algorithm for salient region extraction is presented. Two characteristics graphs of texture and edging of the original image are added to the Itti model. Due to the normalized luminance coefficients of natural images obey the generalized Gauss probability distribution, we utilize this characteristic to extract statistical features in the regions of interest (ROI and regions of non-interest respectively. Then, the extracted features are fused to be an input to establish the support vector regression (SVR model. Finally, the IQA model obtained by training is used to predict the quality of the image. Experimental results show that this method has good predictive ability, and the evaluation effect is better than existing classical algorithms. Moreover, the predicted results are more consistent with human subjective perception, which can accurately reflect the human visual perception to image quality.

  15. Microcomputer-based tests for repeated-measures: Metric properties and predictive validities

    Science.gov (United States)

    Kennedy, Robert S.; Baltzley, Dennis R.; Dunlap, William P.; Wilkes, Robert L.; Kuntz, Lois-Ann

    1989-01-01

    A menu of psychomotor and mental acuity tests were refined. Field applications of such a battery are, for example, a study of the effects of toxic agents or exotic environments on performance readiness, or the determination of fitness for duty. The key requirement of these tasks is that they be suitable for repeated-measures applications, and so questions of stability and reliability are a continuing, central focus of this work. After the initial (practice) session, seven replications of 14 microcomputer-based performance tests (32 measures) were completed by 37 subjects. Each test in the battery had previously been shown to stabilize in less than five 90-second administrations and to possess retest reliabilities greater than r = 0.707 for three minutes of testing. However, all the tests had never been administered together as a battery and they had never been self-administered. In order to provide predictive validity for intelligence measurement, the Wechsler Adult Intelligence Scale-Revised and the Wonderlic Personnel Test were obtained on the same subjects.

  16. Key Performance Indicators in Irish Hospital Libraries: Developing Outcome-Based Metrics

    Directory of Open Access Journals (Sweden)

    Michelle Dalton

    2012-12-01

    Full Text Available Objective – To develop a set of generic outcome-based performance measures for Irishhospital libraries.Methods – Various models and frameworks of performance measurement were used as atheoretical paradigm to link the impact of library services directly with measurablehealthcare objectives and outcomes. Strategic objectives were identified, mapped toperformance indicators, and finally translated into response choices to a single-questiononline survey for distribution via email.Results – The set of performance indicators represents an impact assessment tool whichis easy to administer across a variety of healthcare settings. In using a model directlyaligned with the mission and goals of the organization, and linked to core activities andoperations in an accountable way, the indicators can also be used as a channel throughwhich to implement action, change, and improvement.Conclusion – The indicators can be adopted at a local and potentially a national level, asboth a tool for advocacy and to assess and improve service delivery at a macro level. Toovercome the constraints posed by necessary simplifications, substantial further research is needed by hospital libraries to develop more sophisticated and meaningful measures of impact to further aid decision making at a micro level.

  17. Effect of gap distance on tensile strength of preceramic base metal solder joints.

    Science.gov (United States)

    Fattahi, Farnaz; Motamedi, Milad

    2011-01-01

    In order to fabricate prostheses with high accuracy and durability, soldering techniques have been introduced to clinical dentistry. However, these prostheses always fail at their solder joints. The purpose of this study was to evaluate the effect of gap distance on the tensile strength of base metal solder joints. Based on ADA/ISO 9693 specifications for tensile test, 40 specimens were fabricated from a Ni-Cr alloy and cut at the midpoint of 3-mm diameter bar and placed at desired positions by a specially designed device. The specimens were divided into four groups of 10 samples according to the desired solder gap distance: Group1: 0.1mm; Group2: 0.25mm; Group3: 0.5mm; and Group4: 0.75mm. After soldering, specimens were tested for tensile strength by a universal testing machine at a cross-head speed of 0.5mm/min with a preload of 10N. The mean tensile strength values of the groups were 162, 307.8, 206.1 and 336.7 MPa, respectively. The group with 0.75-mm gap had the highest and the group with 0.1-mm gap had the lowest tensile strength. Bonferroni test showed that Group1 and Group4 had statistically different values (P=0.023), but the differences between other groups were not sig-nificant at a significance level of 0.05. There was no direct relationship between increasing soldering gap distance and tensile strength of the solder joints.

  18. Novel Distance Measure in Fuzzy TOPSIS for Supply Chain Strategy Based Supplier Selection

    Directory of Open Access Journals (Sweden)

    B. Pardha Saradhi

    2016-01-01

    Full Text Available In today’s highly competitive environment, organizations need to evaluate and select suppliers based on their manufacturing strategy. Identification of supply chain strategy of the organization, determination of decision criteria, and methods of supplier selection are appearing to be the most important components in research area in the field of supply chain management. In this paper, evaluation of suppliers is done based on the balanced scorecard framework using new distance measure in fuzzy TOPSIS by considering the supply chain strategy of the manufacturing organization. To take care of vagueness in decision making, trapezoidal fuzzy number is assumed for pairwise comparisons to determine relative weights of perspectives and criteria of supplier selection. Also, linguistic variables specified in terms of trapezoidal fuzzy number are considered for the payoff values of criteria of the suppliers. These fuzzy numbers satisfied the Jensen based inequality. A detailed application of the proposed methodology is illustrated.

  19. In the pursuit of a semantic similarity metric based on UMLS annotations for articles in PubMed Central Open Access.

    Science.gov (United States)

    Garcia Castro, Leyla Jael; Berlanga, Rafael; Garcia, Alexander

    2015-10-01

    Although full-text articles are provided by the publishers in electronic formats, it remains a challenge to find related work beyond the title and abstract context. Identifying related articles based on their abstract is indeed a good starting point; this process is straightforward and does not consume as many resources as full-text based similarity would require. However, further analyses may require in-depth understanding of the full content. Two articles with highly related abstracts can be substantially different regarding the full content. How similarity differs when considering title-and-abstract versus full-text and which semantic similarity metric provides better results when dealing with full-text articles are the main issues addressed in this manuscript. We have benchmarked three similarity metrics - BM25, PMRA, and Cosine, in order to determine which one performs best when using concept-based annotations on full-text documents. We also evaluated variations in similarity values based on title-and-abstract against those relying on full-text. Our test dataset comprises the Genomics track article collection from the 2005 Text Retrieval Conference. Initially, we used an entity recognition software to semantically annotate titles and abstracts as well as full-text with concepts defined in the Unified Medical Language System (UMLS®). For each article, we created a document profile, i.e., a set of identified concepts, term frequency, and inverse document frequency; we then applied various similarity metrics to those document profiles. We considered correlation, precision, recall, and F1 in order to determine which similarity metric performs best with concept-based annotations. For those full-text articles available in PubMed Central Open Access (PMC-OA), we also performed dispersion analyses in order to understand how similarity varies when considering full-text articles. We have found that the PubMed Related Articles similarity metric is the most suitable for

  20. Improving Mobility Performance in Low Vision With a Distance-Based Representation of the Visual Scene.

    Science.gov (United States)

    van Rheede, Joram J; Wilson, Iain R; Qian, Rose I; Downes, Susan M; Kennard, Christopher; Hicks, Stephen L

    2015-07-01

    Severe visual impairment can have a profound impact on personal independence through its effect on mobility. We investigated whether the mobility of people with vision low enough to be registered as blind could be improved by presenting the visual environment in a distance-based manner for easier detection of obstacles. We accomplished this by developing a pair of "residual vision glasses" (RVGs) that use a head-mounted depth camera and displays to present information about the distance of obstacles to the wearer as brightness, such that obstacles closer to the wearer are represented more brightly. We assessed the impact of the RVGs on the mobility performance of visually impaired participants during the completion of a set of obstacle courses. Participant position was monitored continuously, which enabled us to capture the temporal dynamics of mobility performance. This allowed us to find correlates of obstacle detection and hesitations in walking behavior, in addition to the more commonly used measures of trial completion time and number of collisions. All participants were able to use the smart glasses to navigate the course, and mobility performance improved for those visually impaired participants with the worst prior mobility performance. However, walking speed was slower and hesitations increased with the altered visual representation. A depth-based representation of the visual environment may offer low vision patients improvements in independent mobility. It is important for further work to explore whether practice can overcome the reductions in speed and increased hesitation that were observed in our trial.

  1. Comprehensive long distance and real-time pipeline monitoring system based on fiber optic sensing

    Energy Technology Data Exchange (ETDEWEB)

    Nikles, Marc; Ravet, Fabien; Briffod, Fabien [Omnisens S.A., Morges (Switzerland)

    2009-07-01

    An increasing number of pipelines are constructed in remote regions affected by harsh environmental conditions. These pipeline routes often cross mountain areas which are characterized by unstable grounds and where soil texture changes between winter and summer increase the probability of hazards. Due to the long distances to be monitored and the linear nature of pipelines, distributed fiber optic sensing techniques offer significant advantages and the capability to detect and localize pipeline disturbance with great precision. Furthermore pipeline owner/operators lay fiber optic cable parallel to transmission pipelines for telecommunication purposes and at minimum additional cost monitoring capabilities can be added to the communication system. The Brillouin-based Omnisens DITEST monitoring system has been used in several long distance pipeline projects. The technique is capable of measuring strain and temperature over 100's kilometers with meter spatial resolution. Dedicated fiber optic cables have been developed for continuous strain and temperature monitoring and their deployment along the pipeline has enabled permanent and continuous pipeline ground movement, intrusion and leak detection. This paper presents a description of the fiber optic Brillouin-based DITEST sensing technique, its measurement performance and limits, while addressing future perspectives for pipeline monitoring. (author)

  2. Improved GPS-based Satellite Relative Navigation Using Femtosecond Laser Relative Distance Measurements

    Directory of Open Access Journals (Sweden)

    Hyungjik Oh

    2016-03-01

    Full Text Available This study developed an approach for improving Carrier-phase Differential Global Positioning System (CDGPS based realtime satellite relative navigation by applying laser baseline measurement data. The robustness against the space operational environment was considered, and a Synthetic Wavelength Interferometer (SWI algorithm based on a femtosecond laser measurement model was developed. The phase differences between two laser wavelengths were combined to measure precise distance. Generated laser data were used to improve estimation accuracy for the float ambiguity of CDGPS data. Relative navigation simulations in real-time were performed using the extended Kalman filter algorithm. The GPS and laser-combined relative navigation accuracy was compared with GPS-only relative navigation solutions to determine the impact of laser data on relative navigation. In numerical simulations, the success rate of integer ambiguity resolution increased when laser data was added to GPS data. The relative navigational errors also improved five-fold and two-fold, relative to the GPS-only error, for 250 m and 5 km initial relative distances, respectively. The methodology developed in this study is suitable for application to future satellite formation-flying missions.

  3. Analytic processing of distance.

    Science.gov (United States)

    Dopkins, Stephen; Galyer, Darin

    2018-01-01

    How does a human observer extract from the distance between two frontal points the component corresponding to an axis of a rectangular reference frame? To find out we had participants classify pairs of small circles, varying on the horizontal and vertical axes of a computer screen, in terms of the horizontal distance between them. A response signal controlled response time. The error rate depended on the irrelevant vertical as well as the relevant horizontal distance between the test circles with the relevant distance effect being larger than the irrelevant distance effect. The results implied that the horizontal distance between the test circles was imperfectly extracted from the overall distance between them. The results supported an account, derived from the Exemplar Based Random Walk model (Nosofsky & Palmieri, 1997), under which distance classification is based on the overall distance between the test circles, with relevant distance being extracted from overall distance to the extent that the relevant and irrelevant axes are differentially weighted so as to reduce the contribution of irrelevant distance to overall distance. The results did not support an account, derived from the General Recognition Theory (Ashby & Maddox, 1994), under which distance classification is based on the relevant distance between the test circles, with the irrelevant distance effect arising because a test circle's perceived location on the relevant axis depends on its location on the irrelevant axis, and with relevant distance being extracted from overall distance to the extent that this dependency is absent. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. The Metric of Colour Space

    DEFF Research Database (Denmark)

    Gravesen, Jens

    2015-01-01

    and found the MacAdam ellipses which are often interpreted as defining the metric tensor at their centres. An important question is whether it is possible to define colour coordinates such that the Euclidean distance in these coordinates correspond to human perception. Using cubic splines to represent......The space of colours is a fascinating space. It is a real vector space, but no matter what inner product you put on the space the resulting Euclidean distance does not correspond to human perception of difference between colours. In 1942 MacAdam performed the first experiments on colour matching...

  5. Investigating Cardiac MRI Based Right Ventricular Contractility As A Novel Non-Invasive Metric of Pulmonary Arterial Pressure

    Science.gov (United States)

    Menon, Prahlad G; Adhypak, Srilakshmi M; Williams, Ronald B; Doyle, Mark; Biederman, Robert WW

    2014-01-01

    BACKGROUND We test the hypothesis that cardiac magnetic resonance (CMR) imaging-based indices of four-dimensional (4D) (three dimensions (3D) + time) right ventricle (RV) function have predictive values in ascertaining invasive pulmonary arterial systolic pressure (PASP) measurements from right heart catheterization (RHC) in patients with pulmonary arterial hypertension (PAH). METHODS We studied five patients with idiopathic PAH and two age and sex-matched controls for RV function using a novel contractility index (CI) for amplitude and phase to peak contraction established from analysis of regional shape variation in the RV endocardium over 20 cardiac phases, segmented from CMR images in multiple orientations. RESULTS The amplitude of RV contractility correlated inversely with RV ejection fraction (RVEF; R2 = 0.64, P = 0.03) and PASP (R2 = 0.71, P = 0.02). Phase of peak RV contractility also correlated inversely to RVEF (R2 = 0.499, P = 0.12) and PASP (R2 = 0.66, P = 0.04). CONCLUSIONS RV contractility analyzed from CMR offers promising non-invasive metrics for classification of PAH, which are congruent with invasive pressure measurements. PMID:25624777

  6. Considerations of the Software Metric-based Methodology for Software Reliability Assessment in Digital I and C Systems

    International Nuclear Information System (INIS)

    Ha, J. H.; Kim, M. K.; Chung, B. S.; Oh, H. C.; Seo, M. R.

    2007-01-01

    Analog I and C systems have been replaced by digital I and C systems because the digital systems have many potential benefits to nuclear power plants in terms of operational and safety performance. For example, digital systems are essentially free of drifts, have higher data handling and storage capabilities, and provide improved performance by accuracy and computational capabilities. In addition, analog replacement parts become more difficult to obtain since they are obsolete and discontinued. There are, however, challenges to the introduction of digital technology into the nuclear power plants because digital systems are more complex than analog systems and their operation and failure modes are different. Especially, software, which can be the core of functionality in the digital systems, does not wear out physically like hardware and its failure modes are not yet defined clearly. Thus, some researches to develop the methodology for software reliability assessment are still proceeding in the safety-critical areas such as nuclear system, aerospace and medical devices. Among them, software metric-based methodology has been considered for the digital I and C systems of Korean nuclear power plants. Advantages and limitations of that methodology are identified and requirements for its application to the digital I and C systems are considered in this study

  7. Assessment of six dissimilarity metrics for climate analogues

    Science.gov (United States)

    Grenier, Patrick; Parent, Annie-Claude; Huard, David; Anctil, François; Chaumont, Diane

    2013-04-01

    Spatial analogue techniques consist in identifying locations whose recent-past climate is similar in some aspects to the future climate anticipated at a reference location. When identifying analogues, one key step is the quantification of the dissimilarity between two climates separated in time and space, which involves the choice of a metric. In this communication, spatial analogues and their usefulness are briefly discussed. Next, six metrics are presented (the standardized Euclidean distance, the Kolmogorov-Smirnov statistic, the nearest-neighbor distance, the Zech-Aslan energy statistic, the Friedman-Rafsky runs statistic and the Kullback-Leibler divergence), along with a set of criteria used for their assessment. The related case study involves the use of numerical simulations performed with the Canadian Regional Climate Model (CRCM-v4.2.3), from which three annual indicators (total precipitation, heating degree-days and cooling degree-days) are calculated over 30-year periods (1971-2000 and 2041-2070). Results indicate that the six metrics identify comparable analogue regions at a relatively large scale, but best analogues may differ substantially. For best analogues, it is also shown that the uncertainty stemming from the metric choice does generally not exceed that stemming from the simulation or model choice. A synthesis of the advantages and drawbacks of each metric is finally presented, in which the Zech-Aslan energy statistic stands out as the most recommended metric for analogue studies, whereas the Friedman-Rafsky runs statistic is the least recommended, based on this case study.

  8. Metrics of quantum states

    International Nuclear Information System (INIS)

    Ma Zhihao; Chen Jingling

    2011-01-01

    In this work we study metrics of quantum states, which are natural generalizations of the usual trace metric and Bures metric. Some useful properties of the metrics are proved, such as the joint convexity and contractivity under quantum operations. Our result has a potential application in studying the geometry of quantum states as well as the entanglement detection.

  9. A distance-dependent metal-enhanced fluorescence sensing platform based on molecular beacon design.

    Science.gov (United States)

    Zhou, Zhenpeng; Huang, Hongduan; Chen, Yang; Liu, Feng; Huang, Cheng Zhi; Li, Na

    2014-02-15

    A new metal-enhanced fluorescence (MEF) based platform was developed on the basis of distance-dependent fluorescence quenching-enhancement effect, which combined the easiness of Ag-thiol chemistry with the MEF property of noble-metal structures as well as the molecular beacon design. For the given sized AgNPs, the fluorescence enhancement factor was found to increase with a d(6) dependency in agreement with fluorescence resonance energy transfer mechanism at shorter distance and decrease with a d(-3) dependency in agreement with plasmonic enhancement mechanism at longer distance between the fluorophore and the AgNP surface. As a proof of concept, the platform was demonstrated by a sensitive detection of mercuric ions, using thymine-containing molecular beacon to tune silver nanoparticle (AgNP)-enhanced fluorescence. Mercuric ions were detected via formation of a thymine-mercuric-thymine structure to open the hairpin, facilitating fluorescence recovery and AgNP enhancement to yield a limit of detection of 1 nM, which is well below the U.S. Environmental Protection Agency regulation of the Maximum Contaminant Level Goal (10nM) in drinking water. Since the AgNP functioned as not only a quencher to reduce the reagent blank signal but also an enhancement substrate to increase fluorescence of the open hairpin when target mercuric ions were present, the quenching-enhancement strategy can greatly improve the detection sensitivity and can in principle be a universal approach for various targets when combined with molecular beacon design. © 2013 Elsevier B.V. All rights reserved.

  10. A Population-Based Analysis of Time to Surgery and Travel Distances for Brachial Plexus Surgery.

    Science.gov (United States)

    Dy, Christopher J; Baty, Jack; Saeed, Mohammed J; Olsen, Margaret A; Osei, Daniel A

    2016-09-01

    Despite the importance of timely evaluation for patients with brachial plexus injuries (BPIs), in clinical practice we have noted delays in referral. Because the published BPI experience is largely from individual centers, we used a population-based approach to evaluate the delivery of care for patients with BPI. We used statewide administrative databases from Florida (2007-2013), New York (2008-2012), and North Carolina (2009-2010) to create a cohort of patients who underwent surgery for BPI (exploration, repair, neurolysis, grafting, or nerve transfer). Emergency department and inpatient records were used to determine the time interval between the injury and surgical treatment. Distances between treating hospitals and between the patient's home ZIP code and the surgical hospital were recorded. A multivariable logistic regression model was used to determine predictors for time from injury to surgery exceeding 365 days. Within the 222 patients in our cohort, median time from injury to surgery was 7.6 months and exceeded 365 days in 29% (64 of 222 patients) of cases. Treatment at a smaller hospital for the initial injury was significantly associated with surgery beyond 365 days after injury. Patient insurance type, travel distance for surgery, distance between the 2 treating hospitals, and changing hospitals between injury and surgery did not significantly influence time to surgery. Nearly one third of patients in Florida, New York, and North Carolina underwent BPI surgery more than 1 year after the injury. Patients initially treated at smaller hospitals are at risk for undergoing delayed BPI surgery. These findings can inform administrative and policy efforts to expedite timely referral of patients with BPI to experienced centers. Copyright © 2016 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  11. A class-based link prediction using Distance Dependent Chinese Restaurant Process

    Science.gov (United States)

    Andalib, Azam; Babamir, Seyed Morteza

    2016-08-01

    One of the important tasks in relational data analysis is link prediction which has been successfully applied on many applications such as bioinformatics, information retrieval, etc. The link prediction is defined as predicting the existence or absence of edges between nodes of a network. In this paper, we propose a novel method for link prediction based on Distance Dependent Chinese Restaurant Process (DDCRP) model which enables us to utilize the information of the topological structure of the network such as shortest path and connectivity of the nodes. We also propose a new Gibbs sampling algorithm for computing the posterior distribution of the hidden variables based on the training data. Experimental results on three real-world datasets show the superiority of the proposed method over other probabilistic models for link prediction problem.

  12. Using Web-Based, Group Communication Systems to Support Case Study Learning at a Distance

    Directory of Open Access Journals (Sweden)

    Liam Rourke

    2002-10-01

    Full Text Available This study explored the capacity of Web-based, group communication systems to support case-based teaching and learning. Eleven graduate students studying at a distance were divided into three groups to collaborate on a case study using either a synchronous voice, an asynchronous voice, or a synchronous text communication system. Participants kept a detailed log of the time they spent on various activities, wrote a 1,500-word reflection on their experience, and participated in a group interview. Analysis of these data reveals that each group supplemented the system that had been assigned to them with additional communication systems in order to complete the project. Each of these systems were used strategically: email was used to share files and arrange meetings, and synchronous voice systems were used to brainstorm and make decisions. Learning achievement was high across groups and students enjoyed collaborating with others on a concrete task.

  13. METHODOLOGY-TECHNOLOGICAL BASES AND COMMUNICATION FACILITIES OF COMMUNICATION IN DISTANCE LEARNING

    OpenAIRE

    O. Gnedkova; V. Lyakutin

    2010-01-01

    In this article methodical and technological principles of communication process in distance learning, which are the important constituents of the effective distance course, are analyzed. Communication facilities for the conducting communication process in distance learning are examined. Recommendations for teachers (tutors) for organization of successful and effective learning process with the use of modern communication technologies are pointed.

  14. $\\eta$-metric structures

    OpenAIRE

    Gaba, Yaé Ulrich

    2017-01-01

    In this paper, we discuss recent results about generalized metric spaces and fixed point theory. We introduce the notion of $\\eta$-cone metric spaces, give some topological properties and prove some fixed point theorems for contractive type maps on these spaces. In particular we show that theses $\\eta$-cone metric spaces are natural generalizations of both cone metric spaces and metric type spaces.

  15. Improving UWB-Based Localization in IoT Scenarios with Statistical Models of Distance Error.

    Science.gov (United States)

    Monica, Stefania; Ferrari, Gianluigi

    2018-05-17

    Interest in the Internet of Things (IoT) is rapidly increasing, as the number of connected devices is exponentially growing. One of the application scenarios envisaged for IoT technologies involves indoor localization and context awareness. In this paper, we focus on a localization approach that relies on a particular type of communication technology, namely Ultra Wide Band (UWB). UWB technology is an attractive choice for indoor localization, owing to its high accuracy. Since localization algorithms typically rely on estimated inter-node distances, the goal of this paper is to evaluate the improvement brought by a simple (linear) statistical model of the distance error. On the basis of an extensive experimental measurement campaign, we propose a general analytical framework, based on a Least Square (LS) method, to derive a novel statistical model for the range estimation error between a pair of UWB nodes. The proposed statistical model is then applied to improve the performance of a few illustrative localization algorithms in various realistic scenarios. The obtained experimental results show that the use of the proposed statistical model improves the accuracy of the considered localization algorithms with a reduction of the localization error up to 66%.

  16. A long distance voice transmission system based on the white light LED

    Science.gov (United States)

    Tian, Chunyu; Wei, Chang; Wang, Yulian; Wang, Dachi; Yu, Benli; Xu, Feng

    2017-10-01

    A long distance voice transmission system based on a visible light communication technology (VLCT) is proposed in the paper. Our proposed system includes transmitter, receiver and the voice signal processing of single chip microcomputer. In the compact-sized LED transmitter, we use on-off-keying and not-return-to-zero (OOK-NRZ) to easily realize high speed modulation, and then systematic complexity is reduced. A voice transmission system, which possesses the properties of the low-noise and wide modulation band, is achieved by the design of high efficiency receiving optical path and using filters to reduce noise from the surrounding light. To improve the speed of the signal processing, we use single chip microcomputer to code and decode voice signal. Furthermore, serial peripheral interface (SPI) is adopted to accurately transmit voice signal data. The test results of our proposed system show that the transmission distance of this system is more than100 meters with the maximum data rate of 1.5 Mbit/s and a SNR of 30dB. This system has many advantages, such as simple construction, low cost and strong practicality. Therefore, it has extensive application prospect in the fields of the emergency communication and indoor wireless communication, etc.

  17. International Trade Modelling Using Open Flow Networks: A Flow-Distance Based Analysis.

    Science.gov (United States)

    Shen, Bin; Zhang, Jiang; Li, Yixiao; Zheng, Qiuhua; Li, Xingsen

    2015-01-01

    This paper models and analyzes international trade flows using open flow networks (OFNs) with the approaches of flow distances, which provide a novel perspective and effective tools for the study of international trade. We discuss the establishment of OFNs of international trade from two coupled viewpoints: the viewpoint of trading commodity flow and that of money flow. Based on the novel model with flow distance approaches, meaningful insights are gained. First, by introducing the concepts of trade trophic levels and niches, countries' roles and positions in the global supply chains (or value-added chains) can be evaluated quantitatively. We find that the distributions of trading "trophic levels" have the similar clustering pattern for different types of commodities, and summarize some regularities between money flow and commodity flow viewpoints. Second, we find that active and competitive countries trade a wide spectrum of products, while inactive and underdeveloped countries trade a limited variety of products. Besides, some abnormal countries import many types of goods, which the vast majority of countries do not need to import. Third, harmonic node centrality is proposed and we find the phenomenon of centrality stratification. All the results illustrate the usefulness of the model of OFNs with its network approaches for investigating international trade flows.

  18. Technique for long and absolute distance measurement based on laser pulse repetition frequency sweeping

    Science.gov (United States)

    Castro Alves, D.; Abreu, Manuel; Cabral, A.; Jost, Michael; Rebordão, J. M.

    2017-11-01

    In this work we present a technique to perform long and absolute distance measurements based on mode-locked diode lasers. Using a Michelson interferometer, it is possible to produce an optical cross-correlation between laser pulses of the reference arm with the pulses from the measurement arm, adjusting externally their degree of overlap either changing the pulse repetition frequency (PRF) or the position of the reference arm mirror for two (or more) fixed frequencies. The correlation of the travelling pulses for precision distance measurements relies on ultra-short pulse durations, as the uncertainty associated to the method is dependent on the laser pulse width as well as on a highly stable PRF. Mode-locked Diode lasers are a very appealing technology for its inherent characteristics, associated to compactness, size and efficiency, constituting a positive trade-off with regard to other mode-locked laser sources. Nevertheless, main current drawback is the non-availability of frequency-stable laser diodes. The laser used is a monolithic mode-locked semiconductor quantum-dot (QD) laser. The laser PRF is locked to an external stabilized RF reference. In this work we will present some of the preliminary results and discuss the importance of the requirements related to laser PRF stability in the final metrology system accuracy.

  19. Air temperature measurements based on the speed of sound to compensate long distance interferometric measurements

    Directory of Open Access Journals (Sweden)

    Astrua Milena

    2014-01-01

    Full Text Available A method to measure the real time temperature distribution along an interferometer path based on the propagation of acoustic waves is presented. It exploits the high sensitivity of the speed of sound in air to the air temperature. In particular, it takes advantage of a special set-up where the generation of the acoustic waves is synchronous with the amplitude modulation of a laser source. A photodetector converts the laser light to an electronic signal considered as reference, while the incoming acoustic waves are focused on a microphone and generate a second signal. In this condition, the phase difference between the two signals substantially depends on the temperature of the air volume interposed between the sources and the receivers. The comparison with the traditional temperature sensors highlighted the limit of the latter in case of fast temperature variations and the advantage of a measurement integrated along the optical path instead of a sampling measurement. The capability of the acoustic method to compensate the interferometric distance measurements due to air temperature variations has been demonstrated for distances up to 27 m.

  20. A ROBUST METHOD FOR STEREO VISUAL ODOMETRY BASED ON MULTIPLE EUCLIDEAN DISTANCE CONSTRAINT AND RANSAC ALGORITHM

    Directory of Open Access Journals (Sweden)

    Q. Zhou

    2017-07-01

    Full Text Available Visual Odometry (VO is a critical component for planetary robot navigation and safety. It estimates the ego-motion using stereo images frame by frame. Feature points extraction and matching is one of the key steps for robotic motion estimation which largely influences the precision and robustness. In this work, we choose the Oriented FAST and Rotated BRIEF (ORB features by considering both accuracy and speed issues. For more robustness in challenging environment e.g., rough terrain or planetary surface, this paper presents a robust outliers elimination method based on Euclidean Distance Constraint (EDC and Random Sample Consensus (RANSAC algorithm. In the matching process, a set of ORB feature points are extracted from the current left and right synchronous images and the Brute Force (BF matcher is used to find the correspondences between the two images for the Space Intersection. Then the EDC and RANSAC algorithms are carried out to eliminate mismatches whose distances are beyond a predefined threshold. Similarly, when the left image of the next time matches the feature points with the current left images, the EDC and RANSAC are iteratively performed. After the above mentioned, there are exceptional remaining mismatched points in some cases, for which the third time RANSAC is applied to eliminate the effects of those outliers in the estimation of the ego-motion parameters (Interior Orientation and Exterior Orientation. The proposed approach has been tested on a real-world vehicle dataset and the result benefits from its high robustness.

  1. Quasi-metrics, midpoints and applications

    Energy Technology Data Exchange (ETDEWEB)

    Valero, O.

    2017-07-01

    In applied sciences, the scientific community uses simultaneously different kinds of information coming from several sources in order to infer a conclusion or working decision. In the literature there are many techniques for merging the information and providing, hence, a meaningful fused data. In mostpractical cases such fusion methods are based on aggregation operators on somenumerical values, i.e. the aim of the fusion process is to obtain arepresentative number from a finite sequence of numerical data. In the aforementioned cases, the input data presents some kind of imprecision and for thisreason it is represented as fuzzy sets. Moreover, in such problems the comparisons between the numerical values that represent the information described by the fuzzy sets become necessary. The aforementioned comparisons are made by means of a distance defined on fuzzy sets. Thus, the numerical operators aggregating distances between fuzzy sets as incoming data play a central role in applied problems. Recently, J.J. Nieto and A. Torres gave some applications of the aggregation of distances on fuzzy sets to the study of real medical data in /cite{Nieto}. These applications are based on the notion of segment joining two given fuzzy sets and on the notion of set of midpoints between fuzzy sets. A few results obtained by Nieto and Torres have been generalized in turn by Casasnovas and Rossell/'{o} in /cite{Casas,Casas2}. Nowadays, quasi-metrics provide efficient tools in some fields of computer science and in bioinformatics. Motivated by the exposed facts, a study of segments joining two fuzzy sets and of midpoints between fuzzy sets when the measure, used for comparisons, is a quasi-metric has been made in /cite{Casas3, SebVal2013,TiradoValero}. (Author)

  2. Finite Element Based Pelvic Injury Metric Creation and Validation in Lateral Impact for a Human Body Model.

    Science.gov (United States)

    Weaver, Caitlin; Baker, Alexander; Davis, Matthew; Miller, Anna; Stitzel, Joel D

    2018-02-20

    Pelvic fractures are serious injuries resulting in high mortality and morbidity. The objective of this study is to develop and validate local pelvic anatomical, cross-section-based injury risk metrics for a finite element (FE) model of the human body. Cross-sectional instrumentation was implemented in the pelvic region of the Global Human Body Models Consortium (GHBMC M50-O) 50th percentile detailed male FE model (v4.3). In total, 25 lateral impact FE simulations were performed using input data from cadaveric lateral impact tests performed by Bouquet et al. The experimental force-time data was scaled using five normalization techniques, which were evaluated using log rank, Wilcoxon rank sum, and correlation and analysis (CORA) testing. Survival analyses with Weibull distribution were performed on the experimental peak force (scaled and unscaled) and the simulation test data to generate injury risk curves (IRCs) for total pelvic injury. Additionally, IRCs were developed for regional injury using cross-sectional forces from the simulation results and injuries documented in the experimental autopsies. These regional IRCs were also evaluated using the receiver operator characteristic (ROC) curve analysis. Based on the results of the all the evaluation methods, the Equal Stress Equal Velocity (ESEV) and ESEV using effective mass (ESEV-EM) scaling techniques performed best. The simulation IRC shows slight under prediction of injury in comparison to these scaled experimental data curves. However, this difference was determined to not be statistically significant. Additionally, the ROC curve analysis showed moderate predictive power for all regional IRCs.

  3. Large-scale Reconstructions and Independent, Unbiased Clustering Based on Morphological Metrics to Classify Neurons in Selective Populations.

    Science.gov (United States)

    Bragg, Elise M; Briggs, Farran

    2017-02-15

    This protocol outlines large-scale reconstructions of neurons combined with the use of independent and unbiased clustering analyses to create a comprehensive survey of the morphological characteristics observed among a selective neuronal population. Combination of these techniques constitutes a novel approach for the collection and analysis of neuroanatomical data. Together, these techniques enable large-scale, and therefore more comprehensive, sampling of selective neuronal populations and establish unbiased quantitative methods for describing morphologically unique neuronal classes within a population. The protocol outlines the use of modified rabies virus to selectively label neurons. G-deleted rabies virus acts like a retrograde tracer following stereotaxic injection into a target brain structure of interest and serves as a vehicle for the delivery and expression of EGFP in neurons. Large numbers of neurons are infected using this technique and express GFP throughout their dendrites, producing "Golgi-like" complete fills of individual neurons. Accordingly, the virus-mediated retrograde tracing method improves upon traditional dye-based retrograde tracing techniques by producing complete intracellular fills. Individual well-isolated neurons spanning all regions of the brain area under study are selected for reconstruction in order to obtain a representative sample of neurons. The protocol outlines procedures to reconstruct cell bodies and complete dendritic arborization patterns of labeled neurons spanning multiple tissue sections. Morphological data, including positions of each neuron within the brain structure, are extracted for further analysis. Standard programming functions were utilized to perform independent cluster analyses and cluster evaluations based on morphological metrics. To verify the utility of these analyses, statistical evaluation of a cluster analysis performed on 160 neurons reconstructed in the thalamic reticular nucleus of the thalamus

  4. Emergence of the scale-invariant proportion in a flock from the metric-topological interaction.

    Science.gov (United States)

    Niizato, Takayuki; Murakami, Hisashi; Gunji, Yukio-Pegio

    2014-05-01

    Recently, it has become possible to more precisely analyze flocking behavior. Such research has prompted a reconsideration of the notion of neighborhoods in the theoretical model. Flocking based on topological distance is one such result. In a topological flocking model, a bird does not interact with its neighbors on the basis of a fixed-size neighborhood (i.e., on the basis of metric distance), but instead interacts with its nearest seven neighbors. Cavagna et al., moreover, found a new phenomenon in flocks that can be explained by neither metric distance nor topological distance: they found that correlated domains in a flock were larger than the metric and topological distance and that these domains were proportional to the total flock size. However, the role of scale-free correlation is still unclear. In a previous study, we constructed a metric-topological interaction model on three-dimensional spaces and showed that this model exhibited scale-free correlation. In this study, we found that scale-free correlation in a two-dimensional flock was more robust than in a three-dimensional flock for the threshold parameter. Furthermore, we also found a qualitative difference in behavior from using the fluctuation coherence, which we observed on three-dimensional flocking behavior. Our study suggests that two-dimensional flocks try to maintain a balance between the flock size and flock mobility by breaking into several smaller flocks. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Local adjacency metric dimension of sun graph and stacked book graph

    Science.gov (United States)

    Yulisda Badri, Alifiah; Darmaji

    2018-03-01

    A graph is a mathematical system consisting of a non-empty set of nodes and a set of empty sides. One of the topics to be studied in graph theory is the metric dimension. Application in the metric dimension is the navigation robot system on a path. Robot moves from one vertex to another vertex in the field by minimizing the errors that occur in translating the instructions (code) obtained from the vertices of that location. To move the robot must give different instructions (code). In order for the robot to move efficiently, the robot must be fast to translate the code of the nodes of the location it passes. so that the location vertex has a minimum distance. However, if the robot must move with the vertex location on a very large field, so the robot can not detect because the distance is too far.[6] In this case, the robot can determine its position by utilizing location vertices based on adjacency. The problem is to find the minimum cardinality of the required location vertex, and where to put, so that the robot can determine its location. The solution to this problem is the dimension of adjacency metric and adjacency metric bases. Rodrguez-Velzquez and Fernau combine the adjacency metric dimensions with local metric dimensions, thus becoming the local adjacency metric dimension. In the local adjacency metric dimension each vertex in the graph may have the same adjacency representation as the terms of the vertices. To obtain the local metric dimension of values in the graph of the Sun and the stacked book graph is used the construction method by considering the representation of each adjacent vertex of the graph.

  6. Unifying distance-based goodness-of-fit indicators for hydrologic model assessment

    Science.gov (United States)

    Cheng, Qinbo; Reinhardt-Imjela, Christian; Chen, Xi; Schulte, Achim

    2014-05-01

    The goodness-of-fit indicator, i.e. efficiency criterion, is very important for model calibration. However, recently the knowledge about the goodness-of-fit indicators is all empirical and lacks a theoretical support. Based on the likelihood theory, a unified distance-based goodness-of-fit indicator termed BC-GED model is proposed, which uses the Box-Cox (BC) transformation to remove the heteroscedasticity of model errors and the generalized error distribution (GED) with zero-mean to fit the distribution of model errors after BC. The BC-GED model can unify all recent distance-based goodness-of-fit indicators, and reveals the mean square error (MSE) and the mean absolute error (MAE) that are widely used goodness-of-fit indicators imply statistic assumptions that the model errors follow the Gaussian distribution and the Laplace distribution with zero-mean, respectively. The empirical knowledge about goodness-of-fit indicators can be also easily interpreted by BC-GED model, e.g. the sensitivity to high flow of the goodness-of-fit indicators with large power of model errors results from the low probability of large model error in the assumed distribution of these indicators. In order to assess the effect of the parameters (i.e. the BC transformation parameter λ and the GED kurtosis coefficient β also termed the power of model errors) of BC-GED model on hydrologic model calibration, six cases of BC-GED model were applied in Baocun watershed (East China) with SWAT-WB-VSA model. Comparison of the inferred model parameters and model simulation results among the six indicators demonstrates these indicators can be clearly separated two classes by the GED kurtosis β: β >1 and β ≤ 1. SWAT-WB-VSA calibrated by the class β >1 of distance-based goodness-of-fit indicators captures high flow very well and mimics the baseflow very badly, but it calibrated by the class β ≤ 1 mimics the baseflow very well, because first the larger value of β, the greater emphasis is put on

  7. Agent-Based Modeling of Consumer Decision making Process Based on Power Distance and Personality

    NARCIS (Netherlands)

    Roozmand, O.; Ghasem-Aghaee, N.; Hofstede, G.J.; Nematbakhsh, M.A.; Baraani, A.; Verwaart, T.

    2011-01-01

    Simulating consumer decision making processes involves different disciplines such as: sociology, social psychology, marketing, and computer science. In this paper, we propose an agent-based conceptual and computational model of consumer decision-making based on culture, personality and human needs.

  8. Long-distance transmission of light in a scintillator-based radiation detector

    Science.gov (United States)

    Dowell, Jonathan L.; Talbott, Dale V.; Hehlen, Markus P.

    2017-07-11

    Scintillator-based radiation detectors capable of transmitting light indicating the presence of radiation for long distances are disclosed herein. A radiation detector can include a scintillator layer and a light-guide layer. The scintillator layer is configured to produce light upon receiving incident radiation. The light-guide layer is configured to receive light produced by the scintillator layer and either propagate the received light through the radiation detector or absorb the received light and emit light, through fluorescence, that is propagated through the radiation detector. A radiation detector can also include an outer layer partially surrounding the scintillator layer and light-guide layer. The index of refraction of the light-guide layer can be greater than the index of refraction of adjacent layers.

  9. Role of Distance-Based Routing in Traffic Dynamics on Mobile Networks

    Science.gov (United States)

    Yang, Han-Xin; Wang, Wen-Xu

    2013-06-01

    Despite of intensive investigations on transportation dynamics taking place on complex networks with fixed structures, a deep understanding of networks consisting of mobile nodes is challenging yet, especially the lacking of insight into the effects of routing strategies on transmission efficiency. We introduce a distance-based routing strategy for networks of mobile agents toward enhancing the network throughput and the transmission efficiency. We study the transportation capacity and delivering time of data packets associated with mobility and communication ability. Interestingly, we find that the transportation capacity is optimized at moderate moving speed, which is quite different from random routing strategy. In addition, both continuous and discontinuous transitions from free flow to congestions are observed. Degree distributions are explored in order to explain the enhancement of network throughput and other observations. Our work is valuable toward understanding complex transportation dynamics and designing effective routing protocols.

  10. Ratbot automatic navigation by electrical reward stimulation based on distance measurement in unknown environments.

    Science.gov (United States)

    Gao, Liqiang; Sun, Chao; Zhang, Chen; Zheng, Nenggan; Chen, Weidong; Zheng, Xiaoxiang

    2013-01-01

    Traditional automatic navigation methods for bio-robots are constrained to configured environments and thus can't be applied to tasks in unknown environments. With no consideration of bio-robot's own innate living ability and treating bio-robots in the same way as mechanical robots, those methods neglect the intelligence behavior of animals. This paper proposes a novel ratbot automatic navigation method in unknown environments using only reward stimulation and distance measurement. By utilizing rat's habit of thigmotaxis and its reward-seeking behavior, this method is able to incorporate rat's intrinsic intelligence of obstacle avoidance and path searching into navigation. Experiment results show that this method works robustly and can successfully navigate the ratbot to a target in the unknown environment. This work might put a solid base for application of ratbots and also has significant implication of automatic navigation for other bio-robots as well.

  11. Tactile-Sight: A Sensory Substitution Device Based on Distance-Related Vibrotactile Flow

    Directory of Open Access Journals (Sweden)

    Leandro Cancar

    2013-06-01

    Full Text Available Sensory substitution is a research field of increasing interest with regard to technical, applied and theoretical issues. Among the latter, it is of central interest to understand the form in which humans perceive the environment. Ecological psychology, among other approaches, proposes that we can detect higher-order informational variables (in the sense that they are defined over substantial spatial and temporal intervals that specify our interaction with the environment. When using a vibrotactile sensory substitution device, it is reasonable to ask if stimulation on the skin may be exploitable to detect higher-order variables. Motivated by this question, a portable vibrotactile sensory substitution device was built, using distance-based information as a source and driving a large number of vibrotactile actuators (72 in the reported version, 120 max. The portable device was designed to explore real environments, allowing natural unrestricted movement for the user while providing contingent real-time vibrotactile information. Two preliminary experiments were performed. In the first one, participants were asked to detect the time to contact of an approaching ball in a simulated (desktop environment. Reasonable performance was observed in all experimental conditions, including the one with only tactile stimulation. In the second experiment, a portable version of the device was used in a real environment, where participants were asked to hit an approaching ball. Participants were able to coordinate their arm movements with vibrotactile stimulation in appropriate timing. We conclude that vibrotactile flow can be generated by distance-based activation of the actuators and that this stimulation on the skin allows users to perceive time-to-contact related environmental properties.

  12. Some Metric Properties of Planar Gaussian Free Field

    Science.gov (United States)

    Goswami, Subhajit

    In this thesis we study the properties of some metrics arising from two-dimensional Gaussian free field (GFF), namely the Liouville first-passage percolation (Liouville FPP), the Liouville graph distance and an effective resistance metric. In Chapter 1, we define these metrics as well as discuss the motivations for studying them. Roughly speaking, Liouville FPP is the shortest path metric in a planar domain D where the length of a path P is given by ∫Pe gammah(z)|dz| where h is the GFF on D and gamma > 0. In Chapter 2, we present an upper bound on the expected Liouville FPP distance between two typical points for small values of gamma (the near-Euclidean regime). A similar upper bound is derived in Chapter 3 for the Liouville graph distance which is, roughly, the minimal number of Euclidean balls with comparable Liouville quantum gravity (LQG) measure whose union contains a continuous path between two endpoints. Our bounds seem to be in disagreement with Watabiki's prediction (1993) on the random metric of Liouville quantum gravity in this regime. The contents of these two chapters are based on a joint work with Jian Ding. In Chapter 4, we derive some asymptotic estimates for effective resistances on a random network which is defined as follows. Given any gamma > 0 and for eta = {etav}v∈Z2 denoting a sample of the two-dimensional discrete Gaussian free field on Z2 pinned at the origin, we equip the edge ( u, v) with conductance egamma(etau + eta v). The metric structure of effective resistance plays a crucial role in our proof of the main result in Chapter 4. The primary motivation behind this metric is to understand the random walk on Z 2 where the edge (u, v) has weight egamma(etau + etav). Using the estimates from Chapter 4 we show in Chapter 5 that for almost every eta, this random walk is recurrent and that, with probability tending to 1 as T → infinity, the return probability at time 2T decays as T-1+o(1). In addition, we prove a version of subdiffusive

  13. Kullback-Leibler distance-based enhanced detection of incipient anomalies

    KAUST Repository

    Harrou, Fouzi

    2016-09-09

    Accurate and effective anomaly detection and diagnosis of modern engineering systems by monitoring processes ensure reliability and safety of a product while maintaining desired quality. In this paper, an innovative method based on Kullback-Leibler divergence for detecting incipient anomalies in highly correlated multivariate data is presented. We use a partial least square (PLS) method as a modeling framework and a symmetrized Kullback-Leibler distance (KLD) as an anomaly indicator, where it is used to quantify the dissimilarity between current PLS-based residual and reference probability distributions obtained using fault-free data. Furthermore, this paper reports the development of two monitoring charts based on the KLD. The first approach is a KLD-Shewhart chart, where the Shewhart monitoring chart with a three sigma rule is used to monitor the KLD of the response variables residuals from the PLS model. The second approach integrates the KLD statistic into the exponentially weighted moving average monitoring chart. The performance of the PLS-based KLD anomaly-detection methods is illustrated and compared to that of conventional PLS-based anomaly detection methods. Using synthetic data and simulated distillation column data, we demonstrate the greater sensitivity and effectiveness of the developed method over the conventional PLS-based methods, especially when data are highly correlated and small anomalies are of interest. Results indicate that the proposed chart is a very promising KLD-based method because KLD-based charts are, in practice, designed to detect small shifts in process parameters. © 2016 Elsevier Ltd

  14. Defining functional distances over Gene Ontology

    Directory of Open Access Journals (Sweden)

    del Pozo Angela

    2008-01-01

    Full Text Available Abstract Background A fundamental problem when trying to define the functional relationships between proteins is the difficulty in quantifying functional similarities, even when well-structured ontologies exist regarding the activity of proteins (i.e. 'gene ontology' -GO-. However, functional metrics can overcome the problems in the comparing and evaluating functional assignments and predictions. As a reference of proximity, previous approaches to compare GO terms considered linkage in terms of ontology weighted by a probability distribution that balances the non-uniform 'richness' of different parts of the Direct Acyclic Graph. Here, we have followed a different approach to quantify functional similarities between GO terms. Results We propose a new method to derive 'functional distances' between GO terms that is based on the simultaneous occurrence of terms in the same set of Interpro entries, instead of relying on the structure of the GO. The coincidence of GO terms reveals natural biological links between the GO functions and defines a distance model Df which fulfils the properties of a Metric Space. The distances obtained in this way can be represented as a hierarchical 'Functional Tree'. Conclusion The method proposed provides a new definition of distance that enables the similarity between GO terms to be quantified. Additionally, the 'Functional Tree' defines groups with biological meaning enhancing its utility for protein function comparison and prediction. Finally, this approach could be for function-based protein searches in databases, and for analysing the gene clusters produced by DNA array experiments.

  15. Ranking and selection of commercial off-the-shelf using fuzzy distance based approach

    Directory of Open Access Journals (Sweden)

    Rakesh Garg

    2015-06-01

    Full Text Available There is a tremendous growth of the use of the component based software engineering (CBSE approach for the development of software systems. The selection of the best suited COTS components which fulfils the necessary requirement for the development of software(s has become a major challenge for the software developers. The complexity of the optimal selection problem increases with an increase in alternative potential COTS components and the corresponding selection criteria. In this research paper, the problem of ranking and selection of Data Base Management Systems (DBMS components is modeled as a multi-criteria decision making problem. A ‘Fuzzy Distance Based Approach (FDBA’ method is proposed for the optimal ranking and selection of DBMS COTS components of an e-payment system based on 14 selection criteria grouped under three major categories i.e. ‘Vendor Capabilities’, ‘Business Issues’ and ‘Cost’. The results of this method are compared with other Analytical Hierarchy Process (AHP which is termed as a typical multi-criteria decision making approach. The proposed methodology is explained with an illustrated example.

  16. Survival As a Quality Metric of Cancer Care: Use of the National Cancer Data Base to Assess Hospital Performance.

    Science.gov (United States)

    Shulman, Lawrence N; Palis, Bryan E; McCabe, Ryan; Mallin, Kathy; Loomis, Ashley; Winchester, David; McKellar, Daniel

    2018-01-01

    Survival is considered an important indicator of the quality of cancer care, but the validity of different methodologies to measure comparative survival rates is less well understood. We explored whether the National Cancer Data Base (NCDB) could serve as a source of unadjusted and risk-adjusted cancer survival data and whether these data could be used as quality indicators for individual hospitals or in the aggregate by hospital type. The NCDB, an aggregate of > 1,500 hospital cancer registries, was queried to analyze unadjusted and risk-adjusted hazards of death for patients with stage III breast cancer (n = 116,787) and stage IIIB or IV non-small-cell lung cancer (n = 252,392). Data were analyzed at the individual hospital level and by hospital type. At the hospital level, after risk adjustment, few hospitals had comparative risk-adjusted survival rates that were statistically better or worse. By hospital type, National Cancer Institute-designated comprehensive cancer centers had risk-adjusted survival ratios that were statistically significantly better than those of academic cancer centers and community hospitals. Using the NCDB as the data source, survival rates for patients with stage III breast cancer and stage IIIB or IV non-small-cell lung cancer were statistically better at National Cancer Institute-designated comprehensive cancer centers when compared with other hospital types. Compared with academic hospitals, risk-adjusted survival was lower in community hospitals. At the individual hospital level, after risk adjustment, few hospitals were shown to have statistically better or worse survival, suggesting that, using NCDB data, survival may not be a good metric to determine relative quality of cancer care at this level.

  17. A DISTANCE EDUCATION MODEL FOR JORDANIAN STUDENTS BASED ON AN EMPIRICAL STUDY

    Directory of Open Access Journals (Sweden)

    Ahmad SHAHER MASHHOUR

    2007-04-01

    Full Text Available Distance education is expanding worldwide. Numbers of students enrolled in distance education are increasing at very high rates. Distance education is said to be the future of education because it addresses educational needs of the new millennium. This paper represents the findings of an empirical study on a sample of Jordanian distance education students into a requirement model that addresses the need of such education at the national level. The responses of the sample show that distance education is offering a viable and satisfactory alternative to those who cannot enroll in regular residential education. The study also shows that the shortcomings of the regular and the current form of distance education in Jordan can be overcome by the use of modern information technology.

  18. Comparison of efficiency of distance measurement methodologies in mango (Mangifera indica) progenies based on physicochemical descriptors.

    Science.gov (United States)

    Alves, E O S; Cerqueira-Silva, C B M; Souza, A M; Santos, C A F; Lima Neto, F P; Corrêa, R X

    2012-03-14

    We investigated seven distance measures in a set of observations of physicochemical variables of mango (Mangifera indica) submitted to multivariate analyses (distance, projection and grouping). To estimate the distance measurements, five mango progeny (total of 25 genotypes) were analyzed, using six fruit physicochemical descriptors (fruit weight, equatorial diameter, longitudinal diameter, total soluble solids in °Brix, total titratable acidity, and pH). The distance measurements were compared by the Spearman correlation test, projection in two-dimensional space and grouping efficiency. The Spearman correlation coefficients between the seven distance measurements were, except for the Mahalanobis' generalized distance (0.41 ≤ rs ≤ 0.63), high and significant (rs ≥ 0.91; P < 0.001). Regardless of the origin of the distance matrix, the unweighted pair group method with arithmetic mean grouping method proved to be the most adequate. The various distance measurements and grouping methods gave different values for distortion (-116.5 ≤ D ≤ 74.5), cophenetic correlation (0.26 ≤ rc ≤ 0.76) and stress (-1.9 ≤ S ≤ 58.9). Choice of distance measurement and analysis methods influence the.

  19. Comparing Single Case Design Overlap-Based Effect Size Metrics from Studies Examining Speech Generating Device Interventions

    Science.gov (United States)

    Chen, Mo; Hyppa-Martin, Jolene K.; Reichle, Joe E.; Symons, Frank J.

    2016-01-01

    Meaningfully synthesizing single case experimental data from intervention studies comprised of individuals with low incidence conditions and generating effect size estimates remains challenging. Seven effect size metrics were compared for single case design (SCD) data focused on teaching speech generating device use to individuals with…

  20. Distance Learning

    National Research Council Canada - National Science Library

    Braddock, Joseph

    1997-01-01

    A study reviewing the existing Army Distance Learning Plan (ADLP) and current Distance Learning practices, with a focus on the Army's training and educational challenges and the benefits of applying Distance Learning techniques...

  1. Distance based control system for machine vision-based selective spraying

    NARCIS (Netherlands)

    Steward, B.L.; Tian, L.F.; Tang, L.

    2002-01-01

    For effective operation of a selective sprayer with real-time local weed sensing, herbicides must be delivered, accurately to weed targets in the field. With a machine vision-based selective spraying system, acquiring sequential images and switching nozzles on and off at the correct locations are

  2. Inferring nonlinear gene regulatory networks from gene expression data based on distance correlation.

    Directory of Open Access Journals (Sweden)

    Xiaobo Guo

    Full Text Available Nonlinear dependence is general in regulation mechanism of gene regulatory networks (GRNs. It is vital to properly measure or test nonlinear dependence from real data for reconstructing GRNs and understanding the complex regulatory mechanisms within the cellular system. A recently developed measurement called the distance correlation (DC has been shown powerful and computationally effective in nonlinear dependence for many situations. In this work, we incorporate the DC into inferring GRNs from the gene expression data without any underling distribution assumptions. We propose three DC-based GRNs inference algorithms: CLR-DC, MRNET-DC and REL-DC, and then compare them with the mutual information (MI-based algorithms by analyzing two simulated data: benchmark GRNs from the DREAM challenge and GRNs generated by SynTReN network generator, and an experimentally determined SOS DNA repair network in Escherichia coli. According to both the receiver operator characteristic (ROC curve and the precision-recall (PR curve, our proposed algorithms significantly outperform the MI-based algorithms in GRNs inference.

  3. Regularization based on steering parameterized Gaussian filters and a Bhattacharyya distance functional

    Science.gov (United States)

    Lopes, Emerson P.

    2001-08-01

    Template regularization embeds the problem of class separability. In the machine vision perspective, this problem is critical when a textural classification procedure is applied to non-stationary pattern mosaic images. These applications often present low accuracy performance due to disturbance of the classifiers produced by exogenous or endogenous signal regularity perturbations. Natural scene imaging, where the images present certain degree of homogeneity in terms of texture element size or shape (primitives) shows a variety of behaviors, especially varying the preferential spatial directionality. The space-time image pattern characterization is only solved if classification procedures are designed considering the most robust tools within a parallel and hardware perspective. The results to be compared in this paper are obtained using a framework based on multi-resolution, frame and hypothesis approach. Two strategies for the bank of Gabor filters applications are considered: adaptive strategy using the KL transform and fix configuration strategy. The regularization under discussion is accomplished in the pyramid building system instance. The filterings are steering Gaussians controlled by free parameters which are adjusted in accordance with a feedback process driven by hints obtained from sequence of frames interaction functionals pos-processed in the training process and including classification of training set samples as examples. Besides these adjustments there is continuous input data sensitive adaptiveness. The experimental result assessments are focused on two basic issues: Bhattacharyya distance as pattern characterization feature and the combination of KL transform as feature selection and adaptive criterion with the regularization of the pattern Bhattacharyya distance functional (BDF) behavior, using the BDF state separability and symmetry as the main indicators of an optimum framework parameter configuration.

  4. METHODS OF DISTANCE MEASUREMENT’S ACCURACY INCREASING BASED ON THE CORRELATION ANALYSIS OF STEREO IMAGES

    Directory of Open Access Journals (Sweden)

    V. L. Kozlov

    2018-01-01

    Full Text Available To solve the problem of increasing the accuracy of restoring a three-dimensional picture of space using two-dimensional digital images, it is necessary to use new effective techniques and algorithms for processing and correlation analysis of digital images. Actively developed tools that allow you to reduce the time costs for processing stereo images, improve the quality of the depth maps construction and automate their construction. The aim of the work is to investigate the possibilities of using various techniques for processing digital images to improve the measurements accuracy of the rangefinder based on the correlation analysis of the stereo image. The results of studies of the influence of color channel mixing techniques on the distance measurements accuracy for various functions realizing correlation processing of images are presented. Studies on the analysis of the possibility of using integral representation of images to reduce the time cost in constructing a depth map areproposed. The results of studies of the possibility of using images prefiltration before correlation processing when distance measuring by stereo imaging areproposed.It is obtained that using of uniform mixing of channels leads to minimization of the total number of measurement errors, and using of brightness extraction according to the sRGB standard leads to an increase of errors number for all of the considered correlation processing techniques. Integral representation of the image makes it possible to accelerate the correlation processing, but this method is useful for depth map calculating in images no more than 0.5 megapixels. Using of image filtration before correlation processing can provide, depending on the filter parameters, either an increasing of the correlation function value, which is useful for analyzing noisy images, or compression of the correlation function.

  5. Implementing Practical Based Courses under Open and Distance Learning System: A Study of the Perception of Learners and Counsellors

    Science.gov (United States)

    Basantia, Tapan Kumar

    2018-01-01

    Implementing practical based courses under Open and Distance Learning (ODL) system is a very difficult and challenging task as the teaching of practical based courses involves intensive practical work. For removing the difficulties and challenges in implementing the practical based courses under ODL system, there is a need to study the existing…

  6. Network-level accident-mapping: Distance based pattern matching using artificial neural network.

    Science.gov (United States)

    Deka, Lipika; Quddus, Mohammed

    2014-04-01

    The objective of an accident-mapping algorithm is to snap traffic accidents onto the correct road segments. Assigning accidents onto the correct segments facilitate to robustly carry out some key analyses in accident research including the identification of accident hot-spots, network-level risk mapping and segment-level accident risk modelling. Existing risk mapping algorithms have some severe limitations: (i) they are not easily 'transferable' as the algorithms are specific to given accident datasets; (ii) they do not perform well in all road-network environments such as in areas of dense road network; and (iii) the methods used do not perform well in addressing inaccuracies inherent in and type of road environment. The purpose of this paper is to develop a new accident mapping algorithm based on the common variables observed in most accident databases (e.g. road name and type, direction of vehicle movement before the accident and recorded accident location). The challenges here are to: (i) develop a method that takes into account uncertainties inherent to the recorded traffic accident data and the underlying digital road network data, (ii) accurately determine the type and proportion of inaccuracies, and (iii) develop a robust algorithm that can be adapted for any accident set and road network of varying complexity. In order to overcome these challenges, a distance based pattern-matching approach is used to identify the correct road segment. This is based on vectors containing feature values that are common in the accident data and the network data. Since each feature does not contribute equally towards the identification of the correct road segments, an ANN approach using the single-layer perceptron is used to assist in "learning" the relative importance of each feature in the distance calculation and hence the correct link identification. The performance of the developed algorithm was evaluated based on a reference accident dataset from the UK confirming that

  7. LAND-MAN: a new curriculum based on open distance learning for Asian

    Science.gov (United States)

    Guadagno, F. M.; Dhital, M. R.; Petley, D.

    2003-04-01

    Land-Man is a one-year Asian-European partnership project (Asia-Link EU programme), aiming to implement both a new curriculum and a new distance learning model in the field of landslides management which deals with situations that occur prior to, during, and after the landslide. The emphasis in Land-Man is placed on establishing methodologies, guidelines, and tools to develop Open and Distance Learning (ODL) for the future improvement and harmonisation of education in Landslides Management. Decision-makers, postgraduate students in environmental, earth and engineering disciplines, as well as professionals may benefit from the project. During the implementation of activities, the clear intention is to use internet-based tools in order to strengthen the co-operation between partners and thus lay a stable, cross-cultural, internet-oriented foundation for the future ODL-based educational model. At the end of the project, an ODL-based model for Asian-European Landslides Management Education will be designed and based on specially assembled, multimedia products. In particular, the project aims to provide tutors/professors with training by supplying them with appropriate materials and support to enable them to change to the new teaching model and by focusing on assessment of training, self-esteem, comfort level, commitment, and enthusiasm for tutors. The project also aims to nurture positive attitudes towards distance learning by changing the techniques whereby students learn landslides management, using the latest educational strategies and technology. Although the management of territory is the responsibility of national and local authorities, personnel in these departments can have limited training and experience in natural hazard and, particularly, in landslides management plans. This project will not only hypothesise, through a new curriculum, how management planning can be undertaken, but will also consider how to bring together practitioners and decision makers

  8. Distance and Density Similarity Based Enhanced k-NN Classifier for Improving Fault Diagnosis Performance of Bearings

    Directory of Open Access Journals (Sweden)

    Sharif Uddin

    2016-01-01

    Full Text Available An enhanced k-nearest neighbor (k-NN classification algorithm is presented, which uses a density based similarity measure in addition to a distance based similarity measure to improve the diagnostic performance in bearing fault diagnosis. Due to its use of distance based similarity measure alone, the classification accuracy of traditional k-NN deteriorates in case of overlapping samples and outliers and is highly susceptible to the neighborhood size, k. This study addresses these limitations by proposing the use of both distance and density based measures of similarity between training and test samples. The proposed k-NN classifier is used to enhance the diagnostic performance of a bearing fault diagnosis scheme, which classifies different fault conditions based upon hybrid feature vectors extracted from acoustic emission (AE signals. Experimental results demonstrate that the proposed scheme, which uses the enhanced k-NN classifier, yields better diagnostic performance and is more robust to variations in the neighborhood size, k.

  9. Assessment of the Effectiveness of Internet-Based Distance Learning through the VClass e-Education Platform

    Directory of Open Access Journals (Sweden)

    Chadchadaporn Pukkaew

    2013-09-01

    Full Text Available This study assesses the effectiveness of internet-based distance learning (IBDL through the VClass live e-education platform. The research examines (1 the effectiveness of IBDL for regular and distance students and (2 the distance students’ experience of VClass in the IBDL course entitled Computer Programming 1. The study employed the common definitions of evaluation to attain useful statistical results. The measurement instruments used were test scores and questionnaires. The sample consisted of 59 first-year undergraduate students, most of whom were studying computer information systems at Rajamangala University of Technology Lanna Chiang Mai in Thailand. The results revealed that distance students engaged in learning behavior only occasionally but that the effectiveness of learning was the same for distance and regular students. Moreover, the provided computer-mediated communications (CMC (e.g., live chat, email, and discussion board were sparingly used, primarily by male distance students. Distance students, regular students, the instructor, and the tutor agreed to use a social networking site, Facebook, rather than the provided CMC during the course. The evaluation results produce useful information that is applicable for developing and improving IBDL practices.

  10. High Girth Column-Weight-Two LDPC Codes Based on Distance Graphs

    Directory of Open Access Journals (Sweden)

    Gabofetswe Malema

    2007-01-01

    Full Text Available LDPC codes of column weight of two are constructed from minimal distance graphs or cages. Distance graphs are used to represent LDPC code matrices such that graph vertices that represent rows and edges are columns. The conversion of a distance graph into matrix form produces an adjacency matrix with column weight of two and girth double that of the graph. The number of 1's in each row (row weight is equal to the degree of the corresponding vertex. By constructing graphs with different vertex degrees, we can vary the rate of corresponding LDPC code matrices. Cage graphs are used as examples of distance graphs to design codes with different girths and rates. Performance of obtained codes depends on girth and structure of the corresponding distance graphs.

  11. New Cepheid distances to nearby galaxies based on BVRI CCD photometry. III - NGC 300

    Energy Technology Data Exchange (ETDEWEB)

    Freedman, W.L.; Madore, B.F.; Hawley, S.L.; Horowitz, I.K.; Mould, J.; Navarrete, M.; Sallmen, S. (Carnegie Institution of Washington, Observatories, Pasadena, CA (United States) JPL, Pasadena, CA (United States) Lawrence Livermore National Laboratory, Livermore, CA (United States) California Institute of Technology, Pasadena (United States) Cerro Tololo Inter-American Observatory, La Serena (Chile) California, University, Berkeley (United States))

    1992-09-01

    A true distance modulus of (m - M) sub 0 = 26.66 +/- 0.10 mag (corresponding to 2.1 +/- 0.1 Mpc) has been determined for the Sculptor Group spiral galaxy NGC 300. New CCD data have been obtained for a sample of known Cepheids in this galaxy from which apparent distance moduli at B, V, R, and I wavelengths are determined. Combining the data available at different wavelenghts, and assuming a true distance modulus to the LMC of 18.5 mag, a true distance modulus is obtained for NGC 300, corrected for the effects of interstellar reddening. The availability of a new distance to NGC 300 brings to five the total number of galaxies with new CCD photometry of Cepheids, useful for calibration of the Hubble constant. 26 refs.

  12. Hyperplane distance neighbor clustering based on local discriminant analysis for complex chemical processes monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Chunhong; Xiao, Shaoqing; Gu, Xiaofeng [Jiangnan University, Wuxi (China)

    2014-11-15

    The collected training data often include both normal and faulty samples for complex chemical processes. However, some monitoring methods, such as partial least squares (PLS), principal component analysis (PCA), independent component analysis (ICA) and Fisher discriminant analysis (FDA), require fault-free data to build the normal operation model. These techniques are applicable after the preliminary step of data clustering is applied. We here propose a novel hyperplane distance neighbor clustering (HDNC) based on the local discriminant analysis (LDA) for chemical process monitoring. First, faulty samples are separated from normal ones using the HDNC method. Then, the optimal subspace for fault detection and classification can be obtained using the LDA approach. The proposed method takes the multimodality within the faulty data into account, and thus improves the capability of process monitoring significantly. The HDNC-LDA monitoring approach is applied to two simulation processes and then compared with the conventional FDA based on the K-nearest neighbor (KNN-FDA) method. The results obtained in two different scenarios demonstrate the superiority of the HDNC-LDA approach in terms of fault detection and classification accuracy.

  13. A Simple Density with Distance Based Initial Seed Selection Technique for K Means Algorithm

    Directory of Open Access Journals (Sweden)

    Sajidha Syed Azimuddin

    2017-01-01

    Full Text Available Open issues with respect to K means algorithm are identifying the number of clusters, initial seed concept selection, clustering tendency, handling empty clusters, identifying outliers etc. In this paper we propose a novel and a simple technique considering both density and distance of the concepts in a dataset to identify initial seed concepts for clustering. Many authors have proposed different techniques to identify initial seed concepts; but our method ensures that the initial seed concepts are chosen from different clusters that are to be generated by the clustering solution. The hallmark of our algorithm is that it is a single pass algorithm that does not require any extra parameters to be estimated. Further, our seed concepts are one among the actual concepts and not the mean of representative concepts as is the case in many other algorithms. We have implemented our proposed algorithm and compared the results with the interval based technique of Fouad Khan. We see that our method outperforms the interval based method. We have also compared our method with the original random K means and K Means++ algorithms.

  14. Reachable Distance Space: Efficient Sampling-Based Planning for Spatially Constrained Systems

    KAUST Repository

    Xinyu Tang,

    2010-01-25

    Motion planning for spatially constrained robots is difficult due to additional constraints placed on the robot, such as closure constraints for closed chains or requirements on end-effector placement for articulated linkages. It is usually computationally too expensive to apply sampling-based planners to these problems since it is difficult to generate valid configurations. We overcome this challenge by redefining the robot\\'s degrees of freedom and constraints into a new set of parameters, called reachable distance space (RD-space), in which all configurations lie in the set of constraint-satisfying subspaces. This enables us to directly sample the constrained subspaces with complexity linear in the number of the robot\\'s degrees of freedom. In addition to supporting efficient sampling of configurations, we show that the RD-space formulation naturally supports planning and, in particular, we design a local planner suitable for use by sampling-based planners. We demonstrate the effectiveness and efficiency of our approach for several systems including closed chain planning with multiple loops, restricted end-effector sampling, and on-line planning for drawing/sculpting. We can sample single-loop closed chain systems with 1,000 links in time comparable to open chain sampling, and we can generate samples for 1,000-link multi-loop systems of varying topologies in less than a second. © 2010 The Author(s).

  15. Estimation of cylinder orientation in three-dimensional point cloud using angular distance-based optimization

    Science.gov (United States)

    Su, Yun-Ting; Hu, Shuowen; Bethel, James S.

    2017-05-01

    Light detection and ranging (LIDAR) has become a widely used tool in remote sensing for mapping, surveying, modeling, and a host of other applications. The motivation behind this work is the modeling of piping systems in industrial sites, where cylinders are the most common primitive or shape. We focus on cylinder parameter estimation in three-dimensional point clouds, proposing a mathematical formulation based on angular distance to determine the cylinder orientation. We demonstrate the accuracy and robustness of the technique on synthetically generated cylinder point clouds (where the true axis orientation is known) as well as on real LIDAR data of piping systems. The proposed algorithm is compared with a discrete space Hough transform-based approach as well as a continuous space inlier approach, which iteratively discards outlier points to refine the cylinder parameter estimates. Results show that the proposed method is more computationally efficient than the Hough transform approach and is more accurate than both the Hough transform approach and the inlier method.

  16. Application of affinity propagation algorithm based on manifold distance for transformer PD pattern recognition

    Science.gov (United States)

    Wei, B. G.; Huo, K. X.; Yao, Z. F.; Lou, J.; Li, X. Y.

    2018-03-01

    It is one of the difficult problems encountered in the research of condition maintenance technology of transformers to recognize partial discharge (PD) pattern. According to the main physical characteristics of PD, three models of oil-paper insulation defects were set up in laboratory to study the PD of transformers, and phase resolved partial discharge (PRPD) was constructed. By using least square method, the grey-scale images of PRPD were constructed and features of each grey-scale image were 28 box dimensions and 28 information dimensions. Affinity propagation algorithm based on manifold distance (AP-MD) for transformers PD pattern recognition was established, and the data of box dimension and information dimension were clustered based on AP-MD. Study shows that clustering result of AP-MD is better than the results of affinity propagation (AP), k-means and fuzzy c-means algorithm (FCM). By choosing different k values of k-nearest neighbor, we find clustering accuracy of AP-MD falls when k value is larger or smaller, and the optimal k value depends on sample size.

  17. Symmetrical compression distance for arrhythmia discrimination in cloud-based big-data services.

    Science.gov (United States)

    Lillo-Castellano, J M; Mora-Jiménez, I; Santiago-Mozos, R; Chavarría-Asso, F; Cano-González, A; García-Alberola, A; Rojo-Álvarez, J L

    2015-07-01

    The current development of cloud computing is completely changing the paradigm of data knowledge extraction in huge databases. An example of this technology in the cardiac arrhythmia field is the SCOOP platform, a national-level scientific cloud-based big data service for implantable cardioverter defibrillators. In this scenario, we here propose a new methodology for automatic classification of intracardiac electrograms (EGMs) in a cloud computing system, designed for minimal signal preprocessing. A new compression-based similarity measure (CSM) is created for low computational burden, so-called weighted fast compression distance, which provides better performance when compared with other CSMs in the literature. Using simple machine learning techniques, a set of 6848 EGMs extracted from SCOOP platform were classified into seven cardiac arrhythmia classes and one noise class, reaching near to 90% accuracy when previous patient arrhythmia information was available and 63% otherwise, hence overcoming in all cases the classification provided by the majority class. Results show that this methodology can be used as a high-quality service of cloud computing, providing support to physicians for improving the knowledge on patient diagnosis.

  18. Hyperplane distance neighbor clustering based on local discriminant analysis for complex chemical processes monitoring

    International Nuclear Information System (INIS)

    Lu, Chunhong; Xiao, Shaoqing; Gu, Xiaofeng

    2014-01-01

    The collected training data often include both normal and faulty samples for complex chemical processes. However, some monitoring methods, such as partial least squares (PLS), principal component analysis (PCA), independent component analysis (ICA) and Fisher discriminant analysis (FDA), require fault-free data to build the normal operation model. These techniques are applicable after the preliminary step of data clustering is applied. We here propose a novel hyperplane distance neighbor clustering (HDNC) based on the local discriminant analysis (LDA) for chemical process monitoring. First, faulty samples are separated from normal ones using the HDNC method. Then, the optimal subspace for fault detection and classification can be obtained using the LDA approach. The proposed method takes the multimodality within the faulty data into account, and thus improves the capability of process monitoring significantly. The HDNC-LDA monitoring approach is applied to two simulation processes and then compared with the conventional FDA based on the K-nearest neighbor (KNN-FDA) method. The results obtained in two different scenarios demonstrate the superiority of the HDNC-LDA approach in terms of fault detection and classification accuracy

  19. Active Metric Learning from Relative Comparisons

    OpenAIRE

    Xiong, Sicheng; Rosales, Rómer; Pei, Yuanli; Fern, Xiaoli Z.

    2014-01-01

    This work focuses on active learning of distance metrics from relative comparison information. A relative comparison specifies, for a data point triplet $(x_i,x_j,x_k)$, that instance $x_i$ is more similar to $x_j$ than to $x_k$. Such constraints, when available, have been shown to be useful toward defining appropriate distance metrics. In real-world applications, acquiring constraints often require considerable human effort. This motivates us to study how to select and query the most useful ...

  20. Efficient sequential and parallel algorithms for finding edit distance based motifs.

    Science.gov (United States)

    Pal, Soumitra; Xiao, Peng; Rajasekaran, Sanguthevar

    2016-08-18

    Motif search is an important step in extracting meaningful patterns from biological data. The general problem of motif search is intractable and there is a pressing need to develop efficient, exact and approximation algorithms to solve this problem. In this paper, we present several novel, exact, sequential and parallel algorithms for solving the (l,d) Edit-distance-based Motif Search (EMS) problem: given two integers l,d and n biological strings, find all strings of length l that appear in each input string with atmost d errors of types substitution, insertion and deletion. One popular technique to solve the problem is to explore for each input string the set of all possible l-mers that belong to the d-neighborhood of any substring of the input string and output those which are common for all input strings. We introduce a novel and provably efficient neighborhood exploration technique. We show that it is enough to consider the candidates in neighborhood which are at a distance exactly d. We compactly represent these candidate motifs using wildcard characters and efficiently explore them with very few repetitions. Our sequential algorithm uses a trie based data structure to efficiently store and sort the candidate motifs. Our parallel algorithm in a multi-core shared memory setting uses arrays for storing and a novel modification of radix-sort for sorting the candidate motifs. The algorithms for EMS are customarily evaluated on several challenging instances such as (8,1), (12,2), (16,3), (20,4), and so on. The best previously known algorithm, EMS1, is sequential and in estimated 3 days solves up to instance (16,3). Our sequential algorithms are more than 20 times faster on (16,3). On other hard instances such as (9,2), (11,3), (13,4), our algorithms are much faster. Our parallel algorithm has more than 600 % scaling performance while using 16 threads. Our algorithms have pushed up the state-of-the-art of EMS solvers and we believe that the techniques introduced in

  1. Mapping Rice Cropping Systems in Vietnam Using an NDVI-Based Time-Series Similarity Measurement Based on DTW Distance

    Directory of Open Access Journals (Sweden)

    Xudong Guan

    2016-01-01

    Full Text Available Normalized Difference Vegetation Index (NDVI derived from Moderate Resolution Imaging Spectroradiometer (MODIS time-series data has been widely used in the fields of crop and rice classification. The cloudy and rainy weather characteristics of the monsoon season greatly reduce the likelihood of obtaining high-quality optical remote sensing images. In addition, the diverse crop-planting system in Vietnam also hinders the comparison of NDVI among different crop stages. To address these problems, we apply a Dynamic Time Warping (DTW distance-based similarity measure approach and use the entire yearly NDVI time series to reduce the inaccuracy of classification using a single image. We first de-noise the NDVI time series using S-G filtering based on the TIMESAT software. Then, a standard NDVI time-series base for rice growth is established based on field survey data and Google Earth sample data. NDVI time-series data for each pixel are constructed and the DTW distance with the standard rice growth NDVI time series is calculated. Then, we apply thresholds to extract rice growth areas. A qualitative assessment using statistical data and a spatial assessment using sampled data from the rice-cropping map reveal a high mapping accuracy at the national scale between the statistical data, with the corresponding R2 being as high as 0.809; however, the mapped rice accuracy decreased at the provincial scale due to the reduced number of rice planting areas per province. An analysis of the results indicates that the 500-m resolution MODIS data are limited in terms of mapping scattered rice parcels. The results demonstrate that the DTW-based similarity measure of the NDVI time series can be effectively used to map large-area rice cropping systems with diverse cultivation processes.

  2. Experimental Study on Damage Detection in Timber Specimens Based on an Electromechanical Impedance Technique and RMSD-Based Mahalanobis Distance

    Directory of Open Access Journals (Sweden)

    Dansheng Wang

    2016-10-01

    Full Text Available In the electromechanical impedance (EMI method, the PZT patch performs the functions of both sensor and exciter. Due to the high frequency actuation and non-model based characteristics, the EMI method can be utilized to detect incipient structural damage. In recent years EMI techniques have been widely applied to monitor the health status of concrete and steel materials, however, studies on application to timber are limited. This paper will explore the feasibility of using the EMI technique for damage detection in timber specimens. In addition, the conventional damage index, namely root mean square deviation (RMSD is employed to evaluate the level of damage. On that basis, a new damage index, Mahalanobis distance based on RMSD, is proposed to evaluate the damage severity of timber specimens. Experimental studies are implemented to detect notch and hole damage in the timber specimens. Experimental results verify the availability and robustness of the proposed damage index and its superiority over the RMSD indexes.

  3. Computer-Based Learning in Open and Distance Learning Institutions in Nigeria: Cautions on Use of Internet for Counseling

    Science.gov (United States)

    Okopi, Fidel Onjefu; Odeyemi, Olajumoke Janet; Adesina, Adewale

    2015-01-01

    The study has identified the areas of strengths and weaknesses in the current use of Computer Based Learning (CBL) tools in Open and Distance Learning (ODL) institutions in Nigeria. To achieve these objectives, the following research questions were proposed: (i) What are the computer-based learning tools (soft and hard ware) that are actually in…

  4. A novel genome signature based on inter-nucleotide distances profiles for visualization of metagenomic data

    Science.gov (United States)

    Xie, Xian-Hua; Yu, Zu-Guo; Ma, Yuan-Lin; Han, Guo-Sheng; Anh, Vo

    2017-09-01

    There has been a growing interest in visualization of metagenomic data. The present study focuses on the visualization of metagenomic data using inter-nucleotide distances profile. We first convert the fragment sequences into inter-nucleotide distances profiles. Then we analyze these profiles by principal component analysis. Finally the principal components are used to obtain the 2-D scattered plot according to their source of species. We name our method as inter-nucleotide distances profiles (INP) method. Our method is evaluated on three benchmark data sets used in previous published papers. Our results demonstrate that the INP method is good, alternative and efficient for visualization of metagenomic data.

  5. Fast Algorithms for Earth Mover Distance Based on Optimal Transport and L1 Regularization II

    Science.gov (United States)

    2016-09-01

    gradient ascent in the dual variable Φ and a gradient descent in the primal variable m. In our updates for (6), we use simple exact formulae. Since the...0. References [1] Luigi Ambrosio, Nicola Gigli, and Giuseppe Savaré. Gradient flows: in metric spaces and in the space of probability measures...continuity and summability of transport densities: simpler proofs and new estimates. Calculus of Variations and Partial Differential Equations, 36 (3

  6. Study of variations of radiofrequency power density from mobile phone base stations with distance

    International Nuclear Information System (INIS)

    Ayinmode, B. O.; Farai, I. P.

    2013-01-01

    The variations of radiofrequency (RF) radiation power density with distance around some mobile phone base stations (BTSs), in ten randomly selected locations in Ibadan, western Nigeria, were studied. Measurements were made with a calibrated hand-held spectrum analyser. The maximum Global System of Mobile (GSM) communication 1800 signal power density was 323.91 μW m -2 at 250 m radius of a BTS and that of GSM 900 was 1119.00 μW m -2 at 200 m radius of another BTS. The estimated total maximum power density was 2972.00 μW m -2 at 50 m radius of a different BTS. This study shows that the maximum carrier signal power density and the total maximum power density from a BTS may be observed averagely at 200 and 50 m of its radius, respectively. The result of this study demonstrates that exposure of people to RF radiation from phone BTSs in Ibadan city is far less than the recommended limits by International scientific bodies. (authors)

  7. Identifying the true oysters (Bivalvia: Ostreidae) with mitochondrial phylogeny and distance-based DNA barcoding.

    Science.gov (United States)

    Liu, Jun; Li, Qi; Kong, Lingfeng; Yu, Hong; Zheng, Xiaodong

    2011-09-01

    Oysters (family Ostreidae), with high levels of phenotypic plasticity and wide geographic distribution, are a challenging group for taxonomists and phylogenetics. As a useful tool for molecular species identification, DNA barcoding might offer significant potential for oyster identification and taxonomy. This study used two mitochondrial fragments, cytochrome c oxidase I (COI) and the large ribosomal subunit (16S rDNA), to assess whether oyster species could be identified by phylogeny and distance-based DNA barcoding techniques. Relationships among species were estimated by the phylogenetic analyses of both genes, and then pairwise inter- and intraspecific genetic divergences were assessed. Species forming well-differentiated clades in the molecular phylogenies were identical for both genes even when the closely related species were included. Intraspecific variability of 16S rDNA overlapped with interspecific divergence. However, average intra- and interspecific genetic divergences for COI were 0-1.4% (maximum 2.2%) and 2.6-32.2% (minimum 2.2%), respectively, indicating the existence of a barcoding gap. These results confirm the efficacy of species identification in oysters via DNA barcodes and phylogenetic analysis. © 2011 Blackwell Publishing Ltd.

  8. A Hybrid Distance-Based Ideal-Seeking Consensus Ranking Model

    Directory of Open Access Journals (Sweden)

    Madjid Tavana

    2007-01-01

    Full Text Available Ordinal consensus ranking problems have received much attention in the management science literature. A problem arises in situations where a group of k decision makers (DMs is asked to rank order n alternatives. The question is how to combine the DM rankings into one consensus ranking. Several different approaches have been suggested to aggregate DM responses into a compromise or consensus ranking; however, the similarity of consensus rankings generated by the different algorithms is largely unknown. In this paper, we propose a new hybrid distance-based ideal-seeking consensus ranking model (DCM. The proposed hybrid model combines parts of the two commonly used consensus ranking techniques of Beck and Lin (1983 and Cook and Kress (1985 into an intuitive and computationally simple model. We illustrate our method and then run a Monte Carlo simulation across a range of k and n to compare the similarity of the consensus rankings generated by our method with the best-known method of Borda and Kendall (Kendall 1962 and the two methods proposed by Beck and Lin (1983 and Cook and Kress (1985. DCM and Beck and Lin's method yielded the most similar consensus rankings, whereas the Cook-Kress method and the Borda-Kendall method yielded the least similar consensus rankings.

  9. How do e-book readers enhance learning opportunities for distance work-based learners?

    Directory of Open Access Journals (Sweden)

    Gabi Witthaus

    2011-12-01

    Full Text Available We report on the incorporation of e-book readers into the delivery of two distance-taught master's programmes in Occupational Psychology (OP and one in Education at the University of Leicester, UK. The programmes attract work-based practitioners in OP and Teaching English to Speakers of Other Languages, respectively. Challenges in curriculum delivery included the need for more flexibility in the curricula, better access to essential readings and maximising the benefit of learners' limited study time. As part of a suite of pilot changes to curriculum design and delivery, 28 Sony PRS-505™ e-book readers were pre-loaded with course materials and sent out to students. The evidence suggests that the students' learning experiences improved as a result of four key benefits afforded by the e-book readers: enhanced flexibility in curriculum delivery to accommodate the mobile lifestyle of our learners, improved efficiency in the use of study time, especially short breaks during the working day, new strategies for reading course materials and cost. We discuss the opportunities and limitations associated with the e-book readers used and the challenges encountered in the study.

  10. Distance-Based Behaviors for Low-Complexity Control in Multiagent Robotics

    Science.gov (United States)

    Pierpaoli, Pietro

    Several biological examples show that living organisms cooperate to collectively accomplish tasks impossible for single individuals. More importantly, this coordination is often achieved with a very limited set of information. Inspired by these observations, research on autonomous systems has focused on the development of distributed control techniques for control and guidance of groups of autonomous mobile agents, or robots. From an engineering perspective, when coordination and cooperation is sought in large ensembles of robotic vehicles, a reduction in hardware and algorithms' complexity becomes mandatory from the very early stages of the project design. The research for solutions capable of lowering power consumption, cost and increasing reliability are thus worth investigating. In this work, we studied low-complexity techniques to achieve cohesion and control on swarms of autonomous robots. Starting from an inspiring example with two-agents, we introduced effects of neighbors' relative positions on control of an autonomous agent. The extension of this intuition addressed the control of large ensembles of autonomous vehicles, and was applied in the form of a herding-like technique. To this end, a low-complexity distance-based aggregation protocol was defined. We first showed that our protocol produced a cohesion aggregation among the agent while avoiding inter-agent collisions. Then, a feedback leader-follower architecture was introduced for the control of the swarm. We also described how proximity measures and probability of collisions with neighbors can also be used as source of information in highly populated environments.

  11. Reachable Distance Space: Efficient Sampling-Based Planning for Spatially Constrained Systems

    KAUST Repository

    Xinyu Tang,; Thomas, S.; Coleman, P.; Amato, N. M.

    2010-01-01

    reachable distance space (RD-space), in which all configurations lie in the set of constraint-satisfying subspaces. This enables us to directly sample the constrained subspaces with complexity linear in the number of the robot's degrees of freedom

  12. Establishing the existence of a distance-based upper bound for a fuzzy DEA model using duality

    International Nuclear Information System (INIS)

    Soleimani-damaneh, M.

    2009-01-01

    In a recent paper [Soleimani-damaneh M. Fuzzy upper bounds and their applications. Chaos, Solitons and Fractals 2008;36:217-25.], I established the existence of a distance-based fuzzy upper bound for the objective function of a fuzzy DEA model, using the properties of a discussed signed distance, and provided an effective approach to solve that model. In this paper a new dual-based proof for the existence of the above-mentioned upper bound is provided which gives a useful insight into the theory of fuzzy DEA.

  13. A New Model of Stopping Sight Distance of Curve Braking Based on Vehicle Dynamics

    Directory of Open Access Journals (Sweden)

    Rong-xia Xia

    2016-01-01

    Full Text Available Compared with straight-line braking, cornering brake has longer braking distance and poorer stability. Therefore, drivers are more prone to making mistakes. The braking process and the dynamics of vehicles in emergency situations on curves were analyzed. A biaxial four-wheel vehicle was simplified to a single model. Considering the braking process, dynamics, force distribution, and stability, a stopping sight distance of the curve braking calculation model was built. Then a driver-vehicle-road simulation platform was built using multibody dynamic software. The vehicle test of brake-in-turn was realized in this platform. The comparison of experimental and calculated values verified the reliability of the computational model. Eventually, the experimental values and calculated values were compared with the stopping sight distance recommended by the Highway Route Design Specification (JTGD20-2006; the current specification of stopping sight distance does not apply to cornering brake sight distance requirements. In this paper, the general values and limits of the curve stopping sight distance are presented.

  14. Performance evaluation of a distance learning program.

    Science.gov (United States)

    Dailey, D J; Eno, K R; Brinkley, J F

    1994-01-01

    This paper presents a performance metric which uses a single number to characterize the response time for a non-deterministic client-server application operating over the Internet. When applied to a Macintosh-based distance learning application called the Digital Anatomist Browser, the metric allowed us to observe that "A typical student doing a typical mix of Browser commands on a typical data set will experience the same delay if they use a slow Macintosh on a local network or a fast Macintosh on the other side of the country accessing the data over the Internet." The methodology presented is applicable to other client-server applications that are rapidly appearing on the Internet.

  15. Impact localization in dispersive waveguides based on energy-attenuation of waves with the traveled distance

    Science.gov (United States)

    Alajlouni, Sa'ed; Albakri, Mohammad; Tarazaga, Pablo

    2018-05-01

    An algorithm is introduced to solve the general multilateration (source localization) problem in a dispersive waveguide. The algorithm is designed with the intention of localizing impact forces in a dispersive floor, and can potentially be used to localize and track occupants in a building using vibration sensors connected to the lower surface of the walking floor. The lower the wave frequencies generated by the impact force, the more accurate the localization is expected to be. An impact force acting on a floor, generates a seismic wave that gets distorted as it travels away from the source. This distortion is noticeable even over relatively short traveled distances, and is mainly caused by the dispersion phenomenon among other reasons, therefore using conventional localization/multilateration methods will produce localization error values that are highly variable and occasionally large. The proposed localization approach is based on the fact that the wave's energy, calculated over some time window, decays exponentially as the wave travels away from the source. Although localization methods that assume exponential decay exist in the literature (in the field of wireless communications), these methods have only been considered for wave propagation in non-dispersive media, in addition to the limiting assumption required by these methods that the source must not coincide with a sensor location. As a result, these methods cannot be applied to the indoor localization problem in their current form. We show how our proposed method is different from the other methods, and that it overcomes the source-sensor location coincidence limitation. Theoretical analysis and experimental data will be used to motivate and justify the pursuit of the proposed approach for localization in a dispersive medium. Additionally, hammer impacts on an instrumented floor section inside an operational building, as well as finite element model simulations, are used to evaluate the performance of

  16. Genetic distances and phylogenetic trees of different Awassi sheep populations based on DNA sequencing.

    Science.gov (United States)

    Al-Atiyat, R M; Aljumaah, R S

    2014-08-27

    This study aimed to estimate evolutionary distances and to reconstruct phylogeny trees between different Awassi sheep populations. Thirty-two sheep individuals from three different geographical areas of Jordan and the Kingdom of Saudi Arabia (KSA) were randomly sampled. DNA was extracted from the tissue samples and sequenced using the T7 promoter universal primer. Different phylogenetic trees were reconstructed from 0.64-kb DNA sequences using the MEGA software with the best general time reverse distance model. Three methods of distance estimation were then used. The maximum composite likelihood test was considered for reconstructing maximum likelihood, neighbor-joining and UPGMA trees. The maximum likelihood tree indicated three major clusters separated by cytosine (C) and thymine (T). The greatest distance was shown between the South sheep and North sheep. On the other hand, the KSA sheep as an outgroup showed shorter evolutionary distance to the North sheep population than to the others. The neighbor-joining and UPGMA trees showed quite reliable clusters of evolutionary differentiation of Jordan sheep populations from the Saudi population. The overall results support geographical information and ecological types of the sheep populations studied. Summing up, the resulting phylogeny trees may contribute to the limited information about the genetic relatedness and phylogeny of Awassi sheep in nearby Arab countries.

  17. Fault Diagnosis of Supervision and Homogenization Distance Based on Local Linear Embedding Algorithm

    Directory of Open Access Journals (Sweden)

    Guangbin Wang

    2015-01-01

    Full Text Available In view of the problems of uneven distribution of reality fault samples and dimension reduction effect of locally linear embedding (LLE algorithm which is easily affected by neighboring points, an improved local linear embedding algorithm of homogenization distance (HLLE is developed. The method makes the overall distribution of sample points tend to be homogenization and reduces the influence of neighboring points using homogenization distance instead of the traditional Euclidean distance. It is helpful to choose effective neighboring points to construct weight matrix for dimension reduction. Because the fault recognition performance improvement of HLLE is limited and unstable, the paper further proposes a new local linear embedding algorithm of supervision and homogenization distance (SHLLE by adding the supervised learning mechanism. On the basis of homogenization distance, supervised learning increases the category information of sample points so that the same category of sample points will be gathered and the heterogeneous category of sample points will be scattered. It effectively improves the performance of fault diagnosis and maintains stability at the same time. A comparison of the methods mentioned above was made by simulation experiment with rotor system fault diagnosis, and the results show that SHLLE algorithm has superior fault recognition performance.

  18. Selection of metrics based on the seagrass Cymodocea nodosa and development of a biotic index (CYMOX) for assessing ecological status of coastal and transitional waters

    Science.gov (United States)

    Oliva, Silvia; Mascaró, Oriol; Llagostera, Izaskun; Pérez, Marta; Romero, Javier

    2012-12-01

    Bioindicators, based on a large variety of organisms, have been increasingly used in the assessment of the status of aquatic systems. In marine coastal waters, seagrasses have shown a great potential as bioindicator organisms, probably due to both their environmental sensitivity and the large amount of knowledge available. However, and as far as we are aware, only little attention has been paid to euryhaline species suitable for biomonitoring both transitional and marine waters. With the aim to contribute to this expanding field, and provide new and useful tools for managers, we develop here a multi-bioindicator index based on the seagrass Cymodocea nodosa. We first compiled from the literature a suite of 54 candidate metrics, i. e. measurable attribute of the concerned organism or community that adequately reflects properties of the environment, obtained from C. nodosa and its associated ecosystem, putatively responding to environmental deterioration. We then evaluated them empirically, obtaining a complete dataset on these metrics along a gradient of anthropogenic disturbance. Using this dataset, we selected the metrics to construct the index, using, successively: (i) ANOVA, to assess their capacity to discriminate among sites of different environmental conditions; (ii) PCA, to check the existence of a common pattern among the metrics reflecting the environmental gradient; and (iii) feasibility and cost-effectiveness criteria. Finally, 10 metrics (out of the 54 tested) encompassing from the physiological (δ15N, δ34S, % N, % P content of rhizomes), through the individual (shoot size) and the population (root weight ratio), to the community (epiphytes load) organisation levels, and some metallic pollution descriptors (Cd, Cu and Zn content of rhizomes) were retained and integrated into a single index (CYMOX) using the scores of the sites on the first axis of a PCA. These scores were reduced to a 0-1 (Ecological Quality Ratio) scale by referring the values to the

  19. GNSS Precise Kinematic Positioning for Multiple Kinematic Stations Based on A Priori Distance Constraints

    Science.gov (United States)

    He, Kaifei; Xu, Tianhe; Förste, Christoph; Petrovic, Svetozar; Barthelmes, Franz; Jiang, Nan; Flechtner, Frank

    2016-01-01

    When applying the Global Navigation Satellite System (GNSS) for precise kinematic positioning in airborne and shipborne gravimetry, multiple GNSS receiving equipment is often fixed mounted on the kinematic platform carrying the gravimetry instrumentation. Thus, the distances among these GNSS antennas are known and invariant. This information can be used to improve the accuracy and reliability of the state estimates. For this purpose, the known distances between the antennas are applied as a priori constraints within the state parameters adjustment. These constraints are introduced in such a way that their accuracy is taken into account. To test this approach, GNSS data of a Baltic Sea shipborne gravimetric campaign have been used. The results of our study show that an application of distance constraints improves the accuracy of the GNSS kinematic positioning, for example, by about 4 mm for the radial component. PMID:27043580

  20. Investigation of shear distance in Michelson interferometer-based shearography for mechanical characterization

    International Nuclear Information System (INIS)

    Lee, Jung-Ryul; Yoon, Dong-Jin; Kim, Jung-Seok; Vautrin, Alain

    2008-01-01

    Shearography is a growing industrial field in both quantitative mechanical characterization and relatively qualitative non-destructive testing. In shearography, shear distance is the most important parameter to control measurement performances. In this paper, the role of the shear distance is systematically investigated, focusing on the application of full-field mechanical characterization. A modified Michelson interferometer is considered as the shearing device, which is most commonly adopted for mechanical characterization applications because it enables easy and precise shearing and phase shifting. This paper also includes theoretical and experimental investigations of the relationship between shear distance and performance issues such as the immeasurable zone in the target with discontinuity, signal-to-noise ratio, sensitivity and shear distortion. In addition, this study is verified with actual shearographic results and a phase-shifting grid method capable of full-field displacement evaluation in the submicrometer regime

  1. Genetic distance of Malaysian mousedeer based on mitochondrial DNA cytochrome oxidase I (COI) and D-loop region sequences

    Science.gov (United States)

    Bakar, Mohamad-Azam Akmal Abu; Rovie-Ryan, Jeffrine Japning; Ampeng, Ahmad; Yaakop, Salmah; Nor, Shukor Md; Md-Zain, Badrul Munir

    2018-04-01

    Mousedeer is one of the primitive mammals that can be found mainly in Southeast-Asia region. There are two species of mousedeer in Malaysia which are Tragulus kanchil and Tragulus napu. Both species can be distinguish by size, coat coloration, and throat pattern but clear diagnosis still cannot be found. The objective of the study is to show the genetic distance relationship between T. kanchil and T. napu and their population based on mitochondrial DNA (mtDNA) cytochrome oxidase I (COI) and D-loop region. There are 42 sample of mousedeer were used in this study collected by PERHILITAN from different locality. Another 29 D-loop sequence were retrieved from Genbank for comparative analysis. All sample were amplified using universal primer and species-specific primer for COI and D-loop genes via PCR process. The amplified sequences were analyzed to determine genetic distance of T. kanchil and T. napu. From the analysis, the average genetic distance between T. kanchil and T. napu based on locus COI and D-loop were 0.145 and 0.128 respectively. The genetic distance between populations of T. kanchil based on locus COI was between 0.003-0.013. For locus D-loop, genetic distance analysis showed distance in relationship between west-coast populations to east-coast population of T. kanchil. COI and D-loop mtDNA region provided a clear picture on the relationship within the mousedeer species. Last but not least, conservation effort toward protecting this species can be done by study the molecular genetics and prevent the extinction of this species.

  2. A Voice-Based E-Examination Framework for Visually Impaired Students in Open and Distance Learning

    Science.gov (United States)

    Azeta, Ambrose A.; Inam, Itorobong A.; Daramola, Olawande

    2018-01-01

    Voice-based systems allow users access to information on the internet over a voice interface. Prior studies on Open and Distance Learning (ODL) e-examination systems that make use of voice interface do not sufficiently exhibit intelligent form of assessment, which diminishes the rigor of examination. The objective of this paper is to improve on…

  3. Proximity and Distance in Knowledge Relationships : From Micro to Structural Considerations based on Territorial Knowledge Dynamics (TKDs)

    NARCIS (Netherlands)

    Crespo, Joan; Vicente, Jérôme

    2016-01-01

    Crespo J. and Vicente J. Proximity and distance in knowledge relationships: from micro to structural considerations based on territorial knowledge dynamics (TKDs), Regional Studies. Among the key parameters identified in territorial knowledge dynamics (TKDs), this paper focuses on the balance and

  4. Vision-Based Detection and Distance Estimation of Micro Unmanned Aerial Vehicles

    Directory of Open Access Journals (Sweden)

    Fatih Gökçe

    2015-09-01

    Full Text Available Detection and distance estimation of micro unmanned aerial vehicles (mUAVs is crucial for (i the detection of intruder mUAVs in protected environments; (ii sense and avoid purposes on mUAVs or on other aerial vehicles and (iii multi-mUAV control scenarios, such as environmental monitoring, surveillance and exploration. In this article, we evaluate vision algorithms as alternatives for detection and distance estimation of mUAVs, since other sensing modalities entail certain limitations on the environment or on the distance. For this purpose, we test Haar-like features, histogram of gradients (HOG and local binary patterns (LBP using cascades of boosted classifiers. Cascaded boosted classifiers allow fast processing by performing detection tests at multiple stages, where only candidates passing earlier simple stages are processed at the preceding more complex stages. We also integrate a distance estimation method with our system utilizing geometric cues with support vector regressors. We evaluated each method on indoor and outdoor videos that are collected in a systematic way and also on videos having motion blur. Our experiments show that, using boosted cascaded classifiers with LBP, near real-time detection and distance estimation of mUAVs are possible in about 60 ms indoors (1032 × 778 resolution and 150 ms outdoors (1280 × 720 resolution per frame, with a detection rate of 0.96 F-score. However, the cascaded classifiers using Haar-like features lead to better distance estimation since they can position the bounding boxes on mUAVs more accurately. On the other hand, our time analysis yields that the cascaded classifiers using HOG train and run faster than the other algorithms.

  5. Metric modular spaces

    CERN Document Server

    Chistyakov, Vyacheslav

    2015-01-01

    Aimed toward researchers and graduate students familiar with elements of functional analysis, linear algebra, and general topology; this book contains a general study of modulars, modular spaces, and metric modular spaces. Modulars may be thought of as generalized velocity fields and serve two important purposes: generate metric spaces in a unified manner and provide a weaker convergence, the modular convergence, whose topology is non-metrizable in general. Metric modular spaces are extensions of metric spaces, metric linear spaces, and classical modular linear spaces. The topics covered include the classification of modulars, metrizability of modular spaces, modular transforms and duality between modular spaces, metric  and modular topologies. Applications illustrated in this book include: the description of superposition operators acting in modular spaces, the existence of regular selections of set-valued mappings, new interpretations of spaces of Lipschitzian and absolutely continuous mappings, the existe...

  6. Local-metrics error-based Shepard interpolation as surrogate for highly non-linear material models in high dimensions

    Science.gov (United States)

    Lorenzi, Juan M.; Stecher, Thomas; Reuter, Karsten; Matera, Sebastian

    2017-10-01

    Many problems in computational materials science and chemistry require the evaluation of expensive functions with locally rapid changes, such as the turn-over frequency of first principles kinetic Monte Carlo models for heterogeneous catalysis. Because of the high computational cost, it is often desirable to replace the original with a surrogate model, e.g., for use in coupled multiscale simulations. The construction of surrogates becomes particularly challenging in high-dimensions. Here, we present a novel version of the modified Shepard interpolation method which can overcome the curse of dimensionality for such functions to give faithful reconstructions even from very modest numbers of function evaluations. The introduction of local metrics allows us to take advantage of the fact that, on a local scale, rapid variation often occurs only across a small number of directions. Furthermore, we use local error estimates to weigh different local approximations, which helps avoid artificial oscillations. Finally, we test our approach on a number of challenging analytic functions as well as a realistic kinetic Monte Carlo model. Our method not only outperforms existing isotropic metric Shepard methods but also state-of-the-art Gaussian process regression.

  7. A comparison of community and trophic structure in five marine ecosystems based on energy budgets and system metrics

    Science.gov (United States)

    Gaichas, Sarah; Skaret, Georg; Falk-Petersen, Jannike; Link, Jason S.; Overholtz, William; Megrey, Bernard A.; Gjøsæter, Harald; Stockhausen, William T.; Dommasnes, Are; Friedland, Kevin D.; Aydin, Kerim

    2009-04-01

    Energy budget models for five marine ecosystems were compared to identify differences and similarities in trophic and community structure. We examined the Gulf of Maine and Georges Bank in the northwest Atlantic Ocean, the combined Norwegian/Barents Seas in the northeast Atlantic Ocean, and the eastern Bering Sea and the Gulf of Alaska in the northeast Pacific Ocean. Comparable energy budgets were constructed for each ecosystem by aggregating information for similar species groups into consistent functional groups. Several ecosystem indices (e.g., functional group production, consumption and biomass ratios, cumulative biomass, food web macrodescriptors, and network metrics) were compared for each ecosystem. The comparative approach clearly identified data gaps for each ecosystem, an important outcome of this work. Commonalities across the ecosystems included overall high primary production and energy flow at low trophic levels, high production and consumption by carnivorous zooplankton, and similar proportions of apex predator to lower trophic level biomass. Major differences included distinct biomass ratios of pelagic to demersal fish, ranging from highest in the combined Norwegian/Barents ecosystem to lowest in the Alaskan systems, and notable differences in primary production per unit area, highest in the Alaskan and Georges Bank/Gulf of Maine ecosystems, and lowest in the Norwegian ecosystems. While comparing a disparate group of organisms across a wide range of marine ecosystems is challenging, this work demonstrates that standardized metrics both elucidate properties common to marine ecosystems and identify key distinctions useful for fisheries management.

  8. Smart-system of distance learning of visually impaired people based on approaches of artificial intelligence

    Science.gov (United States)

    Samigulina, Galina A.; Shayakhmetova, Assem S.

    2016-11-01

    Research objective is the creation of intellectual innovative technology and information Smart-system of distance learning for visually impaired people. The organization of the available environment for receiving quality education for visually impaired people, their social adaptation in society are important and topical issues of modern education.The proposed Smart-system of distance learning for visually impaired people can significantly improve the efficiency and quality of education of this category of people. The scientific novelty of proposed Smart-system is using intelligent and statistical methods of processing multi-dimensional data, and taking into account psycho-physiological characteristics of perception and awareness learning information by visually impaired people.

  9. A spectral method to detect community structure based on distance modularity matrix

    Science.gov (United States)

    Yang, Jin-Xuan; Zhang, Xiao-Dong

    2017-08-01

    There are many community organizations in social and biological networks. How to identify these community structure in complex networks has become a hot issue. In this paper, an algorithm to detect community structure of networks is proposed by using spectra of distance modularity matrix. The proposed algorithm focuses on the distance of vertices within communities, rather than the most weakly connected vertex pairs or number of edges between communities. The experimental results show that our method achieves better effectiveness to identify community structure for a variety of real-world networks and computer generated networks with a little more time-consumption.

  10. Constructing a no-reference H.264/AVC bitstream-based video quality metric using genetic programming-based symbolic regression

    OpenAIRE

    Staelens, Nicolas; Deschrijver, Dirk; Vladislavleva, E; Vermeulen, Brecht; Dhaene, Tom; Demeester, Piet

    2013-01-01

    In order to ensure optimal quality of experience toward end users during video streaming, automatic video quality assessment becomes an important field-of-interest to video service providers. Objective video quality metrics try to estimate perceived quality with high accuracy and in an automated manner. In traditional approaches, these metrics model the complex properties of the human visual system. More recently, however, it has been shown that machine learning approaches can also yield comp...

  11. Fault diagnosis of rolling bearings based on multifractal detrended fluctuation analysis and Mahalanobis distance criterion

    Science.gov (United States)

    Lin, Jinshan; Chen, Qian

    2013-07-01

    Vibration data of faulty rolling bearings are usually nonstationary and nonlinear, and contain fairly weak fault features. As a result, feature extraction of rolling bearing fault data is always an intractable problem and has attracted considerable attention for a long time. This paper introduces multifractal detrended fluctuation analysis (MF-DFA) to analyze bearing vibration data and proposes a novel method for fault diagnosis of rolling bearings based on MF-DFA and Mahalanobis distance criterion (MDC). MF-DFA, an extension of monofractal DFA, is a powerful tool for uncovering the nonlinear dynamical characteristics buried in nonstationary time series and can capture minor changes of complex system conditions. To begin with, by MF-DFA, multifractality of bearing fault data was quantified with the generalized Hurst exponent, the scaling exponent and the multifractal spectrum. Consequently, controlled by essentially different dynamical mechanisms, the multifractality of four heterogeneous bearing fault data is significantly different; by contrast, controlled by slightly different dynamical mechanisms, the multifractality of homogeneous bearing fault data with different fault diameters is significantly or slightly different depending on different types of bearing faults. Therefore, the multifractal spectrum, as a set of parameters describing multifractality of time series, can be employed to characterize different types and severity of bearing faults. Subsequently, five characteristic parameters sensitive to changes of bearing fault conditions were extracted from the multifractal spectrum and utilized to construct fault features of bearing fault data. Moreover, Hilbert transform based envelope analysis, empirical mode decomposition (EMD) and wavelet transform (WT) were utilized to study the same bearing fault data. Also, the kurtosis and the peak levels of the EMD or the WT component corresponding to the bearing tones in the frequency domain were carefully checked

  12. Two Phase Non-Rigid Multi-Modal Image Registration Using Weber Local Descriptor-Based Similarity Metrics and Normalized Mutual Information

    Directory of Open Access Journals (Sweden)

    Feng Yang

    2013-06-01

    Full Text Available Non-rigid multi-modal image registration plays an important role in medical image processing and analysis. Existing image registration methods based on similarity metrics such as mutual information (MI and sum of squared differences (SSD cannot achieve either high registration accuracy or high registration efficiency. To address this problem, we propose a novel two phase non-rigid multi-modal image registration method by combining Weber local descriptor (WLD based similarity metrics with the normalized mutual information (NMI using the diffeomorphic free-form deformation (FFD model. The first phase aims at recovering the large deformation component using the WLD based non-local SSD (wldNSSD or weighted structural similarity (wldWSSIM. Based on the output of the former phase, the second phase is focused on getting accurate transformation parameters related to the small deformation using the NMI. Extensive experiments on T1, T2 and PD weighted MR images demonstrate that the proposed wldNSSD-NMI or wldWSSIM-NMI method outperforms the registration methods based on the NMI, the conditional mutual information (CMI, the SSD on entropy images (ESSD and the ESSD-NMI in terms of registration accuracy and computation efficiency.

  13. The Effects of Land-Use Patterns on Home-Based Tour Complexity and Total Distances Traveled: A Path Analysis

    Directory of Open Access Journals (Sweden)

    João de Abreu e Silva

    2018-03-01

    Full Text Available This work studies the relationships between the number of complex tours (with one or more intermediate stops and simple home-based tours, total distances traveled by mode, and land-use patterns both at the residence and at the workplace using path analysis. The model includes commuting distance, car ownership and motorcycle ownership, which are intermediate variables in the relationship between land use, tour complexity and distances traveled by mode. The dataset used here was collected in a region comprising four municipalities located in the north of Portugal that are made up of urban areas, their sprawling suburbs, and surrounding rural hinterland. The results confirm the association between complex tours and higher levels of car use. Land-use patterns significantly affect travelled distances by mode either directly and indirectly via the influence of longer-term decisions like vehicle ownership and commuting distance. The results obtained highlight the role of socioeconomic variables in influencing tour complexity; in particular, households with children, household income, and workers with a college degree tend to do more complex tours. Land-use patterns mediate the effects of tour complexity on the kilometers travelled by different modes. Increasing densities in central areas, and particularly the concentration of jobs, have relevant benefits by reducing car kilometers driven.

  14. Multiwavelength Raman-fiber-laser-based long-distance remote sensor for simultaneous measurement of strain and temperature.

    Science.gov (United States)

    Han, Young-Geun; Tran, T V A; Kim, Sang-Hyuck; Lee, Sang Bae

    2005-06-01

    We propose a simple and flexible multiwavelength Raman-fiber-laser-based long-distance remote-sensing scheme for simultaneous measurement of strain and temperature by use of fiber Bragg gratings. By combining two uniform fiber Bragg gratings with a tunable chirped fiber grating, we readily achieve simultaneous two-channel sensing probes with a high extinction ratio of more than approximately 50 dB over a 50-km distance. When strain and temperature are applied, lasing wavelength separation and shift occur, respectively, since the two uniform fiber Bragg gratings have identical material composition and different cladding diameters. This allows simultaneous measurement of strain and temperature for long-distance sensing applications of more than 50 km.

  15. Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — This chapter presents several performance metrics for offline evaluation of prognostics algorithms. A brief overview of different methods employed for performance...

  16. Overview of journal metrics

    Directory of Open Access Journals (Sweden)

    Kihong Kim

    2018-02-01

    Full Text Available Various kinds of metrics used for the quantitative evaluation of scholarly journals are reviewed. The impact factor and related metrics including the immediacy index and the aggregate impact factor, which are provided by the Journal Citation Reports, are explained in detail. The Eigenfactor score and the article influence score are also reviewed. In addition, journal metrics such as CiteScore, Source Normalized Impact per Paper, SCImago Journal Rank, h-index, and g-index are discussed. Limitations and problems that these metrics have are pointed out. We should be cautious to rely on those quantitative measures too much when we evaluate journals or researchers.

  17. Distance between Behaviors and Rational Representations

    NARCIS (Netherlands)

    Trentelman, H.L.; Gottimukkala, S.V.

    2013-01-01

    In this paper we study notions of distance between behaviors of linear differential systems. We introduce four metrics on the space of all controllable behaviors which generalize existing metrics on the space of input-output systems represented by transfer matrices. Three of these are defined in

  18. A Physically—Based Geometry Model for Transport Distance Estimation of Rainfall-Eroded Soil Sediment

    Directory of Open Access Journals (Sweden)

    Qian-Gui Zhang

    2016-01-01

    Full Text Available Estimations of rainfall-induced soil erosion are mostly derived from the weight of sediment measured in natural runoff. The transport distance of eroded soil is important for evaluating landscape evolution but is difficult to estimate, mainly because it cannot be linked directly to the eroded sediment weight. The volume of eroded soil is easier to calculate visually using popular imaging tools, which can aid in estimating the transport distance of eroded soil through geometry relationships. In this study, we present a straightforward geometry model to predict the maximum sediment transport distance incurred by rainfall events of various intensity and duration. In order to verify our geometry prediction model, a series of experiments are reported in the form of a sediment volume. The results show that cumulative rainfall has a linear relationship with the total volume of eroded soil. The geometry model can accurately estimate the maximum transport distance of eroded soil by cumulative rainfall, with a low root-mean-square error (4.7–4.8 and a strong linear correlation (0.74–0.86.

  19. Communication Barriers in Distance Education: "Text-Based Internet-Enabled Online Courses"

    Science.gov (United States)

    Dabaj, Fahme; Isman, Aytekin

    2004-01-01

    With the rapid technological changes and the diverse people demands and conditions, traditional educational systems and institutions are forced to provide additional educational opportunities. A number of educational establishments are contributing to these conditions and demands by developing and offering distance education programs. Distance…

  20. A Chroma-based Tempo-insensitive Distance Measure for Cover Song Identification

    DEFF Research Database (Denmark)

    Jensen, Jesper Højvang; Ellis, Dan P. W.; Christensen, Mads Græsbøll

    In the context of music, a cover version is a remake of a song, often with significant stylistic variation. In this paper we describe a distance measure between sampled audio files that is designed to be insensitive to instrumentation, time shift, temporal scaling and transpositions. The algorithm...

  1. Optimal coordination of distance and over-current relays in series compensated systems based on MAPSO

    International Nuclear Information System (INIS)

    Moravej, Zahra; Jazaeri, Mostafa; Gholamzadeh, Mehdi

    2012-01-01

    Highlight: ► Optimal coordination problem between distance relays and Directional Over-Current Relays (DOCRs) is studied. ► A new problem formulation for both uncompensated and series compensated system is proposed. ► In order to solve the coordination problem a Modified Adaptive Particle Swarm Optimization (MAPSO) is employed. ► The optimum results are found in both uncompensated and series compensated systems. - Abstract: In this paper, a novel problem formulation for optimal coordination between distance relays and Directional Over-Current Relays (DOCRs) in series compensated systems is proposed. The integration of the series capacitor (SC) into the transmission line makes the coordination problem more complex. The main contribution of this paper is a new systematic method for computing the optimal second zone timing of distance relays and optimal settings of DOCRs, in series compensated and uncompensated transmission systems, which have a combined protection scheme with DOCRs and distance relays. In order to solve this coordination problem, which is a nonlinear and non-convex problem, a Modified Adaptive Particle Swarm Optimization (MAPSO) is employed. The new proposed method is supported by obtained results from a typical test case and a real power system network.

  2. Implementing Electronic Conferencing within a Distance-Based University: University of South Africa Case Study

    Science.gov (United States)

    Kritzinger, E.; Padayachee, K.; Tolmay, M.

    2010-01-01

    The outcome of this paper is primarily to survey and analyse student interactions with electronic conferencing systems and to reflect on the impact of such a system on the students' learning within an open distance learning context. This pilot study is articulated within action research methodology to generate critical reflection on collaborative,…

  3. Converging from Branching to Linear Metrics on Markov Chains

    DEFF Research Database (Denmark)

    Bacci, Giorgio; Bacci, Giovanni; Larsen, Kim Guldstrand

    2015-01-01

    time in the size of the MC. The upper-approximants are Kantorovich-like pseudometrics, i.e. branching-time distances, that converge point-wise to the linear-time metrics. This convergence is interesting in itself, since it reveals a nontrivial relation between branching and linear-time metric...

  4. Beyond leaf color: Comparing camera-based phenological metrics with leaf biochemical, biophysical, and spectral properties throughout the growing season of a temperate deciduous forest

    Science.gov (United States)

    Yang, Xi; Tang, Jianwu; Mustard, John F.

    2014-03-01

    Plant phenology, a sensitive indicator of climate change, influences vegetation-atmosphere interactions by changing the carbon and water cycles from local to global scales. Camera-based phenological observations of the color changes of the vegetation canopy throughout the growing season have become popular in recent years. However, the linkages between camera phenological metrics and leaf biochemical, biophysical, and spectral properties are elusive. We measured key leaf properties including chlorophyll concentration and leaf reflectance on a weekly basis from June to November 2011 in a white oak forest on the island of Martha's Vineyard, Massachusetts, USA. Concurrently, we used a digital camera to automatically acquire daily pictures of the tree canopies. We found that there was a mismatch between the camera-based phenological metric for the canopy greenness (green chromatic coordinate, gcc) and the total chlorophyll and carotenoids concentration and leaf mass per area during late spring/early summer. The seasonal peak of gcc is approximately 20 days earlier than the peak of the total chlorophyll concentration. During the fall, both canopy and leaf redness were significantly correlated with the vegetation index for anthocyanin concentration, opening a new window to quantify vegetation senescence remotely. Satellite- and camera-based vegetation indices agreed well, suggesting that camera-based observations can be used as the ground validation for satellites. Using the high-temporal resolution dataset of leaf biochemical, biophysical, and spectral properties, our results show the strengths and potential uncertainties to use canopy color as the proxy of ecosystem functioning.

  5. Metrics for Polyphonic Sound Event Detection

    Directory of Open Access Journals (Sweden)

    Annamaria Mesaros

    2016-05-01

    Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.

  6. Functionality based detection of airborne engineered nanoparticles in quasi real time: a new type of detector and a new metric.

    Science.gov (United States)

    Neubauer, Nicole; Seipenbusch, Martin; Kasper, Gerhard

    2013-08-01

    A new type of detector which we call the Catalytic Activity Aerosol Monitor (CAAM) was investigated towards its capability to detect traces of commonly used industrial catalysts in ambient air in quasi real time. Its metric is defined as the catalytic activity concentration (CAC) expressed per volume of sampled workplace air. We thus propose a new metric which expresses the presence of nanoparticles in terms of their functionality - in this case a functionality of potential relevance for damaging effects - rather than their number, surface, or mass concentration in workplace air. The CAAM samples a few micrograms of known or anticipated airborne catalyst material onto a filter first and then initiates a chemical reaction which is specific to that catalyst. The concentration of specific gases is recorded using an IR sensor, thereby giving the desired catalytic activity. Due to a miniaturization effort, the laboratory prototype is compact and portable. Sensitivity and linearity of the CAAM response were investigated with catalytically active palladium and nickel nano-aerosols of known mass concentration and precisely adjustable primary particle size in the range of 3-30 nm. With the miniature IR sensor, the smallest detectable particle mass was found to be in the range of a few micrograms, giving estimated sampling times on the order of minutes for workplace aerosol concentrations typically reported in the literature. Tests were also performed in the presence of inert background aerosols of SiO2, TiO2, and Al2O3. It was found that the active material is detectable via its catalytic activity even when the particles are attached to a non-active background aerosol.

  7. Change in intraindividual variability over time as a key metric for defining performance-based cognitive fatigability.

    Science.gov (United States)

    Wang, Chao; Ding, Mingzhou; Kluger, Benzi M

    2014-03-01

    Cognitive fatigability is conventionally quantified as the increase over time in either mean reaction time (RT) or error rate from two or more time periods during sustained performance of a prolonged cognitive task. There is evidence indicating that these mean performance measures may not sufficiently reflect the response characteristics of cognitive fatigue. We hypothesized that changes in intraindividual variability over time would be a more sensitive and ecologically meaningful metric for investigations of fatigability of cognitive performance. To test the hypothesis fifteen young adults were recruited. Trait fatigue perceptions in various domains were assessed with the Multidimensional Fatigue Index (MFI). Behavioral data were then recorded during performance of a three-hour continuous cued Stroop task. Results showed that intraindividual variability, as quantified by the coefficient of variation of RT, increased linearly over the course of three hours and demonstrated a significantly greater effect size than mean RT or accuracy. Change in intraindividual RT variability over time was significantly correlated with relevant subscores of the MFI including reduced activity, reduced motivation and mental fatigue. While change in mean RT over time was also correlated with reduced motivation and mental fatigue, these correlations were significantly smaller than those associated with intraindividual RT variability. RT distribution analysis using an ex-Gaussian model further revealed that change in intraindividual variability over time reflects an increase in the exponential component of variance and may reflect attentional lapses or other breakdowns in cognitive control. These results suggest that intraindividual variability and its change over time provide important metrics for measuring cognitive fatigability and may prove useful for inferring the underlying neuronal mechanisms of both perceptions of fatigue and objective changes in performance. Copyright © 2014

  8. CORSEN, a new software dedicated to microscope-based 3D distance measurements: mRNA-mitochondria distance, from single-cell to population analyses.

    Science.gov (United States)

    Jourdren, Laurent; Delaveau, Thierry; Marquenet, Emelie; Jacq, Claude; Garcia, Mathilde

    2010-07-01

    Recent improvements in microscopy technology allow detection of single molecules of RNA, but tools for large-scale automatic analyses of particle distributions are lacking. An increasing number of imaging studies emphasize the importance of mRNA localization in the definition of cell territory or the biogenesis of cell compartments. CORSEN is a new tool dedicated to three-dimensional (3D) distance measurements from imaging experiments especially developed to access the minimal distance between RNA molecules and cellular compartment markers. CORSEN includes a 3D segmentation algorithm allowing the extraction and the characterization of the cellular objects to be processed--surface determination, aggregate decomposition--for minimal distance calculations. CORSEN's main contribution lies in exploratory statistical analysis, cell population characterization, and high-throughput assays that are made possible by the implementation of a batch process analysis. We highlighted CORSEN's utility for the study of relative positions of mRNA molecules and mitochondria: CORSEN clearly discriminates mRNA localized to the vicinity of mitochondria from those that are translated on free cytoplasmic polysomes. Moreover, it quantifies the cell-to-cell variations of mRNA localization and emphasizes the necessity for statistical approaches. This method can be extended to assess the evolution of the distance between specific mRNAs and other cellular structures in different cellular contexts. CORSEN was designed for the biologist community with the concern to provide an easy-to-use and highly flexible tool that can be applied for diverse distance quantification issues.

  9. Brand metrics that matter

    NARCIS (Netherlands)

    Muntinga, D.; Bernritter, S.

    2017-01-01

    Het merk staat steeds meer centraal in de organisatie. Het is daarom essentieel om de gezondheid, prestaties en ontwikkelingen van het merk te meten. Het is echter een uitdaging om de juiste brand metrics te selecteren. Een enorme hoeveelheid metrics vraagt de aandacht van merkbeheerders. Maar welke

  10. Privacy Metrics and Boundaries

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  11. DESIGNING INSTRUCTION FOR THE TRADITIONAL, ADULT, AND DISTANCE LEARNER: A New Engine for Technology-Based Teaching

    Directory of Open Access Journals (Sweden)

    Lawrence A. Tomei

    2011-10-01

    Full Text Available Adult students demand a wider variety of instructional strategies that encompass real-world, interactive, cooperative, and discovery learning experiences.Designing Instruction for the Traditional, Adult, and Distance Learner: A New Engine for Technology-Based Teaching explores how technology impacts the process of devising instructional plans as well as learning itself in adult students. Containing research from leading international experts, this publication proposes realistic and accurate archetypes to assist educators in incorporating state-of-the-art technologies into online instruction.This text proposes a new paradigm for designing, developing, implementing, and assessed technology-based instruction. It addresses three target populations of today's learner: traditional, adult, and distance education. The text proposes a new model of instructional system design (ISD for developing effective technology-based education that involves a five-step process focusing on the learner, learning theories, resources, delivery modalities, and outcomes.

  12. [Problem based learning by distance education and analysis of a training system].

    Science.gov (United States)

    Dury, Cécile

    2004-12-01

    This article presents and analyses a training system aiming at acquiring skills in nursing cares. The aims followed are the development: --of an active pedagogic method: learning through problems (LTP); --of the interdisciplinary and intercultural approach, the same problems being solves by students from different disciplines and cultures; --of the use of the new technologies of information and communication (NTIC) so as to enable a maximal "distance" cooperation between the various partners of the project. The analysis of the system shows that the pedagogic aims followed by LTP are reached. The pluridisciplinary and pluricultural approach, to be optimal, requires great coordination between the partners, balance between the groups of students from different countries and disciplines, training and support from the tutors in the use of the distance teaching platform.

  13. Enhancing Authentication Models Characteristic Metrics via ...

    African Journals Online (AJOL)

    In this work, we derive the universal characteristic metrics set for authentication models based on security, usability and design issues. We then compute the probability of the occurrence of each characteristic metrics in some single factor and multifactor authentication models in order to determine the effectiveness of these ...

  14. Are contemporary tourists consuming distance?

    DEFF Research Database (Denmark)

    Larsen, Gunvor Riber

    2012. Background The background for this research, which explores how tourists represent distance and whether or not distance can be said to be consumed by contemporary tourists, is the increasing leisure mobility of people. Travelling for the purpose of visiting friends and relatives is increasing...... of understanding mobility at a conceptual level, and distance matters to people's manifest mobility: how they travel and how far they travel are central elements of their movements. Therefore leisure mobility (indeed all mobility) is the activity of relating across distance, either through actual corporeal...... metric representation. These representations are the focus for this research. Research Aim and Questions The aim of this research is thus to explore how distance is being represented within the context of leisure mobility. Further the aim is to explore how or whether distance is being consumed...

  15. Anatomy-Based navigation for ventriculostomy: Nasion-coronal suture distance measurement

    Directory of Open Access Journals (Sweden)

    Mevci Özdemir

    2014-09-01

    Full Text Available Objective: In this study we aimed to determine a landmark that can be measured through the skin with nasal mid-point (bregma to coronal suture, and additionally an average value was calculated. We report, to our knowledge, the distance between the nasion-coronal sutures is reported for the first time in Turkish population. Methods: The study included 30 craniums and 30 frontal bones. Each skull from midline nasal suture to coronal suture curved up at the distance was measured with tape measure. Results: Mean values were determined. Nasal suture between coronal suture distance average 12,2 cm (min10,3 cm, up to 13,5 cm were detected. Conclusion: Nasal suture is an easily palpable area through the skin. A small incision is carried down through skin to bone at the spot 12 cm back from the nasion 3 cm lateral to the midline for ventricular drainage operation. This data provide practical information for neurosurgeon and is available everywhere. J Clin Exp Invest 2014; 5 (3: 368-370

  16. Stereo camera based virtual cane system with identifiable distance tactile feedback for the blind.

    Science.gov (United States)

    Kim, Donghun; Kim, Kwangtaek; Lee, Sangyoun

    2014-06-13

    In this paper, we propose a new haptic-assisted virtual cane system operated by a simple finger pointing gesture. The system is developed by two stages: development of visual information delivery assistant (VIDA) with a stereo camera and adding a tactile feedback interface with dual actuators for guidance and distance feedbacks. In the first stage, user's pointing finger is automatically detected using color and disparity data from stereo images and then a 3D pointing direction of the finger is estimated with its geometric and textural features. Finally, any object within the estimated pointing trajectory in 3D space is detected and the distance is then estimated in real time. For the second stage, identifiable tactile signals are designed through a series of identification experiments, and an identifiable tactile feedback interface is developed and integrated into the VIDA system. Our approach differs in that navigation guidance is provided by a simple finger pointing gesture and tactile distance feedbacks are perfectly identifiable to the blind.

  17. Stereo Camera Based Virtual Cane System with Identifiable Distance Tactile Feedback for the Blind

    Directory of Open Access Journals (Sweden)

    Donghun Kim

    2014-06-01

    Full Text Available In this paper, we propose a new haptic-assisted virtual cane system operated by a simple finger pointing gesture. The system is developed by two stages: development of visual information delivery assistant (VIDA with a stereo camera and adding a tactile feedback interface with dual actuators for guidance and distance feedbacks. In the first stage, user’s pointing finger is automatically detected using color and disparity data from stereo images and then a 3D pointing direction of the finger is estimated with its geometric and textural features. Finally, any object within the estimated pointing trajectory in 3D space is detected and the distance is then estimated in real time. For the second stage, identifiable tactile signals are designed through a series of identification experiments, and an identifiable tactile feedback interface is developed and integrated into the VIDA system. Our approach differs in that navigation guidance is provided by a simple finger pointing gesture and tactile distance feedbacks are perfectly identifiable to the blind.

  18. Method paper--distance and travel time to casualty clinics in Norway based on crowdsourced postcode coordinates: a comparison with other methods.

    Directory of Open Access Journals (Sweden)

    Guttorm Raknes

    Full Text Available We describe a method that uses crowdsourced postcode coordinates and Google maps to estimate average distance and travel time for inhabitants of a municipality to a casualty clinic in Norway. The new method was compared with methods based on population centroids, median distance and town hall location, and we used it to examine how distance affects the utilisation of out-of-hours primary care services. At short distances our method showed good correlation with mean travel time and distance. The utilisation of out-of-hours services correlated with postcode based distances similar to previous research. The results show that our method is a reliable and useful tool for estimating average travel distances and travel times.

  19. Method paper--distance and travel time to casualty clinics in Norway based on crowdsourced postcode coordinates: a comparison with other methods.

    Science.gov (United States)

    Raknes, Guttorm; Hunskaar, Steinar

    2014-01-01

    We describe a method that uses crowdsourced postcode coordinates and Google maps to estimate average distance and travel time for inhabitants of a municipality to a casualty clinic in Norway. The new method was compared with methods based on population centroids, median distance and town hall location, and we used it to examine how distance affects the utilisation of out-of-hours primary care services. At short distances our method showed good correlation with mean travel time and distance. The utilisation of out-of-hours services correlated with postcode based distances similar to previous research. The results show that our method is a reliable and useful tool for estimating average travel distances and travel times.

  20. Distance-Based Tear Lactoferrin Assay on Microfluidic Paper Device Using Interfacial Interactions on Surface-Modified Cellulose.

    Science.gov (United States)

    Yamada, Kentaro; Henares, Terence G; Suzuki, Koji; Citterio, Daniel

    2015-11-11

    "Distance-based" detection motifs on microfluidic paper-based analytical devices (μPADs) allow quantitative analysis without using signal readout instruments in a similar manner to classical analogue thermometers. To realize a cost-effective and calibration-free distance-based assay of lactoferrin in human tear fluid on a μPAD not relying on antibodies or enzymes, we investigated the fluidic mobilities of the target protein and Tb(3+) cations used as the fluorescent detection reagent on surface-modified cellulosic filter papers. Chromatographic elution experiments in a tear-like sample matrix containing electrolytes and proteins revealed a collapse of attractive electrostatic interactions between lactoferrin or Tb(3+) and the cellulosic substrate, which was overcome by the modification of the paper surface with the sulfated polysaccharide ι-carrageenan. The resulting μPAD based on the fluorescence emission distance successfully analyzed 0-4 mg mL(-1) of lactoferrin in complex human tear matrix with a lower limit of detection of 0.1 mg mL(-1) by simple visual inspection. Assay results of 18 human tear samples including ocular disease patients and healthy volunteers showed good correlation to the reference ELISA method with a slope of 0.997 and a regression coefficient of 0.948. The distance-based quantitative signal and the good batch-to-batch fabrication reproducibility relying on printing methods enable quantitative analysis by simply reading out "concentration scale marks" printed on the μPAD without performing any calibration and using any signal readout instrument.