WorldWideScience

Sample records for multiple resolution metrics

  1. High resolution metric imaging payload

    Delclaud, Y.

    2017-11-01

    Alcatel Space Industries has become Europe's leader in the field of high and very high resolution optical payloads, in the frame work of earth observation system able to provide military government with metric images from space. This leadership allowed ALCATEL to propose for the export market, within a French collaboration frame, a complete space based system for metric observation.

  2. A Simple Metric for Determining Resolution in Optical, Ion, and Electron Microscope Images.

    Curtin, Alexandra E; Skinner, Ryan; Sanders, Aric W

    2015-06-01

    A resolution metric intended for resolution analysis of arbitrary spatially calibrated images is presented. By fitting a simple sigmoidal function to pixel intensities across slices of an image taken perpendicular to light-dark edges, the mean distance over which the light-dark transition occurs can be determined. A fixed multiple of this characteristic distance is then reported as the image resolution. The prefactor is determined by analysis of scanning transmission electron microscope high-angle annular dark field images of Si. This metric has been applied to optical, scanning electron microscope, and helium ion microscope images. This method provides quantitative feedback about image resolution, independent of the tool on which the data were collected. In addition, our analysis provides a nonarbitrary and self-consistent framework that any end user can utilize to evaluate the resolution of multiple microscopes from any vendor using the same metric.

  3. Tripled Fixed Point in Ordered Multiplicative Metric Spaces

    Laishram Shanjit

    2017-06-01

    Full Text Available In this paper, we present some triple fixed point theorems in partially ordered multiplicative metric spaces depended on another function. Our results generalise the results of [6] and [5].

  4. Evaluating Multiple Object Tracking Performance: The CLEAR MOT Metrics

    Bernardin Keni

    2008-01-01

    Full Text Available Abstract Simultaneous tracking of multiple persons in real-world environments is an active research field and several approaches have been proposed, based on a variety of features and algorithms. Recently, there has been a growing interest in organizing systematic evaluations to compare the various techniques. Unfortunately, the lack of common metrics for measuring the performance of multiple object trackers still makes it hard to compare their results. In this work, we introduce two intuitive and general metrics to allow for objective comparison of tracker characteristics, focusing on their precision in estimating object locations, their accuracy in recognizing object configurations and their ability to consistently label objects over time. These metrics have been extensively used in two large-scale international evaluations, the 2006 and 2007 CLEAR evaluations, to measure and compare the performance of multiple object trackers for a wide variety of tracking tasks. Selected performance results are presented and the advantages and drawbacks of the presented metrics are discussed based on the experience gained during the evaluations.

  5. Robustness Metrics: Consolidating the multiple approaches to quantify Robustness

    Göhler, Simon Moritz; Eifler, Tobias; Howard, Thomas J.

    2016-01-01

    robustness metrics; 3) Functional expectancy and dispersion robustness metrics; and 4) Probability of conformance robustness metrics. The goal was to give a comprehensive overview of robustness metrics and guidance to scholars and practitioners to understand the different types of robustness metrics...

  6. Project, building and utilization of a tomograph of micro metric resolution to application in soil science

    Macedo, Alvaro; Torre Neto, Andre; Cruvinel, Paulo Estevao; Crestana, Silvio

    1996-08-01

    This paper describes the project , building and utilization of a tomograph of micro metric resolution in soil science. It describes the problems involved in soil's science study and it describes the system and methodology

  7. Whole brain white matter changes revealed by multiple diffusion metrics in multiple sclerosis: A TBSS study

    Liu, Yaou, E-mail: asiaeurope80@gmail.com [Department of Radiology, Xuanwu Hospital, Capital Medical University, Beijing 100053 (China); Duan, Yunyun, E-mail: xiaoyun81.love@163.com [Department of Radiology, Xuanwu Hospital, Capital Medical University, Beijing 100053 (China); He, Yong, E-mail: yong.h.he@gmail.com [State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing 100875 (China); Yu, Chunshui, E-mail: csyuster@gmail.com [Department of Radiology, Xuanwu Hospital, Capital Medical University, Beijing 100053 (China); Wang, Jun, E-mail: jun_wang@bnu.edu.cn [State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing 100875 (China); Huang, Jing, E-mail: sainthj@126.com [Department of Radiology, Xuanwu Hospital, Capital Medical University, Beijing 100053 (China); Ye, Jing, E-mail: jingye.2007@yahoo.com.cn [Department of Neurology, Xuanwu Hospital, Capital Medical University, Beijing 100053 (China); Parizel, Paul M., E-mail: paul.parizel@ua.ac.be [Department of Radiology, Antwerp University Hospital and University of Antwerp, Wilrijkstraat 10, 2650 Edegem, 8 Belgium (Belgium); Li, Kuncheng, E-mail: kunchengli55@gmail.com [Department of Radiology, Xuanwu Hospital, Capital Medical University, Beijing 100053 (China); Shu, Ni, E-mail: nshu55@gmail.com [State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing 100875 (China)

    2012-10-15

    Objective: To investigate whole brain white matter changes in multiple sclerosis (MS) by multiple diffusion indices, we examined patients with diffusion tensor imaging and utilized tract-based spatial statistics (TBSS) method to analyze the data. Methods: Forty-one relapsing-remitting multiple sclerosis (RRMS) patients and 41 age- and gender-matched normal controls were included in this study. Diffusion weighted images were acquired by employing a single-shot echo planar imaging sequence on a 1.5 T MR scanner. Voxel-wise analyses of multiple diffusion metrics, including fractional anisotropy (FA), mean diffusivity (MD), axial diffusivity (AD) and radial diffusivity (RD) were performed with TBSS. Results: The MS patients had significantly decreased FA (9.11%), increased MD (8.26%), AD (3.48%) and RD (13.17%) in their white matter skeletons compared with the controls. Through TBSS analyses, we found abnormal diffusion changes in widespread white matter regions in MS patients. Specifically, decreased FA, increased MD and increased RD were involved in whole-brain white matter, while several regions exhibited increased AD. Furthermore, white matter regions with significant correlations between the diffusion metrics and the clinical variables (the EDSS scores, disease durations and white matter lesion loads) in MS patients were identified. Conclusion: Widespread white matter abnormalities were observed in MS patients revealed by multiple diffusion metrics. The diffusion changes and correlations with clinical variables were mainly attributed to increased RD, implying the predominant role of RD in reflecting the subtle pathological changes in MS.

  8. Whole brain white matter changes revealed by multiple diffusion metrics in multiple sclerosis: A TBSS study

    Liu, Yaou; Duan, Yunyun; He, Yong; Yu, Chunshui; Wang, Jun; Huang, Jing; Ye, Jing; Parizel, Paul M.; Li, Kuncheng; Shu, Ni

    2012-01-01

    Objective: To investigate whole brain white matter changes in multiple sclerosis (MS) by multiple diffusion indices, we examined patients with diffusion tensor imaging and utilized tract-based spatial statistics (TBSS) method to analyze the data. Methods: Forty-one relapsing-remitting multiple sclerosis (RRMS) patients and 41 age- and gender-matched normal controls were included in this study. Diffusion weighted images were acquired by employing a single-shot echo planar imaging sequence on a 1.5 T MR scanner. Voxel-wise analyses of multiple diffusion metrics, including fractional anisotropy (FA), mean diffusivity (MD), axial diffusivity (AD) and radial diffusivity (RD) were performed with TBSS. Results: The MS patients had significantly decreased FA (9.11%), increased MD (8.26%), AD (3.48%) and RD (13.17%) in their white matter skeletons compared with the controls. Through TBSS analyses, we found abnormal diffusion changes in widespread white matter regions in MS patients. Specifically, decreased FA, increased MD and increased RD were involved in whole-brain white matter, while several regions exhibited increased AD. Furthermore, white matter regions with significant correlations between the diffusion metrics and the clinical variables (the EDSS scores, disease durations and white matter lesion loads) in MS patients were identified. Conclusion: Widespread white matter abnormalities were observed in MS patients revealed by multiple diffusion metrics. The diffusion changes and correlations with clinical variables were mainly attributed to increased RD, implying the predominant role of RD in reflecting the subtle pathological changes in MS

  9. Content-Based High-Resolution Remote Sensing Image Retrieval via Unsupervised Feature Learning and Collaborative Affinity Metric Fusion

    Yansheng Li

    2016-08-01

    Full Text Available With the urgent demand for automatic management of large numbers of high-resolution remote sensing images, content-based high-resolution remote sensing image retrieval (CB-HRRS-IR has attracted much research interest. Accordingly, this paper proposes a novel high-resolution remote sensing image retrieval approach via multiple feature representation and collaborative affinity metric fusion (IRMFRCAMF. In IRMFRCAMF, we design four unsupervised convolutional neural networks with different layers to generate four types of unsupervised features from the fine level to the coarse level. In addition to these four types of unsupervised features, we also implement four traditional feature descriptors, including local binary pattern (LBP, gray level co-occurrence (GLCM, maximal response 8 (MR8, and scale-invariant feature transform (SIFT. In order to fully incorporate the complementary information among multiple features of one image and the mutual information across auxiliary images in the image dataset, this paper advocates collaborative affinity metric fusion to measure the similarity between images. The performance evaluation of high-resolution remote sensing image retrieval is implemented on two public datasets, the UC Merced (UCM dataset and the Wuhan University (WH dataset. Large numbers of experiments show that our proposed IRMFRCAMF can significantly outperform the state-of-the-art approaches.

  10. Optimized multiple linear mappings for single image super-resolution

    Zhang, Kaibing; Li, Jie; Xiong, Zenggang; Liu, Xiuping; Gao, Xinbo

    2017-12-01

    Learning piecewise linear regression has been recognized as an effective way for example learning-based single image super-resolution (SR) in literature. In this paper, we employ an expectation-maximization (EM) algorithm to further improve the SR performance of our previous multiple linear mappings (MLM) based SR method. In the training stage, the proposed method starts with a set of linear regressors obtained by the MLM-based method, and then jointly optimizes the clustering results and the low- and high-resolution subdictionary pairs for regression functions by using the metric of the reconstruction errors. In the test stage, we select the optimal regressor for SR reconstruction by accumulating the reconstruction errors of m-nearest neighbors in the training set. Thorough experimental results carried on six publicly available datasets demonstrate that the proposed SR method can yield high-quality images with finer details and sharper edges in terms of both quantitative and perceptual image quality assessments.

  11. Multiple reflection Michelson interferometer with picometer resolution.

    Pisani, Marco

    2008-12-22

    A Michelson interferometer based on an optical set-up allowing multiple reflection between two plane mirrors performs the multiplication of the optical path by a factor N, proportionally increasing the resolution of the measurement. A multiplication factor of almost two orders of magnitude has been demonstrated with a simple set-up. The technique can be applied to any interferometric measurement where the classical interferometer limits due to fringe nonlinearities and quantum noise are an issue. Applications in precision engineering, vibration analysis, nanometrology, and spectroscopy are foreseen.

  12. Nonlinear Semi-Supervised Metric Learning Via Multiple Kernels and Local Topology.

    Li, Xin; Bai, Yanqin; Peng, Yaxin; Du, Shaoyi; Ying, Shihui

    2018-03-01

    Changing the metric on the data may change the data distribution, hence a good distance metric can promote the performance of learning algorithm. In this paper, we address the semi-supervised distance metric learning (ML) problem to obtain the best nonlinear metric for the data. First, we describe the nonlinear metric by the multiple kernel representation. By this approach, we project the data into a high dimensional space, where the data can be well represented by linear ML. Then, we reformulate the linear ML by a minimization problem on the positive definite matrix group. Finally, we develop a two-step algorithm for solving this model and design an intrinsic steepest descent algorithm to learn the positive definite metric matrix. Experimental results validate that our proposed method is effective and outperforms several state-of-the-art ML methods.

  13. Examination of High Resolution Channel Topography to Determine Suitable Metrics to Characterize Morphological Complexity

    Stewart, R. L.; Gaeuman, D.

    2015-12-01

    Complex bed morphology is deemed necessary to restore salmonid habitats, yet quantifiable metrics that capture channel complexity have remained elusive. This work utilizes high resolution topographic data from the 40 miles of the Trinity River of northern California to determine a suitable metric for characterizing morphological complexity at the reach scale. The study area is segregated into reaches defined by individual riffle pool units or aggregates of several consecutive units. Potential measures of complexity include rugosity and depth statistics such as standard deviation and interquartile range, yet previous research has shown these metrics are scale dependent and subject to sampling density-based bias. The effect of sampling density on the present analysis has been reduced by underrepresenting the high resolution topographic data as a 3'x 3' raster so that all areas are equally sampled. Standard rugosity, defined as the three-dimensional surface area divided by projected area, has been shown to be dependent on average depth. We therefore define R*, a empirically depth-corrected rugosity metric in which rugosity is corrected using an empirical relationship based on linear regression between the standard rugosity metric and average depth. By removing the dependence on depth using a regression based on the study reach, R* provides a measure reach scale complexity relative to the entire study area. The interquartile range of depths is also depth-dependent, so we defined a non-dimensional metric (IQR*) as the interquartile range dividing by median depth. These are calculated to develop rankings of channel complexity which, are found to closely agree with perceived channel complexity observed in the field. Current efforts combine these measures of morphological complexity with salmonid habitat suitability to evaluate the effects of channel complexity on the various life stages of salmonids. Future work will investigate the downstream sequencing of channel

  14. Assessing the influence of multiple stressors on stream diatom metrics in the upper Midwest, USA

    Munn, Mark D.; Waite, Ian R.; Konrad, Christopher P.

    2018-01-01

    Water resource managers face increasing challenges in identifying what physical and chemical stressors are responsible for the alteration of biological conditions in streams. The objective of this study was to assess the comparative influence of multiple stressors on benthic diatoms at 98 sites that spanned a range of stressors in an agriculturally dominated region in the upper Midwest, USA. The primary stressors of interest included: nutrients, herbicides and fungicides, sediment, and streamflow; although the influence of physical habitat was incorporated in the assessment. Boosted Regression Tree was used to examine both the sensitivity of various diatom metrics and the relative importance of the primary stressors. Percent Sensitive Taxa, percent Highly Motile Taxa, and percent High Phosphorus Taxa had the strongest response to stressors. Habitat and total phosphorous were the most common discriminators of diatom metrics, with herbicides as secondary factors. A Classification and Regression Tree (CART) model was used to examine conditional relations among stressors and indicated that fine-grain streams had a lower percentage of Sensitive Taxa than coarse-grain streams, with Sensitive Taxa decreasing further with increased water temperature (>30 °C) and triazine concentrations (>1500 ng/L). In contrast, streams dominated by coarse-grain substrate contained a higher percentage of Sensitive Taxa, with relative abundance increasing with lower water temperatures (water depth (water temperature appears to be a major limiting factor in Midwest streams; whereas both total phosphorus and percent fines showed a slight subsidy-stress response. While using benthic algae for assessing stream quality can be challenging, field-based studies can elucidate stressor effects and interactions when the response variables are appropriate, sufficient stressor resolution is achieved, and the number and type of sites represent a gradient of stressor conditions and at least a quasi

  15. Metric-Resolution 2D River Modeling at the Macroscale: Computational Methods and Applications in a Braided River

    Jochen eSchubert

    2015-11-01

    Full Text Available Metric resolution digital terrain models (DTMs of rivers now make it possible for multi-dimensional fluid mechanics models to be applied to characterize flow at fine scales that are relevant to studies of river morphology and ecological habitat, or microscales. These developments are important for managing rivers because of the potential to better understand system dynamics, anthropogenic impacts, and the consequences of proposed interventions. However, the data volumes and computational demands of microscale river modeling have largely constrained applications to small multiples of the channel width, or the mesoscale. This report presents computational methods to extend a microscale river model beyond the mesoscale to the macroscale, defined as large multiples of the channel width. A method of automated unstructured grid generation is presented that automatically clusters fine resolution cells in areas of curvature (e.g., channel banks, and places relatively coarse cells in areas lacking topographic variability. This overcomes the need to manually generate breaklines to constrain the grid, which is painstaking at the mesoscale and virtually impossible at the macroscale. The method is applied to a braided river with an extremely complex channel network configuration and shown to yield an efficient fine resolution model. The sensitivity of model output to grid design and resistance parameters is also examined as it relates to analysis of hydrology, hydraulic geometry and river habitats and the findings reiterate the importance of model calibration and validation.

  16. Common Fixed Points of Generalized Rational Type Cocyclic Mappings in Multiplicative Metric Spaces

    Mujahid Abbas

    2015-01-01

    Full Text Available The aim of this paper is to present fixed point result of mappings satisfying a generalized rational contractive condition in the setup of multiplicative metric spaces. As an application, we obtain a common fixed point of a pair of weakly compatible mappings. Some common fixed point results of pair of rational contractive types mappings involved in cocyclic representation of a nonempty subset of a multiplicative metric space are also obtained. Some examples are presented to support the results proved herein. Our results generalize and extend various results in the existing literature.

  17. Measuring floodplain spatial patterns using continuous surface metrics at multiple scales

    Scown, Murray W.; Thoms, Martin C.; DeJager, Nathan R.

    2015-01-01

    Interactions between fluvial processes and floodplain ecosystems occur upon a floodplain surface that is often physically complex. Spatial patterns in floodplain topography have only recently been quantified over multiple scales, and discrepancies exist in how floodplain surfaces are perceived to be spatially organised. We measured spatial patterns in floodplain topography for pool 9 of the Upper Mississippi River, USA, using moving window analyses of eight surface metrics applied to a 1 × 1 m2 DEM over multiple scales. The metrics used were Range, SD, Skewness, Kurtosis, CV, SDCURV,Rugosity, and Vol:Area, and window sizes ranged from 10 to 1000 m in radius. Surface metric values were highly variable across the floodplain and revealed a high degree of spatial organisation in floodplain topography. Moran's I correlograms fit to the landscape of each metric at each window size revealed that patchiness existed at nearly all window sizes, but the strength and scale of patchiness changed within window size, suggesting that multiple scales of patchiness and patch structure exist in the topography of this floodplain. Scale thresholds in the spatial patterns were observed, particularly between the 50 and 100 m window sizes for all surface metrics and between the 500 and 750 m window sizes for most metrics. These threshold scales are ~ 15–20% and 150% of the main channel width (1–2% and 10–15% of the floodplain width), respectively. These thresholds may be related to structuring processes operating across distinct scale ranges. By coupling surface metrics, multi-scale analyses, and correlograms, quantifying floodplain topographic complexity is possible in ways that should assist in clarifying how floodplain ecosystems are structured.

  18. Far-field super-resolution imaging of resonant multiples

    Guo, Bowen

    2016-05-20

    We demonstrate for the first time that seismic resonant multiples, usually considered as noise, can be used for super-resolution imaging in the far-field region of sources and receivers. Tests with both synthetic data and field data show that resonant multiples can image reflector boundaries with resolutions more than twice the classical resolution limit. Resolution increases with the order of the resonant multiples. This procedure has important applications in earthquake and exploration seismology, radar, sonar, LIDAR (light detection and ranging), and ultrasound imaging, where the multiples can be used to make high-resolution images.

  19. Upper esophageal sphincter (UES) metrics on high-resolution manometry (HRM) differentiate achalasia subtypes.

    Blais, P; Patel, A; Sayuk, G S; Gyawali, C P

    2017-12-01

    The upper esophageal sphincter (UES) reflexively responds to bolus presence within the esophageal lumen, therefore UES metrics can vary in achalasia. Within consecutive patients undergoing esophageal high-resolution manometry (HRM), 302 patients (58.2±1.0 year, 57% F) with esophageal outflow obstruction were identified, and compared to 16 asymptomatic controls (27.7±0.7 year, 56% F). Esophageal outflow obstruction was segregated into achalasia subtypes 1, 2, and 3, and esophagogastric junction outflow obstruction (EGJOO with intact peristalsis) using Chicago Classification v3.0. UES and lower esophageal sphincter (LES) metrics were compared between esophageal outflow obstruction and normal controls using univariate and multivariate analysis. Linear regression excluded multicollinearity of pressure metrics that demonstrated significant differences across individual subtype comparisons. LES integrated relaxation pressure (IRP) had utility in differentiating achalasia from controls (P<.0001), but no utility in segregating between subtypes (P=.27). In comparison to controls, patients collectively demonstrated univariate differences in UES mean basal pressure, relaxation time to nadir, recovery time, and residual pressure (UES-RP) (P≤.049). UES-RP was highest in type 2 achalasia (P<.0001 compared to other subtypes and controls). In multivariate analysis, only UES-RP retained significance in comparison between each of the subgroups (P≤.02 for each comparison). Intrabolus pressure was highest in type 3 achalasia; this demonstrated significant differences across some but not all subtype comparisons. Nadir UES-RP can differentiate achalasia subtypes within the esophageal outflow obstruction spectrum, with highest values in type 2 achalasia. This metric likely represents a surrogate marker for esophageal pressurization. © 2017 John Wiley & Sons Ltd.

  20. Fast generation of multiple resolution instances of raster data sets

    Arge, L.; Haverkort, H.J.; Tsirogiannis, C.P.

    2012-01-01

    In many GIS applications it is important to study the characteristics of a raster data set at multiple resolutions. Often this is done by generating several coarser resolution rasters from a fine resolution raster. In this paper we describe efficient algorithms for different variants of this

  1. Using basic metrics to analyze high-resolution temperature data in the subsurface

    Shanafield, Margaret; McCallum, James L.; Cook, Peter G.; Noorduijn, Saskia

    2017-08-01

    Time-series temperature data can be summarized to provide valuable information on spatial variation in subsurface flow, using simple metrics. Such computationally light analysis is often discounted in favor of more complex models. However, this study demonstrates the merits of summarizing high-resolution temperature data, obtained from a fiber optic cable installation at several depths within a water delivery channel, into daily amplitudes and mean temperatures. These results are compared to fluid flux estimates from a one-dimensional (1D) advection-conduction model and to the results of a previous study that used a full three-dimensional (3D) model. At a depth of 0.1 m below the channel, plots of amplitude suggested areas of advective water movement (as confirmed by the 1D and 3D models). Due to lack of diurnal signal at depths below 0.1 m, mean temperature was better able to identify probable areas of water movement at depths of 0.25-0.5 m below the channel. The high density of measurements provided a 3D picture of temperature change over time within the study reach, and would be suitable for long-term monitoring in man-made environments such as constructed wetlands, recharge basins, and water-delivery channels, where a firm understanding of spatial and temporal variation in infiltration is imperative for optimal functioning.

  2. A robust new metric of phenotypic distance to estimate and compare multiple trait differences among populations

    Rebecca SAFRAN, Samuel FLAXMAN, Michael KOPP, Darren E. IRWIN, Derek BRIGGS, Matthew R. EVANS, W. Chris FUNK, David A. GRAY, Eileen A. HEBE

    2012-06-01

    Full Text Available Whereas a rich literature exists for estimating population genetic divergence, metrics of phenotypic trait divergence are lacking, particularly for comparing multiple traits among three or more populations. Here, we review and analyze via simulation Hedges’ g, a widely used parametric estimate of effect size. Our analyses indicate that g is sensitive to a combination of unequal trait variances and unequal sample sizes among populations and to changes in the scale of measurement. We then go on to derive and explain a new, non-parametric distance measure, “Δp”, which is calculated based upon a joint cumulative distribution function (CDF from all populations under study. More precisely, distances are measured in terms of the percentiles in this CDF at which each population’s median lies. Δp combines many desirable features of other distance metrics into a single metric; namely, compared to other metrics, p is relatively insensitive to unequal variances and sample sizes among the populations sampled. Furthermore, a key feature of Δp—and our main motivation for developing it—is that it easily accommodates simultaneous comparisons of any number of traits across any number of populations. To exemplify its utility, we employ Δp to address a question related to the role of sexual selection in speciation: are sexual signals more divergent than ecological traits in closely related taxa? Using traits of known function in closely related populations, we show that traits predictive of reproductive performance are, indeed, more divergent and more sexually dimorphic than traits related to ecological adaptation [Current Zoology 58 (3: 423-436, 2012].

  3. Microstructural abnormalities in the trigeminal nerves of patients with trigeminal neuralgia revealed by multiple diffusion metrics

    Liu, Yaou [Department of Radiology, Xuanwu Hospital, Capital Medical University, Beijing 100053 (China); Beijing Key laboratory of MRI and Brain Informatics, Beijing (China); Li, Jiping [Department of Functional Neurosurgery, Xuanwu Hospital, Capital Medical University, Beijing 100053 (China); Butzkueven, Helmut [Department of Medicine, University of Melbourne, Parkville 3010 (Australia); Duan, Yunyun; Zhang, Mo [Department of Radiology, Xuanwu Hospital, Capital Medical University, Beijing 100053 (China); Shu, Ni [State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing 100875 (China); Li, Yongjie [Department of Functional Neurosurgery, Xuanwu Hospital, Capital Medical University, Beijing 100053 (China); Zhang, Yuqing, E-mail: yuqzhang@sohu.com [Department of Functional Neurosurgery, Xuanwu Hospital, Capital Medical University, Beijing 100053 (China); Li, Kuncheng, E-mail: kunchengli55@gmail.com [Department of Radiology, Xuanwu Hospital, Capital Medical University, Beijing 100053 (China)

    2013-05-15

    Objective: To investigate microstructural tissue changes of trigeminal nerve (TGN) in patients with unilateral trigeminal neuralgia (TN) by multiple diffusion metrics, and correlate the diffusion indexes with the clinical variables. Methods: 16 patients with TN and 6 healthy controls (HC) were recruited into our study. All participants were imaged with a 3.0 T system with three-dimension time-of-flight (TOF) magnetic resonance angiography and fluid attenuated inversion recovery (FLAIR) DTI-sequence. We placed regions of interest over the root entry zone of the TGN and measured fractional anisotropy (FA), mean diffusivity (MD), axial diffusivity (AD) and radial diffusivity (RD). The mean values of FA, MD, AD and RD were compared between the affected and unaffected sides in the same patient, and to HC values. The correlation between the side-to-side diffusion metric difference and clinical variables (disease duration and visual analogy scale, VAS) was further explored. Results: Compared with the unaffected side and HC, the affected side showed significantly decreased FA and increased RD; however, no significant changes of AD were found. A trend toward significantly increased MD was identified on the affected side comparing with the unaffected side. We also found the significant correlation between the FA reduction and VAS of pain (r = −0.55, p = 0.03). Conclusion: DTI can quantitatively assess the microstructural abnormalities of the affected TGN in patients with TN. Our results suggest demyelination without significant axonal injury is the essential pathological basis of the affected TGN by multiple diffusion metrics. The correlation between FA reduction and VAS suggests FA as a potential objective MRI biomarker to correlate with clinical severity.

  4. Microstructural abnormalities in the trigeminal nerves of patients with trigeminal neuralgia revealed by multiple diffusion metrics

    Liu, Yaou; Li, Jiping; Butzkueven, Helmut; Duan, Yunyun; Zhang, Mo; Shu, Ni; Li, Yongjie; Zhang, Yuqing; Li, Kuncheng

    2013-01-01

    Objective: To investigate microstructural tissue changes of trigeminal nerve (TGN) in patients with unilateral trigeminal neuralgia (TN) by multiple diffusion metrics, and correlate the diffusion indexes with the clinical variables. Methods: 16 patients with TN and 6 healthy controls (HC) were recruited into our study. All participants were imaged with a 3.0 T system with three-dimension time-of-flight (TOF) magnetic resonance angiography and fluid attenuated inversion recovery (FLAIR) DTI-sequence. We placed regions of interest over the root entry zone of the TGN and measured fractional anisotropy (FA), mean diffusivity (MD), axial diffusivity (AD) and radial diffusivity (RD). The mean values of FA, MD, AD and RD were compared between the affected and unaffected sides in the same patient, and to HC values. The correlation between the side-to-side diffusion metric difference and clinical variables (disease duration and visual analogy scale, VAS) was further explored. Results: Compared with the unaffected side and HC, the affected side showed significantly decreased FA and increased RD; however, no significant changes of AD were found. A trend toward significantly increased MD was identified on the affected side comparing with the unaffected side. We also found the significant correlation between the FA reduction and VAS of pain (r = −0.55, p = 0.03). Conclusion: DTI can quantitatively assess the microstructural abnormalities of the affected TGN in patients with TN. Our results suggest demyelination without significant axonal injury is the essential pathological basis of the affected TGN by multiple diffusion metrics. The correlation between FA reduction and VAS suggests FA as a potential objective MRI biomarker to correlate with clinical severity

  5. Multiple scattering effects in depth resolution of elastic recoil detection

    Wielunski, L.S.; Harding, G.L.

    1998-01-01

    Elastic Recoil Detection (ERD) is used to profile hydrogen and other low mass elements in thin films at surface and interfaces in a similar way that Rutherford Backscattering Spectroscopy (RBS) is used to detect and profile heavy elements. It is often assumed that the depth resolutions of these two techniques are similar. However, in contrast to typical RBS, the depth resolution of ERD is limited substantially by multiple scattering. In experimental data analysis and/or spectra simulations of a typical RBS measurement multiple scattering effects are often ignored. Computer programs used in IBA, such as RUMP, HYPRA or RBX do not include multiple scattering effects at all. In this paper, using practical thin metal structures with films containing intentionally introduced hydrogen, we demonstrate experimental ERD depth resolution and sensitivity limitations. The effects of sample material and scattering angle are also discussed. (authors)

  6. Multiple scattering effects in depth resolution of elastic recoil detection

    Wielunski, L.S.; Harding, G.L. [Commonwealth Scientific and Industrial Research Organisation (CSIRO), Lindfield, NSW (Australia). Telecommunications and Industrial Physics; Szilagyi, E. [KFKI Research Institute for Particle and Nuclear Physics, Budapest, (Hungary)

    1998-06-01

    Elastic Recoil Detection (ERD) is used to profile hydrogen and other low mass elements in thin films at surface and interfaces in a similar way that Rutherford Backscattering Spectroscopy (RBS) is used to detect and profile heavy elements. It is often assumed that the depth resolutions of these two techniques are similar. However, in contrast to typical RBS, the depth resolution of ERD is limited substantially by multiple scattering. In experimental data analysis and/or spectra simulations of a typical RBS measurement multiple scattering effects are often ignored. Computer programs used in IBA, such as RUMP, HYPRA or RBX do not include multiple scattering effects at all. In this paper, using practical thin metal structures with films containing intentionally introduced hydrogen, we demonstrate experimental ERD depth resolution and sensitivity limitations. The effects of sample material and scattering angle are also discussed. (authors). 19 refs., 4 figs.

  7. Fast generation of multiple resolution instances of raster data sets

    Arge, Lars; Haverkort, Herman; Tsirogiannis, Constantinos

    2012-01-01

    In many GIS applications it is important to study the characteristics of a raster data set at multiple resolutions. Often this is done by generating several coarser resolution rasters from a fine resolution raster. In this paper we describe efficient algorithms for different variants of this prob......In many GIS applications it is important to study the characteristics of a raster data set at multiple resolutions. Often this is done by generating several coarser resolution rasters from a fine resolution raster. In this paper we describe efficient algorithms for different variants...... in the main memory of the computer. We also provide two algorithms that solve this problem in external memory, that is when the input raster is larger than the main memory. The first external algorithm is very easy to implement and requires O(sort(N)) data block transfers from/to the external memory....... For this variant we describe an algorithm that runs in (U logN) time in internal memory, where U is the size of the output. We show how this algorithm can be adapted to perform efficiently in the external memory using O(sort(U)) data transfers from the disk. We have also implemented two of the presented algorithms...

  8. Solutions on high-resolution multiple configuration system sensors

    Liu, Hua; Ding, Quanxin; Guo, Chunjie; Zhou, Liwei

    2014-11-01

    For aim to achieve an improved resolution in modern image domain, a method of continuous zoom multiple configuration, with a core optics is attempt to establish model by novel principle on energy transfer and high accuracy localization, by which the system resolution can be improved with a level in nano meters. A comparative study on traditional vs modern methods can demonstrate that the dialectical relationship and their balance is important, among Merit function, Optimization algorithms and Model parameterization. The effect of system evaluated criterion that MTF, REA, RMS etc. can support our arguments qualitatively.

  9. Reconstructed Image Spatial Resolution of Multiple Coincidences Compton Imager

    Andreyev, Andriy; Sitek, Arkadiusz; Celler, Anna

    2010-02-01

    We study the multiple coincidences Compton imager (MCCI) which is based on a simultaneous acquisition of several photons emitted in cascade from a single nuclear decay. Theoretically, this technique should provide a major improvement in localization of a single radioactive source as compared to a standard Compton camera. In this work, we investigated the performance and limitations of MCCI using Monte Carlo computer simulations. Spatial resolutions of the reconstructed point source have been studied as a function of the MCCI parameters, including geometrical dimensions and detector characteristics such as materials, energy and spatial resolutions.

  10. The SPAtial EFficiency metric (SPAEF): multiple-component evaluation of spatial patterns for optimization of hydrological models

    Koch, Julian; Cüneyd Demirel, Mehmet; Stisen, Simon

    2018-05-01

    The process of model evaluation is not only an integral part of model development and calibration but also of paramount importance when communicating modelling results to the scientific community and stakeholders. The modelling community has a large and well-tested toolbox of metrics to evaluate temporal model performance. In contrast, spatial performance evaluation does not correspond to the grand availability of spatial observations readily available and to the sophisticate model codes simulating the spatial variability of complex hydrological processes. This study makes a contribution towards advancing spatial-pattern-oriented model calibration by rigorously testing a multiple-component performance metric. The promoted SPAtial EFficiency (SPAEF) metric reflects three equally weighted components: correlation, coefficient of variation and histogram overlap. This multiple-component approach is found to be advantageous in order to achieve the complex task of comparing spatial patterns. SPAEF, its three components individually and two alternative spatial performance metrics, i.e. connectivity analysis and fractions skill score, are applied in a spatial-pattern-oriented model calibration of a catchment model in Denmark. Results suggest the importance of multiple-component metrics because stand-alone metrics tend to fail to provide holistic pattern information. The three SPAEF components are found to be independent, which allows them to complement each other in a meaningful way. In order to optimally exploit spatial observations made available by remote sensing platforms, this study suggests applying bias insensitive metrics which further allow for a comparison of variables which are related but may differ in unit. This study applies SPAEF in the hydrological context using the mesoscale Hydrologic Model (mHM; version 5.8), but we see great potential across disciplines related to spatially distributed earth system modelling.

  11. Facile and high spatial resolution ratio-metric luminescence thermal mapping in microfluidics by near infrared excited upconversion nanoparticles

    Wang, Yu; Li, Shunbo; Wen, Weijia; Cao, Wenbin

    2016-01-01

    A local area temperature monitor is important for precise control of chemical and biological processes in microfluidics. In this work, we developed a facile method to realize micron spatial resolution of temperature mapping in a microfluidic channel quickly and cost effectively. Based on the temperature dependent fluorescence emission of NaYF 4 :Yb 3+ , Er 3+ upconversion nanoparticles (UCNPs) under near-infrared irradiation, ratio-metric imaging of UCNPs doped polydimethylsiloxane can map detailed temperature distribution in the channel. Unlike some reported strategies that utilize temperature sensitive organic dye (such as Rhodamine) to achieve thermal sensing, our method is highly chemically inert and physically stable without any performance degradation in long term operation. Moreover, this method can be easily scaled up or down, since the spatial and temperature resolution is determined by an optical imaging system. Our method supplied a simple and efficient solution for temperature mapping on a heterogeneous surface where usage of an infrared thermal camera was limited

  12. Multiplicative surrogate standard deviation: a group metric for the glycemic variability of individual hospitalized patients.

    Braithwaite, Susan S; Umpierrez, Guillermo E; Chase, J Geoffrey

    2013-09-01

    Group metrics are described to quantify blood glucose (BG) variability of hospitalized patients. The "multiplicative surrogate standard deviation" (MSSD) is the reverse-transformed group mean of the standard deviations (SDs) of the logarithmically transformed BG data set of each patient. The "geometric group mean" (GGM) is the reverse-transformed group mean of the means of the logarithmically transformed BG data set of each patient. Before reverse transformation is performed, the mean of means and mean of SDs each has its own SD, which becomes a multiplicative standard deviation (MSD) after reverse transformation. Statistical predictions and comparisons of parametric or nonparametric tests remain valid after reverse transformation. A subset of a previously published BG data set of 20 critically ill patients from the first 72 h of treatment under the SPRINT protocol was transformed logarithmically. After rank ordering according to the SD of the logarithmically transformed BG data of each patient, the cohort was divided into two equal groups, those having lower or higher variability. For the entire cohort, the GGM was 106 (÷/× 1.07) mg/dl, and MSSD was 1.24 (÷/× 1.07). For the subgroups having lower and higher variability, respectively, the GGM did not differ, 104 (÷/× 1.07) versus 109 (÷/× 1.07) mg/dl, but the MSSD differed, 1.17 (÷/× 1.03) versus 1.31 (÷/× 1.05), p = .00004. By using the MSSD with its MSD, groups can be characterized and compared according to glycemic variability of individual patient members. © 2013 Diabetes Technology Society.

  13. Seasonal climate signals from multiple tree ring metrics: A case study of Pinus ponderosa in the upper Columbia River Basin

    Dannenberg, Matthew P.; Wise, Erika K.

    2016-04-01

    Projected changes in the seasonality of hydroclimatic regimes are likely to have important implications for water resources and terrestrial ecosystems in the U.S. Pacific Northwest. The tree ring record, which has frequently been used to position recent changes in a longer-term context, typically relies on signals embedded in the total ring width of tree rings. Additional climatic inferences at a subannual temporal scale can be made using alternative tree ring metrics such as earlywood and latewood widths and the density of tree ring latewood. Here we examine seasonal precipitation and temperature signals embedded in total ring width, earlywood width, adjusted latewood width, and blue intensity chronologies from a network of six Pinus ponderosa sites in and surrounding the upper Columbia River Basin of the U.S. Pacific Northwest. We also evaluate the potential for combining multiple tree ring metrics together in reconstructions of past cool- and warm-season precipitation. The common signal among all metrics and sites is related to warm-season precipitation. Earlywood and latewood widths differ primarily in their sensitivity to conditions in the year prior to growth. Total and earlywood widths from the lowest elevation sites also reflect cool-season moisture. Effective correlation analyses and composite-plus-scale tests suggest that combining multiple tree ring metrics together may improve reconstructions of warm-season precipitation. For cool-season precipitation, total ring width alone explains more variance than any other individual metric or combination of metrics. The composite-plus-scale tests show that variance-scaled precipitation reconstructions in the upper Columbia River Basin may be asymmetric in their ability to capture extreme events.

  14. Regional emission metrics for short-lived climate forcers from multiple models

    B. Aamaas

    2016-06-01

    Full Text Available For short-lived climate forcers (SLCFs, the impact of emissions depends on where and when the emissions take place. Comprehensive new calculations of various emission metrics for SLCFs are presented based on radiative forcing (RF values calculated in four different (chemical-transport or coupled chemistry–climate models. We distinguish between emissions during summer (May–October and winter (November–April for emissions in Europe and East Asia, as well as from the global shipping sector and global emissions. The species included in this study are aerosols and aerosol precursors (BC, OC, SO2, NH3, as well as ozone precursors (NOx, CO, VOCs, which also influence aerosols to a lesser degree. Emission metrics for global climate responses of these emissions, as well as for CH4, have been calculated using global warming potential (GWP and global temperature change potential (GTP, based on dedicated RF simulations by four global models. The emission metrics include indirect cloud effects of aerosols and the semi-direct forcing for BC. In addition to the standard emission metrics for pulse and sustained emissions, we have also calculated a new emission metric designed for an emission profile consisting of a ramping period of 15 years followed by sustained emissions, which is more appropriate for a gradual implementation of mitigation policies.For the aerosols, the emission metric values are larger in magnitude for emissions in Europe than East Asia and for summer than winter. A variation is also observed for the ozone precursors, with largest values for emissions in East Asia and winter for CO and in Europe and summer for VOCs. In general, the variations between the emission metrics derived from different models are larger than the variations between regions and seasons, but the regional and seasonal variations for the best estimate also hold for most of the models individually. Further, the estimated climate impact of an illustrative mitigation

  15. Association of High-Resolution Manometry Metrics with the Symptoms of Achalasia and the Symptomatic Outcomes of Peroral Esophageal Myotomy.

    Tang, Yurong; Xie, Chen; Wang, Meifeng; Jiang, Liuqin; Shi, Ruihua; Lin, Lin

    2015-01-01

    High-resolution manometry (HRM) has improved the accuracy of manometry in detecting achalasia and has helped distinguish between clinically relevant subtypes. This study investigated whether HRM metrics correlate with the achalasia symptoms and symptomatic outcomes of peroral esophageal myotomy (POEM). Of the 30 patients who were enrolled, 25 were treated with POEM, 12 of who underwent HRM after 3 months. All the patients completed the Eckardt score questionnaires, and those who underwent POEM were followed up for about 6 months. Pearson correlation was used to assess the relationship between the HRM metrics and symptoms and outcomes. The integrated relaxation pressure (IRP) score positively correlated with the total Eckardt score, regurgitation score and weight loss score in all the patients, and with the weight loss score in type I achalasia. In 25 patients (10 patients, type I; 15 patients, type II) who underwent POEM, the total Eckardt scores and individual symptom scores significantly decreased after surgery. Changes in the Eckardt scores were similar between type I and type II. Further, the Eckardt scores and weight loss score changes were positively correlated with baseline IRP. Twelve patients (4 patients, type I; 8 patients, type II) underwent HRM again after POEM. IRP changed significantly after POEM, as did the DEP in type II. The IRP changes after POEM were positively correlated with the Eckardt score changes. IRP is correlated with the symptoms and outcomes of achalasia patients. Thus, HRM is effective for assessing the severity of achalasia and can predict the efficacy of POEM.

  16. Multiple speckle illumination for optical-resolution photoacoustic imaging

    Poisson, Florian; Stasio, Nicolino; Moser, Christophe; Psaltis, Demetri; Bossy, Emmanuel

    2017-03-01

    Optical-resolution photoacoustic microscopy offers exquisite and specific contrast to optical absorption. Conventional approaches generally involves raster scanning a focused spot over the sample. Here, we demonstrate that a full-field illumination approach with multiple speckle illumination can also provide diffraction-limited optical-resolution photoacoustic images. Two different proof-of-concepts are demonstrated with micro-structured test samples. The first approach follows the principle of correlation/ghost imaging,1, 2 and is based on cross-correlating photoacoustic signals under multiple speckle illumination with known speckle patterns measured during a calibration step. The second approach is a speckle scanning microscopy technique, which adapts the technique proposed in fluorescence microscopy by Bertolotti and al.:3 in our work, spatially unresolved photoacoustic measurements are performed for various translations of unknown speckle patterns. A phase-retrieval algorithm is used to reconstruct the object from the knowledge of the modulus of its Fourier Transform yielded by the measurements. Because speckle patterns naturally appear in many various situations, including propagation through biological tissue or multi-mode fibers (for which focusing light is either very demanding if not impossible), speckle-illumination-based photoacoustic microscopy provides a powerful framework for the development of novel reconstruction approaches, well-suited to compressed sensing approaches.2

  17. Global High Resolution Sea Surface Flux Parameters From Multiple Satellites

    Zhang, H.; Reynolds, R. W.; Shi, L.; Bates, J. J.

    2007-05-01

    Advances in understanding the coupled air-sea system and modeling of the ocean and atmosphere demand increasingly higher resolution data, such as air-sea fluxes of up to 3 hourly and every 50 km. These observational requirements can only be met by utilizing multiple satellite observations. Generation of such high resolution products from multiple-satellite and in-situ observations on an operational basis has been started at the U.S. National Oceanic and Atmospheric Administration (NOAA) National Climatic Data Center. Here we describe a few products that are directly related to the computation of turbulent air-sea fluxes. Sea surface wind speed has been observed from in-situ instruments and multiple satellites, with long-term observations ranging from one satellite in the mid 1987 to six or more satellites since mid 2002. A blended product with a global 0.25° grid and four snapshots per day has been produced for July 1987 to present, using a near Gaussian 3-D (x, y, t) interpolation to minimize aliases. Wind direction has been observed from fewer satellites, thus for the blended high resolution vector winds and wind stresses, the directions are taken from the NCEP Re-analysis 2 (operationally run near real time) for climate consistency. The widely used Reynolds Optimum Interpolation SST analysis has been improved with higher resolutions (daily and 0.25°). The improvements use both infrared and microwave satellite data that are bias-corrected by in- situ observations for the period 1985 to present. The new versions provide very significant improvements in terms of resolving ocean features such as the meandering of the Gulf Stream, the Aghulas Current, the equatorial jets and other fronts. The Ta and Qa retrievals are based on measurements from the AMSU sounder onboard the NOAA satellites. Ta retrieval uses AMSU-A data, while Qa retrieval uses both AMSU-A and AMSU-B observations. The retrieval algorithms are developed using the neural network approach. Training

  18. Space volcano observatory (SVO): a metric resolution system on-board a micro/mini-satellite

    Briole, P.; Cerutti-Maori, G.; Kasser, M.

    2017-11-01

    1500 volcanoes on the Earth are potentially active, one third of them have been active during this century and about 70 are presently erupting. At the beginning of the third millenium, 10% of the world population will be living in areas directly threatened by volcanoes, without considering the effects of eruptions on climate or air-trafic for example. The understanding of volcanic eruptions, a major challenge in geoscience, demands continuous monitoring of active volcanoes. The only way to provide global, continuous, real time and all-weather information on volcanoes is to set up a Space Volcano Observatory closely connected to the ground observatories. Spaceborne observations are mandatory and implement the ground ones as well as airborne ones that can be implemented on a limited set of volcanoes. SVO goal is to monitor both the deformations and the changes in thermal radiance at optical wavelengths from high temperature surfaces of the active volcanic zones. For that, we propose to map at high resolution (1 to 1,5 m pixel size) the topography (stereoscopic observation) and the thermal anomalies (pixel-integrated temperatures above 450°C) of active volcanic areas in a size of 6 x 6 km to 12 x 12 km, large enough for monitoring most of the target features. A return time of 1 to 3 days will allow to get a monitoring useful for hazard mitigation. The paper will present the concept of the optical payload, compatible with a micro/mini satellite (mass in the range 100 - 400 kg), budget for the use of Proteus platform in the case of minisatellite approach will be given and also in the case of CNES microsat platform family. This kind of design could be used for other applications like high resolution imagery on a limited zone for military purpose, GIS, evolution cadaster…

  19. Facile and high spatial resolution ratio-metric luminescence thermal mapping in microfluidics by near infrared excited upconversion nanoparticles

    Wang, Yu; Li, Shunbo; Wen, Weijia, E-mail: phwen@ust.hk [Department of Physics, KAUST-HKUST Joint Micro/Nanofluidic Laboratory, The Hong Kong University of Science and Technology, Clear Water Bay, Kowloon (Hong Kong); Cao, Wenbin [Nano Science and Technology Program, Department of Physics, The Hong Kong University of Science and Technology, Clear Water Bay, Kowloon (Hong Kong)

    2016-02-01

    A local area temperature monitor is important for precise control of chemical and biological processes in microfluidics. In this work, we developed a facile method to realize micron spatial resolution of temperature mapping in a microfluidic channel quickly and cost effectively. Based on the temperature dependent fluorescence emission of NaYF{sub 4}:Yb{sup 3+}, Er{sup 3+} upconversion nanoparticles (UCNPs) under near-infrared irradiation, ratio-metric imaging of UCNPs doped polydimethylsiloxane can map detailed temperature distribution in the channel. Unlike some reported strategies that utilize temperature sensitive organic dye (such as Rhodamine) to achieve thermal sensing, our method is highly chemically inert and physically stable without any performance degradation in long term operation. Moreover, this method can be easily scaled up or down, since the spatial and temperature resolution is determined by an optical imaging system. Our method supplied a simple and efficient solution for temperature mapping on a heterogeneous surface where usage of an infrared thermal camera was limited.

  20. A multiple decision support metrics method for effective risk-informed asset management

    Liming, J.K.; Salter, J.E.

    2004-01-01

    The objective of this paper is to provide electric utilities with a concept for developing and applying effective decision support metrics via integrated risk-informed asset management (RIAM) programs for power stations and generating companies. RIAM is a process by which analysts review historical performance and develop predictive logic models and data analyses to predict critical decision support figures-of-merit (or metrics) for generating station managers and electric utility company executives. These metrics include, but are not limited to, the following: profitability, net benefit, benefit-to-cost ratio, projected return on investment, projected revenue, projected costs, asset value, safety (catastrophic facility damage frequency and consequences, etc.), power production availability (capacity factor, etc.), efficiency (heat rate), and others. RIAM applies probabilistic safety assessment (PSA) techniques and generates predictions in a probabilistic way so that metrics information can be supplied to managers in terms of probability distributions as well as point estimates. This enables the managers to apply the concept of 'confidence levels' in their critical decision-making processes. (authors)

  1. Comparison of Deep Learning With Multiple Machine Learning Methods and Metrics Using Diverse Drug Discovery Data Sets.

    Korotcov, Alexandru; Tkachenko, Valery; Russo, Daniel P; Ekins, Sean

    2017-12-04

    Machine learning methods have been applied to many data sets in pharmaceutical research for several decades. The relative ease and availability of fingerprint type molecular descriptors paired with Bayesian methods resulted in the widespread use of this approach for a diverse array of end points relevant to drug discovery. Deep learning is the latest machine learning algorithm attracting attention for many of pharmaceutical applications from docking to virtual screening. Deep learning is based on an artificial neural network with multiple hidden layers and has found considerable traction for many artificial intelligence applications. We have previously suggested the need for a comparison of different machine learning methods with deep learning across an array of varying data sets that is applicable to pharmaceutical research. End points relevant to pharmaceutical research include absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) properties, as well as activity against pathogens and drug discovery data sets. In this study, we have used data sets for solubility, probe-likeness, hERG, KCNQ1, bubonic plague, Chagas, tuberculosis, and malaria to compare different machine learning methods using FCFP6 fingerprints. These data sets represent whole cell screens, individual proteins, physicochemical properties as well as a data set with a complex end point. Our aim was to assess whether deep learning offered any improvement in testing when assessed using an array of metrics including AUC, F1 score, Cohen's kappa, Matthews correlation coefficient and others. Based on ranked normalized scores for the metrics or data sets Deep Neural Networks (DNN) ranked higher than SVM, which in turn was ranked higher than all the other machine learning methods. Visualizing these properties for training and test sets using radar type plots indicates when models are inferior or perhaps over trained. These results also suggest the need for assessing deep learning further

  2. The Combined Quantification and Interpretation of Multiple Quantitative Magnetic Resonance Imaging Metrics Enlightens Longitudinal Changes Compatible with Brain Repair in Relapsing-Remitting Multiple Sclerosis Patients.

    Bonnier, Guillaume; Maréchal, Benedicte; Fartaria, Mário João; Falkowskiy, Pavel; Marques, José P; Simioni, Samanta; Schluep, Myriam; Du Pasquier, Renaud; Thiran, Jean-Philippe; Krueger, Gunnar; Granziera, Cristina

    2017-01-01

    Quantitative and semi-quantitative MRI (qMRI) metrics provide complementary specificity and differential sensitivity to pathological brain changes compatible with brain inflammation, degeneration, and repair. Moreover, advanced magnetic resonance imaging (MRI) metrics with overlapping elements amplify the true tissue-related information and limit measurement noise. In this work, we combined multiple advanced MRI parameters to assess focal and diffuse brain changes over 2 years in a group of early-stage relapsing-remitting MS patients. Thirty relapsing-remitting MS patients with less than 5 years disease duration and nine healthy subjects underwent 3T MRI at baseline and after 2 years including T1, T2, T2* relaxometry, and magnetization transfer imaging. To assess longitudinal changes in normal-appearing (NA) tissue and lesions, we used analyses of variance and Bonferroni correction for multiple comparisons. Multivariate linear regression was used to assess the correlation between clinical outcome and multiparametric MRI changes in lesions and NA tissue. In patients, we measured a significant longitudinal decrease of mean T2 relaxation times in NA white matter ( p  = 0.005) and a decrease of T1 relaxation times in the pallidum ( p  decrease in T1 relaxation time ( p -value  0.4, p  < 0.05). In summary, the combination of multiple advanced MRI provided evidence of changes compatible with focal and diffuse brain repair at early MS stages as suggested by histopathological studies.

  3. On using multiple routing metrics with destination sequenced distance vector protocol for MultiHop wireless ad hoc networks

    Mehic, M.; Fazio, P.; Voznak, M.; Partila, P.; Komosny, D.; Tovarek, J.; Chmelikova, Z.

    2016-05-01

    A mobile ad hoc network is a collection of mobile nodes which communicate without a fixed backbone or centralized infrastructure. Due to the frequent mobility of nodes, routes connecting two distant nodes may change. Therefore, it is not possible to establish a priori fixed paths for message delivery through the network. Because of its importance, routing is the most studied problem in mobile ad hoc networks. In addition, if the Quality of Service (QoS) is demanded, one must guarantee the QoS not only over a single hop but over an entire wireless multi-hop path which may not be a trivial task. In turns, this requires the propagation of QoS information within the network. The key to the support of QoS reporting is QoS routing, which provides path QoS information at each source. To support QoS for real-time traffic one needs to know not only minimum delay on the path to the destination but also the bandwidth available on it. Therefore, throughput, end-to-end delay, and routing overhead are traditional performance metrics used to evaluate the performance of routing protocol. To obtain additional information about the link, most of quality-link metrics are based on calculation of the lost probabilities of links by broadcasting probe packets. In this paper, we address the problem of including multiple routing metrics in existing routing packets that are broadcasted through the network. We evaluate the efficiency of such approach with modified version of DSDV routing protocols in ns-3 simulator.

  4. The Combined Quantification and Interpretation of Multiple Quantitative Magnetic Resonance Imaging Metrics Enlightens Longitudinal Changes Compatible with Brain Repair in Relapsing-Remitting Multiple Sclerosis Patients

    Guillaume Bonnier

    2017-09-01

    Full Text Available ObjectiveQuantitative and semi-quantitative MRI (qMRI metrics provide complementary specificity and differential sensitivity to pathological brain changes compatible with brain inflammation, degeneration, and repair. Moreover, advanced magnetic resonance imaging (MRI metrics with overlapping elements amplify the true tissue-related information and limit measurement noise. In this work, we combined multiple advanced MRI parameters to assess focal and diffuse brain changes over 2 years in a group of early-stage relapsing-remitting MS patients.MethodsThirty relapsing-remitting MS patients with less than 5 years disease duration and nine healthy subjects underwent 3T MRI at baseline and after 2 years including T1, T2, T2* relaxometry, and magnetization transfer imaging. To assess longitudinal changes in normal-appearing (NA tissue and lesions, we used analyses of variance and Bonferroni correction for multiple comparisons. Multivariate linear regression was used to assess the correlation between clinical outcome and multiparametric MRI changes in lesions and NA tissue.ResultsIn patients, we measured a significant longitudinal decrease of mean T2 relaxation times in NA white matter (p = 0.005 and a decrease of T1 relaxation times in the pallidum (p < 0.05, which are compatible with edema reabsorption and/or iron deposition. No longitudinal changes in qMRI metrics were observed in controls. In MS lesions, we measured a decrease in T1 relaxation time (p-value < 2.2e−16 and a significant increase in MTR (p-value < 1e−6, suggesting repair mechanisms, such as remyelination, increased axonal density, and/or a gliosis. Last, the evolution of advanced MRI metrics—and not changes in lesions or brain volume—were correlated to motor and cognitive tests scores evolution (Adj-R2 > 0.4, p < 0.05. In summary, the combination of multiple advanced MRI provided evidence of changes compatible with focal and diffuse brain repair at

  5. Sub-metric Resolution FWI of Ultra-High-Frequency Marine Reflection Seismograms. A Remote Sensing Tool for the Characterisation of Shallow Marine Geohazard

    Provenzano, G.; Vardy, M. E.; Henstock, T.; Zervos, A.

    2017-12-01

    A quantitative high-resolution physical model of the top 100 meters of the sub-seabed is of key importance for a wide range of shallow geohazard scenarios: identification of potential shallow landsliding, monitoring of gas storage sites, and assessment of offshore structures stability. Cur- rently, engineering-scale sediment characterisation relies heavily on direct sampling of the seabed and in-situ measurements. Such an approach is expensive and time-consuming, as well as liable to alter the sediment properties during the coring process. As opposed to reservoir-scale seismic exploration, ultra-high-frequency (UHF, 0.2-4.0 kHz) multi-channel marine reflection seismic data are most often limited to a to semi-quantitative interpretation of the reflection amplitudes and facies geometries, leaving largely unexploited its intrinsic value as a remote characterisation tool. In this work, we develop a seismic inversion methodology to obtain a robust sub-metric resolution elastic model from limited-offset, limited-bandwidth UHF seismic reflection data, with minimal pre-processing and limited a priori information. The Full Waveform Inversion is implemented as a stochastic optimiser based upon a Genetic Algorithm, modified in order to improve the robustness against inaccurate starting model populations. Multiple independent runs are used to create a robust posterior model distribution and quantify the uncertainties on the solution. The methodology has been applied to complex synthetic examples and to real datasets acquired in areas prone to shallow landsliding. The inverted elastic models show a satisfactory match with the ground-truths and a good sensitivity to relevant variations in the sediment texture and saturation state. We apply the methodology to a range of synthetic consolidating slopes under different loading conditions and sediment properties distributions. Our work demonstrates that the seismic inversion of UHF data has the potential to become an important

  6. The Combined Quantification and Interpretation of Multiple Quantitative Magnetic Resonance Imaging Metrics Enlightens Longitudinal Changes Compatible with Brain Repair in Relapsing-Remitting Multiple Sclerosis Patients

    Bonnier, G.; Marechal, B.; Fartaria, M.J.; Marques, J.P.; Simioni, S.; Schluep, M.; Du Pasquier, R.; Thiran, J.-P.; Krueger, G.; Granziera, C.

    2017-01-01

    Objective: Quantitative and semi-quantitative MRI (qMRI) metrics provide complementary specificity and differential sensitivity to pathological brain changes compatible with brain inflammation, degeneration and repair. Moreover, advanced MRI metrics with overlapping elements amplify the true

  7. Global spatially explicit CO2 emission metrics at 0.25° horizontal resolution for forest bioenergy

    Cherubini, F.

    2015-12-01

    Bioenergy is the most important renewable energy option in studies designed to align with future RCP projections, reaching approximately 250 EJ/yr in RCP2.6, 145 EJ/yr in RCP4.5 and 180 EJ/yr in RCP8.5 by the end of the 21st century. However, many questions enveloping the direct carbon cycle and climate response to bioenergy remain partially unexplored. Bioenergy systems are largely assessed under the default climate neutrality assumption and the time lag between CO2 emissions from biomass combustion and CO2 uptake by vegetation is usually ignored. Emission metrics of CO2 from forest bioenergy are only available on a case-specific basis and their quantification requires processing of a wide spectrum of modelled or observed local climate and forest conditions. On the other hand, emission metrics are widely used to aggregate climate impacts of greenhouse gases to common units such as CO2-equivalents (CO2-eq.), but a spatially explicit analysis of emission metrics with global forest coverage is today lacking. Examples of emission metrics include the global warming potential (GWP), the global temperature change potential (GTP) and the absolute sustained emission temperature (aSET). Here, we couple a global forest model, a heterotrophic respiration model, and a global climate model to produce global spatially explicit emission metrics for CO2 emissions from forest bioenergy. We show their applications to global emissions in 2015 and until 2100 under the different RCP scenarios. We obtain global average values of 0.49 ± 0.03 kgCO2-eq. kgCO2-1 (mean ± standard deviation), 0.05 ± 0.05 kgCO2-eq. kgCO2-1, and 2.14·10-14 ± 0.11·10-14 °C (kg yr-1)-1, and 2.14·10-14 ± 0.11·10-14 °C (kg yr-1)-1 for GWP, GTP and aSET, respectively. We also present results aggregated at a grid, national and continental level. The metrics are found to correlate with the site-specific turnover times and local climate variables like annual mean temperature and precipitation. Simplified

  8. Disaster metrics: quantitative benchmarking of hospital surge capacity in trauma-related multiple casualty events.

    Bayram, Jamil D; Zuabi, Shawki; Subbarao, Italo

    2011-06-01

    Hospital surge capacity in multiple casualty events (MCE) is the core of hospital medical response, and an integral part of the total medical capacity of the community affected. To date, however, there has been no consensus regarding the definition or quantification of hospital surge capacity. The first objective of this study was to quantitatively benchmark the various components of hospital surge capacity pertaining to the care of critically and moderately injured patients in trauma-related MCE. The second objective was to illustrate the applications of those quantitative parameters in local, regional, national, and international disaster planning; in the distribution of patients to various hospitals by prehospital medical services; and in the decision-making process for ambulance diversion. A 2-step approach was adopted in the methodology of this study. First, an extensive literature search was performed, followed by mathematical modeling. Quantitative studies on hospital surge capacity for trauma injuries were used as the framework for our model. The North Atlantic Treaty Organization triage categories (T1-T4) were used in the modeling process for simplicity purposes. Hospital Acute Care Surge Capacity (HACSC) was defined as the maximum number of critical (T1) and moderate (T2) casualties a hospital can adequately care for per hour, after recruiting all possible additional medical assets. HACSC was modeled to be equal to the number of emergency department beds (#EDB), divided by the emergency department time (EDT); HACSC = #EDB/EDT. In trauma-related MCE, the EDT was quantitatively benchmarked to be 2.5 (hours). Because most of the critical and moderate casualties arrive at hospitals within a 6-hour period requiring admission (by definition), the hospital bed surge capacity must match the HACSC at 6 hours to ensure coordinated care, and it was mathematically benchmarked to be 18% of the staffed hospital bed capacity. Defining and quantitatively benchmarking the

  9. Radio and X-ray observations of a multiple impulsive solar burst with high time resolution

    Kosugi, T.

    1981-01-01

    A well-developed multiple impulsive microwave burst occurred on February 17, 1979 simultaneously with a hard X-ray burst and a large group of type III bursts at metric wavelengths. The whole event is composed of serveral subgroups of elementary spike bursts. Detailed comparisons between these three classes of emissions with high time resolution of approx. equal to0.5 s reveal that individual type III bursts coincide in time with corresponding elementary X-ray and microwave spike bursts. It suggests that a non-thermal electron pulse generating a type III spike burst is produced simultaneously with those responsible for the corresponding hard X-ray and microwave spike bursts. The rise and decay characteristic time scales of the elementary spike burst are << 1 s, and approx. equal to1 s and approx. equal to3 s for type III, hard X-ray and microwave emissions respectively. Radio interferometric observations made at 17 GHz reveal that the spatial structure varies from one subgroup to others while it remains unchanged in a subgroup. Spectral evolution of the microwave burst seems to be closely related to the spatial evolution. The spatial evolution together with the spectral evolution suggests that the electron-accelerating region shifts to a different location after it stays at one location for several tens of seconds, duration of a subgroup of elementary spike bursts. We discuss several requirements for a model of the impulsive burst which come out from these observational results, and propose a migrating double-source model. (orig.)

  10. Reproducibility of R-fMRI metrics on the impact of different strategies for multiple comparison correction and sample sizes.

    Chen, Xiao; Lu, Bin; Yan, Chao-Gan

    2018-01-01

    Concerns regarding reproducibility of resting-state functional magnetic resonance imaging (R-fMRI) findings have been raised. Little is known about how to operationally define R-fMRI reproducibility and to what extent it is affected by multiple comparison correction strategies and sample size. We comprehensively assessed two aspects of reproducibility, test-retest reliability and replicability, on widely used R-fMRI metrics in both between-subject contrasts of sex differences and within-subject comparisons of eyes-open and eyes-closed (EOEC) conditions. We noted permutation test with Threshold-Free Cluster Enhancement (TFCE), a strict multiple comparison correction strategy, reached the best balance between family-wise error rate (under 5%) and test-retest reliability/replicability (e.g., 0.68 for test-retest reliability and 0.25 for replicability of amplitude of low-frequency fluctuations (ALFF) for between-subject sex differences, 0.49 for replicability of ALFF for within-subject EOEC differences). Although R-fMRI indices attained moderate reliabilities, they replicated poorly in distinct datasets (replicability < 0.3 for between-subject sex differences, < 0.5 for within-subject EOEC differences). By randomly drawing different sample sizes from a single site, we found reliability, sensitivity and positive predictive value (PPV) rose as sample size increased. Small sample sizes (e.g., < 80 [40 per group]) not only minimized power (sensitivity < 2%), but also decreased the likelihood that significant results reflect "true" effects (PPV < 0.26) in sex differences. Our findings have implications for how to select multiple comparison correction strategies and highlight the importance of sufficiently large sample sizes in R-fMRI studies to enhance reproducibility. Hum Brain Mapp 39:300-318, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  11. Improving the spatial resolution of the multiple multiwire proportional chamber gamma camera

    Bateman, J.E.; Connolly, J.F.

    1978-03-01

    Results are presented showing how the spatial resolution of the multiple multiwire proportional chamber (MMPC) gamma camera may be improved. Under the best conditions 1.6 mm bars can be resolved. (author)

  12. In-depth analysis of protein inference algorithms using multiple search engines and well-defined metrics.

    Audain, Enrique; Uszkoreit, Julian; Sachsenberg, Timo; Pfeuffer, Julianus; Liang, Xiao; Hermjakob, Henning; Sanchez, Aniel; Eisenacher, Martin; Reinert, Knut; Tabb, David L; Kohlbacher, Oliver; Perez-Riverol, Yasset

    2017-01-06

    In mass spectrometry-based shotgun proteomics, protein identifications are usually the desired result. However, most of the analytical methods are based on the identification of reliable peptides and not the direct identification of intact proteins. Thus, assembling peptides identified from tandem mass spectra into a list of proteins, referred to as protein inference, is a critical step in proteomics research. Currently, different protein inference algorithms and tools are available for the proteomics community. Here, we evaluated five software tools for protein inference (PIA, ProteinProphet, Fido, ProteinLP, MSBayesPro) using three popular database search engines: Mascot, X!Tandem, and MS-GF+. All the algorithms were evaluated using a highly customizable KNIME workflow using four different public datasets with varying complexities (different sample preparation, species and analytical instruments). We defined a set of quality control metrics to evaluate the performance of each combination of search engines, protein inference algorithm, and parameters on each dataset. We show that the results for complex samples vary not only regarding the actual numbers of reported protein groups but also concerning the actual composition of groups. Furthermore, the robustness of reported proteins when using databases of differing complexities is strongly dependant on the applied inference algorithm. Finally, merging the identifications of multiple search engines does not necessarily increase the number of reported proteins, but does increase the number of peptides per protein and thus can generally be recommended. Protein inference is one of the major challenges in MS-based proteomics nowadays. Currently, there are a vast number of protein inference algorithms and implementations available for the proteomics community. Protein assembly impacts in the final results of the research, the quantitation values and the final claims in the research manuscript. Even though protein

  13. Analyzing Snowpack Metrics Over Large Spatial Extents Using Calibrated, Enhanced-Resolution Brightness Temperature Data and Long Short Term Memory Artificial Neural Networks

    Norris, W.; J Q Farmer, C.

    2017-12-01

    Snow water equivalence (SWE) is a difficult metric to measure accurately over large spatial extents; snow-tell sites are too localized, and traditional remotely sensed brightness temperature data is at too coarse of a resolution to capture variation. The new Calibrated Enhanced-Resolution Brightness Temperature (CETB) data from the National Snow and Ice Data Center (NSIDC) offers remotely sensed brightness temperature data at an enhanced resolution of 3.125 km versus the original 25 km, which allows for large spatial extents to be analyzed with reduced uncertainty compared to the 25km product. While the 25km brightness temperature data has proved useful in past research — one group found decreasing trends in SWE outweighed increasing trends three to one in North America; other researchers used the data to incorporate winter conditions, like snow cover, into ecological zoning criterion — with the new 3.125 km data, it is possible to derive more accurate metrics for SWE, since we have far more spatial variability in measurements. Even with higher resolution data, using the 37 - 19 GHz frequencies to estimate SWE distorts the data during times of melt onset and accumulation onset. Past researchers employed statistical splines, while other successful attempts utilized non-parametric curve fitting to smooth out spikes distorting metrics. In this work, rather than using legacy curve fitting techniques, a Long Short Term Memory (LSTM) Artificial Neural Network (ANN) was trained to perform curve fitting on the data. LSTM ANN have shown great promise in modeling time series data, and with almost 40 years of data available — 14,235 days — there is plenty of training data for the ANN. LSTM's are ideal for this type of time series analysis because they allow important trends to persist for long periods of time, but ignore short term fluctuations; since LSTM's have poor mid- to short-term memory, they are ideal for smoothing out the large spikes generated in the melt

  14. METRIC context unit architecture

    Simpson, R.O.

    1988-01-01

    METRIC is an architecture for a simple but powerful Reduced Instruction Set Computer (RISC). Its speed comes from the simultaneous processing of several instruction streams, with instructions from the various streams being dispatched into METRIC's execution pipeline as they become available for execution. The pipeline is thus kept full, with a mix of instructions for several contexts in execution at the same time. True parallel programming is supported within a single execution unit, the METRIC Context Unit. METRIC's architecture provides for expansion through the addition of multiple Context Units and of specialized Functional Units. The architecture thus spans a range of size and performance from a single-chip microcomputer up through large and powerful multiprocessors. This research concentrates on the specification of the METRIC Context Unit at the architectural level. Performance tradeoffs made during METRIC's design are discussed, and projections of METRIC's performance are made based on simulation studies.

  15. Improving the Reliability of Optimised Link State Routing in a Smart Grid Neighbour Area Network based Wireless Mesh Network Using Multiple Metrics

    Yakubu Tsado

    2017-02-01

    Full Text Available Reliable communication is the backbone of advanced metering infrastructure (AMI. Within the AMI, the neighbourhood area network (NAN transports a multitude of traffic, each with unique requirements. In order to deliver an acceptable level of reliability and latency, the underlying network, such as the wireless mesh network(WMN, must provide or guarantee the quality-of-service (QoS level required by the respective application traffic. Existing WMN routing protocols, such as optimised link state routing (OLSR, typically utilise a single metric and do not consider the requirements of individual traffic; hence, packets are delivered on a best-effort basis. This paper presents a QoS-aware WMN routing technique that employs multiple metrics in OLSR optimal path selection for AMI applications. The problems arising from this approach are non deterministic polynomial time (NP-complete in nature, which were solved through the combined use of the analytical hierarchy process (AHP algorithm and pruning techniques. For smart meters transmitting Internet Protocol (IP packets of varying sizes at different intervals, the proposed technique considers the constraints of NAN and the applications’ traffic characteristics. The technique was developed by combining multiple OLSR path selection metrics with the AHP algorithminns-2. Compared with the conventional link metric in OLSR, the results show improvements of about 23% and 45% in latency and Packet Delivery Ratio (PDR, respectively, in a 25-node grid NAN.

  16. Quantitative comparison using Generalized Relative Object Detectability (G-ROD) metrics of an amorphous selenium detector with high resolution Microangiographic Fluoroscopes (MAF) and standard flat panel detectors (FPD).

    Russ, M; Shankar, A; Jain, A; Setlur Nagesh, S V; Ionita, C N; Scott, C; Karim, K S; Bednarek, D R; Rudin, S

    2016-02-27

    A novel amorphous selenium (a-Se) direct detector with CMOS readout has been designed, and relative detector performance investigated. The detector features include a 25 μ m pixel pitch, and 1000 μ m thick a-Se layer operating at 10V/ μ m bias field. A simulated detector DQE was determined, and used in comparative calculations of the Relative Object Detectability (ROD) family of prewhitening matched-filter (PWMF) observer and non-prewhitening matched filter (NPWMF) observer model metrics to gauge a-Se detector performance against existing high resolution micro-angiographic fluoroscopic (MAF) detectors and a standard flat panel detector (FPD). The PWMF-ROD or ROD metric compares two x-ray imaging detectors in their relative abilities in imaging a given object by taking the integral over spatial frequencies of the Fourier transform of the detector DQE weighted by an object function, divided by the comparable integral for a different detector. The generalized-ROD (G-ROD) metric incorporates clinically relevant parameters (focal-spot size, magnification, and scatter) to show the degradation in imaging performance for detectors that are part of an imaging chain. Preliminary ROD calculations using simulated spheres as the object predicted superior imaging performance by the a-Se detector as compared to existing detectors. New PWMF-G-ROD and NPWMF-G-ROD results still indicate better performance by the a-Se detector in an imaging chain over all sphere sizes for various focal spot sizes and magnifications, although a-Se performance advantages were degraded by focal spot blurring. Nevertheless, the a-Se technology has great potential to provide breakthrough abilities such as visualization of fine details including of neuro-vascular perforator vessels and of small vascular devices.

  17. Hadron multiplicity as the limit of jet multiplicity at high resolution

    Lupia, S.; Ochs, W. [Max-Planck-Institut fuer Physik, Muenchen (Germany). Werner-Heisenberg-Institut

    1998-05-01

    Recently exact numerical results from the evolution equation for parton multiplicities in QCD jets have been obtained. A comparison with various approximate results is presented. A good description is obtained not only of the jet multiplicities measured at LEP-1 but also of the hadron multiplicities for cm s energies above 1.6 GeV in e{sup +}e{sup -} annihilation. The solution suggests that a final state hadron can be represented by a jet in the limit of small (nonperturbative) k {sub perpendicular} {sub to} cut-off Q{sub 0}. In this description using as adjustable parameters only the QCD scale {Lambda} and the cut-off Q{sub 0}, the coupling {alpha}{sub s} can be seen to rise towards large values above unity at low energies. (orig.). 8 refs.

  18. Hadron multiplicity as the limit of jet multiplicity at high resolution

    Lupia, S.; Ochs, W.

    1998-01-01

    Recently exact numerical results from the evolution equation for parton multiplicities in QCD jets have been obtained. A comparison with various approximate results is presented. A good description is obtained not only of the jet multiplicities measured at LEP-1 but also of the hadron multiplicities for cm s energies above 1.6 GeV in e + e - annihilation. The solution suggests that a final state hadron can be represented by a jet in the limit of small (nonperturbative) k perpendicular to cut-off Q 0 . In this description using as adjustable parameters only the QCD scale Λ and the cut-off Q 0 , the coupling α s can be seen to rise towards large values above unity at low energies. (orig.)

  19. Novel method of simultaneous multiple immunogold localization on resin sections in high resolution scanning electron microscopy

    Nebesářová, Jana; Wandrol, P.; Vancová, Marie

    2016-01-01

    Roč. 12, č. 1 (2016), s. 105-517 ISSN 1549-9634 R&D Projects: GA TA ČR(CZ) TE01020118 Institutional support: RVO:60077344 Keywords : multiple immunolabeling * gold nanoparticles * high resolution SEM * STEM imaging * BSE imaging Subject RIV: EA - Cell Biology Impact factor: 5.720, year: 2016

  20. On a possible use of multiple Bragg reflections for high-resolution monochromatization of neutrons

    Mikula, Pavol; Vrána, Miroslav; Wagner, V.

    2004-01-01

    Roč. 350, - (2004), e667-e670 ISSN 0921-4526 R&D Projects: GA ČR GA202/03/0891 Keywords : neutron diffraction * multiple reflections * higg-resolution monochromator Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 0.679, year: 2004

  1. Super-Resolution Enhancement From Multiple Overlapping Images: A Fractional Area Technique

    Michaels, Joshua A.

    With the availability of large quantities of relatively low-resolution data from several decades of space borne imaging, methods of creating an accurate, higher-resolution image from the multiple lower-resolution images (i.e. super-resolution), have been developed almost since such imagery has been around. The fractional-area super-resolution technique developed in this thesis has never before been documented. Satellite orbits, like Landsat, have a quantifiable variation, which means each image is not centered on the exact same spot more than once and the overlapping information from these multiple images may be used for super-resolution enhancement. By splitting a single initial pixel into many smaller, desired pixels, a relationship can be created between them using the ratio of the area within the initial pixel. The ideal goal for this technique is to obtain smaller pixels with exact values and no error, yielding a better potential result than those methods that yield interpolated pixel values with consequential loss of spatial resolution. A Fortran 95 program was developed to perform all calculations associated with the fractional-area super-resolution technique. The fractional areas are calculated using traditional trigonometry and coordinate geometry and Linear Algebra Package (LAPACK; Anderson et al., 1999) is used to solve for the higher-resolution pixel values. In order to demonstrate proof-of-concept, a synthetic dataset was created using the intrinsic Fortran random number generator and Adobe Illustrator CS4 (for geometry). To test the real-life application, digital pictures from a Sony DSC-S600 digital point-and-shoot camera with a tripod were taken of a large US geological map under fluorescent lighting. While the fractional-area super-resolution technique works in perfect synthetic conditions, it did not successfully produce a reasonable or consistent solution in the digital photograph enhancement test. The prohibitive amount of processing time (up to

  2. Use of a line-pair resolution phantom for comprehensive quality assurance of electronic portal imaging devices based on fundamental imaging metrics

    Gopal, Arun; Samant, Sanjiv S.

    2009-01-01

    Image guided radiation therapy solutions based on megavoltage computed tomography (MVCT) involve the extension of electronic portal imaging devices (EPIDs) from their traditional role of weekly localization imaging and planar dose mapping to volumetric imaging for 3D setup and dose verification. To sustain the potential advantages of MVCT, EPIDs are required to provide improved levels of portal image quality. Therefore, it is vital that the performance of EPIDs in clinical use is maintained at an optimal level through regular and rigorous quality assurance (QA). Traditionally, portal imaging QA has been carried out by imaging calibrated line-pair and contrast resolution phantoms and obtaining arbitrarily defined QA indices that are usually dependent on imaging conditions and merely indicate relative trends in imaging performance. They are not adequately sensitive to all aspects of image quality unlike fundamental imaging metrics such as the modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) that are widely used to characterize detector performance in radiographic imaging and would be ideal for QA purposes. However, due to the difficulty of performing conventional MTF measurements, they have not been used for routine clinical QA. The authors present a simple and quick QA methodology based on obtaining the MTF, NPS, and DQE of a megavoltage imager by imaging standard open fields and a bar-pattern QA phantom containing 2 mm thick tungsten line-pair bar resolution targets. Our bar-pattern based MTF measurement features a novel zero-frequency normalization scheme that eliminates normalization errors typically associated with traditional bar-pattern measurements at megavoltage x-ray energies. The bar-pattern QA phantom and open-field images are used in conjunction with an automated image analysis algorithm that quickly computes the MTF, NPS, and DQE of an EPID system. Our approach combines the fundamental advantages of

  3. Single image super-resolution using locally adaptive multiple linear regression.

    Yu, Soohwan; Kang, Wonseok; Ko, Seungyong; Paik, Joonki

    2015-12-01

    This paper presents a regularized superresolution (SR) reconstruction method using locally adaptive multiple linear regression to overcome the limitation of spatial resolution of digital images. In order to make the SR problem better-posed, the proposed method incorporates the locally adaptive multiple linear regression into the regularization process as a local prior. The local regularization prior assumes that the target high-resolution (HR) pixel is generated by a linear combination of similar pixels in differently scaled patches and optimum weight parameters. In addition, we adapt a modified version of the nonlocal means filter as a smoothness prior to utilize the patch redundancy. Experimental results show that the proposed algorithm better restores HR images than existing state-of-the-art methods in the sense of the most objective measures in the literature.

  4. Multiple-image hiding using super resolution reconstruction in high-frequency domains

    Li, Xiao-Wei; Zhao, Wu-Xiang; Wang, Jun; Wang, Qiong-Hua

    2017-12-01

    In this paper, a robust multiple-image hiding method using the computer-generated integral imaging and the modified super-resolution reconstruction algorithm is proposed. In our work, the host image is first transformed into frequency domains by cellular automata (CA), to assure the quality of the stego-image, the secret images are embedded into the CA high-frequency domains. The proposed method has the following advantages: (1) robustness to geometric attacks because of the memory-distributed property of elemental images, (2) increasing quality of the reconstructed secret images as the scheme utilizes the modified super-resolution reconstruction algorithm. The simulation results show that the proposed multiple-image hiding method outperforms other similar hiding methods and is robust to some geometric attacks, e.g., Gaussian noise and JPEG compression attacks.

  5. MULTI-SCALE SEGMENTATION OF HIGH RESOLUTION REMOTE SENSING IMAGES BY INTEGRATING MULTIPLE FEATURES

    Y. Di

    2017-05-01

    Full Text Available Most of multi-scale segmentation algorithms are not aiming at high resolution remote sensing images and have difficulty to communicate and use layers’ information. In view of them, we proposes a method of multi-scale segmentation of high resolution remote sensing images by integrating multiple features. First, Canny operator is used to extract edge information, and then band weighted distance function is built to obtain the edge weight. According to the criterion, the initial segmentation objects of color images can be gained by Kruskal minimum spanning tree algorithm. Finally segmentation images are got by the adaptive rule of Mumford–Shah region merging combination with spectral and texture information. The proposed method is evaluated precisely using analog images and ZY-3 satellite images through quantitative and qualitative analysis. The experimental results show that the multi-scale segmentation of high resolution remote sensing images by integrating multiple features outperformed the software eCognition fractal network evolution algorithm (highest-resolution network evolution that FNEA on the accuracy and slightly inferior to FNEA on the efficiency.

  6. Semantic metrics

    Hu, Bo; Kalfoglou, Yannis; Dupplaw, David; Alani, Harith; Lewis, Paul; Shadbolt, Nigel

    2006-01-01

    In the context of the Semantic Web, many ontology-related operations, e.g. ontology ranking, segmentation, alignment, articulation, reuse, evaluation, can be boiled down to one fundamental operation: computing the similarity and/or dissimilarity among ontological entities, and in some cases among ontologies themselves. In this paper, we review standard metrics for computing distance measures and we propose a series of semantic metrics. We give a formal account of semantic metrics drawn from a...

  7. Study on the Spatial Resolution of Single and Multiple Coincidences Compton Camera

    Andreyev, Andriy; Sitek, Arkadiusz; Celler, Anna

    2012-10-01

    In this paper we study the image resolution that can be obtained from the Multiple Coincidences Compton Camera (MCCC). The principle of MCCC is based on a simultaneous acquisition of several gamma-rays emitted in cascade from a single nucleus. Contrary to a standard Compton camera, MCCC can theoretically provide the exact location of a radioactive source (based only on the identification of the intersection point of three cones created by a single decay), without complicated tomographic reconstruction. However, practical implementation of the MCCC approach encounters several problems, such as low detection sensitivities result in very low probability of coincident triple gamma-ray detection, which is necessary for the source localization. It is also important to evaluate how the detection uncertainties (finite energy and spatial resolution) influence identification of the intersection of three cones, thus the resulting image quality. In this study we investigate how the spatial resolution of the reconstructed images using the triple-cone reconstruction (TCR) approach compares to images reconstructed from the same data using standard iterative method based on single-cone. Results show, that FWHM for the point source reconstructed with TCR was 20-30% higher than the one obtained from the standard iterative reconstruction based on expectation maximization (EM) algorithm and conventional single-cone Compton imaging. Finite energy and spatial resolutions of the MCCC detectors lead to errors in conical surfaces definitions (“thick” conical surfaces) which only amplify in image reconstruction when intersection of three cones is being sought. Our investigations show that, in spite of being conceptually appealing, the identification of triple cone intersection constitutes yet another restriction of the multiple coincidence approach which limits the image resolution that can be obtained with MCCC and TCR algorithm.

  8. Experimental demonstration of producing high resolution zone plates by spatial-frequency multiplication

    Yun, W.B.; Howells, M.R.

    1987-01-01

    In an earlier publication, the possibility of producing high resolution zone plates for x-ray applications by spatial-frequency multiplication was analyzed theoretically. The theory predicted that for a daughter zone plate generated from the interference of mth and nth diffraction orders of a parent zone plate, its primary focal spot size and focal length are one (m + n)th of their counterparts of the parent zone plate, respectively. It was also shown that a zone plate with the outermost zone width of as small as 13.8 nm might be produced by this technique. In this paper, we report an experiment which we carried out with laser light (λ = 4166A) for demonstrating this technique. In addition, an outlook for producing high resolution zone plates for x-ray application is briefly discussed

  9. Canonical resolution of the multiplicity problem for U(3): an explicit and complete constructive solution

    Biedenharn, L.C.; Lohe, M.A.; Louck, J.D.

    1975-01-01

    The multiplicity problem for tensor operators in U(3) has a unique (canonical) resolution which is utilized to effect the explicit construction of all U(3) Wigner and Racah coefficients. Methods are employed which elucidate the structure of the results; in particular, the significance of the denominator functions entering the structure of these coefficients, and the relation of these denominator functions to the null space of the canonical tensor operators. An interesting feature of the denominator functions is the appearance of new, group theoretical, polynomials exhibiting several remarkable and quite unexpected properties. (U.S.)

  10. High-Resolution Printing of 3D Structures Using an Electrohydrodynamic Inkjet with Multiple Functional Inks.

    An, Byeong Wan; Kim, Kukjoo; Lee, Heejoo; Kim, So-Yun; Shim, Yulhui; Lee, Dae-Young; Song, Jun Yeob; Park, Jang-Ung

    2015-08-05

    Electrohydrodynamic-inkjet-printed high-resolution complex 3D structures with multiple functional inks are demonstrated. Printed 3D structures can have a variety of fine patterns, such as vertical or helix-shaped pillars and straight or rounded walls, with high aspect ratios (greater than ≈50) and narrow diameters (≈0.7 μm). Furthermore, the formation of freestanding, bridge-like Ag wire structures on plastic substrates suggests substantial potentials as high-precision, flexible 3D interconnects. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. ePRISM: A case study in multiple proxy and mixed temporal resolution integration

    Robinson, Marci M.; Dowsett, Harry J.

    2010-01-01

    As part of the Pliocene Research, Interpretation and Synoptic Mapping (PRISM) Project, we present the ePRISM experiment designed I) to provide climate modelers with a reconstruction of an early Pliocene warm period that was warmer than the PRISM interval (similar to 3.3 to 3.0 Ma), yet still similar in many ways to modern conditions and 2) to provide an example of how best to integrate multiple-proxy sea surface temperature (SST) data from time series with varying degrees of temporal resolution and age control as we begin to build the next generation of PRISM, the PRISM4 reconstruction, spanning a constricted time interval. While it is possible to tie individual SST estimates to a single light (warm) oxygen isotope event, we find that the warm peak average of SST estimates over a narrowed time interval is preferential for paleoclimate reconstruction as it allows for the inclusion of more records of multiple paleotemperature proxies.

  12. A Methodology for Multiple Rule System Integration and Resolution Within a Singular Knowledge Base

    Kautzmann, Frank N., III

    1988-01-01

    Expert Systems which support knowledge representation by qualitative modeling techniques experience problems, when called upon to support integrated views embodying description and explanation, especially when other factors such as multiple causality, competing rule model resolution, and multiple uses of knowledge representation are included. A series of prototypes are being developed to demonstrate the feasibility of automating the process of systems engineering, design and configuration, and diagnosis and fault management. A study involves not only a generic knowledge representation; it must also support multiple views at varying levels of description and interaction between physical elements, systems, and subsystems. Moreover, it will involve models of description and explanation for each level. This multiple model feature requires the development of control methods between rule systems and heuristics on a meta-level for each expert system involved in an integrated and larger class of expert system. The broadest possible category of interacting expert systems is described along with a general methodology for the knowledge representation and control of mutually exclusive rule systems.

  13. Corrected multiple upsets and bit reversals for improved 1-s resolution measurements

    Brucker, G.J.; Stassinopoulos, E.G.; Stauffer, C.A.

    1994-01-01

    Previous work has studied the generation of single and multiple errors in control and irradiated static RAM samples (Harris 6504RH) which were exposed to heavy ions for relatively long intervals of time (minute), and read out only after the beam was shut off. The present investigation involved storing 4k x 1 bit maps every second during 1 min ion exposures at low flux rates of 10 3 ions/cm 2 -s in order to reduce the chance of two sequential ions upsetting adjacent bits. The data were analyzed for the presence of adjacent upset bit locations in the physical memory plane, which were previously defined to constitute multiple upsets. Improvement in the time resolution of these measurements has provided more accurate estimates of multiple upsets. The results indicate that the percentage of multiples decreased from a high of 17% in the previous experiment to less than 1% for this new experimental technique. Consecutive double and triple upsets (reversals of bits) were detected. These were caused by sequential ions hitting the same bit, with one or two reversals of state occurring in a 1-min run. In addition to these results, a status review for these same parts covering 3.5 years of imprint damage recovery is also presented

  14. Prediction of individual probabilities of livebirth and multiple birth events following in vitro fertilization (IVF): a new outcomes counselling tool for IVF providers and patients using HFEA metrics.

    Jones, Christopher A; Christensen, Anna L; Salihu, Hamisu; Carpenter, William; Petrozzino, Jeffrey; Abrams, Elizabeth; Sills, Eric Scott; Keith, Louis G

    2011-01-01

    In vitro fertilization (IVF) has become a standard treatment for subfertility after it was demonstrated to be of value to humans in 1978. However, the introduction of IVF into mainstream clinical practice has been accompanied by concerns regarding the number of multiple gestations that it can produce, as multiple births present significant medical consequences to mothers and offspring. When considering IVF as a treatment modality, a balance must be set between the chance of having a live birth and the risk of having a multiple birth. As IVF is often a costly decision for patients-financially, medically, and emotionally-there is benefit from estimating a patient's specific chance that IVF could result in a birth as fertility treatment options are contemplated. Historically, a patient's "chance of success" with IVF has been approximated from institution-based statistics, rather than on the basis of any particular clinical parameter (except age). Furthermore, the likelihood of IVF resulting in a twin or triplet outcome must be acknowledged for each patient, given the known increased complications of multiple gestation and consequent increased risk of poor birth outcomes. In this research, we describe a multivariate risk assessment model that incorporates metrics adapted from a national 7.5-year sampling of the Human Fertilisation & Embryology Authority (HFEA) dataset (1991-1998) to predict reproductive outcome (including estimation of multiple birth) after IVF. To our knowledge, http://www.formyodds.com is the first Software-as-a-Service (SaaS) application to predict IVF outcome. The approach also includes a confirmation functionality, where clinicians can agree or disagree with the computer-generated outcome predictions. It is anticipated that the emergence of predictive tools will augment the reproductive endocrinology consultation, improve the medical informed consent process by tailoring the outcome assessment to each patient, and reduce the potential for adverse

  15. Use of two population metrics clarifies biodiversity dynamics in large-scale monitoring: the case of trees in Japanese old-growth forests: the need for multiple population metrics in large-scale monitoring.

    Ogawa, Mifuyu; Yamaura, Yuichi; Abe, Shin; Hoshino, Daisuke; Hoshizaki, Kazuhiko; Iida, Shigeo; Katsuki, Toshio; Masaki, Takashi; Niiyama, Kaoru; Saito, Satoshi; Sakai, Takeshi; Sugita, Hisashi; Tanouchi, Hiroyuki; Amano, Tatsuya; Taki, Hisatomo; Okabe, Kimiko

    2011-07-01

    Many indicators/indices provide information on whether the 2010 biodiversity target of reducing declines in biodiversity have been achieved. The strengths and limitations of the various measures used to assess the success of such measures are now being discussed. Biodiversity dynamics are often evaluated by a single biological population metric, such as the abundance of each species. Here we examined tree population dynamics of 52 families (192 species) at 11 research sites (three vegetation zones) of Japanese old-growth forests using two population metrics: number of stems and basal area. We calculated indices that track the rate of change in all species of tree by taking the geometric mean of changes in population metrics between the 1990s and the 2000s at the national level and at the levels of the vegetation zone and family. We specifically focused on whether indices based on these two metrics behaved similarly. The indices showed that (1) the number of stems declined, whereas basal area did not change at the national level and (2) the degree of change in the indices varied by vegetation zone and family. These results suggest that Japanese old-growth forests have not degraded and may even be developing in some vegetation zones, and indicate that the use of a single population metric (or indicator/index) may be insufficient to precisely understand the state of biodiversity. It is therefore important to incorporate more metrics into monitoring schemes to overcome the risk of misunderstanding or misrepresenting biodiversity dynamics.

  16. Metrics with vanishing quantum corrections

    Coley, A A; Hervik, S; Gibbons, G W; Pope, C N

    2008-01-01

    We investigate solutions of the classical Einstein or supergravity equations that solve any set of quantum corrected Einstein equations in which the Einstein tensor plus a multiple of the metric is equated to a symmetric conserved tensor T μν (g αβ , ∂ τ g αβ , ∂ τ ∂ σ g αβ , ...,) constructed from sums of terms, the involving contractions of the metric and powers of arbitrary covariant derivatives of the curvature tensor. A classical solution, such as an Einstein metric, is called universal if, when evaluated on that Einstein metric, T μν is a multiple of the metric. A Ricci flat classical solution is called strongly universal if, when evaluated on that Ricci flat metric, T μν vanishes. It is well known that pp-waves in four spacetime dimensions are strongly universal. We focus attention on a natural generalization; Einstein metrics with holonomy Sim(n - 2) in which all scalar invariants are zero or constant. In four dimensions we demonstrate that the generalized Ghanam-Thompson metric is weakly universal and that the Goldberg-Kerr metric is strongly universal; indeed, we show that universality extends to all four-dimensional Sim(2) Einstein metrics. We also discuss generalizations to higher dimensions

  17. Metric learning

    Bellet, Aurelien; Sebban, Marc

    2015-01-01

    Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learnin

  18. Single Image Super-Resolution Using Global Regression Based on Multiple Local Linear Mappings.

    Choi, Jae-Seok; Kim, Munchurl

    2017-03-01

    Super-resolution (SR) has become more vital, because of its capability to generate high-quality ultra-high definition (UHD) high-resolution (HR) images from low-resolution (LR) input images. Conventional SR methods entail high computational complexity, which makes them difficult to be implemented for up-scaling of full-high-definition input images into UHD-resolution images. Nevertheless, our previous super-interpolation (SI) method showed a good compromise between Peak-Signal-to-Noise Ratio (PSNR) performances and computational complexity. However, since SI only utilizes simple linear mappings, it may fail to precisely reconstruct HR patches with complex texture. In this paper, we present a novel SR method, which inherits the large-to-small patch conversion scheme from SI but uses global regression based on local linear mappings (GLM). Thus, our new SR method is called GLM-SI. In GLM-SI, each LR input patch is divided into 25 overlapped subpatches. Next, based on the local properties of these subpatches, 25 different local linear mappings are applied to the current LR input patch to generate 25 HR patch candidates, which are then regressed into one final HR patch using a global regressor. The local linear mappings are learned cluster-wise in our off-line training phase. The main contribution of this paper is as follows: Previously, linear-mapping-based conventional SR methods, including SI only used one simple yet coarse linear mapping to each patch to reconstruct its HR version. On the contrary, for each LR input patch, our GLM-SI is the first to apply a combination of multiple local linear mappings, where each local linear mapping is found according to local properties of the current LR patch. Therefore, it can better approximate nonlinear LR-to-HR mappings for HR patches with complex texture. Experiment results show that the proposed GLM-SI method outperforms most of the state-of-the-art methods, and shows comparable PSNR performance with much lower

  19. Old but Still Relevant: High Resolution Electrophoresis and Immunofixation in Multiple Myeloma.

    Misra, Aroonima; Mishra, Jyoti; Chandramohan, Jagan; Sharma, Atul; Raina, Vinod; Kumar, Rajive; Soni, Sushant; Chopra, Anita

    2016-03-01

    High resolution electrophoresis (HRE) and immunofixation (IFX) of serum and urine are integral to the diagnostic work-up of multiple myeloma. Unusual electrophoresis patterns are common and may be misinterpreted. Though primarily the responsibility of the hematopathologist, clinicians who are responsible for managing myelomas may benefit from knowledge of these. In this review article we intend to discuss the patterns and importance of electrophoresis in present day scenario. Patterns of HRE and IFX seen in our laboratory over the past 15 years were studied. Monoclonal proteins are seen on HRE as sharply defined bands, sometimes two, lying from γ- to α-globulin regions on a background of normal, increased or decreased polyclonal γ-globulins, showing HRE to be a rapid and dependable method of detecting M-protein in serum or urine. Immunofixation complements HRE and due to its greater sensitivity, is able to pick up small or light chain bands, not apparent on electrophoresis, including biclonal disease even when electrophoresis shows only one M-band. Special features liable to misinterpretation are discussed. Familiarity with the interpretation of the varied patterns seen in health and disease is essential for providing dependable laboratory support in the management of multiple myeloma.

  20. MrTADFinder: A network modularity based approach to identify topologically associating domains in multiple resolutions.

    Koon-Kiu Yan

    2017-07-01

    Full Text Available Genome-wide proximity ligation based assays such as Hi-C have revealed that eukaryotic genomes are organized into structural units called topologically associating domains (TADs. From a visual examination of the chromosomal contact map, however, it is clear that the organization of the domains is not simple or obvious. Instead, TADs exhibit various length scales and, in many cases, a nested arrangement. Here, by exploiting the resemblance between TADs in a chromosomal contact map and densely connected modules in a network, we formulate TAD identification as a network optimization problem and propose an algorithm, MrTADFinder, to identify TADs from intra-chromosomal contact maps. MrTADFinder is based on the network-science concept of modularity. A key component of it is deriving an appropriate background model for contacts in a random chain, by numerically solving a set of matrix equations. The background model preserves the observed coverage of each genomic bin as well as the distance dependence of the contact frequency for any pair of bins exhibited by the empirical map. Also, by introducing a tunable resolution parameter, MrTADFinder provides a self-consistent approach for identifying TADs at different length scales, hence the acronym "Mr" standing for Multiple Resolutions. We then apply MrTADFinder to various Hi-C datasets. The identified domain boundaries are marked by characteristic signatures in chromatin marks and transcription factors (TF that are consistent with earlier work. Moreover, by calling TADs at different length scales, we observe that boundary signatures change with resolution, with different chromatin features having different characteristic length scales. Furthermore, we report an enrichment of HOT (high-occupancy target regions near TAD boundaries and investigate the role of different TFs in determining boundaries at various resolutions. To further explore the interplay between TADs and epigenetic marks, as tumor mutational

  1. Metrication manual

    Harper, A.F.A.; Digby, R.B.; Thong, S.P.; Lacey, F.

    1978-04-01

    In April 1978 a meeting of senior metrication officers convened by the Commonwealth Science Council of the Commonwealth Secretariat, was held in London. The participants were drawn from Australia, Bangladesh, Britain, Canada, Ghana, Guyana, India, Jamaica, Papua New Guinea, Solomon Islands and Trinidad and Tobago. Among other things, the meeting resolved to develop a set of guidelines to assist countries to change to SI and to compile such guidelines in the form of a working manual

  2. IGRINS NEAR-IR HIGH-RESOLUTION SPECTROSCOPY OF MULTIPLE JETS AROUND LkHα 234

    Oh, Heeyoung; Yuk, In-Soo; Park, Byeong-Gon; Park, Chan; Chun, Moo-Young; Kim, Kang-Min; Oh, Jae Sok; Jeong, Ueejeong; Yu, Young Sam; Lee, Jae-Joon; Kim, Hwihyun; Hwang, Narae; Lee, Sungho [Korea Astronomy and Space Science Institute, 776 Daedeok-daero, Yuseong-gu, Daejeon 305-348 (Korea, Republic of); Pyo, Tae-Soo [Subaru Telescope, National Astronomical Observatory of Japan, 650 North A’ohoku Place, Hilo, HI 96720 (United States); Pak, Soojong; Lee, Hye-In; Le, Huynh Anh Nguyen [School of Space Research and Institute of Natural Sciences, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of); Kaplan, Kyle; Pavel, Michael; Mace, Gregory, E-mail: hyoh@kasi.re.kr [Department of Astronomy, University of Texas at Austin, Austin, TX (United States); and others

    2016-02-01

    We present the results of high-resolution near-IR spectroscopy toward the multiple outflows around the Herbig Be star LkHα 234 using the Immersion Grating Infrared Spectrograph. Previous studies indicate that the region around LkHα 234 is complex, with several embedded young stellar objects and the outflows associated with them. In simultaneous H- and K-band spectra from HH 167, we detected 5 [Fe ii] and 14 H{sub 2} emission lines. We revealed a new [Fe ii] jet driven by radio continuum source VLA 3B. Position–velocity diagrams of the H{sub 2} 1−0 S(1) λ2.122 μm line show multiple velocity peaks. The kinematics may be explained by a geometrical bow shock model. We detected a component of H{sub 2} emission at the systemic velocity (V{sub LSR} = −10.2 km s{sup −1}) along the whole slit in all slit positions, which may arise from the ambient photodissociation region. Low-velocity gas dominates the molecular hydrogen emission from knots A and B in HH 167, which is close to the systemic velocity; [Fe ii] emission lines are detected farther from the systemic velocity, at V{sub LSR} = −100–−130 km s{sup −1}. We infer that the H{sub 2} emission arises from shocked gas entrained by a high-velocity outflow. Population diagrams of H{sub 2} lines imply that the gas is thermalized at a temperature of 2500–3000 K and the emission results from shock excitation.

  3. A New Conflict Resolution Method for Multiple Mobile Robots in Cluttered Environments With Motion-Liveness.

    Shahriari, Mohammadali; Biglarbegian, Mohammad

    2018-01-01

    This paper presents a new conflict resolution methodology for multiple mobile robots while ensuring their motion-liveness, especially for cluttered and dynamic environments. Our method constructs a mathematical formulation in a form of an optimization problem by minimizing the overall travel times of the robots subject to resolving all the conflicts in their motion. This optimization problem can be easily solved through coordinating only the robots' speeds. To overcome the computational cost in executing the algorithm for very cluttered environments, we develop an innovative method through clustering the environment into independent subproblems that can be solved using parallel programming techniques. We demonstrate the scalability of our approach through performing extensive simulations. Simulation results showed that our proposed method is capable of resolving the conflicts of 100 robots in less than 1.23 s in a cluttered environment that has 4357 intersections in the paths of the robots. We also developed an experimental testbed and demonstrated that our approach can be implemented in real time. We finally compared our approach with other existing methods in the literature both quantitatively and qualitatively. This comparison shows while our approach is mathematically sound, it is more computationally efficient, scalable for very large number of robots, and guarantees the live and smooth motion of robots.

  4. Association of the interferon signature metric with serological disease manifestations but not global activity scores in multiple cohorts of patients with SLE

    Kennedy, William P; Maciuca, Romeo; Wolslegel, Kristen; Tew, Wei; Abbas, Alexander R; Chaivorapol, Christina; Morimoto, Alyssa; McBride, Jacqueline M; Brunetta, Paul; Richardson, Bruce C; Davis, John C; Behrens, Timothy W; Townsend, Michael J

    2015-01-01

    Objectives The interferon (IFN) signature (IS) in patients with systemic lupus erythematosus (SLE) includes over 100 genes induced by type I IFN pathway activation. We developed a method to quantify the IS using three genes—the IS metric (ISM)—and characterised the clinical characteristics of patients with SLE with different ISM status from multiple clinical trials. Methods Blood microarray expression data from a training cohort of patients with SLE confirmed the presence of the IS and identified surrogate genes. We assayed these genes in a quantitative PCR (qPCR) assay, yielding an ISM from the IS. The association of ISM status with clinical disease characteristics was assessed in patients with extrarenal lupus and lupus nephritis from four clinical trials. Results Three genes, HERC5, EPSTI and CMPK2, correlated well with the IS (p>0.96), and composed the ISM qPCR assay. Using the 95th centile for healthy control data, patients with SLE from different studies were classified into two ISM subsets—ISM-Low and ISM-High—that are longitudinally stable over 36 weeks. Significant associations were identified between ISM-High status and higher titres of anti-dsDNA antibodies, presence of anti extractable nuclear antigen autoantibodies, elevated serum B cell activating factor of the tumour necrosis factor family (BAFF) levels, and hypocomplementaemia. However, measures of overall clinical disease activity were similar for ISM-High and ISM-Low groups. Conclusions The ISM is an IS biomarker that divides patients with SLE into two subpopulations—ISM-High and ISM-Low—with differing serological manifestations. The ISM does not distinguish between high and low disease activity, but may have utility in identifying patients more likely to respond to treatment(s) targeting IFN-α. Clinicaltrials.gov registration number NCT00962832. PMID:25861459

  5. Multiple Rapid Swallow Responses During Esophageal High-Resolution Manometry Reflect Esophageal Body Peristaltic Reserve

    Shaker, Anisa; Stoikes, Nathaniel; Drapekin, Jesse; Kushnir, Vladimir; Brunt, L. Michael; Gyawali, C. Prakash

    2014-01-01

    OBJECTIVES Dysphagia may develop following antireflux surgery as a consequence of poor esophageal peristaltic reserve. We hypothesized that suboptimal contraction response following multiple rapid swallows (MRS) could be associated with chronic transit symptoms following antireflux surgery. METHODS Wet swallow and MRS responses on esophageal high-resolution manometry (HRM) were characterized collectively in the esophageal body (distal contractile integral (DCI)), and individually in each smooth muscle contraction segment (S2 and S3 amplitudes) in 63 patients undergoing antireflux surgery and in 18 healthy controls. Dysphagia was assessed using symptom questionnaires. The MRS/wet swallow ratios were calculated for S2 and S3 peak amplitudes and DCI. MRS responses were compared in patients with and without late postoperative dysphagia following antireflux surgery. RESULTS Augmentation of smooth muscle contraction (MRS/wet swallow ratios > 1.0) as measured collectively by DCI was seen in only 11.1% with late postoperative dysphagia, compared with 63.6% in those with no dysphagia and 78.1% in controls (P≤0.02 for each comparison). Similar results were seen with S3 but not S2 peak amplitude ratios. Receiver operating characteristics identified a DCI MRS/wet swallow ratio threshold of 0.85 in segregating patients with late postoperative dysphagia from those with no postoperative dysphagia with a sensitivity of 0.67 and specificity of 0.64. CONCLUSIONS Lack of augmentation of smooth muscle contraction following MRS is associated with late postoperative dysphagia following antireflux surgery, suggesting that MRS responses could assess esophageal smooth muscle peristaltic reserve. Further research is warranted to determine if antireflux surgery needs to be tailored to the MRS response. PMID:24019081

  6. Metrics for energy resilience

    Roege, Paul E.; Collier, Zachary A.; Mancillas, James; McDonagh, John A.; Linkov, Igor

    2014-01-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today's energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system's energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth. - Highlights: • Resilience is the ability of a system to recover from adversity. • There is a need for methods to quantify and measure system resilience. • We developed a matrix-based approach to generate energy resilience metrics. • These metrics can be used in energy planning, system design, and operations

  7. Multiple resolution chirp reflectometry for fault localization and diagnosis in a high voltage cable in automotive electronics

    Chang, Seung Jin; Lee, Chun Ku; Shin, Yong-June; Park, Jin Bae

    2016-12-01

    A multiple chirp reflectometry system with a fault estimation process is proposed to obtain multiple resolution and to measure the degree of fault in a target cable. A multiple resolution algorithm has the ability to localize faults, regardless of fault location. The time delay information, which is derived from the normalized cross-correlation between the incident signal and bandpass filtered reflected signals, is converted to a fault location and cable length. The in-phase and quadrature components are obtained by lowpass filtering of the mixed signal of the incident signal and the reflected signal. Based on in-phase and quadrature components, the reflection coefficient is estimated by the proposed fault estimation process including the mixing and filtering procedure. Also, the measurement uncertainty for this experiment is analyzed according to the Guide to the Expression of Uncertainty in Measurement. To verify the performance of the proposed method, we conduct comparative experiments to detect and measure faults under different conditions. Considering the installation environment of the high voltage cable used in an actual vehicle, target cable length and fault position are designed. To simulate the degree of fault, the variety of termination impedance (10 Ω , 30 Ω , 50 Ω , and 1 \\text{k} Ω ) are used and estimated by the proposed method in this experiment. The proposed method demonstrates advantages in that it has multiple resolution to overcome the blind spot problem, and can assess the state of the fault.

  8. Latest Holocene Climate Variability revealed by a high-resolution multiple Proxy Record off Lisbon (Portugal)

    Abrantes, F.; Lebreiro, S.; Ferreira, A.; Gil, I.; Jonsdottir, H.; Rodrigues, T.; Kissel, C.; Grimalt, J.

    2003-04-01

    The North Atlantic Oscillation (NAO) is known to have a major influence on the wintertime climate of the Atlantic basin and surrounding countries, determining precipitation and wind conditions at mid-latitudes. A comparison of Hurrel's NAO index to the mean winter (January-March) discharge of the Iberian Tagus River reveals a good negative correlation to negative NAO, while the years of largest upwelling anomalies, as referred in the literature, appear to be in good agreement with positive NAO. On this basis, a better understanding of the long-term variability of the NAO and Atlantic climate variability can be gained from high-resolution climate records from the Lisbon area. Climate variability of the last 2,000 years is assessed through a multiple proxy study of sedimentary sequences recovered from the Tagus prodelta deposition center, off Lisbon (Western Iberia). Physical properties, XRF and magnetic properties from core logging, grain size, δ18O, TOC, CaCO3, total alkenones, n-alkanes, alkenone SST, diatoms, benthic and planktonic foraminiferal assemblage compositions and fluxes are the proxies employed. The age model for site D13902 is based on AMS C-14 dates from mollusc and planktonic foraminifera shells, the reservoir correction for which was obtained by dating 3 pre-bomb, mollusc shells from the study area. Preliminary results indicate a Little Ice Age (LIA - 1300 - 1600 AD) alkenone derived SSTs around 15 degC followed by a sharp and rapid increase towards 19 degC. In spite the strong variability observed for most records, this low temperature interval is marked by a general increase in organic carbon, total alkenone concentration, diatom and foraminiferal abundances, as well as an increase in the sediment fine fraction and XRF determined Fe content, pointing to important river input and higher productivity. The Medieval Warm Period (1080 - 1300 AD) is characterized by 17-18 degC SSTs, increased mean grain size, but lower magnetic susceptibility and Fe

  9. Multiple and double scattering contributions to depth resolution and low energy background in hydrogen elastic recoil detection

    Wielunski, L S [Commonwealth Scientific and Industrial Research Organisation (CSIRO), Lindfield, NSW (Australia). Div. of Applied Physics

    1997-12-31

    The sensitivity of hydrogen elastic recoil detection ( ERD ) is usually limited by the low energy background in the ERD spectrum. A number of 4.5 MeV He{sup ++} hydrogen ERD spectra from different hydrogen implanted samples are compared. The samples are chosen with different atomic numbers from low Z (carbon) to high Z (tungsten carbide) to observe the effects of multiple scattering and double scattering within the sample material. The experimental depth resolution and levels of the low energy background in ERD spectra are compared with theoretical predictions from multiple and double scattering. 10 refs., 2 tabs., 5 figs.

  10. Multiple and double scattering contributions to depth resolution and low energy background in hydrogen elastic recoil detection

    Wielunski, L.S. [Commonwealth Scientific and Industrial Research Organisation (CSIRO), Lindfield, NSW (Australia). Div. of Applied Physics

    1996-12-31

    The sensitivity of hydrogen elastic recoil detection ( ERD ) is usually limited by the low energy background in the ERD spectrum. A number of 4.5 MeV He{sup ++} hydrogen ERD spectra from different hydrogen implanted samples are compared. The samples are chosen with different atomic numbers from low Z (carbon) to high Z (tungsten carbide) to observe the effects of multiple scattering and double scattering within the sample material. The experimental depth resolution and levels of the low energy background in ERD spectra are compared with theoretical predictions from multiple and double scattering. 10 refs., 2 tabs., 5 figs.

  11. Gaussian Multiple Instance Learning Approach for Mapping the Slums of the World Using Very High Resolution Imagery

    Vatsavai, Raju [ORNL

    2013-01-01

    In this paper, we present a computationally efficient algo- rithm based on multiple instance learning for mapping infor- mal settlements (slums) using very high-resolution remote sensing imagery. From remote sensing perspective, infor- mal settlements share unique spatial characteristics that dis- tinguish them from other urban structures like industrial, commercial, and formal residential settlements. However, regular pattern recognition and machine learning methods, which are predominantly single-instance or per-pixel classi- fiers, often fail to accurately map the informal settlements as they do not capture the complex spatial patterns. To overcome these limitations we employed a multiple instance based machine learning approach, where groups of contigu- ous pixels (image patches) are modeled as generated by a Gaussian distribution. We have conducted several experi- ments on very high-resolution satellite imagery, represent- ing four unique geographic regions across the world. Our method showed consistent improvement in accurately iden- tifying informal settlements.

  12. Detection of time-varying structures by large deformation diffeomorphic metric mapping to aid reading of high-resolution CT images of the lung.

    Ryo Sakamoto

    Full Text Available OBJECTIVES: To evaluate the accuracy of advanced non-linear registration of serial lung Computed Tomography (CT images using Large Deformation Diffeomorphic Metric Mapping (LDDMM. METHODS: FIFTEEN CASES OF LUNG CANCER WITH SERIAL LUNG CT IMAGES (INTERVAL: 62.2±26.9 days were used. After affine transformation, three dimensional, non-linear volume registration was conducted using LDDMM with or without cascading elasticity control. Registration accuracy was evaluated by measuring the displacement of landmarks placed on vessel bifurcations for each lung segment. Subtraction images and Jacobian color maps, calculated from the transformation matrix derived from image warping, were generated, which were used to evaluate time-course changes of the tumors. RESULTS: The average displacement of landmarks was 0.02±0.16 mm and 0.12±0.60 mm for proximal and distal landmarks after LDDMM transformation with cascading elasticity control, which was significantly smaller than 3.11±2.47 mm and 3.99±3.05 mm, respectively, after affine transformation. Emerged or vanished nodules were visualized on subtraction images, and enlarging or shrinking nodules were displayed on Jacobian maps enabled by highly accurate registration of the nodules using LDDMM. However, some residual misalignments were observed, even with non-linear transformation when substantial changes existed between the image pairs. CONCLUSIONS: LDDMM provides accurate registration of serial lung CT images, and temporal subtraction images with Jacobian maps help radiologists to find changes in pulmonary nodules.

  13. A novel approach for multiple mobile objects path planning: Parametrization method and conflict resolution strategy

    Ma, Yong; Wang, Hongwei; Zamirian, M.

    2012-01-01

    We present a new approach containing two steps to determine conflict-free paths for mobile objects in two and three dimensions with moving obstacles. Firstly, the shortest path of each object is set as goal function which is subject to collision-avoidance criterion, path smoothness, and velocity and acceleration constraints. This problem is formulated as calculus of variation problem (CVP). Using parametrization method, CVP is converted to time-varying nonlinear programming problems (TNLPP) and then resolved. Secondly, move sequence of object is assigned by priority scheme; conflicts are resolved by multilevel conflict resolution strategy. Approach efficiency is confirmed by numerical examples. -- Highlights: ► Approach with parametrization method and conflict resolution strategy is proposed. ► Approach fits for multi-object paths planning in two and three dimensions. ► Single object path planning and multi-object conflict resolution are orderly used. ► Path of each object obtained with parameterization method in the first phase. ► Conflict-free paths gained by multi-object conflict resolution in the second phase.

  14. A 2.9 ps equivalent resolution interpolating time counter based on multiple independent coding lines

    Szplet, R; Jachna, Z; Kwiatkowski, P; Rozyc, K

    2013-01-01

    We present the design, operation and test results of a time counter that has an equivalent resolution of 2.9 ps, a measurement uncertainty at the level of 6 ps, and a measurement range of 10 s. The time counter has been implemented in a general-purpose reprogrammable device Spartan-6 (Xilinx). To obtain both high precision and wide measurement range the counting of periods of a reference clock is combined with a two-stage interpolation within a single period of the clock signal. The interpolation involves a four-phase clock in the first interpolation stage (FIS) and an equivalent coding line (ECL) in the second interpolation stage (SIS). The ECL is created as a compound of independent discrete time coding lines (TCL). The number of TCLs used to create the virtual ECL has an effect on its resolution. We tested ECLs made from up to 16 TCLs, but the idea may be extended to a larger number of lines. In the presented time counter the coarse resolution of the counting method equal to 2 ns (period of the 500 MHz reference clock) is firstly improved fourfold in the FIS and next even more than 400 times in the SIS. The proposed solution allows us to overcome the technological limitation in achievable resolution and improve the precision of conversion of integrated interpolators based on tapped delay lines. (paper)

  15. Mixed-time parallel evolution in multiple quantum NMR experiments: sensitivity and resolution enhancement in heteronuclear NMR

    Ying Jinfa; Chill, Jordan H.; Louis, John M.; Bax, Ad

    2007-01-01

    A new strategy is demonstrated that simultaneously enhances sensitivity and resolution in three- or higher-dimensional heteronuclear multiple quantum NMR experiments. The approach, referred to as mixed-time parallel evolution (MT-PARE), utilizes evolution of chemical shifts of the spins participating in the multiple quantum coherence in parallel, thereby reducing signal losses relative to sequential evolution. The signal in a given PARE dimension, t 1 , is of a non-decaying constant-time nature for a duration that depends on the length of t 2 , and vice versa, prior to the onset of conventional exponential decay. Line shape simulations for the 1 H- 15 N PARE indicate that this strategy significantly enhances both sensitivity and resolution in the indirect 1 H dimension, and that the unusual signal decay profile results in acceptable line shapes. Incorporation of the MT-PARE approach into a 3D HMQC-NOESY experiment for measurement of H N -H N NOEs in KcsA in SDS micelles at 50 o C was found to increase the experimental sensitivity by a factor of 1.7±0.3 with a concomitant resolution increase in the indirectly detected 1 H dimension. The method is also demonstrated for a situation in which homonuclear 13 C- 13 C decoupling is required while measuring weak H3'-2'OH NOEs in an RNA oligomer

  16. Metrics of quantum states

    Ma Zhihao; Chen Jingling

    2011-01-01

    In this work we study metrics of quantum states, which are natural generalizations of the usual trace metric and Bures metric. Some useful properties of the metrics are proved, such as the joint convexity and contractivity under quantum operations. Our result has a potential application in studying the geometry of quantum states as well as the entanglement detection.

  17. A time resolving data acquisition system for multiple high-resolution position sensitive detectors

    Dimmler, D.G.

    1988-01-01

    An advanced time resolving data collection system for use in neutron and x-ray spectrometry has been implemented and put into routine operation. The system collects data from high-resolution position-sensitive area detectors with a maximum cumulative rate of 10/sup 6/ events per second. The events are sorted, in real-time, into many time-slice arrays. A programmable timing control unit allows for a wide choice of time sequences and time-slice array sizes. The shortest dwell time on a slice may be below 1 ms and the delay to switch between slices is zero

  18. $\\eta$-metric structures

    Gaba, Yaé Ulrich

    2017-01-01

    In this paper, we discuss recent results about generalized metric spaces and fixed point theory. We introduce the notion of $\\eta$-cone metric spaces, give some topological properties and prove some fixed point theorems for contractive type maps on these spaces. In particular we show that theses $\\eta$-cone metric spaces are natural generalizations of both cone metric spaces and metric type spaces.

  19. High Resolution of Quantitative Traits Into Multiple Loci via Interval Mapping

    Jansen, Ritsert C.; Stam, Piet

    1994-01-01

    A very general method is described for multiple linear regression of a quantitative phenotype on genotype [putative quantitative trait loci (QTLs) and markers] in segregating generations obtained from line crosses. The method exploits two features, (a) the use of additional parental and F1 data, which fixes the joint QTL effects and the environmental error, and (b) the use of markers as cofactors, which reduces the genetic background noise. As a result, a significant increase of QTL detection...

  20. Object-Based Change Detection in Urban Areas from High Spatial Resolution Images Based on Multiple Features and Ensemble Learning

    Xin Wang

    2018-02-01

    Full Text Available To improve the accuracy of change detection in urban areas using bi-temporal high-resolution remote sensing images, a novel object-based change detection scheme combining multiple features and ensemble learning is proposed in this paper. Image segmentation is conducted to determine the objects in bi-temporal images separately. Subsequently, three kinds of object features, i.e., spectral, shape and texture, are extracted. Using the image differencing process, a difference image is generated and used as the input for nonlinear supervised classifiers, including k-nearest neighbor, support vector machine, extreme learning machine and random forest. Finally, the results of multiple classifiers are integrated using an ensemble rule called weighted voting to generate the final change detection result. Experimental results of two pairs of real high-resolution remote sensing datasets demonstrate that the proposed approach outperforms the traditional methods in terms of overall accuracy and generates change detection maps with a higher number of homogeneous regions in urban areas. Moreover, the influences of segmentation scale and the feature selection strategy on the change detection performance are also analyzed and discussed.

  1. The multiplicity of massive stars: A high angular resolution survey with the HST fine guidance sensor

    Aldoretta, E. J.; Gies, D. R.; Henry, T. J.; Jao, W.-C.; Norris, R. P.

    2015-01-01

    We present the results of an all-sky survey made with the Fine Guidance Sensor on the Hubble Space Telescope to search for angularly resolved binary systems among massive stars. The sample of 224 stars is comprised mainly of Galactic O- and B-type stars and luminous blue variables, plus a few luminous stars in the Large Magellanic Cloud. The FGS TRANS mode observations are sensitive to the detection of companions with an angular separation between 0.″01 and 1.″0 and brighter than △m=5. The FGS observations resolved 52 binary and 6 triple star systems and detected partially resolved binaries in 7 additional targets (43 of these are new detections). These numbers yield a companion detection frequency of 29% for the FGS survey. We also gathered literature results on the numbers of close spectroscopic binaries and wider astrometric binaries among the sample, and we present estimates of the frequency of multiple systems and the companion frequency for subsets of stars residing in clusters and associations, field stars, and runaway stars. These results confirm the high multiplicity fraction, especially among massive stars in clusters and associations. We show that the period distribution is approximately flat in increments of logP. We identify a number of systems of potential interest for long-term orbital determinations, and we note the importance of some of these companions for the interpretation of the radial velocities and light curves of close binaries that have third companions.

  2. High resolution multiple sampling ionization chamber (MUSIC) sensitive to position coordinates

    Petrascu, H.; Kumagai, H.; Tanihata, I.; Petrascu, M.

    1999-01-01

    A new type of MUSIC sensitive to position coordinates is reported. The development of the first version of this type of chamber is based on the principles presented by Badhwar in 1973. The present detector will be used in experiments on fusion by using radioactive beams. This chamber due to the high resolution is suitable to identification and tracking of low Z particles. One of our goals, when we started this work, was to reduce as much as possible the Z value of particles that can be 'seen' by an ionization chamber. The resolution of the chamber was significantly improved by connecting the preamplifiers directly to the MUSIC's pads. These preamplifiers are able to work in vacuum and very low gas pressure. In this way the value of signal to noise ratio was increased by a factor of ∼10. The detector is of Frisch grid type, with the anode split into 10 active pads. It is the first model of a MUSIC with the field shared between the position grid and the anode pads. The Frisch grid was necessary because the detector is originally designed for very accurate energy measurements and particle identification. A drawing of this detector is shown. The detector itself consists of four main parts. The first one is the constant field-gradient cage, sandwiched in between the cathode and the Frisch grid. The second is the Frisch grid. The third is the position grid located under the Frisch grid. The last one is the plate with the anode pads. The cage is made of 100 μm Cu-Be wires. Every wire was tensioned with a weight representing half of its breaking limit. The Frisch grid was done on an aluminium frame, on which 20 μm W wires spaced 0.3 mm, were wound. For the position grid, 10 groups of 20 μm gold plated W wires have been used. Each group consisted of 5 wires spaced 0.9 mm and connected in parallel. The anode pads 7.8 x 60 mm 2 were perpendicular to the beam direction. Each pad and each of the position wire groups were connected to a preamplifier. The energy resolution

  3. SU-E-J-27: Shifting Multiple EPID Imager Layers to Improve Image Quality and Resolution in MV CBCT

    Chen, H; Rottmann, J; Yip, S; Berbeco, R [Brigham and Women’s Hospital, Boston, Massachusetts (United States); Morf, D; Fueglistaller, R; Star-Lack, J; Zentai, G [Varian Medical Systems, Palo Alto, CA (United States)

    2015-06-15

    Purpose: Vertical stacking of four conventional EPID layers can improve DQE for MV-CBCT applications. We hypothesize that shifting each layer laterally by half a pixel relative to the layer above, will improve the contrast-to-noise ratio (CNR) and image resolution. Methods: For CNR assessment, a 20 cm diameter digital phantom with 8 inserts is created. The attenuation coefficient of the phantom is similar to lung at the average energy of a 6 MV photon beam. The inserts have attenuations 1, 2…8 times of lung. One of the inserts is close to soft tissue, resembling the case of a tumor in lung. For resolution assessment, a digital phantom featuring a bar pattern is created. The phantom has an attenuation coefficient similar to soft tissue and the bars have an attenuation coefficient of calcium sulfate. A 2 MeV photon beam is attenuated through these phantoms and hits each of the four stacked detector layers. Each successive layer is shifted by half a pixel in the x only, y only, and x and y (combined) directions, respectively. Blurring and statistical noise are added to the projections. Projections from one, two, three and four layers are used for reconstruction. CNR and image resolution are evaluated and compared. Results: When projections from multiple layers are combined for reconstruction, CNR increases with the number of layers involved. CNR in reconstructions from two, three and four layers are 1.4, 1.7 and 1.99 times that from one layer. The resolution from the shifted four layer detector is also improved from a single layer. In a comparison between one layer versus four layers in this preliminary study, the resolution from four shifted layers is at least 20% better. Conclusion: Layer-shifting in a stacked EPID imager design enhances resolution as well as CNR for half scan MV-CBCT. The project described was supported, in part, by a grant from Varian Medical Systems, Inc., and Award No. R01CA188446-01 from the National Cancer Institute. The content is solely

  4. High-resolution 40Ar/39Ar chronology of multiple intrusion igneous complexes

    Foland, K. A.; Chen, J.-F.; Linder, J. S.; Henderson, C. M. B.; Whillans, I. M.

    1989-06-01

    The Mount Brome complex of the Monteregian province of southern Quebec, Canada, consits of several major intrusions ranging compositionally from gabbro to syenite. The relative ages of these intrusives have been investigated with high-resolution 40Ar/39Ar analyses, including a specially designed irradiation configuration to cancel the effects of fluence gradients. Small yet distinct apparent age differences are observed. While a number of analytical and geological factors could be proposed to explain the small variations, evaluation of these suggests the age differences reflect those in emplacement times. The gabbro and nepheline diorite were emplaced within a short span 123.1 Ma ago. Generally more evolved lithologies (biotite monzodiorite, pulaskite, nordmarkite) appear to have been emplaced within a restricted interval 1.4±0.3 Ma later. Whole-rock Rb-Sr systematics do not give acceptable isochrons because of significant scatter interpreted to reflect initial 87Sr/86Sr heterogeneities resulting from crustal contamination. Considering the variations in initial ratio, the Rb-Sr data are consistent with the 40Ar/39Ar age.

  5. Detection of somatic mutations by high-resolution DNA melting (HRM) analysis in multiple cancers.

    Gonzalez-Bosquet, Jesus; Calcei, Jacob; Wei, Jun S; Garcia-Closas, Montserrat; Sherman, Mark E; Hewitt, Stephen; Vockley, Joseph; Lissowska, Jolanta; Yang, Hannah P; Khan, Javed; Chanock, Stephen

    2011-01-17

    Identification of somatic mutations in cancer is a major goal for understanding and monitoring the events related to cancer initiation and progression. High resolution melting (HRM) curve analysis represents a fast, post-PCR high-throughput method for scanning somatic sequence alterations in target genes. The aim of this study was to assess the sensitivity and specificity of HRM analysis for tumor mutation screening in a range of tumor samples, which included 216 frozen pediatric small rounded blue-cell tumors as well as 180 paraffin-embedded tumors from breast, endometrial and ovarian cancers (60 of each). HRM analysis was performed in exons of the following candidate genes known to harbor established commonly observed mutations: PIK3CA, ERBB2, KRAS, TP53, EGFR, BRAF, GATA3, and FGFR3. Bi-directional sequencing analysis was used to determine the accuracy of the HRM analysis. For the 39 mutations observed in frozen samples, the sensitivity and specificity of HRM analysis were 97% and 87%, respectively. There were 67 mutation/variants in the paraffin-embedded samples, and the sensitivity and specificity for the HRM analysis were 88% and 80%, respectively. Paraffin-embedded samples require higher quantity of purified DNA for high performance. In summary, HRM analysis is a promising moderate-throughput screening test for mutations among known candidate genomic regions. Although the overall accuracy appears to be better in frozen specimens, somatic alterations were detected in DNA extracted from paraffin-embedded samples.

  6. Detection of somatic mutations by high-resolution DNA melting (HRM analysis in multiple cancers.

    Jesus Gonzalez-Bosquet

    Full Text Available Identification of somatic mutations in cancer is a major goal for understanding and monitoring the events related to cancer initiation and progression. High resolution melting (HRM curve analysis represents a fast, post-PCR high-throughput method for scanning somatic sequence alterations in target genes. The aim of this study was to assess the sensitivity and specificity of HRM analysis for tumor mutation screening in a range of tumor samples, which included 216 frozen pediatric small rounded blue-cell tumors as well as 180 paraffin-embedded tumors from breast, endometrial and ovarian cancers (60 of each. HRM analysis was performed in exons of the following candidate genes known to harbor established commonly observed mutations: PIK3CA, ERBB2, KRAS, TP53, EGFR, BRAF, GATA3, and FGFR3. Bi-directional sequencing analysis was used to determine the accuracy of the HRM analysis. For the 39 mutations observed in frozen samples, the sensitivity and specificity of HRM analysis were 97% and 87%, respectively. There were 67 mutation/variants in the paraffin-embedded samples, and the sensitivity and specificity for the HRM analysis were 88% and 80%, respectively. Paraffin-embedded samples require higher quantity of purified DNA for high performance. In summary, HRM analysis is a promising moderate-throughput screening test for mutations among known candidate genomic regions. Although the overall accuracy appears to be better in frozen specimens, somatic alterations were detected in DNA extracted from paraffin-embedded samples.

  7. Metrics for Polyphonic Sound Event Detection

    Annamaria Mesaros

    2016-05-01

    Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.

  8. Palm Swamp Wetland Ecosystems of the Upper Amazon: Characterizing their Distribution and Inundation State Using Multiple Resolution Microwave Remote Sensing

    Podest, E.; McDonald, K. C.; Schröder, R.; Pinto, N.; Zimmermann, R.; Horna, V.

    2011-12-01

    Palm swamp wetlands are prevalent in the Amazon basin, including extensive regions in northern Peru. These ecosystems are characterized by constant surface inundation and moderate seasonal water level variation. The combination of constantly saturated soils, giving rise to low oxygen conditions, and warm temperatures year-round can lead to considerable methane release to the atmosphere. Because of the widespread occurrence and expected sensitivity of these ecosystems to climate change, knowledge of their spatial extent and inundation state is crucial for assessing the associated land-atmosphere carbon exchange. Precise spatio-temporal information on palm swamps is difficult to gather because of their remoteness and difficult accessibility. Spaceborne microwave remote sensing is an effective tool for characterizing these ecosystems since it is sensitive to surface water and vegetation structure and allows monitoring large inaccessible areas on a temporal basis regardless of atmospheric conditions or solar illumination. We are developing a remote sensing methodology using multiple resolution microwave remote sensing data to determine palm swamp distribution and inundation state over focus regions in the Amazon basin in northern Peru. For this purpose, two types of multi-temporal microwave data are used: 1) high-resolution (100 m) data from the Advanced Land Observing Satellite (ALOS) Phased Array L-Band Synthetic Aperture Radar (PALSAR) to derive maps of palm swamp extent and inundation from dual-polarization fine-beam and multi-temporal HH-polarized ScanSAR, and 2) coarse resolution (25 km) combined active and passive microwave data from QuikSCAT and AMSR-E to derive inundated area fraction on a weekly basis. We compare information content and accuracy of the coarse resolution products to the PALSAR-based datasets to ensure information harmonization. The synergistic combination of high and low resolution datasets will allow for characterization of palm swamps and

  9. SU-D-204-05: Quantitative Comparison of a High Resolution Micro-Angiographic Fluoroscopic (MAF) Detector with a Standard Flat Panel Detector (FPD) Using the New Metric of Generalized Measured Relative Object Detectability (GM-ROD)

    Russ, M; Ionita, C; Bednarek, D; Rudin, S [Toshiba Stroke and Vascular Research Center, University at Buffalo (SUNY), Buffalo, NY (United States)

    2015-06-15

    Purpose: In endovascular image-guided neuro-interventions, visualization of fine detail is paramount. For example, the ability of the interventionist to visualize the stent struts depends heavily on the x-ray imaging detector performance. Methods: A study to examine the relative performance of the high resolution MAF-CMOS (pixel size 75µm, Nyquist frequency 6.6 cycles/mm) and a standard Flat Panel Detector (pixel size 194µm, Nyquist frequency 2.5 cycles/mm) detectors in imaging a neuro stent was done using the Generalized Measured Relative Object Detectability (GM-ROD) metric. Low quantum noise images of a deployed stent were obtained by averaging 95 frames obtained by both detectors without changing other exposure or geometric parameters. The square of the Fourier transform of each image is taken and divided by the generalized normalized noise power spectrum to give an effective measured task-specific signal-to-noise ratio. This expression is then integrated from 0 to each of the detector’s Nyquist frequencies, and the GM-ROD value is determined by taking a ratio of the integrals for the MAF-CMOS to that of the FPD. The lower bound of integration can be varied to emphasize high frequencies in the detector comparisons. Results: The MAF-CMOS detector exhibits vastly superior performance over the FPD when integrating over all frequencies, yielding a GM-ROD value of 63.1. The lower bound of integration was stepped up in increments of 0.5 cycles/mm for higher frequency comparisons. As the lower bound increased, the GM-ROD value was augmented, reflecting the superior performance of the MAF-CMOS in the high frequency regime. Conclusion: GM-ROD is a versatile metric that can provide quantitative detector and task dependent comparisons that can be used as a basis for detector selection. Supported by NIH Grant: 2R01EB002873 and an equipment grant from Toshiba Medical Systems Corporation.

  10. Cerebrospinal fluid metabolic profiles in multiple sclerosis and degenerative dementias obtained by high resolution proton magnetic resonance spectroscopy

    Vion-Dury, J.; Confort-Gouny, S.; Maillet, S.; Cozzone, P.J.; Nicoli, F.; Gastaut, J.L.

    1996-01-01

    We have analyzed the cerebrospinal fluid (CSF) of 19 patients with multiple sclerosis (MS), 12 patients with degenerative dementia and 17 control patients using in vitro high resolution proton magnetic resonance spectroscopy (MRS) at 400 MHz. The CSF metabolic profile is slightly modified in MS patients (increased lactate and fructose concentrations, decreased creatinine and phenylalanine concentrations) and is not correlated with the intensity of the intrathecal inflammation. Proton MRS of CSF does not differentiate relapsing-remitting MS and primary progressive MS. We have not detected any specific abnormal resonance in native or lyophilized CSF. The CSF metabolic profile of demented patients is much more altered (increased concentration of lactate, pyruvate, alanine, lysine, valine, leucine-isoleucine, tyrosine, glutamine) and is in agreement with a brain oxidative metabolism impairment as already described in Alzheimer's disease. Unassigned abnormal but non specific or constant resonances have been detected on MR spectra of demented patients. CSF inositol concentration is also increased in the CSF of patients with Alzheimer's disease. In vitro high resolution proton MRS of the CSF constitutes a new and original way to explore CSF for the differential and/or early diagnosis of dementias, as a complement to in vivo proton cerebral MRS. (authors). 22 refs., 4 figs., 2 tabs

  11. Cerebrospinal fluid metabolic profiles in multiple sclerosis and degenerative dementias obtained by high resolution proton magnetic resonance spectroscopy

    Vion-Dury, J.; Confort-Gouny, S.; Maillet, S.; Cozzone, P.J. [Centre Hospitalier Universitaire de la Timone, 13 - Marseille (France); Nicoli, F. [Centre Hospitalier Universitaire de la Timone, 13 - Marseille (France)]|[Hopital Sainte-Marguerite, 13 - Marseille (France); Gastaut, J.L. [Hopital Sainte-Marguerite, 13 - Marseille (France)

    1996-07-01

    We have analyzed the cerebrospinal fluid (CSF) of 19 patients with multiple sclerosis (MS), 12 patients with degenerative dementia and 17 control patients using in vitro high resolution proton magnetic resonance spectroscopy (MRS) at 400 MHz. The CSF metabolic profile is slightly modified in MS patients (increased lactate and fructose concentrations, decreased creatinine and phenylalanine concentrations) and is not correlated with the intensity of the intrathecal inflammation. Proton MRS of CSF does not differentiate relapsing-remitting MS and primary progressive MS. We have not detected any specific abnormal resonance in native or lyophilized CSF. The CSF metabolic profile of demented patients is much more altered (increased concentration of lactate, pyruvate, alanine, lysine, valine, leucine-isoleucine, tyrosine, glutamine) and is in agreement with a brain oxidative metabolism impairment as already described in Alzheimer`s disease. Unassigned abnormal but non specific or constant resonances have been detected on MR spectra of demented patients. CSF inositol concentration is also increased in the CSF of patients with Alzheimer`s disease. In vitro high resolution proton MRS of the CSF constitutes a new and original way to explore CSF for the differential and/or early diagnosis of dementias, as a complement to in vivo proton cerebral MRS. (authors). 22 refs., 4 figs., 2 tabs.

  12. On the estimate of deviations of partial sums of a multiple Fourier-Walsh series of the form S2j,⋯,2jf (x ) of a function in the metric L1(Qk)

    Igenberlina, Alua; Matin, Dauren; Turgumbayev, Mendybay

    2017-09-01

    In this paper, deviations of the partial sums of a multiple Fourier-Walsh series of a function in the metric L1(Qk) on a dyadic group are investigated. This estimate plays an important role in the study of equivalent normalizations in this space by means of a difference, oscillation, and best approximation by polynomials in the Walsh system. The classical classical Besov space and its equivalent normalizations are set forth in the well-known monographs of Nikolsky S.M., Besov O.V., Ilyin V.P., Triebel H.; in the works of Kazakh scientists such as Amanov T.I., Mynbaev K.T., Otelbaev M.O., Smailov E.S.. The Besov spaces on the dyadic group and the Vilenkin groups in the one-dimensional case are considered in works by Ombe H., Bloom Walter R, Fournier J., Onneweer C.W., Weyi S., Jun Tateoka.

  13. Sensory Metrics of Neuromechanical Trust.

    Softky, William; Benford, Criscillia

    2017-09-01

    Today digital sources supply a historically unprecedented component of human sensorimotor data, the consumption of which is correlated with poorly understood maladies such as Internet addiction disorder and Internet gaming disorder. Because both natural and digital sensorimotor data share common mathematical descriptions, one can quantify our informational sensorimotor needs using the signal processing metrics of entropy, noise, dimensionality, continuity, latency, and bandwidth. Such metrics describe in neutral terms the informational diet human brains require to self-calibrate, allowing individuals to maintain trusting relationships. With these metrics, we define the trust humans experience using the mathematical language of computational models, that is, as a primitive statistical algorithm processing finely grained sensorimotor data from neuromechanical interaction. This definition of neuromechanical trust implies that artificial sensorimotor inputs and interactions that attract low-level attention through frequent discontinuities and enhanced coherence will decalibrate a brain's representation of its world over the long term by violating the implicit statistical contract for which self-calibration evolved. Our hypersimplified mathematical understanding of human sensorimotor processing as multiscale, continuous-time vibratory interaction allows equally broad-brush descriptions of failure modes and solutions. For example, we model addiction in general as the result of homeostatic regulation gone awry in novel environments (sign reversal) and digital dependency as a sub-case in which the decalibration caused by digital sensorimotor data spurs yet more consumption of them. We predict that institutions can use these sensorimotor metrics to quantify media richness to improve employee well-being; that dyads and family-size groups will bond and heal best through low-latency, high-resolution multisensory interaction such as shared meals and reciprocated touch; and

  14. Classification of Small-Scale Eucalyptus Plantations Based on NDVI Time Series Obtained from Multiple High-Resolution Datasets

    Hailang Qiao

    2016-02-01

    Full Text Available Eucalyptus, a short-rotation plantation, has been expanding rapidly in southeast China in recent years owing to its short growth cycle and high yield of wood. Effective identification of eucalyptus, therefore, is important for monitoring land use changes and investigating environmental quality. For this article, we used remote sensing images over 15 years (one per year with a 30-m spatial resolution, including Landsat 5 thematic mapper images, Landsat 7-enhanced thematic mapper images, and HJ 1A/1B images. These data were used to construct a 15-year Normalized Difference Vegetation Index (NDVI time series for several cities in Guangdong Province, China. Eucalyptus reference NDVI time series sub-sequences were acquired, including one-year-long and two-year-long growing periods, using invested eucalyptus samples in the study region. In order to compensate for the discontinuity of the NDVI time series that is a consequence of the relatively coarse temporal resolution, we developed an inverted triangle area methodology. Using this methodology, the images were classified on the basis of the matching degree of the NDVI time series and two reference NDVI time series sub-sequences during the growing period of the eucalyptus rotations. Three additional methodologies (Bounding Envelope, City Block, and Standardized Euclidian Distance were also tested and used as a comparison group. Threshold coefficients for the algorithms were adjusted using commission–omission error criteria. The results show that the triangle area methodology out-performed the other methodologies in classifying eucalyptus plantations. Threshold coefficients and an optimal discriminant function were determined using a mosaic photograph that had been taken by an unmanned aerial vehicle platform. Good stability was found as we performed further validation using multiple-year data from the high-resolution Gaofen Satellite 1 (GF-1 observations of larger regions. Eucalyptus planting dates

  15. Metric diffusion along foliations

    Walczak, Szymon M

    2017-01-01

    Up-to-date research in metric diffusion along compact foliations is presented in this book. Beginning with fundamentals from the optimal transportation theory and the theory of foliations; this book moves on to cover Wasserstein distance, Kantorovich Duality Theorem, and the metrization of the weak topology by the Wasserstein distance. Metric diffusion is defined, the topology of the metric space is studied and the limits of diffused metrics along compact foliations are discussed. Essentials on foliations, holonomy, heat diffusion, and compact foliations are detailed and vital technical lemmas are proved to aide understanding. Graduate students and researchers in geometry, topology and dynamics of foliations and laminations will find this supplement useful as it presents facts about the metric diffusion along non-compact foliation and provides a full description of the limit for metrics diffused along foliation with at least one compact leaf on the two dimensions.

  16. Metric modular spaces

    Chistyakov, Vyacheslav

    2015-01-01

    Aimed toward researchers and graduate students familiar with elements of functional analysis, linear algebra, and general topology; this book contains a general study of modulars, modular spaces, and metric modular spaces. Modulars may be thought of as generalized velocity fields and serve two important purposes: generate metric spaces in a unified manner and provide a weaker convergence, the modular convergence, whose topology is non-metrizable in general. Metric modular spaces are extensions of metric spaces, metric linear spaces, and classical modular linear spaces. The topics covered include the classification of modulars, metrizability of modular spaces, modular transforms and duality between modular spaces, metric  and modular topologies. Applications illustrated in this book include: the description of superposition operators acting in modular spaces, the existence of regular selections of set-valued mappings, new interpretations of spaces of Lipschitzian and absolutely continuous mappings, the existe...

  17. Prognostic Performance Metrics

    National Aeronautics and Space Administration — This chapter presents several performance metrics for offline evaluation of prognostics algorithms. A brief overview of different methods employed for performance...

  18. Overview of journal metrics

    Kihong Kim

    2018-02-01

    Full Text Available Various kinds of metrics used for the quantitative evaluation of scholarly journals are reviewed. The impact factor and related metrics including the immediacy index and the aggregate impact factor, which are provided by the Journal Citation Reports, are explained in detail. The Eigenfactor score and the article influence score are also reviewed. In addition, journal metrics such as CiteScore, Source Normalized Impact per Paper, SCImago Journal Rank, h-index, and g-index are discussed. Limitations and problems that these metrics have are pointed out. We should be cautious to rely on those quantitative measures too much when we evaluate journals or researchers.

  19. Monitoring the Lavina di Roncovetro (RE, Italy) landslide by integrating traditional monitoring systems and multiple high-resolution topographic datasets

    Fornaciai, Alessandro; Favalli, Massimiliano; Gigli, Giovanni; Nannipieri, Luca; Mucchi, Lorenzo; Intieri, Emanuele; Agostini, Andrea; Pizziolo, Marco; Bertolini, Giovanni; Trippi, Federico; Casagli, Nicola; Schina, Rosa; Carnevale, Ennio

    2016-04-01

    Roncovetro Landslide were generated at different times. The 3D models are then georeferenced and the digital elevation models (DEMs) created. By comparing the obtained DEMs, changes in the investigated area were detected and the sediment volumes, as well as the 3D displacement at the most active parts of the landslide quantified. In this work, we test the performance of the SFM techniques applied on active landslide by comparing them with the traditional monitoring systems, highlighting the strengths and weaknesses of both methods. In addition, we show the preliminary results obtained integrating the traditional monitoring systems and the multiple high-resolution topographic datasets, over a period of more than one year, used for investigating the spatial and the temporal evolution of the upper sector of the Roncovetro landslide.

  20. Brand metrics that matter

    Muntinga, D.; Bernritter, S.

    2017-01-01

    Het merk staat steeds meer centraal in de organisatie. Het is daarom essentieel om de gezondheid, prestaties en ontwikkelingen van het merk te meten. Het is echter een uitdaging om de juiste brand metrics te selecteren. Een enorme hoeveelheid metrics vraagt de aandacht van merkbeheerders. Maar welke

  1. Privacy Metrics and Boundaries

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  2. Robustness of climate metrics under climate policy ambiguity

    Ekholm, Tommi; Lindroos, Tomi J.; Savolainen, Ilkka

    2013-01-01

    Highlights: • We assess the economic impacts of using different climate metrics. • The setting is cost-efficient scenarios for three interpretations of the 2C target. • With each target setting, the optimal metric is different. • Therefore policy ambiguity prevents the selection of an optimal metric. • Robust metric values that perform well with multiple policy targets however exist. -- Abstract: A wide array of alternatives has been proposed as the common metrics with which to compare the climate impacts of different emission types. Different physical and economic metrics and their parameterizations give diverse weights between e.g. CH 4 and CO 2 , and fixing the metric from one perspective makes it sub-optimal from another. As the aims of global climate policy involve some degree of ambiguity, it is not possible to determine a metric that would be optimal and consistent with all policy aims. This paper evaluates the cost implications of using predetermined metrics in cost-efficient mitigation scenarios. Three formulations of the 2 °C target, including both deterministic and stochastic approaches, shared a wide range of metric values for CH 4 with which the mitigation costs are only slightly above the cost-optimal levels. Therefore, although ambiguity in current policy might prevent us from selecting an optimal metric, it can be possible to select robust metric values that perform well with multiple policy targets

  3. Holographic Spherically Symmetric Metrics

    Petri, Michael

    The holographic principle (HP) conjectures, that the maximum number of degrees of freedom of any realistic physical system is proportional to the system's boundary area. The HP has its roots in the study of black holes. It has recently been applied to cosmological solutions. In this article we apply the HP to spherically symmetric static space-times. We find that any regular spherically symmetric object saturating the HP is subject to tight constraints on the (interior) metric, energy-density, temperature and entropy-density. Whenever gravity can be described by a metric theory, gravity is macroscopically scale invariant and the laws of thermodynamics hold locally and globally, the (interior) metric of a regular holographic object is uniquely determined up to a constant factor and the interior matter-state must follow well defined scaling relations. When the metric theory of gravity is general relativity, the interior matter has an overall string equation of state (EOS) and a unique total energy-density. Thus the holographic metric derived in this article can serve as simple interior 4D realization of Mathur's string fuzzball proposal. Some properties of the holographic metric and its possible experimental verification are discussed. The geodesics of the holographic metric describe an isotropically expanding (or contracting) universe with a nearly homogeneous matter-distribution within the local Hubble volume. Due to the overall string EOS the active gravitational mass-density is zero, resulting in a coasting expansion with Ht = 1, which is compatible with the recent GRB-data.

  4. Probabilistic metric spaces

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  5. Tracker Performance Metric

    Olson, Teresa; Lee, Harry; Sanders, Johnnie

    2002-01-01

    .... We have developed the Tracker Performance Metric (TPM) specifically for this purpose. It was designed to measure the output performance, on a frame-by-frame basis, using its output position and quality...

  6. IT Project Management Metrics

    2007-01-01

    Full Text Available Many software and IT projects fail in completing theirs objectives because different causes of which the management of the projects has a high weight. In order to have successfully projects, lessons learned have to be used, historical data to be collected and metrics and indicators have to be computed and used to compare them with past projects and avoid failure to happen. This paper presents some metrics that can be used for the IT project management.

  7. Mass Customization Measurements Metrics

    Nielsen, Kjeld; Brunø, Thomas Ditlev; Jørgensen, Kaj Asbjørn

    2014-01-01

    A recent survey has indicated that 17 % of companies have ceased mass customizing less than 1 year after initiating the effort. This paper presents measurement for a company’s mass customization performance, utilizing metrics within the three fundamental capabilities: robust process design, choice...... navigation, and solution space development. A mass customizer when assessing performance with these metrics can identify within which areas improvement would increase competitiveness the most and enable more efficient transition to mass customization....

  8. Fault Management Metrics

    Johnson, Stephen B.; Ghoshal, Sudipto; Haste, Deepak; Moore, Craig

    2017-01-01

    This paper describes the theory and considerations in the application of metrics to measure the effectiveness of fault management. Fault management refers here to the operational aspect of system health management, and as such is considered as a meta-control loop that operates to preserve or maximize the system's ability to achieve its goals in the face of current or prospective failure. As a suite of control loops, the metrics to estimate and measure the effectiveness of fault management are similar to those of classical control loops in being divided into two major classes: state estimation, and state control. State estimation metrics can be classified into lower-level subdivisions for detection coverage, detection effectiveness, fault isolation and fault identification (diagnostics), and failure prognosis. State control metrics can be classified into response determination effectiveness and response effectiveness. These metrics are applied to each and every fault management control loop in the system, for each failure to which they apply, and probabilistically summed to determine the effectiveness of these fault management control loops to preserve the relevant system goals that they are intended to protect.

  9. Deep Transfer Metric Learning.

    Junlin Hu; Jiwen Lu; Yap-Peng Tan; Jie Zhou

    2016-12-01

    Conventional metric learning methods usually assume that the training and test samples are captured in similar scenarios so that their distributions are assumed to be the same. This assumption does not hold in many real visual recognition applications, especially when samples are captured across different data sets. In this paper, we propose a new deep transfer metric learning (DTML) method to learn a set of hierarchical nonlinear transformations for cross-domain visual recognition by transferring discriminative knowledge from the labeled source domain to the unlabeled target domain. Specifically, our DTML learns a deep metric network by maximizing the inter-class variations and minimizing the intra-class variations, and minimizing the distribution divergence between the source domain and the target domain at the top layer of the network. To better exploit the discriminative information from the source domain, we further develop a deeply supervised transfer metric learning (DSTML) method by including an additional objective on DTML, where the output of both the hidden layers and the top layer are optimized jointly. To preserve the local manifold of input data points in the metric space, we present two new methods, DTML with autoencoder regularization and DSTML with autoencoder regularization. Experimental results on face verification, person re-identification, and handwritten digit recognition validate the effectiveness of the proposed methods.

  10. Cyber threat metrics.

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  11. Adaptive metric kernel regression

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  12. Adaptive Metric Kernel Regression

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  13. Spatial Statistical and Modeling Strategy for Inventorying and Monitoring Ecosystem Resources at Multiple Scales and Resolution Levels

    Robin M. Reich; C. Aguirre-Bravo; M.S. Williams

    2006-01-01

    A statistical strategy for spatial estimation and modeling of natural and environmental resource variables and indicators is presented. This strategy is part of an inventory and monitoring pilot study that is being carried out in the Mexican states of Jalisco and Colima. Fine spatial resolution estimates of key variables and indicators are outputs that will allow the...

  14. Students' Consideration of Source Information during the Reading of Multiple Texts and Its Effect on Intertextual Conflict Resolution

    Kobayashi, Keiichi

    2014-01-01

    This study investigated students' spontaneous use of source information for the resolution of conflicts between texts. One-hundred fifty-four undergraduate students read two conflicting explanations concerning the relationship between blood type and personality under two conditions: either one explanation with a higher credibility source and…

  15. Metrical Phonology and SLA.

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English language with the intention that it may be used in second language instruction. Stress is defined by its physical and acoustical correlates, and the principles of…

  16. Engineering performance metrics

    Delozier, R.; Snyder, N.

    1993-03-01

    Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.

  17. Metrics for Probabilistic Geometries

    Tosi, Alessandra; Hauberg, Søren; Vellido, Alfredo

    2014-01-01

    the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances...

  18. Software Quality Assurance Metrics

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  19. Enhanced resolution imaging of ultrathin ZnO layers on Ag(111) by multiple hydrogen molecules in a scanning tunneling microscope junction

    Liu, Shuyi; Shiotari, Akitoshi; Baugh, Delroy; Wolf, Martin; Kumagai, Takashi

    2018-05-01

    Molecular hydrogen in a scanning tunneling microscope (STM) junction has been found to enhance the lateral spatial resolution of the STM imaging, referred to as scanning tunneling hydrogen microscopy (STHM). Here we report atomic resolution imaging of 2- and 3-monolayer (ML) thick ZnO layers epitaxially grown on Ag(111) using STHM. The enhanced resolution can be obtained at a relatively large tip to surface distance and resolves a more defective structure exhibiting dislocation defects for 3-ML-thick ZnO than for 2 ML. In order to elucidate the enhanced imaging mechanism, the electric and mechanical properties of the hydrogen molecular junction (HMJ) are investigated by a combination of STM and atomic force microscopy. It is found that the HMJ shows multiple kinklike features in the tip to surface distance dependence of the conductance and frequency shift curves, which are absent in a hydrogen-free junction. Based on a simple modeling, we propose that the junction contains several hydrogen molecules and sequential squeezing of the molecules out of the junction results in the kinklike features in the conductance and frequency shift curves. The model also qualitatively reproduces the enhanced resolution image of the ZnO films.

  20. Performance metrics for the evaluation of hyperspectral chemical identification systems

    Truslow, Eric; Golowich, Steven; Manolakis, Dimitris; Ingle, Vinay

    2016-02-01

    Remote sensing of chemical vapor plumes is a difficult but important task for many military and civilian applications. Hyperspectral sensors operating in the long-wave infrared regime have well-demonstrated detection capabilities. However, the identification of a plume's chemical constituents, based on a chemical library, is a multiple hypothesis testing problem which standard detection metrics do not fully describe. We propose using an additional performance metric for identification based on the so-called Dice index. Our approach partitions and weights a confusion matrix to develop both the standard detection metrics and identification metric. Using the proposed metrics, we demonstrate that the intuitive system design of a detector bank followed by an identifier is indeed justified when incorporating performance information beyond the standard detection metrics.

  1. A Metric on Phylogenetic Tree Shapes.

    Colijn, C; Plazzotta, G

    2018-01-01

    The shapes of evolutionary trees are influenced by the nature of the evolutionary process but comparisons of trees from different processes are hindered by the challenge of completely describing tree shape. We present a full characterization of the shapes of rooted branching trees in a form that lends itself to natural tree comparisons. We use this characterization to define a metric, in the sense of a true distance function, on tree shapes. The metric distinguishes trees from random models known to produce different tree shapes. It separates trees derived from tropical versus USA influenza A sequences, which reflect the differing epidemiology of tropical and seasonal flu. We describe several metrics based on the same core characterization, and illustrate how to extend the metric to incorporate trees' branch lengths or other features such as overall imbalance. Our approach allows us to construct addition and multiplication on trees, and to create a convex metric on tree shapes which formally allows computation of average tree shapes. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  2. Enterprise Sustainment Metrics

    2015-06-19

    are negatively impacting KPIs” (Parmenter, 2010: 31). In the current state, the Air Force’s AA and PBL metrics are once again split . AA does...must have the authority to “take immediate action to rectify situations that are negatively impacting KPIs” (Parmenter, 2010: 31). 3. Measuring...highest profitability and shareholder value for each company” (2014: 273). By systematically diagraming a process, either through a swim lane flowchart

  3. Combining multiple approaches and optimized data resolution for an improved understanding of stream temperature dynamics of a forested headwater basin in the Southern Appalachians

    Belica, L.; Mitasova, H.; Caldwell, P.; McCarter, J. B.; Nelson, S. A. C.

    2017-12-01

    Thermal regimes of forested headwater streams continue to be an area of active research as climatic, hydrologic, and land cover changes can influence water temperature, a key aspect of aquatic ecosystems. Widespread monitoring of stream temperatures have provided an important data source, yielding insights on the temporal and spatial patterns and the underlying processes that influence stream temperature. However, small forested streams remain challenging to model due to the high spatial and temporal variability of stream temperatures and the climatic and hydrologic conditions that drive them. Technological advances and increased computational power continue to provide new tools and measurement methods and have allowed spatially explicit analyses of dynamic natural systems at greater temporal resolutions than previously possible. With the goal of understanding how current stream temperature patterns and processes may respond to changing landcover and hydroclimatoligical conditions, we combined high-resolution, spatially explicit geospatial modeling with deterministic heat flux modeling approaches using data sources that ranged from traditional hydrological and climatological measurements to emerging remote sensing techniques. Initial analyses of stream temperature monitoring data revealed that high temporal resolution (5 minutes) and measurement resolutions (guide field data collection for further heat flux modeling. By integrating multiple approaches and optimizing data resolution for the processes being investigated, small, but ecologically significant differences in stream thermal regimes were revealed. In this case, multi-approach research contributed to the identification of the dominant mechanisms driving stream temperature in the study area and advanced our understanding of the current thermal fluxes and how they may change as environmental conditions change in the future.

  4. Spatial downscaling algorithm of TRMM precipitation based on multiple high-resolution satellite data for Inner Mongolia, China

    Duan, Limin; Fan, Keke; Li, Wei; Liu, Tingxi

    2017-12-01

    Daily precipitation data from 42 stations in Inner Mongolia, China for the 10 years period from 1 January 2001 to 31 December 2010 was utilized along with downscaled data from the Tropical Rainfall Measuring Mission (TRMM) with a spatial resolution of 0.25° × 0.25° for the same period based on the statistical relationships between the normalized difference vegetation index (NDVI), meteorological variables, and digital elevation models (https://en.wikipedia.org/wiki/Digital_elevation_model) (DEM) using the leave-one-out (LOO) cross validation method and multivariate step regression. The results indicate that (1) TRMM data can indeed be used to estimate annual precipitation in Inner Mongolia and there is a linear relationship between annual TRMM and observed precipitation; (2) there is a significant relationship between TRMM-based precipitation and predicted precipitation, with a spatial resolution of 0.50° × 0.50°; (3) NDVI and temperature are important factors influencing the downscaling of TRMM precipitation data for DEM and the slope is not the most significant factor affecting the downscaled TRMM data; and (4) the downscaled TRMM data reflects spatial patterns in annual precipitation reasonably well, showing less precipitation falling in west Inner Mongolia and more in the south and southeast. The new approach proposed here provides a useful alternative for evaluating spatial patterns in precipitation and can thus be applied to generate a more accurate precipitation dataset to support both irrigation management and the conservation of this fragile grassland ecosystem.

  5. Global estimates of CO sources with high resolution by adjoint inversion of multiple satellite datasets (MOPITT, AIRS, SCIAMACHY, TES

    M. Kopacz

    2010-02-01

    Full Text Available We combine CO column measurements from the MOPITT, AIRS, SCIAMACHY, and TES satellite instruments in a full-year (May 2004–April 2005 global inversion of CO sources at 4°×5° spatial resolution and monthly temporal resolution. The inversion uses the GEOS-Chem chemical transport model (CTM and its adjoint applied to MOPITT, AIRS, and SCIAMACHY. Observations from TES, surface sites (NOAA/GMD, and aircraft (MOZAIC are used for evaluation of the a posteriori solution. Using GEOS-Chem as a common intercomparison platform shows global consistency between the different satellite datasets and with the in situ data. Differences can be largely explained by different averaging kernels and a priori information. The global CO emission from combustion as constrained in the inversion is 1350 Tg a−1. This is much higher than current bottom-up emission inventories. A large fraction of the correction results from a seasonal underestimate of CO sources at northern mid-latitudes in winter and suggests a larger-than-expected CO source from vehicle cold starts and residential heating. Implementing this seasonal variation of emissions solves the long-standing problem of models underestimating CO in the northern extratropics in winter-spring. A posteriori emissions also indicate a general underestimation of biomass burning in the GFED2 inventory. However, the tropical biomass burning constraints are not quantitatively consistent across the different datasets.

  6. Symmetries of the dual metrics

    Baleanu, D.

    1998-01-01

    The geometric duality between the metric g μν and a Killing tensor K μν is studied. The conditions were found when the symmetries of the metric g μν and the dual metric K μν are the same. Dual spinning space was constructed without introduction of torsion. The general results are applied to the case of Kerr-Newmann metric

  7. Compensation Methods for Non-uniform and Incomplete Data Sampling in High Resolution PET with Multiple Scintillation Crystal Layers

    Lee, Jae Sung; Kim, Soo Mee; Lee, Dong Soo; Hong, Jong Hong; Sim, Kwang Souk; Rhee, June Tak

    2008-01-01

    To establish the methods for sinogram formation and correction in order to appropriately apply the filtered backprojection (FBP) reconstruction algorithm to the data acquired using PET scanner with multiple scintillation crystal layers. Formation for raw PET data storage and conversion methods from listmode data to histogram and sinogram were optimized. To solve the various problems occurred while the raw histogram was converted into sinogram, optimal sampling strategy and sampling efficiency correction method were investigated. Gap compensation methods that is unique in this system were also investigated. All the sinogram data were reconstructed using 2D filtered backprojection algorithm and compared to estimate the improvements by the correction algorithms. Optimal radial sampling interval and number of angular samples in terms of the sampling theorem and sampling efficiency correction algorithm were pitch/2 and 120, respectively. By applying the sampling efficiency correction and gap compensation, artifacts and background noise on the reconstructed image could be reduced. Conversion method from the histogram to sinogram was investigated for the FBP reconstruction of data acquired using multiple scintillation crystal layers. This method will be useful for the fast 2D reconstruction of multiple crystal layer PET data

  8. Kerr metric in cosmological background

    Vaidya, P C [Gujarat Univ., Ahmedabad (India). Dept. of Mathematics

    1977-06-01

    A metric satisfying Einstein's equation is given which in the vicinity of the source reduces to the well-known Kerr metric and which at large distances reduces to the Robertson-Walker metric of a nomogeneous cosmological model. The radius of the event horizon of the Kerr black hole in the cosmological background is found out.

  9. In house validation of a high resolution mass spectrometry Orbitrap-based method for multiple allergen detection in a processed model food.

    Pilolli, Rosa; De Angelis, Elisabetta; Monaci, Linda

    2018-02-13

    In recent years, mass spectrometry (MS) has been establishing its role in the development of analytical methods for multiple allergen detection, but most analyses are being carried out on low-resolution mass spectrometers such as triple quadrupole or ion traps. In this investigation, performance provided by a high resolution (HR) hybrid quadrupole-Orbitrap™ MS platform for the multiple allergens detection in processed food matrix is presented. In particular, three different acquisition modes were compared: full-MS, targeted-selected ion monitoring with data-dependent fragmentation (t-SIM/dd2), and parallel reaction monitoring. In order to challenge the HR-MS platform, the sample preparation was kept as simple as possible, limited to a 30-min ultrasound-aided protein extraction followed by clean-up with disposable size exclusion cartridges. Selected peptide markers tracing for five allergenic ingredients namely skim milk, whole egg, soy flour, ground hazelnut, and ground peanut were monitored in home-made cookies chosen as model processed matrix. Timed t-SIM/dd2 was found the best choice as a good compromise between sensitivity and accuracy, accomplishing the detection of 17 peptides originating from the five allergens in the same run. The optimized method was validated in-house through the evaluation of matrix and processing effects, recoveries, and precision. The selected quantitative markers for each allergenic ingredient provided quantification of 60-100 μg ingred /g allergenic ingredient/matrix in incurred cookies.

  10. IONIZED GAS KINEMATICS AT HIGH RESOLUTION. V. [Ne ii], MULTIPLE CLUSTERS, HIGH EFFICIENCY STAR FORMATION, AND BLUE FLOWS IN HE 2–10

    Beck, Sara; Turner, Jean; Lacy, John; Greathouse, Thomas

    2015-01-01

    We measured the 12.8 μm [Ne ii] line in the dwarf starburst galaxy He 2–10 with the high-resolution spectrometer TEXES on the NASA IRTF. The data cube has a diffraction-limited spatial resolution of ∼1″ and a total velocity resolution, including thermal broadening, of ∼5 km s −1 . This makes it possible to compare the kinematics of individual star-forming clumps and molecular clouds in the three dimensions of space and velocity, and allows us to determine star formation efficiencies. The kinematics of the ionized gas confirm that the starburst contains multiple dense clusters. From the M/R of the clusters and the ≃30%–40% star formation efficiencies, the clusters are likely to be bound and long lived, like globulars. Non-gravitational features in the line profiles show how the ionized gas flows through the ambient molecular material, as well as a narrow velocity feature, which we identify with the interface of the H ii region and a cold dense clump. These data offer an unprecedented view of the interaction of embedded H ii regions with their environment

  11. Learning Low-Dimensional Metrics

    Jain, Lalit; Mason, Blake; Nowak, Robert

    2017-01-01

    This paper investigates the theoretical foundations of metric learning, focused on three key questions that are not fully addressed in prior work: 1) we consider learning general low-dimensional (low-rank) metrics as well as sparse metrics; 2) we develop upper and lower (minimax)bounds on the generalization error; 3) we quantify the sample complexity of metric learning in terms of the dimension of the feature space and the dimension/rank of the underlying metric;4) we also bound the accuracy ...

  12. Quantitative analysis of multiple high-resolution mass spectrometry images using chemometric methods: quantitation of chlordecone in mouse liver.

    Mohammadi, Saeedeh; Parastar, Hadi

    2018-05-15

    In this work, a chemometrics-based strategy is developed for quantitative mass spectrometry imaging (MSI). In this regard, quantification of chlordecone as a carcinogenic organochlorinated pesticide (C10Cll0O) in mouse liver using the matrix-assisted laser desorption ionization MSI (MALDI-MSI) method is used as a case study. The MSI datasets corresponded to 1, 5 and 10 days of mouse exposure to the standard chlordecone in the quantity range of 0 to 450 μg g-1. The binning approach in the m/z direction is used to group high resolution m/z values and to reduce the big data size. To consider the effect of bin size on the quality of results, three different bin sizes of 0.25, 0.5 and 1.0 were chosen. Afterwards, three-way MSI data arrays (two spatial and one m/z dimensions) for seven standards and four unknown samples were column-wise augmented with m/z values as the common mode. Then, these datasets were analyzed using multivariate curve resolution-alternating least squares (MCR-ALS) using proper constraints. The resolved mass spectra were used for identification of chlordecone in the presence of a complex background and interference. Additionally, the augmented spatial profiles were post-processed and 2D images for each component were obtained in calibration and unknown samples. The sum of these profiles was utilized to set the calibration curve and to obtain the analytical figures of merit (AFOMs). Inspection of the results showed that the lower bin size (i.e., 0.25) provides more accurate results. Finally, the obtained results by MCR for three datasets were compared with those of gas chromatography-mass spectrometry (GC-MS) and MALDI-MSI. The results showed that the MCR-assisted method gives a higher amount of chlordecone than MALDI-MSI and a lower amount than GC-MS. It is concluded that a combination of chemometric methods with MSI can be considered as an alternative way for MSI quantification.

  13. Image characterization metrics for muon tomography

    Luo, Weidong; Lehovich, Andre; Anashkin, Edward; Bai, Chuanyong; Kindem, Joel; Sossong, Michael; Steiger, Matt

    2014-05-01

    Muon tomography uses naturally occurring cosmic rays to detect nuclear threats in containers. Currently there are no systematic image characterization metrics for muon tomography. We propose a set of image characterization methods to quantify the imaging performance of muon tomography. These methods include tests of spatial resolution, uniformity, contrast, signal to noise ratio (SNR) and vertical smearing. Simulated phantom data and analysis methods were developed to evaluate metric applicability. Spatial resolution was determined as the FWHM of the point spread functions in X, Y and Z axis for 2.5cm tungsten cubes. Uniformity was measured by drawing a volume of interest (VOI) within a large water phantom and defined as the standard deviation of voxel values divided by the mean voxel value. Contrast was defined as the peak signals of a set of tungsten cubes divided by the mean voxel value of the water background. SNR was defined as the peak signals of cubes divided by the standard deviation (noise) of the water background. Vertical smearing, i.e. vertical thickness blurring along the zenith axis for a set of 2 cm thick tungsten plates, was defined as the FWHM of vertical spread function for the plate. These image metrics provided a useful tool to quantify the basic imaging properties for muon tomography.

  14. The ALICE TPC, a high resolution device for ultra-high particle multiplicities. Past, present and future

    Ivanov, Marian [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH (Germany); Collaboration: ALICE-Collaboration

    2015-07-01

    The Time Projection Chamber (TPC) of the ALICE apparatus is a large 3-dimensional tracking and particle identification device for ultra-high multiplicity collision events. It has been operated successfully at the Large Hadron Collider (LHC) at CERN, recording data from pp, p-Pb, and Pb-Pb collisions. Presently, LHC is in its first long shutdown (LS1), the next round of data taking will start in summer 2015 at or close to the LHC design energy and luminosity. During the second long shutdown (LS2), LHC will undergo a further increase in the Pb-Pb luminosity together with a major upgrade of ALICE. After the upgrade, the ALICE TPC will operate with Pb-Pb collisions at an interaction rate of 50 kHz. We present the performance in operation, calibration and reconstruction with the ALICE TPC together with ongoing work and plans for the near future and the coming 10 years.

  15. Multiple-resolution Modeling of flood processes in urban catchments using WRF-Hydro: A Case Study in south Louisiana.

    Saad, H.; Habib, E. H.

    2017-12-01

    In August 2016, the city of Lafayette and many other urban centers in south Louisiana experienced catastrophic flooding resulting from prolonged rainfall. Statewide, this historic storm displaced more than 30,000 people from their homes, resulted in damages up to $8.7 billion, put rescue workers at risk, interrupted institutions of education and business, and worst of all, resulted in the loss of life of at least 13 Louisiana residents. With growing population and increasing signs of climate change, the frequency of major floods and severe storms is expected to increase, as will the impacts of these events on our communities. Local communities need improved capabilities for forecasting flood events, monitoring of flood impacts on roads and key infrastructure, and effectively communicating real-time flood dangers at scales that are useful to the public. The current study presents the application of the WRF-Hydro modeling system to represent integrated hydrologic, hydraulic and hydrometeorological processes that drive flooding in urban basins at temporal and spatial scales that can be useful to local communities. The study site is the 25- mile2 Coulee mine catchment in Lafayette, south Louisiana. The catchment includes two tributaries with natural streams located within mostly agricultural lands. The catchment crosses the I-10 highway and through the metropolitan area of the City of Lafayette into a man-made channel, which eventually drains into the Vermilion River and the Gulf of Mexico. Due to its hydrogeomorphic setting, local and rapid diversification of land uses, low elevation, and interdependent infrastructure, the integrated modeling of this coulee is considered a challenge. A nested multi-scale model is being built using the WRF-HYDRO, with 500m and 10m resolutions for the NOAH land-surface model and diffusive wave terrain routing grids, respectively.

  16. Technical Note: A new global database of trace gases and aerosols from multiple sources of high vertical resolution measurements

    G. E. Bodeker

    2008-09-01

    Full Text Available A new database of trace gases and aerosols with global coverage, derived from high vertical resolution profile measurements, has been assembled as a collection of binary data files; hereafter referred to as the "Binary DataBase of Profiles" (BDBP. Version 1.0 of the BDBP, described here, includes measurements from different satellite- (HALOE, POAM II and III, SAGE I and II and ground-based measurement systems (ozonesondes. In addition to the primary product of ozone, secondary measurements of other trace gases, aerosol extinction, and temperature are included. All data are subjected to very strict quality control and for every measurement a percentage error on the measurement is included. To facilitate analyses, each measurement is added to 3 different instances (3 different grids of the database where measurements are indexed by: (1 geographic latitude, longitude, altitude (in 1 km steps and time, (2 geographic latitude, longitude, pressure (at levels ~1 km apart and time, (3 equivalent latitude, potential temperature (8 levels from 300 K to 650 K and time.

    In contrast to existing zonal mean databases, by including a wider range of measurement sources (both satellite and ozonesondes, the BDBP is sufficiently dense to permit calculation of changes in ozone by latitude, longitude and altitude. In addition, by including other trace gases such as water vapour, this database can be used for comprehensive radiative transfer calculations. By providing the original measurements rather than derived monthly means, the BDBP is applicable to a wider range of applications than databases containing only monthly mean data. Monthly mean zonal mean ozone concentrations calculated from the BDBP are compared with the database of Randel and Wu, which has been used in many earlier analyses. As opposed to that database which is generated from regression model fits, the BDBP uses the original (quality controlled measurements with no smoothing applied in any

  17. Sharp metric obstructions for quasi-Einstein metrics

    Case, Jeffrey S.

    2013-02-01

    Using the tractor calculus to study smooth metric measure spaces, we adapt results of Gover and Nurowski to give sharp metric obstructions to the existence of quasi-Einstein metrics on suitably generic manifolds. We do this by introducing an analogue of the Weyl tractor W to the setting of smooth metric measure spaces. The obstructions we obtain can be realized as tensorial invariants which are polynomial in the Riemann curvature tensor and its divergence. By taking suitable limits of their tensorial forms, we then find obstructions to the existence of static potentials, generalizing to higher dimensions a result of Bartnik and Tod, and to the existence of potentials for gradient Ricci solitons.

  18. Rapid Classification and Identification of Multiple Microorganisms with Accurate Statistical Significance via High-Resolution Tandem Mass Spectrometry.

    Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y; Drake, Steven K; Gucek, Marjan; Sacks, David B; Yu, Yi-Kuo

    2018-06-05

    Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html . Graphical Abstract ᅟ.

  19. Completion of a Dislocated Metric Space

    P. Sumati Kumari

    2015-01-01

    Full Text Available We provide a construction for the completion of a dislocated metric space (abbreviated d-metric space; we also prove that the completion of the metric associated with a d-metric coincides with the metric associated with the completion of the d-metric.

  20. Metric adjusted skew information

    Hansen, Frank

    2008-01-01

    ) that vanishes for observables commuting with the state. We show that the skew information is a convex function on the manifold of states. It also satisfies other requirements, proposed by Wigner and Yanase, for an effective measure-of-information content of a state relative to a conserved observable. We...... establish a connection between the geometrical formulation of quantum statistics as proposed by Chentsov and Morozova and measures of quantum information as introduced by Wigner and Yanase and extended in this article. We show that the set of normalized Morozova-Chentsov functions describing the possible......We extend the concept of Wigner-Yanase-Dyson skew information to something we call "metric adjusted skew information" (of a state with respect to a conserved observable). This "skew information" is intended to be a non-negative quantity bounded by the variance (of an observable in a state...

  1. High-resolution 1H NMR spectroscopy of fish muscle, eggs and small whole fish via Hadamard-encoded intermolecular multiple-quantum coherence.

    Honghao Cai

    Full Text Available BACKGROUND AND PURPOSE: Nuclear magnetic resonance (NMR spectroscopy has become an important technique for tissue studies. Since tissues are in semisolid-state, their high-resolution (HR spectra cannot be obtained by conventional NMR spectroscopy. Because of this restriction, extraction and high-resolution magic angle spinning (HR MAS are widely applied for HR NMR spectra of tissues. However, both of the methods are subject to limitations. In this study, the feasibility of HR (1H NMR spectroscopy based on intermolecular multiple-quantum coherence (iMQC technique is explored using fish muscle, fish eggs, and a whole fish as examples. MATERIALS AND METHODS: Intact salmon muscle tissues, intact eggs from shishamo smelt and a whole fish (Siamese algae eater are studied by using conventional 1D one-pulse sequence, Hadamard-encoded iMQC sequence, and HR MAS. RESULTS: When we use the conventional 1D one-pulse sequence, hardly any useful spectral information can be obtained due to the severe field inhomogeneity. By contrast, HR NMR spectra can be obtained in a short period of time by using the Hadamard-encoded iMQC method without shimming. Most signals from fatty acids and small metabolites can be observed. Compared to HR MAS, the iMQC method is non-invasive, but the resolution and the sensitivity of resulting spectra are not as high as those of HR MAS spectra. CONCLUSION: Due to the immunity to field inhomogeneity, the iMQC technique can be a proper supplement to HR MAS, and it provides an alternative for the investigation in cases with field distortions and with samples unsuitable for spinning. The acquisition time of the proposed method is greatly reduced by introduction of the Hadamard-encoded technique, in comparison with that of conventional iMQC method.

  2. The metric system: An introduction

    Lumley, S.M.

    1995-05-01

    On July 13, 1992, Deputy Director Duane Sewell restated the Laboratory`s policy on conversion to the metric system which was established in 1974. Sewell`s memo announced the Laboratory`s intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory`s conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on July 25, 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation`s conversion to the metric system. The second part of this report is on applying the metric system.

  3. Attack-Resistant Trust Metrics

    Levien, Raph

    The Internet is an amazingly powerful tool for connecting people together, unmatched in human history. Yet, with that power comes great potential for spam and abuse. Trust metrics are an attempt to compute the set of which people are trustworthy and which are likely attackers. This chapter presents two specific trust metrics developed and deployed on the Advogato Website, which is a community blog for free software developers. This real-world experience demonstrates that the trust metrics fulfilled their goals, but that for good results, it is important to match the assumptions of the abstract trust metric computation to the real-world implementation.

  4. The metric system: An introduction

    Lumley, Susan M.

    On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.

  5. Metric-adjusted skew information

    Liang, Cai; Hansen, Frank

    2010-01-01

    on a bipartite system and proved superadditivity of the Wigner-Yanase-Dyson skew informations for such states. We extend this result to the general metric-adjusted skew information. We finally show that a recently introduced extension to parameter values 1 ...We give a truly elementary proof of the convexity of metric-adjusted skew information following an idea of Effros. We extend earlier results of weak forms of superadditivity to general metric-adjusted skew information. Recently, Luo and Zhang introduced the notion of semi-quantum states...... of (unbounded) metric-adjusted skew information....

  6. Two classes of metric spaces

    Isabel Garrido

    2016-04-01

    Full Text Available The class of metric spaces (X,d known as small-determined spaces, introduced by Garrido and Jaramillo, are properly defined by means of some type of real-valued Lipschitz functions on X. On the other hand, B-simple metric spaces introduced by Hejcman are defined in terms of some kind of bornologies of bounded subsets of X. In this note we present a common framework where both classes of metric spaces can be studied which allows us to see not only the relationships between them but also to obtain new internal characterizations of these metric properties.

  7. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  8. Alternative kinetic energy metrics for Lagrangian systems

    Sarlet, W.; Prince, G.

    2010-11-01

    We examine Lagrangian systems on \\ {R}^n with standard kinetic energy terms for the possibility of additional, alternative Lagrangians with kinetic energy metrics different to the Euclidean one. Using the techniques of the inverse problem in the calculus of variations we find necessary and sufficient conditions for the existence of such Lagrangians. We illustrate the problem in two and three dimensions with quadratic and cubic potentials. As an aside we show that the well-known anomalous Lagrangians for the Coulomb problem can be removed by switching on a magnetic field, providing an appealing resolution of the ambiguous quantizations of the hydrogen atom.

  9. Metrics for border management systems.

    Duggan, Ruth Ann

    2009-07-01

    There are as many unique and disparate manifestations of border systems as there are borders to protect. Border Security is a highly complex system analysis problem with global, regional, national, sector, and border element dimensions for land, water, and air domains. The complexity increases with the multiple, and sometimes conflicting, missions for regulating the flow of people and goods across borders, while securing them for national security. These systems include frontier border surveillance, immigration management and customs functions that must operate in a variety of weather, terrain, operational conditions, cultural constraints, and geopolitical contexts. As part of a Laboratory Directed Research and Development Project 08-684 (Year 1), the team developed a reference framework to decompose this complex system into international/regional, national, and border elements levels covering customs, immigration, and border policing functions. This generalized architecture is relevant to both domestic and international borders. As part of year two of this project (09-1204), the team determined relevant relative measures to better understand border management performance. This paper describes those relative metrics and how they can be used to improve border management systems.

  10. Model assessment using a multi-metric ranking technique

    Fitzpatrick, P. J.; Lau, Y.; Alaka, G.; Marks, F.

    2017-12-01

    Validation comparisons of multiple models presents challenges when skill levels are similar, especially in regimes dominated by the climatological mean. Assessing skill separation will require advanced validation metrics and identifying adeptness in extreme events, but maintain simplicity for management decisions. Flexibility for operations is also an asset. This work postulates a weighted tally and consolidation technique which ranks results by multiple types of metrics. Variables include absolute error, bias, acceptable absolute error percentages, outlier metrics, model efficiency, Pearson correlation, Kendall's Tau, reliability Index, multiplicative gross error, and root mean squared differences. Other metrics, such as root mean square difference and rank correlation were also explored, but removed when the information was discovered to be generally duplicative to other metrics. While equal weights are applied, weights could be altered depending for preferred metrics. Two examples are shown comparing ocean models' currents and tropical cyclone products, including experimental products. The importance of using magnitude and direction for tropical cyclone track forecasts instead of distance, along-track, and cross-track are discussed. Tropical cyclone intensity and structure prediction are also assessed. Vector correlations are not included in the ranking process, but found useful in an independent context, and will be briefly reported.

  11. Multimetric indices: How many metrics?

    Multimetric indices (MMI’s) often include 5 to 15 metrics, each representing a different attribute of assemblage condition, such as species diversity, tolerant taxa, and nonnative taxa. Is there an optimal number of metrics for MMIs? To explore this question, I created 1000 9-met...

  12. Metrical Phonology: German Sound System.

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English and German languages. The objective is to promote use of metrical phonology as a tool for enhancing instruction in stress patterns in words and sentences, particularly in…

  13. Extending cosmology: the metric approach

    Mendoza, S.

    2012-01-01

    Comment: 2012, Extending Cosmology: The Metric Approach, Open Questions in Cosmology; Review article for an Intech "Open questions in cosmology" book chapter (19 pages, 3 figures). Available from: http://www.intechopen.com/books/open-questions-in-cosmology/extending-cosmology-the-metric-approach

  14. Numerical Calabi-Yau metrics

    Douglas, Michael R.; Karp, Robert L.; Lukic, Sergio; Reinbacher, Rene

    2008-01-01

    We develop numerical methods for approximating Ricci flat metrics on Calabi-Yau hypersurfaces in projective spaces. Our approach is based on finding balanced metrics and builds on recent theoretical work by Donaldson. We illustrate our methods in detail for a one parameter family of quintics. We also suggest several ways to extend our results

  15. Weyl metrics and wormholes

    Gibbons, Gary W. [DAMTP, University of Cambridge, Wilberforce Road, Cambridge, CB3 0WA U.K. (United Kingdom); Volkov, Mikhail S., E-mail: gwg1@cam.ac.uk, E-mail: volkov@lmpt.univ-tours.fr [Laboratoire de Mathématiques et Physique Théorique, LMPT CNRS—UMR 7350, Université de Tours, Parc de Grandmont, Tours, 37200 France (France)

    2017-05-01

    We study solutions obtained via applying dualities and complexifications to the vacuum Weyl metrics generated by massive rods and by point masses. Rescaling them and extending to complex parameter values yields axially symmetric vacuum solutions containing singularities along circles that can be viewed as singular matter sources. These solutions have wormhole topology with several asymptotic regions interconnected by throats and their sources can be viewed as thin rings of negative tension encircling the throats. For a particular value of the ring tension the geometry becomes exactly flat although the topology remains non-trivial, so that the rings literally produce holes in flat space. To create a single ring wormhole of one metre radius one needs a negative energy equivalent to the mass of Jupiter. Further duality transformations dress the rings with the scalar field, either conventional or phantom. This gives rise to large classes of static, axially symmetric solutions, presumably including all previously known solutions for a gravity-coupled massless scalar field, as for example the spherically symmetric Bronnikov-Ellis wormholes with phantom scalar. The multi-wormholes contain infinite struts everywhere at the symmetry axes, apart from solutions with locally flat geometry.

  16. Metrics for image segmentation

    Rees, Gareth; Greenway, Phil; Morray, Denise

    1998-07-01

    An important challenge in mapping image-processing techniques onto applications is the lack of quantitative performance measures. From a systems engineering perspective these are essential if system level requirements are to be decomposed into sub-system requirements which can be understood in terms of algorithm selection and performance optimization. Nowhere in computer vision is this more evident than in the area of image segmentation. This is a vigorous and innovative research activity, but even after nearly two decades of progress, it remains almost impossible to answer the question 'what would the performance of this segmentation algorithm be under these new conditions?' To begin to address this shortcoming, we have devised a well-principled metric for assessing the relative performance of two segmentation algorithms. This allows meaningful objective comparisons to be made between their outputs. It also estimates the absolute performance of an algorithm given ground truth. Our approach is an information theoretic one. In this paper, we describe the theory and motivation of our method, and present practical results obtained from a range of state of the art segmentation methods. We demonstrate that it is possible to measure the objective performance of these algorithms, and to use the information so gained to provide clues about how their performance might be improved.

  17. Metric regularity and subdifferential calculus

    Ioffe, A D

    2000-01-01

    The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces

  18. METRICS DEVELOPMENT FOR PATENTS.

    Veiga, Daniela Francescato; Ferreira, Lydia Masako

    2015-01-01

    To develop a proposal for metrics for patents to be applied in assessing the postgraduate programs of Medicine III - Capes. From the reading and analysis of the 2013 area documents of all the 48 areas of Capes, a proposal for metrics for patents was developed to be applied in Medicine III programs. Except for the areas Biotechnology, Food Science, Biological Sciences III, Physical Education, Engineering I, III and IV and Interdisciplinary, most areas do not adopt a scoring system for patents. The proposal developed was based on the criteria of Biotechnology, with adaptations. In general, it will be valued, in ascending order, the deposit, the granting and licensing/production. It will also be assigned higher scores to patents registered abroad and whenever there is a participation of students. This proposal can be applied to the item Intellectual Production of the evaluation form, in subsection Technical Production/Patents. The percentage of 10% for academic programs and 40% for Masters Professionals should be maintained. The program will be scored as Very Good when it reaches 400 points or over; Good, between 200 and 399 points; Regular, between 71 and 199 points; Weak up to 70 points; Insufficient, no punctuation. Desenvolver uma proposta de métricas para patentes a serem aplicadas na avaliação dos Programas de Pós-Graduação da Área Medicina III - Capes. A partir da leitura e análise dos documentos de área de 2013 de todas as 48 Áreas da Capes, desenvolveu-se uma proposta de métricas para patentes, a ser aplicada na avaliação dos programas da área. Constatou-se que, com exceção das áreas Biotecnologia, Ciência de Alimentos, Ciências Biológicas III, Educação Física, Engenharias I, III e IV e Interdisciplinar, a maioria não adota sistema de pontuação para patentes. A proposta desenvolvida baseou-se nos critérios da Biotecnologia, com adaptações. De uma forma geral, foi valorizado, em ordem crescente, o depósito, a concessão e o

  19. A Metric for Heterotic Moduli

    Candelas, Philip; de la Ossa, Xenia; McOrist, Jock

    2017-12-01

    Heterotic vacua of string theory are realised, at large radius, by a compact threefold with vanishing first Chern class together with a choice of stable holomorphic vector bundle. These form a wide class of potentially realistic four-dimensional vacua of string theory. Despite all their phenomenological promise, there is little understanding of the metric on the moduli space of these. What is sought is the analogue of special geometry for these vacua. The metric on the moduli space is important in phenomenology as it normalises D-terms and Yukawa couplings. It is also of interest in mathematics, since it generalises the metric, first found by Kobayashi, on the space of gauge field connections, to a more general context. Here we construct this metric, correct to first order in {α^{\\backprime}}, in two ways: first by postulating a metric that is invariant under background gauge transformations of the gauge field, and also by dimensionally reducing heterotic supergravity. These methods agree and the resulting metric is Kähler, as is required by supersymmetry. Checking the metric is Kähler is intricate and the anomaly cancellation equation for the H field plays an essential role. The Kähler potential nevertheless takes a remarkably simple form: it is the Kähler potential of special geometry with the Kähler form replaced by the {α^{\\backprime}}-corrected hermitian form.

  20. Quantitative Isotope-Dilution High-Resolution-Mass-Spectrometry Analysis of Multiple Intracellular Metabolites in Clostridium autoethanogenum with Uniformly 13C-Labeled Standards Derived from Spirulina.

    Schatschneider, Sarah; Abdelrazig, Salah; Safo, Laudina; Henstra, Anne M; Millat, Thomas; Kim, Dong-Hyun; Winzer, Klaus; Minton, Nigel P; Barrett, David A

    2018-04-03

    We have investigated the applicability of commercially available lyophilized spirulina ( Arthrospira platensis), a microorganism uniformly labeled with 13 C, as a readily accessible source of multiple 13 C-labeled metabolites suitable as internal standards for the quantitative determination of intracellular bacterial metabolites. Metabolites of interest were analyzed by hydrophilic-interaction liquid chromatography coupled with high-resolution mass spectrometry. Multiple internal standards obtained from uniformly (U)- 13 C-labeled extracts from spirulina were used to enable isotope-dilution mass spectrometry (IDMS) in the identification and quantification of intracellular metabolites. Extraction of the intracellular metabolites of Clostridium autoethanogenum using 2:1:1 chloroform/methanol/water was found to be the optimal method in comparison with freeze-thaw, homogenization, and sonication methods. The limits of quantification were ≤1 μM with excellent linearity for all of the calibration curves ( R 2 ≥ 0.99) for 74 metabolites. The precision and accuracy were found to be within relative standard deviations (RSDs) of 15% for 49 of the metabolites and within RSDs of 20% for all of the metabolites. The method was applied to study the effects of feeding different levels of carbon monoxide (as a carbon source) on the central metabolism and Wood-Ljungdahl pathway of C. autoethanogenum grown in continuous culture over 35 days. Using LC-IDMS with U- 13 C spirulina allowed the successful quantification of 52 metabolites in the samples, including amino acids, carboxylic acids, sugar phosphates, purines, and pyrimidines. The method provided absolute quantitative data on intracellular metabolites that was suitable for computational modeling to understand and optimize the C. autoethanogenum metabolic pathways active in gas fermentation.

  1. Implications of Metric Choice for Common Applications of Readmission Metrics

    Davies, Sheryl; Saynina, Olga; Schultz, Ellen; McDonald, Kathryn M; Baker, Laurence C

    2013-01-01

    Objective. To quantify the differential impact on hospital performance of three readmission metrics: all-cause readmission (ACR), 3M Potential Preventable Readmission (PPR), and Centers for Medicare and Medicaid 30-day readmission (CMS).

  2. 16-channel analyser with high time-resolution for multiple coincidence experiments; Selecteur a 16 canaux a temps de resolution eleve pour experiences de coincidences multiples; Ispol'zovanie shestiadtsatikanal'nogo analizatora s bol'shoj razreshayushchej sposobnost'yu po vremeni dlya opytov s mnogokratnymi sovpadeniyami; Analizador de 16 canales de elevado poder temporal de resolucion para experimentos de coincidencias multiples

    Brandt, B; Cappeller, U [University of Marburg, Marburg, Federal Republic of Germany (Germany)

    1962-04-15

    In usual slow-fast-coincidence experiments such events are counted which are selected in pulse-height by means of single-channel analysers. It is desirable to extend these techniques to a simultaneous energy-analysis of the pulses, selecting pulse-heights by means of multi-channel analysers. Multi-channel analysers used to perform such multiple coincidence experiments should give information about the amplitudes of arriving pulses at a well defined time-interval after the rising of the analysed pulse. A 16-channel pulse-height analyser is described which permits the generation of a pulse-height signal 2.5 x 10{sup -6} s after pulse-rising with an uncertainty of less than {+-} 2 x 10{sup -8} s. The resolving-time is less than 5 x 10{sup -6} s, controlled by an inspector-circuit to avoid distortions arising from the pile-up of pulses. The channel width is controlled by a window-amplifier, adjustable from 1 V to 7 V. The channel-height may be selected from 7 V to 103 V in steps of 0.5 V. (author) [French] En general, dans les experiences de coincidences lentes-rapides, on compte des evenements traduits sous forme d'impulsions dont l'amplitude est selectionnee a l'aide de selecteurs a un seul canal. Il est souhaitable d'etendre cette technique a l'analyse simultanee de l'energie des impulsions, en selectionnant les amplitudes au moyen de selecteurs a canaux multiples. Les selecteurs multi-canaux utilises pour ces experiences de coincidences multiples devraient fournir une indication sur les amplitudes des impulsions d'entree a un intervalle de temps bien defini, apres la montee de l'impulsion analysee. Les auteurs decrivent un selecteur d'amplitude a 16 canaux, qui delivre un signal 2,5 {center_dot} 10{sup -6} s apres l'etablissement de l'impulsion avec une erreur inferieure a {+-} 2 {center_dot} 10{sup -8} s. Le temps de resolution, inferieur a 5 {center_dot} 10{sup -6} s, est commande par un circuit special pour eviter les derangements qui resulteraient de l

  3. Issues in Benchmark Metric Selection

    Crolotte, Alain

    It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.

  4. Background metric in supergravity theories

    Yoneya, T.

    1978-01-01

    In supergravity theories, we investigate the conformal anomaly of the path-integral determinant and the problem of fermion zero modes in the presence of a nontrivial background metric. Except in SO(3) -invariant supergravity, there are nonvanishing conformal anomalies. As a consequence, amplitudes around the nontrivial background metric contain unpredictable arbitrariness. The fermion zero modes which are explicitly constructed for the Euclidean Schwarzschild metric are interpreted as an indication of the supersymmetric multiplet structure of a black hole. The degree of degeneracy of a black hole is 2/sup 4n/ in SO(n) supergravity

  5. Generalized Painleve-Gullstrand metrics

    Lin Chunyu [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: l2891112@mail.ncku.edu.tw; Soo Chopin [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: cpsoo@mail.ncku.edu.tw

    2009-02-02

    An obstruction to the implementation of spatially flat Painleve-Gullstrand (PG) slicings is demonstrated, and explicitly discussed for Reissner-Nordstroem and Schwarzschild-anti-deSitter spacetimes. Generalizations of PG slicings which are not spatially flat but which remain regular at the horizons are introduced. These metrics can be obtained from standard spherically symmetric metrics by physical Lorentz boosts. With these generalized PG metrics, problematic contributions to the imaginary part of the action in the Parikh-Wilczek derivation of Hawking radiation due to the obstruction can be avoided.

  6. Daylight metrics and energy savings

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  7. Next-Generation Metrics: Responsible Metrics & Evaluation for Open Science

    Wilsdon, J.; Bar-Ilan, J.; Peters, I.; Wouters, P.

    2016-07-01

    Metrics evoke a mixed reaction from the research community. A commitment to using data to inform decisions makes some enthusiastic about the prospect of granular, real-time analysis o of research and its wider impacts. Yet we only have to look at the blunt use of metrics such as journal impact factors, h-indices and grant income targets, to be reminded of the pitfalls. Some of the most precious qualities of academic culture resist simple quantification, and individual indicators often struggle to do justice to the richness and plurality of research. Too often, poorly designed evaluation criteria are “dominating minds, distorting behaviour and determining careers (Lawrence, 2007).” Metrics hold real power: they are constitutive of values, identities and livelihoods. How to exercise that power to more positive ends has been the focus of several recent and complementary initiatives, including the San Francisco Declaration on Research Assessment (DORA1), the Leiden Manifesto2 and The Metric Tide3 (a UK government review of the role of metrics in research management and assessment). Building on these initiatives, the European Commission, under its new Open Science Policy Platform4, is now looking to develop a framework for responsible metrics for research management and evaluation, which can be incorporated into the successor framework to Horizon 2020. (Author)

  8. Accurate Molecular Orientation Analysis Using Infrared p-Polarized Multiple-Angle Incidence Resolution Spectrometry (pMAIRS) Considering the Refractive Index of the Thin Film Sample.

    Shioya, Nobutaka; Shimoaka, Takafumi; Murdey, Richard; Hasegawa, Takeshi

    2017-06-01

    Infrared (IR) p-polarized multiple-angle incidence resolution spectrometry (pMAIRS) is a powerful tool for analyzing the molecular orientation in an organic thin film. In particular, pMAIRS works powerfully for a thin film with a highly rough surface irrespective of degree of the crystallinity. Recently, the optimal experimental condition has comprehensively been revealed, with which the accuracy of the analytical results has largely been improved. Regardless, some unresolved matters still remain. A structurally isotropic sample, for example, yields different peak intensities in the in-plane and out-of-plane spectra. In the present study, this effect is shown to be due to the refractive index of the sample film and a correction factor has been developed using rigorous theoretical methods. As a result, with the use of the correction factor, organic materials having atypical refractive indices such as perfluoroalkyl compounds ( n = 1.35) and fullerene ( n = 1.83) can be analyzed with high accuracy comparable to a compound having a normal refractive index of approximately 1.55. With this improved technique, we are also ready for discriminating an isotropic structure from an oriented sample having the magic angle of 54.7°.

  9. The role of chemometrics in single and sequential extraction assays: a review. Part II. Cluster analysis, multiple linear regression, mixture resolution, experimental design and other techniques.

    Giacomino, Agnese; Abollino, Ornella; Malandrino, Mery; Mentasti, Edoardo

    2011-03-04

    Single and sequential extraction procedures are used for studying element mobility and availability in solid matrices, like soils, sediments, sludge, and airborne particulate matter. In the first part of this review we reported an overview on these procedures and described the applications of chemometric uni- and bivariate techniques and of multivariate pattern recognition techniques based on variable reduction to the experimental results obtained. The second part of the review deals with the use of chemometrics not only for the visualization and interpretation of data, but also for the investigation of the effects of experimental conditions on the response, the optimization of their values and the calculation of element fractionation. We will describe the principles of the multivariate chemometric techniques considered, the aims for which they were applied and the key findings obtained. The following topics will be critically addressed: pattern recognition by cluster analysis (CA), linear discriminant analysis (LDA) and other less common techniques; modelling by multiple linear regression (MLR); investigation of spatial distribution of variables by geostatistics; calculation of fractionation patterns by a mixture resolution method (Chemometric Identification of Substrates and Element Distributions, CISED); optimization and characterization of extraction procedures by experimental design; other multivariate techniques less commonly applied. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. Reconcile: A Coreference Resolution Research Platform

    Stoyanov, V; Cardie, C; Gilbert, N; Riloff, E; Buttler, D; Hysom, D

    2009-10-29

    Despite the availability of standard data sets and metrics, approaches to the problem of noun phrase coreference resolution are hard to compare empirically due to the different evaluation setting stemming, in part, from the lack of comprehensive coreference resolution research platforms. In this tech report we present Reconcile, a coreference resolution research platform that aims to facilitate the implementation of new approaches to coreference resolution as well as the comparison of existing approaches. We discuss Reconcile's architecture and give results of running Reconcile on six data sets using four evaluation metrics, showing that Reconcile's performance is comparable to state-of-the-art systems in coreference resolution.

  11. Multi-Robot Assembly Strategies and Metrics

    MARVEL, JEREMY A.; BOSTELMAN, ROGER; FALCO, JOE

    2018-01-01

    We present a survey of multi-robot assembly applications and methods and describe trends and general insights into the multi-robot assembly problem for industrial applications. We focus on fixtureless assembly strategies featuring two or more robotic systems. Such robotic systems include industrial robot arms, dexterous robotic hands, and autonomous mobile platforms, such as automated guided vehicles. In this survey, we identify the types of assemblies that are enabled by utilizing multiple robots, the algorithms that synchronize the motions of the robots to complete the assembly operations, and the metrics used to assess the quality and performance of the assemblies. PMID:29497234

  12. Multi-Robot Assembly Strategies and Metrics.

    Marvel, Jeremy A; Bostelman, Roger; Falco, Joe

    2018-02-01

    We present a survey of multi-robot assembly applications and methods and describe trends and general insights into the multi-robot assembly problem for industrial applications. We focus on fixtureless assembly strategies featuring two or more robotic systems. Such robotic systems include industrial robot arms, dexterous robotic hands, and autonomous mobile platforms, such as automated guided vehicles. In this survey, we identify the types of assemblies that are enabled by utilizing multiple robots, the algorithms that synchronize the motions of the robots to complete the assembly operations, and the metrics used to assess the quality and performance of the assemblies.

  13. Let's Make Metric Ice Cream

    Zimmerman, Marianna

    1975-01-01

    Describes a classroom activity which involved sixth grade students in a learning situation including making ice cream, safety procedures in a science laboratory, calibrating a thermometer, using metric units of volume and mass. (EB)

  14. Experiential space is hardly metric

    Šikl, Radovan; Šimeček, Michal; Lukavský, Jiří

    2008-01-01

    Roč. 2008, č. 37 (2008), s. 58-58 ISSN 0301-0066. [European Conference on Visual Perception. 24.08-28.08.2008, Utrecht] R&D Projects: GA ČR GA406/07/1676 Institutional research plan: CEZ:AV0Z70250504 Keywords : visual space perception * metric and non-metric perceptual judgments * ecological validity Subject RIV: AN - Psychology

  15. Coverage Metrics for Model Checking

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  16. Phantom metrics with Killing spinors

    W.A. Sabra

    2015-11-01

    Full Text Available We study metric solutions of Einstein–anti-Maxwell theory admitting Killing spinors. The analogue of the IWP metric which admits a space-like Killing vector is found and is expressed in terms of a complex function satisfying the wave equation in flat (2+1-dimensional space–time. As examples, electric and magnetic Kasner spaces are constructed by allowing the solution to depend only on the time coordinate. Euclidean solutions are also presented.

  17. Quantitative rainfall metrics for comparing volumetric rainfall retrievals to fine scale models

    Collis, Scott; Tao, Wei-Kuo; Giangrande, Scott; Fridlind, Ann; Theisen, Adam; Jensen, Michael

    2013-04-01

    Precipitation processes play a significant role in the energy balance of convective systems for example, through latent heating and evaporative cooling. Heavy precipitation "cores" can also be a proxy for vigorous convection and vertical motions. However, comparisons between rainfall rate retrievals from volumetric remote sensors with forecast rain fields from high-resolution numerical weather prediction simulations are complicated by differences in the location and timing of storm morphological features. This presentation will outline a series of metrics for diagnosing the spatial variability and statistical properties of precipitation maps produced both from models and retrievals. We include existing metrics such as Contoured by Frequency Altitude Diagrams (Yuter and Houze 1995) and Statistical Coverage Products (May and Lane 2009) and propose new metrics based on morphology, cell and feature based statistics. Work presented focuses on observations from the ARM Southern Great Plains radar network consisting of three agile X-Band radar systems with a very dense coverage pattern and a C Band system providing site wide coverage. By combining multiple sensors resolutions of 250m2 can be achieved, allowing improved characterization of fine-scale features. Analyses compare data collected during the Midlattitude Continental Convective Clouds Experiment (MC3E) with simulations of observed systems using the NASA Unified Weather Research and Forecasting model. May, P. T., and T. P. Lane, 2009: A method for using weather radar data to test cloud resolving models. Meteorological Applications, 16, 425-425, doi:10.1002/met.150, 10.1002/met.150. Yuter, S. E., and R. A. Houze, 1995: Three-Dimensional Kinematic and Microphysical Evolution of Florida Cumulonimbus. Part II: Frequency Distributions of Vertical Velocity, Reflectivity, and Differential Reflectivity. Mon. Wea. Rev., 123, 1941-1963, doi:10.1175/1520-0493(1995)1232.0.CO;2.

  18. Investigation of Dysphagia After Antireflux Surgery by High-resolution Manometry: Impact of Multiple Water Swallows and a Solid Test Meal on Diagnosis, Management, and Clinical Outcome.

    Wang, Yu Tien; Tai, Ling Fung; Yazaki, Etsuro; Jafari, Jafar; Sweis, Rami; Tucker, Emily; Knowles, Kevin; Wright, Jeff; Ahmad, Saqib; Kasi, Madhavi; Hamlett, Katharine; Fox, Mark R; Sifrim, Daniel

    2015-09-01

    Management of patients with dysphagia, regurgitation, and related symptoms after antireflux surgery is challenging. This prospective, case-control study tested the hypothesis that compared with standard high-resolution manometry (HRM) with single water swallows (SWS), adding multiple water swallows (MWS) and a solid test meal increases diagnostic yield and clinical impact of physiological investigations. Fifty-seven symptomatic and 12 asymptomatic patients underwent HRM with SWS, MWS, and a solid test meal. Dysphagia and reflux were assessed by validated questionnaires. Diagnostic yield of standard and full HRM studies with 24-hour pH-impedance monitoring was compared. Pneumatic dilatation was performed for outlet obstruction on HRM studies. Clinical outcome was assessed by questionnaires and an analogue scale with "satisfactory" defined as at least 40% symptom improvement requiring no further treatment. Postoperative esophagogastric junction pressure was similar in all groups. Abnormal esophagogastric junction morphology (double high pressure band) was more common in symptomatic than in control patients (13 of 57 vs 0 of 12, P = .004). Diagnostic yield of HRM was 11 (19%), 11 (19%), and 33 of 57 (58%), with SWS, MWS, and solids, respectively (P dysphagia (19 of 27, 70%). Outlet obstruction was present in 4 (7%), 11 (19%), and 15 of 57 patients (26%) with SWS, MWS, and solids, respectively (P < .009). No asymptomatic control had clinically relevant dysfunction on solid swallows. Dilatation was performed in 12 of 15 patients with outlet obstruction during the test meal. Symptom response was satisfactory, good, or excellent in 7 of 12 (58%) with no serious complications. The addition of MWS and a solid test meal increases the diagnostic yield of HRM studies in patients with symptoms after fundoplication and identifies additional patients with outlet obstruction who benefit from endoscopic dilatation. Copyright © 2015 AGA Institute. Published by Elsevier Inc. All

  19. Heavy metals in soils of Hechuan County in the upper Yangtze (SW China): Comparative pollution assessment using multiple indices with high-spatial-resolution sampling.

    Ni, Maofei; Mao, Rong; Jia, Zhongmin; Dong, Ruozhu; Li, Siyue

    2018-02-01

    In order to assess heavy metals (HMs) in soils of the upper Yangtze Basin, a very high-spatial-resolution sampling (582 soil samples) was conducted from Hechuan County, an important agricultural practice area in the Southwest China. Multiple indices including geoaccumulation index (I geo ), enrichment factor (EF), sediment pollution index (SPI) and risk index (RI), as well as multivariate statistics were employed for pollution assessment and source identification of HMs in soils. Our results demonstrated that the averages of eight HMs decreased in the following order: Zn (82.8 ± 15.9) > Cr (71.6 ± 12.2) > Ni (32.1 ± 9.89) > Pb (27.6 ± 13.8) > Cu (25.9 ± 11.8) > As (5.48 ± 3.42) > Cd (0.30 ± 0.077) > Hg (0.082 ± 0.092). Averages of HMs except Cd were lower than threshold value of Environmental Quality Standard for Soils, while 43% of total samples had Cd concentration exceeding the national standard, 1% of samples for Hg and 5% samples for Ni, moreover, Cd and Hg averages were much higher than their background levels. I geo and EF indicated that their levels decreased as follows: Cd > Hg > Zn > Pb > Ni > Cu > Cr > As, with moderate enrichments of Cd and Hg. RI indicated that 61.7% of all samples showed moderate risk, while 6.5% of samples with greater than considerable risk due to human activities should be paid more attention. Multivariate analysis showed lithogenic source of Cu, Cr, Ni and Zn, while Cd and Hg were largely contributed by anthropogenic activities such as agricultural practices. Our study would be helpful for improving soil environmental quality in SW, China, as well as supplying modern approaches for other areas with soil HM pollution. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. THE MASSIVE PROTOSTELLAR CLUSTER NGC 6334I AT 220 au RESOLUTION: DISCOVERY OF FURTHER MULTIPLICITY, DIVERSITY, AND A HOT MULTI-CORE

    Brogan, C. L.; Hunter, T. R.; Indebetouw, R. [NRAO, 520 Edgemont Rd, Charlottesville, VA 22903 (United States); Cyganowski, C. J. [SUPA, School of Physics and Astronomy, University of St. Andrews, North Haugh, St. Andrews KY16 9SS (United Kingdom); Chandler, C. J. [NRAO, P.O. Box 0, Socorro, NM 87801 (United States); Friesen, R., E-mail: cbrogan@nrao.edu [Dunlap Institute for Astronomy and Astrophysics, University of Toronto, Toronto, Ontario, M5S 3H4 (Canada)

    2016-12-01

    We present Very Large Array and Atacama Large Millimeter/submillimeter Array imaging of the deeply embedded protostellar cluster NGC 6334I from 5 cm to 1.3 mm at angular resolutions as fine as 0.″17 (220 au). The dominant hot core MM1 is resolved into seven components at 1.3 mm, clustered within a radius of 1000 au. Four of the components have brightness temperatures >200 K, radii ∼300 au, minimum luminosities ∼10{sup 4} L {sub ⊙}, and must be centrally heated. We term this new phenomenon a “hot multi-core.” Two of these objects also exhibit compact free–free emission at longer wavelengths, consistent with a hypercompact H ii region (MM1B) and a jet (MM1D). The spatial kinematics of the water maser emission centered on MM1D are consistent with it being the origin of the high-velocity bipolar molecular outflow seen in CO. The close proximity of MM1B and MM1D (440 au) suggests a proto-binary or a transient bound system. Several components of MM1 exhibit steep millimeter spectral energy distributions indicative of either unusual dust spectral properties or time variability. In addition to resolving MM1 and the other hot core (MM2) into multiple components, we detect five new millimeter and two new centimeter sources. Water masers are detected for the first time toward MM4A, confirming its membership in the protocluster. With a 1.3 mm brightness temperature of 97 K coupled with a lack of thermal molecular line emission, MM4A appears to be a highly optically thick 240  L {sub ⊙} dust core, possibly tracing a transient stage of massive protostellar evolution. The nature of the strongest water maser source CM2 remains unclear due to its combination of non-thermal radio continuum and lack of dust emission.

  1. Scalar-metric and scalar-metric-torsion gravitational theories

    Aldersley, S.J.

    1977-01-01

    The techniques of dimensional analysis and of the theory of tensorial concomitants are employed to study field equations in gravitational theories which incorporate scalar fields of the Brans-Dicke type. Within the context of scalar-metric gravitational theories, a uniqueness theorem for the geometric (or gravitational) part of the field equations is proven and a Lagrangian is determined which is uniquely specified by dimensional analysis. Within the context of scalar-metric-torsion gravitational theories a uniqueness theorem for field Lagrangians is presented and the corresponding Euler-Lagrange equations are given. Finally, an example of a scalar-metric-torsion theory is presented which is similar in many respects to the Brans-Dicke theory and the Einstein-Cartan theory

  2. Advanced reactors: the case for metric design

    Ruby, L.

    1986-01-01

    The author argues that DOE should insist that all design specifications for advanced reactors be in the International System of Units (SI) in accordance with the Metric Conversion Act of 1975. Despite a lack of leadership from the federal government, industry has had to move toward conversion in order to compete on world markets. The US is the only major country without a scheduled conversion program. SI avoids the disadvantages of ambiguous names, non-coherent units, multiple units for the same quantity, multiple definitions, as well as barriers to international exchange and marketing and problems in comparing safety and code parameters. With a first step by DOE, the Nuclear Regulatory Commission should add the same requirements to reactor licensing guidelines. 4 references

  3. Regge calculus from discontinuous metrics

    Khatsymovsky, V.M.

    2003-01-01

    Regge calculus is considered as a particular case of the more general system where the linklengths of any two neighbouring 4-tetrahedra do not necessarily coincide on their common face. This system is treated as that one described by metric discontinuous on the faces. In the superspace of all discontinuous metrics the Regge calculus metrics form some hypersurface defined by continuity conditions. Quantum theory of the discontinuous metric system is assumed to be fixed somehow in the form of quantum measure on (the space of functionals on) the superspace. The problem of reducing this measure to the Regge hypersurface is addressed. The quantum Regge calculus measure is defined from a discontinuous metric measure by inserting the δ-function-like phase factor. The requirement that continuity conditions be imposed in a 'face-independent' way fixes this factor uniquely. The term 'face-independent' means that this factor depends only on the (hyper)plane spanned by the face, not on it's form and size. This requirement seems to be natural from the viewpoint of existence of the well-defined continuum limit maximally free of lattice artefacts

  4. Symmetries of Taub-NUT dual metrics

    Baleanu, D.; Codoban, S.

    1998-01-01

    Recently geometric duality was analyzed for a metric which admits Killing tensors. An interesting example arises when the manifold has Killing-Yano tensors. The symmetries of the dual metrics in the case of Taub-NUT metric are investigated. Generic and non-generic symmetries of dual Taub-NUT metric are analyzed

  5. Landscape metrics for three-dimension urban pattern recognition

    Liu, M.; Hu, Y.; Zhang, W.; Li, C.

    2017-12-01

    Understanding how landscape pattern determines population or ecosystem dynamics is crucial for managing our landscapes. Urban areas are becoming increasingly dominant social-ecological systems, so it is important to understand patterns of urbanization. Most studies of urban landscape pattern examine land-use maps in two dimensions because the acquisition of 3-dimensional information is difficult. We used Brista software based on Quickbird images and aerial photos to interpret the height of buildings, thus incorporating a 3-dimensional approach. We estimated the feasibility and accuracy of this approach. A total of 164,345 buildings in the Liaoning central urban agglomeration of China, which included seven cities, were measured. Twelve landscape metrics were proposed or chosen to describe the urban landscape patterns in 2- and 3-dimensional scales. The ecological and social meaning of landscape metrics were analyzed with multiple correlation analysis. The results showed that classification accuracy compared with field surveys was 87.6%, which means this method for interpreting building height was acceptable. The metrics effectively reflected the urban architecture in relation to number of buildings, area, height, 3-D shape and diversity aspects. We were able to describe the urban characteristics of each city with these metrics. The metrics also captured ecological and social meanings. The proposed landscape metrics provided a new method for urban landscape analysis in three dimensions.

  6. A Kerr-NUT metric

    Vaidya, P.C.; Patel, L.K.; Bhatt, P.V.

    1976-01-01

    Using Galilean time and retarded distance as coordinates the usual Kerr metric is expressed in form similar to the Newman-Unti-Tamburino (NUT) metric. The combined Kerr-NUT metric is then investigated. In addition to the Kerr and NUT solutions of Einstein's equations, three other types of solutions are derived. These are (i) the radiating Kerr solution, (ii) the radiating NUT solution satisfying Rsub(ik) = sigmaxisub(i)xisub(k), xisub(i)xisup(i) = 0, and (iii) the associated Kerr solution satisfying Rsub(ik) = 0. Solution (i) is distinct from and simpler than the one reported earlier by Vaidya and Patel (Phys. Rev.; D7:3590 (1973)). Solutions (ii) and (iii) gave line elements which have the axis of symmetry as a singular line. (author)

  7. Complexity Metrics for Workflow Nets

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  8. The uniqueness of the Fisher metric as information metric

    Le, Hong-Van

    2017-01-01

    Roč. 69, č. 4 (2017), s. 879-896 ISSN 0020-3157 Institutional support: RVO:67985840 Keywords : Chentsov’s theorem * mixed topology * monotonicity of the Fisher metric Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.049, year: 2016 https://link.springer.com/article/10.1007%2Fs10463-016-0562-0

  9. Thermodynamic metrics and optimal paths.

    Sivak, David A; Crooks, Gavin E

    2012-05-11

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  10. Invariant metrics for Hamiltonian systems

    Rangarajan, G.; Dragt, A.J.; Neri, F.

    1991-05-01

    In this paper, invariant metrics are constructed for Hamiltonian systems. These metrics give rise to norms on the space of homeogeneous polynomials of phase-space variables. For an accelerator lattice described by a Hamiltonian, these norms characterize the nonlinear content of the lattice. Therefore, the performance of the lattice can be improved by minimizing the norm as a function of parameters describing the beam-line elements in the lattice. A four-fold increase in the dynamic aperture of a model FODO cell is obtained using this procedure. 7 refs

  11. Generalization of Vaidya's radiation metric

    Gleiser, R J; Kozameh, C N [Universidad Nacional de Cordoba (Argentina). Instituto de Matematica, Astronomia y Fisica

    1981-11-01

    In this paper it is shown that if Vaidya's radiation metric is considered from the point of view of kinetic theory in general relativity, the corresponding phase space distribution function can be generalized in a particular way. The new family of spherically symmetric radiation metrics obtained contains Vaidya's as a limiting situation. The Einstein field equations are solved in a ''comoving'' coordinate system. Two arbitrary functions of a single variable are introduced in the process of solving these equations. Particular examples considered are a stationary solution, a nonvacuum solution depending on a single parameter, and several limiting situations.

  12. Technical Privacy Metrics: a Systematic Survey

    Wagner, Isabel; Eckhoff, David

    2018-01-01

    The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...

  13. Remarks on G-Metric Spaces

    Bessem Samet

    2013-01-01

    Full Text Available In 2005, Mustafa and Sims (2006 introduced and studied a new class of generalized metric spaces, which are called G-metric spaces, as a generalization of metric spaces. We establish some useful propositions to show that many fixed point theorems on (nonsymmetric G-metric spaces given recently by many authors follow directly from well-known theorems on metric spaces. Our technique can be easily extended to other results as shown in application.

  14. DLA Energy Biofuel Feedstock Metrics Study

    2012-12-11

    moderately/highly in- vasive  Metric 2: Genetically modified organism ( GMO ) hazard, Yes/No and Hazard Category  Metric 3: Species hybridization...4– biofuel distribution Stage # 5– biofuel use Metric 1: State inva- siveness ranking Yes Minimal Minimal No No Metric 2: GMO hazard Yes...may utilize GMO microbial or microalgae species across the applicable biofuel life cycles (stages 1–3). The following consequence Metrics 4–6 then

  15. High Resolution Mapping of Soil Properties Using Remote Sensing Variables in South-Western Burkina Faso: A Comparison of Machine Learning and Multiple Linear Regression Models.

    Forkuor, Gerald; Hounkpatin, Ozias K L; Welp, Gerhard; Thiel, Michael

    2017-01-01

    Accurate and detailed spatial soil information is essential for environmental modelling, risk assessment and decision making. The use of Remote Sensing data as secondary sources of information in digital soil mapping has been found to be cost effective and less time consuming compared to traditional soil mapping approaches. But the potentials of Remote Sensing data in improving knowledge of local scale soil information in West Africa have not been fully explored. This study investigated the use of high spatial resolution satellite data (RapidEye and Landsat), terrain/climatic data and laboratory analysed soil samples to map the spatial distribution of six soil properties-sand, silt, clay, cation exchange capacity (CEC), soil organic carbon (SOC) and nitrogen-in a 580 km2 agricultural watershed in south-western Burkina Faso. Four statistical prediction models-multiple linear regression (MLR), random forest regression (RFR), support vector machine (SVM), stochastic gradient boosting (SGB)-were tested and compared. Internal validation was conducted by cross validation while the predictions were validated against an independent set of soil samples considering the modelling area and an extrapolation area. Model performance statistics revealed that the machine learning techniques performed marginally better than the MLR, with the RFR providing in most cases the highest accuracy. The inability of MLR to handle non-linear relationships between dependent and independent variables was found to be a limitation in accurately predicting soil properties at unsampled locations. Satellite data acquired during ploughing or early crop development stages (e.g. May, June) were found to be the most important spectral predictors while elevation, temperature and precipitation came up as prominent terrain/climatic variables in predicting soil properties. The results further showed that shortwave infrared and near infrared channels of Landsat8 as well as soil specific indices of redness

  16. High Resolution Mapping of Soil Properties Using Remote Sensing Variables in South-Western Burkina Faso: A Comparison of Machine Learning and Multiple Linear Regression Models.

    Gerald Forkuor

    Full Text Available Accurate and detailed spatial soil information is essential for environmental modelling, risk assessment and decision making. The use of Remote Sensing data as secondary sources of information in digital soil mapping has been found to be cost effective and less time consuming compared to traditional soil mapping approaches. But the potentials of Remote Sensing data in improving knowledge of local scale soil information in West Africa have not been fully explored. This study investigated the use of high spatial resolution satellite data (RapidEye and Landsat, terrain/climatic data and laboratory analysed soil samples to map the spatial distribution of six soil properties-sand, silt, clay, cation exchange capacity (CEC, soil organic carbon (SOC and nitrogen-in a 580 km2 agricultural watershed in south-western Burkina Faso. Four statistical prediction models-multiple linear regression (MLR, random forest regression (RFR, support vector machine (SVM, stochastic gradient boosting (SGB-were tested and compared. Internal validation was conducted by cross validation while the predictions were validated against an independent set of soil samples considering the modelling area and an extrapolation area. Model performance statistics revealed that the machine learning techniques performed marginally better than the MLR, with the RFR providing in most cases the highest accuracy. The inability of MLR to handle non-linear relationships between dependent and independent variables was found to be a limitation in accurately predicting soil properties at unsampled locations. Satellite data acquired during ploughing or early crop development stages (e.g. May, June were found to be the most important spectral predictors while elevation, temperature and precipitation came up as prominent terrain/climatic variables in predicting soil properties. The results further showed that shortwave infrared and near infrared channels of Landsat8 as well as soil specific indices

  17. Monitoring Cloud-prone Complex Landscapes At Multiple Spatial Scales Using Medium And High Resolution Optical Data: A Case Study In Central Africa

    Basnet, Bikash

    Tracking land surface dynamics over cloud-prone areas with complex mountainous terrain and a landscape that is heterogeneous at a scale of approximately 10 m, is an important challenge in the remote sensing of tropical regions in developing nations, due to the small plot sizes. Persistent monitoring of natural resources in these regions at multiple spatial scales requires development of tools to identify emerging land cover transformation due to anthropogenic causes, such as agricultural expansion and climate change. Along with the cloud cover and obstructions by topographic distortions due to steep terrain, there are limitations to the accuracy of monitoring change using available historical satellite imagery, largely due to sparse data access and the lack of high quality ground truth for classifier training. One such complex region is the Lake Kivu region in Central Africa. This work addressed these problems to create an effective process for monitoring the Lake Kivu region located in Central Africa. The Lake Kivu region is a biodiversity hotspot with a complex and heterogeneous landscape and intensive agricultural development, where individual plot sizes are often at the scale of 10m. Procedures were developed that use optical data from satellite and aerial observations at multiple scales to tackle the monitoring challenges. First, a novel processing chain was developed to systematically monitor the spatio-temporal land cover dynamics of this region over the years 1988, 2001, and 2011 using Landsat data, complemented by ancillary data. Topographic compensation was performed on Landsat reflectances to avoid the strong illumination angle impacts and image compositing was used to compensate for frequent cloud cover and thus incomplete annual data availability in the archive. A systematic supervised classification, using the state-of-the-art machine learning classifier Random Forest, was applied to the composite Landsat imagery to obtain land cover thematic maps

  18. METRIC EVALUATION PIPELINE FOR 3D MODELING OF URBAN SCENES

    M. Bosch

    2017-05-01

    Full Text Available Publicly available benchmark data and metric evaluation approaches have been instrumental in enabling research to advance state of the art methods for remote sensing applications in urban 3D modeling. Most publicly available benchmark datasets have consisted of high resolution airborne imagery and lidar suitable for 3D modeling on a relatively modest scale. To enable research in larger scale 3D mapping, we have recently released a public benchmark dataset with multi-view commercial satellite imagery and metrics to compare 3D point clouds with lidar ground truth. We now define a more complete metric evaluation pipeline developed as publicly available open source software to assess semantically labeled 3D models of complex urban scenes derived from multi-view commercial satellite imagery. Evaluation metrics in our pipeline include horizontal and vertical accuracy and completeness, volumetric completeness and correctness, perceptual quality, and model simplicity. Sources of ground truth include airborne lidar and overhead imagery, and we demonstrate a semi-automated process for producing accurate ground truth shape files to characterize building footprints. We validate our current metric evaluation pipeline using 3D models produced using open source multi-view stereo methods. Data and software is made publicly available to enable further research and planned benchmarking activities.

  19. Metric Evaluation Pipeline for 3d Modeling of Urban Scenes

    Bosch, M.; Leichtman, A.; Chilcott, D.; Goldberg, H.; Brown, M.

    2017-05-01

    Publicly available benchmark data and metric evaluation approaches have been instrumental in enabling research to advance state of the art methods for remote sensing applications in urban 3D modeling. Most publicly available benchmark datasets have consisted of high resolution airborne imagery and lidar suitable for 3D modeling on a relatively modest scale. To enable research in larger scale 3D mapping, we have recently released a public benchmark dataset with multi-view commercial satellite imagery and metrics to compare 3D point clouds with lidar ground truth. We now define a more complete metric evaluation pipeline developed as publicly available open source software to assess semantically labeled 3D models of complex urban scenes derived from multi-view commercial satellite imagery. Evaluation metrics in our pipeline include horizontal and vertical accuracy and completeness, volumetric completeness and correctness, perceptual quality, and model simplicity. Sources of ground truth include airborne lidar and overhead imagery, and we demonstrate a semi-automated process for producing accurate ground truth shape files to characterize building footprints. We validate our current metric evaluation pipeline using 3D models produced using open source multi-view stereo methods. Data and software is made publicly available to enable further research and planned benchmarking activities.

  20. Separable metrics and radiating stars

    We study the junction condition relating the pressure to heat flux at the boundary of an accelerating and expanding spherically symmetric radiating star. We transform the junction condition to an ordinary differential equation by making a separability assumption on the metric functions in the space–time variables.

  1. Socio-technical security metrics

    Gollmann, D.; Herley, C.; Koenig, V.; Pieters, W.; Sasse, M.A.

    2015-01-01

    Report from Dagstuhl seminar 14491. This report documents the program and the outcomes of Dagstuhl Seminar 14491 “Socio-Technical Security Metrics”. In the domain of safety, metrics inform many decisions, from the height of new dikes to the design of nuclear plants. We can state, for example, that

  2. Leading Gainful Employment Metric Reporting

    Powers, Kristina; MacPherson, Derek

    2016-01-01

    This chapter will address the importance of intercampus involvement in reporting of gainful employment student-level data that will be used in the calculation of gainful employment metrics by the U.S. Department of Education. The authors will discuss why building relationships within the institution is critical for effective gainful employment…

  3. Human Performance Optimization Metrics: Consensus Findings, Gaps, and Recommendations for Future Research.

    Nindl, Bradley C; Jaffin, Dianna P; Dretsch, Michael N; Cheuvront, Samuel N; Wesensten, Nancy J; Kent, Michael L; Grunberg, Neil E; Pierce, Joseph R; Barry, Erin S; Scott, Jonathan M; Young, Andrew J; OʼConnor, Francis G; Deuster, Patricia A

    2015-11-01

    Human performance optimization (HPO) is defined as "the process of applying knowledge, skills and emerging technologies to improve and preserve the capabilities of military members, and organizations to execute essential tasks." The lack of consensus for operationally relevant and standardized metrics that meet joint military requirements has been identified as the single most important gap for research and application of HPO. In 2013, the Consortium for Health and Military Performance hosted a meeting to develop a toolkit of standardized HPO metrics for use in military and civilian research, and potentially for field applications by commanders, units, and organizations. Performance was considered from a holistic perspective as being influenced by various behaviors and barriers. To accomplish the goal of developing a standardized toolkit, key metrics were identified and evaluated across a spectrum of domains that contribute to HPO: physical performance, nutritional status, psychological status, cognitive performance, environmental challenges, sleep, and pain. These domains were chosen based on relevant data with regard to performance enhancers and degraders. The specific objectives at this meeting were to (a) identify and evaluate current metrics for assessing human performance within selected domains; (b) prioritize metrics within each domain to establish a human performance assessment toolkit; and (c) identify scientific gaps and the needed research to more effectively assess human performance across domains. This article provides of a summary of 150 total HPO metrics across multiple domains that can be used as a starting point-the beginning of an HPO toolkit: physical fitness (29 metrics), nutrition (24 metrics), psychological status (36 metrics), cognitive performance (35 metrics), environment (12 metrics), sleep (9 metrics), and pain (5 metrics). These metrics can be particularly valuable as the military emphasizes a renewed interest in Human Dimension efforts

  4. Metrics for measuring net-centric data strategy implementation

    Kroculick, Joseph B.

    2010-04-01

    An enterprise data strategy outlines an organization's vision and objectives for improved collection and use of data. We propose generic metrics and quantifiable measures for each of the DoD Net-Centric Data Strategy (NCDS) data goals. Data strategy metrics can be adapted to the business processes of an enterprise and the needs of stakeholders in leveraging the organization's data assets to provide for more effective decision making. Generic metrics are applied to a specific application where logistics supply and transportation data is integrated across multiple functional groups. A dashboard presents a multidimensional view of the current progress to a state where logistics data shared in a timely and seamless manner among users, applications, and systems.

  5. Metrics correlation and analysis service (MCAS)

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya

    2010-01-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information pool is disorganized, it is a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation, and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by loosely coupled or fully decoupled middleware.

  6. Metrics correlation and analysis service (MCAS)

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya

    2009-01-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information 'pond' is disorganized, it a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by disjoint middleware.

  7. Group covariance and metrical theory

    Halpern, L.

    1983-01-01

    The a priori introduction of a Lie group of transformations into a physical theory has often proved to be useful; it usually serves to describe special simplified conditions before a general theory can be worked out. Newton's assumptions of absolute space and time are examples where the Euclidian group and translation group have been introduced. These groups were extended to the Galilei group and modified in the special theory of relativity to the Poincare group to describe physics under the given conditions covariantly in the simplest way. The criticism of the a priori character leads to the formulation of the general theory of relativity. The general metric theory does not really give preference to a particular invariance group - even the principle of equivalence can be adapted to a whole family of groups. The physical laws covariantly inserted into the metric space are however adapted to the Poincare group. 8 references

  8. General relativity: An erfc metric

    Plamondon, Réjean

    2018-06-01

    This paper proposes an erfc potential to incorporate in a symmetric metric. One key feature of this model is that it relies on the existence of an intrinsic physical constant σ, a star-specific proper length that scales all its surroundings. Based thereon, the new metric is used to study the space-time geometry of a static symmetric massive object, as seen from its interior. The analytical solutions to the Einstein equation are presented, highlighting the absence of singularities and discontinuities in such a model. The geodesics are derived in their second- and first-order differential formats. Recalling the slight impact of the new model on the classical general relativity tests in the solar system, a number of facts and open problems are briefly revisited on the basis of a heuristic definition of σ. A special attention is given to gravitational collapses and non-singular black holes.

  9. hdm: High-dimensional metrics

    Chernozhukov, Victor; Hansen, Christian; Spindler, Martin

    2016-01-01

    In this article the package High-dimensional Metrics (\\texttt{hdm}) is introduced. It is a collection of statistical methods for estimation and quantification of uncertainty in high-dimensional approximately sparse models. It focuses on providing confidence intervals and significance testing for (possibly many) low-dimensional subcomponents of the high-dimensional parameter vector. Efficient estimators and uniformly valid confidence intervals for regression coefficients on target variables (e...

  10. Multi-Metric Sustainability Analysis

    Cowlin, Shannon [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heimiller, Donna [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Pless, Jacquelyn [National Renewable Energy Lab. (NREL), Golden, CO (United States); Munoz, David [Colorado School of Mines, Golden, CO (United States)

    2014-12-01

    A readily accessible framework that allows for evaluating impacts and comparing tradeoffs among factors in energy policy, expansion planning, and investment decision making is lacking. Recognizing this, the Joint Institute for Strategic Energy Analysis (JISEA) funded an exploration of multi-metric sustainability analysis (MMSA) to provide energy decision makers with a means to make more comprehensive comparisons of energy technologies. The resulting MMSA tool lets decision makers simultaneously compare technologies and potential deployment locations.

  11. Metric reconstruction from Weyl scalars

    Whiting, Bernard F; Price, Larry R [Department of Physics, PO Box 118440, University of Florida, Gainesville, FL 32611 (United States)

    2005-08-07

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations.

  12. Metric reconstruction from Weyl scalars

    Whiting, Bernard F; Price, Larry R

    2005-01-01

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations

  13. Sustainability Metrics: The San Luis Basin Project

    Sustainability is about promoting humanly desirable dynamic regimes of the environment. Metrics: ecological footprint, net regional product, exergy, emergy, and Fisher Information. Adaptive management: (1) metrics assess problem, (2) specific problem identified, and (3) managemen...

  14. An Innovative Metric to Evaluate Satellite Precipitation's Spatial Distribution

    Liu, H.; Chu, W.; Gao, X.; Sorooshian, S.

    2011-12-01

    Thanks to its capability to cover the mountains, where ground measurement instruments cannot reach, satellites provide a good means of estimating precipitation over mountainous regions. In regions with complex terrains, accurate information on high-resolution spatial distribution of precipitation is critical for many important issues, such as flood/landslide warning, reservoir operation, water system planning, etc. Therefore, in order to be useful in many practical applications, satellite precipitation products should possess high quality in characterizing spatial distribution. However, most existing validation metrics, which are based on point/grid comparison using simple statistics, cannot effectively measure satellite's skill of capturing the spatial patterns of precipitation fields. This deficiency results from the fact that point/grid-wised comparison does not take into account of the spatial coherence of precipitation fields. Furth more, another weakness of many metrics is that they can barely provide information on why satellite products perform well or poor. Motivated by our recent findings of the consistent spatial patterns of the precipitation field over the western U.S., we developed a new metric utilizing EOF analysis and Shannon entropy. The metric can be derived through two steps: 1) capture the dominant spatial patterns of precipitation fields from both satellite products and reference data through EOF analysis, and 2) compute the similarities between the corresponding dominant patterns using mutual information measurement defined with Shannon entropy. Instead of individual point/grid, the new metric treat the entire precipitation field simultaneously, naturally taking advantage of spatial dependence. Since the dominant spatial patterns are shaped by physical processes, the new metric can shed light on why satellite product can or cannot capture the spatial patterns. For demonstration, a experiment was carried out to evaluate a satellite

  15. Crowdsourcing metrics of digital collections

    Tuula Pääkkönen

    2015-12-01

    Full Text Available In the National Library of Finland (NLF there are millions of digitized newspaper and journal pages, which are openly available via the public website  http://digi.kansalliskirjasto.fi. To serve users better, last year the front end was completely overhauled with its main aim in crowdsourcing features, e.g., by giving end-users the opportunity to create digital clippings and a personal scrapbook from the digital collections. But how can you know whether crowdsourcing has had an impact? How much crowdsourcing functionalities have been used so far? Did crowdsourcing work? In this paper the statistics and metrics of a recent crowdsourcing effort are analysed across the different digitized material types (newspapers, journals, ephemera. The subjects, categories and keywords given by the users are analysed to see which topics are the most appealing. Some notable public uses of the crowdsourced article clippings are highlighted. These metrics give us indications on how the end-users, based on their own interests, are investigating and using the digital collections. Therefore, the suggested metrics illustrate the versatility of the information needs of the users, varying from citizen science to research purposes. By analysing the user patterns, we can respond to the new needs of the users by making minor changes to accommodate the most active participants, while still making the service more approachable for those who are trying out the functionalities for the first time. Participation in the clippings and annotations can enrich the materials in unexpected ways and can possibly pave the way for opportunities of using crowdsourcing more also in research contexts. This creates more opportunities for the goals of open science since source data becomes ­available, making it possible for researchers to reach out to the general public for help. In the long term, utilizing, for example, text mining methods can allow these different end-user segments to

  16. A family of metric gravities

    Shuler, Robert

    2018-04-01

    The goal of this paper is to take a completely fresh approach to metric gravity, in which the metric principle is strictly adhered to but its properties in local space-time are derived from conservation principles, not inferred from a global field equation. The global field strength variation then gains some flexibility, but only in the regime of very strong fields (2nd-order terms) whose measurement is now being contemplated. So doing provides a family of similar gravities, differing only in strong fields, which could be developed into meaningful verification targets for strong fields after the manner in which far-field variations were used in the 20th century. General Relativity (GR) is shown to be a member of the family and this is demonstrated by deriving the Schwarzschild metric exactly from a suitable field strength assumption. The method of doing so is interesting in itself because it involves only one differential equation rather than the usual four. Exact static symmetric field solutions are also given for one pedagogical alternative based on potential, and one theoretical alternative based on inertia, and the prospects of experimentally differentiating these are analyzed. Whether the method overturns the conventional wisdom that GR is the only metric theory of gravity and that alternatives must introduce additional interactions and fields is somewhat semantical, depending on whether one views the field strength assumption as a field and whether the assumption that produces GR is considered unique in some way. It is of course possible to have other fields, and the local space-time principle can be applied to field gravities which usually are weak-field approximations having only time dilation, giving them the spatial factor and promoting them to full metric theories. Though usually pedagogical, some of them are interesting from a quantum gravity perspective. Cases are noted where mass measurement errors, or distributions of dark matter, can cause one

  17. Hybrid metric-Palatini stars

    Danilǎ, Bogdan; Harko, Tiberiu; Lobo, Francisco S. N.; Mak, M. K.

    2017-02-01

    We consider the internal structure and the physical properties of specific classes of neutron, quark and Bose-Einstein condensate stars in the recently proposed hybrid metric-Palatini gravity theory, which is a combination of the metric and Palatini f (R ) formalisms. It turns out that the theory is very successful in accounting for the observed phenomenology, since it unifies local constraints at the Solar System level and the late-time cosmic acceleration, even if the scalar field is very light. In this paper, we derive the equilibrium equations for a spherically symmetric configuration (mass continuity and Tolman-Oppenheimer-Volkoff) in the framework of the scalar-tensor representation of the hybrid metric-Palatini theory, and we investigate their solutions numerically for different equations of state of neutron and quark matter, by adopting for the scalar field potential a Higgs-type form. It turns out that the scalar-tensor definition of the potential can be represented as an Clairaut differential equation, and provides an explicit form for f (R ) given by f (R )˜R +Λeff, where Λeff is an effective cosmological constant. Furthermore, stellar models, described by the stiff fluid, radiation-like, bag model and the Bose-Einstein condensate equations of state are explicitly constructed in both general relativity and hybrid metric-Palatini gravity, thus allowing an in-depth comparison between the predictions of these two gravitational theories. As a general result it turns out that for all the considered equations of state, hybrid gravity stars are more massive than their general relativistic counterparts. Furthermore, two classes of stellar models corresponding to two particular choices of the functional form of the scalar field (constant value, and logarithmic form, respectively) are also investigated. Interestingly enough, in the case of a constant scalar field the equation of state of the matter takes the form of the bag model equation of state describing

  18. Unveiling multiple solid-state transitions in pharmaceutical solid dosage forms using multi-series hyperspectral imaging and different curve resolution approaches

    Alexandrino, Guilherme L; Amigo Rubio, Jose Manuel; Khorasani, Milad Rouhi

    2017-01-01

    Solid-state transitions at the surface of pharmaceutical solid dosage forms (SDF) were monitored using multi-series hyperspectral imaging (HSI) along with Multivariate Curve Resolution – Alternating Least Squares (MCR-ALS) and Parallel Factor Analysis (PARAFAC and PARAFAC2). First, the solid-stat...

  19. Metrics for Evaluation of Student Models

    Pelanek, Radek

    2015-01-01

    Researchers use many different metrics for evaluation of performance of student models. The aim of this paper is to provide an overview of commonly used metrics, to discuss properties, advantages, and disadvantages of different metrics, to summarize current practice in educational data mining, and to provide guidance for evaluation of student…

  20. Context-dependent ATC complexity metric

    Mercado Velasco, G.A.; Borst, C.

    2015-01-01

    Several studies have investigated Air Traffic Control (ATC) complexity metrics in a search for a metric that could best capture workload. These studies have shown how daunting the search for a universal workload metric (one that could be applied in different contexts: sectors, traffic patterns,

  1. Properties of C-metric spaces

    Croitoru, Anca; Apreutesei, Gabriela; Mastorakis, Nikos E.

    2017-09-01

    The subject of this paper belongs to the theory of approximate metrics [23]. An approximate metric on X is a real application defined on X × X that satisfies only a part of the metric axioms. In a recent paper [23], we introduced a new type of approximate metric, named C-metric, that is an application which satisfies only two metric axioms: symmetry and triangular inequality. The remarkable fact in a C-metric space is that a topological structure induced by the C-metric can be defined. The innovative idea of this paper is that we obtain some convergence properties of a C-metric space in the absence of a metric. In this paper we investigate C-metric spaces. The paper is divided into four sections. Section 1 is for Introduction. In Section 2 we recall some concepts and preliminary results. In Section 3 we present some properties of C-metric spaces, such as convergence properties, a canonical decomposition and a C-fixed point theorem. Finally, in Section 4 some conclusions are highlighted.

  2. Optimization of the alpha image reconstruction. An iterative CT-image reconstruction with well-defined image quality metrics

    Lebedev, Sergej; Sawall, Stefan; Knaup, Michael; Kachelriess, Marc

    2017-01-01

    Optimization of the AIR-algorithm for improved convergence and performance. TThe AIR method is an iterative algorithm for CT image reconstruction. As a result of its linearity with respect to the basis images, the AIR algorithm possesses well defined, regular image quality metrics, e.g. point spread function (PSF) or modulation transfer function (MTF), unlike other iterative reconstruction algorithms. The AIR algorithm computes weighting images α to blend between a set of basis images that preferably have mutually exclusive properties, e.g. high spatial resolution or low noise. The optimized algorithm uses an approach that alternates between the optimization of rawdata fidelity using an OSSART like update and regularization using gradient descent, as opposed to the initially proposed AIR using a straightforward gradient descent implementation. A regularization strength for a given task is chosen by formulating a requirement for the noise reduction and checking whether it is fulfilled for different regularization strengths, while monitoring the spatial resolution using the voxel-wise defined modulation transfer function for the AIR image. The optimized algorithm computes similar images in a shorter time compared to the initial gradient descent implementation of AIR. The result can be influenced by multiple parameters that can be narrowed down to a relatively simple framework to compute high quality images. The AIR images, for instance, can have at least a 50% lower noise level compared to the sharpest basis image, while the spatial resolution is mostly maintained. The optimization improves performance by a factor of 6, while maintaining image quality. Furthermore, it was demonstrated that the spatial resolution for AIR can be determined using regular image quality metrics, given smooth weighting images. This is not possible for other iterative reconstructions as a result of their non linearity. A simple set of parameters for the algorithm is discussed that provides

  3. Optimization of the alpha image reconstruction. An iterative CT-image reconstruction with well-defined image quality metrics

    Lebedev, Sergej; Sawall, Stefan; Knaup, Michael; Kachelriess, Marc [German Cancer Research Center, Heidelberg (Germany).

    2017-10-01

    Optimization of the AIR-algorithm for improved convergence and performance. TThe AIR method is an iterative algorithm for CT image reconstruction. As a result of its linearity with respect to the basis images, the AIR algorithm possesses well defined, regular image quality metrics, e.g. point spread function (PSF) or modulation transfer function (MTF), unlike other iterative reconstruction algorithms. The AIR algorithm computes weighting images α to blend between a set of basis images that preferably have mutually exclusive properties, e.g. high spatial resolution or low noise. The optimized algorithm uses an approach that alternates between the optimization of rawdata fidelity using an OSSART like update and regularization using gradient descent, as opposed to the initially proposed AIR using a straightforward gradient descent implementation. A regularization strength for a given task is chosen by formulating a requirement for the noise reduction and checking whether it is fulfilled for different regularization strengths, while monitoring the spatial resolution using the voxel-wise defined modulation transfer function for the AIR image. The optimized algorithm computes similar images in a shorter time compared to the initial gradient descent implementation of AIR. The result can be influenced by multiple parameters that can be narrowed down to a relatively simple framework to compute high quality images. The AIR images, for instance, can have at least a 50% lower noise level compared to the sharpest basis image, while the spatial resolution is mostly maintained. The optimization improves performance by a factor of 6, while maintaining image quality. Furthermore, it was demonstrated that the spatial resolution for AIR can be determined using regular image quality metrics, given smooth weighting images. This is not possible for other iterative reconstructions as a result of their non linearity. A simple set of parameters for the algorithm is discussed that provides

  4. LINKING IN SITU TIME SERIES FOREST CANOPY LAI AND PHENOLOGY METRICS WITH MODIS AND LANDSAT NDVI AND LAI PRODUCTS

    The subject of this presentation is forest vegetation dynamics as observed by the TERRA spacecraft's Moderate-Resolution Imaging Spectroradiometer (MODIS) and Landsat Thematic Mapper, and complimentary in situ time series measurements of forest canopy metrics related to Leaf Area...

  5. Development and Implementation of a Design Metric for Systems Containing Long-Term Fluid Loops

    Steele, John W.

    2016-01-01

    John Steele, a chemist and technical fellow from United Technologies Corporation, provided a water quality module to assist engineers and scientists with a metric tool to evaluate risks associated with the design of space systems with fluid loops. This design metric is a methodical, quantitative, lessons-learned based means to evaluate the robustness of a long-term fluid loop system design. The tool was developed by a cross-section of engineering disciplines who had decades of experience and problem resolution.

  6. On characterizations of quasi-metric completeness

    Dag, H.; Romaguera, S.; Tirado, P.

    2017-07-01

    Hu proved in [4] that a metric space (X, d) is complete if and only if for any closed subspace C of (X, d), every Banach contraction on C has fixed point. Since then several authors have investigated the problem of characterizing the metric completeness by means of fixed point theorems. Recently this problem has been studied in the more general context of quasi-metric spaces for different notions of completeness. Here we present a characterization of a kind of completeness for quasi-metric spaces by means of a quasi-metric versions of Hu’s theorem. (Author)

  7. The Metric of Colour Space

    Gravesen, Jens

    2015-01-01

    and found the MacAdam ellipses which are often interpreted as defining the metric tensor at their centres. An important question is whether it is possible to define colour coordinates such that the Euclidean distance in these coordinates correspond to human perception. Using cubic splines to represent......The space of colours is a fascinating space. It is a real vector space, but no matter what inner product you put on the space the resulting Euclidean distance does not correspond to human perception of difference between colours. In 1942 MacAdam performed the first experiments on colour matching...

  8. Product Operations Status Summary Metrics

    Takagi, Atsuya; Toole, Nicholas

    2010-01-01

    The Product Operations Status Summary Metrics (POSSUM) computer program provides a readable view into the state of the Phoenix Operations Product Generation Subsystem (OPGS) data pipeline. POSSUM provides a user interface that can search the data store, collect product metadata, and display the results in an easily-readable layout. It was designed with flexibility in mind for support in future missions. Flexibility over various data store hierarchies is provided through the disk-searching facilities of Marsviewer. This is a proven program that has been in operational use since the first day of the Phoenix mission.

  9. MetricForensics: A Multi-Level Approach for Mining Volatile Graphs

    Henderson, Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Eliassi-Rad, Tina [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Faloutsos, Christos [Carnegie Mellon Univ., Pittsburgh, PA (United States); Akoglu, Leman [Carnegie Mellon Univ., Pittsburgh, PA (United States); Li, Lei [Carnegie Mellon Univ., Pittsburgh, PA (United States); Maruhashi, Koji [Fujitsu Laboratories Ltd., Kanagawa (Japan); Prakash, B. Aditya [Carnegie Mellon Univ., Pittsburgh, PA (United States); Tong, H [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    2010-02-08

    Advances in data collection and storage capacity have made it increasingly possible to collect highly volatile graph data for analysis. Existing graph analysis techniques are not appropriate for such data, especially in cases where streaming or near-real-time results are required. An example that has drawn significant research interest is the cyber-security domain, where internet communication traces are collected and real-time discovery of events, behaviors, patterns and anomalies is desired. We propose MetricForensics, a scalable framework for analysis of volatile graphs. MetricForensics combines a multi-level “drill down" approach, a collection of user-selected graph metrics and a collection of analysis techniques. At each successive level, more sophisticated metrics are computed and the graph is viewed at a finer temporal resolution. In this way, MetricForensics scales to highly volatile graphs by only allocating resources for computationally expensive analysis when an interesting event is discovered at a coarser resolution first. We test MetricForensics on three real-world graphs: an enterprise IP trace, a trace of legitimate and malicious network traffic from a research institution, and the MIT Reality Mining proximity sensor data. Our largest graph has »3M vertices and »32M edges, spanning 4:5 days. The results demonstrate the scalability and capability of MetricForensics in analyzing volatile graphs; and highlight four novel phenomena in such graphs: elbows, broken correlations, prolonged spikes, and strange stars.

  10. Web metrics for library and information professionals

    Stuart, David

    2014-01-01

    This is a practical guide to using web metrics to measure impact and demonstrate value. The web provides an opportunity to collect a host of different metrics, from those associated with social media accounts and websites to more traditional research outputs. This book is a clear guide for library and information professionals as to what web metrics are available and how to assess and use them to make informed decisions and demonstrate value. As individuals and organizations increasingly use the web in addition to traditional publishing avenues and formats, this book provides the tools to unlock web metrics and evaluate the impact of this content. The key topics covered include: bibliometrics, webometrics and web metrics; data collection tools; evaluating impact on the web; evaluating social media impact; investigating relationships between actors; exploring traditional publications in a new environment; web metrics and the web of data; the future of web metrics and the library and information professional.Th...

  11. Development of soil quality metrics using mycorrhizal fungi

    Baar, J.

    2010-07-01

    Based on the Treaty on Biological Diversity of Rio de Janeiro in 1992 for maintaining and increasing biodiversity, several countries have started programmes monitoring soil quality and the above- and below ground biodiversity. Within the European Union, policy makers are working on legislation for soil protection and management. Therefore, indicators are needed to monitor the status of the soils and these indicators reflecting the soil quality, can be integrated in working standards or soil quality metrics. Soil micro-organisms, particularly arbuscular mycorrhizal fungi (AMF), are indicative of soil changes. These soil fungi live in symbiosis with the great majority of plants and are sensitive to changes in the physico-chemical conditions of the soil. The aim of this study was to investigate whether AMF are reliable and sensitive indicators for disturbances in the soils and can be used for the development of soil quality metrics. Also, it was studied whether soil quality metrics based on AMF meet requirements to applicability by users and policy makers. Ecological criterions were set for the development of soil quality metrics for different soils. Multiple root samples containing AMF from various locations in The Netherlands were analyzed. The results of the analyses were related to the defined criterions. This resulted in two soil quality metrics, one for sandy soils and a second one for clay soils, with six different categories ranging from very bad to very good. These soil quality metrics meet the majority of requirements for applicability and are potentially useful for the development of legislations for the protection of soil quality. (Author) 23 refs.

  12. Metrics for building performance assurance

    Koles, G.; Hitchcock, R.; Sherman, M.

    1996-07-01

    This report documents part of the work performed in phase I of a Laboratory Directors Research and Development (LDRD) funded project entitled Building Performance Assurances (BPA). The focus of the BPA effort is to transform the way buildings are built and operated in order to improve building performance by facilitating or providing tools, infrastructure, and information. The efforts described herein focus on the development of metrics with which to evaluate building performance and for which information and optimization tools need to be developed. The classes of building performance metrics reviewed are (1) Building Services (2) First Costs, (3) Operating Costs, (4) Maintenance Costs, and (5) Energy and Environmental Factors. The first category defines the direct benefits associated with buildings; the next three are different kinds of costs associated with providing those benefits; the last category includes concerns that are broader than direct costs and benefits to the building owner and building occupants. The level of detail of the various issues reflect the current state of knowledge in those scientific areas and the ability of the to determine that state of knowledge, rather than directly reflecting the importance of these issues; it intentionally does not specifically focus on energy issues. The report describes work in progress and is intended as a resource and can be used to indicate the areas needing more investigation. Other reports on BPA activities are also available.

  13. A faster, high resolution, mtPA-GFP-based mitochondrial fusion assay acquiring kinetic data of multiple cells in parallel using confocal microscopy.

    Lovy, Alenka; Molina, Anthony J A; Cerqueira, Fernanda M; Trudeau, Kyle; Shirihai, Orian S

    2012-07-20

    exposing loaded cells (3-15 nM TMRE) to the imaging parameters that will be used in the assay (perhaps 7 stacks of 6 optical sections in a row), and assessing cell health after 2 hours. If the mitochondria appear too fragmented and cells are dying, other mitochondrial markers, such as dsRED or Mitotracker red could be used instead of TMRE. The mtPAGFP method has revealed details about mitochondrial network behavior that could not be visualized using other methods. For example, we now know that mitochondrial fusion can be full or transient, where matrix content can mix without changing the overall network morphology. Additionally, we know that the probability of fusion is independent of contact duration and organelle dimension, is influenced by organelle motility, membrane potential and history of previous fusion activity. In this manuscript, we describe a methodology for scaling up the previously published protocol using mtPAGFP and 15 nM TMRE in order to examine multiple cells at a time and improve the time efficiency of data collection without sacrificing the subcellular resolution. This has been made possible by the use of an automated microscope stage, and programmable image acquisition software. Zen software from Zeiss allows the user to mark and track several designated cells expressing mtPAGFP. Each of these cells can be photoactivated in a particular region of interest, and stacks of confocal slices can be monitored for mtPAGFP signal as well as TMRE at specified intervals. Other confocal systems could be used to perform this protocol provided there is an automated stage that is programmable, an incubator with CO2, and a means by which to photoactivate the PAGFP; either a multiphoton laser, or a 405 nm diode laser.

  14. HIGH-RESOLUTION 8 mm AND 1 cm POLARIZATION OF IRAS 4A FROM THE VLA NASCENT DISK AND MULTIPLICITY (VANDAM) SURVEY

    Cox, Erin G.; Harris, Robert J.; Looney, Leslie W.; Segura-Cox, Dominique M. [Department of Astronomy, University of Illinois at Urbana-Champaign, Urbana, IL 61801 (United States); Tobin, John [Leiden Observatory, Leiden University, P.O. Box 9513, 2000-RA Leiden (Netherlands); Li, Zhi-Yun [Department of Astronomy, University of Virginia, Charlottesville, VA 22903 (United States); Tychoniec, Łukasz [Astronomical Observatory Institute, Faculty of Physics, A. Mickiewicz University, Słoneczna 36, PL-60-268 Poznań (Poland); Chandler, Claire J.; Perez, Laura M. [National Radio Astronomy Observatory, Socorro, NM 87801 (United States); Dunham, Michael M. [Harvard-Smithsonian Center for Astrophysics, Cambridge, MA 02138 (United States); Kratter, Kaitlin [Steward Observatory, University of Arizona, Tucson, AZ 85721 (United States); Melis, Carl [Center for Astrophysics and Space Sciences, University of California, San Diego, CA 92093 (United States); Sadavoy, Sarah I., E-mail: egcox2@illinois.edu [Max-Planck-Institut für Astronomie, D-69117 Heidelberg (Germany)

    2015-12-01

    Magnetic fields can regulate disk formation, accretion, and jet launching. Until recently, it has been difficult to obtain high-resolution observations of the magnetic fields of the youngest protostars in the critical region near the protostar. The VANDAM survey is observing all known protostars in the Perseus Molecular Cloud. Here we present the polarization data of IRAS 4A. We find that with ∼0.″2 (50 AU) resolution at λ = 8.1 and 10.3 mm, the inferred magnetic field is consistent with a circular morphology, in marked contrast with the hourglass morphology seen on larger scales. This morphology is consistent with frozen-in field lines that were dragged in by rotating material entering the infall region. The field morphology is reminiscent of rotating circumstellar material near the protostar. This is the first polarization detection of a protostar at these wavelengths. We conclude from our observations that the dust emission is optically thin with β ∼ 1.3, suggesting that millimeter-/centimeter-sized grains have grown and survived in the short lifetime of the protostar.

  15. Metric approach to quantum constraints

    Brody, Dorje C; Hughston, Lane P; Gustavsson, Anna C T

    2009-01-01

    A framework for deriving equations of motion for constrained quantum systems is introduced and a procedure for its implementation is outlined. In special cases, the proposed new method, which takes advantage of the fact that the space of pure states in quantum mechanics has both a symplectic structure and a metric structure, reduces to a quantum analogue of the Dirac theory of constraints in classical mechanics. Explicit examples involving spin-1/2 particles are worked out in detail: in the first example, our approach coincides with a quantum version of the Dirac formalism, while the second example illustrates how a situation that cannot be treated by Dirac's approach can nevertheless be dealt with in the present scheme.

  16. Metrics for Business Process Models

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  17. Quantifying Forest Spatial Pattern Trends at Multiple Extents: An Approach to Detect Significant Changes at Different Scales

    Ludovico Frate

    2014-09-01

    Full Text Available We propose a procedure to detect significant changes in forest spatial patterns and relevant scales. Our approach consists of four sequential steps. First, based on a series of multi-temporal forest maps, a set of geographic windows of increasing extents are extracted. Second, for each extent and date, specific stochastic simulations that replicate real-world spatial pattern characteristics are run. Third, by computing pattern metrics on both simulated and real maps, their empirical distributions and confidence intervals are derived. Finally, multi-temporal scalograms are built for each metric. Based on cover maps (1954, 2011 with a resolution of 10 m we analyze forest pattern changes in a central Apennines (Italy reserve at multiple spatial extents (128, 256 and 512 pixels. We identify three types of multi-temporal scalograms, depending on pattern metric behaviors, describing different dynamics of natural reforestation process. The statistical distribution and variability of pattern metrics at multiple extents offers a new and powerful tool to detect forest variations over time. Similar procedures can (i help to identify significant changes in spatial patterns and provide the bases to relate them to landscape processes; (ii minimize the bias when comparing pattern metrics at a single extent and (iii be extended to other landscapes and scales.

  18. Security camera resolution measurements: Horizontal TV lines versus modulation transfer function measurements.

    Birch, Gabriel Carisle [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Griffin, John Clark [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    The horizontal television lines (HTVL) metric has been the primary quantity used by division 6000 related to camera resolution for high consequence security systems. This document shows HTVL measurements are fundamen- tally insufficient as a metric to determine camera resolution, and propose a quantitative, standards based methodology by measuring the camera system modulation transfer function (MTF), the most common and accepted metric of res- olution in the optical science community. Because HTVL calculations are easily misinterpreted or poorly defined, we present several scenarios in which HTVL is frequently reported, and discuss their problems. The MTF metric is discussed, and scenarios are presented with calculations showing the application of such a metric.

  19. Response of algal metrics to nutrients and physical factors and identification of nutrient thresholds in agricultural streams

    Black, R.W.; Moran, P.W.; Frankforter, J.D.

    2011-01-01

    Many streams within the United States are impaired due to nutrient enrichment, particularly in agricultural settings. The present study examines the response of benthic algal communities in agricultural and minimally disturbed sites from across the western United States to a suite of environmental factors, including nutrients, collected at multiple scales. The first objective was to identify the relative importance of nutrients, habitat and watershed features, and macroinvertebrate trophic structure to explain algal metrics derived from deposition and erosion habitats. The second objective was to determine if thresholds in total nitrogen (TN) and total phosphorus (TP) related to algal metrics could be identified and how these thresholds varied across metrics and habitats. Nutrient concentrations within the agricultural areas were elevated and greater than published threshold values. All algal metrics examined responded to nutrients as hypothesized. Although nutrients typically were the most important variables in explaining the variation in each of the algal metrics, environmental factors operating at multiple scales also were important. Calculated thresholds for TN or TP based on the algal metrics generated from samples collected from erosion and deposition habitats were not significantly different. Little variability in threshold values for each metric for TN and TP was observed. The consistency of the threshold values measured across multiple metrics and habitats suggest that the thresholds identified in this study are ecologically relevant. Additional work to characterize the relationship between algal metrics, physical and chemical features, and nuisance algal growth would be of benefit to the development of nutrient thresholds and criteria. ?? 2010 The Author(s).

  20. Active Metric Learning for Supervised Classification

    Kumaran, Krishnan; Papageorgiou, Dimitri; Chang, Yutong; Li, Minhan; Takáč, Martin

    2018-01-01

    Clustering and classification critically rely on distance metrics that provide meaningful comparisons between data points. We present mixed-integer optimization approaches to find optimal distance metrics that generalize the Mahalanobis metric extensively studied in the literature. Additionally, we generalize and improve upon leading methods by removing reliance on pre-designated "target neighbors," "triplets," and "similarity pairs." Another salient feature of our method is its ability to en...

  1. On Nakhleh's metric for reduced phylogenetic networks

    Cardona, Gabriel; Llabrés, Mercè; Rosselló, Francesc; Valiente Feruglio, Gabriel Alejandro

    2009-01-01

    We prove that Nakhleh’s metric for reduced phylogenetic networks is also a metric on the classes of tree-child phylogenetic networks, semibinary tree-sibling time consistent phylogenetic networks, and multilabeled phylogenetic trees. We also prove that it separates distinguishable phylogenetic networks. In this way, it becomes the strongest dissimilarity measure for phylogenetic networks available so far. Furthermore, we propose a generalization of that metric that separates arbitrary phyl...

  2. Generalized tolerance sensitivity and DEA metric sensitivity

    Neralić, Luka; E. Wendell, Richard

    2015-01-01

    This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA). Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  3. The definitive guide to IT service metrics

    McWhirter, Kurt

    2012-01-01

    Used just as they are, the metrics in this book will bring many benefits to both the IT department and the business as a whole. Details of the attributes of each metric are given, enabling you to make the right choices for your business. You may prefer and are encouraged to design and create your own metrics to bring even more value to your business - this book will show you how to do this, too.

  4. Generalized tolerance sensitivity and DEA metric sensitivity

    Luka Neralić

    2015-03-01

    Full Text Available This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA. Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  5. Common Metrics for Human-Robot Interaction

    Steinfeld, Aaron; Lewis, Michael; Fong, Terrence; Scholtz, Jean; Schultz, Alan; Kaber, David; Goodrich, Michael

    2006-01-01

    This paper describes an effort to identify common metrics for task-oriented human-robot interaction (HRI). We begin by discussing the need for a toolkit of HRI metrics. We then describe the framework of our work and identify important biasing factors that must be taken into consideration. Finally, we present suggested common metrics for standardization and a case study. Preparation of a larger, more detailed toolkit is in progress.

  6. Chaotic inflation with metric and matter perturbations

    Feldman, H.A.; Brandenberger, R.H.

    1989-01-01

    A perturbative scheme to analyze the evolution of both metric and scalar field perturbations in an expanding universe is developed. The scheme is applied to study chaotic inflation with initial metric and scalar field perturbations present. It is shown that initial gravitational perturbations with wavelength smaller than the Hubble radius rapidly decay. The metric simultaneously picks up small perturbations determined by the matter inhomogeneities. Both are frozen in once the wavelength exceeds the Hubble radius. (orig.)

  7. Gravitational lensing in metric theories of gravity

    Sereno, Mauro

    2003-01-01

    Gravitational lensing in metric theories of gravity is discussed. I introduce a generalized approximate metric element, inclusive of both post-post-Newtonian contributions and a gravitomagnetic field. Following Fermat's principle and standard hypotheses, I derive the time delay function and deflection angle caused by an isolated mass distribution. Several astrophysical systems are considered. In most of the cases, the gravitomagnetic correction offers the best perspectives for an observational detection. Actual measurements distinguish only marginally different metric theories from each other

  8. Performance of a fast and high-resolution multi-echo spin-echo sequence for prostate T2 mapping across multiple systems.

    van Houdt, Petra J; Agarwal, Harsh K; van Buuren, Laurens D; Heijmink, Stijn W T P J; Haack, Søren; van der Poel, Henk G; Ghobadi, Ghazaleh; Pos, Floris J; Peeters, Johannes M; Choyke, Peter L; van der Heide, Uulke A

    2018-03-01

    To evaluate the performance of a multi-echo spin-echo sequence with k-t undersampling scheme (k-t T 2 ) in prostate cancer. Phantom experiments were performed at five systems to estimate the bias, short-term repeatability, and reproducibility across all systems expressed with the within-subject coefficient of variation (wCV). Monthly measurements were performed on two systems for long-term repeatability estimation. To evaluate clinical repeatability, two T 2 maps (voxel size 0.8 × 0.8 × 3 mm 3 ; 5 min) were acquired at separate visits on one system for 13 prostate cancer patients. Repeatability was assessed per patient in relation to spatial resolution. T 2 values were compared for tumor, peripheral zone, and transition zone. Phantom measurements showed a small bias (median = -0.9 ms) and good short-term repeatability (median wCV = 0.5%). Long-term repeatability was 0.9 and 1.1% and reproducibility between systems was 1.7%. The median bias observed in patients was -1.1 ms. At voxel level, the median wCV was 15%, dropping to 4% for structures of 0.5 cm 3 . The median tumor T 2 values (79 ms) were significantly lower (P < 0.001) than in the peripheral zone (149 ms), but overlapped with the transition zone (91 ms). Reproducible T 2 mapping of the prostate is feasible with good spatial resolution in a clinically reasonable scan time, allowing reliable measurement of T 2 in structures as small as 0.5 cm 3 . Magn Reson Med 79:1586-1594, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  9. About the possibility of a generalized metric

    Lukacs, B.; Ladik, J.

    1991-10-01

    The metric (the structure of the space-time) may be dependent on the properties of the object measuring it. The case of size dependence of the metric was examined. For this dependence the simplest possible form of the metric tensor has been constructed which fulfils the following requirements: there be two extremal characteristic scales; the metric be unique and the usual between them; the change be sudden in the neighbourhood of these scales; the size of the human body appear as a parameter (postulated on the basis of some philosophical arguments). Estimates have been made for the two extremal length scales according to existing observations. (author) 19 refs

  10. A jackknife approach to quantifying single-trial correlation between covariance-based metrics undefined on a single-trial basis.

    Richter, Craig G; Thompson, William H; Bosman, Conrado A; Fries, Pascal

    2015-07-01

    The quantification of covariance between neuronal activities (functional connectivity) requires the observation of correlated changes and therefore multiple observations. The strength of such neuronal correlations may itself undergo moment-by-moment fluctuations, which might e.g. lead to fluctuations in single-trial metrics such as reaction time (RT), or may co-fluctuate with the correlation between activity in other brain areas. Yet, quantifying the relation between moment-by-moment co-fluctuations in neuronal correlations is precluded by the fact that neuronal correlations are not defined per single observation. The proposed solution quantifies this relation by first calculating neuronal correlations for all leave-one-out subsamples (i.e. the jackknife replications of all observations) and then correlating these values. Because the correlation is calculated between jackknife replications, we address this approach as jackknife correlation (JC). First, we demonstrate the equivalence of JC to conventional correlation for simulated paired data that are defined per observation and therefore allow the calculation of conventional correlation. While the JC recovers the conventional correlation precisely, alternative approaches, like sorting-and-binning, result in detrimental effects of the analysis parameters. We then explore the case of relating two spectral correlation metrics, like coherence, that require multiple observation epochs, where the only viable alternative analysis approaches are based on some form of epoch subdivision, which results in reduced spectral resolution and poor spectral estimators. We show that JC outperforms these approaches, particularly for short epoch lengths, without sacrificing any spectral resolution. Finally, we note that the JC can be applied to relate fluctuations in any smooth metric that is not defined on single observations. Copyright © 2015. Published by Elsevier Inc.

  11. Towards Video Quality Metrics Based on Colour Fractal Geometry

    Richard Noël

    2010-01-01

    Full Text Available Vision is a complex process that integrates multiple aspects of an image: spatial frequencies, topology and colour. Unfortunately, so far, all these elements were independently took into consideration for the development of image and video quality metrics, therefore we propose an approach that blends together all of them. Our approach allows for the analysis of the complexity of colour images in the RGB colour space, based on the probabilistic algorithm for calculating the fractal dimension and lacunarity. Given that all the existing fractal approaches are defined only for gray-scale images, we extend them to the colour domain. We show how these two colour fractal features capture the multiple aspects that characterize the degradation of the video signal, based on the hypothesis that the quality degradation perceived by the user is directly proportional to the modification of the fractal complexity. We claim that the two colour fractal measures can objectively assess the quality of the video signal and they can be used as metrics for the user-perceived video quality degradation and we validated them through experimental results obtained for an MPEG-4 video streaming application; finally, the results are compared against the ones given by unanimously-accepted metrics and subjective tests.

  12. Metrics for comparing dynamic earthquake rupture simulations

    Barall, Michael; Harris, Ruth A.

    2014-01-01

    Earthquakes are complex events that involve a myriad of interactions among multiple geologic features and processes. One of the tools that is available to assist with their study is computer simulation, particularly dynamic rupture simulation. A dynamic rupture simulation is a numerical model of the physical processes that occur during an earthquake. Starting with the fault geometry, friction constitutive law, initial stress conditions, and assumptions about the condition and response of the near‐fault rocks, a dynamic earthquake rupture simulation calculates the evolution of fault slip and stress over time as part of the elastodynamic numerical solution (Ⓔ see the simulation description in the electronic supplement to this article). The complexity of the computations in a dynamic rupture simulation make it challenging to verify that the computer code is operating as intended, because there are no exact analytic solutions against which these codes’ results can be directly compared. One approach for checking if dynamic rupture computer codes are working satisfactorily is to compare each code’s results with the results of other dynamic rupture codes running the same earthquake simulation benchmark. To perform such a comparison consistently, it is necessary to have quantitative metrics. In this paper, we present a new method for quantitatively comparing the results of dynamic earthquake rupture computer simulation codes.

  13. High-resolution continuum source atomic absorption spectrometry for the simultaneous or sequential monitoring of multiple lines. A critical review of current possibilities

    Resano, M.; Flórez, M.R.; García-Ruiz, E.

    2013-01-01

    This work examines the capabilities and limitations of commercially available high-resolution continuum source atomic absorption spectrometry instrumentation for multi-line monitoring, discussing in detail the possible strategies to develop multi-element methodologies that are truly simultaneous, or else sequential, but from the same sample aliquot. Moreover, the simultaneous monitoring of various atomic or molecular lines may bring other important analytical advantages, such as: i) expansion of the linear range by monitoring multiplets; ii) improvements in the limit of detection and in precision by summing the signals from different lines of the same element or molecule; iii) simple correction for matrix-effects by selecting a suitable internal standard; or iv) accurate mathematical correction of spectral overlaps by simultaneous monitoring of free lines of the interfering molecule or element. This work discusses how authors have made use of these strategies to develop analytical methodologies that permit the straightforward analysis of complex samples. - Highlights: • HR CS AAS potential for simultaneous multi-line monitoring is critically examined. • Strategies to develop simultaneous multi-element methods are discussed. • Other benefits of multi-line monitoring (e.g., use of an IS or LSBC) are highlighted. • Selected examples from the literature are discussed in detail

  14. High spatial resolution decade-time scale land cover change at multiple locations in the Beringian Arctic (1948–2000s)

    Lin, D H; Johnson, D R; Tweedie, C E; Andresen, C

    2012-01-01

    Analysis of time series imagery from satellite and aircraft platforms is useful for detecting land cover change at plot to regional scales. In this study, we created multi-temporal high spatial resolution land cover maps for seven locations in the Beringian Arctic and assessed the change in land cover over time. Land cover classifications were site specific and mostly aligned with a soil moisture gradient. Time series varied between 60 and 21 years. Four of the five landscapes studied in Alaska underwent an expansion of drier land cover classes while the two landscapes studies in Chukotka, Russia showed an expansion of wetter land cover types. While a range of land cover types was present across the landscapes studied, the extent of shrubs (in Chukotka) and open water (in Alaska) increased in all landscapes where these land cover types were present. The results support trends documented for regional change in NDVI (a measure of vegetation greenness and productivity) as well as a host of other long term, experimental and modeling studies. Using historic change trends for each land cover type at each landscape, we use a simple probabilistic vegetation model to establish hypotheses of future change trajectories for different land cover types at each of the landscapes investigated. This study is a contribution to the International Polar Year Back to the Future project (IPY-BTF). (letter)

  15. Singularity resolution in quantum gravity

    Husain, Viqar; Winkler, Oliver

    2004-01-01

    We examine the singularity resolution issue in quantum gravity by studying a new quantization of standard Friedmann-Robertson-Walker geometrodynamics. The quantization procedure is inspired by the loop quantum gravity program, and is based on an alternative to the Schroedinger representation normally used in metric variable quantum cosmology. We show that in this representation for quantum geometrodynamics there exists a densely defined inverse scale factor operator, and that the Hamiltonian constraint acts as a difference operator on the basis states. We find that the cosmological singularity is avoided in the quantum dynamics. We discuss these results with a view to identifying the criteria that constitute 'singularity resolution' in quantum gravity

  16. Integrated Metrics for Improving the Life Cycle Approach to Assessing Product System Sustainability

    Wesley Ingwersen

    2014-03-01

    Full Text Available Life cycle approaches are critical for identifying and reducing environmental burdens of products. While these methods can indicate potential environmental impacts of a product, current Life Cycle Assessment (LCA methods fail to integrate the multiple impacts of a system into unified measures of social, economic or environmental performance related to sustainability. Integrated metrics that combine multiple aspects of system performance based on a common scientific or economic principle have proven to be valuable for sustainability evaluation. In this work, we propose methods of adapting four integrated metrics for use with LCAs of product systems: ecological footprint, emergy, green net value added, and Fisher information. These metrics provide information on the full product system in land, energy, monetary equivalents, and as a unitless information index; each bundled with one or more indicators for reporting. When used together and for relative comparison, integrated metrics provide a broader coverage of sustainability aspects from multiple theoretical perspectives that is more likely to illuminate potential issues than individual impact indicators. These integrated metrics are recommended for use in combination with traditional indicators used in LCA. Future work will test and demonstrate the value of using these integrated metrics and combinations to assess product system sustainability.

  17. Enhancing Authentication Models Characteristic Metrics via ...

    In this work, we derive the universal characteristic metrics set for authentication models based on security, usability and design issues. We then compute the probability of the occurrence of each characteristic metrics in some single factor and multifactor authentication models in order to determine the effectiveness of these ...

  18. Gravitational Metric Tensor Exterior to Rotating Homogeneous ...

    The covariant and contravariant metric tensors exterior to a homogeneous spherical body rotating uniformly about a common φ axis with constant angular velocity ω is constructed. The constructed metric tensors in this gravitational field have seven non-zero distinct components.The Lagrangian for this gravitational field is ...

  19. Invariant metric for nonlinear symplectic maps

    In this paper, we construct an invariant metric in the space of homogeneous polynomials of a given degree (≥ 3). The homogeneous polynomials specify a nonlinear symplectic map which in turn represents a Hamiltonian system. By minimizing the norm constructed out of this metric as a function of system parameters, we ...

  20. Finite Metric Spaces of Strictly negative Type

    Hjorth, Poul G.

    If a finite metric space is of strictly negative type then its transfinite diameter is uniquely realized by an infinite extent (“load vector''). Finite metric spaces that have this property include all trees, and all finite subspaces of Euclidean and Hyperbolic spaces. We prove that if the distance...

  1. Fixed point theory in metric type spaces

    Agarwal, Ravi P; O’Regan, Donal; Roldán-López-de-Hierro, Antonio Francisco

    2015-01-01

    Written by a team of leading experts in the field, this volume presents a self-contained account of the theory, techniques and results in metric type spaces (in particular in G-metric spaces); that is, the text approaches this important area of fixed point analysis beginning from the basic ideas of metric space topology. The text is structured so that it leads the reader from preliminaries and historical notes on metric spaces (in particular G-metric spaces) and on mappings, to Banach type contraction theorems in metric type spaces, fixed point theory in partially ordered G-metric spaces, fixed point theory for expansive mappings in metric type spaces, generalizations, present results and techniques in a very general abstract setting and framework. Fixed point theory is one of the major research areas in nonlinear analysis. This is partly due to the fact that in many real world problems fixed point theory is the basic mathematical tool used to establish the existence of solutions to problems which arise natur...

  2. Metric solution of a spinning mass

    Sato, H.

    1982-01-01

    Studies on a particular class of asymptotically flat and stationary metric solutions called the Kerr-Tomimatsu-Sato class are reviewed about its derivation and properties. For a further study, an almost complete list of the papers worked on the Tomimatsu-Sato metrics is given. (Auth.)

  3. On Information Metrics for Spatial Coding.

    Souza, Bryan C; Pavão, Rodrigo; Belchior, Hindiael; Tort, Adriano B L

    2018-04-01

    The hippocampal formation is involved in navigation, and its neuronal activity exhibits a variety of spatial correlates (e.g., place cells, grid cells). The quantification of the information encoded by spikes has been standard procedure to identify which cells have spatial correlates. For place cells, most of the established metrics derive from Shannon's mutual information (Shannon, 1948), and convey information rate in bits/s or bits/spike (Skaggs et al., 1993, 1996). Despite their widespread use, the performance of these metrics in relation to the original mutual information metric has never been investigated. In this work, using simulated and real data, we find that the current information metrics correlate less with the accuracy of spatial decoding than the original mutual information metric. We also find that the top informative cells may differ among metrics, and show a surrogate-based normalization that yields comparable spatial information estimates. Since different information metrics may identify different neuronal populations, we discuss current and alternative definitions of spatially informative cells, which affect the metric choice. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  4. Validation of Metrics for Collaborative Systems

    Ion IVAN

    2008-01-01

    Full Text Available This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  5. Validation of Metrics for Collaborative Systems

    Ion IVAN; Cristian CIUREA

    2008-01-01

    This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  6. Software Power Metric Model: An Implementation | Akwukwuma ...

    ... and the execution time (TIME) in each case was recorded. We then obtain the application functions point count. Our result shows that the proposed metric is computable, consistent in its use of unit, and is programming language independent. Keywords: Software attributes, Software power, measurement, Software metric, ...

  7. On the possibilities of high-resolution continuum source graphite furnace atomic absorption spectrometry for the simultaneous or sequential monitoring of multiple atomic lines

    Resano, M.; Rello, L.; Florez, M.; Belarra, M.A.

    2011-01-01

    This paper explores the potential of commercially available high-resolution continuum source graphite furnace atomic absorption spectrometry instrumentation for the simultaneous or sequential monitoring of various atomic lines, in an attempt to highlight the analytical advantages that can be derived from this strategy. In particular, it is demonstrated how i) the monitoring of multiplets may allow for the simple expansion of the linear range, as shown for the measurement of Ni using the triplet located in the vicinity of 234.6 nm; ii) the use of a suitable internal standard may permit improving the precision and help in correcting for matrix-effects, as proved for the monitoring of Ni in different biological samples; iii) direct and multi-element analysis of solid samples may be feasible on some occasions, either by monitoring various atomic lines that are sufficiently close (truly simultaneous monitoring, as demonstrated in the determination of Co, Fe and Ni in NIST 1566a Oyster tissue) or, alternatively, by opting for a selective and sequential atomization of the elements of interest during every single replicate. Determination of Cd and Ni in BCR 679 White cabbage is attempted using both approaches, which permits confirming that both methods can offer very similar and satisfactory results. However, it is important to stress that the second approach provides more flexibility, since analysis is no longer limited to those elements that show very close atomic lines (closer than 0.3 nm in the ultraviolet region) with a sensitivity ratio similar to the concentration ratio of the analytes in the samples investigated.

  8. Urban Landscape Metrics for Climate and Sustainability Assessments

    Cochran, F. V.; Brunsell, N. A.

    2014-12-01

    To test metrics for rapid identification of urban classes and sustainable urban forms, we examine the configuration of urban landscapes using satellite remote sensing data. We adopt principles from landscape ecology and urban planning to evaluate urban heterogeneity and design themes that may constitute more sustainable urban forms, including compactness (connectivity), density, mixed land uses, diversity, and greening. Using 2-D wavelet and multi-resolution analysis, landscape metrics, and satellite-derived indices of vegetation fraction and impervious surface, the spatial variability of Landsat and MODIS data from metropolitan areas of Manaus and São Paulo, Brazil are investigated. Landscape metrics for density, connectivity, and diversity, like the Shannon Diversity Index, are used to assess the diversity of urban buildings, geographic extent, and connectedness. Rapid detection of urban classes for low density, medium density, high density, and tall building district at the 1-km scale are needed for use in climate models. If the complexity of finer-scale urban characteristics can be related to the neighborhood scale both climate and sustainability assessments may be more attainable across urban areas.

  9. The metrics of science and technology

    Geisler, Eliezer

    2000-01-01

    Dr. Geisler's far-reaching, unique book provides an encyclopedic compilation of the key metrics to measure and evaluate the impact of science and technology on academia, industry, and government. Focusing on such items as economic measures, patents, peer review, and other criteria, and supported by an extensive review of the literature, Dr. Geisler gives a thorough analysis of the strengths and weaknesses inherent in metric design, and in the use of the specific metrics he cites. His book has already received prepublication attention, and will prove especially valuable for academics in technology management, engineering, and science policy; industrial R&D executives and policymakers; government science and technology policymakers; and scientists and managers in government research and technology institutions. Geisler maintains that the application of metrics to evaluate science and technology at all levels illustrates the variety of tools we currently possess. Each metric has its own unique strengths and...

  10. Smart Grid Status and Metrics Report Appendices

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Antonopoulos, Chrissi A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clements, Samuel L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorrissen, Willy J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ruiz, Kathleen A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, David L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gardner, Chris [APQC, Houston, TX (United States); Varney, Jeff [APQC, Houston, TX (United States)

    2014-07-01

    A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

  11. Evaluating Modeled Impact Metrics for Human Health, Agriculture Growth, and Near-Term Climate

    Seltzer, K. M.; Shindell, D. T.; Faluvegi, G.; Murray, L. T.

    2017-12-01

    Simulated metrics that assess impacts on human health, agriculture growth, and near-term climate were evaluated using ground-based and satellite observations. The NASA GISS ModelE2 and GEOS-Chem models were used to simulate the near-present chemistry of the atmosphere. A suite of simulations that varied by model, meteorology, horizontal resolution, emissions inventory, and emissions year were performed, enabling an analysis of metric sensitivities to various model components. All simulations utilized consistent anthropogenic global emissions inventories (ECLIPSE V5a or CEDS), and an evaluation of simulated results were carried out for 2004-2006 and 2009-2011 over the United States and 2014-2015 over China. Results for O3- and PM2.5-based metrics featured minor differences due to the model resolutions considered here (2.0° × 2.5° and 0.5° × 0.666°) and model, meteorology, and emissions inventory each played larger roles in variances. Surface metrics related to O3 were consistently high biased, though to varying degrees, demonstrating the need to evaluate particular modeling frameworks before O3 impacts are quantified. Surface metrics related to PM2.5 were diverse, indicating that a multimodel mean with robust results are valuable tools in predicting PM2.5-related impacts. Oftentimes, the configuration that captured the change of a metric best over time differed from the configuration that captured the magnitude of the same metric best, demonstrating the challenge in skillfully simulating impacts. These results highlight the strengths and weaknesses of these models in simulating impact metrics related to air quality and near-term climate. With such information, the reliability of historical and future simulations can be better understood.

  12. Low-Resolution Tactile Image Recognition for Automated Robotic Assembly Using Kernel PCA-Based Feature Fusion and Multiple Kernel Learning-Based Support Vector Machine

    Yi-Hung Liu

    2014-01-01

    Full Text Available In this paper, we propose a robust tactile sensing image recognition scheme for automatic robotic assembly. First, an image reprocessing procedure is designed to enhance the contrast of the tactile image. In the second layer, geometric features and Fourier descriptors are extracted from the image. Then, kernel principal component analysis (kernel PCA is applied to transform the features into ones with better discriminating ability, which is the kernel PCA-based feature fusion. The transformed features are fed into the third layer for classification. In this paper, we design a classifier by combining the multiple kernel learning (MKL algorithm and support vector machine (SVM. We also design and implement a tactile sensing array consisting of 10-by-10 sensing elements. Experimental results, carried out on real tactile images acquired by the designed tactile sensing array, show that the kernel PCA-based feature fusion can significantly improve the discriminating performance of the geometric features and Fourier descriptors. Also, the designed MKL-SVM outperforms the regular SVM in terms of recognition accuracy. The proposed recognition scheme is able to achieve a high recognition rate of over 85% for the classification of 12 commonly used metal parts in industrial applications.

  13. High-resolution stratigraphy and multiple luminescence dating techniques to reveal the paleoseismic history of the central Dead Sea fault (Yammouneh fault, Lebanon)

    Le Béon, Maryline; Tseng, Ya-Chu; Klinger, Yann; Elias, Ata; Kunz, Alexander; Sursock, Alexandre; Daëron, Mathieu; Tapponnier, Paul; Jomaa, Rachid

    2018-07-01

    Continuous sedimentation and detailed stratigraphy are key parameters for a complete paleo-earthquake record. Here, we present a new paleoseismological study across the main strike-slip fault branch of the Dead Sea fault in Lebanon. We aim to expand the current knowledge on local paleoseismicity and seismic behavior of strike-slip plate boundary faults and to explore the limitations of paleoseismology and dating methods. The trench, dug in the Jbab el-Homr basin, reveals a succession of remarkable, very thin (0.1 to 5 cm) palustrine and lacustrine layers, ruptured by at least 17 earthquakes. Absolute ages of 4 samples are obtained from three luminescence-dating techniques targeting fine-grain minerals. Blue-green stimulated luminescence (BGSL) on quartz and post-infrared infrared-stimulated luminescence at 225 °C on polymineral aliquots led to consistent ages, while ages from infrared-stimulated luminescence at 50 °C on polymineral aliquots appeared underestimated. The quartz BGSL ages are 26.9 ± 2.3 ka at 0.50 m depth and 30.8 ± 2.9 ka at 3.65 m depth. During this time period of 3.9 ka ([0; 9.1 ka]), 14 surface-rupturing events occurred with a mean return time of 280 years ([0; 650 years]) and probable clustering. This return time is much shorter than the 1127 ± 135 years return time previously determined at the Yammouneh site, located 30 km south. Although fault segmentation and temporal variations in the earthquake cycle remain possible causes for such different records, we argue that the high-resolution stratigraphy in Jbab is the main factor, enabling us to record small deformations related to smaller-magnitude events that may have been missed in the rougher strata of Yammouneh. Indeed, focusing only on larger events of Jbab, we obtain a mean return time of 720 years ([0; 1670 years]) that is compatible with the Yammouneh record.

  14. Neurosurgical virtual reality simulation metrics to assess psychomotor skills during brain tumor resection.

    Azarnoush, Hamed; Alzhrani, Gmaan; Winkler-Schwartz, Alexander; Alotaibi, Fahad; Gelinas-Phaneuf, Nicholas; Pazos, Valérie; Choudhury, Nusrat; Fares, Jawad; DiRaddo, Robert; Del Maestro, Rolando F

    2015-05-01

    Virtual reality simulator technology together with novel metrics could advance our understanding of expert neurosurgical performance and modify and improve resident training and assessment. This pilot study introduces innovative metrics that can be measured by the state-of-the-art simulator to assess performance. Such metrics cannot be measured in an operating room and have not been used previously to assess performance. Three sets of performance metrics were assessed utilizing the NeuroTouch platform in six scenarios with simulated brain tumors having different visual and tactile characteristics. Tier 1 metrics included percentage of brain tumor resected and volume of simulated "normal" brain tissue removed. Tier 2 metrics included instrument tip path length, time taken to resect the brain tumor, pedal activation frequency, and sum of applied forces. Tier 3 metrics included sum of forces applied to different tumor regions and the force bandwidth derived from the force histogram. The results outlined are from a novice resident in the second year of training and an expert neurosurgeon. The three tiers of metrics obtained from the NeuroTouch simulator do encompass the wide variability of technical performance observed during novice/expert resections of simulated brain tumors and can be employed to quantify the safety, quality, and efficiency of technical performance during simulated brain tumor resection. Tier 3 metrics derived from force pyramids and force histograms may be particularly useful in assessing simulated brain tumor resections. Our pilot study demonstrates that the safety, quality, and efficiency of novice and expert operators can be measured using metrics derived from the NeuroTouch platform, helping to understand how specific operator performance is dependent on both psychomotor ability and cognitive input during multiple virtual reality brain tumor resections.

  15. The effect of growth interruptions at the interfaces in epitaxially grown GaInAsSb/AlGaAsSb multiple-quantum-wells studied with high-resolution x-ray diffraction and photoluminescence

    Selvig, E; Myrvaagnes, G; Bugge, R; Haakenaasen, R; Fimland, B O

    2006-01-01

    Molecular beam epitaxy has been used to grow GaInAsSb/AlGaAsSb multiple-quantum-well (MQW) structures. Growth has been interrupted at the interfaces between the wells and the barriers. During the growth interruptions, the interfaces have been exposed to Sb x (x=1, 2) and As 2 fluxes. The structures have been studied using high-resolution x-ray diffraction (HRXRD) and photoluminescence (PL). The As content in the interface layers has been found to have a large impact on the HRXRD curves. The As content in the interface layers has been determined by simulation of HRXRD rocking curves. We also show how highly strained interfaces cause more satellite peaks to appear in HRXRD rocking curves. PL spectra show that interrupting growth at the interfaces between wells and barriers and exposing the interfaces to an Sb soak result in flatter interfaces

  16. Partial rectangular metric spaces and fixed point theorems.

    Shukla, Satish

    2014-01-01

    The purpose of this paper is to introduce the concept of partial rectangular metric spaces as a generalization of rectangular metric and partial metric spaces. Some properties of partial rectangular metric spaces and some fixed point results for quasitype contraction in partial rectangular metric spaces are proved. Some examples are given to illustrate the observed results.

  17. High-resolution slab gel isoelectric focusing: methods for quantitative electrophoretic transfer and immunodetection of proteins as applied to the study of the multiple isoelectric forms of ornithine decarboxylase.

    Reddy, S G; Cochran, B J; Worth, L L; Knutson, V P; Haddox, M K

    1994-04-01

    A high-resolution isoelectric focusing vertical slab gel method which can resolve proteins which differ by a single charge was developed and this method was applied to the study of the multiple isoelectric forms of ornithine decarboxylase. Separation of proteins at this high level of resolution was achieved by increasing the ampholyte concentration in the gels to 6%. Various lots of ampholytes, from the same or different commercial sources, differed significantly in their protein binding capacity. Ampholytes bound to proteins interfered both with the electrophoretic transfer of proteins from the gel to immunoblotting membranes and with the ability of antibodies to interact with proteins on the immunoblotting membranes. Increasing the amount of protein loaded into a gel lane also decreased the efficiency of the electrophoretic transfer and immunodetection. To overcome these problems, both gel washing and gel electrophoretic transfer protocols for disrupting the ampholyte-protein binding and enabling a quantitative electrophoretic transfer of proteins were developed. Two gel washing procedures, with either thiocyanate or borate buffers, and a two-step electrophoretic transfer method are described. The choice of which method to use to optimally disrupt the ampholyte-protein binding was found to vary with each lot of ampholytes employed.

  18. An Arabidopsis introgression zone studied at high spatio-temporal resolution: interglacial and multiple genetic contact exemplified using whole nuclear and plastid genomes.

    Hohmann, Nora; Koch, Marcus A

    2017-10-23

    Northeastern Forealps are older than expected and predate the Last Glaciation Maximum. This correlates well with high genetic diversity found within areas that served as refuge area multiple times. Our data also provide some first hints that early introgressed and presumably preadapted populations account for successful and rapid postglacial re-colonization and range expansion.

  19. Disaster damage detection through synergistic use of deep learning and 3D point cloud features derived from very high resolution oblique aerial images, and multiple-kernel-learning

    Vetrivel, Anand; Gerke, Markus; Kerle, Norman; Nex, Francesco; Vosselman, George

    2018-06-01

    Oblique aerial images offer views of both building roofs and façades, and thus have been recognized as a potential source to detect severe building damages caused by destructive disaster events such as earthquakes. Therefore, they represent an important source of information for first responders or other stakeholders involved in the post-disaster response process. Several automated methods based on supervised learning have already been demonstrated for damage detection using oblique airborne images. However, they often do not generalize well when data from new unseen sites need to be processed, hampering their practical use. Reasons for this limitation include image and scene characteristics, though the most prominent one relates to the image features being used for training the classifier. Recently features based on deep learning approaches, such as convolutional neural networks (CNNs), have been shown to be more effective than conventional hand-crafted features, and have become the state-of-the-art in many domains, including remote sensing. Moreover, often oblique images are captured with high block overlap, facilitating the generation of dense 3D point clouds - an ideal source to derive geometric characteristics. We hypothesized that the use of CNN features, either independently or in combination with 3D point cloud features, would yield improved performance in damage detection. To this end we used CNN and 3D features, both independently and in combination, using images from manned and unmanned aerial platforms over several geographic locations that vary significantly in terms of image and scene characteristics. A multiple-kernel-learning framework, an effective way for integrating features from different modalities, was used for combining the two sets of features for classification. The results are encouraging: while CNN features produced an average classification accuracy of about 91%, the integration of 3D point cloud features led to an additional

  20. Improving resolution of public health surveillance for human Salmonella enterica serovar Typhimurium infection: 3 years of prospective multiple-locus variable-number tandem-repeat analysis (MLVA

    Sintchenko Vitali

    2012-03-01

    Full Text Available Abstract Background Prospective typing of Salmonella enterica serovar Typhimurium (STM by multiple-locus variable-number tandem-repeat analysis (MLVA can assist in identifying clusters of STM cases that might otherwise have gone unrecognised, as well as sources of sporadic and outbreak cases. This paper describes the dynamics of human STM infection in a prospective study of STM MLVA typing for public health surveillance. Methods During a three-year period between August 2007 and September 2010 all confirmed STM isolates were fingerprinted using MLVA as part of the New South Wales (NSW state public health surveillance program. Results A total of 4,920 STM isolates were typed and a subset of 4,377 human isolates was included in the analysis. The STM spectrum was dominated by a small number of phage types, including DT170 (44.6% of all isolates, DT135 (13.9%, DT9 (10.8%, DT44 (4.5% and DT126 (4.5%. There was a difference in the discriminatory power of MLVA types within endemic phage types: Simpson's index of diversity ranged from 0.109 and 0.113 for DTs 9 and 135 to 0.172 and 0.269 for DTs 170 and 44, respectively. 66 distinct STM clusters were observed ranging in size from 5 to 180 cases and in duration from 4 weeks to 25 weeks. 43 clusters had novel MLVA types and 23 represented recurrences of previously recorded MLVA types. The diversity of the STM population remained relatively constant over time. The gradual increase in the number of STM cases during the study was not related to significant changes in the number of clusters or their size. 667 different MLVA types or patterns were observed. Conclusions Prospective MLVA typing of STM allows the detection of community outbreaks and demonstrates the sustained level of STM diversity that accompanies the increasing incidence of human STM infections. The monitoring of novel and persistent MLVA types offers a new benchmark for STM surveillance. A part of this study was presented at the MEEGID

  1. Measuring Information Security: Guidelines to Build Metrics

    von Faber, Eberhard

    Measuring information security is a genuine interest of security managers. With metrics they can develop their security organization's visibility and standing within the enterprise or public authority as a whole. Organizations using information technology need to use security metrics. Despite the clear demands and advantages, security metrics are often poorly developed or ineffective parameters are collected and analysed. This paper describes best practices for the development of security metrics. First attention is drawn to motivation showing both requirements and benefits. The main body of this paper lists things which need to be observed (characteristic of metrics), things which can be measured (how measurements can be conducted) and steps for the development and implementation of metrics (procedures and planning). Analysis and communication is also key when using security metrics. Examples are also given in order to develop a better understanding. The author wants to resume, continue and develop the discussion about a topic which is or increasingly will be a critical factor of success for any security managers in larger organizations.

  2. Characterising risk - aggregated metrics: radiation and noise

    Passchier, W.

    1998-01-01

    The characterisation of risk is an important phase in the risk assessment - risk management process. From the multitude of risk attributes a few have to be selected to obtain a risk characteristic or profile that is useful for risk management decisions and implementation of protective measures. One way to reduce the number of attributes is aggregation. In the field of radiation protection such an aggregated metric is firmly established: effective dose. For protection against environmental noise the Health Council of the Netherlands recently proposed a set of aggregated metrics for noise annoyance and sleep disturbance. The presentation will discuss similarities and differences between these two metrics and practical limitations. The effective dose has proven its usefulness in designing radiation protection measures, which are related to the level of risk associated with the radiation practice in question, given that implicit judgements on radiation induced health effects are accepted. However, as the metric does not take into account the nature of radiation practice, it is less useful in policy discussions on the benefits and harm of radiation practices. With respect to the noise exposure metric, only one effect is targeted (annoyance), and the differences between sources are explicitly taken into account. This should make the metric useful in policy discussions with respect to physical planning and siting problems. The metric proposed has only significance on a population level, and can not be used as a predictor for individual risk. (author)

  3. Energy functionals for Calabi-Yau metrics

    Headrick, M; Nassar, A

    2013-01-01

    We identify a set of ''energy'' functionals on the space of metrics in a given Kähler class on a Calabi-Yau manifold, which are bounded below and minimized uniquely on the Ricci-flat metric in that class. Using these functionals, we recast the problem of numerically solving the Einstein equation as an optimization problem. We apply this strategy, using the ''algebraic'' metrics (metrics for which the Kähler potential is given in terms of a polynomial in the projective coordinates), to the Fermat quartic and to a one-parameter family of quintics that includes the Fermat and conifold quintics. We show that this method yields approximations to the Ricci-flat metric that are exponentially accurate in the degree of the polynomial (except at the conifold point, where the convergence is polynomial), and therefore orders of magnitude more accurate than the balanced metrics, previously studied as approximations to the Ricci-flat metric. The method is relatively fast and easy to implement. On the theoretical side, we also show that the functionals can be used to give a heuristic proof of Yau's theorem

  4. Interactive Mapping of Inundation Metrics Using Cloud Computing for Improved Floodplain Conservation and Management

    Bulliner, E. A., IV; Lindner, G. A.; Bouska, K.; Paukert, C.; Jacobson, R. B.

    2017-12-01

    Within large-river ecosystems, floodplains serve a variety of important ecological functions. A recent survey of 80 managers of floodplain conservation lands along the Upper and Middle Mississippi and Lower Missouri Rivers in the central United States found that the most critical information needed to improve floodplain management centered on metrics for characterizing depth, extent, frequency, duration, and timing of inundation. These metrics can be delivered to managers efficiently through cloud-based interactive maps. To calculate these metrics, we interpolated an existing one-dimensional hydraulic model for the Lower Missouri River, which simulated water surface elevations at cross sections spaced (step. To translate these water surface elevations to inundation depths, we subtracted a merged terrain model consisting of floodplain LIDAR and bathymetric surveys of the river channel. This approach resulted in a 29000+ day time series of inundation depths across the floodplain using grid cells with 30 m spatial resolution. Initially, we used these data on a local workstation to calculate a suite of nine spatially distributed inundation metrics for the entire model domain. These metrics are calculated on a per pixel basis and encompass a variety of temporal criteria generally relevant to flora and fauna of interest to floodplain managers, including, for example, the average number of days inundated per year within a growing season. Using a local workstation, calculating these metrics for the entire model domain requires several hours. However, for the needs of individual floodplain managers working at site scales, these metrics may be too general and inflexible. Instead of creating a priori a suite of inundation metrics able to satisfy all user needs, we present the usage of Google's cloud-based Earth Engine API to allow users to define and query their own inundation metrics from our dataset and produce maps nearly instantaneously. This approach allows users to

  5. Metrics Are Needed for Collaborative Software Development

    Mojgan Mohtashami

    2011-10-01

    Full Text Available There is a need for metrics for inter-organizational collaborative software development projects, encompassing management and technical concerns. In particular, metrics are needed that are aimed at the collaborative aspect itself, such as readiness for collaboration, the quality and/or the costs and benefits of collaboration in a specific ongoing project. We suggest questions and directions for such metrics, spanning the full lifespan of a collaborative project, from considering the suitability of collaboration through evaluating ongoing projects to final evaluation of the collaboration.

  6. Indefinite metric fields and the renormalization group

    Sherry, T.N.

    1976-11-01

    The renormalization group equations are derived for the Green functions of an indefinite metric field theory. In these equations one retains the mass dependence of the coefficient functions, since in the indefinite metric theories the masses cannot be neglected. The behavior of the effective coupling constant in the asymptotic and infrared limits is analyzed. The analysis is illustrated by means of a simple model incorporating indefinite metric fields. The model scales at first order, and at this order also the effective coupling constant has both ultra-violet and infra-red fixed points, the former being the bare coupling constant

  7. Metric learning for DNA microarray data analysis

    Takeuchi, Ichiro; Nakagawa, Masao; Seto, Masao

    2009-01-01

    In many microarray studies, gene set selection is an important preliminary step for subsequent main task such as tumor classification, cancer subtype identification, etc. In this paper, we investigate the possibility of using metric learning as an alternative to gene set selection. We develop a simple metric learning algorithm aiming to use it for microarray data analysis. Exploiting a property of the algorithm, we introduce a novel approach for extending the metric learning to be adaptive. We apply the algorithm to previously studied microarray data on malignant lymphoma subtype identification.

  8. Software metrics a rigorous and practical approach

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  9. Computing the Gromov hyperbolicity constant of a discrete metric space

    Ismail, Anas

    2012-07-01

    Although it was invented by Mikhail Gromov, in 1987, to describe some family of groups[1], the notion of Gromov hyperbolicity has many applications and interpretations in different fields. It has applications in Biology, Networking, Graph Theory, and many other areas of research. The Gromov hyperbolicity constant of several families of graphs and geometric spaces has been determined. However, so far, the only known algorithm for calculating the Gromov hyperbolicity constant δ of a discrete metric space is the brute force algorithm with running time O (n4) using the four-point condition. In this thesis, we first introduce an approximation algorithm which calculates a O (log n)-approximation of the hyperbolicity constant δ, based on a layering approach, in time O(n2), where n is the number of points in the metric space. We also calculate the fixed base point hyperbolicity constant δr for a fixed point r using a (max, min)−matrix multiplication algorithm by Duan in time O(n2.688)[2]. We use this result to present a 2-approximation algorithm for calculating the hyper-bolicity constant in time O(n2.688). We also provide an exact algorithm to compute the hyperbolicity constant δ in time O(n3.688) for a discrete metric space. We then present some partial results we obtained for designing some approximation algorithms to compute the hyperbolicity constant δ.

  10. Simulation and assessment of urbanization impacts on runoff metrics

    Zhang, Yongyong; Xia, Jun; Yu, Jingjie

    2018-01-01

    changes. The Qing River catchment as a peri-urban catchment in the Beijing metropolitan area is selected as our study region. Results show that: (1) the dryland agriculture is decreased from 13.9% to 1.5% of the total catchment area in the years 2000–2015, while the percentages of impervious surface...... information for urban planning such as Sponge City design.......Urbanization-induced landuse changes alter runoff regimes in complex ways. In this study, a detailed investigation of the urbanization impacts on runoff regimes is provided by using multiple runoff metrics and with consideration of landuse dynamics. A catchment hydrological model is modified...

  11. WE-FG-207B-09: Experimental Assessment of Noise and Spatial Resolution in Virtual Non-Contrast Dual-Energy CT Images Across Multiple Patient Sizes and CT Systems

    Montoya, J; Ferrero, A; Yu, L; Leng, S; McCollough, C

    2016-01-01

    Purpose: To investigate the noise and spatial resolution properties of virtual non-contrast (VNC) dual-energy CT images compared to true non-contrast (TNC) images across multiple patient sizes and CT systems. Methods: Torso-shaped water phantoms with lateral widths of 25, 30, 35, 40 and 45 cm and a high resolution bar pattern phantom (Catphan CTP528) were scanned using 2nd and 3rd generation dual-source CT systems (Scanner A: Somatom Definition Flash, Scanner B: Somatom Force, Siemens Healthcare) in dual-energy scan mode with the same radiation dose for a given phantom size. Tube potentials of 80/Sn140 and 100/Sn140 on Scanner A and 80/Sn150, 90/Sn150 and 100/Sn150 on Scanner B were evaluated to examine the impact of spectral separation. Images were reconstructed using a medium sharp quantitative kernel (Qr40), 1.0-mm thickness, 1.0-mm interval and 20 cm field of view. Mixed images served as TNC images. VNC images were created using commercial software (Virtual Unenhanced, Syngo VIA Version VA30, Siemens Healthcare). The noise power spectrum (NPS), area under the NPS, peak frequency of the NPS and image noise were measured for every phantom size and tube potential combination in TNC and VNC images. Results were compared within and between CT systems. Results: Minimal shift in NPS peak frequencies was observed in VNC images compared to TNC for NPS having pronounced peaks. Image noise and area under the NPS were higher in VNC images compared to TNC images across all tube potentials and for scanner A compared to scanner B. Limiting spatial resolution was deemed to be identical between VNC and TNC images. Conclusion: Quantitative assessment of image quality in VNC images demonstrated higher noise but equivalent spatial resolution compared to TNC images. Decreased noise was observed in the 3rd generation dual-source CT system for tube potential pairs having greater spectral separation. Dr. McCollough receives research support from Siemens Healthcare

  12. WE-FG-207B-09: Experimental Assessment of Noise and Spatial Resolution in Virtual Non-Contrast Dual-Energy CT Images Across Multiple Patient Sizes and CT Systems

    Montoya, J; Ferrero, A; Yu, L; Leng, S; McCollough, C [Mayo Clinic, Rochester, MN (United States)

    2016-06-15

    Purpose: To investigate the noise and spatial resolution properties of virtual non-contrast (VNC) dual-energy CT images compared to true non-contrast (TNC) images across multiple patient sizes and CT systems. Methods: Torso-shaped water phantoms with lateral widths of 25, 30, 35, 40 and 45 cm and a high resolution bar pattern phantom (Catphan CTP528) were scanned using 2nd and 3rd generation dual-source CT systems (Scanner A: Somatom Definition Flash, Scanner B: Somatom Force, Siemens Healthcare) in dual-energy scan mode with the same radiation dose for a given phantom size. Tube potentials of 80/Sn140 and 100/Sn140 on Scanner A and 80/Sn150, 90/Sn150 and 100/Sn150 on Scanner B were evaluated to examine the impact of spectral separation. Images were reconstructed using a medium sharp quantitative kernel (Qr40), 1.0-mm thickness, 1.0-mm interval and 20 cm field of view. Mixed images served as TNC images. VNC images were created using commercial software (Virtual Unenhanced, Syngo VIA Version VA30, Siemens Healthcare). The noise power spectrum (NPS), area under the NPS, peak frequency of the NPS and image noise were measured for every phantom size and tube potential combination in TNC and VNC images. Results were compared within and between CT systems. Results: Minimal shift in NPS peak frequencies was observed in VNC images compared to TNC for NPS having pronounced peaks. Image noise and area under the NPS were higher in VNC images compared to TNC images across all tube potentials and for scanner A compared to scanner B. Limiting spatial resolution was deemed to be identical between VNC and TNC images. Conclusion: Quantitative assessment of image quality in VNC images demonstrated higher noise but equivalent spatial resolution compared to TNC images. Decreased noise was observed in the 3rd generation dual-source CT system for tube potential pairs having greater spectral separation. Dr. McCollough receives research support from Siemens Healthcare.

  13. Metrics, Media and Advertisers: Discussing Relationship

    Marco Aurelio de Souza Rodrigues

    2014-11-01

    Full Text Available This study investigates how Brazilian advertisers are adapting to new media and its attention metrics. In-depth interviews were conducted with advertisers in 2009 and 2011. In 2009, new media and its metrics were celebrated as innovations that would increase advertising campaigns overall efficiency. In 2011, this perception has changed: New media’s profusion of metrics, once seen as an advantage, started to compromise its ease of use and adoption. Among its findings, this study argues that there is an opportunity for media groups willing to shift from a product-focused strategy towards a customer-centric one, through the creation of new, simple and integrative metrics

  14. Networks and centroid metrics for understanding football

    Gonçalo Dias

    games. However, it seems that the centroid metric, supported only by the position of players in the field ...... the strategy adopted by the coach (Gama et al., 2014). ... centroid distance as measures of team's tactical performance in youth football.

  15. Clean Cities Annual Metrics Report 2009 (Revised)

    Johnson, C.

    2011-08-01

    Document provides Clean Cities coalition metrics about the use of alternative fuels; the deployment of alternative fuel vehicles, hybrid electric vehicles (HEVs), and idle reduction initiatives; fuel economy activities; and programs to reduce vehicle miles driven.

  16. Metric Guidelines Inservice and/or Preservice

    Granito, Dolores

    1978-01-01

    Guidelines are given for designing teacher training for going metric. The guidelines were developed from existing guidelines, journal articles, a survey of colleges, and the detailed reactions of a panel. (MN)

  17. Science and Technology Metrics and Other Thoughts

    Harman, Wayne; Staton, Robin

    2006-01-01

    This report explores the subject of science and technology metrics and other topics to begin to provide Navy managers, as well as scientists and engineers, additional tools and concepts with which to...

  18. Using Activity Metrics for DEVS Simulation Profiling

    Muzy A.

    2014-01-01

    Full Text Available Activity metrics can be used to profile DEVS models before and during the simulation. It is critical to get good activity metrics of models before and during their simulation. Having a means to compute a-priori activity of components (analytic activity may be worth when simulating a model (or parts of it for the first time. After, during the simulation, analytic activity can be corrected using dynamic one. In this paper, we introduce McCabe cyclomatic complexity metric (MCA to compute analytic activity. Both static and simulation activity metrics have been implemented through a plug-in of the DEVSimPy (DEVS Simulator in Python language environment and applied to DEVS models.

  19. Evaluating and Estimating the WCET Criticality Metric

    Jordan, Alexander

    2014-01-01

    a programmer (or compiler) from targeting optimizations the right way. A possible resort is to use a metric that targets WCET and which can be efficiently computed for all code parts of a program. Similar to dynamic profiling techniques, which execute code with input that is typically expected...... for the application, based on WCET analysis we can indicate how critical a code fragment is, in relation to the worst-case bound. Computing such a metric on top of static analysis, incurs a certain overhead though, which increases with the complexity of the underlying WCET analysis. We present our approach...... to estimate the Criticality metric, by relaxing the precision of WCET analysis. Through this, we can reduce analysis time by orders of magnitude, while only introducing minor error. To evaluate our estimation approach and share our garnered experience using the metric, we evaluate real-time programs, which...

  20. 16 CFR 1511.8 - Metric references.

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Metric references. 1511.8 Section 1511.8 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT REGULATIONS... parentheses for convenience and information only. ...

  1. Flight Crew State Monitoring Metrics, Phase I

    National Aeronautics and Space Administration — eSky will develop specific crew state metrics based on the timeliness, tempo and accuracy of pilot inputs required by the H-mode Flight Control System (HFCS)....

  2. Supplier selection using different metric functions

    Omosigho S.E.

    2015-01-01

    Full Text Available Supplier selection is an important component of supply chain management in today’s global competitive environment. Hence, the evaluation and selection of suppliers have received considerable attention in the literature. Many attributes of suppliers, other than cost, are considered in the evaluation and selection process. Therefore, the process of evaluation and selection of suppliers is a multi-criteria decision making process. The methodology adopted to solve the supplier selection problem is intuitionistic fuzzy TOPSIS (Technique for Order Preference by Similarity to the Ideal Solution. Generally, TOPSIS is based on the concept of minimum distance from the positive ideal solution and maximum distance from the negative ideal solution. We examine the deficiencies of using only one metric function in TOPSIS and propose the use of spherical metric function in addition to the commonly used metric functions. For empirical supplier selection problems, more than one metric function should be used.

  3. Evaluation Metrics for Simulations of Tropical South America

    Gallup, S.; Baker, I. T.; Denning, A. S.; Cheeseman, M.; Haynes, K. D.; Phillips, M.

    2017-12-01

    The evergreen broadleaf forest of the Amazon Basin is the largest rainforest on earth, and has teleconnections to global climate and carbon cycle characteristics. This region defies simple characterization, spanning large gradients in total rainfall and seasonal variability. Broadly, the region can be thought of as trending from light-limited in its wettest areas to water-limited near the ecotone, with individual landscapes possibly exhibiting the characteristics of either (or both) limitations during an annual cycle. A basin-scale classification of mean behavior has been elusive, and ecosystem response to seasonal cycles and anomalous drought events has resulted in some disagreement in the literature, to say the least. However, new observational platforms and instruments make characterization of the heterogeneity and variability more feasible.To evaluate simulations of ecophysiological function, we develop metrics that correlate various observational products with meteorological variables such as precipitation and radiation. Observations include eddy covariance fluxes, Solar Induced Fluorescence (SIF, from GOME2 and OCO2), biomass and vegetation indices. We find that the modest correlation between SIF and precipitation decreases with increasing annual precipitation, although the relationship is not consistent between products. Biomass increases with increasing precipitation. Although vegetation indices are generally correlated with biomass and precipitation, they can saturate or experience retrieval issues during cloudy periods.Using these observational products and relationships, we develop a set of model evaluation metrics. These metrics are designed to call attention to models that get "the right answer only if it's for the right reason," and provide an opportunity for more critical evaluation of model physics. These metrics represent a testbed that can be applied to multiple models as a means to evaluate their performance in tropical South America.

  4. Classroom reconstruction of the Schwarzschild metric

    Kassner, Klaus

    2015-01-01

    A promising way to introduce general relativity in the classroom is to study the physical implications of certain given metrics, such as the Schwarzschild one. This involves lower mathematical expenditure than an approach focusing on differential geometry in its full glory and permits to emphasize physical aspects before attacking the field equations. Even so, in terms of motivation, lacking justification of the metric employed may pose an obstacle. The paper discusses how to establish the we...

  5. Marketing communication metrics for social media

    Töllinen, Aarne; Karjaluoto, Heikki

    2011-01-01

    The objective of this paper is to develop a conceptual framework for measuring the effectiveness of social media marketing communications. Specifically, we study whether the existing marketing communications performance metrics are still valid in the changing digitalised communications landscape, or whether it is time to rethink them, or even to devise entirely new metrics. Recent advances in information technology and marketing bring a need to re-examine measurement models. We combine two im...

  6. Some observations on a fuzzy metric space

    Gregori, V.

    2017-07-01

    Let $(X,d)$ be a metric space. In this paper we provide some observations about the fuzzy metric space in the sense of Kramosil and Michalek $(Y,N,/wedge)$, where $Y$ is the set of non-negative real numbers $[0,/infty[$ and $N(x,y,t)=1$ if $d(x,y)/leq t$ and $N(x,y,t)=0$ if $d(x,y)/geq t$. (Author)

  7. Area Regge calculus and discontinuous metrics

    Wainwright, Chris; Williams, Ruth M

    2004-01-01

    Taking the triangle areas as independent variables in the theory of Regge calculus can lead to ambiguities in the edge lengths, which can be interpreted as discontinuities in the metric. We construct solutions to area Regge calculus using a triangulated lattice and find that on a spacelike or timelike hypersurface no such discontinuity can arise. On a null hypersurface however, we can have such a situation and the resulting metric can be interpreted as a so-called refractive wave

  8. Relaxed metrics and indistinguishability operators: the relationship

    Martin, J.

    2017-07-01

    In 1982, the notion of indistinguishability operator was introduced by E. Trillas in order to fuzzify the crisp notion of equivalence relation (/cite{Trillas}). In the study of such a class of operators, an outstanding property must be pointed out. Concretely, there exists a duality relationship between indistinguishability operators and metrics. The aforesaid relationship was deeply studied by several authors that introduced a few techniques to generate metrics from indistinguishability operators and vice-versa (see, for instance, /cite{BaetsMesiar,BaetsMesiar2}). In the last years a new generalization of the metric notion has been introduced in the literature with the purpose of developing mathematical tools for quantitative models in Computer Science and Artificial Intelligence (/cite{BKMatthews,Ma}). The aforementioned generalized metrics are known as relaxed metrics. The main target of this talk is to present a study of the duality relationship between indistinguishability operators and relaxed metrics in such a way that the aforementioned classical techniques to generate both concepts, one from the other, can be extended to the new framework. (Author)

  9. Localized Multi-Model Extremes Metrics for the Fourth National Climate Assessment

    Thompson, T. R.; Kunkel, K.; Stevens, L. E.; Easterling, D. R.; Biard, J.; Sun, L.

    2017-12-01

    We have performed localized analysis of scenario-based datasets for the Fourth National Climate Assessment (NCA4). These datasets include CMIP5-based Localized Constructed Analogs (LOCA) downscaled simulations at daily temporal resolution and 1/16th-degree spatial resolution. Over 45 temperature and precipitation extremes metrics have been processed using LOCA data, including threshold, percentile, and degree-days calculations. The localized analysis calculates trends in the temperature and precipitation extremes metrics for relatively small regions such as counties, metropolitan areas, climate zones, administrative areas, or economic zones. For NCA4, we are currently addressing metropolitan areas as defined by U.S. Census Bureau Metropolitan Statistical Areas. Such localized analysis provides essential information for adaptation planning at scales relevant to local planning agencies and businesses. Nearly 30 such regions have been analyzed to date. Each locale is defined by a closed polygon that is used to extract LOCA-based extremes metrics specific to the area. For each metric, single-model data at each LOCA grid location are first averaged over several 30-year historical and future periods. Then, for each metric, the spatial average across the region is calculated using model weights based on both model independence and reproducibility of current climate conditions. The range of single-model results is also captured on the same localized basis, and then combined with the weighted ensemble average for each region and each metric. For example, Boston-area cooling degree days and maximum daily temperature is shown below for RCP8.5 (red) and RCP4.5 (blue) scenarios. We also discuss inter-regional comparison of these metrics, as well as their relevance to risk analysis for adaptation planning.

  10. Effects of metric hierarchy and rhyme predictability on word duration in The Cat in the Hat.

    Breen, Mara

    2018-05-01

    Word durations convey many types of linguistic information, including intrinsic lexical features like length and frequency and contextual features like syntactic and semantic structure. The current study was designed to investigate whether hierarchical metric structure and rhyme predictability account for durational variation over and above other features in productions of a rhyming, metrically-regular children's book: The Cat in the Hat (Dr. Seuss, 1957). One-syllable word durations and inter-onset intervals were modeled as functions of segment number, lexical frequency, word class, syntactic structure, repetition, and font emphasis. Consistent with prior work, factors predicting longer word durations and inter-onset intervals included more phonemes, lower frequency, first mention, alignment with a syntactic boundary, and capitalization. A model parameter corresponding to metric grid height improved model fit of word durations and inter-onset intervals. Specifically, speakers realized five levels of metric hierarchy with inter-onset intervals such that interval duration increased linearly with increased height in the metric hierarchy. Conversely, speakers realized only three levels of metric hierarchy with word duration, demonstrating that they shortened the highly predictable rhyme resolutions. These results further understanding of the factors that affect spoken word duration, and demonstrate the myriad cues that children receive about linguistic structure from nursery rhymes. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Baby universe metric equivalent to an interior black-hole metric

    Gonzalez-Diaz, P.F.

    1991-01-01

    It is shown that the maximally extended metric corresponding to a large wormhole is the unique possible wormhole metric whose baby universe sector is conformally equivalent ot the maximal inextendible Kruskal metric corresponding to the interior region of a Schwarzschild black hole whose gravitational radius is half the wormhole neck radius. The physical implications of this result in the black hole evaporation process are discussed. (orig.)

  12. Resolution propositions

    2003-05-01

    To put a resolution to the meeting in relation with the use of weapons made of depleted uranium is the purpose of this text. The situation of the use of depleted uranium by France during the Gulf war and other recent conflicts will be established. This resolution will give the most strict recommendations face to the eventual sanitary and environmental risks in the use of these kind of weapons. (N.C.)

  13. High-resolution 1H magnetic resonance spectroscopy imaging at 1.5 and 3 Tesla of the human brain: development of techniques and applications for patients with primary brain tumors and multiple sclerosis

    Stadlbauer, A.

    2004-05-01

    The aim of this work was to develop several strategies and software-packages for the evaluation of in-vivo-data of the human brain, which were acquired with high-resolution 1H-MRSI at 1.5 and 3 T. Several studies involving phantoms, volunteers and patients were performed. Quality assurance studies were conducted in order to evaluate the reproducibility of the applied MR-techniques at both field strengths. A qualitative comparison-study between MRSI-data from a 1.5 T clinical MR-scanner and a 3 T research MR-scanner showed the advantages of the more advanced MRSI sequences and higher field strength (3 T). A study involving patients with primary brain tumours (gliomas) was performed in cooperation with the Department of Neurosurgery (University of Erlangen-Nuremberg). The methods developed in the course of this study, such as the integration of MRS-data into a stereotactic-system, the segmentation of metabolic maps and the correlation with histopathological findings represent a package of vital information for diagnostics and therapy of primary brain tumors, neurodegenerative disorders or epilepsy. In the course of two pilot-studies in cooperation with the MR-Centre of Excellence (Medical University of Vienna) the advantages of high-resolution 3D in-vivo-1H-MRSI at 3T were qualitatively evaluated via measurements on patients with brain tumors and multiple sclerosis (MS). It was demonstrated that 1H-MRSI may be valuable for the diagnosis, follow-up and prediction of 'seizures' with MS-patients. In conclusion, this work contains an overview of potential and advantages of in-vivo-1H-MRS-methods at 1.5 and 3 T for the clinical diagnosis and treatment of patients with gliomas and MS. (author)

  14. The dynamics of metric-affine gravity

    Vitagliano, Vincenzo; Sotiriou, Thomas P.; Liberati, Stefano

    2011-01-01

    Highlights: → The role and the dynamics of the connection in metric-affine theories is explored. → The most general second order action does not lead to a dynamical connection. → Including higher order invariants excites new degrees of freedom in the connection. → f(R) actions are also discussed and shown to be a non- representative class. - Abstract: Metric-affine theories of gravity provide an interesting alternative to general relativity: in such an approach, the metric and the affine (not necessarily symmetric) connection are independent quantities. Furthermore, the action should include covariant derivatives of the matter fields, with the covariant derivative naturally defined using the independent connection. As a result, in metric-affine theories a direct coupling involving matter and connection is also present. The role and the dynamics of the connection in such theories is explored. We employ power counting in order to construct the action and search for the minimal requirements it should satisfy for the connection to be dynamical. We find that for the most general action containing lower order invariants of the curvature and the torsion the independent connection does not carry any dynamics. It actually reduces to the role of an auxiliary field and can be completely eliminated algebraically in favour of the metric and the matter field, introducing extra interactions with respect to general relativity. However, we also show that including higher order terms in the action radically changes this picture and excites new degrees of freedom in the connection, making it (or parts of it) dynamical. Constructing actions that constitute exceptions to this rule requires significant fine tuned and/or extra a priori constraints on the connection. We also consider f(R) actions as a particular example in order to show that they constitute a distinct class of metric-affine theories with special properties, and as such they cannot be used as representative toy

  15. Evaluation metrics for biostatistical and epidemiological collaborations.

    Rubio, Doris McGartland; Del Junco, Deborah J; Bhore, Rafia; Lindsell, Christopher J; Oster, Robert A; Wittkowski, Knut M; Welty, Leah J; Li, Yi-Ju; Demets, Dave

    2011-10-15

    Increasing demands for evidence-based medicine and for the translation of biomedical research into individual and public health benefit have been accompanied by the proliferation of special units that offer expertise in biostatistics, epidemiology, and research design (BERD) within academic health centers. Objective metrics that can be used to evaluate, track, and improve the performance of these BERD units are critical to their successful establishment and sustainable future. To develop a set of reliable but versatile metrics that can be adapted easily to different environments and evolving needs, we consulted with members of BERD units from the consortium of academic health centers funded by the Clinical and Translational Science Award Program of the National Institutes of Health. Through a systematic process of consensus building and document drafting, we formulated metrics that covered the three identified domains of BERD practices: the development and maintenance of collaborations with clinical and translational science investigators, the application of BERD-related methods to clinical and translational research, and the discovery of novel BERD-related methodologies. In this article, we describe the set of metrics and advocate their use for evaluating BERD practices. The routine application, comparison of findings across diverse BERD units, and ongoing refinement of the metrics will identify trends, facilitate meaningful changes, and ultimately enhance the contribution of BERD activities to biomedical research. Copyright © 2011 John Wiley & Sons, Ltd.

  16. Future of the PCI Readmission Metric.

    Wasfy, Jason H; Yeh, Robert W

    2016-03-01

    Between 2013 and 2014, the Centers for Medicare and Medicaid Services and the National Cardiovascular Data Registry publically reported risk-adjusted 30-day readmission rates after percutaneous coronary intervention (PCI) as a pilot project. A key strength of this public reporting effort included risk adjustment with clinical rather than administrative data. Furthermore, because readmission after PCI is common, expensive, and preventable, this metric has substantial potential to improve quality and value in American cardiology care. Despite this, concerns about the metric exist. For example, few PCI readmissions are caused by procedural complications, limiting the extent to which improved procedural technique can reduce readmissions. Also, similar to other readmission measures, PCI readmission is associated with socioeconomic status and race. Accordingly, the metric may unfairly penalize hospitals that care for underserved patients. Perhaps in the context of these limitations, Centers for Medicare and Medicaid Services has not yet included PCI readmission among metrics that determine Medicare financial penalties. Nevertheless, provider organizations may still wish to focus on this metric to improve value for cardiology patients. PCI readmission is associated with low-risk chest discomfort and patient anxiety. Therefore, patient education, improved triage mechanisms, and improved care coordination offer opportunities to minimize PCI readmissions. Because PCI readmission is common and costly, reducing PCI readmission offers provider organizations a compelling target to improve the quality of care, and also performance in contracts involve shared financial risk. © 2016 American Heart Association, Inc.

  17. Automation of Endmember Pixel Selection in SEBAL/METRIC Model

    Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.

    2015-12-01

    The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.

  18. g-Weak Contraction in Ordered Cone Rectangular Metric Spaces

    S. K. Malhotra

    2013-01-01

    Full Text Available We prove some common fixed-point theorems for the ordered g-weak contractions in cone rectangular metric spaces without assuming the normality of cone. Our results generalize some recent results from cone metric and cone rectangular metric spaces into ordered cone rectangular metric spaces. Examples are provided which illustrate the results.

  19. Defining a Progress Metric for CERT RMM Improvement

    2017-09-14

    REV-03.18.2016.0 Defining a Progress Metric for CERT-RMM Improvement Gregory Crabb Nader Mehravari David Tobar September 2017 TECHNICAL ...fendable resource allocation decisions. Technical metrics measure aspects of controls implemented through technology (systems, soft- ware, hardware...implementation metric would be the percentage of users who have received anti-phishing training . • Effectiveness/efficiency metrics measure whether

  20. NASA education briefs for the classroom. Metrics in space

    The use of metric measurement in space is summarized for classroom use. Advantages of the metric system over the English measurement system are described. Some common metric units are defined, as are special units for astronomical study. International system unit prefixes and a conversion table of metric/English units are presented. Questions and activities for the classroom are recommended.

  1. Resilience-based performance metrics for water resources management under uncertainty

    Roach, Tom; Kapelan, Zoran; Ledbetter, Ralph

    2018-06-01

    This paper aims to develop new, resilience type metrics for long-term water resources management under uncertain climate change and population growth. Resilience is defined here as the ability of a water resources management system to 'bounce back', i.e. absorb and then recover from a water deficit event, restoring the normal system operation. Ten alternative metrics are proposed and analysed addressing a range of different resilience aspects including duration, magnitude, frequency and volume of related water deficit events. The metrics were analysed on a real-world case study of the Bristol Water supply system in the UK and compared with current practice. The analyses included an examination of metrics' sensitivity and correlation, as well as a detailed examination into the behaviour of metrics during water deficit periods. The results obtained suggest that multiple metrics which cover different aspects of resilience should be used simultaneously when assessing the resilience of a water resources management system, leading to a more complete understanding of resilience compared with current practice approaches. It was also observed that calculating the total duration of a water deficit period provided a clearer and more consistent indication of system performance compared to splitting the deficit periods into the time to reach and time to recover from the worst deficit events.

  2. SOCIAL METRICS APPLIED TO SMART TOURISM

    O. Cervantes

    2016-09-01

    Full Text Available We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  3. Landscape pattern metrics and regional assessment

    O'Neill, R. V.; Riitters, K.H.; Wickham, J.D.; Jones, K.B.

    1999-01-01

    The combination of remote imagery data, geographic information systems software, and landscape ecology theory provides a unique basis for monitoring and assessing large-scale ecological systems. The unique feature of the work has been the need to develop and interpret quantitative measures of spatial pattern-the landscape indices. This article reviews what is known about the statistical properties of these pattern metrics and suggests some additional metrics based on island biogeography, percolation theory, hierarchy theory, and economic geography. Assessment applications of this approach have required interpreting the pattern metrics in terms of specific environmental endpoints, such as wildlife and water quality, and research into how to represent synergystic effects of many overlapping sources of stress.

  4. A bi-metric theory of gravitation

    Rosen, N.

    1975-01-01

    The bi-metric theory of gravitation proposed previously is simplified in that the auxiliary conditions are discarded, the two metric tensors being tied together only by means of the boundary conditions. Some of the properties of the field of a particle are investigated; there is no black hole, and it appears that no gravitational collapse can take place. Although the proposed theory and general relativity are at present observationally indistinguishable, some differences are pointed out which may some day be susceptible of observation. An alternative bi-metric theory is considered which gives for the precession of the perihelion 5/6 of the value given by general relativity; it seems less satisfactory than the present theory from the aesthetic point of view. (author)

  5. Steiner trees for fixed orientation metrics

    Brazil, Marcus; Zachariasen, Martin

    2009-01-01

    We consider the problem of constructing Steiner minimum trees for a metric defined by a polygonal unit circle (corresponding to s = 2 weighted legal orientations in the plane). A linear-time algorithm to enumerate all angle configurations for degree three Steiner points is given. We provide...... a simple proof that the angle configuration for a Steiner point extends to all Steiner points in a full Steiner minimum tree, such that at most six orientations suffice for edges in a full Steiner minimum tree. We show that the concept of canonical forms originally introduced for the uniform orientation...... metric generalises to the fixed orientation metric. Finally, we give an O(s n) time algorithm to compute a Steiner minimum tree for a given full Steiner topology with n terminal leaves....

  6. Metrical and dynamical aspects in complex analysis

    2017-01-01

    The central theme of this reference book is the metric geometry of complex analysis in several variables. Bridging a gap in the current literature, the text focuses on the fine behavior of the Kobayashi metric of complex manifolds and its relationships to dynamical systems, hyperbolicity in the sense of Gromov and operator theory, all very active areas of research. The modern points of view expressed in these notes, collected here for the first time, will be of interest to academics working in the fields of several complex variables and metric geometry. The different topics are treated coherently and include expository presentations of the relevant tools, techniques and objects, which will be particularly useful for graduate and PhD students specializing in the area.

  7. Social Metrics Applied to Smart Tourism

    Cervantes, O.; Gutiérrez, E.; Gutiérrez, F.; Sánchez, J. A.

    2016-09-01

    We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general) to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services) to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  8. Validation of Metrics as Error Predictors

    Mendling, Jan

    In this chapter, we test the validity of metrics that were defined in the previous chapter for predicting errors in EPC business process models. In Section 5.1, we provide an overview of how the analysis data is generated. Section 5.2 describes the sample of EPCs from practice that we use for the analysis. Here we discuss a disaggregation by the EPC model group and by error as well as a correlation analysis between metrics and error. Based on this sample, we calculate a logistic regression model for predicting error probability with the metrics as input variables in Section 5.3. In Section 5.4, we then test the regression function for an independent sample of EPC models from textbooks as a cross-validation. Section 5.5 summarizes the findings.

  9. Metric Learning for Hyperspectral Image Segmentation

    Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca

    2011-01-01

    We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.

  10. Kerr metric in the deSitter background

    Vaidya, P.C.

    1984-01-01

    In addition to the Kerr metric with cosmological constant Λ several other metrics are presented giving a Kerr-like solution of Einstein's equations in the background of deSitter universe. A new metric of what may be termed as rotating deSitter space-time devoid of matter but containing null fluid with twisting null rays, has been presented. This metric reduces to the standard deSitter metric when the twist in the rays vanishes. Kerr metric in this background is the immediate generalization of Schwarzschild's exterior metric with cosmological constant. (author)

  11. Local-order metric for condensed-phase environments

    Martelli, Fausto; Ko, Hsin-Yu; Oǧuz, Erdal C.; Car, Roberto

    2018-02-01

    We introduce a local order metric (LOM) that measures the degree of order in the neighborhood of an atomic or molecular site in a condensed medium. The LOM maximizes the overlap between the spatial distribution of sites belonging to that neighborhood and the corresponding distribution in a suitable reference system. The LOM takes a value tending to zero for completely disordered environments and tending to one for environments that perfectly match the reference. The site-averaged LOM and its standard deviation define two scalar order parameters, S and δ S , that characterize with excellent resolution crystals, liquids, and amorphous materials. We show with molecular dynamics simulations that S , δ S , and the LOM provide very insightful information in the study of structural transformations, such as those occurring when ice spontaneously nucleates from supercooled water or when a supercooled water sample becomes amorphous upon progressive cooling.

  12. A New Metric for Land-Atmosphere Coupling Strength: Applications on Observations and Modeling

    Tang, Q.; Xie, S.; Zhang, Y.; Phillips, T. J.; Santanello, J. A., Jr.; Cook, D. R.; Riihimaki, L.; Gaustad, K.

    2017-12-01

    A new metric is proposed to quantify the land-atmosphere (LA) coupling strength and is elaborated by correlating the surface evaporative fraction and impacting land and atmosphere variables (e.g., soil moisture, vegetation, and radiation). Based upon multiple linear regression, this approach simultaneously considers multiple factors and thus represents complex LA coupling mechanisms better than existing single variable metrics. The standardized regression coefficients quantify the relative contributions from individual drivers in a consistent manner, avoiding the potential inconsistency in relative influence of conventional metrics. Moreover, the unique expendable feature of the new method allows us to verify and explore potentially important coupling mechanisms. Our observation-based application of the new metric shows moderate coupling with large spatial variations at the U.S. Southern Great Plains. The relative importance of soil moisture vs. vegetation varies by location. We also show that LA coupling strength is generally underestimated by single variable methods due to their incompleteness. We also apply this new metric to evaluate the representation of LA coupling in the Accelerated Climate Modeling for Energy (ACME) V1 Contiguous United States (CONUS) regionally refined model (RRM). This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-734201

  13. Active Metric Learning from Relative Comparisons

    Xiong, Sicheng; Rosales, Rómer; Pei, Yuanli; Fern, Xiaoli Z.

    2014-01-01

    This work focuses on active learning of distance metrics from relative comparison information. A relative comparison specifies, for a data point triplet $(x_i,x_j,x_k)$, that instance $x_i$ is more similar to $x_j$ than to $x_k$. Such constraints, when available, have been shown to be useful toward defining appropriate distance metrics. In real-world applications, acquiring constraints often require considerable human effort. This motivates us to study how to select and query the most useful ...

  14. Heuristic extension of the Schwarzschild metric

    Espinosa, J.M.

    1982-01-01

    The Schwarzschild solution of Einstein's equations of gravitation has several singularities. It is known that the singularity at r = 2Gm/c 2 is only apparent, a result of the coordinates in which the solution was found. Paradoxical results occuring near the singularity show the system of coordinates is incomplete. We introduce a simple, two-dimensional metric with an apparent singularity that makes it incomplete. By a straightforward, heuristic procedure we extend and complete this simple metric. We then use the same procedure to give a heuristic derivation of the Kruskal system of coordinates, which is known to extend the Schwarzschild manifold past its apparent singularity and produce a complete manifold

  15. Metric inhomogeneous Diophantine approximation in positive characteristic

    Kristensen, Simon

    2011-01-01

    We obtain asymptotic formulae for the number of solutions to systems of inhomogeneous linear Diophantine inequalities over the field of formal Laurent series with coefficients from a finite fields, which are valid for almost every such system. Here `almost every' is with respect to Haar measure...... of the coefficients of the homogeneous part when the number of variables is at least two (singly metric case), and with respect to the Haar measure of all coefficients for any number of variables (doubly metric case). As consequences, we derive zero-one laws in the spirit of the Khintchine-Groshev Theorem and zero...

  16. Metric inhomogeneous Diophantine approximation in positive characteristic

    Kristensen, S.

    We obtain asymptotic formulae for the number of solutions to systems of inhomogeneous linear Diophantine inequalities over the field of formal Laurent series with coefficients from a finite fields, which are valid for almost every such system. Here 'almost every' is with respect to Haar measure...... of the coefficients of the homogeneous part when the number of variables is at least two (singly metric case), and with respect to the Haar measure of all coefficients for any number of variables (doubly metric case). As consequences, we derive zero-one laws in the spirit of the Khintchine--Groshev Theorem and zero...

  17. Jacobi-Maupertuis metric and Kepler equation

    Chanda, Sumanto; Gibbons, Gary William; Guha, Partha

    This paper studies the application of the Jacobi-Eisenhart lift, Jacobi metric and Maupertuis transformation to the Kepler system. We start by reviewing fundamentals and the Jacobi metric. Then we study various ways to apply the lift to Kepler-related systems: first as conformal description and Bohlin transformation of Hooke’s oscillator, second in contact geometry and third in Houri’s transformation [T. Houri, Liouville integrability of Hamiltonian systems and spacetime symmetry (2016), www.geocities.jp/football_physician/publication.html], coupled with Milnor’s construction [J. Milnor, On the geometry of the Kepler problem, Am. Math. Mon. 90 (1983) 353-365] with eccentric anomaly.

  18. Correlations between contouring similarity metrics and simulated treatment outcome for prostate radiotherapy

    Roach, D.; Jameson, M. G.; Dowling, J. A.; Ebert, M. A.; Greer, P. B.; Kennedy, A. M.; Watt, S.; Holloway, L. C.

    2018-02-01

    Many similarity metrics exist for inter-observer contouring variation studies, however no correlation between metric choice and prostate cancer radiotherapy dosimetry has been explored. These correlations were investigated in this study. Two separate trials were undertaken, the first a thirty-five patient cohort with three observers, the second a five patient dataset with ten observers. Clinical and planning target volumes (CTV and PTV), rectum, and bladder were independently contoured by all observers in each trial. Structures were contoured on T2-weighted MRI and transferred onto CT following rigid registration for treatment planning in the first trial. Structures were contoured directly on CT in the second trial. STAPLE and majority voting volumes were generated as reference gold standard volumes for each structure for the two trials respectively. VMAT treatment plans (78 Gy to PTV) were simulated for observer and gold standard volumes, and dosimetry assessed using multiple radiobiological metrics. Correlations between contouring similarity metrics and dosimetry were calculated using Spearman’s rank correlation coefficient. No correlations were observed between contouring similarity metrics and dosimetry for CTV within either trial. Volume similarity correlated most strongly with radiobiological metrics for PTV in both trials, including TCPPoisson (ρ  =  0.57, 0.65), TCPLogit (ρ  =  0.39, 0.62), and EUD (ρ  =  0.43, 0.61) for each respective trial. Rectum and bladder metric correlations displayed no consistency for the two trials. PTV volume similarity was found to significantly correlate with rectum normal tissue complication probability (ρ  =  0.33, 0.48). Minimal to no correlations with dosimetry were observed for overlap or boundary contouring metrics. Future inter-observer contouring variation studies for prostate cancer should incorporate volume similarity to provide additional insights into dosimetry during analysis.

  19. Multiple-hit parameter estimation in monolithic detectors.

    Hunter, William C J; Barrett, Harrison H; Lewellen, Tom K; Miyaoka, Robert S

    2013-02-01

    We examine a maximum-a-posteriori method for estimating the primary interaction position of gamma rays with multiple interaction sites (hits) in a monolithic detector. In assessing the performance of a multiple-hit estimator over that of a conventional one-hit estimator, we consider a few different detector and readout configurations of a 50-mm-wide square cerium-doped lutetium oxyorthosilicate block. For this study, we use simulated data from SCOUT, a Monte-Carlo tool for photon tracking and modeling scintillation- camera output. With this tool, we determine estimate bias and variance for a multiple-hit estimator and compare these with similar metrics for a one-hit maximum-likelihood estimator, which assumes full energy deposition in one hit. We also examine the effect of event filtering on these metrics; for this purpose, we use a likelihood threshold to reject signals that are not likely to have been produced under the assumed likelihood model. Depending on detector design, we observe a 1%-12% improvement of intrinsic resolution for a 1-or-2-hit estimator as compared with a 1-hit estimator. We also observe improved differentiation of photopeak events using a 1-or-2-hit estimator as compared with the 1-hit estimator; more than 6% of photopeak events that were rejected by likelihood filtering for the 1-hit estimator were accurately identified as photopeak events and positioned without loss of resolution by a 1-or-2-hit estimator; for PET, this equates to at least a 12% improvement in coincidence-detection efficiency with likelihood filtering applied.

  20. Parameter-space metric of semicoherent searches for continuous gravitational waves

    Pletsch, Holger J.

    2010-01-01

    Continuous gravitational-wave (CW) signals such as emitted by spinning neutron stars are an important target class for current detectors. However, the enormous computational demand prohibits fully coherent broadband all-sky searches for prior unknown CW sources over wide ranges of parameter space and for yearlong observation times. More efficient hierarchical ''semicoherent'' search strategies divide the data into segments much shorter than one year, which are analyzed coherently; then detection statistics from different segments are combined incoherently. To optimally perform the incoherent combination, understanding of the underlying parameter-space structure is requisite. This problem is addressed here by using new coordinates on the parameter space, which yield the first analytical parameter-space metric for the incoherent combination step. This semicoherent metric applies to broadband all-sky surveys (also embedding directed searches at fixed sky position) for isolated CW sources. Furthermore, the additional metric resolution attained through the combination of segments is studied. From the search parameters (sky position, frequency, and frequency derivatives), solely the metric resolution in the frequency derivatives is found to significantly increase with the number of segments.

  1. Beyond metrics? Utilizing 'soft intelligence' for healthcare quality and safety.

    Martin, Graham P; McKee, Lorna; Dixon-Woods, Mary

    2015-10-01

    Formal metrics for monitoring the quality and safety of healthcare have a valuable role, but may not, by themselves, yield full insight into the range of fallibilities in organizations. 'Soft intelligence' is usefully understood as the processes and behaviours associated with seeking and interpreting soft data-of the kind that evade easy capture, straightforward classification and simple quantification-to produce forms of knowledge that can provide the basis for intervention. With the aim of examining current and potential practice in relation to soft intelligence, we conducted and analysed 107 in-depth qualitative interviews with senior leaders, including managers and clinicians, involved in healthcare quality and safety in the English National Health Service. We found that participants were in little doubt about the value of softer forms of data, especially for their role in revealing troubling issues that might be obscured by conventional metrics. Their struggles lay in how to access softer data and turn them into a useful form of knowing. Some of the dominant approaches they used risked replicating the limitations of hard, quantitative data. They relied on processes of aggregation and triangulation that prioritised reliability, or on instrumental use of soft data to animate the metrics. The unpredictable, untameable, spontaneous quality of soft data could be lost in efforts to systematize their collection and interpretation to render them more tractable. A more challenging but potentially rewarding approach involved processes and behaviours aimed at disrupting taken-for-granted assumptions about quality, safety, and organizational performance. This approach, which explicitly values the seeking out and the hearing of multiple voices, is consistent with conceptual frameworks of organizational sensemaking and dialogical understandings of knowledge. Using soft intelligence this way can be challenging and discomfiting, but may offer a critical defence against the

  2. Quantitative properties of the Schwarzschild metric

    Křížek, Michal; Křížek, Filip

    2018-01-01

    Roč. 2018, č. 1 (2018), s. 1-10 Institutional support: RVO:67985840 Keywords : exterior and interior Schwarzschild metric * proper radius * coordinate radius Subject RIV: BA - General Mathematics OBOR OECD: Applied mathematics http://astro.shu-bg.net/pasb/index_files/Papers/2018/SCHWARZ8.pdf

  3. Strong Ideal Convergence in Probabilistic Metric Spaces

    In the present paper we introduce the concepts of strongly ideal convergent sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong ideal limit points and the strong ideal cluster points of a sequence in this ...

  4. lakemorpho: Calculating lake morphometry metrics in R.

    Hollister, Jeffrey; Stachelek, Joseph

    2017-01-01

    Metrics describing the shape and size of lakes, known as lake morphometry metrics, are important for any limnological study. In cases where a lake has long been the subject of study these data are often already collected and are openly available. Many other lakes have these data collected, but access is challenging as it is often stored on individual computers (or worse, in filing cabinets) and is available only to the primary investigators. The vast majority of lakes fall into a third category in which the data are not available. This makes broad scale modelling of lake ecology a challenge as some of the key information about in-lake processes are unavailable. While this valuable in situ information may be difficult to obtain, several national datasets exist that may be used to model and estimate lake morphometry. In particular, digital elevation models and hydrography have been shown to be predictive of several lake morphometry metrics. The R package lakemorpho has been developed to utilize these data and estimate the following morphometry metrics: surface area, shoreline length, major axis length, minor axis length, major and minor axis length ratio, shoreline development, maximum depth, mean depth, volume, maximum lake length, mean lake width, maximum lake width, and fetch. In this software tool article we describe the motivation behind developing lakemorpho , discuss the implementation in R, and describe the use of lakemorpho with an example of a typical use case.

  5. Contraction theorems in fuzzy metric space

    Farnoosh, R.; Aghajani, A.; Azhdari, P.

    2009-01-01

    In this paper, the results on fuzzy contractive mapping proposed by Dorel Mihet will be proved for B-contraction and C-contraction in the case of George and Veeramani fuzzy metric space. The existence of fixed point with weaker conditions will be proved; that is, instead of the convergence of subsequence, p-convergence of subsequence is used.

  6. Inferring feature relevances from metric learning

    Schulz, Alexander; Mokbel, Bassam; Biehl, Michael

    2015-01-01

    Powerful metric learning algorithms have been proposed in the last years which do not only greatly enhance the accuracy of distance-based classifiers and nearest neighbor database retrieval, but which also enable the interpretability of these operations by assigning explicit relevance weights...

  7. DIGITAL MARKETING: SUCCESS METRICS, FUTURE TRENDS

    Preeti Kaushik

    2017-01-01

    Abstract – Business Marketing is one of the prospective which has been tremendously affected by digital world in last few years. Digital marketing refers to doing advertising through digital channels. This paper provides detailed study of metrics to measure success of digital marketing platform and glimpse of future of technologies by 2020.

  8. Assessing Software Quality Through Visualised Cohesion Metrics

    Timothy Shih

    2001-05-01

    Full Text Available Cohesion is one of the most important factors for software quality as well as maintainability, reliability and reusability. Module cohesion is defined as a quality attribute that seeks for measuring the singleness of the purpose of a module. The module of poor quality can be a serious obstacle to the system quality. In order to design a good software quality, software managers and engineers need to introduce cohesion metrics to measure and produce desirable software. A highly cohesion software is thought to be a desirable constructing. In this paper, we propose a function-oriented cohesion metrics based on the analysis of live variables, live span and the visualization of processing element dependency graph. We give six typical cohesion examples to be measured as our experiments and justification. Therefore, a well-defined, well-normalized, well-visualized and well-experimented cohesion metrics is proposed to indicate and thus enhance software cohesion strength. Furthermore, this cohesion metrics can be easily incorporated with software CASE tool to help software engineers to improve software quality.

  9. Metric propositional neighborhood logics on natural numbers

    Bresolin, Davide; Della Monica, Dario; Goranko, Valentin

    2013-01-01

    Metric Propositional Neighborhood Logic (MPNL) over natural numbers. MPNL features two modalities referring, respectively, to an interval that is “met by” the current one and to an interval that “meets” the current one, plus an infinite set of length constraints, regarded as atomic propositions...

  10. Calabi–Yau metrics and string compactification

    Michael R. Douglas

    2015-09-01

    Full Text Available Yau proved an existence theorem for Ricci-flat Kähler metrics in the 1970s, but we still have no closed form expressions for them. Nevertheless there are several ways to get approximate expressions, both numerical and analytical. We survey some of this work and explain how it can be used to obtain physical predictions from superstring theory.

  11. Goedel-type metrics in various dimensions

    Guerses, Metin; Karasu, Atalay; Sarioglu, Oezguer

    2005-01-01

    Goedel-type metrics are introduced and used in producing charged dust solutions in various dimensions. The key ingredient is a (D - 1)-dimensional Riemannian geometry which is then employed in constructing solutions to the Einstein-Maxwell field equations with a dust distribution in D dimensions. The only essential field equation in the procedure turns out to be the source-free Maxwell's equation in the relevant background. Similarly the geodesics of this type of metric are described by the Lorentz force equation for a charged particle in the lower dimensional geometry. It is explicitly shown with several examples that Goedel-type metrics can be used in obtaining exact solutions to various supergravity theories and in constructing spacetimes that contain both closed timelike and closed null curves and that contain neither of these. Among the solutions that can be established using non-flat backgrounds, such as the Tangherlini metrics in (D - 1)-dimensions, there exists a class which can be interpreted as describing black-hole-type objects in a Goedel-like universe

  12. Strong Statistical Convergence in Probabilistic Metric Spaces

    Şençimen, Celaleddin; Pehlivan, Serpil

    2008-01-01

    In this article, we introduce the concepts of strongly statistically convergent sequence and strong statistically Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong statistical limit points and the strong statistical cluster points of a sequence in this space and investigate the relations between these concepts.

  13. Language Games: University Responses to Ranking Metrics

    Heffernan, Troy A.; Heffernan, Amanda

    2018-01-01

    League tables of universities that measure performance in various ways are now commonplace, with numerous bodies providing their own rankings of how institutions throughout the world are seen to be performing on a range of metrics. This paper uses Lyotard's notion of language games to theorise that universities are regaining some power over being…

  14. A new universal colour image fidelity metric

    Toet, A.; Lucassen, M.P.

    2003-01-01

    We extend a recently introduced universal grayscale image quality index to a newly developed perceptually decorrelated colour space. The resulting colour image fidelity metric quantifies the distortion of a processed colour image relative to its original version. We evaluated the new colour image

  15. Standardised metrics for global surgical surveillance.

    Weiser, Thomas G; Makary, Martin A; Haynes, Alex B; Dziekan, Gerald; Berry, William R; Gawande, Atul A

    2009-09-26

    Public health surveillance relies on standardised metrics to evaluate disease burden and health system performance. Such metrics have not been developed for surgical services despite increasing volume, substantial cost, and high rates of death and disability associated with surgery. The Safe Surgery Saves Lives initiative of WHO's Patient Safety Programme has developed standardised public health metrics for surgical care that are applicable worldwide. We assembled an international panel of experts to develop and define metrics for measuring the magnitude and effect of surgical care in a population, while taking into account economic feasibility and practicability. This panel recommended six measures for assessing surgical services at a national level: number of operating rooms, number of operations, number of accredited surgeons, number of accredited anaesthesia professionals, day-of-surgery death ratio, and postoperative in-hospital death ratio. We assessed the feasibility of gathering such statistics at eight diverse hospitals in eight countries and incorporated them into the WHO Guidelines for Safe Surgery, in which methods for data collection, analysis, and reporting are outlined.

  16. A Lagrangian-dependent metric space

    El-Tahir, A.

    1989-08-01

    A generalized Lagrangian-dependent metric of the static isotropic spacetime is derived. Its behaviour should be governed by imposing physical constraints allowing to avert the pathological features of gravity at the strong field domain. This would restrict the choice of the Lagrangian form. (author). 10 refs

  17. Clean Cities 2011 Annual Metrics Report

    Johnson, C.

    2012-12-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2011. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  18. Clean Cities 2010 Annual Metrics Report

    Johnson, C.

    2012-10-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2010. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  19. Genetic basis of a cognitive complexity metric

    Hansell, Narelle K; Halford, Graeme S; Andrews, Glenda; Shum, David H K; Harris, Sarah E; Davies, Gail; Franic, Sanja; Christoforou, Andrea; Zietsch, Brendan; Painter, Jodie; Medland, Sarah E; Ehli, Erik A; Davies, Gareth E; Steen, Vidar M; Lundervold, Astri J; Reinvang, Ivar; Montgomery, Grant W; Espeseth, Thomas; Hulshoff Pol, Hilleke E; Starr, John M; Martin, Nicholas G; Le Hellard, Stephanie; Boomsma, Dorret I; Deary, Ian J; Wright, Margaret J

    2015-01-01

    Relational complexity (RC) is a metric reflecting capacity limitation in relational processing. It plays a crucial role in higher cognitive processes and is an endophenotype for several disorders. However, the genetic underpinnings of complex relational processing have not been investigated. Using

  20. Genetic Basis of a Cognitive Complexity Metric

    Hansell, N.K.; Halford, G.S.; Andrews, G.; Shum, D.H.K.; Harris, S.E.; Davies, G.; Franic, S.; Christoforou, A.; Zietsch, B.; Painter, J.; Medland, S.E.; Ehli, E.A.; Davies, G.E.; Steen, V.M.; Lundervold, A.J.; Reinvang, I.; Montgomery, G.W.; Espeseth, T.; Hulshoff Pol, H.E.; Starr, J.M.; Martin, N.G.; Le Hellard, S.; Boomsma, D.I.; Deary, I.J.; Wright, M.J.

    2015-01-01

    Relational complexity (RC) is a metric reflecting capacity limitation in relational processing. It plays a crucial role in higher cognitive processes and is an endophenotype for several disorders. However, the genetic underpinnings of complex relational processing have not been investigated. Using

  1. Business model metrics : An open repository

    Heikkila, M.; Bouwman, W.A.G.A.; Heikkila, J.; Solaimani, S.; Janssen, W.

    2015-01-01

    Development of successful business models has become a necessity in turbulent business environments, but compared to research on business modeling tools, attention to the role of metrics in designing business models in literature is limited. Building on existing approaches to business models and

  2. Software quality metrics aggregation in industry

    Mordal, K.; Anquetil, N.; Laval, J.; Serebrenik, A.; Vasilescu, B.N.; Ducasse, S.

    2013-01-01

    With the growing need for quality assessment of entire software systems in the industry, new issues are emerging. First, because most software quality metrics are defined at the level of individual software components, there is a need for aggregation methods to summarize the results at the system

  3. Invariance group of the Finster metric function

    Asanov, G.S.

    1985-01-01

    An invariance group of the Finsler metric function is introduced and studied that directly generalized the respective concept (a group of Euclidean rolations) of the Rieman geometry. A sequential description of the isotopic invariance of physical fields on the base of the Finsler geometry is possible in terms of this group

  4. Sigma Routing Metric for RPL Protocol

    Paul Sanmartin

    2018-04-01

    Full Text Available This paper presents the adaptation of a specific metric for the RPL protocol in the objective function MRHOF. Among the functions standardized by IETF, we find OF0, which is based on the minimum hop count, as well as MRHOF, which is based on the Expected Transmission Count (ETX. However, when the network becomes denser or the number of nodes increases, both OF0 and MRHOF introduce long hops, which can generate a bottleneck that restricts the network. The adaptation is proposed to optimize both OFs through a new routing metric. To solve the above problem, the metrics of the minimum number of hops and the ETX are combined by designing a new routing metric called SIGMA-ETX, in which the best route is calculated using the standard deviation of ETX values between each node, as opposed to working with the ETX average along the route. This method ensures a better routing performance in dense sensor networks. The simulations are done through the Cooja simulator, based on the Contiki operating system. The simulations showed that the proposed optimization outperforms at a high margin in both OF0 and MRHOF, in terms of network latency, packet delivery ratio, lifetime, and power consumption.

  5. In Search of Helpful Group Awareness Metrics in Closed-Type Formative Assessment Tools

    Papadopoulos, Pantelis M.; Natsis, Antonios; Obwegeser, Nikolaus

    2017-01-01

    For 4 weeks, a total of 91 sophomore students started their classes with a short multiple-choice quiz. The students had to answer the quiz individually, view feedback on class activity, revise their initial answers, and discuss the correct answers with the teacher. The percentage of students...... that selected each question choice and their self-reported confidence and preparation were the three metrics included in the feedback. Results showed that students were relying mainly on the percentage metric. However, statistical analysis also revealed a significant main effect for confidence and preparation...

  6. Mapping Rubber Plantations and Natural Forests in Xishuangbanna (Southwest China Using Multi-Spectral Phenological Metrics from MODIS Time Series

    Sebastian van der Linden

    2013-05-01

    Full Text Available We developed and evaluated a new approach for mapping rubber plantations and natural forests in one of Southeast Asia’s biodiversity hot spots, Xishuangbanna in China. We used a one-year annual time series of Moderate Resolution Imaging Spectroradiometer (MODIS, Enhanced Vegetation Index (EVI and short-wave infrared (SWIR reflectance data to develop phenological metrics. These phenological metrics were used to classify rubber plantations and forests with the Random Forest classification algorithm. We evaluated which key phenological characteristics were important to discriminate rubber plantations and natural forests by estimating the influence of each metric on the classification accuracy. As a benchmark, we compared the best classification with a classification based on the full, fitted time series data. Overall classification accuracies derived from EVI and SWIR time series alone were 64.4% and 67.9%, respectively. Combining the phenological metrics from EVI and SWIR time series improved the accuracy to 73.5%. Using the full, smoothed time series data instead of metrics derived from the time series improved the overall accuracy only slightly (1.3%, indicating that the phenological metrics were sufficient to explain the seasonal changes captured by the MODIS time series. The results demonstrate a promising utility of phenological metrics for mapping and monitoring rubber expansion with MODIS.

  7. Computing the Gromov hyperbolicity of a discrete metric space

    Fournier, Hervé

    2015-02-12

    We give exact and approximation algorithms for computing the Gromov hyperbolicity of an n-point discrete metric space. We observe that computing the Gromov hyperbolicity from a fixed base-point reduces to a (max,min) matrix product. Hence, using the (max,min) matrix product algorithm by Duan and Pettie, the fixed base-point hyperbolicity can be determined in O(n2.69) time. It follows that the Gromov hyperbolicity can be computed in O(n3.69) time, and a 2-approximation can be found in O(n2.69) time. We also give a (2log2⁡n)-approximation algorithm that runs in O(n2) time, based on a tree-metric embedding by Gromov. We also show that hyperbolicity at a fixed base-point cannot be computed in O(n2.05) time, unless there exists a faster algorithm for (max,min) matrix multiplication than currently known.

  8. Observable traces of non-metricity: New constraints on metric-affine gravity

    Delhom-Latorre, Adrià; Olmo, Gonzalo J.; Ronco, Michele

    2018-05-01

    Relaxing the Riemannian condition to incorporate geometric quantities such as torsion and non-metricity may allow to explore new physics associated with defects in a hypothetical space-time microstructure. Here we show that non-metricity produces observable effects in quantum fields in the form of 4-fermion contact interactions, thereby allowing us to constrain the scale of non-metricity to be greater than 1 TeV by using results on Bahbah scattering. Our analysis is carried out in the framework of a wide class of theories of gravity in the metric-affine approach. The bound obtained represents an improvement of several orders of magnitude to previous experimental constraints.

  9. Conformal and related changes of metric on the product of two almost contact metric manifolds.

    Blair, D. E.

    1990-01-01

    This paper studies conformal and related changes of the product metric on the product of two almost contact metric manifolds. It is shown that if one factor is Sasakian, the other is not, but that locally the second factor is of the type studied by Kenmotsu. The results are more general and given in terms of trans-Sasakian, α-Sasakian and β-Kenmotsu structures.

  10. Drift correction for single-molecule imaging by molecular constraint field, a distance minimum metric

    Han, Renmin; Wang, Liansan; Xu, Fan; Zhang, Yongdeng; Zhang, Mingshu; Liu, Zhiyong; Ren, Fei; Zhang, Fa

    2015-01-01

    The recent developments of far-field optical microscopy (single molecule imaging techniques) have overcome the diffraction barrier of light and improve image resolution by a factor of ten compared with conventional light microscopy. These techniques utilize the stochastic switching of probe molecules to overcome the diffraction limit and determine the precise localizations of molecules, which often requires a long image acquisition time. However, long acquisition times increase the risk of sample drift. In the case of high resolution microscopy, sample drift would decrease the image resolution. In this paper, we propose a novel metric based on the distance between molecules to solve the drift correction. The proposed metric directly uses the position information of molecules to estimate the frame drift. We also designed an algorithm to implement the metric for the general application of drift correction. There are two advantages of our method: First, because our method does not require space binning of positions of molecules but directly operates on the positions, it is more natural for single molecule imaging techniques. Second, our method can estimate drift with a small number of positions in each temporal bin, which may extend its potential application. The effectiveness of our method has been demonstrated by both simulated data and experiments on single molecular images

  11. Macroscale hydrologic modeling of ecologically relevant flow metrics

    Wenger, Seth J.; Luce, Charles H.; Hamlet, Alan F.; Isaak, Daniel J.; Neville, Helen M.

    2010-09-01

    Stream hydrology strongly affects the structure of aquatic communities. Changes to air temperature and precipitation driven by increased greenhouse gas concentrations are shifting timing and volume of streamflows potentially affecting these communities. The variable infiltration capacity (VIC) macroscale hydrologic model has been employed at regional scales to describe and forecast hydrologic changes but has been calibrated and applied mainly to large rivers. An important question is how well VIC runoff simulations serve to answer questions about hydrologic changes in smaller streams, which are important habitat for many fish species. To answer this question, we aggregated gridded VIC outputs within the drainage basins of 55 streamflow gages in the Pacific Northwest United States and compared modeled hydrographs and summary metrics to observations. For most streams, several ecologically relevant aspects of the hydrologic regime were accurately modeled, including center of flow timing, mean annual and summer flows and frequency of winter floods. Frequencies of high and low flows in the summer were not well predicted, however. Predictions were worse for sites with strong groundwater influence, and some sites showed errors that may result from limitations in the forcing climate data. Higher resolution (1/16th degree) modeling provided small improvements over lower resolution (1/8th degree). Despite some limitations, the VIC model appears capable of representing several ecologically relevant hydrologic characteristics in streams, making it a useful tool for understanding the effects of hydrology in delimiting species distributions and predicting the potential effects of climate shifts on aquatic organisms.

  12. Relationship between bifenthrin sediment toxic units and benthic community metrics in urban California streams.

    Hall, Lenwood W; Anderson, Ronald D

    2013-08-01

    The objective of this study was to use ecologically relevant field measurements for determining the relationship between bifenthrin sediment toxic units (TUs) (environmental concentrations/Hyalella acute LC50 value) and 15 benthic metrics in four urban California streams sampled from 2006 to 2011. Data from the following four California streams were used in the analysis: Kirker Creek (2006, 2007), Pleasant Grove Creek (2006, 2007, and 2008), Arcade Creek (2009, 2010, and 2011), and Salinas streams (2009, 2010, and 2011). The results from univariate analysis of benthic metrics versus bifenthrin TU calculations for the four California streams with multiple-year datasets combined by stream showed that there were either nonsignificant relationships or lack of metric data for 93 % of cases. For 7 % of the data (4 cases) where significant relationships were reported between benthic metrics and bifenthrin TUs, these relationships were ecologically meaningful. Three of these significant direct relationships were an expression of tolerant benthic taxa (either % tolerant taxa or tolerance values, which are similar metrics), which would be expected to increase in a stressed environment. These direct significant tolerance relationships were reported for Kirker Creek, Pleasant Grove Creek, and Arcade Creek. The fourth significant relationship was an inverse relationship between taxa richness and bifenthrin TUs for the 3-year Pleasant Grove Creek dataset. In summary, only a small percent of the benthic metric × bifenthrin TU relationships were significant for the four California streams. Therefore, the general summary conclusion from this analysis is that there is no strong case for showing consistent meaningful relationships between various benthic metrics used to characterize the status of benthic communities and bifenthrin TUs for these four California streams.

  13. Metrics for measuring distances in configuration spaces

    Sadeghi, Ali; Ghasemi, S. Alireza; Schaefer, Bastian; Mohr, Stephan; Goedecker, Stefan; Lill, Markus A.

    2013-01-01

    In order to characterize molecular structures we introduce configurational fingerprint vectors which are counterparts of quantities used experimentally to identify structures. The Euclidean distance between the configurational fingerprint vectors satisfies the properties of a metric and can therefore safely be used to measure dissimilarities between configurations in the high dimensional configuration space. In particular we show that these metrics are a perfect and computationally cheap replacement for the root-mean-square distance (RMSD) when one has to decide whether two noise contaminated configurations are identical or not. We introduce a Monte Carlo approach to obtain the global minimum of the RMSD between configurations, which is obtained from a global minimization over all translations, rotations, and permutations of atomic indices

  14. A perceptual metric for photo retouching.

    Kee, Eric; Farid, Hany

    2011-12-13

    In recent years, advertisers and magazine editors have been widely criticized for taking digital photo retouching to an extreme. Impossibly thin, tall, and wrinkle- and blemish-free models are routinely splashed onto billboards, advertisements, and magazine covers. The ubiquity of these unrealistic and highly idealized images has been linked to eating disorders and body image dissatisfaction in men, women, and children. In response, several countries have considered legislating the labeling of retouched photos. We describe a quantitative and perceptually meaningful metric of photo retouching. Photographs are rated on the degree to which they have been digitally altered by explicitly modeling and estimating geometric and photometric changes. This metric correlates well with perceptual judgments of photo retouching and can be used to objectively judge by how much a retouched photo has strayed from reality.

  15. Metric-Aware Secure Service Orchestration

    Gabriele Costa

    2012-12-01

    Full Text Available Secure orchestration is an important concern in the internet of service. Next to providing the required functionality the composite services must also provide a reasonable level of security in order to protect sensitive data. Thus, the orchestrator has a need to check whether the complex service is able to satisfy certain properties. Some properties are expressed with metrics for precise definition of requirements. Thus, the problem is to analyse the values of metrics for a complex business process. In this paper we extend our previous work on analysis of secure orchestration with quantifiable properties. We show how to define, verify and enforce quantitative security requirements in one framework with other security properties. The proposed approach should help to select the most suitable service architecture and guarantee fulfilment of the declared security requirements.

  16. Machine Learning for ATLAS DDM Network Metrics

    Lassnig, Mario; The ATLAS collaboration; Vamosi, Ralf

    2016-01-01

    The increasing volume of physics data is posing a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from our ongoing automation efforts. First, we describe our framework for distributed data management and network metrics, automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for network-aware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our models.

  17. Beyond Lovelock gravity: Higher derivative metric theories

    Crisostomi, M.; Noui, K.; Charmousis, C.; Langlois, D.

    2018-02-01

    We consider theories describing the dynamics of a four-dimensional metric, whose Lagrangian is diffeomorphism invariant and depends at most on second derivatives of the metric. Imposing degeneracy conditions we find a set of Lagrangians that, apart form the Einstein-Hilbert one, are either trivial or contain more than 2 degrees of freedom. Among the partially degenerate theories, we recover Chern-Simons gravity, endowed with constraints whose structure suggests the presence of instabilities. Then, we enlarge the class of parity violating theories of gravity by introducing new "chiral scalar-tensor theories." Although they all raise the same concern as Chern-Simons gravity, they can nevertheless make sense as low energy effective field theories or, by restricting them to the unitary gauge (where the scalar field is uniform), as Lorentz breaking theories with a parity violating sector.

  18. High-Dimensional Metrics in R

    Chernozhukov, Victor; Hansen, Chris; Spindler, Martin

    2016-01-01

    The package High-dimensional Metrics (\\Rpackage{hdm}) is an evolving collection of statistical methods for estimation and quantification of uncertainty in high-dimensional approximately sparse models. It focuses on providing confidence intervals and significance testing for (possibly many) low-dimensional subcomponents of the high-dimensional parameter vector. Efficient estimators and uniformly valid confidence intervals for regression coefficients on target variables (e.g., treatment or poli...

  19. Interiors of Vaidya's radiating metric: Gravitational collapse

    Fayos, F.; Jaen, X.; Llanta, E.; Senovilla, J.M.M.

    1992-01-01

    Using the Darmois junction conditions, we give the necessary and sufficient conditions for the matching of a general spherically symmetric metric to a Vaidya radiating solution. We present also these conditions in terms of the physical quantities of the corresponding energy-momentum tensors. The physical interpretation of the results and their possible applications are studied, and we also perform a detailed analysis of previous work on the subject by other authors

  20. Anisotropic rectangular metric for polygonal surface remeshing

    Pellenard, Bertrand

    2013-06-18

    We propose a new method for anisotropic polygonal surface remeshing. Our algorithm takes as input a surface triangle mesh. An anisotropic rectangular metric, defined at each triangle facet of the input mesh, is derived from both a user-specified normal-based tolerance error and the requirement to favor rectangle-shaped polygons. Our algorithm uses a greedy optimization procedure that adds, deletes and relocates generators so as to match two criteria related to partitioning and conformity.

  1. A Metrics Approach for Collaborative Systems

    Cristian CIUREA

    2009-01-01

    Full Text Available This article presents different types of collaborative systems, their structure and classification. This paper defines the concept of virtual campus as a collaborative system. It builds architecture for virtual campus oriented on collaborative training processes. It analyses the quality characteristics of collaborative systems and propose techniques for metrics construction and validation in order to evaluate them. The article analyzes different ways to increase the efficiency and the performance level in collaborative banking systems.

  2. Preserved Network Metrics across Translated Texts

    Cabatbat, Josephine Jill T.; Monsanto, Jica P.; Tapang, Giovanni A.

    2014-09-01

    Co-occurrence language networks based on Bible translations and the Universal Declaration of Human Rights (UDHR) translations in different languages were constructed and compared with random text networks. Among the considered network metrics, the network size, N, the normalized betweenness centrality (BC), and the average k-nearest neighbors, knn, were found to be the most preserved across translations. Moreover, similar frequency distributions of co-occurring network motifs were observed for translated texts networks.

  3. Anisotropic rectangular metric for polygonal surface remeshing

    Pellenard, Bertrand; Morvan, Jean-Marie; Alliez, Pierre

    2013-01-01

    We propose a new method for anisotropic polygonal surface remeshing. Our algorithm takes as input a surface triangle mesh. An anisotropic rectangular metric, defined at each triangle facet of the input mesh, is derived from both a user-specified normal-based tolerance error and the requirement to favor rectangle-shaped polygons. Our algorithm uses a greedy optimization procedure that adds, deletes and relocates generators so as to match two criteria related to partitioning and conformity.

  4. Smart Grid Status and Metrics Report

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-07-01

    To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. It measures 21 metrics to provide insight into the grid’s capacity to embody these characteristics. This report looks across a spectrum of smart grid concerns to measure the status of smart grid deployment and impacts.

  5. Metrics in Keplerian orbits quotient spaces

    Milanov, Danila V.

    2018-03-01

    Quotient spaces of Keplerian orbits are important instruments for the modelling of orbit samples of celestial bodies on a large time span. We suppose that variations of the orbital eccentricities, inclinations and semi-major axes remain sufficiently small, while arbitrary perturbations are allowed for the arguments of pericentres or longitudes of the nodes, or both. The distance between orbits or their images in quotient spaces serves as a numerical criterion for such problems of Celestial Mechanics as search for common origin of meteoroid streams, comets, and asteroids, asteroid families identification, and others. In this paper, we consider quotient sets of the non-rectilinear Keplerian orbits space H. Their elements are identified irrespective of the values of pericentre arguments or node longitudes. We prove that distance functions on the quotient sets, introduced in Kholshevnikov et al. (Mon Not R Astron Soc 462:2275-2283, 2016), satisfy metric space axioms and discuss theoretical and practical importance of this result. Isometric embeddings of the quotient spaces into R^n, and a space of compact subsets of H with Hausdorff metric are constructed. The Euclidean representations of the orbits spaces find its applications in a problem of orbit averaging and computational algorithms specific to Euclidean space. We also explore completions of H and its quotient spaces with respect to corresponding metrics and establish a relation between elements of the extended spaces and rectilinear trajectories. Distance between an orbit and subsets of elliptic and hyperbolic orbits is calculated. This quantity provides an upper bound for the metric value in a problem of close orbits identification. Finally the invariance of the equivalence relations in H under coordinates change is discussed.

  6. The Planck Vacuum and the Schwarzschild Metrics

    Daywitt W. C.

    2009-07-01

    Full Text Available The Planck vacuum (PV is assumed to be the source of the visible universe. So under conditions of sufficient stress, there must exist a pathway through which energy from the PV can travel into this universe. Conversely, the passage of energy from the visible universe to the PV must also exist under the same stressful conditions. The following examines two versions of the Schwarzschild metric equation for compatability with this open-pathway idea.

  7. Metrics and Its Function in Poetry

    XIAO Zhong-qiong; CHEN Min-jie

    2013-01-01

    Poetry is a special combination of musical and linguistic qualities-of sounds both regarded as pure sound and as mean-ingful speech. Part of the pleasure of poetry lies in its relationship with music. Metrics, including rhythm and meter, is an impor-tant method for poetry to express poetic sentiment. Through the introduction of poetic language and typical examples, the writer of this paper tries to discuss the relationship between sound and meaning.

  8. An Enhanced TIMESAT Algorithm for Estimating Vegetation Phenology Metrics from MODIS Data

    Tan, Bin; Morisette, Jeffrey T.; Wolfe, Robert E.; Gao, Feng; Ederer, Gregory A.; Nightingale, Joanne; Pedelty, Jeffrey A.

    2012-01-01

    An enhanced TIMESAT algorithm was developed for retrieving vegetation phenology metrics from 250 m and 500 m spatial resolution Moderate Resolution Imaging Spectroradiometer (MODIS) vegetation indexes (VI) over North America. MODIS VI data were pre-processed using snow-cover and land surface temperature data, and temporally smoothed with the enhanced TIMESAT algorithm. An objective third derivative test was applied to define key phenology dates and retrieve a set of phenology metrics. This algorithm has been applied to two MODIS VIs: Normalized Difference Vegetation Index (NDVI) and Enhanced Vegetation Index (EVI). In this paper, we describe the algorithm and use EVI as an example to compare three sets of TIMESAT algorithm/MODIS VI combinations: a) original TIMESAT algorithm with original MODIS VI, b) original TIMESAT algorithm with pre-processed MODIS VI, and c) enhanced TIMESAT and pre-processed MODIS VI. All retrievals were compared with ground phenology observations, some made available through the National Phenology Network. Our results show that for MODIS data in middle to high latitude regions, snow and land surface temperature information is critical in retrieving phenology metrics from satellite observations. The results also show that the enhanced TIMESAT algorithm can better accommodate growing season start and end dates that vary significantly from year to year. The TIMESAT algorithm improvements contribute to more spatial coverage and more accurate retrievals of the phenology metrics. Among three sets of TIMESAT/MODIS VI combinations, the start of the growing season metric predicted by the enhanced TIMESAT algorithm using pre-processed MODIS VIs has the best associations with ground observed vegetation greenup dates.

  9. A Fundamental Metric for Metal Recycling Applied to Coated Magnesium

    Meskers, C.E.M.; Reuter, M.A.; Boin, U.; Kvithyld, A.

    2008-01-01

    A fundamental metric for the assessment of the recyclability and, hence, the sustainability of coated magnesium scrap is presented; this metric combines kinetics and thermodynamics. The recycling process, consisting of thermal decoating and remelting, was studied by thermogravimetry and differential

  10. Ideal Based Cyber Security Technical Metrics for Control Systems

    W. F. Boyer; M. A. McQueen

    2007-10-01

    Much of the world's critical infrastructure is at risk from attack through electronic networks connected to control systems. Security metrics are important because they provide the basis for management decisions that affect the protection of the infrastructure. A cyber security technical metric is the security relevant output from an explicit mathematical model that makes use of objective measurements of a technical object. A specific set of technical security metrics are proposed for use by the operators of control systems. Our proposed metrics are based on seven security ideals associated with seven corresponding abstract dimensions of security. We have defined at least one metric for each of the seven ideals. Each metric is a measure of how nearly the associated ideal has been achieved. These seven ideals provide a useful structure for further metrics development. A case study shows how the proposed metrics can be applied to an operational control system.

  11. 43 CFR 12.915 - Metric system of measurement.

    2010-10-01

    ... procurements, grants, and other business-related activities. Metric implementation may take longer where the... recipient, such as when foreign competitors are producing competing products in non-metric units. (End of...

  12. The Jacobi metric for timelike geodesics in static spacetimes

    Gibbons, G. W.

    2016-01-01

    It is shown that the free motion of massive particles moving in static spacetimes is given by the geodesics of an energy-dependent Riemannian metric on the spatial sections analogous to Jacobi's metric in classical dynamics. In the massless limit Jacobi's metric coincides with the energy independent Fermat or optical metric. For stationary metrics, it is known that the motion of massless particles is given by the geodesics of an energy independent Finslerian metric of Randers type. The motion of massive particles is governed by neither a Riemannian nor a Finslerian metric. The properies of the Jacobi metric for massive particles moving outside the horizon of a Schwarschild black hole are described. By constrast with the massless case, the Gaussian curvature of the equatorial sections is not always negative.

  13. Spatially-Explicit Bayesian Information Entropy Metrics for Calibrating Landscape Transformation Models

    Kostas Alexandridis

    2013-06-01

    Full Text Available Assessing spatial model performance often presents challenges related to the choice and suitability of traditional statistical methods in capturing the true validity and dynamics of the predicted outcomes. The stochastic nature of many of our contemporary spatial models of land use change necessitate the testing and development of new and innovative methodologies in statistical spatial assessment. In many cases, spatial model performance depends critically on the spatially-explicit prior distributions, characteristics, availability and prevalence of the variables and factors under study. This study explores the statistical spatial characteristics of statistical model assessment of modeling land use change dynamics in a seven-county study area in South-Eastern Wisconsin during the historical period of 1963–1990. The artificial neural network-based Land Transformation Model (LTM predictions are used to compare simulated with historical land use transformations in urban/suburban landscapes. We introduce a range of Bayesian information entropy statistical spatial metrics for assessing the model performance across multiple simulation testing runs. Bayesian entropic estimates of model performance are compared against information-theoretic stochastic entropy estimates and theoretically-derived accuracy assessments. We argue for the critical role of informational uncertainty across different scales of spatial resolution in informing spatial landscape model assessment. Our analysis reveals how incorporation of spatial and landscape information asymmetry estimates can improve our stochastic assessments of spatial model predictions. Finally our study shows how spatially-explicit entropic classification accuracy estimates can work closely with dynamic modeling methodologies in improving our scientific understanding of landscape change as a complex adaptive system and process.

  14. Gap Resolution

    2017-04-25

    Gap Resolution is a software package that was developed to improve Newbler genome assemblies by automating the closure of sequence gaps caused by repetitive regions in the DNA. This is done by performing the follow steps:1) Identify and distribute the data for each gap in sub-projects. 2) Assemble the data associated with each sub-project using a secondary assembler, such as Newbler or PGA. 3) Determine if any gaps are closed after reassembly, and either design fakes (consensus of closed gap) for those that closed or lab experiments for those that require additional data. The software requires as input a genome assembly produce by the Newbler assembler provided by Roche and 454 data containing paired-end reads.

  15. Factor structure of the Tomimatsu-Sato metrics

    Perjes, Z.

    1989-02-01

    Based on an earlier result stating that δ = 3 Tomimatsu-Sato (TS) metrics can be factored over the field of integers, an analogous representation for higher TS metrics was sought. It is shown that the factoring property of TS metrics follows from the structure of special Hankel determinants. A set of linear algebraic equations determining the factors was defined, and the factors of the first five TS metrics were tabulated, together with their primitive factors. (R.P.) 4 refs.; 2 tabs

  16. What can article-level metrics do for you?

    Fenner, Martin

    2013-10-01

    Article-level metrics (ALMs) provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, usage statistics, discussions in online comments and social media, social bookmarking, and recommendations. In this essay, we describe why article-level metrics are an important extension of traditional citation-based journal metrics and provide a number of example from ALM data collected for PLOS Biology.

  17. A convergence theory for probabilistic metric spaces | Jäger ...

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  18. Understanding Acceptance of Software Metrics--A Developer Perspective

    Umarji, Medha

    2009-01-01

    Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…

  19. Modified intuitionistic fuzzy metric spaces and some fixed point theorems

    Saadati, R.; Sedghi, S.; Shobe, N.

    2008-01-01

    Since the intuitionistic fuzzy metric space has extra conditions (see [Gregori V, Romaguera S, Veereamani P. A note on intuitionistic fuzzy metric spaces. Chaos, Solitons and Fractals 2006;28:902-5]). In this paper, we consider modified intuitionistic fuzzy metric spaces and prove some fixed point theorems in these spaces. All the results presented in this paper are new

  20. Tide or Tsunami? The Impact of Metrics on Scholarly Research

    Bonnell, Andrew G.

    2016-01-01

    Australian universities are increasingly resorting to the use of journal metrics such as impact factors and ranking lists in appraisal and promotion processes, and are starting to set quantitative "performance expectations" which make use of such journal-based metrics. The widespread use and misuse of research metrics is leading to…

  1. Graev metrics on free products and HNN extensions

    Slutsky, Konstantin

    2014-01-01

    We give a construction of two-sided invariant metrics on free products (possibly with amalgamation) of groups with two-sided invariant metrics and, under certain conditions, on HNN extensions of such groups. Our approach is similar to the Graev's construction of metrics on free groups over pointed...

  2. The universal connection and metrics on moduli spaces

    Massamba, Fortune; Thompson, George

    2003-11-01

    We introduce a class of metrics on gauge theoretic moduli spaces. These metrics are made out of the universal matrix that appears in the universal connection construction of M. S. Narasimhan and S. Ramanan. As an example we construct metrics on the c 2 = 1 SU(2) moduli space of instantons on R 4 for various universal matrices. (author)

  3. ST-intuitionistic fuzzy metric space with properties

    Arora, Sahil; Kumar, Tanuj

    2017-07-01

    In this paper, we define ST-intuitionistic fuzzy metric space and the notion of convergence and completeness properties of cauchy sequences is studied. Further, we prove some properties of ST-intuitionistic fuzzy metric space. Finally, we introduce the concept of symmetric ST Intuitionistic Fuzzy metric space.

  4. Term Based Comparison Metrics for Controlled and Uncontrolled Indexing Languages

    Good, B. M.; Tennis, J. T.

    2009-01-01

    Introduction: We define a collection of metrics for describing and comparing sets of terms in controlled and uncontrolled indexing languages and then show how these metrics can be used to characterize a set of languages spanning folksonomies, ontologies and thesauri. Method: Metrics for term set characterization and comparison were identified and…

  5. Software architecture analysis tool : software architecture metrics collection

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  6. Otherwise Engaged : Social Media from Vanity Metrics to Critical Analytics

    Rogers, R.

    2018-01-01

    Vanity metrics is a term that captures the measurement and display of how well one is doing in the “success theater” of social media. The notion of vanity metrics implies a critique of metrics concerning both the object of measurement as well as their capacity to measure unobtrusively or only to

  7. Meter Detection in Symbolic Music Using Inner Metric Analysis

    de Haas, W.B.; Volk, A.

    2016-01-01

    In this paper we present PRIMA: a new model tailored to symbolic music that detects the meter and the first downbeat position of a piece. Given onset data, the metrical structure of a piece is interpreted using the Inner Metric Analysis (IMA) model. IMA identifies the strong and weak metrical

  8. Regional Sustainability: The San Luis Basin Metrics Project

    There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute. Moreover, individual metrics may not capture all aspects of a system that are relevant to sust...

  9. Predicting spatial variations of tree species richness in tropical forests from high-resolution remote sensing.

    Fricker, Geoffrey A; Wolf, Jeffrey A; Saatchi, Sassan S; Gillespie, Thomas W

    2015-10-01

    There is an increasing interest in identifying theories, empirical data sets, and remote-sensing metrics that can quantify tropical forest alpha diversity at a landscape scale. Quantifying patterns of tree species richness in the field is time consuming, especially in regions with over 100 tree species/ha. We examine species richness in a 50-ha plot in Barro Colorado Island in Panama and test if biophysical measurements of canopy reflectance from high-resolution satellite imagery and detailed vertical forest structure and topography from light detection and ranging (lidar) are associated with species richness across four tree size classes (>1, 1-10, >10, and >20 cm dbh) and three spatial scales (1, 0.25, and 0.04 ha). We use the 2010 tree inventory, including 204,757 individuals belonging to 301 species of freestanding woody plants or 166 ± 1.5 species/ha (mean ± SE), to compare with remote-sensing data. All remote-sensing metrics became less correlated with species richness as spatial resolution decreased from 1.0 ha to 0.04 ha and tree size increased from 1 cm to 20 cm dbh. When all stems with dbh > 1 cm in 1-ha plots were compared to remote-sensing metrics, standard deviation in canopy reflectance explained 13% of the variance in species richness. The standard deviations of canopy height and the topographic wetness index (TWI) derived from lidar were the best metrics to explain the spatial variance in species richness (15% and 24%, respectively). Using multiple regression models, we made predictions of species richness across Barro Colorado Island (BCI) at the 1-ha spatial scale for different tree size classes. We predicted variation in tree species richness among all plants (adjusted r² = 0.35) and trees with dbh > 10 cm (adjusted r² = 0.25). However, the best model results were for understory trees and shrubs (dbh 1-10 cm) (adjusted r² = 0.52) that comprise the majority of species richness in tropical forests. Our results indicate that high-resolution

  10. Virginia Base Mapping Program (VBMP) 2002; Multiple Resolutions (1"=100',1"= 200',1"=400' scale) Digital Orthophotography for the South Zone of the Virginia State Plane Grid

    Federal Emergency Management Agency, Department of Homeland Security — These files contain 2-foot GSD high-resolution orthorectified aerial image map products in GeoTIFF version 6.0 file format. GeoTIFF files are uncompressed raster...

  11. Simulation and assessment of urbanization impacts on runoff metrics: insights from landuse changes

    Zhang, Yongyong; Xia, Jun; Yu, Jingjie; Randall, Mark; Zhang, Yichi; Zhao, Tongtiegang; Pan, Xingyao; Zhai, Xiaoyan; Shao, Quanxi

    2018-05-01

    Urbanization-induced landuse changes alter runoff regimes in complex ways. In this study, a detailed investigation of the urbanization impacts on runoff regimes is provided by using multiple runoff metrics and with consideration of landuse dynamics. A catchment hydrological model is modified by coupling a simplified flow routing module of the urban drainage system and landuse dynamics to improve long-term urban runoff simulations. Moreover, multivariate statistical approach is adopted to mine the spatial variations of runoff metrics so as to further identify critical impact factors of landuse changes. The Qing River catchment as a peri-urban catchment in the Beijing metropolitan area is selected as our study region. Results show that: (1) the dryland agriculture is decreased from 13.9% to 1.5% of the total catchment area in the years 2000-2015, while the percentages of impervious surface, forest and grass are increased from 63.5% to 72.4%, 13.5% to 16.6% and 5.1% to 6.5%, respectively. The most dramatic landuse changes occur in the middle and downstream regions; (2) The combined landuse changes do not alter the average flow metrics obviously at the catchment outlet, but slightly increase the high flow metrics, particularly the extreme high flows; (3) The impacts on runoff metrics in the sub-catchments are more obvious than those at the catchment outlet. For the average flow metrics, the most impacted metric is the runoff depth in the dry season (October ∼ May) with a relative change from -10.9% to 11.6%, and the critical impact factors are the impervious surface and grass. For the high flow metrics, the extreme high flow depth is increased most significantly with a relative change from -0.6% to 10.5%, and the critical impact factors are the impervious surface and dryland agriculture; (4) The runoff depth metrics in the sub-catchments are increased because of the landuse changes from dryland agriculture to impervious surface, but are decreased because of the

  12. Estimation of Actual Evapotranspiration along the Middle Rio Grande of New Mexico Using MODIS and Landsat Imagery with the METRIC Model

    Trezza, Ricardo; Allen, Richard; Tasumi, Masahiro

    2013-01-01

    Estimation of actual evapotranspiration (ET) for the Middle Rio Grande valley in central New Mexico via the METRIC surface energy balance model using MODIS and Landsat imagery is described. MODIS images are a useful resource for estimating ET at large scales when high spatial resolution is not required. One advantage of MODIS satellites is that images having a view angle < ~15° are potentially available about every four to five days. The main challenge of applying METRIC using MODIS is the se...

  13. Extremal limits of the C metric: Nariai, Bertotti-Robinson, and anti-Nariai C metrics

    Dias, Oscar J.C.; Lemos, Jose P.S.

    2003-01-01

    In two previous papers we have analyzed the C metric in a background with a cosmological constant Λ, namely, the de-Sitter (dS) C metric (Λ>0), and the anti-de Sitter (AdS) C metric (Λ 0, Λ=0, and Λ 2 xS-tilde 2 ) to each point in the deformed two-sphere S-tilde 2 corresponds a dS 2 spacetime, except for one point which corresponds to a dS 2 spacetime with an infinite straight strut or string. There are other important new features that appear. One expects that the solutions found in this paper are unstable and decay into a slightly nonextreme black hole pair accelerated by a strut or by strings. Moreover, the Euclidean version of these solutions mediate the quantum process of black hole pair creation that accompanies the decay of the dS and AdS spaces

  14. Massless and massive quanta resulting from a mediumlike metric tensor

    Soln, J.

    1985-01-01

    A simple model of the ''primordial'' scalar field theory is presented in which the metric tensor is a generalization of the metric tensor from electrodynamics in a medium. The radiation signal corresponding to the scalar field propagates with a velocity that is generally less than c. This signal can be associated simultaneously with imaginary and real effective (momentum-dependent) masses. The requirement that the imaginary effective mass vanishes, which we take to be the prerequisite for the vacuumlike signal propagation, leads to the ''spontaneous'' splitting of the metric tensor into two distinct metric tensors: one metric tensor gives rise to masslesslike radiation and the other to a massive particle. (author)

  15. Principle of space existence and De Sitter metric

    Mal'tsev, V.K.

    1990-01-01

    The selection principle for the solutions of the Einstein equations suggested in a series of papers implies the existence of space (g ik ≠ 0) only in the presence of matter (T ik ≠0). This selection principle (principle of space existence, in the Markov terminology) implies, in the general case, the absence of the cosmological solution with the De Sitter metric. On the other hand, the De Sitter metric is necessary for describing both inflation and deflation periods of the Universe. It is shown that the De Sitter metric is also allowed by the selection principle under discussion if the metric experiences the evolution into the Friedmann metric

  16. Pragmatic security metrics applying metametrics to information security

    Brotby, W Krag

    2013-01-01

    Other books on information security metrics discuss number theory and statistics in academic terms. Light on mathematics and heavy on utility, PRAGMATIC Security Metrics: Applying Metametrics to Information Security breaks the mold. This is the ultimate how-to-do-it guide for security metrics.Packed with time-saving tips, the book offers easy-to-follow guidance for those struggling with security metrics. Step by step, it clearly explains how to specify, develop, use, and maintain an information security measurement system (a comprehensive suite of metrics) to

  17. Fractional charge resolution in music

    Romero, J.L.; Brady, F.P.; Christie, B.

    1984-01-01

    Recent results obtained with MUSIC (MUltiple Sampling Ionization Chamber) for La and Ar beams at the Bevalac show resolutions better than ΔZ(FWHM) = 0.3 e. These results suggest the use of MUSIC in future ultrarelativistic heavy ion collisions

  18. Individuality evaluation for paper based artifact-metrics using transmitted light image

    Yamakoshi, Manabu; Tanaka, Junichi; Furuie, Makoto; Hirabayashi, Masashi; Matsumoto, Tsutomu

    2008-02-01

    Artifact-metrics is an automated method of authenticating artifacts based on a measurable intrinsic characteristic. Intrinsic characters, such as microscopic random-patterns made during the manufacturing process, are very difficult to copy. A transmitted light image of the distribution can be used for artifact-metrics, since the fiber distribution of paper is random. Little is known about the individuality of the transmitted light image although it is an important requirement for intrinsic characteristic artifact-metrics. Measuring individuality requires that the intrinsic characteristic of each artifact significantly differs, so having sufficient individuality can make an artifact-metric system highly resistant to brute force attack. Here we investigate the influence of paper category, matching size of sample, and image-resolution on the individuality of a transmitted light image of paper through a matching test using those images. More concretely, we evaluate FMR/FNMR curves by calculating similarity scores with matches using correlation coefficients between pairs of scanner input images, and the individuality of paper by way of estimated EER with probabilistic measure through a matching method based on line segments, which can localize the influence of rotation gaps of a sample in the case of large matching size. As a result, we found that the transmitted light image of paper has a sufficient individuality.

  19. Lack of correlation between HRM metrics and symptoms during the manometric protocol.

    Xiao, Yinglian; Kahrilas, Peter J; Nicodème, Frédéric; Lin, Zhiyue; Roman, Sabine; Pandolfino, John E

    2014-04-01

    Although esophageal motor disorders are associated with chest pain and dysphagia, minimal data support a direct relationship between abnormal motor function and symptoms. This study investigated whether high-resolution manometry (HRM) metrics correlate with symptoms. Consecutive HRM patients without previous surgery were enrolled. HRM studies included 10 supine liquid, 5 upright liquid, 2 upright viscous, and 2 upright solid swallows. All patients evaluated their esophageal symptom for each upright swallow. Symptoms were graded on a 4-point likert score (0, none; 1, mild; 2, moderate; 3, severe). The individual liquid, viscous or solid upright swallow with the maximal symptom score was selected for analysis in each patient. HRM metrics were compared between groups with and without symptoms during the upright liquid protocol and the provocative protocols separately. A total of 269 patients recorded symptoms during the upright liquid swallows and 72 patients had a swallow symptom score of 1 or greater. Of the 269 patients, 116 recorded symptoms during viscous or solid swallows. HRM metrics were similar between swallows with and without associated symptoms in the upright, viscous, and solid swallows. No correlation was noted between HRM metrics and symptom scores among swallow types. Esophageal symptoms are not related to abnormal motor function defined by HRM during liquid, viscous or solid bolus swallows in the upright position. Other factors beyond circular muscle contraction patterns should be explored as possible causes of symptom generation.

  20. SU-F-T-312: Identifying Distinct Radiation Therapy Plan Classes Through Multi-Dimensional Analysis of Plan Complexity Metrics

    Desai, V; Labby, Z; Culberson, W [University of Wisc Madison, Madison, WI (United States)

    2016-06-15

    Purpose: To determine whether body site-specific treatment plans form unique “plan class” clusters in a multi-dimensional analysis of plan complexity metrics such that a single beam quality correction determined for a representative plan could be universally applied within the “plan class”, thereby increasing the dosimetric accuracy of a detector’s response within a subset of similarly modulated nonstandard deliveries. Methods: We collected 95 clinical volumetric modulated arc therapy (VMAT) plans from four body sites (brain, lung, prostate, and spine). The lung data was further subdivided into SBRT and non-SBRT data for a total of five plan classes. For each control point in each plan, a variety of aperture-based complexity metrics were calculated and stored as unique characteristics of each patient plan. A multiple comparison of means analysis was performed such that every plan class was compared to every other plan class for every complexity metric in order to determine which groups could be considered different from one another. Statistical significance was assessed after correcting for multiple hypothesis testing. Results: Six out of a possible 10 pairwise plan class comparisons were uniquely distinguished based on at least nine out of 14 of the proposed metrics (Brain/Lung, Brain/SBRT lung, Lung/Prostate, Lung/SBRT Lung, Lung/Spine, Prostate/SBRT Lung). Eight out of 14 of the complexity metrics could distinguish at least six out of the possible 10 pairwise plan class comparisons. Conclusion: Aperture-based complexity metrics could prove to be useful tools to quantitatively describe a distinct class of treatment plans. Certain plan-averaged complexity metrics could be considered unique characteristics of a particular plan. A new approach to generating plan-class specific reference (pcsr) fields could be established through a targeted preservation of select complexity metrics or a clustering algorithm that identifies plans exhibiting similar

  1. Classification in medical images using adaptive metric k-NN

    Chen, C.; Chernoff, K.; Karemore, G.; Lo, P.; Nielsen, M.; Lauze, F.

    2010-03-01

    The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier with respect to different adaptive metrics in the context of medical imaging. We propose using adaptive metrics such that the structure of the data is better described, introducing some unsupervised learning knowledge in k-NN. We investigated four different metrics are estimated: a theoretical metric based on the assumption that images are drawn from Brownian Image Model (BIM), the normalized metric based on variance of the data, the empirical metric is based on the empirical covariance matrix of the unlabeled data, and an optimized metric obtained by minimizing the classification error. The spectral structure of the empirical covariance also leads to Principal Component Analysis (PCA) performed on it which results the subspace metrics. The metrics are evaluated on two data sets: lateral X-rays of the lumbar aortic/spine region, where we use k-NN for performing abdominal aorta calcification detection; and mammograms, where we use k-NN for breast cancer risk assessment. The results show that appropriate choice of metric can improve classification.

  2. THE ROLE OF ARTICLE LEVEL METRICS IN SCIENTIFIC PUBLISHING

    Vladimir TRAJKOVSKI

    2016-04-01

    Full Text Available Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Article-level metrics (ALMs provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, statistics of usage, discussions in online comments and social media, social bookmarking, and recommendations. In this editorial, the role of article level metrics in publishing scientific papers has been described. Article-Level Metrics (ALMs are rapidly emerging as important tools to quantify how individual articles are being discussed, shared, and used. Data sources depend on the tool, but they include classic metrics indicators depending on citations, academic social networks (Mendeley, CiteULike, Delicious and social media (Facebook, Twitter, blogs, and Youtube. The most popular tools used to apply this new metrics are: Public Library of Science - Article-Level Metrics, Altmetric, Impactstory and Plum Analytics. Journal Impact Factor (JIF does not consider impact or influence beyond citations count as this count reflected only through Thomson Reuters’ Web of Science® database. JIF provides indicator related to the journal, but not related to a published paper. Thus, altmetrics now becomes an alternative metrics for performance assessment of individual scientists and their contributed scholarly publications. Macedonian scholarly publishers have to work on implementing of article level metrics in their e-journals. It is the way to increase their visibility and impact in the world of science.

  3. Outsourced Similarity Search on Metric Data Assets

    Yiu, Man Lung; Assent, Ira; Jensen, Christian S.

    2012-01-01

    . Outsourcing offers the data owner scalability and a low initial investment. The need for privacy may be due to the data being sensitive (e.g., in medicine), valuable (e.g., in astronomy), or otherwise confidential. Given this setting, the paper presents techniques that transform the data prior to supplying......This paper considers a cloud computing setting in which similarity querying of metric data is outsourced to a service provider. The data is to be revealed only to trusted users, not to the service provider or anyone else. Users query the server for the most similar data objects to a query example...

  4. New Metrics from a Fractional Gravitational Field

    El-Nabulsi, Rami Ahmad

    2017-01-01

    Agop et al. proved in Commun. Theor. Phys. (2008) that, a Reissner–Nordstrom type metric is obtained, if gauge gravitational field in a fractal spacetime is constructed by means of concepts of scale relativity. We prove in this short communication that similar result is obtained if gravity in D-spacetime dimensions is fractionalized by means of the Glaeske–Kilbas–Saigo fractional. Besides, non-singular gravitational fields are obtained without using extra-dimensions. We present few examples to show that these gravitational fields hold a number of motivating features in spacetime physics. (paper)

  5. Energy Metrics for State Government Buildings

    Michael, Trevor

    Measuring true progress towards energy conservation goals requires the accurate reporting and accounting of energy consumption. An accurate energy metrics framework is also a critical element for verifiable Greenhouse Gas Inventories. Energy conservation in government can reduce expenditures on energy costs leaving more funds available for public services. In addition to monetary savings, conserving energy can help to promote energy security, air quality, and a reduction of carbon footprint. With energy consumption/GHG inventories recently produced at the Federal level, state and local governments are beginning to also produce their own energy metrics systems. In recent years, many states have passed laws and executive orders which require their agencies to reduce energy consumption. In June 2008, SC state government established a law to achieve a 20% energy usage reduction in state buildings by 2020. This study examines case studies from other states who have established similar goals to uncover the methods used to establish an energy metrics system. Direct energy consumption in state government primarily comes from buildings and mobile sources. This study will focus exclusively on measuring energy consumption in state buildings. The case studies reveal that many states including SC are having issues gathering the data needed to accurately measure energy consumption across all state buildings. Common problems found include a lack of enforcement and incentives that encourage state agencies to participate in any reporting system. The case studies are aimed at finding the leverage used to gather the needed data. The various approaches at coercing participation will hopefully reveal methods that SC can use to establish the accurate metrics system needed to measure progress towards its 20% by 2020 energy reduction goal. Among the strongest incentives found in the case studies is the potential for monetary savings through energy efficiency. Framing energy conservation

  6. Metric preheating and limitations of linearized gravity

    Bassett, Bruce A.; Tamburini, Fabrizio; Kaiser, David I.; Maartens, Roy

    1999-01-01

    During the preheating era after inflation, resonant amplification of quantum field fluctuations takes place. Recently it has become clear that this must be accompanied by resonant amplification of scalar metric fluctuations, since the two are united by Einstein's equations. Furthermore, this 'metric preheating' enhances particle production, and leads to gravitational rescattering effects even at linear order. In multi-field models with strong preheating (q>>1), metric perturbations are driven non-linear, with the strongest amplification typically on super-Hubble scales (k→0). This amplification is causal, being due to the super-Hubble coherence of the inflaton condensate, and is accompanied by resonant growth of entropy perturbations. The amplification invalidates the use of the linearized Einstein field equations, irrespective of the amount of fine-tuning of the initial conditions. This has serious implications on all scales - from large-angle cosmic microwave background (CMB) anisotropies to primordial black holes. We investigate the (q,k) parameter space in a two-field model, and introduce the time to non-linearity, t nl , as the timescale for the breakdown of the linearized Einstein equations. t nl is a robust indicator of resonance behavior, showing the fine structure in q and k that one expects from a quasi-Floquet system, and we argue that t nl is a suitable generalization of the static Floquet index in an expanding universe. Backreaction effects are expected to shut down the linear resonances, but cannot remove the existing amplification, which threatens the viability of strong preheating when confronted with the CMB. Mode-mode coupling and turbulence tend to re-establish scale invariance, but this process is limited by causality and for small k the primordial scale invariance of the spectrum may be destroyed. We discuss ways to escape the above conclusions, including secondary phases of inflation and preheating solely to fermions. The exclusion principle

  7. Differential geometry bundles, connections, metrics and curvature

    Taubes, Clifford Henry

    2011-01-01

    Bundles, connections, metrics and curvature are the 'lingua franca' of modern differential geometry and theoretical physics. This book will supply a graduate student in mathematics or theoretical physics with the fundamentals of these objects. Many of the tools used in differential topology are introduced and the basic results about differentiable manifolds, smooth maps, differential forms, vector fields, Lie groups, and Grassmanians are all presented here. Other material covered includes the basic theorems about geodesics and Jacobi fields, the classification theorem for flat connections, the

  8. Indefinite metric and regularization of electrodynamics

    Gaudin, M.

    1984-06-01

    The invariant regularization of Pauli and Villars in quantum electrodynamics can be considered as deriving from a local and causal lagrangian theory for spin 1/2 bosons, by introducing an indefinite metric and a condition on the allowed states similar to the Lorentz condition. The consequences are the asymptotic freedom of the photon's propagator. We present a calcultion of the effective charge to the fourth order in the coupling as a function of the auxiliary masses, the theory avoiding all mass divergencies to this order [fr

  9. Metrics for comparing plasma mass filters

    Fetterman, Abraham J.; Fisch, Nathaniel J. [Department of Astrophysical Sciences, Princeton University, Princeton, New Jersey 08540 (United States)

    2011-10-15

    High-throughput mass separation of nuclear waste may be useful for optimal storage, disposal, or environmental remediation. The most dangerous part of nuclear waste is the fission product, which produces most of the heat and medium-term radiation. Plasmas are well-suited to separating nuclear waste because they can separate many different species in a single step. A number of plasma devices have been designed for such mass separation, but there has been no standardized comparison between these devices. We define a standard metric, the separative power per unit volume, and derive it for three different plasma mass filters: the plasma centrifuge, Ohkawa filter, and the magnetic centrifugal mass filter.

  10. Metrics for comparing plasma mass filters

    Fetterman, Abraham J.; Fisch, Nathaniel J.

    2011-01-01

    High-throughput mass separation of nuclear waste may be useful for optimal storage, disposal, or environmental remediation. The most dangerous part of nuclear waste is the fission product, which produces most of the heat and medium-term radiation. Plasmas are well-suited to separating nuclear waste because they can separate many different species in a single step. A number of plasma devices have been designed for such mass separation, but there has been no standardized comparison between these devices. We define a standard metric, the separative power per unit volume, and derive it for three different plasma mass filters: the plasma centrifuge, Ohkawa filter, and the magnetic centrifugal mass filter.

  11. Decision Analysis for Metric Selection on a Clinical Quality Scorecard.

    Guth, Rebecca M; Storey, Patricia E; Vitale, Michael; Markan-Aurora, Sumita; Gordon, Randolph; Prevost, Traci Q; Dunagan, Wm Claiborne; Woeltje, Keith F

    2016-09-01

    Clinical quality scorecards are used by health care institutions to monitor clinical performance and drive quality improvement. Because of the rapid proliferation of quality metrics in health care, BJC HealthCare found it increasingly difficult to select the most impactful scorecard metrics while still monitoring metrics for regulatory purposes. A 7-step measure selection process was implemented incorporating Kepner-Tregoe Decision Analysis, which is a systematic process that considers key criteria that must be satisfied in order to make the best decision. The decision analysis process evaluates what metrics will most appropriately fulfill these criteria, as well as identifies potential risks associated with a particular metric in order to identify threats to its implementation. Using this process, a list of 750 potential metrics was narrowed to 25 that were selected for scorecard inclusion. This decision analysis process created a more transparent, reproducible approach for selecting quality metrics for clinical quality scorecards. © The Author(s) 2015.

  12. Balanced metrics for vector bundles and polarised manifolds

    Garcia Fernandez, Mario; Ross, Julius

    2012-01-01

    leads to a Hermitian-Einstein metric on E and a constant scalar curvature Kähler metric in c_1(L). For special values of α, limits of balanced metrics are solutions of a system of coupled equations relating a Hermitian-Einstein metric on E and a Kähler metric in c1(L). For this, we compute the top two......We consider a notion of balanced metrics for triples (X, L, E) which depend on a parameter α, where X is smooth complex manifold with an ample line bundle L and E is a holomorphic vector bundle over X. For generic choice of α, we prove that the limit of a convergent sequence of balanced metrics...

  13. Construction of Einstein-Sasaki metrics in D≥7

    Lue, H.; Pope, C. N.; Vazquez-Poritz, J. F.

    2007-01-01

    We construct explicit Einstein-Kaehler metrics in all even dimensions D=2n+4≥6, in terms of a 2n-dimensional Einstein-Kaehler base metric. These are cohomogeneity 2 metrics which have the new feature of including a NUT-type parameter, or gravomagnetic charge, in addition to..' in addition to mass and rotation parameters. Using a canonical construction, these metrics all yield Einstein-Sasaki metrics in dimensions D=2n+5≥7. As is commonly the case in this type of construction, for suitable choices of the free parameters the Einstein-Sasaki metrics can extend smoothly onto complete and nonsingular manifolds, even though the underlying Einstein-Kaehler metric has conical singularities. We discuss some explicit examples in the case of seven-dimensional Einstein-Sasaki spaces. These new spaces can provide supersymmetric backgrounds in M theory, which play a role in the AdS 4 /CFT 3 correspondence

  14. National Metrical Types in Nineteenth Century Art Song

    Leigh VanHandel

    2010-01-01

    Full Text Available William Rothstein’s article “National metrical types in music of the eighteenth and early nineteenth centuries” (2008 proposes a distinction between the metrical habits of 18th and early 19th century German music and those of Italian and French music of that period. Based on theoretical treatises and compositional practice, he outlines these national metrical types and discusses the characteristics of each type. This paper presents the results of a study designed to determine whether, and to what degree, Rothstein’s characterizations of national metrical types are present in 19th century French and German art song. Studying metrical habits in this genre may provide a lens into changing metrical conceptions of 19th century theorists and composers, as well as to the metrical habits and compositional style of individual 19th century French and German art song composers.

  15. Metrication: An economic wake-up call for US industry

    Carver, G. P.

    1993-03-01

    As the international standard of measurement, the metric system is one key to success in the global marketplace. International standards have become an important factor in international economic competition. Non-metric products are becoming increasingly unacceptable in world markets that favor metric products. Procurement is the primary federal tool for encouraging and helping U.S. industry to convert voluntarily to the metric system. Besides the perceived unwillingness of the customer, certain regulatory language, and certain legal definitions in some states, there are no major impediments to conversion of the remaining non-metric industries to metric usage. Instead, there are good reasons for changing, including an opportunity to rethink many industry standards and to take advantage of size standardization. Also, when the remaining industries adopt the metric system, they will come into conformance with federal agencies engaged in similar activities.

  16. Prediction and Migration of Surface-related Resonant Multiples

    Guo, Bowen; Schuster, Gerard T.; Huang, Yunsong

    2015-01-01

    Surface-related resonant multiples can be migrated to achieve better resolution than migrating primary reflections. We now derive the formula for migrating surface-related resonant multiples, and show its super-resolution characteristics. Moreover

  17. Fanpage metrics analysis. "Study on content engagement"

    Rahman, Zoha; Suberamanian, Kumaran; Zanuddin, Hasmah Binti; Moghavvemi, Sedigheh; Nasir, Mohd Hairul Nizam Bin Md

    2016-08-01

    Social Media is now determined as an excellent communicative tool to connect directly with consumers. One of the most significant ways to connect with the consumers through these Social Networking Sites (SNS) is to create a facebook fanpage with brand contents and to place different posts periodically on these fanpages. In measuring social networking sites' effectiveness, corporate houses are now analyzing metrics in terms of calculating engagement rate, number of comments/share and likings in fanpages. So now, it is very important for the marketers to know the effectiveness of different contents or posts of fanpages in order to increase the fan responsiveness and engagement rate in the fan pages. In the study the authors have analyzed total 1834 brand posts from 17 international brands of Electronics companies. Data of 9 months (From December 2014 to August 2015) have been collected for analyses, which were available online in the Brand' fan pages. An econometrics analysis is conducted using Eviews 9, to determine the impact of different contents on fanpage engagement. The study picked the four most frequently posted content to determine their impact on PTA (people Talking About) metrics and Fanpage engagement activities.

  18. Network Community Detection on Metric Space

    Suman Saha

    2015-08-01

    Full Text Available Community detection in a complex network is an important problem of much interest in recent years. In general, a community detection algorithm chooses an objective function and captures the communities of the network by optimizing the objective function, and then, one uses various heuristics to solve the optimization problem to extract the interesting communities for the user. In this article, we demonstrate the procedure to transform a graph into points of a metric space and develop the methods of community detection with the help of a metric defined for a pair of points. We have also studied and analyzed the community structure of the network therein. The results obtained with our approach are very competitive with most of the well-known algorithms in the literature, and this is justified over the large collection of datasets. On the other hand, it can be observed that time taken by our algorithm is quite less compared to other methods and justifies the theoretical findings.

  19. Value of the Company and Marketing Metrics

    André Luiz Ramos

    2013-12-01

    Full Text Available Thinking marketing strategies from a resource-based perspective (Barney, 1991, proposing assets as either tangible, organizational and human, and from Constantin and Luch’s vision (1994, where strategic resources can be tanbigle or intangible, internal or external to the firm, raises a research approach on Marketing and Finance. According to Srivastava, Shervani and Fahey (1998 there are 3 market assets types, which generate firm value. Firm value can be measured by discounted cashflow, compromising marketing activities with value generation forcasts (Anderson, 1982; Day, Fahey, 1988; Doyle, 2000; Rust et al., 2004a. The economic value of marketing strategies and marketing metrics are calling strategy researchers’ and marketing managers’ attention, making clear the need for building a bridge able to articulate marketing and finance form a strategic perspective. This article proposes an analytical framework based on different scientific approaches envolving risk and return promoted by marketing strategies and points out advances concerning both methodological approaches and marketing strategies and its impact on firm metrics and value, usgin Srinivasan and Hanssens (2009 as a start point.

  20. Defining a standard metric for electricity savings

    Koomey, Jonathan; Akbari, Hashem; Blumstein, Carl; Brown, Marilyn; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B; Greenberg, Steve

    2010-01-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70% capacity factor with 7% T and D losses. Displacing such a plant for one year would save 3 billion kWh/year at the meter and reduce emissions by 3 million metric tons of CO 2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question-Dr Arthur H Rosenfeld.

  1. Defining a standard metric for electricity savings

    Koomey, Jonathan [Lawrence Berkeley National Laboratory and Stanford University, PO Box 20313, Oakland, CA 94620-0313 (United States); Akbari, Hashem; Blumstein, Carl; Brown, Marilyn; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B; Greenberg, Steve, E-mail: JGKoomey@stanford.ed

    2010-01-15

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70% capacity factor with 7% T and D losses. Displacing such a plant for one year would save 3 billion kWh/year at the meter and reduce emissions by 3 million metric tons of CO{sub 2} per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question-Dr Arthur H Rosenfeld.

  2. Covariant electrodynamics in linear media: Optical metric

    Thompson, Robert T.

    2018-03-01

    While the postulate of covariance of Maxwell's equations for all inertial observers led Einstein to special relativity, it was the further demand of general covariance—form invariance under general coordinate transformations, including between accelerating frames—that led to general relativity. Several lines of inquiry over the past two decades, notably the development of metamaterial-based transformation optics, has spurred a greater interest in the role of geometry and space-time covariance for electrodynamics in ponderable media. I develop a generally covariant, coordinate-free framework for electrodynamics in general dielectric media residing in curved background space-times. In particular, I derive a relation for the spatial medium parameters measured by an arbitrary timelike observer. In terms of those medium parameters I derive an explicit expression for the pseudo-Finslerian optical metric of birefringent media and show how it reduces to a pseudo-Riemannian optical metric for nonbirefringent media. This formulation provides a basis for a unified approach to ray and congruence tracing through media in curved space-times that may smoothly vary among positively refracting, negatively refracting, and vacuum.

  3. Axisymmetric plasma equilibria in a Kerr metric

    Elsässer, Klaus

    2001-10-01

    Plasma equilibria near a rotating black hole are considered within the multifluid description. An isothermal two-component plasma with electrons and positrons or ions is determined by four structure functions and the boundary conditions. These structure functions are the Bernoulli function and the toroidal canonical momentum per mass for each species. The quasi-neutrality assumption (no charge density, no toroidal current) allows to solve Maxwell's equations analytically for any axisymmetric stationary metric, and to reduce the fluid equations to one single scalar equation for the stream function \\chi of the positrons or ions, respectively. The basic smallness parameter is the ratio of the skin depth of electrons to the scale length of the metric and fluid quantities, and, in the case of an electron-ion plasma, the mass ratio m_e/m_i. The \\chi-equation can be solved by standard methods, and simple solutions for a Kerr geometry are available; they show characteristic flow patterns, depending on the structure functions and the boundary conditions.

  4. Defining a Standard Metric for Electricity Savings

    Brown, Marilyn; Akbari, Hashem; Blumstein, Carl; Koomey, Jonathan; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H.; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B.; Greenberg, Steve; Hafemeister, David; Harris, Jeff; Harvey, Hal; Heitz, Eric; Hirst, Eric; Hummel, Holmes; Kammen, Dan; Kelly, Henry; Laitner, Skip; Levine, Mark; Lovins, Amory; Masters, Gil; McMahon, James E.; Meier, Alan; Messenger, Michael; Millhone, John; Mills, Evan; Nadel, Steve; Nordman, Bruce; Price, Lynn; Romm, Joe; Ross, Marc; Rufo, Michael; Sathaye, Jayant; Schipper, Lee; Schneider, Stephen H; Sweeney, James L; Verdict, Malcolm; Vorsatz, Diana; Wang, Devra; Weinberg, Carl; Wilk, Richard; Wilson, John; Worrell, Ernst

    2009-03-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70percent capacity factor with 7percent T&D losses. Displacing such a plant for one year would save 3 billion kW h per year at the meter and reduce emissions by 3 million metric tons of CO2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question--Dr. Arthur H. Rosenfeld.

  5. Use of different exposure metrics for understanding multi-modal travel injury risk

    S. Ilgin Guler

    2016-08-01

    Full Text Available The objective of this work is to identify characteristics of different metrics of exposure for quantifying multi-modal travel injury risk. First, a discussion on the use of time-based and trip-based metrics for road user exposure to injury risk, considering multiple travel modes, is presented. The main difference between a time-based and trip-based metric is argued to be that a time-based metric reflects the actual duration of time spent on the road exposed to the travel risks. This can be proven to be important when considering multiple modes since different modes typically different speeds and average travel distances. Next, the use of total number of trips, total time traveled, and mode share (time-based or trip-based is considered to compare the injury risk of a given mode at different locations. It is argued that using mode share the safety concept which focuses on absolute numbers can be generalized. Quantitative results are also obtained from combining travel survey data with police collision reports for ten counties in California. The data are aggregated for five modes: (i cars, (ii SUVs, (iii transit riders, (iv bicyclists, and (v pedestrians. These aggregated data are used to compare travel risk of different modes with time-based or trip-based exposure metrics. These quantitative results confirm the initial qualitative discussions. As the penetration of mobile probes for transportation data collection increases, the insights of this study can provide guidance on how to best utilize the added value of such data to better quantify travel injury risk, and improve safety.

  6. Popular dispute resolution mechanisms in Ethiopia: Trends ...

    institutions to such initiatives are considered to be good opportunities for ..... where the wrongdoer is cleansed and the victim compensated; a school where ..... coexistence of multiple systems of conflict resolution and the ways of resolving.

  7. Comprehensive Metric Education Project: Implementing Metrics at a District Level Administrative Guide.

    Borelli, Michael L.

    This document details the administrative issues associated with guiding a school district through its metrication efforts. Issues regarding staff development, curriculum development, and the acquisition of instructional resources are considered. Alternative solutions are offered. Finally, an overall implementation strategy is discussed with…

  8. High-resolution seismic imaging of the Sohagpur Gondwana basin ...

    The quality of the high-resolution seismic data depends mainly on the data ..... metric rift geometry. Based on the .... Biswas S K 2003 Regional tectonic framework of the .... Sheth H C, Ray J S, Ray R, Vanderkluysen L, Mahoney J. J, Kumar A ...

  9. Social Media Metrics Importance and Usage Frequency in Latvia

    Ronalds Skulme

    2017-12-01

    Full Text Available Purpose of the article: The purpose of this paper was to explore which social media marketing metrics are most often used and are most important for marketing experts in Latvia and can be used to evaluate marketing campaign effectiveness. Methodology/methods: In order to achieve the aim of this paper several theoretical and practical research methods were used, such as theoretical literature analysis, surveying and grouping. First of all, theoretical research about social media metrics was conducted. Authors collected information about social media metric grouping methods and the most frequently mentioned social media metrics in the literature. The collected information was used as the foundation for the expert surveys. The expert surveys were used to collect information from Latvian marketing professionals to determine which social media metrics are used most often and which social media metrics are most important in Latvia. Scientific aim: The scientific aim of this paper was to identify if social media metrics importance varies depending on the consumer purchase decision stage. Findings: Information about the most important and most often used social media marketing metrics in Latvia was collected. A new social media grouping framework is proposed. Conclusions: The main conclusion is that the importance and the usage frequency of the social media metrics is changing depending of consumer purchase decisions stage the metric is used to evaluate.

  10. Topological and metric properties of Henon-type strange attractors

    Cvitanovic, P.; Gunaratne, G.H.; Procaccia, I.

    1988-01-01

    We use the set of all periodic points of Henon-type mappings to develop a theory of the topological and metric properties of their attractors. The topology of a Henon-type attractor is conveniently represented by a two-dimensional symbol plane, with the allowed and disallowed orbits cleanly separated by the ''pruning front.'' The pruning front is a function discontinuous on every binary rational number, but for maps with finite dissipation chemical bondbchemical bond<1, it is well approximated by a few steps, or, in the symbolic dynamics language, by a finite grammar. Thus equipped with the complete list of allowed periodic points, we reconstruct (to resolution of order b/sup n/) the physical attractor by piecing together the linearized neighborhoods of all periodic points of cycle length n. We use this representation to compute the singularity spectrum f(α). The description in terms of periodic points works very well in the ''hyperbolic phase,'' for α larger than some α/sub c/, where α/sub c/ is the value of α corresponding to the (conjectured) phase transition

  11. Metric Accuracy Evaluation of Dense Matching Algorithms in Archeological Applications

    C. Re

    2011-12-01

    Full Text Available In the cultural heritage field the recording and documentation of small and medium size objects with very detailed Digital Surface Models (DSM is readily possible by through the use of high resolution and high precision triangulation laser scanners. 3D surface recording of archaeological objects can be easily achieved in museums; however, this type of record can be quite expensive. In many cases photogrammetry can provide a viable alternative for the generation of DSMs. The photogrammetric procedure has some benefits with respect to laser survey. The research described in this paper sets out to verify the reconstruction accuracy of DSMs of some archaeological artifacts obtained by photogrammetric survey. The experimentation has been carried out on some objects preserved in the Petrie Museum of Egyptian Archaeology at University College London (UCL. DSMs produced by two photogrammetric software packages are compared with the digital 3D model obtained by a state of the art triangulation color laser scanner. Intercomparison between the generated DSM has allowed an evaluation of metric accuracy of the photogrammetric approach applied to archaeological documentation and of precision performances of the two software packages.

  12. Cloud-based Computing and Applications of New Snow Metrics for Societal Benefit

    Nolin, A. W.; Sproles, E. A.; Crumley, R. L.; Wilson, A.; Mar, E.; van de Kerk, M.; Prugh, L.

    2017-12-01

    Seasonal and interannual variability in snow cover affects socio-environmental systems including water resources, forest ecology, freshwater and terrestrial habitat, and winter recreation. We have developed two new seasonal snow metrics: snow cover frequency (SCF) and snow disappearance date (SDD). These metrics are calculated at 500-m resolution using NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) snow cover data (MOD10A1). SCF is the number of times snow is observed in a pixel over the user-defined observation period. SDD is the last date of observed snow in a water year. These pixel-level metrics are calculated rapidly and globally in the Google Earth Engine cloud-based environment. SCF and SDD can be interactively visualized in a map-based interface, allowing users to explore spatial and temporal snowcover patterns from 2000-present. These metrics are especially valuable in regions where snow data are sparse or non-existent. We have used these metrics in several ongoing projects. When SCF was linked with a simple hydrologic model in the La Laguna watershed in northern Chile, it successfully predicted summer low flows with a Nash-Sutcliffe value of 0.86. SCF has also been used to help explain changes in Dall sheep populations in Alaska where sheep populations are negatively impacted by late snow cover and low snowline elevation during the spring lambing season. In forest management, SCF and SDD appear to be valuable predictors of post-wildfire vegetation growth. We see a positive relationship between winter SCF and subsequent summer greening for several years post-fire. For western US winter recreation, we are exploring trends in SDD and SCF for regions where snow sports are economically important. In a world with declining snowpacks and increasing uncertainty, these metrics extend across elevations and fill data gaps to provide valuable information for decision-making. SCF and SDD are being produced so that anyone with Internet access and a Google

  13. Measurable Control System Security through Ideal Driven Technical Metrics

    Miles McQueen; Wayne Boyer; Sean McBride; Marie Farrar; Zachary Tudor

    2008-01-01

    The Department of Homeland Security National Cyber Security Division supported development of a small set of security ideals as a framework to establish measurable control systems security. Based on these ideals, a draft set of proposed technical metrics was developed to allow control systems owner-operators to track improvements or degradations in their individual control systems security posture. The technical metrics development effort included review and evaluation of over thirty metrics-related documents. On the bases of complexity, ambiguity, or misleading and distorting effects the metrics identified during the reviews were determined to be weaker than necessary to aid defense against the myriad threats posed by cyber-terrorism to human safety, as well as to economic prosperity. Using the results of our metrics review and the set of security ideals as a starting point for metrics development, we identified thirteen potential technical metrics - with at least one metric supporting each ideal. Two case study applications of the ideals and thirteen metrics to control systems were then performed to establish potential difficulties in applying both the ideals and the metrics. The case studies resulted in no changes to the ideals, and only a few deletions and refinements to the thirteen potential metrics. This led to a final proposed set of ten core technical metrics. To further validate the security ideals, the modifications made to the original thirteen potential metrics, and the final proposed set of ten core metrics, seven separate control systems security assessments performed over the past three years were reviewed for findings and recommended mitigations. These findings and mitigations were then mapped to the security ideals and metrics to assess gaps in their coverage. The mappings indicated that there are no gaps in the security ideals and that the ten core technical metrics provide significant coverage of standard security issues with 87% coverage. Based

  14. Vanishing-Overhead Linear-Scaling Random Phase Approximation by Cholesky Decomposition and an Attenuated Coulomb-Metric.

    Luenser, Arne; Schurkus, Henry F; Ochsenfeld, Christian

    2017-04-11

    A reformulation of the random phase approximation within the resolution-of-the-identity (RI) scheme is presented, that is competitive to canonical molecular orbital RI-RPA already for small- to medium-sized molecules. For electronically sparse systems drastic speedups due to the reduced scaling behavior compared to the molecular orbital formulation are demonstrated. Our reformulation is based on two ideas, which are independently useful: First, a Cholesky decomposition of density matrices that reduces the scaling with basis set size for a fixed-size molecule by one order, leading to massive performance improvements. Second, replacement of the overlap RI metric used in the original AO-RPA by an attenuated Coulomb metric. Accuracy is significantly improved compared to the overlap metric, while locality and sparsity of the integrals are retained, as is the effective linear scaling behavior.

  15. Comparison of luminance based metrics in different lighting conditions

    Wienold, J.; Kuhn, T.E.; Christoffersen, J.

    In this study, we evaluate established and newly developed metrics for predicting glare using data from three different research studies. The evaluation covers two different targets: 1. How well the user’s perception of glare magnitude correlates to the prediction of the glare metrics? 2. How well...... do the glare metrics describe the subjects’ disturbance by glare? We applied Spearman correlations, logistic regressions and an accuracy evaluation, based on an ROC-analysis. The results show that five of the twelve investigated metrics are failing at least one of the statistical tests. The other...... seven metrics CGI, modified DGI, DGP, Ev, average Luminance of the image Lavg, UGP and UGR are passing all statistical tests. DGP, CGI, DGI_mod and UGP have largest AUC and might be slightly more robust. The accuracy of the predictions of afore mentioned seven metrics for the disturbance by glare lies...

  16. Development of Technology Transfer Economic Growth Metrics

    Mastrangelo, Christina M.

    1998-01-01

    The primary objective of this project is to determine the feasibility of producing technology transfer metrics that answer the question: Do NASA/MSFC technical assistance activities impact economic growth? The data for this project resides in a 7800-record database maintained by Tec-Masters, Incorporated. The technology assistance data results from survey responses from companies and individuals who have interacted with NASA via a Technology Transfer Agreement, or TTA. The goal of this project was to determine if the existing data could provide indications of increased wealth. This work demonstrates that there is evidence that companies that used NASA technology transfer have a higher job growth rate than the rest of the economy. It also shows that the jobs being supported are jobs in higher wage SIC codes, and this indicates improvements in personal wealth. Finally, this work suggests that with correct data, the wealth issue may be addressed.

  17. MESUR metrics from scholarly usage of resources

    CERN. Geneva; Van de Sompel, Herbert

    2007-01-01

    Usage data is increasingly regarded as a valuable resource in the assessment of scholarly communication items. However, the development of quantitative, usage-based indicators of scholarly impact is still in its infancy. The Digital Library Research & Prototyping Team at the Los Alamos National Laboratory's Research library has therefore started a program to expand the set of usage-based tools for the assessment of scholarly communication items. The two-year MESUR project, funded by the Andrew W. Mellon Foundation, aims to define and validate a range of usage-based impact metrics, and issue guidelines with regards to their characteristics and proper application. The MESUR project is constructing a large-scale semantic model of the scholarly community that seamlessly integrates a wide range of bibliographic, citation and usage data. Functioning as a reference data set, this model is analyzed to characterize the intricate networks of typed relationships that exist in the scholarly community. The resulting c...

  18. Einstein metrics and Brans-Dicke superfields

    Marques, S.

    1988-01-01

    It is obtained here a space conformal to the Einstein space-time, making the transition from an internal bosonic space, constructed with the Majorana constant spinors in the Majorana representation, to a bosonic ''superspace,'' through the use of Einstein vierbeins. These spaces are related to a Grassmann space constructed with the Majorana spinors referred to above, where the ''metric'' is a function of internal bosonic coordinates. The conformal function is a scale factor in the zone of gravitational radiation. A conformal function dependent on space-time coordinates can be constructed in that region when we introduce Majorana spinors which are functions of those coordinates. With this we obtain a scalar field of Brans-Dicke type. 11 refs

  19. Analytical Cost Metrics : Days of Future Past

    Prajapati, Nirmal [Colorado State Univ., Fort Collins, CO (United States); Rajopadhye, Sanjay [Colorado State Univ., Fort Collins, CO (United States); Djidjev, Hristo Nikolov [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-20

    As we move towards the exascale era, the new architectures must be capable of running the massive computational problems efficiently. Scientists and researchers are continuously investing in tuning the performance of extreme-scale computational problems. These problems arise in almost all areas of computing, ranging from big data analytics, artificial intelligence, search, machine learning, virtual/augmented reality, computer vision, image/signal processing to computational science and bioinformatics. With Moore’s law driving the evolution of hardware platforms towards exascale, the dominant performance metric (time efficiency) has now expanded to also incorporate power/energy efficiency. Therefore the major challenge that we face in computing systems research is: “how to solve massive-scale computational problems in the most time/power/energy efficient manner?”

  20. Clean Cities 2013 Annual Metrics Report

    Johnson, C.; Singer, M.

    2014-10-01

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2013 Annual Metrics Report.

  1. Clean Cities 2014 Annual Metrics Report

    Johnson, Caley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Singer, Mark [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-12-22

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2014 Annual Metrics Report.

  2. Outsourced similarity search on metric data assets

    Yiu, Man Lung

    2012-02-01

    This paper considers a cloud computing setting in which similarity querying of metric data is outsourced to a service provider. The data is to be revealed only to trusted users, not to the service provider or anyone else. Users query the server for the most similar data objects to a query example. Outsourcing offers the data owner scalability and a low-initial investment. The need for privacy may be due to the data being sensitive (e.g., in medicine), valuable (e.g., in astronomy), or otherwise confidential. Given this setting, the paper presents techniques that transform the data prior to supplying it to the service provider for similarity queries on the transformed data. Our techniques provide interesting trade-offs between query cost and accuracy. They are then further extended to offer an intuitive privacy guarantee. Empirical studies with real data demonstrate that the techniques are capable of offering privacy while enabling efficient and accurate processing of similarity queries.

  3. Special metrics and group actions in geometry

    Fino, Anna; Musso, Emilio; Podestà, Fabio; Vezzoni, Luigi

    2017-01-01

    The volume is a follow-up to the INdAM meeting “Special metrics and quaternionic geometry” held in Rome in November 2015. It offers a panoramic view of a selection of cutting-edge topics in differential geometry, including 4-manifolds, quaternionic and octonionic geometry, twistor spaces, harmonic maps, spinors, complex and conformal geometry, homogeneous spaces and nilmanifolds, special geometries in dimensions 5–8, gauge theory, symplectic and toric manifolds, exceptional holonomy and integrable systems. The workshop was held in honor of Simon Salamon, a leading international scholar at the forefront of academic research who has made significant contributions to all these subjects. The articles published here represent a compelling testimony to Salamon’s profound and longstanding impact on the mathematical community. Target readership includes graduate students and researchers working in Riemannian and complex geometry, Lie theory and mathematical physics.

  4. Quasi-metrics, midpoints and applications

    Valero, O.

    2017-07-01

    In applied sciences, the scientific community uses simultaneously different kinds of information coming from several sources in order to infer a conclusion or working decision. In the literature there are many techniques for merging the information and providing, hence, a meaningful fused data. In mostpractical cases such fusion methods are based on aggregation operators on somenumerical values, i.e. the aim of the fusion process is to obtain arepresentative number from a finite sequence of numerical data. In the aforementioned cases, the input data presents some kind of imprecision and for thisreason it is represented as fuzzy sets. Moreover, in such problems the comparisons between the numerical values that represent the information described by the fuzzy sets become necessary. The aforementioned comparisons are made by means of a distance defined on fuzzy sets. Thus, the numerical operators aggregating distances between fuzzy sets as incoming data play a central role in applied problems. Recently, J.J. Nieto and A. Torres gave some applications of the aggregation of distances on fuzzy sets to the study of real medical data in /cite{Nieto}. These applications are based on the notion of segment joining two given fuzzy sets and on the notion of set of midpoints between fuzzy sets. A few results obtained by Nieto and Torres have been generalized in turn by Casasnovas and Rossell/'{o} in /cite{Casas,Casas2}. Nowadays, quasi-metrics provide efficient tools in some fields of computer science and in bioinformatics. Motivated by the exposed facts, a study of segments joining two fuzzy sets and of midpoints between fuzzy sets when the measure, used for comparisons, is a quasi-metric has been made in /cite{Casas3, SebVal2013,TiradoValero}. (Author)

  5. Analytic convergence of harmonic metrics for parabolic Higgs bundles

    Kim, Semin; Wilkin, Graeme

    2018-04-01

    In this paper we investigate the moduli space of parabolic Higgs bundles over a punctured Riemann surface with varying weights at the punctures. We show that the harmonic metric depends analytically on the weights and the stable Higgs bundle. This gives a Higgs bundle generalisation of a theorem of McOwen on the existence of hyperbolic cone metrics on a punctured surface within a given conformal class, and a generalisation of a theorem of Judge on the analytic parametrisation of these metrics.

  6. Exact solutions of strong gravity in generalized metrics

    Hojman, R.; Smailagic, A.

    1981-05-01

    We consider classical solutions for the strong gravity theory of Salam and Strathdee in a wider class of metrics with positive, zero and negative curvature. It turns out that such solutions exist and their relevance for quark confinement is explored. Only metrics with positive curvature (spherical symmetry) give a confining potential in a simple picture of the scalar hadron. This supports the idea of describing the hadron as a closed microuniverse of the strong metric. (author)

  7. An accurate metric for the spacetime around neutron stars

    Pappas, George

    2016-01-01

    The problem of having an accurate description of the spacetime around neutron stars is of great astrophysical interest. For astrophysical applications, one needs to have a metric that captures all the properties of the spacetime around a neutron star. Furthermore, an accurate appropriately parameterised metric, i.e., a metric that is given in terms of parameters that are directly related to the physical structure of the neutron star, could be used to solve the inverse problem, which is to inf...

  8. Problems in Systematic Application of Software Metrics and Possible Solution

    Rakic, Gordana; Budimac, Zoran

    2013-01-01

    Systematic application of software metric techniques can lead to significant improvements of the quality of a final software product. However, there is still the evident lack of wider utilization of software metrics techniques and tools due to many reasons. In this paper we investigate some limitations of contemporary software metrics tools and then propose construction of a new tool that would solve some of the problems. We describe the promising prototype, its internal structure, and then f...

  9. Two-dimensional manifolds with metrics of revolution

    Sabitov, I Kh

    2000-01-01

    This is a study of the topological and metric structure of two-dimensional manifolds with a metric that is locally a metric of revolution. In the case of compact manifolds this problem can be thoroughly investigated, and in particular it is explained why there are no closed analytic surfaces of revolution in R 3 other than a sphere and a torus (moreover, in the smoothness class C ∞ such surfaces, understood in a certain generalized sense, exist in any topological class)

  10. A software quality model and metrics for risk assessment

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  11. Chaos of discrete dynamical systems in complete metric spaces

    Shi Yuming; Chen Guanrong

    2004-01-01

    This paper is concerned with chaos of discrete dynamical systems in complete metric spaces. Discrete dynamical systems governed by continuous maps in general complete metric spaces are first discussed, and two criteria of chaos are then established. As a special case, two corresponding criteria of chaos for discrete dynamical systems in compact subsets of metric spaces are obtained. These results have extended and improved the existing relevant results of chaos in finite-dimensional Euclidean spaces

  12. Singular value decomposition metrics show limitations of detector design in diffuse fluorescence tomography.

    Leblond, Frederic; Tichauer, Kenneth M; Pogue, Brian W

    2010-11-29

    The spatial resolution and recovered contrast of images reconstructed from diffuse fluorescence tomography data are limited by the high scattering properties of light propagation in biological tissue. As a result, the image reconstruction process can be exceedingly vulnerable to inaccurate prior knowledge of tissue optical properties and stochastic noise. In light of these limitations, the optimal source-detector geometry for a fluorescence tomography system is non-trivial, requiring analytical methods to guide design. Analysis of the singular value decomposition of the matrix to be inverted for image reconstruction is one potential approach, providing key quantitative metrics, such as singular image mode spatial resolution and singular data mode frequency as a function of singular mode. In the present study, these metrics are used to analyze the effects of different sources of noise and model errors as related to image quality in the form of spatial resolution and contrast recovery. The image quality is demonstrated to be inherently noise-limited even when detection geometries were increased in complexity to allow maximal tissue sampling, suggesting that detection noise characteristics outweigh detection geometry for achieving optimal reconstructions.

  13. Super-resolution reconstruction in frequency, image, and wavelet domains to reduce through-plane partial voluming in MRI

    Gholipour, Ali, E-mail: ali.gholipour@childrens.harvard.edu; Afacan, Onur; Scherrer, Benoit; Prabhu, Sanjay P.; Warfield, Simon K. [Department of Radiology, Boston Children’s Hospital, Boston, Massachusetts 02115 and Harvard Medical School, Boston, Massachusetts 02115 (United States); Aganj, Iman [Radiology Department, Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Boston, Massachusetts 02129 and Harvard Medical School, Boston, Massachusetts 02115 (United States); Sahin, Mustafa [Department of Neurology, Boston Children’s Hospital, Boston, Massachusetts 02115 and Harvard Medical School, Boston, Massachusetts 02115 (United States)

    2015-12-15

    Purpose: To compare and evaluate the use of super-resolution reconstruction (SRR), in frequency, image, and wavelet domains, to reduce through-plane partial voluming effects in magnetic resonance imaging. Methods: The reconstruction of an isotropic high-resolution image from multiple thick-slice scans has been investigated through techniques in frequency, image, and wavelet domains. Experiments were carried out with thick-slice T2-weighted fast spin echo sequence on the Academic College of Radiology MRI phantom, where the reconstructed images were compared to a reference high-resolution scan using peak signal-to-noise ratio (PSNR), structural similarity image metric (SSIM), mutual information (MI), and the mean absolute error (MAE) of image intensity profiles. The application of super-resolution reconstruction was then examined in retrospective processing of clinical neuroimages of ten pediatric patients with tuberous sclerosis complex (TSC) to reduce through-plane partial voluming for improved 3D delineation and visualization of thin radial bands of white matter abnormalities. Results: Quantitative evaluation results show improvements in all evaluation metrics through super-resolution reconstruction in the frequency, image, and wavelet domains, with the highest values obtained from SRR in the image domain. The metric values for image-domain SRR versus the original axial, coronal, and sagittal images were PSNR = 32.26 vs 32.22, 32.16, 30.65; SSIM = 0.931 vs 0.922, 0.924, 0.918; MI = 0.871 vs 0.842, 0.844, 0.831; and MAE = 5.38 vs 7.34, 7.06, 6.19. All similarity metrics showed high correlations with expert ranking of image resolution with MI showing the highest correlation at 0.943. Qualitative assessment of the neuroimages of ten TSC patients through in-plane and out-of-plane visualization of structures showed the extent of partial voluming effect in a real clinical scenario and its reduction using SRR. Blinded expert evaluation of image resolution in

  14. Super-resolution reconstruction in frequency, image, and wavelet domains to reduce through-plane partial voluming in MRI

    Gholipour, Ali; Afacan, Onur; Scherrer, Benoit; Prabhu, Sanjay P.; Warfield, Simon K.; Aganj, Iman; Sahin, Mustafa

    2015-01-01

    Purpose: To compare and evaluate the use of super-resolution reconstruction (SRR), in frequency, image, and wavelet domains, to reduce through-plane partial voluming effects in magnetic resonance imaging. Methods: The reconstruction of an isotropic high-resolution image from multiple thick-slice scans has been investigated through techniques in frequency, image, and wavelet domains. Experiments were carried out with thick-slice T2-weighted fast spin echo sequence on the Academic College of Radiology MRI phantom, where the reconstructed images were compared to a reference high-resolution scan using peak signal-to-noise ratio (PSNR), structural similarity image metric (SSIM), mutual information (MI), and the mean absolute error (MAE) of image intensity profiles. The application of super-resolution reconstruction was then examined in retrospective processing of clinical neuroimages of ten pediatric patients with tuberous sclerosis complex (TSC) to reduce through-plane partial voluming for improved 3D delineation and visualization of thin radial bands of white matter abnormalities. Results: Quantitative evaluation results show improvements in all evaluation metrics through super-resolution reconstruction in the frequency, image, and wavelet domains, with the highest values obtained from SRR in the image domain. The metric values for image-domain SRR versus the original axial, coronal, and sagittal images were PSNR = 32.26 vs 32.22, 32.16, 30.65; SSIM = 0.931 vs 0.922, 0.924, 0.918; MI = 0.871 vs 0.842, 0.844, 0.831; and MAE = 5.38 vs 7.34, 7.06, 6.19. All similarity metrics showed high correlations with expert ranking of image resolution with MI showing the highest correlation at 0.943. Qualitative assessment of the neuroimages of ten TSC patients through in-plane and out-of-plane visualization of structures showed the extent of partial voluming effect in a real clinical scenario and its reduction using SRR. Blinded expert evaluation of image resolution in

  15. Benefits and Impact of Joint Metric of AOA/RSS/TOF on Indoor Localization Error

    Qing Jiang

    2016-10-01

    Full Text Available The emerging techniques in the Fifth Generation (5G communication system, like the millimeter-Wave (mmWave and massive Multiple Input Multiple Output (MIMO, make it possible to measure the Angle-Of-arrival (AOA, Receive Signal Strength (RSS and Time-Of-flight (TOF by using various types of mobile devices. At the same time, there is always significant interest in the high-precision localization techniques based on the joint metric of AOA/RSS/TOF, which enable one to overcome the drawback of the single metric-based localization. Motivated by this concern, we rely on the Cramer–Rao Lower Bound (CRLB to analyze the localization errors of RSS/AOA, RSS/TOF, AOA/TOF and the Joint Metric of AOA/RSS/TOF (JMART-based localization. The error bounds derived in this paper can be selected as the benchmarking results to evaluate the indoor localization performance. Finally, extensive simulations are conducted to support our claim.

  16. Application of Super-Resolution Convolutional Neural Network for Enhancing Image Resolution in Chest CT.

    Umehara, Kensuke; Ota, Junko; Ishida, Takayuki

    2017-10-18

    In this study, the super-resolution convolutional neural network (SRCNN) scheme, which is the emerging deep-learning-based super-resolution method for enhancing image resolution in chest CT images, was applied and evaluated using the post-processing approach. For evaluation, 89 chest CT cases were sampled from The Cancer Imaging Archive. The 89 CT cases were divided randomly into 45 training cases and 44 external test cases. The SRCNN was trained using the training dataset. With the trained SRCNN, a high-resolution image was reconstructed from a low-resolution image, which was down-sampled from an original test image. For quantitative evaluation, two image quality metrics were measured and compared to those of the conventional linear interpolation methods. The image restoration quality of the SRCNN scheme was significantly higher than that of the linear interpolation methods (p < 0.001 or p < 0.05). The high-resolution image reconstructed by the SRCNN scheme was highly restored and comparable to the original reference image, in particular, for a ×2 magnification. These results indicate that the SRCNN scheme significantly outperforms the linear interpolation methods for enhancing image resolution in chest CT images. The results also suggest that SRCNN may become a potential solution for generating high-resolution CT images from standard CT images.

  17. Development of quality metrics for ambulatory pediatric cardiology: Infection prevention.

    Johnson, Jonathan N; Barrett, Cindy S; Franklin, Wayne H; Graham, Eric M; Halnon, Nancy J; Hattendorf, Brandy A; Krawczeski, Catherine D; McGovern, James J; O'Connor, Matthew J; Schultz, Amy H; Vinocur, Jeffrey M; Chowdhury, Devyani; Anderson, Jeffrey B

    2017-12-01

    In 2012, the American College of Cardiology's (ACC) Adult Congenital and Pediatric Cardiology Council established a program to develop quality metrics to guide ambulatory practices for pediatric cardiology. The council chose five areas on which to focus their efforts; chest pain, Kawasaki Disease, tetralogy of Fallot, transposition of the great arteries after arterial switch, and infection prevention. Here, we sought to describe the process, evaluation, and results of the Infection Prevention Committee's metric design process. The infection prevention metrics team consisted of 12 members from 11 institutions in North America. The group agreed to work on specific infection prevention topics including antibiotic prophylaxis for endocarditis, rheumatic fever, and asplenia/hyposplenism; influenza vaccination and respiratory syncytial virus prophylaxis (palivizumab); preoperative methods to reduce intraoperative infections; vaccinations after cardiopulmonary bypass; hand hygiene; and testing to identify splenic function in patients with heterotaxy. An extensive literature review was performed. When available, previously published guidelines were used fully in determining metrics. The committee chose eight metrics to submit to the ACC Quality Metric Expert Panel for review. Ultimately, metrics regarding hand hygiene and influenza vaccination recommendation for patients did not pass the RAND analysis. Both endocarditis prophylaxis metrics and the RSV/palivizumab metric passed the RAND analysis but fell out during the open comment period. Three metrics passed all analyses, including those for antibiotic prophylaxis in patients with heterotaxy/asplenia, for influenza vaccination compliance in healthcare personnel, and for adherence to recommended regimens of secondary prevention of rheumatic fever. The lack of convincing data to guide quality improvement initiatives in pediatric cardiology is widespread, particularly in infection prevention. Despite this, three metrics were

  18. Safety culture relationships with hospital nursing sensitive metrics.

    Brown, Diane Storer; Wolosin, Robert

    2013-01-01

    Public demand for safer care has catapulted the healthcare industry's efforts to understand relationships between patient safety and hospital performance. This study explored linkages between staff perceptions of safety culture (SC) and ongoing measures of hospital nursing unit-based structures, care processes, and adverse patient outcomes. Relationships between nursing-sensitive measures of hospital performance and SC were explored at the unit-level from 9 California hospitals and 37 nursing units. SC perceptions were measured 6 months prior to collection of nursing metrics and relationships between the two sets of data were explored using correlational and regression analyses. Significant relationships were found with reported falls and process measures for fall prevention. Multiple associations were identified with SC and the structure of care delivery: skill mix, staff turnover, and workload intensity demonstrated significant relationships with SC, explaining 22-45% of the variance. SC was an important factor to understand in the quest to advance safe patient care. These findings have affordability and care quality implications for hospital leadership. When senior leaders prioritized a safety culture, patient outcomes may have improved with less staff turnover and more productivity. A business case could be made for investing in patient safety systems to provide reliably safe care. © 2013 National Association for Healthcare Quality.

  19. Principles in selecting human capital measurements and metrics

    Pharny D. Chrysler-Fox

    2014-09-01

    Research purpose: The study explored principles in selecting human capital measurements,drawing on the views and recommendations of human resource management professionals,all experts in human capital measurement. Motivation for the study: The motivation was to advance the understanding of selectingappropriate and strategic valid measurements, in order for human resource practitioners tocontribute to creating value and driving strategic change. Research design, approach and method: A qualitative approach, with purposively selectedcases from a selected panel of human capital measurement experts, generated a datasetthrough unstructured interviews, which were analysed thematically. Main findings: Nineteen themes were found. They represent a process that considers thecentrality of the business strategy and a systemic integration across multiple value chains inthe organisation through business partnering, in order to select measurements and generatemanagement level-appropriate information. Practical/managerial implications: Measurement practitioners, in partnership withmanagement from other functions, should integrate the business strategy across multiplevalue chains in order to select measurements. Analytics becomes critical in discoveringrelationships and formulating hypotheses to understand value creation. Higher educationinstitutions should produce graduates able to deal with systems thinking and to operatewithin complexity. Contribution: This study identified principles to select measurements and metrics. Noticeableis the move away from the interrelated scorecard perspectives to a systemic view of theorganisation in order to understand value creation. In addition, the findings may help toposition the human resource management function as a strategic asset.

  20. Long-term energy planning with uncertain environmental performance metrics

    Parkinson, Simon C.; Djilali, Ned

    2015-01-01

    Highlights: • Environmental performance uncertainty considered in a long-term energy planning model. • Application to electricity generation planning in British Columbia. • Interactions with climate change mitigation and adaptation strategy are assessed. • Performance risk-hedging impacts the technology investment strategy. • Sensitivity of results to model formulation is discussed. - Abstract: Environmental performance (EP) uncertainties span a number of energy technology options, and pose planning risk when the energy system is subject to environmental constraints. This paper presents two approaches to integrating EP uncertainty into the long-term energy planning framework. The methodologies consider stochastic EP metrics across multiple energy technology options, and produce a development strategy that hedges against the risk of exceeding environmental targets. Both methods are compared within a case study of emission-constrained electricity generation planning in British Columbia, Canada. The analysis provides important insight into model formulation and the interactions with concurrent environmental policy uncertainties. EP risk is found to be particularly important in situations where environmental constraints become increasingly stringent. Model results indicate allocation of a modest risk premium in these situations can provide valuable hedging against EP risk