WorldWideScience

Sample records for metric target strategy

  1. Metrics for measuring net-centric data strategy implementation

    Science.gov (United States)

    Kroculick, Joseph B.

    2010-04-01

    An enterprise data strategy outlines an organization's vision and objectives for improved collection and use of data. We propose generic metrics and quantifiable measures for each of the DoD Net-Centric Data Strategy (NCDS) data goals. Data strategy metrics can be adapted to the business processes of an enterprise and the needs of stakeholders in leveraging the organization's data assets to provide for more effective decision making. Generic metrics are applied to a specific application where logistics supply and transportation data is integrated across multiple functional groups. A dashboard presents a multidimensional view of the current progress to a state where logistics data shared in a timely and seamless manner among users, applications, and systems.

  2. Down-side Risk Metrics as Portfolio Diversification Strategies across the GFC

    NARCIS (Netherlands)

    D.E. Allen (David); M.J. McAleer (Michael); R.J. Powell (Robert); A.K. Singh (Abhay)

    2015-01-01

    textabstractThis paper features an analysis of the effectiveness of a range of portfolio diversification strategies, with a focus on down-side risk metrics, as a portfolio diversification strategy in a European market context. We apply these measures to a set of daily arithmetically compounded

  3. Improved targeted immunization strategies based on two rounds of selection

    Science.gov (United States)

    Xia, Ling-Ling; Song, Yu-Rong; Li, Chan-Chan; Jiang, Guo-Ping

    2018-04-01

    In the case of high degree targeted immunization where the number of vaccine is limited, when more than one node associated with the same degree meets the requirement of high degree centrality, how can we choose a certain number of nodes from those nodes, so that the number of immunized nodes will not exceed the limit? In this paper, we introduce a new idea derived from the selection process of second-round exam to solve this problem and then propose three improved targeted immunization strategies. In these proposed strategies, the immunized nodes are selected through two rounds of selection, where we increase the quotas of first-round selection according the evaluation criterion of degree centrality and then consider another characteristic parameter of node, such as node's clustering coefficient, betweenness and closeness, to help choose targeted nodes in the second-round selection. To validate the effectiveness of the proposed strategies, we compare them with the degree immunizations including the high degree targeted and the high degree adaptive immunizations using two metrics: the size of the largest connected component of immunized network and the number of infected nodes. Simulation results demonstrate that the proposed strategies based on two rounds of sorting are effective for heterogeneous networks and their immunization effects are better than that of the degree immunizations.

  4. Multi-Robot Assembly Strategies and Metrics

    Science.gov (United States)

    MARVEL, JEREMY A.; BOSTELMAN, ROGER; FALCO, JOE

    2018-01-01

    We present a survey of multi-robot assembly applications and methods and describe trends and general insights into the multi-robot assembly problem for industrial applications. We focus on fixtureless assembly strategies featuring two or more robotic systems. Such robotic systems include industrial robot arms, dexterous robotic hands, and autonomous mobile platforms, such as automated guided vehicles. In this survey, we identify the types of assemblies that are enabled by utilizing multiple robots, the algorithms that synchronize the motions of the robots to complete the assembly operations, and the metrics used to assess the quality and performance of the assemblies. PMID:29497234

  5. Multi-Robot Assembly Strategies and Metrics.

    Science.gov (United States)

    Marvel, Jeremy A; Bostelman, Roger; Falco, Joe

    2018-02-01

    We present a survey of multi-robot assembly applications and methods and describe trends and general insights into the multi-robot assembly problem for industrial applications. We focus on fixtureless assembly strategies featuring two or more robotic systems. Such robotic systems include industrial robot arms, dexterous robotic hands, and autonomous mobile platforms, such as automated guided vehicles. In this survey, we identify the types of assemblies that are enabled by utilizing multiple robots, the algorithms that synchronize the motions of the robots to complete the assembly operations, and the metrics used to assess the quality and performance of the assemblies.

  6. The impact of applying different metrics in target definitions : lessons for policy design

    NARCIS (Netherlands)

    Harmsen, Robert

    2016-01-01

    The objective of this paper is to analyse the impact of the use of different metrics in the EU renewable energy target definition. The analysis, using a case study of the Dutch renewable energy support for illustration, reveals that a target based on primary energy would have led to a ranking in

  7. Impact of greenhouse gas metrics on the quantification of agricultural emissions and farm-scale mitigation strategies: a New Zealand case study

    Science.gov (United States)

    Reisinger, Andy; Ledgard, Stewart

    2013-06-01

    Agriculture emits a range of greenhouse gases. Greenhouse gas metrics allow emissions of different gases to be reported in a common unit called CO2-equivalent. This enables comparisons of the efficiency of different farms and production systems and of alternative mitigation strategies across all gases. The standard metric is the 100 year global warming potential (GWP), but alternative metrics have been proposed and could result in very different CO2-equivalent emissions, particularly for CH4. While significant effort has been made to reduce uncertainties in emissions estimates of individual gases, little effort has been spent on evaluating the implications of alternative metrics on overall agricultural emissions profiles and mitigation strategies. Here we assess, for a selection of New Zealand dairy farms, the effect of two alternative metrics (100 yr GWP and global temperature change potentials, GTP) on farm-scale emissions and apparent efficiency and cost effectiveness of alternative mitigation strategies. We find that alternative metrics significantly change the balance between CH4 and N2O; in some cases, alternative metrics even determine whether a specific management option would reduce or increase net farm-level emissions or emissions intensity. However, the relative ranking of different farms by profitability or emissions intensity, and the ranking of the most cost-effective mitigation options for each farm, are relatively unaffected by the metric. We conclude that alternative metrics would change the perceived significance of individual gases from agriculture and the overall cost to farmers if a price were applied to agricultural emissions, but the economically most effective response strategies are unaffected by the choice of metric.

  8. Impact of greenhouse gas metrics on the quantification of agricultural emissions and farm-scale mitigation strategies: a New Zealand case study

    International Nuclear Information System (INIS)

    Reisinger, Andy; Ledgard, Stewart

    2013-01-01

    Agriculture emits a range of greenhouse gases. Greenhouse gas metrics allow emissions of different gases to be reported in a common unit called CO 2 -equivalent. This enables comparisons of the efficiency of different farms and production systems and of alternative mitigation strategies across all gases. The standard metric is the 100 year global warming potential (GWP), but alternative metrics have been proposed and could result in very different CO 2 -equivalent emissions, particularly for CH 4 . While significant effort has been made to reduce uncertainties in emissions estimates of individual gases, little effort has been spent on evaluating the implications of alternative metrics on overall agricultural emissions profiles and mitigation strategies. Here we assess, for a selection of New Zealand dairy farms, the effect of two alternative metrics (100 yr GWP and global temperature change potentials, GTP) on farm-scale emissions and apparent efficiency and cost effectiveness of alternative mitigation strategies. We find that alternative metrics significantly change the balance between CH 4 and N 2 O; in some cases, alternative metrics even determine whether a specific management option would reduce or increase net farm-level emissions or emissions intensity. However, the relative ranking of different farms by profitability or emissions intensity, and the ranking of the most cost-effective mitigation options for each farm, are relatively unaffected by the metric. We conclude that alternative metrics would change the perceived significance of individual gases from agriculture and the overall cost to farmers if a price were applied to agricultural emissions, but the economically most effective response strategies are unaffected by the choice of metric. (letter)

  9. Identifying Drug-Target Interactions with Decision Templates.

    Science.gov (United States)

    Yan, Xiao-Ying; Zhang, Shao-Wu

    2018-01-01

    During the development process of new drugs, identification of the drug-target interactions wins primary concerns. However, the chemical or biological experiments bear the limitation in coverage as well as the huge cost of both time and money. Based on drug similarity and target similarity, chemogenomic methods can be able to predict potential drug-target interactions (DTIs) on a large scale and have no luxurious need about target structures or ligand entries. In order to reflect the cases that the drugs having variant structures interact with common targets and the targets having dissimilar sequences interact with same drugs. In addition, though several other similarity metrics have been developed to predict DTIs, the combination of multiple similarity metrics (especially heterogeneous similarities) is too naïve to sufficiently explore the multiple similarities. In this paper, based on Gene Ontology and pathway annotation, we introduce two novel target similarity metrics to address above issues. More importantly, we propose a more effective strategy via decision template to integrate multiple classifiers designed with multiple similarity metrics. In the scenarios that predict existing targets for new drugs and predict approved drugs for new protein targets, the results on the DTI benchmark datasets show that our target similarity metrics are able to enhance the predictive accuracies in two scenarios. And the elaborate fusion strategy of multiple classifiers has better predictive power than the naïve combination of multiple similarity metrics. Compared with other two state-of-the-art approaches on the four popular benchmark datasets of binary drug-target interactions, our method achieves the best results in terms of AUC and AUPR for predicting available targets for new drugs (S2), and predicting approved drugs for new protein targets (S3).These results demonstrate that our method can effectively predict the drug-target interactions. The software package can

  10. Robustness of climate metrics under climate policy ambiguity

    International Nuclear Information System (INIS)

    Ekholm, Tommi; Lindroos, Tomi J.; Savolainen, Ilkka

    2013-01-01

    Highlights: • We assess the economic impacts of using different climate metrics. • The setting is cost-efficient scenarios for three interpretations of the 2C target. • With each target setting, the optimal metric is different. • Therefore policy ambiguity prevents the selection of an optimal metric. • Robust metric values that perform well with multiple policy targets however exist. -- Abstract: A wide array of alternatives has been proposed as the common metrics with which to compare the climate impacts of different emission types. Different physical and economic metrics and their parameterizations give diverse weights between e.g. CH 4 and CO 2 , and fixing the metric from one perspective makes it sub-optimal from another. As the aims of global climate policy involve some degree of ambiguity, it is not possible to determine a metric that would be optimal and consistent with all policy aims. This paper evaluates the cost implications of using predetermined metrics in cost-efficient mitigation scenarios. Three formulations of the 2 °C target, including both deterministic and stochastic approaches, shared a wide range of metric values for CH 4 with which the mitigation costs are only slightly above the cost-optimal levels. Therefore, although ambiguity in current policy might prevent us from selecting an optimal metric, it can be possible to select robust metric values that perform well with multiple policy targets

  11. Voyager 2 Neptune targeting strategy

    Science.gov (United States)

    Potts, C. L.; Francis, K.; Matousek, S. E.; Cesarone, R. J.; Gray, D. L.

    1989-01-01

    The success of the Voyager 2 flybys of Neptune and Triton depends upon the ability to correct the spacecraft's trajectory. Accurate spacecraft delivery to the desired encounter conditions will promote the maximum science return. However, Neptune's great distance causes large a priori uncertainties in Neptune and Triton ephemerides and planetary system parameters. Consequently, the 'ideal' trajectory is unknown beforehand. The targeting challenge is to utilize the gradually improving knowledge as the spacecraft approaches Neptune to meet the science objectives, but with an overriding concern for spacecraft safety and a desire to limit propellant expenditure. A unique targeting strategy has been developed in response to this challenge. Through the use of a Monte Carlo simulation, candidate strategies are evaluated by the degree to which they meet these objectives and are compared against each other in determining the targeting strategy to be adopted.

  12. Voyager 2 Uranus targeting strategy

    Science.gov (United States)

    Cesarone, R. J.; Gray, D. L.; Potts, C. L.; Francis, K.

    1986-01-01

    One of the major challenges involved in the Voyager 2 Uranus flyby is to deliver the spacecraft to an appropriate aimpoint at the optimum time, so as to maximize the science return of the mission, while yet keeping propellant expenditure low. An unusual targeting strategy has been devised to satisfy these requirements. Its complexity arises from the great distance of the planet Uranus and the limited performance capabilities of Voyager. This selected strategy is developed in relation to a set of candidate strategies, mission requirements and shifting science objectives. The analysis of these candidates is conducted via a Monte Carlo simulation, the results of which yield data for the comparative evaluation and eventual and selection of the actual targeting strategy to be employed.

  13. Conceptual Soundness, Metric Development, Benchmarking, and Targeting for PATH Subprogram Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Mosey. G.; Doris, E.; Coggeshall, C.; Antes, M.; Ruch, J.; Mortensen, J.

    2009-01-01

    The objective of this study is to evaluate the conceptual soundness of the U.S. Department of Housing and Urban Development (HUD) Partnership for Advancing Technology in Housing (PATH) program's revised goals and establish and apply a framework to identify and recommend metrics that are the most useful for measuring PATH's progress. This report provides an evaluative review of PATH's revised goals, outlines a structured method for identifying and selecting metrics, proposes metrics and benchmarks for a sampling of individual PATH programs, and discusses other metrics that potentially could be developed that may add value to the evaluation process. The framework and individual program metrics can be used for ongoing management improvement efforts and to inform broader program-level metrics for government reporting requirements.

  14. Next-Generation Metrics: Responsible Metrics & Evaluation for Open Science

    Energy Technology Data Exchange (ETDEWEB)

    Wilsdon, J.; Bar-Ilan, J.; Peters, I.; Wouters, P.

    2016-07-01

    Metrics evoke a mixed reaction from the research community. A commitment to using data to inform decisions makes some enthusiastic about the prospect of granular, real-time analysis o of research and its wider impacts. Yet we only have to look at the blunt use of metrics such as journal impact factors, h-indices and grant income targets, to be reminded of the pitfalls. Some of the most precious qualities of academic culture resist simple quantification, and individual indicators often struggle to do justice to the richness and plurality of research. Too often, poorly designed evaluation criteria are “dominating minds, distorting behaviour and determining careers (Lawrence, 2007).” Metrics hold real power: they are constitutive of values, identities and livelihoods. How to exercise that power to more positive ends has been the focus of several recent and complementary initiatives, including the San Francisco Declaration on Research Assessment (DORA1), the Leiden Manifesto2 and The Metric Tide3 (a UK government review of the role of metrics in research management and assessment). Building on these initiatives, the European Commission, under its new Open Science Policy Platform4, is now looking to develop a framework for responsible metrics for research management and evaluation, which can be incorporated into the successor framework to Horizon 2020. (Author)

  15. Parameter-space metric of semicoherent searches for continuous gravitational waves

    International Nuclear Information System (INIS)

    Pletsch, Holger J.

    2010-01-01

    Continuous gravitational-wave (CW) signals such as emitted by spinning neutron stars are an important target class for current detectors. However, the enormous computational demand prohibits fully coherent broadband all-sky searches for prior unknown CW sources over wide ranges of parameter space and for yearlong observation times. More efficient hierarchical ''semicoherent'' search strategies divide the data into segments much shorter than one year, which are analyzed coherently; then detection statistics from different segments are combined incoherently. To optimally perform the incoherent combination, understanding of the underlying parameter-space structure is requisite. This problem is addressed here by using new coordinates on the parameter space, which yield the first analytical parameter-space metric for the incoherent combination step. This semicoherent metric applies to broadband all-sky surveys (also embedding directed searches at fixed sky position) for isolated CW sources. Furthermore, the additional metric resolution attained through the combination of segments is studied. From the search parameters (sky position, frequency, and frequency derivatives), solely the metric resolution in the frequency derivatives is found to significantly increase with the number of segments.

  16. Brain tumor-targeted drug delivery strategies

    Directory of Open Access Journals (Sweden)

    Xiaoli Wei

    2014-06-01

    Full Text Available Despite the application of aggressive surgery, radiotherapy and chemotherapy in clinics, brain tumors are still a difficult health challenge due to their fast development and poor prognosis. Brain tumor-targeted drug delivery systems, which increase drug accumulation in the tumor region and reduce toxicity in normal brain and peripheral tissue, are a promising new approach to brain tumor treatments. Since brain tumors exhibit many distinctive characteristics relative to tumors growing in peripheral tissues, potential targets based on continuously changing vascular characteristics and the microenvironment can be utilized to facilitate effective brain tumor-targeted drug delivery. In this review, we briefly describe the physiological characteristics of brain tumors, including blood–brain/brain tumor barriers, the tumor microenvironment, and tumor stem cells. We also review targeted delivery strategies and introduce a systematic targeted drug delivery strategy to overcome the challenges.

  17. Deep Transfer Metric Learning.

    Science.gov (United States)

    Junlin Hu; Jiwen Lu; Yap-Peng Tan; Jie Zhou

    2016-12-01

    Conventional metric learning methods usually assume that the training and test samples are captured in similar scenarios so that their distributions are assumed to be the same. This assumption does not hold in many real visual recognition applications, especially when samples are captured across different data sets. In this paper, we propose a new deep transfer metric learning (DTML) method to learn a set of hierarchical nonlinear transformations for cross-domain visual recognition by transferring discriminative knowledge from the labeled source domain to the unlabeled target domain. Specifically, our DTML learns a deep metric network by maximizing the inter-class variations and minimizing the intra-class variations, and minimizing the distribution divergence between the source domain and the target domain at the top layer of the network. To better exploit the discriminative information from the source domain, we further develop a deeply supervised transfer metric learning (DSTML) method by including an additional objective on DTML, where the output of both the hidden layers and the top layer are optimized jointly. To preserve the local manifold of input data points in the metric space, we present two new methods, DTML with autoencoder regularization and DSTML with autoencoder regularization. Experimental results on face verification, person re-identification, and handwritten digit recognition validate the effectiveness of the proposed methods.

  18. STRATEGI SEGMENTING, TARGETING, POSITIONING SERTA STRATEGI HARGA PADA PERUSAHAAN KECAP BLEKOK DI CILACAP

    OpenAIRE

    Wijaya, Hari; Sirine, Hani

    2017-01-01

    To win the market competition, companies must have segmenting, targeting, positioning strategy and pricing strategy. This study aims to determine segmenting, targeting, positioning strategy as well as the company's pricing strategies on Kecap Blekok Company in Cilacap. Methods of data collection in this study using interviews and documentation. The analysis technique used is descriptive analysis techniques. The results showed market segment of Kecap Blekok Company is the lower middle class, t...

  19. Active Metric Learning for Supervised Classification

    OpenAIRE

    Kumaran, Krishnan; Papageorgiou, Dimitri; Chang, Yutong; Li, Minhan; Takáč, Martin

    2018-01-01

    Clustering and classification critically rely on distance metrics that provide meaningful comparisons between data points. We present mixed-integer optimization approaches to find optimal distance metrics that generalize the Mahalanobis metric extensively studied in the literature. Additionally, we generalize and improve upon leading methods by removing reliance on pre-designated "target neighbors," "triplets," and "similarity pairs." Another salient feature of our method is its ability to en...

  20. Down-Side Risk Metrics as Portfolio Diversification Strategies across the Global Financial Crisis

    Directory of Open Access Journals (Sweden)

    David E. Allen

    2016-06-01

    Full Text Available This paper features an analysis of the effectiveness of a range of portfolio diversification strategies, with a focus on down-side risk metrics, as a portfolio diversification strategy in a European market context. We apply these measures to a set of daily arithmetically-compounded returns, in U.S. dollar terms, on a set of ten market indices representing the major European markets for a nine-year period from the beginning of 2005 to the end of 2013. The sample period, which incorporates the periods of both the Global Financial Crisis (GFC and the subsequent European Debt Crisis (EDC, is a challenging one for the application of portfolio investment strategies. The analysis is undertaken via the examination of multiple investment strategies and a variety of hold-out periods and backtests. We commence by using four two-year estimation periods and a subsequent one-year investment hold out period, to analyse a naive 1/N diversification strategy and to contrast its effectiveness with Markowitz mean variance analysis with positive weights. Markowitz optimisation is then compared to various down-side investment optimisation strategies. We begin by comparing Markowitz with CVaR, and then proceed to evaluate the relative effectiveness of Markowitz with various draw-down strategies, utilising a series of backtests. Our results suggest that none of the more sophisticated optimisation strategies appear to dominate naive diversification.

  1. The PM&R Journal Implements a Social Media Strategy to Disseminate Research and Track Alternative Metrics in Physical Medicine and Rehabilitation.

    Science.gov (United States)

    Niehaus, William N; Silver, Julie K; Katz, Matthew S

    2017-12-16

    Implementation science is an evolving part of translating evidence into clinical practice and public health policy. This report describes how a social media strategy for the journal PM&R using metrics, including alternative metrics, contributes to the dissemination of research and other information in the field of physical medicine and rehabilitation. The primary goal of the strategy was to disseminate information about rehabilitation medicine, including but not limited to new research published in the journal, to health care professionals. Several different types of metrics were studied, including alternative metrics that are increasingly being used to demonstrate impact in academic medicine. A secondary goal was to encourage diversity and inclusion of the physiatric workforce-enhancing the reputations of all physiatrists by highlighting their research, lectures, awards, and other accomplishments with attention to those who may be underrepresented. A third goal was to educate the public so that they are more aware of the field and how to access care. This report describes the early results following initiation of PM&R's coordinated social media strategy. Through a network of social media efforts that are strategically integrated, physiatrists and their associated institutions have an opportunity to advance their research and clinical agendas, support the diverse physiatric workforce, and educate the public about the field to enhance patient awareness and access to care. Copyright © 2018 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  2. The wasted energy: A metric to set up appropriate targets in our path towards fully renewable energy systems

    International Nuclear Information System (INIS)

    Vinagre Díaz, Juan José; Wilby, Mark Richard; Rodríguez González, Ana Belén

    2015-01-01

    By 2020 Europe has to increase its energy efficiency and share of renewables in 20%. However, even accomplishing these challenging objectives Europe will be effectively wasting energy as we demonstrate in this paper. In our way towards a fully renewable scenario, we need at least to stop wasting energy in order to guarantee the energy supply needed for growth and comfort. We waste energy when we employ more primary energy than the final energy we ultimately use and this excess cannot be reutilized. In this paper we propose the WE (wasted energy) as a novel metric to measure the performance of energy systems and set up appropriate targets. The WE incorporates information about energy efficiency and renewable sources. Unlike European legislation, the WE considers them in an integrated way. This approach will help Member States to exploit their intrinsic capabilities and design their optimum strategy to reach their objectives. Using the information in Eurostat, we calculate the WE of Member States in EU-28 and their evolution. We also analyze illustrative examples to highlight strategies to reduce the WE, study the connection between economic development and WE, and provide a tool to diagnose the potential of improvement of an energy system. - Highlights: • Even achieving the 2020 objectives, Europe will still be wasting energy. • We need to reduce wasted energy in our way towards 100% renewable energy systems. • The WE (wasted energy) integrates efficiency and renewable in a single target. • We provide the empirical WE of Member States in EU-28 and their evolution. • Finally we highlight best practices of real energy systems.

  3. Space based lidar shot pattern targeting strategies for small targets such as streams

    Science.gov (United States)

    Spiers, Gary D.

    2001-01-01

    An analysis of the effectiveness of four different types of lidar shot distribution is conducted to determine which is best for concentrating shots in a given location. A simple preemptive targeting strategy is found to work as adequately as a more involved dynamic strategy for most target sizes considered.

  4. Evaluating and Estimating the WCET Criticality Metric

    DEFF Research Database (Denmark)

    Jordan, Alexander

    2014-01-01

    a programmer (or compiler) from targeting optimizations the right way. A possible resort is to use a metric that targets WCET and which can be efficiently computed for all code parts of a program. Similar to dynamic profiling techniques, which execute code with input that is typically expected...... for the application, based on WCET analysis we can indicate how critical a code fragment is, in relation to the worst-case bound. Computing such a metric on top of static analysis, incurs a certain overhead though, which increases with the complexity of the underlying WCET analysis. We present our approach...... to estimate the Criticality metric, by relaxing the precision of WCET analysis. Through this, we can reduce analysis time by orders of magnitude, while only introducing minor error. To evaluate our estimation approach and share our garnered experience using the metric, we evaluate real-time programs, which...

  5. Metrics, Media and Advertisers: Discussing Relationship

    Directory of Open Access Journals (Sweden)

    Marco Aurelio de Souza Rodrigues

    2014-11-01

    Full Text Available This study investigates how Brazilian advertisers are adapting to new media and its attention metrics. In-depth interviews were conducted with advertisers in 2009 and 2011. In 2009, new media and its metrics were celebrated as innovations that would increase advertising campaigns overall efficiency. In 2011, this perception has changed: New media’s profusion of metrics, once seen as an advantage, started to compromise its ease of use and adoption. Among its findings, this study argues that there is an opportunity for media groups willing to shift from a product-focused strategy towards a customer-centric one, through the creation of new, simple and integrative metrics

  6. Mass Media Strategies Targeting High Sensation Seekers: What Works and Why

    Science.gov (United States)

    Stephenson, Michael T.

    2003-01-01

    Objectives: To examine strategies for using the mass media effectively in drug prevention campaigns targeting high sensation seekers. Methods: Both experimental lab and field studies were used to develop a comprehensive audience segmentation strategy targeting high sensation seekers. Results: A 4-pronged targeting strategy employed in an…

  7. Target marketing strategies for occupational therapy entrepreneurs.

    Science.gov (United States)

    Kautzmann, L N; Kautzmann, F N; Navarro, F H

    1989-01-01

    Understanding marketing techniques is one of the skills needed by successful entre renews. Target marketing is an effective method for occupational therapy entrepreneurs to use in determining when and where to enter the marketplace. The two components of target marketing, market segmentation and the development of marketing mix strategies for each identified market segment, are described. The Profife of Attitudes Toward Health Care (PATH) method of psychographic market segmentation of health care consumers is presented. Occupational therapy marketing mix strategies for each PATH consumer group are delineated and compatible groupings of market segments are suggested.

  8. Comprehensive Metric Education Project: Implementing Metrics at a District Level Administrative Guide.

    Science.gov (United States)

    Borelli, Michael L.

    This document details the administrative issues associated with guiding a school district through its metrication efforts. Issues regarding staff development, curriculum development, and the acquisition of instructional resources are considered. Alternative solutions are offered. Finally, an overall implementation strategy is discussed with…

  9. Proteolysis targeting peptide (PROTAP) strategy for protein ubiquitination and degradation.

    Science.gov (United States)

    Zheng, Jing; Tan, Chunyan; Xue, Pengcheng; Cao, Jiakun; Liu, Feng; Tan, Ying; Jiang, Yuyang

    2016-02-19

    Ubiquitination proteasome pathway (UPP) is the most important and selective way to degrade proteins in vivo. Here, a novel proteolysis targeting peptide (PROTAP) strategy, composed of a target protein binding peptide, a linker and a ubiquitin E3 ligase recognition peptide, was designed to recruit both target protein and E3 ligase and then induce polyubiquitination and degradation of the target protein through UPP. In our study, the PROTAP strategy was proved to be a general method with high specificity using Bcl-xL protein as model target in vitro and in cells, which indicates that the strategy has great potential for in vivo application. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Prodrug strategy for cancer cell-specific targeting: A recent overview.

    Science.gov (United States)

    Zhang, Xian; Li, Xiang; You, Qidong; Zhang, Xiaojin

    2017-10-20

    The increasing development of targeted cancer therapy provides extensive possibilities in clinical trials, and numerous strategies have been explored. The prodrug is one of the most promising strategies in targeted cancer therapy to improve the selectivity and efficacy of cytotoxic compounds. Compared with normal tissues, cancer cells are characterized by unique aberrant markers, thus inactive prodrugs targeting these markers are excellent therapeutics to release active drugs, killing cancer cells without damaging normal tissues. In this review, we explore an integrated view of potential prodrugs applied in targeted cancer therapy based on aberrant cancer specific markers and some examples are provided for inspiring new ideas of prodrug strategy for cancer cell-specific targeting. Copyright © 2017. Published by Elsevier Masson SAS.

  11. Voyager 1 Saturn targeting strategy

    Science.gov (United States)

    Cesarone, R. J.

    1980-01-01

    A trajectory targeting strategy for the Voyager 1 Saturn encounter has been designed to accomodate predicted uncertainties in Titan's ephemeris while maximizing spacecraft safety and science return. The encounter is characterized by a close Titan flyby 18 hours prior to Saturn periapse. Retargeting of the nominal trajectory to account for late updates in Titan's estimated position can disperse the ascending node location, which is nominally situated at a radius of low expected particle density in Saturn's ring plane. The strategy utilizes a floating Titan impact vector magnitude to minimize this dispersion. Encounter trajectory characteristics and optimal tradeoffs are presented.

  12. [The development of novel tumor targeting delivery strategy].

    Science.gov (United States)

    Gao, Hui-le; Jiang, Xin-guo

    2016-02-01

    Tumor is one of the most serious threats for human being. Although many anti-tumor drugs are approved for clinical use, the treatment outcome is still modest because of the poor tumor targeting efficiency and low accumulation in tumor. Therefore, it is important to deliver anti-tumor drug into tumor efficiently, elevate drug concentration in tumor tissues and reduce the drug distribution in normal tissues. And it has been one of the most attractive directions of pharmaceutical academy and industry. Many kinds of strategies, especially various nanoparticulated drug delivery systems, have been developed to address the critical points of complex tumor microenvironment, which are partially or mostly satisfied for tumor treatment. In this paper, we carefully reviewed the novel targeting delivery strategies developed in recent years. The most powerful method is passive targeting delivery based on the enhanced permeability and retention(EPR) effect, and most commercial nanomedicines are based on the EPR effect. However, the high permeability and retention require different particle sizes, thus several kinds of size-changeable nanoparticles are developed, such as size reducible particles and assemble particles, to satisfy the controversial requirement for particle size and enhance both tumor retention and penetration. Surface charge reversible nanoparticles also shows a high efficiency because the anionic charge in blood circulation and normal organs decrease the unintended internalization. The charge can change into positive in tumor microenvironment, facilitating drug uptake by tumor cells. Additionally, tumor microenvironment responsive drug release is important to decrease drug side effect, and many strategies are developed, such as p H sensitive release and enzyme sensitive release. Except the responsive nanoparticles, shaping tumor microenvironment could attenuate the barriers in drug delivery, for example, decreasing tumor collagen intensity and normalizing tumor

  13. Energy functionals for Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Headrick, M; Nassar, A

    2013-01-01

    We identify a set of ''energy'' functionals on the space of metrics in a given Kähler class on a Calabi-Yau manifold, which are bounded below and minimized uniquely on the Ricci-flat metric in that class. Using these functionals, we recast the problem of numerically solving the Einstein equation as an optimization problem. We apply this strategy, using the ''algebraic'' metrics (metrics for which the Kähler potential is given in terms of a polynomial in the projective coordinates), to the Fermat quartic and to a one-parameter family of quintics that includes the Fermat and conifold quintics. We show that this method yields approximations to the Ricci-flat metric that are exponentially accurate in the degree of the polynomial (except at the conifold point, where the convergence is polynomial), and therefore orders of magnitude more accurate than the balanced metrics, previously studied as approximations to the Ricci-flat metric. The method is relatively fast and easy to implement. On the theoretical side, we also show that the functionals can be used to give a heuristic proof of Yau's theorem

  14. Degraded visual environment image/video quality metrics

    Science.gov (United States)

    Baumgartner, Dustin D.; Brown, Jeremy B.; Jacobs, Eddie L.; Schachter, Bruce J.

    2014-06-01

    A number of image quality metrics (IQMs) and video quality metrics (VQMs) have been proposed in the literature for evaluating techniques and systems for mitigating degraded visual environments. Some require both pristine and corrupted imagery. Others require patterned target boards in the scene. None of these metrics relates well to the task of landing a helicopter in conditions such as a brownout dust cloud. We have developed and used a variety of IQMs and VQMs related to the pilot's ability to detect hazards in the scene and to maintain situational awareness. Some of these metrics can be made agnostic to sensor type. Not only are the metrics suitable for evaluating algorithm and sensor variation, they are also suitable for choosing the most cost effective solution to improve operating conditions in degraded visual environments.

  15. Effects of subsampling of passive acoustic recordings on acoustic metrics.

    Science.gov (United States)

    Thomisch, Karolin; Boebel, Olaf; Zitterbart, Daniel P; Samaran, Flore; Van Parijs, Sofie; Van Opzeeland, Ilse

    2015-07-01

    Passive acoustic monitoring is an important tool in marine mammal studies. However, logistics and finances frequently constrain the number and servicing schedules of acoustic recorders, requiring a trade-off between deployment periods and sampling continuity, i.e., the implementation of a subsampling scheme. Optimizing such schemes to each project's specific research questions is desirable. This study investigates the impact of subsampling on the accuracy of two common metrics, acoustic presence and call rate, for different vocalization patterns (regimes) of baleen whales: (1) variable vocal activity, (2) vocalizations organized in song bouts, and (3) vocal activity with diel patterns. To this end, above metrics are compared for continuous and subsampled data subject to different sampling strategies, covering duty cycles between 50% and 2%. The results show that a reduction of the duty cycle impacts negatively on the accuracy of both acoustic presence and call rate estimates. For a given duty cycle, frequent short listening periods improve accuracy of daily acoustic presence estimates over few long listening periods. Overall, subsampling effects are most pronounced for low and/or temporally clustered vocal activity. These findings illustrate the importance of informed decisions when applying subsampling strategies to passive acoustic recordings or analyses for a given target species.

  16. Target Scattering Metrics: Model-Model and Model Data comparisons

    Science.gov (United States)

    2017-12-13

    6785 Email : kargl@apl.washington.edu Award Number: N00014-16-1-3209 ABSTRACT The development of metrics for the comparison of data obtained from...satisfies ∫ |()| 2 = ∫ |()| 2∞ −∞ ∞ −∞ . (1) To exploit Eq. (1), it is convenient to write || 2 = |||

  17. Target Scattering Metrics: Model-Model and Model-Data Comparisons

    Science.gov (United States)

    2017-12-13

    6785 Email : kargl@apl.washington.edu Award Number: N00014-16-1-3209 ABSTRACT The development of metrics for the comparison of data obtained from...satisfies ∫ |()| 2 = ∫ |()| 2∞ −∞ ∞ −∞ . (1) To exploit Eq. (1), it is convenient to write || 2 = |||

  18. Smart Grid Status and Metrics Report Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Antonopoulos, Chrissi A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clements, Samuel L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorrissen, Willy J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ruiz, Kathleen A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, David L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gardner, Chris [APQC, Houston, TX (United States); Varney, Jeff [APQC, Houston, TX (United States)

    2014-07-01

    A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

  19. Targeted Assessment for Prevention of Healthcare-Associated Infections: A New Prioritization Metric.

    Science.gov (United States)

    Soe, Minn M; Gould, Carolyn V; Pollock, Daniel; Edwards, Jonathan

    2015-12-01

    To develop a method for calculating the number of healthcare-associated infections (HAIs) that must be prevented to reach a HAI reduction goal and identifying and prioritizing healthcare facilities where the largest reductions can be achieved. Acute care hospitals that report HAI data to the Centers for Disease Control and Prevention's National Healthcare Safety Network. METHODS :The cumulative attributable difference (CAD) is calculated by subtracting a numerical prevention target from an observed number of HAIs. The prevention target is the product of the predicted number of HAIs and a standardized infection ratio goal, which represents a HAI reduction goal. The CAD is a numeric value that if positive is the number of infections to prevent to reach the HAI reduction goal. We calculated the CAD for catheter-associated urinary tract infections for each of the 3,639 hospitals that reported such data to National Healthcare Safety Network in 2013 and ranked the hospitals by their CAD values in descending order. Of 1,578 hospitals with positive CAD values, preventing 10,040 catheter-associated urinary tract infections at 293 hospitals (19%) with the highest CAD would enable achievement of the national 25% catheter-associated urinary tract infection reduction goal. The CAD is a new metric that facilitates ranking of facilities, and locations within facilities, to prioritize HAI prevention efforts where the greatest impact can be achieved toward a HAI reduction goal.

  20. Therapeutic targeting strategies using endogenous cells and proteins.

    Science.gov (United States)

    Parayath, Neha N; Amiji, Mansoor M

    2017-07-28

    Targeted drug delivery has become extremely important in enhancing efficacy and reducing the toxicity of therapeutics in the treatment of various disease conditions. Current approaches include passive targeting, which relies on naturally occurring differences between healthy and diseased tissues, and active targeting, which utilizes various ligands that can recognize targets expressed preferentially at the diseased site. Clinical translation of these mechanisms faces many challenges including the immunogenic and toxic effects of these non-natural systems. Thus, use of endogenous targeting systems is increasingly gaining momentum. This review is focused on strategies for employing endogenous moieties, which could serve as safe and efficient carriers for targeted drug delivery. The first part of the review involves cells and cellular components as endogenous carriers for therapeutics in multiple disease states, while the second part discusses the use of endogenous plasma components as endogenous carriers. Further understanding of the biological tropism with cells and proteins and the newer generation of delivery strategies that exploits these endogenous approaches promises to provide better solutions for site-specific delivery and could further facilitate clinical translations. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Artificial Chemical Reporter Targeting Strategy Using Bioorthogonal Click Reaction for Improving Active-Targeting Efficiency of Tumor.

    Science.gov (United States)

    Yoon, Hong Yeol; Shin, Min Lee; Shim, Man Kyu; Lee, Sangmin; Na, Jin Hee; Koo, Heebeom; Lee, Hyukjin; Kim, Jong-Ho; Lee, Kuen Yong; Kim, Kwangmeyung; Kwon, Ick Chan

    2017-05-01

    Biological ligands such as aptamer, antibody, glucose, and peptide have been widely used to bind specific surface molecules or receptors in tumor cells or subcellular structures to improve tumor-targeting efficiency of nanoparticles. However, this active-targeting strategy has limitations for tumor targeting due to inter- and intraheterogeneity of tumors. In this study, we demonstrated an alternative active-targeting strategy using metabolic engineering and bioorthogonal click reaction to improve tumor-targeting efficiency of nanoparticles. We observed that azide-containing chemical reporters were successfully generated onto surface glycans of various tumor cells such as lung cancer (A549), brain cancer (U87), and breast cancer (BT-474, MDA-MB231, MCF-7) via metabolic engineering in vitro. In addition, we compared tumor targeting of artificial azide reporter with bicyclononyne (BCN)-conjugated glycol chitosan nanoparticles (BCN-CNPs) and integrin α v β 3 with cyclic RGD-conjugated CNPs (cRGD-CNPs) in vitro and in vivo. Fluorescence intensity of azide-reporter-targeted BCN-CNPs in tumor tissues was 1.6-fold higher and with a more uniform distribution compared to that of cRGD-CNPs. Moreover, even in the isolated heterogeneous U87 cells, BCN-CNPs could bind artificial azide reporters on tumor cells more uniformly (∼92.9%) compared to cRGD-CNPs. Therefore, the artificial azide-reporter-targeting strategy can be utilized for targeting heterogeneous tumor cells via bioorthogonal click reaction and may provide an alternative method of tumor targeting for further investigation in cancer therapy.

  2. Value of the Company and Marketing Metrics

    Directory of Open Access Journals (Sweden)

    André Luiz Ramos

    2013-12-01

    Full Text Available Thinking marketing strategies from a resource-based perspective (Barney, 1991, proposing assets as either tangible, organizational and human, and from Constantin and Luch’s vision (1994, where strategic resources can be tanbigle or intangible, internal or external to the firm, raises a research approach on Marketing and Finance. According to Srivastava, Shervani and Fahey (1998 there are 3 market assets types, which generate firm value. Firm value can be measured by discounted cashflow, compromising marketing activities with value generation forcasts (Anderson, 1982; Day, Fahey, 1988; Doyle, 2000; Rust et al., 2004a. The economic value of marketing strategies and marketing metrics are calling strategy researchers’ and marketing managers’ attention, making clear the need for building a bridge able to articulate marketing and finance form a strategic perspective. This article proposes an analytical framework based on different scientific approaches envolving risk and return promoted by marketing strategies and points out advances concerning both methodological approaches and marketing strategies and its impact on firm metrics and value, usgin Srinivasan and Hanssens (2009 as a start point.

  3. Relaxed metrics and indistinguishability operators: the relationship

    Energy Technology Data Exchange (ETDEWEB)

    Martin, J.

    2017-07-01

    In 1982, the notion of indistinguishability operator was introduced by E. Trillas in order to fuzzify the crisp notion of equivalence relation (/cite{Trillas}). In the study of such a class of operators, an outstanding property must be pointed out. Concretely, there exists a duality relationship between indistinguishability operators and metrics. The aforesaid relationship was deeply studied by several authors that introduced a few techniques to generate metrics from indistinguishability operators and vice-versa (see, for instance, /cite{BaetsMesiar,BaetsMesiar2}). In the last years a new generalization of the metric notion has been introduced in the literature with the purpose of developing mathematical tools for quantitative models in Computer Science and Artificial Intelligence (/cite{BKMatthews,Ma}). The aforementioned generalized metrics are known as relaxed metrics. The main target of this talk is to present a study of the duality relationship between indistinguishability operators and relaxed metrics in such a way that the aforementioned classical techniques to generate both concepts, one from the other, can be extended to the new framework. (Author)

  4. Essays in Marketing Strategy: The Role of Customer Integration, Marketing Metrics, and Advertising Effectiveness

    OpenAIRE

    Ptok, Annette

    2017-01-01

    The dissertation, coauthored by Annette Ptok addresses the overall topic of marketing strategy within three different essays. Marketing strategy is a complex bundle of decisions dealing with markets and customer segments to target as well as the communication and delivery of value to the customer always under the consideration of disposable budget investments. Nowadays, there are several challenges managers need to tackle with regard to marketing strategy (Bhasin 2016). The most important cha...

  5. Comparison of luminance based metrics in different lighting conditions

    DEFF Research Database (Denmark)

    Wienold, J.; Kuhn, T.E.; Christoffersen, J.

    In this study, we evaluate established and newly developed metrics for predicting glare using data from three different research studies. The evaluation covers two different targets: 1. How well the user’s perception of glare magnitude correlates to the prediction of the glare metrics? 2. How well...... do the glare metrics describe the subjects’ disturbance by glare? We applied Spearman correlations, logistic regressions and an accuracy evaluation, based on an ROC-analysis. The results show that five of the twelve investigated metrics are failing at least one of the statistical tests. The other...... seven metrics CGI, modified DGI, DGP, Ev, average Luminance of the image Lavg, UGP and UGR are passing all statistical tests. DGP, CGI, DGI_mod and UGP have largest AUC and might be slightly more robust. The accuracy of the predictions of afore mentioned seven metrics for the disturbance by glare lies...

  6. Student Borrowing in America: Metrics, Demographics, Default Aversion Strategies

    Science.gov (United States)

    Kesterman, Frank

    2006-01-01

    The use of Cohort Default Rate (CDR) as the primary measure of student loan defaults among undergraduates was investigated. The study used data extracted from the National Student Loan Data System (NSLDS), quantitative analysis of Likert-scale survey responses from 153 student financial aid professionals on proposed changes to present metrics and…

  7. Advances in targeting strategies for nanoparticles in cancer imaging and therapy.

    Science.gov (United States)

    Yhee, Ji Young; Lee, Sangmin; Kim, Kwangmeyung

    2014-11-21

    In the last decade, nanoparticles have offered great advances in diagnostic imaging and targeted drug delivery. In particular, nanoparticles have provided remarkable progress in cancer imaging and therapy based on materials science and biochemical engineering technology. Researchers constantly attempted to develop the nanoparticles which can deliver drugs more specifically to cancer cells, and these efforts brought the advances in the targeting strategy of nanoparticles. This minireview will discuss the progress in targeting strategies for nanoparticles focused on the recent innovative work for nanomedicine.

  8. SOCIAL METRICS APPLIED TO SMART TOURISM

    Directory of Open Access Journals (Sweden)

    O. Cervantes

    2016-09-01

    Full Text Available We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  9. Social Metrics Applied to Smart Tourism

    Science.gov (United States)

    Cervantes, O.; Gutiérrez, E.; Gutiérrez, F.; Sánchez, J. A.

    2016-09-01

    We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general) to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services) to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  10. Long-term energy planning with uncertain environmental performance metrics

    International Nuclear Information System (INIS)

    Parkinson, Simon C.; Djilali, Ned

    2015-01-01

    Highlights: • Environmental performance uncertainty considered in a long-term energy planning model. • Application to electricity generation planning in British Columbia. • Interactions with climate change mitigation and adaptation strategy are assessed. • Performance risk-hedging impacts the technology investment strategy. • Sensitivity of results to model formulation is discussed. - Abstract: Environmental performance (EP) uncertainties span a number of energy technology options, and pose planning risk when the energy system is subject to environmental constraints. This paper presents two approaches to integrating EP uncertainty into the long-term energy planning framework. The methodologies consider stochastic EP metrics across multiple energy technology options, and produce a development strategy that hedges against the risk of exceeding environmental targets. Both methods are compared within a case study of emission-constrained electricity generation planning in British Columbia, Canada. The analysis provides important insight into model formulation and the interactions with concurrent environmental policy uncertainties. EP risk is found to be particularly important in situations where environmental constraints become increasingly stringent. Model results indicate allocation of a modest risk premium in these situations can provide valuable hedging against EP risk

  11. hdm: High-dimensional metrics

    OpenAIRE

    Chernozhukov, Victor; Hansen, Christian; Spindler, Martin

    2016-01-01

    In this article the package High-dimensional Metrics (\\texttt{hdm}) is introduced. It is a collection of statistical methods for estimation and quantification of uncertainty in high-dimensional approximately sparse models. It focuses on providing confidence intervals and significance testing for (possibly many) low-dimensional subcomponents of the high-dimensional parameter vector. Efficient estimators and uniformly valid confidence intervals for regression coefficients on target variables (e...

  12. Anti-Authoritarian Metrics: Recursivity as a strategy for post-capitalism

    Directory of Open Access Journals (Sweden)

    David Adam Banks

    2016-12-01

    Full Text Available This essay proposes that those seeking to build counter-power institutions and communities learn to think in terms of what I call “recursivity.” Recursivity is an anti-authoritarian metric that helps bring about a sensitivity to feedback loops at multiple levels of organization. I begin by describing how technological systems and the socio-economic order co-constitute one-another around efficiency metrics. I then go on to define recursivity as social conditions that contain within them all of the parts and practices for their maturation and expansion, and show how organizations that demonstrate recursivity, like the historical English commons, have been marginalized or destroyed all together. Finally, I show how the ownership of property is inherently antithetical to the closed loops of recursivity. All of this is bookended by a study of urban planning’s recursive beginnings.

  13. WE-AB-BRA-01: 3D-2D Image Registration for Target Localization in Spine Surgery: Comparison of Similarity Metrics Against Robustness to Content Mismatch

    International Nuclear Information System (INIS)

    De Silva, T; Ketcha, M; Siewerdsen, J H; Uneri, A; Reaungamornrat, S; Vogt, S; Kleinszig, G; Lo, S F; Wolinsky, J P; Gokaslan, Z L; Aygun, N

    2015-01-01

    Purpose: In image-guided spine surgery, mapping 3D preoperative images to 2D intraoperative images via 3D-2D registration can provide valuable assistance in target localization. However, the presence of surgical instrumentation, hardware implants, and soft-tissue resection/displacement causes mismatches in image content, confounding existing registration methods. Manual/semi-automatic methods to mask such extraneous content is time consuming, user-dependent, error prone, and disruptive to clinical workflow. We developed and evaluated 2 novel similarity metrics within a robust registration framework to overcome such challenges in target localization. Methods: An IRB-approved retrospective study in 19 spine surgery patients included 19 preoperative 3D CT images and 50 intraoperative mobile radiographs in cervical, thoracic, and lumbar spine regions. A neuroradiologist provided truth definition of vertebral positions in CT and radiography. 3D-2D registration was performed using the CMA-ES optimizer with 4 gradient-based image similarity metrics: (1) gradient information (GI); (2) gradient correlation (GC); (3) a novel variant referred to as gradient orientation (GO); and (4) a second variant referred to as truncated gradient correlation (TGC). Registration accuracy was evaluated in terms of the projection distance error (PDE) of the vertebral levels. Results: Conventional similarity metrics were susceptible to gross registration error and failure modes associated with the presence of surgical instrumentation: for GI, the median PDE and interquartile range was 33.0±43.6 mm; similarly for GC, PDE = 23.0±92.6 mm respectively. The robust metrics GO and TGC, on the other hand, demonstrated major improvement in PDE (7.6 ±9.4 mm and 8.1± 18.1 mm, respectively) and elimination of gross failure modes. Conclusion: The proposed GO and TGC similarity measures improve registration accuracy and robustness to gross failure in the presence of strong image content mismatch. Such

  14. WE-AB-BRA-01: 3D-2D Image Registration for Target Localization in Spine Surgery: Comparison of Similarity Metrics Against Robustness to Content Mismatch

    Energy Technology Data Exchange (ETDEWEB)

    De Silva, T; Ketcha, M; Siewerdsen, J H [Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD (United States); Uneri, A; Reaungamornrat, S [Department of Computer Science, Johns Hopkins University, Baltimore, MD (United States); Vogt, S; Kleinszig, G [Siemens Healthcare XP Division, Erlangen, DE (Germany); Lo, S F; Wolinsky, J P; Gokaslan, Z L [Department of Neurosurgery, The Johns Hopkins Hospital, Baltimore, MD (United States); Aygun, N [Department of Raiology and Radiological Sciences, The Johns Hopkins Hospital, Baltimore, MD (United States)

    2015-06-15

    Purpose: In image-guided spine surgery, mapping 3D preoperative images to 2D intraoperative images via 3D-2D registration can provide valuable assistance in target localization. However, the presence of surgical instrumentation, hardware implants, and soft-tissue resection/displacement causes mismatches in image content, confounding existing registration methods. Manual/semi-automatic methods to mask such extraneous content is time consuming, user-dependent, error prone, and disruptive to clinical workflow. We developed and evaluated 2 novel similarity metrics within a robust registration framework to overcome such challenges in target localization. Methods: An IRB-approved retrospective study in 19 spine surgery patients included 19 preoperative 3D CT images and 50 intraoperative mobile radiographs in cervical, thoracic, and lumbar spine regions. A neuroradiologist provided truth definition of vertebral positions in CT and radiography. 3D-2D registration was performed using the CMA-ES optimizer with 4 gradient-based image similarity metrics: (1) gradient information (GI); (2) gradient correlation (GC); (3) a novel variant referred to as gradient orientation (GO); and (4) a second variant referred to as truncated gradient correlation (TGC). Registration accuracy was evaluated in terms of the projection distance error (PDE) of the vertebral levels. Results: Conventional similarity metrics were susceptible to gross registration error and failure modes associated with the presence of surgical instrumentation: for GI, the median PDE and interquartile range was 33.0±43.6 mm; similarly for GC, PDE = 23.0±92.6 mm respectively. The robust metrics GO and TGC, on the other hand, demonstrated major improvement in PDE (7.6 ±9.4 mm and 8.1± 18.1 mm, respectively) and elimination of gross failure modes. Conclusion: The proposed GO and TGC similarity measures improve registration accuracy and robustness to gross failure in the presence of strong image content mismatch. Such

  15. Targeting Strategies for Multifunctional Nanoparticles in Cancer Imaging and Therapy

    Science.gov (United States)

    Yu, Mi Kyung; Park, Jinho; Jon, Sangyong

    2012-01-01

    Nanomaterials offer new opportunities for cancer diagnosis and treatment. Multifunctional nanoparticles harboring various functions including targeting, imaging, therapy, and etc have been intensively studied aiming to overcome limitations associated with conventional cancer diagnosis and therapy. Of various nanoparticles, magnetic iron oxide nanoparticles with superparamagnetic property have shown potential as multifunctional nanoparticles for clinical translation because they have been used asmagnetic resonance imaging (MRI) constrast agents in clinic and their features could be easily tailored by including targeting moieties, fluorescence dyes, or therapeutic agents. This review summarizes targeting strategies for construction of multifunctional nanoparticles including magnetic nanoparticles-based theranostic systems, and the various surface engineering strategies of nanoparticles for in vivo applications. PMID:22272217

  16. Assessing the metrics of climate change. Current methods and future possibilities

    International Nuclear Information System (INIS)

    Fuglestveit, Jan S.; Berntsen, Terje K.; Godal, Odd; Sausen, Robert; Shine, Keith P.; Skodvin, Tora

    2001-01-01

    With the principle of comprehensiveness embedded in the UN Framework Convention on Climate Change (Art. 3), a multi-gas abatement strategy with emphasis also on non-CO2 greenhouse gases as targets for reduction and control measures has been adopted in the international climate regime. In the Kyoto Protocol, the comprehensive approach is made operative as the aggregate anthropogenic carbon dioxide equivalent emissions of six specified greenhouse gases or groups of gases (Art. 3). With this operationalisation, the emissions of a set of greenhouse gases with very different atmospheric lifetimes and radiative properties are transformed into one common unit - CO2 equivalents. This transformation is based on the Global Warming Potential (GWP) index, which in turn is based on the concept of radiative forcing. The GWP metric and its application in policy making has been debated, and several other alternative concepts have been suggested. In this paper, we review existing and alternative metrics of climate change, with particular emphasis on radiative forcing and GWPs, in terms of their scientific performance. This assessment focuses on questions such as the climate impact (end point) against which gases are weighted; the extent to which and how temporality is included, both with regard to emission control and with regard to climate impact; how cost issues are dealt with; and the sensitivity of the metrics to various assumptions. It is concluded that the radiative forcing concept is a robust and useful metric of the potential climatic impact of various agents and that there are prospects for improvement by weighing different forcings according to their effectiveness. We also find that although the GWP concept is associated with serious shortcomings, it retains advantages over any of the proposed alternatives in terms of political feasibility. Alternative metrics, however, make a significant contribution to addressing important issues, and this contribution should be taken

  17. Assessing the metrics of climate change. Current methods and future possibilities

    Energy Technology Data Exchange (ETDEWEB)

    Fuglestveit, Jan S.; Berntsen, Terje K.; Godal, Odd; Sausen, Robert; Shine, Keith P.; Skodvin, Tora

    2001-07-01

    With the principle of comprehensiveness embedded in the UN Framework Convention on Climate Change (Art. 3), a multi-gas abatement strategy with emphasis also on non-CO2 greenhouse gases as targets for reduction and control measures has been adopted in the international climate regime. In the Kyoto Protocol, the comprehensive approach is made operative as the aggregate anthropogenic carbon dioxide equivalent emissions of six specified greenhouse gases or groups of gases (Art. 3). With this operationalisation, the emissions of a set of greenhouse gases with very different atmospheric lifetimes and radiative properties are transformed into one common unit - CO2 equivalents. This transformation is based on the Global Warming Potential (GWP) index, which in turn is based on the concept of radiative forcing. The GWP metric and its application in policy making has been debated, and several other alternative concepts have been suggested. In this paper, we review existing and alternative metrics of climate change, with particular emphasis on radiative forcing and GWPs, in terms of their scientific performance. This assessment focuses on questions such as the climate impact (end point) against which gases are weighted; the extent to which and how temporality is included, both with regard to emission control and with regard to climate impact; how cost issues are dealt with; and the sensitivity of the metrics to various assumptions. It is concluded that the radiative forcing concept is a robust and useful metric of the potential climatic impact of various agents and that there are prospects for improvement by weighing different forcings according to their effectiveness. We also find that although the GWP concept is associated with serious shortcomings, it retains advantages over any of the proposed alternatives in terms of political feasibility. Alternative metrics, however, make a significant contribution to addressing important issues, and this contribution should be taken

  18. Putting health metrics into practice: using the disability-adjusted life year for strategic decision making.

    Science.gov (United States)

    Longfield, Kim; Smith, Brian; Gray, Rob; Ngamkitpaiboon, Lek; Vielot, Nadja

    2013-01-01

    Implementing organizations are pressured to be accountable for performance. Many health impact metrics present limitations for priority setting; they do not permit comparisons across different interventions or health areas. In response, Population Services International (PSI) adopted the disability-adjusted life year (DALY) averted as its bottom-line performance metric. While international standards exist for calculating DALYs to determine burden of disease (BOD), PSI's use of DALYs averted is novel. It uses DALYs averted to assess and compare the health impact of its country programs, and to understand the effectiveness of a portfolio of interventions. This paper describes how the adoption of DALYs averted influenced organizational strategy and presents the advantages and constraints of using the metric. Health impact data from 2001-2011 were analyzed by program area and geographic region to measure PSI's performance against its goal of doubling health impact between 2007-2011. Analyzing 10 years of data permitted comparison with previous years' performance. A case study of PSI's Asia and Eastern European (A/EE) region, and PSI/Laos, is presented to illustrate how the adoption of DALYs averted affected strategic decision making. Between 2007-2011, PSI's programs doubled the total number of DALYs averted from 2002-2006. Most DALYs averted were within malaria, followed by HIV/AIDS and family planning (FP). The performance of PSI's A/EE region relative to other regions declined with the switch to DALYs averted. As a result, the region made a strategic shift to align its work with countries' BOD. In PSI/Laos, this redirection led to better-targeted programs and an approximate 50% gain in DALYs averted from 2009-2011. PSI's adoption of DALYs averted shifted the organization's strategic direction away from product sales and toward BOD. Now, many strategic decisions are based on "BOD-relevance," the share of the BOD that interventions can potentially address. This switch

  19. Characterising risk - aggregated metrics: radiation and noise

    International Nuclear Information System (INIS)

    Passchier, W.

    1998-01-01

    The characterisation of risk is an important phase in the risk assessment - risk management process. From the multitude of risk attributes a few have to be selected to obtain a risk characteristic or profile that is useful for risk management decisions and implementation of protective measures. One way to reduce the number of attributes is aggregation. In the field of radiation protection such an aggregated metric is firmly established: effective dose. For protection against environmental noise the Health Council of the Netherlands recently proposed a set of aggregated metrics for noise annoyance and sleep disturbance. The presentation will discuss similarities and differences between these two metrics and practical limitations. The effective dose has proven its usefulness in designing radiation protection measures, which are related to the level of risk associated with the radiation practice in question, given that implicit judgements on radiation induced health effects are accepted. However, as the metric does not take into account the nature of radiation practice, it is less useful in policy discussions on the benefits and harm of radiation practices. With respect to the noise exposure metric, only one effect is targeted (annoyance), and the differences between sources are explicitly taken into account. This should make the metric useful in policy discussions with respect to physical planning and siting problems. The metric proposed has only significance on a population level, and can not be used as a predictor for individual risk. (author)

  20. Drug Target Interference in Immunogenicity Assays: Recommendations and Mitigation Strategies.

    Science.gov (United States)

    Zhong, Zhandong Don; Clements-Egan, Adrienne; Gorovits, Boris; Maia, Mauricio; Sumner, Giane; Theobald, Valerie; Wu, Yuling; Rajadhyaksha, Manoj

    2017-11-01

    Sensitive and specific methodology is required for the detection and characterization of anti-drug antibodies (ADAs). High-quality ADA data enables the evaluation of potential impact of ADAs on the drug pharmacokinetic profile, patient safety, and efficacious response to the drug. Immunogenicity assessments are typically initiated at early stages in preclinical studies and continue throughout the drug development program. One of the potential bioanalytical challenges encountered with ADA testing is the need to identify and mitigate the interference mediated by the presence of soluble drug target. A drug target, when present at sufficiently high circulating concentrations, can potentially interfere with the performance of ADA and neutralizing antibody (NAb) assays, leading to either false-positive or, in some cases, false-negative ADA and NAb assay results. This publication describes various mechanisms of assay interference by soluble drug target, as well as strategies to recognize and mitigate such target interference. Pertinent examples are presented to illustrate the impact of target interference on ADA and NAb assays as well as several mitigation strategies, including the use of anti-target antibodies, soluble versions of the receptors, target-binding proteins, lectins, and solid-phase removal of targets. Furthermore, recommendations for detection and mitigation of such interference in different formats of ADA and NAb assays are provided.

  1. Networks and centroid metrics for understanding football

    African Journals Online (AJOL)

    Gonçalo Dias

    games. However, it seems that the centroid metric, supported only by the position of players in the field ...... the strategy adopted by the coach (Gama et al., 2014). ... centroid distance as measures of team's tactical performance in youth football.

  2. [Clinical trial data management and quality metrics system].

    Science.gov (United States)

    Chen, Zhao-hua; Huang, Qin; Deng, Ya-zhong; Zhang, Yue; Xu, Yu; Yu, Hao; Liu, Zong-fan

    2015-11-01

    Data quality management system is essential to ensure accurate, complete, consistent, and reliable data collection in clinical research. This paper is devoted to various choices of data quality metrics. They are categorized by study status, e.g. study start up, conduct, and close-out. In each category, metrics for different purposes are listed according to ALCOA+ principles such us completeness, accuracy, timeliness, traceability, etc. Some general quality metrics frequently used are also introduced. This paper contains detail information as much as possible to each metric by providing definition, purpose, evaluation, referenced benchmark, and recommended targets in favor of real practice. It is important that sponsors and data management service providers establish a robust integrated clinical trial data quality management system to ensure sustainable high quality of clinical trial deliverables. It will also support enterprise level of data evaluation and bench marking the quality of data across projects, sponsors, data management service providers by using objective metrics from the real clinical trials. We hope this will be a significant input to accelerate the improvement of clinical trial data quality in the industry.

  3. Evaluating which plan quality metrics are appropriate for use in lung SBRT.

    Science.gov (United States)

    Yaparpalvi, Ravindra; Garg, Madhur K; Shen, Jin; Bodner, William R; Mynampati, Dinesh K; Gafar, Aleiya; Kuo, Hsiang-Chi; Basavatia, Amar K; Ohri, Nitin; Hong, Linda X; Kalnicki, Shalom; Tome, Wolfgang A

    2018-02-01

    Several dose metrics in the categories-homogeneity, coverage, conformity and gradient have been proposed in literature for evaluating treatment plan quality. In this study, we applied these metrics to characterize and identify the plan quality metrics that would merit plan quality assessment in lung stereotactic body radiation therapy (SBRT) dose distributions. Treatment plans of 90 lung SBRT patients, comprising 91 targets, treated in our institution were retrospectively reviewed. Dose calculations were performed using anisotropic analytical algorithm (AAA) with heterogeneity correction. A literature review on published plan quality metrics in the categories-coverage, homogeneity, conformity and gradient was performed. For each patient, using dose-volume histogram data, plan quality metric values were quantified and analysed. For the study, the radiation therapy oncology group (RTOG) defined plan quality metrics were: coverage (0.90 ± 0.08); homogeneity (1.27 ± 0.07); conformity (1.03 ± 0.07) and gradient (4.40 ± 0.80). Geometric conformity strongly correlated with conformity index (p plan quality guidelines-coverage % (ICRU 62), conformity (CN or CI Paddick ) and gradient (R 50% ). Furthermore, we strongly recommend that RTOG lung SBRT protocols adopt either CN or CI Padddick in place of prescription isodose to target volume ratio for conformity index evaluation. Advances in knowledge: Our study metrics are valuable tools for establishing lung SBRT plan quality guidelines.

  4. Description of the Sandia National Laboratories science, technology & engineering metrics process.

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, Gretchen B.; Watkins, Randall D.; Trucano, Timothy Guy; Burns, Alan Richard; Oelschlaeger, Peter

    2010-04-01

    There has been a concerted effort since 2007 to establish a dashboard of metrics for the Science, Technology, and Engineering (ST&E) work at Sandia National Laboratories. These metrics are to provide a self assessment mechanism for the ST&E Strategic Management Unit (SMU) to complement external expert review and advice and various internal self assessment processes. The data and analysis will help ST&E Managers plan, implement, and track strategies and work in order to support the critical success factors of nurturing core science and enabling laboratory missions. The purpose of this SAND report is to provide a guide for those who want to understand the ST&E SMU metrics process. This report provides an overview of why the ST&E SMU wants a dashboard of metrics, some background on metrics for ST&E programs from existing literature and past Sandia metrics efforts, a summary of work completed to date, specifics on the portfolio of metrics that have been chosen and the implementation process that has been followed, and plans for the coming year to improve the ST&E SMU metrics process.

  5. A comparison of prostate tumor targeting strategies using magnetic resonance imaging-targeted, transrectal ultrasound-guided fusion biopsy.

    Science.gov (United States)

    Martin, Peter R; Cool, Derek W; Fenster, Aaron; Ward, Aaron D

    2018-03-01

    Magnetic resonance imaging (MRI)-targeted, three-dimensional (3D) transrectal ultrasound (TRUS)-guided prostate biopsy aims to reduce the 21-47% false-negative rate of clinical two-dimensional (2D) TRUS-guided systematic biopsy, but continues to yield false-negative results. This may be improved via needle target optimization, accounting for guidance system errors and image registration errors. As an initial step toward the goal of optimized prostate biopsy targeting, we investigated how needle delivery error impacts tumor sampling probability for two targeting strategies. We obtained MRI and 3D TRUS images from 49 patients. A radiologist and radiology resident assessed these MR images and contoured 81 suspicious regions, yielding tumor surfaces that were registered to 3D TRUS. The biopsy system's root-mean-squared needle delivery error (RMSE) and systematic error were modeled using an isotropic 3D Gaussian distribution. We investigated two different prostate tumor-targeting strategies using (a) the tumor's centroid and (b) a ring in the lateral-elevational plane. For each simulation, targets were spaced at equal arc lengths on a ring with radius equal to the systematic error magnitude. A total of 1000 biopsy simulations were conducted for each tumor, with RMSE and systematic error magnitudes ranging from 1 to 6 mm. The difference in median tumor sampling probability and probability of obtaining a 50% core involvement was determined for ring vs centroid targeting. Our simulation results indicate that ring targeting outperformed centroid targeting in situations where systematic error exceeds RMSE. In these instances, we observed statistically significant differences showing 1-32% improvement in sampling probability due to ring targeting. Likewise, we observed statistically significant differences showing 1-39% improvement in 50% core involvement probability due to ring targeting. Our results suggest that the optimal targeting scheme for prostate biopsy depends on

  6. High-Dimensional Metrics in R

    OpenAIRE

    Chernozhukov, Victor; Hansen, Chris; Spindler, Martin

    2016-01-01

    The package High-dimensional Metrics (\\Rpackage{hdm}) is an evolving collection of statistical methods for estimation and quantification of uncertainty in high-dimensional approximately sparse models. It focuses on providing confidence intervals and significance testing for (possibly many) low-dimensional subcomponents of the high-dimensional parameter vector. Efficient estimators and uniformly valid confidence intervals for regression coefficients on target variables (e.g., treatment or poli...

  7. 78 FR 14121 - Notice of Availability of Funds and Solicitation for Grant Applications for Strategies Targeting...

    Science.gov (United States)

    2013-03-04

    ... Solicitation for Grant Applications for Strategies Targeting Characteristics Common to Female Ex-Offenders... will be targeted to females, but must also be open to eligible male ex-offenders. Strategies Targeting... period of performance. These grants will include an integrated strategy of recruitment and assessment...

  8. 3D-2D image registration for target localization in spine surgery: investigation of similarity metrics providing robustness to content mismatch

    Science.gov (United States)

    De Silva, T.; Uneri, A.; Ketcha, M. D.; Reaungamornrat, S.; Kleinszig, G.; Vogt, S.; Aygun, N.; Lo, S.-F.; Wolinsky, J.-P.; Siewerdsen, J. H.

    2016-04-01

    In image-guided spine surgery, robust three-dimensional to two-dimensional (3D-2D) registration of preoperative computed tomography (CT) and intraoperative radiographs can be challenged by the image content mismatch associated with the presence of surgical instrumentation and implants as well as soft-tissue resection or deformation. This work investigates image similarity metrics in 3D-2D registration offering improved robustness against mismatch, thereby improving performance and reducing or eliminating the need for manual masking. The performance of four gradient-based image similarity metrics (gradient information (GI), gradient correlation (GC), gradient information with linear scaling (GS), and gradient orientation (GO)) with a multi-start optimization strategy was evaluated in an institutional review board-approved retrospective clinical study using 51 preoperative CT images and 115 intraoperative mobile radiographs. Registrations were tested with and without polygonal masks as a function of the number of multistarts employed during optimization. Registration accuracy was evaluated in terms of the projection distance error (PDE) and assessment of failure modes (PDE  >  30 mm) that could impede reliable vertebral level localization. With manual polygonal masking and 200 multistarts, the GC and GO metrics exhibited robust performance with 0% gross failures and median PDE  interquartile range (IQR)) and a median runtime of 84 s (plus upwards of 1-2 min for manual masking). Excluding manual polygonal masks and decreasing the number of multistarts to 50 caused the GC-based registration to fail at a rate of  >14% however, GO maintained robustness with a 0% gross failure rate. Overall, the GI, GC, and GS metrics were susceptible to registration errors associated with content mismatch, but GO provided robust registration (median PDE  =  5.5 mm, 2.6 mm IQR) without manual masking and with an improved runtime (29.3 s). The GO metric

  9. Future of the PCI Readmission Metric.

    Science.gov (United States)

    Wasfy, Jason H; Yeh, Robert W

    2016-03-01

    Between 2013 and 2014, the Centers for Medicare and Medicaid Services and the National Cardiovascular Data Registry publically reported risk-adjusted 30-day readmission rates after percutaneous coronary intervention (PCI) as a pilot project. A key strength of this public reporting effort included risk adjustment with clinical rather than administrative data. Furthermore, because readmission after PCI is common, expensive, and preventable, this metric has substantial potential to improve quality and value in American cardiology care. Despite this, concerns about the metric exist. For example, few PCI readmissions are caused by procedural complications, limiting the extent to which improved procedural technique can reduce readmissions. Also, similar to other readmission measures, PCI readmission is associated with socioeconomic status and race. Accordingly, the metric may unfairly penalize hospitals that care for underserved patients. Perhaps in the context of these limitations, Centers for Medicare and Medicaid Services has not yet included PCI readmission among metrics that determine Medicare financial penalties. Nevertheless, provider organizations may still wish to focus on this metric to improve value for cardiology patients. PCI readmission is associated with low-risk chest discomfort and patient anxiety. Therefore, patient education, improved triage mechanisms, and improved care coordination offer opportunities to minimize PCI readmissions. Because PCI readmission is common and costly, reducing PCI readmission offers provider organizations a compelling target to improve the quality of care, and also performance in contracts involve shared financial risk. © 2016 American Heart Association, Inc.

  10. Sharp metric obstructions for quasi-Einstein metrics

    Science.gov (United States)

    Case, Jeffrey S.

    2013-02-01

    Using the tractor calculus to study smooth metric measure spaces, we adapt results of Gover and Nurowski to give sharp metric obstructions to the existence of quasi-Einstein metrics on suitably generic manifolds. We do this by introducing an analogue of the Weyl tractor W to the setting of smooth metric measure spaces. The obstructions we obtain can be realized as tensorial invariants which are polynomial in the Riemann curvature tensor and its divergence. By taking suitable limits of their tensorial forms, we then find obstructions to the existence of static potentials, generalizing to higher dimensions a result of Bartnik and Tod, and to the existence of potentials for gradient Ricci solitons.

  11. Stakeholder analysis and mapping as targeted communication strategy.

    Science.gov (United States)

    Shirey, Maria R

    2012-09-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools, and resources that mobilize and sustain organizational change initiatives. In this article, the author highlights the importance of stakeholder theory and discusses how to apply the theory to conduct a stakeholder analysis. This article also provides an explanation of how to use related stakeholder mapping techniques with targeted communication strategies.

  12. Design strategies for self-assembly of discrete targets

    International Nuclear Information System (INIS)

    Madge, Jim; Miller, Mark A.

    2015-01-01

    Both biological and artificial self-assembly processes can take place by a range of different schemes, from the successive addition of identical building blocks to hierarchical sequences of intermediates, all the way to the fully addressable limit in which each component is unique. In this paper, we introduce an idealized model of cubic particles with patterned faces that allows self-assembly strategies to be compared and tested. We consider a simple octameric target, starting with the minimal requirements for successful self-assembly and comparing the benefits and limitations of more sophisticated hierarchical and addressable schemes. Simulations are performed using a hybrid dynamical Monte Carlo protocol that allows self-assembling clusters to rearrange internally while still providing Stokes-Einstein-like diffusion of aggregates of different sizes. Our simulations explicitly capture the thermodynamic, dynamic, and steric challenges typically faced by self-assembly processes, including competition between multiple partially completed structures. Self-assembly pathways are extracted from the simulation trajectories by a fully extendable scheme for identifying structural fragments, which are then assembled into history diagrams for successfully completed target structures. For the simple target, a one-component assembly scheme is most efficient and robust overall, but hierarchical and addressable strategies can have an advantage under some conditions if high yield is a priority

  13. Targeting poverty : lessons from monitoring Ireland's National Anti-Poverty Strategy

    OpenAIRE

    Layte, Richard; Nolan, Brian; Whelan, Christopher T.

    2000-01-01

    In 1997 the Irish government adopted the National Anti-Poverty Strategy (NAPS), a global target for the reduction of poverty which illuminates a range of issues relating to official poverty targets. The Irish target is framed in terms of a relative poverty measure incorporating both relative income and direct measures of deprivation based on data on the extent of poverty from 1994. Since 1994 Ireland has experienced an unprecedented period of economic growth that makes it particularly importa...

  14. Prioritizing Urban Habitats for Connectivity Conservation: Integrating Centrality and Ecological Metrics.

    Science.gov (United States)

    Poodat, Fatemeh; Arrowsmith, Colin; Fraser, David; Gordon, Ascelin

    2015-09-01

    Connectivity among fragmented areas of habitat has long been acknowledged as important for the viability of biological conservation, especially within highly modified landscapes. Identifying important habitat patches in ecological connectivity is a priority for many conservation strategies, and the application of 'graph theory' has been shown to provide useful information on connectivity. Despite the large number of metrics for connectivity derived from graph theory, only a small number have been compared in terms of the importance they assign to nodes in a network. This paper presents a study that aims to define a new set of metrics and compares these with traditional graph-based metrics, used in the prioritization of habitat patches for ecological connectivity. The metrics measured consist of "topological" metrics, "ecological metrics," and "integrated metrics," Integrated metrics are a combination of topological and ecological metrics. Eight metrics were applied to the habitat network for the fat-tailed dunnart within Greater Melbourne, Australia. A non-directional network was developed in which nodes were linked to adjacent nodes. These links were then weighted by the effective distance between patches. By applying each of the eight metrics for the study network, nodes were ranked according to their contribution to the overall network connectivity. The structured comparison revealed the similarity and differences in the way the habitat for the fat-tailed dunnart was ranked based on different classes of metrics. Due to the differences in the way the metrics operate, a suitable metric should be chosen that best meets the objectives established by the decision maker.

  15. $\\eta$-metric structures

    OpenAIRE

    Gaba, Yaé Ulrich

    2017-01-01

    In this paper, we discuss recent results about generalized metric spaces and fixed point theory. We introduce the notion of $\\eta$-cone metric spaces, give some topological properties and prove some fixed point theorems for contractive type maps on these spaces. In particular we show that theses $\\eta$-cone metric spaces are natural generalizations of both cone metric spaces and metric type spaces.

  16. Comparison of provider and plan-based targeting strategies for disease management.

    Science.gov (United States)

    Annis, Ann M; Holtrop, Jodi Summers; Tao, Min; Chang, Hsiu-Ching; Luo, Zhehui

    2015-05-01

    We aimed to describe and contrast the targeting methods and engagement outcomes for health plan-delivered disease management with those of a provider-delivered care management program. Health plan epidemiologists partnered with university health services researchers to conduct a quasi-experimental, mixed-methods study of a 2-year pilot. We used semi-structured interviews to assess the characteristics of program-targeting strategies, and calculated target and engagement rates from clinical encounter data. Five physician organizations (POs) with 51 participating practices implemented care management. Health plan member lists were sent monthly to the practices to accept patients, and then the practices sent back data reports regarding targeting and engagement in care management. Among patients accepted by the POs, we compared those who were targeted and engaged by POs with those who met health plan targeting criteria. The health plan's targeting process combined claims algorithms and employer group preferences to identify candidates for disease management; on the other hand, several different factors influenced PO practices' targeting approaches, including clinical and personal knowledge of the patients, health assessment information, and availability of disease-relevant programs. Practices targeted a higher percentage of patients for care management than the health plan (38% vs 16%), where only 7% of these patients met the targeting criteria of both. Practices engaged a higher percentage of their targeted patients than the health plan (50% vs 13%). The health plan's claims-driven targeting approach and the clinically based strategies of practices both provide advantages; an optimal model may be to combine the strengths of each approach to maximize benefits in care management.

  17. Design of a multi-purpose fragment screening library using molecular complexity and orthogonal diversity metrics

    Science.gov (United States)

    Lau, Wan F.; Withka, Jane M.; Hepworth, David; Magee, Thomas V.; Du, Yuhua J.; Bakken, Gregory A.; Miller, Michael D.; Hendsch, Zachary S.; Thanabal, Venkataraman; Kolodziej, Steve A.; Xing, Li; Hu, Qiyue; Narasimhan, Lakshmi S.; Love, Robert; Charlton, Maura E.; Hughes, Samantha; van Hoorn, Willem P.; Mills, James E.

    2011-07-01

    Fragment Based Drug Discovery (FBDD) continues to advance as an efficient and alternative screening paradigm for the identification and optimization of novel chemical matter. To enable FBDD across a wide range of pharmaceutical targets, a fragment screening library is required to be chemically diverse and synthetically expandable to enable critical decision making for chemical follow-up and assessing new target druggability. In this manuscript, the Pfizer fragment library design strategy which utilized multiple and orthogonal metrics to incorporate structure, pharmacophore and pharmacological space diversity is described. Appropriate measures of molecular complexity were also employed to maximize the probability of detection of fragment hits using a variety of biophysical and biochemical screening methods. In addition, structural integrity, purity, solubility, fragment and analog availability as well as cost were important considerations in the selection process. Preliminary analysis of primary screening results for 13 targets using NMR Saturation Transfer Difference (STD) indicates the identification of uM-mM hits and the uniqueness of hits at weak binding affinities for these targets.

  18. Resources available for applying metrics in security and safety programming.

    Science.gov (United States)

    Luizzo, Anthony

    2016-01-01

    Incorporating metrics into security surveys has been championed as a better way of substantiating program-related effectiveness and expenditures. Although security surveys have been aroundfor well over 40 years, rarely, if ever, have metric-related strategies been part of the equation, the author says. In this article, he cites several published articles and research findings available to security professionals and their surveyors that may give them the expertise and confidence they need to make use of this valuable tool.

  19. 3D–2D image registration for target localization in spine surgery: investigation of similarity metrics providing robustness to content mismatch

    International Nuclear Information System (INIS)

    De Silva, T; Ketcha, M D; Siewerdsen, J H; Uneri, A; Reaungamornrat, S; Kleinszig, G; Vogt, S; Aygun, N; Lo, S-F; Wolinsky, J-P

    2016-01-01

    In image-guided spine surgery, robust three-dimensional to two-dimensional (3D–2D) registration of preoperative computed tomography (CT) and intraoperative radiographs can be challenged by the image content mismatch associated with the presence of surgical instrumentation and implants as well as soft-tissue resection or deformation. This work investigates image similarity metrics in 3D–2D registration offering improved robustness against mismatch, thereby improving performance and reducing or eliminating the need for manual masking. The performance of four gradient-based image similarity metrics (gradient information (GI), gradient correlation (GC), gradient information with linear scaling (GS), and gradient orientation (GO)) with a multi-start optimization strategy was evaluated in an institutional review board-approved retrospective clinical study using 51 preoperative CT images and 115 intraoperative mobile radiographs. Registrations were tested with and without polygonal masks as a function of the number of multistarts employed during optimization. Registration accuracy was evaluated in terms of the projection distance error (PDE) and assessment of failure modes (PDE  >  30 mm) that could impede reliable vertebral level localization. With manual polygonal masking and 200 multistarts, the GC and GO metrics exhibited robust performance with 0% gross failures and median PDE  <  6.4 mm (±4.4 mm interquartile range (IQR)) and a median runtime of 84 s (plus upwards of 1–2 min for manual masking). Excluding manual polygonal masks and decreasing the number of multistarts to 50 caused the GC-based registration to fail at a rate of  >14%; however, GO maintained robustness with a 0% gross failure rate. Overall, the GI, GC, and GS metrics were susceptible to registration errors associated with content mismatch, but GO provided robust registration (median PDE  =  5.5 mm, 2.6 mm IQR) without manual masking and with an improved

  20. Targeting HIV latency: pharmacologic strategies toward eradication

    Science.gov (United States)

    Xing, Sifei; Siliciano, Robert F.

    2013-01-01

    The latent reservoir for HIV-1 in resting CD4+ T cells remains a major barrier to HIV-1 eradication, even though highly active antiretroviral therapy (HAART) can successfully reduce plasma HIV-1 levels to below the detection limit of clinical assays and reverse disease progression. Proposed eradication strategies involve reactivation of this latent reservoir. Multiple mechanisms are believed to be involved in maintaining HIV-1 latency, mostly through suppression of transcription. These include cytoplasmic sequestration of host transcription factors and epigenetic modifications such as histone deacetylation, histone methylation and DNA methylation. Therefore, strategies targeting these mechanisms have been explored for reactivation of the latent reservoir. In this review, we discuss current pharmacological approaches toward eradication, focusing on small molecule latency-reversing agents, their mechanisms, advantages and limitations. PMID:23270785

  1. Targeting human breast cancer cells by an oncolytic adenovirus using microRNA-targeting strategy.

    Science.gov (United States)

    Shayestehpour, Mohammad; Moghim, Sharareh; Salimi, Vahid; Jalilvand, Somayeh; Yavarian, Jila; Romani, Bizhan; Mokhtari-Azad, Talat

    2017-08-15

    MicroRNA-targeting strategy is a promising approach that enables oncolytic viruses to replicate in tumor cells but not in normal cells. In this study, we targeted adenoviral replication toward breast cancer cells by inserting ten complementary binding sites for miR-145-5p downstream of E1A gene. In addition, we evaluated the effect of increasing miR-145 binding sites on inhibition of virus replication. Ad5-control and adenoviruses carrying five or ten copies of miR145-5p target sites (Ad5-5miR145T, Ad5-10miR145T) were generated and inoculated into MDA-MB-453, BT-20, MCF-7 breast cancer cell lines and human mammary epithelial cells (HMEpC). Titer of Ad5-10miR145T in HMEpC was significantly lower than Ad5-control titer. Difference between the titer of these two viruses at 12, 24, 36, and 48h after infection was 1.25, 2.96, 3.06, and 3.77 log TCID 50 . No significant difference was observed between the titer of both adenoviruses in MDA-MB-453, BT-20 and MCF-7 cells. The infectious titer of adenovirus containing 10 miR-145 binding sites in HMEpC cells at 24, 36, and 48h post-infection was 1.7, 2.08, and 4-fold, respectively, lower than the titer of adenovirus carrying 5 miR-145 targets. Our results suggest that miR-145-targeting strategy provides selectivity for adenovirus replication in breast cancer cells. Increasing the number of miRNA binding sites within the adenoviral genome confers more selectivity for viral replication in cancer cells. Copyright © 2017. Published by Elsevier B.V.

  2. Smart Grid Status and Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-07-01

    To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. It measures 21 metrics to provide insight into the grid’s capacity to embody these characteristics. This report looks across a spectrum of smart grid concerns to measure the status of smart grid deployment and impacts.

  3. Inverse targeting —An effective immunization strategy

    Science.gov (United States)

    Schneider, C. M.; Mihaljev, T.; Herrmann, H. J.

    2012-05-01

    We propose a new method to immunize populations or computer networks against epidemics which is more efficient than any continuous immunization method considered before. The novelty of our method resides in the way of determining the immunization targets. First we identify those individuals or computers that contribute the least to the disease spreading measured through their contribution to the size of the largest connected cluster in the social or a computer network. The immunization process follows the list of identified individuals or computers in inverse order, immunizing first those which are most relevant for the epidemic spreading. We have applied our immunization strategy to several model networks and two real networks, the Internet and the collaboration network of high-energy physicists. We find that our new immunization strategy is in the case of model networks up to 14%, and for real networks up to 33% more efficient than immunizing dynamically the most connected nodes in a network. Our strategy is also numerically efficient and can therefore be applied to large systems.

  4. Analysis of Subjects' Vulnerability in a Touch Screen Game Using Behavioral Metrics.

    Science.gov (United States)

    Parsinejad, Payam; Sipahi, Rifat

    2017-12-01

    In this article, we report results on an experimental study conducted with volunteer subjects playing a touch-screen game with two unique difficulty levels. Subjects have knowledge about the rules of both game levels, but only sufficient playing experience with the easy level of the game, making them vulnerable with the difficult level. Several behavioral metrics associated with subjects' playing the game are studied in order to assess subjects' mental-workload changes induced by their vulnerability. Specifically, these metrics are calculated based on subjects' finger kinematics and decision making times, which are then compared with baseline metrics, namely, performance metrics pertaining to how well the game is played and a physiological metric called pnn50 extracted from heart rate measurements. In balanced experiments and supported by comparisons with baseline metrics, it is found that some of the studied behavioral metrics have the potential to be used to infer subjects' mental workload changes through different levels of the game. These metrics, which are decoupled from task specifics, relate to subjects' ability to develop strategies to play the game, and hence have the advantage of offering insight into subjects' task-load and vulnerability assessment across various experimental settings.

  5. Improving the Reliability of Network Metrics in Structural Brain Networks by Integrating Different Network Weighting Strategies into a Single Graph

    Directory of Open Access Journals (Sweden)

    Stavros I. Dimitriadis

    2017-12-01

    Full Text Available Structural brain networks estimated from diffusion MRI (dMRI via tractography have been widely studied in healthy controls and patients with neurological and psychiatric diseases. However, few studies have addressed the reliability of derived network metrics both node-specific and network-wide. Different network weighting strategies (NWS can be adopted to weight the strength of connection between two nodes yielding structural brain networks that are almost fully-weighted. Here, we scanned five healthy participants five times each, using a diffusion-weighted MRI protocol and computed edges between 90 regions of interest (ROI from the Automated Anatomical Labeling (AAL template. The edges were weighted according to nine different methods. We propose a linear combination of these nine NWS into a single graph using an appropriate diffusion distance metric. We refer to the resulting weighted graph as an Integrated Weighted Structural Brain Network (ISWBN. Additionally, we consider a topological filtering scheme that maximizes the information flow in the brain network under the constraint of the overall cost of the surviving connections. We compared each of the nine NWS and the ISWBN based on the improvement of: (a intra-class correlation coefficient (ICC of well-known network metrics, both node-wise and per network level; and (b the recognition accuracy of each subject compared to the remainder of the cohort, as an attempt to access the uniqueness of the structural brain network for each subject, after first applying our proposed topological filtering scheme. Based on a threshold where the network level ICC should be >0.90, our findings revealed that six out of nine NWS lead to unreliable results at the network level, while all nine NWS were unreliable at the node level. In comparison, our proposed ISWBN performed as well as the best performing individual NWS at the network level, and the ICC was higher compared to all individual NWS at the node

  6. An Energy-Efficient Sleep Strategy for Target Tracking Sensor Networks

    Directory of Open Access Journals (Sweden)

    Juan FENG

    2014-02-01

    Full Text Available Energy efficiency is very important for sensor networks since sensor nodes have limited energy supply from battery. So far, many researches have been focused on this issue, while less emphasis was placed on the optimal sleep time of each node. This paper proposed an adaptive energy conservation strategy for target tracking based on a grid network structure, where each node autonomously determines when and if to sleep. It allows sensor nodes far away from targets to sleep to save energy and guarantee the tracking accuracy. The proposed approach extend network lifetime by adopting an adaptive sleep scheduling scheme that combines the local power management (PM and the adaptive coordinate PM strategies to schedule the activities of sensor nodes. And each node can choose an optimal sleep time so as to make system adaptive and energy-efficient. We show the performance of our approach in terms of energy drop, comparing it to a naive approach, dynamic PM with fixed sleep time and the coordinate PM strategies. From the experimental results, it is readily seen that the efficiency of the proposed approach.

  7. [Improvement in zinc nutrition due to zinc transporter-targeting strategy].

    Science.gov (United States)

    Kambe, Taiho

    2016-07-01

    Adequate intake of zinc from the daily diet is indispensable to maintain health. However, the dietary zinc content often fails to fulfill the recommended daily intake, leading to zinc deficiency and also increases the risk of developing chronic diseases, particularly in elderly individuals. Therefore, increased attention is required to overcome zinc deficiency and it is important to improve zinc nutrition in daily life. In the small intestine, the zinc transporter, ZIP4, functions as a component that is essential for zinc absorption. In this manuscript, we present a brief overview regarding zinc deficiency. Moreover, we review a novel strategy, called "ZIP4-targeting", which has the potential to enable efficient zinc absorption from the diet. ZIP4-targeting strategy is possibly a major step in preventing zinc deficiency and improving human health.

  8. Semantic metrics

    OpenAIRE

    Hu, Bo; Kalfoglou, Yannis; Dupplaw, David; Alani, Harith; Lewis, Paul; Shadbolt, Nigel

    2006-01-01

    In the context of the Semantic Web, many ontology-related operations, e.g. ontology ranking, segmentation, alignment, articulation, reuse, evaluation, can be boiled down to one fundamental operation: computing the similarity and/or dissimilarity among ontological entities, and in some cases among ontologies themselves. In this paper, we review standard metrics for computing distance measures and we propose a series of semantic metrics. We give a formal account of semantic metrics drawn from a...

  9. Performance and strategy comparisons of human listeners and logistic regression in discriminating underwater targets.

    Science.gov (United States)

    Yang, Lixue; Chen, Kean

    2015-11-01

    To improve the design of underwater target recognition systems based on auditory perception, this study compared human listeners with automatic classifiers. Performances measures and strategies in three discrimination experiments, including discriminations between man-made and natural targets, between ships and submarines, and among three types of ships, were used. In the experiments, the subjects were asked to assign a score to each sound based on how confident they were about the category to which it belonged, and logistic regression, which represents linear discriminative models, also completed three similar tasks by utilizing many auditory features. The results indicated that the performances of logistic regression improved as the ratio between inter- and intra-class differences became larger, whereas the performances of the human subjects were limited by their unfamiliarity with the targets. Logistic regression performed better than the human subjects in all tasks but the discrimination between man-made and natural targets, and the strategies employed by excellent human subjects were similar to that of logistic regression. Logistic regression and several human subjects demonstrated similar performances when discriminating man-made and natural targets, but in this case, their strategies were not similar. An appropriate fusion of their strategies led to further improvement in recognition accuracy.

  10. A strategy for actualization of active targeting nanomedicine practically functioning in a living body.

    Science.gov (United States)

    Lee, Kyoung Jin; Shin, Seol Hwa; Lee, Jae Hee; Ju, Eun Jin; Park, Yun-Yong; Hwang, Jung Jin; Suh, Young-Ah; Hong, Seung-Mo; Jang, Se Jin; Lee, Jung Shin; Song, Si Yeol; Jeong, Seong-Yun; Choi, Eun Kyung

    2017-10-01

    Designing nanocarriers with active targeting has been increasingly emphasized as for an ideal delivery mechanism of anti-cancer therapeutic agents, but the actualization has been constrained by lack of reliable strategy ultimately applicable. Here, we designed and verified a strategy to achieve active targeting nanomedicine that works in a living body, utilizing animal models bearing a patient's tumor tissue and subjected to the same treatments that would be used in the clinic. The concept for this strategy was that a novel peptide probe and its counterpart protein, which responded to a therapy, were identified, and then the inherent ability of the peptide to target the designated tumor protein was used for active targeting in vivo. An initial dose of ionizing radiation was locally delivered to the gastric cancer (GC) tumor of a patient-derived xenograft mouse model, and phage-displayed peptide library was intravenously injected. The peptides tightly bound to the tumor were recovered, and the counterpart protein was subsequently identified. Peptide-conjugated liposomal drug showed dramatically improved therapeutic efficacy and possibility of diagnostic imaging with radiation. These results strongly suggested the potential of our strategy to achieve in vivo functional active targeting and to be applied clinically for human cancer treatment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Surface Functionalization and Targeting Strategies of Liposomes in Solid Tumor Therapy: A Review

    Science.gov (United States)

    Riaz, Muhammad Kashif; Riaz, Muhammad Adil; Zhang, Xue; Lin, Congcong; Wong, Ka Hong; Chen, Xiaoyu; Lu, Aiping

    2018-01-01

    Surface functionalization of liposomes can play a key role in overcoming the current limitations of nanocarriers to treat solid tumors, i.e., biological barriers and physiological factors. The phospholipid vesicles (liposomes) containing anticancer agents produce fewer side effects than non-liposomal anticancer formulations, and can effectively target the solid tumors. This article reviews information about the strategies for targeting of liposomes to solid tumors along with the possible targets in cancer cells, i.e., extracellular and intracellular targets and targets in tumor microenvironment or vasculature. Targeting ligands for functionalization of liposomes with relevant surface engineering techniques have been described. Stimuli strategies for enhanced delivery of anticancer agents at requisite location using stimuli-responsive functionalized liposomes have been discussed. Recent approaches for enhanced delivery of anticancer agents at tumor site with relevant surface functionalization techniques have been reviewed. Finally, current challenges of functionalized liposomes and future perspective of smart functionalized liposomes have been discussed. PMID:29315231

  12. Surface Functionalization and Targeting Strategies of Liposomes in Solid Tumor Therapy: A Review

    Directory of Open Access Journals (Sweden)

    Muhammad Kashif Riaz

    2018-01-01

    Full Text Available Surface functionalization of liposomes can play a key role in overcoming the current limitations of nanocarriers to treat solid tumors, i.e., biological barriers and physiological factors. The phospholipid vesicles (liposomes containing anticancer agents produce fewer side effects than non-liposomal anticancer formulations, and can effectively target the solid tumors. This article reviews information about the strategies for targeting of liposomes to solid tumors along with the possible targets in cancer cells, i.e., extracellular and intracellular targets and targets in tumor microenvironment or vasculature. Targeting ligands for functionalization of liposomes with relevant surface engineering techniques have been described. Stimuli strategies for enhanced delivery of anticancer agents at requisite location using stimuli-responsive functionalized liposomes have been discussed. Recent approaches for enhanced delivery of anticancer agents at tumor site with relevant surface functionalization techniques have been reviewed. Finally, current challenges of functionalized liposomes and future perspective of smart functionalized liposomes have been discussed.

  13. Metric modular spaces

    CERN Document Server

    Chistyakov, Vyacheslav

    2015-01-01

    Aimed toward researchers and graduate students familiar with elements of functional analysis, linear algebra, and general topology; this book contains a general study of modulars, modular spaces, and metric modular spaces. Modulars may be thought of as generalized velocity fields and serve two important purposes: generate metric spaces in a unified manner and provide a weaker convergence, the modular convergence, whose topology is non-metrizable in general. Metric modular spaces are extensions of metric spaces, metric linear spaces, and classical modular linear spaces. The topics covered include the classification of modulars, metrizability of modular spaces, modular transforms and duality between modular spaces, metric  and modular topologies. Applications illustrated in this book include: the description of superposition operators acting in modular spaces, the existence of regular selections of set-valued mappings, new interpretations of spaces of Lipschitzian and absolutely continuous mappings, the existe...

  14. Shaping of arm configuration space by prescription of non-Euclidean metrics with applications to human motor control

    Science.gov (United States)

    Biess, Armin

    2013-01-01

    The study of the kinematic and dynamic features of human arm movements provides insights into the computational strategies underlying human motor control. In this paper a differential geometric approach to movement control is taken by endowing arm configuration space with different non-Euclidean metric structures to study the predictions of the generalized minimum-jerk (MJ) model in the resulting Riemannian manifold for different types of human arm movements. For each metric space the solution of the generalized MJ model is given by reparametrized geodesic paths. This geodesic model is applied to a variety of motor tasks ranging from three-dimensional unconstrained movements of a four degree of freedom arm between pointlike targets to constrained movements where the hand location is confined to a surface (e.g., a sphere) or a curve (e.g., an ellipse). For the latter speed-curvature relations are derived depending on the boundary conditions imposed (periodic or nonperiodic) and the compatibility with the empirical one-third power law is shown. Based on these theoretical studies and recent experimental findings, I argue that geodesics may be an emergent property of the motor system and that the sensorimotor system may shape arm configuration space by learning metric structures through sensorimotor feedback.

  15. Targeting tumor highly-expressed LAT1 transporter with amino acid-modified nanoparticles: Toward a novel active targeting strategy in breast cancer therapy.

    Science.gov (United States)

    Li, Lin; Di, Xingsheng; Wu, Mingrui; Sun, Zhisu; Zhong, Lu; Wang, Yongjun; Fu, Qiang; Kan, Qiming; Sun, Jin; He, Zhonggui

    2017-04-01

    Designing active targeting nanocarriers with increased cellular accumulation of chemotherapeutic agents is a promising strategy in cancer therapy. Herein, we report a novel active targeting strategy based on the large amino acid transporter 1 (LAT1) overexpressed in a variety of cancers. Glutamate was conjugated to polyoxyethylene stearate as a targeting ligand to achieve LAT1-targeting PLGA nanoparticles. The targeting efficiency of nanoparticles was investigated in HeLa and MCF-7 cells. Significant increase in cellular uptake and cytotoxicity was observed in LAT1-targeting nanoparticles compared to the unmodified ones. More interestingly, the internalized LAT1 together with targeting nanoparticles could recycle back to the cell membrane within 3 h, guaranteeing sufficient transporters on cell membrane for continuous cellular uptake. The LAT1 targeting nanoparticles exhibited better tumor accumulation and antitumor effects. These results suggested that the overexpressed LAT1 on cancer cells holds a great potential to be a high-efficiency target for the rational design of active-targeting nanosystems. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Development and Analysis of Psychomotor Skills Metrics for Procedural Skills Decay.

    Science.gov (United States)

    Parthiban, Chembian; Ray, Rebecca; Rutherford, Drew; Zinn, Mike; Pugh, Carla

    2016-01-01

    In this paper we develop and analyze the metrics associated with a force production task involving a stationary target with the help of advanced VR and Force Dimension Omega 6 haptic device. We study the effects of force magnitude and direction on the various metrics namely path length, movement smoothness, velocity and acceleration patterns, reaction time and overall error in achieving the target. Data was collected from 47 participants who were residents. Results show a positive correlation between the maximum force applied and the deflection error, velocity while reducing the path length and increasing smoothness with a force of higher magnitude showing the stabilizing characteristics of higher magnitude forces. This approach paves a way to assess and model procedural skills decay.

  17. Using network properties to evaluate targeted immunization algorithms

    Directory of Open Access Journals (Sweden)

    Bita Shams

    2014-09-01

    Full Text Available Immunization of complex network with minimal or limited budget is a challenging issue for research community. In spite of much literature in network immunization, no comprehensive research has been conducted for evaluation and comparison of immunization algorithms. In this paper, we propose an evaluation framework for immunization algorithms regarding available amount of vaccination resources, goal of immunization program, and time complexity. The evaluation framework is designed based on network topological metrics which is extensible to all epidemic spreading model. Exploiting evaluation framework on well-known targeted immunization algorithms shows that in general, immunization based on PageRank centrality outperforms other targeting strategies in various types of networks, whereas, closeness and eigenvector centrality exhibit the worst case performance.

  18. Tropism-Modification Strategies for Targeted Gene Delivery Using Adenoviral Vectors

    Directory of Open Access Journals (Sweden)

    Andrew H. Baker

    2010-10-01

    Full Text Available Achieving high efficiency, targeted gene delivery with adenoviral vectors is a long-standing goal in the field of clinical gene therapy. To achieve this, platform vectors must combine efficient retargeting strategies with detargeting modifications to ablate native receptor binding (i.e. CAR/integrins/heparan sulfate proteoglycans and “bridging” interactions. “Bridging” interactions refer to coagulation factor binding, namely coagulation factor X (FX, which bridges hepatocyte transduction in vivo through engagement with surface expressed heparan sulfate proteoglycans (HSPGs. These interactions can contribute to the off-target sequestration of Ad5 in the liver and its characteristic dose-limiting hepatotoxicity, thereby significantly limiting the in vivo targeting efficiency and clinical potential of Ad5-based therapeutics. To date, various approaches to retargeting adenoviruses (Ad have been described. These include genetic modification strategies to incorporate peptide ligands (within fiber knob domain, fiber shaft, penton base, pIX or hexon, pseudotyping of capsid proteins to include whole fiber substitutions or fiber knob chimeras, pseudotyping with non-human Ad species or with capsid proteins derived from other viral families, hexon hypervariable region (HVR substitutions and adapter-based conjugation/crosslinking of scFv, growth factors or monoclonal antibodies directed against surface-expressed target antigens. In order to maximize retargeting, strategies which permit detargeting from undesirable interactions between the Ad capsid and components of the circulatory system (e.g. coagulation factors, erythrocytes, pre-existing neutralizing antibodies, can be employed simultaneously. Detargeting can be achieved by genetic ablation of native receptor-binding determinants, ablation of “bridging interactions” such as those which occur between the hexon of Ad5 and coagulation factor X (FX, or alternatively, through the use of polymer

  19. Translation Strategies from Target Culture Perspective: An Analysis of English and Chinese Brands Names

    Directory of Open Access Journals (Sweden)

    Hong Shi

    2017-03-01

    Full Text Available As a crucial communication material, the brand name exhibits its growing importance in the worldwide communication. It is a special text with a strong function and a clear persuasive purpose. This paper aims to explore the translation strategy and methods of English brand names from the perspective of culture. According to Skopostheorie, the prime principle determining any translation process is the purpose of the overall translational action. The translation methods should be based on the text’s function and the target culture. This paper is a tentative study of the guiding strategy and possible methods used in English brand names translation by analyzing the Chinese and English brand names, and how they fulfill the function of promoting products and enhancing the cultural exchange in the hope of offering a new perspective in the brand name translation practice. The study used the Skopostheorie as the guiding theory and strategy to analyze English brand names, which were selected from the brand names database “brandirectory”. It is found that the translation should follow the target-culture oriented strategy to conform to the habitual use of target language, social culture and aesthetics in target market.

  20. Construction and applications of exon-trapping gene-targeting vectors with a novel strategy for negative selection.

    Science.gov (United States)

    Saito, Shinta; Ura, Kiyoe; Kodama, Miho; Adachi, Noritaka

    2015-06-30

    Targeted gene modification by homologous recombination provides a powerful tool for studying gene function in cells and animals. In higher eukaryotes, non-homologous integration of targeting vectors occurs several orders of magnitude more frequently than does targeted integration, making the gene-targeting technology highly inefficient. For this reason, negative-selection strategies have been employed to reduce the number of drug-resistant clones associated with non-homologous vector integration, particularly when artificial nucleases to introduce a DNA break at the target site are unavailable or undesirable. As such, an exon-trap strategy using a promoterless drug-resistance marker gene provides an effective way to counterselect non-homologous integrants. However, constructing exon-trapping targeting vectors has been a time-consuming and complicated process. By virtue of highly efficient att-mediated recombination, we successfully developed a simple and rapid method to construct plasmid-based vectors that allow for exon-trapping gene targeting. These exon-trap vectors were useful in obtaining correctly targeted clones in mouse embryonic stem cells and human HT1080 cells. Most importantly, with the use of a conditionally cytotoxic gene, we further developed a novel strategy for negative selection, thereby enhancing the efficiency of counterselection for non-homologous integration of exon-trap vectors. Our methods will greatly facilitate exon-trapping gene-targeting technologies in mammalian cells, particularly when combined with the novel negative selection strategy.

  1. Implications of structural genomics target selection strategies: Pfam5000, whole genome, and random approaches

    Energy Technology Data Exchange (ETDEWEB)

    Chandonia, John-Marc; Brenner, Steven E.

    2004-07-14

    The structural genomics project is an international effort to determine the three-dimensional shapes of all important biological macromolecules, with a primary focus on proteins. Target proteins should be selected according to a strategy which is medically and biologically relevant, of good value, and tractable. As an option to consider, we present the Pfam5000 strategy, which involves selecting the 5000 most important families from the Pfam database as sources for targets. We compare the Pfam5000 strategy to several other proposed strategies that would require similar numbers of targets. These include including complete solution of several small to moderately sized bacterial proteomes, partial coverage of the human proteome, and random selection of approximately 5000 targets from sequenced genomes. We measure the impact that successful implementation of these strategies would have upon structural interpretation of the proteins in Swiss-Prot, TrEMBL, and 131 complete proteomes (including 10 of eukaryotes) from the Proteome Analysis database at EBI. Solving the structures of proteins from the 5000 largest Pfam families would allow accurate fold assignment for approximately 68 percent of all prokaryotic proteins (covering 59 percent of residues) and 61 percent of eukaryotic proteins (40 percent of residues). More fine-grained coverage which would allow accurate modeling of these proteins would require an order of magnitude more targets. The Pfam5000 strategy may be modified in several ways, for example to focus on larger families, bacterial sequences, or eukaryotic sequences; as long as secondary consideration is given to large families within Pfam, coverage results vary only slightly. In contrast, focusing structural genomics on a single tractable genome would have only a limited impact in structural knowledge of other proteomes: a significant fraction (about 30-40 percent of the proteins, and 40-60 percent of the residues) of each proteome is classified in small

  2. Nonspecific Organelle-Targeting Strategy with Core-Shell Nanoparticles of Varied Lipid Components/Ratios.

    Science.gov (United States)

    Zhang, Lu; Sun, Jiashu; Wang, Yilian; Wang, Jiancheng; Shi, Xinghua; Hu, Guoqing

    2016-07-19

    We report a nonspecific organelle-targeting strategy through one-step microfluidic fabrication and screening of a library of surface charge- and lipid components/ratios-varied lipid shell-polymer core nanoparticles. Different from the common strategy relying on the use of organelle-targeted moieties conjugated onto the surface of nanoparticles, here, we program the distribution of hybrid nanoparticles in lysosomes or mitochondria by tuning the lipid components/ratios in shell. Hybrid nanoparticles with 60% 1,2-dioleoyl-3-trimethylammonium-propane (DOTAP) and 20% 1,2-dioleoyl-sn-glycero-3-phosphoethanolamine (DOPE) can intracellularly target mitochondria in both in vitro and in vivo models. While replacing DOPE with the same amount of 1,2-dipalmitoyl-sn-glycero-3-phosphocholine (DPPC), the nanoparticles do not show mitochondrial targeting, indicating an incremental effect of cationic and fusogenic lipids on lysosomal escape which is further studied by molecular dynamics simulations. This work unveils the lipid-regulated subcellular distribution of hybrid nanoparticles in which target moieties and complex synthetic steps are avoided.

  3. A multidimensional strategy to detect polypharmacological targets in the absence of structural and sequence homology.

    Science.gov (United States)

    Durrant, Jacob D; Amaro, Rommie E; Xie, Lei; Urbaniak, Michael D; Ferguson, Michael A J; Haapalainen, Antti; Chen, Zhijun; Di Guilmi, Anne Marie; Wunder, Frank; Bourne, Philip E; McCammon, J Andrew

    2010-01-22

    Conventional drug design embraces the "one gene, one drug, one disease" philosophy. Polypharmacology, which focuses on multi-target drugs, has emerged as a new paradigm in drug discovery. The rational design of drugs that act via polypharmacological mechanisms can produce compounds that exhibit increased therapeutic potency and against which resistance is less likely to develop. Additionally, identifying multiple protein targets is also critical for side-effect prediction. One third of potential therapeutic compounds fail in clinical trials or are later removed from the market due to unacceptable side effects often caused by off-target binding. In the current work, we introduce a multidimensional strategy for the identification of secondary targets of known small-molecule inhibitors in the absence of global structural and sequence homology with the primary target protein. To demonstrate the utility of the strategy, we identify several targets of 4,5-dihydroxy-3-(1-naphthyldiazenyl)-2,7-naphthalenedisulfonic acid, a known micromolar inhibitor of Trypanosoma brucei RNA editing ligase 1. As it is capable of identifying potential secondary targets, the strategy described here may play a useful role in future efforts to reduce drug side effects and/or to increase polypharmacology.

  4. Baby universe metric equivalent to an interior black-hole metric

    International Nuclear Information System (INIS)

    Gonzalez-Diaz, P.F.

    1991-01-01

    It is shown that the maximally extended metric corresponding to a large wormhole is the unique possible wormhole metric whose baby universe sector is conformally equivalent ot the maximal inextendible Kruskal metric corresponding to the interior region of a Schwarzschild black hole whose gravitational radius is half the wormhole neck radius. The physical implications of this result in the black hole evaporation process are discussed. (orig.)

  5. Dual peptide conjugation strategy for improved cellular uptake and mitochondria targeting.

    Science.gov (United States)

    Lin, Ran; Zhang, Pengcheng; Cheetham, Andrew G; Walston, Jeremy; Abadir, Peter; Cui, Honggang

    2015-01-21

    Mitochondria are critical regulators of cellular function and survival. Delivery of therapeutic and diagnostic agents into mitochondria is a challenging task in modern pharmacology because the molecule to be delivered needs to first overcome the cell membrane barrier and then be able to actively target the intracellular organelle. Current strategy of conjugating either a cell penetrating peptide (CPP) or a subcellular targeting sequence to the molecule of interest only has limited success. We report here a dual peptide conjugation strategy to achieve effective delivery of a non-membrane-penetrating dye 5-carboxyfluorescein (5-FAM) into mitochondria through the incorporation of both a mitochondrial targeting sequence (MTS) and a CPP into one conjugated molecule. Notably, circular dichroism studies reveal that the combined use of α-helix and PPII-like secondary structures has an unexpected, synergistic contribution to the internalization of the conjugate. Our results suggest that although the use of positively charged MTS peptide allows for improved targeting of mitochondria, with MTS alone it showed poor cellular uptake. With further covalent linkage of the MTS-5-FAM conjugate to a CPP sequence (R8), the dually conjugated molecule was found to show both improved cellular uptake and effective mitochondria targeting. We believe these results offer important insight into the rational design of peptide conjugates for intracellular delivery.

  6. Knowledge metrics of Brand Equity; critical measure of Brand Attachment

    OpenAIRE

    Arslan Rafi (Corresponding Author); Arslan Ali; Sidra Waris; Dr. Kashif-ur-Rehman

    2011-01-01

    Brand creation through an effective marketing strategy is necessary for creation of unique associations in the customers memory. Customers attitude, awareness and association towards the brand are primarily focused while evaluating performance of a brand, before designing the marketing strategies and subsequent evaluation of the progress. In this research, literature establishes a direct and significant effect of Knowledge metrics of the Brand equity, i.e. Brand Awareness and Brand Associatio...

  7. Properties of C-metric spaces

    Science.gov (United States)

    Croitoru, Anca; Apreutesei, Gabriela; Mastorakis, Nikos E.

    2017-09-01

    The subject of this paper belongs to the theory of approximate metrics [23]. An approximate metric on X is a real application defined on X × X that satisfies only a part of the metric axioms. In a recent paper [23], we introduced a new type of approximate metric, named C-metric, that is an application which satisfies only two metric axioms: symmetry and triangular inequality. The remarkable fact in a C-metric space is that a topological structure induced by the C-metric can be defined. The innovative idea of this paper is that we obtain some convergence properties of a C-metric space in the absence of a metric. In this paper we investigate C-metric spaces. The paper is divided into four sections. Section 1 is for Introduction. In Section 2 we recall some concepts and preliminary results. In Section 3 we present some properties of C-metric spaces, such as convergence properties, a canonical decomposition and a C-fixed point theorem. Finally, in Section 4 some conclusions are highlighted.

  8. Learning Low-Dimensional Metrics

    OpenAIRE

    Jain, Lalit; Mason, Blake; Nowak, Robert

    2017-01-01

    This paper investigates the theoretical foundations of metric learning, focused on three key questions that are not fully addressed in prior work: 1) we consider learning general low-dimensional (low-rank) metrics as well as sparse metrics; 2) we develop upper and lower (minimax)bounds on the generalization error; 3) we quantify the sample complexity of metric learning in terms of the dimension of the feature space and the dimension/rank of the underlying metric;4) we also bound the accuracy ...

  9. Pharmacological and physical vessel modulation strategies to improve EPR-mediated drug targeting to tumors.

    Science.gov (United States)

    Ojha, Tarun; Pathak, Vertika; Shi, Yang; Hennink, Wim E; Moonen, Chrit T W; Storm, Gert; Kiessling, Fabian; Lammers, Twan

    2017-09-15

    The performance of nanomedicine formulations depends on the Enhanced Permeability and Retention (EPR) effect. Prototypic nanomedicine-based drug delivery systems, such as liposomes, polymers and micelles, aim to exploit the EPR effect to accumulate at pathological sites, to thereby improve the balance between drug efficacy and toxicity. Thus far, however, tumor-targeted nanomedicines have not yet managed to achieve convincing therapeutic results, at least not in large cohorts of patients. This is likely mostly due to high inter- and intra-patient heterogeneity in EPR. Besides developing (imaging) biomarkers to monitor and predict EPR, another strategy to address this heterogeneity is the establishment of vessel modulation strategies to homogenize and improve EPR. Over the years, several pharmacological and physical co-treatments have been evaluated to improve EPR-mediated tumor targeting. These include pharmacological strategies, such as vessel permeabilization, normalization, disruption and promotion, as well as physical EPR enhancement via hyperthermia, radiotherapy, sonoporation and phototherapy. In the present manuscript, we summarize exemplary studies showing that pharmacological and physical vessel modulation strategies can be used to improve tumor-targeted drug delivery, and we discuss how these advanced combination regimens can be optimally employed to enhance the (pre-) clinical performance of tumor-targeted nanomedicines. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Scalar-metric and scalar-metric-torsion gravitational theories

    International Nuclear Information System (INIS)

    Aldersley, S.J.

    1977-01-01

    The techniques of dimensional analysis and of the theory of tensorial concomitants are employed to study field equations in gravitational theories which incorporate scalar fields of the Brans-Dicke type. Within the context of scalar-metric gravitational theories, a uniqueness theorem for the geometric (or gravitational) part of the field equations is proven and a Lagrangian is determined which is uniquely specified by dimensional analysis. Within the context of scalar-metric-torsion gravitational theories a uniqueness theorem for field Lagrangians is presented and the corresponding Euler-Lagrange equations are given. Finally, an example of a scalar-metric-torsion theory is presented which is similar in many respects to the Brans-Dicke theory and the Einstein-Cartan theory

  11. Metrics of quantum states

    International Nuclear Information System (INIS)

    Ma Zhihao; Chen Jingling

    2011-01-01

    In this work we study metrics of quantum states, which are natural generalizations of the usual trace metric and Bures metric. Some useful properties of the metrics are proved, such as the joint convexity and contractivity under quantum operations. Our result has a potential application in studying the geometry of quantum states as well as the entanglement detection.

  12. Empirical Information Metrics for Prediction Power and Experiment Planning

    Directory of Open Access Journals (Sweden)

    Christopher Lee

    2011-01-01

    Full Text Available In principle, information theory could provide useful metrics for statistical inference. In practice this is impeded by divergent assumptions: Information theory assumes the joint distribution of variables of interest is known, whereas in statistical inference it is hidden and is the goal of inference. To integrate these approaches we note a common theme they share, namely the measurement of prediction power. We generalize this concept as an information metric, subject to several requirements: Calculation of the metric must be objective or model-free; unbiased; convergent; probabilistically bounded; and low in computational complexity. Unfortunately, widely used model selection metrics such as Maximum Likelihood, the Akaike Information Criterion and Bayesian Information Criterion do not necessarily meet all these requirements. We define four distinct empirical information metrics measured via sampling, with explicit Law of Large Numbers convergence guarantees, which meet these requirements: Ie, the empirical information, a measure of average prediction power; Ib, the overfitting bias information, which measures selection bias in the modeling procedure; Ip, the potential information, which measures the total remaining information in the observations not yet discovered by the model; and Im, the model information, which measures the model’s extrapolation prediction power. Finally, we show that Ip + Ie, Ip + Im, and Ie — Im are fixed constants for a given observed dataset (i.e. prediction target, independent of the model, and thus represent a fundamental subdivision of the total information contained in the observations. We discuss the application of these metrics to modeling and experiment planning.    

  13. Generic metrics and quantitative approaches for system resilience as a function of time

    International Nuclear Information System (INIS)

    Henry, Devanandham; Emmanuel Ramirez-Marquez, Jose

    2012-01-01

    Resilience is generally understood as the ability of an entity to recover from an external disruptive event. In the system domain, a formal definition and quantification of the concept of resilience has been elusive. This paper proposes generic metrics and formulae for quantifying system resilience. The discussions and graphical examples illustrate that the quantitative model is aligned with the fundamental concept of resilience. Based on the approach presented it is possible to analyze resilience as a time dependent function in the context of systems. The paper describes the metrics of network and system resilience, time for resilience and total cost of resilience. Also the paper describes the key parameters necessary to analyze system resilience such as the following: disruptive events, component restoration and overall resilience strategy. A road network example is used to demonstrate the applicability of the proposed resilience metrics and how these analyses form the basis for developing effective resilience design strategies. The metrics described are generic enough to be implemented in a variety of applications as long as appropriate figures-of-merit and the necessary system parameters, system decomposition and component parameters are defined. - Highlights: ► Propose a graphical model for the understanding of the resilience process. ► Mathematical description of resilience as a function of time. ► Identification of necessary concepts to define and evaluate network resilience. ► Development of cost and time to recovery metrics based on resilience formulation.

  14. Symmetric Kullback-Leibler Metric Based Tracking Behaviors for Bioinspired Robotic Eyes.

    Science.gov (United States)

    Liu, Hengli; Luo, Jun; Wu, Peng; Xie, Shaorong; Li, Hengyu

    2015-01-01

    A symmetric Kullback-Leibler metric based tracking system, capable of tracking moving targets, is presented for a bionic spherical parallel mechanism to minimize a tracking error function to simulate smooth pursuit of human eyes. More specifically, we propose a real-time moving target tracking algorithm which utilizes spatial histograms taking into account symmetric Kullback-Leibler metric. In the proposed algorithm, the key spatial histograms are extracted and taken into particle filtering framework. Once the target is identified, an image-based control scheme is implemented to drive bionic spherical parallel mechanism such that the identified target is to be tracked at the center of the captured images. Meanwhile, the robot motion information is fed forward to develop an adaptive smooth tracking controller inspired by the Vestibuloocular Reflex mechanism. The proposed tracking system is designed to make the robot track dynamic objects when the robot travels through transmittable terrains, especially bumpy environment. To perform bumpy-resist capability under the condition of violent attitude variation when the robot works in the bumpy environment mentioned, experimental results demonstrate the effectiveness and robustness of our bioinspired tracking system using bionic spherical parallel mechanism inspired by head-eye coordination.

  15. Symmetric Kullback-Leibler Metric Based Tracking Behaviors for Bioinspired Robotic Eyes

    Directory of Open Access Journals (Sweden)

    Hengli Liu

    2015-01-01

    Full Text Available A symmetric Kullback-Leibler metric based tracking system, capable of tracking moving targets, is presented for a bionic spherical parallel mechanism to minimize a tracking error function to simulate smooth pursuit of human eyes. More specifically, we propose a real-time moving target tracking algorithm which utilizes spatial histograms taking into account symmetric Kullback-Leibler metric. In the proposed algorithm, the key spatial histograms are extracted and taken into particle filtering framework. Once the target is identified, an image-based control scheme is implemented to drive bionic spherical parallel mechanism such that the identified target is to be tracked at the center of the captured images. Meanwhile, the robot motion information is fed forward to develop an adaptive smooth tracking controller inspired by the Vestibuloocular Reflex mechanism. The proposed tracking system is designed to make the robot track dynamic objects when the robot travels through transmittable terrains, especially bumpy environment. To perform bumpy-resist capability under the condition of violent attitude variation when the robot works in the bumpy environment mentioned, experimental results demonstrate the effectiveness and robustness of our bioinspired tracking system using bionic spherical parallel mechanism inspired by head-eye coordination.

  16. A family of metric gravities

    Science.gov (United States)

    Shuler, Robert

    2018-04-01

    The goal of this paper is to take a completely fresh approach to metric gravity, in which the metric principle is strictly adhered to but its properties in local space-time are derived from conservation principles, not inferred from a global field equation. The global field strength variation then gains some flexibility, but only in the regime of very strong fields (2nd-order terms) whose measurement is now being contemplated. So doing provides a family of similar gravities, differing only in strong fields, which could be developed into meaningful verification targets for strong fields after the manner in which far-field variations were used in the 20th century. General Relativity (GR) is shown to be a member of the family and this is demonstrated by deriving the Schwarzschild metric exactly from a suitable field strength assumption. The method of doing so is interesting in itself because it involves only one differential equation rather than the usual four. Exact static symmetric field solutions are also given for one pedagogical alternative based on potential, and one theoretical alternative based on inertia, and the prospects of experimentally differentiating these are analyzed. Whether the method overturns the conventional wisdom that GR is the only metric theory of gravity and that alternatives must introduce additional interactions and fields is somewhat semantical, depending on whether one views the field strength assumption as a field and whether the assumption that produces GR is considered unique in some way. It is of course possible to have other fields, and the local space-time principle can be applied to field gravities which usually are weak-field approximations having only time dilation, giving them the spatial factor and promoting them to full metric theories. Though usually pedagogical, some of them are interesting from a quantum gravity perspective. Cases are noted where mass measurement errors, or distributions of dark matter, can cause one

  17. Perspectives of 99mTc chemistry and radiopharmacy: strategies, building blocks and targets

    International Nuclear Information System (INIS)

    Alberto, R.

    2007-01-01

    Technetium chemistry, both fundamental and applied are required to a larger extent in order to keep the essential role of this element in radiopharmacy alive. After an introduction, highlighting the situation in general from research and market aspects, new strategies will be proposed in which technetium and rhenium play an essential role which can not be taken over by other radionuclides such as 11 C or 18 F. Furthermore, currently available and potential future building blocks in technetium chemistry and their relationship to the new strategies as well as characteristics of new precursors will be discussed and compared to each other. Targets and targeting molecules, again in the context of strategies unique for technetium (and rhenium) are in the focus of the last part. With respect of retaining a unique role, it is obvious that any future technetium or rhenium labelled biomolecule should have potential to therapy or be applied in the immediate context of therapy, as e.g. for the early assessment of success in chemotherapy. All these aspects emphasize a role of inorganic technetium chemistry which goes far beyond simple labelling strategies. To underline the importance of fundamental chemistry, we will present and discuss some examples with nuclear targeting agents, amino acids and vitamin B12. (author)

  18. METRIC context unit architecture

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, R.O.

    1988-01-01

    METRIC is an architecture for a simple but powerful Reduced Instruction Set Computer (RISC). Its speed comes from the simultaneous processing of several instruction streams, with instructions from the various streams being dispatched into METRIC's execution pipeline as they become available for execution. The pipeline is thus kept full, with a mix of instructions for several contexts in execution at the same time. True parallel programming is supported within a single execution unit, the METRIC Context Unit. METRIC's architecture provides for expansion through the addition of multiple Context Units and of specialized Functional Units. The architecture thus spans a range of size and performance from a single-chip microcomputer up through large and powerful multiprocessors. This research concentrates on the specification of the METRIC Context Unit at the architectural level. Performance tradeoffs made during METRIC's design are discussed, and projections of METRIC's performance are made based on simulation studies.

  19. Molecular-Targeted Immunotherapeutic Strategy for Melanoma via Dual-Targeting Nanoparticles Delivering Small Interfering RNA to Tumor-Associated Macrophages.

    Science.gov (United States)

    Qian, Yuan; Qiao, Sha; Dai, Yanfeng; Xu, Guoqiang; Dai, Bolei; Lu, Lisen; Yu, Xiang; Luo, Qingming; Zhang, Zhihong

    2017-09-26

    Tumor-associated macrophages (TAMs) are a promising therapeutic target for cancer immunotherapy. Targeted delivery of therapeutic drugs to the tumor-promoting M2-like TAMs is challenging. Here, we developed M2-like TAM dual-targeting nanoparticles (M2NPs), whose structure and function were controlled by α-peptide (a scavenger receptor B type 1 (SR-B1) targeting peptide) linked with M2pep (an M2 macrophage binding peptide). By loading anti-colony stimulating factor-1 receptor (anti-CSF-1R) small interfering RNA (siRNA) on the M2NPs, we developed a molecular-targeted immunotherapeutic approach to specifically block the survival signal of M2-like TAMs and deplete them from melanoma tumors. We confirmed the validity of SR-B1 for M2-like TAM targeting and demonstrated the synergistic effect of the two targeting units (α-peptide and M2pep) in the fusion peptide (α-M2pep). After being administered to tumor-bearing mice, M2NPs had higher affinity to M2-like TAMs than to tissue-resident macrophages in liver, spleen, and lung. Compared with control treatment groups, M2NP-based siRNA delivery resulted in a dramatic elimination of M2-like TAMs (52%), decreased tumor size (87%), and prolonged survival. Additionally, this molecular-targeted strategy inhibited immunosuppressive IL-10 and TGF-β production and increased immunostimulatory cytokines (IL-12 and IFN-γ) expression and CD8 + T cell infiltration (2.9-fold) in the tumor microenvironment. Moreover, the siRNA-carrying M2NPs down-regulated expression of the exhaustion markers (PD-1 and Tim-3) on the infiltrating CD8 + T cells and stimulated their IFN-γ secretion (6.2-fold), indicating the restoration of T cell immune function. Thus, the dual-targeting property of M2NPs combined with RNA interference provides a potential strategy of molecular-targeted cancer immunotherapy for clinical application.

  20. Novel strategies for ultrahigh specific activity targeted nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Dong

    2012-12-13

    We have developed novel strategies optimized for preparing high specific activity radiolabeled nanoparticles, targeting nuclear imaging of low abundance biomarkers. Several compounds have been labeled with F-18 and Cu-64 for radiolabeling of SCK-nanoparticles via Copper(I) catalyzed or copper-free alkyne-azide cyclolization. Novel strategies have been developed to achieve ultrahigh specific activity with administrable amount of dose for human study using copper-free chemistry. Ligands for carbonic anhydrase 12 (CA12), a low abundance extracellular biomarker for the responsiveness of breast cancer to endocrine therapie, have been labeled with F-18 and Cu-64, and one of them has been evaluated in animal models. The results of this project will lead to major improvements in the use of nanoparticles in nuclear imaging and will significantly advance their potential for detecting low abundance biomarkers of medical importance.

  1. Observable traces of non-metricity: New constraints on metric-affine gravity

    Science.gov (United States)

    Delhom-Latorre, Adrià; Olmo, Gonzalo J.; Ronco, Michele

    2018-05-01

    Relaxing the Riemannian condition to incorporate geometric quantities such as torsion and non-metricity may allow to explore new physics associated with defects in a hypothetical space-time microstructure. Here we show that non-metricity produces observable effects in quantum fields in the form of 4-fermion contact interactions, thereby allowing us to constrain the scale of non-metricity to be greater than 1 TeV by using results on Bahbah scattering. Our analysis is carried out in the framework of a wide class of theories of gravity in the metric-affine approach. The bound obtained represents an improvement of several orders of magnitude to previous experimental constraints.

  2. Innovations in food products: first-mover advantages and entopry metrics

    NARCIS (Netherlands)

    Sporleder, T.L.; Hooker, N.H.; Shannahan, C.J.; Bröring, S.

    2008-01-01

    The objective of this research is to investigate food product innovation in the context of the firstmover strategy among food manufacturers within a supply chain. The emphasis of the analysis is on developing a useful metric for tracking new product development in the context of first-mover

  3. Metrically adjusted questionnaires can provide more information for scientists- an example from the tourism.

    Science.gov (United States)

    Sindik, Joško; Miljanović, Maja

    2017-03-01

    The article deals with the issue of research methodology, illustrating the use of known research methods for new purposes. Questionnaires that originally do not have metric characteristics can be called »handy questionnaires«. In this article, the author is trying to consider the possibilities of their improved scientific usability, which can be primarily ensured by improving their metric characteristics, consequently using multivariate instead of univariate statistical methods. In order to establish the base for the application of multivariate statistical procedures, the main idea is to develop strategies to design measurement instruments from parts of the handy questionnaires. This can be accomplished in two ways: before deciding upon the methods for data collection (redesigning the handy questionnaires) and before the collection of the data (a priori) or after the data has been collected, without modifying the questionnaire (a posteriori). The basic principles of applying these two strategies of the metrical adaptation of handy questionnaires are described.

  4. A Convenient Cas9-based Conditional Knockout Strategy for Simultaneously Targeting Multiple Genes in Mouse.

    Science.gov (United States)

    Chen, Jiang; Du, Yinan; He, Xueyan; Huang, Xingxu; Shi, Yun S

    2017-03-31

    The most powerful way to probe protein function is to characterize the consequence of its deletion. Compared to conventional gene knockout (KO), conditional knockout (cKO) provides an advanced gene targeting strategy with which gene deletion can be performed in a spatially and temporally restricted manner. However, for most species that are amphiploid, the widely used Cre-flox conditional KO (cKO) system would need targeting loci in both alleles to be loxP flanked, which in practice, requires time and labor consuming breeding. This is considerably significant when one is dealing with multiple genes. CRISPR/Cas9 genome modulation system is advantaged in its capability in targeting multiple sites simultaneously. Here we propose a strategy that could achieve conditional KO of multiple genes in mouse with Cre recombinase dependent Cas9 expression. By transgenic construction of loxP-stop-loxP (LSL) controlled Cas9 (LSL-Cas9) together with sgRNAs targeting EGFP, we showed that the fluorescence molecule could be eliminated in a Cre-dependent manner. We further verified the efficacy of this novel strategy to target multiple sites by deleting c-Maf and MafB simultaneously in macrophages specifically. Compared to the traditional Cre-flox cKO strategy, this sgRNAs-LSL-Cas9 cKO system is simpler and faster, and would make conditional manipulation of multiple genes feasible.

  5. Maneuver Analysis and Targeting Strategy for the Stardust Re-Entry Capsule

    Science.gov (United States)

    Helfrich, Cliff; Bhat, Ramachand S.; Kangas, Julie A.; Wilson, Roby S.; Wong, Mau C.; Potts, Christopher L.; Williams, Kenneth E.

    2006-01-01

    The Stardust Sample Return Capsule (SRC) returned to Earth on January 15, 2006 after seven years of collecting interstellar and comet particles over three heliocentric revolutions, as shown in Figure 1. The SRC was carried on board the Stardust spacecraft, as shown in Figure 2. Because the spacecraft was built with unbalanced thrusters, turns and attitude control maintenance resulted in undesirable delta-v being imparted to the trajectory. As a result, a carefully planned maneuver strategy was devised to accurately target the Stardust capsule to the Utah Test and Training Range (UTTR). This paper provides an overview of the Stardust spacecraft and mission and describes the maneuver strategy that was employed to achieve the stringent targeting requirements for landing in Utah. In addition, an overview of Stardust maneuver analysis tools and techniques will also be presented.

  6. A novel rotational invariants target recognition method for rotating motion blurred images

    Science.gov (United States)

    Lan, Jinhui; Gong, Meiling; Dong, Mingwei; Zeng, Yiliang; Zhang, Yuzhen

    2017-11-01

    The imaging of the image sensor is blurred due to the rotational motion of the carrier and reducing the target recognition rate greatly. Although the traditional mode that restores the image first and then identifies the target can improve the recognition rate, it takes a long time to recognize. In order to solve this problem, a rotating fuzzy invariants extracted model was constructed that recognizes target directly. The model includes three metric layers. The object description capability of metric algorithms that contain gray value statistical algorithm, improved round projection transformation algorithm and rotation-convolution moment invariants in the three metric layers ranges from low to high, and the metric layer with the lowest description ability among them is as the input which can eliminate non pixel points of target region from degenerate image gradually. Experimental results show that the proposed model can improve the correct target recognition rate of blurred image and optimum allocation between the computational complexity and function of region.

  7. Metric diffusion along foliations

    CERN Document Server

    Walczak, Szymon M

    2017-01-01

    Up-to-date research in metric diffusion along compact foliations is presented in this book. Beginning with fundamentals from the optimal transportation theory and the theory of foliations; this book moves on to cover Wasserstein distance, Kantorovich Duality Theorem, and the metrization of the weak topology by the Wasserstein distance. Metric diffusion is defined, the topology of the metric space is studied and the limits of diffused metrics along compact foliations are discussed. Essentials on foliations, holonomy, heat diffusion, and compact foliations are detailed and vital technical lemmas are proved to aide understanding. Graduate students and researchers in geometry, topology and dynamics of foliations and laminations will find this supplement useful as it presents facts about the metric diffusion along non-compact foliation and provides a full description of the limit for metrics diffused along foliation with at least one compact leaf on the two dimensions.

  8. Preventive strike vs. false targets and protection in defense strategy

    International Nuclear Information System (INIS)

    Levitin, Gregory; Hausken, Kjell

    2011-01-01

    A defender allocates its resource between defending an object passively and striking preventively against an attacker seeking to destroy the object. With no preventive strike the defender distributes its entire resource between deploying false targets, which the attacker cannot distinguish from the genuine object, and protecting the object. If the defender strikes preventively, the attacker's vulnerability depends on its protection and on the defender's resource allocated to the strike. If the attacker survives, the object's vulnerability depends on the attacker's revenge attack resource allocated to the attacked object. The optimal defense resource distribution between striking preventively, deploying the false targets and protecting the object is analyzed. Two cases of the attacker strategy are considered: when the attacker attacks all of the targets and when it chooses a number of targets to attack. An optimization model is presented for making a decision about the efficiency of the preventive strike based on the estimated attack probability, dependent on a variety of model parameters.

  9. Fault Management Metrics

    Science.gov (United States)

    Johnson, Stephen B.; Ghoshal, Sudipto; Haste, Deepak; Moore, Craig

    2017-01-01

    This paper describes the theory and considerations in the application of metrics to measure the effectiveness of fault management. Fault management refers here to the operational aspect of system health management, and as such is considered as a meta-control loop that operates to preserve or maximize the system's ability to achieve its goals in the face of current or prospective failure. As a suite of control loops, the metrics to estimate and measure the effectiveness of fault management are similar to those of classical control loops in being divided into two major classes: state estimation, and state control. State estimation metrics can be classified into lower-level subdivisions for detection coverage, detection effectiveness, fault isolation and fault identification (diagnostics), and failure prognosis. State control metrics can be classified into response determination effectiveness and response effectiveness. These metrics are applied to each and every fault management control loop in the system, for each failure to which they apply, and probabilistically summed to determine the effectiveness of these fault management control loops to preserve the relevant system goals that they are intended to protect.

  10. Completion of a Dislocated Metric Space

    Directory of Open Access Journals (Sweden)

    P. Sumati Kumari

    2015-01-01

    Full Text Available We provide a construction for the completion of a dislocated metric space (abbreviated d-metric space; we also prove that the completion of the metric associated with a d-metric coincides with the metric associated with the completion of the d-metric.

  11. Molecular Strategies for Targeting Antioxidants to Mitochondria: Therapeutic Implications

    Science.gov (United States)

    2015-01-01

    Abstract Mitochondrial function and specifically its implication in cellular redox/oxidative balance is fundamental in controlling the life and death of cells, and has been implicated in a wide range of human pathologies. In this context, mitochondrial therapeutics, particularly those involving mitochondria-targeted antioxidants, have attracted increasing interest as potentially effective therapies for several human diseases. For the past 10 years, great progress has been made in the development and functional testing of molecules that specifically target mitochondria, and there has been special focus on compounds with antioxidant properties. In this review, we will discuss several such strategies, including molecules conjugated with lipophilic cations (e.g., triphenylphosphonium) or rhodamine, conjugates of plant alkaloids, amino-acid- and peptide-based compounds, and liposomes. This area has several major challenges that need to be confronted. Apart from antioxidants and other redox active molecules, current research aims at developing compounds that are capable of modulating other mitochondria-controlled processes, such as apoptosis and autophagy. Multiple chemically different molecular strategies have been developed as delivery tools that offer broad opportunities for mitochondrial manipulation. Additional studies, and particularly in vivo approaches under physiologically relevant conditions, are necessary to confirm the clinical usefulness of these molecules. Antioxid. Redox Signal. 22, 686–729. PMID:25546574

  12. Metrics with vanishing quantum corrections

    International Nuclear Information System (INIS)

    Coley, A A; Hervik, S; Gibbons, G W; Pope, C N

    2008-01-01

    We investigate solutions of the classical Einstein or supergravity equations that solve any set of quantum corrected Einstein equations in which the Einstein tensor plus a multiple of the metric is equated to a symmetric conserved tensor T μν (g αβ , ∂ τ g αβ , ∂ τ ∂ σ g αβ , ...,) constructed from sums of terms, the involving contractions of the metric and powers of arbitrary covariant derivatives of the curvature tensor. A classical solution, such as an Einstein metric, is called universal if, when evaluated on that Einstein metric, T μν is a multiple of the metric. A Ricci flat classical solution is called strongly universal if, when evaluated on that Ricci flat metric, T μν vanishes. It is well known that pp-waves in four spacetime dimensions are strongly universal. We focus attention on a natural generalization; Einstein metrics with holonomy Sim(n - 2) in which all scalar invariants are zero or constant. In four dimensions we demonstrate that the generalized Ghanam-Thompson metric is weakly universal and that the Goldberg-Kerr metric is strongly universal; indeed, we show that universality extends to all four-dimensional Sim(2) Einstein metrics. We also discuss generalizations to higher dimensions

  13. Remarks on G-Metric Spaces

    Directory of Open Access Journals (Sweden)

    Bessem Samet

    2013-01-01

    Full Text Available In 2005, Mustafa and Sims (2006 introduced and studied a new class of generalized metric spaces, which are called G-metric spaces, as a generalization of metric spaces. We establish some useful propositions to show that many fixed point theorems on (nonsymmetric G-metric spaces given recently by many authors follow directly from well-known theorems on metric spaces. Our technique can be easily extended to other results as shown in application.

  14. Traveler oriented traffic performance metrics using real time traffic data from the Midtown-in-Motion (MIM) project in Manhattan, NY.

    Science.gov (United States)

    2013-10-01

    In a congested urban street network the average traffic speed is an inadequate metric for measuring : speed changes that drivers can perceive from changes in traffic control strategies. : A driver oriented metric is needed. Stop frequency distrib...

  15. Optimal strategies for controlling riverine tsetse flies using targets: a modelling study.

    Directory of Open Access Journals (Sweden)

    Glyn A Vale

    2015-03-01

    Full Text Available Tsetse flies occur in much of sub-Saharan Africa where they transmit the trypanosomes that cause the diseases of sleeping sickness in humans and nagana in livestock. One of the most economical and effective methods of tsetse control is the use of insecticide-treated screens, called targets, that simulate hosts. Targets have been ~1 m2, but recently it was shown that those tsetse that occupy riverine situations, and which are the main vectors of sleeping sickness, respond well to targets only ~0.06 m2. The cheapness of these tiny targets suggests the need to reconsider what intensity and duration of target deployments comprise the most cost-effective strategy in various riverine habitats.A deterministic model, written in Excel spreadsheets and managed by Visual Basic for Applications, simulated the births, deaths and movement of tsetse confined to a strip of riverine vegetation composed of segments of habitat in which the tsetse population was either self-sustaining, or not sustainable unless supplemented by immigrants. Results suggested that in many situations the use of tiny targets at high density for just a few months per year would be the most cost-effective strategy for rapidly reducing tsetse densities by the ~90% expected to have a great impact on the incidence of sleeping sickness. Local elimination of tsetse becomes feasible when targets are deployed in isolated situations, or where the only invasion occurs from populations that are not self-sustaining.Seasonal use of tiny targets deserves field trials. The ability to recognise habitat that contains tsetse populations which are not self-sustaining could improve the planning of all methods of tsetse control, against any species, in riverine, savannah or forest situations. Criteria to assist such recognition are suggested.

  16. Systematic Assessment of Strategies for Lung-targeted Delivery of MicroRNA Mimics

    Science.gov (United States)

    Schlosser, Kenny; Taha, Mohamad; Stewart, Duncan J.

    2018-01-01

    There is considerable interest in the use of synthetic miRNA mimics (or inhibitors) as potential therapeutic agents in pulmonary vascular disease; however, the optimal delivery method to achieve high efficiency, selective lung targeting has not been determined. Here, we sought to investigate the relative merits of different lung-targeted strategies for delivering miRNA mimics in rats. Methods: Tissue levels of a synthetic miRNA mimic, cel-miR-39-3p (0.5 nmol in 50 µL invivofectamine/PBS vehicle) were compared in male rats (n=3 rats/method) after delivery by commonly used lung-targeting strategies including intratracheal liquid instillation (IT-L), intratracheal aerosolization with (IT-AV) or without ventilator assistance (IT-A), intranasal liquid instillation (IN-L) and intranasal aerosolization (IN-A). Intravenous (IV; via jugular vein), intraperitoneal (IP) and subcutaneous (SC) delivery served as controls. Relative levels of cel-miR-39 were quantified by RT-qPCR. Results: At 2 h post delivery, IT-L showed the highest lung mimic level, which was significantly higher than levels achieved by all other methods (from ~10- to 10,000-fold, pMimic levels remained detectable in the lung 24 h after delivery, but were 10- to 100-fold lower. The intrapulmonary distribution of cel-miR-39 was comparable when delivered as either a liquid or aerosol, with evidence of mimic distribution to both the left and right lung lobes and penetration to distal regions. All lung-targeted strategies showed lung-selective mimic uptake, with mimic levels 10- to 100-fold lower in heart and 100- to 10,000-fold lower in liver, kidney and spleen. In contrast, IV, SC and IP routes showed comparable or higher mimic levels in non-pulmonary tissues. Conclusions: miRNA uptake in the lungs differed markedly by up to 4 orders of magnitude, demonstrating that the choice of delivery strategy could have a significant impact on potential therapeutic outcomes in preclinical investigations of miRNA-based drug

  17. Metric-adjusted skew information

    DEFF Research Database (Denmark)

    Liang, Cai; Hansen, Frank

    2010-01-01

    on a bipartite system and proved superadditivity of the Wigner-Yanase-Dyson skew informations for such states. We extend this result to the general metric-adjusted skew information. We finally show that a recently introduced extension to parameter values 1 ...We give a truly elementary proof of the convexity of metric-adjusted skew information following an idea of Effros. We extend earlier results of weak forms of superadditivity to general metric-adjusted skew information. Recently, Luo and Zhang introduced the notion of semi-quantum states...... of (unbounded) metric-adjusted skew information....

  18. Constructed wetlands targeting nitrogen removal in agricultural drainage discharge – a subcatchment scale mitigation strategy

    DEFF Research Database (Denmark)

    Kjærgaard, Charlotte; Hoffmann, Carl Christian; Bruun, Jacob Druedahl

    analysis of variable mitigation strategies and cost-efficiency analysis reveals that even at low to moderate yearly N removal efficiencies (20-25% N removal efficiency) CWs targeting drainage water are highly efficient and cost-efficient measures. Thus, although challenges remain regarding site......-specific documentations, CWs targeting drainage discharge has been included as new mitigation strategy in the Danish environmental regulation....... of recipients, drainage water nutrient loads have a major impact on water quality, and end-of-pipe drainage filter solution may offer the benefits of a targeted measure. This calls for a paradigm shift towards the development of new, cost-efficient technologies to mitigate site-specific nutrient losses...

  19. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  20. An innovative pre-targeting strategy for tumor cell specific imaging and therapy.

    Science.gov (United States)

    Qin, Si-Yong; Peng, Meng-Yun; Rong, Lei; Jia, Hui-Zhen; Chen, Si; Cheng, Si-Xue; Feng, Jun; Zhang, Xian-Zheng

    2015-09-21

    A programmed pre-targeting system for tumor cell imaging and targeting therapy was established based on the "biotin-avidin" interaction. In this programmed functional system, transferrin-biotin can be actively captured by tumor cells with the overexpression of transferrin receptors, thus achieving the pre-targeting modality. Depending upon avidin-biotin recognition, the attachment of multivalent FITC-avidin to biotinylated tumor cells not only offered the rapid fluorescence labelling, but also endowed the pre-targeted cells with targeting sites for the specifically designed biotinylated peptide nano-drug. Owing to the successful pre-targeting, tumorous HepG2 and HeLa cells were effectively distinguished from the normal 3T3 cells via fluorescence imaging. In addition, the self-assembled peptide nano-drug resulted in enhanced cell apoptosis in the observed HepG2 cells. The tumor cell specific pre-targeting strategy is applicable for a variety of different imaging and therapeutic agents for tumor treatments.

  1. The metric system: An introduction

    Science.gov (United States)

    Lumley, Susan M.

    On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.

  2. The metric system: An introduction

    Energy Technology Data Exchange (ETDEWEB)

    Lumley, S.M.

    1995-05-01

    On July 13, 1992, Deputy Director Duane Sewell restated the Laboratory`s policy on conversion to the metric system which was established in 1974. Sewell`s memo announced the Laboratory`s intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory`s conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on July 25, 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation`s conversion to the metric system. The second part of this report is on applying the metric system.

  3. THE STRATEGY OF DIRECT INFLATION TARGETING – EPERIENCES OF THE COUNTRIES OF MIDDLE-EAST EUROPE

    OpenAIRE

    Dorota Zbierzchowska

    2009-01-01

    This paper aims at presenting theoretical assumptions of the strategy of direct inflation targeting as well as profits and potential threats stemming from the acceptance of that strategy. Empirical analysis compares the results of implementation of the BCI strategy in the Central and Eastern European countries (Poland, Czech Republic, Romania, Slovakia, Hungary).

  4. Attack-Resistant Trust Metrics

    Science.gov (United States)

    Levien, Raph

    The Internet is an amazingly powerful tool for connecting people together, unmatched in human history. Yet, with that power comes great potential for spam and abuse. Trust metrics are an attempt to compute the set of which people are trustworthy and which are likely attackers. This chapter presents two specific trust metrics developed and deployed on the Advogato Website, which is a community blog for free software developers. This real-world experience demonstrates that the trust metrics fulfilled their goals, but that for good results, it is important to match the assumptions of the abstract trust metric computation to the real-world implementation.

  5. Hierarchical Targeting Strategy for Enhanced Tumor Tissue Accumulation/Retention and Cellular Internalization.

    Science.gov (United States)

    Wang, Sheng; Huang, Peng; Chen, Xiaoyuan

    2016-09-01

    Targeted delivery of therapeutic agents is an important way to improve the therapeutic index and reduce side effects. To design nanoparticles for targeted delivery, both enhanced tumor tissue accumulation/retention and enhanced cellular internalization should be considered simultaneously. So far, there have been very few nanoparticles with immutable structures that can achieve this goal efficiently. Hierarchical targeting, a novel targeting strategy based on stimuli responsiveness, shows good potential to enhance both tumor tissue accumulation/retention and cellular internalization. Here, the recent design and development of hierarchical targeting nanoplatforms, based on changeable particle sizes, switchable surface charges and activatable surface ligands, will be introduced. In general, the targeting moieties in these nanoplatforms are not activated during blood circulation for efficient tumor tissue accumulation, but re-activated by certain internal or external stimuli in the tumor microenvironment for enhanced cellular internalization. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Safety, codes and standards for hydrogen installations. Metrics development and benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Harris, Aaron P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dedrick, Daniel E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); LaFleur, Angela Christine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); San Marchi, Christopher W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-04-01

    Automakers and fuel providers have made public commitments to commercialize light duty fuel cell electric vehicles and fueling infrastructure in select US regions beginning in 2014. The development, implementation, and advancement of meaningful codes and standards is critical to enable the effective deployment of clean and efficient fuel cell and hydrogen solutions in the energy technology marketplace. Metrics pertaining to the development and implementation of safety knowledge, codes, and standards are important to communicate progress and inform future R&D investments. This document describes the development and benchmarking of metrics specific to the development of hydrogen specific codes relevant for hydrogen refueling stations. These metrics will be most useful as the hydrogen fuel market transitions from pre-commercial to early-commercial phases. The target regions in California will serve as benchmarking case studies to quantify the success of past investments in research and development supporting safety codes and standards R&D.

  7. Project management metrics, KPIs, and dashboards a guide to measuring and monitoring project performance

    CERN Document Server

    Kerzner, Harold

    2013-01-01

    Today, with the growth of complex projects, stakeholder involvement in projects, advances in computer technology for dashboard designs, metrics, and key performance indicators for project management have become an important focus. This Second Edition of the bestselling book walks readers through everything from the basics of project management metrics and key performance indicators to establishing targets and using dashboards to monitor performance. The content is aligned with PMI's PMBOK Guide and stresses "value" as the main focal point.

  8. Symmetries of the dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.

    1998-01-01

    The geometric duality between the metric g μν and a Killing tensor K μν is studied. The conditions were found when the symmetries of the metric g μν and the dual metric K μν are the same. Dual spinning space was constructed without introduction of torsion. The general results are applied to the case of Kerr-Newmann metric

  9. Analysis of correlation between full-waveform metrics, scan geometry and land-cover: an application over forests

    Directory of Open Access Journals (Sweden)

    F. Pirotti

    2013-10-01

    Full Text Available For a correct use of metrics derived from processing of the full-waveform return signal from airborne laser scanner sensors any correlation which is not related to properties of the reflecting target must be known and, if possible, removed. In the following article we report on an analysis of correlation between several metrics extracted from the full-waveform return signal and scan characteristics (mainly range and type of land-cover (urban, grasslands, forests. The metrics taken in consideration are the amplitude, normalized amplitude, width (full width at half maximum, asymmetry indicators, left and right energy content, and the cross-section calculated from width and normalized amplitude considering the range effect. The results show that scan geometry in this case does not have a significant impact scans over forest cover, except for range affecting amplitude and width distribution. Over complex targets such as vegetation canopy, other factors such as incidence angle have little meaning, therefore corrections of range effect are the most meaningful. A strong correlation with the type of land-cover is also shown by the distribution of the values of the metrics in the different areas taken in consideration.

  10. Nonrandom Intrafraction Target Motions and General Strategy for Correction of Spine Stereotactic Body Radiotherapy

    International Nuclear Information System (INIS)

    Ma Lijun; Sahgal, Arjun; Hossain, Sabbir; Chuang, Cynthia; Descovich, Martina; Huang, Kim; Gottschalk, Alex; Larson, David A.

    2009-01-01

    Purpose: To characterize nonrandom intrafraction target motions for spine stereotactic body radiotherapy and to develop a method of correction via image guidance. The dependence of target motions, as well as the effectiveness of the correction strategy for lesions of different locations within the spine, was analyzed. Methods and Materials: Intrafraction target motions for 64 targets in 64 patients treated with a total of 233 fractions were analyzed. Based on the target location, the cases were divided into three groups, i.e., cervical (n = 20 patients), thoracic (n = 20 patients), or lumbar-sacrum (n = 24 patients) lesions. For each case, time-lag autocorrelation analysis was performed for each degree of freedom of motion that included both translations (x, y, and z shifts) and rotations (roll, yaw, and pitch). A general correction strategy based on periodic interventions was derived to determine the time interval required between two adjacent interventions, to overcome the patient-specific target motions. Results: Nonrandom target motions were detected for 100% of cases regardless of target locations. Cervical spine targets were found to possess the highest incidence of nonrandom target motion compared with thoracic and lumbar-sacral lesions (p < 0.001). The average time needed to maintain the target motion to within 1 mm of translation or 1 deg. of rotational deviation was 5.5 min, 5.9 min, and 7.1 min for cervical, thoracic, and lumbar-sacrum locations, respectively (at 95% confidence level). Conclusions: A high incidence of nonrandom intrafraction target motions was found for spine stereotactic body radiotherapy treatments. Periodic interventions at approximately every 5 minutes or less were needed to overcome such motions.

  11. An Energy-Efficient Target-Tracking Strategy for Mobile Sensor Networks.

    Science.gov (United States)

    Mahboubi, Hamid; Masoudimansour, Walid; Aghdam, Amir G; Sayrafian-Pour, Kamran

    2017-02-01

    In this paper, an energy-efficient strategy is proposed for tracking a moving target in an environment with obstacles, using a network of mobile sensors. Typically, the most dominant sources of energy consumption in a mobile sensor network are sensing, communication, and movement. The proposed algorithm first divides the field into a grid of sufficiently small cells. The grid is then represented by a graph whose edges are properly weighted to reflect the energy consumption of sensors. The proposed technique searches for near-optimal locations for the sensors in different time instants to route information from the target to destination, using a shortest path algorithm. Simulations confirm the efficacy of the proposed algorithm.

  12. Accounting for no net loss: A critical assessment of biodiversity offsetting metrics and methods.

    Science.gov (United States)

    Carreras Gamarra, Maria Jose; Lassoie, James Philip; Milder, Jeffrey

    2018-08-15

    Biodiversity offset strategies are based on the explicit calculation of both losses and gains necessary to establish ecological equivalence between impact and offset areas. Given the importance of quantifying biodiversity values, various accounting methods and metrics are continuously being developed and tested for this purpose. Considering the wide array of alternatives, selecting an appropriate one for a specific project can be not only challenging, but also crucial; accounting methods can strongly influence the biodiversity outcomes of an offsetting strategy, and if not well-suited to the context and values being offset, a no net loss outcome might not be delivered. To date there has been no systematic review or comparative classification of the available biodiversity accounting alternatives that aim at facilitating metric selection, and no tools that guide decision-makers throughout such a complex process. We fill this gap by developing a set of analyses to support (i) identifying the spectrum of available alternatives, (ii) understanding the characteristics of each and, ultimately (iii) making the most sensible and sound decision about which one to implement. The metric menu, scoring matrix, and decision tree developed can be used by biodiversity offsetting practitioners to help select an existing metric, and thus achieve successful outcomes that advance the goal of no net loss of biodiversity. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Evaluation of targeted influenza vaccination strategies via population modeling.

    Directory of Open Access Journals (Sweden)

    John Glasser

    Full Text Available BACKGROUND: Because they can generate comparable predictions, mathematical models are ideal tools for evaluating alternative drug or vaccine allocation strategies. To remain credible, however, results must be consistent. Authors of a recent assessment of possible influenza vaccination strategies conclude that older children, adolescents, and young adults are the optimal targets, no matter the objective, and argue for vaccinating them. Authors of two earlier studies concluded, respectively, that optimal targets depend on objectives and cautioned against changing policy. Which should we believe? METHODS AND FINDINGS: In matrices whose elements are contacts between persons by age, the main diagonal always predominates, reflecting contacts between contemporaries. Indirect effects (e.g., impacts of vaccinating one group on morbidity or mortality in others result from off-diagonal elements. Mixing matrices based on periods in proximity with others have greater sub- and super-diagonals, reflecting contacts between parents and children, and other off-diagonal elements (reflecting, e.g., age-independent contacts among co-workers, than those based on face-to-face conversations. To assess the impact of targeted vaccination, we used a time-usage study's mixing matrix and allowed vaccine efficacy to vary with age. And we derived mortality rates either by dividing observed deaths attributed to pneumonia and influenza by average annual cases from a demographically-realistic SEIRS model or by multiplying those rates by ratios of (versus adding to them differences between pandemic and pre-pandemic mortalities. CONCLUSIONS: In our simulations, vaccinating older children, adolescents, and young adults averts the most cases, but vaccinating either younger children and older adults or young adults averts the most deaths, depending on the age distribution of mortality. These results are consistent with those of the earlier studies.

  14. Overview of journal metrics

    Directory of Open Access Journals (Sweden)

    Kihong Kim

    2018-02-01

    Full Text Available Various kinds of metrics used for the quantitative evaluation of scholarly journals are reviewed. The impact factor and related metrics including the immediacy index and the aggregate impact factor, which are provided by the Journal Citation Reports, are explained in detail. The Eigenfactor score and the article influence score are also reviewed. In addition, journal metrics such as CiteScore, Source Normalized Impact per Paper, SCImago Journal Rank, h-index, and g-index are discussed. Limitations and problems that these metrics have are pointed out. We should be cautious to rely on those quantitative measures too much when we evaluate journals or researchers.

  15. Targeted intervention strategies to optimise diversion of BMW in the Dublin, Ireland region

    International Nuclear Information System (INIS)

    Purcell, M.; Magette, W.L.

    2011-01-01

    Highlights: → Previous research indicates that targeted strategies designed for specific areas should lead to improved diversion. → Survey responses and GIS model predictions from previous research were the basis for goal setting. → Then logic modelling and behavioural research were employed to develop site-specific management intervention strategies. → Waste management initiatives can be tailored to specific needs of areas rather than one size fits all means currently used. - Abstract: Urgent transformation is required in Ireland to divert biodegradable municipal waste (BMW) from landfill and prevent increases in overall waste generation. When BMW is optimally managed, it becomes a resource with value instead of an unwanted by-product requiring disposal. An analysis of survey responses from commercial and residential sectors for the Dublin region in previous research by the authors proved that attitudes towards and behaviour regarding municipal solid waste is spatially variable. This finding indicates that targeted intervention strategies designed for specific geographic areas should lead to improved diversion rates of BMW from landfill, a requirement of the Landfill Directive 1999/31/EC. In the research described in this paper, survey responses and GIS model predictions from previous research were the basis for goal setting, after which logic modelling and behavioural research were employed to develop site-specific waste management intervention strategies. The main strategies devised include (a) roll out of the Brown Bin (Organics) Collection and Community Workshops in Dun Laoghaire Rathdown, (b) initiation of a Community Composting Project in Dublin City (c) implementation of a Waste Promotion and Motivation Scheme in South Dublin (d) development and distribution of a Waste Booklet to promote waste reduction activities in Fingal (e) region wide distribution of a Waste Booklet to the commercial sector and (f) Greening Irish Pubs Initiative. Each of these

  16. Holographic Spherically Symmetric Metrics

    Science.gov (United States)

    Petri, Michael

    The holographic principle (HP) conjectures, that the maximum number of degrees of freedom of any realistic physical system is proportional to the system's boundary area. The HP has its roots in the study of black holes. It has recently been applied to cosmological solutions. In this article we apply the HP to spherically symmetric static space-times. We find that any regular spherically symmetric object saturating the HP is subject to tight constraints on the (interior) metric, energy-density, temperature and entropy-density. Whenever gravity can be described by a metric theory, gravity is macroscopically scale invariant and the laws of thermodynamics hold locally and globally, the (interior) metric of a regular holographic object is uniquely determined up to a constant factor and the interior matter-state must follow well defined scaling relations. When the metric theory of gravity is general relativity, the interior matter has an overall string equation of state (EOS) and a unique total energy-density. Thus the holographic metric derived in this article can serve as simple interior 4D realization of Mathur's string fuzzball proposal. Some properties of the holographic metric and its possible experimental verification are discussed. The geodesics of the holographic metric describe an isotropically expanding (or contracting) universe with a nearly homogeneous matter-distribution within the local Hubble volume. Due to the overall string EOS the active gravitational mass-density is zero, resulting in a coasting expansion with Ht = 1, which is compatible with the recent GRB-data.

  17. Tumor-targeted inhibition by a novel strategy - mimoretrovirus expressing siRNA targeting the Pokemon gene.

    Science.gov (United States)

    Tian, Zhiqiang; Wang, Huaizhi; Jia, Zhengcai; Shi, Jinglei; Tang, Jun; Mao, Liwei; Liu, Hongli; Deng, Yijing; He, Yangdong; Ruan, Zhihua; Li, Jintao; Wu, Yuzhang; Ni, Bing

    2010-12-01

    Pokemon gene has crucial but versatile functions in cell differentiation, proliferation and tumorigenesis. It is a master regulator of the ARF-HDM2-p53 and Rb-E2F pathways. The facts that the expression of Pokemon is essential for tumor formation and many kinds of tumors over-express the Pokemon gene make it an attractive target for therapeutic intervention for cancer treatment. In this study, we used an RNAi strategy to silence the Pokemon gene in a cervical cancer model. To address the issues involving tumor specific delivery and durable expression of siRNA, we applied the Arg-Gly-Asp (RGD) peptide ligand and polylysine (K(18)) fusion peptide to encapsulate a recombinant retrovirus plasmid expressing a siRNA targeting the Pokemon gene and produced the 'mimoretrovirus'. At charge ratio 2.0 of fusion peptide/plasmid, the mimoretrovirus formed stable and homogenous nanoparticles, and provided complete DNase I protection and complete gel retardation. This nanoparticle inhibited SiHa cell proliferation and invasion, while it promoted SiHa cell apoptosis. The binding of the nanoparticle to SiHa cells was mediated via the RGD-integrin α(v)β(3) interaction, as evidenced by the finding that unconjugated RGD peptide inhibited this binding significantly. This tumor-targeting mimoretrovirus exhibited excellent anti-tumor capacity in vivo in a nude mouse model. Moreover, the mimoretrovirus inhibited tumor growth with a much higher efficiency than recombinant retrovirus expressing siRNA or the K(18)/P4 nanoparticle lacking the RGD peptide. Results suggest that the RNAi/RGD-based mimoretrovirus developed in this study represents a novel anti-tumor strategy that may be applicable to most research involving cancer therapy and, thus, has promising potential as a cervical cancer treatment.

  18. Tumor trailing strategy for intensity-modulated radiation therapy of moving targets

    International Nuclear Information System (INIS)

    Trofimov, Alexei; Vrancic, Christian; Chan, Timothy C. Y.; Sharp, Gregory C.; Bortfeld, Thomas

    2008-01-01

    Internal organ motion during the course of radiation therapy of cancer affects the distribution of the delivered dose and, generally, reduces its conformality to the targeted volume. Previously proposed approaches aimed at mitigating the effect of internal motion in intensity-modulated radiation therapy (IMRT) included expansion of the target margins, motion-correlated delivery (e.g., respiratory gating, tumor tracking), and adaptive treatment plan optimization employing a probabilistic description of motion. We describe and test the tumor trailing strategy, which utilizes the synergy of motion-adaptive treatment planning and delivery methods. We regard the (rigid) target motion as a superposition of a relatively fast cyclic component (e.g., respiratory) and slow aperiodic trends (e.g., the drift of exhalation baseline). In the trailing approach, these two components of motion are decoupled and dealt with separately. Real-time motion monitoring is employed to identify the 'slow' shifts, which are then corrected by applying setup adjustments. The delivery does not track the target position exactly, but trails the systematic trend due to the delay between the time a shift occurs, is reliably detected, and, subsequently, corrected. The ''fast'' cyclic motion is accounted for with a robust motion-adaptive treatment planning, which allows for variability in motion parameters (e.g., mean and extrema of the tidal volume, variable period of respiration, and expiratory duration). Motion-surrogate data from gated IMRT treatments were used to provide probability distribution data for motion-adaptive planning and to test algorithms that identified systematic trends in the character of motion. Sample IMRT fields were delivered on a clinical linear accelerator to a programmable moving phantom. Dose measurements were performed with a commercial two-dimensional ion-chamber array. The results indicate that by reducing intrafractional motion variability, the trailing strategy

  19. Emerging Therapeutic Strategies for Targeting Chronic Myeloid Leukemia Stem Cells

    Directory of Open Access Journals (Sweden)

    Ahmad Hamad

    2013-01-01

    Full Text Available Chronic myeloid leukemia (CML is a clonal myeloproliferative disorder. Current targeted therapies designed to inhibit the tyrosine kinase activity of the BCR-ABL oncoprotein have made a significant breakthrough in the treatment of CML patients. However, CML remains a chronic disease that a patient must manage for life. Although tyrosine kinase inhibitors (TKI therapy has completely transformed the prognosis of CML, it has made the therapeutic management more complex. The interruption of TKI treatment results in early disease progression because it does not eliminate quiescent CML stem cells which remain a potential reservoir for disease relapse. This highlights the need to develop new therapeutic strategies for CML to achieve a permanent cure, and to allow TKI interruption. This review summarizes recent research done on alternative targeted therapies with a particular focus on some important signaling pathways (such as Alox5, Hedgehog, Wnt/b-catenin, autophagy, and PML that have the potential to target CML stem cells and potentially provide cure for CML.

  20. Applying graphs and complex networks to football metric interpretation.

    Science.gov (United States)

    Arriaza-Ardiles, E; Martín-González, J M; Zuniga, M D; Sánchez-Flores, J; de Saa, Y; García-Manso, J M

    2018-02-01

    This work presents a methodology for analysing the interactions between players in a football team, from the point of view of graph theory and complex networks. We model the complex network of passing interactions between players of a same team in 32 official matches of the Liga de Fútbol Profesional (Spain), using a passing/reception graph. This methodology allows us to understand the play structure of the team, by analysing the offensive phases of game-play. We utilise two different strategies for characterising the contribution of the players to the team: the clustering coefficient, and centrality metrics (closeness and betweenness). We show the application of this methodology by analyzing the performance of a professional Spanish team according to these metrics and the distribution of passing/reception in the field. Keeping in mind the dynamic nature of collective sports, in the future we will incorporate metrics which allows us to analyse the performance of the team also according to the circumstances of game-play and to different contextual variables such as, the utilisation of the field space, the time, and the ball, according to specific tactical situations. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. A strategy to correct for intrafraction target translation in conformal prostate radiotherapy: Simulation results

    International Nuclear Information System (INIS)

    Keall, P. J.; Lauve, A. D.; Hagan, M. P.; Siebers, J. V.

    2007-01-01

    A strategy is proposed in which intrafraction internal target translation is corrected for by repositioning the multileaf collimator position aperture to conform to the new target pose in the beam projection, and the beam monitor units are adjusted to account for the change in the geometric relationship between the target and the beam. The purpose of this study was to investigate the dosimetric stability of the prostate and critical structures in the presence of internal target translation using the dynamic compensation strategy. Twenty-five previously treated prostate cancer patients were replanned using a four-field conformal technique to deliver 72 Gy to 95% of the planning target volume (PTV). Internal translation was introduced by displacing the prostate PTV (no rotation or deformation was considered). Thirty-six randomly selected isotropic displacements of magnitude 0.5, 1.0, 1.5 and 2.0 cm were sampled for each patient, for a total of 3600 errors. Due to their anatomic relation to the prostate, the rectum and bladder contours were also moved with the same magnitude and direction as the prostate. The dynamic compensation strategy was used to correct each of these errors by conforming the beam apertures to the new target pose and adjusting the monitor units using inverse-square and off-axis factor corrections. The dynamic compensation strategy plans were then compared to the original treatment plans via dose-volume histogram (DVH) analysis. Changes of more than 5% of the prescription dose (3.6 Gy) were deemed clinically significant. Compared to the original treatment plans, the dynamic compensation strategy produced small discrepancies in isodose distributions and DVH analyses for all structures considered apart from the femoral heads. These differences increased with the magnitude of the internal motion. Coverage of the PTV was excellent: D 5 , D 95 , and D mean were not increased or decreased by more than 5% of the prescription dose for any of the 3600

  2. The Publications Tracking and Metrics Program at NOAO: Challenges and Opportunities

    Science.gov (United States)

    Hunt, Sharon

    2015-08-01

    The National Optical Astronomy Observatory (NOAO) is the U.S. national research and development center for ground-based nighttime astronomy. The NOAO librarian manages the organization’s publications tracking and metrics program, which consists of three components: identifying publications, organizing citation data, and disseminating publications information. We are developing methods to streamline these tasks, better organize our data, provide greater accessibility to publications data, and add value to our services.Our publications tracking process is complex, as we track refereed publications citing data from several sources: NOAO telescopes at two observatory sites, telescopes of consortia in which NOAO participates, the NOAO Science Archive, and NOAO-granted community-access time on non-NOAO telescopes. We also identify and document our scientific staff publications. In addition, several individuals contribute publications data.In the past year, we made several changes in our publications tracking and metrics program. To better organize our data and streamline the creation of reports and metrics, we created a MySQL publications database. When designing this relational database, we considered ease of use, the ability to incorporate data from various sources, efficiency in data inputting and sorting, and potential for growth. We also considered the types of metrics we wished to generate from our publications data based on our target audiences and the messages we wanted to convey. To increase accessibility and dissemination of publications information, we developed a publications section on the library’s website, with citation lists, acknowledgements guidelines, and metrics. We are now developing a searchable online database for our website using PHP.The publications tracking and metrics program has provided many opportunities for the library to market its services and contribute to the organization’s mission. As we make decisions on collecting, organizing

  3. Mung bean nuclease treatment increases capture specificity of microdroplet-PCR based targeted DNA enrichment.

    Directory of Open Access Journals (Sweden)

    Zhenming Yu

    Full Text Available Targeted DNA enrichment coupled with next generation sequencing has been increasingly used for interrogation of select sub-genomic regions at high depth of coverage in a cost effective manner. Specificity measured by on-target efficiency is a key performance metric for target enrichment. Non-specific capture leads to off-target reads, resulting in waste of sequencing throughput on irrelevant regions. Microdroplet-PCR allows simultaneous amplification of up to thousands of regions in the genome and is among the most commonly used strategies for target enrichment. Here we show that carryover of single-stranded template genomic DNA from microdroplet-PCR constitutes a major contributing factor for off-target reads in the resultant libraries. Moreover, treatment of microdroplet-PCR enrichment products with a nuclease specific to single-stranded DNA alleviates off-target load and improves enrichment specificity. We propose that nuclease treatment of enrichment products should be incorporated in the workflow of targeted sequencing using microdroplet-PCR for target capture. These findings may have a broad impact on other PCR based applications for which removal of template DNA is beneficial.

  4. Correlations between contouring similarity metrics and simulated treatment outcome for prostate radiotherapy

    Science.gov (United States)

    Roach, D.; Jameson, M. G.; Dowling, J. A.; Ebert, M. A.; Greer, P. B.; Kennedy, A. M.; Watt, S.; Holloway, L. C.

    2018-02-01

    Many similarity metrics exist for inter-observer contouring variation studies, however no correlation between metric choice and prostate cancer radiotherapy dosimetry has been explored. These correlations were investigated in this study. Two separate trials were undertaken, the first a thirty-five patient cohort with three observers, the second a five patient dataset with ten observers. Clinical and planning target volumes (CTV and PTV), rectum, and bladder were independently contoured by all observers in each trial. Structures were contoured on T2-weighted MRI and transferred onto CT following rigid registration for treatment planning in the first trial. Structures were contoured directly on CT in the second trial. STAPLE and majority voting volumes were generated as reference gold standard volumes for each structure for the two trials respectively. VMAT treatment plans (78 Gy to PTV) were simulated for observer and gold standard volumes, and dosimetry assessed using multiple radiobiological metrics. Correlations between contouring similarity metrics and dosimetry were calculated using Spearman’s rank correlation coefficient. No correlations were observed between contouring similarity metrics and dosimetry for CTV within either trial. Volume similarity correlated most strongly with radiobiological metrics for PTV in both trials, including TCPPoisson (ρ  =  0.57, 0.65), TCPLogit (ρ  =  0.39, 0.62), and EUD (ρ  =  0.43, 0.61) for each respective trial. Rectum and bladder metric correlations displayed no consistency for the two trials. PTV volume similarity was found to significantly correlate with rectum normal tissue complication probability (ρ  =  0.33, 0.48). Minimal to no correlations with dosimetry were observed for overlap or boundary contouring metrics. Future inter-observer contouring variation studies for prostate cancer should incorporate volume similarity to provide additional insights into dosimetry during analysis.

  5. Increasing Army Supply Chain Performance: Using an Integrated End to End Metrics System

    Science.gov (United States)

    2017-01-01

    Sched Deliver Sched Delinquent Contracts Current Metrics PQDR/SDRs Forecasting Accuracy Reliability Demand Management Asset Mgmt Strategies Pipeline...are identified and characterized by statistical analysis. The study proposed a framework and tool for inventory management based on factors such as

  6. Metric regularity and subdifferential calculus

    International Nuclear Information System (INIS)

    Ioffe, A D

    2000-01-01

    The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces

  7. Feature Extraction and Selection Strategies for Automated Target Recognition

    Science.gov (United States)

    Greene, W. Nicholas; Zhang, Yuhan; Lu, Thomas T.; Chao, Tien-Hsin

    2010-01-01

    Several feature extraction and selection methods for an existing automatic target recognition (ATR) system using JPLs Grayscale Optical Correlator (GOC) and Optimal Trade-Off Maximum Average Correlation Height (OT-MACH) filter were tested using MATLAB. The ATR system is composed of three stages: a cursory region of-interest (ROI) search using the GOC and OT-MACH filter, a feature extraction and selection stage, and a final classification stage. Feature extraction and selection concerns transforming potential target data into more useful forms as well as selecting important subsets of that data which may aide in detection and classification. The strategies tested were built around two popular extraction methods: Principal Component Analysis (PCA) and Independent Component Analysis (ICA). Performance was measured based on the classification accuracy and free-response receiver operating characteristic (FROC) output of a support vector machine(SVM) and a neural net (NN) classifier.

  8. Context-dependent ATC complexity metric

    NARCIS (Netherlands)

    Mercado Velasco, G.A.; Borst, C.

    2015-01-01

    Several studies have investigated Air Traffic Control (ATC) complexity metrics in a search for a metric that could best capture workload. These studies have shown how daunting the search for a universal workload metric (one that could be applied in different contexts: sectors, traffic patterns,

  9. Development strategy and targets of CGNPG

    International Nuclear Information System (INIS)

    Zan Yunlong

    2002-01-01

    The development of nuclear power industry in Guangdong results from the steady implementation of a catch-up strategy aimed at the advanced world level in the nuclear power industry. China Guangdong Nuclear Power (Holding) Co., Ltd. (CGNPC) started from Daya Bay Nuclear Power Station (GNPS). In the form of joint venture, GNPS has obtained sophisticated technology, management expertise and human resources both at home and abroad, and has successfully completed the learning curve from importing, digesting, absorbing to innovating and self-improving. Under the principle of maintaining continuous nuclear power development by reinvesting the returns on the operating nuclear power stations, the second nuclear power project, Ling Ao Nuclear Power Station (LNPS) is progressing well and preparation for the third nuclear power project is now in full swing. With a rolling-on development mechanism being established, Daya Bay has become the cradle for nuclear power development in Guangdong. In the 21 st century, CGNPC is facing new challenges and opportunity. CGNPC will uphold the principle of maintaining continuous nuclear power development by reinvesting the returns on the operating nuclear power stations, brace itself for the market competition and explore sustained development of nuclear power in China by pursuing constant innovation in technology, management, system and concept. The strategy framework for future development of CGNPC is defined as follows: - to establish three-dimension strategic targets; - to pursue two-step development with the year 2015 as the dividing point; - to promote concerted development of nuclear power, associated industries and supporting services

  10. Fiscal sustainability and fiscal policy targets

    DEFF Research Database (Denmark)

    Andersen, Torben M.

    Analyses of fiscal sustainability have become integral parts of fiscal policy planning due to high debt levels and projected demographic changes. A popular metric by which to evaluate sustainability gaps is the so-called S2 metric given as the permanent change in the primary budget balance...... indicator can be given a normative interpretation, and this issue is extensively discussed. The paper ends by discussing the formulation of fiscal policy targets to ensure fiscal sustainability....... (relative to GDP) needed to meet the intertemporal budget constraint. While a very useful metric it also suffers from some problems, and the paper discusses some of the problems with this metric as a way to assess fiscal sustainability problems. A particular important issue is the extent to which the S2...

  11. DLA Energy Biofuel Feedstock Metrics Study

    Science.gov (United States)

    2012-12-11

    moderately/highly in- vasive  Metric 2: Genetically modified organism ( GMO ) hazard, Yes/No and Hazard Category  Metric 3: Species hybridization...4– biofuel distribution Stage # 5– biofuel use Metric 1: State inva- siveness ranking Yes Minimal Minimal No No Metric 2: GMO hazard Yes...may utilize GMO microbial or microalgae species across the applicable biofuel life cycles (stages 1–3). The following consequence Metrics 4–6 then

  12. Analisis Strategi Segmenting, Targeting Dan Positioning Pada Perushaan Asuransi Pt.(persero) Jiwasraya, Pekanbaru

    OpenAIRE

    Sihotang, Jon Predianto; Karneli, Okta

    2017-01-01

    This research aims to identify and analyze the strategy of segmenting, targeting and positioning on the insurance company PT.(Persero) Asuransi Jiwasraya, Pekanbaru. In last 5 (five) years, the company experienced with unstable marketing. And the author believes that the trouble sits inside the marketing strategies that are not running well. The data had gained directly from the key informans by interviewing process in having accurate informations.The method of this research was used descript...

  13. Symmetries of Taub-NUT dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.; Codoban, S.

    1998-01-01

    Recently geometric duality was analyzed for a metric which admits Killing tensors. An interesting example arises when the manifold has Killing-Yano tensors. The symmetries of the dual metrics in the case of Taub-NUT metric are investigated. Generic and non-generic symmetries of dual Taub-NUT metric are analyzed

  14. Multistage Targeting Strategy Using Magnetic Composite Nanoparticles for Synergism of Photothermal Therapy and Chemotherapy.

    Science.gov (United States)

    Wang, Yi; Wei, Guoqing; Zhang, Xiaobin; Huang, Xuehui; Zhao, Jingya; Guo, Xing; Zhou, Shaobing

    2018-03-01

    Mitochondrial-targeting therapy is an emerging strategy for enhanced cancer treatment. In the present study, a multistage targeting strategy using doxorubicin-loaded magnetic composite nanoparticles is developed for enhanced efficacy of photothermal and chemical therapy. The nanoparticles with a core-shell-SS-shell architecture are composed of a core of Fe 3 O 4 colloidal nanocrystal clusters, an inner shell of polydopamine (PDA) functionalized with triphenylphosphonium (TPP), and an outer shell of methoxy poly(ethylene glycol) linked to the PDA by disulfide bonds. The magnetic core can increase the accumulation of nanoparticles at the tumor site for the first stage of tumor tissue targeting. After the nanoparticles enter the tumor cells, the second stage of mitochondrial targeting is realized as the mPEG shell is detached from the nanoparticles by redox responsiveness to expose the TPP. Using near-infrared light irradiation at the tumor site, a photothermal effect is generated from the PDA photosensitizer, leading to a dramatic decrease in mitochondrial membrane potential. Simultaneously, the loaded doxorubicin can rapidly enter the mitochondria and subsequently damage the mitochondrial DNA, resulting in cell apoptosis. Thus, the synergism of photothermal therapy and chemotherapy targeting the mitochondria significantly enhances the cancer treatment. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Metric learning

    CERN Document Server

    Bellet, Aurelien; Sebban, Marc

    2015-01-01

    Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learnin

  16. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2018-01-01

    The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...

  17. On Information Metrics for Spatial Coding.

    Science.gov (United States)

    Souza, Bryan C; Pavão, Rodrigo; Belchior, Hindiael; Tort, Adriano B L

    2018-04-01

    The hippocampal formation is involved in navigation, and its neuronal activity exhibits a variety of spatial correlates (e.g., place cells, grid cells). The quantification of the information encoded by spikes has been standard procedure to identify which cells have spatial correlates. For place cells, most of the established metrics derive from Shannon's mutual information (Shannon, 1948), and convey information rate in bits/s or bits/spike (Skaggs et al., 1993, 1996). Despite their widespread use, the performance of these metrics in relation to the original mutual information metric has never been investigated. In this work, using simulated and real data, we find that the current information metrics correlate less with the accuracy of spatial decoding than the original mutual information metric. We also find that the top informative cells may differ among metrics, and show a surrogate-based normalization that yields comparable spatial information estimates. Since different information metrics may identify different neuronal populations, we discuss current and alternative definitions of spatially informative cells, which affect the metric choice. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  18. The n-by-T Target Discharge Strategy for Inpatient Units.

    Science.gov (United States)

    Parikh, Pratik J; Ballester, Nicholas; Ramsey, Kylie; Kong, Nan; Pook, Nancy

    2017-07-01

    Ineffective inpatient discharge planning often causes discharge delays and upstream boarding. While an optimal discharge strategy that works across all units at a hospital is likely difficult to identify and implement, a strategy that provides a reasonable target to the discharge team appears feasible. We used observational and retrospective data from an inpatient trauma unit at a Level 2 trauma center in the Midwest US. Our proposed novel n-by-T strategy-discharge n patients by the Tth hour-was evaluated using a validated simulation model. Outcome measures included 2 measures: time-based (mean discharge completion and upstream boarding times) and capacity-based (increase in annual inpatient and upstream bed hours). Data from the pilot implementation of a 2-by-12 strategy at the unit was obtained and analyzed. The model suggested that the 1-by-T and 2-by-T strategies could advance the mean completion times by over 1.38 and 2.72 h, respectively (for 10 AM ≤ T ≤ noon, occupancy rate = 85%); the corresponding mean boarding time reductions were nearly 11% and 15%. These strategies could increase the availability of annual inpatient and upstream bed hours by at least 2,469 and 500, respectively. At 100% occupancy rate, the hospital-favored 2-by-12 strategy reduced the mean boarding time by 26.1%. A pilot implementation of the 2-by-12 strategy at the unit corroborated with the model findings: a 1.98-h advancement in completion times (Pstrategies, such as the n-by-T, can help substantially reduce discharge lateness and upstream boarding, especially during high unit occupancy. To sustain implementation, necessary commitment from the unit staff and physicians is vital, and may require some training.

  19. Generalized Painleve-Gullstrand metrics

    Energy Technology Data Exchange (ETDEWEB)

    Lin Chunyu [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: l2891112@mail.ncku.edu.tw; Soo Chopin [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: cpsoo@mail.ncku.edu.tw

    2009-02-02

    An obstruction to the implementation of spatially flat Painleve-Gullstrand (PG) slicings is demonstrated, and explicitly discussed for Reissner-Nordstroem and Schwarzschild-anti-deSitter spacetimes. Generalizations of PG slicings which are not spatially flat but which remain regular at the horizons are introduced. These metrics can be obtained from standard spherically symmetric metrics by physical Lorentz boosts. With these generalized PG metrics, problematic contributions to the imaginary part of the action in the Parikh-Wilczek derivation of Hawking radiation due to the obstruction can be avoided.

  20. Kerr metric in the deSitter background

    International Nuclear Information System (INIS)

    Vaidya, P.C.

    1984-01-01

    In addition to the Kerr metric with cosmological constant Λ several other metrics are presented giving a Kerr-like solution of Einstein's equations in the background of deSitter universe. A new metric of what may be termed as rotating deSitter space-time devoid of matter but containing null fluid with twisting null rays, has been presented. This metric reduces to the standard deSitter metric when the twist in the rays vanishes. Kerr metric in this background is the immediate generalization of Schwarzschild's exterior metric with cosmological constant. (author)

  1. Four-Week Strategy-Based Training to Enhance Prospective Memory in Older Adults: Targeting Intention Retention Is More Beneficial than Targeting Intention Formation.

    Science.gov (United States)

    Ihle, Andreas; Albiński, Rafal; Gurynowicz, Kamila; Kliegel, Matthias

    2018-01-01

    So far, training of prospective memory (PM) focused on very short instances (single sessions) and targeted the intention-formation phase only. We aimed to compare the effectiveness of 2 different 4-week strategy-based PM training types, namely imagery training (targeting the encoding of the PM intention in the intention-formation phase) versus rehearsal training (targeting the maintenance of the PM intention in the intention-retention phase) in older adults. We used a 4-week training protocol (8 sessions in total, 2 sessions per week). From the 44 participants, 21 were randomly assigned to the imagery training (vividly imagining a mental picture to memorize the connection between the PM cue words and related actions during intention formation) and 23 to the rehearsal training (rehearsing the PM cue words during intention retention). The criterion PM task was assessed before and after the training. Comparing the effectiveness of both training types, we found a significant time by training type interaction on PM accuracy in terms of PM cue detection, F(1, 42) = 6.07, p = 0.018, η2p = 0.13. Subsequent analyses revealed that the rehearsal training was more effective in enhancing PM accuracy in terms of PM cue detection than the imagery training. Strategy-based PM training in older adults targeting the maintenance of the PM intention in the intention-retention phase may be more effective in enhancing PM accuracy in terms of PM cue detection than the strategy targeting the encoding of the PM intention in the intention-formation phase. This suggests that for successful prospective remembering, older adults may need more support to keep the PM cues active in memory while working on the ongoing task than to initially encode the PM intention. © 2018 S. Karger AG, Basel.

  2. Kerr metric in cosmological background

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, P C [Gujarat Univ., Ahmedabad (India). Dept. of Mathematics

    1977-06-01

    A metric satisfying Einstein's equation is given which in the vicinity of the source reduces to the well-known Kerr metric and which at large distances reduces to the Robertson-Walker metric of a nomogeneous cosmological model. The radius of the event horizon of the Kerr black hole in the cosmological background is found out.

  3. A strategy to objectively evaluate the necessity of correcting detected target deviations in image guided radiotherapy

    International Nuclear Information System (INIS)

    Yue, Ning J.; Kim, Sung; Jabbour, Salma; Narra, Venkat; Haffty, Bruce G.

    2007-01-01

    Image guided radiotherapy technologies are being increasingly utilized in the treatment of various cancers. These technologies have enhanced the ability to detect temporal and spatial deviations of the target volume relative to planned radiation beams. Correcting these detected deviations may, in principle, improve the accuracy of dose delivery to the target. However, in many situations, a clinical decision has to be made as to whether it is necessary to correct some of the deviations since the relevant dosimetric impact may or may not be significant, and the corresponding corrective action may be either impractical or time consuming. Ideally this decision should be based on objective and reproducible criteria rather than subjective judgment. In this study, a strategy is proposed for the objective evaluation of the necessity of deviation correction during the treatment verification process. At the treatment stage, without any alteration from the planned beams, the treatment beams should provide the desired dose coverage to the geometric volume identical to the planning target volume (PTV). Given this fact, the planned dose distribution and PTV geometry were used to compute the dose coverage and PTV enclosure of the clinical target volume (CTV) that was detected from imaging during the treatment setup verification. The spatial differences between the detected CTV and the planning CTV are essentially the target deviations. The extent of the PTV enclosure of the detected CTV as well as its dose coverage were used as criteria to evaluate the necessity of correcting any of the target deviations. This strategy, in principle, should be applicable to any type of target deviations, including both target deformable and positional changes and should be independent of how the deviations are detected. The proposed strategy was used on two clinical prostate cancer cases. In both cases, gold markers were implanted inside the prostate for the purpose of treatment setup

  4. Increasing the Structural Coverage of Tuberculosis Drug Targets

    Science.gov (United States)

    Baugh, Loren; Phan, Isabelle; Begley, Darren W.; Clifton, Matthew C.; Armour, Brianna; Dranow, David M.; Taylor, Brandy M.; Muruthi, Marvin M.; Abendroth, Jan; Fairman, James W.; Fox, David; Dieterich, Shellie H.; Staker, Bart L.; Gardberg, Anna S.; Choi, Ryan; Hewitt, Stephen N.; Napuli, Alberto J.; Myers, Janette; Barrett, Lynn K.; Zhang, Yang; Ferrell, Micah; Mundt, Elizabeth; Thompkins, Katie; Tran, Ngoc; Lyons-Abbott, Sally; Abramov, Ariel; Sekar, Aarthi; Serbzhinskiy, Dmitri; Lorimer, Don; Buchko, Garry W.; Stacy, Robin; Stewart, Lance J.; Edwards, Thomas E.; Van Voorhis, Wesley C.; Myler, Peter J.

    2015-01-01

    High-resolution three-dimensional structures of essential Mycobacterium tuberculosis (Mtb) proteins provide templates for TB drug design, but are available for only a small fraction of the Mtb proteome. Here we evaluate an intra-genus “homolog-rescue” strategy to increase the structural information available for TB drug discovery by using mycobacterial homologs with conserved active sites. Of 179 potential TB drug targets selected for x-ray structure determination, only 16 yielded a crystal structure. By adding 1675 homologs from nine other mycobacterial species to the pipeline, structures representing an additional 52 otherwise intractable targets were solved. To determine whether these homolog structures would be useful surrogates in TB drug design, we compared the active sites of 106 pairs of Mtb and non-TB mycobacterial (NTM) enzyme homologs with experimentally determined structures, using three metrics of active site similarity, including superposition of continuous pharmacophoric property distributions. Pair-wise structural comparisons revealed that 19/22 pairs with >55% overall sequence identity had active site Cα RMSD 85% side chain identity, and ≥80% PSAPF (similarity based on pharmacophoric properties) indicating highly conserved active site shape and chemistry. Applying these results to the 52 NTM structures described above, 41 shared >55% sequence identity with the Mtb target, thus increasing the effective structural coverage of the 179 Mtb targets over three-fold (from 9% to 32%). The utility of these structures in TB drug design can be tested by designing inhibitors using the homolog structure and assaying the cognate Mtb enzyme; a promising test case, Mtb cytidylate kinase, is described. The homolog-rescue strategy evaluated here for TB is also generalizable to drug targets for other diseases. PMID:25613812

  5. Two classes of metric spaces

    Directory of Open Access Journals (Sweden)

    Isabel Garrido

    2016-04-01

    Full Text Available The class of metric spaces (X,d known as small-determined spaces, introduced by Garrido and Jaramillo, are properly defined by means of some type of real-valued Lipschitz functions on X. On the other hand, B-simple metric spaces introduced by Hejcman are defined in terms of some kind of bornologies of bounded subsets of X. In this note we present a common framework where both classes of metric spaces can be studied which allows us to see not only the relationships between them but also to obtain new internal characterizations of these metric properties.

  6. Evaluating system reliability and targeted hardening strategies of power distribution systems subjected to hurricanes

    International Nuclear Information System (INIS)

    Salman, Abdullahi M.; Li, Yue; Stewart, Mark G.

    2015-01-01

    Over the years, power distribution systems have been vulnerable to extensive damage from hurricanes which can cause power outage resulting in millions of dollars of economic losses and restoration costs. Most of the outage is as a result of failure of distribution support structures. Over the years, various methods of strengthening distribution systems have been proposed and studied. Some of these methods, such as undergrounding of the system, have been shown to be unjustified from an economic point of view. A potential cost-effective strategy is targeted hardening of the system. This, however, requires a method of determining critical parts of a system that when strengthened, will have greater impact on reliability. This paper presents a framework for studying the effectiveness of targeted hardening strategies on power distribution systems subjected to hurricanes. The framework includes a methodology for evaluating system reliability that relates failure of poles and power delivery, determination of critical parts of a system, hurricane hazard analysis, and consideration of decay of distribution poles. The framework also incorporates cost analysis that considers economic losses due to power outage. A notional power distribution system is used to demonstrate the framework by evaluating and comparing the effectiveness of three hardening measures. - Highlight: • Risk assessment of power distribution systems subjected to hurricanes is carried out. • Framework for studying effectiveness of targeted hardening strategies is presented. • A system reliability method is proposed. • Targeted hardening is cost effective for existing systems. • Economic losses due to power outage should be considered for cost analysis.

  7. A Metric for Heterotic Moduli

    Science.gov (United States)

    Candelas, Philip; de la Ossa, Xenia; McOrist, Jock

    2017-12-01

    Heterotic vacua of string theory are realised, at large radius, by a compact threefold with vanishing first Chern class together with a choice of stable holomorphic vector bundle. These form a wide class of potentially realistic four-dimensional vacua of string theory. Despite all their phenomenological promise, there is little understanding of the metric on the moduli space of these. What is sought is the analogue of special geometry for these vacua. The metric on the moduli space is important in phenomenology as it normalises D-terms and Yukawa couplings. It is also of interest in mathematics, since it generalises the metric, first found by Kobayashi, on the space of gauge field connections, to a more general context. Here we construct this metric, correct to first order in {α^{\\backprime}}, in two ways: first by postulating a metric that is invariant under background gauge transformations of the gauge field, and also by dimensionally reducing heterotic supergravity. These methods agree and the resulting metric is Kähler, as is required by supersymmetry. Checking the metric is Kähler is intricate and the anomaly cancellation equation for the H field plays an essential role. The Kähler potential nevertheless takes a remarkably simple form: it is the Kähler potential of special geometry with the Kähler form replaced by the {α^{\\backprime}}-corrected hermitian form.

  8. Targeting reactive nitrogen species: a promising therapeutic strategy for cerebral ischemia-reperfusion injury.

    Science.gov (United States)

    Chen, Xing-miao; Chen, Han-sen; Xu, Ming-jing; Shen, Jian-gang

    2013-01-01

    Ischemic stroke accounts for nearly 80% of stroke cases. Recanalization with thrombolysis is a currently crucial therapeutic strategy for re-building blood supply, but the thrombolytic therapy often companies with cerebral ischemia-reperfusion injury, which are mediated by free radicals. As an important component of free radicals, reactive nitrogen species (RNS), including nitric oxide (NO) and peroxynitrite (ONOO(-)), play important roles in the process of cerebral ischemia-reperfusion injury. Ischemia-reperfusion results in the production of nitric oxide (NO) and peroxynitrite (ONOO(-)) in ischemic brain, which trigger numerous molecular cascades and lead to disruption of the blood brain barrier and exacerbate brain damage. There are few therapeutic strategies available for saving ischemic brains and preventing the subsequent brain damage. Recent evidence suggests that RNS could be a therapeutic target for the treatment of cerebral ischemia-reperfusion injury. Herein, we reviewed the recent progress regarding the roles of RNS in the process of cerebral ischemic-reperfusion injury and discussed the potentials of drug development that target NO and ONOO(-) to treat ischemic stroke. We conclude that modulation for RNS level could be an important therapeutic strategy for preventing cerebral ischemia-reperfusion injury.

  9. On characterizations of quasi-metric completeness

    Energy Technology Data Exchange (ETDEWEB)

    Dag, H.; Romaguera, S.; Tirado, P.

    2017-07-01

    Hu proved in [4] that a metric space (X, d) is complete if and only if for any closed subspace C of (X, d), every Banach contraction on C has fixed point. Since then several authors have investigated the problem of characterizing the metric completeness by means of fixed point theorems. Recently this problem has been studied in the more general context of quasi-metric spaces for different notions of completeness. Here we present a characterization of a kind of completeness for quasi-metric spaces by means of a quasi-metric versions of Hu’s theorem. (Author)

  10. GEO Optical Data Association with Concurrent Metric and Photometric Information

    Science.gov (United States)

    Dao, P.; Monet, D.

    Data association in a congested area of the GEO belt with occasional visits by non-resident objects can be treated as a Multi-Target-Tracking (MTT) problem. For a stationary sensor surveilling the GEO belt, geosynchronous and near GEO objects are not completely motionless in the earth-fixed frame and can be observed as moving targets. In some clusters, metric or positional information is insufficiently accurate or up-to-date to associate the measurements. In the presence of measurements with uncertain origin, star tracks (residuals) and other sensor artifacts, heuristic techniques based on hard decision assignment do not perform adequately. In the MMT community, Bar-Shalom [2009 Bar-Shalom] was first in introducing the use of measurements to update the state of the target of interest in the tracking filter, e.g. Kalman filter. Following Bar-Shalom’s idea, we use the Probabilistic Data Association Filter (PDAF) but to make use of all information obtainable in the measurement of three-axis-stabilized GEO satellites, we combine photometric with metric measurements to update the filter. Therefore, our technique Concurrent Spatio- Temporal and Brightness (COSTB) has the stand-alone ability of associating a track with its identity –for resident objects. That is possible because the light curve of a stabilized GEO satellite changes minimally from night to night. We exercised COSTB on camera cadence data to associate measurements, correct mistags and detect non-residents in a simulated near real time cadence. Data on GEO clusters were used.

  11. Targeting vacuolar H+-ATPases as a new strategy against cancer.

    Science.gov (United States)

    Fais, Stefano; De Milito, Angelo; You, Haiyan; Qin, Wenxin

    2007-11-15

    Growing evidence suggests a key role of tumor acidic microenvironment in cancer development, progression, and metastasis. As a consequence, the need for compounds that specifically target the mechanism(s) responsible for the low pH of tumors is increasing. Among the key regulators of the tumor acidic microenvironment, vacuolar H(+)-ATPases (V-ATPases) play an important role. These proteins cover a number of functions in a variety of normal as well as tumor cells, in which they pump ions across the membranes. We discuss here some recent results showing that a molecular inhibition of V-ATPases by small interfering RNA in vivo as well as a pharmacologic inhibition through proton pump inhibitors led to tumor cytotoxicity and marked inhibition of human tumor growth in xenograft models. These results propose V-ATPases as a key target for new strategies in cancer treatment.

  12. A strategy for evaluating pathway analysis methods.

    Science.gov (United States)

    Yu, Chenggang; Woo, Hyung Jun; Yu, Xueping; Oyama, Tatsuya; Wallqvist, Anders; Reifman, Jaques

    2017-10-13

    Researchers have previously developed a multitude of methods designed to identify biological pathways associated with specific clinical or experimental conditions of interest, with the aim of facilitating biological interpretation of high-throughput data. Before practically applying such pathway analysis (PA) methods, we must first evaluate their performance and reliability, using datasets where the pathways perturbed by the conditions of interest have been well characterized in advance. However, such 'ground truths' (or gold standards) are often unavailable. Furthermore, previous evaluation strategies that have focused on defining 'true answers' are unable to systematically and objectively assess PA methods under a wide range of conditions. In this work, we propose a novel strategy for evaluating PA methods independently of any gold standard, either established or assumed. The strategy involves the use of two mutually complementary metrics, recall and discrimination. Recall measures the consistency of the perturbed pathways identified by applying a particular analysis method to an original large dataset and those identified by the same method to a sub-dataset of the original dataset. In contrast, discrimination measures specificity-the degree to which the perturbed pathways identified by a particular method to a dataset from one experiment differ from those identifying by the same method to a dataset from a different experiment. We used these metrics and 24 datasets to evaluate six widely used PA methods. The results highlighted the common challenge in reliably identifying significant pathways from small datasets. Importantly, we confirmed the effectiveness of our proposed dual-metric strategy by showing that previous comparative studies corroborate the performance evaluations of the six methods obtained by our strategy. Unlike any previously proposed strategy for evaluating the performance of PA methods, our dual-metric strategy does not rely on any ground truth

  13. Engineering performance metrics

    Science.gov (United States)

    Delozier, R.; Snyder, N.

    1993-03-01

    Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.

  14. Nanomedicine strategies for sustained, controlled, and targeted treatment of cancer stem cells of the digestive system.

    Science.gov (United States)

    Xie, Fang-Yuan; Xu, Wei-Heng; Yin, Chuan; Zhang, Guo-Qing; Zhong, Yan-Qiang; Gao, Jie

    2016-10-15

    Cancer stem cells (CSCs) constitute a small proportion of the cancer cells that have self-renewal capacity and tumor-initiating ability. They have been identified in a variety of tumors, including tumors of the digestive system. CSCs exhibit some unique characteristics, which are responsible for cancer metastasis and recurrence. Consequently, the development of effective therapeutic strategies against CSCs plays a key role in increasing the efficacy of cancer therapy. Several potential approaches to target CSCs of the digestive system have been explored, including targeting CSC surface markers and signaling pathways, inducing the differentiation of CSCs, altering the tumor microenvironment or niche, and inhibiting ATP-driven efflux transporters. However, conventional therapies may not successfully eradicate CSCs owing to various problems, including poor solubility, stability, rapid clearance, poor cellular uptake, and unacceptable cytotoxicity. Nanomedicine strategies, which include drug, gene, targeted, and combinational delivery, could solve these problems and significantly improve the therapeutic index. This review briefly summarizes the ongoing development of strategies and nanomedicine-based therapies against CSCs of the digestive system.

  15. Inhibition of mesothelin as a novel strategy for targeting cancer cells.

    Directory of Open Access Journals (Sweden)

    Kun Wang

    Full Text Available Mesothelin, a differentiation antigen present in a series of malignancies such as mesothelioma, ovarian, lung and pancreatic cancer, has been studied as a marker for diagnosis and a target for immunotherapy. We, however, were interested in evaluating the effects of direct targeting of Mesothelin on the viability of cancer cells as the first step towards developing a novel therapeutic strategy. We report here that gene specific silencing for Mesothelin by distinct methods (siRNA and microRNA decreased viability of cancer cells from different origins such as mesothelioma (H2373, ovarian cancer (Skov3 and Ovcar-5 and pancreatic cancer (Miapaca2 and Panc-1. Additionally, the invasiveness of cancer cells was also significantly decreased upon such treatment. We then investigated pro-oncogenic signaling characteristics of cells upon mesothelin-silencing which revealed a significant decrease in phospho-ERK1 and PI3K/AKT activity. The molecular mechanism of reduced invasiveness was connected to the reduced expression of β-Catenin, an important marker of EMT (epithelial-mesenchymal transition. Ero1, a protein involved in clearing unfolded proteins and a member of the ER-Stress (endoplasmic reticulum-stress pathway was also markedly reduced. Furthermore, Mesothelin silencing caused a significant increase in fraction of cancer cells in S-phase. In next step, treatment of ovarian cancer cells (OVca429 with a lentivirus expressing anti-mesothelin microRNA resulted in significant loss of viability, invasiveness, and morphological alterations. Therefore, we propose the inhibition of Mesothelin as a potential novel strategy for targeting human malignancies.

  16. Brand metrics that matter

    NARCIS (Netherlands)

    Muntinga, D.; Bernritter, S.

    2017-01-01

    Het merk staat steeds meer centraal in de organisatie. Het is daarom essentieel om de gezondheid, prestaties en ontwikkelingen van het merk te meten. Het is echter een uitdaging om de juiste brand metrics te selecteren. Een enorme hoeveelheid metrics vraagt de aandacht van merkbeheerders. Maar welke

  17. Privacy Metrics and Boundaries

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  18. An Opportunistic Routing Mechanism Combined with Long-Term and Short-Term Metrics for WMN

    Directory of Open Access Journals (Sweden)

    Weifeng Sun

    2014-01-01

    Full Text Available WMN (wireless mesh network is a useful wireless multihop network with tremendous research value. The routing strategy decides the performance of network and the quality of transmission. A good routing algorithm will use the whole bandwidth of network and assure the quality of service of traffic. Since the routing metric ETX (expected transmission count does not assure good quality of wireless links, to improve the routing performance, an opportunistic routing mechanism combined with long-term and short-term metrics for WMN based on OLSR (optimized link state routing and ETX is proposed in this paper. This mechanism always chooses the highest throughput links to improve the performance of routing over WMN and then reduces the energy consumption of mesh routers. The simulations and analyses show that the opportunistic routing mechanism is better than the mechanism with the metric of ETX.

  19. An opportunistic routing mechanism combined with long-term and short-term metrics for WMN.

    Science.gov (United States)

    Sun, Weifeng; Wang, Haotian; Piao, Xianglan; Qiu, Tie

    2014-01-01

    WMN (wireless mesh network) is a useful wireless multihop network with tremendous research value. The routing strategy decides the performance of network and the quality of transmission. A good routing algorithm will use the whole bandwidth of network and assure the quality of service of traffic. Since the routing metric ETX (expected transmission count) does not assure good quality of wireless links, to improve the routing performance, an opportunistic routing mechanism combined with long-term and short-term metrics for WMN based on OLSR (optimized link state routing) and ETX is proposed in this paper. This mechanism always chooses the highest throughput links to improve the performance of routing over WMN and then reduces the energy consumption of mesh routers. The simulations and analyses show that the opportunistic routing mechanism is better than the mechanism with the metric of ETX.

  20. Common metrics. Comparing the warming effect of climate forcers in climate policy; Common metrics. Laempenemiseen vaikuttavien paeaestoejen yhteismitallistaminen ilmastopolitiikassa

    Energy Technology Data Exchange (ETDEWEB)

    Lindroos, T. J.; Ekholm, T.; Savolainen, I.

    2012-11-15

    Climate policy needs a relatively simple method to compare the warming effect of different greenhouse gases (GHGs). Otherwise it would be necessary to negotiate a different reduction target for each gas. At the moment, Global Warming Potential (GWP) concept is used to compare different GHGs. Numerical values of GWP factors have been updated alongside with scientific understanding and majority seems content to the GWP. From 2005 onwards there have been many proposals of optional metrics. The most well known is Global Temperature change Potential (GTP) concept which measures the change of temperature as does global climate policies. The decision between metrics is a multicriteria decision which should include at least the coherence with climate policy and cost efficiency. The GWP concept may be a little more difficult to understand than the GTP but it is more cost efficient. Alongside with new metrics, scientists and politicians have started to discuss of new emission which have an effect on warming. These Short Lived Climate Forcers (SLCFs) have either warming or cooling effect. Their effect can be presented with GWP and GTP but the uncertainties in the emission factors are large. In total, SLCFs reduce overall emissions of EU approximately 1% in year 2000. NO{sub x}, SO{sub x} (cooling) and black carbon (warming) emissions were the biggest factors. EU is planning to reduce the SLCF emissions to achieve health and environmental benefits, but at the same time this reduces the effect of EU's climate policies by approximately 10%. Uncertainties in the estimates are large. (orig.)

  1. Metric Structure of the Space of Two-Qubit Gates, Perfect Entanglers and Quantum Control

    Directory of Open Access Journals (Sweden)

    Paul Watts

    2013-05-01

    Full Text Available We derive expressions for the invariant length element and measure for the simple compact Lie group SU(4 in a coordinate system particularly suitable for treating entanglement in quantum information processing. Using this metric, we compute the invariant volume of the space of two-qubit perfect entanglers. We find that this volume corresponds to more than 84% of the total invariant volume of the space of two-qubit gates. This same metric is also used to determine the effective target sizes that selected gates will present in any quantum-control procedure designed to implement them.

  2. A compound chimeric antigen receptor strategy for targeting multiple myeloma.

    Science.gov (United States)

    Chen, K H; Wada, M; Pinz, K G; Liu, H; Shuai, X; Chen, X; Yan, L E; Petrov, J C; Salman, H; Senzel, L; Leung, E L H; Jiang, X; Ma, Y

    2018-02-01

    Current clinical outcomes using chimeric-antigen receptors (CARs) against multiple myeloma show promise in the eradication of bulk disease. However, these anti-BCMA (CD269) CARs observe relapse as a common phenomenon after treatment due to the reemergence of either antigen-positive or -negative cells. Hence, the development of improvements in CAR design to target antigen loss and increase effector cell persistency represents a critical need. Here, we report on the anti-tumor activity of a CAR T-cell possessing two complete and independent CAR receptors against the multiple myeloma antigens BCMA and CS1. We determined that the resulting compound CAR (cCAR) T-cell possesses consistent, potent and directed cytotoxicity against each target antigen population. Using multiple mouse models of myeloma and mixed cell populations, we are further able to show superior in vivo survival by directed cytotoxicity against multiple populations compared to a single-expressing CAR T-cell. These findings indicate that compound targeting of BCMA and CS1 on myeloma cells can potentially be an effective strategy for augmenting the response against myeloma bulk disease and for initiation of broader coverage CAR therapy.

  3. Cyber threat metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  4. Visible Contrast Energy Metrics for Detection and Discrimination

    Science.gov (United States)

    Ahumada, Albert; Watson, Andrew

    2013-01-01

    Contrast energy was proposed by Watson, Robson, & Barlow as a useful metric for representing luminance contrast target stimuli because it represents the detectability of the stimulus in photon noise for an ideal observer. Like the eye, the ear is a complex transducer system, but relatively simple sound level meters are used to characterize sounds. These meters provide a range of frequency sensitivity functions and integration times depending on the intended use. We propose here the use of a range of contrast energy measures with different spatial frequency contrast sensitivity weightings, eccentricity sensitivity weightings, and temporal integration times. When detection threshold are plotting using such measures, the results show what the eye sees best when these variables are taken into account in a standard way. The suggested weighting functions revise the Standard Spatial Observer for luminance contrast detection and extend it into the near periphery. Under the assumption that the detection is limited only by internal noise, discrimination performance can be predicted by metrics based on the visible energy of the difference images

  5. Fixed point theory in metric type spaces

    CERN Document Server

    Agarwal, Ravi P; O’Regan, Donal; Roldán-López-de-Hierro, Antonio Francisco

    2015-01-01

    Written by a team of leading experts in the field, this volume presents a self-contained account of the theory, techniques and results in metric type spaces (in particular in G-metric spaces); that is, the text approaches this important area of fixed point analysis beginning from the basic ideas of metric space topology. The text is structured so that it leads the reader from preliminaries and historical notes on metric spaces (in particular G-metric spaces) and on mappings, to Banach type contraction theorems in metric type spaces, fixed point theory in partially ordered G-metric spaces, fixed point theory for expansive mappings in metric type spaces, generalizations, present results and techniques in a very general abstract setting and framework. Fixed point theory is one of the major research areas in nonlinear analysis. This is partly due to the fact that in many real world problems fixed point theory is the basic mathematical tool used to establish the existence of solutions to problems which arise natur...

  6. Molecular strategies targeting the host component of cancer to enhance tumor response to radiation therapy

    International Nuclear Information System (INIS)

    Kim, Dong Wook; Huamani, Jessica; Fu, Allie; Hallahan, Dennis E.

    2006-01-01

    The tumor microenvironment, in particular, the tumor vasculature, as an important target for the cytotoxic effects of radiation therapy is an established paradigm for cancer therapy. We review the evidence that the phosphoinositide 3-kinase (PI3K)/Akt pathway is activated in endothelial cells exposed to ionizing radiation (IR) and is a molecular target for the development of novel radiation sensitizing agents. On the basis of this premise, several promising preclinical studies that targeted the inhibition of the PI3K/Akt activation as a potential method of sensitizing the tumor vasculature to the cytotoxic effects of IR have been conducted. An innovative strategy to guide cytotoxic therapy in tumors treated with radiation and PI3K/Akt inhibitors is presented. The evidence supports a need for further investigation of combined-modality therapy that involves radiation therapy and inhibitors of PI3K/Akt pathway as a promising strategy for improving the treatment of patients with cancer

  7. Use of two population metrics clarifies biodiversity dynamics in large-scale monitoring: the case of trees in Japanese old-growth forests: the need for multiple population metrics in large-scale monitoring.

    Science.gov (United States)

    Ogawa, Mifuyu; Yamaura, Yuichi; Abe, Shin; Hoshino, Daisuke; Hoshizaki, Kazuhiko; Iida, Shigeo; Katsuki, Toshio; Masaki, Takashi; Niiyama, Kaoru; Saito, Satoshi; Sakai, Takeshi; Sugita, Hisashi; Tanouchi, Hiroyuki; Amano, Tatsuya; Taki, Hisatomo; Okabe, Kimiko

    2011-07-01

    Many indicators/indices provide information on whether the 2010 biodiversity target of reducing declines in biodiversity have been achieved. The strengths and limitations of the various measures used to assess the success of such measures are now being discussed. Biodiversity dynamics are often evaluated by a single biological population metric, such as the abundance of each species. Here we examined tree population dynamics of 52 families (192 species) at 11 research sites (three vegetation zones) of Japanese old-growth forests using two population metrics: number of stems and basal area. We calculated indices that track the rate of change in all species of tree by taking the geometric mean of changes in population metrics between the 1990s and the 2000s at the national level and at the levels of the vegetation zone and family. We specifically focused on whether indices based on these two metrics behaved similarly. The indices showed that (1) the number of stems declined, whereas basal area did not change at the national level and (2) the degree of change in the indices varied by vegetation zone and family. These results suggest that Japanese old-growth forests have not degraded and may even be developing in some vegetation zones, and indicate that the use of a single population metric (or indicator/index) may be insufficient to precisely understand the state of biodiversity. It is therefore important to incorporate more metrics into monitoring schemes to overcome the risk of misunderstanding or misrepresenting biodiversity dynamics.

  8. SU-G-BRB-16: Vulnerabilities in the Gamma Metric

    International Nuclear Information System (INIS)

    Neal, B; Siebers, J

    2016-01-01

    Purpose: To explore vulnerabilities in the gamma index metric that undermine its wide use as a radiation therapy quality assurance tool. Methods: 2D test field pairs (images) are created specifically to achieve high gamma passing rates, but to also include gross errors by exploiting the distance-to-agreement and percent-passing components of the metric. The first set has no requirement of clinical practicality, but is intended to expose vulnerabilities. The second set exposes clinically realistic vulnerabilities. To circumvent limitations inherent to user-specific tuning of prediction algorithms to match measurements, digital test cases are manually constructed, thereby mimicking high-quality image prediction. Results: With a 3 mm distance-to-agreement metric, changing field size by ±6 mm results in a gamma passing rate over 99%. For a uniform field, a lattice of passing points spaced 5 mm apart results in a passing rate of 100%. Exploiting the percent-passing component, a 10×10 cm"2 field can have a 95% passing rate when an 8 cm"2=2.8×2.8 cm"2 highly out-of-tolerance (e.g. zero dose) square is missing from the comparison image. For clinically realistic vulnerabilities, an arc plan for which a 2D image is created can have a >95% passing rate solely due to agreement in the lateral spillage, with the failing 5% in the critical target region. A field with an integrated boost (e.g whole brain plus small metastases) could neglect the metastases entirely, yet still pass with a 95% threshold. All the failure modes described would be visually apparent on a gamma-map image. Conclusion: The %gamma<1 metric has significant vulnerabilities. High passing rates can obscure critical faults in hypothetical and delivered radiation doses. Great caution should be used with gamma as a QA metric; users should inspect the gamma-map. Visual analysis of gamma-maps may be impractical for cine acquisition.

  9. SU-G-BRB-16: Vulnerabilities in the Gamma Metric

    Energy Technology Data Exchange (ETDEWEB)

    Neal, B; Siebers, J [University of Virginia Health System, Charlottesville, VA (United States)

    2016-06-15

    Purpose: To explore vulnerabilities in the gamma index metric that undermine its wide use as a radiation therapy quality assurance tool. Methods: 2D test field pairs (images) are created specifically to achieve high gamma passing rates, but to also include gross errors by exploiting the distance-to-agreement and percent-passing components of the metric. The first set has no requirement of clinical practicality, but is intended to expose vulnerabilities. The second set exposes clinically realistic vulnerabilities. To circumvent limitations inherent to user-specific tuning of prediction algorithms to match measurements, digital test cases are manually constructed, thereby mimicking high-quality image prediction. Results: With a 3 mm distance-to-agreement metric, changing field size by ±6 mm results in a gamma passing rate over 99%. For a uniform field, a lattice of passing points spaced 5 mm apart results in a passing rate of 100%. Exploiting the percent-passing component, a 10×10 cm{sup 2} field can have a 95% passing rate when an 8 cm{sup 2}=2.8×2.8 cm{sup 2} highly out-of-tolerance (e.g. zero dose) square is missing from the comparison image. For clinically realistic vulnerabilities, an arc plan for which a 2D image is created can have a >95% passing rate solely due to agreement in the lateral spillage, with the failing 5% in the critical target region. A field with an integrated boost (e.g whole brain plus small metastases) could neglect the metastases entirely, yet still pass with a 95% threshold. All the failure modes described would be visually apparent on a gamma-map image. Conclusion: The %gamma<1 metric has significant vulnerabilities. High passing rates can obscure critical faults in hypothetical and delivered radiation doses. Great caution should be used with gamma as a QA metric; users should inspect the gamma-map. Visual analysis of gamma-maps may be impractical for cine acquisition.

  10. An effective tumor-targeting strategy utilizing hypoxia-sensitive siRNA delivery system for improved anti-tumor outcome.

    Science.gov (United States)

    Kang, Lin; Fan, Bo; Sun, Ping; Huang, Wei; Jin, Mingji; Wang, Qiming; Gao, Zhonggao

    2016-10-15

    Hypoxia is a feature of most solid tumors, targeting hypoxia is considered as the best validated yet not extensively exploited strategy in cancer therapy. Here, we reported a novel tumor-targeting strategy using a hypoxia-sensitive siRNA delivery system. In the study, 2-nitroimidazole (NI), a hydrophobic component that can be converted to hydrophilic 2-aminoimidazole (AI) through bioreduction under hypoxic conditions, was conjugated to the alkylated polyethyleneimine (bPEI1.8k-C6) to form amphiphilic bPEI1.8k-C6-NI polycations. bPEI1.8k-C6-NI could self-assemble into micelle-like aggregations in aqueous, which contributed to the improved stability of the bPEI1.8k-C6-NI/siRNA polyplexes, resulted in increased cellular uptake. After being transported into the hypoxic tumor cells, the selective nitro-to-amino reduction would cause structural change and elicit a relatively loose structure to facilitate the siRNA dissociation in the cytoplasm, for enhanced gene silencing efficiency ultimately. Therefore, the conflict between the extracellular stability and the intracellular siRNA release ability of the polyplexes was solved by introducing the hypoxia-responsive unit. Consequently, the survivin-targeted siRNA loaded polyplexes shown remarkable anti-tumor effect not only in hypoxic cells, but also in tumor spheroids and tumor-bearing mice, indicating that the hypoxia-sensitive siRNA delivery system had great potential for tumor-targeted therapy. Hypoxia is one of the most remarkable features of most solid tumors, and targeting hypoxia is considered as the best validated strategy in cancer therapy. However, in the past decades, there were few reports about using this strategy in the drug delivery system, especially in siRNA delivery system. Therefore, we constructed a hypoxia-sensitive siRNA delivery system utilizing a hypoxia-responsive unit, 2-nitroimidazole, by which the unavoidable conflict between improved extracellular stability and promoted intracellular si

  11. Regge calculus from discontinuous metrics

    International Nuclear Information System (INIS)

    Khatsymovsky, V.M.

    2003-01-01

    Regge calculus is considered as a particular case of the more general system where the linklengths of any two neighbouring 4-tetrahedra do not necessarily coincide on their common face. This system is treated as that one described by metric discontinuous on the faces. In the superspace of all discontinuous metrics the Regge calculus metrics form some hypersurface defined by continuity conditions. Quantum theory of the discontinuous metric system is assumed to be fixed somehow in the form of quantum measure on (the space of functionals on) the superspace. The problem of reducing this measure to the Regge hypersurface is addressed. The quantum Regge calculus measure is defined from a discontinuous metric measure by inserting the δ-function-like phase factor. The requirement that continuity conditions be imposed in a 'face-independent' way fixes this factor uniquely. The term 'face-independent' means that this factor depends only on the (hyper)plane spanned by the face, not on it's form and size. This requirement seems to be natural from the viewpoint of existence of the well-defined continuum limit maximally free of lattice artefacts

  12. Numerical Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Douglas, Michael R.; Karp, Robert L.; Lukic, Sergio; Reinbacher, Rene

    2008-01-01

    We develop numerical methods for approximating Ricci flat metrics on Calabi-Yau hypersurfaces in projective spaces. Our approach is based on finding balanced metrics and builds on recent theoretical work by Donaldson. We illustrate our methods in detail for a one parameter family of quintics. We also suggest several ways to extend our results

  13. Reproducibility of R-fMRI metrics on the impact of different strategies for multiple comparison correction and sample sizes.

    Science.gov (United States)

    Chen, Xiao; Lu, Bin; Yan, Chao-Gan

    2018-01-01

    Concerns regarding reproducibility of resting-state functional magnetic resonance imaging (R-fMRI) findings have been raised. Little is known about how to operationally define R-fMRI reproducibility and to what extent it is affected by multiple comparison correction strategies and sample size. We comprehensively assessed two aspects of reproducibility, test-retest reliability and replicability, on widely used R-fMRI metrics in both between-subject contrasts of sex differences and within-subject comparisons of eyes-open and eyes-closed (EOEC) conditions. We noted permutation test with Threshold-Free Cluster Enhancement (TFCE), a strict multiple comparison correction strategy, reached the best balance between family-wise error rate (under 5%) and test-retest reliability/replicability (e.g., 0.68 for test-retest reliability and 0.25 for replicability of amplitude of low-frequency fluctuations (ALFF) for between-subject sex differences, 0.49 for replicability of ALFF for within-subject EOEC differences). Although R-fMRI indices attained moderate reliabilities, they replicated poorly in distinct datasets (replicability < 0.3 for between-subject sex differences, < 0.5 for within-subject EOEC differences). By randomly drawing different sample sizes from a single site, we found reliability, sensitivity and positive predictive value (PPV) rose as sample size increased. Small sample sizes (e.g., < 80 [40 per group]) not only minimized power (sensitivity < 2%), but also decreased the likelihood that significant results reflect "true" effects (PPV < 0.26) in sex differences. Our findings have implications for how to select multiple comparison correction strategies and highlight the importance of sufficiently large sample sizes in R-fMRI studies to enhance reproducibility. Hum Brain Mapp 39:300-318, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  14. NeuronMetrics: software for semi-automated processing of cultured neuron images.

    Science.gov (United States)

    Narro, Martha L; Yang, Fan; Kraft, Robert; Wenk, Carola; Efrat, Alon; Restifo, Linda L

    2007-03-23

    Using primary cell culture to screen for changes in neuronal morphology requires specialized analysis software. We developed NeuronMetrics for semi-automated, quantitative analysis of two-dimensional (2D) images of fluorescently labeled cultured neurons. It skeletonizes the neuron image using two complementary image-processing techniques, capturing fine terminal neurites with high fidelity. An algorithm was devised to span wide gaps in the skeleton. NeuronMetrics uses a novel strategy based on geometric features called faces to extract a branch number estimate from complex arbors with numerous neurite-to-neurite contacts, without creating a precise, contact-free representation of the neurite arbor. It estimates total neurite length, branch number, primary neurite number, territory (the area of the convex polygon bounding the skeleton and cell body), and Polarity Index (a measure of neuronal polarity). These parameters provide fundamental information about the size and shape of neurite arbors, which are critical factors for neuronal function. NeuronMetrics streamlines optional manual tasks such as removing noise, isolating the largest primary neurite, and correcting length for self-fasciculating neurites. Numeric data are output in a single text file, readily imported into other applications for further analysis. Written as modules for ImageJ, NeuronMetrics provides practical analysis tools that are easy to use and support batch processing. Depending on the need for manual intervention, processing time for a batch of approximately 60 2D images is 1.0-2.5 h, from a folder of images to a table of numeric data. NeuronMetrics' output accelerates the quantitative detection of mutations and chemical compounds that alter neurite morphology in vitro, and will contribute to the use of cultured neurons for drug discovery.

  15. Increasing the structural coverage of tuberculosis drug targets.

    Science.gov (United States)

    Baugh, Loren; Phan, Isabelle; Begley, Darren W; Clifton, Matthew C; Armour, Brianna; Dranow, David M; Taylor, Brandy M; Muruthi, Marvin M; Abendroth, Jan; Fairman, James W; Fox, David; Dieterich, Shellie H; Staker, Bart L; Gardberg, Anna S; Choi, Ryan; Hewitt, Stephen N; Napuli, Alberto J; Myers, Janette; Barrett, Lynn K; Zhang, Yang; Ferrell, Micah; Mundt, Elizabeth; Thompkins, Katie; Tran, Ngoc; Lyons-Abbott, Sally; Abramov, Ariel; Sekar, Aarthi; Serbzhinskiy, Dmitri; Lorimer, Don; Buchko, Garry W; Stacy, Robin; Stewart, Lance J; Edwards, Thomas E; Van Voorhis, Wesley C; Myler, Peter J

    2015-03-01

    High-resolution three-dimensional structures of essential Mycobacterium tuberculosis (Mtb) proteins provide templates for TB drug design, but are available for only a small fraction of the Mtb proteome. Here we evaluate an intra-genus "homolog-rescue" strategy to increase the structural information available for TB drug discovery by using mycobacterial homologs with conserved active sites. Of 179 potential TB drug targets selected for x-ray structure determination, only 16 yielded a crystal structure. By adding 1675 homologs from nine other mycobacterial species to the pipeline, structures representing an additional 52 otherwise intractable targets were solved. To determine whether these homolog structures would be useful surrogates in TB drug design, we compared the active sites of 106 pairs of Mtb and non-TB mycobacterial (NTM) enzyme homologs with experimentally determined structures, using three metrics of active site similarity, including superposition of continuous pharmacophoric property distributions. Pair-wise structural comparisons revealed that 19/22 pairs with >55% overall sequence identity had active site Cα RMSD 85% side chain identity, and ≥80% PSAPF (similarity based on pharmacophoric properties) indicating highly conserved active site shape and chemistry. Applying these results to the 52 NTM structures described above, 41 shared >55% sequence identity with the Mtb target, thus increasing the effective structural coverage of the 179 Mtb targets over three-fold (from 9% to 32%). The utility of these structures in TB drug design can be tested by designing inhibitors using the homolog structure and assaying the cognate Mtb enzyme; a promising test case, Mtb cytidylate kinase, is described. The homolog-rescue strategy evaluated here for TB is also generalizable to drug targets for other diseases. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Universal, colorimetric microRNA detection strategy based on target-catalyzed toehold-mediated strand displacement reaction

    Science.gov (United States)

    Park, Yeonkyung; Lee, Chang Yeol; Kang, Shinyoung; Kim, Hansol; Park, Ki Soo; Park, Hyun Gyu

    2018-02-01

    In this work, we developed a novel, label-free, and enzyme-free strategy for the colorimetric detection of microRNA (miRNA), which relies on a target-catalyzed toehold-mediated strand displacement (TMSD) reaction. The system employs a detection probe that specifically binds to the target miRNA and sequentially releases a catalyst strand (CS) intended to trigger the subsequent TMSD reaction. Thus, the presence of target miRNA releases the CS that mediates the formation of an active G-quadruplex DNAzyme which is initially caged and inactivated by a blocker strand. In addition, a fuel strand that is supplemented for the recycling of the CS promotes another TMSD reaction, consequently generating a large number of active G-quadruplex DNAzymes. As a result, a distinct colorimetric signal is produced by the ABTS oxidation promoted by the peroxidase mimicking activity of the released G-quadruplex DNAzymes. Based on this novel strategy, we successfully detected miR-141, a promising biomarker for human prostate cancer, with high selectivity. The diagnostic capability of this system was also demonstrated by reliably determining target miR-141 in human serum, showing its great potential towards real clinical applications. Importantly, the proposed approach is composed of separate target recognition and signal transduction modules. Thus, it could be extended to analyze different target miRNAs by simply redesigning the detection probe while keeping the same signal transduction module as a universal signal amplification unit, which was successfully demonstrated by analyzing another target miRNA, let-7d.

  17. Metrics for Evaluation of Student Models

    Science.gov (United States)

    Pelanek, Radek

    2015-01-01

    Researchers use many different metrics for evaluation of performance of student models. The aim of this paper is to provide an overview of commonly used metrics, to discuss properties, advantages, and disadvantages of different metrics, to summarize current practice in educational data mining, and to provide guidance for evaluation of student…

  18. Modulation of actin dynamics as potential macrophage subtype-targeting anti-tumour strategy.

    Science.gov (United States)

    Pergola, Carlo; Schubert, Katrin; Pace, Simona; Ziereisen, Jana; Nikels, Felix; Scherer, Olga; Hüttel, Stephan; Zahler, Stefan; Vollmar, Angelika M; Weinigel, Christina; Rummler, Silke; Müller, Rolf; Raasch, Martin; Mosig, Alexander; Koeberle, Andreas; Werz, Oliver

    2017-01-30

    Tumour-associated macrophages mainly comprise immunosuppressive M2 phenotypes that promote tumour progression besides anti-tumoural M1 subsets. Selective depletion or reprogramming of M2 may represent an innovative anti-cancer strategy. The actin cytoskeleton is central for cellular homeostasis and is targeted for anti-cancer chemotherapy. Here, we show that targeting G-actin nucleation using chondramide A (ChA) predominantly depletes human M2 while promoting the tumour-suppressive M1 phenotype. ChA reduced the viability of M2, with minor effects on M1, but increased tumour necrosis factor (TNF)α release from M1. Interestingly, ChA caused rapid disruption of dynamic F-actin filaments and polymerization of G-actin, followed by reduction of cell size, binucleation and cell division, without cellular collapse. In M1, but not in M2, ChA caused marked activation of SAPK/JNK and NFκB, with slight or no effects on Akt, STAT-1/-3, ERK-1/2, and p38 MAPK, seemingly accounting for the better survival of M1 and TNFα secretion. In a microfluidically-supported human tumour biochip model, circulating ChA-treated M1 markedly reduced tumour cell viability through enhanced release of TNFα. Together, ChA may cause an anti-tumoural microenvironment by depletion of M2 and activation of M1, suggesting induction of G-actin nucleation as potential strategy to target tumour-associated macrophages in addition to neoplastic cells.

  19. Targeting Strategies for the Combination Treatment of Cancer Using Drug Delivery Systems

    Science.gov (United States)

    Kydd, Janel; Jadia, Rahul; Velpurisiva, Praveena; Gad, Aniket; Paliwal, Shailee; Rai, Prakash

    2017-01-01

    Cancer cells have characteristics of acquired and intrinsic resistances to chemotherapy treatment—due to the hostile tumor microenvironment—that create a significant challenge for effective therapeutic regimens. Multidrug resistance, collateral toxicity to normal cells, and detrimental systemic side effects present significant obstacles, necessitating alternative and safer treatment strategies. Traditional administration of chemotherapeutics has demonstrated minimal success due to the non-specificity of action, uptake and rapid clearance by the immune system, and subsequent metabolic alteration and poor tumor penetration. Nanomedicine can provide a more effective approach to targeting cancer by focusing on the vascular, tissue, and cellular characteristics that are unique to solid tumors. Targeted methods of treatment using nanoparticles can decrease the likelihood of resistant clonal populations of cancerous cells. Dual encapsulation of chemotherapeutic drug allows simultaneous targeting of more than one characteristic of the tumor. Several first-generation, non-targeted nanomedicines have received clinical approval starting with Doxil® in 1995. However, more than two decades later, second-generation or targeted nanomedicines have yet to be approved for treatment despite promising results in pre-clinical studies. This review highlights recent studies using targeted nanoparticles for cancer treatment focusing on approaches that target either the tumor vasculature (referred to as ‘vascular targeting’), the tumor microenvironment (‘tissue targeting’) or the individual cancer cells (‘cellular targeting’). Recent studies combining these different targeting methods are also discussed in this review. Finally, this review summarizes some of the reasons for the lack of clinical success in the field of targeted nanomedicines. PMID:29036899

  20. Targeting Strategies for the Combination Treatment of Cancer Using Drug Delivery Systems

    Directory of Open Access Journals (Sweden)

    Janel Kydd

    2017-10-01

    Full Text Available Cancer cells have characteristics of acquired and intrinsic resistances to chemotherapy treatment—due to the hostile tumor microenvironment—that create a significant challenge for effective therapeutic regimens. Multidrug resistance, collateral toxicity to normal cells, and detrimental systemic side effects present significant obstacles, necessitating alternative and safer treatment strategies. Traditional administration of chemotherapeutics has demonstrated minimal success due to the non-specificity of action, uptake and rapid clearance by the immune system, and subsequent metabolic alteration and poor tumor penetration. Nanomedicine can provide a more effective approach to targeting cancer by focusing on the vascular, tissue, and cellular characteristics that are unique to solid tumors. Targeted methods of treatment using nanoparticles can decrease the likelihood of resistant clonal populations of cancerous cells. Dual encapsulation of chemotherapeutic drug allows simultaneous targeting of more than one characteristic of the tumor. Several first-generation, non-targeted nanomedicines have received clinical approval starting with Doxil® in 1995. However, more than two decades later, second-generation or targeted nanomedicines have yet to be approved for treatment despite promising results in pre-clinical studies. This review highlights recent studies using targeted nanoparticles for cancer treatment focusing on approaches that target either the tumor vasculature (referred to as ‘vascular targeting’, the tumor microenvironment (‘tissue targeting’ or the individual cancer cells (‘cellular targeting’. Recent studies combining these different targeting methods are also discussed in this review. Finally, this review summarizes some of the reasons for the lack of clinical success in the field of targeted nanomedicines.

  1. Mitochondria-targeting nanomedicine: An effective and potent strategy against aminoglycosides-induced ototoxicity.

    Science.gov (United States)

    Zhou, Shuang; Sun, Yanhui; Kuang, Xiao; Hou, Shanshan; Yang, YinXian; Wang, Zhenjie; Liu, Hongzhuo

    2018-04-21

    We report a proof-of-concept for the development of mitochondria-targeting nanoparticles (NPs) loaded with geranylgeranylacetone (GGA) to protect against a wide range of gentamicin-induced ototoxicity symptoms in a zebrafish model. The polymeric NPs were functionalized with a mitochondrial-homing peptide (d‑Arg‑Dmt‑Orn‑Phe‑NH 2 ) and exhibited greater mitochondrial uptake and lower gentamicin uptake in hair cells via mechanotransduction (MET) channels and tuned machinery in the hair bundle than the ordinary NPs did. Blockade of MET channels rapidly reversed this effect, indicating the reversible responses of hair cells to the targeting NPs were mediated by MET channels. Pretreatment of hair cells with mitochondria-targeting GGA-loaded NPs exhibited a superior acute or chronic protective efficacy against subsequent exposure to gentamicin compared with unmodified formulations. Mitochondrial delivery regulating the death pathway of hair cells appeared to cause the therapeutic failure of untargeted NPs. Thus, peptide-directed mitochondria-targeting NPs may represent a novel therapeutic strategy for mitochondrial dysfunction-linked diseases. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Issues in Benchmark Metric Selection

    Science.gov (United States)

    Crolotte, Alain

    It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.

  3. Selection of appropriates E-learning personalization strategies from ontological perspectives

    Directory of Open Access Journals (Sweden)

    Fathi Essalmi

    2010-10-01

    Full Text Available When there are several personalization strategies of E-learning, authors of courses need to be supported for deciding which strategy will be applied for personalizing each course. In fact, the time, the efforts and the learning objects needed for preparing personalized learning scenarios depend on the personalization strategy to be applied. This paper presents an approach for selecting personalization strategies according to the feasibility of generating personalized learning scenarios with minimal intervention of the author. Several metrics are proposed for putting in order and selecting useful personalization strategies. The calculus of these metrics is automated based on the analyses of the LOM (Learning Object Metadata standard according to the semantic relations between data elements and learners’ characteristics represented in the Ontology for Selection of Personalization Strategies (OSPS.

  4. Least loaded and route fragmentation aware RSA strategies for elastic optical networks

    Science.gov (United States)

    Batham, Deepak; Yadav, Dharmendra Singh; Prakash, Shashi

    2017-12-01

    Elastic optical networks (EONs) provide flexibility to assign wide range of spectral resources to the connection requests. In this manuscript, we address two issues related to spectrum assignment in EONs: the non uniform spectrum assignment along different links of the route and the spectrum fragmentation in the network. To address these issues, two routing and spectrum assignment (RSA) strategies have been proposed: Least Loaded RSA (LLRSA) and Route Fragmentation Aware RSA (RFARSA). The LLRSA allocates spectrum homogeneously along different links in the network, where as RFARSA accords priority to the routes which are less fragmented. To highlight the salient features of the two strategies, two new metrics, route fragmentation index (RFI) and standard deviation (SD) are introduced. RFI is defined as the ratio of non-contiguous FSs to the total available free FSs on the route, and SD relates to the measure of non-uniformity in the allocation of resources on the links in the network. A simulation program has been developed to evaluate the performance of the proposed (LLRSA and RFARSA) strategies, and the existing strategies of shortest path RSA (SPRSA) and spectrum compactness based defragmentation (SCD) strategies, on the metric of RFI, bandwidth blocking probability (BBP), network capacity utilized, and SD. The variation in the metrics on the basis of number of requests and the bandwidth (number of FSs) requested has been studied. It has been conclusively established that the proposed strategies (LLRSA and RFARSA) outperform the existing strategies in terms of all the metrics.

  5. Web metrics for library and information professionals

    CERN Document Server

    Stuart, David

    2014-01-01

    This is a practical guide to using web metrics to measure impact and demonstrate value. The web provides an opportunity to collect a host of different metrics, from those associated with social media accounts and websites to more traditional research outputs. This book is a clear guide for library and information professionals as to what web metrics are available and how to assess and use them to make informed decisions and demonstrate value. As individuals and organizations increasingly use the web in addition to traditional publishing avenues and formats, this book provides the tools to unlock web metrics and evaluate the impact of this content. The key topics covered include: bibliometrics, webometrics and web metrics; data collection tools; evaluating impact on the web; evaluating social media impact; investigating relationships between actors; exploring traditional publications in a new environment; web metrics and the web of data; the future of web metrics and the library and information professional.Th...

  6. Partial rectangular metric spaces and fixed point theorems.

    Science.gov (United States)

    Shukla, Satish

    2014-01-01

    The purpose of this paper is to introduce the concept of partial rectangular metric spaces as a generalization of rectangular metric and partial metric spaces. Some properties of partial rectangular metric spaces and some fixed point results for quasitype contraction in partial rectangular metric spaces are proved. Some examples are given to illustrate the observed results.

  7. In vivo tumor targeting of gold nanoparticles: effect of particle type and dosing strategy.

    Science.gov (United States)

    Puvanakrishnan, Priyaveena; Park, Jaesook; Chatterjee, Deyali; Krishnan, Sunil; Tunnell, James W

    2012-01-01

    Gold nanoparticles (GNPs) have gained significant interest as nanovectors for combined imaging and photothermal therapy of tumors. Delivered systemically, GNPs preferentially accumulate at the tumor site via the enhanced permeability and retention effect, and when irradiated with near infrared light, produce sufficient heat to treat tumor tissue. The efficacy of this process strongly depends on the targeting ability of the GNPs, which is a function of the particle's geometric properties (eg, size) and dosing strategy (eg, number and amount of injections). The purpose of this study was to investigate the effect of GNP type and dosing strategy on in vivo tumor targeting. Specifically, we investigated the in vivo tumor-targeting efficiency of pegylated gold nanoshells (GNSs) and gold nanorods (GNRs) for single and multiple dosing. We used Swiss nu/nu mice with a subcutaneous tumor xenograft model that received intravenous administration for a single and multiple doses of GNS and GNR. We performed neutron activation analysis to quantify the gold present in the tumor and liver. We performed histology to determine if there was acute toxicity as a result of multiple dosing. Neutron activation analysis results showed that the smaller GNRs accumulated in higher concentrations in the tumor compared to the larger GNSs. We observed a significant increase in GNS and GNR accumulation in the liver for higher doses. However, multiple doses increased targeting efficiency with minimal effect beyond three doses of GNPs. These results suggest a significant effect of particle type and multiple doses on increasing particle accumulation and on tumor targeting ability.

  8. A Kerr-NUT metric

    International Nuclear Information System (INIS)

    Vaidya, P.C.; Patel, L.K.; Bhatt, P.V.

    1976-01-01

    Using Galilean time and retarded distance as coordinates the usual Kerr metric is expressed in form similar to the Newman-Unti-Tamburino (NUT) metric. The combined Kerr-NUT metric is then investigated. In addition to the Kerr and NUT solutions of Einstein's equations, three other types of solutions are derived. These are (i) the radiating Kerr solution, (ii) the radiating NUT solution satisfying Rsub(ik) = sigmaxisub(i)xisub(k), xisub(i)xisup(i) = 0, and (iii) the associated Kerr solution satisfying Rsub(ik) = 0. Solution (i) is distinct from and simpler than the one reported earlier by Vaidya and Patel (Phys. Rev.; D7:3590 (1973)). Solutions (ii) and (iii) gave line elements which have the axis of symmetry as a singular line. (author)

  9. Evaluation of performance metrics of leagile supply chain through fuzzy MCDM

    Directory of Open Access Journals (Sweden)

    D. Venkata Ramana

    2013-07-01

    Full Text Available Leagile supply chain management has emerged as a proactive approach for improving business value of companies. The companies that face volatile and unpredictable market demand of their products must pioneer in leagile supply chain strategy for competition and various demands of customers. There are literally many approaches for performance metrics of supply chain in general, yet little investigation has identified the reliability and validity of such approaches particularly in leagile supply chains. This study examines the consistency approaches by confirmatory factor analysis that determines the adoption of performance dimensions. The prioritization of performance enablers under these dimensions of leagile supply chain in small and medium enterprises are determined through fuzzy logarithmic least square method (LLSM. The study developed a generic hierarchy model for decision-makers who can prioritize the supply chain metrics under performance dimensions of leagile supply chain.

  10. Magnetic targeting as a strategy to enhance therapeutic effects of mesenchymal stromal cells.

    Science.gov (United States)

    Silva, Luisa H A; Cruz, Fernanda F; Morales, Marcelo M; Weiss, Daniel J; Rocco, Patricia R M

    2017-03-09

    Mesenchymal stromal cells (MSCs) have been extensively investigated in the field of regenerative medicine. It is known that the success of MSC-based therapies depends primarily on effective cell delivery to the target site where they will secrete vesicles and soluble factors with immunomodulatory and potentially reparative properties. However, some lesions are located in sites that are difficult to access, such as the heart, spinal cord, and joints. Additionally, low MSC retention at target sites makes cell therapy short-lasting and, therefore, less effective. In this context, the magnetic targeting technique has emerged as a new strategy to aid delivery, increase retention, and enhance the effects of MSCs. This approach uses magnetic nanoparticles to magnetize MSCs and static magnetic fields to guide them in vivo, thus promoting more focused, effective, and lasting retention of MSCs at the target site. In the present review, we discuss the magnetic targeting technique, its principles, and the materials most commonly used; we also discuss its potential for MSC enhancement, and safety concerns that should be addressed before it can be applied in clinical practice.

  11. Productivity in Pediatric Palliative Care: Measuring and Monitoring an Elusive Metric.

    Science.gov (United States)

    Kaye, Erica C; Abramson, Zachary R; Snaman, Jennifer M; Friebert, Sarah E; Baker, Justin N

    2017-05-01

    Workforce productivity is poorly defined in health care. Particularly in the field of pediatric palliative care (PPC), the absence of consensus metrics impedes aggregation and analysis of data to track workforce efficiency and effectiveness. Lack of uniformly measured data also compromises the development of innovative strategies to improve productivity and hinders investigation of the link between productivity and quality of care, which are interrelated but not interchangeable. To review the literature regarding the definition and measurement of productivity in PPC; to identify barriers to productivity within traditional PPC models; and to recommend novel metrics to study productivity as a component of quality care in PPC. PubMed ® and Cochrane Database of Systematic Reviews searches for scholarly literature were performed using key words (pediatric palliative care, palliative care, team, workforce, workflow, productivity, algorithm, quality care, quality improvement, quality metric, inpatient, hospital, consultation, model) for articles published between 2000 and 2016. Organizational searches of Center to Advance Palliative Care, National Hospice and Palliative Care Organization, National Association for Home Care & Hospice, American Academy of Hospice and Palliative Medicine, Hospice and Palliative Nurses Association, National Quality Forum, and National Consensus Project for Quality Palliative Care were also performed. Additional semistructured interviews were conducted with directors from seven prominent PPC programs across the U.S. to review standard operating procedures for PPC team workflow and productivity. Little consensus exists in the PPC field regarding optimal ways to define, measure, and analyze provider and program productivity. Barriers to accurate monitoring of productivity include difficulties with identification, measurement, and interpretation of metrics applicable to an interdisciplinary care paradigm. In the context of inefficiencies

  12. Background metric in supergravity theories

    International Nuclear Information System (INIS)

    Yoneya, T.

    1978-01-01

    In supergravity theories, we investigate the conformal anomaly of the path-integral determinant and the problem of fermion zero modes in the presence of a nontrivial background metric. Except in SO(3) -invariant supergravity, there are nonvanishing conformal anomalies. As a consequence, amplitudes around the nontrivial background metric contain unpredictable arbitrariness. The fermion zero modes which are explicitly constructed for the Euclidean Schwarzschild metric are interpreted as an indication of the supersymmetric multiplet structure of a black hole. The degree of degeneracy of a black hole is 2/sup 4n/ in SO(n) supergravity

  13. Daylight metrics and energy savings

    Energy Technology Data Exchange (ETDEWEB)

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  14. Application of Sigma Metrics Analysis for the Assessment and Modification of Quality Control Program in the Clinical Chemistry Laboratory of a Tertiary Care Hospital.

    Science.gov (United States)

    Iqbal, Sahar; Mustansar, Tazeen

    2017-03-01

    Sigma is a metric that quantifies the performance of a process as a rate of Defects-Per-Million opportunities. In clinical laboratories, sigma metric analysis is used to assess the performance of laboratory process system. Sigma metric is also used as a quality management strategy for a laboratory process to improve the quality by addressing the errors after identification. The aim of this study is to evaluate the errors in quality control of analytical phase of laboratory system by sigma metric. For this purpose sigma metric analysis was done for analytes using the internal and external quality control as quality indicators. Results of sigma metric analysis were used to identify the gaps and need for modification in the strategy of laboratory quality control procedure. Sigma metric was calculated for quality control program of ten clinical chemistry analytes including glucose, chloride, cholesterol, triglyceride, HDL, albumin, direct bilirubin, total bilirubin, protein and creatinine, at two control levels. To calculate the sigma metric imprecision and bias was calculated with internal and external quality control data, respectively. The minimum acceptable performance was considered as 3 sigma. Westgard sigma rules were applied to customize the quality control procedure. Sigma level was found acceptable (≥3) for glucose (L2), cholesterol, triglyceride, HDL, direct bilirubin and creatinine at both levels of control. For rest of the analytes sigma metric was found control levels (8.8 and 8.0 at L2 and L3, respectively). We conclude that analytes with the sigma value quality control procedure. In this study application of sigma rules provided us the practical solution for improved and focused design of QC procedure.

  15. Metrics for energy resilience

    International Nuclear Information System (INIS)

    Roege, Paul E.; Collier, Zachary A.; Mancillas, James; McDonagh, John A.; Linkov, Igor

    2014-01-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today's energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system's energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth. - Highlights: • Resilience is the ability of a system to recover from adversity. • There is a need for methods to quantify and measure system resilience. • We developed a matrix-based approach to generate energy resilience metrics. • These metrics can be used in energy planning, system design, and operations

  16. Balanced metrics for vector bundles and polarised manifolds

    DEFF Research Database (Denmark)

    Garcia Fernandez, Mario; Ross, Julius

    2012-01-01

    leads to a Hermitian-Einstein metric on E and a constant scalar curvature Kähler metric in c_1(L). For special values of α, limits of balanced metrics are solutions of a system of coupled equations relating a Hermitian-Einstein metric on E and a Kähler metric in c1(L). For this, we compute the top two......We consider a notion of balanced metrics for triples (X, L, E) which depend on a parameter α, where X is smooth complex manifold with an ample line bundle L and E is a holomorphic vector bundle over X. For generic choice of α, we prove that the limit of a convergent sequence of balanced metrics...

  17. Activation loop targeting strategy for design of receptor-interacting protein kinase 2 (RIPK2) inhibitors.

    Science.gov (United States)

    Suebsuwong, Chalada; Pinkas, Daniel M; Ray, Soumya S; Bufton, Joshua C; Dai, Bing; Bullock, Alex N; Degterev, Alexei; Cuny, Gregory D

    2018-02-15

    Development of selective kinase inhibitors remains a challenge due to considerable amino acid sequence similarity among family members particularly in the ATP binding site. Targeting the activation loop might offer improved inhibitor selectivity since this region of kinases is less conserved. However, the strategy presents difficulties due to activation loop flexibility. Herein, we report the design of receptor-interacting protein kinase 2 (RIPK2) inhibitors based on pan-kinase inhibitor regorafenib that aim to engage basic activation loop residues Lys169 or Arg171. We report development of CSR35 that displayed >10-fold selective inhibition of RIPK2 versus VEGFR2, the target of regorafenib. A co-crystal structure of CSR35 with RIPK2 revealed a resolved activation loop with an ionic interaction between the carboxylic acid installed in the inhibitor and the side-chain of Lys169. Our data provides principle feasibility of developing activation loop targeting type II inhibitors as a complementary strategy for achieving improved selectivity. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  18. PRICE-LEVEL TARGETING – A VIABLE ALTERNATIVE TO INFLATION TARGETING?

    Directory of Open Access Journals (Sweden)

    Iulian Vasile Popescu

    2012-12-01

    Full Text Available The recent financial crisis that has led some central banks reaching the zero lower bound of their interest rate to use unconventional monetary policy instruments, has brought to the forefront theacademic discussions on the shift from inflation targeting (IT to price level targeting. This paper provides a comparative analysis on IT strategy and targeting the price level, assesses the implications and highlights the challenges of an eventual transition to a new monetary policy strategy. Balancing the advantages (mainly better anchored inflation expectations and disadvantages (communication difficulties generated by following a potential price-level targeting strategy and the necessary prerequisites for its functionality (predictive agents, fully familiar with the implications of such a strategy and with complete confidence in themonetary authority has led us to the conclusion that there is no common acceptance that price level targeting strategy might replace the present IT framework.

  19. The metrics of science and technology

    CERN Document Server

    Geisler, Eliezer

    2000-01-01

    Dr. Geisler's far-reaching, unique book provides an encyclopedic compilation of the key metrics to measure and evaluate the impact of science and technology on academia, industry, and government. Focusing on such items as economic measures, patents, peer review, and other criteria, and supported by an extensive review of the literature, Dr. Geisler gives a thorough analysis of the strengths and weaknesses inherent in metric design, and in the use of the specific metrics he cites. His book has already received prepublication attention, and will prove especially valuable for academics in technology management, engineering, and science policy; industrial R&D executives and policymakers; government science and technology policymakers; and scientists and managers in government research and technology institutions. Geisler maintains that the application of metrics to evaluate science and technology at all levels illustrates the variety of tools we currently possess. Each metric has its own unique strengths and...

  20. Extending cosmology: the metric approach

    OpenAIRE

    Mendoza, S.

    2012-01-01

    Comment: 2012, Extending Cosmology: The Metric Approach, Open Questions in Cosmology; Review article for an Intech "Open questions in cosmology" book chapter (19 pages, 3 figures). Available from: http://www.intechopen.com/books/open-questions-in-cosmology/extending-cosmology-the-metric-approach

  1. An analytical modeling framework to evaluate converged networks through business-oriented metrics

    International Nuclear Information System (INIS)

    Guimarães, Almir P.; Maciel, Paulo R.M.; Matias, Rivalino

    2013-01-01

    Nowadays, society has increasingly relied on convergent networks as an essential means for individuals, businesses, and governments. Strategies, methods, models and techniques for preventing and handling hardware or software failures as well as avoiding performance degradation are, thus, fundamental for prevailing in business. Issues such as operational costs, revenues and the respective relationship to key performance and dependability metrics are central for defining the required system infrastructure. Our work aims to provide system performance and dependability models for supporting optimization of infrastructure design, aimed at business oriented metrics. In addition, a methodology is also adopted to support both the modeling and the evaluation process. The results showed that the proposed methodology can significantly reduce the complexity of infrastructure design as well as improve the relationship between business and infrastructure aspects

  2. Measuring Information Security: Guidelines to Build Metrics

    Science.gov (United States)

    von Faber, Eberhard

    Measuring information security is a genuine interest of security managers. With metrics they can develop their security organization's visibility and standing within the enterprise or public authority as a whole. Organizations using information technology need to use security metrics. Despite the clear demands and advantages, security metrics are often poorly developed or ineffective parameters are collected and analysed. This paper describes best practices for the development of security metrics. First attention is drawn to motivation showing both requirements and benefits. The main body of this paper lists things which need to be observed (characteristic of metrics), things which can be measured (how measurements can be conducted) and steps for the development and implementation of metrics (procedures and planning). Analysis and communication is also key when using security metrics. Examples are also given in order to develop a better understanding. The author wants to resume, continue and develop the discussion about a topic which is or increasingly will be a critical factor of success for any security managers in larger organizations.

  3. Multimetric indices: How many metrics?

    Science.gov (United States)

    Multimetric indices (MMI’s) often include 5 to 15 metrics, each representing a different attribute of assemblage condition, such as species diversity, tolerant taxa, and nonnative taxa. Is there an optimal number of metrics for MMIs? To explore this question, I created 1000 9-met...

  4. Tumor initiating cells and chemoresistance: which is the best strategy to target colon cancer stem cells?

    Science.gov (United States)

    Paldino, Emanuela; Tesori, Valentina; Casalbore, Patrizia; Gasbarrini, Antonio; Puglisi, Maria Ausiliatrice

    2014-01-01

    There is an emerging body of evidence that chemoresistance and minimal residual disease result from selective resistance of a cell subpopulation from the original tumor that is molecularly and phenotypically distinct. These cells are called "cancer stem cells" (CSCs). In this review, we analyze the potential targeting strategies for eradicating CSCs specifically in order to develop more effective therapeutic strategies for metastatic colon cancer. These include induction of terminal epithelial differentiation of CSCs or targeting some genes expressed only in CSCs and involved in self-renewal and chemoresistance. Ideal targets could be cell regulators that simultaneously control the stemness and the resistance of CSCs. Another important aspect of cancer biology, which can also be harnessed to create novel broad-spectrum anticancer agents, is the Warburg effect, also known as aerobic glycolysis. Actually, little is yet known with regard to the metabolism of CSCs population, leaving an exciting unstudied avenue in the dawn of the emerging field of metabolomics.

  5. Tumor Initiating Cells and Chemoresistance: Which Is the Best Strategy to Target Colon Cancer Stem Cells?

    Directory of Open Access Journals (Sweden)

    Emanuela Paldino

    2014-01-01

    Full Text Available There is an emerging body of evidence that chemoresistance and minimal residual disease result from selective resistance of a cell subpopulation from the original tumor that is molecularly and phenotypically distinct. These cells are called “cancer stem cells” (CSCs. In this review, we analyze the potential targeting strategies for eradicating CSCs specifically in order to develop more effective therapeutic strategies for metastatic colon cancer. These include induction of terminal epithelial differentiation of CSCs or targeting some genes expressed only in CSCs and involved in self-renewal and chemoresistance. Ideal targets could be cell regulators that simultaneously control the stemness and the resistance of CSCs. Another important aspect of cancer biology, which can also be harnessed to create novel broad-spectrum anticancer agents, is the Warburg effect, also known as aerobic glycolysis. Actually, little is yet known with regard to the metabolism of CSCs population, leaving an exciting unstudied avenue in the dawn of the emerging field of metabolomics.

  6. Solving the productivity and impact puzzle: Do men outperform women, or are metrics biased?

    Science.gov (United States)

    Elissa Z. Cameron; Angela M. White; Meeghan E. Gray

    2016-01-01

    The attrition of women from science with increasing career stage continues, suggesting that current strategies are unsuccessful. Research evaluation using unbiased metrics could be important for the retention of women, because other factors such as implicit bias are unlikely to quickly change. We compare the publishing patterns of men and women within the...

  7. A comparison of information functions and search strategies for sensor planning in target classification.

    Science.gov (United States)

    Zhang, Guoxian; Ferrari, Silvia; Cai, Chenghui

    2012-02-01

    This paper investigates the comparative performance of several information-driven search strategies and decision rules using a canonical target classification problem. Five sensor models are considered: one obtained from classical estimation theory and four obtained from Bernoulli, Poisson, binomial, and mixture-of-binomial distributions. A systematic approach is presented for deriving information functions that represent the expected utility of future sensor measurements from mutual information, Rènyi divergence, Kullback-Leibler divergence, information potential, quadratic entropy, and the Cauchy-Schwarz distance. The resulting information-driven strategies are compared to direct-search, alert-confirm, task-driven (TS), and log-likelihood-ratio (LLR) search strategies. Extensive numerical simulations show that quadratic entropy typically leads to the most effective search strategy with respect to correct-classification rates. In the presence of prior information, the quadratic-entropy-driven strategy also displays the lowest rate of false alarms. However, when prior information is absent or very noisy, TS and LLR strategies achieve the lowest false-alarm rates for the Bernoulli, mixture-of-binomial, and classical sensor models.

  8. Metrics for Polyphonic Sound Event Detection

    Directory of Open Access Journals (Sweden)

    Annamaria Mesaros

    2016-05-01

    Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.

  9. Robustness Metrics: Consolidating the multiple approaches to quantify Robustness

    DEFF Research Database (Denmark)

    Göhler, Simon Moritz; Eifler, Tobias; Howard, Thomas J.

    2016-01-01

    robustness metrics; 3) Functional expectancy and dispersion robustness metrics; and 4) Probability of conformance robustness metrics. The goal was to give a comprehensive overview of robustness metrics and guidance to scholars and practitioners to understand the different types of robustness metrics...

  10. Common Metrics for Human-Robot Interaction

    Science.gov (United States)

    Steinfeld, Aaron; Lewis, Michael; Fong, Terrence; Scholtz, Jean; Schultz, Alan; Kaber, David; Goodrich, Michael

    2006-01-01

    This paper describes an effort to identify common metrics for task-oriented human-robot interaction (HRI). We begin by discussing the need for a toolkit of HRI metrics. We then describe the framework of our work and identify important biasing factors that must be taken into consideration. Finally, we present suggested common metrics for standardization and a case study. Preparation of a larger, more detailed toolkit is in progress.

  11. Narrowing the Gap Between QoS Metrics and Web QoE Using Above-the-fold Metrics

    OpenAIRE

    da Hora, Diego Neves; Asrese, Alemnew; Christophides, Vassilis; Teixeira, Renata; Rossi, Dario

    2018-01-01

    International audience; Page load time (PLT) is still the most common application Quality of Service (QoS) metric to estimate the Quality of Experience (QoE) of Web users. Yet, recent literature abounds with proposals for alternative metrics (e.g., Above The Fold, SpeedIndex and variants) that aim at better estimating user QoE. The main purpose of this work is thus to thoroughly investigate a mapping between established and recently proposed objective metrics and user QoE. We obtain ground tr...

  12. Photochemical internalisation, a minimally invasive strategy for light-controlled endosomal escape of cancer stem cell-targeting therapeutics.

    Science.gov (United States)

    Selbo, Pål Kristian; Bostad, Monica; Olsen, Cathrine Elisabeth; Edwards, Victoria Tudor; Høgset, Anders; Weyergang, Anette; Berg, Kristian

    2015-08-01

    Despite progress in radio-, chemo- and photodynamic-therapy (PDT) of cancer, treatment resistance still remains a major problem for patients with aggressive tumours. Cancer stem cells (CSCs) or tumour-initiating cells are intrinsically and notoriously resistant to conventional cancer therapies and are proposed to be responsible for the recurrence of tumours after therapy. According to the CSC hypothesis, it is imperative to develop novel anticancer agents or therapeutic strategies that take into account the biology and role of CSCs. The present review outlines our recent study on photochemical internalisation (PCI) using the clinically relevant photosensitiser TPCS2a/Amphinex® as a rational, non-invasive strategy for the light-controlled endosomal escape of CSC-targeting drugs. PCI is an intracellular drug delivery method based on light-induced ROS-generation and a subsequent membrane-disruption of endocytic vesicles, leading to cytosolic release of the entrapped drugs of interest. In different proof-of-concept studies we have demonstrated that PCI of CSC-directed immunotoxins targeting CD133, CD44, CSPG4 and EpCAM is a highly specific and effective strategy for killing cancer cells and CSCs. CSCs overexpressing CD133 are PDT-resistant; however, this is circumvented by PCI of CD133-targeting immunotoxins. In view of the fact that TPCS2a is not a substrate of the efflux pumps ABCG2 and P-glycoprotein (ABCB1), the PCI-method is a promising anti-CSC therapeutic strategy. Due to a laser-controlled exposure, PCI of CSC-targeting drugs will be confined exclusively to the tumour tissue, suggesting that this drug delivery method has the potential to spare distant normal stem cells.

  13. Metrics to describe the effects of landscape pattern on hydrology in a lotic peatland

    Science.gov (United States)

    Yuan, J.; Cohen, M. J.; Kaplan, D. A.; Acharya, S.; Larsen, L.; Nungesser, M.

    2013-12-01

    Strong reciprocal interactions exist between landscape patterns and ecological processes. Hydrology is the dominant abiotic driver of ecological processes in wetlands, particularly flowing wetlands, but is both the control on and controlled by the geometry of vegetation patterning. Landscape metrics are widely used to quantitatively link pattern and process. Our goal here was to use several candidate spatial pattern metrics to predict the effects of wetland vegetation pattern on hydrologic regime, specifically hydroperiod, in the ridge-slough patterned landscape of the Everglades. The metrics focus on the capacity for longitudinally connected flow, and thus the ability of this low-gradient patterned landscape to route water from upstream. We first explored flow friction cost (FFC), a weighted spatial distance procedure wherein ridges have a high flow cost than sloughs by virtue of their elevation and vegetation structure, to evaluate water movement through different landscape configurations. We also investigated existing published flow metrics, specifically the Directional Connectivity Index (DCI) and Landscape Discharge Competence (LDC), that seek to quantify connectivity, one of the sentinel targets of ecological restoration. Hydroperiod was estimated using a numerical hydrologic model (SWIFT 2D) in real and synthetic landscapes with varying vegetation properties ( patch anisotropy, ridge density). Synthetic landscapes were constrained by the geostatistical properties of the best conserved patterned, and contained five anisotropy levels and seven ridge density levels. These were used to construct the relationship between landscape metrics and hydroperiod. Then, using historical images from 1940 to 2004, we applied the metrics toback-cast hydroperiod. Current vegetation maps were used to test scale dependency for each metric. Our results suggest that both FFC and DCI are good predictors of hydroperiod under free flowing conditions, and that they can be used

  14. Factor structure of the Tomimatsu-Sato metrics

    International Nuclear Information System (INIS)

    Perjes, Z.

    1989-02-01

    Based on an earlier result stating that δ = 3 Tomimatsu-Sato (TS) metrics can be factored over the field of integers, an analogous representation for higher TS metrics was sought. It is shown that the factoring property of TS metrics follows from the structure of special Hankel determinants. A set of linear algebraic equations determining the factors was defined, and the factors of the first five TS metrics were tabulated, together with their primitive factors. (R.P.) 4 refs.; 2 tabs

  15. Strategy for 90% autoverification of clinical chemistry and immunoassay test results using six sigma process improvement.

    Science.gov (United States)

    Randell, Edward W; Short, Garry; Lee, Natasha; Beresford, Allison; Spencer, Margaret; Kennell, Marina; Moores, Zoë; Parry, David

    2018-06-01

    Six Sigma involves a structured process improvement strategy that places processes on a pathway to continued improvement. The data presented here summarizes a project that took three clinical laboratories from autoverification processes that allowed between about 40% to 60% of tests being auto-verified to more than 90% of tests and samples auto-verified. The project schedule, metrics and targets, a description of the previous system and detailed information on the changes made to achieve greater than 90% auto-verification is presented for this Six Sigma DMAIC (Design, Measure, Analyze, Improve, Control) process improvement project.

  16. A Framework to Integrate Public, Dynamic Metrics Into an OER Platform

    Directory of Open Access Journals (Sweden)

    Jaclyn Zetta Cohen

    2014-04-01

    Full Text Available The usage metrics for open educational resources (OER are often either hidden behind an authentication system or shared intermittently in static, aggregated format at the repository level. This paper discusses the first year of University of Michigan’s project to share its OER usage data dynamically, publicly, to synthesize it across different levels within the repository hierarchies, and to aggregate in a method inclusive of content hosted on third-party platforms. The authors analyze their user research with a target audience of faculty authors, multimedia specialists, librarians, and communications specialists. Next, they explore a stratified technical design that allows the dynamic sharing of metrics down to the level of individual resources. The authors conclude that this framework enables sustainable feedback to OER creators, helps to build positive relationships with creators of OER, and allows the institution to move toward sharing OER on a larger scale.

  17. Korea's nuclear public information experiences-target groups and communication strategies

    International Nuclear Information System (INIS)

    Chung, J.K.

    1996-01-01

    Why public information activities in Korea are needed is first explained. There are three basic reasons; 1) to secure necessary sites for construction of large nuclear facilities; such as nuclear power plants, radwaste management facilities, and nuclear fuel-cycle related facilities 2) to maintain a friendly relationship between the local communities and the nuclear industries, 3) to promote better understanding about the nation's peaceful nuclear programs to the various target groups. Categorization of target groups and messages are reviewed. By whom the public information programs are implemented is also explained. An orchestrated effort together with the third communicators is stressed. Basic philosophy of nuclear public information programs is introduced. A high-profile information campaign and a low-profile information campaign are explained. Particular information strategies suitable to Korean situation as examined. In addition, the Korean general public perception on nuclear energy is briefly introduced. Also, some real insights of anti-nuclear movement in Korea together with the arguments are reviewed. In conclusion, the paper stresses that nuclear arguments became no more technical matters but almost socio-political issues. (author)

  18. Photothermal Effect Enhanced Cascade-Targeting Strategy for Improved Pancreatic Cancer Therapy by Gold Nanoshell@Mesoporous Silica Nanorod.

    Science.gov (United States)

    Zhao, Ruifang; Han, Xuexiang; Li, Yiye; Wang, Hai; Ji, Tianjiao; Zhao, Yuliang; Nie, Guangjun

    2017-08-22

    Pancreatic cancer, one of the leading causes of cancer-related mortality, is characterized by desmoplasia and hypovascular cancerous tissue, with a 5 year survival rate of targeting (mediated by photothermal effect and molecular receptor binding) and photothermal treatment-enhanced gemcitabine chemotherapy, under mild near-infrared laser irradiation condition. GNRS significantly improved gemcitabine penetration and accumulation in tumor tissues, thus destroying the dense stroma barrier of pancreatic cancer and reinforcing chemosensitivity in mice. Our current findings strongly support the notion that further development of this integrated plasmonic photothermal strategy may represent a promising translational nanoformulation for effective treatment of pancreatic cancer with integral cascade tumor targeting strategy and enhanced drug delivery efficacy.

  19. ST-intuitionistic fuzzy metric space with properties

    Science.gov (United States)

    Arora, Sahil; Kumar, Tanuj

    2017-07-01

    In this paper, we define ST-intuitionistic fuzzy metric space and the notion of convergence and completeness properties of cauchy sequences is studied. Further, we prove some properties of ST-intuitionistic fuzzy metric space. Finally, we introduce the concept of symmetric ST Intuitionistic Fuzzy metric space.

  20. A new metric of the low-mode asymmetry for ignition target designs

    International Nuclear Information System (INIS)

    Gu, Jianfa; Dai, Zhensheng; Fan, Zhengfeng; Zou, Shiyang; Ye, Wenhua; Pei, Wenbing; Zhu, Shaoping

    2014-01-01

    In the deuterium-tritium inertial confinement fusion implosion experiments on the National Ignition Facility, the measured neutron yield and hot spot pressure are significantly lower than simulations. Understanding the underlying physics of the deficit is essential to achieving ignition. This paper investigates the low-mode areal density asymmetry in the main fuel of ignition capsule. It is shown that the areal density asymmetry breaks up the compressed shell and significantly reduces the conversion of implosion kinetic energy to hot spot internal energy, leading to the calculated hot spot pressure and neutron yield quite close to the experimental data. This indicates that the low-mode shell areal density asymmetry can explain part of the large discrepancy between simulations and experiments. Since only using the hot spot shape term could not adequately characterize the effects of the shell areal density asymmetry on implosion performance, a new metric of the low-mode asymmetry is developed to accurately measure the probability of ignition

  1. MO-A-16A-01: QA Procedures and Metrics: In Search of QA Usability

    International Nuclear Information System (INIS)

    Sathiaseelan, V; Thomadsen, B

    2014-01-01

    Radiation therapy has undergone considerable changes in the past two decades with a surge of new technology and treatment delivery methods. The complexity of radiation therapy treatments has increased and there has been increased awareness and publicity about the associated risks. In response, there has been proliferation of guidelines for medical physicists to adopt to ensure that treatments are delivered safely. Task Group recommendations are copious, and clinical physicists' hours are longer, stretched to various degrees between site planning and management, IT support, physics QA, and treatment planning responsibilities.Radiation oncology has many quality control practices in place to ensure the delivery of high-quality, safe treatments. Incident reporting systems have been developed to collect statistics about near miss events at many radiation oncology centers. However, tools are lacking to assess the impact of these various control measures. A recent effort to address this shortcoming is the work of Ford et al (2012) who recently published a methodology enumerating quality control quantification for measuring the effectiveness of safety barriers. Over 4000 near-miss incidents reported from 2 academic radiation oncology clinics were analyzed using quality control quantification, and a profile of the most effective quality control measures (metrics) was identified.There is a critical need to identify a QA metric to help the busy clinical physicists to focus their limited time and resources most effectively in order to minimize or eliminate errors in the radiation treatment delivery processes. In this symposium the usefulness of workflows and QA metrics to assure safe and high quality patient care will be explored.Two presentations will be given:Quality Metrics and Risk Management with High Risk Radiation Oncology ProceduresStrategies and metrics for quality management in the TG-100 Era Learning Objectives: Provide an overview and the need for QA usability

  2. MO-A-16A-01: QA Procedures and Metrics: In Search of QA Usability

    Energy Technology Data Exchange (ETDEWEB)

    Sathiaseelan, V [Northwestern Memorial Hospital, Chicago, IL (United States); Thomadsen, B [University of Wisconsin, Madison, WI (United States)

    2014-06-15

    Radiation therapy has undergone considerable changes in the past two decades with a surge of new technology and treatment delivery methods. The complexity of radiation therapy treatments has increased and there has been increased awareness and publicity about the associated risks. In response, there has been proliferation of guidelines for medical physicists to adopt to ensure that treatments are delivered safely. Task Group recommendations are copious, and clinical physicists' hours are longer, stretched to various degrees between site planning and management, IT support, physics QA, and treatment planning responsibilities.Radiation oncology has many quality control practices in place to ensure the delivery of high-quality, safe treatments. Incident reporting systems have been developed to collect statistics about near miss events at many radiation oncology centers. However, tools are lacking to assess the impact of these various control measures. A recent effort to address this shortcoming is the work of Ford et al (2012) who recently published a methodology enumerating quality control quantification for measuring the effectiveness of safety barriers. Over 4000 near-miss incidents reported from 2 academic radiation oncology clinics were analyzed using quality control quantification, and a profile of the most effective quality control measures (metrics) was identified.There is a critical need to identify a QA metric to help the busy clinical physicists to focus their limited time and resources most effectively in order to minimize or eliminate errors in the radiation treatment delivery processes. In this symposium the usefulness of workflows and QA metrics to assure safe and high quality patient care will be explored.Two presentations will be given:Quality Metrics and Risk Management with High Risk Radiation Oncology ProceduresStrategies and metrics for quality management in the TG-100 Era Learning Objectives: Provide an overview and the need for QA usability

  3. Targeting the renin-angiotensin system as novel therapeutic strategy for pulmonary diseases.

    Science.gov (United States)

    Tan, Wan Shun Daniel; Liao, Wupeng; Zhou, Shuo; Mei, Dan; Wong, Wai-Shiu Fred

    2017-12-27

    The renin-angiotensin system (RAS) plays a major role in regulating electrolyte balance and blood pressure. RAS has also been implicated in the regulation of inflammation, proliferation and fibrosis in pulmonary diseases such as asthma, acute lung injury (ALI), chronic obstructive pulmonary disease (COPD), idiopathic pulmonary fibrosis (IPF) and pulmonary arterial hypertension (PAH). Current therapeutics suffer from some drawbacks like steroid resistance, limited efficacies and side effects. Novel intervention is definitely needed to offer optimal therapeutic strategy and clinical outcome. This review compiles and analyses recent investigations targeting RAS for the treatment of inflammatory lung diseases. Inhibition of the upstream angiotensin (Ang) I/Ang II/angiotensin receptor type 1 (AT 1 R) pathway and activation of the downstream angiotensin-converting enzyme 2 (ACE2)/Ang (1-7)/Mas receptor pathway are two feasible strategies demonstrating efficacies in various pulmonary disease models. More recent studies favor the development of targeting the downstream ACE2/Ang (1-7)/Mas receptor pathway, in which diminazene aceturate, an ACE2 activator, GSK2586881, a recombinant ACE2, and AV0991, a Mas receptor agonist, showed much potential for further development. As the pathogenesis of pulmonary diseases is so complex that RAS modulation may be used alone or in combination with existing drugs like corticosteroids, pirfenidone/nintedanib or endothelin receptor antagonists for different pulmonary diseases. Personalized medicine through genetic screening and phenotyping for angiotensinogen or ACE would aid treatment especially for non-responsive patients. This review serves to provide an update on the latest development in the field of RAS targeting for pulmonary diseases, and offer some insights into future direction. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Pragmatic security metrics applying metametrics to information security

    CERN Document Server

    Brotby, W Krag

    2013-01-01

    Other books on information security metrics discuss number theory and statistics in academic terms. Light on mathematics and heavy on utility, PRAGMATIC Security Metrics: Applying Metametrics to Information Security breaks the mold. This is the ultimate how-to-do-it guide for security metrics.Packed with time-saving tips, the book offers easy-to-follow guidance for those struggling with security metrics. Step by step, it clearly explains how to specify, develop, use, and maintain an information security measurement system (a comprehensive suite of metrics) to

  5. Defining a Progress Metric for CERT RMM Improvement

    Science.gov (United States)

    2017-09-14

    REV-03.18.2016.0 Defining a Progress Metric for CERT-RMM Improvement Gregory Crabb Nader Mehravari David Tobar September 2017 TECHNICAL ...fendable resource allocation decisions. Technical metrics measure aspects of controls implemented through technology (systems, soft- ware, hardware...implementation metric would be the percentage of users who have received anti-phishing training . • Effectiveness/efficiency metrics measure whether

  6. Targeting Beta-Amyloid at the CSF: A New Therapeutic Strategy in Alzheimer's Disease.

    Science.gov (United States)

    Menendez-Gonzalez, Manuel; Padilla-Zambrano, Huber S; Alvarez, Gabriel; Capetillo-Zarate, Estibaliz; Tomas-Zapico, Cristina; Costa, Agustin

    2018-01-01

    Although immunotherapies against the amyloid-β (Aβ) peptide tried so date failed to prove sufficient clinical benefit, Aβ still remains the main target in Alzheimer's disease (AD). This article aims to show the rationale of a new therapeutic strategy: clearing Aβ from the CSF continuously (the "CSF-sink" therapeutic strategy). First, we describe the physiologic mechanisms of Aβ clearance and the resulting AD pathology when these mechanisms are altered. Then, we review the experiences with peripheral Aβ-immunotherapy and discuss the related hypothesis of the mechanism of action of "peripheral sink." We also present Aβ-immunotherapies acting on the CNS directly. Finally, we introduce alternative methods of removing Aβ including the "CSF-sink" therapeutic strategy. As soluble peptides are in constant equilibrium between the ISF and the CSF, altering the levels of Aβ oligomers in the CSF would also alter the levels of such proteins in the brain parenchyma. We conclude that interventions based in a "CSF-sink" of Aβ will probably produce a steady clearance of Aβ in the ISF and therefore it may represent a new therapeutic strategy in AD.

  7. IT Project Management Metrics

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Many software and IT projects fail in completing theirs objectives because different causes of which the management of the projects has a high weight. In order to have successfully projects, lessons learned have to be used, historical data to be collected and metrics and indicators have to be computed and used to compare them with past projects and avoid failure to happen. This paper presents some metrics that can be used for the IT project management.

  8. Mass Customization Measurements Metrics

    DEFF Research Database (Denmark)

    Nielsen, Kjeld; Brunø, Thomas Ditlev; Jørgensen, Kaj Asbjørn

    2014-01-01

    A recent survey has indicated that 17 % of companies have ceased mass customizing less than 1 year after initiating the effort. This paper presents measurement for a company’s mass customization performance, utilizing metrics within the three fundamental capabilities: robust process design, choice...... navigation, and solution space development. A mass customizer when assessing performance with these metrics can identify within which areas improvement would increase competitiveness the most and enable more efficient transition to mass customization....

  9. The validation index: a new metric for validation of segmentation algorithms using two or more expert outlines with application to radiotherapy planning.

    Science.gov (United States)

    Juneja, Prabhjot; Evans, Philp M; Harris, Emma J

    2013-08-01

    Validation is required to ensure automated segmentation algorithms are suitable for radiotherapy target definition. In the absence of true segmentation, algorithmic segmentation is validated against expert outlining of the region of interest. Multiple experts are used to overcome inter-expert variability. Several approaches have been studied in the literature, but the most appropriate approach to combine the information from multiple expert outlines, to give a single metric for validation, is unclear. None consider a metric that can be tailored to case-specific requirements in radiotherapy planning. Validation index (VI), a new validation metric which uses experts' level of agreement was developed. A control parameter was introduced for the validation of segmentations required for different radiotherapy scenarios: for targets close to organs-at-risk and for difficult to discern targets, where large variation between experts is expected. VI was evaluated using two simulated idealized cases and data from two clinical studies. VI was compared with the commonly used Dice similarity coefficient (DSCpair - wise) and found to be more sensitive than the DSCpair - wise to the changes in agreement between experts. VI was shown to be adaptable to specific radiotherapy planning scenarios.

  10. Multiple polysaccharide-drug complex-loaded liposomes: A unique strategy in drug loading and cancer targeting.

    Science.gov (United States)

    Ruttala, Hima Bindu; Ramasamy, Thiruganesh; Gupta, Biki; Choi, Han-Gon; Yong, Chul Soon; Kim, Jong Oh

    2017-10-01

    In the present study, a unique strategy was developed to develop nanocarriers containing multiple therapeutics with controlled release characteristics. In this study, we demonstrated the synthesis of dextran sulfate-doxorubicin (DS-DOX) and alginate-cisplatin (AL-CIS) polymer-drug complexes to produce a transferrin ligand-conjugated liposome. The targeted nanoparticles (TL-DDAC) were nano-sized and spherical. The targeted liposome exhibited a specific receptor-mediated endocytic uptake in cancer cells. The enhanced cellular uptake of TL-DDAC resulted in a significantly better anticancer effect in resistant and sensitive breast cancer cells compared to that of the free drugs. Specifically, DOX and CIS at a molar ratio of 1:1 exhibited better therapeutic performance compared to that of other combinations. The combination of an anthracycline-based topoisomerase II inhibitor (DOX) and a platinum compound (CIS) resulted in significantly higher cell apoptosis (early and late) in both types of cancer cells. In conclusion, treatment with DS-DOX and AL-CIS based combination liposomes modified with transferrin (TL-DDAC) was an effective cancer treatment strategy. Further investigation in clinically relevant animal models is warranted to prove the therapeutic efficacy of this unique strategy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Metrical Phonology: German Sound System.

    Science.gov (United States)

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English and German languages. The objective is to promote use of metrical phonology as a tool for enhancing instruction in stress patterns in words and sentences, particularly in…

  12. Construction of Einstein-Sasaki metrics in D≥7

    International Nuclear Information System (INIS)

    Lue, H.; Pope, C. N.; Vazquez-Poritz, J. F.

    2007-01-01

    We construct explicit Einstein-Kaehler metrics in all even dimensions D=2n+4≥6, in terms of a 2n-dimensional Einstein-Kaehler base metric. These are cohomogeneity 2 metrics which have the new feature of including a NUT-type parameter, or gravomagnetic charge, in addition to..' in addition to mass and rotation parameters. Using a canonical construction, these metrics all yield Einstein-Sasaki metrics in dimensions D=2n+5≥7. As is commonly the case in this type of construction, for suitable choices of the free parameters the Einstein-Sasaki metrics can extend smoothly onto complete and nonsingular manifolds, even though the underlying Einstein-Kaehler metric has conical singularities. We discuss some explicit examples in the case of seven-dimensional Einstein-Sasaki spaces. These new spaces can provide supersymmetric backgrounds in M theory, which play a role in the AdS 4 /CFT 3 correspondence

  13. National Metrical Types in Nineteenth Century Art Song

    Directory of Open Access Journals (Sweden)

    Leigh VanHandel

    2010-01-01

    Full Text Available William Rothstein’s article “National metrical types in music of the eighteenth and early nineteenth centuries” (2008 proposes a distinction between the metrical habits of 18th and early 19th century German music and those of Italian and French music of that period. Based on theoretical treatises and compositional practice, he outlines these national metrical types and discusses the characteristics of each type. This paper presents the results of a study designed to determine whether, and to what degree, Rothstein’s characterizations of national metrical types are present in 19th century French and German art song. Studying metrical habits in this genre may provide a lens into changing metrical conceptions of 19th century theorists and composers, as well as to the metrical habits and compositional style of individual 19th century French and German art song composers.

  14. A Metric on Phylogenetic Tree Shapes.

    Science.gov (United States)

    Colijn, C; Plazzotta, G

    2018-01-01

    The shapes of evolutionary trees are influenced by the nature of the evolutionary process but comparisons of trees from different processes are hindered by the challenge of completely describing tree shape. We present a full characterization of the shapes of rooted branching trees in a form that lends itself to natural tree comparisons. We use this characterization to define a metric, in the sense of a true distance function, on tree shapes. The metric distinguishes trees from random models known to produce different tree shapes. It separates trees derived from tropical versus USA influenza A sequences, which reflect the differing epidemiology of tropical and seasonal flu. We describe several metrics based on the same core characterization, and illustrate how to extend the metric to incorporate trees' branch lengths or other features such as overall imbalance. Our approach allows us to construct addition and multiplication on trees, and to create a convex metric on tree shapes which formally allows computation of average tree shapes. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  15. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  16. Coordination of Project and Current Activities on the Basis of the Strategy Alignment Metamodel in the Oil and Gas Company

    Directory of Open Access Journals (Sweden)

    R. Yu. Dashkov

    2017-01-01

    Full Text Available Purpose: the purpose of this article is to describe the Strategy Alignment Metamodel of the project and current activities, which allows us to connect the Goals and Strategies for Phases of the project with the Goals and Strategies of the company at all levels of the organization through targeted measurement and application of Interpretive Models. Building Networks of Goals and Strategies, and adopting organizational solutions, you coordinate the interaction of the Project office and departments of the company. This methodology is based on a Logical Rationale of the Contexts and Assumptions for establishing Goals and Strategies both for the project and for the company, and on preparation of Contexts and Assumptions, Goals and Strategies Alignment Matrices, which provides a flexible adaptation to the internal and external environment in the process of selecting the most successful Strategies to achieve the Goals. Methods: this article is based on the concept of Goals-Questions-Metrics+ Strategies, which is adapted as another concept of strategic monitoring and control system of projects: Goals-Phases-Metrics+Strategies. These concepts have formed the basis of the Strategy Alignment Metamodel, where a technology of Phases Earned Value Management is used as a measurement system for the project activity, and Balanced scorecard is applied for current operations. Results: strategy Alignment Metamodel of the project and current activities of the company is proposed hereby. It uses modern strategic monitoring and control systems for projects: Goals-Phases-Metrics+Strategies, and for the company: Goals-Questions-Metrics+ Strategies. The interaction between these systems is based on Contexts and Assumptions, Goals and Strategies Alignment Matrices. The existence of such matrices greatly simplifies management decisions and prevents the risk of delays in the execution of project Phases based on rational participation and coordination of the company

  17. The Jacobi metric for timelike geodesics in static spacetimes

    Science.gov (United States)

    Gibbons, G. W.

    2016-01-01

    It is shown that the free motion of massive particles moving in static spacetimes is given by the geodesics of an energy-dependent Riemannian metric on the spatial sections analogous to Jacobi's metric in classical dynamics. In the massless limit Jacobi's metric coincides with the energy independent Fermat or optical metric. For stationary metrics, it is known that the motion of massless particles is given by the geodesics of an energy independent Finslerian metric of Randers type. The motion of massive particles is governed by neither a Riemannian nor a Finslerian metric. The properies of the Jacobi metric for massive particles moving outside the horizon of a Schwarschild black hole are described. By constrast with the massless case, the Gaussian curvature of the equatorial sections is not always negative.

  18. Measures and Metrics for Feasibility of Proof-of-Concept Studies With Human Immunodeficiency Virus Rapid Point-of-Care Technologies

    Science.gov (United States)

    Pant Pai, Nitika; Chiavegatti, Tiago; Vijh, Rohit; Karatzas, Nicolaos; Daher, Jana; Smallwood, Megan; Wong, Tom; Engel, Nora

    2017-01-01

    Objective Pilot (feasibility) studies form a vast majority of diagnostic studies with point-of-care technologies but often lack use of clear measures/metrics and a consistent framework for reporting and evaluation. To fill this gap, we systematically reviewed data to (a) catalog feasibility measures/metrics and (b) propose a framework. Methods For the period January 2000 to March 2014, 2 reviewers searched 4 databases (MEDLINE, EMBASE, CINAHL, Scopus), retrieved 1441 citations, and abstracted data from 81 studies. We observed 2 major categories of measures, that is, implementation centered and patient centered, and 4 subcategories of measures, that is, feasibility, acceptability, preference, and patient experience. We defined and delineated metrics and measures for a feasibility framework. We documented impact measures for a comparison. Findings We observed heterogeneity in reporting of metrics as well as misclassification and misuse of metrics within measures. Although we observed poorly defined measures and metrics for feasibility, preference, and patient experience, in contrast, acceptability measure was the best defined. For example, within feasibility, metrics such as consent, completion, new infection, linkage rates, and turnaround times were misclassified and reported. Similarly, patient experience was variously reported as test convenience, comfort, pain, and/or satisfaction. In contrast, within impact measures, all the metrics were well documented, thus serving as a good baseline comparator. With our framework, we classified, delineated, and defined quantitative measures and metrics for feasibility. Conclusions Our framework, with its defined measures/metrics, could reduce misclassification and improve the overall quality of reporting for monitoring and evaluation of rapid point-of-care technology strategies and their context-driven optimization. PMID:29333105

  19. Strategies for systemic radiotherapy of micrometastases using antibody-targeted 131I.

    Science.gov (United States)

    Wheldon, T E; O'Donoghue, J A; Hilditch, T E; Barrett, A

    1988-02-01

    A simple analysis is developed to evaluate the likely effectiveness of treatment of micrometastases by antibody-targeted 131I. Account is taken of the low levels of tumour uptake of antibody-conjugated 131I presently achievable and of the "energy wastage" in targeting microscopic tumours with a radionuclide whose disintegration energy is widely dissipated. The analysis shows that only modest doses can be delivered to micrometastases when total body dose is restricted to levels which allow recovery of bone marrow. Much higher doses could be delivered to micrometastases when bone marrow rescue is used. A rationale is presented for targeted systemic radiotherapy used in combination with external beam total body irradiation (TBI) and bone marrow rescue. This has some practical advantages. The effect of the targeted component is to impose a biological non-uniformity on the total body dose distribution with regions of high tumour cell density receiving higher doses. Where targeting results in high doses to particular normal organs (e.g. liver, kidney) the total dose to these organs could be kept within tolerable limits by appropriate shielding of the external beam radiation component of the treatment. Greater levels of tumour cell kill should be achievable by the combination regime without any increase in normal tissue damage over that inflicted by conventional TBI. The predicted superiority of the combination regime is especially marked for tumours just below the threshold for detectability (e.g. approximately 1 mm-1 cm diameter). This approach has the advantage that targeted radiotherapy provides only a proportion of the total body dose, most of which is given by a familiar technique. The proportion of dose given by the targeted component could be increased as experience is gained. The predicted superiority of the combination strategy should be experimentally testable using laboratory animals. Clinical applications should be cautiously approached, with due regard to

  20. Total value of the customer and targeted marketing strategies

    OpenAIRE

    Ryals , L.

    2002-01-01

    The literature shows some recent calls for an end to 'unaccountable' marketing (Rust et al., 2001; Sheth and Sharma, 2001) and the use of customer lifetime value as an appropriate marketing metric (Rust et al., 2001). Some commentators recommend the application of shareholder value measures to the valuation of customer relationships (Uyemara, 1997; Mariotti, 1996). The thesis evaluates the application of shareholder value measures to the valuation of customers. Shareholder valu...

  1. Classification in medical images using adaptive metric k-NN

    Science.gov (United States)

    Chen, C.; Chernoff, K.; Karemore, G.; Lo, P.; Nielsen, M.; Lauze, F.

    2010-03-01

    The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier with respect to different adaptive metrics in the context of medical imaging. We propose using adaptive metrics such that the structure of the data is better described, introducing some unsupervised learning knowledge in k-NN. We investigated four different metrics are estimated: a theoretical metric based on the assumption that images are drawn from Brownian Image Model (BIM), the normalized metric based on variance of the data, the empirical metric is based on the empirical covariance matrix of the unlabeled data, and an optimized metric obtained by minimizing the classification error. The spectral structure of the empirical covariance also leads to Principal Component Analysis (PCA) performed on it which results the subspace metrics. The metrics are evaluated on two data sets: lateral X-rays of the lumbar aortic/spine region, where we use k-NN for performing abdominal aorta calcification detection; and mammograms, where we use k-NN for breast cancer risk assessment. The results show that appropriate choice of metric can improve classification.

  2. Overview on the target fabrication facilities at ELI-NP and ongoing strategies

    Science.gov (United States)

    Gheorghiu, C. C.; Leca, V.; Popa, D.; Cernaianu, M. O.; Stutman, D.

    2016-10-01

    Along with the development of petawatt class laser systems, the interaction between high power lasers and matter flourished an extensive research, with high-interest applications like: laser nuclear physics, proton radiography or cancer therapy. The new ELI-NP (Extreme Light Infrastructure - Nuclear Physics) petawatt laser facility, with 10PW and ~ 1023W/cm2 beam intensity, is one of the innovative projects that will provide novel research of fundamental processes during light-matter interaction. As part of the ELI-NP facility, Targets Laboratory will provide the means for in-house manufacturing and characterization of the required targets (mainly solid ones) for the experiments, in addition to the research activity carried out in order to develop novel target designs with improved performances. A description of the Targets Laboratory with the main pieces of equipment and their specifications are presented. Moreover, in view of the latest progress in the target design, one of the proposed strategies for the forthcoming experiments at ELI-NP is also described, namely: ultra-thin patterned foil of diamond-like carbon (DLC) coated with a carbon-based ultra-low density layer. The carbon foam which behaves as a near-critical density plasma, will allow the controlled-shaping of the laser pulse before the main interaction with the solid foil. Particular emphasis will be directed towards the target's design optimization, by simulation tests and tuning the key-properties (thickness/length, spacing, density foam, depth, periodicity etc.) which are expected to have a crucial effect on the laser-matter interaction process.

  3. Measurable Control System Security through Ideal Driven Technical Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Sean McBride; Marie Farrar; Zachary Tudor

    2008-01-01

    The Department of Homeland Security National Cyber Security Division supported development of a small set of security ideals as a framework to establish measurable control systems security. Based on these ideals, a draft set of proposed technical metrics was developed to allow control systems owner-operators to track improvements or degradations in their individual control systems security posture. The technical metrics development effort included review and evaluation of over thirty metrics-related documents. On the bases of complexity, ambiguity, or misleading and distorting effects the metrics identified during the reviews were determined to be weaker than necessary to aid defense against the myriad threats posed by cyber-terrorism to human safety, as well as to economic prosperity. Using the results of our metrics review and the set of security ideals as a starting point for metrics development, we identified thirteen potential technical metrics - with at least one metric supporting each ideal. Two case study applications of the ideals and thirteen metrics to control systems were then performed to establish potential difficulties in applying both the ideals and the metrics. The case studies resulted in no changes to the ideals, and only a few deletions and refinements to the thirteen potential metrics. This led to a final proposed set of ten core technical metrics. To further validate the security ideals, the modifications made to the original thirteen potential metrics, and the final proposed set of ten core metrics, seven separate control systems security assessments performed over the past three years were reviewed for findings and recommended mitigations. These findings and mitigations were then mapped to the security ideals and metrics to assess gaps in their coverage. The mappings indicated that there are no gaps in the security ideals and that the ten core technical metrics provide significant coverage of standard security issues with 87% coverage. Based

  4. Experiential space is hardly metric

    Czech Academy of Sciences Publication Activity Database

    Šikl, Radovan; Šimeček, Michal; Lukavský, Jiří

    2008-01-01

    Roč. 2008, č. 37 (2008), s. 58-58 ISSN 0301-0066. [European Conference on Visual Perception. 24.08-28.08.2008, Utrecht] R&D Projects: GA ČR GA406/07/1676 Institutional research plan: CEZ:AV0Z70250504 Keywords : visual space perception * metric and non-metric perceptual judgments * ecological validity Subject RIV: AN - Psychology

  5. High resolution metric imaging payload

    Science.gov (United States)

    Delclaud, Y.

    2017-11-01

    Alcatel Space Industries has become Europe's leader in the field of high and very high resolution optical payloads, in the frame work of earth observation system able to provide military government with metric images from space. This leadership allowed ALCATEL to propose for the export market, within a French collaboration frame, a complete space based system for metric observation.

  6. Small Molecule Sequential Dual-Targeting Theragnostic Strategy (SMSDTTS): from Preclinical Experiments towards Possible Clinical Anticancer Applications.

    Science.gov (United States)

    Li, Junjie; Oyen, Raymond; Verbruggen, Alfons; Ni, Yicheng

    2013-01-01

    Hitting the evasive tumor cells proves challenging in targeted cancer therapies. A general and unconventional anticancer approach namely small molecule sequential dual-targeting theragnostic strategy (SMSDTTS) has recently been introduced with the aims to target and debulk the tumor mass, wipe out the residual tumor cells, and meanwhile enable cancer detectability. This dual targeting approach works in two steps for systemic delivery of two naturally derived drugs. First, an anti-tubulin vascular disrupting agent, e.g., combretastatin A4 phosphate (CA4P), is injected to selectively cut off tumor blood supply and to cause massive necrosis, which nevertheless always leaves peripheral tumor residues. Secondly, a necrosis-avid radiopharmaceutical, namely (131)I-hypericin ((131)I-Hyp), is administered the next day, which accumulates in intratumoral necrosis and irradiates the residual cancer cells with beta particles. Theoretically, this complementary targeted approach may biologically and radioactively ablate solid tumors and reduce the risk of local recurrence, remote metastases, and thus cancer mortality. Meanwhile, the emitted gamma rays facilitate radio-scintigraphy to detect tumors and follow up the therapy, hence a simultaneous theragnostic approach. SMSDTTS has now shown promise from multicenter animal experiments and may demonstrate unique anticancer efficacy in upcoming preliminary clinical trials. In this short review article, information about the two involved agents, the rationale of SMSDTTS, its preclinical antitumor efficacy, multifocal targetability, simultaneous theragnostic property, and toxicities of the dose regimens are summarized. Meanwhile, possible drawbacks, practical challenges and future improvement with SMSDTTS are discussed, which hopefully may help to push forward this strategy from preclinical experiments towards possible clinical applications.

  7. Modeling of Body Weight Metrics for Effective and Cost-Efficient Conventional Factor VIII Dosing in Hemophilia A Prophylaxis

    Directory of Open Access Journals (Sweden)

    Alanna McEneny-King

    2017-10-01

    Full Text Available The total body weight-based dosing strategy currently used in the prophylactic treatment of hemophilia A may not be appropriate for all populations. The assumptions that guide weight-based dosing are not valid in overweight and obese populations, resulting in overdosing and ineffective resource utilization. We explored different weight metrics including lean body weight, ideal body weight, and adjusted body weight to determine an alternative dosing strategy that is both safe and resource-efficient in normal and overweight/obese adult patients. Using a validated population pharmacokinetic model, we simulated a variety of dosing regimens using different doses, weight metrics, and frequencies; we also investigated the implications of assuming various levels of endogenous factor production. Ideal body weight performed the best across all of the regimens explored, maintaining safety while moderating resource consumption for overweight and obese patients.

  8. Implications of Metric Choice for Common Applications of Readmission Metrics

    OpenAIRE

    Davies, Sheryl; Saynina, Olga; Schultz, Ellen; McDonald, Kathryn M; Baker, Laurence C

    2013-01-01

    Objective. To quantify the differential impact on hospital performance of three readmission metrics: all-cause readmission (ACR), 3M Potential Preventable Readmission (PPR), and Centers for Medicare and Medicaid 30-day readmission (CMS).

  9. Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — This chapter presents several performance metrics for offline evaluation of prognostics algorithms. A brief overview of different methods employed for performance...

  10. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  11. CRISPR-Cas Targeting of Host Genes as an Antiviral Strategy.

    Science.gov (United States)

    Chen, Shuliang; Yu, Xiao; Guo, Deyin

    2018-01-16

    Currently, a new gene editing tool-the Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR) associated (Cas) system-is becoming a promising approach for genetic manipulation at the genomic level. This simple method, originating from the adaptive immune defense system in prokaryotes, has been developed and applied to antiviral research in humans. Based on the characteristics of virus-host interactions and the basic rules of nucleic acid cleavage or gene activation of the CRISPR-Cas system, it can be used to target both the virus genome and host factors to clear viral reservoirs and prohibit virus infection or replication. Here, we summarize recent progress of the CRISPR-Cas technology in editing host genes as an antiviral strategy.

  12. SU-D-BRE-06: Modeling the Dosimetric Effects of Volumetric and Layer-Based Repainting Strategies in Spot Scanning Proton Treatment Plans

    International Nuclear Information System (INIS)

    Johnson, J E; Beltran, C; Herman, M G; Kruse, J J

    2014-01-01

    Purpose: To compare multiple repainting techniques as strategies for mitigating the interplay effect in free-breathing, spot scanning proton plans. Methods: An analytic routine modeled three-dimensional dose distributions of pencil-beam proton plans delivered to a moving target. The interplay effect was studied in subsequent calculations by modeling proton delivery from a clinical synchrotron based spot scanning system and respiratory target motion, patterned from surrogate breathing traces from clinical 4DCT scans and normalized to nominal 0.5 and 1 cm amplitudes. Two distinct repainting strategies were modeled. In idealized volumetric repainting, the plan is divided up and delivered multiple times successively, with each instance only delivering a fraction of the total MU. Maximum-MU repainting involves delivering a fixed number of MU per spot and repeating a given energy layer until the prescribed MU are reached. For each of 13 patient breathing traces, the dose was computed for up to four volumetric repaints and an array of maximum-MU values. Delivery strategies were inter-compared based on target coverage, dose homogeneity, and delivery time. Results: Increasing levels of repainting generally improved plan quality and reduced dosimetric variability at the expense of longer delivery time. Motion orthogonal to the scan direction yielded substantially greater dose deviations than motion parallel to the scan direction. For a fixed delivery time, maximum-MU repainting was most effective relative to idealized volumetric repainting at small maximum-MU values. For 1 cm amplitude motion orthogonal to the scan direction, the average homogeneity metric (D5 – D95)[%] of 23.4% was reduced to 7.6% with a 168 s delivery using volumetric repainting compared with 8.7% in 157.2 s for maximum-MU repainting. The associated static target homogeneity metric was 2.5%. Conclusion: Maximum-MU repainting can provide a reasonably effective alternative to volumetric repainting for

  13. Global-cognitive health metrics: A novel approach for assessing cognition impairment in adult population.

    Directory of Open Access Journals (Sweden)

    Chia-Kuang Tsai

    Full Text Available Dementia is the supreme worldwide burden for welfare and the health care system in the 21st century. The early identification and control of the modifiable risk factors of dementia are important. Global-cognitive health (GCH metrics, encompassing controllable cardiovascular health (CVH and non-CVH risk factors of dementia, is a newly developed approach to assess the risk of cognitive impairment. The components of ideal GCH metrics includes better education, non-obesity, normal blood pressure, no smoking, no depression, ideal physical activity, good social integration, normal glycated hemoglobin (HbA1c, and normal hearing. This study focuses on the association between ideal GCH metrics and the cognitive function in young adults by investigating the Third Health and Nutrition Examination Survey (NHANES III database, which has not been reported previously. A total of 1243 participants aged 17 to 39 years were recruited in this study. Cognitive functioning was evaluated by the simple reaction time test (SRTT, symbol-digit substitution test (SDST, and serial digit learning test (SDLT. Participants with significantly higher scores of GCH metrics had better cognitive performance (p for trend <0.01 in three cognitive tests. Moreover, better education, ideal physical activity, good social integration and normal glycated hemoglobin were the optimistic components of ideal GCH metrics associated with better cognitive performance after adjusting for covariates (p < 0.05 in three cognitive tests. These findings emphasize the importance of a preventive strategy for modifiable dementia risk factors to enhance cognitive functioning during adulthood.

  14. Halobacterium salinarum NRC-1 PeptideAtlas: strategies for targeted proteomics

    Science.gov (United States)

    Van, Phu T.; Schmid, Amy K.; King, Nichole L.; Kaur, Amardeep; Pan, Min; Whitehead, Kenia; Koide, Tie; Facciotti, Marc T.; Goo, Young-Ah; Deutsch, Eric W.; Reiss, David J.; Mallick, Parag; Baliga, Nitin S.

    2009-01-01

    The relatively small numbers of proteins and fewer possible posttranslational modifications in microbes provides a unique opportunity to comprehensively characterize their dynamic proteomes. We have constructed a Peptide Atlas (PA) for 62.7% of the predicted proteome of the extremely halophilic archaeon Halobacterium salinarum NRC-1 by compiling approximately 636,000 tandem mass spectra from 497 mass spectrometry runs in 88 experiments. Analysis of the PA with respect to biophysical properties of constituent peptides, functional properties of parent proteins of detected peptides, and performance of different mass spectrometry approaches has helped highlight plausible strategies for improving proteome coverage and selecting signature peptides for targeted proteomics. Notably, discovery of a significant correlation between absolute abundances of mRNAs and proteins has helped identify low abundance of proteins as the major limitation in peptide detection. Furthermore we have discovered that iTRAQ labeling for quantitative proteomic analysis introduces a significant bias in peptide detection by mass spectrometry. Therefore, despite identifying at least one proteotypic peptide for almost all proteins in the PA, a context-dependent selection of proteotypic peptides appears to be the most effective approach for targeted proteomics. PMID:18652504

  15. Energy-Based Metrics for Arthroscopic Skills Assessment.

    Science.gov (United States)

    Poursartip, Behnaz; LeBel, Marie-Eve; McCracken, Laura C; Escoto, Abelardo; Patel, Rajni V; Naish, Michael D; Trejos, Ana Luisa

    2017-08-05

    Minimally invasive skills assessment methods are essential in developing efficient surgical simulators and implementing consistent skills evaluation. Although numerous methods have been investigated in the literature, there is still a need to further improve the accuracy of surgical skills assessment. Energy expenditure can be an indication of motor skills proficiency. The goals of this study are to develop objective metrics based on energy expenditure, normalize these metrics, and investigate classifying trainees using these metrics. To this end, different forms of energy consisting of mechanical energy and work were considered and their values were divided by the related value of an ideal performance to develop normalized metrics. These metrics were used as inputs for various machine learning algorithms including support vector machines (SVM) and neural networks (NNs) for classification. The accuracy of the combination of the normalized energy-based metrics with these classifiers was evaluated through a leave-one-subject-out cross-validation. The proposed method was validated using 26 subjects at two experience levels (novices and experts) in three arthroscopic tasks. The results showed that there are statistically significant differences between novices and experts for almost all of the normalized energy-based metrics. The accuracy of classification using SVM and NN methods was between 70% and 95% for the various tasks. The results show that the normalized energy-based metrics and their combination with SVM and NN classifiers are capable of providing accurate classification of trainees. The assessment method proposed in this study can enhance surgical training by providing appropriate feedback to trainees about their level of expertise and can be used in the evaluation of proficiency.

  16. Principle of space existence and De Sitter metric

    International Nuclear Information System (INIS)

    Mal'tsev, V.K.

    1990-01-01

    The selection principle for the solutions of the Einstein equations suggested in a series of papers implies the existence of space (g ik ≠ 0) only in the presence of matter (T ik ≠0). This selection principle (principle of space existence, in the Markov terminology) implies, in the general case, the absence of the cosmological solution with the De Sitter metric. On the other hand, the De Sitter metric is necessary for describing both inflation and deflation periods of the Universe. It is shown that the De Sitter metric is also allowed by the selection principle under discussion if the metric experiences the evolution into the Friedmann metric

  17. What can article-level metrics do for you?

    Science.gov (United States)

    Fenner, Martin

    2013-10-01

    Article-level metrics (ALMs) provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, usage statistics, discussions in online comments and social media, social bookmarking, and recommendations. In this essay, we describe why article-level metrics are an important extension of traditional citation-based journal metrics and provide a number of example from ALM data collected for PLOS Biology.

  18. About the possibility of a generalized metric

    International Nuclear Information System (INIS)

    Lukacs, B.; Ladik, J.

    1991-10-01

    The metric (the structure of the space-time) may be dependent on the properties of the object measuring it. The case of size dependence of the metric was examined. For this dependence the simplest possible form of the metric tensor has been constructed which fulfils the following requirements: there be two extremal characteristic scales; the metric be unique and the usual between them; the change be sudden in the neighbourhood of these scales; the size of the human body appear as a parameter (postulated on the basis of some philosophical arguments). Estimates have been made for the two extremal length scales according to existing observations. (author) 19 refs

  19. Consumption metrics of chardonnay wine consumers in Australia

    Directory of Open Access Journals (Sweden)

    Saliba AJ

    2015-02-01

    Full Text Available Anthony J Saliba,1 Johan Bruwer,2 Jasmine B MacDonald1 1School of Psychology, Charles Sturt University, Bathurst, NSW, 2School of Marketing, University of South Australia, Adelaide, SA, Australia Abstract: There is a dearth of information in the knowledge base about who the chardonnay consumer is, what their wine-consumption metrics are, what sensory characteristics they associate chardonnay with, and who influenced their perceptions. This study examines the consumer engagement with chardonnay, and contributes evidence-based research to inform future wine-business strategy. A population sample was recruited to be representative of Australian consumers. An online survey of 2,024 Australian wine consumers was conducted, 1,533 (76% of whom actually consumed chardonnay. This paper focuses only on those who consumed chardonnay. Males purchased and consumed larger quantities of chardonnay, although marginally more females consumed it. Chardonnay is considered to be characterized by full, lingering, and fruity flavors, as well as yellow color. Chardonnay is associated with dinner parties and at-home consumption. The vast majority of participants liked and had a positive perception of chardonnay. The target market for chardonnay is not only females; in fact, males appear to be the main consumers of this varietal by volume. Marketing and promotion campaigns should leverage the findings to retain current and win back other consumers. This is the first research to provide empirical explanations of consumer engagement with chardonnay, and to contribute evidence-based research in this regard.Keywords: chardonnay, consumer behavior, wine style, wine consumption, Australia

  20. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  1. Ideal Based Cyber Security Technical Metrics for Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    W. F. Boyer; M. A. McQueen

    2007-10-01

    Much of the world's critical infrastructure is at risk from attack through electronic networks connected to control systems. Security metrics are important because they provide the basis for management decisions that affect the protection of the infrastructure. A cyber security technical metric is the security relevant output from an explicit mathematical model that makes use of objective measurements of a technical object. A specific set of technical security metrics are proposed for use by the operators of control systems. Our proposed metrics are based on seven security ideals associated with seven corresponding abstract dimensions of security. We have defined at least one metric for each of the seven ideals. Each metric is a measure of how nearly the associated ideal has been achieved. These seven ideals provide a useful structure for further metrics development. A case study shows how the proposed metrics can be applied to an operational control system.

  2. THE ROLE OF ARTICLE LEVEL METRICS IN SCIENTIFIC PUBLISHING

    Directory of Open Access Journals (Sweden)

    Vladimir TRAJKOVSKI

    2016-04-01

    Full Text Available Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Article-level metrics (ALMs provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, statistics of usage, discussions in online comments and social media, social bookmarking, and recommendations. In this editorial, the role of article level metrics in publishing scientific papers has been described. Article-Level Metrics (ALMs are rapidly emerging as important tools to quantify how individual articles are being discussed, shared, and used. Data sources depend on the tool, but they include classic metrics indicators depending on citations, academic social networks (Mendeley, CiteULike, Delicious and social media (Facebook, Twitter, blogs, and Youtube. The most popular tools used to apply this new metrics are: Public Library of Science - Article-Level Metrics, Altmetric, Impactstory and Plum Analytics. Journal Impact Factor (JIF does not consider impact or influence beyond citations count as this count reflected only through Thomson Reuters’ Web of Science® database. JIF provides indicator related to the journal, but not related to a published paper. Thus, altmetrics now becomes an alternative metrics for performance assessment of individual scientists and their contributed scholarly publications. Macedonian scholarly publishers have to work on implementing of article level metrics in their e-journals. It is the way to increase their visibility and impact in the world of science.

  3. Supplier selection using different metric functions

    Directory of Open Access Journals (Sweden)

    Omosigho S.E.

    2015-01-01

    Full Text Available Supplier selection is an important component of supply chain management in today’s global competitive environment. Hence, the evaluation and selection of suppliers have received considerable attention in the literature. Many attributes of suppliers, other than cost, are considered in the evaluation and selection process. Therefore, the process of evaluation and selection of suppliers is a multi-criteria decision making process. The methodology adopted to solve the supplier selection problem is intuitionistic fuzzy TOPSIS (Technique for Order Preference by Similarity to the Ideal Solution. Generally, TOPSIS is based on the concept of minimum distance from the positive ideal solution and maximum distance from the negative ideal solution. We examine the deficiencies of using only one metric function in TOPSIS and propose the use of spherical metric function in addition to the commonly used metric functions. For empirical supplier selection problems, more than one metric function should be used.

  4. 77 FR 12832 - Non-RTO/ISO Performance Metrics; Commission Staff Request Comments on Performance Metrics for...

    Science.gov (United States)

    2012-03-02

    ... Performance Metrics; Commission Staff Request Comments on Performance Metrics for Regions Outside of RTOs and... performance communicate about the benefits of RTOs and, where appropriate, (2) changes that need to be made to... common set of performance measures for markets both within and outside of ISOs/RTOs. As recommended by...

  5. Regional Sustainability: The San Luis Basin Metrics Project

    Science.gov (United States)

    There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute. Moreover, individual metrics may not capture all aspects of a system that are relevant to sust...

  6. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  7. Metric solution of a spinning mass

    International Nuclear Information System (INIS)

    Sato, H.

    1982-01-01

    Studies on a particular class of asymptotically flat and stationary metric solutions called the Kerr-Tomimatsu-Sato class are reviewed about its derivation and properties. For a further study, an almost complete list of the papers worked on the Tomimatsu-Sato metrics is given. (Auth.)

  8. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  9. Generalized tolerance sensitivity and DEA metric sensitivity

    OpenAIRE

    Neralić, Luka; E. Wendell, Richard

    2015-01-01

    This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA). Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  10. On Nakhleh's metric for reduced phylogenetic networks

    OpenAIRE

    Cardona, Gabriel; Llabrés, Mercè; Rosselló, Francesc; Valiente Feruglio, Gabriel Alejandro

    2009-01-01

    We prove that Nakhleh’s metric for reduced phylogenetic networks is also a metric on the classes of tree-child phylogenetic networks, semibinary tree-sibling time consistent phylogenetic networks, and multilabeled phylogenetic trees. We also prove that it separates distinguishable phylogenetic networks. In this way, it becomes the strongest dissimilarity measure for phylogenetic networks available so far. Furthermore, we propose a generalization of that metric that separates arbitrary phyl...

  11. Generalized tolerance sensitivity and DEA metric sensitivity

    Directory of Open Access Journals (Sweden)

    Luka Neralić

    2015-03-01

    Full Text Available This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA. Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  12. Social Media Metrics Importance and Usage Frequency in Latvia

    Directory of Open Access Journals (Sweden)

    Ronalds Skulme

    2017-12-01

    Full Text Available Purpose of the article: The purpose of this paper was to explore which social media marketing metrics are most often used and are most important for marketing experts in Latvia and can be used to evaluate marketing campaign effectiveness. Methodology/methods: In order to achieve the aim of this paper several theoretical and practical research methods were used, such as theoretical literature analysis, surveying and grouping. First of all, theoretical research about social media metrics was conducted. Authors collected information about social media metric grouping methods and the most frequently mentioned social media metrics in the literature. The collected information was used as the foundation for the expert surveys. The expert surveys were used to collect information from Latvian marketing professionals to determine which social media metrics are used most often and which social media metrics are most important in Latvia. Scientific aim: The scientific aim of this paper was to identify if social media metrics importance varies depending on the consumer purchase decision stage. Findings: Information about the most important and most often used social media marketing metrics in Latvia was collected. A new social media grouping framework is proposed. Conclusions: The main conclusion is that the importance and the usage frequency of the social media metrics is changing depending of consumer purchase decisions stage the metric is used to evaluate.

  13. Post-targeting strategy for ready-to-use targeted nanodelivery post cargo loading.

    Science.gov (United States)

    Zhu, J Y; Hu, J J; Zhang, M K; Yu, W Y; Zheng, D W; Wang, X Q; Feng, J; Zhang, X Z

    2017-12-14

    Based on boronate formation, this study reports a post-targeting methodology capable of readily installing versatile targeting modules onto a cargo-loaded nanoplatform in aqueous mediums. This permits the targeted nanodelivery of broad-spectrum therapeutics (drug/gene) in a ready-to-use manner while overcoming the PEGylation-dilemma that frequently occurs in conventional targeting approaches.

  14. A comparison theorem of the Kobayashi metric and the Bergman metric on a class of Reinhardt domains

    International Nuclear Information System (INIS)

    Weiping Yin.

    1990-03-01

    A comparison theorem for the Kobayashi and Bergman metric is given on a class of Reinhardt domains in C n . In the meantime, we obtain a class of complete invariant Kaehler metrics for these domains of the special cases. (author). 5 refs

  15. Using Activity Metrics for DEVS Simulation Profiling

    Directory of Open Access Journals (Sweden)

    Muzy A.

    2014-01-01

    Full Text Available Activity metrics can be used to profile DEVS models before and during the simulation. It is critical to get good activity metrics of models before and during their simulation. Having a means to compute a-priori activity of components (analytic activity may be worth when simulating a model (or parts of it for the first time. After, during the simulation, analytic activity can be corrected using dynamic one. In this paper, we introduce McCabe cyclomatic complexity metric (MCA to compute analytic activity. Both static and simulation activity metrics have been implemented through a plug-in of the DEVSimPy (DEVS Simulator in Python language environment and applied to DEVS models.

  16. Tracker Performance Metric

    National Research Council Canada - National Science Library

    Olson, Teresa; Lee, Harry; Sanders, Johnnie

    2002-01-01

    .... We have developed the Tracker Performance Metric (TPM) specifically for this purpose. It was designed to measure the output performance, on a frame-by-frame basis, using its output position and quality...

  17. Metrication: An economic wake-up call for US industry

    Science.gov (United States)

    Carver, G. P.

    1993-03-01

    As the international standard of measurement, the metric system is one key to success in the global marketplace. International standards have become an important factor in international economic competition. Non-metric products are becoming increasingly unacceptable in world markets that favor metric products. Procurement is the primary federal tool for encouraging and helping U.S. industry to convert voluntarily to the metric system. Besides the perceived unwillingness of the customer, certain regulatory language, and certain legal definitions in some states, there are no major impediments to conversion of the remaining non-metric industries to metric usage. Instead, there are good reasons for changing, including an opportunity to rethink many industry standards and to take advantage of size standardization. Also, when the remaining industries adopt the metric system, they will come into conformance with federal agencies engaged in similar activities.

  18. Conformal and related changes of metric on the product of two almost contact metric manifolds.

    OpenAIRE

    Blair, D. E.

    1990-01-01

    This paper studies conformal and related changes of the product metric on the product of two almost contact metric manifolds. It is shown that if one factor is Sasakian, the other is not, but that locally the second factor is of the type studied by Kenmotsu. The results are more general and given in terms of trans-Sasakian, α-Sasakian and β-Kenmotsu structures.

  19. Extremal limits of the C metric: Nariai, Bertotti-Robinson, and anti-Nariai C metrics

    International Nuclear Information System (INIS)

    Dias, Oscar J.C.; Lemos, Jose P.S.

    2003-01-01

    In two previous papers we have analyzed the C metric in a background with a cosmological constant Λ, namely, the de-Sitter (dS) C metric (Λ>0), and the anti-de Sitter (AdS) C metric (Λ 0, Λ=0, and Λ 2 xS-tilde 2 ) to each point in the deformed two-sphere S-tilde 2 corresponds a dS 2 spacetime, except for one point which corresponds to a dS 2 spacetime with an infinite straight strut or string. There are other important new features that appear. One expects that the solutions found in this paper are unstable and decay into a slightly nonextreme black hole pair accelerated by a strut or by strings. Moreover, the Euclidean version of these solutions mediate the quantum process of black hole pair creation that accompanies the decay of the dS and AdS spaces

  20. Graev metrics on free products and HNN extensions

    DEFF Research Database (Denmark)

    Slutsky, Konstantin

    2014-01-01

    We give a construction of two-sided invariant metrics on free products (possibly with amalgamation) of groups with two-sided invariant metrics and, under certain conditions, on HNN extensions of such groups. Our approach is similar to the Graev's construction of metrics on free groups over pointed...

  1. Validation of Metrics for Collaborative Systems

    OpenAIRE

    Ion IVAN; Cristian CIUREA

    2008-01-01

    This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  2. g-Weak Contraction in Ordered Cone Rectangular Metric Spaces

    Directory of Open Access Journals (Sweden)

    S. K. Malhotra

    2013-01-01

    Full Text Available We prove some common fixed-point theorems for the ordered g-weak contractions in cone rectangular metric spaces without assuming the normality of cone. Our results generalize some recent results from cone metric and cone rectangular metric spaces into ordered cone rectangular metric spaces. Examples are provided which illustrate the results.

  3. The dynamics of metric-affine gravity

    International Nuclear Information System (INIS)

    Vitagliano, Vincenzo; Sotiriou, Thomas P.; Liberati, Stefano

    2011-01-01

    Highlights: → The role and the dynamics of the connection in metric-affine theories is explored. → The most general second order action does not lead to a dynamical connection. → Including higher order invariants excites new degrees of freedom in the connection. → f(R) actions are also discussed and shown to be a non- representative class. - Abstract: Metric-affine theories of gravity provide an interesting alternative to general relativity: in such an approach, the metric and the affine (not necessarily symmetric) connection are independent quantities. Furthermore, the action should include covariant derivatives of the matter fields, with the covariant derivative naturally defined using the independent connection. As a result, in metric-affine theories a direct coupling involving matter and connection is also present. The role and the dynamics of the connection in such theories is explored. We employ power counting in order to construct the action and search for the minimal requirements it should satisfy for the connection to be dynamical. We find that for the most general action containing lower order invariants of the curvature and the torsion the independent connection does not carry any dynamics. It actually reduces to the role of an auxiliary field and can be completely eliminated algebraically in favour of the metric and the matter field, introducing extra interactions with respect to general relativity. However, we also show that including higher order terms in the action radically changes this picture and excites new degrees of freedom in the connection, making it (or parts of it) dynamical. Constructing actions that constitute exceptions to this rule requires significant fine tuned and/or extra a priori constraints on the connection. We also consider f(R) actions as a particular example in order to show that they constitute a distinct class of metric-affine theories with special properties, and as such they cannot be used as representative toy

  4. Multigas reduction strategy under climate stabilization target

    Energy Technology Data Exchange (ETDEWEB)

    Kurosawa, A. [Inst. of Applied Energy, Tokyo (Japan)

    2005-07-01

    Global warming can be mitigated through the abatement of carbon dioxide (CO{sub 2}), methane (CH{sub 4}), nitrous oxide (N{sub 2}O), hydrofluorocarbons (HFCs), perfluorocarbons (PFCs) and sulfur hexafluoride (SF{sub 6}). This study argued that multiple gas reduction flexibility should be assessed when considering effective greenhouse gas (GHG) mitigation strategies. Emissions of non-CO{sub 2} GHGs were calculated endogenously using an integrated assessment model. Multigas reduction potential was measured in relation to long-term atmospheric temperature targets, and the effects on gas life as well as abatement timing uncertainty were considered in terms of cost and technological availability. The model consisted of 5 nodules which considered issues related to energy, climate, land use, macroeconomics, and environmental impacts. The time horizon of the model was 2000 to 2100. An economic utility maximization technology was used to consider global trade balances. Emissions of non-CO{sub 2} gases from specific sources was calculated by multiplying the emission factor and the endogenous parameters within the model. Results were presented for GHG emissions and concentrations in 2 simulation cases: (1) a no climate policy case (NCP); and (2) a transient temperature stabilization (TTS) case. Actions to reduce non-CO{sub 2} GHGs included activity level changes in production and consumption, and additional reductions in abatement costs without sector activity changes. Results of the study showed that reducing global dependency on fossil fuels was an effective way to reduce GHG effects from CO{sub 2}, CH{sub 4} and N{sub 2}O. Additional abatements to reduce N{sub 2}O emissions are required in the agricultural sector. Economic incentives and public outreach programs are needed to offset the high transaction costs of GHG mitigation strategies. It was concluded that both short-term and long-term policies are required to reduce GHG in all sectors. Multigas mitigation is needed to

  5. The definitive guide to IT service metrics

    CERN Document Server

    McWhirter, Kurt

    2012-01-01

    Used just as they are, the metrics in this book will bring many benefits to both the IT department and the business as a whole. Details of the attributes of each metric are given, enabling you to make the right choices for your business. You may prefer and are encouraged to design and create your own metrics to bring even more value to your business - this book will show you how to do this, too.

  6. NASA education briefs for the classroom. Metrics in space

    Science.gov (United States)

    The use of metric measurement in space is summarized for classroom use. Advantages of the metric system over the English measurement system are described. Some common metric units are defined, as are special units for astronomical study. International system unit prefixes and a conversion table of metric/English units are presented. Questions and activities for the classroom are recommended.

  7. LPI Optimization Framework for Target Tracking in Radar Network Architectures Using Information-Theoretic Criteria

    Directory of Open Access Journals (Sweden)

    Chenguang Shi

    2014-01-01

    Full Text Available Widely distributed radar network architectures can provide significant performance improvement for target detection and localization. For a fixed radar network, the achievable target detection performance may go beyond a predetermined threshold with full transmitted power allocation, which is extremely vulnerable in modern electronic warfare. In this paper, we study the problem of low probability of intercept (LPI design for radar network and propose two novel LPI optimization schemes based on information-theoretic criteria. For a predefined threshold of target detection, Schleher intercept factor is minimized by optimizing transmission power allocation among netted radars in the network. Due to the lack of analytical closed-form expression for receiver operation characteristics (ROC, we employ two information-theoretic criteria, namely, Bhattacharyya distance and J-divergence as the metrics for target detection performance. The resulting nonconvex and nonlinear LPI optimization problems associated with different information-theoretic criteria are cast under a unified framework, and the nonlinear programming based genetic algorithm (NPGA is used to tackle the optimization problems in the framework. Numerical simulations demonstrate that our proposed LPI strategies are effective in enhancing the LPI performance for radar network.

  8. Enhancing Authentication Models Characteristic Metrics via ...

    African Journals Online (AJOL)

    In this work, we derive the universal characteristic metrics set for authentication models based on security, usability and design issues. We then compute the probability of the occurrence of each characteristic metrics in some single factor and multifactor authentication models in order to determine the effectiveness of these ...

  9. Validation of Metrics for Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2008-01-01

    Full Text Available This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  10. Understanding Acceptance of Software Metrics--A Developer Perspective

    Science.gov (United States)

    Umarji, Medha

    2009-01-01

    Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…

  11. Targeting Tumor-Associated Macrophages as a Potential Strategy to Enhance the Response to Immune Checkpoint Inhibitors.

    Science.gov (United States)

    Cassetta, Luca; Kitamura, Takanori

    2018-01-01

    Inhibition of immune checkpoint pathways in CD8 + T cell is a promising therapeutic strategy for the treatment of solid tumors that has shown significant anti-tumor effects and is now approved by the FDA to treat patients with melanoma and lung cancer. However the response to this therapy is limited to a certain fraction of patients and tumor types, for reasons still unknown. To ensure success of this treatment, CD8 + T cells, the main target of the checkpoint inhibitors, should exert full cytotoxicity against tumor cells. However recent studies show that tumor-associated macrophages (TAM) can impede this process by different mechanisms. In this mini-review we will summarize recent studies showing the effect of TAM targeting on immune checkpoint inhibitors efficacy. We will also discuss on the limitations of the current strategies as well on the future scientific challenges for the progress of the tumor immunology field.

  12. Measuring reliability under epistemic uncertainty: Review on non-probabilistic reliability metrics

    Directory of Open Access Journals (Sweden)

    Kang Rui

    2016-06-01

    Full Text Available In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence-theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis-based reliability metrics, possibility-theory-based reliability metrics (posbist reliability and uncertainty-theory-based reliability metrics (belief reliability. It is pointed out that a qualified reliability metric that is able to consider the effect of epistemic uncertainty needs to (1 compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2 satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.

  13. Dual targeting of MDM2 and BCL2 as a therapeutic strategy in neuroblastoma.

    Science.gov (United States)

    Van Goethem, Alan; Yigit, Nurten; Moreno-Smith, Myrthala; Vasudevan, Sanjeev A; Barbieri, Eveline; Speleman, Frank; Shohet, Jason; Vandesompele, Jo; Van Maerken, Tom

    2017-08-22

    Wild-type p53 tumor suppressor activity in neuroblastoma tumors is hampered by increased MDM2 activity, making selective MDM2 antagonists an attractive therapeutic strategy for this childhood malignancy. Since monotherapy in cancer is generally not providing long-lasting clinical responses, we here aimed to identify small molecule drugs that synergize with idasanutlin (RG7388). To this purpose we evaluated 15 targeted drugs in combination with idasanutlin in three p53 wild type neuroblastoma cell lines and identified the BCL2 inhibitor venetoclax (ABT-199) as a promising interaction partner. The venetoclax/idasanutlin combination was consistently found to be highly synergistic in a diverse panel of neuroblastoma cell lines, including cells with high MCL1 expression levels. A more pronounced induction of apoptosis was found to underlie the synergistic interaction, as evidenced by caspase-3/7 and cleaved PARP measurements. Mice carrying orthotopic xenografts of neuroblastoma cells treated with both idasanutlin and venetoclax had drastically lower tumor weights than mice treated with either treatment alone. In conclusion, these data strongly support the further evaluation of dual BCL2/MDM2 targeting as a therapeutic strategy in neuroblastoma.

  14. Assessing the Effects of Data Compression in Simulations Using Physically Motivated Metrics

    Directory of Open Access Journals (Sweden)

    Daniel Laney

    2014-01-01

    Full Text Available This paper examines whether lossy compression can be used effectively in physics simulations as a possible strategy to combat the expected data-movement bottleneck in future high performance computing architectures. We show that, for the codes and simulations we tested, compression levels of 3–5X can be applied without causing significant changes to important physical quantities. Rather than applying signal processing error metrics, we utilize physics-based metrics appropriate for each code to assess the impact of compression. We evaluate three different simulation codes: a Lagrangian shock-hydrodynamics code, an Eulerian higher-order hydrodynamics turbulence modeling code, and an Eulerian coupled laser-plasma interaction code. We compress relevant quantities after each time-step to approximate the effects of tightly coupled compression and study the compression rates to estimate memory and disk-bandwidth reduction. We find that the error characteristics of compression algorithms must be carefully considered in the context of the underlying physics being modeled.

  15. Construction of self-dual codes in the Rosenbloom-Tsfasman metric

    Science.gov (United States)

    Krisnawati, Vira Hari; Nisa, Anzi Lina Ukhtin

    2017-12-01

    Linear code is a very basic code and very useful in coding theory. Generally, linear code is a code over finite field in Hamming metric. Among the most interesting families of codes, the family of self-dual code is a very important one, because it is the best known error-correcting code. The concept of Hamming metric is develop into Rosenbloom-Tsfasman metric (RT-metric). The inner product in RT-metric is different from Euclid inner product that is used to define duality in Hamming metric. Most of the codes which are self-dual in Hamming metric are not so in RT-metric. And, generator matrix is very important to construct a code because it contains basis of the code. Therefore in this paper, we give some theorems and methods to construct self-dual codes in RT-metric by considering properties of the inner product and generator matrix. Also, we illustrate some examples for every kind of the construction.

  16. Chaotic inflation with metric and matter perturbations

    International Nuclear Information System (INIS)

    Feldman, H.A.; Brandenberger, R.H.

    1989-01-01

    A perturbative scheme to analyze the evolution of both metric and scalar field perturbations in an expanding universe is developed. The scheme is applied to study chaotic inflation with initial metric and scalar field perturbations present. It is shown that initial gravitational perturbations with wavelength smaller than the Hubble radius rapidly decay. The metric simultaneously picks up small perturbations determined by the matter inhomogeneities. Both are frozen in once the wavelength exceeds the Hubble radius. (orig.)

  17. Obstacles to the implementation of the treat-to-target strategy for rheumatoid arthritis in clinical practice in Japan.

    Science.gov (United States)

    Kaneko, Yuko; Koike, Takao; Oda, Hiromi; Yamamoto, Kazuhiko; Miyasaka, Nobuyuki; Harigai, Masayoshi; Yamanaka, Hisashi; Ishiguro, Naoki; Tanaka, Yoshiya; Takeuchi, Tsutomu

    2015-01-01

    To clarify the obstacles preventing the implementation of the treat-to-target (T2T) strategy for rheumatoid arthritis (RA) in clinical practice. A total of 301 rheumatologists in Japan completed a questionnaire. In the first section, participants were indirectly questioned on the implementation of basic components of T2T, and in the second section, participants were directly questioned on their level of agreement and application. Although nearly all participants set treatment targets for the majority of RA patients with moderate to high disease activity, the proportion who set clinical remission as their target was 59%, with only 45% of these using composite measures. The proportion of participants who monitored X-rays and Health Assessment Questionnaires for all their patients was 44% and 14%, respectively. The proportion of participants who did not discuss treatment strategies was 44%, with approximately half of these reasoning that this was due to a proportion of patients having a lack of understanding of the treatment strategy or inability to make decisions. When participants were directly questioned, there was a high level of agreement with the T2T recommendations. Although there was a high level of agreement with the T2T recommendations, major obstacles preventing its full implementation still remain.

  18. Phantom metrics with Killing spinors

    Directory of Open Access Journals (Sweden)

    W.A. Sabra

    2015-11-01

    Full Text Available We study metric solutions of Einstein–anti-Maxwell theory admitting Killing spinors. The analogue of the IWP metric which admits a space-like Killing vector is found and is expressed in terms of a complex function satisfying the wave equation in flat (2+1-dimensional space–time. As examples, electric and magnetic Kasner spaces are constructed by allowing the solution to depend only on the time coordinate. Euclidean solutions are also presented.

  19. A novel Trojan-horse targeting strategy to reduce the non-specific uptake of nanocarriers by non-cancerous cells.

    Science.gov (United States)

    Shen, Zheyu; Wu, Hao; Yang, Sugeun; Ma, Xuehua; Li, Zihou; Tan, Mingqian; Wu, Aiguo

    2015-11-01

    One big challenge with active targeting of nanocarriers is non-specific binding between targeting molecules and non-target moieties expressed on non-cancerous cells, which leads to non-specific uptake of nanocarriers by non-cancerous cells. Here, we propose a novel Trojan-horse targeting strategy to hide or expose the targeting molecules of nanocarriers on-demand. The non-specific uptake by non-cancerous cells can be reduced because the targeting molecules are hidden in hydrophilic polymers. The nanocarriers are still actively targetable to cancer cells because the targeting molecules can be exposed on-demand at tumor regions. Typically, Fe3O4 nanocrystals (FN) as magnetic resonance imaging (MRI) contrast agents were encapsulated into albumin nanoparticles (AN), and then folic acid (FA) and pH-sensitive polymers (PP) were grafted onto the surface of AN-FN to construct PP-FA-AN-FN nanoparticles. Fourier transform infrared spectroscopy (FT-IR), dynamic light scattering (DLS), transmission electron microscope (TEM) and gel permeation chromatography (GPC) results confirm successful construction of PP-FA-AN-FN. According to difference of nanoparticle-cellular uptake between pH 7.4 and 5.5, the weight ratio of conjugated PP to nanoparticle FA-AN-FN (i.e. graft density) and the molecular weight of PP (i.e. graft length) are optimized to be 1.32 and 5.7 kDa, respectively. In vitro studies confirm that the PP can hide ligand FA to prevent it from binding to cells with FRα at pH 7.4 and shrink to expose FA at pH 5.5. In vivo studies demonstrate that our Trojan-horse targeting strategy can reduce the non-specific uptake of the PP-FA-AN-FN by non-cancerous cells. Therefore, our PP-FA-AN-FN might be used as an accurately targeted MRI contrast agent. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Construction of the Questionnaire on Foreign Language Learning Strategies in Specific Croatian Context.

    Science.gov (United States)

    Božinović, Nikolina; Sindik, Joško

    2017-03-01

    Learning strategies are special thoughts or behaviours that individuals use to understand, learn or retain new information, according to the point of view of O’Malley & Chamot. The other view, promoted by Oxford, believes learning strategies are specific actions taken by the learner to make learning easier, faster, more enjoyable, and more transferrable to new situations of language learning and use. The use of appropriate strategies ensures greater success in language learning. The aim of the research was to establish metric characteristics of the Questionnaire on learning strategies created by the author, in line with the template of the original SILL questionnaire (Strategy Inventory for Language Learning). The research was conducted at the Rochester Institute of Technology Croatia on a sample of 201 participants who learned German, Spanish, French and Italian as a foreign language. The results have shown that one-component latent dimensions which describe the space of foreign language learning strategies according to Oxford’s classification, have metric characteristics which are low, but still satisfactory (reliability and validity). All dimensions of learning strategies appeared not to be adequately defined. Therefore, we excluded compensation strategies and merged social and affective strategies into social-affective strategies into the unique dimension. Overall, this version of Oxford’s original questionnaire, based on Oxford’s theoretical construct, applied on Croatian students, clearly shows that current version of the questionnaire has poor metric characteristics. One of the explanations of the results obtained could be positioned in multicultural context and intercultural dialogue. Namely, particular social, political and economic context in Croatia could shape even foreign language learning strategies.

  1. Mature Epitope Density - A strategy for target selection based on immunoinformatics and exported prokaryotic proteins

    DEFF Research Database (Denmark)

    Santos, Anderson R; Pereira, Vanessa Bastos; Barbosa, Eudes

    2013-01-01

    . However, currently available tools do not account for the concentration of epitope products in the mature protein product and its relation to the reliability of target selection. RESULTS: We developed a computational strategy based on measuring the epitope's concentration in the mature protein, called...... Mature Epitope Density (MED). Our method, though simple, is capable of identifying promising vaccine targets. Our online software implementation provides a computationally light and reliable analysis of bacterial exoproteins and their potential for vaccines or diagnosis projects against pathogenic...... proteins were confirmed as related. There was no experimental evidence of antigenic or pathogenic contributions for three of the highest MED-scored Mtb proteins. Hence, these three proteins could represent novel putative vaccine and drug targets for Mtb. A web version of MED is publicly available online...

  2. Invariant metric for nonlinear symplectic maps

    Indian Academy of Sciences (India)

    In this paper, we construct an invariant metric in the space of homogeneous polynomials of a given degree (≥ 3). The homogeneous polynomials specify a nonlinear symplectic map which in turn represents a Hamiltonian system. By minimizing the norm constructed out of this metric as a function of system parameters, we ...

  3. Learning a Novel Detection Metric for the Detection of O’Connell Effect Eclipsing Binaries

    Science.gov (United States)

    Johnston, Kyle; Haber, Rana; Knote, Matthew; Caballero-Nieves, Saida Maria; Peter, Adrian; Petit, Véronique

    2018-01-01

    With the advent of digital astronomy, new benefits and new challenges have been presented to the modern day astronomer. No longer can the astronomer rely on manual processing, instead the profession as a whole has begun to adopt more advanced computational means. Here we focus on the construction and application of a novel time-domain signature extraction methodology and the development of a supporting supervised pattern detection algorithm for the targeted identification of eclipsing binaries which demonstrate a feature known as the O’Connell Effect. A methodology for the reduction of stellar variable observations (time-domain data) into Distribution Fields (DF) is presented. Push-Pull metric learning, a variant of LMNN learning, is used to generate a learned distance metric for the specific detection problem proposed. The metric will be trained on a set of a labelled Kepler eclipsing binary data, in particular systems showing the O’Connell effect. Performance estimates will be presented, as well the results of the detector applied to an unlabeled Kepler EB data set; this work is a crucial step in the upcoming era of big data from the next generation of big telescopes, such as LSST.

  4. A unifying framework for metrics for aggregating the climate effect of different emissions

    International Nuclear Information System (INIS)

    Tol, Richard S J; Berntsen, Terje K; Fuglestvedt, Jan S; O’Neill, Brian C; Shine, Keith P

    2012-01-01

    Multi-gas approaches to climate change policies require a metric establishing ‘equivalences’ among emissions of various species. Climate scientists and economists have proposed four kinds of such metrics and debated their relative merits. We present a unifying framework that clarifies the relationships among them. We show, as have previous authors, that the global warming potential (GWP), used in international law to compare emissions of greenhouse gases, is a special case of the global damage potential (GDP), assuming (1) a finite time horizon, (2) a zero discount rate, (3) constant atmospheric concentrations, and (4) impacts that are proportional to radiative forcing. Both the GWP and GDP follow naturally from a cost–benefit framing of the climate change issue. We show that the global temperature change potential (GTP) is a special case of the global cost potential (GCP), assuming a (slight) fall in the global temperature after the target is reached. We show how the four metrics should be generalized if there are intertemporal spillovers in abatement costs, distinguishing between private (e.g., capital stock turnover) and public (e.g., induced technological change) spillovers. Both the GTP and GCP follow naturally from a cost-effectiveness framing of the climate change issue. We also argue that if (1) damages are zero below a threshold and (2) infinitely large above a threshold, then cost-effectiveness analysis and cost–benefit analysis lead to identical results. Therefore, the GCP is a special case of the GDP. The UN Framework Convention on Climate Change uses the GWP, a simplified cost–benefit concept. The UNFCCC is framed around the ultimate goal of stabilizing greenhouse gas concentrations. Once a stabilization target has been agreed under the convention, implementation is clearly a cost-effectiveness problem. It would therefore be more consistent to use the GCP or its simplification, the GTP. (letter)

  5. Two-dimensional manifolds with metrics of revolution

    International Nuclear Information System (INIS)

    Sabitov, I Kh

    2000-01-01

    This is a study of the topological and metric structure of two-dimensional manifolds with a metric that is locally a metric of revolution. In the case of compact manifolds this problem can be thoroughly investigated, and in particular it is explained why there are no closed analytic surfaces of revolution in R 3 other than a sphere and a torus (moreover, in the smoothness class C ∞ such surfaces, understood in a certain generalized sense, exist in any topological class)

  6. Gravitational lensing in metric theories of gravity

    International Nuclear Information System (INIS)

    Sereno, Mauro

    2003-01-01

    Gravitational lensing in metric theories of gravity is discussed. I introduce a generalized approximate metric element, inclusive of both post-post-Newtonian contributions and a gravitomagnetic field. Following Fermat's principle and standard hypotheses, I derive the time delay function and deflection angle caused by an isolated mass distribution. Several astrophysical systems are considered. In most of the cases, the gravitomagnetic correction offers the best perspectives for an observational detection. Actual measurements distinguish only marginally different metric theories from each other

  7. The uniqueness of the Fisher metric as information metric

    Czech Academy of Sciences Publication Activity Database

    Le, Hong-Van

    2017-01-01

    Roč. 69, č. 4 (2017), s. 879-896 ISSN 0020-3157 Institutional support: RVO:67985840 Keywords : Chentsov’s theorem * mixed topology * monotonicity of the Fisher metric Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.049, year: 2016 https://link.springer.com/article/10.1007%2Fs10463-016-0562-0

  8. Hybrid metric-Palatini stars

    Science.gov (United States)

    Danilǎ, Bogdan; Harko, Tiberiu; Lobo, Francisco S. N.; Mak, M. K.

    2017-02-01

    We consider the internal structure and the physical properties of specific classes of neutron, quark and Bose-Einstein condensate stars in the recently proposed hybrid metric-Palatini gravity theory, which is a combination of the metric and Palatini f (R ) formalisms. It turns out that the theory is very successful in accounting for the observed phenomenology, since it unifies local constraints at the Solar System level and the late-time cosmic acceleration, even if the scalar field is very light. In this paper, we derive the equilibrium equations for a spherically symmetric configuration (mass continuity and Tolman-Oppenheimer-Volkoff) in the framework of the scalar-tensor representation of the hybrid metric-Palatini theory, and we investigate their solutions numerically for different equations of state of neutron and quark matter, by adopting for the scalar field potential a Higgs-type form. It turns out that the scalar-tensor definition of the potential can be represented as an Clairaut differential equation, and provides an explicit form for f (R ) given by f (R )˜R +Λeff, where Λeff is an effective cosmological constant. Furthermore, stellar models, described by the stiff fluid, radiation-like, bag model and the Bose-Einstein condensate equations of state are explicitly constructed in both general relativity and hybrid metric-Palatini gravity, thus allowing an in-depth comparison between the predictions of these two gravitational theories. As a general result it turns out that for all the considered equations of state, hybrid gravity stars are more massive than their general relativistic counterparts. Furthermore, two classes of stellar models corresponding to two particular choices of the functional form of the scalar field (constant value, and logarithmic form, respectively) are also investigated. Interestingly enough, in the case of a constant scalar field the equation of state of the matter takes the form of the bag model equation of state describing

  9. Exports of company: SWOT-analysis, product strategy and sales targets

    International Nuclear Information System (INIS)

    Hammer, Hele

    1998-01-01

    Despite its smallness Estonia has a good chance to enjoy a success in the international peat market, due to the favourable geographical location and well developed peat industry. There are numerous harbours, low wages and salaries, and a good educational background in Estonia. Moreover, Estonian economy is aiming at a competitive market economy. Peat exports represent a great opportunity to improve the balance of payments, create jobs, support the State through the taxes paid, meet the needs of foreign customers, earn a profit for Estonian peat companies and better Estonian standard of living. When preparing this paper, marketing textbooks and professional articles of interest were used. The working experience of one of Estonian peat companies and acquired practical knowledge have also been of help throughout the thesis. In general, it may be expected that Estonian peat exports will increase in the next few years. The Netherlands and Germany will remain the main target countries, also France, Belgium and the United Kingdom are important. The exports to Italy will, for sure, increase, to the Middle-East these will be quite likely. The Far East is also a potential market, especially Korea and Japan. Peat marketing is based on the following premises: the demand for peat is a derived demand, being dependent on that for the end-products. The number of customers is small and their decisions are rational. Estonian peat producers also have to face the fact that the production needs to be marketed mostly abroad. While considering the product strategy, the conclusion was that with peat the least cost strategy is easily applicable. Possibilities for differentiation are almost next to nothing (except in case of packaging or transportation services). Possibilities will widen when the production of potting soils is launched. Most Estonian peat firms sell peat and products thereof through foreign wholesalers, some of them render also transportation services and this is well

  10. Targeting lipid metabolism of cancer cells: A promising therapeutic strategy for cancer.

    Science.gov (United States)

    Liu, Qiuping; Luo, Qing; Halim, Alexander; Song, Guanbin

    2017-08-10

    One of the most important metabolic hallmarks of cancer cells is deregulation of lipid metabolism. In addition, enhancing de novo fatty acid (FA) synthesis, increasing lipid uptake and lipolysis have also been considered as means of FA acquisition in cancer cells. FAs are involved in various aspects of tumourigenesis and tumour progression. Therefore, targeting lipid metabolism is a promising therapeutic strategy for human cancer. Recent studies have shown that reprogramming lipid metabolism plays important roles in providing energy, macromolecules for membrane synthesis, and lipid signals during cancer progression. Moreover, accumulation of lipid droplets in cancer cells acts as a pivotal adaptive response to harmful conditions. Here, we provide a brief review of the crucial roles of FA metabolism in cancer development, and place emphasis on FA origin, utilization and storage in cancer cells. Understanding the regulation of lipid metabolism in cancer cells has important implications for exploring a new therapeutic strategy for management and treatment of cancer. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. The universal connection and metrics on moduli spaces

    International Nuclear Information System (INIS)

    Massamba, Fortune; Thompson, George

    2003-11-01

    We introduce a class of metrics on gauge theoretic moduli spaces. These metrics are made out of the universal matrix that appears in the universal connection construction of M. S. Narasimhan and S. Ramanan. As an example we construct metrics on the c 2 = 1 SU(2) moduli space of instantons on R 4 for various universal matrices. (author)

  12. Reproducibility of graph metrics in fMRI networks

    Directory of Open Access Journals (Sweden)

    Qawi K Telesford

    2010-12-01

    Full Text Available The reliability of graph metrics calculated in network analysis is essential to the interpretation of complex network organization. These graph metrics are used to deduce the small-world properties in networks. In this study, we investigated the test-retest reliability of graph metrics from functional magnetic resonance imaging (fMRI data collected for two runs in 45 healthy older adults. Graph metrics were calculated on data for both runs and compared using intraclass correlation coefficient (ICC statistics and Bland-Altman (BA plots. ICC scores describe the level of absolute agreement between two measurements and provide a measure of reproducibility. For mean graph metrics, ICC scores were high for clustering coefficient (ICC=0.86, global efficiency (ICC=0.83, path length (ICC=0.79, and local efficiency (ICC=0.75; the ICC score for degree was found to be low (ICC=0.29. ICC scores were also used to generate reproducibility maps in brain space to test voxel-wise reproducibility for unsmoothed and smoothed data. Reproducibility was uniform across the brain for global efficiency and path length, but was only high in network hubs for clustering coefficient, local efficiency and degree. BA plots were used to test the measurement repeatability of all graph metrics. All graph metrics fell within the limits for repeatability. Together, these results suggest that with exception of degree, mean graph metrics are reproducible and suitable for clinical studies. Further exploration is warranted to better understand reproducibility across the brain on a voxel-wise basis.

  13. Metrics to assess ecological condition, change, and impacts in sandy beach ecosystems.

    Science.gov (United States)

    Schlacher, Thomas A; Schoeman, David S; Jones, Alan R; Dugan, Jenifer E; Hubbard, David M; Defeo, Omar; Peterson, Charles H; Weston, Michael A; Maslo, Brooke; Olds, Andrew D; Scapini, Felicita; Nel, Ronel; Harris, Linda R; Lucrezi, Serena; Lastra, Mariano; Huijbers, Chantal M; Connolly, Rod M

    2014-11-01

    Complexity is increasingly the hallmark in environmental management practices of sandy shorelines. This arises primarily from meeting growing public demands (e.g., real estate, recreation) whilst reconciling economic demands with expectations of coastal users who have modern conservation ethics. Ideally, shoreline management is underpinned by empirical data, but selecting ecologically-meaningful metrics to accurately measure the condition of systems, and the ecological effects of human activities, is a complex task. Here we construct a framework for metric selection, considering six categories of issues that authorities commonly address: erosion; habitat loss; recreation; fishing; pollution (litter and chemical contaminants); and wildlife conservation. Possible metrics were scored in terms of their ability to reflect environmental change, and against criteria that are widely used for judging the performance of ecological indicators (i.e., sensitivity, practicability, costs, and public appeal). From this analysis, four types of broadly applicable metrics that also performed very well against the indicator criteria emerged: 1.) traits of bird populations and assemblages (e.g., abundance, diversity, distributions, habitat use); 2.) breeding/reproductive performance sensu lato (especially relevant for birds and turtles nesting on beaches and in dunes, but equally applicable to invertebrates and plants); 3.) population parameters and distributions of vertebrates associated primarily with dunes and the supralittoral beach zone (traditionally focused on birds and turtles, but expandable to mammals); 4.) compound measurements of the abundance/cover/biomass of biota (plants, invertebrates, vertebrates) at both the population and assemblage level. Local constraints (i.e., the absence of birds in highly degraded urban settings or lack of dunes on bluff-backed beaches) and particular issues may require alternatives. Metrics - if selected and applied correctly - provide

  14. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  15. Sustainability Metrics: The San Luis Basin Project

    Science.gov (United States)

    Sustainability is about promoting humanly desirable dynamic regimes of the environment. Metrics: ecological footprint, net regional product, exergy, emergy, and Fisher Information. Adaptive management: (1) metrics assess problem, (2) specific problem identified, and (3) managemen...

  16. Goedel-type metrics in various dimensions

    International Nuclear Information System (INIS)

    Guerses, Metin; Karasu, Atalay; Sarioglu, Oezguer

    2005-01-01

    Goedel-type metrics are introduced and used in producing charged dust solutions in various dimensions. The key ingredient is a (D - 1)-dimensional Riemannian geometry which is then employed in constructing solutions to the Einstein-Maxwell field equations with a dust distribution in D dimensions. The only essential field equation in the procedure turns out to be the source-free Maxwell's equation in the relevant background. Similarly the geodesics of this type of metric are described by the Lorentz force equation for a charged particle in the lower dimensional geometry. It is explicitly shown with several examples that Goedel-type metrics can be used in obtaining exact solutions to various supergravity theories and in constructing spacetimes that contain both closed timelike and closed null curves and that contain neither of these. Among the solutions that can be established using non-flat backgrounds, such as the Tangherlini metrics in (D - 1)-dimensions, there exists a class which can be interpreted as describing black-hole-type objects in a Goedel-like universe

  17. Standardised metrics for global surgical surveillance.

    Science.gov (United States)

    Weiser, Thomas G; Makary, Martin A; Haynes, Alex B; Dziekan, Gerald; Berry, William R; Gawande, Atul A

    2009-09-26

    Public health surveillance relies on standardised metrics to evaluate disease burden and health system performance. Such metrics have not been developed for surgical services despite increasing volume, substantial cost, and high rates of death and disability associated with surgery. The Safe Surgery Saves Lives initiative of WHO's Patient Safety Programme has developed standardised public health metrics for surgical care that are applicable worldwide. We assembled an international panel of experts to develop and define metrics for measuring the magnitude and effect of surgical care in a population, while taking into account economic feasibility and practicability. This panel recommended six measures for assessing surgical services at a national level: number of operating rooms, number of operations, number of accredited surgeons, number of accredited anaesthesia professionals, day-of-surgery death ratio, and postoperative in-hospital death ratio. We assessed the feasibility of gathering such statistics at eight diverse hospitals in eight countries and incorporated them into the WHO Guidelines for Safe Surgery, in which methods for data collection, analysis, and reporting are outlined.

  18. Developing a Security Metrics Scorecard for Healthcare Organizations.

    Science.gov (United States)

    Elrefaey, Heba; Borycki, Elizabeth; Kushniruk, Andrea

    2015-01-01

    In healthcare, information security is a key aspect of protecting a patient's privacy and ensuring systems availability to support patient care. Security managers need to measure the performance of security systems and this can be achieved by using evidence-based metrics. In this paper, we describe the development of an evidence-based security metrics scorecard specific to healthcare organizations. Study participants were asked to comment on the usability and usefulness of a prototype of a security metrics scorecard that was developed based on current research in the area of general security metrics. Study findings revealed that scorecards need to be customized for the healthcare setting in order for the security information to be useful and usable in healthcare organizations. The study findings resulted in the development of a security metrics scorecard that matches the healthcare security experts' information requirements.

  19. Landscape pattern metrics and regional assessment

    Science.gov (United States)

    O'Neill, R. V.; Riitters, K.H.; Wickham, J.D.; Jones, K.B.

    1999-01-01

    The combination of remote imagery data, geographic information systems software, and landscape ecology theory provides a unique basis for monitoring and assessing large-scale ecological systems. The unique feature of the work has been the need to develop and interpret quantitative measures of spatial pattern-the landscape indices. This article reviews what is known about the statistical properties of these pattern metrics and suggests some additional metrics based on island biogeography, percolation theory, hierarchy theory, and economic geography. Assessment applications of this approach have required interpreting the pattern metrics in terms of specific environmental endpoints, such as wildlife and water quality, and research into how to represent synergystic effects of many overlapping sources of stress.

  20. Systems resilience for multihazard environments: definition, metrics, and valuation for decision making.

    Science.gov (United States)

    Ayyub, Bilal M

    2014-02-01

    The United Nations Office for Disaster Risk Reduction reported that the 2011 natural disasters, including the earthquake and tsunami that struck Japan, resulted in $366 billion in direct damages and 29,782 fatalities worldwide. Storms and floods accounted for up to 70% of the 302 natural disasters worldwide in 2011, with earthquakes producing the greatest number of fatalities. Average annual losses in the United States amount to about $55 billion. Enhancing community and system resilience could lead to massive savings through risk reduction and expeditious recovery. The rational management of such reduction and recovery is facilitated by an appropriate definition of resilience and associated metrics. In this article, a resilience definition is provided that meets a set of requirements with clear relationships to the metrics of the relevant abstract notions of reliability and risk. Those metrics also meet logically consistent requirements drawn from measure theory, and provide a sound basis for the development of effective decision-making tools for multihazard environments. Improving the resiliency of a system to meet target levels requires the examination of system enhancement alternatives in economic terms, within a decision-making framework. Relevant decision analysis methods would typically require the examination of resilience based on its valuation by society at large. The article provides methods for valuation and benefit-cost analysis based on concepts from risk analysis and management. © 2013 Society for Risk Analysis.

  1. Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Sartor, Dale; Tschudi, William

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  2. Strategies for enhancing the implementation of school-based policies or practices targeting risk factors for chronic disease.

    Science.gov (United States)

    Wolfenden, Luke; Nathan, Nicole K; Sutherland, Rachel; Yoong, Sze Lin; Hodder, Rebecca K; Wyse, Rebecca J; Delaney, Tessa; Grady, Alice; Fielding, Alison; Tzelepis, Flora; Clinton-McHarg, Tara; Parmenter, Benjamin; Butler, Peter; Wiggers, John; Bauman, Adrian; Milat, Andrew; Booth, Debbie; Williams, Christopher M

    2017-11-29

    consulted with experts in the field to identify other relevant research. 'Implementation' was defined as the use of strategies to adopt and integrate evidence-based health interventions and to change practice patterns within specific settings. We included any trial (randomised or non-randomised) conducted at any scale, with a parallel control group that compared a strategy to implement policies or practices to address diet, physical activity, overweight or obesity, tobacco or alcohol use by school staff to 'no intervention', 'usual' practice or a different implementation strategy. Citation screening, data extraction and assessment of risk of bias was performed by review authors in pairs. Disagreements between review authors were resolved via consensus, or if required, by a third author. Considerable trial heterogeneity precluded meta-analysis. We narratively synthesised trial findings by describing the effect size of the primary outcome measure for policy or practice implementation (or the median of such measures where a single primary outcome was not stated). We included 27 trials, 18 of which were conducted in the USA. Nineteen studies employed randomised controlled trial (RCT) designs. Fifteen trials tested strategies to implement healthy eating policies, practice or programs; six trials tested strategies targeting physical activity policies or practices; and three trials targeted tobacco policies or practices. Three trials targeted a combination of risk factors. None of the included trials sought to increase the implementation of interventions to delay initiation or reduce the consumption of alcohol. All trials examined multi-strategic implementation strategies and no two trials examined the same combinations of implementation strategies. The most common implementation strategies included educational materials, educational outreach and educational meetings. For all outcomes, the overall quality of evidence was very low and the risk of bias was high for the majority of

  3. Clinical and radiographic outcome of a treat-to-target strategy using methotrexate and intra-articular glucocorticoids with or without adalimumab induction

    DEFF Research Database (Denmark)

    Hørslev-Petersen, K; Hetland, M L; Ørnbjerg, L M

    2015-01-01

    OBJECTIVES: To study clinical and radiographic outcomes after withdrawing 1 year's adalimumab induction therapy for early rheumatoid arthritis (eRA) added to a methotrexate and intra-articular triamcinolone hexacetonide treat-to-target strategy (NCT00660647). METHODS: Disease-modifying antirheuma......OBJECTIVES: To study clinical and radiographic outcomes after withdrawing 1 year's adalimumab induction therapy for early rheumatoid arthritis (eRA) added to a methotrexate and intra-articular triamcinolone hexacetonide treat-to-target strategy (NCT00660647). METHODS: Disease.......12). Erosive progression (Δerosion score (ES)/year) was year 1:0.57/0.06 (p=0.02); year 2:0.38/0.05 (p=0.005). Proportion of patients without erosive progression (ΔES≤0) was year 1: 59%/76% (p=0.03); year 2:64%/79% (p=0.04). CONCLUSIONS: An aggressive triamcinolone and synthetic DMARD treat-to-target strategy...... was (re)initiated in 12/12 patients and cumulative triamcinolone dose was 160/120 mg (p=0.15). The treatment target (disease activity score, 4 variables, C-reactive protein (DAS28CRP) ≤3.2 or DAS28>3.2 without swollen joints) was achieved at all visits in ≥85% of patients in year 2; remission rates were...

  4. Metrics Are Needed for Collaborative Software Development

    Directory of Open Access Journals (Sweden)

    Mojgan Mohtashami

    2011-10-01

    Full Text Available There is a need for metrics for inter-organizational collaborative software development projects, encompassing management and technical concerns. In particular, metrics are needed that are aimed at the collaborative aspect itself, such as readiness for collaboration, the quality and/or the costs and benefits of collaboration in a specific ongoing project. We suggest questions and directions for such metrics, spanning the full lifespan of a collaborative project, from considering the suitability of collaboration through evaluating ongoing projects to final evaluation of the collaboration.

  5. Predicting class testability using object-oriented metrics

    OpenAIRE

    Bruntink, Magiel; Deursen, Arie

    2004-01-01

    textabstractIn this paper we investigate factors of the testability of object-oriented software systems. The starting point is given by a study of the literature to obtain both an initial model of testability and existing OO metrics related to testability. Subsequently, these metrics are evaluated by means of two case studies of large Java systems for which JUnit test cases exist. The goal of this paper is to define and evaluate a set of metrics that can be used to assess the testability of t...

  6. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  7. Hermitian-Einstein metrics on parabolic stable bundles

    International Nuclear Information System (INIS)

    Li Jiayu; Narasimhan, M.S.

    1995-12-01

    Let M-bar be a compact complex manifold of complex dimension two with a smooth Kaehler metric and D a smooth divisor on M-bar. If E is a rank 2 holomorphic vector bundle on M-bar with a stable parabolic structure along D, we prove the existence of a metric on E' = E module MbarD (compatible with the parabolic structure) which is Hermitian-Einstein with respect to the restriction of Kaehler metric of M-barD. A converse is also proved. (author). 24 refs

  8. Recovery and Resource Allocation Strategies to Maximize Mobile Network Survivability by Using Game Theories and Optimization Techniques

    Directory of Open Access Journals (Sweden)

    Pei-Yu Chen

    2013-01-01

    Full Text Available With more and more mobile device users, an increasingly important and critical issue is how to efficiently evaluate mobile network survivability. In this paper, a novel metric called Average Degree of Disconnectivity (Average DOD is proposed, in which the concept of probability is calculated by the contest success function. The DOD metric is used to evaluate the damage degree of the network, where the larger the value of the Average DOD, the more the damage degree of the network. A multiround network attack-defense scenario as a mathematical model is used to support network operators to predict all the strategies both cyber attacker and network defender would likely take. In addition, the Average DOD would be used to evaluate the damage degree of the network. In each round, the attacker could use the attack resources to launch attacks on the nodes of the target network. Meanwhile, the network defender could reallocate its existing resources to recover compromised nodes and allocate defense resources to protect the survival nodes of the network. In the approach to solving this problem, the “gradient method” and “game theory” are adopted to find the optimal resource allocation strategies for both the cyber attacker and mobile network defender.

  9. ARM Data-Oriented Metrics and Diagnostics Package for Climate Model Evaluation Value-Added Product

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Chengzhu [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Xie, Shaocheng [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-10-15

    A Python-based metrics and diagnostics package is currently being developed by the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Infrastructure Team at Lawrence Livermore National Laboratory (LLNL) to facilitate the use of long-term, high-frequency measurements from the ARM Facility in evaluating the regional climate simulation of clouds, radiation, and precipitation. This metrics and diagnostics package computes climatological means of targeted climate model simulation and generates tables and plots for comparing the model simulation with ARM observational data. The Coupled Model Intercomparison Project (CMIP) model data sets are also included in the package to enable model intercomparison as demonstrated in Zhang et al. (2017). The mean of the CMIP model can serve as a reference for individual models. Basic performance metrics are computed to measure the accuracy of mean state and variability of climate models. The evaluated physical quantities include cloud fraction, temperature, relative humidity, cloud liquid water path, total column water vapor, precipitation, sensible and latent heat fluxes, and radiative fluxes, with plan to extend to more fields, such as aerosol and microphysics properties. Process-oriented diagnostics focusing on individual cloud- and precipitation-related phenomena are also being developed for the evaluation and development of specific model physical parameterizations. The version 1.0 package is designed based on data collected at ARM’s Southern Great Plains (SGP) Research Facility, with the plan to extend to other ARM sites. The metrics and diagnostics package is currently built upon standard Python libraries and additional Python packages developed by DOE (such as CDMS and CDAT). The ARM metrics and diagnostic package is available publicly with the hope that it can serve as an easy entry point for climate modelers to compare their models with ARM data. In this report, we first present the input data, which

  10. Coverage Metrics for Model Checking

    Science.gov (United States)

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  11. A pre-protective strategy for precise tumor targeting and efficient photodynamic therapy with a switchable DNA/upconversion nanocomposite.

    Science.gov (United States)

    Yu, Zhengze; Ge, Yegang; Sun, Qiaoqiao; Pan, Wei; Wan, Xiuyan; Li, Na; Tang, Bo

    2018-04-14

    Tumor-specific targeting based on folic acid (FA) is one of the most common and significant approaches in cancer therapy. However, the expression of folate receptors (FRs) in normal tissues will lead to unexpected targeting and unsatisfactory therapeutic effect. To address this issue, we develop a pre-protective strategy for precise tumor targeting and efficient photodynamic therapy (PDT) using a switchable DNA/upconversion nanocomposite, which can be triggered in the acidic tumor microenvironment. The DNA/upconversion nanocomposite is composed of polyacrylic acid (PAA) coated upconversion nanoparticles (UCNPs), the surface of which is modified using FA and chlorin e6 (Ce6) functionalized DNA sequences with different lengths. Initially, FA on the shorter DNA was protected by a longer DNA to prevent the bonding to FRs on normal cells. Once reaching the acidic tumor microenvironment, C base-rich longer DNA forms a C-quadruplex, resulting in the exposure of the FA groups and the bonding of FA and FRs on cancer cell membranes to achieve precise targeting. Simultaneously, the photosensitizer chlorin e6 (Ce6) gets close to the surface of UCNPs, enabling the excitation of Ce6 to generate singlet oxygen ( 1 O 2 ) under near infrared light via Förster resonance energy transfer (FRET). In vivo experiments indicated that higher tumor targeting efficiency was achieved and the tumor growth was greatly inhibited through the pre-protective strategy.

  12. PRICE-LEVEL TARGETING – A VIABLE ALTERNATIVE TO INFLATION TARGETING?

    OpenAIRE

    Iulian Vasile Popescu

    2012-01-01

    The recent financial crisis that has led some central banks reaching the zero lower bound of their interest rate to use unconventional monetary policy instruments, has brought to the forefront the academic discussions on the shift from inflation targeting (IT) to price level targeting. This paper provides a comparative analysis on IT strategy and targeting the price level, assesses the implications and highlights the challenges of an eventual transition to a new monetary policy strategy. Bala...

  13. Model assessment using a multi-metric ranking technique

    Science.gov (United States)

    Fitzpatrick, P. J.; Lau, Y.; Alaka, G.; Marks, F.

    2017-12-01

    Validation comparisons of multiple models presents challenges when skill levels are similar, especially in regimes dominated by the climatological mean. Assessing skill separation will require advanced validation metrics and identifying adeptness in extreme events, but maintain simplicity for management decisions. Flexibility for operations is also an asset. This work postulates a weighted tally and consolidation technique which ranks results by multiple types of metrics. Variables include absolute error, bias, acceptable absolute error percentages, outlier metrics, model efficiency, Pearson correlation, Kendall's Tau, reliability Index, multiplicative gross error, and root mean squared differences. Other metrics, such as root mean square difference and rank correlation were also explored, but removed when the information was discovered to be generally duplicative to other metrics. While equal weights are applied, weights could be altered depending for preferred metrics. Two examples are shown comparing ocean models' currents and tropical cyclone products, including experimental products. The importance of using magnitude and direction for tropical cyclone track forecasts instead of distance, along-track, and cross-track are discussed. Tropical cyclone intensity and structure prediction are also assessed. Vector correlations are not included in the ranking process, but found useful in an independent context, and will be briefly reported.

  14. Special metrics and group actions in geometry

    CERN Document Server

    Fino, Anna; Musso, Emilio; Podestà, Fabio; Vezzoni, Luigi

    2017-01-01

    The volume is a follow-up to the INdAM meeting “Special metrics and quaternionic geometry” held in Rome in November 2015. It offers a panoramic view of a selection of cutting-edge topics in differential geometry, including 4-manifolds, quaternionic and octonionic geometry, twistor spaces, harmonic maps, spinors, complex and conformal geometry, homogeneous spaces and nilmanifolds, special geometries in dimensions 5–8, gauge theory, symplectic and toric manifolds, exceptional holonomy and integrable systems. The workshop was held in honor of Simon Salamon, a leading international scholar at the forefront of academic research who has made significant contributions to all these subjects. The articles published here represent a compelling testimony to Salamon’s profound and longstanding impact on the mathematical community. Target readership includes graduate students and researchers working in Riemannian and complex geometry, Lie theory and mathematical physics.

  15. Metric learning for DNA microarray data analysis

    International Nuclear Information System (INIS)

    Takeuchi, Ichiro; Nakagawa, Masao; Seto, Masao

    2009-01-01

    In many microarray studies, gene set selection is an important preliminary step for subsequent main task such as tumor classification, cancer subtype identification, etc. In this paper, we investigate the possibility of using metric learning as an alternative to gene set selection. We develop a simple metric learning algorithm aiming to use it for microarray data analysis. Exploiting a property of the algorithm, we introduce a novel approach for extending the metric learning to be adaptive. We apply the algorithm to previously studied microarray data on malignant lymphoma subtype identification.

  16. Metrical expectations from preceding prosody influence perception of lexical stress.

    Science.gov (United States)

    Brown, Meredith; Salverda, Anne Pier; Dilley, Laura C; Tanenhaus, Michael K

    2015-04-01

    Two visual-world experiments tested the hypothesis that expectations based on preceding prosody influence the perception of suprasegmental cues to lexical stress. The results demonstrate that listeners' consideration of competing alternatives with different stress patterns (e.g., 'jury/gi'raffe) can be influenced by the fundamental frequency and syllable timing patterns across material preceding a target word. When preceding stressed syllables distal to the target word shared pitch and timing characteristics with the first syllable of the target word, pictures of alternatives with primary lexical stress on the first syllable (e.g., jury) initially attracted more looks than alternatives with unstressed initial syllables (e.g., giraffe). This effect was modulated when preceding unstressed syllables had pitch and timing characteristics similar to the initial syllable of the target word, with more looks to alternatives with unstressed initial syllables (e.g., giraffe) than to those with stressed initial syllables (e.g., jury). These findings suggest that expectations about the acoustic realization of upcoming speech include information about metrical organization and lexical stress and that these expectations constrain the initial interpretation of suprasegmental stress cues. These distal prosody effects implicate online probabilistic inferences about the sources of acoustic-phonetic variation during spoken-word recognition. (c) 2015 APA, all rights reserved.

  17. Development of a perceptually calibrated objective metric of noise

    Science.gov (United States)

    Keelan, Brian W.; Jin, Elaine W.; Prokushkin, Sergey

    2011-01-01

    A system simulation model was used to create scene-dependent noise masks that reflect current performance of mobile phone cameras. Stimuli with different overall magnitudes of noise and with varying mixtures of red, green, blue, and luminance noises were included in the study. Eleven treatments in each of ten pictorial scenes were evaluated by twenty observers using the softcopy ruler method. In addition to determining the quality loss function in just noticeable differences (JNDs) for the average observer and scene, transformations for different combinations of observer sensitivity and scene susceptibility were derived. The psychophysical results were used to optimize an objective metric of isotropic noise based on system noise power spectra (NPS), which were integrated over a visual frequency weighting function to yield perceptually relevant variances and covariances in CIE L*a*b* space. Because the frequency weighting function is expressed in terms of cycles per degree at the retina, it accounts for display pixel size and viewing distance effects, so application-specific predictions can be made. Excellent results were obtained using only L* and a* variances and L*a* covariance, with relative weights of 100, 5, and 12, respectively. The positive a* weight suggests that the luminance (photopic) weighting is slightly narrow on the long wavelength side for predicting perceived noisiness. The L*a* covariance term, which is normally negative, reflects masking between L* and a* noise, as confirmed in informal evaluations. Test targets in linear sRGB and rendered L*a*b* spaces for each treatment are available at http://www.aptina.com/ImArch/ to enable other researchers to test metrics of their own design and calibrate them to JNDs of quality loss without performing additional observer experiments. Such JND-calibrated noise metrics are particularly valuable for comparing the impact of noise and other attributes, and for computing overall image quality.

  18. Incorporation of aptamers in the terminal loop of shRNAs yields an effective and novel combinatorial targeting strategy.

    Science.gov (United States)

    Pang, Ka Ming; Castanotto, Daniela; Li, Haitang; Scherer, Lisa; Rossi, John J

    2018-01-09

    Gene therapy by engineering patient's own blood cells to confer HIV resistance can potentially lead to a functional cure for AIDS. Toward this goal, we have previously developed an anti-HIV lentivirus vector that deploys a combination of shRNA, ribozyme and RNA decoy. To further improve this therapeutic vector against viral escape, we sought an additional reagent to target HIV integrase. Here, we report the development of a new strategy for selection and expression of aptamer for gene therapy. We developed a SELEX protocol (multi-tag SELEX) for selecting RNA aptamers against proteins with low solubility or stability, such as integrase. More importantly, we expressed these aptamers in vivo by incorporating them in the terminal loop of shRNAs. This novel strategy allowed efficient expression of the shRNA-aptamer fusions that targeted RNAs and proteins simultaneously. Expressed shRNA-aptamer fusions targeting HIV integrase or reverse transcriptase inhibited HIV replication in cell cultures. Viral inhibition was further enhanced by combining an anti-integrase aptamer with an anti-HIV Tat-Rev shRNA. This construct exhibited efficacy comparable to that of integrase inhibitor Raltegravir. Our strategy for the selection and expression of RNA aptamers can potentially extend to other gene therapy applications. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. A bi-metric theory of gravitation

    International Nuclear Information System (INIS)

    Rosen, N.

    1975-01-01

    The bi-metric theory of gravitation proposed previously is simplified in that the auxiliary conditions are discarded, the two metric tensors being tied together only by means of the boundary conditions. Some of the properties of the field of a particle are investigated; there is no black hole, and it appears that no gravitational collapse can take place. Although the proposed theory and general relativity are at present observationally indistinguishable, some differences are pointed out which may some day be susceptible of observation. An alternative bi-metric theory is considered which gives for the precession of the perihelion 5/6 of the value given by general relativity; it seems less satisfactory than the present theory from the aesthetic point of view. (author)

  20. Comparative Study of Trace Metrics between Bibliometrics and Patentometrics

    Directory of Open Access Journals (Sweden)

    Fred Y. Ye

    2016-06-01

    Full Text Available Purpose: To comprehensively evaluate the overall performance of a group or an individual in both bibliometrics and patentometrics. Design/methodology/approach: Trace metrics were applied to the top 30 universities in the 2014 Academic Ranking of World Universities (ARWU — computer sciences, the top 30 ESI highly cited papers in the computer sciences field in 2014, as well as the top 30 assignees and the top 30 most cited patents in the National Bureau of Economic Research (NBER computer hardware and software category. Findings: We found that, by applying trace metrics, the research or marketing impact efficiency, at both group and individual levels, was clearly observed. Furthermore, trace metrics were more sensitive to the different publication-citation distributions than the average citation and h-index were. Research limitations: Trace metrics considered publications with zero citations as negative contributions. One should clarify how he/she evaluates a zero-citation paper or patent before applying trace metrics. Practical implications: Decision makers could regularly examinine the performance of their university/company by applying trace metrics and adjust their policies accordingly. Originality/value: Trace metrics could be applied both in bibliometrics and patentometrics and provide a comprehensive view. Moreover, the high sensitivity and unique impact efficiency view provided by trace metrics can facilitate decision makers in examining and adjusting their policies.

  1. Metrics to assess injury prevention programs for young workers in high-risk occupations: a scoping review of the literature

    Directory of Open Access Journals (Sweden)

    Jennifer Smith

    2018-05-01

    Full Text Available Introduction: Despite legal protections for young workers in Canada, youth aged 15–24 are at high risk of traumatic occupational injury. While many injury prevention initiatives targeting young workers exist, the challenge faced by youth advocates and employers is deciding what aspect(s of prevention will be the most effective focus for their efforts. A review of the academic and grey literatures was undertaken to compile the metrics—both the indicators being evaluated and the methods of measurement—commonly used to assess injury prevention programs for young workers. Metrics are standards of measurement through which efficiency, performance, progress, or quality of a plan, process, or product can be assessed. Methods: A PICO framework was used to develop search terms. Medline, PubMed, OVID, EMBASE, CCOHS, PsychINFO, CINAHL, NIOSHTIC, Google Scholar and the grey literature were searched for articles in English, published between 1975-2015. Two independent reviewers screened the resulting list and categorized the metrics in three domains of injury prevention: Education, Environment and Enforcement. Results: Of 174 acquired articles meeting the inclusion criteria, 21 both described and assessed an intervention. Half were educational in nature (N=11. Commonly assessed metrics included: knowledge, perceptions, self-reported behaviours or intentions, hazardous exposures, injury claims, and injury counts. One study outlined a method for developing metrics to predict injury rates. Conclusion: Metrics specific to the evaluation of young worker injury prevention programs are needed, as current metrics are insufficient to predict reduced injuries following program implementation. One study, which the review brought to light, could be an appropriate model for future research to develop valid leading metrics specific to young workers, and then apply these metrics to injury prevention programs for youth.

  2. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  3. Species-Level Differences in Hyperspectral Metrics among Tropical Rainforest Trees as Determined by a Tree-Based Classifier

    Directory of Open Access Journals (Sweden)

    Dar A. Roberts

    2012-06-01

    Full Text Available This study explores a method to classify seven tropical rainforest tree species from full-range (400–2,500 nm hyperspectral data acquired at tissue (leaf and bark, pixel and crown scales using laboratory and airborne sensors. Metrics that respond to vegetation chemistry and structure were derived using narrowband indices, derivative- and absorption-based techniques, and spectral mixture analysis. We then used the Random Forests tree-based classifier to discriminate species with minimally-correlated, importance-ranked metrics. At all scales, best overall accuracies were achieved with metrics derived from all four techniques and that targeted chemical and structural properties across the visible to shortwave infrared spectrum (400–2500 nm. For tissue spectra, overall accuracies were 86.8% for leaves, 74.2% for bark, and 84.9% for leaves plus bark. Variation in tissue metrics was best explained by an axis of red absorption related to photosynthetic leaves and an axis distinguishing bark water and other chemical absorption features. Overall accuracies for individual tree crowns were 71.5% for pixel spectra, 70.6% crown-mean spectra, and 87.4% for a pixel-majority technique. At pixel and crown scales, tree structure and phenology at the time of image acquisition were important factors that determined species spectral separability.

  4. A new disaster victim identification management strategy targeting "near identification-threshold" cases: Experiences from the Boxing Day tsunami.

    Science.gov (United States)

    Wright, Kirsty; Mundorff, Amy; Chaseling, Janet; Forrest, Alexander; Maguire, Christopher; Crane, Denis I

    2015-05-01

    The international disaster victim identification (DVI) response to the Boxing Day tsunami, led by the Royal Thai Police in Phuket, Thailand, was one of the largest and most complex in DVI history. Referred to as the Thai Tsunami Victim Identification operation, the group comprised a multi-national, multi-agency, and multi-disciplinary team. The traditional DVI approach proved successful in identifying a large number of victims quickly. However, the team struggled to identify certain victims due to incomplete or poor quality ante-mortem and post-mortem data. In response to these challenges, a new 'near-threshold' DVI management strategy was implemented to target presumptive identifications and improve operational efficiency. The strategy was implemented by the DNA Team, therefore DNA kinship matches that just failed to reach the reporting threshold of 99.9% were prioritized, however the same approach could be taken by targeting, for example, cases with partial fingerprint matches. The presumptive DNA identifications were progressively filtered through the Investigation, Dental and Fingerprint Teams to add additional information necessary to either strengthen or conclusively exclude the identification. Over a five-month period 111 victims from ten countries were identified using this targeted approach. The new identifications comprised 87 adults, 24 children and included 97 Thai locals. New data from the Fingerprint Team established nearly 60% of the total near-threshold identifications and the combined DNA/Physical method was responsible for over 30%. Implementing the new strategy, targeting near-threshold cases, had positive management implications. The process initiated additional ante-mortem information collections, and established a much-needed, distinct "end-point" for unresolved cases. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. Indefinite metric fields and the renormalization group

    International Nuclear Information System (INIS)

    Sherry, T.N.

    1976-11-01

    The renormalization group equations are derived for the Green functions of an indefinite metric field theory. In these equations one retains the mass dependence of the coefficient functions, since in the indefinite metric theories the masses cannot be neglected. The behavior of the effective coupling constant in the asymptotic and infrared limits is analyzed. The analysis is illustrated by means of a simple model incorporating indefinite metric fields. The model scales at first order, and at this order also the effective coupling constant has both ultra-violet and infra-red fixed points, the former being the bare coupling constant

  6. Kerr-Newman metric in deSitter background

    International Nuclear Information System (INIS)

    Patel, L.K.; Koppar, S.S.; Bhatt, P.V.

    1987-01-01

    In addition to the Kerr-Newman metric with cosmological constant several other metrics are presented giving Kerr-Newman type solutions of Einstein-Maxwell field equations in the background of deSitter universe. The electromagnetic field in all the solutions is assumed to be source-free. A new metric of what may be termed as an electrovac rotating deSitter space-time- a space-time devoid of matter but containing source-free electromagnetic field and a null fluid with twisting rays-has been presented. In the absence of the electromagnetic field, these solutions reduce to those discussed by Vaidya (1984). 8 refs. (author)

  7. The independence of software metrics taken at different life-cycle stages

    Science.gov (United States)

    Kafura, D.; Canning, J.; Reddy, G.

    1984-01-01

    Over the past few years a large number of software metrics have been proposed and, in varying degrees, a number of these metrics have been subjected to empirical validation which demonstrated the utility of the metrics in the software development process. Attempts to classify these metrics and to determine if the metrics in these different classes appear to be measuring distinct attributes of the software product are studied. Statistical analysis is used to determine the degree of relationship among the metrics.

  8. Thermodynamic metrics and optimal paths.

    Science.gov (United States)

    Sivak, David A; Crooks, Gavin E

    2012-05-11

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  9. Invariant metrics for Hamiltonian systems

    International Nuclear Information System (INIS)

    Rangarajan, G.; Dragt, A.J.; Neri, F.

    1991-05-01

    In this paper, invariant metrics are constructed for Hamiltonian systems. These metrics give rise to norms on the space of homeogeneous polynomials of phase-space variables. For an accelerator lattice described by a Hamiltonian, these norms characterize the nonlinear content of the lattice. Therefore, the performance of the lattice can be improved by minimizing the norm as a function of parameters describing the beam-line elements in the lattice. A four-fold increase in the dynamic aperture of a model FODO cell is obtained using this procedure. 7 refs

  10. Final Technical Report: Targeting DOE-Relevant Ions with Supramolecular Strategies, DE-SC0010555

    Energy Technology Data Exchange (ETDEWEB)

    Bowman-James, Kristin [Univ. of Kansas, Lawrence, KS (United States). Dept. of Chemistry

    2017-04-13

    The effectiveness of three popular supramolecular strategies to selectively target negatively charged ions (anions) was evaluated. Ions of interest included oxo anions, particularly sulfate, that hamper nuclear waste remediation. Three objectives were pursued using a simple building block strategies and by strategically placing anion-binding sites at appropriate positions on organic host molecules. The goal of the first objective was to assess the influence of secondary, tertiary and quaternized amines on binding tetrahedral anions using mixed amide/amine macrocyclic and urea/amine hosts containing aromatic or heteroaromatic spacers. Objective 2 focused on the design of ion pair hosts, using mixed macrocyclic anion hosts joined through polyether linkages. Objective 3 was to explore the synthesis of new metal-linked extended macrocyclic frameworks to leverage anion binding. Key findings were that smaller 24-membered macrocycles provided the most complementary binding for sulfate ion and mixed urea/amine chelates showed enhanced binding over amide corollaries in addition to being highly selective for SO42- in the presence of small quantities of water. In addition to obtaining prototype metal-linked macrocyclic anion hosts, a new dipincer ligand was designed that can be used to link macrocyclic or other supramolecular hosts in extended frameworks. When the tetraamide-based pincers are bound to two metal ions, an interesting phenomenon occurs. Upon deprotonation of the amides, two new protons appear between adjacent carbonyl pairs on the ligand, which may modify the chemistry, and metal-metal interactions in the complexes. Gel formation occurred for some of these extended hosts, and the physical properties are currently under investigation. The new tetracarboxamide-based pincers can also provide basic frameworks for double macrocycles capable of binding ion pairs as well as for binding metal ions and exploring intermetallic interactions through

  11. Targeting iodothyronine deiodinases locally in the retina is a therapeutic strategy for retinal degeneration.

    Science.gov (United States)

    Yang, Fan; Ma, Hongwei; Belcher, Joshua; Butler, Michael R; Redmond, T Michael; Boye, Sanford L; Hauswirth, William W; Ding, Xi-Qin

    2016-12-01

    Recent studies have implicated thyroid hormone (TH) signaling in cone photoreceptor viability. Using mouse models of retinal degeneration, we found that antithyroid treatment preserves cones. This work investigates the significance of targeting intracellular TH components locally in the retina. The cellular TH level is mainly regulated by deiodinase iodothyronine (DIO)-2 and -3. DIO2 converts thyroxine (T4) to triiodothyronine (T3), which binds to the TH receptor, whereas DIO3 degrades T3 and T4. We examined cone survival after overexpression of DIO3 and inhibition of DIO2 and demonstrated the benefits of these manipulations. Subretinal delivery of AAV5-IRBP/GNAT2-DIO3, which directs expression of human DIO3 specifically in cones, increased cone density by 30-40% in a Rpe65 -/- mouse model of Lebers congenital amaurosis (LCA) and in a Cpfl1 mouse with Pde6c defect model of achromatopsia, compared with their respective untreated controls. Intravitreal and topical delivery of the DIO2 inhibitor iopanoic acid also significantly improved cone survival in the LCA model mice. Moreover, the expression levels of DIO2 and Slc16a2 were significantly higher in the diseased retinas, suggesting locally elevated TH signaling. We show that targeting DIOs protects cones, and intracellular inhibition of TH components locally in the retina may represent a novel strategy for retinal degeneration management.-Yang, F., Ma, H., Belcher, J., Butler, M. R., Redmond, T. M., Boye, S. L., Hauswirth, W. W., Ding, X.-Q. Targeting iodothyronine deiodinases locally in the retina is a therapeutic strategy for retinal degeneration. © FASEB.

  12. Kepler Planet Detection Metrics: Per-Target Detection Contours for Data Release 25

    Science.gov (United States)

    Burke, Christopher J.; Catanzarite, Joseph

    2017-01-01

    A necessary input to planet occurrence calculations is an accurate model for the pipeline completeness (Burke et al., 2015). This document describes the use of the Kepler planet occurrence rate products in order to calculate a per-target detection contour for the measured Data Release 25 (DR25) pipeline performance. A per-target detection contour measures for a given combination of orbital period, Porb, and planet radius, Rp, what fraction of transit signals are recoverable by the Kepler pipeline (Twicken et al., 2016; Jenkins et al., 2017). The steps for calculating a detection contour follow the procedure outlined in Burke et al. (2015), but have been updated to provide improved accuracy enabled by the substantially larger database of transit injection and recovery tests that were performed on the final version (i.e., SOC 9.3) of the Kepler pipeline (Christiansen, 2017; Burke Catanzarite, 2017a). In the following sections, we describe the main inputs to the per-target detection contour and provide a worked example of the python software released with this document (Kepler Planet Occurrence Rate Tools KeplerPORTs)1 that illustrates the generation of a detection contour in practice. As background material for this document and its nomenclature, we recommend the reader be familiar with the previous method of calculating a detection contour (Section 2 of Burke et al.,2015), input parameters relevant for describing the data quantity and quality of Kepler targets (Burke Catanzarite, 2017b), and the extensive new transit injection and recovery tests of the Kepler pipeline (Christiansen et al., 2016; Burke Catanzarite, 2017a; Christiansen, 2017).

  13. Steiner trees for fixed orientation metrics

    DEFF Research Database (Denmark)

    Brazil, Marcus; Zachariasen, Martin

    2009-01-01

    We consider the problem of constructing Steiner minimum trees for a metric defined by a polygonal unit circle (corresponding to s = 2 weighted legal orientations in the plane). A linear-time algorithm to enumerate all angle configurations for degree three Steiner points is given. We provide...... a simple proof that the angle configuration for a Steiner point extends to all Steiner points in a full Steiner minimum tree, such that at most six orientations suffice for edges in a full Steiner minimum tree. We show that the concept of canonical forms originally introduced for the uniform orientation...... metric generalises to the fixed orientation metric. Finally, we give an O(s n) time algorithm to compute a Steiner minimum tree for a given full Steiner topology with n terminal leaves....

  14. Metric Learning for Hyperspectral Image Segmentation

    Science.gov (United States)

    Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca

    2011-01-01

    We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.

  15. Validation of Metrics as Error Predictors

    Science.gov (United States)

    Mendling, Jan

    In this chapter, we test the validity of metrics that were defined in the previous chapter for predicting errors in EPC business process models. In Section 5.1, we provide an overview of how the analysis data is generated. Section 5.2 describes the sample of EPCs from practice that we use for the analysis. Here we discuss a disaggregation by the EPC model group and by error as well as a correlation analysis between metrics and error. Based on this sample, we calculate a logistic regression model for predicting error probability with the metrics as input variables in Section 5.3. In Section 5.4, we then test the regression function for an independent sample of EPC models from textbooks as a cross-validation. Section 5.5 summarizes the findings.

  16. Strategic management system in a healthcare setting--moving from strategy to results.

    Science.gov (United States)

    Devitt, Rob; Klassen, Wolf; Martalog, Julian

    2005-01-01

    One of the historical challenges in the healthcare system has been the identification and collection of meaningful data to measure an organization's progress towards the achievement of its strategic goals and the concurrent alignment of internal operating practices with this strategy. Over the last 18 months the Toronto East General Hospital (TEGH) has adopted a strategic management system and organizing framework that has led to a metric-based strategic plan. It has allowed for formal and measurable linkages across a full range of internal business processes, from the annual operating plan to resource allocation decisions, to the balanced scorecard and individual performance evaluations. The Strategic Management System (SMS) aligns organizational planning and performance measurement, facilitates an appropriate balance between organizational priorities and resolving "local" problems, and encourages behaviours that are consistent with the values upon which the organization is built. The TEGH Accountability Framework serves as the foundation for the entire system. A key tool of the system is the rolling three-year strategic plan for the organization that sets out specific annual improvement targets on a number of key strategic measures. Individual program/department plans with corresponding measures ensure that the entire organization is moving forward strategically. Each year, all plans are reviewed, with course adjustments made to reflect changes in the hospital's environment and with re-calibration of performance targets for the next three years to ensure continued improvement and organizational progress. This system has been used through one annual business cycle. Results from the past year show measurable success. The hospital has improved on 12 of the 15 strategic plan metrics, including achieving the targeted 1% operating surplus while operating in an environment of tremendous change and uncertainty. This article describes the strategic management system used

  17. Predicting class testability using object-oriented metrics

    NARCIS (Netherlands)

    M. Bruntink (Magiel); A. van Deursen (Arie)

    2004-01-01

    textabstractIn this paper we investigate factors of the testability of object-oriented software systems. The starting point is given by a study of the literature to obtain both an initial model of testability and existing OO metrics related to testability. Subsequently, these metrics are evaluated

  18. Software Power Metric Model: An Implementation | Akwukwuma ...

    African Journals Online (AJOL)

    ... and the execution time (TIME) in each case was recorded. We then obtain the application functions point count. Our result shows that the proposed metric is computable, consistent in its use of unit, and is programming language independent. Keywords: Software attributes, Software power, measurement, Software metric, ...

  19. Meter Detection in Symbolic Music Using Inner Metric Analysis

    NARCIS (Netherlands)

    de Haas, W.B.; Volk, A.

    2016-01-01

    In this paper we present PRIMA: a new model tailored to symbolic music that detects the meter and the first downbeat position of a piece. Given onset data, the metrical structure of a piece is interpreted using the Inner Metric Analysis (IMA) model. IMA identifies the strong and weak metrical

  20. Performance metrics for the evaluation of hyperspectral chemical identification systems

    Science.gov (United States)

    Truslow, Eric; Golowich, Steven; Manolakis, Dimitris; Ingle, Vinay

    2016-02-01

    Remote sensing of chemical vapor plumes is a difficult but important task for many military and civilian applications. Hyperspectral sensors operating in the long-wave infrared regime have well-demonstrated detection capabilities. However, the identification of a plume's chemical constituents, based on a chemical library, is a multiple hypothesis testing problem which standard detection metrics do not fully describe. We propose using an additional performance metric for identification based on the so-called Dice index. Our approach partitions and weights a confusion matrix to develop both the standard detection metrics and identification metric. Using the proposed metrics, we demonstrate that the intuitive system design of a detector bank followed by an identifier is indeed justified when incorporating performance information beyond the standard detection metrics.

  1. Curvature properties of four-dimensional Walker metrics

    International Nuclear Information System (INIS)

    Chaichi, M; Garcia-Rio, E; Matsushita, Y

    2005-01-01

    A Walker n-manifold is a semi-Riemannian manifold, which admits a field of parallel null r-planes, r ≤ n/2. In the present paper we study curvature properties of a Walker 4-manifold (M, g) which admits a field of parallel null 2-planes. The metric g is necessarily of neutral signature (+ + - -). Such a Walker 4-manifold is the lowest dimensional example not of Lorentz type. There are three functions of coordinates which define a Walker metric. Some recent work shows that a Walker 4-manifold of restricted type whose metric is characterized by two functions exhibits a large variety of symplectic structures, Hermitian structures, Kaehler structures, etc. For such a restricted Walker 4-manifold, we shall study mainly curvature properties, e.g., conditions for a Walker metric to be Einstein, Osserman, or locally conformally flat, etc. One of our main results is the exact solutions to the Einstein equations for a restricted Walker 4-manifold

  2. Measures and Metrics for Feasibility of Proof-of-Concept Studies With Human Immunodeficiency Virus Rapid Point-of-Care Technologies: The Evidence and the Framework.

    Science.gov (United States)

    Pant Pai, Nitika; Chiavegatti, Tiago; Vijh, Rohit; Karatzas, Nicolaos; Daher, Jana; Smallwood, Megan; Wong, Tom; Engel, Nora

    2017-12-01

    Pilot (feasibility) studies form a vast majority of diagnostic studies with point-of-care technologies but often lack use of clear measures/metrics and a consistent framework for reporting and evaluation. To fill this gap, we systematically reviewed data to ( a ) catalog feasibility measures/metrics and ( b ) propose a framework. For the period January 2000 to March 2014, 2 reviewers searched 4 databases (MEDLINE, EMBASE, CINAHL, Scopus), retrieved 1441 citations, and abstracted data from 81 studies. We observed 2 major categories of measures, that is, implementation centered and patient centered, and 4 subcategories of measures, that is, feasibility, acceptability, preference, and patient experience. We defined and delineated metrics and measures for a feasibility framework. We documented impact measures for a comparison. We observed heterogeneity in reporting of metrics as well as misclassification and misuse of metrics within measures. Although we observed poorly defined measures and metrics for feasibility, preference, and patient experience, in contrast, acceptability measure was the best defined. For example, within feasibility, metrics such as consent, completion, new infection, linkage rates, and turnaround times were misclassified and reported. Similarly, patient experience was variously reported as test convenience, comfort, pain, and/or satisfaction. In contrast, within impact measures, all the metrics were well documented, thus serving as a good baseline comparator. With our framework, we classified, delineated, and defined quantitative measures and metrics for feasibility. Our framework, with its defined measures/metrics, could reduce misclassification and improve the overall quality of reporting for monitoring and evaluation of rapid point-of-care technology strategies and their context-driven optimization.

  3. Targeting Glutathione-S Transferase Enzymes in Musculoskeletal Sarcomas: A Promising Therapeutic Strategy

    Directory of Open Access Journals (Sweden)

    Michela Pasello

    2011-01-01

    Full Text Available Recent studies have indicated that targeting glutathione-S-transferase (GST isoenzymes may be a promising novel strategy to improve the efficacy of conventional chemotherapy in the three most common musculoskeletal tumours: osteosarcoma, Ewing's sarcoma, and rhabdomyosarcoma. By using a panel of 15 drug-sensitive and drug-resistant human osteosarcoma, Ewing's sarcoma, and rhabdomyosarcoma cell lines, the efficay of the GST-targeting agent 6-(7-nitro-2,1,3-benzoxadiazol-4-ylthiohexanol (NBDHEX has been assessed and related to GST isoenzymes expression (namely GSTP1, GSTA1, GSTM1, and MGST. NBDHEX showed a relevant in vitro activity on all cell lines, including the drug-resistant ones and those with higher GSTs levels. The in vitro activity of NBDHEX was mostly related to cytostatic effects, with a less evident apoptotic induction. NBDHEX positively interacted with doxorubicin, vincristine, cisplatin but showed antagonistic effects with methotrexate. In vivo studies confirmed the cytostatic efficay of NBDHEX and its positive interaction with vincristine in Ewing's sarcoma cells, and also indicated a positive effect against the metastatisation of osteosarcoma cells. The whole body of evidence found in this study indicated that targeting GSTs in osteosarcoma, Ewing's sarcoma and rhabdomyosarcoma may be an interesting new therapeutic option, which can be considered for patients who are scarcely responsive to conventional regimens.

  4. Decision Analysis for Metric Selection on a Clinical Quality Scorecard.

    Science.gov (United States)

    Guth, Rebecca M; Storey, Patricia E; Vitale, Michael; Markan-Aurora, Sumita; Gordon, Randolph; Prevost, Traci Q; Dunagan, Wm Claiborne; Woeltje, Keith F

    2016-09-01

    Clinical quality scorecards are used by health care institutions to monitor clinical performance and drive quality improvement. Because of the rapid proliferation of quality metrics in health care, BJC HealthCare found it increasingly difficult to select the most impactful scorecard metrics while still monitoring metrics for regulatory purposes. A 7-step measure selection process was implemented incorporating Kepner-Tregoe Decision Analysis, which is a systematic process that considers key criteria that must be satisfied in order to make the best decision. The decision analysis process evaluates what metrics will most appropriately fulfill these criteria, as well as identifies potential risks associated with a particular metric in order to identify threats to its implementation. Using this process, a list of 750 potential metrics was narrowed to 25 that were selected for scorecard inclusion. This decision analysis process created a more transparent, reproducible approach for selecting quality metrics for clinical quality scorecards. © The Author(s) 2015.

  5. Insights on ornithine decarboxylase silencing as a potential strategy for targeting retinoblastoma.

    Science.gov (United States)

    Muthukumaran, Sivashanmugam; Bhuvanasundar, Renganathan; Umashankar, Vetrivel; Sulochana, K N

    2018-02-01

    Ornithine Decarboxylase (ODC) is a key enzyme involved in polyamine synthesis and is reported to be up regulated in several cancers. However, the effect of ODC gene silencing in retinoblastoma is to be understood for utilization in therapeutic applications. Hence, in this study, a novel siRNA (small interference RNA) targeting ODC was designed and validated in Human Y79 retinoblastoma cells for its effects on intracellular polyamine levels, Matrix Metalloproteinase 2 & 9 activity and Cell cycle. The designed siRNA showed efficient silencing of ODC mRNA expression and protein levels in Y79 cells. It also showed significant reduction of intracellular polyamine levels and altered levels of oncogenic LIN28b expression. By this study, a regulatory loop is proposed, wherein, ODC silencing in Y79 cells to result in decreased polyamine levels, thereby, leading to altered protein levels of Lin28b, MMP-2 and MMP-9, which falls in line with earlier studies in neuroblastoma. Thus, by this study, we propose ODC silencing as a prospective strategy for targeting retinoblastoma. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  6. Common fixed point theorems in intuitionistic fuzzy metric spaces and L-fuzzy metric spaces with nonlinear contractive condition

    International Nuclear Information System (INIS)

    Jesic, Sinisa N.; Babacev, Natasa A.

    2008-01-01

    The purpose of this paper is to prove some common fixed point theorems for a pair of R-weakly commuting mappings defined on intuitionistic fuzzy metric spaces [Park JH. Intuitionistic fuzzy metric spaces. Chaos, Solitons and Fractals 2004;22:1039-46] and L-fuzzy metric spaces [Saadati R, Razani A, Adibi H. A common fixed point theorem in L-fuzzy metric spaces. Chaos, Solitons and Fractals, doi:10.1016/j.chaos.2006.01.023], with nonlinear contractive condition, defined with function, first observed by Boyd and Wong [Boyd DW, Wong JSW. On nonlinear contractions. Proc Am Math Soc 1969;20:458-64]. Following Pant [Pant RP. Common fixed points of noncommuting mappings. J Math Anal Appl 1994;188:436-40] we define R-weak commutativity for a pair of mappings and then prove the main results. These results generalize some known results due to Saadati et al., and Jungck [Jungck G. Commuting maps and fixed points. Am Math Mon 1976;83:261-3]. Some examples and comments according to the preceding results are given

  7. 43 CFR 12.915 - Metric system of measurement.

    Science.gov (United States)

    2010-10-01

    ... procurements, grants, and other business-related activities. Metric implementation may take longer where the... recipient, such as when foreign competitors are producing competing products in non-metric units. (End of...

  8. Cophenetic metrics for phylogenetic trees, after Sokal and Rohlf.

    Science.gov (United States)

    Cardona, Gabriel; Mir, Arnau; Rosselló, Francesc; Rotger, Lucía; Sánchez, David

    2013-01-16

    Phylogenetic tree comparison metrics are an important tool in the study of evolution, and hence the definition of such metrics is an interesting problem in phylogenetics. In a paper in Taxon fifty years ago, Sokal and Rohlf proposed to measure quantitatively the difference between a pair of phylogenetic trees by first encoding them by means of their half-matrices of cophenetic values, and then comparing these matrices. This idea has been used several times since then to define dissimilarity measures between phylogenetic trees but, to our knowledge, no proper metric on weighted phylogenetic trees with nested taxa based on this idea has been formally defined and studied yet. Actually, the cophenetic values of pairs of different taxa alone are not enough to single out phylogenetic trees with weighted arcs or nested taxa. For every (rooted) phylogenetic tree T, let its cophenetic vectorφ(T) consist of all pairs of cophenetic values between pairs of taxa in T and all depths of taxa in T. It turns out that these cophenetic vectors single out weighted phylogenetic trees with nested taxa. We then define a family of cophenetic metrics dφ,p by comparing these cophenetic vectors by means of Lp norms, and we study, either analytically or numerically, some of their basic properties: neighbors, diameter, distribution, and their rank correlation with each other and with other metrics. The cophenetic metrics can be safely used on weighted phylogenetic trees with nested taxa and no restriction on degrees, and they can be computed in O(n2) time, where n stands for the number of taxa. The metrics dφ,1 and dφ,2 have positive skewed distributions, and they show a low rank correlation with the Robinson-Foulds metric and the nodal metrics, and a very high correlation with each other and with the splitted nodal metrics. The diameter of dφ,p, for p⩾1 , is in O(n(p+2)/p), and thus for low p they are more discriminative, having a wider range of values.

  9. Finite Metric Spaces of Strictly negative Type

    DEFF Research Database (Denmark)

    Hjorth, Poul G.

    If a finite metric space is of strictly negative type then its transfinite diameter is uniquely realized by an infinite extent (“load vector''). Finite metric spaces that have this property include all trees, and all finite subspaces of Euclidean and Hyperbolic spaces. We prove that if the distance...

  10. Gravitational Metric Tensor Exterior to Rotating Homogeneous ...

    African Journals Online (AJOL)

    The covariant and contravariant metric tensors exterior to a homogeneous spherical body rotating uniformly about a common φ axis with constant angular velocity ω is constructed. The constructed metric tensors in this gravitational field have seven non-zero distinct components.The Lagrangian for this gravitational field is ...

  11. Test of the FLRW Metric and Curvature with Strong Lens Time Delays

    International Nuclear Information System (INIS)

    Liao, Kai; Li, Zhengxiang; Wang, Guo-Jian; Fan, Xi-Long

    2017-01-01

    We present a new model-independent strategy for testing the Friedmann–Lemaître–Robertson–Walker (FLRW) metric and constraining cosmic curvature, based on future time-delay measurements of strongly lensed quasar-elliptical galaxy systems from the Large Synoptic Survey Telescope and supernova observations from the Dark Energy Survey. The test only relies on geometric optics. It is independent of the energy contents of the universe and the validity of the Einstein equation on cosmological scales. The study comprises two levels: testing the FLRW metric through the distance sum rule (DSR) and determining/constraining cosmic curvature. We propose an effective and efficient (redshift) evolution model for performing the former test, which allows us to concretely specify the violation criterion for the FLRW DSR. If the FLRW metric is consistent with the observations, then on the second level the cosmic curvature parameter will be constrained to ∼0.057 or ∼0.041 (1 σ ), depending on the availability of high-redshift supernovae, which is much more stringent than current model-independent techniques. We also show that the bias in the time-delay method might be well controlled, leading to robust results. The proposed method is a new independent tool for both testing the fundamental assumptions of homogeneity and isotropy in cosmology and for determining cosmic curvature. It is complementary to cosmic microwave background plus baryon acoustic oscillation analyses, which normally assume a cosmological model with dark energy domination in the late-time universe.

  12. Test of the FLRW Metric and Curvature with Strong Lens Time Delays

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Kai [School of Science, Wuhan University of Technology, Wuhan 430070 (China); Li, Zhengxiang; Wang, Guo-Jian [Department of Astronomy, Beijing Normal University, Beijing 100875 (China); Fan, Xi-Long, E-mail: liaokai@whut.edu.cn, E-mail: xilong.fan@glasgow.ac.uk [Department of Physics and Mechanical and Electrical Engineering, Hubei University of Education, Wuhan 430205 (China)

    2017-04-20

    We present a new model-independent strategy for testing the Friedmann–Lemaître–Robertson–Walker (FLRW) metric and constraining cosmic curvature, based on future time-delay measurements of strongly lensed quasar-elliptical galaxy systems from the Large Synoptic Survey Telescope and supernova observations from the Dark Energy Survey. The test only relies on geometric optics. It is independent of the energy contents of the universe and the validity of the Einstein equation on cosmological scales. The study comprises two levels: testing the FLRW metric through the distance sum rule (DSR) and determining/constraining cosmic curvature. We propose an effective and efficient (redshift) evolution model for performing the former test, which allows us to concretely specify the violation criterion for the FLRW DSR. If the FLRW metric is consistent with the observations, then on the second level the cosmic curvature parameter will be constrained to ∼0.057 or ∼0.041 (1 σ ), depending on the availability of high-redshift supernovae, which is much more stringent than current model-independent techniques. We also show that the bias in the time-delay method might be well controlled, leading to robust results. The proposed method is a new independent tool for both testing the fundamental assumptions of homogeneity and isotropy in cosmology and for determining cosmic curvature. It is complementary to cosmic microwave background plus baryon acoustic oscillation analyses, which normally assume a cosmological model with dark energy domination in the late-time universe.

  13. Exact solutions of strong gravity in generalized metrics

    International Nuclear Information System (INIS)

    Hojman, R.; Smailagic, A.

    1981-05-01

    We consider classical solutions for the strong gravity theory of Salam and Strathdee in a wider class of metrics with positive, zero and negative curvature. It turns out that such solutions exist and their relevance for quark confinement is explored. Only metrics with positive curvature (spherical symmetry) give a confining potential in a simple picture of the scalar hadron. This supports the idea of describing the hadron as a closed microuniverse of the strong metric. (author)

  14. Development of quality metrics for ambulatory pediatric cardiology: Infection prevention.

    Science.gov (United States)

    Johnson, Jonathan N; Barrett, Cindy S; Franklin, Wayne H; Graham, Eric M; Halnon, Nancy J; Hattendorf, Brandy A; Krawczeski, Catherine D; McGovern, James J; O'Connor, Matthew J; Schultz, Amy H; Vinocur, Jeffrey M; Chowdhury, Devyani; Anderson, Jeffrey B

    2017-12-01

    In 2012, the American College of Cardiology's (ACC) Adult Congenital and Pediatric Cardiology Council established a program to develop quality metrics to guide ambulatory practices for pediatric cardiology. The council chose five areas on which to focus their efforts; chest pain, Kawasaki Disease, tetralogy of Fallot, transposition of the great arteries after arterial switch, and infection prevention. Here, we sought to describe the process, evaluation, and results of the Infection Prevention Committee's metric design process. The infection prevention metrics team consisted of 12 members from 11 institutions in North America. The group agreed to work on specific infection prevention topics including antibiotic prophylaxis for endocarditis, rheumatic fever, and asplenia/hyposplenism; influenza vaccination and respiratory syncytial virus prophylaxis (palivizumab); preoperative methods to reduce intraoperative infections; vaccinations after cardiopulmonary bypass; hand hygiene; and testing to identify splenic function in patients with heterotaxy. An extensive literature review was performed. When available, previously published guidelines were used fully in determining metrics. The committee chose eight metrics to submit to the ACC Quality Metric Expert Panel for review. Ultimately, metrics regarding hand hygiene and influenza vaccination recommendation for patients did not pass the RAND analysis. Both endocarditis prophylaxis metrics and the RSV/palivizumab metric passed the RAND analysis but fell out during the open comment period. Three metrics passed all analyses, including those for antibiotic prophylaxis in patients with heterotaxy/asplenia, for influenza vaccination compliance in healthcare personnel, and for adherence to recommended regimens of secondary prevention of rheumatic fever. The lack of convincing data to guide quality improvement initiatives in pediatric cardiology is widespread, particularly in infection prevention. Despite this, three metrics were

  15. A method for evaluating cognitively informed micro-targeted campaign strategies: An agent-based model proof of principle.

    Science.gov (United States)

    Madsen, Jens Koed; Pilditch, Toby D

    2018-01-01

    In political campaigns, perceived candidate credibility influences the persuasiveness of messages. In campaigns aiming to influence people's beliefs, micro-targeted campaigns (MTCs) that target specific voters using their psychological profile have become increasingly prevalent. It remains open how effective MTCs are, notably in comparison to population-targeted campaign strategies. Using an agent-based model, the paper applies recent insights from cognitive models of persuasion, extending them to the societal level in a novel framework for exploring political campaigning. The paper provides an initial treatment of the complex dynamics of population level political campaigning in a psychologically informed manner. Model simulations show that MTCs can take advantage of the psychology of the electorate by targeting voters favourable disposed towards the candidate. Relative to broad campaigning, MTCs allow for efficient and adaptive management of complex campaigns. Findings show that disliked MTC candidates can beat liked population-targeting candidates, pointing to societal questions concerning campaign regulations.

  16. A new form of the rotating C-metric

    International Nuclear Information System (INIS)

    Hong, Kenneth; Teo, Edward

    2005-01-01

    In a previous paper, we showed that the traditional form of the charged C-metric can be transformed, by a change of coordinates, into one with an explicitly factorizable structure function. This new form of the C-metric has the advantage that its properties become much simpler to analyse. In this paper, we propose an analogous new form for the rotating charged C-metric, with structure function G(ξ) = (1 - ξ 2 )(1 + r + Aξ)(1 + r - Aξ), where r ± are the usual locations of the horizons in the Kerr-Newman black hole. Unlike the non-rotating case, this new form is not related to the traditional one by a coordinate transformation. We show that the physical distinction between these two forms of the rotating C-metric lies in the nature of the conical singularities causing the black holes to accelerate apart: the new form is free of torsion singularities and therefore does not contain any closed timelike curves. We claim that this new form should be considered the natural generalization of the C-metric with rotation

  17. SU-F-T-312: Identifying Distinct Radiation Therapy Plan Classes Through Multi-Dimensional Analysis of Plan Complexity Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Desai, V; Labby, Z; Culberson, W [University of Wisc Madison, Madison, WI (United States)

    2016-06-15

    Purpose: To determine whether body site-specific treatment plans form unique “plan class” clusters in a multi-dimensional analysis of plan complexity metrics such that a single beam quality correction determined for a representative plan could be universally applied within the “plan class”, thereby increasing the dosimetric accuracy of a detector’s response within a subset of similarly modulated nonstandard deliveries. Methods: We collected 95 clinical volumetric modulated arc therapy (VMAT) plans from four body sites (brain, lung, prostate, and spine). The lung data was further subdivided into SBRT and non-SBRT data for a total of five plan classes. For each control point in each plan, a variety of aperture-based complexity metrics were calculated and stored as unique characteristics of each patient plan. A multiple comparison of means analysis was performed such that every plan class was compared to every other plan class for every complexity metric in order to determine which groups could be considered different from one another. Statistical significance was assessed after correcting for multiple hypothesis testing. Results: Six out of a possible 10 pairwise plan class comparisons were uniquely distinguished based on at least nine out of 14 of the proposed metrics (Brain/Lung, Brain/SBRT lung, Lung/Prostate, Lung/SBRT Lung, Lung/Spine, Prostate/SBRT Lung). Eight out of 14 of the complexity metrics could distinguish at least six out of the possible 10 pairwise plan class comparisons. Conclusion: Aperture-based complexity metrics could prove to be useful tools to quantitatively describe a distinct class of treatment plans. Certain plan-averaged complexity metrics could be considered unique characteristics of a particular plan. A new approach to generating plan-class specific reference (pcsr) fields could be established through a targeted preservation of select complexity metrics or a clustering algorithm that identifies plans exhibiting similar

  18. Energy metrics for driving competitiveness of countries: Energy weakness magnitude, GDP per barrel and barrels per capita

    International Nuclear Information System (INIS)

    Coccia, Mario

    2010-01-01

    Energy metrics is the development of a whole new theoretical framework for the conception and measurement of energy and economic system performances, energy efficiency and productivity improvements with important political economy implications consistent with the best use of all natural and economic resources. The purpose of this research is to present some vital energy indicators based on magnitude and scale of energy weakness, GDP per barrel of oil that is an indicator of energy productivity and barrels (of oil) per capita that is an indicator of energy efficiency. Energy metrics can support the monitoring of energy and economic system performances in order to design effective energy strategy and political economy interventions focused on the 'competitive advantage' increase of countries in modern economies.

  19. Sigma Routing Metric for RPL Protocol

    Directory of Open Access Journals (Sweden)

    Paul Sanmartin

    2018-04-01

    Full Text Available This paper presents the adaptation of a specific metric for the RPL protocol in the objective function MRHOF. Among the functions standardized by IETF, we find OF0, which is based on the minimum hop count, as well as MRHOF, which is based on the Expected Transmission Count (ETX. However, when the network becomes denser or the number of nodes increases, both OF0 and MRHOF introduce long hops, which can generate a bottleneck that restricts the network. The adaptation is proposed to optimize both OFs through a new routing metric. To solve the above problem, the metrics of the minimum number of hops and the ETX are combined by designing a new routing metric called SIGMA-ETX, in which the best route is calculated using the standard deviation of ETX values between each node, as opposed to working with the ETX average along the route. This method ensures a better routing performance in dense sensor networks. The simulations are done through the Cooja simulator, based on the Contiki operating system. The simulations showed that the proposed optimization outperforms at a high margin in both OF0 and MRHOF, in terms of network latency, packet delivery ratio, lifetime, and power consumption.

  20. Socio-Technical Security Metrics (Dagstuhl Seminar 14491)

    NARCIS (Netherlands)

    Gollmann, Dieter; Herley, Cormac; Koenig, Vincent; Pieters, Wolter; Sasse, Martina Angela

    2015-01-01

    This report documents the program and the outcomes of Dagstuhl Seminar 14491 "Socio-Technical Security Metrics". In the domain of safety, metrics inform many decisions, from the height of new dikes to the design of nuclear plants. We can state, for example, that the dikes should be high enough to

  1. Landscape metrics for three-dimension urban pattern recognition

    Science.gov (United States)

    Liu, M.; Hu, Y.; Zhang, W.; Li, C.

    2017-12-01

    Understanding how landscape pattern determines population or ecosystem dynamics is crucial for managing our landscapes. Urban areas are becoming increasingly dominant social-ecological systems, so it is important to understand patterns of urbanization. Most studies of urban landscape pattern examine land-use maps in two dimensions because the acquisition of 3-dimensional information is difficult. We used Brista software based on Quickbird images and aerial photos to interpret the height of buildings, thus incorporating a 3-dimensional approach. We estimated the feasibility and accuracy of this approach. A total of 164,345 buildings in the Liaoning central urban agglomeration of China, which included seven cities, were measured. Twelve landscape metrics were proposed or chosen to describe the urban landscape patterns in 2- and 3-dimensional scales. The ecological and social meaning of landscape metrics were analyzed with multiple correlation analysis. The results showed that classification accuracy compared with field surveys was 87.6%, which means this method for interpreting building height was acceptable. The metrics effectively reflected the urban architecture in relation to number of buildings, area, height, 3-D shape and diversity aspects. We were able to describe the urban characteristics of each city with these metrics. The metrics also captured ecological and social meanings. The proposed landscape metrics provided a new method for urban landscape analysis in three dimensions.

  2. Reproducibility of graph metrics of human brain functional networks.

    Science.gov (United States)

    Deuker, Lorena; Bullmore, Edward T; Smith, Marie; Christensen, Soren; Nathan, Pradeep J; Rockstroh, Brigitte; Bassett, Danielle S

    2009-10-01

    Graph theory provides many metrics of complex network organization that can be applied to analysis of brain networks derived from neuroimaging data. Here we investigated the test-retest reliability of graph metrics of functional networks derived from magnetoencephalography (MEG) data recorded in two sessions from 16 healthy volunteers who were studied at rest and during performance of the n-back working memory task in each session. For each subject's data at each session, we used a wavelet filter to estimate the mutual information (MI) between each pair of MEG sensors in each of the classical frequency intervals from gamma to low delta in the overall range 1-60 Hz. Undirected binary graphs were generated by thresholding the MI matrix and 8 global network metrics were estimated: the clustering coefficient, path length, small-worldness, efficiency, cost-efficiency, assortativity, hierarchy, and synchronizability. Reliability of each graph metric was assessed using the intraclass correlation (ICC). Good reliability was demonstrated for most metrics applied to the n-back data (mean ICC=0.62). Reliability was greater for metrics in lower frequency networks. Higher frequency gamma- and beta-band networks were less reliable at a global level but demonstrated high reliability of nodal metrics in frontal and parietal regions. Performance of the n-back task was associated with greater reliability than measurements on resting state data. Task practice was also associated with greater reliability. Collectively these results suggest that graph metrics are sufficiently reliable to be considered for future longitudinal studies of functional brain network changes.

  3. Evaluation metrics for biostatistical and epidemiological collaborations.

    Science.gov (United States)

    Rubio, Doris McGartland; Del Junco, Deborah J; Bhore, Rafia; Lindsell, Christopher J; Oster, Robert A; Wittkowski, Knut M; Welty, Leah J; Li, Yi-Ju; Demets, Dave

    2011-10-15

    Increasing demands for evidence-based medicine and for the translation of biomedical research into individual and public health benefit have been accompanied by the proliferation of special units that offer expertise in biostatistics, epidemiology, and research design (BERD) within academic health centers. Objective metrics that can be used to evaluate, track, and improve the performance of these BERD units are critical to their successful establishment and sustainable future. To develop a set of reliable but versatile metrics that can be adapted easily to different environments and evolving needs, we consulted with members of BERD units from the consortium of academic health centers funded by the Clinical and Translational Science Award Program of the National Institutes of Health. Through a systematic process of consensus building and document drafting, we formulated metrics that covered the three identified domains of BERD practices: the development and maintenance of collaborations with clinical and translational science investigators, the application of BERD-related methods to clinical and translational research, and the discovery of novel BERD-related methodologies. In this article, we describe the set of metrics and advocate their use for evaluating BERD practices. The routine application, comparison of findings across diverse BERD units, and ongoing refinement of the metrics will identify trends, facilitate meaningful changes, and ultimately enhance the contribution of BERD activities to biomedical research. Copyright © 2011 John Wiley & Sons, Ltd.

  4. The grain of spatially referenced economic cost and biodiversity benefit data and the effectiveness of a cost targeting strategy.

    Science.gov (United States)

    Sutton, N J; Armsworth, P R

    2014-12-01

    Facing tight resource constraints, conservation organizations must allocate funds available for habitat protection as effectively as possible. Often, they combine spatially referenced economic and biodiversity data to prioritize land for protection. We tested how sensitive these prioritizations could be to differences in the spatial grain of these data by demonstrating how the conclusion of a classic debate in conservation planning between cost and benefit targeting was altered based on the available information. As a case study, we determined parcel-level acquisition costs and biodiversity benefits of land transactions recently undertaken by a nonprofit conservation organization that seeks to protect forests in the eastern United States. Then, we used hypothetical conservation plans to simulate the types of ex ante priorities that an organization could use to prioritize areas for protection. We found the apparent effectiveness of cost and benefit targeting depended on the spatial grain of the data used when prioritizing parcels based on local species richness. However, when accounting for complementarity, benefit targeting consistently was more efficient than a cost targeting strategy regardless of the spatial grain of the data involved. More pertinently for other studies, we found that combining data collected over different spatial grains inflated the apparent effectiveness of a cost targeting strategy and led to overestimation of the efficiency gain offered by adopting a more integrative return-on-investment approach. © 2014 Society for Conservation Biology.

  5. Hantavirus Gc induces long-term immune protection via LAMP-targeting DNA vaccine strategy.

    Science.gov (United States)

    Jiang, Dong-Bo; Zhang, Jin-Peng; Cheng, Lin-Feng; Zhang, Guan-Wen; Li, Yun; Li, Zi-Chao; Lu, Zhen-Hua; Zhang, Zi-Xin; Lu, Yu-Chen; Zheng, Lian-He; Zhang, Fang-Lin; Yang, Kun

    2018-02-01

    Hemorrhagic fever with renal syndrome (HFRS) occurs widely throughout Eurasia. Unfortunately, there is no effective treatment, and prophylaxis remains the best option against the major pathogenic agent, hantaan virus (HTNV), which is an Old World hantavirus. However, the absence of cellular immune responses and immunological memory hampers acceptance of the current inactivated HFRS vaccine. Previous studies revealed that a lysosome-associated membrane protein 1 (LAMP1)-targeting strategy involving a DNA vaccine based on the HTNV glycoprotein Gn successfully conferred long-term immunity, and indicated that further research on Gc, another HTNV antigen, was warranted. Plasmids encoding Gc and lysosome-targeted Gc, designated pVAX-Gc and pVAX-LAMP/Gc, respectively, were constructed. Proteins of interest were identified by fluorescence microscopy following cell line transfection. Five groups of 20 female BALB/c mice were subjected to the following inoculations: inactivated HTNV vaccine, pVAX-LAMP/Gc, pVAX-Gc, and, as the negative controls, pVAX-LAMP or the blank vector pVAX1. Humoral and cellular immunity were assessed by enzyme-linked immunosorbent assays (ELISAs) and 15-mer peptide enzyme-linked immunospot (ELISpot) epitope mapping assays. Repeated immunization with pVAX-LAMP/Gc enhanced adaptive immune responses, as demonstrated by the specific and neutralizing antibody titers and increased IFN-γ production. The inactivated vaccine induced a comparable humoral reaction, but the negative controls only elicited insignificant responses. Using a mouse model of HTNV challenge, the in vivo protection conferred by the inactivated vaccine and Gc-based constructs (with/without LAMP recombination) was confirmed. Evidence of pan-epitope reactions highlighted the long-term cellular response to the LAMP-targeting strategy, and histological observations indicated the safety of the LAMP-targeting vaccines. The long-term protective immune responses induced by pVAX-LAMP/Gc may be

  6. Generalization of Vaidya's radiation metric

    Energy Technology Data Exchange (ETDEWEB)

    Gleiser, R J; Kozameh, C N [Universidad Nacional de Cordoba (Argentina). Instituto de Matematica, Astronomia y Fisica

    1981-11-01

    In this paper it is shown that if Vaidya's radiation metric is considered from the point of view of kinetic theory in general relativity, the corresponding phase space distribution function can be generalized in a particular way. The new family of spherically symmetric radiation metrics obtained contains Vaidya's as a limiting situation. The Einstein field equations are solved in a ''comoving'' coordinate system. Two arbitrary functions of a single variable are introduced in the process of solving these equations. Particular examples considered are a stationary solution, a nonvacuum solution depending on a single parameter, and several limiting situations.

  7. Marketing communication metrics for social media

    OpenAIRE

    Töllinen, Aarne; Karjaluoto, Heikki

    2011-01-01

    The objective of this paper is to develop a conceptual framework for measuring the effectiveness of social media marketing communications. Specifically, we study whether the existing marketing communications performance metrics are still valid in the changing digitalised communications landscape, or whether it is time to rethink them, or even to devise entirely new metrics. Recent advances in information technology and marketing bring a need to re-examine measurement models. We combine two im...

  8. Evaluation of different set-up error corrections on dose-volume metrics in prostate IMRT using CBCT images

    International Nuclear Information System (INIS)

    Hirose, Yoshinori; Tomita, Tsuneyuki; Kitsuda, Kenji; Notogawa, Takuya; Miki, Katsuhito; Nakamura, Mitsuhiro; Nakamura, Kiyonao; Ishigaki, Takashi

    2014-01-01

    We investigated the effect of different set-up error corrections on dose-volume metrics in intensity-modulated radiotherapy (IMRT) for prostate cancer under different planning target volume (PTV) margin settings using cone-beam computed tomography (CBCT) images. A total of 30 consecutive patients who underwent IMRT for prostate cancer were retrospectively analysed, and 7-14 CBCT datasets were acquired per patient. Interfractional variations in dose-volume metrics were evaluated under six different set-up error corrections, including tattoo, bony anatomy, and four different target matching groups. Set-up errors were incorporated into planning the isocenter position, and dose distributions were recalculated on CBCT images. These processes were repeated under two different PTV margin settings. In the on-line bony anatomy matching groups, systematic error (Σ) was 0.3 mm, 1.4 mm, and 0.3 mm in the left-right, anterior-posterior (AP), and superior-inferior directions, respectively. Σ in three successive off-line target matchings was finally comparable with that in the on-line bony anatomy matching in the AP direction. Although doses to the rectum and bladder wall were reduced for a small PTV margin, averaged reductions in the volume receiving 100% of the prescription dose from planning were within 2.5% under all PTV margin settings for all correction groups, with the exception of the tattoo set-up error correction only (≥ 5.0%). Analysis of variance showed no significant difference between on-line bony anatomy matching and target matching. While variations between the planned and delivered doses were smallest when target matching was applied, the use of bony anatomy matching still ensured the planned doses. (author)

  9. A dual-targeting strategy for enhanced drug delivery and synergistic therapy based on thermosensitive nanoparticles.

    Science.gov (United States)

    Wang, Mingxin; You, Chaoqun; Gao, Zhiguo; Wu, Hongshuai; Sun, Baiwang; Zhu, Xiaoli; Chen, Renjie

    2018-08-01

    The functionalized nanoparticles have been widely studied and reported as carriers of drug transport recently. Furthermore, many groups have focused more on developing novel and efficient treatment methods, such as photodynamic therapy and photothermal therapy, since both therapies have shown inspiring potential in the application of antitumor. The mentioned treatments exhibited the superiority of cooperative manner and showed the ability to compensate for the adverse effects caused by conventional monotherapy in proposed strategies. In view of the above descriptions, we formulated a thermosensitive drug delivery system, which achieved the enhanced delivery of cisplatin and two photosensitizers (ICG and Ce6) by dual-targeting traction. Drawing on the thin film hydration method, cisplatin and photosensitizers were encapsulated inside nanoparticles. Meanwhile, the targeting peptide cRGD and targeting molecule folate can be modified on the surface of nanoparticles to realize the active identification of tumor cells. The measurements of dynamic light scattering showed that the prepared nanoparticles had an ideal dispersibility and uniform particle size of 102.6 nm. On the basis of the results observed from confocal laser scanning microscope, the modified nanoparticles were more efficient endocytosed by MCF-7 cells as a contrast to SGC-7901 cells. Photothermal conversion-triggered drug release and photo-therapies produced a significant apoptosis rate of 85.9% on MCF-7 cells. The distinguished results made it believed that the formulated delivery system had conducted great efforts and innovations for the realization of concise collaboration and provided a promising strategy for the treatment of breast cancer.

  10. Accuracy and precision in the calculation of phenology metrics

    DEFF Research Database (Denmark)

    Ferreira, Ana Sofia; Visser, Andre; MacKenzie, Brian

    2014-01-01

    a phenology metric is first determined from a noise- and gap-free time series, and again once it has been modified. We show that precision is a greater concern than accuracy for many of these metrics, an important point that has been hereto overlooked in the literature. The variability in precision between...... phenology metrics is substantial, but it can be improved by the use of preprocessing techniques (e.g., gap-filling or smoothing). Furthermore, there are important differences in the inherent variability of the metrics that may be crucial in the interpretation of studies based upon them. Of the considered......Phytoplankton phenology (the timing of seasonal events) is a commonly used indicator for evaluating responses of marine ecosystems to climate change. However, phenological metrics are vulnerable to observation-(bloom amplitude, missing data, and observational noise) and analysis-related (temporal...

  11. Quantum anomalies for generalized Euclidean Taub-NUT metrics

    International Nuclear Information System (INIS)

    Cotaescu, Ion I; Moroianu, Sergiu; Visinescu, Mihai

    2005-01-01

    The generalized Taub-NUT metrics exhibit in general gravitational anomalies. This is in contrast with the fact that the original Taub-NUT metric does not exhibit gravitational anomalies, which is a consequence of the fact that it admits Killing-Yano tensors forming Staeckel-Killing tensors as products. We have found that for axial anomalies, interpreted as the index of the Dirac operator, the presence of Killing-Yano tensors is irrelevant. In order to evaluate the axial anomalies, we compute the index of the Dirac operator with the APS boundary condition on balls and on annular domains. The result is an explicit number-theoretic quantity depending on the radii of the domain. This quantity is 0 for metrics close to the original Taub-NUT metric but it does not vanish in general

  12. Metrics for Performance Evaluation of Patient Exercises during Physical Therapy.

    Science.gov (United States)

    Vakanski, Aleksandar; Ferguson, Jake M; Lee, Stephen

    2017-06-01

    The article proposes a set of metrics for evaluation of patient performance in physical therapy exercises. Taxonomy is employed that classifies the metrics into quantitative and qualitative categories, based on the level of abstraction of the captured motion sequences. Further, the quantitative metrics are classified into model-less and model-based metrics, in reference to whether the evaluation employs the raw measurements of patient performed motions, or whether the evaluation is based on a mathematical model of the motions. The reviewed metrics include root-mean square distance, Kullback Leibler divergence, log-likelihood, heuristic consistency, Fugl-Meyer Assessment, and similar. The metrics are evaluated for a set of five human motions captured with a Kinect sensor. The metrics can potentially be integrated into a system that employs machine learning for modelling and assessment of the consistency of patient performance in home-based therapy setting. Automated performance evaluation can overcome the inherent subjectivity in human performed therapy assessment, and it can increase the adherence to prescribed therapy plans, and reduce healthcare costs.

  13. Technical Note: Using k-means clustering to determine the number and position of isocenters in MLC-based multiple target intracranial radiosurgery.

    Science.gov (United States)

    Yock, Adam D; Kim, Gwe-Ya

    2017-09-01

    To present the k-means clustering algorithm as a tool to address treatment planning considerations characteristic of stereotactic radiosurgery using a single isocenter for multiple targets. For 30 patients treated with stereotactic radiosurgery for multiple brain metastases, the geometric centroids and radii of each met were determined from the treatment planning system. In-house software used this as well as weighted and unweighted versions of the k-means clustering algorithm to group the targets to be treated with a single isocenter, and to position each isocenter. The algorithm results were evaluated using within-cluster sum of squares as well as a minimum target coverage metric that considered the effect of target size. Both versions of the algorithm were applied to an example patient to demonstrate the prospective determination of the appropriate number and location of isocenters. Both weighted and unweighted versions of the k-means algorithm were applied successfully to determine the number and position of isocenters. Comparing the two, both the within-cluster sum of squares metric and the minimum target coverage metric resulting from the unweighted version were less than those from the weighted version. The average magnitudes of the differences were small (-0.2 cm 2 and 0.1% for the within cluster sum of squares and minimum target coverage, respectively) but statistically significant (Wilcoxon signed-rank test, P k-means clustering algorithm represented an advantage of the unweighted version for the within-cluster sum of squares metric, and an advantage of the weighted version for the minimum target coverage metric. While additional treatment planning considerations have a large influence on the final treatment plan quality, both versions of the k-means algorithm provide automatic, consistent, quantitative, and objective solutions to the tasks associated with SRS treatment planning using a single isocenter for multiple targets. © 2017 The Authors. Journal

  14. SU-G-201-14: Is Maximum Skin Dose a Reliable Metric for Accelerated Partial Breast Irradiation with Brachytherapy?

    International Nuclear Information System (INIS)

    Park, S; Ragab, O; Patel, S; Demanes, J; Kamrava, M; Kim, Y

    2016-01-01

    Purpose: To evaluate the reliability of the maximum point dose (Dmax) to the skin surface as a dosimetric constraint, we investigated the correlation between Dmax at the skin surface and dose metrics at various definitions of skin thickness. Methods: 42 patients treated with APBI using a Strut Adjusted Volume Implant (SAVI) applicator between 2010 and 2014 were retrospectively reviewed. Target (PTV-EVAL) and organs at risk (OARs: skin, lung, and ribs) were delineated on a CT following NSABP B-39 guidelines. Six skin structures were contoured: a rind 3cm external to the body surface and 1, 2, 3, 4, and 5mm thick rinds deep to the body surface. Inverse planning simulated annealing optimization was used to deliver 32–34Gy in 8-10 fractions to the target while minimizing OAR doses. Dmax, D0.1cc, D1.0cc, and D2.0cc to the various skin structures were calculated. Linear regressions between the metrics were evaluated using the coefficient of determination (R"2). Results: The average±SD PTV-EVAL volume and cavity-to-skin distances were 71.1±28.5cc and 6.9±5.0mm. The target V90 and V95 were 97.3±2.3% and 95.1±3.2%. The Dmax to the skin structures were 78.7±10.2% (skin surface), 82.2±10.7% (skin-1mm), 89.4±12.6% (skin-2mm), 97.9±15.4% (skin-3mm), 114.1±32.5% (skin-4mm), and 157.0±85.3% (skin-5mm). Linear regression analysis showed D1.0cc and D2.0cc to the skin 1mm and Dmax to the skin-4mm and 5mm were poorly correlated with other metrics (R"2=0.413±0.204). Dmax to the skin surface was well correlated (R"2=0.910±0.047) and D1.0cc to the skin-3mm was strongly correlated with all subsurface skin layers (R"2=0.935±0.050). Conclusion: Dmax to the skin surface is a relevant metric for breast skin dose. Contouring discontinuities in the skin with a 1mm subsurface rind and the active dwells in the skin 4 and 5mm introduced significant variations in skin DVH. D0.1cc, D1.0cc, and D2.0cc to a 3mm skin rind are more robust metrics in breast brachytherapy.

  15. Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  16. Massless and massive quanta resulting from a mediumlike metric tensor

    International Nuclear Information System (INIS)

    Soln, J.

    1985-01-01

    A simple model of the ''primordial'' scalar field theory is presented in which the metric tensor is a generalization of the metric tensor from electrodynamics in a medium. The radiation signal corresponding to the scalar field propagates with a velocity that is generally less than c. This signal can be associated simultaneously with imaginary and real effective (momentum-dependent) masses. The requirement that the imaginary effective mass vanishes, which we take to be the prerequisite for the vacuumlike signal propagation, leads to the ''spontaneous'' splitting of the metric tensor into two distinct metric tensors: one metric tensor gives rise to masslesslike radiation and the other to a massive particle. (author)

  17. A guide to phylogenetic metrics for conservation, community ecology and macroecology

    Science.gov (United States)

    Cadotte, Marc W.; Carvalho, Silvia B.; Davies, T. Jonathan; Ferrier, Simon; Fritz, Susanne A.; Grenyer, Rich; Helmus, Matthew R.; Jin, Lanna S.; Mooers, Arne O.; Pavoine, Sandrine; Purschke, Oliver; Redding, David W.; Rosauer, Dan F.; Winter, Marten; Mazel, Florent

    2016-01-01

    ABSTRACT The use of phylogenies in ecology is increasingly common and has broadened our understanding of biological diversity. Ecological sub‐disciplines, particularly conservation, community ecology and macroecology, all recognize the value of evolutionary relationships but the resulting development of phylogenetic approaches has led to a proliferation of phylogenetic diversity metrics. The use of many metrics across the sub‐disciplines hampers potential meta‐analyses, syntheses, and generalizations of existing results. Further, there is no guide for selecting the appropriate metric for a given question, and different metrics are frequently used to address similar questions. To improve the choice, application, and interpretation of phylo‐diversity metrics, we organize existing metrics by expanding on a unifying framework for phylogenetic information. Generally, questions about phylogenetic relationships within or between assemblages tend to ask three types of question: how much; how different; or how regular? We show that these questions reflect three dimensions of a phylogenetic tree: richness, divergence, and regularity. We classify 70 existing phylo‐diversity metrics based on their mathematical form within these three dimensions and identify ‘anchor’ representatives: for α‐diversity metrics these are PD (Faith's phylogenetic diversity), MPD (mean pairwise distance), and VPD (variation of pairwise distances). By analysing mathematical formulae and using simulations, we use this framework to identify metrics that mix dimensions, and we provide a guide to choosing and using the most appropriate metrics. We show that metric choice requires connecting the research question with the correct dimension of the framework and that there are logical approaches to selecting and interpreting metrics. The guide outlined herein will help researchers navigate the current jungle of indices. PMID:26785932

  18. Circumferential or sectored beam arrangements for stereotactic body radiation therapy (SBRT) of primary lung tumors: Effect on target and normal-structure dose-volume metrics

    Energy Technology Data Exchange (ETDEWEB)

    Rosenberg, Mara W. [Broad Institute of MIT and Harvard, Cambridge, MA (United States); Department of Physics, Brandeis University, Waltham, MA (United States); Kato, Catherine M. [Macalester College, St. Paul, MN (United States); Carson, Kelly M.P. [The University of North Carolina, Chapel Hill, NC (United States); Matsunaga, Nathan M. [Santa Clara University, Santa Clara, CA (United States); Arao, Robert F. [Department of Public Health and Preventive Medicine, Oregon Health and Science University, Portland, OR (United States); Doss, Emily J. [Department of Internal Medicine, Providence St. Vincent Medical Center, Portland, OR (United States); McCracken, Charles L. [Department of Radiation Medicine, Oregon Health and Science University, Portland, OR (United States); Meng, Lu Z. [Department of Radiation Oncology, University of California Davis Comprehensive Cancer Center, Sacramento, CA (United States); Chen, Yiyi [Department of Public Health and Preventive Medicine, Oregon Health and Science University, Portland, OR (United States); Laub, Wolfram U.; Fuss, Martin [Department of Radiation Medicine, Oregon Health and Science University, Portland, OR (United States); Department of Nuclear Engineering and Radiation Health Physics, Oregon State University, Corvallis, OR (United States); Tanyi, James A., E-mail: tanyij@ohsu.edu [Department of Radiation Medicine, Oregon Health and Science University, Portland, OR (United States); Department of Nuclear Engineering and Radiation Health Physics, Oregon State University, Corvallis, OR (United States)

    2013-01-01

    To compare 2 beam arrangements, sectored (beam entry over ipsilateral hemithorax) vs circumferential (beam entry over both ipsilateral and contralateral lungs), for static-gantry intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT) delivery techniques with respect to target and organs-at-risk (OAR) dose-volume metrics, as well as treatment delivery efficiency. Data from 60 consecutive patients treated using stereotactic body radiation therapy (SBRT) for primary non–small-cell lung cancer (NSCLC) formed the basis of this study. Four treatment plans were generated per data set: IMRT/VMAT plans using sectored (-s) and circumferential (-c) configurations. The prescribed dose (PD) was 60 Gy in 5 fractions to 95% of the planning target volume (PTV) (maximum PTV dose ∼ 150% PD) for a 6-MV photon beam. Plan conformality, R{sub 50} (ratio of volume circumscribed by the 50% isodose line and the PTV), and D{sub 2} {sub cm} (D{sub max} at a distance ≥2 cm beyond the PTV) were evaluated. For lungs, mean doses (mean lung dose [MLD]) and percent V{sub 30}/V{sub 20}/V{sub 10}/V{sub 5} Gy were assessed. Spinal cord and esophagus D{sub max} and D{sub 5}/D{sub 50} were computed. Chest wall (CW) D{sub max} and absolute V{sub 30}/V{sub 20}/V{sub 10}/V{sub 5} {sub Gy} were reported. Sectored SBRT planning resulted in significant decrease in contralateral MLD and V{sub 10}/V{sub 5} {sub Gy}, as well as contralateral CW D{sub max} and V{sub 10}/V{sub 5} {sub Gy} (all p < 0.001). Nominal reductions of D{sub max} and D{sub 5}/D{sub 50} for the spinal cord with sectored planning did not reach statistical significance for static-gantry IMRT, although VMAT metrics did show a statistically significant decrease (all p < 0.001). The respective measures for esophageal doses were significantly lower with sectored planning (p < 0.001). Despite comparable dose conformality, irrespective of planning configuration, R{sub 50} significantly improved with IMRT

  19. Assessing Software Quality Through Visualised Cohesion Metrics

    Directory of Open Access Journals (Sweden)

    Timothy Shih

    2001-05-01

    Full Text Available Cohesion is one of the most important factors for software quality as well as maintainability, reliability and reusability. Module cohesion is defined as a quality attribute that seeks for measuring the singleness of the purpose of a module. The module of poor quality can be a serious obstacle to the system quality. In order to design a good software quality, software managers and engineers need to introduce cohesion metrics to measure and produce desirable software. A highly cohesion software is thought to be a desirable constructing. In this paper, we propose a function-oriented cohesion metrics based on the analysis of live variables, live span and the visualization of processing element dependency graph. We give six typical cohesion examples to be measured as our experiments and justification. Therefore, a well-defined, well-normalized, well-visualized and well-experimented cohesion metrics is proposed to indicate and thus enhance software cohesion strength. Furthermore, this cohesion metrics can be easily incorporated with software CASE tool to help software engineers to improve software quality.

  20. MESUR: USAGE-BASED METRICS OF SCHOLARLY IMPACT

    Energy Technology Data Exchange (ETDEWEB)

    BOLLEN, JOHAN [Los Alamos National Laboratory; RODRIGUEZ, MARKO A. [Los Alamos National Laboratory; VAN DE SOMPEL, HERBERT [Los Alamos National Laboratory

    2007-01-30

    The evaluation of scholarly communication items is now largely a matter of expert opinion or metrics derived from citation data. Both approaches can fail to take into account the myriad of factors that shape scholarly impact. Usage data has emerged as a promising complement to existing methods o fassessment but the formal groundwork to reliably and validly apply usage-based metrics of schlolarly impact is lacking. The Andrew W. Mellon Foundation funded MESUR project constitutes a systematic effort to define, validate and cross-validate a range of usage-based metrics of schlolarly impact by creating a semantic model of the scholarly communication process. The constructed model will serve as the basis of a creating a large-scale semantic network that seamlessly relates citation, bibliographic and usage data from a variety of sources. A subsequent program that uses the established semantic network as a reference data set will determine the characteristics and semantics of a variety of usage-based metrics of schlolarly impact. This paper outlines the architecture and methodology adopted by the MESUR project and its future direction.

  1. Synergistic retention strategy of RGD active targeting and radiofrequency-enhanced permeability for intensified RF & chemotherapy synergistic tumor treatment.

    Science.gov (United States)

    Zhang, Kun; Li, Pei; He, Yaping; Bo, Xiaowan; Li, Xiaolong; Li, Dandan; Chen, Hangrong; Xu, Huixiong

    2016-08-01

    Despite gaining increasing attention, chelation of multiple active targeting ligands greatly increase the formation probability of protein corona, disabling active targeting. To overcome it, a synergistic retention strategy of RGD-mediated active targeting and radiofrequency (RF) electromagnetic field-enhanced permeability has been proposed here. It is validated that such a special synergistic retention strategy can promote more poly lactic-co-glycolic acid (PLGA)-based capsules encapsulating camptothecin (CPT) and solid DL-menthol (DLM) to enter and retain in tumor in vitro and in vivo upon exposure to RF irradiation, receiving an above 8 fold enhancement in HeLa retention. Moreover, the PLGA-based capsules can respond RF field to trigger the entrapped DLM to generate solid-liquid-gas (SLG) tri-phase transformation for enhancing RF ablation and CPT release. Therefore, depending on the enhanced RF ablation and released CPT and the validated synergistic retention effect, the inhibitory outcome for tumor growth has gained an over 10-fold improvement, realizing RF ablation & chemotherapy synergistic treatment against HeLa solid tumor, which indicates a significant promise in clinical RF ablation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Classroom reconstruction of the Schwarzschild metric

    OpenAIRE

    Kassner, Klaus

    2015-01-01

    A promising way to introduce general relativity in the classroom is to study the physical implications of certain given metrics, such as the Schwarzschild one. This involves lower mathematical expenditure than an approach focusing on differential geometry in its full glory and permits to emphasize physical aspects before attacking the field equations. Even so, in terms of motivation, lacking justification of the metric employed may pose an obstacle. The paper discusses how to establish the we...

  3. Area Regge calculus and discontinuous metrics

    International Nuclear Information System (INIS)

    Wainwright, Chris; Williams, Ruth M

    2004-01-01

    Taking the triangle areas as independent variables in the theory of Regge calculus can lead to ambiguities in the edge lengths, which can be interpreted as discontinuities in the metric. We construct solutions to area Regge calculus using a triangulated lattice and find that on a spacelike or timelike hypersurface no such discontinuity can arise. On a null hypersurface however, we can have such a situation and the resulting metric can be interpreted as a so-called refractive wave

  4. Use of metrics in an effective ALARA program

    International Nuclear Information System (INIS)

    Bates, B.B. Jr.

    1996-01-01

    ALARA radiological protection programs require metrics to meet their objectives. Sources of metrics include external dosimetry; internal dosimetry; radiological occurrences from the occurrence reporting an processing system (ORPS); and radiological incident reports (RIR). The sources themselves contain an abundance of specific open-quotes indicators.close quotes To choose the site-specific indicators that will be tracked and trended requires careful review. Justification is needed to defend the indicators selected and maybe even stronger justification is needed for those indicators that are available, but not chosen as a metric. Historically, the many different sources of information resided in a plethora of locations. Even the same type of metric had data located in different areas and could not be easily totaled for the entire Site. This required the end user to expend valuable time and effort to locate the data they needed. To address this problem, a central metrics database has been developed so that a customer can have all their questions addressed quickly and correctly. The database was developed in the beginning to answer some of the customer's most frequently asked questions. IL is now also a tool to communicate the status of the radiation protection program to facility managers. Finally, it also addresses requirements contained in the Rad Con manual and the 10CFR835 implementation guides. The database uses currently available, open-quotes user friendly,close quotes software and contains information from RIR's, ORPS, and external dosimetry records specific to ALARA performance indicators. The database is expandable to allow new metrics input. Specific reports have been developed to assist customers in their tracking and trending of ALARA metrics. These include quarterly performance indicator reports, monthly radiological incident reports, monthly external dose history and goals tracking reports, and the future use of performance indexing

  5. Assessment of six dissimilarity metrics for climate analogues

    Science.gov (United States)

    Grenier, Patrick; Parent, Annie-Claude; Huard, David; Anctil, François; Chaumont, Diane

    2013-04-01

    Spatial analogue techniques consist in identifying locations whose recent-past climate is similar in some aspects to the future climate anticipated at a reference location. When identifying analogues, one key step is the quantification of the dissimilarity between two climates separated in time and space, which involves the choice of a metric. In this communication, spatial analogues and their usefulness are briefly discussed. Next, six metrics are presented (the standardized Euclidean distance, the Kolmogorov-Smirnov statistic, the nearest-neighbor distance, the Zech-Aslan energy statistic, the Friedman-Rafsky runs statistic and the Kullback-Leibler divergence), along with a set of criteria used for their assessment. The related case study involves the use of numerical simulations performed with the Canadian Regional Climate Model (CRCM-v4.2.3), from which three annual indicators (total precipitation, heating degree-days and cooling degree-days) are calculated over 30-year periods (1971-2000 and 2041-2070). Results indicate that the six metrics identify comparable analogue regions at a relatively large scale, but best analogues may differ substantially. For best analogues, it is also shown that the uncertainty stemming from the metric choice does generally not exceed that stemming from the simulation or model choice. A synthesis of the advantages and drawbacks of each metric is finally presented, in which the Zech-Aslan energy statistic stands out as the most recommended metric for analogue studies, whereas the Friedman-Rafsky runs statistic is the least recommended, based on this case study.

  6. Effective use of metrics in an ALARA program

    International Nuclear Information System (INIS)

    Bates, B.B. Jr.

    1996-01-01

    ALARA radiological protection programs require metrics to meet their objectives. Sources of metrics include; external dosimetry; internal dosimetry; radiological occurrences from the occurrence reporting and processing system (ORPS); and radiological incident reports (RIR). The sources themselves contain an abundance of specific ''indicators''. To choose the site-specific indicators that will be tracked and trended requires careful review. This required the end users to expend valuable time and effort to locate the data they needed. To address this problem, a central metrics database has been developed so that customers can have all their questions addressed quickly and correctly. The database was developed in the beginning to answer some of the customer's most frequently asked questions. It is now also a tool to communicate the status of the radiation protection program to facility managers. Finally it also addresses requirements contained in the Rad Con manual and the 10CFR835 implementation guides. The database uses currently available, ''user friendly'', software and contains information from RIR's, ORPS, and external dosimetry records specific to ALARA performance indicators. The database is expandable to allow new metrics input. Specific reports have been developed to assist customers in their tracking and trending of ALARA metrics

  7. Low-complexity atlas-based prostate segmentation by combining global, regional, and local metrics

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Qiuliang; Ruan, Dan, E-mail: druan@mednet.ucla.edu [The Department of Radiation Oncology, University of California Los Angeles, California 90095 (United States)

    2014-04-15

    Purpose: To improve the efficiency of atlas-based segmentation without compromising accuracy, and to demonstrate the validity of the proposed method on MRI-based prostate segmentation application. Methods: Accurate and efficient automatic structure segmentation is an important task in medical image processing. Atlas-based methods, as the state-of-the-art, provide good segmentation at the cost of a large number of computationally intensive nonrigid registrations, for anatomical sites/structures that are subject to deformation. In this study, the authors propose to utilize a combination of global, regional, and local metrics to improve the accuracy yet significantly reduce the number of required nonrigid registrations. The authors first perform an affine registration to minimize the global mean squared error (gMSE) to coarsely align each atlas image to the target. Subsequently, atarget-specific regional MSE (rMSE), demonstrated to be a good surrogate for dice similarity coefficient (DSC), is used to select a relevant subset from the training atlas. Only within this subset are nonrigid registrations performed between the training images and the target image, to minimize a weighted combination of gMSE and rMSE. Finally, structure labels are propagated from the selected training samples to the target via the estimated deformation fields, and label fusion is performed based on a weighted combination of rMSE and local MSE (lMSE) discrepancy, with proper total-variation-based spatial regularization. Results: The proposed method was applied to a public database of 30 prostate MR images with expert-segmented structures. The authors’ method, utilizing only eight nonrigid registrations, achieved a performance with a median/mean DSC of over 0.87/0.86, outperforming the state-of-the-art full-fledged atlas-based segmentation approach of which the median/mean DSC was 0.84/0.82 when applying to their data set. Conclusions: The proposed method requires a fixed number of nonrigid

  8. Metrication manual

    International Nuclear Information System (INIS)

    Harper, A.F.A.; Digby, R.B.; Thong, S.P.; Lacey, F.

    1978-04-01

    In April 1978 a meeting of senior metrication officers convened by the Commonwealth Science Council of the Commonwealth Secretariat, was held in London. The participants were drawn from Australia, Bangladesh, Britain, Canada, Ghana, Guyana, India, Jamaica, Papua New Guinea, Solomon Islands and Trinidad and Tobago. Among other things, the meeting resolved to develop a set of guidelines to assist countries to change to SI and to compile such guidelines in the form of a working manual

  9. An accurate metric for the spacetime around rotating neutron stars

    Science.gov (United States)

    Pappas, George

    2017-04-01

    The problem of having an accurate description of the spacetime around rotating neutron stars is of great astrophysical interest. For astrophysical applications, one needs to have a metric that captures all the properties of the spacetime around a rotating neutron star. Furthermore, an accurate appropriately parametrized metric, I.e. a metric that is given in terms of parameters that are directly related to the physical structure of the neutron star, could be used to solve the inverse problem, which is to infer the properties of the structure of a neutron star from astrophysical observations. In this work, we present such an approximate stationary and axisymmetric metric for the exterior of rotating neutron stars, which is constructed using the Ernst formalism and is parametrized by the relativistic multipole moments of the central object. This metric is given in terms of an expansion on the Weyl-Papapetrou coordinates with the multipole moments as free parameters and is shown to be extremely accurate in capturing the physical properties of a neutron star spacetime as they are calculated numerically in general relativity. Because the metric is given in terms of an expansion, the expressions are much simpler and easier to implement, in contrast to previous approaches. For the parametrization of the metric in general relativity, the recently discovered universal 3-hair relations are used to produce a three-parameter metric. Finally, a straightforward extension of this metric is given for scalar-tensor theories with a massless scalar field, which also admit a formulation in terms of an Ernst potential.

  10. Metric reconstruction from Weyl scalars

    Energy Technology Data Exchange (ETDEWEB)

    Whiting, Bernard F; Price, Larry R [Department of Physics, PO Box 118440, University of Florida, Gainesville, FL 32611 (United States)

    2005-08-07

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations.

  11. Metric reconstruction from Weyl scalars

    International Nuclear Information System (INIS)

    Whiting, Bernard F; Price, Larry R

    2005-01-01

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations

  12. Targeting the endocannabinoid system : future therapeutic strategies

    NARCIS (Netherlands)

    Aizpurua-Olaizola, Oier; Elezgarai, Izaskun; Rico-Barrio, Irantzu; Zarandona, Iratxe; Etxebarria, Nestor; Usobiaga, Aresatz

    2017-01-01

    The endocannabinoid system (ECS) is involved in many physiological regulation pathways in the human body, which makes this system the target of many drugs and therapies. In this review, we highlight the latest studies regarding the role of the ECS and the drugs that target it, with a particular

  13. A Targeted Enrichment Strategy for Massively Parallel Sequencing of Angiosperm Plastid Genomes

    Directory of Open Access Journals (Sweden)

    Gregory W. Stull

    2013-02-01

    Full Text Available Premise of the study: We explored a targeted enrichment strategy to facilitate rapid and low-cost next-generation sequencing (NGS of numerous complete plastid genomes from across the phylogenetic breadth of angiosperms. Methods and Results: A custom RNA probe set including the complete sequences of 22 previously sequenced eudicot plastomes was designed to facilitate hybridization-based targeted enrichment of eudicot plastid genomes. Using this probe set and an Agilent SureSelect targeted enrichment kit, we conducted an enrichment experiment including 24 angiosperms (22 eudicots, two monocots, which were subsequently sequenced on a single lane of the Illumina GAIIx with single-end, 100-bp reads. This approach yielded nearly complete to complete plastid genomes with exceptionally high coverage (mean coverage: 717×, even for the two monocots. Conclusions: Our enrichment experiment was highly successful even though many aspects of the capture process employed were suboptimal. Hence, significant improvements to this methodology are feasible. With this general approach and probe set, it should be possible to sequence more than 300 essentially complete plastid genomes in a single Illumina GAIIx lane (achieving 50× mean coverage. However, given the complications of pooling numerous samples for multiplex sequencing and the limited number of barcodes (e.g., 96 available in commercial kits, we recommend 96 samples as a current practical maximum for multiplex plastome sequencing. This high-throughput approach should facilitate large-scale plastid genome sequencing at any level of phylogenetic diversity in angiosperms.

  14. Using Publication Metrics to Highlight Academic Productivity and Research Impact

    Science.gov (United States)

    Carpenter, Christopher R.; Cone, David C.; Sarli, Cathy C.

    2016-01-01

    This article provides a broad overview of widely available measures of academic productivity and impact using publication data and highlights uses of these metrics for various purposes. Metrics based on publication data include measures such as number of publications, number of citations, the journal impact factor score, and the h-index, as well as emerging metrics based on document-level metrics. Publication metrics can be used for a variety of purposes for tenure and promotion, grant applications and renewal reports, benchmarking, recruiting efforts, and administrative purposes for departmental or university performance reports. The authors also highlight practical applications of measuring and reporting academic productivity and impact to emphasize and promote individual investigators, grant applications, or department output. PMID:25308141

  15. Comparison of routing metrics for wireless mesh networks

    CSIR Research Space (South Africa)

    Nxumalo, SL

    2011-09-01

    Full Text Available in each and every relay node so as to find the next hop for the packet. A routing metric is simply a measure used for selecting the best path, used by a routing protocol. Figure 2 shows the relationship between a routing protocol and the routing... on its QoS-awareness level. The routing metrics that considered QoS the most were selected from each group. This section discusses the four routing metrics that were compared in this paper, which are: hop count (HOP), expected transmission count (ETX...

  16. The metrics and correlates of physician migration from Africa

    Directory of Open Access Journals (Sweden)

    Arah Onyebuchi A

    2007-05-01

    Full Text Available Abstract Background Physician migration from poor to rich countries is considered an important contributor to the growing health workforce crisis in the developing world. This is particularly true for Africa. The perceived magnitude of such migration for each source country might, however, depend on the choice of metrics used in the analysis. This study examined the influence of choice of migration metrics on the rankings of African countries that suffered the most physician migration, and investigated the correlates of physician migration. Methods Ranking and correlational analyses were conducted on African physician migration data adjusted for bilateral net flows, and supplemented with developmental, economic and health system data. The setting was the 53 African birth countries of African-born physicians working in nine wealthier destination countries. Three metrics of physician migration were used: total number of physician émigrés; emigration fraction defined as the proportion of the potential physician pool working in destination countries; and physician migration density defined as the number of physician émigrés per 1000 population of the African source country. Results Rankings based on any of the migration metrics differed substantially from those based on the other two metrics. Although the emigration fraction and physician migration density metrics gave proportionality to the migration crisis, only the latter was consistently associated with source countries' workforce capacity, health, health spending, economic and development characteristics. As such, higher physician migration density was seen among African countries with relatively higher health workforce capacity (0.401 ≤ r ≤ 0.694, p ≤ 0.011, health status, health spending, and development. Conclusion The perceived magnitude of physician migration is sensitive to the choice of metrics. Complementing the emigration fraction, the physician migration density is a metric

  17. Diagnostic imaging strategy for MDCT- or MRI-detected breast lesions: use of targeted sonography

    International Nuclear Information System (INIS)

    Nakano, Satoko; Ohtsuka, Masahiko; Mibu, Akemi; Karikomi, Masato; Sakata, Hitomi; Yamamoto, Masahiro

    2012-01-01

    Leading-edge technology such as magnetic resonance imaging (MRI) or computed tomography (CT) often reveals mammographically and ultrasonographically occult lesions. MRI is a well-documented, effective tool to evaluate these lesions; however, the detection rate of targeted sonography varies for MRI detected lesions, and its significance is not well established in diagnostic strategy of MRI detected lesions. We assessed the utility of targeted sonography for multidetector-row CT (MDCT)- or MRI-detected lesions in practice. We retrospectively reviewed 695 patients with newly diagnosed breast cancer who were candidates for breast conserving surgery and underwent MDCT or MRI in our hospital between January 2004 and March 2011. Targeted sonography was performed in all MDCT- or MRI-detected lesions followed by imaging-guided biopsy. Patient background, histopathology features and the sizes of the lesions were compared among benign, malignant and follow-up groups. Of the 695 patients, 61 lesions in 56 patients were detected by MDCT or MRI. The MDCT- or MRI-detected lesions were identified by targeted sonography in 58 out of 61 lesions (95.1%). Patients with pathological diagnoses were significantly older and more likely to be postmenopausal than the follow-up patients. Pathological diagnosis proved to be benign in 20 cases and malignant in 25. The remaining 16 lesions have been followed up. Lesion size and shape were not significantly different among the benign, malignant and follow-up groups. Approximately 95% of MDCT- or MRI-detected lesions were identified by targeted sonography, and nearly half of these lesions were pathologically proven malignancies in this study. Targeted sonography is a useful modality for MDCT- or MRI-detected breast lesions

  18. Use of social media in health promotion: purposes, key performance indicators, and evaluation metrics.

    Science.gov (United States)

    Neiger, Brad L; Thackeray, Rosemary; Van Wagenen, Sarah A; Hanson, Carl L; West, Joshua H; Barnes, Michael D; Fagen, Michael C

    2012-03-01

    Despite the expanding use of social media, little has been published about its appropriate role in health promotion, and even less has been written about evaluation. The purpose of this article is threefold: (a) outline purposes for social media in health promotion, (b) identify potential key performance indicators associated with these purposes, and (c) propose evaluation metrics for social media related to the key performance indicators. Process evaluation is presented in this article as an overarching evaluation strategy for social media.

  19. Relevance of motion-related assessment metrics in laparoscopic surgery.

    Science.gov (United States)

    Oropesa, Ignacio; Chmarra, Magdalena K; Sánchez-González, Patricia; Lamata, Pablo; Rodrigues, Sharon P; Enciso, Silvia; Sánchez-Margallo, Francisco M; Jansen, Frank-Willem; Dankelman, Jenny; Gómez, Enrique J

    2013-06-01

    Motion metrics have become an important source of information when addressing the assessment of surgical expertise. However, their direct relationship with the different surgical skills has not been fully explored. The purpose of this study is to investigate the relevance of motion-related metrics in the evaluation processes of basic psychomotor laparoscopic skills and their correlation with the different abilities sought to measure. A framework for task definition and metric analysis is proposed. An explorative survey was first conducted with a board of experts to identify metrics to assess basic psychomotor skills. Based on the output of that survey, 3 novel tasks for surgical assessment were designed. Face and construct validation was performed, with focus on motion-related metrics. Tasks were performed by 42 participants (16 novices, 22 residents, and 4 experts). Movements of the laparoscopic instruments were registered with the TrEndo tracking system and analyzed. Time, path length, and depth showed construct validity for all 3 tasks. Motion smoothness and idle time also showed validity for tasks involving bimanual coordination and tasks requiring a more tactical approach, respectively. Additionally, motion smoothness and average speed showed a high internal consistency, proving them to be the most task-independent of all the metrics analyzed. Motion metrics are complementary and valid for assessing basic psychomotor skills, and their relevance depends on the skill being evaluated. A larger clinical implementation, combined with quality performance information, will give more insight on the relevance of the results shown in this study.

  20. Alternative Strategies to Achieve Cardiovascular Mortality Goals in China and India: A Microsimulation of Target- Versus Risk-Based Blood Pressure Treatment.

    Science.gov (United States)

    Basu, Sanjay; Yudkin, John S; Sussman, Jeremy B; Millett, Christopher; Hayward, Rodney A

    2016-03-01

    The World Health Organization aims to reduce mortality from chronic diseases including cardiovascular disease (CVD) by 25% by 2025. High blood pressure is a leading CVD risk factor. We sought to compare 3 strategies for treating blood pressure in China and India: a treat-to-target (TTT) strategy emphasizing lowering blood pressure to a target, a benefit-based tailored treatment (BTT) strategy emphasizing lowering CVD risk, or a hybrid strategy currently recommended by the World Health Organization. We developed a microsimulation model of adults aged 30 to 70 years in China and in India to compare the 2 treatment approaches across a 10-year policy-planning horizon. In the model, a BTT strategy treating adults with a 10-year CVD event risk of ≥ 10% used similar financial resources but averted ≈ 5 million more disability-adjusted life-years in both China and India than a TTT approach based on current US guidelines. The hybrid strategy in the current World Health Organization guidelines produced no substantial benefits over TTT. BTT was more cost-effective at $205 to $272/disability-adjusted life-year averted, which was $142 to $182 less per disability-adjusted life-year than TTT or hybrid strategies. The comparative effectiveness of BTT was robust to uncertainties in CVD risk estimation and to variations in the age range analyzed, the BTT treatment threshold, or rates of treatment access, adherence, or concurrent statin therapy. In model-based analyses, a simple BTT strategy was more effective and cost-effective than TTT or hybrid strategies in reducing mortality. © 2016 American Heart Association, Inc.

  1. Intermittent search strategies

    Science.gov (United States)

    Bénichou, O.; Loverdo, C.; Moreau, M.; Voituriez, R.

    2011-01-01

    This review examines intermittent target search strategies, which combine phases of slow motion, allowing the searcher to detect the target, and phases of fast motion during which targets cannot be detected. It is first shown that intermittent search strategies are actually widely observed at various scales. At the macroscopic scale, this is, for example, the case of animals looking for food; at the microscopic scale, intermittent transport patterns are involved in a reaction pathway of DNA-binding proteins as well as in intracellular transport. Second, generic stochastic models are introduced, which show that intermittent strategies are efficient strategies that enable the minimization of search time. This suggests that the intrinsic efficiency of intermittent search strategies could justify their frequent observation in nature. Last, beyond these modeling aspects, it is proposed that intermittent strategies could also be used in a broader context to design and accelerate search processes.

  2. Left-invariant Einstein metrics on S3 ×S3

    Science.gov (United States)

    Belgun, Florin; Cortés, Vicente; Haupt, Alexander S.; Lindemann, David

    2018-06-01

    The classification of homogeneous compact Einstein manifolds in dimension six is an open problem. We consider the remaining open case, namely left-invariant Einstein metrics g on G = SU(2) × SU(2) =S3 ×S3. Einstein metrics are critical points of the total scalar curvature functional for fixed volume. The scalar curvature S of a left-invariant metric g is constant and can be expressed as a rational function in the parameters determining the metric. The critical points of S, subject to the volume constraint, are given by the zero locus of a system of polynomials in the parameters. In general, however, the determination of the zero locus is apparently out of reach. Instead, we consider the case where the isotropy group K of g in the group of motions is non-trivial. When K ≇Z2 we prove that the Einstein metrics on G are given by (up to homothety) either the standard metric or the nearly Kähler metric, based on representation-theoretic arguments and computer algebra. For the remaining case K ≅Z2 we present partial results.

  3. Targeting AMPK Signaling as a Neuroprotective Strategy in Parkinson's Disease.

    Science.gov (United States)

    Curry, Daniel W; Stutz, Bernardo; Andrews, Zane B; Elsworth, John D

    2018-03-26

    Parkinson's disease (PD) is the second most common neurodegenerative disorder. It is characterized by the accumulation of intracellular α-synuclein aggregates and the degeneration of nigrostriatal dopaminergic neurons. While no treatment strategy has been proven to slow or halt the progression of the disease, there is mounting evidence from preclinical PD models that activation of 5'-AMP-activated protein kinase (AMPK) may have broad neuroprotective effects. Numerous dietary supplements and pharmaceuticals (e.g., metformin) that increase AMPK activity are available for use in humans, but clinical studies of their effects in PD patients are limited. AMPK is an evolutionarily conserved serine/threonine kinase that is activated by falling energy levels and functions to restore cellular energy balance. However, in response to certain cellular stressors, AMPK activation may exacerbate neuronal atrophy and cell death. This review describes the regulation and functions of AMPK, evaluates the controversies in the field, and assesses the potential of targeting AMPK signaling as a neuroprotective treatment for PD.

  4. Active Metric Learning from Relative Comparisons

    OpenAIRE

    Xiong, Sicheng; Rosales, Rómer; Pei, Yuanli; Fern, Xiaoli Z.

    2014-01-01

    This work focuses on active learning of distance metrics from relative comparison information. A relative comparison specifies, for a data point triplet $(x_i,x_j,x_k)$, that instance $x_i$ is more similar to $x_j$ than to $x_k$. Such constraints, when available, have been shown to be useful toward defining appropriate distance metrics. In real-world applications, acquiring constraints often require considerable human effort. This motivates us to study how to select and query the most useful ...

  5. A practical approach to determine dose metrics for nanomaterials.

    Science.gov (United States)

    Delmaar, Christiaan J E; Peijnenburg, Willie J G M; Oomen, Agnes G; Chen, Jingwen; de Jong, Wim H; Sips, Adriënne J A M; Wang, Zhuang; Park, Margriet V D Z

    2015-05-01

    Traditionally, administered mass is used to describe doses of conventional chemical substances in toxicity studies. For deriving toxic doses of nanomaterials, mass and chemical composition alone may not adequately describe the dose, because particles with the same chemical composition can have completely different toxic mass doses depending on properties such as particle size. Other dose metrics such as particle number, volume, or surface area have been suggested, but consensus is lacking. The discussion regarding the most adequate dose metric for nanomaterials clearly needs a systematic, unbiased approach to determine the most appropriate dose metric for nanomaterials. In the present study, the authors propose such an approach and apply it to results from in vitro and in vivo experiments with silver and silica nanomaterials. The proposed approach is shown to provide a convenient tool to systematically investigate and interpret dose metrics of nanomaterials. Recommendations for study designs aimed at investigating dose metrics are provided. © 2015 SETAC.

  6. Information-theoretic semi-supervised metric learning via entropy regularization.

    Science.gov (United States)

    Niu, Gang; Dai, Bo; Yamada, Makoto; Sugiyama, Masashi

    2014-08-01

    We propose a general information-theoretic approach to semi-supervised metric learning called SERAPH (SEmi-supervised metRic leArning Paradigm with Hypersparsity) that does not rely on the manifold assumption. Given the probability parameterized by a Mahalanobis distance, we maximize its entropy on labeled data and minimize its entropy on unlabeled data following entropy regularization. For metric learning, entropy regularization improves manifold regularization by considering the dissimilarity information of unlabeled data in the unsupervised part, and hence it allows the supervised and unsupervised parts to be integrated in a natural and meaningful way. Moreover, we regularize SERAPH by trace-norm regularization to encourage low-dimensional projections associated with the distance metric. The nonconvex optimization problem of SERAPH could be solved efficiently and stably by either a gradient projection algorithm or an EM-like iterative algorithm whose M-step is convex. Experiments demonstrate that SERAPH compares favorably with many well-known metric learning methods, and the learned Mahalanobis distance possesses high discriminability even under noisy environments.

  7. Fisher information metrics for binary classifier evaluation and training

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Different evaluation metrics for binary classifiers are appropriate to different scientific domains and even to different problems within the same domain. This presentation focuses on the optimisation of event selection to minimise statistical errors in HEP parameter estimation, a problem that is best analysed in terms of the maximisation of Fisher information about the measured parameters. After describing a general formalism to derive evaluation metrics based on Fisher information, three more specific metrics are introduced for the measurements of signal cross sections in counting experiments (FIP1) or distribution fits (FIP2) and for the measurements of other parameters from distribution fits (FIP3). The FIP2 metric is particularly interesting because it can be derived from any ROC curve, provided that prevalence is also known. In addition to its relation to measurement errors when used as an evaluation criterion (which makes it more interesting that the ROC AUC), a further advantage of the FIP2 metric is ...

  8. SU-E-T-222: How to Define and Manage Quality Metrics in Radiation Oncology.

    Science.gov (United States)

    Harrison, A; Cooper, K; DeGregorio, N; Doyle, L; Yu, Y

    2012-06-01

    Since the 2001 IOM Report Crossing the Quality Chasm: A New Health System for the 21st Century, the need to provide quality metrics in health care has increased. Quality metrics have yet to be defined for the field of radiation oncology. This study represents one institutes initial efforts defining and measuring quality metrics using our electronic medical record and verify system(EMR) as a primary data collection tool. This effort began by selecting meaningful quality metrics rooted in the IOM definition of quality (safe, timely, efficient, effective, equitable and patient-centered care) that were also measurable targets based on current data input and workflow. Elekta MOSAIQ 2.30.04D1 was used to generate reports on the number of Special Physics Consults(SPC) charged as a surrogate for treatment complexity, daily patient time in department(DTP) as a measure of efficiency and timeliness, and time from CT-simulation to first LINAC appointment(STL). The number of IMRT QAs delivered in the department was also analyzed to assess complexity. Although initial MOSAIQ reports were easily generated, the data needed to be assessed and adjusted for outliers. Patients with delays outside of radiation oncology such as chemotherapy or surgery were excluded from STL data. We found an average STL of six days for all CT-simulated patients and an average DTP of 52 minutes total time, with 23 minutes in the LINAC vault. Annually, 7.3% of all patient require additional physics support indicated by SPC. Utilizing our EMR, an entire year's worth of useful data characterizing our clinical experience was analyzed in less than one day. Having baseline quality metrics is necessary to improve patient care. Future plans include dissecting this data into more specific categories such as IMRT DTP, workflow timing following CT-simulation, beam-on hours, chart review outcomes, and dosimetric quality indicators. © 2012 American Association of Physicists in Medicine.

  9. Molecular Mechanisms of Diabetic Retinopathy, General Preventive Strategies, and Novel Therapeutic Targets

    Science.gov (United States)

    Safi, Sher Zaman; Kumar, Selva; Ismail, Ikram Shah Bin

    2014-01-01

    The growing number of people with diabetes worldwide suggests that diabetic retinopathy (DR) and diabetic macular edema (DME) will continue to be sight threatening factors. The pathogenesis of diabetic retinopathy is a widespread cause of visual impairment in the world and a range of hyperglycemia-linked pathways have been implicated in the initiation and progression of this condition. Despite understanding the polyol pathway flux, activation of protein kinase C (KPC) isoforms, increased hexosamine pathway flux, and increased advanced glycation end-product (AGE) formation, pathogenic mechanisms underlying diabetes induced vision loss are not fully understood. The purpose of this paper is to review molecular mechanisms that regulate cell survival and apoptosis of retinal cells and discuss new and exciting therapeutic targets with comparison to the old and inefficient preventive strategies. This review highlights the recent advancements in understanding hyperglycemia-induced biochemical and molecular alterations, systemic metabolic factors, and aberrant activation of signaling cascades that ultimately lead to activation of a number of transcription factors causing functional and structural damage to retinal cells. It also reviews the established interventions and emerging molecular targets to avert diabetic retinopathy and its associated risk factors. PMID:25105142

  10. Metrical and dynamical aspects in complex analysis

    CERN Document Server

    2017-01-01

    The central theme of this reference book is the metric geometry of complex analysis in several variables. Bridging a gap in the current literature, the text focuses on the fine behavior of the Kobayashi metric of complex manifolds and its relationships to dynamical systems, hyperbolicity in the sense of Gromov and operator theory, all very active areas of research. The modern points of view expressed in these notes, collected here for the first time, will be of interest to academics working in the fields of several complex variables and metric geometry. The different topics are treated coherently and include expository presentations of the relevant tools, techniques and objects, which will be particularly useful for graduate and PhD students specializing in the area.

  11. PSA-selective activation of cytotoxic human serine proteases within the tumor microenvironment as a therapeutic strategy to target prostate cancer.

    Science.gov (United States)

    Rogers, Oliver C; Anthony, Lizamma; Rosen, D Marc; Brennen, W Nathaniel; Denmeade, Samuel R

    2018-04-27

    Prostate cancer is the most diagnosed malignancy and the second leading cause of cancer-related death in American men. While localized therapy is highly curative, treatments for metastatic prostate cancer are largely palliative. Thus, new innovative therapies are needed to target metastatic tumors. Prostate-Specific Antigen (PSA) is a chymotrypsin-like protease with a unique substrate specificity that is secreted by both normal and malignant prostate epithelial cells. Previous studies demonstrated the presence of high levels (μM-mM) of enzymatically active PSA is present in the extracellular fluid of the prostate cancer microenvironment. Because of this, PSA is an attractive target for a protease activated pro-toxin therapeutic strategy. Because prostate cancers typically grow very slowly, a strategy employing a proliferation-independent cytotoxic payload is preferred. Recently, it was shown that the human protease Granzyme B (GZMB), at low micromolar concentrations in the extracellular space, can cleave an array of extracellular matrix (ECM) proteins thus perturbing cell growth, signaling, motility, and integrity. It is also well established that other human proteases such as trypsin can induce similar effects. Because both enzymes require N-terminal proteolytic activation, we propose to convert these proteins into PSA-activated cytotoxins. In this study, we examine the enzymatic and cell targeting parameters of these PSA-activated cytotoxic serine proteases. These pro-enzymes were activated robustly by PSA and induced ECM damage that led to the death of prostate cancer cells in vitro thus supporting the potential use of this strategy as means to target metastatic prostate cancers.

  12. Important LiDAR metrics for discriminating forest tree species in Central Europe

    Science.gov (United States)

    Shi, Yifang; Wang, Tiejun; Skidmore, Andrew K.; Heurich, Marco

    2018-03-01

    Numerous airborne LiDAR-derived metrics have been proposed for classifying tree species. Yet an in-depth ecological and biological understanding of the significance of these metrics for tree species mapping remains largely unexplored. In this paper, we evaluated the performance of 37 frequently used LiDAR metrics derived under leaf-on and leaf-off conditions, respectively, for discriminating six different tree species in a natural forest in Germany. We firstly assessed the correlation between these metrics. Then we applied a Random Forest algorithm to classify the tree species and evaluated the importance of the LiDAR metrics. Finally, we identified the most important LiDAR metrics and tested their robustness and transferability. Our results indicated that about 60% of LiDAR metrics were highly correlated to each other (|r| > 0.7). There was no statistically significant difference in tree species mapping accuracy between the use of leaf-on and leaf-off LiDAR metrics. However, combining leaf-on and leaf-off LiDAR metrics significantly increased the overall accuracy from 58.2% (leaf-on) and 62.0% (leaf-off) to 66.5% as well as the kappa coefficient from 0.47 (leaf-on) and 0.51 (leaf-off) to 0.58. Radiometric features, especially intensity related metrics, provided more consistent and significant contributions than geometric features for tree species discrimination. Specifically, the mean intensity of first-or-single returns as well as the mean value of echo width were identified as the most robust LiDAR metrics for tree species discrimination. These results indicate that metrics derived from airborne LiDAR data, especially radiometric metrics, can aid in discriminating tree species in a mixed temperate forest, and represent candidate metrics for tree species classification and monitoring in Central Europe.

  13. Human Performance Optimization Metrics: Consensus Findings, Gaps, and Recommendations for Future Research.

    Science.gov (United States)

    Nindl, Bradley C; Jaffin, Dianna P; Dretsch, Michael N; Cheuvront, Samuel N; Wesensten, Nancy J; Kent, Michael L; Grunberg, Neil E; Pierce, Joseph R; Barry, Erin S; Scott, Jonathan M; Young, Andrew J; OʼConnor, Francis G; Deuster, Patricia A

    2015-11-01

    Human performance optimization (HPO) is defined as "the process of applying knowledge, skills and emerging technologies to improve and preserve the capabilities of military members, and organizations to execute essential tasks." The lack of consensus for operationally relevant and standardized metrics that meet joint military requirements has been identified as the single most important gap for research and application of HPO. In 2013, the Consortium for Health and Military Performance hosted a meeting to develop a toolkit of standardized HPO metrics for use in military and civilian research, and potentially for field applications by commanders, units, and organizations. Performance was considered from a holistic perspective as being influenced by various behaviors and barriers. To accomplish the goal of developing a standardized toolkit, key metrics were identified and evaluated across a spectrum of domains that contribute to HPO: physical performance, nutritional status, psychological status, cognitive performance, environmental challenges, sleep, and pain. These domains were chosen based on relevant data with regard to performance enhancers and degraders. The specific objectives at this meeting were to (a) identify and evaluate current metrics for assessing human performance within selected domains; (b) prioritize metrics within each domain to establish a human performance assessment toolkit; and (c) identify scientific gaps and the needed research to more effectively assess human performance across domains. This article provides of a summary of 150 total HPO metrics across multiple domains that can be used as a starting point-the beginning of an HPO toolkit: physical fitness (29 metrics), nutrition (24 metrics), psychological status (36 metrics), cognitive performance (35 metrics), environment (12 metrics), sleep (9 metrics), and pain (5 metrics). These metrics can be particularly valuable as the military emphasizes a renewed interest in Human Dimension efforts

  14. Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact

    Science.gov (United States)

    2011-01-01

    Background Citations in peer-reviewed articles and the impact factor are generally accepted measures of scientific impact. Web 2.0 tools such as Twitter, blogs or social bookmarking tools provide the possibility to construct innovative article-level or journal-level metrics to gauge impact and influence. However, the relationship of the these new metrics to traditional metrics such as citations is not known. Objective (1) To explore the feasibility of measuring social impact of and public attention to scholarly articles by analyzing buzz in social media, (2) to explore the dynamics, content, and timing of tweets relative to the publication of a scholarly article, and (3) to explore whether these metrics are sensitive and specific enough to predict highly cited articles. Methods Between July 2008 and November 2011, all tweets containing links to articles in the Journal of Medical Internet Research (JMIR) were mined. For a subset of 1573 tweets about 55 articles published between issues 3/2009 and 2/2010, different metrics of social media impact were calculated and compared against subsequent citation data from Scopus and Google Scholar 17 to 29 months later. A heuristic to predict the top-cited articles in each issue through tweet metrics was validated. Results A total of 4208 tweets cited 286 distinct JMIR articles. The distribution of tweets over the first 30 days after article publication followed a power law (Zipf, Bradford, or Pareto distribution), with most tweets sent on the day when an article was published (1458/3318, 43.94% of all tweets in a 60-day period) or on the following day (528/3318, 15.9%), followed by a rapid decay. The Pearson correlations between tweetations and citations were moderate and statistically significant, with correlation coefficients ranging from .42 to .72 for the log-transformed Google Scholar citations, but were less clear for Scopus citations and rank correlations. A linear multivariate model with time and tweets as significant

  15. Organelle targeting: third level of drug targeting

    Directory of Open Access Journals (Sweden)

    Sakhrani NM

    2013-07-01

    Full Text Available Niraj M Sakhrani, Harish PadhDepartment of Cell and Molecular Biology, BV Patel Pharmaceutical Education and Research Development (PERD Centre, Gujarat, IndiaAbstract: Drug discovery and drug delivery are two main aspects for treatment of a variety of disorders. However, the real bottleneck associated with systemic drug administration is the lack of target-specific affinity toward a pathological site, resulting in systemic toxicity and innumerable other side effects as well as higher dosage requirement for efficacy. An attractive strategy to increase the therapeutic index of a drug is to specifically deliver the therapeutic molecule in its active form, not only into target tissue, nor even to target cells, but more importantly, into the targeted organelle, ie, to its intracellular therapeutic active site. This would ensure improved efficacy and minimize toxicity. Cancer chemotherapy today faces the major challenge of delivering chemotherapeutic drugs exclusively to tumor cells, while sparing normal proliferating cells. Nanoparticles play a crucial role by acting as a vehicle for delivery of drugs to target sites inside tumor cells. In this review, we spotlight active and passive targeting, followed by discussion of the importance of targeting to specific cell organelles and the potential role of cell-penetrating peptides. Finally, the discussion will address the strategies for drug/DNA targeting to lysosomes, mitochondria, nuclei and Golgi/endoplasmic reticulum.Keywords: intracellular drug delivery, cancer chemotherapy, therapeutic index, cell penetrating peptides

  16. Rainbows without unicorns: metric structures in theories with modified dispersion relations

    International Nuclear Information System (INIS)

    Lobo, Iarley P.; Loret, Niccolo; Nettel, Francisco

    2017-01-01

    Rainbow metrics are a widely used approach to the metric formalism for theories with modified dispersion relations. They have had a huge success in the quantum gravity phenomenology literature, since they allow one to introduce momentum-dependent space-time metrics into the description of systems with a modified dispersion relation. In this paper, we introduce the reader to some realizations of this general idea: the original rainbow metrics proposal, the momentum-space-inspired metric and a Finsler geometry approach. As the main result of this work we also present an alternative definition of a four-velocity dependent metric which allows one to handle the massless limit. This paper aims to highlight some of their properties and how to properly describe their relativistic realizations. (orig.)

  17. Rainbows without unicorns: metric structures in theories with modified dispersion relations

    Science.gov (United States)

    Lobo, Iarley P.; Loret, Niccoló; Nettel, Francisco

    2017-07-01

    Rainbow metrics are a widely used approach to the metric formalism for theories with modified dispersion relations. They have had a huge success in the quantum gravity phenomenology literature, since they allow one to introduce momentum-dependent space-time metrics into the description of systems with a modified dispersion relation. In this paper, we introduce the reader to some realizations of this general idea: the original rainbow metrics proposal, the momentum-space-inspired metric and a Finsler geometry approach. As the main result of this work we also present an alternative definition of a four-velocity dependent metric which allows one to handle the massless limit. This paper aims to highlight some of their properties and how to properly describe their relativistic realizations.

  18. Rainbows without unicorns: metric structures in theories with modified dispersion relations

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Iarley P. [Universita ' ' La Sapienza' ' , Dipartimento di Fisica, Rome (Italy); ICRANet, Pescara (Italy); CAPES Foundation, Ministry of Education of Brazil, Brasilia (Brazil); Universidade Federal da Paraiba, Departamento de Fisica, Joao Pessoa, PB (Brazil); INFN Sezione Roma 1 (Italy); Loret, Niccolo [Ruder Boskovic Institute, Division of Theoretical Physics, Zagreb (Croatia); Nettel, Francisco [Universita ' ' La Sapienza' ' , Dipartimento di Fisica, Rome (Italy); Universidad Nacional Autonoma de Mexico, Instituto de Ciencias Nucleares, Mexico (Mexico); INFN Sezione Roma 1 (Italy)

    2017-07-15

    Rainbow metrics are a widely used approach to the metric formalism for theories with modified dispersion relations. They have had a huge success in the quantum gravity phenomenology literature, since they allow one to introduce momentum-dependent space-time metrics into the description of systems with a modified dispersion relation. In this paper, we introduce the reader to some realizations of this general idea: the original rainbow metrics proposal, the momentum-space-inspired metric and a Finsler geometry approach. As the main result of this work we also present an alternative definition of a four-velocity dependent metric which allows one to handle the massless limit. This paper aims to highlight some of their properties and how to properly describe their relativistic realizations. (orig.)

  19. Temporal variability of daily personal magnetic field exposure metrics in pregnant women.

    Science.gov (United States)

    Lewis, Ryan C; Evenson, Kelly R; Savitz, David A; Meeker, John D

    2015-01-01

    Recent epidemiology studies of power-frequency magnetic fields and reproductive health have characterized exposures using data collected from personal exposure monitors over a single day, possibly resulting in exposure misclassification due to temporal variability in daily personal magnetic field exposure metrics, but relevant data in adults are limited. We assessed the temporal variability of daily central tendency (time-weighted average, median) and peak (upper percentiles, maximum) personal magnetic field exposure metrics over 7 consecutive days in 100 pregnant women. When exposure was modeled as a continuous variable, central tendency metrics had substantial reliability, whereas peak metrics had fair (maximum) to moderate (upper percentiles) reliability. The predictive ability of a single-day metric to accurately classify participants into exposure categories based on a weeklong metric depended on the selected exposure threshold, with sensitivity decreasing with increasing exposure threshold. Consistent with the continuous measures analysis, sensitivity was higher for central tendency metrics than for peak metrics. If there is interest in peak metrics, more than 1 day of measurement is needed over the window of disease susceptibility to minimize measurement error, but 1 day may be sufficient for central tendency metrics.

  20. Development of soil quality metrics using mycorrhizal fungi

    Energy Technology Data Exchange (ETDEWEB)

    Baar, J.

    2010-07-01

    Based on the Treaty on Biological Diversity of Rio de Janeiro in 1992 for maintaining and increasing biodiversity, several countries have started programmes monitoring soil quality and the above- and below ground biodiversity. Within the European Union, policy makers are working on legislation for soil protection and management. Therefore, indicators are needed to monitor the status of the soils and these indicators reflecting the soil quality, can be integrated in working standards or soil quality metrics. Soil micro-organisms, particularly arbuscular mycorrhizal fungi (AMF), are indicative of soil changes. These soil fungi live in symbiosis with the great majority of plants and are sensitive to changes in the physico-chemical conditions of the soil. The aim of this study was to investigate whether AMF are reliable and sensitive indicators for disturbances in the soils and can be used for the development of soil quality metrics. Also, it was studied whether soil quality metrics based on AMF meet requirements to applicability by users and policy makers. Ecological criterions were set for the development of soil quality metrics for different soils. Multiple root samples containing AMF from various locations in The Netherlands were analyzed. The results of the analyses were related to the defined criterions. This resulted in two soil quality metrics, one for sandy soils and a second one for clay soils, with six different categories ranging from very bad to very good. These soil quality metrics meet the majority of requirements for applicability and are potentially useful for the development of legislations for the protection of soil quality. (Author) 23 refs.

  1. Measures of agreement between computation and experiment:validation metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Barone, Matthew Franklin; Oberkampf, William Louis

    2005-08-01

    With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables and sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric and also features that should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent free-shear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.

  2. Tide or Tsunami? The Impact of Metrics on Scholarly Research

    Science.gov (United States)

    Bonnell, Andrew G.

    2016-01-01

    Australian universities are increasingly resorting to the use of journal metrics such as impact factors and ranking lists in appraisal and promotion processes, and are starting to set quantitative "performance expectations" which make use of such journal-based metrics. The widespread use and misuse of research metrics is leading to…

  3. Term Based Comparison Metrics for Controlled and Uncontrolled Indexing Languages

    Science.gov (United States)

    Good, B. M.; Tennis, J. T.

    2009-01-01

    Introduction: We define a collection of metrics for describing and comparing sets of terms in controlled and uncontrolled indexing languages and then show how these metrics can be used to characterize a set of languages spanning folksonomies, ontologies and thesauri. Method: Metrics for term set characterization and comparison were identified and…

  4. Software metrics: The key to quality software on the NCC project

    Science.gov (United States)

    Burns, Patricia J.

    1993-01-01

    Network Control Center (NCC) Project metrics are captured during the implementation and testing phases of the NCCDS software development lifecycle. The metrics data collection and reporting function has interfaces with all elements of the NCC project. Close collaboration with all project elements has resulted in the development of a defined and repeatable set of metrics processes. The resulting data are used to plan and monitor release activities on a weekly basis. The use of graphical outputs facilitates the interpretation of progress and status. The successful application of metrics throughout the NCC project has been instrumental in the delivery of quality software. The use of metrics on the NCC Project supports the needs of the technical and managerial staff. This paper describes the project, the functions supported by metrics, the data that are collected and reported, how the data are used, and the improvements in the quality of deliverable software since the metrics processes and products have been in use.

  5. Search strategy in a complex and dynamic environment (the Indian Ocean case)

    Science.gov (United States)

    Loire, Sophie; Arbabi, Hassan; Clary, Patrick; Ivic, Stefan; Crnjaric-Zic, Nelida; Macesic, Senka; Crnkovic, Bojan; Mezic, Igor; UCSB Team; Rijeka Team

    2014-11-01

    The disappearance of Malaysia Airlines Flight 370 (MH370) in the early morning hours of 8 March 2014 has exposed the disconcerting lack of efficient methods for identifying where to look and how to look for missing objects in a complex and dynamic environment. The search area for plane debris is a remote part of the Indian Ocean. Searches, of the lawnmower type, have been unsuccessful so far. Lagrangian kinematics of mesoscale features are visible in hypergraph maps of the Indian Ocean surface currents. Without a precise knowledge of the crash site, these maps give an estimate of the time evolution of any initial distribution of plane debris and permits the design of a search strategy. The Dynamic Spectral Multiscale Coverage search algorithm is modified to search a spatial distribution of targets that is evolving with time following the dynamic of ocean surface currents. Trajectories are generated for multiple search agents such that their spatial coverage converges to the target distribution. Central to this DSMC algorithm is a metric for the ergodicity.

  6. 48 CFR 611.002-70 - Metric system implementation.

    Science.gov (United States)

    2010-10-01

    ... with security, operations, economic, technical, logistical, training and safety requirements. (3) The... total cost of the retrofit, including redesign costs, exceeds $50,000; (ii) Metric is not the accepted... office with an explanation for the disapproval. (7) The in-house operating metric costs shall be...

  7. Metrics for assessing retailers based on consumer perception

    Directory of Open Access Journals (Sweden)

    Klimin Anastasii

    2017-01-01

    Full Text Available The article suggests a new look at trading platforms, which is called “metrics.” Metrics are a way to look at the point of sale in a large part from the buyer’s side. The buyer enters the store and make buying decision based on those factors that the seller often does not consider, or considers in part, because “does not see” them, since he is not a buyer. The article proposes the classification of retailers, metrics and a methodology for their determination, presents the results of an audit of retailers in St. Petersburg on the proposed methodology.

  8. Chaos of discrete dynamical systems in complete metric spaces

    International Nuclear Information System (INIS)

    Shi Yuming; Chen Guanrong

    2004-01-01

    This paper is concerned with chaos of discrete dynamical systems in complete metric spaces. Discrete dynamical systems governed by continuous maps in general complete metric spaces are first discussed, and two criteria of chaos are then established. As a special case, two corresponding criteria of chaos for discrete dynamical systems in compact subsets of metric spaces are obtained. These results have extended and improved the existing relevant results of chaos in finite-dimensional Euclidean spaces

  9. The correlation of metrics in complex networks with applications in functional brain networks

    International Nuclear Information System (INIS)

    Li, C; Wang, H; Van Mieghem, P; De Haan, W; Stam, C J

    2011-01-01

    An increasing number of network metrics have been applied in network analysis. If metric relations were known better, we could more effectively characterize networks by a small set of metrics to discover the association between network properties/metrics and network functioning. In this paper, we investigate the linear correlation coefficients between widely studied network metrics in three network models (Bárabasi–Albert graphs, Erdös–Rényi random graphs and Watts–Strogatz small-world graphs) as well as in functional brain networks of healthy subjects. The metric correlations, which we have observed and theoretically explained, motivate us to propose a small representative set of metrics by including only one metric from each subset of mutually strongly dependent metrics. The following contributions are considered important. (a) A network with a given degree distribution can indeed be characterized by a small representative set of metrics. (b) Unweighted networks, which are obtained from weighted functional brain networks with a fixed threshold, and Erdös–Rényi random graphs follow a similar degree distribution. Moreover, their metric correlations and the resultant representative metrics are similar as well. This verifies the influence of degree distribution on metric correlations. (c) Most metric correlations can be explained analytically. (d) Interestingly, the most studied metrics so far, the average shortest path length and the clustering coefficient, are strongly correlated and, thus, redundant. Whereas spectral metrics, though only studied recently in the context of complex networks, seem to be essential in network characterizations. This representative set of metrics tends to both sufficiently and effectively characterize networks with a given degree distribution. In the study of a specific network, however, we have to at least consider the representative set so that important network properties will not be neglected

  10. Targeting miRNAs by polyphenols: Novel therapeutic strategy for cancer.

    Science.gov (United States)

    Pandima Devi, Kasi; Rajavel, Tamilselvam; Daglia, Maria; Nabavi, Seyed Fazel; Bishayee, Anupam; Nabavi, Seyed Mohammad

    2017-10-01

    In the recent years, polyphenols have gained significant attention in scientific community owing to their potential anticancer effects against a wide range of human malignancies. Epidemiological, clinical and preclinical studies have supported that daily intake of polyphenol-rich dietary fruits have a strong co-relationship in the prevention of different types of cancer. In addition to direct antioxidant mechanisms, they also regulate several therapeutically important oncogenic signaling and transcription factors. However, after the discovery of microRNA (miRNA), numerous studies have identified that polyphenols, including epigallocatechin-3-gallate, genistein, resveratrol and curcumin exert their anticancer effects by regulating different miRNAs which are implicated in all the stages of cancer. MiRNAs are short, non-coding endogenous RNA, which silence the gene functions by targeting messenger RNA (mRNA) through degradation or translation repression. However, cancer associated miRNAs has emerged only in recent years to support its applications in cancer therapy. Preclinical experiments have suggested that deregulation of single miRNA is sufficient for neoplastic transformation of cells. Indeed, the widespread deregulation of several miRNA profiles of tumor and healthy tissue samples revealed the involvement of many types of miRNA in the development of numerous cancers. Hence, targeting the miRNAs using polyphenols will be a novel and promising strategy in anticancer chemotherapy. Herein, we have critically reviewed the potential applications of polyphenols on various human miRNAs, especially which are involved in oncogenic and tumor suppressor pathways. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Final Report for Bio-Inspired Approaches to Moving-Target Defense Strategies

    Energy Technology Data Exchange (ETDEWEB)

    Fink, Glenn A.; Oehmen, Christopher S.

    2012-09-01

    This report records the work and contributions of the NITRD-funded Bio-Inspired Approaches to Moving-Target Defense Strategies project performed by Pacific Northwest National Laboratory under the technical guidance of the National Security Agency’s R6 division. The project has incorporated a number of bio-inspired cyber defensive technologies within an elastic framework provided by the Digital Ants. This project has created the first scalable, real-world prototype of the Digital Ants Framework (DAF)[11] and integrated five technologies into this flexible, decentralized framework: (1) Ant-Based Cyber Defense (ABCD), (2) Behavioral Indicators, (3) Bioinformatic Clas- sification, (4) Moving-Target Reconfiguration, and (5) Ambient Collaboration. The DAF can be used operationally to decentralize many such data intensive applications that normally rely on collection of large amounts of data in a central repository. In this work, we have shown how these component applications may be decentralized and may perform analysis at the edge. Operationally, this will enable analytics to scale far beyond current limitations while not suffering from the bandwidth or computational limitations of centralized analysis. This effort has advanced the R6 Cyber Security research program to secure digital infrastructures by developing a dynamic means to adaptively defend complex cyber systems. We hope that this work will benefit both our client’s efforts in system behavior modeling and cyber security to the overall benefit of the nation.

  12. Otherwise Engaged : Social Media from Vanity Metrics to Critical Analytics

    NARCIS (Netherlands)

    Rogers, R.

    2018-01-01

    Vanity metrics is a term that captures the measurement and display of how well one is doing in the “success theater” of social media. The notion of vanity metrics implies a critique of metrics concerning both the object of measurement as well as their capacity to measure unobtrusively or only to

  13. Tumor Specific Detection of an Optically Targeted Antibody Combined with a Quencher-conjugated Neutravidin “Quencher-Chaser”: A Dual “Quench and Chase” Strategy to Improve Target to Non-target Ratios for Molecular Imaging of Cancer

    Science.gov (United States)

    Ogawa, Mikako; Kosaka, Nobuyuki; Choyke, Peter L; Kobayashi, Hisataka

    2009-01-01

    In vivo molecular cancer imaging with monoclonal antibodies has great potential not only for cancer detection but also for cancer characterization. However, the prolonged retention of intravenously injected antibody in the blood causes low target tumor-to-background ratio (TBR). Avidin has been used as a “chase” to clear the unbound, circulating biotinylated antibody and decrease the background signal. Here, we utilize a combined approach of a Fluorescence Resonance Energy Transfer (FRET) quenched antibody with an “avidin chase” to increase TBR. Trastuzumab, a humanized monoclonal antibody against human epidermal growth factor receptor type 2 (HER2), was biotinylated and conjugated with the near-infrared (NIR) fluorophore Alexa680 to synthesize Tra-Alexa680-biotin. Next, the FRET quencher, QSY-21, was conjugated to avidin, neutravidin (nAv) or streptavidin (sAv), thus creating Av-QSY21, nAv-QSY21 or sAv-QSY21 as “chasers”. The fluorescence was quenched in vitro by binding Tra-Alexa680-biotin to Av-QSY21, nAv-QSY21 or sAv-QSY21. To evaluate if the injection of quencher-conjugated avidin-derivatives can improve target TBR by using a dual “quench and chase” strategy, both target (3T3/HER2+) and non-target (Balb3T3/ZsGreen) tumor bearing mice were employed. The “FRET quench” effect induced by all the QSY21 avidin-based conjugates reduced but did not totally eliminate background signal from the blood pool. The addition of nAv-QSY21 administration increased target TBR mainly due to the “chase” effect where unbound conjugated antibody was preferentially cleared to the liver. The relatively slow clearance of unbound nAv-QSY21 leads to further reductions in background signal by leaking out of the vascular space and binding to unbound antibodies in the extravascular space of tumors resulting in decreased non-target tumor-to-background ratios but increased target TBR due to the “FRET quench” effect because target-bound antibodies were internalized

  14. IDENTIFYING MARKETING EFFECTIVENESS METRICS (Case study: East Azerbaijan`s industrial units)

    OpenAIRE

    Faridyahyaie, Reza; Faryabi, Mohammad; Bodaghi Khajeh Noubar, Hossein

    2012-01-01

    The Paper attempts to identify marketing eff ectiveness metrics in industrial units. The metrics investigated in this study are completely applicable and comprehensive, and consequently they can evaluate marketing eff ectiveness in various industries. The metrics studied include: Market Share, Profitability, Sales Growth, Customer Numbers, Customer Satisfaction and Customer Loyalty. The findings indicate that these six metrics are impressive when measuring marketing effectiveness. Data was ge...

  15. Some observations on a fuzzy metric space

    Energy Technology Data Exchange (ETDEWEB)

    Gregori, V.

    2017-07-01

    Let $(X,d)$ be a metric space. In this paper we provide some observations about the fuzzy metric space in the sense of Kramosil and Michalek $(Y,N,/wedge)$, where $Y$ is the set of non-negative real numbers $[0,/infty[$ and $N(x,y,t)=1$ if $d(x,y)/leq t$ and $N(x,y,t)=0$ if $d(x,y)/geq t$. (Author)

  16. 22 CFR 226.15 - Metric system of measurement.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Metric system of measurement. 226.15 Section 226.15 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT ADMINISTRATION OF ASSISTANCE AWARDS TO U.S. NON-GOVERNMENTAL ORGANIZATIONS Pre-award Requirements § 226.15 Metric system of measurement. (a...

  17. 20 CFR 435.15 - Metric system of measurement.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Metric system of measurement. 435.15 Section 435.15 Employees' Benefits SOCIAL SECURITY ADMINISTRATION UNIFORM ADMINISTRATIVE REQUIREMENTS FOR... metric system is the preferred measurement system for U.S. trade and commerce. The Act requires each...

  18. Assessment of the Log-Euclidean Metric Performance in Diffusion Tensor Image Segmentation

    Directory of Open Access Journals (Sweden)

    Mostafa Charmi

    2010-06-01

    Full Text Available Introduction: Appropriate definition of the distance measure between diffusion tensors has a deep impact on Diffusion Tensor Image (DTI segmentation results. The geodesic metric is the best distance measure since it yields high-quality segmentation results. However, the important problem with the geodesic metric is a high computational cost of the algorithms based on it. The main goal of this paper is to assess the possible substitution of the geodesic metric with the Log-Euclidean one to reduce the computational cost of a statistical surface evolution algorithm. Materials and Methods: We incorporated the Log-Euclidean metric in the statistical surface evolution algorithm framework. To achieve this goal, the statistics and gradients of diffusion tensor images were defined using the Log-Euclidean metric. Numerical implementation of the segmentation algorithm was performed in the MATLAB software using the finite difference techniques. Results: In the statistical surface evolution framework, the Log-Euclidean metric was able to discriminate the torus and helix patterns in synthesis datasets and rat spinal cords in biological phantom datasets from the background better than the Euclidean and J-divergence metrics. In addition, similar results were obtained with the geodesic metric. However, the main advantage of the Log-Euclidean metric over the geodesic metric was the dramatic reduction of computational cost of the segmentation algorithm, at least by 70 times. Discussion and Conclusion: The qualitative and quantitative results have shown that the Log-Euclidean metric is a good substitute for the geodesic metric when using a statistical surface evolution algorithm in DTIs segmentation.

  19. Absolutely minimal extensions of functions on metric spaces

    International Nuclear Information System (INIS)

    Milman, V A

    1999-01-01

    Extensions of a real-valued function from the boundary ∂X 0 of an open subset X 0 of a metric space (X,d) to X 0 are discussed. For the broad class of initial data coming under discussion (linearly bounded functions) locally Lipschitz extensions to X 0 that preserve localized moduli of continuity are constructed. In the set of these extensions an absolutely minimal extension is selected, which was considered before by Aronsson for Lipschitz initial functions in the case X 0 subset of R n . An absolutely minimal extension can be regarded as an ∞-harmonic function, that is, a limit of p-harmonic functions as p→+∞. The proof of the existence of absolutely minimal extensions in a metric space with intrinsic metric is carried out by the Perron method. To this end, ∞-subharmonic, ∞-superharmonic, and ∞-harmonic functions on a metric space are defined and their properties are established

  20. PSQM-based RR and NR video quality metrics

    Science.gov (United States)

    Lu, Zhongkang; Lin, Weisi; Ong, Eeping; Yang, Xiaokang; Yao, Susu

    2003-06-01

    This paper presents a new and general concept, PQSM (Perceptual Quality Significance Map), to be used in measuring the visual distortion. It makes use of the selectivity characteristic of HVS (Human Visual System) that it pays more attention to certain area/regions of visual signal due to one or more of the following factors: salient features in image/video, cues from domain knowledge, and association of other media (e.g., speech or audio). PQSM is an array whose elements represent the relative perceptual-quality significance levels for the corresponding area/regions for images or video. Due to its generality, PQSM can be incorporated into any visual distortion metrics: to improve effectiveness or/and efficiency of perceptual metrics; or even to enhance a PSNR-based metric. A three-stage PQSM estimation method is also proposed in this paper, with an implementation of motion, texture, luminance, skin-color and face mapping. Experimental results show the scheme can improve the performance of current image/video distortion metrics.