WorldWideScience

Sample records for metric optimization evaluation

  1. Evaluating Application-Layer Traffic Optimization Cost Metrics for P2P Multimedia Streaming

    DEFF Research Database (Denmark)

    Poderys, Justas; Soler, José

    2017-01-01

    To help users of P2P communication systems perform better-than-random selection of communication peers, Internet Engineering Task Force standardized the Application Layer Traffic Optimization (ALTO) protocol. The ALTO provided data-routing cost metric, can be used to rank peers in P2P communicati...

  2. Thermodynamic metrics and optimal paths.

    Science.gov (United States)

    Sivak, David A; Crooks, Gavin E

    2012-05-11

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  3. The SPAtial EFficiency metric (SPAEF): multiple-component evaluation of spatial patterns for optimization of hydrological models

    Science.gov (United States)

    Koch, Julian; Cüneyd Demirel, Mehmet; Stisen, Simon

    2018-05-01

    The process of model evaluation is not only an integral part of model development and calibration but also of paramount importance when communicating modelling results to the scientific community and stakeholders. The modelling community has a large and well-tested toolbox of metrics to evaluate temporal model performance. In contrast, spatial performance evaluation does not correspond to the grand availability of spatial observations readily available and to the sophisticate model codes simulating the spatial variability of complex hydrological processes. This study makes a contribution towards advancing spatial-pattern-oriented model calibration by rigorously testing a multiple-component performance metric. The promoted SPAtial EFficiency (SPAEF) metric reflects three equally weighted components: correlation, coefficient of variation and histogram overlap. This multiple-component approach is found to be advantageous in order to achieve the complex task of comparing spatial patterns. SPAEF, its three components individually and two alternative spatial performance metrics, i.e. connectivity analysis and fractions skill score, are applied in a spatial-pattern-oriented model calibration of a catchment model in Denmark. Results suggest the importance of multiple-component metrics because stand-alone metrics tend to fail to provide holistic pattern information. The three SPAEF components are found to be independent, which allows them to complement each other in a meaningful way. In order to optimally exploit spatial observations made available by remote sensing platforms, this study suggests applying bias insensitive metrics which further allow for a comparison of variables which are related but may differ in unit. This study applies SPAEF in the hydrological context using the mesoscale Hydrologic Model (mHM; version 5.8), but we see great potential across disciplines related to spatially distributed earth system modelling.

  4. Load Index Metrics for an Optimized Management of Web Services: A Systematic Evaluation

    Science.gov (United States)

    Souza, Paulo S. L.; Santana, Regina H. C.; Santana, Marcos J.; Zaluska, Ed; Faical, Bruno S.; Estrella, Julio C.

    2013-01-01

    The lack of precision to predict service performance through load indices may lead to wrong decisions regarding the use of web services, compromising service performance and raising platform cost unnecessarily. This paper presents experimental studies to qualify the behaviour of load indices in the web service context. The experiments consider three services that generate controlled and significant server demands, four levels of workload for each service and six distinct execution scenarios. The evaluation considers three relevant perspectives: the capability for representing recent workloads, the capability for predicting near-future performance and finally stability. Eight different load indices were analysed, including the JMX Average Time index (proposed in this paper) specifically designed to address the limitations of the other indices. A systematic approach is applied to evaluate the different load indices, considering a multiple linear regression model based on the stepwise-AIC method. The results show that the load indices studied represent the workload to some extent; however, in contrast to expectations, most of them do not exhibit a coherent correlation with service performance and this can result in stability problems. The JMX Average Time index is an exception, showing a stable behaviour which is tightly-coupled to the service runtime for all executions. Load indices are used to predict the service runtime and therefore their inappropriate use can lead to decisions that will impact negatively on both service performance and execution cost. PMID:23874776

  5. Gamut Volume Index: a color preference metric based on meta-analysis and optimized colour samples.

    Science.gov (United States)

    Liu, Qiang; Huang, Zheng; Xiao, Kaida; Pointer, Michael R; Westland, Stephen; Luo, M Ronnier

    2017-07-10

    A novel metric named Gamut Volume Index (GVI) is proposed for evaluating the colour preference of lighting. This metric is based on the absolute gamut volume of optimized colour samples. The optimal colour set of the proposed metric was obtained by optimizing the weighted average correlation between the metric predictions and the subjective ratings for 8 psychophysical studies. The performance of 20 typical colour metrics was also investigated, which included colour difference based metrics, gamut based metrics, memory based metrics as well as combined metrics. It was found that the proposed GVI outperformed the existing counterparts, especially for the conditions where correlated colour temperatures differed.

  6. Evaluating and Estimating the WCET Criticality Metric

    DEFF Research Database (Denmark)

    Jordan, Alexander

    2014-01-01

    a programmer (or compiler) from targeting optimizations the right way. A possible resort is to use a metric that targets WCET and which can be efficiently computed for all code parts of a program. Similar to dynamic profiling techniques, which execute code with input that is typically expected...... for the application, based on WCET analysis we can indicate how critical a code fragment is, in relation to the worst-case bound. Computing such a metric on top of static analysis, incurs a certain overhead though, which increases with the complexity of the underlying WCET analysis. We present our approach...... to estimate the Criticality metric, by relaxing the precision of WCET analysis. Through this, we can reduce analysis time by orders of magnitude, while only introducing minor error. To evaluate our estimation approach and share our garnered experience using the metric, we evaluate real-time programs, which...

  7. Metrics for Evaluation of Student Models

    Science.gov (United States)

    Pelanek, Radek

    2015-01-01

    Researchers use many different metrics for evaluation of performance of student models. The aim of this paper is to provide an overview of commonly used metrics, to discuss properties, advantages, and disadvantages of different metrics, to summarize current practice in educational data mining, and to provide guidance for evaluation of student…

  8. Next-Generation Metrics: Responsible Metrics & Evaluation for Open Science

    Energy Technology Data Exchange (ETDEWEB)

    Wilsdon, J.; Bar-Ilan, J.; Peters, I.; Wouters, P.

    2016-07-01

    Metrics evoke a mixed reaction from the research community. A commitment to using data to inform decisions makes some enthusiastic about the prospect of granular, real-time analysis o of research and its wider impacts. Yet we only have to look at the blunt use of metrics such as journal impact factors, h-indices and grant income targets, to be reminded of the pitfalls. Some of the most precious qualities of academic culture resist simple quantification, and individual indicators often struggle to do justice to the richness and plurality of research. Too often, poorly designed evaluation criteria are “dominating minds, distorting behaviour and determining careers (Lawrence, 2007).” Metrics hold real power: they are constitutive of values, identities and livelihoods. How to exercise that power to more positive ends has been the focus of several recent and complementary initiatives, including the San Francisco Declaration on Research Assessment (DORA1), the Leiden Manifesto2 and The Metric Tide3 (a UK government review of the role of metrics in research management and assessment). Building on these initiatives, the European Commission, under its new Open Science Policy Platform4, is now looking to develop a framework for responsible metrics for research management and evaluation, which can be incorporated into the successor framework to Horizon 2020. (Author)

  9. Use of plan quality degradation to evaluate tradeoffs in delivery efficiency and clinical plan metrics arising from IMRT optimizer and sequencer compromises

    Science.gov (United States)

    Wilkie, Joel R.; Matuszak, Martha M.; Feng, Mary; Moran, Jean M.; Fraass, Benedick A.

    2013-01-01

    Purpose: Plan degradation resulting from compromises made to enhance delivery efficiency is an important consideration for intensity modulated radiation therapy (IMRT) treatment plans. IMRT optimization and/or multileaf collimator (MLC) sequencing schemes can be modified to generate more efficient treatment delivery, but the effect those modifications have on plan quality is often difficult to quantify. In this work, the authors present a method for quantitative assessment of overall plan quality degradation due to tradeoffs between delivery efficiency and treatment plan quality, illustrated using comparisons between plans developed allowing different numbers of intensity levels in IMRT optimization and/or MLC sequencing for static segmental MLC IMRT plans. Methods: A plan quality degradation method to evaluate delivery efficiency and plan quality tradeoffs was developed and used to assess planning for 14 prostate and 12 head and neck patients treated with static IMRT. Plan quality was evaluated using a physician's predetermined “quality degradation” factors for relevant clinical plan metrics associated with the plan optimization strategy. Delivery efficiency and plan quality were assessed for a range of optimization and sequencing limitations. The “optimal” (baseline) plan for each case was derived using a clinical cost function with an unlimited number of intensity levels. These plans were sequenced with a clinical MLC leaf sequencer which uses >100 segments, assuring delivered intensities to be within 1% of the optimized intensity pattern. Each patient's optimal plan was also sequenced limiting the number of intensity levels (20, 10, and 5), and then separately optimized with these same numbers of intensity levels. Delivery time was measured for all plans, and direct evaluation of the tradeoffs between delivery time and plan degradation was performed. Results: When considering tradeoffs, the optimal number of intensity levels depends on the treatment

  10. Human Performance Optimization Metrics: Consensus Findings, Gaps, and Recommendations for Future Research.

    Science.gov (United States)

    Nindl, Bradley C; Jaffin, Dianna P; Dretsch, Michael N; Cheuvront, Samuel N; Wesensten, Nancy J; Kent, Michael L; Grunberg, Neil E; Pierce, Joseph R; Barry, Erin S; Scott, Jonathan M; Young, Andrew J; OʼConnor, Francis G; Deuster, Patricia A

    2015-11-01

    Human performance optimization (HPO) is defined as "the process of applying knowledge, skills and emerging technologies to improve and preserve the capabilities of military members, and organizations to execute essential tasks." The lack of consensus for operationally relevant and standardized metrics that meet joint military requirements has been identified as the single most important gap for research and application of HPO. In 2013, the Consortium for Health and Military Performance hosted a meeting to develop a toolkit of standardized HPO metrics for use in military and civilian research, and potentially for field applications by commanders, units, and organizations. Performance was considered from a holistic perspective as being influenced by various behaviors and barriers. To accomplish the goal of developing a standardized toolkit, key metrics were identified and evaluated across a spectrum of domains that contribute to HPO: physical performance, nutritional status, psychological status, cognitive performance, environmental challenges, sleep, and pain. These domains were chosen based on relevant data with regard to performance enhancers and degraders. The specific objectives at this meeting were to (a) identify and evaluate current metrics for assessing human performance within selected domains; (b) prioritize metrics within each domain to establish a human performance assessment toolkit; and (c) identify scientific gaps and the needed research to more effectively assess human performance across domains. This article provides of a summary of 150 total HPO metrics across multiple domains that can be used as a starting point-the beginning of an HPO toolkit: physical fitness (29 metrics), nutrition (24 metrics), psychological status (36 metrics), cognitive performance (35 metrics), environment (12 metrics), sleep (9 metrics), and pain (5 metrics). These metrics can be particularly valuable as the military emphasizes a renewed interest in Human Dimension efforts

  11. Evaluation metrics for biostatistical and epidemiological collaborations.

    Science.gov (United States)

    Rubio, Doris McGartland; Del Junco, Deborah J; Bhore, Rafia; Lindsell, Christopher J; Oster, Robert A; Wittkowski, Knut M; Welty, Leah J; Li, Yi-Ju; Demets, Dave

    2011-10-15

    Increasing demands for evidence-based medicine and for the translation of biomedical research into individual and public health benefit have been accompanied by the proliferation of special units that offer expertise in biostatistics, epidemiology, and research design (BERD) within academic health centers. Objective metrics that can be used to evaluate, track, and improve the performance of these BERD units are critical to their successful establishment and sustainable future. To develop a set of reliable but versatile metrics that can be adapted easily to different environments and evolving needs, we consulted with members of BERD units from the consortium of academic health centers funded by the Clinical and Translational Science Award Program of the National Institutes of Health. Through a systematic process of consensus building and document drafting, we formulated metrics that covered the three identified domains of BERD practices: the development and maintenance of collaborations with clinical and translational science investigators, the application of BERD-related methods to clinical and translational research, and the discovery of novel BERD-related methodologies. In this article, we describe the set of metrics and advocate their use for evaluating BERD practices. The routine application, comparison of findings across diverse BERD units, and ongoing refinement of the metrics will identify trends, facilitate meaningful changes, and ultimately enhance the contribution of BERD activities to biomedical research. Copyright © 2011 John Wiley & Sons, Ltd.

  12. Microservice scaling optimization based on metric collection in Kubernetes

    OpenAIRE

    Blažej, Aljaž

    2017-01-01

    As web applications become more complex and the number of internet users rises, so does the need to optimize the use of hardware supporting these applications. Optimization can be achieved with microservices, as they offer several advantages compared to the monolithic approach, such as better utilization of resources, scalability and isolation of different parts of an application. Another important part is collecting metrics, since they can be used for analysis and debugging as well as the ba...

  13. Optimal recovery of linear operators in non-Euclidean metrics

    Energy Technology Data Exchange (ETDEWEB)

    Osipenko, K Yu [Moscow State Aviation Technological University, Moscow (Russian Federation)

    2014-10-31

    The paper looks at problems concerning the recovery of operators from noisy information in non-Euclidean metrics. A number of general theorems are proved and applied to recovery problems for functions and their derivatives from the noisy Fourier transform. In some cases, a family of optimal methods is found, from which the methods requiring the least amount of original information are singled out. Bibliography: 25 titles.

  14. A novel spatial performance metric for robust pattern optimization of distributed hydrological models

    Science.gov (United States)

    Stisen, S.; Demirel, C.; Koch, J.

    2017-12-01

    Evaluation of performance is an integral part of model development and calibration as well as it is of paramount importance when communicating modelling results to stakeholders and the scientific community. There exists a comprehensive and well tested toolbox of metrics to assess temporal model performance in the hydrological modelling community. On the contrary, the experience to evaluate spatial performance is not corresponding to the grand availability of spatial observations readily available and to the sophisticate model codes simulating the spatial variability of complex hydrological processes. This study aims at making a contribution towards advancing spatial pattern oriented model evaluation for distributed hydrological models. This is achieved by introducing a novel spatial performance metric which provides robust pattern performance during model calibration. The promoted SPAtial EFficiency (spaef) metric reflects three equally weighted components: correlation, coefficient of variation and histogram overlap. This multi-component approach is necessary in order to adequately compare spatial patterns. spaef, its three components individually and two alternative spatial performance metrics, i.e. connectivity analysis and fractions skill score, are tested in a spatial pattern oriented model calibration of a catchment model in Denmark. The calibration is constrained by a remote sensing based spatial pattern of evapotranspiration and discharge timeseries at two stations. Our results stress that stand-alone metrics tend to fail to provide holistic pattern information to the optimizer which underlines the importance of multi-component metrics. The three spaef components are independent which allows them to complement each other in a meaningful way. This study promotes the use of bias insensitive metrics which allow comparing variables which are related but may differ in unit in order to optimally exploit spatial observations made available by remote sensing

  15. Fisher information metrics for binary classifier evaluation and training

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Different evaluation metrics for binary classifiers are appropriate to different scientific domains and even to different problems within the same domain. This presentation focuses on the optimisation of event selection to minimise statistical errors in HEP parameter estimation, a problem that is best analysed in terms of the maximisation of Fisher information about the measured parameters. After describing a general formalism to derive evaluation metrics based on Fisher information, three more specific metrics are introduced for the measurements of signal cross sections in counting experiments (FIP1) or distribution fits (FIP2) and for the measurements of other parameters from distribution fits (FIP3). The FIP2 metric is particularly interesting because it can be derived from any ROC curve, provided that prevalence is also known. In addition to its relation to measurement errors when used as an evaluation criterion (which makes it more interesting that the ROC AUC), a further advantage of the FIP2 metric is ...

  16. Metrics for Performance Evaluation of Patient Exercises during Physical Therapy.

    Science.gov (United States)

    Vakanski, Aleksandar; Ferguson, Jake M; Lee, Stephen

    2017-06-01

    The article proposes a set of metrics for evaluation of patient performance in physical therapy exercises. Taxonomy is employed that classifies the metrics into quantitative and qualitative categories, based on the level of abstraction of the captured motion sequences. Further, the quantitative metrics are classified into model-less and model-based metrics, in reference to whether the evaluation employs the raw measurements of patient performed motions, or whether the evaluation is based on a mathematical model of the motions. The reviewed metrics include root-mean square distance, Kullback Leibler divergence, log-likelihood, heuristic consistency, Fugl-Meyer Assessment, and similar. The metrics are evaluated for a set of five human motions captured with a Kinect sensor. The metrics can potentially be integrated into a system that employs machine learning for modelling and assessment of the consistency of patient performance in home-based therapy setting. Automated performance evaluation can overcome the inherent subjectivity in human performed therapy assessment, and it can increase the adherence to prescribed therapy plans, and reduce healthcare costs.

  17. A condition metric for Eucalyptus woodland derived from expert evaluations.

    Science.gov (United States)

    Sinclair, Steve J; Bruce, Matthew J; Griffioen, Peter; Dodd, Amanda; White, Matthew D

    2018-02-01

    The evaluation of ecosystem quality is important for land-management and land-use planning. Evaluation is unavoidably subjective, and robust metrics must be based on consensus and the structured use of observations. We devised a transparent and repeatable process for building and testing ecosystem metrics based on expert data. We gathered quantitative evaluation data on the quality of hypothetical grassy woodland sites from experts. We used these data to train a model (an ensemble of 30 bagged regression trees) capable of predicting the perceived quality of similar hypothetical woodlands based on a set of 13 site variables as inputs (e.g., cover of shrubs, richness of native forbs). These variables can be measured at any site and the model implemented in a spreadsheet as a metric of woodland quality. We also investigated the number of experts required to produce an opinion data set sufficient for the construction of a metric. The model produced evaluations similar to those provided by experts, as shown by assessing the model's quality scores of expert-evaluated test sites not used to train the model. We applied the metric to 13 woodland conservation reserves and asked managers of these sites to independently evaluate their quality. To assess metric performance, we compared the model's evaluation of site quality with the managers' evaluations through multidimensional scaling. The metric performed relatively well, plotting close to the center of the space defined by the evaluators. Given the method provides data-driven consensus and repeatability, which no single human evaluator can provide, we suggest it is a valuable tool for evaluating ecosystem quality in real-world contexts. We believe our approach is applicable to any ecosystem. © 2017 State of Victoria.

  18. Video Analytics Evaluation: Survey of Datasets, Performance Metrics and Approaches

    Science.gov (United States)

    2014-09-01

    people with different ethnicity and gender . Cur- rently we have four subjects, but more can be added in the future. • Lighting Variations. We consider...is however not a proper distance as the triangular inequality condition is not met. For this reason, the next metric should be preferred. • the...and Alan F. Smeaton and Georges Quenot, An Overview of the Goals, Tasks, Data, Evaluation Mechanisms and Metrics, Proceedings of TRECVID 2011, NIST, USA

  19. Clinical Outcome Metrics for Optimization of Robust Training

    Science.gov (United States)

    Ebert, D.; Byrne, V. E.; McGuire, K. M.; Hurst, V. W., IV; Kerstman, E. L.; Cole, R. W.; Sargsyan, A. E.; Garcia, K. M.; Reyes, D.; Young, M.

    2016-01-01

    (pre-IMM analysis) and overall mitigation of the mission medical impact (IMM analysis); 2) refine the procedure outcome and clinical outcome metrics themselves; 3) refine or develop innovative medical training products and solutions to maximize CMO performance; and 4) validate the methods and products of this experiment for operational use in the planning, execution, and quality assurance of the CMO training process The team has finalized training protocols and developed a software training/testing tool in collaboration with Butler Graphics (Detroit, MI). In addition to the "hands on" medical procedure modules, the software includes a differential diagnosis exercise (limited clinical decision support tool) to evaluate the diagnostic skills of participants. Human subject testing will occur over the next year.

  20. METRIC EVALUATION PIPELINE FOR 3D MODELING OF URBAN SCENES

    Directory of Open Access Journals (Sweden)

    M. Bosch

    2017-05-01

    Full Text Available Publicly available benchmark data and metric evaluation approaches have been instrumental in enabling research to advance state of the art methods for remote sensing applications in urban 3D modeling. Most publicly available benchmark datasets have consisted of high resolution airborne imagery and lidar suitable for 3D modeling on a relatively modest scale. To enable research in larger scale 3D mapping, we have recently released a public benchmark dataset with multi-view commercial satellite imagery and metrics to compare 3D point clouds with lidar ground truth. We now define a more complete metric evaluation pipeline developed as publicly available open source software to assess semantically labeled 3D models of complex urban scenes derived from multi-view commercial satellite imagery. Evaluation metrics in our pipeline include horizontal and vertical accuracy and completeness, volumetric completeness and correctness, perceptual quality, and model simplicity. Sources of ground truth include airborne lidar and overhead imagery, and we demonstrate a semi-automated process for producing accurate ground truth shape files to characterize building footprints. We validate our current metric evaluation pipeline using 3D models produced using open source multi-view stereo methods. Data and software is made publicly available to enable further research and planned benchmarking activities.

  1. Metric Evaluation Pipeline for 3d Modeling of Urban Scenes

    Science.gov (United States)

    Bosch, M.; Leichtman, A.; Chilcott, D.; Goldberg, H.; Brown, M.

    2017-05-01

    Publicly available benchmark data and metric evaluation approaches have been instrumental in enabling research to advance state of the art methods for remote sensing applications in urban 3D modeling. Most publicly available benchmark datasets have consisted of high resolution airborne imagery and lidar suitable for 3D modeling on a relatively modest scale. To enable research in larger scale 3D mapping, we have recently released a public benchmark dataset with multi-view commercial satellite imagery and metrics to compare 3D point clouds with lidar ground truth. We now define a more complete metric evaluation pipeline developed as publicly available open source software to assess semantically labeled 3D models of complex urban scenes derived from multi-view commercial satellite imagery. Evaluation metrics in our pipeline include horizontal and vertical accuracy and completeness, volumetric completeness and correctness, perceptual quality, and model simplicity. Sources of ground truth include airborne lidar and overhead imagery, and we demonstrate a semi-automated process for producing accurate ground truth shape files to characterize building footprints. We validate our current metric evaluation pipeline using 3D models produced using open source multi-view stereo methods. Data and software is made publicly available to enable further research and planned benchmarking activities.

  2. Public relations metrics: research and evaluation

    NARCIS (Netherlands)

    van Ruler, B.; Tkalac Verčič, A.; Verčič, D.

    2008-01-01

    Responding to the increasing need in academia and the public relations profession, this volume presents the current state of knowledge in public relations measurement and evaluation. The book brings together ideas and methods that can be used throughout the world, and scholars and practitioners from

  3. Metrics and Evaluation Models for Accessible Television

    DEFF Research Database (Denmark)

    Li, Dongxiao; Looms, Peter Olaf

    2014-01-01

    The adoption of the UN Convention on the Rights of Persons with Disabilities (UN CRPD) in 2006 has provided a global framework for work on accessibility, including information and communication technologies and audiovisual content. One of the challenges facing the application of the UN CRPD...... number of platforms on which audiovisual content needs to be distributed, requiring very clear multiplatform architectures to facilitate interworking and assure interoperability. As a consequence, the regular evaluations of progress being made by signatories to the UN CRPD protocol are difficult...

  4. Methods and Metrics for Evaluating Environmental Dredging ...

    Science.gov (United States)

    This report documents the objectives, approach, methodologies, results, and interpretation of a collaborative research study conducted by the National Risk Management Research Laboratory (NRMRL) and the National Exposure Research laboratory (NERL) of the U.S. Environmental Protection Agency’s (U.S. EPA’s) Office of Research and Development (ORD) and the U.S. EPA’s Great Lakes National Program Office (GLNPO). The objectives of the research study were to: 1) evaluate remedy effectiveness of environmental dredging as applied to contaminated sediments in the Ashtabula River in northeastern Ohio, and 2) monitor the recovery of the surrounding ecosystem. The project was carried out over 6 years from 2006 through 2011 and consisted of the development and evaluation of methods and approaches to assess river and ecosystem conditions prior to dredging (2006), during dredging (2006 and 2007), and following dredging, both short term (2008) and long term (2009-2011). This project report summarizes and interprets the results of this 6-year study to develop and assess methods for monitoring pollutant fate and transport and ecosystem recovery through the use of biological, chemical, and physical lines of evidence (LOEs) such as: 1) comprehensive sampling of and chemical analysis of contaminants in surface, suspended, and historic sediments; 2) extensive grab and multi-level real time water sampling and analysis of contaminants in the water column; 3) sampling, chemi

  5. Primal-dual convex optimization in large deformation diffeomorphic metric mapping: LDDMM meets robust regularizers

    Science.gov (United States)

    Hernandez, Monica

    2017-12-01

    This paper proposes a method for primal-dual convex optimization in variational large deformation diffeomorphic metric mapping problems formulated with robust regularizers and robust image similarity metrics. The method is based on Chambolle and Pock primal-dual algorithm for solving general convex optimization problems. Diagonal preconditioning is used to ensure the convergence of the algorithm to the global minimum. We consider three robust regularizers liable to provide acceptable results in diffeomorphic registration: Huber, V-Huber and total generalized variation. The Huber norm is used in the image similarity term. The primal-dual equations are derived for the stationary and the non-stationary parameterizations of diffeomorphisms. The resulting algorithms have been implemented for running in the GPU using Cuda. For the most memory consuming methods, we have developed a multi-GPU implementation. The GPU implementations allowed us to perform an exhaustive evaluation study in NIREP and LPBA40 databases. The experiments showed that, for all the considered regularizers, the proposed method converges to diffeomorphic solutions while better preserving discontinuities at the boundaries of the objects compared to baseline diffeomorphic registration methods. In most cases, the evaluation showed a competitive performance for the robust regularizers, close to the performance of the baseline diffeomorphic registration methods.

  6. Evaluating Multiple Object Tracking Performance: The CLEAR MOT Metrics

    Directory of Open Access Journals (Sweden)

    Bernardin Keni

    2008-01-01

    Full Text Available Abstract Simultaneous tracking of multiple persons in real-world environments is an active research field and several approaches have been proposed, based on a variety of features and algorithms. Recently, there has been a growing interest in organizing systematic evaluations to compare the various techniques. Unfortunately, the lack of common metrics for measuring the performance of multiple object trackers still makes it hard to compare their results. In this work, we introduce two intuitive and general metrics to allow for objective comparison of tracker characteristics, focusing on their precision in estimating object locations, their accuracy in recognizing object configurations and their ability to consistently label objects over time. These metrics have been extensively used in two large-scale international evaluations, the 2006 and 2007 CLEAR evaluations, to measure and compare the performance of multiple object trackers for a wide variety of tracking tasks. Selected performance results are presented and the advantages and drawbacks of the presented metrics are discussed based on the experience gained during the evaluations.

  7. An Innovative Metric to Evaluate Satellite Precipitation's Spatial Distribution

    Science.gov (United States)

    Liu, H.; Chu, W.; Gao, X.; Sorooshian, S.

    2011-12-01

    Thanks to its capability to cover the mountains, where ground measurement instruments cannot reach, satellites provide a good means of estimating precipitation over mountainous regions. In regions with complex terrains, accurate information on high-resolution spatial distribution of precipitation is critical for many important issues, such as flood/landslide warning, reservoir operation, water system planning, etc. Therefore, in order to be useful in many practical applications, satellite precipitation products should possess high quality in characterizing spatial distribution. However, most existing validation metrics, which are based on point/grid comparison using simple statistics, cannot effectively measure satellite's skill of capturing the spatial patterns of precipitation fields. This deficiency results from the fact that point/grid-wised comparison does not take into account of the spatial coherence of precipitation fields. Furth more, another weakness of many metrics is that they can barely provide information on why satellite products perform well or poor. Motivated by our recent findings of the consistent spatial patterns of the precipitation field over the western U.S., we developed a new metric utilizing EOF analysis and Shannon entropy. The metric can be derived through two steps: 1) capture the dominant spatial patterns of precipitation fields from both satellite products and reference data through EOF analysis, and 2) compute the similarities between the corresponding dominant patterns using mutual information measurement defined with Shannon entropy. Instead of individual point/grid, the new metric treat the entire precipitation field simultaneously, naturally taking advantage of spatial dependence. Since the dominant spatial patterns are shaped by physical processes, the new metric can shed light on why satellite product can or cannot capture the spatial patterns. For demonstration, a experiment was carried out to evaluate a satellite

  8. Evaluation Metrics for Simulations of Tropical South America

    Science.gov (United States)

    Gallup, S.; Baker, I. T.; Denning, A. S.; Cheeseman, M.; Haynes, K. D.; Phillips, M.

    2017-12-01

    The evergreen broadleaf forest of the Amazon Basin is the largest rainforest on earth, and has teleconnections to global climate and carbon cycle characteristics. This region defies simple characterization, spanning large gradients in total rainfall and seasonal variability. Broadly, the region can be thought of as trending from light-limited in its wettest areas to water-limited near the ecotone, with individual landscapes possibly exhibiting the characteristics of either (or both) limitations during an annual cycle. A basin-scale classification of mean behavior has been elusive, and ecosystem response to seasonal cycles and anomalous drought events has resulted in some disagreement in the literature, to say the least. However, new observational platforms and instruments make characterization of the heterogeneity and variability more feasible.To evaluate simulations of ecophysiological function, we develop metrics that correlate various observational products with meteorological variables such as precipitation and radiation. Observations include eddy covariance fluxes, Solar Induced Fluorescence (SIF, from GOME2 and OCO2), biomass and vegetation indices. We find that the modest correlation between SIF and precipitation decreases with increasing annual precipitation, although the relationship is not consistent between products. Biomass increases with increasing precipitation. Although vegetation indices are generally correlated with biomass and precipitation, they can saturate or experience retrieval issues during cloudy periods.Using these observational products and relationships, we develop a set of model evaluation metrics. These metrics are designed to call attention to models that get "the right answer only if it's for the right reason," and provide an opportunity for more critical evaluation of model physics. These metrics represent a testbed that can be applied to multiple models as a means to evaluate their performance in tropical South America.

  9. Utility of different glycemic control metrics for optimizing management of diabetes.

    Science.gov (United States)

    Kohnert, Klaus-Dieter; Heinke, Peter; Vogt, Lutz; Salzsieder, Eckhard

    2015-02-15

    The benchmark for assessing quality of long-term glycemic control and adjustment of therapy is currently glycated hemoglobin (HbA1c). Despite its importance as an indicator for the development of diabetic complications, recent studies have revealed that this metric has some limitations; it conveys a rather complex message, which has to be taken into consideration for diabetes screening and treatment. On the basis of recent clinical trials, the relationship between HbA1c and cardiovascular outcomes in long-standing diabetes has been called into question. It becomes obvious that other surrogate and biomarkers are needed to better predict cardiovascular diabetes complications and assess efficiency of therapy. Glycated albumin, fructosamin, and 1,5-anhydroglucitol have received growing interest as alternative markers of glycemic control. In addition to measures of hyperglycemia, advanced glucose monitoring methods became available. An indispensible adjunct to HbA1c in routine diabetes care is self-monitoring of blood glucose. This monitoring method is now widely used, as it provides immediate feedback to patients on short-term changes, involving fasting, preprandial, and postprandial glucose levels. Beyond the traditional metrics, glycemic variability has been identified as a predictor of hypoglycemia, and it might also be implicated in the pathogenesis of vascular diabetes complications. Assessment of glycemic variability is thus important, but exact quantification requires frequently sampled glucose measurements. In order to optimize diabetes treatment, there is a need for both key metrics of glycemic control on a day-to-day basis and for more advanced, user-friendly monitoring methods. In addition to traditional discontinuous glucose testing, continuous glucose sensing has become a useful tool to reveal insufficient glycemic management. This new technology is particularly effective in patients with complicated diabetes and provides the opportunity to characterize

  10. A lighting metric for quantitative evaluation of accent lighting systems

    Science.gov (United States)

    Acholo, Cyril O.; Connor, Kenneth A.; Radke, Richard J.

    2014-09-01

    Accent lighting is critical for artwork and sculpture lighting in museums, and subject lighting for stage, Film and television. The research problem of designing effective lighting in such settings has been revived recently with the rise of light-emitting-diode-based solid state lighting. In this work, we propose an easy-to-apply quantitative measure of the scene's visual quality as perceived by human viewers. We consider a well-accent-lit scene as one which maximizes the information about the scene (in an information-theoretic sense) available to the user. We propose a metric based on the entropy of the distribution of colors, which are extracted from an image of the scene from the viewer's perspective. We demonstrate that optimizing the metric as a function of illumination configuration (i.e., position, orientation, and spectral composition) results in natural, pleasing accent lighting. We use a photorealistic simulation tool to validate the functionality of our proposed approach, showing its successful application to two- and three-dimensional scenes.

  11. Prototypic Development and Evaluation of a Medium Format Metric Camera

    Science.gov (United States)

    Hastedt, H.; Rofallski, R.; Luhmann, T.; Rosenbauer, R.; Ochsner, D.; Rieke-Zapp, D.

    2018-05-01

    Engineering applications require high-precision 3D measurement techniques for object sizes that vary between small volumes (2-3 m in each direction) and large volumes (around 20 x 20 x 1-10 m). The requested precision in object space (1σ RMS) is defined to be within 0.1-0.2 mm for large volumes and less than 0.01 mm for small volumes. In particular, focussing large volume applications the availability of a metric camera would have different advantages for several reasons: 1) high-quality optical components and stabilisations allow for a stable interior geometry of the camera itself, 2) a stable geometry leads to a stable interior orientation that enables for an a priori camera calibration, 3) a higher resulting precision can be expected. With this article the development and accuracy evaluation of a new metric camera, the ALPA 12 FPS add|metric will be presented. Its general accuracy potential is tested against calibrated lengths in a small volume test environment based on the German Guideline VDI/VDE 2634.1 (2002). Maximum length measurement errors of less than 0.025 mm are achieved with different scenarios having been tested. The accuracy potential for large volumes is estimated within a feasibility study on the application of photogrammetric measurements for the deformation estimation on a large wooden shipwreck in the German Maritime Museum. An accuracy of 0.2 mm-0.4 mm is reached for a length of 28 m (given by a distance from a lasertracker network measurement). All analyses have proven high stabilities of the interior orientation of the camera and indicate the applicability for a priori camera calibration for subsequent 3D measurements.

  12. PROTOTYPIC DEVELOPMENT AND EVALUATION OF A MEDIUM FORMAT METRIC CAMERA

    Directory of Open Access Journals (Sweden)

    H. Hastedt

    2018-05-01

    Full Text Available Engineering applications require high-precision 3D measurement techniques for object sizes that vary between small volumes (2–3 m in each direction and large volumes (around 20 x 20 x 1–10 m. The requested precision in object space (1σ RMS is defined to be within 0.1–0.2 mm for large volumes and less than 0.01 mm for small volumes. In particular, focussing large volume applications the availability of a metric camera would have different advantages for several reasons: 1 high-quality optical components and stabilisations allow for a stable interior geometry of the camera itself, 2 a stable geometry leads to a stable interior orientation that enables for an a priori camera calibration, 3 a higher resulting precision can be expected. With this article the development and accuracy evaluation of a new metric camera, the ALPA 12 FPS add|metric will be presented. Its general accuracy potential is tested against calibrated lengths in a small volume test environment based on the German Guideline VDI/VDE 2634.1 (2002. Maximum length measurement errors of less than 0.025 mm are achieved with different scenarios having been tested. The accuracy potential for large volumes is estimated within a feasibility study on the application of photogrammetric measurements for the deformation estimation on a large wooden shipwreck in the German Maritime Museum. An accuracy of 0.2 mm–0.4 mm is reached for a length of 28 m (given by a distance from a lasertracker network measurement. All analyses have proven high stabilities of the interior orientation of the camera and indicate the applicability for a priori camera calibration for subsequent 3D measurements.

  13. Retrospective group fusion similarity search based on eROCE evaluation metric.

    Science.gov (United States)

    Avram, Sorin I; Crisan, Luminita; Bora, Alina; Pacureanu, Liliana M; Avram, Stefana; Kurunczi, Ludovic

    2013-03-01

    In this study, a simple evaluation metric, denoted as eROCE was proposed to measure the early enrichment of predictive methods. We demonstrated the superior robustness of eROCE compared to other known metrics throughout several active to inactive ratios ranging from 1:10 to 1:1000. Group fusion similarity search was investigated by varying 16 similarity coefficients, five molecular representations (binary and non-binary) and two group fusion rules using two reference structure set sizes. We used a dataset of 3478 actives and 43,938 inactive molecules and the enrichment was analyzed by means of eROCE. This retrospective study provides optimal similarity search parameters in the case of ALDH1A1 inhibitors. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. An analytical modeling framework to evaluate converged networks through business-oriented metrics

    International Nuclear Information System (INIS)

    Guimarães, Almir P.; Maciel, Paulo R.M.; Matias, Rivalino

    2013-01-01

    Nowadays, society has increasingly relied on convergent networks as an essential means for individuals, businesses, and governments. Strategies, methods, models and techniques for preventing and handling hardware or software failures as well as avoiding performance degradation are, thus, fundamental for prevailing in business. Issues such as operational costs, revenues and the respective relationship to key performance and dependability metrics are central for defining the required system infrastructure. Our work aims to provide system performance and dependability models for supporting optimization of infrastructure design, aimed at business oriented metrics. In addition, a methodology is also adopted to support both the modeling and the evaluation process. The results showed that the proposed methodology can significantly reduce the complexity of infrastructure design as well as improve the relationship between business and infrastructure aspects

  15. Performance metrics for the evaluation of hyperspectral chemical identification systems

    Science.gov (United States)

    Truslow, Eric; Golowich, Steven; Manolakis, Dimitris; Ingle, Vinay

    2016-02-01

    Remote sensing of chemical vapor plumes is a difficult but important task for many military and civilian applications. Hyperspectral sensors operating in the long-wave infrared regime have well-demonstrated detection capabilities. However, the identification of a plume's chemical constituents, based on a chemical library, is a multiple hypothesis testing problem which standard detection metrics do not fully describe. We propose using an additional performance metric for identification based on the so-called Dice index. Our approach partitions and weights a confusion matrix to develop both the standard detection metrics and identification metric. Using the proposed metrics, we demonstrate that the intuitive system design of a detector bank followed by an identifier is indeed justified when incorporating performance information beyond the standard detection metrics.

  16. RNA-SeQC: RNA-seq metrics for quality control and process optimization.

    Science.gov (United States)

    DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad

    2012-06-01

    RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.

  17. Disaster Metrics: A Comprehensive Framework for Disaster Evaluation Typologies.

    Science.gov (United States)

    Wong, Diana F; Spencer, Caroline; Boyd, Lee; Burkle, Frederick M; Archer, Frank

    2017-10-01

    framework. This unique, unifying framework has relevance at an international level and is expected to benefit the disaster, humanitarian, and development sectors. The next step is to undertake a validation process that will include international leaders with experience in evaluation, in general, and disasters specifically. This work promotes an environment for constructive dialogue on evaluations in the disaster setting to strengthen the evidence base for interventions across the disaster spectrum. It remains a work in progress. Wong DF , Spencer C , Boyd L , Burkle FM Jr. , Archer F . Disaster metrics: a comprehensive framework for disaster evaluation typologies. Prehosp Disaster Med. 2017;32(5):501-514.

  18. Analysis on the Metrics used in Optimizing Electronic Business based on Learning Techniques

    Directory of Open Access Journals (Sweden)

    Irina-Steliana STAN

    2014-09-01

    Full Text Available The present paper proposes a methodology of analyzing the metrics related to electronic business. The drafts of the optimizing models include KPIs that can highlight the business specific, if only they are integrated by using learning-based techniques. Having set the most important and high-impact elements of the business, the models should get in the end the link between them, by automating business flows. The human resource will be found in the situation of collaborating more and more with the optimizing models which will translate into high quality decisions followed by profitability increase.

  19. Evaluating hydrological model performance using information theory-based metrics

    Science.gov (United States)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  20. Performance evaluation of routing metrics for wireless mesh networks

    CSIR Research Space (South Africa)

    Nxumalo, SL

    2009-08-01

    Full Text Available for WMN. The routing metrics have not been compared with QoS parameters. This paper is a work in progress of the project in which researchers want to compare the performance of different routing metrics in WMN using a wireless test bed. Researchers...

  1. Metric Accuracy Evaluation of Dense Matching Algorithms in Archeological Applications

    Directory of Open Access Journals (Sweden)

    C. Re

    2011-12-01

    Full Text Available In the cultural heritage field the recording and documentation of small and medium size objects with very detailed Digital Surface Models (DSM is readily possible by through the use of high resolution and high precision triangulation laser scanners. 3D surface recording of archaeological objects can be easily achieved in museums; however, this type of record can be quite expensive. In many cases photogrammetry can provide a viable alternative for the generation of DSMs. The photogrammetric procedure has some benefits with respect to laser survey. The research described in this paper sets out to verify the reconstruction accuracy of DSMs of some archaeological artifacts obtained by photogrammetric survey. The experimentation has been carried out on some objects preserved in the Petrie Museum of Egyptian Archaeology at University College London (UCL. DSMs produced by two photogrammetric software packages are compared with the digital 3D model obtained by a state of the art triangulation color laser scanner. Intercomparison between the generated DSM has allowed an evaluation of metric accuracy of the photogrammetric approach applied to archaeological documentation and of precision performances of the two software packages.

  2. A neurophysiological training evaluation metric for air traffic management.

    Science.gov (United States)

    Borghini, G; Aricò, P; Ferri, F; Graziani, I; Pozzi, S; Napoletano, L; Imbert, J P; Granger, G; Benhacene, R; Babiloni, F

    2014-01-01

    The aim of this work was to analyze the possibility to apply a neuroelectrical cognitive metrics for the evaluation of the training level of subjects during the learning of a task employed by Air Traffic Controllers (ATCos). In particular, the Electroencephalogram (EEG), the Electrocardiogram (ECG) and the Electrooculogram (EOG) signals were gathered from a group of students during the execution of an Air Traffic Management (ATM) task, proposed at three different levels of difficulty. The neuroelectrical results were compared with the subjective perception of the task difficulty obtained by the NASA-TLX questionnaires. From these analyses, we suggest that the integration of information derived from the power spectral density (PSD) of the EEG signals, the heart rate (HR) and the eye-blink rate (EBR) return important quantitative information about the training level of the subjects. In particular, by focusing the analysis on the direct and inverse correlation of the frontal PSD theta (4-7 (Hz)) and HR, and of the parietal PSD alpha (10-12 (Hz)) and EBR, respectively, with the degree of mental and emotive engagement, it is possible to obtain useful information about the training improvement across the training sessions.

  3. Conceptual Soundness, Metric Development, Benchmarking, and Targeting for PATH Subprogram Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Mosey. G.; Doris, E.; Coggeshall, C.; Antes, M.; Ruch, J.; Mortensen, J.

    2009-01-01

    The objective of this study is to evaluate the conceptual soundness of the U.S. Department of Housing and Urban Development (HUD) Partnership for Advancing Technology in Housing (PATH) program's revised goals and establish and apply a framework to identify and recommend metrics that are the most useful for measuring PATH's progress. This report provides an evaluative review of PATH's revised goals, outlines a structured method for identifying and selecting metrics, proposes metrics and benchmarks for a sampling of individual PATH programs, and discusses other metrics that potentially could be developed that may add value to the evaluation process. The framework and individual program metrics can be used for ongoing management improvement efforts and to inform broader program-level metrics for government reporting requirements.

  4. Quality Evaluation in Wireless Imaging Using Feature-Based Objective Metrics

    OpenAIRE

    Engelke, Ulrich; Zepernick, Hans-Jürgen

    2007-01-01

    This paper addresses the evaluation of image quality in the context of wireless systems using feature-based objective metrics. The considered metrics comprise of a weighted combination of feature values that are used to quantify the extend by which the related artifacts are present in a processed image. In view of imaging applications in mobile radio and wireless communication systems, reduced-reference objective quality metrics are investigated for quantifying user-perceived quality. The exa...

  5. Relevance as a metric for evaluating machine learning algorithms

    NARCIS (Netherlands)

    Kota Gopalakrishna, A.; Ozcelebi, T.; Liotta, A.; Lukkien, J.J.

    2013-01-01

    In machine learning, the choice of a learning algorithm that is suitable for the application domain is critical. The performance metric used to compare different algorithms must also reflect the concerns of users in the application domain under consideration. In this work, we propose a novel

  6. Performance evaluation of objective quality metrics for HDR image compression

    Science.gov (United States)

    Valenzise, Giuseppe; De Simone, Francesca; Lauga, Paul; Dufaux, Frederic

    2014-09-01

    Due to the much larger luminance and contrast characteristics of high dynamic range (HDR) images, well-known objective quality metrics, widely used for the assessment of low dynamic range (LDR) content, cannot be directly applied to HDR images in order to predict their perceptual fidelity. To overcome this limitation, advanced fidelity metrics, such as the HDR-VDP, have been proposed to accurately predict visually significant differences. However, their complex calibration may make them difficult to use in practice. A simpler approach consists in computing arithmetic or structural fidelity metrics, such as PSNR and SSIM, on perceptually encoded luminance values but the performance of quality prediction in this case has not been clearly studied. In this paper, we aim at providing a better comprehension of the limits and the potentialities of this approach, by means of a subjective study. We compare the performance of HDR-VDP to that of PSNR and SSIM computed on perceptually encoded luminance values, when considering compressed HDR images. Our results show that these simpler metrics can be effectively employed to assess image fidelity for applications such as HDR image compression.

  7. An Evaluation of the IntelliMetric[SM] Essay Scoring System

    Science.gov (United States)

    Rudner, Lawrence M.; Garcia, Veronica; Welch, Catherine

    2006-01-01

    This report provides a two-part evaluation of the IntelliMetric[SM] automated essay scoring system based on its performance scoring essays from the Analytic Writing Assessment of the Graduate Management Admission Test[TM] (GMAT[TM]). The IntelliMetric system performance is first compared to that of individual human raters, a Bayesian system…

  8. Evaluative Usage-Based Metrics for the Selection of E-Journals.

    Science.gov (United States)

    Hahn, Karla L.; Faulkner, Lila A.

    2002-01-01

    Explores electronic journal usage statistics and develops three metrics and three benchmarks based on those metrics. Topics include earlier work that assessed the value of print journals and was modified for the electronic format; the evaluation of potential purchases; and implications for standards development, including the need for content…

  9. Evaluation of Vehicle-Based Crash Severity Metrics.

    Science.gov (United States)

    Tsoi, Ada H; Gabler, Hampton C

    2015-01-01

    Vehicle change in velocity (delta-v) is a widely used crash severity metric used to estimate occupant injury risk. Despite its widespread use, delta-v has several limitations. Of most concern, delta-v is a vehicle-based metric which does not consider the crash pulse or the performance of occupant restraints, e.g. seatbelts and airbags. Such criticisms have prompted the search for alternative impact severity metrics based upon vehicle kinematics. The purpose of this study was to assess the ability of the occupant impact velocity (OIV), acceleration severity index (ASI), vehicle pulse index (VPI), and maximum delta-v (delta-v) to predict serious injury in real world crashes. The study was based on the analysis of event data recorders (EDRs) downloaded from the National Automotive Sampling System / Crashworthiness Data System (NASS-CDS) 2000-2013 cases. All vehicles in the sample were GM passenger cars and light trucks involved in a frontal collision. Rollover crashes were excluded. Vehicles were restricted to single-event crashes that caused an airbag deployment. All EDR data were checked for a successful, completed recording of the event and that the crash pulse was complete. The maximum abbreviated injury scale (MAIS) was used to describe occupant injury outcome. Drivers were categorized into either non-seriously injured group (MAIS2-) or seriously injured group (MAIS3+), based on the severity of any injuries to the thorax, abdomen, and spine. ASI and OIV were calculated according to the Manual for Assessing Safety Hardware. VPI was calculated according to ISO/TR 12353-3, with vehicle-specific parameters determined from U.S. New Car Assessment Program crash tests. Using binary logistic regression, the cumulative probability of injury risk was determined for each metric and assessed for statistical significance, goodness-of-fit, and prediction accuracy. The dataset included 102,744 vehicles. A Wald chi-square test showed each vehicle-based crash severity metric

  10. Optimizing the fMRI data-processing pipeline using prediction and reproducibility performance metrics: I. A preliminary group analysis

    DEFF Research Database (Denmark)

    Strother, Stephen C.; Conte, Stephen La; Hansen, Lars Kai

    2004-01-01

    We argue that published results demonstrate that new insights into human brain function may be obscured by poor and/or limited choices in the data-processing pipeline, and review the work on performance metrics for optimizing pipelines: prediction, reproducibility, and related empirical Receiver......, temporal detrending, and between-subject alignment) in a group analysis of BOLD-fMRI scans from 16 subjects performing a block-design, parametric-static-force task. Large-scale brain networks were detected using a multivariate linear discriminant analysis (canonical variates analysis, CVA) that was tuned...... of baseline scans have constant, equal means, and this assumption was assessed with prediction metrics. Higher-order polynomial warps compared to affine alignment had only a minor impact on the performance metrics. We found that both prediction and reproducibility metrics were required for optimizing...

  11. Evaluation of mobile phone camera benchmarking using objective camera speed and image quality metrics

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2014-11-01

    When a mobile phone camera is tested and benchmarked, the significance of image quality metrics is widely acknowledged. There are also existing methods to evaluate the camera speed. However, the speed or rapidity metrics of the mobile phone's camera system has not been used with the quality metrics even if the camera speed has become a more and more important camera performance feature. There are several tasks in this work. First, the most important image quality and speed-related metrics of a mobile phone's camera system are collected from the standards and papers and, also, novel speed metrics are identified. Second, combinations of the quality and speed metrics are validated using mobile phones on the market. The measurements are done toward application programming interface of different operating systems. Finally, the results are evaluated and conclusions are made. The paper defines a solution to combine different image quality and speed metrics to a single benchmarking score. A proposal of the combined benchmarking metric is evaluated using measurements of 25 mobile phone cameras on the market. The paper is a continuation of a previous benchmarking work expanded with visual noise measurement and updates of the latest mobile phone versions.

  12. Riemannian metric optimization on surfaces (RMOS) for intrinsic brain mapping in the Laplace-Beltrami embedding space.

    Science.gov (United States)

    Gahm, Jin Kyu; Shi, Yonggang

    2018-05-01

    Surface mapping methods play an important role in various brain imaging studies from tracking the maturation of adolescent brains to mapping gray matter atrophy patterns in Alzheimer's disease. Popular surface mapping approaches based on spherical registration, however, have inherent numerical limitations when severe metric distortions are present during the spherical parameterization step. In this paper, we propose a novel computational framework for intrinsic surface mapping in the Laplace-Beltrami (LB) embedding space based on Riemannian metric optimization on surfaces (RMOS). Given a diffeomorphism between two surfaces, an isometry can be defined using the pullback metric, which in turn results in identical LB embeddings from the two surfaces. The proposed RMOS approach builds upon this mathematical foundation and achieves general feature-driven surface mapping in the LB embedding space by iteratively optimizing the Riemannian metric defined on the edges of triangular meshes. At the core of our framework is an optimization engine that converts an energy function for surface mapping into a distance measure in the LB embedding space, which can be effectively optimized using gradients of the LB eigen-system with respect to the Riemannian metrics. In the experimental results, we compare the RMOS algorithm with spherical registration using large-scale brain imaging data, and show that RMOS achieves superior performance in the prediction of hippocampal subfields and cortical gyral labels, and the holistic mapping of striatal surfaces for the construction of a striatal connectivity atlas from substantia nigra. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. MOL-Eye: A New Metric for the Performance Evaluation of a Molecular Signal

    OpenAIRE

    Turan, Meric; Kuran, Mehmet Sukru; Yilmaz, H. Birkan; Chae, Chan-Byoung; Tugcu, Tuna

    2017-01-01

    Inspired by the eye diagram in classical radio frequency (RF) based communications, the MOL-Eye diagram is proposed for the performance evaluation of a molecular signal within the context of molecular communication. Utilizing various features of this diagram, three new metrics for the performance evaluation of a molecular signal, namely the maximum eye height, standard deviation of received molecules, and counting SNR (CSNR) are introduced. The applicability of these performance metrics in th...

  14. A composite efficiency metrics for evaluation of resource and energy utilization

    International Nuclear Information System (INIS)

    Yang, Siyu; Yang, Qingchun; Qian, Yu

    2013-01-01

    Polygeneration systems are commonly found in chemical and energy industry. These systems often involve chemical conversions and energy conversions. Studies of these systems are interdisciplinary, mainly involving fields of chemical engineering, energy engineering, environmental science, and economics. Each of these fields has developed an isolated index system different from the others. Analyses of polygeneration systems are therefore very likely to provide bias results with only the indexes from one field. This paper is motivated from this problem to develop a new composite efficiency metrics for polygeneration systems. This new metrics is based on the second law of thermodynamics, exergy theory. We introduce exergy cost for waste treatment as the energy penalty into conventional exergy efficiency. Using this new metrics could avoid the situation of spending too much energy for increasing production or paying production capacity for saving energy consumption. The composite metrics is studied on a simplified co-production process, syngas to methanol and electricity. The advantage of the new efficiency metrics is manifested by comparison with carbon element efficiency, energy efficiency, and exergy efficiency. Results show that the new metrics could give more rational analysis than the other indexes. - Highlights: • The composite efficiency metric gives the balanced evaluation of resource utilization and energy utilization. • This efficiency uses the exergy for waste treatment as the energy penalty. • This efficiency is applied on a simplified co-production process. • Results show that the composite metrics is better than energy efficiencies and resource efficiencies

  15. Optimization of a simplified automobile finite element model using time varying injury metrics.

    Science.gov (United States)

    Gaewsky, James P; Danelson, Kerry A; Weaver, Caitlin M; Stitzel, Joel D

    2014-01-01

    In 2011, frontal crashes resulted in 55% of passenger car injuries with 10,277 fatalities and 866,000 injuries in the United States. To better understand frontal crash injury mechanisms, human body finite element models (FEMs) can be used to reconstruct Crash Injury Research and Engineering Network (CIREN) cases. A limitation of this method is the paucity of vehicle FEMs; therefore, we developed a functionally equivalent simplified vehicle model. The New Car Assessment Program (NCAP) data for our selected vehicle was from a frontal collision with Hybrid III (H3) Anthropomorphic Test Device (ATD) occupant. From NCAP test reports, the vehicle geometry was created and the H3 ATD was positioned. The material and component properties optimized using a variation study process were: steering column shear bolt fracture force and stroke resistance, seatbelt pretensioner force, frontal and knee bolster airbag stiffness, and belt friction through the D-ring. These parameters were varied using three successive Latin Hypercube Designs of Experiments with 130-200 simulations each. The H3 injury response was compared to the reported NCAP frontal test results for the head, chest and pelvis accelerations, and seat belt and femur forces. The phase, magnitude, and comprehensive error factors, from a Sprague and Geers analysis were calculated for each injury metric and then combined to determine the simulations with the best match to the crash test. The Sprague and Geers analyses typically yield error factors ranging from 0 to 1 with lower scores being more optimized. The total body injury response error factor for the most optimized simulation from each round of the variation study decreased from 0.466 to 0.395 to 0.360. This procedure to optimize vehicle FEMs is a valuable tool to conduct future CIREN case reconstructions in a variety of vehicles.

  16. Evaluating which plan quality metrics are appropriate for use in lung SBRT.

    Science.gov (United States)

    Yaparpalvi, Ravindra; Garg, Madhur K; Shen, Jin; Bodner, William R; Mynampati, Dinesh K; Gafar, Aleiya; Kuo, Hsiang-Chi; Basavatia, Amar K; Ohri, Nitin; Hong, Linda X; Kalnicki, Shalom; Tome, Wolfgang A

    2018-02-01

    Several dose metrics in the categories-homogeneity, coverage, conformity and gradient have been proposed in literature for evaluating treatment plan quality. In this study, we applied these metrics to characterize and identify the plan quality metrics that would merit plan quality assessment in lung stereotactic body radiation therapy (SBRT) dose distributions. Treatment plans of 90 lung SBRT patients, comprising 91 targets, treated in our institution were retrospectively reviewed. Dose calculations were performed using anisotropic analytical algorithm (AAA) with heterogeneity correction. A literature review on published plan quality metrics in the categories-coverage, homogeneity, conformity and gradient was performed. For each patient, using dose-volume histogram data, plan quality metric values were quantified and analysed. For the study, the radiation therapy oncology group (RTOG) defined plan quality metrics were: coverage (0.90 ± 0.08); homogeneity (1.27 ± 0.07); conformity (1.03 ± 0.07) and gradient (4.40 ± 0.80). Geometric conformity strongly correlated with conformity index (p plan quality guidelines-coverage % (ICRU 62), conformity (CN or CI Paddick ) and gradient (R 50% ). Furthermore, we strongly recommend that RTOG lung SBRT protocols adopt either CN or CI Padddick in place of prescription isodose to target volume ratio for conformity index evaluation. Advances in knowledge: Our study metrics are valuable tools for establishing lung SBRT plan quality guidelines.

  17. A Cross-Domain Survey of Metrics for Modelling and Evaluating Collisions

    Directory of Open Access Journals (Sweden)

    Jeremy A. Marvel

    2014-09-01

    Full Text Available This paper provides a brief survey of the metrics for measuring probability, degree, and severity of collisions as applied to autonomous and intelligent systems. Though not exhaustive, this survey evaluates the state-of-the-art of collision metrics, and assesses which are likely to aid in the establishment and support of autonomous system collision modelling. The survey includes metrics for 1 robot arms; 2 mobile robot platforms; 3 nonholonomic physical systems such as ground vehicles, aircraft, and naval vessels, and; 4 virtual and mathematical models.

  18. Incorporating big data into treatment plan evaluation: Development of statistical DVH metrics and visualization dashboards

    Directory of Open Access Journals (Sweden)

    Charles S. Mayo, PhD

    2017-07-01

    Conclusions: Statistical DVH offers an easy-to-read, detailed, and comprehensive way to visualize the quantitative comparison with historical experiences and among institutions. WES and GEM metrics offer a flexible means of incorporating discrete threshold-prioritizations and historic context into a set of standardized scoring metrics. Together, they provide a practical approach for incorporating big data into clinical practice for treatment plan evaluations.

  19. [Applicability of traditional landscape metrics in evaluating urban heat island effect].

    Science.gov (United States)

    Chen, Ai-Lian; Sun, Ran-Hao; Chen, Li-Ding

    2012-08-01

    By using 24 landscape metrics, this paper evaluated the urban heat island effect in parts of Beijing downtown area. QuickBird (QB) images were used to extract the landscape type information, and the thermal bands from Landsat Enhanced Thematic Mapper Plus (ETM+) images were used to extract the land surface temperature (LST) in four seasons of the same year. The 24 landscape pattern metrics were calculated at landscape and class levels in a fixed window with 120 mx 120 m in size, with the applicability of these traditional landscape metrics in evaluating the urban heat island effect examined. Among the 24 landscape metrics, only the percentage composition of landscape (PLAND), patch density (PD), largest patch index (LPI), coefficient of Euclidean nearest-neighbor distance variance (ENN_CV), and landscape division index (DIVISION) at landscape level were significantly correlated with the LST in March, May, and November, and the PLAND, LPI, DIVISION, percentage of like adjacencies, and interspersion and juxtaposition index at class level showed significant correlations with the LST in March, May, July, and December, especially in July. Some metrics such as PD, edge density, clumpiness index, patch cohesion index, effective mesh size, splitting index, aggregation index, and normalized landscape shape index showed varying correlations with the LST at different class levels. The traditional landscape metrics could not be appropriate in evaluating the effects of river on LST, while some of the metrics could be useful in characterizing urban LST and analyzing the urban heat island effect, but screening and examining should be made on the metrics.

  20. Evaluating Modeled Impact Metrics for Human Health, Agriculture Growth, and Near-Term Climate

    Science.gov (United States)

    Seltzer, K. M.; Shindell, D. T.; Faluvegi, G.; Murray, L. T.

    2017-12-01

    Simulated metrics that assess impacts on human health, agriculture growth, and near-term climate were evaluated using ground-based and satellite observations. The NASA GISS ModelE2 and GEOS-Chem models were used to simulate the near-present chemistry of the atmosphere. A suite of simulations that varied by model, meteorology, horizontal resolution, emissions inventory, and emissions year were performed, enabling an analysis of metric sensitivities to various model components. All simulations utilized consistent anthropogenic global emissions inventories (ECLIPSE V5a or CEDS), and an evaluation of simulated results were carried out for 2004-2006 and 2009-2011 over the United States and 2014-2015 over China. Results for O3- and PM2.5-based metrics featured minor differences due to the model resolutions considered here (2.0° × 2.5° and 0.5° × 0.666°) and model, meteorology, and emissions inventory each played larger roles in variances. Surface metrics related to O3 were consistently high biased, though to varying degrees, demonstrating the need to evaluate particular modeling frameworks before O3 impacts are quantified. Surface metrics related to PM2.5 were diverse, indicating that a multimodel mean with robust results are valuable tools in predicting PM2.5-related impacts. Oftentimes, the configuration that captured the change of a metric best over time differed from the configuration that captured the magnitude of the same metric best, demonstrating the challenge in skillfully simulating impacts. These results highlight the strengths and weaknesses of these models in simulating impact metrics related to air quality and near-term climate. With such information, the reliability of historical and future simulations can be better understood.

  1. Semantic metrics

    OpenAIRE

    Hu, Bo; Kalfoglou, Yannis; Dupplaw, David; Alani, Harith; Lewis, Paul; Shadbolt, Nigel

    2006-01-01

    In the context of the Semantic Web, many ontology-related operations, e.g. ontology ranking, segmentation, alignment, articulation, reuse, evaluation, can be boiled down to one fundamental operation: computing the similarity and/or dissimilarity among ontological entities, and in some cases among ontologies themselves. In this paper, we review standard metrics for computing distance measures and we propose a series of semantic metrics. We give a formal account of semantic metrics drawn from a...

  2. EVALUATING METRICS FOR GREEN CHEMISTRIES: INFORMATION AND CALCULATION NEEDS

    Science.gov (United States)

    Research within the U.S. EPA's National Risk Management Research Laboratory is developing a methodology for the evaluation of green chemistries. This methodology called GREENSCOPE (Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Ob...

  3. Evaluation of Subjective and Objective Performance Metrics for Haptically Controlled Robotic Systems

    Directory of Open Access Journals (Sweden)

    Cong Dung Pham

    2014-07-01

    Full Text Available This paper studies in detail how different evaluation methods perform when it comes to describing the performance of haptically controlled mobile manipulators. Particularly, we investigate how well subjective metrics perform compared to objective metrics. To find the best metrics to describe the performance of a control scheme is challenging when human operators are involved; how the user perceives the performance of the controller does not necessarily correspond to the directly measurable metrics normally used in controller evaluation. It is therefore important to study whether there is any correspondence between how the user perceives the performance of a controller, and how it performs in terms of directly measurable metrics such as the time used to perform a task, number of errors, accuracy, and so on. To perform these tests we choose a system that consists of a mobile manipulator that is controlled by an operator through a haptic device. This is a good system for studying different performance metrics as the performance can be determined by subjective metrics based on feedback from the users, and also as objective and directly measurable metrics. The system consists of a robotic arm which provides for interaction and manipulation, which is mounted on a mobile base which extends the workspace of the arm. The operator thus needs to perform both interaction and locomotion using a single haptic device. While the position of the on-board camera is determined by the base motion, the principal control objective is the motion of the manipulator arm. This calls for intelligent control allocation between the base and the manipulator arm in order to obtain intuitive control of both the camera and the arm. We implement three different approaches to the control allocation problem, i.e., whether the vehicle or manipulator arm actuation is applied to generate the desired motion. The performance of the different control schemes is evaluated, and our

  4. Use of social media in health promotion: purposes, key performance indicators, and evaluation metrics.

    Science.gov (United States)

    Neiger, Brad L; Thackeray, Rosemary; Van Wagenen, Sarah A; Hanson, Carl L; West, Joshua H; Barnes, Michael D; Fagen, Michael C

    2012-03-01

    Despite the expanding use of social media, little has been published about its appropriate role in health promotion, and even less has been written about evaluation. The purpose of this article is threefold: (a) outline purposes for social media in health promotion, (b) identify potential key performance indicators associated with these purposes, and (c) propose evaluation metrics for social media related to the key performance indicators. Process evaluation is presented in this article as an overarching evaluation strategy for social media.

  5. A comparison of metrics to evaluate the effects of hydro-facility passage stressors on fish

    Energy Technology Data Exchange (ETDEWEB)

    Colotelo, Alison H.; Goldman, Amy E.; Wagner, Katie A.; Brown, Richard S.; Deng, Z. Daniel; Richmond, Marshall C.

    2017-03-01

    Hydropower is the most common form of renewable energy, and countries worldwide are considering expanding hydropower to new areas. One of the challenges of hydropower deployment is mitigation of the environmental impacts including water quality, habitat alterations, and ecosystem connectivity. For fish species that inhabit river systems with hydropower facilities, passage through the facility to access spawning and rearing habitats can be particularly challenging. Fish moving downstream through a hydro-facility can be exposed to a number of stressors (e.g., rapid decompression, shear forces, blade strike and collision, and turbulence), which can all affect fish survival in direct and indirect ways. Many studies have investigated the effects of hydro-turbine passage on fish; however, the comparability among studies is limited by variation in the metrics and biological endpoints used. Future studies investigating the effects of hydro-turbine passage should focus on using metrics and endpoints that are easily comparable. This review summarizes four categories of metrics that are used in fisheries research and have application to hydro-turbine passage (i.e., mortality, injury, molecular metrics, behavior) and evaluates them based on several criteria (i.e., resources needed, invasiveness, comparability among stressors and species, and diagnostic properties). Additionally, these comparisons are put into context of study setting (i.e., laboratory vs. field). Overall, injury and molecular metrics are ideal for studies in which there is a need to understand the mechanisms of effect, whereas behavior and mortality metrics provide information on the whole body response of the fish. The study setting strongly influences the comparability among studies. In laboratory-based studies, stressors can be controlled by both type and magnitude, allowing for easy comparisons among studies. In contrast, field studies expose fish to realistic passage environments but the comparability is

  6. Towards a consensus on datasets and evaluation metrics for developing B-cell epitope prediction tools

    DEFF Research Database (Denmark)

    Greenbaum, Jason A.; Andersen, Pernille; Blythe, Martin

    2007-01-01

    and immunology communities. Improving the accuracy of B-cell epitope prediction methods depends on a community consensus on the data and metrics utilized to develop and evaluate such tools. A workshop, sponsored by the National Institute of Allergy and Infectious Disease (NIAID), was recently held in Washington...

  7. Demand Forecasting: An Evaluation of DODs Accuracy Metric and Navys Procedures

    Science.gov (United States)

    2016-06-01

    dataset ci = unit cost for item i fi = demand forecast for item i 28 ai = actual demand for item i A close look at fCIMIP metric reveals a...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT DEMAND FORECASTING : AN EVALUATION OF DOD’S ACCURACY...June 2016 3. REPORT TYPE AND DATES COVERED MBA professional report 4. TITLE AND SUBTITLE DEMAND FORECASTING : AN EVALUATION OF DOD’S ACCURACY

  8. Optimized Evaluation System to Athletic Food Safety

    OpenAIRE

    Shanshan Li

    2015-01-01

    This study presented a new method of optimizing evaluation function in athletic food safety information programming by particle swarm optimization. The process of food information evaluation function is to automatically adjust these parameters in the evaluation function by self-optimizing method accomplished through competition, which is a food information system plays against itself with different evaluation functions. The results show that the particle swarm optimization is successfully app...

  9. Individuality evaluation for paper based artifact-metrics using transmitted light image

    Science.gov (United States)

    Yamakoshi, Manabu; Tanaka, Junichi; Furuie, Makoto; Hirabayashi, Masashi; Matsumoto, Tsutomu

    2008-02-01

    Artifact-metrics is an automated method of authenticating artifacts based on a measurable intrinsic characteristic. Intrinsic characters, such as microscopic random-patterns made during the manufacturing process, are very difficult to copy. A transmitted light image of the distribution can be used for artifact-metrics, since the fiber distribution of paper is random. Little is known about the individuality of the transmitted light image although it is an important requirement for intrinsic characteristic artifact-metrics. Measuring individuality requires that the intrinsic characteristic of each artifact significantly differs, so having sufficient individuality can make an artifact-metric system highly resistant to brute force attack. Here we investigate the influence of paper category, matching size of sample, and image-resolution on the individuality of a transmitted light image of paper through a matching test using those images. More concretely, we evaluate FMR/FNMR curves by calculating similarity scores with matches using correlation coefficients between pairs of scanner input images, and the individuality of paper by way of estimated EER with probabilistic measure through a matching method based on line segments, which can localize the influence of rotation gaps of a sample in the case of large matching size. As a result, we found that the transmitted light image of paper has a sufficient individuality.

  10. Comparison of SOAP and REST Based Web Services Using Software Evaluation Metrics

    Directory of Open Access Journals (Sweden)

    Tihomirovs Juris

    2016-12-01

    Full Text Available The usage of Web services has recently increased. Therefore, it is important to select right type of Web services at the project design stage. The most common implementations are based on SOAP (Simple Object Access Protocol and REST (Representational State Transfer Protocol styles. Maintainability of REST and SOAP Web services has become an important issue as popularity of Web services is increasing. Choice of the right approach is not an easy decision since it is influenced by development requirements and maintenance considerations. In the present research, we present the comparison of SOAP and REST based Web services using software evaluation metrics. To achieve this aim, a systematic literature review will be made to compare REST and SOAP Web services in terms of the software evaluation metrics.

  11. New Metrics for Economic Evaluation in the Presence of Heterogeneity: Focusing on Evaluating Policy Alternatives Rather than Treatment Alternatives.

    Science.gov (United States)

    Kim, David D; Basu, Anirban

    2017-11-01

    Cost-effectiveness analysis (CEA) methods fail to acknowledge that where cost-effectiveness differs across subgroups, there may be differential adoption of technology. Also, current CEA methods are not amenable to incorporating the impact of policy alternatives that potentially influence the adoption behavior. Unless CEA methods are extended to allow for a comparison of policies rather than simply treatments, their usefulness to decision makers may be limited. We conceptualize new metrics, which estimate the realized value of technology from policy alternatives, through introducing subgroup-specific adoption parameters into existing metrics, incremental cost-effectiveness ratios (ICERs) and Incremental Net Monetary Benefits (NMBs). We also provide the Loss with respect to Efficient Diffusion (LED) metrics, which link with existing value of information metrics but take a policy evaluation perspective. We illustrate these metrics using policies on treatment with combination therapy with a statin plus a fibrate v. statin monotherapy for patients with diabetes and mixed dyslipidemia. Under the traditional approach, the population-level ICER of combination v. monotherapy was $46,000/QALY. However, after accounting for differential rates of adoption of the combination therapy (7.2% among males and 4.3% among females), the modified ICER was $41,733/QALY, due to the higher rate of adoption in the more cost-effective subgroup (male). The LED metrics showed that an education program to increase the uptake of combination therapy among males would provide the largest economic returns due to the significant underutilization of the combination therapy among males under the current policy. This framework may have the potential to improve the decision-making process by producing metrics that are better aligned with the specific policy decisions under consideration for a specific technology.

  12. Robustness of climate metrics under climate policy ambiguity

    International Nuclear Information System (INIS)

    Ekholm, Tommi; Lindroos, Tomi J.; Savolainen, Ilkka

    2013-01-01

    Highlights: • We assess the economic impacts of using different climate metrics. • The setting is cost-efficient scenarios for three interpretations of the 2C target. • With each target setting, the optimal metric is different. • Therefore policy ambiguity prevents the selection of an optimal metric. • Robust metric values that perform well with multiple policy targets however exist. -- Abstract: A wide array of alternatives has been proposed as the common metrics with which to compare the climate impacts of different emission types. Different physical and economic metrics and their parameterizations give diverse weights between e.g. CH 4 and CO 2 , and fixing the metric from one perspective makes it sub-optimal from another. As the aims of global climate policy involve some degree of ambiguity, it is not possible to determine a metric that would be optimal and consistent with all policy aims. This paper evaluates the cost implications of using predetermined metrics in cost-efficient mitigation scenarios. Three formulations of the 2 °C target, including both deterministic and stochastic approaches, shared a wide range of metric values for CH 4 with which the mitigation costs are only slightly above the cost-optimal levels. Therefore, although ambiguity in current policy might prevent us from selecting an optimal metric, it can be possible to select robust metric values that perform well with multiple policy targets

  13. Scientist impact factor (SIF): a new metric for improving scientists' evaluation?

    Science.gov (United States)

    Lippi, Giuseppe; Mattiuzzi, Camilla

    2017-08-01

    The publication of scientific research is the mainstay for knowledge dissemination, but is also an essential criterion of scientists' evaluation for recruiting funds and career progression. Although the most widespread approach for evaluating scientists is currently based on the H-index, the total impact factor (IF) and the overall number of citations, these metrics are plagued by some well-known drawbacks. Therefore, with the aim to improve the process of scientists' evaluation, we developed a new and potentially useful indicator of recent scientific output. The new metric scientist impact factor (SIF) was calculated as all citations of articles published in the two years following the publication year of the articles, divided by the overall number of articles published in that year. The metrics was then tested by analyzing data of the 40 top scientists of the local University. No correlation was found between SIF and H-index (r=0.15; P=0.367) or 2 years H-index (r=-0.01; P=0.933), whereas the H-index and 2 years H-index values were found to be highly correlated (r=0.57; Particles published in one year and the total number of citations to these articles in the two following years (r=0.62; Pscientists, wherein the SIF reflects the scientific output over the past two years thus increasing their chances to apply to and obtain competitive funding.

  14. ARM Data-Oriented Metrics and Diagnostics Package for Climate Model Evaluation Value-Added Product

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Chengzhu [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Xie, Shaocheng [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-10-15

    A Python-based metrics and diagnostics package is currently being developed by the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Infrastructure Team at Lawrence Livermore National Laboratory (LLNL) to facilitate the use of long-term, high-frequency measurements from the ARM Facility in evaluating the regional climate simulation of clouds, radiation, and precipitation. This metrics and diagnostics package computes climatological means of targeted climate model simulation and generates tables and plots for comparing the model simulation with ARM observational data. The Coupled Model Intercomparison Project (CMIP) model data sets are also included in the package to enable model intercomparison as demonstrated in Zhang et al. (2017). The mean of the CMIP model can serve as a reference for individual models. Basic performance metrics are computed to measure the accuracy of mean state and variability of climate models. The evaluated physical quantities include cloud fraction, temperature, relative humidity, cloud liquid water path, total column water vapor, precipitation, sensible and latent heat fluxes, and radiative fluxes, with plan to extend to more fields, such as aerosol and microphysics properties. Process-oriented diagnostics focusing on individual cloud- and precipitation-related phenomena are also being developed for the evaluation and development of specific model physical parameterizations. The version 1.0 package is designed based on data collected at ARM’s Southern Great Plains (SGP) Research Facility, with the plan to extend to other ARM sites. The metrics and diagnostics package is currently built upon standard Python libraries and additional Python packages developed by DOE (such as CDMS and CDAT). The ARM metrics and diagnostic package is available publicly with the hope that it can serve as an easy entry point for climate modelers to compare their models with ARM data. In this report, we first present the input data, which

  15. Evaluation of Deposited Sediment and Macroinvertebrate Metrics Used to Quantify Biological Response to Excessive Sedimentation in Agricultural Streams

    Science.gov (United States)

    Sutherland, Andrew B.; Culp, Joseph M.; Benoy, Glenn A.

    2012-07-01

    The objective of this study was to evaluate which macroinvertebrate and deposited sediment metrics are best for determining effects of excessive sedimentation on stream integrity. Fifteen instream sediment metrics, with the strongest relationship to land cover, were compared to riffle macroinvertebrate metrics in streams ranging across a gradient of land disturbance. Six deposited sediment metrics were strongly related to the relative abundance of Ephemeroptera, Plecoptera and Trichoptera and six were strongly related to the modified family biotic index (MFBI). Few functional feeding groups and habit groups were significantly related to deposited sediment, and this may be related to the focus on riffle, rather than reach-wide macroinvertebrates, as reach-wide sediment metrics were more closely related to human land use. Our results suggest that the coarse-level deposited sediment metric, visual estimate of fines, and the coarse-level biological index, MFBI, may be useful in biomonitoring efforts aimed at determining the impact of anthropogenic sedimentation on stream biotic integrity.

  16. Incorporating big data into treatment plan evaluation: Development of statistical DVH metrics and visualization dashboards.

    Science.gov (United States)

    Mayo, Charles S; Yao, John; Eisbruch, Avraham; Balter, James M; Litzenberg, Dale W; Matuszak, Martha M; Kessler, Marc L; Weyburn, Grant; Anderson, Carlos J; Owen, Dawn; Jackson, William C; Haken, Randall Ten

    2017-01-01

    To develop statistical dose-volume histogram (DVH)-based metrics and a visualization method to quantify the comparison of treatment plans with historical experience and among different institutions. The descriptive statistical summary (ie, median, first and third quartiles, and 95% confidence intervals) of volume-normalized DVH curve sets of past experiences was visualized through the creation of statistical DVH plots. Detailed distribution parameters were calculated and stored in JavaScript Object Notation files to facilitate management, including transfer and potential multi-institutional comparisons. In the treatment plan evaluation, structure DVH curves were scored against computed statistical DVHs and weighted experience scores (WESs). Individual, clinically used, DVH-based metrics were integrated into a generalized evaluation metric (GEM) as a priority-weighted sum of normalized incomplete gamma functions. Historical treatment plans for 351 patients with head and neck cancer, 104 with prostate cancer who were treated with conventional fractionation, and 94 with liver cancer who were treated with stereotactic body radiation therapy were analyzed to demonstrate the usage of statistical DVH, WES, and GEM in a plan evaluation. A shareable dashboard plugin was created to display statistical DVHs and integrate GEM and WES scores into a clinical plan evaluation within the treatment planning system. Benchmarking with normal tissue complication probability scores was carried out to compare the behavior of GEM and WES scores. DVH curves from historical treatment plans were characterized and presented, with difficult-to-spare structures (ie, frequently compromised organs at risk) identified. Quantitative evaluations by GEM and/or WES compared favorably with the normal tissue complication probability Lyman-Kutcher-Burman model, transforming a set of discrete threshold-priority limits into a continuous model reflecting physician objectives and historical experience

  17. Optimization of the alpha image reconstruction. An iterative CT-image reconstruction with well-defined image quality metrics

    International Nuclear Information System (INIS)

    Lebedev, Sergej; Sawall, Stefan; Knaup, Michael; Kachelriess, Marc

    2017-01-01

    Optimization of the AIR-algorithm for improved convergence and performance. TThe AIR method is an iterative algorithm for CT image reconstruction. As a result of its linearity with respect to the basis images, the AIR algorithm possesses well defined, regular image quality metrics, e.g. point spread function (PSF) or modulation transfer function (MTF), unlike other iterative reconstruction algorithms. The AIR algorithm computes weighting images α to blend between a set of basis images that preferably have mutually exclusive properties, e.g. high spatial resolution or low noise. The optimized algorithm uses an approach that alternates between the optimization of rawdata fidelity using an OSSART like update and regularization using gradient descent, as opposed to the initially proposed AIR using a straightforward gradient descent implementation. A regularization strength for a given task is chosen by formulating a requirement for the noise reduction and checking whether it is fulfilled for different regularization strengths, while monitoring the spatial resolution using the voxel-wise defined modulation transfer function for the AIR image. The optimized algorithm computes similar images in a shorter time compared to the initial gradient descent implementation of AIR. The result can be influenced by multiple parameters that can be narrowed down to a relatively simple framework to compute high quality images. The AIR images, for instance, can have at least a 50% lower noise level compared to the sharpest basis image, while the spatial resolution is mostly maintained. The optimization improves performance by a factor of 6, while maintaining image quality. Furthermore, it was demonstrated that the spatial resolution for AIR can be determined using regular image quality metrics, given smooth weighting images. This is not possible for other iterative reconstructions as a result of their non linearity. A simple set of parameters for the algorithm is discussed that provides

  18. Optimization of the alpha image reconstruction. An iterative CT-image reconstruction with well-defined image quality metrics

    Energy Technology Data Exchange (ETDEWEB)

    Lebedev, Sergej; Sawall, Stefan; Knaup, Michael; Kachelriess, Marc [German Cancer Research Center, Heidelberg (Germany).

    2017-10-01

    Optimization of the AIR-algorithm for improved convergence and performance. TThe AIR method is an iterative algorithm for CT image reconstruction. As a result of its linearity with respect to the basis images, the AIR algorithm possesses well defined, regular image quality metrics, e.g. point spread function (PSF) or modulation transfer function (MTF), unlike other iterative reconstruction algorithms. The AIR algorithm computes weighting images α to blend between a set of basis images that preferably have mutually exclusive properties, e.g. high spatial resolution or low noise. The optimized algorithm uses an approach that alternates between the optimization of rawdata fidelity using an OSSART like update and regularization using gradient descent, as opposed to the initially proposed AIR using a straightforward gradient descent implementation. A regularization strength for a given task is chosen by formulating a requirement for the noise reduction and checking whether it is fulfilled for different regularization strengths, while monitoring the spatial resolution using the voxel-wise defined modulation transfer function for the AIR image. The optimized algorithm computes similar images in a shorter time compared to the initial gradient descent implementation of AIR. The result can be influenced by multiple parameters that can be narrowed down to a relatively simple framework to compute high quality images. The AIR images, for instance, can have at least a 50% lower noise level compared to the sharpest basis image, while the spatial resolution is mostly maintained. The optimization improves performance by a factor of 6, while maintaining image quality. Furthermore, it was demonstrated that the spatial resolution for AIR can be determined using regular image quality metrics, given smooth weighting images. This is not possible for other iterative reconstructions as a result of their non linearity. A simple set of parameters for the algorithm is discussed that provides

  19. Performance evaluation of no-reference image quality metrics for face biometric images

    Science.gov (United States)

    Liu, Xinwei; Pedersen, Marius; Charrier, Christophe; Bours, Patrick

    2018-03-01

    The accuracy of face recognition systems is significantly affected by the quality of face sample images. The recent established standardization proposed several important aspects for the assessment of face sample quality. There are many existing no-reference image quality metrics (IQMs) that are able to assess natural image quality by taking into account similar image-based quality attributes as introduced in the standardization. However, whether such metrics can assess face sample quality is rarely considered. We evaluate the performance of 13 selected no-reference IQMs on face biometrics. The experimental results show that several of them can assess face sample quality according to the system performance. We also analyze the strengths and weaknesses of different IQMs as well as why some of them failed to assess face sample quality. Retraining an original IQM by using face database can improve the performance of such a metric. In addition, the contribution of this paper can be used for the evaluation of IQMs on other biometric modalities; furthermore, it can be used for the development of multimodality biometric IQMs.

  20. A Metric for Secrecy-Energy Efficiency Tradeoff Evaluation in 3GPP Cellular Networks

    Directory of Open Access Journals (Sweden)

    Fabio Ciabini

    2016-10-01

    Full Text Available Physical-layer security is now being considered for information protection in future wireless communications. However, a better understanding of the inherent secrecy of wireless systems under more realistic conditions, with a specific attention to the relative energy consumption costs, has to be pursued. This paper aims at proposing new analysis tools and investigating the relation between secrecy capacity and energy consumption in a 3rd Generation Partnership Project (3GPP cellular network , by focusing on secure and energy efficient communications. New metrics that bind together the secure area in the Base Station (BS sectors, the afforded date-rate and the power spent by the BS to obtain it, are proposed that permit evaluation of the tradeoff between these aspects. The results show that these metrics are useful in identifying the optimum transmit power level for the BS, so that the maximum secure area can be obtained while minimizing the energy consumption.

  1. Reference-free ground truth metric for metal artifact evaluation in CT images

    International Nuclear Information System (INIS)

    Kratz, Baerbel; Ens, Svitlana; Mueller, Jan; Buzug, Thorsten M.

    2011-01-01

    Purpose: In computed tomography (CT), metal objects in the region of interest introduce data inconsistencies during acquisition. Reconstructing these data results in an image with star shaped artifacts induced by the metal inconsistencies. To enhance image quality, the influence of the metal objects can be reduced by different metal artifact reduction (MAR) strategies. For an adequate evaluation of new MAR approaches a ground truth reference data set is needed. In technical evaluations, where phantoms can be measured with and without metal inserts, ground truth data can easily be obtained by a second reference data acquisition. Obviously, this is not possible for clinical data. Here, an alternative evaluation method is presented without the need of an additionally acquired reference data set. Methods: The proposed metric is based on an inherent ground truth for metal artifacts as well as MAR methods comparison, where no reference information in terms of a second acquisition is needed. The method is based on the forward projection of a reconstructed image, which is compared to the actually measured projection data. Results: The new evaluation technique is performed on phantom and on clinical CT data with and without MAR. The metric results are then compared with methods using a reference data set as well as an expert-based classification. It is shown that the new approach is an adequate quantification technique for artifact strength in reconstructed metal or MAR CT images. Conclusions: The presented method works solely on the original projection data itself, which yields some advantages compared to distance measures in image domain using two data sets. Beside this, no parameters have to be manually chosen. The new metric is a useful evaluation alternative when no reference data are available.

  2. INVESTIGATION AND EVALUATION OF SPATIAL PATTERNS IN TABRIZ PARKS USING LANDSCAPE METRICS

    Directory of Open Access Journals (Sweden)

    Ali Majnouni Toutakhane

    2016-01-01

    Full Text Available Nowadays, the green spaces in cities and especially metropolises have adopted a variety of functions. In addition to improving the environmental conditions, they are suitable places for spending free times and mitigating nervous pressures of the machinery life based on their distribution and dispersion in the cities. In this research, in order to study the spatial distribution and composition of the parks and green spaces in Tabriz metropolis, the map of Parks prepared using the digital atlas of Tabriz parks and Arc Map and IDRISI softwares. Then, quantitative information of spatial patterns of Tabriz parks provided using Fragstats software and a selection of landscape metrics including: the area of class, patch density, percentage of landscape, average patch size, average patch area, largest patch index, landscape shape index, average Euclidean distance of the nearest neighborhood and average index of patch shape. Then the spatial distribution, composition, extent and continuity of the parks was evaluated. Overall, only 8.5 percent of the landscape is assigned to the parks, and they are studied in three classes of neighborhood, district and regional parks. Neighborhood parks and green spaces have a better spatial distribution pattern compared to the other classes and the studied metrics showed better results for this class. In contrast, the quantitative results of the metrics calculated for regional parks, showed the most unfavorable spatial status for this class of parks among the three classes studied in Tabriz city.

  3. Portfolio optimization and performance evaluation

    DEFF Research Database (Denmark)

    Juhl, Hans Jørn; Christensen, Michael

    2013-01-01

    Based on an exclusive business-to-business database comprising nearly 1,000 customers, the applicability of portfolio analysis is documented, and it is examined how such an optimization analysis can be used to explore the growth potential of a company. As opposed to any previous analyses, optimal...... customer portfolios are determined, and it is shown how marketing decision-makers can use this information in their marketing strategies to optimize the revenue growth of the company. Finally, our analysis is the first analysis which applies portfolio based methods to measure customer performance......, and it is shown how these performance measures complement the optimization analysis....

  4. SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics for Scientific Data and Analysis

    Data.gov (United States)

    National Aeronautics and Space Administration — We will construct SciSpark, a scalable system for interactive model evaluation and for the rapid development of climate metrics and analyses. SciSpark directly...

  5. Field installation versus local integration of photovoltaic systems and their effect on energy evaluation metrics

    International Nuclear Information System (INIS)

    Halasah, Suleiman A.; Pearlmutter, David; Feuermann, Daniel

    2013-01-01

    In this study we employ Life-Cycle Assessment to evaluate the energy-related impacts of photovoltaic systems at different scales of integration, in an arid region with especially high solar irradiation. Based on the electrical output and embodied energy of a selection of fixed and tracking systems and including concentrator photovoltaic (CPV) and varying cell technology, we calculate a number of energy evaluation metrics, including the energy payback time (EPBT), energy return factor (ERF), and life-cycle CO 2 emissions offset per unit aperture and land area. Studying these metrics in the context of a regionally limited setting, it was found that utilizing existing infrastructure such as existing building roofs and shade structures does significantly reduce the embodied energy requirements (by 20–40%) and in turn the EPBT of flat-plate PV systems due to the avoidance of energy-intensive balance of systems (BOS) components like foundations. Still, high-efficiency CPV field installations were found to yield the shortest EPBT, the highest ERF and the largest life-cycle CO 2 offsets—under the condition that land availability is not a limitation. A greater life-cycle energy return and carbon offset per unit land area is yielded by locally-integrated non-concentrating systems, despite their lower efficiency per unit module area. - Highlights: ► We evaluate life-cycle energy impacts of PV systems at different scales. ► We calculate the energy payback time, return factor and CO 2 emissions offset. ► Utilizing existing structures significantly improves metrics of flat-plate PV. ► High-efficiency CPV installations yield best return and offset per aperture area. ► Locally-integrated flat-plate systems yield best return and offset per land area.

  6. Modelling the B2C Marketplace: Evaluation of a Reputation Metric for e-Commerce

    Science.gov (United States)

    Gutowska, Anna; Sloane, Andrew

    This paper evaluates recently developed novel and comprehensive reputation metric designed for the distributed multi-agent reputation system for the Business-to-Consumer (B2C) E-commerce applications. To do that an agent-based simulation framework was implemented which models different types of behaviours in the marketplace. The trustworthiness of different types of providers is investigated to establish whether the simulation models behaviour of B2C e-Commerce systems as they are expected to behave in real life.

  7. Light Water Reactor Sustainability Program Operator Performance Metrics for Control Room Modernization: A Practical Guide for Early Design Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Boring; Roger Lew; Thomas Ulrich; Jeffrey Joe

    2014-03-01

    As control rooms are modernized with new digital systems at nuclear power plants, it is necessary to evaluate the operator performance using these systems as part of a verification and validation process. There are no standard, predefined metrics available for assessing what is satisfactory operator interaction with new systems, especially during the early design stages of a new system. This report identifies the process and metrics for evaluating human system interfaces as part of control room modernization. The report includes background information on design and evaluation, a thorough discussion of human performance measures, and a practical example of how the process and metrics have been used as part of a turbine control system upgrade during the formative stages of design. The process and metrics are geared toward generalizability to other applications and serve as a template for utilities undertaking their own control room modernization activities.

  8. A review of the trunk surface metrics used as Scoliosis and other deformities evaluation indices

    Directory of Open Access Journals (Sweden)

    Aggouris Costas

    2010-06-01

    Full Text Available Abstract Background Although scoliosis is characterized by lateral deviation of the spine, a 3D deformation actually is responsible for geometric and morphologic changes in the trunk and rib cage. In a vast related medical literature, one can find quite a few scoliosis evaluation indices, which are based on back surface data and are generally measured along three planes. Regardless the large number of such indices, the literature is lacking a coherent presentation of the underlying metrics, the involved anatomic surface landmarks, the definition of planes and the definition of the related body axes. In addition, the long list of proposed scoliotic indices is rarely presented in cross-reference to each other. This creates a possibility of misunderstandings and sometimes irrational or even wrong use of these indices by the medical society. Materials and methods It is hoped that the current work contributes in clearing up the issue and gives rise to innovative ideas on how to assess the surface metrics in scoliosis. In particular, this paper presents a thorough study on the scoliosis evaluation indices, proposed by the medical society. Results More specifically, the referred indices are classified, according to the type of asymmetry they measure, according to the plane they refer to, according to the importance, and relevance or the level of scientific consensus they enjoy. Conclusions Surface metrics have very little correlation to Cobb angle measurements. Indices measured on different planes do not correlate to each other. Different indices exhibit quite diverging characteristics in terms of observer-induced errors, accuracy, sensitivity and specificity. Complicated positioning of the patient and ambiguous anatomical landmarks are the major error sources, which cause observer variations. Principles that should be followed when an index is proposed are presented.

  9. Guiding automated NMR structure determination using a global optimization metric, the NMR DP score

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Yuanpeng Janet, E-mail: yphuang@cabm.rutgers.edu; Mao, Binchen; Xu, Fei; Montelione, Gaetano T., E-mail: gtm@rutgers.edu [Rutgers, The State University of New Jersey, Department of Molecular Biology and Biochemistry, Center for Advanced Biotechnology and Medicine, and Northeast Structural Genomics Consortium (United States)

    2015-08-15

    ASDP is an automated NMR NOE assignment program. It uses a distinct bottom-up topology-constrained network anchoring approach for NOE interpretation, with 2D, 3D and/or 4D NOESY peak lists and resonance assignments as input, and generates unambiguous NOE constraints for iterative structure calculations. ASDP is designed to function interactively with various structure determination programs that use distance restraints to generate molecular models. In the CASD–NMR project, ASDP was tested and further developed using blinded NMR data, including resonance assignments, either raw or manually-curated (refined) NOESY peak list data, and in some cases {sup 15}N–{sup 1}H residual dipolar coupling data. In these blinded tests, in which the reference structure was not available until after structures were generated, the fully-automated ASDP program performed very well on all targets using both the raw and refined NOESY peak list data. Improvements of ASDP relative to its predecessor program for automated NOESY peak assignments, AutoStructure, were driven by challenges provided by these CASD–NMR data. These algorithmic improvements include (1) using a global metric of structural accuracy, the discriminating power score, for guiding model selection during the iterative NOE interpretation process, and (2) identifying incorrect NOESY cross peak assignments caused by errors in the NMR resonance assignment list. These improvements provide a more robust automated NOESY analysis program, ASDP, with the unique capability of being utilized with alternative structure generation and refinement programs including CYANA, CNS, and/or Rosetta.

  10. Optimization of VPSC Model Parameters for Two-Phase Titanium Alloys: Flow Stress Vs Orientation Distribution Function Metrics

    Science.gov (United States)

    Miller, V. M.; Semiatin, S. L.; Szczepanski, C.; Pilchak, A. L.

    2018-06-01

    The ability to predict the evolution of crystallographic texture during hot work of titanium alloys in the α + β temperature regime is greatly significant to numerous engineering disciplines; however, research efforts are complicated by the rapid changes in phase volume fractions and flow stresses with temperature in addition to topological considerations. The viscoplastic self-consistent (VPSC) polycrystal plasticity model is employed to simulate deformation in the two phase field. Newly developed parameter selection schemes utilizing automated optimization based on two different error metrics are considered. In the first optimization scheme, which is commonly used in the literature, the VPSC parameters are selected based on the quality of fit between experiment and simulated flow curves at six hot-working temperatures. Under the second newly developed scheme, parameters are selected to minimize the difference between the simulated and experimentally measured α textures after accounting for the β → α transformation upon cooling. It is demonstrated that both methods result in good qualitative matches for the experimental α phase texture, but texture-based optimization results in a substantially better quantitative orientation distribution function match.

  11. Process-level model evaluation: a snow and heat transfer metric

    Science.gov (United States)

    Slater, Andrew G.; Lawrence, David M.; Koven, Charles D.

    2017-04-01

    Land models require evaluation in order to understand results and guide future development. Examining functional relationships between model variables can provide insight into the ability of models to capture fundamental processes and aid in minimizing uncertainties or deficiencies in model forcing. This study quantifies the proficiency of land models to appropriately transfer heat from the soil through a snowpack to the atmosphere during the cooling season (Northern Hemisphere: October-March). Using the basic physics of heat diffusion, we investigate the relationship between seasonal amplitudes of soil versus air temperatures due to insulation from seasonal snow. Observations demonstrate the anticipated exponential relationship of attenuated soil temperature amplitude with increasing snow depth and indicate that the marginal influence of snow insulation diminishes beyond an effective snow depth of about 50 cm. A snow and heat transfer metric (SHTM) is developed to quantify model skill compared to observations. Land models within the CMIP5 experiment vary widely in SHTM scores, and deficiencies can often be traced to model structural weaknesses. The SHTM value for individual models is stable over 150 years of climate, 1850-2005, indicating that the metric is insensitive to climate forcing and can be used to evaluate each model's representation of the insulation process.

  12. Evaluation of GCC optimization parameters

    Directory of Open Access Journals (Sweden)

    Rodrigo D. Escobar

    2012-12-01

    Full Text Available Compile-time optimization of code can result in significant performance gains. The amount of these gains varies widely depending upon the code being optimized, the hardware being compiled for, the specific performance increase attempted (e.g. speed, throughput, memory utilization, etc. and the used compiler. We used the latest version of the SPEC CPU 2006 benchmark suite to help gain an understanding of possible performance improvements using GCC (GNU Compiler Collection options focusing mainly on speed gains made possible by tuning the compiler with the standard compiler optimization levels as well as a specific compiler option for the hardware processor. We compared the best standardized tuning options obtained for a core i7 processor, to the same relative options used on a Pentium4 to determine whether the GNU project has improved its performance tuning capabilities for specific hardware over time.

  13. Evaluation of Daily Evapotranspiration Over Orchards Using METRIC Approach and Landsat Satellite Observations

    Science.gov (United States)

    He, R.; Jin, Y.; Daniele, Z.; Kandelous, M. M.; Kent, E. R.

    2016-12-01

    The pistachio and almond acreage in California has been rapidly growing in the past 10 years, raising concerns about competition for limited water resources in California. A robust and cost-effective mapping of crop water use, mostly evapotranspiration (ET), by orchards, is needed for improved farm-level irrigation management and regional water planning. METRIC™, a satellite-based surface energy balance approach, has been widely used to map field-scale crop ET, mostly over row crops. We here aim to apply METRIC with Landsat satellite observations over California's orchards and evaluate the ET estimates by comparing with field measurements in South San Joaquin Valley, California. Reference ET of grass (ETo) from California Irrigation Management Information system (CIMIS) stations was used to estimate daily ET of commercial almond and pistachio orchards. Our comparisons showed that METRIC-Landsat ET daily estimates agreed well with ET measured by the eddy covariance and surface renewal stations, with a RMSE of 1.25 and a correlation coefficient of 0.84 for the pistachio orchard. A slight high bias of satellite based ET estimates was found for both pistachio and almond orchards. We also found time series of NDVI was highly correlated with ET temporal dynamics within each field, but the correlation was reduced to 0.56 when all fields were pooled together. Net radiation, however, remained highly correlated with ET across all the fields. The METRIC ET was able to distinguish the differences in ET among salt- and non-salt affected pistachio orchards, e.g., mean daily ET during growing season in salt-affected orchards was lower than that of non-salt affected one by 0.87 mm/day. The remote sensing based ET estimate will support a variety of state and local interests in water use and management, for both planning and regulatory/compliance purposes, and provide the farmers observation-based guidance for site-specific and time-sensitive irrigation management.

  14. WE-AB-209-07: Explicit and Convex Optimization of Plan Quality Metrics in Intensity-Modulated Radiation Therapy Treatment Planning

    International Nuclear Information System (INIS)

    Engberg, L; Eriksson, K; Hardemark, B; Forsgren, A

    2016-01-01

    Purpose: To formulate objective functions of a multicriteria fluence map optimization model that correlate well with plan quality metrics, and to solve this multicriteria model by convex approximation. Methods: In this study, objectives of a multicriteria model are formulated to explicitly either minimize or maximize a dose-at-volume measure. Given the widespread agreement that dose-at-volume levels play important roles in plan quality assessment, these objectives correlate well with plan quality metrics. This is in contrast to the conventional objectives, which are to maximize clinical goal achievement by relating to deviations from given dose-at-volume thresholds: while balancing the new objectives means explicitly balancing dose-at-volume levels, balancing the conventional objectives effectively means balancing deviations. Constituted by the inherently non-convex dose-at-volume measure, the new objectives are approximated by the convex mean-tail-dose measure (CVaR measure), yielding a convex approximation of the multicriteria model. Results: Advantages of using the convex approximation are investigated through juxtaposition with the conventional objectives in a computational study of two patient cases. Clinical goals of each case respectively point out three ROI dose-at-volume measures to be considered for plan quality assessment. This is translated in the convex approximation into minimizing three mean-tail-dose measures. Evaluations of the three ROI dose-at-volume measures on Pareto optimal plans are used to represent plan quality of the Pareto sets. Besides providing increased accuracy in terms of feasibility of solutions, the convex approximation generates Pareto sets with overall improved plan quality. In one case, the Pareto set generated by the convex approximation entirely dominates that generated with the conventional objectives. Conclusion: The initial computational study indicates that the convex approximation outperforms the conventional objectives

  15. Using Green Star Metrics to Optimize the Greenness of Literature Protocols for Syntheses

    Science.gov (United States)

    Duarte, Rita C. C.; Ribeiro, M. Gabriela T. C.; Machado, Adélio A. S. C.

    2015-01-01

    A procedure to improve the greenness of a synthesis, without performing laboratory work, using alternative protocols available in the literature is presented. The greenness evaluation involves the separate assessment of the different steps described in the available protocols--reaction, isolation, and purification--as well as the global process,…

  16. Evaluation of Risk Metrics for KHNP Reference Plants Using the Latest Plant Specific Data

    International Nuclear Information System (INIS)

    Jeon, Ho Jun; Hwang, Seok Won; Ghi, Moon Goo

    2010-01-01

    As Risk-Informed Applications (RIAs) are actively implemented in the nuclear industry, an issue associated with the technical adequacy of the Probabilistic Safety Assessment (PSA) arises in its data sources. The American Society of Mechanical Engineers (ASME) PRA standard suggests the use of component failure data that represent the as-built and as-operated plant conditions. Furthermore, the peer reviews for the KHNP reference plants stated that the component failure data should be updated to reflect the latest plant specific data available. For ensuring the technical adequacy in PSA data elements, we try to update component failure data to reflect the as-operated plant conditions, and a trend analysis of the failure data is implemented. In addition, by applying the updated failure data to the PSA models of the KHNP reference plants, the risk metrics of Core Damage Frequency (CDF) and Large Early Release Frequency (LERF) are evaluated

  17. WE-E-213CD-11: A New Automatically Generated Metric for Evaluating the Spatial Precision of Deformable Image Registrations: The Distance Discordance Metric.

    Science.gov (United States)

    Saleh, Z; Apte, A; Sharp, G; Deasy, J

    2012-06-01

    We propose a new metric called Distance Discordance (DD), which is defined as the distance between two anatomic points from two moving images, which are co-located on some reference image, when deformed onto another reference image. To demonstrate the concept of DD, we created a reference software phantom which contains two objects. The first object (1) consists of a hollow box with a fixed size core and variable wall thickness. The second object (2) consists of a solid box of fixed size and arbitrary location. 7 different variations of the fixed phantom were created. Each phantom was deformed onto every other phantom using two B-Spline DIR algorithms available in Elastix and Plastimatch. Voxels were sampled from the reference phantom [1], which were also deformed from moving phantoms [2…6], and we find the differences in their corresponding location on phantom [7]. Each voxel results in a distribution of DD values, which we call distance discordance histogram (DDH). We also demonstrate this concept in 8 Head & Neck patients. The two image registration algorithms produced two different DD results for the same phantom image set. The mean values of the DDH were slightly lower for Elastix (0-1.28 cm) as compared to the values produced by Plastimatch (0-1.43 cm). The combined DDH for the H&N patients followed a lognormal distribution with a mean of 0.45 cm and std. deviation of 0.42 cm. The proposed distance discordance (DD) metric is an easily interpretable, quantitative tool that can be used to evaluate the effect of inter-patient variability on the goodness of the registration in different parts of the patient anatomy. Therefore, it can be utilized to exclude certain images based on their DDH characteristics. In addition, this metric does not rely on 'ground truth' or the presence of contoured structures. Partially supported by NIH grant R01 CA85181. © 2012 American Association of Physicists in Medicine.

  18. Survey of source code metrics for evaluating testability of object oriented systems

    OpenAIRE

    Shaheen , Muhammad Rabee; Du Bousquet , Lydie

    2010-01-01

    Software testing is costly in terms of time and funds. Testability is a software characteristic that aims at producing systems easy to test. Several metrics have been proposed to identify the testability weaknesses. But it is sometimes difficult to be convinced that those metrics are really related with testability. This article is a critical survey of the source-code based metrics proposed in the literature for object-oriented software testability. It underlines the necessity to provide test...

  19. An Evaluation of iMetric Studies through the Scholarly Influence Model

    Directory of Open Access Journals (Sweden)

    Faramarz Soheili

    2016-12-01

    Full Text Available Among the topics studied in the context of scientometrics, the issue of the scholarly influence is of special interest. This study tries to test the components in the scholarly influence model based on iMetrics studies, and also to find potential relations among these components. The study uses a bibliometric methodology. Since the researchers aim to determine the relationship between variables, this research is of correlation type. The initial data of this study, which comprises 5944 records in the field of iMetrics during 1978-2014, have been retrieved from Web of Science. To calculate the most of measures involved in each kind of influence, the researchers used UCINet and BibExcel software moreover, some indices have been calculated manually using Excel. After calculating all measures included in three types of influence, the researchers used the Smart PLS to test both the model and research hypotheses. The results of data analysis using the software Smart PLS confirmed the scholarly influence model and indicated significant correlation between the variables in the model. To be more precise, findings uncovered that social influence is associated with both ideational and venue influence. Moreover, the venue influence is associated with ideational influence. If researchers test the scholarly influence model in some other areas and led to positive outcomes, it is hoped that the policy-makers use a combination of variables involved in the model as a measure to evaluate the scholarly influence of researchers and to decision-makings related to purposes such as promotion, recruitment, and so on.

  20. New metrics for evaluating channel networks extracted in grid digital elevation models

    Science.gov (United States)

    Orlandini, S.; Moretti, G.

    2017-12-01

    Channel networks are critical components of drainage basins and delta regions. Despite the important role played by these systems in hydrology and geomorphology, there are at present no well-defined methods to evaluate numerically how two complex channel networks are geometrically far apart. The present study introduces new metrics for evaluating numerically channel networks extracted in grid digital elevation models with respect to a reference channel network (see the figure below). Streams of the evaluated network (EN) are delineated as in the Horton ordering system and examined through a priority climbing algorithm based on the triple index (ID1,ID2,ID3), where ID1 is a stream identifier that increases as the elevation of lower end of the stream increases, ID2 indicates the ID1 of the draining stream, and ID3 is the ID1 of the corresponding stream in the reference network (RN). Streams of the RN are identified by the double index (ID1,ID2). Streams of the EN are processed in the order of increasing ID1 (plots a-l in the figure below). For each processed stream of the EN, the closest stream of the RN is sought by considering all the streams of the RN sharing the same ID2. This ID2 in the RN is equal in the EN to the ID3 of the stream draining the processed stream, the one having ID1 equal to the ID2 of the processed stream. The mean stream planar distance (MSPD) and the mean stream elevation drop (MSED) are computed as the mean distance and drop, respectively, between corresponding streams. The MSPD is shown to be useful for evaluating slope direction methods and thresholds for channel initiation, whereas the MSED is shown to indicate the ability of grid coarsening strategies to retain the profiles of observed channels. The developed metrics fill a gap in the existing literature by allowing hydrologists and geomorphologists to compare descriptions of a fixed physical system obtained by using different terrain analysis methods, or different physical systems

  1. Metric qualities of the cognitive behavioral assessment for outcome evaluation to estimate psychological treatment effects.

    Science.gov (United States)

    Bertolotti, Giorgio; Michielin, Paolo; Vidotto, Giulio; Sanavio, Ezio; Bottesi, Gioia; Bettinardi, Ornella; Zotti, Anna Maria

    2015-01-01

    Cognitive behavioral assessment for outcome evaluation was developed to evaluate psychological treatment interventions, especially for counseling and psychotherapy. It is made up of 80 items and five scales: anxiety, well-being, perception of positive change, depression, and psychological distress. The aim of the study was to present the metric qualities and to show validity and reliability of the five constructs of the questionnaire both in nonclinical and clinical subjects. Four steps were completed to assess reliability and factor structure: criterion-related and concurrent validity, responsiveness, and convergent-divergent validity. A nonclinical group of 269 subjects was enrolled, as was a clinical group comprising 168 adults undergoing psychotherapy and psychological counseling provided by the Italian public health service. Cronbach's alphas were between 0.80 and 0.91 for the clinical sample and between 0.74 and 0.91 in the nonclinical one. We observed an excellent structural validity for the five interrelated dimensions. The clinical group showed higher scores in the anxiety, depression, and psychological distress scales, as well as lower scores in well-being and perception of positive change scales than those observed in the nonclinical group. Responsiveness was large for the anxiety, well-being, and depression scales; the psychological distress and perception of positive change scales showed a moderate effect. The questionnaire showed excellent psychometric properties, thus demonstrating that the questionnaire is a good evaluative instrument, with which to assess pre- and post-treatment outcomes.

  2. SU-F-J-38: Dose Rates and Preliminary Evaluation of Contouring Similarity Metrics Using 4D Cone Beam CT

    Energy Technology Data Exchange (ETDEWEB)

    Santoso, A [Wayne State University School of Medicine, Detroit, Michigan (United States); Song, K; Qin, Y; Gardner, S; Liu, C; Cattaneo, R; Chetty, I; Movsas, B; Aljouni, M; Wen, N [Henry Ford Health System, Detroit, MI (United States)

    2016-06-15

    Purpose: 4D imaging modalities require detailed characterization for clinical optimization. The On-Board Imager mounted on the linear accelerator was used to investigate dose rates in a tissue mimicking phantom using 4D-CBCT and assess variability of contouring similarity metrics between 4D-CT and 4D-CBCT retrospective reconstructions. Methods: A 125 kVp thoracic protocol was used. A phantom placed on a motion platform simulated a patient’s breathing cycle. An ion chamber was affixed inside the phantom’s tissue mimicking cavities (i.e. bone, lung, and soft tissue). A sinusoidal motion waveform was executed with a five second period and superior-inferior motion. Dose rates were measured at six ion chamber positions. A preliminary workflow for contouring similarity between 4D-CT and 4D-CBCT was established using a single lung SBRT patient’s historical data. Average intensity projection (Ave-IP) and maximum intensity projection (MIP) reconstructions generated offline were compared between the 4D modalities. Similarity metrics included Dice similarity coefficient (DSC), Hausdorff distance, and center of mass (COM) deviation. Two isolated lesions were evaluated in the patient’s scans: one located in the right lower lobe (ITVRLL) and one located in the left lower lobe (ITVLLL). Results: Dose rates ranged from 2.30 (lung) to 5.18 (bone) E-3 cGy/mAs. For fixed acquisition parameters, cumulative dose is inversely proportional to gantry speed. For ITVRLL, DSC were 0.70 and 0.68, Hausdorff distances were 6.11 and 5.69 mm, and COM deviations were 1.24 and 4.77 mm, for Ave-IP and MIP respectively. For ITVLLL, DSC were 0.64 and 0.75, Hausdorff distances were 10.74 and 8.00 mm, and COM deviations were 7.55 and 4.3 mm, for Ave-IP and MIP respectively. Conclusion: While the dosimetric output of 4D-CBCT is low, characterization is necessary to assure clinical optimization. A basic workflow for comparison of simulation and treatment 4D image-based contours was established

  3. Using research metrics to evaluate the International Atomic Energy Agency guidelines on quality assurance for R&D

    Energy Technology Data Exchange (ETDEWEB)

    Bodnarczuk, M.

    1994-06-01

    The objective of the International Atomic Energy Agency (IAEA) Guidelines on Quality Assurance for R&D is to provide guidance for developing quality assurance (QA) programs for R&D work on items, services, and processes important to safety, and to support the siting, design, construction, commissioning, operation, and decommissioning of nuclear facilities. The standard approach to writing papers describing new quality guidelines documents is to present a descriptive overview of the contents of the document. I will depart from this approach. Instead, I will first discuss a conceptual framework of metrics for evaluating and improving basic and applied experimental science as well as the associated role that quality management should play in understanding and implementing these metrics. I will conclude by evaluating how well the IAEA document addresses the metrics from this conceptual framework and the broader principles of quality management.

  4. METRIC CHARACTERISTICS OF SOME TESTS FOR EVALUATION OF AEROBIC AND ANAEROBIC CAPACITIES

    Directory of Open Access Journals (Sweden)

    Slobodan Stojiljković

    2006-06-01

    Full Text Available This research was aimed at cheking the metric characteristics of some specific functional tests often used in practice for the evaluation of aerobic and anaerobic capacities and muscular capabilities. Keeping track of the changes and behavior of the functional abilities was performed on the basis of several repeated measurements of the same test on a sample consisting of 110 examinees, Students of the nursing school “Dr Milenko Hadzic” iz Nis, 17 years of age (± 6 months, regularly attending the classes of physical education.Two measuring instruments were tested: MARGARIA TEST and HARVARD STEP TEST.The reliability of said tests was evaluated on the basis of five successive measurements using Spearman-Brown method, based on determining of the value of the coefficients of determination of all measurements and of the main component h1.The outcome revealed high reliability of the results of most of the measurements and of the first main component H1, so that the acquired results were 91.2% for the MARGARIA TEST (anaerobic capacity and 93.4% for5 the HARVARD STEP TEST (aerobic capacity.

  5. Evaluation of alternate categorical tumor metrics and cut points for response categorization using the RECIST 1.1 data warehouse.

    Science.gov (United States)

    Mandrekar, Sumithra J; An, Ming-Wen; Meyers, Jeffrey; Grothey, Axel; Bogaerts, Jan; Sargent, Daniel J

    2014-03-10

    We sought to test and validate the predictive utility of trichotomous tumor response (TriTR; complete response [CR] or partial response [PR] v stable disease [SD] v progressive disease [PD]), disease control rate (DCR; CR/PR/SD v PD), and dichotomous tumor response (DiTR; CR/PR v others) metrics using alternate cut points for PR and PD. The data warehouse assembled to guide the Response Evaluation Criteria in Solid Tumors (RECIST) version 1.1 was used. Data from 13 trials (5,480 patients with metastatic breast cancer, non-small-cell lung cancer, or colorectal cancer) were randomly split (60:40) into training and validation data sets. In all, 27 pairs of cut points for PR and PD were considered: PR (10% to 50% decrease by 5% increments) and PD (10% to 20% increase by 5% increments), for which 30% and 20% correspond to the RECIST categorization. Cox proportional hazards models with landmark analyses at 12 and 24 weeks stratified by study and number of lesions (fewer than three v three or more) and adjusted for average baseline tumor size were used to assess the impact of each metric on overall survival (OS). Model discrimination was assessed by using the concordance index (c-index). Standard RECIST cut points demonstrated predictive ability similar to the alternate PR and PD cut points. Regardless of tumor type, the TriTR, DiTR, and DCR metrics had similar predictive performance. The 24-week metrics (albeit with higher c-index point estimate) were not meaningfully better than the 12-week metrics. None of the metrics did particularly well for breast cancer. Alternative cut points to RECIST standards provided no meaningful improvement in OS prediction. Metrics assessed at 12 weeks have good predictive performance.

  6. Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — This chapter presents several performance metrics for offline evaluation of prognostics algorithms. A brief overview of different methods employed for performance...

  7. Survey of consumer attitudes and awareness of the metric conversion of distilled spirits containers: A policy and planning evaluation

    Science.gov (United States)

    Simpson, J. A.; Barsby, S. L.

    1981-12-01

    The survey was conducted as part of a policy and planning evaluation study. The overall study was an examination of a completed private sector conversion to the metric system, in the light of the US Metric Board's planning guidelines and procedures. The conversion of distilled spirits containers took place prior to the establishment of the USMB. The study's objective was to use the completed version to determine if the guidelines and related procedures were adequate to help the conversion process. If they were not, the study was designed to provide suggestions for improvement.

  8. Scientific foundation of regulating ionizing radiation: application of metrics for evaluation of regulatory science information.

    Science.gov (United States)

    Moghissi, A Alan; Gerraa, Vikrham Kumar; McBride, Dennis K; Swetnam, Michael

    2014-11-01

    This paper starts by describing the historical evolution of assessment of biologic effects of ionizing radiation leading to the linear non-threshold (LNT) system currently used to regulate exposure to ionizing radiation. The paper describes briefly the concept of Best Available Science (BAS) and Metrics for Evaluation of Scientific Claims (MESC) derived for BAS. It identifies three phases of regulatory science consisting of the initial phase, when the regulators had to develop regulations without having the needed scientific information; the exploratory phase, when relevant tools were developed; and the standard operating phase, when the tools were applied to regulations. Subsequently, an attempt is made to apply the BAS/MESC system to various stages of LNT. This paper then compares the exposure limits imposed by regulatory agencies and also compares them with naturally occurring radiation at several cities. Controversies about LNT are addressed, including judgments of the U.S. National Academies and their French counterpart. The paper concludes that, based on the BAS/MESC system, there is no disagreement between the two academies on the scientific foundation of LNT; instead, the disagreement is based on their judgment or speculation.

  9. AN EVALUATION OF OZONE EXPOSURE METRICS FOR A SEASONALLY DROUGHT STRESSED PONDEROSA PINE ECOSYSTEM. (R826601)

    Science.gov (United States)

    Ozone stress has become an increasingly significant factor in cases of forest decline reported throughout the world. Current metrics to estimate ozone exposure for forest trees are derived from atmospheric concentrations and assume that the forest is physiologically active at ...

  10. San Luis Basin Sustainability Metrics Project: A Methodology for Evaluating Regional Sustainability

    Science.gov (United States)

    Although there are several scientifically-based sustainability metrics, many are data intensive, difficult to calculate, and fail to capture all aspects of a system. To address these issues, we produced a scientifically-defensible, but straightforward and inexpensive, methodolog...

  11. Vehicular Networking Enhancement And Multi-Channel Routing Optimization, Based on Multi-Objective Metric and Minimum Spanning Tree

    Directory of Open Access Journals (Sweden)

    Peppino Fazio

    2013-01-01

    Full Text Available Vehicular Ad hoc NETworks (VANETs represent a particular mobile technology that permits the communication among vehicles, offering security and comfort. Nowadays, distributed mobile wireless computing is becoming a very important communications paradigm, due to its flexibility to adapt to different mobile applications. VANETs are a practical example of data exchanging among real mobile nodes. To enable communications within an ad-hoc network, characterized by continuous node movements, routing protocols are needed to react to frequent changes in network topology. In this paper, the attention is focused mainly on the network layer of VANETs, proposing a novel approach to reduce the interference level during mobile transmission, based on the multi-channel nature of IEEE 802.11p (1609.4 standard. In this work a new routing protocol based on Distance Vector algorithm is presented to reduce the delay end to end and to increase packet delivery ratio (PDR and throughput in VANETs. A new metric is also proposed, based on the maximization of the average Signal-to-Interference Ratio (SIR level and the link duration probability between two VANET nodes. In order to relieve the effects of the co-channel interference perceived by mobile nodes, transmission channels are switched on a basis of a periodical SIR evaluation. A Network Simulator has been used for implementing and testing the proposed idea.

  12. Metric diffusion along foliations

    CERN Document Server

    Walczak, Szymon M

    2017-01-01

    Up-to-date research in metric diffusion along compact foliations is presented in this book. Beginning with fundamentals from the optimal transportation theory and the theory of foliations; this book moves on to cover Wasserstein distance, Kantorovich Duality Theorem, and the metrization of the weak topology by the Wasserstein distance. Metric diffusion is defined, the topology of the metric space is studied and the limits of diffused metrics along compact foliations are discussed. Essentials on foliations, holonomy, heat diffusion, and compact foliations are detailed and vital technical lemmas are proved to aide understanding. Graduate students and researchers in geometry, topology and dynamics of foliations and laminations will find this supplement useful as it presents facts about the metric diffusion along non-compact foliation and provides a full description of the limit for metrics diffused along foliation with at least one compact leaf on the two dimensions.

  13. Evaluation of the performance of a micromethod for measuring urinary iodine by using six sigma quality metrics.

    Science.gov (United States)

    Hussain, Husniza; Khalid, Norhayati Mustafa; Selamat, Rusidah; Wan Nazaimoon, Wan Mohamud

    2013-09-01

    The urinary iodine micromethod (UIMM) is a modification of the conventional method and its performance needs evaluation. UIMM performance was evaluated using the method validation and 2008 Iodine Deficiency Disorders survey data obtained from four urinary iodine (UI) laboratories. Method acceptability tests and Sigma quality metrics were determined using total allowable errors (TEas) set by two external quality assurance (EQA) providers. UIMM obeyed various method acceptability test criteria with some discrepancies at low concentrations. Method validation data calculated against the UI Quality Program (TUIQP) TEas showed that the Sigma metrics were at 2.75, 1.80, and 3.80 for 51±15.50 µg/L, 108±32.40 µg/L, and 149±38.60 µg/L UI, respectively. External quality control (EQC) data showed that the performance of the laboratories was within Sigma metrics of 0.85-1.12, 1.57-4.36, and 1.46-4.98 at 46.91±7.05 µg/L, 135.14±13.53 µg/L, and 238.58±17.90 µg/L, respectively. No laboratory showed a calculated total error (TEcalc)Sigma metrics at all concentrations. Only one laboratory had TEcalc

  14. Evaluation of Frameworks for HSCT Design Optimization

    Science.gov (United States)

    Krishnan, Ramki

    1998-01-01

    This report is an evaluation of engineering frameworks that could be used to augment, supplement, or replace the existing FIDO 3.5 (Framework for Interdisciplinary Design and Optimization Version 3.5) framework. The report begins with the motivation for this effort, followed by a description of an "ideal" multidisciplinary design and optimization (MDO) framework. The discussion then turns to how each candidate framework stacks up against this ideal. This report ends with recommendations as to the "best" frameworks that should be down-selected for detailed review.

  15. Development of a standardized transfusion ratio as a metric for evaluating dialysis facility anemia management practices.

    Science.gov (United States)

    Liu, Jiannong; Li, Suying; Gilbertson, David T; Monda, Keri L; Bradbury, Brian D; Collins, Allan J

    2014-10-01

    Because transfusion avoidance has been the cornerstone of anemia treatment for patients with kidney disease, direct measurement of red blood cell transfusion use to assess dialysis facility anemia management performance is reasonable. We aimed to explore methods for estimating facility-level standardized transfusion ratios (STfRs) to assess provider anemia treatment practices. Retrospective cohort study. Point prevalent US hemodialysis patients on January 1, 2009, with Medicare as primary payer and dialysis duration of 90 days or longer were included (n = 223,901). All dialysis facilities with eligible patients were included (n = 5,345). Dialysis facility assignment. Receiving a red blood cell transfusion in the inpatient or outpatient setting. We evaluated 3 approaches for estimating STfR: ratio of observed to expected numbers of transfusions (STfR(obs)), a Bayesian approach (STfR(Bayes)), and a modified version of the Bayesian approach (STfR(modBayes)). The overall national transfusion rate in 2009 was 23.2 per 100 patient-years. Our model for predicting the expected number of transfusions performed well. For large facilities, all 3 STfRs worked well. However, for small facilities, while the STfR(modBayes) worked well, STfR(obs) values demonstrated instability and the STfR(Bayes) may produce more bias. Administration of transfusions to dialysis patients reflects medical practice both within and outside the dialysis unit. Some transfusions may be deemed unavoidable and transfusion practices are subject to considerable regional variation. Development of an STfR metric is feasible and reasonable for assessing anemia treatment at dialysis facilities. The STfR(obs) is simple to calculate and works well for larger dialysis facilities. The STfR(modBayes) is more analytically complex, but facilitates comparisons across all dialysis facilities, including small facilities. Copyright © 2014 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  16. Overview of journal metrics

    Directory of Open Access Journals (Sweden)

    Kihong Kim

    2018-02-01

    Full Text Available Various kinds of metrics used for the quantitative evaluation of scholarly journals are reviewed. The impact factor and related metrics including the immediacy index and the aggregate impact factor, which are provided by the Journal Citation Reports, are explained in detail. The Eigenfactor score and the article influence score are also reviewed. In addition, journal metrics such as CiteScore, Source Normalized Impact per Paper, SCImago Journal Rank, h-index, and g-index are discussed. Limitations and problems that these metrics have are pointed out. We should be cautious to rely on those quantitative measures too much when we evaluate journals or researchers.

  17. Evaluating Consumer Product Life Cycle Sustainability with Integrated Metrics: A Paper Towel Case Study

    Science.gov (United States)

    Integrated sustainability metrics provide an enriched set of information to inform decision-making. However, such approaches are rarely used to assess product supply chains. In this work, four integrated metrics—presented in terms of land, resources, value added, and stability—ar...

  18. National evaluation of multidisciplinary quality metrics for head and neck cancer.

    Science.gov (United States)

    Cramer, John D; Speedy, Sedona E; Ferris, Robert L; Rademaker, Alfred W; Patel, Urjeet A; Samant, Sandeep

    2017-11-15

    The National Quality Forum has endorsed quality-improvement measures for multiple cancer types that are being developed into actionable tools to improve cancer care. No nationally endorsed quality metrics currently exist for head and neck cancer. The authors identified patients with surgically treated, invasive, head and neck squamous cell carcinoma in the National Cancer Data Base from 2004 to 2014 and compared the rate of adherence to 5 different quality metrics and whether compliance with these quality metrics impacted overall survival. The metrics examined included negative surgical margins, neck dissection lymph node (LN) yield ≥ 18, appropriate adjuvant radiation, appropriate adjuvant chemoradiation, adjuvant therapy within 6 weeks, as well as overall quality. In total, 76,853 eligible patients were identified. There was substantial variability in patient-level adherence, which was 80% for negative surgical margins, 73.1% for neck dissection LN yield, 69% for adjuvant radiation, 42.6% for adjuvant chemoradiation, and 44.5% for adjuvant therapy within 6 weeks. Risk-adjusted Cox proportional-hazard models indicated that all metrics were associated with a reduced risk of death: negative margins (hazard ratio [HR] 0.73; 95% confidence interval [CI], 0.71-0.76), LN yield ≥ 18 (HR, 0.93; 95% CI, 0.89-0.96), adjuvant radiation (HR, 0.67; 95% CI, 0.64-0.70), adjuvant chemoradiation (HR, 0.84; 95% CI, 0.79-0.88), and adjuvant therapy ≤6 weeks (HR, 0.92; 95% CI, 0.89-0.96). Patients who received high-quality care had a 19% reduced adjusted hazard of mortality (HR, 0.81; 95% CI, 0.79-0.83). Five head and neck cancer quality metrics were identified that have substantial variability in adherence and meaningfully impact overall survival. These metrics are appropriate candidates for national adoption. Cancer 2017;123:4372-81. © 2017 American Cancer Society. © 2017 American Cancer Society.

  19. Classification and Evaluation of Mobility Metrics for Mobility Model Movement Patterns in Mobile Ad-Hoc Networks

    OpenAIRE

    Santosh Kumar S C Sharma Bhupendra Suman

    2011-01-01

    A mobile ad hoc network is collection of self configuring and adaption of wireless link between communicating devices (mobile devices) to form an arbitrary topology and multihop wireless connectivity without the use of existing infrastructure. It requires efficient dynamic routing protocol to determine the routes subsequent to a set of rules that enables two or more devices to communicate with each others. This paper basically classifies and evaluates the mobility metrics into two categories-...

  20. Evaluation of metrics and baselines for tracking greenhouse gas emissions trends: Recommendations for the California climate action registry

    Energy Technology Data Exchange (ETDEWEB)

    Price, Lynn; Murtishaw, Scott; Worrell, Ernst

    2003-06-01

    Laboratory (Berkeley Lab) was asked to provide technical assistance to the California Energy Commission (Energy Commission) related to the Registry in three areas: (1) assessing the availability and usefulness of industry-specific metrics, (2) evaluating various methods for establishing baselines for calculating GHG emissions reductions related to specific actions taken by Registry participants, and (3) establishing methods for calculating electricity CO2 emission factors. The third area of research was completed in 2002 and is documented in Estimating Carbon Dioxide Emissions Factors for the California Electric Power Sector (Marnay et al., 2002). This report documents our findings related to the first areas of research. For the first area of research, the overall objective was to evaluate the metrics, such as emissions per economic unit or emissions per unit of production that can be used to report GHG emissions trends for potential Registry participants. This research began with an effort to identify methodologies, benchmarking programs, inventories, protocols, and registries that u se industry-specific metrics to track trends in energy use or GHG emissions in order to determine what types of metrics have already been developed. The next step in developing industry-specific metrics was to assess the availability of data needed to determine metric development priorities. Berkeley Lab also determined the relative importance of different potential Registry participant categories in order to asses s the availability of sectoral or industry-specific metrics and then identified industry-specific metrics in use around the world. While a plethora of metrics was identified, no one metric that adequately tracks trends in GHG emissions while maintaining confidentiality of data was identified. As a result of this review, Berkeley Lab recommends the development of a GHG intensity index as a new metric for reporting and tracking GHG emissions trends.Such an index could provide an

  1. Disaster Metrics: Evaluation of de Boer's Disaster Severity Scale (DSS) Applied to Earthquakes.

    Science.gov (United States)

    Bayram, Jamil D; Zuabi, Shawki; McCord, Caitlin M; Sherak, Raphael A G; Hsu, Edberdt B; Kelen, Gabor D

    2015-02-01

    Quantitative measurement of the medical severity following multiple-casualty events (MCEs) is an important goal in disaster medicine. In 1990, de Boer proposed a 13-point, 7-parameter scale called the Disaster Severity Scale (DSS). Parameters include cause, duration, radius, number of casualties, nature of injuries, rescue time, and effect on surrounding community. Hypothesis This study aimed to examine the reliability and dimensionality (number of salient themes) of de Boer's DSS scale through its application to 144 discrete earthquake events. A search for earthquake events was conducted via National Oceanic and Atmospheric Administration (NOAA) and US Geological Survey (USGS) databases. Two experts in the field of disaster medicine independently reviewed and assigned scores for parameters that had no data readily available (nature of injuries, rescue time, and effect on surrounding community), and differences were reconciled via consensus. Principle Component Analysis was performed using SPSS Statistics for Windows Version 22.0 (IBM Corp; Armonk, New York USA) to evaluate the reliability and dimensionality of the DSS. A total of 144 individual earthquakes from 2003 through 2013 were identified and scored. Of 13 points possible, the mean score was 6.04, the mode = 5, minimum = 4, maximum = 11, and standard deviation = 2.23. Three parameters in the DSS had zero variance (ie, the parameter received the same score in all 144 earthquakes). Because of the zero contribution to variance, these three parameters (cause, duration, and radius) were removed to run the statistical analysis. Cronbach's alpha score, a coefficient of internal consistency, for the remaining four parameters was found to be robust at 0.89. Principle Component Analysis showed uni-dimensional characteristics with only one component having an eigenvalue greater than one at 3.17. The 4-parameter DSS, however, suffered from restriction of scoring range on both parameter and scale levels. Jan de Boer

  2. Classification in medical images using adaptive metric k-NN

    Science.gov (United States)

    Chen, C.; Chernoff, K.; Karemore, G.; Lo, P.; Nielsen, M.; Lauze, F.

    2010-03-01

    The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier with respect to different adaptive metrics in the context of medical imaging. We propose using adaptive metrics such that the structure of the data is better described, introducing some unsupervised learning knowledge in k-NN. We investigated four different metrics are estimated: a theoretical metric based on the assumption that images are drawn from Brownian Image Model (BIM), the normalized metric based on variance of the data, the empirical metric is based on the empirical covariance matrix of the unlabeled data, and an optimized metric obtained by minimizing the classification error. The spectral structure of the empirical covariance also leads to Principal Component Analysis (PCA) performed on it which results the subspace metrics. The metrics are evaluated on two data sets: lateral X-rays of the lumbar aortic/spine region, where we use k-NN for performing abdominal aorta calcification detection; and mammograms, where we use k-NN for breast cancer risk assessment. The results show that appropriate choice of metric can improve classification.

  3. Metrics for evaluation of the author's writing styles: who is the best?

    Science.gov (United States)

    Darooneh, Amir H; Shariati, Ashrafosadat

    2014-09-01

    Studying the complexity of language has attracted the physicist's attention recently. The methods borrowed from the statistical mechanics; namely, the complex network theory, can be used for exploring the regularities as a characteristic of complexity of language. In this paper, we focus on the authorship identification by using the complex network approach. We introduce three metrics which enable us for comparison the author's writing styles. This approach was previously used by us for finding the author of unknown book among collection of thirty six books written by five Persian poets. Here, we select a collection of one hundred and one books of nine English writers and quantify their writing styles according to our metrics. In our experiment, Shakespeare appears as the best author who follows a unique writing style in all of his works.

  4. Evaluation of performance metrics of leagile supply chain through fuzzy MCDM

    Directory of Open Access Journals (Sweden)

    D. Venkata Ramana

    2013-07-01

    Full Text Available Leagile supply chain management has emerged as a proactive approach for improving business value of companies. The companies that face volatile and unpredictable market demand of their products must pioneer in leagile supply chain strategy for competition and various demands of customers. There are literally many approaches for performance metrics of supply chain in general, yet little investigation has identified the reliability and validity of such approaches particularly in leagile supply chains. This study examines the consistency approaches by confirmatory factor analysis that determines the adoption of performance dimensions. The prioritization of performance enablers under these dimensions of leagile supply chain in small and medium enterprises are determined through fuzzy logarithmic least square method (LLSM. The study developed a generic hierarchy model for decision-makers who can prioritize the supply chain metrics under performance dimensions of leagile supply chain.

  5. Metrics for evaluation of the author's writing styles: Who is the best?

    Science.gov (United States)

    Darooneh, Amir H.; Shariati, Ashrafosadat

    2014-09-01

    Studying the complexity of language has attracted the physicist's attention recently. The methods borrowed from the statistical mechanics; namely, the complex network theory, can be used for exploring the regularities as a characteristic of complexity of language. In this paper, we focus on the authorship identification by using the complex network approach. We introduce three metrics which enable us for comparison the author's writing styles. This approach was previously used by us for finding the author of unknown book among collection of thirty six books written by five Persian poets. Here, we select a collection of one hundred and one books of nine English writers and quantify their writing styles according to our metrics. In our experiment, Shakespeare appears as the best author who follows a unique writing style in all of his works.

  6. A contest of sensors in close range 3D imaging: performance evaluation with a new metric test object

    Directory of Open Access Journals (Sweden)

    M. Hess

    2014-06-01

    Full Text Available An independent means of 3D image quality assessment is introduced, addressing non-professional users of sensors and freeware, which is largely characterized as closed-sourced and by the absence of quality metrics for processing steps, such as alignment. A performance evaluation of commercially available, state-of-the-art close range 3D imaging technologies is demonstrated with the help of a newly developed Portable Metric Test Artefact. The use of this test object provides quality control by a quantitative assessment of 3D imaging sensors. It will enable users to give precise specifications which spatial resolution and geometry recording they expect as outcome from their 3D digitizing process. This will lead to the creation of high-quality 3D digital surrogates and 3D digital assets. The paper is presented in the form of a competition of teams, and a possible winner will emerge.

  7. Test and Evaluation Metrics of Crew Decision-Making And Aircraft Attitude and Energy State Awareness

    Science.gov (United States)

    Bailey, Randall E.; Ellis, Kyle K. E.; Stephens, Chad L.

    2013-01-01

    NASA has established a technical challenge, under the Aviation Safety Program, Vehicle Systems Safety Technologies project, to improve crew decision-making and response in complex situations. The specific objective of this challenge is to develop data and technologies which may increase a pilot's (crew's) ability to avoid, detect, and recover from adverse events that could otherwise result in accidents/incidents. Within this technical challenge, a cooperative industry-government research program has been established to develop innovative flight deck-based counter-measures that can improve the crew's ability to avoid, detect, mitigate, and recover from unsafe loss-of-aircraft state awareness - specifically, the loss of attitude awareness (i.e., Spatial Disorientation, SD) or the loss-of-energy state awareness (LESA). A critical component of this research is to develop specific and quantifiable metrics which identify decision-making and the decision-making influences during simulation and flight testing. This paper reviews existing metrics and methods for SD testing and criteria for establishing visual dominance. The development of Crew State Monitoring technologies - eye tracking and other psychophysiological - are also discussed as well as emerging new metrics for identifying channelized attention and excessive pilot workload, both of which have been shown to contribute to SD/LESA accidents or incidents.

  8. New exposure-based metric approach for evaluating O3 risk to North American aspen forests

    International Nuclear Information System (INIS)

    Percy, K.E.; Nosal, M.; Heilman, W.; Dann, T.; Sober, J.; Legge, A.H.; Karnosky, D.F.

    2007-01-01

    The United States and Canada currently use exposure-based metrics to protect vegetation from O 3 . Using 5 years (1999-2003) of co-measured O 3 , meteorology and growth response, we have developed exposure-based regression models that predict Populus tremuloides growth change within the North American ambient air quality context. The models comprised growing season fourth-highest daily maximum 8-h average O 3 concentration, growing degree days, and wind speed. They had high statistical significance, high goodness of fit, include 95% confidence intervals for tree growth change, and are simple to use. Averaged across a wide range of clonal sensitivity, historical 2001-2003 growth change over most of the 26 M ha P. tremuloides distribution was estimated to have ranged from no impact (0%) to strong negative impacts (-31%). With four aspen clones responding negatively (one responded positively) to O 3 , the growing season fourth-highest daily maximum 8-h average O 3 concentration performed much better than growing season SUM06, AOT40 or maximum 1 h average O 3 concentration metrics as a single indicator of aspen stem cross-sectional area growth. - A new exposure-based metric approach to predict O 3 risk to North American aspen forests has been developed

  9. Evaluation and optimization of LWR fuel cycles

    International Nuclear Information System (INIS)

    Akbas, T.; Zabunoglu, O.; Tombakoglu, M.

    2001-01-01

    There are several options in the back-end of the nuclear fuel cycle. Discharge burn-up, length of interim storage period, choice of direct disposal or recycling and method of reprocessing in case of recycling affect the options and determine/define the fuel cycle scenarios. These options have been evaluated in viewpoint of some tangible (fuel cycle cost, natural uranium requirement, decay heat of high level waste, radiological ingestion and inhalation hazards) and intangible factors (technological feasibility, nonproliferation aspect, etc.). Neutronic parameters are calculated using versatile fuel depletion code ORIGEN2.1. A program is developed for calculation of cost related parameters. Analytical hierarchy process is used to transform the intangible factors into the tangible ones. Then all these tangible and intangible factors are incorporated into a form that is suitable for goal programming, which is a linear optimization technique and used to determine the optimal option among alternatives. According to the specified objective function and constraints, the optimal fuel cycle scenario is determined using GPSYS (a linear programming software) as a goal programming tool. In addition, a sensitivity analysis is performed for some selected important parameters

  10. An evaluation of non-metric cranial traits used to estimate ancestry in a South African sample.

    Science.gov (United States)

    L'Abbé, E N; Van Rooyen, C; Nawrocki, S P; Becker, P J

    2011-06-15

    Establishing ancestry from a skeleton for forensic purposes has been shown to be difficult. The purpose of this paper is to address the application of thirteen non-metric traits to estimate ancestry in three South African groups, namely White, Black and "Coloured". In doing so, the frequency distribution of thirteen non-metric traits among South Africans are presented; the relationship of these non-metric traits with ancestry, sex, age at death are evaluated; and Kappa statistics are utilized to assess the inter and intra-rater reliability. Crania of 520 known individuals were obtained from four skeletal samples in South Africa: the Pretoria Bone Collection, the Raymond A. Dart Collection, the Kirsten Collection and the Student Bone Collection from the University of the Free State. Average age at death was 51, with an age range between 18 and 90. Thirteen commonly used non-metric traits from the face and jaw were scored; definition and illustrations were taken from Hefner, Bass and Hauser and De Stephano. Frequency distributions, ordinal regression and Cohen's Kappa statistics were performed as a means to assess population variation and repeatability. Frequency distributions were highly variable among South Africans. Twelve of the 13 variables had a statistically significant relationship with ancestry. Sex significantly affected only one variable, inter-orbital breadth, and age at death affected two (anterior nasal spine and alveolar prognathism). The interaction of ancestry and sex independently affected three variables (nasal bone contour, nasal breadth, and interorbital breadth). Seven traits had moderate to excellent repeatability, while poor scoring consistency was noted for six variables. Difficulties in repeating several of the trait scores may require either a need for refinement of the definitions, or these character states may not adequately describe the observable morphology in the population. The application of the traditional experience-based approach

  11. Mental workload and cognitive task automaticity: an evaluation of subjective and time estimation metrics.

    Science.gov (United States)

    Liu, Y; Wickens, C D

    1994-11-01

    The evaluation of mental workload is becoming increasingly important in system design and analysis. The present study examined the structure and assessment of mental workload in performing decision and monitoring tasks by focusing on two mental workload measurements: subjective assessment and time estimation. The task required the assignment of a series of incoming customers to the shortest of three parallel service lines displayed on a computer monitor. The subject was either in charge of the customer assignment (manual mode) or was monitoring an automated system performing the same task (automatic mode). In both cases, the subjects were required to detect the non-optimal assignments that they or the computer had made. Time pressure was manipulated by the experimenter to create fast and slow conditions. The results revealed a multi-dimensional structure of mental workload and a multi-step process of subjective workload assessment. The results also indicated that subjective workload was more influenced by the subject's participatory mode than by the factor of task speed. The time estimation intervals produced while performing the decision and monitoring tasks had significantly greater length and larger variability than those produced while either performing no other tasks or performing a well practised customer assignment task. This result seemed to indicate that time estimation was sensitive to the presence of perceptual/cognitive demands, but not to response related activities to which behavioural automaticity has developed.

  12. Wheeling rates evaluation using optimal power flows

    International Nuclear Information System (INIS)

    Muchayi, M.; El-Hawary, M. E.

    1998-01-01

    Wheeling is the transmission of electrical power and reactive power from a seller to a buyer through a transmission network owned by a third party. The wheeling rates are then the prices charged by the third party for the use of its network. This paper proposes and evaluates a strategy for pricing wheeling power using a pricing algorithm that in addition to the fuel cost for generation incorporates the optimal allocation of the transmission system operating cost, based on time-of-use pricing. The algorithm is implemented for the IEEE standard 14 and 30 bus system which involves solving a modified optimal power flow problem iteratively. The base of the proposed algorithm is the hourly spot price. The analysis spans a total time period of 24 hours. Unlike other algorithms that use DC models, the proposed model captures wheeling rates of both real and reactive power. Based on the evaluation, it was concluded that the model has the potential for wide application in calculating wheeling rates in a deregulated competitive power transmission environment. 9 refs., 3 tabs

  13. Introduction to the Special Collection of Papers on the San Luis Basin Sustainability Metrics Project: A Methodology for Evaluating Regional Sustainability

    Science.gov (United States)

    This paper introduces a collection of four articles describing the San Luis Basin Sustainability Metrics Project. The Project developed a methodology for evaluating regional sustainability. This introduction provides the necessary background information for the project, descripti...

  14. Volume-based quantitative FDG PET/CT metrics and their association with optimal debulking and progression-free survival in patients with recurrent ovarian cancer undergoing secondary cytoreductive surgery

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, H.A.; Burger, I.A.; Micco, M.; Sosa, R.E.; Weber, W.; Hricak, H.; Sala, E. [Memorial Sloan Kettering Cancer Center, Department of Radiology, New York, NY (United States); Goldman, D.A. [Memorial Sloan Kettering Cancer Center, Department of Epidemiology and Biostatistics, New York, NY (United States); Chi, D.S. [Memorial Sloan Kettering Cancer Center, Department of Surgery, New York, NY (United States)

    2015-11-15

    Our aim was to evaluate the associations between quantitative {sup 18}F-fluorodeoxyglucose positron-emission tomography (FDG-PET) uptake metrics, optimal debulking (OD) and progression-free survival (PFS) in patients with recurrent ovarian cancer undergoing secondary cytoreductive surgery. Fifty-five patients with recurrent ovarian cancer underwent FDG-PET/CT within 90 days prior to surgery. Standardized uptake values (SUV{sub max}), metabolically active tumour volumes (MTV), and total lesion glycolysis (TLG) were measured on PET. Exact logistic regression, Kaplan-Meier curves and the log-rank test were used to assess associations between imaging metrics, OD and PFS. MTV (p = 0.0025) and TLG (p = 0.0043) were associated with OD; however, there was no significant association between SUV{sub max} and debulking status (p = 0.83). Patients with an MTV above 7.52 mL and/or a TLG above 35.94 g had significantly shorter PFS (p = 0.0191 for MTV and p = 0.0069 for TLG). SUV{sub max} was not significantly related to PFS (p = 0.10). PFS estimates at 3.5 years after surgery were 0.42 for patients with an MTV ≤ 7.52 mL and 0.19 for patients with an MTV > 7.52 mL; 0.46 for patients with a TLG ≤ 35.94 g and 0.15 for patients with a TLG > 35.94 g. FDG-PET metrics that reflect metabolic tumour burden are associated with optimal secondary cytoreductive surgery and progression-free survival in patients with recurrent ovarian cancer. (orig.)

  15. Volume-based quantitative FDG PET/CT metrics and their association with optimal debulking and progression-free survival in patients with recurrent ovarian cancer undergoing secondary cytoreductive surgery

    International Nuclear Information System (INIS)

    Vargas, H.A.; Burger, I.A.; Micco, M.; Sosa, R.E.; Weber, W.; Hricak, H.; Sala, E.; Goldman, D.A.; Chi, D.S.

    2015-01-01

    Our aim was to evaluate the associations between quantitative 18 F-fluorodeoxyglucose positron-emission tomography (FDG-PET) uptake metrics, optimal debulking (OD) and progression-free survival (PFS) in patients with recurrent ovarian cancer undergoing secondary cytoreductive surgery. Fifty-five patients with recurrent ovarian cancer underwent FDG-PET/CT within 90 days prior to surgery. Standardized uptake values (SUV max ), metabolically active tumour volumes (MTV), and total lesion glycolysis (TLG) were measured on PET. Exact logistic regression, Kaplan-Meier curves and the log-rank test were used to assess associations between imaging metrics, OD and PFS. MTV (p = 0.0025) and TLG (p = 0.0043) were associated with OD; however, there was no significant association between SUV max and debulking status (p = 0.83). Patients with an MTV above 7.52 mL and/or a TLG above 35.94 g had significantly shorter PFS (p = 0.0191 for MTV and p = 0.0069 for TLG). SUV max was not significantly related to PFS (p = 0.10). PFS estimates at 3.5 years after surgery were 0.42 for patients with an MTV ≤ 7.52 mL and 0.19 for patients with an MTV > 7.52 mL; 0.46 for patients with a TLG ≤ 35.94 g and 0.15 for patients with a TLG > 35.94 g. FDG-PET metrics that reflect metabolic tumour burden are associated with optimal secondary cytoreductive surgery and progression-free survival in patients with recurrent ovarian cancer. (orig.)

  16. Evaluating the consequences of salmon nutrients for riparian organisms: Linking condition metrics to stable isotopes.

    Science.gov (United States)

    Vizza, Carmella; Sanderson, Beth L; Coe, Holly J; Chaloner, Dominic T

    2017-03-01

    Stable isotope ratios (δ 13 C and δ 15 N) have been used extensively to trace nutrients from Pacific salmon, but salmon transfer more than carbon and nitrogen to stream ecosystems, such as phosphorus, minerals, proteins, and lipids. To examine the importance of these nutrients, metrics other than isotopes need to be considered, particularly when so few studies have made direct links between these nutrients and how they affect riparian organisms. Our study specifically examined δ 13 C and δ 15 N of riparian organisms from salmon and non-salmon streams in Idaho, USA, at different distances from the streams, and examined whether the quality of riparian plants and the body condition of invertebrates varied with access to these nutrients. Overall, quality and condition metrics did not mirror stable isotope patterns. Most notably, all riparian organisms exhibited elevated δ 15 N in salmon streams, but also with proximity to both stream types suggesting that both salmon and landscape factors may affect δ 15 N. The amount of nitrogen incorporated from Pacific salmon was low for all organisms (1950s. In addition, our results support those of other studies that have cautioned that inferences from natural abundance isotope data, particularly in conjunction with mixing models for salmon-derived nutrient percentage estimates, may be confounded by biogeochemical transformations of nitrogen, physiological processes, and even historical legacies of nitrogen sources. Critically, studies should move beyond simply describing isotopic patterns to focusing on the consequences of salmon-derived nutrients by quantifying the condition and fitness of organisms putatively using those resources.

  17. Translating glucose variability metrics into the clinic via Continuous Glucose Monitoring: a Graphical User Interface for Diabetes Evaluation (CGM-GUIDE©).

    Science.gov (United States)

    Rawlings, Renata A; Shi, Hang; Yuan, Lo-Hua; Brehm, William; Pop-Busui, Rodica; Nelson, Patrick W

    2011-12-01

    Several metrics of glucose variability have been proposed to date, but an integrated approach that provides a complete and consistent assessment of glycemic variation is missing. As a consequence, and because of the tedious coding necessary during quantification, most investigators and clinicians have not yet adopted the use of multiple glucose variability metrics to evaluate glycemic variation. We compiled the most extensively used statistical techniques and glucose variability metrics, with adjustable hyper- and hypoglycemic limits and metric parameters, to create a user-friendly Continuous Glucose Monitoring Graphical User Interface for Diabetes Evaluation (CGM-GUIDE©). In addition, we introduce and demonstrate a novel transition density profile that emphasizes the dynamics of transitions between defined glucose states. Our combined dashboard of numerical statistics and graphical plots support the task of providing an integrated approach to describing glycemic variability. We integrated existing metrics, such as SD, area under the curve, and mean amplitude of glycemic excursion, with novel metrics such as the slopes across critical transitions and the transition density profile to assess the severity and frequency of glucose transitions per day as they move between critical glycemic zones. By presenting the above-mentioned metrics and graphics in a concise aggregate format, CGM-GUIDE provides an easy to use tool to compare quantitative measures of glucose variability. This tool can be used by researchers and clinicians to develop new algorithms of insulin delivery for patients with diabetes and to better explore the link between glucose variability and chronic diabetes complications.

  18. A GOAL QUESTION METRIC (GQM APPROACH FOR EVALUATING INTERACTION DESIGN PATTERNS IN DRAWING GAMES FOR PRESCHOOL CHILDREN

    Directory of Open Access Journals (Sweden)

    Dana Sulistiyo Kusumo

    2017-06-01

    Full Text Available In recent years, there has been an increasing interest to use smart devices’ drawing games for educational benefit. In Indonesia, our government classifies children age four to six years old as preschool children. Not all preschool children can use drawing games easily. Further, drawing games may not fulfill all Indonesia's preschool children’s drawing competencies. This research proposes to use Goal-Question Metric (GQM to investigate and evaluate interaction design patterns of preschool children in order to achieve the drawing competencies for preschool children in two drawing Android-based games: Belajar Menggambar (in English: Learn to Draw and Coret: Belajar Menggambar (in English: Scratch: Learn to Draw. We collected data from nine students of a preschool children education in a user research. The results show that GQM can assist to evaluate interaction design patterns in achieving the drawing competencies. Our approach can also yield interaction design patterns by comparing interaction design patterns in two drawing games used.

  19. Evaluating social media's capacity to develop engaged audiences in health promotion settings: use of Twitter metrics as a case study.

    Science.gov (United States)

    Neiger, Brad L; Thackeray, Rosemary; Burton, Scott H; Giraud-Carrier, Christophe G; Fagen, Michael C

    2013-03-01

    Use of social media in health promotion and public health continues to grow in popularity, though most of what is reported in literature represents one-way messaging devoid of attributes associated with engagement, a core attribute, if not the central purpose, of social media. This article defines engagement, describes its value in maximizing the potential of social media in health promotion, proposes an evaluation hierarchy for social media engagement, and uses Twitter as a case study to illustrate how the hierarchy might function in practice. Partnership and participation are proposed as culminating outcomes for social media use in health promotion. As use of social media in health promotion moves toward this end, evaluation metrics that verify progress and inform subsequent strategies will become increasingly important.

  20. A comparison of evaluation metrics for biomedical journals, articles, and websites in terms of sensitivity to topic.

    Science.gov (United States)

    Fu, Lawrence D; Aphinyanaphongs, Yindalon; Wang, Lily; Aliferis, Constantin F

    2011-08-01

    Evaluating the biomedical literature and health-related websites for quality are challenging information retrieval tasks. Current commonly used methods include impact factor for journals, PubMed's clinical query filters and machine learning-based filter models for articles, and PageRank for websites. Previous work has focused on the average performance of these methods without considering the topic, and it is unknown how performance varies for specific topics or focused searches. Clinicians, researchers, and users should be aware when expected performance is not achieved for specific topics. The present work analyzes the behavior of these methods for a variety of topics. Impact factor, clinical query filters, and PageRank vary widely across different topics while a topic-specific impact factor and machine learning-based filter models are more stable. The results demonstrate that a method may perform excellently on average but struggle when used on a number of narrower topics. Topic-adjusted metrics and other topic robust methods have an advantage in such situations. Users of traditional topic-sensitive metrics should be aware of their limitations. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. A Comparison of Evaluation Metrics for Biomedical Journals, Articles, and Websites in Terms of Sensitivity to Topic

    Science.gov (United States)

    Fu, Lawrence D.; Aphinyanaphongs, Yindalon; Wang, Lily; Aliferis, Constantin F.

    2011-01-01

    Evaluating the biomedical literature and health-related websites for quality are challenging information retrieval tasks. Current commonly used methods include impact factor for journals, PubMed’s clinical query filters and machine learning-based filter models for articles, and PageRank for websites. Previous work has focused on the average performance of these methods without considering the topic, and it is unknown how performance varies for specific topics or focused searches. Clinicians, researchers, and users should be aware when expected performance is not achieved for specific topics. The present work analyzes the behavior of these methods for a variety of topics. Impact factor, clinical query filters, and PageRank vary widely across different topics while a topic-specific impact factor and machine learning-based filter models are more stable. The results demonstrate that a method may perform excellently on average but struggle when used on a number of narrower topics. Topic adjusted metrics and other topic robust methods have an advantage in such situations. Users of traditional topic-sensitive metrics should be aware of their limitations. PMID:21419864

  2. Evaluation metrics for the practical application of URREF ontology: An illustration on data criteria

    CSIR Research Space (South Africa)

    De Villiers, Johan P

    2017-08-01

    Full Text Available The International Society of Information Fusion (ISIF) Evaluation Techniques for Uncertainty Representation Working Group (ETURWG) investigates the quantification and evaluation of all types of uncertainty regarding the inputs, reasoning and outputs...

  3. Synoptic evaluation of scale-dependent metrics for hydrographic line feature geometry

    Science.gov (United States)

    Stanislawski, Larry V.; Buttenfield, Barbara P.; Raposo, Paulo; Cameron, Madeline; Falgout, Jeff T.

    2015-01-01

    conterminous United States and compared to topographic metrics. A concurrent processing workflow is implemented using a Linux high-performance computing cluster to simultaneously process multiple subbasins, and thereby complete the work in a fraction of the time required for a single-process environment. In addition, similar metrics are generated for several levels of simplification of the hydrographic features to quantify the effects of simplification over the various landscape conditions. Objectives of this exploratory investigation are to quantify geometric characteristics of linear hydrographic features over the various terrain conditions within the conterminous United States and thereby illuminate relations between stream geomorphological conditions and cartographic representation. The synoptic view of these characteristics over regional watersheds that is afforded through concurrent processing, in conjunction with terrain conditions, may reveal patterns for classifying cartographic stream features into stream geomorphological classes. Furthermore, the synoptic measurement of the amount of change in geometric characteristics caused by the several levels of simplification can enable estimation of tolerance values that appropriately control simplification-induced geometric change of the cartographic features within the various geomorphological classes in the country. Hence, these empirically derived rules or relations could help generate multiscale-representations of features through automated generalization that adequately maintain surface drainage variations and patterns reflective of the natural stream geomorphological conditions across the country.

  4. Multimetric indices: How many metrics?

    Science.gov (United States)

    Multimetric indices (MMI’s) often include 5 to 15 metrics, each representing a different attribute of assemblage condition, such as species diversity, tolerant taxa, and nonnative taxa. Is there an optimal number of metrics for MMIs? To explore this question, I created 1000 9-met...

  5. Soil evaluation for land use optimizing

    Science.gov (United States)

    Marinina, O. A.

    2018-01-01

    The article presents the method of soil classification proposed in the course of the study in which the list of indicators proposed by the existing recommendations is optimized. On the example of one of the river basins within the boundaries of the Belgorod region zoning of the territory was carried out. With this approach, the boundaries of the territorial zones are projected along the natural boundaries of natural objects and the productivity of soils is determined as the main criterion for zoning. To assess the territory by soil properties, the features of the soil cover of the river basin were studied and vectorization of the soil variety boundaries was carried out. In the land evaluation essential and useful for the growth of crops macro- and minor-nutrient elements necessary for the growth of crops were included. To compare the soils each of the indicators was translated into relative units. The final score of soil quality is calculated as the mean geometric value of scores from 0 to 100 points for the selected diagnostic features. Through the imposition of results of soil classification and proposed by the concept of basin nature management - land management activities, five zones were identified according to the degree of suitability for use in agriculture.

  6. Metric learning

    CERN Document Server

    Bellet, Aurelien; Sebban, Marc

    2015-01-01

    Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learnin

  7. Text Summarization Evaluation: Correlating Human Performance on an Extrinsic Task with Automatic Intrinsic Metrics

    National Research Council Canada - National Science Library

    President, Stacy F; Dorr, Bonnie J

    2006-01-01

    This research describes two types of summarization evaluation methods, intrinsic and extrinsic, and concentrates on determining the level of correlation between automatic intrinsic methods and human...

  8. A suite of standard post-tagging evaluation metrics can help assess tag retention for field-based fish telemetry research

    Science.gov (United States)

    Gerber, Kayla M.; Mather, Martha E.; Smith, Joseph M.

    2017-01-01

    Telemetry can inform many scientific and research questions if a context exists for integrating individual studies into the larger body of literature. Creating cumulative distributions of post-tagging evaluation metrics would allow individual researchers to relate their telemetry data to other studies. Widespread reporting of standard metrics is a precursor to the calculation of benchmarks for these distributions (e.g., mean, SD, 95% CI). Here we illustrate five types of standard post-tagging evaluation metrics using acoustically tagged Blue Catfish (Ictalurus furcatus) released into a Kansas reservoir. These metrics included: (1) percent of tagged fish detected overall, (2) percent of tagged fish detected daily using abacus plot data, (3) average number of (and percent of available) receiver sites visited, (4) date of last movement between receiver sites (and percent of tagged fish moving during that time period), and (5) number (and percent) of fish that egressed through exit gates. These metrics were calculated for one to three time periods: early (of the study (5 months). Over three-quarters of our tagged fish were detected early (85%) and at the end (85%) of the study. Using abacus plot data, all tagged fish (100%) were detected at least one day and 96% were detected for > 5 days early in the study. On average, tagged Blue Catfish visited 9 (50%) and 13 (72%) of 18 within-reservoir receivers early and at the end of the study, respectively. At the end of the study, 73% of all tagged fish were detected moving between receivers. Creating statistical benchmarks for individual metrics can provide useful reference points. In addition, combining multiple metrics can inform ecology and research design. Consequently, individual researchers and the field of telemetry research can benefit from widespread, detailed, and standard reporting of post-tagging detection metrics.

  9. A suite of standard post-tagging evaluation metrics can help assess tag retention for field-based fish telemetry research

    Science.gov (United States)

    Gerber, Kayla M.; Mather, Martha E.; Smith, Joseph M.

    2017-01-01

    Telemetry can inform many scientific and research questions if a context exists for integrating individual studies into the larger body of literature. Creating cumulative distributions of post-tagging evaluation metrics would allow individual researchers to relate their telemetry data to other studies. Widespread reporting of standard metrics is a precursor to the calculation of benchmarks for these distributions (e.g., mean, SD, 95% CI). Here we illustrate five types of standard post-tagging evaluation metrics using acoustically tagged Blue Catfish (Ictalurus furcatus) released into a Kansas reservoir. These metrics included: (1) percent of tagged fish detected overall, (2) percent of tagged fish detected daily using abacus plot data, (3) average number of (and percent of available) receiver sites visited, (4) date of last movement between receiver sites (and percent of tagged fish moving during that time period), and (5) number (and percent) of fish that egressed through exit gates. These metrics were calculated for one to three time periods: early ( 5 days early in the study. On average, tagged Blue Catfish visited 9 (50%) and 13 (72%) of 18 within-reservoir receivers early and at the end of the study, respectively. At the end of the study, 73% of all tagged fish were detected moving between receivers. Creating statistical benchmarks for individual metrics can provide useful reference points. In addition, combining multiple metrics can inform ecology and research design. Consequently, individual researchers and the field of telemetry research can benefit from widespread, detailed, and standard reporting of post-tagging detection metrics.

  10. A farm platform approach to optimizing temperate grazing-livestock systems: metrics for trade-off assessments and future innovations

    Science.gov (United States)

    Harris, Paul; Takahashi, Taro; Blackwell, Martin; Cardenas, Laura; Collins, Adrian; Dungait, Jennifer; Eisler, Mark; Hawkins, Jane; Misselbrook, Tom; Mcauliffe, Graham; Mcfadzean, Jamie; Murray, Phil; Orr, Robert; Jordana Rivero, M.; Wu, Lianhai; Lee, Michael

    2017-04-01

    data on hydrology, emissions, nutrient cycling, biodiversity, productivity and livestock welfare/health for 2 years (April 2011 to March 2013). Since April 2013, the platform has been progressively modified across three distinct ca. 22 ha farmlets with the underlying principle being to improve the sustainability (economic, social and environmental) by comparing contrasting pasture-based systems (permanent pasture, grass and clover swards, and reseeding of high quality germplasm on a regular cycle). This modification or transitional period ended in July 2015, when the platform assumed full post-baseline status. In this paper, we summarise the sustainability trade-off metrics developed to compare the three systems, together with the farm platform data collections used to create them; collections that can be viewed as 'big data' when considered in their entirety. We concentrate on the baseline and transitional periods and discuss the potential innovations to optimise grazing livestock systems utilising an experimental farm platform approach.

  11. An Abstract Process and Metrics Model for Evaluating Unified Command and Control: A Scenario and Technology Agnostic Approach

    Science.gov (United States)

    2004-06-01

    18 EBO Cognitive or Memetic input type ..................................................................... 18 Unanticipated EBO generated... Memetic Effects Based COA.................................................................................... 23 Policy...41 Belief systems or Memetic Content Metrics

  12. Introduction to the special collection of papers on the San Luis Basin Sustainability Metrics Project: a methodology for evaluating regional sustainability.

    Science.gov (United States)

    Heberling, Matthew T; Hopton, Matthew E

    2012-11-30

    This paper introduces a collection of four articles describing the San Luis Basin Sustainability Metrics Project. The Project developed a methodology for evaluating regional sustainability. This introduction provides the necessary background information for the project, description of the region, overview of the methods, and summary of the results. Although there are a multitude of scientifically based sustainability metrics, many are data intensive, difficult to calculate, and fail to capture all aspects of a system. We wanted to see if we could develop an approach that decision-makers could use to understand if their system was moving toward or away from sustainability. The goal was to produce a scientifically defensible, but straightforward and inexpensive methodology to measure and monitor environmental quality within a regional system. We initiated an interdisciplinary pilot project in the San Luis Basin, south-central Colorado, to test the methodology. The objectives were: 1) determine the applicability of using existing datasets to estimate metrics of sustainability at a regional scale; 2) calculate metrics through time from 1980 to 2005; and 3) compare and contrast the results to determine if the system was moving toward or away from sustainability. The sustainability metrics, chosen to represent major components of the system, were: 1) Ecological Footprint to capture the impact and human burden on the system; 2) Green Net Regional Product to represent economic welfare; 3) Emergy to capture the quality-normalized flow of energy through the system; and 4) Fisher information to capture the overall dynamic order and to look for possible regime changes. The methodology, data, and results of each metric are presented in the remaining four papers of the special collection. Based on the results of each metric and our criteria for understanding the sustainability trends, we find that the San Luis Basin is moving away from sustainability. Although we understand

  13. 77 FR 72435 - Pipeline Safety: Using Meaningful Metrics in Conducting Integrity Management Program Evaluations

    Science.gov (United States)

    2012-12-05

    ... effectiveness of their integrity management programs. Program evaluation is one of the key required program... activities that are in place to control risk. These measures indicate how well an operator is implementing... outcome is being achieved or not, despite the risk control activities in place. Failure Measures that...

  14. Developing and evaluating a target-background similarity metric for camouflage detection.

    Directory of Open Access Journals (Sweden)

    Chiuhsiang Joe Lin

    Full Text Available BACKGROUND: Measurement of camouflage performance is of fundamental importance for military stealth applications. The goal of camouflage assessment algorithms is to automatically assess the effect of camouflage in agreement with human detection responses. In a previous study, we found that the Universal Image Quality Index (UIQI correlated well with the psychophysical measures, and it could be a potentially camouflage assessment tool. METHODOLOGY: In this study, we want to quantify the camouflage similarity index and psychophysical results. We compare several image quality indexes for computational evaluation of camouflage effectiveness, and present the results of an extensive human visual experiment conducted to evaluate the performance of several camouflage assessment algorithms and analyze the strengths and weaknesses of these algorithms. SIGNIFICANCE: The experimental data demonstrates the effectiveness of the approach, and the correlation coefficient result of the UIQI was higher than those of other methods. This approach was highly correlated with the human target-searching results. It also showed that this method is an objective and effective camouflage performance evaluation method because it considers the human visual system and image structure, which makes it consistent with the subjective evaluation results.

  15. Developing and evaluating a target-background similarity metric for camouflage detection.

    Science.gov (United States)

    Lin, Chiuhsiang Joe; Chang, Chi-Chan; Liu, Bor-Shong

    2014-01-01

    Measurement of camouflage performance is of fundamental importance for military stealth applications. The goal of camouflage assessment algorithms is to automatically assess the effect of camouflage in agreement with human detection responses. In a previous study, we found that the Universal Image Quality Index (UIQI) correlated well with the psychophysical measures, and it could be a potentially camouflage assessment tool. In this study, we want to quantify the camouflage similarity index and psychophysical results. We compare several image quality indexes for computational evaluation of camouflage effectiveness, and present the results of an extensive human visual experiment conducted to evaluate the performance of several camouflage assessment algorithms and analyze the strengths and weaknesses of these algorithms. The experimental data demonstrates the effectiveness of the approach, and the correlation coefficient result of the UIQI was higher than those of other methods. This approach was highly correlated with the human target-searching results. It also showed that this method is an objective and effective camouflage performance evaluation method because it considers the human visual system and image structure, which makes it consistent with the subjective evaluation results.

  16. Deep Transfer Metric Learning.

    Science.gov (United States)

    Junlin Hu; Jiwen Lu; Yap-Peng Tan; Jie Zhou

    2016-12-01

    Conventional metric learning methods usually assume that the training and test samples are captured in similar scenarios so that their distributions are assumed to be the same. This assumption does not hold in many real visual recognition applications, especially when samples are captured across different data sets. In this paper, we propose a new deep transfer metric learning (DTML) method to learn a set of hierarchical nonlinear transformations for cross-domain visual recognition by transferring discriminative knowledge from the labeled source domain to the unlabeled target domain. Specifically, our DTML learns a deep metric network by maximizing the inter-class variations and minimizing the intra-class variations, and minimizing the distribution divergence between the source domain and the target domain at the top layer of the network. To better exploit the discriminative information from the source domain, we further develop a deeply supervised transfer metric learning (DSTML) method by including an additional objective on DTML, where the output of both the hidden layers and the top layer are optimized jointly. To preserve the local manifold of input data points in the metric space, we present two new methods, DTML with autoencoder regularization and DSTML with autoencoder regularization. Experimental results on face verification, person re-identification, and handwritten digit recognition validate the effectiveness of the proposed methods.

  17. Evaluation of different set-up error corrections on dose-volume metrics in prostate IMRT using CBCT images

    International Nuclear Information System (INIS)

    Hirose, Yoshinori; Tomita, Tsuneyuki; Kitsuda, Kenji; Notogawa, Takuya; Miki, Katsuhito; Nakamura, Mitsuhiro; Nakamura, Kiyonao; Ishigaki, Takashi

    2014-01-01

    We investigated the effect of different set-up error corrections on dose-volume metrics in intensity-modulated radiotherapy (IMRT) for prostate cancer under different planning target volume (PTV) margin settings using cone-beam computed tomography (CBCT) images. A total of 30 consecutive patients who underwent IMRT for prostate cancer were retrospectively analysed, and 7-14 CBCT datasets were acquired per patient. Interfractional variations in dose-volume metrics were evaluated under six different set-up error corrections, including tattoo, bony anatomy, and four different target matching groups. Set-up errors were incorporated into planning the isocenter position, and dose distributions were recalculated on CBCT images. These processes were repeated under two different PTV margin settings. In the on-line bony anatomy matching groups, systematic error (Σ) was 0.3 mm, 1.4 mm, and 0.3 mm in the left-right, anterior-posterior (AP), and superior-inferior directions, respectively. Σ in three successive off-line target matchings was finally comparable with that in the on-line bony anatomy matching in the AP direction. Although doses to the rectum and bladder wall were reduced for a small PTV margin, averaged reductions in the volume receiving 100% of the prescription dose from planning were within 2.5% under all PTV margin settings for all correction groups, with the exception of the tattoo set-up error correction only (≥ 5.0%). Analysis of variance showed no significant difference between on-line bony anatomy matching and target matching. While variations between the planned and delivered doses were smallest when target matching was applied, the use of bony anatomy matching still ensured the planned doses. (author)

  18. [Evaluation of the Health Observatory of Asturias (Spain): web and social network metrics and health professionals' opinions].

    Science.gov (United States)

    Casajuana Kögel, Cristina; Cofiño, Rafael; López, María José

    2014-01-01

    To evaluate the Health Observatory of Asturias (Observatorio de Salud de Asturias [OBSA]), which collects and disseminates health data from Asturias through a website and social networks. A cross-sectional study was conducted between 2012 and 2013. The study included a process evaluation that analyzed the reach of the OBSA's website, Facebook and Twitter accounts through web metrics and the use made by health professionals in Asturias of these media. Satisfaction was assessed through an online questionnaire. To estimate the potential effects of the OBSA, the study also included an evaluation of the results with a non-experimental design. The total number of visits to the website increased in 2012, with more than 37,000 visits. The questionnaire (n=43) showed that 72.1% of the health professionals knew of the OBSA and that 81.5% of them had used it. Most health professionals reported they were satisfied with the OBSA and believed that it encouraged cooperation among professionals (51.6%). The OBSA is known and consulted by most health professionals and is achieving some of its main objectives: to inform health staff and stimulate discussion. According to the results, information and communication technologies could play an important role in the presentation of health data in a more interactive and accessible way. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.

  19. New Metrics for Evaluating Technical Benefits and Risks of DGs Increasing Penetration

    DEFF Research Database (Denmark)

    Akbari, Mohammad Amin; Aghaei, Jamshid; Barani, Mostafa

    2017-01-01

    and risks should be qualified and quantified. This paper introduces several probabilistic indices to evaluate the potential operational effects of increasing penetration of renewable DG units such as wind power and photovoltaic on rural distribution network with the aid of evaluating technical benefits...... and risks trade-offs. A probabilistic generation-load model is suggested to calculate these indices which combine a large number of possible operating conditions of renewable DG units with their probabilities. Temporal and annual indices of voltage profile and line flow related attributes such as Interest...... Voltage Rise (IVR), Risky Voltage Rise (RVR), Risky Voltage Down (RVD), Line Loss Reduction (LLR), Line Loss Increment (LLI) and Line overload flow (LOF) are introduced using probability and expected values of their occurrence. Also, to measure the overall interests and risks of installing DG, composite...

  20. Application of Fuzzy TOPSIS for evaluating machining techniques using sustainability metrics

    Science.gov (United States)

    Digalwar, Abhijeet K.

    2018-04-01

    Sustainable processes and techniques are getting increased attention over the last few decades due to rising concerns over the environment, improved focus on productivity and stringency in environmental as well as occupational health and safety norms. The present work analyzes the research on sustainable machining techniques and identifies techniques and parameters on which sustainability of a process is evaluated. Based on the analysis these parameters are then adopted as criteria’s to evaluate different sustainable machining techniques such as Cryogenic Machining, Dry Machining, Minimum Quantity Lubrication (MQL) and High Pressure Jet Assisted Machining (HPJAM) using a fuzzy TOPSIS framework. In order to facilitate easy arithmetic, the linguistic variables represented by fuzzy numbers are transformed into crisp numbers based on graded mean representation. Cryogenic machining was found to be the best alternative sustainable technique as per the fuzzy TOPSIS framework adopted. The paper provides a method to deal with multi criteria decision making problems in a complex and linguistic environment.

  1. Novel evaluation metrics for sparse spatio-temporal point process hotspot predictions - a crime case study

    OpenAIRE

    Adepeju, M.; Rosser, G.; Cheng, T.

    2016-01-01

    Many physical and sociological processes are represented as discrete events in time and space. These spatio-temporal point processes are often sparse, meaning that they cannot be aggregated and treated with conventional regression models. Models based on the point process framework may be employed instead for prediction purposes. Evaluating the predictive performance of these models poses a unique challenge, as the same sparseness prevents the use of popular measures such as the root mean squ...

  2. A Metrics-Based Approach to Intrusion Detection System Evaluation for Distributed Real-Time Systems

    Science.gov (United States)

    2002-04-01

    Based Approach to Intrusion Detection System Evaluation for Distributed Real - Time Systems Authors: G. A. Fink, B. L. Chappell, T. G. Turner, and...Distributed, Security. 1 Introduction Processing and cost requirements are driving future naval combat platforms to use distributed, real - time systems of...distributed, real - time systems . As these systems grow more complex, the timing requirements do not diminish; indeed, they may become more constrained

  3. Metrication manual

    International Nuclear Information System (INIS)

    Harper, A.F.A.; Digby, R.B.; Thong, S.P.; Lacey, F.

    1978-04-01

    In April 1978 a meeting of senior metrication officers convened by the Commonwealth Science Council of the Commonwealth Secretariat, was held in London. The participants were drawn from Australia, Bangladesh, Britain, Canada, Ghana, Guyana, India, Jamaica, Papua New Guinea, Solomon Islands and Trinidad and Tobago. Among other things, the meeting resolved to develop a set of guidelines to assist countries to change to SI and to compile such guidelines in the form of a working manual

  4. Translating Glucose Variability Metrics into the Clinic via Continuous Glucose Monitoring: A Graphical User Interface for Diabetes Evaluation (CGM-GUIDE©)

    Science.gov (United States)

    Rawlings, Renata A.; Shi, Hang; Yuan, Lo-Hua; Brehm, William; Pop-Busui, Rodica

    2011-01-01

    Abstract Background Several metrics of glucose variability have been proposed to date, but an integrated approach that provides a complete and consistent assessment of glycemic variation is missing. As a consequence, and because of the tedious coding necessary during quantification, most investigators and clinicians have not yet adopted the use of multiple glucose variability metrics to evaluate glycemic variation. Methods We compiled the most extensively used statistical techniques and glucose variability metrics, with adjustable hyper- and hypoglycemic limits and metric parameters, to create a user-friendly Continuous Glucose Monitoring Graphical User Interface for Diabetes Evaluation (CGM-GUIDE©). In addition, we introduce and demonstrate a novel transition density profile that emphasizes the dynamics of transitions between defined glucose states. Results Our combined dashboard of numerical statistics and graphical plots support the task of providing an integrated approach to describing glycemic variability. We integrated existing metrics, such as SD, area under the curve, and mean amplitude of glycemic excursion, with novel metrics such as the slopes across critical transitions and the transition density profile to assess the severity and frequency of glucose transitions per day as they move between critical glycemic zones. Conclusions By presenting the above-mentioned metrics and graphics in a concise aggregate format, CGM-GUIDE provides an easy to use tool to compare quantitative measures of glucose variability. This tool can be used by researchers and clinicians to develop new algorithms of insulin delivery for patients with diabetes and to better explore the link between glucose variability and chronic diabetes complications. PMID:21932986

  5. Quantitative metrics for evaluating the phased roll-out of clinical information systems.

    Science.gov (United States)

    Wong, David; Wu, Nicolas; Watkinson, Peter

    2017-09-01

    We introduce a novel quantitative approach for evaluating the order of roll-out during phased introduction of clinical information systems. Such roll-outs are associated with unavoidable risk due to patients transferring between clinical areas using both the old and new systems. We proposed a simple graphical model of patient flow through a hospital. Using a simple instance of the model, we showed how a roll-out order can be generated by minimising the flow of patients from the new system to the old system. The model was applied to admission and discharge data acquired from 37,080 patient journeys at the Churchill Hospital, Oxford between April 2013 and April 2014. The resulting order was evaluated empirically and produced acceptable orders. The development of data-driven approaches to clinical Information system roll-out provides insights that may not necessarily be ascertained through clinical judgment alone. Such methods could make a significant contribution to the smooth running of an organisation during the roll-out of a potentially disruptive technology. Unlike previous approaches, which are based on clinical opinion, the approach described here quantitatively assesses the appropriateness of competing roll-out strategies. The data-driven approach was shown to produce strategies that matched clinical intuition and provides a flexible framework that may be used to plan and monitor Clinical Information System roll-out. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  6. Development, nutritional evaluation and optimization of instant ...

    African Journals Online (AJOL)

    In this study, different instant porridges were formulated from broken fractions of rice blended with bambara groundnut flour through extrusion cooking. Response Surface Methodology (RSM) and Central Composite Rotatable Design (CCRD) were used to optimize the production variables. The objective was to locate the ...

  7. Evaluation of a pilot workload metric for simulated VTOL landing tasks

    Science.gov (United States)

    North, R. A.; Graffunder, K.

    1979-01-01

    A methodological approach to measuring workload was investigated for evaluation of new concepts in VTOL aircraft displays. Multivariate discriminant functions were formed from conventional flight performance and/or visual response variables to maximize detection of experimental differences. The flight performance variable discriminant showed maximum differentiation between crosswind conditions. The visual response measure discriminant maximized differences between fixed vs. motion base conditions and experimental displays. Physiological variables were used to attempt to predict the discriminant function values for each subject/condition/trial. The weights of the physiological variables in these equations showed agreement with previous studies. High muscle tension, light but irregular breathing patterns, and higher heart rate with low amplitude all produced higher scores on this scale and thus, represented higher workload levels.

  8. Final Report: Evaluation of Tools and Metrics to Support Employer Selection of Health Plans.

    Science.gov (United States)

    Mattke, Soeren; Van Busum, Kristin R; Martsolf, Grant R

    2014-01-01

    The Patient Protection and Affordable Care Act (ACA) places strong emphasis on quality of care as a means to improve outcomes for Americans and promote the financial sustainability of our health care system. Included in the ACA are new disclosure requirements that require health plans to provide a summary of benefits and coverage that accurately describes the benefits under the plan or coverage. These requirements are intended to support employers' procurement of high-value health coverage for their employees. This study attempts to help employers understand the structural differences between health plans and the performance dimensions along which plans can differ, as well as to educate employers about available tools that can be used to evaluate plan options. The study also discusses the extent to which these and other tools or resources are used by employers to inform choices between health plans.

  9. Understanding the compaction behaviour of low-substituted HPC: macro, micro, and nano-metric evaluations.

    Science.gov (United States)

    ElShaer, Amr; Al-Khattawi, Ali; Mohammed, Afzal R; Warzecha, Monika; Lamprou, Dimitrios A; Hassanin, Hany

    2018-06-01

    The fast development in materials science has resulted in the emergence of new pharmaceutical materials with superior physical and mechanical properties. Low-substituted hydroxypropyl cellulose is an ether derivative of cellulose and is praised for its multi-functionality as a binder, disintegrant, film coating agent and as a suitable material for medical dressings. Nevertheless, very little is known about the compaction behaviour of this polymer. The aim of the current study was to evaluate the compaction and disintegration behaviour of four grades of L-HPC namely; LH32, LH21, LH11, and LHB1. The macrometric properties of the four powders were studied and the compaction behaviour was evaluated using the out-of-die method. LH11 and LH22 showed poor flow properties as the powders were dominated by fibrous particles with high aspect ratios, which reduced the powder flow. LH32 showed a weak compressibility profile and demonstrated a large elastic region, making it harder for this polymer to deform plastically. These findings are supported by AFM which revealed the high roughness of LH32 powder (100.09 ± 18.84 nm), resulting in small area of contact, but promoting mechanical interlocking. On the contrary, LH21 and LH11 powders had smooth surfaces which enabled larger contact area and higher adhesion forces of 21.01 ± 11.35 nN and 9.50 ± 5.78 nN, respectively. This promoted bond formation during compression as LH21 and LH11 powders had low strength yield.

  10. Contribution to the evaluation and to the improvement of multi-objective optimization methods: application to the optimization of nuclear fuel reloading pattern

    International Nuclear Information System (INIS)

    Collette, Y.

    2002-01-01

    In this thesis, we study the general problem of the selection of a multi-objective optimization method, then we study the improvement so as to efficiently solve a problem. The pertinent selection of a method presume the existence of a methodology: we have built tools to perform evaluation of performances and we propose an original method dedicated to the classification of know optimization methods. Our step has been applied to the elaboration of new methods for solving a very difficult problem: the nuclear core reload pattern optimization. First, we looked for a non usual approach of performances measurement: we have 'measured' the behavior of a method. To reach this goal, we have introduced several metrics. We have proposed to evaluate the 'aesthetic' of a distribution of solutions by defining two new metrics: a 'spacing metric' and a metric that allow us to measure the size of the biggest hole in the distribution of solutions. Then, we studied the convergence of multi-objective optimization methods by using some metrics defined in scientific literature and by proposing some more metrics: the 'Pareto ratio' which computes a ratio of solution production. Lastly, we have defined new metrics intended to better apprehend the behavior of optimization methods: the 'speed metric', which allows to compute the speed profile and a 'distribution metric' which allows to compute statistical distribution of solutions along the Pareto frontier. Next, we have studied transformations of a multi-objective problem and defined news methods: the modified Tchebychev method, or the penalized weighted sum of objective functions. We have elaborated new techniques to choose the initial point. These techniques allow to produce new initial points closer and closer to the Pareto frontier and, thanks to the 'proximal optimality concept', allowing dramatic improvements in the convergence of a multi-objective optimization method. Lastly, we have defined new vectorial multi-objective optimization

  11. Metrics with vanishing quantum corrections

    International Nuclear Information System (INIS)

    Coley, A A; Hervik, S; Gibbons, G W; Pope, C N

    2008-01-01

    We investigate solutions of the classical Einstein or supergravity equations that solve any set of quantum corrected Einstein equations in which the Einstein tensor plus a multiple of the metric is equated to a symmetric conserved tensor T μν (g αβ , ∂ τ g αβ , ∂ τ ∂ σ g αβ , ...,) constructed from sums of terms, the involving contractions of the metric and powers of arbitrary covariant derivatives of the curvature tensor. A classical solution, such as an Einstein metric, is called universal if, when evaluated on that Einstein metric, T μν is a multiple of the metric. A Ricci flat classical solution is called strongly universal if, when evaluated on that Ricci flat metric, T μν vanishes. It is well known that pp-waves in four spacetime dimensions are strongly universal. We focus attention on a natural generalization; Einstein metrics with holonomy Sim(n - 2) in which all scalar invariants are zero or constant. In four dimensions we demonstrate that the generalized Ghanam-Thompson metric is weakly universal and that the Goldberg-Kerr metric is strongly universal; indeed, we show that universality extends to all four-dimensional Sim(2) Einstein metrics. We also discuss generalizations to higher dimensions

  12. Multidimensional poverty in rural Mozambique: a new metric for evaluating public health interventions.

    Science.gov (United States)

    Victor, Bart; Blevins, Meridith; Green, Ann F; Ndatimana, Elisée; González-Calvo, Lázaro; Fischer, Edward F; Vergara, Alfredo E; Vermund, Sten H; Olupona, Omo; Moon, Troy D

    2014-01-01

    Poverty is a multidimensional phenomenon and unidimensional measurements have proven inadequate to the challenge of assessing its dynamics. Dynamics between poverty and public health intervention is among the most difficult yet important problems faced in development. We sought to demonstrate how multidimensional poverty measures can be utilized in the evaluation of public health interventions; and to create geospatial maps of poverty deprivation to aid implementers in prioritizing program planning. Survey teams interviewed a representative sample of 3,749 female heads of household in 259 enumeration areas across Zambézia in August-September 2010. We estimated a multidimensional poverty index, which can be disaggregated into context-specific indicators. We produced an MPI comprised of 3 dimensions and 11 weighted indicators selected from the survey. Households were identified as "poor" if were deprived in >33% of indicators. Our MPI is an adjusted headcount, calculated by multiplying the proportion identified as poor (headcount) and the poverty gap (average deprivation). Geospatial visualizations of poverty deprivation were created as a contextual baseline for future evaluation. In our rural (96%) and urban (4%) interviewees, the 33% deprivation cut-off suggested 58.2% of households were poor (29.3% of urban vs. 59.5% of rural). Among the poor, households experienced an average deprivation of 46%; thus the MPI/adjusted headcount is 0.27 ( = 0.58×0.46). Of households where a local language was the primary language, 58.6% were considered poor versus Portuguese-speaking households where 73.5% were considered non-poor. Living standard is the dominant deprivation, followed by health, and then education. Multidimensional poverty measurement can be integrated into program design for public health interventions, and geospatial visualization helps examine the impact of intervention deployment within the context of distinct poverty conditions. Both permit program

  13. Multidimensional poverty in rural Mozambique: a new metric for evaluating public health interventions.

    Directory of Open Access Journals (Sweden)

    Bart Victor

    Full Text Available BACKGROUND: Poverty is a multidimensional phenomenon and unidimensional measurements have proven inadequate to the challenge of assessing its dynamics. Dynamics between poverty and public health intervention is among the most difficult yet important problems faced in development. We sought to demonstrate how multidimensional poverty measures can be utilized in the evaluation of public health interventions; and to create geospatial maps of poverty deprivation to aid implementers in prioritizing program planning. METHODS: Survey teams interviewed a representative sample of 3,749 female heads of household in 259 enumeration areas across Zambézia in August-September 2010. We estimated a multidimensional poverty index, which can be disaggregated into context-specific indicators. We produced an MPI comprised of 3 dimensions and 11 weighted indicators selected from the survey. Households were identified as "poor" if were deprived in >33% of indicators. Our MPI is an adjusted headcount, calculated by multiplying the proportion identified as poor (headcount and the poverty gap (average deprivation. Geospatial visualizations of poverty deprivation were created as a contextual baseline for future evaluation. RESULTS: In our rural (96% and urban (4% interviewees, the 33% deprivation cut-off suggested 58.2% of households were poor (29.3% of urban vs. 59.5% of rural. Among the poor, households experienced an average deprivation of 46%; thus the MPI/adjusted headcount is 0.27 ( = 0.58×0.46. Of households where a local language was the primary language, 58.6% were considered poor versus Portuguese-speaking households where 73.5% were considered non-poor. Living standard is the dominant deprivation, followed by health, and then education. CONCLUSIONS: Multidimensional poverty measurement can be integrated into program design for public health interventions, and geospatial visualization helps examine the impact of intervention deployment within the context

  14. [Evaluation of the factorial and metric equivalence of the Sexual Assertiveness Scale (SAS) by sex].

    Science.gov (United States)

    Sierra, Juan Carlos; Santos-Iglesias, Pablo; Vallejo-Medina, Pablo

    2012-05-01

    Sexual assertiveness refers to the ability to initiate sexual activity, refuse unwanted sexual activity, and use contraceptive methods to avoid sexually transmitted diseases, developing healthy sexual behaviors. The Sexual Assertiveness Scale (SAS) assesses these three dimensions. The purpose of this study is to evaluate, using structural equation modeling and differential item functioning, the equivalence of the scale between men and women. Standard scores are also provided. A total of 4,034 participants from 21 Spanish provinces took part in the study. Quota sampling method was used. Results indicate a strict equivalent dimensionality of the Sexual Assertiveness Scale across sexes. One item was flagged by differential item functioning, although it does not affect the scale. Therefore, there is no significant bias in the scale when comparing across sexes. Standard scores show similar Initiation assertiveness scores for men and women, and higher scores on Refusal and Sexually Transmitted Disease Prevention for women. This scale can be used on men and women with sufficient psychometric guarantees.

  15. Metric Indices for Performance Evaluation of a Mixed Measurement based State Estimator

    Directory of Open Access Journals (Sweden)

    Paula Sofia Vide

    2013-01-01

    Full Text Available With the development of synchronized phasor measurement technology in recent years, it gains great interest the use of PMU measurements to improve state estimation performances due to their synchronized characteristics and high data transmission speed. The ability of the Phasor Measurement Units (PMU to directly measure the system state is a key over SCADA measurement system. PMU measurements are superior to the conventional SCADA measurements in terms of resolution and accuracy. Since the majority of measurements in existing estimators are from conventional SCADA measurement system, it is hard to be fully replaced by PMUs in the near future so state estimators including both phasor and conventional SCADA measurements are being considered. In this paper, a mixed measurement (SCADA and PMU measurements state estimator is proposed. Several useful measures for evaluating various aspects of the performance of the mixed measurement state estimator are proposed and explained. State Estimator validity, performance and characteristics of the results on IEEE 14 bus test system and IEEE 30 bus test system are presented.

  16. The Use of the Kurtosis-Adjusted Cumulative Noise Exposure Metric in Evaluating the Hearing Loss Risk for Complex Noise.

    Science.gov (United States)

    Xie, Hong-Wei; Qiu, Wei; Heyer, Nicholas J; Zhang, Mei-Bian; Zhang, Peng; Zhao, Yi-Ming; Hamernik, Roger P

    2016-01-01

    To test a kurtosis-adjusted cumulative noise exposure (CNE) metric for use in evaluating the risk of hearing loss among workers exposed to industrial noises. Specifically, to evaluate whether the kurtosis-adjusted CNE (1) provides a better association with observed industrial noise-induced hearing loss, and (2) provides a single metric applicable to both complex (non-Gaussian [non-G]) and continuous or steady state (Gaussian [G]) noise exposures for predicting noise-induced hearing loss (dose-response curves). Audiometric and noise exposure data were acquired on a population of screened workers (N = 341) from two steel manufacturing plants located in Zhejiang province and a textile manufacturing plant located in Henan province, China. All the subjects from the two steel manufacturing plants (N = 178) were exposed to complex noise, whereas the subjects from textile manufacturing plant (N = 163) were exposed to a G continuous noise. Each subject was given an otologic examination to determine their pure-tone HTL and had their personal 8-hr equivalent A-weighted noise exposure (LAeq) and full-shift noise kurtosis statistic (which is sensitive to the peaks and temporal characteristics of noise exposures) measured. For each subject, an unadjusted and kurtosis-adjusted CNE index for the years worked was created. Multiple linear regression analysis controlling for age was used to determine the relationship between CNE (unadjusted and kurtosis adjusted) and the mean HTL at 3, 4, and 6 kHz (HTL346) among the complex noise-exposed group. In addition, each subject's HTLs from 0.5 to 8.0 kHz were age and sex adjusted using Annex A (ISO-1999) to determine whether they had adjusted high-frequency noise-induced hearing loss (AHFNIHL), defined as an adjusted HTL shift of 30 dB or greater at 3.0, 4.0, or 6.0 kHz in either ear. Dose-response curves for AHFNIHL were developed separately for workers exposed to G and non-G noise using both unadjusted and adjusted CNE as the exposure

  17. Metric properties of the Utrecht Scale for Evaluation of Rehabilitation-Participation (USER-Participation) in persons with spinal cord injury living in Switzerland

    NARCIS (Netherlands)

    Mader, Luzius; Post, Marcel W M; Ballert, Carolina S; Michel, Gisela; Stucki, Gerold; Brinkhof, Martin W G

    2016-01-01

    OBJECTIVE: To examine the metric properties of the Utrecht Scale for Evaluation of Rehabilitation-Participation (USER-Participation) in persons with spinal cord injury in Switzerland from a classical and item response theory perspective. DESIGN: Cross-sectional survey. SUBJECTS: Persons with spinal

  18. Optimizing Usability Studies by Complementary Evaluation Methods

    NARCIS (Netherlands)

    Schmettow, Martin; Bach, Cedric; Scapin, Dominique

    2014-01-01

    This paper examines combinations of complementary evaluation methods as a strategy for efficient usability problem discovery. A data set from an earlier study is re-analyzed, involving three evaluation methods applied to two virtual environment applications. Results of a mixed-effects logistic

  19. Metrics for evaluating patient navigation during cancer diagnosis and treatment: crafting a policy-relevant research agenda for patient navigation in cancer care.

    Science.gov (United States)

    Guadagnolo, B Ashleigh; Dohan, Daniel; Raich, Peter

    2011-08-01

    Racial and ethnic minorities as well as other vulnerable populations experience disparate cancer-related health outcomes. Patient navigation is an emerging health care delivery innovation that offers promise in improving quality of cancer care delivery to these patients who experience unique health-access barriers. Metrics are needed to evaluate whether patient navigation can improve quality of care delivery, health outcomes, and overall value in health care during diagnosis and treatment of cancer. Information regarding the current state of the science examining patient navigation interventions was gathered via search of the published scientific literature. A focus group of providers, patient navigators, and health-policy experts was convened as part of the Patient Navigation Leadership Summit sponsored by the American Cancer Society. Key metrics were identified for assessing the efficacy of patient navigation in cancer diagnosis and treatment. Patient navigation data exist for all stages of cancer care; however, the literature is more robust for its implementation during prevention, screening, and early diagnostic workup of cancer. Relatively fewer data are reported for outcomes and efficacy of patient navigation during cancer treatment. Metrics are proposed for a policy-relevant research agenda to evaluate the efficacy of patient navigation in cancer diagnosis and treatment. Patient navigation is understudied with respect to its use in cancer diagnosis and treatment. Core metrics are defined to evaluate its efficacy in improving outcomes and mitigating health-access barriers. Copyright © 2011 American Cancer Society.

  20. Comparative evaluation of various optimization methods and the development of an optimization code system SCOOP

    International Nuclear Information System (INIS)

    Suzuki, Tadakazu

    1979-11-01

    Thirty two programs for linear and nonlinear optimization problems with or without constraints have been developed or incorporated, and their stability, convergence and efficiency have been examined. On the basis of these evaluations, the first version of the optimization code system SCOOP-I has been completed. The SCOOP-I is designed to be an efficient, reliable, useful and also flexible system for general applications. The system enables one to find global optimization point for a wide class of problems by selecting the most appropriate optimization method built in it. (author)

  1. Active Metric Learning for Supervised Classification

    OpenAIRE

    Kumaran, Krishnan; Papageorgiou, Dimitri; Chang, Yutong; Li, Minhan; Takáč, Martin

    2018-01-01

    Clustering and classification critically rely on distance metrics that provide meaningful comparisons between data points. We present mixed-integer optimization approaches to find optimal distance metrics that generalize the Mahalanobis metric extensively studied in the literature. Additionally, we generalize and improve upon leading methods by removing reliance on pre-designated "target neighbors," "triplets," and "similarity pairs." Another salient feature of our method is its ability to en...

  2. Metrics for energy resilience

    International Nuclear Information System (INIS)

    Roege, Paul E.; Collier, Zachary A.; Mancillas, James; McDonagh, John A.; Linkov, Igor

    2014-01-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today's energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system's energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth. - Highlights: • Resilience is the ability of a system to recover from adversity. • There is a need for methods to quantify and measure system resilience. • We developed a matrix-based approach to generate energy resilience metrics. • These metrics can be used in energy planning, system design, and operations

  3. Utilizing Machine Learning and Automated Performance Metrics to Evaluate Robot-Assisted Radical Prostatectomy Performance and Predict Outcomes.

    Science.gov (United States)

    Hung, Andrew J; Chen, Jian; Che, Zhengping; Nilanon, Tanachat; Jarc, Anthony; Titus, Micha; Oh, Paul J; Gill, Inderbir S; Liu, Yan

    2018-05-01

    Surgical performance is critical for clinical outcomes. We present a novel machine learning (ML) method of processing automated performance metrics (APMs) to evaluate surgical performance and predict clinical outcomes after robot-assisted radical prostatectomy (RARP). We trained three ML algorithms utilizing APMs directly from robot system data (training material) and hospital length of stay (LOS; training label) (≤2 days and >2 days) from 78 RARP cases, and selected the algorithm with the best performance. The selected algorithm categorized the cases as "Predicted as expected LOS (pExp-LOS)" and "Predicted as extended LOS (pExt-LOS)." We compared postoperative outcomes of the two groups (Kruskal-Wallis/Fisher's exact tests). The algorithm then predicted individual clinical outcomes, which we compared with actual outcomes (Spearman's correlation/Fisher's exact tests). Finally, we identified five most relevant APMs adopted by the algorithm during predicting. The "Random Forest-50" (RF-50) algorithm had the best performance, reaching 87.2% accuracy in predicting LOS (73 cases as "pExp-LOS" and 5 cases as "pExt-LOS"). The "pExp-LOS" cases outperformed the "pExt-LOS" cases in surgery time (3.7 hours vs 4.6 hours, p = 0.007), LOS (2 days vs 4 days, p = 0.02), and Foley duration (9 days vs 14 days, p = 0.02). Patient outcomes predicted by the algorithm had significant association with the "ground truth" in surgery time (p algorithm in predicting, were largely related to camera manipulation. To our knowledge, ours is the first study to show that APMs and ML algorithms may help assess surgical RARP performance and predict clinical outcomes. With further accrual of clinical data (oncologic and functional data), this process will become increasingly relevant and valuable in surgical assessment and training.

  4. Metrics for building performance assurance

    Energy Technology Data Exchange (ETDEWEB)

    Koles, G.; Hitchcock, R.; Sherman, M.

    1996-07-01

    This report documents part of the work performed in phase I of a Laboratory Directors Research and Development (LDRD) funded project entitled Building Performance Assurances (BPA). The focus of the BPA effort is to transform the way buildings are built and operated in order to improve building performance by facilitating or providing tools, infrastructure, and information. The efforts described herein focus on the development of metrics with which to evaluate building performance and for which information and optimization tools need to be developed. The classes of building performance metrics reviewed are (1) Building Services (2) First Costs, (3) Operating Costs, (4) Maintenance Costs, and (5) Energy and Environmental Factors. The first category defines the direct benefits associated with buildings; the next three are different kinds of costs associated with providing those benefits; the last category includes concerns that are broader than direct costs and benefits to the building owner and building occupants. The level of detail of the various issues reflect the current state of knowledge in those scientific areas and the ability of the to determine that state of knowledge, rather than directly reflecting the importance of these issues; it intentionally does not specifically focus on energy issues. The report describes work in progress and is intended as a resource and can be used to indicate the areas needing more investigation. Other reports on BPA activities are also available.

  5. Evaluation of SEBS, SEBAL, and METRIC models in estimation of the evaporation from the freshwater lakes (Case study: Amirkabir dam, Iran)

    Science.gov (United States)

    Zamani Losgedaragh, Saeideh; Rahimzadegan, Majid

    2018-06-01

    Evapotranspiration (ET) estimation is of great importance due to its key role in water resource management. Surface energy modeling tools such as Surface Energy Balance Algorithm for Land (SEBAL), Mapping Evapotranspiration with Internalized Calibration (METRIC), and the Surface Energy Balance System (SEBS) can estimate the amount of evapotranspiration for every pixel of the satellite images. The main objective of this research is evaporation investigation from the freshwater bodies using SEBAL, METRIC, and SEBS. For this purpose, the Amirkabir dam reservoir and its nearby agricultural lands in a semi-arid climate were selected and studied from 2011 to 2017 as the study area. The implementations of this study were accomplished on 16 satellite images of Landsat TM5 and OLI. Then, SEBAL, METRIC, and SEBS were implemented on the selected images. Moreover, the corresponding pan evaporate measurements on the reservoir bank were considered as the ground truth data. Regarding to the results, SEBAL is not a reliable method to evaluate freshwater evaporation with the coefficient of determination (R2) of 0.36 and the Root Mean Square Error (RMSE) of 5.1 mm. On the other hand, METRIC with RMSE and R2 of 0.57 and 2.02 mm and SEBS with RMSE and R2 of 0.93 and 0.62 demonstrated a relatively good performance.

  6. SCOPE, Shipping Cask Optimization and Parametric Evaluation

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: Given the neutron and gamma-ray shielding requirements as input, SCOPE may be used as a conceptual design tool for the evaluation of various casks designed to carry square fuel assemblies, circular canisters of nuclear waste material, or circular canisters containing 'intact' spent-fuel assemblies. It may be used to evaluate a specific design or to search for the maximum number of full assemblies (or canisters) that might be shipped in a given type of cask. In the 'search' mode, SCOPE will use built-in packing arrangements and the tabulated shielding requirements input by the user to 'design' a cask carrying one fuel assembly (or canister); it will then continue to increment the number of assemblies (or canisters) until one or more of the design limits can no longer be met. In each case (N = 1,2,3...), SCOPE will calculate the steady-state temperature distribution throughout the cask and perform a complete 1-D space/time transient thermal analysis following a postulated half-hour fire; then it will edit the characteristic dimensions of the cask (including fins, if required), the total weight of the loaded case, the steady-state temperature distribution at selected points, and the maximum transient temperature in key components. With SCOPE, the effects of various design changes may be evaluated quickly and inexpensively. 2 - Method of solution: SCOPE assumes that the user has already made an independent determination of the neutron and gamma-ray shielding requirements for the particular type of cask(s) under study. The amount of shielding required obviously depends on the type of spent fuel or nuclear waste material, its burnup and/or exposure, the decay time, and the number of assemblies or canisters in the cask. Source terms (and spectra) for spent PWR and BWR fuel assemblies are provided at each of 17 decay times, along with recommended neutron and gamma-ray shield thicknesses for Pb, Fe, and U-metal casks containing a

  7. Engineering performance metrics

    Science.gov (United States)

    Delozier, R.; Snyder, N.

    1993-03-01

    Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.

  8. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  9. Evaluation and improvement of dynamic optimality in electrochemical reactors

    International Nuclear Information System (INIS)

    Vijayasekaran, B.; Basha, C. Ahmed

    2005-01-01

    A systematic approach for the dynamic optimization problem statement to improve the dynamic optimality in electrochemical reactors is presented in this paper. The formulation takes an account of the diffusion phenomenon in the electrode/electrolyte interface. To demonstrate the present methodology, the optimal time-varying electrode potential for a coupled chemical-electrochemical reaction scheme, that maximizes the production of the desired product in a batch electrochemical reactor with/without recirculation are determined. The dynamic optimization problem statement, based upon this approach, is a nonlinear differential algebraic system, and its solution provides information about the optimal policy. Optimal control policy at different conditions is evaluated using the best-known Pontryagin's maximum principle. The two-point boundary value problem resulting from the application of the maximum principle is then solved using the control vector iteration technique. These optimal time-varying profiles of electrode potential are then compared to the best uniform operation through the relative improvements of the performance index. The application of the proposed approach to two electrochemical systems, described by ordinary differential equations, shows that the existing electrochemical process control strategy could be improved considerably when the proposed method is incorporated

  10. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory.

    Science.gov (United States)

    Kumar, B Vinodh; Mohan, Thuthi

    2018-01-01

    Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of sigma level. For all analytes sigma level, the quality goal index (QGI) was 1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.

  11. Science as Knowledge, Practice, and Map Making: The Challenge of Defining Metrics for Evaluating and Improving DOE-Funded Basic Experimental Science

    Energy Technology Data Exchange (ETDEWEB)

    Bodnarczuk, M.

    1993-03-01

    Industrial R&D laboratories have been surprisingly successful in developing performance objectives and metrics that convincingly show that planning, management, and improvement techniques can be value-added to the actual output of R&D organizations. In this paper, I will discuss the more difficult case of developing analogous constructs for DOE-funded non-nuclear, non-weapons basic research, or as I will refer to it - basic experimental science. Unlike most industrial R&D or the bulk of applied science performed at the National Renewable Energy Laboratory (NREL), the purpose of basic experimental science is producing new knowledge (usually published in professional journals) that has no immediate application to the first link (the R) of a planned R&D chain. Consequently, performance objectives and metrics are far more difficult to define. My claim is that if one can successfully define metrics for evaluating and improving DOE-funded basic experimental science (which is the most difficult case), then defining such constructs for DOE-funded applied science should be much less problematic. With the publication of the DOE Standard - Implementation Guide for Quality Assurance Programs for Basic and Applied Research (DOE-ER-STD-6001-92) and the development of a conceptual framework for integrating all the DOE orders, we need to move aggressively toward the threefold next phase: (1) focusing the management elements found in DOE-ER-STD-6001-92 on the main output of national laboratories - the experimental science itself; (2) developing clearer definitions of basic experimental science as practice not just knowledge; and (3) understanding the relationship between the metrics that scientists use for evaluating the performance of DOE-funded basic experimental science, the management elements of DOE-ER-STD-6001-92, and the notion of continuous improvement.

  12. A systematic approach towards the objective evaluation of low-contrast performance in MDCT: Combination of a full-reference image fidelity metric and a software phantom

    International Nuclear Information System (INIS)

    Falck, Christian von; Rodt, Thomas; Waldeck, Stephan; Hartung, Dagmar; Meyer, Bernhard; Wacker, Frank; Shin, Hoen-oh

    2012-01-01

    Objectives: To assess the feasibility of an objective approach for the evaluation of low-contrast detectability in multidetector computed-tomography (MDCT) by combining a virtual phantom containing simulated lesions with an image quality metric. Materials and methods: A low-contrast phantom containing hypodense spheric lesions (−20 HU) was scanned on a 64-slice MDCT scanner at 4 different dose levels (25, 50, 100, 200 mAs). In addition, virtual round hypodense low-contrast lesions (20 HU object contrast) based on real CT data were inserted into the lesion-free section of the datasets. The sliding-thin-slab algorithm was applied to the image data with an increasing slice-thickness from 1 to 15 slices. For each dataset containing simulated lesions a lesion-free counterpart was reconstructed and post-processed in the same manner. The low-contrast performance of all datasets containing virtual lesions was determined using a full-reference image quality metric (modified multiscale structural similarity index, MS-SSIM*). The results were validated against a reader-study of the real lesions. Results: For all dose levels and lesion sizes there was no statistically significant difference between the low-contrast performance as determined by the image quality metric when compared to the reader study (p < 0.05). The intraclass correlation coefficient was 0.72, 0.82, 0.90 and 0.84 for lesion diameters of 4 mm, 5 mm, 8 mm and 10 mm, respectively. The use of the sliding-thin-slab algorithm improves lesion detectability by a factor ranging from 1.15 to 2.69 when compared with the original axial slice (0.625 mm). Conclusion: The combination of a virtual phantom and a full-reference image quality metric enables a systematic, automated and objective evaluation of low-contrast detectability in MDCT datasets and correlates well with the judgment of human readers.

  13. Evaluation of the effect of advanced coagulation process to optimize ...

    African Journals Online (AJOL)

    Evaluation of the effect of advanced coagulation process to optimize the removal of natural organic matter in water (Case study: drinking water of Mashhad's ... and in addition to giving taste, color and odor to the water, they can intervene in the oxidization and removal of heavy metals such as arsenic, iron and manganese.

  14. Daylight metrics and energy savings

    Energy Technology Data Exchange (ETDEWEB)

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  15. Generalized tolerance sensitivity and DEA metric sensitivity

    OpenAIRE

    Neralić, Luka; E. Wendell, Richard

    2015-01-01

    This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA). Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  16. Generalized tolerance sensitivity and DEA metric sensitivity

    Directory of Open Access Journals (Sweden)

    Luka Neralić

    2015-03-01

    Full Text Available This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA. Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  17. Web metrics for library and information professionals

    CERN Document Server

    Stuart, David

    2014-01-01

    This is a practical guide to using web metrics to measure impact and demonstrate value. The web provides an opportunity to collect a host of different metrics, from those associated with social media accounts and websites to more traditional research outputs. This book is a clear guide for library and information professionals as to what web metrics are available and how to assess and use them to make informed decisions and demonstrate value. As individuals and organizations increasingly use the web in addition to traditional publishing avenues and formats, this book provides the tools to unlock web metrics and evaluate the impact of this content. The key topics covered include: bibliometrics, webometrics and web metrics; data collection tools; evaluating impact on the web; evaluating social media impact; investigating relationships between actors; exploring traditional publications in a new environment; web metrics and the web of data; the future of web metrics and the library and information professional.Th...

  18. Evaluation of optimization strategies and the effect of initial conditions on IMAT optimization using a leaf position optimization algorithm

    International Nuclear Information System (INIS)

    Oliver, Mike; Jensen, Michael; Chen, Jeff; Wong, Eugene

    2009-01-01

    Intensity-modulated arc therapy (IMAT) is a rotational variant of intensity-modulated radiation therapy (IMRT) that can be implemented with or without angular dose rate variation. The purpose of this study is to assess optimization strategies and initial conditions using a leaf position optimization (LPO) algorithm altered for variable dose rate IMAT. A concave planning target volume (PTV) with a central cylindrical organ at risk (OAR) was used in this study. The initial IMAT arcs were approximated by multiple static beams at 5 deg. angular increments where multi-leaf collimator (MLC) leaf positions were determined from the beam's eye view to irradiate the PTV but avoid the OAR. For the optimization strategy, two arcs with arc ranges of 280 deg. and 150 deg. were employed and plans were created using LPO alone, variable dose rate optimization (VDRO) alone, simultaneous LPO and VDRO and sequential combinations of these strategies. To assess the MLC initialization effect, three single 360 deg. arc plans with different initial MLC configurations were generated using the simultaneous LPO and VDRO. The effect of changing optimization degrees of freedom was investigated by employing 3 deg., 5 deg. and 10 deg. angular sampling intervals for the two 280 deg., two 150 deg. and single arc plans using LPO and VDRO. The objective function value, a conformity index, a dose homogeneity index, mean dose to OAR and normal tissues were computed and used to evaluate the treatment plans. This study shows that the best optimization strategy for a concave target is to use simultaneous MLC LPO and VDRO. We found that the optimization result is sensitive to the choice of initial MLC aperture shapes suggesting that an LPO-based IMAT plan may not be able to overcome local minima for this geometry. In conclusion, simultaneous MLC leaf position and VDRO are needed with the most appropriate initial conditions (MLC positions, arc ranges and number of arcs) for IMAT.

  19. The use of the kurtosis metric in the evaluation of occupational hearing loss in workers in China: Implications for hearing risk assessment

    Directory of Open Access Journals (Sweden)

    Robert I Davis

    2012-01-01

    Full Text Available This study examined: (1 the value of using the statistical metric, kurtosis [β(t], along with an energy metric to determine the hazard to hearing from high level industrial noise environments, and (2 the accuracy of the International Standard Organization (ISO-1999:1990 model for median noise-induced permanent threshold shift (NIPTS estimates with actual recent epidemiological data obtained on 240 highly screened workers exposed to high-level industrial noise in China. A cross-sectional approach was used in this study. Shift-long temporal waveforms of the noise that workers were exposed to for evaluation of noise exposures and audiometric threshold measures were obtained on all selected subjects. The subjects were exposed to only one occupational noise exposure without the use of hearing protection devices. The results suggest that: (1 the kurtosis metric is an important variable in determining the hazards to hearing posed by a high-level industrial noise environment for hearing conservation purposes, i.e., the kurtosis differentiated between the hazardous effects produced by Gaussian and non-Gaussian noise environments, (2 the ISO-1999 predictive model does not accurately estimate the degree of median NIPTS incurred to high level kurtosis industrial noise, and (3 the inherent large variability in NIPTS among subjects emphasize the need to develop and analyze a larger database of workers with well-documented exposures to better understand the effect of kurtosis on NIPTS incurred from high level industrial noise exposures. A better understanding of the role of the kurtosis metric may lead to its incorporation into a new generation of more predictive hearing risk assessment for occupational noise exposure.

  20. The use of the kurtosis metric in the evaluation of occupational hearing loss in workers in China: implications for hearing risk assessment.

    Science.gov (United States)

    Davis, Robert I; Qiu, Wei; Heyer, Nicholas J; Zhao, Yiming; Qiuling Yang, M S; Li, Nan; Tao, Liyuan; Zhu, Liangliang; Zeng, Lin; Yao, Daohua

    2012-01-01

    This study examined: (1) the value of using the statistical metric, kurtosis [β(t)], along with an energy metric to determine the hazard to hearing from high level industrial noise environments, and (2) the accuracy of the International Standard Organization (ISO-1999:1990) model for median noise-induced permanent threshold shift (NIPTS) estimates with actual recent epidemiological data obtained on 240 highly screened workers exposed to high-level industrial noise in China. A cross-sectional approach was used in this study. Shift-long temporal waveforms of the noise that workers were exposed to for evaluation of noise exposures and audiometric threshold measures were obtained on all selected subjects. The subjects were exposed to only one occupational noise exposure without the use of hearing protection devices. The results suggest that: (1) the kurtosis metric is an important variable in determining the hazards to hearing posed by a high-level industrial noise environment for hearing conservation purposes, i.e., the kurtosis differentiated between the hazardous effects produced by Gaussian and non-Gaussian noise environments, (2) the ISO-1999 predictive model does not accurately estimate the degree of median NIPTS incurred to high level kurtosis industrial noise, and (3) the inherent large variability in NIPTS among subjects emphasize the need to develop and analyze a larger database of workers with well-documented exposures to better understand the effect of kurtosis on NIPTS incurred from high level industrial noise exposures. A better understanding of the role of the kurtosis metric may lead to its incorporation into a new generation of more predictive hearing risk assessment for occupational noise exposure.

  1. Evaluating and optimizing the NERSC workload on Knights Landing

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, T; Cook, B; Deslippe, J; Doerfler, D; Friesen, B; He, Y; Kurth, T; Koskela, T; Lobet, M; Malas, T; Oliker, L; Ovsyannikov, A; Sarje, A; Vay, JL; Vincenti, H; Williams, S; Carrier, P; Wichmann, N; Wagner, M; Kent, P; Kerr, C; Dennis, J

    2017-01-30

    NERSC has partnered with 20 representative application teams to evaluate performance on the Xeon-Phi Knights Landing architecture and develop an application-optimization strategy for the greater NERSC workload on the recently installed Cori system. In this article, we present early case studies and summarized results from a subset of the 20 applications highlighting the impact of important architecture differences between the Xeon-Phi and traditional Xeon processors. We summarize the status of the applications and describe the greater optimization strategy that has formed.

  2. Metrics of quantum states

    International Nuclear Information System (INIS)

    Ma Zhihao; Chen Jingling

    2011-01-01

    In this work we study metrics of quantum states, which are natural generalizations of the usual trace metric and Bures metric. Some useful properties of the metrics are proved, such as the joint convexity and contractivity under quantum operations. Our result has a potential application in studying the geometry of quantum states as well as the entanglement detection.

  3. A framework for quantification of groundwater dynamics - redundancy and transferability of hydro(geo-)logical metrics

    Science.gov (United States)

    Heudorfer, Benedikt; Haaf, Ezra; Barthel, Roland; Stahl, Kerstin

    2017-04-01

    A new framework for quantification of groundwater dynamics has been proposed in a companion study (Haaf et al., 2017). In this framework, a number of conceptual aspects of dynamics, such as seasonality, regularity, flashiness or inter-annual forcing, are described, which are then linked to quantitative metrics. Hereby, a large number of possible metrics are readily available from literature, such as Pardé Coefficients, Colwell's Predictability Indices or Base Flow Index. In the present work, we focus on finding multicollinearity and in consequence redundancy among the metrics representing different patterns of dynamics found in groundwater hydrographs. This is done also to verify the categories of dynamics aspects suggested by Haaf et al., 2017. To determine the optimal set of metrics we need to balance the desired minimum number of metrics and the desired maximum descriptive property of the metrics. To do this, a substantial number of candidate metrics are applied to a diverse set of groundwater hydrographs from France, Germany and Austria within the northern alpine and peri-alpine region. By applying Principle Component Analysis (PCA) to the correlation matrix of the metrics, we determine a limited number of relevant metrics that describe the majority of variation in the dataset. The resulting reduced set of metrics comprise an optimized set that can be used to describe the aspects of dynamics that were identified within the groundwater dynamics framework. For some aspects of dynamics a single significant metric could be attributed. Other aspects have a more fuzzy quality that can only be described by an ensemble of metrics and are re-evaluated. The PCA is furthermore applied to groups of groundwater hydrographs containing regimes of similar behaviour in order to explore transferability when applying the metric-based characterization framework to groups of hydrographs from diverse groundwater systems. In conclusion, we identify an optimal number of metrics

  4. A Health-Based Metric for Evaluating the Effectiveness of Noise Barrier Mitigation Associated With Transport Infrastructure Noise

    Directory of Open Access Journals (Sweden)

    Geoffrey P Prendergast

    2017-01-01

    Full Text Available Introduction: This study examines the use of the number of night-time sleep disturbances as a health-based metric to assess the cost effectiveness of rail noise mitigation strategies for situations, wherein high-intensity noises dominate such as freight train pass-bys and wheel squeal. Materials and Methods: Twenty residential properties adjacent to the existing and proposed rail tracks in a noise catchment area of the Epping to Thornleigh Third Track project were used as a case study. Awakening probabilities were calculated for individual’s awakening 1, 3 and 5 times a night when subjected to 10 independent freight train pass-by noise events using internal maximum sound pressure levels (LAFmax. Results: Awakenings were predicted using a random intercept multivariate logistic regression model. With source mitigation in place, the majority of the residents were still predicted to be awoken at least once per night (median 88.0%, although substantial reductions in the median probabilities of awakening three and five times per night from 50.9 to 29.4% and 9.2 to 2.7%, respectively, were predicted. This resulted in a cost-effective estimate of 7.6–8.8 less people being awoken at least three times per night per A$1 million spent on noise barriers. Conclusion: The study demonstrates that an easily understood metric can be readily used to assist making decisions related to noise mitigation for large-scale transport projects.

  5. Analytical performance evaluation of a high-volume hematology laboratory utilizing sigma metrics as standard of excellence.

    Science.gov (United States)

    Shaikh, M S; Moiz, B

    2016-04-01

    Around two-thirds of important clinical decisions about the management of patients are based on laboratory test results. Clinical laboratories are required to adopt quality control (QC) measures to ensure provision of accurate and precise results. Six sigma is a statistical tool, which provides opportunity to assess performance at the highest level of excellence. The purpose of this study was to assess performance of our hematological parameters on sigma scale in order to identify gaps and hence areas of improvement in patient care. Twelve analytes included in the study were hemoglobin (Hb), hematocrit (Hct), red blood cell count (RBC), mean corpuscular volume (MCV), red cell distribution width (RDW), total leukocyte count (TLC) with percentages of neutrophils (Neutr%) and lymphocytes (Lymph %), platelet count (Plt), mean platelet volume (MPV), prothrombin time (PT), and fibrinogen (Fbg). Internal quality control data and external quality assurance survey results were utilized for the calculation of sigma metrics for each analyte. Acceptable sigma value of ≥3 was obtained for the majority of the analytes included in the analysis. MCV, Plt, and Fbg achieved value of performed poorly on both level 1 and 2 controls with sigma value of <3. Despite acceptable conventional QC tools, application of sigma metrics can identify analytical deficits and hence prospects for the improvement in clinical laboratories. © 2016 John Wiley & Sons Ltd.

  6. Complexity and Pilot Workload Metrics for the Evaluation of Adaptive Flight Controls on a Full Scale Piloted Aircraft

    Science.gov (United States)

    Hanson, Curt; Schaefer, Jacob; Burken, John J.; Larson, David; Johnson, Marcus

    2014-01-01

    Flight research has shown the effectiveness of adaptive flight controls for improving aircraft safety and performance in the presence of uncertainties. The National Aeronautics and Space Administration's (NASA)'s Integrated Resilient Aircraft Control (IRAC) project designed and conducted a series of flight experiments to study the impact of variations in adaptive controller design complexity on performance and handling qualities. A novel complexity metric was devised to compare the degrees of simplicity achieved in three variations of a model reference adaptive controller (MRAC) for NASA's F-18 (McDonnell Douglas, now The Boeing Company, Chicago, Illinois) Full-Scale Advanced Systems Testbed (Gen-2A) aircraft. The complexity measures of these controllers are also compared to that of an earlier MRAC design for NASA's Intelligent Flight Control System (IFCS) project and flown on a highly modified F-15 aircraft (McDonnell Douglas, now The Boeing Company, Chicago, Illinois). Pilot comments during the IRAC research flights pointed to the importance of workload on handling qualities ratings for failure and damage scenarios. Modifications to existing pilot aggressiveness and duty cycle metrics are presented and applied to the IRAC controllers. Finally, while adaptive controllers may alleviate the effects of failures or damage on an aircraft's handling qualities, they also have the potential to introduce annoying changes to the flight dynamics or to the operation of aircraft systems. A nuisance rating scale is presented for the categorization of nuisance side-effects of adaptive controllers.

  7. $\\eta$-metric structures

    OpenAIRE

    Gaba, Yaé Ulrich

    2017-01-01

    In this paper, we discuss recent results about generalized metric spaces and fixed point theory. We introduce the notion of $\\eta$-cone metric spaces, give some topological properties and prove some fixed point theorems for contractive type maps on these spaces. In particular we show that theses $\\eta$-cone metric spaces are natural generalizations of both cone metric spaces and metric type spaces.

  8. Evaluating IMRT and VMAT dose accuracy: Practical examples of failure to detect systematic errors when applying a commonly used metric and action levels

    Energy Technology Data Exchange (ETDEWEB)

    Nelms, Benjamin E. [Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Chan, Maria F. [Memorial Sloan-Kettering Cancer Center, Basking Ridge, New Jersey 07920 (United States); Jarry, Geneviève; Lemire, Matthieu [Hôpital Maisonneuve-Rosemont, Montréal, QC H1T 2M4 (Canada); Lowden, John [Indiana University Health - Goshen Hospital, Goshen, Indiana 46526 (United States); Hampton, Carnell [Levine Cancer Institute/Carolinas Medical Center, Concord, North Carolina 28025 (United States); Feygelman, Vladimir [Moffitt Cancer Center, Tampa, Florida 33612 (United States)

    2013-11-15

    Purpose: This study (1) examines a variety of real-world cases where systematic errors were not detected by widely accepted methods for IMRT/VMAT dosimetric accuracy evaluation, and (2) drills-down to identify failure modes and their corresponding means for detection, diagnosis, and mitigation. The primary goal of detailing these case studies is to explore different, more sensitive methods and metrics that could be used more effectively for evaluating accuracy of dose algorithms, delivery systems, and QA devices.Methods: The authors present seven real-world case studies representing a variety of combinations of the treatment planning system (TPS), linac, delivery modality, and systematic error type. These case studies are typical to what might be used as part of an IMRT or VMAT commissioning test suite, varying in complexity. Each case study is analyzed according to TG-119 instructions for gamma passing rates and action levels for per-beam and/or composite plan dosimetric QA. Then, each case study is analyzed in-depth with advanced diagnostic methods (dose profile examination, EPID-based measurements, dose difference pattern analysis, 3D measurement-guided dose reconstruction, and dose grid inspection) and more sensitive metrics (2% local normalization/2 mm DTA and estimated DVH comparisons).Results: For these case studies, the conventional 3%/3 mm gamma passing rates exceeded 99% for IMRT per-beam analyses and ranged from 93.9% to 100% for composite plan dose analysis, well above the TG-119 action levels of 90% and 88%, respectively. However, all cases had systematic errors that were detected only by using advanced diagnostic techniques and more sensitive metrics. The systematic errors caused variable but noteworthy impact, including estimated target dose coverage loss of up to 5.5% and local dose deviations up to 31.5%. Types of errors included TPS model settings, algorithm limitations, and modeling and alignment of QA phantoms in the TPS. Most of the errors were

  9. Evaluation of dose-volume metrics for microbeam radiation therapy dose distributions in head phantoms of various sizes using Monte Carlo simulations

    Science.gov (United States)

    Anderson, Danielle; Siegbahn, E. Albert; Fallone, B. Gino; Serduc, Raphael; Warkentin, Brad

    2012-05-01

    This work evaluates four dose-volume metrics applied to microbeam radiation therapy (MRT) using simulated dosimetric data as input. We seek to improve upon the most frequently used MRT metric, the peak-to-valley dose ratio (PVDR), by analyzing MRT dose distributions from a more volumetric perspective. Monte Carlo simulations were used to calculate dose distributions in three cubic head phantoms: a 2 cm mouse head, an 8 cm cat head and a 16 cm dog head. The dose distribution was calculated for a 4 × 4 mm2 microbeam array in each phantom, as well as a 16 × 16 mm2 array in the 8 cm cat head, and a 32 × 32 mm2 array in the 16 cm dog head. Microbeam widths of 25, 50 and 75 µm and center-to-center spacings of 100, 200 and 400 µm were considered. The metrics calculated for each simulation were the conventional PVDR, the peak-to-mean valley dose ratio (PMVDR), the mean dose and the percentage volume below a threshold dose. The PVDR ranged between 3 and 230 for the 2 cm mouse phantom, and between 2 and 186 for the 16 cm dog phantom depending on geometry. The corresponding ranges for the PMVDR were much smaller, being 2-49 (mouse) and 2-46 (dog), and showed a slightly weaker dependence on phantom size and array size. The ratio of the PMVDR to the PVDR varied from 0.21 to 0.79 for the different collimation configurations, indicating a difference between the geometric dependence on outcome that would be predicted by these two metrics. For unidirectional irradiation, the mean lesion dose was 102%, 79% and 42% of the mean skin dose for the 2 cm mouse, 8 cm cat and 16 cm dog head phantoms, respectively. However, the mean lesion dose recovered to 83% of the mean skin dose in the 16 cm dog phantom in intersecting cross-firing regions. The percentage volume below a 10% dose threshold was highly dependent on geometry, with ranges for the different collimation configurations of 2-87% and 33-96% for the 2 cm mouse and 16 cm dog heads, respectively. The results of this study

  10. Evaluation of dose-volume metrics for microbeam radiation therapy dose distributions in head phantoms of various sizes using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Anderson, Danielle; Fallone, B Gino; Warkentin, Brad; Siegbahn, E Albert; Serduc, Raphael

    2012-01-01

    This work evaluates four dose-volume metrics applied to microbeam radiation therapy (MRT) using simulated dosimetric data as input. We seek to improve upon the most frequently used MRT metric, the peak-to-valley dose ratio (PVDR), by analyzing MRT dose distributions from a more volumetric perspective. Monte Carlo simulations were used to calculate dose distributions in three cubic head phantoms: a 2 cm mouse head, an 8 cm cat head and a 16 cm dog head. The dose distribution was calculated for a 4 × 4 mm 2 microbeam array in each phantom, as well as a 16 × 16 mm 2 array in the 8 cm cat head, and a 32 × 32 mm 2 array in the 16 cm dog head. Microbeam widths of 25, 50 and 75 µm and center-to-center spacings of 100, 200 and 400 µm were considered. The metrics calculated for each simulation were the conventional PVDR, the peak-to-mean valley dose ratio (PMVDR), the mean dose and the percentage volume below a threshold dose. The PVDR ranged between 3 and 230 for the 2 cm mouse phantom, and between 2 and 186 for the 16 cm dog phantom depending on geometry. The corresponding ranges for the PMVDR were much smaller, being 2–49 (mouse) and 2–46 (dog), and showed a slightly weaker dependence on phantom size and array size. The ratio of the PMVDR to the PVDR varied from 0.21 to 0.79 for the different collimation configurations, indicating a difference between the geometric dependence on outcome that would be predicted by these two metrics. For unidirectional irradiation, the mean lesion dose was 102%, 79% and 42% of the mean skin dose for the 2 cm mouse, 8 cm cat and 16 cm dog head phantoms, respectively. However, the mean lesion dose recovered to 83% of the mean skin dose in the 16 cm dog phantom in intersecting cross-firing regions. The percentage volume below a 10% dose threshold was highly dependent on geometry, with ranges for the different collimation configurations of 2–87% and 33–96% for the 2 cm mouse and 16 cm dog heads, respectively. The results of this

  11. Uncertainty in BMP evaluation and optimization for watershed management

    Science.gov (United States)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT

  12. Optimized data evaluation for k0-based NAA

    International Nuclear Information System (INIS)

    Van Sluijs, R.; Bossus, D.A.W.

    1999-01-01

    k 0 -NAA allows the simultaneous analysis of up-to 67 elements. The k 0 method is based on calculations using a special library instead of measuring standards. For an efficient use of the method, the calculations and resulting raw data require optimized evaluation procedures. In this paper two efficient procedures for nuclide identification and gamma interference correction are outlined. For a fast computation of the source-detector efficiency and coincidence correction factors the matrix interpolation technique is introduced. (author)

  13. Capital Cost Optimization for Prefabrication: A Factor Analysis Evaluation Model

    Directory of Open Access Journals (Sweden)

    Hong Xue

    2018-01-01

    Full Text Available High capital cost is a significant hindrance to the promotion of prefabrication. In order to optimize cost management and reduce capital cost, this study aims to explore the latent factors and factor analysis evaluation model. Semi-structured interviews were conducted to explore potential variables and then questionnaire survey was employed to collect professionals’ views on their effects. After data collection, exploratory factor analysis was adopted to explore the latent factors. Seven latent factors were identified, including “Management Index”, “Construction Dissipation Index”, “Productivity Index”, “Design Efficiency Index”, “Transport Dissipation Index”, “Material increment Index” and “Depreciation amortization Index”. With these latent factors, a factor analysis evaluation model (FAEM, divided into factor analysis model (FAM and comprehensive evaluation model (CEM, was established. The FAM was used to explore the effect of observed variables on the high capital cost of prefabrication, while the CEM was used to evaluate comprehensive cost management level on prefabrication projects. Case studies were conducted to verify the models. The results revealed that collaborative management had a positive effect on capital cost of prefabrication. Material increment costs and labor costs had significant impacts on production cost. This study demonstrated the potential of on-site management and standardization design to reduce capital cost. Hence, collaborative management is necessary for cost management of prefabrication. Innovation and detailed design were needed to improve cost performance. The new form of precast component factories can be explored to reduce transportation cost. Meanwhile, targeted strategies can be adopted for different prefabrication projects. The findings optimized the capital cost and improved the cost performance through providing an evaluation and optimization model, which helps managers to

  14. Evaluation and optimization of feed-in tariffs

    International Nuclear Information System (INIS)

    Kim, Kyoung-Kuk; Lee, Chi-Guhn

    2012-01-01

    Feed-in tariff program is an incentive plan that provides investors with a set payment for electricity generated from renewable energy sources that is fed into the power grid. As of today, FIT is being used by over 75 jurisdictions around the world and offers a number of design options to achieve policy goals. The objective of this paper is to propose a quantitative model, by which a specific FIT program can be evaluated and hence optimized. We focus on payoff structure, which has a direct impact on the net present value of the investment, and other parameters relevant to investor reaction and electricity prices. We combine cost modeling, option valuation, and consumer choice so as to simulate the performance of a FIT program of interest in various scenarios. The model is used to define an optimization problem from a policy maker's perspective, who wants to increase the contribution of renewable energy to the overall energy supply, while keeping the total burden on ratepayers under control. Numerical studies shed light on the interactions among design options, program parameters, and the performance of a FIT program. - Highlights: ► A quantitative model to evaluate and optimize feed-in tariff policies. ► Net present value of investment on renewable energy under a given feed-in tariff policy. ► Analysis of the interactions of policy options and relevant parameters. ► Recommendations for how to set policy options for feed-in tariff program.

  15. A Parallel Particle Swarm Optimization Algorithm Accelerated by Asynchronous Evaluations

    Science.gov (United States)

    Venter, Gerhard; Sobieszczanski-Sobieski, Jaroslaw

    2005-01-01

    A parallel Particle Swarm Optimization (PSO) algorithm is presented. Particle swarm optimization is a fairly recent addition to the family of non-gradient based, probabilistic search algorithms that is based on a simplified social model and is closely tied to swarming theory. Although PSO algorithms present several attractive properties to the designer, they are plagued by high computational cost as measured by elapsed time. One approach to reduce the elapsed time is to make use of coarse-grained parallelization to evaluate the design points. Previous parallel PSO algorithms were mostly implemented in a synchronous manner, where all design points within a design iteration are evaluated before the next iteration is started. This approach leads to poor parallel speedup in cases where a heterogeneous parallel environment is used and/or where the analysis time depends on the design point being analyzed. This paper introduces an asynchronous parallel PSO algorithm that greatly improves the parallel e ciency. The asynchronous algorithm is benchmarked on a cluster assembled of Apple Macintosh G5 desktop computers, using the multi-disciplinary optimization of a typical transport aircraft wing as an example.

  16. Liver fibrosis: in vivo evaluation using intravoxel incoherent motion-derived histogram metrics with histopathologic findings at 3.0 T.

    Science.gov (United States)

    Hu, Fubi; Yang, Ru; Huang, Zixing; Wang, Min; Zhang, Hanmei; Yan, Xu; Song, Bin

    2017-12-01

    To retrospectively determine the feasibility of intravoxel incoherent motion (IVIM) imaging based on histogram analysis for the staging of liver fibrosis (LF) using histopathologic findings as the reference standard. 56 consecutive patients (14 men, 42 women; age range, 15-76, years) with chronic liver diseases (CLDs) were studied using IVIM-DWI with 9 b-values (0, 25, 50, 75, 100, 150, 200, 500, 800 s/mm 2 ) at 3.0 T. Fibrosis stage was evaluated using the METAVIR scoring system. Histogram metrics including mean, standard deviation (Std), skewness, kurtosis, minimum (Min), maximum (Max), range, interquartile (Iq) range, and percentiles (10, 25, 50, 75, 90th) were extracted from apparent diffusion coefficient (ADC), true diffusion coefficient (D), pseudo-diffusion coefficient (D*), and perfusion fraction (f) maps. All histogram metrics among different fibrosis groups were compared using one-way analysis of variance or nonparametric Kruskal-Wallis test. For significant parameters, receivers operating characteristic curve (ROC) analyses were further performed for the staging of LF. Based on their METAVIR stage, the 56 patients were reclassified into three groups as follows: F0-1 group (n = 25), F2-3 group (n = 21), and F4 group (n = 10). The mean, Iq range, percentiles (50, 75, and 90th) of D* maps between the groups were significant differences (all P histogram metrics of ADC, D, and f maps demonstrated no significant difference among the groups (all P > 0.05). Histogram analysis of D* map derived from IVIM can be used to stage liver fibrosis in patients with CLDs and provide more quantitative information beyond the mean value.

  17. Process optimization and evaluation of novel baicalin solid nanocrystals

    Directory of Open Access Journals (Sweden)

    Yue PF

    2013-08-01

    Full Text Available Peng-Fei Yue,1,2 Yu Li,1 Jing Wan,1 Yong Wang,1 Ming Yang,1 Wei-Feng Zhu,1 Chang-Hong Wang,2 Hai-Long Yuan31Key Lab of Modern Preparation of TCM, Jiangxi University of Traditional Chinese Medicine, Nanchang, 2Institute of Chinese Materia Medica, Shanghai University of Traditional Chinese Medicine, Shanghai, 3302 Hospital of PLA Institute of Chinese Materia Medica, Beijing, People's Republic of ChinaAbstract: The objective of this study was to prepare baicalin solid nanocrystals (BCN-SNS to enhance oral bioavailability of baicalin. A Box–Behnken design approach was used for process optimization. The physicochemical properties and pharmacokinetics of the optimal BCN-SNS were investigated. Multiple linear regression analysis for process optimization revealed that the fine BCN-SNS was obtained wherein the optimal values of homogenization pressure (bar, homogenization cycles (cycles, amount of TPGS to drug (w/w, and amount of MCCS to drug (w/w were 850 bar, 25 cycles, 10%, and 10%, respectively. Transmission electron microscopy and scanning electron microscopy results indicated that no significant aggregation or crystal growth could be observed in the redispersed freeze-dried BCN-SNS. Differential scanning calorimetry and X-ray diffraction results showed that BCN remained in a crystalline state. Dissolution velocity of the freeze-dried BCN-SNS powder was distinctly superior compared to those of the crude powder and physical mixture. The bioavailability of BCN in rats was increased remarkably after oral administration of BCN-SNS (P < 0.05, compared with those of BCN or the physical mixture. The SNS might be a good choice for oral administration of poorly soluble BCN, due to an improvement of the bioavailability and dissolution velocity of BCN-SNS.Keywords: baicalin, solid nanocrystals, optimization, in vivo/vitro evaluation

  18. Observationally-based Metrics of Ocean Carbon and Biogeochemical Variables are Essential for Evaluating Earth System Model Projections

    Science.gov (United States)

    Russell, J. L.; Sarmiento, J. L.

    2017-12-01

    The Southern Ocean is central to the climate's response to increasing levels of atmospheric greenhouse gases as it ventilates a large fraction of the global ocean volume. Global coupled climate models and earth system models, however, vary widely in their simulations of the Southern Ocean and its role in, and response to, the ongoing anthropogenic forcing. Due to its complex water-mass structure and dynamics, Southern Ocean carbon and heat uptake depend on a combination of winds, eddies, mixing, buoyancy fluxes and topography. Understanding how the ocean carries heat and carbon into its interior and how the observed wind changes are affecting this uptake is essential to accurately projecting transient climate sensitivity. Observationally-based metrics are critical for discerning processes and mechanisms, and for validating and comparing climate models. As the community shifts toward Earth system models with explicit carbon simulations, more direct observations of important biogeochemical parameters, like those obtained from the biogeochemically-sensored floats that are part of the Southern Ocean Carbon and Climate Observations and Modeling project, are essential. One goal of future observing systems should be to create observationally-based benchmarks that will lead to reducing uncertainties in climate projections, and especially uncertainties related to oceanic heat and carbon uptake.

  19. Characterizing uncertainty when evaluating risk management metrics: risk assessment modeling of Listeria monocytogenes contamination in ready-to-eat deli meats.

    Science.gov (United States)

    Gallagher, Daniel; Ebel, Eric D; Gallagher, Owen; Labarre, David; Williams, Michael S; Golden, Neal J; Pouillot, Régis; Dearfield, Kerry L; Kause, Janell

    2013-04-01

    This report illustrates how the uncertainty about food safety metrics may influence the selection of a performance objective (PO). To accomplish this goal, we developed a model concerning Listeria monocytogenes in ready-to-eat (RTE) deli meats. This application used a second order Monte Carlo model that simulates L. monocytogenes concentrations through a series of steps: the food-processing establishment, transport, retail, the consumer's home and consumption. The model accounted for growth inhibitor use, retail cross contamination, and applied an FAO/WHO dose response model for evaluating the probability of illness. An appropriate level of protection (ALOP) risk metric was selected as the average risk of illness per serving across all consumed servings-per-annum and the model was used to solve for the corresponding performance objective (PO) risk metric as the maximum allowable L. monocytogenes concentration (cfu/g) at the processing establishment where regulatory monitoring would occur. Given uncertainty about model inputs, an uncertainty distribution of the PO was estimated. Additionally, we considered how RTE deli meats contaminated at levels above the PO would be handled by the industry using three alternative approaches. Points on the PO distribution represent the probability that - if the industry complies with a particular PO - the resulting risk-per-serving is less than or equal to the target ALOP. For example, assuming (1) a target ALOP of -6.41 log10 risk of illness per serving, (2) industry concentrations above the PO that are re-distributed throughout the remaining concentration distribution and (3) no dose response uncertainty, establishment PO's of -4.98 and -4.39 log10 cfu/g would be required for 90% and 75% confidence that the target ALOP is met, respectively. The PO concentrations from this example scenario are more stringent than the current typical monitoring level of an absence in 25 g (i.e., -1.40 log10 cfu/g) or a stricter criteria of absence

  20. Physiologically based pharmacokinetic rat model for methyl tertiary-butyl ether; comparison of selected dose metrics following various MTBE exposure scenarios used for toxicity and carcinogenicity evaluation

    International Nuclear Information System (INIS)

    Borghoff, Susan J.; Parkinson, Horace; Leavens, Teresa L.

    2010-01-01

    There are a number of cancer and toxicity studies that have been carried out to assess hazard from methyl tertiary-butyl ether (MTBE) exposure via inhalation and oral administration. MTBE has been detected in surface as well as ground water supplies which emphasized the need to assess the risk from exposure via drinking water contamination. This model can now be used to evaluate route-to-route extrapolation issues concerning MTBE exposures but also as a means of comparing potential dose metrics that may provide insight to differences in biological responses observed in rats following different routes of MTBE exposure. Recently an updated rat physiologically based pharmacokinetic (PBPK) model was published that relied on a description of MTBE and its metabolite tertiary-butyl alcohol (TBA) binding to α2u-globulin, a male rat-specific protein. This model was used to predict concentrations of MTBE and TBA in the kidney, a target tissue in the male rat. The objective of this study was to use this model to evaluate the dosimetry of MTBE and TBA in rats following different exposure scenarios, used to evaluate the toxicity and carcinogenicity of MTBE, and compare various dose metrics under these different conditions. Model simulations suggested that although inhalation and drinking water exposures show a similar pattern of MTBE and TBA exposure in the blood and kidney (i.e. concentration-time profiles), the total blood and kidney levels following exposure of MTBE to 7.5 mg/ml MTBE in the drinking water for 90 days is in the same range as administration of an oral dose of 1000 mg/kg MTBE. Evaluation of the dose metrics also supports that a high oral bolus dose (i.e. 1000 mg/kg MTBE) results in a greater percentage of the dose exhaled as MTBE with a lower percent metabolized to TBA as compared to dose of MTBE that is delivered over a longer period of time as in the case of drinking water.

  1. Running from features: Optimized evaluation of inflationary power spectra

    Science.gov (United States)

    Motohashi, Hayato; Hu, Wayne

    2015-08-01

    In models like axion monodromy, temporal features during inflation which are not associated with its ending can produce scalar, and to a lesser extent, tensor power spectra where deviations from scale-free power law spectra can be as large as the deviations from scale invariance itself. Here the standard slow-roll approach breaks down since its parameters evolve on an e -folding scale Δ N much smaller than the e -folds to the end of inflation. Using the generalized slow-roll approach, we show that the expansion of observables in a hierarchy of potential or Hubble evolution parameters comes from a Taylor expansion of the features around an evaluation point that can be optimized. Optimization of the leading-order expression provides a sufficiently accurate approximation for current data as long as the power spectrum can be described over the well-observed few e -folds by the local tilt and running. Standard second-order approaches, often used in the literature, ironically are worse than leading-order approaches due to inconsistent evaluation of observables. We develop a new optimized next-order approach which predicts observables to 10-3 even for Δ N ˜1 where all parameters in the infinite hierarchy are of comparable magnitude. For models with Δ N ≪1 , the generalized slow-roll approach provides integral expressions that are accurate to second order in the deviation from scale invariance. Their evaluation in the monodromy model provides highly accurate explicit relations between the running oscillation amplitude, frequency, and phase in the curvature spectrum and parameters of the potential.

  2. METRIC context unit architecture

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, R.O.

    1988-01-01

    METRIC is an architecture for a simple but powerful Reduced Instruction Set Computer (RISC). Its speed comes from the simultaneous processing of several instruction streams, with instructions from the various streams being dispatched into METRIC's execution pipeline as they become available for execution. The pipeline is thus kept full, with a mix of instructions for several contexts in execution at the same time. True parallel programming is supported within a single execution unit, the METRIC Context Unit. METRIC's architecture provides for expansion through the addition of multiple Context Units and of specialized Functional Units. The architecture thus spans a range of size and performance from a single-chip microcomputer up through large and powerful multiprocessors. This research concentrates on the specification of the METRIC Context Unit at the architectural level. Performance tradeoffs made during METRIC's design are discussed, and projections of METRIC's performance are made based on simulation studies.

  3. Does Implementation Follow Design? A Case Study of a Workplace Health Promotion Program Using the 4-S Program Design and the PIPE Impact Metric Evaluation Models.

    Science.gov (United States)

    Äikäs, Antti Hermanni; Pronk, Nicolaas P; Hirvensalo, Mirja Hannele; Absetz, Pilvikki

    2017-08-01

    The aim of this study was to describe the content of a multiyear market-based workplace health promotion (WHP) program and to evaluate design and implementation processes in a real-world setting. Data was collected from the databases of the employer and the service provider. It was classified using the 4-S (Size, Scope, Scalability, and Sustainability) and PIPE Impact Metric (Penetration, Implementation) models. Data analysis utilized both qualitative and quantitative methods. Program design covered well the evidence-informed best practices except for clear path toward sustainability, cooperation with occupational health care, and support from middle-management supervisors. The penetration rate among participants was high (99%) and majority (81%) of services were implemented as designed. Study findings indicate that WHP market would benefit the use of evidence-based design principles and tendentious decisions to anticipate a long-term implementation process already during the planning phase.

  4. Accounting for observation uncertainties in an evaluation metric of low latitude turbulent air-sea fluxes: application to the comparison of a suite of IPSL model versions

    Science.gov (United States)

    Servonnat, Jérôme; Găinuşă-Bogdan, Alina; Braconnot, Pascale

    2017-09-01

    Turbulent momentum and heat (sensible heat and latent heat) fluxes at the air-sea interface are key components of the whole energetic of the Earth's climate. The evaluation of these fluxes in the climate models is still difficult because of the large uncertainties associated with the reference products. In this paper we present an objective metric accounting for reference uncertainties to evaluate the annual cycle of the low latitude turbulent fluxes of a suite of IPSL climate models. This metric consists in a Hotelling T 2 test between the simulated and observed field in a reduce space characterized by the dominant modes of variability that are common to both the model and the reference, taking into account the observational uncertainty. The test is thus more severe when uncertainties are small as it is the case for sea surface temperature (SST). The results of the test show that for almost all variables and all model versions the model-reference differences are not zero. It is not possible to distinguish between model versions for sensible heat and meridional wind stress, certainly due to the large observational uncertainties. All model versions share similar biases for the different variables. There is no improvement between the reference versions of the IPSL model used for CMIP3 and CMIP5. The test also reveals that the higher horizontal resolution fails to improve the representation of the turbulent surface fluxes compared to the other versions. The representation of the fluxes is further degraded in a version with improved atmospheric physics with an amplification of some of the biases in the Indian Ocean and in the intertropical convergence zone. The ranking of the model versions for the turbulent fluxes is not correlated with the ranking found for SST. This highlights that despite the fact that SST gradients are important for the large-scale atmospheric circulation patterns, other factors such as wind speed, and air-sea temperature contrast play an

  5. Optimal differentiation of high- and low-grade glioma and metastasis: a meta-analysis of perfusion, diffusion, and spectroscopy metrics

    Energy Technology Data Exchange (ETDEWEB)

    Usinskiene, Jurgita; Venius, Jonas; Rynkeviciene, Ryte; Norkus, Darius; Suziedelis, Kestutis [National Cancer Institute, Radiology Center, Vilnius (Lithuania); Ulyte, Agne [Vilnius University, Faculty of Medicine, Vilnius (Lithuania); Bjoernerud, Atle [Oslo University Hospital, Department of Physics, Oslo (Norway); Oslo University Hospital, The Intervention Centre, Oslo (Norway); Katsaros, Vasileios K. [General Anti-Cancer and Oncological Hospital ' ' St. Savvas' ' , Athens (Greece); Letautiene, Simona; Aleknavicius, Eduardas [National Cancer Institute, Radiology Center, Vilnius (Lithuania); Vilnius University, Faculty of Medicine, Vilnius (Lithuania); Rocka, Saulius [Vilnius University, Faculty of Medicine, Vilnius (Lithuania); Faculty of Medicine Vilnius University, Neuroangiosurgery Center, Vilnius (Lithuania); Usinskas, Andrius [Vilnius Gedimino Technical University, Department of Electronic Systems, Vilnius (Lithuania)

    2016-04-15

    To perform a meta-analysis of advanced magnetic resonance imaging (MRI) metrics, including relative cerebral blood volume (rCBV), normalized apparent diffusion coefficient (nADC), and spectroscopy ratios choline/creatine (Cho/Cr) and choline/N-acetyl aspartate (Cho/NAA), for the differentiation of high- and low-grade gliomas (HGG, LGG) and metastases (MTS). For systematic review, 83 articles (dated 2000-2013) were selected from the NCBI database. Twenty-four, twenty-two, and eight articles were included respectively for spectroscopy, rCBV, and nADC meta-analysis. In the meta-analysis, we calculated overall means for rCBV, nADC, Cho/Cr (short TE - from 20 to 35 ms, medium - from 135 to 144 ms), and Cho/NAA for the HGG, LGG, and MTS groups. We used random effects model to obtain weighted averages and select thresholds. Overall means (with 95 % CI) for rCBV, nADC, Cho/Cr (short and medium echo time, TE), and Cho/NAA were: for HGG 5.47 (4.78-6.15), 1.38 (1.16-1.60), 2.40 (1.67-3.13), 3.27 (2.78-3.77), and 4.71 (3.24-6.19); for LGG 2.00 (1.71-2.28), 1.61 (1.36-1.87), 1.46 (1.20-1.72), 1.71 (1.49-1.93), and 2.36 (1.50-3.23); for MTS 5.06 (3.85-6.27), 1.35 (1.06-1.64), 1.89 (1.72-2.06), 3.14 (1.57-4.72), (Cho/NAA was not available). LGG had significantly lower rCBV, Cho/Cr, and Cho/NAA values than HGG or MTS. No significant differences were found for nADC. Best differentiation between HGG and LGG is obtained from rCBV, Cho/Cr, and Cho/NAA metrics. MTS could not be reliably distinguished from HGG by the methods investigated. (orig.)

  6. Optimal differentiation of high- and low-grade glioma and metastasis: a meta-analysis of perfusion, diffusion, and spectroscopy metrics

    International Nuclear Information System (INIS)

    Usinskiene, Jurgita; Venius, Jonas; Rynkeviciene, Ryte; Norkus, Darius; Suziedelis, Kestutis; Ulyte, Agne; Bjoernerud, Atle; Katsaros, Vasileios K.; Letautiene, Simona; Aleknavicius, Eduardas; Rocka, Saulius; Usinskas, Andrius

    2016-01-01

    To perform a meta-analysis of advanced magnetic resonance imaging (MRI) metrics, including relative cerebral blood volume (rCBV), normalized apparent diffusion coefficient (nADC), and spectroscopy ratios choline/creatine (Cho/Cr) and choline/N-acetyl aspartate (Cho/NAA), for the differentiation of high- and low-grade gliomas (HGG, LGG) and metastases (MTS). For systematic review, 83 articles (dated 2000-2013) were selected from the NCBI database. Twenty-four, twenty-two, and eight articles were included respectively for spectroscopy, rCBV, and nADC meta-analysis. In the meta-analysis, we calculated overall means for rCBV, nADC, Cho/Cr (short TE - from 20 to 35 ms, medium - from 135 to 144 ms), and Cho/NAA for the HGG, LGG, and MTS groups. We used random effects model to obtain weighted averages and select thresholds. Overall means (with 95 % CI) for rCBV, nADC, Cho/Cr (short and medium echo time, TE), and Cho/NAA were: for HGG 5.47 (4.78-6.15), 1.38 (1.16-1.60), 2.40 (1.67-3.13), 3.27 (2.78-3.77), and 4.71 (3.24-6.19); for LGG 2.00 (1.71-2.28), 1.61 (1.36-1.87), 1.46 (1.20-1.72), 1.71 (1.49-1.93), and 2.36 (1.50-3.23); for MTS 5.06 (3.85-6.27), 1.35 (1.06-1.64), 1.89 (1.72-2.06), 3.14 (1.57-4.72), (Cho/NAA was not available). LGG had significantly lower rCBV, Cho/Cr, and Cho/NAA values than HGG or MTS. No significant differences were found for nADC. Best differentiation between HGG and LGG is obtained from rCBV, Cho/Cr, and Cho/NAA metrics. MTS could not be reliably distinguished from HGG by the methods investigated. (orig.)

  7. WE-B-304-02: Treatment Planning Evaluation and Optimization Should Be Biologically and Not Dose/volume Based

    International Nuclear Information System (INIS)

    Deasy, J.

    2015-01-01

    The ultimate goal of radiotherapy treatment planning is to find a treatment that will yield a high tumor control probability (TCP) with an acceptable normal tissue complication probability (NTCP). Yet most treatment planning today is not based upon optimization of TCPs and NTCPs, but rather upon meeting physical dose and volume constraints defined by the planner. It has been suggested that treatment planning evaluation and optimization would be more effective if they were biologically and not dose/volume based, and this is the claim debated in this month’s Point/Counterpoint. After a brief overview of biologically and DVH based treatment planning by the Moderator Colin Orton, Joseph Deasy (for biological planning) and Charles Mayo (against biological planning) will begin the debate. Some of the arguments in support of biological planning include: this will result in more effective dose distributions for many patients DVH-based measures of plan quality are known to have little predictive value there is little evidence that either D95 or D98 of the PTV is a good predictor of tumor control sufficient validated outcome prediction models are now becoming available and should be used to drive planning and optimization Some of the arguments against biological planning include: several decades of experience with DVH-based planning should not be discarded we do not know enough about the reliability and errors associated with biological models the radiotherapy community in general has little direct experience with side by side comparisons of DVH vs biological metrics and outcomes it is unlikely that a clinician would accept extremely cold regions in a CTV or hot regions in a PTV, despite having acceptable TCP values Learning Objectives: To understand dose/volume based treatment planning and its potential limitations To understand biological metrics such as EUD, TCP, and NTCP To understand biologically based treatment planning and its potential limitations

  8. Metric modular spaces

    CERN Document Server

    Chistyakov, Vyacheslav

    2015-01-01

    Aimed toward researchers and graduate students familiar with elements of functional analysis, linear algebra, and general topology; this book contains a general study of modulars, modular spaces, and metric modular spaces. Modulars may be thought of as generalized velocity fields and serve two important purposes: generate metric spaces in a unified manner and provide a weaker convergence, the modular convergence, whose topology is non-metrizable in general. Metric modular spaces are extensions of metric spaces, metric linear spaces, and classical modular linear spaces. The topics covered include the classification of modulars, metrizability of modular spaces, modular transforms and duality between modular spaces, metric  and modular topologies. Applications illustrated in this book include: the description of superposition operators acting in modular spaces, the existence of regular selections of set-valued mappings, new interpretations of spaces of Lipschitzian and absolutely continuous mappings, the existe...

  9. Design Optimization and Evaluation of Different Wind Generator Systems

    DEFF Research Database (Denmark)

    Chen, Zhe; Li, Hui

    2008-01-01

    . In this paper, seven variable speed constant frequency (VSCF) wind generator systems are investigated, namely permanent magnet synchronous generators with the direct-driven (PMSG_DD), the single-stage gearbox (PMSG_1G) and three-stage gearbox (PMSG_3G) concepts, doubly fed induction generators with the three......With rapid development of wind power technologies and significant growth of wind power capacity installed worldwide, various wind generator systems have been developed and built. The objective of this paper is to evaluate various wind generator systems by optimization designs and comparisons......-stage gearbox (DFIG_3G) and with the single-stage gearbox (DFIG_1G), the electricity excited synchronous generator with the direct-driven (EESG_DD), and the VSCF squirrel cage induction generator with the three-stage gearbox (SCIG_3G). Firstly, the design models of wind turbines, three/single stage gearbox...

  10. Formulation, optimization and evaluation of levocetirizine dihyrochloride oral thin strip

    Directory of Open Access Journals (Sweden)

    J Gunjan Patel

    2012-01-01

    Full Text Available The aim of present research was to develop a fast releasing oral polymeric film, with good mechanical properties, instant disintegration and dissolution, producing an acceptable taste when placed on tongue. Solvent casting method was used to prepare oral films. Levocetirizine dihydrochloride, an antihistaminic was incorporated to relieve the symptoms of allergic rhinitis. The polymers selected were HPMC E 15 and PVA. Propylene glycol was the plasticizers used. Nine batches of films with drug were prepared using different combinations of polymers and plasticizer concentration. The resultant films were evaluated for weight variation, content uniformity, folding endurance, thickness, surface pH, in vitro disintegration and in vitro dissolution. The optimized films which disintegrated in less than 30 sec, releasing 85-98% of drug within 2 minutes. The percentage release was varying with concentration of plasticizer and polymer. The films made with HPMC: PVA (1:2 released 96% of drug in 1 min, which was the best release amongst all.

  11. Multi-objective based on parallel vector evaluated particle swarm optimization for optimal steady-state performance of power systems

    DEFF Research Database (Denmark)

    Vlachogiannis, Ioannis (John); Lee, K Y

    2009-01-01

    In this paper the state-of-the-art extended particle swarm optimization (PSO) methods for solving multi-objective optimization problems are represented. We emphasize in those, the co-evolution technique of the parallel vector evaluated PSO (VEPSO), analysed and applied in a multi-objective problem...

  12. Multivariate analytical figures of merit as a metric for evaluation of quantitative measurements using comprehensive two-dimensional gas chromatography-mass spectrometry.

    Science.gov (United States)

    Eftekhari, Ali; Parastar, Hadi

    2016-09-30

    The present contribution is devoted to develop multivariate analytical figures of merit (AFOMs) as a new metric for evaluation of quantitative measurements using comprehensive two-dimensional gas chromatography-mass spectrometry (GC×GC-MS). In this regard, new definition of sensitivity (SEN) is extended to GC×GC-MS data and then, other multivariate AFOMs including analytical SEN (γ), selectivity (SEL) and limit of detection (LOD) are calculated. Also, two frequently used second- and third-order calibration algorithms of multivariate curve resolution-alternating least squares (MCR-ALS) as representative of multi-set methods and parallel factor analysis (PARAFAC) as representative of multi-way methods are discussed to exploit pure component profiles and to calculate multivariate AFOMs. Different GC×GC-MS data sets with different number of components along with various levels of artifacts are simulated and analyzed. Noise, elution time shifts in both chromatographic dimensions, peak overlap and interferences are considered as the main artifacts in this work. Additionally, a new strategy is developed to estimate the noise level using variance-covariance matrix of residuals which is very important to calculate multivariate AFOMs. Finally, determination of polycyclic aromatic hydrocarbons (PAHs) in aromatic fraction of heavy fuel oil (HFO) analyzed by GC×GC-MS is considered as real case to confirm applicability of the proposed metric in real samples. It should be pointed out that the proposed strategy in this work can be used for other types of comprehensive two-dimensional chromatographic (CTDC) techniques like comprehensive two dimensional liquid chromatography (LC×LC). Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Metrics for image segmentation

    Science.gov (United States)

    Rees, Gareth; Greenway, Phil; Morray, Denise

    1998-07-01

    An important challenge in mapping image-processing techniques onto applications is the lack of quantitative performance measures. From a systems engineering perspective these are essential if system level requirements are to be decomposed into sub-system requirements which can be understood in terms of algorithm selection and performance optimization. Nowhere in computer vision is this more evident than in the area of image segmentation. This is a vigorous and innovative research activity, but even after nearly two decades of progress, it remains almost impossible to answer the question 'what would the performance of this segmentation algorithm be under these new conditions?' To begin to address this shortcoming, we have devised a well-principled metric for assessing the relative performance of two segmentation algorithms. This allows meaningful objective comparisons to be made between their outputs. It also estimates the absolute performance of an algorithm given ground truth. Our approach is an information theoretic one. In this paper, we describe the theory and motivation of our method, and present practical results obtained from a range of state of the art segmentation methods. We demonstrate that it is possible to measure the objective performance of these algorithms, and to use the information so gained to provide clues about how their performance might be improved.

  14. An Investigation of the Relationship Between Automated Machine Translation Evaluation Metrics and User Performance on an Information Extraction Task

    Science.gov (United States)

    2007-01-01

    more reliable than BLEU and that it is easier to understand in terms familiar to NLP researchers. 19 2.2.3 METEOR Researchers at Carnegie Mellon...essential elements of infor- mation from output generated by three types of Arabic -English MT engines. The information extraction experiment was one of three...reviewing the task hierarchy and examining the MT output of several engines. A small, prior pilot experiment to evaluate Arabic -English MT engines for

  15. Evaluating MyPlate: an expanded framework using traditional and nontraditional metrics for assessing health communication campaigns.

    Science.gov (United States)

    Levine, Elyse; Abbatangelo-Gray, Jodie; Mobley, Amy R; McLaughlin, Grant R; Herzog, Jill

    2012-01-01

    MyPlate, the icon and multimodal communication plan developed for the 2010 Dietary Guidelines for Americans (DGA), provides an opportunity to consider new approaches to evaluating the effectiveness of communication initiatives. A review of indicators used in assessments for previous DGA communication initiatives finds gaps in accounting for important intermediate and long-term outcomes. This evaluation framework for the MyPlate Communications Initiative builds on well-known and underused models and theories to propose a wide breadth of observations, outputs, and outcomes that can contribute to a fuller assessment of effectiveness. Two areas are suggested to focus evaluation efforts in order to advance understanding of the effectiveness of the MyPlate Communications Initiative: understanding the extent to which messages and products from the initiative are associated with positive changes in social norms toward the desired behaviors, and strategies to increase the effectiveness of communications about DGA in vulnerable populations. Copyright © 2012 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  16. Cardiomyocytes from late embryos and neonates do optimal work and striate best on substrates with tissue-level elasticity: metrics and mathematics.

    Science.gov (United States)

    Majkut, Stephanie F; Discher, Dennis E

    2012-11-01

    In this review, we discuss recent studies on the mechanosensitive morphology and function of cardiomyocytes derived from embryos and neonates. For early cardiomyocytes cultured on substrates of various stiffnesses, contractile function as measured by force production, work output and calcium handling is optimized when the culture substrate stiffness mimics that of the tissue from which the cells were obtained. This optimal contractile function corresponds to changes in sarcomeric protein conformation and organization that promote contractile ability. In light of current models for myofibillogenesis, a recent mathematical model of striation and alignment on elastic substrates helps to illuminate how substrate stiffness modulates early myofibril formation and organization. During embryonic heart formation and maturation, cardiac tissue mechanics change dynamically. Experiments and models highlighted here have important implications for understanding cardiomyocyte differentiation and function in development and perhaps in regeneration processes.

  17. Technology Transfer External Metrics, Research, Success Stories, and Participation on Evaluation Team for the Reusable Launch Vehicle (RLV)

    Science.gov (United States)

    Trivoli, George W.

    1996-01-01

    This research report is divided into four sections. The first section is related to participation on the team that evaluated the proposals for the X-33 project and the Reusable Launch Vehicle (RLV) during mid-May; prior to beginning the 1996 Summer Faculty Fellowship. The second section discusses the various meetings attended related to the technology evaluation process. The third section is related to various research and evaluation activities engaged in by this researcher. The final section discusses several success stories this researcher aided in preparing. Despite the fact that this researcher is not an engineer or science faculty, invaluable knowledge and experience have been gained at MSFC. Although related to the previous summer's research, the research has been new, varied, and challenging. This researcher was fortunate to have had maximum interaction with NASA colleague, David Cockrell. It would be a privilege and honor to continue a relationship with the Technology Transfer Office. In addition, we will attempt to aid in the establishment of a continuous formalized relationship between MSFC and Jacksonville State University. Dr. David Watts, Vice President for Academic Affairs, J.S.U., is interested in having the Technology Division cooperating with MSFC in sharing information and working tech transfer inquiries. The principal benefits gained by this researcher include the opportunity to conduct research in a non-academic, real world environment. In addition, the opportunity to be involved in aiding with the decision process for the choice of the next generation of space transportation system was a once in a lifetime experience. This researcher has gained enhanced respect and understanding of MSFC/NASA staff and facilities.

  18. The metrics of science and technology

    CERN Document Server

    Geisler, Eliezer

    2000-01-01

    Dr. Geisler's far-reaching, unique book provides an encyclopedic compilation of the key metrics to measure and evaluate the impact of science and technology on academia, industry, and government. Focusing on such items as economic measures, patents, peer review, and other criteria, and supported by an extensive review of the literature, Dr. Geisler gives a thorough analysis of the strengths and weaknesses inherent in metric design, and in the use of the specific metrics he cites. His book has already received prepublication attention, and will prove especially valuable for academics in technology management, engineering, and science policy; industrial R&D executives and policymakers; government science and technology policymakers; and scientists and managers in government research and technology institutions. Geisler maintains that the application of metrics to evaluate science and technology at all levels illustrates the variety of tools we currently possess. Each metric has its own unique strengths and...

  19. Metrics for Polyphonic Sound Event Detection

    Directory of Open Access Journals (Sweden)

    Annamaria Mesaros

    2016-05-01

    Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.

  20. Evaluating and optimizing horticultural regimes in space plant growth facilities

    Science.gov (United States)

    Berkovich, Y.; Chetirkin, R.; Wheeler, R.; Sager, J.

    In designing innovative Space Plant Growth Facilities (SPGF) for long duration space f ightl various limitations must be addressed including onboard resources: volume, energy consumption, heat transfer and crew labor expenditure. The required accuracy in evaluating onboard resources by using the equivalent mass methodology and applying it to the design of such facilities is not precise. This is due to the uncertainty of the structure and not completely understanding of the properties of all associated hardware, including the technology in these systems. We present a simple criteria of optimization for horticultural regimes in SPGF: Qmax = max [M · (EBI) 2 / (V · E · T) ], where M is the crop harvest in terms of total dry biomass in the plant growth system; EBI is the edible biomass index (harvest index), V is a volume occupied by the crop; E is the crop light energy supply during growth; T is the crop growth duration. The criterion reflects directly on the consumption of onboard resources for crop production. We analyzed the efficiency of plant crops and the environmental parameters by examining the criteria for 15 salad and 12 wheat crops from the data in the ALS database at Kennedy Space Center. Some following conclusion have been established: 1. The technology involved in growing salad crops on a cylindrical type surface provides a more meaningful Q-criterion; 2. Wheat crops were less efficient than leafy greens (salad crops) when examining resource utilization; 3. By increasing light intensity of the crop the efficiency of the resource utilization could decrease. Using the existing databases and Q-criteria we have found that the criteria can be used in optimizing design and horticultural regimes in the SPGF.

  1. Brand metrics that matter

    NARCIS (Netherlands)

    Muntinga, D.; Bernritter, S.

    2017-01-01

    Het merk staat steeds meer centraal in de organisatie. Het is daarom essentieel om de gezondheid, prestaties en ontwikkelingen van het merk te meten. Het is echter een uitdaging om de juiste brand metrics te selecteren. Een enorme hoeveelheid metrics vraagt de aandacht van merkbeheerders. Maar welke

  2. Privacy Metrics and Boundaries

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  3. The Fundamentals of Laparoscopic Surgery and LapVR evaluation metrics may not correlate with operative performance in a novice cohort

    Science.gov (United States)

    Steigerwald, Sarah N.; Park, Jason; Hardy, Krista M.; Gillman, Lawrence; Vergis, Ashley S.

    2015-01-01

    Background Considerable resources have been invested in both low- and high-fidelity simulators in surgical training. The purpose of this study was to investigate if the Fundamentals of Laparoscopic Surgery (FLS, low-fidelity box trainer) and LapVR (high-fidelity virtual reality) training systems correlate with operative performance on the Global Operative Assessment of Laparoscopic Skills (GOALS) global rating scale using a porcine cholecystectomy model in a novice surgical group with minimal laparoscopic experience. Methods Fourteen postgraduate year 1 surgical residents with minimal laparoscopic experience performed tasks from the FLS program and the LapVR simulator as well as a live porcine laparoscopic cholecystectomy. Performance was evaluated using standardized FLS metrics, automatic computer evaluations, and a validated global rating scale. Results Overall, FLS score did not show an association with GOALS global rating scale score on the porcine cholecystectomy. None of the five LapVR task scores were significantly associated with GOALS score on the porcine cholecystectomy. Conclusions Neither the low-fidelity box trainer or the high-fidelity virtual simulator demonstrated significant correlation with GOALS operative scores. These findings offer caution against the use of these modalities for brief assessments of novice surgical trainees, especially for predictive or selection purposes. PMID:26641071

  4. The Fundamentals of Laparoscopic Surgery and LapVR evaluation metrics may not correlate with operative performance in a novice cohort

    Directory of Open Access Journals (Sweden)

    Sarah N. Steigerwald

    2015-12-01

    Full Text Available Background: Considerable resources have been invested in both low- and high-fidelity simulators in surgical training. The purpose of this study was to investigate if the Fundamentals of Laparoscopic Surgery (FLS, low-fidelity box trainer and LapVR (high-fidelity virtual reality training systems correlate with operative performance on the Global Operative Assessment of Laparoscopic Skills (GOALS global rating scale using a porcine cholecystectomy model in a novice surgical group with minimal laparoscopic experience. Methods: Fourteen postgraduate year 1 surgical residents with minimal laparoscopic experience performed tasks from the FLS program and the LapVR simulator as well as a live porcine laparoscopic cholecystectomy. Performance was evaluated using standardized FLS metrics, automatic computer evaluations, and a validated global rating scale. Results: Overall, FLS score did not show an association with GOALS global rating scale score on the porcine cholecystectomy. None of the five LapVR task scores were significantly associated with GOALS score on the porcine cholecystectomy. Conclusions: Neither the low-fidelity box trainer or the high-fidelity virtual simulator demonstrated significant correlation with GOALS operative scores. These findings offer caution against the use of these modalities for brief assessments of novice surgical trainees, especially for predictive or selection purposes.

  5. Return to intended oncologic treatment (RIOT): a novel metric for evaluating the quality of oncosurgical therapy for malignancy.

    Science.gov (United States)

    Aloia, Thomas A; Zimmitti, Giuseppe; Conrad, Claudius; Gottumukalla, Vijaya; Kopetz, Scott; Vauthey, Jean-Nicolas

    2014-08-01

    After cancer surgery, complications, and disability prevent some patients from receiving subsequent treatments. Given that an inability to complete all intended cancer therapies might negate the oncologic benefits of surgical therapy, strategies to improve return to intended oncologic treatment (RIOT), including minimally invasive surgery (MIS), are being investigated. This project was designed to evaluate liver tumor patients to determine the RIOT rate, risk factors for inability to RIOT, and its impact on survivals. Outcomes for a homogenous cohort of 223 patients who underwent open-approach surgery for metachronous colorectal liver metastases and a group of 27 liver tumor patients treated with MIS hepatectomy were examined. Of the 223 open-approach patients, 167 were offered postoperative therapy, yielding a RIOT rate of 75%. The remaining 56 (25%) patients were unable to receive further treatment due to surgical complications (n = 29 pts) or poor performance status (n = 27 pts). Risk factors associated with inability to RIOT were hypertension (OR 2.2, P = 0.025), multiple preoperative chemotherapy regimens (OR 5.9, P = 0.039), and postoperative complications (OR 2.0, P = 0.039). Inability to RIOT correlated with shorter disease-free and overall survivals (P relationship between RIOT and long-term oncologic outcomes suggests that RIOT rates for both open- and MIS-approach cancer surgery should routinely be reported as a quality indicator. © 2014 Wiley Periodicals, Inc.

  6. Optimizing chronic disease management mega-analysis: economic evaluation.

    Science.gov (United States)

    2013-01-01

    As Ontario's population ages, chronic diseases are becoming increasingly common. There is growing interest in services and care models designed to optimize the management of chronic disease. To evaluate the cost-effectiveness and expected budget impact of interventions in chronic disease cohorts evaluated as part of the Optimizing Chronic Disease Management mega-analysis. Sector-specific costs, disease incidence, and mortality were calculated for each condition using administrative databases from the Institute for Clinical Evaluative Sciences. Intervention outcomes were based on literature identified in the evidence-based analyses. Quality-of-life and disease prevalence data were obtained from the literature. Analyses were restricted to interventions that showed significant benefit for resource use or mortality from the evidence-based analyses. An Ontario cohort of patients with each chronic disease was constructed and followed over 5 years (2006-2011). A phase-based approach was used to estimate costs across all sectors of the health care system. Utility values identified in the literature and effect estimates for resource use and mortality obtained from the evidence-based analyses were applied to calculate incremental costs and quality-adjusted life-years (QALYs). Given uncertainty about how many patients would benefit from each intervention, a system-wide budget impact was not determined. Instead, the difference in lifetime cost between an individual-administered intervention and no intervention was presented. Of 70 potential cost-effectiveness analyses, 8 met our inclusion criteria. All were found to result in QALY gains and cost savings compared with usual care. The models were robust to the majority of sensitivity analyses undertaken, but due to structural limitations and time constraints, few sensitivity analyses were conducted. Incremental cost savings per patient who received intervention ranged between $15 per diabetic patient with specialized nursing to

  7. Evaluating Laboratory Performance on Point-of-Care Glucose Testing with Six Sigma Metric for 151 Institutions in China.

    Science.gov (United States)

    Fei, Yang; Wang, Wei; He, Falin; Zhong, Kun; Wang, Zhiguo

    2015-10-01

    The aim of this study was to use Six Sigma(SM) (Motorola Trademark Holdings, Libertyville, IL) techniques to analyze the quality of point-of-care (POC) glucose testing measurements quantitatively and to provide suggestions for improvement. In total, 151 laboratories in China were included in this investigation in 2014. Bias and coefficient of variation were collected from an external quality assessment and an internal quality control program, respectively, for POC glucose testing organized by the National Center for Clinical Laboratories. The σ values and the Quality Goal Index were used to evaluate the performance of POC glucose meters. There were 27, 30, 57, and 37 participants in the groups using Optium Xceed™ (Abbott Diabetes Care, Alameda, CA), Accu-Chek(®) Performa (Roche, Basel, Switzerland), One Touch Ultra(®) (Abbott), and "other" meters, respectively. The median of the absolute value of percentage difference varied among different lots and different groups. Among all the groups, the Abbott One Touch Ultra group had the smallest median of absolute value of percentage difference except for lot 201411, whereas the "other" group had the largest median in all five lots. More than 85% of participate laboratories satisfied the total allowable error (TEa) requirement in International Organization for Standardization standard 15197:2013, and 85.43% (129/151) of laboratories obtained intralaboratory coefficient of variations less than 1/3TEa. However, Six Sigma techniques suggested that 41.72% (63/151) to 65.56% (99/151) of the laboratories needed to improve their POC glucose testing performance, in either precision, trueness, or both. Laboratories should pay more attention on the practice of POC glucose testing and take actions to improve their performance. Only in this way can POC glucose testing really function well in clinical practice.

  8. Metrics Are Needed for Collaborative Software Development

    Directory of Open Access Journals (Sweden)

    Mojgan Mohtashami

    2011-10-01

    Full Text Available There is a need for metrics for inter-organizational collaborative software development projects, encompassing management and technical concerns. In particular, metrics are needed that are aimed at the collaborative aspect itself, such as readiness for collaboration, the quality and/or the costs and benefits of collaboration in a specific ongoing project. We suggest questions and directions for such metrics, spanning the full lifespan of a collaborative project, from considering the suitability of collaboration through evaluating ongoing projects to final evaluation of the collaboration.

  9. Impedance cardiography – optimization and efficacy evaluation of antihypertensive treatment

    Directory of Open Access Journals (Sweden)

    Katarzyna Panasiuk-Kamińska

    2016-09-01

    Full Text Available Background . Hypertension is a civilization disease which currently affects about 10.5 m people in Poland. The number of patients with diagnosed, untreated hypertension amounts to 18%, and as many as 45% of patients are treated ineffectively whereas only 26% are treated effectively. Impedance cardiography (IC is an important tool both in diagnostics and the treatment of hypertensive patients, particularly in the case of antihypertensive treatment resistance. This method allows for the individualized treatment of each patient on the basis of hemodynamic parameters, monitoring of hypertensive patients in the outpatient care setting, and the assessment of cardiovascular risk factors. Objectives . The aim of the study was to evaluate the efficacy of hypotensive medications in patients with hypertension using impedance cardiography. Material and methods. The study involved 60 hypertensive patients, treated with antihypertensives, who failed to achieve the required blood pressure values. The modification of hypertension therapy was based on EBM (evidence-based medicine and on hemodynamic parameters obtained using impedance cardiography. Results . It was found that high blood pressure therapy based on impedance cardiography parameters has a significant influence on blood pressure reduction compared to EM B-based therapy: below 140/90: 66.8 vs. 55.1% and below 130/80: 23.5 vs. 18.9%. Conclusions . On the basis of this study it was confirmed that impedance cardiography allows for a significant reduction of hypertension and the selection of the most effective therapeutic strategy, providing for the optimization and efficacy of hypertension treatment.

  10. Experimental evaluation of optimization method for developing ultraviolet barrier coatings

    Science.gov (United States)

    Gonome, Hiroki; Okajima, Junnosuke; Komiya, Atsuki; Maruyama, Shigenao

    2014-01-01

    Ultraviolet (UV) barrier coatings can be used to protect many industrial products from UV attack. This study introduces a method of optimizing UV barrier coatings using pigment particles. The radiative properties of the pigment particles were evaluated theoretically, and the optimum particle size was decided from the absorption efficiency and the back-scattering efficiency. UV barrier coatings were prepared with zinc oxide (ZnO) and titanium dioxide (TiO2). The transmittance of the UV barrier coating was calculated theoretically. The radiative transfer in the UV barrier coating was modeled using the radiation element method by ray emission model (REM2). In order to validate the calculated results, the transmittances of these coatings were measured by a spectrophotometer. A UV barrier coating with a low UV transmittance and high VIS transmittance could be achieved. The calculated transmittance showed a similar spectral tendency with the measured one. The use of appropriate particles with optimum size, coating thickness and volume fraction will result in effective UV barrier coatings. UV barrier coatings can be achieved by the application of optical engineering.

  11. How Does Optimism Suppress Immunity? Evaluation of Three Affective Pathways

    OpenAIRE

    Segerstrom, Suzanne C.

    2006-01-01

    Studies have linked optimism to poorer immunity during difficult stressors. In the present report, when first-year law students (N = 46) relocated to attend law school, reducing conflict among curricular and extracurricular goals, optimism predicted larger delayed type hypersensitivity responses, indicating more robust in vivo cellular immunity. However, when students did not relocate, increasing goal conflict, optimism predicted smaller responses. Although this effect has been attributed to ...

  12. Microwave tomography global optimization, parallelization and performance evaluation

    CERN Document Server

    Noghanian, Sima; Desell, Travis; Ashtari, Ali

    2014-01-01

    This book provides a detailed overview on the use of global optimization and parallel computing in microwave tomography techniques. The book focuses on techniques that are based on global optimization and electromagnetic numerical methods. The authors provide parallelization techniques on homogeneous and heterogeneous computing architectures on high performance and general purpose futuristic computers. The book also discusses the multi-level optimization technique, hybrid genetic algorithm and its application in breast cancer imaging.

  13. Energy functionals for Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Headrick, M; Nassar, A

    2013-01-01

    We identify a set of ''energy'' functionals on the space of metrics in a given Kähler class on a Calabi-Yau manifold, which are bounded below and minimized uniquely on the Ricci-flat metric in that class. Using these functionals, we recast the problem of numerically solving the Einstein equation as an optimization problem. We apply this strategy, using the ''algebraic'' metrics (metrics for which the Kähler potential is given in terms of a polynomial in the projective coordinates), to the Fermat quartic and to a one-parameter family of quintics that includes the Fermat and conifold quintics. We show that this method yields approximations to the Ricci-flat metric that are exponentially accurate in the degree of the polynomial (except at the conifold point, where the convergence is polynomial), and therefore orders of magnitude more accurate than the balanced metrics, previously studied as approximations to the Ricci-flat metric. The method is relatively fast and easy to implement. On the theoretical side, we also show that the functionals can be used to give a heuristic proof of Yau's theorem

  14. Holographic Spherically Symmetric Metrics

    Science.gov (United States)

    Petri, Michael

    The holographic principle (HP) conjectures, that the maximum number of degrees of freedom of any realistic physical system is proportional to the system's boundary area. The HP has its roots in the study of black holes. It has recently been applied to cosmological solutions. In this article we apply the HP to spherically symmetric static space-times. We find that any regular spherically symmetric object saturating the HP is subject to tight constraints on the (interior) metric, energy-density, temperature and entropy-density. Whenever gravity can be described by a metric theory, gravity is macroscopically scale invariant and the laws of thermodynamics hold locally and globally, the (interior) metric of a regular holographic object is uniquely determined up to a constant factor and the interior matter-state must follow well defined scaling relations. When the metric theory of gravity is general relativity, the interior matter has an overall string equation of state (EOS) and a unique total energy-density. Thus the holographic metric derived in this article can serve as simple interior 4D realization of Mathur's string fuzzball proposal. Some properties of the holographic metric and its possible experimental verification are discussed. The geodesics of the holographic metric describe an isotropically expanding (or contracting) universe with a nearly homogeneous matter-distribution within the local Hubble volume. Due to the overall string EOS the active gravitational mass-density is zero, resulting in a coasting expansion with Ht = 1, which is compatible with the recent GRB-data.

  15. Multi-Metric Sustainability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cowlin, Shannon [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heimiller, Donna [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Pless, Jacquelyn [National Renewable Energy Lab. (NREL), Golden, CO (United States); Munoz, David [Colorado School of Mines, Golden, CO (United States)

    2014-12-01

    A readily accessible framework that allows for evaluating impacts and comparing tradeoffs among factors in energy policy, expansion planning, and investment decision making is lacking. Recognizing this, the Joint Institute for Strategic Energy Analysis (JISEA) funded an exploration of multi-metric sustainability analysis (MMSA) to provide energy decision makers with a means to make more comprehensive comparisons of energy technologies. The resulting MMSA tool lets decision makers simultaneously compare technologies and potential deployment locations.

  16. An Evaluation of C1-C3 Hydrochlorofluorocarbon (HCFC) Metrics: Lifetimes, Ozone Depletion Potentials, Radiative Efficiencies, Global Warming and Global Temperature Potentials

    Science.gov (United States)

    Burkholder, J. B.; Papanastasiou, D. K.; Marshall, P.

    2017-12-01

    Hydrochlorofluorocarbons (HCFCs) have been used as chlorofluorocarbon (CFC) substitutes in a number of applications, e.g. refrigerator and air-conditioning systems. Although HCFCs have lower ozone-depletion potentials (ODPs) compared to CFCs, they are potent greenhouse gases. The twenty-eighth meeting of the parties to the Montreal Protocol on Substances that Deplete the Ozone Layer (Kigali, 2016) included a list of 274 HCFCs to be controlled under the Montreal Protocol. However, from this list, only 15 of the HCFCs have values for their atmospheric lifetime, ODP, global warming potential (GWP), and global temperature potential (GTP) that are based on fundamental experimental studies, while 48 are registered compounds. In this work, we present a comprehensive evaluation of the atmospheric lifetimes, ODPs, radiative efficiencies (REs), GWPs, and GTPs for all 274 HCFCs to be included in the Montreal Protocol. Atmospheric lifetimes were estimated based on HCFC reactivity with OH radicals and O(1D), as well as their removal by UV photolysis using structure activity relationships and reactivity trends. ODP values are based on the semi-empirical approach described in the WMO/UNEP ozone assessment. Radiative efficiencies were estimated, based on infrared spectra calculated using theoretical electronic structure methods (Gaussian 09). GWPs and GTPs were calculated relative to CO2 using our estimated atmospheric lifetimes and REs. The details of the methodology will be discussed as well as the associated uncertainties. This study has provided a consistent set of atmospheric metrics for a wide range of HCFCs that support future policy decisions. More accurate metrics for a specific HCFC, if desired, would require fundamental laboratory studies to better define the OH reactivity and infrared absorption spectrum of the compound of interest. Overall, HCFCs within the same family (isomers) show a large ODP, GWP, GTP dependence on the molecular geometry of the isomers. The

  17. The Oil Security Metrics Model: A Tool for Evaluating the Prospective Oil Security Benefits of DOE's Energy Efficiency and Renewable Energy R&D Programs

    Energy Technology Data Exchange (ETDEWEB)

    Greene, David L [ORNL; Leiby, Paul Newsome [ORNL

    2006-05-01

    Energy technology R&D is a cornerstone of U.S. energy policy. Understanding the potential for energy technology R&D to solve the nation's energy problems is critical to formulating a successful R&D program. In light of this, the U.S. Congress requested the National Research Council (NRC) to undertake both retrospective and prospective assessments of the Department of Energy's (DOE's) Energy Efficiency and Fossil Energy Research programs (NRC, 2001; NRC, 2005). ("The Congress continued to express its interest in R&D benefits assessment by providing funds for the NRC to build on the retrospective methodology to develop a methodology for assessing prospective benefits." NRC, 2005, p. ES-2) In 2004, the NRC Committee on Prospective Benefits of DOE's Energy Efficiency and Fossil Energy R&D Programs published a report recommending a new framework and principles for prospective benefits assessment. The Committee explicitly deferred the issue of estimating security benefits to future work. Recognizing the need for a rigorous framework for assessing the energy security benefits of its R&D programs, the DOE's Office of Energy Efficiency and Renewable Energy (EERE) developed a framework and approach for defining energy security metrics for R&D programs to use in gauging the energy security benefits of their programs (Lee, 2005). This report describes methods for estimating the prospective oil security benefits of EERE's R&D programs that are consistent with the methodologies of the NRC (2005) Committee and that build on Lee's (2005) framework. Its objective is to define and implement a method that makes use of the NRC's typology of prospective benefits and methodological framework, satisfies the NRC's criteria for prospective benefits evaluation, and permits measurement of that portion of the prospective energy security benefits of EERE's R&D portfolio related to oil. While the Oil Security Metrics (OSM) methodology described

  18. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  19. Supplier selection using different metric functions

    Directory of Open Access Journals (Sweden)

    Omosigho S.E.

    2015-01-01

    Full Text Available Supplier selection is an important component of supply chain management in today’s global competitive environment. Hence, the evaluation and selection of suppliers have received considerable attention in the literature. Many attributes of suppliers, other than cost, are considered in the evaluation and selection process. Therefore, the process of evaluation and selection of suppliers is a multi-criteria decision making process. The methodology adopted to solve the supplier selection problem is intuitionistic fuzzy TOPSIS (Technique for Order Preference by Similarity to the Ideal Solution. Generally, TOPSIS is based on the concept of minimum distance from the positive ideal solution and maximum distance from the negative ideal solution. We examine the deficiencies of using only one metric function in TOPSIS and propose the use of spherical metric function in addition to the commonly used metric functions. For empirical supplier selection problems, more than one metric function should be used.

  20. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  1. Tracker Performance Metric

    National Research Council Canada - National Science Library

    Olson, Teresa; Lee, Harry; Sanders, Johnnie

    2002-01-01

    .... We have developed the Tracker Performance Metric (TPM) specifically for this purpose. It was designed to measure the output performance, on a frame-by-frame basis, using its output position and quality...

  2. IT Project Management Metrics

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Many software and IT projects fail in completing theirs objectives because different causes of which the management of the projects has a high weight. In order to have successfully projects, lessons learned have to be used, historical data to be collected and metrics and indicators have to be computed and used to compare them with past projects and avoid failure to happen. This paper presents some metrics that can be used for the IT project management.

  3. Mass Customization Measurements Metrics

    DEFF Research Database (Denmark)

    Nielsen, Kjeld; Brunø, Thomas Ditlev; Jørgensen, Kaj Asbjørn

    2014-01-01

    A recent survey has indicated that 17 % of companies have ceased mass customizing less than 1 year after initiating the effort. This paper presents measurement for a company’s mass customization performance, utilizing metrics within the three fundamental capabilities: robust process design, choice...... navigation, and solution space development. A mass customizer when assessing performance with these metrics can identify within which areas improvement would increase competitiveness the most and enable more efficient transition to mass customization....

  4. Evaluation of reactivity shutdown margin for nuclear fuel reload optimization

    International Nuclear Information System (INIS)

    Wong, Hing-Ip; Maldonado, G.I.

    1995-01-01

    The FORMOSA-P code is a nuclear fuel management optimization package that combines simulated annealing (SA) and nodal generalized perturbation theory (GPT). Recent studies at Electricite de France (EdF-Clamart) have produced good results for power-peaking minimizations under multiple limiting control rod configurations. However, since the reactivity shutdown margin is not explicitly treated as an objective or constraint function, then any optimal loading patterns (LPs) are not guaranteed to yield an adequate shutdown margin (SDM). This study describes the implementation of the SDM calculation within a FORMOSA-P optimization. Maintaining all additional computational requirements to a minimum was a key consideration

  5. Evaluation of reactivity shutdown margin for nuclear fuel reload optimization

    International Nuclear Information System (INIS)

    Engrand, P.; Wong, H. I.; Maldonado, G.I.

    1996-01-01

    The FORMOSA-P code is a nuclear fuel management optimization package which combines simulated annealing (SA) and nodal generalized perturbation theory (GPT). Recent studies at Electricite de France have produced good results for power peaking minimizations under multiple limiting control rod configurations. However, since the reactivity shutdown margin is not explicitly treated as an objective or constraint function, then any optimal loading patterns (LPs) are not guaranteed to yield an adequate shutdown margin (SDM). This study describes the implementation of the SDM calculation within a FORMOSA-P optimization. Maintaining all additional computational requirements to a minimum was a key consideration. (authors). 4 refs., 2 figs

  6. Formulation Optimization and In-vitro Evaluation of Oral Floating ...

    African Journals Online (AJOL)

    matrix tablets and to systematically optimize its drug release using varying levels of xanthan gum and hydroxypropyl ... stomach and improve oral bioavailability of drugs that have ... which can affect its sustained release formulation. [19].

  7. Enhancing Evolutionary Optimization in Uncertain Environments by Allocating Evaluations via Multi-armed Bandit Algorithms

    OpenAIRE

    Qiu, Xin; Miikkulainen, Risto

    2018-01-01

    Optimization problems with uncertain fitness functions are common in the real world, and present unique challenges for evolutionary optimization approaches. Existing issues include excessively expensive evaluation, lack of solution reliability, and incapability in maintaining high overall fitness during optimization. Using conversion rate optimization as an example, this paper proposes a series of new techniques for addressing these issues. The main innovation is to augment evolutionary algor...

  8. A matrix-algebraic algorithm for the Riemannian logarithm on the Stiefel manifold under the canonical metric

    OpenAIRE

    Zimmermann, Ralf

    2016-01-01

    We derive a numerical algorithm for evaluating the Riemannian logarithm on the Stiefel manifold with respect to the canonical metric. In contrast to the optimization-based approach known from the literature, we work from a purely matrix-algebraic perspective. Moreover, we prove that the algorithm converges locally and exhibits a linear rate of convergence.

  9. A matrix-algebraic algorithm for the Riemannian logarithm on the Stiefel manifold under the canonical metric

    DEFF Research Database (Denmark)

    Zimmermann, Ralf

    2017-01-01

    We derive a numerical algorithm for evaluating the Riemannian logarithm on the Stiefel manifold with respect to the canonical metric. In contrast to the optimization-based approach known from the literature, we work from a purely matrix-algebraic perspective. Moreover, we prove that the algorithm...... converges locally and exhibits a linear rate of convergence....

  10. Fault Management Metrics

    Science.gov (United States)

    Johnson, Stephen B.; Ghoshal, Sudipto; Haste, Deepak; Moore, Craig

    2017-01-01

    This paper describes the theory and considerations in the application of metrics to measure the effectiveness of fault management. Fault management refers here to the operational aspect of system health management, and as such is considered as a meta-control loop that operates to preserve or maximize the system's ability to achieve its goals in the face of current or prospective failure. As a suite of control loops, the metrics to estimate and measure the effectiveness of fault management are similar to those of classical control loops in being divided into two major classes: state estimation, and state control. State estimation metrics can be classified into lower-level subdivisions for detection coverage, detection effectiveness, fault isolation and fault identification (diagnostics), and failure prognosis. State control metrics can be classified into response determination effectiveness and response effectiveness. These metrics are applied to each and every fault management control loop in the system, for each failure to which they apply, and probabilistically summed to determine the effectiveness of these fault management control loops to preserve the relevant system goals that they are intended to protect.

  11. An Optimization Study on Syngas Production and Economic Evaluation

    Directory of Open Access Journals (Sweden)

    Qasim Faraz

    2016-01-01

    Full Text Available Syngas production in Gas-to-liquid (GTL process is focused in past by several researchers to increase the production with minimal capital and operating costs. In this study, syngas production process is simulated and optimized to increase its production and the economic analysis is studied for the proposed optimized process. Aspen HYSYS v8.4 is used for all process simulation work in this article. A new configuration is rigorously simulated while using auto-thermal reforming. Results exhibit a tremendous rise in production of syngas.

  12. Optimization and evaluation of probabilistic-logic sequence models

    DEFF Research Database (Denmark)

    Christiansen, Henning; Lassen, Ole Torp

    to, in principle, Turing complete languages. In general, such models are computationally far to complex for direct use, so optimization by pruning and approximation are needed. % The first steps are made towards a methodology for optimizing such models by approximations using auxiliary models......Analysis of biological sequence data demands more and more sophisticated and fine-grained models, but these in turn introduce hard computational problems. A class of probabilistic-logic models is considered, which increases the expressibility from HMM's and SCFG's regular and context-free languages...

  13. Cyber threat metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  14. Design Optimization and Evaluation of Gastric Floating Matrix Tablet ...

    African Journals Online (AJOL)

    HP

    Abstract. Purpose: To formulate an optimized gastric floating drug delivery system (GFDDS) containing glipizide ... Index Medicus, JournalSeek, Journal Citation Reports/Science Edition, Directory of Open Access Journals ... Sodium bicarbonate by geometric mixing then .... order polynomial equation (Eq 4) with added.

  15. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  16. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  17. METRICS DEVELOPMENT FOR PATENTS.

    Science.gov (United States)

    Veiga, Daniela Francescato; Ferreira, Lydia Masako

    2015-01-01

    To develop a proposal for metrics for patents to be applied in assessing the postgraduate programs of Medicine III - Capes. From the reading and analysis of the 2013 area documents of all the 48 areas of Capes, a proposal for metrics for patents was developed to be applied in Medicine III programs. Except for the areas Biotechnology, Food Science, Biological Sciences III, Physical Education, Engineering I, III and IV and Interdisciplinary, most areas do not adopt a scoring system for patents. The proposal developed was based on the criteria of Biotechnology, with adaptations. In general, it will be valued, in ascending order, the deposit, the granting and licensing/production. It will also be assigned higher scores to patents registered abroad and whenever there is a participation of students. This proposal can be applied to the item Intellectual Production of the evaluation form, in subsection Technical Production/Patents. The percentage of 10% for academic programs and 40% for Masters Professionals should be maintained. The program will be scored as Very Good when it reaches 400 points or over; Good, between 200 and 399 points; Regular, between 71 and 199 points; Weak up to 70 points; Insufficient, no punctuation. Desenvolver uma proposta de métricas para patentes a serem aplicadas na avaliação dos Programas de Pós-Graduação da Área Medicina III - Capes. A partir da leitura e análise dos documentos de área de 2013 de todas as 48 Áreas da Capes, desenvolveu-se uma proposta de métricas para patentes, a ser aplicada na avaliação dos programas da área. Constatou-se que, com exceção das áreas Biotecnologia, Ciência de Alimentos, Ciências Biológicas III, Educação Física, Engenharias I, III e IV e Interdisciplinar, a maioria não adota sistema de pontuação para patentes. A proposta desenvolvida baseou-se nos critérios da Biotecnologia, com adaptações. De uma forma geral, foi valorizado, em ordem crescente, o depósito, a concessão e o

  18. Quantifying esophagogastric junction contractility with a novel HRM topographic metric, the EGJ-Contractile Integral: normative values and preliminary evaluation in PPI non-responders.

    Science.gov (United States)

    Nicodème, F; Pipa-Muniz, M; Khanna, K; Kahrilas, P J; Pandolfino, J E

    2014-03-01

    Despite its obvious pathophysiological relevance, the clinical utility of measures of esophagogastric junction (EGJ) contractility is unsubstantiated. High-resolution manometry (HRM) may improve upon this with its inherent ability to integrate the magnitude of contractility over time and length of the EGJ. This study aimed to develop a novel HRM metric summarizing EGJ contractility and test its ability distinguish among subgroups of proton pump inhibitor non-responders (PPI-NRs). 75 normal controls and 88 PPI-NRs were studied. All underwent HRM. PPI-NRs underwent pH-impedance monitoring on PPI therapy scored in terms of acid exposure, number of reflux events, and reflux-symptom correlation and grouped as meeting all criteria, some criteria, or no criteria of abnormality. Control HRM studies were used to establish normal values for candidate EGJ contractility metrics, which were then compared in their ability to differentiate among PPI-NR subgroups. The EGJ contractile integral (EGJ-CI), a metric integrating contractility across the EGJ for three respiratory cycles, best distinguished the All Criteria PPI-NR subgroup from controls and other PPI-NR subgroups. Normal values (median, [IQR]) for this measure were 39 mmHg-cm [25-55 mmHg-cm]. The correlation between the EGJ-CI and a previously proposed metric, the lower esophageal sphincter-pressure integral, that used a fixed 10 s time frame and an atmospheric as opposed to gastric pressure reference was weak. Among HRM metrics tested, the EGJ-CI was best in distinguishing PPI-NRs meeting all criteria of abnormality on pH-impedance testing. Future prospective studies are required to explore its utility in management of broader groups of gastroesophageal reflux disease patients. © 2013 John Wiley & Sons Ltd.

  19. Evaluation and optimization of footwear comfort parameters using finite element analysis and a discrete optimization algorithm

    Science.gov (United States)

    Papagiannis, P.; Azariadis, P.; Papanikos, P.

    2017-10-01

    Footwear is subject to bending and torsion deformations that affect comfort perception. Following review of Finite Element Analysis studies of sole rigidity and comfort, a three-dimensional, linear multi-material finite element sole model for quasi-static bending and torsion simulation, overcoming boundary and optimisation limitations, is described. Common footwear materials properties and boundary conditions from gait biomechanics are used. The use of normalised strain energy for product benchmarking is demonstrated along with comfort level determination through strain energy density stratification. Sensitivity of strain energy against material thickness is greater for bending than for torsion, with results of both deformations showing positive correlation. Optimization for a targeted performance level and given layer thickness is demonstrated with bending simulations sufficing for overall comfort assessment. An algorithm for comfort optimization w.r.t. bending is presented, based on a discrete approach with thickness values set in line with practical manufacturing accuracy. This work illustrates the potential of the developed finite element analysis applications to offer viable and proven aids to modern footwear sole design assessment and optimization.

  20. Metrical Phonology and SLA.

    Science.gov (United States)

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English language with the intention that it may be used in second language instruction. Stress is defined by its physical and acoustical correlates, and the principles of…

  1. Metrics for Probabilistic Geometries

    DEFF Research Database (Denmark)

    Tosi, Alessandra; Hauberg, Søren; Vellido, Alfredo

    2014-01-01

    the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances...

  2. Metric Learning for Hyperspectral Image Segmentation

    Science.gov (United States)

    Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca

    2011-01-01

    We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.

  3. Multi-disciplinary design optimization and performance evaluation of a single stage transonic axial compressor

    International Nuclear Information System (INIS)

    Lee, Sae Il; Lee, Dong Ho; Kim, Kyu Hong; Park, Tae Choon; Lim, Byeung Jun; Kang, Young Seok

    2013-01-01

    The multidisciplinary design optimization method, which integrates aerodynamic performance and structural stability, was utilized in the development of a single-stage transonic axial compressor. An approximation model was created using artificial neural network for global optimization within given ranges of variables and several design constraints. The genetic algorithm was used for the exploration of the Pareto front to find the maximum objective function value. The final design was chosen after a second stage gradient-based optimization process to improve the accuracy of the optimization. To validate the design procedure, numerical simulations and compressor tests were carried out to evaluate the aerodynamic performance and safety factor of the optimized compressor. Comparison between numerical optimal results and experimental data are well matched. The optimum shape of the compressor blade is obtained and compared to the baseline design. The proposed optimization framework improves the aerodynamic efficiency and the safety factor.

  4. Optimal contracts based on subjective performance evaluations and reciprocity

    DEFF Research Database (Denmark)

    Sebald, Alexander Christopher; Walzl, Markus

    2015-01-01

    As demonstrated in a recent laboratory experiment (see Sebald & Walzl, 2014), individuals tend to sanction others who subjectively evaluate their performance whenever this assessment falls short of the individuals’ self-evaluation. Interestingly, this is the case even if the individuals’ earnings...

  5. Optimization and evaluation of alkaline potassium permanganate pretreatment of corncob.

    Science.gov (United States)

    Ma, Lijuan; Cui, Youzhi; Cai, Rui; Liu, Xueqiang; Zhang, Cuiying; Xiao, Dongguang

    2015-03-01

    Alkaline potassium permanganate solution (APP) was applied to the pretreatment of corncob with a simple and effective optimization of APP concentration, reaction time, temperature and solid to liquid ratio (SLR). The optimized pretreatment conditions were at 2% (w/v) potassium permanganate with SLR of 1:10 treating for 6h at 50°C. This simple one-step treatment resulted in significant 94.56% of the cellulose and 81.47% of the hemicellulose recoveries and 46.79% of the lignin removal of corncob. The reducing sugar in the hydrolysate from APP-pretreated corncob was 8.39g/L after 12h enzymatic hydrolysis, which was 1.44 and 1.29 folds higher than those from raw and acid pretreated corncobs. Physical characteristics, crystallinity and structure of the pretreated corncob were analyzed and assessed by SEM, XRD and FTIR. The APP pretreatment process was novel and enhanced enzymatic hydrolysis of lignocellulose by affecting composition and structural features. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Honorary authorship epidemic in scholarly publications? How the current use of citation-based evaluative metrics make (pseudo)honorary authors from honest contributors of every multi-author article.

    Science.gov (United States)

    Kovacs, Jozsef

    2013-08-01

    The current use of citation-based metrics to evaluate the research output of individual researchers is highly discriminatory because they are uniformly applied to authors of single-author articles as well as contributors of multi-author papers. In the latter case, these quantitative measures are counted, as if each contributor were the single author of the full article. In this way, each and every contributor is assigned the full impact-factor score and all the citations that the article has received. This has a multiplication effect on each contributor's citation-based evaluative metrics of multi-author articles, because the more contributors an article has, the more undeserved credit is assigned to each of them. In this paper, I argue that this unfair system could be made fairer by requesting the contributors of multi-author articles to describe the nature of their contribution, and to assign a numerical value to their degree of relative contribution. In this way, we could create a contribution-specific index of each contributor for each citation metric. This would be a strong disincentive against honorary authorship and publication cartels, because it would transform the current win-win strategy of accepting honorary authors in the byline into a zero-sum game for each contributor.

  7. Decision Analysis for Metric Selection on a Clinical Quality Scorecard.

    Science.gov (United States)

    Guth, Rebecca M; Storey, Patricia E; Vitale, Michael; Markan-Aurora, Sumita; Gordon, Randolph; Prevost, Traci Q; Dunagan, Wm Claiborne; Woeltje, Keith F

    2016-09-01

    Clinical quality scorecards are used by health care institutions to monitor clinical performance and drive quality improvement. Because of the rapid proliferation of quality metrics in health care, BJC HealthCare found it increasingly difficult to select the most impactful scorecard metrics while still monitoring metrics for regulatory purposes. A 7-step measure selection process was implemented incorporating Kepner-Tregoe Decision Analysis, which is a systematic process that considers key criteria that must be satisfied in order to make the best decision. The decision analysis process evaluates what metrics will most appropriately fulfill these criteria, as well as identifies potential risks associated with a particular metric in order to identify threats to its implementation. Using this process, a list of 750 potential metrics was narrowed to 25 that were selected for scorecard inclusion. This decision analysis process created a more transparent, reproducible approach for selecting quality metrics for clinical quality scorecards. © The Author(s) 2015.

  8. Comparison of luminance based metrics in different lighting conditions

    DEFF Research Database (Denmark)

    Wienold, J.; Kuhn, T.E.; Christoffersen, J.

    In this study, we evaluate established and newly developed metrics for predicting glare using data from three different research studies. The evaluation covers two different targets: 1. How well the user’s perception of glare magnitude correlates to the prediction of the glare metrics? 2. How well...... do the glare metrics describe the subjects’ disturbance by glare? We applied Spearman correlations, logistic regressions and an accuracy evaluation, based on an ROC-analysis. The results show that five of the twelve investigated metrics are failing at least one of the statistical tests. The other...... seven metrics CGI, modified DGI, DGP, Ev, average Luminance of the image Lavg, UGP and UGR are passing all statistical tests. DGP, CGI, DGI_mod and UGP have largest AUC and might be slightly more robust. The accuracy of the predictions of afore mentioned seven metrics for the disturbance by glare lies...

  9. The Study on Food Sensory Evaluation based on Particle Swarm Optimization Algorithm

    OpenAIRE

    Hairong Wang; Huijuan Xu

    2015-01-01

    In this study, it explores the procedures and methods of the system for establishing food sensory evaluation based on particle swarm optimization algorithm, by means of explaining the interpretation of sensory evaluation and sensory analysis, combined with the applying situation of sensory evaluation in food industry.

  10. Reliability Evaluation for Optimizing Electricity Supply in a Developing Country

    Directory of Open Access Journals (Sweden)

    Mark Ndubuka NWOHU

    2007-09-01

    Full Text Available The reliability standards for electricity supply in a developing country, like Nigeria, have to be determined on past engineering principles and practice. Because of the high demand of electrical power due to rapid development, industrialization and rural electrification; the economic, social and political climate in which the electric power supply industry now operates should be critically viewed to ensure that the production of electrical power should be augmented and remain uninterrupted. This paper presents an economic framework that can be used to optimize electric power system reliability. Finally the cost models are investigated to take into account the economic analysis of system reliability, which can be periodically updated to improve overall reliability of electric power system.

  11. Functional Fit Evaluation to Determine Optimal Ease Requirements in Canadian Forces Chemical Protective Gloves

    National Research Council Canada - National Science Library

    Tremblay-Lutter, Julie

    1995-01-01

    A functional fit evaluation of the Canadian Forces (CF) chemical protective lightweight glove was undertaken in order to quantify the amount of ease required within the glove for optimal functional fit...

  12. Sigma Routing Metric for RPL Protocol

    Directory of Open Access Journals (Sweden)

    Paul Sanmartin

    2018-04-01

    Full Text Available This paper presents the adaptation of a specific metric for the RPL protocol in the objective function MRHOF. Among the functions standardized by IETF, we find OF0, which is based on the minimum hop count, as well as MRHOF, which is based on the Expected Transmission Count (ETX. However, when the network becomes denser or the number of nodes increases, both OF0 and MRHOF introduce long hops, which can generate a bottleneck that restricts the network. The adaptation is proposed to optimize both OFs through a new routing metric. To solve the above problem, the metrics of the minimum number of hops and the ETX are combined by designing a new routing metric called SIGMA-ETX, in which the best route is calculated using the standard deviation of ETX values between each node, as opposed to working with the ETX average along the route. This method ensures a better routing performance in dense sensor networks. The simulations are done through the Cooja simulator, based on the Contiki operating system. The simulations showed that the proposed optimization outperforms at a high margin in both OF0 and MRHOF, in terms of network latency, packet delivery ratio, lifetime, and power consumption.

  13. Enterprise Sustainment Metrics

    Science.gov (United States)

    2015-06-19

    are negatively impacting KPIs” (Parmenter, 2010: 31). In the current state, the Air Force’s AA and PBL metrics are once again split . AA does...must have the authority to “take immediate action to rectify situations that are negatively impacting KPIs” (Parmenter, 2010: 31). 3. Measuring...highest profitability and shareholder value for each company” (2014: 273). By systematically diagraming a process, either through a swim lane flowchart

  14. Optimization evaluation of cutting technology based on mechanical parts

    Science.gov (United States)

    Wang, Yu

    2018-04-01

    The relationship between the mechanical manufacturing process and the carbon emission is studied on the basis of the process of the mechanical manufacturing process. The formula of carbon emission calculation suitable for mechanical manufacturing process is derived. Based on this, a green evaluation method for cold machining process of mechanical parts is proposed. The application verification and data analysis of the proposed evaluation method are carried out by an example. The results show that there is a great relationship between the mechanical manufacturing process data and carbon emissions.

  15. Evaluation and Optimization of Godare Starch as a Binder and ...

    African Journals Online (AJOL)

    The binding and disintegrating properties of Godare (Colcosia esculenta) starch in paracetamol tablet formulations were evaluated in comparison with potato starch. Tablet crushing strengths (Hs), friabilities Frs), disintegration times (DTs) and porosities were determined. The results showed that Godare starch has a better ...

  16. Modern tools to evaluate and optimize fire protection systems

    International Nuclear Information System (INIS)

    Alvares, N.J.; Hasegawa, H.K.

    1980-01-01

    Modern techniques, such as fault tree analysis, can be used to obtain engineering descriptions of specific fire protection systems. The analysis allows establishment of an optimum level of fire protection, and evaluates the level of protection provided by various systems. A prime example: the application to fusion energy experiments

  17. Symmetries of the dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.

    1998-01-01

    The geometric duality between the metric g μν and a Killing tensor K μν is studied. The conditions were found when the symmetries of the metric g μν and the dual metric K μν are the same. Dual spinning space was constructed without introduction of torsion. The general results are applied to the case of Kerr-Newmann metric

  18. PARALLEL IMPLEMENTATION OF CROSS-LAYER OPTIMIZATION - A PERFORMANCE EVALUATION BASED ON SWARM INTELLIGENCE

    Directory of Open Access Journals (Sweden)

    Vanaja Gokul

    2012-01-01

    Full Text Available In distributed systems real time optimizations need to be performed dynamically for better utilization of the network resources. Real time optimizations can be performed effectively by using Cross Layer Optimization (CLO within the network operating system. This paper presents the performance evaluation of Cross Layer Optimization (CLO in comparison with the traditional approach of Single-Layer Optimization (SLO. In the parallel implementation of the approaches the experimental study carried out indicates that the CLO results in a significant improvement in network utilization when compared to SLO. A variant of the Particle Swarm Optimization technique that utilizes Digital Pheromones (PSODP for better performance has been used here. A significantly higher speed up in performance was observed from the parallel implementation of CLO that used PSODP on a cluster of nodes.

  19. Performance indices and evaluation of algorithms in building energy efficient design optimization

    International Nuclear Information System (INIS)

    Si, Binghui; Tian, Zhichao; Jin, Xing; Zhou, Xin; Tang, Peng; Shi, Xing

    2016-01-01

    Building energy efficient design optimization is an emerging technique that is increasingly being used to design buildings with better overall performance and a particular emphasis on energy efficiency. To achieve building energy efficient design optimization, algorithms are vital to generate new designs and thus drive the design optimization process. Therefore, the performance of algorithms is crucial to achieving effective energy efficient design techniques. This study evaluates algorithms used for building energy efficient design optimization. A set of performance indices, namely, stability, robustness, validity, speed, coverage, and locality, is proposed to evaluate the overall performance of algorithms. A benchmark building and a design optimization problem are also developed. Hooke–Jeeves algorithm, Multi-Objective Genetic Algorithm II, and Multi-Objective Particle Swarm Optimization algorithm are evaluated by using the proposed performance indices and benchmark design problem. Results indicate that no algorithm performs best in all six areas. Therefore, when facing an energy efficient design problem, the algorithm must be carefully selected based on the nature of the problem and the performance indices that matter the most. - Highlights: • Six indices of algorithm performance in building energy optimization are developed. • For each index, its concept is defined and the calculation formulas are proposed. • A benchmark building and benchmark energy efficient design problem are proposed. • The performance of three selected algorithms are evaluated.

  20. Evaluation of optimal reuse system for hydrofluoric acid wastewater

    Energy Technology Data Exchange (ETDEWEB)

    Won, Chan-Hee [Department of Environmental Engineering, Chonbuk National University, 567 Bakje-daero, Deokjin-Gu, Jeonju, Jeollabuk-Do, 561-756 (Korea, Republic of); Choi, Jeongyun [R and D Center, Samsung Engineering Co. Ltd., 415-10 Woncheon-Dong, Youngtong-Gu, Suwon, Gyeonggi-Do, 443-823 (Korea, Republic of); Chung, Jinwook, E-mail: jin-wook.chung@samsung.com [R and D Center, Samsung Engineering Co. Ltd., 415-10 Woncheon-Dong, Youngtong-Gu, Suwon, Gyeonggi-Do, 443-823 (Korea, Republic of)

    2012-11-15

    Highlights: Black-Right-Pointing-Pointer Coagulation and ion exchange technologies were ineffective in removing fluoride. Black-Right-Pointing-Pointer Polyamide RO was more efficacious than cellulose RO due to its high flux and rejection. Black-Right-Pointing-Pointer Spiral wound RO system was more preferential to disc tube RO system for reusing raw hydrofluoric acid wastewater. Black-Right-Pointing-Pointer Combined coagulation and RO technology can be applied to reuse raw hydrofluoric acid wastewater. - Abstract: The treatment of hydrofluoric acid (HF) wastewater has been an important environmental issue in recent years due to the extensive use of hydrofluoric acid in the chemical and electronics industries, such as semiconductor manufacturers. Coagulation/precipitation and ion exchange technologies have been used to treat HF wastewater, but these conventional methods are ineffective in removing organics, salts, and fluorides, limiting its reuse for water quality and economic feasibility. One promising alternative is reverse osmosis (RO) after lime treatment. Based on pilot-scale experiment using real HF wastewater discharged from semiconductor facility, the spiral wound module equipped with polyamide membranes has shown excellent flux and chemical cleaning cycles. Our results suggest that coagulation/precipitation and spiral wound RO constitute the optimal combination to reuse HF wastewater.

  1. Kerr metric in cosmological background

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, P C [Gujarat Univ., Ahmedabad (India). Dept. of Mathematics

    1977-06-01

    A metric satisfying Einstein's equation is given which in the vicinity of the source reduces to the well-known Kerr metric and which at large distances reduces to the Robertson-Walker metric of a nomogeneous cosmological model. The radius of the event horizon of the Kerr black hole in the cosmological background is found out.

  2. Evaluating dynamic covariance matrix forecasting and portfolio optimization

    OpenAIRE

    Sendstad, Lars Hegnes; Holten, Dag Martin

    2012-01-01

    In this thesis we have evaluated the covariance forecasting ability of the simple moving average, the exponential moving average and the dynamic conditional correlation models. Overall we found that a dynamic portfolio can gain significant improvements by implementing a multivariate GARCH forecast. We further divided the global investment universe into sectors and regions in order to investigate the relative portfolio performance of several asset allocation strategies with both variance and c...

  3. Social Media Metrics Importance and Usage Frequency in Latvia

    Directory of Open Access Journals (Sweden)

    Ronalds Skulme

    2017-12-01

    Full Text Available Purpose of the article: The purpose of this paper was to explore which social media marketing metrics are most often used and are most important for marketing experts in Latvia and can be used to evaluate marketing campaign effectiveness. Methodology/methods: In order to achieve the aim of this paper several theoretical and practical research methods were used, such as theoretical literature analysis, surveying and grouping. First of all, theoretical research about social media metrics was conducted. Authors collected information about social media metric grouping methods and the most frequently mentioned social media metrics in the literature. The collected information was used as the foundation for the expert surveys. The expert surveys were used to collect information from Latvian marketing professionals to determine which social media metrics are used most often and which social media metrics are most important in Latvia. Scientific aim: The scientific aim of this paper was to identify if social media metrics importance varies depending on the consumer purchase decision stage. Findings: Information about the most important and most often used social media marketing metrics in Latvia was collected. A new social media grouping framework is proposed. Conclusions: The main conclusion is that the importance and the usage frequency of the social media metrics is changing depending of consumer purchase decisions stage the metric is used to evaluate.

  4. Global optimization based on noisy evaluations: An empirical study of two statistical approaches

    International Nuclear Information System (INIS)

    Vazquez, Emmanuel; Villemonteix, Julien; Sidorkiewicz, Maryan; Walter, Eric

    2008-01-01

    The optimization of the output of complex computer codes has often to be achieved with a small budget of evaluations. Algorithms dedicated to such problems have been developed and compared, such as the Expected Improvement algorithm (El) or the Informational Approach to Global Optimization (IAGO). However, the influence of noisy evaluation results on the outcome of these comparisons has often been neglected, despite its frequent appearance in industrial problems. In this paper, empirical convergence rates for El and IAGO are compared when an additive noise corrupts the result of an evaluation. IAGO appears more efficient than El and various modifications of El designed to deal with noisy evaluations. Keywords. Global optimization; computer simulations; kriging; Gaussian process; noisy evaluations.

  5. Anisotropic rectangular metric for polygonal surface remeshing

    KAUST Repository

    Pellenard, Bertrand

    2013-06-18

    We propose a new method for anisotropic polygonal surface remeshing. Our algorithm takes as input a surface triangle mesh. An anisotropic rectangular metric, defined at each triangle facet of the input mesh, is derived from both a user-specified normal-based tolerance error and the requirement to favor rectangle-shaped polygons. Our algorithm uses a greedy optimization procedure that adds, deletes and relocates generators so as to match two criteria related to partitioning and conformity.

  6. Anisotropic rectangular metric for polygonal surface remeshing

    KAUST Repository

    Pellenard, Bertrand; Morvan, Jean-Marie; Alliez, Pierre

    2013-01-01

    We propose a new method for anisotropic polygonal surface remeshing. Our algorithm takes as input a surface triangle mesh. An anisotropic rectangular metric, defined at each triangle facet of the input mesh, is derived from both a user-specified normal-based tolerance error and the requirement to favor rectangle-shaped polygons. Our algorithm uses a greedy optimization procedure that adds, deletes and relocates generators so as to match two criteria related to partitioning and conformity.

  7. Learning Low-Dimensional Metrics

    OpenAIRE

    Jain, Lalit; Mason, Blake; Nowak, Robert

    2017-01-01

    This paper investigates the theoretical foundations of metric learning, focused on three key questions that are not fully addressed in prior work: 1) we consider learning general low-dimensional (low-rank) metrics as well as sparse metrics; 2) we develop upper and lower (minimax)bounds on the generalization error; 3) we quantify the sample complexity of metric learning in terms of the dimension of the feature space and the dimension/rank of the underlying metric;4) we also bound the accuracy ...

  8. Measures and Metrics for Feasibility of Proof-of-Concept Studies With Human Immunodeficiency Virus Rapid Point-of-Care Technologies

    Science.gov (United States)

    Pant Pai, Nitika; Chiavegatti, Tiago; Vijh, Rohit; Karatzas, Nicolaos; Daher, Jana; Smallwood, Megan; Wong, Tom; Engel, Nora

    2017-01-01

    Objective Pilot (feasibility) studies form a vast majority of diagnostic studies with point-of-care technologies but often lack use of clear measures/metrics and a consistent framework for reporting and evaluation. To fill this gap, we systematically reviewed data to (a) catalog feasibility measures/metrics and (b) propose a framework. Methods For the period January 2000 to March 2014, 2 reviewers searched 4 databases (MEDLINE, EMBASE, CINAHL, Scopus), retrieved 1441 citations, and abstracted data from 81 studies. We observed 2 major categories of measures, that is, implementation centered and patient centered, and 4 subcategories of measures, that is, feasibility, acceptability, preference, and patient experience. We defined and delineated metrics and measures for a feasibility framework. We documented impact measures for a comparison. Findings We observed heterogeneity in reporting of metrics as well as misclassification and misuse of metrics within measures. Although we observed poorly defined measures and metrics for feasibility, preference, and patient experience, in contrast, acceptability measure was the best defined. For example, within feasibility, metrics such as consent, completion, new infection, linkage rates, and turnaround times were misclassified and reported. Similarly, patient experience was variously reported as test convenience, comfort, pain, and/or satisfaction. In contrast, within impact measures, all the metrics were well documented, thus serving as a good baseline comparator. With our framework, we classified, delineated, and defined quantitative measures and metrics for feasibility. Conclusions Our framework, with its defined measures/metrics, could reduce misclassification and improve the overall quality of reporting for monitoring and evaluation of rapid point-of-care technology strategies and their context-driven optimization. PMID:29333105

  9. Measures and Metrics for Feasibility of Proof-of-Concept Studies With Human Immunodeficiency Virus Rapid Point-of-Care Technologies: The Evidence and the Framework.

    Science.gov (United States)

    Pant Pai, Nitika; Chiavegatti, Tiago; Vijh, Rohit; Karatzas, Nicolaos; Daher, Jana; Smallwood, Megan; Wong, Tom; Engel, Nora

    2017-12-01

    Pilot (feasibility) studies form a vast majority of diagnostic studies with point-of-care technologies but often lack use of clear measures/metrics and a consistent framework for reporting and evaluation. To fill this gap, we systematically reviewed data to ( a ) catalog feasibility measures/metrics and ( b ) propose a framework. For the period January 2000 to March 2014, 2 reviewers searched 4 databases (MEDLINE, EMBASE, CINAHL, Scopus), retrieved 1441 citations, and abstracted data from 81 studies. We observed 2 major categories of measures, that is, implementation centered and patient centered, and 4 subcategories of measures, that is, feasibility, acceptability, preference, and patient experience. We defined and delineated metrics and measures for a feasibility framework. We documented impact measures for a comparison. We observed heterogeneity in reporting of metrics as well as misclassification and misuse of metrics within measures. Although we observed poorly defined measures and metrics for feasibility, preference, and patient experience, in contrast, acceptability measure was the best defined. For example, within feasibility, metrics such as consent, completion, new infection, linkage rates, and turnaround times were misclassified and reported. Similarly, patient experience was variously reported as test convenience, comfort, pain, and/or satisfaction. In contrast, within impact measures, all the metrics were well documented, thus serving as a good baseline comparator. With our framework, we classified, delineated, and defined quantitative measures and metrics for feasibility. Our framework, with its defined measures/metrics, could reduce misclassification and improve the overall quality of reporting for monitoring and evaluation of rapid point-of-care technology strategies and their context-driven optimization.

  10. Preclinical Evaluations To Identify Optimal Linezolid Regimens for Tuberculosis Therapy

    Science.gov (United States)

    Drusano, George L.; Adams, Jonathan R.; Rodriquez, Jaime L.; Jambunathan, Kalyani; Baluya, Dodge L.; Brown, David L.; Kwara, Awewura; Mirsalis, Jon C.; Hafner, Richard; Louie, Arnold

    2015-01-01

    ABSTRACT Linezolid is an oxazolidinone with potent activity against Mycobacterium tuberculosis. Linezolid toxicity in patients correlates with the dose and duration of therapy. These toxicities are attributable to the inhibition of mitochondrial protein synthesis. Clinically relevant linezolid regimens were simulated in the in vitro hollow-fiber infection model (HFIM) system to identify the linezolid therapies that minimize toxicity, maximize antibacterial activity, and prevent drug resistance. Linezolid inhibited mitochondrial proteins in an exposure-dependent manner, with toxicity being driven by trough concentrations. Once-daily linezolid killed M. tuberculosis in an exposure-dependent manner. Further, 300 mg linezolid given every 12 hours generated more bacterial kill but more toxicity than 600 mg linezolid given once daily. None of the regimens prevented linezolid resistance. These findings show that with linezolid monotherapy, a clear tradeoff exists between antibacterial activity and toxicity. By identifying the pharmacokinetic parameters linked with toxicity and antibacterial activity, these data can provide guidance for clinical trials evaluating linezolid in multidrug antituberculosis regimens. PMID:26530386

  11. Performance evaluation of Genetic Algorithms on loading pattern optimization of PWRs

    International Nuclear Information System (INIS)

    Tombakoglu, M.; Bekar, K.B.; Erdemli, A.O.

    2001-01-01

    Genetic Algorithm (GA) based systems are used for search and optimization problems. There are several applications of GAs in literature successfully applied for loading pattern optimization problems. In this study, we have selected loading pattern optimization problem of Pressurised Water Reactor (PWR). The main objective of this work is to evaluate the performance of Genetic Algorithm operators such as regional crossover, crossover and mutation, and selection and construction of initial population and its size for PWR loading pattern optimization problems. The performance of GA with antithetic variates is compared to traditional GA. Antithetic variates are used to generate the initial population and its use with GA operators are also discussed. Finally, the results of multi-cycle optimization problems are discussed for objective function taking into account cycle burn-up and discharge burn-up.(author)

  12. A new universal colour image fidelity metric

    NARCIS (Netherlands)

    Toet, A.; Lucassen, M.P.

    2003-01-01

    We extend a recently introduced universal grayscale image quality index to a newly developed perceptually decorrelated colour space. The resulting colour image fidelity metric quantifies the distortion of a processed colour image relative to its original version. We evaluated the new colour image

  13. Improving alignment in Tract-based spatial statistics: evaluation and optimization of image registration.

    Science.gov (United States)

    de Groot, Marius; Vernooij, Meike W; Klein, Stefan; Ikram, M Arfan; Vos, Frans M; Smith, Stephen M; Niessen, Wiro J; Andersson, Jesper L R

    2013-08-01

    Anatomical alignment in neuroimaging studies is of such importance that considerable effort is put into improving the registration used to establish spatial correspondence. Tract-based spatial statistics (TBSS) is a popular method for comparing diffusion characteristics across subjects. TBSS establishes spatial correspondence using a combination of nonlinear registration and a "skeleton projection" that may break topological consistency of the transformed brain images. We therefore investigated feasibility of replacing the two-stage registration-projection procedure in TBSS with a single, regularized, high-dimensional registration. To optimize registration parameters and to evaluate registration performance in diffusion MRI, we designed an evaluation framework that uses native space probabilistic tractography for 23 white matter tracts, and quantifies tract similarity across subjects in standard space. We optimized parameters for two registration algorithms on two diffusion datasets of different quality. We investigated reproducibility of the evaluation framework, and of the optimized registration algorithms. Next, we compared registration performance of the regularized registration methods and TBSS. Finally, feasibility and effect of incorporating the improved registration in TBSS were evaluated in an example study. The evaluation framework was highly reproducible for both algorithms (R(2) 0.993; 0.931). The optimal registration parameters depended on the quality of the dataset in a graded and predictable manner. At optimal parameters, both algorithms outperformed the registration of TBSS, showing feasibility of adopting such approaches in TBSS. This was further confirmed in the example experiment. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Evaluation of the need for stochastic optimization of out-of-core nuclear fuel management decisions

    International Nuclear Information System (INIS)

    Thomas, R.L. Jr.

    1989-01-01

    Work has been completed on utilizing mathematical optimization techniques to optimize out-of-core nuclear fuel management decisions. The objective of such optimization is to minimize the levelized fuel cycle cost over some planning horizon. Typical decision variables include feed enrichments and number of assemblies, burnable poison requirements, and burned fuel to reinsert for every cycle in the planning horizon. Engineering constraints imposed consist of such items as discharge burnup limits, maximum enrichment limit, and target cycle energy productions. Earlier the authors reported on the development of the OCEON code, which employs the integer Monte Carlo Programming method as the mathematical optimization method. The discharge burnpups, and feed enrichment and burnable poison requirements are evaluated, initially employing a linear reactivity core physics model and refined using a coarse mesh nodal model. The economic evaluation is completed using a modification of the CINCAS methodology. Interest now is to assess the need for stochastic optimization, which will account for cost components and cycle energy production uncertainties. The implication of the present studies is that stochastic optimization in regard to cost component uncertainties need not be completed since deterministic optimization will identify nearly the same family of near-optimum cycling schemes

  15. Development of a perceptually calibrated objective metric of noise

    Science.gov (United States)

    Keelan, Brian W.; Jin, Elaine W.; Prokushkin, Sergey

    2011-01-01

    A system simulation model was used to create scene-dependent noise masks that reflect current performance of mobile phone cameras. Stimuli with different overall magnitudes of noise and with varying mixtures of red, green, blue, and luminance noises were included in the study. Eleven treatments in each of ten pictorial scenes were evaluated by twenty observers using the softcopy ruler method. In addition to determining the quality loss function in just noticeable differences (JNDs) for the average observer and scene, transformations for different combinations of observer sensitivity and scene susceptibility were derived. The psychophysical results were used to optimize an objective metric of isotropic noise based on system noise power spectra (NPS), which were integrated over a visual frequency weighting function to yield perceptually relevant variances and covariances in CIE L*a*b* space. Because the frequency weighting function is expressed in terms of cycles per degree at the retina, it accounts for display pixel size and viewing distance effects, so application-specific predictions can be made. Excellent results were obtained using only L* and a* variances and L*a* covariance, with relative weights of 100, 5, and 12, respectively. The positive a* weight suggests that the luminance (photopic) weighting is slightly narrow on the long wavelength side for predicting perceived noisiness. The L*a* covariance term, which is normally negative, reflects masking between L* and a* noise, as confirmed in informal evaluations. Test targets in linear sRGB and rendered L*a*b* spaces for each treatment are available at http://www.aptina.com/ImArch/ to enable other researchers to test metrics of their own design and calibrate them to JNDs of quality loss without performing additional observer experiments. Such JND-calibrated noise metrics are particularly valuable for comparing the impact of noise and other attributes, and for computing overall image quality.

  16. Optimal Sizing and Performance Evaluation of a Renewable Energy Based Microgrid in Future Seaports

    DEFF Research Database (Denmark)

    Baizura Binti Ahamad, Nor; Othman @ Marzuki, Muzaidi Bin; Quintero, Juan Carlos Vasquez

    2018-01-01

    This paper presents the optimal design and specifies the dimension, energy planning and evaluates the performance of a microgrid to supply the electricity to the load by using integrated microgrid. The integrated system consists of PV, wind turbine and a battery for grid-connected. This paper also...... analyzes the performance of the designed system based on seaport located in Copenhagen, Denmark as a case study. The analysis is performed by using Hybrid Optimization Model for Electric Renewables (HOMER) software which includes optimization and sensitivity analysis result. The simulation result indicates...... that the implementation of microgrid technologies would be a convenient solution to supply the electricity to the load application (shipboard)...

  17. Optimal Sizing and Performance Evaluation of a Renewable Energy Based Microgrid in Future Seaports

    DEFF Research Database (Denmark)

    Baizura Binti Ahamad, Nor; Othman @ Marzuki, Muzaidi Bin; Quintero, Juan Carlos Vasquez

    2018-01-01

    This paper presents the optimal design and specifies the dimension, energy planning and evaluates the performance of a microgrid to supply the electricity to the load by using integrated microgrid. The integrated system consists of PV, wind turbine and a battery for grid-connected. This paper also...... analyzes the performance of the designed system based on seaport located in Copenhagen, Denmark as a case study. The analysis is performed by using Hybrid Optimization Model for Electric Renewables (HOMER) software which includes optimization and sensitivity analysis result. The simulation result indicates...

  18. Evaluation and optimization of the High Resolution Research Tomograph (HRRT)

    International Nuclear Information System (INIS)

    Knoess, C.

    2004-01-01

    Positron Emission Tomography (PET) is an imaging technique used in medicine to determine qualitative and quantitative metabolic parameters in vivo. The High Resolution Research Tomograph (HRRT) is a new high resolution tomograph that was designed for brain studies (312 mm transaxial field-of-view (FOV), 252 mm axial FOV). The detector blocks are arranged in a quadrant sharing design and consist of two crystal layers with dimensions of 2.1 mm x 2.1 mm x 7.5 mm. The main detector material is the newly developed scintillator lutetium oxyorthosilicate (LSO). Events from the different crystal layers are distinguished by Pulse Shape Discrimination (PSD) to gain Depth of Interaction (DOI) information. This will improve the spatial resolution, especially at the edges of the FOV. A prototype of the tomograph was installed at the Max-Planck Institute for Neurological Research in Cologne, Germany in 1999 and was evaluated with respect to spatial resolution, sensitivity, scatter fraction, and count rate behavior. These performance measurements showed that this prototype provided a spatial resolution of around 2.5 mm in a volume big enough to contain the human brain. A comparison with a single layer HRRT prototype showed a 10% worsening of the resolution, despite the fact that DOI was used. Without DOI, the resolution decreased considerably. The sensitivity, as measured with a 22 Na point source, was 46.5 cps/kBq for an energy window of 350-650 keV and 37.9 cps/kBq for an energy window of 400-650 keV, while the scatter fractions were 56% for 350-650 keV and 51% for 400-650 keV, respectively. A daily quality check was developed and implemented that uses the uniform, natural radioactive background of the scintillator material LSO. In 2001, the manufacturer decided to build a series of additional HRRT scanners to try to improve the design (detector electronics, transmission source design, and shielding against out-of-FOV activity) and to eliminate problems (difficult detector

  19. Recursive form of general limited memory variable metric methods

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Vlček, Jan

    2013-01-01

    Roč. 49, č. 2 (2013), s. 224-235 ISSN 0023-5954 Institutional support: RVO:67985807 Keywords : unconstrained optimization * large scale optimization * limited memory methods * variable metric updates * recursive matrix formulation * algorithms Subject RIV: BA - General Mathematics Impact factor: 0.563, year: 2013 http://dml.cz/handle/10338.dmlcz/143365

  20. Sharp metric obstructions for quasi-Einstein metrics

    Science.gov (United States)

    Case, Jeffrey S.

    2013-02-01

    Using the tractor calculus to study smooth metric measure spaces, we adapt results of Gover and Nurowski to give sharp metric obstructions to the existence of quasi-Einstein metrics on suitably generic manifolds. We do this by introducing an analogue of the Weyl tractor W to the setting of smooth metric measure spaces. The obstructions we obtain can be realized as tensorial invariants which are polynomial in the Riemann curvature tensor and its divergence. By taking suitable limits of their tensorial forms, we then find obstructions to the existence of static potentials, generalizing to higher dimensions a result of Bartnik and Tod, and to the existence of potentials for gradient Ricci solitons.

  1. Measurable Control System Security through Ideal Driven Technical Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Sean McBride; Marie Farrar; Zachary Tudor

    2008-01-01

    The Department of Homeland Security National Cyber Security Division supported development of a small set of security ideals as a framework to establish measurable control systems security. Based on these ideals, a draft set of proposed technical metrics was developed to allow control systems owner-operators to track improvements or degradations in their individual control systems security posture. The technical metrics development effort included review and evaluation of over thirty metrics-related documents. On the bases of complexity, ambiguity, or misleading and distorting effects the metrics identified during the reviews were determined to be weaker than necessary to aid defense against the myriad threats posed by cyber-terrorism to human safety, as well as to economic prosperity. Using the results of our metrics review and the set of security ideals as a starting point for metrics development, we identified thirteen potential technical metrics - with at least one metric supporting each ideal. Two case study applications of the ideals and thirteen metrics to control systems were then performed to establish potential difficulties in applying both the ideals and the metrics. The case studies resulted in no changes to the ideals, and only a few deletions and refinements to the thirteen potential metrics. This led to a final proposed set of ten core technical metrics. To further validate the security ideals, the modifications made to the original thirteen potential metrics, and the final proposed set of ten core metrics, seven separate control systems security assessments performed over the past three years were reviewed for findings and recommended mitigations. These findings and mitigations were then mapped to the security ideals and metrics to assess gaps in their coverage. The mappings indicated that there are no gaps in the security ideals and that the ten core technical metrics provide significant coverage of standard security issues with 87% coverage. Based

  2. Completion of a Dislocated Metric Space

    Directory of Open Access Journals (Sweden)

    P. Sumati Kumari

    2015-01-01

    Full Text Available We provide a construction for the completion of a dislocated metric space (abbreviated d-metric space; we also prove that the completion of the metric associated with a d-metric coincides with the metric associated with the completion of the d-metric.

  3. Predicting class testability using object-oriented metrics

    NARCIS (Netherlands)

    M. Bruntink (Magiel); A. van Deursen (Arie)

    2004-01-01

    textabstractIn this paper we investigate factors of the testability of object-oriented software systems. The starting point is given by a study of the literature to obtain both an initial model of testability and existing OO metrics related to testability. Subsequently, these metrics are evaluated

  4. Metric adjusted skew information

    DEFF Research Database (Denmark)

    Hansen, Frank

    2008-01-01

    ) that vanishes for observables commuting with the state. We show that the skew information is a convex function on the manifold of states. It also satisfies other requirements, proposed by Wigner and Yanase, for an effective measure-of-information content of a state relative to a conserved observable. We...... establish a connection between the geometrical formulation of quantum statistics as proposed by Chentsov and Morozova and measures of quantum information as introduced by Wigner and Yanase and extended in this article. We show that the set of normalized Morozova-Chentsov functions describing the possible......We extend the concept of Wigner-Yanase-Dyson skew information to something we call "metric adjusted skew information" (of a state with respect to a conserved observable). This "skew information" is intended to be a non-negative quantity bounded by the variance (of an observable in a state...

  5. The metric system: An introduction

    Energy Technology Data Exchange (ETDEWEB)

    Lumley, S.M.

    1995-05-01

    On July 13, 1992, Deputy Director Duane Sewell restated the Laboratory`s policy on conversion to the metric system which was established in 1974. Sewell`s memo announced the Laboratory`s intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory`s conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on July 25, 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation`s conversion to the metric system. The second part of this report is on applying the metric system.

  6. Attack-Resistant Trust Metrics

    Science.gov (United States)

    Levien, Raph

    The Internet is an amazingly powerful tool for connecting people together, unmatched in human history. Yet, with that power comes great potential for spam and abuse. Trust metrics are an attempt to compute the set of which people are trustworthy and which are likely attackers. This chapter presents two specific trust metrics developed and deployed on the Advogato Website, which is a community blog for free software developers. This real-world experience demonstrates that the trust metrics fulfilled their goals, but that for good results, it is important to match the assumptions of the abstract trust metric computation to the real-world implementation.

  7. The metric system: An introduction

    Science.gov (United States)

    Lumley, Susan M.

    On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.

  8. Metric-adjusted skew information

    DEFF Research Database (Denmark)

    Liang, Cai; Hansen, Frank

    2010-01-01

    on a bipartite system and proved superadditivity of the Wigner-Yanase-Dyson skew informations for such states. We extend this result to the general metric-adjusted skew information. We finally show that a recently introduced extension to parameter values 1 ...We give a truly elementary proof of the convexity of metric-adjusted skew information following an idea of Effros. We extend earlier results of weak forms of superadditivity to general metric-adjusted skew information. Recently, Luo and Zhang introduced the notion of semi-quantum states...... of (unbounded) metric-adjusted skew information....

  9. Two classes of metric spaces

    Directory of Open Access Journals (Sweden)

    Isabel Garrido

    2016-04-01

    Full Text Available The class of metric spaces (X,d known as small-determined spaces, introduced by Garrido and Jaramillo, are properly defined by means of some type of real-valued Lipschitz functions on X. On the other hand, B-simple metric spaces introduced by Hejcman are defined in terms of some kind of bornologies of bounded subsets of X. In this note we present a common framework where both classes of metric spaces can be studied which allows us to see not only the relationships between them but also to obtain new internal characterizations of these metric properties.

  10. Biobjective Optimization and Evaluation for Transit Signal Priority Strategies at Bus Stop-to-Stop Segment

    Directory of Open Access Journals (Sweden)

    Rui Li

    2016-01-01

    Full Text Available This paper proposes a new optimization framework for the transit signal priority strategies in terms of green extension, red truncation, and phase insertion at the stop-to-stop segment of bus lines. The optimization objective is to minimize both passenger delay and the deviation from bus schedule simultaneously. The objective functions are defined with respect to the segment between bus stops, which can include the adjacent signalized intersections and downstream bus stops. The transit priority signal timing is optimized by using a biobjective optimization framework considering both the total delay at a segment and the delay deviation from the arrival schedules at bus stops. The proposed framework is evaluated using a VISSIM model calibrated with field traffic volume and traffic signal data of Caochangmen Boulevard in Nanjing, China. The optimized TSP-based phasing plans result in the reduced delay and improved reliability, compared with the non-TSP scenario under the different traffic flow conditions in the morning peak hour. The evaluation results indicate the promising performance of the proposed optimization framework in reducing the passenger delay and improving the bus schedule adherence for the urban transit system.

  11. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  12. A robust approach to optimal matched filter design in ultrasonic non-destructive evaluation (NDE)

    Science.gov (United States)

    Li, Minghui; Hayward, Gordon

    2017-02-01

    The matched filter was demonstrated to be a powerful yet efficient technique to enhance defect detection and imaging in ultrasonic non-destructive evaluation (NDE) of coarse grain materials, provided that the filter was properly designed and optimized. In the literature, in order to accurately approximate the defect echoes, the design utilized the real excitation signals, which made it time consuming and less straightforward to implement in practice. In this paper, we present a more robust and flexible approach to optimal matched filter design using the simulated excitation signals, and the control parameters are chosen and optimized based on the real scenario of array transducer, transmitter-receiver system response, and the test sample, as a result, the filter response is optimized and depends on the material characteristics. Experiments on industrial samples are conducted and the results confirm the great benefits of the method.

  13. Bi-Level Optimization for Available Transfer Capability Evaluation in Deregulated Electricity Market

    Directory of Open Access Journals (Sweden)

    Beibei Wang

    2015-11-01

    Full Text Available Available transfer capability (ATC is the transfer capability remaining in the physical transmission network for further commercial activity over and above already committed uses which needs to be posted in the electricity market to facilitate competition. ATC evaluation is a complicated task including the determination of total transfer capability (TTC and existing transfer capability (ETC. In the deregulated electricity market, ETC is decided by the independent system operator’s (ISO’s economic dispatch (ED. TTC can then be obtained by a continuation power flow (CPF method or by an optimal power flow (OPF method, based on the given ED solutions as well as the ETC. In this paper, a bi-level optimization framework for the ATC evaluation is proposed in which ATC results can be obtained simultaneously with the ED and ETC results in the deregulated electricity market. In this bi-level optimization model, ATC evaluation is formulated as the upper level problem and the ISO’s ED is the lower level problem. The bi-level model is first converted to a mathematic program with equilibrium constraints (MPEC by recasting the lower level problem as its Karush-Kuhn-Tucher (KKT optimality condition. Then, the MPEC is transformed into a mixed-integer linear programming (MILP problem, which can be solved with the help of available optimization software. In addition, case studies on PJM 5-bus, IEEE 30-bus, and IEEE 118-bus systems are presented to demonstrate the proposed methodology.

  14. Degraded visual environment image/video quality metrics

    Science.gov (United States)

    Baumgartner, Dustin D.; Brown, Jeremy B.; Jacobs, Eddie L.; Schachter, Bruce J.

    2014-06-01

    A number of image quality metrics (IQMs) and video quality metrics (VQMs) have been proposed in the literature for evaluating techniques and systems for mitigating degraded visual environments. Some require both pristine and corrupted imagery. Others require patterned target boards in the scene. None of these metrics relates well to the task of landing a helicopter in conditions such as a brownout dust cloud. We have developed and used a variety of IQMs and VQMs related to the pilot's ability to detect hazards in the scene and to maintain situational awareness. Some of these metrics can be made agnostic to sensor type. Not only are the metrics suitable for evaluating algorithm and sensor variation, they are also suitable for choosing the most cost effective solution to improve operating conditions in degraded visual environments.

  15. DCEO Biotechnology: Tools To Design, Construct, Evaluate, and Optimize the Metabolic Pathway for Biosynthesis of Chemicals

    DEFF Research Database (Denmark)

    Chen, Xiulai; Gao, Cong; Guo, Liang

    2018-01-01

    , and pathway optimization at the systems level, offers a conceptual and technological framework to exploit potential pathways, modify existing pathways and create new pathways for the optimal production of desired chemicals. Here, we summarize recent progress of DCEO biotechnology and examples of its......Chemical synthesis is a well established route for producing many chemicals on a large scale, but some drawbacks still exist in this process, such as unstable intermediates, multistep reactions, complex process control, etc. Biobased production provides an attractive alternative to these challenges......, but how to make cells into efficient factories is challenging. As a key enabling technology to develop efficient cell factories, design-construction-evaluation-optimization (DCEO) biotechnology, which incorporates the concepts and techniques of pathway design, pathway construction, pathway evaluation...

  16. Development of an evaluation method for optimization of maintenance strategy in commercial plant

    International Nuclear Information System (INIS)

    Ito, Satoshi; Shiraishi, Natsuki; Yuki, Kazuhisa; Hashizume, Hidetoshi

    2006-01-01

    In this study, a new simulation method is developed for optimization of maintenance strategy in NPP as a multiple-objective optimization problem (MOP). The result of operation is evaluated as the average of the following three measures in 3,000 trials: Cost of Electricity (COE) as economic risk, Frequency of unplanned shutdown as plant reliability, and Unavailability of Regular Service System (RSS) and Engineering Safety Features (ESF) as safety measures. The following maintenance parameters are considered to evaluate several risk in plant operation by changing maintenance strategy: planned outage cycle, surveillance cycle, major inspection cycle, and surveillance cycle depending on the value of Fussel-Vesely importance measure. By using the Decision-Making method based on AHP, there are individual tendencies depending on individual decision-maker. Therefore this study could be useful for resolving the problem of maintenance optimization as a MOP. (author)

  17. Experimental evaluation of optimal Vehicle Dynamic Control based on the State Dependent Riccati Equation technique

    NARCIS (Netherlands)

    Alirezaei, M.; Kanarachos, S.A.; Scheepers, B.T.M.; Maurice, J.P.

    2013-01-01

    Development and experimentally evaluation of an optimal Vehicle Dynamic Control (VDC) strategy based on the State Dependent Riccati Equation (SDRE) control technique is presented. The proposed nonlinear controller is based on a nonlinear vehicle model with nonlinear tire characteristics. A novel

  18. Metrical Phonology: German Sound System.

    Science.gov (United States)

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English and German languages. The objective is to promote use of metrical phonology as a tool for enhancing instruction in stress patterns in words and sentences, particularly in…

  19. Extending cosmology: the metric approach

    OpenAIRE

    Mendoza, S.

    2012-01-01

    Comment: 2012, Extending Cosmology: The Metric Approach, Open Questions in Cosmology; Review article for an Intech "Open questions in cosmology" book chapter (19 pages, 3 figures). Available from: http://www.intechopen.com/books/open-questions-in-cosmology/extending-cosmology-the-metric-approach

  20. Numerical Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Douglas, Michael R.; Karp, Robert L.; Lukic, Sergio; Reinbacher, Rene

    2008-01-01

    We develop numerical methods for approximating Ricci flat metrics on Calabi-Yau hypersurfaces in projective spaces. Our approach is based on finding balanced metrics and builds on recent theoretical work by Donaldson. We illustrate our methods in detail for a one parameter family of quintics. We also suggest several ways to extend our results

  1. High resolution metric imaging payload

    Science.gov (United States)

    Delclaud, Y.

    2017-11-01

    Alcatel Space Industries has become Europe's leader in the field of high and very high resolution optical payloads, in the frame work of earth observation system able to provide military government with metric images from space. This leadership allowed ALCATEL to propose for the export market, within a French collaboration frame, a complete space based system for metric observation.

  2. Weyl metrics and wormholes

    Energy Technology Data Exchange (ETDEWEB)

    Gibbons, Gary W. [DAMTP, University of Cambridge, Wilberforce Road, Cambridge, CB3 0WA U.K. (United Kingdom); Volkov, Mikhail S., E-mail: gwg1@cam.ac.uk, E-mail: volkov@lmpt.univ-tours.fr [Laboratoire de Mathématiques et Physique Théorique, LMPT CNRS—UMR 7350, Université de Tours, Parc de Grandmont, Tours, 37200 France (France)

    2017-05-01

    We study solutions obtained via applying dualities and complexifications to the vacuum Weyl metrics generated by massive rods and by point masses. Rescaling them and extending to complex parameter values yields axially symmetric vacuum solutions containing singularities along circles that can be viewed as singular matter sources. These solutions have wormhole topology with several asymptotic regions interconnected by throats and their sources can be viewed as thin rings of negative tension encircling the throats. For a particular value of the ring tension the geometry becomes exactly flat although the topology remains non-trivial, so that the rings literally produce holes in flat space. To create a single ring wormhole of one metre radius one needs a negative energy equivalent to the mass of Jupiter. Further duality transformations dress the rings with the scalar field, either conventional or phantom. This gives rise to large classes of static, axially symmetric solutions, presumably including all previously known solutions for a gravity-coupled massless scalar field, as for example the spherically symmetric Bronnikov-Ellis wormholes with phantom scalar. The multi-wormholes contain infinite struts everywhere at the symmetry axes, apart from solutions with locally flat geometry.

  3. Metric regularity and subdifferential calculus

    International Nuclear Information System (INIS)

    Ioffe, A D

    2000-01-01

    The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces

  4. Multi-objective optimization for generating a weighted multi-model ensemble

    Science.gov (United States)

    Lee, H.

    2017-12-01

    Many studies have demonstrated that multi-model ensembles generally show better skill than each ensemble member. When generating weighted multi-model ensembles, the first step is measuring the performance of individual model simulations using observations. There is a consensus on the assignment of weighting factors based on a single evaluation metric. When considering only one evaluation metric, the weighting factor for each model is proportional to a performance score or inversely proportional to an error for the model. While this conventional approach can provide appropriate combinations of multiple models, the approach confronts a big challenge when there are multiple metrics under consideration. When considering multiple evaluation metrics, it is obvious that a simple averaging of multiple performance scores or model ranks does not address the trade-off problem between conflicting metrics. So far, there seems to be no best method to generate weighted multi-model ensembles based on multiple performance metrics. The current study applies the multi-objective optimization, a mathematical process that provides a set of optimal trade-off solutions based on a range of evaluation metrics, to combining multiple performance metrics for the global climate models and their dynamically downscaled regional climate simulations over North America and generating a weighted multi-model ensemble. NASA satellite data and the Regional Climate Model Evaluation System (RCMES) software toolkit are used for assessment of the climate simulations. Overall, the performance of each model differs markedly with strong seasonal dependence. Because of the considerable variability across the climate simulations, it is important to evaluate models systematically and make future projections by assigning optimized weighting factors to the models with relatively good performance. Our results indicate that the optimally weighted multi-model ensemble always shows better performance than an arithmetic

  5. Designing Industrial Networks Using Ecological Food Web Metrics.

    Science.gov (United States)

    Layton, Astrid; Bras, Bert; Weissburg, Marc

    2016-10-18

    Biologically Inspired Design (biomimicry) and Industrial Ecology both look to natural systems to enhance the sustainability and performance of engineered products, systems and industries. Bioinspired design (BID) traditionally has focused on a unit operation and single product level. In contrast, this paper describes how principles of network organization derived from analysis of ecosystem properties can be applied to industrial system networks. Specifically, this paper examines the applicability of particular food web matrix properties as design rules for economically and biologically sustainable industrial networks, using an optimization model developed for a carpet recycling network. Carpet recycling network designs based on traditional cost and emissions based optimization are compared to designs obtained using optimizations based solely on ecological food web metrics. The analysis suggests that networks optimized using food web metrics also were superior from a traditional cost and emissions perspective; correlations between optimization using ecological metrics and traditional optimization ranged generally from 0.70 to 0.96, with flow-based metrics being superior to structural parameters. Four structural food parameters provided correlations nearly the same as that obtained using all structural parameters, but individual structural parameters provided much less satisfactory correlations. The analysis indicates that bioinspired design principles from ecosystems can lead to both environmentally and economically sustainable industrial resource networks, and represent guidelines for designing sustainable industry networks.

  6. Optimization and Evaluation of Desloratadine Oral Strip: An Innovation in Paediatric Medication

    Directory of Open Access Journals (Sweden)

    Harmanpreet Singh

    2013-01-01

    Full Text Available Patients, especially children, are the most difficult to treat in all groups of population mainly because they can not swallow the solid dosage form. Due to this reason they are often prescribed liquid dosage forms. But these formulations have their own disadvantages (lack of dose accuracy during administration, spitting by children, spillage, lack of stability, difficulty in transportation, etc.. Oral strip technology is one such technology to surpass these disadvantages. Desloratadine, a descarboethoxy derivative of loratadine, is a second generation antihistaminic drug approved for usage in allergic rhinitis among paediatric population and is available in markets as suspension. An attempt has been made to design and optimize the oral strip containing desloratadine as an active ingredient. Oral strip was optimized with the help of optimal experimental design using polymer concentration, plasticizer type, and plasticizer concentration as independent variables. Prepared oral strips were evaluated for physicochemical parameter, mechanical strength parameters, disintegration time, dissolution, surface pH, and moisture sorption tendency. Optimized formulation was further evaluated by scanning electron microscopy, moisture content, and histological alteration in oral mucosa. Accelerated stability studies were also carried out for optimized formulations. Results were analysed with the help of various statistical tools at and .

  7. The Effect of Aerodynamic Evaluators on the Multi-Objective Optimization of Flatback Airfoils

    Science.gov (United States)

    Miller, M.; Slew, K. Lee; Matida, E.

    2016-09-01

    With the long lengths of today's wind turbine rotor blades, there is a need to reduce the mass, thereby requiring stiffer airfoils, while maintaining the aerodynamic efficiency of the airfoils, particularly in the inboard region of the blade where structural demands are highest. Using a genetic algorithm, the multi-objective aero-structural optimization of 30% thick flatback airfoils was systematically performed for a variety of aerodynamic evaluators such as lift-to-drag ratio (Cl/Cd), torque (Ct), and torque-to-thrust ratio (Ct/Cn) to determine their influence on airfoil shape and performance. The airfoil optimized for Ct possessed a 4.8% thick trailing-edge, and a rather blunt leading-edge region which creates high levels of lift and correspondingly, drag. It's ability to maintain similar levels of lift and drag under forced transition conditions proved it's insensitivity to roughness. The airfoil optimized for Cl/Cd displayed relatively poor insensitivity to roughness due to the rather aft-located free transition points. The Ct/Cn optimized airfoil was found to have a very similar shape to that of the Cl/Cd airfoil, with a slightly more blunt leading-edge which aided in providing higher levels of lift and moderate insensitivity to roughness. The influence of the chosen aerodynamic evaluator under the specified conditions and constraints in the optimization of wind turbine airfoils is shown to have a direct impact on the airfoil shape and performance.

  8. Comparative Study of Trace Metrics between Bibliometrics and Patentometrics

    Directory of Open Access Journals (Sweden)

    Fred Y. Ye

    2016-06-01

    Full Text Available Purpose: To comprehensively evaluate the overall performance of a group or an individual in both bibliometrics and patentometrics. Design/methodology/approach: Trace metrics were applied to the top 30 universities in the 2014 Academic Ranking of World Universities (ARWU — computer sciences, the top 30 ESI highly cited papers in the computer sciences field in 2014, as well as the top 30 assignees and the top 30 most cited patents in the National Bureau of Economic Research (NBER computer hardware and software category. Findings: We found that, by applying trace metrics, the research or marketing impact efficiency, at both group and individual levels, was clearly observed. Furthermore, trace metrics were more sensitive to the different publication-citation distributions than the average citation and h-index were. Research limitations: Trace metrics considered publications with zero citations as negative contributions. One should clarify how he/she evaluates a zero-citation paper or patent before applying trace metrics. Practical implications: Decision makers could regularly examinine the performance of their university/company by applying trace metrics and adjust their policies accordingly. Originality/value: Trace metrics could be applied both in bibliometrics and patentometrics and provide a comprehensive view. Moreover, the high sensitivity and unique impact efficiency view provided by trace metrics can facilitate decision makers in examining and adjusting their policies.

  9. Economic and Environmental Evaluation and Optimal Ratio of Natural and Recycled Aggregate Production

    Directory of Open Access Journals (Sweden)

    Milad Ghanbari

    2017-01-01

    Full Text Available Steady increase in overexploitation of stone quarries, generation of construction and demolition waste, and costs of preparing extra landfill space have become environmental and waste management challenges in metropolises. In this paper, aggregate production is studied in two scenarios: scenario 1 representing the production of natural aggregates (NA and scenario 2 representing the production of recycled aggregates (RA. This study consists of two parts. In the first part, the objective is the environmental assessment (energy consumption and CO2 emission and economic (cost evaluation of these two scenarios, which is pursued by life-cycle assessment (LCA method. In the second part, the results of the first part are used to estimate the optimal combination of production of NA and RA and thereby find an optimal solution (scenario for a more eco-friendly aggregate production. The defined formulas and relationship are used to develop a model. The results of model validation show that the optimal ratio, in optimal scenario, is 50%. The results show that, compared to scenario 1, optimal scenario improves the energy consumption, CO2 emissions, and production cost by, respectively, 30%, 36%, and 31%, which demonstrate the effectiveness of this optimization.

  10. Standardised metrics for global surgical surveillance.

    Science.gov (United States)

    Weiser, Thomas G; Makary, Martin A; Haynes, Alex B; Dziekan, Gerald; Berry, William R; Gawande, Atul A

    2009-09-26

    Public health surveillance relies on standardised metrics to evaluate disease burden and health system performance. Such metrics have not been developed for surgical services despite increasing volume, substantial cost, and high rates of death and disability associated with surgery. The Safe Surgery Saves Lives initiative of WHO's Patient Safety Programme has developed standardised public health metrics for surgical care that are applicable worldwide. We assembled an international panel of experts to develop and define metrics for measuring the magnitude and effect of surgical care in a population, while taking into account economic feasibility and practicability. This panel recommended six measures for assessing surgical services at a national level: number of operating rooms, number of operations, number of accredited surgeons, number of accredited anaesthesia professionals, day-of-surgery death ratio, and postoperative in-hospital death ratio. We assessed the feasibility of gathering such statistics at eight diverse hospitals in eight countries and incorporated them into the WHO Guidelines for Safe Surgery, in which methods for data collection, analysis, and reporting are outlined.

  11. Evaluating Maximum Photovoltaic Integration in District Distribution Systems Considering Optimal Inverter Dispatch and Cloud Shading Conditions

    DEFF Research Database (Denmark)

    Ding, Tao; Kou, Yu; Yang, Yongheng

    2017-01-01

    . However, the intermittency of solar PV energy (e.g., due to passing clouds) may affect the PV generation in the district distribution network. To address this issue, the voltage magnitude constraints under the cloud shading conditions should be taken into account in the optimization model, which can......As photovoltaic (PV) integration increases in distribution systems, to investigate the maximum allowable PV integration capacity for a district distribution system becomes necessary in the planning phase, an optimization model is thus proposed to evaluate the maximum PV integration capacity while...

  12. Energy evaluation of optimal control strategies for central VWV chiller systems

    International Nuclear Information System (INIS)

    Jin Xinqiao; Du Zhimin; Xiao Xiaokun

    2007-01-01

    Under various conditions, the actual load of the heating, ventilation and air conditioning (HVAC) systems is less than it is originally designed in most operation periods. To save energy and to optimize the controls for chilling systems, the performance of variable water volume (VWV) systems and characteristics of control systems are analyzed, and three strategies are presented and tested based on simulation in this paper. Energy evaluation for the three strategies shows that they can save energy to some extent, and there is potential remained. To minimize the energy consumption of chilling system, the setpoints of controls of supply chilled water temperature and supply head of secondary pump should be optimized simultaneously

  13. Predicting class testability using object-oriented metrics

    OpenAIRE

    Bruntink, Magiel; Deursen, Arie

    2004-01-01

    textabstractIn this paper we investigate factors of the testability of object-oriented software systems. The starting point is given by a study of the literature to obtain both an initial model of testability and existing OO metrics related to testability. Subsequently, these metrics are evaluated by means of two case studies of large Java systems for which JUnit test cases exist. The goal of this paper is to define and evaluate a set of metrics that can be used to assess the testability of t...

  14. Metrics for comparing plasma mass filters

    Energy Technology Data Exchange (ETDEWEB)

    Fetterman, Abraham J.; Fisch, Nathaniel J. [Department of Astrophysical Sciences, Princeton University, Princeton, New Jersey 08540 (United States)

    2011-10-15

    High-throughput mass separation of nuclear waste may be useful for optimal storage, disposal, or environmental remediation. The most dangerous part of nuclear waste is the fission product, which produces most of the heat and medium-term radiation. Plasmas are well-suited to separating nuclear waste because they can separate many different species in a single step. A number of plasma devices have been designed for such mass separation, but there has been no standardized comparison between these devices. We define a standard metric, the separative power per unit volume, and derive it for three different plasma mass filters: the plasma centrifuge, Ohkawa filter, and the magnetic centrifugal mass filter.

  15. Metrics for comparing plasma mass filters

    International Nuclear Information System (INIS)

    Fetterman, Abraham J.; Fisch, Nathaniel J.

    2011-01-01

    High-throughput mass separation of nuclear waste may be useful for optimal storage, disposal, or environmental remediation. The most dangerous part of nuclear waste is the fission product, which produces most of the heat and medium-term radiation. Plasmas are well-suited to separating nuclear waste because they can separate many different species in a single step. A number of plasma devices have been designed for such mass separation, but there has been no standardized comparison between these devices. We define a standard metric, the separative power per unit volume, and derive it for three different plasma mass filters: the plasma centrifuge, Ohkawa filter, and the magnetic centrifugal mass filter.

  16. A Metric for Heterotic Moduli

    Science.gov (United States)

    Candelas, Philip; de la Ossa, Xenia; McOrist, Jock

    2017-12-01

    Heterotic vacua of string theory are realised, at large radius, by a compact threefold with vanishing first Chern class together with a choice of stable holomorphic vector bundle. These form a wide class of potentially realistic four-dimensional vacua of string theory. Despite all their phenomenological promise, there is little understanding of the metric on the moduli space of these. What is sought is the analogue of special geometry for these vacua. The metric on the moduli space is important in phenomenology as it normalises D-terms and Yukawa couplings. It is also of interest in mathematics, since it generalises the metric, first found by Kobayashi, on the space of gauge field connections, to a more general context. Here we construct this metric, correct to first order in {α^{\\backprime}}, in two ways: first by postulating a metric that is invariant under background gauge transformations of the gauge field, and also by dimensionally reducing heterotic supergravity. These methods agree and the resulting metric is Kähler, as is required by supersymmetry. Checking the metric is Kähler is intricate and the anomaly cancellation equation for the H field plays an essential role. The Kähler potential nevertheless takes a remarkably simple form: it is the Kähler potential of special geometry with the Kähler form replaced by the {α^{\\backprime}}-corrected hermitian form.

  17. Development of quality metrics for ambulatory pediatric cardiology: Infection prevention.

    Science.gov (United States)

    Johnson, Jonathan N; Barrett, Cindy S; Franklin, Wayne H; Graham, Eric M; Halnon, Nancy J; Hattendorf, Brandy A; Krawczeski, Catherine D; McGovern, James J; O'Connor, Matthew J; Schultz, Amy H; Vinocur, Jeffrey M; Chowdhury, Devyani; Anderson, Jeffrey B

    2017-12-01

    In 2012, the American College of Cardiology's (ACC) Adult Congenital and Pediatric Cardiology Council established a program to develop quality metrics to guide ambulatory practices for pediatric cardiology. The council chose five areas on which to focus their efforts; chest pain, Kawasaki Disease, tetralogy of Fallot, transposition of the great arteries after arterial switch, and infection prevention. Here, we sought to describe the process, evaluation, and results of the Infection Prevention Committee's metric design process. The infection prevention metrics team consisted of 12 members from 11 institutions in North America. The group agreed to work on specific infection prevention topics including antibiotic prophylaxis for endocarditis, rheumatic fever, and asplenia/hyposplenism; influenza vaccination and respiratory syncytial virus prophylaxis (palivizumab); preoperative methods to reduce intraoperative infections; vaccinations after cardiopulmonary bypass; hand hygiene; and testing to identify splenic function in patients with heterotaxy. An extensive literature review was performed. When available, previously published guidelines were used fully in determining metrics. The committee chose eight metrics to submit to the ACC Quality Metric Expert Panel for review. Ultimately, metrics regarding hand hygiene and influenza vaccination recommendation for patients did not pass the RAND analysis. Both endocarditis prophylaxis metrics and the RSV/palivizumab metric passed the RAND analysis but fell out during the open comment period. Three metrics passed all analyses, including those for antibiotic prophylaxis in patients with heterotaxy/asplenia, for influenza vaccination compliance in healthcare personnel, and for adherence to recommended regimens of secondary prevention of rheumatic fever. The lack of convincing data to guide quality improvement initiatives in pediatric cardiology is widespread, particularly in infection prevention. Despite this, three metrics were

  18. Implications of Metric Choice for Common Applications of Readmission Metrics

    OpenAIRE

    Davies, Sheryl; Saynina, Olga; Schultz, Ellen; McDonald, Kathryn M; Baker, Laurence C

    2013-01-01

    Objective. To quantify the differential impact on hospital performance of three readmission metrics: all-cause readmission (ACR), 3M Potential Preventable Readmission (PPR), and Centers for Medicare and Medicaid 30-day readmission (CMS).

  19. Accuracy and precision in the calculation of phenology metrics

    DEFF Research Database (Denmark)

    Ferreira, Ana Sofia; Visser, Andre; MacKenzie, Brian

    2014-01-01

    a phenology metric is first determined from a noise- and gap-free time series, and again once it has been modified. We show that precision is a greater concern than accuracy for many of these metrics, an important point that has been hereto overlooked in the literature. The variability in precision between...... phenology metrics is substantial, but it can be improved by the use of preprocessing techniques (e.g., gap-filling or smoothing). Furthermore, there are important differences in the inherent variability of the metrics that may be crucial in the interpretation of studies based upon them. Of the considered......Phytoplankton phenology (the timing of seasonal events) is a commonly used indicator for evaluating responses of marine ecosystems to climate change. However, phenological metrics are vulnerable to observation-(bloom amplitude, missing data, and observational noise) and analysis-related (temporal...

  20. Thermo-economic evaluation and optimization of the thermo-chemical conversion of biomass into methanol

    International Nuclear Information System (INIS)

    Peduzzi, Emanuela; Tock, Laurence; Boissonnet, Guillaume; Maréchal, François

    2013-01-01

    In a carbon and resources constrained world, thermo-chemical conversion of lignocellulosic biomass into fuels and chemicals is regarded as a promising alternative to fossil resources derived products. Methanol is one potential product which can be used for the synthesis of various chemicals or as a fuel in fuel cells and internal combustion engines. This study focuses on the evaluation and optimization of the thermodynamic and economic performance of methanol production from biomass by applying process integration and optimization techniques. Results reveal the importance of the energy integration and in particular of the cogeneration of electricity for the efficient use of biomass. - Highlights: • A thermo-economic model for biomass conversion into methanol is developed. • Process integration and multi-objective optimization techniques are applied. • Results reveal the importance of energy integration for electricity co-generation

  1. Evaluation of a proposed optimization method for discrete-event simulation models

    Directory of Open Access Journals (Sweden)

    Alexandre Ferreira de Pinho

    2012-12-01

    Full Text Available Optimization methods combined with computer-based simulation have been utilized in a wide range of manufacturing applications. However, in terms of current technology, these methods exhibit low performance levels which are only able to manipulate a single decision variable at a time. Thus, the objective of this article is to evaluate a proposed optimization method for discrete-event simulation models based on genetic algorithms which exhibits more efficiency in relation to computational time when compared to software packages on the market. It should be emphasized that the variable's response quality will not be altered; that is, the proposed method will maintain the solutions' effectiveness. Thus, the study draws a comparison between the proposed method and that of a simulation instrument already available on the market and has been examined in academic literature. Conclusions are presented, confirming the proposed optimization method's efficiency.

  2. Sound Quality Evaluation and Optimization for Interior Noise of Rail Vehicle

    Directory of Open Access Journals (Sweden)

    Kai Hu

    2014-08-01

    Full Text Available A procedure for sound filed simulation, sound quality (SQ evaluation, and optimization of interior noise of a rail vehicle is investigated in this paper. Firstly, some interior noises are measured on site when the subway is running in tunnel at a speed of 60 km/h. The sound pressure levels (SPLs, loudness, sharpness, and roughness of the measured noise are analyzed. A finite element model for acoustical simulation of the carriage is established by using the Actran software. The accuracy and feasibility of the finite model are verified by comparing the psychoacoustical parameters from the simulations and measurements. By using orthogonal experimental design, finally, the best optimization scheme is put forward, which obtained a sound quality improvement with a 4.81 dB decrease in SPL and a 1.07 sone reduction in loudness. The proposed optimization scheme may be extended to other vehicles for improving interior acoustic environment.

  3. Evaluation of Analysis by Cross-Validation, Part II: Diagnostic and Optimization of Analysis Error Covariance

    Directory of Open Access Journals (Sweden)

    Richard Ménard

    2018-02-01

    Full Text Available We present a general theory of estimation of analysis error covariances based on cross-validation as well as a geometric interpretation of the method. In particular, we use the variance of passive observation-minus-analysis residuals and show that the true analysis error variance can be estimated, without relying on the optimality assumption. This approach is used to obtain near optimal analyses that are then used to evaluate the air quality analysis error using several different methods at active and passive observation sites. We compare the estimates according to the method of Hollingsworth-Lönnberg, Desroziers et al., a new diagnostic we developed, and the perceived analysis error computed from the analysis scheme, to conclude that, as long as the analysis is near optimal, all estimates agree within a certain error margin.

  4. Using complete measurement statistics for optimal device-independent randomness evaluation

    International Nuclear Information System (INIS)

    Nieto-Silleras, O; Pironio, S; Silman, J

    2014-01-01

    The majority of recent works investigating the link between non-locality and randomness, e.g. in the context of device-independent cryptography, do so with respect to some specific Bell inequality, usually the CHSH inequality. However, the joint probabilities characterizing the measurement outcomes of a Bell test are richer than just the degree of violation of a single Bell inequality. In this work we show how to take this extra information into account in a systematic manner in order to optimally evaluate the randomness that can be certified from non-local correlations. We further show that taking into account the complete set of outcome probabilities is equivalent to optimizing over all possible Bell inequalities, thereby allowing us to determine the optimal Bell inequality for certifying the maximal amount of randomness from a given set of non-local correlations. (paper)

  5. Issues in Benchmark Metric Selection

    Science.gov (United States)

    Crolotte, Alain

    It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.

  6. Background metric in supergravity theories

    International Nuclear Information System (INIS)

    Yoneya, T.

    1978-01-01

    In supergravity theories, we investigate the conformal anomaly of the path-integral determinant and the problem of fermion zero modes in the presence of a nontrivial background metric. Except in SO(3) -invariant supergravity, there are nonvanishing conformal anomalies. As a consequence, amplitudes around the nontrivial background metric contain unpredictable arbitrariness. The fermion zero modes which are explicitly constructed for the Euclidean Schwarzschild metric are interpreted as an indication of the supersymmetric multiplet structure of a black hole. The degree of degeneracy of a black hole is 2/sup 4n/ in SO(n) supergravity

  7. Generalized Painleve-Gullstrand metrics

    Energy Technology Data Exchange (ETDEWEB)

    Lin Chunyu [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: l2891112@mail.ncku.edu.tw; Soo Chopin [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: cpsoo@mail.ncku.edu.tw

    2009-02-02

    An obstruction to the implementation of spatially flat Painleve-Gullstrand (PG) slicings is demonstrated, and explicitly discussed for Reissner-Nordstroem and Schwarzschild-anti-deSitter spacetimes. Generalizations of PG slicings which are not spatially flat but which remain regular at the horizons are introduced. These metrics can be obtained from standard spherically symmetric metrics by physical Lorentz boosts. With these generalized PG metrics, problematic contributions to the imaginary part of the action in the Parikh-Wilczek derivation of Hawking radiation due to the obstruction can be avoided.

  8. A Metrics Approach for Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA

    2009-01-01

    Full Text Available This article presents different types of collaborative systems, their structure and classification. This paper defines the concept of virtual campus as a collaborative system. It builds architecture for virtual campus oriented on collaborative training processes. It analyses the quality characteristics of collaborative systems and propose techniques for metrics construction and validation in order to evaluate them. The article analyzes different ways to increase the efficiency and the performance level in collaborative banking systems.

  9. Muffins Elaborated with Optimized Monoglycerides Oleogels: From Solid Fat Replacer Obtention to Product Quality Evaluation.

    Science.gov (United States)

    Giacomozzi, Anabella S; Carrín, María E; Palla, Camila A

    2018-05-22

    This study demonstrates the effectiveness of using oleogels from high oleic sunflower oil (HOSO) and monoglycerides as solid fat replacers in a sweet bakery product. Firstly, a methodology to obtain oleogels with desired properties based on mathematical models able to describe relationships between process and product characteristics variables followed by multi-objective optimization was applied. Later, muffins were prepared with the optimized oleogels and their physicochemical and textural properties were compared with those of muffins formulated using a commercial margarine (Control) or only HOSO. Furthermore, the amount of oil released from muffins over time (1, 7, and 10 days) was measured to evaluate their stability. The replacement of commercial margarine with the optimized oleogels in muffin formulation led to the obtention of products with greater spreadability, higher specific volume, similar hardness values, and a more connected and homogeneous crumb structure. Moreover, these products showed a reduction of oil migration of around 50% in contrast to the Control muffins after 10 days of storage, which indicated that the optimized oleogels can be used satisfactorily to decrease oil loss in this sweet baked product. Fat replacement with the optimized monoglycerides oleogels not only had a positive impact on the quality of the muffins, but also allowed to improve their nutritional profile (without trans fat and low in saturated fat). The food industry demands new ways to reduce the use of saturated and trans fats in food formulations. To contribute to this search, oleogels from high oleic sunflower oil and saturated monoglycerides were prepared under optimized conditions in order to obtain a product with similar functionality to margarine, and its potential application as a semisolid fat ingredient in muffins was evaluated. Muffins formulated with oleogels showed an improved quality compare with those obtained using a commercial margarine with the added

  10. Optimizing the ASC WAN: evaluating network performance tools for comparing transport protocols.

    Energy Technology Data Exchange (ETDEWEB)

    Lydick, Christopher L.

    2007-07-01

    The Advanced Simulation & Computing Wide Area Network (ASC WAN), which is a high delay-bandwidth network connection between US Department of Energy National Laboratories, is constantly being examined and evaluated for efficiency. One of the current transport-layer protocols which is used, TCP, was developed for traffic demands which are different from that on the ASC WAN. The Stream Control Transport Protocol (SCTP), on the other hand, has shown characteristics which make it more appealing to networks such as these. Most important, before considering a replacement for TCP on any network, a testing tool that performs well against certain criteria needs to be found. In order to try to find such a tool, two popular networking tools (Netperf v.2.4.3 & v.2.4.6 (OpenSS7 STREAMS), and Iperf v.2.0.6) were tested. These tools implement both TCP and SCTP and were evaluated using four metrics: (1) How effectively can the tool reach a throughput near the bandwidth? (2) How much of the CPU does the tool utilize during operation? (3) Is the tool freely and widely available? And, (4) Is the tool actively developed? Following the analysis of those tools, this paper goes further into explaining some recommendations and ideas for future work.

  11. Evaluation and Optimization of a Traditional North-Light Roof on Industrial Plant Energy Consumption

    Energy Technology Data Exchange (ETDEWEB)

    Adriaenssens, Sigrid [Form-Finding Lab, Department of Civil and Environmental Engineering, School of Engineering and Applied Science, Princeton Univ., NJ (United States); Hao Liu [Center for Intelligent and Networked Systems, Department of Automation, Tsinghua National Laboratory for Information Science and Technology, Tsinghua University, Beijing (China); Wahed, Miriam [Form-Finding Lab, Department of Civil and Environmental Engineering, School of Engineering and Applied Science, Princeton Univ., NJ (United States); Qianchuan Zhao [Center for Intelligent and Networked Systems, Department of Automation, Tsinghua National Laboratory for Information Science and Technology, Tsinghua University, Beijing (China)

    2013-04-15

    Increasingly strict energy policies, rising energy prices, and a desire for a positive corporate image currently serve as incentives for multinational corporations to reduce their plants’ energy consumption. This paper quantitatively investigates and discusses the value of a traditional north-light roof using a complete building energy simulation and optimization framework. The findings indicate that the north-light system yields positive building energy performance for several climate zones, including: (i) Humid Subtropical; (ii) Semiarid Continental; (iii) Mediterranean; and (iv) Subtropical Highland. In the Subtropical Highland climate zone, for example, the building energy consumption of a north-light roof is up to 54% less than that of a conventional flat roof. Based on these positive findings, this paper further presents an optimization framework that alters the north-light roof shape to further improve its energy performance. To quantitatively guarantee a high probability of finding satisfactory designs while reducing the computational processing time, ordinal optimization is introduced into the scheme. The Subtropical Highland case study shows further energy building consumption reduction of 26% for an optimized north-light roof shape. The presented evaluation and optimization framework could be used in designing a plant with integrated north-lights roof that aim at energy efficiency while maintaining environmental occupant comfort levels.

  12. Determination of optimal angiographic viewing angles: Basic principles and evaluation study

    International Nuclear Information System (INIS)

    Dumay, A.C.M.; Reiber, J.H.C.; Gerbrands, J.J.

    1994-01-01

    Foreshortening of vessel segments in angiographic (biplane) projection images may cause misinterpretation of the extent and degree of coronary artery disease. The views in which the object of interest are visualized with minimum foreshortening are called optimal views. In this paper the authors present a complete approach to obtain such views with computer-assisted techniques. The object of interest is first visualized in two arbitrary views. Two landmarks of the object are manually defined in the two projection images. With complete information of the projection geometry, the vector representation of the object in the three-dimensional space is computed. This vector is perpendicular to a plane in which the views are called optimal. The user has one degree of freedom to define a set of optimal biplane views. The angle between the central beams of the imaging systems can be chosen freely. The computation of the orientation of the object and of corresponding optimal biplane views have been evaluated with a simple hardware phantom. The mean and the standard deviation of the overall errors in the calculation of the optimal angulation angles were 1.8 degree and 1.3 degree, respectively, when the user defined a rotation angle

  13. Let's Make Metric Ice Cream

    Science.gov (United States)

    Zimmerman, Marianna

    1975-01-01

    Describes a classroom activity which involved sixth grade students in a learning situation including making ice cream, safety procedures in a science laboratory, calibrating a thermometer, using metric units of volume and mass. (EB)

  14. Optimal plot size in the evaluation of papaya scions: proposal and comparison of methods

    Directory of Open Access Journals (Sweden)

    Humberto Felipe Celanti

    Full Text Available ABSTRACT Evaluating the quality of scions is extremely important and it can be done by characteristics of shoots and roots. This experiment evaluated height of the aerial part, stem diameter, number of leaves, petiole length and length of roots of papaya seedlings. Analyses were performed from a blank trial with 240 seedlings of "Golden Pecíolo Curto". The determination of the optimum plot size was done by applying the methods of maximum curvature, maximum curvature of coefficient of variation and a new proposed method, which incorporates the bootstrap resampling simulation to the maximum curvature method. According to the results obtained, five is the optimal number of seedlings of papaya "Golden Pecíolo Curto" per plot. The proposed method of bootstrap simulation with replacement provides optimal plot sizes equal or higher than the maximum curvature method and provides same plot size than maximum curvature method of the coefficient of variation.

  15. IDENTIFYING MARKETING EFFECTIVENESS METRICS (Case study: East Azerbaijan`s industrial units)

    OpenAIRE

    Faridyahyaie, Reza; Faryabi, Mohammad; Bodaghi Khajeh Noubar, Hossein

    2012-01-01

    The Paper attempts to identify marketing eff ectiveness metrics in industrial units. The metrics investigated in this study are completely applicable and comprehensive, and consequently they can evaluate marketing eff ectiveness in various industries. The metrics studied include: Market Share, Profitability, Sales Growth, Customer Numbers, Customer Satisfaction and Customer Loyalty. The findings indicate that these six metrics are impressive when measuring marketing effectiveness. Data was ge...

  16. Experiential space is hardly metric

    Czech Academy of Sciences Publication Activity Database

    Šikl, Radovan; Šimeček, Michal; Lukavský, Jiří

    2008-01-01

    Roč. 2008, č. 37 (2008), s. 58-58 ISSN 0301-0066. [European Conference on Visual Perception. 24.08-28.08.2008, Utrecht] R&D Projects: GA ČR GA406/07/1676 Institutional research plan: CEZ:AV0Z70250504 Keywords : visual space perception * metric and non-metric perceptual judgments * ecological validity Subject RIV: AN - Psychology

  17. Coverage Metrics for Model Checking

    Science.gov (United States)

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  18. Phantom metrics with Killing spinors

    Directory of Open Access Journals (Sweden)

    W.A. Sabra

    2015-11-01

    Full Text Available We study metric solutions of Einstein–anti-Maxwell theory admitting Killing spinors. The analogue of the IWP metric which admits a space-like Killing vector is found and is expressed in terms of a complex function satisfying the wave equation in flat (2+1-dimensional space–time. As examples, electric and magnetic Kasner spaces are constructed by allowing the solution to depend only on the time coordinate. Euclidean solutions are also presented.

  19. Information to be wished with a view to an evaluation of the optimization

    International Nuclear Information System (INIS)

    Oudiz, A.; Despres, A.; Champion, M.; Collinet, J.; Bretheau, F.; Hubert, Ph.

    1998-01-01

    This document makes a general survey of the information to be wished in order to evaluate the optimization. The base information are qualitative and are on the general organisation in the frame of radiation protection. The information to give are also about the personnel training to radiation protection. The quantitative data concern: the individual doses, the collective dose, the dose rates, the indicators of internal exposure, the time of exposure and the strength concerned by it. (N.C.)

  20. An Evaluation of the Sniffer Global Optimization Algorithm Using Standard Test Functions

    Science.gov (United States)

    Butler, Roger A. R.; Slaminka, Edward E.

    1992-03-01

    The performance of Sniffer—a new global optimization algorithm—is compared with that of Simulated Annealing. Using the number of function evaluations as a measure of efficiency, the new algorithm is shown to be significantly better at finding the global minimum of seven standard test functions. Several of the test functions used have many local minima and very steep walls surrounding the global minimum. Such functions are intended to thwart global minimization algorithms.

  1. Evaluating and Optimizing Online Advertising: Forget the Click, but There Are Good Proxies.

    Science.gov (United States)

    Dalessandro, Brian; Hook, Rod; Perlich, Claudia; Provost, Foster

    2015-06-01

    Online systems promise to improve advertisement targeting via the massive and detailed data available. However, there often is too few data on exactly the outcome of interest, such as purchases, for accurate campaign evaluation and optimization (due to low conversion rates, cold start periods, lack of instrumentation of offline purchases, and long purchase cycles). This paper presents a detailed treatment of proxy modeling, which is based on the identification of a suitable alternative (proxy) target variable when data on the true objective is in short supply (or even completely nonexistent). The paper has a two-fold contribution. First, the potential of proxy modeling is demonstrated clearly, based on a massive-scale experiment across 58 real online advertising campaigns. Second, we assess the value of different specific proxies for evaluating and optimizing online display advertising, showing striking results. The results include bad news and good news. The most commonly cited and used proxy is a click on an ad. The bad news is that across a large number of campaigns, clicks are not good proxies for evaluation or for optimization: clickers do not resemble buyers. The good news is that an alternative sort of proxy performs remarkably well: observed visits to the brand's website. Specifically, predictive models built based on brand site visits-which are much more common than purchases-do a remarkably good job of predicting which browsers will make a purchase. The practical bottom line: evaluating and optimizing campaigns using clicks seems wrongheaded; however, there is an easy and attractive alternative-use a well-chosen site-visit proxy instead.

  2. C-program LINOP for the evaluation of film dosemeters by linear optimization. User manual

    International Nuclear Information System (INIS)

    Kragh, P.

    1995-11-01

    Linear programming results in an optimal measuring value for film dosemeters. The Linop program was developed to be used for linear programming. The program permits the evaluation and control of film dosemeters and of all other multi-component dosemeters. This user manual for the Linop program contains the source program, a description of the program and installation and use instructions. The data sets with programs and examples are available upon request. (orig.) [de

  3. SU-E-T-368: Evaluating Dosimetric Outcome of Modulated Photon Radiotherapy (XMRT) Optimization for Head and Neck Patients

    Energy Technology Data Exchange (ETDEWEB)

    McGeachy, P; Villarreal-Barajas, JE; Khan, R [University of Calgary, Calgary, AB (Canada); Tom Baker Cancer Centre, Calgary, AB (Canada); Zinchenko, Y [University of Calgary, Calgary, AB (Canada)

    2015-06-15

    Purpose: The dosimetric outcome of optimized treatment plans obtained by modulating the photon beamlet energy and fluence on a small cohort of four Head and Neck (H and N) patients was investigated. This novel optimization technique is denoted XMRT for modulated photon radiotherapy. The dosimetric plans from XMRT for H and N treatment were compared to conventional, 6 MV intensity modulated radiotherapy (IMRT) optimization plans. Methods: An arrangement of two non-coplanar and five coplanar beams was used for all four H and N patients. Both XMRT and IMRT were subject to the same optimization algorithm, with XMRT optimization allowing both 6 and 18 MV beamlets while IMRT was restricted to 6 MV only. The optimization algorithm was based on a linear programming approach with partial-volume constraints implemented via the conditional value-at-risk method. H and N constraints were based off of those mentioned in the Radiation Therapy Oncology Group 1016 protocol. XMRT and IMRT solutions were assessed using metrics suggested by International Commission on Radiation Units and Measurements report 83. The Gurobi solver was used in conjunction with the CVX package to solve each optimization problem. Dose calculations and analysis were done in CERR using Monte Carlo dose calculation with VMC{sub ++}. Results: Both XMRT and IMRT solutions met all clinical criteria. Trade-offs were observed between improved dose uniformity to the primary target volume (PTV1) and increased dose to some of the surrounding healthy organs for XMRT compared to IMRT. On average, IMRT improved dose to the contralateral parotid gland and spinal cord while XMRT improved dose to the brainstem and mandible. Conclusion: Bi-energy XMRT optimization for H and N patients provides benefits in terms of improved dose uniformity to the primary target and reduced dose to some healthy structures, at the expense of increased dose to other healthy structures when compared with IMRT.

  4. An Integrated Modeling Approach to Evaluate and Optimize Data Center Sustainability, Dependability and Cost

    Directory of Open Access Journals (Sweden)

    Gustavo Callou

    2014-01-01

    Full Text Available Data centers have evolved dramatically in recent years, due to the advent of social networking services, e-commerce and cloud computing. The conflicting requirements are the high availability levels demanded against the low sustainability impact and cost values. The approaches that evaluate and optimize these requirements are essential to support designers of data center architectures. Our work aims to propose an integrated approach to estimate and optimize these issues with the support of the developed environment, Mercury. Mercury is a tool for dependability, performance and energy flow evaluation. The tool supports reliability block diagrams (RBD, stochastic Petri nets (SPNs, continuous-time Markov chains (CTMC and energy flow (EFM models. The EFM verifies the energy flow on data center architectures, taking into account the energy efficiency and power capacity that each device can provide (assuming power systems or extract (considering cooling components. The EFM also estimates the sustainability impact and cost issues of data center architectures. Additionally, a methodology is also considered to support the modeling, evaluation and optimization processes. Two case studies are presented to illustrate the adopted methodology on data center power systems.

  5. A critical evaluation of worst case optimization methods for robust intensity-modulated proton therapy planning

    International Nuclear Information System (INIS)

    Fredriksson, Albin; Bokrantz, Rasmus

    2014-01-01

    Purpose: To critically evaluate and compare three worst case optimization methods that have been previously employed to generate intensity-modulated proton therapy treatment plans that are robust against systematic errors. The goal of the evaluation is to identify circumstances when the methods behave differently and to describe the mechanism behind the differences when they occur. Methods: The worst case methods optimize plans to perform as well as possible under the worst case scenario that can physically occur (composite worst case), the combination of the worst case scenarios for each objective constituent considered independently (objectivewise worst case), and the combination of the worst case scenarios for each voxel considered independently (voxelwise worst case). These three methods were assessed with respect to treatment planning for prostate under systematic setup uncertainty. An equivalence with probabilistic optimization was used to identify the scenarios that determine the outcome of the optimization. Results: If the conflict between target coverage and normal tissue sparing is small and no dose-volume histogram (DVH) constraints are present, then all three methods yield robust plans. Otherwise, they all have their shortcomings: Composite worst case led to unnecessarily low plan quality in boundary scenarios that were less difficult than the worst case ones. Objectivewise worst case generally led to nonrobust plans. Voxelwise worst case led to overly conservative plans with respect to DVH constraints, which resulted in excessive dose to normal tissue, and less sharp dose fall-off than the other two methods. Conclusions: The three worst case methods have clearly different behaviors. These behaviors can be understood from which scenarios that are active in the optimization. No particular method is superior to the others under all circumstances: composite worst case is suitable if the conflicts are not very severe or there are DVH constraints whereas

  6. Relevance of motion-related assessment metrics in laparoscopic surgery.

    Science.gov (United States)

    Oropesa, Ignacio; Chmarra, Magdalena K; Sánchez-González, Patricia; Lamata, Pablo; Rodrigues, Sharon P; Enciso, Silvia; Sánchez-Margallo, Francisco M; Jansen, Frank-Willem; Dankelman, Jenny; Gómez, Enrique J

    2013-06-01

    Motion metrics have become an important source of information when addressing the assessment of surgical expertise. However, their direct relationship with the different surgical skills has not been fully explored. The purpose of this study is to investigate the relevance of motion-related metrics in the evaluation processes of basic psychomotor laparoscopic skills and their correlation with the different abilities sought to measure. A framework for task definition and metric analysis is proposed. An explorative survey was first conducted with a board of experts to identify metrics to assess basic psychomotor skills. Based on the output of that survey, 3 novel tasks for surgical assessment were designed. Face and construct validation was performed, with focus on motion-related metrics. Tasks were performed by 42 participants (16 novices, 22 residents, and 4 experts). Movements of the laparoscopic instruments were registered with the TrEndo tracking system and analyzed. Time, path length, and depth showed construct validity for all 3 tasks. Motion smoothness and idle time also showed validity for tasks involving bimanual coordination and tasks requiring a more tactical approach, respectively. Additionally, motion smoothness and average speed showed a high internal consistency, proving them to be the most task-independent of all the metrics analyzed. Motion metrics are complementary and valid for assessing basic psychomotor skills, and their relevance depends on the skill being evaluated. A larger clinical implementation, combined with quality performance information, will give more insight on the relevance of the results shown in this study.

  7. Network Community Detection on Metric Space

    Directory of Open Access Journals (Sweden)

    Suman Saha

    2015-08-01

    Full Text Available Community detection in a complex network is an important problem of much interest in recent years. In general, a community detection algorithm chooses an objective function and captures the communities of the network by optimizing the objective function, and then, one uses various heuristics to solve the optimization problem to extract the interesting communities for the user. In this article, we demonstrate the procedure to transform a graph into points of a metric space and develop the methods of community detection with the help of a metric defined for a pair of points. We have also studied and analyzed the community structure of the network therein. The results obtained with our approach are very competitive with most of the well-known algorithms in the literature, and this is justified over the large collection of datasets. On the other hand, it can be observed that time taken by our algorithm is quite less compared to other methods and justifies the theoretical findings.

  8. Machine Learning for ATLAS DDM Network Metrics

    CERN Document Server

    Lassnig, Mario; The ATLAS collaboration; Vamosi, Ralf

    2016-01-01

    The increasing volume of physics data is posing a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from our ongoing automation efforts. First, we describe our framework for distributed data management and network metrics, automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for network-aware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our models.

  9. Metrics for Diagnosing Undersampling in Monte Carlo Tally Estimates

    International Nuclear Information System (INIS)

    Perfetti, Christopher M.; Rearden, Bradley T.

    2015-01-01

    This study explored the potential of using Markov chain convergence diagnostics to predict the prevalence and magnitude of biases due to undersampling in Monte Carlo eigenvalue and flux tally estimates. Five metrics were applied to two models of pressurized water reactor fuel assemblies and their potential for identifying undersampling biases was evaluated by comparing the calculated test metrics with known biases in the tallies. Three of the five undersampling metrics showed the potential to accurately predict the behavior of undersampling biases in the responses examined in this study.

  10. Best Proximity Point Results in Complex Valued Metric Spaces

    Directory of Open Access Journals (Sweden)

    Binayak S. Choudhury

    2014-01-01

    complex valued metric spaces. We treat the problem as that of finding the global optimal solution of a fixed point equation although the exact solution does not in general exist. We also define and use the concept of P-property in such spaces. Our results are illustrated with examples.

  11. Evaluating Optimism: Developing Children’s Version of Optimistic Attributional Style Questionnaire

    Directory of Open Access Journals (Sweden)

    Gordeeva T.O.,

    2017-08-01

    Full Text Available People differ significantly in how they usually explain to themselves the reasons of events, both positive and negative, that happen in their lives. Psychological research shows that children who tend to think optimistically have certain advantages as compared to their pessimistically thinking peers: they are less likely to suffer from depression, establish more positive relationships with peers, and demonstrate higher academic achievements. This paper describes the process of creating the children’s version of the Optimistic Attributional Style Questionnaire (OASQ-C. This technique is based on the theory of learned hopelessness and optimism developed by M. Seligman, L. Abramson and J. Teas dale and is an efficient (compact tool for measuring optimism as an explanatory style in children and adolescents (9-14 years. Confirmatory factor analysis revealed that this technique is a two-factor structure with acceptable reliability. Validity is supported by the presence of expected correlations between explanatory style and rates of psychological well-being, dispositional optimism, positive attitude to life and its aspects, depression, and academic performance. The outcomes of this technique are not affected by social desirability. The developed questionnaire may be recommended to researchers and school counsellors for evaluating optimism (optimistic thinking as one of the major factors in psychological well-being of children; it may also be used in assessing the effectiveness of cognitive oriented training for adolescents.

  12. Optimization and evaluation of thermoresponsive diclofenac sodium ophthalmic in situ gels.

    Science.gov (United States)

    Asasutjarit, Rathapon; Thanasanchokpibull, Suthira; Fuongfuchat, Asira; Veeranondha, Sukitaya

    2011-06-15

    This work was conducted to optimize and evaluate Pluronic F127-based thermoresponsive diclofenac sodium ophthalmic in situ gels (DS in situ gel). They were prepared by cold method and investigated their physicochemical properties i.e., pH, flow ability, sol-gel transition temperature, gelling capacity and rheological properties. An optimized formulation was selected and investigated its physicochemical properties before and after autoclaving, eye irritation potency in SIRC cells and rabbits. In vivo ophthalmic absorption was performed in rabbits. It was found that physicochemical properties of DS in situ gels were affected by formulation compositions. Increment of Pluronic F127 content decreased sol-gel transition temperature of the products while increase in Pluronic F68 concentration tended to increase sol-gel transition temperature. In this study, Carbopol 940 did not affect sol-gel transition temperature but it affected transparency, pH, and gelling capacity of the products. The optimized formulation exhibited sol-gel transition at 32.6 ± 1.1 °C with pseudoplastic flow behavior. It was lost diclofenac sodium content during autoclaving. However, it was accepted as safe for ophthalmic use and could increase diclofenac sodium bioavailability in aqueous humor significantly. In conclusion, the optimized DS in situ gel had potential for using as an alternative to the conventional diclofenac sodium eye drop. However, autoclaving was not a suitable sterilization method for this product. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Transungual Gel of Terbinafine Hydrochloride for the Management of Onychomycosis: Formulation, Optimization, and Evaluation.

    Science.gov (United States)

    Thatai, Purva; Sapra, Bharti

    2017-08-01

    The present study was aimed to optimize, develop, and evaluate microemulsion and microemulsion-based gel as a vehicle for transungual drug delivery of terbinafine hydrochloride for the treatment of onychomycosis. D-optimal mixture experimental design was adopted to optimize the composition of microemulsion having amount of oil (X 1 ), Smix (mixture of surfactant and cosurfactant; X 2 ), and water (X 3 ) as the independent variables. The formulations were assessed for permeation (micrograms per square centimeter per hour; Y 1 ), particle size (nanometer; Y 2 ), and solubility of the drug in the formulation (milligrams per milliliter; Y 3 ). The microemulsion containing 3.05% oil, 24.98% Smix, and 71.96% water was selected as the optimized formulation. The microemulsion-based gel showed better penetration (∼5 folds) as well as more retention (∼9 fold) in the animal hoof as compared to the commercial cream. The techniques used to screen penetration enhancers (hydration enhancement factor, ATR-FTIR, SEM, and DSC) revealed the synergistic effect of combination of urea and n-acetyl cysteine in disruption of the structure of hoof and hence, leading to enhanced penetration of drug.

  14. Evaluation of the optimal combinations of modulation factor and pitch for Helical TomoTherapy plans made with TomoEdge using Pareto optimal fronts.

    Science.gov (United States)

    De Kerf, Geert; Van Gestel, Dirk; Mommaerts, Lobke; Van den Weyngaert, Danielle; Verellen, Dirk

    2015-09-17

    Modulation factor (MF) and pitch have an impact on Helical TomoTherapy (HT) plan quality and HT users mostly use vendor-recommended settings. This study analyses the effect of these two parameters on both plan quality and treatment time for plans made with TomoEdge planning software by using the concept of Pareto optimal fronts. More than 450 plans with different combinations of pitch [0.10-0.50] and MF [1.2-3.0] were produced. These HT plans, with a field width (FW) of 5 cm, were created for five head and neck patients and homogeneity index, conformity index, dose-near-maximum (D2), and dose-near-minimum (D98) were analysed for the planning target volumes, as well as the mean dose and D2 for most critical organs at risk. For every dose metric the median value will be plotted against treatment time. A Pareto-like method is used in the analysis which will show how pitch and MF influence both treatment time and plan quality. For small pitches (≤0.20), MF does not influence treatment time. The contrary is true for larger pitches (≥0.25) as lowering MF will both decrease treatment time and plan quality until maximum gantry speed is reached. At this moment, treatment time is saturated and only plan quality will further decrease. The Pareto front analysis showed optimal combinations of pitch [0.23-0.45] and MF > 2.0 for a FW of 5 cm. Outside this range, plans will become less optimal. As the vendor-recommended settings fall within this range, the use of these settings is validated.

  15. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry

    Directory of Open Access Journals (Sweden)

    Marek Tobiszewski

    2015-06-01

    Full Text Available The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  16. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.

    Science.gov (United States)

    Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek

    2015-06-12

    The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  17. Scalar-metric and scalar-metric-torsion gravitational theories

    International Nuclear Information System (INIS)

    Aldersley, S.J.

    1977-01-01

    The techniques of dimensional analysis and of the theory of tensorial concomitants are employed to study field equations in gravitational theories which incorporate scalar fields of the Brans-Dicke type. Within the context of scalar-metric gravitational theories, a uniqueness theorem for the geometric (or gravitational) part of the field equations is proven and a Lagrangian is determined which is uniquely specified by dimensional analysis. Within the context of scalar-metric-torsion gravitational theories a uniqueness theorem for field Lagrangians is presented and the corresponding Euler-Lagrange equations are given. Finally, an example of a scalar-metric-torsion theory is presented which is similar in many respects to the Brans-Dicke theory and the Einstein-Cartan theory

  18. Traffic Congestion Evaluation and Signal Control Optimization Based on Wireless Sensor Networks: Model and Algorithms

    Directory of Open Access Journals (Sweden)

    Wei Zhang

    2012-01-01

    Full Text Available This paper presents the model and algorithms for traffic flow data monitoring and optimal traffic light control based on wireless sensor networks. Given the scenario that sensor nodes are sparsely deployed along the segments between signalized intersections, an analytical model is built using continuum traffic equation and develops the method to estimate traffic parameter with the scattered sensor data. Based on the traffic data and principle of traffic congestion formation, we introduce the congestion factor which can be used to evaluate the real-time traffic congestion status along the segment and to predict the subcritical state of traffic jams. The result is expected to support the timing phase optimization of traffic light control for the purpose of avoiding traffic congestion before its formation. We simulate the traffic monitoring based on the Mobile Century dataset and analyze the performance of traffic light control on VISSIM platform when congestion factor is introduced into the signal timing optimization model. The simulation result shows that this method can improve the spatial-temporal resolution of traffic data monitoring and evaluate traffic congestion status with high precision. It is helpful to remarkably alleviate urban traffic congestion and decrease the average traffic delays and maximum queue length.

  19. Formulation and Evaluation of Optimized Oxybenzone Microsponge Gel for Topical Delivery

    Directory of Open Access Journals (Sweden)

    Atmaram P. Pawar

    2015-01-01

    Full Text Available Background. Oxybenzone, a broad spectrum sunscreen agent widely used in the form of lotion and cream, has been reported to cause skin irritation, dermatitis, and systemic absorption. Aim. The objective of the present study was to formulate oxybenzone loaded microsponge gel for enhanced sun protection factor with reduced toxicity. Material and Method. Microsponge for topical delivery of oxybenzone was successfully prepared by quasiemulsion solvent diffusion method. The effects of ethyl cellulose and dichloromethane were optimized by the 32 factorial design. The optimized microsponges were dispersed into the hydrogel and further evaluated. Results. The microsponges were spherical with pore size in the range of 0.10–0.22 µm. The optimized formulation possesses the particle size and entrapment efficiency of 72 ± 0.77 µm and 96.9 ± 0.52%, respectively. The microsponge gel showed the controlled release and was nonirritant to the rat skin. In creep recovery test it had shown highest recovery indicating elasticity. The controlled release of oxybenzone from microsponge and barrier effect of gel result in prolonged retention of oxybenzone with reduced permeation activity. Conclusion. Evaluation study revealed remarkable and enhanced topical retention of oxybenzone for prolonged period of time. It also showed the enhanced sun protection factor compared to the marketed preparation with reduced irritation and toxicity.

  20. A Method for Consensus Reaching in Product Kansei Evaluation Using Advanced Particle Swarm Optimization.

    Science.gov (United States)

    Yang, Yan-Pu

    2017-01-01

    Consumers' opinions toward product design alternatives are often subjective and perceptual, which reflect their perception about a product and can be described using Kansei adjectives. Therefore, Kansei evaluation is often employed to determine consumers' preference. However, how to identify and improve the reliability of consumers' Kansei evaluation opinions toward design alternatives has an important role in adding additional insurance and reducing uncertainty to successful product design. To solve this problem, this study employs a consensus model to measure consistence among consumers' opinions, and an advanced particle swarm optimization (PSO) algorithm combined with Linearly Decreasing Inertia Weight (LDW) method is proposed for consensus reaching by minimizing adjustment of consumers' opinions. Furthermore, the process of the proposed method is presented and the details are illustrated using an example of electronic scooter design evaluation. The case study reveals that the proposed method is promising for reaching a consensus through searching optimal solutions by PSO and improving the reliability of consumers' evaluation opinions toward design alternatives according to Kansei indexes.

  1. Effect of a Home-Based Virtual Reality Intervention for Children with Cerebral Palsy Using Super Pop VR Evaluation Metrics: A Feasibility Study

    Directory of Open Access Journals (Sweden)

    Yuping Chen

    2015-01-01

    Full Text Available Objective. The purpose of this pilot study was to determine whether Super Pop VR, a low-cost virtual reality (VR system, was a feasible system for documenting improvement in children with cerebral palsy (CP and whether a home-based VR intervention was effective. Methods. Three children with CP participated in this study and received an 8-week VR intervention (30 minutes × 5 sessions/week using the commercial EyeToy Play VR system. Reaching kinematics measured by Super Pop VR and two fine motor tools (Bruininks-Oseretsky Test of Motor Proficiency second edition, BOT-2, and Pediatric Motor Activity Log, PMAL were tested before, mid, and after intervention. Results. All children successfully completed the evaluations using the Super Pop VR system at home where 85% of the reaches collected were used to compute reaching kinematics, which is compatible with literature using expensive motion analysis systems. Only the child with hemiplegic CP and more impaired arm function improved the reaching kinematics and functional use of the affected hand after intervention. Conclusion. Super Pop VR proved to be a feasible evaluation tool in children with CP.

  2. Realistic nurse-led policy implementation, optimization and evaluation: novel methodological exemplar.

    Science.gov (United States)

    Noyes, Jane; Lewis, Mary; Bennett, Virginia; Widdas, David; Brombley, Karen

    2014-01-01

    To report the first large-scale realistic nurse-led implementation, optimization and evaluation of a complex children's continuing-care policy. Health policies are increasingly complex, involve multiple Government departments and frequently fail to translate into better patient outcomes. Realist methods have not yet been adapted for policy implementation. Research methodology - Evaluation using theory-based realist methods for policy implementation. An expert group developed the policy and supporting tools. Implementation and evaluation design integrated diffusion of innovation theory with multiple case study and adapted realist principles. Practitioners in 12 English sites worked with Consultant Nurse implementers to manipulate the programme theory and logic of new decision-support tools and care pathway to optimize local implementation. Methods included key-stakeholder interviews, developing practical diffusion of innovation processes using key-opinion leaders and active facilitation strategies and a mini-community of practice. New and existing processes and outcomes were compared for 137 children during 2007-2008. Realist principles were successfully adapted to a shorter policy implementation and evaluation time frame. Important new implementation success factors included facilitated implementation that enabled 'real-time' manipulation of programme logic and local context to best-fit evolving theories of what worked; using local experiential opinion to change supporting tools to more realistically align with local context and what worked; and having sufficient existing local infrastructure to support implementation. Ten mechanisms explained implementation success and differences in outcomes between new and existing processes. Realistic policy implementation methods have advantages over top-down approaches, especially where clinical expertise is low and unlikely to diffuse innovations 'naturally' without facilitated implementation and local optimization. © 2013

  3. Regge calculus from discontinuous metrics

    International Nuclear Information System (INIS)

    Khatsymovsky, V.M.

    2003-01-01

    Regge calculus is considered as a particular case of the more general system where the linklengths of any two neighbouring 4-tetrahedra do not necessarily coincide on their common face. This system is treated as that one described by metric discontinuous on the faces. In the superspace of all discontinuous metrics the Regge calculus metrics form some hypersurface defined by continuity conditions. Quantum theory of the discontinuous metric system is assumed to be fixed somehow in the form of quantum measure on (the space of functionals on) the superspace. The problem of reducing this measure to the Regge hypersurface is addressed. The quantum Regge calculus measure is defined from a discontinuous metric measure by inserting the δ-function-like phase factor. The requirement that continuity conditions be imposed in a 'face-independent' way fixes this factor uniquely. The term 'face-independent' means that this factor depends only on the (hyper)plane spanned by the face, not on it's form and size. This requirement seems to be natural from the viewpoint of existence of the well-defined continuum limit maximally free of lattice artefacts

  4. Social Media Metrics Importance and Usage Frequency in Latvia

    OpenAIRE

    Ronalds Skulme

    2017-01-01

    Purpose of the article: The purpose of this paper was to explore which social media marketing metrics are most often used and are most important for marketing experts in Latvia and can be used to evaluate marketing campaign effectiveness. Methodology/methods: In order to achieve the aim of this paper several theoretical and practical research methods were used, such as theoretical literature analysis, surveying and grouping. First of all, theoretical research about social media metrics was...

  5. An implementation of particle swarm optimization to evaluate optimal under-voltage load shedding in competitive electricity markets

    Science.gov (United States)

    Hosseini-Bioki, M. M.; Rashidinejad, M.; Abdollahi, A.

    2013-11-01

    Load shedding is a crucial issue in power systems especially under restructured electricity environment. Market-driven load shedding in reregulated power systems associated with security as well as reliability is investigated in this paper. A technoeconomic multi-objective function is introduced to reveal an optimal load shedding scheme considering maximum social welfare. The proposed optimization problem includes maximum GENCOs and loads' profits as well as maximum loadability limit under normal and contingency conditions. Particle swarm optimization (PSO) as a heuristic optimization technique, is utilized to find an optimal load shedding scheme. In a market-driven structure, generators offer their bidding blocks while the dispatchable loads will bid their price-responsive demands. An independent system operator (ISO) derives a market clearing price (MCP) while rescheduling the amount of generating power in both pre-contingency and post-contingency conditions. The proposed methodology is developed on a 3-bus system and then is applied to a modified IEEE 30-bus test system. The obtained results show the effectiveness of the proposed methodology in implementing the optimal load shedding satisfying social welfare by maintaining voltage stability margin (VSM) through technoeconomic analyses.

  6. Metrics Evolution in an Energy Research and Development Program

    International Nuclear Information System (INIS)

    Dixon, Brent

    2011-01-01

    All technology programs progress through three phases: Discovery, Definition, and Deployment. The form and application of program metrics needs to evolve with each phase. During the discovery phase, the program determines what is achievable. A set of tools is needed to define program goals, to analyze credible technical options, and to ensure that the options are compatible and meet the program objectives. A metrics system that scores the potential performance of technical options is part of this system of tools, supporting screening of concepts and aiding in the overall definition of objectives. During the definition phase, the program defines what specifically is wanted. What is achievable is translated into specific systems and specific technical options are selected and optimized. A metrics system can help with the identification of options for optimization and the selection of the option for deployment. During the deployment phase, the program shows that the selected system works. Demonstration projects are established and classical systems engineering is employed. During this phase, the metrics communicate system performance. This paper discusses an approach to metrics evolution within the Department of Energy's Nuclear Fuel Cycle R and D Program, which is working to improve the sustainability of nuclear energy.

  7. Evaluation of a commercial packed bed flow hydrogenator for reaction screening, optimization, and synthesis

    Directory of Open Access Journals (Sweden)

    Marian C. Bryan

    2011-08-01

    Full Text Available The performance of the ThalesNano H-Cube®, a commercial packed bed flow hydrogenator, was evaluated in the context of small scale reaction screening and optimization. A model reaction, the reduction of styrene to ethylbenzene through a 10% Pd/C catalyst bed, was used to examine performance at various pressure settings, over sequential runs, and with commercial catalyst cartridges. In addition, the consistency of the hydrogen flow was indirectly measured by in-line UV spectroscopy. Finally, system contamination due to catalyst leaching, and the resolution of this issue, is described. The impact of these factors on the run-to-run reproducibility of the H-Cube® reactor for screening and reaction optimization is discussed.

  8. LOW-CALORIES RAISINS OBTAINED BY COMBINED DEHYDRATION: PROCESS OPTIMIZATION AND EVALUATION OF THE ANTIOXIDANT EFFICIENCY

    Directory of Open Access Journals (Sweden)

    Mariana B. Laborde

    2015-03-01

    Full Text Available A healthy dehydrated food of high nutritional-quality and added-value was developed: low-calories raisin obtained by an ultrasonic assisted combined-dehydration with two-stage osmotic treatment (D3S complemented by drying. Pink Red Globe grape produced at Mendoza (Argentina, experienced a substitution of sugar by natural sweetener Stevia in two osmotic stages under different conditions (treatment with/without ultrasound; sweetener concentration 18, 20, 22% w/w; time 35, 75, 115 minutes, evaluating soluble solids (SS, moisture (M, total polyphenols (PF, antioxidant efficiency (AE and sugar profile. The multiple optimization of the process by response surface methodology and desirability analysis, allowed to minimize M, maximize SS (Stevia incorporation, and preserve the maximum amount of PF. After the first stage, the optimal treatment reduced the majority sugars of the grape in 32% (sucrose, glucose, and the 57% at the end of the dehydration process.

  9. Artificial Neural Networks in Evaluation and Optimization of Modified Release Solid Dosage Forms

    Directory of Open Access Journals (Sweden)

    Zorica Djurić

    2012-10-01

    Full Text Available Implementation of the Quality by Design (QbD approach in pharmaceutical development has compelled researchers in the pharmaceutical industry to employ Design of Experiments (DoE as a statistical tool, in product development. Among all DoE techniques, response surface methodology (RSM is the one most frequently used. Progress of computer science has had an impact on pharmaceutical development as well. Simultaneous with the implementation of statistical methods, machine learning tools took an important place in drug formulation. Twenty years ago, the first papers describing application of artificial neural networks in optimization of modified release products appeared. Since then, a lot of work has been done towards implementation of new techniques, especially Artificial Neural Networks (ANN in modeling of production, drug release and drug stability of modified release solid dosage forms. The aim of this paper is to review artificial neural networks in evaluation and optimization of modified release solid dosage forms.

  10. Critical evaluation and thermodynamic optimization of the U-Pb and U-Sb binary systems

    International Nuclear Information System (INIS)

    Wang, Jian; Jin, Liling; Chen, Chuchu; Rao, Weifeng; Wang, Cuiping; Liu, Xingjun

    2016-01-01

    A complete literature review, critical evaluation and thermodynamic optimization of the phase diagrams and thermodynamic properties of U-Pb and U-Sb binary systems are presented. The CALculation of PHAse Diagrams (CALPHAD) method was used for the thermodynamic optimization, the results of which can reproduce all available reliable experimental phase equilibria and thermodynamic data. The modified quasi-chemical model in the pair approximation (MQMPA) was used for modeling the liquid solution. The Gibbs energies of all terminal solid solutions and intermetallic compounds were described by the compound energy formalism (CEF) model. All reliable experimental data of the U-Pb and U-Sb systems have been reproduced. A self-consistent thermodynamic database has been constructed for these binary systems; this database can be used in liquid-metal fuel reactor (LMFR) research.

  11. Performance Evaluation and Parameter Optimization of SoftCast Wireless Video Broadcast

    Directory of Open Access Journals (Sweden)

    Dongxue Yang

    2015-08-01

    Full Text Available Wireless video broadcast plays an imp ortant role in multimedia communication with the emergence of mobile video applications. However, conventional video broadcast designs suffer from a cliff effect due to separated source and channel encoding. The newly prop osed SoftCast scheme employs a cross-layer design, whose reconstructed video quality is prop ortional to the channel condition. In this pap er, we provide the p erformance evaluation and the parameter optimization of the SoftCast system. Optimization principles on parameter selection are suggested to obtain a b etter video quality, o ccupy less bandwidth and/or utilize lower complexity. In addition, we compare SoftCast with H.264 in the LTE EPA scenario. The simulation results show that SoftCast provides a b etter p erformance in the scalability to channel conditions and the robustness to packet losses.

  12. Evaluation of an optimized shade guide made from porcelain powder mixtures.

    Science.gov (United States)

    Wang, Peng; Wei, Jiaqiang; Li, Qing; Wang, Yining

    2014-12-01

    Color errors associated with current shade guides and problems with color selection and duplication are still challenging for restorative dentists. The purpose of this study was to evaluate an optimized shade guide for visual shade duplication. Color distributions (L*, a*, and b*) of the maxillary left central incisors of 236 participants, whose ages ranged from 20 to 60, were measured with a spectrophotometer. Based on this color map, an optimized shade guide was designed with 14 shade tabs evenly distributed within the given color range of the natural incisors. The shade tabs were fabricated with porcelain powder mixtures and conventional laboratory procedures. A comparison of shade duplication by using the optimized and Vitapan Classical shade guides was conducted. Thirty Chinese participants were involved, and the colors of the left maxillary incisors were selected by using 2 shade guides. Metal ceramic crowns were fabricated according to the results of the shade selection. The colors of the shade tabs, natural teeth, and the ceramic crowns were measured with a spectrophotometer. The color differences among the natural teeth, the shade tabs, and the corresponding metal ceramic crowns were calculated and analyzed (α=.017). Significant differences were found in both phases of shade determination and shade duplication (P<.017). The total number of color errors with the optimized shade guide was 3.5, which was significantly less than that of Vitapan, 5.1 (P<.001). The optimized shade guide system improved performance not only in the color selection phase but also in the color of the fabricated crowns. Copyright © 2014 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  13. Optimization and evaluation of pluronic lecithin organogels as a transdermal delivery vehicle for sinomenine.

    Science.gov (United States)

    Ba, Wenqiang; Li, Zhou; Wang, Lisheng; Wang, Ding; Liao, Weiguo; Fan, Wentao; Wu, Yinai; Liao, Fengyun; Yu, Jianye

    2016-08-01

    The purpose of the present study was to prepare and optimize sinomenine (SIN) pluronic lecithin organogels system (PLO), and to evaluate the permeability of the optimized PLO in vitro and in vivo. Box-Behnken design was used to optimize the PLO and the optimized formulation was pluronic F127 of 19.61%, lecithin of 3.60% and SIN of 1.27%. The formulation was evaluated its skin permeation and drug deposition both in vitro and in vivo compared with gel. Permeation and deposition studies of PLO were carried out with Franz diffusion cells in vitro and with microdialysis in vivo. In vitro studies, permeation rate (Jss) of SIN from PLO was 146.55 ± 2.93 μg/cm(2)/h, significantly higher than that of gel (120.39 μg/cm(2)/h) and the amount of SIN deposited in skin from the PLO was 10.08 ± 0.86 μg/cm(2), significantly larger than that from gel (6.01 ± 0.04 μg/cm(2)). In vivo skin microdialysis studies showed that the maximum concentration (Cmax) of SIN from PLO in "permeation study" and "drug-deposition study" were 150.27 ± 20.85 μg/ml and 67.95 μg/ml, respectively, both significantly higher than that of SIN from gel (29.66 and 6.73 μg/ml). The results recommend that PLO can be used as an advantageous transdermal delivery vehicle to enhance the permeation and skin deposition of SIN.

  14. Optimal design of modular cogeneration plants for hospital facilities and robustness evaluation of the results

    International Nuclear Information System (INIS)

    Gimelli, A.; Muccillo, M.; Sannino, R.

    2017-01-01

    Highlights: • A specific methodology has been set up based on genetic optimization algorithm. • Results highlight a tradeoff between primary energy savings (TPES) and simple payback (SPB). • Optimized plant configurations show TPES exceeding 18% and SPB of approximately three years. • The study aims to identify the most stable plant solutions through the robust design optimization. • The research shows how a deterministic definition of the decision variables could lead to an overestimation of the results. - Abstract: The widespread adoption of combined heat and power generation is widely recognized as a strategic goal to achieve significant primary energy savings and lower carbon dioxide emissions. In this context, the purpose of this research is to evaluate the potential of cogeneration based on reciprocating gas engines for some Italian hospital buildings. Comparative analyses have been conducted based on the load profiles of two specific hospital facilities and through the study of the cogeneration system-user interaction. To this end, a specific methodology has been set up by coupling a specifically developed calculation algorithm to a genetic optimization algorithm, and a multi-objective approach has been adopted. The results from the optimization problem highlight a clear trade-off between total primary energy savings (TPES) and simple payback period (SPB). Optimized plant configurations and management strategies show TPES exceeding 18% for the reference hospital facilities and multi–gas engine solutions along with a minimum SPB of approximately three years, thereby justifying the European regulation promoting cogeneration. However, designing a CHP plant for a specific energetic, legislative or market scenario does not guarantee good performance when these scenarios change. For this reason, the proposed methodology has been enhanced in order to focus on some innovative aspects. In particular, this study proposes an uncommon and effective approach

  15. Symmetries of Taub-NUT dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.; Codoban, S.

    1998-01-01

    Recently geometric duality was analyzed for a metric which admits Killing tensors. An interesting example arises when the manifold has Killing-Yano tensors. The symmetries of the dual metrics in the case of Taub-NUT metric are investigated. Generic and non-generic symmetries of dual Taub-NUT metric are analyzed

  16. Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Sartor, Dale; Tschudi, William

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  17. Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  18. Image characterization metrics for muon tomography

    Science.gov (United States)

    Luo, Weidong; Lehovich, Andre; Anashkin, Edward; Bai, Chuanyong; Kindem, Joel; Sossong, Michael; Steiger, Matt

    2014-05-01

    Muon tomography uses naturally occurring cosmic rays to detect nuclear threats in containers. Currently there are no systematic image characterization metrics for muon tomography. We propose a set of image characterization methods to quantify the imaging performance of muon tomography. These methods include tests of spatial resolution, uniformity, contrast, signal to noise ratio (SNR) and vertical smearing. Simulated phantom data and analysis methods were developed to evaluate metric applicability. Spatial resolution was determined as the FWHM of the point spread functions in X, Y and Z axis for 2.5cm tungsten cubes. Uniformity was measured by drawing a volume of interest (VOI) within a large water phantom and defined as the standard deviation of voxel values divided by the mean voxel value. Contrast was defined as the peak signals of a set of tungsten cubes divided by the mean voxel value of the water background. SNR was defined as the peak signals of cubes divided by the standard deviation (noise) of the water background. Vertical smearing, i.e. vertical thickness blurring along the zenith axis for a set of 2 cm thick tungsten plates, was defined as the FWHM of vertical spread function for the plate. These image metrics provided a useful tool to quantify the basic imaging properties for muon tomography.

  19. A Kerr-NUT metric

    International Nuclear Information System (INIS)

    Vaidya, P.C.; Patel, L.K.; Bhatt, P.V.

    1976-01-01

    Using Galilean time and retarded distance as coordinates the usual Kerr metric is expressed in form similar to the Newman-Unti-Tamburino (NUT) metric. The combined Kerr-NUT metric is then investigated. In addition to the Kerr and NUT solutions of Einstein's equations, three other types of solutions are derived. These are (i) the radiating Kerr solution, (ii) the radiating NUT solution satisfying Rsub(ik) = sigmaxisub(i)xisub(k), xisub(i)xisup(i) = 0, and (iii) the associated Kerr solution satisfying Rsub(ik) = 0. Solution (i) is distinct from and simpler than the one reported earlier by Vaidya and Patel (Phys. Rev.; D7:3590 (1973)). Solutions (ii) and (iii) gave line elements which have the axis of symmetry as a singular line. (author)

  20. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  1. The uniqueness of the Fisher metric as information metric

    Czech Academy of Sciences Publication Activity Database

    Le, Hong-Van

    2017-01-01

    Roč. 69, č. 4 (2017), s. 879-896 ISSN 0020-3157 Institutional support: RVO:67985840 Keywords : Chentsov’s theorem * mixed topology * monotonicity of the Fisher metric Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.049, year: 2016 https://link.springer.com/article/10.1007%2Fs10463-016-0562-0

  2. Measuring Success: Metrics that Link Supply Chain Management to Aircraft Readiness

    National Research Council Canada - National Science Library

    Balestreri, William

    2002-01-01

    This thesis evaluates and analyzes current strategic management planning methods that develop performance metrics linking supply chain management to aircraft readiness, Our primary focus is the Marine...

  3. Invariant metrics for Hamiltonian systems

    International Nuclear Information System (INIS)

    Rangarajan, G.; Dragt, A.J.; Neri, F.

    1991-05-01

    In this paper, invariant metrics are constructed for Hamiltonian systems. These metrics give rise to norms on the space of homeogeneous polynomials of phase-space variables. For an accelerator lattice described by a Hamiltonian, these norms characterize the nonlinear content of the lattice. Therefore, the performance of the lattice can be improved by minimizing the norm as a function of parameters describing the beam-line elements in the lattice. A four-fold increase in the dynamic aperture of a model FODO cell is obtained using this procedure. 7 refs

  4. Generalization of Vaidya's radiation metric

    Energy Technology Data Exchange (ETDEWEB)

    Gleiser, R J; Kozameh, C N [Universidad Nacional de Cordoba (Argentina). Instituto de Matematica, Astronomia y Fisica

    1981-11-01

    In this paper it is shown that if Vaidya's radiation metric is considered from the point of view of kinetic theory in general relativity, the corresponding phase space distribution function can be generalized in a particular way. The new family of spherically symmetric radiation metrics obtained contains Vaidya's as a limiting situation. The Einstein field equations are solved in a ''comoving'' coordinate system. Two arbitrary functions of a single variable are introduced in the process of solving these equations. Particular examples considered are a stationary solution, a nonvacuum solution depending on a single parameter, and several limiting situations.

  5. Energy-Based Metrics for Arthroscopic Skills Assessment.

    Science.gov (United States)

    Poursartip, Behnaz; LeBel, Marie-Eve; McCracken, Laura C; Escoto, Abelardo; Patel, Rajni V; Naish, Michael D; Trejos, Ana Luisa

    2017-08-05

    Minimally invasive skills assessment methods are essential in developing efficient surgical simulators and implementing consistent skills evaluation. Although numerous methods have been investigated in the literature, there is still a need to further improve the accuracy of surgical skills assessment. Energy expenditure can be an indication of motor skills proficiency. The goals of this study are to develop objective metrics based on energy expenditure, normalize these metrics, and investigate classifying trainees using these metrics. To this end, different forms of energy consisting of mechanical energy and work were considered and their values were divided by the related value of an ideal performance to develop normalized metrics. These metrics were used as inputs for various machine learning algorithms including support vector machines (SVM) and neural networks (NNs) for classification. The accuracy of the combination of the normalized energy-based metrics with these classifiers was evaluated through a leave-one-subject-out cross-validation. The proposed method was validated using 26 subjects at two experience levels (novices and experts) in three arthroscopic tasks. The results showed that there are statistically significant differences between novices and experts for almost all of the normalized energy-based metrics. The accuracy of classification using SVM and NN methods was between 70% and 95% for the various tasks. The results show that the normalized energy-based metrics and their combination with SVM and NN classifiers are capable of providing accurate classification of trainees. The assessment method proposed in this study can enhance surgical training by providing appropriate feedback to trainees about their level of expertise and can be used in the evaluation of proficiency.

  6. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2018-01-01

    The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...

  7. Remarks on G-Metric Spaces

    Directory of Open Access Journals (Sweden)

    Bessem Samet

    2013-01-01

    Full Text Available In 2005, Mustafa and Sims (2006 introduced and studied a new class of generalized metric spaces, which are called G-metric spaces, as a generalization of metric spaces. We establish some useful propositions to show that many fixed point theorems on (nonsymmetric G-metric spaces given recently by many authors follow directly from well-known theorems on metric spaces. Our technique can be easily extended to other results as shown in application.

  8. DLA Energy Biofuel Feedstock Metrics Study

    Science.gov (United States)

    2012-12-11

    moderately/highly in- vasive  Metric 2: Genetically modified organism ( GMO ) hazard, Yes/No and Hazard Category  Metric 3: Species hybridization...4– biofuel distribution Stage # 5– biofuel use Metric 1: State inva- siveness ranking Yes Minimal Minimal No No Metric 2: GMO hazard Yes...may utilize GMO microbial or microalgae species across the applicable biofuel life cycles (stages 1–3). The following consequence Metrics 4–6 then

  9. Contrast-enhanced helical CT of the pancreas. Optimal timing of imaging for pancreatic tumor evaluation

    International Nuclear Information System (INIS)

    Koide, Kazuki; Sekiguchi, Ryuzo

    2001-01-01

    We performed three-phase helical CT in patients suspected pancreatic tumors and investigated the optimal timing of imaging for evaluation of the pancreatic mass. The pancreatic-phase was superior in detecting pancreatic tumors, including islet cell tumors that may show strong enhancement. However, portal vein-phase imaging was also superior in 16.7% of our patients. Taking into account examination for hepatic metastasis, helical CT of any pancreatic tumor should include images obtained in the pancreatic and portal vein phases. (author)

  10. TORO: ninety-six-week virologic and immunologic response and safety evaluation of enfuvirtide with an optimized background of antiretrovirals

    NARCIS (Netherlands)

    Reynes, Jacques; Arastéh, Keikawus; Clotet, Bonaventura; Cohen, Calvin; Cooper, David A.; Delfraissy, Jean-François; Eron, Joseph J.; Henry, Keith; Katlama, Christine; Kuritzkes, Daniel R.; Lalezari, Jacob P.; Lange, Joep; Lazzarin, Adriano; Montaner, Julio S. G.; Nelson, Mark; O' Hearn, Mary; Stellbrink, Hans-Jürgen; Trottier, Benoit; Walmsley, Sharon L.; Buss, Neil E.; DeMasi, Ralph; Chung, Jain; Donatacci, Lucille; Guimaraes, Denise; Rowell, Lucy; Valentine, Adeline; Wilkinson, Martin; Salgo, Miklos P.

    2007-01-01

    The additional 48-week optional treatment extension of the T-20 versus Optimized Regimen Only (TORO) studies evaluated long-term safety and efficacy of enfuvirtide (ENF) through week 96 in patients receiving ENF plus optimized background (OB) and patients switching to ENF plus OB from OB alone.

  11. A study on correlation between 2D and 3D gamma evaluation metrics in patient-specific quality assurance for VMAT

    Energy Technology Data Exchange (ETDEWEB)

    Rajasekaran, Dhanabalan, E-mail: dhanabalanraj@gmail.com; Jeevanandam, Prakash; Sukumar, Prabakar; Ranganathan, Arulpandiyan; Johnjothi, Samdevakumar; Nagarajan, Vivekanandan

    2014-01-01

    In this study, we investigated the correlation between 2-dimensional (2D) and 3D gamma analysis using the new PTW OCTAVIUS 4D system for various parameters. For this study, we selected 150 clinically approved volumetric-modulated arc therapy (VMAT) plans of head and neck (50), thoracic (esophagus) (50), and pelvic (cervix) (50) sites. Individual verification plans were created and delivered to the OCTAVIUS 4D phantom. Measured and calculated dose distributions were compared using the 2D and 3D gamma analysis by global (maximum), local and selected (isocenter) dose methods. The average gamma passing rate for 2D global gamma analysis in coronal and sagittal plane was 94.81% ± 2.12% and 95.19% ± 1.76%, respectively, for commonly used 3-mm/3% criteria with 10% low-dose threshold. Correspondingly, for the same criteria, the average gamma passing rate for 3D planar global gamma analysis was 95.90% ± 1.57% and 95.61% ± 1.65%. The volumetric 3D gamma passing rate for 3-mm/3% (10% low-dose threshold) global gamma was 96.49% ± 1.49%. Applying stringent gamma criteria resulted in higher differences between 2D planar and 3D planar gamma analysis across all the global, local, and selected dose gamma evaluation methods. The average gamma passing rate for volumetric 3D gamma analysis was 1.49%, 1.36%, and 2.16% higher when compared with 2D planar analyses (coronal and sagittal combined average) for 3 mm/3% global, local, and selected dose gamma analysis, respectively. On the basis of the wide range of analysis and correlation study, we conclude that there is no assured correlation or notable pattern that could provide relation between planar 2D and volumetric 3D gamma analysis. Owing to higher passing rates, higher action limits can be set while performing 3D quality assurance. Site-wise action limits may be considered for patient-specific QA in VMAT.

  12. SU-E-T-436: Fluence-Based Trajectory Optimization for Non-Coplanar VMAT

    Energy Technology Data Exchange (ETDEWEB)

    Smyth, G; Bamber, JC; Bedford, JL [Joint Department of Physics at The Institute of Cancer Research and The Royal Marsden NHS Foundation Trust, London (United Kingdom); Evans, PM [Centre for Vision, Speech and Signal Processing, University of Surrey, Guildford (United Kingdom); Saran, FH; Mandeville, HC [The Royal Marsden NHS Foundation Trust, Sutton (United Kingdom)

    2015-06-15

    Purpose: To investigate a fluence-based trajectory optimization technique for non-coplanar VMAT for brain cancer. Methods: Single-arc non-coplanar VMAT trajectories were determined using a heuristic technique for five patients. Organ at risk (OAR) volume intersected during raytracing was minimized for two cases: absolute volume and the sum of relative volumes weighted by OAR importance. These trajectories and coplanar VMAT formed starting points for the fluence-based optimization method. Iterative least squares optimization was performed on control points 24° apart in gantry rotation. Optimization minimized the root-mean-square (RMS) deviation of PTV dose from the prescription (relative importance 100), maximum dose to the brainstem (10), optic chiasm (5), globes (5) and optic nerves (5), plus mean dose to the lenses (5), hippocampi (3), temporal lobes (2), cochleae (1) and brain excluding other regions of interest (1). Control point couch rotations were varied in steps of up to 10° and accepted if the cost function improved. Final treatment plans were optimized with the same objectives in an in-house planning system and evaluated using a composite metric - the sum of optimization metrics weighted by importance. Results: The composite metric decreased with fluence-based optimization in 14 of the 15 plans. In the remaining case its overall value, and the PTV and OAR components, were unchanged but the balance of OAR sparing differed. PTV RMS deviation was improved in 13 cases and unchanged in two. The OAR component was reduced in 13 plans. In one case the OAR component increased but the composite metric decreased - a 4 Gy increase in OAR metrics was balanced by a reduction in PTV RMS deviation from 2.8% to 2.6%. Conclusion: Fluence-based trajectory optimization improved plan quality as defined by the composite metric. While dose differences were case specific, fluence-based optimization improved both PTV and OAR dosimetry in 80% of cases.

  13. Application of an integrated model for evaluation and optimization of business projects portfolios

    Directory of Open Access Journals (Sweden)

    Camila Costa Dutra

    2016-12-01

    Full Text Available This work presents an application of an integrated model for the evaluation and probabilistic optimization of projects portfolios, integrating economic, risk and social and environmental impacts analysis. The model uses the Monte Carlo simulation and linear programming techniques for treatment of uncertainties and optimization of projects portfolio. The integrated model was applied in a Brazilian company of electricity distributions. The portfolio of selected projects was related to the expansion of the supply of electricity in a town in the south of the country and the analysis horizon was set in ten years. The aim of the application was to maximize the return for the implementation of a substation and a transmission line in a set of projects, which are diverse in terms of costs, benefits and environmental and social impacts. As a result, the model generates: i an analysis of each individual projects, from budget information (costs and benefits involved and estimation of social and environmental impacts generated by the project and the risks (uncertainties involved and ii the optimum combination of projects that the company should prioritize to ensure the best financial return and lower social and environmental impacts, thus generating an optimal portfolio.

  14. APPLICATION OF AN INTEGRATED MODEL FOR EVALUATION AND OPTIMIZATION OF BUSINESS PROJECTS PORTFOLIOS

    Directory of Open Access Journals (Sweden)

    Maria Auxiliadora Cannarozzo Tinoco

    2016-12-01

    Full Text Available This work presents an application of an integrated model for the evaluation and probabilistic optimization of projects portfolios, integrating economic, risk and social and environmental impacts analysis. The model uses the Monte Carlo simulation and linear programming techniques for treatment of uncertainties and optimization of projects portfolio. The integrated model was applied in a Brazilian company of electricity distributions. The portfolio of selected projects was related to the expansion of the supply of electricity in a town in the south of the country and the analysis horizon was set in ten years. The aim of the application was to maximize the return for the implementation of a substation and a transmission line in a set of projects, which are diverse in terms of costs, benefits and environmental and social impacts. As a result, the model generates: i an analysis of each individual projects, from budget information (costs and benefits involved and estimation of social and environmental impacts generated by the project and the risks (uncertainties involved and ii the optimum combination of projects that the company should prioritize to ensure the best financial return and lower social and environmental impacts, thus generating an optimal portfolio

  15. Production Task Queue Optimization Based on Multi-Attribute Evaluation for Complex Product Assembly Workshop.

    Science.gov (United States)

    Li, Lian-Hui; Mo, Rong

    2015-01-01

    The production task queue has a great significance for manufacturing resource allocation and scheduling decision. Man-made qualitative queue optimization method has a poor effect and makes the application difficult. A production task queue optimization method is proposed based on multi-attribute evaluation. According to the task attributes, the hierarchical multi-attribute model is established and the indicator quantization methods are given. To calculate the objective indicator weight, criteria importance through intercriteria correlation (CRITIC) is selected from three usual methods. To calculate the subjective indicator weight, BP neural network is used to determine the judge importance degree, and then the trapezoid fuzzy scale-rough AHP considering the judge importance degree is put forward. The balanced weight, which integrates the objective weight and the subjective weight, is calculated base on multi-weight contribution balance model. The technique for order preference by similarity to an ideal solution (TOPSIS) improved by replacing Euclidean distance with relative entropy distance is used to sequence the tasks and optimize the queue by the weighted indicator value. A case study is given to illustrate its correctness and feasibility.

  16. Production Task Queue Optimization Based on Multi-Attribute Evaluation for Complex Product Assembly Workshop.

    Directory of Open Access Journals (Sweden)

    Lian-Hui Li

    Full Text Available The production task queue has a great significance for manufacturing resource allocation and scheduling decision. Man-made qualitative queue optimization method has a poor effect and makes the application difficult. A production task queue optimization method is proposed based on multi-attribute evaluation. According to the task attributes, the hierarchical multi-attribute model is established and the indicator quantization methods are given. To calculate the objective indicator weight, criteria importance through intercriteria correlation (CRITIC is selected from three usual methods. To calculate the subjective indicator weight, BP neural network is used to determine the judge importance degree, and then the trapezoid fuzzy scale-rough AHP considering the judge importance degree is put forward. The balanced weight, which integrates the objective weight and the subjective weight, is calculated base on multi-weight contribution balance model. The technique for order preference by similarity to an ideal solution (TOPSIS improved by replacing Euclidean distance with relative entropy distance is used to sequence the tasks and optimize the queue by the weighted indicator value. A case study is given to illustrate its correctness and feasibility.

  17. Evaluation of optimized bronchoalveolar lavage sampling designs for characterization of pulmonary drug distribution.

    Science.gov (United States)

    Clewe, Oskar; Karlsson, Mats O; Simonsson, Ulrika S H

    2015-12-01

    Bronchoalveolar lavage (BAL) is a pulmonary sampling technique for characterization of drug concentrations in epithelial lining fluid and alveolar cells. Two hypothetical drugs with different pulmonary distribution rates (fast and slow) were considered. An optimized BAL sampling design was generated assuming no previous information regarding the pulmonary distribution (rate and extent) and with a maximum of two samples per subject. Simulations were performed to evaluate the impact of the number of samples per subject (1 or 2) and the sample size on the relative bias and relative root mean square error of the parameter estimates (rate and extent of pulmonary distribution). The optimized BAL sampling design depends on a characterized plasma concentration time profile, a population plasma pharmacokinetic model, the limit of quantification (LOQ) of the BAL method and involves only two BAL sample time points, one early and one late. The early sample should be taken as early as possible, where concentrations in the BAL fluid ≥ LOQ. The second sample should be taken at a time point in the declining part of the plasma curve, where the plasma concentration is equivalent to the plasma concentration in the early sample. Using a previously described general pulmonary distribution model linked to a plasma population pharmacokinetic model, simulated data using the final BAL sampling design enabled characterization of both the rate and extent of pulmonary distribution. The optimized BAL sampling design enables characterization of both the rate and extent of the pulmonary distribution for both fast and slowly equilibrating drugs.

  18. Self-organizing weights for Internet AS-graphs and surprisingly simple routing metrics

    DEFF Research Database (Denmark)

    Scholz, Jan Carsten; Greiner, Martin

    The transport capacity of Internet-like communication networks and hence their efficiency may be improved by a factor of 5-10 through the use of highly optimized routing metrics, as demonstrated previously. Numerical determination of such routing metrics can be computationally demanding...... metrics. The new metrics have negligible computational cost and result in an approximately 5-fold performance increase, providing distinguished competitiveness with the computationally costly counterparts. They are applicable to very large networks and easy to implement in today's Internet routing...

  19. Quantum anomalies for generalized Euclidean Taub-NUT metrics

    International Nuclear Information System (INIS)

    Cotaescu, Ion I; Moroianu, Sergiu; Visinescu, Mihai

    2005-01-01

    The generalized Taub-NUT metrics exhibit in general gravitational anomalies. This is in contrast with the fact that the original Taub-NUT metric does not exhibit gravitational anomalies, which is a consequence of the fact that it admits Killing-Yano tensors forming Staeckel-Killing tensors as products. We have found that for axial anomalies, interpreted as the index of the Dirac operator, the presence of Killing-Yano tensors is irrelevant. In order to evaluate the axial anomalies, we compute the index of the Dirac operator with the APS boundary condition on balls and on annular domains. The result is an explicit number-theoretic quantity depending on the radii of the domain. This quantity is 0 for metrics close to the original Taub-NUT metric but it does not vanish in general

  20. Evaluation of optimal dual axis concentrated photovoltaic thermal system with active ventilation using Frog Leap algorithm

    International Nuclear Information System (INIS)

    Gholami, H.; Sarwat, A.I.; Hosseinian, H.; Khalilnejad, A.

    2015-01-01

    Highlights: • Electro-thermal performance of open-loop controlled dual axis CPVT is investigated. • For using the absorbed heat, active ventilation with a heat storage tank is used. • Economic optimization of the system is performed, using Frog Leap algorithm. • Detailed model of all sections is simulated with their characteristics evaluation. • Triple-junction photovoltaic cells, which are the most recent technology, are used. - Abstract: In this study, design and optimization of a concentrated photovoltaic thermal (CPVT) system considering electrical, mechanical, and economical aspects is investigated. For this purpose, each section of the system is simulated in MATLAB, in detail. Triple-junction photovoltaic cells, which are the most recent technology, are used in this study. They are more efficient in comparison to conventional photovoltaic cells. Unlike ordinary procedures, in this work active ventilation is used for absorbing the thermal power of radiation, using heat storage tanks, which not only results in increasing the electrical efficiency of the system through decreasing the temperature, but also leads to storing and managing produced thermal energy and increasing the total efficiency of the system up to 85 percent. The operation of the CPVT system is investigated for total hours of the year, considering the needed thermal load, meteorological conditions, and hourly radiation of Khuznin, a city in Qazvin province, Iran. Finally, the collector used for this system is optimized economically, using frog leap algorithm, which resulted in the cost of 13.4 $/m"2 for a collector with the optimal distance between tubes of 6.34 cm.

  1. Separable metrics and radiating stars

    Indian Academy of Sciences (India)

    We study the junction condition relating the pressure to heat flux at the boundary of an accelerating and expanding spherically symmetric radiating star. We transform the junction condition to an ordinary differential equation by making a separability assumption on the metric functions in the space–time variables.

  2. Socio-technical security metrics

    NARCIS (Netherlands)

    Gollmann, D.; Herley, C.; Koenig, V.; Pieters, W.; Sasse, M.A.

    2015-01-01

    Report from Dagstuhl seminar 14491. This report documents the program and the outcomes of Dagstuhl Seminar 14491 “Socio-Technical Security Metrics”. In the domain of safety, metrics inform many decisions, from the height of new dikes to the design of nuclear plants. We can state, for example, that

  3. Leading Gainful Employment Metric Reporting

    Science.gov (United States)

    Powers, Kristina; MacPherson, Derek

    2016-01-01

    This chapter will address the importance of intercampus involvement in reporting of gainful employment student-level data that will be used in the calculation of gainful employment metrics by the U.S. Department of Education. The authors will discuss why building relationships within the institution is critical for effective gainful employment…

  4. Stimulation of a turbofan engine for evaluation of multivariable optimal control concepts. [(computerized simulation)

    Science.gov (United States)

    Seldner, K.

    1976-01-01

    The development of control systems for jet engines requires a real-time computer simulation. The simulation provides an effective tool for evaluating control concepts and problem areas prior to actual engine testing. The development and use of a real-time simulation of the Pratt and Whitney F100-PW100 turbofan engine is described. The simulation was used in a multi-variable optimal controls research program using linear quadratic regulator theory. The simulation is used to generate linear engine models at selected operating points and evaluate the control algorithm. To reduce the complexity of the design, it is desirable to reduce the order of the linear model. A technique to reduce the order of the model; is discussed. Selected results between high and low order models are compared. The LQR control algorithms can be programmed on digital computer. This computer will control the engine simulation over the desired flight envelope.

  5. Biological-based and physical-based optimization for biological evaluation of prostate patient's plans

    Science.gov (United States)

    Sukhikh, E.; Sheino, I.; Vertinsky, A.

    2017-09-01

    Modern modalities of radiation treatment therapy allow irradiation of the tumor to high dose values and irradiation of organs at risk (OARs) to low dose values at the same time. In this paper we study optimal radiation treatment plans made in Monaco system. The first aim of this study was to evaluate dosimetric features of Monaco treatment planning system using biological versus dose-based cost functions for the OARs and irradiation targets (namely tumors) when the full potential of built-in biological cost functions is utilized. The second aim was to develop criteria for the evaluation of radiation dosimetry plans for patients based on the macroscopic radiobiological criteria - TCP/NTCP. In the framework of the study four dosimetric plans were created utilizing the full extent of biological and physical cost functions using dose calculation-based treatment planning for IMRT Step-and-Shoot delivery of stereotactic body radiation therapy (SBRT) in prostate case (5 fractions per 7 Gy).

  6. Optimization, evaluation, and comparison of standard algorithms for image reconstruction with the VIP-PET.

    Science.gov (United States)

    Mikhaylova, E; Kolstein, M; De Lorenzo, G; Chmeissani, M

    2014-07-01

    A novel positron emission tomography (PET) scanner design based on a room-temperature pixelated CdTe solid-state detector is being developed within the framework of the Voxel Imaging PET (VIP) Pathfinder project [1]. The simulation results show a great potential of the VIP to produce high-resolution images even in extremely challenging conditions such as the screening of a human head [2]. With unprecedented high channel density (450 channels/cm 3 ) image reconstruction is a challenge. Therefore optimization is needed to find the best algorithm in order to exploit correctly the promising detector potential. The following reconstruction algorithms are evaluated: 2-D Filtered Backprojection (FBP), Ordered Subset Expectation Maximization (OSEM), List-Mode OSEM (LM-OSEM), and the Origin Ensemble (OE) algorithm. The evaluation is based on the comparison of a true image phantom with a set of reconstructed images obtained by each algorithm. This is achieved by calculation of image quality merit parameters such as the bias, the variance and the mean square error (MSE). A systematic optimization of each algorithm is performed by varying the reconstruction parameters, such as the cutoff frequency of the noise filters and the number of iterations. The region of interest (ROI) analysis of the reconstructed phantom is also performed for each algorithm and the results are compared. Additionally, the performance of the image reconstruction methods is compared by calculating the modulation transfer function (MTF). The reconstruction time is also taken into account to choose the optimal algorithm. The analysis is based on GAMOS [3] simulation including the expected CdTe and electronic specifics.

  7. Design Optimization and In Vitro-In Vivo Evaluation of Orally Dissolving Strips of Clobazam

    Directory of Open Access Journals (Sweden)

    Rajni Bala

    2014-01-01

    Full Text Available Clobazam orally dissolving strips were prepared by solvent casting method. A full 32 factorial design was applied for optimization using different concentration of film forming polymer and disintegrating agent as independent variable and disintegration time, % cumulative drug release, and tensile strength as dependent variable. In addition the prepared films were also evaluated for surface pH, folding endurance, and content uniformity. The optimized film formulation showing the maximum in vitro drug release, satisfactory in vitro disintegration time, and tensile strength was selected for bioavailability study and compared with a reference marketed product (frisium5 tablets in rabbits. Formulation (F6 was selected by the Design-expert software which exhibited DT (24 sec, TS (2.85 N/cm2, and in vitro drug release (96.6%. Statistical evaluation revealed no significant difference between the bioavailability parameters of the test film (F6 and the reference product. The mean ratio values (test/reference of Cmax (95.87%, tmax (71.42%, AUC0−t (98.125%, and AUC0−∞ (99.213% indicated that the two formulae exhibited comparable plasma level-time profiles.

  8. Design Optimization and In Vitro-In Vivo Evaluation of Orally Dissolving Strips of Clobazam

    Science.gov (United States)

    Bala, Rajni; Khanna, Sushil; Pawar, Pravin

    2014-01-01

    Clobazam orally dissolving strips were prepared by solvent casting method. A full 32 factorial design was applied for optimization using different concentration of film forming polymer and disintegrating agent as independent variable and disintegration time, % cumulative drug release, and tensile strength as dependent variable. In addition the prepared films were also evaluated for surface pH, folding endurance, and content uniformity. The optimized film formulation showing the maximum in vitro drug release, satisfactory in vitro disintegration time, and tensile strength was selected for bioavailability study and compared with a reference marketed product (frisium5 tablets) in rabbits. Formulation (F6) was selected by the Design-expert software which exhibited DT (24 sec), TS (2.85 N/cm2), and in vitro drug release (96.6%). Statistical evaluation revealed no significant difference between the bioavailability parameters of the test film (F6) and the reference product. The mean ratio values (test/reference) of C max (95.87%), t max (71.42%), AUC0−t (98.125%), and AUC0−∞ (99.213%) indicated that the two formulae exhibited comparable plasma level-time profiles. PMID:25328709

  9. Occupant feedback based model predictive control for thermal comfort and energy optimization: A chamber experimental evaluation

    International Nuclear Information System (INIS)

    Chen, Xiao; Wang, Qian; Srebric, Jelena

    2016-01-01

    Highlights: • This study evaluates an occupant-feedback driven Model Predictive Controller (MPC). • The MPC adjusts indoor temperature based on a dynamic thermal sensation (DTS) model. • A chamber model for predicting chamber air temperature is developed and validated. • Experiments show that MPC using DTS performs better than using Predicted Mean Vote. - Abstract: In current centralized building climate control, occupants do not have much opportunity to intervene the automated control system. This study explores the benefit of using thermal comfort feedback from occupants in the model predictive control (MPC) design based on a novel dynamic thermal sensation (DTS) model. This DTS model based MPC was evaluated in chamber experiments. A hierarchical structure for thermal control was adopted in the chamber experiments. At the high level, an MPC controller calculates the optimal supply air temperature of the chamber heating, ventilation, and air conditioning (HVAC) system, using the feedback of occupants’ votes on thermal sensation. At the low level, the actual supply air temperature is controlled by the chiller/heater using a PI control to achieve the optimal set point. This DTS-based MPC was also compared to an MPC designed based on the Predicted Mean Vote (PMV) model for thermal sensation. The experiment results demonstrated that the DTS-based MPC using occupant feedback allows significant energy saving while maintaining occupant thermal comfort compared to the PMV-based MPC.

  10. Framework to evaluate the worth of hydraulic conductivity data for optimal groundwater resources management in ecologically sensitive areas

    Science.gov (United States)

    Feyen, Luc; Gorelick, Steven M.

    2005-03-01

    We propose a framework that combines simulation optimization with Bayesian decision analysis to evaluate the worth of hydraulic conductivity data for optimal groundwater resources management in ecologically sensitive areas. A stochastic simulation optimization management model is employed to plan regionally distributed groundwater pumping while preserving the hydroecological balance in wetland areas. Because predictions made by an aquifer model are uncertain, groundwater supply systems operate below maximum yield. Collecting data from the groundwater system can potentially reduce predictive uncertainty and increase safe water production. The price paid for improvement in water management is the cost of collecting the additional data. Efficient data collection using Bayesian decision analysis proceeds in three stages: (1) The prior analysis determines the optimal pumping scheme and profit from water sales on the basis of known information. (2) The preposterior analysis estimates the optimal measurement locations and evaluates whether each sequential measurement will be cost-effective before it is taken. (3) The posterior analysis then revises the prior optimal pumping scheme and consequent profit, given the new information. Stochastic simulation optimization employing a multiple-realization approach is used to determine the optimal pumping scheme in each of the three stages. The cost of new data must not exceed the expected increase in benefit obtained in optimal groundwater exploitation. An example based on groundwater management practices in Florida aimed at wetland protection showed that the cost of data collection more than paid for itself by enabling a safe and reliable increase in production.

  11. Metrics for Evaluating Performance of Prognostics Techniques

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics is an emerging concept in condition based maintenance (CBM) of critical systems. Along with developing the fundamentals of being able to confidently...

  12. Metrics for Evaluating Performance of Prognostic Techniques

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics is an emerging concept in condition basedmaintenance(CBM)ofcriticalsystems.Alongwith developing the fundamentals of being able to confidently predict...

  13. Human Performance Metrics for Spacesuit Evaluation

    Data.gov (United States)

    National Aeronautics and Space Administration — Introduction: Human spaceflight and exploration beyond low-earth orbit requires providing crewmembers life support systems in various extreme environments, such as...

  14. Evaluating Algorithm Performance Metrics Tailored for Prognostics

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics has taken center stage in Condition Based Maintenance (CBM) where it is desired to estimate Remaining Useful Life (RUL) of a system so that remedial...

  15. Evaluation and optimization of DNA extraction and purification procedures for soil and sediment samples.

    Science.gov (United States)

    Miller, D N; Bryant, J E; Madsen, E L; Ghiorse, W C

    1999-11-01

    We compared and statistically evaluated the effectiveness of nine DNA extraction procedures by using frozen and dried samples of two silt loam soils and a silt loam wetland sediment with different organic matter contents. The effects of different chemical extractants (sodium dodecyl sulfate [SDS], chloroform, phenol, Chelex 100, and guanadinium isothiocyanate), different physical disruption methods (bead mill homogenization and freeze-thaw lysis), and lysozyme digestion were evaluated based on the yield and molecular size of the recovered DNA. Pairwise comparisons of the nine extraction procedures revealed that bead mill homogenization with SDS combined with either chloroform or phenol optimized both the amount of DNA extracted and the molecular size of the DNA (maximum size, 16 to 20 kb). Neither lysozyme digestion before SDS treatment nor guanidine isothiocyanate treatment nor addition of Chelex 100 resin improved the DNA yields. Bead mill homogenization in a lysis mixture containing chloroform, SDS, NaCl, and phosphate-Tris buffer (pH 8) was found to be the best physical lysis technique when DNA yield and cell lysis efficiency were used as criteria. The bead mill homogenization conditions were also optimized for speed and duration with two different homogenizers. Recovery of high-molecular-weight DNA was greatest when we used lower speeds and shorter times (30 to 120 s). We evaluated four different DNA purification methods (silica-based DNA binding, agarose gel electrophoresis, ammonium acetate precipitation, and Sephadex G-200 gel filtration) for DNA recovery and removal of PCR inhibitors from crude extracts. Sephadex G-200 spin column purification was found to be the best method for removing PCR-inhibiting substances while minimizing DNA loss during purification. Our results indicate that for these types of samples, optimum DNA recovery requires brief, low-speed bead mill homogenization in the presence of a phosphate-buffered SDS-chloroform mixture, followed

  16. Thermodynamic evaluation and optimization of the (Na+K+S) system

    International Nuclear Information System (INIS)

    Lindberg, Daniel; Backman, Rainer; Hupa, Mikko; Chartrand, Patrice

    2006-01-01

    The (Na+K+S) system is of primary importance for the combustion of black liquor in the kraft recovery boilers in pulp and paper mills. A thermodynamic evaluation and optimization for the (Na+K+S) system has been made. All available data for the system have been critically evaluated to obtain optimized parameters of thermodynamic models for all phases. The liquid model is the quasichemical model in the quadruplet approximation, which evaluates 1st- and 2nd-nearest-neighbour short-range-order. In this model, cations (Na + and K + ) are assumed to mix on a cationic sublattice, while anions (S 2- ,S 2 2- ,S 3 2- ,S 4 2- ,S 5 2- ,S 6 2- ,S 7 2- ,S 8 2- ,Va - ) are assumed to mix on an anionic sublattice. The thermodynamic data of the liquid polysulphide components M 2 S 1+n (M=Na, K and n=1-7) are fitted to ΔG=A(n)+B(n).T for the reaction M 2 S(l)+nS(l)=M 2 S n+1 (l). The solid phases are the alkali alloys, alkali sulphides, several different alkali polysulphides and sulphur. The solid solutions (Na,K) (Na,K) 2 S and (Na,K) 2 S 2 are modelled using the compound energy formalism. The models can be used to predict the thermodynamic properties and phase equilibria in the multicomponent heterogeneous system. The experimental data are reproduced within experimental error limits for equilibria between solid, liquid and gas. The ternary phase diagram of the system (Na 2 S+K 2 S+S) has been predicted as no experimental determinations of the phase diagram have been made previously

  17. Optimization of Scat Detection Methods for a Social Ungulate, the Wild Pig, and Experimental Evaluation of Factors Affecting Detection of Scat.

    Directory of Open Access Journals (Sweden)

    David A Keiter

    Full Text Available Collection of scat samples is common in wildlife research, particularly for genetic capture-mark-recapture applications. Due to high degradation rates of genetic material in scat, large numbers of samples must be collected to generate robust estimates. Optimization of sampling approaches to account for taxa-specific patterns of scat deposition is, therefore, necessary to ensure sufficient sample collection. While scat collection methods have been widely studied in carnivores, research to maximize scat collection and noninvasive sampling efficiency for social ungulates is lacking. Further, environmental factors or scat morphology may influence detection of scat by observers. We contrasted performance of novel radial search protocols with existing adaptive cluster sampling protocols to quantify differences in observed amounts of wild pig (Sus scrofa scat. We also evaluated the effects of environmental (percentage of vegetative ground cover and occurrence of rain immediately prior to sampling and scat characteristics (fecal pellet size and number on the detectability of scat by observers. We found that 15- and 20-m radial search protocols resulted in greater numbers of scats encountered than the previously used adaptive cluster sampling approach across habitat types, and that fecal pellet size, number of fecal pellets, percent vegetative ground cover, and recent rain events were significant predictors of scat detection. Our results suggest that use of a fixed-width radial search protocol may increase the number of scats detected for wild pigs, or other social ungulates, allowing more robust estimation of population metrics using noninvasive genetic sampling methods. Further, as fecal pellet size affected scat detection, juvenile or smaller-sized animals may be less detectable than adult or large animals, which could introduce bias into abundance estimates. Knowledge of relationships between environmental variables and scat detection may allow

  18. Robust optimization based upon statistical theory.

    Science.gov (United States)

    Sobotta, B; Söhn, M; Alber, M

    2010-08-01

    Organ movement is still the biggest challenge in cancer treatment despite advances in online imaging. Due to the resulting geometric uncertainties, the delivered dose cannot be predicted precisely at treatment planning time. Consequently, all associated dose metrics (e.g., EUD and maxDose) are random variables with a patient-specific probability distribution. The method that the authors propose makes these distributions the basis of the optimization and evaluation process. The authors start from a model of motion derived from patient-specific imaging. On a multitude of geometry instances sampled from this model, a dose metric is evaluated. The resulting pdf of this dose metric is termed outcome distribution. The approach optimizes the shape of the outcome distribution based on its mean and variance. This is in contrast to the conventional optimization of a nominal value (e.g., PTV EUD) computed on a single geometry instance. The mean and variance allow for an estimate of the expected treatment outcome along with the residual uncertainty. Besides being applicable to the target, the proposed method also seamlessly includes the organs at risk (OARs). The likelihood that a given value of a metric is reached in the treatment is predicted quantitatively. This information reveals potential hazards that may occur during the course of the treatment, thus helping the expert to find the right balance between the risk of insufficient normal tissue sparing and the risk of insufficient tumor control. By feeding this information to the optimizer, outcome distributions can be obtained where the probability of exceeding a given OAR maximum and that of falling short of a given target goal can be minimized simultaneously. The method is applicable to any source of residual motion uncertainty in treatment delivery. Any model that quantifies organ movement and deformation in terms of probability distributions can be used as basis for the algorithm. Thus, it can generate dose

  19. Nanosized sustained-release pyridostigmine bromide microcapsules: process optimization and evaluation of characteristics

    Science.gov (United States)

    Tan, Qunyou; Jiang, Rong; Xu, Meiling; Liu, Guodong; Li, Songlin; Zhang, Jingqing

    2013-01-01

    Background Pyridostigmine bromide (3-[[(dimethylamino)-carbonyl]oxy]-1-methylpyridinium bromide), a reversible inhibitor of cholinesterase, is given orally in tablet form, and a treatment schedule of multiple daily doses is recommended for adult patients. Nanotechnology was used in this study to develop an alternative sustained-release delivery system for pyridostigmine, a synthetic drug with high solubility and poor oral bioavailability, hence a Class III drug according to the Biopharmaceutics Classification System. Novel nanosized pyridostigmine-poly(lactic acid) microcapsules (PPNMCs) were expected to have a longer duration of action than free pyridostigmine and previously reported sustained-release formulations of pyridostigmine. Methods The PPNMCs were prepared using a double emulsion-solvent evaporation method to achieve sustained-release characteristics for pyridostigmine. The preparation process for the PPNMCs was optimized by single-factor experiments. The size distribution, zeta potential, and sustained-release behavior were evaluated in different types of release medium. Results The optimal volume ratio of inner phase to external phase, poly(lactic acid) concentration, polyvinyl alcohol concentration, and amount of pyridostigmine were 1:10, 6%, 3% and 40 mg, respectively. The negatively charged PPNMCs had an average particle size of 937.9 nm. Compared with free pyridostigmine, PPNMCs showed an initial burst release and a subsequent very slow release in vitro. The release profiles for the PPNMCs in four different types of dissolution medium were fitted to the Ritger-Peppas and Weibull models. The similarity between pairs of dissolution profiles for the PPNMCs in different types of medium was statistically significant, and the difference between the release curves for PPNMCs and free pyridostigmine was also statistically significant. Conclusion PPNMCs prepared by the optimized protocol described here were in the nanometer range and had good uniformity

  20. Group covariance and metrical theory

    International Nuclear Information System (INIS)

    Halpern, L.

    1983-01-01

    The a priori introduction of a Lie group of transformations into a physical theory has often proved to be useful; it usually serves to describe special simplified conditions before a general theory can be worked out. Newton's assumptions of absolute space and time are examples where the Euclidian group and translation group have been introduced. These groups were extended to the Galilei group and modified in the special theory of relativity to the Poincare group to describe physics under the given conditions covariantly in the simplest way. The criticism of the a priori character leads to the formulation of the general theory of relativity. The general metric theory does not really give preference to a particular invariance group - even the principle of equivalence can be adapted to a whole family of groups. The physical laws covariantly inserted into the metric space are however adapted to the Poincare group. 8 references

  1. General relativity: An erfc metric

    Science.gov (United States)

    Plamondon, Réjean

    2018-06-01

    This paper proposes an erfc potential to incorporate in a symmetric metric. One key feature of this model is that it relies on the existence of an intrinsic physical constant σ, a star-specific proper length that scales all its surroundings. Based thereon, the new metric is used to study the space-time geometry of a static symmetric massive object, as seen from its interior. The analytical solutions to the Einstein equation are presented, highlighting the absence of singularities and discontinuities in such a model. The geodesics are derived in their second- and first-order differential formats. Recalling the slight impact of the new model on the classical general relativity tests in the solar system, a number of facts and open problems are briefly revisited on the basis of a heuristic definition of σ. A special attention is given to gravitational collapses and non-singular black holes.

  2. Important LiDAR metrics for discriminating forest tree species in Central Europe

    Science.gov (United States)

    Shi, Yifang; Wang, Tiejun; Skidmore, Andrew K.; Heurich, Marco

    2018-03-01

    Numerous airborne LiDAR-derived metrics have been proposed for classifying tree species. Yet an in-depth ecological and biological understanding of the significance of these metrics for tree species mapping remains largely unexplored. In this paper, we evaluated the performance of 37 frequently used LiDAR metrics derived under leaf-on and leaf-off conditions, respectively, for discriminating six different tree species in a natural forest in Germany. We firstly assessed the correlation between these metrics. Then we applied a Random Forest algorithm to classify the tree species and evaluated the importance of the LiDAR metrics. Finally, we identified the most important LiDAR metrics and tested their robustness and transferability. Our results indicated that about 60% of LiDAR metrics were highly correlated to each other (|r| > 0.7). There was no statistically significant difference in tree species mapping accuracy between the use of leaf-on and leaf-off LiDAR metrics. However, combining leaf-on and leaf-off LiDAR metrics significantly increased the overall accuracy from 58.2% (leaf-on) and 62.0% (leaf-off) to 66.5% as well as the kappa coefficient from 0.47 (leaf-on) and 0.51 (leaf-off) to 0.58. Radiometric features, especially intensity related metrics, provided more consistent and significant contributions than geometric features for tree species discrimination. Specifically, the mean intensity of first-or-single returns as well as the mean value of echo width were identified as the most robust LiDAR metrics for tree species discrimination. These results indicate that metrics derived from airborne LiDAR data, especially radiometric metrics, can aid in discriminating tree species in a mixed temperate forest, and represent candidate metrics for tree species classification and monitoring in Central Europe.

  3. Polyaspartate extraction of cadmium ions from contaminated soil: Evaluation and optimization using central composite design.

    Science.gov (United States)

    Mu'azu, Nuhu Dalhat; Haladu, Shamsuddeen A; Jarrah, Nabeel; Zubair, Mukarram; Essa, Mohammad H; Ali, Shaikh A

    2018-01-15

    The occurrences of heavy metal contaminated sites and soils and the need for devising environmentally friendly solutions have become global issues of serious concern. In this study, polyaspartate (a highly biodegradable agent) was synthesized using L-Aspartic acid via a new modified thermal procedure and employed for extraction of cadmium ions (Cd) from contaminated soil. Response surface methodology approach using 3 5 full faced centered central composite design was employed for modeling, evaluating and optimizing the influence of polyaspartate concentration (36-145mM), polyaspartate/soil ratio (5-25), initial heavy metal concentration (100-500mg/kg), initial pH (3-6) and extraction time (6-24h) on Cd ions extracted into the polyaspartate solution and its residual concentration in the treated soil. The Cd extraction efficacy obtained reached up to 98.8%. Increase in Cd extraction efficiency was associated with increase in the polyaspartate and Cd concentration coupled with lower polyaspertate/soil ratio and initial pH. Under the optimal conditions characterized with minimal utilization of the polyaspartate and high Cd ions removal, the extractible Cd in the polyaspartate solution reached up to 84.4mg/L which yielded 85% Cd extraction efficacy. This study demonstrates the suitability of using polyaspartate as an effective environmentally friendly chelating agent for Cd extraction from contaminated soils. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Evaluating biomass energy strategies for a UK eco-town with an MILP optimization model

    International Nuclear Information System (INIS)

    Keirstead, James; Samsatli, Nouri; Pantaleo, A. Marco; Shah, Nilay

    2012-01-01

    Recent years have shown a marked interest in the construction of eco-towns, showcase developments intended to demonstrate the best in ecologically-sensitive and energy-efficient construction. This paper examines one such development in the UK and considers the role of biomass energy systems. We present an integrated resource modelling framework that identifies an optimized low-cost energy supply system including the choice of conversion technologies, fuel sources, and distribution networks. Our analysis shows that strategies based on imported wood chips, rather than locally converted forestry residues, burned in a mix of ICE and ORC combined heat and power facilities offer the most promise. While there are uncertainties surrounding the precise environmental impacts of these solutions, it is clear that such biomass systems can help eco-towns to meet their target of an 80% reduction in greenhouse gas emissions. -- Highlights: ► An optimization model for urban biomass energy system design is presented. ► Tool selects technologies, operating rates, supply infrastructures. ► Five technology scenarios evaluated for a UK eco-town proposal. ► Results show ICE and ORC CHP units, fed by wood chips, promising. ► Results show biomass can help eco-towns achieve 80% GHG emission reductions.

  5. Evaluation of 'period-generated' control laws for the time-optimal control of reactor power

    International Nuclear Information System (INIS)

    Bernard, J.A.

    1988-01-01

    Time-Optimal control of neutronic power has recently been achieved by developing control laws that determine the actuator mechanism velocity necessary to produce a specified reactor period. These laws are designated as the 'MIT-SNL Period-Generated Minimum Time Control Laws'. Relative to time-optimal response, they function by altering the rate of change of reactivity so that the instantaneous period is stepped from infinity to its minimum allowed value, held at that value until the desired power level is attained, and then stepped back to infinity. The results of a systematic evaluation of these laws are presented. The behavior of each term in the control laws is shown and the capability of these laws to control properly the reactor power is demonstrated. Factors affecting the implementation of these laws, such as the prompt neutron lifetime and the differential reactivity worth of the actuators, are discussed. Finally, the results of an experimental study in which these laws were used to adjust the power of the 5 MWt MIT Research Reactor are shown. The information presented should be of interest to those designing high performance control systems for test, spacecraft, or, in certain instances, commercial reactors

  6. Metabolomic approach to optimizing and evaluating antibiotic treatment in the axenic culture of cyanobacterium Nostoc flagelliforme.

    Science.gov (United States)

    Han, Pei-pei; Jia, Shi-ru; Sun, Ying; Tan, Zhi-lei; Zhong, Cheng; Dai, Yu-jie; Tan, Ning; Shen, Shi-gang

    2014-09-01

    The application of antibiotic treatment with assistance of metabolomic approach in axenic isolation of cyanobacterium Nostoc flagelliforme was investigated. Seven antibiotics were tested at 1-100 mg L(-1), and order of tolerance of N. flagelliforme cells was obtained as kanamycin > ampicillin, tetracycline > chloromycetin, gentamicin > spectinomycin > streptomycin. Four antibiotics were selected based on differences in antibiotic sensitivity of N. flagelliforme and associated bacteria, and their effects on N. flagelliforme cells including the changes of metabolic activity with antibiotics and the metabolic recovery after removal were assessed by a metabolomic approach based on gas chromatography-mass spectrometry combined with multivariate analysis. The results showed that antibiotic treatment had affected cell metabolism as antibiotics treated cells were metabolically distinct from control cells, but the metabolic activity would be recovered via eliminating antibiotics and the sequence of metabolic recovery time needed was spectinomycin, gentamicin > ampicillin > kanamycin. The procedures of antibiotic treatment have been accordingly optimized as a consecutive treatment starting with spectinomycin, then gentamicin, ampicillin and lastly kanamycin, and proved to be highly effective in eliminating the bacteria as examined by agar plating method and light microscope examination. Our work presented a strategy to obtain axenic culture of N. flagelliforme and provided a method for evaluating and optimizing cyanobacteria purification process through diagnosing target species cellular state.

  7. An Optimization Framework for Investment Evaluation of Complex Renewable Energy Systems

    Directory of Open Access Journals (Sweden)

    David Olave-Rojas

    2017-07-01

    Full Text Available Enhancing the role of renewable energies in existing power systems is one of the most crucial challenges that society faces today. However, the high variability of their generation potential and the temporal disparity between the demand and the generation potential represent technological and operational gaps that burden the massive incorporation of renewable sources into power systems. Energy storage technologies are an alternative to tackle this gap; nonetheless, their incorporation within large-scale power grids calls for decision-making tools that ensure an appropriate design and sizing of power systems that exploit the benefits of incorporating storage facilities along with renewable generation power. In this paper, we present an optimization framework for aiding the evaluation of the strategic design of complex renewable power systems. The developed tool relies on an optimization problem, the generation, transmission, storage energy location and sizing problem, which allows one to compute economically-attractive investment plans given by the location and sizing of generation and storage energy systems, along with the corresponding layout of transmission lines. Results on a real case study (located in the central region of Chile, characterized by carefully-curated data, show the potential of the developed tool for aiding long-term investment planning.

  8. Formulation, optimization, and evaluation of self-emulsifying drug delivery systems of nevirapine.

    Science.gov (United States)

    Chintalapudi, Ramprasad; Murthy, T E G K; Lakshmi, K Rajya; Manohar, G Ganesh

    2015-01-01

    The aim of the present study was to formulate and optimize the self-emulsifying drug delivery systems (SEDDS) of nevirapine (NVP) by use of 2(2) factorial designs to enhance the oral absorption of NVP by improving its solubility, dissolution rate, and diffusion profile. SEDDS are the isotropic mixtures of oil, surfactant, co-surfactant and drug that form oil in water microemulsion when introduced into the aqueous phase under gentle agitation. Solubility of NVP in different oils, surfactants, and co-surfactants was determined for the screening of excipients. Pseudo-ternary phase diagrams were constructed by the aqueous titration method, and formulations were developed based on the optimum excipient combinations with the help of data obtained through the maximum micro emulsion region containing combinations of oil, surfactant, and co-surfactant. The formulations of SEDDS were optimized by 2(2) factorial designs. The optimum formulation of SEDDS contains 32.5% oleic acid, 44.16% tween 20, and 11.9% polyethylene glycol 600 as oil, surfactant, and co-surfactant respectively. The SEDDS was evaluated for the following drug content, self-emulsification time, rheological properties, zeta potential, in vitro diffusion studies, thermodynamic stability studies, and in vitro dissolution studies. An increase in dissolution was achieved by SEDDS compared to pure form of NVP. Overall, this study suggests that the dissolution and oral bioavailability of NVP could be improved by SEDDS technology.

  9. Evaluation of sample preparation methods and optimization of nickel determination in vegetable tissues

    Directory of Open Access Journals (Sweden)

    Rodrigo Fernando dos Santos Salazar

    2011-02-01

    Full Text Available Nickel, although essential to plants, may be toxic to plants and animals. It is mainly assimilated by food ingestion. However, information about the average levels of elements (including Ni in edible vegetables from different regions is still scarce in Brazil. The objectives of this study were to: (a evaluate and optimize a method for preparation of vegetable tissue samples for Ni determination; (b optimize the analytical procedures for determination by Flame Atomic Absorption Spectrometry (FAAS and by Electrothermal Atomic Absorption (ETAAS in vegetable samples and (c determine the Ni concentration in vegetables consumed in the cities of Lorena and Taubaté in the Vale do Paraíba, State of São Paulo, Brazil. By means of the analytical technique for determination by ETAAS or FAAS, the results were validated by the test of analyte addition and recovery. The most viable method tested for quantification of this element was HClO4-HNO3 wet digestion. All samples but carrot tissue collected in Lorena contained Ni levels above the permitted by the Brazilian Ministry of Health. The most disturbing results, requiring more detailed studies, were the Ni concentrations measured in carrot samples from Taubaté, where levels were five times higher than permitted by Brazilian regulations.

  10. Evaluation of Optimized WRF Precipitation Forecast over a Complex Topography Region during Flood Season

    Directory of Open Access Journals (Sweden)

    Yuan Li

    2016-11-01

    Full Text Available In recent years, the Weather Research and Forecast (WRF model has been utilized to generate quantitative precipitation forecasts with higher spatial and temporal resolutions. However, factors including horizontal resolution, domain size, and the physical parameterization scheme have a strong impact on the dynamic downscaling ability of the WRF model. In this study, the influence of these factors has been analyzed in precipitation forecasting for the Xijiang Basin, southern China—a region with complex topography. The results indicate that higher horizontal resolutions always result in higher Critical Success Indexes (CSI, but higher biases as well. Meanwhile, the precipitation forecast skills are also influenced by the combination of microphysics parameterization scheme and cumulus convective parameterization scheme. On the basis of these results, an optimized configuration of the WRF model is built in which the horizontal resolution is 10 km, the microphysics parameterization is the Lin scheme, and the cumulus convective parameterization is the Betts–Miller–Janjic scheme. This configuration is then evaluated by simulating the daily weather during the 2013–2014 flood season. The high Critical Success Index scores and low biases at various thresholds and lead times confirm the high accuracy of the optimized WRF model configuration for Xijiang Basin. However, the performance of the WRF model varies from different sub-basins due to the complexity of the mesoscale convective system (MCS over this region.

  11. Optimization and evaluation of clarithromycin floating tablets using experimental mixture design.

    Science.gov (United States)

    Uğurlu, Timucin; Karaçiçek, Uğur; Rayaman, Erkan

    2014-01-01

    The purpose of the study was to prepare and evaluate clarithromycin (CLA) floating tablets using experimental mixture design for treatment of Helicobacter pylori provided by prolonged gastric residence time and controlled plasma level. Ten different formulations were generated based on different molecular weight of hypromellose (HPMC K100, K4M, K15M) by using simplex lattice design (a sub-class of mixture design) with Minitab 16 software. Sodium bicarbonate and anhydrous citric acid were used as gas generating agents. Tablets were prepared by wet granulation technique. All of the process variables were fixed. Results of cumulative drug release at 8th h (CDR 8th) were statistically analyzed to get optimized formulation (OF). Optimized formulation, which gave floating lag time lower than 15 s and total floating time more than 10 h, was analyzed and compared with target for CDR 8th (80%). A good agreement was shown between predicted and actual values of CDR 8th with a variation lower than 1%. The activity of clarithromycin contained optimizedformula against H. pylori were quantified using well diffusion agar assay. Diameters of inhibition zones vs. log10 clarithromycin concentrations were plotted in order to obtain a standard curve and clarithromycin activity.

  12. Performance evaluation and optimization of fluidized bed boiler in ethanol plant using irreversibility analysis

    Directory of Open Access Journals (Sweden)

    Nugroho Agung Pambudi

    2017-09-01

    Full Text Available This research aims to evaluate the performance of a fluidized bed boiler in an ethanol production plant through exergy and irreversibility analysis. The study also includes the optimization of the pre-heater and the deaerator in order to improve the system efficiency. Operational data from the ethanol production plant was collected between 2015 and early 2016. The total exergy derived from the fuel was determined to be 7783 kJ/s, while the exergy efficiency of the system was found to be 26.19%, with 2214 kJ/s used in steam production, while 71.55% was lost to component irreversibility and waste heat from the pre-heater. The exergy efficiencies of individual components of the system such as the boiler, deaerator, and pre-heater were found to be 25.82%, 40.13%, and 2.617%, respectively, with the pre-heater having the lowest efficiency. Thus, the pre-heater has the highest potential to significantly improve the efficiency of the boiler system. The optimization of the pre-heater shows that a rise in temperature in the outlet of the pre-heater positively affects the exergy efficiency of the deaerator.

  13. hdm: High-dimensional metrics

    OpenAIRE

    Chernozhukov, Victor; Hansen, Christian; Spindler, Martin

    2016-01-01

    In this article the package High-dimensional Metrics (\\texttt{hdm}) is introduced. It is a collection of statistical methods for estimation and quantification of uncertainty in high-dimensional approximately sparse models. It focuses on providing confidence intervals and significance testing for (possibly many) low-dimensional subcomponents of the high-dimensional parameter vector. Efficient estimators and uniformly valid confidence intervals for regression coefficients on target variables (e...

  14. Sensory Metrics of Neuromechanical Trust.

    Science.gov (United States)

    Softky, William; Benford, Criscillia

    2017-09-01

    Today digital sources supply a historically unprecedented component of human sensorimotor data, the consumption of which is correlated with poorly understood maladies such as Internet addiction disorder and Internet gaming disorder. Because both natural and digital sensorimotor data share common mathematical descriptions, one can quantify our informational sensorimotor needs using the signal processing metrics of entropy, noise, dimensionality, continuity, latency, and bandwidth. Such metrics describe in neutral terms the informational diet human brains require to self-calibrate, allowing individuals to maintain trusting relationships. With these metrics, we define the trust humans experience using the mathematical language of computational models, that is, as a primitive statistical algorithm processing finely grained sensorimotor data from neuromechanical interaction. This definition of neuromechanical trust implies that artificial sensorimotor inputs and interactions that attract low-level attention through frequent discontinuities and enhanced coherence will decalibrate a brain's representation of its world over the long term by violating the implicit statistical contract for which self-calibration evolved. Our hypersimplified mathematical understanding of human sensorimotor processing as multiscale, continuous-time vibratory interaction allows equally broad-brush descriptions of failure modes and solutions. For example, we model addiction in general as the result of homeostatic regulation gone awry in novel environments (sign reversal) and digital dependency as a sub-case in which the decalibration caused by digital sensorimotor data spurs yet more consumption of them. We predict that institutions can use these sensorimotor metrics to quantify media richness to improve employee well-being; that dyads and family-size groups will bond and heal best through low-latency, high-resolution multisensory interaction such as shared meals and reciprocated touch; and

  15. Metric reconstruction from Weyl scalars

    Energy Technology Data Exchange (ETDEWEB)

    Whiting, Bernard F; Price, Larry R [Department of Physics, PO Box 118440, University of Florida, Gainesville, FL 32611 (United States)

    2005-08-07

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations.

  16. Metric reconstruction from Weyl scalars

    International Nuclear Information System (INIS)

    Whiting, Bernard F; Price, Larry R

    2005-01-01

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations

  17. Reactor coolant pump service life evaluation for current life cycle optimization and license renewal

    International Nuclear Information System (INIS)

    Doroshuk, B.W.; Berto, D.S.; Robles, M.

    1990-01-01

    This paper reports that as part of the plant life cycle management and license renewal program, Baltimore Gas and Electric Company (BG and E) has completed a service life evaluation of their reactor coolant pumps, funded jointly by EPRI and performed by ABB Combustion Engineering Nuclear Power. Two of the goals of the BG and E plant life cycle management and license renewal program, and of this current evaluation, are to identify actions which would optimize current plant operation, and ensure that license renewal remains a viable option. The reactor coolant pumps (RCPs) at BG and E's Calvert Cliffs Units 1 and 2 are Byron Jackson pumps with a diffuser and a single suction. This pump design is also used in many other nuclear plants. The RCP service life evaluation assessed the effect of all plausible age-related degradation mechanisms (ARDMs) on the RCP components. Cyclic fatigue and thermal embrittlement were two ARDMs identified as having a high potential to limit the service life of the pump case. The pump case is a primary pressure boundary component. Hence, ensuring its continued structural integrity is important

  18. Optimal evaluation of infectious medical waste disposal companies using the fuzzy analytic hierarchy process

    International Nuclear Information System (INIS)

    Ho, Chao Chung

    2011-01-01

    Ever since Taiwan's National Health Insurance implemented the diagnosis-related groups payment system in January 2010, hospital income has declined. Therefore, to meet their medical waste disposal needs, hospitals seek suppliers that provide high-quality services at a low cost. The enactment of the Waste Disposal Act in 1974 had facilitated some improvement in the management of waste disposal. However, since the implementation of the National Health Insurance program, the amount of medical waste from disposable medical products has been increasing. Further, of all the hazardous waste types, the amount of infectious medical waste has increased at the fastest rate. This is because of the increase in the number of items considered as infectious waste by the Environmental Protection Administration. The present study used two important findings from previous studies to determine the critical evaluation criteria for selecting infectious medical waste disposal firms. It employed the fuzzy analytic hierarchy process to set the objective weights of the evaluation criteria and select the optimal infectious medical waste disposal firm through calculation and sorting. The aim was to propose a method of evaluation with which medical and health care institutions could objectively and systematically choose appropriate infectious medical waste disposal firms.

  19. Sustainability Metrics: The San Luis Basin Project

    Science.gov (United States)

    Sustainability is about promoting humanly desirable dynamic regimes of the environment. Metrics: ecological footprint, net regional product, exergy, emergy, and Fisher Information. Adaptive management: (1) metrics assess problem, (2) specific problem identified, and (3) managemen...

  20. Evaluating the Stability of Feature Selectors that Optimize Feature Subset Cardinality

    Czech Academy of Sciences Publication Activity Database

    Somol, Petr; Novovičová, Jana

    2008-01-01

    Roč. 2008, č. 5342 (2008), s. 956-966 ISSN 0302-9743. [Joint IAPR International Workshops SSPR 2008 and SPR 2008. Orlando , 04.12.2008-06.12.2008] R&D Projects: GA AV ČR 1ET400750407; GA MŠk 1M0572; GA ČR GA102/07/1594 Grant - others:GA MŠk(CZ) 2C06019 Institutional research plan: CEZ:AV0Z10750506 Keywords : Feature selection * stability * relative weighted consistency measure * sequential search * floating search Subject RIV: IN - Informatics, Computer Science http://library.utia.cas.cz/separaty/2008/RO/somol-evaluating the stability of feature selectors that optimize feature subset cardinality.pdf

  1. OPTIMIZATION OF CELL DISRUPTION IN RAPHIDOCELIS SUBCAPITATA AND CHLORELLA VULGARIS FOR BIOMARKER EVALUATION

    Directory of Open Access Journals (Sweden)

    Adeolu Aderemi

    2015-06-01

    Full Text Available Raphidocelis subcapitata and Chlorella vulgaris are bioassay microalgae with rigid cellulosic cell wall which can hinder the release of intracellular proteins often studied as toxicity biomarkers. Since cell disruption is necessary for recovering intracellular biomolecules in these organisms, this study investigated the efficiency of ultrasonication bath; ultrasonication probe; vortexer; and bead mill in disintegrating the microalgae for anti-oxidative enzyme extraction. The extent of cell disruption was evaluated and quantified using bright field microscopy. Disrupted algae appeared as ghosts. The greatest disintegration of the microalgae (83-99.6 % was achieved using bead mill with 0.42-0.6 mm glass beads while the other methods induced little or no disruption. The degree of cell disruption using bead mill increased with exposure time, beads-solution ratio and agitation speed while larger beads caused less disruption. Findings revealed that bead milling, with specific parameters optimized, is one of the most effective methods of disintegrating the robust algal cells.

  2. Evaluation of Shielding Wall Optimization in Lead Slowing Down Spectrometer System

    Energy Technology Data Exchange (ETDEWEB)

    Jeon, Ju Young; Kim, Jeong Dong; Lee, Yong Deok [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    A Lead Slowing Down Spectrometer (LSDS) system is nondestructive technology for analyzing isotope fissile content in spent fuel and pyro processed material, in real time and directly. The high intensity neutron and gamma ray were generated from a nuclear material (Pyro, Spent nuclear fuel), electron beam-target reaction and fission of fissile material. Therefore, shielding analysis of LSDS system should be carried out. In this study, Borax, B{sub 4}C, Li{sub 2}Co{sub 3}, Resin were chosen for shielding analysis. The radiation dose limit (<0.1 μSv/hr) was adopted conservatively at the outer wall surface. The covering could be able to reduce the concrete wall thickness from 5cm to 15cm. The optimized shielding walls evaluation will be used as an important data for future real LSDS facility design and shielding door assessment.

  3. Evaluation of optimal silver amount for the removal of methyl iodide on silver-impregnated adsorbents

    International Nuclear Information System (INIS)

    Park, G.I.; Cho, I.H.; Kim, J.H.; Oh, W.Z.

    2001-01-01

    The adsorption characteristics of methyl iodide generated from the simulated off-gas stream on various adsorbents such as silver-impregnated zeolite (AgX), zeocarbon and activated carbon were investigated. An extensive evaluation was made on the optimal silver impregnation amount for the removal of methyl iodide at temperatures up to 300 deg. C. The degree of adsorption efficiency of methyl iodide on silver-impregnated adsorbent is strongly dependent on impregnation amount and process temperature. A quantitative comparison of adsorption efficiencies on three adsorbents in a fixed bed was investigated. The influence of temperature, methyl iodide concentration and silver impregnation amount on the adsorption efficiency is closely related to the pore characteristics of adsorbents. It shows that the effective impregnation ratio was about 10wt%, based on the degree of silver utilization for the removal of methyl iodide. The practical applicability of silver-impregnated zeolite for the removal of radioiodine generated from the DUPIC process was consequently proposed. (author)

  4. Optimizing monoscopic kV fluoro acquisition for prostate intrafraction motion evaluation

    International Nuclear Information System (INIS)

    Adamson, Justus; Wu Qiuwen

    2009-01-01

    Monoscopic kV imaging during radiotherapy has been recently implemented for prostate intrafraction motion evaluation. However, the accuracy of 3D localization techniques from monoscopic imaging of prostate and the effect of acquisition parameters on the 3D accuracy have not been studied in detail, with imaging dose remaining a concern. In this paper, we investigate methods to optimize the kV acquisition parameters and imaging protocol to achieve improved 3D localization and 2D image registration accuracy for minimal imaging dose. Prostate motion during radiotherapy was simulated using existing cine-MRI measurements, and was used to investigate the accuracy of various 3D localization techniques and the effect of the kV acquisition protocol. We also investigated the relationship between mAs and the accuracy of the 2D image registration for localization of fiducial markers and we measured imaging dose for a 30 cm diameter phantom to evaluate the necessary dose to achieve acceptable image registration accuracy. Simulations showed that the error in assuming the shortest path to localize the prostate in 3D using monoscopic imaging during a typical IMRT fraction will be less than ∼1.5 mm for 95% of localizations, and will also depend on prostate motion distribution, treatment duration and image acquisition and treatment protocol. Most uncertainty cannot be reduced from higher imaging frequency or acquiring during gantry rotation between beams. Measured maximum surface dose to the cylindrical phantom from monoscopic kV intrafraction acquisitions varied between 0.4 and 5.5 mGy, depending on the acquisition protocol, and was lower than the required dose for CBCT (21.1 mGy). Imaging dose can be lowered by ∼15-40% when mAs is optimized with acquisition angle. Images acquired during MV beam delivery require increased mAs to obtain the same level of registration accuracy, with mAs/registration increasing roughly linearly with field size and dose rate.

  5. Review of tri-generation technologies: Design evaluation, optimization, decision-making, and selection approach

    International Nuclear Information System (INIS)

    Al Moussawi, Houssein; Fardoun, Farouk; Louahlia-Gualous, Hasna

    2016-01-01

    Highlights: • Trigeneration technologies classified and reviewed according to prime movers. • Relevant heat recovery equipment discussed with thermal energy storage. • Trigeneration evaluated based on energy, exergy, economy, environment criteria. • Design, optimization, and decision-making methods classified and presented. • System selection suggested according to user preferences. - Abstract: Electricity, heating, and cooling are the three main components constituting the tripod of energy consumption in residential, commercial, and public buildings all around the world. Their separate generation causes higher fuel consumption, at a time where energy demands and fuel costs are continuously rising. Combined cooling, heating, and power (CCHP) or trigeneration could be a solution for such challenge yielding an efficient, reliable, flexible, competitive, and less pollutant alternative. A variety of trigeneration technologies are available and their proper choice is influenced by the employed energy system conditions and preferences. In this paper, different types of trigeneration systems are classified according to the prime mover, size and energy sequence usage. A leveled selection procedure is subsequently listed in the consecutive sections. The first level contains the applied prime mover technologies which are considered to be the heart of any CCHP system. The second level comprises the heat recovery equipment (heating and cooling) of which suitable selection should be compatible with the used prime mover. The third level includes the thermal energy storage system and heat transfer fluid to be employed. For each section of the paper, a survey of conducted studies with CHP/CCHP implementation is presented. A comprehensive table of evaluation criteria for such systems based on energy, exergy, economy, and environment measures is performed, along with a survey of the methods used in their design, optimization, and decision-making. Moreover, a classification

  6. Optimizing and benchmarking de novo transcriptome sequencing: from library preparation to assembly evaluation.

    Science.gov (United States)

    Hara, Yuichiro; Tatsumi, Kaori; Yoshida, Michio; Kajikawa, Eriko; Kiyonari, Hiroshi; Kuraku, Shigehiro

    2015-11-18

    RNA-seq enables gene expression profiling in selected spatiotemporal windows and yields massive sequence information with relatively low cost and time investment, even for non-model species. However, there remains a large room for optimizing its workflow, in order to take full advantage of continuously developing sequencing capacity. Transcriptome sequencing for three embryonic stages of Madagascar ground gecko (Paroedura picta) was performed with the Illumina platform. The output reads were assembled de novo for reconstructing transcript sequences. In order to evaluate the completeness of transcriptome assemblies, we prepared a reference gene set consisting of vertebrate one-to-one orthologs. To take advantage of increased read length of >150 nt, we demonstrated shortened RNA fragmentation time, which resulted in a dramatic shift of insert size distribution. To evaluate products of multiple de novo assembly runs incorporating reads with different RNA sources, read lengths, and insert sizes, we introduce a new reference gene set, core vertebrate genes (CVG), consisting of 233 genes that are shared as one-to-one orthologs by all vertebrate genomes examined (29 species)., The completeness assessment performed by the computational pipelines CEGMA and BUSCO referring to CVG, demonstrated higher accuracy and resolution than with the gene set previously established for this purpose. As a result of the assessment with CVG, we have derived the most comprehensive transcript sequence set of the Madagascar ground gecko by means of assembling individual libraries followed by clustering the assembled sequences based on their overall similarities. Our results provide several insights into optimizing de novo RNA-seq workflow, including the coordination between library insert size and read length, which manifested in improved connectivity of assemblies. The approach and assembly assessment with CVG demonstrated here would be applicable to transcriptome analysis of other species as

  7. Crowdsourcing metrics of digital collections

    Directory of Open Access Journals (Sweden)

    Tuula Pääkkönen

    2015-12-01

    Full Text Available In the National Library of Finland (NLF there are millions of digitized newspaper and journal pages, which are openly available via the public website  http://digi.kansalliskirjasto.fi. To serve users better, last year the front end was completely overhauled with its main aim in crowdsourcing features, e.g., by giving end-users the opportunity to create digital clippings and a personal scrapbook from the digital collections. But how can you know whether crowdsourcing has had an impact? How much crowdsourcing functionalities have been used so far? Did crowdsourcing work? In this paper the statistics and metrics of a recent crowdsourcing effort are analysed across the different digitized material types (newspapers, journals, ephemera. The subjects, categories and keywords given by the users are analysed to see which topics are the most appealing. Some notable public uses of the crowdsourced article clippings are highlighted. These metrics give us indications on how the end-users, based on their own interests, are investigating and using the digital collections. Therefore, the suggested metrics illustrate the versatility of the information needs of the users, varying from citizen science to research purposes. By analysing the user patterns, we can respond to the new needs of the users by making minor changes to accommodate the most active participants, while still making the service more approachable for those who are trying out the functionalities for the first time. Participation in the clippings and annotations can enrich the materials in unexpected ways and can possibly pave the way for opportunities of using crowdsourcing more also in research contexts. This creates more opportunities for the goals of open science since source data becomes ­available, making it possible for researchers to reach out to the general public for help. In the long term, utilizing, for example, text mining methods can allow these different end-user segments to

  8. A family of metric gravities

    Science.gov (United States)

    Shuler, Robert

    2018-04-01

    The goal of this paper is to take a completely fresh approach to metric gravity, in which the metric principle is strictly adhered to but its properties in local space-time are derived from conservation principles, not inferred from a global field equation. The global field strength variation then gains some flexibility, but only in the regime of very strong fields (2nd-order terms) whose measurement is now being contemplated. So doing provides a family of similar gravities, differing only in strong fields, which could be developed into meaningful verification targets for strong fields after the manner in which far-field variations were used in the 20th century. General Relativity (GR) is shown to be a member of the family and this is demonstrated by deriving the Schwarzschild metric exactly from a suitable field strength assumption. The method of doing so is interesting in itself because it involves only one differential equation rather than the usual four. Exact static symmetric field solutions are also given for one pedagogical alternative based on potential, and one theoretical alternative based on inertia, and the prospects of experimentally differentiating these are analyzed. Whether the method overturns the conventional wisdom that GR is the only metric theory of gravity and that alternatives must introduce additional interactions and fields is somewhat semantical, depending on whether one views the field strength assumption as a field and whether the assumption that produces GR is considered unique in some way. It is of course possible to have other fields, and the local space-time principle can be applied to field gravities which usually are weak-field approximations having only time dilation, giving them the spatial factor and promoting them to full metric theories. Though usually pedagogical, some of them are interesting from a quantum gravity perspective. Cases are noted where mass measurement errors, or distributions of dark matter, can cause one

  9. Hybrid metric-Palatini stars

    Science.gov (United States)

    Danilǎ, Bogdan; Harko, Tiberiu; Lobo, Francisco S. N.; Mak, M. K.

    2017-02-01

    We consider the internal structure and the physical properties of specific classes of neutron, quark and Bose-Einstein condensate stars in the recently proposed hybrid metric-Palatini gravity theory, which is a combination of the metric and Palatini f (R ) formalisms. It turns out that the theory is very successful in accounting for the observed phenomenology, since it unifies local constraints at the Solar System level and the late-time cosmic acceleration, even if the scalar field is very light. In this paper, we derive the equilibrium equations for a spherically symmetric configuration (mass continuity and Tolman-Oppenheimer-Volkoff) in the framework of the scalar-tensor representation of the hybrid metric-Palatini theory, and we investigate their solutions numerically for different equations of state of neutron and quark matter, by adopting for the scalar field potential a Higgs-type form. It turns out that the scalar-tensor definition of the potential can be represented as an Clairaut differential equation, and provides an explicit form for f (R ) given by f (R )˜R +Λeff, where Λeff is an effective cosmological constant. Furthermore, stellar models, described by the stiff fluid, radiation-like, bag model and the Bose-Einstein condensate equations of state are explicitly constructed in both general relativity and hybrid metric-Palatini gravity, thus allowing an in-depth comparison between the predictions of these two gravitational theories. As a general result it turns out that for all the considered equations of state, hybrid gravity stars are more massive than their general relativistic counterparts. Furthermore, two classes of stellar models corresponding to two particular choices of the functional form of the scalar field (constant value, and logarithmic form, respectively) are also investigated. Interestingly enough, in the case of a constant scalar field the equation of state of the matter takes the form of the bag model equation of state describing

  10. The human error rate assessment and optimizing system HEROS - a new procedure for evaluating and optimizing the man-machine interface in PSA

    International Nuclear Information System (INIS)

    Richei, A.; Hauptmanns, U.; Unger, H.

    2001-01-01

    A new procedure allowing the probabilistic evaluation and optimization of the man-machine system is presented. This procedure and the resulting expert system HEROS, which is an acronym for Human Error Rate Assessment and Optimizing System, is based on the fuzzy set theory. Most of the well-known procedures employed for the probabilistic evaluation of human factors involve the use of vague linguistic statements on performance shaping factors to select and to modify basic human error probabilities from the associated databases. This implies a large portion of subjectivity. Vague statements are expressed here in terms of fuzzy numbers or intervals which allow mathematical operations to be performed on them. A model of the man-machine system is the basis of the procedure. A fuzzy rule-based expert system was derived from ergonomic and psychological studies. Hence, it does not rely on a database, whose transferability to situations different from its origin is questionable. In this way, subjective elements are eliminated to a large extent. HEROS facilitates the importance analysis for the evaluation of human factors, which is necessary for optimizing the man-machine system. HEROS is applied to the analysis of a simple diagnosis of task of the operating personnel in a nuclear power plant

  11. Context-dependent ATC complexity metric

    NARCIS (Netherlands)

    Mercado Velasco, G.A.; Borst, C.

    2015-01-01

    Several studies have investigated Air Traffic Control (ATC) complexity metrics in a search for a metric that could best capture workload. These studies have shown how daunting the search for a universal workload metric (one that could be applied in different contexts: sectors, traffic patterns,

  12. Properties of C-metric spaces

    Science.gov (United States)

    Croitoru, Anca; Apreutesei, Gabriela; Mastorakis, Nikos E.

    2017-09-01

    The subject of this paper belongs to the theory of approximate metrics [23]. An approximate metric on X is a real application defined on X × X that satisfies only a part of the metric axioms. In a recent paper [23], we introduced a new type of approximate metric, named C-metric, that is an application which satisfies only two metric axioms: symmetry and triangular inequality. The remarkable fact in a C-metric space is that a topological structure induced by the C-metric can be defined. The innovative idea of this paper is that we obtain some convergence properties of a C-metric space in the absence of a metric. In this paper we investigate C-metric spaces. The paper is divided into four sections. Section 1 is for Introduction. In Section 2 we recall some concepts and preliminary results. In Section 3 we present some properties of C-metric spaces, such as convergence properties, a canonical decomposition and a C-fixed point theorem. Finally, in Section 4 some conclusions are highlighted.

  13. Robust Design Impact Metrics: Measuring the effect of implementing and using Robust Design

    DEFF Research Database (Denmark)

    Ebro, Martin; Olesen, Jesper; Howard, Thomas J.

    2014-01-01

    Measuring the performance of an organisation’s product development process can be challenging due to the limited use of metrics in R&D. An organisation considering whether to use Robust Design as an integrated part of their development process may find it difficult to define whether it is relevant......, and afterwards measure the effect of having implemented it. This publication identifies and evaluates Robust Design-related metrics and finds that 2 metrics are especially useful: 1) Relative amount of R&D Resources spent after Design Verification and 2) Number of ‘change notes’ after Design Verification....... The metrics have been applied in a case company to test the assumptions made during the evaluation. It is concluded that the metrics are useful and relevant, but further work is necessary to make a proper overview and categorisation of different types of robustness related metrics....

  14. [Clinical trial data management and quality metrics system].

    Science.gov (United States)

    Chen, Zhao-hua; Huang, Qin; Deng, Ya-zhong; Zhang, Yue; Xu, Yu; Yu, Hao; Liu, Zong-fan

    2015-11-01

    Data quality management system is essential to ensure accurate, complete, consistent, and reliable data collection in clinical research. This paper is devoted to various choices of data quality metrics. They are categorized by study status, e.g. study start up, conduct, and close-out. In each category, metrics for different purposes are listed according to ALCOA+ principles such us completeness, accuracy, timeliness, traceability, etc. Some general quality metrics frequently used are also introduced. This paper contains detail information as much as possible to each metric by providing definition, purpose, evaluation, referenced benchmark, and recommended targets in favor of real practice. It is important that sponsors and data management service providers establish a robust integrated clinical trial data quality management system to ensure sustainable high quality of clinical trial deliverables. It will also support enterprise level of data evaluation and bench marking the quality of data across projects, sponsors, data management service providers by using objective metrics from the real clinical trials. We hope this will be a significant input to accelerate the improvement of clinical trial data quality in the industry.

  15. Evaluation on the detection limit of blood hemoglobin using photolepthysmography based on path-length optimization

    Science.gov (United States)

    Sun, Di; Guo, Chao; Zhang, Ziyang; Han, Tongshuai; Liu, Jin

    2016-10-01

    The blood hemoglobin concentration's (BHC) measurement using Photoplethysmography (PPG), which gets blood absorption to near infrared light from the instantaneous pulse of transmitted light intensity, has not been applied to the clinical use due to the non-enough precision. The main challenge might be caused of the non-enough stable pulse signal when it's very weak and it often varies in different human bodies or in the same body with different physiological states. We evaluated the detection limit of BHC using PPG as the measurement precision level, which can be considered as a best precision result because we got the relative stable subject's pulse signals recorded by using a spectrometer with high signal-to-noise ratio (SNR) level, which is about 30000:1 in short term. Moreover, we optimized the used pathlength using the theory based on optimum pathlength to get a better sensitivity to the absorption variation in blood. The best detection limit was evaluated as about 1 g/L for BHC, and the best SNR of pulse for in vivo measurement was about 2000:1 at 1130 and 1250 nm. Meanwhile, we conclude that the SNR of pulse signal should be better than 400:1 when the required detection limit is set to 5 g/L. Our result would be a good reference to the BHC measurement to get a desired BHC measurement precision of real application.

  16. Preparation, Optimization and Activity Evaluation of PLGA/Streptokinase Nanoparticles Using Electrospray

    Directory of Open Access Journals (Sweden)

    Nasrin Yaghoobi

    2017-04-01

    Full Text Available Purpose: PLGA nanoparticles (NPs have been extensively investigated as carriers of different drug molecules to enhance their therapeutic effects or preserve them from the aqueous environment. Streptokinase (SK is an important medicine for thrombotic diseases. Methods: In this study, we used electrospray to encapsulate SK in PLGA NPs and evaluate its activity. This is the first paper which investigates activity of an electrosprayed enzyme. Effect of three input parameters, namely, voltage, internal diameter of needle (nozzle and concentration ratio of polymer to protein on size and size distribution (SD of NPs was evaluated using artificial neural networks (ANNs. Optimizing the SD has been rarely reported so far in electrospray. Results: From the results, to obtain lowest size of nanoparticles, ratio of polymer/enzyme and needle internal diameter (ID should be low. Also, minimum SD was obtainable at high values of voltage. The optimum preparation had mean (SD size, encapsulation efficiency and loading capacity of 37 (12 nm, 90% and 8.2%, respectively. Nearly, 20% of SK was released in the first 30 minutes, followed by cumulative release of 41% during 72 h. Activity of the enzyme was also checked 30 min after preparation and 19.2% activity was shown. Conclusion: Our study showed that electrospraying could be an interesting approach to encapsulate proteins/enzymes in polymeric nanoparticles. However, further works are required to assure maintaining the activity of the enzyme/protein after electrospray.

  17. MIXOPTIM: A tool for the evaluation and the optimization of the electricity mix in a territory

    Science.gov (United States)

    Bonin, Bernard; Safa, Henri; Laureau, Axel; Merle-Lucotte, Elsa; Miss, Joachim; Richet, Yann

    2014-09-01

    This article presents a method of calculation of the generation cost of a mixture of electricity sources, by means of a Monte Carlo simulation of the production output taking into account the fluctuations of the demand and the stochastic nature of the availability of the various power sources that compose the mix. This evaluation shows that for a given electricity mix, the cost has a non-linear dependence on the demand level. In the second part of the paper, we develop some considerations on the management of intermittence. We develop a method based on spectral decomposition of the imposed power fluctuations to calculate the minimal amount of the controlled power sources needed to follow these fluctuations. This can be converted into a viability criterion of the mix included in the MIXOPTIM software. In the third part of the paper, the MIXOPTIM cost evaluation method is applied to the multi-criteria optimization of the mix, according to three main criteria: the cost of the mix; its impact on climate in terms of CO2 production; and the security of supply.

  18. Thermodynamic evaluation and optimization of the (Na+K+S) system

    Energy Technology Data Exchange (ETDEWEB)

    Lindberg, Daniel [Abo Akademi Process Chemistry Centre, Abo Akademi University, Biskopsgatan 8, FI-20500 Turku (Finland)]. E-mail: Daniel.Lindberg@abo.fi; Backman, Rainer [Abo Akademi Process Chemistry Centre, Abo Akademi University, Biskopsgatan 8, FI-20500 Turku (Finland); Energy Technology and Thermal Process Chemistry, Umea University, SE-90187 Umea (Sweden); Hupa, Mikko [Abo Akademi Process Chemistry Centre, Abo Akademi University, Biskopsgatan 8, FI-20500 Turku (Finland); Chartrand, Patrice [Centre de Recherche en Calcul Thermochimique (CRCT), Ecole Polytechnique, Box 6079, Station Downtown, Montreal, Que., H3C 3A7 (Canada)

    2006-07-15

    The (Na+K+S) system is of primary importance for the combustion of black liquor in the kraft recovery boilers in pulp and paper mills. A thermodynamic evaluation and optimization for the (Na+K+S) system has been made. All available data for the system have been critically evaluated to obtain optimized parameters of thermodynamic models for all phases. The liquid model is the quasichemical model in the quadruplet approximation, which evaluates 1st- and 2nd-nearest-neighbour short-range-order. In this model, cations (Na{sup +} and K{sup +}) are assumed to mix on a cationic sublattice, while anions (S{sup 2-},S{sub 2}{sup 2-},S{sub 3}{sup 2-},S{sub 4}{sup 2-},S{sub 5}{sup 2-},S{sub 6}{sup 2-},S{sub 7}{sup 2-},S{sub 8}{sup 2-},Va{sup -}) are assumed to mix on an anionic sublattice. The thermodynamic data of the liquid polysulphide components M{sub 2}S{sub 1+n} (M=Na, K and n=1-7) are fitted to {delta}G=A(n)+B(n).T for the reaction M{sub 2}S(l)+nS(l)=M{sub 2}S{sub n+1}(l). The solid phases are the alkali alloys, alkali sulphides, several different alkali polysulphides and sulphur. The solid solutions (Na,K) (Na,K){sub 2}S and (Na,K){sub 2}S{sub 2} are modelled using the compound energy formalism. The models can be used to predict the thermodynamic properties and phase equilibria in the multicomponent heterogeneous system. The experimental data are reproduced within experimental error limits for equilibria between solid, liquid and gas. The ternary phase diagram of the system (Na{sub 2}S+K{sub 2}S+S) has been predicted as no experimental determinations of the phase diagram have been made previously.

  19. Fractional order Darwinian particle swarm optimization applications and evaluation of an evolutionary algorithm

    CERN Document Server

    Couceiro, Micael

    2015-01-01

    This book examines the bottom-up applicability of swarm intelligence to solving multiple problems, such as curve fitting, image segmentation, and swarm robotics. It compares the capabilities of some of the better-known bio-inspired optimization approaches, especially Particle Swarm Optimization (PSO), Darwinian Particle Swarm Optimization (DPSO) and the recently proposed Fractional Order Darwinian Particle Swarm Optimization (FODPSO), and comprehensively discusses their advantages and disadvantages. Further, it demonstrates the superiority and key advantages of using the FODPSO algorithm, suc

  20. On characterizations of quasi-metric completeness

    Energy Technology Data Exchange (ETDEWEB)

    Dag, H.; Romaguera, S.; Tirado, P.

    2017-07-01

    Hu proved in [4] that a metric space (X, d) is complete if and only if for any closed subspace C of (X, d), every Banach contraction on C has fixed point. Since then several authors have investigated the problem of characterizing the metric completeness by means of fixed point theorems. Recently this problem has been studied in the more general context of quasi-metric spaces for different notions of completeness. Here we present a characterization of a kind of completeness for quasi-metric spaces by means of a quasi-metric versions of Hu’s theorem. (Author)