WorldWideScience

Sample records for metric tonnes carbon

  1. The climate response to five trillion tonnes of carbon

    Science.gov (United States)

    Tokarska, Katarzyna B.; Gillett, Nathan P.; Weaver, Andrew J.; Arora, Vivek K.; Eby, Michael

    2016-09-01

    Concrete actions to curtail greenhouse gas emissions have so far been limited on a global scale, and therefore the ultimate magnitude of climate change in the absence of further mitigation is an important consideration for climate policy. Estimates of fossil fuel reserves and resources are highly uncertain, and the amount used under a business-as-usual scenario would depend on prevailing economic and technological conditions. In the absence of global mitigation actions, five trillion tonnes of carbon (5 EgC), corresponding to the lower end of the range of estimates of the total fossil fuel resource, is often cited as an estimate of total cumulative emissions. An approximately linear relationship between global warming and cumulative CO2 emissions is known to hold up to 2 EgC emissions on decadal to centennial timescales; however, in some simple climate models the predicted warming at higher cumulative emissions is less than that predicted by such a linear relationship. Here, using simulations from four comprehensive Earth system models, we demonstrate that CO2-attributable warming continues to increase approximately linearly up to 5 EgC emissions. These models simulate, in response to 5 EgC of CO2 emissions, global mean warming of 6.4-9.5 °C, mean Arctic warming of 14.7-19.5 °C, and mean regional precipitation increases by more than a factor of four. These results indicate that the unregulated exploitation of the fossil fuel resource could ultimately result in considerably more profound climate changes than previously suggested.

  2. The role of carbon sequestration and the tonne-year approach in fulfilling the objective of climate convention

    International Nuclear Information System (INIS)

    Korhonen, Riitta; Pingoud, Kim; Savolainen, Ilkka; Matthews, Robert

    2002-01-01

    Carbon can be sequestered from the atmosphere to forests in order to lower the atmospheric carbon dioxide concentration. Tonne-years of sequestered carbon have been suggested to be used as a measure of global warming impact for these projects of finite lifetimes. It is illustrated here by simplified example cases that the objective of the stabilisation of the atmospheric greenhouse gas concentrations expressed in the UN Climate convention and the tonne-year approach can be in contradiction. Tonne-years generated by the project can indicate that carbon sequestration helps in the mitigation of climate change even when the impact of the project on the CO 2 concentration is that concentration increases. Hence, the use of the tonne-years might waste resources of fulfilling the objective of the convention. The studied example cases are closely related to the IPCC estimates on global forestation potentials by 2050. It is also illustrated that the use of bioenergy from the reforested areas to replace fossil fuels can in the long term contribute more effectively to the control of carbon dioxide concentrations than permanent sequestration of carbon to forests. However, the estimated benefits depend on the time frame considered, whether we are interested in the decadal scale of controlling of the rate of climate change or in the centennial scale of controlling or halting the climate change

  3. How much would five trillion tonnes of carbon warm the climate?

    Science.gov (United States)

    Tokarska, Katarzyna Kasia; Gillett, Nathan P.; Weaver, Andrew J.; Arora, Vivek K.

    2016-04-01

    While estimates of fossil fuel reserves and resources are very uncertain, and the amount which could ultimately be burnt under a business as usual scenario would depend on prevailing economic and technological conditions, an amount of five trillion tonnes of carbon (5 EgC), corresponding to the lower end of the range of estimates of the total fossil fuel resource, is often cited as an estimate of total cumulative emissions in the absence of mitigation actions. The IPCC Fifth Assessment Report indicates that an approximately linear relationship between warming and cumulative carbon emissions holds only up to around 2 EgC emissions. It is typically assumed that at higher cumulative emissions the warming would tend to be less than that predicted by such a linear relationship, with the radiative saturation effect dominating the effects of positive carbon-climate feedbacks at high emissions, as predicted by simple carbon-climate models. We analyze simulations from four state-of-the-art Earth System Models (ESMs) from the Coupled Model Intercomparison Project Phase 5 (CMIP5) and seven Earth System Models of Intermediate Complexity (EMICs), driven by the Representative Concentration Pathway 8.5 Extension scenario (RCP 8.5 Ext), which represents a very high emission scenario of increasing greenhouse gas concentrations in absence of climate mitigation policies. Our results demonstrate that while terrestrial and ocean carbon storage varies between the models, the CO2-induced warming continues to increase approximately linearly with cumulative carbon emissions even for higher levels of cumulative emissions, in all four ESMs. Five of the seven EMICs considered simulate a similarly linear response, while two exhibit less warming at higher cumulative emissions for reasons we discuss. The ESMs simulate global mean warming of 6.6-11.0°C, mean Arctic warming of 15.3-19.7°C, and mean regional precipitation increases and decreases by more than a factor of four, in response to 5Eg

  4. Ten Tonne Plan: Education for Sustainability from a Whole Systems Thinking Perspective

    Science.gov (United States)

    Lewis, Elaine; Mansfield, Caroline; Baudains, Catherine

    2014-01-01

    The "Ten Tonne Plan" is a greenhouse gas emissions reduction initiative that aimed to reduce school emissions by 10 tonnes (metric tons) in one year. A case study was conducted on the impact of this initiative at a primary school in Western Australia. Research investigated student, staff, parent, and community partner perceptions…

  5. The implications of carbon dioxide and methane exchange for the heavy mitigation RCP2.6 scenario under two metrics

    International Nuclear Information System (INIS)

    Huntingford, Chris; Lowe, Jason A.; Howarth, Nicholas; Bowerman, Niel H.A.; Gohar, Laila K.; Otto, Alexander; Lee, David S.; Smith, Stephen M.; Elzen, Michel G.J. den; Vuuren, Detlef P. van; Millar, Richard J.; Allen, Myles R.

    2015-01-01

    Highlights: • Exchanging methane for carbon dioxide emissions affects peak global warming. • Economic constraints severely affects exchange possibilities. • Chosen metric determines if economic to eliminate all removable methane emissions. • If all methane emissions could be removed, this could aid meeting two-degrees warming target. - Abstract: Greenhouse gas emissions associated with Representative Concentration Pathway RCP2.6 could limit global warming to around or below a 2 °C increase since pre-industrial times. However this scenario implies very large and rapid reductions in both carbon dioxide (CO 2 ) and non-CO 2 emissions, and suggests a need to understand available flexibility between how different greenhouse gases might be abated. There is a growing interest in developing a greater understanding of the particular role of shorter lived non-CO 2 gases as abatement options. We address this here through a sensitivity study of different methane (CH 4 ) emissions pathways to year 2100 and beyond, by including exchanges with CO 2 emissions, and with a focus on related climate and economic advantages and disadvantages. Metrics exist that characterise gas equivalence in terms of climate change effect per tonne emitted. We analyse the implications of CO 2 and CH 4 emission exchanges under two commonly considered metrics: the 100-yr Global Warming Potential (GWP-100) and Global Temperature Potential (GTP-100). This is whilst keeping CO 2 -equivalent emissions pathways fixed, based on the standard set of emissions usually associated with RCP2.6. An idealised situation of anthropogenic CH 4 emissions being reduced to zero across a period of two decades and with the implementation of such cuts starting almost immediately gives lower warming than for standard RCP2.6 emissions during the 21st and 22nd Century. This is despite exchanging for higher CO 2 emissions. Introducing Marginal Abatement Cost (MAC) curves provides an economic assessment of alternative gas

  6. Climate metrics and the carbon footprint of livestock products: where’s the beef?

    Science.gov (United States)

    Persson, U. Martin; Johansson, Daniel J. A.; Cederberg, Christel; Hedenus, Fredrik; Bryngelsson, David

    2015-03-01

    The livestock sector is estimated to account for 15% of global greenhouse gas (GHG) emissions, 80% of which originate from ruminant animal systems due to high emissions of methane (CH4) from enteric fermentation and manure management. However, recent analyses have argued that the carbon footprint (CF) of ruminant meat and dairy products are substantially reduced if one adopts alternative metrics for comparing emissions of GHGs—e.g., the 100 year global temperature change potential (GTP100), instead of the commonly used 100 year global warming potential (GWP100)—due to a lower valuation of CH4 emissions. This raises the question of which metric to use. Ideally, the choice of metric should be related to a climate policy goal. Here, we argue that basing current GHG metrics solely on temperature impact 100 years into the future is inconsistent with the current global climate goal of limiting warming to 2 °C, a limit that is likely to be reached well within 100 years. A reasonable GTP value for CH4, accounting for current projections for when 2 °C warming will be reached, is about 18, leading to a current CF of 19 kg CO2-eq. per kilo beef (carcass weight, average European system), 20% lower than if evaluated using GWP100. Further, we show that an application of the GTP metric consistent with a 2 °C climate limit leads to the valuation of CH4 increasing rapidly over time as the temperature ceiling is approached. This means that the CF for beef would rise by around 2.5% per year in the coming decades, surpassing the GWP based footprint in only ten years. Consequently, the impact on the livestock sector of substituting GTPs for GWPs would be modest in the near term, but could potentially be very large in the future due to a much higher (>50%) and rapidly appreciating CF.

  7. ALICE: structures weighing several tonnes are moved with millimetric precision

    CERN Multimedia

    2005-01-01

    The ALICE collaboration has just conducted one of its most spectacular transport operations to date in lifting the dipole of the muon spectrometer and reassembling it on the other side of the huge solenoid magnet. This incredible feat involved lifting no fewer than 900 tonnes of equipment over the red octagonal yoke inherited from the L3 experiment at a height of 18 metres. Following initial assembly and successful testing at the end of last year (see Bulletin No. 4/2005), the dipole was completely dismantled and moved to the other end of the cavern. The yoke was transported as 28 modules, each weighing 30 tonnes. The most spectacular feat of all, though, was undoubtedly the removal of the two 32-tonne coils. The first of these was moved on 18 April, as recorded in the following photos: A special lifting gantry weighing 5 tonnes had to be developed to move and install the coils. Huge clamps, which can be seen at the front, were used to rotate these enormous 32-tonne components. The whole assembly was raised ...

  8. Ability of LANDSAT-8 Oli Derived Texture Metrics in Estimating Aboveground Carbon Stocks of Coppice Oak Forests

    Science.gov (United States)

    Safari, A.; Sohrabi, H.

    2016-06-01

    The role of forests as a reservoir for carbon has prompted the need for timely and reliable estimation of aboveground carbon stocks. Since measurement of aboveground carbon stocks of forests is a destructive, costly and time-consuming activity, aerial and satellite remote sensing techniques have gained many attentions in this field. Despite the fact that using aerial data for predicting aboveground carbon stocks has been proved as a highly accurate method, there are challenges related to high acquisition costs, small area coverage, and limited availability of these data. These challenges are more critical for non-commercial forests located in low-income countries. Landsat program provides repetitive acquisition of high-resolution multispectral data, which are freely available. The aim of this study was to assess the potential of multispectral Landsat 8 Operational Land Imager (OLI) derived texture metrics in quantifying aboveground carbon stocks of coppice Oak forests in Zagros Mountains, Iran. We used four different window sizes (3×3, 5×5, 7×7, and 9×9), and four different offsets ([0,1], [1,1], [1,0], and [1,-1]) to derive nine texture metrics (angular second moment, contrast, correlation, dissimilar, entropy, homogeneity, inverse difference, mean, and variance) from four bands (blue, green, red, and infrared). Totally, 124 sample plots in two different forests were measured and carbon was calculated using species-specific allometric models. Stepwise regression analysis was applied to estimate biomass from derived metrics. Results showed that, in general, larger size of window for deriving texture metrics resulted models with better fitting parameters. In addition, the correlation of the spectral bands for deriving texture metrics in regression models was ranked as b4>b3>b2>b5. The best offset was [1,-1]. Amongst the different metrics, mean and entropy were entered in most of the regression models. Overall, different models based on derived texture metrics

  9. Is Time the Best Metric to Measure Carbon-Related Climate Change Potential and Tune the Economy Toward Reduced Fossil Carbon Extraction?

    Science.gov (United States)

    DeGroff, F. A.

    2016-12-01

    Anthropogenic changes to non-anthropogenic carbon fluxes are a primary driver of climate change. There currently exists no comprehensive metric to measure and value anthropogenic changes in carbon flux between all states of carbon. Focusing on atmospheric carbon emissions as a measure of anthropogenic activity on the environment ignores the fungible characteristics of carbon that are crucial in both the biosphere and the worldwide economy. Focusing on a single form of inorganic carbon as a proxy metric for the plethora of anthropogenic activity and carbon compounds will prove inadequate, convoluted, and unmanageable. A broader, more basic metric is needed to capture the entirety of carbon activity, particularly in an economic, profit-driven environment. We propose a new metric to measure changes in the temporal distance of any form or state of carbon from one state to another. Such a metric would be especially useful to measure the temporal distance of carbon from sinks such as the atmosphere or oceans. The effect of changes in carbon flux as a result of any human activity can be measured by the difference between the anthropogenic and non-anthropogenic temporal distance. The change in the temporal distance is a measure of the climate change potential much like voltage is a measure of electrical potential. The integral of the climate change potential is proportional to the anthropogenic climate change. We also propose a logarithmic vector scale for carbon quality, cq, as a measure of anthropogenic changes in carbon flux. The distance between the cq vector starting and ending temporal distances represents the change in cq. A base-10 logarithmic scale would allow the addition and subtraction of exponents to calculate changes in cq. As anthropogenic activity changes the temporal distance of carbon, the change in cq is measured as: cq = ß ( log10 [mean carbon temporal distance] ) where ß represents the carbon price coefficient for a particular country. For any

  10. New Pathways and Metrics for Enhanced, Reversible Hydrogen Storage in Boron-Doped Carbon Nanospaces

    Energy Technology Data Exchange (ETDEWEB)

    Pfeifer, Peter [University of Missouri; Wexler, Carlos [University of Missouri; Hawthorne, M. Frederick [University of Missouri; Lee, Mark W. [University of Missouri; Jalistegi, Satish S. [University of Missouri

    2014-08-14

    This project, since its start in 2007—entitled “Networks of boron-doped carbon nanopores for low-pressure reversible hydrogen storage” (2007-10) and “New pathways and metrics for enhanced, reversible hydrogen storage in boron-doped carbon nanospaces” (2010-13)—is in support of the DOE's National Hydrogen Storage Project, as part of the DOE Hydrogen and Fuel Cells Program’s comprehensive efforts to enable the widespread commercialization of hydrogen and fuel cell technologies in diverse sectors of the economy. Hydrogen storage is widely recognized as a critical enabling technology for the successful commercialization and market acceptance of hydrogen powered vehicles. Storing sufficient hydrogen on board a wide range of vehicle platforms, at energy densities comparable to gasoline, without compromising passenger or cargo space, remains an outstanding technical challenge. Of the main three thrust areas in 2007—metal hydrides, chemical hydrogen storage, and sorption-based hydrogen storage—sorption-based storage, i.e., storage of molecular hydrogen by adsorption on high-surface-area materials (carbons, metal-organic frameworks, and other porous organic networks), has emerged as the most promising path toward achieving the 2017 DOE storage targets of 0.055 kg H2/kg system (“5.5 wt%”) and 0.040 kg H2/liter system. The objective of the project is to develop high-surface-area carbon materials that are boron-doped by incorporation of boron into the carbon lattice at the outset, i.e., during the synthesis of the material. The rationale for boron-doping is the prediction that boron atoms in carbon will raise the binding energy of hydro- gen from 4-5 kJ/mol on the undoped surface to 10-14 kJ/mol on a doped surface, and accordingly the hydro- gen storage capacity of the material. The mechanism for the increase in binding energy is electron donation from H2 to electron-deficient B atoms, in the form of sp2 boron-carbon bonds. Our team is proud to have

  11. Topographic Metric Predictions of Soil redistribution and Organic Carbon Distribution in Croplands

    Science.gov (United States)

    Mccarty, G.; Li, X.

    2017-12-01

    Landscape topography is a key factor controlling soil redistribution and soil organic carbon (SOC) distribution in Iowa croplands (USA). In this study, we adopted a combined approach based on carbon () and cesium (137Cs) isotope tracers, and digital terrain analysis to understand patterns of SOC redistribution and carbon sequestration dynamics as influenced by landscape topography in tilled cropland under long term corn/soybean management. The fallout radionuclide 137Cs was used to estimate soil redistribution rates and a Lidar-derived DEM was used to obtain a set of topographic metrics for digital terrain analysis. Soil redistribution rates and patterns of SOC distribution were examined across 560 sampling locations at two field sites as well as at larger scale within the watershed. We used δ13C content in SOC to partition C3 and C4 plant derived C density at 127 locations in one of the two field sites with corn being the primary source of C4 C. Topography-based models were developed to simulate SOC distribution and soil redistribution using stepwise ordinary least square regression (SOLSR) and stepwise principal component regression (SPCR). All topography-based models developed through SPCR and SOLSR demonstrated good simulation performance, explaining more than 62% variability in SOC density and soil redistribution rates across two field sites with intensive samplings. However, the SOLSR models showed lower reliability than the SPCR models in predicting SOC density at the watershed scale. Spatial patterns of C3-derived SOC density were highly related to those of SOC density. Topographic metrics exerted substantial influence on C3-derived SOC density with the SPCR model accounting for 76.5% of the spatial variance. In contrast C4 derived SOC density had poor spatial structure likely reflecting the substantial contribution of corn vegetation to recently sequestered SOC density. Results of this study highlighted the utility of topographic SPCR models for scaling

  12. Achieving Carbon Neutrality in the Global Aluminum Industry

    Science.gov (United States)

    Das, Subodh

    2012-02-01

    In the 21st century, sustainability is widely regarded as the new corporate culture, and leading manufacturing companies (Toyota, GE, and Alcoa) and service companies (Google and Federal Express) are striving towards carbon neutrality. The current carbon footprint of the global aluminum industry is estimated at 500 million metric tonnes carbon dioxide equivalent (CO2eq), representing about 1.7% of global emissions from all sources. For the global aluminum industry, carbon neutrality is defined as a state where the total "in-use" CO2eq saved from all products in current use, including incremental process efficiency improvements, recycling, and urban mining activities, equals the CO2eq expended to produce the global output of aluminum. This paper outlines an integrated and quantifiable plan for achieving "carbon neutrality" in the global aluminum industry by advocating five actionable steps: (1) increase use of "green" electrical energy grid by 8%, (2) reduce process energy needs by 16%, (3) deploy 35% of products in "in-use" energy saving applications, (4) divert 6.1 million metric tonnes/year from landfills, and (5) mine 4.5 million metric tonnes/year from aluminum-rich "urban mines." Since it takes 20 times more energy to make aluminum from bauxite ore than to recycle it from scrap, the global aluminum industry could set a reasonable, self-imposed energy/carbon neutrality goal to incrementally increase the supply of recycled aluminum by at least 1.05 metric tonnes for every tonne of incremental production via primary aluminum smelter capacity. Furthermore, the aluminum industry can and should take a global leadership position by actively developing internationally accepted and approved carbon footprint credit protocols.

  13. Review of the incineration of 500 tonnes of radio-active residues; Bilan de l'incineration de 500 tonnes de residus radioactifs

    Energy Technology Data Exchange (ETDEWEB)

    Rodier, J; Seyfried, P; Charbonneaux, M [Commissariat a l' Energie Atomique, Chusclan (France). Centre de Production de Plutonium de Marcoule

    1969-07-01

    During its first five years operation, the incinerator at the Marcoule Centre has burnt almost 500 tonnes of radio-active residues. Improvements in some of the details of the process have been made during this period; they concern the nature of the materials involved. The technical and radiological results for the installation are very favorable, and have made it possible to maintain a high charge factor.Although the overall economic results are not advantageous in the case of ungraded solid residues this method represents nevertheless the best available for eliminating oils, solvents, wood and dead animals. It can also be of use furthermore each time that a dilution in the atmosphere can advantageously be used as a method of disposing of certain radio elements such as tritium or carbon 14 in the form of gases or vapours. (author) [French] L'incinerateur du Centre de Marcoule a brule, durant les 5 premieres annees de fonctionnement, pres de 500 tonnes de residus radioactifs. Les ameliorations de detail realisees au cours de cette periode ont porte sur la nature des materiaux employes. Les bilans techniques et radiologiques de l'installation sont tres favorables et ont permis de maintenir un facteur de charge eleve. Si le bilan economique n'est pas favorable a l'incineration des residus solides 'tout venant' cette methode constitue cependant la solution ideale pour l'elimination des huiles, des solvants, du bois et des cadavres d'animaux. En outre, elle peut etre interessante chaque fois que la dilution dans l'atmosphere peut etre avantageusement mise a profit pour rejeter certains radioelements tels que le tritium ou le carbone 14 sous forme de gaz ou de vapeurs. (auteur)

  14. Discovery Mondays: Transporting tonnes of equipment with millimetre precision

    CERN Multimedia

    2005-01-01

    Transporting huge, very heavy but also frequently fragile items at CERN often presents a real challenge. The task becomes even more challenging when it involves lowering huge LHC machine and detector components 100 metres below ground. The Laboratory's Transport Service uses various techniques and different types of transport and heavy handling equipment to perform these delicate operations. You will have an opportunity to find out more about how they do their job at the next Discovery Monday event. You will have a close encounter with the trailer used to transport the impressive 15 metre-long, 35-tonne dipole magnets. You will be able to install mock-up magnets in a beam line or test your skill using heavy handling equipment to carry out a most unusual fishing operation. You will be able to take a trip in a three-metre-high lorry and have a once-in-a-lifetime opportunity to operate a crane. You will also be able to take a test drive in the famous roll-over simulator vehicle. At the coming Discovery Monday...

  15. Review of the incineration of 500 tonnes of radio-active residues

    International Nuclear Information System (INIS)

    Rodier, J.; Seyfried, P.; Charbonneaux, M.

    1969-01-01

    During its first five years operation, the incinerator at the Marcoule Centre has burnt almost 500 tonnes of radio-active residues. Improvements in some of the details of the process have been made during this period; they concern the nature of the materials involved. The technical and radiological results for the installation are very favorable, and have made it possible to maintain a high charge factor.Although the overall economic results are not advantageous in the case of ungraded solid residues this method represents nevertheless the best available for eliminating oils, solvents, wood and dead animals. It can also be of use furthermore each time that a dilution in the atmosphere can advantageously be used as a method of disposing of certain radio elements such as tritium or carbon 14 in the form of gases or vapours. (author) [fr

  16. Redefining agricultural yields: from tonnes to people nourished per hectare

    International Nuclear Information System (INIS)

    Cassidy, Emily S; West, Paul C; Gerber, James S; Foley, Jonathan A

    2013-01-01

    Worldwide demand for crops is increasing rapidly due to global population growth, increased biofuel production, and changing dietary preferences. Meeting these growing demands will be a substantial challenge that will tax the capability of our food system and prompt calls to dramatically boost global crop production. However, to increase food availability, we may also consider how the world’s crops are allocated to different uses and whether it is possible to feed more people with current levels of crop production. Of particular interest are the uses of crops as animal feed and as biofuel feedstocks. Currently, 36% of the calories produced by the world’s crops are being used for animal feed, and only 12% of those feed calories ultimately contribute to the human diet (as meat and other animal products). Additionally, human-edible calories used for biofuel production increased fourfold between the years 2000 and 2010, from 1% to 4%, representing a net reduction of available food globally. In this study, we re-examine agricultural productivity, going from using the standard definition of yield (in tonnes per hectare, or similar units) to using the number of people actually fed per hectare of cropland. We find that, given the current mix of crop uses, growing food exclusively for direct human consumption could, in principle, increase available food calories by as much as 70%, which could feed an additional 4 billion people (more than the projected 2–3 billion people arriving through population growth). Even small shifts in our allocation of crops to animal feed and biofuels could significantly increase global food availability, and could be an instrumental tool in meeting the challenges of ensuring global food security. (letter)

  17. Redefining agricultural yields: from tonnes to people nourished per hectare

    Science.gov (United States)

    Cassidy, Emily S.; West, Paul C.; Gerber, James S.; Foley, Jonathan A.

    2013-09-01

    Worldwide demand for crops is increasing rapidly due to global population growth, increased biofuel production, and changing dietary preferences. Meeting these growing demands will be a substantial challenge that will tax the capability of our food system and prompt calls to dramatically boost global crop production. However, to increase food availability, we may also consider how the world’s crops are allocated to different uses and whether it is possible to feed more people with current levels of crop production. Of particular interest are the uses of crops as animal feed and as biofuel feedstocks. Currently, 36% of the calories produced by the world’s crops are being used for animal feed, and only 12% of those feed calories ultimately contribute to the human diet (as meat and other animal products). Additionally, human-edible calories used for biofuel production increased fourfold between the years 2000 and 2010, from 1% to 4%, representing a net reduction of available food globally. In this study, we re-examine agricultural productivity, going from using the standard definition of yield (in tonnes per hectare, or similar units) to using the number of people actually fed per hectare of cropland. We find that, given the current mix of crop uses, growing food exclusively for direct human consumption could, in principle, increase available food calories by as much as 70%, which could feed an additional 4 billion people (more than the projected 2-3 billion people arriving through population growth). Even small shifts in our allocation of crops to animal feed and biofuels could significantly increase global food availability, and could be an instrumental tool in meeting the challenges of ensuring global food security.

  18. Observationally-based Metrics of Ocean Carbon and Biogeochemical Variables are Essential for Evaluating Earth System Model Projections

    Science.gov (United States)

    Russell, J. L.; Sarmiento, J. L.

    2017-12-01

    The Southern Ocean is central to the climate's response to increasing levels of atmospheric greenhouse gases as it ventilates a large fraction of the global ocean volume. Global coupled climate models and earth system models, however, vary widely in their simulations of the Southern Ocean and its role in, and response to, the ongoing anthropogenic forcing. Due to its complex water-mass structure and dynamics, Southern Ocean carbon and heat uptake depend on a combination of winds, eddies, mixing, buoyancy fluxes and topography. Understanding how the ocean carries heat and carbon into its interior and how the observed wind changes are affecting this uptake is essential to accurately projecting transient climate sensitivity. Observationally-based metrics are critical for discerning processes and mechanisms, and for validating and comparing climate models. As the community shifts toward Earth system models with explicit carbon simulations, more direct observations of important biogeochemical parameters, like those obtained from the biogeochemically-sensored floats that are part of the Southern Ocean Carbon and Climate Observations and Modeling project, are essential. One goal of future observing systems should be to create observationally-based benchmarks that will lead to reducing uncertainties in climate projections, and especially uncertainties related to oceanic heat and carbon uptake.

  19. The path to a successful one-million tonne demonstration of geological sequestration: Characterization, cooperation, and collaboration

    Science.gov (United States)

    Finley, R.J.; Greenberg, S.E.; Frailey, S.M.; Krapac, I.G.; Leetaru, H.E.; Marsteller, S.

    2011-01-01

    The development of the Illinois Basin-Decatur USA test site for a 1 million tonne injection of CO2 into the Mount Simon Sandstone saline reservoir beginning in 2011 has been a multiphase process requiring a wide array of personnel and resources that began in 2003. The process of regional characterization took two years as part of a Phase I effort focused on the entire Illinois Basin, located in Illinois, Indiana, and Kentucky, USA. Seeking the cooperation of an industrial source of CO2 and site selection within the Basin took place during Phase II while most of the concurrent research emphasis was on a set of small-scale tests of Enhanced Oil Recovery (EOR) and CO2 injection into a coal seam. Phase III began the commitment to the 1 million-tonne test site development through the collaboration of the Archer Daniels Midland Company (ADM) who is providing a site, the CO2, and developing a compression facility, of Schlumberger Carbon Services who is providing expertise for operations, drilling, geophysics, risk assessment, and reservoir modelling, and of the Illinois State Geological Survey (ISGS) whose geologists and engineers lead the Midwest Geological Sequestration Consortium (MGSC). Communications and outreach has been a collaborative effort of ADM, ISGS and Schlumberger Carbon Services. The Consortium is one of the seven Regional Carbon Sequestration Partnerships, a carbon sequestration research program supported by the National Energy Technology Laboratory of the U.S. Department of Energy. ?? 2011 Published by Elsevier Ltd.

  20. Carbon oxidation state as a metric for describing the chemistry of atmospheric organic aerosol

    Energy Technology Data Exchange (ETDEWEB)

    Massachusetts Institute of Technology; Kroll, Jesse H.; Donahue, Neil M.; Jimenez, Jose L.; Kessler, Sean H.; Canagaratna, Manjula R.; Wilson, Kevin R.; Altieri, Katye E.; Mazzoleni, Lynn R.; Wozniak, Andrew S.; Bluhm, Hendrik; Mysak, Erin R.; Smith, Jared D.; Kolb, Charles E.; Worsnop, Douglas R.

    2010-11-05

    A detailed understanding of the sources, transformations, and fates of organic species in the environment is crucial because of the central roles that organics play in human health, biogeochemical cycles, and Earth's climate. However, such an understanding is hindered by the immense chemical complexity of environmental mixtures of organics; for example, atmospheric organic aerosol consists of at least thousands of individual compounds, all of which likely evolve chemically over their atmospheric lifetimes. Here we demonstrate the utility of describing organic aerosol (and other complex organic mixtures) in terms of average carbon oxidation state (OSC), a quantity that always increases with oxidation, and is readily measured using state-of-the-art analytical techniques. Field and laboratory measurements of OSC , using several such techniques, constrain the chemical properties of the organics and demonstrate that the formation and evolution of organic aerosol involves simultaneous changes to both carbon oxidation state and carbon number (nC).

  1. Energy dependent track structure parametrizations for protons and carbon ions based on nano-metric simulations

    International Nuclear Information System (INIS)

    Frauke, A.; Wilkens, J.J.; Villagrasa, C.; Rabus, H.

    2015-01-01

    The BioQuaRT project within the European Metrology Research Programme aims at correlating ion track structure characteristics with the biological effects of radiation and develops measurement and simulation techniques for determining ion track structure on different length scales from about 2 nm to about 10 μm. Within this framework, we investigate methods to translate track-structure quantities derived on a nanometer scale to macroscopic dimensions. Input data sets were generated by simulations of ion tracks of protons and carbon ions in liquid water using the Geant-4 Monte Carlo tool-kit with the Geant-4-DNA processes. Based on the energy transfer points - recorded with nanometer resolution - we investigated parametrizations of overall properties of ion track structure. Three different track structure parametrizations have been developed using the distances to the 10 next neighbouring ionizations, the radial energy distribution and ionisation cluster size distributions. These parametrizations of nanometer-scale track structure build a basis for deriving biologically relevant mean values which are essential in the clinical situation where each voxel is exposed to a mixed radiation field. (authors)

  2. Semantic metrics

    OpenAIRE

    Hu, Bo; Kalfoglou, Yannis; Dupplaw, David; Alani, Harith; Lewis, Paul; Shadbolt, Nigel

    2006-01-01

    In the context of the Semantic Web, many ontology-related operations, e.g. ontology ranking, segmentation, alignment, articulation, reuse, evaluation, can be boiled down to one fundamental operation: computing the similarity and/or dissimilarity among ontological entities, and in some cases among ontologies themselves. In this paper, we review standard metrics for computing distance measures and we propose a series of semantic metrics. We give a formal account of semantic metrics drawn from a...

  3. Cosmogenic activation of germanium used for tonne-scale rare event search experiments

    Science.gov (United States)

    Wei, W.-Z.; Mei, D.-M.; Zhang, C.

    2017-11-01

    We report a comprehensive study of cosmogenic activation of germanium used for tonne-scale rare event search experiments. The germanium exposure to cosmic rays on the Earth's surface are simulated with and without a shielding container using Geant4 for a given cosmic muon, neutron, and proton energy spectrum. The production rates of various radioactive isotopes are obtained for different sources separately. We find that fast neutron induced interactions dominate the production rate of cosmogenic activation. Geant4-based simulation results are compared with the calculation of ACTIVIA and the available experimental data. A reasonable agreement between Geant4 simulations and several experimental data sets is presented. We predict that cosmogenic activation of germanium can set limits to the sensitivity of the next generation of tonne-scale experiments.

  4. Results from the 1 tonne*year Dark Matter Search with XENON1T

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Weakly Interacting Massive Particles (WIMPs) are an excellent candidate for the mysterious Dark Matter in the Universe. The XENON1T experiment at LNGS is the world’s largest and most sensitive experiment for the direct detection of WIMPs via nuclear recoils. Details of the experiment and of the achieved unprecedented low background conditions will be covered and new results from a record exposure of 1 tonne x year will be presented for the first time.

  5. Carbon dioxide emissions from fossil-fuel use, 1751-1950

    Energy Technology Data Exchange (ETDEWEB)

    Andres, R.J.; Fielding, D.J.; Marland, G.; Boden, T.A.; Kumar, N.; Kearney, A.T. [University of Alaska, Fairbanks, AK (US). Inst. of Northern Engineering

    1999-09-01

    Newly compiled energy statistics allow the complete time series of carbon dioxide (CO{sub 2}) emissions from fossil-fuel use for the years 1751 to the present to be estimated. The time series begins with 3 x 10{sup 6} metric tonnes carbon (C). The CO{sub 2} flux increased exponentially until World War I. The time series derived here seamlessly joins the modern 1950 to present time series. Total cumulative CO{sub 2} emissions through 1949 were 61.0 x 10{sup 9} tonne C from fossil-fuel use, virtually all since the beginning of the Industrial Revolution around 1860. The rate of growth continues to grow during present times, generating debate on the probability of enhanced greenhouse warming. In addition to global totals, national totals and 1 degree global distributions of the data have been calculated.

  6. Electrode fabrication for Lithium-ion batteries by intercalating of carbon nano tubes inside nano metric pores of silver foam

    International Nuclear Information System (INIS)

    Khoshnevisan, B.

    2011-01-01

    Here there is an on effort to improve working electrode (Ag + carbon nano tubes) preparation for Li-Ion batteries applications. Nano scaled silver foam with high specific area has been employed as a frame for loading carbon nano tubes by electrophoretic deposition method. In this ground, the prepared electrodes show a very good stability and also charge-discharge cycles reversibility.

  7. Metric learning

    CERN Document Server

    Bellet, Aurelien; Sebban, Marc

    2015-01-01

    Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learnin

  8. The relationship of metals, bifenthrin, physical habitat metrics, grain size, total organic carbon, dissolved oxygen and conductivity to Hyalella sp. abundance in urban California streams.

    Science.gov (United States)

    Hall, Lenwood W; Anderson, Ronald D

    2013-01-01

    The objectives of this study were to determine the relationship between Hyalella sp. abundance in four urban California streams and the following parameters: (1) 8 bulk metals (As, Cd, Cr, Cu, Pb, Hg, Ni, and Zn) and their associated sediment Threshold Effect Levels (TELs); (2) bifenthrin sediment concentrations; (3) 10 habitat metrics and total score; (4) grain size (% sand, silt and clay); (5) Total Organic Carbon (TOC); (6) dissolved oxygen; and (7) conductivity. California stream data used for this study were collected from Kirker Creek (2006 and 2007), Pleasant Grove Creek (2006, 2007 and 2008), Salinas streams (2009 and 2010) and Arcade Creek (2009 and 2010). Hyalella abundance in the four California streams generally declined when metals concentrations were elevated beyond the TELs. There was also a statistically significant negative relationship between Hyalella abundance and % silt for these 4 California streams as Hyalella were generally not present in silt areas. No statistically significant relationships were reported between Hyalella abundance and metals concentrations, bifenthrin concentrations, habitat metrics, % sand, % clay, TOC, dissolved oxygen and conductivity. The results from this study highlight the complexity of assessing which factors are responsible for determining the abundance of amphipods, such as Hyalella sp., in the natural environment.

  9. The implications of carbon dioxide and methane exchange for the heavy mitigation RCP2.6 scenario under two metrics

    NARCIS (Netherlands)

    Huntingford, Chris; Lowe, Jason A.; Howarth, Nicholas; Bowerman, Niel H.A.; Gohar, Laila K.; Otto, Alexander; Lee, David S.; Smith, Stephen M.; den Elzen, Michel G.J.; van Vuuren, Detlef P.; Millar, Richard J.; Allen, Myles R.

    2015-01-01

    Greenhouse gas emissions associated with Representative Concentration Pathway RCP2.6 could limit global warming to around or below a 2°C increase since pre-industrial times. However this scenario implies very large and rapid reductions in both carbon dioxide (CO2) and non-CO2 emissions, and suggests

  10. Metrication manual

    International Nuclear Information System (INIS)

    Harper, A.F.A.; Digby, R.B.; Thong, S.P.; Lacey, F.

    1978-04-01

    In April 1978 a meeting of senior metrication officers convened by the Commonwealth Science Council of the Commonwealth Secretariat, was held in London. The participants were drawn from Australia, Bangladesh, Britain, Canada, Ghana, Guyana, India, Jamaica, Papua New Guinea, Solomon Islands and Trinidad and Tobago. Among other things, the meeting resolved to develop a set of guidelines to assist countries to change to SI and to compile such guidelines in the form of a working manual

  11. Can tonne-scale direct detection experiments discover nuclear dark matter?

    Energy Technology Data Exchange (ETDEWEB)

    Butcher, Alistair; Kirk, Russell; Monroe, Jocelyn; West, Stephen M., E-mail: Alistair.Butcher.2010@live.rhul.ac.uk, E-mail: Russell.Kirk.2008@live.rhul.ac.uk, E-mail: Jocelyn.Monroe@rhul.ac.uk, E-mail: Stephen.West@rhul.ac.uk [Department of Physics, Royal Holloway University of London, Egham, Surrey, TW20 0EX (United Kingdom)

    2017-10-01

    Models of nuclear dark matter propose that the dark sector contains large composite states consisting of dark nucleons in analogy to Standard Model nuclei. We examine the direct detection phenomenology of a particular class of nuclear dark matter model at the current generation of tonne-scale liquid noble experiments, in particular DEAP-3600 and XENON1T. In our chosen nuclear dark matter scenario distinctive features arise in the recoil energy spectra due to the non-point-like nature of the composite dark matter state. We calculate the number of events required to distinguish these spectra from those of a standard point-like WIMP state with a decaying exponential recoil spectrum. In the most favourable regions of nuclear dark matter parameter space, we find that a few tens of events are needed to distinguish nuclear dark matter from WIMPs at the 3 σ level in a single experiment. Given the total exposure time of DEAP-3600 and XENON1T we find that at best a 2 σ distinction is possible by these experiments individually, while 3 σ sensitivity is reached for a range of parameters by the combination of the two experiments. We show that future upgrades of these experiments have potential to distinguish a large range of nuclear dark matter models from that of a WIMP at greater than 3 σ .

  12. Can tonne-scale direct detection experiments discover nuclear dark matter?

    International Nuclear Information System (INIS)

    Butcher, Alistair; Kirk, Russell; Monroe, Jocelyn; West, Stephen M.

    2017-01-01

    Models of nuclear dark matter propose that the dark sector contains large composite states consisting of dark nucleons in analogy to Standard Model nuclei. We examine the direct detection phenomenology of a particular class of nuclear dark matter model at the current generation of tonne-scale liquid noble experiments, in particular DEAP-3600 and XENON1T. In our chosen nuclear dark matter scenario distinctive features arise in the recoil energy spectra due to the non-point-like nature of the composite dark matter state. We calculate the number of events required to distinguish these spectra from those of a standard point-like WIMP state with a decaying exponential recoil spectrum. In the most favourable regions of nuclear dark matter parameter space, we find that a few tens of events are needed to distinguish nuclear dark matter from WIMPs at the 3 σ level in a single experiment. Given the total exposure time of DEAP-3600 and XENON1T we find that at best a 2 σ distinction is possible by these experiments individually, while 3 σ sensitivity is reached for a range of parameters by the combination of the two experiments. We show that future upgrades of these experiments have potential to distinguish a large range of nuclear dark matter models from that of a WIMP at greater than 3 σ .

  13. Carbon dioxide and climate impulse response functions for the computation of greenhouse gas metrics: a multi-model analysis

    Directory of Open Access Journals (Sweden)

    F. Joos

    2013-03-01

    Full Text Available The responses of carbon dioxide (CO2 and other climate variables to an emission pulse of CO2 into the atmosphere are often used to compute the Global Warming Potential (GWP and Global Temperature change Potential (GTP, to characterize the response timescales of Earth System models, and to build reduced-form models. In this carbon cycle-climate model intercomparison project, which spans the full model hierarchy, we quantify responses to emission pulses of different magnitudes injected under different conditions. The CO2 response shows the known rapid decline in the first few decades followed by a millennium-scale tail. For a 100 Gt-C emission pulse added to a constant CO2 concentration of 389 ppm, 25 ± 9% is still found in the atmosphere after 1000 yr; the ocean has absorbed 59 ± 12% and the land the remainder (16 ± 14%. The response in global mean surface air temperature is an increase by 0.20 ± 0.12 °C within the first twenty years; thereafter and until year 1000, temperature decreases only slightly, whereas ocean heat content and sea level continue to rise. Our best estimate for the Absolute Global Warming Potential, given by the time-integrated response in CO2 at year 100 multiplied by its radiative efficiency, is 92.5 × 10−15 yr W m−2 per kg-CO2. This value very likely (5 to 95% confidence lies within the range of (68 to 117 × 10−15 yr W m−2 per kg-CO2. Estimates for time-integrated response in CO2 published in the IPCC First, Second, and Fourth Assessment and our multi-model best estimate all agree within 15% during the first 100 yr. The integrated CO2 response, normalized by the pulse size, is lower for pre-industrial conditions, compared to present day, and lower for smaller pulses than larger pulses. In contrast, the response in temperature, sea level and ocean heat content is less sensitive to these choices. Although, choices in pulse size, background concentration, and model lead to uncertainties, the most important and

  14. Experience of European irradiated fuel transport - the first four hundred tonnes

    International Nuclear Information System (INIS)

    Curtis, H.W.

    1977-01-01

    The paper describes the successful integration of the experience of its three shareholders into an international company providing an irradiated fuel transport service throughout Europe. The experience of transporting more than 400 tonnes of irradiated uranium from fifteen power reactors is used to illustrate the flexibility which the transport organisation requires when the access and handling facilities are different at almost every reactor. Variations in fuel cross sections and lengths of fuel elements used in first generation reactors created the need for first generation flasks with sufficient variants to accommodate all reactor fuels but the trend now is towards standardisation of flasks to perhaps two basic types. Increases in fuel rating have raised the flask shielding and heat dissipation requirements and have influenced the design of later flasks. More stringent criticality acceptance criteria have tended to reduce the flask capacity below the maximum number of elements which could physically be contained. Reprocessing plant acceptance criteria initiated because of the presence of substantial quantities of loose crud released in the flask and the need to transport substantial numbers of failed elements have also reduced the flask capacity. Different modes of transport have been developed to cater for the various limitations on access to reactor sites arising from geographical and routing considerations. The safety record of irradiated fuel transport is examined with explanation of the means whereby this has been achieved. The problems of programming the movement of a pool of flasks for fifteen reactors in eight countries are discussed together with the steps taken to ensure that the service operates fairly to give priority to those reactors with the greatest problems. The transport of European irradiated fuel can be presented as an example of international collaboration which works

  15. Experience of European LWR irradiated fuel transport: the first five hundred tonnes

    International Nuclear Information System (INIS)

    Curtis, H.W.

    1978-01-01

    The paper describes the service provided by an international company specializing in the transport of LWR irradiated fuel throughout Europe. Methods of transport used to the reprocessing plants at La Hague and Windscale include road transport of 38 te flasks over the whole route; transport of flasks between 55 and 105 te by rail, with rail-head and the reprocessing plant, where required, performed by road using heavy trailers; roll-on, roll-off sea ferries; and charter ships. Different modes of transport have been developed to cater for the various limitations on access to reactor sites arising from geographical and routing considerations. The experience of transporting more than 500 tonnes of irradiated uranium from twenty-one power reactors is used to illustrate the flexibility which the transport organization requires when the access and handling facilities are different at almost every reactor. Variations in fuel cross sections and lengths of fuel elements used in first generation reactors created the need for first generation flasks with sufficient variants to accommodate all reactor fuels but the trend now is towards standardization of flasks to perhaps two basic types. The safety record of irradiated fuel transport is examined with explanation of the means whereby this has been achieved. The problems of programming the movement of a pool of eighteen flasks for twenty-one reactors in eight countries are discussed together with the steps taken to ensure that the service operates fairly to give priority to those reactors with the greatest problems. The transport of irradiated fuel across several national frontiers is an international task which requires an international company. The transport of European irradiated fuel can be presented as an example of international collaboration which works

  16. Trefoil bundles of NPD 7-element size fuel irradiated to 9100 MWd/tonne U

    Energy Technology Data Exchange (ETDEWEB)

    Bain, A S; Christie, J; Daniel, A R

    1964-01-15

    NPD prototype elements (1 in. OD, 19 in. long) were assembled into trefoil bundles and irradiated in the X-5 pressurized-water loop of NRX. The first tests were for only a few weeks but showed that elements made by sheathing UO{sub 2} pellets in Zircaloy-2 behaved well under irradiation; later similar elements were irradiated for 18000 hours to a burn-up of 9100 MWd/tonne U at {integral}kd{theta} = 40 W/cm. The dimensional stability of all the elements was good. Only those subjected to long irradiation showed progressive diametral increases, and these were attributed to relocation of the UO{sub 2} during interim inspections. Length measurements demonstrated that pellet end-dishing is effective in controlling axial expansion, but that for a given depth of dishing the amount of expansion depends on the shoulder width. The extent of grain growth in the UO{sub 2} was compatible with previously reported results when the duration of irradiation, density of the fuel, and variations in growth characteristics of the different batches of UO{sub 2} are considered. The elements taken to high irradiation released up to 135 ml of fission-product gases, which is 2% of the amount formed. The transverse tensile strength of ring samples from the Zircaloy-2 sheaths increased from 75000 to 95000 lb/in{sup 2} at room temperature, but the ductility dropped. The completely brittle fracture of some rings was due to ZrH{sub 2} precipitation. The failure of one element was caused by increased stress due to a higher heat rating, combined with low ductility of the Zircaloy-2 resulting from radiation damage and with precipitation of ZrH{sub 2} because of a lower coolant temperature. The fission-product release from the split was not excessive, and the element was easily withdrawn from the loop after operating at full power for four days from the time of the failure. (author)

  17. A 4 tonne demonstrator for large-scale dual-phase liquid argon time projection chambers arXiv

    CERN Document Server

    Aimard, B.; Asaadi, J.; Auger, M.; Aushev, V.; Autiero, D.; Badoi, M.M.; Balaceanu, A.; Balik, G.; Balleyguier, L.; Bechetoille, E.; Belver, D.; Blebea-Apostu, A.M.; Bolognesi, S.; Bordoni, S.; Bourgeois, N.; Bourguille, B.; Bremer, J.; Brown, G.; Brunetti, G.; Caiulo, D.; Calin, M.; Calvo, E.; Campanelli, M.; Cankocak, K.; Cantini, C.; Carlus, B.; Cautisanu, B.M.; Chalifour, M.; Chappuis, A.; Charitonidis, N.; Chatterjee, A.; Chiriacescu, A.; Chiu, P.; Conforti, S.; Cotte, Ph.; Crivelli, P.; Cuesta, C.; Dawson, J.; De Bonis, I.; De La Taille, C.; Delbart, A.; Desforge, D.; Di Luise, S.; Dimitru, B.S.; Doizon, F.; Drancourt, C.; Duchesneau, D.; Dulucq, F.; Dumarchez, J.; Duval, F.; Emery, S.; Ereditato, A.; Esanu, T.; Falcone, A.; Fusshoeller, K.; Gallego-Ros, A.; Galymov, V.; Geffroy, N.; Gendotti, A.; Gherghel-Lascu, M.; Giganti, C.; Gil-Botella, I.; Girerd, C.; Gomoiu, M.C.; Gorodetzky, P.; Hamada, E.; Hanni, R.; Hasegawa, T.; Holin, A.; Horikawa, S.; Ikeno, M.; Jiménez, S.; Jipa, A.; Karolak, M.; Karyotakis, Y.; Kasai, S.; Kasami, K.; Kishishita, T.; Kreslo, I.; Kryn, D.; Lastoria, C.; Lazanu, I.; Lehmann-Miotto, G.; Lira, N.; Loo, K.; Lorca, D.; Lutz, P.; Lux, T.; Maalampi, J.; Maire, G.; Maki, M.; Manenti, L.; Margineanu, R.M.; Marteau, J.; Martin-Chassard, G.; Mathez, H.; Mazzucato, E.; Misitano, G.; Mitrica, B.; Mladenov, D.; Molina Bueno, L.; Moreno Martínez, C.; Mols, J.Ph.; Mosu, T.S.; Mu, W.; Munteanu, A.; Murphy, S.; Nakayoshi, K.; Narita, S.; Navas-Nicolás, D.; Negishi, K.; Nessi, M.; Niculescu-Oglinzanu, M.; Nita, L.; Noto, F.; Noury, A.; Onishchuk, Y.; Palomares, C.; Parvu, M.; Patzak, T.; Pénichot, Y.; Pennacchio, E.; Periale, L.; Pessard, H.; Pietropaolo, F.; Piret, Y.; Popov, B.; Pugnere, D.; Radics, B.; Redondo, D.; Regenfus, C.; Remoto, A.; Resnati, F.; Rigaut, Y.A.; Ristea, C.; Rubbia, A.; Saftoiu, A.; Sakashita, K.; Sanchez, F.; Santos, C.; Scarpelli, A.; Schloesser, C.; Scotto Lavina, L.; Sendai, K.; Sergiampietri, F.; Shahsavarani, S.; Shoji, M.; Sinclair, J.; Soto-Oton, J.; Stanca, D.L.; Stefan, D.; Stroescu, P.; Sulej, R.; Tanaka, M.; Toboaru, V.; Tonazzo, A.; Tromeur, W.; Trzaska, W.H.; Uchida, T.; Vannucci, F.; Vasseur, G.; Verdugo, A.; Viant, T.; Vihonen, S.; Vilalte, S.; Weber, M.; Wu, S.; Yu, J.; Zambelli, L.; Zito, M.

    A 10 kilo-tonne dual-phase liquid argon TPC is one of the detector options considered for the Deep Underground Neutrino Experiment (DUNE). The detector technology relies on amplification of the ionisation charge in ultra-pure argon vapour and others several advantages compared to the traditional single-phase liquid argon TPCs. A 4.2 tonne dual-phase liquid argon TPC prototype, the largest of its kind, with an active volume of 3 x1x1 m^3 has been constructed and operated at CERN. In this paper we describe in detail the experimental setup and detector components as well as report on the operation experience. We also present the first results on the achieved charge amplification, prompt scintillation and electroluminiscence detection, and purity of the liquid argon from analyses of a collected sample of cosmic ray muons.

  18. Reheating experiment in the 35-ton pile; Experience de rechauffage sur la pile de 35 tonnes

    Energy Technology Data Exchange (ETDEWEB)

    Cherot, J; Girard, Y [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1957-07-01

    When the 35-ton pile was started up it was necessary for us, in order to study certain effects (xenon for example), to know the anti reactivity value of the rods as a function of their dimensions. We have made use of the possibility, in the reheating experiment, of raising the temperature of the graphite-uranium block by simple heating, in order to determine the anti reactivity curves of the rods, and from that the overall temperature coefficient. For the latter we have considered two solutions: first, one in which the average temperature of the pile is defined as our arithmetical mean of the different values given by the 28 thermocouples distributed throughout the pile; a second in which the temperature in likened to a poisoning and is balanced by the square of the flux. The way in which the measurements have been made is indicated, and the different instruments used are described. The method of reheating does not permit the separation of the temperature coefficients of uranium and of graphite. The precision obtained is only moderate, and suffers from the changes of various parameters necessary to other manipulations carried out simultaneously (life time modulators for example), and finally it is a function of the comparatively restricted time allowed. It is evident of course that more careful stabilisation at the different plateaux chosen would have necessitated long periods of reheating. (author) [French] Nous avions besoin lors de la montee en puissance de la pile de 35 tonnes, pour l'elude de divers effets (xenon par ex.) de la valeur de l'antireactivite des barres en fonction de leurs cotes. Nous avons profite dans l'experience rechauffage de la possibilite de monter en temperature, non nucleairement, le bloc graphite uranium, pour determiner les courbes d'antireactivite des barres et de la le coefficient global de temperature. Nous avons considere pour ce dernier deux solutions. Une premiere dans laquelle la temperature moyenne de la pile est definie comme

  19. Metrics of quantum states

    International Nuclear Information System (INIS)

    Ma Zhihao; Chen Jingling

    2011-01-01

    In this work we study metrics of quantum states, which are natural generalizations of the usual trace metric and Bures metric. Some useful properties of the metrics are proved, such as the joint convexity and contractivity under quantum operations. Our result has a potential application in studying the geometry of quantum states as well as the entanglement detection.

  20. $\\eta$-metric structures

    OpenAIRE

    Gaba, Yaé Ulrich

    2017-01-01

    In this paper, we discuss recent results about generalized metric spaces and fixed point theory. We introduce the notion of $\\eta$-cone metric spaces, give some topological properties and prove some fixed point theorems for contractive type maps on these spaces. In particular we show that theses $\\eta$-cone metric spaces are natural generalizations of both cone metric spaces and metric type spaces.

  1. Contract for service to produce 50 tonnes sterifeed - lesson learned from semi-commercial scale test production

    International Nuclear Information System (INIS)

    Hassan Hamdani Mutaat; Mat Rasol Awang; Zainon Said; Irwan Md Arif; Alias K Sadeli

    2005-01-01

    A contract for service has been awarded to a local company to produce 50 tonnes Sterifeed in a period of 4 months commencing 3 September 2001. The production was for supplying enough feed for the testing of Sterifeed on cattle. Appointed contractor was required to manage the production, supply labours and provide transport for the full production operation of the plant. The production performance is discussed based on: the labour cost; supervision and control; and skill and training. This report discusses and evaluate the suitability of service for contract for the future commercial scale production. (Author)

  2. METRIC context unit architecture

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, R.O.

    1988-01-01

    METRIC is an architecture for a simple but powerful Reduced Instruction Set Computer (RISC). Its speed comes from the simultaneous processing of several instruction streams, with instructions from the various streams being dispatched into METRIC's execution pipeline as they become available for execution. The pipeline is thus kept full, with a mix of instructions for several contexts in execution at the same time. True parallel programming is supported within a single execution unit, the METRIC Context Unit. METRIC's architecture provides for expansion through the addition of multiple Context Units and of specialized Functional Units. The architecture thus spans a range of size and performance from a single-chip microcomputer up through large and powerful multiprocessors. This research concentrates on the specification of the METRIC Context Unit at the architectural level. Performance tradeoffs made during METRIC's design are discussed, and projections of METRIC's performance are made based on simulation studies.

  3. Load-cell-based weighing system for weighing 9.1- and 12.7-tonne UF6 cylinders

    International Nuclear Information System (INIS)

    McAuley, W.A.; Kane, W.R.

    1986-01-01

    For the independent verification of UF 6 cylinder masses by the International Atomic Energy Agency (IAEA) at uranium enrichment facilities, an 18-tonne capacity Load-Cell-Based Weighing System (LCBWS) has been developed. The system was developed at Brookhaven National Laboratory and the Oak Ridge Gaseous Diffusion Plant and calibrated at the US National Bureau of Standards. The principal components of the LCBWS are two load cells, with readout and ancillary equipment, and a lifting fixture that couples the load cells to a cylinder. Initial experience with the system demonstrates that it has the advantages of transportability, ease of application, stability, and an attainable accuracy of 2 kg or better for a full cylinder

  4. Metric diffusion along foliations

    CERN Document Server

    Walczak, Szymon M

    2017-01-01

    Up-to-date research in metric diffusion along compact foliations is presented in this book. Beginning with fundamentals from the optimal transportation theory and the theory of foliations; this book moves on to cover Wasserstein distance, Kantorovich Duality Theorem, and the metrization of the weak topology by the Wasserstein distance. Metric diffusion is defined, the topology of the metric space is studied and the limits of diffused metrics along compact foliations are discussed. Essentials on foliations, holonomy, heat diffusion, and compact foliations are detailed and vital technical lemmas are proved to aide understanding. Graduate students and researchers in geometry, topology and dynamics of foliations and laminations will find this supplement useful as it presents facts about the metric diffusion along non-compact foliation and provides a full description of the limit for metrics diffused along foliation with at least one compact leaf on the two dimensions.

  5. Metric modular spaces

    CERN Document Server

    Chistyakov, Vyacheslav

    2015-01-01

    Aimed toward researchers and graduate students familiar with elements of functional analysis, linear algebra, and general topology; this book contains a general study of modulars, modular spaces, and metric modular spaces. Modulars may be thought of as generalized velocity fields and serve two important purposes: generate metric spaces in a unified manner and provide a weaker convergence, the modular convergence, whose topology is non-metrizable in general. Metric modular spaces are extensions of metric spaces, metric linear spaces, and classical modular linear spaces. The topics covered include the classification of modulars, metrizability of modular spaces, modular transforms and duality between modular spaces, metric  and modular topologies. Applications illustrated in this book include: the description of superposition operators acting in modular spaces, the existence of regular selections of set-valued mappings, new interpretations of spaces of Lipschitzian and absolutely continuous mappings, the existe...

  6. Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — This chapter presents several performance metrics for offline evaluation of prognostics algorithms. A brief overview of different methods employed for performance...

  7. Overview of journal metrics

    Directory of Open Access Journals (Sweden)

    Kihong Kim

    2018-02-01

    Full Text Available Various kinds of metrics used for the quantitative evaluation of scholarly journals are reviewed. The impact factor and related metrics including the immediacy index and the aggregate impact factor, which are provided by the Journal Citation Reports, are explained in detail. The Eigenfactor score and the article influence score are also reviewed. In addition, journal metrics such as CiteScore, Source Normalized Impact per Paper, SCImago Journal Rank, h-index, and g-index are discussed. Limitations and problems that these metrics have are pointed out. We should be cautious to rely on those quantitative measures too much when we evaluate journals or researchers.

  8. Irradiation behaviour of solid and hollow U{sub 3}Si fuel elements: results to 15,000 MWd/tonne U

    Energy Technology Data Exchange (ETDEWEB)

    Feraday, M A; Chalder, G H; Cotnam, K D

    1969-06-15

    U{sub 3}Si fuel elements clad in zirconium alloy sheaths have been irradiated to burnups close to 15,000 MWd/tonne U in pressurized water at 220{sup o}C, 98 bars. The results show that the external swelling can be controlled by incorporating free volume in the element. The dimensional stability of such elements is adequate to permit their use in power reactor fuel bundles. A diameter increase of 1.2% had occurred in an element initially containing 12.8% total free volume, after a burnup of 14,700 MWd/tonne U. There was no change in diameter between burnups of 5200 and 14,700 MWd/tonne U. Elements containing 3% total free volume had increased in diameter about 2.5% at 2000 MWd/tonne U compared to 0.2% at 9500 MWd/tonne U for elements containing 22% total free volume. The observed swelling in the U{sub 3}Si is discussed in terms of possible mechanisms. (author)

  9. Brand metrics that matter

    NARCIS (Netherlands)

    Muntinga, D.; Bernritter, S.

    2017-01-01

    Het merk staat steeds meer centraal in de organisatie. Het is daarom essentieel om de gezondheid, prestaties en ontwikkelingen van het merk te meten. Het is echter een uitdaging om de juiste brand metrics te selecteren. Een enorme hoeveelheid metrics vraagt de aandacht van merkbeheerders. Maar welke

  10. Privacy Metrics and Boundaries

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  11. The ALICE collaboration has just conducted one of its most spectacular transport operations to date: structures weighing several tonnes are moved with millimetric precision

    CERN Multimedia

    Maximilien Brice

    2005-01-01

    The ALICE collaboration has just lifted the dipole of the muon spectrometer and reassembled it on the other side of the huge solenoid magnet. This incredible feat involved lifting no fewer than 900 tonnes of equipment over the red octagonal yoke inherited from the L3 experiment at a height of 18 metres. A special lifting gantry weighing 5 tonnes had to be developed to move and install the coils. Huge clamps, which can be seen at the front, were used to rotate these enormous 32-tonne components. The whole assembly was raised to the cavern ceiling using an overhead travelling crane. With just 3 centimetres to spare below and 2 centimetres above, there was just enough room for the coil to pass. The operation required the overhead travelling crane to be operated with extreme precision. The coil was then placed on a 4.5-metre-high platform on the other side of the magnet.

  12. A Tale of Two Forest Carbon Assessments in the Eastern United States: Forest Use Versus Cover as a Metric of Change

    Science.gov (United States)

    C. W. Woodall; B. F. Walters; M. B. Russell; J. W. Coulston; G. M. Domke; A. W. D' Amato; P. A. Sowers

    2016-01-01

    The dynamics of land-use practices (for example, forest versus settlements) is often a major driver of changes in terrestrial carbon (C). As the management and conservation of forest land uses are considered a means of reducing future atmospheric CO2 concentrations, the monitoring of forest C stocks and stock change by categories of land-use...

  13. Holographic Spherically Symmetric Metrics

    Science.gov (United States)

    Petri, Michael

    The holographic principle (HP) conjectures, that the maximum number of degrees of freedom of any realistic physical system is proportional to the system's boundary area. The HP has its roots in the study of black holes. It has recently been applied to cosmological solutions. In this article we apply the HP to spherically symmetric static space-times. We find that any regular spherically symmetric object saturating the HP is subject to tight constraints on the (interior) metric, energy-density, temperature and entropy-density. Whenever gravity can be described by a metric theory, gravity is macroscopically scale invariant and the laws of thermodynamics hold locally and globally, the (interior) metric of a regular holographic object is uniquely determined up to a constant factor and the interior matter-state must follow well defined scaling relations. When the metric theory of gravity is general relativity, the interior matter has an overall string equation of state (EOS) and a unique total energy-density. Thus the holographic metric derived in this article can serve as simple interior 4D realization of Mathur's string fuzzball proposal. Some properties of the holographic metric and its possible experimental verification are discussed. The geodesics of the holographic metric describe an isotropically expanding (or contracting) universe with a nearly homogeneous matter-distribution within the local Hubble volume. Due to the overall string EOS the active gravitational mass-density is zero, resulting in a coasting expansion with Ht = 1, which is compatible with the recent GRB-data.

  14. Carbon dioxide emissions from fossil-fuel use, 1751-1950

    Energy Technology Data Exchange (ETDEWEB)

    Andres, R.J.; Fielding, D.J. [Alaska Fairbanks Univ., Fairbanks AK (United States). Inst. of Northern Engineering; Marland, G.; Boden, T.A. [Oak Ridge National Lab., TN (United States). Environmental Sciences Div.; Kumar, N.; Kearney, A.T. [153 East 53rd Street, New York, NY (United States)

    1999-09-01

    Newly compiled energy statistics allow for an estimation of the complete time series of carbon dioxide (CO{sub 2}) emissions from fossil-fuel use for the years 1751 to the present. The time series begins with 3 x 10{sup 6} metric tonnes carbon (C). This initial flux represents the early stages of the fossil-fuel era. The CO{sub 2} flux increased exponentially until World War I. The time series derived here seamlessly joins the modern 1950 to present time series. Total cumulative CO{sub 2} emissions through 1949 were 61.0 x 10{sup 9} tonnes C from fossil-fuel use, virtually all since the beginning of the Industrial Revolution around 1860. The rate of growth continues to grow during present times, generating debate on the probability of enhanced greenhouse warming. In addition to global totals, national totals and 1 deg global distributions of the data have been calculated 18 refs, 4 figs, 2 tabs

  15. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  16. Tracker Performance Metric

    National Research Council Canada - National Science Library

    Olson, Teresa; Lee, Harry; Sanders, Johnnie

    2002-01-01

    .... We have developed the Tracker Performance Metric (TPM) specifically for this purpose. It was designed to measure the output performance, on a frame-by-frame basis, using its output position and quality...

  17. The WellingTONNE Challenge Toolkit: Using the RE-AIM Framework to Evaluate a Community Resource Promoting Healthy Lifestyle Behaviours

    Science.gov (United States)

    Caperchione, Cristina; Coulson, Fiona

    2010-01-01

    Objective: The RE-AIM framework has been recognized as a tool to evaluate the adoption, delivery, and sustainability of an intervention, and estimate its potential public health impact. In this study four dimensions of the RE-AIM framework (adoption, implementation, effectiveness, and maintenance) were used to evaluate the WellingTONNE Challenge…

  18. Evidence for deposition of 10 million tonnes of impact spherules across four continents 12,800 y ago

    Science.gov (United States)

    Wittke, James H.; Weaver, James C.; Bunch, Ted E.; Kennett, James P.; Kennett, Douglas J.; Moore, Andrew M.T.; Hillman, Gordon C.; Tankersly, Kenneth B.; Goodyear, Albert C.; Moore, Christopher R.; Daniel, I. Randolph; Ray, Jack H.; Lopinot, Neal H.; Ferraro, David; Israde-Alcántara, Isabel; Bischoff, James L.; DeCarli, Paul S.; Hermes, Robert E.; Kloosterman, Johan B.; Revay, Zsolt; Howard, George A.; Kimbel, David R.; Kletetschka, Gunther; Nabelek, Ladislav; Lipo, Carl P.; Sakai, Sachiko; West, Allen; Firestone, Richard B.

    2013-01-01

    Airbursts/impacts by a fragmented comet or asteroid have been proposed at the Younger Dryas onset (12.80 ± 0.15 ka) based on identification of an assemblage of impact-related proxies, including microspherules, nanodiamonds, and iridium. Distributed across four continents at the Younger Dryas boundary (YDB), spherule peaks have been independently confirmed in eight studies, but unconfirmed in two others, resulting in continued dispute about their occurrence, distribution, and origin. To further address this dispute and better identify YDB spherules, we present results from one of the largest spherule investigations ever undertaken regarding spherule geochemistry, morphologies, origins, and processes of formation. We investigated 18 sites across North America, Europe, and the Middle East, performing nearly 700 analyses on spherules using energy dispersive X-ray spectroscopy for geochemical analyses and scanning electron microscopy for surface microstructural characterization. Twelve locations rank among the world’s premier end-Pleistocene archaeological sites, where the YDB marks a hiatus in human occupation or major changes in site use. Our results are consistent with melting of sediments to temperatures >2,200 °C by the thermal radiation and air shocks produced by passage of an extraterrestrial object through the atmosphere; they are inconsistent with volcanic, cosmic, anthropogenic, lightning, or authigenic sources. We also produced spherules from wood in the laboratory at >1,730 °C, indicating that impact-related incineration of biomass may have contributed to spherule production. At 12.8 ka, an estimated 10 million tonnes of spherules were distributed across ∼50 million square kilometers, similar to well-known impact strewnfields and consistent with a major cosmic impact event.

  19. IT Project Management Metrics

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Many software and IT projects fail in completing theirs objectives because different causes of which the management of the projects has a high weight. In order to have successfully projects, lessons learned have to be used, historical data to be collected and metrics and indicators have to be computed and used to compare them with past projects and avoid failure to happen. This paper presents some metrics that can be used for the IT project management.

  20. Mass Customization Measurements Metrics

    DEFF Research Database (Denmark)

    Nielsen, Kjeld; Brunø, Thomas Ditlev; Jørgensen, Kaj Asbjørn

    2014-01-01

    A recent survey has indicated that 17 % of companies have ceased mass customizing less than 1 year after initiating the effort. This paper presents measurement for a company’s mass customization performance, utilizing metrics within the three fundamental capabilities: robust process design, choice...... navigation, and solution space development. A mass customizer when assessing performance with these metrics can identify within which areas improvement would increase competitiveness the most and enable more efficient transition to mass customization....

  1. An Assessment of Geological Carbon Sequestration Options in the Illinois Basin

    Energy Technology Data Exchange (ETDEWEB)

    Robert Finley

    2005-09-30

    The Midwest Geological Sequestration Consortium (MGSC) has investigated the options for geological carbon dioxide (CO{sub 2}) sequestration in the 155,400-km{sup 2} (60,000-mi{sup 2}) Illinois Basin. Within the Basin, underlying most of Illinois, western Indiana, and western Kentucky, are relatively deeper and/or thinner coal resources, numerous mature oil fields, and deep salt-water-bearing reservoirs that are potentially capable of storing CO{sub 2}. The objective of this Assessment was to determine the technical and economic feasibility of using these geological sinks for long-term storage to avoid atmospheric release of CO{sub 2} from fossil fuel combustion and thereby avoid the potential for adverse climate change. The MGSC is a consortium of the geological surveys of Illinois, Indiana, and Kentucky joined by six private corporations, five professional business associations, one interstate compact, two university researchers, two Illinois state agencies, and two consultants. The purpose of the Consortium is to assess carbon capture, transportation, and storage processes and their costs and viability in the three-state Illinois Basin region. The Illinois State Geological Survey serves as Lead Technical Contractor for the Consortium. The Illinois Basin region has annual emissions from stationary anthropogenic sources exceeding 276 million metric tonnes (304 million tons) of CO{sub 2} (>70 million tonnes (77 million tons) carbon equivalent), primarily from coal-fired electric generation facilities, some of which burn almost 4.5 million tonnes (5 million tons) of coal per year. Assessing the options for capture, transportation, and storage of the CO{sub 2} emissions within the region has been a 12-task, 2-year process that has assessed 3,600 million tonnes (3,968 million tons) of storage capacity in coal seams, 140 to 440 million tonnes (154 to 485 million tons) of capacity in mature oil reservoirs, 7,800 million tonnes (8,598 million tons) of capacity in saline

  2. Fault Management Metrics

    Science.gov (United States)

    Johnson, Stephen B.; Ghoshal, Sudipto; Haste, Deepak; Moore, Craig

    2017-01-01

    This paper describes the theory and considerations in the application of metrics to measure the effectiveness of fault management. Fault management refers here to the operational aspect of system health management, and as such is considered as a meta-control loop that operates to preserve or maximize the system's ability to achieve its goals in the face of current or prospective failure. As a suite of control loops, the metrics to estimate and measure the effectiveness of fault management are similar to those of classical control loops in being divided into two major classes: state estimation, and state control. State estimation metrics can be classified into lower-level subdivisions for detection coverage, detection effectiveness, fault isolation and fault identification (diagnostics), and failure prognosis. State control metrics can be classified into response determination effectiveness and response effectiveness. These metrics are applied to each and every fault management control loop in the system, for each failure to which they apply, and probabilistically summed to determine the effectiveness of these fault management control loops to preserve the relevant system goals that they are intended to protect.

  3. Deep Transfer Metric Learning.

    Science.gov (United States)

    Junlin Hu; Jiwen Lu; Yap-Peng Tan; Jie Zhou

    2016-12-01

    Conventional metric learning methods usually assume that the training and test samples are captured in similar scenarios so that their distributions are assumed to be the same. This assumption does not hold in many real visual recognition applications, especially when samples are captured across different data sets. In this paper, we propose a new deep transfer metric learning (DTML) method to learn a set of hierarchical nonlinear transformations for cross-domain visual recognition by transferring discriminative knowledge from the labeled source domain to the unlabeled target domain. Specifically, our DTML learns a deep metric network by maximizing the inter-class variations and minimizing the intra-class variations, and minimizing the distribution divergence between the source domain and the target domain at the top layer of the network. To better exploit the discriminative information from the source domain, we further develop a deeply supervised transfer metric learning (DSTML) method by including an additional objective on DTML, where the output of both the hidden layers and the top layer are optimized jointly. To preserve the local manifold of input data points in the metric space, we present two new methods, DTML with autoencoder regularization and DSTML with autoencoder regularization. Experimental results on face verification, person re-identification, and handwritten digit recognition validate the effectiveness of the proposed methods.

  4. Cyber threat metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  5. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  6. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  7. Design, construction and test run of a two-tonne capacity solar rice dryer with rice-husk-fired auxiliary heater

    International Nuclear Information System (INIS)

    Iloeje, O.C.; Ekechukwu, O.V.; Ezeike, G.O.I.

    1993-09-01

    The design and construction details of a two-tonne per batch capacity natural-circulation solar rice dryer and the highlights of the design of its rice-husk-fired auxiliary heating system which is still under construction are presented. The dryer measures approximately 17.7m long by 9.8m wide by 6m high. Preliminary results of a test run on the solar dryer section only is reported. (author). 5 refs, 3 figs

  8. Members of the team responsable for the strength test stand on the plug for the main CMS shaft on which 2500 tonnes of steel blocks have been placed

    CERN Multimedia

    Maximilien Brice

    2006-01-01

    The plug over the CMS shaft, which will be required to bear the weight of the various detector sub-assemblies when they are lowered into the experiment hall, has just passed a strength test. The plug, a huge 2.2-metre-thick rectangular block of reinforced concrete measuring 15 x 20 metres and weighing 2000 tonnes, underwent its first strength test on 15 May.

  9. Metrical Phonology and SLA.

    Science.gov (United States)

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English language with the intention that it may be used in second language instruction. Stress is defined by its physical and acoustical correlates, and the principles of…

  10. Engineering performance metrics

    Science.gov (United States)

    Delozier, R.; Snyder, N.

    1993-03-01

    Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.

  11. Metrics for Probabilistic Geometries

    DEFF Research Database (Denmark)

    Tosi, Alessandra; Hauberg, Søren; Vellido, Alfredo

    2014-01-01

    the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances...

  12. How much energy is locked in the USA? Alternative metrics for characterising the magnitude of overweight and obesity derived from BRFSS 2010 data.

    Science.gov (United States)

    Reidpath, Daniel D; Masood, Mohd; Allotey, Pascale

    2014-06-01

    Four metrics to characterise population overweight are described. Behavioural Risk Factors Surveillance System data were used to estimate the weight the US population needed to lose to achieve a BMI energy, and energy value. About 144 million people in the US need to lose 2.4 million metric tonnes. The volume of fat is 2.6 billion litres-1,038 Olympic size swimming pools. The energy in the fat would power 90,000 households for a year and is worth around 162 million dollars. Four confronting ways of talking about a national overweight and obesity are described. The value of the metrics remains to be tested.

  13. Metrics for energy resilience

    International Nuclear Information System (INIS)

    Roege, Paul E.; Collier, Zachary A.; Mancillas, James; McDonagh, John A.; Linkov, Igor

    2014-01-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today's energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system's energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth. - Highlights: • Resilience is the ability of a system to recover from adversity. • There is a need for methods to quantify and measure system resilience. • We developed a matrix-based approach to generate energy resilience metrics. • These metrics can be used in energy planning, system design, and operations

  14. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  15. Dynamics Analysis Of Land-Based Carbon Stock In The Region Of Samarinda East Kalimantan Province

    Directory of Open Access Journals (Sweden)

    Zikri Azham

    2017-10-01

    Full Text Available This study aims to determine the potential dynamics of carbon stocks in various land cover classes in the city of Samarinda in the calculation of carbon stocks land cover only devided into three 3 Class Land Cover CLC is a secondary forest CLC CLC thickets and CLC shrubs. Research results show that the above ground carbon AGC stocks on Secondary Forest Land Cover Class average of 71.93 tonnesha the land cover classes thickets of 32.34 tonnes hectares and shrubs land cover classes of 19.66 tonnes hectare. The carbon stocks in 2009 amounted to 2589929 tonnes in 2012 there were 2347477 tons and in 2015 there were 2201005 tonnes. Estimated decrease in land-based stock carbon in the city of Samarinda during the period 2009-2015 amounted to 388943 tonnes or an average of 70170 tonnes per year or approximately 2.73year or the emissions in the field of land amounting to 254538 tonnes of CO2 equivalent.

  16. Carbon emissions and an equitable emission reduction criterion

    International Nuclear Information System (INIS)

    Golomb, Dan

    1999-01-01

    In 1995 the world-wide carbon emissions reached 5.8 billion metric tonnes per year (GTC/y). The Kyoto protocol calls for a reduction of carbon emissions from the developed countries (Annex I countries) of 6-8% below 1990 levels on the average, and unspecified commitments for the less developed (non-Annex I) countries. It is doubtful that the Kyoto agreement will be ratified by some parliaments, especially the USA Congress. Furthermore, it is shown that if the non-Annex I countries will not curtail their carbon emissions drastically, the global emissions will soar to huge levels by the middle of the next century. An equitable emission criterion is proposed which may lead to a sustainable rate of growth of carbon emissions, and be acceptable to all countries of the world. The criterion links the rate of growth of carbon emissions to the rate of growth of the Gross Domestic Product (GDP). A target criterion is proposed R = 0.15 KgC/SGDP, which is the current average for western European countries and Japan. This allows for both the growth of the GDP and carbon emissions. However, to reach the target in a reasonable time, the countries for which R≤ 0.3 would be allowed a carbon emission growth rate of 1%./y, and countries for which R≥ 0.3, 0.75%/y. It is shown that by 2050 the world-wide carbon emissions would reach about 10 GTC/y, which is about 3 times less than the Kyoto agreement would allow. (Author)

  17. Carbon footprint of cartons in Europe - Carbon Footprint methodology and biogenic carbon sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Eriksson, Elin; Karlsson, Per-Erik; Hallberg, Lisa; Jelse, Kristian

    2010-05-15

    A methodology for carbon sequestration in forests used for carton production has been developed and applied. The average Carbon Footprint of converted cartons sold in Europe has been calculated and summarised. A methodology for a EU27 scenario based assessment of end of life treatment has been developed and applied. The average Carbon Footprint represents the total Greenhouse Gas emissions from one average tonne of virgin based fibres and recycled fibres produced, converted and printed in Europe

  18. Enterprise Sustainment Metrics

    Science.gov (United States)

    2015-06-19

    are negatively impacting KPIs” (Parmenter, 2010: 31). In the current state, the Air Force’s AA and PBL metrics are once again split . AA does...must have the authority to “take immediate action to rectify situations that are negatively impacting KPIs” (Parmenter, 2010: 31). 3. Measuring...highest profitability and shareholder value for each company” (2014: 273). By systematically diagraming a process, either through a swim lane flowchart

  19. Carbon storage and sequestration by trees in urban and community areas of the United States

    International Nuclear Information System (INIS)

    Nowak, David J.; Greenfield, Eric J.; Hoehn, Robert E.; Lapoint, Elizabeth

    2013-01-01

    Carbon storage and sequestration by urban trees in the United States was quantified to assess the magnitude and role of urban forests in relation to climate change. Urban tree field data from 28 cities and 6 states were used to determine the average carbon density per unit of tree cover. These data were applied to statewide urban tree cover measurements to determine total urban forest carbon storage and annual sequestration by state and nationally. Urban whole tree carbon storage densities average 7.69 kg C m −2 of tree cover and sequestration densities average 0.28 kg C m −2 of tree cover per year. Total tree carbon storage in U.S. urban areas (c. 2005) is estimated at 643 million tonnes ($50.5 billion value; 95% CI = 597 million and 690 million tonnes) and annual sequestration is estimated at 25.6 million tonnes ($2.0 billion value; 95% CI = 23.7 million to 27.4 million tonnes). -- Highlights: •Total tree carbon storage in U.S. urban areas (c. 2005) is estimated at 643 million tonnes. •Total tree carbon storage in U.S. urban and community areas is estimated at 1.36 billion tonnes. •Net carbon sequestration in U.S. urban areas varies by state and is estimated at 18.9 million tonnes per year. •Overlap between U.S. forest and urban forest carbon estimates is between 247 million and 303 million tonnes. -- Field and tree cover measurements reveal carbon storage and sequestration by trees in U.S. urban and community areas

  20. Symmetries of the dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.

    1998-01-01

    The geometric duality between the metric g μν and a Killing tensor K μν is studied. The conditions were found when the symmetries of the metric g μν and the dual metric K μν are the same. Dual spinning space was constructed without introduction of torsion. The general results are applied to the case of Kerr-Newmann metric

  1. Kerr metric in cosmological background

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, P C [Gujarat Univ., Ahmedabad (India). Dept. of Mathematics

    1977-06-01

    A metric satisfying Einstein's equation is given which in the vicinity of the source reduces to the well-known Kerr metric and which at large distances reduces to the Robertson-Walker metric of a nomogeneous cosmological model. The radius of the event horizon of the Kerr black hole in the cosmological background is found out.

  2. Learning Low-Dimensional Metrics

    OpenAIRE

    Jain, Lalit; Mason, Blake; Nowak, Robert

    2017-01-01

    This paper investigates the theoretical foundations of metric learning, focused on three key questions that are not fully addressed in prior work: 1) we consider learning general low-dimensional (low-rank) metrics as well as sparse metrics; 2) we develop upper and lower (minimax)bounds on the generalization error; 3) we quantify the sample complexity of metric learning in terms of the dimension of the feature space and the dimension/rank of the underlying metric;4) we also bound the accuracy ...

  3. Metrics with vanishing quantum corrections

    International Nuclear Information System (INIS)

    Coley, A A; Hervik, S; Gibbons, G W; Pope, C N

    2008-01-01

    We investigate solutions of the classical Einstein or supergravity equations that solve any set of quantum corrected Einstein equations in which the Einstein tensor plus a multiple of the metric is equated to a symmetric conserved tensor T μν (g αβ , ∂ τ g αβ , ∂ τ ∂ σ g αβ , ...,) constructed from sums of terms, the involving contractions of the metric and powers of arbitrary covariant derivatives of the curvature tensor. A classical solution, such as an Einstein metric, is called universal if, when evaluated on that Einstein metric, T μν is a multiple of the metric. A Ricci flat classical solution is called strongly universal if, when evaluated on that Ricci flat metric, T μν vanishes. It is well known that pp-waves in four spacetime dimensions are strongly universal. We focus attention on a natural generalization; Einstein metrics with holonomy Sim(n - 2) in which all scalar invariants are zero or constant. In four dimensions we demonstrate that the generalized Ghanam-Thompson metric is weakly universal and that the Goldberg-Kerr metric is strongly universal; indeed, we show that universality extends to all four-dimensional Sim(2) Einstein metrics. We also discuss generalizations to higher dimensions

  4. Sharp metric obstructions for quasi-Einstein metrics

    Science.gov (United States)

    Case, Jeffrey S.

    2013-02-01

    Using the tractor calculus to study smooth metric measure spaces, we adapt results of Gover and Nurowski to give sharp metric obstructions to the existence of quasi-Einstein metrics on suitably generic manifolds. We do this by introducing an analogue of the Weyl tractor W to the setting of smooth metric measure spaces. The obstructions we obtain can be realized as tensorial invariants which are polynomial in the Riemann curvature tensor and its divergence. By taking suitable limits of their tensorial forms, we then find obstructions to the existence of static potentials, generalizing to higher dimensions a result of Bartnik and Tod, and to the existence of potentials for gradient Ricci solitons.

  5. Completion of a Dislocated Metric Space

    Directory of Open Access Journals (Sweden)

    P. Sumati Kumari

    2015-01-01

    Full Text Available We provide a construction for the completion of a dislocated metric space (abbreviated d-metric space; we also prove that the completion of the metric associated with a d-metric coincides with the metric associated with the completion of the d-metric.

  6. Metric adjusted skew information

    DEFF Research Database (Denmark)

    Hansen, Frank

    2008-01-01

    ) that vanishes for observables commuting with the state. We show that the skew information is a convex function on the manifold of states. It also satisfies other requirements, proposed by Wigner and Yanase, for an effective measure-of-information content of a state relative to a conserved observable. We...... establish a connection between the geometrical formulation of quantum statistics as proposed by Chentsov and Morozova and measures of quantum information as introduced by Wigner and Yanase and extended in this article. We show that the set of normalized Morozova-Chentsov functions describing the possible......We extend the concept of Wigner-Yanase-Dyson skew information to something we call "metric adjusted skew information" (of a state with respect to a conserved observable). This "skew information" is intended to be a non-negative quantity bounded by the variance (of an observable in a state...

  7. Deep uncertainty and broad heterogeneity in country-level social cost of carbon

    Science.gov (United States)

    Ricke, K.; Drouet, L.; Caldeira, K.; Tavoni, M.

    2017-12-01

    The social cost of carbon (SCC) is a commonly employed metric of the expected economic damages expected from carbon dioxide (CO2) emissions. Recent estimates of SCC range from approximately 10/tonne of CO2 to as much as 1000/tCO2, but these have been computed at the global level. While useful in an optimal policy context, a world-level approach obscures the heterogeneous geography of climate damages and vast differences in country-level contributions to global SCC, as well as climate and socio-economic uncertainties, which are much larger at the regional level. For the first time, we estimate country-level contributions to SCC using recent climate and carbon-cycle model projections, empirical climate-driven economic damage estimations, and information from the Shared Socio-economic Pathways. Central specifications show high global SCC values (median: 417 /tCO2, 66% confidence intervals: 168 - 793 /tCO2) with country-level contributions ranging from -11 (-8 - -14) /tCO2 to 86 (50 - 158) /tCO2. We quantify climate-, scenario- and economic damage- driven uncertainties associated with the calculated values of SCC. We find that while the magnitude of country-level social cost of carbon is highly uncertain, the relative positioning among countries is consistent. Countries incurring large fractions of the global cost include India, China, and the United States. The share of SCC distributed among countries is robust, indicating climate change winners and losers from a geopolitical perspective.

  8. The metric system: An introduction

    Energy Technology Data Exchange (ETDEWEB)

    Lumley, S.M.

    1995-05-01

    On July 13, 1992, Deputy Director Duane Sewell restated the Laboratory`s policy on conversion to the metric system which was established in 1974. Sewell`s memo announced the Laboratory`s intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory`s conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on July 25, 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation`s conversion to the metric system. The second part of this report is on applying the metric system.

  9. Attack-Resistant Trust Metrics

    Science.gov (United States)

    Levien, Raph

    The Internet is an amazingly powerful tool for connecting people together, unmatched in human history. Yet, with that power comes great potential for spam and abuse. Trust metrics are an attempt to compute the set of which people are trustworthy and which are likely attackers. This chapter presents two specific trust metrics developed and deployed on the Advogato Website, which is a community blog for free software developers. This real-world experience demonstrates that the trust metrics fulfilled their goals, but that for good results, it is important to match the assumptions of the abstract trust metric computation to the real-world implementation.

  10. The metric system: An introduction

    Science.gov (United States)

    Lumley, Susan M.

    On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.

  11. Metric-adjusted skew information

    DEFF Research Database (Denmark)

    Liang, Cai; Hansen, Frank

    2010-01-01

    on a bipartite system and proved superadditivity of the Wigner-Yanase-Dyson skew informations for such states. We extend this result to the general metric-adjusted skew information. We finally show that a recently introduced extension to parameter values 1 ...We give a truly elementary proof of the convexity of metric-adjusted skew information following an idea of Effros. We extend earlier results of weak forms of superadditivity to general metric-adjusted skew information. Recently, Luo and Zhang introduced the notion of semi-quantum states...... of (unbounded) metric-adjusted skew information....

  12. Two classes of metric spaces

    Directory of Open Access Journals (Sweden)

    Isabel Garrido

    2016-04-01

    Full Text Available The class of metric spaces (X,d known as small-determined spaces, introduced by Garrido and Jaramillo, are properly defined by means of some type of real-valued Lipschitz functions on X. On the other hand, B-simple metric spaces introduced by Hejcman are defined in terms of some kind of bornologies of bounded subsets of X. In this note we present a common framework where both classes of metric spaces can be studied which allows us to see not only the relationships between them but also to obtain new internal characterizations of these metric properties.

  13. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  14. The ALICE collaboration has just conducted one of its most spectacular transport operations to date: structures weighing several tonnes are moved with millimetric precision

    CERN Multimedia

    Maximilien Brice

    2005-01-01

    The ALICE collaboration has just lifted the dipole of the muon spectrometer and reassembled it on the other side of the huge solenoid magnet. This incredible feat involved lifting no fewer than 900 tonnes of equipment over the red octagonal yoke inherited from the L3 experiment at a height of 18 metres. The work resumed on 19 April, the following day. The coil was turned over into an upright position and lifted over the blue yoke of the muon spectrometer's dipole magnet. Remarkable precision was required yet again. The space between the red magnet inherited from the L3 experiment and the descending coil was no more than a few centimetres and this tiny gap had to be maintained throughout the operation to bring the 6-metre high coil down into position.

  15. Multimetric indices: How many metrics?

    Science.gov (United States)

    Multimetric indices (MMI’s) often include 5 to 15 metrics, each representing a different attribute of assemblage condition, such as species diversity, tolerant taxa, and nonnative taxa. Is there an optimal number of metrics for MMIs? To explore this question, I created 1000 9-met...

  16. Metrical Phonology: German Sound System.

    Science.gov (United States)

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English and German languages. The objective is to promote use of metrical phonology as a tool for enhancing instruction in stress patterns in words and sentences, particularly in…

  17. Extending cosmology: the metric approach

    OpenAIRE

    Mendoza, S.

    2012-01-01

    Comment: 2012, Extending Cosmology: The Metric Approach, Open Questions in Cosmology; Review article for an Intech "Open questions in cosmology" book chapter (19 pages, 3 figures). Available from: http://www.intechopen.com/books/open-questions-in-cosmology/extending-cosmology-the-metric-approach

  18. Numerical Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Douglas, Michael R.; Karp, Robert L.; Lukic, Sergio; Reinbacher, Rene

    2008-01-01

    We develop numerical methods for approximating Ricci flat metrics on Calabi-Yau hypersurfaces in projective spaces. Our approach is based on finding balanced metrics and builds on recent theoretical work by Donaldson. We illustrate our methods in detail for a one parameter family of quintics. We also suggest several ways to extend our results

  19. High resolution metric imaging payload

    Science.gov (United States)

    Delclaud, Y.

    2017-11-01

    Alcatel Space Industries has become Europe's leader in the field of high and very high resolution optical payloads, in the frame work of earth observation system able to provide military government with metric images from space. This leadership allowed ALCATEL to propose for the export market, within a French collaboration frame, a complete space based system for metric observation.

  20. Weyl metrics and wormholes

    Energy Technology Data Exchange (ETDEWEB)

    Gibbons, Gary W. [DAMTP, University of Cambridge, Wilberforce Road, Cambridge, CB3 0WA U.K. (United Kingdom); Volkov, Mikhail S., E-mail: gwg1@cam.ac.uk, E-mail: volkov@lmpt.univ-tours.fr [Laboratoire de Mathématiques et Physique Théorique, LMPT CNRS—UMR 7350, Université de Tours, Parc de Grandmont, Tours, 37200 France (France)

    2017-05-01

    We study solutions obtained via applying dualities and complexifications to the vacuum Weyl metrics generated by massive rods and by point masses. Rescaling them and extending to complex parameter values yields axially symmetric vacuum solutions containing singularities along circles that can be viewed as singular matter sources. These solutions have wormhole topology with several asymptotic regions interconnected by throats and their sources can be viewed as thin rings of negative tension encircling the throats. For a particular value of the ring tension the geometry becomes exactly flat although the topology remains non-trivial, so that the rings literally produce holes in flat space. To create a single ring wormhole of one metre radius one needs a negative energy equivalent to the mass of Jupiter. Further duality transformations dress the rings with the scalar field, either conventional or phantom. This gives rise to large classes of static, axially symmetric solutions, presumably including all previously known solutions for a gravity-coupled massless scalar field, as for example the spherically symmetric Bronnikov-Ellis wormholes with phantom scalar. The multi-wormholes contain infinite struts everywhere at the symmetry axes, apart from solutions with locally flat geometry.

  1. Metrics for image segmentation

    Science.gov (United States)

    Rees, Gareth; Greenway, Phil; Morray, Denise

    1998-07-01

    An important challenge in mapping image-processing techniques onto applications is the lack of quantitative performance measures. From a systems engineering perspective these are essential if system level requirements are to be decomposed into sub-system requirements which can be understood in terms of algorithm selection and performance optimization. Nowhere in computer vision is this more evident than in the area of image segmentation. This is a vigorous and innovative research activity, but even after nearly two decades of progress, it remains almost impossible to answer the question 'what would the performance of this segmentation algorithm be under these new conditions?' To begin to address this shortcoming, we have devised a well-principled metric for assessing the relative performance of two segmentation algorithms. This allows meaningful objective comparisons to be made between their outputs. It also estimates the absolute performance of an algorithm given ground truth. Our approach is an information theoretic one. In this paper, we describe the theory and motivation of our method, and present practical results obtained from a range of state of the art segmentation methods. We demonstrate that it is possible to measure the objective performance of these algorithms, and to use the information so gained to provide clues about how their performance might be improved.

  2. Metric regularity and subdifferential calculus

    International Nuclear Information System (INIS)

    Ioffe, A D

    2000-01-01

    The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces

  3. METRICS DEVELOPMENT FOR PATENTS.

    Science.gov (United States)

    Veiga, Daniela Francescato; Ferreira, Lydia Masako

    2015-01-01

    To develop a proposal for metrics for patents to be applied in assessing the postgraduate programs of Medicine III - Capes. From the reading and analysis of the 2013 area documents of all the 48 areas of Capes, a proposal for metrics for patents was developed to be applied in Medicine III programs. Except for the areas Biotechnology, Food Science, Biological Sciences III, Physical Education, Engineering I, III and IV and Interdisciplinary, most areas do not adopt a scoring system for patents. The proposal developed was based on the criteria of Biotechnology, with adaptations. In general, it will be valued, in ascending order, the deposit, the granting and licensing/production. It will also be assigned higher scores to patents registered abroad and whenever there is a participation of students. This proposal can be applied to the item Intellectual Production of the evaluation form, in subsection Technical Production/Patents. The percentage of 10% for academic programs and 40% for Masters Professionals should be maintained. The program will be scored as Very Good when it reaches 400 points or over; Good, between 200 and 399 points; Regular, between 71 and 199 points; Weak up to 70 points; Insufficient, no punctuation. Desenvolver uma proposta de métricas para patentes a serem aplicadas na avaliação dos Programas de Pós-Graduação da Área Medicina III - Capes. A partir da leitura e análise dos documentos de área de 2013 de todas as 48 Áreas da Capes, desenvolveu-se uma proposta de métricas para patentes, a ser aplicada na avaliação dos programas da área. Constatou-se que, com exceção das áreas Biotecnologia, Ciência de Alimentos, Ciências Biológicas III, Educação Física, Engenharias I, III e IV e Interdisciplinar, a maioria não adota sistema de pontuação para patentes. A proposta desenvolvida baseou-se nos critérios da Biotecnologia, com adaptações. De uma forma geral, foi valorizado, em ordem crescente, o depósito, a concessão e o

  4. A Metric for Heterotic Moduli

    Science.gov (United States)

    Candelas, Philip; de la Ossa, Xenia; McOrist, Jock

    2017-12-01

    Heterotic vacua of string theory are realised, at large radius, by a compact threefold with vanishing first Chern class together with a choice of stable holomorphic vector bundle. These form a wide class of potentially realistic four-dimensional vacua of string theory. Despite all their phenomenological promise, there is little understanding of the metric on the moduli space of these. What is sought is the analogue of special geometry for these vacua. The metric on the moduli space is important in phenomenology as it normalises D-terms and Yukawa couplings. It is also of interest in mathematics, since it generalises the metric, first found by Kobayashi, on the space of gauge field connections, to a more general context. Here we construct this metric, correct to first order in {α^{\\backprime}}, in two ways: first by postulating a metric that is invariant under background gauge transformations of the gauge field, and also by dimensionally reducing heterotic supergravity. These methods agree and the resulting metric is Kähler, as is required by supersymmetry. Checking the metric is Kähler is intricate and the anomaly cancellation equation for the H field plays an essential role. The Kähler potential nevertheless takes a remarkably simple form: it is the Kähler potential of special geometry with the Kähler form replaced by the {α^{\\backprime}}-corrected hermitian form.

  5. Implications of Metric Choice for Common Applications of Readmission Metrics

    OpenAIRE

    Davies, Sheryl; Saynina, Olga; Schultz, Ellen; McDonald, Kathryn M; Baker, Laurence C

    2013-01-01

    Objective. To quantify the differential impact on hospital performance of three readmission metrics: all-cause readmission (ACR), 3M Potential Preventable Readmission (PPR), and Centers for Medicare and Medicaid 30-day readmission (CMS).

  6. Issues in Benchmark Metric Selection

    Science.gov (United States)

    Crolotte, Alain

    It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.

  7. Background metric in supergravity theories

    International Nuclear Information System (INIS)

    Yoneya, T.

    1978-01-01

    In supergravity theories, we investigate the conformal anomaly of the path-integral determinant and the problem of fermion zero modes in the presence of a nontrivial background metric. Except in SO(3) -invariant supergravity, there are nonvanishing conformal anomalies. As a consequence, amplitudes around the nontrivial background metric contain unpredictable arbitrariness. The fermion zero modes which are explicitly constructed for the Euclidean Schwarzschild metric are interpreted as an indication of the supersymmetric multiplet structure of a black hole. The degree of degeneracy of a black hole is 2/sup 4n/ in SO(n) supergravity

  8. Generalized Painleve-Gullstrand metrics

    Energy Technology Data Exchange (ETDEWEB)

    Lin Chunyu [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: l2891112@mail.ncku.edu.tw; Soo Chopin [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: cpsoo@mail.ncku.edu.tw

    2009-02-02

    An obstruction to the implementation of spatially flat Painleve-Gullstrand (PG) slicings is demonstrated, and explicitly discussed for Reissner-Nordstroem and Schwarzschild-anti-deSitter spacetimes. Generalizations of PG slicings which are not spatially flat but which remain regular at the horizons are introduced. These metrics can be obtained from standard spherically symmetric metrics by physical Lorentz boosts. With these generalized PG metrics, problematic contributions to the imaginary part of the action in the Parikh-Wilczek derivation of Hawking radiation due to the obstruction can be avoided.

  9. Daylight metrics and energy savings

    Energy Technology Data Exchange (ETDEWEB)

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  10. Next-Generation Metrics: Responsible Metrics & Evaluation for Open Science

    Energy Technology Data Exchange (ETDEWEB)

    Wilsdon, J.; Bar-Ilan, J.; Peters, I.; Wouters, P.

    2016-07-01

    Metrics evoke a mixed reaction from the research community. A commitment to using data to inform decisions makes some enthusiastic about the prospect of granular, real-time analysis o of research and its wider impacts. Yet we only have to look at the blunt use of metrics such as journal impact factors, h-indices and grant income targets, to be reminded of the pitfalls. Some of the most precious qualities of academic culture resist simple quantification, and individual indicators often struggle to do justice to the richness and plurality of research. Too often, poorly designed evaluation criteria are “dominating minds, distorting behaviour and determining careers (Lawrence, 2007).” Metrics hold real power: they are constitutive of values, identities and livelihoods. How to exercise that power to more positive ends has been the focus of several recent and complementary initiatives, including the San Francisco Declaration on Research Assessment (DORA1), the Leiden Manifesto2 and The Metric Tide3 (a UK government review of the role of metrics in research management and assessment). Building on these initiatives, the European Commission, under its new Open Science Policy Platform4, is now looking to develop a framework for responsible metrics for research management and evaluation, which can be incorporated into the successor framework to Horizon 2020. (Author)

  11. CarbonSAFE Rocky Mountain Phase I : Seismic Characterization of the Navajo Reservoir, Buzzard Bench, Utah

    Science.gov (United States)

    Haar, K. K.; Balch, R. S.; Lee, S. Y.

    2017-12-01

    The CarbonSAFE Rocky Mountain project team is in the initial phase of investigating the regulatory, financial and technical feasibility of commercial-scale CO2 capture and storage from two coal-fired power plants in the northwest region of the San Rafael Swell, Utah. The reservoir interval is the Jurassic Navajo Sandstone, an eolian dune deposit that at present serves as the salt water disposal reservoir for Ferron Sandstone coal-bed methane production in the Drunkards Wash field and Buzzard Bench area of central Utah. In the study area the Navajo sandstone is approximately 525 feet thick and is at an average depth of about 7000 feet below the surface. If sufficient porosity and permeability exist, reservoir depth and thickness would provide storage for up to 100,000 metric tonnes of CO2 per square mile, based on preliminary estimates. This reservoir has the potential to meet the DOE's requirement of having the ability to store at least 50 million metric tons of CO2 and fulfills the DOE's initiative to develop protocols for commercially sequestering carbon sourced from coal-fired power plants. A successful carbon storage project requires thorough structural and stratigraphic characterization of the reservoir, seal and faults, thereby allowing the creation of a comprehensive geologic model with subsequent simulations to evaluate CO2/brine migration and long-term effects. Target formation lithofacies and subfacies data gathered from outcrop mapping and laboratory analysis of core samples were developed into a geologic model. Synthetic seismic was modeled from this, allowing us to seismically characterize the lithofacies of the target formation. This seismic characterization data was then employed in the interpretation of 2D legacy lines which provided stratigraphic and structural control for more accurate model development of the northwest region of the San Rafael Swell. Developing baseline interpretations such as this are crucial toward long-term carbon storage

  12. Let's Make Metric Ice Cream

    Science.gov (United States)

    Zimmerman, Marianna

    1975-01-01

    Describes a classroom activity which involved sixth grade students in a learning situation including making ice cream, safety procedures in a science laboratory, calibrating a thermometer, using metric units of volume and mass. (EB)

  13. Experiential space is hardly metric

    Czech Academy of Sciences Publication Activity Database

    Šikl, Radovan; Šimeček, Michal; Lukavský, Jiří

    2008-01-01

    Roč. 2008, č. 37 (2008), s. 58-58 ISSN 0301-0066. [European Conference on Visual Perception. 24.08-28.08.2008, Utrecht] R&D Projects: GA ČR GA406/07/1676 Institutional research plan: CEZ:AV0Z70250504 Keywords : visual space perception * metric and non-metric perceptual judgments * ecological validity Subject RIV: AN - Psychology

  14. Coverage Metrics for Model Checking

    Science.gov (United States)

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  15. Phantom metrics with Killing spinors

    Directory of Open Access Journals (Sweden)

    W.A. Sabra

    2015-11-01

    Full Text Available We study metric solutions of Einstein–anti-Maxwell theory admitting Killing spinors. The analogue of the IWP metric which admits a space-like Killing vector is found and is expressed in terms of a complex function satisfying the wave equation in flat (2+1-dimensional space–time. As examples, electric and magnetic Kasner spaces are constructed by allowing the solution to depend only on the time coordinate. Euclidean solutions are also presented.

  16. Developing a Metric for the Cost of Green House Gas Abatement

    Science.gov (United States)

    2017-02-28

    The authors introduce the levelized cost of carbon (LCC), a metric that can be used to evaluate MassDOT CO2 abatement projects in terms of their cost-effectiveness. The study presents ways in which the metric can be used to rank projects. The data ar...

  17. The Majorana Demonstrator: Progress towards showing the feasibility of a tonne-scale 76Ge neutrinoless double-beta decay experiment

    Science.gov (United States)

    Finnerty, P.; Aguayo, E.; Amman, M.; Avignone, F. T., Iii; Barabash, A. S.; Barton, P. J.; Beene, J. R.; Bertrand, F. E.; Boswell, M.; Brudanin, V.; Busch, M.; Chan, Y.-D.; Christofferson, C. D.; Collar, J. I.; Combs, D. C.; Cooper, R. J.; Detwiler, J. A.; Doe, P. J.; Efremenko, Yu; Egorov, V.; Ejiri, H.; Elliott, S. R.; Esterline, J.; Fast, J. E.; Fields, N.; Fraenkle, F. M.; Galindo-Uribarri, A.; Gehman, V. M.; Giovanetti, G. K.; Green, M. P.; Guiseppe, V. E.; Gusey, K.; Hallin, A. L.; Hazama, R.; Henning, R.; Hoppe, E. W.; Horton, M.; Howard, S.; Howe, M. A.; Johnson, R. A.; Keeter, K. J.; Kidd, M. F.; Knecht, A.; Kochetov, O.; Konovalov, S. I.; Kouzes, R. T.; LaFerriere, B. D.; Leon, J.; Leviner, L. E.; Loach, J. C.; Luke, P. N.; MacMullin, S.; Marino, M. G.; Martin, R. D.; Merriman, J. H.; Miller, M. L.; Mizouni, L.; Nomachi, M.; Orrell, J. L.; Overman, N. R.; Perumpilly, G.; Phillips, D. G., Ii; Poon, A. W. P.; Radford, D. C.; Rielage, K.; Robertson, R. G. H.; Ronquest, M. C.; Schubert, A. G.; Shima, T.; Shirchenko, M.; Snavely, K. J.; Steele, D.; Strain, J.; Timkin, V.; Tornow, W.; Varner, R. L.; Vetter, K.; Vorren, K.; Wilkerson, J. F.; Yakushev, E.; Yaver, H.; Young, A. R.; Yu, C.-H.; Yumatov, V.; Majorana Collaboration

    2014-03-01

    The Majorana Demonstrator will search for the neutrinoless double-beta decay (0vββ) of the 76Ge isotope with a mixed array of enriched and natural germanium detectors. The observation of this rare decay would indicate the neutrino is its own anti-particle, demonstrate that lepton number is not conserved, and provide information on the absolute mass-scale of the neutrino. The Demonstrator is being assembled at the 4850 foot level of the Sanford Underground Research Facility in Lead, South Dakota. The array will be contained in a low-background environment and surrounded by passive and active shielding. The goals for the Demonstrator are: demonstrating a background rate less than 3 t-1 y-1 in the 4 keV region of interest (ROI) surrounding the 2039 keV 76Ge endpoint energy; establishing the technology required to build a tonne-scale germanium based double-beta decay experiment; testing the recent claim of observation of 0vββ [1]; and performing a direct search for light WIMPs (3-10 GeV/c2).

  18. Scalar-metric and scalar-metric-torsion gravitational theories

    International Nuclear Information System (INIS)

    Aldersley, S.J.

    1977-01-01

    The techniques of dimensional analysis and of the theory of tensorial concomitants are employed to study field equations in gravitational theories which incorporate scalar fields of the Brans-Dicke type. Within the context of scalar-metric gravitational theories, a uniqueness theorem for the geometric (or gravitational) part of the field equations is proven and a Lagrangian is determined which is uniquely specified by dimensional analysis. Within the context of scalar-metric-torsion gravitational theories a uniqueness theorem for field Lagrangians is presented and the corresponding Euler-Lagrange equations are given. Finally, an example of a scalar-metric-torsion theory is presented which is similar in many respects to the Brans-Dicke theory and the Einstein-Cartan theory

  19. Regge calculus from discontinuous metrics

    International Nuclear Information System (INIS)

    Khatsymovsky, V.M.

    2003-01-01

    Regge calculus is considered as a particular case of the more general system where the linklengths of any two neighbouring 4-tetrahedra do not necessarily coincide on their common face. This system is treated as that one described by metric discontinuous on the faces. In the superspace of all discontinuous metrics the Regge calculus metrics form some hypersurface defined by continuity conditions. Quantum theory of the discontinuous metric system is assumed to be fixed somehow in the form of quantum measure on (the space of functionals on) the superspace. The problem of reducing this measure to the Regge hypersurface is addressed. The quantum Regge calculus measure is defined from a discontinuous metric measure by inserting the δ-function-like phase factor. The requirement that continuity conditions be imposed in a 'face-independent' way fixes this factor uniquely. The term 'face-independent' means that this factor depends only on the (hyper)plane spanned by the face, not on it's form and size. This requirement seems to be natural from the viewpoint of existence of the well-defined continuum limit maximally free of lattice artefacts

  20. Symmetries of Taub-NUT dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.; Codoban, S.

    1998-01-01

    Recently geometric duality was analyzed for a metric which admits Killing tensors. An interesting example arises when the manifold has Killing-Yano tensors. The symmetries of the dual metrics in the case of Taub-NUT metric are investigated. Generic and non-generic symmetries of dual Taub-NUT metric are analyzed

  1. A Kerr-NUT metric

    International Nuclear Information System (INIS)

    Vaidya, P.C.; Patel, L.K.; Bhatt, P.V.

    1976-01-01

    Using Galilean time and retarded distance as coordinates the usual Kerr metric is expressed in form similar to the Newman-Unti-Tamburino (NUT) metric. The combined Kerr-NUT metric is then investigated. In addition to the Kerr and NUT solutions of Einstein's equations, three other types of solutions are derived. These are (i) the radiating Kerr solution, (ii) the radiating NUT solution satisfying Rsub(ik) = sigmaxisub(i)xisub(k), xisub(i)xisup(i) = 0, and (iii) the associated Kerr solution satisfying Rsub(ik) = 0. Solution (i) is distinct from and simpler than the one reported earlier by Vaidya and Patel (Phys. Rev.; D7:3590 (1973)). Solutions (ii) and (iii) gave line elements which have the axis of symmetry as a singular line. (author)

  2. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  3. The uniqueness of the Fisher metric as information metric

    Czech Academy of Sciences Publication Activity Database

    Le, Hong-Van

    2017-01-01

    Roč. 69, č. 4 (2017), s. 879-896 ISSN 0020-3157 Institutional support: RVO:67985840 Keywords : Chentsov’s theorem * mixed topology * monotonicity of the Fisher metric Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.049, year: 2016 https://link.springer.com/article/10.1007%2Fs10463-016-0562-0

  4. Thermodynamic metrics and optimal paths.

    Science.gov (United States)

    Sivak, David A; Crooks, Gavin E

    2012-05-11

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  5. Invariant metrics for Hamiltonian systems

    International Nuclear Information System (INIS)

    Rangarajan, G.; Dragt, A.J.; Neri, F.

    1991-05-01

    In this paper, invariant metrics are constructed for Hamiltonian systems. These metrics give rise to norms on the space of homeogeneous polynomials of phase-space variables. For an accelerator lattice described by a Hamiltonian, these norms characterize the nonlinear content of the lattice. Therefore, the performance of the lattice can be improved by minimizing the norm as a function of parameters describing the beam-line elements in the lattice. A four-fold increase in the dynamic aperture of a model FODO cell is obtained using this procedure. 7 refs

  6. Generalization of Vaidya's radiation metric

    Energy Technology Data Exchange (ETDEWEB)

    Gleiser, R J; Kozameh, C N [Universidad Nacional de Cordoba (Argentina). Instituto de Matematica, Astronomia y Fisica

    1981-11-01

    In this paper it is shown that if Vaidya's radiation metric is considered from the point of view of kinetic theory in general relativity, the corresponding phase space distribution function can be generalized in a particular way. The new family of spherically symmetric radiation metrics obtained contains Vaidya's as a limiting situation. The Einstein field equations are solved in a ''comoving'' coordinate system. Two arbitrary functions of a single variable are introduced in the process of solving these equations. Particular examples considered are a stationary solution, a nonvacuum solution depending on a single parameter, and several limiting situations.

  7. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2018-01-01

    The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...

  8. Remarks on G-Metric Spaces

    Directory of Open Access Journals (Sweden)

    Bessem Samet

    2013-01-01

    Full Text Available In 2005, Mustafa and Sims (2006 introduced and studied a new class of generalized metric spaces, which are called G-metric spaces, as a generalization of metric spaces. We establish some useful propositions to show that many fixed point theorems on (nonsymmetric G-metric spaces given recently by many authors follow directly from well-known theorems on metric spaces. Our technique can be easily extended to other results as shown in application.

  9. DLA Energy Biofuel Feedstock Metrics Study

    Science.gov (United States)

    2012-12-11

    moderately/highly in- vasive  Metric 2: Genetically modified organism ( GMO ) hazard, Yes/No and Hazard Category  Metric 3: Species hybridization...4– biofuel distribution Stage # 5– biofuel use Metric 1: State inva- siveness ranking Yes Minimal Minimal No No Metric 2: GMO hazard Yes...may utilize GMO microbial or microalgae species across the applicable biofuel life cycles (stages 1–3). The following consequence Metrics 4–6 then

  10. Mainstreaming Low-Carbon Climate-Resilient growth pathways into Development Finance Institutions' activities. A research program on the standards, tools and metrics to support transition to the low-carbon climate-resilient development models. Paper 2 - Lessons from the use of climate-related decision-making standards and tools by DFIs to facilitate the transition to a low-carbon, climate-resilient future

    International Nuclear Information System (INIS)

    Cochran, Ian; Eschalier, Claire; Deheza, Mariana

    2015-10-01

    The integration or 'mainstreaming' of climate change into development finance decisions poses a broad number of operational challenges. Drawing from the current practice of Development Finance Institutions (DFIs), this paper first identifies three families of tools and metrics used by DFIs to integrate both mitigation and adaptation objectives into investment decision making. Based on this analysis, it then establishes a framework for integrating carbon standards and tools into the upstream strategic and downstream assessment stages of investment decision making. It principally considers the integration into the assessment of direct project finance and investment, but also looks at budget support, programmatic and indirect interventions. Finally, the paper identifies the next steps to build on existing tools and indicators that currently focus on climate finance tracking to those that foster the alignment of long-term development with the 2 deg. C climate objective. This alignment implies moving from 'static' assessment tools - that identify whether or not emissions are reduced or resiliency is increased by an action - to a 'dynamic' process within which the 'transition impact' is assessed. (authors)

  11. Energy Metrics for State Government Buildings

    Science.gov (United States)

    Michael, Trevor

    Measuring true progress towards energy conservation goals requires the accurate reporting and accounting of energy consumption. An accurate energy metrics framework is also a critical element for verifiable Greenhouse Gas Inventories. Energy conservation in government can reduce expenditures on energy costs leaving more funds available for public services. In addition to monetary savings, conserving energy can help to promote energy security, air quality, and a reduction of carbon footprint. With energy consumption/GHG inventories recently produced at the Federal level, state and local governments are beginning to also produce their own energy metrics systems. In recent years, many states have passed laws and executive orders which require their agencies to reduce energy consumption. In June 2008, SC state government established a law to achieve a 20% energy usage reduction in state buildings by 2020. This study examines case studies from other states who have established similar goals to uncover the methods used to establish an energy metrics system. Direct energy consumption in state government primarily comes from buildings and mobile sources. This study will focus exclusively on measuring energy consumption in state buildings. The case studies reveal that many states including SC are having issues gathering the data needed to accurately measure energy consumption across all state buildings. Common problems found include a lack of enforcement and incentives that encourage state agencies to participate in any reporting system. The case studies are aimed at finding the leverage used to gather the needed data. The various approaches at coercing participation will hopefully reveal methods that SC can use to establish the accurate metrics system needed to measure progress towards its 20% by 2020 energy reduction goal. Among the strongest incentives found in the case studies is the potential for monetary savings through energy efficiency. Framing energy conservation

  12. Separable metrics and radiating stars

    Indian Academy of Sciences (India)

    We study the junction condition relating the pressure to heat flux at the boundary of an accelerating and expanding spherically symmetric radiating star. We transform the junction condition to an ordinary differential equation by making a separability assumption on the metric functions in the space–time variables.

  13. Socio-technical security metrics

    NARCIS (Netherlands)

    Gollmann, D.; Herley, C.; Koenig, V.; Pieters, W.; Sasse, M.A.

    2015-01-01

    Report from Dagstuhl seminar 14491. This report documents the program and the outcomes of Dagstuhl Seminar 14491 “Socio-Technical Security Metrics”. In the domain of safety, metrics inform many decisions, from the height of new dikes to the design of nuclear plants. We can state, for example, that

  14. Leading Gainful Employment Metric Reporting

    Science.gov (United States)

    Powers, Kristina; MacPherson, Derek

    2016-01-01

    This chapter will address the importance of intercampus involvement in reporting of gainful employment student-level data that will be used in the calculation of gainful employment metrics by the U.S. Department of Education. The authors will discuss why building relationships within the institution is critical for effective gainful employment…

  15. Carbonization

    Energy Technology Data Exchange (ETDEWEB)

    Hennebutte, H G; Goutal, E

    1921-07-04

    Materials such as coal, peat, or schist are subjected to a rising temperature in successive stages in apparatus in which the distillation products are withdrawn at each stage. For example in a three-stage process, the acid products of the first or low-temperature stage are fixed in a suitable reagent, the basic products from a second or higher-temperature stage are absorbed in an acid reagent, hydrocarbons being retained by solvents, while the third are subjected to a pyrogenation process carried out in a closed vessel. Wherein the material is subjected in stages to a rising temperature, the gasified products being withdrawn at each stage, and are prevented as far as possible from mixing with the carbonized products.

  16. Hominin Sites and Paleolakes Drilling Project. Chew Bahir, southern Ethiopia: How to get from three tonnes of sediment core to > 500 ka of continuous climate history?

    Science.gov (United States)

    Foerster, Verena; Asrat, Asfawossen; Cohen, Andrew S.; Gromig, Raphael; Günter, Christina; Junginger, Annett; Lamb, Henry F.; Schaebitz, Frank; Trauth, Martin H.

    2016-04-01

    In search of the environmental context of the evolution and dispersal of Homo sapiens and our close relatives within and beyond the African continent, the ICDP-funded Hominin Sites and Paleolakes Drilling Project (HSPDP) has recently cored five fluvio-lacustrine archives of climate change in East Africa. The sediment cores collected in Ethiopia and Kenya are expected to provide valuable insights into East African environmental variability during the last ~3.5 Ma. The tectonically-bound Chew Bahir basin in the southern Ethiopian rift is one of the five sites within HSPDP, located in close proximity to the Lower Omo River valley, the site of the oldest known fossils of anatomically modern humans. In late 2014, the two cores (279 and 266 m long respectively, HSPDP-CHB14-2A and 2B) were recovered, summing up to nearly three tonnes of mostly calcareous clays and silts. Deciphering an environmental record from multiple records, from the source region of modern humans could eventually allow us to reconstruct the pronounced variations of moisture availability during the transition into Middle Stone Age, and its implications for the origin and dispersal of Homo sapiens. Here we present the first results of our analysis of the Chew Bahir cores. Following the HSPDP protocols, the two parallel Chew Bahir sediment cores have been merged into one single, 280 m long and nearly continuous (>90%) composite core on the basis of a high resolution MSCL data set (e.g., magnetic susceptibility, gamma ray density, color intensity transects, core photographs). Based on the obvious cyclicities in the MSCL, correlated with orbital cycles, the time interval covered by our sediment archive of climate change is inferred to span the last 500-600 kyrs. Combining our first results from the long cores with the results from the accomplished pre-study of short cores taken in 2009/10 along a NW-SE transect across the basin (Foerster et al., 2012, Trauth et al., 2015), we have developed a hypothesis

  17. Carbon Value Analysis of Batang Gadis National Park, Mandailing Natal Regency, North Sumatera Province, Indonesia

    Science.gov (United States)

    Daulay, Dini Novalanty Ohara; Hidayat, Jafron Wasiq

    2018-02-01

    Global warming is an important issue in the world which it gives a negative effect on human life. One indicator of global warming is increasing greenhouse gas i.e. carbondioxide from human activities. Deforestation and forest degradation are the second largest contributor of carbon into the atmosphere, after the use of fossil fuels by industry and transportation. As lungs of the world, forest is enable to produce renewable energy sources i.e. biomass. Forest carbon stock in above ground biomass (AGB) is the greatest effect source on deforestation and forest degradation. Therefore, it is necessary to perform a study the potential of carbon in forest. The purpose of this research is to determine carbon stock value in Batang Gadis National Park, Mandailing Natal Regency, North Sumatera Province, Indonesia. The carbon potential stored in this forest vegetation is calculated using AGB allometric equation by using data in diameter at breast height (dbh = 1.3 m), height, and density of the wood for trees. Data obtained from secondary data is Asset Assessment Report which State Controlled Forest Natural Resources Batang Gadis National Park, 2016. Study locations were Pagar Gunung and Sopo Tinjak Villages. Carbon stock values were calculated and analyzed with assumption that a half of biomass part is carbon stock which using Australian carbon price about AUD 11.82 Australia (Australian dollars) and EU € 5 (US 6). The results showed that the total biomass in Pagar Gunung and Sopo Tinjak Villages amounted to 259.83 tonnes and 160.89 tonnes. From the results of the total biomass, the total carbon stocks (C) and CO2 stocks in both villages are 210.36 tonnes (129.92 tonnes in Pagar Gunung Village and 80.45 tonnes in Sopo Tinjak Village) and 772.03 tonnes (476.79 tonnes in Pagar Gunung Village and 295.24 tonnes in Sopo Tinjak Village). By using the carbon price prevailing in the market place Australia Emission Trading System (ETS) and the EU ETS (AUD 11.82/t CO2e and € 5 (US

  18. Carbon Value Analysis of Batang Gadis National Park, Mandailing Natal Regency, North Sumatera Province, Indonesia

    Directory of Open Access Journals (Sweden)

    Novalanty Ohara Daulay Dini

    2018-01-01

    Full Text Available Global warming is an important issue in the world which it gives a negative effect on human life. One indicator of global warming is increasing greenhouse gas i.e. carbondioxide from human activities. Deforestation and forest degradation are the second largest contributor of carbon into the atmosphere, after the use of fossil fuels by industry and transportation. As lungs of the world, forest is enable to produce renewable energy sources i.e. biomass. Forest carbon stock in above ground biomass (AGB is the greatest effect source on deforestation and forest degradation. Therefore, it is necessary to perform a study the potential of carbon in forest. The purpose of this research is to determine carbon stock value in Batang Gadis National Park, Mandailing Natal Regency, North Sumatera Province, Indonesia. The carbon potential stored in this forest vegetation is calculated using AGB allometric equation by using data in diameter at breast height (dbh = 1.3 m, height, and density of the wood for trees. Data obtained from secondary data is Asset Assessment Report which State Controlled Forest Natural Resources Batang Gadis National Park, 2016. Study locations were Pagar Gunung and Sopo Tinjak Villages. Carbon stock values were calculated and analyzed with assumption that a half of biomass part is carbon stock which using Australian carbon price about AUD $ 11.82 Australia (Australian dollars and EU € 5 (US $ 6. The results showed that the total biomass in Pagar Gunung and Sopo Tinjak Villages amounted to 259.83 tonnes and 160.89 tonnes. From the results of the total biomass, the total carbon stocks (C and CO2 stocks in both villages are 210.36 tonnes (129.92 tonnes in Pagar Gunung Village and 80.45 tonnes in Sopo Tinjak Village and 772.03 tonnes (476.79 tonnes in Pagar Gunung Village and 295.24 tonnes in Sopo Tinjak Village. By using the carbon price prevailing in the market place Australia Emission Trading System (ETS and the EU ETS (AUD $ 11.82/t

  19. Group covariance and metrical theory

    International Nuclear Information System (INIS)

    Halpern, L.

    1983-01-01

    The a priori introduction of a Lie group of transformations into a physical theory has often proved to be useful; it usually serves to describe special simplified conditions before a general theory can be worked out. Newton's assumptions of absolute space and time are examples where the Euclidian group and translation group have been introduced. These groups were extended to the Galilei group and modified in the special theory of relativity to the Poincare group to describe physics under the given conditions covariantly in the simplest way. The criticism of the a priori character leads to the formulation of the general theory of relativity. The general metric theory does not really give preference to a particular invariance group - even the principle of equivalence can be adapted to a whole family of groups. The physical laws covariantly inserted into the metric space are however adapted to the Poincare group. 8 references

  20. General relativity: An erfc metric

    Science.gov (United States)

    Plamondon, Réjean

    2018-06-01

    This paper proposes an erfc potential to incorporate in a symmetric metric. One key feature of this model is that it relies on the existence of an intrinsic physical constant σ, a star-specific proper length that scales all its surroundings. Based thereon, the new metric is used to study the space-time geometry of a static symmetric massive object, as seen from its interior. The analytical solutions to the Einstein equation are presented, highlighting the absence of singularities and discontinuities in such a model. The geodesics are derived in their second- and first-order differential formats. Recalling the slight impact of the new model on the classical general relativity tests in the solar system, a number of facts and open problems are briefly revisited on the basis of a heuristic definition of σ. A special attention is given to gravitational collapses and non-singular black holes.

  1. hdm: High-dimensional metrics

    OpenAIRE

    Chernozhukov, Victor; Hansen, Christian; Spindler, Martin

    2016-01-01

    In this article the package High-dimensional Metrics (\\texttt{hdm}) is introduced. It is a collection of statistical methods for estimation and quantification of uncertainty in high-dimensional approximately sparse models. It focuses on providing confidence intervals and significance testing for (possibly many) low-dimensional subcomponents of the high-dimensional parameter vector. Efficient estimators and uniformly valid confidence intervals for regression coefficients on target variables (e...

  2. Multi-Metric Sustainability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cowlin, Shannon [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heimiller, Donna [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Pless, Jacquelyn [National Renewable Energy Lab. (NREL), Golden, CO (United States); Munoz, David [Colorado School of Mines, Golden, CO (United States)

    2014-12-01

    A readily accessible framework that allows for evaluating impacts and comparing tradeoffs among factors in energy policy, expansion planning, and investment decision making is lacking. Recognizing this, the Joint Institute for Strategic Energy Analysis (JISEA) funded an exploration of multi-metric sustainability analysis (MMSA) to provide energy decision makers with a means to make more comprehensive comparisons of energy technologies. The resulting MMSA tool lets decision makers simultaneously compare technologies and potential deployment locations.

  3. Sensory Metrics of Neuromechanical Trust.

    Science.gov (United States)

    Softky, William; Benford, Criscillia

    2017-09-01

    Today digital sources supply a historically unprecedented component of human sensorimotor data, the consumption of which is correlated with poorly understood maladies such as Internet addiction disorder and Internet gaming disorder. Because both natural and digital sensorimotor data share common mathematical descriptions, one can quantify our informational sensorimotor needs using the signal processing metrics of entropy, noise, dimensionality, continuity, latency, and bandwidth. Such metrics describe in neutral terms the informational diet human brains require to self-calibrate, allowing individuals to maintain trusting relationships. With these metrics, we define the trust humans experience using the mathematical language of computational models, that is, as a primitive statistical algorithm processing finely grained sensorimotor data from neuromechanical interaction. This definition of neuromechanical trust implies that artificial sensorimotor inputs and interactions that attract low-level attention through frequent discontinuities and enhanced coherence will decalibrate a brain's representation of its world over the long term by violating the implicit statistical contract for which self-calibration evolved. Our hypersimplified mathematical understanding of human sensorimotor processing as multiscale, continuous-time vibratory interaction allows equally broad-brush descriptions of failure modes and solutions. For example, we model addiction in general as the result of homeostatic regulation gone awry in novel environments (sign reversal) and digital dependency as a sub-case in which the decalibration caused by digital sensorimotor data spurs yet more consumption of them. We predict that institutions can use these sensorimotor metrics to quantify media richness to improve employee well-being; that dyads and family-size groups will bond and heal best through low-latency, high-resolution multisensory interaction such as shared meals and reciprocated touch; and

  4. Metric reconstruction from Weyl scalars

    Energy Technology Data Exchange (ETDEWEB)

    Whiting, Bernard F; Price, Larry R [Department of Physics, PO Box 118440, University of Florida, Gainesville, FL 32611 (United States)

    2005-08-07

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations.

  5. Metric reconstruction from Weyl scalars

    International Nuclear Information System (INIS)

    Whiting, Bernard F; Price, Larry R

    2005-01-01

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations

  6. Sustainability Metrics: The San Luis Basin Project

    Science.gov (United States)

    Sustainability is about promoting humanly desirable dynamic regimes of the environment. Metrics: ecological footprint, net regional product, exergy, emergy, and Fisher Information. Adaptive management: (1) metrics assess problem, (2) specific problem identified, and (3) managemen...

  7. Properties of Activated Carbon Prepared from Coconut Shells in ...

    African Journals Online (AJOL)

    Materials commonly used for preparation of activated carbons include coal and coconut shells. Ghana generates over 30,000 tonnes of coconut shells annually from coconut oil processing activities but apart from a small percentage of the shells, which is burned as fuel, the remaining is usually dumped as waste.

  8. Historical storage budgets of organic carbon, nutrient and contaminant elements in saltmarsh sediments: Biogeochemical context for managed realignment, Humber Estuary, UK

    International Nuclear Information System (INIS)

    Andrews, J.E.; Samways, G.; Shimmield, G.B.

    2008-01-01

    Biogeochemical data from Welwick marsh (Humber Estuary, UK), an actively accreting saltmarsh, provides a decadal-centennial-scale natural analogue for likely future biogeochemical storage effects of managed realignment sites accreting either intertidal muds or saltmarsh. Marsh topographic profiles and progradation history from aerial photographs were combined with 137 Cs and niobium contamination history to establish and verify chronology and sediment mass accumulation. These data, combined with down-core measurements of particulate organic carbon (C org ), organic nitrogen (N org ), particle reactive phosphorus and selected contaminant metal (Zn, Pb, Cu, As and Nb) contents were then used to calculate sediment and chemical storage terms and to quantify changes in these over time. These data are used to help predict likely future biogeochemical storage changes at managed realignment sites in the estuary. The net effect of returning some 26 km 2 of reclaimed land to intertidal environments now (about 25% of the maximum possible realignment storage identified for the estuary) could result in the storage of some 40,000 tonnes a -1 of sediment which would also bury about 800 tonnes a -1 of C org and 40 tonnes a -1 of N org . Particulate contaminant P burial would be around 25 tonnes a -1 along with ∼ 6 tonnes a -1 contaminant Zn, 3 tonnes a -1 contaminant Pb, and ∼ 1 tonnes a -1 contaminant As and Cu. The study also shows that reclamation activities in the outer estuary since the mid-1700s has prevented, in total, the deposition of about 10 million tonnes of sediment, along with 320,000 tonnes of C org and 16,000 tonnes of N org . The study provides a mid-1990s baseline against which future measurements at the site can determine changes in burial fluxes and improvement or deterioration in contaminant metal contents of the sediments. The data are directly relevant for local managed realignment sites but also broadly indicative for sites generally on the European

  9. Reducing the environmental impact of trials: a comparison of the carbon footprint of the CRASH-1 and CRASH-2 clinical trials

    Science.gov (United States)

    2011-01-01

    Background All sectors of the economy, including the health research sector, must reduce their carbon emissions. The UK National Institute for Health Research has recently prepared guidelines on how to minimize the carbon footprint of research. We compare the carbon emissions from two international clinical trials in order to identify where emissions reductions can be made. Methods We conducted a carbon audit of two clinical trials (the CRASH-1 and CRASH-2 trials), quantifying the carbon dioxide emissions produced over a one-year audit period. Carbon emissions arising from the coordination centre, freight delivery, trial-related travel and commuting were calculated and compared. Results The total emissions in carbon dioxide equivalents during the one-year audit period were 181.3 tonnes for CRASH-1 and 108.2 tonnes for CRASH-2. In total, CRASH-1 emitted 924.6 tonnes of carbon dioxide equivalents compared with 508.5 tonnes for CRASH-2. The CRASH-1 trial recruited 10,008 patients over 5.1 years, corresponding to 92 kg of carbon dioxide per randomized patient. The CRASH-2 trial recruited 20,211 patients over 4.7 years, corresponding to 25 kg of carbon dioxide per randomized patient. The largest contributor to emissions in CRASH-1 was freight delivery of trial materials (86.0 tonnes, 48% of total emissions), whereas the largest contributor in CRASH-2 was energy use by the trial coordination centre (54.6 tonnes, 30% of total emissions). Conclusions Faster patient recruitment in the CRASH-2 trial largely accounted for its greatly increased carbon efficiency in terms of emissions per randomized patient. Lighter trial materials and web-based data entry also contributed to the overall lower carbon emissions in CRASH-2 as compared to CRASH-1. Trial Registration Numbers CRASH-1: ISRCTN74459797 CRASH-2: ISRCTN86750102 PMID:21291517

  10. Reducing the environmental impact of trials: a comparison of the carbon footprint of the CRASH-1 and CRASH-2 clinical trials

    Directory of Open Access Journals (Sweden)

    Roberts Ian

    2011-02-01

    Full Text Available Abstract Background All sectors of the economy, including the health research sector, must reduce their carbon emissions. The UK National Institute for Health Research has recently prepared guidelines on how to minimize the carbon footprint of research. We compare the carbon emissions from two international clinical trials in order to identify where emissions reductions can be made. Methods We conducted a carbon audit of two clinical trials (the CRASH-1 and CRASH-2 trials, quantifying the carbon dioxide emissions produced over a one-year audit period. Carbon emissions arising from the coordination centre, freight delivery, trial-related travel and commuting were calculated and compared. Results The total emissions in carbon dioxide equivalents during the one-year audit period were 181.3 tonnes for CRASH-1 and 108.2 tonnes for CRASH-2. In total, CRASH-1 emitted 924.6 tonnes of carbon dioxide equivalents compared with 508.5 tonnes for CRASH-2. The CRASH-1 trial recruited 10,008 patients over 5.1 years, corresponding to 92 kg of carbon dioxide per randomized patient. The CRASH-2 trial recruited 20,211 patients over 4.7 years, corresponding to 25 kg of carbon dioxide per randomized patient. The largest contributor to emissions in CRASH-1 was freight delivery of trial materials (86.0 tonnes, 48% of total emissions, whereas the largest contributor in CRASH-2 was energy use by the trial coordination centre (54.6 tonnes, 30% of total emissions. Conclusions Faster patient recruitment in the CRASH-2 trial largely accounted for its greatly increased carbon efficiency in terms of emissions per randomized patient. Lighter trial materials and web-based data entry also contributed to the overall lower carbon emissions in CRASH-2 as compared to CRASH-1. Trial Registration Numbers CRASH-1: ISRCTN74459797 CRASH-2: ISRCTN86750102

  11. Crowdsourcing metrics of digital collections

    Directory of Open Access Journals (Sweden)

    Tuula Pääkkönen

    2015-12-01

    Full Text Available In the National Library of Finland (NLF there are millions of digitized newspaper and journal pages, which are openly available via the public website  http://digi.kansalliskirjasto.fi. To serve users better, last year the front end was completely overhauled with its main aim in crowdsourcing features, e.g., by giving end-users the opportunity to create digital clippings and a personal scrapbook from the digital collections. But how can you know whether crowdsourcing has had an impact? How much crowdsourcing functionalities have been used so far? Did crowdsourcing work? In this paper the statistics and metrics of a recent crowdsourcing effort are analysed across the different digitized material types (newspapers, journals, ephemera. The subjects, categories and keywords given by the users are analysed to see which topics are the most appealing. Some notable public uses of the crowdsourced article clippings are highlighted. These metrics give us indications on how the end-users, based on their own interests, are investigating and using the digital collections. Therefore, the suggested metrics illustrate the versatility of the information needs of the users, varying from citizen science to research purposes. By analysing the user patterns, we can respond to the new needs of the users by making minor changes to accommodate the most active participants, while still making the service more approachable for those who are trying out the functionalities for the first time. Participation in the clippings and annotations can enrich the materials in unexpected ways and can possibly pave the way for opportunities of using crowdsourcing more also in research contexts. This creates more opportunities for the goals of open science since source data becomes ­available, making it possible for researchers to reach out to the general public for help. In the long term, utilizing, for example, text mining methods can allow these different end-user segments to

  12. A family of metric gravities

    Science.gov (United States)

    Shuler, Robert

    2018-04-01

    The goal of this paper is to take a completely fresh approach to metric gravity, in which the metric principle is strictly adhered to but its properties in local space-time are derived from conservation principles, not inferred from a global field equation. The global field strength variation then gains some flexibility, but only in the regime of very strong fields (2nd-order terms) whose measurement is now being contemplated. So doing provides a family of similar gravities, differing only in strong fields, which could be developed into meaningful verification targets for strong fields after the manner in which far-field variations were used in the 20th century. General Relativity (GR) is shown to be a member of the family and this is demonstrated by deriving the Schwarzschild metric exactly from a suitable field strength assumption. The method of doing so is interesting in itself because it involves only one differential equation rather than the usual four. Exact static symmetric field solutions are also given for one pedagogical alternative based on potential, and one theoretical alternative based on inertia, and the prospects of experimentally differentiating these are analyzed. Whether the method overturns the conventional wisdom that GR is the only metric theory of gravity and that alternatives must introduce additional interactions and fields is somewhat semantical, depending on whether one views the field strength assumption as a field and whether the assumption that produces GR is considered unique in some way. It is of course possible to have other fields, and the local space-time principle can be applied to field gravities which usually are weak-field approximations having only time dilation, giving them the spatial factor and promoting them to full metric theories. Though usually pedagogical, some of them are interesting from a quantum gravity perspective. Cases are noted where mass measurement errors, or distributions of dark matter, can cause one

  13. Hybrid metric-Palatini stars

    Science.gov (United States)

    Danilǎ, Bogdan; Harko, Tiberiu; Lobo, Francisco S. N.; Mak, M. K.

    2017-02-01

    We consider the internal structure and the physical properties of specific classes of neutron, quark and Bose-Einstein condensate stars in the recently proposed hybrid metric-Palatini gravity theory, which is a combination of the metric and Palatini f (R ) formalisms. It turns out that the theory is very successful in accounting for the observed phenomenology, since it unifies local constraints at the Solar System level and the late-time cosmic acceleration, even if the scalar field is very light. In this paper, we derive the equilibrium equations for a spherically symmetric configuration (mass continuity and Tolman-Oppenheimer-Volkoff) in the framework of the scalar-tensor representation of the hybrid metric-Palatini theory, and we investigate their solutions numerically for different equations of state of neutron and quark matter, by adopting for the scalar field potential a Higgs-type form. It turns out that the scalar-tensor definition of the potential can be represented as an Clairaut differential equation, and provides an explicit form for f (R ) given by f (R )˜R +Λeff, where Λeff is an effective cosmological constant. Furthermore, stellar models, described by the stiff fluid, radiation-like, bag model and the Bose-Einstein condensate equations of state are explicitly constructed in both general relativity and hybrid metric-Palatini gravity, thus allowing an in-depth comparison between the predictions of these two gravitational theories. As a general result it turns out that for all the considered equations of state, hybrid gravity stars are more massive than their general relativistic counterparts. Furthermore, two classes of stellar models corresponding to two particular choices of the functional form of the scalar field (constant value, and logarithmic form, respectively) are also investigated. Interestingly enough, in the case of a constant scalar field the equation of state of the matter takes the form of the bag model equation of state describing

  14. Metrics for Evaluation of Student Models

    Science.gov (United States)

    Pelanek, Radek

    2015-01-01

    Researchers use many different metrics for evaluation of performance of student models. The aim of this paper is to provide an overview of commonly used metrics, to discuss properties, advantages, and disadvantages of different metrics, to summarize current practice in educational data mining, and to provide guidance for evaluation of student…

  15. Context-dependent ATC complexity metric

    NARCIS (Netherlands)

    Mercado Velasco, G.A.; Borst, C.

    2015-01-01

    Several studies have investigated Air Traffic Control (ATC) complexity metrics in a search for a metric that could best capture workload. These studies have shown how daunting the search for a universal workload metric (one that could be applied in different contexts: sectors, traffic patterns,

  16. Properties of C-metric spaces

    Science.gov (United States)

    Croitoru, Anca; Apreutesei, Gabriela; Mastorakis, Nikos E.

    2017-09-01

    The subject of this paper belongs to the theory of approximate metrics [23]. An approximate metric on X is a real application defined on X × X that satisfies only a part of the metric axioms. In a recent paper [23], we introduced a new type of approximate metric, named C-metric, that is an application which satisfies only two metric axioms: symmetry and triangular inequality. The remarkable fact in a C-metric space is that a topological structure induced by the C-metric can be defined. The innovative idea of this paper is that we obtain some convergence properties of a C-metric space in the absence of a metric. In this paper we investigate C-metric spaces. The paper is divided into four sections. Section 1 is for Introduction. In Section 2 we recall some concepts and preliminary results. In Section 3 we present some properties of C-metric spaces, such as convergence properties, a canonical decomposition and a C-fixed point theorem. Finally, in Section 4 some conclusions are highlighted.

  17. The nuclear fuel cycle versus the carbon cycle

    International Nuclear Information System (INIS)

    Ewing, R.C.

    2005-01-01

    Nuclear power provides approximately 17% of the world's electricity, which is equivalent to a reduction in carbon emissions of ∼0.5 gigatonnes (Gt) of C/yr. This is a modest reduction as compared with global emissions of carbon, ∼7 Gt C/yr. Most analyses suggest that in order to have a significant and timely impact on carbon emissions, carbon-free sources, such as nuclear power, would have to expand total production of energy by factors of three to ten by 2050. A three-fold increase in nuclear power capacity would result in a projected reduction in carbon emissions of 1 to 2 Gt C/yr, depending on the type of carbon-based energy source that is displaced. This three-fold increase utilizing present nuclear technologies would result in 25,000 metric tonnes (t) of spent nuclear fuel (SNF) per year, containing over 200 t of plutonium. This is compared to a present global inventory of approximately 280,000 t of SNF and >1,700 t of Pu. A nuclear weapon can be fashioned from as little as 5 kg of 239 Pu. However, there is considerable technological flexibility in the nuclear fuel cycle. There are three types of nuclear fuel cycles that might be utilized for the increased production of energy: open, closed, or a symbiotic combination of different types of reactor (such as, thermal and fast neutron reactors). The neutron energy spectrum has a significant effect on the fission product yield, and the consumption of long-lived actinides, by fission, is best achieved by fast neutrons. Within each cycle, the volume and composition of the high-level nuclear waste and fissile material depend on the type of nuclear fuel, the amount of burn-up, the extent of radionuclide separation during reprocessing, and the types of materials used to immobilize different radionuclides. As an example, a 232 Th-based fuel cycle can be used to breed fissile 233 U with minimum production of Pu. In this paper, I will contrast the production of excess carbon in the form of CO 2 from fossil fuels with

  18. Carbon charges and natural gas use in China

    International Nuclear Information System (INIS)

    Skeer, Jeffrey; Wang Yanjia

    2006-01-01

    Substitution of natural gas for coal in China's power sector could significantly reduce emissions of carbon dioxide, but gas-fired power is generally more costly than coal-fired power in China today. This paper explores how carbon charges and carbon sequestration technology might tip the balance in favour of gas. The costs of electricity from new coal-fired and gas-fired power plants in China are compared under various assumptions about fuel costs, exchange rates, carbon dioxide charges, and application of carbon sequestration technology. Under average cost conditions today, gas-fired power is roughly two-thirds more costly than coal-fired power. But with a charge of $20/tonne of carbon dioxide, the costs of gas- and coal-fired power would typically be about equal. Over the longer term, carbon sequestration technology could be economical with a carbon dioxide charge of $22/tonne or more under typical cost conditions, but gas with sequestration would not have a clear cost advantage over coal with sequestration unless the charge exceeded $35/tonne

  19. Mainstreaming Low-Carbon Climate-Resilient growth pathways into Development Finance Institutions' activities. A research project on the standards, tools and metrics to support transition to the low-carbon climate-resilient development models. Paper 1 - Climate and development finance institutions: linking climate finance, development finance and the transition to low-carbon, climate-resilient economic models

    International Nuclear Information System (INIS)

    Eschalier, Claire; Cochran, Ian; Deheza, Mariana; Risler, Ophelie; Forestier, Pierre

    2015-10-01

    Development finance institutions (DFIs) are in a position to be key actors in aligning development and the 2 deg. challenge. One of the principal challenges today is to scale-up the financial flows to the trillions of dollars per year necessary to achieve the 2 deg. C long-term objectives. Achieving this transition to a low-carbon, climate resilient (LCCR) economic model requires the integration or 'mainstreaming' of climate issues as a prism through which all investment decisions should be made. This paper presents an overview of the opportunities and challenges of linking a LCCR transition with the objectives of development finance. It first presents the two-fold challenge of climate change and development for countries around the world. Second, the paper explores the role of development finance institutions and their support for the transition to a low-carbon, climate-resilient economic model. Finally, it examines a necessary paradigm shift to integrate climate and development objectives to establish a 'LCCR development model' able to simultaneously tackling development priorities and needs for resilient, low-carbon growth. This will necessitate a move from focusing on a 'siloed' vision of climate finance to a means of aligning activities across the economy with the LCCR objectives to ensure that the majority of investments are coherent with this long-term transition. (authors)

  20. The carbon footprint of a renal service in the United Kingdom.

    Science.gov (United States)

    Connor, A; Lillywhite, R; Cooke, M W

    2010-12-01

    Anthropogenic climate change presents a major global health threat. However, the very provision of healthcare itself is associated with a significant environmental impact. Carbon footprinting techniques are increasingly used outside of the healthcare sector to assess greenhouse gas emissions and inform strategies to reduce them. This study represents the first assessment of the carbon footprint of an individual specialty service to include both direct and indirect emissions. This was a component analysis study. Activity data were collected for building energy use, travel and procurement. Established emissions factors were applied to reconcile this data to carbon dioxide equivalents (CO(2)eq) per year. The Dorset Renal Service has a carbon footprint of 3006 tonnes CO(2)eq per annum, of which 381 tonnes CO(2)eq (13% of overall emissions) result from building energy use, 462 tonnes CO(2)eq from travel (15%) and 2163 tonnes CO(2)eq (72%) from procurement. The contributions of the major subsectors within procurement are: pharmaceuticals, 1043 tonnes CO(2)eq (35% of overall emissions); medical equipment, 753 tonnes CO(2)eq (25%). The emissions associated with healthcare episodes were estimated at 161 kg CO(2)eq per bed day for an inpatient admission and 22 kg CO(2)eq for an outpatient appointment. These results suggest that carbon-reduction strategies focusing upon supply chain emissions are likely to yield the greatest benefits. Sustainable waste management and strategies to reduce emissions associated with building energy use and travel will also be important. A transformation in the way that clinical care is delivered is required, such that lower carbon clinical pathways, treatments and technologies are embraced. The estimations of greenhouse gas emissions associated with outpatient appointments and inpatient stays calculated here may facilitate modelling of the emissions of alternative pathways of care.

  1. On characterizations of quasi-metric completeness

    Energy Technology Data Exchange (ETDEWEB)

    Dag, H.; Romaguera, S.; Tirado, P.

    2017-07-01

    Hu proved in [4] that a metric space (X, d) is complete if and only if for any closed subspace C of (X, d), every Banach contraction on C has fixed point. Since then several authors have investigated the problem of characterizing the metric completeness by means of fixed point theorems. Recently this problem has been studied in the more general context of quasi-metric spaces for different notions of completeness. Here we present a characterization of a kind of completeness for quasi-metric spaces by means of a quasi-metric versions of Hu’s theorem. (Author)

  2. The Metric of Colour Space

    DEFF Research Database (Denmark)

    Gravesen, Jens

    2015-01-01

    and found the MacAdam ellipses which are often interpreted as defining the metric tensor at their centres. An important question is whether it is possible to define colour coordinates such that the Euclidean distance in these coordinates correspond to human perception. Using cubic splines to represent......The space of colours is a fascinating space. It is a real vector space, but no matter what inner product you put on the space the resulting Euclidean distance does not correspond to human perception of difference between colours. In 1942 MacAdam performed the first experiments on colour matching...

  3. Product Operations Status Summary Metrics

    Science.gov (United States)

    Takagi, Atsuya; Toole, Nicholas

    2010-01-01

    The Product Operations Status Summary Metrics (POSSUM) computer program provides a readable view into the state of the Phoenix Operations Product Generation Subsystem (OPGS) data pipeline. POSSUM provides a user interface that can search the data store, collect product metadata, and display the results in an easily-readable layout. It was designed with flexibility in mind for support in future missions. Flexibility over various data store hierarchies is provided through the disk-searching facilities of Marsviewer. This is a proven program that has been in operational use since the first day of the Phoenix mission.

  4. Resolving society's energy trilemma through the Energy Justice Metric

    International Nuclear Information System (INIS)

    Heffron, Raphael J.; McCauley, Darren; Sovacool, Benjamin K.

    2015-01-01

    Carbon dioxide emissions continue to increase to the detriment of society in many forms. One of the difficulties faced is the imbalance between the competing aims of economics, politics and the environment which form the trilemma of energy policy. This article advances that this energy trilemma can be resolved through energy justice. Energy justice develops the debate on energy policy to one that highlights cosmopolitanism, progresses thinking beyond economics and incorporates a new futuristic perspective. To capture these dynamics of energy justice, this research developed an Energy Justice Metric (EJM) that involves the calculation of several metrics: (1) a country (national) EJM; (2) an EJM for different energy infrastructure; and (3) an EJM which is incorporated into economic models that derive costs for energy infrastructure projects. An EJM is modeled for China, the European Union and the United States, and for different energy infrastructure in the United Kingdom. The EJM is plotted on a Ternary Phase Diagram which is used in the sciences for analyzing the relationship (trilemma) of three forms of matter. The development of an EJM can provide a tool for decision-making on energy policy and one that solves the energy trilemma with a just and equitable approach. - Highlights: • Energy justice advances energy policy with cosmopolitanism and new economic-thinking. • An Energy Justice Metric is developed and captures the dynamics of energy justice. • The Energy Justice Metric (EJM) compares countries, and energy infrastructure. • EJM provides an energy policy decision-making tool that is just and equitable.

  5. Deep Energy Retrofit Performance Metric Comparison: Eight California Case Studies

    Energy Technology Data Exchange (ETDEWEB)

    Walker, Iain [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fisher, Jeremy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Less, Brennan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-06-01

    In this paper we will present the results of monitored annual energy use data from eight residential Deep Energy Retrofit (DER) case studies using a variety of performance metrics. For each home, the details of the retrofits were analyzed, diagnostic tests to characterize the home were performed and the homes were monitored for total and individual end-use energy consumption for approximately one year. Annual performance in site and source energy, as well as carbon dioxide equivalent (CO2e) emissions were determined on a per house, per person and per square foot basis to examine the sensitivity to these different metrics. All eight DERs showed consistent success in achieving substantial site energy and CO2e reductions, but some projects achieved very little, if any source energy reduction. This problem emerged in those homes that switched from natural gas to electricity for heating and hot water, resulting in energy consumption dominated by electricity use. This demonstrates the crucial importance of selecting an appropriate metric to be used in guiding retrofit decisions. Also, due to the dynamic nature of DERs, with changes in occupancy, size, layout, and comfort, several performance metrics might be necessary to understand a project’s success.

  6. Web metrics for library and information professionals

    CERN Document Server

    Stuart, David

    2014-01-01

    This is a practical guide to using web metrics to measure impact and demonstrate value. The web provides an opportunity to collect a host of different metrics, from those associated with social media accounts and websites to more traditional research outputs. This book is a clear guide for library and information professionals as to what web metrics are available and how to assess and use them to make informed decisions and demonstrate value. As individuals and organizations increasingly use the web in addition to traditional publishing avenues and formats, this book provides the tools to unlock web metrics and evaluate the impact of this content. The key topics covered include: bibliometrics, webometrics and web metrics; data collection tools; evaluating impact on the web; evaluating social media impact; investigating relationships between actors; exploring traditional publications in a new environment; web metrics and the web of data; the future of web metrics and the library and information professional.Th...

  7. Metrics for building performance assurance

    Energy Technology Data Exchange (ETDEWEB)

    Koles, G.; Hitchcock, R.; Sherman, M.

    1996-07-01

    This report documents part of the work performed in phase I of a Laboratory Directors Research and Development (LDRD) funded project entitled Building Performance Assurances (BPA). The focus of the BPA effort is to transform the way buildings are built and operated in order to improve building performance by facilitating or providing tools, infrastructure, and information. The efforts described herein focus on the development of metrics with which to evaluate building performance and for which information and optimization tools need to be developed. The classes of building performance metrics reviewed are (1) Building Services (2) First Costs, (3) Operating Costs, (4) Maintenance Costs, and (5) Energy and Environmental Factors. The first category defines the direct benefits associated with buildings; the next three are different kinds of costs associated with providing those benefits; the last category includes concerns that are broader than direct costs and benefits to the building owner and building occupants. The level of detail of the various issues reflect the current state of knowledge in those scientific areas and the ability of the to determine that state of knowledge, rather than directly reflecting the importance of these issues; it intentionally does not specifically focus on energy issues. The report describes work in progress and is intended as a resource and can be used to indicate the areas needing more investigation. Other reports on BPA activities are also available.

  8. Measuring Carbon Footprint of Flexible Pavement Construction Project in Indonesia

    Science.gov (United States)

    Hatmoko, Jati Utomo Dwi; Hidayat, Arif; Setiawati, Apsari; Prasetyo, Stefanus Catur Adi

    2018-02-01

    Road infrastructure in Indonesia is mainly dominated by flexible pavement type. Its construction process, however, has raised concerns in terms of its environment impacts. This study aims to track and measure the carbon footprint of flexible pavement. The objectives are to map the construction process in relation to greenhouse gas (GHG) emissions, to quantify them in terms of carbon dioxide equivalents (CO2e) as generated by the process of production and transportation of raw materials, and the operation of plant off-site and on-site project. Data collection was done by having site observations and interviews with project stakeholders. The results show a total emissions of 70.888 tonnes CO2e, consisting of 34.248 tonnes CO2e (48.31%) off-site activities and 36.640 tonnes CO2e (51.687%) on-site activities. The two highest CO2e emissions were generated by the use of plant for asphalt concrete laying activities accounted 34.827 tonnes CO2e (49.130%), and material transportation accounted 24.921 (35.155%). These findings provide a new perspective of the carbon footprint in flexible pavement and suggest the urgent need for the use of more efficient and environmentally friendly plant in construction process as it shows the most significant contribution on the CO2e. This study provides valuable understanding on the environmental impact of typical flexible pavement projects in Indonesia, and further can be used for developing green road framework.

  9. A composite efficiency metrics for evaluation of resource and energy utilization

    International Nuclear Information System (INIS)

    Yang, Siyu; Yang, Qingchun; Qian, Yu

    2013-01-01

    Polygeneration systems are commonly found in chemical and energy industry. These systems often involve chemical conversions and energy conversions. Studies of these systems are interdisciplinary, mainly involving fields of chemical engineering, energy engineering, environmental science, and economics. Each of these fields has developed an isolated index system different from the others. Analyses of polygeneration systems are therefore very likely to provide bias results with only the indexes from one field. This paper is motivated from this problem to develop a new composite efficiency metrics for polygeneration systems. This new metrics is based on the second law of thermodynamics, exergy theory. We introduce exergy cost for waste treatment as the energy penalty into conventional exergy efficiency. Using this new metrics could avoid the situation of spending too much energy for increasing production or paying production capacity for saving energy consumption. The composite metrics is studied on a simplified co-production process, syngas to methanol and electricity. The advantage of the new efficiency metrics is manifested by comparison with carbon element efficiency, energy efficiency, and exergy efficiency. Results show that the new metrics could give more rational analysis than the other indexes. - Highlights: • The composite efficiency metric gives the balanced evaluation of resource utilization and energy utilization. • This efficiency uses the exergy for waste treatment as the energy penalty. • This efficiency is applied on a simplified co-production process. • Results show that the composite metrics is better than energy efficiencies and resource efficiencies

  10. Utilisation of high carbon pulverised fuel ash

    OpenAIRE

    Mahmud, Maythem Naji

    2011-01-01

    Coal combustion by-products generated from coal-fired power plant and cause enormous problems for disposal unless a way can be found to utilize these by-products through resource recovery programs. The implementation of air act regulations to reduce NOx emission have resulted millions of tonnes of pulverised fuel ash (PFA) accumulated with high percentage of unburned carbon made it un-saleable for the cement industry. Moreover, alternative fuels such as biomass and import coals were suggested...

  11. Metric approach to quantum constraints

    International Nuclear Information System (INIS)

    Brody, Dorje C; Hughston, Lane P; Gustavsson, Anna C T

    2009-01-01

    A framework for deriving equations of motion for constrained quantum systems is introduced and a procedure for its implementation is outlined. In special cases, the proposed new method, which takes advantage of the fact that the space of pure states in quantum mechanics has both a symplectic structure and a metric structure, reduces to a quantum analogue of the Dirac theory of constraints in classical mechanics. Explicit examples involving spin-1/2 particles are worked out in detail: in the first example, our approach coincides with a quantum version of the Dirac formalism, while the second example illustrates how a situation that cannot be treated by Dirac's approach can nevertheless be dealt with in the present scheme.

  12. Metrics for Business Process Models

    Science.gov (United States)

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  13. The extreme carbon dioxide outburst at the Menzengraben potash mine 7 July 1953

    DEFF Research Database (Denmark)

    Hedlund, Frank Huess

    2012-01-01

    Carbon dioxide is an asphyxiant and an irritant gas. An extreme outburst of carbon dioxide took place 7 July 1953 in a potash mine in the former East Germany. During 25 min, a large amount of CO2 was blown out of the mine shaft with great force. It was wind still and concentrated CO2 accumulated....... It is concluded that 1100–3900 tonnes of CO2 were blown out of the mine shaft, possibly with intensities around 4 tonnes/s. It is also concluded that the large majority of the gas escaped as a near-vertical high-velocity jet with only little loss of momentum due to impingement. The release was modelled using...... histories to date include sudden releases of CO2 of up to 50 tonnes only, far too small to provide a suitable empirical perspective on predicted hazard distances for CCS projects. The 1953 outburst contributes to filling this gap....

  14. Active Metric Learning for Supervised Classification

    OpenAIRE

    Kumaran, Krishnan; Papageorgiou, Dimitri; Chang, Yutong; Li, Minhan; Takáč, Martin

    2018-01-01

    Clustering and classification critically rely on distance metrics that provide meaningful comparisons between data points. We present mixed-integer optimization approaches to find optimal distance metrics that generalize the Mahalanobis metric extensively studied in the literature. Additionally, we generalize and improve upon leading methods by removing reliance on pre-designated "target neighbors," "triplets," and "similarity pairs." Another salient feature of our method is its ability to en...

  15. On Nakhleh's metric for reduced phylogenetic networks

    OpenAIRE

    Cardona, Gabriel; Llabrés, Mercè; Rosselló, Francesc; Valiente Feruglio, Gabriel Alejandro

    2009-01-01

    We prove that Nakhleh’s metric for reduced phylogenetic networks is also a metric on the classes of tree-child phylogenetic networks, semibinary tree-sibling time consistent phylogenetic networks, and multilabeled phylogenetic trees. We also prove that it separates distinguishable phylogenetic networks. In this way, it becomes the strongest dissimilarity measure for phylogenetic networks available so far. Furthermore, we propose a generalization of that metric that separates arbitrary phyl...

  16. Generalized tolerance sensitivity and DEA metric sensitivity

    OpenAIRE

    Neralić, Luka; E. Wendell, Richard

    2015-01-01

    This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA). Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  17. The definitive guide to IT service metrics

    CERN Document Server

    McWhirter, Kurt

    2012-01-01

    Used just as they are, the metrics in this book will bring many benefits to both the IT department and the business as a whole. Details of the attributes of each metric are given, enabling you to make the right choices for your business. You may prefer and are encouraged to design and create your own metrics to bring even more value to your business - this book will show you how to do this, too.

  18. Generalized tolerance sensitivity and DEA metric sensitivity

    Directory of Open Access Journals (Sweden)

    Luka Neralić

    2015-03-01

    Full Text Available This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA. Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  19. Common Metrics for Human-Robot Interaction

    Science.gov (United States)

    Steinfeld, Aaron; Lewis, Michael; Fong, Terrence; Scholtz, Jean; Schultz, Alan; Kaber, David; Goodrich, Michael

    2006-01-01

    This paper describes an effort to identify common metrics for task-oriented human-robot interaction (HRI). We begin by discussing the need for a toolkit of HRI metrics. We then describe the framework of our work and identify important biasing factors that must be taken into consideration. Finally, we present suggested common metrics for standardization and a case study. Preparation of a larger, more detailed toolkit is in progress.

  20. Chaotic inflation with metric and matter perturbations

    International Nuclear Information System (INIS)

    Feldman, H.A.; Brandenberger, R.H.

    1989-01-01

    A perturbative scheme to analyze the evolution of both metric and scalar field perturbations in an expanding universe is developed. The scheme is applied to study chaotic inflation with initial metric and scalar field perturbations present. It is shown that initial gravitational perturbations with wavelength smaller than the Hubble radius rapidly decay. The metric simultaneously picks up small perturbations determined by the matter inhomogeneities. Both are frozen in once the wavelength exceeds the Hubble radius. (orig.)

  1. Gravitational lensing in metric theories of gravity

    International Nuclear Information System (INIS)

    Sereno, Mauro

    2003-01-01

    Gravitational lensing in metric theories of gravity is discussed. I introduce a generalized approximate metric element, inclusive of both post-post-Newtonian contributions and a gravitomagnetic field. Following Fermat's principle and standard hypotheses, I derive the time delay function and deflection angle caused by an isolated mass distribution. Several astrophysical systems are considered. In most of the cases, the gravitomagnetic correction offers the best perspectives for an observational detection. Actual measurements distinguish only marginally different metric theories from each other

  2. ROE Carbon Storage - Forest Biomass

    Data.gov (United States)

    U.S. Environmental Protection Agency — This polygon dataset depicts the density of forest biomass in counties across the United States, in terms of metric tons of carbon per square mile of land area....

  3. An assessment of uncertainty in forest carbon budget projections

    Science.gov (United States)

    Linda S. Heath; James E. Smith

    2000-01-01

    Estimates of uncertainty are presented for projections of forest carbon inventory and average annual net carbon flux on private timberland in the US using the model FORCARB. Uncertainty in carbon inventory was approximately ±9% (2000 million metric tons) of the estimated median in the year 2000, rising to 11% (2800 million metric tons) in projection year 2040...

  4. About the possibility of a generalized metric

    International Nuclear Information System (INIS)

    Lukacs, B.; Ladik, J.

    1991-10-01

    The metric (the structure of the space-time) may be dependent on the properties of the object measuring it. The case of size dependence of the metric was examined. For this dependence the simplest possible form of the metric tensor has been constructed which fulfils the following requirements: there be two extremal characteristic scales; the metric be unique and the usual between them; the change be sudden in the neighbourhood of these scales; the size of the human body appear as a parameter (postulated on the basis of some philosophical arguments). Estimates have been made for the two extremal length scales according to existing observations. (author) 19 refs

  5. Forests and carbon storage

    Science.gov (United States)

    Michael G. Ryan

    2008-01-01

    Forests store much carbon and their growth can be a carbon sink if disturbance or harvesting has killed or removed trees or if trees that can now regrow are planted where they did not historically occur. Forests and long-lived wood products currently offset 310 million metric tons of U.S. fossil fuel emissions of carbon--20 percent of the total (Pacala et al. 2007)....

  6. Carbon storage and sequestration by trees in urban and community areas of the United States.

    Science.gov (United States)

    Nowak, David J; Greenfield, Eric J; Hoehn, Robert E; Lapoint, Elizabeth

    2013-07-01

    Carbon storage and sequestration by urban trees in the United States was quantified to assess the magnitude and role of urban forests in relation to climate change. Urban tree field data from 28 cities and 6 states were used to determine the average carbon density per unit of tree cover. These data were applied to statewide urban tree cover measurements to determine total urban forest carbon storage and annual sequestration by state and nationally. Urban whole tree carbon storage densities average 7.69 kg C m(-2) of tree cover and sequestration densities average 0.28 kg C m(-2) of tree cover per year. Total tree carbon storage in U.S. urban areas (c. 2005) is estimated at 643 million tonnes ($50.5 billion value; 95% CI = 597 million and 690 million tonnes) and annual sequestration is estimated at 25.6 million tonnes ($2.0 billion value; 95% CI = 23.7 million to 27.4 million tonnes). Published by Elsevier Ltd.

  7. Carbon storage in Ontario's forests, 2000-2100

    International Nuclear Information System (INIS)

    Colombo, S.J.; Chen, J.; Ter-Mikaelian, M.T.

    2007-01-01

    One of the greatest challenges facing modern society is rapid climate change resulting from greenhouse gases emissions to the atmosphere, primarily in the form of carbon dioxide from the burning of fossil fuels. The effects of climate change on natural environments will inevitably affect people as well, if left unchanged. In addition to many other societal benefits, forests store large amounts of carbon. As a result, it is necessary to understand how forest management and natural processes affect forest carbon storage. Such information can be utilized to manage forests so that they function as carbon sinks and help reduce greenhouse gas concentrations in the atmosphere. This report employed data about Ontario's forest structure and information from the forest management planning process and past harvests to describe carbon in forests and wood products today and through to the end of this century. The paper described the methods used for the study which included modification of the United States national forest carbon model, FORCARB2, to predict Ontario's forest carbon budgets in order to make carbon projections congruent with forest management plans. The modified forest carbon model, which is called FORCARB-ON, predicts carbon in live trees, understory vegetation, forest floor, standing and down dead wood, and soil. Ontario's managed forests are projected to increase carbon storage by 433 million tonnes from 2000 to 2100. The largest forest sink will be in wood products, accounting for 364 million tonnes of carbon storage over the century. 22 refs., 1 tab., 3 figs

  8. Life cycle study. Carbon dioxide emissions lower in electric heating than in oil heating

    Energy Technology Data Exchange (ETDEWEB)

    Heikkinen, A.; Jaervinen, P.; Nikula, A.

    1996-11-01

    A primary objective of energy conservation is to cut carbon dioxide emissions. A comparative study on the various heating forms, based on the life cycle approach, showed that the carbon dioxide emissions resulting form heating are appreciably lower now that electric heating has become more common. The level of carbon dioxide emissions in Finland would have been millions of tonnes higher had oil heating been chosen instead of electric heating. (orig.)

  9. Enhancing Authentication Models Characteristic Metrics via ...

    African Journals Online (AJOL)

    In this work, we derive the universal characteristic metrics set for authentication models based on security, usability and design issues. We then compute the probability of the occurrence of each characteristic metrics in some single factor and multifactor authentication models in order to determine the effectiveness of these ...

  10. Gravitational Metric Tensor Exterior to Rotating Homogeneous ...

    African Journals Online (AJOL)

    The covariant and contravariant metric tensors exterior to a homogeneous spherical body rotating uniformly about a common φ axis with constant angular velocity ω is constructed. The constructed metric tensors in this gravitational field have seven non-zero distinct components.The Lagrangian for this gravitational field is ...

  11. Invariant metric for nonlinear symplectic maps

    Indian Academy of Sciences (India)

    In this paper, we construct an invariant metric in the space of homogeneous polynomials of a given degree (≥ 3). The homogeneous polynomials specify a nonlinear symplectic map which in turn represents a Hamiltonian system. By minimizing the norm constructed out of this metric as a function of system parameters, we ...

  12. Finite Metric Spaces of Strictly negative Type

    DEFF Research Database (Denmark)

    Hjorth, Poul G.

    If a finite metric space is of strictly negative type then its transfinite diameter is uniquely realized by an infinite extent (“load vector''). Finite metric spaces that have this property include all trees, and all finite subspaces of Euclidean and Hyperbolic spaces. We prove that if the distance...

  13. Fixed point theory in metric type spaces

    CERN Document Server

    Agarwal, Ravi P; O’Regan, Donal; Roldán-López-de-Hierro, Antonio Francisco

    2015-01-01

    Written by a team of leading experts in the field, this volume presents a self-contained account of the theory, techniques and results in metric type spaces (in particular in G-metric spaces); that is, the text approaches this important area of fixed point analysis beginning from the basic ideas of metric space topology. The text is structured so that it leads the reader from preliminaries and historical notes on metric spaces (in particular G-metric spaces) and on mappings, to Banach type contraction theorems in metric type spaces, fixed point theory in partially ordered G-metric spaces, fixed point theory for expansive mappings in metric type spaces, generalizations, present results and techniques in a very general abstract setting and framework. Fixed point theory is one of the major research areas in nonlinear analysis. This is partly due to the fact that in many real world problems fixed point theory is the basic mathematical tool used to establish the existence of solutions to problems which arise natur...

  14. Metric solution of a spinning mass

    International Nuclear Information System (INIS)

    Sato, H.

    1982-01-01

    Studies on a particular class of asymptotically flat and stationary metric solutions called the Kerr-Tomimatsu-Sato class are reviewed about its derivation and properties. For a further study, an almost complete list of the papers worked on the Tomimatsu-Sato metrics is given. (Auth.)

  15. On Information Metrics for Spatial Coding.

    Science.gov (United States)

    Souza, Bryan C; Pavão, Rodrigo; Belchior, Hindiael; Tort, Adriano B L

    2018-04-01

    The hippocampal formation is involved in navigation, and its neuronal activity exhibits a variety of spatial correlates (e.g., place cells, grid cells). The quantification of the information encoded by spikes has been standard procedure to identify which cells have spatial correlates. For place cells, most of the established metrics derive from Shannon's mutual information (Shannon, 1948), and convey information rate in bits/s or bits/spike (Skaggs et al., 1993, 1996). Despite their widespread use, the performance of these metrics in relation to the original mutual information metric has never been investigated. In this work, using simulated and real data, we find that the current information metrics correlate less with the accuracy of spatial decoding than the original mutual information metric. We also find that the top informative cells may differ among metrics, and show a surrogate-based normalization that yields comparable spatial information estimates. Since different information metrics may identify different neuronal populations, we discuss current and alternative definitions of spatially informative cells, which affect the metric choice. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  16. Validation of Metrics for Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2008-01-01

    Full Text Available This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  17. Validation of Metrics for Collaborative Systems

    OpenAIRE

    Ion IVAN; Cristian CIUREA

    2008-01-01

    This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  18. Software Power Metric Model: An Implementation | Akwukwuma ...

    African Journals Online (AJOL)

    ... and the execution time (TIME) in each case was recorded. We then obtain the application functions point count. Our result shows that the proposed metric is computable, consistent in its use of unit, and is programming language independent. Keywords: Software attributes, Software power, measurement, Software metric, ...

  19. Metrics for border management systems.

    Energy Technology Data Exchange (ETDEWEB)

    Duggan, Ruth Ann

    2009-07-01

    There are as many unique and disparate manifestations of border systems as there are borders to protect. Border Security is a highly complex system analysis problem with global, regional, national, sector, and border element dimensions for land, water, and air domains. The complexity increases with the multiple, and sometimes conflicting, missions for regulating the flow of people and goods across borders, while securing them for national security. These systems include frontier border surveillance, immigration management and customs functions that must operate in a variety of weather, terrain, operational conditions, cultural constraints, and geopolitical contexts. As part of a Laboratory Directed Research and Development Project 08-684 (Year 1), the team developed a reference framework to decompose this complex system into international/regional, national, and border elements levels covering customs, immigration, and border policing functions. This generalized architecture is relevant to both domestic and international borders. As part of year two of this project (09-1204), the team determined relevant relative measures to better understand border management performance. This paper describes those relative metrics and how they can be used to improve border management systems.

  20. The metrics of science and technology

    CERN Document Server

    Geisler, Eliezer

    2000-01-01

    Dr. Geisler's far-reaching, unique book provides an encyclopedic compilation of the key metrics to measure and evaluate the impact of science and technology on academia, industry, and government. Focusing on such items as economic measures, patents, peer review, and other criteria, and supported by an extensive review of the literature, Dr. Geisler gives a thorough analysis of the strengths and weaknesses inherent in metric design, and in the use of the specific metrics he cites. His book has already received prepublication attention, and will prove especially valuable for academics in technology management, engineering, and science policy; industrial R&D executives and policymakers; government science and technology policymakers; and scientists and managers in government research and technology institutions. Geisler maintains that the application of metrics to evaluate science and technology at all levels illustrates the variety of tools we currently possess. Each metric has its own unique strengths and...

  1. Smart Grid Status and Metrics Report Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Antonopoulos, Chrissi A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clements, Samuel L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorrissen, Willy J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ruiz, Kathleen A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, David L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gardner, Chris [APQC, Houston, TX (United States); Varney, Jeff [APQC, Houston, TX (United States)

    2014-07-01

    A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

  2. Metrics for Polyphonic Sound Event Detection

    Directory of Open Access Journals (Sweden)

    Annamaria Mesaros

    2016-05-01

    Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.

  3. Nuclear power and carbon dioxide free automobiles

    International Nuclear Information System (INIS)

    Pendergast, D.R.

    1999-01-01

    Nuclear energy has been developed as a major source of electric power in Canada. Electricity from nuclear energy already avoids the emission of about 100 million tonnes of carbon dioxide to the atmosphere in Canada. This is a significant fraction of the 619 million tonnes of Canadian greenhouse gas emissions in 1995. However, the current scope of application of electricity to end use energy needs in Canada limits the contribution nuclear energy can make to carbon dioxide emission reduction. Nuclear energy can also contribute to carbon dioxide emissions reduction through expansion of the use of electricity to less traditional applications. Transportation, in particular contributed 165 million tonnes of carbon dioxide to the Canadian atmosphere in 1995. Canada's fleet of personal vehicles consisted of 16.9 million cars and light trucks. These vehicles were driven on average 21,000 km/year and generated 91 million tonnes of greenhouse gases expressed as a C02 equivalent. Technology to improve the efficiency of cars is under development which is expected to increase the energy efficiency from the 1995 level of about 10 litres/100 km of gasoline to under 3 litres/100km expressed as an equivalent referenced to the energy content of gasoline. The development of this technology, which may ultimately lead to the practical implementation of hydrogen as a portable source of energy for transportation is reviewed. Fuel supply life cycle greenhouse gas releases for several personal vehicle energy supply systems are then estimated. Very substantial reductions of greenhouse gas emissions are possible due to efficiency improvements and changing to less carbon intensive fuels such as natural gas. C02 emissions from on board natural gas fueled versions of hybrid electric cars would be decreased to approximately 25 million t/year from the current 91 million tonnes/year. The ultimate reduction identified is through the use of hydrogen fuel produced via electricity from CANDU power

  4. Evaluation Metrics for Simulations of Tropical South America

    Science.gov (United States)

    Gallup, S.; Baker, I. T.; Denning, A. S.; Cheeseman, M.; Haynes, K. D.; Phillips, M.

    2017-12-01

    The evergreen broadleaf forest of the Amazon Basin is the largest rainforest on earth, and has teleconnections to global climate and carbon cycle characteristics. This region defies simple characterization, spanning large gradients in total rainfall and seasonal variability. Broadly, the region can be thought of as trending from light-limited in its wettest areas to water-limited near the ecotone, with individual landscapes possibly exhibiting the characteristics of either (or both) limitations during an annual cycle. A basin-scale classification of mean behavior has been elusive, and ecosystem response to seasonal cycles and anomalous drought events has resulted in some disagreement in the literature, to say the least. However, new observational platforms and instruments make characterization of the heterogeneity and variability more feasible.To evaluate simulations of ecophysiological function, we develop metrics that correlate various observational products with meteorological variables such as precipitation and radiation. Observations include eddy covariance fluxes, Solar Induced Fluorescence (SIF, from GOME2 and OCO2), biomass and vegetation indices. We find that the modest correlation between SIF and precipitation decreases with increasing annual precipitation, although the relationship is not consistent between products. Biomass increases with increasing precipitation. Although vegetation indices are generally correlated with biomass and precipitation, they can saturate or experience retrieval issues during cloudy periods.Using these observational products and relationships, we develop a set of model evaluation metrics. These metrics are designed to call attention to models that get "the right answer only if it's for the right reason," and provide an opportunity for more critical evaluation of model physics. These metrics represent a testbed that can be applied to multiple models as a means to evaluate their performance in tropical South America.

  5. Robustness Metrics: Consolidating the multiple approaches to quantify Robustness

    DEFF Research Database (Denmark)

    Göhler, Simon Moritz; Eifler, Tobias; Howard, Thomas J.

    2016-01-01

    robustness metrics; 3) Functional expectancy and dispersion robustness metrics; and 4) Probability of conformance robustness metrics. The goal was to give a comprehensive overview of robustness metrics and guidance to scholars and practitioners to understand the different types of robustness metrics...

  6. Partial rectangular metric spaces and fixed point theorems.

    Science.gov (United States)

    Shukla, Satish

    2014-01-01

    The purpose of this paper is to introduce the concept of partial rectangular metric spaces as a generalization of rectangular metric and partial metric spaces. Some properties of partial rectangular metric spaces and some fixed point results for quasitype contraction in partial rectangular metric spaces are proved. Some examples are given to illustrate the observed results.

  7. Industrial Scale Synthesis of Carbon Nanotubes Via Fluidized Bed Chemical Vapor Deposition: A Senior Design Project

    Science.gov (United States)

    Smith, York R.; Fuchs, Alan; Meyyappan, M.

    2010-01-01

    Senior year chemical engineering students designed a process to produce 10 000 tonnes per annum of single wall carbon nanotubes (SWNT) and also conducted bench-top experiments to synthesize SWNTs via fluidized bed chemical vapor deposition techniques. This was an excellent pedagogical experience because it related to the type of real world design…

  8. Measuring Information Security: Guidelines to Build Metrics

    Science.gov (United States)

    von Faber, Eberhard

    Measuring information security is a genuine interest of security managers. With metrics they can develop their security organization's visibility and standing within the enterprise or public authority as a whole. Organizations using information technology need to use security metrics. Despite the clear demands and advantages, security metrics are often poorly developed or ineffective parameters are collected and analysed. This paper describes best practices for the development of security metrics. First attention is drawn to motivation showing both requirements and benefits. The main body of this paper lists things which need to be observed (characteristic of metrics), things which can be measured (how measurements can be conducted) and steps for the development and implementation of metrics (procedures and planning). Analysis and communication is also key when using security metrics. Examples are also given in order to develop a better understanding. The author wants to resume, continue and develop the discussion about a topic which is or increasingly will be a critical factor of success for any security managers in larger organizations.

  9. Characterising risk - aggregated metrics: radiation and noise

    International Nuclear Information System (INIS)

    Passchier, W.

    1998-01-01

    The characterisation of risk is an important phase in the risk assessment - risk management process. From the multitude of risk attributes a few have to be selected to obtain a risk characteristic or profile that is useful for risk management decisions and implementation of protective measures. One way to reduce the number of attributes is aggregation. In the field of radiation protection such an aggregated metric is firmly established: effective dose. For protection against environmental noise the Health Council of the Netherlands recently proposed a set of aggregated metrics for noise annoyance and sleep disturbance. The presentation will discuss similarities and differences between these two metrics and practical limitations. The effective dose has proven its usefulness in designing radiation protection measures, which are related to the level of risk associated with the radiation practice in question, given that implicit judgements on radiation induced health effects are accepted. However, as the metric does not take into account the nature of radiation practice, it is less useful in policy discussions on the benefits and harm of radiation practices. With respect to the noise exposure metric, only one effect is targeted (annoyance), and the differences between sources are explicitly taken into account. This should make the metric useful in policy discussions with respect to physical planning and siting problems. The metric proposed has only significance on a population level, and can not be used as a predictor for individual risk. (author)

  10. Energy functionals for Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Headrick, M; Nassar, A

    2013-01-01

    We identify a set of ''energy'' functionals on the space of metrics in a given Kähler class on a Calabi-Yau manifold, which are bounded below and minimized uniquely on the Ricci-flat metric in that class. Using these functionals, we recast the problem of numerically solving the Einstein equation as an optimization problem. We apply this strategy, using the ''algebraic'' metrics (metrics for which the Kähler potential is given in terms of a polynomial in the projective coordinates), to the Fermat quartic and to a one-parameter family of quintics that includes the Fermat and conifold quintics. We show that this method yields approximations to the Ricci-flat metric that are exponentially accurate in the degree of the polynomial (except at the conifold point, where the convergence is polynomial), and therefore orders of magnitude more accurate than the balanced metrics, previously studied as approximations to the Ricci-flat metric. The method is relatively fast and easy to implement. On the theoretical side, we also show that the functionals can be used to give a heuristic proof of Yau's theorem

  11. Metrics Are Needed for Collaborative Software Development

    Directory of Open Access Journals (Sweden)

    Mojgan Mohtashami

    2011-10-01

    Full Text Available There is a need for metrics for inter-organizational collaborative software development projects, encompassing management and technical concerns. In particular, metrics are needed that are aimed at the collaborative aspect itself, such as readiness for collaboration, the quality and/or the costs and benefits of collaboration in a specific ongoing project. We suggest questions and directions for such metrics, spanning the full lifespan of a collaborative project, from considering the suitability of collaboration through evaluating ongoing projects to final evaluation of the collaboration.

  12. Indefinite metric fields and the renormalization group

    International Nuclear Information System (INIS)

    Sherry, T.N.

    1976-11-01

    The renormalization group equations are derived for the Green functions of an indefinite metric field theory. In these equations one retains the mass dependence of the coefficient functions, since in the indefinite metric theories the masses cannot be neglected. The behavior of the effective coupling constant in the asymptotic and infrared limits is analyzed. The analysis is illustrated by means of a simple model incorporating indefinite metric fields. The model scales at first order, and at this order also the effective coupling constant has both ultra-violet and infra-red fixed points, the former being the bare coupling constant

  13. Metric learning for DNA microarray data analysis

    International Nuclear Information System (INIS)

    Takeuchi, Ichiro; Nakagawa, Masao; Seto, Masao

    2009-01-01

    In many microarray studies, gene set selection is an important preliminary step for subsequent main task such as tumor classification, cancer subtype identification, etc. In this paper, we investigate the possibility of using metric learning as an alternative to gene set selection. We develop a simple metric learning algorithm aiming to use it for microarray data analysis. Exploiting a property of the algorithm, we introduce a novel approach for extending the metric learning to be adaptive. We apply the algorithm to previously studied microarray data on malignant lymphoma subtype identification.

  14. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  15. Metrics, Media and Advertisers: Discussing Relationship

    Directory of Open Access Journals (Sweden)

    Marco Aurelio de Souza Rodrigues

    2014-11-01

    Full Text Available This study investigates how Brazilian advertisers are adapting to new media and its attention metrics. In-depth interviews were conducted with advertisers in 2009 and 2011. In 2009, new media and its metrics were celebrated as innovations that would increase advertising campaigns overall efficiency. In 2011, this perception has changed: New media’s profusion of metrics, once seen as an advantage, started to compromise its ease of use and adoption. Among its findings, this study argues that there is an opportunity for media groups willing to shift from a product-focused strategy towards a customer-centric one, through the creation of new, simple and integrative metrics

  16. Networks and centroid metrics for understanding football

    African Journals Online (AJOL)

    Gonçalo Dias

    games. However, it seems that the centroid metric, supported only by the position of players in the field ...... the strategy adopted by the coach (Gama et al., 2014). ... centroid distance as measures of team's tactical performance in youth football.

  17. Clean Cities Annual Metrics Report 2009 (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.

    2011-08-01

    Document provides Clean Cities coalition metrics about the use of alternative fuels; the deployment of alternative fuel vehicles, hybrid electric vehicles (HEVs), and idle reduction initiatives; fuel economy activities; and programs to reduce vehicle miles driven.

  18. Metric Guidelines Inservice and/or Preservice

    Science.gov (United States)

    Granito, Dolores

    1978-01-01

    Guidelines are given for designing teacher training for going metric. The guidelines were developed from existing guidelines, journal articles, a survey of colleges, and the detailed reactions of a panel. (MN)

  19. Science and Technology Metrics and Other Thoughts

    National Research Council Canada - National Science Library

    Harman, Wayne; Staton, Robin

    2006-01-01

    This report explores the subject of science and technology metrics and other topics to begin to provide Navy managers, as well as scientists and engineers, additional tools and concepts with which to...

  20. Using Activity Metrics for DEVS Simulation Profiling

    Directory of Open Access Journals (Sweden)

    Muzy A.

    2014-01-01

    Full Text Available Activity metrics can be used to profile DEVS models before and during the simulation. It is critical to get good activity metrics of models before and during their simulation. Having a means to compute a-priori activity of components (analytic activity may be worth when simulating a model (or parts of it for the first time. After, during the simulation, analytic activity can be corrected using dynamic one. In this paper, we introduce McCabe cyclomatic complexity metric (MCA to compute analytic activity. Both static and simulation activity metrics have been implemented through a plug-in of the DEVSimPy (DEVS Simulator in Python language environment and applied to DEVS models.

  1. Evaluating and Estimating the WCET Criticality Metric

    DEFF Research Database (Denmark)

    Jordan, Alexander

    2014-01-01

    a programmer (or compiler) from targeting optimizations the right way. A possible resort is to use a metric that targets WCET and which can be efficiently computed for all code parts of a program. Similar to dynamic profiling techniques, which execute code with input that is typically expected...... for the application, based on WCET analysis we can indicate how critical a code fragment is, in relation to the worst-case bound. Computing such a metric on top of static analysis, incurs a certain overhead though, which increases with the complexity of the underlying WCET analysis. We present our approach...... to estimate the Criticality metric, by relaxing the precision of WCET analysis. Through this, we can reduce analysis time by orders of magnitude, while only introducing minor error. To evaluate our estimation approach and share our garnered experience using the metric, we evaluate real-time programs, which...

  2. 16 CFR 1511.8 - Metric references.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Metric references. 1511.8 Section 1511.8 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT REGULATIONS... parentheses for convenience and information only. ...

  3. Flight Crew State Monitoring Metrics, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — eSky will develop specific crew state metrics based on the timeliness, tempo and accuracy of pilot inputs required by the H-mode Flight Control System (HFCS)....

  4. Supplier selection using different metric functions

    Directory of Open Access Journals (Sweden)

    Omosigho S.E.

    2015-01-01

    Full Text Available Supplier selection is an important component of supply chain management in today’s global competitive environment. Hence, the evaluation and selection of suppliers have received considerable attention in the literature. Many attributes of suppliers, other than cost, are considered in the evaluation and selection process. Therefore, the process of evaluation and selection of suppliers is a multi-criteria decision making process. The methodology adopted to solve the supplier selection problem is intuitionistic fuzzy TOPSIS (Technique for Order Preference by Similarity to the Ideal Solution. Generally, TOPSIS is based on the concept of minimum distance from the positive ideal solution and maximum distance from the negative ideal solution. We examine the deficiencies of using only one metric function in TOPSIS and propose the use of spherical metric function in addition to the commonly used metric functions. For empirical supplier selection problems, more than one metric function should be used.

  5. Classroom reconstruction of the Schwarzschild metric

    OpenAIRE

    Kassner, Klaus

    2015-01-01

    A promising way to introduce general relativity in the classroom is to study the physical implications of certain given metrics, such as the Schwarzschild one. This involves lower mathematical expenditure than an approach focusing on differential geometry in its full glory and permits to emphasize physical aspects before attacking the field equations. Even so, in terms of motivation, lacking justification of the metric employed may pose an obstacle. The paper discusses how to establish the we...

  6. Marketing communication metrics for social media

    OpenAIRE

    Töllinen, Aarne; Karjaluoto, Heikki

    2011-01-01

    The objective of this paper is to develop a conceptual framework for measuring the effectiveness of social media marketing communications. Specifically, we study whether the existing marketing communications performance metrics are still valid in the changing digitalised communications landscape, or whether it is time to rethink them, or even to devise entirely new metrics. Recent advances in information technology and marketing bring a need to re-examine measurement models. We combine two im...

  7. Some observations on a fuzzy metric space

    Energy Technology Data Exchange (ETDEWEB)

    Gregori, V.

    2017-07-01

    Let $(X,d)$ be a metric space. In this paper we provide some observations about the fuzzy metric space in the sense of Kramosil and Michalek $(Y,N,/wedge)$, where $Y$ is the set of non-negative real numbers $[0,/infty[$ and $N(x,y,t)=1$ if $d(x,y)/leq t$ and $N(x,y,t)=0$ if $d(x,y)/geq t$. (Author)

  8. Area Regge calculus and discontinuous metrics

    International Nuclear Information System (INIS)

    Wainwright, Chris; Williams, Ruth M

    2004-01-01

    Taking the triangle areas as independent variables in the theory of Regge calculus can lead to ambiguities in the edge lengths, which can be interpreted as discontinuities in the metric. We construct solutions to area Regge calculus using a triangulated lattice and find that on a spacelike or timelike hypersurface no such discontinuity can arise. On a null hypersurface however, we can have such a situation and the resulting metric can be interpreted as a so-called refractive wave

  9. Carbon footprint assessment of Western Australian Groundwater Recycling Scheme

    Science.gov (United States)

    Simms, Andrew; Hamilton, Stacey; Biswas, Wahidul K.

    2017-04-01

    This research has determined the carbon footprint or the carbon dioxide equivalent (CO2 eq) of potable water production from a groundwater recycling scheme, consisting of the Beenyup wastewater treatment plant, the Beenyup groundwater replenishment trial plant and the Wanneroo groundwater treatment plant in Western Australia, using a life cycle assessment approach. It was found that the scheme produces 1300 tonnes of CO2 eq per gigalitre (GL) of water produced, which is 933 tonnes of CO2 eq higher than the desalination plant at Binningup in Western Australia powered by 100% renewable energy generated electricity. A Monte Carlo Simulation uncertainty analysis calculated a Coefficient of Variation value of 5.4%, thus confirming the accuracy of the simulation. Electricity input accounts for 83% of the carbon dioxide equivalent produced during the production of potable water. The chosen mitigation strategy was to consider the use of renewable energy to generate electricity for carbon intensive groundwater replenishment trial plant. Depending on the local situation, a maximum of 93% and a minimum of 21% greenhouse gas saving from electricity use can be attained at groundwater replenishment trial plant by replacing grid electricity with renewable electricity. In addition, the consideration of vibrational separation (V-Sep) that helps reduce wastes generation and chemical use resulted in a 4.03 tonne of CO2 eq saving per GL of water produced by the plant.

  10. Carbon footprint assessment of Western Australian Groundwater Recycling Scheme.

    Science.gov (United States)

    Simms, Andrew; Hamilton, Stacey; Biswas, Wahidul K

    2017-04-01

    This research has determined the carbon footprint or the carbon dioxide equivalent (CO 2 eq) of potable water production from a groundwater recycling scheme, consisting of the Beenyup wastewater treatment plant, the Beenyup groundwater replenishment trial plant and the Wanneroo groundwater treatment plant in Western Australia, using a life cycle assessment approach. It was found that the scheme produces 1300 tonnes of CO 2 eq per gigalitre (GL) of water produced, which is 933 tonnes of CO 2 eq higher than the desalination plant at Binningup in Western Australia powered by 100% renewable energy generated electricity. A Monte Carlo Simulation uncertainty analysis calculated a Coefficient of Variation value of 5.4%, thus confirming the accuracy of the simulation. Electricity input accounts for 83% of the carbon dioxide equivalent produced during the production of potable water. The chosen mitigation strategy was to consider the use of renewable energy to generate electricity for carbon intensive groundwater replenishment trial plant. Depending on the local situation, a maximum of 93% and a minimum of 21% greenhouse gas saving from electricity use can be attained at groundwater replenishment trial plant by replacing grid electricity with renewable electricity. In addition, the consideration of vibrational separation (V-Sep) that helps reduce wastes generation and chemical use resulted in a 4.03 tonne of CO 2 eq saving per GL of water produced by the plant.

  11. Relaxed metrics and indistinguishability operators: the relationship

    Energy Technology Data Exchange (ETDEWEB)

    Martin, J.

    2017-07-01

    In 1982, the notion of indistinguishability operator was introduced by E. Trillas in order to fuzzify the crisp notion of equivalence relation (/cite{Trillas}). In the study of such a class of operators, an outstanding property must be pointed out. Concretely, there exists a duality relationship between indistinguishability operators and metrics. The aforesaid relationship was deeply studied by several authors that introduced a few techniques to generate metrics from indistinguishability operators and vice-versa (see, for instance, /cite{BaetsMesiar,BaetsMesiar2}). In the last years a new generalization of the metric notion has been introduced in the literature with the purpose of developing mathematical tools for quantitative models in Computer Science and Artificial Intelligence (/cite{BKMatthews,Ma}). The aforementioned generalized metrics are known as relaxed metrics. The main target of this talk is to present a study of the duality relationship between indistinguishability operators and relaxed metrics in such a way that the aforementioned classical techniques to generate both concepts, one from the other, can be extended to the new framework. (Author)

  12. Baby universe metric equivalent to an interior black-hole metric

    International Nuclear Information System (INIS)

    Gonzalez-Diaz, P.F.

    1991-01-01

    It is shown that the maximally extended metric corresponding to a large wormhole is the unique possible wormhole metric whose baby universe sector is conformally equivalent ot the maximal inextendible Kruskal metric corresponding to the interior region of a Schwarzschild black hole whose gravitational radius is half the wormhole neck radius. The physical implications of this result in the black hole evaporation process are discussed. (orig.)

  13. Red mud as a carbon sink: variability, affecting factors and environmental significance.

    Science.gov (United States)

    Si, Chunhua; Ma, Yingqun; Lin, Chuxia

    2013-01-15

    The capacity of red mud to sequester CO(2) varied markedly due to differences in bauxite type, processing and disposal methods. Calcium carbonates were the dominant mineral phases responsible for the carbon sequestration in the investigated red mud types. The carbon sequestration capacity of red mud was not fully exploited due to shortages of soluble divalent cations for formation of stable carbonate minerals. Titanate and silicate ions were the two major oxyanions that appeared to strongly compete with carbonate ions for the available soluble Ca. Supply of additional soluble Ca and Mg could be a viable pathway for maximizing carbon sequestration in red mud and simultaneously reducing the causticity of red mud. It is roughly estimated that over 100 million tonnes of CO(2) have been unintentionally sequestered in red mud around the world to date through the natural weathering of historically produced red mud. Based on the current production rate of red mud, it is likely that some 6 million tonnes of CO(2) will be sequestered annually through atmospheric carbonation. If appropriate technologies are in place for incorporating binding cations into red mud, approximately 6 million tonnes of additional CO(2) can be captured and stored in the red mud while the hazardousness of red mud is simultaneously reduced. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. The dynamics of metric-affine gravity

    International Nuclear Information System (INIS)

    Vitagliano, Vincenzo; Sotiriou, Thomas P.; Liberati, Stefano

    2011-01-01

    Highlights: → The role and the dynamics of the connection in metric-affine theories is explored. → The most general second order action does not lead to a dynamical connection. → Including higher order invariants excites new degrees of freedom in the connection. → f(R) actions are also discussed and shown to be a non- representative class. - Abstract: Metric-affine theories of gravity provide an interesting alternative to general relativity: in such an approach, the metric and the affine (not necessarily symmetric) connection are independent quantities. Furthermore, the action should include covariant derivatives of the matter fields, with the covariant derivative naturally defined using the independent connection. As a result, in metric-affine theories a direct coupling involving matter and connection is also present. The role and the dynamics of the connection in such theories is explored. We employ power counting in order to construct the action and search for the minimal requirements it should satisfy for the connection to be dynamical. We find that for the most general action containing lower order invariants of the curvature and the torsion the independent connection does not carry any dynamics. It actually reduces to the role of an auxiliary field and can be completely eliminated algebraically in favour of the metric and the matter field, introducing extra interactions with respect to general relativity. However, we also show that including higher order terms in the action radically changes this picture and excites new degrees of freedom in the connection, making it (or parts of it) dynamical. Constructing actions that constitute exceptions to this rule requires significant fine tuned and/or extra a priori constraints on the connection. We also consider f(R) actions as a particular example in order to show that they constitute a distinct class of metric-affine theories with special properties, and as such they cannot be used as representative toy

  15. Evaluation metrics for biostatistical and epidemiological collaborations.

    Science.gov (United States)

    Rubio, Doris McGartland; Del Junco, Deborah J; Bhore, Rafia; Lindsell, Christopher J; Oster, Robert A; Wittkowski, Knut M; Welty, Leah J; Li, Yi-Ju; Demets, Dave

    2011-10-15

    Increasing demands for evidence-based medicine and for the translation of biomedical research into individual and public health benefit have been accompanied by the proliferation of special units that offer expertise in biostatistics, epidemiology, and research design (BERD) within academic health centers. Objective metrics that can be used to evaluate, track, and improve the performance of these BERD units are critical to their successful establishment and sustainable future. To develop a set of reliable but versatile metrics that can be adapted easily to different environments and evolving needs, we consulted with members of BERD units from the consortium of academic health centers funded by the Clinical and Translational Science Award Program of the National Institutes of Health. Through a systematic process of consensus building and document drafting, we formulated metrics that covered the three identified domains of BERD practices: the development and maintenance of collaborations with clinical and translational science investigators, the application of BERD-related methods to clinical and translational research, and the discovery of novel BERD-related methodologies. In this article, we describe the set of metrics and advocate their use for evaluating BERD practices. The routine application, comparison of findings across diverse BERD units, and ongoing refinement of the metrics will identify trends, facilitate meaningful changes, and ultimately enhance the contribution of BERD activities to biomedical research. Copyright © 2011 John Wiley & Sons, Ltd.

  16. A Metric on Phylogenetic Tree Shapes.

    Science.gov (United States)

    Colijn, C; Plazzotta, G

    2018-01-01

    The shapes of evolutionary trees are influenced by the nature of the evolutionary process but comparisons of trees from different processes are hindered by the challenge of completely describing tree shape. We present a full characterization of the shapes of rooted branching trees in a form that lends itself to natural tree comparisons. We use this characterization to define a metric, in the sense of a true distance function, on tree shapes. The metric distinguishes trees from random models known to produce different tree shapes. It separates trees derived from tropical versus USA influenza A sequences, which reflect the differing epidemiology of tropical and seasonal flu. We describe several metrics based on the same core characterization, and illustrate how to extend the metric to incorporate trees' branch lengths or other features such as overall imbalance. Our approach allows us to construct addition and multiplication on trees, and to create a convex metric on tree shapes which formally allows computation of average tree shapes. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  17. Future of the PCI Readmission Metric.

    Science.gov (United States)

    Wasfy, Jason H; Yeh, Robert W

    2016-03-01

    Between 2013 and 2014, the Centers for Medicare and Medicaid Services and the National Cardiovascular Data Registry publically reported risk-adjusted 30-day readmission rates after percutaneous coronary intervention (PCI) as a pilot project. A key strength of this public reporting effort included risk adjustment with clinical rather than administrative data. Furthermore, because readmission after PCI is common, expensive, and preventable, this metric has substantial potential to improve quality and value in American cardiology care. Despite this, concerns about the metric exist. For example, few PCI readmissions are caused by procedural complications, limiting the extent to which improved procedural technique can reduce readmissions. Also, similar to other readmission measures, PCI readmission is associated with socioeconomic status and race. Accordingly, the metric may unfairly penalize hospitals that care for underserved patients. Perhaps in the context of these limitations, Centers for Medicare and Medicaid Services has not yet included PCI readmission among metrics that determine Medicare financial penalties. Nevertheless, provider organizations may still wish to focus on this metric to improve value for cardiology patients. PCI readmission is associated with low-risk chest discomfort and patient anxiety. Therefore, patient education, improved triage mechanisms, and improved care coordination offer opportunities to minimize PCI readmissions. Because PCI readmission is common and costly, reducing PCI readmission offers provider organizations a compelling target to improve the quality of care, and also performance in contracts involve shared financial risk. © 2016 American Heart Association, Inc.

  18. Uletutshiliss tonnõ tekstilja / Jevgeni Kapov

    Index Scriptorium Estoniae

    Kapov, Jevgeni

    2006-01-01

    Narva tekstiilivabrikus AS Polytex.ee.printing toodetud ja Prantsusmaale tellijale saadetud 18 tonni tekstiilitooteid kogumaksumusega 700 000 krooni ei jõudnud sihtpunkti, politsei ja firma teevad koostööd kadunud kauba ja kaubavedu korraldanud firma leidmiseks

  19. Forest carbon trading : legal, policy, ecological and aboriginal issues

    International Nuclear Information System (INIS)

    Elgie, S.

    2005-01-01

    Canada's forest ecosystems store 88 billion tonnes of carbon, with trees alone storing 13 billion tonnes, twice the global annual carbon emissions. Carbon trading could affect forest management. Certain types of forest carbon project will offer cost-effective carbon sequestration options. This paper addresses current concerns about forest carbon trading such as phony carbon gains, biodiversity impact and increased fossil fuel emissions. Statistics were presented with information on global carbon stocks. The Kyoto Protocol requires that Canada must count all changes in forest carbon stocks resulting from afforestation, reforestation or deforestation, and that Canada has the option of counting carbon stock changes from forest management. The decision must be made by 2006, and considerations are whether to present projected net source or sink, or whether to count current commercially managed areas or all timber productive areas. An outline of federal constitutional authority power regarding Kyoto was presented, including limits and risks of trade and treaty powers. The economics of forest carbon were outlined with reference to increasing forest carbon storage. A two-pronged approach was advised, with avoided logging and plantation and intensive management securing carbon and timber benefits. Examples of pre-Kyoto pilots were presented, including the SaskPower project, the Little Red River Cree project and the Labrador Innu project. The disadvantages of offset trading were presented. It was concluded that forest carbon markets are part of a larger vision for sustainable development in Canada's north, especially for aboriginal peoples, and may indicate a growing market for ecological services. Constitutional limits to federal power to regulate carbon trading are not insurmountable, but require care. Ownerships of forest carbon rights raises important policy and legal issues, including aboriginal right, efficiency and equity. An estimated cost of forest carbon projects

  20. g-Weak Contraction in Ordered Cone Rectangular Metric Spaces

    Directory of Open Access Journals (Sweden)

    S. K. Malhotra

    2013-01-01

    Full Text Available We prove some common fixed-point theorems for the ordered g-weak contractions in cone rectangular metric spaces without assuming the normality of cone. Our results generalize some recent results from cone metric and cone rectangular metric spaces into ordered cone rectangular metric spaces. Examples are provided which illustrate the results.

  1. Defining a Progress Metric for CERT RMM Improvement

    Science.gov (United States)

    2017-09-14

    REV-03.18.2016.0 Defining a Progress Metric for CERT-RMM Improvement Gregory Crabb Nader Mehravari David Tobar September 2017 TECHNICAL ...fendable resource allocation decisions. Technical metrics measure aspects of controls implemented through technology (systems, soft- ware, hardware...implementation metric would be the percentage of users who have received anti-phishing training . • Effectiveness/efficiency metrics measure whether

  2. NASA education briefs for the classroom. Metrics in space

    Science.gov (United States)

    The use of metric measurement in space is summarized for classroom use. Advantages of the metric system over the English measurement system are described. Some common metric units are defined, as are special units for astronomical study. International system unit prefixes and a conversion table of metric/English units are presented. Questions and activities for the classroom are recommended.

  3. SOCIAL METRICS APPLIED TO SMART TOURISM

    Directory of Open Access Journals (Sweden)

    O. Cervantes

    2016-09-01

    Full Text Available We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  4. Landscape pattern metrics and regional assessment

    Science.gov (United States)

    O'Neill, R. V.; Riitters, K.H.; Wickham, J.D.; Jones, K.B.

    1999-01-01

    The combination of remote imagery data, geographic information systems software, and landscape ecology theory provides a unique basis for monitoring and assessing large-scale ecological systems. The unique feature of the work has been the need to develop and interpret quantitative measures of spatial pattern-the landscape indices. This article reviews what is known about the statistical properties of these pattern metrics and suggests some additional metrics based on island biogeography, percolation theory, hierarchy theory, and economic geography. Assessment applications of this approach have required interpreting the pattern metrics in terms of specific environmental endpoints, such as wildlife and water quality, and research into how to represent synergystic effects of many overlapping sources of stress.

  5. A bi-metric theory of gravitation

    International Nuclear Information System (INIS)

    Rosen, N.

    1975-01-01

    The bi-metric theory of gravitation proposed previously is simplified in that the auxiliary conditions are discarded, the two metric tensors being tied together only by means of the boundary conditions. Some of the properties of the field of a particle are investigated; there is no black hole, and it appears that no gravitational collapse can take place. Although the proposed theory and general relativity are at present observationally indistinguishable, some differences are pointed out which may some day be susceptible of observation. An alternative bi-metric theory is considered which gives for the precession of the perihelion 5/6 of the value given by general relativity; it seems less satisfactory than the present theory from the aesthetic point of view. (author)

  6. Steiner trees for fixed orientation metrics

    DEFF Research Database (Denmark)

    Brazil, Marcus; Zachariasen, Martin

    2009-01-01

    We consider the problem of constructing Steiner minimum trees for a metric defined by a polygonal unit circle (corresponding to s = 2 weighted legal orientations in the plane). A linear-time algorithm to enumerate all angle configurations for degree three Steiner points is given. We provide...... a simple proof that the angle configuration for a Steiner point extends to all Steiner points in a full Steiner minimum tree, such that at most six orientations suffice for edges in a full Steiner minimum tree. We show that the concept of canonical forms originally introduced for the uniform orientation...... metric generalises to the fixed orientation metric. Finally, we give an O(s n) time algorithm to compute a Steiner minimum tree for a given full Steiner topology with n terminal leaves....

  7. Metrical and dynamical aspects in complex analysis

    CERN Document Server

    2017-01-01

    The central theme of this reference book is the metric geometry of complex analysis in several variables. Bridging a gap in the current literature, the text focuses on the fine behavior of the Kobayashi metric of complex manifolds and its relationships to dynamical systems, hyperbolicity in the sense of Gromov and operator theory, all very active areas of research. The modern points of view expressed in these notes, collected here for the first time, will be of interest to academics working in the fields of several complex variables and metric geometry. The different topics are treated coherently and include expository presentations of the relevant tools, techniques and objects, which will be particularly useful for graduate and PhD students specializing in the area.

  8. Social Metrics Applied to Smart Tourism

    Science.gov (United States)

    Cervantes, O.; Gutiérrez, E.; Gutiérrez, F.; Sánchez, J. A.

    2016-09-01

    We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general) to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services) to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  9. Validation of Metrics as Error Predictors

    Science.gov (United States)

    Mendling, Jan

    In this chapter, we test the validity of metrics that were defined in the previous chapter for predicting errors in EPC business process models. In Section 5.1, we provide an overview of how the analysis data is generated. Section 5.2 describes the sample of EPCs from practice that we use for the analysis. Here we discuss a disaggregation by the EPC model group and by error as well as a correlation analysis between metrics and error. Based on this sample, we calculate a logistic regression model for predicting error probability with the metrics as input variables in Section 5.3. In Section 5.4, we then test the regression function for an independent sample of EPC models from textbooks as a cross-validation. Section 5.5 summarizes the findings.

  10. Metric Learning for Hyperspectral Image Segmentation

    Science.gov (United States)

    Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca

    2011-01-01

    We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.

  11. Kerr metric in the deSitter background

    International Nuclear Information System (INIS)

    Vaidya, P.C.

    1984-01-01

    In addition to the Kerr metric with cosmological constant Λ several other metrics are presented giving a Kerr-like solution of Einstein's equations in the background of deSitter universe. A new metric of what may be termed as rotating deSitter space-time devoid of matter but containing null fluid with twisting null rays, has been presented. This metric reduces to the standard deSitter metric when the twist in the rays vanishes. Kerr metric in this background is the immediate generalization of Schwarzschild's exterior metric with cosmological constant. (author)

  12. Active Metric Learning from Relative Comparisons

    OpenAIRE

    Xiong, Sicheng; Rosales, Rómer; Pei, Yuanli; Fern, Xiaoli Z.

    2014-01-01

    This work focuses on active learning of distance metrics from relative comparison information. A relative comparison specifies, for a data point triplet $(x_i,x_j,x_k)$, that instance $x_i$ is more similar to $x_j$ than to $x_k$. Such constraints, when available, have been shown to be useful toward defining appropriate distance metrics. In real-world applications, acquiring constraints often require considerable human effort. This motivates us to study how to select and query the most useful ...

  13. Heuristic extension of the Schwarzschild metric

    International Nuclear Information System (INIS)

    Espinosa, J.M.

    1982-01-01

    The Schwarzschild solution of Einstein's equations of gravitation has several singularities. It is known that the singularity at r = 2Gm/c 2 is only apparent, a result of the coordinates in which the solution was found. Paradoxical results occuring near the singularity show the system of coordinates is incomplete. We introduce a simple, two-dimensional metric with an apparent singularity that makes it incomplete. By a straightforward, heuristic procedure we extend and complete this simple metric. We then use the same procedure to give a heuristic derivation of the Kruskal system of coordinates, which is known to extend the Schwarzschild manifold past its apparent singularity and produce a complete manifold

  14. Metric inhomogeneous Diophantine approximation in positive characteristic

    DEFF Research Database (Denmark)

    Kristensen, Simon

    2011-01-01

    We obtain asymptotic formulae for the number of solutions to systems of inhomogeneous linear Diophantine inequalities over the field of formal Laurent series with coefficients from a finite fields, which are valid for almost every such system. Here `almost every' is with respect to Haar measure...... of the coefficients of the homogeneous part when the number of variables is at least two (singly metric case), and with respect to the Haar measure of all coefficients for any number of variables (doubly metric case). As consequences, we derive zero-one laws in the spirit of the Khintchine-Groshev Theorem and zero...

  15. Metric inhomogeneous Diophantine approximation in positive characteristic

    DEFF Research Database (Denmark)

    Kristensen, S.

    We obtain asymptotic formulae for the number of solutions to systems of inhomogeneous linear Diophantine inequalities over the field of formal Laurent series with coefficients from a finite fields, which are valid for almost every such system. Here 'almost every' is with respect to Haar measure...... of the coefficients of the homogeneous part when the number of variables is at least two (singly metric case), and with respect to the Haar measure of all coefficients for any number of variables (doubly metric case). As consequences, we derive zero-one laws in the spirit of the Khintchine--Groshev Theorem and zero...

  16. Jacobi-Maupertuis metric and Kepler equation

    Science.gov (United States)

    Chanda, Sumanto; Gibbons, Gary William; Guha, Partha

    This paper studies the application of the Jacobi-Eisenhart lift, Jacobi metric and Maupertuis transformation to the Kepler system. We start by reviewing fundamentals and the Jacobi metric. Then we study various ways to apply the lift to Kepler-related systems: first as conformal description and Bohlin transformation of Hooke’s oscillator, second in contact geometry and third in Houri’s transformation [T. Houri, Liouville integrability of Hamiltonian systems and spacetime symmetry (2016), www.geocities.jp/football_physician/publication.html], coupled with Milnor’s construction [J. Milnor, On the geometry of the Kepler problem, Am. Math. Mon. 90 (1983) 353-365] with eccentric anomaly.

  17. Common metrics. Comparing the warming effect of climate forcers in climate policy; Common metrics. Laempenemiseen vaikuttavien paeaestoejen yhteismitallistaminen ilmastopolitiikassa

    Energy Technology Data Exchange (ETDEWEB)

    Lindroos, T. J.; Ekholm, T.; Savolainen, I.

    2012-11-15

    Climate policy needs a relatively simple method to compare the warming effect of different greenhouse gases (GHGs). Otherwise it would be necessary to negotiate a different reduction target for each gas. At the moment, Global Warming Potential (GWP) concept is used to compare different GHGs. Numerical values of GWP factors have been updated alongside with scientific understanding and majority seems content to the GWP. From 2005 onwards there have been many proposals of optional metrics. The most well known is Global Temperature change Potential (GTP) concept which measures the change of temperature as does global climate policies. The decision between metrics is a multicriteria decision which should include at least the coherence with climate policy and cost efficiency. The GWP concept may be a little more difficult to understand than the GTP but it is more cost efficient. Alongside with new metrics, scientists and politicians have started to discuss of new emission which have an effect on warming. These Short Lived Climate Forcers (SLCFs) have either warming or cooling effect. Their effect can be presented with GWP and GTP but the uncertainties in the emission factors are large. In total, SLCFs reduce overall emissions of EU approximately 1% in year 2000. NO{sub x}, SO{sub x} (cooling) and black carbon (warming) emissions were the biggest factors. EU is planning to reduce the SLCF emissions to achieve health and environmental benefits, but at the same time this reduces the effect of EU's climate policies by approximately 10%. Uncertainties in the estimates are large. (orig.)

  18. Mainstreaming Low-Carbon Climate-Resilient growth pathways into Development Finance Institutions' activities A research program on the standards, tools and metrics to support transition to the low-carbon climate-resilient development model. Paper 3 - Case Study: Integration of Climate Change into the operational activities of Agence Francaise de Developpement

    International Nuclear Information System (INIS)

    Eschalier, Claire; Deheza, Mariana; Cochran, Ian; Risler, Ophelie; Forestier, Pierre

    2015-10-01

    This case study examines the AFD's integration of climate and transition-related information and tools into its activities. It first presents the general investment process and the range of financial instruments used by AFD. Second, the framework elaborated in paper 2 of this series is used to analyze the upstream and downstream integration of long-term climate and transition objectives. It begins with the analysis of the upstream standards and information that are applied to transpose AFD's global strategy and Climate Action Plan into local and sectoral intervention plans and to guide AFD's initial project screening. It then explores the tools and instruments that are used during downstream process for project and program level assessments and optimization, before the final investment decision is made. Although the tools and standards implemented by AFD constitute a solid base for mainstreaming climate considerations into its activities, it seems that they could be further developed to allow for a more qualitative assessment of a project's contribution to 'low-carbon transformation' of a given country's economy. A number of opportunities and challenges to build on AFD's existing tools are identified to take this next step - first among which is the need to work with recipient countries and other development finance institutions to identify country-specific low-carbon climate resilient development pathways. (authors)

  19. Increase in observed net carbon dioxide uptake by land and oceans during the past 50 years.

    Science.gov (United States)

    Ballantyne, A P; Alden, C B; Miller, J B; Tans, P P; White, J W C

    2012-08-02

    One of the greatest sources of uncertainty for future climate predictions is the response of the global carbon cycle to climate change. Although approximately one-half of total CO(2) emissions is at present taken up by combined land and ocean carbon reservoirs, models predict a decline in future carbon uptake by these reservoirs, resulting in a positive carbon-climate feedback. Several recent studies suggest that rates of carbon uptake by the land and ocean have remained constant or declined in recent decades. Other work, however, has called into question the reported decline. Here we use global-scale atmospheric CO(2) measurements, CO(2) emission inventories and their full range of uncertainties to calculate changes in global CO(2) sources and sinks during the past 50 years. Our mass balance analysis shows that net global carbon uptake has increased significantly by about 0.05 billion tonnes of carbon per year and that global carbon uptake doubled, from 2.4 ± 0.8 to 5.0 ± 0.9 billion tonnes per year, between 1960 and 2010. Therefore, it is very unlikely that both land and ocean carbon sinks have decreased on a global scale. Since 1959, approximately 350 billion tonnes of carbon have been emitted by humans to the atmosphere, of which about 55 per cent has moved into the land and oceans. Thus, identifying the mechanisms and locations responsible for increasing global carbon uptake remains a critical challenge in constraining the modern global carbon budget and predicting future carbon-climate interactions.

  20. Carbon dioxide emissions from fossil fuel consumption and cement manufacture, 1751-1991; and an estimate of their isotopic composition and latitudinal distribution

    Energy Technology Data Exchange (ETDEWEB)

    Andres, R.J.; Marland, G.; Boden, T.; Bischof, S.

    1994-10-01

    This work briefly discusses four of the current research emphases at Oak Ridge National Laboratory regarding the emission of carbon dioxide (CO{sub 2}) from fossil fuel consumption, natural gas flaring and cement manufacture. These emphases include: (1) updating the 1950 to present time series of CO{sub 2} emissions from fossil fuel consumption and cement manufacture, (2) extending this time series back to 1751, (3) gridding the data at 1{sup 0} by 1{sup 0} resolution, and (4) estimating the isotopic signature of these emissions. In 1991, global emissions of CO{sub 2} from fossil fuel and cement increased 1.5% over 1990 levels to 6188 {times} 10{sup 6} metric tonnes C. The Kuwaiti oil fires can account for all of the increase. Recently published energy data (Etemad et al., 1991) allow extension of the CO emissions time series back to 1751. Preliminary examination shows good agreement with two other, but shorter, energy time series. A latitudinal distribution of carbon emissions is being completed. A southward shift in the major mass of CO{sub 2} emissions is occurring from European-North American latitudes towards central-southeast Asian latitudes, reflecting the growth of population and industrialization at these lower latitudes. The carbon isotopic signature of these emissions has been re-examined. The emissions of the last two decades are approximately 1{per_thousand} lighter than previously reported (Tans, 1981). This lightening of the emissions signature is due to fossil fuel gases and liquids, including a revision of their {delta}{sup 13}C isotopic signature and an increased production rate.

  1. Silk industry and carbon footprint mitigation

    Science.gov (United States)

    Giacomin, A. M.; Garcia, J. B., Jr.; Zonatti, W. F.; Silva-Santos, M. C.; Laktim, M. C.; Baruque-Ramos, J.

    2017-10-01

    Currently there is a concern with issues related to sustainability and more conscious consumption habits. The carbon footprint measures the total amount of greenhouse gas (GHG) emissions produced directly and indirectly by human activities and is usually expressed in tonnes of carbon dioxide (CO2) equivalents. The present study takes into account data collected in scientific literature regarding the carbon footprint, garments produced with silk fiber and the role of mulberry as a CO2 mitigation tool. There is an indication of a positive correlation between silk garments and carbon footprint mitigation when computed the cultivation of mulberry trees in this calculation. A field of them mitigates CO2 equivalents in a proportion of 735 times the weight of the produced silk fiber by the mulberry cultivated area. At the same time, additional researches are needed in order to identify and evaluate methods to advertise this positive correlation in order to contribute to a more sustainable fashion industry.

  2. Measuring Carbon Footprint of Flexible Pavement Construction Project in Indonesia

    Directory of Open Access Journals (Sweden)

    Utomo Dwi Hatmoko Jati

    2018-01-01

    Full Text Available Road infrastructure in Indonesia is mainly dominated by flexible pavement type. Its construction process, however, has raised concerns in terms of its environment impacts. This study aims to track and measure the carbon footprint of flexible pavement. The objectives are to map the construction process in relation to greenhouse gas (GHG emissions, to quantify them in terms of carbon dioxide equivalents (CO2e as generated by the process of production and transportation of raw materials, and the operation of plant off-site and on-site project. Data collection was done by having site observations and interviews with project stakeholders. The results show a total emissions of 70.888 tonnes CO2e, consisting of 34.248 tonnes CO2e (48.31% off-site activities and 36.640 tonnes CO2e (51.687% on-site activities. The two highest CO2e emissions were generated by the use of plant for asphalt concrete laying activities accounted 34.827 tonnes CO2e (49.130%, and material transportation accounted 24.921 (35.155%. These findings provide a new perspective of the carbon footprint in flexible pavement and suggest the urgent need for the use of more efficient and environmentally friendly plant in construction process as it shows the most significant contribution on the CO2e. This study provides valuable understanding on the environmental impact of typical flexible pavement projects in Indonesia, and further can be used for developing green road framework.

  3. Quantitative properties of the Schwarzschild metric

    Czech Academy of Sciences Publication Activity Database

    Křížek, Michal; Křížek, Filip

    2018-01-01

    Roč. 2018, č. 1 (2018), s. 1-10 Institutional support: RVO:67985840 Keywords : exterior and interior Schwarzschild metric * proper radius * coordinate radius Subject RIV: BA - General Mathematics OBOR OECD: Applied mathematics http://astro.shu-bg.net/pasb/index_files/Papers/2018/SCHWARZ8.pdf

  4. Strong Ideal Convergence in Probabilistic Metric Spaces

    Indian Academy of Sciences (India)

    In the present paper we introduce the concepts of strongly ideal convergent sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong ideal limit points and the strong ideal cluster points of a sequence in this ...

  5. lakemorpho: Calculating lake morphometry metrics in R.

    Science.gov (United States)

    Hollister, Jeffrey; Stachelek, Joseph

    2017-01-01

    Metrics describing the shape and size of lakes, known as lake morphometry metrics, are important for any limnological study. In cases where a lake has long been the subject of study these data are often already collected and are openly available. Many other lakes have these data collected, but access is challenging as it is often stored on individual computers (or worse, in filing cabinets) and is available only to the primary investigators. The vast majority of lakes fall into a third category in which the data are not available. This makes broad scale modelling of lake ecology a challenge as some of the key information about in-lake processes are unavailable. While this valuable in situ information may be difficult to obtain, several national datasets exist that may be used to model and estimate lake morphometry. In particular, digital elevation models and hydrography have been shown to be predictive of several lake morphometry metrics. The R package lakemorpho has been developed to utilize these data and estimate the following morphometry metrics: surface area, shoreline length, major axis length, minor axis length, major and minor axis length ratio, shoreline development, maximum depth, mean depth, volume, maximum lake length, mean lake width, maximum lake width, and fetch. In this software tool article we describe the motivation behind developing lakemorpho , discuss the implementation in R, and describe the use of lakemorpho with an example of a typical use case.

  6. Contraction theorems in fuzzy metric space

    International Nuclear Information System (INIS)

    Farnoosh, R.; Aghajani, A.; Azhdari, P.

    2009-01-01

    In this paper, the results on fuzzy contractive mapping proposed by Dorel Mihet will be proved for B-contraction and C-contraction in the case of George and Veeramani fuzzy metric space. The existence of fixed point with weaker conditions will be proved; that is, instead of the convergence of subsequence, p-convergence of subsequence is used.

  7. Inferring feature relevances from metric learning

    DEFF Research Database (Denmark)

    Schulz, Alexander; Mokbel, Bassam; Biehl, Michael

    2015-01-01

    Powerful metric learning algorithms have been proposed in the last years which do not only greatly enhance the accuracy of distance-based classifiers and nearest neighbor database retrieval, but which also enable the interpretability of these operations by assigning explicit relevance weights...

  8. DIGITAL MARKETING: SUCCESS METRICS, FUTURE TRENDS

    OpenAIRE

    Preeti Kaushik

    2017-01-01

    Abstract – Business Marketing is one of the prospective which has been tremendously affected by digital world in last few years. Digital marketing refers to doing advertising through digital channels. This paper provides detailed study of metrics to measure success of digital marketing platform and glimpse of future of technologies by 2020.

  9. Assessing Software Quality Through Visualised Cohesion Metrics

    Directory of Open Access Journals (Sweden)

    Timothy Shih

    2001-05-01

    Full Text Available Cohesion is one of the most important factors for software quality as well as maintainability, reliability and reusability. Module cohesion is defined as a quality attribute that seeks for measuring the singleness of the purpose of a module. The module of poor quality can be a serious obstacle to the system quality. In order to design a good software quality, software managers and engineers need to introduce cohesion metrics to measure and produce desirable software. A highly cohesion software is thought to be a desirable constructing. In this paper, we propose a function-oriented cohesion metrics based on the analysis of live variables, live span and the visualization of processing element dependency graph. We give six typical cohesion examples to be measured as our experiments and justification. Therefore, a well-defined, well-normalized, well-visualized and well-experimented cohesion metrics is proposed to indicate and thus enhance software cohesion strength. Furthermore, this cohesion metrics can be easily incorporated with software CASE tool to help software engineers to improve software quality.

  10. Metric propositional neighborhood logics on natural numbers

    DEFF Research Database (Denmark)

    Bresolin, Davide; Della Monica, Dario; Goranko, Valentin

    2013-01-01

    Metric Propositional Neighborhood Logic (MPNL) over natural numbers. MPNL features two modalities referring, respectively, to an interval that is “met by” the current one and to an interval that “meets” the current one, plus an infinite set of length constraints, regarded as atomic propositions...

  11. Calabi–Yau metrics and string compactification

    Directory of Open Access Journals (Sweden)

    Michael R. Douglas

    2015-09-01

    Full Text Available Yau proved an existence theorem for Ricci-flat Kähler metrics in the 1970s, but we still have no closed form expressions for them. Nevertheless there are several ways to get approximate expressions, both numerical and analytical. We survey some of this work and explain how it can be used to obtain physical predictions from superstring theory.

  12. Goedel-type metrics in various dimensions

    International Nuclear Information System (INIS)

    Guerses, Metin; Karasu, Atalay; Sarioglu, Oezguer

    2005-01-01

    Goedel-type metrics are introduced and used in producing charged dust solutions in various dimensions. The key ingredient is a (D - 1)-dimensional Riemannian geometry which is then employed in constructing solutions to the Einstein-Maxwell field equations with a dust distribution in D dimensions. The only essential field equation in the procedure turns out to be the source-free Maxwell's equation in the relevant background. Similarly the geodesics of this type of metric are described by the Lorentz force equation for a charged particle in the lower dimensional geometry. It is explicitly shown with several examples that Goedel-type metrics can be used in obtaining exact solutions to various supergravity theories and in constructing spacetimes that contain both closed timelike and closed null curves and that contain neither of these. Among the solutions that can be established using non-flat backgrounds, such as the Tangherlini metrics in (D - 1)-dimensions, there exists a class which can be interpreted as describing black-hole-type objects in a Goedel-like universe

  13. Strong Statistical Convergence in Probabilistic Metric Spaces

    OpenAIRE

    Şençimen, Celaleddin; Pehlivan, Serpil

    2008-01-01

    In this article, we introduce the concepts of strongly statistically convergent sequence and strong statistically Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong statistical limit points and the strong statistical cluster points of a sequence in this space and investigate the relations between these concepts.

  14. Language Games: University Responses to Ranking Metrics

    Science.gov (United States)

    Heffernan, Troy A.; Heffernan, Amanda

    2018-01-01

    League tables of universities that measure performance in various ways are now commonplace, with numerous bodies providing their own rankings of how institutions throughout the world are seen to be performing on a range of metrics. This paper uses Lyotard's notion of language games to theorise that universities are regaining some power over being…

  15. A new universal colour image fidelity metric

    NARCIS (Netherlands)

    Toet, A.; Lucassen, M.P.

    2003-01-01

    We extend a recently introduced universal grayscale image quality index to a newly developed perceptually decorrelated colour space. The resulting colour image fidelity metric quantifies the distortion of a processed colour image relative to its original version. We evaluated the new colour image

  16. Standardised metrics for global surgical surveillance.

    Science.gov (United States)

    Weiser, Thomas G; Makary, Martin A; Haynes, Alex B; Dziekan, Gerald; Berry, William R; Gawande, Atul A

    2009-09-26

    Public health surveillance relies on standardised metrics to evaluate disease burden and health system performance. Such metrics have not been developed for surgical services despite increasing volume, substantial cost, and high rates of death and disability associated with surgery. The Safe Surgery Saves Lives initiative of WHO's Patient Safety Programme has developed standardised public health metrics for surgical care that are applicable worldwide. We assembled an international panel of experts to develop and define metrics for measuring the magnitude and effect of surgical care in a population, while taking into account economic feasibility and practicability. This panel recommended six measures for assessing surgical services at a national level: number of operating rooms, number of operations, number of accredited surgeons, number of accredited anaesthesia professionals, day-of-surgery death ratio, and postoperative in-hospital death ratio. We assessed the feasibility of gathering such statistics at eight diverse hospitals in eight countries and incorporated them into the WHO Guidelines for Safe Surgery, in which methods for data collection, analysis, and reporting are outlined.

  17. A Lagrangian-dependent metric space

    International Nuclear Information System (INIS)

    El-Tahir, A.

    1989-08-01

    A generalized Lagrangian-dependent metric of the static isotropic spacetime is derived. Its behaviour should be governed by imposing physical constraints allowing to avert the pathological features of gravity at the strong field domain. This would restrict the choice of the Lagrangian form. (author). 10 refs

  18. Clean Cities 2011 Annual Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.

    2012-12-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2011. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  19. Clean Cities 2010 Annual Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.

    2012-10-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2010. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  20. Genetic basis of a cognitive complexity metric

    NARCIS (Netherlands)

    Hansell, Narelle K; Halford, Graeme S; Andrews, Glenda; Shum, David H K; Harris, Sarah E; Davies, Gail; Franic, Sanja; Christoforou, Andrea; Zietsch, Brendan; Painter, Jodie; Medland, Sarah E; Ehli, Erik A; Davies, Gareth E; Steen, Vidar M; Lundervold, Astri J; Reinvang, Ivar; Montgomery, Grant W; Espeseth, Thomas; Hulshoff Pol, Hilleke E; Starr, John M; Martin, Nicholas G; Le Hellard, Stephanie; Boomsma, Dorret I; Deary, Ian J; Wright, Margaret J

    2015-01-01

    Relational complexity (RC) is a metric reflecting capacity limitation in relational processing. It plays a crucial role in higher cognitive processes and is an endophenotype for several disorders. However, the genetic underpinnings of complex relational processing have not been investigated. Using

  1. Genetic Basis of a Cognitive Complexity Metric

    NARCIS (Netherlands)

    Hansell, N.K.; Halford, G.S.; Andrews, G.; Shum, D.H.K.; Harris, S.E.; Davies, G.; Franic, S.; Christoforou, A.; Zietsch, B.; Painter, J.; Medland, S.E.; Ehli, E.A.; Davies, G.E.; Steen, V.M.; Lundervold, A.J.; Reinvang, I.; Montgomery, G.W.; Espeseth, T.; Hulshoff Pol, H.E.; Starr, J.M.; Martin, N.G.; Le Hellard, S.; Boomsma, D.I.; Deary, I.J.; Wright, M.J.

    2015-01-01

    Relational complexity (RC) is a metric reflecting capacity limitation in relational processing. It plays a crucial role in higher cognitive processes and is an endophenotype for several disorders. However, the genetic underpinnings of complex relational processing have not been investigated. Using

  2. Business model metrics : An open repository

    NARCIS (Netherlands)

    Heikkila, M.; Bouwman, W.A.G.A.; Heikkila, J.; Solaimani, S.; Janssen, W.

    2015-01-01

    Development of successful business models has become a necessity in turbulent business environments, but compared to research on business modeling tools, attention to the role of metrics in designing business models in literature is limited. Building on existing approaches to business models and

  3. Software quality metrics aggregation in industry

    NARCIS (Netherlands)

    Mordal, K.; Anquetil, N.; Laval, J.; Serebrenik, A.; Vasilescu, B.N.; Ducasse, S.

    2013-01-01

    With the growing need for quality assessment of entire software systems in the industry, new issues are emerging. First, because most software quality metrics are defined at the level of individual software components, there is a need for aggregation methods to summarize the results at the system

  4. Invariance group of the Finster metric function

    International Nuclear Information System (INIS)

    Asanov, G.S.

    1985-01-01

    An invariance group of the Finsler metric function is introduced and studied that directly generalized the respective concept (a group of Euclidean rolations) of the Rieman geometry. A sequential description of the isotopic invariance of physical fields on the base of the Finsler geometry is possible in terms of this group

  5. Sigma Routing Metric for RPL Protocol

    Directory of Open Access Journals (Sweden)

    Paul Sanmartin

    2018-04-01

    Full Text Available This paper presents the adaptation of a specific metric for the RPL protocol in the objective function MRHOF. Among the functions standardized by IETF, we find OF0, which is based on the minimum hop count, as well as MRHOF, which is based on the Expected Transmission Count (ETX. However, when the network becomes denser or the number of nodes increases, both OF0 and MRHOF introduce long hops, which can generate a bottleneck that restricts the network. The adaptation is proposed to optimize both OFs through a new routing metric. To solve the above problem, the metrics of the minimum number of hops and the ETX are combined by designing a new routing metric called SIGMA-ETX, in which the best route is calculated using the standard deviation of ETX values between each node, as opposed to working with the ETX average along the route. This method ensures a better routing performance in dense sensor networks. The simulations are done through the Cooja simulator, based on the Contiki operating system. The simulations showed that the proposed optimization outperforms at a high margin in both OF0 and MRHOF, in terms of network latency, packet delivery ratio, lifetime, and power consumption.

  6. Forest carbon benefits, costs and leakage effects of carbon reserve scenarios in the United States

    Science.gov (United States)

    Prakash Nepal; Peter J. Ince; Kenneth E. Skog; Sun J. Chang

    2013-01-01

    This study evaluated the potential effectiveness of future carbon reserve scenarios, where U.S. forest landowners would hypothetically be paid to sequester carbon on their timberland and forego timber harvests for 100 years. Scenarios featured direct payments to landowners of $0 (baseline), $5, $10, or $15 per metric ton of additional forest carbon sequestered on the...

  7. Observable traces of non-metricity: New constraints on metric-affine gravity

    Science.gov (United States)

    Delhom-Latorre, Adrià; Olmo, Gonzalo J.; Ronco, Michele

    2018-05-01

    Relaxing the Riemannian condition to incorporate geometric quantities such as torsion and non-metricity may allow to explore new physics associated with defects in a hypothetical space-time microstructure. Here we show that non-metricity produces observable effects in quantum fields in the form of 4-fermion contact interactions, thereby allowing us to constrain the scale of non-metricity to be greater than 1 TeV by using results on Bahbah scattering. Our analysis is carried out in the framework of a wide class of theories of gravity in the metric-affine approach. The bound obtained represents an improvement of several orders of magnitude to previous experimental constraints.

  8. Conformal and related changes of metric on the product of two almost contact metric manifolds.

    OpenAIRE

    Blair, D. E.

    1990-01-01

    This paper studies conformal and related changes of the product metric on the product of two almost contact metric manifolds. It is shown that if one factor is Sasakian, the other is not, but that locally the second factor is of the type studied by Kenmotsu. The results are more general and given in terms of trans-Sasakian, α-Sasakian and β-Kenmotsu structures.

  9. Metrics for measuring distances in configuration spaces

    International Nuclear Information System (INIS)

    Sadeghi, Ali; Ghasemi, S. Alireza; Schaefer, Bastian; Mohr, Stephan; Goedecker, Stefan; Lill, Markus A.

    2013-01-01

    In order to characterize molecular structures we introduce configurational fingerprint vectors which are counterparts of quantities used experimentally to identify structures. The Euclidean distance between the configurational fingerprint vectors satisfies the properties of a metric and can therefore safely be used to measure dissimilarities between configurations in the high dimensional configuration space. In particular we show that these metrics are a perfect and computationally cheap replacement for the root-mean-square distance (RMSD) when one has to decide whether two noise contaminated configurations are identical or not. We introduce a Monte Carlo approach to obtain the global minimum of the RMSD between configurations, which is obtained from a global minimization over all translations, rotations, and permutations of atomic indices

  10. A perceptual metric for photo retouching.

    Science.gov (United States)

    Kee, Eric; Farid, Hany

    2011-12-13

    In recent years, advertisers and magazine editors have been widely criticized for taking digital photo retouching to an extreme. Impossibly thin, tall, and wrinkle- and blemish-free models are routinely splashed onto billboards, advertisements, and magazine covers. The ubiquity of these unrealistic and highly idealized images has been linked to eating disorders and body image dissatisfaction in men, women, and children. In response, several countries have considered legislating the labeling of retouched photos. We describe a quantitative and perceptually meaningful metric of photo retouching. Photographs are rated on the degree to which they have been digitally altered by explicitly modeling and estimating geometric and photometric changes. This metric correlates well with perceptual judgments of photo retouching and can be used to objectively judge by how much a retouched photo has strayed from reality.

  11. Metric-Aware Secure Service Orchestration

    Directory of Open Access Journals (Sweden)

    Gabriele Costa

    2012-12-01

    Full Text Available Secure orchestration is an important concern in the internet of service. Next to providing the required functionality the composite services must also provide a reasonable level of security in order to protect sensitive data. Thus, the orchestrator has a need to check whether the complex service is able to satisfy certain properties. Some properties are expressed with metrics for precise definition of requirements. Thus, the problem is to analyse the values of metrics for a complex business process. In this paper we extend our previous work on analysis of secure orchestration with quantifiable properties. We show how to define, verify and enforce quantitative security requirements in one framework with other security properties. The proposed approach should help to select the most suitable service architecture and guarantee fulfilment of the declared security requirements.

  12. Machine Learning for ATLAS DDM Network Metrics

    CERN Document Server

    Lassnig, Mario; The ATLAS collaboration; Vamosi, Ralf

    2016-01-01

    The increasing volume of physics data is posing a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from our ongoing automation efforts. First, we describe our framework for distributed data management and network metrics, automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for network-aware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our models.

  13. Beyond Lovelock gravity: Higher derivative metric theories

    Science.gov (United States)

    Crisostomi, M.; Noui, K.; Charmousis, C.; Langlois, D.

    2018-02-01

    We consider theories describing the dynamics of a four-dimensional metric, whose Lagrangian is diffeomorphism invariant and depends at most on second derivatives of the metric. Imposing degeneracy conditions we find a set of Lagrangians that, apart form the Einstein-Hilbert one, are either trivial or contain more than 2 degrees of freedom. Among the partially degenerate theories, we recover Chern-Simons gravity, endowed with constraints whose structure suggests the presence of instabilities. Then, we enlarge the class of parity violating theories of gravity by introducing new "chiral scalar-tensor theories." Although they all raise the same concern as Chern-Simons gravity, they can nevertheless make sense as low energy effective field theories or, by restricting them to the unitary gauge (where the scalar field is uniform), as Lorentz breaking theories with a parity violating sector.

  14. Using improved technology for widespread application of a geological carbon sequestration study

    Science.gov (United States)

    Raney, J.

    2013-12-01

    The Kansas Geological Survey is part of an ongoing collaboration between DOE-NETL, academia, and the petroleum industry to investigate the feasibility of carbon utilization and storage in Kansas. Latest findings in the 25,000 mi2 study area in southern Kansas estimate CO2 storage capacity ranges from 8.8 to 75.5 billion metric tons in a deep Lower Orodovican-age Arbuckle saline aquifer. In addition, an estimated 100 million tonnes of CO2 could be used for extracting additional oil from Kansas' fields, making transitions to carbon management economic. This partnership has a rare opportunity to synchronize abundant, yet previously disseminated knowledge into a cohesive scientific process to optimize sequestration site selection and implementation strategies. Following a thorough characterization, a small-scale CO2 injection of 70,000 tonnes will be implemented in Wellington Field in Sumner County, including a five-plot miscible CO2-EOR flood of a Mississippian reservoir followed by the underlying Arbuckle saline aquifer. Best practices and lessons learned from the field study will improve estimates on CO2 storage capacity, plume migration models, and identify potential leakage pathways to pursue safe and effective geological carbon sequestration at commercial scales. A highly accessible and multifunctional online database is being developed throughout the study that integrates all acquired geological, physical, chemical, and hydrogeologic knowledge. This public database incorporates tens of thousands of data points into easily viewable formats for user downloads. An Interactive Project Map Viewer is a key mechanism to present the scientific research, and will delineate compartment candidates and reservoirs matching reference criteria or user defined attributes. This tool uses a familiar pan and zoom interface to filter regional project data or scale down to detailed digitized information from over 3,300 carefully selected preexisting Kansas wells. A Java-based log

  15. High-Dimensional Metrics in R

    OpenAIRE

    Chernozhukov, Victor; Hansen, Chris; Spindler, Martin

    2016-01-01

    The package High-dimensional Metrics (\\Rpackage{hdm}) is an evolving collection of statistical methods for estimation and quantification of uncertainty in high-dimensional approximately sparse models. It focuses on providing confidence intervals and significance testing for (possibly many) low-dimensional subcomponents of the high-dimensional parameter vector. Efficient estimators and uniformly valid confidence intervals for regression coefficients on target variables (e.g., treatment or poli...

  16. Interiors of Vaidya's radiating metric: Gravitational collapse

    International Nuclear Information System (INIS)

    Fayos, F.; Jaen, X.; Llanta, E.; Senovilla, J.M.M.

    1992-01-01

    Using the Darmois junction conditions, we give the necessary and sufficient conditions for the matching of a general spherically symmetric metric to a Vaidya radiating solution. We present also these conditions in terms of the physical quantities of the corresponding energy-momentum tensors. The physical interpretation of the results and their possible applications are studied, and we also perform a detailed analysis of previous work on the subject by other authors

  17. Anisotropic rectangular metric for polygonal surface remeshing

    KAUST Repository

    Pellenard, Bertrand

    2013-06-18

    We propose a new method for anisotropic polygonal surface remeshing. Our algorithm takes as input a surface triangle mesh. An anisotropic rectangular metric, defined at each triangle facet of the input mesh, is derived from both a user-specified normal-based tolerance error and the requirement to favor rectangle-shaped polygons. Our algorithm uses a greedy optimization procedure that adds, deletes and relocates generators so as to match two criteria related to partitioning and conformity.

  18. A Metrics Approach for Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA

    2009-01-01

    Full Text Available This article presents different types of collaborative systems, their structure and classification. This paper defines the concept of virtual campus as a collaborative system. It builds architecture for virtual campus oriented on collaborative training processes. It analyses the quality characteristics of collaborative systems and propose techniques for metrics construction and validation in order to evaluate them. The article analyzes different ways to increase the efficiency and the performance level in collaborative banking systems.

  19. Preserved Network Metrics across Translated Texts

    Science.gov (United States)

    Cabatbat, Josephine Jill T.; Monsanto, Jica P.; Tapang, Giovanni A.

    2014-09-01

    Co-occurrence language networks based on Bible translations and the Universal Declaration of Human Rights (UDHR) translations in different languages were constructed and compared with random text networks. Among the considered network metrics, the network size, N, the normalized betweenness centrality (BC), and the average k-nearest neighbors, knn, were found to be the most preserved across translations. Moreover, similar frequency distributions of co-occurring network motifs were observed for translated texts networks.

  20. Anisotropic rectangular metric for polygonal surface remeshing

    KAUST Repository

    Pellenard, Bertrand; Morvan, Jean-Marie; Alliez, Pierre

    2013-01-01

    We propose a new method for anisotropic polygonal surface remeshing. Our algorithm takes as input a surface triangle mesh. An anisotropic rectangular metric, defined at each triangle facet of the input mesh, is derived from both a user-specified normal-based tolerance error and the requirement to favor rectangle-shaped polygons. Our algorithm uses a greedy optimization procedure that adds, deletes and relocates generators so as to match two criteria related to partitioning and conformity.

  1. Smart Grid Status and Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-07-01

    To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. It measures 21 metrics to provide insight into the grid’s capacity to embody these characteristics. This report looks across a spectrum of smart grid concerns to measure the status of smart grid deployment and impacts.

  2. Metrics in Keplerian orbits quotient spaces

    Science.gov (United States)

    Milanov, Danila V.

    2018-03-01

    Quotient spaces of Keplerian orbits are important instruments for the modelling of orbit samples of celestial bodies on a large time span. We suppose that variations of the orbital eccentricities, inclinations and semi-major axes remain sufficiently small, while arbitrary perturbations are allowed for the arguments of pericentres or longitudes of the nodes, or both. The distance between orbits or their images in quotient spaces serves as a numerical criterion for such problems of Celestial Mechanics as search for common origin of meteoroid streams, comets, and asteroids, asteroid families identification, and others. In this paper, we consider quotient sets of the non-rectilinear Keplerian orbits space H. Their elements are identified irrespective of the values of pericentre arguments or node longitudes. We prove that distance functions on the quotient sets, introduced in Kholshevnikov et al. (Mon Not R Astron Soc 462:2275-2283, 2016), satisfy metric space axioms and discuss theoretical and practical importance of this result. Isometric embeddings of the quotient spaces into R^n, and a space of compact subsets of H with Hausdorff metric are constructed. The Euclidean representations of the orbits spaces find its applications in a problem of orbit averaging and computational algorithms specific to Euclidean space. We also explore completions of H and its quotient spaces with respect to corresponding metrics and establish a relation between elements of the extended spaces and rectilinear trajectories. Distance between an orbit and subsets of elliptic and hyperbolic orbits is calculated. This quantity provides an upper bound for the metric value in a problem of close orbits identification. Finally the invariance of the equivalence relations in H under coordinates change is discussed.

  3. The Planck Vacuum and the Schwarzschild Metrics

    Directory of Open Access Journals (Sweden)

    Daywitt W. C.

    2009-07-01

    Full Text Available The Planck vacuum (PV is assumed to be the source of the visible universe. So under conditions of sufficient stress, there must exist a pathway through which energy from the PV can travel into this universe. Conversely, the passage of energy from the visible universe to the PV must also exist under the same stressful conditions. The following examines two versions of the Schwarzschild metric equation for compatability with this open-pathway idea.

  4. Metrics and Its Function in Poetry

    Institute of Scientific and Technical Information of China (English)

    XIAO Zhong-qiong; CHEN Min-jie

    2013-01-01

    Poetry is a special combination of musical and linguistic qualities-of sounds both regarded as pure sound and as mean-ingful speech. Part of the pleasure of poetry lies in its relationship with music. Metrics, including rhythm and meter, is an impor-tant method for poetry to express poetic sentiment. Through the introduction of poetic language and typical examples, the writer of this paper tries to discuss the relationship between sound and meaning.

  5. Image characterization metrics for muon tomography

    Science.gov (United States)

    Luo, Weidong; Lehovich, Andre; Anashkin, Edward; Bai, Chuanyong; Kindem, Joel; Sossong, Michael; Steiger, Matt

    2014-05-01

    Muon tomography uses naturally occurring cosmic rays to detect nuclear threats in containers. Currently there are no systematic image characterization metrics for muon tomography. We propose a set of image characterization methods to quantify the imaging performance of muon tomography. These methods include tests of spatial resolution, uniformity, contrast, signal to noise ratio (SNR) and vertical smearing. Simulated phantom data and analysis methods were developed to evaluate metric applicability. Spatial resolution was determined as the FWHM of the point spread functions in X, Y and Z axis for 2.5cm tungsten cubes. Uniformity was measured by drawing a volume of interest (VOI) within a large water phantom and defined as the standard deviation of voxel values divided by the mean voxel value. Contrast was defined as the peak signals of a set of tungsten cubes divided by the mean voxel value of the water background. SNR was defined as the peak signals of cubes divided by the standard deviation (noise) of the water background. Vertical smearing, i.e. vertical thickness blurring along the zenith axis for a set of 2 cm thick tungsten plates, was defined as the FWHM of vertical spread function for the plate. These image metrics provided a useful tool to quantify the basic imaging properties for muon tomography.

  6. A Fundamental Metric for Metal Recycling Applied to Coated Magnesium

    NARCIS (Netherlands)

    Meskers, C.E.M.; Reuter, M.A.; Boin, U.; Kvithyld, A.

    2008-01-01

    A fundamental metric for the assessment of the recyclability and, hence, the sustainability of coated magnesium scrap is presented; this metric combines kinetics and thermodynamics. The recycling process, consisting of thermal decoating and remelting, was studied by thermogravimetry and differential

  7. Ideal Based Cyber Security Technical Metrics for Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    W. F. Boyer; M. A. McQueen

    2007-10-01

    Much of the world's critical infrastructure is at risk from attack through electronic networks connected to control systems. Security metrics are important because they provide the basis for management decisions that affect the protection of the infrastructure. A cyber security technical metric is the security relevant output from an explicit mathematical model that makes use of objective measurements of a technical object. A specific set of technical security metrics are proposed for use by the operators of control systems. Our proposed metrics are based on seven security ideals associated with seven corresponding abstract dimensions of security. We have defined at least one metric for each of the seven ideals. Each metric is a measure of how nearly the associated ideal has been achieved. These seven ideals provide a useful structure for further metrics development. A case study shows how the proposed metrics can be applied to an operational control system.

  8. 43 CFR 12.915 - Metric system of measurement.

    Science.gov (United States)

    2010-10-01

    ... procurements, grants, and other business-related activities. Metric implementation may take longer where the... recipient, such as when foreign competitors are producing competing products in non-metric units. (End of...

  9. The Jacobi metric for timelike geodesics in static spacetimes

    Science.gov (United States)

    Gibbons, G. W.

    2016-01-01

    It is shown that the free motion of massive particles moving in static spacetimes is given by the geodesics of an energy-dependent Riemannian metric on the spatial sections analogous to Jacobi's metric in classical dynamics. In the massless limit Jacobi's metric coincides with the energy independent Fermat or optical metric. For stationary metrics, it is known that the motion of massless particles is given by the geodesics of an energy independent Finslerian metric of Randers type. The motion of massive particles is governed by neither a Riemannian nor a Finslerian metric. The properies of the Jacobi metric for massive particles moving outside the horizon of a Schwarschild black hole are described. By constrast with the massless case, the Gaussian curvature of the equatorial sections is not always negative.

  10. Factor structure of the Tomimatsu-Sato metrics

    International Nuclear Information System (INIS)

    Perjes, Z.

    1989-02-01

    Based on an earlier result stating that δ = 3 Tomimatsu-Sato (TS) metrics can be factored over the field of integers, an analogous representation for higher TS metrics was sought. It is shown that the factoring property of TS metrics follows from the structure of special Hankel determinants. A set of linear algebraic equations determining the factors was defined, and the factors of the first five TS metrics were tabulated, together with their primitive factors. (R.P.) 4 refs.; 2 tabs

  11. What can article-level metrics do for you?

    Science.gov (United States)

    Fenner, Martin

    2013-10-01

    Article-level metrics (ALMs) provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, usage statistics, discussions in online comments and social media, social bookmarking, and recommendations. In this essay, we describe why article-level metrics are an important extension of traditional citation-based journal metrics and provide a number of example from ALM data collected for PLOS Biology.

  12. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  13. Understanding Acceptance of Software Metrics--A Developer Perspective

    Science.gov (United States)

    Umarji, Medha

    2009-01-01

    Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…

  14. Modified intuitionistic fuzzy metric spaces and some fixed point theorems

    International Nuclear Information System (INIS)

    Saadati, R.; Sedghi, S.; Shobe, N.

    2008-01-01

    Since the intuitionistic fuzzy metric space has extra conditions (see [Gregori V, Romaguera S, Veereamani P. A note on intuitionistic fuzzy metric spaces. Chaos, Solitons and Fractals 2006;28:902-5]). In this paper, we consider modified intuitionistic fuzzy metric spaces and prove some fixed point theorems in these spaces. All the results presented in this paper are new

  15. Tide or Tsunami? The Impact of Metrics on Scholarly Research

    Science.gov (United States)

    Bonnell, Andrew G.

    2016-01-01

    Australian universities are increasingly resorting to the use of journal metrics such as impact factors and ranking lists in appraisal and promotion processes, and are starting to set quantitative "performance expectations" which make use of such journal-based metrics. The widespread use and misuse of research metrics is leading to…

  16. Robustness of climate metrics under climate policy ambiguity

    International Nuclear Information System (INIS)

    Ekholm, Tommi; Lindroos, Tomi J.; Savolainen, Ilkka

    2013-01-01

    Highlights: • We assess the economic impacts of using different climate metrics. • The setting is cost-efficient scenarios for three interpretations of the 2C target. • With each target setting, the optimal metric is different. • Therefore policy ambiguity prevents the selection of an optimal metric. • Robust metric values that perform well with multiple policy targets however exist. -- Abstract: A wide array of alternatives has been proposed as the common metrics with which to compare the climate impacts of different emission types. Different physical and economic metrics and their parameterizations give diverse weights between e.g. CH 4 and CO 2 , and fixing the metric from one perspective makes it sub-optimal from another. As the aims of global climate policy involve some degree of ambiguity, it is not possible to determine a metric that would be optimal and consistent with all policy aims. This paper evaluates the cost implications of using predetermined metrics in cost-efficient mitigation scenarios. Three formulations of the 2 °C target, including both deterministic and stochastic approaches, shared a wide range of metric values for CH 4 with which the mitigation costs are only slightly above the cost-optimal levels. Therefore, although ambiguity in current policy might prevent us from selecting an optimal metric, it can be possible to select robust metric values that perform well with multiple policy targets

  17. Graev metrics on free products and HNN extensions

    DEFF Research Database (Denmark)

    Slutsky, Konstantin

    2014-01-01

    We give a construction of two-sided invariant metrics on free products (possibly with amalgamation) of groups with two-sided invariant metrics and, under certain conditions, on HNN extensions of such groups. Our approach is similar to the Graev's construction of metrics on free groups over pointed...

  18. The universal connection and metrics on moduli spaces

    International Nuclear Information System (INIS)

    Massamba, Fortune; Thompson, George

    2003-11-01

    We introduce a class of metrics on gauge theoretic moduli spaces. These metrics are made out of the universal matrix that appears in the universal connection construction of M. S. Narasimhan and S. Ramanan. As an example we construct metrics on the c 2 = 1 SU(2) moduli space of instantons on R 4 for various universal matrices. (author)

  19. ST-intuitionistic fuzzy metric space with properties

    Science.gov (United States)

    Arora, Sahil; Kumar, Tanuj

    2017-07-01

    In this paper, we define ST-intuitionistic fuzzy metric space and the notion of convergence and completeness properties of cauchy sequences is studied. Further, we prove some properties of ST-intuitionistic fuzzy metric space. Finally, we introduce the concept of symmetric ST Intuitionistic Fuzzy metric space.

  20. Term Based Comparison Metrics for Controlled and Uncontrolled Indexing Languages

    Science.gov (United States)

    Good, B. M.; Tennis, J. T.

    2009-01-01

    Introduction: We define a collection of metrics for describing and comparing sets of terms in controlled and uncontrolled indexing languages and then show how these metrics can be used to characterize a set of languages spanning folksonomies, ontologies and thesauri. Method: Metrics for term set characterization and comparison were identified and…

  1. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  2. Otherwise Engaged : Social Media from Vanity Metrics to Critical Analytics

    NARCIS (Netherlands)

    Rogers, R.

    2018-01-01

    Vanity metrics is a term that captures the measurement and display of how well one is doing in the “success theater” of social media. The notion of vanity metrics implies a critique of metrics concerning both the object of measurement as well as their capacity to measure unobtrusively or only to

  3. Meter Detection in Symbolic Music Using Inner Metric Analysis

    NARCIS (Netherlands)

    de Haas, W.B.; Volk, A.

    2016-01-01

    In this paper we present PRIMA: a new model tailored to symbolic music that detects the meter and the first downbeat position of a piece. Given onset data, the metrical structure of a piece is interpreted using the Inner Metric Analysis (IMA) model. IMA identifies the strong and weak metrical

  4. Regional Sustainability: The San Luis Basin Metrics Project

    Science.gov (United States)

    There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute. Moreover, individual metrics may not capture all aspects of a system that are relevant to sust...

  5. Assessing the metrics of climate change. Current methods and future possibilities

    Energy Technology Data Exchange (ETDEWEB)

    Fuglestveit, Jan S.; Berntsen, Terje K.; Godal, Odd; Sausen, Robert; Shine, Keith P.; Skodvin, Tora

    2001-07-01

    With the principle of comprehensiveness embedded in the UN Framework Convention on Climate Change (Art. 3), a multi-gas abatement strategy with emphasis also on non-CO2 greenhouse gases as targets for reduction and control measures has been adopted in the international climate regime. In the Kyoto Protocol, the comprehensive approach is made operative as the aggregate anthropogenic carbon dioxide equivalent emissions of six specified greenhouse gases or groups of gases (Art. 3). With this operationalisation, the emissions of a set of greenhouse gases with very different atmospheric lifetimes and radiative properties are transformed into one common unit - CO2 equivalents. This transformation is based on the Global Warming Potential (GWP) index, which in turn is based on the concept of radiative forcing. The GWP metric and its application in policy making has been debated, and several other alternative concepts have been suggested. In this paper, we review existing and alternative metrics of climate change, with particular emphasis on radiative forcing and GWPs, in terms of their scientific performance. This assessment focuses on questions such as the climate impact (end point) against which gases are weighted; the extent to which and how temporality is included, both with regard to emission control and with regard to climate impact; how cost issues are dealt with; and the sensitivity of the metrics to various assumptions. It is concluded that the radiative forcing concept is a robust and useful metric of the potential climatic impact of various agents and that there are prospects for improvement by weighing different forcings according to their effectiveness. We also find that although the GWP concept is associated with serious shortcomings, it retains advantages over any of the proposed alternatives in terms of political feasibility. Alternative metrics, however, make a significant contribution to addressing important issues, and this contribution should be taken

  6. Assessing the metrics of climate change. Current methods and future possibilities

    International Nuclear Information System (INIS)

    Fuglestveit, Jan S.; Berntsen, Terje K.; Godal, Odd; Sausen, Robert; Shine, Keith P.; Skodvin, Tora

    2001-01-01

    With the principle of comprehensiveness embedded in the UN Framework Convention on Climate Change (Art. 3), a multi-gas abatement strategy with emphasis also on non-CO2 greenhouse gases as targets for reduction and control measures has been adopted in the international climate regime. In the Kyoto Protocol, the comprehensive approach is made operative as the aggregate anthropogenic carbon dioxide equivalent emissions of six specified greenhouse gases or groups of gases (Art. 3). With this operationalisation, the emissions of a set of greenhouse gases with very different atmospheric lifetimes and radiative properties are transformed into one common unit - CO2 equivalents. This transformation is based on the Global Warming Potential (GWP) index, which in turn is based on the concept of radiative forcing. The GWP metric and its application in policy making has been debated, and several other alternative concepts have been suggested. In this paper, we review existing and alternative metrics of climate change, with particular emphasis on radiative forcing and GWPs, in terms of their scientific performance. This assessment focuses on questions such as the climate impact (end point) against which gases are weighted; the extent to which and how temporality is included, both with regard to emission control and with regard to climate impact; how cost issues are dealt with; and the sensitivity of the metrics to various assumptions. It is concluded that the radiative forcing concept is a robust and useful metric of the potential climatic impact of various agents and that there are prospects for improvement by weighing different forcings according to their effectiveness. We also find that although the GWP concept is associated with serious shortcomings, it retains advantages over any of the proposed alternatives in terms of political feasibility. Alternative metrics, however, make a significant contribution to addressing important issues, and this contribution should be taken

  7. Extremal limits of the C metric: Nariai, Bertotti-Robinson, and anti-Nariai C metrics

    International Nuclear Information System (INIS)

    Dias, Oscar J.C.; Lemos, Jose P.S.

    2003-01-01

    In two previous papers we have analyzed the C metric in a background with a cosmological constant Λ, namely, the de-Sitter (dS) C metric (Λ>0), and the anti-de Sitter (AdS) C metric (Λ 0, Λ=0, and Λ 2 xS-tilde 2 ) to each point in the deformed two-sphere S-tilde 2 corresponds a dS 2 spacetime, except for one point which corresponds to a dS 2 spacetime with an infinite straight strut or string. There are other important new features that appear. One expects that the solutions found in this paper are unstable and decay into a slightly nonextreme black hole pair accelerated by a strut or by strings. Moreover, the Euclidean version of these solutions mediate the quantum process of black hole pair creation that accompanies the decay of the dS and AdS spaces

  8. Massless and massive quanta resulting from a mediumlike metric tensor

    International Nuclear Information System (INIS)

    Soln, J.

    1985-01-01

    A simple model of the ''primordial'' scalar field theory is presented in which the metric tensor is a generalization of the metric tensor from electrodynamics in a medium. The radiation signal corresponding to the scalar field propagates with a velocity that is generally less than c. This signal can be associated simultaneously with imaginary and real effective (momentum-dependent) masses. The requirement that the imaginary effective mass vanishes, which we take to be the prerequisite for the vacuumlike signal propagation, leads to the ''spontaneous'' splitting of the metric tensor into two distinct metric tensors: one metric tensor gives rise to masslesslike radiation and the other to a massive particle. (author)

  9. Principle of space existence and De Sitter metric

    International Nuclear Information System (INIS)

    Mal'tsev, V.K.

    1990-01-01

    The selection principle for the solutions of the Einstein equations suggested in a series of papers implies the existence of space (g ik ≠ 0) only in the presence of matter (T ik ≠0). This selection principle (principle of space existence, in the Markov terminology) implies, in the general case, the absence of the cosmological solution with the De Sitter metric. On the other hand, the De Sitter metric is necessary for describing both inflation and deflation periods of the Universe. It is shown that the De Sitter metric is also allowed by the selection principle under discussion if the metric experiences the evolution into the Friedmann metric

  10. Pragmatic security metrics applying metametrics to information security

    CERN Document Server

    Brotby, W Krag

    2013-01-01

    Other books on information security metrics discuss number theory and statistics in academic terms. Light on mathematics and heavy on utility, PRAGMATIC Security Metrics: Applying Metametrics to Information Security breaks the mold. This is the ultimate how-to-do-it guide for security metrics.Packed with time-saving tips, the book offers easy-to-follow guidance for those struggling with security metrics. Step by step, it clearly explains how to specify, develop, use, and maintain an information security measurement system (a comprehensive suite of metrics) to

  11. Atmospheric carbon reduction by urban trees

    International Nuclear Information System (INIS)

    Nowak, D.J.

    1993-01-01

    Trees, because they sequester atmospheric carbon through their growth process and conserve energy in urban areas, have been suggested as one means to combat increasing levels of atmospheric carbon. Analysis of the urban forest in Oakland, California (21% tree cover), reveals a tree carbon storage level of 11·0 metric tons/hectare. Trees in the area of the 1991 fire in Oakland stored approximately 14,500 metric tons of carbon, 10% of the total amount stored by Oakland's urban forest. National urban forest carbon storage in the United States (28% tree cover) is estimated at between 350 and 750 million metric tons. Establishment of 10 million urban trees annually over the next 10 years is estimated to sequester and offset the production of 363 million metric tons of carbon over the next 50 years-less than 1% of the estimated carbon emissions in the United States over the same time period. Advantages and limitations of managing urban trees to reduce atmospheric carbon are discussed. 36 refs., 2 figs., 3 tabs

  12. Classification in medical images using adaptive metric k-NN

    Science.gov (United States)

    Chen, C.; Chernoff, K.; Karemore, G.; Lo, P.; Nielsen, M.; Lauze, F.

    2010-03-01

    The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier with respect to different adaptive metrics in the context of medical imaging. We propose using adaptive metrics such that the structure of the data is better described, introducing some unsupervised learning knowledge in k-NN. We investigated four different metrics are estimated: a theoretical metric based on the assumption that images are drawn from Brownian Image Model (BIM), the normalized metric based on variance of the data, the empirical metric is based on the empirical covariance matrix of the unlabeled data, and an optimized metric obtained by minimizing the classification error. The spectral structure of the empirical covariance also leads to Principal Component Analysis (PCA) performed on it which results the subspace metrics. The metrics are evaluated on two data sets: lateral X-rays of the lumbar aortic/spine region, where we use k-NN for performing abdominal aorta calcification detection; and mammograms, where we use k-NN for breast cancer risk assessment. The results show that appropriate choice of metric can improve classification.

  13. THE ROLE OF ARTICLE LEVEL METRICS IN SCIENTIFIC PUBLISHING

    Directory of Open Access Journals (Sweden)

    Vladimir TRAJKOVSKI

    2016-04-01

    Full Text Available Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Article-level metrics (ALMs provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, statistics of usage, discussions in online comments and social media, social bookmarking, and recommendations. In this editorial, the role of article level metrics in publishing scientific papers has been described. Article-Level Metrics (ALMs are rapidly emerging as important tools to quantify how individual articles are being discussed, shared, and used. Data sources depend on the tool, but they include classic metrics indicators depending on citations, academic social networks (Mendeley, CiteULike, Delicious and social media (Facebook, Twitter, blogs, and Youtube. The most popular tools used to apply this new metrics are: Public Library of Science - Article-Level Metrics, Altmetric, Impactstory and Plum Analytics. Journal Impact Factor (JIF does not consider impact or influence beyond citations count as this count reflected only through Thomson Reuters’ Web of Science® database. JIF provides indicator related to the journal, but not related to a published paper. Thus, altmetrics now becomes an alternative metrics for performance assessment of individual scientists and their contributed scholarly publications. Macedonian scholarly publishers have to work on implementing of article level metrics in their e-journals. It is the way to increase their visibility and impact in the world of science.

  14. Outsourced Similarity Search on Metric Data Assets

    DEFF Research Database (Denmark)

    Yiu, Man Lung; Assent, Ira; Jensen, Christian S.

    2012-01-01

    . Outsourcing offers the data owner scalability and a low initial investment. The need for privacy may be due to the data being sensitive (e.g., in medicine), valuable (e.g., in astronomy), or otherwise confidential. Given this setting, the paper presents techniques that transform the data prior to supplying......This paper considers a cloud computing setting in which similarity querying of metric data is outsourced to a service provider. The data is to be revealed only to trusted users, not to the service provider or anyone else. Users query the server for the most similar data objects to a query example...

  15. New Metrics from a Fractional Gravitational Field

    International Nuclear Information System (INIS)

    El-Nabulsi, Rami Ahmad

    2017-01-01

    Agop et al. proved in Commun. Theor. Phys. (2008) that, a Reissner–Nordstrom type metric is obtained, if gauge gravitational field in a fractal spacetime is constructed by means of concepts of scale relativity. We prove in this short communication that similar result is obtained if gravity in D-spacetime dimensions is fractionalized by means of the Glaeske–Kilbas–Saigo fractional. Besides, non-singular gravitational fields are obtained without using extra-dimensions. We present few examples to show that these gravitational fields hold a number of motivating features in spacetime physics. (paper)

  16. Multi-Robot Assembly Strategies and Metrics

    Science.gov (United States)

    MARVEL, JEREMY A.; BOSTELMAN, ROGER; FALCO, JOE

    2018-01-01

    We present a survey of multi-robot assembly applications and methods and describe trends and general insights into the multi-robot assembly problem for industrial applications. We focus on fixtureless assembly strategies featuring two or more robotic systems. Such robotic systems include industrial robot arms, dexterous robotic hands, and autonomous mobile platforms, such as automated guided vehicles. In this survey, we identify the types of assemblies that are enabled by utilizing multiple robots, the algorithms that synchronize the motions of the robots to complete the assembly operations, and the metrics used to assess the quality and performance of the assemblies. PMID:29497234

  17. Metric preheating and limitations of linearized gravity

    International Nuclear Information System (INIS)

    Bassett, Bruce A.; Tamburini, Fabrizio; Kaiser, David I.; Maartens, Roy

    1999-01-01

    During the preheating era after inflation, resonant amplification of quantum field fluctuations takes place. Recently it has become clear that this must be accompanied by resonant amplification of scalar metric fluctuations, since the two are united by Einstein's equations. Furthermore, this 'metric preheating' enhances particle production, and leads to gravitational rescattering effects even at linear order. In multi-field models with strong preheating (q>>1), metric perturbations are driven non-linear, with the strongest amplification typically on super-Hubble scales (k→0). This amplification is causal, being due to the super-Hubble coherence of the inflaton condensate, and is accompanied by resonant growth of entropy perturbations. The amplification invalidates the use of the linearized Einstein field equations, irrespective of the amount of fine-tuning of the initial conditions. This has serious implications on all scales - from large-angle cosmic microwave background (CMB) anisotropies to primordial black holes. We investigate the (q,k) parameter space in a two-field model, and introduce the time to non-linearity, t nl , as the timescale for the breakdown of the linearized Einstein equations. t nl is a robust indicator of resonance behavior, showing the fine structure in q and k that one expects from a quasi-Floquet system, and we argue that t nl is a suitable generalization of the static Floquet index in an expanding universe. Backreaction effects are expected to shut down the linear resonances, but cannot remove the existing amplification, which threatens the viability of strong preheating when confronted with the CMB. Mode-mode coupling and turbulence tend to re-establish scale invariance, but this process is limited by causality and for small k the primordial scale invariance of the spectrum may be destroyed. We discuss ways to escape the above conclusions, including secondary phases of inflation and preheating solely to fermions. The exclusion principle

  18. Alternative kinetic energy metrics for Lagrangian systems

    Science.gov (United States)

    Sarlet, W.; Prince, G.

    2010-11-01

    We examine Lagrangian systems on \\ {R}^n with standard kinetic energy terms for the possibility of additional, alternative Lagrangians with kinetic energy metrics different to the Euclidean one. Using the techniques of the inverse problem in the calculus of variations we find necessary and sufficient conditions for the existence of such Lagrangians. We illustrate the problem in two and three dimensions with quadratic and cubic potentials. As an aside we show that the well-known anomalous Lagrangians for the Coulomb problem can be removed by switching on a magnetic field, providing an appealing resolution of the ambiguous quantizations of the hydrogen atom.

  19. Differential geometry bundles, connections, metrics and curvature

    CERN Document Server

    Taubes, Clifford Henry

    2011-01-01

    Bundles, connections, metrics and curvature are the 'lingua franca' of modern differential geometry and theoretical physics. This book will supply a graduate student in mathematics or theoretical physics with the fundamentals of these objects. Many of the tools used in differential topology are introduced and the basic results about differentiable manifolds, smooth maps, differential forms, vector fields, Lie groups, and Grassmanians are all presented here. Other material covered includes the basic theorems about geodesics and Jacobi fields, the classification theorem for flat connections, the

  20. Multi-Robot Assembly Strategies and Metrics.

    Science.gov (United States)

    Marvel, Jeremy A; Bostelman, Roger; Falco, Joe

    2018-02-01

    We present a survey of multi-robot assembly applications and methods and describe trends and general insights into the multi-robot assembly problem for industrial applications. We focus on fixtureless assembly strategies featuring two or more robotic systems. Such robotic systems include industrial robot arms, dexterous robotic hands, and autonomous mobile platforms, such as automated guided vehicles. In this survey, we identify the types of assemblies that are enabled by utilizing multiple robots, the algorithms that synchronize the motions of the robots to complete the assembly operations, and the metrics used to assess the quality and performance of the assemblies.

  1. Indefinite metric and regularization of electrodynamics

    International Nuclear Information System (INIS)

    Gaudin, M.

    1984-06-01

    The invariant regularization of Pauli and Villars in quantum electrodynamics can be considered as deriving from a local and causal lagrangian theory for spin 1/2 bosons, by introducing an indefinite metric and a condition on the allowed states similar to the Lorentz condition. The consequences are the asymptotic freedom of the photon's propagator. We present a calcultion of the effective charge to the fourth order in the coupling as a function of the auxiliary masses, the theory avoiding all mass divergencies to this order [fr

  2. Metrics for comparing plasma mass filters

    Energy Technology Data Exchange (ETDEWEB)

    Fetterman, Abraham J.; Fisch, Nathaniel J. [Department of Astrophysical Sciences, Princeton University, Princeton, New Jersey 08540 (United States)

    2011-10-15

    High-throughput mass separation of nuclear waste may be useful for optimal storage, disposal, or environmental remediation. The most dangerous part of nuclear waste is the fission product, which produces most of the heat and medium-term radiation. Plasmas are well-suited to separating nuclear waste because they can separate many different species in a single step. A number of plasma devices have been designed for such mass separation, but there has been no standardized comparison between these devices. We define a standard metric, the separative power per unit volume, and derive it for three different plasma mass filters: the plasma centrifuge, Ohkawa filter, and the magnetic centrifugal mass filter.

  3. Metrics for comparing plasma mass filters

    International Nuclear Information System (INIS)

    Fetterman, Abraham J.; Fisch, Nathaniel J.

    2011-01-01

    High-throughput mass separation of nuclear waste may be useful for optimal storage, disposal, or environmental remediation. The most dangerous part of nuclear waste is the fission product, which produces most of the heat and medium-term radiation. Plasmas are well-suited to separating nuclear waste because they can separate many different species in a single step. A number of plasma devices have been designed for such mass separation, but there has been no standardized comparison between these devices. We define a standard metric, the separative power per unit volume, and derive it for three different plasma mass filters: the plasma centrifuge, Ohkawa filter, and the magnetic centrifugal mass filter.

  4. Decision Analysis for Metric Selection on a Clinical Quality Scorecard.

    Science.gov (United States)

    Guth, Rebecca M; Storey, Patricia E; Vitale, Michael; Markan-Aurora, Sumita; Gordon, Randolph; Prevost, Traci Q; Dunagan, Wm Claiborne; Woeltje, Keith F

    2016-09-01

    Clinical quality scorecards are used by health care institutions to monitor clinical performance and drive quality improvement. Because of the rapid proliferation of quality metrics in health care, BJC HealthCare found it increasingly difficult to select the most impactful scorecard metrics while still monitoring metrics for regulatory purposes. A 7-step measure selection process was implemented incorporating Kepner-Tregoe Decision Analysis, which is a systematic process that considers key criteria that must be satisfied in order to make the best decision. The decision analysis process evaluates what metrics will most appropriately fulfill these criteria, as well as identifies potential risks associated with a particular metric in order to identify threats to its implementation. Using this process, a list of 750 potential metrics was narrowed to 25 that were selected for scorecard inclusion. This decision analysis process created a more transparent, reproducible approach for selecting quality metrics for clinical quality scorecards. © The Author(s) 2015.

  5. Balanced metrics for vector bundles and polarised manifolds

    DEFF Research Database (Denmark)

    Garcia Fernandez, Mario; Ross, Julius

    2012-01-01

    leads to a Hermitian-Einstein metric on E and a constant scalar curvature Kähler metric in c_1(L). For special values of α, limits of balanced metrics are solutions of a system of coupled equations relating a Hermitian-Einstein metric on E and a Kähler metric in c1(L). For this, we compute the top two......We consider a notion of balanced metrics for triples (X, L, E) which depend on a parameter α, where X is smooth complex manifold with an ample line bundle L and E is a holomorphic vector bundle over X. For generic choice of α, we prove that the limit of a convergent sequence of balanced metrics...

  6. Construction of Einstein-Sasaki metrics in D≥7

    International Nuclear Information System (INIS)

    Lue, H.; Pope, C. N.; Vazquez-Poritz, J. F.

    2007-01-01

    We construct explicit Einstein-Kaehler metrics in all even dimensions D=2n+4≥6, in terms of a 2n-dimensional Einstein-Kaehler base metric. These are cohomogeneity 2 metrics which have the new feature of including a NUT-type parameter, or gravomagnetic charge, in addition to..' in addition to mass and rotation parameters. Using a canonical construction, these metrics all yield Einstein-Sasaki metrics in dimensions D=2n+5≥7. As is commonly the case in this type of construction, for suitable choices of the free parameters the Einstein-Sasaki metrics can extend smoothly onto complete and nonsingular manifolds, even though the underlying Einstein-Kaehler metric has conical singularities. We discuss some explicit examples in the case of seven-dimensional Einstein-Sasaki spaces. These new spaces can provide supersymmetric backgrounds in M theory, which play a role in the AdS 4 /CFT 3 correspondence

  7. National Metrical Types in Nineteenth Century Art Song

    Directory of Open Access Journals (Sweden)

    Leigh VanHandel

    2010-01-01

    Full Text Available William Rothstein’s article “National metrical types in music of the eighteenth and early nineteenth centuries” (2008 proposes a distinction between the metrical habits of 18th and early 19th century German music and those of Italian and French music of that period. Based on theoretical treatises and compositional practice, he outlines these national metrical types and discusses the characteristics of each type. This paper presents the results of a study designed to determine whether, and to what degree, Rothstein’s characterizations of national metrical types are present in 19th century French and German art song. Studying metrical habits in this genre may provide a lens into changing metrical conceptions of 19th century theorists and composers, as well as to the metrical habits and compositional style of individual 19th century French and German art song composers.

  8. Metrication: An economic wake-up call for US industry

    Science.gov (United States)

    Carver, G. P.

    1993-03-01

    As the international standard of measurement, the metric system is one key to success in the global marketplace. International standards have become an important factor in international economic competition. Non-metric products are becoming increasingly unacceptable in world markets that favor metric products. Procurement is the primary federal tool for encouraging and helping U.S. industry to convert voluntarily to the metric system. Besides the perceived unwillingness of the customer, certain regulatory language, and certain legal definitions in some states, there are no major impediments to conversion of the remaining non-metric industries to metric usage. Instead, there are good reasons for changing, including an opportunity to rethink many industry standards and to take advantage of size standardization. Also, when the remaining industries adopt the metric system, they will come into conformance with federal agencies engaged in similar activities.

  9. Electrochemical processing of carbon dioxide.

    Science.gov (United States)

    Oloman, Colin; Li, Hui

    2008-01-01

    With respect to the negative role of carbon dioxide on our climate, it is clear that the time is ripe for the development of processes that convert CO(2) into useful products. The electroreduction of CO(2) is a prime candidate here, as the reaction at near-ambient conditions can yield organics such as formic acid, methanol, and methane. Recent laboratory work on the 100 A scale has shown that reduction of CO(2) to formate (HCO(2)(-)) may be carried out in a trickle-bed continuous electrochemical reactor under industrially viable conditions. Presuming the problems of cathode stability and formate crossover can be overcome, this type of reactor is proposed as the basis for a commercial operation. The viability of corresponding processes for electrosynthesis of formate salts and/or formic acid from CO(2) is examined here through conceptual flowsheets for two process options, each converting CO(2) at the rate of 100 tonnes per day.

  10. Cyclic occurrence of fire and its role in carbon dynamics along an edaphic moisture gradient in longleaf pine ecosystems.

    Directory of Open Access Journals (Sweden)

    Andrew Whelan

    Full Text Available Fire regulates the structure and function of savanna ecosystems, yet we lack understanding of how cyclic fire affects savanna carbon dynamics. Furthermore, it is largely unknown how predicted changes in climate may impact the interaction between fire and carbon cycling in these ecosystems. This study utilizes a novel combination of prescribed fire, eddy covariance (EC and statistical techniques to investigate carbon dynamics in frequently burned longleaf pine savannas along a gradient of soil moisture availability (mesic, intermediate and xeric. This research approach allowed us to investigate the complex interactions between carbon exchange and cyclic fire along the ecological amplitude of longleaf pine. Over three years of EC measurement of net ecosystem exchange (NEE show that the mesic site was a net carbon sink (NEE = -2.48 tonnes C ha(-1, while intermediate and xeric sites were net carbon sources (NEE = 1.57 and 1.46 tonnes C ha(-1, respectively, but when carbon losses due to fuel consumption were taken into account, all three sites were carbon sources (10.78, 7.95 and 9.69 tonnes C ha(-1 at the mesic, intermediate and xeric sites, respectively. Nonetheless, rates of NEE returned to pre-fire levels 1-2 months following fire. Consumption of leaf area by prescribed fire was associated with reduction in NEE post-fire, and the system quickly recovered its carbon uptake capacity 30-60 days post fire. While losses due to fire affected carbon balances on short time scales (instantaneous to a few months, drought conditions over the final two years of the study were a more important driver of net carbon loss on yearly to multi-year time scales. However, longer-term observations over greater environmental variability and additional fire cycles would help to more precisely examine interactions between fire and climate and make future predictions about carbon dynamics in these systems.

  11. Fanpage metrics analysis. "Study on content engagement"

    Science.gov (United States)

    Rahman, Zoha; Suberamanian, Kumaran; Zanuddin, Hasmah Binti; Moghavvemi, Sedigheh; Nasir, Mohd Hairul Nizam Bin Md

    2016-08-01

    Social Media is now determined as an excellent communicative tool to connect directly with consumers. One of the most significant ways to connect with the consumers through these Social Networking Sites (SNS) is to create a facebook fanpage with brand contents and to place different posts periodically on these fanpages. In measuring social networking sites' effectiveness, corporate houses are now analyzing metrics in terms of calculating engagement rate, number of comments/share and likings in fanpages. So now, it is very important for the marketers to know the effectiveness of different contents or posts of fanpages in order to increase the fan responsiveness and engagement rate in the fan pages. In the study the authors have analyzed total 1834 brand posts from 17 international brands of Electronics companies. Data of 9 months (From December 2014 to August 2015) have been collected for analyses, which were available online in the Brand' fan pages. An econometrics analysis is conducted using Eviews 9, to determine the impact of different contents on fanpage engagement. The study picked the four most frequently posted content to determine their impact on PTA (people Talking About) metrics and Fanpage engagement activities.

  12. Network Community Detection on Metric Space

    Directory of Open Access Journals (Sweden)

    Suman Saha

    2015-08-01

    Full Text Available Community detection in a complex network is an important problem of much interest in recent years. In general, a community detection algorithm chooses an objective function and captures the communities of the network by optimizing the objective function, and then, one uses various heuristics to solve the optimization problem to extract the interesting communities for the user. In this article, we demonstrate the procedure to transform a graph into points of a metric space and develop the methods of community detection with the help of a metric defined for a pair of points. We have also studied and analyzed the community structure of the network therein. The results obtained with our approach are very competitive with most of the well-known algorithms in the literature, and this is justified over the large collection of datasets. On the other hand, it can be observed that time taken by our algorithm is quite less compared to other methods and justifies the theoretical findings.

  13. Value of the Company and Marketing Metrics

    Directory of Open Access Journals (Sweden)

    André Luiz Ramos

    2013-12-01

    Full Text Available Thinking marketing strategies from a resource-based perspective (Barney, 1991, proposing assets as either tangible, organizational and human, and from Constantin and Luch’s vision (1994, where strategic resources can be tanbigle or intangible, internal or external to the firm, raises a research approach on Marketing and Finance. According to Srivastava, Shervani and Fahey (1998 there are 3 market assets types, which generate firm value. Firm value can be measured by discounted cashflow, compromising marketing activities with value generation forcasts (Anderson, 1982; Day, Fahey, 1988; Doyle, 2000; Rust et al., 2004a. The economic value of marketing strategies and marketing metrics are calling strategy researchers’ and marketing managers’ attention, making clear the need for building a bridge able to articulate marketing and finance form a strategic perspective. This article proposes an analytical framework based on different scientific approaches envolving risk and return promoted by marketing strategies and points out advances concerning both methodological approaches and marketing strategies and its impact on firm metrics and value, usgin Srinivasan and Hanssens (2009 as a start point.

  14. Defining a standard metric for electricity savings

    International Nuclear Information System (INIS)

    Koomey, Jonathan; Akbari, Hashem; Blumstein, Carl; Brown, Marilyn; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B; Greenberg, Steve

    2010-01-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70% capacity factor with 7% T and D losses. Displacing such a plant for one year would save 3 billion kWh/year at the meter and reduce emissions by 3 million metric tons of CO 2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question-Dr Arthur H Rosenfeld.

  15. Defining a standard metric for electricity savings

    Energy Technology Data Exchange (ETDEWEB)

    Koomey, Jonathan [Lawrence Berkeley National Laboratory and Stanford University, PO Box 20313, Oakland, CA 94620-0313 (United States); Akbari, Hashem; Blumstein, Carl; Brown, Marilyn; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B; Greenberg, Steve, E-mail: JGKoomey@stanford.ed

    2010-01-15

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70% capacity factor with 7% T and D losses. Displacing such a plant for one year would save 3 billion kWh/year at the meter and reduce emissions by 3 million metric tons of CO{sub 2} per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question-Dr Arthur H Rosenfeld.

  16. Covariant electrodynamics in linear media: Optical metric

    Science.gov (United States)

    Thompson, Robert T.

    2018-03-01

    While the postulate of covariance of Maxwell's equations for all inertial observers led Einstein to special relativity, it was the further demand of general covariance—form invariance under general coordinate transformations, including between accelerating frames—that led to general relativity. Several lines of inquiry over the past two decades, notably the development of metamaterial-based transformation optics, has spurred a greater interest in the role of geometry and space-time covariance for electrodynamics in ponderable media. I develop a generally covariant, coordinate-free framework for electrodynamics in general dielectric media residing in curved background space-times. In particular, I derive a relation for the spatial medium parameters measured by an arbitrary timelike observer. In terms of those medium parameters I derive an explicit expression for the pseudo-Finslerian optical metric of birefringent media and show how it reduces to a pseudo-Riemannian optical metric for nonbirefringent media. This formulation provides a basis for a unified approach to ray and congruence tracing through media in curved space-times that may smoothly vary among positively refracting, negatively refracting, and vacuum.

  17. Axisymmetric plasma equilibria in a Kerr metric

    Science.gov (United States)

    Elsässer, Klaus

    2001-10-01

    Plasma equilibria near a rotating black hole are considered within the multifluid description. An isothermal two-component plasma with electrons and positrons or ions is determined by four structure functions and the boundary conditions. These structure functions are the Bernoulli function and the toroidal canonical momentum per mass for each species. The quasi-neutrality assumption (no charge density, no toroidal current) allows to solve Maxwell's equations analytically for any axisymmetric stationary metric, and to reduce the fluid equations to one single scalar equation for the stream function \\chi of the positrons or ions, respectively. The basic smallness parameter is the ratio of the skin depth of electrons to the scale length of the metric and fluid quantities, and, in the case of an electron-ion plasma, the mass ratio m_e/m_i. The \\chi-equation can be solved by standard methods, and simple solutions for a Kerr geometry are available; they show characteristic flow patterns, depending on the structure functions and the boundary conditions.

  18. Defining a Standard Metric for Electricity Savings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Marilyn; Akbari, Hashem; Blumstein, Carl; Koomey, Jonathan; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H.; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B.; Greenberg, Steve; Hafemeister, David; Harris, Jeff; Harvey, Hal; Heitz, Eric; Hirst, Eric; Hummel, Holmes; Kammen, Dan; Kelly, Henry; Laitner, Skip; Levine, Mark; Lovins, Amory; Masters, Gil; McMahon, James E.; Meier, Alan; Messenger, Michael; Millhone, John; Mills, Evan; Nadel, Steve; Nordman, Bruce; Price, Lynn; Romm, Joe; Ross, Marc; Rufo, Michael; Sathaye, Jayant; Schipper, Lee; Schneider, Stephen H; Sweeney, James L; Verdict, Malcolm; Vorsatz, Diana; Wang, Devra; Weinberg, Carl; Wilk, Richard; Wilson, John; Worrell, Ernst

    2009-03-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70percent capacity factor with 7percent T&D losses. Displacing such a plant for one year would save 3 billion kW h per year at the meter and reduce emissions by 3 million metric tons of CO2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question--Dr. Arthur H. Rosenfeld.

  19. Strategies for Financing Large-scale Carbon Capture and Storage Power Plants in China

    OpenAIRE

    Liang, X.; Liu, H.; Reiner, D.

    2014-01-01

    Building on previous stakeholder consultations from 2006 to 2010, we conduct a financial analysis for a generic CCS power plant in China. In comparison with conventional thermal generation technologies, a coal-fired power plant with CCS requires either a 70% higher on-grid electricity tariff or carbon price support of approximately US$50/tonne CO2 in the absence of any other incentive mechanisms or financing strategies. Given the difficulties of relying on any one single measure to finance a ...

  20. Comprehensive Metric Education Project: Implementing Metrics at a District Level Administrative Guide.

    Science.gov (United States)

    Borelli, Michael L.

    This document details the administrative issues associated with guiding a school district through its metrication efforts. Issues regarding staff development, curriculum development, and the acquisition of instructional resources are considered. Alternative solutions are offered. Finally, an overall implementation strategy is discussed with…

  1. Biochar production for carbon sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Thakkar, J.; Kumar, A. [Alberta Univ., Edmonton, AB (Canada). Dept. of Mechanical Engineering

    2010-07-01

    This study examined the use of agricultural biomass for biochar production and its storage in a landfill to sequester carbon. Capturing the energy from biomass that would otherwise decay, is among the many options available to mitigate the impact of the greenhouse gas (GHG) emissions associated with fossil fuel consumption. Biochar is a solid fuel which can be produced from agricultural biomass such as wheat and barley straw. This organic solid can be produced by slow pyrolysis of straw. A conceptual techno-economic model based on actual data was used to estimate the cost of producing biochar from straw in a centralized plant. The objectives of the study were to estimate the overall delivered cost of straw to the charcoal production plant; estimate the transportation costs of charcoal to the landfill site; estimate the cost of landfill; and estimate the overall cost of carbon sequestration through a charcoal landfill. According to preliminary results, the cost of carbon sequestration through this pathway is greater than $50 per tonne of carbon dioxide.

  2. Social Media Metrics Importance and Usage Frequency in Latvia

    Directory of Open Access Journals (Sweden)

    Ronalds Skulme

    2017-12-01

    Full Text Available Purpose of the article: The purpose of this paper was to explore which social media marketing metrics are most often used and are most important for marketing experts in Latvia and can be used to evaluate marketing campaign effectiveness. Methodology/methods: In order to achieve the aim of this paper several theoretical and practical research methods were used, such as theoretical literature analysis, surveying and grouping. First of all, theoretical research about social media metrics was conducted. Authors collected information about social media metric grouping methods and the most frequently mentioned social media metrics in the literature. The collected information was used as the foundation for the expert surveys. The expert surveys were used to collect information from Latvian marketing professionals to determine which social media metrics are used most often and which social media metrics are most important in Latvia. Scientific aim: The scientific aim of this paper was to identify if social media metrics importance varies depending on the consumer purchase decision stage. Findings: Information about the most important and most often used social media marketing metrics in Latvia was collected. A new social media grouping framework is proposed. Conclusions: The main conclusion is that the importance and the usage frequency of the social media metrics is changing depending of consumer purchase decisions stage the metric is used to evaluate.

  3. Antarctic sea ice losses drive gains in benthic carbon drawdown.

    Science.gov (United States)

    Barnes, D K A

    2015-09-21

    Climate forcing of sea-ice losses from the Arctic and West Antarctic are blueing the poles. These losses are accelerating, reducing Earth's albedo and increasing heat absorption. Subarctic forest (area expansion and increased growth) and ice-shelf losses (resulting in new phytoplankton blooms which are eaten by benthos) are the only significant described negative feedbacks acting to counteract the effects of increasing CO2 on a warming planet, together accounting for uptake of ∼10(7) tonnes of carbon per year. Most sea-ice loss to date has occurred over polar continental shelves, which are richly, but patchily, colonised by benthic animals. Most polar benthos feeds on microscopic algae (phytoplankton), which has shown increased blooms coincident with sea-ice losses. Here, growth responses of Antarctic shelf benthos to sea-ice losses and phytoplankton increases were investigated. Analysis of two decades of benthic collections showed strong increases in annual production of shelf seabed carbon in West Antarctic bryozoans. These were calculated to have nearly doubled to >2x10(5) tonnes of carbon per year since the 1980s. Annual production of bryozoans is median within wider Antarctic benthos, so upscaling to include other benthos (combined study species typically constitute ∼3% benthic biomass) suggests an increased drawdown of ∼2.9x10(6) tonnes of carbon per year. This drawdown could become sequestration because polar continental shelves are typically deeper than most modern iceberg scouring, bacterial breakdown rates are slow, and benthos is easily buried. To date, most sea-ice losses have been Arctic, so, if hyperboreal benthos shows a similar increase in drawdown, polar continental shelves would represent Earth's largest negative feedback to climate change. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Green Net Value Added as a Sustainability Metric Based on ...

    Science.gov (United States)

    Sustainability measurement in economics involves evaluation of environmental and economic impact in an integrated manner. In this study, system level economic data are combined with environmental impact from a life cycle assessment (LCA) of a common product. We are exploring a costing approach that captures traditional costs but also incorporates externality costs to provide a convenient, easily interpretable metric. Green Net Value Added (GNVA) is a type of full cost accounting that incorporates total revenue, the cost of materials and services, depreciation, and environmental externalities. Two, but not all, of the potential environmental impacts calculated by the standard LCIA method (TRACI) could be converted to externality cost values. We compute externality costs disaggregated by upstream sectors, full cost, and GNVA to evaluate the relative sustainability of Bounty® paper towels manufactured at two production facilities. We found that the longer running, more established line had a higher GNVA than the newer line. The dominant factors contributing to externality costs are calculated to come from the stationary sources in the supply chain: electricity generation (27-35%), refineries (20-21%), pulp and paper making (15-23%). Health related externalities from Particulate Matter (PM2.5) and Carbon Dioxide equivalent (CO2e) emissions appear largely driven by electricity usage and emissions by the facilities, followed by pulp processing and transport. Supply

  5. The impact of future carbon prices on CCS investment for power generation in China

    International Nuclear Information System (INIS)

    Wu, Ning; Parsons, John E.; Polenske, Karen R.

    2013-01-01

    Carbon capture and storage (CCS) in China is currently discussed extensively but few in-depth analyses focusing on economics are observed. In this study, we answer two related questions about the development of CCS and power generation technologies in China: (1) what is the breakeven carbon-dioxide price to justify CCS installation investment for Integrated Gasification Combined Cycle (IGCC) and pulverized coal (PC) power plants, and, (2) what are the risks associated with investment for CCS. To answer these questions, we build a net present value model for IGCC and PC plants with capacity of 600 MW, with assumptions best representing the current technologies in China. Then, we run a sensitivity analysis of capital costs and fuel costs to reveal their impact on the carbon price, and analyze the risk on investment return caused by the carbon price volatility. Our study shows that in China, a breakeven carbon price of $61/tonne is required to justify investment on CCS for PC plants, and $72/tonne for IGCC plants. In this analysis, we also advise investors on the impact of capital and fuel costs on the carbon price and suggest optimal timing for CCS investment. - Highlights: ► We collect data on CCS and power generation which best represents technologies and costs in China. ► We model power plants' net present value to find the breakeven carbon prices. ► IGCC needs $72 per tonne to breakeven while PC requires $61 in China. ► Capital and fuel costs impact the carbon prices noticeably. ► We also examine the sensitivity, impact on return and time for investment

  6. Measurable Control System Security through Ideal Driven Technical Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Sean McBride; Marie Farrar; Zachary Tudor

    2008-01-01

    The Department of Homeland Security National Cyber Security Division supported development of a small set of security ideals as a framework to establish measurable control systems security. Based on these ideals, a draft set of proposed technical metrics was developed to allow control systems owner-operators to track improvements or degradations in their individual control systems security posture. The technical metrics development effort included review and evaluation of over thirty metrics-related documents. On the bases of complexity, ambiguity, or misleading and distorting effects the metrics identified during the reviews were determined to be weaker than necessary to aid defense against the myriad threats posed by cyber-terrorism to human safety, as well as to economic prosperity. Using the results of our metrics review and the set of security ideals as a starting point for metrics development, we identified thirteen potential technical metrics - with at least one metric supporting each ideal. Two case study applications of the ideals and thirteen metrics to control systems were then performed to establish potential difficulties in applying both the ideals and the metrics. The case studies resulted in no changes to the ideals, and only a few deletions and refinements to the thirteen potential metrics. This led to a final proposed set of ten core technical metrics. To further validate the security ideals, the modifications made to the original thirteen potential metrics, and the final proposed set of ten core metrics, seven separate control systems security assessments performed over the past three years were reviewed for findings and recommended mitigations. These findings and mitigations were then mapped to the security ideals and metrics to assess gaps in their coverage. The mappings indicated that there are no gaps in the security ideals and that the ten core technical metrics provide significant coverage of standard security issues with 87% coverage. Based

  7. Towards a meaningful metric for the quantification of GHG emissions of electric vehicles (EVs)

    International Nuclear Information System (INIS)

    Manjunath, Archana; Gross, George

    2017-01-01

    A key motivator for wider deployment of electric vehicles (EVs) – vehicles that are fully powered by battery charged from grid electricity – is to bring about environmental cleanliness. This goal is based on the fact that EVs produce zero tailpipe emissioon the associated carbon emissins. However, the generation and transmission of the charge electricity produce emissions that are not explicitly accounted by current measurement metrics for EV greenhouse gas (GHG) emissions and as such, the notion of environmental cleanliness of EVs becomes questionable. In this paper, we propose a comprehensive metric to quantify the actual environmental impacts of EVs. The new metric that we call the electric vehicle emissions index (EVEI) captures CO_2 emissions in the electricity production to consumption stages. Our metric is the first that provides transparency in the comparison of total emissions among various EV models, as well as in the side-by-side comparison of an EV with a gasoline vehicle (GV). Illustrative results indicate that the actual environmental impacts of an EV may show wide spatial variations and in some case, these impacts may be even greater than that of GV. Such insights that the EVEI provides may be useful in a wide range of applications, particularly in policy and incentive formulation. - Highlights: • We propose the Electric Vehicle Emission Index (EVEI) metric. • EVEI indicates the EV environmental impacts w.r.t gasoline vehicles (GVs). • Fuel economy and resource mix are the major contributors to emissions. • Results indicate EVs may prove to be dirtier than GVs in certain areas of usage. • Insights may prove to be valuable to policy and incentive formulation.

  8. Comparison of luminance based metrics in different lighting conditions

    DEFF Research Database (Denmark)

    Wienold, J.; Kuhn, T.E.; Christoffersen, J.

    In this study, we evaluate established and newly developed metrics for predicting glare using data from three different research studies. The evaluation covers two different targets: 1. How well the user’s perception of glare magnitude correlates to the prediction of the glare metrics? 2. How well...... do the glare metrics describe the subjects’ disturbance by glare? We applied Spearman correlations, logistic regressions and an accuracy evaluation, based on an ROC-analysis. The results show that five of the twelve investigated metrics are failing at least one of the statistical tests. The other...... seven metrics CGI, modified DGI, DGP, Ev, average Luminance of the image Lavg, UGP and UGR are passing all statistical tests. DGP, CGI, DGI_mod and UGP have largest AUC and might be slightly more robust. The accuracy of the predictions of afore mentioned seven metrics for the disturbance by glare lies...

  9. Performance metrics for the evaluation of hyperspectral chemical identification systems

    Science.gov (United States)

    Truslow, Eric; Golowich, Steven; Manolakis, Dimitris; Ingle, Vinay

    2016-02-01

    Remote sensing of chemical vapor plumes is a difficult but important task for many military and civilian applications. Hyperspectral sensors operating in the long-wave infrared regime have well-demonstrated detection capabilities. However, the identification of a plume's chemical constituents, based on a chemical library, is a multiple hypothesis testing problem which standard detection metrics do not fully describe. We propose using an additional performance metric for identification based on the so-called Dice index. Our approach partitions and weights a confusion matrix to develop both the standard detection metrics and identification metric. Using the proposed metrics, we demonstrate that the intuitive system design of a detector bank followed by an identifier is indeed justified when incorporating performance information beyond the standard detection metrics.

  10. Metrics correlation and analysis service (MCAS)

    International Nuclear Information System (INIS)

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya

    2010-01-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information pool is disorganized, it is a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation, and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by loosely coupled or fully decoupled middleware.

  11. Metrics correlation and analysis service (MCAS)

    International Nuclear Information System (INIS)

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya

    2009-01-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information 'pond' is disorganized, it a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by disjoint middleware.

  12. Development of Technology Transfer Economic Growth Metrics

    Science.gov (United States)

    Mastrangelo, Christina M.

    1998-01-01

    The primary objective of this project is to determine the feasibility of producing technology transfer metrics that answer the question: Do NASA/MSFC technical assistance activities impact economic growth? The data for this project resides in a 7800-record database maintained by Tec-Masters, Incorporated. The technology assistance data results from survey responses from companies and individuals who have interacted with NASA via a Technology Transfer Agreement, or TTA. The goal of this project was to determine if the existing data could provide indications of increased wealth. This work demonstrates that there is evidence that companies that used NASA technology transfer have a higher job growth rate than the rest of the economy. It also shows that the jobs being supported are jobs in higher wage SIC codes, and this indicates improvements in personal wealth. Finally, this work suggests that with correct data, the wealth issue may be addressed.

  13. MESUR metrics from scholarly usage of resources

    CERN Document Server

    CERN. Geneva; Van de Sompel, Herbert

    2007-01-01

    Usage data is increasingly regarded as a valuable resource in the assessment of scholarly communication items. However, the development of quantitative, usage-based indicators of scholarly impact is still in its infancy. The Digital Library Research & Prototyping Team at the Los Alamos National Laboratory's Research library has therefore started a program to expand the set of usage-based tools for the assessment of scholarly communication items. The two-year MESUR project, funded by the Andrew W. Mellon Foundation, aims to define and validate a range of usage-based impact metrics, and issue guidelines with regards to their characteristics and proper application. The MESUR project is constructing a large-scale semantic model of the scholarly community that seamlessly integrates a wide range of bibliographic, citation and usage data. Functioning as a reference data set, this model is analyzed to characterize the intricate networks of typed relationships that exist in the scholarly community. The resulting c...

  14. Einstein metrics and Brans-Dicke superfields

    International Nuclear Information System (INIS)

    Marques, S.

    1988-01-01

    It is obtained here a space conformal to the Einstein space-time, making the transition from an internal bosonic space, constructed with the Majorana constant spinors in the Majorana representation, to a bosonic ''superspace,'' through the use of Einstein vierbeins. These spaces are related to a Grassmann space constructed with the Majorana spinors referred to above, where the ''metric'' is a function of internal bosonic coordinates. The conformal function is a scale factor in the zone of gravitational radiation. A conformal function dependent on space-time coordinates can be constructed in that region when we introduce Majorana spinors which are functions of those coordinates. With this we obtain a scalar field of Brans-Dicke type. 11 refs

  15. Advanced reactors: the case for metric design

    International Nuclear Information System (INIS)

    Ruby, L.

    1986-01-01

    The author argues that DOE should insist that all design specifications for advanced reactors be in the International System of Units (SI) in accordance with the Metric Conversion Act of 1975. Despite a lack of leadership from the federal government, industry has had to move toward conversion in order to compete on world markets. The US is the only major country without a scheduled conversion program. SI avoids the disadvantages of ambiguous names, non-coherent units, multiple units for the same quantity, multiple definitions, as well as barriers to international exchange and marketing and problems in comparing safety and code parameters. With a first step by DOE, the Nuclear Regulatory Commission should add the same requirements to reactor licensing guidelines. 4 references

  16. Analytical Cost Metrics : Days of Future Past

    Energy Technology Data Exchange (ETDEWEB)

    Prajapati, Nirmal [Colorado State Univ., Fort Collins, CO (United States); Rajopadhye, Sanjay [Colorado State Univ., Fort Collins, CO (United States); Djidjev, Hristo Nikolov [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-20

    As we move towards the exascale era, the new architectures must be capable of running the massive computational problems efficiently. Scientists and researchers are continuously investing in tuning the performance of extreme-scale computational problems. These problems arise in almost all areas of computing, ranging from big data analytics, artificial intelligence, search, machine learning, virtual/augmented reality, computer vision, image/signal processing to computational science and bioinformatics. With Moore’s law driving the evolution of hardware platforms towards exascale, the dominant performance metric (time efficiency) has now expanded to also incorporate power/energy efficiency. Therefore the major challenge that we face in computing systems research is: “how to solve massive-scale computational problems in the most time/power/energy efficient manner?”

  17. Clean Cities 2013 Annual Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.; Singer, M.

    2014-10-01

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2013 Annual Metrics Report.

  18. Clean Cities 2014 Annual Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Caley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Singer, Mark [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-12-22

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2014 Annual Metrics Report.

  19. Outsourced similarity search on metric data assets

    KAUST Repository

    Yiu, Man Lung

    2012-02-01

    This paper considers a cloud computing setting in which similarity querying of metric data is outsourced to a service provider. The data is to be revealed only to trusted users, not to the service provider or anyone else. Users query the server for the most similar data objects to a query example. Outsourcing offers the data owner scalability and a low-initial investment. The need for privacy may be due to the data being sensitive (e.g., in medicine), valuable (e.g., in astronomy), or otherwise confidential. Given this setting, the paper presents techniques that transform the data prior to supplying it to the service provider for similarity queries on the transformed data. Our techniques provide interesting trade-offs between query cost and accuracy. They are then further extended to offer an intuitive privacy guarantee. Empirical studies with real data demonstrate that the techniques are capable of offering privacy while enabling efficient and accurate processing of similarity queries.

  20. Special metrics and group actions in geometry

    CERN Document Server

    Fino, Anna; Musso, Emilio; Podestà, Fabio; Vezzoni, Luigi

    2017-01-01

    The volume is a follow-up to the INdAM meeting “Special metrics and quaternionic geometry” held in Rome in November 2015. It offers a panoramic view of a selection of cutting-edge topics in differential geometry, including 4-manifolds, quaternionic and octonionic geometry, twistor spaces, harmonic maps, spinors, complex and conformal geometry, homogeneous spaces and nilmanifolds, special geometries in dimensions 5–8, gauge theory, symplectic and toric manifolds, exceptional holonomy and integrable systems. The workshop was held in honor of Simon Salamon, a leading international scholar at the forefront of academic research who has made significant contributions to all these subjects. The articles published here represent a compelling testimony to Salamon’s profound and longstanding impact on the mathematical community. Target readership includes graduate students and researchers working in Riemannian and complex geometry, Lie theory and mathematical physics.

  1. Recommended metric for tracking visibility progress in the Regional Haze Rule.

    Science.gov (United States)

    Gantt, Brett; Beaver, Melinda; Timin, Brian; Lorang, Phil

    2018-05-01

    For many national parks and wilderness areas with special air quality protections (Class I areas) in the western United States (U.S.), wildfire smoke and dust events can have a large impact on visibility. The U.S. Environmental Protection Agency's (EPA) 1999 Regional Haze Rule used the 20% haziest days to track visibility changes over time even if they are dominated by smoke or dust. Visibility on the 20% haziest days has remained constant or degraded over the last 16 yr at some Class I areas despite widespread emission reductions from anthropogenic sources. To better track visibility changes specifically associated with anthropogenic pollution sources rather than natural sources, the EPA has revised the Regional Haze Rule to track visibility on the 20% most anthropogenically impaired (hereafter, most impaired) days rather than the haziest days. To support the implementation of this revised requirement, the EPA has proposed (but not finalized) a recommended metric for characterizing the anthropogenic and natural portions of the daily extinction budget at each site. This metric selects the 20% most impaired days based on these portions using a "delta deciview" approach to quantify the deciview scale impact of anthropogenic light extinction. Using this metric, sulfate and nitrate make up the majority of the anthropogenic extinction in 2015 on these days, with natural extinction largely made up of organic carbon mass in the eastern U.S. and a combination of organic carbon mass, dust components, and sea salt in the western U.S. For sites in the western U.S., the seasonality of days selected as the 20% most impaired is different than the seasonality of the 20% haziest days, with many more winter and spring days selected. Applying this new metric to the 2000-2015 period across sites representing Class I areas results in substantial changes in the calculated visibility trend for the northern Rockies and southwest U.S., but little change for the eastern U.S. Changing the

  2. Quasi-metrics, midpoints and applications

    Energy Technology Data Exchange (ETDEWEB)

    Valero, O.

    2017-07-01

    In applied sciences, the scientific community uses simultaneously different kinds of information coming from several sources in order to infer a conclusion or working decision. In the literature there are many techniques for merging the information and providing, hence, a meaningful fused data. In mostpractical cases such fusion methods are based on aggregation operators on somenumerical values, i.e. the aim of the fusion process is to obtain arepresentative number from a finite sequence of numerical data. In the aforementioned cases, the input data presents some kind of imprecision and for thisreason it is represented as fuzzy sets. Moreover, in such problems the comparisons between the numerical values that represent the information described by the fuzzy sets become necessary. The aforementioned comparisons are made by means of a distance defined on fuzzy sets. Thus, the numerical operators aggregating distances between fuzzy sets as incoming data play a central role in applied problems. Recently, J.J. Nieto and A. Torres gave some applications of the aggregation of distances on fuzzy sets to the study of real medical data in /cite{Nieto}. These applications are based on the notion of segment joining two given fuzzy sets and on the notion of set of midpoints between fuzzy sets. A few results obtained by Nieto and Torres have been generalized in turn by Casasnovas and Rossell/'{o} in /cite{Casas,Casas2}. Nowadays, quasi-metrics provide efficient tools in some fields of computer science and in bioinformatics. Motivated by the exposed facts, a study of segments joining two fuzzy sets and of midpoints between fuzzy sets when the measure, used for comparisons, is a quasi-metric has been made in /cite{Casas3, SebVal2013,TiradoValero}. (Author)

  3. Mountaineer Commerical Scale Carbon Capture and Storage (CCS) Project

    Energy Technology Data Exchange (ETDEWEB)

    Deanna Gilliland; Matthew Usher

    2011-12-31

    The Final Technical documents all work performed during the award period on the Mountaineer Commercial Scale Carbon Capture & Storage project. This report presents the findings and conclusions produced as a consequence of this work. As identified in the Cooperative Agreement DE-FE0002673, AEP's objective of the Mountaineer Commercial Scale Carbon Capture and Storage (MT CCS II) project is to design, build and operate a commercial scale carbon capture and storage (CCS) system capable of treating a nominal 235 MWe slip stream of flue gas from the outlet duct of the Flue Gas Desulfurization (FGD) system at AEP's Mountaineer Power Plant (Mountaineer Plant), a 1300 MWe coal-fired generating station in New Haven, WV. The CCS system is designed to capture 90% of the CO{sub 2} from the incoming flue gas using the Alstom Chilled Ammonia Process (CAP) and compress, transport, inject and store 1.5 million tonnes per year of the captured CO{sub 2} in deep saline reservoirs. Specific Project Objectives include: (1) Achieve a minimum of 90% carbon capture efficiency during steady-state operations; (2) Demonstrate progress toward capture and storage at less than a 35% increase in cost of electricity (COE); (3) Store CO{sub 2} at a rate of 1.5 million tonnes per year in deep saline reservoirs; and (4) Demonstrate commercial technology readiness of the integrated CO{sub 2} capture and storage system.

  4. Prospective life cycle carbon abatement for pyrolysis biochar systems in the UK

    International Nuclear Information System (INIS)

    Hammond, Jim; Shackley, Simon; Sohi, Saran; Brownsort, Peter

    2011-01-01

    Life cycle assessment (LCA) of slow pyrolysis biochar systems (PBS) in the UK for small, medium and large scale process chains and ten feedstocks was performed, assessing carbon abatement and electricity production. Pyrolysis biochar systems appear to offer greater carbon abatement than other bioenergy systems. Carbon abatement of 0.7-1.3 t CO 2 equivalent per oven dry tonne of feedstock processed was found. In terms of delivered energy, medium to large scale PBS abates 1.4-1.9 t CO 2 e/MWh, which compares to average carbon emissions of 0.05-0.30 t CO 2 e/MWh for other bioenergy systems. The largest contribution to PBS carbon abatement is from the feedstock carbon stabilised in biochar (40-50%), followed by the less certain indirect effects of biochar in the soil (25-40%)-mainly due to increase in soil organic carbon levels. Change in soil organic carbon levels was found to be a key sensitivity. Electricity production off-setting emissions from fossil fuels accounted for 10-25% of carbon abatement. The LCA suggests that provided 43% of the carbon in the biochar remains stable, PBS will out-perform direct combustion of biomass at 33% efficiency in terms of carbon abatement, even if there is no beneficial effect upon soil organic carbon levels from biochar application. - Research highlights: → Biochar systems offer greater carbon abatement than combustion or gasification. → Carbon abatement of 0.7-1.4t CO 2 e/dry tonne of feedstock processed was found. → Change in soil organic carbon stocks induced by biochar is the key sensitivity. → Biochar systems produce less electricity then combustion or gasification.

  5. Analytic convergence of harmonic metrics for parabolic Higgs bundles

    Science.gov (United States)

    Kim, Semin; Wilkin, Graeme

    2018-04-01

    In this paper we investigate the moduli space of parabolic Higgs bundles over a punctured Riemann surface with varying weights at the punctures. We show that the harmonic metric depends analytically on the weights and the stable Higgs bundle. This gives a Higgs bundle generalisation of a theorem of McOwen on the existence of hyperbolic cone metrics on a punctured surface within a given conformal class, and a generalisation of a theorem of Judge on the analytic parametrisation of these metrics.

  6. Exact solutions of strong gravity in generalized metrics

    International Nuclear Information System (INIS)

    Hojman, R.; Smailagic, A.

    1981-05-01

    We consider classical solutions for the strong gravity theory of Salam and Strathdee in a wider class of metrics with positive, zero and negative curvature. It turns out that such solutions exist and their relevance for quark confinement is explored. Only metrics with positive curvature (spherical symmetry) give a confining potential in a simple picture of the scalar hadron. This supports the idea of describing the hadron as a closed microuniverse of the strong metric. (author)

  7. An accurate metric for the spacetime around neutron stars

    OpenAIRE

    Pappas, George

    2016-01-01

    The problem of having an accurate description of the spacetime around neutron stars is of great astrophysical interest. For astrophysical applications, one needs to have a metric that captures all the properties of the spacetime around a neutron star. Furthermore, an accurate appropriately parameterised metric, i.e., a metric that is given in terms of parameters that are directly related to the physical structure of the neutron star, could be used to solve the inverse problem, which is to inf...

  8. Problems in Systematic Application of Software Metrics and Possible Solution

    OpenAIRE

    Rakic, Gordana; Budimac, Zoran

    2013-01-01

    Systematic application of software metric techniques can lead to significant improvements of the quality of a final software product. However, there is still the evident lack of wider utilization of software metrics techniques and tools due to many reasons. In this paper we investigate some limitations of contemporary software metrics tools and then propose construction of a new tool that would solve some of the problems. We describe the promising prototype, its internal structure, and then f...

  9. Two-dimensional manifolds with metrics of revolution

    International Nuclear Information System (INIS)

    Sabitov, I Kh

    2000-01-01

    This is a study of the topological and metric structure of two-dimensional manifolds with a metric that is locally a metric of revolution. In the case of compact manifolds this problem can be thoroughly investigated, and in particular it is explained why there are no closed analytic surfaces of revolution in R 3 other than a sphere and a torus (moreover, in the smoothness class C ∞ such surfaces, understood in a certain generalized sense, exist in any topological class)

  10. A software quality model and metrics for risk assessment

    Science.gov (United States)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  11. Chaos of discrete dynamical systems in complete metric spaces

    International Nuclear Information System (INIS)

    Shi Yuming; Chen Guanrong

    2004-01-01

    This paper is concerned with chaos of discrete dynamical systems in complete metric spaces. Discrete dynamical systems governed by continuous maps in general complete metric spaces are first discussed, and two criteria of chaos are then established. As a special case, two corresponding criteria of chaos for discrete dynamical systems in compact subsets of metric spaces are obtained. These results have extended and improved the existing relevant results of chaos in finite-dimensional Euclidean spaces

  12. Development of quality metrics for ambulatory pediatric cardiology: Infection prevention.

    Science.gov (United States)

    Johnson, Jonathan N; Barrett, Cindy S; Franklin, Wayne H; Graham, Eric M; Halnon, Nancy J; Hattendorf, Brandy A; Krawczeski, Catherine D; McGovern, James J; O'Connor, Matthew J; Schultz, Amy H; Vinocur, Jeffrey M; Chowdhury, Devyani; Anderson, Jeffrey B

    2017-12-01

    In 2012, the American College of Cardiology's (ACC) Adult Congenital and Pediatric Cardiology Council established a program to develop quality metrics to guide ambulatory practices for pediatric cardiology. The council chose five areas on which to focus their efforts; chest pain, Kawasaki Disease, tetralogy of Fallot, transposition of the great arteries after arterial switch, and infection prevention. Here, we sought to describe the process, evaluation, and results of the Infection Prevention Committee's metric design process. The infection prevention metrics team consisted of 12 members from 11 institutions in North America. The group agreed to work on specific infection prevention topics including antibiotic prophylaxis for endocarditis, rheumatic fever, and asplenia/hyposplenism; influenza vaccination and respiratory syncytial virus prophylaxis (palivizumab); preoperative methods to reduce intraoperative infections; vaccinations after cardiopulmonary bypass; hand hygiene; and testing to identify splenic function in patients with heterotaxy. An extensive literature review was performed. When available, previously published guidelines were used fully in determining metrics. The committee chose eight metrics to submit to the ACC Quality Metric Expert Panel for review. Ultimately, metrics regarding hand hygiene and influenza vaccination recommendation for patients did not pass the RAND analysis. Both endocarditis prophylaxis metrics and the RSV/palivizumab metric passed the RAND analysis but fell out during the open comment period. Three metrics passed all analyses, including those for antibiotic prophylaxis in patients with heterotaxy/asplenia, for influenza vaccination compliance in healthcare personnel, and for adherence to recommended regimens of secondary prevention of rheumatic fever. The lack of convincing data to guide quality improvement initiatives in pediatric cardiology is widespread, particularly in infection prevention. Despite this, three metrics were

  13. Carbon sequestration research and development

    Energy Technology Data Exchange (ETDEWEB)

    Reichle, Dave; Houghton, John; Kane, Bob; Ekmann, Jim; and others

    1999-12-31

    Predictions of global energy use in the next century suggest a continued increase in carbon emissions and rising concentrations of carbon dioxide (CO{sub 2}) in the atmosphere unless major changes are made in the way we produce and use energy--in particular, how we manage carbon. For example, the Intergovernmental Panel on Climate Change (IPCC) predicts in its 1995 ''business as usual'' energy scenario that future global emissions of CO{sub 2} to the atmosphere will increase from 7.4 billion tonnes of carbon (GtC) per year in 1997 to approximately 26 GtC/year by 2100. IPCC also projects a doubling of atmospheric CO{sub 2} concentration by the middle of next century and growing rates of increase beyond. Although the effects of increased CO{sub 2} levels on global climate are uncertain, many scientists agree that a doubling of atmospheric CO{sub 2} concentrations could have a variety of serious environmental consequences. The goal of this report is to identify key areas for research and development (R&D) that could lead to an understanding of the potential for future use of carbon sequestration as a major tool for managing carbon emissions. Under the leadership of DOE, researchers from universities, industry, other government agencies, and DOE national laboratories were brought together to develop the technical basis for conceiving a science and technology road map. That effort has resulted in this report, which develops much of the information needed for the road map.

  14. 10 CFR 600.306 - Metric system of measurement.

    Science.gov (United States)

    2010-01-01

    ... cause significant inefficiencies or loss of markets to United States firms. (b) Recipients are... Requirements for Grants and Cooperative Agreements With For-Profit Organizations General § 600.306 Metric... Competitiveness Act of 1988 (15 U.S.C. 205) and implemented by Executive Order 12770, states that: (1) The metric...

  15. On the topology defined by Thurston's asymmetric metric

    DEFF Research Database (Denmark)

    Papadopoulos, Athanase; Theret, Guillaume

    2007-01-01

    that the topology that the asymmetric metric L induces on Teichmüller space is the same as the usual topology. Furthermore, we show that L satisfies the axioms of a (not necessarily symmetric) metric in the sense of Busemann and conclude that L is complete in the sense of Busemann....

  16. Path integral measure for first-order and metric gravities

    International Nuclear Information System (INIS)

    Aros, Rodrigo; Contreras, Mauricio; Zanelli, Jorge

    2003-01-01

    The equivalence between the path integrals for first-order gravity and the standard torsion-free, metric gravity in 3 + 1 dimensions is analysed. Starting with the path integral for first-order gravity, the correct measure for the path integral of the metric theory is obtained

  17. Converging from Branching to Linear Metrics on Markov Chains

    DEFF Research Database (Denmark)

    Bacci, Giorgio; Bacci, Giovanni; Larsen, Kim Guldstrand

    2015-01-01

    time in the size of the MC. The upper-approximants are Kantorovich-like pseudometrics, i.e. branching-time distances, that converge point-wise to the linear-time metrics. This convergence is interesting in itself, since it reveals a nontrivial relation between branching and linear-time metric...

  18. Effects of Metric Change on Workers’ Tools and Training.

    Science.gov (United States)

    1981-07-01

    understanding of the metric system, and particularly a lack of fluency in converting customary measurements to metric measuremerts, may increase the...assembly, installing, and repairing occupations 84 Painting, plastering, waterproofing, cementing , and related occupations 85 Excavating, grading... cementing , and related occupations 85 Excavating, grading, paving, and related occupations 86 Construction occupations, n.e.c. 89 Structural work

  19. 48 CFR 611.002-70 - Metric system implementation.

    Science.gov (United States)

    2010-10-01

    ... with security, operations, economic, technical, logistical, training and safety requirements. (3) The... total cost of the retrofit, including redesign costs, exceeds $50,000; (ii) Metric is not the accepted... office with an explanation for the disapproval. (7) The in-house operating metric costs shall be...

  20. Empirical analysis of change metrics for software fault prediction

    NARCIS (Netherlands)

    Choudhary, Garvit Rajesh; Kumar, Sandeep; Kumar, Kuldeep; Mishra, Alok; Catal, Cagatay

    2018-01-01

    A quality assurance activity, known as software fault prediction, can reduce development costs and improve software quality. The objective of this study is to investigate change metrics in conjunction with code metrics to improve the performance of fault prediction models. Experimental studies are

  1. Predicting class testability using object-oriented metrics

    NARCIS (Netherlands)

    M. Bruntink (Magiel); A. van Deursen (Arie)

    2004-01-01

    textabstractIn this paper we investigate factors of the testability of object-oriented software systems. The starting point is given by a study of the literature to obtain both an initial model of testability and existing OO metrics related to testability. Subsequently, these metrics are evaluated

  2. Comparative Study of Trace Metrics between Bibliometrics and Patentometrics

    Directory of Open Access Journals (Sweden)

    Fred Y. Ye

    2016-06-01

    Full Text Available Purpose: To comprehensively evaluate the overall performance of a group or an individual in both bibliometrics and patentometrics. Design/methodology/approach: Trace metrics were applied to the top 30 universities in the 2014 Academic Ranking of World Universities (ARWU — computer sciences, the top 30 ESI highly cited papers in the computer sciences field in 2014, as well as the top 30 assignees and the top 30 most cited patents in the National Bureau of Economic Research (NBER computer hardware and software category. Findings: We found that, by applying trace metrics, the research or marketing impact efficiency, at both group and individual levels, was clearly observed. Furthermore, trace metrics were more sensitive to the different publication-citation distributions than the average citation and h-index were. Research limitations: Trace metrics considered publications with zero citations as negative contributions. One should clarify how he/she evaluates a zero-citation paper or patent before applying trace metrics. Practical implications: Decision makers could regularly examinine the performance of their university/company by applying trace metrics and adjust their policies accordingly. Originality/value: Trace metrics could be applied both in bibliometrics and patentometrics and provide a comprehensive view. Moreover, the high sensitivity and unique impact efficiency view provided by trace metrics can facilitate decision makers in examining and adjusting their policies.

  3. Self-dual metrics with self-dual Killing vectors

    International Nuclear Information System (INIS)

    Tod, K.P.; Ward, R.S.

    1979-01-01

    Twistor methods are used to derive a class of solutions to Einstein's vacuum equations, with anti-self dual Weyl tensor. In particular, all metrics with a Killing vector whose derivative is anti-self-dual and which admit a real positive-definite section are exhibited and shown to coincide with the metrics of Hawking. (author)

  4. Scalar metric fluctuations in space-time matter inflation

    International Nuclear Information System (INIS)

    Anabitarte, Mariano; Bellini, Mauricio

    2006-01-01

    Using the Ponce de Leon background metric, which describes a 5D universe in an apparent vacuum: G-bar AB =0, we study the effective 4D evolution of both, the inflaton and gauge-invariant scalar metric fluctuations, in the recently introduced model of space-time matter inflation

  5. 22 CFR 226.15 - Metric system of measurement.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Metric system of measurement. 226.15 Section 226.15 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT ADMINISTRATION OF ASSISTANCE AWARDS TO U.S. NON-GOVERNMENTAL ORGANIZATIONS Pre-award Requirements § 226.15 Metric system of measurement. (a...

  6. Presic-Boyd-Wong Type Results in Ordered Metric Spaces

    Directory of Open Access Journals (Sweden)

    Satish Shukla

    2014-04-01

    Full Text Available The purpose of this paper is to prove some Presic-Boyd-Wong type fixed point theorems in ordered metric spaces. The results of this paper generalize the famous results of Presic and Boyd-Wong in ordered metric spaces. We also initiate the homotopy result in product spaces. Some examples are provided which illustrate the results proved herein.

  7. A heuristic way of obtaining the Kerr metric

    International Nuclear Information System (INIS)

    Enderlein, J.

    1997-01-01

    An intuitive, straightforward way of finding the metric of a rotating black hole is presented, based on the algebra of differential forms. The representation obtained for the metric displays a simplicity which is not obvious in the usual Boyer Lindquist coordinates. copyright 1997 American Association of Physics Teachers

  8. On the L2-metric of vortex moduli spaces

    NARCIS (Netherlands)

    Baptista, J.M.

    2011-01-01

    We derive general expressions for the Kähler form of the L2-metric in terms of standard 2-forms on vortex moduli spaces. In the case of abelian vortices in gauged linear sigma-models, this allows us to compute explicitly the Kähler class of the L2-metric. As an application we compute the total

  9. Probabilistic G-Metric space and some fixed point results

    Directory of Open Access Journals (Sweden)

    A. R. Janfada

    2013-01-01

    Full Text Available In this note we introduce the notions of generalized probabilistic metric spaces and generalized Menger probabilistic metric spaces. After making our elementary observations and proving some basic properties of these spaces, we are going to prove some fixed point result in these spaces.

  10. Socio-Technical Security Metrics (Dagstuhl Seminar 14491)

    NARCIS (Netherlands)

    Gollmann, Dieter; Herley, Cormac; Koenig, Vincent; Pieters, Wolter; Sasse, Martina Angela

    2015-01-01

    This report documents the program and the outcomes of Dagstuhl Seminar 14491 "Socio-Technical Security Metrics". In the domain of safety, metrics inform many decisions, from the height of new dikes to the design of nuclear plants. We can state, for example, that the dikes should be high enough to

  11. Radiating c metric: an example of a proper Ricci Collineation

    International Nuclear Information System (INIS)

    Aulestia, L.; Nunez, L.; Patino, A.; Rago, H.; Herrera, L.

    1984-01-01

    A generalization of the charged c metric to the nonstationary case is given. The possibility of associating the energy-momentum tensor with the electromagnetic or neutrino field is discussed. It is shown that, for a specific choice of the time-dependent parameters, the metric admits at least a two-parameter group of proper Ricci collineations

  12. On the Metric-based Approximate Minimization of Markov Chains

    DEFF Research Database (Denmark)

    Bacci, Giovanni; Bacci, Giorgio; Larsen, Kim Guldstrand

    2018-01-01

    In this paper we address the approximate minimization problem of Markov Chains (MCs) from a behavioral metric-based perspective. Specifically, given a finite MC and a positive integer k, we are looking for an MC with at most k states having minimal distance to the original. The metric considered...

  13. On the Metric-Based Approximate Minimization of Markov Chains

    DEFF Research Database (Denmark)

    Bacci, Giovanni; Bacci, Giorgio; Larsen, Kim Guldstrand

    2017-01-01

    We address the behavioral metric-based approximate minimization problem of Markov Chains (MCs), i.e., given a finite MC and a positive integer k, we are interested in finding a k-state MC of minimal distance to the original. By considering as metric the bisimilarity distance of Desharnais at al...

  14. Implementing Metrics at a District Level. Administrative Guide. Revised Edition.

    Science.gov (United States)

    Borelli, Michael L.; Morelli, Sandra Z.

    Administrative concerns in implementing metrics at a district level are discussed and specific recommendations are made regarding them. The paper considers the extent and manner of staff training necessary, the curricular changes associated with metrics, and the distinctions between elementary and secondary programs. Appropriate instructional…

  15. 20 CFR 435.15 - Metric system of measurement.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Metric system of measurement. 435.15 Section 435.15 Employees' Benefits SOCIAL SECURITY ADMINISTRATION UNIFORM ADMINISTRATIVE REQUIREMENTS FOR... metric system is the preferred measurement system for U.S. trade and commerce. The Act requires each...

  16. Choosing the Greenest Synthesis: A Multivariate Metric Green Chemistry Exercise

    Science.gov (United States)

    Mercer, Sean M.; Andraos, John; Jessop, Philip G.

    2012-01-01

    The ability to correctly identify the greenest of several syntheses is a particularly useful asset for young chemists in the growing green economy. The famous univariate metrics atom economy and environmental factor provide insufficient information to allow for a proper selection of a green process. Multivariate metrics, such as those used in…

  17. 76 FR 53885 - Patent and Trademark Resource Centers Metrics

    Science.gov (United States)

    2011-08-30

    ... DEPARTMENT OF COMMERCE United States Patent and Trademark Office Patent and Trademark Resource Centers Metrics ACTION: Proposed collection; comment request. SUMMARY: The United States Patent and... ``Patent and Trademark Resource Centers Metrics comment'' in the subject line of the message. Mail: Susan K...

  18. Author Impact Metrics in Communication Sciences and Disorder Research

    Science.gov (United States)

    Stuart, Andrew; Faucette, Sarah P.; Thomas, William Joseph

    2017-01-01

    Purpose: The purpose was to examine author-level impact metrics for faculty in the communication sciences and disorder research field across a variety of databases. Method: Author-level impact metrics were collected for faculty from 257 accredited universities in the United States and Canada. Three databases (i.e., Google Scholar, ResearchGate,…

  19. Evaluating hydrological model performance using information theory-based metrics

    Science.gov (United States)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  20. Using metrics in stability of stochastic programming problems

    Czech Academy of Sciences Publication Activity Database

    Houda, Michal

    2005-01-01

    Roč. 13, č. 1 (2005), s. 128-134 ISSN 0572-3043 R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : stochastic programming * quantitative stability * Wasserstein metrics * Kolmogorov metrics * simulation study Subject RIV: BB - Applied Statistics, Operational Research

  1. A Practical Method for Collecting Social Media Campaign Metrics

    Science.gov (United States)

    Gharis, Laurie W.; Hightower, Mary F.

    2017-01-01

    Today's Extension professionals are tasked with more work and fewer resources. Integrating social media campaigns into outreach efforts can be an efficient way to meet work demands. If resources go toward social media, a practical method for collecting metrics is needed. Collecting metrics adds one more task to the workloads of Extension…

  2. Discriminatory Data Mapping by Matrix-Based Supervised Learning Metrics

    NARCIS (Netherlands)

    Strickert, M.; Schneider, P.; Keilwagen, J.; Villmann, T.; Biehl, M.; Hammer, B.

    2008-01-01

    Supervised attribute relevance detection using cross-comparisons (SARDUX), a recently proposed method for data-driven metric learning, is extended from dimension-weighted Minkowski distances to metrics induced by a data transformation matrix Ω for modeling mutual attribute dependence. Given class

  3. Performance evaluation of routing metrics for wireless mesh networks

    CSIR Research Space (South Africa)

    Nxumalo, SL

    2009-08-01

    Full Text Available for WMN. The routing metrics have not been compared with QoS parameters. This paper is a work in progress of the project in which researchers want to compare the performance of different routing metrics in WMN using a wireless test bed. Researchers...

  4. 27 CFR 4.72 - Metric standards of fill.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Metric standards of fill. 4.72 Section 4.72 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU, DEPARTMENT OF THE TREASURY LIQUORS LABELING AND ADVERTISING OF WINE Standards of Fill for Wine § 4.72 Metric...

  5. ISS Logistics Hardware Disposition and Metrics Validation

    Science.gov (United States)

    Rogers, Toneka R.

    2010-01-01

    I was assigned to the Logistics Division of the International Space Station (ISS)/Spacecraft Processing Directorate. The Division consists of eight NASA engineers and specialists that oversee the logistics portion of the Checkout, Assembly, and Payload Processing Services (CAPPS) contract. Boeing, their sub-contractors and the Boeing Prime contract out of Johnson Space Center, provide the Integrated Logistics Support for the ISS activities at Kennedy Space Center. Essentially they ensure that spares are available to support flight hardware processing and the associated ground support equipment (GSE). Boeing maintains a Depot for electrical, mechanical and structural modifications and/or repair capability as required. My assigned task was to learn project management techniques utilized by NASA and its' contractors to provide an efficient and effective logistics support infrastructure to the ISS program. Within the Space Station Processing Facility (SSPF) I was exposed to Logistics support components, such as, the NASA Spacecraft Services Depot (NSSD) capabilities, Mission Processing tools, techniques and Warehouse support issues, required for integrating Space Station elements at the Kennedy Space Center. I also supported the identification of near-term ISS Hardware and Ground Support Equipment (GSE) candidates for excessing/disposition prior to October 2010; and the validation of several Logistics Metrics used by the contractor to measure logistics support effectiveness.

  6. Securing Health Sensing Using Integrated Circuit Metric

    Science.gov (United States)

    Tahir, Ruhma; Tahir, Hasan; McDonald-Maier, Klaus

    2015-01-01

    Convergence of technologies from several domains of computing and healthcare have aided in the creation of devices that can help health professionals in monitoring their patients remotely. An increase in networked healthcare devices has resulted in incidents related to data theft, medical identity theft and insurance fraud. In this paper, we discuss the design and implementation of a secure lightweight wearable health sensing system. The proposed system is based on an emerging security technology called Integrated Circuit Metric (ICMetric) that extracts the inherent features of a device to generate a unique device identification. In this paper, we provide details of how the physical characteristics of a health sensor can be used for the generation of hardware “fingerprints”. The obtained fingerprints are used to deliver security services like authentication, confidentiality, secure admission and symmetric key generation. The generated symmetric key is used to securely communicate the health records and data of the patient. Based on experimental results and the security analysis of the proposed scheme, it is apparent that the proposed system enables high levels of security for health monitoring in resource optimized manner. PMID:26492250

  7. Metric integration architecture for product development

    Science.gov (United States)

    Sieger, David B.

    1997-06-01

    Present-day product development endeavors utilize the concurrent engineering philosophy as a logical means for incorporating a variety of viewpoints into the design of products. Since this approach provides no explicit procedural provisions, it is necessary to establish at least a mental coupling with a known design process model. The central feature of all such models is the management and transformation of information. While these models assist in structuring the design process, characterizing the basic flow of operations that are involved, they provide no guidance facilities. The significance of this feature, and the role it plays in the time required to develop products, is increasing in importance due to the inherent process dynamics, system/component complexities, and competitive forces. The methodology presented in this paper involves the use of a hierarchical system structure, discrete event system specification (DEVS), and multidimensional state variable based metrics. This approach is unique in its capability to quantify designer's actions throughout product development, provide recommendations about subsequent activity selection, and coordinate distributed activities of designers and/or design teams across all design stages. Conceptual design tool implementation results are used to demonstrate the utility of this technique in improving the incremental decision making process.

  8. Creating meaningful business continuity management programme metrics.

    Science.gov (United States)

    Strong, Brian

    2010-11-01

    The popular axiom, 'what gets measured gets done', is often applied in the quality management and continuous improvement disciplines. This truism is also useful to business continuity practitioners as they continually strive to prove the value of their organisation's investment in a business continuity management (BCM) programme. BCM practitioners must also remain relevant to their organisations as executives focus on the bottom line and maintaining stakeholder confidence. It seems that executives always find a way, whether in a hallway or elevator, to ask BCM professionals about the company's level of readiness. When asked, they must be ready with an informed response. The establishment of a process to measure business continuity programme performance and organisational readiness has emerged as a key component of US Department of Homeland Security 'Voluntary Private Sector Preparedness (PS-Prep) Program' standards where the overarching goal is to improve private sector preparedness for disasters and emergencies. The purpose of this paper is two-fold: to introduce continuity professionals to best practices that should be considered when developing a BCM metrics programme as well as providing a case study of how a large health insurance company researched, developed and implemented a process to measure BCM programme performance and company readiness.

  9. Viscous shear in the Kerr metric

    International Nuclear Information System (INIS)

    Anderson, M.R.; Lemos, J.P.S.

    1988-01-01

    Models of viscous flows on to black holes commonly assume a zero-torque boundary condition at the radius of the last stable Keplerian orbit. It is here shown that this condition is wrong. The viscous torque is generally non-zero at both the last stable orbit and the horizon itself. The existence of a non-zero viscous torque at the horizon does not require the transfer of energy or angular momentum across any spacelike distance, and so does not violate causality. Further, in comparison with the viscous torque in the distant, Newtonian regime, the viscous torque on the horizon is often reversed, so that angular momentum is viscously advected inwards rather than outwards. This phenomenon is first suggested by an analysis of the quasi-stationary case, and then demonstrated explicitly for a series of cold, dynamical flows which fall freely from the last stable orbit in the Schwarzschild and Kerr metrics. In the steady flows constructed here, the net torque on the hole is always directed in the usual sense; any reversal in the viscous torque is offset by an increase in the convected flux of angular momentum. (author)

  10. On degenerate metrics, dark matter and unification

    Science.gov (United States)

    Searight, Trevor P.

    2017-12-01

    A five-dimensional theory of relativity is presented which suggests that gravitation and electromagnetism may be unified using a degenerate metric. There are four fields (in the four-dimensional sense): a tensor field, two vector fields, and a scalar field, and they are unified with a combination of a gauge-like invariance and a reflection symmetry which means that both vector fields are photons. The gauge-like invariance implies that the fifth dimension is not directly observable; it also implies that charge is a constant of motion. The scalar field is analogous to the Brans-Dicke scalar field, and the theory tends towards the Einstein-Maxwell theory in the limit as the coupling constant tends to infinity. As there is some scope for fields to vary in the fifth dimension, it is possible for the photons to have wave behaviour in the fifth dimension. The wave behaviour has two effects: it gives mass to the photons, and it prevents them from interacting directly with normal matter. These massive photons still act as a source of gravity, however, and therefore they are candidates for dark matter.

  11. Relativistic gas in a Schwarzschild metric

    International Nuclear Information System (INIS)

    Kremer, Gilberto M

    2013-01-01

    A relativistic gas in a Schwarzschild metric is studied within the framework of a relativistic Boltzmann equation in the presence of gravitational fields, where Marle’s model for the collision operator of the Boltzmann equation is employed. The transport coefficients of the bulk and shear viscosities and thermal conductivity are determined from the Chapman–Enskog method. It is shown that the transport coefficients depend on the gravitational potential. Expressions for the transport coefficients in the presence of weak gravitational fields in the non-relativistic (low temperature) and ultra-relativistic (high temperature) limiting cases are given. Apart from the temperature gradient the heat flux has two relativistic terms. The first one, proposed by Eckart, is due to the inertia of energy and represents an isothermal heat flux when matter is accelerated. The other, suggested by Tolman, is proportional to the gravitational potential gradient and indicates that—in the absence of an acceleration field—a state of equilibrium of a relativistic gas in a gravitational field can be attained only if the temperature gradient is counterbalanced by a gravitational potential gradient. (paper)

  12. Securing Health Sensing Using Integrated Circuit Metric

    Directory of Open Access Journals (Sweden)

    Ruhma Tahir

    2015-10-01

    Full Text Available Convergence of technologies from several domains of computing and healthcare have aided in the creation of devices that can help health professionals in monitoring their patients remotely. An increase in networked healthcare devices has resulted in incidents related to data theft, medical identity theft and insurance fraud. In this paper, we discuss the design and implementation of a secure lightweight wearable health sensing system. The proposed system is based on an emerging security technology called Integrated Circuit Metric (ICMetric that extracts the inherent features of a device to generate a unique device identification. In this paper, we provide details of how the physical characteristics of a health sensor can be used for the generation of hardware “fingerprints”. The obtained fingerprints are used to deliver security services like authentication, confidentiality, secure admission and symmetric key generation. The generated symmetric key is used to securely communicate the health records and data of the patient. Based on experimental results and the security analysis of the proposed scheme, it is apparent that the proposed system enables high levels of security for health monitoring in resource optimized manner.

  13. Securing health sensing using integrated circuit metric.

    Science.gov (United States)

    Tahir, Ruhma; Tahir, Hasan; McDonald-Maier, Klaus

    2015-10-20

    Convergence of technologies from several domains of computing and healthcare have aided in the creation of devices that can help health professionals in monitoring their patients remotely. An increase in networked healthcare devices has resulted in incidents related to data theft, medical identity theft and insurance fraud. In this paper, we discuss the design and implementation of a secure lightweight wearable health sensing system. The proposed system is based on an emerging security technology called Integrated Circuit Metric (ICMetric) that extracts the inherent features of a device to generate a unique device identification. In this paper, we provide details of how the physical characteristics of a health sensor can be used for the generation of hardware "fingerprints". The obtained fingerprints are used to deliver security services like authentication, confidentiality, secure admission and symmetric key generation. The generated symmetric key is used to securely communicate the health records and data of the patient. Based on experimental results and the security analysis of the proposed scheme, it is apparent that the proposed system enables high levels of security for health monitoring in resource optimized manner.

  14. Genetic basis of a cognitive complexity metric.

    Directory of Open Access Journals (Sweden)

    Narelle K Hansell

    Full Text Available Relational complexity (RC is a metric reflecting capacity limitation in relational processing. It plays a crucial role in higher cognitive processes and is an endophenotype for several disorders. However, the genetic underpinnings of complex relational processing have not been investigated. Using the classical twin model, we estimated the heritability of RC and genetic overlap with intelligence (IQ, reasoning, and working memory in a twin and sibling sample aged 15-29 years (N = 787. Further, in an exploratory search for genetic loci contributing to RC, we examined associated genetic markers and genes in our Discovery sample and selected loci for replication in four independent samples (ALSPAC, LBC1936, NTR, NCNG, followed by meta-analysis (N>6500 at the single marker level. Twin modelling showed RC is highly heritable (67%, has considerable genetic overlap with IQ (59%, and is a major component of genetic covariation between reasoning and working memory (72%. At the molecular level, we found preliminary support for four single-marker loci (one in the gene DGKB, and at a gene-based level for the NPS gene, having influence on cognition. These results indicate that genetic sources influencing relational processing are a key component of the genetic architecture of broader cognitive abilities. Further, they suggest a genetic cascade, whereby genetic factors influencing capacity limitation in relational processing have a flow-on effect to more complex cognitive traits, including reasoning and working memory, and ultimately, IQ.

  15. Metrics for comparing dynamic earthquake rupture simulations

    Science.gov (United States)

    Barall, Michael; Harris, Ruth A.

    2014-01-01

    Earthquakes are complex events that involve a myriad of interactions among multiple geologic features and processes. One of the tools that is available to assist with their study is computer simulation, particularly dynamic rupture simulation. A dynamic rupture simulation is a numerical model of the physical processes that occur during an earthquake. Starting with the fault geometry, friction constitutive law, initial stress conditions, and assumptions about the condition and response of the near‐fault rocks, a dynamic earthquake rupture simulation calculates the evolution of fault slip and stress over time as part of the elastodynamic numerical solution (Ⓔ see the simulation description in the electronic supplement to this article). The complexity of the computations in a dynamic rupture simulation make it challenging to verify that the computer code is operating as intended, because there are no exact analytic solutions against which these codes’ results can be directly compared. One approach for checking if dynamic rupture computer codes are working satisfactorily is to compare each code’s results with the results of other dynamic rupture codes running the same earthquake simulation benchmark. To perform such a comparison consistently, it is necessary to have quantitative metrics. In this paper, we present a new method for quantitatively comparing the results of dynamic earthquake rupture computer simulation codes.

  16. Characterizing granular networks using topological metrics

    Science.gov (United States)

    Dijksman, Joshua A.; Kovalcinova, Lenka; Ren, Jie; Behringer, Robert P.; Kramar, Miroslav; Mischaikow, Konstantin; Kondic, Lou

    2018-04-01

    We carry out a direct comparison of experimental and numerical realizations of the exact same granular system as it undergoes shear jamming. We adjust the numerical methods used to optimally represent the experimental settings and outcomes up to microscopic contact force dynamics. Measures presented here range from microscopic through mesoscopic to systemwide characteristics of the system. Topological properties of the mesoscopic force networks provide a key link between microscales and macroscales. We report two main findings: (1) The number of particles in the packing that have at least two contacts is a good predictor for the mechanical state of the system, regardless of strain history and packing density. All measures explored in both experiments and numerics, including stress-tensor-derived measures and contact numbers depend in a universal manner on the fraction of nonrattler particles, fNR. (2) The force network topology also tends to show this universality, yet the shape of the master curve depends much more on the details of the numerical simulations. In particular we show that adding force noise to the numerical data set can significantly alter the topological features in the data. We conclude that both fNR and topological metrics are useful measures to consider when quantifying the state of a granular system.

  17. Applying Halstead's Metric to Oberon Language

    Directory of Open Access Journals (Sweden)

    Fawaz Ahmed Masoud

    1999-12-01

    Full Text Available Oberon is a small, simple and difficult programming language. The guiding principle of Oberon was a quote from Albert Einstein: "Make it as simple as possible, but not simpler". Oberon language is based on few fundamental concepts that are easy to understand and use. It supports two programming paradigms: the procedural paradigm, and the object-oriented paradigm This paper provides the application of Halstead's software science theory to Oberon programs. Applying Halstead's metric to the Oberon language has provided the analysis and measurements for module and within module maintenance complexity of programs written in Oberon. This type of analysis provides a manager or programmer with enough information about the maintenance complexity of the Oberon programs. So they can be aware of how much effort they need to maintain a certain Oberon program. The maintenance complexity of the programs written in Oberon or any other language is based on counting the number of operators and operands within the statements of the tested program. The counting process is accomplished by a program written in C language- Results are obtained, analyzed, and discussed in detail.

  18. Using Publication Metrics to Highlight Academic Productivity and Research Impact

    Science.gov (United States)

    Carpenter, Christopher R.; Cone, David C.; Sarli, Cathy C.

    2016-01-01

    This article provides a broad overview of widely available measures of academic productivity and impact using publication data and highlights uses of these metrics for various purposes. Metrics based on publication data include measures such as number of publications, number of citations, the journal impact factor score, and the h-index, as well as emerging metrics based on document-level metrics. Publication metrics can be used for a variety of purposes for tenure and promotion, grant applications and renewal reports, benchmarking, recruiting efforts, and administrative purposes for departmental or university performance reports. The authors also highlight practical applications of measuring and reporting academic productivity and impact to emphasize and promote individual investigators, grant applications, or department output. PMID:25308141

  19. Accuracy and precision in the calculation of phenology metrics

    DEFF Research Database (Denmark)

    Ferreira, Ana Sofia; Visser, Andre; MacKenzie, Brian

    2014-01-01

    a phenology metric is first determined from a noise- and gap-free time series, and again once it has been modified. We show that precision is a greater concern than accuracy for many of these metrics, an important point that has been hereto overlooked in the literature. The variability in precision between...... phenology metrics is substantial, but it can be improved by the use of preprocessing techniques (e.g., gap-filling or smoothing). Furthermore, there are important differences in the inherent variability of the metrics that may be crucial in the interpretation of studies based upon them. Of the considered......Phytoplankton phenology (the timing of seasonal events) is a commonly used indicator for evaluating responses of marine ecosystems to climate change. However, phenological metrics are vulnerable to observation-(bloom amplitude, missing data, and observational noise) and analysis-related (temporal...

  20. Degraded visual environment image/video quality metrics

    Science.gov (United States)

    Baumgartner, Dustin D.; Brown, Jeremy B.; Jacobs, Eddie L.; Schachter, Bruce J.

    2014-06-01

    A number of image quality metrics (IQMs) and video quality metrics (VQMs) have been proposed in the literature for evaluating techniques and systems for mitigating degraded visual environments. Some require both pristine and corrupted imagery. Others require patterned target boards in the scene. None of these metrics relates well to the task of landing a helicopter in conditions such as a brownout dust cloud. We have developed and used a variety of IQMs and VQMs related to the pilot's ability to detect hazards in the scene and to maintain situational awareness. Some of these metrics can be made agnostic to sensor type. Not only are the metrics suitable for evaluating algorithm and sensor variation, they are also suitable for choosing the most cost effective solution to improve operating conditions in degraded visual environments.

  1. Developing a Security Metrics Scorecard for Healthcare Organizations.

    Science.gov (United States)

    Elrefaey, Heba; Borycki, Elizabeth; Kushniruk, Andrea

    2015-01-01

    In healthcare, information security is a key aspect of protecting a patient's privacy and ensuring systems availability to support patient care. Security managers need to measure the performance of security systems and this can be achieved by using evidence-based metrics. In this paper, we describe the development of an evidence-based security metrics scorecard specific to healthcare organizations. Study participants were asked to comment on the usability and usefulness of a prototype of a security metrics scorecard that was developed based on current research in the area of general security metrics. Study findings revealed that scorecards need to be customized for the healthcare setting in order for the security information to be useful and usable in healthcare organizations. The study findings resulted in the development of a security metrics scorecard that matches the healthcare security experts' information requirements.

  2. A practical approach to determine dose metrics for nanomaterials.

    Science.gov (United States)

    Delmaar, Christiaan J E; Peijnenburg, Willie J G M; Oomen, Agnes G; Chen, Jingwen; de Jong, Wim H; Sips, Adriënne J A M; Wang, Zhuang; Park, Margriet V D Z

    2015-05-01

    Traditionally, administered mass is used to describe doses of conventional chemical substances in toxicity studies. For deriving toxic doses of nanomaterials, mass and chemical composition alone may not adequately describe the dose, because particles with the same chemical composition can have completely different toxic mass doses depending on properties such as particle size. Other dose metrics such as particle number, volume, or surface area have been suggested, but consensus is lacking. The discussion regarding the most adequate dose metric for nanomaterials clearly needs a systematic, unbiased approach to determine the most appropriate dose metric for nanomaterials. In the present study, the authors propose such an approach and apply it to results from in vitro and in vivo experiments with silver and silica nanomaterials. The proposed approach is shown to provide a convenient tool to systematically investigate and interpret dose metrics of nanomaterials. Recommendations for study designs aimed at investigating dose metrics are provided. © 2015 SETAC.

  3. Fisher information metrics for binary classifier evaluation and training

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Different evaluation metrics for binary classifiers are appropriate to different scientific domains and even to different problems within the same domain. This presentation focuses on the optimisation of event selection to minimise statistical errors in HEP parameter estimation, a problem that is best analysed in terms of the maximisation of Fisher information about the measured parameters. After describing a general formalism to derive evaluation metrics based on Fisher information, three more specific metrics are introduced for the measurements of signal cross sections in counting experiments (FIP1) or distribution fits (FIP2) and for the measurements of other parameters from distribution fits (FIP3). The FIP2 metric is particularly interesting because it can be derived from any ROC curve, provided that prevalence is also known. In addition to its relation to measurement errors when used as an evaluation criterion (which makes it more interesting that the ROC AUC), a further advantage of the FIP2 metric is ...

  4. Modelling short-rotation coppice and tree planting for urban carbon management - a citywide analysis.

    Science.gov (United States)

    McHugh, Nicola; Edmondson, Jill L; Gaston, Kevin J; Leake, Jonathan R; O'Sullivan, Odhran S

    2015-10-01

    The capacity of urban areas to deliver provisioning ecosystem services is commonly overlooked and underutilized. Urban populations have globally increased fivefold since 1950, and they disproportionately consume ecosystem services and contribute to carbon emissions, highlighting the need to increase urban sustainability and reduce environmental impacts of urban dwellers. Here, we investigated the potential for increasing carbon sequestration, and biomass fuel production, by planting trees and short-rotation coppice (SRC), respectively, in a mid-sized UK city as a contribution to meeting national commitments to reduce CO 2 emissions.Iterative GIS models were developed using high-resolution spatial data. The models were applied to patches of public and privately owned urban greenspace suitable for planting trees and SRC, across the 73 km 2 area of the city of Leicester. We modelled tree planting with a species mix based on the existing tree populations, and SRC with willow and poplar to calculate biomass production in new trees, and carbon sequestration into harvested biomass over 25 years.An area of 11 km 2 comprising 15% of the city met criteria for tree planting and had the potential over 25 years to sequester 4200 tonnes of carbon above-ground. Of this area, 5·8 km 2 also met criteria for SRC planting and over the same period this could yield 71 800 tonnes of carbon in harvested biomass.The harvested biomass could supply energy to over 1566 domestic homes or 30 municipal buildings, resulting in avoided carbon emissions of 29 236 tonnes of carbon over 25 years when compared to heating by natural gas. Together with the net carbon sequestration into trees, a total reduction of 33 419 tonnes of carbon in the atmosphere could be achieved in 25 years by combined SRC and tree planting across the city. Synthesis and applications . We demonstrate that urban greenspaces in a typical UK city are underutilized for provisioning ecosystem services by trees and

  5. Diffuse volcanic emissions of carbon dioxide from Vulcano Island, Italy.

    Science.gov (United States)

    Baubron, J C; Allard, P; Toutain, J P

    1990-03-01

    RECENT investigations on Mount Etna (Sicily)(1-3) have revealed that volcanoes may release abundant carbon dioxide not only from their active craters, but also from their flanks, as diffuse soil emanations. Here we present analyses of soil gases and air in water wells on Vulcano Island which provide further evidence of such lateral degassing. Nearly pure carbon dioxide, enriched in helium and radon, escapes from the slopes of the Fossa active cone, adding a total output of 30 tonnes per day to the fumarolic crater discharge ( 180 tonnes CO(2) per day). This emanation has similar He/CO(2) and (13)C/(12)C ratios to those of the crater fumaroles (300%ndash;500 degrees C) and therefore a similar volcanic origin. Gases rich in carbon dioxide also escape at sea level along the isthmus between the Fossa and Vulcanello volcanic cones, but their depletion in both He and (13)C suggests a distinct source. Diffuse volcanic gas emanations, once their genetic link with central fumarole degassing has been demonstrated, can be used for continuous volcano monitoring, at safe distances from active craters. Such monitoring has been initiated at Vulcano, where soil and well emanations of nearly pure CO(2) themselves represent a threat to the local population.

  6. 77 FR 12832 - Non-RTO/ISO Performance Metrics; Commission Staff Request Comments on Performance Metrics for...

    Science.gov (United States)

    2012-03-02

    ... Performance Metrics; Commission Staff Request Comments on Performance Metrics for Regions Outside of RTOs and... performance communicate about the benefits of RTOs and, where appropriate, (2) changes that need to be made to... common set of performance measures for markets both within and outside of ISOs/RTOs. As recommended by...

  7. A comparison theorem of the Kobayashi metric and the Bergman metric on a class of Reinhardt domains

    International Nuclear Information System (INIS)

    Weiping Yin.

    1990-03-01

    A comparison theorem for the Kobayashi and Bergman metric is given on a class of Reinhardt domains in C n . In the meantime, we obtain a class of complete invariant Kaehler metrics for these domains of the special cases. (author). 5 refs

  8. Carbon storage and recycling in short-rotation energy crops

    International Nuclear Information System (INIS)

    Ranney, J.W.; Wright, L.L.; Mitchell, C.P.

    1991-01-01

    Short-rotation energy crops can play a significant role in storing carbon compared to the agricultural land uses they would displace. However, the benefits from these plantations in avoiding further use of fossil fuel and in taking pressure off of native forests for energy uses provides longer term carbon benetfits than the plantation carbon sequestration itself. The fast growth and harvest frequency of plantations tends to limit the amount of above and below-ground carbon storage in them. The primary components of plantation carbon sequestering compared to sustained agricultural practices involve above-ground wood, possible increased soil carbon, litter layer formation, and increased root biomass. On the average, short-rotation plantations in total may increase carbon inventories by about 30 to 40 tonnes per hectare over about a 20- to 56-year period when displacing cropland. This is about doubling in storage over cropland and about one-half the storage in human-impacted forests. The sequestration benefit of wood energy crops over cropland would be negated in about 75 to 100 years by the use of fossil fuels to tend the plantations and handle biomass. Plantation interactions with other land uses and total landscape carbon inventory is important in assessing the relative role plantations play in terrestrial and atmospheric carbon dynamics. It is speculated that plantations, when viewed in this context. could trencrate a global leveling of net carbon emissions for approximately 10 to 20 years

  9. Gamut Volume Index: a color preference metric based on meta-analysis and optimized colour samples.

    Science.gov (United States)

    Liu, Qiang; Huang, Zheng; Xiao, Kaida; Pointer, Michael R; Westland, Stephen; Luo, M Ronnier

    2017-07-10

    A novel metric named Gamut Volume Index (GVI) is proposed for evaluating the colour preference of lighting. This metric is based on the absolute gamut volume of optimized colour samples. The optimal colour set of the proposed metric was obtained by optimizing the weighted average correlation between the metric predictions and the subjective ratings for 8 psychophysical studies. The performance of 20 typical colour metrics was also investigated, which included colour difference based metrics, gamut based metrics, memory based metrics as well as combined metrics. It was found that the proposed GVI outperformed the existing counterparts, especially for the conditions where correlated colour temperatures differed.

  10. Carbon Management In the Post-Cap-and-Trade Carbon Economy

    Science.gov (United States)

    DeGroff, F. A.

    2013-12-01

    This abstract outlines an economic model that integrates carbon externalities seamlessly into the national and international economies. The model incorporates a broad carbon metric used to value all carbon in the biosphere, as well as all transnational commerce. The model minimizes the cost associated with carbon management, and allows for the variation in carbon avidity between jurisdictions. When implemented over time, the model reduces the deadweight loss while minimizing social cost, thus maximizing the marginal social benefit commonly associated with Pigouvian taxes. Once implemented, the model provides a comprehensive economic construct for governments, industry and consumers to efficiently weigh the cost of carbon, and effectively participate in helping to reduce their direct and indirect use of carbon, while allowing individual jurisdictions to decide their own carbon value, without the need for explicit, express agreement of all countries. The model uses no credits, requires no caps, and matches climate changing behavior to costs. The steps to implement the model for a particular jurisdiction are: 1) Define the Carbon Metric to value changes in Carbon Quality. 2) Apply the Carbon Metric to assess the Carbon Toll a) for all changes in Carbon Quality and b) for imports and exports. This economic model has 3 clear advantages. 1) The carbon pricing and cost scheme use existing and generally accepted accounting methodologies to ensure the veracity and verifiability of carbon management efforts with minimal effort and expense using standard auditing protocols. Implementing this economic model will not require any special training, tools, or systems for any entity to achieve their minimum carbon target goals within their jurisdictional framework. 2) Given the spectrum of carbon affinities worldwide, the model recognizes and provides for flexible carbon pricing regimes, but does not penalize domestic carbon-consuming producers subject to imports from exporters in

  11. Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Sartor, Dale; Tschudi, William

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  12. Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  13. Metrics for Performance Evaluation of Patient Exercises during Physical Therapy.

    Science.gov (United States)

    Vakanski, Aleksandar; Ferguson, Jake M; Lee, Stephen

    2017-06-01

    The article proposes a set of metrics for evaluation of patient performance in physical therapy exercises. Taxonomy is employed that classifies the metrics into quantitative and qualitative categories, based on the level of abstraction of the captured motion sequences. Further, the quantitative metrics are classified into model-less and model-based metrics, in reference to whether the evaluation employs the raw measurements of patient performed motions, or whether the evaluation is based on a mathematical model of the motions. The reviewed metrics include root-mean square distance, Kullback Leibler divergence, log-likelihood, heuristic consistency, Fugl-Meyer Assessment, and similar. The metrics are evaluated for a set of five human motions captured with a Kinect sensor. The metrics can potentially be integrated into a system that employs machine learning for modelling and assessment of the consistency of patient performance in home-based therapy setting. Automated performance evaluation can overcome the inherent subjectivity in human performed therapy assessment, and it can increase the adherence to prescribed therapy plans, and reduce healthcare costs.

  14. Relevance of motion-related assessment metrics in laparoscopic surgery.

    Science.gov (United States)

    Oropesa, Ignacio; Chmarra, Magdalena K; Sánchez-González, Patricia; Lamata, Pablo; Rodrigues, Sharon P; Enciso, Silvia; Sánchez-Margallo, Francisco M; Jansen, Frank-Willem; Dankelman, Jenny; Gómez, Enrique J

    2013-06-01

    Motion metrics have become an important source of information when addressing the assessment of surgical expertise. However, their direct relationship with the different surgical skills has not been fully explored. The purpose of this study is to investigate the relevance of motion-related metrics in the evaluation processes of basic psychomotor laparoscopic skills and their correlation with the different abilities sought to measure. A framework for task definition and metric analysis is proposed. An explorative survey was first conducted with a board of experts to identify metrics to assess basic psychomotor skills. Based on the output of that survey, 3 novel tasks for surgical assessment were designed. Face and construct validation was performed, with focus on motion-related metrics. Tasks were performed by 42 participants (16 novices, 22 residents, and 4 experts). Movements of the laparoscopic instruments were registered with the TrEndo tracking system and analyzed. Time, path length, and depth showed construct validity for all 3 tasks. Motion smoothness and idle time also showed validity for tasks involving bimanual coordination and tasks requiring a more tactical approach, respectively. Additionally, motion smoothness and average speed showed a high internal consistency, proving them to be the most task-independent of all the metrics analyzed. Motion metrics are complementary and valid for assessing basic psychomotor skills, and their relevance depends on the skill being evaluated. A larger clinical implementation, combined with quality performance information, will give more insight on the relevance of the results shown in this study.

  15. An accurate metric for the spacetime around rotating neutron stars

    Science.gov (United States)

    Pappas, George

    2017-04-01

    The problem of having an accurate description of the spacetime around rotating neutron stars is of great astrophysical interest. For astrophysical applications, one needs to have a metric that captures all the properties of the spacetime around a rotating neutron star. Furthermore, an accurate appropriately parametrized metric, I.e. a metric that is given in terms of parameters that are directly related to the physical structure of the neutron star, could be used to solve the inverse problem, which is to infer the properties of the structure of a neutron star from astrophysical observations. In this work, we present such an approximate stationary and axisymmetric metric for the exterior of rotating neutron stars, which is constructed using the Ernst formalism and is parametrized by the relativistic multipole moments of the central object. This metric is given in terms of an expansion on the Weyl-Papapetrou coordinates with the multipole moments as free parameters and is shown to be extremely accurate in capturing the physical properties of a neutron star spacetime as they are calculated numerically in general relativity. Because the metric is given in terms of an expansion, the expressions are much simpler and easier to implement, in contrast to previous approaches. For the parametrization of the metric in general relativity, the recently discovered universal 3-hair relations are used to produce a three-parameter metric. Finally, a straightforward extension of this metric is given for scalar-tensor theories with a massless scalar field, which also admit a formulation in terms of an Ernst potential.

  16. Flexible dynamic operation of solar-integrated power plant with solvent based post-combustion carbon capture (PCC) process

    International Nuclear Information System (INIS)

    Qadir, Abdul; Sharma, Manish; Parvareh, Forough; Khalilpour, Rajab; Abbas, Ali

    2015-01-01

    Highlights: • Flexible operation of power and PCC plant may significantly increase operational revenue. • Higher optimal carbon capture rates observed with solar thermal energy input. • Solar thermal repowering of the power plant provides highest net revenue. • Constant optimal capture rate observed for one of the flexible operation cases. • Up to 42% higher revenue generation observed between two cases with solar input. - Abstract: This paper examines flexible operation of solvent-based post-combustion carbon capture (PCC) for the reduction of power plant carbon emissions while minimizing revenue loss due to the reduced power plant electricity output. The study is conducted using a model superstructure enveloping three plants; a power plant, a PCC plant and a solar thermal field where the power plant and PCC plant are operated flexibly under the influence of hourly electricity market and weather conditions. Reduced (surrogate) models for the reboiler duty and auxiliary power requirement for the carbon capture plant are generated and applied to simulate and compare four cases, (A) power plant with PCC, (B) power plant with solar assisted PCC, (C) power plant with PCC and solar repowering – variable net electricity output and (D) power plant with PCC and solar repowering – fixed net electricity output. Such analyses are conducted under dynamic conditions including power plant part-load operation while varying the capture rate to optimize the revenue of the power plant. Each case was simulated with a lower carbon price of $25/tonne-CO 2 and a higher price of $50/tonne-CO 2 . The comparison of cases B–D found that optimal revenue generation for case C can be up to 42% higher than that of solar-assisted PCC (case B). Case C is found to be the most profitable with the lowest carbon emissions intensity and is found to exhibit a constant capture rate for both carbon prices. The optimal revenue for case D is slightly lower than case C for the lower carbon

  17. Carbon captured from the air

    Energy Technology Data Exchange (ETDEWEB)

    Keith, D. [Calgary Univ., AB (Canada)

    2008-10-15

    This article presented an innovative way to achieve the efficient capture of atmospheric carbon. A team of scientists from the University of Calgary's Institute for Sustainable Energy, Environment and Economy have shown that it is possible to reduce carbon dioxide (CO{sub 2}) using a simple machine that can capture the trace amount of CO{sub 2} present in ambient air at any place on the planet. The thermodynamics of capturing the small concentrations of CO{sub 2} from the air is only slightly more difficult than capturing much larger concentrations of CO{sub 2} from power plants. The research is significant because it offers a way to capture CO{sub 2} emissions from transportation sources such as vehicles and airplanes, which represent more than half of the greenhouse gases emitted on Earth. The energy efficient and cost effective air capture technology could complement other approaches for reducing emissions from the transportation sector, such as biofuels and electric vehicles. Air capture differs from carbon capture and storage (CCS) technology used at coal-fired power plants where CO{sub 2} is captured and pipelined for permanent storage underground. Air capture can capture the CO{sub 2} that is present in ambient air and store it wherever it is cheapest. The team at the University of Calgary showed that CO{sub 2} could be captured directly from the air with less than 100 kWhrs of electricity per tonne of CO{sub 2}. A custom-built tower was able to capture the equivalent of 20 tonnes per year of CO{sub 2} on a single square meter of scrubbing material. The team devised a way to use a chemical process from the pulp and paper industry to cut the energy cost of air capture in half. Although the technology is only in its early stage, it appears that CO{sub 2} could be captured from the air with an energy demand comparable to that needed for CO{sub 2} capture from conventional power plants, but costs will be higher. The simple, reliable and scalable technology

  18. Carbon captured from the air

    International Nuclear Information System (INIS)

    Keith, D.

    2008-01-01

    This article presented an innovative way to achieve the efficient capture of atmospheric carbon. A team of scientists from the University of Calgary's Institute for Sustainable Energy, Environment and Economy have shown that it is possible to reduce carbon dioxide (CO 2 ) using a simple machine that can capture the trace amount of CO 2 present in ambient air at any place on the planet. The thermodynamics of capturing the small concentrations of CO 2 from the air is only slightly more difficult than capturing much larger concentrations of CO 2 from power plants. The research is significant because it offers a way to capture CO 2 emissions from transportation sources such as vehicles and airplanes, which represent more than half of the greenhouse gases emitted on Earth. The energy efficient and cost effective air capture technology could complement other approaches for reducing emissions from the transportation sector, such as biofuels and electric vehicles. Air capture differs from carbon capture and storage (CCS) technology used at coal-fired power plants where CO 2 is captured and pipelined for permanent storage underground. Air capture can capture the CO 2 that is present in ambient air and store it wherever it is cheapest. The team at the University of Calgary showed that CO 2 could be captured directly from the air with less than 100 kWhrs of electricity per tonne of CO 2 . A custom-built tower was able to capture the equivalent of 20 tonnes per year of CO 2 on a single square meter of scrubbing material. The team devised a way to use a chemical process from the pulp and paper industry to cut the energy cost of air capture in half. Although the technology is only in its early stage, it appears that CO 2 could be captured from the air with an energy demand comparable to that needed for CO 2 capture from conventional power plants, but costs will be higher. The simple, reliable and scalable technology offers an opportunity to build a commercial-scale plant. 1 fig

  19. The AGIS metric and time of test: A replication study

    OpenAIRE

    Counsell, S; Swift, S; Tucker, A

    2016-01-01

    Visual Field (VF) tests and corresponding data are commonly used in clinical practices to manage glaucoma. The standard metric used to measure glaucoma severity is the Advanced Glaucoma Intervention Studies (AGIS) metric. We know that time of day when VF tests are applied can influence a patient’s AGIS metric value; a previous study showed that this was the case for a data set of 160 patients. In this paper, we replicate that study using data from 2468 patients obtained from Moorfields Eye Ho...

  20. Metric space construction for the boundary of space-time

    International Nuclear Information System (INIS)

    Meyer, D.A.

    1986-01-01

    A distance function between points in space-time is defined and used to consider the manifold as a topological metric space. The properties of the distance function are investigated: conditions under which the metric and manifold topologies agree, the relationship with the causal structure of the space-time and with the maximum lifetime function of Wald and Yip, and in terms of the space of causal curves. The space-time is then completed as a topological metric space; the resultant boundary is compared with the causal boundary and is also calculated for some pertinent examples

  1. Metrics for assessing retailers based on consumer perception

    Directory of Open Access Journals (Sweden)

    Klimin Anastasii

    2017-01-01

    Full Text Available The article suggests a new look at trading platforms, which is called “metrics.” Metrics are a way to look at the point of sale in a large part from the buyer’s side. The buyer enters the store and make buying decision based on those factors that the seller often does not consider, or considers in part, because “does not see” them, since he is not a buyer. The article proposes the classification of retailers, metrics and a methodology for their determination, presents the results of an audit of retailers in St. Petersburg on the proposed methodology.

  2. Predicting class testability using object-oriented metrics

    OpenAIRE

    Bruntink, Magiel; Deursen, Arie

    2004-01-01

    textabstractIn this paper we investigate factors of the testability of object-oriented software systems. The starting point is given by a study of the literature to obtain both an initial model of testability and existing OO metrics related to testability. Subsequently, these metrics are evaluated by means of two case studies of large Java systems for which JUnit test cases exist. The goal of this paper is to define and evaluate a set of metrics that can be used to assess the testability of t...

  3. Inflation with non-minimal coupling. Metric vs. Palatini formulations

    International Nuclear Information System (INIS)

    Bauer, F.; Demir, D.A.; Izmir Institute of Technology

    2008-03-01

    We analyze non-minimally coupled scalar field theories in metric (second-order) and Palatini (first-order) formalisms in a comparative fashion. After contrasting them in a general setup, we specialize to inflation and find that the two formalisms differ in their predictions for various cosmological parameters. The main reason is that dependencies on the non-minimal coupling parameter are different in the two formalisms. For successful inflation, the Palatini approach prefers a much larger value for the non-minimal coupling parameter than the Metric approach. Unlike the Metric formalism, in Palatini, the inflaton stays well below the Planck scale whereby providing a natural inflationary epoch. (orig.)

  4. Metrics for Diagnosing Undersampling in Monte Carlo Tally Estimates

    International Nuclear Information System (INIS)

    Perfetti, Christopher M.; Rearden, Bradley T.

    2015-01-01

    This study explored the potential of using Markov chain convergence diagnostics to predict the prevalence and magnitude of biases due to undersampling in Monte Carlo eigenvalue and flux tally estimates. Five metrics were applied to two models of pressurized water reactor fuel assemblies and their potential for identifying undersampling biases was evaluated by comparing the calculated test metrics with known biases in the tallies. Three of the five undersampling metrics showed the potential to accurately predict the behavior of undersampling biases in the responses examined in this study.

  5. Kerr-Newman metric in deSitter background

    International Nuclear Information System (INIS)

    Patel, L.K.; Koppar, S.S.; Bhatt, P.V.

    1987-01-01

    In addition to the Kerr-Newman metric with cosmological constant several other metrics are presented giving Kerr-Newman type solutions of Einstein-Maxwell field equations in the background of deSitter universe. The electromagnetic field in all the solutions is assumed to be source-free. A new metric of what may be termed as an electrovac rotating deSitter space-time- a space-time devoid of matter but containing source-free electromagnetic field and a null fluid with twisting rays-has been presented. In the absence of the electromagnetic field, these solutions reduce to those discussed by Vaidya (1984). 8 refs. (author)

  6. Comparison of routing metrics for wireless mesh networks

    CSIR Research Space (South Africa)

    Nxumalo, SL

    2011-09-01

    Full Text Available in each and every relay node so as to find the next hop for the packet. A routing metric is simply a measure used for selecting the best path, used by a routing protocol. Figure 2 shows the relationship between a routing protocol and the routing... on its QoS-awareness level. The routing metrics that considered QoS the most were selected from each group. This section discusses the four routing metrics that were compared in this paper, which are: hop count (HOP), expected transmission count (ETX...

  7. Hermitian-Einstein metrics on parabolic stable bundles

    International Nuclear Information System (INIS)

    Li Jiayu; Narasimhan, M.S.

    1995-12-01

    Let M-bar be a compact complex manifold of complex dimension two with a smooth Kaehler metric and D a smooth divisor on M-bar. If E is a rank 2 holomorphic vector bundle on M-bar with a stable parabolic structure along D, we prove the existence of a metric on E' = E module MbarD (compatible with the parabolic structure) which is Hermitian-Einstein with respect to the restriction of Kaehler metric of M-barD. A converse is also proved. (author). 24 refs

  8. Culture, intangibles and metrics in environmental management.

    Science.gov (United States)

    Satterfield, Terre; Gregory, Robin; Klain, Sarah; Roberts, Mere; Chan, Kai M

    2013-03-15

    The demand for better representation of cultural considerations in environmental management is increasingly evident. As two cases in point, ecosystem service approaches increasingly include cultural services, and resource planners recognize indigenous constituents and the cultural knowledge they hold as key to good environmental management. Accordingly, collaborations between anthropologists, planners, decision makers and biodiversity experts about the subject of culture are increasingly common-but also commonly fraught. Those whose expertise is culture often engage in such collaborations because they worry a practitioner from 'elsewhere' will employ a 'measure of culture' that is poorly or naively conceived. Those from an economic or biophysical training must grapple with the intangible properties of culture as they intersect with economic, biological or other material measures. This paper seeks to assist those who engage in collaborations to characterize cultural benefits or impacts relevant to decision-making in three ways; by: (i) considering the likely mindset of would-be collaborators; (ii) providing examples of tested approaches that might enable innovation; and (iii) characterizing the kinds of obstacles that are in principle solvable through methodological alternatives. We accomplish these tasks in part by examining three cases wherein culture was a critical variable in environmental decision making: risk management in New Zealand associated with Māori concerns about genetically modified organisms; cultural services to assist marine planning in coastal British Columbia; and a decision-making process involving a local First Nation about water flows in a regulated river in western Canada. We examine how 'culture' came to be manifest in each case, drawing from ethnographic and cultural-models interviews and using subjective metrics (recommended by theories of judgment and decision making) to express cultural concerns. We conclude that the characterization of

  9. US Rocket Propulsion Industrial Base Health Metrics

    Science.gov (United States)

    Doreswamy, Rajiv

    2013-01-01

    The number of active liquid rocket engine and solid rocket motor development programs has severely declined since the "space race" of the 1950s and 1960s center dot This downward trend has been exacerbated by the retirement of the Space Shuttle, transition from the Constellation Program to the Space launch System (SLS) and similar activity in DoD programs center dot In addition with consolidation in the industry, the rocket propulsion industrial base is under stress. To Improve the "health" of the RPIB, we need to understand - The current condition of the RPIB - How this compares to past history - The trend of RPIB health center dot This drives the need for a concise set of "metrics" - Analogous to the basic data a physician uses to determine the state of health of his patients - Easy to measure and collect - The trend is often more useful than the actual data point - Can be used to focus on problem areas and develop preventative measures The nation's capability to conceive, design, develop, manufacture, test, and support missions using liquid rocket engines and solid rocket motors that are critical to its national security, economic health and growth, and future scientific needs. center dot The RPIB encompasses US government, academic, and commercial (including industry primes and their supplier base) research, development, test, evaluation, and manufacturing capabilities and facilities. center dot The RPIB includes the skilled workforce, related intellectual property, engineering and support services, and supply chain operations and management. This definition touches the five main segments of the U.S. RPIB as categorized by the USG: defense, intelligence community, civil government, academia, and commercial sector. The nation's capability to conceive, design, develop, manufacture, test, and support missions using liquid rocket engines and solid rocket motors that are critical to its national security, economic health and growth, and future scientific needs

  10. Assessment of Brine Management for Geologic Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Breunig, Hanna M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Birkholzer, Jens T. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Borgia, Andrea [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Price, Phillip N. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Oldenburg, Curtis M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; McKone, Thomas E. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division

    2013-06-13

    Geologic carbon sequestration (GCS) is the injection of carbon dioxide (CO2), typically captured from stationary emission sources, into deep geologic formations to prevent its entry into the atmosphere. Active pilot facilities run by regional United States (US) carbon sequestration partnerships inject on the order of one million metric tonnes (mt) CO2 annually while the US electric power sector emits over 2000 million mt-CO2 annually. GCS is likely to play an increasing role in US carbon mitigation initiatives, but scaling up GCS poses several challenges. Injecting CO2 into sedimentary basins raises fluid pressure in the pore space, which is typically already occupied by naturally occurring, or native, brine. The resulting elevated pore pressures increase the likelihood of induced seismicity, of brine or CO2 escaping into potable groundwater resources, and of CO2 escaping into the atmosphere. Brine extraction is one method for pressure management, in which brine in the injection formation is brought to the surface through extraction wells. Removal of the brine makes room for the CO2 and decreases pressurization. Although the technology required for brine extraction is mature, this form of pressure management will only be applicable if there are cost-­effective and sustainable methods of disposing of the extracted brine. Brine extraction, treatment, and disposal may increase the already substantial capital, energy, and water demands of Carbon dioxide Capture and Sequestration (CCS). But, regionally specific brine management strategies may be able to treat the extracted water as a source of revenue, energy, and water to subsidize CCS costs, while minimizing environmental impacts. By this approach, value from the extracted water would be recovered before disposing of any resulting byproducts. Until a price is placed on carbon, we expect that utilities and other CO2 sources will be

  11. Comparison of Highly Resolved Model-Based Exposure Metrics for Traffic-Related Air Pollutants to Support Environmental Health Studies

    Directory of Open Access Journals (Sweden)

    Shih Ying Chang

    2015-12-01

    Full Text Available Human exposure to air pollution in many studies is represented by ambient concentrations from space-time kriging of observed values. Space-time kriging techniques based on a limited number of ambient monitors may fail to capture the concentration from local sources. Further, because people spend more time indoors, using ambient concentration to represent exposure may cause error. To quantify the associated exposure error, we computed a series of six different hourly-based exposure metrics at 16,095 Census blocks of three Counties in North Carolina for CO, NOx, PM2.5, and elemental carbon (EC during 2012. These metrics include ambient background concentration from space-time ordinary kriging (STOK, ambient on-road concentration from the Research LINE source dispersion model (R-LINE, a hybrid concentration combining STOK and R-LINE, and their associated indoor concentrations from an indoor infiltration mass balance model. Using a hybrid-based indoor concentration as the standard, the comparison showed that outdoor STOK metrics yielded large error at both population (67% to 93% and individual level (average bias between −10% to 95%. For pollutants with significant contribution from on-road emission (EC and NOx, the on-road based indoor metric performs the best at the population level (error less than 52%. At the individual level, however, the STOK-based indoor concentration performs the best (average bias below 30%. For PM2.5, due to the relatively low contribution from on-road emission (7%, STOK-based indoor metric performs the best at both population (error below 40% and individual level (error below 25%. The results of the study will help future epidemiology studies to select appropriate exposure metric and reduce potential bias in exposure characterization.

  12. An Assessment of Geological Carbon Storage Options in the Illinois Basin: Validation Phase

    Energy Technology Data Exchange (ETDEWEB)

    Finley, Robert

    2012-12-01

    The Midwest Geological Sequestration Consortium (MGSC) assessed the options for geological carbon dioxide (CO{sub 2}) storage in the 155,400 km{sup 2} (60,000 mi{sup 2}) Illinois Basin, which underlies most of Illinois, western Indiana, and western Kentucky. The region has annual CO{sub 2} emissions of about 265 million metric tonnes (292 million tons), primarily from 122 coal-fired electric generation facilities, some of which burn almost 4.5 million tonnes (5 million tons) of coal per year (U.S. Department of Energy, 2010). Validation Phase (Phase II) field tests gathered pilot data to update the Characterization Phase (Phase I) assessment of options for capture, transportation, and storage of CO{sub 2} emissions in three geological sink types: coal seams, oil fields, and saline reservoirs. Four small-scale field tests were conducted to determine the properties of rock units that control injectivity of CO{sub 2}, assess the total storage resources, examine the security of the overlying rock units that act as seals for the reservoirs, and develop ways to control and measure the safety of injection and storage processes. The MGSC designed field test operational plans for pilot sites based on the site screening process, MVA program needs, the selection of equipment related to CO{sub 2} injection, and design of a data acquisition system. Reservoir modeling, computational simulations, and statistical methods assessed and interpreted data gathered from the field tests. Monitoring, Verification, and Accounting (MVA) programs were established to detect leakage of injected CO{sub 2} and ensure public safety. Public outreach and education remained an important part of the project; meetings and presentations informed public and private regional stakeholders of the results and findings. A miscible (liquid) CO{sub 2} flood pilot project was conducted in the Clore Formation sandstone (Mississippian System, Chesterian Series) at Mumford Hills Field in Posey County, southwestern

  13. Narrowing the Gap Between QoS Metrics and Web QoE Using Above-the-fold Metrics

    OpenAIRE

    da Hora, Diego Neves; Asrese, Alemnew; Christophides, Vassilis; Teixeira, Renata; Rossi, Dario

    2018-01-01

    International audience; Page load time (PLT) is still the most common application Quality of Service (QoS) metric to estimate the Quality of Experience (QoE) of Web users. Yet, recent literature abounds with proposals for alternative metrics (e.g., Above The Fold, SpeedIndex and variants) that aim at better estimating user QoE. The main purpose of this work is thus to thoroughly investigate a mapping between established and recently proposed objective metrics and user QoE. We obtain ground tr...

  14. GPS Device Testing Based on User Performance Metrics

    Science.gov (United States)

    2015-10-02

    1. Rationale for a Test Program Based on User Performance Metrics ; 2. Roberson and Associates Test Program ; 3. Status of, and Revisions to, the Roberson and Associates Test Program ; 4. Comparison of Roberson and DOT/Volpe Programs

  15. Quantum anomalies for generalized Euclidean Taub-NUT metrics

    International Nuclear Information System (INIS)

    Cotaescu, Ion I; Moroianu, Sergiu; Visinescu, Mihai

    2005-01-01

    The generalized Taub-NUT metrics exhibit in general gravitational anomalies. This is in contrast with the fact that the original Taub-NUT metric does not exhibit gravitational anomalies, which is a consequence of the fact that it admits Killing-Yano tensors forming Staeckel-Killing tensors as products. We have found that for axial anomalies, interpreted as the index of the Dirac operator, the presence of Killing-Yano tensors is irrelevant. In order to evaluate the axial anomalies, we compute the index of the Dirac operator with the APS boundary condition on balls and on annular domains. The result is an explicit number-theoretic quantity depending on the radii of the domain. This quantity is 0 for metrics close to the original Taub-NUT metric but it does not vanish in general

  16. Analyses Of Two End-User Software Vulnerability Exposure Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Miles McQueen; Lawrence Wellman

    2012-08-01

    The risk due to software vulnerabilities will not be completely resolved in the near future. Instead, putting reliable vulnerability measures into the hands of end-users so that informed decisions can be made regarding the relative security exposure incurred by choosing one software package over another is of importance. To that end, we propose two new security metrics, average active vulnerabilities (AAV) and vulnerability free days (VFD). These metrics capture both the speed with which new vulnerabilities are reported to vendors and the rate at which software vendors fix them. We then examine how the metrics are computed using currently available datasets and demonstrate their estimation in a simulation experiment using four different browsers as a case study. Finally, we discuss how the metrics may be used by the various stakeholders of software and to software usage decisions.

  17. Research on cardiovascular disease prediction based on distance metric learning

    Science.gov (United States)

    Ni, Zhuang; Liu, Kui; Kang, Guixia

    2018-04-01

    Distance metric learning algorithm has been widely applied to medical diagnosis and exhibited its strengths in classification problems. The k-nearest neighbour (KNN) is an efficient method which treats each feature equally. The large margin nearest neighbour classification (LMNN) improves the accuracy of KNN by learning a global distance metric, which did not consider the locality of data distributions. In this paper, we propose a new distance metric algorithm adopting cosine metric and LMNN named COS-SUBLMNN which takes more care about local feature of data to overcome the shortage of LMNN and improve the classification accuracy. The proposed methodology is verified on CVDs patient vector derived from real-world medical data. The Experimental results show that our method provides higher accuracy than KNN and LMNN did, which demonstrates the effectiveness of the Risk predictive model of CVDs based on COS-SUBLMNN.

  18. Computing the Gromov hyperbolicity constant of a discrete metric space

    KAUST Repository

    Ismail, Anas

    2012-01-01

    , and many other areas of research. The Gromov hyperbolicity constant of several families of graphs and geometric spaces has been determined. However, so far, the only known algorithm for calculating the Gromov hyperbolicity constant δ of a discrete metric

  19. Some applications on tangent bundle with Kaluza-Klein metric

    Directory of Open Access Journals (Sweden)

    Murat Altunbaş

    2017-01-01

    Full Text Available In this paper, differential equations of geodesics; parallelism, incompressibility and closeness conditions of the horizontal and complete lift of the vector fields are investigated with respect to Kaluza-Klein metric on tangent bundle.

  20. Curvature properties of four-dimensional Walker metrics

    International Nuclear Information System (INIS)

    Chaichi, M; Garcia-Rio, E; Matsushita, Y

    2005-01-01

    A Walker n-manifold is a semi-Riemannian manifold, which admits a field of parallel null r-planes, r ≤ n/2. In the present paper we study curvature properties of a Walker 4-manifold (M, g) which admits a field of parallel null 2-planes. The metric g is necessarily of neutral signature (+ + - -). Such a Walker 4-manifold is the lowest dimensional example not of Lorentz type. There are three functions of coordinates which define a Walker metric. Some recent work shows that a Walker 4-manifold of restricted type whose metric is characterized by two functions exhibits a large variety of symplectic structures, Hermitian structures, Kaehler structures, etc. For such a restricted Walker 4-manifold, we shall study mainly curvature properties, e.g., conditions for a Walker metric to be Einstein, Osserman, or locally conformally flat, etc. One of our main results is the exact solutions to the Einstein equations for a restricted Walker 4-manifold

  1. Office Skills: Metric Problems in the Typing Classroom

    Science.gov (United States)

    Panagoplos, Nicholas A.

    1978-01-01

    Discusses problems of metric conversion in the typewriting classroom, as most typewriters have spacing in inches, and shows how to teach students to adjust their typewritten work for this spacing. (MF)

  2. On a Theorem of Khan in a Generalized Metric Space

    Directory of Open Access Journals (Sweden)

    Jamshaid Ahmad

    2013-01-01

    Full Text Available Existence and uniqueness of fixed points are established for a mapping satisfying a contractive condition involving a rational expression on a generalized metric space. Several particular cases and applications as well as some illustrative examples are given.

  3. A bridge role metric model for nodes in software networks.

    Directory of Open Access Journals (Sweden)

    Bo Li

    Full Text Available A bridge role metric model is put forward in this paper. Compared with previous metric models, our solution of a large-scale object-oriented software system as a complex network is inherently more realistic. To acquire nodes and links in an undirected network, a new model that presents the crucial connectivity of a module or the hub instead of only centrality as in previous metric models is presented. Two previous metric models are described for comparison. In addition, it is obvious that the fitting curve between the Bre results and degrees can well be fitted by a power law. The model represents many realistic characteristics of actual software structures, and a hydropower simulation system is taken as an example. This paper makes additional contributions to an accurate understanding of module design of software systems and is expected to be beneficial to software engineering practices.

  4. Tripled Fixed Point in Ordered Multiplicative Metric Spaces

    Directory of Open Access Journals (Sweden)

    Laishram Shanjit

    2017-06-01

    Full Text Available In this paper, we present some triple fixed point theorems in partially ordered multiplicative metric spaces depended on another function. Our results generalise the results of [6] and [5].

  5. Effecting IT infrastructure culture change: management by processes and metrics

    Science.gov (United States)

    Miller, R. L.

    2001-01-01

    This talk describes the processes and metrics used by Jet Propulsion Laboratory to bring about the required IT infrastructure culture change to update and certify, as Y2K compliant, thousands of computers and millions of lines of code.

  6. Metrics for Objective Assessment of Surgical Skills Workshop

    National Research Council Canada - National Science Library

    Satava, Richard

    2001-01-01

    On 9-10 July, 2001 the Metrics for Objective Assessment of Surgical Skills Workshop convened an international assemblage of subject matter experts in objective assessment of surgical technical skills...

  7. Resilient Control Systems Practical Metrics Basis for Defining Mission Impact

    Energy Technology Data Exchange (ETDEWEB)

    Craig G. Rieger

    2014-08-01

    "Resilience” describes how systems operate at an acceptable level of normalcy despite disturbances or threats. In this paper we first consider the cognitive, cyber-physical interdependencies inherent in critical infrastructure systems and how resilience differs from reliability to mitigate these risks. Terminology and metrics basis are provided to integrate the cognitive, cyber-physical aspects that should be considered when defining solutions for resilience. A practical approach is taken to roll this metrics basis up to system integrity and business case metrics that establish “proper operation” and “impact.” A notional chemical processing plant is the use case for demonstrating how the system integrity metrics can be applied to establish performance, and

  8. New fixed and periodic point results on cone metric spaces

    Directory of Open Access Journals (Sweden)

    Ghasem Soleimani Rad

    2014-05-01

    Full Text Available In this paper, several xed point theorems for T-contraction of two maps on cone metric spaces under normality condition are proved. Obtained results extend and generalize well-known comparable results in the literature.

  9. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry

    Directory of Open Access Journals (Sweden)

    Marek Tobiszewski

    2015-06-01

    Full Text Available The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  10. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.

    Science.gov (United States)

    Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek

    2015-06-12

    The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  11. Rotationally symmetric extremal pseudo-Kähler metrics of non ...

    Indian Academy of Sciences (India)

    Xiaojuan Duan

    XIAOJUAN DUAN. Department of Applied Mathematics, Xiamen University of Technology,. Xiamen 361024, China. E-mail: ... published online 22 March 2018. Abstract. ... of Kähler Ricci-flat metrics that depend on a parameter a. When a → 0+.

  12. Greenroads : a sustainability performance metric for roadway design and construction.

    Science.gov (United States)

    2009-11-01

    Greenroads is a performance metric for quantifying sustainable practices associated with roadway design and construction. Sustainability is defined as having seven key components: ecology, equity, economy, extent, expectations, experience and exposur...

  13. Software bug prediction using object-oriented metrics

    Indian Academy of Sciences (India)

    Dharmendra Lal Gupta

    2 Department of Computer Science and Engineering, Mewar University, Chittorgarh 312901, India e-mail: ... the object-oriented technology has been widely accepted ... whereas project metrics cover the number of staff members involved in ...

  14. An inheritance complexity metric for object-oriented code: A ...

    Indian Academy of Sciences (India)

    Department of Computer Engineering, Atilim University, 06836, Ankara, Turkey ... applied our metric on a real project for empirical validation and compared it with ... being insufficiently generalized or too implementation technology dependent.

  15. The entire sequence over Musielak p-metric space

    Directory of Open Access Journals (Sweden)

    C. Murugesan

    2016-04-01

    Full Text Available In this paper, we introduce fibonacci numbers of Γ2(F sequence space over p-metric spaces defined by Musielak function and examine some topological properties of the resulting these spaces.

  16. The independence of software metrics taken at different life-cycle stages

    Science.gov (United States)

    Kafura, D.; Canning, J.; Reddy, G.

    1984-01-01

    Over the past few years a large number of software metrics have been proposed and, in varying degrees, a number of these metrics have been subjected to empirical validation which demonstrated the utility of the metrics in the software development process. Attempts to classify these metrics and to determine if the metrics in these different classes appear to be measuring distinct attributes of the software product are studied. Statistical analysis is used to determine the degree of relationship among the metrics.

  17. Reconstructing an economic space from a market metric

    OpenAIRE

    Mendes, R. Vilela; Araújo, Tanya; Louçã, Francisco

    2002-01-01

    Using a metric related to the returns correlation, a method is proposed to reconstruct an economic space from the market data. A reduced subspace, associated to the systematic structure of the market, is identified and its dimension related to the number of terms in factor models. Example were worked out involving sets of companies from the DJIA and S&P500 indexes. Having a metric defined in the space of companies, network topology coefficients may be used to extract further information from ...

  18. Comparison of exit time moment spectra for extrinsic metric balls

    DEFF Research Database (Denmark)

    Hurtado, Ana; Markvorsen, Steen; Palmer, Vicente

    2012-01-01

    We prove explicit upper and lower bounds for the $L^1$-moment spectra for the Brownian motion exit time from extrinsic metric balls of submanifolds $P^m$ in ambient Riemannian spaces $N^n$. We assume that $P$ and $N$ both have controlled radial curvatures (mean curvature and sectional curvature...... obtain new intrinsic comparison results for the exit time spectra for metric balls in the ambient manifolds $N^n$ themselves....

  19. Wireless Sensor Network Metrics for Real-Time Systems

    Science.gov (United States)

    2009-05-20

    Wireless Sensor Network Metrics for Real-Time Systems Phoebus Wei-Chih Chen Electrical Engineering and Computer Sciences University of California at...3. DATES COVERED 00-00-2009 to 00-00-2009 4. TITLE AND SUBTITLE Wireless Sensor Network Metrics for Real-Time Systems 5a. CONTRACT NUMBER 5b... wireless sensor networks (WSNs) is moving from studies of WSNs in isolation toward studies where the WSN is treated as a component of a larger system

  20. Vacuum structure for indefinite-metric quantum field theory

    International Nuclear Information System (INIS)

    Rabuffo, I.; Vitiello, G.

    1978-01-01

    An approach to indefinite-metric QFT is presented in which the fundamental state of the theory is constructed by taking advantage of the existence of infinitely many unitarily inequivalent representations of the commutation relations. Use of the metric operator eta is avoided. Physical states are positive normed states. The probabilistic interpretation of the norms is fully recovered. An application to a simple model is given. Considerations on the statistical aspects of the construction conclude the paper