WorldWideScience

Sample records for billion metric tons

  1. The updated billion-ton resource assessment

    Science.gov (United States)

    Anthony Turhollow; Robert Perlack; Laurence Eaton; Matthew Langholtz; Craig Brandt; Mark Downing; Lynn Wright; Kenneth Skog; Chad Hellwinckel; Bryce Stokes; Patricia Lebow

    2014-01-01

    This paper summarizes the results of an update to a resource assessment, published in 2005, commonly referred to as the Billion-Ton Study (BTS). The updated results are consistent with the 2005 BTS in terms of overall magnitude. The 2005 BTS projected between 860 and 1240 Tg of biomass available in the 2050 timeframe, while the Billion-Ton Update (BT2), for a price of...

  2. Sneak Peek to the 2016 Billion-Ton Report

    Energy Technology Data Exchange (ETDEWEB)

    None

    2016-06-01

    The 2005 Billion-Ton Study became a landmark resource for bioenergy stakeholders, detailing for the first time the potential to produce at least one billion dry tons of biomass annually in a sustainable manner from U.S. agriculture and forest resources. The 2011 U.S. Billion-Ton Update expanded and updated the analysis, and in 2016, the U.S. Department of Energy’s Bioenergy Technologies Office plans to release the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy.

  3. Summary and Comparison of the 2016 Billion-Ton Report with the 2011 U.S. Billion-Ton Update

    Energy Technology Data Exchange (ETDEWEB)

    None

    2016-06-01

    In terms of the magnitude of the resource potential, the results of the 2016 Billion-Ton Report (BT16) are consistent with the original 2005 Billion-Ton Study (BTS) and the 2011 report, U.S. Billion-Ton Update: Biomass Supply for a Bioenergy and Bioproducts Industry (BT2. An effort was made to reevaluate the potential forestland, agricultural, and waste resources at the roadside, then extend the analysis by adding transportation costs to a biorefinery under specified logistics assumptions to major resource fractions.

  4. Regional Feedstock Partnership Summary Report: Enabling the Billion-Ton Vision

    Energy Technology Data Exchange (ETDEWEB)

    Owens, Vance N. [South Dakota State Univ., Brookings, SD (United States). North Central Sun Grant Center; Karlen, Douglas L. [Dept. of Agriculture Agricultural Research Service, Ames, IA (United States). National Lab. for Agriculture and the Environment; Lacey, Jeffrey A. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Process Science and Technology Division

    2016-07-12

    The U.S. Department of Energy (DOE) and the Sun Grant Initiative established the Regional Feedstock Partnership (referred to as the Partnership) to address information gaps associated with enabling the vision of a sustainable, reliable, billion-ton U.S. bioenergy industry by the year 2030 (i.e., the Billion-Ton Vision). Over the past 7 years (2008–2014), the Partnership has been successful at advancing the biomass feedstock production industry in the United States, with notable accomplishments. The Billion-Ton Study identifies the technical potential to expand domestic biomass production to offset up to 30% of U.S. petroleum consumption, while continuing to meet demands for food, feed, fiber, and export. This study verifies for the biofuels and chemical industries that a real and substantial resource base could justify the significant investment needed to develop robust conversion technologies and commercial-scale facilities. DOE and the Sun Grant Initiative established the Partnership to demonstrate and validate the underlying assumptions underpinning the Billion-Ton Vision to supply a sustainable and reliable source of lignocellulosic feedstock to a large-scale bioenergy industry. This report discusses the accomplishments of the Partnership, with references to accompanying scientific publications. These accomplishments include advances in sustainable feedstock production, feedstock yield, yield stability and stand persistence, energy crop commercialization readiness, information transfer, assessment of the economic impacts of achieving the Billion-Ton Vision, and the impact of feedstock species and environment conditions on feedstock quality characteristics.

  5. 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy

    Energy Technology Data Exchange (ETDEWEB)

    None

    2016-07-06

    This product builds on previous efforts, namely the 2005 Billion-Ton Study (BTS) and the 2011 U.S. Billion-Ton Update (BT2).With each report, greater perspective is gained on the potential of biomass resources to contribute to a national energy strategy. Similarly, each successive report introduces new questions regarding commercialization challenges. BTS quantified the broad biophysical potential of biomass nationally, and BT2 elucidated the potential economic availability of these resources. These reports clearly established the potential availability of up to one billion tons of biomass resources nationally. However, many questions remain, including but not limited to crop yields, climate change impacts, logistical operations, and systems integration across production, harvest, and conversion. The present report aims to address many of these questions through empirically modeled energy crop yields, scenario analysis of resources delivered to biorefineries, and the addition of new feedstocks. Volume 2 of the 2016 Billion-Ton Report is expected to be released by the end of 2016. It seeks to evaluate environmental sustainability indicators of select scenarios from volume 1 and potential climate change impacts on future supplies.

  6. 2016 Billion-Ton Report: Environmental Sustainability Effects of Select Scenarios from Volume 1 (Volume 2)

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, R. A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Langholtz, M. H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, K. E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Stokes, B. J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-13

    On behalf of all the authors and contributors, it is a great privilege to present the 2016 Billion-Ton Report (BT16), volume 2: Environmental Sustainability Effects of Select Scenarios from volume 1. This report represents the culmination of several years of collaborative effort among national laboratories, government agencies, academic institutions, and industry. BT16 was developed to support the U.S. Department of Energy’s efforts towards national goals of energy security and associated quality of life.

  7. 2016 Billion-ton report: Advancing domestic resources for a thriving bioeconomy, Volume 1: Economic availability of feedstock

    Science.gov (United States)

    M.H. Langholtz; B.J. Stokes; L.M. Eaton

    2016-01-01

    This product builds on previous efforts, namely the 2005 Billion-Ton Study (BTS) and the 2011 U.S. Billion-Ton Update (BT2).With each report, greater perspective is gained on the potential of biomass resources to contribute to a national energy strategy. Similarly, each successive report introduces new questions regarding commercialization challenges. BTS quantified...

  8. U.S. Billion-Ton Update: Biomass Supply for a Bioenergy and Bioproducts Industry

    Energy Technology Data Exchange (ETDEWEB)

    Downing, Mark [ORNL; Eaton, Laurence M [ORNL; Graham, Robin Lambert [ORNL; Langholtz, Matthew H [ORNL; Perlack, Robert D [ORNL; Turhollow Jr, Anthony F [ORNL; Stokes, Bryce [Navarro Research & Engineering; Brandt, Craig C [ORNL

    2011-08-01

    The report, Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasibility of a Billion-Ton Annual Supply (generally referred to as the Billion-Ton Study or 2005 BTS), was an estimate of 'potential' biomass based on numerous assumptions about current and future inventory, production capacity, availability, and technology. The analysis was made to determine if conterminous U.S. agriculture and forestry resources had the capability to produce at least one billion dry tons of sustainable biomass annually to displace 30% or more of the nation's present petroleum consumption. An effort was made to use conservative estimates to assure confidence in having sufficient supply to reach the goal. The potential biomass was projected to be reasonably available around mid-century when large-scale biorefineries are likely to exist. The study emphasized primary sources of forest- and agriculture-derived biomass, such as logging residues, fuel treatment thinnings, crop residues, and perennially grown grasses and trees. These primary sources have the greatest potential to supply large, reliable, and sustainable quantities of biomass. While the primary sources were emphasized, estimates of secondary residue and tertiary waste resources of biomass were also provided. The original Billion-Ton Resource Assessment, published in 2005, was divided into two parts-forest-derived resources and agriculture-derived resources. The forest resources included residues produced during the harvesting of merchantable timber, forest residues, and small-diameter trees that could become available through initiatives to reduce fire hazards and improve forest health; forest residues from land conversion; fuelwood extracted from forests; residues generated at primary forest product processing mills; and urban wood wastes, municipal solid wastes (MSW), and construction and demolition (C&D) debris. For these forest resources, only residues, wastes, and small

  9. Taking out 1 billion tons of CO2: The magic of China's 11th Five-Year Plan?

    International Nuclear Information System (INIS)

    Lin Jiang; Zhou Nan; Levine, Mark; Fridley, David

    2008-01-01

    China's 11th Five-Year Plan (FYP) sets an ambitious target for energy-efficiency improvement: energy intensity of the country's gross domestic product (GDP) should be reduced by 20% from 2005 to 2010 [National Development and Reform Commission (NDRC), 2006. Overview of the 11th Five Year Plan for National Economic and Social Development. NDRC, Beijing]. This is the first time that a quantitative and binding target has been set for energy efficiency, and signals a major shift in China's strategic thinking about its long-term economic and energy development. The 20% energy-intensity target also translates into an annual reduction of over 1.5 billion tons of CO 2 by 2010, making the Chinese effort one of the most significant carbon mitigation efforts in the world today. While it is still too early to tell whether China will achieve this target, this paper attempts to understand the trend in energy intensity in China and to explore a variety of options toward meeting the 20% target using a detailed end-use energy model

  10. Taking out 1 billion tons of CO2: The magic of China's 11th Five-Year Plan?

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Nan; Lin, Jiang; Zhou, Nan; Levine, Mark; Fridley, David

    2007-07-01

    China's 11th Five-Year Plan (FYP) sets an ambitious target for energy-efficiency improvement: energy intensity of the country's gross domestic product (GDP) should be reduced by 20% from 2005 to 2010 (NDRC, 2006). This is the first time that a quantitative and binding target has been set for energy efficiency, and signals a major shift in China's strategic thinking about its long-term economic and energy development. The 20% energy intensity target also translates into an annual reduction of over 1.5 billion tons of CO2 by 2010, making the Chinese effort one of most significant carbon mitigation effort in the world today. While it is still too early to tell whether China will achieve this target, this paper attempts to understand the trend in energy intensity in China and to explore a variety of options toward meeting the 20% target using a detailed end-use energy model.

  11. Land-Use Change and the Billion Ton 2016 Resource Assessment: Understanding the Effects of Land Management on Environmental Indicators

    Science.gov (United States)

    Kline, K. L.; Eaton, L. M.; Efroymson, R.; Davis, M. R.; Dunn, J.; Langholtz, M. H.

    2016-12-01

    The federal government, led by the U.S. Department of Energy (DOE), quantified potential U.S. biomass resources for expanded production of renewable energy and bioproducts in the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy (BT16) (DOE 2016). Volume 1 of the report provides analysis of projected supplies from 2015 to2040. Volume 2 (forthcoming) evaluates changes in environmental indicators for water quality and quantity, carbon, air quality, and biodiversity associated with production scenarios in BT16 volume 1. This presentation will review land-use allocations under the projected biomass production scenarios and the changes in land management that are implied, including drivers of direct and indirect LUC. National and global concerns such as deforestation and displacement of food production are addressed. The choice of reference scenario, input parameters and constraints (e.g., regarding land classes, availability, and productivity) drive LUC results in any model simulation and are reviewed to put BT16 impacts into context. The principal LUC implied in BT16 supply scenarios involves the transition of 25-to-47 million acres (net) from annual crops in 2015 baseline to perennial cover by 2040 under the base case and 3% yield growth case, respectively. We conclude that clear definitions of land parameters and effects are essential to assess LUC. A lack of consistency in parameters and outcomes of historic LUC analysis in the U.S. underscores the need for science-based approaches.

  12. Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasability of a Billion-Ton Annual Supply

    Energy Technology Data Exchange (ETDEWEB)

    Perlack, R.D.

    2005-12-15

    whether the land resources of the United States are capable of producing a sustainable supply of biomass sufficient to displace 30 percent or more of the country's present petroleum consumption--the goal set by the Advisory Committee in their vision for biomass technologies. Accomplishing this goal would require approximately 1 billion dry tons of biomass feedstock per year.

  13. Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasibility of a Billion-Ton Annual Supply, April 2005

    Energy Technology Data Exchange (ETDEWEB)

    None

    2005-04-01

    The purpose of this report is to determine whether the land resources of the United States are capable of producing a sustainable supply of biomass sufficient to displace 30 percent or more of the country’s present petroleum consumption – the goal set by the Biomass R&D Technical Advisory Committee in their vision for biomass technologies. Accomplishing this goal would require approximately 1 billion dry tons of biomass feedstock per year.

  14. Semiportable load-cell-based weighing system prototype of 18.14-metric-ton (20-ton) capacity for UF6 cylinder weight verifications: description and testing procedure

    International Nuclear Information System (INIS)

    McAuley, W.A.

    1984-01-01

    The 18.14-metric-ton-capacity (20-ton) Load-Cell-Based Weighing System (LCBWS) prototype tested at the Oak Ridge (Tennessee) Gaseous Diffusion Plant March 20-30, 1984, is semiportable and has the potential for being highly accurate. Designed by Brookhaven National Laboratory, it can be moved to cylinders for weighing as opposed to the widely used operating philosophy of most enrichment facilities of moving cylinders to stationary accountability scales. Composed mainly of commercially available, off-the-shelf hardware, the system's principal elements are two load cells that sense the weight (i.e., force) of a uranium hexafluoride (UF 6 ) cylinder suspended from the LCBWS while the cylinder is in the process of being weighed. Portability is achieved by its attachment to a double-hook, overhead-bridge crane. The LCBWS prototype is designed to weigh 9.07- and 12.70-metric ton (10- and 14-ton) UF 6 cylinders. A detailed description of the LCBWS is given, design information and criteria are supplied, a testing procedure is outlined, and initial test results are reported. A major objective of the testing is to determine the reliability and accuracy of the system. Other testing objectives include the identification of (1) potential areas for system improvements and (2) procedural modifications that will reflect an improved and more efficient system. The testing procedure described includes, but is not limited to, methods that account for temperature sensitivity of the instrumentation, the local variation in the acceleration due to gravity, and buoyance effects. Operational and safety considerations are noted. A preliminary evaluation of the March test data indicates that the LCBWS prototype has the potential to have an accuracy in the vicinity of 1 kg

  15. 50 CFR Table 1a to Part 660... - 2009, Specifications of ABCs, OYs, and HGs, by Management Area (weights in metric tons)

    Science.gov (United States)

    2010-10-01

    ..., by Management Area (weights in metric tons) 1a Table 1a to Part 660, Subpart C Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF..., Subpart C—2009, Specifications of ABCs, OYs, and HGs, by Management Area (weights in metric tons) ER01OC10...

  16. 50 CFR Table 2a to Part 660... - 2010, Specifications of ABCs, OYs, and HGs, by Management Area (weights in metric tons)

    Science.gov (United States)

    2010-10-01

    ..., by Management Area (weights in metric tons) 2a Table 2a to Part 660, Subpart C Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF..., Subpart C—2010, Specifications of ABCs, OYs, and HGs, by Management Area (weights in metric tons) ER01OC10...

  17. 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy, Volume 2: Environmental Sustainability Effects of Select Scenarios from Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, Rebecca Ann [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-11

    With the goal of understanding environmental effects of a growing bioeconomy, the U.S. Department of Energy (DOE), national laboratories, and U.S. Forest Service research laboratories, together with academic and industry collaborators, undertook a study to estimate environmental effects of potential biomass production scenarios in the United States, with an emphasis on agricultural and forest biomass. Potential effects investigated include changes in soil organic carbon (SOC), greenhouse gas (GHG) emissions, water quality and quantity, air emissions, and biodiversity. Effects of altered land-management regimes were analyzed based on select county-level biomass-production scenarios for 2017 and 2040 taken from the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy (BT16), volume 1, which assumes that the land bases for agricultural and forestry would not change over time. The scenarios reflect constraints on biomass supply (e.g., excluded areas; implementation of management practices; and consideration of food, feed, forage, and fiber demands and exports) that intend to address sustainability concerns. Nonetheless, both beneficial and adverse environmental effects might be expected. To characterize these potential effects, this research sought to estimate where and under what modeled scenarios or conditions positive and negative environmental effects could occur nationwide. The report also includes a discussion of land-use change (LUC) (i.e., land management change) assumptions associated with the scenario transitions (but not including analysis of indirect LUC [ILUC]), analyses of climate sensitivity of feedstock productivity under a set of potential scenarios, and a qualitative environmental effects analysis of algae production under carbon dioxide (CO2) co-location scenarios. Because BT16 biomass supplies are simulated independent of a defined end use, most analyses do not include benefits from displacing fossil fuels or other

  18. 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy, Volume 2: Environmental Sustainability Effects of Select Scenarios from Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, Rebecca Ann [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Kristen [Dept. of Energy (DOE), Washington DC (United States); Stokes, Bryce [Allegheny Science & Technology, LLC, Bridgeport, WV (United States); Brandt, Craig C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hellwinckel, Chad [Univ. of Tennessee, Knoxville, TN (United States); Kline, Keith L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Eaton, Laurence M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Dunn, Jennifer [Argonne National Lab. (ANL), Argonne, IL (United States); Canter, Christina E. [Argonne National Lab. (ANL), Argonne, IL (United States); Qin, Zhangcai [Argonne National Lab. (ANL), Argonne, IL (United States); Cai, Hao [Argonne National Lab. (ANL), Argonne, IL (United States); Wang, Michael [Argonne National Lab. (ANL), Argonne, IL (United States); Scott, D. Andrew [USDA Forest Service, Normal, AL (United States); Jager, Henrietta I. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wu, May [Argonne National Lab. (ANL), Argonne, IL (United States); Ha, Miae [Argonne National Lab. (ANL), Argonne, IL (United States); Baskaran, Latha Malar [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kreig, Jasmine A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rau, Benjamin [USDA Forest Service, Aiken, SC (United States); Muwamba, Augustine [Univ. of Georgia, Athens, GA (United States); Trettin, Carl [USDA Forest Service, Aiken, SC (United States); Panda, Sudhanshu [Univ. of North Georgia, Oakwood, GA (United States); Amatya, Devendra M. [USDA Forest Service, Aiken, SC (United States); Tollner, Ernest W. [USDA Forest Service, Aiken, SC (United States); Sun, Ge [USDA Forest Service, Aiken, SC (United States); Zhang, Liangxia [USDA Forest Service, Aiken, SC (United States); Duan, Kai [North Carolina State Univ., Raleigh, NC (United States); Warner, Ethan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Zhang, Yimin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Inman, Daniel [National Renewable Energy Lab. (NREL), Golden, CO (United States); Eberle, Annika [National Renewable Energy Lab. (NREL), Golden, CO (United States); Carpenter, Alberta [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heath, Garvin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hettinger, Dylan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wang, Gangsheng [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sutton, Nathan J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Busch, Ingrid Karin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Donner, Deahn M. [USDA Forest Service, Aiken, SC (United States); Wigley, T. Bently [National Council for Air and Stream Improvement (NCASI), Research Triangle Park, NC (United States); Miller, Darren A. [Weyerhaeuser Company, Federal Way, WA (United States); Coleman, Andre [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wigmosta, Mark [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pattullo, Molly [Univ. of Tennessee, Knoxville, TN (United States); Mayes, Melanie [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Daly, Christopher [Oregon State Univ., Corvallis, OR (United States); Halbleib, Mike [Oregon State Univ., Corvallis, OR (United States); Negri, Cristina [Argonne National Lab. (ANL), Argonne, IL (United States); Turhollow, Anthony F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bonner, Ian [Monsanto Company, Twin Falls, ID (United States); Dale, Virginia H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-01

    With the goal of understanding environmental effects of a growing bioeconomy, the U.S. Department of Energy (DOE), national laboratories, and U.S. Forest Service research laboratories, together with academic and industry collaborators, undertook a study to estimate environmental effects of potential biomass production scenarios in the United States, with an emphasis on agricultural and forest biomass. Potential effects investigated include changes in soil organic carbon (SOC), greenhouse gas (GHG) emissions, water quality and quantity, air emissions, and biodiversity. Effects of altered land-management regimes were analyzed based on select county-level biomass-production scenarios for 2017 and 2040 taken from the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy (BT16), volume 1, which assumes that the land bases for agricultural and forestry would not change over time. The scenarios reflect constraints on biomass supply (e.g., excluded areas; implementation of management practices; and consideration of food, feed, forage, and fiber demands and exports) that intend to address sustainability concerns. Nonetheless, both beneficial and adverse environmental effects might be expected. To characterize these potential effects, this research sought to estimate where and under what modeled scenarios or conditions positive and negative environmental effects could occur nationwide. The report also includes a discussion of land-use change (LUC) (i.e., land management change) assumptions associated with the scenario transitions (but not including analysis of indirect LUC [ILUC]), analyses of climate sensitivity of feedstock productivity under a set of potential scenarios, and a qualitative environmental effects analysis of algae production under carbon dioxide (CO2) co-location scenarios. Because BT16 biomass supplies are simulated independent of a defined end use, most analyses do not include benefits from displacing fossil fuels or

  19. Proposed method for assigning metric tons of heavy metal values to defense high-level waste forms to be disposed of in a geologic repository

    International Nuclear Information System (INIS)

    1987-08-01

    A proposed method is described for assigning an equivalent metric ton heavy metal (eMTHM) value to defense high-level waste forms to be disposed of in a geologic repository. This method for establishing a curie equivalency between defense high-level waste and irradiated commercial fuel is based on the ratio of defense fuel exposure to the typical commercial fuel exposure, MWd/MTHM. application of this technique to defense high-level wastes is described. Additionally, this proposed technique is compared to several alternate calculations for eMTHM. 15 refs., 2 figs., 10 tabs

  20. Missing billions.

    Science.gov (United States)

    Conly, S

    1997-01-01

    This article discusses funding of population programs that support the Cairo International Conference on Population and Development's Plan of Action. The Plan of Action calls for a quadrupling of annual financial commitments for population programs to $17 billion by the year 2000 and $22 billion by 2015. The increased expenditures would cover the increased demand for services from unmet need and population growth. Donor countries are expected to increase their share from the current 25% to about 33%, or $5.7 billion by the year 2000. The estimates are in 1993 constant dollars. $17 billion is less than the $40 billion that is spent worldwide on playing golf. During 1993-94, general donor support increased to $1.2 billion. Denmark, Germany, Japan, the Netherlands, the United Kingdom, and the United States increased their support. The United States doubled its support for population programs during 1992-95 to $583 million. During 1996-97 the US Congress cut funding back to the 1995 level. France, Italy, Spain, Belgium, and Austria have lagged in support for population programs in the present and the past. Equal burden sharing would require the US to increase funding to $1.9 billion. Developed country assistance declined to the lowest share of combined gross national product since 1970. This shifts the burden to multilateral sources. The European Union is committed to increasing its funding, and the World Bank increased funding for population and reproductive health to about $600 million in 1996 from $424 million in 1994. Bangladesh, China, India, Indonesia, Mexico, South Africa, and Turkey spent 85% of all government expenditures on family planning in developing countries. External donors in Africa are the main support of family planning. Private consumers in Latin America pay most of the costs of family planning. External assistance will be needed for some time.

  1. Feeding six billion.

    Science.gov (United States)

    Brown, L R

    1989-01-01

    Between 1986-88 drought damage to crops caused the grain supply to decrease and the price of grain worldwide increased 50%. However, in 1989 higher prices and better weather did not result in a rebuilding of reserves lost in previous years. According to the US Agriculture Department, the 1989 harvest will be 13 million tons short of the projected 1684 million tons of consumption. If grain stock cannot be replenished this year, then when will they be replenished? There are a variety of problems causing this situation. Lack of crop land and irrigation water prevent expansion. Diminishing returns from fertilizer inputs, deforestation, soil erosion, and pollution are all decreasing yields. Growth in food production worldwide has stabilized. Between 1950-84 world grain harvest increased 2.6 times or 3%/year. But between 1985-90 that same growth was only 0.2%/year. While this is too short a time to establish a trend, it does suggest a slowdown in worldwide food production. Every year 24 billion tons of topsoil are lost to water and wind erosion, and the world population grows by 88 million annually. Together, these 2 trends indicate a pending disaster. There is no reason to believe that food production is going to continue to grow as fast as the population, thus, population growth must be drastically curtailed. The UN has changes its projected level of population stabilization from 10 billion to 14 billion based on the fact that worldwide population growth has dropped only to 1.7%. Family planning programs have not been as successful as was hoped, partly because the US has withdrawn a large amount of funding due to political pressure from conservatives. The outlook is not good, as the per capita food share shrinks, malnutrition and starvation will continue to grow. Food prices will rise sharply and many more people will be unable to afford food. In many developing countries, people spend 70% of their income on food. This is already occurring as measured by a worldwide

  2. Six billion and counting

    OpenAIRE

    Leisinger, Klaus M.; Schmitt, Karin M.; Pandya-Lorch, Rajul

    2002-01-01

    In 1999 global population surpassed 6 billion people, and this number rises by about 70-80 million people each year. "Six Billion and Counting" examines the consequences of continuing population growth for the world's resource systems and for national and global food security. Leisinger, Schmitt, and Pandya-Lorch offer here a sober analysis of a complex and alarming situation. They assess the progress the world has made in controlling population growth and point to the areas where future diff...

  3. Ton That Tung's livers.

    Science.gov (United States)

    Helling, Thomas S; Azoulay, Daniel

    2014-06-01

    Born in the early 20th century, the Vietnamese surgeon Ton That Tung received his medical education in French colonial Indochina at the fledgling l'Ecole de Médecine de Hanoi, the first indigenous medical school in Southeast Asia. The benefactor of a postgraduate position at the medical school, Ton That Tung subsequently obtained his surgical training at the Phù Doãn Hospital in Hanoi and concurrently developed a passion for the study of liver anatomy, pathology, and surgery. His contributions to an understanding of liver anatomy based on meticulous dissection of autopsy specimens antedated and rivaled later work by the famous Western anatomists Couinaud, Healey, Schroy, and others. Ton That Tung's contributions, however, were overshadowed by the intense national struggles of the Vietnamese to establish independent rule and self-governance from the French and by eventual alignment with eastern bloc Communist countries, thus isolating much of his work behind the "Iron Curtain" until well after the end of the Cold War. Nevertheless, Ton That Tung remains a pioneer in liver anatomy and liver surgery. His commitment to surgical science and, more importantly, to the Vietnamese people stands as a tribute to the tireless pursuit of his ideals.

  4. Connecting the last billion

    OpenAIRE

    Ben David, Yahel

    2015-01-01

    The last billion people to join the online world, are likely to face at least one of two obstacles:Part I: Rural Internet AccessRural, sparsely populated, areas make conventional infrastructure investments unfeasible: Bigcorporations attempt to address this challenge via the launch of Low-Earth-Orbiting (LEO) satelliteconstellations, fleets of high-altitude balloons, and giant solar-powered drones; although thesegrandiose initiatives hold potential, they are costly and risky. At the same time...

  5. Defining a standard metric for electricity savings

    International Nuclear Information System (INIS)

    Koomey, Jonathan; Akbari, Hashem; Blumstein, Carl; Brown, Marilyn; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B; Greenberg, Steve

    2010-01-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70% capacity factor with 7% T and D losses. Displacing such a plant for one year would save 3 billion kWh/year at the meter and reduce emissions by 3 million metric tons of CO 2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question-Dr Arthur H Rosenfeld.

  6. Defining a standard metric for electricity savings

    Energy Technology Data Exchange (ETDEWEB)

    Koomey, Jonathan [Lawrence Berkeley National Laboratory and Stanford University, PO Box 20313, Oakland, CA 94620-0313 (United States); Akbari, Hashem; Blumstein, Carl; Brown, Marilyn; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B; Greenberg, Steve, E-mail: JGKoomey@stanford.ed

    2010-01-15

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70% capacity factor with 7% T and D losses. Displacing such a plant for one year would save 3 billion kWh/year at the meter and reduce emissions by 3 million metric tons of CO{sub 2} per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question-Dr Arthur H Rosenfeld.

  7. Defining a Standard Metric for Electricity Savings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Marilyn; Akbari, Hashem; Blumstein, Carl; Koomey, Jonathan; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H.; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B.; Greenberg, Steve; Hafemeister, David; Harris, Jeff; Harvey, Hal; Heitz, Eric; Hirst, Eric; Hummel, Holmes; Kammen, Dan; Kelly, Henry; Laitner, Skip; Levine, Mark; Lovins, Amory; Masters, Gil; McMahon, James E.; Meier, Alan; Messenger, Michael; Millhone, John; Mills, Evan; Nadel, Steve; Nordman, Bruce; Price, Lynn; Romm, Joe; Ross, Marc; Rufo, Michael; Sathaye, Jayant; Schipper, Lee; Schneider, Stephen H; Sweeney, James L; Verdict, Malcolm; Vorsatz, Diana; Wang, Devra; Weinberg, Carl; Wilk, Richard; Wilson, John; Worrell, Ernst

    2009-03-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70percent capacity factor with 7percent T&D losses. Displacing such a plant for one year would save 3 billion kW h per year at the meter and reduce emissions by 3 million metric tons of CO2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question--Dr. Arthur H. Rosenfeld.

  8. Transportation system benefits of early deployment of a 75-ton multipurpose canister system

    International Nuclear Information System (INIS)

    Wankerl, M.W.; Schmid, S.P.

    1995-01-01

    In 1993 the US Civilian Radioactive Waste Management System (CRWMS) began developing two multipurpose canister (MPC) systems to provide a standardized method for interim storage and transportation of spent nuclear fuel (SNF) at commercial nuclear power plants. One is a 75-ton concept with an estimated payload of about 6 metric tons (t) of SNF, and the other is a 125-ton concept with an estimated payload of nearly 11 t of SNF. These payloads are two to three times the payloads of the largest currently certified US rail transport casks, the IF-300. Although is it recognized that a fully developed 125-ton MPC system is likely to provide a greater cost benefit, and radiation exposure benefit than the lower-capacity 75-ton MPC, the authors of this paper suggest that development and deployment of the 75-ton MPC prior to developing and deploying a 125-ton MPC is a desirable strategy. Reasons that support this are discussed in this paper

  9. 12 billion DM for Germany

    International Nuclear Information System (INIS)

    Anon.

    1975-01-01

    The German atomic industry has achieved the break-through to the world market: Brazil orders eight nuclear electricity generating plants from Siemens-AEG daughter Kraftwerk-Union. US concerns attacked the twelve billion DM deal, the biggest export order in the history of German industry. Without avail - the contract is to be signed in Bonn this week. (orig./LH) [de

  10. Metric Tensor Vs. Metric Extensor

    OpenAIRE

    Fernández, V. V.; Moya, A. M.; Rodrigues Jr, Waldyr A.

    2002-01-01

    In this paper we give a comparison between the formulation of the concept of metric for a real vector space of finite dimension in terms of \\emph{tensors} and \\emph{extensors}. A nice property of metric extensors is that they have inverses which are also themselves metric extensors. This property is not shared by metric tensors because tensors do \\emph{not} have inverses. We relate the definition of determinant of a metric extensor with the classical determinant of the corresponding matrix as...

  11. Countdown to Six Billion Teaching Kit.

    Science.gov (United States)

    Zero Population Growth, Inc., Washington, DC.

    This teaching kit features six activities focused on helping students understand the significance of the world population reaching six billion for our society and our environment. Featured activities include: (1) History of the World: Part Six Billion; (2) A Woman's Place; (3) Baby-O-Matic; (4) Earth: The Apple of Our Eye; (5) Needs vs. Wants; and…

  12. Biomass as feedstock for a bioenergy and bioproducts industry: The technical feasibility of a billion-ton annual supply

    Energy Technology Data Exchange (ETDEWEB)

    Perlack, Robert D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wright, Lynn L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Turhollow, Anthony F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Graham, Robin L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Stokes, Bryce J. [U.S. Department of Agriculture, Washington, D.C. (United States); Erbach, Donald C. [U.S. Department of Agriculture, Washington, D.C. (United States)

    2005-04-01

    The purpose of this report is to determine whether the land resources of the United States are capable of producing a sustainable supply of biomass sufficient to displace 30% or more of the country's present petroleum consumption.

  13. 10 billion years of massive Galaxies

    NARCIS (Netherlands)

    Taylor, Edward Nairne Cunningham

    2009-01-01

    The most massive galaxies in the local universe are not forming new stars -- but we don’t know why. As a step towards figuring out why big galaxies stop forming stars, we set out to measure when they stop forming stars. By looking at the colors of massive galaxies have changed over 10 billion

  14. Metric learning

    CERN Document Server

    Bellet, Aurelien; Sebban, Marc

    2015-01-01

    Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learnin

  15. Multi-Ton Argon and Xenon

    Energy Technology Data Exchange (ETDEWEB)

    Alarcon, Ricardo; Balascuta, Septimiu; Alton, Drew; Aprile, Elena; Giboni, Karl-Ludwig; Haruyama, Tom; Lang, Rafael; Melgarejo, Antonio Jesus; Ni, Kaixuan; Plante, Guillaume; Choi, Bin [et al.

    2009-01-01

    There is a wide range of astronomical evidence that the visible stars and gas in all galaxies, including our own, are immersed in a much larger cloud of non-luminous matter, typically an order of magnitude greater in total mass. The existence of this ''dark matter'' is consistent with evidence from large-scale galaxy surveys and microwave background measurements, indicating that the majority of matter in the universe is non-baryonic. The nature of this non-baryonic component is still totally unknown, and the resolution of the ''dark matter puzzle'' is of fundamental importance to cosmology, astrophysics, and elementary particle physics. A leading explanation, motivated by supersymmetry theory, is the existence of as yet undiscovered Weakly Interacting Massive Particles (WIMPs), formed in the early universe and subsequently clustered in association with normal matter. WIMPs could, in principle, be detected in terrestrial experiments by their collisions with ordinary nuclei, giving observable low energy (< 100 keV) nuclear recoils. The predicted low collision rates require ultra-low background detectors with large (0.1-10 ton) target masses, located in deep underground sites to eliminate neutron background from cosmic ray muons. The establishment of the Deep Underground Science and Engineering Laboratory for large-scale experiments of this type would strengthen the current leadership of US researchers in this and other particle astrophysics areas. We propose to detect nuclear recoils by scintillation and ionization in ton-scale liquid noble gas targets, using techniques already proven in experiments at the 0.01-0.1 ton level. The experimental challenge is to identify these events in the presence of background events from gammas, neutrons, and alphas.

  16. ICARUS 600 ton: A status report

    CERN Document Server

    Vignoli, C; Badertscher, A; Barbieri, E; Benetti, P; Borio di Tigliole, A; Brunetti, R; Bueno, A; Calligarich, E; Campanelli, Mario; Carli, F; Carpanese, C; Cavalli, D; Cavanna, F; Cennini, P; Centro, S; Cesana, A; Chen, C; Chen, Y; Cinquini, C; Cline, D; De Mitri, I; Dolfini, R; Favaretto, D; Ferrari, A; Gigli Berzolari, A; Goudsmit, P; He, K; Huang, X; Li, Z; Lu, F; Ma, J; Mannocchi, G; Mauri, F; Mazza, D; Mazzone, L; Montanari, C; Nurzia, G P; Otwinowski, S; Palamara, O; Pascoli, D; Pepato, A; Periale, L; Petrera, S; Piano Mortari, Giovanni; Piazzoli, A; Picchi, P; Pietropaolo, F; Rancati, T; Rappoldi, A; Raselli, G L; Rebuzzi, D; Revol, J P; Rico, J; Rossella, M; Rossi, C; Rubbia, C; Rubbia, A; Sala, P; Scannicchio, D; Sergiampietri, F; Suzuki, S; Terrani, M; Ventura, S; Verdecchia, M; Wang, H; Woo, J; Xu, G; Xu, Z; Zhang, C; Zhang, Q; Zheng, S

    2000-01-01

    The goal of the ICARUS Project is the installation of a multi-kiloton LAr TPC in the underground Gran Sasso Laboratory. The programme foresees the realization of the detector in a modular way. The first step is the construction of a 600 ton module which is now at an advanced phase. It will be mounted and tested in Pavia in one year and then it will be moved to Gran Sasso for the final operation. The major cryogenic and purification systems and the mechanical components of the detector have been constructed and tested in a 10 m3 prototype. The results of these tests are here summarized.

  17. ICARUS 600 ton: A status report

    Science.gov (United States)

    Vignoli, C.; Arneodo, F.; Badertscher, A.; Barbieri, E.; Benetti, P.; di Tigliole, A. Borio; Brunetti, R.; Bueno, A.; Calligarich, E.; Campanelli, M.; Carli, F.; Carpanese, C.; Cavalli, D.; Cavanna, F.; Cennini, P.; Centro, S.; Cesana, A.; Chen, C.; Chen, Y.; Cinquini, C.; Cline, D.; De Mitri, I.; Dolfini, R.; Favaretto, D.; Ferrari, A.; Berzolari, A. Gigli; Goudsmit, P.; He, K.; Huang, X.; Li, Z.; Lu, F.; Ma, J.; Mannocchi, G.; Mauri, F.; Mazza, D.; Mazzone, L.; Montanari, C.; Nurzia, G. P.; Otwinowski, S.; Palamara, O.; Pascoli, D.; Pepato, A.; Periale, L.; Petrera, S.; Piano-Mortari, G.; Piazzoli, A.; Picchi, P.; Pietropaolo, F.; Rancati, T.; Rappoldi, A.; Raselli, G. L.; Rebuzzi, D.; Revol, J. P.; Rico, J.; Rossella, M.; Rossi, C.; Rubbia, A.; Rubbia, C.; Sala, P.; Scannicchio, D.; Sergiampietri, F.; Suzuki, S.; Terrani, M.; Ventura, S.; Verdecchia, M.; Wang, H.; Woo, J.; Xu, G.; Xu, Z.; Zhang, C.; Zhang, Q.; Zheng, S.

    2000-05-01

    The goal of the ICARUS Project is the installation of a multi-kiloton LAr TPC in the underground Gran Sasso Laboratory. The programme foresees the realization of the detector in a modular way. The first step is the construction of a 600 ton module which is now at an advanced phase. It will be mounted and tested in Pavia in one year and then it will be moved to Gran Sasso for the final operation. The major cryogenic and purification systems and the mechanical components of the detector have been constructed and tested in a 10 m 3 prototype. The results of these tests are here summarized.

  18. FY97 nuclear-related budgets total 493 billion yen (4.4 billion dollars)

    International Nuclear Information System (INIS)

    Anon.

    1996-01-01

    On September 13, the Atomic Energy Commission of Japan announced the estimated nuclear-related budget requests for FY1997 (April, 1997 - Mach, 1998), giving the breakdowns for eight ministries and agencies. The total amount requested by the government bodies was 493.3 billion yen, 0.8% increase as compared with FY96. this figure includes the budget requests of the Science and Technology Agency (STA), the Ministry of International Trade and Industry (MITI), the Ministry of Foreign Affairs, the Ministry of Transport, the Ministry of Agriculture, Forestry and Fisheries, the Okinawa Development Agency, and the Ministry of Home Affairs, but excludes the budget request made by the Ministry of Education. The budget requests of STA and MITI are 360 billion yen and 126 billion yen, respectively. On August 29, STA released its estimated FY97 budget request. The nuclear-related 360.4 billion yen is 0.9% more than that in year before. Of this sum, 199.9 billion yen is in the general account, and 160.6 billion yen is in the special account for power source development. The details of the nuclear-related amounts are explained. On August 26, MITI released its estimated budget request for FY97, and of the nuclear-related 125.7 billion yen (0.1% increase from FY96), 200 million yen is in the general account, and 98.9 billion yen and 26.6 billion yen are in the special accounts for power resource development and power source diversification, respectively. (K.I.)

  19. Four billion people facing severe water scarcity.

    Science.gov (United States)

    Mekonnen, Mesfin M; Hoekstra, Arjen Y

    2016-02-01

    Freshwater scarcity is increasingly perceived as a global systemic risk. Previous global water scarcity assessments, measuring water scarcity annually, have underestimated experienced water scarcity by failing to capture the seasonal fluctuations in water consumption and availability. We assess blue water scarcity globally at a high spatial resolution on a monthly basis. We find that two-thirds of the global population (4.0 billion people) live under conditions of severe water scarcity at least 1 month of the year. Nearly half of those people live in India and China. Half a billion people in the world face severe water scarcity all year round. Putting caps to water consumption by river basin, increasing water-use efficiencies, and better sharing of the limited freshwater resources will be key in reducing the threat posed by water scarcity on biodiversity and human welfare.

  20. Metrication manual

    International Nuclear Information System (INIS)

    Harper, A.F.A.; Digby, R.B.; Thong, S.P.; Lacey, F.

    1978-04-01

    In April 1978 a meeting of senior metrication officers convened by the Commonwealth Science Council of the Commonwealth Secretariat, was held in London. The participants were drawn from Australia, Bangladesh, Britain, Canada, Ghana, Guyana, India, Jamaica, Papua New Guinea, Solomon Islands and Trinidad and Tobago. Among other things, the meeting resolved to develop a set of guidelines to assist countries to change to SI and to compile such guidelines in the form of a working manual

  1. Endemic Cardiovascular Diseases of the Poorest Billion.

    Science.gov (United States)

    Kwan, Gene F; Mayosi, Bongani M; Mocumbi, Ana O; Miranda, J Jaime; Ezzati, Majid; Jain, Yogesh; Robles, Gisela; Benjamin, Emelia J; Subramanian, S V; Bukhman, Gene

    2016-06-14

    The poorest billion people are distributed throughout the world, though most are concentrated in rural sub-Saharan Africa and South Asia. Cardiovascular disease (CVD) data can be sparse in low- and middle-income countries beyond urban centers. Despite this urban bias, CVD registries from the poorest countries have long revealed a predominance of nonatherosclerotic stroke, hypertensive heart disease, nonischemic and Chagas cardiomyopathies, rheumatic heart disease, and congenital heart anomalies, among others. Ischemic heart disease has been relatively uncommon. Here, we summarize what is known about the epidemiology of CVDs among the world's poorest people and evaluate the relevance of global targets for CVD control in this population. We assessed both primary data sources, and the 2013 Global Burden of Disease Study modeled estimates in the world's 16 poorest countries where 62% of the population are among the poorest billion. We found that ischemic heart disease accounted for only 12% of the combined CVD and congenital heart anomaly disability-adjusted life years (DALYs) in the poorest countries, compared with 51% of DALYs in high-income countries. We found that as little as 53% of the combined CVD and congenital heart anomaly burden (1629/3049 DALYs per 100 000) was attributed to behavioral or metabolic risk factors in the poorest countries (eg, in Niger, 82% of the population among the poorest billion) compared with 85% of the combined CVD and congenital heart anomaly burden (4439/5199 DALYs) in high-income countries. Further, of the combined CVD and congenital heart anomaly burden, 34% was accrued in people under age 30 years in the poorest countries, while only 3% is accrued under age 30 years in high-income countries. We conclude although the current global targets for noncommunicable disease and CVD control will help diminish premature CVD death in the poorest populations, they are not sufficient. Specifically, the current framework (1) excludes deaths of

  2. Les tons du langage tambouriné et les tons du langage parlé. Texte ...

    African Journals Online (AJOL)

    A.J. Smet présente, en traduction française, un texte inédit de Placide Tempels concernant la relation entre les tons du langage tambouriné et ceux du langage parlé. Ce texte est à l'origine de la correspondance entre Tempels et Hulstaert. Cette correspondance a eu une influence certaine sur la rédaction de la ...

  3. Oxygen - A Four Billion Year History

    DEFF Research Database (Denmark)

    Canfield, Donald Eugene

    The air we breathe is twenty-one percent oxygen, an amount higher than on any other known world. While we may take our air for granted, Earth was not always an oxygenated planet. How did it become this way? Oxygen is the most current account of the history of atmospheric oxygen on Earth. Donald...... of fields, including geology, paleontology, geochemistry, biochemistry, animal physiology, and microbiology, to explain why our oxygenated Earth became the ideal place for life. Describing which processes, both biological and geological, act to control oxygen levels in the atmosphere, Canfield traces...... the records of oxygen concentrations through time. Readers learn about the great oxidation event, the tipping point 2.3 billion years ago when the oxygen content of the Earth increased dramatically, and Canfield examines how oxygenation created a favorable environment for the evolution of large animals. He...

  4. Origins fourteen billion years of cosmic evolution

    CERN Document Server

    Tyson, Neil deGrasse

    2004-01-01

    Origins explores cosmic science's stunning new insights into the formation and evolution of our universe--of the cosmos, of galaxies and galaxy clusters, of stars within galaxies, of planets that orbit those stars, and of different forms of life that take us back to the first three seconds and forward through three billion years of life on Earth to today's search for life on other planets. Drawing on the current cross-pollination of geology, biology and astrophysics, Origins explains the thrilling daily breakthroughs in our knowledge of the universe from dark energy to life on Mars to the mysteries of space and time. Distilling complex science in clear and lively prose, co-authors Neil deGrasse Tyson and Donald Goldsmith conduct a galvanising tour of the cosmos revealing what the universe has been up to while turning part of itself into us.

  5. Oxygen - A Four Billion Year History

    DEFF Research Database (Denmark)

    Canfield, Donald Eugene

    The air we breathe is twenty-one percent oxygen, an amount higher than on any other known world. While we may take our air for granted, Earth was not always an oxygenated planet. How did it become this way? Oxygen is the most current account of the history of atmospheric oxygen on Earth. Donald...... Canfield--one of the world's leading authorities on geochemistry, earth history, and the early oceans--covers this vast history, emphasizing its relationship to the evolution of life and the evolving chemistry of the Earth. With an accessible and colorful first-person narrative, he draws from a variety...... the records of oxygen concentrations through time. Readers learn about the great oxidation event, the tipping point 2.3 billion years ago when the oxygen content of the Earth increased dramatically, and Canfield examines how oxygenation created a favorable environment for the evolution of large animals. He...

  6. Death of the TonB Shuttle Hypothesis.

    Science.gov (United States)

    Gresock, Michael G; Savenkova, Marina I; Larsen, Ray A; Ollis, Anne A; Postle, Kathleen

    2011-01-01

    A complex of ExbB, ExbD, and TonB couples cytoplasmic membrane (CM) proton motive force (pmf) to the active transport of large, scarce, or important nutrients across the outer membrane (OM). TonB interacts with OM transporters to enable ligand transport. Several mechanical models and a shuttle model explain how TonB might work. In the mechanical models, TonB remains attached to the CM during energy transduction, while in the shuttle model the TonB N terminus leaves the CM to deliver conformationally stored potential energy to OM transporters. Previous studies suggested that TonB did not shuttle based on the activity of a GFP-TonB fusion that was anchored in the CM by the GFP moiety. When we recreated the GFP-TonB fusion to extend those studies, in our hands it was proteolytically unstable, giving rise to potentially shuttleable degradation products. Recently, we discovered that a fusion of the Vibrio cholerae ToxR cytoplasmic domain to the N terminus of TonB was proteolytically stable. ToxR-TonB was able to be completely converted into a proteinase K-resistant conformation in response to loss of pmf in spheroplasts and exhibited an ability to form a pmf-dependent formaldehyde crosslink to ExbD, both indicators of its location in the CM. Most importantly, ToxR-TonB had the same relative specific activity as wild-type TonB. Taken together, these results provide conclusive evidence that TonB does not shuttle during energy transduction. We had previously concluded that TonB shuttles based on the use of an Oregon Green(®) 488 maleimide probe to assess periplasmic accessibility of N-terminal TonB. Here we show that the probe was permeant to the CM, thus permitting the labeling of the TonB N-terminus. These former results are reinterpreted in the context that TonB does not shuttle, and suggest the existence of a signal transduction pathway from OM to cytoplasm.

  7. From Throw Weights to Metric Tons: The Radioactive Waste Problems of Russia's Northern Fleet

    National Research Council Canada - National Science Library

    Pruefer, Donald

    2000-01-01

    .... In a microcosm of the shortsighted planning, reckless development and lack of ecological concern that epitomized the Soviet era, 70 decommissioned nuclear submarines are currently moored in ports...

  8. Agroecohydrology: Key to Feeding 9 Billion?

    Science.gov (United States)

    Herrick, J.

    2011-12-01

    Agricultural production necessary to feed 9 billion people in 2050 depends on increased production on existing croplands, and expanding onto 'marginal' lands. A high proportion of these lands are marginal because they are too steep or too dry to reliably support crop production. These same characteristics increase their susceptibility to accelerated erosion, leading (for most soil profiles) to further reductions in plant available water as infiltration and soil profile water holding capacity decline. Sustaining production on these marginal lands will require careful land use planning. In this paper, we present a land use planning framework that integrates 4 elements: (1) potential production (based on soil profile characteristics), (2) edaphic, topographic and climatic limitations to production, (3) soil resistance to degradation, and (4) resilience. This framework expands existing land capability classification systems through the integration of biophysical feedbacks and thresholds. State and transition models, similar to those currently applied to rangelands in the United States and other countries, are used to organize and communicate knowledge about the sustainability of different land use changes and management actions at field to regional scales. This framework emphasizes hydrologic characteristics of soil profiles and landscapes over fertility because fertility declines are more easily addressed through increased inputs. The presentation will conclude with a discussion of how research in ecohydrology can be more effectively focused to support sustainable food production in the context of increasingly rapid social and economic changes throughout the world.

  9. Uranium in Canada: Billion-dollar industry

    International Nuclear Information System (INIS)

    Whillans, R.T.

    1989-01-01

    In 1988, Canada maintained its position as the world's leading producer and exporter of uranium; five primary uranium producers reported concentrate output containing 12,400 MT of uranium, or about one-third of Western production. Uranium shipments made by these producers in 1988 exceeded 13,200 MT, worth Canadian $1.1 billion. Because domestic requirements represent only 15% of current Canadian output, most of Canada's uranium production is available for export. Despite continued market uncertainty in 1988, Canada's uranium producers signed new sales contracts for some 14,000 MT, twice the 1987 level. About 90% of this new volume is with the US, now Canada's major uranium customer. The recent implementation of the Canada/US Free Trade agreement brings benefits to both countries; the uranium industries in each can now develop in an orderly, free market. Canada's uranium industry was restructured and consolidated in 1988 through merger and acquisition; three new uranium projects advanced significantly. Canada's new policy on nonresident ownership in the uranium mining sector, designed to encourage both Canadian and foreign investment, should greatly improve efforts to finance the development of recent Canadian uranium discoveries

  10. Death of the TonB shuttle hypothesis

    Directory of Open Access Journals (Sweden)

    Michael George Gresock

    2011-10-01

    Full Text Available A complex of ExbB, ExbD, and TonB transduces cytoplasmic membrane (CM proton motive force (pmf to outer membrane (OM transporters so that large, scarce, and important nutrients can be released into the periplasmic space for subsequent transport across the CM. TonB is the component that interacts with the OM transporters and enables ligand transport, and several mechanical models and a shuttle model explain how TonB might work. In the mechanical models, TonB remains attached to the CM during energy transduction, while in the shuttle model the TonB N terminus leaves the CM to deliver conformationally stored potential energy to OM transporters. Previous efforts to test the shuttle model by anchoring TonB to the CM by fusion to a large globular cytoplasmic protein have been hampered by the proteolytic susceptibility of the fusion constructs. Here we confirm that GFP-TonB, tested in a previous study by another laboratory, again gave rise to full-length TonB and slightly larger potentially shuttleable fragments that prevented unambiguous interpretation of the data. Recently, we discovered that a fusion of the Vibrio cholerae ToxR cytoplasmic domain to the N terminus of TonB was proteolytically stable. ToxR-TonB was able to be completely converted into a proteinase K-resistant conformation in response to loss of pmf in spheroplasts and exhibited an ability to form a pmf-dependent formaldehyde crosslink to ExbD, both indicators of its location in the CM. Most importantly, ToxR-TonB had the same relative specific activity as wild-type TonB. Taken together, these results provide the first conclusive evidence that TonB does not shuttle during energy transduction. The interpretations of our previous study, which concluded that TonB shuttled in vivo, were complicated by the fact that the probe used in those studies, Oregon Green® 488 maleimide, was permeant to the CM and could label proteins, including a TonB ∆TMD derivative, confined exclusively to the

  11. La carbonatation du béton armé

    CERN Document Server

    Girard, C

    1998-01-01

    Ce document présente les divers aspects de la carbonatation du béton armé: Description des diverses réactions chimiques qui conduisent à la carbonatation des bétons. Comment le pH du béton se modifie laissant les armatures dans un milieu acide, donc corrosif. Description des conséquences sur les bétons et les armatures. Formation de cloques de surface, éclatement et perte de résistance. Méthodes d'investigation des dégâts d'ouvrages en béton armé visuelle, électrique, et par carottage. Prophylaxie passive et active: disposition des armatures lors de la réalisation, utilisation d'armatures ydables ou de fers enrobés de peinture epoxydique, protection des aciers lors de travaux d'assainissement. Principes d'assainissement du béton armé sur ouvrages anciens. Purge des bétons, passivation et protection des armatures et reconstitution au mortier de résine, ou passivation par infiltration du béton. Impact sur les ouvrages du CERN et conséquences financières prévisibles.

  12. Metric Tannakian Duality

    OpenAIRE

    Daenzer, Calder

    2011-01-01

    We incorporate metric data into the framework of Tannaka-Krein duality. Thus, for any group with left invariant metric, we produce a dual metric on its category of unitary representations. We characterize the conditions under which a "double-dual" metric on the group may be recovered from the metric on representations, and provide conditions under which a metric agrees with its double-dual. We also consider some applications to T-duality and quantum Gromov-Hausdorff distance.

  13. Oral diseases affect some 3.9 billion people.

    Science.gov (United States)

    Richards, Derek

    2013-01-01

    Medline, Embase, Lilacs. Published and unpublished observational population-based studies presenting information on the prevalence, incidence, case fatality and cause-specific mortality related to untreated caries, severe periodontitis and severe tooth loss between January 1980 and December 2010. There were no language restrictions. Study quality was assessed using the STROBE checklist (http://www.strobe-statement.org/). Prevalence estimates were calculated on the database for all age-gender-country-year groups using a specifically developed Bayesian meta-regression tool. Disability-adjusted life-years (DALYs) and years lived with disability (YLDs) metrics were used to quantify the disease burden. Disability weights were calculated based on population-based surveys in five countries (USA, Peru, Tanzania, Bangladesh and Indonesia) and an open Internet survey. Uncertainties in estimates were examined using Monte Carlo simulation techniques with uncertainty levels presented as the 2.5th and 97.5th centiles, which can be interpreted as a 95% UI. Oral diseases remain highly prevalent in 2010 affecting 3.9 billion people. Untreated caries in permanent teeth was the most prevalent condition evaluated for the entire GBD (Global Burden of Disease) 2010 Study with a global prevalence of 35% for all ages combined. Severe periodontitis and untreated caries in deciduous teeth were the 6th and 10th most prevalent conditions, affecting, respectively, 11% and 9% of the global population. Oral conditions combined accounted for 15 million DALYs globally (1.9% of all YLDs and 0.6% of all DALYs), implying an average health loss of 224 years per 100,000 people. DALYs due to oral conditions increased 20.8% between 1990 and 2010, mainly due to population growth and aging. While DALYs due to severe periodontitis and untreated caries increased, those due to severe tooth loss decreased. The findings highlight the challenge in responding to the diversity of urgent oral health needs world

  14. Response Surface Model (RSM)-based Benefit Per Ton Estimates

    Science.gov (United States)

    The tables below are updated versions of the tables appearing in The influence of location, source, and emission type in estimates of the human health benefits of reducing a ton of air pollution (Fann, Fulcher and Hubbell 2009).

  15. Antifungal effect of TONS504-photodynamic therapy on Malassezia furfur.

    Science.gov (United States)

    Takahashi, Hidetoshi; Nakajima, Susumu; Sakata, Isao; Iizuka, Hajime

    2014-10-01

    Numerous reports indicate therapeutic efficacy of photodynamic therapy (PDT) against skin tumors, acne and for skin rejuvenation. However, few reports exist regarding its efficacy for fungal skin diseases. In order to determine the antifungal effect, PDT was applied on Malassezia furfur. M. furfur was cultured in the presence of a novel cationic photosensitizer, TONS504, and was irradiated with a 670-nm diode laser. TONS504-PDT showed a significant antifungal effect against M. furfur. The effect was irradiation dose- and TONS504 concentration-dependent and the maximal effect was observed at 100 J/cm2 and 1 μg/mL, respectively. In conclusion, TONS504-PDT showed antifungal effect against M. furfur in vitro, and may be a new therapeutic modality for M. furfur-related skin disorders. © 2014 Japanese Dermatological Association.

  16. Random Kaehler metrics

    Energy Technology Data Exchange (ETDEWEB)

    Ferrari, Frank, E-mail: frank.ferrari@ulb.ac.be [Service de Physique Theorique et Mathematique, Universite Libre de Bruxelles and International Solvay Institutes, Campus de la Plaine, CP 231, 1050 Bruxelles (Belgium); Klevtsov, Semyon, E-mail: semyon.klevtsov@ulb.ac.be [Service de Physique Theorique et Mathematique, Universite Libre de Bruxelles and International Solvay Institutes, Campus de la Plaine, CP 231, 1050 Bruxelles (Belgium); ITEP, B. Cheremushkinskaya 25, Moscow 117218 (Russian Federation); Zelditch, Steve, E-mail: zelditch@math.northwestern.edu [Department of Mathematics, Northwestern University, Evanston, IL 60208 (United States)

    2013-04-01

    The purpose of this article is to propose a new method to define and calculate path integrals over metrics on a Kaehler manifold. The main idea is to use finite dimensional spaces of Bergman metrics, as an approximation to the full space of Kaehler metrics. We use the theory of large deviations to decide when a sequence of probability measures on the spaces of Bergman metrics tends to a limit measure on the space of all Kaehler metrics. Several examples are considered.

  17. Proton collider breaks the six-billion-dollar barrier

    CERN Document Server

    Vaughan, C

    1990-01-01

    The SSC will cost at least 1 billion more than its estimated final price of 5.9 billion dollars. Critics in congress believe the final bill could be double that figure. The director of the SSC blames most of the increase in cost on technical problems with developing the superconducting magnets for the SSC (1/2 page).

  18. Function valued metric spaces

    Directory of Open Access Journals (Sweden)

    Madjid Mirzavaziri

    2010-11-01

    Full Text Available In this paper we introduce the notion of an ℱ-metric, as a function valued distance mapping, on a set X and we investigate the theory of ℱ-metrics paces. We show that every metric space may be viewed as an F-metric space and every ℱ-metric space (X,δ can be regarded as a topological space (X,τδ. In addition, we prove that the category of the so-called extended F-metric spaces properly contains the category of metric spaces. We also introduce the concept of an `ℱ-metric space as a completion of an ℱ-metric space and, as an application to topology, we prove that each normal topological space is `ℱ-metrizable.

  19. Metric Tannakian duality

    Science.gov (United States)

    Daenzer, Calder

    2013-08-01

    We incorporate metric data into the framework of Tannaka-Krein duality. Thus, for any group with left invariant metric, we produce a dual metric on its category of unitary representations. We characterize the conditions under which a "double-dual" metric on the group may be recovered from the metric on representations, and provide conditions under which a metric agrees with its double-dual. We also explore a diverse class of possible applications of the theory, including applications to T-duality and to quantum Gromov-Hausdorff distance.

  20. Intuitionistic fuzzy metric spaces

    International Nuclear Information System (INIS)

    Park, Jin Han

    2004-01-01

    Using the idea of intuitionistic fuzzy set due to Atanassov [Intuitionistic fuzzy sets. in: V. Sgurev (Ed.), VII ITKR's Session, Sofia June, 1983; Fuzzy Sets Syst. 20 (1986) 87], we define the notion of intuitionistic fuzzy metric spaces as a natural generalization of fuzzy metric spaces due to George and Veeramani [Fuzzy Sets Syst. 64 (1994) 395] and prove some known results of metric spaces including Baire's theorem and the Uniform limit theorem for intuitionistic fuzzy metric spaces

  1. God particle disappears down 6 billion pound drain

    CERN Multimedia

    Henderson, M

    2001-01-01

    An estimated 6 billion pounds has been spent looking for the Higgs particle over the last three decades. Recent results from LEP though, are now causing some scientists to doubt that it exists at all (1 page).

  2. Factory Acceptance Test Procedure Westinghouse 100 ton Hydraulic Trailer

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1994-01-01

    This Factory Acceptance Test Procedure (FAT) is for the Westinghouse 100 Ton Hydraulic Trailer. The trailer will be used for the removal of the 101-SY pump. This procedure includes: safety check and safety procedures; pre-operation check out; startup; leveling trailer; functional/proofload test; proofload testing; and rolling load test

  3. Software Metrics Capability Evaluation Guide

    National Research Council Canada - National Science Library

    Budlong, Faye

    1995-01-01

    ...: disseminating information regarding the U.S. Air Force Policy on software metrics, providing metrics information to the public through CrossTalk, conducting customer workshops in software metrics, guiding metrics technology adoption programs...

  4. Metrics That Matter.

    Science.gov (United States)

    Prentice, Julia C; Frakt, Austin B; Pizer, Steven D

    2016-04-01

    Increasingly, performance metrics are seen as key components for accurately measuring and improving health care value. Disappointment in the ability of chosen metrics to meet these goals is exemplified in a recent Institute of Medicine report that argues for a consensus-building process to determine a simplified set of reliable metrics. Overall health care goals should be defined and then metrics to measure these goals should be considered. If appropriate data for the identified goals are not available, they should be developed. We use examples from our work in the Veterans Health Administration (VHA) on validating waiting time and mental health metrics to highlight other key issues for metric selection and implementation. First, we focus on the need for specification and predictive validation of metrics. Second, we discuss strategies to maintain the fidelity of the data used in performance metrics over time. These strategies include using appropriate incentives and data sources, using composite metrics, and ongoing monitoring. Finally, we discuss the VA's leadership in developing performance metrics through a planned upgrade in its electronic medical record system to collect more comprehensive VHA and non-VHA data, increasing the ability to comprehensively measure outcomes.

  5. Metric modular spaces

    CERN Document Server

    Chistyakov, Vyacheslav

    2015-01-01

    Aimed toward researchers and graduate students familiar with elements of functional analysis, linear algebra, and general topology; this book contains a general study of modulars, modular spaces, and metric modular spaces. Modulars may be thought of as generalized velocity fields and serve two important purposes: generate metric spaces in a unified manner and provide a weaker convergence, the modular convergence, whose topology is non-metrizable in general. Metric modular spaces are extensions of metric spaces, metric linear spaces, and classical modular linear spaces. The topics covered include the classification of modulars, metrizability of modular spaces, modular transforms and duality between modular spaces, metric  and modular topologies. Applications illustrated in this book include: the description of superposition operators acting in modular spaces, the existence of regular selections of set-valued mappings, new interpretations of spaces of Lipschitzian and absolutely continuous mappings, the existe...

  6. Metric diffusion along foliations

    CERN Document Server

    Walczak, Szymon M

    2017-01-01

    Up-to-date research in metric diffusion along compact foliations is presented in this book. Beginning with fundamentals from the optimal transportation theory and the theory of foliations; this book moves on to cover Wasserstein distance, Kantorovich Duality Theorem, and the metrization of the weak topology by the Wasserstein distance. Metric diffusion is defined, the topology of the metric space is studied and the limits of diffused metrics along compact foliations are discussed. Essentials on foliations, holonomy, heat diffusion, and compact foliations are detailed and vital technical lemmas are proved to aide understanding. Graduate students and researchers in geometry, topology and dynamics of foliations and laminations will find this supplement useful as it presents facts about the metric diffusion along non-compact foliation and provides a full description of the limit for metrics diffused along foliation with at least one compact leaf on the two dimensions.

  7. Software Metrics: Measuring Haskell

    OpenAIRE

    Ryder, Chris; Thompson, Simon

    2005-01-01

    Software metrics have been used in software engineering as a mechanism for assessing code quality and for targeting software development activities, such as testing or refactoring, at areas of a program that will most benefit from them. Haskell has many tools for software engineering, such as testing, debugging and refactoring tools, but software metrics have mostly been neglected. The work presented in this paper identifies a collection of software metrics for use with Haskell programs. Thes...

  8. -Metric Space: A Generalization

    Directory of Open Access Journals (Sweden)

    Farshid Khojasteh

    2013-01-01

    Full Text Available We introduce the notion of -metric as a generalization of a metric by replacing the triangle inequality with a more generalized inequality. We investigate the topology of the spaces induced by a -metric and present some essential properties of it. Further, we give characterization of well-known fixed point theorems, such as the Banach and Caristi types in the context of such spaces.

  9. Adaptive Metric Dimensionality Reduction

    OpenAIRE

    Gottlieb, Lee-Ad; Kontorovich, Aryeh; Krauthgamer, Robert

    2013-01-01

    We study adaptive data-dependent dimensionality reduction in the context of supervised learning in general metric spaces. Our main statistical contribution is a generalization bound for Lipschitz functions in metric spaces that are doubling, or nearly doubling. On the algorithmic front, we describe an analogue of PCA for metric spaces: namely an efficient procedure that approximates the data's intrinsic dimension, which is often much lower than the ambient dimension. Our approach thus leverag...

  10. Metric graphic sets

    Science.gov (United States)

    Garces, I. J. L.; Rosario, J. B.

    2017-10-01

    For an ordered subset W = {w 1, w 2, …, wk } of vertices in a connected graph G and a vertex v of G, the metric representation of v with respect to W is the k-vector r(v|W) = (d(v, w 1), d(v, w 2), …, d(v, wk )), where d(v, wi ) is the distance of the vertices v and wi in G. The set W is called a resolving set of G if r(u|W) = r(v|W) implies u = v. The metric dimension of G, denoted by β(G), is the minimum cardinality of a resolving set of G, and a resolving set of G with cardinality equal to its metric dimension is called a metric basis of G. A set T of vectors is called a positive lattice set if all the coordinates in each vector of T are positive integers. A positive lattice set T consisting of n k-vectors is called a metric graphic set if there exists a simple connected graph G of order n + k with β(G) = k such that T = {r(ui |S) : ui ∈ V (G)\\S, 1 ≤ i ≤ n} for some metric basis S = {s 1, s 2, …, sk } of G. If such G exists, then we say G is a metric graphic realization of T. In this paper, we introduce the concept of metric graphic sets anchored on the concept of metric dimension and provide some characterizations. We also give necessary and sufficient conditions for any positive lattice set consisting of 2 k-vectors to be a metric graphic set. We provide an upper bound for the sum of all the coordinates of any metric graphic set and enumerate some properties of positive lattice sets consisting of n 2-vectors that are not metric graphic sets.

  11. Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — This chapter presents several performance metrics for offline evaluation of prognostics algorithms. A brief overview of different methods employed for performance...

  12. Topics in Metric Approximation

    Science.gov (United States)

    Leeb, William Edward

    This thesis develops effective approximations of certain metrics that occur frequently in pure and applied mathematics. We show that distances that often arise in applications, such as the Earth Mover's Distance between two probability measures, can be approximated by easily computed formulas for a wide variety of ground distances. We develop simple and easily computed characterizations both of norms measuring a function's regularity -- such as the Lipschitz norm -- and of their duals. We are particularly concerned with the tensor product of metric spaces, where the natural notion of regularity is not the Lipschitz condition but the mixed Lipschitz condition. A theme that runs throughout this thesis is that snowflake metrics (metrics raised to a power less than 1) are often better-behaved than ordinary metrics. For example, we show that snowflake metrics on finite spaces can be approximated by the average of tree metrics with a distortion bounded by intrinsic geometric characteristics of the space and not the number of points. Many of the metrics for which we characterize the Lipschitz space and its dual are snowflake metrics. We also present applications of the characterization of certain regularity norms to the problem of recovering a matrix that has been corrupted by noise. We are able to achieve an optimal rate of recovery for certain families of matrices by exploiting the relationship between mixed-variable regularity conditions and the decay of a function's coefficients in a certain orthonormal basis.

  13. Overview of journal metrics

    Directory of Open Access Journals (Sweden)

    Kihong Kim

    2018-02-01

    Full Text Available Various kinds of metrics used for the quantitative evaluation of scholarly journals are reviewed. The impact factor and related metrics including the immediacy index and the aggregate impact factor, which are provided by the Journal Citation Reports, are explained in detail. The Eigenfactor score and the article influence score are also reviewed. In addition, journal metrics such as CiteScore, Source Normalized Impact per Paper, SCImago Journal Rank, h-index, and g-index are discussed. Limitations and problems that these metrics have are pointed out. We should be cautious to rely on those quantitative measures too much when we evaluate journals or researchers.

  14. Cosmic rays and the biosphere over 4 billion years

    DEFF Research Database (Denmark)

    Svensmark, Henrik

    2006-01-01

    Variations in the flux of cosmic rays (CR) at Earth during the last 4.6 billion years are constructed from information about the star formation rate in the Milky Way and the evolution of the solar activity. The constructed CR signal is compared with variations in the Earths biological productivit...... as recorded in the isotope delta C-13, which spans more than 3 billion years. CR and fluctuations in biological productivity show a remarkable correlation and indicate that the evolution of climate and the biosphere on the Earth is closely linked to the evolution of the Milky Way....

  15. TonB-dependent transporters and their occurrence in cyanobacteria

    Directory of Open Access Journals (Sweden)

    von Haeseler Arndt

    2009-10-01

    Full Text Available Abstract Background Different iron transport systems evolved in Gram-negative bacteria during evolution. Most of the transport systems depend on outer membrane localized TonB-dependent transporters (TBDTs, a periplasma-facing TonB protein and a plasma membrane localized machinery (ExbBD. So far, iron chelators (siderophores, oligosaccharides and polypeptides have been identified as substrates of TBDTs. For iron transport, three uptake systems are defined: the lactoferrin/transferrin binding proteins, the porphyrin-dependent transporters and the siderophore-dependent transporters. However, for cyanobacteria almost nothing is known about possible TonB-dependent uptake systems for iron or other substrates. Results We have screened all publicly available eubacterial genomes for sequences representing (putative TBDTs. Based on sequence similarity, we identified 195 clusters, where elements of one cluster may possibly recognize similar substrates. For Anabaena sp. PCC 7120 we identified 22 genes as putative TBDTs covering almost all known TBDT subclasses. This is a high number of TBDTs compared to other cyanobacteria. The expression of the 22 putative TBDTs individually depends on the presence of iron, copper or nitrogen. Conclusion We exemplified on TBDTs the power of CLANS-based classification, which demonstrates its importance for future application in systems biology. In addition, the tentative substrate assignment based on characterized proteins will stimulate the research of TBDTs in different species. For cyanobacteria, the atypical dependence of TBDT gene expression on different nutrition points to a yet unknown regulatory mechanism. In addition, we were able to clarify a hypothesis of the absence of TonB in cyanobacteria by the identification of according sequences.

  16. Project Management Metrics

    Directory of Open Access Journals (Sweden)

    Radu MARSANU

    2010-04-01

    Full Text Available Metrics and indicators used for the evaluation of the IT projects management have the advantage of providing rigorous details about the required effort and the boundaries of the IT deliverables. There are some disadvantages, as well, due to the fact the input data contains errors and the value of metrics depends on the quality of data used in models.

  17. Project Management Metrics

    OpenAIRE

    Radu MARSANU

    2010-01-01

    Metrics and indicators used for the evaluation of the IT projects management have the advantage of providing rigorous details about the required effort and the boundaries of the IT deliverables. There are some disadvantages, as well, due to the fact the input data contains errors and the value of metrics depends on the quality of data used in models.

  18. Computational visual distinctness metric

    NARCIS (Netherlands)

    Martínez-Baena, J.; Toet, A.; Fdez-Vidal, X.R.; Garrido, A.; Rodríguez-Sánchez, R.

    1998-01-01

    A new computational visual distinctness metric based on principles of the early human visual system is presented. The metric is applied to quantify (1) the visual distinctness of targets in complex natural scenes and (2) the perceptual differences between compressed and uncompressed images. The new

  19. Arbitrary Metrics in Psychology

    Science.gov (United States)

    Blanton, Hart; Jaccard, James

    2006-01-01

    Many psychological tests have arbitrary metrics but are appropriate for testing psychological theories. Metric arbitrariness is a concern, however, when researchers wish to draw inferences about the true, absolute standing of a group or individual on the latent psychological dimension being measured. The authors illustrate this in the context of 2…

  20. Metrics for Probabilistic Geometries

    DEFF Research Database (Denmark)

    Tosi, Alessandra; Hauberg, Søren; Vellido, Alfredo

    2014-01-01

    We investigate the geometrical structure of probabilistic generative dimensionality reduction models using the tools of Riemannian geometry. We explicitly define a distribution over the natural metric given by the models. We provide the necessary algorithms to compute expected metric tensors where...

  1. Privacy Metrics and Boundaries

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  2. Metrics for Cosmetology.

    Science.gov (United States)

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of cosmetology students, this instructional package on cosmetology is part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational terminology, measurement terms, and tools currently in use. Each of the…

  3. Tracker Performance Metric

    National Research Council Canada - National Science Library

    Olson, Teresa; Lee, Harry; Sanders, Johnnie

    2002-01-01

    ... (sometimes referred to as confidence) state. The TPM can also be used as a measure of algorithm performance to compare against the Trackability Metric. The Trackability Metric was developed by AMCOM to determine how "trackable" a set of data should be. The TPM will be described and results presented.

  4. Metrics for Secretarial, Stenography.

    Science.gov (United States)

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of secretarial, stenography students, this instructional package is one of three for the business and office occupations cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational terminology,…

  5. Working Paper 5: Beyond Collier's Bottom Billion | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2010-12-16

    Dec 16, 2010 ... The heart of the narrative presented in the book is that a group of almost 60 countries, with a population of about a billion people, are caught in four main traps. Their prospects for escaping the traps are poor, and they need a set of actions from the international community to achieve the rapid rates of growth ...

  6. Congress OKs $2 Billion Boost for the NIH.

    Science.gov (United States)

    2017-07-01

    President Donald Trump last week signed a $1.1 trillion spending bill for fiscal year 2017, including a welcome $2 billion boost for the NIH that will support former Vice President Joe Biden's Cancer Moonshot initiative, among other priorities. However, researchers who rely heavily on NIH grant funding remain concerned about proposed cuts for 2018. ©2017 American Association for Cancer Research.

  7. The Metric Lens : Visualizing Metrics and Structure on Software Diagrams

    NARCIS (Netherlands)

    Byelas, Heorhiy; Telea, Alexandru; Hassan, AE; Zaidman, A; DiPenta, M

    2008-01-01

    We present the metric lens, a new visualization of method-level code metrics atop UML class diagrams, which allows performing metric-metric and metric-structure correlations on large diagrams. Me demonstrate air interactive visualization tool in which users can quickly specify a wide palette of

  8. Acceptance test report for the Westinghouse 100 ton hydraulic trailer

    International Nuclear Information System (INIS)

    Barrett, R.A.

    1995-01-01

    The SY-101 Equipment Removal System 100 Ton Hydraulic Trailer was designed and built by KAMP Systems, Inc. Performance of the Acceptance Test Procedure at KAMP's facility in Ontario, California (termed Phase 1 in this report) was interrupted by discrepancies noted with the main hydraulic cylinder. The main cylinder was removed and sent to REMCO for repair while the trailer was sent to Lampson's facility in Pasco, Washington. The Acceptance Test Procedure was modified and performance resumed at Lampson (termed Phase 2 in this report) after receipt of the repaired cylinder. At the successful conclusion of Phase 2 testing the trailer was accepted as meeting all the performance criteria specified

  9. Responsible for 45 000 tons CO2 emissions

    International Nuclear Information System (INIS)

    Nedrelid, Ola N.

    2006-01-01

    Waste combustion has much better tax conditions in Sweden compared to Norway. Today waste is being transported from Norway to Sweden, resulting in a 45 000 ton emission of CO 2 every year, when the waste could have remained in Norway, utilized as regained energy in district heating. The tax regime, however, does not provide the conditions for a profitable use of the waste in Norway. The district heating association is disappointed with the new government's proposed fiscal budget, which only worsens the competitive situation for Norway handling its own waste (ml)

  10. Gaia: Science with 1 billion objects in three dimensions

    Science.gov (United States)

    Prusti, Timo

    2018-02-01

    Gaia is an operational satellite in the ESA science programme. It is gathering data for more than a billion objects. Gaia measures positions and motions of stars in our Milky Way Galaxy, but captures many asteroids and extragalactic sources as well. The first data release has already been made and exploitation by the world-wide scientific community is underway. Further data releases will be made with further increasing accuracy. Gaia is well underway to provide its promised set of fundamental astronomical data.

  11. Document de travail 5: Beyond Collier's Bottom Billion | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    16 déc. 2010 ... L'ouvrage de Paul Collier, The Bottom Billion, suscite un grand intérêt dans le domaine du développement. Il repose sur la thèse selon laquelle un groupe de près de 60 pays, dont la population totale avoisine un milliard de personnes, sont pris dans quatre pièges principaux.

  12. Ubiquitous Supercritical Wing Design Cuts Billions in Fuel Costs

    Science.gov (United States)

    2015-01-01

    A Langley Research Center engineer’s work in the 1960s and ’70s to develop a wing with better performance near the speed of sound resulted in a significant increase in subsonic efficiency. The design was shared with industry. Today, Renton, Washington-based Boeing Commercial Airplanes, as well as most other plane manufacturers, apply it to all their aircraft, saving the airline industry billions of dollars in fuel every year.

  13. Cost of solving mysteries of universe: $6.7 billion

    CERN Multimedia

    Overbye, Dennis

    2007-01-01

    "An international consortium of physicists on Thursday released the first detailed design of what they believe will be the next big thing in physics. The machine, 20 miles long, will slam together electrons and their opposites, positrons, to produce fireballs of energy re-creating conditions when the universe was only a trillionth of a second old. It would cost about $6.7 billion." (1 page)

  14. General perceptual contrast metric

    Science.gov (United States)

    Liberg, Anna; Hasler, David

    2003-06-01

    A combined achromatic and chromatic contrast metric for digital images and video is presented in this paper. Our work is aimed at tuning any parametric rendering algorithm in an automated way by computing how much details an observer perceives in a rendered scene. The contrast metric is based on contrast analysis in spatial domain of image sub-bands constructed by pyramidal decomposition of the image. The proposed contrast metric is the sum of the perceptual contrast of every pixel in the image at different detail levels corresponding to different viewing distances. The novel metric shows high correlation with subjective experiments. Important applications involve optimal parameter set of any image rendering and contrast enhancement technique or auto exposure of an image capturing device.

  15. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  16. Tracker Performance Metric

    National Research Council Canada - National Science Library

    Olson, Teresa; Lee, Harry; Sanders, Johnnie

    2002-01-01

    .... We have developed the Tracker Performance Metric (TPM) specifically for this purpose. It was designed to measure the output performance, on a frame-by-frame basis, using its output position and quality...

  17. A metric for success

    Science.gov (United States)

    Carver, Gary P.

    1994-05-01

    The federal agencies are working with industry to ease adoption of the metric system. The goal is to help U.S. industry compete more successfully in the global marketplace, increase exports, and create new jobs. The strategy is to use federal procurement, financial assistance, and other business-related activities to encourage voluntary conversion. Based upon the positive experiences of firms and industries that have converted, federal agencies have concluded that metric use will yield long-term benefits that are beyond any one-time costs or inconveniences. It may be time for additional steps to move the Nation out of its dual-system comfort zone and continue to progress toward metrication. This report includes 'Metric Highlights in U.S. History'.

  18. IT Project Management Metrics

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Many software and IT projects fail in completing theirs objectives because different causes of which the management of the projects has a high weight. In order to have successfully projects, lessons learned have to be used, historical data to be collected and metrics and indicators have to be computed and used to compare them with past projects and avoid failure to happen. This paper presents some metrics that can be used for the IT project management.

  19. IT Project Management Metrics

    OpenAIRE

    Paul POCATILU

    2007-01-01

    Many software and IT projects fail in completing theirs objectives because different causes of which the management of the projects has a high weight. In order to have successfully projects, lessons learned have to be used, historical data to be collected and metrics and indicators have to be computed and used to compare them with past projects and avoid failure to happen. This paper presents some metrics that can be used for the IT project management.

  20. Mass Customization Measurements Metrics

    DEFF Research Database (Denmark)

    Nielsen, Kjeld; Brunø, Thomas Ditlev; Jørgensen, Kaj Asbjørn

    2014-01-01

    A recent survey has indicated that 17 % of companies have ceased mass customizing less than 1 year after initiating the effort. This paper presents measurement for a company’s mass customization performance, utilizing metrics within the three fundamental capabilities: robust process design, choice...... navigation, and solution space development. A mass customizer when assessing performance with these metrics can identify within which areas improvement would increase competitiveness the most and enable more efficient transition to mass customization....

  1. Distance Metric Tracking

    Science.gov (United States)

    2016-03-02

    estimation [13, 2], and manifold learning [19]. Such unsupervised methods do not have the benefit of human input on the distance metric, and overly rely...to be defined that is related to the task at hand. Many supervised and semi- supervised distance metric learning approaches have been developed [17... Unsupervised PCA seeks to identify a set of axes that best explain the variance contained in the data. LDA takes a supervised approach, minimiz- ing the intra

  2. Dilution Refrigeration of Multi-Ton Cold Masses

    CERN Document Server

    Wikus, P; CERN. Geneva

    2007-01-01

    Dilution refrigeration is the only means to provide continuous cooling at temperatures below 250 mK. Future experiments featuring multi-ton cold masses require a new generation of dilution refrigeration systems, capable of providing a heat sink below 10 mK at cooling powers which exceed the performance of present systems considerably. This thesis presents some advances towards dilution refrigeration of multi-ton masses in this temperature range. A new method using numerical simulation to predict the cooling power of a dilution refrigerator of a given design has been developed in the framework of this thesis project. This method does not only allow to take into account the differences between an actual and an ideal continuous heat exchanger, but also to quantify the impact of an additional heat load on an intermediate section of the dilute stream. In addition, transient behavior can be simulated. The numerical model has been experimentally verified with a dilution refrigeration system which has been designed, ...

  3. Taking out one billion tones of carbon: the magic of China's 11thFive-Year Plan

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Jiang; Zhou, Nan; Levine, Mark D.; Fridley, David

    2007-05-01

    China's 11th Five-Year Plan (FYP) sets an ambitious targetfor energy-efficiency improvement: energy intensity of the country sgross domestic product (GDP) should be reduced by 20 percent from 2005 to2010 (NDRC, 2006). This is the first time that a quantitative and bindingtarget has been set for energy efficiency, and signals a major shift inChina's strategic thinking about its long-term economic and energydevelopment. The 20 percent energy intensity target also translates intoan annual reduction of over one billion tons of CO2 by 2010, making theChinese effort one of most significant carbon mitigation effort in theworld today. While it is still too early to tell whether China willachieve this target, this paper attempts to understand the trend inenergy intensity in China and to explore a variety of options towardmeeting the 20 percent target using a detailed endues energymodel.

  4. Inducing Weinhold's metric from Euclidean and Riemannian metrics

    International Nuclear Information System (INIS)

    Andresen, B.; Berry, R.S.; Ihrig, E.; Salamon, P.

    1987-01-01

    We show that Weinhold's metric cannot be introduced on the equation of state surface from a Euclidean metric in the ambient space of all extensive state variables, whereas it can be induced if the ambient space is assumed only to have a Riemannian metric. This metric, however, is not unique. (orig.)

  5. Fault Management Metrics

    Science.gov (United States)

    Johnson, Stephen B.; Ghoshal, Sudipto; Haste, Deepak; Moore, Craig

    2017-01-01

    This paper describes the theory and considerations in the application of metrics to measure the effectiveness of fault management. Fault management refers here to the operational aspect of system health management, and as such is considered as a meta-control loop that operates to preserve or maximize the system's ability to achieve its goals in the face of current or prospective failure. As a suite of control loops, the metrics to estimate and measure the effectiveness of fault management are similar to those of classical control loops in being divided into two major classes: state estimation, and state control. State estimation metrics can be classified into lower-level subdivisions for detection coverage, detection effectiveness, fault isolation and fault identification (diagnostics), and failure prognosis. State control metrics can be classified into response determination effectiveness and response effectiveness. These metrics are applied to each and every fault management control loop in the system, for each failure to which they apply, and probabilistically summed to determine the effectiveness of these fault management control loops to preserve the relevant system goals that they are intended to protect.

  6. Deep Transfer Metric Learning.

    Science.gov (United States)

    Junlin Hu; Jiwen Lu; Yap-Peng Tan; Jie Zhou

    2016-12-01

    Conventional metric learning methods usually assume that the training and test samples are captured in similar scenarios so that their distributions are assumed to be the same. This assumption does not hold in many real visual recognition applications, especially when samples are captured across different data sets. In this paper, we propose a new deep transfer metric learning (DTML) method to learn a set of hierarchical nonlinear transformations for cross-domain visual recognition by transferring discriminative knowledge from the labeled source domain to the unlabeled target domain. Specifically, our DTML learns a deep metric network by maximizing the inter-class variations and minimizing the intra-class variations, and minimizing the distribution divergence between the source domain and the target domain at the top layer of the network. To better exploit the discriminative information from the source domain, we further develop a deeply supervised transfer metric learning (DSTML) method by including an additional objective on DTML, where the output of both the hidden layers and the top layer are optimized jointly. To preserve the local manifold of input data points in the metric space, we present two new methods, DTML with autoencoder regularization and DSTML with autoencoder regularization. Experimental results on face verification, person re-identification, and handwritten digit recognition validate the effectiveness of the proposed methods.

  7. Ants at Ton Nga Chang Wildlife Sanctuary, Songkhla

    Directory of Open Access Journals (Sweden)

    Watanasit, S.

    2005-03-01

    Full Text Available The aim of this study was to investigate diversity of ant at Ton Nga Chang Wildlife Sanctuary, Hat Yai, Songkhla. Three line transects (100 m each were randomly set up in 2 types of forest area, disturbed and undisturbed. Hand collecting (HC and leaf litter sampling (LL were applied for ant collection within a time limit of 30 minutes for each method. This study was carried out every month during Febuary 2002- Febuary 2003. The results showed that 206 species were placed under 8 subfamilies: Aenictinae, Cerapachyinae, Dolichoderinae, Formicinae, Leptanillinae, Myrmicinae, Ponerinae and Pseudomyrmecinae. Study sites and collection methods could divide ant species into 2 groups, whereas seasonal change could not distinguish the groups by DCA of multivariate analysis.

  8. Developing economic environment: energy for a billion people

    International Nuclear Information System (INIS)

    Sethna, H.N.; Chandramouli, R.; Manaktala, S.P.

    1996-01-01

    The ongoing reforms in the Indian economy provide an interesting canvas for optimal development of the energy sector serving the needs for a billion people. It will be necessary in the global interest, to avoid the pitfalls of developing an energy intensive society as in the west and remain within the realms of sustainable development. It also deals with the strategies to be adopted for energy conservation, rehabilitation of existing plants and optimal utilisation of hydro thermal capacities by integrated grid operation on a commercial basis and setting up of pumped storage plants. 9 tabs

  9. Cyber threat metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  10. Permanence of metric fractals

    Directory of Open Access Journals (Sweden)

    Kyril Tintarev

    2007-05-01

    Full Text Available The paper studies energy functionals on quasimetric spaces, defined by quadratic measure-valued Lagrangeans. This general model of medium, known as metric fractals, includes nested fractals and sub-Riemannian manifolds. In particular, the quadratic form of the Lagrangean satisfies Sobolev inequalities with the critical exponent determined by the (quasimetric homogeneous dimension, which is also involved in the asymptotic distribution of the form's eigenvalues. This paper verifies that the axioms of the metric fractal are preserved by space products, leading thus to examples of non-differentiable media of arbitrary intrinsic dimension.

  11. The Noncommutative Ward Metric

    Directory of Open Access Journals (Sweden)

    Marco Maceda

    2010-06-01

    Full Text Available We analyze the moduli-space metric in the static nonabelian charge-two sector of the Moyal-deformed CP^1 sigma model in 1+2 dimensions. After carefully reviewing the commutative results of Ward and Ruback, the noncommutative Kähler potential is expanded in powers of dimensionless moduli. In two special cases we sum the perturbative series to analytic expressions. For any nonzero value of the noncommutativity parameter, the logarithmic singularity of the commutative metric is expelled from the origin of the moduli space and possibly altogether.

  12. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  13. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  14. A two-billion-year history for the lunar dynamo.

    Science.gov (United States)

    Tikoo, Sonia M; Weiss, Benjamin P; Shuster, David L; Suavet, Clément; Wang, Huapei; Grove, Timothy L

    2017-08-01

    Magnetic studies of lunar rocks indicate that the Moon generated a core dynamo with surface field intensities of ~20 to 110 μT between at least 4.25 and 3.56 billion years ago (Ga). The field subsequently declined to <~4 μT by 3.19 Ga, but it has been unclear whether the dynamo had terminated by this time or just greatly weakened in intensity. We present analyses that demonstrate that the melt glass matrix of a young regolith breccia was magnetized in a ~5 ± 2 μT dynamo field at ~1 to ~2.5 Ga. These data extend the known lifetime of the lunar dynamo by at least 1 billion years. Such a protracted history requires an extraordinarily long-lived power source like core crystallization or precession. No single dynamo mechanism proposed thus far can explain the strong fields inferred for the period before 3.56 Ga while also allowing the dynamo to persist in such a weakened state beyond ~2.5 Ga. Therefore, our results suggest that the dynamo was powered by at least two distinct mechanisms operating during early and late lunar history.

  15. The Boring Billion, a slingshot for Complex Life on Earth.

    Science.gov (United States)

    Mukherjee, Indrani; Large, Ross R; Corkrey, Ross; Danyushevsky, Leonid V

    2018-03-13

    The period 1800 to 800 Ma ("Boring Billion") is believed to mark a delay in the evolution of complex life, primarily due to low levels of oxygen in the atmosphere. Earlier studies highlight the remarkably flat C, Cr isotopes and low trace element trends during the so-called stasis, caused by prolonged nutrient, climatic, atmospheric and tectonic stability. In contrast, we suggest a first-order variability of bio-essential trace element availability in the oceans by combining systematic sampling of the Proterozoic rock record with sensitive geochemical analyses of marine pyrite by LA-ICP-MS technique. We also recall that several critical biological evolutionary events, such as the appearance of eukaryotes, origin of multicellularity & sexual reproduction, and the first major diversification of eukaryotes (crown group) occurred during this period. Therefore, it appears possible that the period of low nutrient trace elements (1800-1400 Ma) caused evolutionary pressures which became an essential trigger for promoting biological innovations in the eukaryotic domain. Later periods of stress-free conditions, with relatively high nutrient trace element concentration, facilitated diversification. We propose that the "Boring Billion" was a period of sequential stepwise evolution and diversification of complex eukaryotes, triggering evolutionary pathways that made possible the later rise of micro-metazoans and their macroscopic counterparts.

  16. Empowering billions with food safety and food security

    International Nuclear Information System (INIS)

    Pillai, Suresh D.

    2009-01-01

    Full text: There are virtually millions of people -who die needlessly every year due to contaminated water and food. There are virtually many millions more who are starving due to an inadequate supply of food. Billions of pounds of food are unnecessarily wasted due to insect and other damage. Deaths and illness due to contaminated food or inadequate food are at catastrophic levels in many regions of the world. A majority of the food and water borne illnesses and deaths are preventable. It can be prevented by improved food production methods, improved food processing technologies, improved food distribution systems and improved personal hygiene. Food irradiation technology is over 100 years old. Yet, this technology is poorly understood by governments and corporate decision makers all around the world. Many consumers also are unfortunately misinformed of this technology. There is an urgent need for nations and people around the world to empower themselves with the knowledge and the expertise to harness this powerful technology. Widespread and sensible adoption of this technology can empower billions around the world with clean and abundant food supplies. It is unconscionable in the 21st century for governments to allow people to die or go hungry when the technology to prevent them is readily available

  17. Health impact metrics for air pollution management strategies.

    Science.gov (United States)

    Martenies, Sheena E; Wilkins, Donele; Batterman, Stuart A

    2015-12-01

    Health impact assessments (HIAs) inform policy and decision making by providing information regarding future health concerns, and quantitative HIAs now are being used for local and urban-scale projects. HIA results can be expressed using a variety of metrics that differ in meaningful ways, and guidance is lacking with respect to best practices for the development and use of HIA metrics. This study reviews HIA metrics pertaining to air quality management and presents evaluative criteria for their selection and use. These are illustrated in a case study where PM2.5 concentrations are lowered from 10 to 8μg/m(3) in an urban area of 1.8 million people. Health impact functions are used to estimate the number of premature deaths, unscheduled hospitalizations and other morbidity outcomes. The most common metric in recent quantitative HIAs has been the number of cases of adverse outcomes avoided. Other metrics include time-based measures, e.g., disability-adjusted life years (DALYs), monetized impacts, functional-unit based measures, e.g., benefits per ton of emissions reduced, and other economic indicators, e.g., cost-benefit ratios. These metrics are evaluated by considering their comprehensiveness, the spatial and temporal resolution of the analysis, how equity considerations are facilitated, and the analysis and presentation of uncertainty. In the case study, the greatest number of avoided cases occurs for low severity morbidity outcomes, e.g., asthma exacerbations (n=28,000) and minor-restricted activity days (n=37,000); while DALYs and monetized impacts are driven by the severity, duration and value assigned to a relatively low number of premature deaths (n=190 to 230 per year). The selection of appropriate metrics depends on the problem context and boundaries, the severity of impacts, and community values regarding health. The number of avoided cases provides an estimate of the number of people affected, and monetized impacts facilitate additional economic analyses

  18. Engineering performance metrics

    Science.gov (United States)

    Delozier, R.; Snyder, N.

    1993-03-01

    Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.

  19. Arbitrary Metrics Redux

    Science.gov (United States)

    Blanton, Hart; Jaccard, James

    2006-01-01

    Reducing the arbitrariness of a metric is distinct from the pursuit of validity, rational zero points, data transformations, standardization, and the types of statistical procedures one uses to analyze interval-level versus ordinal-level data. A variety of theoretical, methodological, and statistical tools can assist researchers who wish to make…

  20. Universal hypermultiplet metrics

    International Nuclear Information System (INIS)

    Ketov, Sergei V.

    2001-01-01

    Some instanton corrections to the universal hypermultiplet moduli space metric of the type IIA string theory compactified on a Calabi-Yau threefold arise due to multiple wrapping of BPS membranes and five-branes around certain cycles of Calabi-Yau. The classical universal hypermultipet metric is locally equivalent to the Bergmann metric of the symmetric quaternionic space SU(2,1)/U(2), whereas its generic quaternionic deformations are governed by the integrable SU(∞) Toda equation. We calculate the exact (non-perturbative) UH metrics in the special cases of (i) the D-instantons (the wrapped D2-branes) in the absence of five-branes, and (ii) the five-brane instantons with vanishing charges, in the absence of D-instantons. The solutions of the first type preserve the U(1)xU(1) classical symmetry, while they can be interpreted as the gravitational dressing of the hyper-Kaehler D-instanton solutions. The solutions of the second type preserve the non-abelian SU(2) classical symmetry, while they can be interpreted as the gradient flows in the universal hypermultiplet moduli space

  1. Metrics for energy resilience

    International Nuclear Information System (INIS)

    Roege, Paul E.; Collier, Zachary A.; Mancillas, James; McDonagh, John A.; Linkov, Igor

    2014-01-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today's energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system's energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth. - Highlights: • Resilience is the ability of a system to recover from adversity. • There is a need for methods to quantify and measure system resilience. • We developed a matrix-based approach to generate energy resilience metrics. • These metrics can be used in energy planning, system design, and operations

  2. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  3. WFIRST: Extragalactic Science over Twelve Billion Years of Cosmic History

    Science.gov (United States)

    Dickinson, Mark; Robertson, Brant; Ferguson, Henry C.; Furlanetto, Steve; Greene, Jenny; Madau, Piero; Marrone, Dan; Shapley, Alice; Stark, Daniel P.; Wechsler, Risa; Woosley, Stan; WFIRST-EXPO Science Investigation Team

    2018-01-01

    WFIRST’s infrared multiband imaging and spectroscopy from space over thousands of square degrees will revolutionize our understanding of galaxy formation and evolution. When combined with unique guest observer programs that provide ultradeep IR imaging and spectroscopy over areas >100x larger than achieved by Hubble Space Telescope, WFIRST will provide the first complete picture of star formation and stellar mass build-up in galaxies over twelve billion years of cosmic history. The WFIRST Extragalactic Potential Observations (WFIRST-EXPO) Science Investigation Team has identified a host of guest observer and archival programs where WFIRST can transform our views of the connections between the star formation, environment, morphology, stellar mass, and dark matter halo properties of galaxies, and determined how WFIRST can singularly probe the connection between early galaxies and the process of cosmic reionization. We present these WFIRST capabilities, and discuss how the science from WFIRST relates to other major forthcoming space- and ground-based facilities.

  4. Characterization of Arsenic Contamination on Rust from Ton Containers

    Energy Technology Data Exchange (ETDEWEB)

    Gary S. Groenewold; Recep Avci; Robert V. Fox; Muhammedin Deliorman; Jayson Suo; Laura Kellerman

    2013-01-01

    The speciation and spatial distribution of arsenic on rusted steel surfaces affects both measurement and removal approaches. The chemistry of arsenic residing in the rust of ton containers that held the chemical warfare agents bis(2-chloroethyl)sulfide (sulfur mustard) and 2-chlorovinyldichloroarsine (Lewisite) is of particular interest, because while the agents have been decontaminated, residual arsenic could pose a health or environmental risk. The chemistry and distribution of arsenic in rust samples was probed using imaging secondary ion mass spectrometry (SIMS), X-ray photoelectron spectroscopy (XPS), Auger electron spectroscopy, and scanning electron microscopy/energy dispersive X-ray spectroscopy (SEM/EDX). Arsenic in the +3 and or +5 oxidation state is homogeneously distributed at the very top-most layer of the rust samples, and is intimately associated with iron. Sputter depth profiling followed by SIMS and XPS shows As at a depth of several nm, in some cases in a reduced form. The SEM/EDX experiments show that As is present at a depth of several microns, but is inhomogeneously distributed; most locations contained oxidized As at concentrations of a few percent, however several locations showed very high As in a metallic form. These results indicate that the rust material must be removed if the steel containers are to be cleared of arsenic.

  5. Quality metrics in endoscopy.

    Science.gov (United States)

    Gurudu, Suryakanth R; Ramirez, Francisco C

    2013-04-01

    Endoscopy has evolved in the past 4 decades to become an important tool in the diagnosis and management of many digestive diseases. Greater focus on endoscopic quality has highlighted the need to ensure competency among endoscopists. A joint task force of the American College of Gastroenterology and the American Society for Gastrointestinal Endoscopy has proposed several quality metrics to establish competence and help define areas of continuous quality improvement. These metrics represent quality in endoscopy pertinent to pre-, intra-, and postprocedural periods. Quality in endoscopy is a dynamic and multidimensional process that requires continuous monitoring of several indicators and benchmarking with local and national standards. Institutions and practices should have a process in place for credentialing endoscopists and for the assessment of competence regarding individual endoscopic procedures.

  6. Metrics for Energy Resilience

    Energy Technology Data Exchange (ETDEWEB)

    Paul E. Roege; Zachary A. Collier; James Mancillas; John A. McDonagh; Igor Linkov

    2014-09-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today?s energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system?s energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth.

  7. Metrics and Assessment

    Directory of Open Access Journals (Sweden)

    Todd Carpenter

    2015-07-01

    Full Text Available An important and timely plenary session at the 2015 UKSG Conference and Exhibition focused on the role of metrics in research assessment. The two excellent speakers had slightly divergent views.Todd Carpenter from NISO (National Information Standards Organization argued that altmetrics aren’t alt anymore and that downloads and other forms of digital interaction, including social media reference, reference tracking, personal library saving, and secondary linking activity now provide mainstream approaches to the assessment of scholarly impact. James Wilsdon is professor of science and democracy in the Science Policy Research Unit at the University of Sussex and is chair of the Independent Review of the Role of Metrics in Research Assessment commissioned by the Higher Education Funding Council in England (HEFCE. The outcome of this review will inform the work of HEFCE and the other UK higher education funding bodies as they prepare for the future of the Research Excellence Framework. He is more circumspect arguing that metrics cannot and should not be used as a substitute for informed judgement. This article provides a summary of both presentations.

  8. Symmetries of the dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.

    1998-01-01

    The geometric duality between the metric g μν and a Killing tensor K μν is studied. The conditions were found when the symmetries of the metric g μν and the dual metric K μν are the same. Dual spinning space was constructed without introduction of torsion. The general results are applied to the case of Kerr-Newmann metric

  9. Monochromatic metrics are generalized Berwald

    OpenAIRE

    Bartelmeß, Nina; Matveev, Vladimir S.

    2017-01-01

    We show that monochromatic Finsler metrics, i.e., Finsler metrics such that each two tangent spaces are isomorphic as normed spaces, are generalized Berwald metrics, i.e., there exists an affine connection, possibly with torsion, that preserves the Finsler function

  10. Spacetime Metrics from Gauge Potentials

    Directory of Open Access Journals (Sweden)

    Ettore Minguzzi

    2014-03-01

    Full Text Available I present an approach to gravity in which the spacetime metric is constructed from a non-Abelian gauge potential with values in the Lie algebra of the group U(2 (or the Lie algebra of quaternions. If the curvature of this potential vanishes, the metric reduces to a canonical curved background form reminiscent of the Friedmann S3 cosmological metric.

  11. The Role of TonB Gene in Edwardsiella ictaluri Virulence

    Directory of Open Access Journals (Sweden)

    Hossam Abdelhamed

    2017-12-01

    Full Text Available Edwardsiella ictaluri is a Gram-negative facultative intracellular pathogen that causes enteric septicemia in catfish (ESC. Stress factors including poor water quality, poor diet, rough handling, overcrowding, and water temperature fluctuations increase fish susceptibility to ESC. The TonB energy transducing system (TonB-ExbB-ExbD and TonB-dependent transporters of Gram-negative bacteria support active transport of scarce resources including iron, an essential micronutrient for bacterial virulence. Deletion of the tonB gene attenuates virulence in several pathogenic bacteria. In the current study, the role of TonB (NT01EI_RS07425 in iron acquisition and E. ictaluri virulence were investigated. To accomplish this, the E. ictaluri tonB gene was in-frame deleted. Growth kinetics, iron utilization, and virulence of the EiΔtonB mutant were determined. Loss of TonB caused a significant reduction in bacterial growth in iron-depleted medium (p > 0.05. The EiΔtonB mutant grew similarly to wild-type E. ictaluri when ferric iron was added to the iron-depleted medium. The EiΔtonB mutant was significantly attenuated in catfish compared with the parent strain (21.69 vs. 46.91% mortality. Catfish surviving infection with EiΔtonB had significant protection against ESC compared with naïve fish (100 vs. 40.47% survival. These findings indicate that TonB participates in pathogenesis of ESC and is an important E. ictaluri virulence factor.

  12. 77 FR 16224 - Billion Auto, Inc.; Analysis of Proposed Consent Order To Aid Public Comment

    Science.gov (United States)

    2012-03-20

    ... FEDERAL TRADE COMMISSION [File No. 112 3209] Billion Auto, Inc.; Analysis of Proposed Consent... ``Billion Auto, File No. 112 3209'' on your comment, and file your comment online at https://ftcpublic... April 16, 2012. Write ``Billion Auto, File No. 112 3209'' on your comment. Your comment--including your...

  13. Criticality safety review of 2 1/2-, 10-, and 14-ton UF6 cylinders

    International Nuclear Information System (INIS)

    Broadhead, B.L.

    1991-10-01

    Currently, UF 6 cylinders designed to contain 2 1/2 tons of UF 6 are classified as Fissile Class 2 packages with a transport index (TI) of 5 for the purpose of transportation. The 10-ton UF 6 cylinders are classified as Fissile Class 1 with no TI assigned for transportation. The 14-ton cylinders, although not certified for transport with enrichments greater than 1 wt % because they have no approved overpack, can be used in on-site operations for enrichments greater than 1 wt %. The maximum 235 U enrichments for these cylinders are 5.0 wt % for the 2 1/2-ton cylinder and 4.5 wt % for the 10- and 14-ton cylinders. This work reviews the suitability for reclassification of the 2 1/2-ton UF 6 packages as Fissile Class 1 with a maximum 235 U enrichment of 5 wt %. Additionally, the 10- and 14-ton cylinders are reviewed to address a change in maximum 235 U enrichment from 4.5 to 5 wt %. Based on this evaluation, the 2 1/2-ton UF 6 cylinders meet the 10 CFR.71 criteria for Fissile Class 1 packages, and no TI is needed for criticality safety purposes; however, a TI may be required based on radiation from the packages. Similarly, the 10- and 14-ton UF 6 packages appear acceptable for a maximum enrichment rating change to 5 wt % 235 U. 11 refs., 13 figs., 7 tabs

  14. Orbital forcing of climate 1.4 billion years ago.

    Science.gov (United States)

    Zhang, Shuichang; Wang, Xiaomei; Hammarlund, Emma U; Wang, Huajian; Costa, M Mafalda; Bjerrum, Christian J; Connelly, James N; Zhang, Baomin; Bian, Lizeng; Canfield, Donald E

    2015-03-24

    Fluctuating climate is a hallmark of Earth. As one transcends deep into Earth time, however, both the evidence for and the causes of climate change become difficult to establish. We report geochemical and sedimentological evidence for repeated, short-term climate fluctuations from the exceptionally well-preserved ∼1.4-billion-year-old Xiamaling Formation of the North China Craton. We observe two patterns of climate fluctuations: On long time scales, over what amounts to tens of millions of years, sediments of the Xiamaling Formation record changes in geochemistry consistent with long-term changes in the location of the Xiamaling relative to the position of the Intertropical Convergence Zone. On shorter time scales, and within a precisely calibrated stratigraphic framework, cyclicity in sediment geochemical dynamics is consistent with orbital control. In particular, sediment geochemical fluctuations reflect what appear to be orbitally forced changes in wind patterns and ocean circulation as they influenced rates of organic carbon flux, trace metal accumulation, and the source of detrital particles to the sediment.

  15. A Unification of G-Metric, Partial Metric, and b-Metric Spaces

    Directory of Open Access Journals (Sweden)

    Nawab Hussain

    2014-01-01

    Full Text Available Using the concepts of G-metric, partial metric, and b-metric spaces, we define a new concept of generalized partial b-metric space. Topological and structural properties of the new space are investigated and certain fixed point theorems for contractive mappings in such spaces are obtained. Some examples are provided here to illustrate the usability of the obtained results.

  16. Random Kähler metrics

    International Nuclear Information System (INIS)

    Ferrari, Frank; Klevtsov, Semyon; Zelditch, Steve

    2013-01-01

    The purpose of this article is to propose a new method to define and calculate path integrals over metrics on a Kähler manifold. The main idea is to use finite dimensional spaces of Bergman metrics, as an approximation to the full space of Kähler metrics. We use the theory of large deviations to decide when a sequence of probability measures on the spaces of Bergman metrics tends to a limit measure on the space of all Kähler metrics. Several examples are considered.

  17. Standard for metric practice

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    This standard gives guidance for application of the modernized metric system in the United States. The International System of Units, developed and maintained by the General Conference on Weights and Measures (abbreviated CGPM from the official French name Conference Generale des Poids et Measures) is intended as a basis for worldwide standardization of measurement units. The name International System of Units and the international abbreviation SI 2 were adopted by the 11th CGPM in 1960. SI is a complete, coherent system that is being universally adopted

  18. Vizualization Challenges of a Subduction Simulation Using One Billion Markers

    Science.gov (United States)

    Rudolph, M. L.; Gerya, T. V.; Yuen, D. A.

    2004-12-01

    Recent advances in supercomputing technology have permitted us to study the multiscale, multicomponent fluid dynamics of subduction zones at unprecedented resolutions down to about the length of a football field. We have performed numerical simulations using one billion tracers over a grid of about 80 thousand points in two dimensions. These runs have been performed using a thermal-chemical simulation that accounts for hydration and partial melting in the thermal, mechanical, petrological, and rheological domains. From these runs, we have observed several geophysically interesting phenomena including the development of plumes with unmixed mantle composition as well as plumes with mixed mantle/crust components. Unmixed plumes form at depths greater than 100km (5-10 km above the upper interface of subducting slab) and consist of partially molten wet peridotite. Mixed plumes form at lesser depth directly from the subducting slab and contain partially molten hydrated oceanic crust and sediments. These high resolution simulations have also spurred the development of new visualization methods. We have created a new web-based interface to data from our subduction simulation and other high-resolution 2D data that uses an hierarchical data format to achieve response times of less than one second when accessing data files on the order of 3GB. This interface, WEB-IS4, uses a Javascript and HTML frontend coupled with a C and PHP backend and allows the user to perform region of interest zooming, real-time colormap selection, and can return relevant statistics relating to the data in the region of interest.

  19. TonEBP modulates the protective effect of taurine in ischemia-induced cytotoxicity in cardiomyocytes.

    Science.gov (United States)

    Yang, Y J; Han, Y Y; Chen, K; Zhang, Y; Liu, X; Li, S; Wang, K Q; Ge, J B; Liu, W; Zuo, J

    2015-12-17

    Taurine, which is found at high concentration in the heart, exerts several protective actions on myocardium. Physically, the high level of taurine in heart is maintained by a taurine transporter (TauT), the expression of which is suppressed under ischemic insult. Although taurine supplementation upregulates TauT expression, elevates the intracellular taurine content and ameliorates the ischemic injury of cardiomyocytes (CMs), little is known about the regulatory mechanisms of taurine governing TauT expression under ischemia. In this study, we describe the TonE (tonicity-responsive element)/TonEBP (TonE-binding protein) pathway involved in the taurine-regulated TauT expression in ischemic CMs. Taurine inhibited the ubiquitin-dependent proteasomal degradation of TonEBP, promoted the translocation of TonEBP into the nucleus, enhanced TauT promoter activity and finally upregulated TauT expression in CMs. In addition, we observed that TonEBP had an anti-apoptotic and anti-oxidative role in CMs under ischemia. Moreover, the protective effects of taurine on myocardial ischemia were TonEBP dependent. Collectively, our findings suggest that TonEBP is a core molecule in the protective mechanism of taurine in CMs under ischemic insult.

  20. TonEBP modulates the protective effect of taurine in ischemia-induced cytotoxicity in cardiomyocytes

    Science.gov (United States)

    Yang, Y J; Han, Y Y; Chen, K; Zhang, Y; Liu, X; Li, S; Wang, K Q; Ge, J B; Liu, W; Zuo, J

    2015-01-01

    Taurine, which is found at high concentration in the heart, exerts several protective actions on myocardium. Physically, the high level of taurine in heart is maintained by a taurine transporter (TauT), the expression of which is suppressed under ischemic insult. Although taurine supplementation upregulates TauT expression, elevates the intracellular taurine content and ameliorates the ischemic injury of cardiomyocytes (CMs), little is known about the regulatory mechanisms of taurine governing TauT expression under ischemia. In this study, we describe the TonE (tonicity-responsive element)/TonEBP (TonE-binding protein) pathway involved in the taurine-regulated TauT expression in ischemic CMs. Taurine inhibited the ubiquitin-dependent proteasomal degradation of TonEBP, promoted the translocation of TonEBP into the nucleus, enhanced TauT promoter activity and finally upregulated TauT expression in CMs. In addition, we observed that TonEBP had an anti-apoptotic and anti-oxidative role in CMs under ischemia. Moreover, the protective effects of taurine on myocardial ischemia were TonEBP dependent. Collectively, our findings suggest that TonEBP is a core molecule in the protective mechanism of taurine in CMs under ischemic insult. PMID:26673669

  1. 46 CFR 130.110 - Internal communications on OSVs of less than 100 gross tons.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Internal communications on OSVs of less than 100 gross... Internal communications on OSVs of less than 100 gross tons. Each vessel of less than 100 gross tons... have a fixed means of communication between the pilothouse and the place where the auxiliary means of...

  2. Formulation des bétons autoplaçants : Optimisation du squelette ...

    African Journals Online (AJOL)

    Formulation des bétons autoplaçants : Optimisation du squelette granulaire par la méthode graphique de Dreux - Gorisse. ... De multiples approches se sont développées á travers le monde pour la formulation d'un béton autoplaçant. ... Keywords: SCC - Formulation - Optimization -Aggregate -Additions - Characterization ...

  3. Completion of a Dislocated Metric Space

    Directory of Open Access Journals (Sweden)

    P. Sumati Kumari

    2015-01-01

    Full Text Available We provide a construction for the completion of a dislocated metric space (abbreviated d-metric space; we also prove that the completion of the metric associated with a d-metric coincides with the metric associated with the completion of the d-metric.

  4. Understanding Traditional Research Impact Metrics.

    Science.gov (United States)

    Butler, Joseph S; Sebastian, Arjun S; Kaye, I David; Wagner, Scott C; Morrissey, Patrick B; Schroeder, Gregory D; Kepler, Christopher K; Vaccaro, Alexander R

    2017-05-01

    Traditionally, the success of a researcher has been judged by the number of publications he or she has published in peer-review, indexed, high impact journals. However, to quantify the impact of research in the wider scientific community, a number of traditional metrics have been used, including Impact Factor, SCImago Journal Rank, Eigenfactor Score, and Article Influence Score. This article attempts to provide a broad overview of the main traditional impact metrics that have been used to assess scholarly output and research impact. We determine that there is no perfect all-encompassing metric to measure research impact, and, in the modern era, no single traditional metric is capable of accommodating all facets of research impact. Academics and researchers should be aware of the advantages and limitations of traditional metrics and should be judicious when selecting any metrics for an objective assessment of scholarly output and research impact.

  5. Tomographic reconstruction of quantum metrics

    Science.gov (United States)

    Laudato, Marco; Marmo, Giuseppe; Mele, Fabio M.; Ventriglia, Franco; Vitale, Patrizia

    2018-02-01

    In the framework of quantum information geometry we investigate the relationship between monotone metric tensors uniquely defined on the space of quantum tomograms, once the tomographic scheme is chosen, and monotone quantum metrics on the space of quantum states, classified by operator monotone functions, according to the Petz classification theorem. We show that different metrics can be related through a change in the tomographic map and prove that there exists a bijective relation between monotone quantum metrics associated with different operator monotone functions. Such a bijective relation is uniquely defined in terms of solutions of a first order second degree differential equation for the parameters of the involved tomographic maps. We first exhibit an example of a non-linear tomographic map that connects a monotone metric with a new one, which is not monotone. Then we provide a second example where two monotone metrics are uniquely related through their tomographic parameters.

  6. Metric adjusted skew information

    DEFF Research Database (Denmark)

    Hansen, Frank

    2008-01-01

    establish a connection between the geometrical formulation of quantum statistics as proposed by Chentsov and Morozova and measures of quantum information as introduced by Wigner and Yanase and extended in this article. We show that the set of normalized Morozova-Chentsov functions describing the possible......We extend the concept of Wigner-Yanase-Dyson skew information to something we call "metric adjusted skew information" (of a state with respect to a conserved observable). This "skew information" is intended to be a non-negative quantity bounded by the variance (of an observable in a state......) that vanishes for observables commuting with the state. We show that the skew information is a convex function on the manifold of states. It also satisfies other requirements, proposed by Wigner and Yanase, for an effective measure-of-information content of a state relative to a conserved observable. We...

  7. Random metric spaces and universality

    International Nuclear Information System (INIS)

    Vershik, A M

    2004-01-01

    The notion of random metric space is defined, and it is proved that such a space is isometric to the Urysohn universal metric space with probability one. The main technique is the study of universal and random distance matrices; properties of metric (in particular, universal) spaces are related to properties of distance matrices. Examples of other categories in which randomness and universality coincide (graphs, and so on) are given

  8. The metric system: An introduction

    Science.gov (United States)

    Lumley, Susan M.

    On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.

  9. Two classes of metric spaces

    Directory of Open Access Journals (Sweden)

    Isabel Garrido

    2016-04-01

    Full Text Available The class of metric spaces (X,d known as small-determined spaces, introduced by Garrido and Jaramillo, are properly defined by means of some type of real-valued Lipschitz functions on X. On the other hand, B-simple metric spaces introduced by Hejcman are defined in terms of some kind of bornologies of bounded subsets of X. In this note we present a common framework where both classes of metric spaces can be studied which allows us to see not only the relationships between them but also to obtain new internal characterizations of these metric properties.

  10. Metric-adjusted skew information

    DEFF Research Database (Denmark)

    Liang, Cai; Hansen, Frank

    2010-01-01

    We give a truly elementary proof of the convexity of metric-adjusted skew information following an idea of Effros. We extend earlier results of weak forms of superadditivity to general metric-adjusted skew information. Recently, Luo and Zhang introduced the notion of semi-quantum states...... on a bipartite system and proved superadditivity of the Wigner-Yanase-Dyson skew informations for such states. We extend this result to the general metric-adjusted skew information. We finally show that a recently introduced extension to parameter values 1 information is a special case...... of (unbounded) metric-adjusted skew information....

  11. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  12. On Indistinguishability Operators, Fuzzy Metrics and Modular Metrics

    Directory of Open Access Journals (Sweden)

    Juan-José Miñana

    2017-12-01

    Full Text Available The notion of indistinguishability operator was introduced by Trillas, E. in 1982, with the aim of fuzzifying the crisp notion of equivalence relation. Such operators allow for measuring the similarity between objects when there is a limitation on the accuracy of the performed measurement or a certain degree of similarity can be only determined between the objects being compared. Since Trillas introduced such kind of operators, many authors have studied their properties and applications. In particular, an intensive research line is focused on the metric behavior of indistinguishability operators. Specifically, the existence of a duality between metrics and indistinguishability operators has been explored. In this direction, a technique to generate metrics from indistinguishability operators, and vice versa, has been developed by several authors in the literature. Nowadays, such a measurement of similarity is provided by the so-called fuzzy metrics when the degree of similarity between objects is measured relative to a parameter. The main purpose of this paper is to extend the notion of indistinguishability operator in such a way that the measurements of similarity are relative to a parameter and, thus, classical indistinguishability operators and fuzzy metrics can be retrieved as a particular case. Moreover, we discuss the relationship between the new operators and metrics. Concretely, we prove the existence of a duality between them and the so-called modular metrics, which provide a dissimilarity measurement between objects relative to a parameter. The new duality relationship allows us, on the one hand, to introduce a technique for generating the new indistinguishability operators from modular metrics and vice versa and, on the other hand, to derive, as a consequence, a technique for generating fuzzy metrics from modular metrics and vice versa. Furthermore, we yield examples that illustrate the new results.

  13. Determination of a Screening Metric for High Diversity DNA Libraries.

    Directory of Open Access Journals (Sweden)

    Nicholas J Guido

    Full Text Available The fields of antibody engineering, enzyme optimization and pathway construction rely increasingly on screening complex variant DNA libraries. These highly diverse libraries allow researchers to sample a maximized sequence space; and therefore, more rapidly identify proteins with significantly improved activity. The current state of the art in synthetic biology allows for libraries with billions of variants, pushing the limits of researchers' ability to qualify libraries for screening by measuring the traditional quality metrics of fidelity and diversity of variants. Instead, when screening variant libraries, researchers typically use a generic, and often insufficient, oversampling rate based on a common rule-of-thumb. We have developed methods to calculate a library-specific oversampling metric, based on fidelity, diversity, and representation of variants, which informs researchers, prior to screening the library, of the amount of oversampling required to ensure that the desired fraction of variant molecules will be sampled. To derive this oversampling metric, we developed a novel alignment tool to efficiently measure frequency counts of individual nucleotide variant positions using next-generation sequencing data. Next, we apply a method based on the "coupon collector" probability theory to construct a curve of upper bound estimates of the sampling size required for any desired variant coverage. The calculated oversampling metric will guide researchers to maximize their efficiency in using highly variant libraries.

  14. Metric representation of DNA sequences.

    Science.gov (United States)

    Wu, Z B

    2000-07-01

    A metric representation of DNA sequences is borrowed from symbolic dynamics. In view of this method, the pattern seen in the chaos game representation of DNA sequences is explained as the suppression of certain nucleotide strings in the DNA sequences. Frequencies of short nucleotide strings and suppression of the shortest ones in the DNA sequences can be determined by using the metric representation.

  15. Extending cosmology: the metric approach

    OpenAIRE

    Mendoza, S.

    2012-01-01

    Comment: 2012, Extending Cosmology: The Metric Approach, Open Questions in Cosmology; Review article for an Intech "Open questions in cosmology" book chapter (19 pages, 3 figures). Available from: http://www.intechopen.com/books/open-questions-in-cosmology/extending-cosmology-the-metric-approach

  16. Metrics for Stage Lighting Technology.

    Science.gov (United States)

    Cooper, Gloria S., Ed; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of stage lighting technology students, this instructional package is one of five for the arts and humanities occupations cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational terminology,…

  17. Metric regularity and subdifferential calculus

    International Nuclear Information System (INIS)

    Ioffe, A D

    2000-01-01

    The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces

  18. Metrics for image segmentation

    Science.gov (United States)

    Rees, Gareth; Greenway, Phil; Morray, Denise

    1998-07-01

    An important challenge in mapping image-processing techniques onto applications is the lack of quantitative performance measures. From a systems engineering perspective these are essential if system level requirements are to be decomposed into sub-system requirements which can be understood in terms of algorithm selection and performance optimization. Nowhere in computer vision is this more evident than in the area of image segmentation. This is a vigorous and innovative research activity, but even after nearly two decades of progress, it remains almost impossible to answer the question 'what would the performance of this segmentation algorithm be under these new conditions?' To begin to address this shortcoming, we have devised a well-principled metric for assessing the relative performance of two segmentation algorithms. This allows meaningful objective comparisons to be made between their outputs. It also estimates the absolute performance of an algorithm given ground truth. Our approach is an information theoretic one. In this paper, we describe the theory and motivation of our method, and present practical results obtained from a range of state of the art segmentation methods. We demonstrate that it is possible to measure the objective performance of these algorithms, and to use the information so gained to provide clues about how their performance might be improved.

  19. Weyl metrics and wormholes

    Science.gov (United States)

    Gibbons, Gary W.; Volkov, Mikhail S.

    2017-05-01

    We study solutions obtained via applying dualities and complexifications to the vacuum Weyl metrics generated by massive rods and by point masses. Rescaling them and extending to complex parameter values yields axially symmetric vacuum solutions containing singularities along circles that can be viewed as singular matter sources. These solutions have wormhole topology with several asymptotic regions interconnected by throats and their sources can be viewed as thin rings of negative tension encircling the throats. For a particular value of the ring tension the geometry becomes exactly flat although the topology remains non-trivial, so that the rings literally produce holes in flat space. To create a single ring wormhole of one metre radius one needs a negative energy equivalent to the mass of Jupiter. Further duality transformations dress the rings with the scalar field, either conventional or phantom. This gives rise to large classes of static, axially symmetric solutions, presumably including all previously known solutions for a gravity-coupled massless scalar field, as for example the spherically symmetric Bronnikov-Ellis wormholes with phantom scalar. The multi-wormholes contain infinite struts everywhere at the symmetry axes, apart from solutions with locally flat geometry.

  20. Diabetes mellitus typ 2 bland barn och tonåringar : en kvalitativ litteraturstudie

    OpenAIRE

    Lundberg, Sandra; Östman, Daniela

    2012-01-01

    Syftet med detta examensarbete är att genom enkvalitativ litteraturstudie ta reda på de förebyggande åtgärderna mot diabetes mellitus typ 2 bland barn och tonåringar. För att förtydliga och skapa förståelse för de förebyggande åtgärderna för typ 2 diabetes bland barn och tonåringar har respondenterna valt att ta reda på orsaker och riskfaktorer till typ 2 diabetes bland barn och tonåringar. Litteraturstudien görs för att lyfta fram kunskap om typ 2 diabetes bland barn och tonåringar. För att...

  1. 19 CFR 4.100 - Licensing of vessels of less than 30 net tons.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 1 2010-04-01 2010-04-01 false Licensing of vessels of less than 30 net tons. 4.100 Section 4.100 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY VESSELS IN FOREIGN AND DOMESTIC TRADES General § 4.100 Licensing of vessels of less than 30 net tons. (a) The application...

  2. METRICS DEVELOPMENT FOR PATENTS.

    Science.gov (United States)

    Veiga, Daniela Francescato; Ferreira, Lydia Masako

    2015-01-01

    To develop a proposal for metrics for patents to be applied in assessing the postgraduate programs of Medicine III - Capes. From the reading and analysis of the 2013 area documents of all the 48 areas of Capes, a proposal for metrics for patents was developed to be applied in Medicine III programs. Except for the areas Biotechnology, Food Science, Biological Sciences III, Physical Education, Engineering I, III and IV and Interdisciplinary, most areas do not adopt a scoring system for patents. The proposal developed was based on the criteria of Biotechnology, with adaptations. In general, it will be valued, in ascending order, the deposit, the granting and licensing/production. It will also be assigned higher scores to patents registered abroad and whenever there is a participation of students. This proposal can be applied to the item Intellectual Production of the evaluation form, in subsection Technical Production/Patents. The percentage of 10% for academic programs and 40% for Masters Professionals should be maintained. The program will be scored as Very Good when it reaches 400 points or over; Good, between 200 and 399 points; Regular, between 71 and 199 points; Weak up to 70 points; Insufficient, no punctuation. Desenvolver uma proposta de métricas para patentes a serem aplicadas na avaliação dos Programas de Pós-Graduação da Área Medicina III - Capes. A partir da leitura e análise dos documentos de área de 2013 de todas as 48 Áreas da Capes, desenvolveu-se uma proposta de métricas para patentes, a ser aplicada na avaliação dos programas da área. Constatou-se que, com exceção das áreas Biotecnologia, Ciência de Alimentos, Ciências Biológicas III, Educação Física, Engenharias I, III e IV e Interdisciplinar, a maioria não adota sistema de pontuação para patentes. A proposta desenvolvida baseou-se nos critérios da Biotecnologia, com adaptações. De uma forma geral, foi valorizado, em ordem crescente, o depósito, a concessão e o

  3. Interactive (statistical) visualisation and exploration of a billion objects with vaex

    NARCIS (Netherlands)

    Breddels, M. A.

    2016-01-01

    With new catalogues arriving such as the Gaia DR1, containing more than a billion objects, new methods of handling and visualizing these data volumes are needed. We show that by calculating statistics on a regular (N-dimensional) grid, visualizations of a billion objects can be done within a second

  4. Nuclear budget for FY1991 up 3.6% to 409.7 billion yen

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    A total of yen409.7 billion was approved for the Governmental nuclear energy draft budget for fiscal 1991 on December 28, as the Cabinet gave its approval. The total, the highest ever, was divided into yen182.6 billion for the general account and yen227.1 billion for the special account for power resources development, representing a 3.6% increase over the ongoing fiscal year's level of yen395.5 billion. The draft budget will be examined for approval of the Diet session by the end of March. The nuclear energy budget devoted to research and development projects governed by the Science and Technology Agency amounts yen306.4 billion, up 3.5% exceeding yen300 billion for the first time. The nuclear budget for the Ministry of International Trade and Industry is yen98.1 billion, up 3.5%. For the other ministries, including the Ministry of Foreign Affairs, yen5.1 billion was allotted to nuclear energy-related projects. The Government had decided to raise the unit cost of the power plant siting promotion subsidies in the special account for power resources development by 25% --- from yen600/kw to yen750/kw --- in order to support the siting of plants. Consequently, the power resources siting account of the special accounts for both STA and MITI showed high levels of growth rates: 6.3% and 7.5%, respectively. (N.K.)

  5. Community access networks: how to connect the next billion to the ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Community access networks: how to connect the next billion to the Internet. Despite recent progress with mobile technology diffusion, more than four billion people worldwide are unconnected and have limited access to global communication infrastructure. The cost of implementing connectivity infrastructure in underserved ...

  6. Diamond's 2-billion-year growth charts tectonic shift in early Earth's carbon cycle

    NARCIS (Netherlands)

    Davies, G.R.; Gress, M.U.

    2017-01-01

    A study of tiny mineral 'inclusions' within diamonds from Botswana has shown that diamond crystals can take billions of years to grow. One diamond was found to contain silicate material that formed 2.3 billion years ago in its interior and a 250 million-year-old garnet crystal towards its outer rim,

  7. A Metric for Heterotic Moduli

    Science.gov (United States)

    Candelas, Philip; de la Ossa, Xenia; McOrist, Jock

    2017-12-01

    Heterotic vacua of string theory are realised, at large radius, by a compact threefold with vanishing first Chern class together with a choice of stable holomorphic vector bundle. These form a wide class of potentially realistic four-dimensional vacua of string theory. Despite all their phenomenological promise, there is little understanding of the metric on the moduli space of these. What is sought is the analogue of special geometry for these vacua. The metric on the moduli space is important in phenomenology as it normalises D-terms and Yukawa couplings. It is also of interest in mathematics, since it generalises the metric, first found by Kobayashi, on the space of gauge field connections, to a more general context. Here we construct this metric, correct to first order in {α^{\\backprime}}, in two ways: first by postulating a metric that is invariant under background gauge transformations of the gauge field, and also by dimensionally reducing heterotic supergravity. These methods agree and the resulting metric is Kähler, as is required by supersymmetry. Checking the metric is Kähler is intricate and the anomaly cancellation equation for the H field plays an essential role. The Kähler potential nevertheless takes a remarkably simple form: it is the Kähler potential of special geometry with the Kähler form replaced by the {α^{\\backprime}}-corrected hermitian form.

  8. Colonoscopy quality: metrics and implementation.

    Science.gov (United States)

    Calderwood, Audrey H; Jacobson, Brian C

    2013-09-01

    Colonoscopy is an excellent area for quality improvement because it is high volume, has significant associated risk and expense, and there is evidence that variability in its performance affects outcomes. The best end point for validation of quality metrics in colonoscopy is colorectal cancer incidence and mortality, but a more readily accessible metric is the adenoma detection rate. Fourteen quality metrics were proposed in 2006, and these are described in this article. Implementation of quality improvement initiatives involves rapid assessments and changes on an iterative basis, and can be done at the individual, group, or facility level. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Daylight metrics and energy savings

    Energy Technology Data Exchange (ETDEWEB)

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  10. Next-Generation Metrics: Responsible Metrics & Evaluation for Open Science

    Energy Technology Data Exchange (ETDEWEB)

    Wilsdon, J.; Bar-Ilan, J.; Peters, I.; Wouters, P.

    2016-07-01

    Metrics evoke a mixed reaction from the research community. A commitment to using data to inform decisions makes some enthusiastic about the prospect of granular, real-time analysis o of research and its wider impacts. Yet we only have to look at the blunt use of metrics such as journal impact factors, h-indices and grant income targets, to be reminded of the pitfalls. Some of the most precious qualities of academic culture resist simple quantification, and individual indicators often struggle to do justice to the richness and plurality of research. Too often, poorly designed evaluation criteria are “dominating minds, distorting behaviour and determining careers (Lawrence, 2007).” Metrics hold real power: they are constitutive of values, identities and livelihoods. How to exercise that power to more positive ends has been the focus of several recent and complementary initiatives, including the San Francisco Declaration on Research Assessment (DORA1), the Leiden Manifesto2 and The Metric Tide3 (a UK government review of the role of metrics in research management and assessment). Building on these initiatives, the European Commission, under its new Open Science Policy Platform4, is now looking to develop a framework for responsible metrics for research management and evaluation, which can be incorporated into the successor framework to Horizon 2020. (Author)

  11. Management Infrastructure and Metrics Definition

    OpenAIRE

    Loomis , Charles

    2010-01-01

    This document describes the project management bodies, the software develop- ment process, and the tools to support them. It also contains a description of the metrics that will be collected over the lifetime of the project to gauge progress.

  12. Hyperbolic geometry for colour metrics.

    Science.gov (United States)

    Farup, Ivar

    2014-05-19

    It is well established from both colour difference and colour order perpectives that the colour space cannot be Euclidean. In spite of this, most colour spaces still in use today are Euclidean, and the best Euclidean colour metrics are performing comparably to state-of-the-art non-Euclidean metrics. In this paper, it is shown that a transformation from Euclidean to hyperbolic geometry (i.e., constant negative curvature) for the chromatic plane can significantly improve the performance of Euclidean colour metrics to the point where they are statistically significantly better than state-of-the-art non-Euclidean metrics on standard data sets. The resulting hyperbolic geometry nicely models both qualitatively and quantitatively the hue super-importance phenomenon observed in colour order systems.

  13. Using TRACI for Sustainability Metrics

    Science.gov (United States)

    TRACI, the Tool for the Reduction and Assessment of Chemical and other environmental Impacts, has been developed for sustainability metrics, life cycle impact assessment, and product and process design impact assessment for developing increasingly sustainable products, processes,...

  14. Let's Make Metric Ice Cream

    Science.gov (United States)

    Zimmerman, Marianna

    1975-01-01

    Describes a classroom activity which involved sixth grade students in a learning situation including making ice cream, safety procedures in a science laboratory, calibrating a thermometer, using metric units of volume and mass. (EB)

  15. Experiential space is hardly metric

    Czech Academy of Sciences Publication Activity Database

    Šikl, Radovan; Šimeček, Michal; Lukavský, Jiří

    2008-01-01

    Roč. 2008, č. 37 (2008), s. 58-58 ISSN 0301-0066. [European Conference on Visual Perception. 24.08-28.08.2008, Utrecht] R&D Projects: GA ČR GA406/07/1676 Institutional research plan: CEZ:AV0Z70250504 Keywords : visual space perception * metric and non-metric perceptual judgments * ecological validity Subject RIV: AN - Psychology

  16. Phantom metrics with Killing spinors

    Directory of Open Access Journals (Sweden)

    W.A. Sabra

    2015-11-01

    Full Text Available We study metric solutions of Einstein–anti-Maxwell theory admitting Killing spinors. The analogue of the IWP metric which admits a space-like Killing vector is found and is expressed in terms of a complex function satisfying the wave equation in flat (2+1-dimensional space–time. As examples, electric and magnetic Kasner spaces are constructed by allowing the solution to depend only on the time coordinate. Euclidean solutions are also presented.

  17. Implementing metrics for process improvement

    OpenAIRE

    McAuley, Angela

    1993-01-01

    There is increasing interest in the use of metrics to control the software development process, to demonstrate productivity and value, and to identify areas for process improvement. Research work completed to date is based on the implementation of metrics in a 'standard' software development environment, and follows either a top-down or bottom-up approach. With the advent of further European unity, many companies are producing localised products, ie products which are translated and adapted t...

  18. Comportement en flexion des bétons fibrés sous chargement cyclique

    OpenAIRE

    Boulekbache Bensaid; Hamrat Mostefa; Chemrouk Mohamed; Amziane Sofiane

    2014-01-01

    Ce papier présente les résultats d’une étude expérimentale sur le comportement en flexion des bétons de fibres métalliques. On étudie l’effet de la rhéologie du béton sur l’orientation des fibres et l’influence de l’orientation sur les propriétés mécaniques. La rigidité de l’ancrage des fibres étudiée par les essais cycliques est liée aux caractéristiques rhéologiques et mécaniques de la matrice. Les résultats montrent que la fluidité des bétons est un paramètre essentiel de l’orientation des...

  19. Non-metric chaotic inflation

    Energy Technology Data Exchange (ETDEWEB)

    Enqvist, Kari [Physics Department, University of Helsinki, and Helsinki Institute of Physics, FIN-00014 Helsinki (Finland); Koivisto, Tomi [Institute for Theoretical Physics and Spinoza Institute, Leuvenlaan 4, 3584 CE Utrecht (Netherlands); Rigopoulos, Gerasimos, E-mail: kari.enqvist@helsinki.fi, E-mail: T.S.Koivisto@astro.uio.no, E-mail: rigopoulos@physik.rwth-aachen.de [Institut für Theoretische Teilchenphysik und Kosmologie, RWTH Aachen University, D-52056 Aachen (Germany)

    2012-05-01

    We consider inflation within the context of what is arguably the simplest non-metric extension of Einstein gravity. There non-metricity is described by a single graviscalar field with a non-minimal kinetic coupling to the inflaton field Ψ, parameterized by a single parameter γ. There is a simple equivalent description in terms of a massless field and an inflaton with a modified potential. We discuss the implications of non-metricity for chaotic inflation and find that it significantly alters the inflaton dynamics for field values Ψ∼>M{sub P}/γ, dramatically changing the qualitative behaviour in this regime. In the equivalent single-field description this is described as a cuspy potential that forms of barrier beyond which the inflation becomes a ghost field. This imposes an upper bound on the possible number of e-folds. For the simplest chaotic inflation models, the spectral index and the tensor-to-scalar ratio receive small corrections dependent on the non-metricity parameter. We also argue that significant post-inflationary non-metricity may be generated.

  20. Symmetries of Taub-NUT dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.; Codoban, S.

    1998-01-01

    Recently geometric duality was analyzed for a metric which admits Killing tensors. An interesting example arises when the manifold has Killing-Yano tensors. The symmetries of the dual metrics in the case of Taub-NUT metric are investigated. Generic and non-generic symmetries of dual Taub-NUT metric are analyzed

  1. Requirement Metrics for Risk Identification

    Science.gov (United States)

    Hammer, Theodore; Huffman, Lenore; Wilson, William; Rosenberg, Linda; Hyatt, Lawrence

    1996-01-01

    The Software Assurance Technology Center (SATC) is part of the Office of Mission Assurance of the Goddard Space Flight Center (GSFC). The SATC's mission is to assist National Aeronautics and Space Administration (NASA) projects to improve the quality of software which they acquire or develop. The SATC's efforts are currently focused on the development and use of metric methodologies and tools that identify and assess risks associated with software performance and scheduled delivery. This starts at the requirements phase, where the SATC, in conjunction with software projects at GSFC and other NASA centers is working to identify tools and metric methodologies to assist project managers in identifying and mitigating risks. This paper discusses requirement metrics currently being used at NASA in a collaborative effort between the SATC and the Quality Assurance Office at GSFC to utilize the information available through the application of requirements management tools.

  2. Moduli spaces of riemannian metrics

    CERN Document Server

    Tuschmann, Wilderich

    2015-01-01

    This book studies certain spaces of Riemannian metrics on both compact and non-compact manifolds. These spaces are defined by various sign-based curvature conditions, with special attention paid to positive scalar curvature and non-negative sectional curvature, though we also consider positive Ricci and non-positive sectional curvature. If we form the quotient of such a space of metrics under the action of the diffeomorphism group (or possibly a subgroup) we obtain a moduli space. Understanding the topology of both the original space of metrics and the corresponding moduli space form the central theme of this book. For example, what can be said about the connectedness or the various homotopy groups of such spaces? We explore the major results in the area, but provide sufficient background so that a non-expert with a grounding in Riemannian geometry can access this growing area of research.

  3. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  4. The uniqueness of the Fisher metric as information metric

    Czech Academy of Sciences Publication Activity Database

    Le, Hong-Van

    2017-01-01

    Roč. 69, č. 4 (2017), s. 879-896 ISSN 0020-3157 Institutional support: RVO:67985840 Keywords : Chentsov’s theorem * mixed topology * monotonicity of the Fish er metric Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.049, year: 2016 https://link.springer.com/article/10.1007%2Fs10463-016-0562-0

  5. The uniqueness of the Fisher metric as information metric

    Czech Academy of Sciences Publication Activity Database

    Le, Hong-Van

    2017-01-01

    Roč. 69, č. 4 (2017), s. 879-896 ISSN 0020-3157 Institutional support: RVO:67985840 Keywords : Chentsov’s theorem * mixed topology * monotonicity of the Fisher metric Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.049, year: 2016 https://link.springer.com/article/10.1007%2Fs10463-016-0562-0

  6. Probability measures on metric spaces

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    In this book, the author gives a cohesive account of the theory of probability measures on complete metric spaces (which is viewed as an alternative approach to the general theory of stochastic processes). After a general description of the basics of topology on the set of measures, the author discusses regularity, tightness, and perfectness of measures, properties of sampling distributions, and metrizability and compactness theorems. Next, he describes arithmetic properties of probability measures on metric groups and locally compact abelian groups. Covered in detail are notions such as decom

  7. Invariant metrics for Hamiltonian systems

    International Nuclear Information System (INIS)

    Rangarajan, G.; Dragt, A.J.; Neri, F.

    1991-05-01

    In this paper, invariant metrics are constructed for Hamiltonian systems. These metrics give rise to norms on the space of homeogeneous polynomials of phase-space variables. For an accelerator lattice described by a Hamiltonian, these norms characterize the nonlinear content of the lattice. Therefore, the performance of the lattice can be improved by minimizing the norm as a function of parameters describing the beam-line elements in the lattice. A four-fold increase in the dynamic aperture of a model FODO cell is obtained using this procedure. 7 refs

  8. Thermodynamic Metrics and Optimal Paths

    Energy Technology Data Exchange (ETDEWEB)

    Sivak, David; Crooks, Gavin

    2012-05-08

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  9. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2015-01-01

    The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, redundant new metrics are proposed frequently, and privacy studies are often incomparable. In this survey we allevia...

  10. Remarks on G-Metric Spaces

    Directory of Open Access Journals (Sweden)

    Bessem Samet

    2013-01-01

    Full Text Available In 2005, Mustafa and Sims (2006 introduced and studied a new class of generalized metric spaces, which are called G-metric spaces, as a generalization of metric spaces. We establish some useful propositions to show that many fixed point theorems on (nonsymmetric G-metric spaces given recently by many authors follow directly from well-known theorems on metric spaces. Our technique can be easily extended to other results as shown in application.

  11. A method to press powder at 6000 ton using small amount of explosive

    Science.gov (United States)

    Hilmi, Ahmad Humaizi; Azmi, Nor Azmaliana; Ismail, Ariffin

    2017-12-01

    Large die hydraulic press forces are one of the key instruments in making jumbo planes. The machine can produce aircraft components such as wing spars, landing gear supports and armor plates. Superpower nations such as USA, Russia, Germany, Japan, Korea and China have large die hydraulic press which can press 50,000 tons. In Malaysia, heavy-duty press is available from companies such as Proton that builds chassis for cars. However, that heavy-duty press is not able to produce better bulkhead for engines, fuselage, and wings of an aircraft. This paper presents the design of an apparatus that uses 50 grams of commercial grade explosives to produce 6000 tons of compaction. This is a first step towards producing larger scale apparatus that can produce 50,000-ton press. The design was done using AUTODYN blast simulation software. According to the results, the maximum load the apparatus can withstand was 6000 tons which was contributed by 50 grams of commercial explosive(Emulex). Explosive size larger than 50 grams will lead to catastrophic failure. Fabrication of the apparatus was completed. However, testing of the apparatus is not presented in this article.

  12. Safety analysis report on the ''Paducah Tiger'' protective overpack for 10-ton cylinders of uranium hexafluoride

    International Nuclear Information System (INIS)

    Stitt, D.H.

    1975-01-01

    The ''Paducah Tiger'' is a protective overpack used in shipment of 10-ton cylinders of enriched UF 6 . The calculations and tests are described which made and which indicate that the overpack is in compliance with the type B packaging requirements of ERDA Manual Chapter 0529 and Title 10 Code Federal Regulations Part 71. (U.S.)

  13. 10'000 ton ALICE gets her UK-built "Brain"

    CERN Multimedia

    Maddock, Julia

    2007-01-01

    For one of the four LEP experiments, called ALICE, the process got a step closer last week when a crucial part of the 10'000-ton detector, the British-built Central Trigger Processor (CTP), was installed in the ALICE cavern, some 150 feet underground. (plus background information about ALICE) (2,5 pages)

  14. Confined Mobility of TonB and FepA in Escherichia coli Membranes.

    Directory of Open Access Journals (Sweden)

    Yoriko Lill

    Full Text Available The important process of nutrient uptake in Escherichia coli, in many cases, involves transit of the nutrient through a class of beta-barrel proteins in the outer membrane known as TonB-dependent transporters (TBDTs and requires interaction with the inner membrane protein TonB. Here we have imaged the mobility of the ferric enterobactin transporter FepA and TonB by tracking them in the membranes of live E. coli with single-molecule resolution at time-scales ranging from milliseconds to seconds. We employed simple simulations to model/analyze the lateral diffusion in the membranes of E.coli, to take into account both the highly curved geometry of the cell and artifactual effects expected due to finite exposure time imaging. We find that both molecules perform confined lateral diffusion in their respective membranes in the absence of ligand with FepA confined to a region [Formula: see text] μm in radius in the outer membrane and TonB confined to a region [Formula: see text] μm in radius in the inner membrane. The diffusion coefficient of these molecules on millisecond time-scales was estimated to be [Formula: see text] μm2/s and [Formula: see text] μm2/s for FepA and TonB, respectively, implying that each molecule is free to diffuse within its domain. Disruption of the inner membrane potential, deletion of ExbB/D from the inner membrane, presence of ligand or antibody to FepA and disruption of the MreB cytoskeleton was all found to further restrict the mobility of both molecules. Results are analyzed in terms of changes in confinement size and interactions between the two proteins.

  15. Socio-technical security metrics

    NARCIS (Netherlands)

    Gollmann, D.; Herley, C.; Koenig, V.; Pieters, W.; Sasse, M.A.

    2015-01-01

    Report from Dagstuhl seminar 14491. This report documents the program and the outcomes of Dagstuhl Seminar 14491 “Socio-Technical Security Metrics”. In the domain of safety, metrics inform many decisions, from the height of new dikes to the design of nuclear plants. We can state, for example, that

  16. Separable metrics and radiating stars

    Indian Academy of Sciences (India)

    We study the junction condition relating the pressure to heat flux at the boundary of an accelerating and expanding spherically symmetric radiating star. We transform the junction condition to an ordinary differential equation by making a separability assumption on the metric functions in the space–time variables.

  17. Warped products and Einstein metrics

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seongtag [Department of Mathematics Education, Inha University, Incheon 402-751 (Korea, Republic of)

    2006-05-19

    Warped product construction is an important method to produce a new metric with a base manifold and a fibre. We construct compact base manifolds with a positive scalar curvature which do not admit any non-trivial Einstein warped product, and noncompact complete base manifolds which do not admit any non-trivial Ricci-flat Einstein warped product. (letter to the editor)

  18. Geometry of Cuts and Metrics

    NARCIS (Netherlands)

    M. Deza; M. Laurent (Monique)

    1997-01-01

    htmlabstractCuts and metrics are well-known objects that arise - independently, but with many deep and fascinating connections - in diverse fields: in graph theory, combinatorial optimization, geometry of numbers, combinatorial matrix theory, statistical physics, VLSI design etc. This book offers a

  19. Linear and Branching System Metrics

    NARCIS (Netherlands)

    J., Hilston; de Alfaro, Luca; Faella, Marco; M.Z., Kwiatkowska; Telek, M.; Stoelinga, Mariëlle Ida Antoinette

    We extend the classical system relations of trace inclusion, trace equivalence, simulation, and bisimulation to a quantitative setting in which propositions are interpreted not as boolean values, but as elements of arbitrary metric spaces. Trace inclusion and equivalence give rise to asymmetrical

  20. Axiomatic Testing of Structure Metrics

    NARCIS (Netherlands)

    van den Berg, Klaas; van den Broek, P.M.

    1994-01-01

    In this paper, axiomatic testing of software metrics is described. The testing is based on representation axioms from the measurement theory. In a case study, the axioms are given for the formal relational structure and the empirical relational structure. Two approaches of axiomatic testing are

  1. Axiomatic Testing of Structure Metrics

    NARCIS (Netherlands)

    van den Berg, Klaas; van den Broek, P.M.

    In this paper, axiomatic testing of software metrics will be described. The testing is based on representation axioms from the measurement theory. In a case study, the axioms are given for the formal relational structure and the empirical relational structure. Two approaches of axiomatic testing are

  2. Separable metrics and radiating stars

    Indian Academy of Sciences (India)

    2016-12-14

    Dec 14, 2016 ... Abstract. We study the junction condition relating the pressure to heat flux at the boundary of an accelerating and expanding spherically symmetric radiating star. We transform the junction condition to an ordinary differ- ential equation by making a separability assumption on the metric functions in the ...

  3. A ton is not always a ton: A road-test of landfill, manure, and afforestation/reforestation offset protocols in the U.S. carbon market

    International Nuclear Information System (INIS)

    Lee, Carrie M.; Lazarus, Michael; Smith, Gordon R.; Todd, Kimberly; Weitz, Melissa

    2013-01-01

    Highlights: • Protocols are the foundation of an offset program. • Using sample projects, we “road test” landfill, manure and afforestation protocols from 5 programs. • For a given project, we find large variation in the volume of offsets generated. • Harmonization of protocols can increase the likelihood that “a ton is a ton”. • Harmonization can enhance prospects for linking emission trading systems. -- Abstract: The outcome of recent international climate negotiations suggests we are headed toward a more fragmented carbon market, with multiple emission trading and offset programs operating in parallel. To effectively harmonize and link across programs, it will be important to ensure that across offset programs and protocols that a “ton is a ton”. In this article, we consider how sample offsets projects in the U.S. carbon market are treated across protocols from five programs: the Clean Development Mechanism, Climate Action Reserve, Chicago Climate Exchange, Regional Greenhouse Gas Initiative, and the U.S. EPA's former program, Climate Leaders. We find that differences among protocols for landfill methane, manure management, and afforestation/reforestation project types in accounting boundary definitions, baseline setting methods, measurement rules, emission factors, and discounts lead to differences in offsets credited that are often significant (e.g. greater than 50%). We suggest opportunities for modification and harmonization of protocols that can improve offset quality and credibility and enhance prospects for future linking of trading units and systems

  4. In vivo evidence of TonB shuttling between the cytoplasmic and outer membrane in Escherichia coli.

    Science.gov (United States)

    Larsen, Ray A; Letain, Tracy E; Postle, Kathleen

    2003-07-01

    Gram-negative bacteria are able to convert potential energy inherent in the proton gradient of the cytoplasmic membrane into active nutrient transport across the outer membrane. The transduction of energy is mediated by TonB protein. Previous studies suggest a model in which TonB makes sequential and cyclic contact with proteins in each membrane, a process called shuttling. A key feature of shuttling is that the amino-terminal signal anchor must quit its association with the cytoplasmic membrane, and TonB becomes associated solely with the outer membrane. However, the initial studies did not exclude the possibility that TonB was artifactually pulled from the cytoplasmic membrane by the fractionation process. To resolve this ambiguity, we devised a method to test whether the extreme TonB amino-terminus, located in the cytoplasm, ever became accessible to the cys-specific, cytoplasmic membrane-impermeant molecule, Oregon Green(R) 488 maleimide (OGM) in vivo. A full-length TonB and a truncated TonB were modified to carry a sole cysteine at position 3. Both full-length TonB and truncated TonB (consisting of the amino-terminal two-thirds) achieved identical conformations in the cytoplasmic membrane, as determined by their abilities to cross-link to the cytoplasmic membrane protein ExbB and their abilities to respond conformationally to the presence or absence of proton motive force. Full-length TonB could be amino-terminally labelled in vivo, suggesting that it was periplasmically exposed. In contrast, truncated TonB, which did not associate with the outer membrane, was not specifically labelled in vivo. The truncated TonB also acted as a control for leakage of OGM across the cytoplasmic membrane. Further, the extent of labelling for full-length TonB correlated roughly with the proportion of TonB found at the outer membrane. These findings suggest that TonB does indeed disengage from the cytoplasmic membrane during energy transduction and shuttle to the outer membrane.

  5. Group covariance and metrical theory

    International Nuclear Information System (INIS)

    Halpern, L.

    1983-01-01

    The a priori introduction of a Lie group of transformations into a physical theory has often proved to be useful; it usually serves to describe special simplified conditions before a general theory can be worked out. Newton's assumptions of absolute space and time are examples where the Euclidian group and translation group have been introduced. These groups were extended to the Galilei group and modified in the special theory of relativity to the Poincare group to describe physics under the given conditions covariantly in the simplest way. The criticism of the a priori character leads to the formulation of the general theory of relativity. The general metric theory does not really give preference to a particular invariance group - even the principle of equivalence can be adapted to a whole family of groups. The physical laws covariantly inserted into the metric space are however adapted to the Poincare group. 8 references

  6. Fuzzy polynucleotide spaces and metrics.

    Science.gov (United States)

    Nieto, Juan J; Torres, A; Georgiou, D N; Karakasidis, T E

    2006-04-01

    The study of genetic sequences is of great importance in biology and medicine. Mathematics is playing an important role in the study of genetic sequences and, generally, in bioinformatics. In this paper, we extend the work concerning the Fuzzy Polynucleotide Space (FPS) introduced in Torres, A., Nieto, J.J., 2003. The fuzzy polynucleotide Space: Basic properties. Bioinformatics 19(5); 587-592 and Nieto, J.J., Torres, A., Vazquez-Trasande, M.M. 2003. A metric space to study differences between polynucleotides. Appl. Math. Lett. 27:1289-1294: by studying distances between nucleotides and some complete genomes using several metrics. We also present new results concerning the notions of similarity, difference and equality between polynucleotides. The results are encouraging since they demonstrate how the notions of distance and similarity between polynucleotides in the FPS can be employed in the analysis of genetic material.

  7. Quality Metrics in Inpatient Neurology.

    Science.gov (United States)

    Dhand, Amar

    2015-12-01

    Quality of care in the context of inpatient neurology is the standard of performance by neurologists and the hospital system as measured against ideal models of care. There are growing regulatory pressures to define health care value through concrete quantifiable metrics linked to reimbursement. Theoretical models of quality acknowledge its multimodal character with quantitative and qualitative dimensions. For example, the Donabedian model distils quality as a phenomenon of three interconnected domains, structure-process-outcome, with each domain mutually influential. The actual measurement of quality may be implicit, as in peer review in morbidity and mortality rounds, or explicit, in which criteria are prespecified and systemized before assessment. As a practical contribution, in this article a set of candidate quality indicators for inpatient neurology based on an updated review of treatment guidelines is proposed. These quality indicators may serve as an initial blueprint for explicit quality metrics long overdue for inpatient neurology. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  8. 50 CFR Table 1c to Part 660... - 2009, Open Access and Limited Entry Allocations by Species or Species Group (weights in metric tons)

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false 2009, Open Access and Limited Entry... Part 660, Subpart G—2009, Open Access and Limited Entry Allocations by Species or Species Group... limited entry fixed gear, 2.5 mt for directed open access, 4.9 mt for Washington recreational, 16.0 mt for...

  9. A stationary q-metric

    Science.gov (United States)

    Toktarbay, S.; Quevedo, H.

    2014-10-01

    We present a stationary generalization of the static $q-$metric, the simplest generalization of the Schwarzschild solution that contains a quadrupole parameter. It possesses three independent parameters that are related to the mass, quadrupole moment and angular momentum. We investigate the geometric and physical properties of this exact solution of Einstein's vacuum equations, and show that it can be used to describe the exterior gravitational field of rotating, axially symmetric, compact objects.

  10. Sensory Metrics of Neuromechanical Trust.

    Science.gov (United States)

    Softky, William; Benford, Criscillia

    2017-09-01

    Today digital sources supply a historically unprecedented component of human sensorimotor data, the consumption of which is correlated with poorly understood maladies such as Internet addiction disorder and Internet gaming disorder. Because both natural and digital sensorimotor data share common mathematical descriptions, one can quantify our informational sensorimotor needs using the signal processing metrics of entropy, noise, dimensionality, continuity, latency, and bandwidth. Such metrics describe in neutral terms the informational diet human brains require to self-calibrate, allowing individuals to maintain trusting relationships. With these metrics, we define the trust humans experience using the mathematical language of computational models, that is, as a primitive statistical algorithm processing finely grained sensorimotor data from neuromechanical interaction. This definition of neuromechanical trust implies that artificial sensorimotor inputs and interactions that attract low-level attention through frequent discontinuities and enhanced coherence will decalibrate a brain's representation of its world over the long term by violating the implicit statistical contract for which self-calibration evolved. Our hypersimplified mathematical understanding of human sensorimotor processing as multiscale, continuous-time vibratory interaction allows equally broad-brush descriptions of failure modes and solutions. For example, we model addiction in general as the result of homeostatic regulation gone awry in novel environments (sign reversal) and digital dependency as a sub-case in which the decalibration caused by digital sensorimotor data spurs yet more consumption of them. We predict that institutions can use these sensorimotor metrics to quantify media richness to improve employee well-being; that dyads and family-size groups will bond and heal best through low-latency, high-resolution multisensory interaction such as shared meals and reciprocated touch; and

  11. Multi-Metric Sustainability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cowlin, Shannon [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heimiller, Donna [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Pless, Jacquelyn [National Renewable Energy Lab. (NREL), Golden, CO (United States); Munoz, David [Colorado School of Mines, Golden, CO (United States)

    2014-12-01

    A readily accessible framework that allows for evaluating impacts and comparing tradeoffs among factors in energy policy, expansion planning, and investment decision making is lacking. Recognizing this, the Joint Institute for Strategic Energy Analysis (JISEA) funded an exploration of multi-metric sustainability analysis (MMSA) to provide energy decision makers with a means to make more comprehensive comparisons of energy technologies. The resulting MMSA tool lets decision makers simultaneously compare technologies and potential deployment locations.

  12. Design and analysis of a 1-ton prototype of the Jinping Neutrino Experiment

    Science.gov (United States)

    Wang, Zongyi; Wang, Yuanqing; Wang, Zhe; Chen, Shaomin; Du, Xinxi; Zhang, Tianxiong; Guo, Ziyi; Yuan, Huanxin

    2017-05-01

    The Jinping Neutrino Experiment will perform an in-depth research on solar neutrinos and geo-neutrinos. Two structural options (i.e., cylindrical and spherical schemes) are proposed for the Jinping detector based on other successful underground neutrino detectors. Several key factors in the design are also discussed in detail. A 1-ton prototype of the Jinping experiment is proposed based on physics requirements. Subsequently, the structural design, installation procedure, and mechanical analysis of the neutrino detector prototype are discussed. The results show that the maximum Mises stresses on the acrylic vessel, stainless steel truss, and the tank are all lower than the design values of the strengths. The stability requirement of the stainless steel truss in the detector prototype is satisfied. Consequently, the structural scheme for the 1-ton prototype is safe and reliable.

  13. Diabetes\tand\tdepression:\tA\treview\twith\tspecial\tfocus\ton\tIndia

    Directory of Open Access Journals (Sweden)

    Megha\tThakur

    2015-10-01

    Full Text Available Diabetes,\ta\tpsychologically\tchallenging\tcondition for the\tpatients\tand their care givers, has been found to be a significant risk factor for depression. Depression\tmay\tbe\ta\tcritical\tbarrier\tto\teffective\tdiabetes\tmanagement.\tThe accompanying\tfatigue\tremarkably\tlowers\tthe\tmotivation\tfor\tself-care,\toften leading to lowered physical and emotion well-being, poor markers of diabetes control, poor adherence to medication, and increased mortality among individuals with diabetes. A very small proportion of the diabetes patients\twith\tdepression\tget\tdiagnosed,\tand\tfurthermore,\tonly\ta\thandful\tof the ones diagnosed get treated for depression. Despite the fact that 80 percent\tof\tthe\tpeople\twith\ttype\t2\tdiabetes\treside\tin\tlow\tand\tmiddle\tincome\tcountries,\tmost\tof\tthe\tevidence\ton diabetes\tand\tdepression\tcomes\tfrom\thigh\tincome\tcountries.\tThis\treview\toffers\ta\tsummary\tof\texisting\tevidence and\tthe\tpotential\tgaps\tthat\tneed\tto\tbe\taddressed.

  14. Design and analysis of a 1-ton prototype of the Jinping Neutrino Experiment

    International Nuclear Information System (INIS)

    Wang, Zongyi; Wang, Yuanqing; Wang, Zhe; Chen, Shaomin; Du, Xinxi; Zhang, Tianxiong; Guo, Ziyi; Yuan, Huanxin

    2017-01-01

    The Jinping Neutrino Experiment will perform an in-depth research on solar neutrinos and geo-neutrinos. Two structural options (i.e., cylindrical and spherical schemes) are proposed for the Jinping detector based on other successful underground neutrino detectors. Several key factors in the design are also discussed in detail. A 1-ton prototype of the Jinping experiment is proposed based on physics requirements. Subsequently, the structural design, installation procedure, and mechanical analysis of the neutrino detector prototype are discussed. The results show that the maximum Mises stresses on the acrylic vessel, stainless steel truss, and the tank are all lower than the design values of the strengths. The stability requirement of the stainless steel truss in the detector prototype is satisfied. Consequently, the structural scheme for the 1-ton prototype is safe and reliable.

  15. Design and analysis of a 1-ton prototype of the Jinping Neutrino Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zongyi, E-mail: wangzongyi1990@outlook.com [School of Civil Engineering, Wuhan University, Wuhan 430072 (China); Wang, Yuanqing [Key Laboratory of Civil Engineering Safety and Durability of Education Ministry, Tsinghua University, Beijing 100084 (China); Wang, Zhe; Chen, Shaomin [Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Du, Xinxi [School of Civil Engineering, Wuhan University, Wuhan 430072 (China); Zhang, Tianxiong [School of Civil Engineering, Tianjin University, Tianjin 300072 (China); Guo, Ziyi [Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Yuan, Huanxin [School of Civil Engineering, Wuhan University, Wuhan 430072 (China)

    2017-05-21

    The Jinping Neutrino Experiment will perform an in-depth research on solar neutrinos and geo-neutrinos. Two structural options (i.e., cylindrical and spherical schemes) are proposed for the Jinping detector based on other successful underground neutrino detectors. Several key factors in the design are also discussed in detail. A 1-ton prototype of the Jinping experiment is proposed based on physics requirements. Subsequently, the structural design, installation procedure, and mechanical analysis of the neutrino detector prototype are discussed. The results show that the maximum Mises stresses on the acrylic vessel, stainless steel truss, and the tank are all lower than the design values of the strengths. The stability requirement of the stainless steel truss in the detector prototype is satisfied. Consequently, the structural scheme for the 1-ton prototype is safe and reliable.

  16. Measuring Sustainability: Deriving Metrics From Objectives (Presentation)

    Science.gov (United States)

    The definition of 'sustain', to keep in existence, provides some insight into the metrics that are required to measure sustainability and adequately respond to assure sustainability. Keeping something in existence implies temporal and spatial contexts and requires metrics that g...

  17. Framework for Information Age Assessment Metrics

    National Research Council Canada - National Science Library

    Augustine, Thomas H; Broyles, James W

    2004-01-01

    ... all of these metrics. Specifically this paper discusses an Information Age Framework for Assessment Metrics and relates its elements to the fundamental facets of a C4ISR enterprise architecture...

  18. Almost contact metric 3-submersions

    Directory of Open Access Journals (Sweden)

    Bill Watson

    1984-01-01

    Full Text Available An almost contact metric 3-submersion is a Riemannian submersion, π from an almost contact metric manifold (M4m+3,(φi,ξi,ηii=13,g onto an almost quaternionic manifold (N4n,(Jii=13,h which commutes with the structure tensors of type (1,1;i.e., π*φi=Jiπ*, for i=1,2,3. For various restrictions on ∇φi, (e.g., M is 3-Sasakian, we show corresponding limitations on the second fundamental form of the fibres and on the complete integrability of the horizontal distribution. Concommitantly, relations are derived between the Betti numbers of a compact total space and the base space. For instance, if M is 3-quasi-Saskian (dΦ=0, then b1(N≤b1(M. The respective φi-holomorphic sectional and bisectional curvature tensors are studied and several unexpected results are obtained. As an example, if X and Y are orthogonal horizontal vector fields on the 3-contact (a relatively weak structure total space of such a submersion, then the respective holomorphic bisectional curvatures satisfy: Bφi(X,Y=B′J′i(X*,Y*−2. Applications to the real differential geometry of Yarg-Milis field equations are indicated based on the fact that a principal SU(2-bundle over a compactified realized space-time can be given the structure of an almost contact metric 3-submersion.

  19. Mine design for producing 100,000 tons per day of uranium-bearing Chattanooga Shale

    International Nuclear Information System (INIS)

    Hoe, H.L.

    1979-01-01

    Chattanooga Shale, underlying some 40,000 square miles in the southeastern United States, is considered to be a potentially large, low-grade source of uranium. The area in and near Dekalb County, Tennessee, appears to be the most likely site for commercial development. This paper deals with the mine design, mining procedures, equipment requirements, and operating maintenance costs for an underground mining complex capable of producing 100,000 tons of Chattanooga Shale per day for delivery to a beneficiation process

  20. AGILE detection of gamma-ray emission from the FSRQ Ton 0599

    Science.gov (United States)

    Bulgarelli, A.; Parmiggiani, N.; Lucarelli, F.; Verrecchia, F.; Pittori, C.; Tavani, M.; Vercellone, S.; Colafrancesco, S.; Cardillo, M.; Piano, G.; Ursi, A.; Fioretti, V.; Pilia, M.; Donnarumma, I.; Gianotti, F.; Trifoglio, M.; Giuliani, A.; Mereghetti, S.; Caraveo, P.; Perotti, F.; Chen, A.; Argan, A.; Costa, E.; Del Monte, E.; Evangelista, Y.; Feroci, M.; Lazzarotto, F.; Lapshov, I.; Pacciani, L.; Soffitta, P.; Sabatini, S.; Vittorini, V.; Pucella, G.; Rapisarda, M.; Di Cocco, G.; Fuschino, F.; Galli, M.; Labanti, C.; Marisaldi, M.; Pellizzoni, A.; Trois, A.; Barbiellini, G.; Vallazza, E.; Longo, F.; Morselli, A.; Picozza, P.; Prest, M.; Lipari, P.; Zanello, D.; Cattaneo, P. W.; Rappoldi, A.; Ferrari, A.; Paoletti, F.; Antonelli, A.; Giommi, P.; Salotti, L.; Valentini, G.; D'Amico, F.

    2017-12-01

    AGILE is detecting increasing gamma-ray emission above 100 MeV from a source positionally consistent with the FSRQ Ton 0599. Integrating from 2017-12-16 05:45 to 2017-12-18 05:45 UT, a preliminary maximum likelihood analysis yields a detection above 5 sigma and a flux F(E > 100 MeV)=(2.0 +/- 0.6) x 10^-6 ph cm^-2 s^-1.

  1. Comportement d'un béton à hautes performances à base de laitier ...

    African Journals Online (AJOL)

    L'utilisation de béton à hautes performances (BHP) intégrant des ajouts cimentaires comme les cendres volantes, les fumées de silice ou le laitier hydraulique ... armatures qui sont, à leur tour attaquées. Il est possible de modifier la ... refroidissement brutal par l'eau sous pression, c'est un sable de granulométrie 0/5 mm.

  2. Comportement d'un béton à hautes performances à base de laitier ...

    African Journals Online (AJOL)

    L'utilisation de béton à hautes performances (BHP) intégrant des ajouts cimentaires comme les cendres volantes, les fumées de silice ou le laitier hydraulique cimentaire a augmenté considérablement au cours des deux dernières décennies. Les ouvrages en BHP durent plus longtemps et entraînent des frais d'entretien ...

  3. Overall view of the AA hall dominated by the 50 ton crane (Donges).

    CERN Multimedia

    1980-01-01

    A 50 ton, 32 metre span overhead travelling cranre was mounted in one of the bays of Hall 193 (AA). An identical crane was mounted on the other bay. See also photo 8004261. For photos of the AA in different phases of completion (between 1979 and 1982) see: 7911303, 7911597X, 8004261, 8004608X, 8005563X, 8005565X, 8006716X, 8006722X, 8010939X, 8010941X, 8202324, 8202658X, 8203628X .

  4. Object-Oriented Metrics Which Predict Maintainability

    OpenAIRE

    Li, Wei; Henry, Sallie M.

    1993-01-01

    Software metrics have been studied in the procedural paradigm as a quantitative means of assessing the software development process as well as the quality of software products. Several studies have validated that various metrics are useful indicators of maintenance effort in the procedural paradigm. However, software metrics have rarely been studied in the object oriented paradigm. Very few metrics have been proposed to measure object oriented systems, and the proposed ones have not been v...

  5. A proposal of knowledge engineering metrics

    OpenAIRE

    Britos, Paola Verónica; García Martínez, Ramón; Hauge, Ødwin

    2005-01-01

    Metrics used on development of expert systems is not a well investigated problem area. This article suggests some metrics to be used to measure the maturity of the conceptualization process and the complexity of the decision process in the problem domain. We propose some further work to be done with these metrics. Applying those metrics makes new and interesting problems, concerning the structure of knowledge to surface.

  6. Nuclear criticality safety study for increased enrichment limit in 2 1/2-ton (30B) UF6 cylinders

    International Nuclear Information System (INIS)

    Tayloe, R.W. Jr.; Davis, T.C.; Lindenschmidt, D.J.; Fentiman, A.W.

    1992-10-01

    The current U 235 enrichment limit for 30B 2 1/2-ton UF 6 cylinders is 5.0 percent. This feasibility study examined the nuclear criticality safety issues associated with increasing the U 235 enrichment limit in 2 1/2-ton cylinders to 10.0 percent. Operations affected include cylinder cleaning, filling, storage, sampling, and transfer

  7. 46 CFR 25.25-17 - Survival craft requirements for uninspected passenger vessels of at least 100 gross tons.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Survival craft requirements for uninspected passenger... Survival craft requirements for uninspected passenger vessels of at least 100 gross tons. (a) Each uninspected passenger vessel of at least 100 gross tons must have adequate survival craft with enough capacity...

  8. TonB-Dependent Utilization of Dihydroxamate Xenosiderophores in Synechocystis sp. PCC 6803.

    Science.gov (United States)

    Babykin, Michael M; Obando, Tobias S A; Zinchenko, Vladislav V

    2018-02-01

    In Gram-negative bacteria, transport of ferric siderophores through outer membrane is a complex process that requires specific outer membrane transporters and energy-transducing TonB-ExbB-ExbD system in the cytoplasmic membrane. The genome of the non-siderophore-producing cyanobacterium Synechocystis sp. PCC 6803 encodes all putative components of the siderophore-mediated iron uptake system. So far, there has been no experimental evidence for the existence of such a pathway in this organism. On the contrary, its reductive iron uptake pathway has been studied in detail. We demonstrate that Synechocystis sp. PCC 6803 is capable of using dihydroxamate xenosiderophores, either ferric schizokinen (FeSK) or a siderophore of the filamentous cyanobacterium Anabaena variabilis ATCC 29413 (SAV), as the sole source of iron. Inactivation of the tonB gene or the exbB1-exbD1 gene cluster resulted in an inability to utilize these siderophores. At the same time, the inactivation of the feoB gene encoding FeoB plasma membrane ferrous iron transporter, or one of the futB or futC genes encoding permease and ATPase subunit of FutABC ferric iron transporter, did not impair the ability of cells to utilize FeSK or SAV as the sole source of iron for growth. Our data suggest that cyanobacterium Synechocystis sp. PCC 6803 is capable of acquiring iron-siderophore complexes in a TonB-dependent manner without iron reduction in the periplasm.

  9. Crowdsourcing metrics of digital collections

    Directory of Open Access Journals (Sweden)

    Tuula Pääkkönen

    2015-12-01

    Full Text Available In the National Library of Finland (NLF there are millions of digitized newspaper and journal pages, which are openly available via the public website  http://digi.kansalliskirjasto.fi. To serve users better, last year the front end was completely overhauled with its main aim in crowdsourcing features, e.g., by giving end-users the opportunity to create digital clippings and a personal scrapbook from the digital collections. But how can you know whether crowdsourcing has had an impact? How much crowdsourcing functionalities have been used so far? Did crowdsourcing work? In this paper the statistics and metrics of a recent crowdsourcing effort are analysed across the different digitized material types (newspapers, journals, ephemera. The subjects, categories and keywords given by the users are analysed to see which topics are the most appealing. Some notable public uses of the crowdsourced article clippings are highlighted. These metrics give us indications on how the end-users, based on their own interests, are investigating and using the digital collections. Therefore, the suggested metrics illustrate the versatility of the information needs of the users, varying from citizen science to research purposes. By analysing the user patterns, we can respond to the new needs of the users by making minor changes to accommodate the most active participants, while still making the service more approachable for those who are trying out the functionalities for the first time. Participation in the clippings and annotations can enrich the materials in unexpected ways and can possibly pave the way for opportunities of using crowdsourcing more also in research contexts. This creates more opportunities for the goals of open science since source data becomes ­available, making it possible for researchers to reach out to the general public for help. In the long term, utilizing, for example, text mining methods can allow these different end-user segments to

  10. Quality metrics for sensor images

    Science.gov (United States)

    Ahumada, AL

    1993-01-01

    Methods are needed for evaluating the quality of augmented visual displays (AVID). Computational quality metrics will help summarize, interpolate, and extrapolate the results of human performance tests with displays. The FLM Vision group at NASA Ames has been developing computational models of visual processing and using them to develop computational metrics for similar problems. For example, display modeling systems use metrics for comparing proposed displays, halftoning optimizing methods use metrics to evaluate the difference between the halftone and the original, and image compression methods minimize the predicted visibility of compression artifacts. The visual discrimination models take as input two arbitrary images A and B and compute an estimate of the probability that a human observer will report that A is different from B. If A is an image that one desires to display and B is the actual displayed image, such an estimate can be regarded as an image quality metric reflecting how well B approximates A. There are additional complexities associated with the problem of evaluating the quality of radar and IR enhanced displays for AVID tasks. One important problem is the question of whether intruding obstacles are detectable in such displays. Although the discrimination model can handle detection situations by making B the original image A plus the intrusion, this detection model makes the inappropriate assumption that the observer knows where the intrusion will be. Effects of signal uncertainty need to be added to our models. A pilot needs to make decisions rapidly. The models need to predict not just the probability of a correct decision, but the probability of a correct decision by the time the decision needs to be made. That is, the models need to predict latency as well as accuracy. Luce and Green have generated models for auditory detection latencies. Similar models are needed for visual detection. Most image quality models are designed for static imagery

  11. Validation in the Software Metric Development Process

    NARCIS (Netherlands)

    van den Berg, Klaas; van den Broek, P.M.

    In this paper the validation of software metrics will be examined. Two approaches will be combined: representational measurement theory and a validation network scheme. The development process of a software metric will be described, together with validities for the three phases of the metric

  12. Modeling and analysis of metrics databases

    OpenAIRE

    Paul, Raymond A.

    1999-01-01

    The main objective of this research is to propose a comprehensive framework for quality and risk management in software development process based on analysis and modeling of software metrics data. Existing software metrics work has focused mainly on the type of metrics tobe collected ...

  13. Invariant Matsumoto metrics on homogeneous spaces

    OpenAIRE

    Salimi Moghaddam, H.R.

    2014-01-01

    In this paper we consider invariant Matsumoto metrics which are induced by invariant Riemannian metrics and invariant vector fields on homogeneous spaces, and then we give the flag curvature formula of them. Also we study the special cases of naturally reductive spaces and bi-invariant metrics. We end the article by giving some examples of geodesically complete Matsumoto spaces.

  14. Context-dependent ATC complexity metric

    NARCIS (Netherlands)

    Mercado Velasco, G.A.; Borst, C.

    2015-01-01

    Several studies have investigated Air Traffic Control (ATC) complexity metrics in a search for a metric that could best capture workload. These studies have shown how daunting the search for a universal workload metric (one that could be applied in different contexts: sectors, traffic patterns,

  15. Towards a catalog format for software metrics

    NARCIS (Netherlands)

    Bouwers, E.; Visser, J.; Van Deursen, A.

    2014-01-01

    In the past two decades both the industry and the research community have proposed hundreds of metrics to track software projects, evaluate quality or estimate effort. Unfortunately, it is not always clear which metric works best in a particular context. Even worse, for some metrics there is little

  16. A Common Metric for Integrating Research Findings.

    Science.gov (United States)

    Haladyna, Tom

    The choice of a common metric for the meta-analysis (quantitative synthesis) of correlational and experimental research studies is presented and justified. First, a background for the problem of identifying a common metric is presented. Second, the percentage of accounted variance (PAV) is described as the metric of choice, and reasons are given…

  17. Readability of the web: a study on 1 billion web pages

    NARCIS (Netherlands)

    de Heus, Marije; Hiemstra, Djoerd

    We have performed a readability study on more than 1 billion web pages. The Automated Readability Index was used to determine the average grade level required to easily comprehend a website. Some of the results are that a 16-year-old can easily understand 50% of the web and an 18-year old can easily

  18. Price of next big thing in physics: $6.7 billion

    CERN Multimedia

    Overbye, Dennis

    2007-01-01

    The price of exploring inner space went up Thursday. The machine discusses in a news conference in Beijing, will be 20 miles long and would cost about $6.7 billion and 13'000 person-years of labor to be built. (1,5 page)

  19. U of M seeking $1.1 billion in projects for Soudan Mine lab.

    CERN Multimedia

    2003-01-01

    The University of Minnesota is hoping that groundbreaking research underway at its labs at the Soudan Underground Mine near Tower will help secure up to $1.1 billion in the next 5 to 20 years to expand its work into particle physics (1 page).

  20. How much energy is locked in the USA? Alternative metrics for characterising the magnitude of overweight and obesity derived from BRFSS 2010 data.

    Science.gov (United States)

    Reidpath, Daniel D; Masood, Mohd; Allotey, Pascale

    2014-06-01

    Four metrics to characterise population overweight are described. Behavioural Risk Factors Surveillance System data were used to estimate the weight the US population needed to lose to achieve a BMI energy, and energy value. About 144 million people in the US need to lose 2.4 million metric tonnes. The volume of fat is 2.6 billion litres-1,038 Olympic size swimming pools. The energy in the fat would power 90,000 households for a year and is worth around 162 million dollars. Four confronting ways of talking about a national overweight and obesity are described. The value of the metrics remains to be tested.

  1. Some Equivalences between Cone b-Metric Spaces and b-Metric Spaces

    Directory of Open Access Journals (Sweden)

    Poom Kumam

    2013-01-01

    Full Text Available We introduce a b-metric on the cone b-metric space and then prove some equivalences between them. As applications, we show that fixed point theorems on cone b-metric spaces can be obtained from fixed point theorems on b-metric spaces.

  2. Angles between Curves in Metric Measure Spaces

    Directory of Open Access Journals (Sweden)

    Han Bang-Xian

    2017-09-01

    Full Text Available The goal of the paper is to study the angle between two curves in the framework of metric (and metric measure spaces. More precisely, we give a new notion of angle between two curves in a metric space. Such a notion has a natural interplay with optimal transportation and is particularly well suited for metric measure spaces satisfying the curvature-dimension condition. Indeed one of the main results is the validity of the cosine formula on RCD*(K, N metric measure spaces. As a consequence, the new introduced notions are compatible with the corresponding classical ones for Riemannian manifolds, Ricci limit spaces and Alexandrov spaces.

  3. On characterizations of quasi-metric completeness

    Energy Technology Data Exchange (ETDEWEB)

    Dag, H.; Romaguera, S.; Tirado, P.

    2017-07-01

    Hu proved in [4] that a metric space (X, d) is complete if and only if for any closed subspace C of (X, d), every Banach contraction on C has fixed point. Since then several authors have investigated the problem of characterizing the metric completeness by means of fixed point theorems. Recently this problem has been studied in the more general context of quasi-metric spaces for different notions of completeness. Here we present a characterization of a kind of completeness for quasi-metric spaces by means of a quasi-metric versions of Hu’s theorem. (Author)

  4. Web metrics for library and information professionals

    CERN Document Server

    Stuart, David

    2014-01-01

    This is a practical guide to using web metrics to measure impact and demonstrate value. The web provides an opportunity to collect a host of different metrics, from those associated with social media accounts and websites to more traditional research outputs. This book is a clear guide for library and information professionals as to what web metrics are available and how to assess and use them to make informed decisions and demonstrate value. As individuals and organizations increasingly use the web in addition to traditional publishing avenues and formats, this book provides the tools to unlock web metrics and evaluate the impact of this content. The key topics covered include: bibliometrics, webometrics and web metrics; data collection tools; evaluating impact on the web; evaluating social media impact; investigating relationships between actors; exploring traditional publications in a new environment; web metrics and the web of data; the future of web metrics and the library and information professional.Th...

  5. Information Distances versus Entropy Metric

    Directory of Open Access Journals (Sweden)

    Bo Hu

    2017-06-01

    Full Text Available Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected value of Kolmogorov complexity equals the Shannon entropy. We study the similar relationship between entropy and information distance. We also study the relationship between entropy and the normalized versions of information distances.

  6. The Metric of Colour Space

    DEFF Research Database (Denmark)

    Gravesen, Jens

    2015-01-01

    The space of colours is a fascinating space. It is a real vector space, but no matter what inner product you put on the space the resulting Euclidean distance does not correspond to human perception of difference between colours. In 1942 MacAdam performed the first experiments on colour matching...... and found the MacAdam ellipses which are often interpreted as defining the metric tensor at their centres. An important question is whether it is possible to define colour coordinates such that the Euclidean distance in these coordinates correspond to human perception. Using cubic splines to represent...

  7. Adaptive Optics Metrics & QC Scheme

    Science.gov (United States)

    Girard, Julien H.

    2017-09-01

    "There are many Adaptive Optics (AO) fed instruments on Paranal and more to come. To monitor their performances and assess the quality of the scientific data, we have developed a scheme and a set of tools and metrics adapted to each flavour of AO and each data product. Our decisions to repeat observations or not depends heavily on this immediate quality control "zero" (QC0). Atmospheric parameters monitoring can also help predict performances . At the end of the chain, the user must be able to find the data that correspond to his/her needs. In Particular, we address the special case of SPHERE."

  8. Les tons dans les dictionnaires de langues gabonaises: situation et perspectives

    Directory of Open Access Journals (Sweden)

    Thierry Afane Otsaga

    2011-10-01

    Full Text Available

    Résumé: La plupart des dictionnaires des langues gabonaises, parus jusqu'à ce jour, n'ont pasvraiment tenu compte de la tonalité dans la rédaction des articles. Or, à présent, il est unanimementreconnu que le ton joue un rôle important dans le fonctionnement des langues bantoues en généralet des langues gabonaises en particulier. Les dictionnaires, outils d'apprentissage, de standardisation,de préservation et de l'emploi correct d'une langue, ne pourront plus négliger cet aspect dansleur traitement lexicographique. Malgré le grand nombre de tons qui existent dans les languesgabonaises et la difficulté de les noter tous dans la présentation des articles, il est néanmoins nécessairede trouver des moyens de mettre en valeur chaque ton, sans néanmoins rendre trop compliquél'emploi du dictionnaire. L'objectif de notre exposé est de formuler, pour ces dictionnaires,certaines directives qui permettront de prendre en considération la tonalité des langues gabonaises,tout en gardant accessible l'emploi du dictionnaire. Avant de formuler ces directives, notre exposétente tout d'abord de clarifier le problème par l'analyse et l'énumération des différents tons identifiésjusqu'à ce jour dans les langues gabonaises. En outre, nous prenons en considération certainspoints de vue qui, dans les dictionnaires, ont voulu identifier les tons, en donnant des exemples dutraitement de la tonalité dans certains dictionnaires des langues tonales autres que les languesgabonaises.

    Mots-clés: DICTIONNAIRES, LANGUES GABONAISES, TONS, TONALITÉ, RÉDACTIONDES ARTICLES, PERSPECTIVES, LANGUES BANTOUES, TRAITEMENT LEXICOGRAPHIQUE,OUVRAGES DE RÉFÉRENCE LEXICOGRAPHIQUE, COURBE MÉLODIQUE, SIGNES DIACRITIQUES,ORTHOGRAPHE, PRONONCIATION

    Abstract: Tones in Dictionaries of the Gabonese Languages: State and Perspectives. Until now most dictionaries that have appeared in the Gabonese languages have not really taken tonality into account in

  9. Metrics for building performance assurance

    Energy Technology Data Exchange (ETDEWEB)

    Koles, G.; Hitchcock, R.; Sherman, M.

    1996-07-01

    This report documents part of the work performed in phase I of a Laboratory Directors Research and Development (LDRD) funded project entitled Building Performance Assurances (BPA). The focus of the BPA effort is to transform the way buildings are built and operated in order to improve building performance by facilitating or providing tools, infrastructure, and information. The efforts described herein focus on the development of metrics with which to evaluate building performance and for which information and optimization tools need to be developed. The classes of building performance metrics reviewed are (1) Building Services (2) First Costs, (3) Operating Costs, (4) Maintenance Costs, and (5) Energy and Environmental Factors. The first category defines the direct benefits associated with buildings; the next three are different kinds of costs associated with providing those benefits; the last category includes concerns that are broader than direct costs and benefits to the building owner and building occupants. The level of detail of the various issues reflect the current state of knowledge in those scientific areas and the ability of the to determine that state of knowledge, rather than directly reflecting the importance of these issues; it intentionally does not specifically focus on energy issues. The report describes work in progress and is intended as a resource and can be used to indicate the areas needing more investigation. Other reports on BPA activities are also available.

  10. Production of 415-ton ingots for backup rolls in the conditions of PJSC "Energomashspetsstal"

    Directory of Open Access Journals (Sweden)

    Maxim Victorovich Efimov

    2013-10-01

    Full Text Available The paper gives the results of work on melting, pouring, forging and heat treatment of 415-ton 45H3M1F steel ingots as semiproducts for manufacturing large backup rolls at the request of  «Severstal» Company. For the production of rollers, a new grade of steel based on 0.45%С and 3%Cr, additionally alloyed by Mo and V, was developed in PJSC «Energomashspetsstal» and applied in industry. Steel of bainite class provides the best combination of strength and plasticity properties. Metal was prepared by melting 7 heats in arc steel-melting furnaces with the capacity of 100 and 50 tons with subsequent out-of-furnace treatment on ladle – furnace and ladle – degassing units. The pouring of steel was carried out from four steel-pouring ladles into a vertical mould under vacuum through a tundish ladle with the protection of stream by argon. The forging of ingots was conducted on the automated 150MN forging system. The obtained billets were exposed to primary heat treatment which consisted of heating for recrystallization, isothermal soaking, in order to provide the removal of residual stresses, additional dehydrogenization to give the material lower hardness and controlled cooling for the prevention of snowflake formation. Heating of ingots for forging and preliminary heat treatment was carried out in heat treatment furnaces with the carrying capacity of up to 500 tons. For the heat treatment of the backup roll, a horizontal sprayer unit was applied. The tooling of the backup roll was executed on a lathe with one pace-plate and the final mechanical treatment was conducted on the machine-tool of Hercules NWD 1500×18000 CNC. Finished products with 225-ton mass of the following dimensions: a barrel with a diameter and a length of 2,360 mm and 4,800 mm, respectively, at a general length of 10,650 mm were obtained.  

  11. 3CR249.1 and Ton202-luminous QSOs in interacting systems

    International Nuclear Information System (INIS)

    Stockton, A.; MacKenty, J.W.

    1983-01-01

    The morphologies of the extended emission-line regions around the QSOs 3CR249.1 and Ton202 indicate that they have, in both cases, resulted from interactions between two galaxies of roughly equal mass and that the galaxies involved possessed extensive rotating gaseous disks before their encounters. These two examples provide support for the position that luminous QSOs are frequently activated by violent interactions. More generally, the distribution of gas around luminous QSOs promises to be a sensitive and useful indicator of the recent history of their physical environment. (author)

  12. Interactive (statistical) visualisation and exploration of a billion objects with vaex

    Science.gov (United States)

    Breddels, M. A.

    2017-06-01

    With new catalogues arriving such as the Gaia DR1, containing more than a billion objects, new methods of handling and visualizing these data volumes are needed. We show that by calculating statistics on a regular (N-dimensional) grid, visualizations of a billion objects can be done within a second on a modern desktop computer. This is achieved using memory mapping of hdf5 files together with a simple binning algorithm, which are part of a Python library called vaex. This enables efficient exploration or large datasets interactively, making science exploration of large catalogues feasible. Vaex is a Python library and an application, which allows for interactive exploration and visualization. The motivation for developing vaex is the catalogue of the Gaia satellite, however, vaex can also be used on SPH or N-body simulations, any other (future) catalogues such as SDSS, Pan-STARRS, LSST, etc. or other tabular data. The homepage for vaex is http://vaex.astro.rug.nl.

  13. Quality metrics for detailed clinical models.

    Science.gov (United States)

    Ahn, SunJu; Huff, Stanley M; Kim, Yoon; Kalra, Dipak

    2013-05-01

    To develop quality metrics for detailed clinical models (DCMs) and test their validity. Based on existing quality criteria which did not include formal metrics, we developed quality metrics by applying the ISO/IEC 9126 software quality evaluation model. The face and content validity of the initial quality metrics were assessed by 9 international experts. Content validity was defined as agreement by over 70% of the panelists. For eliciting opinions and achieving consensus of the panelists, a two round Delphi survey was conducted. Valid quality metrics were considered reliable if agreement between two evaluators' assessments of two example DCMs was over 0.60 in terms of the kappa coefficient. After reliability and validity were tested, the final DCM quality metrics were selected. According to the results of the reliability test, the degree of agreement was high (a kappa coefficient of 0.73). Based on the results of the reliability test, 8 quality evaluation domains and 29 quality metrics were finalized as DCM quality metrics. Quality metrics were validated by a panel of international DCM experts. Therefore, we expect that the metrics, which constitute essential qualitative and quantitative quality requirements for DCMs, can be used to support rational decision-making by DCM developers and clinical users. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. Accomplishing rural electrification for over a billion people: Approaches towards sustainable solutions

    OpenAIRE

    Rahman, Mizanur Md.

    2014-01-01

    Access to electricity appears to be a prerequisite to materialize social, economic, and human development in the underprivileged rural areas. However, 1.1 billion rural people in the world, almost all of them living in developing countries, still do not have access to electricity. Although the rural electrification process poses more challenges than urban electrification, rural areas are blessed with abundant and relatively evenly distributed renewable energy resources. To facilitate electric...

  15. How to Bring Solar Energy to Seven Billion People (LBNL Science at the Theater)

    Energy Technology Data Exchange (ETDEWEB)

    Wadia, Cyrus

    2009-04-06

    By exploiting the powers of nanotechnology and taking advantage of non-toxic, Earth-abundant materials, Berkeley Lab's Cyrus Wadia has fabricated new solar cell devices that have the potential to be several orders of magnitude less expensive than conventional solar cells. And by mastering the chemistry of these materials-and the economics of solar energy-he envisions bringing electricity to the 1.2 billion people now living without it.

  16. Areva - First quarter 2009 revenue climbs 8.5% to 3.003 billion euros

    International Nuclear Information System (INIS)

    2009-04-01

    First quarter 2009 revenue was up 8.5% compared with the same period last year, to 3.003 billion euros. At constant exchange rates and consolidation scope, growth came to 3.9%. Currency translation had a positive impact of 57 million euros over the quarter. Changes in the consolidation scope had an impact of 66 million euros, primarily due to the consolidation of acquisitions made in 2008 in Transmission and Distribution and in Renewable Energies. The growth engines for first quarter revenue were the Reactors and Services division and the Transmission and Distribution division, with growth of 9.2% and 16.1% respectively. Outside France, revenue rose to 2.032 billion euros, compared with 1.857 billion euros in the first quarter of 2008, and represents 68% of total revenue. Orders were steady in the first quarter, particularly in the Front End, which posted several significant contracts with US and Asian utilities, and in Transmission and Distribution, with orders up sharply in Asia and South America. As of March 31, 2009, the group's backlog reached 49.5 billion euros, for 28.3% growth year-on-year, including 31.3% growth in Nuclear and 10.2% in Transmission and Distribution. For the year as a whole, the group confirms its outlook for backlog and revenue growth as well as rising operating income It should be noted that revenue may vary significantly from one quarter to the next in nuclear operations. Accordingly, quarterly data cannot be viewed as a reliable indicator of annual trends

  17. A 17-billion-solar-mass black hole in a group galaxy with a diffuse core.

    Science.gov (United States)

    Thomas, Jens; Ma, Chung-Pei; McConnell, Nicholas J; Greene, Jenny E; Blakeslee, John P; Janish, Ryan

    2016-04-21

    Quasars are associated with and powered by the accretion of material onto massive black holes; the detection of highly luminous quasars with redshifts greater than z = 6 suggests that black holes of up to ten billion solar masses already existed 13 billion years ago. Two possible present-day 'dormant' descendants of this population of 'active' black holes have been found in the galaxies NGC 3842 and NGC 4889 at the centres of the Leo and Coma galaxy clusters, which together form the central region of the Great Wall--the largest local structure of galaxies. The most luminous quasars, however, are not confined to such high-density regions of the early Universe; yet dormant black holes of this high mass have not yet been found outside of modern-day rich clusters. Here we report observations of the stellar velocity distribution in the galaxy NGC 1600--a relatively isolated elliptical galaxy near the centre of a galaxy group at a distance of 64 megaparsecs from Earth. We use orbit superposition models to determine that the black hole at the centre of NGC 1600 has a mass of 17 billion solar masses. The spatial distribution of stars near the centre of NGC 1600 is rather diffuse. We find that the region of depleted stellar density in the cores of massive elliptical galaxies extends over the same radius as the gravitational sphere of influence of the central black holes, and interpret this as the dynamical imprint of the black holes.

  18. Two ten-billion-solar-mass black holes at the centres of giant elliptical galaxies.

    Science.gov (United States)

    McConnell, Nicholas J; Ma, Chung-Pei; Gebhardt, Karl; Wright, Shelley A; Murphy, Jeremy D; Lauer, Tod R; Graham, James R; Richstone, Douglas O

    2011-12-08

    Observational work conducted over the past few decades indicates that all massive galaxies have supermassive black holes at their centres. Although the luminosities and brightness fluctuations of quasars in the early Universe suggest that some were powered by black holes with masses greater than 10 billion solar masses, the remnants of these objects have not been found in the nearby Universe. The giant elliptical galaxy Messier 87 hosts the hitherto most massive known black hole, which has a mass of 6.3 billion solar masses. Here we report that NGC 3842, the brightest galaxy in a cluster at a distance from Earth of 98 megaparsecs, has a central black hole with a mass of 9.7 billion solar masses, and that a black hole of comparable or greater mass is present in NGC 4889, the brightest galaxy in the Coma cluster (at a distance of 103 megaparsecs). These two black holes are significantly more massive than predicted by linearly extrapolating the widely used correlations between black-hole mass and the stellar velocity dispersion or bulge luminosity of the host galaxy. Although these correlations remain useful for predicting black-hole masses in less massive elliptical galaxies, our measurements suggest that different evolutionary processes influence the growth of the largest galaxies and their black holes.

  19. BUILDING A BILLION SPATIO-TEMPORAL OBJECT SEARCH AND VISUALIZATION PLATFORM

    Directory of Open Access Journals (Sweden)

    D. Kakkar

    2017-10-01

    Full Text Available With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC, an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.

  20. Building a Billion Spatio-Temporal Object Search and Visualization Platform

    Science.gov (United States)

    Kakkar, D.; Lewis, B.

    2017-10-01

    With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA) has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC), an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.

  1. Fast similarity search for learned metrics.

    Science.gov (United States)

    Kulis, Brian; Jain, Prateek; Grauman, Kristen

    2009-12-01

    We introduce a method that enables scalable similarity search for learned metrics. Given pairwise similarity and dissimilarity constraints between some examples, we learn a Mahalanobis distance function that captures the examples' underlying relationships well. To allow sublinear time similarity search under the learned metric, we show how to encode the learned metric parameterization into randomized locality-sensitive hash functions. We further formulate an indirect solution that enables metric learning and hashing for vector spaces whose high dimensionality makes it infeasible to learn an explicit transformation over the feature dimensions. We demonstrate the approach applied to a variety of image data sets, as well as a systems data set. The learned metrics improve accuracy relative to commonly used metric baselines, while our hashing construction enables efficient indexing with learned distances and very large databases.

  2. Techniques et systèmes de renfort des structures en béton

    CERN Document Server

    Miranda-Vizuete, J

    2000-01-01

    Bien qu'appelé « pierre artificielle », le béton est un matériau vivant qui se modifie tout au long de sa vie utile. Il change car la structure dont il fait partie subit elle-même des changements. Ces changements proviennent soit de modifications ou de rénovations, soit d'une altération de sa capacité de support par un accroissement des charges. Dans la plupart des cas, ils nécessitent un renfort. Le renforcement d'une structure en béton consiste à améliorer les caractéristiques mécaniques des éléments qui la composent, de manière à ce qu'elle offre une meilleure solidité aussi bien en état de service qu'en état de résistances ultimes. Ce document présente les méthodes les plus utilisées dans le domaine de renfort des structures dont l'incorporation des profiles métalliques, l'augmentation de section structurelle et celle plus récente du renforcement à base d'adjonction de matériaux composites extérieurs.

  3. Clinical comparison of the ProTon and Tono-Pen tonometers with the Goldmann applanation tonometer.

    Science.gov (United States)

    Midelfart, A; Wigers, A

    1994-12-01

    A clinical evaluation of a new electron ProTon tonometer was performed comparing the values of intraocular pressure (IOP) measured using this instrument with those determined by a similar instrument, Tono-Pen XL, and by Goldmann applanation tonometry. The mean IOP measured in 106 eyes with the ProTon tonometer was not significantly different from that determined with Goldmann applanation, while the IOP values measured with Tono-Pen XL were significantly lower. The 95% limits of agreement between applanation tonometry and ProTon tonometry were between -4 mm Hg and 5 mm Hg and between applanation tonometry and Tono-Pen XL tonometry between -3 mm Hg and 8 mm Hg. The ProTon tonometer appears to have a higher level of accuracy than the Tono-Pen XL tonometer in clinical practice.

  4. Metric approach to quantum constraints

    International Nuclear Information System (INIS)

    Brody, Dorje C; Hughston, Lane P; Gustavsson, Anna C T

    2009-01-01

    A framework for deriving equations of motion for constrained quantum systems is introduced and a procedure for its implementation is outlined. In special cases, the proposed new method, which takes advantage of the fact that the space of pure states in quantum mechanics has both a symplectic structure and a metric structure, reduces to a quantum analogue of the Dirac theory of constraints in classical mechanics. Explicit examples involving spin-1/2 particles are worked out in detail: in the first example, our approach coincides with a quantum version of the Dirac formalism, while the second example illustrates how a situation that cannot be treated by Dirac's approach can nevertheless be dealt with in the present scheme.

  5. Metrics Are Needed for Collaborative Software Development

    OpenAIRE

    Mojgan Mohtashami; Cyril S. Ku; Thomas J. Marlowe

    2011-01-01

    There is a need for metrics for inter-organizational collaborative software development projects, encompassing management and technical concerns. In particular, metrics are needed that are aimed at the collaborative aspect itself, such as readiness for collaboration, the quality and/or the costs and benefits of collaboration in a specific ongoing project. We suggest questions and directions for such metrics, spanning the full lifespan of a collaborative project, from considering the suitabili...

  6. A Metric Observer for Induction Motors Control

    Directory of Open Access Journals (Sweden)

    Mohamed Benbouzid

    2016-01-01

    Full Text Available This paper deals with metric observer application for induction motors. Firstly, assuming that stator currents and speed are measured, a metric observer is designed to estimate the rotor fluxes. Secondly, assuming that only stator currents are measured, another metric observer is derived to estimate rotor fluxes and speed. The proposed observer validity is checked throughout simulations on a 4 kW induction motor drive.

  7. The definitive guide to IT service metrics

    CERN Document Server

    McWhirter, Kurt

    2012-01-01

    Used just as they are, the metrics in this book will bring many benefits to both the IT department and the business as a whole. Details of the attributes of each metric are given, enabling you to make the right choices for your business. You may prefer and are encouraged to design and create your own metrics to bring even more value to your business - this book will show you how to do this, too.

  8. Chaotic inflation with metric and matter perturbations

    International Nuclear Information System (INIS)

    Feldman, H.A.; Brandenberger, R.H.

    1989-01-01

    A perturbative scheme to analyze the evolution of both metric and scalar field perturbations in an expanding universe is developed. The scheme is applied to study chaotic inflation with initial metric and scalar field perturbations present. It is shown that initial gravitational perturbations with wavelength smaller than the Hubble radius rapidly decay. The metric simultaneously picks up small perturbations determined by the matter inhomogeneities. Both are frozen in once the wavelength exceeds the Hubble radius. (orig.)

  9. Areva excellent business volume: backlog as of december 31, 2008: + 21.1% to 48.2 billion euros. 2008 revenue: + 10.4% to 13.2 billion euros

    International Nuclear Information System (INIS)

    2009-01-01

    AREVA's backlog stood at 48.2 billion euros as of December 31, 2008, for 21.1% growth year-on-year, including 21.8% growth in Nuclear and 16.5% growth in Transmission and Distribution. The Nuclear backlog came to 42.5 billion euros at December 31, 2008. The Transmission and Distribution backlog came to 5.7 billion euros at year-end. The group recognized revenue of 13.2 billion euros in 2008, for year-on-year growth of 10.4% (+9.8% like-for-like). Revenue outside France was up 10.5% to 9.5 billion euros, representing 72% of total revenue. Revenue was up 6.5% in the Nuclear businesses (up 6.3% LFL), with strong performance in the Reactors and Services division (+10.9% LFL) and the Front End division (+7.2% LFL). The Transmission and Distribution division recorded growth of 17% (+15.8% LFL). Revenue for the fourth quarter of 2008 rose to 4.1 billion euros, up 5.2% (+1.6% LFL) from that of the fourth quarter of 2007. Revenue for the Front End division rose to 3.363 billion euros in 2008, up 7.1% over 2007 (+7.2% LFL). Foreign exchange (currency translations) had a negative impact of 53 million euros. Revenue for the Reactors and Services division rose to 3.037 billion euros, up 11.8% over 2007 (+10.9% LFL). Foreign exchange (currency translations) had a negative impact of 47 million euros. Revenue for the Back End division came to 1.692 billion euros, a drop of 2.7% (-2.5% LFL). Foreign exchange (currency translations) had a negative impact of 3.5 million euros. Revenue for the Transmission and Distribution division rose to 5.065 billion euros in 2008, up 17.0% (+15.8% LFL)

  10. Semantic Metrics for Object Oriented Design

    Science.gov (United States)

    Etzkorn, Lethe

    2003-01-01

    The purpose of this proposal is to research a new suite of object-oriented (OO) software metrics, called semantic metrics, that have the potential to help software engineers identify fragile, low quality code sections much earlier in the development cycle than is possible with traditional OO metrics. With earlier and better Fault detection, software maintenance will be less time consuming and expensive, and software reusability will be improved. Because it is less costly to correct faults found earlier than to correct faults found later in the software lifecycle, the overall cost of software development will be reduced. Semantic metrics can be derived from the knowledge base of a program understanding system. A program understanding system is designed to understand a software module. Once understanding is complete, the knowledge-base contains digested information about the software module. Various semantic metrics can be collected on the knowledge base. This new kind of metric measures domain complexity, or the relationship of the software to its application domain, rather than implementation complexity, which is what traditional software metrics measure. A semantic metric will thus map much more closely to qualities humans are interested in, such as cohesion and maintainability, than is possible using traditional metrics, that are calculated using only syntactic aspects of software.

  11. About the possibility of a generalized metric

    International Nuclear Information System (INIS)

    Lukacs, B.; Ladik, J.

    1991-10-01

    The metric (the structure of the space-time) may be dependent on the properties of the object measuring it. The case of size dependence of the metric was examined. For this dependence the simplest possible form of the metric tensor has been constructed which fulfils the following requirements: there be two extremal characteristic scales; the metric be unique and the usual between them; the change be sudden in the neighbourhood of these scales; the size of the human body appear as a parameter (postulated on the basis of some philosophical arguments). Estimates have been made for the two extremal length scales according to existing observations. (author) 19 refs

  12. Bounds for phylogenetic network space metrics.

    Science.gov (United States)

    Francis, Andrew; Huber, Katharina T; Moulton, Vincent; Wu, Taoyang

    2018-04-01

    Phylogenetic networks are a generalization of phylogenetic trees that allow for representation of reticulate evolution. Recently, a space of unrooted phylogenetic networks was introduced, where such a network is a connected graph in which every vertex has degree 1 or 3 and whose leaf-set is a fixed set X of taxa. This space, denoted [Formula: see text], is defined in terms of two operations on networks-the nearest neighbor interchange and triangle operations-which can be used to transform any network with leaf set X into any other network with that leaf set. In particular, it gives rise to a metric d on [Formula: see text] which is given by the smallest number of operations required to transform one network in [Formula: see text] into another in [Formula: see text]. The metric generalizes the well-known NNI-metric on phylogenetic trees which has been intensively studied in the literature. In this paper, we derive a bound for the metric d as well as a related metric [Formula: see text] which arises when restricting d to the subset of [Formula: see text] consisting of all networks with [Formula: see text] vertices, [Formula: see text]. We also introduce two new metrics on networks-the SPR and TBR metrics-which generalize the metrics on phylogenetic trees with the same name and give bounds for these new metrics. We expect our results to eventually have applications to the development and understanding of network search algorithms.

  13. Kerr-Schild metrics revisited. Pt. 1

    International Nuclear Information System (INIS)

    Gergely, L.A.; Perjes, Z.

    1993-04-01

    The particular way Kerr-Schild metrics incorporate a congruence of null curves in space-time is a sure source of fascination. The Kerr-Schild pencil of metrics g ab +Δl a l b is investigated in the generic case when it maps an arbitrary vacuum space-time with metric g ab to a vacuum space-time. The theorem is proved that this generic case does not contain the shear-free subclass as a smooth limit. It is shown that one of the Kota-Perjes metrics is a solution in the shearing class. (R.P.) 15 refs

  14. ROUTING BASE CONGESTION CONTROL METRICS IN MANETS

    Directory of Open Access Journals (Sweden)

    Sandeep Dalal

    2014-09-01

    Full Text Available Mobile adhoc network is self-configurable and adaptive. Due to node mobility we cannot predict load on the network which leads to congestion, one of the widely researched area in manets. A lot of congestion control techniques and metrics have been proposed to overcome it before its occurrence or after it has occurred. In this survey we identify the currently used congestion control metrics. Through this survey we also propose a congestion control metric RFR(resource free ratio which considers three most important parameters to provide congestion free route discovery. Further we show the results of node selection based on fuzzy logic calculations using the proposed metric.

  15. Comparing Evaluation Metrics for Sentence Boundary Detection

    National Research Council Canada - National Science Library

    Liu, Yang; Shriberg, Elizabeth

    2007-01-01

    .... This paper compares alternative evaluation metrics including the NIST error rate, classification error rate per word boundary, precision and recall, ROC curves, DET curves, precision-recall curves...

  16. Simulation information regarding Sandia National Laboratories trinity capability improvement metric.

    Energy Technology Data Exchange (ETDEWEB)

    Agelastos, Anthony Michael; Lin, Paul T.

    2013-10-01

    Sandia National Laboratories, Los Alamos National Laboratory, and Lawrence Livermore National Laboratory each selected a representative simulation code to be used as a performance benchmark for the Trinity Capability Improvement Metric. Sandia selected SIERRA Low Mach Module: Nalu, which is a uid dynamics code that solves many variable-density, acoustically incompressible problems of interest spanning from laminar to turbulent ow regimes, since it is fairly representative of implicit codes that have been developed under ASC. The simulations for this metric were performed on the Cielo Cray XE6 platform during dedicated application time and the chosen case utilized 131,072 Cielo cores to perform a canonical turbulent open jet simulation within an approximately 9-billion-elementunstructured- hexahedral computational mesh. This report will document some of the results from these simulations as well as provide instructions to perform these simulations for comparison.

  17. Distance in Metric Trees and Banach Spaces

    Science.gov (United States)

    Alansari, Monairah

    This thesis contains results on metric trees and Banach spaces. There is a common thread which is about distance function. In case of metric trees, special metrics such as radial and river metrics will yield characterization theorems. In the case of Banach spaces we consider the distance from a point in the Banach space to its subspace and by putting conditions on subspaces we obtain results for the speed of convergence of the error of best approximation. We first introduce the concept of metric trees and study some of its properties and provide a new representation of metric trees by using a special set of metric rays, which we called it "crossing point sets". We have captured the four-point condition from these set and shown an equivalence between the metric trees with radial and river metrics, and the crossing point set. As an application of our characterization of metric trees via crossing point sets, we were able to index Brownian motions by a metric tree. Second part of this thesis contains results on the error of best approximation in the context of Banach spaces. The error of the best approximation to x via S is denoted by rho(x,S) defined as follows: rho(x, S) = inf d(x, y) for all y∈S. Note that the well known Weierstrass approximation theorem states that every continuous function defined on a closed interval [a,b] can be uniformly approximated by a polynomial function. Note that the Weierstrass approximation theorem gives no information about the speed of convergence for rho(f, Yn). However, Bernstein Lethargy Theorem (BLT) is about the speed of convergence for rho(f, Y n). We consider a condition on subspaces in order to improve bounds given in the Bernstein's Lethargy Theorem (BLT) for Banach spaces.

  18. Simple emission metrics for climate impacts

    Directory of Open Access Journals (Sweden)

    B. Aamaas

    2013-06-01

    Full Text Available In the context of climate change, emissions of different species (e.g., carbon dioxide and methane are not directly comparable since they have different radiative efficiencies and lifetimes. Since comparisons via detailed climate models are computationally expensive and complex, emission metrics were developed to allow a simple and straightforward comparison of the estimated climate impacts of emissions of different species. Emission metrics are not unique and variety of different emission metrics has been proposed, with key choices being the climate impacts and time horizon to use for comparisons. In this paper, we present analytical expressions and describe how to calculate common emission metrics for different species. We include the climate metrics radiative forcing, integrated radiative forcing, temperature change and integrated temperature change in both absolute form and normalised to a reference gas. We consider pulse emissions, sustained emissions and emission scenarios. The species are separated into three types: CO2 which has a complex decay over time, species with a simple exponential decay, and ozone precursors (NOx, CO, VOC which indirectly effect climate via various chemical interactions. We also discuss deriving Impulse Response Functions, radiative efficiency, regional dependencies, consistency within and between metrics and uncertainties. We perform various applications to highlight key applications of emission metrics, which show that emissions of CO2 are important regardless of what metric and time horizon is used, but that the importance of short lived climate forcers varies greatly depending on the metric choices made. Further, the ranking of countries by emissions changes very little with different metrics despite large differences in metric values, except for the shortest time horizons (GWP20.

  19. On the relation of the generalized Schwarzschild metric and Tallman metric

    International Nuclear Information System (INIS)

    Sharshekeev, O.Sh.

    1977-01-01

    Relation of the Schwarzschild generalized metric (the Schwarzschild metric with regard for the four-dimension tensor of curvation) with the Tollman metric is considered. It is shown, that the Schwarzschild problem solution in the Tollman metric is quite correct as well. The obtained solutions meet the following requirements: conformity principle is carried out, transformation functional determinant is final everywhere, excluding the centre, where a singular point is to be

  20. Building a 70 billion word corpus of English from ClueWeb

    OpenAIRE

    Pomikálek Jan; Rychlý Pavel; Jakubíček Miloš

    2012-01-01

    This work describes the process of creation of a 70 billion word text corpus of English. We used an existing language resource, namely the ClueWeb09 dataset, as source for the corpus data. Processing such a vast amount of data presented several challenges, mainly associated with pre-processing (boilerplate cleaning, text de-duplication) and post-processing (indexing for efficient corpus querying using the CQL – Corpus Query Language) steps. In this paper we explain how we tackled them: we des...

  1. A field like today's? The strength of the geomagnetic field 1.1 billion years ago

    Science.gov (United States)

    Sprain, Courtney J.; Swanson-Hysell, Nicholas L.; Fairchild, Luke M.; Gaastra, Kevin

    2018-02-01

    Paleomagnetic data from ancient rocks are one of the few types of observational data that can be brought to be bear on the long-term evolution of Earth's core. A recent compilation of paleointensity estimates from throughout Earth history has been interpreted to indicate that Earth's magnetic field strength increased in the Mesoproterozoic (between 1.5 and 1.0 billion years ago), with this increase taken to mark the onset of inner core nucleation. However, much of the data within the Precambrian paleointensity database are from Thellier-style experiments with non-ideal behavior that manifests in results such as double-slope Arai plots. Choices made when interpreting these data may significantly change conclusions about long-term trends in the intensity of Earth's geomagnetic field. In this study, we present new paleointensity results from volcanics of the ˜1.1 billion-year-old North American Midcontinent Rift. While most of the results exhibit non-ideal double-slope or sagging behavior in Arai plots, some flows have more ideal single-slope behavior leading to paleointensity estimates that may be some of the best constraints on the strength of Earth's field for this time. Taken together, new and previously published paleointensity data from the Midcontinent Rift yield a median field strength estimate of 56.0 ZAm2—very similar to the median for the past 300 million years. These field strength estimates are distinctly higher than those for the preceding billion years after excluding ca. 1.3 Ga data that may be biased by non-ideal behavior—consistent with an increase in field strength in the late Mesoproterozoic. However, given that ˜90 per cent of paleointensity estimates from 1.1 to 0.5 Ga come from the Midcontinent Rift, it is difficult to evaluate whether these high values relative to those estimated for the preceding billion years are the result of a stepwise, sustained increase in dipole moment. Regardless, paleointensity estimates from the Midcontinent

  2. Development and Manufacturing Technology of Prototype Monoblock Low Pressure Rotor Shaft by 650ton Large Ingot

    Energy Technology Data Exchange (ETDEWEB)

    Song, Duk-Yong; Kim, Dong-Soo; Kim, Jungyeup; Lee, Jongwook; Ko, Seokhee [Doosan Heavy Industries and Construction, Changwon(Korea, Republic of)

    2016-10-15

    In order to establish the manufacturing technology for monoblock LP rotor shaft, DHI has produced the prototype monoblock LP rotor shaft with a maximum diameter of φ 2,800 mm using 650 ton ingot and investigated the mechanical properties and the internal quality of the ingot. As a result, the quality and mechanical properties required the large rotor shaft for nuclear power plant met a target. These results indicate that DHI can be contributed to increasing demands with high efficiency and capacity at the nuclear power plant. Additionally, some tests such as high cycle fatigue (HCF), low cycle fatigue (LCF), fracture toughness (K1C/J1C) and dynamic crack propagation velocity (da/dN) are in progress.

  3. Development of NUPAC 140B 100 ton rail/barge cask

    International Nuclear Information System (INIS)

    1990-04-01

    The NuPac 140-B 100 Ton Rail/Barge Shipping Cask Preliminary Design Report (PDR) presents a general introduction to, and description of, the NuPac 140-B Cask and its fuel payload. The NuPac 140-B Cask, Model: NuPac 140-B, is being designed by Nuclear Packaging, Inc., to meet or exceed all NRC and Department of Transportation regulations governing the shipment of radioactive material. Specifically the Cask is being developed as a safe means of transporting spent light-water-reactor (LWR) fuels from existing and proposed reactor facilities to a repository and/or a monitored retrievable storage (MRS) facility. The primary transportation mode is by railroad, although the shipping package is designed to be transported by barge and by truck shipment on a special overweight basis for short distances. This feature allows the servicing of reactor sites and other facilities which lack direct railroad access

  4. Cracked lifting lug welds on ten-ton UF{sub 6} cylinders

    Energy Technology Data Exchange (ETDEWEB)

    Dorning, R.E. [Martin Marietta Energy Systems, Inc., Piketon, OH (United States)

    1991-12-31

    Ten-ton, Type 48X, UF{sub 6} cylinders are used at the Portsmouth Gaseous Diffusion Plant to withdraw enriched uranium hexafluoride from the cascade, transfer enriched uranium hexafluoride to customer cylinders, and feed enriched product to the cascade. To accomplish these activities, the cylinders are lifted by cranes and straddle carriers which engage the cylinder lifting lugs. In August of 1988, weld cracks on two lifting lugs were discovered during preparation to lift a cylinder. The cylinder was rejected and tagged out, and an investigating committee formed to determine the cause of cracking and recommend remedial actions. Further investigation revealed the problem may be general to this class of cylinder in this use cycle. This paper discusses the actions taken at the Portsmouth site to deal with the cracked lifting lug weld problem. The actions include inspection activities, interim corrective actions, metallurgical evaluation of cracked welds, weld repairs, and current monitoring/inspection program.

  5. Light Readout for a 1 ton Liquid Argon Dark Matter Detector

    CERN Document Server

    Boccone, Vittorio; Baudis, Laura; Otyugova, Polina; Regenfus, Christian

    2010-01-01

    Evidence for dark matter (DM) has been reported using astronomical observations in systems such as the Bullet cluster. Weakly interactive massive particles (WIMPs), in particular the lightest neutralino, are the most popular DM candidates within the Minimal Supersymmetric Standard Model (MSSM). Many groups in the world are focussing their attention on the direct detection of DM in the laboratory. The detectors should have large target masses and excellent noise rejection capabilities because of the small cross section between DM and ordinary matter (σWIMP−nucleon < 4 · 10−8 pb). Noble liquids are today considered to be one of the best options for large-size DM experiments, as they have a relatively low ionization energy, good scintillation properties and long electron lifetime. Moreover noble liquid detectors are easily scalable to large masses. This thesis deals with the development of a large (1 ton) LAr WIMP detector (ArDM) which could measure simultaneously light and charge from the scintilla...

  6. Modified Cooling System for Low Temperature Experiments in a 3000 Ton Multi-Anvil Press

    Science.gov (United States)

    Secco, R.; Yong, W.

    2017-12-01

    A new modified cooling system for a 3000-ton multi-anvil press has been developed to reach temperatures below room temperature at high pressures. The new system is much simpler in design, easier to make and use, and has the same cooling capability as the previous design (Secco and Yong, RSI, 2016). The key component of the new system is a steel ring surrounding the module wedges that contains liquid nitrogen (LN2) which flows freely through an entrance port to flood the interior of the pressure module. Upper and lower O-rings on the ring seal in the liquid while permitting modest compression and an thermally insulating layer of foam is attached to the outside of the ring. The same temperature of 220 K reached with two different cooling systems suggests that thermal equilibrium is reached between the removal of heat by LN2 and the influx of heat through the massive steel components of this press.

  7. Analysis of the Effects of Sea Disposal on a One-Ton Container

    Science.gov (United States)

    Jackson, Wde C.; Jackson, Karen E.; Fasanella, Edwin L.; Kelley, John

    2007-01-01

    Excess and obsolete stocks of chemical warfare material (CWM) were sea disposed by the United States between 1919 and 1970. One-ton containers were used for bulk storage of CWM and were the largest containers sea disposed. Disposal depths ranged from 300 to 17,000 feet. Based on a Type D container assembly drawing, three independent analyses (one corrosion and two structural) were performed on the containers to address the corrosion resistance from prolonged exposure to sea water and the structural response during the descent. Corrosion predictions were made using information about corrosion rates and the disposal environment. The structural analyses employed two different finite element codes and were used to predict the buckling and material response of the container during sea disposal. The results of these investigations are summarized below. Detailed reports on each study are contained in the appendices.

  8. Synthesis and characterization of Al-TON zeolite using a dialkylimizadolium as structure-directing agent

    Energy Technology Data Exchange (ETDEWEB)

    Lopes, Christian Wittee; Pergher, Sibele Berenice Castella, E-mail: chriswittee@gmail.com [Universidade Federal do Rio Grande do Norte (UFRN), Natal, RN (Brazil); Villarroel-Rocha, Jhonny [Laboratorio de Solidos Porosos, Instituto de Fisica Aplicada, Universidad Nacional de San Luis, Chacabuco, San Luis (Argentina); Silva, Bernardo Araldi Da; Mignoni, Marcelo Luis [Universidade Regional Integrada, Erechim, RS (Brazil)

    2016-11-15

    In this work, the synthesis of zeolites using 1-butyl-3-methylimidazolium chloride [C{sub 4}MI]Cl as a structure-directing agent was investigated. The organic cation shows effectiveness and selectivity for the syntheses of TON zeolites under different reaction conditions compared to the traditional structure directing agent, 1,8-diaminooctane. The 1-butyl-3-methylimidazolium cation lead to highly crystalline materials and its role as OSDA in our synthesis conditions has been confirmed by characterization techniques. ICP-OES confirms the presence of Al in the samples and {sup 27}Al MAS NMR analysis indicated that aluminum atoms were incorporated in tetrahedral coordination. Scanning electron microscopy indicated that changing the crystallization condition (static or stirring), zeolites with different crystal size were obtained, which consequently affects the textural properties of the zeolites. Moreover, varying some synthesis parameters MFI zeolite can also be obtained. (author)

  9. Consumption data for 100,000 tons/annum auto gasoline

    Energy Technology Data Exchange (ETDEWEB)

    Hochstetter, H.

    1944-06-28

    This report is a table accompanied by a prefatory note of explanation. The table gives consumption figures for the production of 100,000 tons of auto gasoline per year from brown coal, bituminous coal, petroleum, and brown coal tar, based on data from German plants. Under material consumption, the table lists dry raw material, power, steam, water, fuel, hygas credit, hydrogen, coke, and excess watergas credit. Under personnel requirements, the table lists operators and millwrights. Under iron requirements the table lists operators and millwrights. Under iron requirements, the table lists iron for equipment, for building, and for repairs. The note explains that the personnel count does not include support personnel, a tabulation of which would add forty percent to the total count. 1 table.

  10. Optimal Transportation and Curvature of Metric Spaces

    OpenAIRE

    Eskin, Thomas

    2013-01-01

    In this thesis we study the notion of non-negative Ricci curvature for compact metric measure spaces introduced by Lott and Villani in their article (2009): Ricci curvature for metric measure spaces via optimal transport. We also define and prove the required prerequisites concerning length spaces, convex analysis, measure theory, and optimal transportation.

  11. Enhancing Authentication Models Characteristic Metrics via ...

    African Journals Online (AJOL)

    In this work, we derive the universal characteristic metrics set for authentication models based on security, usability and design issues. We then compute the probability of the occurrence of each characteristic metrics in some single factor and multifactor authentication models in order to determine the effectiveness of these ...

  12. Quantitative metric theory of continued fractions

    Indian Academy of Sciences (India)

    2 (log log n). 1. 2 +ǫ) almost everywhere with respect to the Lebesgue measure. Keywords. Continued fractions; ergodic averages; metric theory of numbers. Mathematics Subject Classification. Primary: 11K50; Secondary: 28D99. 1. Introduction. In this paper, we use a quantitative L2-ergodic theorem to study the metrical ...

  13. Metric solution of a spinning mass

    International Nuclear Information System (INIS)

    Sato, H.

    1982-01-01

    Studies on a particular class of asymptotically flat and stationary metric solutions called the Kerr-Tomimatsu-Sato class are reviewed about its derivation and properties. For a further study, an almost complete list of the papers worked on the Tomimatsu-Sato metrics is given. (Auth.)

  14. Finite Metric Spaces of Strictly Negative Type

    DEFF Research Database (Denmark)

    Hjorth, Poul; Lisonek, P.; Markvorsen, Steen

    1998-01-01

    We prove that, if a finite metric space is of strictly negative type, then its transfinite diameter is uniquely realized by the infinite extender (load vector). Finite metric spaces that have this property include all spaces on two, three, or four points, all trees, and all finite subspaces of Eu...

  15. Gravitational Metric Tensor Exterior to Rotating Homogeneous ...

    African Journals Online (AJOL)

    The covariant and contravariant metric tensors exterior to a homogeneous spherical body rotating uniformly about a common φ axis with constant angular velocity ω is constructed. The constructed metric tensors in this gravitational field have seven non-zero distinct components.The Lagrangian for this gravitational field is ...

  16. Invariant metric for nonlinear symplectic maps

    Indian Academy of Sciences (India)

    In this paper, we construct an invariant metric in the space of homogeneous polynomials of a given degree (≥ 3). The homogeneous polynomials specify a nonlinear symplectic map which in turn represents a Hamiltonian system. By minimizing the norm constructed out of this metric as a function of system parameters, we ...

  17. On Information Metrics for Spatial Coding.

    Science.gov (United States)

    Souza, Bryan C; Pavão, Rodrigo; Belchior, Hindiael; Tort, Adriano B L

    2018-04-01

    The hippocampal formation is involved in navigation, and its neuronal activity exhibits a variety of spatial correlates (e.g., place cells, grid cells). The quantification of the information encoded by spikes has been standard procedure to identify which cells have spatial correlates. For place cells, most of the established metrics derive from Shannon's mutual information (Shannon, 1948), and convey information rate in bits/s or bits/spike (Skaggs et al., 1993, 1996). Despite their widespread use, the performance of these metrics in relation to the original mutual information metric has never been investigated. In this work, using simulated and real data, we find that the current information metrics correlate less with the accuracy of spatial decoding than the original mutual information metric. We also find that the top informative cells may differ among metrics, and show a surrogate-based normalization that yields comparable spatial information estimates. Since different information metrics may identify different neuronal populations, we discuss current and alternative definitions of spatially informative cells, which affect the metric choice. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  18. Fixed point theory in metric type spaces

    CERN Document Server

    Agarwal, Ravi P; O’Regan, Donal; Roldán-López-de-Hierro, Antonio Francisco

    2015-01-01

    Written by a team of leading experts in the field, this volume presents a self-contained account of the theory, techniques and results in metric type spaces (in particular in G-metric spaces); that is, the text approaches this important area of fixed point analysis beginning from the basic ideas of metric space topology. The text is structured so that it leads the reader from preliminaries and historical notes on metric spaces (in particular G-metric spaces) and on mappings, to Banach type contraction theorems in metric type spaces, fixed point theory in partially ordered G-metric spaces, fixed point theory for expansive mappings in metric type spaces, generalizations, present results and techniques in a very general abstract setting and framework. Fixed point theory is one of the major research areas in nonlinear analysis. This is partly due to the fact that in many real world problems fixed point theory is the basic mathematical tool used to establish the existence of solutions to problems which arise natur...

  19. Software Power Metric Model: An Implementation | Akwukwuma ...

    African Journals Online (AJOL)

    ... and the execution time (TIME) in each case was recorded. We then obtain the application functions point count. Our result shows that the proposed metric is computable, consistent in its use of unit, and is programming language independent. Keywords: Software attributes, Software power, measurement, Software metric, ...

  20. Validation of Metrics for Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2008-01-01

    Full Text Available This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  1. Fuzzy Set Field and Fuzzy Metric

    OpenAIRE

    Gebray, Gebru; Reddy, B. Krishna

    2014-01-01

    The notation of fuzzy set field is introduced. A fuzzy metric is redefined on fuzzy set field and on arbitrary fuzzy set in a field. The metric redefined is between fuzzy points and constitutes both fuzziness and crisp property of vector. In addition, a fuzzy magnitude of a fuzzy point in a field is defined.

  2. BASIC DESIGN KAPAL PENGANGKUT BATUBARA 200 TON SEBAGAI JALUR ALTERNATIF RUTE SUNGAI LEMATANG

    Directory of Open Access Journals (Sweden)

    Budianto Budianto

    2016-10-01

    Full Text Available Sebagai jalur alternatif jalan khusus,  melalui transportasi sungai Lematang dengan perancangan kapal pengangkut hasil tambang dalam  distribusi batubara merupakan salah satu solusi untuk jalur darat provinsi tetap terjaga dengan baik dan tidak terjadi kemacetan akibat konvoi dump truk pengangkut batubara. Bentuk geografis, perpindahan aliran sungai, dan pendangkalan sungai Lematang berpengaruh terhadap sulitnya membuat kapal angkut batubara melalui sistem tranportasi sungai tersebut. Jika menggunakan kapal angkut batubara jenis Tug and Barge, maka akan menyebabkan kesulitan dalam proses manouvering kapal bahkan sering terjadi kapal kandas karena pendangkalan sungai, serta bisa juga kapal tersangkut bagian barge karena sudut olah belok kapal terlalu melebar. Disamping itu juga, perlu diperhatikan bentuk aliran sungai Lematang yang melengkung dan berkelok, adanya masalah sosial, serta hambatan lain seperti adanya kabel slink, jembatan, kedalaman sungai, sampah kayu d.l.l. Salah satu teknologi yang bisa digunakan adalah kapal SPB (Self Propeller Barge. Dimana kapal SPB pengangkut batubara memiliki kelebihan dapat manouvering yang baik ketika melintasi wilayah sungai. Kapal SPB pengangkut batubara memiliki geladak angkut yang terletak dibelakang akomodasi, hal ini akan mempermudah jarak pandang dan proses manouvering kapal, akan tetapi terbatas dengan kapasitas yang diangkut karena terbatas dengan kedalaman draft kapal yang dimiliki. Dalam perancangan harus diperhatikan faktor geografis, kedalaman sungai, faktor sosial dan faktor ekonomis kapal. Sehingga akan memberikan hasil teknologi kapal SPB pengangkut batubara dengan kondisi sellow draft yang efektif dan efisien. Dimana kapal SPB yang dirancang dengan kaasitas 200 ton setara dengan 20 dump truk, kapal  yang difungsikan untuk mengangkut batubara dengan memiliki kecepatan sebesar 12 knot (dengan mesin 2x250HP dan konsumsi bahan bakar sebesar 1.77 ton dengan mngunakan SFOC sebesar 160 gram

  3. Community Extreme Tonnage User Service (CETUS): A 5000 Ton Open Research Facility in the United States

    Science.gov (United States)

    Danielson, L. R.; Righter, K.; Vander Kaaden, K. E.; Rowland, R. L., II; Draper, D. S.; McCubbin, F. M.

    2017-12-01

    Large sample volume 5000 ton multi-anvil presses have contributed to the exploration of deep Earth and planetary interiors, synthesis of ultra-hard and other novel materials, and serve as a sample complement to pressure and temperature regimes already attainable by diamond anvil cell experiments. However, no such facility exists in the Western Hemisphere. We are establishing an open user facility for the entire research community, with the unique capability of a 5000 ton multi-anvil and deformation press, HERA (High pressure Experimental Research Apparatus), supported by a host of extant co-located experimental and analytical laboratories and research staff. We offer wide range of complementary and/or preparatory experimental options. Any required synthesis of materials or follow up experiments can be carried out controlled atmosphere furnaces, piston cylinders, multi-anvil, or experimental impact apparatus. Additionally, our division houses two machine shops that would facilitate any modification or custom work necessary for development of CETUS, one for general fabrication and one located specifically within our experimental facilities. We also have a general sample preparation laboratory, specifically for experimental samples, that allows users to quickly and easily prepare samples for ebeam analyses and more. Our focus as contract staff is on serving the scientific needs of our users and collaborators. We are seeking community expert input on multiple aspects of this facility, such as experimental assembly design, module modifications, immediate projects, and future innovation initiatives. We've built a cooperative network of 12 (and growing) collaborating institutions, including COMPRES. CETUS is a coordinated effort leveraging HERA with our extant experimental, analytical, and planetary process modelling instrumentation and expertise in order to create a comprehensive model of the origin and evolution of our solar system and beyond. We are looking to engage

  4. Reconstruction and Analysis for the DUNE 35-ton Liquid Argon Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Wallbank, Michael James [Sheffield U.

    2018-01-01

    Neutrino physics is approaching the precision era, with current and future experiments aiming to perform highly accurate measurements of the parameters which govern the phenomenon of neutrino oscillations. The ultimate ambition with these results is to search for evidence of CP-violation in the lepton sector, currently hinted at in the world-leading analyses from present experiments, which may explain the dominance of matter over antimatter in the Universe. The Deep Underground Neutrino Experiment (DUNE) is a future long-baseline experiment based at Fermi National Accelerator Laboratory (FNAL), with a far detector at the Sanford Underground Research Facility (SURF) and a baseline of 1300 km. In order to make the required precision measurements, the far detector will consist of 40 kton liquid argon and an embedded time projection chamber. This promising technology is still in development and, since each detector module is around a factor 15 larger than any previous experiment employing this design, prototyping the detector and design choices is critical to the success of the experiment. The 35-ton experiment was constructed for this purpose and will be described in detail in this thesis. The outcomes of the 35-ton prototype are already influencing DUNE and, following the successes and lessons learned from the experiment, confidence can be taken forward to the next stage of the DUNE programme. The main oscillation signal at DUNE will be electron neutrino appearance from the muon neutrino beam. High-precision studies of these νe interactions requires advanced processing and event reconstruction techniques, particularly in the handling of showering particles such as electrons and photons. Novel methods developed for the purposes of shower reconstruction in liquid argon are presented with an aim to successfully develop a selection to use in a νe charged-current analysis, and a first-generation selection using the new techniques is presented.

  5. Methods and results for stress analyses on 14-ton, thin-wall depleted UF6 cylinders

    International Nuclear Information System (INIS)

    Kirkpatrick, J.R.; Chung, C.K.; Frazier, J.L.; Kelley, D.K.

    1996-10-01

    Uranium enrichment operations at the three US gaseous diffusion plants produce depleted uranium hexafluoride (DUF 6 ) as a residential product. At the present time, the inventory of DUF 6 in this country is more than half a million tons. The inventory of DUF 6 is contained in metal storage cylinders, most of which are located at the gaseous diffusion plants. The principal objective of the project is to ensure the integrity of the cylinders to prevent causing an environmental hazard by releasing the contents of the cylinders into the atmosphere. Another objective is to maintain the cylinders in such a manner that the DUF 6 may eventually be converted to a less hazardous material for final disposition. An important task in the DUF 6 cylinders management project is determining how much corrosion of the walls can be tolerated before the cylinders are in danger of being damaged during routine handling and shipping operations. Another task is determining how to handle cylinders that have already been damaged in a manner that will minimize the chance that a breach will occur or that the size of an existing breach will be significantly increased. A number of finite element stress analysis (FESA) calculations have been done to analyze the stresses for three conditions: (1) while the cylinder is being lifted, (2) when a cylinder is resting on two cylinders under it in the customary two-tier stacking array, and (3) when a cylinder is resting on tis chocks on the ground. Various documents describe some of the results and discuss some of the methods whereby they have been obtained. The objective of the present report is to document as many of the FESA cases done at Oak Ridge for 14-ton thin-wall cylinders as possible, giving results and a description of the calculations in some detail

  6. The metrics of science and technology

    CERN Document Server

    Geisler, Eliezer

    2000-01-01

    Dr. Geisler's far-reaching, unique book provides an encyclopedic compilation of the key metrics to measure and evaluate the impact of science and technology on academia, industry, and government. Focusing on such items as economic measures, patents, peer review, and other criteria, and supported by an extensive review of the literature, Dr. Geisler gives a thorough analysis of the strengths and weaknesses inherent in metric design, and in the use of the specific metrics he cites. His book has already received prepublication attention, and will prove especially valuable for academics in technology management, engineering, and science policy; industrial R&D executives and policymakers; government science and technology policymakers; and scientists and managers in government research and technology institutions. Geisler maintains that the application of metrics to evaluate science and technology at all levels illustrates the variety of tools we currently possess. Each metric has its own unique strengths and...

  7. Some Properties of Metric Polytope Constraints

    Directory of Open Access Journals (Sweden)

    V. A. Bondarenko

    2014-01-01

    Full Text Available The integrality recognition problem is considered on the sequence Mn,k of the nested Boolean quadric polytope relaxations, including the rooted semimetric Mn and the metric Mn,3 polytopes. Constraints of the metric polytope cut off all faces of the rooted semimetric polytope, containing only fractional vertices, that allows to solve the problem of integrality recognition on Mn in polynomial time. To solve the problem of integrality recognition on the metric polytope, we consider the possibility of cutting off all fractional faces of Mn,3 by some relaxation Mn,k. We represent the coordinates of the metric polytope in a homogeneous form by a three-dimensional block matrix. We show that to answer the question of the metric polytope fractional faces cutting off, it is sufficient to consider only constraints of the triangle inequalities form.

  8. Smart Grid Status and Metrics Report Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Antonopoulos, Chrissi A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clements, Samuel L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorrissen, Willy J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ruiz, Kathleen A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, David L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gardner, Chris [APQC, Houston, TX (United States); Varney, Jeff [APQC, Houston, TX (United States)

    2014-07-01

    A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

  9. Metrics for border management systems.

    Energy Technology Data Exchange (ETDEWEB)

    Duggan, Ruth Ann

    2009-07-01

    There are as many unique and disparate manifestations of border systems as there are borders to protect. Border Security is a highly complex system analysis problem with global, regional, national, sector, and border element dimensions for land, water, and air domains. The complexity increases with the multiple, and sometimes conflicting, missions for regulating the flow of people and goods across borders, while securing them for national security. These systems include frontier border surveillance, immigration management and customs functions that must operate in a variety of weather, terrain, operational conditions, cultural constraints, and geopolitical contexts. As part of a Laboratory Directed Research and Development Project 08-684 (Year 1), the team developed a reference framework to decompose this complex system into international/regional, national, and border elements levels covering customs, immigration, and border policing functions. This generalized architecture is relevant to both domestic and international borders. As part of year two of this project (09-1204), the team determined relevant relative measures to better understand border management performance. This paper describes those relative metrics and how they can be used to improve border management systems.

  10. Similarity metrics for surgical process models.

    Science.gov (United States)

    Neumuth, Thomas; Loebe, Frank; Jannin, Pierre

    2012-01-01

    The objective of this work is to introduce a set of similarity metrics for comparing surgical process models (SPMs). SPMs are progression models of surgical interventions that support quantitative analyses of surgical activities, supporting systems engineering or process optimization. Five different similarity metrics are presented and proven. These metrics deal with several dimensions of process compliance in surgery, including granularity, content, time, order, and frequency of surgical activities. The metrics were experimentally validated using 20 clinical data sets each for cataract interventions, craniotomy interventions, and supratentorial tumor resections. The clinical data sets were controllably modified in simulations, which were iterated ten times, resulting in a total of 600 simulated data sets. The simulated data sets were subsequently compared to the original data sets to empirically assess the predictive validity of the metrics. We show that the results of the metrics for the surgical process models correlate significantly (pmetrics meet predictive validity. The clinical use of the metrics was exemplarily, as demonstrated by assessment of the learning curves of observers during surgical process model acquisition. Measuring similarity between surgical processes is a complex task. However, metrics for computing the similarity between surgical process models are needed in many uses in the field of medical engineering. These metrics are essential whenever two SPMs need to be compared, such as during the evaluation of technical systems, the education of observers, or the determination of surgical strategies. These metrics are key figures that provide a solid base for medical decisions, such as during validation of sensor systems for use in operating rooms in the future. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Parametrization and Classification of 20 Billion LSST Objects: Lessons from SDSS

    Energy Technology Data Exchange (ETDEWEB)

    Ivezic, Z.; /Washington U., Seattle, Astron. Dept.; Axelrod, T.; /Large Binocular Telescope, Tucson; Becker, A.C.; /Washington U., Seattle, Astron. Dept.; Becla, J.; /SLAC; Borne, K.; /George Mason U.; Burke, David L.; /SLAC; Claver, C.F.; /NOAO, Tucson; Cook, K.H.; /LLNL, Livermore; Connolly, A.; /Washington U., Seattle, Astron. Dept.; Gilmore, D.K.; /SLAC; Jones, R.L.; /Washington U., Seattle, Astron. Dept.; Juric, M.; /Princeton, Inst. Advanced Study; Kahn, Steven M.; /SLAC; Lim, K-T.; /SLAC; Lupton, R.H.; /Princeton U.; Monet, D.G.; /Naval Observ., Flagstaff; Pinto, P.A.; /Arizona U.; Sesar, B.; /Washington U., Seattle, Astron. Dept.; Stubbs, Christopher W.; /Harvard U.; Tyson, J.Anthony; /UC, Davis

    2011-11-10

    The Large Synoptic Survey Telescope (LSST) will be a large, wide-field ground-based system designed to obtain, starting in 2015, multiple images of the sky that is visible from Cerro Pachon in Northern Chile. About 90% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg{sup 2} region about 1000 times during the anticipated 10 years of operations (distributed over six bands, ugrizy). Each 30-second long visit will deliver 5{sigma} depth for point sources of r {approx} 24.5 on average. The co-added map will be about 3 magnitudes deeper, and will include 10 billion galaxies and a similar number of stars. We discuss various measurements that will be automatically performed for these 20 billion sources, and how they can be used for classification and determination of source physical and other properties. We provide a few classification examples based on SDSS data, such as color classification of stars, color-spatial proximity search for wide-angle binary stars, orbital-color classification of asteroid families, and the recognition of main Galaxy components based on the distribution of stars in the position-metallicity-kinematics space. Guided by these examples, we anticipate that two grand classification challenges for LSST will be (1) rapid and robust classification of sources detected in difference images, and (2) simultaneous treatment of diverse astrometric and photometric time series measurements for an unprecedentedly large number of objects.

  12. A parts-per-billion measurement of the antiproton magnetic moment

    CERN Document Server

    Smorra, C; Borchert, M J; Harrington, J A; Higuchi, T; Nagahama, H; Tanaka, T; Mooser, A; Schneider, G; Blaum, K; Matsuda, Y; Ospelkaus, C; Quint, W; Walz, J; Yamazaki, Y; Ulmer, S

    2016-01-01

    Precise comparisons of the fundamental properties of matter–antimatter conjugates provide sensitive tests of charge–parity–time (CPT) invariance1, which is an important symmetry that rests on basic assumptions of the standard model of particle physics. Experiments on mesons2, leptons3, 4 and baryons5, 6 have compared different properties of matter–antimatter conjugates with fractional uncertainties at the parts-per-billion level or better. One specific quantity, however, has so far only been known to a fractional uncertainty at the parts-per-million level7, 8: the magnetic moment of the antiproton, . The extraordinary difficulty in measuring with high precision is caused by its intrinsic smallness; for example, it is 660 times smaller than the magnetic moment of the positron3. Here we report a high-precision measurement of in units of the nuclear magneton μN with a fractional precision of 1.5 parts per billion (68% confidence level). We use a two-particle spectroscopy method in an advanced cryogenic ...

  13. Energy tax price tag for CPI: $1.2 billion, jobs, and production

    International Nuclear Information System (INIS)

    Begley, R.

    1993-01-01

    If President Clinton's proposed energy tax had been fully in place last year, it would have cost the US chemical industry an additional $1.2 billion and 9,900 jobs, according to Chemical Manufacturers Association (CMA; Washington) estimates. It also would have driven output down 3% and prices up 5%, CMA says. Allen Lenz, CMA director/trade and economics, says the increase in production costs that would accompany the tax will not be shared by foreign competitors, cannot be neutralized with higher border taxes because of existing trade agreements, and provides another reason to move production offshore. Worse, the US chemical industry's generally impressive trade surplus declined by $2.5 billion last year, and a further drop is projected for this year. The margin of error gets thinner all the time as competition increases, Lenz says. We're not concerned only with the chemical industry, but the rest of US-based manufacturing because they taken half our output, he adds. One problem is the energy intensiveness of the chemical process industries-a CMA report says that 55% of the cost of producing ethylene glycol is energy related. And double taxation of such things as coproducts returned for credit to oil refineries could add up to $115 million/year, the report says

  14. Stability of equidimensional pseudo-single-domain magnetite over billion-year timescales.

    Science.gov (United States)

    Nagy, Lesleis; Williams, Wyn; Muxworthy, Adrian R; Fabian, Karl; Almeida, Trevor P; Conbhuí, Pádraig Ó; Shcherbakov, Valera P

    2017-09-26

    Interpretations of paleomagnetic observations assume that naturally occurring magnetic particles can retain their primary magnetic recording over billions of years. The ability to retain a magnetic recording is inferred from laboratory measurements, where heating causes demagnetization on the order of seconds. The theoretical basis for this inference comes from previous models that assume only the existence of small, uniformly magnetized particles, whereas the carriers of paleomagnetic signals in rocks are usually larger, nonuniformly magnetized particles, for which there is no empirically complete, thermally activated model. This study has developed a thermally activated numerical micromagnetic model that can quantitatively determine the energy barriers between stable states in nonuniform magnetic particles on geological timescales. We examine in detail the thermal stability characteristics of equidimensional cuboctahedral magnetite and find that, contrary to previously published theories, such nonuniformly magnetized particles provide greater magnetic stability than their uniformly magnetized counterparts. Hence, nonuniformly magnetized grains, which are commonly the main remanence carrier in meteorites and rocks, can record and retain high-fidelity magnetic recordings over billions of years.

  15. Plate tectonic influences on Earth's baseline climate: a 2 billion-year record

    Science.gov (United States)

    McKenzie, R.; Evans, D. A.; Eglington, B. M.; Planavsky, N.

    2017-12-01

    Plate tectonic processes present strong influences on the long-term carbon cycle, and thus global climate. Here we utilize multiple aspects of the geologic record to assess the role plate tectonics has played in driving major icehouse­-greenhouse transitions for the past 2 billion years. Refined paleogeographic reconstructions allow us to quantitatively assess the area of continents in various latitudinal belts throughout this interval. From these data we are able to test the hypothesis that concentrating continental masses in low-latitudes will drive cooler climates due to increased silicate weathering. We further superimpose records of events that are believed to increase the `weatherability' of the crust, such as large igneous province emplacement, island-arc accretion, and continental collisional belts. Climatic records are then compared with global detrital zircon U-Pb age data as a proxy for continental magmatism. Our results show a consistent relationship between zircon-generating magmatism and icehouse-greenhouse transitions for > 2 billion years, whereas paleogeographic records show no clear consistent relationship between continental configurations and prominent climate transitions. Volcanic outgassing appears to exert a first-order control on major baseline climatic shifts; however, paleogeography likely plays an important role in the magnitude of this change. Notably, climatic extremes, such as the Cryogenian icehouse, occur during a combination of reduce volcanism and end-member concentrations of low-latitudinal continents.

  16. Families of quasi-pseudo-metrics generated by probabilistic quasi-pseudo-metric spaces

    Directory of Open Access Journals (Sweden)

    Mariusz T. Grabiec

    2008-03-01

    Full Text Available This paper contains a study of families of quasi-pseudo-metrics (the concept of a quasi-pseudo-metric was introduced by Wilson (1931 , Albert (1941 and Kelly (1963 generated by probabilistic quasi-pseudo-metric-spaces which are generalization of probabilistic metric space (PM-space shortly [2, 3, 4, 6]. The idea of PM-spaces was introduced by Menger (1942, 1951, Schweizer and Sklar (1983 and Serstnev (1965. Families of pseudo-metrics generated by PM-spaces and those generalizing PM-spaces have been described by Stevens (1968 and Nishiure (1970.

  17. Fighter agility metrics. M.S. Thesis

    Science.gov (United States)

    Liefer, Randall K.

    1990-01-01

    Fighter flying qualities and combat capabilities are currently measured and compared in terms relating to vehicle energy, angular rates and sustained acceleration. Criteria based on these measurable quantities have evolved over the past several decades and are routinely used to design aircraft structures, aerodynamics, propulsion and control systems. While these criteria, or metrics, have the advantage of being well understood, easily verified and repeatable during test, they tend to measure the steady state capability of the aircraft and not its ability to transition quickly from one state to another. Proposed new metrics to assess fighter aircraft agility are collected and analyzed. A framework for classification of these new agility metrics is developed and applied. A complete set of transient agility metrics is evaluated with a high fidelity, nonlinear F-18 simulation. Test techniques and data reduction methods are proposed. A method of providing cuing information to the pilot during flight test is discussed. The sensitivity of longitudinal and lateral agility metrics to deviations from the pilot cues is studied in detail. The metrics are shown to be largely insensitive to reasonable deviations from the nominal test pilot commands. Instrumentation required to quantify agility via flight test is also considered. With one exception, each of the proposed new metrics may be measured with instrumentation currently available.

  18. SAPHIRE 8 Quality Assurance Software Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Kurt G. Vedros

    2011-08-01

    The purpose of this review of software metrics is to examine the quality of the metrics gathered in the 2010 IV&V and to set an outline for results of updated metrics runs to be performed. We find from the review that the maintenance of accepted quality standards presented in the SAPHIRE 8 initial Independent Verification and Validation (IV&V) of April, 2010 is most easily achieved by continuing to utilize the tools used in that effort while adding a metric of bug tracking and resolution. Recommendations from the final IV&V were to continue periodic measurable metrics such as McCabe's complexity measure to ensure quality is maintained. The four software tools used to measure quality in the IV&V were CodeHealer, Coverage Validator, Memory Validator, Performance Validator, and Thread Validator. These are evaluated based on their capabilities. We attempted to run their latest revisions with the newer Delphi 2010 based SAPHIRE 8 code that has been developed and was successful with all of the Validator series of tools on small tests. Another recommendation from the IV&V was to incorporate a bug tracking and resolution metric. To improve our capability of producing this metric, we integrated our current web reporting system with the SpiraTest test management software purchased earlier this year to track requirements traceability.

  19. An Underwater Color Image Quality Evaluation Metric.

    Science.gov (United States)

    Yang, Miao; Sowmya, Arcot

    2015-12-01

    Quality evaluation of underwater images is a key goal of underwater video image retrieval and intelligent processing. To date, no metric has been proposed for underwater color image quality evaluation (UCIQE). The special absorption and scattering characteristics of the water medium do not allow direct application of natural color image quality metrics especially to different underwater environments. In this paper, subjective testing for underwater image quality has been organized. The statistical distribution of the underwater image pixels in the CIELab color space related to subjective evaluation indicates the sharpness and colorful factors correlate well with subjective image quality perception. Based on these, a new UCIQE metric, which is a linear combination of chroma, saturation, and contrast, is proposed to quantify the non-uniform color cast, blurring, and low-contrast that characterize underwater engineering and monitoring images. Experiments are conducted to illustrate the performance of the proposed UCIQE metric and its capability to measure the underwater image enhancement results. They show that the proposed metric has comparable performance to the leading natural color image quality metrics and the underwater grayscale image quality metrics available in the literature, and can predict with higher accuracy the relative amount of degradation with similar image content in underwater environments. Importantly, UCIQE is a simple and fast solution for real-time underwater video processing. The effectiveness of the presented measure is also demonstrated by subjective evaluation. The results show better correlation between the UCIQE and the subjective mean opinion score.

  20. Optimal Lyapunov metrics of expansive homeomorphisms

    International Nuclear Information System (INIS)

    Dovbysh, S A

    2006-01-01

    We sharpen the following results of Reddy, Sakai and Fried: any expansive homeomorphism of a metrizable compactum admits a Lyapunov metric compatible with the topology, and if we also assume the existence of a local product structure (that is, if the homeomorphism is an A*-homeomorphism in the terminology of Alekseev and Yakobson, or possesses hyperbolic canonical coordinates in the terminology of Bowen, or together with the metric compactum constitutes a Smale space in the terminology by Ruelle), then we also obtain the validity of Ruelle's technical axiom on the Lipschitz property of the homeomorphism, its inverse, and the local product structure. It is shown that any expansive homeomorphism admits a Lyapunov metric such that the homeomorphism on local stable (resp. unstable) 'manifolds' is approximately representable on a small scale as a contraction (resp. expansion) with constant coefficient λ s (resp. λ u -1 ) in this metric. For A*-homeomorphisms, we prove that the desired metric can be approximately represented on a small scale as the direct sum of metrics corresponding to the canonical coordinates determined by the local product structure and that local 'manifolds' are 'flat' in some sense. It is also proved that the lower bounds for the contraction constants λ s and expansion constants λ u of A*-homeomorphisms are attained simultaneously for some metric that satisfies all the conditions described

  1. Robust Transfer Metric Learning for Image Classification.

    Science.gov (United States)

    Ding, Zhengming; Fu, Yun

    2017-02-01

    Metric learning has attracted increasing attention due to its critical role in image analysis and classification. Conventional metric learning always assumes that the training and test data are sampled from the same or similar distribution. However, to build an effective distance metric, we need abundant supervised knowledge (i.e., side/label information), which is generally inaccessible in practice, because of the expensive labeling cost. In this paper, we develop a robust transfer metric learning (RTML) framework to effectively assist the unlabeled target learning by transferring the knowledge from the well-labeled source domain. Specifically, RTML exploits knowledge transfer to mitigate the domain shift in two directions, i.e., sample space and feature space. In the sample space, domain-wise and class-wise adaption schemes are adopted to bridge the gap of marginal and conditional distribution disparities across two domains. In the feature space, our metric is built in a marginalized denoising fashion and low-rank constraint, which make it more robust to tackle noisy data in reality. Furthermore, we design an explicit rank constraint regularizer to replace the rank minimization NP-hard problem to guide the low-rank metric learning. Experimental results on several standard benchmarks demonstrate the effectiveness of our proposed RTML by comparing it with the state-of-the-art transfer learning and metric learning algorithms.

  2. Characterising risk - aggregated metrics: radiation and noise

    International Nuclear Information System (INIS)

    Passchier, W.

    1998-01-01

    The characterisation of risk is an important phase in the risk assessment - risk management process. From the multitude of risk attributes a few have to be selected to obtain a risk characteristic or profile that is useful for risk management decisions and implementation of protective measures. One way to reduce the number of attributes is aggregation. In the field of radiation protection such an aggregated metric is firmly established: effective dose. For protection against environmental noise the Health Council of the Netherlands recently proposed a set of aggregated metrics for noise annoyance and sleep disturbance. The presentation will discuss similarities and differences between these two metrics and practical limitations. The effective dose has proven its usefulness in designing radiation protection measures, which are related to the level of risk associated with the radiation practice in question, given that implicit judgements on radiation induced health effects are accepted. However, as the metric does not take into account the nature of radiation practice, it is less useful in policy discussions on the benefits and harm of radiation practices. With respect to the noise exposure metric, only one effect is targeted (annoyance), and the differences between sources are explicitly taken into account. This should make the metric useful in policy discussions with respect to physical planning and siting problems. The metric proposed has only significance on a population level, and can not be used as a predictor for individual risk. (author)

  3. Impact of humans on the flux of terrestrial sediment to the global coastal ocean.

    Science.gov (United States)

    Syvitski, James P M; Vörösmarty, Charles J; Kettner, Albert J; Green, Pamela

    2005-04-15

    Here we provide global estimates of the seasonal flux of sediment, on a river-by-river basis, under modern and prehuman conditions. Humans have simultaneously increased the sediment transport by global rivers through soil erosion (by 2.3 +/- 0.6 billion metric tons per year), yet reduced the flux of sediment reaching the world's coasts (by 1.4 +/- 0.3 billion metric tons per year) because of retention within reservoirs. Over 100 billion metric tons of sediment and 1 to 3 billion metric tons of carbon are now sequestered in reservoirs constructed largely within the past 50 years. African and Asian rivers carry a greatly reduced sediment load; Indonesian rivers deliver much more sediment to coastal areas.

  4. Mars’ First Billion Years: Key Findings, Key Unsolved Paradoxes, and Future Exploration

    Science.gov (United States)

    Ehlmann, Bethany

    2017-10-01

    In the evolution of terrestrial planets, the first billion years are the period most shrouded in mystery: How vigorous is early atmospheric loss? How do planetary climates respond to a brightening sun? When and how are plate tectonic recycling processes initiated? How do voluminous volcanism and heavy impact bombardment influence the composition of the atmosphere? Under what conditions might life arise? Looking outward to terrestrial planets around other stars, the record from Venus, Earth and Mars in this solar system is crucial for developing models of physical can chemical processes. Of these three worlds, Mars provides the longest record of planetary evolution from the first billion years, comprising >50% of exposed geologic units, which are only lightly overprinted by later processes.Orbital observations of the last decade have revealed abundant evidence for surface waters in the form of lakes, valley networks, and evidence of chemically open-system near-surface weathering. Groundwaters at temperatures ranging from just above freezing to hydrothermal have also left a rich record of process in the mineralogical record. A rsuite of environments - similar in diversity to Earth’s - has been discovered on Mars with water pH, temperature, redox, and chemistries varying in space and time.Here, I will focus on the consequences of the aqueous alteration of the Martian crust on the composition of the atmosphere based on recent work studying aspects of the volatile budget (Usui et al., 2015; Edwards & Ehlmann, 2015; Hu et al., 2015; Jakosky et al., 2017, Wordsworth et al., 2017, and Ehlmann, in prep.). The solid crust and mantle of Mars act as volatile reservoirs and volatile sources through volcanism, mineral precipitation, and release of gases. We examine the extent to which the budget is understood or ill-understood for hydrogen and carbon, and associated phases H2O, CO2, and CH4. Additionally, I identify some key stratigraphies where a combination of focused in

  5. Metrics Are Needed for Collaborative Software Development

    Directory of Open Access Journals (Sweden)

    Mojgan Mohtashami

    2011-10-01

    Full Text Available There is a need for metrics for inter-organizational collaborative software development projects, encompassing management and technical concerns. In particular, metrics are needed that are aimed at the collaborative aspect itself, such as readiness for collaboration, the quality and/or the costs and benefits of collaboration in a specific ongoing project. We suggest questions and directions for such metrics, spanning the full lifespan of a collaborative project, from considering the suitability of collaboration through evaluating ongoing projects to final evaluation of the collaboration.

  6. On metrics and super-Riemann surfaces

    International Nuclear Information System (INIS)

    Hodgkin, L.

    1987-01-01

    It is shown that any super-Riemann surface M admits a large space of metrics (in a rather basic sense); while if M is of compact genus g type, g>1, M admits a unique metric whose lift to the universal cover is superconformally equivalent to the standard (Baranov-Shvarts) metric on the super-half plane. This explains the relation between the different methods of calculation of the upper Teichmueller space by the author (using arbitrary superconformal transformations) and Crane and Rabin (using only isometries). (orig.)

  7. Finite Metric Spaces of Strictly negative Type

    DEFF Research Database (Denmark)

    Hjorth, Poul G.

    If a finite metric space is of strictly negative type then its transfinite diameter is uniquely realized by an infinite extent (“load vector''). Finite metric spaces that have this property include all trees, and all finite subspaces of Euclidean and Hyperbolic spaces. We prove that if the distance...... matrix of a finite metric space is both hypermetric and regular, then it is of strictly negative type. We show that the strictly negative type finite subspaces of spheres are precisely those which do not contain two pairs of antipodal points....

  8. Applying Sigma Metrics to Reduce Outliers.

    Science.gov (United States)

    Litten, Joseph

    2017-03-01

    Sigma metrics can be used to predict assay quality, allowing easy comparison of instrument quality and predicting which tests will require minimal quality control (QC) rules to monitor the performance of the method. A Six Sigma QC program can result in fewer controls and fewer QC failures for methods with a sigma metric of 5 or better. The higher the number of methods with a sigma metric of 5 or better, the lower the costs for reagents, supplies, and control material required to monitor the performance of the methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Metric learning for DNA microarray data analysis

    International Nuclear Information System (INIS)

    Takeuchi, Ichiro; Nakagawa, Masao; Seto, Masao

    2009-01-01

    In many microarray studies, gene set selection is an important preliminary step for subsequent main task such as tumor classification, cancer subtype identification, etc. In this paper, we investigate the possibility of using metric learning as an alternative to gene set selection. We develop a simple metric learning algorithm aiming to use it for microarray data analysis. Exploiting a property of the algorithm, we introduce a novel approach for extending the metric learning to be adaptive. We apply the algorithm to previously studied microarray data on malignant lymphoma subtype identification.

  10. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  11. Constructing experimental devices for half-ton synthesis of gadolinium-loaded liquid scintillator and its performance

    Science.gov (United States)

    Park, Young Seo; Jang, Yeong Min; Joo, Kyung Kwang

    2018-04-01

    This paper describes in brief features of various experimental devices constructed for half-ton synthesis of gadolinium(Gd)-loaded liquid scintillator (GdLS) and also includes the performances and detailed chemical and physical results of a 0.5% high-concentration GdLS. Various feasibility studies on useful apparatus used for loading Gd into solvents have been carried out. The transmittance, Gd concentration, density, light yield, and moisture content were measured for quality control. We show that with the help of adequate automated experimental devices and tools, it is possible to perform ton scale synthesis of GdLS at moderate laboratory scale without difficulty. The synthesized GdLS was satisfactory to meet chemical, optical, and physical properties and various safety requirements. These synthesizing devices can be expanded into massive scale next-generation neutrino experiments of several hundred tons.

  12. Study of light detection and sensitivity for a ton-scale liquid xenon dark matter detector

    International Nuclear Information System (INIS)

    Wei, Y; Lin, Q; Xiao, X; Ni, K

    2013-01-01

    Ton-scale liquid xenon detectors operated in two-phase mode are proposed and being constructed recently to explore the favored parameter space for the Weakly Interacting Massive Particles (WIMPs) dark matter. To achieve a better light collection efficiency while limiting the number of electronics channels compared to the previous generation detectors, large-size photo-multiplier tubes (PMTs) such as the 3-inch-diameter R11410 from Hamamatsu are suggested to replace the 1-inch-square R8520 PMTs. In a two-phase xenon dark matter detector, two PMT arrays on the top and bottom are usually used. In this study, we compare the performance of two different ton-scale liquid xenon detector configurations with the same number of either R11410 (config.1) or R8520 (config.2) for the top PMT array, while both using R11410 PMTs for the bottom array. The self-shielding of liquid xenon suppresses the background from the PMTs and the dominant background is from the pp solar neutrinos in the central fiducial volume. The light collection efficiency for the primary scintillation light is largely affected by the xenon purity and the reflectivity of the reflectors. In the optimistic situation with a 10 m light absorption length and a 95% reflectivity, the light collection efficiency is 43%(34%) for config.1(config.2). In the conservative situation with a 2.5 m light absorption length and a 85% reflectivity, the value is only 18%(13%) for config.1(config.2). The difference between the two configurations is due to the larger PMT coverage on the top for config.1. The slightly different position resolutions for the two configurations have a negligible effect on the sensitivity. Based on the above considerations, we estimate the sensitivity reach of the two detector configurations. Both configurations can reach a sensitivity of 2 ∼ 3 × 10 −47 cm 2 for spin-independent WIMP-nucleon cross section for 100 GeV/c 2 WIMPs after two live-years of operation. The one with R8520 PMTs for the top

  13. SWRO-PRO System in “Mega-ton Water System” for Energy Reduction and Low Environmental Impact

    Directory of Open Access Journals (Sweden)

    Masaru Kurihara

    2018-01-01

    Full Text Available Reverse osmosis (RO membranes have been widely applied in seawater desalination (SWRO and wastewater reclamation as the main desalination technology since 2000. SWRO plants face challenges to reduce energy consumption and brine disposal to lessen marine pollution. To tackle these challenges, a SWRO-PRO (Pressure Retarded Osmosis System was proposed in the “Mega-ton Water System” project under the Japanese national project of the “Funding Program for World-Leading Innovative R&D on Science and Technology” (FIRST Program. To reduce the energy consumption of the main SWRO plant, an innovative low-pressure SWRO membrane and a next generation energy recovery device (ERD were developed by the “Mega-ton Water System” project. In addition to this research and development, a new membrane process has been proposed and confirmed as a low-pressure multi-stage SWRO (LMS. A brine conversion two-stage SWRO system was invented 20 years ago, and has been in operation for over 15 years. Application of the SWRO membrane process to actual commercial plants was an important research theme. The low-pressure multi-stage SWRO System (LMS was an innovative method of introducing a low-pressure membrane and the membrane element in the pressure vessel was designed to avoid heavy fouling of lead elements. As a result of these developments at mega-ton scale SWRO plants, a 20% energy reduction was possible in the SWRO system of the “Mega-ton Water System”. In the development of the PRO process, a PRO hollow fiber membrane module with a maximum 13.3 w/m2 of membrane power density using a 10-inch module was established at a prototype PRO plant. Thus, a 30% energy reduction was possible using the SWRO-PRO System in the “Mega-ton Water System” at mega-ton scale SWRO plants. The brine disposal problem was also solved by this system.

  14. Missing billions. How the Australian government's climate policy is penalising farmers

    International Nuclear Information System (INIS)

    Riguet, T.

    2006-10-01

    The Climate Institute analysis suggests ratifying the Kyoto Protocol and implementing a national emissions trading scheme today could provide Australian farmers with an income of $1.8 billion over the period 2008-2012, due to the emissions saved by limiting land clearing. Separately, a report to the National Farmers Federation by the Allen Consulting Group earlier this year concluded that a carbon emission trading system which recognised Kyoto Protocol rules could create an additional income stream of $0.7-0.9 billion over a five year period from revenue to farmers from forestry sinks. These two studies suggest that ratification of the Kyoto Protocol and the introduction of a national emissions trading scheme could provide farmers an income stream in the order of $2.5 billion. A central tenet of the Federal Government's greenhouse policy for over a decade has been to not ratify Kyoto, but to meet its Kyoto target - a national emissions increase of 8% from 1990 levels, in the period 2008-2012. Australia's National Greenhouse Gas Accounts show that farmers, by reducing land clearing rates since 1990, have offset substantial increases in greenhouse gas emissions from other sectors, mainly energy. Official Federal Government projections show that without land clearing reductions, Australia's greenhouse emissions would be 30% above 1990 levels by 2010. Australia's farmers have been responsible for virtually the entire share of the nation's greenhouse gas emissions reductions, but their efforts, worth around $2 billion, have not been recognised or financially rewarded by the Government. By reducing land clearing, farmers have already reduced greenhouse gas emissions by about 75 million tonnes since 1990. By 2010, the savings are projected to be about 83 million tonnes. This level of emissions reductions is equivalent to eliminating the total annual emissions of New Zealand or Ireland. Over that same period, emissions from energy and transport have and continue to sky

  15. Saving billions of dollars--and physicians' time--by streamlining billing practices.

    Science.gov (United States)

    Blanchfield, Bonnie B; Heffernan, James L; Osgood, Bradford; Sheehan, Rosemary R; Meyer, Gregg S

    2010-06-01

    The U.S. system of billing third parties for health care services is complex, expensive, and inefficient. Physicians end up using nearly 12 percent of their net patient service revenue to cover the costs of excessive administrative complexity. A single transparent set of payment rules for multiple payers, a single claim form, and standard rules of submission, among other innovations, would reduce the burden on the billing offices of physician organizations. On a national scale, our hypothetical modeling of these changes would translate into $7 billion of savings annually for physician and clinical services. Four hours of professional time per physician and five hours of practice support staff time could be saved each week.

  16. Titanium isotopic evidence for felsic crust and plate tectonics 3.5 billion years ago.

    Science.gov (United States)

    Greber, Nicolas D; Dauphas, Nicolas; Bekker, Andrey; Ptáček, Matouš P; Bindeman, Ilya N; Hofmann, Axel

    2017-09-22

    Earth exhibits a dichotomy in elevation and chemical composition between the continents and ocean floor. Reconstructing when this dichotomy arose is important for understanding when plate tectonics started and how the supply of nutrients to the oceans changed through time. We measured the titanium isotopic composition of shales to constrain the chemical composition of the continental crust exposed to weathering and found that shales of all ages have a uniform isotopic composition. This can only be explained if the emerged crust was predominantly felsic (silica-rich) since 3.5 billion years ago, requiring an early initiation of plate tectonics. We also observed a change in the abundance of biologically important nutrients phosphorus and nickel across the Archean-Proterozoic boundary, which might have helped trigger the rise in atmospheric oxygen. Copyright © 2017 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  17. Learnometrics: Metrics for Learning Objects (Learnometrics: metrieken voor leerobjecten)

    OpenAIRE

    Ochoa, Xavier

    2008-01-01

    - Introduction - Quantitative Analysis of the Publication of Learning Objects - Quantiative Analysis of the Reuse of Learning Objects - Metadata Quality Metrics for Learning Objects - Relevance Ranking Metrics for Learning Objects - Metrics Service Architecture and Use Cases - Conclusions

  18. Ici, la priorité est aux piétons / Pedestrians have the right of way at CERN

    CERN Multimedia

    2002-01-01

    Au CERN, nous sommes tous piétons, très souvent automobilistes et parfois cyclistes. Mais peu importe notre moyen de locomotion si l'on reste vigilant et si l'on se rappelle que le piéton est un usager de la route à part entière, mais plus vulnérable. / At CERN, we are all pedestrians, often drivers, and occasionally cyclists. But our means of locomotion do no matter so long as we exercise caution and remember that a pedestrian has equal rights as a roa user, except than that he runs greater risks.

  19. A parts-per-billion measurement of the antiproton magnetic moment

    Science.gov (United States)

    Smorra, C.; Sellner, S.; Borchert, M. J.; Harrington, J. A.; Higuchi, T.; Nagahama, H.; Tanaka, T.; Mooser, A.; Schneider, G.; Bohman, M.; Blaum, K.; Matsuda, Y.; Ospelkaus, C.; Quint, W.; Walz, J.; Yamazaki, Y.; Ulmer, S.

    2017-10-01

    Precise comparisons of the fundamental properties of matter–antimatter conjugates provide sensitive tests of charge–parity–time (CPT) invariance, which is an important symmetry that rests on basic assumptions of the standard model of particle physics. Experiments on mesons, leptons and baryons have compared different properties of matter–antimatter conjugates with fractional uncertainties at the parts-per-billion level or better. One specific quantity, however, has so far only been known to a fractional uncertainty at the parts-per-million level: the magnetic moment of the antiproton, . The extraordinary difficulty in measuring with high precision is caused by its intrinsic smallness; for example, it is 660 times smaller than the magnetic moment of the positron. Here we report a high-precision measurement of in units of the nuclear magneton μN with a fractional precision of 1.5 parts per billion (68% confidence level). We use a two-particle spectroscopy method in an advanced cryogenic multi-Penning trap system. Our result  = ‑2.7928473441(42)μN (where the number in parentheses represents the 68% confidence interval on the last digits of the value) improves the precision of the previous best measurement by a factor of approximately 350. The measured value is consistent with the proton magnetic moment, μp = 2.792847350(9)μN, and is in agreement with CPT invariance. Consequently, this measurement constrains the magnitude of certain CPT-violating effects to below 1.8 × 10‑24 gigaelectronvolts, and a possible splitting of the proton–antiproton magnetic moments by CPT-odd dimension-five interactions to below 6 × 10‑12 Bohr magnetons.

  20. China. Country profile. [China's billion consumers are a rapidly changing market].

    Science.gov (United States)

    Hardee, K

    1984-10-01

    This article provides a summary of demographic, social, and economic characteristics of the People's Republic of China. Chinese leaders project that achievement of the 4 modernizations (agriculture, industry, science, and technology) will double the per capita income level to $800/year by 2000. Although industrial and agricultural growth have outpaced population growth, stringent population control is considered necessary for continued economic development. China's 1982 population was 1.008 billion, with a birth rate of 20.91, a death rate of 6.36, and a 14.55 rate of natural increase. The growth rate declined from 1.3% in 1982 to 1.15% in 1983. To achieve its goal of preventing the population from exceeding 1.2 billion by the year 2000, the government urges couples to have only 1 child. This policy has been successful in the cities but faces opposition in the rural areas. The sex ratio is 106 males to every 100 females, and there is concern about female infanticide. In 1982 the average household size ranged from a high of 5.2 persons in Qinghai and Yunnan to a low of 3.6 persons in Shanghai. 39% of the population lives in nuclear families without relatives. The literacy rate stood at 77% of those over 12 years of age in 1982, but males outnumber females at higher levels of education. China's campaign to improve health has focused on preventive measures, and there are an estimated 3-5 million health care workers. The 1982 labor force participation rate for those 15-64 years of age was 87.7%, with 44% of workers employed in agricculture. 76.6% of women work, primarily in labor-intensive, low-wage occupations.

  1. ICI bites demerger bullet, Zeneca guns for Brit-pounds 1.3-billion rights issue

    International Nuclear Information System (INIS)

    Jackson, D.; Alperowicz, N.

    1993-01-01

    Any lingering doubts as to ICI's (London) intentions to follow through its demerger proposals were dispelled last week. The company will hive off its bioscience business into Zeneca Group plc, which will make a Brit-pounds 1.3-billion ($1.9 billion) rights issue in June 1993. Shareholders, whose approval for the historic move will be sought in late May, will receive one fully paid Zeneca share for each ICI share. Proceeds from the rights issue will be used to reduce Zeneca's indebtedness to ICI by about 70%. Acknowledging that ICI had 'spread the jam too thinly' during its expansion in the 1980s, chief executive Ronnie Hampel says the new ICI will be a cost-conscious, no-frills' organization and that businesses that failed to perform would be restructured or closed. He is 'not expecting any help from the economy' in 1993. Of ICI's remaining petrochemicals and plastics businesses, Hampel says that despite 'stringent measures to reduce the cost base hor-ellipsis it is clear they will not reach a return on capital that will justify reinvestment by ICI.' He does not see them as closure candidates but as 'businesses that will require further restructuring.' Hampel notes 'a dozen clearly identified areas for expansion,' including paints, catalysts, titanium dioxide, and chlorofluorocarbon replacements. Losses in materials, where substantial rationalization has failed to halt the slide, will be reduced on completion of the DuPont deal - expected by midyear. 'Further measures' would be necessary for the 'residual bit of advanced materials in the US,' he says

  2. Searching for Organics Preserved in 4.5 Billion Year Old Salt

    Science.gov (United States)

    Zolensky, Michael E.; Fries, M.; Steele, A.; Bodnar, R.

    2012-01-01

    Our understanding of early solar system fluids took a dramatic turn a decade ago with the discovery of fluid inclusion-bearing halite (NaCl) crystals in the matrix of two freshly fallen brecciated H chondrite falls, Monahans and Zag. Both meteorites are regolith breccias, and contain xenolithic halite (and minor admixed sylvite -- KCl, crystals in their regolith lithologies. The halites are purple to dark blue, due to the presence of color centers (electrons in anion vacancies) which slowly accumulated as 40K (in sylvite) decayed over billions of years. The halites were dated by K-Ar, Rb-Sr and I-Xe systematics to be 4.5 billion years old. The "blue" halites were a fantastic discovery for the following reasons: (1) Halite+sylvite can be dated (K is in sylvite and will substitute for Na in halite, Rb substitutes in halite for Na, and I substitutes for Cl). (2) The blue color is lost if the halite dissolves on Earth and reprecipitates (because the newly-formed halite has no color centers), so the color serves as a "freshness" or pristinity indicator. (3) Halite frequently contains aqueous fluid inclusions. (4) Halite contains no structural oxygen, carbon or hydrogen, making them ideal materials to measure these isotopic systems in any fluid inclusions. (5) It is possible to directly measure fluid inclusion formation temperatures, and thus directly measure the temperature of the mineralizing aqueous fluid. In addition to these two ordinary chondrites halite grains have been reliably reported in several ureilites, an additional ordinary chondrite (Jilin), and in the carbonaceous chondrite (Murchison), although these reports were unfortunately not taken seriously. We have lately found additional fluid inclusions in carbonates in several additional carbonaceous chondrites. Meteoritic aqueous fluid inclusions are apparently relatively widespread in meteorites, though very small and thus difficult to analyze.

  3. A parts-per-billion measurement of the antiproton magnetic moment.

    Science.gov (United States)

    Smorra, C; Sellner, S; Borchert, M J; Harrington, J A; Higuchi, T; Nagahama, H; Tanaka, T; Mooser, A; Schneider, G; Bohman, M; Blaum, K; Matsuda, Y; Ospelkaus, C; Quint, W; Walz, J; Yamazaki, Y; Ulmer, S

    2017-10-18

    Precise comparisons of the fundamental properties of matter-antimatter conjugates provide sensitive tests of charge-parity-time (CPT) invariance, which is an important symmetry that rests on basic assumptions of the standard model of particle physics. Experiments on mesons, leptons and baryons have compared different properties of matter-antimatter conjugates with fractional uncertainties at the parts-per-billion level or better. One specific quantity, however, has so far only been known to a fractional uncertainty at the parts-per-million level: the magnetic moment of the antiproton, . The extraordinary difficulty in measuring with high precision is caused by its intrinsic smallness; for example, it is 660 times smaller than the magnetic moment of the positron. Here we report a high-precision measurement of in units of the nuclear magneton μ N with a fractional precision of 1.5 parts per billion (68% confidence level). We use a two-particle spectroscopy method in an advanced cryogenic multi-Penning trap system. Our result  = -2.7928473441(42)μ N (where the number in parentheses represents the 68% confidence interval on the last digits of the value) improves the precision of the previous best measurement by a factor of approximately 350. The measured value is consistent with the proton magnetic moment, μ p  = 2.792847350(9)μ N , and is in agreement with CPT invariance. Consequently, this measurement constrains the magnitude of certain CPT-violating effects to below 1.8 × 10 -24 gigaelectronvolts, and a possible splitting of the proton-antiproton magnetic moments by CPT-odd dimension-five interactions to below 6 × 10 -12 Bohr magnetons.

  4. MPLS/VPN traffic engineering: SLA metrics

    Science.gov (United States)

    Cherkaoui, Omar; MacGibbon, Brenda; Blais, Michel; Serhrouchni, Ahmed

    2001-07-01

    Traffic engineering must be concerned with a broad definition of service that includes network availability, reliability and stability, as well as traditional traffic data on loss, throughput, delay and jitter. MPLS and Virtual Private Networks (VPNs) significantly contribute to security and Quality of Service (QoS) within communication networks, but there remains a need for metric measurement and evaluation. The purpose of this paper is to propose a methodology which gives a measure for LSP ( Lfew abel Switching Paths) metrics in VPN MPLS networks. We propose here a statistical method for the evaluation of those metrics. Statistical methodology is very important in this type of study since there is a large amount of data to consider. We use the notions of sample surveys, self-similar processes, linear regression, additive models and bootstrapping. The results obtained allows us to estimate the different metrics for such SLAs.

  5. Einstein metrics on tangent bundles of spheres

    Energy Technology Data Exchange (ETDEWEB)

    Dancer, Andrew S [Jesus College, Oxford University, Oxford OX1 3DW (United Kingdom); Strachan, Ian A B [Department of Mathematics, University of Hull, Hull HU6 7RX (United Kingdom)

    2002-09-21

    We give an elementary treatment of the existence of complete Kaehler-Einstein metrics with nonpositive Einstein constant and underlying manifold diffeomorphic to the tangent bundle of the (n+1)-sphere.

  6. Medicare Contracting - Redacted Benchmark Metric Reports

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services has compiled aggregate national benchmark cost and workload metrics using data submitted to CMS by the AB MACs and the...

  7. Metric Guidelines Inservice and/or Preservice

    Science.gov (United States)

    Granito, Dolores

    1978-01-01

    Guidelines are given for designing teacher training for going metric. The guidelines were developed from existing guidelines, journal articles, a survey of colleges, and the detailed reactions of a panel. (MN)

  8. Variational principles for amenable metric mean dimensions

    OpenAIRE

    Chen, Ercai; Dou, Dou; Zheng, Dongmei

    2017-01-01

    In this paper, we prove variational principles between metric mean dimension and rate distortion function for countable discrete amenable group actions which extend recently results by Lindenstrauss and Tsukamoto.

  9. Science and Technology Metrics and Other Thoughts

    National Research Council Canada - National Science Library

    Harman, Wayne; Staton, Robin

    2006-01-01

    This report explores the subject of science and technology metrics and other topics to begin to provide Navy managers, as well as scientists and engineers, additional tools and concepts with which to...

  10. Clean Cities Annual Metrics Report 2009 (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.

    2011-08-01

    Document provides Clean Cities coalition metrics about the use of alternative fuels; the deployment of alternative fuel vehicles, hybrid electric vehicles (HEVs), and idle reduction initiatives; fuel economy activities; and programs to reduce vehicle miles driven.

  11. Metrics, Media and Advertisers: Discussing Relationship

    Directory of Open Access Journals (Sweden)

    Marco Aurelio de Souza Rodrigues

    2014-11-01

    Full Text Available This study investigates how Brazilian advertisers are adapting to new media and its attention metrics. In-depth interviews were conducted with advertisers in 2009 and 2011. In 2009, new media and its metrics were celebrated as innovations that would increase advertising campaigns overall efficiency. In 2011, this perception has changed: New media’s profusion of metrics, once seen as an advantage, started to compromise its ease of use and adoption. Among its findings, this study argues that there is an opportunity for media groups willing to shift from a product-focused strategy towards a customer-centric one, through the creation of new, simple and integrative metrics

  12. Flight Crew State Monitoring Metrics, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — eSky will develop specific crew state metrics based on the timeliness, tempo and accuracy of pilot inputs required by the H-mode Flight Control System (HFCS)....

  13. Performance metrics used by freight transport providers.

    Science.gov (United States)

    2008-09-30

    The newly-established National Cooperative Freight Research Program (NCFRP) has allocated $300,000 in funding to a project entitled Performance Metrics for Freight Transportation (NCFRP 03). The project is scheduled for completion in September ...

  14. New quality metrics for digital image resizing

    Science.gov (United States)

    Kim, Hongseok; Kumara, Soundar

    2007-09-01

    Digital image rescaling by interpolation has been intensively researched over past decades, and still getting constant attention from many applications such as medical diagnosis, super-resolution, image blow-up, nano-manufacturing, etc. However, there are no consented metrics to objectively assess and compare the quality of resized images. Some existing measures such as peak-signal-to-noise ratio (PSNR) or mean-squared error (MSE), widely used in image restoration area, do not always coincide with the opinions from viewers. Enlarged digital images generally suffer from two major artifacts: blurring, zigzagging, and those undesirable effects especially around edges significantly degrade the overall perceptual image quality. We propose two new image quality metrics to measure the degree of the two major defects, and compare several existing interpolation methods using the proposed metrics. We also evaluate the validity of image quality metrics by comparing rank correlations.

  15. Supplier selection using different metric functions

    Directory of Open Access Journals (Sweden)

    Omosigho S.E.

    2015-01-01

    Full Text Available Supplier selection is an important component of supply chain management in today’s global competitive environment. Hence, the evaluation and selection of suppliers have received considerable attention in the literature. Many attributes of suppliers, other than cost, are considered in the evaluation and selection process. Therefore, the process of evaluation and selection of suppliers is a multi-criteria decision making process. The methodology adopted to solve the supplier selection problem is intuitionistic fuzzy TOPSIS (Technique for Order Preference by Similarity to the Ideal Solution. Generally, TOPSIS is based on the concept of minimum distance from the positive ideal solution and maximum distance from the negative ideal solution. We examine the deficiencies of using only one metric function in TOPSIS and propose the use of spherical metric function in addition to the commonly used metric functions. For empirical supplier selection problems, more than one metric function should be used.

  16. Environmental metrics for community health improvement.

    Science.gov (United States)

    Jakubowski, Benjamin; Frumkin, Howard

    2010-07-01

    Environmental factors greatly affect human health. Accordingly, environmental metrics are a key part of the community health information base. We review environmental metrics relevant to community health, including measurements of contaminants in environmental media, such as air, water, and food; measurements of contaminants in people (biomonitoring); measurements of features of the built environment that affect health; and measurements of "upstream" environmental conditions relevant to health. We offer a set of metrics (including unhealthy exposures, such as pollutants, and health-promoting assets, such as parks and green space) selected on the basis of relevance to health outcomes, magnitude of associated health outcomes, corroboration in the peer-reviewed literature, and data availability, especially at the community level, and we recommend ways to use these metrics most effectively.

  17. Eight Tons of Material Footprint—Suggestion for a Resource Cap for Household Consumption in Finland

    Directory of Open Access Journals (Sweden)

    Michael Lettenmeier

    2014-07-01

    Full Text Available The paper suggests a sustainable material footprint of eight tons, per person, in a year as a resource cap target for household consumption in Finland. This means an 80% (factor 5 reduction from the present Finnish average. The material footprint is used as a synonym to the Total Material Requirement (TMR calculated for products and activities. The paper suggests how to allocate the sustainable material footprint to different consumption components on the basis of earlier household studies, as well as other studies, on the material intensity of products, services, and infrastructures. It analyzes requirements, opportunities, and challenges for future developments in technology and lifestyle, also taking into account that future lifestyles are supposed to show a high degree of diversity. The targets and approaches are discussed for the consumption components of nutrition, housing, household goods, mobility, leisure activities, and other purposes. The paper states that a sustainable level of natural resource use by households is achievable and it can be roughly allocated to different consumption components in order to illustrate the need for a change in lifestyles. While the absolute material footprint of all the consumption components will have to decrease, the relative share of nutrition, the most basic human need, in the total material footprint is expected to rise, whereas much smaller shares than at present are proposed for housing and especially mobility. For reducing material resource use to the sustainable level suggested, both social innovations, and technological developments are required.

  18. Materials Discarded in the U.S. Municipal Waste Stream, 1960 to 2009 (in tons)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has collected and reported data on the generation and disposal of waste in the United States for more than 30 years. We use this information to measure the success of waste reduction and recycling programs across the country. Our trash, or municipal solid waste (MSW), is made up of the things we commonly use and then throw away. These materials include items such as packaging, food scraps, grass clippings, sofas, computers, tires, and refrigerators. MSW does not include industrial, hazardous, or construction waste. The data on Materials Discarded in the Municipal Waste Stream, 1960 to 2009, provides estimated data in thousands of tons discarded after recycling and compost recovery for the years 1960, 1970, 1980, 1990, 2000, 2005, 2007, 2008, and 2009. In this data set, discards include combustion with energy recovery. This data table does not include construction & demolition debris, industrial process wastes, or certain other wastes. The Other category includes electrolytes in batteries and fluff pulp, feces, and urine in disposable diapers. Details may not add to totals due to rounding.

  19. Statistical downscaling of temperature using three techniques in the Tons River basin in Central India

    Science.gov (United States)

    Duhan, Darshana; Pandey, Ashish

    2015-08-01

    In this study, downscaling models were developed for the projections of monthly maximum and minimum air temperature for three stations, namely, Allahabad, Satna, and Rewa in Tons River basin, which is a sub-basin of the Ganges River in Central India. The three downscaling techniques, namely, multiple linear regression (MLR), artificial neural network (ANN), and least square support vector machine (LS-SVM), were used for the development of models, and best identified model was used for simulations of future predictand (temperature) using third-generation Canadian Coupled Global Climate Model (CGCM3) simulation of A2 emission scenario for the period 2001-2100. The performance of the models was evaluated based on four statistical performance indicators. To reduce the bias in monthly projected temperature series, bias correction technique was employed. The results show that all the models are able to simulate temperature; however, LS-SVM models perform slightly better than ANN and MLR. The best identified LS-SVM models are then employed to project future temperature. The results of future projections show the increasing trends in maximum and minimum temperature for A2 scenario. Further, it is observed that minimum temperature will increase at greater rate than maximum temperature.

  20. VALIDACIÓN RESISTIVA ESTRUCTURAL DE UN VARADERO PARA EMBARCACIONES DE 600 ton.

    Directory of Open Access Journals (Sweden)

    Carlos Novo Soto

    2005-09-01

    Full Text Available En el presente trabajo se valida la capacidad de carga portante de la estructura de un varadero a partirde las condiciones a resistencia y rigidez. El varadero consta de 2 carros trapezoidales, sobre los cualesse desplazan transversalmente 6 boggies, que permiten trasladar la embarcación varada hacia losapartaderos, así mismo dispone de 3 motores con sus respectivos sistemas de reducción y tamboras,uno de ellos permite sacar la embarcación que se encuentra soportada sobre los boggies y los 2 carroscunas, otro se emplea para retornar dicha embarcación al mar y el último se utiliza para desplazar laembarcación sobre los boggies, transversalmente a los carros cunas, hacia los apartaderos.Dada la complejidad estructural del sistema se desarrolla un modelo físico matemático, el que mediantela aplicación del Método de los Elementos Finitos, permite obtener los esfuerzos equivalente máximos deMises y los desplazamientos máximos con lo que finalmente se valida, a través del Análisis porElementos Finitos, la capacidad portante de la estructura para varar una embarcación de 600 ton.

  1. 1000–ton testing machine for cyclic fatigue tests of materials at liquid nitrogen temperatures

    International Nuclear Information System (INIS)

    Khitruk, A. A.; Klimchenko, Yu. A.; Kovalchuk, O. A.; Marushin, E. L.; Mednikov, A. A.; Nasluzov, S. N.; Privalova, E. K.; Rodin, I. Yu.; Stepanov, D. B.; Sukhanova, M. V.

    2014-01-01

    One of the main tasks of superconductive magnets R and D is to determine the mechanical and fatigue properties of structural materials and the critical design elements in the cryogenic temperature range. This paper describes a new facility built based on the industrial 1000-ton (10 MN) testing machine Schenk PC10.0S. Special equipment was developed to provide the mechanical and cyclic tensile fatigue tests of large-scale samples at the liquid nitrogen temperature and in a given load range. The main feature of the developed testing machine is the cryostat, in which the device converting a standard compression force of the testing machine to the tensile force affected at the test object is placed. The control system provides the remote control of the test and obtaining, processing and presentation of test data. As an example of the testing machine operation the test program and test results of the cyclic tensile fatigue tests of fullscale helium inlet sample of the PF1 coil ITER are presented

  2. High temperature experiments on a 4 tons UF6 container TENERIFE program

    Energy Technology Data Exchange (ETDEWEB)

    Casselman, C.; Duret, B.; Seiler, J.M.; Ringot, C.; Warniez, P.

    1991-12-31

    The paper presents an experimental program (called TENERIFE) whose aim is to investigate the behaviour of a cylinder containing UF{sub 6} when exposed to a high temperature fire for model validation. Taking into account the experiments performed in the past, the modelization needs further information in order to be able to predict the behaviour of a real size cylinder when engulfed in a 800{degrees}C fire, as specified in the regulation. The main unknowns are related to (1) the UF{sub 6} behaviour beyond the critical point, (2) the relationship between temperature field and internal pressure and (3) the equivalent conductivity of the solid UF{sub 6}. In order to investigate these phenomena in a representative way it is foreseen to perform experiments with a cylinder of real diameter, but reduced length, containing 4 tons of UF{sub 6}. This cylinder will be placed in an electrically heated furnace. A confinement vessel prevents any dispersion of UF{sub 6}. The heat flux delivered by the furnace will be calibrated by specific tests. The cylinder will be changed for each test.

  3. Performance and Results of the LBNE 35 Ton Membrane Cryostat Prototype

    Science.gov (United States)

    Montanari, David; Adamowski, Mark; Hahn, Alan; Norris, Barry; Reichenbacher, Juergen; Rucinski, Russell; Stewart, Jim; Tope, Terry

    We report on the performance and commissioning of the first membrane cryostat to be used for scientific application. The Long Baseline Neutrino Experiment (LBNE) has designed and fabricated a membrane cryostat prototype in collaboration with Ishikawajima-Harima Heavy Industries Co., Ltd. (IHI). LBNE has designed and fabricated the supporting cryogenic system infrastructure and successfully commissioned and operated the first membrane cryostat. Original goals of the prototype are: to demonstrate the membrane cryostat technology in terms of thermal performance, feasibility for liquid argon and leak tightness; to demonstrate that we can remove all the impurities from the vessel and achieve the purity requirements in a membrane cryostat without evacuation; to demonstrate that we can achieve and maintain the purity requirements of the liquid argon using mol sieve and copper filters. The purity requirements of a large liquid argon detector such as LBNE are contaminants below 200 parts per trillion (ppt) oxygen equivalent. LBNE is planning the design and construction of a large liquid argon detector. This presentation will present requirements, design and construction of the LBNE 35 ton membrane cryostat prototype, and detail the commissioning and performance. The experience and results of this prototype are extremely important for the development of the LBNE detector.

  4. The economic value of one ton CO2: what system of reference for public action?

    International Nuclear Information System (INIS)

    2007-04-01

    Given the convergence of scientific analyses of global warming and its consequences for the planet - evaluated for years by the Intergovernmental Panel on Climate Change (IPCC) - it is no longer possible to postpone the efforts required to reduce our emissions of greenhouse gases substantially. However, the choice of actions to take and the calendar of priorities are proving complex to define: the social and economic consequences are great, and neither France (which represents 2% of global emissions) nor Europe (15%) are up to treating the problem independently of the rest of the world. Faced with this challenge, and with budgetary constraints imposing a rationalisation of expenditure, public action must have measuring instruments at its disposal: the value of one ton of carbon is one such instrument. This Strategic Newswatch has a twofold objective: to recall the usefulness of this reference value which, though it cannot guarantee the validity of different public policies, may contribute to ensuring their consistency; and to present the different approaches and difficulties that producing such a reference system introduces. (author)

  5. New Gromov-Inspired Metrics on Phylogenetic Tree Space.

    Science.gov (United States)

    Liebscher, Volkmar

    2018-03-01

    We present a new class of metrics for unrooted phylogenetic X-trees inspired by the Gromov-Hausdorff distance for (compact) metric spaces. These metrics can be efficiently computed by linear or quadratic programming. They are robust under NNI operations, too. The local behaviour of the metrics shows that they are different from any previously introduced metrics. The performance of the metrics is briefly analysed on random weighted and unweighted trees as well as random caterpillars.

  6. Target Scattering Metrics: Model-Model and Model Data comparisons

    Science.gov (United States)

    2017-12-13

    be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for...stainless steel replica of artillery shell Table 7. Targets used in the TIER simulations for the metrics study. C. Four Potential Metrics: Four...Four metrics were investigated. The metric, based on 2D cross-correlation, is typically used in classification algorithms. Model-model comparisons

  7. Transport impacts on atmosphere and climate: Metrics

    Science.gov (United States)

    Fuglestvedt, J. S.; Shine, K. P.; Berntsen, T.; Cook, J.; Lee, D. S.; Stenke, A.; Skeie, R. B.; Velders, G. J. M.; Waitz, I. A.

    2010-12-01

    The transport sector emits a wide variety of gases and aerosols, with distinctly different characteristics which influence climate directly and indirectly via chemical and physical processes. Tools that allow these emissions to be placed on some kind of common scale in terms of their impact on climate have a number of possible uses such as: in agreements and emission trading schemes; when considering potential trade-offs between changes in emissions resulting from technological or operational developments; and/or for comparing the impact of different environmental impacts of transport activities. Many of the non-CO 2 emissions from the transport sector are short-lived substances, not currently covered by the Kyoto Protocol. There are formidable difficulties in developing metrics and these are particularly acute for such short-lived species. One difficulty concerns the choice of an appropriate structure for the metric (which may depend on, for example, the design of any climate policy it is intended to serve) and the associated value judgements on the appropriate time periods to consider; these choices affect the perception of the relative importance of short- and long-lived species. A second difficulty is the quantification of input parameters (due to underlying uncertainty in atmospheric processes). In addition, for some transport-related emissions, the values of metrics (unlike the gases included in the Kyoto Protocol) depend on where and when the emissions are introduced into the atmosphere - both the regional distribution and, for aircraft, the distribution as a function of altitude, are important. In this assessment of such metrics, we present Global Warming Potentials (GWPs) as these have traditionally been used in the implementation of climate policy. We also present Global Temperature Change Potentials (GTPs) as an alternative metric, as this, or a similar metric may be more appropriate for use in some circumstances. We use radiative forcings and lifetimes

  8. Static and Dynamic Software Quality Metric Tools

    OpenAIRE

    Mayo, Kevin A.; Wake, Steven A.; Henry, Sallie M.

    1990-01-01

    The ability to detect and predict poor software quality is of major importance to software engineers, managers, and quality assurance organizations. Poor software quality leads to increased development costs and expensive maintenance. With so much attention on exacerbated budgetary constraints, a viable alternative is necessary. Software quality metrics are designed for this purpose. Metrics measure aspects of code or PDL representations, and can be collected and used throughout the life ...

  9. Effective dimension in some general metric spaces

    Directory of Open Access Journals (Sweden)

    Elvira Mayordomo

    2014-03-01

    Full Text Available We introduce the concept of effective dimension for a general metric space. Effective dimension was defined by Lutz in (Lutz 2003 for Cantor space and has also been extended to Euclidean space. Our extension to other metric spaces is based on a supergale characterization of Hausdorff dimension. We present here the concept of constructive dimension and its characterization in terms of Kolmogorov complexity. Further research directions are indicated.

  10. GRC GSFC TDRSS Waveform Metrics Report

    Science.gov (United States)

    Mortensen, Dale J.

    2013-01-01

    The report presents software metrics and porting metrics for the GGT Waveform. The porting was from a ground-based COTS SDR, the SDR-3000, to the CoNNeCT JPL SDR. The report does not address any of the Operating Environment (OE) software development, nor the original TDRSS waveform development at GSFC for the COTS SDR. With regard to STRS, the report presents compliance data and lessons learned.

  11. Autonomous Exploration Using an Information Gain Metric

    Science.gov (United States)

    2016-03-01

    navigation goals, serving to drive an autonomous system. By continually moving to these navigation goals and taking measurements, the system works to...ARL-TR-7638 ● MAR 2016 US Army Research Laboratory Autonomous Exploration Using an Information Gain Metric by Nicholas C Fung...Laboratory Autonomous Exploration Using an Information Gain Metric by Nicholas C Fung, Jason M Gregory, and John G Rogers Computational and

  12. A Laplacian on Metric Measure Spaces

    DEFF Research Database (Denmark)

    Kokkendorff, Simon Lyngby

    2006-01-01

    We introduce a Laplacian on a class of metric measure spaces via a direct pointwise mean value definition. Fundamental properties of this Laplacian, such as its symmetry as an operator on functions satisfying a Neumann or Dirichlet condition, are established.......We introduce a Laplacian on a class of metric measure spaces via a direct pointwise mean value definition. Fundamental properties of this Laplacian, such as its symmetry as an operator on functions satisfying a Neumann or Dirichlet condition, are established....

  13. Engineering Design Handbook. Metric Conversion Guide

    Science.gov (United States)

    1976-07-01

    metre /second (m3/ s ) WORK (SEE ENERGY) 5-40 OARCOM-P 706-470 TABLE 5-3 EXPERIMENTALLY DETERMINED CONSTANTS Avogadro constant, N. Bohr...result of international economic and political situations, the metric question was not seriously considered until the 1950’ s . Then, the opening of... Law 90-472 authorizing the Department of Commerce to conduct the United States Metric Study was passed by Congress. 1975: The Deputy Secretary of

  14. Area Regge calculus and discontinuous metrics

    International Nuclear Information System (INIS)

    Wainwright, Chris; Williams, Ruth M

    2004-01-01

    Taking the triangle areas as independent variables in the theory of Regge calculus can lead to ambiguities in the edge lengths, which can be interpreted as discontinuities in the metric. We construct solutions to area Regge calculus using a triangulated lattice and find that on a spacelike or timelike hypersurface no such discontinuity can arise. On a null hypersurface however, we can have such a situation and the resulting metric can be interpreted as a so-called refractive wave

  15. Some observations on a fuzzy metric space

    Energy Technology Data Exchange (ETDEWEB)

    Gregori, V.

    2017-07-01

    Let $(X,d)$ be a metric space. In this paper we provide some observations about the fuzzy metric space in the sense of Kramosil and Michalek $(Y,N,/wedge)$, where $Y$ is the set of non-negative real numbers $[0,/infty[$ and $N(x,y,t)=1$ if $d(x,y)/leq t$ and $N(x,y,t)=0$ if $d(x,y)/geq t$. (Author)

  16. Node self-connections in network metrics.

    Science.gov (United States)

    Saura, Santiago

    2018-02-01

    Zamborain-Mason et al. (Ecol. Lett., 20, 2017, 815-831) state that they have newly proposed network metrics that account for node self-connections. Network metrics incorporating node self-connections, also referred to as intranode (intrapatch) connectivity, were however already proposed before and have been widely used in a variety of conservation planning applications. © 2017 The Author. Ecology Letters published by CNRS and John Wiley & Sons Ltd.

  17. Etude comparative de la cinétique de la réaction d’hydratation des bétons autoplaçants et des bétons vibrés

    Directory of Open Access Journals (Sweden)

    Ahmed Gargouri

    2014-04-01

    En effet, la nature exothermique de la réaction chimique du ciment peut induire des déformations de dilatation et de contraction. Par ailleurs, la dépression capillaire crée par la consommation d’eau due à l’hydratation du ciment entraine un retrait de dessiccation. Ces déformations peuvent entrainer des micros fissurations pouvant affecter la durabilité de l’ouvrage à long terme surtout pour les ouvrages épais. D’où l’importance d’étudier la cinétique d’hydratation de ses bétons non conventionnels et de les comparer à celle des bétons vibrés traditionnels. L’évolution de la température adiabatique ainsi que la variation en fonction du temps du degré d’hydratation sont déterminées pour le béton autoplaçant et le béton vibré. L’analyse des résultats expérimentaux obtenus montre que le changement de composition modifie considérablement la cinétique de la réaction d’hydratation.

  18. Relaxed metrics and indistinguishability operators: the relationship

    Energy Technology Data Exchange (ETDEWEB)

    Martin, J.

    2017-07-01

    In 1982, the notion of indistinguishability operator was introduced by E. Trillas in order to fuzzify the crisp notion of equivalence relation (/cite{Trillas}). In the study of such a class of operators, an outstanding property must be pointed out. Concretely, there exists a duality relationship between indistinguishability operators and metrics. The aforesaid relationship was deeply studied by several authors that introduced a few techniques to generate metrics from indistinguishability operators and vice-versa (see, for instance, /cite{BaetsMesiar,BaetsMesiar2}). In the last years a new generalization of the metric notion has been introduced in the literature with the purpose of developing mathematical tools for quantitative models in Computer Science and Artificial Intelligence (/cite{BKMatthews,Ma}). The aforementioned generalized metrics are known as relaxed metrics. The main target of this talk is to present a study of the duality relationship between indistinguishability operators and relaxed metrics in such a way that the aforementioned classical techniques to generate both concepts, one from the other, can be extended to the new framework. (Author)

  19. Peano compactifications and property S metric spaces

    Directory of Open Access Journals (Sweden)

    R. F. Dickman

    1980-01-01

    Full Text Available Let (X,d denote a locally connected, connected separable metric space. We say the X is S-metrizable provided there is a topologically equivalent metric ρ on X such that (X,ρ has Property S, i.e. for any ϵ>0, X is the union of finitely many connected sets of ρ-diameter less than ϵ. It is well-known that S-metrizable spaces are locally connected and that if ρ is a Property S metric for X, then the usual metric completion (X˜,ρ˜ of (X,ρ is a compact, locally connected, connected metric space, i.e. (X˜,ρ˜ is a Peano compactification of (X,ρ. There are easily constructed examples of locally connected connected metric spaces which fail to be S-metrizable, however the author does not know of a non-S-metrizable space (X,d which has a Peano compactification. In this paper we conjecture that: If (P,ρ a Peano compactification of (X,ρ|X, X must be S-metrizable. Several (new necessary and sufficient for a space to be S-metrizable are given, together with an example of non-S-metrizable space which fails to have a Peano compactification.

  20. Almost convex metrics and Peano compactifications

    Directory of Open Access Journals (Sweden)

    R. F. Dickman

    1982-01-01

    Full Text Available Let (X,d denote a locally connected, connected separable metric space. We say the X is S-metrizable provided there is a topologically equivalent metric ρ on X such that (X,ρ has Property S, i.e., for any ϵ>0, X is the union of finitely many connected sets of ρ-diameter less than ϵ. It is well-known that S-metrizable spaces are locally connected and that if ρ is a Property S metric for X, then the usual metric completion (X˜,ρ˜ of (X,ρ is a compact, locally connected, connected metric space; i.e., (X˜,ρ˜ is a Peano compactification of (X,ρ. In an earlier paper, the author conjectured that if a space (X,d has a Peano compactification, then it must be S-metrizable. In this paper, that conjecture is shown to be false; however, the connected spaces which have Peano compactificatons are shown to be exactly those having a totally bounded, almost convex metric. Several related results are given.

  1. Metrics for Offline Evaluation of Prognostic Performance

    Directory of Open Access Journals (Sweden)

    Sankalita Saha

    2010-01-01

    Full Text Available Prognostic performance evaluation has gained significant attention in the past few years.Currently, prognostics concepts lack standard definitions and suffer from ambiguous and inconsistent interpretations. This lack of standards is in part due to the varied end-user requirements for different applications, time scales, available information, domain dynamics, etc. to name a few. The research community has used a variety of metrics largely based on convenience and their respective requirements. Very little attention has been focused on establishing a standardized approach to compare different efforts. This paper presents several new evaluation metrics tailored for prognostics that were recently introduced and were shown to effectively evaluate various algorithms as compared to other conventional metrics. Specifically, this paper presents a detailed discussion on how these metrics should be interpreted and used. These metrics have the capability of incorporating probabilistic uncertainty estimates from prognostic algorithms. In addition to quantitative assessment they also offer a comprehensive visual perspective that can be used in designing the prognostic system. Several methods are suggested to customize these metrics for different applications. Guidelines are provided to help choose one method over another based on distribution characteristics. Various issues faced by prognostics and its performance evaluation are discussed followed by a formal notational framework to help standardize subsequent developments.

  2. 48 CFR 611.002-70 - Metric system implementation.

    Science.gov (United States)

    2010-10-01

    ... of measurement sensitive processes and systems to the metric system. Soft metric means the result of... with security, operations, economic, technical, logistical, training and safety requirements. (3) The...

  3. Baby universe metric equivalent to an interior black-hole metric

    International Nuclear Information System (INIS)

    Gonzalez-Diaz, P.F.

    1991-01-01

    It is shown that the maximally extended metric corresponding to a large wormhole is the unique possible wormhole metric whose baby universe sector is conformally equivalent ot the maximal inextendible Kruskal metric corresponding to the interior region of a Schwarzschild black hole whose gravitational radius is half the wormhole neck radius. The physical implications of this result in the black hole evaporation process are discussed. (orig.)

  4. 'Plug and play' vochtsensor biedt eenvoud (interview met o.a. Ton Baltissen, Pieter van Dalfsen en Jos Balendonck)

    NARCIS (Netherlands)

    Jagers, F.; Baltissen, A.H.M.C.; Dalfsen, van P.; Balendonck, J.

    2013-01-01

    Uit recent onderzoek van PPO-Wageningen UR Boomkwekerij blijkt dat er bij boomkwekerijen met planten in potten of containers tot 40% op de water- en meststoffengift kan worden bespaard als er efficiënter wordt water gegeven. Interview met o.a. Ton Baltissen, Pieter van Dalfsen en Jos Balendonck.

  5. Avian pathogenic Escherichia coli ΔtonB mutants are safe and protective live-attenuated vaccine candidates.

    Science.gov (United States)

    Holden, Karen M; Browning, Glenn F; Noormohammadi, Amir H; Markham, Philip; Marenda, Marc S

    2014-10-10

    Avian pathogenic Escherichia coli (APEC) cause colibacillosis, a serious respiratory disease in poultry. Most APEC strains possess TonB-dependent outer membrane transporters for the siderophores salmochelin and aerobactin, which both contribute to their capacity to cause disease. To assess the potential of iron transport deficient mutants as vaccine candidates, the tonB gene was deleted in the APEC wild type strain E956 and a Δfur (ferric uptake repressor) mutant of E956. The growth of the ΔtonB and ΔtonB/Δfur mutants was impaired in iron-restricted conditions, but not in iron-replete media. Day old chicks were exposed to aerosols of the mutants to assess their efficacy as live attenuated vaccines. At day 18, the birds were challenged with aerosols of the virulent parent strain E956. Both mutants conferred protection against colibacillosis; weight gains and lesion scores were significantly different between the vaccinated groups and an unvaccinated challenged control group. Thus mutation of iron uptake systems can be used as a platform technology to generate protective live attenuated vaccines against extraintestinal E. coli infections, and potentially a range of Gram negative pathogens of importance in veterinary medicine. Copyright © 2014. Published by Elsevier B.V.

  6. The structure of TON1937 from archaeon Thermococcus onnurineus NA1 reveals a eukaryotic HEAT-like architecture.

    Science.gov (United States)

    Jeong, Jae-Hee; Kim, Yi-Seul; Rojviriya, Catleya; Cha, Hyung Jin; Ha, Sung-Chul; Kim, Yeon-Gil

    2013-10-01

    The members of the ARM/HEAT repeat-containing protein superfamily in eukaryotes have been known to mediate protein-protein interactions by using their concave surface. However, little is known about the ARM/HEAT repeat proteins in prokaryotes. Here we report the crystal structure of TON1937, a hypothetical protein from the hyperthermophilic archaeon Thermococcus onnurineus NA1. The structure reveals a crescent-shaped molecule composed of a double layer of α-helices with seven anti-parallel α-helical repeats. A structure-based sequence alignment of the α-helical repeats identified a conserved pattern of hydrophobic or aliphatic residues reminiscent of the consensus sequence of eukaryotic HEAT repeats. The individual repeats of TON1937 also share high structural similarity with the canonical eukaryotic HEAT repeats. In addition, the concave surface of TON1937 is proposed to be its potential binding interface based on this structural comparison and its surface properties. These observations lead us to speculate that the archaeal HEAT-like repeats of TON1937 have evolved to engage in protein-protein interactions in the same manner as eukaryotic HEAT repeats. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. 46 CFR 171.124 - Watertight integrity above the margin line in a vessel less than 100 gross tons.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Watertight integrity above the margin line in a vessel less than 100 gross tons. 171.124 Section 171.124 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY... Integrity Above the Margin Line § 171.124 Watertight integrity above the margin line in a vessel less than...

  8. 46 CFR 171.122 - Watertight integrity above the margin line in a vessel of 100 gross tons or more.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Watertight integrity above the margin line in a vessel of 100 gross tons or more. 171.122 Section 171.122 Shipping COAST GUARD, DEPARTMENT OF HOMELAND... Watertight Integrity Above the Margin Line § 171.122 Watertight integrity above the margin line in a vessel...

  9. 33 CFR 157.43 - Discharges of clean and segregated ballast: Seagoing tank vessels of 150 gross tons or more.

    Science.gov (United States)

    2010-07-01

    ... discharge point for segregated ballast. (c) All discharges of clean ballast and segregated ballast must be... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Discharges of clean and... Discharges of clean and segregated ballast: Seagoing tank vessels of 150 gross tons or more. (a) Clean...

  10. Active Seismic Monitoring Using High-Power Moveable 40-TONS Vibration Sources in Altay-Sayn Region of Russia

    Science.gov (United States)

    Soloviev, V. M.; Seleznev, V. S.; Emanov, A. F.; Kashun, V. N.; Elagin, S. A.; Romanenko, I.; Shenmayer, A. E.; Serezhnikov, N.

    2013-05-01

    The paper presents data of operating vibroseismic observations using high-power stationary 100-tons and moveable 40-tons vibration sources, which have been carried out in Russia for 30 years. It is shown that investigations using high-power vibration sources open new possibilities for study stressedly-deformed condition of the Earth`s crust and the upper mantle and tectonic process in them. Special attention is given to developing operating seismic translucences of the Earth`s crust and the upper mantle using high-power 40-tons vibration sources. As a result of experimental researches there was proved high stability and repeatability of vibration effects. There were carried out long period experiments of many days with vibration source sessions of every two hours with the purpose of monitoring accuracy estimation. It was determined, that repeatability of vibroseismic effects (there was researched time difference of repeated sessions of P- and S-waves from crystal rocks surface) could be estimated as 10-3 - 10-4 sec. It is ten times less than revealed here annual variations of kinematic parameters according to regime vibroseismic observations. It is shown, that on hard high-speed grounds radiation spectrum becomes narrowband and is dislocated to high frequency; at the same time quantity of multiple high-frequency harmonic is growing. At radiation on soft sedimentary grounds (sand, clay) spectrum of vibration source in near zone is more broadband, correlograms are more compact. there Correspondence of wave fields from 40-tons vibration sources and explosions by reference waves from boundaries in he Earth`s crust and the upper mantle at record distance of 400 km was proved by many experiments in various regions of Russia; there was carried out the technique of high-power vibration sources grouping for increase of effectiveness of emanation and increase of record distance. According to results of long-term vibroseismic monitoring near Novosibirsk (1997-2012) there are

  11. Community Extreme Tonnage User Service (CETUS): A 5000 Ton Open Research Facility in the United States

    Science.gov (United States)

    Danielson, L.; Righter, K.; McCubbin, F.

    2016-01-01

    Large sample volume 5000 ton multi-anvil presses have contributed to the exploration of deep Earth and planetary interiors, synthesis of ultra-hard and other novel materials, and serve as a sample complement to pressure and temperature regimes already attainable by diamond anvil cell experiments. However, no such facility exists on the North American continent. We propose the establishment of an open user facility for COMPRES members and the entire research community, with the unique capability of a 5000 ton (or more) press, supported by a host of extant co-located experimental and analytical laboratories and research staff. We offer wide range of complementary and/or preparatory experimental options. Any required synthesis of materials or follow up experiments can be carried out controlled atmosphere furnaces, piston cylinders, multi-anvil, or experimental impact apparatus. Additionally, our division houses two machine shops that would facilitate any modification or custom work necessary for development of CETUS, one for general fabrication and one located specifically within our experimental facilities. We also have a general sample preparation laboratory, specifically for experimental samples, that allows users to quickly and easily prepare samples for ebeam analyses and more. A service we can offer to COMPRES community members in general, and CETUS visiting users specifically, is a multitude of analytical instrumentation literally steps away from the experimental laboratories. This year we will be pursuing site funding of our laboratories through NASA's Planetary Science Directorate, which should result in substantial cost savings to all visiting users, and supports our mission of interagency cooperation for the enhancement of science for all (see companion PSAMS abstract). The PI is in a unique position as an employee of Jacobs Technology to draw funding from multiple sources, including those from industry and commerce. We submitted a Planetary Major Equipment

  12. Cryo-Compression System in a 3000 Ton Multi-Anvil Press

    Science.gov (United States)

    Secco, R. A.; Yong, W.

    2016-12-01

    Most large volume high pressure devices are capable of high temperature experiments that are typically achieved by using localized resistive heating of a metal foil, graphite or ceramic sleeve inside a thermally insulated sample volume in a high pressure cell. Low temperatures at high pressures are needed for physical property studies of materials that comprise planetary bodies in the outer solar system. However, low temperatures are more difficult to achieve mainly because the massive steel components of the press, which are in good thermal contact with each other under high load, act as large heat reservoirs and pathways that encumber the removal of heat from the pressure cell. We describe a new custom-designed system under development for a 3000 ton multi-anvil press to reach temperatures below 295K at high pressures. The system was designed to remove heat selectively and conductively from the sample volume through six of the eight WC cubes in direct contact with the octahedral pressure cell. Cooling fins made of Cu are sandwiched between, and in thermal contact with, neighboring anvil faces and are each connected to a dedicated Cu heat exchanger chamber through which liquid nitrogen flows. The chamber internal geometry consists of either square pillars that double the internal surface area of the rectangular parallelepiped enclosed volume or continuous walls separated by valleys. Gas from each chamber is vented to the lab through an exhaust pipe. High pressure results will be presented of several temperature monitoring points in the center of the pressure cell and on the surfaces of the WC cubes and steel wedges which recorded the time-dependent cooling progress. Temperature stability tests will also be presented.

  13. Analysis of internal crack in a six-ton P91 ingot

    Directory of Open Access Journals (Sweden)

    Jing-an Yang

    2016-05-01

    Full Text Available P91 is a new kind of heat-resistant and high-tensile steel. It can be extruded after ingot casting and can be widely used for different pipes in power plants. However, due to its mushy freezing characteristics, a lack of feeding in the ingot center often generates many defects, such as porosity and crack. A six-ton P91 ingot was cast and sliced, and a representative part of the longitudinal section was inspected in more detail. The morphology of crack-like defects was examined by X-ray high energy industrial CT and reconstructed by 3D software. There are five main portions of defects larger than 200 mm3, four of which are interconnected. These initiated from continuous liquid film, and then were torn apart by excessive tensile stress within the brittle temperature range (BTR. The 3D FEM analysis of thermo-mechanical simulation was carried out to analyze the formation of porosity and internal crack defects. The results of shrinkage porosity and Niyama values revealed that the center of the ingot suffers from inadequate feeding. Several criteria based on thermal and mechanical models were used to evaluate the susceptibility of hot crack formation. The Clyne and Davies’ criterion and Katgerman’s criterion successfully predicted the high hot crack susceptibility in the ingot center. Six typical locations in the longitudinal section had been chosen for analysis of the stresses and strains evolution during the BTR. Locations in the defects region showed the highest tensile stresses and relative high strain values, while other locations showed either low tensile stresses or low strain values. In conclusion, hot crack develops only when stress and strain exceed a threshold value at the same time during the BTR.

  14. Development and operation of a 30 ton/ day gasification and melting plant for municipal solid wastes

    International Nuclear Information System (INIS)

    Jung, Hae Young; Seo, Yong-Chil; Cho, Sung-Jin; Lee, Jang-Su; Lee, Ki-Bae; Jeong, Dae-Woon; Kim, Woo-Hyun; Roh, Seon-Ah; Min, Tai-Jin

    2010-01-01

    As one of the efforts to increase recycling rate of end of life vehicles enforcing by the governmental regulation, automobile shredder residue (ASR) was considered to treat by a thermal method with converting waste to energy. Gasification and melting experimental processes of lab (1 kg/ hour) and pilot (5 ton. day) scale were installed. ASR collected from a domestic shredding company was experimented at a lab-scale and pilot-scale gasification and melting process which is similar to the shaft type gasification melting furnace. The characteristics of syngas, tar and residue (slag) generated from a conversion process (gasification and melting) were analyzed to provide the information to further utilize them as fuel and recyclable materials in scaled up plants. A series of experiments have been conducted with various air equivalent ratios (ERs), and syngas compositions, carbon conversion efficiency, heating value of syngas, yield and characteristics of slag were analyzed. Finally, slags generated from the process were recycled with various alternative technologies. In summary, energy conversion technology of ASR with the least production of residue by gasification and slag utilization has been developed. The main components in product gas were H 2 , CO, CH 4 and CO 2 ; and concentrations of C 2 H 4 and C 2 H 6 were less. This can be used as clean fuel gas whose heating value ranged from 2.5 to 14.0 MJ/ m 3 . Most of slag generated from the process can further be fabricated to valuable and usable products. Such combined technology would result in achieving almost zero waste release from ELVs. (author)

  15. TENERIFE program: high temperature experiments on A 4 tons UF6 container

    International Nuclear Information System (INIS)

    Casselman, C.; Duret, B.; Seiler, J.M.; Ringot, C.; Warniez, P.; Wataru, M.; Shiomi, S.; Ozaki, S.; Yamakawa, H.

    1993-01-01

    To know the input of the future thermo-mechanical code, we have to get a better understanding of the thermo-physical evolution of the UF 6 which pressurizes the container. This evolution is function of: a) the heat transfer rate from the fire to the container b) the UF 6 behaviour in the container. These tests are essentially analytical at simulated fire temperatures of between 800 and 1000degC. They use a representative mass of UF 6 (around 4 tons). The tests will not seek to rupture the test container which has a diameter equal to the 48Y container, but shorter length. These tests carried out in realistic conditions (typical thermal gradient at the wall, characteristic period for UF 6 internal mass transfer) should make possible to improve knowledge of two fundamental phenomena: 1) vaporization of UF 6 on contact with the heated wall (around 400degC), a phenomenon which controls the container internal pressurization kinetic, 2) the equivalent conductivity of solid UF 6 , a phenomenon which is linked to the heat transfer by UF 6 vaporization-condensation through the solid's porosities and which depends on the diameter of the container. In addition, they will allow the influence of other parameters to be studied, such as UF 6 container filling mode or the mechanical characteristics of the container material. A UF 6 container fitted with instruments (wall temperature, UF 6 temperature, pressure) is heated by a rapid heat transient in a radiating furnace where the temperature and thermal power supplied can be measured. The test continues until pre-established thresholds have been reached: 1) strain threshold measured on the container surface (strain gauges positioned on the outside), 2) maximum temperature threshold of UF 6 , 3) container internal pressure threshold. (J.P.N.)

  16. An Ionospheric Metric Study Using Operational Models

    Science.gov (United States)

    Sojka, J. J.; Schunk, R. W.; Thompson, D. C.; Scherliess, L.; Harris, T. J.

    2006-12-01

    One of the outstanding challenges in upgrading ionospheric operational models is quantifying their improvement. This challenge is not necessarily an absolute accuracy one, but rather answering the question, "Is the newest operational model an improvement over its predecessor under operational scenarios?" There are few documented cases where ionospheric models are compared either with each other or against "ground truth". For example a CEDAR workshop team, PRIMO, spent almost a decade carrying out a models comparison with ionosonde and incoherent scatter radar measurements from the Millstone Hill, Massachusetts location [Anderson et al.,1998]. The result of this study was that all models were different and specific conditions could be found when each was the "best" model. Similarly, a National Space Weather Metrics ionospheric challenge was held and results were presented at a National Space Weather meeting. The results were again found to be open to interpretation, and issues with the value of the specific metrics were raised (Fuller-Rowell, private communication, 2003). Hence, unlike the tropospheric weather community, who have established metrics and exercised them on new models over many decades to quantify improvement, the ionospheric community has not yet settled on a metric of both scientific and operational value. We report on a study in which metrics were used to compare various forms of the International Reference Ionosphere (IRI), the Ionospheric Forecast Model (IFM), and the Utah State University Global Assimilation of Ionospheric Measurements Model (USU-GAIM) models. The ground truth for this study was a group of 11 ionosonde data sets taken between 20 March and 19 April 2004. The metric parameter was the ionosphere's critical frequency. The metric was referenced to the IRI. Hence, the study addressed the specific question what improvement does IFM and USU-GAIM have over IRI. Both strengths (improvements) and weaknesses of these models are discussed

  17. The dynamics of metric-affine gravity

    International Nuclear Information System (INIS)

    Vitagliano, Vincenzo; Sotiriou, Thomas P.; Liberati, Stefano

    2011-01-01

    Highlights: → The role and the dynamics of the connection in metric-affine theories is explored. → The most general second order action does not lead to a dynamical connection. → Including higher order invariants excites new degrees of freedom in the connection. → f(R) actions are also discussed and shown to be a non- representative class. - Abstract: Metric-affine theories of gravity provide an interesting alternative to general relativity: in such an approach, the metric and the affine (not necessarily symmetric) connection are independent quantities. Furthermore, the action should include covariant derivatives of the matter fields, with the covariant derivative naturally defined using the independent connection. As a result, in metric-affine theories a direct coupling involving matter and connection is also present. The role and the dynamics of the connection in such theories is explored. We employ power counting in order to construct the action and search for the minimal requirements it should satisfy for the connection to be dynamical. We find that for the most general action containing lower order invariants of the curvature and the torsion the independent connection does not carry any dynamics. It actually reduces to the role of an auxiliary field and can be completely eliminated algebraically in favour of the metric and the matter field, introducing extra interactions with respect to general relativity. However, we also show that including higher order terms in the action radically changes this picture and excites new degrees of freedom in the connection, making it (or parts of it) dynamical. Constructing actions that constitute exceptions to this rule requires significant fine tuned and/or extra a priori constraints on the connection. We also consider f(R) actions as a particular example in order to show that they constitute a distinct class of metric-affine theories with special properties, and as such they cannot be used as representative toy

  18. Nordic energy co-operation can save the equivalent of 4 - 10 billion USD

    International Nuclear Information System (INIS)

    Lind, Oddvar

    2000-01-01

    Better co-ordination of the energy- and environment policies among the Nordic countries can be very profitable from the socio-economic point of view and facilitate the fulfilment of the Kyoto agreement. A Swedish calculation shows that up to 10 billion USD can be saved by building a trans-nordic gasline and at the same time preparing for a common implementation of the Kyoto agreement, combined with increased electricity trade, improving the efficiency and increasing the use of renewable energy sources. The consumption of natural gas must then increase threefold the next 25 years. There is no alternative to natural gas of the same potential if coal and oil are to be replaced to reduce the emission of carbon dioxide. The importance of natural gas is further increased by the phase-out of nuclear energy in Sweden. After 2025 the use of natural gas will be reduced and in 2040 biomass energy, wind energy and solar energy will contribute as much as the natural gas, that is, 250 TWh. Throughout the entire period more than half of the electricity production will be hydropower. It is presupposed that the cogeneration sector and the district heating network are substantially expanded, even in South Norway. The Nordic energy system is quite flexible with respect to fulfilling future CO 2 targets. Although the different Nordic countries have different commitments with respect to the Kyoto agreement, they will profit economically from acting jointly within the sum of their individual emission quotas

  19. Providing safe drinking water to 1.2 billion unserved people

    Energy Technology Data Exchange (ETDEWEB)

    Gadgil, Ashok J.; Derby, Elisabeth A.

    2003-06-01

    Despite substantial advances in the past 100 years in public health, technology and medicine, 20% of the world population, mostly comprised of the poor population segments in developing countries (DCs), still does not have access to safe drinking water. To reach the United Nations (UN) Millennium Goal of halving the number of people without access to safe water by 2015, the global community will need to provide an additional one billion urban residents and 600 million rural residents with safe water within the next twelve years. This paper examines current water treatment measures and implementation methods for delivery of safe drinking water, and offers suggestions for making progress towards the goal of providing a timely and equitable solution for safe water provision. For water treatment, based on the serious limitations of boiling water and chlorination, we suggest an approach based on filtration coupled with ultraviolet (UV) disinfection, combined with public education. Additionally, owing to the capacity limitations for non-governmental organizations (NGOs) to take on this task primarily on their own, we suggest a strategy based on financially sustainable models that include the private sector as well as NGOs.

  20. A Massive Galaxy in Its Core Formation Phase Three Billion Years After the Big Bang

    Science.gov (United States)

    Nelson, Erica; van Dokkum, Pieter; Franx, Marijn; Brammer, Gabriel; Momcheva, Ivelina; Schreiber, Natascha M. Forster; da Cunha, Elisabete; Tacconi, Linda; Bezanson, Rachel; Kirkpatrick, Allison; hide

    2014-01-01

    Most massive galaxies are thought to have formed their dense stellar cores at early cosmic epochs. However, cores in their formation phase have not yet been observed. Previous studies have found galaxies with high gas velocity dispersions or small apparent sizes but so far no objects have been identified with both the stellar structure and the gas dynamics of a forming core. Here we present a candidate core in formation 11 billion years ago, at z = 2.3. GOODS-N-774 has a stellar mass of 1.0 × 10 (exp 11) solar mass, a half-light radius of 1.0 kpc, and a star formation rate of 90 (sup +45 / sub -20) solar mass/yr. The star forming gas has a velocity dispersion 317 plus or minus 30 km/s, amongst the highest ever measured. It is similar to the stellar velocity dispersions of the putative descendants of GOODS-N-774, compact quiescent galaxies at z is approximately equal to 2 (exp 8-11) and giant elliptical galaxies in the nearby Universe. Galaxies such as GOODS-N-774 appear to be rare; however, from the star formation rate and size of the galaxy we infer that many star forming cores may be heavily obscured, and could be missed in optical and near-infrared surveys.

  1. A large neutral fraction of cosmic hydrogen a billion years after the Big Bang.

    Science.gov (United States)

    Wyithe, J Stuart B; Loeb, Abraham

    2004-02-26

    The fraction of ionized hydrogen left over from the Big Bang provides evidence for the time of formation of the first stars and quasar black holes in the early Universe; such objects provide the high-energy photons necessary to ionize hydrogen. Spectra of the two most distant known quasars show nearly complete absorption of photons with wavelengths shorter than the Lyman alpha transition of neutral hydrogen, indicating that hydrogen in the intergalactic medium (IGM) had not been completely ionized at a redshift of z approximately 6.3, about one billion years after the Big Bang. Here we show that the IGM surrounding these quasars had a neutral hydrogen fraction of tens of per cent before the quasar activity started, much higher than the previous lower limits of approximately 0.1 per cent. Our results, when combined with the recent inference of a large cumulative optical depth to electron scattering after cosmological recombination therefore suggest the presence of a second peak in the mean ionization history of the Universe.

  2. Development of multicomponent parts-per-billion-level gas standards of volatile toxic organic compounds

    International Nuclear Information System (INIS)

    Rhoderick, G.C.; Zielinski, W.L. Jr.

    1990-01-01

    This paper reports that the demand for stable, low-concentration multicomponent standards of volatile toxic organic compounds for quantifying national and state measurement of ambient air quality and hazardous waste incineration emissions has markedly increased in recent years. In response to this demand, a microgravimetric technique was developed and validated for preparing such standards; these standards ranged in concentration from several parts per million (ppm) down to one part per billion (ppb) and in complexity from one organic up to 17. Studies using the gravimetric procedure to prepare mixtures of different groups of organics. including multi-components mixtures in the 5 to 20 ppb range, revealed a very low imprecision. This procedure is based on the separate gravimetric introduction of individual organics into an evacuated gas cylinder, followed by the pressurized addition of a precalculated amount of pure nitrogen. Additional studies confirmed the long-term stability of these mixtures. The uncertainty of the concentrations of the individual organics at the 95% confidence level ranged from less than 1% relative at 1 ppm to less than 10% relative at 1 ppb. Over 100 primary gravimetric standards have been developed, validated, and used for certifying the concentrations of a variety of mixtures for monitoring studies

  3. Potentially biogenic carbon preserved in a 4.1 billion-year-old zircon.

    Science.gov (United States)

    Bell, Elizabeth A; Boehnke, Patrick; Harrison, T Mark; Mao, Wendy L

    2015-11-24

    Evidence of life on Earth is manifestly preserved in the rock record. However, the microfossil record only extends to ∼ 3.5 billion years (Ga), the chemofossil record arguably to ∼ 3.8 Ga, and the rock record to 4.0 Ga. Detrital zircons from Jack Hills, Western Australia range in age up to nearly 4.4 Ga. From a population of over 10,000 Jack Hills zircons, we identified one >3.8-Ga zircon that contains primary graphite inclusions. Here, we report carbon isotopic measurements on these inclusions in a concordant, 4.10 ± 0.01-Ga zircon. We interpret these inclusions as primary due to their enclosure in a crack-free host as shown by transmission X-ray microscopy and their crystal habit. Their δ(13)CPDB of -24 ± 5‰ is consistent with a biogenic origin and may be evidence that a terrestrial biosphere had emerged by 4.1 Ga, or ∼ 300 My earlier than has been previously proposed.

  4. The controversial "Cambrian" fossils of the Vindhyan are real but more than a billion years older.

    Science.gov (United States)

    Bengtson, Stefan; Belivanova, Veneta; Rasmussen, Birger; Whitehouse, Martin

    2009-05-12

    The age of the Vindhyan sedimentary basin in central India is controversial, because geochronology indicating early Proterozoic ages clashes with reports of Cambrian fossils. We present here an integrated paleontologic-geochronologic investigation to resolve this conundrum. New sampling of Lower Vindhyan phosphoritic stromatolitic dolomites from the northern flank of the Vindhyans confirms the presence of fossils most closely resembling those found elsewhere in Cambrian deposits: annulated tubes, embryo-like globules with polygonal surface pattern, and filamentous and coccoidal microbial fabrics similar to Girvanella and Renalcis. None of the fossils, however, can be ascribed to uniquely Cambrian or Ediacaran taxa. Indeed, the embryo-like globules are not interpreted as fossils at all but as former gas bubbles trapped in mucus-rich cyanobacterial mats. Direct dating of the same fossiliferous phosphorite yielded a Pb-Pb isochron of 1,650 +/- 89 (2sigma) million years ago, confirming the Paleoproterozoic age of the fossils. New U-Pb geochronology of zircons from tuffaceous mudrocks in the Lower Vindhyan Porcellanite Formation on the southern flank of the Vindhyans give comparable ages. The Vindhyan phosphorites provide a window of 3-dimensionally preserved Paleoproterozoic fossils resembling filamentous and coccoidal cyanobacteria and filamentous eukaryotic algae, as well as problematic forms. Like Neoproterozoic phosphorites a billion years later, the Vindhyan deposits offer important new insights into the nature and diversity of life, and in particular, the early evolution of multicellular eukaryotes.

  5. Enhanced cellular preservation by clay minerals in 1 billion-year-old lakes.

    Science.gov (United States)

    Wacey, David; Saunders, Martin; Roberts, Malcolm; Menon, Sarath; Green, Leonard; Kong, Charlie; Culwick, Timothy; Strother, Paul; Brasier, Martin D

    2014-07-28

    Organic-walled microfossils provide the best insights into the composition and evolution of the biosphere through the first 80 percent of Earth history. The mechanism of microfossil preservation affects the quality of biological information retained and informs understanding of early Earth palaeo-environments. We here show that 1 billion-year-old microfossils from the non-marine Torridon Group are remarkably preserved by a combination of clay minerals and phosphate, with clay minerals providing the highest fidelity of preservation. Fe-rich clay mostly occurs in narrow zones in contact with cellular material and is interpreted as an early microbially-mediated phase enclosing and replacing the most labile biological material. K-rich clay occurs within and exterior to cell envelopes, forming where the supply of Fe had been exhausted. Clay minerals inter-finger with calcium phosphate that co-precipitated with the clays in the sub-oxic zone of the lake sediments. This type of preservation was favoured in sulfate-poor environments where Fe-silicate precipitation could outcompete Fe-sulfide formation. This work shows that clay minerals can provide an exceptionally high fidelity of microfossil preservation and extends the known geological range of this fossilization style by almost 500 Ma. It also suggests that the best-preserved microfossils of this time may be found in low-sulfate environments.

  6. The DECam Plane Survey: Optical Photometry of Two Billion Objects in the Southern Galactic Plane

    Science.gov (United States)

    Schlafly, E. F.; Green, G. M.; Lang, D.; Daylan, T.; Finkbeiner, D. P.; Lee, A.; Meisner, A. M.; Schlegel, D.; Valdes, F.

    2018-02-01

    The DECam Plane Survey is a five-band optical and near-infrared survey of the southern Galactic plane with the Dark Energy Camera at Cerro Tololo. The survey is designed to reach past the main-sequence turn-off of old populations at the distance of the Galactic center through a reddening E(B-V) of 1.5 mag. Typical single-exposure depths are 23.7, 22.8, 22.3, 21.9, and 21.0 mag (AB) in the grizY bands, with seeing around 1\\prime\\prime . The footprint covers the Galactic plane with | b| ≲ 4^\\circ , 5^\\circ > l> -120^\\circ . The survey pipeline simultaneously solves for the positions and fluxes of tens of thousands of sources in each image, delivering positions and fluxes of roughly two billion stars with better than 10 mmag precision. Most of these objects are highly reddened and deep in the Galactic disk, probing the structure and properties of the Milky Way and its interstellar medium. The fully-processed images and derived catalogs are publicly available.

  7. Rapid oxygenation of Earth’s atmosphere 2.33 billion years ago

    Science.gov (United States)

    Luo, Genming; Ono, Shuhei; Beukes, Nicolas J.; Wang, David T.; Xie, Shucheng; Summons, Roger E.

    2016-01-01

    Molecular oxygen (O2) is, and has been, a primary driver of biological evolution and shapes the contemporary landscape of Earth’s biogeochemical cycles. Although “whiffs” of oxygen have been documented in the Archean atmosphere, substantial O2 did not accumulate irreversibly until the Early Paleoproterozoic, during what has been termed the Great Oxygenation Event (GOE). The timing of the GOE and the rate at which this oxygenation took place have been poorly constrained until now. We report the transition (that is, from being mass-independent to becoming mass-dependent) in multiple sulfur isotope signals of diagenetic pyrite in a continuous sedimentary sequence in three coeval drill cores in the Transvaal Supergroup, South Africa. These data precisely constrain the GOE to 2.33 billion years ago. The new data suggest that the oxygenation occurred rapidly—within 1 to 10 million years—and was followed by a slower rise in the ocean sulfate inventory. Our data indicate that a climate perturbation predated the GOE, whereas the relationships among GOE, “Snowball Earth” glaciation, and biogeochemical cycling will require further stratigraphic correlation supported with precise chronologies and paleolatitude reconstructions. PMID:27386544

  8. Sharing global CO2 emission reductions among one billion high emitters.

    Science.gov (United States)

    Chakravarty, Shoibal; Chikkatur, Ananth; de Coninck, Heleen; Pacala, Stephen; Socolow, Robert; Tavoni, Massimo

    2009-07-21

    We present a framework for allocating a global carbon reduction target among nations, in which the concept of "common but differentiated responsibilities" refers to the emissions of individuals instead of nations. We use the income distribution of a country to estimate how its fossil fuel CO(2) emissions are distributed among its citizens, from which we build up a global CO(2) distribution. We then propose a simple rule to derive a universal cap on global individual emissions and find corresponding limits on national aggregate emissions from this cap. All of the world's high CO(2)-emitting individuals are treated the same, regardless of where they live. Any future global emission goal (target and time frame) can be converted into national reduction targets, which are determined by "Business as Usual" projections of national carbon emissions and in-country income distributions. For example, reducing projected global emissions in 2030 by 13 GtCO(2) would require the engagement of 1.13 billion high emitters, roughly equally distributed in 4 regions: the U.S., the OECD minus the U.S., China, and the non-OECD minus China. We also modify our methodology to place a floor on emissions of the world's lowest CO(2) emitters and demonstrate that climate mitigation and alleviation of extreme poverty are largely decoupled.

  9. Large data analysis: automatic visual personal identification in a demography of 1.2 billion persons

    Science.gov (United States)

    Daugman, John

    2014-05-01

    The largest biometric deployment in history is now underway in India, where the Government is enrolling the iris patterns (among other data) of all 1.2 billion citizens. The purpose of the Unique Identification Authority of India (UIDAI) is to ensure fair access to welfare benefits and entitlements, to reduce fraud, and enhance social inclusion. Only a minority of Indian citizens have bank accounts; only 4 percent possess passports; and less than half of all aid money reaches its intended recipients. A person who lacks any means of establishing their identity is excluded from entitlements and does not officially exist; thus the slogan of UIDAI is: To give the poor an identity." This ambitious program enrolls a million people every day, across 36,000 stations run by 83 agencies, with a 3-year completion target for the entire national population. The halfway point was recently passed with more than 600 million persons now enrolled. In order to detect and prevent duplicate identities, every iris pattern that is enrolled is first compared against all others enrolled so far; thus the daily workflow now requires 600 trillion (or 600 million-million) iris cross-comparisons. Avoiding identity collisions (False Matches) requires high biometric entropy, and achieving the tremendous match speed requires phase bit coding. Both of these requirements are being delivered operationally by wavelet methods developed by the author for encoding and comparing iris patterns, which will be the focus of this Large Data Award" presentation.

  10. Biomechanical metrics of aesthetic perception in dance.

    Science.gov (United States)

    Bronner, Shaw; Shippen, James

    2015-12-01

    The brain may be tuned to evaluate aesthetic perception through perceptual chunking when we observe the grace of the dancer. We modelled biomechanical metrics to explain biological determinants of aesthetic perception in dance. Eighteen expert (EXP) and intermediate (INT) dancers performed développé arabesque in three conditions: (1) slow tempo, (2) slow tempo with relevé, and (3) fast tempo. To compare biomechanical metrics of kinematic data, we calculated intra-excursion variability, principal component analysis (PCA), and dimensionless jerk for the gesture limb. Observers, all trained dancers, viewed motion capture stick figures of the trials and ranked each for aesthetic (1) proficiency and (2) movement smoothness. Statistical analyses included group by condition repeated-measures ANOVA for metric data; Mann-Whitney U rank and Friedman's rank tests for nonparametric rank data; Spearman's rho correlations to compare aesthetic rankings and metrics; and linear regression to examine which metric best quantified observers' aesthetic rankings, p dance movements revealed differences between groups and condition, p brain combines sensory motor elements into integrated units of behaviour. In this representation, the chunk of information which is remembered, and to which the observer reacts, is the elemental mode shape of the motion rather than physical displacements. This suggests that reduction in redundant information to a simplistic dimensionality is related to the experienced observer's aesthetic perception.

  11. Pragmatic Metrics for Monitoring Science Data Centers

    Science.gov (United States)

    Moses, J. F.; Behnke, J.

    2003-12-01

    Science data metrics and their analysis are critical components to the end-to-end data and service flow for science data centers. The Earth Science Data and Information System Project has collected records of EOS science data archive, processing and distribution metrics from NASA's Distributed Active Archive Centers since 1996. The ESDIS Science Operations Office and the DAAC data centers have cooperated to develop a DAAC metrics reporting capability called the EOSDIS Data Gathering and Reporting Systems (EDGRS). This poster illustrates EDGRS processes and metrics data applications. EDGRS currently accesses detailed archive and distribution metrics from nine DAAC sites and transfers results to a centralized collection system on a routine basis. After automated quality checks the records are immediately made available through a web-based Graphic User Interface. Users can obtain standard graphs and prepare custom queries to generate specific reports for monitoring science data processing progress. Applications are illustrated that explore methods for performing data availability studies and performance analyses. Improvements are planned to support granule-level science data accounting and characterization of product distribution.

  12. Future of the PCI Readmission Metric.

    Science.gov (United States)

    Wasfy, Jason H; Yeh, Robert W

    2016-03-01

    Between 2013 and 2014, the Centers for Medicare and Medicaid Services and the National Cardiovascular Data Registry publically reported risk-adjusted 30-day readmission rates after percutaneous coronary intervention (PCI) as a pilot project. A key strength of this public reporting effort included risk adjustment with clinical rather than administrative data. Furthermore, because readmission after PCI is common, expensive, and preventable, this metric has substantial potential to improve quality and value in American cardiology care. Despite this, concerns about the metric exist. For example, few PCI readmissions are caused by procedural complications, limiting the extent to which improved procedural technique can reduce readmissions. Also, similar to other readmission measures, PCI readmission is associated with socioeconomic status and race. Accordingly, the metric may unfairly penalize hospitals that care for underserved patients. Perhaps in the context of these limitations, Centers for Medicare and Medicaid Services has not yet included PCI readmission among metrics that determine Medicare financial penalties. Nevertheless, provider organizations may still wish to focus on this metric to improve value for cardiology patients. PCI readmission is associated with low-risk chest discomfort and patient anxiety. Therefore, patient education, improved triage mechanisms, and improved care coordination offer opportunities to minimize PCI readmissions. Because PCI readmission is common and costly, reducing PCI readmission offers provider organizations a compelling target to improve the quality of care, and also performance in contracts involve shared financial risk. © 2016 American Heart Association, Inc.

  13. Exploring model-based target discrimination metrics

    Science.gov (United States)

    Witus, Gary; Weathersby, Marshall

    2004-08-01

    Visual target discrimination has occurred when the observer can say "I see a target THERE!" and can designate the target location. Target discrimination occurs when a perceived shape is sufficiently similar one or more of the instances the observer has been trained on. Marr defined vision as "knowing what is where by seeing." Knowing "what" requires prior knowledge. Target discrimination requires model-based visual processing. Model-based signature metrics attempt to answer the question "to what extent does the target in the image resemble a training image?" Model-based signature metrics attempt to represent the effects of high-level top-down visual cognition, in addition to low-level bottom-up effects. Recent advances in realistic 3D target rendering and computer-vision object recognition have made model-based signature metrics more practical. The human visual system almost certainly does NOT use the same processing algorithms as computer vision object recognition, but some processing elements and the overall effects are similar. It remains to be determined whether model-based metrics explain the variance in human performance. The purpose of this paper is to explain and illustrate the model-based approach to signature metrics.

  14. NASA education briefs for the classroom. Metrics in space

    Science.gov (United States)

    1982-01-01

    The use of metric measurement in space is summarized for classroom use. Advantages of the metric system over the English measurement system are described. Some common metric units are defined, as are special units for astronomical study. International system unit prefixes and a conversion table of metric/English units are presented. Questions and activities for the classroom are recommended.

  15. Codes in W*-Metric Spaces: Theory and Examples

    Science.gov (United States)

    Bumgardner, Christopher J.

    2011-01-01

    We introduce a "W*"-metric space, which is a particular approach to non-commutative metric spaces where a "quantum metric" is defined on a von Neumann algebra. We generalize the notion of a quantum code and quantum error correction to the setting of finite dimensional "W*"-metric spaces, which includes codes and error correction for classical…

  16. On Convergence of Fixed Points in Fuzzy Metric Spaces

    Directory of Open Access Journals (Sweden)

    Yonghong Shen

    2013-01-01

    Full Text Available We mainly focus on the convergence of the sequence of fixed points for some different sequences of contraction mappings or fuzzy metrics in fuzzy metric spaces. Our results provide a novel research direction for fixed point theory in fuzzy metric spaces as well as a substantial extension of several important results from classical metric spaces.

  17. Galaxy evolution. Evidence for mature bulges and an inside-out quenching phase 3 billion years after the Big Bang.

    Science.gov (United States)

    Tacchella, S; Carollo, C M; Renzini, A; Förster Schreiber, N M; Lang, P; Wuyts, S; Cresci, G; Dekel, A; Genzel, R; Lilly, S J; Mancini, C; Newman, S; Onodera, M; Shapley, A; Tacconi, L; Woo, J; Zamorani, G

    2015-04-17

    Most present-day galaxies with stellar masses ≥10(11) solar masses show no ongoing star formation and are dense spheroids. Ten billion years ago, similarly massive galaxies were typically forming stars at rates of hundreds solar masses per year. It is debated how star formation ceased, on which time scales, and how this "quenching" relates to the emergence of dense spheroids. We measured stellar mass and star-formation rate surface density distributions in star-forming galaxies at redshift 2.2 with ~1-kiloparsec resolution. We find that, in the most massive galaxies, star formation is quenched from the inside out, on time scales less than 1 billion years in the inner regions, up to a few billion years in the outer disks. These galaxies sustain high star-formation activity at large radii, while hosting fully grown and already quenched bulges in their cores. Copyright © 2015, American Association for the Advancement of Science.

  18. AREVA - First quarter 2011 revenue: 2.7% growth like for like to 1.979 billion euros

    International Nuclear Information System (INIS)

    2011-01-01

    The group reported consolidated revenue of 1.979 billion euros in the 1. quarter of 2011, for 2.2% growth compared with the 1. quarter of 2010 (+ 2.7% like for like). The increase was driven by the Mining / Front End Business Group (+ 20.8% LFL). Revenue from outside France rose 12.0% to 1.22 billion euros and represented 62% of total revenue. The impacts of foreign exchange and changes in consolidation scope were negligible during the period. The March 11 events in Japan had no significant impact on the group's performance in the 1. quarter of 2011. The group's backlog of 43.5 billion euros at March 31, 2011 was stable in relation to March 31, 2010. The growth in the backlog of the Mining / Front End and Renewable Energies Business Groups offset the partial depletion of the backlog in the Reactors and Services and Back End Business Groups as contracts were completed

  19. Backlog at December 31, 2007: euro 39,8 billion, up by 55% from year-end 2006. 2007 sales revenue: euro 11.9 billion, up by 9.8% (+10.4% like-for-like)

    International Nuclear Information System (INIS)

    2008-01-01

    The AREVA group's backlog reached a record level of euro 39.834 billion as of December 31, 2007, up by 55% from that of year-end 2006. In Nuclear, the backlog was euro 34.927 billion at year-end 2007 (+58%), due in particular to the signature of a contract in a record amount with the Chinese utility CGNPC. The series of agreements concluded provide among other things for the construction of two new-generation EPR nuclear islands and the supply of all of the materials and services needed for their operation through 2027. CGNPC also bought 35% of the production of UraMin, the mining company acquired by AREVA in August 2007. Industrial cooperation in the Back End of the cycle was launched with the signature of an agreement between China and France. In addition, the group signed several long-term contracts in significant amounts, particularly with KHNP of South Korea, EDF and Japanese utilities. The Transmission and Distribution division won several major contracts in Libya and Qatar at the end of the year approaching a total of euro 750 million. For the entire year, new orders grew by 34% to euro 5.816 billion. The backlog, meanwhile, grew by 40% to euro 4.906 billion at year-end. The group cleared sales revenue of euro 11.923 billion in 2007, up by 9.8% (+10.4% like-for-like) in relation to 2006 sales of euro 10.863 billion. Sales revenue for the 4. quarter of 2007 rose to euro 3.858 billion, for growth of 16.7% (+18.8% like-for-like) over one year. Sales revenue for the year was marked by: - Growth of 7.6% (+10.6% like-for-like) in Front End sales revenue, which rose to euro 3.140 billion. The division's Enrichment operations posted strong growth. - Sales were up by 17.5% (+15.2% like-for-like) to euro 2.717 billion in the Reactors and Services division. Sales revenue was driven in particular by the growth of Services operations, after weak demand in 2006, by progress on OL3 construction, and by the start of Flamanville 3, the second EPR. For the Back End division

  20. Rainbow Rindler metric and Unruh effect

    Science.gov (United States)

    Yadav, Gaurav; Komal, Baby; Majhi, Bibhas Ranjan

    2017-11-01

    The energy of a particle moving on a space-time, in principle, can affect the background metric. The modifications to it depend on the ratio of energy of the particle and the Planck energy, known as rainbow gravity. Here, we find the explicit expressions for the coordinate transformations from rainbow Minkowski space-time to accelerated frame. The corresponding metric is also obtained which we call as rainbow Rindler metric. So far we are aware of that no body has done it in a concrete manner. Here, this is found from the first principle and hence all the parameters are properly identified. The advantage of this is that the calculated Unruh temperature is compatible with the Hawking temperature of the rainbow black hole horizon, obtained earlier. Since the accelerated frame has several importance in revealing various properties of gravity, we believe that the present result will not only fill that gap, but also help to explore different aspects of rainbow gravity paradigm.

  1. Evaluating and Estimating the WCET Criticality Metric

    DEFF Research Database (Denmark)

    Jordan, Alexander

    2014-01-01

    Static analysis tools that are used for worst-case execution time (WCET) analysis of real-time software just provide partial information on an analyzed program. Only the longest-executing path, which currently determines the WCET bound is indicated to the programmer. This limited view can prevent...... a programmer (or compiler) from targeting optimizations the right way. A possible resort is to use a metric that targets WCET and which can be efficiently computed for all code parts of a program. Similar to dynamic profiling techniques, which execute code with input that is typically expected...... to estimate the Criticality metric, by relaxing the precision of WCET analysis. Through this, we can reduce analysis time by orders of magnitude, while only introducing minor error. To evaluate our estimation approach and share our garnered experience using the metric, we evaluate real-time programs, which...

  2. SOCIAL METRICS APPLIED TO SMART TOURISM

    Directory of Open Access Journals (Sweden)

    O. Cervantes

    2016-09-01

    Full Text Available We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  3. The conformal metric structure of Geometrothermodynamics

    Science.gov (United States)

    Bravetti, Alessandro; Lopez-Monsalvo, Cesar S.; Nettel, Francisco; Quevedo, Hernando

    2013-03-01

    We present a thorough analysis on the invariance of the most widely used metrics in the Geometrothermodynamics programme. We centre our attention in the invariance of the curvature of the space of equilibrium states under a change of fundamental representation. Assuming that the systems under consideration can be described by a fundamental relation which is a homogeneous function of a definite order, we demonstrate that such invariance is only compatible with total Legendre transformations in the present form of the programme. We give the explicit form of a metric which is invariant under total Legendre transformations and whose induced metric produces a curvature which is independent of the fundamental representation. Finally, we study a generic system with two degrees of freedom and whose fundamental relation is homogeneous of order one.

  4. Steiner trees for fixed orientation metrics

    DEFF Research Database (Denmark)

    Brazil, Marcus; Zachariasen, Martin

    2009-01-01

    We consider the problem of constructing Steiner minimum trees for a metric defined by a polygonal unit circle (corresponding to s = 2 weighted legal orientations in the plane). A linear-time algorithm to enumerate all angle configurations for degree three Steiner points is given. We provide...... a simple proof that the angle configuration for a Steiner point extends to all Steiner points in a full Steiner minimum tree, such that at most six orientations suffice for edges in a full Steiner minimum tree. We show that the concept of canonical forms originally introduced for the uniform orientation...... metric generalises to the fixed orientation metric. Finally, we give an O(s n) time algorithm to compute a Steiner minimum tree for a given full Steiner topology with n terminal leaves....

  5. A bi-metric theory of gravitation

    International Nuclear Information System (INIS)

    Rosen, N.

    1975-01-01

    The bi-metric theory of gravitation proposed previously is simplified in that the auxiliary conditions are discarded, the two metric tensors being tied together only by means of the boundary conditions. Some of the properties of the field of a particle are investigated; there is no black hole, and it appears that no gravitational collapse can take place. Although the proposed theory and general relativity are at present observationally indistinguishable, some differences are pointed out which may some day be susceptible of observation. An alternative bi-metric theory is considered which gives for the precession of the perihelion 5/6 of the value given by general relativity; it seems less satisfactory than the present theory from the aesthetic point of view. (author)

  6. Metric Learning for Hyperspectral Image Segmentation

    Science.gov (United States)

    Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca

    2011-01-01

    We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.

  7. Social Metrics Applied to Smart Tourism

    Science.gov (United States)

    Cervantes, O.; Gutiérrez, E.; Gutiérrez, F.; Sánchez, J. A.

    2016-09-01

    We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general) to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services) to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  8. Metrical and dynamical aspects in complex analysis

    CERN Document Server

    2017-01-01

    The central theme of this reference book is the metric geometry of complex analysis in several variables. Bridging a gap in the current literature, the text focuses on the fine behavior of the Kobayashi metric of complex manifolds and its relationships to dynamical systems, hyperbolicity in the sense of Gromov and operator theory, all very active areas of research. The modern points of view expressed in these notes, collected here for the first time, will be of interest to academics working in the fields of several complex variables and metric geometry. The different topics are treated coherently and include expository presentations of the relevant tools, techniques and objects, which will be particularly useful for graduate and PhD students specializing in the area.

  9. Robustness Metrics: Consolidating the multiple approaches to quantify Robustness

    DEFF Research Database (Denmark)

    Göhler, Simon Moritz; Eifler, Tobias; Howard, Thomas J.

    2016-01-01

    determined to be conceptually different from one another. The metrics were classified by their meaning and interpretation based on the types of information necessary to calculate the metrics. Four different classes were identified: 1) Sensitivity robustness metrics; 2) Size of feasible design space......, this ambiguity can have significant influence on the strategies used to combat variability, the way it is quantified and ultimately, the quality of the final design. In this contribution the literature for robustness metrics was systematically reviewed. From the 108 relevant publications found, 38 metrics were...... robustness metrics; 3) Functional expectancy and dispersion robustness metrics; and 4) Probability of conformance robustness metrics. The goal was to give a comprehensive overview of robustness metrics and guidance to scholars and practitioners to understand the different types of robustness metrics...

  10. Failure analysis of clutch disc in a two-ton truck; Análisis de falla del disco de embrague de un camión de 2 ton

    Directory of Open Access Journals (Sweden)

    Diego Leon Perea Salcedo

    2013-06-01

    Full Text Available This article presents the failure analysis of a clutch disc in a two-ton truck. The failure part was made of high-carbon steel for springs. Although the failure surface was very thin and difficult to interpret, the fracture surface examination revealed chevron marks and the presence of faint beach marks. The failure parts were made of hardened 1070 carbon steel. The cushioning plate experienced shear and bending alternating stress. The end failure is caused by shock load.

  11. Failure analysis of clutch disc in a two-ton truck; Análisis de falla del disco de embrague de un camión de 2 ton

    OpenAIRE

    Diego Leon Perea Salcedo

    2013-01-01

    This article presents the failure analysis of a clutch disc in a two-ton truck. The failure part was made of high-carbon steel for springs. Although the failure surface was very thin and difficult to interpret, the fracture surface examination revealed chevron marks and the presence of faint beach marks. The failure parts were made of hardened 1070 carbon steel. The cushioning plate experienced shear and bending alternating stress. The end failure is caused by shock load.

  12. On metric divergences of probability measures

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor

    2009-01-01

    Roč. 45, č. 6 (2009), s. 885-900 ISSN 0023-5954 R&D Projects: GA MŠk(CZ) 1M0572; GA ČR GA102/07/1131 Institutional research plan: CEZ:AV0Z10750506 Keywords : Metric divergences * Hellinger divergence * Le Cam divergence * Jensen-Shannon divergence * Total variation Subject RIV: BD - Theory of Information Impact factor: 0.445, year: 2009 http://library.utia.cas.cz/separaty/2010/SI/vajda-on metric divergences of probability measures.pdf

  13. Federal Procurement Metrication Appropriateness and Methods.

    Science.gov (United States)

    1982-09-18

    8217D-R123 243 FEDERAL PROCUREMENT NETRICATION AiPPROPRIATENESS AND- i/i METHODS(U) SCIENCE MANAGEMENT CORP UASHINGTON DC M A COELLA 18 SEP 82 NRC-358i... Science Management Corporation FEDERAL PROCUREMENT METRICATION APPROPRIATENESS AND METHODS (FINAL REPORT) DTICSEL.C T ED fDISMhIUTION STATEMENT A LM...reflect the views of the U.S. Metric Board. Acce5Sion For NTTS GRA&I DTTC ’ri% Q is Diet Special Science Management CoRoration 1120 Connecticut Avenue

  14. Jacobi-Maupertuis metric and Kepler equation

    Science.gov (United States)

    Chanda, Sumanto; Gibbons, Gary William; Guha, Partha

    This paper studies the application of the Jacobi-Eisenhart lift, Jacobi metric and Maupertuis transformation to the Kepler system. We start by reviewing fundamentals and the Jacobi metric. Then we study various ways to apply the lift to Kepler-related systems: first as conformal description and Bohlin transformation of Hooke’s oscillator, second in contact geometry and third in Houri’s transformation [T. Houri, Liouville integrability of Hamiltonian systems and spacetime symmetry (2016), www.geocities.jp/football_physician/publication.html], coupled with Milnor’s construction [J. Milnor, On the geometry of the Kepler problem, Am. Math. Mon. 90 (1983) 353-365] with eccentric anomaly.

  15. What Metrics Accurately Reflect Surgical Quality?

    Science.gov (United States)

    Ibrahim, Andrew M; Dimick, Justin B

    2018-01-29

    Surgeons are increasingly under pressure to measure and improve their quality. While there is broad consensus that we ought to track surgical quality, there is far less agreement about which metrics matter most. This article reviews the important statistical concepts of case mix and chance as they apply to understanding the observed wide variation in surgical quality. We then discuss the benefits and drawbacks of current measurement strategies through the framework of structure, process, and outcomes approaches. Finally, we describe emerging new metrics, such as video evaluation and network optimization, that are likely to take on an increasingly important role in the future of measuring surgical quality.

  16. A generalization of Vaidya's radiation metric

    International Nuclear Information System (INIS)

    Gleiser, R.J.; Kozameh, C.N.

    1981-01-01

    In this paper it is shown that if Vaidya's radiation metric is considered from the point of view of kinetic theory in general relativity, the corresponding phase space distribution function can be generalized in a particular way. The new family of spherically symmetric radiation metrics obtained contains Vaidya's as a limiting situation. The Einstein field equations are solved in a ''comoving'' coordinate system. Two arbitrary functions of a single variable are introduced in the process of solving these equations. Particular examples considered are a stationary solution, a nonvacuum solution depending on a single parameter, and several limiting situations. (author)

  17. U.S. Air Force Spent Billions on F117 Engine Sustainment Without Knowing What a Fair Price Was

    Science.gov (United States)

    2016-03-11

    reasonable. It states that “the fact that a price is included in a catalog does not, in and of itself, make it fair and reasonable” and further refers ...No. DODIG-2016-059 M A R C H 1 1 , 2 0 1 6 U.S. Air Force Spent Billions on F117 Engine Sustainment Without Knowing What a Fair Price Was FOR...i Results in Brief U.S. Air Force Spent Billions on F117 Engine Sustainment Without Knowing What a Fair Price Was Visit us at www.dodig.mil

  18. On the Plane Geometry with Generalized Absolute Value Metric

    Directory of Open Access Journals (Sweden)

    A. Bayar

    2008-01-01

    Full Text Available Metric spaces are among the most important widely studied topics in mathematics. In recent years, Mathematicians began to investigate using other metrics different from Euclidean metric. These metrics also find their place computer age in addition to their importance in geometry. In this paper, we consider the plane geometry with the generalized absolute value metric and define trigonometric functions and norm and then give a plane tiling example for engineers underlying Schwarz's inequality in this plane.

  19. Layout finishing of a 28nm, 3 billions transistors, multi-core processor

    Science.gov (United States)

    Morey-Chaisemartin, Philippe; Beisser, Eric

    2013-06-01

    Designing a fully new 256 cores processor is a great challenge for a fabless startup. In addition to all architecture, functionalities and timing issues, the layout by itself is a bottleneck due to all the process constraints of a 28nm technology. As developers of advanced layout finishing solutions, we were involved in the design flow of this huge chip with its 3 billions transistors. We had to face the issue of dummy patterns instantiation with respect to design constraints. All the design rules to generate the "dummies" are clearly defined in the Design Rule Manual, and some automatic procedures are provided by the foundry itself, but these routines don't take care of the designer requests. Such a chip, embeds both digital parts and analog modules for clock and power management. These two different type of designs have each their own set of constraints. In both cases, the insertion of dummies should not introduce unexpected variations leading to malfunctions. For example, on digital parts were signal race conditions are critical on long wires or bus, introduction of uncontrolled parasitic along these nets are highly critical. For analog devices such as high frequency and high sensitivity comparators, the exact symmetry of the two parts of a current mirror generator should be guaranteed. Thanks to the easily customizable features of our dummies insertion tool, we were able to configure it in order to meet all the designer requirements as well as the process constraints. This paper will present all these advanced key features as well as the layout tricks used to fulfill all requirements.

  20. No Photon Left Behind: How Billions of Spectral Lines are Transforming Planetary Sciences

    Science.gov (United States)

    Villanueva, Geronimo L.

    2014-06-01

    With the advent of realistic potential energy surface (PES) and dipole moment surface (DMS) descriptions, theoretically computed linelists can now synthesize accurate spectral parameters for billions of spectral lines sampling the untamed high-energy molecular domain. Being the initial driver for these databases the characterization of stellar spectra, these theoretical databases, in combination with decades of precise experimental studies (nicely compiled in community databases such as HITRAN and GEISA), are leading to unprecedented precisions in the characterization of planetary atmospheres. Cometary sciences are among the most affected by this spectroscopic revolution. Even though comets are relatively cold bodies (T˜100 K), their infrared molecular emission is mainly defined by non-LTE solar fluorescence induced by a high-energy source (Sun, T˜5600 K). In order to interpret high-resolution spectra of comets acquired with extremely powerful telescopes (e.g., Keck, VLT, NASA-IRTF), we have developed advanced non-LTE fluorescence models that integrate the high-energy dynamic range of ab-initio databases (e.g., BT2, VTT, HPT2, BYTe, TROVE) and the precision of laboratory and semi-empirical compilations (e.g., HITRAN, GEISA, CDMS, WKMC, SELP, IUPAC). These new models allow us to calculate realistic non-LTE pumps, cascades, branching-ratios, and emission rates for a broad range of excitation regimes for H2O, HDO, HCN, HNC and NH3. We have implemented elements of these compilations to the study of Mars spectra, and we are now exploring its application to modeling non-LTE emission in exoplanets. In this presentation, we present application of these advanced models to interpret highresolution spectra of comets, Mars and exoplanets.

  1. Measured performance of a 3 ton LiBr absorption water chiller and its effect on cooling system operation

    Science.gov (United States)

    Namkoong, D.

    1976-01-01

    A three ton lithium bromide absorption water chiller was tested for a number of conditions involving hot water input, chilled water, and the cooling water. The primary influences on chiller capacity were the hot water inlet temperature and the cooling water inlet temperature. One combination of these two parameters extended the output to as much as 125% of design capacity, but no combination could lower the capacity to below 60% of design. A cooling system was conceptually designed so that it could provide several modes of operation. Such flexibility is needed for any solar cooling system to be able to accommodate the varying solar energy collection and the varying building demand. It was concluded that a three-ton absorption water chiller with the kind of performance that was measured can be incorporated into a cooling system such as that proposed, to provide efficient cooling over the specified ranges of operating conditions.

  2. Measured performance of a 3-ton LiBr absorption water chiller and its effect on cooling system operation

    Science.gov (United States)

    Namkoong, D.

    1976-01-01

    A 3-ton lithium bromide absorption water chiller was tested for a number of conditions involving hot-water input, chilled water, and the cooling water. The primary influences on chiller capacity were the hot water inlet temperature and the cooling water inlet temperature. One combination of these two parameters extended the output to as much as 125% of design capacity, but no combination could lower the capacity to below 60% of design. A cooling system was conceptually designed so that it could provide several modes of operation. Such flexibility is needed for any solar cooling system to be able to accommodate the varying solar energy collection and the varying building demand. It is concluded that a 3-ton absorption water chiller with the kind of performance that was measured can be incorporated into a cooling system such as that proposed, to provide efficient cooling over the specified ranges of operating conditions.

  3. Caractérisation d'un béton autoplaçant avec addition de laitier ...

    African Journals Online (AJOL)

    DK

    Le renforcement du béton par les fibres métalliques, synthétiques ou un système hybride, est une démarche classique. .... traction traduisant plutôt un comportement fragile ; la loi de comportement est linéaire. Tableau 3. Caractéristiques des fibres. Type de fibre. Polypropylène. Diss ..... nulé et de la poudre de verre.

  4. Metrics and Evaluation Models for Accessible Television

    DEFF Research Database (Denmark)

    Li, Dongxiao; Looms, Peter Olaf

    2014-01-01

    to compare. Using case studies from three emerging economies (Argentina, Brazil and China) as well as industrialized nations including Canada, Denmark, the United Kingdom and the USA), this paper examines the situation facing television accessibility. Having identified and discussed existing metrics...

  5. Vehicle Integrated Prognostic Reasoner (VIPR) Metric Report

    Science.gov (United States)

    Cornhill, Dennis; Bharadwaj, Raj; Mylaraswamy, Dinkar

    2013-01-01

    This document outlines a set of metrics for evaluating the diagnostic and prognostic schemes developed for the Vehicle Integrated Prognostic Reasoner (VIPR), a system-level reasoner that encompasses the multiple levels of large, complex systems such as those for aircraft and spacecraft. VIPR health managers are organized hierarchically and operate together to derive diagnostic and prognostic inferences from symptoms and conditions reported by a set of diagnostic and prognostic monitors. For layered reasoners such as VIPR, the overall performance cannot be evaluated by metrics solely directed toward timely detection and accuracy of estimation of the faults in individual components. Among other factors, overall vehicle reasoner performance is governed by the effectiveness of the communication schemes between monitors and reasoners in the architecture, and the ability to propagate and fuse relevant information to make accurate, consistent, and timely predictions at different levels of the reasoner hierarchy. We outline an extended set of diagnostic and prognostics metrics that can be broadly categorized as evaluation measures for diagnostic coverage, prognostic coverage, accuracy of inferences, latency in making inferences, computational cost, and sensitivity to different fault and degradation conditions. We report metrics from Monte Carlo experiments using two variations of an aircraft reference model that supported both flat and hierarchical reasoning.

  6. All You Need to Know About Metric

    Science.gov (United States)

    American Metric Journal, 1974

    1974-01-01

    Information found necessary for South Africa's citizens to learn during their recent conversion to the metric system is presented. Twelve terms and prefixes are suggested that satisfy practically all ordinary needs. Tables are given for the most commonly used measures, with relationships between different units indicated. (LS)

  7. Strong ideal convergence in probabilistic metric spaces

    Indian Academy of Sciences (India)

    sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and ... also important applications in nonlinear analysis [2]. The theory was brought to ..... for each t > 0 since each set on the right-hand side of the relation (3.1) belongs to I. Thus, by Definition 2.11 and the ...

  8. Calabi–Yau metrics and string compactification

    Directory of Open Access Journals (Sweden)

    Michael R. Douglas

    2015-09-01

    Full Text Available Yau proved an existence theorem for Ricci-flat Kähler metrics in the 1970s, but we still have no closed form expressions for them. Nevertheless there are several ways to get approximate expressions, both numerical and analytical. We survey some of this work and explain how it can be used to obtain physical predictions from superstring theory.

  9. Reuse metrics and measurement: A framework

    Science.gov (United States)

    Reifer, Donald J.

    1990-01-01

    The lessons learned and experience gleaned are described by those who have started to implement the reuse metrics and measurement framework used in controlling the development of common avionics and software for its affiliated aircraft programs. The framework was developed to permit the measurement of the long term cost/benefits resulting from the creation and use of Reusable Software Objects (RSOs). The framework also monitors the efficiency and effectiveness of the Software Reuse Library (SRL). The metrics and measurement framework is defined which was established to allow some determinations and findings to be made relative to software reuse. Seven criteria are discussed which were used to guide the establishment of the proposed reuse framework. Object recapture and creation metrics are explained along with their normalized use in effort, productivity, and quality determination. A single and multiple reuse instance version of a popular cost model is presented which uses these metrics and the measurement scheme proposed to predict the software effort and duration under various reuse assumptions. Studies in using this model to predict actuals taken from the RCI data base of over 1000 completed projects is discussed.

  10. Assessing Software Quality Through Visualised Cohesion Metrics

    Directory of Open Access Journals (Sweden)

    Timothy Shih

    2001-05-01

    Full Text Available Cohesion is one of the most important factors for software quality as well as maintainability, reliability and reusability. Module cohesion is defined as a quality attribute that seeks for measuring the singleness of the purpose of a module. The module of poor quality can be a serious obstacle to the system quality. In order to design a good software quality, software managers and engineers need to introduce cohesion metrics to measure and produce desirable software. A highly cohesion software is thought to be a desirable constructing. In this paper, we propose a function-oriented cohesion metrics based on the analysis of live variables, live span and the visualization of processing element dependency graph. We give six typical cohesion examples to be measured as our experiments and justification. Therefore, a well-defined, well-normalized, well-visualized and well-experimented cohesion metrics is proposed to indicate and thus enhance software cohesion strength. Furthermore, this cohesion metrics can be easily incorporated with software CASE tool to help software engineers to improve software quality.

  11. Invariant metric for nonlinear symplectic maps

    Indian Academy of Sciences (India)

    a function of system parameters, we demonstrate that the performance of a nonlinear Hamiltonian system is enhanced. Keywords. Invariant metric; symplectic maps; performance optimization. PACS Nos 05.45. ...... [7] A Nijenhuis and H S Wilf, Computational algorithms for computers and calculators (Academic. Press, New ...

  12. Random shortest path metrics with applications

    NARCIS (Netherlands)

    Engels, Christian; Manthey, Bodo; Raghavendra Rao, B.V.; Brieden, A.; Görgülü, Z.-K.; Krug, T.; Kropat, E.; Meyer-Nieberg, S.; Mihelcic, G.; Pickl, S.W.

    2012-01-01

    We consider random metric instances for optimization problems obtained as follows: Every edge of a complete graph gets a weight drawn independently at random. And then the length of an edge is the length of a shortest path with respect to these weights that connects its two endpoints. We prove that

  13. Business model metrics : An open repository

    NARCIS (Netherlands)

    Heikkila, M.; Bouwman, W.A.G.A.; Heikkila, J.; Solaimani, S.; Janssen, W.

    2015-01-01

    Development of successful business models has become a necessity in turbulent business environments, but compared to research on business modeling tools, attention to the role of metrics in designing business models in literature is limited. Building on existing approaches to business models and

  14. Metrical musings on Littlewood and friends

    DEFF Research Database (Denmark)

    Haynes, A.; Jensen, Jonas Lindstrøm; Kristensen, Simon

    We prove a metrical result on a family of conjectures related to the Littlewood conjecture, namely the original Littlewood conjecture, the mixed Littlewood conjecture of de Mathan and Teulié and a hybrid between a conjecture of Cassels and the Littlewood conjecture. It is shown that the set of nu...

  15. Click Model-Based Information Retrieval Metrics

    NARCIS (Netherlands)

    Chuklin, A.; Serdyukov, P.; de Rijke, M.

    2013-01-01

    In recent years many models have been proposed that are aimed at predicting clicks of web search users. In addition, some information retrieval evaluation metrics have been built on top of a user model. In this paper we bring these two directions together and propose a common approach to converting

  16. Strong Ideal Convergence in Probabilistic Metric Spaces

    Indian Academy of Sciences (India)

    In the present paper we introduce the concepts of strongly ideal convergent sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong ideal limit points and the strong ideal cluster points of a sequence in this ...

  17. A Paradigm for Security Metrics in Counterinsurgency

    Science.gov (United States)

    2011-06-10

    The rest go on with their old measurements and expect me to fit them. — George Bernard Shaw, Playwright The history of COIN security metrics...at an estimated 263,000 in 1962.87 The book Souvenirs de la Bataille d’Alger written by Saadi Yacef in 1962 inspired the movie Battle of Algiers

  18. Contraction theorems in fuzzy metric space

    International Nuclear Information System (INIS)

    Farnoosh, R.; Aghajani, A.; Azhdari, P.

    2009-01-01

    In this paper, the results on fuzzy contractive mapping proposed by Dorel Mihet will be proved for B-contraction and C-contraction in the case of George and Veeramani fuzzy metric space. The existence of fixed point with weaker conditions will be proved; that is, instead of the convergence of subsequence, p-convergence of subsequence is used.

  19. Clean Cities 2010 Annual Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.

    2012-10-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2010. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  20. Clean Cities 2011 Annual Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.

    2012-12-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2011. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  1. A new universal colour image fidelity metric

    NARCIS (Netherlands)

    Toet, A.; Lucassen, M.P.

    2003-01-01

    We extend a recently introduced universal grayscale image quality index to a newly developed perceptually decorrelated colour space. The resulting colour image fidelity metric quantifies the distortion of a processed colour image relative to its original version. We evaluated the new colour image

  2. Strong ideal convergence in probabilistic metric spaces

    Indian Academy of Sciences (India)

    In the present paper we introduce the concepts of strongly ideal convergent sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong ideal limit points and the strong ideal cluster points of a sequence in this ...

  3. Outsourced Similarity Search on Metric Data Assets

    DEFF Research Database (Denmark)

    Yiu, Man Lung; Assent, Ira; Jensen, Christian S.

    2012-01-01

    This paper considers a cloud computing setting in which similarity querying of metric data is outsourced to a service provider. The data is to be revealed only to trusted users, not to the service provider or anyone else. Users query the server for the most similar data objects to a query example...

  4. A Lagrangian-dependent metric space

    International Nuclear Information System (INIS)

    El-Tahir, A.

    1989-08-01

    A generalized Lagrangian-dependent metric of the static isotropic spacetime is derived. Its behaviour should be governed by imposing physical constraints allowing to avert the pathological features of gravity at the strong field domain. This would restrict the choice of the Lagrangian form. (author). 10 refs

  5. Product fixed points in ordered metric spaces

    OpenAIRE

    Turinici, Mihai

    2011-01-01

    All product fixed point results in ordered metric spaces based on linear contractive conditions are but a vectorial form of the fixed point statement due to Nieto and Rodriguez-Lopez [Order, 22 (2005), 223-239], under the lines in Matkowski [Bull. Acad. Pol. Sci. (Ser. Sci. Math. Astronom. Phys.), 21 (1973), 323-324].

  6. Quantitative properties of the Schwarzschild metric

    Czech Academy of Sciences Publication Activity Database

    Křížek, Michal; Křížek, Filip

    2018-01-01

    Roč. 2018, č. 1 (2018), s. 1-10 Institutional support: RVO:67985840 Keywords : exterior and interior Schwarzschild metric * proper radius * coordinate radius Subject RIV: BA - General Mathematics OBOR OECD: Applied mathematics http://astro.shu-bg.net/pasb/index_files/Papers/2018/SCHWARZ8.pdf

  7. Visualizing energy landscapes with metric disconnectivity graphs.

    Science.gov (United States)

    Smeeton, Lewis C; Oakley, Mark T; Johnston, Roy L

    2014-07-30

    The visualization of multidimensional energy landscapes is important, providing insight into the kinetics and thermodynamics of a system, as well the range of structures a system can adopt. It is, however, highly nontrivial, with the number of dimensions required for a faithful reproduction of the landscape far higher than can be represented in two or three dimensions. Metric disconnectivity graphs provide a possible solution, incorporating the landscape connectivity information present in disconnectivity graphs with structural information in the form of a metric. In this study, we present a new software package, PyConnect, which is capable of producing both disconnectivity graphs and metric disconnectivity graphs in two or three dimensions. We present as a test case the analysis of the 69-bead BLN coarse-grained model protein and show that, by choosing appropriate order parameters, metric disconnectivity graphs can resolve correlations between structural features on the energy landscape with the landscapes energetic and kinetic properties. Copyright © 2014 The Authors Journal of Computational Chemistry Published by Wiley Periodicals, Inc.

  8. A possible molecular metric for biological evolvability

    Indian Academy of Sciences (India)

    2012-06-25

    Jun 25, 2012 ... Proteins manifest themselves as phenotypic traits, retained or lost in living systems via evolutionary pressures. Simply ... the first such metric by utilizing the recently discovered stoichiometric margin of life for all known naturally occurring ... tures), at the molecular level, are still debated under the context.

  9. DIGITAL MARKETING: SUCCESS METRICS, FUTURE TRENDS

    OpenAIRE

    Preeti Kaushik

    2017-01-01

    Abstract – Business Marketing is one of the prospective which has been tremendously affected by digital world in last few years. Digital marketing refers to doing advertising through digital channels. This paper provides detailed study of metrics to measure success of digital marketing platform and glimpse of future of technologies by 2020.

  10. Description of the Sandia Validation Metrics Project

    International Nuclear Information System (INIS)

    TRUCANO, TIMOTHY G.; EASTERLING, ROBERT G.; DOWDING, KEVIN J.; PAEZ, THOMAS L.; URBINA, ANGEL; ROMERO, VICENTE J.; RUTHERFORD, BRIAN M.; HILLS, RICHARD GUY

    2001-01-01

    This report describes the underlying principles and goals of the Sandia ASCI Verification and Validation Program Validation Metrics Project. It also gives a technical description of two case studies, one in structural dynamics and the other in thermomechanics, that serve to focus the technical work of the project in Fiscal Year 2001

  11. Inferring feature relevances from metric learning

    DEFF Research Database (Denmark)

    Schulz, Alexander; Mokbel, Bassam; Biehl, Michael

    2015-01-01

    Powerful metric learning algorithms have been proposed in the last years which do not only greatly enhance the accuracy of distance-based classifiers and nearest neighbor database retrieval, but which also enable the interpretability of these operations by assigning explicit relevance weights to ...

  12. Assessment of Reusing 14-Ton, Thin-Wall, Depleted UF6 Cylinders as LLW Disposal Containers

    International Nuclear Information System (INIS)

    O'Connor, D.G.; Poole, A.B.; Shelton, J.H.

    2000-01-01

    of 48 inches and nominally contain 14 tons (12.7 MT) of DUF 6 , were originally designed and fabricated for temporary storage of DUF 6 . They were fabricated from pressure-vessel-grade steels according to the provisions of the ASME Boiler and Pressure Vessel Code. Cylinders are stored in open yards at the three sites and, due to historical storage techniques, were subject to corrosion. Roughly 10,000 of the 14TTW cylinders are considered substandard due to corrosion and other structural anomalies caused by mishandling. This means that approximately 40,000 14TTW cylinders could be made available as containers for LLW disposal In order to demonstrate the use of 14TTW cylinders as LLW disposal containers, several qualifying tasks need to be performed. Two demonstrations are being considered using 14TTW cylinders--one demonstration using contaminated soil and one demonstration using U 3 O 8 . The objective of this report are to determine how much information is known that could be used to support the demonstrations, and how much additional work will need to be done in order to conduct the demonstrations. Information associated with the following four qualifying tasks are evaluated in this report

  13. Extremal limits of the C metric: Nariai, Bertotti-Robinson, and anti-Nariai C metrics

    Science.gov (United States)

    Dias, Óscar J.; Lemos, José P.

    2003-11-01

    In two previous papers we have analyzed the C metric in a background with a cosmological constant Λ, namely, the de Sitter (dS) C metric (Λ>0), and the anti de Sitter (AdS) C metricmetric in flat spacetime (Λ=0). These exact solutions describe a pair of accelerated black holes in the flat or cosmological constant background, with the acceleration A being provided by a strut in between that pushes away the two black holes or, alternatively, by strings hanging from infinity that pull them in. In this paper we analyze the extremal limits of the C metric in a background with a generic cosmological constant Λ>0, Λ=0, and ΛSchwarzschild solution by taking an appropriate limit, where the black hole event horizon approaches the cosmological horizon. Similarly, one can generate the Bertotti-Robinson metric from the Reissner-Nordström metric by taking the limit of the Cauchy horizon going into the event horizon of the black hole, as well as the anti-Nariai metric by taking an appropriate solution and limit. Using these methods we generate the C-metric counterparts of the Nariai, Bertotti-Robinson, and anti-Nariai solutions, among others. These C-metric counterparts are conformal to the product of two two-dimensional manifolds of constant curvature, the conformal factor depending on the angular coordinate. In addition, the C-metric extremal solutions have a conical singularity at least at one of the poles of their angular surfaces. We give a physical interpretation to these solutions, e.g., in the Nariai C metric (with topology dS2×S˜2) to each point in the deformed two-sphere S˜˜2 corresponds a dS2 spacetime, except for one point which corresponds to a dS2 spacetime with an infinite straight strut or string. There are other important new features that appear. One expects that the solutions found in this paper are unstable and decay into a slightly nonextreme black hole pair accelerated by a strut or by strings. Moreover, the Euclidean version of these

  14. Selections of the metric projection operator and strict solarity of sets with continuous metric projection

    Science.gov (United States)

    Alimov, A. R.

    2017-07-01

    In a broad class of finite-dimensional Banach spaces, we show that a closed set with lower semicontinuous metric projection is a strict sun, admits a continuous selection of the metric projection operator onto it, has contractible intersections with balls, and its (nonempty) intersection with any closed ball is a retract of this ball. For sets with continuous metric projection, a number of new results relating the solarity of such sets to the stability of the operator of best approximation are obtained. Bibliography 25 titles.

  15. 77 FR 15052 - Dataset Workshop-U.S. Billion Dollar Disasters Dataset (1980-2011): Assessing Dataset Strengths...

    Science.gov (United States)

    2012-03-14

    .... Pathways to overcome accuracy and bias issues will be an important focus. Participants will consider: Historical development and current state of the U.S. Billion Dollar Disasters Report; What additional data... dataset; Examination of unique uncertainties related to the cost of each of the major types of weather and...

  16. Constraint on a Varying Proton-Electron Mass Ratio 1.5 Billion Years after the Big Bang

    NARCIS (Netherlands)

    Bagdonaite, J.; Ubachs, W.M.G.; Murphy, M.T.; Withmore, J.B.

    2015-01-01

    A molecular hydrogen absorber at a lookback time of 12.4 billion years, corresponding to 10% of the age of the Universe today, is analyzed to put a constraint on a varying proton-electron mass ratio, μ. A high resolution spectrum of the J1443+2724 quasar, which was observed with the Very Large

  17. Subsampled open-reference clustering creates consistent, comprehensive OTU definitions and scales to billions of sequences

    Directory of Open Access Journals (Sweden)

    Jai Ram Rideout

    2014-08-01

    Full Text Available We present a performance-optimized algorithm, subsampled open-reference OTU picking, for assigning marker gene (e.g., 16S rRNA sequences generated on next-generation sequencing platforms to operational taxonomic units (OTUs for microbial community analysis. This algorithm provides benefits over de novo OTU picking (clustering can be performed largely in parallel, reducing runtime and closed-reference OTU picking (all reads are clustered, not only those that match a reference database sequence with high similarity. Because more of our algorithm can be run in parallel relative to “classic” open-reference OTU picking, it makes open-reference OTU picking tractable on massive amplicon sequence data sets (though on smaller data sets, “classic” open-reference OTU clustering is often faster. We illustrate that here by applying it to the first 15,000 samples sequenced for the Earth Microbiome Project (1.3 billion V4 16S rRNA amplicons. To the best of our knowledge, this is the largest OTU picking run ever performed, and we estimate that our new algorithm runs in less than 1/5 the time than would be required of “classic” open reference OTU picking. We show that subsampled open-reference OTU picking yields results that are highly correlated with those generated by “classic” open-reference OTU picking through comparisons on three well-studied datasets. An implementation of this algorithm is provided in the popular QIIME software package, which uses uclust for read clustering. All analyses were performed using QIIME’s uclust wrappers, though we provide details (aided by the open-source code in our GitHub repository that will allow implementation of subsampled open-reference OTU picking independently of QIIME (e.g., in a compiled programming language, where runtimes should be further reduced. Our analyses should generalize to other implementations of these OTU picking algorithms. Finally, we present a comparison of parameter settings in

  18. The Other Inconvenient Truth: Feeding 9 Billion While Sustaining the Earth System

    Science.gov (United States)

    Foley, J. A.

    2010-12-01

    As the international community focuses on climate change as the great challenge of our era, we have been largely ignoring another looming problem — the global crisis in agriculture, food security and the environment. Our use of land, particularly for agriculture, is absolutely essential to the success of the human race: we depend on agriculture to supply us with food, feed, fiber, and, increasingly, biofuels. Without a highly efficient, productive, and resilient agricultural system, our society would collapse almost overnight. But we are demanding more and more from our global agricultural systems, pushing them to their very limits. Continued population growth (adding more than 70 million people to the world every year), changing dietary preferences (including more meat and dairy consumption), rising energy prices, and increasing needs for bioenergy sources are putting tremendous pressure on the world’s resources. And, if we want any hope of keeping up with these demands, we’ll need to double the agricultural production of the planet in the next 30 to 40 years. Meeting these huge new agricultural demands will be one of the greatest challenges of the 21st century. At present, it is completely unclear how (and if) we can do it. If this wasn’t enough, we must also address the massive environmental impacts of our current agricultural practices, which new evidence indicates rival the impacts of climate change. Simply put, providing for the basic needs of 9 billion-plus people, without ruining the biosphere in the process, will be one of the greatest challenges our species has ever faced. In this presentation, I will present a new framework for evaluating and assessing global patterns of agriculture, food / fiber / fuel production, and their relationship to the earth system, particularly in terms of changing stocks and flows of water, nutrients and carbon in our planetary environment. This framework aims to help us manage the challenges of increasing global food

  19. Strongly baryon-dominated disk galaxies at the peak of galaxy formation ten billion years ago.

    Science.gov (United States)

    Genzel, R; Schreiber, N M Förster; Übler, H; Lang, P; Naab, T; Bender, R; Tacconi, L J; Wisnioski, E; Wuyts, S; Alexander, T; Beifiori, A; Belli, S; Brammer, G; Burkert, A; Carollo, C M; Chan, J; Davies, R; Fossati, M; Galametz, A; Genel, S; Gerhard, O; Lutz, D; Mendel, J T; Momcheva, I; Nelson, E J; Renzini, A; Saglia, R; Sternberg, A; Tacchella, S; Tadaki, K; Wilman, D

    2017-03-15

    In the cold dark matter cosmology, the baryonic components of galaxies-stars and gas-are thought to be mixed with and embedded in non-baryonic and non-relativistic dark matter, which dominates the total mass of the galaxy and its dark-matter halo. In the local (low-redshift) Universe, the mass of dark matter within a galactic disk increases with disk radius, becoming appreciable and then dominant in the outer, baryonic regions of the disks of star-forming galaxies. This results in rotation velocities of the visible matter within the disk that are constant or increasing with disk radius-a hallmark of the dark-matter model. Comparisons between the dynamical mass, inferred from these velocities in rotational equilibrium, and the sum of the stellar and cold-gas mass at the peak epoch of galaxy formation ten billion years ago, inferred from ancillary data, suggest high baryon fractions in the inner, star-forming regions of the disks. Although this implied baryon fraction may be larger than in the local Universe, the systematic uncertainties (owing to the chosen stellar initial-mass function and the calibration of gas masses) render such comparisons inconclusive in terms of the mass of dark matter. Here we report rotation curves (showing rotation velocity as a function of disk radius) for the outer disks of six massive star-forming galaxies, and find that the rotation velocities are not constant, but decrease with radius. We propose that this trend arises because of a combination of two main factors: first, a large fraction of the massive high-redshift galaxy population was strongly baryon-dominated, with dark matter playing a smaller part than in the local Universe; and second, the large velocity dispersion in high-redshift disks introduces a substantial pressure term that leads to a decrease in rotation velocity with increasing radius. The effect of both factors appears to increase with redshift. Qualitatively, the observations suggest that baryons in the early (high

  20. Four billion years of ophiolites reveal secular trends in oceanic crust formation

    Directory of Open Access Journals (Sweden)

    Harald Furnes

    2014-07-01

    Full Text Available We combine a geological, geochemical and tectonic dataset from 118 ophiolite complexes of the major global Phanerozoic orogenic belts with similar datasets of ophiolites from 111 Precambrian greenstone belts to construct an overview of oceanic crust generation over 4 billion years. Geochemical discrimination systematics built on immobile trace elements reveal that the basaltic units of the Phanerozoic ophiolites are dominantly subduction-related (75%, linked to backarc processes and characterized by a strong MORB component, similar to ophiolites in Precambrian greenstone sequences (85%. The remaining 25% Phanerozoic subduction-unrelated ophiolites are mainly (74% of Mid-Ocean-Ridge type (MORB type, in contrast to the equal proportion of Rift/Continental Margin, Plume, and MORB type ophiolites in the Precambrian greenstone belts. Throughout the Phanerozoic there are large geochemical variations in major and trace elements, but for average element values calculated in 5 bins of 100 million year intervals there are no obvious secular trends. By contrast, basaltic units in the ophiolites of the Precambrian greenstones (calculated in 12 bins of 250 million years intervals, starting in late Paleo- to early Mesoproterozoic (ca. 2.0–1.8 Ga, exhibit an apparent decrease in the average values of incompatible elements such as Ti, P, Zr, Y and Nb, and an increase in the compatible elements Ni and Cr with deeper time to the end of the Archean and into the Hadean. These changes can be attributed to decreasing degrees of partial melting of the upper mantle from Hadean/Archean to Present. The onset of geochemical changes coincide with the timing of detectible changes in the structural architecture of the ophiolites such as greater volumes of gabbro and more common sheeted dyke complexes, and lesser occurrences of ocelli (varioles in the pillow lavas in ophiolites younger than 2 Ga. The global data from the Precambrian ophiolites, representative of nearly 50

  1. High accuracy for China's 1990 Census: refuting a rumour about China's population topping 1.4 billion.

    Science.gov (United States)

    1991-02-01

    In response to newspaper reports that China's population had topped 1.4 billion, a spokesman from the Office of Population Census under the State Council issued a statement refuting the claim, pointing out that the highly accurate 1990 census estimates the population to be 1.13 billion. US newspapers, including the Boston Globe, and the Shijie ribao, a international daily in Chinese, recently cited reports from a Japanese newspaper claiming that China's population had exceeded 1.4 billion. But as the official explained, China has carefully monitored its population size. Every year since 1982, the country has carried out a sample survey on population change, and in 1988, it conducted a national 1% population sample survey on fertility and birth control. In 1982, a national census placed China's population at 1.0817 billion. So considering that the sample surveys over the past 8 years have indicated an annual net increase in population of about 17 million, it is impossible for China's population to have topped 1.4 billion. Furthermore, the 1990 census enumerated all the unplanned births not previously registered, and carefully monitored for possible underreporting for the floating population. 2 general checks for people without fixed living quarters took place on June 28 and 29. And on July 8 and 9, officials conducted a national make-up registration. Enumerators visited and registered floating persons who did have fixed living quarters. Furthermore, officials conducted a follow-up sample survey on the quality of the registration. This showed that the rate of underregistration in the 1990 census was 0.6/1000 population--a figure of 680,000 nationally.

  2. Tonometer calibration in Brasília, Brazil Calibragem de tonômetros em Brasília, Brasil

    Directory of Open Access Journals (Sweden)

    Fernanda Pires da Silva Abrão

    2009-06-01

    Full Text Available PURPOSE: To determine calibration errors of Goldmann applanation tonometers in ophthalmic clinics of Brasília, Brazil, and correlate the findings with variables related to tonometers model and utilization. METHODS: Tonometers from ophthalmic clinics in Brasília, Brazil, were checked for calibration errors. A standard Goldmann applanation tonometer checking tool was used to asses the calibration error. Only one trained individual made all verifications, with a masked reading of the results. Data on the model, age, daily use, frequency of calibration checking and the nature of the ophthalmic department - private or public - were collected and correlated with the observed errors. RESULTS: One hundred tonometers were checked for calibration. Forty seven percent (47/100 were out of 1 mmHg range at least at one point checking. Tonometers mounted to slit lamp, with less than 5 years, used in less than 20 patients daily, that had a calibration check on a yearly basis, and those from private office exhibit a lower rate of inaccuracy, but only the first variable was statistically significant. Sixty one percent of tonometers on public hospitals were out of calibration. CONCLUSION: Calibration of tonometers in the capital of Brazil is poor; those from general hospitals are worst, and this fact can lead to inaccurate detection and assessment of glaucoma patients, overall in the population under government assistance.OBJETIVOS: Determinar os erros de calibração dos tonômetros de aplanação de Goldmann em clínicas oftalmológicas de Brasília, Brasil, e correlacioná-los a variáveis relativas ao modelo e à utilização dos aparelhos. MÉTODOS: Tonômetros de clínicas oftalmológicas de Brasília tiveram a calibragem aferida usando um cilindro padrão fornecido pelo fabricante dos aparelhos. Todas as aferições foram realizadas por um só examinador previamente treinado e a leitura das medidas foi mascarada por um observador independente. As medidas

  3. Information metric on instanton moduli spaces in nonlinear σ models

    International Nuclear Information System (INIS)

    Yahikozawa, Shigeaki

    2004-01-01

    We study the information metric on instanton moduli spaces in two-dimensional nonlinear σ models. In the CP 1 model, the information metric on the moduli space of one instanton with the topological charge Q=k(k≥1) is a three-dimensional hyperbolic metric, which corresponds to Euclidean anti-de Sitter space-time metric in three dimensions, and the overall scale factor of the information metric is 4k 2 /3; this means that the sectional curvature is -3/4k 2 . We also calculate the information metric in the CP 2 model

  4. Machine Learning for ATLAS DDM Network Metrics

    CERN Document Server

    Lassnig, Mario; The ATLAS collaboration; Vamosi, Ralf

    2016-01-01

    The increasing volume of physics data is posing a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from our ongoing automation efforts. First, we describe our framework for distributed data management and network metrics, automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for network-aware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our models.

  5. Data Complexity Metrics for XML Web Services

    Directory of Open Access Journals (Sweden)

    MISRA, S.

    2009-06-01

    Full Text Available Web services that are based on eXtensible Markup Language (XML technologies enable integration of diverse IT processes and systems and have been gaining extraordinary acceptance from the basic to the most complicated business and scientific processes. The maintainability is one of the important factors that affect the quality of the Web services that can be seen a kind of software project. The effective management of any type of software projects requires modelling, measurement, and quantification. This study presents a metric for the assessment of the quality of the Web services in terms of its maintainability. For this purpose we proposed a data complexity metric that can be evaluated by analyzing WSDL (Web Service Description Language documents used for describing Web services.

  6. A perceptual metric for photo retouching.

    Science.gov (United States)

    Kee, Eric; Farid, Hany

    2011-12-13

    In recent years, advertisers and magazine editors have been widely criticized for taking digital photo retouching to an extreme. Impossibly thin, tall, and wrinkle- and blemish-free models are routinely splashed onto billboards, advertisements, and magazine covers. The ubiquity of these unrealistic and highly idealized images has been linked to eating disorders and body image dissatisfaction in men, women, and children. In response, several countries have considered legislating the labeling of retouched photos. We describe a quantitative and perceptually meaningful metric of photo retouching. Photographs are rated on the degree to which they have been digitally altered by explicitly modeling and estimating geometric and photometric changes. This metric correlates well with perceptual judgments of photo retouching and can be used to objectively judge by how much a retouched photo has strayed from reality.

  7. Metrics for measuring distances in configuration spaces.

    Science.gov (United States)

    Sadeghi, Ali; Ghasemi, S Alireza; Schaefer, Bastian; Mohr, Stephan; Lill, Markus A; Goedecker, Stefan

    2013-11-14

    In order to characterize molecular structures we introduce configurational fingerprint vectors which are counterparts of quantities used experimentally to identify structures. The Euclidean distance between the configurational fingerprint vectors satisfies the properties of a metric and can therefore safely be used to measure dissimilarities between configurations in the high dimensional configuration space. In particular we show that these metrics are a perfect and computationally cheap replacement for the root-mean-square distance (RMSD) when one has to decide whether two noise contaminated configurations are identical or not. We introduce a Monte Carlo approach to obtain the global minimum of the RMSD between configurations, which is obtained from a global minimization over all translations, rotations, and permutations of atomic indices.

  8. Metric-Aware Secure Service Orchestration

    Directory of Open Access Journals (Sweden)

    Gabriele Costa

    2012-12-01

    Full Text Available Secure orchestration is an important concern in the internet of service. Next to providing the required functionality the composite services must also provide a reasonable level of security in order to protect sensitive data. Thus, the orchestrator has a need to check whether the complex service is able to satisfy certain properties. Some properties are expressed with metrics for precise definition of requirements. Thus, the problem is to analyse the values of metrics for a complex business process. In this paper we extend our previous work on analysis of secure orchestration with quantifiable properties. We show how to define, verify and enforce quantitative security requirements in one framework with other security properties. The proposed approach should help to select the most suitable service architecture and guarantee fulfilment of the declared security requirements.

  9. Anisotropic rectangular metric for polygonal surface remeshing

    KAUST Repository

    Pellenard, Bertrand

    2013-06-18

    We propose a new method for anisotropic polygonal surface remeshing. Our algorithm takes as input a surface triangle mesh. An anisotropic rectangular metric, defined at each triangle facet of the input mesh, is derived from both a user-specified normal-based tolerance error and the requirement to favor rectangle-shaped polygons. Our algorithm uses a greedy optimization procedure that adds, deletes and relocates generators so as to match two criteria related to partitioning and conformity.

  10. Quantitative metric theory of continued fractions

    Indian Academy of Sciences (India)

    Quantitative versions of the central results of the metric theory of continued fractions were given primarily by C. De Vroedt. In this paper we give improvements of the bounds involved . For a real number , let. x = c 0 + 1 c 1 + 1 c 2 + 1 c 3 + 1 c 4 + ⋱ . A sample result we prove is that given ϵ > 0 ,. ( c 1 ( x ) ⋯ c n ( x ) ) 1 n ...

  11. Agile Metrics: Progress Monitoring of Agile Contractors

    Science.gov (United States)

    2014-01-01

    epic. The short timeframe is usually called an itera- tion or, in Scrum -based teams, a sprint; multiple iterations make up a release [Lapham 2011...9769 [Rawsthorne 2012] Rawsthorne, Dan. Monitoring Scrum Projects with AgileEVM and Earned Business Value Metrics (EBV). 2012. http...AgileEVM – Earned Value Manage- ment in Scrum Projects.” Presented at Agile2006, 23-28 July 2006. [USAF 2008] United States Air Force. United

  12. Primordial magnetic fields from metric perturbations

    CERN Document Server

    Maroto, A L

    2001-01-01

    We study the amplification of electromagnetic vacuum fluctuations induced by the evolution of scalar metric perturbations at the end of inflation. Such perturbations break the conformal invariance of Maxwell equations in Friedmann-Robertson-Walker backgrounds and allow the growth of magnetic fields on super-Hubble scales. We estimate the strength of the fields generated by this mechanism on galactic scales and compare the results with the present bounds on the galactic dynamo seed fields.

  13. Design Management: Metrics and Visual Tools

    OpenAIRE

    Abou Ibrahim, Hisham; Hamzeh, Farook

    2017-01-01

    The iterative and multidisciplinary nature of design complicates its management. Nonetheless, the lack of adequate tools that can be used to manage design dynamics negatively affects the design process as well as the quality of the final design deliverables. In this regard, this paper introduces new metrics to measure information flow in BIM projects, and elaborates on the Level of Development (LOD) concept to reflect the design maturity of model elements in the corresponding design context. ...

  14. Metrics in Keplerian orbits quotient spaces

    Science.gov (United States)

    Milanov, Danila V.

    2018-03-01

    Quotient spaces of Keplerian orbits are important instruments for the modelling of orbit samples of celestial bodies on a large time span. We suppose that variations of the orbital eccentricities, inclinations and semi-major axes remain sufficiently small, while arbitrary perturbations are allowed for the arguments of pericentres or longitudes of the nodes, or both. The distance between orbits or their images in quotient spaces serves as a numerical criterion for such problems of Celestial Mechanics as search for common origin of meteoroid streams, comets, and asteroids, asteroid families identification, and others. In this paper, we consider quotient sets of the non-rectilinear Keplerian orbits space H. Their elements are identified irrespective of the values of pericentre arguments or node longitudes. We prove that distance functions on the quotient sets, introduced in Kholshevnikov et al. (Mon Not R Astron Soc 462:2275-2283, 2016), satisfy metric space axioms and discuss theoretical and practical importance of this result. Isometric embeddings of the quotient spaces into R^n, and a space of compact subsets of H with Hausdorff metric are constructed. The Euclidean representations of the orbits spaces find its applications in a problem of orbit averaging and computational algorithms specific to Euclidean space. We also explore completions of H and its quotient spaces with respect to corresponding metrics and establish a relation between elements of the extended spaces and rectilinear trajectories. Distance between an orbit and subsets of elliptic and hyperbolic orbits is calculated. This quantity provides an upper bound for the metric value in a problem of close orbits identification. Finally the invariance of the equivalence relations in H under coordinates change is discussed.

  15. Smart Grid Status and Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-07-01

    To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. It measures 21 metrics to provide insight into the grid’s capacity to embody these characteristics. This report looks across a spectrum of smart grid concerns to measure the status of smart grid deployment and impacts.

  16. Quantitative Embeddability and Connectivity in Metric Spaces

    Science.gov (United States)

    Eriksson-Bique, Sylvester David

    This thesis studies three analytic and quantitative questions on doubling metric (measure) spaces. These results are largely independent and will be presented in separate chapters. The first question concerns representing metric spaces arising from complete Riemannian manifolds in Euclidean space. More precisely, we find bi-Lipschitz embeddings ƒ for subsets A of complete Riemannian manifolds M of dimension n, where N could depend on a bound on the curvature and diameter of A. The main difficulty here is to control the distortion of such embeddings in terms of the curvature of the manifold. In constructing the embeddings, we will study the collapsing theory of manifolds in detail and at multiple scales. Similar techniques give embeddings for subsets of complete Riemannian orbifolds and quotient metric spaces. The second part of the thesis answers a question about finding quantitative and weak conditions that ensure large families of rectifiable curves connecting pairs of points. These families of rectifiable curves are quantified in terms of Poincare inequalities. We identify a new quantitative connectivity condition in terms of curve fragments, which is equivalent to possessing a Poincare inequality with some exponent. The connectivity condition arises naturally in three different contexts, and we present methods to find Poincare inequalities for the spaces involved. In particular, we prove such inequalities for spaces with weak curvature bounds and thus resolve a question of Tapio Rajala. In the final part of the thesis we study the local geometry of spaces admitting differentiation of Lipschitz functions with certain Banach space targets. The main result shows that such spaces can be characterized in terms of Poincare inequalities and doubling conditions. In fact, such spaces can be covered by countably many pieces, each of which is an isometric subset of a doubling metric measure space admitting a Poincare inequality. In proving this, we will find a new way to

  17. Metric entropy in linear inverse scattering

    Directory of Open Access Journals (Sweden)

    M. A. Maisto

    2016-09-01

    Full Text Available The role of multiple views and/or multiple frequencies on the achievable performance in linear inverse scattering problems is addressed. To this end, the impact of views and frequencies on the Kolmogorov entropy measure is studied. This way the metric information that can be conveyed back from data to the unknown can be estimated. For the sake of simplicity, the study deals with strip scatterers and the cases of discrete angles of incidence and/or frequencies.

  18. The Planck Vacuum and the Schwarzschild Metrics

    Directory of Open Access Journals (Sweden)

    Daywitt W. C.

    2009-07-01

    Full Text Available The Planck vacuum (PV is assumed to be the source of the visible universe. So under conditions of sufficient stress, there must exist a pathway through which energy from the PV can travel into this universe. Conversely, the passage of energy from the visible universe to the PV must also exist under the same stressful conditions. The following examines two versions of the Schwarzschild metric equation for compatability with this open-pathway idea.

  19. Preserved Network Metrics across Translated Texts

    Science.gov (United States)

    Cabatbat, Josephine Jill T.; Monsanto, Jica P.; Tapang, Giovanni A.

    2014-09-01

    Co-occurrence language networks based on Bible translations and the Universal Declaration of Human Rights (UDHR) translations in different languages were constructed and compared with random text networks. Among the considered network metrics, the network size, N, the normalized betweenness centrality (BC), and the average k-nearest neighbors, knn, were found to be the most preserved across translations. Moreover, similar frequency distributions of co-occurring network motifs were observed for translated texts networks.

  20. Ideal Based Cyber Security Technical Metrics for Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    W. F. Boyer; M. A. McQueen

    2007-10-01

    Much of the world's critical infrastructure is at risk from attack through electronic networks connected to control systems. Security metrics are important because they provide the basis for management decisions that affect the protection of the infrastructure. A cyber security technical metric is the security relevant output from an explicit mathematical model that makes use of objective measurements of a technical object. A specific set of technical security metrics are proposed for use by the operators of control systems. Our proposed metrics are based on seven security ideals associated with seven corresponding abstract dimensions of security. We have defined at least one metric for each of the seven ideals. Each metric is a measure of how nearly the associated ideal has been achieved. These seven ideals provide a useful structure for further metrics development. A case study shows how the proposed metrics can be applied to an operational control system.