WorldWideScience

Sample records for billion metric tons

  1. Sneak Peek to the 2016 Billion-Ton Report

    Energy Technology Data Exchange (ETDEWEB)

    None

    2016-06-01

    The 2005 Billion-Ton Study became a landmark resource for bioenergy stakeholders, detailing for the first time the potential to produce at least one billion dry tons of biomass annually in a sustainable manner from U.S. agriculture and forest resources. The 2011 U.S. Billion-Ton Update expanded and updated the analysis, and in 2016, the U.S. Department of Energy’s Bioenergy Technologies Office plans to release the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy.

  2. Summary and Comparison of the 2016 Billion-Ton Report with the 2011 U.S. Billion-Ton Update

    Energy Technology Data Exchange (ETDEWEB)

    None

    2016-06-01

    In terms of the magnitude of the resource potential, the results of the 2016 Billion-Ton Report (BT16) are consistent with the original 2005 Billion-Ton Study (BTS) and the 2011 report, U.S. Billion-Ton Update: Biomass Supply for a Bioenergy and Bioproducts Industry (BT2. An effort was made to reevaluate the potential forestland, agricultural, and waste resources at the roadside, then extend the analysis by adding transportation costs to a biorefinery under specified logistics assumptions to major resource fractions.

  3. Regional Feedstock Partnership Summary Report: Enabling the Billion-Ton Vision

    Energy Technology Data Exchange (ETDEWEB)

    Owens, Vance N. [South Dakota State Univ., Brookings, SD (United States). North Central Sun Grant Center; Karlen, Douglas L. [Dept. of Agriculture Agricultural Research Service, Ames, IA (United States). National Lab. for Agriculture and the Environment; Lacey, Jeffrey A. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Process Science and Technology Division

    2016-07-12

    The U.S. Department of Energy (DOE) and the Sun Grant Initiative established the Regional Feedstock Partnership (referred to as the Partnership) to address information gaps associated with enabling the vision of a sustainable, reliable, billion-ton U.S. bioenergy industry by the year 2030 (i.e., the Billion-Ton Vision). Over the past 7 years (2008–2014), the Partnership has been successful at advancing the biomass feedstock production industry in the United States, with notable accomplishments. The Billion-Ton Study identifies the technical potential to expand domestic biomass production to offset up to 30% of U.S. petroleum consumption, while continuing to meet demands for food, feed, fiber, and export. This study verifies for the biofuels and chemical industries that a real and substantial resource base could justify the significant investment needed to develop robust conversion technologies and commercial-scale facilities. DOE and the Sun Grant Initiative established the Partnership to demonstrate and validate the underlying assumptions underpinning the Billion-Ton Vision to supply a sustainable and reliable source of lignocellulosic feedstock to a large-scale bioenergy industry. This report discusses the accomplishments of the Partnership, with references to accompanying scientific publications. These accomplishments include advances in sustainable feedstock production, feedstock yield, yield stability and stand persistence, energy crop commercialization readiness, information transfer, assessment of the economic impacts of achieving the Billion-Ton Vision, and the impact of feedstock species and environment conditions on feedstock quality characteristics.

  4. 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy

    Energy Technology Data Exchange (ETDEWEB)

    None

    2016-07-06

    This product builds on previous efforts, namely the 2005 Billion-Ton Study (BTS) and the 2011 U.S. Billion-Ton Update (BT2).With each report, greater perspective is gained on the potential of biomass resources to contribute to a national energy strategy. Similarly, each successive report introduces new questions regarding commercialization challenges. BTS quantified the broad biophysical potential of biomass nationally, and BT2 elucidated the potential economic availability of these resources. These reports clearly established the potential availability of up to one billion tons of biomass resources nationally. However, many questions remain, including but not limited to crop yields, climate change impacts, logistical operations, and systems integration across production, harvest, and conversion. The present report aims to address many of these questions through empirically modeled energy crop yields, scenario analysis of resources delivered to biorefineries, and the addition of new feedstocks. Volume 2 of the 2016 Billion-Ton Report is expected to be released by the end of 2016. It seeks to evaluate environmental sustainability indicators of select scenarios from volume 1 and potential climate change impacts on future supplies.

  5. 2016 Billion-Ton Report: Environmental Sustainability Effects of Select Scenarios from Volume 1 (Volume 2)

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, R. A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Langholtz, M. H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, K. E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Stokes, B. J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-13

    On behalf of all the authors and contributors, it is a great privilege to present the 2016 Billion-Ton Report (BT16), volume 2: Environmental Sustainability Effects of Select Scenarios from volume 1. This report represents the culmination of several years of collaborative effort among national laboratories, government agencies, academic institutions, and industry. BT16 was developed to support the U.S. Department of Energy’s efforts towards national goals of energy security and associated quality of life.

  6. 2016 Billion-ton report: Advancing domestic resources for a thriving bioeconomy, Volume 1: Economic availability of feedstock

    Science.gov (United States)

    M.H. Langholtz; B.J. Stokes; L.M. Eaton

    2016-01-01

    This product builds on previous efforts, namely the 2005 Billion-Ton Study (BTS) and the 2011 U.S. Billion-Ton Update (BT2).With each report, greater perspective is gained on the potential of biomass resources to contribute to a national energy strategy. Similarly, each successive report introduces new questions regarding commercialization challenges. BTS quantified...

  7. U.S. Billion-Ton Update: Biomass Supply for a Bioenergy and Bioproducts Industry

    Energy Technology Data Exchange (ETDEWEB)

    Downing, Mark [ORNL; Eaton, Laurence M [ORNL; Graham, Robin Lambert [ORNL; Langholtz, Matthew H [ORNL; Perlack, Robert D [ORNL; Turhollow Jr, Anthony F [ORNL; Stokes, Bryce [Navarro Research & Engineering; Brandt, Craig C [ORNL

    2011-08-01

    The report, Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasibility of a Billion-Ton Annual Supply (generally referred to as the Billion-Ton Study or 2005 BTS), was an estimate of 'potential' biomass based on numerous assumptions about current and future inventory, production capacity, availability, and technology. The analysis was made to determine if conterminous U.S. agriculture and forestry resources had the capability to produce at least one billion dry tons of sustainable biomass annually to displace 30% or more of the nation's present petroleum consumption. An effort was made to use conservative estimates to assure confidence in having sufficient supply to reach the goal. The potential biomass was projected to be reasonably available around mid-century when large-scale biorefineries are likely to exist. The study emphasized primary sources of forest- and agriculture-derived biomass, such as logging residues, fuel treatment thinnings, crop residues, and perennially grown grasses and trees. These primary sources have the greatest potential to supply large, reliable, and sustainable quantities of biomass. While the primary sources were emphasized, estimates of secondary residue and tertiary waste resources of biomass were also provided. The original Billion-Ton Resource Assessment, published in 2005, was divided into two parts-forest-derived resources and agriculture-derived resources. The forest resources included residues produced during the harvesting of merchantable timber, forest residues, and small-diameter trees that could become available through initiatives to reduce fire hazards and improve forest health; forest residues from land conversion; fuelwood extracted from forests; residues generated at primary forest product processing mills; and urban wood wastes, municipal solid wastes (MSW), and construction and demolition (C&D) debris. For these forest resources, only residues, wastes, and small

  8. Taking out 1 billion tons of CO2: The magic of China's 11th Five-Year Plan?

    International Nuclear Information System (INIS)

    Lin Jiang; Zhou Nan; Levine, Mark; Fridley, David

    2008-01-01

    China's 11th Five-Year Plan (FYP) sets an ambitious target for energy-efficiency improvement: energy intensity of the country's gross domestic product (GDP) should be reduced by 20% from 2005 to 2010 [National Development and Reform Commission (NDRC), 2006. Overview of the 11th Five Year Plan for National Economic and Social Development. NDRC, Beijing]. This is the first time that a quantitative and binding target has been set for energy efficiency, and signals a major shift in China's strategic thinking about its long-term economic and energy development. The 20% energy-intensity target also translates into an annual reduction of over 1.5 billion tons of CO 2 by 2010, making the Chinese effort one of the most significant carbon mitigation efforts in the world today. While it is still too early to tell whether China will achieve this target, this paper attempts to understand the trend in energy intensity in China and to explore a variety of options toward meeting the 20% target using a detailed end-use energy model

  9. Land-Use Change and the Billion Ton 2016 Resource Assessment: Understanding the Effects of Land Management on Environmental Indicators

    Science.gov (United States)

    Kline, K. L.; Eaton, L. M.; Efroymson, R.; Davis, M. R.; Dunn, J.; Langholtz, M. H.

    2016-12-01

    The federal government, led by the U.S. Department of Energy (DOE), quantified potential U.S. biomass resources for expanded production of renewable energy and bioproducts in the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy (BT16) (DOE 2016). Volume 1 of the report provides analysis of projected supplies from 2015 to2040. Volume 2 (forthcoming) evaluates changes in environmental indicators for water quality and quantity, carbon, air quality, and biodiversity associated with production scenarios in BT16 volume 1. This presentation will review land-use allocations under the projected biomass production scenarios and the changes in land management that are implied, including drivers of direct and indirect LUC. National and global concerns such as deforestation and displacement of food production are addressed. The choice of reference scenario, input parameters and constraints (e.g., regarding land classes, availability, and productivity) drive LUC results in any model simulation and are reviewed to put BT16 impacts into context. The principal LUC implied in BT16 supply scenarios involves the transition of 25-to-47 million acres (net) from annual crops in 2015 baseline to perennial cover by 2040 under the base case and 3% yield growth case, respectively. We conclude that clear definitions of land parameters and effects are essential to assess LUC. A lack of consistency in parameters and outcomes of historic LUC analysis in the U.S. underscores the need for science-based approaches.

  10. Taking out 1 billion tons of CO2: The magic of China's 11th Five-Year Plan?

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Nan; Lin, Jiang; Zhou, Nan; Levine, Mark; Fridley, David

    2007-07-01

    China's 11th Five-Year Plan (FYP) sets an ambitious target for energy-efficiency improvement: energy intensity of the country's gross domestic product (GDP) should be reduced by 20% from 2005 to 2010 (NDRC, 2006). This is the first time that a quantitative and binding target has been set for energy efficiency, and signals a major shift in China's strategic thinking about its long-term economic and energy development. The 20% energy intensity target also translates into an annual reduction of over 1.5 billion tons of CO2 by 2010, making the Chinese effort one of most significant carbon mitigation effort in the world today. While it is still too early to tell whether China will achieve this target, this paper attempts to understand the trend in energy intensity in China and to explore a variety of options toward meeting the 20% target using a detailed end-use energy model.

  11. Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasability of a Billion-Ton Annual Supply

    Energy Technology Data Exchange (ETDEWEB)

    Perlack, R.D.

    2005-12-15

    whether the land resources of the United States are capable of producing a sustainable supply of biomass sufficient to displace 30 percent or more of the country's present petroleum consumption--the goal set by the Advisory Committee in their vision for biomass technologies. Accomplishing this goal would require approximately 1 billion dry tons of biomass feedstock per year.

  12. Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasibility of a Billion-Ton Annual Supply, April 2005

    Energy Technology Data Exchange (ETDEWEB)

    None

    2005-04-01

    The purpose of this report is to determine whether the land resources of the United States are capable of producing a sustainable supply of biomass sufficient to displace 30 percent or more of the country’s present petroleum consumption – the goal set by the Biomass R&D Technical Advisory Committee in their vision for biomass technologies. Accomplishing this goal would require approximately 1 billion dry tons of biomass feedstock per year.

  13. Semiportable load-cell-based weighing system prototype of 18.14-metric-ton (20-ton) capacity for UF6 cylinder weight verifications: description and testing procedure

    International Nuclear Information System (INIS)

    McAuley, W.A.

    1984-01-01

    The 18.14-metric-ton-capacity (20-ton) Load-Cell-Based Weighing System (LCBWS) prototype tested at the Oak Ridge (Tennessee) Gaseous Diffusion Plant March 20-30, 1984, is semiportable and has the potential for being highly accurate. Designed by Brookhaven National Laboratory, it can be moved to cylinders for weighing as opposed to the widely used operating philosophy of most enrichment facilities of moving cylinders to stationary accountability scales. Composed mainly of commercially available, off-the-shelf hardware, the system's principal elements are two load cells that sense the weight (i.e., force) of a uranium hexafluoride (UF 6 ) cylinder suspended from the LCBWS while the cylinder is in the process of being weighed. Portability is achieved by its attachment to a double-hook, overhead-bridge crane. The LCBWS prototype is designed to weigh 9.07- and 12.70-metric ton (10- and 14-ton) UF 6 cylinders. A detailed description of the LCBWS is given, design information and criteria are supplied, a testing procedure is outlined, and initial test results are reported. A major objective of the testing is to determine the reliability and accuracy of the system. Other testing objectives include the identification of (1) potential areas for system improvements and (2) procedural modifications that will reflect an improved and more efficient system. The testing procedure described includes, but is not limited to, methods that account for temperature sensitivity of the instrumentation, the local variation in the acceleration due to gravity, and buoyance effects. Operational and safety considerations are noted. A preliminary evaluation of the March test data indicates that the LCBWS prototype has the potential to have an accuracy in the vicinity of 1 kg

  14. 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy, Volume 2: Environmental Sustainability Effects of Select Scenarios from Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, Rebecca Ann [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-11

    With the goal of understanding environmental effects of a growing bioeconomy, the U.S. Department of Energy (DOE), national laboratories, and U.S. Forest Service research laboratories, together with academic and industry collaborators, undertook a study to estimate environmental effects of potential biomass production scenarios in the United States, with an emphasis on agricultural and forest biomass. Potential effects investigated include changes in soil organic carbon (SOC), greenhouse gas (GHG) emissions, water quality and quantity, air emissions, and biodiversity. Effects of altered land-management regimes were analyzed based on select county-level biomass-production scenarios for 2017 and 2040 taken from the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy (BT16), volume 1, which assumes that the land bases for agricultural and forestry would not change over time. The scenarios reflect constraints on biomass supply (e.g., excluded areas; implementation of management practices; and consideration of food, feed, forage, and fiber demands and exports) that intend to address sustainability concerns. Nonetheless, both beneficial and adverse environmental effects might be expected. To characterize these potential effects, this research sought to estimate where and under what modeled scenarios or conditions positive and negative environmental effects could occur nationwide. The report also includes a discussion of land-use change (LUC) (i.e., land management change) assumptions associated with the scenario transitions (but not including analysis of indirect LUC [ILUC]), analyses of climate sensitivity of feedstock productivity under a set of potential scenarios, and a qualitative environmental effects analysis of algae production under carbon dioxide (CO2) co-location scenarios. Because BT16 biomass supplies are simulated independent of a defined end use, most analyses do not include benefits from displacing fossil fuels or other

  15. Production equipment development needs for a 700 metric ton/year light water reactor mixed oxide fuel manufacturing plant

    International Nuclear Information System (INIS)

    Blahnik, D.E.

    1977-09-01

    A literature search and survey of fuel suppliers was conducted to determine how much development of production equipment is needed for a 700 metric tons/y LWR mixed-oxide (UO 2 --PuO 2 ) fuel fabrication plant. Results indicate that moderate to major production equipment development is needed in the powder and pellet processing areas. The equipment in the rod and assembly processing areas need only minor development effort. Required equipment development for a 700 MT/y plant is not anticipated to delay startup of the plant. The development, whether major or minor, can be done well within the time frame for licensing and construction of the plant as long as conventional production equipment is used

  16. 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy, Volume 2: Environmental Sustainability Effects of Select Scenarios from Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, Rebecca Ann [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Kristen [Dept. of Energy (DOE), Washington DC (United States); Stokes, Bryce [Allegheny Science & Technology, LLC, Bridgeport, WV (United States); Brandt, Craig C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hellwinckel, Chad [Univ. of Tennessee, Knoxville, TN (United States); Kline, Keith L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Eaton, Laurence M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Dunn, Jennifer [Argonne National Lab. (ANL), Argonne, IL (United States); Canter, Christina E. [Argonne National Lab. (ANL), Argonne, IL (United States); Qin, Zhangcai [Argonne National Lab. (ANL), Argonne, IL (United States); Cai, Hao [Argonne National Lab. (ANL), Argonne, IL (United States); Wang, Michael [Argonne National Lab. (ANL), Argonne, IL (United States); Scott, D. Andrew [USDA Forest Service, Normal, AL (United States); Jager, Henrietta I. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wu, May [Argonne National Lab. (ANL), Argonne, IL (United States); Ha, Miae [Argonne National Lab. (ANL), Argonne, IL (United States); Baskaran, Latha Malar [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kreig, Jasmine A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rau, Benjamin [USDA Forest Service, Aiken, SC (United States); Muwamba, Augustine [Univ. of Georgia, Athens, GA (United States); Trettin, Carl [USDA Forest Service, Aiken, SC (United States); Panda, Sudhanshu [Univ. of North Georgia, Oakwood, GA (United States); Amatya, Devendra M. [USDA Forest Service, Aiken, SC (United States); Tollner, Ernest W. [USDA Forest Service, Aiken, SC (United States); Sun, Ge [USDA Forest Service, Aiken, SC (United States); Zhang, Liangxia [USDA Forest Service, Aiken, SC (United States); Duan, Kai [North Carolina State Univ., Raleigh, NC (United States); Warner, Ethan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Zhang, Yimin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Inman, Daniel [National Renewable Energy Lab. (NREL), Golden, CO (United States); Eberle, Annika [National Renewable Energy Lab. (NREL), Golden, CO (United States); Carpenter, Alberta [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heath, Garvin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hettinger, Dylan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wang, Gangsheng [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sutton, Nathan J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Busch, Ingrid Karin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Donner, Deahn M. [USDA Forest Service, Aiken, SC (United States); Wigley, T. Bently [National Council for Air and Stream Improvement (NCASI), Research Triangle Park, NC (United States); Miller, Darren A. [Weyerhaeuser Company, Federal Way, WA (United States); Coleman, Andre [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wigmosta, Mark [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pattullo, Molly [Univ. of Tennessee, Knoxville, TN (United States); Mayes, Melanie [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Daly, Christopher [Oregon State Univ., Corvallis, OR (United States); Halbleib, Mike [Oregon State Univ., Corvallis, OR (United States); Negri, Cristina [Argonne National Lab. (ANL), Argonne, IL (United States); Turhollow, Anthony F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bonner, Ian [Monsanto Company, Twin Falls, ID (United States); Dale, Virginia H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-01

    With the goal of understanding environmental effects of a growing bioeconomy, the U.S. Department of Energy (DOE), national laboratories, and U.S. Forest Service research laboratories, together with academic and industry collaborators, undertook a study to estimate environmental effects of potential biomass production scenarios in the United States, with an emphasis on agricultural and forest biomass. Potential effects investigated include changes in soil organic carbon (SOC), greenhouse gas (GHG) emissions, water quality and quantity, air emissions, and biodiversity. Effects of altered land-management regimes were analyzed based on select county-level biomass-production scenarios for 2017 and 2040 taken from the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy (BT16), volume 1, which assumes that the land bases for agricultural and forestry would not change over time. The scenarios reflect constraints on biomass supply (e.g., excluded areas; implementation of management practices; and consideration of food, feed, forage, and fiber demands and exports) that intend to address sustainability concerns. Nonetheless, both beneficial and adverse environmental effects might be expected. To characterize these potential effects, this research sought to estimate where and under what modeled scenarios or conditions positive and negative environmental effects could occur nationwide. The report also includes a discussion of land-use change (LUC) (i.e., land management change) assumptions associated with the scenario transitions (but not including analysis of indirect LUC [ILUC]), analyses of climate sensitivity of feedstock productivity under a set of potential scenarios, and a qualitative environmental effects analysis of algae production under carbon dioxide (CO2) co-location scenarios. Because BT16 biomass supplies are simulated independent of a defined end use, most analyses do not include benefits from displacing fossil fuels or

  17. Proposed method for assigning metric tons of heavy metal values to defense high-level waste forms to be disposed of in a geologic repository

    International Nuclear Information System (INIS)

    1987-08-01

    A proposed method is described for assigning an equivalent metric ton heavy metal (eMTHM) value to defense high-level waste forms to be disposed of in a geologic repository. This method for establishing a curie equivalency between defense high-level waste and irradiated commercial fuel is based on the ratio of defense fuel exposure to the typical commercial fuel exposure, MWd/MTHM. application of this technique to defense high-level wastes is described. Additionally, this proposed technique is compared to several alternate calculations for eMTHM. 15 refs., 2 figs., 10 tabs

  18. Connecting the last billion

    OpenAIRE

    Ben David, Yahel

    2015-01-01

    The last billion people to join the online world, are likely to face at least one of two obstacles:Part I: Rural Internet AccessRural, sparsely populated, areas make conventional infrastructure investments unfeasible: Bigcorporations attempt to address this challenge via the launch of Low-Earth-Orbiting (LEO) satelliteconstellations, fleets of high-altitude balloons, and giant solar-powered drones; although thesegrandiose initiatives hold potential, they are costly and risky. At the same time...

  19. Defining a standard metric for electricity savings

    International Nuclear Information System (INIS)

    Koomey, Jonathan; Akbari, Hashem; Blumstein, Carl; Brown, Marilyn; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B; Greenberg, Steve

    2010-01-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70% capacity factor with 7% T and D losses. Displacing such a plant for one year would save 3 billion kWh/year at the meter and reduce emissions by 3 million metric tons of CO 2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question-Dr Arthur H Rosenfeld.

  20. Defining a standard metric for electricity savings

    Energy Technology Data Exchange (ETDEWEB)

    Koomey, Jonathan [Lawrence Berkeley National Laboratory and Stanford University, PO Box 20313, Oakland, CA 94620-0313 (United States); Akbari, Hashem; Blumstein, Carl; Brown, Marilyn; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B; Greenberg, Steve, E-mail: JGKoomey@stanford.ed

    2010-01-15

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70% capacity factor with 7% T and D losses. Displacing such a plant for one year would save 3 billion kWh/year at the meter and reduce emissions by 3 million metric tons of CO{sub 2} per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question-Dr Arthur H Rosenfeld.

  1. Defining a Standard Metric for Electricity Savings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Marilyn; Akbari, Hashem; Blumstein, Carl; Koomey, Jonathan; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H.; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B.; Greenberg, Steve; Hafemeister, David; Harris, Jeff; Harvey, Hal; Heitz, Eric; Hirst, Eric; Hummel, Holmes; Kammen, Dan; Kelly, Henry; Laitner, Skip; Levine, Mark; Lovins, Amory; Masters, Gil; McMahon, James E.; Meier, Alan; Messenger, Michael; Millhone, John; Mills, Evan; Nadel, Steve; Nordman, Bruce; Price, Lynn; Romm, Joe; Ross, Marc; Rufo, Michael; Sathaye, Jayant; Schipper, Lee; Schneider, Stephen H; Sweeney, James L; Verdict, Malcolm; Vorsatz, Diana; Wang, Devra; Weinberg, Carl; Wilk, Richard; Wilson, John; Worrell, Ernst

    2009-03-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70percent capacity factor with 7percent T&D losses. Displacing such a plant for one year would save 3 billion kW h per year at the meter and reduce emissions by 3 million metric tons of CO2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question--Dr. Arthur H. Rosenfeld.

  2. Transportation system benefits of early deployment of a 75-ton multipurpose canister system

    International Nuclear Information System (INIS)

    Wankerl, M.W.; Schmid, S.P.

    1995-01-01

    In 1993 the US Civilian Radioactive Waste Management System (CRWMS) began developing two multipurpose canister (MPC) systems to provide a standardized method for interim storage and transportation of spent nuclear fuel (SNF) at commercial nuclear power plants. One is a 75-ton concept with an estimated payload of about 6 metric tons (t) of SNF, and the other is a 125-ton concept with an estimated payload of nearly 11 t of SNF. These payloads are two to three times the payloads of the largest currently certified US rail transport casks, the IF-300. Although is it recognized that a fully developed 125-ton MPC system is likely to provide a greater cost benefit, and radiation exposure benefit than the lower-capacity 75-ton MPC, the authors of this paper suggest that development and deployment of the 75-ton MPC prior to developing and deploying a 125-ton MPC is a desirable strategy. Reasons that support this are discussed in this paper

  3. Winglets Save Billions of Dollars in Fuel Costs

    Science.gov (United States)

    2010-01-01

    The upturned ends now featured on many airplane wings are saving airlines billions of dollars in fuel costs. Called winglets, the drag-reducing technology was advanced through the research of Langley Research Center engineer Richard Whitcomb and through flight tests conducted at Dryden Flight Research Center. Seattle-based Aviation Partners Boeing -- a partnership between Aviation Partners Inc., of Seattle, and The Boeing Company, of Chicago -- manufactures Blended Winglets, a unique design featured on Boeing aircraft around the world. These winglets have saved more than 2 billion gallons of jet fuel to date, representing a cost savings of more than $4 billion and a reduction of almost 21.5 million tons in carbon dioxide emissions.

  4. A billion-dollar bonanza

    International Nuclear Information System (INIS)

    Isaacs, J.

    1993-01-01

    In late May -- only weeks after Congress had rejected the president's economic stimulus package because it would add to the federal deficit -- the House of Representatives generously allocated an extra $1.2 billion to the Pentagon. This article discusses some of the rationalizations House members gave for the gift and describes the attempts of a bipartisan group to defeat this request for funds propounded by Pennsylvania Democrat John Murtha. This gist of the arguments for and against the $1.2 billion and the results of votes on the bill are presented

  5. 12 billion DM for Germany

    International Nuclear Information System (INIS)

    Anon.

    1975-01-01

    The German atomic industry has achieved the break-through to the world market: Brazil orders eight nuclear electricity generating plants from Siemens-AEG daughter Kraftwerk-Union. US concerns attacked the twelve billion DM deal, the biggest export order in the history of German industry. Without avail - the contract is to be signed in Bonn this week. (orig./LH) [de

  6. Nuclear business worth billions begins

    International Nuclear Information System (INIS)

    Beer, G.; Marcan, P.; Slovak, K.

    2005-01-01

    specific data regarding the direct costs of decommissioning. Preliminary estimates state 50 billions Slovak crowns (1.28 billions EUR), but the actual costs will mainly depend on the volume of nuclear waste to be disposed of. (authors)

  7. Semantic metrics

    OpenAIRE

    Hu, Bo; Kalfoglou, Yannis; Dupplaw, David; Alani, Harith; Lewis, Paul; Shadbolt, Nigel

    2006-01-01

    In the context of the Semantic Web, many ontology-related operations, e.g. ontology ranking, segmentation, alignment, articulation, reuse, evaluation, can be boiled down to one fundamental operation: computing the similarity and/or dissimilarity among ontological entities, and in some cases among ontologies themselves. In this paper, we review standard metrics for computing distance measures and we propose a series of semantic metrics. We give a formal account of semantic metrics drawn from a...

  8. Countdown to Six Billion Teaching Kit.

    Science.gov (United States)

    Zero Population Growth, Inc., Washington, DC.

    This teaching kit features six activities focused on helping students understand the significance of the world population reaching six billion for our society and our environment. Featured activities include: (1) History of the World: Part Six Billion; (2) A Woman's Place; (3) Baby-O-Matic; (4) Earth: The Apple of Our Eye; (5) Needs vs. Wants; and…

  9. Biomass as feedstock for a bioenergy and bioproducts industry: The technical feasibility of a billion-ton annual supply

    Energy Technology Data Exchange (ETDEWEB)

    Perlack, Robert D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wright, Lynn L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Turhollow, Anthony F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Graham, Robin L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Stokes, Bryce J. [U.S. Department of Agriculture, Washington, D.C. (United States); Erbach, Donald C. [U.S. Department of Agriculture, Washington, D.C. (United States)

    2005-04-01

    The purpose of this report is to determine whether the land resources of the United States are capable of producing a sustainable supply of biomass sufficient to displace 30% or more of the country's present petroleum consumption.

  10. Metric learning

    CERN Document Server

    Bellet, Aurelien; Sebban, Marc

    2015-01-01

    Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learnin

  11. Seven Billion People: Fostering Productive Struggle

    Science.gov (United States)

    Murawska, Jaclyn M.

    2018-01-01

    How can a cognitively demanding real-world task such as the Seven Billion People problem promote productive struggle "and" help shape students' mathematical dispositions? Driving home from school one evening, Jaclyn Murawska heard a commentator on the radio announce three statements: (1) experts had determined that the world population…

  12. Multi-Ton Argon and Xenon

    Energy Technology Data Exchange (ETDEWEB)

    Alarcon, Ricardo; Balascuta, Septimiu; Alton, Drew; Aprile, Elena; Giboni, Karl-Ludwig; Haruyama, Tom; Lang, Rafael; Melgarejo, Antonio Jesus; Ni, Kaixuan; Plante, Guillaume; Choi, Bin [et al.

    2009-01-01

    There is a wide range of astronomical evidence that the visible stars and gas in all galaxies, including our own, are immersed in a much larger cloud of non-luminous matter, typically an order of magnitude greater in total mass. The existence of this ''dark matter'' is consistent with evidence from large-scale galaxy surveys and microwave background measurements, indicating that the majority of matter in the universe is non-baryonic. The nature of this non-baryonic component is still totally unknown, and the resolution of the ''dark matter puzzle'' is of fundamental importance to cosmology, astrophysics, and elementary particle physics. A leading explanation, motivated by supersymmetry theory, is the existence of as yet undiscovered Weakly Interacting Massive Particles (WIMPs), formed in the early universe and subsequently clustered in association with normal matter. WIMPs could, in principle, be detected in terrestrial experiments by their collisions with ordinary nuclei, giving observable low energy (< 100 keV) nuclear recoils. The predicted low collision rates require ultra-low background detectors with large (0.1-10 ton) target masses, located in deep underground sites to eliminate neutron background from cosmic ray muons. The establishment of the Deep Underground Science and Engineering Laboratory for large-scale experiments of this type would strengthen the current leadership of US researchers in this and other particle astrophysics areas. We propose to detect nuclear recoils by scintillation and ionization in ton-scale liquid noble gas targets, using techniques already proven in experiments at the 0.01-0.1 ton level. The experimental challenge is to identify these events in the presence of background events from gammas, neutrons, and alphas.

  13. ICARUS 600 ton: A status report

    CERN Document Server

    Vignoli, C; Badertscher, A; Barbieri, E; Benetti, P; Borio di Tigliole, A; Brunetti, R; Bueno, A; Calligarich, E; Campanelli, Mario; Carli, F; Carpanese, C; Cavalli, D; Cavanna, F; Cennini, P; Centro, S; Cesana, A; Chen, C; Chen, Y; Cinquini, C; Cline, D; De Mitri, I; Dolfini, R; Favaretto, D; Ferrari, A; Gigli Berzolari, A; Goudsmit, P; He, K; Huang, X; Li, Z; Lu, F; Ma, J; Mannocchi, G; Mauri, F; Mazza, D; Mazzone, L; Montanari, C; Nurzia, G P; Otwinowski, S; Palamara, O; Pascoli, D; Pepato, A; Periale, L; Petrera, S; Piano Mortari, Giovanni; Piazzoli, A; Picchi, P; Pietropaolo, F; Rancati, T; Rappoldi, A; Raselli, G L; Rebuzzi, D; Revol, J P; Rico, J; Rossella, M; Rossi, C; Rubbia, C; Rubbia, A; Sala, P; Scannicchio, D; Sergiampietri, F; Suzuki, S; Terrani, M; Ventura, S; Verdecchia, M; Wang, H; Woo, J; Xu, G; Xu, Z; Zhang, C; Zhang, Q; Zheng, S

    2000-01-01

    The goal of the ICARUS Project is the installation of a multi-kiloton LAr TPC in the underground Gran Sasso Laboratory. The programme foresees the realization of the detector in a modular way. The first step is the construction of a 600 ton module which is now at an advanced phase. It will be mounted and tested in Pavia in one year and then it will be moved to Gran Sasso for the final operation. The major cryogenic and purification systems and the mechanical components of the detector have been constructed and tested in a 10 m3 prototype. The results of these tests are here summarized.

  14. Metrication manual

    International Nuclear Information System (INIS)

    Harper, A.F.A.; Digby, R.B.; Thong, S.P.; Lacey, F.

    1978-04-01

    In April 1978 a meeting of senior metrication officers convened by the Commonwealth Science Council of the Commonwealth Secretariat, was held in London. The participants were drawn from Australia, Bangladesh, Britain, Canada, Ghana, Guyana, India, Jamaica, Papua New Guinea, Solomon Islands and Trinidad and Tobago. Among other things, the meeting resolved to develop a set of guidelines to assist countries to change to SI and to compile such guidelines in the form of a working manual

  15. FY97 nuclear-related budgets total 493 billion yen (4.4 billion dollars)

    International Nuclear Information System (INIS)

    Anon.

    1996-01-01

    On September 13, the Atomic Energy Commission of Japan announced the estimated nuclear-related budget requests for FY1997 (April, 1997 - Mach, 1998), giving the breakdowns for eight ministries and agencies. The total amount requested by the government bodies was 493.3 billion yen, 0.8% increase as compared with FY96. this figure includes the budget requests of the Science and Technology Agency (STA), the Ministry of International Trade and Industry (MITI), the Ministry of Foreign Affairs, the Ministry of Transport, the Ministry of Agriculture, Forestry and Fisheries, the Okinawa Development Agency, and the Ministry of Home Affairs, but excludes the budget request made by the Ministry of Education. The budget requests of STA and MITI are 360 billion yen and 126 billion yen, respectively. On August 29, STA released its estimated FY97 budget request. The nuclear-related 360.4 billion yen is 0.9% more than that in year before. Of this sum, 199.9 billion yen is in the general account, and 160.6 billion yen is in the special account for power source development. The details of the nuclear-related amounts are explained. On August 26, MITI released its estimated budget request for FY97, and of the nuclear-related 125.7 billion yen (0.1% increase from FY96), 200 million yen is in the general account, and 98.9 billion yen and 26.6 billion yen are in the special accounts for power resource development and power source diversification, respectively. (K.I.)

  16. Four billion people facing severe water scarcity.

    Science.gov (United States)

    Mekonnen, Mesfin M; Hoekstra, Arjen Y

    2016-02-01

    Freshwater scarcity is increasingly perceived as a global systemic risk. Previous global water scarcity assessments, measuring water scarcity annually, have underestimated experienced water scarcity by failing to capture the seasonal fluctuations in water consumption and availability. We assess blue water scarcity globally at a high spatial resolution on a monthly basis. We find that two-thirds of the global population (4.0 billion people) live under conditions of severe water scarcity at least 1 month of the year. Nearly half of those people live in India and China. Half a billion people in the world face severe water scarcity all year round. Putting caps to water consumption by river basin, increasing water-use efficiencies, and better sharing of the limited freshwater resources will be key in reducing the threat posed by water scarcity on biodiversity and human welfare.

  17. Improvements in BTS estimation of ton-miles

    Science.gov (United States)

    2004-08-01

    Ton-miles (one ton of freight shipped one mile) is the primary physical measure of freight transportation output. This paper describes improved measurements of ton-miles for air, truck, rail, water, and pipeline modes. Each modal measure contains a d...

  18. $17 billion needed by year 2000.

    Science.gov (United States)

    Finger, W R

    1995-09-01

    The United Nations Population Fund (UNFPA) estimates that US$17 billion will be needed to fund reproductive health care in developing countries by the year 2000. About US$10 billion of would go for family planning: currently, the amount spent on family planning is about US$5 billion. Donors are focusing on fewer countries because of limited resources. The United States Agency for International Development (USAID) is planning to phase out support for family planning in Jamaica and Brazil because the programs there have advanced sufficiently. Resources will be shifted to countries with more pressing needs. Dr. Richard Osborn, senior technical officer for UNFPA, states that UNFPA works with national program managers in allocating resources at the macro level (commodities, training). Currently, two-thirds of family planning funds spent worldwide come from developing country governments (mainly China, India, Indonesia, Mexico, South Africa, Turkey, and Bangladesh). Sustaining programs, much less adding new services, will be difficult. User fees and public-private partnerships are being considered; worldwide, consumers provide, currently, about 14% of family planning funds (The portion is higher in most Latin American countries.). In a few countries, insurance, social security, and other public-private arrangements contribute. Social marketing programs are being considered that would remove constraints on prescriptions and prices and improve the quality of services so that clients would be more willing to pay for contraceptives. Although governments are attempting to fit family planning into their health care budgets, estimates at the national level are difficult to make. Standards are needed to make expenditure estimates quickly and at low cost, according to Dr. Barbara Janowitz of FHI, which is developing guidelines. Studies in Bangladesh, Ecuador, Ghana, Mexico, and the Philippines are being conducted, with the assistance of The Evaluation Project at the Population

  19. Reheating experiment in the 35-ton pile

    International Nuclear Information System (INIS)

    Cherot, J.; Girard, Y.

    1957-01-01

    When the 35-ton pile was started up it was necessary for us, in order to study certain effects (xenon for example), to know the anti reactivity value of the rods as a function of their dimensions. We have made use of the possibility, in the reheating experiment, of raising the temperature of the graphite-uranium block by simple heating, in order to determine the anti reactivity curves of the rods, and from that the overall temperature coefficient. For the latter we have considered two solutions: first, one in which the average temperature of the pile is defined as our arithmetical mean of the different values given by the 28 thermocouples distributed throughout the pile; a second in which the temperature in likened to a poisoning and is balanced by the square of the flux. The way in which the measurements have been made is indicated, and the different instruments used are described. The method of reheating does not permit the separation of the temperature coefficients of uranium and of graphite. The precision obtained is only moderate, and suffers from the changes of various parameters necessary to other manipulations carried out simultaneously (life time modulators for example), and finally it is a function of the comparatively restricted time allowed. It is evident of course that more careful stabilisation at the different plateaux chosen would have necessitated long periods of reheating. (author) [fr

  20. From Throw Weights to Metric Tons: The Radioactive Waste Problems of Russia's Northern Fleet

    National Research Council Canada - National Science Library

    Pruefer, Donald

    2000-01-01

    .... In a microcosm of the shortsighted planning, reckless development and lack of ecological concern that epitomized the Soviet era, 70 decommissioned nuclear submarines are currently moored in ports...

  1. Origins fourteen billion years of cosmic evolution

    CERN Document Server

    Tyson, Neil deGrasse

    2004-01-01

    Origins explores cosmic science's stunning new insights into the formation and evolution of our universe--of the cosmos, of galaxies and galaxy clusters, of stars within galaxies, of planets that orbit those stars, and of different forms of life that take us back to the first three seconds and forward through three billion years of life on Earth to today's search for life on other planets. Drawing on the current cross-pollination of geology, biology and astrophysics, Origins explains the thrilling daily breakthroughs in our knowledge of the universe from dark energy to life on Mars to the mysteries of space and time. Distilling complex science in clear and lively prose, co-authors Neil deGrasse Tyson and Donald Goldsmith conduct a galvanising tour of the cosmos revealing what the universe has been up to while turning part of itself into us.

  2. Death of the TonB Shuttle Hypothesis.

    Science.gov (United States)

    Gresock, Michael G; Savenkova, Marina I; Larsen, Ray A; Ollis, Anne A; Postle, Kathleen

    2011-01-01

    A complex of ExbB, ExbD, and TonB couples cytoplasmic membrane (CM) proton motive force (pmf) to the active transport of large, scarce, or important nutrients across the outer membrane (OM). TonB interacts with OM transporters to enable ligand transport. Several mechanical models and a shuttle model explain how TonB might work. In the mechanical models, TonB remains attached to the CM during energy transduction, while in the shuttle model the TonB N terminus leaves the CM to deliver conformationally stored potential energy to OM transporters. Previous studies suggested that TonB did not shuttle based on the activity of a GFP-TonB fusion that was anchored in the CM by the GFP moiety. When we recreated the GFP-TonB fusion to extend those studies, in our hands it was proteolytically unstable, giving rise to potentially shuttleable degradation products. Recently, we discovered that a fusion of the Vibrio cholerae ToxR cytoplasmic domain to the N terminus of TonB was proteolytically stable. ToxR-TonB was able to be completely converted into a proteinase K-resistant conformation in response to loss of pmf in spheroplasts and exhibited an ability to form a pmf-dependent formaldehyde crosslink to ExbD, both indicators of its location in the CM. Most importantly, ToxR-TonB had the same relative specific activity as wild-type TonB. Taken together, these results provide conclusive evidence that TonB does not shuttle during energy transduction. We had previously concluded that TonB shuttles based on the use of an Oregon Green(®) 488 maleimide probe to assess periplasmic accessibility of N-terminal TonB. Here we show that the probe was permeant to the CM, thus permitting the labeling of the TonB N-terminus. These former results are reinterpreted in the context that TonB does not shuttle, and suggest the existence of a signal transduction pathway from OM to cytoplasm.

  3. Death of the TonB shuttle hypothesis

    Directory of Open Access Journals (Sweden)

    Michael George Gresock

    2011-10-01

    Full Text Available A complex of ExbB, ExbD, and TonB transduces cytoplasmic membrane (CM proton motive force (pmf to outer membrane (OM transporters so that large, scarce, and important nutrients can be released into the periplasmic space for subsequent transport across the CM. TonB is the component that interacts with the OM transporters and enables ligand transport, and several mechanical models and a shuttle model explain how TonB might work. In the mechanical models, TonB remains attached to the CM during energy transduction, while in the shuttle model the TonB N terminus leaves the CM to deliver conformationally stored potential energy to OM transporters. Previous efforts to test the shuttle model by anchoring TonB to the CM by fusion to a large globular cytoplasmic protein have been hampered by the proteolytic susceptibility of the fusion constructs. Here we confirm that GFP-TonB, tested in a previous study by another laboratory, again gave rise to full-length TonB and slightly larger potentially shuttleable fragments that prevented unambiguous interpretation of the data. Recently, we discovered that a fusion of the Vibrio cholerae ToxR cytoplasmic domain to the N terminus of TonB was proteolytically stable. ToxR-TonB was able to be completely converted into a proteinase K-resistant conformation in response to loss of pmf in spheroplasts and exhibited an ability to form a pmf-dependent formaldehyde crosslink to ExbD, both indicators of its location in the CM. Most importantly, ToxR-TonB had the same relative specific activity as wild-type TonB. Taken together, these results provide the first conclusive evidence that TonB does not shuttle during energy transduction. The interpretations of our previous study, which concluded that TonB shuttled in vivo, were complicated by the fact that the probe used in those studies, Oregon Green® 488 maleimide, was permeant to the CM and could label proteins, including a TonB ∆TMD derivative, confined exclusively to the

  4. Uranium in Canada: Billion-dollar industry

    International Nuclear Information System (INIS)

    Whillans, R.T.

    1989-01-01

    In 1988, Canada maintained its position as the world's leading producer and exporter of uranium; five primary uranium producers reported concentrate output containing 12,400 MT of uranium, or about one-third of Western production. Uranium shipments made by these producers in 1988 exceeded 13,200 MT, worth Canadian $1.1 billion. Because domestic requirements represent only 15% of current Canadian output, most of Canada's uranium production is available for export. Despite continued market uncertainty in 1988, Canada's uranium producers signed new sales contracts for some 14,000 MT, twice the 1987 level. About 90% of this new volume is with the US, now Canada's major uranium customer. The recent implementation of the Canada/US Free Trade agreement brings benefits to both countries; the uranium industries in each can now develop in an orderly, free market. Canada's uranium industry was restructured and consolidated in 1988 through merger and acquisition; three new uranium projects advanced significantly. Canada's new policy on nonresident ownership in the uranium mining sector, designed to encourage both Canadian and foreign investment, should greatly improve efforts to finance the development of recent Canadian uranium discoveries

  5. Metrics of quantum states

    International Nuclear Information System (INIS)

    Ma Zhihao; Chen Jingling

    2011-01-01

    In this work we study metrics of quantum states, which are natural generalizations of the usual trace metric and Bures metric. Some useful properties of the metrics are proved, such as the joint convexity and contractivity under quantum operations. Our result has a potential application in studying the geometry of quantum states as well as the entanglement detection.

  6. $\\eta$-metric structures

    OpenAIRE

    Gaba, Yaé Ulrich

    2017-01-01

    In this paper, we discuss recent results about generalized metric spaces and fixed point theory. We introduce the notion of $\\eta$-cone metric spaces, give some topological properties and prove some fixed point theorems for contractive type maps on these spaces. In particular we show that theses $\\eta$-cone metric spaces are natural generalizations of both cone metric spaces and metric type spaces.

  7. Response Surface Model (RSM)-based Benefit Per Ton Estimates

    Science.gov (United States)

    The tables below are updated versions of the tables appearing in The influence of location, source, and emission type in estimates of the human health benefits of reducing a ton of air pollution (Fann, Fulcher and Hubbell 2009).

  8. 305 Building 2 ton bridge crane and monorail assembly analysis

    International Nuclear Information System (INIS)

    Axup, M.D.

    1995-12-01

    The analyses in the appendix of this document evaluate the integrity of the existing bridge crane structure, as depicted on drawing H-3-34292, for a bridge crane and monorail assembly with a load rating of 2 tons. This bridge crane and monorail assembly is a modification of a 1 1/2 ton rated manipulator bridge crane which originally existed in the 305 building

  9. Proton collider breaks the six-billion-dollar barrier

    CERN Multimedia

    Vaughan, C

    1990-01-01

    The SSC will cost at least 1 billion more than its estimated final price of 5.9 billion dollars. Critics in congress believe the final bill could be double that figure. The director of the SSC blames most of the increase in cost on technical problems with developing the superconducting magnets for the SSC (1/2 page).

  10. METRIC context unit architecture

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, R.O.

    1988-01-01

    METRIC is an architecture for a simple but powerful Reduced Instruction Set Computer (RISC). Its speed comes from the simultaneous processing of several instruction streams, with instructions from the various streams being dispatched into METRIC's execution pipeline as they become available for execution. The pipeline is thus kept full, with a mix of instructions for several contexts in execution at the same time. True parallel programming is supported within a single execution unit, the METRIC Context Unit. METRIC's architecture provides for expansion through the addition of multiple Context Units and of specialized Functional Units. The architecture thus spans a range of size and performance from a single-chip microcomputer up through large and powerful multiprocessors. This research concentrates on the specification of the METRIC Context Unit at the architectural level. Performance tradeoffs made during METRIC's design are discussed, and projections of METRIC's performance are made based on simulation studies.

  11. Factory Acceptance Test Procedure Westinghouse 100 ton Hydraulic Trailer

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1994-01-01

    This Factory Acceptance Test Procedure (FAT) is for the Westinghouse 100 Ton Hydraulic Trailer. The trailer will be used for the removal of the 101-SY pump. This procedure includes: safety check and safety procedures; pre-operation check out; startup; leveling trailer; functional/proofload test; proofload testing; and rolling load test

  12. Metric diffusion along foliations

    CERN Document Server

    Walczak, Szymon M

    2017-01-01

    Up-to-date research in metric diffusion along compact foliations is presented in this book. Beginning with fundamentals from the optimal transportation theory and the theory of foliations; this book moves on to cover Wasserstein distance, Kantorovich Duality Theorem, and the metrization of the weak topology by the Wasserstein distance. Metric diffusion is defined, the topology of the metric space is studied and the limits of diffused metrics along compact foliations are discussed. Essentials on foliations, holonomy, heat diffusion, and compact foliations are detailed and vital technical lemmas are proved to aide understanding. Graduate students and researchers in geometry, topology and dynamics of foliations and laminations will find this supplement useful as it presents facts about the metric diffusion along non-compact foliation and provides a full description of the limit for metrics diffused along foliation with at least one compact leaf on the two dimensions.

  13. Metric modular spaces

    CERN Document Server

    Chistyakov, Vyacheslav

    2015-01-01

    Aimed toward researchers and graduate students familiar with elements of functional analysis, linear algebra, and general topology; this book contains a general study of modulars, modular spaces, and metric modular spaces. Modulars may be thought of as generalized velocity fields and serve two important purposes: generate metric spaces in a unified manner and provide a weaker convergence, the modular convergence, whose topology is non-metrizable in general. Metric modular spaces are extensions of metric spaces, metric linear spaces, and classical modular linear spaces. The topics covered include the classification of modulars, metrizability of modular spaces, modular transforms and duality between modular spaces, metric  and modular topologies. Applications illustrated in this book include: the description of superposition operators acting in modular spaces, the existence of regular selections of set-valued mappings, new interpretations of spaces of Lipschitzian and absolutely continuous mappings, the existe...

  14. Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — This chapter presents several performance metrics for offline evaluation of prognostics algorithms. A brief overview of different methods employed for performance...

  15. Overview of journal metrics

    Directory of Open Access Journals (Sweden)

    Kihong Kim

    2018-02-01

    Full Text Available Various kinds of metrics used for the quantitative evaluation of scholarly journals are reviewed. The impact factor and related metrics including the immediacy index and the aggregate impact factor, which are provided by the Journal Citation Reports, are explained in detail. The Eigenfactor score and the article influence score are also reviewed. In addition, journal metrics such as CiteScore, Source Normalized Impact per Paper, SCImago Journal Rank, h-index, and g-index are discussed. Limitations and problems that these metrics have are pointed out. We should be cautious to rely on those quantitative measures too much when we evaluate journals or researchers.

  16. Brand metrics that matter

    NARCIS (Netherlands)

    Muntinga, D.; Bernritter, S.

    2017-01-01

    Het merk staat steeds meer centraal in de organisatie. Het is daarom essentieel om de gezondheid, prestaties en ontwikkelingen van het merk te meten. Het is echter een uitdaging om de juiste brand metrics te selecteren. Een enorme hoeveelheid metrics vraagt de aandacht van merkbeheerders. Maar welke

  17. Privacy Metrics and Boundaries

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  18. How do you interpret a billion primary care records?

    Directory of Open Access Journals (Sweden)

    Martin Heaven

    2017-04-01

    To establish this we explored just over 1 billion unique Read coded records generated in the time period 1999 to 2015 by GP practices participating in the provision of anonymised records to SAIL, aligning, filtering and summarising the data in a series of observational exercises to generate hypotheses related to the capture and recording of the data. Results A fascinating journey through 1 billion GP practice generated pieces of information, embarked upon to aid interpretation of our Supporting People results, and providing insights into the patterns of recording within GP data.

  19. Cosmic rays and the biosphere over 4 billion years

    DEFF Research Database (Denmark)

    Svensmark, Henrik

    2006-01-01

    Variations in the flux of cosmic rays (CR) at Earth during the last 4.6 billion years are constructed from information about the star formation rate in the Milky Way and the evolution of the solar activity. The constructed CR signal is compared with variations in the Earths biological productivit...... as recorded in the isotope delta C-13, which spans more than 3 billion years. CR and fluctuations in biological productivity show a remarkable correlation and indicate that the evolution of climate and the biosphere on the Earth is closely linked to the evolution of the Milky Way....

  20. Holographic Spherically Symmetric Metrics

    Science.gov (United States)

    Petri, Michael

    The holographic principle (HP) conjectures, that the maximum number of degrees of freedom of any realistic physical system is proportional to the system's boundary area. The HP has its roots in the study of black holes. It has recently been applied to cosmological solutions. In this article we apply the HP to spherically symmetric static space-times. We find that any regular spherically symmetric object saturating the HP is subject to tight constraints on the (interior) metric, energy-density, temperature and entropy-density. Whenever gravity can be described by a metric theory, gravity is macroscopically scale invariant and the laws of thermodynamics hold locally and globally, the (interior) metric of a regular holographic object is uniquely determined up to a constant factor and the interior matter-state must follow well defined scaling relations. When the metric theory of gravity is general relativity, the interior matter has an overall string equation of state (EOS) and a unique total energy-density. Thus the holographic metric derived in this article can serve as simple interior 4D realization of Mathur's string fuzzball proposal. Some properties of the holographic metric and its possible experimental verification are discussed. The geodesics of the holographic metric describe an isotropically expanding (or contracting) universe with a nearly homogeneous matter-distribution within the local Hubble volume. Due to the overall string EOS the active gravitational mass-density is zero, resulting in a coasting expansion with Ht = 1, which is compatible with the recent GRB-data.

  1. One billion cubic meters of gas produced in Kikinda area

    Energy Technology Data Exchange (ETDEWEB)

    Vicicevic, M; Duric, N

    1969-10-01

    The Kikinda gas reservoir has just passed a milestone in producing one billion cubic meters of natural gas. The reservoir was discovered in 1962, and its present production amounts to 26 million cu m. One of the biggest problems was formation of hydrates, which has successfully been solved by using methanol. Four tables show production statistics by years and productive formations.

  2. International collaboration in SSC (or any $4 billion scientific project)

    International Nuclear Information System (INIS)

    Lederman, L.M.

    1988-01-01

    In this paper, the author discusses the superconducting supercollider. This is a project that costs U.S. $4.4 billion. The author spends a short time giving the motivation (which is a scientific motivation) and also giving the idea of how it is possible, with U.S. deficits

  3. Working Paper 5: Beyond Collier's Bottom Billion | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2010-12-16

    Dec 16, 2010 ... The heart of the narrative presented in the book is that a group of almost 60 countries, with a population of about a billion people, are caught in four main traps. Their prospects for escaping the traps are poor, and they need a set of actions from the international community to achieve the rapid rates of growth ...

  4. Congress OKs $2 Billion Boost for the NIH.

    Science.gov (United States)

    2017-07-01

    President Donald Trump last week signed a $1.1 trillion spending bill for fiscal year 2017, including a welcome $2 billion boost for the NIH that will support former Vice President Joe Biden's Cancer Moonshot initiative, among other priorities. However, researchers who rely heavily on NIH grant funding remain concerned about proposed cuts for 2018. ©2017 American Association for Cancer Research.

  5. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  6. Tracker Performance Metric

    National Research Council Canada - National Science Library

    Olson, Teresa; Lee, Harry; Sanders, Johnnie

    2002-01-01

    .... We have developed the Tracker Performance Metric (TPM) specifically for this purpose. It was designed to measure the output performance, on a frame-by-frame basis, using its output position and quality...

  7. IT Project Management Metrics

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Many software and IT projects fail in completing theirs objectives because different causes of which the management of the projects has a high weight. In order to have successfully projects, lessons learned have to be used, historical data to be collected and metrics and indicators have to be computed and used to compare them with past projects and avoid failure to happen. This paper presents some metrics that can be used for the IT project management.

  8. Mass Customization Measurements Metrics

    DEFF Research Database (Denmark)

    Nielsen, Kjeld; Brunø, Thomas Ditlev; Jørgensen, Kaj Asbjørn

    2014-01-01

    A recent survey has indicated that 17 % of companies have ceased mass customizing less than 1 year after initiating the effort. This paper presents measurement for a company’s mass customization performance, utilizing metrics within the three fundamental capabilities: robust process design, choice...... navigation, and solution space development. A mass customizer when assessing performance with these metrics can identify within which areas improvement would increase competitiveness the most and enable more efficient transition to mass customization....

  9. Responsible for 45 000 tons CO2 emissions

    International Nuclear Information System (INIS)

    Nedrelid, Ola N.

    2006-01-01

    Waste combustion has much better tax conditions in Sweden compared to Norway. Today waste is being transported from Norway to Sweden, resulting in a 45 000 ton emission of CO 2 every year, when the waste could have remained in Norway, utilized as regained energy in district heating. The tax regime, however, does not provide the conditions for a profitable use of the waste in Norway. The district heating association is disappointed with the new government's proposed fiscal budget, which only worsens the competitive situation for Norway handling its own waste (ml)

  10. Acceptance test report for the Westinghouse 100 ton hydraulic trailer

    International Nuclear Information System (INIS)

    Barrett, R.A.

    1995-01-01

    The SY-101 Equipment Removal System 100 Ton Hydraulic Trailer was designed and built by KAMP Systems, Inc. Performance of the Acceptance Test Procedure at KAMP's facility in Ontario, California (termed Phase 1 in this report) was interrupted by discrepancies noted with the main hydraulic cylinder. The main cylinder was removed and sent to REMCO for repair while the trailer was sent to Lampson's facility in Pasco, Washington. The Acceptance Test Procedure was modified and performance resumed at Lampson (termed Phase 2 in this report) after receipt of the repaired cylinder. At the successful conclusion of Phase 2 testing the trailer was accepted as meeting all the performance criteria specified

  11. Dilution Refrigeration of Multi-Ton Cold Masses

    CERN Document Server

    Wikus, P; CERN. Geneva

    2007-01-01

    Dilution refrigeration is the only means to provide continuous cooling at temperatures below 250 mK. Future experiments featuring multi-ton cold masses require a new generation of dilution refrigeration systems, capable of providing a heat sink below 10 mK at cooling powers which exceed the performance of present systems considerably. This thesis presents some advances towards dilution refrigeration of multi-ton masses in this temperature range. A new method using numerical simulation to predict the cooling power of a dilution refrigerator of a given design has been developed in the framework of this thesis project. This method does not only allow to take into account the differences between an actual and an ideal continuous heat exchanger, but also to quantify the impact of an additional heat load on an intermediate section of the dilute stream. In addition, transient behavior can be simulated. The numerical model has been experimentally verified with a dilution refrigeration system which has been designed, ...

  12. A two-billion-year history for the lunar dynamo.

    Science.gov (United States)

    Tikoo, Sonia M; Weiss, Benjamin P; Shuster, David L; Suavet, Clément; Wang, Huapei; Grove, Timothy L

    2017-08-01

    Magnetic studies of lunar rocks indicate that the Moon generated a core dynamo with surface field intensities of ~20 to 110 μT between at least 4.25 and 3.56 billion years ago (Ga). The field subsequently declined to lunar dynamo by at least 1 billion years. Such a protracted history requires an extraordinarily long-lived power source like core crystallization or precession. No single dynamo mechanism proposed thus far can explain the strong fields inferred for the period before 3.56 Ga while also allowing the dynamo to persist in such a weakened state beyond ~2.5 Ga. Therefore, our results suggest that the dynamo was powered by at least two distinct mechanisms operating during early and late lunar history.

  13. Oncology pharma costs to exceed $150 billion by 2020.

    Science.gov (United States)

    2016-10-01

    Worldwide costs of oncology drugs will rise above $150 billion by 2020, according to a report by the IMS Institute for Healthcare Informatics. Many factors are in play, according to IMS, including the new wave of expensive immunotherapies. Pembrolizumab (Keytruda), priced at $150,000 per year per patient, and nivolumab (Opdivo), priced at $165,000, may be harbingers of the market for cancer immunotherapies.

  14. Ubiquitous Supercritical Wing Design Cuts Billions in Fuel Costs

    Science.gov (United States)

    2015-01-01

    A Langley Research Center engineer’s work in the 1960s and ’70s to develop a wing with better performance near the speed of sound resulted in a significant increase in subsonic efficiency. The design was shared with industry. Today, Renton, Washington-based Boeing Commercial Airplanes, as well as most other plane manufacturers, apply it to all their aircraft, saving the airline industry billions of dollars in fuel every year.

  15. Gaia: Science with 1 billion objects in three dimensions

    Science.gov (United States)

    Prusti, Timo

    2018-02-01

    Gaia is an operational satellite in the ESA science programme. It is gathering data for more than a billion objects. Gaia measures positions and motions of stars in our Milky Way Galaxy, but captures many asteroids and extragalactic sources as well. The first data release has already been made and exploitation by the world-wide scientific community is underway. Further data releases will be made with further increasing accuracy. Gaia is well underway to provide its promised set of fundamental astronomical data.

  16. Document de travail 5: Beyond Collier's Bottom Billion | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    16 déc. 2010 ... L'ouvrage de Paul Collier, The Bottom Billion, suscite un grand intérêt dans le domaine du développement. Il repose sur la thèse selon laquelle un groupe de près de 60 pays, dont la population totale avoisine un milliard de personnes, sont pris dans quatre pièges principaux.

  17. Cost of solving mysteries of universe: $6.7 billion

    CERN Multimedia

    Overbye, Dennis

    2007-01-01

    "An international consortium of physicists on Thursday released the first detailed design of what they believe will be the next big thing in physics. The machine, 20 miles long, will slam together electrons and their opposites, positrons, to produce fireballs of energy re-creating conditions when the universe was only a trillionth of a second old. It would cost about $6.7 billion." (1 page)

  18. $35 billion habit: will nuclear cost overruns bankrupt the utilities

    International Nuclear Information System (INIS)

    Morgan, R.E.

    1980-01-01

    The Nuclear Regulatory Commission (NRC) has proposed some 150 modifications in the design and operation of nuclear power plants as a result of the accident at Three Mile Island. The Atomic Industrial Forum estimates the total cost of the NRC's proposed rule changes at $35.5 billion ($3.5 billion in capital costs for the entire industry, and $32 billion in outage and construction-delay costs to the utilities) for existing facilities and for those with construction well underway. The changes range from improved training for reactor workers to a major overhaul of the reactor-containment design. The nuclear industry is asking the NRC to modify the proposals citing excessive costs (like the $100 million changes needed for a plant that cost $17 million to build) and safety (some of the complex regulations may interfere with safety). Financing the changes has become a major problem for the utilities. If the regulators allow all the costs to be passed along to the consumer, the author feels electricity will be too expensive for the consumer

  19. Fault Management Metrics

    Science.gov (United States)

    Johnson, Stephen B.; Ghoshal, Sudipto; Haste, Deepak; Moore, Craig

    2017-01-01

    This paper describes the theory and considerations in the application of metrics to measure the effectiveness of fault management. Fault management refers here to the operational aspect of system health management, and as such is considered as a meta-control loop that operates to preserve or maximize the system's ability to achieve its goals in the face of current or prospective failure. As a suite of control loops, the metrics to estimate and measure the effectiveness of fault management are similar to those of classical control loops in being divided into two major classes: state estimation, and state control. State estimation metrics can be classified into lower-level subdivisions for detection coverage, detection effectiveness, fault isolation and fault identification (diagnostics), and failure prognosis. State control metrics can be classified into response determination effectiveness and response effectiveness. These metrics are applied to each and every fault management control loop in the system, for each failure to which they apply, and probabilistically summed to determine the effectiveness of these fault management control loops to preserve the relevant system goals that they are intended to protect.

  20. Deep Transfer Metric Learning.

    Science.gov (United States)

    Junlin Hu; Jiwen Lu; Yap-Peng Tan; Jie Zhou

    2016-12-01

    Conventional metric learning methods usually assume that the training and test samples are captured in similar scenarios so that their distributions are assumed to be the same. This assumption does not hold in many real visual recognition applications, especially when samples are captured across different data sets. In this paper, we propose a new deep transfer metric learning (DTML) method to learn a set of hierarchical nonlinear transformations for cross-domain visual recognition by transferring discriminative knowledge from the labeled source domain to the unlabeled target domain. Specifically, our DTML learns a deep metric network by maximizing the inter-class variations and minimizing the intra-class variations, and minimizing the distribution divergence between the source domain and the target domain at the top layer of the network. To better exploit the discriminative information from the source domain, we further develop a deeply supervised transfer metric learning (DSTML) method by including an additional objective on DTML, where the output of both the hidden layers and the top layer are optimized jointly. To preserve the local manifold of input data points in the metric space, we present two new methods, DTML with autoencoder regularization and DSTML with autoencoder regularization. Experimental results on face verification, person re-identification, and handwritten digit recognition validate the effectiveness of the proposed methods.

  1. Taking out one billion tones of carbon: the magic of China's 11thFive-Year Plan

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Jiang; Zhou, Nan; Levine, Mark D.; Fridley, David

    2007-05-01

    China's 11th Five-Year Plan (FYP) sets an ambitious targetfor energy-efficiency improvement: energy intensity of the country sgross domestic product (GDP) should be reduced by 20 percent from 2005 to2010 (NDRC, 2006). This is the first time that a quantitative and bindingtarget has been set for energy efficiency, and signals a major shift inChina's strategic thinking about its long-term economic and energydevelopment. The 20 percent energy intensity target also translates intoan annual reduction of over one billion tons of CO2 by 2010, making theChinese effort one of most significant carbon mitigation effort in theworld today. While it is still too early to tell whether China willachieve this target, this paper attempts to understand the trend inenergy intensity in China and to explore a variety of options towardmeeting the 20 percent target using a detailed endues energymodel.

  2. Increase of alcohol yield per ton of pulp

    Energy Technology Data Exchange (ETDEWEB)

    Sokolov, B N

    1957-01-01

    Digestion processes of cellulose were studied under production conditions. When the digestion was carried out with acid having 5.2% total SO/sub 2/ and 0.92% CaO, the concentration of total sugars in the spent liquor was 1.8 to 2.5%. When the acidity was reduced to 4.8% total SO/sub 2/ and 0.82% CaO, all other conditions being the same, the sugar concentration in the spent liquor increased to 3.0 to 3.7%. The importance of the acid strength and CaO content of the cooking liquor was further demonstrated at the end of 1955. At that time the total SO/sub 2/ in the acid rose to 8% while the amount of CaO remained practically the same-0.85 to 0.90%. These conditions permitted an increase in the amount of ships by 25 to 30%, which further changed the ratio CaO: wood and created conditions favorable for an improved yield of sugar. The increase in the activity of the acid was reflected favorably in the degree of hydrolysis of the hemicelluloses and in the degree to which the oligosaccharides or polysaccharides were hydrolyzed to simple sugars. At that time the yield of alcohol reached 53 1/ton of unbleached pulp. The process was further improved in 1956 by the use of successive washings; at the end of the digestion period the concentrated spent liquor was piped to the alcohol unit. The yield of alcohol reached 59.4 1/ton of pulp. Sugar recovery from the tank was 92.5% of that theoretically possible. Further improvements resulted by saturating the wood chips with acid under variable pressures. As a result, the base of the cooking acid was reduced to 0.7 to 0.72% and, at the end of the process the liquor contained 0.03 to 0.06% CaO instead of 0.2 to 0.18%. The alcohol yield/ton of pulp then rose to 66.8 l.

  3. Cyber threat metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  4. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  5. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  6. Matrix model and time-like linear dila ton matter

    International Nuclear Information System (INIS)

    Takayanagi, Tadashi

    2004-01-01

    We consider a matrix model description of the 2d string theory whose matter part is given by a time-like linear dilaton CFT. This is equivalent to the c=1 matrix model with a deformed, but very simple Fermi surface. Indeed, after a Lorentz transformation, the corresponding 2d spacetime is a conventional linear dila ton background with a time-dependent tachyon field. We show that the tree level scattering amplitudes in the matrix model perfectly agree with those computed in the world-sheet theory. The classical trajectories of fermions correspond to the decaying D-boranes in the time-like linear dilaton CFT. We also discuss the ground ring structure. Furthermore, we study the properties of the time-like Liouville theory by applying this matrix model description. We find that its ground ring structure is very similar to that of the minimal string. (author)

  7. Acceptance test report for the Westinghouse 100 ton hydraulic trailer

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, R.A.

    1995-03-06

    The SY-101 Equipment Removal System 100 Ton Hydraulic Trailer was designed and built by KAMP Systems, Inc. Performance of the Acceptance Test Procedure at KAMP`s facility in Ontario, California (termed Phase 1 in this report) was interrupted by discrepancies noted with the main hydraulic cylinder. The main cylinder was removed and sent to REMCO for repair while the trailer was sent to Lampson`s facility in Pasco, Washington. The Acceptance Test Procedure was modified and performance resumed at Lampson (termed Phase 2 in this report) after receipt of the repaired cylinder. At the successful conclusion of Phase 2 testing the trailer was accepted as meeting all the performance criteria specified.

  8. Ants at Ton Nga Chang Wildlife Sanctuary, Songkhla

    Directory of Open Access Journals (Sweden)

    Watanasit, S.

    2005-03-01

    Full Text Available The aim of this study was to investigate diversity of ant at Ton Nga Chang Wildlife Sanctuary, Hat Yai, Songkhla. Three line transects (100 m each were randomly set up in 2 types of forest area, disturbed and undisturbed. Hand collecting (HC and leaf litter sampling (LL were applied for ant collection within a time limit of 30 minutes for each method. This study was carried out every month during Febuary 2002- Febuary 2003. The results showed that 206 species were placed under 8 subfamilies: Aenictinae, Cerapachyinae, Dolichoderinae, Formicinae, Leptanillinae, Myrmicinae, Ponerinae and Pseudomyrmecinae. Study sites and collection methods could divide ant species into 2 groups, whereas seasonal change could not distinguish the groups by DCA of multivariate analysis.

  9. Dark matter sensitivity of multi-ton liquid xenon detectors

    International Nuclear Information System (INIS)

    Schumann, Marc; Bütikofer, Lukas; Baudis, Laura; Kish, Alexander; Selvi, Marco

    2015-01-01

    We study the sensitivity of multi ton-scale time projection chambers using a liquid xenon target, e.g., the proposed DARWIN instrument, to spin-independent and spin-dependent WIMP-nucleon scattering interactions. Taking into account realistic backgrounds from the detector itself as well as from neutrinos, we examine the impact of exposure, energy threshold, background rejection efficiency and energy resolution on the dark matter sensitivity. With an exposure of 200 t × y and assuming detector parameters which have been already demonstrated experimentally, spin-independent cross sections as low as 2.5 × 10 −49 cm 2 can be probed for WIMP masses around 40 GeV/c 2 . Additional improvements in terms of background rejection and exposure will further increase the sensitivity, while the ultimate WIMP science reach will be limited by neutrinos scattering coherently off the xenon nuclei

  10. Metrical Phonology and SLA.

    Science.gov (United States)

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English language with the intention that it may be used in second language instruction. Stress is defined by its physical and acoustical correlates, and the principles of…

  11. Engineering performance metrics

    Science.gov (United States)

    Delozier, R.; Snyder, N.

    1993-03-01

    Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.

  12. Metrics for Probabilistic Geometries

    DEFF Research Database (Denmark)

    Tosi, Alessandra; Hauberg, Søren; Vellido, Alfredo

    2014-01-01

    the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances...

  13. Metrics for energy resilience

    International Nuclear Information System (INIS)

    Roege, Paul E.; Collier, Zachary A.; Mancillas, James; McDonagh, John A.; Linkov, Igor

    2014-01-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today's energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system's energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth. - Highlights: • Resilience is the ability of a system to recover from adversity. • There is a need for methods to quantify and measure system resilience. • We developed a matrix-based approach to generate energy resilience metrics. • These metrics can be used in energy planning, system design, and operations

  14. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  15. Empowering billions with food safety and food security

    International Nuclear Information System (INIS)

    Pillai, Suresh D.

    2009-01-01

    Full text: There are virtually millions of people -who die needlessly every year due to contaminated water and food. There are virtually many millions more who are starving due to an inadequate supply of food. Billions of pounds of food are unnecessarily wasted due to insect and other damage. Deaths and illness due to contaminated food or inadequate food are at catastrophic levels in many regions of the world. A majority of the food and water borne illnesses and deaths are preventable. It can be prevented by improved food production methods, improved food processing technologies, improved food distribution systems and improved personal hygiene. Food irradiation technology is over 100 years old. Yet, this technology is poorly understood by governments and corporate decision makers all around the world. Many consumers also are unfortunately misinformed of this technology. There is an urgent need for nations and people around the world to empower themselves with the knowledge and the expertise to harness this powerful technology. Widespread and sensible adoption of this technology can empower billions around the world with clean and abundant food supplies. It is unconscionable in the 21st century for governments to allow people to die or go hungry when the technology to prevent them is readily available

  16. The Boring Billion, a slingshot for Complex Life on Earth.

    Science.gov (United States)

    Mukherjee, Indrani; Large, Ross R; Corkrey, Ross; Danyushevsky, Leonid V

    2018-03-13

    The period 1800 to 800 Ma ("Boring Billion") is believed to mark a delay in the evolution of complex life, primarily due to low levels of oxygen in the atmosphere. Earlier studies highlight the remarkably flat C, Cr isotopes and low trace element trends during the so-called stasis, caused by prolonged nutrient, climatic, atmospheric and tectonic stability. In contrast, we suggest a first-order variability of bio-essential trace element availability in the oceans by combining systematic sampling of the Proterozoic rock record with sensitive geochemical analyses of marine pyrite by LA-ICP-MS technique. We also recall that several critical biological evolutionary events, such as the appearance of eukaryotes, origin of multicellularity & sexual reproduction, and the first major diversification of eukaryotes (crown group) occurred during this period. Therefore, it appears possible that the period of low nutrient trace elements (1800-1400 Ma) caused evolutionary pressures which became an essential trigger for promoting biological innovations in the eukaryotic domain. Later periods of stress-free conditions, with relatively high nutrient trace element concentration, facilitated diversification. We propose that the "Boring Billion" was a period of sequential stepwise evolution and diversification of complex eukaryotes, triggering evolutionary pathways that made possible the later rise of micro-metazoans and their macroscopic counterparts.

  17. Enterprise Sustainment Metrics

    Science.gov (United States)

    2015-06-19

    are negatively impacting KPIs” (Parmenter, 2010: 31). In the current state, the Air Force’s AA and PBL metrics are once again split . AA does...must have the authority to “take immediate action to rectify situations that are negatively impacting KPIs” (Parmenter, 2010: 31). 3. Measuring...highest profitability and shareholder value for each company” (2014: 273). By systematically diagraming a process, either through a swim lane flowchart

  18. Ultrarelativistic heavy ion collisions: the first billion seconds

    Energy Technology Data Exchange (ETDEWEB)

    Baym, Gordon

    2016-12-15

    I first review the early history of the ultrarelativistic heavy ion program, starting with the 1974 Bear Mountain Workshop, and the 1983 Aurora meeting of the U.S. Nuclear Science Committtee, just one billion seconds ago, which laid out the initial science goals of an ultrarelativistic collider. The primary goal, to discover the properties of nuclear matter at the highest energy densities, included finding new states of matter – the quark-gluon plasma primarily – and to use collisions to open a new window on related problems of matter in cosmology, neutron stars, supernovae, and elsewhere. To bring out how the study of heavy ions and hot, dense matter in QCD has been fulfilling these goals, I concentrate on a few topics, the phase diagram of matter in QCD, and connections of heavy ion physics to cold atoms, cosmology, and neutron stars.

  19. Orbital forcing of climate 1.4 billion years ago

    DEFF Research Database (Denmark)

    Zhang, Shuichang; Wang, Xiaomei; Hammarlund, Emma U

    2015-01-01

    Fluctuating climate is a hallmark of Earth. As one transcends deep into Earth time, however, both the evidence for and the causes of climate change become difficult to establish. We report geochemical and sedimentological evidence for repeated, short-term climate fluctuations from the exceptionally...... well-preserved ∼1.4-billion-year-old Xiamaling Formation of the North China Craton. We observe two patterns of climate fluctuations: On long time scales, over what amounts to tens of millions of years, sediments of the Xiamaling Formation record changes in geochemistry consistent with long-term changes...... reflect what appear to be orbitally forced changes in wind patterns and ocean circulation as they influenced rates of organic carbon flux, trace metal accumulation, and the source of detrital particles to the sediment....

  20. Symmetries of the dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.

    1998-01-01

    The geometric duality between the metric g μν and a Killing tensor K μν is studied. The conditions were found when the symmetries of the metric g μν and the dual metric K μν are the same. Dual spinning space was constructed without introduction of torsion. The general results are applied to the case of Kerr-Newmann metric

  1. Kerr metric in cosmological background

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, P C [Gujarat Univ., Ahmedabad (India). Dept. of Mathematics

    1977-06-01

    A metric satisfying Einstein's equation is given which in the vicinity of the source reduces to the well-known Kerr metric and which at large distances reduces to the Robertson-Walker metric of a nomogeneous cosmological model. The radius of the event horizon of the Kerr black hole in the cosmological background is found out.

  2. The Role of TonB Gene in Edwardsiella ictaluri Virulence

    Directory of Open Access Journals (Sweden)

    Hossam Abdelhamed

    2017-12-01

    Full Text Available Edwardsiella ictaluri is a Gram-negative facultative intracellular pathogen that causes enteric septicemia in catfish (ESC. Stress factors including poor water quality, poor diet, rough handling, overcrowding, and water temperature fluctuations increase fish susceptibility to ESC. The TonB energy transducing system (TonB-ExbB-ExbD and TonB-dependent transporters of Gram-negative bacteria support active transport of scarce resources including iron, an essential micronutrient for bacterial virulence. Deletion of the tonB gene attenuates virulence in several pathogenic bacteria. In the current study, the role of TonB (NT01EI_RS07425 in iron acquisition and E. ictaluri virulence were investigated. To accomplish this, the E. ictaluri tonB gene was in-frame deleted. Growth kinetics, iron utilization, and virulence of the EiΔtonB mutant were determined. Loss of TonB caused a significant reduction in bacterial growth in iron-depleted medium (p > 0.05. The EiΔtonB mutant grew similarly to wild-type E. ictaluri when ferric iron was added to the iron-depleted medium. The EiΔtonB mutant was significantly attenuated in catfish compared with the parent strain (21.69 vs. 46.91% mortality. Catfish surviving infection with EiΔtonB had significant protection against ESC compared with naïve fish (100 vs. 40.47% survival. These findings indicate that TonB participates in pathogenesis of ESC and is an important E. ictaluri virulence factor.

  3. Learning Low-Dimensional Metrics

    OpenAIRE

    Jain, Lalit; Mason, Blake; Nowak, Robert

    2017-01-01

    This paper investigates the theoretical foundations of metric learning, focused on three key questions that are not fully addressed in prior work: 1) we consider learning general low-dimensional (low-rank) metrics as well as sparse metrics; 2) we develop upper and lower (minimax)bounds on the generalization error; 3) we quantify the sample complexity of metric learning in terms of the dimension of the feature space and the dimension/rank of the underlying metric;4) we also bound the accuracy ...

  4. Criticality safety review of 2 1/2-, 10-, and 14-ton UF6 cylinders

    International Nuclear Information System (INIS)

    Broadhead, B.L.

    1991-10-01

    Currently, UF 6 cylinders designed to contain 2 1/2 tons of UF 6 are classified as Fissile Class 2 packages with a transport index (TI) of 5 for the purpose of transportation. The 10-ton UF 6 cylinders are classified as Fissile Class 1 with no TI assigned for transportation. The 14-ton cylinders, although not certified for transport with enrichments greater than 1 wt % because they have no approved overpack, can be used in on-site operations for enrichments greater than 1 wt %. The maximum 235 U enrichments for these cylinders are 5.0 wt % for the 2 1/2-ton cylinder and 4.5 wt % for the 10- and 14-ton cylinders. This work reviews the suitability for reclassification of the 2 1/2-ton UF 6 packages as Fissile Class 1 with a maximum 235 U enrichment of 5 wt %. Additionally, the 10- and 14-ton cylinders are reviewed to address a change in maximum 235 U enrichment from 4.5 to 5 wt %. Based on this evaluation, the 2 1/2-ton UF 6 cylinders meet the 10 CFR.71 criteria for Fissile Class 1 packages, and no TI is needed for criticality safety purposes; however, a TI may be required based on radiation from the packages. Similarly, the 10- and 14-ton UF 6 packages appear acceptable for a maximum enrichment rating change to 5 wt % 235 U. 11 refs., 13 figs., 7 tabs

  5. Fuel efficient stoves for the poorest two billion

    Science.gov (United States)

    Gadgil, Ashok

    2012-03-01

    About 2 billion people cook their daily meals on generally inefficient, polluting, biomass cookstoves. The fuels include twigs and leaves, agricultural waste, animal dung, firewood, and charcoal. Exposure to resulting smoke leads to acute respiratory illness, and cancers, particularly among women cooks, and their infant children near them. Resulting annual mortality estimate is almost 2 million deaths, higher than that from malaria or tuberculosis. There is a large diversity of cooking methods (baking, boiling, long simmers, brazing and roasting), and a diversity of pot shapes and sizes in which the cooking is undertaken. Fuel-efficiency and emissions depend on the tending of the fire (and thermal power), type of fuel, stove characteristics, and fit of the pot to the stove. Thus, no one perfect fuel-efficient low-emitting stove can suit all users. Affordability imposes a further severe constraint on the stove design. For various economic strata within the users, a variety of stove designs may be appropriate and affordable. In some regions, biomass is harvested non-renewably for cooking fuel. There is also increasing evidence that black carbon emitted from stoves is a significant contributor to atmospheric forcing. Thus improved biomass stoves can also help mitigate global climate change. The speaker will describe specific work undertaken to design, develop, test, and disseminate affordable fuel-efficient stoves for internally displaced persons (IDPs) of Darfur, Sudan, where the IDPs face hardship, humiliation, hunger, and risk of sexual assault owing to their dependence on local biomass for cooking their meals.

  6. Metrics with vanishing quantum corrections

    International Nuclear Information System (INIS)

    Coley, A A; Hervik, S; Gibbons, G W; Pope, C N

    2008-01-01

    We investigate solutions of the classical Einstein or supergravity equations that solve any set of quantum corrected Einstein equations in which the Einstein tensor plus a multiple of the metric is equated to a symmetric conserved tensor T μν (g αβ , ∂ τ g αβ , ∂ τ ∂ σ g αβ , ...,) constructed from sums of terms, the involving contractions of the metric and powers of arbitrary covariant derivatives of the curvature tensor. A classical solution, such as an Einstein metric, is called universal if, when evaluated on that Einstein metric, T μν is a multiple of the metric. A Ricci flat classical solution is called strongly universal if, when evaluated on that Ricci flat metric, T μν vanishes. It is well known that pp-waves in four spacetime dimensions are strongly universal. We focus attention on a natural generalization; Einstein metrics with holonomy Sim(n - 2) in which all scalar invariants are zero or constant. In four dimensions we demonstrate that the generalized Ghanam-Thompson metric is weakly universal and that the Goldberg-Kerr metric is strongly universal; indeed, we show that universality extends to all four-dimensional Sim(2) Einstein metrics. We also discuss generalizations to higher dimensions

  7. Sharp metric obstructions for quasi-Einstein metrics

    Science.gov (United States)

    Case, Jeffrey S.

    2013-02-01

    Using the tractor calculus to study smooth metric measure spaces, we adapt results of Gover and Nurowski to give sharp metric obstructions to the existence of quasi-Einstein metrics on suitably generic manifolds. We do this by introducing an analogue of the Weyl tractor W to the setting of smooth metric measure spaces. The obstructions we obtain can be realized as tensorial invariants which are polynomial in the Riemann curvature tensor and its divergence. By taking suitable limits of their tensorial forms, we then find obstructions to the existence of static potentials, generalizing to higher dimensions a result of Bartnik and Tod, and to the existence of potentials for gradient Ricci solitons.

  8. Completion of a Dislocated Metric Space

    Directory of Open Access Journals (Sweden)

    P. Sumati Kumari

    2015-01-01

    Full Text Available We provide a construction for the completion of a dislocated metric space (abbreviated d-metric space; we also prove that the completion of the metric associated with a d-metric coincides with the metric associated with the completion of the d-metric.

  9. TonEBP modulates the protective effect of taurine in ischemia-induced cytotoxicity in cardiomyocytes

    Science.gov (United States)

    Yang, Y J; Han, Y Y; Chen, K; Zhang, Y; Liu, X; Li, S; Wang, K Q; Ge, J B; Liu, W; Zuo, J

    2015-01-01

    Taurine, which is found at high concentration in the heart, exerts several protective actions on myocardium. Physically, the high level of taurine in heart is maintained by a taurine transporter (TauT), the expression of which is suppressed under ischemic insult. Although taurine supplementation upregulates TauT expression, elevates the intracellular taurine content and ameliorates the ischemic injury of cardiomyocytes (CMs), little is known about the regulatory mechanisms of taurine governing TauT expression under ischemia. In this study, we describe the TonE (tonicity-responsive element)/TonEBP (TonE-binding protein) pathway involved in the taurine-regulated TauT expression in ischemic CMs. Taurine inhibited the ubiquitin-dependent proteasomal degradation of TonEBP, promoted the translocation of TonEBP into the nucleus, enhanced TauT promoter activity and finally upregulated TauT expression in CMs. In addition, we observed that TonEBP had an anti-apoptotic and anti-oxidative role in CMs under ischemia. Moreover, the protective effects of taurine on myocardial ischemia were TonEBP dependent. Collectively, our findings suggest that TonEBP is a core molecule in the protective mechanism of taurine in CMs under ischemic insult. PMID:26673669

  10. Vizualization Challenges of a Subduction Simulation Using One Billion Markers

    Science.gov (United States)

    Rudolph, M. L.; Gerya, T. V.; Yuen, D. A.

    2004-12-01

    Recent advances in supercomputing technology have permitted us to study the multiscale, multicomponent fluid dynamics of subduction zones at unprecedented resolutions down to about the length of a football field. We have performed numerical simulations using one billion tracers over a grid of about 80 thousand points in two dimensions. These runs have been performed using a thermal-chemical simulation that accounts for hydration and partial melting in the thermal, mechanical, petrological, and rheological domains. From these runs, we have observed several geophysically interesting phenomena including the development of plumes with unmixed mantle composition as well as plumes with mixed mantle/crust components. Unmixed plumes form at depths greater than 100km (5-10 km above the upper interface of subducting slab) and consist of partially molten wet peridotite. Mixed plumes form at lesser depth directly from the subducting slab and contain partially molten hydrated oceanic crust and sediments. These high resolution simulations have also spurred the development of new visualization methods. We have created a new web-based interface to data from our subduction simulation and other high-resolution 2D data that uses an hierarchical data format to achieve response times of less than one second when accessing data files on the order of 3GB. This interface, WEB-IS4, uses a Javascript and HTML frontend coupled with a C and PHP backend and allows the user to perform region of interest zooming, real-time colormap selection, and can return relevant statistics relating to the data in the region of interest.

  11. Metric adjusted skew information

    DEFF Research Database (Denmark)

    Hansen, Frank

    2008-01-01

    ) that vanishes for observables commuting with the state. We show that the skew information is a convex function on the manifold of states. It also satisfies other requirements, proposed by Wigner and Yanase, for an effective measure-of-information content of a state relative to a conserved observable. We...... establish a connection between the geometrical formulation of quantum statistics as proposed by Chentsov and Morozova and measures of quantum information as introduced by Wigner and Yanase and extended in this article. We show that the set of normalized Morozova-Chentsov functions describing the possible......We extend the concept of Wigner-Yanase-Dyson skew information to something we call "metric adjusted skew information" (of a state with respect to a conserved observable). This "skew information" is intended to be a non-negative quantity bounded by the variance (of an observable in a state...

  12. The metric system: An introduction

    Energy Technology Data Exchange (ETDEWEB)

    Lumley, S.M.

    1995-05-01

    On July 13, 1992, Deputy Director Duane Sewell restated the Laboratory`s policy on conversion to the metric system which was established in 1974. Sewell`s memo announced the Laboratory`s intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory`s conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on July 25, 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation`s conversion to the metric system. The second part of this report is on applying the metric system.

  13. Attack-Resistant Trust Metrics

    Science.gov (United States)

    Levien, Raph

    The Internet is an amazingly powerful tool for connecting people together, unmatched in human history. Yet, with that power comes great potential for spam and abuse. Trust metrics are an attempt to compute the set of which people are trustworthy and which are likely attackers. This chapter presents two specific trust metrics developed and deployed on the Advogato Website, which is a community blog for free software developers. This real-world experience demonstrates that the trust metrics fulfilled their goals, but that for good results, it is important to match the assumptions of the abstract trust metric computation to the real-world implementation.

  14. The metric system: An introduction

    Science.gov (United States)

    Lumley, Susan M.

    On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.

  15. Metric-adjusted skew information

    DEFF Research Database (Denmark)

    Liang, Cai; Hansen, Frank

    2010-01-01

    on a bipartite system and proved superadditivity of the Wigner-Yanase-Dyson skew informations for such states. We extend this result to the general metric-adjusted skew information. We finally show that a recently introduced extension to parameter values 1 ...We give a truly elementary proof of the convexity of metric-adjusted skew information following an idea of Effros. We extend earlier results of weak forms of superadditivity to general metric-adjusted skew information. Recently, Luo and Zhang introduced the notion of semi-quantum states...... of (unbounded) metric-adjusted skew information....

  16. Two classes of metric spaces

    Directory of Open Access Journals (Sweden)

    Isabel Garrido

    2016-04-01

    Full Text Available The class of metric spaces (X,d known as small-determined spaces, introduced by Garrido and Jaramillo, are properly defined by means of some type of real-valued Lipschitz functions on X. On the other hand, B-simple metric spaces introduced by Hejcman are defined in terms of some kind of bornologies of bounded subsets of X. In this note we present a common framework where both classes of metric spaces can be studied which allows us to see not only the relationships between them but also to obtain new internal characterizations of these metric properties.

  17. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  18. Multimetric indices: How many metrics?

    Science.gov (United States)

    Multimetric indices (MMI’s) often include 5 to 15 metrics, each representing a different attribute of assemblage condition, such as species diversity, tolerant taxa, and nonnative taxa. Is there an optimal number of metrics for MMIs? To explore this question, I created 1000 9-met...

  19. Metrical Phonology: German Sound System.

    Science.gov (United States)

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English and German languages. The objective is to promote use of metrical phonology as a tool for enhancing instruction in stress patterns in words and sentences, particularly in…

  20. Extending cosmology: the metric approach

    OpenAIRE

    Mendoza, S.

    2012-01-01

    Comment: 2012, Extending Cosmology: The Metric Approach, Open Questions in Cosmology; Review article for an Intech "Open questions in cosmology" book chapter (19 pages, 3 figures). Available from: http://www.intechopen.com/books/open-questions-in-cosmology/extending-cosmology-the-metric-approach

  1. Numerical Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Douglas, Michael R.; Karp, Robert L.; Lukic, Sergio; Reinbacher, Rene

    2008-01-01

    We develop numerical methods for approximating Ricci flat metrics on Calabi-Yau hypersurfaces in projective spaces. Our approach is based on finding balanced metrics and builds on recent theoretical work by Donaldson. We illustrate our methods in detail for a one parameter family of quintics. We also suggest several ways to extend our results

  2. High resolution metric imaging payload

    Science.gov (United States)

    Delclaud, Y.

    2017-11-01

    Alcatel Space Industries has become Europe's leader in the field of high and very high resolution optical payloads, in the frame work of earth observation system able to provide military government with metric images from space. This leadership allowed ALCATEL to propose for the export market, within a French collaboration frame, a complete space based system for metric observation.

  3. Weyl metrics and wormholes

    Energy Technology Data Exchange (ETDEWEB)

    Gibbons, Gary W. [DAMTP, University of Cambridge, Wilberforce Road, Cambridge, CB3 0WA U.K. (United Kingdom); Volkov, Mikhail S., E-mail: gwg1@cam.ac.uk, E-mail: volkov@lmpt.univ-tours.fr [Laboratoire de Mathématiques et Physique Théorique, LMPT CNRS—UMR 7350, Université de Tours, Parc de Grandmont, Tours, 37200 France (France)

    2017-05-01

    We study solutions obtained via applying dualities and complexifications to the vacuum Weyl metrics generated by massive rods and by point masses. Rescaling them and extending to complex parameter values yields axially symmetric vacuum solutions containing singularities along circles that can be viewed as singular matter sources. These solutions have wormhole topology with several asymptotic regions interconnected by throats and their sources can be viewed as thin rings of negative tension encircling the throats. For a particular value of the ring tension the geometry becomes exactly flat although the topology remains non-trivial, so that the rings literally produce holes in flat space. To create a single ring wormhole of one metre radius one needs a negative energy equivalent to the mass of Jupiter. Further duality transformations dress the rings with the scalar field, either conventional or phantom. This gives rise to large classes of static, axially symmetric solutions, presumably including all previously known solutions for a gravity-coupled massless scalar field, as for example the spherically symmetric Bronnikov-Ellis wormholes with phantom scalar. The multi-wormholes contain infinite struts everywhere at the symmetry axes, apart from solutions with locally flat geometry.

  4. Metrics for image segmentation

    Science.gov (United States)

    Rees, Gareth; Greenway, Phil; Morray, Denise

    1998-07-01

    An important challenge in mapping image-processing techniques onto applications is the lack of quantitative performance measures. From a systems engineering perspective these are essential if system level requirements are to be decomposed into sub-system requirements which can be understood in terms of algorithm selection and performance optimization. Nowhere in computer vision is this more evident than in the area of image segmentation. This is a vigorous and innovative research activity, but even after nearly two decades of progress, it remains almost impossible to answer the question 'what would the performance of this segmentation algorithm be under these new conditions?' To begin to address this shortcoming, we have devised a well-principled metric for assessing the relative performance of two segmentation algorithms. This allows meaningful objective comparisons to be made between their outputs. It also estimates the absolute performance of an algorithm given ground truth. Our approach is an information theoretic one. In this paper, we describe the theory and motivation of our method, and present practical results obtained from a range of state of the art segmentation methods. We demonstrate that it is possible to measure the objective performance of these algorithms, and to use the information so gained to provide clues about how their performance might be improved.

  5. Metric regularity and subdifferential calculus

    International Nuclear Information System (INIS)

    Ioffe, A D

    2000-01-01

    The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces

  6. Determination of a Screening Metric for High Diversity DNA Libraries.

    Science.gov (United States)

    Guido, Nicholas J; Handerson, Steven; Joseph, Elaine M; Leake, Devin; Kung, Li A

    2016-01-01

    The fields of antibody engineering, enzyme optimization and pathway construction rely increasingly on screening complex variant DNA libraries. These highly diverse libraries allow researchers to sample a maximized sequence space; and therefore, more rapidly identify proteins with significantly improved activity. The current state of the art in synthetic biology allows for libraries with billions of variants, pushing the limits of researchers' ability to qualify libraries for screening by measuring the traditional quality metrics of fidelity and diversity of variants. Instead, when screening variant libraries, researchers typically use a generic, and often insufficient, oversampling rate based on a common rule-of-thumb. We have developed methods to calculate a library-specific oversampling metric, based on fidelity, diversity, and representation of variants, which informs researchers, prior to screening the library, of the amount of oversampling required to ensure that the desired fraction of variant molecules will be sampled. To derive this oversampling metric, we developed a novel alignment tool to efficiently measure frequency counts of individual nucleotide variant positions using next-generation sequencing data. Next, we apply a method based on the "coupon collector" probability theory to construct a curve of upper bound estimates of the sampling size required for any desired variant coverage. The calculated oversampling metric will guide researchers to maximize their efficiency in using highly variant libraries.

  7. Determination of a Screening Metric for High Diversity DNA Libraries.

    Directory of Open Access Journals (Sweden)

    Nicholas J Guido

    Full Text Available The fields of antibody engineering, enzyme optimization and pathway construction rely increasingly on screening complex variant DNA libraries. These highly diverse libraries allow researchers to sample a maximized sequence space; and therefore, more rapidly identify proteins with significantly improved activity. The current state of the art in synthetic biology allows for libraries with billions of variants, pushing the limits of researchers' ability to qualify libraries for screening by measuring the traditional quality metrics of fidelity and diversity of variants. Instead, when screening variant libraries, researchers typically use a generic, and often insufficient, oversampling rate based on a common rule-of-thumb. We have developed methods to calculate a library-specific oversampling metric, based on fidelity, diversity, and representation of variants, which informs researchers, prior to screening the library, of the amount of oversampling required to ensure that the desired fraction of variant molecules will be sampled. To derive this oversampling metric, we developed a novel alignment tool to efficiently measure frequency counts of individual nucleotide variant positions using next-generation sequencing data. Next, we apply a method based on the "coupon collector" probability theory to construct a curve of upper bound estimates of the sampling size required for any desired variant coverage. The calculated oversampling metric will guide researchers to maximize their efficiency in using highly variant libraries.

  8. Twitching motility and biofilm formation are associated with tonB1 in Xylella fastidiosa.

    Science.gov (United States)

    Cursino, Luciana; Li, Yaxin; Zaini, Paulo A; De La Fuente, Leonardo; Hoch, Harvey C; Burr, Thomas J

    2009-10-01

    A mutation in the Xylella fastidiosa tonB1 gene resulted in loss of twitching motility and in significantly less biofilm formation as compared with a wild type. The altered motility and biofilm phenotypes were restored by complementation with a functional copy of the gene. The mutation affected virulence as measured by Pierce's disease symptoms on grapevines. The role of TonB1 in twitching and biofilm formation appears to be independent of the characteristic iron-uptake function of this protein. This is the first report demonstrating a functional role for a tonB homolog in X. fastidiosa.

  9. A SWIRE Picture is Worth Billions of Years

    Science.gov (United States)

    2005-01-01

    [figure removed for brevity, see original site] Figure 1: SWIRE View of Distant Galaxies [figure removed for brevity, see original site] [figure removed for brevity, see original site] [figure removed for brevity, see original site] Figure 2Figure 3 Figure 4 These spectacular images, taken by the Spitzer Wide-area Infrared Extragalactic (SWIRE) Legacy project, encapsulate one of the primary objectives of the Spitzer mission: to connect the evolution of galaxies from the distant, or early, universe to the nearby, or present day, universe. The Tadpole galaxy (main image) is the result of a recent galactic interaction in the local universe. Although these galactic mergers are rare in the universe's recent history, astronomers believe that they were much more common in the early universe. Thus, SWIRE team members will use this detailed image of the Tadpole galaxy to help understand the nature of the 'faint red-orange specks' of the early universe. The larger picture (figure 2) depicts one-sixteenth of the SWIRE survey field called ELAIS-N1. In this image, the bright blue sources are hot stars in our own Milky Way, which range anywhere from 3 to 60 times the mass of our Sun. The fainter green spots are cooler stars and galaxies beyond the Milky Way whose light is dominated by older stellar populations. The red dots are dusty galaxies that are undergoing intense star formation. The faintest specks of red-orange are galaxies billions of light-years away in the distant universe. Figure 3 features an unusual ring-like galaxy called CGCG 275-022. The red spiral arms indicate that this galaxy is very dusty and perhaps undergoing intense star formation. The star-forming activity could have been initiated by a near head-on collision with another galaxy. The most distant galaxies that SWIRE is able to detect are revealed in a zoom of deep space (figure 4). The colors in this feature represent the same objects as those in the larger field image of ELAIS-N1. The observed SWIRE

  10. METRICS DEVELOPMENT FOR PATENTS.

    Science.gov (United States)

    Veiga, Daniela Francescato; Ferreira, Lydia Masako

    2015-01-01

    To develop a proposal for metrics for patents to be applied in assessing the postgraduate programs of Medicine III - Capes. From the reading and analysis of the 2013 area documents of all the 48 areas of Capes, a proposal for metrics for patents was developed to be applied in Medicine III programs. Except for the areas Biotechnology, Food Science, Biological Sciences III, Physical Education, Engineering I, III and IV and Interdisciplinary, most areas do not adopt a scoring system for patents. The proposal developed was based on the criteria of Biotechnology, with adaptations. In general, it will be valued, in ascending order, the deposit, the granting and licensing/production. It will also be assigned higher scores to patents registered abroad and whenever there is a participation of students. This proposal can be applied to the item Intellectual Production of the evaluation form, in subsection Technical Production/Patents. The percentage of 10% for academic programs and 40% for Masters Professionals should be maintained. The program will be scored as Very Good when it reaches 400 points or over; Good, between 200 and 399 points; Regular, between 71 and 199 points; Weak up to 70 points; Insufficient, no punctuation. Desenvolver uma proposta de métricas para patentes a serem aplicadas na avaliação dos Programas de Pós-Graduação da Área Medicina III - Capes. A partir da leitura e análise dos documentos de área de 2013 de todas as 48 Áreas da Capes, desenvolveu-se uma proposta de métricas para patentes, a ser aplicada na avaliação dos programas da área. Constatou-se que, com exceção das áreas Biotecnologia, Ciência de Alimentos, Ciências Biológicas III, Educação Física, Engenharias I, III e IV e Interdisciplinar, a maioria não adota sistema de pontuação para patentes. A proposta desenvolvida baseou-se nos critérios da Biotecnologia, com adaptações. De uma forma geral, foi valorizado, em ordem crescente, o depósito, a concessão e o

  11. A Metric for Heterotic Moduli

    Science.gov (United States)

    Candelas, Philip; de la Ossa, Xenia; McOrist, Jock

    2017-12-01

    Heterotic vacua of string theory are realised, at large radius, by a compact threefold with vanishing first Chern class together with a choice of stable holomorphic vector bundle. These form a wide class of potentially realistic four-dimensional vacua of string theory. Despite all their phenomenological promise, there is little understanding of the metric on the moduli space of these. What is sought is the analogue of special geometry for these vacua. The metric on the moduli space is important in phenomenology as it normalises D-terms and Yukawa couplings. It is also of interest in mathematics, since it generalises the metric, first found by Kobayashi, on the space of gauge field connections, to a more general context. Here we construct this metric, correct to first order in {α^{\\backprime}}, in two ways: first by postulating a metric that is invariant under background gauge transformations of the gauge field, and also by dimensionally reducing heterotic supergravity. These methods agree and the resulting metric is Kähler, as is required by supersymmetry. Checking the metric is Kähler is intricate and the anomaly cancellation equation for the H field plays an essential role. The Kähler potential nevertheless takes a remarkably simple form: it is the Kähler potential of special geometry with the Kähler form replaced by the {α^{\\backprime}}-corrected hermitian form.

  12. Implications of Metric Choice for Common Applications of Readmission Metrics

    OpenAIRE

    Davies, Sheryl; Saynina, Olga; Schultz, Ellen; McDonald, Kathryn M; Baker, Laurence C

    2013-01-01

    Objective. To quantify the differential impact on hospital performance of three readmission metrics: all-cause readmission (ACR), 3M Potential Preventable Readmission (PPR), and Centers for Medicare and Medicaid 30-day readmission (CMS).

  13. Issues in Benchmark Metric Selection

    Science.gov (United States)

    Crolotte, Alain

    It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.

  14. Background metric in supergravity theories

    International Nuclear Information System (INIS)

    Yoneya, T.

    1978-01-01

    In supergravity theories, we investigate the conformal anomaly of the path-integral determinant and the problem of fermion zero modes in the presence of a nontrivial background metric. Except in SO(3) -invariant supergravity, there are nonvanishing conformal anomalies. As a consequence, amplitudes around the nontrivial background metric contain unpredictable arbitrariness. The fermion zero modes which are explicitly constructed for the Euclidean Schwarzschild metric are interpreted as an indication of the supersymmetric multiplet structure of a black hole. The degree of degeneracy of a black hole is 2/sup 4n/ in SO(n) supergravity

  15. Generalized Painleve-Gullstrand metrics

    Energy Technology Data Exchange (ETDEWEB)

    Lin Chunyu [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: l2891112@mail.ncku.edu.tw; Soo Chopin [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: cpsoo@mail.ncku.edu.tw

    2009-02-02

    An obstruction to the implementation of spatially flat Painleve-Gullstrand (PG) slicings is demonstrated, and explicitly discussed for Reissner-Nordstroem and Schwarzschild-anti-deSitter spacetimes. Generalizations of PG slicings which are not spatially flat but which remain regular at the horizons are introduced. These metrics can be obtained from standard spherically symmetric metrics by physical Lorentz boosts. With these generalized PG metrics, problematic contributions to the imaginary part of the action in the Parikh-Wilczek derivation of Hawking radiation due to the obstruction can be avoided.

  16. Daylight metrics and energy savings

    Energy Technology Data Exchange (ETDEWEB)

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  17. Next-Generation Metrics: Responsible Metrics & Evaluation for Open Science

    Energy Technology Data Exchange (ETDEWEB)

    Wilsdon, J.; Bar-Ilan, J.; Peters, I.; Wouters, P.

    2016-07-01

    Metrics evoke a mixed reaction from the research community. A commitment to using data to inform decisions makes some enthusiastic about the prospect of granular, real-time analysis o of research and its wider impacts. Yet we only have to look at the blunt use of metrics such as journal impact factors, h-indices and grant income targets, to be reminded of the pitfalls. Some of the most precious qualities of academic culture resist simple quantification, and individual indicators often struggle to do justice to the richness and plurality of research. Too often, poorly designed evaluation criteria are “dominating minds, distorting behaviour and determining careers (Lawrence, 2007).” Metrics hold real power: they are constitutive of values, identities and livelihoods. How to exercise that power to more positive ends has been the focus of several recent and complementary initiatives, including the San Francisco Declaration on Research Assessment (DORA1), the Leiden Manifesto2 and The Metric Tide3 (a UK government review of the role of metrics in research management and assessment). Building on these initiatives, the European Commission, under its new Open Science Policy Platform4, is now looking to develop a framework for responsible metrics for research management and evaluation, which can be incorporated into the successor framework to Horizon 2020. (Author)

  18. Let's Make Metric Ice Cream

    Science.gov (United States)

    Zimmerman, Marianna

    1975-01-01

    Describes a classroom activity which involved sixth grade students in a learning situation including making ice cream, safety procedures in a science laboratory, calibrating a thermometer, using metric units of volume and mass. (EB)

  19. Nuclear budget for FY1991 up 3.6% to 409.7 billion yen

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    A total of yen409.7 billion was approved for the Governmental nuclear energy draft budget for fiscal 1991 on December 28, as the Cabinet gave its approval. The total, the highest ever, was divided into yen182.6 billion for the general account and yen227.1 billion for the special account for power resources development, representing a 3.6% increase over the ongoing fiscal year's level of yen395.5 billion. The draft budget will be examined for approval of the Diet session by the end of March. The nuclear energy budget devoted to research and development projects governed by the Science and Technology Agency amounts yen306.4 billion, up 3.5% exceeding yen300 billion for the first time. The nuclear budget for the Ministry of International Trade and Industry is yen98.1 billion, up 3.5%. For the other ministries, including the Ministry of Foreign Affairs, yen5.1 billion was allotted to nuclear energy-related projects. The Government had decided to raise the unit cost of the power plant siting promotion subsidies in the special account for power resources development by 25% --- from yen600/kw to yen750/kw --- in order to support the siting of plants. Consequently, the power resources siting account of the special accounts for both STA and MITI showed high levels of growth rates: 6.3% and 7.5%, respectively. (N.K.)

  20. Community access networks: how to connect the next billion to the ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Community access networks: how to connect the next billion to the Internet. Despite recent progress with mobile technology diffusion, more than four billion people worldwide are unconnected and have limited access to global communication infrastructure. The cost of implementing connectivity infrastructure in underserved ...

  1. Interactive (statistical) visualisation and exploration of a billion objects with vaex

    NARCIS (Netherlands)

    Breddels, M. A.

    2016-01-01

    With new catalogues arriving such as the Gaia DR1, containing more than a billion objects, new methods of handling and visualizing these data volumes are needed. We show that by calculating statistics on a regular (N-dimensional) grid, visualizations of a billion objects can be done within a second

  2. Experiential space is hardly metric

    Czech Academy of Sciences Publication Activity Database

    Šikl, Radovan; Šimeček, Michal; Lukavský, Jiří

    2008-01-01

    Roč. 2008, č. 37 (2008), s. 58-58 ISSN 0301-0066. [European Conference on Visual Perception. 24.08-28.08.2008, Utrecht] R&D Projects: GA ČR GA406/07/1676 Institutional research plan: CEZ:AV0Z70250504 Keywords : visual space perception * metric and non-metric perceptual judgments * ecological validity Subject RIV: AN - Psychology

  3. Coverage Metrics for Model Checking

    Science.gov (United States)

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  4. Phantom metrics with Killing spinors

    Directory of Open Access Journals (Sweden)

    W.A. Sabra

    2015-11-01

    Full Text Available We study metric solutions of Einstein–anti-Maxwell theory admitting Killing spinors. The analogue of the IWP metric which admits a space-like Killing vector is found and is expressed in terms of a complex function satisfying the wave equation in flat (2+1-dimensional space–time. As examples, electric and magnetic Kasner spaces are constructed by allowing the solution to depend only on the time coordinate. Euclidean solutions are also presented.

  5. Potential effects of the next 100 billion hamburgers sold by McDonald's.

    Science.gov (United States)

    Spencer, Elsa H; Frank, Erica; McIntosh, Nichole F

    2005-05-01

    McDonald's has sold >100 billion beef-based hamburgers worldwide with a potentially considerable health impact. This paper explores whether there would be any advantages if the next 100 billion burgers were instead plant-based burgers. Nutrient composition of the beef hamburger patty and the McVeggie burger patty were obtained from the McDonald's website; sales data were obtained from the McDonald's customer service. Consuming 100 billion McDonald's beef burgers versus the same company's McVeggie burgers would provide, approximately, on average, an additional 550 million pounds of saturated fat and 1.2 billion total pounds of fat, as well as 1 billion fewer pounds of fiber, 660 million fewer pounds of protein, and no difference in calories. These data suggest that the McDonald's new McVeggie burger represents a less harmful fast-food choice than the beef burger.

  6. Scalar-metric and scalar-metric-torsion gravitational theories

    International Nuclear Information System (INIS)

    Aldersley, S.J.

    1977-01-01

    The techniques of dimensional analysis and of the theory of tensorial concomitants are employed to study field equations in gravitational theories which incorporate scalar fields of the Brans-Dicke type. Within the context of scalar-metric gravitational theories, a uniqueness theorem for the geometric (or gravitational) part of the field equations is proven and a Lagrangian is determined which is uniquely specified by dimensional analysis. Within the context of scalar-metric-torsion gravitational theories a uniqueness theorem for field Lagrangians is presented and the corresponding Euler-Lagrange equations are given. Finally, an example of a scalar-metric-torsion theory is presented which is similar in many respects to the Brans-Dicke theory and the Einstein-Cartan theory

  7. Regge calculus from discontinuous metrics

    International Nuclear Information System (INIS)

    Khatsymovsky, V.M.

    2003-01-01

    Regge calculus is considered as a particular case of the more general system where the linklengths of any two neighbouring 4-tetrahedra do not necessarily coincide on their common face. This system is treated as that one described by metric discontinuous on the faces. In the superspace of all discontinuous metrics the Regge calculus metrics form some hypersurface defined by continuity conditions. Quantum theory of the discontinuous metric system is assumed to be fixed somehow in the form of quantum measure on (the space of functionals on) the superspace. The problem of reducing this measure to the Regge hypersurface is addressed. The quantum Regge calculus measure is defined from a discontinuous metric measure by inserting the δ-function-like phase factor. The requirement that continuity conditions be imposed in a 'face-independent' way fixes this factor uniquely. The term 'face-independent' means that this factor depends only on the (hyper)plane spanned by the face, not on it's form and size. This requirement seems to be natural from the viewpoint of existence of the well-defined continuum limit maximally free of lattice artefacts

  8. Symmetries of Taub-NUT dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.; Codoban, S.

    1998-01-01

    Recently geometric duality was analyzed for a metric which admits Killing tensors. An interesting example arises when the manifold has Killing-Yano tensors. The symmetries of the dual metrics in the case of Taub-NUT metric are investigated. Generic and non-generic symmetries of dual Taub-NUT metric are analyzed

  9. Operation and maintenance techniques of 1 ton bucket elevator in IMEF

    International Nuclear Information System (INIS)

    Soong, Woong Sup

    1999-04-01

    IMEF pool is used as a pathway between pool and hot cell in order to transfer (incoming and outgoing) irradiated materials. Transfer is performed by 1 ton bucket elevator which is moved inside the rectangular tube installed between pool and M1 hot cell. Allowable load capacity is 1 ton of the bucket elevator and its size is 25 X 25 X 150 cm. Bucket is driven by chain system which is moved up and down through the guide rail. Guide rail is installed in rectangular tube that is tilted about 63 degree. Chain which is moved by using the roller sliding method is driven by sprocket wheel being rotated by the shaft and the shaft is driven by gear reducing motor. In this report operation and maintenance techniques of 1 ton bucket elevator in IMEF are described in detail. (Author). 8 refs., 14 tabs., 6 figs

  10. Operation and maintenance techniques of 1 ton bucket elevator in IMEF

    Energy Technology Data Exchange (ETDEWEB)

    Soong, Woong Sup

    1999-04-01

    IMEF pool is used as a pathway between pool and hot cell in order to transfer (incoming and outgoing) irradiated materials. Transfer is performed by 1 ton bucket elevator which is moved inside the rectangular tube installed between pool and M1 hot cell. Allowable load capacity is 1 ton of the bucket elevator and its size is 25 X 25 X 150 cm. Bucket is driven by chain system which is moved up and down through the guide rail. Guide rail is installed in rectangular tube that is tilted about 63 degree. Chain which is moved by using the roller sliding method is driven by sprocket wheel being rotated by the shaft and the shaft is driven by gear reducing motor. In this report operation and maintenance techniques of 1 ton bucket elevator in IMEF are described in detail. (Author). 8 refs., 14 tabs., 6 figs.

  11. A Kerr-NUT metric

    International Nuclear Information System (INIS)

    Vaidya, P.C.; Patel, L.K.; Bhatt, P.V.

    1976-01-01

    Using Galilean time and retarded distance as coordinates the usual Kerr metric is expressed in form similar to the Newman-Unti-Tamburino (NUT) metric. The combined Kerr-NUT metric is then investigated. In addition to the Kerr and NUT solutions of Einstein's equations, three other types of solutions are derived. These are (i) the radiating Kerr solution, (ii) the radiating NUT solution satisfying Rsub(ik) = sigmaxisub(i)xisub(k), xisub(i)xisup(i) = 0, and (iii) the associated Kerr solution satisfying Rsub(ik) = 0. Solution (i) is distinct from and simpler than the one reported earlier by Vaidya and Patel (Phys. Rev.; D7:3590 (1973)). Solutions (ii) and (iii) gave line elements which have the axis of symmetry as a singular line. (author)

  12. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  13. Twitching motility and biofilm formation are associated with tonB1 in Xylella fastidiosa

    OpenAIRE

    Cursino, Luciana; Li, Yaxin; Zaini, Paulo A.; De La Fuente, Leonardo; Hoch, Harvey C.; Burr, Thomas J.

    2017-01-01

    A mutation in the Xylella fastidiosa tonB1 gene resulted in loss of twitching motility and in significantly less biofilm formation as compared with a wild type. The altered motility and biofilm phenotypes were restored by complementation with a functional copy of the gene. The mutation affected virulence as measured by Pierce's disease symptoms on grapevines. The role of TonB1 in twitching and biofilm formation appears to be independent of the characteristic iron-uptake function of this prote...

  14. The uniqueness of the Fisher metric as information metric

    Czech Academy of Sciences Publication Activity Database

    Le, Hong-Van

    2017-01-01

    Roč. 69, č. 4 (2017), s. 879-896 ISSN 0020-3157 Institutional support: RVO:67985840 Keywords : Chentsov’s theorem * mixed topology * monotonicity of the Fisher metric Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.049, year: 2016 https://link.springer.com/article/10.1007%2Fs10463-016-0562-0

  15. Thermodynamic metrics and optimal paths.

    Science.gov (United States)

    Sivak, David A; Crooks, Gavin E

    2012-05-11

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  16. Invariant metrics for Hamiltonian systems

    International Nuclear Information System (INIS)

    Rangarajan, G.; Dragt, A.J.; Neri, F.

    1991-05-01

    In this paper, invariant metrics are constructed for Hamiltonian systems. These metrics give rise to norms on the space of homeogeneous polynomials of phase-space variables. For an accelerator lattice described by a Hamiltonian, these norms characterize the nonlinear content of the lattice. Therefore, the performance of the lattice can be improved by minimizing the norm as a function of parameters describing the beam-line elements in the lattice. A four-fold increase in the dynamic aperture of a model FODO cell is obtained using this procedure. 7 refs

  17. Generalization of Vaidya's radiation metric

    Energy Technology Data Exchange (ETDEWEB)

    Gleiser, R J; Kozameh, C N [Universidad Nacional de Cordoba (Argentina). Instituto de Matematica, Astronomia y Fisica

    1981-11-01

    In this paper it is shown that if Vaidya's radiation metric is considered from the point of view of kinetic theory in general relativity, the corresponding phase space distribution function can be generalized in a particular way. The new family of spherically symmetric radiation metrics obtained contains Vaidya's as a limiting situation. The Einstein field equations are solved in a ''comoving'' coordinate system. Two arbitrary functions of a single variable are introduced in the process of solving these equations. Particular examples considered are a stationary solution, a nonvacuum solution depending on a single parameter, and several limiting situations.

  18. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2018-01-01

    The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...

  19. Remarks on G-Metric Spaces

    Directory of Open Access Journals (Sweden)

    Bessem Samet

    2013-01-01

    Full Text Available In 2005, Mustafa and Sims (2006 introduced and studied a new class of generalized metric spaces, which are called G-metric spaces, as a generalization of metric spaces. We establish some useful propositions to show that many fixed point theorems on (nonsymmetric G-metric spaces given recently by many authors follow directly from well-known theorems on metric spaces. Our technique can be easily extended to other results as shown in application.

  20. DLA Energy Biofuel Feedstock Metrics Study

    Science.gov (United States)

    2012-12-11

    moderately/highly in- vasive  Metric 2: Genetically modified organism ( GMO ) hazard, Yes/No and Hazard Category  Metric 3: Species hybridization...4– biofuel distribution Stage # 5– biofuel use Metric 1: State inva- siveness ranking Yes Minimal Minimal No No Metric 2: GMO hazard Yes...may utilize GMO microbial or microalgae species across the applicable biofuel life cycles (stages 1–3). The following consequence Metrics 4–6 then

  1. Separable metrics and radiating stars

    Indian Academy of Sciences (India)

    We study the junction condition relating the pressure to heat flux at the boundary of an accelerating and expanding spherically symmetric radiating star. We transform the junction condition to an ordinary differential equation by making a separability assumption on the metric functions in the space–time variables.

  2. Socio-technical security metrics

    NARCIS (Netherlands)

    Gollmann, D.; Herley, C.; Koenig, V.; Pieters, W.; Sasse, M.A.

    2015-01-01

    Report from Dagstuhl seminar 14491. This report documents the program and the outcomes of Dagstuhl Seminar 14491 “Socio-Technical Security Metrics”. In the domain of safety, metrics inform many decisions, from the height of new dikes to the design of nuclear plants. We can state, for example, that

  3. Leading Gainful Employment Metric Reporting

    Science.gov (United States)

    Powers, Kristina; MacPherson, Derek

    2016-01-01

    This chapter will address the importance of intercampus involvement in reporting of gainful employment student-level data that will be used in the calculation of gainful employment metrics by the U.S. Department of Education. The authors will discuss why building relationships within the institution is critical for effective gainful employment…

  4. Safety analysis report on the ''Paducah Tiger'' protective overpack for 10-ton cylinders of uranium hexafluoride

    International Nuclear Information System (INIS)

    Stitt, D.H.

    1975-01-01

    The ''Paducah Tiger'' is a protective overpack used in shipment of 10-ton cylinders of enriched UF 6 . The calculations and tests are described which made and which indicate that the overpack is in compliance with the type B packaging requirements of ERDA Manual Chapter 0529 and Title 10 Code Federal Regulations Part 71. (U.S.)

  5. A method to press powder at 6000 ton using small amount of explosive

    Science.gov (United States)

    Hilmi, Ahmad Humaizi; Azmi, Nor Azmaliana; Ismail, Ariffin

    2017-12-01

    Large die hydraulic press forces are one of the key instruments in making jumbo planes. The machine can produce aircraft components such as wing spars, landing gear supports and armor plates. Superpower nations such as USA, Russia, Germany, Japan, Korea and China have large die hydraulic press which can press 50,000 tons. In Malaysia, heavy-duty press is available from companies such as Proton that builds chassis for cars. However, that heavy-duty press is not able to produce better bulkhead for engines, fuselage, and wings of an aircraft. This paper presents the design of an apparatus that uses 50 grams of commercial grade explosives to produce 6000 tons of compaction. This is a first step towards producing larger scale apparatus that can produce 50,000-ton press. The design was done using AUTODYN blast simulation software. According to the results, the maximum load the apparatus can withstand was 6000 tons which was contributed by 50 grams of commercial explosive(Emulex). Explosive size larger than 50 grams will lead to catastrophic failure. Fabrication of the apparatus was completed. However, testing of the apparatus is not presented in this article.

  6. 10'000 ton ALICE gets her UK-built "Brain"

    CERN Multimedia

    Maddock, Julia

    2007-01-01

    For one of the four LEP experiments, called ALICE, the process got a step closer last week when a crucial part of the 10'000-ton detector, the British-built Central Trigger Processor (CTP), was installed in the ALICE cavern, some 150 feet underground. (plus background information about ALICE) (2,5 pages)

  7. Structure-activity correlations for TON, FER and MOR in the hydroisomerization of n-butane

    NARCIS (Netherlands)

    Pieterse, J.A.Z.; Seshan, Kulathuiyer; Lercher, J.A.

    2000-01-01

    n-Butane hydroconversion was studied over (Pt-loaded) molecular sieves with TON, FER, and MOR morphology. The conversion occurs via a complex interplay of mono- and bimolecular bifunctional acid mechanism and monofunctional platinum-catalyzed hydrogenolysis. Hydroisomerization occurs bimolecularly

  8. Applied molecular simulations over FER-, TON- and AEL-type zeolites

    NARCIS (Netherlands)

    Domokos, L.; Lefferts, Leonardus; Seshan, Kulathuiyer; Lercher, J.A.

    2001-01-01

    Interaction and transport of representative (un)saturated hydrocarbon molecules involved in the proposed reaction network of n-butene isomerization in zeolites FER, TON, and AEL have been studied by classic molecular modeling calculations. Docking of the guest molecules into the zeolite frameworks

  9. Optical flare observed in the flaring gamma-ray blazar Ton 599

    Science.gov (United States)

    Pursimo, Tapio; Sagues, Ana; Telting, John; Ojha, Roopesh

    2017-11-01

    We report optical photometry of the flat spectrum radio quasar Ton 599, obtained with the 2.56m Nordic Optical Telescope in La Palma, to look for any enhanced optical activity associated with a recent flare in the daily averaged gamma-ray flux (ATel#10931, ATel#10937).

  10. Confined Mobility of TonB and FepA in Escherichia coli Membranes.

    Directory of Open Access Journals (Sweden)

    Yoriko Lill

    Full Text Available The important process of nutrient uptake in Escherichia coli, in many cases, involves transit of the nutrient through a class of beta-barrel proteins in the outer membrane known as TonB-dependent transporters (TBDTs and requires interaction with the inner membrane protein TonB. Here we have imaged the mobility of the ferric enterobactin transporter FepA and TonB by tracking them in the membranes of live E. coli with single-molecule resolution at time-scales ranging from milliseconds to seconds. We employed simple simulations to model/analyze the lateral diffusion in the membranes of E.coli, to take into account both the highly curved geometry of the cell and artifactual effects expected due to finite exposure time imaging. We find that both molecules perform confined lateral diffusion in their respective membranes in the absence of ligand with FepA confined to a region [Formula: see text] μm in radius in the outer membrane and TonB confined to a region [Formula: see text] μm in radius in the inner membrane. The diffusion coefficient of these molecules on millisecond time-scales was estimated to be [Formula: see text] μm2/s and [Formula: see text] μm2/s for FepA and TonB, respectively, implying that each molecule is free to diffuse within its domain. Disruption of the inner membrane potential, deletion of ExbB/D from the inner membrane, presence of ligand or antibody to FepA and disruption of the MreB cytoskeleton was all found to further restrict the mobility of both molecules. Results are analyzed in terms of changes in confinement size and interactions between the two proteins.

  11. A ton is not always a ton: A road-test of landfill, manure, and afforestation/reforestation offset protocols in the U.S. carbon market

    International Nuclear Information System (INIS)

    Lee, Carrie M.; Lazarus, Michael; Smith, Gordon R.; Todd, Kimberly; Weitz, Melissa

    2013-01-01

    Highlights: • Protocols are the foundation of an offset program. • Using sample projects, we “road test” landfill, manure and afforestation protocols from 5 programs. • For a given project, we find large variation in the volume of offsets generated. • Harmonization of protocols can increase the likelihood that “a ton is a ton”. • Harmonization can enhance prospects for linking emission trading systems. -- Abstract: The outcome of recent international climate negotiations suggests we are headed toward a more fragmented carbon market, with multiple emission trading and offset programs operating in parallel. To effectively harmonize and link across programs, it will be important to ensure that across offset programs and protocols that a “ton is a ton”. In this article, we consider how sample offsets projects in the U.S. carbon market are treated across protocols from five programs: the Clean Development Mechanism, Climate Action Reserve, Chicago Climate Exchange, Regional Greenhouse Gas Initiative, and the U.S. EPA's former program, Climate Leaders. We find that differences among protocols for landfill methane, manure management, and afforestation/reforestation project types in accounting boundary definitions, baseline setting methods, measurement rules, emission factors, and discounts lead to differences in offsets credited that are often significant (e.g. greater than 50%). We suggest opportunities for modification and harmonization of protocols that can improve offset quality and credibility and enhance prospects for future linking of trading units and systems

  12. Fiscal 1988 draft budget for nuclear energy up 1.9% to yen 369 billion

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    AT the cabinet meeting held on December 28, the government approved the fiscal 1988 draft budget, with a general account of yen 56.6 trillion. The nuclear energy related budget is yen 181.124 billion from the general account and yen 186.098 billion from the special account for power sources development, totalling yen 367.222 billion, up 1.9% on the previous year. The largest appropriation goes to the Science and Technology Agency (STA) totaling yen 271 billion. The STA is promoting safety studies and R and D for extensive nuclear energy utilization but the budget shows a 0.7% decrease from the previous year, reflecting completion of the construction of JT-60, which is one of the Agency's major projects. MITI, with its budget of yen 91 billion will carry on policies related to the promotion of commercial nuclear power program as well as support for the industrialization program of the nuclear fuel cycle. Nuclear related budget of Ministry of Foreign Affairs is yen 2.8 billion, consisting mainly of IAEA subscriptions and contributions and OECD/NEA subscriptions. Besides these three government agencies, a large sum of yen 1.2 billion is allocated to the Okinawa Development Agency for the prevention and elimination of melon-flies in Kume Island and islands around Okinawa main island. The draft government budget will be submitted to the ordinary session of the Diet when it resumes towards the end of January. After deliberation in the Budget Committees of the House of Representatives and the House of Councilors, the draft budget will be put to the vote in the plenary session. Assuming that all proceeds smoothly, the budget is expected to be approved by the end of March without any major revision. (author)

  13. Asset Decommissioning Risk Metrics for Floating Structures in the Gulf of Mexico.

    Science.gov (United States)

    Kaiser, Mark J

    2015-08-01

    Public companies in the United States are required to report standardized values of their proved reserves and asset retirement obligations on an annual basis. When compared, these two measures provide an aggregate indicator of corporate decommissioning risk but, because of their consolidated nature, cannot readily be decomposed at a more granular level. The purpose of this article is to introduce a decommissioning risk metric defined in terms of the ratio of the expected value of an asset's reserves to its expected cost of decommissioning. Asset decommissioning risk (ADR) is more difficult to compute than a consolidated corporate risk measure, but can be used to quantify the decommissioning risk of structures and to perform regional comparisons, and also provides market signals of future decommissioning activity. We formalize two risk metrics for decommissioning and apply the ADR metric to the deepwater Gulf of Mexico (GOM) floater inventory. Deepwater oil and gas structures are expensive to construct, and at the end of their useful life, will be expensive to decommission. The value of proved reserves for the 42 floating structures in the GOM circa January 2013 is estimated to range between $37 and $80 billion for future oil prices between 60 and 120 $/bbl, which is about 10 to 20 times greater than the estimated $4.3 billion to decommission the inventory. Eni's Allegheny and MC Offshore's Jolliet tension leg platforms have ADR metrics less than one and are approaching the end of their useful life. Application of the proposed metrics in the regulatory review of supplemental bonding requirements in the U.S. Outer Continental Shelf is suggested to complement the current suite of financial metrics employed. © 2015 Society for Risk Analysis.

  14. Group covariance and metrical theory

    International Nuclear Information System (INIS)

    Halpern, L.

    1983-01-01

    The a priori introduction of a Lie group of transformations into a physical theory has often proved to be useful; it usually serves to describe special simplified conditions before a general theory can be worked out. Newton's assumptions of absolute space and time are examples where the Euclidian group and translation group have been introduced. These groups were extended to the Galilei group and modified in the special theory of relativity to the Poincare group to describe physics under the given conditions covariantly in the simplest way. The criticism of the a priori character leads to the formulation of the general theory of relativity. The general metric theory does not really give preference to a particular invariance group - even the principle of equivalence can be adapted to a whole family of groups. The physical laws covariantly inserted into the metric space are however adapted to the Poincare group. 8 references

  15. General relativity: An erfc metric

    Science.gov (United States)

    Plamondon, Réjean

    2018-06-01

    This paper proposes an erfc potential to incorporate in a symmetric metric. One key feature of this model is that it relies on the existence of an intrinsic physical constant σ, a star-specific proper length that scales all its surroundings. Based thereon, the new metric is used to study the space-time geometry of a static symmetric massive object, as seen from its interior. The analytical solutions to the Einstein equation are presented, highlighting the absence of singularities and discontinuities in such a model. The geodesics are derived in their second- and first-order differential formats. Recalling the slight impact of the new model on the classical general relativity tests in the solar system, a number of facts and open problems are briefly revisited on the basis of a heuristic definition of σ. A special attention is given to gravitational collapses and non-singular black holes.

  16. hdm: High-dimensional metrics

    OpenAIRE

    Chernozhukov, Victor; Hansen, Christian; Spindler, Martin

    2016-01-01

    In this article the package High-dimensional Metrics (\\texttt{hdm}) is introduced. It is a collection of statistical methods for estimation and quantification of uncertainty in high-dimensional approximately sparse models. It focuses on providing confidence intervals and significance testing for (possibly many) low-dimensional subcomponents of the high-dimensional parameter vector. Efficient estimators and uniformly valid confidence intervals for regression coefficients on target variables (e...

  17. Multi-Metric Sustainability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cowlin, Shannon [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heimiller, Donna [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Pless, Jacquelyn [National Renewable Energy Lab. (NREL), Golden, CO (United States); Munoz, David [Colorado School of Mines, Golden, CO (United States)

    2014-12-01

    A readily accessible framework that allows for evaluating impacts and comparing tradeoffs among factors in energy policy, expansion planning, and investment decision making is lacking. Recognizing this, the Joint Institute for Strategic Energy Analysis (JISEA) funded an exploration of multi-metric sustainability analysis (MMSA) to provide energy decision makers with a means to make more comprehensive comparisons of energy technologies. The resulting MMSA tool lets decision makers simultaneously compare technologies and potential deployment locations.

  18. Sensory Metrics of Neuromechanical Trust.

    Science.gov (United States)

    Softky, William; Benford, Criscillia

    2017-09-01

    Today digital sources supply a historically unprecedented component of human sensorimotor data, the consumption of which is correlated with poorly understood maladies such as Internet addiction disorder and Internet gaming disorder. Because both natural and digital sensorimotor data share common mathematical descriptions, one can quantify our informational sensorimotor needs using the signal processing metrics of entropy, noise, dimensionality, continuity, latency, and bandwidth. Such metrics describe in neutral terms the informational diet human brains require to self-calibrate, allowing individuals to maintain trusting relationships. With these metrics, we define the trust humans experience using the mathematical language of computational models, that is, as a primitive statistical algorithm processing finely grained sensorimotor data from neuromechanical interaction. This definition of neuromechanical trust implies that artificial sensorimotor inputs and interactions that attract low-level attention through frequent discontinuities and enhanced coherence will decalibrate a brain's representation of its world over the long term by violating the implicit statistical contract for which self-calibration evolved. Our hypersimplified mathematical understanding of human sensorimotor processing as multiscale, continuous-time vibratory interaction allows equally broad-brush descriptions of failure modes and solutions. For example, we model addiction in general as the result of homeostatic regulation gone awry in novel environments (sign reversal) and digital dependency as a sub-case in which the decalibration caused by digital sensorimotor data spurs yet more consumption of them. We predict that institutions can use these sensorimotor metrics to quantify media richness to improve employee well-being; that dyads and family-size groups will bond and heal best through low-latency, high-resolution multisensory interaction such as shared meals and reciprocated touch; and

  19. Metric reconstruction from Weyl scalars

    Energy Technology Data Exchange (ETDEWEB)

    Whiting, Bernard F; Price, Larry R [Department of Physics, PO Box 118440, University of Florida, Gainesville, FL 32611 (United States)

    2005-08-07

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations.

  20. Metric reconstruction from Weyl scalars

    International Nuclear Information System (INIS)

    Whiting, Bernard F; Price, Larry R

    2005-01-01

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations

  1. Implications of\tenhanced\teffectiveness\tof\tvincristine\tsulfate/ε-viniferin combination\tcompared\tto\tvincristine\tsulfate\tonly\ton\tHepG2\tcells

    Directory of Open Access Journals (Sweden)

    Filiz\tÖzdemir

    2016-12-01

    Full Text Available Objective: This\tstudy\twas\tdesigned\tto\tinvestigate\tthe\teffects\tof\tε-viniferin\t(ε-VNF\ton\tthe\tmitochondrial\tpathway\tof\tapoptosis and\ton\tlate\tapoptosis\tin\tHepG2\tcell\tlines.\tTo\tobserve\tthese\teffects,\tε-VNF\tand\tvincristine\tsulfate\t(VNC,\tanti-cancer\tdrugs\tused for\ttreatment\ton\tHepG2\tcells,\twere\tadministered\teither\talone\tor\tin\tcombination\tat\tdifferent\ttime\tintervals. Methods:\tMitochondrial\tmembrane\tpotential\tchanges\tin\tthe\tcells\t(ΔΨm\twere\tevaluated\tusing\tcationic\tdye\tJC-1,\twhile\tBax,\tBcl- 2\texpression\tlevels\twith\tRT-PCR\tand\tcaspase-3\tactivity\twere\tanalyzed\tusing\ta\tkit.\tFor\tdetection\tof\tapoptotic\tactivity,\tan\tin\tsitu TUNEL\tassay\twas\tperformed. Results: When 98.3µM ε-VNF, 52.5µM VNC and the 11.25+15.8µM VNC+ε-VNF combination were compared with the control group,\tΔΨm\tchanges\tat\tthe\t6th\thour\twere\tfound\tto\tbe\t19.5%,\t5.5%,\t24.6%,\tand\t3.5%\t,\trespectively.\tThese\tfinding\tshow\tthat\tthe combination\tgroup\t(24.6%\tresulted\tin\tearly\tapoptosis\tof\tthe\tcell\tat\tthe\t6th\thour.\tBax\tmRNA\texpression\tincreased\tat\tthe\t24th hour in the VNC+ε-VNF group compared to control group (160%, and caspase-3 activation increased in the 1.25+15.8 µM[VNC+ε-VNF]\tgroup\tcompared\tto\tthe\tcontrol\tgroup\tat\tthe\t48th\thour.\tThe\tdetection\tof\tDNA\tfragments\tin\tHepG2\tcells\twithin 24\thours\tsuggests\tdirect\tapoptosis. Conclusion: These findings demonstrate that the doses administered to the VNC+ε-VNF combination group\twere\tlower than those\tadministered\tto\tgroups\tusing\tone\tagent\talone\t(e.g.\tVNC,\tthe\tresults\tof\twhich\treduce\tthe\tpossibility\tof\tadministering\ttoxic doses.

  2. Sustainability Metrics: The San Luis Basin Project

    Science.gov (United States)

    Sustainability is about promoting humanly desirable dynamic regimes of the environment. Metrics: ecological footprint, net regional product, exergy, emergy, and Fisher Information. Adaptive management: (1) metrics assess problem, (2) specific problem identified, and (3) managemen...

  3. 50 CFR Table 1b to Part 660... - 2009, Harvest Guidelines for Minor Rockfish by Depth Sub-groups (weights in metric tons)

    Science.gov (United States)

    2010-10-01

    ... population in the California Current ecosystem, a non-quantitative assessment was conducted in 2007. The... 50 percent of the 2008 ABC and OY values. The stock is expected to remain at its current equilibrium... catcher/processors, 60.0 mt for motherships, and 105.0 mt for shore-based. r Canary rockfish—A canary...

  4. 50 CFR Table 1c to Part 660... - 2009, Open Access and Limited Entry Allocations by Species or Species Group (weights in metric tons)

    Science.gov (United States)

    2010-10-01

    ... population in the California Current ecosystem, a non-quantitative assessment was conducted in 2007. The... 50 percent of the 2008 ABC and OY values. The stock is expected to remain at its current equilibrium... fishery: 85.0 mt for catcher/processors, 60.0 mt for motherships, and 105.0 mt for shore-based. r/ Canary...

  5. 50 CFR Table 2b to Part 660... - 2010, and Beyond, Harvest Guidelines for Minor Rockfish by Depth Sub-groups (weights in metric tons)

    Science.gov (United States)

    2010-10-01

    ... unexploited rockfish population in the California Current ecosystem, a non-quantitative assessment was... current equilibrium with these harvest specifications. q Widow rockfish was assessed in 2005, and an... Canary rockfish—A canary rockfish stock assessment was completed in 2007 and the stock was estimated to...

  6. In vivo evidence of TonB shuttling between the cytoplasmic and outer membrane in Escherichia coli.

    Science.gov (United States)

    Larsen, Ray A; Letain, Tracy E; Postle, Kathleen

    2003-07-01

    Gram-negative bacteria are able to convert potential energy inherent in the proton gradient of the cytoplasmic membrane into active nutrient transport across the outer membrane. The transduction of energy is mediated by TonB protein. Previous studies suggest a model in which TonB makes sequential and cyclic contact with proteins in each membrane, a process called shuttling. A key feature of shuttling is that the amino-terminal signal anchor must quit its association with the cytoplasmic membrane, and TonB becomes associated solely with the outer membrane. However, the initial studies did not exclude the possibility that TonB was artifactually pulled from the cytoplasmic membrane by the fractionation process. To resolve this ambiguity, we devised a method to test whether the extreme TonB amino-terminus, located in the cytoplasm, ever became accessible to the cys-specific, cytoplasmic membrane-impermeant molecule, Oregon Green(R) 488 maleimide (OGM) in vivo. A full-length TonB and a truncated TonB were modified to carry a sole cysteine at position 3. Both full-length TonB and truncated TonB (consisting of the amino-terminal two-thirds) achieved identical conformations in the cytoplasmic membrane, as determined by their abilities to cross-link to the cytoplasmic membrane protein ExbB and their abilities to respond conformationally to the presence or absence of proton motive force. Full-length TonB could be amino-terminally labelled in vivo, suggesting that it was periplasmically exposed. In contrast, truncated TonB, which did not associate with the outer membrane, was not specifically labelled in vivo. The truncated TonB also acted as a control for leakage of OGM across the cytoplasmic membrane. Further, the extent of labelling for full-length TonB correlated roughly with the proportion of TonB found at the outer membrane. These findings suggest that TonB does indeed disengage from the cytoplasmic membrane during energy transduction and shuttle to the outer membrane.

  7. Crowdsourcing metrics of digital collections

    Directory of Open Access Journals (Sweden)

    Tuula Pääkkönen

    2015-12-01

    Full Text Available In the National Library of Finland (NLF there are millions of digitized newspaper and journal pages, which are openly available via the public website  http://digi.kansalliskirjasto.fi. To serve users better, last year the front end was completely overhauled with its main aim in crowdsourcing features, e.g., by giving end-users the opportunity to create digital clippings and a personal scrapbook from the digital collections. But how can you know whether crowdsourcing has had an impact? How much crowdsourcing functionalities have been used so far? Did crowdsourcing work? In this paper the statistics and metrics of a recent crowdsourcing effort are analysed across the different digitized material types (newspapers, journals, ephemera. The subjects, categories and keywords given by the users are analysed to see which topics are the most appealing. Some notable public uses of the crowdsourced article clippings are highlighted. These metrics give us indications on how the end-users, based on their own interests, are investigating and using the digital collections. Therefore, the suggested metrics illustrate the versatility of the information needs of the users, varying from citizen science to research purposes. By analysing the user patterns, we can respond to the new needs of the users by making minor changes to accommodate the most active participants, while still making the service more approachable for those who are trying out the functionalities for the first time. Participation in the clippings and annotations can enrich the materials in unexpected ways and can possibly pave the way for opportunities of using crowdsourcing more also in research contexts. This creates more opportunities for the goals of open science since source data becomes ­available, making it possible for researchers to reach out to the general public for help. In the long term, utilizing, for example, text mining methods can allow these different end-user segments to

  8. A family of metric gravities

    Science.gov (United States)

    Shuler, Robert

    2018-04-01

    The goal of this paper is to take a completely fresh approach to metric gravity, in which the metric principle is strictly adhered to but its properties in local space-time are derived from conservation principles, not inferred from a global field equation. The global field strength variation then gains some flexibility, but only in the regime of very strong fields (2nd-order terms) whose measurement is now being contemplated. So doing provides a family of similar gravities, differing only in strong fields, which could be developed into meaningful verification targets for strong fields after the manner in which far-field variations were used in the 20th century. General Relativity (GR) is shown to be a member of the family and this is demonstrated by deriving the Schwarzschild metric exactly from a suitable field strength assumption. The method of doing so is interesting in itself because it involves only one differential equation rather than the usual four. Exact static symmetric field solutions are also given for one pedagogical alternative based on potential, and one theoretical alternative based on inertia, and the prospects of experimentally differentiating these are analyzed. Whether the method overturns the conventional wisdom that GR is the only metric theory of gravity and that alternatives must introduce additional interactions and fields is somewhat semantical, depending on whether one views the field strength assumption as a field and whether the assumption that produces GR is considered unique in some way. It is of course possible to have other fields, and the local space-time principle can be applied to field gravities which usually are weak-field approximations having only time dilation, giving them the spatial factor and promoting them to full metric theories. Though usually pedagogical, some of them are interesting from a quantum gravity perspective. Cases are noted where mass measurement errors, or distributions of dark matter, can cause one

  9. Hybrid metric-Palatini stars

    Science.gov (United States)

    Danilǎ, Bogdan; Harko, Tiberiu; Lobo, Francisco S. N.; Mak, M. K.

    2017-02-01

    We consider the internal structure and the physical properties of specific classes of neutron, quark and Bose-Einstein condensate stars in the recently proposed hybrid metric-Palatini gravity theory, which is a combination of the metric and Palatini f (R ) formalisms. It turns out that the theory is very successful in accounting for the observed phenomenology, since it unifies local constraints at the Solar System level and the late-time cosmic acceleration, even if the scalar field is very light. In this paper, we derive the equilibrium equations for a spherically symmetric configuration (mass continuity and Tolman-Oppenheimer-Volkoff) in the framework of the scalar-tensor representation of the hybrid metric-Palatini theory, and we investigate their solutions numerically for different equations of state of neutron and quark matter, by adopting for the scalar field potential a Higgs-type form. It turns out that the scalar-tensor definition of the potential can be represented as an Clairaut differential equation, and provides an explicit form for f (R ) given by f (R )˜R +Λeff, where Λeff is an effective cosmological constant. Furthermore, stellar models, described by the stiff fluid, radiation-like, bag model and the Bose-Einstein condensate equations of state are explicitly constructed in both general relativity and hybrid metric-Palatini gravity, thus allowing an in-depth comparison between the predictions of these two gravitational theories. As a general result it turns out that for all the considered equations of state, hybrid gravity stars are more massive than their general relativistic counterparts. Furthermore, two classes of stellar models corresponding to two particular choices of the functional form of the scalar field (constant value, and logarithmic form, respectively) are also investigated. Interestingly enough, in the case of a constant scalar field the equation of state of the matter takes the form of the bag model equation of state describing

  10. Metrics for Evaluation of Student Models

    Science.gov (United States)

    Pelanek, Radek

    2015-01-01

    Researchers use many different metrics for evaluation of performance of student models. The aim of this paper is to provide an overview of commonly used metrics, to discuss properties, advantages, and disadvantages of different metrics, to summarize current practice in educational data mining, and to provide guidance for evaluation of student…

  11. Context-dependent ATC complexity metric

    NARCIS (Netherlands)

    Mercado Velasco, G.A.; Borst, C.

    2015-01-01

    Several studies have investigated Air Traffic Control (ATC) complexity metrics in a search for a metric that could best capture workload. These studies have shown how daunting the search for a universal workload metric (one that could be applied in different contexts: sectors, traffic patterns,

  12. Properties of C-metric spaces

    Science.gov (United States)

    Croitoru, Anca; Apreutesei, Gabriela; Mastorakis, Nikos E.

    2017-09-01

    The subject of this paper belongs to the theory of approximate metrics [23]. An approximate metric on X is a real application defined on X × X that satisfies only a part of the metric axioms. In a recent paper [23], we introduced a new type of approximate metric, named C-metric, that is an application which satisfies only two metric axioms: symmetry and triangular inequality. The remarkable fact in a C-metric space is that a topological structure induced by the C-metric can be defined. The innovative idea of this paper is that we obtain some convergence properties of a C-metric space in the absence of a metric. In this paper we investigate C-metric spaces. The paper is divided into four sections. Section 1 is for Introduction. In Section 2 we recall some concepts and preliminary results. In Section 3 we present some properties of C-metric spaces, such as convergence properties, a canonical decomposition and a C-fixed point theorem. Finally, in Section 4 some conclusions are highlighted.

  13. Diabetes\tand\tdepression:\tA\treview\twith\tspecial\tfocus\ton\tIndia

    Directory of Open Access Journals (Sweden)

    Megha\tThakur

    2015-10-01

    Full Text Available Diabetes,\ta\tpsychologically\tchallenging\tcondition for the\tpatients\tand their care givers, has been found to be a significant risk factor for depression. Depression\tmay\tbe\ta\tcritical\tbarrier\tto\teffective\tdiabetes\tmanagement.\tThe accompanying\tfatigue\tremarkably\tlowers\tthe\tmotivation\tfor\tself-care,\toften leading to lowered physical and emotion well-being, poor markers of diabetes control, poor adherence to medication, and increased mortality among individuals with diabetes. A very small proportion of the diabetes patients\twith\tdepression\tget\tdiagnosed,\tand\tfurthermore,\tonly\ta\thandful\tof the ones diagnosed get treated for depression. Despite the fact that 80 percent\tof\tthe\tpeople\twith\ttype\t2\tdiabetes\treside\tin\tlow\tand\tmiddle\tincome\tcountries,\tmost\tof\tthe\tevidence\ton diabetes\tand\tdepression\tcomes\tfrom\thigh\tincome\tcountries.\tThis\treview\toffers\ta\tsummary\tof\texisting\tevidence and\tthe\tpotential\tgaps\tthat\tneed\tto\tbe\taddressed.

  14. Design and analysis of a 1-ton prototype of the Jinping Neutrino Experiment

    International Nuclear Information System (INIS)

    Wang, Zongyi; Wang, Yuanqing; Wang, Zhe; Chen, Shaomin; Du, Xinxi; Zhang, Tianxiong; Guo, Ziyi; Yuan, Huanxin

    2017-01-01

    The Jinping Neutrino Experiment will perform an in-depth research on solar neutrinos and geo-neutrinos. Two structural options (i.e., cylindrical and spherical schemes) are proposed for the Jinping detector based on other successful underground neutrino detectors. Several key factors in the design are also discussed in detail. A 1-ton prototype of the Jinping experiment is proposed based on physics requirements. Subsequently, the structural design, installation procedure, and mechanical analysis of the neutrino detector prototype are discussed. The results show that the maximum Mises stresses on the acrylic vessel, stainless steel truss, and the tank are all lower than the design values of the strengths. The stability requirement of the stainless steel truss in the detector prototype is satisfied. Consequently, the structural scheme for the 1-ton prototype is safe and reliable.

  15. Design and analysis of a 1-ton prototype of the Jinping Neutrino Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zongyi, E-mail: wangzongyi1990@outlook.com [School of Civil Engineering, Wuhan University, Wuhan 430072 (China); Wang, Yuanqing [Key Laboratory of Civil Engineering Safety and Durability of Education Ministry, Tsinghua University, Beijing 100084 (China); Wang, Zhe; Chen, Shaomin [Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Du, Xinxi [School of Civil Engineering, Wuhan University, Wuhan 430072 (China); Zhang, Tianxiong [School of Civil Engineering, Tianjin University, Tianjin 300072 (China); Guo, Ziyi [Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Yuan, Huanxin [School of Civil Engineering, Wuhan University, Wuhan 430072 (China)

    2017-05-21

    The Jinping Neutrino Experiment will perform an in-depth research on solar neutrinos and geo-neutrinos. Two structural options (i.e., cylindrical and spherical schemes) are proposed for the Jinping detector based on other successful underground neutrino detectors. Several key factors in the design are also discussed in detail. A 1-ton prototype of the Jinping experiment is proposed based on physics requirements. Subsequently, the structural design, installation procedure, and mechanical analysis of the neutrino detector prototype are discussed. The results show that the maximum Mises stresses on the acrylic vessel, stainless steel truss, and the tank are all lower than the design values of the strengths. The stability requirement of the stainless steel truss in the detector prototype is satisfied. Consequently, the structural scheme for the 1-ton prototype is safe and reliable.

  16. 1.6 billion euros for nuclear research through the 'Horizon 2020' program

    International Nuclear Information System (INIS)

    Anon.

    2014-01-01

    The European Union Council has approved the budget for the future European program for research and innovation called 'Horizon 2020'. A global funding of 77 billion euros has been allocated to 'Horizon 2020' for the 2014 to 2020 years. The share for nuclear sciences will reach 1.6 billion euros and will break down as follows: 316 million euros for fundamental research on fission, 728 million euros for fundamental research on fusion (ITER not included) and 560 million euros for the research projects of the European Joint Research Center (JRC). (A.C.)

  17. Electron capture detection of sulphur gases in carbon dioxide at the parts-per-billion level

    International Nuclear Information System (INIS)

    Pick, M.E.

    1979-01-01

    A gas chromatograph with an electron capture detector has been used to determine sulphur gases in CO 2 at the parts-per-billion level, with particular application to the analysis of coolant from CO 2 cooled nuclear reactors. For COS, CS 2 , CH 3 SH, H 2 S and (CH 3 ) 2 S 2 the detector has a sensitivity comparable with the more commonly used flame photometric detector, but it is much less sensitive towards (CH 3 ) 2 S and thiophene. In addition, the paper describes a simple method for trapping sulphur gases which might enable detection of sub parts-per-billion levels of sulphur compounds. (Auth.)

  18. Comportement d'un béton à hautes performances à base de laitier ...

    African Journals Online (AJOL)

    L'utilisation de béton à hautes performances (BHP) intégrant des ajouts cimentaires comme les cendres volantes, les fumées de silice ou le laitier hydraulique ... armatures qui sont, à leur tour attaquées. Il est possible de modifier la ... refroidissement brutal par l'eau sous pression, c'est un sable de granulométrie 0/5 mm.

  19. The design of steel string crane with lifting capacity 10 tons

    International Nuclear Information System (INIS)

    Syamsurrijal Ramdja

    2007-01-01

    The steel string (sling) used for lift Crane of type of Overhead Travelling Crane, with capacities lifting 10 ton are designed. If compared to other string type, string of steel have some excellence. At this design, election of type of string become primary and the factor of safety become prima facie matter with pursuant to up to date standard. From made of design, is hence got by specification and age of steel string. (author)

  20. Mine design for producing 100,000 tons per day of uranium-bearing Chattanooga Shale

    International Nuclear Information System (INIS)

    Hoe, H.L.

    1979-01-01

    Chattanooga Shale, underlying some 40,000 square miles in the southeastern United States, is considered to be a potentially large, low-grade source of uranium. The area in and near Dekalb County, Tennessee, appears to be the most likely site for commercial development. This paper deals with the mine design, mining procedures, equipment requirements, and operating maintenance costs for an underground mining complex capable of producing 100,000 tons of Chattanooga Shale per day for delivery to a beneficiation process

  1. Overall view of the AA hall dominated by the 50 ton crane (Donges).

    CERN Multimedia

    1980-01-01

    A 50 ton, 32 metre span overhead travelling cranre was mounted in one of the bays of Hall 193 (AA). An identical crane was mounted on the other bay. See also photo 8004261. For photos of the AA in different phases of completion (between 1979 and 1982) see: 7911303, 7911597X, 8004261, 8004608X, 8005563X, 8005565X, 8006716X, 8006722X, 8010939X, 8010941X, 8202324, 8202658X, 8203628X .

  2. Comportement en flexion des bétons fibrés sous chargement cyclique

    Directory of Open Access Journals (Sweden)

    Boulekbache Bensaid

    2014-04-01

    Full Text Available Ce papier présente les résultats d’une étude expérimentale sur le comportement en flexion des bétons de fibres métalliques. On étudie l’effet de la rhéologie du béton sur l’orientation des fibres et l’influence de l’orientation sur les propriétés mécaniques. La rigidité de l’ancrage des fibres étudiée par les essais cycliques est liée aux caractéristiques rhéologiques et mécaniques de la matrice. Les résultats montrent que la fluidité des bétons est un paramètre essentiel de l’orientation des fibres. Dès lors que l’on obtient une orientation dans le sens de l’efficacité mécanique, la résistance à la flexion est nettement améliorée.

  3. 46 CFR 25.25-17 - Survival craft requirements for uninspected passenger vessels of at least 100 gross tons.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Survival craft requirements for uninspected passenger... Survival craft requirements for uninspected passenger vessels of at least 100 gross tons. (a) Each uninspected passenger vessel of at least 100 gross tons must have adequate survival craft with enough capacity...

  4. How much energy is locked in the USA? Alternative metrics for characterising the magnitude of overweight and obesity derived from BRFSS 2010 data.

    Science.gov (United States)

    Reidpath, Daniel D; Masood, Mohd; Allotey, Pascale

    2014-06-01

    Four metrics to characterise population overweight are described. Behavioural Risk Factors Surveillance System data were used to estimate the weight the US population needed to lose to achieve a BMI energy, and energy value. About 144 million people in the US need to lose 2.4 million metric tonnes. The volume of fat is 2.6 billion litres-1,038 Olympic size swimming pools. The energy in the fat would power 90,000 households for a year and is worth around 162 million dollars. Four confronting ways of talking about a national overweight and obesity are described. The value of the metrics remains to be tested.

  5. On characterizations of quasi-metric completeness

    Energy Technology Data Exchange (ETDEWEB)

    Dag, H.; Romaguera, S.; Tirado, P.

    2017-07-01

    Hu proved in [4] that a metric space (X, d) is complete if and only if for any closed subspace C of (X, d), every Banach contraction on C has fixed point. Since then several authors have investigated the problem of characterizing the metric completeness by means of fixed point theorems. Recently this problem has been studied in the more general context of quasi-metric spaces for different notions of completeness. Here we present a characterization of a kind of completeness for quasi-metric spaces by means of a quasi-metric versions of Hu’s theorem. (Author)

  6. The Metric of Colour Space

    DEFF Research Database (Denmark)

    Gravesen, Jens

    2015-01-01

    and found the MacAdam ellipses which are often interpreted as defining the metric tensor at their centres. An important question is whether it is possible to define colour coordinates such that the Euclidean distance in these coordinates correspond to human perception. Using cubic splines to represent......The space of colours is a fascinating space. It is a real vector space, but no matter what inner product you put on the space the resulting Euclidean distance does not correspond to human perception of difference between colours. In 1942 MacAdam performed the first experiments on colour matching...

  7. Product Operations Status Summary Metrics

    Science.gov (United States)

    Takagi, Atsuya; Toole, Nicholas

    2010-01-01

    The Product Operations Status Summary Metrics (POSSUM) computer program provides a readable view into the state of the Phoenix Operations Product Generation Subsystem (OPGS) data pipeline. POSSUM provides a user interface that can search the data store, collect product metadata, and display the results in an easily-readable layout. It was designed with flexibility in mind for support in future missions. Flexibility over various data store hierarchies is provided through the disk-searching facilities of Marsviewer. This is a proven program that has been in operational use since the first day of the Phoenix mission.

  8. Web metrics for library and information professionals

    CERN Document Server

    Stuart, David

    2014-01-01

    This is a practical guide to using web metrics to measure impact and demonstrate value. The web provides an opportunity to collect a host of different metrics, from those associated with social media accounts and websites to more traditional research outputs. This book is a clear guide for library and information professionals as to what web metrics are available and how to assess and use them to make informed decisions and demonstrate value. As individuals and organizations increasingly use the web in addition to traditional publishing avenues and formats, this book provides the tools to unlock web metrics and evaluate the impact of this content. The key topics covered include: bibliometrics, webometrics and web metrics; data collection tools; evaluating impact on the web; evaluating social media impact; investigating relationships between actors; exploring traditional publications in a new environment; web metrics and the web of data; the future of web metrics and the library and information professional.Th...

  9. Community access networks: how to connect the next billion to the ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Community access networks: how to connect the next billion to the Internet ... services is a prerequisite to sustainable socio-economic development. ... It will provide case studies and formulate recommendations with respect to ... An IDRC delegation will join international delegates and city representatives at the ICLEI World ...

  10. Guangdong Aluminum to Raise RMB 3 billion for New Production Base in Guizhou

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    <正>On July 7, a loan signing ceremony was held between the Guangdong Aluminum Group, China Construction Bank, Hua Xia Bank and Guangzhou Bank Consortium. It is reported that these banks will provide Guangdong Aluminum Group with RMB 30 billion for an alu-minum oxide and supporting bauxite mining project in Guizhou.

  11. Readability of the web: a study on 1 billion web pages

    NARCIS (Netherlands)

    de Heus, Marije; Hiemstra, Djoerd

    We have performed a readability study on more than 1 billion web pages. The Automated Readability Index was used to determine the average grade level required to easily comprehend a website. Some of the results are that a 16-year-old can easily understand 50% of the web and an 18-year old can easily

  12. Anhui Tongling Invests 1 Billion Yuan to Set up “Copper Industry Fund”

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    <正>On September 12, the signing ceremony for "Anhui Copper Industry Fund" set up by Anhui Tongling Development & Investment Group Co., Ltd. and Shanghai V. Stone Investment Management Co., Ltd. was held in Tongling. The fund is 1 billion yuan.

  13. Spatial variability in oceanic redox structure 1.8 billion years ago

    DEFF Research Database (Denmark)

    Poulton, Simon W.; Fralick, Philip W.; Canfield, Donald Eugene

    2010-01-01

    to reconstruct oceanic redox conditions from the 1.88- to 1.83-billion-year-old Animikie group from the Superior region, North America. We find that surface waters were oxygenated, whereas at mid-depths, anoxic and sulphidic (euxinic) conditions extended over 100 km from the palaeoshoreline. The spatial extent...

  14. Price of next big thing in physics: $6.7 billion

    CERN Multimedia

    Overbye, Dennis

    2007-01-01

    The price of exploring inner space went up Thursday. The machine discusses in a news conference in Beijing, will be 20 miles long and would cost about $6.7 billion and 13'000 person-years of labor to be built. (1,5 page)

  15. Universities Report $1.8-Billion in Earnings on Inventions in 2011

    Science.gov (United States)

    Blumenstyk, Goldie

    2012-01-01

    Universities and their inventors earned more than $1.8-billion from commercializing their academic research in the 2011 fiscal year, collecting royalties from new breeds of wheat, from a new drug for the treatment of HIV, and from longstanding arrangements over enduring products like Gatorade. Northwestern University earned the most of any…

  16. U of M seeking $1.1 billion in projects for Soudan Mine lab.

    CERN Multimedia

    2003-01-01

    The University of Minnesota is hoping that groundbreaking research underway at its labs at the Soudan Underground Mine near Tower will help secure up to $1.1 billion in the next 5 to 20 years to expand its work into particle physics (1 page).

  17. Systems resilience for multihazard environments: definition, metrics, and valuation for decision making.

    Science.gov (United States)

    Ayyub, Bilal M

    2014-02-01

    The United Nations Office for Disaster Risk Reduction reported that the 2011 natural disasters, including the earthquake and tsunami that struck Japan, resulted in $366 billion in direct damages and 29,782 fatalities worldwide. Storms and floods accounted for up to 70% of the 302 natural disasters worldwide in 2011, with earthquakes producing the greatest number of fatalities. Average annual losses in the United States amount to about $55 billion. Enhancing community and system resilience could lead to massive savings through risk reduction and expeditious recovery. The rational management of such reduction and recovery is facilitated by an appropriate definition of resilience and associated metrics. In this article, a resilience definition is provided that meets a set of requirements with clear relationships to the metrics of the relevant abstract notions of reliability and risk. Those metrics also meet logically consistent requirements drawn from measure theory, and provide a sound basis for the development of effective decision-making tools for multihazard environments. Improving the resiliency of a system to meet target levels requires the examination of system enhancement alternatives in economic terms, within a decision-making framework. Relevant decision analysis methods would typically require the examination of resilience based on its valuation by society at large. The article provides methods for valuation and benefit-cost analysis based on concepts from risk analysis and management. © 2013 Society for Risk Analysis.

  18. Metrics for building performance assurance

    Energy Technology Data Exchange (ETDEWEB)

    Koles, G.; Hitchcock, R.; Sherman, M.

    1996-07-01

    This report documents part of the work performed in phase I of a Laboratory Directors Research and Development (LDRD) funded project entitled Building Performance Assurances (BPA). The focus of the BPA effort is to transform the way buildings are built and operated in order to improve building performance by facilitating or providing tools, infrastructure, and information. The efforts described herein focus on the development of metrics with which to evaluate building performance and for which information and optimization tools need to be developed. The classes of building performance metrics reviewed are (1) Building Services (2) First Costs, (3) Operating Costs, (4) Maintenance Costs, and (5) Energy and Environmental Factors. The first category defines the direct benefits associated with buildings; the next three are different kinds of costs associated with providing those benefits; the last category includes concerns that are broader than direct costs and benefits to the building owner and building occupants. The level of detail of the various issues reflect the current state of knowledge in those scientific areas and the ability of the to determine that state of knowledge, rather than directly reflecting the importance of these issues; it intentionally does not specifically focus on energy issues. The report describes work in progress and is intended as a resource and can be used to indicate the areas needing more investigation. Other reports on BPA activities are also available.

  19. Valorisation et Recyclage des Déchets Plastiques dans le Béton

    Directory of Open Access Journals (Sweden)

    Benimam Samir

    2014-04-01

    Full Text Available La valorisation des déchets dans le génie civil est un secteur important dans la mesure où les produits que l’on souhaite obtenir ne sont pas soumis à des critères de qualité trop rigoureux. Le recyclage des déchets touche deux impacts très importants à savoir l’impact et l’impact économique. Donc plusieurs pays du monde, différents déchets sont utilisé dans le domaine de la construction et spécialement dans le ciment ou béton comme poudre, fibres ou agrégats. Ce travail s’intéresse à la valorisation d’un déchet qui est nuisible pour l’environnement vu son caractère encombrant et inesthétique il s’agit du déchet plastique. Trois types de déchets plastiques sont ajoutés dans le béton (sous forme de grains et fibres (ondulées et rectilignes. Les propriétés à l’état frais (maniabilité, air occlus et densité et à l’état durci (résistance à la compression, à la traction, retrait et absorption d’eau des différents bétons réalisés sont analysées et comparés par rapport leurs témoins respectifs. D’après les résultats expérimentaux on peut conclure que le renforcement de la matrice cimentaire avec des fibres plastiques ondulées montrent une nette amélioration de la résistance à la traction du béton ainsi qu’une diminution remarquable de sa capacité d’absorption de l’eau lorsqu’on utilise des grains plastiques.

  20. Metric approach to quantum constraints

    International Nuclear Information System (INIS)

    Brody, Dorje C; Hughston, Lane P; Gustavsson, Anna C T

    2009-01-01

    A framework for deriving equations of motion for constrained quantum systems is introduced and a procedure for its implementation is outlined. In special cases, the proposed new method, which takes advantage of the fact that the space of pure states in quantum mechanics has both a symplectic structure and a metric structure, reduces to a quantum analogue of the Dirac theory of constraints in classical mechanics. Explicit examples involving spin-1/2 particles are worked out in detail: in the first example, our approach coincides with a quantum version of the Dirac formalism, while the second example illustrates how a situation that cannot be treated by Dirac's approach can nevertheless be dealt with in the present scheme.

  1. Metrics for Business Process Models

    Science.gov (United States)

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  2. Active Metric Learning for Supervised Classification

    OpenAIRE

    Kumaran, Krishnan; Papageorgiou, Dimitri; Chang, Yutong; Li, Minhan; Takáč, Martin

    2018-01-01

    Clustering and classification critically rely on distance metrics that provide meaningful comparisons between data points. We present mixed-integer optimization approaches to find optimal distance metrics that generalize the Mahalanobis metric extensively studied in the literature. Additionally, we generalize and improve upon leading methods by removing reliance on pre-designated "target neighbors," "triplets," and "similarity pairs." Another salient feature of our method is its ability to en...

  3. On Nakhleh's metric for reduced phylogenetic networks

    OpenAIRE

    Cardona, Gabriel; Llabrés, Mercè; Rosselló, Francesc; Valiente Feruglio, Gabriel Alejandro

    2009-01-01

    We prove that Nakhleh’s metric for reduced phylogenetic networks is also a metric on the classes of tree-child phylogenetic networks, semibinary tree-sibling time consistent phylogenetic networks, and multilabeled phylogenetic trees. We also prove that it separates distinguishable phylogenetic networks. In this way, it becomes the strongest dissimilarity measure for phylogenetic networks available so far. Furthermore, we propose a generalization of that metric that separates arbitrary phyl...

  4. Generalized tolerance sensitivity and DEA metric sensitivity

    OpenAIRE

    Neralić, Luka; E. Wendell, Richard

    2015-01-01

    This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA). Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  5. The definitive guide to IT service metrics

    CERN Document Server

    McWhirter, Kurt

    2012-01-01

    Used just as they are, the metrics in this book will bring many benefits to both the IT department and the business as a whole. Details of the attributes of each metric are given, enabling you to make the right choices for your business. You may prefer and are encouraged to design and create your own metrics to bring even more value to your business - this book will show you how to do this, too.

  6. Generalized tolerance sensitivity and DEA metric sensitivity

    Directory of Open Access Journals (Sweden)

    Luka Neralić

    2015-03-01

    Full Text Available This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA. Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  7. Common Metrics for Human-Robot Interaction

    Science.gov (United States)

    Steinfeld, Aaron; Lewis, Michael; Fong, Terrence; Scholtz, Jean; Schultz, Alan; Kaber, David; Goodrich, Michael

    2006-01-01

    This paper describes an effort to identify common metrics for task-oriented human-robot interaction (HRI). We begin by discussing the need for a toolkit of HRI metrics. We then describe the framework of our work and identify important biasing factors that must be taken into consideration. Finally, we present suggested common metrics for standardization and a case study. Preparation of a larger, more detailed toolkit is in progress.

  8. Chaotic inflation with metric and matter perturbations

    International Nuclear Information System (INIS)

    Feldman, H.A.; Brandenberger, R.H.

    1989-01-01

    A perturbative scheme to analyze the evolution of both metric and scalar field perturbations in an expanding universe is developed. The scheme is applied to study chaotic inflation with initial metric and scalar field perturbations present. It is shown that initial gravitational perturbations with wavelength smaller than the Hubble radius rapidly decay. The metric simultaneously picks up small perturbations determined by the matter inhomogeneities. Both are frozen in once the wavelength exceeds the Hubble radius. (orig.)

  9. Gravitational lensing in metric theories of gravity

    International Nuclear Information System (INIS)

    Sereno, Mauro

    2003-01-01

    Gravitational lensing in metric theories of gravity is discussed. I introduce a generalized approximate metric element, inclusive of both post-post-Newtonian contributions and a gravitomagnetic field. Following Fermat's principle and standard hypotheses, I derive the time delay function and deflection angle caused by an isolated mass distribution. Several astrophysical systems are considered. In most of the cases, the gravitomagnetic correction offers the best perspectives for an observational detection. Actual measurements distinguish only marginally different metric theories from each other

  10. Interactive (statistical) visualisation and exploration of a billion objects with vaex

    Science.gov (United States)

    Breddels, M. A.

    2017-06-01

    With new catalogues arriving such as the Gaia DR1, containing more than a billion objects, new methods of handling and visualizing these data volumes are needed. We show that by calculating statistics on a regular (N-dimensional) grid, visualizations of a billion objects can be done within a second on a modern desktop computer. This is achieved using memory mapping of hdf5 files together with a simple binning algorithm, which are part of a Python library called vaex. This enables efficient exploration or large datasets interactively, making science exploration of large catalogues feasible. Vaex is a Python library and an application, which allows for interactive exploration and visualization. The motivation for developing vaex is the catalogue of the Gaia satellite, however, vaex can also be used on SPH or N-body simulations, any other (future) catalogues such as SDSS, Pan-STARRS, LSST, etc. or other tabular data. The homepage for vaex is http://vaex.astro.rug.nl.

  11. About the possibility of a generalized metric

    International Nuclear Information System (INIS)

    Lukacs, B.; Ladik, J.

    1991-10-01

    The metric (the structure of the space-time) may be dependent on the properties of the object measuring it. The case of size dependence of the metric was examined. For this dependence the simplest possible form of the metric tensor has been constructed which fulfils the following requirements: there be two extremal characteristic scales; the metric be unique and the usual between them; the change be sudden in the neighbourhood of these scales; the size of the human body appear as a parameter (postulated on the basis of some philosophical arguments). Estimates have been made for the two extremal length scales according to existing observations. (author) 19 refs

  12. Areva - First quarter 2009 revenue climbs 8.5% to 3.003 billion euros

    International Nuclear Information System (INIS)

    2009-04-01

    First quarter 2009 revenue was up 8.5% compared with the same period last year, to 3.003 billion euros. At constant exchange rates and consolidation scope, growth came to 3.9%. Currency translation had a positive impact of 57 million euros over the quarter. Changes in the consolidation scope had an impact of 66 million euros, primarily due to the consolidation of acquisitions made in 2008 in Transmission and Distribution and in Renewable Energies. The growth engines for first quarter revenue were the Reactors and Services division and the Transmission and Distribution division, with growth of 9.2% and 16.1% respectively. Outside France, revenue rose to 2.032 billion euros, compared with 1.857 billion euros in the first quarter of 2008, and represents 68% of total revenue. Orders were steady in the first quarter, particularly in the Front End, which posted several significant contracts with US and Asian utilities, and in Transmission and Distribution, with orders up sharply in Asia and South America. As of March 31, 2009, the group's backlog reached 49.5 billion euros, for 28.3% growth year-on-year, including 31.3% growth in Nuclear and 10.2% in Transmission and Distribution. For the year as a whole, the group confirms its outlook for backlog and revenue growth as well as rising operating income It should be noted that revenue may vary significantly from one quarter to the next in nuclear operations. Accordingly, quarterly data cannot be viewed as a reliable indicator of annual trends

  13. Dongfeng has fixed a sales goal of 80 billion yuan supported by Nissan

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    <正> The Nissan and Dongfeng Group-based Dongfeng Automobile Co., Ltd, which has the largest investment in the history of the industry, opened officially for business on July 1. With a total investment of USD 2 billion and 70,000 employees, the company is the first joint venture in China which plans a full range of truck, light commercial and passenger vehicles. According to president Nakamura, the company has established a management

  14. Galaxy growth in a massive halo in the first billion years of cosmic history

    Science.gov (United States)

    Marrone, D. P.; Spilker, J. S.; Hayward, C. C.; Vieira, J. D.; Aravena, M.; Ashby, M. L. N.; Bayliss, M. B.; Béthermin, M.; Brodwin, M.; Bothwell, M. S.; Carlstrom, J. E.; Chapman, S. C.; Chen, Chian-Chou; Crawford, T. M.; Cunningham, D. J. M.; De Breuck, C.; Fassnacht, C. D.; Gonzalez, A. H.; Greve, T. R.; Hezaveh, Y. D.; Lacaille, K.; Litke, K. C.; Lower, S.; Ma, J.; Malkan, M.; Miller, T. B.; Morningstar, W. R.; Murphy, E. J.; Narayanan, D.; Phadke, K. A.; Rotermund, K. M.; Sreevani, J.; Stalder, B.; Stark, A. A.; Strandet, M. L.; Tang, M.; Weiß, A.

    2018-01-01

    According to the current understanding of cosmic structure formation, the precursors of the most massive structures in the Universe began to form shortly after the Big Bang, in regions corresponding to the largest fluctuations in the cosmic density field. Observing these structures during their period of active growth and assembly—the first few hundred million years of the Universe—is challenging because it requires surveys that are sensitive enough to detect the distant galaxies that act as signposts for these structures and wide enough to capture the rarest objects. As a result, very few such objects have been detected so far. Here we report observations of a far-infrared-luminous object at redshift 6.900 (less than 800 million years after the Big Bang) that was discovered in a wide-field survey. High-resolution imaging shows it to be a pair of extremely massive star-forming galaxies. The larger is forming stars at a rate of 2,900 solar masses per year, contains 270 billion solar masses of gas and 2.5 billion solar masses of dust, and is more massive than any other known object at a redshift of more than 6. Its rapid star formation is probably triggered by its companion galaxy at a projected separation of 8 kiloparsecs. This merging companion hosts 35 billion solar masses of stars and has a star-formation rate of 540 solar masses per year, but has an order of magnitude less gas and dust than its neighbour and physical conditions akin to those observed in lower-metallicity galaxies in the nearby Universe. These objects suggest the presence of a dark-matter halo with a mass of more than 100 billion solar masses, making it among the rarest dark-matter haloes that should exist in the Universe at this epoch.

  15. Techniques et systèmes de renfort des structures en béton

    CERN Document Server

    Miranda-Vizuete, J

    2000-01-01

    Bien qu'appelé « pierre artificielle », le béton est un matériau vivant qui se modifie tout au long de sa vie utile. Il change car la structure dont il fait partie subit elle-même des changements. Ces changements proviennent soit de modifications ou de rénovations, soit d'une altération de sa capacité de support par un accroissement des charges. Dans la plupart des cas, ils nécessitent un renfort. Le renforcement d'une structure en béton consiste à améliorer les caractéristiques mécaniques des éléments qui la composent, de manière à ce qu'elle offre une meilleure solidité aussi bien en état de service qu'en état de résistances ultimes. Ce document présente les méthodes les plus utilisées dans le domaine de renfort des structures dont l'incorporation des profiles métalliques, l'augmentation de section structurelle et celle plus récente du renforcement à base d'adjonction de matériaux composites extérieurs.

  16. Drought analysis in the Tons River Basin, India during 1969-2008

    Science.gov (United States)

    Meshram, Sarita Gajbhiye; Gautam, Randhir; Kahya, Ercan

    2018-05-01

    The primary focus of this study is the analysis of droughts in the Tons River Basin during the period 1969-2008. Precipitation data observed at four gauging stations are used to identify drought over the study area. The event of drought is derived from the standardized precipitation index (SPI) on a 3-month scale. Our results indicated that severe drought occurred in the Allahabad, Rewa, and Satna stations in the years 1973 and 1979. The droughts in this region had occurred mainly due to erratic behavior in monsoons, especially due to long breaks between monsoons. During the drought years, the deficiency of the annual rainfall in the analysis of annual rainfall departure had varied from -26% in 1976 to -60% in 1973 at Allahabad station in the basin. The maximum deficiency of annual and seasonal rainfall recorded in the basin is 60%. The maximum seasonal rainfall departure observed in the basin is in the order of -60% at Allahabad station in 1973, while maximum annual rainfall departure had been recorded as -60% during 1979 at the Satna station. Extreme dry events ( z score <-2) were detected during July, August, and September. Moreover, severe dry events were observed in August, September, and October. The drought conditions in the Tons River Basin are dominantly driven by total rainfall throughout the period between June and November.

  17. A 17-billion-solar-mass black hole in a group galaxy with a diffuse core.

    Science.gov (United States)

    Thomas, Jens; Ma, Chung-Pei; McConnell, Nicholas J; Greene, Jenny E; Blakeslee, John P; Janish, Ryan

    2016-04-21

    Quasars are associated with and powered by the accretion of material onto massive black holes; the detection of highly luminous quasars with redshifts greater than z = 6 suggests that black holes of up to ten billion solar masses already existed 13 billion years ago. Two possible present-day 'dormant' descendants of this population of 'active' black holes have been found in the galaxies NGC 3842 and NGC 4889 at the centres of the Leo and Coma galaxy clusters, which together form the central region of the Great Wall--the largest local structure of galaxies. The most luminous quasars, however, are not confined to such high-density regions of the early Universe; yet dormant black holes of this high mass have not yet been found outside of modern-day rich clusters. Here we report observations of the stellar velocity distribution in the galaxy NGC 1600--a relatively isolated elliptical galaxy near the centre of a galaxy group at a distance of 64 megaparsecs from Earth. We use orbit superposition models to determine that the black hole at the centre of NGC 1600 has a mass of 17 billion solar masses. The spatial distribution of stars near the centre of NGC 1600 is rather diffuse. We find that the region of depleted stellar density in the cores of massive elliptical galaxies extends over the same radius as the gravitational sphere of influence of the central black holes, and interpret this as the dynamical imprint of the black holes.

  18. BUILDING A BILLION SPATIO-TEMPORAL OBJECT SEARCH AND VISUALIZATION PLATFORM

    Directory of Open Access Journals (Sweden)

    D. Kakkar

    2017-10-01

    Full Text Available With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC, an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.

  19. Building a Billion Spatio-Temporal Object Search and Visualization Platform

    Science.gov (United States)

    Kakkar, D.; Lewis, B.

    2017-10-01

    With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA) has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC), an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.

  20. Two ten-billion-solar-mass black holes at the centres of giant elliptical galaxies.

    Science.gov (United States)

    McConnell, Nicholas J; Ma, Chung-Pei; Gebhardt, Karl; Wright, Shelley A; Murphy, Jeremy D; Lauer, Tod R; Graham, James R; Richstone, Douglas O

    2011-12-08

    Observational work conducted over the past few decades indicates that all massive galaxies have supermassive black holes at their centres. Although the luminosities and brightness fluctuations of quasars in the early Universe suggest that some were powered by black holes with masses greater than 10 billion solar masses, the remnants of these objects have not been found in the nearby Universe. The giant elliptical galaxy Messier 87 hosts the hitherto most massive known black hole, which has a mass of 6.3 billion solar masses. Here we report that NGC 3842, the brightest galaxy in a cluster at a distance from Earth of 98 megaparsecs, has a central black hole with a mass of 9.7 billion solar masses, and that a black hole of comparable or greater mass is present in NGC 4889, the brightest galaxy in the Coma cluster (at a distance of 103 megaparsecs). These two black holes are significantly more massive than predicted by linearly extrapolating the widely used correlations between black-hole mass and the stellar velocity dispersion or bulge luminosity of the host galaxy. Although these correlations remain useful for predicting black-hole masses in less massive elliptical galaxies, our measurements suggest that different evolutionary processes influence the growth of the largest galaxies and their black holes.

  1. Combustion characteristics and NO formation for biomass blends in a 35-ton-per-hour travelling grate utility boiler.

    Science.gov (United States)

    Li, Zhengqi; Zhao, Wei; Li, Ruiyang; Wang, Zhenwang; Li, Yuan; Zhao, Guangbo

    2009-04-01

    Measurements were taken for a 35-ton-per-hour biomass-fired travelling grate boiler. Local mean concentrations of O(2), CO, SO(2) and NO gas species and gas temperatures were determined in the region above the grate. For a 28-ton-per-hour load, the mass ratios of biomass fly ash and boiler slag were 42% and 58%, the boiler efficiency was 81.56%, and the concentrations of NO(x) and SO(2) at 6% O(2) were 257 and 84 mg/m(3). For an 18-ton-per-hour load, the fuel burning zone was nearer to the inlet than it was for the 28-ton-per-hour load, and the contents of CO and NO in the fuel burning zone above the grate were lower.

  2. Areva excellent business volume: backlog as of december 31, 2008: + 21.1% to 48.2 billion euros. 2008 revenue: + 10.4% to 13.2 billion euros

    International Nuclear Information System (INIS)

    2009-01-01

    AREVA's backlog stood at 48.2 billion euros as of December 31, 2008, for 21.1% growth year-on-year, including 21.8% growth in Nuclear and 16.5% growth in Transmission and Distribution. The Nuclear backlog came to 42.5 billion euros at December 31, 2008. The Transmission and Distribution backlog came to 5.7 billion euros at year-end. The group recognized revenue of 13.2 billion euros in 2008, for year-on-year growth of 10.4% (+9.8% like-for-like). Revenue outside France was up 10.5% to 9.5 billion euros, representing 72% of total revenue. Revenue was up 6.5% in the Nuclear businesses (up 6.3% LFL), with strong performance in the Reactors and Services division (+10.9% LFL) and the Front End division (+7.2% LFL). The Transmission and Distribution division recorded growth of 17% (+15.8% LFL). Revenue for the fourth quarter of 2008 rose to 4.1 billion euros, up 5.2% (+1.6% LFL) from that of the fourth quarter of 2007. Revenue for the Front End division rose to 3.363 billion euros in 2008, up 7.1% over 2007 (+7.2% LFL). Foreign exchange (currency translations) had a negative impact of 53 million euros. Revenue for the Reactors and Services division rose to 3.037 billion euros, up 11.8% over 2007 (+10.9% LFL). Foreign exchange (currency translations) had a negative impact of 47 million euros. Revenue for the Back End division came to 1.692 billion euros, a drop of 2.7% (-2.5% LFL). Foreign exchange (currency translations) had a negative impact of 3.5 million euros. Revenue for the Transmission and Distribution division rose to 5.065 billion euros in 2008, up 17.0% (+15.8% LFL)

  3. Enhancing Authentication Models Characteristic Metrics via ...

    African Journals Online (AJOL)

    In this work, we derive the universal characteristic metrics set for authentication models based on security, usability and design issues. We then compute the probability of the occurrence of each characteristic metrics in some single factor and multifactor authentication models in order to determine the effectiveness of these ...

  4. Gravitational Metric Tensor Exterior to Rotating Homogeneous ...

    African Journals Online (AJOL)

    The covariant and contravariant metric tensors exterior to a homogeneous spherical body rotating uniformly about a common φ axis with constant angular velocity ω is constructed. The constructed metric tensors in this gravitational field have seven non-zero distinct components.The Lagrangian for this gravitational field is ...

  5. Invariant metric for nonlinear symplectic maps

    Indian Academy of Sciences (India)

    In this paper, we construct an invariant metric in the space of homogeneous polynomials of a given degree (≥ 3). The homogeneous polynomials specify a nonlinear symplectic map which in turn represents a Hamiltonian system. By minimizing the norm constructed out of this metric as a function of system parameters, we ...

  6. Finite Metric Spaces of Strictly negative Type

    DEFF Research Database (Denmark)

    Hjorth, Poul G.

    If a finite metric space is of strictly negative type then its transfinite diameter is uniquely realized by an infinite extent (“load vector''). Finite metric spaces that have this property include all trees, and all finite subspaces of Euclidean and Hyperbolic spaces. We prove that if the distance...

  7. Fixed point theory in metric type spaces

    CERN Document Server

    Agarwal, Ravi P; O’Regan, Donal; Roldán-López-de-Hierro, Antonio Francisco

    2015-01-01

    Written by a team of leading experts in the field, this volume presents a self-contained account of the theory, techniques and results in metric type spaces (in particular in G-metric spaces); that is, the text approaches this important area of fixed point analysis beginning from the basic ideas of metric space topology. The text is structured so that it leads the reader from preliminaries and historical notes on metric spaces (in particular G-metric spaces) and on mappings, to Banach type contraction theorems in metric type spaces, fixed point theory in partially ordered G-metric spaces, fixed point theory for expansive mappings in metric type spaces, generalizations, present results and techniques in a very general abstract setting and framework. Fixed point theory is one of the major research areas in nonlinear analysis. This is partly due to the fact that in many real world problems fixed point theory is the basic mathematical tool used to establish the existence of solutions to problems which arise natur...

  8. Metric solution of a spinning mass

    International Nuclear Information System (INIS)

    Sato, H.

    1982-01-01

    Studies on a particular class of asymptotically flat and stationary metric solutions called the Kerr-Tomimatsu-Sato class are reviewed about its derivation and properties. For a further study, an almost complete list of the papers worked on the Tomimatsu-Sato metrics is given. (Auth.)

  9. On Information Metrics for Spatial Coding.

    Science.gov (United States)

    Souza, Bryan C; Pavão, Rodrigo; Belchior, Hindiael; Tort, Adriano B L

    2018-04-01

    The hippocampal formation is involved in navigation, and its neuronal activity exhibits a variety of spatial correlates (e.g., place cells, grid cells). The quantification of the information encoded by spikes has been standard procedure to identify which cells have spatial correlates. For place cells, most of the established metrics derive from Shannon's mutual information (Shannon, 1948), and convey information rate in bits/s or bits/spike (Skaggs et al., 1993, 1996). Despite their widespread use, the performance of these metrics in relation to the original mutual information metric has never been investigated. In this work, using simulated and real data, we find that the current information metrics correlate less with the accuracy of spatial decoding than the original mutual information metric. We also find that the top informative cells may differ among metrics, and show a surrogate-based normalization that yields comparable spatial information estimates. Since different information metrics may identify different neuronal populations, we discuss current and alternative definitions of spatially informative cells, which affect the metric choice. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  10. Validation of Metrics for Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2008-01-01

    Full Text Available This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  11. Validation of Metrics for Collaborative Systems

    OpenAIRE

    Ion IVAN; Cristian CIUREA

    2008-01-01

    This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  12. Software Power Metric Model: An Implementation | Akwukwuma ...

    African Journals Online (AJOL)

    ... and the execution time (TIME) in each case was recorded. We then obtain the application functions point count. Our result shows that the proposed metric is computable, consistent in its use of unit, and is programming language independent. Keywords: Software attributes, Software power, measurement, Software metric, ...

  13. Metrics for border management systems.

    Energy Technology Data Exchange (ETDEWEB)

    Duggan, Ruth Ann

    2009-07-01

    There are as many unique and disparate manifestations of border systems as there are borders to protect. Border Security is a highly complex system analysis problem with global, regional, national, sector, and border element dimensions for land, water, and air domains. The complexity increases with the multiple, and sometimes conflicting, missions for regulating the flow of people and goods across borders, while securing them for national security. These systems include frontier border surveillance, immigration management and customs functions that must operate in a variety of weather, terrain, operational conditions, cultural constraints, and geopolitical contexts. As part of a Laboratory Directed Research and Development Project 08-684 (Year 1), the team developed a reference framework to decompose this complex system into international/regional, national, and border elements levels covering customs, immigration, and border policing functions. This generalized architecture is relevant to both domestic and international borders. As part of year two of this project (09-1204), the team determined relevant relative measures to better understand border management performance. This paper describes those relative metrics and how they can be used to improve border management systems.

  14. The metrics of science and technology

    CERN Document Server

    Geisler, Eliezer

    2000-01-01

    Dr. Geisler's far-reaching, unique book provides an encyclopedic compilation of the key metrics to measure and evaluate the impact of science and technology on academia, industry, and government. Focusing on such items as economic measures, patents, peer review, and other criteria, and supported by an extensive review of the literature, Dr. Geisler gives a thorough analysis of the strengths and weaknesses inherent in metric design, and in the use of the specific metrics he cites. His book has already received prepublication attention, and will prove especially valuable for academics in technology management, engineering, and science policy; industrial R&D executives and policymakers; government science and technology policymakers; and scientists and managers in government research and technology institutions. Geisler maintains that the application of metrics to evaluate science and technology at all levels illustrates the variety of tools we currently possess. Each metric has its own unique strengths and...

  15. Smart Grid Status and Metrics Report Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Antonopoulos, Chrissi A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clements, Samuel L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorrissen, Willy J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ruiz, Kathleen A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, David L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gardner, Chris [APQC, Houston, TX (United States); Varney, Jeff [APQC, Houston, TX (United States)

    2014-07-01

    A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

  16. Metrics for Polyphonic Sound Event Detection

    Directory of Open Access Journals (Sweden)

    Annamaria Mesaros

    2016-05-01

    Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.

  17. Proposed Performance-Based Metrics for the Future Funding of Graduate Medical Education: Starting the Conversation.

    Science.gov (United States)

    Caverzagie, Kelly J; Lane, Susan W; Sharma, Niraj; Donnelly, John; Jaeger, Jeffrey R; Laird-Fick, Heather; Moriarty, John P; Moyer, Darilyn V; Wallach, Sara L; Wardrop, Richard M; Steinmann, Alwin F

    2017-12-12

    Graduate medical education (GME) in the United States is financed by contributions from both federal and state entities that total over $15 billion annually. Within institutions, these funds are distributed with limited transparency to achieve ill-defined outcomes. To address this, the Institute of Medicine convened a committee on the governance and financing of GME to recommend finance reform that would promote a physician training system that meets society's current and future needs. The resulting report provided several recommendations regarding the oversight and mechanisms of GME funding, including implementation of performance-based GME payments, but did not provide specific details about the content and development of metrics for these payments. To initiate a national conversation about performance-based GME funding, the authors asked: What should GME be held accountable for in exchange for public funding? In answer to this question, the authors propose 17 potential performance-based metrics for GME funding that could inform future funding decisions. Eight of the metrics are described as exemplars to add context and to help readers obtain a deeper understanding of the inherent complexities of performance-based GME funding. The authors also describe considerations and precautions for metric implementation.

  18. Development and Manufacturing Technology of Prototype Monoblock Low Pressure Rotor Shaft by 650ton Large Ingot

    Energy Technology Data Exchange (ETDEWEB)

    Song, Duk-Yong; Kim, Dong-Soo; Kim, Jungyeup; Lee, Jongwook; Ko, Seokhee [Doosan Heavy Industries and Construction, Changwon(Korea, Republic of)

    2016-10-15

    In order to establish the manufacturing technology for monoblock LP rotor shaft, DHI has produced the prototype monoblock LP rotor shaft with a maximum diameter of φ 2,800 mm using 650 ton ingot and investigated the mechanical properties and the internal quality of the ingot. As a result, the quality and mechanical properties required the large rotor shaft for nuclear power plant met a target. These results indicate that DHI can be contributed to increasing demands with high efficiency and capacity at the nuclear power plant. Additionally, some tests such as high cycle fatigue (HCF), low cycle fatigue (LCF), fracture toughness (K1C/J1C) and dynamic crack propagation velocity (da/dN) are in progress.

  19. Modified Cooling System for Low Temperature Experiments in a 3000 Ton Multi-Anvil Press

    Science.gov (United States)

    Secco, R.; Yong, W.

    2017-12-01

    A new modified cooling system for a 3000-ton multi-anvil press has been developed to reach temperatures below room temperature at high pressures. The new system is much simpler in design, easier to make and use, and has the same cooling capability as the previous design (Secco and Yong, RSI, 2016). The key component of the new system is a steel ring surrounding the module wedges that contains liquid nitrogen (LN2) which flows freely through an entrance port to flood the interior of the pressure module. Upper and lower O-rings on the ring seal in the liquid while permitting modest compression and an thermally insulating layer of foam is attached to the outside of the ring. The same temperature of 220 K reached with two different cooling systems suggests that thermal equilibrium is reached between the removal of heat by LN2 and the influx of heat through the massive steel components of this press.

  20. Synthesis and characterization of Al-TON zeolite using a dialkylimizadolium as structure-directing agent

    Energy Technology Data Exchange (ETDEWEB)

    Lopes, Christian Wittee; Pergher, Sibele Berenice Castella, E-mail: chriswittee@gmail.com [Universidade Federal do Rio Grande do Norte (UFRN), Natal, RN (Brazil); Villarroel-Rocha, Jhonny [Laboratorio de Solidos Porosos, Instituto de Fisica Aplicada, Universidad Nacional de San Luis, Chacabuco, San Luis (Argentina); Silva, Bernardo Araldi Da; Mignoni, Marcelo Luis [Universidade Regional Integrada, Erechim, RS (Brazil)

    2016-11-15

    In this work, the synthesis of zeolites using 1-butyl-3-methylimidazolium chloride [C{sub 4}MI]Cl as a structure-directing agent was investigated. The organic cation shows effectiveness and selectivity for the syntheses of TON zeolites under different reaction conditions compared to the traditional structure directing agent, 1,8-diaminooctane. The 1-butyl-3-methylimidazolium cation lead to highly crystalline materials and its role as OSDA in our synthesis conditions has been confirmed by characterization techniques. ICP-OES confirms the presence of Al in the samples and {sup 27}Al MAS NMR analysis indicated that aluminum atoms were incorporated in tetrahedral coordination. Scanning electron microscopy indicated that changing the crystallization condition (static or stirring), zeolites with different crystal size were obtained, which consequently affects the textural properties of the zeolites. Moreover, varying some synthesis parameters MFI zeolite can also be obtained. (author)

  1. Cracked lifting lug welds on ten-ton UF{sub 6} cylinders

    Energy Technology Data Exchange (ETDEWEB)

    Dorning, R.E. [Martin Marietta Energy Systems, Inc., Piketon, OH (United States)

    1991-12-31

    Ten-ton, Type 48X, UF{sub 6} cylinders are used at the Portsmouth Gaseous Diffusion Plant to withdraw enriched uranium hexafluoride from the cascade, transfer enriched uranium hexafluoride to customer cylinders, and feed enriched product to the cascade. To accomplish these activities, the cylinders are lifted by cranes and straddle carriers which engage the cylinder lifting lugs. In August of 1988, weld cracks on two lifting lugs were discovered during preparation to lift a cylinder. The cylinder was rejected and tagged out, and an investigating committee formed to determine the cause of cracking and recommend remedial actions. Further investigation revealed the problem may be general to this class of cylinder in this use cycle. This paper discusses the actions taken at the Portsmouth site to deal with the cracked lifting lug weld problem. The actions include inspection activities, interim corrective actions, metallurgical evaluation of cracked welds, weld repairs, and current monitoring/inspection program.

  2. Light Readout for a 1 ton Liquid Argon Dark Matter Detector

    CERN Document Server

    Boccone, Vittorio; Baudis, Laura; Otyugova, Polina; Regenfus, Christian

    2010-01-01

    Evidence for dark matter (DM) has been reported using astronomical observations in systems such as the Bullet cluster. Weakly interactive massive particles (WIMPs), in particular the lightest neutralino, are the most popular DM candidates within the Minimal Supersymmetric Standard Model (MSSM). Many groups in the world are focussing their attention on the direct detection of DM in the laboratory. The detectors should have large target masses and excellent noise rejection capabilities because of the small cross section between DM and ordinary matter (σWIMP−nucleon < 4 · 10−8 pb). Noble liquids are today considered to be one of the best options for large-size DM experiments, as they have a relatively low ionization energy, good scintillation properties and long electron lifetime. Moreover noble liquid detectors are easily scalable to large masses. This thesis deals with the development of a large (1 ton) LAr WIMP detector (ArDM) which could measure simultaneously light and charge from the scintilla...

  3. Community Extreme Tonnage User Service (CETUS): A 5000 Ton Open Research Facility in the United States

    Science.gov (United States)

    Danielson, L. R.; Righter, K.; Vander Kaaden, K. E.; Rowland, R. L., II; Draper, D. S.; McCubbin, F. M.

    2017-12-01

    Large sample volume 5000 ton multi-anvil presses have contributed to the exploration of deep Earth and planetary interiors, synthesis of ultra-hard and other novel materials, and serve as a sample complement to pressure and temperature regimes already attainable by diamond anvil cell experiments. However, no such facility exists in the Western Hemisphere. We are establishing an open user facility for the entire research community, with the unique capability of a 5000 ton multi-anvil and deformation press, HERA (High pressure Experimental Research Apparatus), supported by a host of extant co-located experimental and analytical laboratories and research staff. We offer wide range of complementary and/or preparatory experimental options. Any required synthesis of materials or follow up experiments can be carried out controlled atmosphere furnaces, piston cylinders, multi-anvil, or experimental impact apparatus. Additionally, our division houses two machine shops that would facilitate any modification or custom work necessary for development of CETUS, one for general fabrication and one located specifically within our experimental facilities. We also have a general sample preparation laboratory, specifically for experimental samples, that allows users to quickly and easily prepare samples for ebeam analyses and more. Our focus as contract staff is on serving the scientific needs of our users and collaborators. We are seeking community expert input on multiple aspects of this facility, such as experimental assembly design, module modifications, immediate projects, and future innovation initiatives. We've built a cooperative network of 12 (and growing) collaborating institutions, including COMPRES. CETUS is a coordinated effort leveraging HERA with our extant experimental, analytical, and planetary process modelling instrumentation and expertise in order to create a comprehensive model of the origin and evolution of our solar system and beyond. We are looking to engage

  4. Methods and results for stress analyses on 14-ton, thin-wall depleted UF6 cylinders

    International Nuclear Information System (INIS)

    Kirkpatrick, J.R.; Chung, C.K.; Frazier, J.L.; Kelley, D.K.

    1996-10-01

    Uranium enrichment operations at the three US gaseous diffusion plants produce depleted uranium hexafluoride (DUF 6 ) as a residential product. At the present time, the inventory of DUF 6 in this country is more than half a million tons. The inventory of DUF 6 is contained in metal storage cylinders, most of which are located at the gaseous diffusion plants. The principal objective of the project is to ensure the integrity of the cylinders to prevent causing an environmental hazard by releasing the contents of the cylinders into the atmosphere. Another objective is to maintain the cylinders in such a manner that the DUF 6 may eventually be converted to a less hazardous material for final disposition. An important task in the DUF 6 cylinders management project is determining how much corrosion of the walls can be tolerated before the cylinders are in danger of being damaged during routine handling and shipping operations. Another task is determining how to handle cylinders that have already been damaged in a manner that will minimize the chance that a breach will occur or that the size of an existing breach will be significantly increased. A number of finite element stress analysis (FESA) calculations have been done to analyze the stresses for three conditions: (1) while the cylinder is being lifted, (2) when a cylinder is resting on two cylinders under it in the customary two-tier stacking array, and (3) when a cylinder is resting on tis chocks on the ground. Various documents describe some of the results and discuss some of the methods whereby they have been obtained. The objective of the present report is to document as many of the FESA cases done at Oak Ridge for 14-ton thin-wall cylinders as possible, giving results and a description of the calculations in some detail

  5. Reconstruction and Analysis for the DUNE 35-ton Liquid Argon Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Wallbank, Michael James [Sheffield U.

    2018-01-01

    Neutrino physics is approaching the precision era, with current and future experiments aiming to perform highly accurate measurements of the parameters which govern the phenomenon of neutrino oscillations. The ultimate ambition with these results is to search for evidence of CP-violation in the lepton sector, currently hinted at in the world-leading analyses from present experiments, which may explain the dominance of matter over antimatter in the Universe. The Deep Underground Neutrino Experiment (DUNE) is a future long-baseline experiment based at Fermi National Accelerator Laboratory (FNAL), with a far detector at the Sanford Underground Research Facility (SURF) and a baseline of 1300 km. In order to make the required precision measurements, the far detector will consist of 40 kton liquid argon and an embedded time projection chamber. This promising technology is still in development and, since each detector module is around a factor 15 larger than any previous experiment employing this design, prototyping the detector and design choices is critical to the success of the experiment. The 35-ton experiment was constructed for this purpose and will be described in detail in this thesis. The outcomes of the 35-ton prototype are already influencing DUNE and, following the successes and lessons learned from the experiment, confidence can be taken forward to the next stage of the DUNE programme. The main oscillation signal at DUNE will be electron neutrino appearance from the muon neutrino beam. High-precision studies of these νe interactions requires advanced processing and event reconstruction techniques, particularly in the handling of showering particles such as electrons and photons. Novel methods developed for the purposes of shower reconstruction in liquid argon are presented with an aim to successfully develop a selection to use in a νe charged-current analysis, and a first-generation selection using the new techniques is presented.

  6. A field like today's? The strength of the geomagnetic field 1.1 billion years ago

    Science.gov (United States)

    Sprain, Courtney J.; Swanson-Hysell, Nicholas L.; Fairchild, Luke M.; Gaastra, Kevin

    2018-06-01

    Palaeomagnetic data from ancient rocks are one of the few types of observational data that can be brought to bear on the long-term evolution of Earth's core. A recent compilation of palaeointensity estimates from throughout Earth history has been interpreted to indicate that Earth's magnetic field strength increased in the Mesoproterozoic (between 1.5 and 1.0 billion years ago), with this increase taken to mark the onset of inner core nucleation. However, much of the data within the Precambrian palaeointensity database are from Thellier-style experiments with non-ideal behaviour that manifests in results such as double-slope Arai plots. Choices made when interpreting these data may significantly change conclusions about long-term trends in the intensity of Earth's geomagnetic field. In this study, we present new palaeointensity results from volcanics of the ˜1.1-billion-year-old North American Midcontinent Rift. While most of the results exhibit non-ideal double-slope or sagging behaviour in Arai plots, some flows have more ideal single-slope behaviour leading to palaeointensity estimates that may be some of the best constraints on the strength of Earth's field for this time. Taken together, new and previously published palaeointensity data from the Midcontinent Rift yield a median field strength estimate of 56.0 ZAm2—very similar to the median for the past 300 Myr. These field strength estimates are distinctly higher than those for the preceding billion years (Ga) after excluding ca. 1.3 Ga data that may be biased by non-ideal behaviour—consistent with an increase in field strength in the late Mesoproterozoic. However, given that ˜90 per cent of palaeointensity estimates from 1.1 to 0.5 Ga come from the Midcontinent Rift, it is difficult to evaluate whether these high values relative to those estimated for the preceding billion years are the result of a stepwise, sustained increase in dipole moment. Regardless, palaeointensity estimates from the Midcontinent

  7. Robustness Metrics: Consolidating the multiple approaches to quantify Robustness

    DEFF Research Database (Denmark)

    Göhler, Simon Moritz; Eifler, Tobias; Howard, Thomas J.

    2016-01-01

    robustness metrics; 3) Functional expectancy and dispersion robustness metrics; and 4) Probability of conformance robustness metrics. The goal was to give a comprehensive overview of robustness metrics and guidance to scholars and practitioners to understand the different types of robustness metrics...

  8. Partial rectangular metric spaces and fixed point theorems.

    Science.gov (United States)

    Shukla, Satish

    2014-01-01

    The purpose of this paper is to introduce the concept of partial rectangular metric spaces as a generalization of rectangular metric and partial metric spaces. Some properties of partial rectangular metric spaces and some fixed point results for quasitype contraction in partial rectangular metric spaces are proved. Some examples are given to illustrate the observed results.

  9. Measuring Information Security: Guidelines to Build Metrics

    Science.gov (United States)

    von Faber, Eberhard

    Measuring information security is a genuine interest of security managers. With metrics they can develop their security organization's visibility and standing within the enterprise or public authority as a whole. Organizations using information technology need to use security metrics. Despite the clear demands and advantages, security metrics are often poorly developed or ineffective parameters are collected and analysed. This paper describes best practices for the development of security metrics. First attention is drawn to motivation showing both requirements and benefits. The main body of this paper lists things which need to be observed (characteristic of metrics), things which can be measured (how measurements can be conducted) and steps for the development and implementation of metrics (procedures and planning). Analysis and communication is also key when using security metrics. Examples are also given in order to develop a better understanding. The author wants to resume, continue and develop the discussion about a topic which is or increasingly will be a critical factor of success for any security managers in larger organizations.

  10. Characterising risk - aggregated metrics: radiation and noise

    International Nuclear Information System (INIS)

    Passchier, W.

    1998-01-01

    The characterisation of risk is an important phase in the risk assessment - risk management process. From the multitude of risk attributes a few have to be selected to obtain a risk characteristic or profile that is useful for risk management decisions and implementation of protective measures. One way to reduce the number of attributes is aggregation. In the field of radiation protection such an aggregated metric is firmly established: effective dose. For protection against environmental noise the Health Council of the Netherlands recently proposed a set of aggregated metrics for noise annoyance and sleep disturbance. The presentation will discuss similarities and differences between these two metrics and practical limitations. The effective dose has proven its usefulness in designing radiation protection measures, which are related to the level of risk associated with the radiation practice in question, given that implicit judgements on radiation induced health effects are accepted. However, as the metric does not take into account the nature of radiation practice, it is less useful in policy discussions on the benefits and harm of radiation practices. With respect to the noise exposure metric, only one effect is targeted (annoyance), and the differences between sources are explicitly taken into account. This should make the metric useful in policy discussions with respect to physical planning and siting problems. The metric proposed has only significance on a population level, and can not be used as a predictor for individual risk. (author)

  11. Energy functionals for Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Headrick, M; Nassar, A

    2013-01-01

    We identify a set of ''energy'' functionals on the space of metrics in a given Kähler class on a Calabi-Yau manifold, which are bounded below and minimized uniquely on the Ricci-flat metric in that class. Using these functionals, we recast the problem of numerically solving the Einstein equation as an optimization problem. We apply this strategy, using the ''algebraic'' metrics (metrics for which the Kähler potential is given in terms of a polynomial in the projective coordinates), to the Fermat quartic and to a one-parameter family of quintics that includes the Fermat and conifold quintics. We show that this method yields approximations to the Ricci-flat metric that are exponentially accurate in the degree of the polynomial (except at the conifold point, where the convergence is polynomial), and therefore orders of magnitude more accurate than the balanced metrics, previously studied as approximations to the Ricci-flat metric. The method is relatively fast and easy to implement. On the theoretical side, we also show that the functionals can be used to give a heuristic proof of Yau's theorem

  12. Metrics Are Needed for Collaborative Software Development

    Directory of Open Access Journals (Sweden)

    Mojgan Mohtashami

    2011-10-01

    Full Text Available There is a need for metrics for inter-organizational collaborative software development projects, encompassing management and technical concerns. In particular, metrics are needed that are aimed at the collaborative aspect itself, such as readiness for collaboration, the quality and/or the costs and benefits of collaboration in a specific ongoing project. We suggest questions and directions for such metrics, spanning the full lifespan of a collaborative project, from considering the suitability of collaboration through evaluating ongoing projects to final evaluation of the collaboration.

  13. Indefinite metric fields and the renormalization group

    International Nuclear Information System (INIS)

    Sherry, T.N.

    1976-11-01

    The renormalization group equations are derived for the Green functions of an indefinite metric field theory. In these equations one retains the mass dependence of the coefficient functions, since in the indefinite metric theories the masses cannot be neglected. The behavior of the effective coupling constant in the asymptotic and infrared limits is analyzed. The analysis is illustrated by means of a simple model incorporating indefinite metric fields. The model scales at first order, and at this order also the effective coupling constant has both ultra-violet and infra-red fixed points, the former being the bare coupling constant

  14. Metric learning for DNA microarray data analysis

    International Nuclear Information System (INIS)

    Takeuchi, Ichiro; Nakagawa, Masao; Seto, Masao

    2009-01-01

    In many microarray studies, gene set selection is an important preliminary step for subsequent main task such as tumor classification, cancer subtype identification, etc. In this paper, we investigate the possibility of using metric learning as an alternative to gene set selection. We develop a simple metric learning algorithm aiming to use it for microarray data analysis. Exploiting a property of the algorithm, we introduce a novel approach for extending the metric learning to be adaptive. We apply the algorithm to previously studied microarray data on malignant lymphoma subtype identification.

  15. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  16. Energy tax price tag for CPI: $1.2 billion, jobs, and production

    International Nuclear Information System (INIS)

    Begley, R.

    1993-01-01

    If President Clinton's proposed energy tax had been fully in place last year, it would have cost the US chemical industry an additional $1.2 billion and 9,900 jobs, according to Chemical Manufacturers Association (CMA; Washington) estimates. It also would have driven output down 3% and prices up 5%, CMA says. Allen Lenz, CMA director/trade and economics, says the increase in production costs that would accompany the tax will not be shared by foreign competitors, cannot be neutralized with higher border taxes because of existing trade agreements, and provides another reason to move production offshore. Worse, the US chemical industry's generally impressive trade surplus declined by $2.5 billion last year, and a further drop is projected for this year. The margin of error gets thinner all the time as competition increases, Lenz says. We're not concerned only with the chemical industry, but the rest of US-based manufacturing because they taken half our output, he adds. One problem is the energy intensiveness of the chemical process industries-a CMA report says that 55% of the cost of producing ethylene glycol is energy related. And double taxation of such things as coproducts returned for credit to oil refineries could add up to $115 million/year, the report says

  17. A parts-per-billion measurement of the antiproton magnetic moment

    CERN Document Server

    Smorra, C; Borchert, M J; Harrington, J A; Higuchi, T; Nagahama, H; Tanaka, T; Mooser, A; Schneider, G; Blaum, K; Matsuda, Y; Ospelkaus, C; Quint, W; Walz, J; Yamazaki, Y; Ulmer, S

    2017-01-01

    Precise comparisons of the fundamental properties of matter–antimatter conjugates provide sensitive tests of charge–parity–time (CPT) invariance1, which is an important symmetry that rests on basic assumptions of the standard model of particle physics. Experiments on mesons2, leptons3, 4 and baryons5, 6 have compared different properties of matter–antimatter conjugates with fractional uncertainties at the parts-per-billion level or better. One specific quantity, however, has so far only been known to a fractional uncertainty at the parts-per-million level7, 8: the magnetic moment of the antiproton, . The extraordinary difficulty in measuring with high precision is caused by its intrinsic smallness; for example, it is 660 times smaller than the magnetic moment of the positron3. Here we report a high-precision measurement of in units of the nuclear magneton μN with a fractional precision of 1.5 parts per billion (68% confidence level). We use a two-particle spectroscopy method in an advanced cryogenic ...

  18. Plate tectonic influences on Earth's baseline climate: a 2 billion-year record

    Science.gov (United States)

    McKenzie, R.; Evans, D. A.; Eglington, B. M.; Planavsky, N.

    2017-12-01

    Plate tectonic processes present strong influences on the long-term carbon cycle, and thus global climate. Here we utilize multiple aspects of the geologic record to assess the role plate tectonics has played in driving major icehouse­-greenhouse transitions for the past 2 billion years. Refined paleogeographic reconstructions allow us to quantitatively assess the area of continents in various latitudinal belts throughout this interval. From these data we are able to test the hypothesis that concentrating continental masses in low-latitudes will drive cooler climates due to increased silicate weathering. We further superimpose records of events that are believed to increase the `weatherability' of the crust, such as large igneous province emplacement, island-arc accretion, and continental collisional belts. Climatic records are then compared with global detrital zircon U-Pb age data as a proxy for continental magmatism. Our results show a consistent relationship between zircon-generating magmatism and icehouse-greenhouse transitions for > 2 billion years, whereas paleogeographic records show no clear consistent relationship between continental configurations and prominent climate transitions. Volcanic outgassing appears to exert a first-order control on major baseline climatic shifts; however, paleogeography likely plays an important role in the magnitude of this change. Notably, climatic extremes, such as the Cryogenian icehouse, occur during a combination of reduce volcanism and end-member concentrations of low-latitudinal continents.

  19. Effective interventions for unintentional injuries: a systematic review and mortality impact assessment among the poorest billion.

    Science.gov (United States)

    Vecino-Ortiz, Andres I; Jafri, Aisha; Hyder, Adnan A

    2018-05-01

    Between 1990 and 2015, the global injury mortality declined, but in countries where the poorest billion live, injuries are becoming an increasingly prevalent cause of death. The vulnerability of this population requires immediate attention from policy makers to implement effective interventions that lessen the burden of injuries in these countries. Our aim was two-fold; first, to review all the evidence on effective interventions for the five main types of unintentional injury; and second, to estimate the potential number of lives saved by effective injury interventions among the poorest billion. For our systematic review we used references in the Disability Control Priorities third edition, and searched PubMed and the Cochrane database for papers published until Sept 10, 2016, using a comprehensive search strategy to find interventions for the five major causes of unintentional injuries: road traffic crashes, falls, drowning, burns, and poisoning. Studies were included if they presented evidence with significant effects sizes for any outcome; no inclusions or exclusions made on the basis of where the study was carried out (ie, low-income, middle-income, or high-income country). Then we used data from the Global Burden of Disease 2015 study and a Monte Carlo simulation technique to estimate the potential annual attributable number of lives saved among the poorest billion by these evidence-based injury interventions. We estimated results for 84 countries where the poorest billion live. From the 513 papers identified, 47 were eligible for inclusion. We identified 11 interventions that had an effect on injury mortality. For road traffic deaths, the most successful interventions in preventing deaths are speed enforcement (>80 000 lives saved per year) and drink-driving enforcement (>60 000 lives saved per year). Interventions potentially most effective in preventing deaths from drowning are formal swimming lessons for children younger than 14 years (>25 000 lives

  20. Effective interventions for unintentional injuries: a systematic review and mortality impact assessment among the poorest billion

    Directory of Open Access Journals (Sweden)

    Andres I Vecino-Ortiz, PhD

    2018-05-01

    Full Text Available Summary: Background: Between 1990 and 2015, the global injury mortality declined, but in countries where the poorest billion live, injuries are becoming an increasingly prevalent cause of death. The vulnerability of this population requires immediate attention from policy makers to implement effective interventions that lessen the burden of injuries in these countries. Our aim was two-fold; first, to review all the evidence on effective interventions for the five main types of unintentional injury; and second, to estimate the potential number of lives saved by effective injury interventions among the poorest billion. Methods: For our systematic review we used references in the Disability Control Priorities third edition, and searched PubMed and the Cochrane database for papers published until Sept 10, 2016, using a comprehensive search strategy to find interventions for the five major causes of unintentional injuries: road traffic crashes, falls, drowning, burns, and poisoning. Studies were included if they presented evidence with significant effects sizes for any outcome; no inclusions or exclusions made on the basis of where the study was carried out (ie, low-income, middle-income, or high-income country. Then we used data from the Global Burden of Disease 2015 study and a Monte Carlo simulation technique to estimate the potential annual attributable number of lives saved among the poorest billion by these evidence-based injury interventions. We estimated results for 84 countries where the poorest billion live. Findings: From the 513 papers identified, 47 were eligible for inclusion. We identified 11 interventions that had an effect on injury mortality. For road traffic deaths, the most successful interventions in preventing deaths are speed enforcement (>80 000 lives saved per year and drink-driving enforcement (>60 000 lives saved per year. Interventions potentially most effective in preventing deaths from drowning are formal swimming

  1. Constructing experimental devices for half-ton synthesis of gadolinium-loaded liquid scintillator and its performance

    Science.gov (United States)

    Park, Young Seo; Jang, Yeong Min; Joo, Kyung Kwang

    2018-04-01

    This paper describes in brief features of various experimental devices constructed for half-ton synthesis of gadolinium(Gd)-loaded liquid scintillator (GdLS) and also includes the performances and detailed chemical and physical results of a 0.5% high-concentration GdLS. Various feasibility studies on useful apparatus used for loading Gd into solvents have been carried out. The transmittance, Gd concentration, density, light yield, and moisture content were measured for quality control. We show that with the help of adequate automated experimental devices and tools, it is possible to perform ton scale synthesis of GdLS at moderate laboratory scale without difficulty. The synthesized GdLS was satisfactory to meet chemical, optical, and physical properties and various safety requirements. These synthesizing devices can be expanded into massive scale next-generation neutrino experiments of several hundred tons.

  2. Study of light detection and sensitivity for a ton-scale liquid xenon dark matter detector

    International Nuclear Information System (INIS)

    Wei, Y; Lin, Q; Xiao, X; Ni, K

    2013-01-01

    Ton-scale liquid xenon detectors operated in two-phase mode are proposed and being constructed recently to explore the favored parameter space for the Weakly Interacting Massive Particles (WIMPs) dark matter. To achieve a better light collection efficiency while limiting the number of electronics channels compared to the previous generation detectors, large-size photo-multiplier tubes (PMTs) such as the 3-inch-diameter R11410 from Hamamatsu are suggested to replace the 1-inch-square R8520 PMTs. In a two-phase xenon dark matter detector, two PMT arrays on the top and bottom are usually used. In this study, we compare the performance of two different ton-scale liquid xenon detector configurations with the same number of either R11410 (config.1) or R8520 (config.2) for the top PMT array, while both using R11410 PMTs for the bottom array. The self-shielding of liquid xenon suppresses the background from the PMTs and the dominant background is from the pp solar neutrinos in the central fiducial volume. The light collection efficiency for the primary scintillation light is largely affected by the xenon purity and the reflectivity of the reflectors. In the optimistic situation with a 10 m light absorption length and a 95% reflectivity, the light collection efficiency is 43%(34%) for config.1(config.2). In the conservative situation with a 2.5 m light absorption length and a 85% reflectivity, the value is only 18%(13%) for config.1(config.2). The difference between the two configurations is due to the larger PMT coverage on the top for config.1. The slightly different position resolutions for the two configurations have a negligible effect on the sensitivity. Based on the above considerations, we estimate the sensitivity reach of the two detector configurations. Both configurations can reach a sensitivity of 2 ∼ 3 × 10 −47 cm 2 for spin-independent WIMP-nucleon cross section for 100 GeV/c 2 WIMPs after two live-years of operation. The one with R8520 PMTs for the top

  3. Metrics, Media and Advertisers: Discussing Relationship

    Directory of Open Access Journals (Sweden)

    Marco Aurelio de Souza Rodrigues

    2014-11-01

    Full Text Available This study investigates how Brazilian advertisers are adapting to new media and its attention metrics. In-depth interviews were conducted with advertisers in 2009 and 2011. In 2009, new media and its metrics were celebrated as innovations that would increase advertising campaigns overall efficiency. In 2011, this perception has changed: New media’s profusion of metrics, once seen as an advantage, started to compromise its ease of use and adoption. Among its findings, this study argues that there is an opportunity for media groups willing to shift from a product-focused strategy towards a customer-centric one, through the creation of new, simple and integrative metrics

  4. Networks and centroid metrics for understanding football

    African Journals Online (AJOL)

    Gonçalo Dias

    games. However, it seems that the centroid metric, supported only by the position of players in the field ...... the strategy adopted by the coach (Gama et al., 2014). ... centroid distance as measures of team's tactical performance in youth football.

  5. Clean Cities Annual Metrics Report 2009 (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.

    2011-08-01

    Document provides Clean Cities coalition metrics about the use of alternative fuels; the deployment of alternative fuel vehicles, hybrid electric vehicles (HEVs), and idle reduction initiatives; fuel economy activities; and programs to reduce vehicle miles driven.

  6. Metric Guidelines Inservice and/or Preservice

    Science.gov (United States)

    Granito, Dolores

    1978-01-01

    Guidelines are given for designing teacher training for going metric. The guidelines were developed from existing guidelines, journal articles, a survey of colleges, and the detailed reactions of a panel. (MN)

  7. Science and Technology Metrics and Other Thoughts

    National Research Council Canada - National Science Library

    Harman, Wayne; Staton, Robin

    2006-01-01

    This report explores the subject of science and technology metrics and other topics to begin to provide Navy managers, as well as scientists and engineers, additional tools and concepts with which to...

  8. Using Activity Metrics for DEVS Simulation Profiling

    Directory of Open Access Journals (Sweden)

    Muzy A.

    2014-01-01

    Full Text Available Activity metrics can be used to profile DEVS models before and during the simulation. It is critical to get good activity metrics of models before and during their simulation. Having a means to compute a-priori activity of components (analytic activity may be worth when simulating a model (or parts of it for the first time. After, during the simulation, analytic activity can be corrected using dynamic one. In this paper, we introduce McCabe cyclomatic complexity metric (MCA to compute analytic activity. Both static and simulation activity metrics have been implemented through a plug-in of the DEVSimPy (DEVS Simulator in Python language environment and applied to DEVS models.

  9. Evaluating and Estimating the WCET Criticality Metric

    DEFF Research Database (Denmark)

    Jordan, Alexander

    2014-01-01

    a programmer (or compiler) from targeting optimizations the right way. A possible resort is to use a metric that targets WCET and which can be efficiently computed for all code parts of a program. Similar to dynamic profiling techniques, which execute code with input that is typically expected...... for the application, based on WCET analysis we can indicate how critical a code fragment is, in relation to the worst-case bound. Computing such a metric on top of static analysis, incurs a certain overhead though, which increases with the complexity of the underlying WCET analysis. We present our approach...... to estimate the Criticality metric, by relaxing the precision of WCET analysis. Through this, we can reduce analysis time by orders of magnitude, while only introducing minor error. To evaluate our estimation approach and share our garnered experience using the metric, we evaluate real-time programs, which...

  10. 16 CFR 1511.8 - Metric references.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Metric references. 1511.8 Section 1511.8 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT REGULATIONS... parentheses for convenience and information only. ...

  11. Flight Crew State Monitoring Metrics, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — eSky will develop specific crew state metrics based on the timeliness, tempo and accuracy of pilot inputs required by the H-mode Flight Control System (HFCS)....

  12. Supplier selection using different metric functions

    Directory of Open Access Journals (Sweden)

    Omosigho S.E.

    2015-01-01

    Full Text Available Supplier selection is an important component of supply chain management in today’s global competitive environment. Hence, the evaluation and selection of suppliers have received considerable attention in the literature. Many attributes of suppliers, other than cost, are considered in the evaluation and selection process. Therefore, the process of evaluation and selection of suppliers is a multi-criteria decision making process. The methodology adopted to solve the supplier selection problem is intuitionistic fuzzy TOPSIS (Technique for Order Preference by Similarity to the Ideal Solution. Generally, TOPSIS is based on the concept of minimum distance from the positive ideal solution and maximum distance from the negative ideal solution. We examine the deficiencies of using only one metric function in TOPSIS and propose the use of spherical metric function in addition to the commonly used metric functions. For empirical supplier selection problems, more than one metric function should be used.

  13. Nouvelles Techniques d'Intervention sur la Corrosion des Armatures du Béton Armé

    CERN Document Server

    Colloca, C

    1999-01-01

    Les principaux dégâts constatés dans les armatures passives du béton armé sont la corrosion généralisée et la corrosion locale. Ces dégradations sont provoquées soit par la carbonatation du béton soit par le contact avec l'eau pure ou l'eau chargée de chlorures pénétrant dans les pores et dans les fissures de surface. Ce document présente de nouvelles techniques d'intervention, fondées sur d'anciens principes, introduites pour le traitement électrochimique des zones altérées liées aux différentes conditions. La réalcalinisation (dans le cas de béton carbonaté) permet d'augmenter le pH du béton et de rétablir un niveau de basicité garantissant la passivation de l'armature. La désalification (dans le cas de béton entamé par les chlorures) provoque l'élimination des ions chlorure à travers la surface du béton. Les avantages de ces traitements, par rapport aux anciennes techniques, sont appréciables si l'on considère la durée d'exécution et leur coût moins élevé.

  14. Classroom reconstruction of the Schwarzschild metric

    OpenAIRE

    Kassner, Klaus

    2015-01-01

    A promising way to introduce general relativity in the classroom is to study the physical implications of certain given metrics, such as the Schwarzschild one. This involves lower mathematical expenditure than an approach focusing on differential geometry in its full glory and permits to emphasize physical aspects before attacking the field equations. Even so, in terms of motivation, lacking justification of the metric employed may pose an obstacle. The paper discusses how to establish the we...

  15. Marketing communication metrics for social media

    OpenAIRE

    Töllinen, Aarne; Karjaluoto, Heikki

    2011-01-01

    The objective of this paper is to develop a conceptual framework for measuring the effectiveness of social media marketing communications. Specifically, we study whether the existing marketing communications performance metrics are still valid in the changing digitalised communications landscape, or whether it is time to rethink them, or even to devise entirely new metrics. Recent advances in information technology and marketing bring a need to re-examine measurement models. We combine two im...

  16. Some observations on a fuzzy metric space

    Energy Technology Data Exchange (ETDEWEB)

    Gregori, V.

    2017-07-01

    Let $(X,d)$ be a metric space. In this paper we provide some observations about the fuzzy metric space in the sense of Kramosil and Michalek $(Y,N,/wedge)$, where $Y$ is the set of non-negative real numbers $[0,/infty[$ and $N(x,y,t)=1$ if $d(x,y)/leq t$ and $N(x,y,t)=0$ if $d(x,y)/geq t$. (Author)

  17. Area Regge calculus and discontinuous metrics

    International Nuclear Information System (INIS)

    Wainwright, Chris; Williams, Ruth M

    2004-01-01

    Taking the triangle areas as independent variables in the theory of Regge calculus can lead to ambiguities in the edge lengths, which can be interpreted as discontinuities in the metric. We construct solutions to area Regge calculus using a triangulated lattice and find that on a spacelike or timelike hypersurface no such discontinuity can arise. On a null hypersurface however, we can have such a situation and the resulting metric can be interpreted as a so-called refractive wave

  18. Mars’ First Billion Years: Key Findings, Key Unsolved Paradoxes, and Future Exploration

    Science.gov (United States)

    Ehlmann, Bethany

    2017-10-01

    In the evolution of terrestrial planets, the first billion years are the period most shrouded in mystery: How vigorous is early atmospheric loss? How do planetary climates respond to a brightening sun? When and how are plate tectonic recycling processes initiated? How do voluminous volcanism and heavy impact bombardment influence the composition of the atmosphere? Under what conditions might life arise? Looking outward to terrestrial planets around other stars, the record from Venus, Earth and Mars in this solar system is crucial for developing models of physical can chemical processes. Of these three worlds, Mars provides the longest record of planetary evolution from the first billion years, comprising >50% of exposed geologic units, which are only lightly overprinted by later processes.Orbital observations of the last decade have revealed abundant evidence for surface waters in the form of lakes, valley networks, and evidence of chemically open-system near-surface weathering. Groundwaters at temperatures ranging from just above freezing to hydrothermal have also left a rich record of process in the mineralogical record. A rsuite of environments - similar in diversity to Earth’s - has been discovered on Mars with water pH, temperature, redox, and chemistries varying in space and time.Here, I will focus on the consequences of the aqueous alteration of the Martian crust on the composition of the atmosphere based on recent work studying aspects of the volatile budget (Usui et al., 2015; Edwards & Ehlmann, 2015; Hu et al., 2015; Jakosky et al., 2017, Wordsworth et al., 2017, and Ehlmann, in prep.). The solid crust and mantle of Mars act as volatile reservoirs and volatile sources through volcanism, mineral precipitation, and release of gases. We examine the extent to which the budget is understood or ill-understood for hydrogen and carbon, and associated phases H2O, CO2, and CH4. Additionally, I identify some key stratigraphies where a combination of focused in

  19. Relaxed metrics and indistinguishability operators: the relationship

    Energy Technology Data Exchange (ETDEWEB)

    Martin, J.

    2017-07-01

    In 1982, the notion of indistinguishability operator was introduced by E. Trillas in order to fuzzify the crisp notion of equivalence relation (/cite{Trillas}). In the study of such a class of operators, an outstanding property must be pointed out. Concretely, there exists a duality relationship between indistinguishability operators and metrics. The aforesaid relationship was deeply studied by several authors that introduced a few techniques to generate metrics from indistinguishability operators and vice-versa (see, for instance, /cite{BaetsMesiar,BaetsMesiar2}). In the last years a new generalization of the metric notion has been introduced in the literature with the purpose of developing mathematical tools for quantitative models in Computer Science and Artificial Intelligence (/cite{BKMatthews,Ma}). The aforementioned generalized metrics are known as relaxed metrics. The main target of this talk is to present a study of the duality relationship between indistinguishability operators and relaxed metrics in such a way that the aforementioned classical techniques to generate both concepts, one from the other, can be extended to the new framework. (Author)

  20. Ici, la priorité est aux piétons / Pedestrians have the right of way at CERN

    CERN Multimedia

    2002-01-01

    Au CERN, nous sommes tous piétons, très souvent automobilistes et parfois cyclistes. Mais peu importe notre moyen de locomotion si l'on reste vigilant et si l'on se rappelle que le piéton est un usager de la route à part entière, mais plus vulnérable. / At CERN, we are all pedestrians, often drivers, and occasionally cyclists. But our means of locomotion do no matter so long as we exercise caution and remember that a pedestrian has equal rights as a roa user, except than that he runs greater risks.

  1. Consideration on the 1 ton bucket elevator installed under water of pool in IMEF

    Energy Technology Data Exchange (ETDEWEB)

    Song, Ung Sup; Lee, Jong Heon; Lee, Hong Gi; Choo, Yong Sun; Jung, Yang Hong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2004-07-01

    The bucket elevator which can transfer examination capsule or fuel from pool to hot cell is installed under water of pool (3x6x10 m) in IMEF. A allowable load is 1 ton and the dimension of bucket is 25x25x150 cm. The upper and lower sides motion of bucket have about 63 degrees inclined duties with a chain driving system. A specialized made chain catches a rug of bucket with a roller sliding way between right and left guide rails which are fixed at inner of a square tube and moves to an upper and lower sides and it is made so that it is operated by sprocket wheel installed in a hot cell working table below. Sprocket wheel is executed to two steps of driving shaft by reduction geared motor installed at right outside of M1 hot cell. As for the starting operation, it is executed by push an operation button on a operating panel located at front of M1 hot cell.

  2. Study of hot cracking potential in a 6-ton steel ingot casting

    Science.gov (United States)

    Yang, Jing'an; Liu, Baicheng; Shen, Houfa

    2018-04-01

    A new hot cracking potential (HCP) criterion, for the appearance of hot tearing in steel ingot castings, is proposed. The maximum value of the first principal stress, divided by the dynamic yield strength in the brittle temperature range (BTR), was used to identify the HCP. Experiments were carried out on a 6-ton P91 steel ingot in which severe hot tearing was detected in the upper centerline. Another ingot, with a better heat preservation riser, and without hot tearing, was used for comparison. Samples were obtained from the area of the ingot body with hot tearing, and their morphologies were inspected by a X-ray high energy industrial computed tomography. The carbon and sulfur distributions around the hot tearing were characterized by an infrared spectrometry carbon and sulfur analyzer. High temperature mechanical properties were obtained by a Gleeble thermal simulation machine, under different strain rates. Then, thermo-mechanical simulations using an elasto-viscoplastic finite-element model were conducted to analyze the stress and strain evolution during ingot solidification. The results showed that the hot tearing area, which was rich in both carbon and sulfur, was under excessive tensile stress in the BTR, bearing the highest HCP.

  3. 1000–ton testing machine for cyclic fatigue tests of materials at liquid nitrogen temperatures

    Energy Technology Data Exchange (ETDEWEB)

    Khitruk, A. A.; Klimchenko, Yu. A.; Kovalchuk, O. A.; Marushin, E. L.; Mednikov, A. A.; Nasluzov, S. N.; Privalova, E. K.; Rodin, I. Yu.; Stepanov, D. B.; Sukhanova, M. V. [The D.V. Efremov Scientific Research Institute of Electrophysical Apparatus (NIIEFA), 3 Doroga na Metallostroy, Metallostroy, Saint Petersburg 196641 (Russian Federation)

    2014-01-29

    One of the main tasks of superconductive magnets R and D is to determine the mechanical and fatigue properties of structural materials and the critical design elements in the cryogenic temperature range. This paper describes a new facility built based on the industrial 1000-ton (10 MN) testing machine Schenk PC10.0S. Special equipment was developed to provide the mechanical and cyclic tensile fatigue tests of large-scale samples at the liquid nitrogen temperature and in a given load range. The main feature of the developed testing machine is the cryostat, in which the device converting a standard compression force of the testing machine to the tensile force affected at the test object is placed. The control system provides the remote control of the test and obtaining, processing and presentation of test data. As an example of the testing machine operation the test program and test results of the cyclic tensile fatigue tests of fullscale helium inlet sample of the PF1 coil ITER are presented.

  4. 1000–ton testing machine for cyclic fatigue tests of materials at liquid nitrogen temperatures

    International Nuclear Information System (INIS)

    Khitruk, A. A.; Klimchenko, Yu. A.; Kovalchuk, O. A.; Marushin, E. L.; Mednikov, A. A.; Nasluzov, S. N.; Privalova, E. K.; Rodin, I. Yu.; Stepanov, D. B.; Sukhanova, M. V.

    2014-01-01

    One of the main tasks of superconductive magnets R and D is to determine the mechanical and fatigue properties of structural materials and the critical design elements in the cryogenic temperature range. This paper describes a new facility built based on the industrial 1000-ton (10 MN) testing machine Schenk PC10.0S. Special equipment was developed to provide the mechanical and cyclic tensile fatigue tests of large-scale samples at the liquid nitrogen temperature and in a given load range. The main feature of the developed testing machine is the cryostat, in which the device converting a standard compression force of the testing machine to the tensile force affected at the test object is placed. The control system provides the remote control of the test and obtaining, processing and presentation of test data. As an example of the testing machine operation the test program and test results of the cyclic tensile fatigue tests of fullscale helium inlet sample of the PF1 coil ITER are presented

  5. High temperature experiments on a 4 tons UF6 container TENERIFE program

    Energy Technology Data Exchange (ETDEWEB)

    Casselman, C.; Duret, B.; Seiler, J.M.; Ringot, C.; Warniez, P.

    1991-12-31

    The paper presents an experimental program (called TENERIFE) whose aim is to investigate the behaviour of a cylinder containing UF{sub 6} when exposed to a high temperature fire for model validation. Taking into account the experiments performed in the past, the modelization needs further information in order to be able to predict the behaviour of a real size cylinder when engulfed in a 800{degrees}C fire, as specified in the regulation. The main unknowns are related to (1) the UF{sub 6} behaviour beyond the critical point, (2) the relationship between temperature field and internal pressure and (3) the equivalent conductivity of the solid UF{sub 6}. In order to investigate these phenomena in a representative way it is foreseen to perform experiments with a cylinder of real diameter, but reduced length, containing 4 tons of UF{sub 6}. This cylinder will be placed in an electrically heated furnace. A confinement vessel prevents any dispersion of UF{sub 6}. The heat flux delivered by the furnace will be calibrated by specific tests. The cylinder will be changed for each test.

  6. Eight Tons of Material Footprint—Suggestion for a Resource Cap for Household Consumption in Finland

    Directory of Open Access Journals (Sweden)

    Michael Lettenmeier

    2014-07-01

    Full Text Available The paper suggests a sustainable material footprint of eight tons, per person, in a year as a resource cap target for household consumption in Finland. This means an 80% (factor 5 reduction from the present Finnish average. The material footprint is used as a synonym to the Total Material Requirement (TMR calculated for products and activities. The paper suggests how to allocate the sustainable material footprint to different consumption components on the basis of earlier household studies, as well as other studies, on the material intensity of products, services, and infrastructures. It analyzes requirements, opportunities, and challenges for future developments in technology and lifestyle, also taking into account that future lifestyles are supposed to show a high degree of diversity. The targets and approaches are discussed for the consumption components of nutrition, housing, household goods, mobility, leisure activities, and other purposes. The paper states that a sustainable level of natural resource use by households is achievable and it can be roughly allocated to different consumption components in order to illustrate the need for a change in lifestyles. While the absolute material footprint of all the consumption components will have to decrease, the relative share of nutrition, the most basic human need, in the total material footprint is expected to rise, whereas much smaller shares than at present are proposed for housing and especially mobility. For reducing material resource use to the sustainable level suggested, both social innovations, and technological developments are required.

  7. Shrinkage Porosity Criterion and Its Application to A 5.5 Ton Steel Ingot

    Directory of Open Access Journals (Sweden)

    Zhang C.

    2016-06-01

    Full Text Available In order to predict the distribution of shrinkage porosity in steel ingot efficiently and accurately, a criterion R√L and a method to obtain its threshold value were proposed. The criterion R√L was derived based on the solidification characteristics of steel ingot and pressure gradient in the mushy zone, in which the physical properties, the thermal parameters, the structure of the mushy zone and the secondary dendrite arm spacing were all taken into consideration. The threshold value of the criterion R√L was obtained with combination of numerical simulation of ingot solidification and total solidification shrinkage rate. Prediction of the shrinkage porosity in a 5.5 ton ingot of 2Cr13 steel with criterion R√L>0.21 m · °C1/2 · s−3/2 agreed well with the results of experimental sectioning. Based on this criterion, optimization of the ingot was carried out by decreasing the height-to-diameter ratio and increasing the taper, which successfully eliminated the centreline porosity and further proved the applicability of this criterion.

  8. VALIDACIÓN RESISTIVA ESTRUCTURAL DE UN VARADERO PARA EMBARCACIONES DE 600 ton.

    Directory of Open Access Journals (Sweden)

    Carlos Novo Soto

    2005-09-01

    Full Text Available En el presente trabajo se valida la capacidad de carga portante de la estructura de un varadero a partirde las condiciones a resistencia y rigidez. El varadero consta de 2 carros trapezoidales, sobre los cualesse desplazan transversalmente 6 boggies, que permiten trasladar la embarcación varada hacia losapartaderos, así mismo dispone de 3 motores con sus respectivos sistemas de reducción y tamboras,uno de ellos permite sacar la embarcación que se encuentra soportada sobre los boggies y los 2 carroscunas, otro se emplea para retornar dicha embarcación al mar y el último se utiliza para desplazar laembarcación sobre los boggies, transversalmente a los carros cunas, hacia los apartaderos.Dada la complejidad estructural del sistema se desarrolla un modelo físico matemático, el que mediantela aplicación del Método de los Elementos Finitos, permite obtener los esfuerzos equivalente máximos deMises y los desplazamientos máximos con lo que finalmente se valida, a través del Análisis porElementos Finitos, la capacidad portante de la estructura para varar una embarcación de 600 ton.

  9. The economic value of one ton CO2: what system of reference for public action?

    International Nuclear Information System (INIS)

    2007-04-01

    Given the convergence of scientific analyses of global warming and its consequences for the planet - evaluated for years by the Intergovernmental Panel on Climate Change (IPCC) - it is no longer possible to postpone the efforts required to reduce our emissions of greenhouse gases substantially. However, the choice of actions to take and the calendar of priorities are proving complex to define: the social and economic consequences are great, and neither France (which represents 2% of global emissions) nor Europe (15%) are up to treating the problem independently of the rest of the world. Faced with this challenge, and with budgetary constraints imposing a rationalisation of expenditure, public action must have measuring instruments at its disposal: the value of one ton of carbon is one such instrument. This Strategic Newswatch has a twofold objective: to recall the usefulness of this reference value which, though it cannot guarantee the validity of different public policies, may contribute to ensuring their consistency; and to present the different approaches and difficulties that producing such a reference system introduces. (author)

  10. Baby universe metric equivalent to an interior black-hole metric

    International Nuclear Information System (INIS)

    Gonzalez-Diaz, P.F.

    1991-01-01

    It is shown that the maximally extended metric corresponding to a large wormhole is the unique possible wormhole metric whose baby universe sector is conformally equivalent ot the maximal inextendible Kruskal metric corresponding to the interior region of a Schwarzschild black hole whose gravitational radius is half the wormhole neck radius. The physical implications of this result in the black hole evaporation process are discussed. (orig.)

  11. Saving billions of dollars--and physicians' time--by streamlining billing practices.

    Science.gov (United States)

    Blanchfield, Bonnie B; Heffernan, James L; Osgood, Bradford; Sheehan, Rosemary R; Meyer, Gregg S

    2010-06-01

    The U.S. system of billing third parties for health care services is complex, expensive, and inefficient. Physicians end up using nearly 12 percent of their net patient service revenue to cover the costs of excessive administrative complexity. A single transparent set of payment rules for multiple payers, a single claim form, and standard rules of submission, among other innovations, would reduce the burden on the billing offices of physician organizations. On a national scale, our hypothetical modeling of these changes would translate into $7 billion of savings annually for physician and clinical services. Four hours of professional time per physician and five hours of practice support staff time could be saved each week.

  12. Missing billions. How the Australian government's climate policy is penalising farmers

    International Nuclear Information System (INIS)

    Riguet, T.

    2006-10-01

    The Climate Institute analysis suggests ratifying the Kyoto Protocol and implementing a national emissions trading scheme today could provide Australian farmers with an income of $1.8 billion over the period 2008-2012, due to the emissions saved by limiting land clearing. Separately, a report to the National Farmers Federation by the Allen Consulting Group earlier this year concluded that a carbon emission trading system which recognised Kyoto Protocol rules could create an additional income stream of $0.7-0.9 billion over a five year period from revenue to farmers from forestry sinks. These two studies suggest that ratification of the Kyoto Protocol and the introduction of a national emissions trading scheme could provide farmers an income stream in the order of $2.5 billion. A central tenet of the Federal Government's greenhouse policy for over a decade has been to not ratify Kyoto, but to meet its Kyoto target - a national emissions increase of 8% from 1990 levels, in the period 2008-2012. Australia's National Greenhouse Gas Accounts show that farmers, by reducing land clearing rates since 1990, have offset substantial increases in greenhouse gas emissions from other sectors, mainly energy. Official Federal Government projections show that without land clearing reductions, Australia's greenhouse emissions would be 30% above 1990 levels by 2010. Australia's farmers have been responsible for virtually the entire share of the nation's greenhouse gas emissions reductions, but their efforts, worth around $2 billion, have not been recognised or financially rewarded by the Government. By reducing land clearing, farmers have already reduced greenhouse gas emissions by about 75 million tonnes since 1990. By 2010, the savings are projected to be about 83 million tonnes. This level of emissions reductions is equivalent to eliminating the total annual emissions of New Zealand or Ireland. Over that same period, emissions from energy and transport have and continue to sky

  13. Etude comparative de la cinétique de la réaction d’hydratation des bétons autoplaçants et des bétons vibrés

    Directory of Open Access Journals (Sweden)

    Ahmed Gargouri

    2014-04-01

    En effet, la nature exothermique de la réaction chimique du ciment peut induire des déformations de dilatation et de contraction. Par ailleurs, la dépression capillaire crée par la consommation d’eau due à l’hydratation du ciment entraine un retrait de dessiccation. Ces déformations peuvent entrainer des micros fissurations pouvant affecter la durabilité de l’ouvrage à long terme surtout pour les ouvrages épais. D’où l’importance d’étudier la cinétique d’hydratation de ses bétons non conventionnels et de les comparer à celle des bétons vibrés traditionnels. L’évolution de la température adiabatique ainsi que la variation en fonction du temps du degré d’hydratation sont déterminées pour le béton autoplaçant et le béton vibré. L’analyse des résultats expérimentaux obtenus montre que le changement de composition modifie considérablement la cinétique de la réaction d’hydratation.

  14. Megacity Green Infrastructure Converts Water into Billions of Dollars in Ecosystem Services

    Science.gov (United States)

    Endreny, T. A.; Ulgiati, S.; Santagata, R.

    2016-12-01

    Cities can invest in green infrastructure to purposefully couple water with urban tree growth, thereby generating ecosystem services and supporting human wellbeing as advocated by United Nations sustainable development initiatives. This research estimates the value of tree-based ecosystem services in order to help megacities assess the benefits relative to the costs of such investments. We inventoried tree cover across the metropolitan area of 10 megacities, in 5 continents and biomes, and developed biophysical scaling equations using i-Tree tools to estimate the tree cover value to reductions in air pollution, stormwater, building energy, and carbon emissions. Metropolitan areas ranged from 1173 to 18,720 sq km (median value 2530 sq km), with median tree cover 21%, and potential additional tree cover 19%, of this area. Median tree cover density was 39 m2/capita (compared with global value of 7800 m2/capita), with lower density in desert and tropical biomes, and higher density in temperate biomes. Using water to support trees led to median benefits of 1.2 billion/yr from reductions in CO, NO2, SO2, PM10, and PM2.5, 27 million/yr in avoided stormwater processing by wastewater facilities, 1.2 million/yr in building energy heating and cooling savings, and 20 million/yr in CO2 sequestration. These ecosystem service benefits contributed between 0.1% and 1% of megacity GDP, with a median contribution of 0.3%. Adjustment of benefit value between different city economies considered factors such as purchasing power parity and emergy to money ratio conversions. Green infrastructure costs billions of dollars less than grey infrastructure, and stormwater based grey infrastructure provides fewer benefits. This analysis suggests megacities should invest in tree-based green infrastructure to maintain and increase ecosystem service benefits, manage their water resources, and improve human wellbeing.

  15. IRON AND {alpha}-ELEMENT PRODUCTION IN THE FIRST ONE BILLION YEARS AFTER THE BIG BANG

    Energy Technology Data Exchange (ETDEWEB)

    Becker, George D.; Carswell, Robert F. [Kavli Institute for Cosmology and Institute of Astronomy, Madingley Road, Cambridge, CB3 0HA (United Kingdom); Sargent, Wallace L. W. [Palomar Observatory, California Institute of Technology, Pasadena, CA 91125 (United States); Rauch, Michael, E-mail: gdb@ast.cam.ac.uk, E-mail: acalver@ast.cam.ac.uk, E-mail: wws@astro.caltech.edu, E-mail: mr@obs.carnegiescience.edu [Carnegie Observatories, 813 Santa Barbara Street, Pasadena, CA 91101 (United States)

    2012-01-10

    We present measurements of carbon, oxygen, silicon, and iron in quasar absorption systems existing when the universe was roughly one billion years old. We measure column densities in nine low-ionization systems at 4.7 < z < 6.3 using Keck, Magellan, and Very Large Telescope optical and near-infrared spectra with moderate to high resolution. The column density ratios among C II, O I, Si II, and Fe II are nearly identical to sub-damped Ly{alpha} systems (sub-DLAs) and metal-poor ([M/H] {<=} -1) DLAs at lower redshifts, with no significant evolution over 2 {approx}< z {approx}< 6. The estimated intrinsic scatter in the ratio of any two elements is also small, with a typical rms deviation of {approx}< 0.1 dex. These facts suggest that dust depletion and ionization effects are minimal in our z > 4.7 systems, as in the lower-redshift DLAs, and that the column density ratios are close to the intrinsic relative element abundances. The abundances in our z > 4.7 systems are therefore likely to represent the typical integrated yields from stellar populations within the first gigayear of cosmic history. Due to the time limit imposed by the age of the universe at these redshifts, our measurements thus place direct constraints on the metal production of massive stars, including iron yields of prompt supernovae. The lack of redshift evolution further suggests that the metal inventories of most metal-poor absorption systems at z {approx}> 2 are also dominated by massive stars, with minimal contributions from delayed Type Ia supernovae or winds from asymptotic giant branch stars. The relative abundances in our systems broadly agree with those in very metal-poor, non-carbon-enhanced Galactic halo stars. This is consistent with the picture in which present-day metal-poor stars were potentially formed as early as one billion years after the big bang.

  16. Searching for Organics Preserved in 4.5 Billion Year Old Salt

    Science.gov (United States)

    Zolensky, Michael E.; Fries, M.; Steele, A.; Bodnar, R.

    2012-01-01

    Our understanding of early solar system fluids took a dramatic turn a decade ago with the discovery of fluid inclusion-bearing halite (NaCl) crystals in the matrix of two freshly fallen brecciated H chondrite falls, Monahans and Zag. Both meteorites are regolith breccias, and contain xenolithic halite (and minor admixed sylvite -- KCl, crystals in their regolith lithologies. The halites are purple to dark blue, due to the presence of color centers (electrons in anion vacancies) which slowly accumulated as 40K (in sylvite) decayed over billions of years. The halites were dated by K-Ar, Rb-Sr and I-Xe systematics to be 4.5 billion years old. The "blue" halites were a fantastic discovery for the following reasons: (1) Halite+sylvite can be dated (K is in sylvite and will substitute for Na in halite, Rb substitutes in halite for Na, and I substitutes for Cl). (2) The blue color is lost if the halite dissolves on Earth and reprecipitates (because the newly-formed halite has no color centers), so the color serves as a "freshness" or pristinity indicator. (3) Halite frequently contains aqueous fluid inclusions. (4) Halite contains no structural oxygen, carbon or hydrogen, making them ideal materials to measure these isotopic systems in any fluid inclusions. (5) It is possible to directly measure fluid inclusion formation temperatures, and thus directly measure the temperature of the mineralizing aqueous fluid. In addition to these two ordinary chondrites halite grains have been reliably reported in several ureilites, an additional ordinary chondrite (Jilin), and in the carbonaceous chondrite (Murchison), although these reports were unfortunately not taken seriously. We have lately found additional fluid inclusions in carbonates in several additional carbonaceous chondrites. Meteoritic aqueous fluid inclusions are apparently relatively widespread in meteorites, though very small and thus difficult to analyze.

  17. ICI bites demerger bullet, Zeneca guns for Brit-pounds 1.3-billion rights issue

    International Nuclear Information System (INIS)

    Jackson, D.; Alperowicz, N.

    1993-01-01

    Any lingering doubts as to ICI's (London) intentions to follow through its demerger proposals were dispelled last week. The company will hive off its bioscience business into Zeneca Group plc, which will make a Brit-pounds 1.3-billion ($1.9 billion) rights issue in June 1993. Shareholders, whose approval for the historic move will be sought in late May, will receive one fully paid Zeneca share for each ICI share. Proceeds from the rights issue will be used to reduce Zeneca's indebtedness to ICI by about 70%. Acknowledging that ICI had 'spread the jam too thinly' during its expansion in the 1980s, chief executive Ronnie Hampel says the new ICI will be a cost-conscious, no-frills' organization and that businesses that failed to perform would be restructured or closed. He is 'not expecting any help from the economy' in 1993. Of ICI's remaining petrochemicals and plastics businesses, Hampel says that despite 'stringent measures to reduce the cost base hor-ellipsis it is clear they will not reach a return on capital that will justify reinvestment by ICI.' He does not see them as closure candidates but as 'businesses that will require further restructuring.' Hampel notes 'a dozen clearly identified areas for expansion,' including paints, catalysts, titanium dioxide, and chlorofluorocarbon replacements. Losses in materials, where substantial rationalization has failed to halt the slide, will be reduced on completion of the DuPont deal - expected by midyear. 'Further measures' would be necessary for the 'residual bit of advanced materials in the US,' he says

  18. A parts-per-billion measurement of the antiproton magnetic moment.

    Science.gov (United States)

    Smorra, C; Sellner, S; Borchert, M J; Harrington, J A; Higuchi, T; Nagahama, H; Tanaka, T; Mooser, A; Schneider, G; Bohman, M; Blaum, K; Matsuda, Y; Ospelkaus, C; Quint, W; Walz, J; Yamazaki, Y; Ulmer, S

    2017-10-18

    Precise comparisons of the fundamental properties of matter-antimatter conjugates provide sensitive tests of charge-parity-time (CPT) invariance, which is an important symmetry that rests on basic assumptions of the standard model of particle physics. Experiments on mesons, leptons and baryons have compared different properties of matter-antimatter conjugates with fractional uncertainties at the parts-per-billion level or better. One specific quantity, however, has so far only been known to a fractional uncertainty at the parts-per-million level: the magnetic moment of the antiproton, . The extraordinary difficulty in measuring with high precision is caused by its intrinsic smallness; for example, it is 660 times smaller than the magnetic moment of the positron. Here we report a high-precision measurement of in units of the nuclear magneton μ N with a fractional precision of 1.5 parts per billion (68% confidence level). We use a two-particle spectroscopy method in an advanced cryogenic multi-Penning trap system. Our result  = -2.7928473441(42)μ N (where the number in parentheses represents the 68% confidence interval on the last digits of the value) improves the precision of the previous best measurement by a factor of approximately 350. The measured value is consistent with the proton magnetic moment, μ p  = 2.792847350(9)μ N , and is in agreement with CPT invariance. Consequently, this measurement constrains the magnitude of certain CPT-violating effects to below 1.8 × 10 -24 gigaelectronvolts, and a possible splitting of the proton-antiproton magnetic moments by CPT-odd dimension-five interactions to below 6 × 10 -12 Bohr magnetons.

  19. The dynamics of metric-affine gravity

    International Nuclear Information System (INIS)

    Vitagliano, Vincenzo; Sotiriou, Thomas P.; Liberati, Stefano

    2011-01-01

    Highlights: → The role and the dynamics of the connection in metric-affine theories is explored. → The most general second order action does not lead to a dynamical connection. → Including higher order invariants excites new degrees of freedom in the connection. → f(R) actions are also discussed and shown to be a non- representative class. - Abstract: Metric-affine theories of gravity provide an interesting alternative to general relativity: in such an approach, the metric and the affine (not necessarily symmetric) connection are independent quantities. Furthermore, the action should include covariant derivatives of the matter fields, with the covariant derivative naturally defined using the independent connection. As a result, in metric-affine theories a direct coupling involving matter and connection is also present. The role and the dynamics of the connection in such theories is explored. We employ power counting in order to construct the action and search for the minimal requirements it should satisfy for the connection to be dynamical. We find that for the most general action containing lower order invariants of the curvature and the torsion the independent connection does not carry any dynamics. It actually reduces to the role of an auxiliary field and can be completely eliminated algebraically in favour of the metric and the matter field, introducing extra interactions with respect to general relativity. However, we also show that including higher order terms in the action radically changes this picture and excites new degrees of freedom in the connection, making it (or parts of it) dynamical. Constructing actions that constitute exceptions to this rule requires significant fine tuned and/or extra a priori constraints on the connection. We also consider f(R) actions as a particular example in order to show that they constitute a distinct class of metric-affine theories with special properties, and as such they cannot be used as representative toy

  20. Articulated vehicles of 25 meter and 60 ton in The Netherlands: the start of a pilot project

    NARCIS (Netherlands)

    Hoogvelt, R.B.J.; Huijbers, J.J.W.

    1998-01-01

    At this moment the total allowable length of an articulated vehicle in The Netherlands is 18.35 meter and its total weight is 50 ton. Several Dutch transportation organisations requested a pilot project with longer and heavier vehicles for heavy goods transpotation. Because of the environmental

  1. Active Seismic Monitoring Using High-Power Moveable 40-TONS Vibration Sources in Altay-Sayn Region of Russia

    Science.gov (United States)

    Soloviev, V. M.; Seleznev, V. S.; Emanov, A. F.; Kashun, V. N.; Elagin, S. A.; Romanenko, I.; Shenmayer, A. E.; Serezhnikov, N.

    2013-05-01

    The paper presents data of operating vibroseismic observations using high-power stationary 100-tons and moveable 40-tons vibration sources, which have been carried out in Russia for 30 years. It is shown that investigations using high-power vibration sources open new possibilities for study stressedly-deformed condition of the Earth`s crust and the upper mantle and tectonic process in them. Special attention is given to developing operating seismic translucences of the Earth`s crust and the upper mantle using high-power 40-tons vibration sources. As a result of experimental researches there was proved high stability and repeatability of vibration effects. There were carried out long period experiments of many days with vibration source sessions of every two hours with the purpose of monitoring accuracy estimation. It was determined, that repeatability of vibroseismic effects (there was researched time difference of repeated sessions of P- and S-waves from crystal rocks surface) could be estimated as 10-3 - 10-4 sec. It is ten times less than revealed here annual variations of kinematic parameters according to regime vibroseismic observations. It is shown, that on hard high-speed grounds radiation spectrum becomes narrowband and is dislocated to high frequency; at the same time quantity of multiple high-frequency harmonic is growing. At radiation on soft sedimentary grounds (sand, clay) spectrum of vibration source in near zone is more broadband, correlograms are more compact. there Correspondence of wave fields from 40-tons vibration sources and explosions by reference waves from boundaries in he Earth`s crust and the upper mantle at record distance of 400 km was proved by many experiments in various regions of Russia; there was carried out the technique of high-power vibration sources grouping for increase of effectiveness of emanation and increase of record distance. According to results of long-term vibroseismic monitoring near Novosibirsk (1997-2012) there are

  2. Investigation of the thermal behavior of 2 1/2 ton cylinder protective overpack

    International Nuclear Information System (INIS)

    Park, S.H.

    1988-01-01

    UF 6 cylinders containing reactor grade enriched uranium are transported in protective overpacks. Recently, the design of the 2 1/2 ton UF 6 cylinder overpack was modified to insure the safety of the cylinder inside the overpack. Modifications include a continuous stainless steel liner from the outer surface to the inner surface of the overpack and step joints between the upper and lower halves of the overpack. The effects of a continuous stainless steel liner and moisture in the insulation layer of a UF 6 cylinder protective overpack were investigated with a numerical code. Results were compared with limited available field data. The purpose of comparing the numerical results with field data is to insure the validity of the numerical analysis and the physical properties used in the analysis. The study indicates that the continuous stainless steel liner did not influence the heat transfer rate much from the outer surface of the overpack to the 30B cylinder inside. The effect of step joints was not modeled due to the difficulty of quantifying the leakage rate through the gap. With a continuous stainless steel liner from the outside of the overpack to the inside, the overpack satisfies the thermal design criteria of protecting the cylinder inside for a minimum of 30 minutes when the overpack is exposed to a fire. The effect of moisture inside the insulation layer in the overpack is to reduce the energy to the cylinder with its high thermal capacity. The high pressure steam generated from the moisture will be relieved externally through the vent holes on the outer surface of the overpack. Although these holes are sealed after the overpack is dried, the plug sealing the holes will melt when the overpack is exposed to a fire

  3. TENERIFE program: high temperature experiments on A 4 tons UF6 container

    International Nuclear Information System (INIS)

    Casselman, C.; Duret, B.; Seiler, J.M.; Ringot, C.; Warniez, P.; Wataru, M.; Shiomi, S.; Ozaki, S.; Yamakawa, H.

    1993-01-01

    To know the input of the future thermo-mechanical code, we have to get a better understanding of the thermo-physical evolution of the UF 6 which pressurizes the container. This evolution is function of: a) the heat transfer rate from the fire to the container b) the UF 6 behaviour in the container. These tests are essentially analytical at simulated fire temperatures of between 800 and 1000degC. They use a representative mass of UF 6 (around 4 tons). The tests will not seek to rupture the test container which has a diameter equal to the 48Y container, but shorter length. These tests carried out in realistic conditions (typical thermal gradient at the wall, characteristic period for UF 6 internal mass transfer) should make possible to improve knowledge of two fundamental phenomena: 1) vaporization of UF 6 on contact with the heated wall (around 400degC), a phenomenon which controls the container internal pressurization kinetic, 2) the equivalent conductivity of solid UF 6 , a phenomenon which is linked to the heat transfer by UF 6 vaporization-condensation through the solid's porosities and which depends on the diameter of the container. In addition, they will allow the influence of other parameters to be studied, such as UF 6 container filling mode or the mechanical characteristics of the container material. A UF 6 container fitted with instruments (wall temperature, UF 6 temperature, pressure) is heated by a rapid heat transient in a radiating furnace where the temperature and thermal power supplied can be measured. The test continues until pre-established thresholds have been reached: 1) strain threshold measured on the container surface (strain gauges positioned on the outside), 2) maximum temperature threshold of UF 6 , 3) container internal pressure threshold. (J.P.N.)

  4. Analysis of internal crack in a six-ton P91 ingot

    Directory of Open Access Journals (Sweden)

    Jing-an Yang

    2016-05-01

    Full Text Available P91 is a new kind of heat-resistant and high-tensile steel. It can be extruded after ingot casting and can be widely used for different pipes in power plants. However, due to its mushy freezing characteristics, a lack of feeding in the ingot center often generates many defects, such as porosity and crack. A six-ton P91 ingot was cast and sliced, and a representative part of the longitudinal section was inspected in more detail. The morphology of crack-like defects was examined by X-ray high energy industrial CT and reconstructed by 3D software. There are five main portions of defects larger than 200 mm3, four of which are interconnected. These initiated from continuous liquid film, and then were torn apart by excessive tensile stress within the brittle temperature range (BTR. The 3D FEM analysis of thermo-mechanical simulation was carried out to analyze the formation of porosity and internal crack defects. The results of shrinkage porosity and Niyama values revealed that the center of the ingot suffers from inadequate feeding. Several criteria based on thermal and mechanical models were used to evaluate the susceptibility of hot crack formation. The Clyne and Davies’ criterion and Katgerman’s criterion successfully predicted the high hot crack susceptibility in the ingot center. Six typical locations in the longitudinal section had been chosen for analysis of the stresses and strains evolution during the BTR. Locations in the defects region showed the highest tensile stresses and relative high strain values, while other locations showed either low tensile stresses or low strain values. In conclusion, hot crack develops only when stress and strain exceed a threshold value at the same time during the BTR.

  5. Development and operation of a 30 ton/ day gasification and melting plant for municipal solid wastes

    International Nuclear Information System (INIS)

    Jung, Hae Young; Seo, Yong-Chil; Cho, Sung-Jin; Lee, Jang-Su; Lee, Ki-Bae; Jeong, Dae-Woon; Kim, Woo-Hyun; Roh, Seon-Ah; Min, Tai-Jin

    2010-01-01

    As one of the efforts to increase recycling rate of end of life vehicles enforcing by the governmental regulation, automobile shredder residue (ASR) was considered to treat by a thermal method with converting waste to energy. Gasification and melting experimental processes of lab (1 kg/ hour) and pilot (5 ton. day) scale were installed. ASR collected from a domestic shredding company was experimented at a lab-scale and pilot-scale gasification and melting process which is similar to the shaft type gasification melting furnace. The characteristics of syngas, tar and residue (slag) generated from a conversion process (gasification and melting) were analyzed to provide the information to further utilize them as fuel and recyclable materials in scaled up plants. A series of experiments have been conducted with various air equivalent ratios (ERs), and syngas compositions, carbon conversion efficiency, heating value of syngas, yield and characteristics of slag were analyzed. Finally, slags generated from the process were recycled with various alternative technologies. In summary, energy conversion technology of ASR with the least production of residue by gasification and slag utilization has been developed. The main components in product gas were H 2 , CO, CH 4 and CO 2 ; and concentrations of C 2 H 4 and C 2 H 6 were less. This can be used as clean fuel gas whose heating value ranged from 2.5 to 14.0 MJ/ m 3 . Most of slag generated from the process can further be fabricated to valuable and usable products. Such combined technology would result in achieving almost zero waste release from ELVs. (author)

  6. Evaluation metrics for biostatistical and epidemiological collaborations.

    Science.gov (United States)

    Rubio, Doris McGartland; Del Junco, Deborah J; Bhore, Rafia; Lindsell, Christopher J; Oster, Robert A; Wittkowski, Knut M; Welty, Leah J; Li, Yi-Ju; Demets, Dave

    2011-10-15

    Increasing demands for evidence-based medicine and for the translation of biomedical research into individual and public health benefit have been accompanied by the proliferation of special units that offer expertise in biostatistics, epidemiology, and research design (BERD) within academic health centers. Objective metrics that can be used to evaluate, track, and improve the performance of these BERD units are critical to their successful establishment and sustainable future. To develop a set of reliable but versatile metrics that can be adapted easily to different environments and evolving needs, we consulted with members of BERD units from the consortium of academic health centers funded by the Clinical and Translational Science Award Program of the National Institutes of Health. Through a systematic process of consensus building and document drafting, we formulated metrics that covered the three identified domains of BERD practices: the development and maintenance of collaborations with clinical and translational science investigators, the application of BERD-related methods to clinical and translational research, and the discovery of novel BERD-related methodologies. In this article, we describe the set of metrics and advocate their use for evaluating BERD practices. The routine application, comparison of findings across diverse BERD units, and ongoing refinement of the metrics will identify trends, facilitate meaningful changes, and ultimately enhance the contribution of BERD activities to biomedical research. Copyright © 2011 John Wiley & Sons, Ltd.

  7. A Metric on Phylogenetic Tree Shapes.

    Science.gov (United States)

    Colijn, C; Plazzotta, G

    2018-01-01

    The shapes of evolutionary trees are influenced by the nature of the evolutionary process but comparisons of trees from different processes are hindered by the challenge of completely describing tree shape. We present a full characterization of the shapes of rooted branching trees in a form that lends itself to natural tree comparisons. We use this characterization to define a metric, in the sense of a true distance function, on tree shapes. The metric distinguishes trees from random models known to produce different tree shapes. It separates trees derived from tropical versus USA influenza A sequences, which reflect the differing epidemiology of tropical and seasonal flu. We describe several metrics based on the same core characterization, and illustrate how to extend the metric to incorporate trees' branch lengths or other features such as overall imbalance. Our approach allows us to construct addition and multiplication on trees, and to create a convex metric on tree shapes which formally allows computation of average tree shapes. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  8. Future of the PCI Readmission Metric.

    Science.gov (United States)

    Wasfy, Jason H; Yeh, Robert W

    2016-03-01

    Between 2013 and 2014, the Centers for Medicare and Medicaid Services and the National Cardiovascular Data Registry publically reported risk-adjusted 30-day readmission rates after percutaneous coronary intervention (PCI) as a pilot project. A key strength of this public reporting effort included risk adjustment with clinical rather than administrative data. Furthermore, because readmission after PCI is common, expensive, and preventable, this metric has substantial potential to improve quality and value in American cardiology care. Despite this, concerns about the metric exist. For example, few PCI readmissions are caused by procedural complications, limiting the extent to which improved procedural technique can reduce readmissions. Also, similar to other readmission measures, PCI readmission is associated with socioeconomic status and race. Accordingly, the metric may unfairly penalize hospitals that care for underserved patients. Perhaps in the context of these limitations, Centers for Medicare and Medicaid Services has not yet included PCI readmission among metrics that determine Medicare financial penalties. Nevertheless, provider organizations may still wish to focus on this metric to improve value for cardiology patients. PCI readmission is associated with low-risk chest discomfort and patient anxiety. Therefore, patient education, improved triage mechanisms, and improved care coordination offer opportunities to minimize PCI readmissions. Because PCI readmission is common and costly, reducing PCI readmission offers provider organizations a compelling target to improve the quality of care, and also performance in contracts involve shared financial risk. © 2016 American Heart Association, Inc.

  9. g-Weak Contraction in Ordered Cone Rectangular Metric Spaces

    Directory of Open Access Journals (Sweden)

    S. K. Malhotra

    2013-01-01

    Full Text Available We prove some common fixed-point theorems for the ordered g-weak contractions in cone rectangular metric spaces without assuming the normality of cone. Our results generalize some recent results from cone metric and cone rectangular metric spaces into ordered cone rectangular metric spaces. Examples are provided which illustrate the results.

  10. Defining a Progress Metric for CERT RMM Improvement

    Science.gov (United States)

    2017-09-14

    REV-03.18.2016.0 Defining a Progress Metric for CERT-RMM Improvement Gregory Crabb Nader Mehravari David Tobar September 2017 TECHNICAL ...fendable resource allocation decisions. Technical metrics measure aspects of controls implemented through technology (systems, soft- ware, hardware...implementation metric would be the percentage of users who have received anti-phishing training . • Effectiveness/efficiency metrics measure whether

  11. NASA education briefs for the classroom. Metrics in space

    Science.gov (United States)

    The use of metric measurement in space is summarized for classroom use. Advantages of the metric system over the English measurement system are described. Some common metric units are defined, as are special units for astronomical study. International system unit prefixes and a conversion table of metric/English units are presented. Questions and activities for the classroom are recommended.

  12. One billion year-old Mid-continent Rift leaves virtually no clues in the mantle

    Science.gov (United States)

    Bollmann, T. A.; Frederiksen, A. W.; van der Lee, S.; Wolin, E.; Revenaugh, J.; Wiens, D.; Darbyshire, F. A.; Aleqabi, G. I.; Wysession, M. E.; Stein, S.; Jurdy, D. M.

    2017-12-01

    We measured the relative arrival times of more than forty-six thousand teleseismic P waves recorded by seismic stations of Earthscope's Superior Province Rifting Earthscope Experiment (SPREE) and combined them with a similar amount of such measurements from other seismic stations in the larger region. SPREE recorded seismic waves for two and a half years around the prominent, one billion year-old Mid-continent Rift structure. The curvilinear Mid-continent Rift (MR) is distinguished by voluminous one billion year-old lava flows, which produce a prominent gravity high along the MR. As for other seismic waves, these lava flows along with their underplated counterpart, slightly slow down the measured teleseismic P waves, on average, compared to P waves that did not traverse structures beneath the Mid-continent Rift. However, the variance in the P wave arrival times in these two groups is nearly ten times higher than their average difference. In a seismic-tomographic inversion, we mapped all measured arrival times into structures deep beneath the crust, in the Earth's mantle. Beneath the crust we generally find relatively high P velocities, indicating relatively cool and undeformable mantle structures. However, the uppermost mantle beneath the MR shows several patches of slightly decreased P velocities. These patches are coincident with where the gravity anomalies peak, in Iowa and along the northern Minnesota/Wisconsin border. We will report on the likelihood that these anomalies are indeed a remaining mantle-lithospheric signature of the MR or whether these patches indirectly reflect the presence of the lava flows and their underplated counterparts at the crust-mantle interface. Other structures of interest and of varying depth extent in our tomographic image locate at 1) the intersection of the Superior Craton with the Penokean Province and the Marshfield Terrane west of the MR in southern Minnesota, 2) the intersection of the Penokean, Yavapai, and Mazatzal Terranes

  13. SOCIAL METRICS APPLIED TO SMART TOURISM

    Directory of Open Access Journals (Sweden)

    O. Cervantes

    2016-09-01

    Full Text Available We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  14. Landscape pattern metrics and regional assessment

    Science.gov (United States)

    O'Neill, R. V.; Riitters, K.H.; Wickham, J.D.; Jones, K.B.

    1999-01-01

    The combination of remote imagery data, geographic information systems software, and landscape ecology theory provides a unique basis for monitoring and assessing large-scale ecological systems. The unique feature of the work has been the need to develop and interpret quantitative measures of spatial pattern-the landscape indices. This article reviews what is known about the statistical properties of these pattern metrics and suggests some additional metrics based on island biogeography, percolation theory, hierarchy theory, and economic geography. Assessment applications of this approach have required interpreting the pattern metrics in terms of specific environmental endpoints, such as wildlife and water quality, and research into how to represent synergystic effects of many overlapping sources of stress.

  15. A bi-metric theory of gravitation

    International Nuclear Information System (INIS)

    Rosen, N.

    1975-01-01

    The bi-metric theory of gravitation proposed previously is simplified in that the auxiliary conditions are discarded, the two metric tensors being tied together only by means of the boundary conditions. Some of the properties of the field of a particle are investigated; there is no black hole, and it appears that no gravitational collapse can take place. Although the proposed theory and general relativity are at present observationally indistinguishable, some differences are pointed out which may some day be susceptible of observation. An alternative bi-metric theory is considered which gives for the precession of the perihelion 5/6 of the value given by general relativity; it seems less satisfactory than the present theory from the aesthetic point of view. (author)

  16. Steiner trees for fixed orientation metrics

    DEFF Research Database (Denmark)

    Brazil, Marcus; Zachariasen, Martin

    2009-01-01

    We consider the problem of constructing Steiner minimum trees for a metric defined by a polygonal unit circle (corresponding to s = 2 weighted legal orientations in the plane). A linear-time algorithm to enumerate all angle configurations for degree three Steiner points is given. We provide...... a simple proof that the angle configuration for a Steiner point extends to all Steiner points in a full Steiner minimum tree, such that at most six orientations suffice for edges in a full Steiner minimum tree. We show that the concept of canonical forms originally introduced for the uniform orientation...... metric generalises to the fixed orientation metric. Finally, we give an O(s n) time algorithm to compute a Steiner minimum tree for a given full Steiner topology with n terminal leaves....

  17. Metrical and dynamical aspects in complex analysis

    CERN Document Server

    2017-01-01

    The central theme of this reference book is the metric geometry of complex analysis in several variables. Bridging a gap in the current literature, the text focuses on the fine behavior of the Kobayashi metric of complex manifolds and its relationships to dynamical systems, hyperbolicity in the sense of Gromov and operator theory, all very active areas of research. The modern points of view expressed in these notes, collected here for the first time, will be of interest to academics working in the fields of several complex variables and metric geometry. The different topics are treated coherently and include expository presentations of the relevant tools, techniques and objects, which will be particularly useful for graduate and PhD students specializing in the area.

  18. Social Metrics Applied to Smart Tourism

    Science.gov (United States)

    Cervantes, O.; Gutiérrez, E.; Gutiérrez, F.; Sánchez, J. A.

    2016-09-01

    We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general) to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services) to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  19. Validation of Metrics as Error Predictors

    Science.gov (United States)

    Mendling, Jan

    In this chapter, we test the validity of metrics that were defined in the previous chapter for predicting errors in EPC business process models. In Section 5.1, we provide an overview of how the analysis data is generated. Section 5.2 describes the sample of EPCs from practice that we use for the analysis. Here we discuss a disaggregation by the EPC model group and by error as well as a correlation analysis between metrics and error. Based on this sample, we calculate a logistic regression model for predicting error probability with the metrics as input variables in Section 5.3. In Section 5.4, we then test the regression function for an independent sample of EPC models from textbooks as a cross-validation. Section 5.5 summarizes the findings.

  20. Metric Learning for Hyperspectral Image Segmentation

    Science.gov (United States)

    Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca

    2011-01-01

    We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.

  1. Kerr metric in the deSitter background

    International Nuclear Information System (INIS)

    Vaidya, P.C.

    1984-01-01

    In addition to the Kerr metric with cosmological constant Λ several other metrics are presented giving a Kerr-like solution of Einstein's equations in the background of deSitter universe. A new metric of what may be termed as rotating deSitter space-time devoid of matter but containing null fluid with twisting null rays, has been presented. This metric reduces to the standard deSitter metric when the twist in the rays vanishes. Kerr metric in this background is the immediate generalization of Schwarzschild's exterior metric with cosmological constant. (author)

  2. Active Metric Learning from Relative Comparisons

    OpenAIRE

    Xiong, Sicheng; Rosales, Rómer; Pei, Yuanli; Fern, Xiaoli Z.

    2014-01-01

    This work focuses on active learning of distance metrics from relative comparison information. A relative comparison specifies, for a data point triplet $(x_i,x_j,x_k)$, that instance $x_i$ is more similar to $x_j$ than to $x_k$. Such constraints, when available, have been shown to be useful toward defining appropriate distance metrics. In real-world applications, acquiring constraints often require considerable human effort. This motivates us to study how to select and query the most useful ...

  3. Heuristic extension of the Schwarzschild metric

    International Nuclear Information System (INIS)

    Espinosa, J.M.

    1982-01-01

    The Schwarzschild solution of Einstein's equations of gravitation has several singularities. It is known that the singularity at r = 2Gm/c 2 is only apparent, a result of the coordinates in which the solution was found. Paradoxical results occuring near the singularity show the system of coordinates is incomplete. We introduce a simple, two-dimensional metric with an apparent singularity that makes it incomplete. By a straightforward, heuristic procedure we extend and complete this simple metric. We then use the same procedure to give a heuristic derivation of the Kruskal system of coordinates, which is known to extend the Schwarzschild manifold past its apparent singularity and produce a complete manifold

  4. Metric inhomogeneous Diophantine approximation in positive characteristic

    DEFF Research Database (Denmark)

    Kristensen, Simon

    2011-01-01

    We obtain asymptotic formulae for the number of solutions to systems of inhomogeneous linear Diophantine inequalities over the field of formal Laurent series with coefficients from a finite fields, which are valid for almost every such system. Here `almost every' is with respect to Haar measure...... of the coefficients of the homogeneous part when the number of variables is at least two (singly metric case), and with respect to the Haar measure of all coefficients for any number of variables (doubly metric case). As consequences, we derive zero-one laws in the spirit of the Khintchine-Groshev Theorem and zero...

  5. Metric inhomogeneous Diophantine approximation in positive characteristic

    DEFF Research Database (Denmark)

    Kristensen, S.

    We obtain asymptotic formulae for the number of solutions to systems of inhomogeneous linear Diophantine inequalities over the field of formal Laurent series with coefficients from a finite fields, which are valid for almost every such system. Here 'almost every' is with respect to Haar measure...... of the coefficients of the homogeneous part when the number of variables is at least two (singly metric case), and with respect to the Haar measure of all coefficients for any number of variables (doubly metric case). As consequences, we derive zero-one laws in the spirit of the Khintchine--Groshev Theorem and zero...

  6. Jacobi-Maupertuis metric and Kepler equation

    Science.gov (United States)

    Chanda, Sumanto; Gibbons, Gary William; Guha, Partha

    This paper studies the application of the Jacobi-Eisenhart lift, Jacobi metric and Maupertuis transformation to the Kepler system. We start by reviewing fundamentals and the Jacobi metric. Then we study various ways to apply the lift to Kepler-related systems: first as conformal description and Bohlin transformation of Hooke’s oscillator, second in contact geometry and third in Houri’s transformation [T. Houri, Liouville integrability of Hamiltonian systems and spacetime symmetry (2016), www.geocities.jp/football_physician/publication.html], coupled with Milnor’s construction [J. Milnor, On the geometry of the Kepler problem, Am. Math. Mon. 90 (1983) 353-365] with eccentric anomaly.

  7. Prodigious degassing of a billion years of accumulated radiogenic helium at Yellowstone

    Science.gov (United States)

    Lowenstern, Jacob B.; Evans, William C.; Bergfeld, D.; Hunt, Andrew G.

    2014-01-01

    Helium is used as a critical tracer throughout the Earth sciences, where its relatively simple isotopic systematics is used to trace degassing from the mantle, to date groundwater and to time the rise of continents1. The hydrothermal system at Yellowstone National Park is famous for its high helium-3/helium-4 isotope ratio, commonly cited as evidence for a deep mantle source for the Yellowstone hotspot2. However, much of the helium emitted from this region is actually radiogenic helium-4 produced within the crust by α-decay of uranium and thorium. Here we show, by combining gas emission rates with chemistry and isotopic analyses, that crustal helium-4 emission rates from Yellowstone exceed (by orders of magnitude) any conceivable rate of generation within the crust. It seems that helium has accumulated for (at least) many hundreds of millions of years in Archaean (more than 2.5 billion years old) cratonic rocks beneath Yellowstone, only to be liberated over the past two million years by intense crustal metamorphism induced by the Yellowstone hotspot. Our results demonstrate the extremes in variability of crustal helium efflux on geologic timescales and imply crustal-scale open-system behaviour of helium in tectonically and magmatically active regions.

  8. Natural gas. The LNG trade exceeds the 100 billions of m3 limit

    International Nuclear Information System (INIS)

    Anon.

    1997-01-01

    1996 has been a particularly favourable year for the international natural gas industry with a 10% increase of the international trade. The worldwide commercialized production of natural gas (2310 billions of m 3 ) has shown a 5% increase with respect to the previous year, with a strong increase in the OECD countries (+15.5%), in particular in the North Sea. High growing rates were recorded also in Latin America (9.5%) and Middle East (8%). Natural gas production in the CIS (Community of Independent States) reached 714 Gm 3 in 1996 with 600 Gm 3 from the Russian federation. The international trade has shown a 10% increase and reached 429.3 Gm 3 . The methane tanker ship trade has shown a 10 Gm 3 increase mainly in the Asian market (Japan and South Korea). Natural gas consumption growth has been high too (+4.9%) and reached 11.6% in Europe due to the climate conditions and to an increasing electric power demand. (J.S.)

  9. Rapid analysis of perchlorate in drinking water at parts per billion levels using microchip electrophoresis.

    Science.gov (United States)

    Gertsch, Jana C; Noblitt, Scott D; Cropek, Donald M; Henry, Charles S

    2010-05-01

    A microchip capillary electrophoresis (MCE) system has been developed for the determination of perchlorate in drinking water. The United States Environmental Protection Agency (USEPA) recently proposed a health advisory limit for perchlorate in drinking water of 15 parts per billion (ppb), a level requiring large, sophisticated instrumentation, such as ion chromatography coupled with mass spectrometry (IC-MS), for detection. An inexpensive, portable system is desired for routine online monitoring applications of perchlorate in drinking water. Here, we present an MCE method using contact conductivity detection for perchlorate determination. The method has several advantages, including reduced analysis times relative to IC, inherent portability, high selectivity, and minimal sample pretreatment. Resolution of perchlorate from more abundant ions was achieved using zwitterionic, sulfobetaine surfactants, N-hexadecyl-N,N-dimethyl-3-ammonio-1-propane sulfonate (HDAPS) and N-tetradecyl-N,N-dimethyl-3-ammonio-1-propane sulfonate (TDAPS). The system performance and the optimization of the separation chemistry, including the use of these surfactants to resolve perchlorate from other anions, are discussed in this work. The system is capable of detection limits of 3.4 +/- 1.8 ppb (n = 6) in standards and 5.6 +/- 1.7 ppb (n = 6) in drinking water.

  10. Billion-scale production of hepatocyte-like cells from human induced pluripotent stem cells.

    Science.gov (United States)

    Yamashita, Tomoki; Takayama, Kazuo; Sakurai, Fuminori; Mizuguchi, Hiroyuki

    2018-02-19

    Human induced pluripotent stem (iPS) cell-derived hepatocyte-like cells are expected to be utilized in drug screening and regenerative medicine. However, hepatocyte-like cells have not been fully used in such applications because it is difficult to produce such cells on a large scale. In this study, we tried to establish a method to mass produce hepatocyte-like cells using a three-dimensional (3D) cell culture bioreactor called the Rotary Cell Culture System (RCCS). RCCS enabled us to obtain homogenous hepatocyte-like cells on a billion scale (>10 9  cells). The gene expression levels of some hepatocyte markers (alpha-1 antitrypsin, cytochrome (CYP) 1A2, CYP2D6, and hepatocyte nuclear factor 4alpha) were higher in 3D-cultured hepatocyte-like cells than in 2D-cultured hepatocyte-like cells. This result suggests that RCCS could provide more suitable conditions for hepatocyte maturation than the conventional 2D cell culture conditions. In addition, more than 90% of hepatocyte-like cells were positive for albumin and could uptake low-density lipoprotein in the culture medium. We succeeded in the large-scale production of homogenous and functional hepatocyte-like cells from human iPS cells. This technology will be useful in drug screening and regenerative medicine, which require enormous numbers of hepatocyte-like cells. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Evidence for oxygenic photosynthesis half a billion years before the Great Oxidation Event

    Science.gov (United States)

    Planavsky, Noah J.; Asael, Dan; Hofmann, Axel; Reinhard, Christopher T.; Lalonde, Stefan V.; Knudsen, Andrew; Wang, Xiangli; Ossa Ossa, Frantz; Pecoits, Ernesto; Smith, Albertus J. B.; Beukes, Nicolas J.; Bekker, Andrey; Johnson, Thomas M.; Konhauser, Kurt O.; Lyons, Timothy W.; Rouxel, Olivier J.

    2014-04-01

    The early Earth was characterized by the absence of oxygen in the ocean-atmosphere system, in contrast to the well-oxygenated conditions that prevail today. Atmospheric concentrations first rose to appreciable levels during the Great Oxidation Event, roughly 2.5-2.3 Gyr ago. The evolution of oxygenic photosynthesis is generally accepted to have been the ultimate cause of this rise, but it has proved difficult to constrain the timing of this evolutionary innovation. The oxidation of manganese in the water column requires substantial free oxygen concentrations, and thus any indication that Mn oxides were present in ancient environments would imply that oxygenic photosynthesis was ongoing. Mn oxides are not commonly preserved in ancient rocks, but there is a large fractionation of molybdenum isotopes associated with the sorption of Mo onto the Mn oxides that would be retained. Here we report Mo isotopes from rocks of the Sinqeni Formation, Pongola Supergroup, South Africa. These rocks formed no less than 2.95 Gyr ago in a nearshore setting. The Mo isotopic signature is consistent with interaction with Mn oxides. We therefore infer that oxygen produced through oxygenic photosynthesis began to accumulate in shallow marine settings at least half a billion years before the accumulation of significant levels of atmospheric oxygen.

  12. Biologic agents in rheumatology: unmet issues after 200 trials and $200 billion sales.

    Science.gov (United States)

    Ioannidis, John P A; Karassa, Fotini B; Druyts, Eric; Thorlund, Kristian; Mills, Edward J

    2013-11-01

    Anti-TNF agents and other biologic therapies are widely prescribed for a variety of indications, with total sales that exceed $200 billion to date. In rheumatic diseases, biologic agents have now been studied in more than 200 randomized clinical trials and over 100 subsequent meta-analyses; however, the information obtained does not always meet the needs of patients and clinicians. In this Review, we discuss the current issues concerning the evidence derived from such studies: potential biases favouring positive results; a paucity of head-to-head comparisons between biologically active agents; overwhelming involvement of manufacturer sponsors in trials and in the synthesis of the evidence; the preference for trials with limited follow-up; and the potential for spurious findings on adverse events, leading to endless debates about malignancy risk. We debate the responsibilities of regulatory authorities, the pharmaceutical industry and academia in attempting to solve these shortcomings and challenges. We propose that improvements in the evidence regarding biologic treatments that are continually being added to the therapeutic armamentarium for rheumatic diseases might require revisiting the design and conduct of studies. For example, trials with long-term follow-up that are independent of the pharmaceutical industry, head-to-head comparisons of therapeutic agents and the use of rigorous clinical outcomes should be considered, and public availability of raw data endorsed.

  13. Nordic energy co-operation can save the equivalent of 4 - 10 billion USD

    International Nuclear Information System (INIS)

    Lind, Oddvar

    2000-01-01

    Better co-ordination of the energy- and environment policies among the Nordic countries can be very profitable from the socio-economic point of view and facilitate the fulfilment of the Kyoto agreement. A Swedish calculation shows that up to 10 billion USD can be saved by building a trans-nordic gasline and at the same time preparing for a common implementation of the Kyoto agreement, combined with increased electricity trade, improving the efficiency and increasing the use of renewable energy sources. The consumption of natural gas must then increase threefold the next 25 years. There is no alternative to natural gas of the same potential if coal and oil are to be replaced to reduce the emission of carbon dioxide. The importance of natural gas is further increased by the phase-out of nuclear energy in Sweden. After 2025 the use of natural gas will be reduced and in 2040 biomass energy, wind energy and solar energy will contribute as much as the natural gas, that is, 250 TWh. Throughout the entire period more than half of the electricity production will be hydropower. It is presupposed that the cogeneration sector and the district heating network are substantially expanded, even in South Norway. The Nordic energy system is quite flexible with respect to fulfilling future CO 2 targets. Although the different Nordic countries have different commitments with respect to the Kyoto agreement, they will profit economically from acting jointly within the sum of their individual emission quotas

  14. Development of multicomponent parts-per-billion-level gas standards of volatile toxic organic compounds

    International Nuclear Information System (INIS)

    Rhoderick, G.C.; Zielinski, W.L. Jr.

    1990-01-01

    This paper reports that the demand for stable, low-concentration multicomponent standards of volatile toxic organic compounds for quantifying national and state measurement of ambient air quality and hazardous waste incineration emissions has markedly increased in recent years. In response to this demand, a microgravimetric technique was developed and validated for preparing such standards; these standards ranged in concentration from several parts per million (ppm) down to one part per billion (ppb) and in complexity from one organic up to 17. Studies using the gravimetric procedure to prepare mixtures of different groups of organics. including multi-components mixtures in the 5 to 20 ppb range, revealed a very low imprecision. This procedure is based on the separate gravimetric introduction of individual organics into an evacuated gas cylinder, followed by the pressurized addition of a precalculated amount of pure nitrogen. Additional studies confirmed the long-term stability of these mixtures. The uncertainty of the concentrations of the individual organics at the 95% confidence level ranged from less than 1% relative at 1 ppm to less than 10% relative at 1 ppb. Over 100 primary gravimetric standards have been developed, validated, and used for certifying the concentrations of a variety of mixtures for monitoring studies

  15. A large neutral fraction of cosmic hydrogen a billion years after the Big Bang.

    Science.gov (United States)

    Wyithe, J Stuart B; Loeb, Abraham

    2004-02-26

    The fraction of ionized hydrogen left over from the Big Bang provides evidence for the time of formation of the first stars and quasar black holes in the early Universe; such objects provide the high-energy photons necessary to ionize hydrogen. Spectra of the two most distant known quasars show nearly complete absorption of photons with wavelengths shorter than the Lyman alpha transition of neutral hydrogen, indicating that hydrogen in the intergalactic medium (IGM) had not been completely ionized at a redshift of z approximately 6.3, about one billion years after the Big Bang. Here we show that the IGM surrounding these quasars had a neutral hydrogen fraction of tens of per cent before the quasar activity started, much higher than the previous lower limits of approximately 0.1 per cent. Our results, when combined with the recent inference of a large cumulative optical depth to electron scattering after cosmological recombination therefore suggest the presence of a second peak in the mean ionization history of the Universe.

  16. A Massive Galaxy in Its Core Formation Phase Three Billion Years After the Big Bang

    Science.gov (United States)

    Nelson, Erica; van Dokkum, Pieter; Franx, Marijn; Brammer, Gabriel; Momcheva, Ivelina; Schreiber, Natascha M. Forster; da Cunha, Elisabete; Tacconi, Linda; Bezanson, Rachel; Kirkpatrick, Allison; hide

    2014-01-01

    Most massive galaxies are thought to have formed their dense stellar cores at early cosmic epochs. However, cores in their formation phase have not yet been observed. Previous studies have found galaxies with high gas velocity dispersions or small apparent sizes but so far no objects have been identified with both the stellar structure and the gas dynamics of a forming core. Here we present a candidate core in formation 11 billion years ago, at z = 2.3. GOODS-N-774 has a stellar mass of 1.0 × 10 (exp 11) solar mass, a half-light radius of 1.0 kpc, and a star formation rate of 90 (sup +45 / sub -20) solar mass/yr. The star forming gas has a velocity dispersion 317 plus or minus 30 km/s, amongst the highest ever measured. It is similar to the stellar velocity dispersions of the putative descendants of GOODS-N-774, compact quiescent galaxies at z is approximately equal to 2 (exp 8-11) and giant elliptical galaxies in the nearby Universe. Galaxies such as GOODS-N-774 appear to be rare; however, from the star formation rate and size of the galaxy we infer that many star forming cores may be heavily obscured, and could be missed in optical and near-infrared surveys.

  17. Providing safe drinking water to 1.2 billion unserved people

    Energy Technology Data Exchange (ETDEWEB)

    Gadgil, Ashok J.; Derby, Elisabeth A.

    2003-06-01

    Despite substantial advances in the past 100 years in public health, technology and medicine, 20% of the world population, mostly comprised of the poor population segments in developing countries (DCs), still does not have access to safe drinking water. To reach the United Nations (UN) Millennium Goal of halving the number of people without access to safe water by 2015, the global community will need to provide an additional one billion urban residents and 600 million rural residents with safe water within the next twelve years. This paper examines current water treatment measures and implementation methods for delivery of safe drinking water, and offers suggestions for making progress towards the goal of providing a timely and equitable solution for safe water provision. For water treatment, based on the serious limitations of boiling water and chlorination, we suggest an approach based on filtration coupled with ultraviolet (UV) disinfection, combined with public education. Additionally, owing to the capacity limitations for non-governmental organizations (NGOs) to take on this task primarily on their own, we suggest a strategy based on financially sustainable models that include the private sector as well as NGOs.

  18. Broadcasts for a billion: the growth of commercial television in China.

    Science.gov (United States)

    Schmuck, C

    1987-01-01

    At present, Chinese television reaches 35% of the population (80-90% in urban areas) and is used by the government as a source of education and information. In recognition of the potential market represented by 1.1 billions consumers, Western advertisers have commissioned elaborate market research studies. Drama, sports, news, and movies are consistently identified as the favorite type of programming among Chinese television viewers. About 75% of Beijing adults watch television daily, making the medium both an important target for advertising campaigns and a way for Westerners to influence Chinese business and government leaders. Western advertisers have tended to concentrate their investments in the more urban, affluent regions where products have the greatest likelihood of being sold. There has been a recent trend, however, toward industrial commercials, with British and French companies buying television time to promote their image as partners in China's modernization. Key to the future of commercial advertising on Chinese Television. In many provinces, local television stations have developed a unique character and portray different sociocultural values than the national channel. Outside advertisers have sometimes experienced problems with local networks that substitute local advertising without informing the network. To correct this situation, the government is enacting pro-sponsor regulations that forbid the preemption of the national channel and its advertisements. At the same time, efforts are being made to improve relationships with local television stations by either paying them a fee or airing local commercials on the national network.

  19. Large data analysis: automatic visual personal identification in a demography of 1.2 billion persons

    Science.gov (United States)

    Daugman, John

    2014-05-01

    The largest biometric deployment in history is now underway in India, where the Government is enrolling the iris patterns (among other data) of all 1.2 billion citizens. The purpose of the Unique Identification Authority of India (UIDAI) is to ensure fair access to welfare benefits and entitlements, to reduce fraud, and enhance social inclusion. Only a minority of Indian citizens have bank accounts; only 4 percent possess passports; and less than half of all aid money reaches its intended recipients. A person who lacks any means of establishing their identity is excluded from entitlements and does not officially exist; thus the slogan of UIDAI is: To give the poor an identity." This ambitious program enrolls a million people every day, across 36,000 stations run by 83 agencies, with a 3-year completion target for the entire national population. The halfway point was recently passed with more than 600 million persons now enrolled. In order to detect and prevent duplicate identities, every iris pattern that is enrolled is first compared against all others enrolled so far; thus the daily workflow now requires 600 trillion (or 600 million-million) iris cross-comparisons. Avoiding identity collisions (False Matches) requires high biometric entropy, and achieving the tremendous match speed requires phase bit coding. Both of these requirements are being delivered operationally by wavelet methods developed by the author for encoding and comparing iris patterns, which will be the focus of this Large Data Award" presentation.

  20. AREVA - First quarter 2011 revenue: 2.7% growth like for like to 1.979 billion euros

    International Nuclear Information System (INIS)

    2011-01-01

    The group reported consolidated revenue of 1.979 billion euros in the 1. quarter of 2011, for 2.2% growth compared with the 1. quarter of 2010 (+ 2.7% like for like). The increase was driven by the Mining / Front End Business Group (+ 20.8% LFL). Revenue from outside France rose 12.0% to 1.22 billion euros and represented 62% of total revenue. The impacts of foreign exchange and changes in consolidation scope were negligible during the period. The March 11 events in Japan had no significant impact on the group's performance in the 1. quarter of 2011. The group's backlog of 43.5 billion euros at March 31, 2011 was stable in relation to March 31, 2010. The growth in the backlog of the Mining / Front End and Renewable Energies Business Groups offset the partial depletion of the backlog in the Reactors and Services and Back End Business Groups as contracts were completed

  1. Galaxy evolution. Evidence for mature bulges and an inside-out quenching phase 3 billion years after the Big Bang.

    Science.gov (United States)

    Tacchella, S; Carollo, C M; Renzini, A; Förster Schreiber, N M; Lang, P; Wuyts, S; Cresci, G; Dekel, A; Genzel, R; Lilly, S J; Mancini, C; Newman, S; Onodera, M; Shapley, A; Tacconi, L; Woo, J; Zamorani, G

    2015-04-17

    Most present-day galaxies with stellar masses ≥10(11) solar masses show no ongoing star formation and are dense spheroids. Ten billion years ago, similarly massive galaxies were typically forming stars at rates of hundreds solar masses per year. It is debated how star formation ceased, on which time scales, and how this "quenching" relates to the emergence of dense spheroids. We measured stellar mass and star-formation rate surface density distributions in star-forming galaxies at redshift 2.2 with ~1-kiloparsec resolution. We find that, in the most massive galaxies, star formation is quenched from the inside out, on time scales less than 1 billion years in the inner regions, up to a few billion years in the outer disks. These galaxies sustain high star-formation activity at large radii, while hosting fully grown and already quenched bulges in their cores. Copyright © 2015, American Association for the Advancement of Science.

  2. Backlog at December 31, 2007: euro 39,8 billion, up by 55% from year-end 2006. 2007 sales revenue: euro 11.9 billion, up by 9.8% (+10.4% like-for-like)

    International Nuclear Information System (INIS)

    2008-01-01

    The AREVA group's backlog reached a record level of euro 39.834 billion as of December 31, 2007, up by 55% from that of year-end 2006. In Nuclear, the backlog was euro 34.927 billion at year-end 2007 (+58%), due in particular to the signature of a contract in a record amount with the Chinese utility CGNPC. The series of agreements concluded provide among other things for the construction of two new-generation EPR nuclear islands and the supply of all of the materials and services needed for their operation through 2027. CGNPC also bought 35% of the production of UraMin, the mining company acquired by AREVA in August 2007. Industrial cooperation in the Back End of the cycle was launched with the signature of an agreement between China and France. In addition, the group signed several long-term contracts in significant amounts, particularly with KHNP of South Korea, EDF and Japanese utilities. The Transmission and Distribution division won several major contracts in Libya and Qatar at the end of the year approaching a total of euro 750 million. For the entire year, new orders grew by 34% to euro 5.816 billion. The backlog, meanwhile, grew by 40% to euro 4.906 billion at year-end. The group cleared sales revenue of euro 11.923 billion in 2007, up by 9.8% (+10.4% like-for-like) in relation to 2006 sales of euro 10.863 billion. Sales revenue for the 4. quarter of 2007 rose to euro 3.858 billion, for growth of 16.7% (+18.8% like-for-like) over one year. Sales revenue for the year was marked by: - Growth of 7.6% (+10.6% like-for-like) in Front End sales revenue, which rose to euro 3.140 billion. The division's Enrichment operations posted strong growth. - Sales were up by 17.5% (+15.2% like-for-like) to euro 2.717 billion in the Reactors and Services division. Sales revenue was driven in particular by the growth of Services operations, after weak demand in 2006, by progress on OL3 construction, and by the start of Flamanville 3, the second EPR. For the Back End division

  3. Quantitative properties of the Schwarzschild metric

    Czech Academy of Sciences Publication Activity Database

    Křížek, Michal; Křížek, Filip

    2018-01-01

    Roč. 2018, č. 1 (2018), s. 1-10 Institutional support: RVO:67985840 Keywords : exterior and interior Schwarzschild metric * proper radius * coordinate radius Subject RIV: BA - General Mathematics OBOR OECD: Applied mathematics http://astro.shu-bg.net/pasb/index_files/Papers/2018/SCHWARZ8.pdf

  4. Strong Ideal Convergence in Probabilistic Metric Spaces

    Indian Academy of Sciences (India)

    In the present paper we introduce the concepts of strongly ideal convergent sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong ideal limit points and the strong ideal cluster points of a sequence in this ...

  5. lakemorpho: Calculating lake morphometry metrics in R.

    Science.gov (United States)

    Hollister, Jeffrey; Stachelek, Joseph

    2017-01-01

    Metrics describing the shape and size of lakes, known as lake morphometry metrics, are important for any limnological study. In cases where a lake has long been the subject of study these data are often already collected and are openly available. Many other lakes have these data collected, but access is challenging as it is often stored on individual computers (or worse, in filing cabinets) and is available only to the primary investigators. The vast majority of lakes fall into a third category in which the data are not available. This makes broad scale modelling of lake ecology a challenge as some of the key information about in-lake processes are unavailable. While this valuable in situ information may be difficult to obtain, several national datasets exist that may be used to model and estimate lake morphometry. In particular, digital elevation models and hydrography have been shown to be predictive of several lake morphometry metrics. The R package lakemorpho has been developed to utilize these data and estimate the following morphometry metrics: surface area, shoreline length, major axis length, minor axis length, major and minor axis length ratio, shoreline development, maximum depth, mean depth, volume, maximum lake length, mean lake width, maximum lake width, and fetch. In this software tool article we describe the motivation behind developing lakemorpho , discuss the implementation in R, and describe the use of lakemorpho with an example of a typical use case.

  6. Contraction theorems in fuzzy metric space

    International Nuclear Information System (INIS)

    Farnoosh, R.; Aghajani, A.; Azhdari, P.

    2009-01-01

    In this paper, the results on fuzzy contractive mapping proposed by Dorel Mihet will be proved for B-contraction and C-contraction in the case of George and Veeramani fuzzy metric space. The existence of fixed point with weaker conditions will be proved; that is, instead of the convergence of subsequence, p-convergence of subsequence is used.

  7. Inferring feature relevances from metric learning

    DEFF Research Database (Denmark)

    Schulz, Alexander; Mokbel, Bassam; Biehl, Michael

    2015-01-01

    Powerful metric learning algorithms have been proposed in the last years which do not only greatly enhance the accuracy of distance-based classifiers and nearest neighbor database retrieval, but which also enable the interpretability of these operations by assigning explicit relevance weights...

  8. DIGITAL MARKETING: SUCCESS METRICS, FUTURE TRENDS

    OpenAIRE

    Preeti Kaushik

    2017-01-01

    Abstract – Business Marketing is one of the prospective which has been tremendously affected by digital world in last few years. Digital marketing refers to doing advertising through digital channels. This paper provides detailed study of metrics to measure success of digital marketing platform and glimpse of future of technologies by 2020.

  9. Assessing Software Quality Through Visualised Cohesion Metrics

    Directory of Open Access Journals (Sweden)

    Timothy Shih

    2001-05-01

    Full Text Available Cohesion is one of the most important factors for software quality as well as maintainability, reliability and reusability. Module cohesion is defined as a quality attribute that seeks for measuring the singleness of the purpose of a module. The module of poor quality can be a serious obstacle to the system quality. In order to design a good software quality, software managers and engineers need to introduce cohesion metrics to measure and produce desirable software. A highly cohesion software is thought to be a desirable constructing. In this paper, we propose a function-oriented cohesion metrics based on the analysis of live variables, live span and the visualization of processing element dependency graph. We give six typical cohesion examples to be measured as our experiments and justification. Therefore, a well-defined, well-normalized, well-visualized and well-experimented cohesion metrics is proposed to indicate and thus enhance software cohesion strength. Furthermore, this cohesion metrics can be easily incorporated with software CASE tool to help software engineers to improve software quality.

  10. Metric propositional neighborhood logics on natural numbers

    DEFF Research Database (Denmark)

    Bresolin, Davide; Della Monica, Dario; Goranko, Valentin

    2013-01-01

    Metric Propositional Neighborhood Logic (MPNL) over natural numbers. MPNL features two modalities referring, respectively, to an interval that is “met by” the current one and to an interval that “meets” the current one, plus an infinite set of length constraints, regarded as atomic propositions...

  11. Calabi–Yau metrics and string compactification

    Directory of Open Access Journals (Sweden)

    Michael R. Douglas

    2015-09-01

    Full Text Available Yau proved an existence theorem for Ricci-flat Kähler metrics in the 1970s, but we still have no closed form expressions for them. Nevertheless there are several ways to get approximate expressions, both numerical and analytical. We survey some of this work and explain how it can be used to obtain physical predictions from superstring theory.

  12. Goedel-type metrics in various dimensions

    International Nuclear Information System (INIS)

    Guerses, Metin; Karasu, Atalay; Sarioglu, Oezguer

    2005-01-01

    Goedel-type metrics are introduced and used in producing charged dust solutions in various dimensions. The key ingredient is a (D - 1)-dimensional Riemannian geometry which is then employed in constructing solutions to the Einstein-Maxwell field equations with a dust distribution in D dimensions. The only essential field equation in the procedure turns out to be the source-free Maxwell's equation in the relevant background. Similarly the geodesics of this type of metric are described by the Lorentz force equation for a charged particle in the lower dimensional geometry. It is explicitly shown with several examples that Goedel-type metrics can be used in obtaining exact solutions to various supergravity theories and in constructing spacetimes that contain both closed timelike and closed null curves and that contain neither of these. Among the solutions that can be established using non-flat backgrounds, such as the Tangherlini metrics in (D - 1)-dimensions, there exists a class which can be interpreted as describing black-hole-type objects in a Goedel-like universe

  13. Strong Statistical Convergence in Probabilistic Metric Spaces

    OpenAIRE

    Şençimen, Celaleddin; Pehlivan, Serpil

    2008-01-01

    In this article, we introduce the concepts of strongly statistically convergent sequence and strong statistically Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong statistical limit points and the strong statistical cluster points of a sequence in this space and investigate the relations between these concepts.

  14. Language Games: University Responses to Ranking Metrics

    Science.gov (United States)

    Heffernan, Troy A.; Heffernan, Amanda

    2018-01-01

    League tables of universities that measure performance in various ways are now commonplace, with numerous bodies providing their own rankings of how institutions throughout the world are seen to be performing on a range of metrics. This paper uses Lyotard's notion of language games to theorise that universities are regaining some power over being…

  15. A new universal colour image fidelity metric

    NARCIS (Netherlands)

    Toet, A.; Lucassen, M.P.

    2003-01-01

    We extend a recently introduced universal grayscale image quality index to a newly developed perceptually decorrelated colour space. The resulting colour image fidelity metric quantifies the distortion of a processed colour image relative to its original version. We evaluated the new colour image

  16. Standardised metrics for global surgical surveillance.

    Science.gov (United States)

    Weiser, Thomas G; Makary, Martin A; Haynes, Alex B; Dziekan, Gerald; Berry, William R; Gawande, Atul A

    2009-09-26

    Public health surveillance relies on standardised metrics to evaluate disease burden and health system performance. Such metrics have not been developed for surgical services despite increasing volume, substantial cost, and high rates of death and disability associated with surgery. The Safe Surgery Saves Lives initiative of WHO's Patient Safety Programme has developed standardised public health metrics for surgical care that are applicable worldwide. We assembled an international panel of experts to develop and define metrics for measuring the magnitude and effect of surgical care in a population, while taking into account economic feasibility and practicability. This panel recommended six measures for assessing surgical services at a national level: number of operating rooms, number of operations, number of accredited surgeons, number of accredited anaesthesia professionals, day-of-surgery death ratio, and postoperative in-hospital death ratio. We assessed the feasibility of gathering such statistics at eight diverse hospitals in eight countries and incorporated them into the WHO Guidelines for Safe Surgery, in which methods for data collection, analysis, and reporting are outlined.

  17. A Lagrangian-dependent metric space

    International Nuclear Information System (INIS)

    El-Tahir, A.

    1989-08-01

    A generalized Lagrangian-dependent metric of the static isotropic spacetime is derived. Its behaviour should be governed by imposing physical constraints allowing to avert the pathological features of gravity at the strong field domain. This would restrict the choice of the Lagrangian form. (author). 10 refs

  18. Clean Cities 2011 Annual Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.

    2012-12-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2011. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  19. Clean Cities 2010 Annual Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.

    2012-10-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2010. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  20. Genetic basis of a cognitive complexity metric

    NARCIS (Netherlands)

    Hansell, Narelle K; Halford, Graeme S; Andrews, Glenda; Shum, David H K; Harris, Sarah E; Davies, Gail; Franic, Sanja; Christoforou, Andrea; Zietsch, Brendan; Painter, Jodie; Medland, Sarah E; Ehli, Erik A; Davies, Gareth E; Steen, Vidar M; Lundervold, Astri J; Reinvang, Ivar; Montgomery, Grant W; Espeseth, Thomas; Hulshoff Pol, Hilleke E; Starr, John M; Martin, Nicholas G; Le Hellard, Stephanie; Boomsma, Dorret I; Deary, Ian J; Wright, Margaret J

    2015-01-01

    Relational complexity (RC) is a metric reflecting capacity limitation in relational processing. It plays a crucial role in higher cognitive processes and is an endophenotype for several disorders. However, the genetic underpinnings of complex relational processing have not been investigated. Using

  1. Genetic Basis of a Cognitive Complexity Metric

    NARCIS (Netherlands)

    Hansell, N.K.; Halford, G.S.; Andrews, G.; Shum, D.H.K.; Harris, S.E.; Davies, G.; Franic, S.; Christoforou, A.; Zietsch, B.; Painter, J.; Medland, S.E.; Ehli, E.A.; Davies, G.E.; Steen, V.M.; Lundervold, A.J.; Reinvang, I.; Montgomery, G.W.; Espeseth, T.; Hulshoff Pol, H.E.; Starr, J.M.; Martin, N.G.; Le Hellard, S.; Boomsma, D.I.; Deary, I.J.; Wright, M.J.

    2015-01-01

    Relational complexity (RC) is a metric reflecting capacity limitation in relational processing. It plays a crucial role in higher cognitive processes and is an endophenotype for several disorders. However, the genetic underpinnings of complex relational processing have not been investigated. Using

  2. Business model metrics : An open repository

    NARCIS (Netherlands)

    Heikkila, M.; Bouwman, W.A.G.A.; Heikkila, J.; Solaimani, S.; Janssen, W.

    2015-01-01

    Development of successful business models has become a necessity in turbulent business environments, but compared to research on business modeling tools, attention to the role of metrics in designing business models in literature is limited. Building on existing approaches to business models and

  3. Software quality metrics aggregation in industry

    NARCIS (Netherlands)

    Mordal, K.; Anquetil, N.; Laval, J.; Serebrenik, A.; Vasilescu, B.N.; Ducasse, S.

    2013-01-01

    With the growing need for quality assessment of entire software systems in the industry, new issues are emerging. First, because most software quality metrics are defined at the level of individual software components, there is a need for aggregation methods to summarize the results at the system

  4. Invariance group of the Finster metric function

    International Nuclear Information System (INIS)

    Asanov, G.S.

    1985-01-01

    An invariance group of the Finsler metric function is introduced and studied that directly generalized the respective concept (a group of Euclidean rolations) of the Rieman geometry. A sequential description of the isotopic invariance of physical fields on the base of the Finsler geometry is possible in terms of this group

  5. Sigma Routing Metric for RPL Protocol

    Directory of Open Access Journals (Sweden)

    Paul Sanmartin

    2018-04-01

    Full Text Available This paper presents the adaptation of a specific metric for the RPL protocol in the objective function MRHOF. Among the functions standardized by IETF, we find OF0, which is based on the minimum hop count, as well as MRHOF, which is based on the Expected Transmission Count (ETX. However, when the network becomes denser or the number of nodes increases, both OF0 and MRHOF introduce long hops, which can generate a bottleneck that restricts the network. The adaptation is proposed to optimize both OFs through a new routing metric. To solve the above problem, the metrics of the minimum number of hops and the ETX are combined by designing a new routing metric called SIGMA-ETX, in which the best route is calculated using the standard deviation of ETX values between each node, as opposed to working with the ETX average along the route. This method ensures a better routing performance in dense sensor networks. The simulations are done through the Cooja simulator, based on the Contiki operating system. The simulations showed that the proposed optimization outperforms at a high margin in both OF0 and MRHOF, in terms of network latency, packet delivery ratio, lifetime, and power consumption.

  6. Tendances Carbone no. 75 'The CDM: let's not discard a tool that raised over US$ 200 billion'

    International Nuclear Information System (INIS)

    Shishlov, Igor

    2012-01-01

    Among the publications of CDC Climat Research, 'Tendances Carbone' bulletin specifically studies the developments of the European market for CO 2 allowances. This issue addresses the following points: Everyone wonders which miraculous instrument will enable the Green Climate Fund to mobilize the pledged US$100 billion per year in climate finance by 2020. Developing countries are now asking for interim targets to quench their mounting skepticism that this level of commitment can be reached. In the meantime paradoxically, the Clean Development Mechanism (CDM) - a tool that managed to leverage over US$200 billion of mostly private investment for climate change mitigation - is left dying without much regret

  7. Measured performance of a 3 ton LiBr absorption water chiller and its effect on cooling system operation

    Science.gov (United States)

    Namkoong, D.

    1976-01-01

    A three ton lithium bromide absorption water chiller was tested for a number of conditions involving hot water input, chilled water, and the cooling water. The primary influences on chiller capacity were the hot water inlet temperature and the cooling water inlet temperature. One combination of these two parameters extended the output to as much as 125% of design capacity, but no combination could lower the capacity to below 60% of design. A cooling system was conceptually designed so that it could provide several modes of operation. Such flexibility is needed for any solar cooling system to be able to accommodate the varying solar energy collection and the varying building demand. It was concluded that a three-ton absorption water chiller with the kind of performance that was measured can be incorporated into a cooling system such as that proposed, to provide efficient cooling over the specified ranges of operating conditions.

  8. Measured performance of a 3-ton LiBr absorption water chiller and its effect on cooling system operation

    Science.gov (United States)

    Namkoong, D.

    1976-01-01

    A 3-ton lithium bromide absorption water chiller was tested for a number of conditions involving hot-water input, chilled water, and the cooling water. The primary influences on chiller capacity were the hot water inlet temperature and the cooling water inlet temperature. One combination of these two parameters extended the output to as much as 125% of design capacity, but no combination could lower the capacity to below 60% of design. A cooling system was conceptually designed so that it could provide several modes of operation. Such flexibility is needed for any solar cooling system to be able to accommodate the varying solar energy collection and the varying building demand. It is concluded that a 3-ton absorption water chiller with the kind of performance that was measured can be incorporated into a cooling system such as that proposed, to provide efficient cooling over the specified ranges of operating conditions.

  9. Fluoresceination of FepA during colicin B killing: effects of temperature, toxin and TonB.

    Science.gov (United States)

    Smallwood, Chuck R; Marco, Amparo Gala; Xiao, Qiaobin; Trinh, Vy; Newton, Salete M C; Klebba, Phillip E

    2009-06-01

    We studied the reactivity of 35 genetically engineered Cys sulphydryl groups at different locations in Escherichia coli FepA. Modification of surface loop residues by fluorescein maleimide (FM) was strongly temperature-dependent in vivo, whereas reactivity at other sites was much less affected. Control reactions with bovine serum albumin showed that the temperature dependence of loop residue reactivity was unusually high, indicating that conformational changes in multiple loops (L2, L3, L4, L5, L7, L8, L10) transform the receptor to a more accessible form at 37 degrees C. At 0 degrees C colicin B binding impaired or blocked labelling at 8 of 10 surface loop sites, presumably by steric hindrance. Overall, colicin B adsorption decreased the reactivity of more than half of the 35 sites, in both the N- and C- domains of FepA. However, colicin B penetration into the cell at 37 degrees C did not augment the chemical modification of any residues in FepA. The FM modification patterns were similarly unaffected by the tonB locus. FepA was expressed at lower levels in a tonB host strain, but when we accounted for this decrease its FM labelling was comparable whether TonB was present or absent. Thus we did not detect TonB-dependent structural changes in FepA, either alone or when it interacted with colicin B at 37 degrees C. The only changes in chemical modification were reductions from steric hindrance when the bacteriocin bound to the receptor protein. The absence of increases in the reactivity of N-domain residues argues against the idea that the colicin B polypeptide traverses the FepA channel.

  10. Analysis of precious metals at parts-per-billion levels in industrial applications

    International Nuclear Information System (INIS)

    Tickner, James; O'Dwyer, Joel; Roach, Greg; Smith, Michael; Van Haarlem, Yves

    2015-01-01

    Precious metals, including gold and the platinum group metals (notable Pt, Pd and Rh), are mined commercially at concentrations of a few parts-per-million and below. Mining and processing operations demand sensitive and rapid analysis at concentrations down to about 100 parts-per-billion (ppb). In this paper, we discuss two technologies being developed to meet this challenge: X-ray fluorescence (XRF) and gamma-activation analysis (GAA). We have designed on-stream XRF analysers capable of measuring targeted elements in slurries with precisions in the 35–70 ppb range. For the past two years, two on-stream analysers have been in continuous operation at a precious metals concentrator plant. The simultaneous measurement of feed and waste stream grades provides real-time information on metal recovery, allowing changes in operating conditions and plant upsets to be detected and corrected more rapidly. Separately, we have been developing GAA for the measurement of gold as a replacement for the traditional laboratory fire-assay process. High-energy Bremsstrahlung X-rays are used to excite gold via the 197 Au(γ,γ′) 197 Au-M reaction, and the gamma-rays released in the decay of the meta-state are then counted. We report on work to significantly improve accuracy and detection limits. - Highlights: • X-ray fluorescence analysis at sub-parts-per-million concentration in bulk materials. • Gamma activation analysis of gold at high accuracy and low concentrations. • Use of advanced Monte Carlo techniques to optimise radiation-based analysers. • Industrial application of XRF and GAA technologies for minerals processing.

  11. With billions more on earth we should not back the wheel

    International Nuclear Information System (INIS)

    Blix, H.

    1999-01-01

    In a lecture give in Nice, Dr. Hans Blix, former Director General of the Vienna International Atomic Energy Agency (IAEA), warned against turning back the wheel of energy policy development. Blix sees major deficits in information among a number of politicians bearing responsibility in important European states. This was evident not only from the opt-out plans in Sweden and Germany, but also from the request to delete nuclear power from the catalog of safe and sustainable energy sources, which had been rejected in the European Parliament by a very slim majority. Actually, there were indications of the public acceptance of nuclear power being broader - also with a view to problems of climate - than was evident from statements by politicians. Blix accused politicians of being opportunists playing to antinuclear voters. A factor influencing this situation could be the fact that the true consequences of opting out of the use of nuclear power, including higher carbon dioxide emissions, always became apparent long after the next election date. Blix asked what the nuclear industry could do to convince the public that nuclear power, compared with other forms of energy, should be classified as a sustainable energy source not only in terms of its price, but also with respect to its environmental impact. In this connection, the former Director General of IAEA mentioned five areas in which the nuclear industry was required to make progress on its own. These included the need for still more safety in all areas of nuclear technology worldwide, from exploration in the uranium mines to the storage of nuclear waste. In particular, those responsible should seek to attain public acceptance of the need for nuclear waste management and storage strategies as soon as possible. Ultimately, this required nuclear organizations to seek a direct dialog with the public. Blix concluded with the statement that it was simply unconvincing to venture into a future for billions of people on earth

  12. No Photon Left Behind: How Billions of Spectral Lines are Transforming Planetary Sciences

    Science.gov (United States)

    Villanueva, Geronimo L.

    2014-06-01

    With the advent of realistic potential energy surface (PES) and dipole moment surface (DMS) descriptions, theoretically computed linelists can now synthesize accurate spectral parameters for billions of spectral lines sampling the untamed high-energy molecular domain. Being the initial driver for these databases the characterization of stellar spectra, these theoretical databases, in combination with decades of precise experimental studies (nicely compiled in community databases such as HITRAN and GEISA), are leading to unprecedented precisions in the characterization of planetary atmospheres. Cometary sciences are among the most affected by this spectroscopic revolution. Even though comets are relatively cold bodies (T˜100 K), their infrared molecular emission is mainly defined by non-LTE solar fluorescence induced by a high-energy source (Sun, T˜5600 K). In order to interpret high-resolution spectra of comets acquired with extremely powerful telescopes (e.g., Keck, VLT, NASA-IRTF), we have developed advanced non-LTE fluorescence models that integrate the high-energy dynamic range of ab-initio databases (e.g., BT2, VTT, HPT2, BYTe, TROVE) and the precision of laboratory and semi-empirical compilations (e.g., HITRAN, GEISA, CDMS, WKMC, SELP, IUPAC). These new models allow us to calculate realistic non-LTE pumps, cascades, branching-ratios, and emission rates for a broad range of excitation regimes for H2O, HDO, HCN, HNC and NH3. We have implemented elements of these compilations to the study of Mars spectra, and we are now exploring its application to modeling non-LTE emission in exoplanets. In this presentation, we present application of these advanced models to interpret highresolution spectra of comets, Mars and exoplanets.

  13. Assessment of Reusing 14-Ton, Thin-Wall, Depleted UF6 Cylinders as LLW Disposal Containers

    International Nuclear Information System (INIS)

    O'Connor, D.G.; Poole, A.B.; Shelton, J.H.

    2000-01-01

    of 48 inches and nominally contain 14 tons (12.7 MT) of DUF 6 , were originally designed and fabricated for temporary storage of DUF 6 . They were fabricated from pressure-vessel-grade steels according to the provisions of the ASME Boiler and Pressure Vessel Code. Cylinders are stored in open yards at the three sites and, due to historical storage techniques, were subject to corrosion. Roughly 10,000 of the 14TTW cylinders are considered substandard due to corrosion and other structural anomalies caused by mishandling. This means that approximately 40,000 14TTW cylinders could be made available as containers for LLW disposal In order to demonstrate the use of 14TTW cylinders as LLW disposal containers, several qualifying tasks need to be performed. Two demonstrations are being considered using 14TTW cylinders--one demonstration using contaminated soil and one demonstration using U 3 O 8 . The objective of this report are to determine how much information is known that could be used to support the demonstrations, and how much additional work will need to be done in order to conduct the demonstrations. Information associated with the following four qualifying tasks are evaluated in this report

  14. Reheating experiment in the 35-ton pile; Experience de rechauffage sur la pile de 35 tonnes

    Energy Technology Data Exchange (ETDEWEB)

    Cherot, J; Girard, Y [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1957-07-01

    When the 35-ton pile was started up it was necessary for us, in order to study certain effects (xenon for example), to know the anti reactivity value of the rods as a function of their dimensions. We have made use of the possibility, in the reheating experiment, of raising the temperature of the graphite-uranium block by simple heating, in order to determine the anti reactivity curves of the rods, and from that the overall temperature coefficient. For the latter we have considered two solutions: first, one in which the average temperature of the pile is defined as our arithmetical mean of the different values given by the 28 thermocouples distributed throughout the pile; a second in which the temperature in likened to a poisoning and is balanced by the square of the flux. The way in which the measurements have been made is indicated, and the different instruments used are described. The method of reheating does not permit the separation of the temperature coefficients of uranium and of graphite. The precision obtained is only moderate, and suffers from the changes of various parameters necessary to other manipulations carried out simultaneously (life time modulators for example), and finally it is a function of the comparatively restricted time allowed. It is evident of course that more careful stabilisation at the different plateaux chosen would have necessitated long periods of reheating. (author) [French] Nous avions besoin lors de la montee en puissance de la pile de 35 tonnes, pour l'elude de divers effets (xenon par ex.) de la valeur de l'antireactivite des barres en fonction de leurs cotes. Nous avons profite dans l'experience rechauffage de la possibilite de monter en temperature, non nucleairement, le bloc graphite uranium, pour determiner les courbes d'antireactivite des barres et de la le coefficient global de temperature. Nous avons considere pour ce dernier deux solutions. Une premiere dans laquelle la temperature moyenne de la pile est definie comme

  15. Observable traces of non-metricity: New constraints on metric-affine gravity

    Science.gov (United States)

    Delhom-Latorre, Adrià; Olmo, Gonzalo J.; Ronco, Michele

    2018-05-01

    Relaxing the Riemannian condition to incorporate geometric quantities such as torsion and non-metricity may allow to explore new physics associated with defects in a hypothetical space-time microstructure. Here we show that non-metricity produces observable effects in quantum fields in the form of 4-fermion contact interactions, thereby allowing us to constrain the scale of non-metricity to be greater than 1 TeV by using results on Bahbah scattering. Our analysis is carried out in the framework of a wide class of theories of gravity in the metric-affine approach. The bound obtained represents an improvement of several orders of magnitude to previous experimental constraints.

  16. Conformal and related changes of metric on the product of two almost contact metric manifolds.

    OpenAIRE

    Blair, D. E.

    1990-01-01

    This paper studies conformal and related changes of the product metric on the product of two almost contact metric manifolds. It is shown that if one factor is Sasakian, the other is not, but that locally the second factor is of the type studied by Kenmotsu. The results are more general and given in terms of trans-Sasakian, α-Sasakian and β-Kenmotsu structures.

  17. Constraint on a Varying Proton-Electron Mass Ratio 1.5 Billion Years after the Big Bang

    NARCIS (Netherlands)

    Bagdonaite, J.; Ubachs, W.M.G.; Murphy, M.T.; Withmore, J.B.

    2015-01-01

    A molecular hydrogen absorber at a lookback time of 12.4 billion years, corresponding to 10% of the age of the Universe today, is analyzed to put a constraint on a varying proton-electron mass ratio, μ. A high resolution spectrum of the J1443+2724 quasar, which was observed with the Very Large

  18. Rapid emergence of subaerial landmasses and onset of a modern hydrologic cycle 2.5 billion years ago.

    Science.gov (United States)

    Bindeman, I N; Zakharov, D O; Palandri, J; Greber, N D; Dauphas, N; Retallack, G J; Hofmann, A; Lackey, J S; Bekker, A

    2018-05-01

    The history of the growth of continental crust is uncertain, and several different models that involve a gradual, decelerating, or stepwise process have been proposed 1-4 . Even more uncertain is the timing and the secular trend of the emergence of most landmasses above the sea (subaerial landmasses), with estimates ranging from about one billion to three billion years ago 5-7 . The area of emerged crust influences global climate feedbacks and the supply of nutrients to the oceans 8 , and therefore connects Earth's crustal evolution to surface environmental conditions 9-11 . Here we use the triple-oxygen-isotope composition of shales from all continents, spanning 3.7 billion years, to provide constraints on the emergence of continents over time. Our measurements show a stepwise total decrease of 0.08 per mille in the average triple-oxygen-isotope value of shales across the Archaean-Proterozoic boundary. We suggest that our data are best explained by a shift in the nature of water-rock interactions, from near-coastal in the Archaean era to predominantly continental in the Proterozoic, accompanied by a decrease in average surface temperatures. We propose that this shift may have coincided with the onset of a modern hydrological cycle owing to the rapid emergence of continental crust with near-modern average elevation and aerial extent roughly 2.5 billion years ago.

  19. Tonometer calibration in Brasília, Brazil Calibragem de tonômetros em Brasília, Brasil

    Directory of Open Access Journals (Sweden)

    Fernanda Pires da Silva Abrão

    2009-06-01

    Full Text Available PURPOSE: To determine calibration errors of Goldmann applanation tonometers in ophthalmic clinics of Brasília, Brazil, and correlate the findings with variables related to tonometers model and utilization. METHODS: Tonometers from ophthalmic clinics in Brasília, Brazil, were checked for calibration errors. A standard Goldmann applanation tonometer checking tool was used to asses the calibration error. Only one trained individual made all verifications, with a masked reading of the results. Data on the model, age, daily use, frequency of calibration checking and the nature of the ophthalmic department - private or public - were collected and correlated with the observed errors. RESULTS: One hundred tonometers were checked for calibration. Forty seven percent (47/100 were out of 1 mmHg range at least at one point checking. Tonometers mounted to slit lamp, with less than 5 years, used in less than 20 patients daily, that had a calibration check on a yearly basis, and those from private office exhibit a lower rate of inaccuracy, but only the first variable was statistically significant. Sixty one percent of tonometers on public hospitals were out of calibration. CONCLUSION: Calibration of tonometers in the capital of Brazil is poor; those from general hospitals are worst, and this fact can lead to inaccurate detection and assessment of glaucoma patients, overall in the population under government assistance.OBJETIVOS: Determinar os erros de calibração dos tonômetros de aplanação de Goldmann em clínicas oftalmológicas de Brasília, Brasil, e correlacioná-los a variáveis relativas ao modelo e à utilização dos aparelhos. MÉTODOS: Tonômetros de clínicas oftalmológicas de Brasília tiveram a calibragem aferida usando um cilindro padrão fornecido pelo fabricante dos aparelhos. Todas as aferições foram realizadas por um só examinador previamente treinado e a leitura das medidas foi mascarada por um observador independente. As medidas

  20. Strongly baryon-dominated disk galaxies at the peak of galaxy formation ten billion years ago.

    Science.gov (United States)

    Genzel, R; Schreiber, N M Förster; Übler, H; Lang, P; Naab, T; Bender, R; Tacconi, L J; Wisnioski, E; Wuyts, S; Alexander, T; Beifiori, A; Belli, S; Brammer, G; Burkert, A; Carollo, C M; Chan, J; Davies, R; Fossati, M; Galametz, A; Genel, S; Gerhard, O; Lutz, D; Mendel, J T; Momcheva, I; Nelson, E J; Renzini, A; Saglia, R; Sternberg, A; Tacchella, S; Tadaki, K; Wilman, D

    2017-03-15

    In the cold dark matter cosmology, the baryonic components of galaxies-stars and gas-are thought to be mixed with and embedded in non-baryonic and non-relativistic dark matter, which dominates the total mass of the galaxy and its dark-matter halo. In the local (low-redshift) Universe, the mass of dark matter within a galactic disk increases with disk radius, becoming appreciable and then dominant in the outer, baryonic regions of the disks of star-forming galaxies. This results in rotation velocities of the visible matter within the disk that are constant or increasing with disk radius-a hallmark of the dark-matter model. Comparisons between the dynamical mass, inferred from these velocities in rotational equilibrium, and the sum of the stellar and cold-gas mass at the peak epoch of galaxy formation ten billion years ago, inferred from ancillary data, suggest high baryon fractions in the inner, star-forming regions of the disks. Although this implied baryon fraction may be larger than in the local Universe, the systematic uncertainties (owing to the chosen stellar initial-mass function and the calibration of gas masses) render such comparisons inconclusive in terms of the mass of dark matter. Here we report rotation curves (showing rotation velocity as a function of disk radius) for the outer disks of six massive star-forming galaxies, and find that the rotation velocities are not constant, but decrease with radius. We propose that this trend arises because of a combination of two main factors: first, a large fraction of the massive high-redshift galaxy population was strongly baryon-dominated, with dark matter playing a smaller part than in the local Universe; and second, the large velocity dispersion in high-redshift disks introduces a substantial pressure term that leads to a decrease in rotation velocity with increasing radius. The effect of both factors appears to increase with redshift. Qualitatively, the observations suggest that baryons in the early (high

  1. Four billion years of ophiolites reveal secular trends in oceanic crust formation

    Directory of Open Access Journals (Sweden)

    Harald Furnes

    2014-07-01

    Full Text Available We combine a geological, geochemical and tectonic dataset from 118 ophiolite complexes of the major global Phanerozoic orogenic belts with similar datasets of ophiolites from 111 Precambrian greenstone belts to construct an overview of oceanic crust generation over 4 billion years. Geochemical discrimination systematics built on immobile trace elements reveal that the basaltic units of the Phanerozoic ophiolites are dominantly subduction-related (75%, linked to backarc processes and characterized by a strong MORB component, similar to ophiolites in Precambrian greenstone sequences (85%. The remaining 25% Phanerozoic subduction-unrelated ophiolites are mainly (74% of Mid-Ocean-Ridge type (MORB type, in contrast to the equal proportion of Rift/Continental Margin, Plume, and MORB type ophiolites in the Precambrian greenstone belts. Throughout the Phanerozoic there are large geochemical variations in major and trace elements, but for average element values calculated in 5 bins of 100 million year intervals there are no obvious secular trends. By contrast, basaltic units in the ophiolites of the Precambrian greenstones (calculated in 12 bins of 250 million years intervals, starting in late Paleo- to early Mesoproterozoic (ca. 2.0–1.8 Ga, exhibit an apparent decrease in the average values of incompatible elements such as Ti, P, Zr, Y and Nb, and an increase in the compatible elements Ni and Cr with deeper time to the end of the Archean and into the Hadean. These changes can be attributed to decreasing degrees of partial melting of the upper mantle from Hadean/Archean to Present. The onset of geochemical changes coincide with the timing of detectible changes in the structural architecture of the ophiolites such as greater volumes of gabbro and more common sheeted dyke complexes, and lesser occurrences of ocelli (varioles in the pillow lavas in ophiolites younger than 2 Ga. The global data from the Precambrian ophiolites, representative of nearly 50

  2. Subsampled open-reference clustering creates consistent, comprehensive OTU definitions and scales to billions of sequences.

    Science.gov (United States)

    Rideout, Jai Ram; He, Yan; Navas-Molina, Jose A; Walters, William A; Ursell, Luke K; Gibbons, Sean M; Chase, John; McDonald, Daniel; Gonzalez, Antonio; Robbins-Pianka, Adam; Clemente, Jose C; Gilbert, Jack A; Huse, Susan M; Zhou, Hong-Wei; Knight, Rob; Caporaso, J Gregory

    2014-01-01

    We present a performance-optimized algorithm, subsampled open-reference OTU picking, for assigning marker gene (e.g., 16S rRNA) sequences generated on next-generation sequencing platforms to operational taxonomic units (OTUs) for microbial community analysis. This algorithm provides benefits over de novo OTU picking (clustering can be performed largely in parallel, reducing runtime) and closed-reference OTU picking (all reads are clustered, not only those that match a reference database sequence with high similarity). Because more of our algorithm can be run in parallel relative to "classic" open-reference OTU picking, it makes open-reference OTU picking tractable on massive amplicon sequence data sets (though on smaller data sets, "classic" open-reference OTU clustering is often faster). We illustrate that here by applying it to the first 15,000 samples sequenced for the Earth Microbiome Project (1.3 billion V4 16S rRNA amplicons). To the best of our knowledge, this is the largest OTU picking run ever performed, and we estimate that our new algorithm runs in less than 1/5 the time than would be required of "classic" open reference OTU picking. We show that subsampled open-reference OTU picking yields results that are highly correlated with those generated by "classic" open-reference OTU picking through comparisons on three well-studied datasets. An implementation of this algorithm is provided in the popular QIIME software package, which uses uclust for read clustering. All analyses were performed using QIIME's uclust wrappers, though we provide details (aided by the open-source code in our GitHub repository) that will allow implementation of subsampled open-reference OTU picking independently of QIIME (e.g., in a compiled programming language, where runtimes should be further reduced). Our analyses should generalize to other implementations of these OTU picking algorithms. Finally, we present a comparison of parameter settings in QIIME's OTU picking workflows and

  3. Subsampled open-reference clustering creates consistent, comprehensive OTU definitions and scales to billions of sequences

    Directory of Open Access Journals (Sweden)

    Jai Ram Rideout

    2014-08-01

    Full Text Available We present a performance-optimized algorithm, subsampled open-reference OTU picking, for assigning marker gene (e.g., 16S rRNA sequences generated on next-generation sequencing platforms to operational taxonomic units (OTUs for microbial community analysis. This algorithm provides benefits over de novo OTU picking (clustering can be performed largely in parallel, reducing runtime and closed-reference OTU picking (all reads are clustered, not only those that match a reference database sequence with high similarity. Because more of our algorithm can be run in parallel relative to “classic” open-reference OTU picking, it makes open-reference OTU picking tractable on massive amplicon sequence data sets (though on smaller data sets, “classic” open-reference OTU clustering is often faster. We illustrate that here by applying it to the first 15,000 samples sequenced for the Earth Microbiome Project (1.3 billion V4 16S rRNA amplicons. To the best of our knowledge, this is the largest OTU picking run ever performed, and we estimate that our new algorithm runs in less than 1/5 the time than would be required of “classic” open reference OTU picking. We show that subsampled open-reference OTU picking yields results that are highly correlated with those generated by “classic” open-reference OTU picking through comparisons on three well-studied datasets. An implementation of this algorithm is provided in the popular QIIME software package, which uses uclust for read clustering. All analyses were performed using QIIME’s uclust wrappers, though we provide details (aided by the open-source code in our GitHub repository that will allow implementation of subsampled open-reference OTU picking independently of QIIME (e.g., in a compiled programming language, where runtimes should be further reduced. Our analyses should generalize to other implementations of these OTU picking algorithms. Finally, we present a comparison of parameter settings in

  4. Metrics for measuring distances in configuration spaces

    International Nuclear Information System (INIS)

    Sadeghi, Ali; Ghasemi, S. Alireza; Schaefer, Bastian; Mohr, Stephan; Goedecker, Stefan; Lill, Markus A.

    2013-01-01

    In order to characterize molecular structures we introduce configurational fingerprint vectors which are counterparts of quantities used experimentally to identify structures. The Euclidean distance between the configurational fingerprint vectors satisfies the properties of a metric and can therefore safely be used to measure dissimilarities between configurations in the high dimensional configuration space. In particular we show that these metrics are a perfect and computationally cheap replacement for the root-mean-square distance (RMSD) when one has to decide whether two noise contaminated configurations are identical or not. We introduce a Monte Carlo approach to obtain the global minimum of the RMSD between configurations, which is obtained from a global minimization over all translations, rotations, and permutations of atomic indices

  5. A perceptual metric for photo retouching.

    Science.gov (United States)

    Kee, Eric; Farid, Hany

    2011-12-13

    In recent years, advertisers and magazine editors have been widely criticized for taking digital photo retouching to an extreme. Impossibly thin, tall, and wrinkle- and blemish-free models are routinely splashed onto billboards, advertisements, and magazine covers. The ubiquity of these unrealistic and highly idealized images has been linked to eating disorders and body image dissatisfaction in men, women, and children. In response, several countries have considered legislating the labeling of retouched photos. We describe a quantitative and perceptually meaningful metric of photo retouching. Photographs are rated on the degree to which they have been digitally altered by explicitly modeling and estimating geometric and photometric changes. This metric correlates well with perceptual judgments of photo retouching and can be used to objectively judge by how much a retouched photo has strayed from reality.

  6. Metric-Aware Secure Service Orchestration

    Directory of Open Access Journals (Sweden)

    Gabriele Costa

    2012-12-01

    Full Text Available Secure orchestration is an important concern in the internet of service. Next to providing the required functionality the composite services must also provide a reasonable level of security in order to protect sensitive data. Thus, the orchestrator has a need to check whether the complex service is able to satisfy certain properties. Some properties are expressed with metrics for precise definition of requirements. Thus, the problem is to analyse the values of metrics for a complex business process. In this paper we extend our previous work on analysis of secure orchestration with quantifiable properties. We show how to define, verify and enforce quantitative security requirements in one framework with other security properties. The proposed approach should help to select the most suitable service architecture and guarantee fulfilment of the declared security requirements.

  7. Machine Learning for ATLAS DDM Network Metrics

    CERN Document Server

    Lassnig, Mario; The ATLAS collaboration; Vamosi, Ralf

    2016-01-01

    The increasing volume of physics data is posing a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from our ongoing automation efforts. First, we describe our framework for distributed data management and network metrics, automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for network-aware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our models.

  8. Beyond Lovelock gravity: Higher derivative metric theories

    Science.gov (United States)

    Crisostomi, M.; Noui, K.; Charmousis, C.; Langlois, D.

    2018-02-01

    We consider theories describing the dynamics of a four-dimensional metric, whose Lagrangian is diffeomorphism invariant and depends at most on second derivatives of the metric. Imposing degeneracy conditions we find a set of Lagrangians that, apart form the Einstein-Hilbert one, are either trivial or contain more than 2 degrees of freedom. Among the partially degenerate theories, we recover Chern-Simons gravity, endowed with constraints whose structure suggests the presence of instabilities. Then, we enlarge the class of parity violating theories of gravity by introducing new "chiral scalar-tensor theories." Although they all raise the same concern as Chern-Simons gravity, they can nevertheless make sense as low energy effective field theories or, by restricting them to the unitary gauge (where the scalar field is uniform), as Lorentz breaking theories with a parity violating sector.

  9. High-Dimensional Metrics in R

    OpenAIRE

    Chernozhukov, Victor; Hansen, Chris; Spindler, Martin

    2016-01-01

    The package High-dimensional Metrics (\\Rpackage{hdm}) is an evolving collection of statistical methods for estimation and quantification of uncertainty in high-dimensional approximately sparse models. It focuses on providing confidence intervals and significance testing for (possibly many) low-dimensional subcomponents of the high-dimensional parameter vector. Efficient estimators and uniformly valid confidence intervals for regression coefficients on target variables (e.g., treatment or poli...

  10. Interiors of Vaidya's radiating metric: Gravitational collapse

    International Nuclear Information System (INIS)

    Fayos, F.; Jaen, X.; Llanta, E.; Senovilla, J.M.M.

    1992-01-01

    Using the Darmois junction conditions, we give the necessary and sufficient conditions for the matching of a general spherically symmetric metric to a Vaidya radiating solution. We present also these conditions in terms of the physical quantities of the corresponding energy-momentum tensors. The physical interpretation of the results and their possible applications are studied, and we also perform a detailed analysis of previous work on the subject by other authors

  11. Anisotropic rectangular metric for polygonal surface remeshing

    KAUST Repository

    Pellenard, Bertrand

    2013-06-18

    We propose a new method for anisotropic polygonal surface remeshing. Our algorithm takes as input a surface triangle mesh. An anisotropic rectangular metric, defined at each triangle facet of the input mesh, is derived from both a user-specified normal-based tolerance error and the requirement to favor rectangle-shaped polygons. Our algorithm uses a greedy optimization procedure that adds, deletes and relocates generators so as to match two criteria related to partitioning and conformity.

  12. A Metrics Approach for Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA

    2009-01-01

    Full Text Available This article presents different types of collaborative systems, their structure and classification. This paper defines the concept of virtual campus as a collaborative system. It builds architecture for virtual campus oriented on collaborative training processes. It analyses the quality characteristics of collaborative systems and propose techniques for metrics construction and validation in order to evaluate them. The article analyzes different ways to increase the efficiency and the performance level in collaborative banking systems.

  13. Preserved Network Metrics across Translated Texts

    Science.gov (United States)

    Cabatbat, Josephine Jill T.; Monsanto, Jica P.; Tapang, Giovanni A.

    2014-09-01

    Co-occurrence language networks based on Bible translations and the Universal Declaration of Human Rights (UDHR) translations in different languages were constructed and compared with random text networks. Among the considered network metrics, the network size, N, the normalized betweenness centrality (BC), and the average k-nearest neighbors, knn, were found to be the most preserved across translations. Moreover, similar frequency distributions of co-occurring network motifs were observed for translated texts networks.

  14. Anisotropic rectangular metric for polygonal surface remeshing

    KAUST Repository

    Pellenard, Bertrand; Morvan, Jean-Marie; Alliez, Pierre

    2013-01-01

    We propose a new method for anisotropic polygonal surface remeshing. Our algorithm takes as input a surface triangle mesh. An anisotropic rectangular metric, defined at each triangle facet of the input mesh, is derived from both a user-specified normal-based tolerance error and the requirement to favor rectangle-shaped polygons. Our algorithm uses a greedy optimization procedure that adds, deletes and relocates generators so as to match two criteria related to partitioning and conformity.

  15. Smart Grid Status and Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-07-01

    To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. It measures 21 metrics to provide insight into the grid’s capacity to embody these characteristics. This report looks across a spectrum of smart grid concerns to measure the status of smart grid deployment and impacts.

  16. Metrics in Keplerian orbits quotient spaces

    Science.gov (United States)

    Milanov, Danila V.

    2018-03-01

    Quotient spaces of Keplerian orbits are important instruments for the modelling of orbit samples of celestial bodies on a large time span. We suppose that variations of the orbital eccentricities, inclinations and semi-major axes remain sufficiently small, while arbitrary perturbations are allowed for the arguments of pericentres or longitudes of the nodes, or both. The distance between orbits or their images in quotient spaces serves as a numerical criterion for such problems of Celestial Mechanics as search for common origin of meteoroid streams, comets, and asteroids, asteroid families identification, and others. In this paper, we consider quotient sets of the non-rectilinear Keplerian orbits space H. Their elements are identified irrespective of the values of pericentre arguments or node longitudes. We prove that distance functions on the quotient sets, introduced in Kholshevnikov et al. (Mon Not R Astron Soc 462:2275-2283, 2016), satisfy metric space axioms and discuss theoretical and practical importance of this result. Isometric embeddings of the quotient spaces into R^n, and a space of compact subsets of H with Hausdorff metric are constructed. The Euclidean representations of the orbits spaces find its applications in a problem of orbit averaging and computational algorithms specific to Euclidean space. We also explore completions of H and its quotient spaces with respect to corresponding metrics and establish a relation between elements of the extended spaces and rectilinear trajectories. Distance between an orbit and subsets of elliptic and hyperbolic orbits is calculated. This quantity provides an upper bound for the metric value in a problem of close orbits identification. Finally the invariance of the equivalence relations in H under coordinates change is discussed.

  17. The Planck Vacuum and the Schwarzschild Metrics

    Directory of Open Access Journals (Sweden)

    Daywitt W. C.

    2009-07-01

    Full Text Available The Planck vacuum (PV is assumed to be the source of the visible universe. So under conditions of sufficient stress, there must exist a pathway through which energy from the PV can travel into this universe. Conversely, the passage of energy from the visible universe to the PV must also exist under the same stressful conditions. The following examines two versions of the Schwarzschild metric equation for compatability with this open-pathway idea.

  18. Metrics and Its Function in Poetry

    Institute of Scientific and Technical Information of China (English)

    XIAO Zhong-qiong; CHEN Min-jie

    2013-01-01

    Poetry is a special combination of musical and linguistic qualities-of sounds both regarded as pure sound and as mean-ingful speech. Part of the pleasure of poetry lies in its relationship with music. Metrics, including rhythm and meter, is an impor-tant method for poetry to express poetic sentiment. Through the introduction of poetic language and typical examples, the writer of this paper tries to discuss the relationship between sound and meaning.

  19. TNF-α promotes nuclear enrichment of the transcription factor TonEBP/NFAT5 to selectively control inflammatory but not osmoregulatory responses in nucleus pulposus cells.

    Science.gov (United States)

    Johnson, Zariel I; Doolittle, Alexandra C; Snuggs, Joseph W; Shapiro, Irving M; Le Maitre, Christine L; Risbud, Makarand V

    2017-10-20

    Intervertebral disc degeneration (IDD) causes chronic back pain and is linked to production of proinflammatory molecules by nucleus pulposus (NP) and other disc cells. Activation of tonicity-responsive enhancer-binding protein (TonEBP)/NFAT5 by non-osmotic stimuli, including proinflammatory molecules, occurs in cells involved in immune response. However, whether inflammatory stimuli activate TonEBP in NP cells and whether TonEBP controls inflammation during IDD is unknown. We show that TNF-α, but not IL-1β or LPS, promoted nuclear enrichment of TonEBP protein. However, TNF-α-mediated activation of TonEBP did not cause induction of osmoregulatory genes. RNA sequencing showed that 8.5% of TNF-α transcriptional responses were TonEBP-dependent and identified genes regulated by both TNF-α and TonEBP. These genes were over-enriched in pathways and diseases related to inflammatory response and inhibition of matrix metalloproteases. Based on RNA-sequencing results, we further investigated regulation of novel TonEBP targets CXCL1 , CXCL2 , and CXCL3 TonEBP acted synergistically with TNF-α and LPS to induce CXCL1 -proximal promoter activity. Interestingly, this regulation required a highly conserved NF-κB-binding site but not a predicted TonE, suggesting cross-talk between these two members of the Rel family. Finally, analysis of human NP tissue showed that TonEBP expression correlated with canonical osmoregulatory targets TauT/SLC6A6 , SMIT/SLC5A3 , and AR/AKR1B1 , supporting in vitro findings that the inflammatory milieu during IDD does not interfere with TonEBP osmoregulation. In summary, whereas TonEBP participates in the proinflammatory response to TNF-α, therapeutic strategies targeting this transcription factor for treatment of disc disease must spare osmoprotective, prosurvival, and matrix homeostatic activities. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  20. Image characterization metrics for muon tomography

    Science.gov (United States)

    Luo, Weidong; Lehovich, Andre; Anashkin, Edward; Bai, Chuanyong; Kindem, Joel; Sossong, Michael; Steiger, Matt

    2014-05-01

    Muon tomography uses naturally occurring cosmic rays to detect nuclear threats in containers. Currently there are no systematic image characterization metrics for muon tomography. We propose a set of image characterization methods to quantify the imaging performance of muon tomography. These methods include tests of spatial resolution, uniformity, contrast, signal to noise ratio (SNR) and vertical smearing. Simulated phantom data and analysis methods were developed to evaluate metric applicability. Spatial resolution was determined as the FWHM of the point spread functions in X, Y and Z axis for 2.5cm tungsten cubes. Uniformity was measured by drawing a volume of interest (VOI) within a large water phantom and defined as the standard deviation of voxel values divided by the mean voxel value. Contrast was defined as the peak signals of a set of tungsten cubes divided by the mean voxel value of the water background. SNR was defined as the peak signals of cubes divided by the standard deviation (noise) of the water background. Vertical smearing, i.e. vertical thickness blurring along the zenith axis for a set of 2 cm thick tungsten plates, was defined as the FWHM of vertical spread function for the plate. These image metrics provided a useful tool to quantify the basic imaging properties for muon tomography.

  1. A Fundamental Metric for Metal Recycling Applied to Coated Magnesium

    NARCIS (Netherlands)

    Meskers, C.E.M.; Reuter, M.A.; Boin, U.; Kvithyld, A.

    2008-01-01

    A fundamental metric for the assessment of the recyclability and, hence, the sustainability of coated magnesium scrap is presented; this metric combines kinetics and thermodynamics. The recycling process, consisting of thermal decoating and remelting, was studied by thermogravimetry and differential

  2. Ideal Based Cyber Security Technical Metrics for Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    W. F. Boyer; M. A. McQueen

    2007-10-01

    Much of the world's critical infrastructure is at risk from attack through electronic networks connected to control systems. Security metrics are important because they provide the basis for management decisions that affect the protection of the infrastructure. A cyber security technical metric is the security relevant output from an explicit mathematical model that makes use of objective measurements of a technical object. A specific set of technical security metrics are proposed for use by the operators of control systems. Our proposed metrics are based on seven security ideals associated with seven corresponding abstract dimensions of security. We have defined at least one metric for each of the seven ideals. Each metric is a measure of how nearly the associated ideal has been achieved. These seven ideals provide a useful structure for further metrics development. A case study shows how the proposed metrics can be applied to an operational control system.

  3. 43 CFR 12.915 - Metric system of measurement.

    Science.gov (United States)

    2010-10-01

    ... procurements, grants, and other business-related activities. Metric implementation may take longer where the... recipient, such as when foreign competitors are producing competing products in non-metric units. (End of...

  4. The Jacobi metric for timelike geodesics in static spacetimes

    Science.gov (United States)

    Gibbons, G. W.

    2016-01-01

    It is shown that the free motion of massive particles moving in static spacetimes is given by the geodesics of an energy-dependent Riemannian metric on the spatial sections analogous to Jacobi's metric in classical dynamics. In the massless limit Jacobi's metric coincides with the energy independent Fermat or optical metric. For stationary metrics, it is known that the motion of massless particles is given by the geodesics of an energy independent Finslerian metric of Randers type. The motion of massive particles is governed by neither a Riemannian nor a Finslerian metric. The properies of the Jacobi metric for massive particles moving outside the horizon of a Schwarschild black hole are described. By constrast with the massless case, the Gaussian curvature of the equatorial sections is not always negative.

  5. Factor structure of the Tomimatsu-Sato metrics

    International Nuclear Information System (INIS)

    Perjes, Z.

    1989-02-01

    Based on an earlier result stating that δ = 3 Tomimatsu-Sato (TS) metrics can be factored over the field of integers, an analogous representation for higher TS metrics was sought. It is shown that the factoring property of TS metrics follows from the structure of special Hankel determinants. A set of linear algebraic equations determining the factors was defined, and the factors of the first five TS metrics were tabulated, together with their primitive factors. (R.P.) 4 refs.; 2 tabs

  6. What can article-level metrics do for you?

    Science.gov (United States)

    Fenner, Martin

    2013-10-01

    Article-level metrics (ALMs) provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, usage statistics, discussions in online comments and social media, social bookmarking, and recommendations. In this essay, we describe why article-level metrics are an important extension of traditional citation-based journal metrics and provide a number of example from ALM data collected for PLOS Biology.

  7. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  8. Understanding Acceptance of Software Metrics--A Developer Perspective

    Science.gov (United States)

    Umarji, Medha

    2009-01-01

    Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…

  9. Modified intuitionistic fuzzy metric spaces and some fixed point theorems

    International Nuclear Information System (INIS)

    Saadati, R.; Sedghi, S.; Shobe, N.

    2008-01-01

    Since the intuitionistic fuzzy metric space has extra conditions (see [Gregori V, Romaguera S, Veereamani P. A note on intuitionistic fuzzy metric spaces. Chaos, Solitons and Fractals 2006;28:902-5]). In this paper, we consider modified intuitionistic fuzzy metric spaces and prove some fixed point theorems in these spaces. All the results presented in this paper are new

  10. Tide or Tsunami? The Impact of Metrics on Scholarly Research

    Science.gov (United States)

    Bonnell, Andrew G.

    2016-01-01

    Australian universities are increasingly resorting to the use of journal metrics such as impact factors and ranking lists in appraisal and promotion processes, and are starting to set quantitative "performance expectations" which make use of such journal-based metrics. The widespread use and misuse of research metrics is leading to…

  11. Robustness of climate metrics under climate policy ambiguity

    International Nuclear Information System (INIS)

    Ekholm, Tommi; Lindroos, Tomi J.; Savolainen, Ilkka

    2013-01-01

    Highlights: • We assess the economic impacts of using different climate metrics. • The setting is cost-efficient scenarios for three interpretations of the 2C target. • With each target setting, the optimal metric is different. • Therefore policy ambiguity prevents the selection of an optimal metric. • Robust metric values that perform well with multiple policy targets however exist. -- Abstract: A wide array of alternatives has been proposed as the common metrics with which to compare the climate impacts of different emission types. Different physical and economic metrics and their parameterizations give diverse weights between e.g. CH 4 and CO 2 , and fixing the metric from one perspective makes it sub-optimal from another. As the aims of global climate policy involve some degree of ambiguity, it is not possible to determine a metric that would be optimal and consistent with all policy aims. This paper evaluates the cost implications of using predetermined metrics in cost-efficient mitigation scenarios. Three formulations of the 2 °C target, including both deterministic and stochastic approaches, shared a wide range of metric values for CH 4 with which the mitigation costs are only slightly above the cost-optimal levels. Therefore, although ambiguity in current policy might prevent us from selecting an optimal metric, it can be possible to select robust metric values that perform well with multiple policy targets

  12. Graev metrics on free products and HNN extensions

    DEFF Research Database (Denmark)

    Slutsky, Konstantin

    2014-01-01

    We give a construction of two-sided invariant metrics on free products (possibly with amalgamation) of groups with two-sided invariant metrics and, under certain conditions, on HNN extensions of such groups. Our approach is similar to the Graev's construction of metrics on free groups over pointed...

  13. The universal connection and metrics on moduli spaces

    International Nuclear Information System (INIS)

    Massamba, Fortune; Thompson, George

    2003-11-01

    We introduce a class of metrics on gauge theoretic moduli spaces. These metrics are made out of the universal matrix that appears in the universal connection construction of M. S. Narasimhan and S. Ramanan. As an example we construct metrics on the c 2 = 1 SU(2) moduli space of instantons on R 4 for various universal matrices. (author)

  14. ST-intuitionistic fuzzy metric space with properties

    Science.gov (United States)

    Arora, Sahil; Kumar, Tanuj

    2017-07-01

    In this paper, we define ST-intuitionistic fuzzy metric space and the notion of convergence and completeness properties of cauchy sequences is studied. Further, we prove some properties of ST-intuitionistic fuzzy metric space. Finally, we introduce the concept of symmetric ST Intuitionistic Fuzzy metric space.

  15. Term Based Comparison Metrics for Controlled and Uncontrolled Indexing Languages

    Science.gov (United States)

    Good, B. M.; Tennis, J. T.

    2009-01-01

    Introduction: We define a collection of metrics for describing and comparing sets of terms in controlled and uncontrolled indexing languages and then show how these metrics can be used to characterize a set of languages spanning folksonomies, ontologies and thesauri. Method: Metrics for term set characterization and comparison were identified and…

  16. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  17. Otherwise Engaged : Social Media from Vanity Metrics to Critical Analytics

    NARCIS (Netherlands)

    Rogers, R.

    2018-01-01

    Vanity metrics is a term that captures the measurement and display of how well one is doing in the “success theater” of social media. The notion of vanity metrics implies a critique of metrics concerning both the object of measurement as well as their capacity to measure unobtrusively or only to

  18. Meter Detection in Symbolic Music Using Inner Metric Analysis

    NARCIS (Netherlands)

    de Haas, W.B.; Volk, A.

    2016-01-01

    In this paper we present PRIMA: a new model tailored to symbolic music that detects the meter and the first downbeat position of a piece. Given onset data, the metrical structure of a piece is interpreted using the Inner Metric Analysis (IMA) model. IMA identifies the strong and weak metrical

  19. Regional Sustainability: The San Luis Basin Metrics Project

    Science.gov (United States)

    There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute. Moreover, individual metrics may not capture all aspects of a system that are relevant to sust...

  20. The software product assurance metrics study: JPL's software systems quality and productivity

    Science.gov (United States)

    Bush, Marilyn W.

    1989-01-01

    The findings are reported of the Jet Propulsion Laboratory (JPL)/Software Product Assurance (SPA) Metrics Study, conducted as part of a larger JPL effort to improve software quality and productivity. Until recently, no comprehensive data had been assembled on how JPL manages and develops software-intensive systems. The first objective was to collect data on software development from as many projects and for as many years as possible. Results from five projects are discussed. These results reflect 15 years of JPL software development, representing over 100 data points (systems and subsystems), over a third of a billion dollars, over four million lines of code and 28,000 person months. Analysis of this data provides a benchmark for gauging the effectiveness of past, present and future software development work. In addition, the study is meant to encourage projects to record existing metrics data and to gather future data. The SPA long term goal is to integrate the collection of historical data and ongoing project data with future project estimations.

  1. Determining the potential benefits for the freight carriage by road in Spain facing an increase in vehicles gvm 40 to 44 tons

    Energy Technology Data Exchange (ETDEWEB)

    Martinez Reguero, A.H.; Campos Cacheda, J.M.

    2016-07-01

    A very significant percentage of the products shipped by road in Spain using heavy goods vehicles (HGV) make 40 tons GVM (gross vehicle mass). Any changes aimed at increasing productivity in that vehicles category would result in a very positive way in the road freight transport market, by lowering transport costs, decreasing environmental costs, rationalizing the sector and improving logistics market. Therefore it is discussed here the improvement derived from the transfer of HGV that currently have a limitation of 40 tons GVM to a new limit of 44 tons GVM, establishing the potential benefits that would be set after the change. (Author)

  2. The 2011-2015 physical and monetary balance for electricity: spending of over euro 50 billion in 2015

    International Nuclear Information System (INIS)

    Guggemos, Fabien; Meilhac, Christophe; Riedinger, Nicolas; Martial, Elodie; Mombel, David; Moreau, Sylvain; Bottin, Anne; Lavail, Jennyfer

    2017-09-01

    Electricity consumers (excluding the electricity sector itself) spent euro 52 billion in 2015 to consume 446 TWh. Taxes accounted for 27% of that expenditure (of which around one-half contributed to financing renewable sources of electricity and to geographical price adjustments), the cost of transmission 27%, and that of supply (including production and sales) 46%. Trade with other countries showed a positive balance of euro 2.3 billion. The residential sector was the main consuming sector, accounting for 35% of physical deliveries. Given the transmission and sales costs, higher on average for households than for businesses, the residential sector accounted for a greater proportion of the spending (48%). Conversely, industry accounted for 24% of physical consumption but only 15% of spending. The share of the services sector was around one-third, in both physical-unit and monetary terms

  3. Extremal limits of the C metric: Nariai, Bertotti-Robinson, and anti-Nariai C metrics

    International Nuclear Information System (INIS)

    Dias, Oscar J.C.; Lemos, Jose P.S.

    2003-01-01

    In two previous papers we have analyzed the C metric in a background with a cosmological constant Λ, namely, the de-Sitter (dS) C metric (Λ>0), and the anti-de Sitter (AdS) C metric (Λ 0, Λ=0, and Λ 2 xS-tilde 2 ) to each point in the deformed two-sphere S-tilde 2 corresponds a dS 2 spacetime, except for one point which corresponds to a dS 2 spacetime with an infinite straight strut or string. There are other important new features that appear. One expects that the solutions found in this paper are unstable and decay into a slightly nonextreme black hole pair accelerated by a strut or by strings. Moreover, the Euclidean version of these solutions mediate the quantum process of black hole pair creation that accompanies the decay of the dS and AdS spaces

  4. Massless and massive quanta resulting from a mediumlike metric tensor

    International Nuclear Information System (INIS)

    Soln, J.

    1985-01-01

    A simple model of the ''primordial'' scalar field theory is presented in which the metric tensor is a generalization of the metric tensor from electrodynamics in a medium. The radiation signal corresponding to the scalar field propagates with a velocity that is generally less than c. This signal can be associated simultaneously with imaginary and real effective (momentum-dependent) masses. The requirement that the imaginary effective mass vanishes, which we take to be the prerequisite for the vacuumlike signal propagation, leads to the ''spontaneous'' splitting of the metric tensor into two distinct metric tensors: one metric tensor gives rise to masslesslike radiation and the other to a massive particle. (author)

  5. Principle of space existence and De Sitter metric

    International Nuclear Information System (INIS)

    Mal'tsev, V.K.

    1990-01-01

    The selection principle for the solutions of the Einstein equations suggested in a series of papers implies the existence of space (g ik ≠ 0) only in the presence of matter (T ik ≠0). This selection principle (principle of space existence, in the Markov terminology) implies, in the general case, the absence of the cosmological solution with the De Sitter metric. On the other hand, the De Sitter metric is necessary for describing both inflation and deflation periods of the Universe. It is shown that the De Sitter metric is also allowed by the selection principle under discussion if the metric experiences the evolution into the Friedmann metric

  6. Pragmatic security metrics applying metametrics to information security

    CERN Document Server

    Brotby, W Krag

    2013-01-01

    Other books on information security metrics discuss number theory and statistics in academic terms. Light on mathematics and heavy on utility, PRAGMATIC Security Metrics: Applying Metametrics to Information Security breaks the mold. This is the ultimate how-to-do-it guide for security metrics.Packed with time-saving tips, the book offers easy-to-follow guidance for those struggling with security metrics. Step by step, it clearly explains how to specify, develop, use, and maintain an information security measurement system (a comprehensive suite of metrics) to

  7. $17 billion needed for population programme to year 2000: Dr. Nafis Sadik launches State of World Population Report.

    Science.gov (United States)

    1995-01-01

    Dr. Nafis Sadik, Executive Director of the United Nations Population Fund (UNFPA), in her address on July 11 to the Foreign Press Association in London on the occasion of the release of the "1995 State of the World Population Report," stated that governments needed to invest in people, and that the estimated amount needed to reduce population numbers in developing countries was $17 billion for the year 2000. Two-thirds of the cost would be supplied by the developing countries. She said that coordinating population policies globally through such documents as the Programme of Action from the Cairo Conference would aid in slowing population growth. World population, currently 5.7 billion, is projected to reach 7.1-7.83 billion in 2015 and 7.9-11.9 billion in 2050. She also noted that certain conditions faced by women bear upon unsustainable population growth. The cycle of poverty continues in developing countries because very young mothers, who face higher risks in pregnancy and childbirth than those who delay childbearing until after the age of 20, are less likely to continue their education, more likely to have lower-paying jobs, and have a higher rate of separation and divorce. The isolation of women from widespread political participation and the marginalization of women's concerns from mainstream topics has resulted in ineffective family planning programs, including prevention of illness or impairment related to pregnancy or childbirth. Women, in most societies, cannot fully participate in economic and public life, have limited access to positions of influence and power, have narrower occupational choices and lower earnings than men, and must struggle to reconcile activities outside the home with their traditional roles. Sustainable development can only be achieved when social development expands opportunities for individuals (men and women), and their families, empowering them in the attainment of their social, economic, political, and cultural aspirations.

  8. How Brazil Transferred Billions to Foreign Coffee Importers: The International Coffee Agreement, Rent Seeking and Export Tax Rebates

    OpenAIRE

    Jarvis, Lovell S.

    2003-01-01

    Rent seeking is well known, but empirical evidence of its effects is relatively rare. This paper analyzes how the domestic and international rent seeking caused Brazil to provide coffee export tax rebates that transferred foreign exchange to coffee importers. Although Brazil was the world's largest exporter, it began to pay export tax rebates to selected coffee importers in 1965 and, by 1988, had paid rebates totaling $8 billion. Brazil explained these rebates as a mechanism to price disc...

  9. Classification in medical images using adaptive metric k-NN

    Science.gov (United States)

    Chen, C.; Chernoff, K.; Karemore, G.; Lo, P.; Nielsen, M.; Lauze, F.

    2010-03-01

    The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier with respect to different adaptive metrics in the context of medical imaging. We propose using adaptive metrics such that the structure of the data is better described, introducing some unsupervised learning knowledge in k-NN. We investigated four different metrics are estimated: a theoretical metric based on the assumption that images are drawn from Brownian Image Model (BIM), the normalized metric based on variance of the data, the empirical metric is based on the empirical covariance matrix of the unlabeled data, and an optimized metric obtained by minimizing the classification error. The spectral structure of the empirical covariance also leads to Principal Component Analysis (PCA) performed on it which results the subspace metrics. The metrics are evaluated on two data sets: lateral X-rays of the lumbar aortic/spine region, where we use k-NN for performing abdominal aorta calcification detection; and mammograms, where we use k-NN for breast cancer risk assessment. The results show that appropriate choice of metric can improve classification.

  10. THE ROLE OF ARTICLE LEVEL METRICS IN SCIENTIFIC PUBLISHING

    Directory of Open Access Journals (Sweden)

    Vladimir TRAJKOVSKI

    2016-04-01

    Full Text Available Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Article-level metrics (ALMs provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, statistics of usage, discussions in online comments and social media, social bookmarking, and recommendations. In this editorial, the role of article level metrics in publishing scientific papers has been described. Article-Level Metrics (ALMs are rapidly emerging as important tools to quantify how individual articles are being discussed, shared, and used. Data sources depend on the tool, but they include classic metrics indicators depending on citations, academic social networks (Mendeley, CiteULike, Delicious and social media (Facebook, Twitter, blogs, and Youtube. The most popular tools used to apply this new metrics are: Public Library of Science - Article-Level Metrics, Altmetric, Impactstory and Plum Analytics. Journal Impact Factor (JIF does not consider impact or influence beyond citations count as this count reflected only through Thomson Reuters’ Web of Science® database. JIF provides indicator related to the journal, but not related to a published paper. Thus, altmetrics now becomes an alternative metrics for performance assessment of individual scientists and their contributed scholarly publications. Macedonian scholarly publishers have to work on implementing of article level metrics in their e-journals. It is the way to increase their visibility and impact in the world of science.

  11. Outsourced Similarity Search on Metric Data Assets

    DEFF Research Database (Denmark)

    Yiu, Man Lung; Assent, Ira; Jensen, Christian S.

    2012-01-01

    . Outsourcing offers the data owner scalability and a low initial investment. The need for privacy may be due to the data being sensitive (e.g., in medicine), valuable (e.g., in astronomy), or otherwise confidential. Given this setting, the paper presents techniques that transform the data prior to supplying......This paper considers a cloud computing setting in which similarity querying of metric data is outsourced to a service provider. The data is to be revealed only to trusted users, not to the service provider or anyone else. Users query the server for the most similar data objects to a query example...

  12. New Metrics from a Fractional Gravitational Field

    International Nuclear Information System (INIS)

    El-Nabulsi, Rami Ahmad

    2017-01-01

    Agop et al. proved in Commun. Theor. Phys. (2008) that, a Reissner–Nordstrom type metric is obtained, if gauge gravitational field in a fractal spacetime is constructed by means of concepts of scale relativity. We prove in this short communication that similar result is obtained if gravity in D-spacetime dimensions is fractionalized by means of the Glaeske–Kilbas–Saigo fractional. Besides, non-singular gravitational fields are obtained without using extra-dimensions. We present few examples to show that these gravitational fields hold a number of motivating features in spacetime physics. (paper)

  13. Energy Metrics for State Government Buildings

    Science.gov (United States)

    Michael, Trevor

    Measuring true progress towards energy conservation goals requires the accurate reporting and accounting of energy consumption. An accurate energy metrics framework is also a critical element for verifiable Greenhouse Gas Inventories. Energy conservation in government can reduce expenditures on energy costs leaving more funds available for public services. In addition to monetary savings, conserving energy can help to promote energy security, air quality, and a reduction of carbon footprint. With energy consumption/GHG inventories recently produced at the Federal level, state and local governments are beginning to also produce their own energy metrics systems. In recent years, many states have passed laws and executive orders which require their agencies to reduce energy consumption. In June 2008, SC state government established a law to achieve a 20% energy usage reduction in state buildings by 2020. This study examines case studies from other states who have established similar goals to uncover the methods used to establish an energy metrics system. Direct energy consumption in state government primarily comes from buildings and mobile sources. This study will focus exclusively on measuring energy consumption in state buildings. The case studies reveal that many states including SC are having issues gathering the data needed to accurately measure energy consumption across all state buildings. Common problems found include a lack of enforcement and incentives that encourage state agencies to participate in any reporting system. The case studies are aimed at finding the leverage used to gather the needed data. The various approaches at coercing participation will hopefully reveal methods that SC can use to establish the accurate metrics system needed to measure progress towards its 20% by 2020 energy reduction goal. Among the strongest incentives found in the case studies is the potential for monetary savings through energy efficiency. Framing energy conservation

  14. Multi-Robot Assembly Strategies and Metrics

    Science.gov (United States)

    MARVEL, JEREMY A.; BOSTELMAN, ROGER; FALCO, JOE

    2018-01-01

    We present a survey of multi-robot assembly applications and methods and describe trends and general insights into the multi-robot assembly problem for industrial applications. We focus on fixtureless assembly strategies featuring two or more robotic systems. Such robotic systems include industrial robot arms, dexterous robotic hands, and autonomous mobile platforms, such as automated guided vehicles. In this survey, we identify the types of assemblies that are enabled by utilizing multiple robots, the algorithms that synchronize the motions of the robots to complete the assembly operations, and the metrics used to assess the quality and performance of the assemblies. PMID:29497234

  15. Metric preheating and limitations of linearized gravity

    International Nuclear Information System (INIS)

    Bassett, Bruce A.; Tamburini, Fabrizio; Kaiser, David I.; Maartens, Roy

    1999-01-01

    During the preheating era after inflation, resonant amplification of quantum field fluctuations takes place. Recently it has become clear that this must be accompanied by resonant amplification of scalar metric fluctuations, since the two are united by Einstein's equations. Furthermore, this 'metric preheating' enhances particle production, and leads to gravitational rescattering effects even at linear order. In multi-field models with strong preheating (q>>1), metric perturbations are driven non-linear, with the strongest amplification typically on super-Hubble scales (k→0). This amplification is causal, being due to the super-Hubble coherence of the inflaton condensate, and is accompanied by resonant growth of entropy perturbations. The amplification invalidates the use of the linearized Einstein field equations, irrespective of the amount of fine-tuning of the initial conditions. This has serious implications on all scales - from large-angle cosmic microwave background (CMB) anisotropies to primordial black holes. We investigate the (q,k) parameter space in a two-field model, and introduce the time to non-linearity, t nl , as the timescale for the breakdown of the linearized Einstein equations. t nl is a robust indicator of resonance behavior, showing the fine structure in q and k that one expects from a quasi-Floquet system, and we argue that t nl is a suitable generalization of the static Floquet index in an expanding universe. Backreaction effects are expected to shut down the linear resonances, but cannot remove the existing amplification, which threatens the viability of strong preheating when confronted with the CMB. Mode-mode coupling and turbulence tend to re-establish scale invariance, but this process is limited by causality and for small k the primordial scale invariance of the spectrum may be destroyed. We discuss ways to escape the above conclusions, including secondary phases of inflation and preheating solely to fermions. The exclusion principle

  16. Alternative kinetic energy metrics for Lagrangian systems

    Science.gov (United States)

    Sarlet, W.; Prince, G.

    2010-11-01

    We examine Lagrangian systems on \\ {R}^n with standard kinetic energy terms for the possibility of additional, alternative Lagrangians with kinetic energy metrics different to the Euclidean one. Using the techniques of the inverse problem in the calculus of variations we find necessary and sufficient conditions for the existence of such Lagrangians. We illustrate the problem in two and three dimensions with quadratic and cubic potentials. As an aside we show that the well-known anomalous Lagrangians for the Coulomb problem can be removed by switching on a magnetic field, providing an appealing resolution of the ambiguous quantizations of the hydrogen atom.

  17. Differential geometry bundles, connections, metrics and curvature

    CERN Document Server

    Taubes, Clifford Henry

    2011-01-01

    Bundles, connections, metrics and curvature are the 'lingua franca' of modern differential geometry and theoretical physics. This book will supply a graduate student in mathematics or theoretical physics with the fundamentals of these objects. Many of the tools used in differential topology are introduced and the basic results about differentiable manifolds, smooth maps, differential forms, vector fields, Lie groups, and Grassmanians are all presented here. Other material covered includes the basic theorems about geodesics and Jacobi fields, the classification theorem for flat connections, the

  18. Multi-Robot Assembly Strategies and Metrics.

    Science.gov (United States)

    Marvel, Jeremy A; Bostelman, Roger; Falco, Joe

    2018-02-01

    We present a survey of multi-robot assembly applications and methods and describe trends and general insights into the multi-robot assembly problem for industrial applications. We focus on fixtureless assembly strategies featuring two or more robotic systems. Such robotic systems include industrial robot arms, dexterous robotic hands, and autonomous mobile platforms, such as automated guided vehicles. In this survey, we identify the types of assemblies that are enabled by utilizing multiple robots, the algorithms that synchronize the motions of the robots to complete the assembly operations, and the metrics used to assess the quality and performance of the assemblies.

  19. Indefinite metric and regularization of electrodynamics

    International Nuclear Information System (INIS)

    Gaudin, M.

    1984-06-01

    The invariant regularization of Pauli and Villars in quantum electrodynamics can be considered as deriving from a local and causal lagrangian theory for spin 1/2 bosons, by introducing an indefinite metric and a condition on the allowed states similar to the Lorentz condition. The consequences are the asymptotic freedom of the photon's propagator. We present a calcultion of the effective charge to the fourth order in the coupling as a function of the auxiliary masses, the theory avoiding all mass divergencies to this order [fr

  20. Metrics for comparing plasma mass filters

    Energy Technology Data Exchange (ETDEWEB)

    Fetterman, Abraham J.; Fisch, Nathaniel J. [Department of Astrophysical Sciences, Princeton University, Princeton, New Jersey 08540 (United States)

    2011-10-15

    High-throughput mass separation of nuclear waste may be useful for optimal storage, disposal, or environmental remediation. The most dangerous part of nuclear waste is the fission product, which produces most of the heat and medium-term radiation. Plasmas are well-suited to separating nuclear waste because they can separate many different species in a single step. A number of plasma devices have been designed for such mass separation, but there has been no standardized comparison between these devices. We define a standard metric, the separative power per unit volume, and derive it for three different plasma mass filters: the plasma centrifuge, Ohkawa filter, and the magnetic centrifugal mass filter.

  1. Metrics for comparing plasma mass filters

    International Nuclear Information System (INIS)

    Fetterman, Abraham J.; Fisch, Nathaniel J.

    2011-01-01

    High-throughput mass separation of nuclear waste may be useful for optimal storage, disposal, or environmental remediation. The most dangerous part of nuclear waste is the fission product, which produces most of the heat and medium-term radiation. Plasmas are well-suited to separating nuclear waste because they can separate many different species in a single step. A number of plasma devices have been designed for such mass separation, but there has been no standardized comparison between these devices. We define a standard metric, the separative power per unit volume, and derive it for three different plasma mass filters: the plasma centrifuge, Ohkawa filter, and the magnetic centrifugal mass filter.

  2. Decision Analysis for Metric Selection on a Clinical Quality Scorecard.

    Science.gov (United States)

    Guth, Rebecca M; Storey, Patricia E; Vitale, Michael; Markan-Aurora, Sumita; Gordon, Randolph; Prevost, Traci Q; Dunagan, Wm Claiborne; Woeltje, Keith F

    2016-09-01

    Clinical quality scorecards are used by health care institutions to monitor clinical performance and drive quality improvement. Because of the rapid proliferation of quality metrics in health care, BJC HealthCare found it increasingly difficult to select the most impactful scorecard metrics while still monitoring metrics for regulatory purposes. A 7-step measure selection process was implemented incorporating Kepner-Tregoe Decision Analysis, which is a systematic process that considers key criteria that must be satisfied in order to make the best decision. The decision analysis process evaluates what metrics will most appropriately fulfill these criteria, as well as identifies potential risks associated with a particular metric in order to identify threats to its implementation. Using this process, a list of 750 potential metrics was narrowed to 25 that were selected for scorecard inclusion. This decision analysis process created a more transparent, reproducible approach for selecting quality metrics for clinical quality scorecards. © The Author(s) 2015.

  3. Balanced metrics for vector bundles and polarised manifolds

    DEFF Research Database (Denmark)

    Garcia Fernandez, Mario; Ross, Julius

    2012-01-01

    leads to a Hermitian-Einstein metric on E and a constant scalar curvature Kähler metric in c_1(L). For special values of α, limits of balanced metrics are solutions of a system of coupled equations relating a Hermitian-Einstein metric on E and a Kähler metric in c1(L). For this, we compute the top two......We consider a notion of balanced metrics for triples (X, L, E) which depend on a parameter α, where X is smooth complex manifold with an ample line bundle L and E is a holomorphic vector bundle over X. For generic choice of α, we prove that the limit of a convergent sequence of balanced metrics...

  4. Construction of Einstein-Sasaki metrics in D≥7

    International Nuclear Information System (INIS)

    Lue, H.; Pope, C. N.; Vazquez-Poritz, J. F.

    2007-01-01

    We construct explicit Einstein-Kaehler metrics in all even dimensions D=2n+4≥6, in terms of a 2n-dimensional Einstein-Kaehler base metric. These are cohomogeneity 2 metrics which have the new feature of including a NUT-type parameter, or gravomagnetic charge, in addition to..' in addition to mass and rotation parameters. Using a canonical construction, these metrics all yield Einstein-Sasaki metrics in dimensions D=2n+5≥7. As is commonly the case in this type of construction, for suitable choices of the free parameters the Einstein-Sasaki metrics can extend smoothly onto complete and nonsingular manifolds, even though the underlying Einstein-Kaehler metric has conical singularities. We discuss some explicit examples in the case of seven-dimensional Einstein-Sasaki spaces. These new spaces can provide supersymmetric backgrounds in M theory, which play a role in the AdS 4 /CFT 3 correspondence

  5. National Metrical Types in Nineteenth Century Art Song

    Directory of Open Access Journals (Sweden)

    Leigh VanHandel

    2010-01-01

    Full Text Available William Rothstein’s article “National metrical types in music of the eighteenth and early nineteenth centuries” (2008 proposes a distinction between the metrical habits of 18th and early 19th century German music and those of Italian and French music of that period. Based on theoretical treatises and compositional practice, he outlines these national metrical types and discusses the characteristics of each type. This paper presents the results of a study designed to determine whether, and to what degree, Rothstein’s characterizations of national metrical types are present in 19th century French and German art song. Studying metrical habits in this genre may provide a lens into changing metrical conceptions of 19th century theorists and composers, as well as to the metrical habits and compositional style of individual 19th century French and German art song composers.

  6. Metrication: An economic wake-up call for US industry

    Science.gov (United States)

    Carver, G. P.

    1993-03-01

    As the international standard of measurement, the metric system is one key to success in the global marketplace. International standards have become an important factor in international economic competition. Non-metric products are becoming increasingly unacceptable in world markets that favor metric products. Procurement is the primary federal tool for encouraging and helping U.S. industry to convert voluntarily to the metric system. Besides the perceived unwillingness of the customer, certain regulatory language, and certain legal definitions in some states, there are no major impediments to conversion of the remaining non-metric industries to metric usage. Instead, there are good reasons for changing, including an opportunity to rethink many industry standards and to take advantage of size standardization. Also, when the remaining industries adopt the metric system, they will come into conformance with federal agencies engaged in similar activities.

  7. THE SOUTHERN FRAGMENT OF THE SIBERIAN CRATON: “LANDSCAPE” HISTORY OVER TWO BILLION YEARS

    Directory of Open Access Journals (Sweden)

    Arkady M. Stanevich

    2010-01-01

    background of sub-continental sedimentation. In the Late Paleozoic, the geologic development was marked by major transformation of the pattern of tectonic structures, that was most likely related to inside-plate extension and thinning of the continental crust. In the Mid and Late Carbon (Fig. 4A, the integrated Tungusskiy sedimentation basin was formed as a result of continuous and uniform bending. In the Early Permian (see Fig. 4Б, positive tectonic movements led to significant dewatering of the Paleozoic basins, so that they turned into a washed-out area. Overall raising of the Siberian Platform preconditioned climate changes, such as aridization and climate cooling. In the Mesozoic, landscapes were presented by a combination of flat uplands, wide river valleys with swampy plains and lakes wherein carbonous sediments were accumulated. Basic volcanism with shield eruptions and sub-volcanic rocks was typical then. In the Jurassic (see Fig. 4B, elements observed in the recent topography of the Siberian Platform were formed. In that period, major structural transformation occurred in association with the largest diastrophic cycles in the territory of the Eastern Asia, including formation of the Baikal rift and its branches.From the analyses of the available data which are briefly presented above, it is obvious that the period of two billion years in the Earth history includes numerous epochs of diastrophic processes of tremendous destructive capacity. Unconformities of formations differing in ages by millions and even hundreds of million years, as those dating back to the Pre-Cambrian, suggest quite realistic yet astounding visions. At the background of scenarios of floods, rock up-thrusts, volcanic explosions and earthquakes evidenced from the very remote past, the current geological and climatic phenomena may seem quite trivial.

  8. Ton H.M. van Schaik and Karin Strengers-Olde Kalter, Het Arme Roomse Leven. Geschiedenis van de katholieke caritas in de stad Utrecht

    Directory of Open Access Journals (Sweden)

    Joost van Genabeek

    2017-07-01

    Full Text Available Ton H.M. van Schaik and Karin Strengers-Olde Kalter, Het Arme Roomse Leven. Geschiedenis van de katholieke caritas in de stad Utrecht (Hilversum: Verloren, 2016, 276 pp., isbn 978 90 8704 578 4.

  9. Transportability Testing of the Family of Medium Tactical Vehicles (FMTV) 10-Ton Dump Truck, TP-94-01, Rev. 2, June 2004, "Transportability Testing Procedures"

    National Research Council Canada - National Science Library

    Barickman, Philip W

    2007-01-01

    ... (FMTV) 10-ton dump truck manufactured by Stewart and Stevenson Systems, Inc., Sealy, Texas. The testing was conducted in accordance with TP-94-01, Revision 2, June 2004 "Transportability Testing Procedures...

  10. Un nouveau béton auto-cicatrisant grâce à l’incorporation de bactéries

    NARCIS (Netherlands)

    Wiktor, V.; Jonkers, H.M.

    2011-01-01

    La formation d’un réseau continu de fissures contribue à l’augmentation de la perméabilité du béton, réduisant ainsi de manière importante sa résistance à l’attaque d’agents agressifs dissous dans l’eau. Afin d’augmenter la capacité de cicatrisation autogène du béton, certains agents cicatrisants

  11. Fanpage metrics analysis. "Study on content engagement"

    Science.gov (United States)

    Rahman, Zoha; Suberamanian, Kumaran; Zanuddin, Hasmah Binti; Moghavvemi, Sedigheh; Nasir, Mohd Hairul Nizam Bin Md

    2016-08-01

    Social Media is now determined as an excellent communicative tool to connect directly with consumers. One of the most significant ways to connect with the consumers through these Social Networking Sites (SNS) is to create a facebook fanpage with brand contents and to place different posts periodically on these fanpages. In measuring social networking sites' effectiveness, corporate houses are now analyzing metrics in terms of calculating engagement rate, number of comments/share and likings in fanpages. So now, it is very important for the marketers to know the effectiveness of different contents or posts of fanpages in order to increase the fan responsiveness and engagement rate in the fan pages. In the study the authors have analyzed total 1834 brand posts from 17 international brands of Electronics companies. Data of 9 months (From December 2014 to August 2015) have been collected for analyses, which were available online in the Brand' fan pages. An econometrics analysis is conducted using Eviews 9, to determine the impact of different contents on fanpage engagement. The study picked the four most frequently posted content to determine their impact on PTA (people Talking About) metrics and Fanpage engagement activities.

  12. Network Community Detection on Metric Space

    Directory of Open Access Journals (Sweden)

    Suman Saha

    2015-08-01

    Full Text Available Community detection in a complex network is an important problem of much interest in recent years. In general, a community detection algorithm chooses an objective function and captures the communities of the network by optimizing the objective function, and then, one uses various heuristics to solve the optimization problem to extract the interesting communities for the user. In this article, we demonstrate the procedure to transform a graph into points of a metric space and develop the methods of community detection with the help of a metric defined for a pair of points. We have also studied and analyzed the community structure of the network therein. The results obtained with our approach are very competitive with most of the well-known algorithms in the literature, and this is justified over the large collection of datasets. On the other hand, it can be observed that time taken by our algorithm is quite less compared to other methods and justifies the theoretical findings.

  13. Value of the Company and Marketing Metrics

    Directory of Open Access Journals (Sweden)

    André Luiz Ramos

    2013-12-01

    Full Text Available Thinking marketing strategies from a resource-based perspective (Barney, 1991, proposing assets as either tangible, organizational and human, and from Constantin and Luch’s vision (1994, where strategic resources can be tanbigle or intangible, internal or external to the firm, raises a research approach on Marketing and Finance. According to Srivastava, Shervani and Fahey (1998 there are 3 market assets types, which generate firm value. Firm value can be measured by discounted cashflow, compromising marketing activities with value generation forcasts (Anderson, 1982; Day, Fahey, 1988; Doyle, 2000; Rust et al., 2004a. The economic value of marketing strategies and marketing metrics are calling strategy researchers’ and marketing managers’ attention, making clear the need for building a bridge able to articulate marketing and finance form a strategic perspective. This article proposes an analytical framework based on different scientific approaches envolving risk and return promoted by marketing strategies and points out advances concerning both methodological approaches and marketing strategies and its impact on firm metrics and value, usgin Srinivasan and Hanssens (2009 as a start point.

  14. Covariant electrodynamics in linear media: Optical metric

    Science.gov (United States)

    Thompson, Robert T.

    2018-03-01

    While the postulate of covariance of Maxwell's equations for all inertial observers led Einstein to special relativity, it was the further demand of general covariance—form invariance under general coordinate transformations, including between accelerating frames—that led to general relativity. Several lines of inquiry over the past two decades, notably the development of metamaterial-based transformation optics, has spurred a greater interest in the role of geometry and space-time covariance for electrodynamics in ponderable media. I develop a generally covariant, coordinate-free framework for electrodynamics in general dielectric media residing in curved background space-times. In particular, I derive a relation for the spatial medium parameters measured by an arbitrary timelike observer. In terms of those medium parameters I derive an explicit expression for the pseudo-Finslerian optical metric of birefringent media and show how it reduces to a pseudo-Riemannian optical metric for nonbirefringent media. This formulation provides a basis for a unified approach to ray and congruence tracing through media in curved space-times that may smoothly vary among positively refracting, negatively refracting, and vacuum.

  15. Axisymmetric plasma equilibria in a Kerr metric

    Science.gov (United States)

    Elsässer, Klaus

    2001-10-01

    Plasma equilibria near a rotating black hole are considered within the multifluid description. An isothermal two-component plasma with electrons and positrons or ions is determined by four structure functions and the boundary conditions. These structure functions are the Bernoulli function and the toroidal canonical momentum per mass for each species. The quasi-neutrality assumption (no charge density, no toroidal current) allows to solve Maxwell's equations analytically for any axisymmetric stationary metric, and to reduce the fluid equations to one single scalar equation for the stream function \\chi of the positrons or ions, respectively. The basic smallness parameter is the ratio of the skin depth of electrons to the scale length of the metric and fluid quantities, and, in the case of an electron-ion plasma, the mass ratio m_e/m_i. The \\chi-equation can be solved by standard methods, and simple solutions for a Kerr geometry are available; they show characteristic flow patterns, depending on the structure functions and the boundary conditions.

  16. Hypertonic-induced lamin A/C synthesis and distribution to nucleoplasmic speckles is mediated by TonEBP/NFAT5 transcriptional activator

    International Nuclear Information System (INIS)

    Favale, Nicolas O.; Sterin Speziale, Norma B.; Fernandez Tome, Maria C.

    2007-01-01

    Lamin A/C is the most studied nucleoskeletal constituent. Lamin A/C expression indicates cell differentiation and is also a structural component of nuclear speckles, which are involved in gene expression regulation. Hypertonicity has been reported to induce renal epithelial cell differentiation and expression of TonEBP (NFAT5), a transcriptional activator of hypertonicity-induced gene transcription. In this paper, we investigate the effect of hypertonicity on lamin A/C expression in MDCK cells and the involvement of TonEBP. Hypertonicity increased lamin A/C expression and its distribution to nucleoplasm with speckled pattern. Microscopy showed codistribution of TonEBP and lamin A/C in nucleoplasmic speckles, and immunoprecipitation demonstrated their interaction. TonEBP silencing caused lamin A/C redistribution from nucleoplasmic speckles to the nuclear rim, followed by lamin decrease, thus showing that hypertonicity induces lamin A/C speckles through a TonEBP-dependent mechanism. We suggest that lamin A/C speckles could serve TonEBP as scaffold thus favoring its role in hypertonicity

  17. Comprehensive Metric Education Project: Implementing Metrics at a District Level Administrative Guide.

    Science.gov (United States)

    Borelli, Michael L.

    This document details the administrative issues associated with guiding a school district through its metrication efforts. Issues regarding staff development, curriculum development, and the acquisition of instructional resources are considered. Alternative solutions are offered. Finally, an overall implementation strategy is discussed with…

  18. LLNL's Big Science Capabilities Help Spur Over $796 Billion in U.S. Economic Activity Sequencing the Human Genome

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Jeffrey S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-28

    LLNL’s successful history of taking on big science projects spans beyond national security and has helped create billions of dollars per year in new economic activity. One example is LLNL’s role in helping sequence the human genome. Over $796 billion in new economic activity in over half a dozen fields has been documented since LLNL successfully completed this Grand Challenge.

  19. Social Media Metrics Importance and Usage Frequency in Latvia

    Directory of Open Access Journals (Sweden)

    Ronalds Skulme

    2017-12-01

    Full Text Available Purpose of the article: The purpose of this paper was to explore which social media marketing metrics are most often used and are most important for marketing experts in Latvia and can be used to evaluate marketing campaign effectiveness. Methodology/methods: In order to achieve the aim of this paper several theoretical and practical research methods were used, such as theoretical literature analysis, surveying and grouping. First of all, theoretical research about social media metrics was conducted. Authors collected information about social media metric grouping methods and the most frequently mentioned social media metrics in the literature. The collected information was used as the foundation for the expert surveys. The expert surveys were used to collect information from Latvian marketing professionals to determine which social media metrics are used most often and which social media metrics are most important in Latvia. Scientific aim: The scientific aim of this paper was to identify if social media metrics importance varies depending on the consumer purchase decision stage. Findings: Information about the most important and most often used social media marketing metrics in Latvia was collected. A new social media grouping framework is proposed. Conclusions: The main conclusion is that the importance and the usage frequency of the social media metrics is changing depending of consumer purchase decisions stage the metric is used to evaluate.

  20. The giant superconducting magnet system of 10,000 tons mass for fusion experiment at Cadarache, France

    International Nuclear Information System (INIS)

    Sahu, A.K.

    2013-01-01

    The International Thermonuclear Experimental Reactor (ITER) being built at Cadarache, France has many unique features and is one of the biggest scientific adventures in the history of science and technology. Seven partners (India, EU, US, China, Japan, Korea and Russia) have made an International Organization situated at Cadarache, France to provide direction and co-ordination for R and D and construction of this project. The R and D labs and manufacturing industries are spread in these seven partner countries. Components manufactured in these countries will be transported to Cadarache in France for assembly. Institute for Plasma Research, Bhat, Gandhinagar, Gujarat is coordinating this project activities on behalf of India. The magnet system, required for confinement and control of plasma leading to fusion reaction in ITER is one of the key systems of this project. There are 18 TF (Toroidal Field) Coils, 6 PF (Poloidal Field) coils, 6 CS (Central Solenoid) coils and 18 correction coils (CC), all of which are of superconducting type. All TF and CS coils have Nb3Sn superconductor and all PF and CC coils have NbTi superconductor. Each TF coil has height 15 m and width 9 m and 330 tons mass. The biggest PF coil has diameter 24 m and 300 tons mass. The total mass of these superconducting magnet systems is about 10000 tons. Use of Nb3Sn superconductor for manufacturing superconducting cables for successful use had not reached a matured stage earlier and this project gave a thrust for significant R and D activities worldwide and now due to this project, it is a matured and reliable technology. The jacketing and manufacturing of long cables need up to about 760 m long special infrastructure at Industry. The special building built for PF coil winding at ITER, Cadarache site is of size 250 m X 45 m. All these coils are made using cable-in-conduit conductors (CICC). These long CICCs have to carry current as high as 68 kA in case of TF coils. Due to this high current and

  1. Unlocking the EUR53 billion savings from smart meters in the EU. How increasing the adoption of dynamic tariffs could make or break the EU's smart grid investment

    International Nuclear Information System (INIS)

    Faruqui, Ahmad; Hledik, Ryan; Harris, Dan

    2010-01-01

    We estimate the cost of installing smart meters in the EU to be EUR51 billion, and that operational savings will be worth between EUR26 and 41 billion, leaving a gap of EUR10-25 billion between benefits and costs. Smart meters can fill this gap because they enable the provision of dynamic pricing, which reduces peak demand and lowers the need for building and running expensive peaking power plants. The present value of savings in peaking infrastructure could be as high as EUR67 billion for the EU if policy-makers can overcome barriers to consumers adopting dynamic tariffs, but only EUR14 billion otherwise. We outline a number of ways to increase the adoption of dynamic tariffs. (author)

  2. Molecular dynamics beyonds the limits: Massive scaling on 72 racks of a BlueGene/P and supercooled glass dynamics of a 1 billion particles system

    KAUST Repository

    Allsopp, Nicholas

    2012-04-01

    We report scaling results on the world\\'s largest supercomputer of our recently developed Billions-Body Molecular Dynamics (BBMD) package, which was especially designed for massively parallel simulations of the short-range atomic dynamics in structural glasses and amorphous materials. The code was able to scale up to 72 racks of an IBM BlueGene/P, with a measured 89% efficiency for a system with 100 billion particles. The code speed, with 0.13. s per iteration in the case of 1 billion particles, paves the way to the study of billion-body structural glasses with a resolution increase of two orders of magnitude with respect to the largest simulation ever reported. We demonstrate the effectiveness of our code by studying the liquid-glass transition of an exceptionally large system made by a binary mixture of 1 billion particles. © 2012.

  3. Measurable Control System Security through Ideal Driven Technical Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Sean McBride; Marie Farrar; Zachary Tudor

    2008-01-01

    The Department of Homeland Security National Cyber Security Division supported development of a small set of security ideals as a framework to establish measurable control systems security. Based on these ideals, a draft set of proposed technical metrics was developed to allow control systems owner-operators to track improvements or degradations in their individual control systems security posture. The technical metrics development effort included review and evaluation of over thirty metrics-related documents. On the bases of complexity, ambiguity, or misleading and distorting effects the metrics identified during the reviews were determined to be weaker than necessary to aid defense against the myriad threats posed by cyber-terrorism to human safety, as well as to economic prosperity. Using the results of our metrics review and the set of security ideals as a starting point for metrics development, we identified thirteen potential technical metrics - with at least one metric supporting each ideal. Two case study applications of the ideals and thirteen metrics to control systems were then performed to establish potential difficulties in applying both the ideals and the metrics. The case studies resulted in no changes to the ideals, and only a few deletions and refinements to the thirteen potential metrics. This led to a final proposed set of ten core technical metrics. To further validate the security ideals, the modifications made to the original thirteen potential metrics, and the final proposed set of ten core metrics, seven separate control systems security assessments performed over the past three years were reviewed for findings and recommended mitigations. These findings and mitigations were then mapped to the security ideals and metrics to assess gaps in their coverage. The mappings indicated that there are no gaps in the security ideals and that the ten core technical metrics provide significant coverage of standard security issues with 87% coverage. Based

  4. Durabilité des structures en béton et robustesse vis-à-vis des actions accidentelles

    OpenAIRE

    TOUTLEMONDE, François; LABORATOIRE CENTRAL DES PONTS ET CHAUSSEES - LCPC

    2007-01-01

    L'expérience du fonctionnement réel des structures en béton existantes, spécialement sous conditions environnementales extrêmes, sous chargement accidentel sévère, ou au bout d'un vieillissement en service de longue durée, fait ressortir la nécessité de mieux intégrer dans la conception de la structure l'objectif de la durabilité, l'enjeu des nouveaux concepts fiabilistes pour une conception en vue de la durabilité, l'intérêt de la spécification performantielle du matériau, l'importance de la...

  5. Two new species of nematodes (Nematoda) from highly mineralized rivers of Lake El'ton basin, Russia.

    Science.gov (United States)

    Gusakov, Vladimir A; Gagarin, Vladimir G

    2016-09-05

    Two new nematode species, Mesodorylaimus rivalis sp. n. and Allodiplogaster media sp. n., from the highly mineralized rivers of the El'ton Lake basin (Russia) are described and illustrated from numerous mature females and males. Mesodorylaimus rivalis sp. n. is similar to M. vulvapapillatus Bagaturia & Eliava, 1966, but differs from it in the longer body, shorter spicules and longer female prerectum. Allodiplogaster media sp. n. resembles A. lupata (Shoshin, 1989) Kanzaki, Ragsdale & Giblin-Davis, 2014 and A. mordax (Shoshin, 1989) Kanzaki, Ragsdale & Giblin-Davis, 2014, but differs from the first species in having a shorter pharynx, shorter outer labial setae, longer spicules and different ratio between anterior and posterior pharynx sections, and from A. mordax in the thinner body, shorter pharynx and longer spicules.

  6. arXiv Photon detector system performance in the DUNE 35-ton prototype liquid argon time projection chamber

    CERN Document Server

    Adams, D.L.; Anderson, J.T.; Bagby, L.; Baird, M.; Barr, G.; Barros, N.; Biery, K.; Blake, A.; Blaufuss, E.; Boone, T.; Booth, A.; Brailsford, D.; Buchanan, N.; Chatterjee, A.; Convery, M.; Davies, J.; Dealtry, T.; DeLurgio, P.; Deuerling, G.; Dharmapalan, R.; Djurcic, Z.; Drake, G.; Eberly, B.; Freeman, J.; Glavin, S.; Gomes, R.A.; Goodman, M.C.; Graham, M.; Hahn, A.; Haigh, J.T.; Hartnell, J.; Higuera, A.; Himmel, A.; Insler, J.; Jacobsen, J.; Junk, T.; Kirby, B.; Klein, J.; Kudryavtsev, V.A.; Kutter, T.; Li, Y.; Li, X.; Lin, S.; Martin-Albo, J.; McConkey, N.; Moura, C.A.; Mufson, S.; Nicholls, T.C.; Nowak, J.; Oberling, M.; Paley, J.; Qian, X.; Raaf, J.L.; Rivera, D.; Santucci, G.; Sinev, G.; Spooner, N.J. C.; Stancari, M.; Stancu, I.; Stefan, D.; Stewart, J.; Stock, J.; Strauss, T.; Sulej, R.; Sun, Y.; Thiesse, M.; Thompson, L.F.; Tsai, Y.T.; Wallbank, M.; Warburton, T.K.; Warner, D.; Whittington, D.; Wilson, R.J.; Worcester, M.; Worcester, E.; Yang, T.; Zhang, C.

    The 35-ton prototype for the Deep Underground Neutrino Experiment far detector was a single-phase liquid argon time projection chamber with an integrated photon detector system, all situated inside a membrane cryostat. The detector took cosmic-ray data for six weeks during the period of February 1, 2016 to March 12, 2016. The performance of the photon detection system was checked with these data. An installed photon detector was demonstrated to measure the arrival times of cosmic-ray muons with a resolution better than 32 ns, limited by the timing of the trigger system. A measurement of the timing resolution using closely-spaced calibration pulses yielded a resolution of 15 ns for pulses at a level of 6 photo-electrons. Scintillation light from cosmic-ray muons was observed to be attenuated with increasing distance with a characteristic length of $155 \\pm 28$ cm.

  7. Comparison of luminance based metrics in different lighting conditions

    DEFF Research Database (Denmark)

    Wienold, J.; Kuhn, T.E.; Christoffersen, J.

    In this study, we evaluate established and newly developed metrics for predicting glare using data from three different research studies. The evaluation covers two different targets: 1. How well the user’s perception of glare magnitude correlates to the prediction of the glare metrics? 2. How well...... do the glare metrics describe the subjects’ disturbance by glare? We applied Spearman correlations, logistic regressions and an accuracy evaluation, based on an ROC-analysis. The results show that five of the twelve investigated metrics are failing at least one of the statistical tests. The other...... seven metrics CGI, modified DGI, DGP, Ev, average Luminance of the image Lavg, UGP and UGR are passing all statistical tests. DGP, CGI, DGI_mod and UGP have largest AUC and might be slightly more robust. The accuracy of the predictions of afore mentioned seven metrics for the disturbance by glare lies...

  8. Performance metrics for the evaluation of hyperspectral chemical identification systems

    Science.gov (United States)

    Truslow, Eric; Golowich, Steven; Manolakis, Dimitris; Ingle, Vinay

    2016-02-01

    Remote sensing of chemical vapor plumes is a difficult but important task for many military and civilian applications. Hyperspectral sensors operating in the long-wave infrared regime have well-demonstrated detection capabilities. However, the identification of a plume's chemical constituents, based on a chemical library, is a multiple hypothesis testing problem which standard detection metrics do not fully describe. We propose using an additional performance metric for identification based on the so-called Dice index. Our approach partitions and weights a confusion matrix to develop both the standard detection metrics and identification metric. Using the proposed metrics, we demonstrate that the intuitive system design of a detector bank followed by an identifier is indeed justified when incorporating performance information beyond the standard detection metrics.

  9. Metrics correlation and analysis service (MCAS)

    International Nuclear Information System (INIS)

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya

    2010-01-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information pool is disorganized, it is a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation, and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by loosely coupled or fully decoupled middleware.

  10. Metrics correlation and analysis service (MCAS)

    International Nuclear Information System (INIS)

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya

    2009-01-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information 'pond' is disorganized, it a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by disjoint middleware.

  11. Development of Technology Transfer Economic Growth Metrics

    Science.gov (United States)

    Mastrangelo, Christina M.

    1998-01-01

    The primary objective of this project is to determine the feasibility of producing technology transfer metrics that answer the question: Do NASA/MSFC technical assistance activities impact economic growth? The data for this project resides in a 7800-record database maintained by Tec-Masters, Incorporated. The technology assistance data results from survey responses from companies and individuals who have interacted with NASA via a Technology Transfer Agreement, or TTA. The goal of this project was to determine if the existing data could provide indications of increased wealth. This work demonstrates that there is evidence that companies that used NASA technology transfer have a higher job growth rate than the rest of the economy. It also shows that the jobs being supported are jobs in higher wage SIC codes, and this indicates improvements in personal wealth. Finally, this work suggests that with correct data, the wealth issue may be addressed.

  12. MESUR metrics from scholarly usage of resources

    CERN Document Server

    CERN. Geneva; Van de Sompel, Herbert

    2007-01-01

    Usage data is increasingly regarded as a valuable resource in the assessment of scholarly communication items. However, the development of quantitative, usage-based indicators of scholarly impact is still in its infancy. The Digital Library Research & Prototyping Team at the Los Alamos National Laboratory's Research library has therefore started a program to expand the set of usage-based tools for the assessment of scholarly communication items. The two-year MESUR project, funded by the Andrew W. Mellon Foundation, aims to define and validate a range of usage-based impact metrics, and issue guidelines with regards to their characteristics and proper application. The MESUR project is constructing a large-scale semantic model of the scholarly community that seamlessly integrates a wide range of bibliographic, citation and usage data. Functioning as a reference data set, this model is analyzed to characterize the intricate networks of typed relationships that exist in the scholarly community. The resulting c...

  13. Einstein metrics and Brans-Dicke superfields

    International Nuclear Information System (INIS)

    Marques, S.

    1988-01-01

    It is obtained here a space conformal to the Einstein space-time, making the transition from an internal bosonic space, constructed with the Majorana constant spinors in the Majorana representation, to a bosonic ''superspace,'' through the use of Einstein vierbeins. These spaces are related to a Grassmann space constructed with the Majorana spinors referred to above, where the ''metric'' is a function of internal bosonic coordinates. The conformal function is a scale factor in the zone of gravitational radiation. A conformal function dependent on space-time coordinates can be constructed in that region when we introduce Majorana spinors which are functions of those coordinates. With this we obtain a scalar field of Brans-Dicke type. 11 refs

  14. Advanced reactors: the case for metric design

    International Nuclear Information System (INIS)

    Ruby, L.

    1986-01-01

    The author argues that DOE should insist that all design specifications for advanced reactors be in the International System of Units (SI) in accordance with the Metric Conversion Act of 1975. Despite a lack of leadership from the federal government, industry has had to move toward conversion in order to compete on world markets. The US is the only major country without a scheduled conversion program. SI avoids the disadvantages of ambiguous names, non-coherent units, multiple units for the same quantity, multiple definitions, as well as barriers to international exchange and marketing and problems in comparing safety and code parameters. With a first step by DOE, the Nuclear Regulatory Commission should add the same requirements to reactor licensing guidelines. 4 references

  15. Analytical Cost Metrics : Days of Future Past

    Energy Technology Data Exchange (ETDEWEB)

    Prajapati, Nirmal [Colorado State Univ., Fort Collins, CO (United States); Rajopadhye, Sanjay [Colorado State Univ., Fort Collins, CO (United States); Djidjev, Hristo Nikolov [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-20

    As we move towards the exascale era, the new architectures must be capable of running the massive computational problems efficiently. Scientists and researchers are continuously investing in tuning the performance of extreme-scale computational problems. These problems arise in almost all areas of computing, ranging from big data analytics, artificial intelligence, search, machine learning, virtual/augmented reality, computer vision, image/signal processing to computational science and bioinformatics. With Moore’s law driving the evolution of hardware platforms towards exascale, the dominant performance metric (time efficiency) has now expanded to also incorporate power/energy efficiency. Therefore the major challenge that we face in computing systems research is: “how to solve massive-scale computational problems in the most time/power/energy efficient manner?”

  16. Clean Cities 2013 Annual Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.; Singer, M.

    2014-10-01

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2013 Annual Metrics Report.

  17. Clean Cities 2014 Annual Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Caley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Singer, Mark [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-12-22

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2014 Annual Metrics Report.

  18. Outsourced similarity search on metric data assets

    KAUST Repository

    Yiu, Man Lung

    2012-02-01

    This paper considers a cloud computing setting in which similarity querying of metric data is outsourced to a service provider. The data is to be revealed only to trusted users, not to the service provider or anyone else. Users query the server for the most similar data objects to a query example. Outsourcing offers the data owner scalability and a low-initial investment. The need for privacy may be due to the data being sensitive (e.g., in medicine), valuable (e.g., in astronomy), or otherwise confidential. Given this setting, the paper presents techniques that transform the data prior to supplying it to the service provider for similarity queries on the transformed data. Our techniques provide interesting trade-offs between query cost and accuracy. They are then further extended to offer an intuitive privacy guarantee. Empirical studies with real data demonstrate that the techniques are capable of offering privacy while enabling efficient and accurate processing of similarity queries.

  19. Special metrics and group actions in geometry

    CERN Document Server

    Fino, Anna; Musso, Emilio; Podestà, Fabio; Vezzoni, Luigi

    2017-01-01

    The volume is a follow-up to the INdAM meeting “Special metrics and quaternionic geometry” held in Rome in November 2015. It offers a panoramic view of a selection of cutting-edge topics in differential geometry, including 4-manifolds, quaternionic and octonionic geometry, twistor spaces, harmonic maps, spinors, complex and conformal geometry, homogeneous spaces and nilmanifolds, special geometries in dimensions 5–8, gauge theory, symplectic and toric manifolds, exceptional holonomy and integrable systems. The workshop was held in honor of Simon Salamon, a leading international scholar at the forefront of academic research who has made significant contributions to all these subjects. The articles published here represent a compelling testimony to Salamon’s profound and longstanding impact on the mathematical community. Target readership includes graduate students and researchers working in Riemannian and complex geometry, Lie theory and mathematical physics.

  20. Operational Efficiencies and Simulated Performance of Big Data Analytics Platform over Billions of Patient Records of a Hospital System

    Directory of Open Access Journals (Sweden)

    Dillon Chrimes

    2017-01-01

    Full Text Available Big Data Analytics (BDA is important to utilize data from hospital systems to reduce healthcare costs. BDA enable queries of large volumes of patient data in an interactively dynamic way for healthcare. The study objective was high performance establishment of interactive BDA platform of hospital system. A Hadoop/MapReduce framework was established at University of Victoria (UVic with Compute Canada/Westgrid to form a Healthcare BDA (HBDA platform with HBase (NoSQL database using hospital-specific metadata and file ingestion. Patient data profiles and clinical workflow derived from Vancouver Island Health Authority (VIHA, Victoria, BC, Canada. The proof-of-concept implementation tested patient data representative of the entire Provincial hospital systems. We cross-referenced all data profiles and metadata with real patient data used in clinical reporting. Query performance tested Apache tools in Hadoop’s ecosystem. At optimized iteration, Hadoop Distributed File System (HDFS ingestion required three seconds but HBase required four to twelve hours to complete the Reducer of MapReduce. HBase bulkloads took a week for one billion (10TB and over two months for three billion (30TB. Simple and complex query results showed about two seconds for one and three billion, respectively. Apache Drill outperformed Apache Spark. However, it was restricted to running more simplified queries with poor usability for healthcare. Jupyter on Spark offered high performance and customization to run all queries simultaneously with high usability. BDA platform of HBase distributed over Hadoop successfully; however, some inconsistencies of MapReduce limited operational efficiencies. Importance of Hadoop/MapReduce on representation of platform performance discussed.