WorldWideScience

Sample records for billion metric tons

  1. Summary and Comparison of the 2016 Billion-Ton Report with the 2011 U.S. Billion-Ton Update

    Energy Technology Data Exchange (ETDEWEB)

    None

    2016-06-01

    In terms of the magnitude of the resource potential, the results of the 2016 Billion-Ton Report (BT16) are consistent with the original 2005 Billion-Ton Study (BTS) and the 2011 report, U.S. Billion-Ton Update: Biomass Supply for a Bioenergy and Bioproducts Industry (BT2. An effort was made to reevaluate the potential forestland, agricultural, and waste resources at the roadside, then extend the analysis by adding transportation costs to a biorefinery under specified logistics assumptions to major resource fractions.

  2. Sneak Peek to the 2016 Billion-Ton Report

    Energy Technology Data Exchange (ETDEWEB)

    None

    2016-06-01

    The 2005 Billion-Ton Study became a landmark resource for bioenergy stakeholders, detailing for the first time the potential to produce at least one billion dry tons of biomass annually in a sustainable manner from U.S. agriculture and forest resources. The 2011 U.S. Billion-Ton Update expanded and updated the analysis, and in 2016, the U.S. Department of Energy’s Bioenergy Technologies Office plans to release the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy.

  3. Regional Feedstock Partnership Summary Report: Enabling the Billion-Ton Vision

    Energy Technology Data Exchange (ETDEWEB)

    Owens, Vance N. [South Dakota State Univ., Brookings, SD (United States). North Central Sun Grant Center; Karlen, Douglas L. [Dept. of Agriculture Agricultural Research Service, Ames, IA (United States). National Lab. for Agriculture and the Environment; Lacey, Jeffrey A. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Process Science and Technology Division

    2016-07-12

    The U.S. Department of Energy (DOE) and the Sun Grant Initiative established the Regional Feedstock Partnership (referred to as the Partnership) to address information gaps associated with enabling the vision of a sustainable, reliable, billion-ton U.S. bioenergy industry by the year 2030 (i.e., the Billion-Ton Vision). Over the past 7 years (2008–2014), the Partnership has been successful at advancing the biomass feedstock production industry in the United States, with notable accomplishments. The Billion-Ton Study identifies the technical potential to expand domestic biomass production to offset up to 30% of U.S. petroleum consumption, while continuing to meet demands for food, feed, fiber, and export. This study verifies for the biofuels and chemical industries that a real and substantial resource base could justify the significant investment needed to develop robust conversion technologies and commercial-scale facilities. DOE and the Sun Grant Initiative established the Partnership to demonstrate and validate the underlying assumptions underpinning the Billion-Ton Vision to supply a sustainable and reliable source of lignocellulosic feedstock to a large-scale bioenergy industry. This report discusses the accomplishments of the Partnership, with references to accompanying scientific publications. These accomplishments include advances in sustainable feedstock production, feedstock yield, yield stability and stand persistence, energy crop commercialization readiness, information transfer, assessment of the economic impacts of achieving the Billion-Ton Vision, and the impact of feedstock species and environment conditions on feedstock quality characteristics.

  4. 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy

    Energy Technology Data Exchange (ETDEWEB)

    None

    2016-07-06

    This product builds on previous efforts, namely the 2005 Billion-Ton Study (BTS) and the 2011 U.S. Billion-Ton Update (BT2).With each report, greater perspective is gained on the potential of biomass resources to contribute to a national energy strategy. Similarly, each successive report introduces new questions regarding commercialization challenges. BTS quantified the broad biophysical potential of biomass nationally, and BT2 elucidated the potential economic availability of these resources. These reports clearly established the potential availability of up to one billion tons of biomass resources nationally. However, many questions remain, including but not limited to crop yields, climate change impacts, logistical operations, and systems integration across production, harvest, and conversion. The present report aims to address many of these questions through empirically modeled energy crop yields, scenario analysis of resources delivered to biorefineries, and the addition of new feedstocks. Volume 2 of the 2016 Billion-Ton Report is expected to be released by the end of 2016. It seeks to evaluate environmental sustainability indicators of select scenarios from volume 1 and potential climate change impacts on future supplies.

  5. 2016 Billion-ton report: Advancing domestic resources for a thriving bioeconomy, Volume 1: Economic availability of feedstock

    Science.gov (United States)

    M.H. Langholtz; B.J. Stokes; L.M. Eaton

    2016-01-01

    This product builds on previous efforts, namely the 2005 Billion-Ton Study (BTS) and the 2011 U.S. Billion-Ton Update (BT2).With each report, greater perspective is gained on the potential of biomass resources to contribute to a national energy strategy. Similarly, each successive report introduces new questions regarding commercialization challenges. BTS quantified...

  6. U.S. Billion-Ton Update: Biomass Supply for a Bioenergy and Bioproducts Industry

    Energy Technology Data Exchange (ETDEWEB)

    Downing, Mark [ORNL; Eaton, Laurence M [ORNL; Graham, Robin Lambert [ORNL; Langholtz, Matthew H [ORNL; Perlack, Robert D [ORNL; Turhollow Jr, Anthony F [ORNL; Stokes, Bryce [Navarro Research & Engineering; Brandt, Craig C [ORNL

    2011-08-01

    The report, Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasibility of a Billion-Ton Annual Supply (generally referred to as the Billion-Ton Study or 2005 BTS), was an estimate of 'potential' biomass based on numerous assumptions about current and future inventory, production capacity, availability, and technology. The analysis was made to determine if conterminous U.S. agriculture and forestry resources had the capability to produce at least one billion dry tons of sustainable biomass annually to displace 30% or more of the nation's present petroleum consumption. An effort was made to use conservative estimates to assure confidence in having sufficient supply to reach the goal. The potential biomass was projected to be reasonably available around mid-century when large-scale biorefineries are likely to exist. The study emphasized primary sources of forest- and agriculture-derived biomass, such as logging residues, fuel treatment thinnings, crop residues, and perennially grown grasses and trees. These primary sources have the greatest potential to supply large, reliable, and sustainable quantities of biomass. While the primary sources were emphasized, estimates of secondary residue and tertiary waste resources of biomass were also provided. The original Billion-Ton Resource Assessment, published in 2005, was divided into two parts-forest-derived resources and agriculture-derived resources. The forest resources included residues produced during the harvesting of merchantable timber, forest residues, and small-diameter trees that could become available through initiatives to reduce fire hazards and improve forest health; forest residues from land conversion; fuelwood extracted from forests; residues generated at primary forest product processing mills; and urban wood wastes, municipal solid wastes (MSW), and construction and demolition (C&D) debris. For these forest resources, only residues, wastes, and small

  7. 2016 Billion-Ton Report: Environmental Sustainability Effects of Select Scenarios from Volume 1 (Volume 2)

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, R. A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Langholtz, M. H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, K. E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Stokes, B. J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-13

    On behalf of all the authors and contributors, it is a great privilege to present the 2016 Billion-Ton Report (BT16), volume 2: Environmental Sustainability Effects of Select Scenarios from volume 1. This report represents the culmination of several years of collaborative effort among national laboratories, government agencies, academic institutions, and industry. BT16 was developed to support the U.S. Department of Energy’s efforts towards national goals of energy security and associated quality of life.

  8. Semiportable load-cell-based weighing system prototype of 18.14-metric-ton (20-ton) capacity for UF6 cylinder weight verifications: description and testing procedure

    International Nuclear Information System (INIS)

    McAuley, W.A.

    1984-01-01

    The 18.14-metric-ton-capacity (20-ton) Load-Cell-Based Weighing System (LCBWS) prototype tested at the Oak Ridge (Tennessee) Gaseous Diffusion Plant March 20-30, 1984, is semiportable and has the potential for being highly accurate. Designed by Brookhaven National Laboratory, it can be moved to cylinders for weighing as opposed to the widely used operating philosophy of most enrichment facilities of moving cylinders to stationary accountability scales. Composed mainly of commercially available, off-the-shelf hardware, the system's principal elements are two load cells that sense the weight (i.e., force) of a uranium hexafluoride (UF 6 ) cylinder suspended from the LCBWS while the cylinder is in the process of being weighed. Portability is achieved by its attachment to a double-hook, overhead-bridge crane. The LCBWS prototype is designed to weigh 9.07- and 12.70-metric ton (10- and 14-ton) UF 6 cylinders. A detailed description of the LCBWS is given, design information and criteria are supplied, a testing procedure is outlined, and initial test results are reported. A major objective of the testing is to determine the reliability and accuracy of the system. Other testing objectives include the identification of (1) potential areas for system improvements and (2) procedural modifications that will reflect an improved and more efficient system. The testing procedure described includes, but is not limited to, methods that account for temperature sensitivity of the instrumentation, the local variation in the acceleration due to gravity, and buoyance effects. Operational and safety considerations are noted. A preliminary evaluation of the March test data indicates that the LCBWS prototype has the potential to have an accuracy in the vicinity of 1 kg

  9. Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasibility of a Billion-Ton Annual Supply, April 2005

    Energy Technology Data Exchange (ETDEWEB)

    None

    2005-04-01

    The purpose of this report is to determine whether the land resources of the United States are capable of producing a sustainable supply of biomass sufficient to displace 30 percent or more of the country’s present petroleum consumption – the goal set by the Biomass R&D Technical Advisory Committee in their vision for biomass technologies. Accomplishing this goal would require approximately 1 billion dry tons of biomass feedstock per year.

  10. Taking out 1 billion tons of CO2: The magic of China's 11th Five-Year Plan?

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Nan; Lin, Jiang; Zhou, Nan; Levine, Mark; Fridley, David

    2007-07-01

    China's 11th Five-Year Plan (FYP) sets an ambitious target for energy-efficiency improvement: energy intensity of the country's gross domestic product (GDP) should be reduced by 20% from 2005 to 2010 (NDRC, 2006). This is the first time that a quantitative and binding target has been set for energy efficiency, and signals a major shift in China's strategic thinking about its long-term economic and energy development. The 20% energy intensity target also translates into an annual reduction of over 1.5 billion tons of CO2 by 2010, making the Chinese effort one of most significant carbon mitigation effort in the world today. While it is still too early to tell whether China will achieve this target, this paper attempts to understand the trend in energy intensity in China and to explore a variety of options toward meeting the 20% target using a detailed end-use energy model.

  11. Taking out 1 billion tons of CO2: The magic of China's 11th Five-Year Plan?

    International Nuclear Information System (INIS)

    Lin Jiang; Zhou Nan; Levine, Mark; Fridley, David

    2008-01-01

    China's 11th Five-Year Plan (FYP) sets an ambitious target for energy-efficiency improvement: energy intensity of the country's gross domestic product (GDP) should be reduced by 20% from 2005 to 2010 [National Development and Reform Commission (NDRC), 2006. Overview of the 11th Five Year Plan for National Economic and Social Development. NDRC, Beijing]. This is the first time that a quantitative and binding target has been set for energy efficiency, and signals a major shift in China's strategic thinking about its long-term economic and energy development. The 20% energy-intensity target also translates into an annual reduction of over 1.5 billion tons of CO 2 by 2010, making the Chinese effort one of the most significant carbon mitigation efforts in the world today. While it is still too early to tell whether China will achieve this target, this paper attempts to understand the trend in energy intensity in China and to explore a variety of options toward meeting the 20% target using a detailed end-use energy model

  12. Defining a standard metric for electricity savings

    International Nuclear Information System (INIS)

    Koomey, Jonathan; Akbari, Hashem; Blumstein, Carl; Brown, Marilyn; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B; Greenberg, Steve

    2010-01-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70% capacity factor with 7% T and D losses. Displacing such a plant for one year would save 3 billion kWh/year at the meter and reduce emissions by 3 million metric tons of CO 2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question-Dr Arthur H Rosenfeld.

  13. Defining a standard metric for electricity savings

    Energy Technology Data Exchange (ETDEWEB)

    Koomey, Jonathan [Lawrence Berkeley National Laboratory and Stanford University, PO Box 20313, Oakland, CA 94620-0313 (United States); Akbari, Hashem; Blumstein, Carl; Brown, Marilyn; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B; Greenberg, Steve, E-mail: JGKoomey@stanford.ed

    2010-01-15

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70% capacity factor with 7% T and D losses. Displacing such a plant for one year would save 3 billion kWh/year at the meter and reduce emissions by 3 million metric tons of CO{sub 2} per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question-Dr Arthur H Rosenfeld.

  14. Defining a Standard Metric for Electricity Savings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Marilyn; Akbari, Hashem; Blumstein, Carl; Koomey, Jonathan; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H.; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B.; Greenberg, Steve; Hafemeister, David; Harris, Jeff; Harvey, Hal; Heitz, Eric; Hirst, Eric; Hummel, Holmes; Kammen, Dan; Kelly, Henry; Laitner, Skip; Levine, Mark; Lovins, Amory; Masters, Gil; McMahon, James E.; Meier, Alan; Messenger, Michael; Millhone, John; Mills, Evan; Nadel, Steve; Nordman, Bruce; Price, Lynn; Romm, Joe; Ross, Marc; Rufo, Michael; Sathaye, Jayant; Schipper, Lee; Schneider, Stephen H; Sweeney, James L; Verdict, Malcolm; Vorsatz, Diana; Wang, Devra; Weinberg, Carl; Wilk, Richard; Wilson, John; Worrell, Ernst

    2009-03-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70percent capacity factor with 7percent T&D losses. Displacing such a plant for one year would save 3 billion kW h per year at the meter and reduce emissions by 3 million metric tons of CO2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question--Dr. Arthur H. Rosenfeld.

  15. Winglets Save Billions of Dollars in Fuel Costs

    Science.gov (United States)

    2010-01-01

    The upturned ends now featured on many airplane wings are saving airlines billions of dollars in fuel costs. Called winglets, the drag-reducing technology was advanced through the research of Langley Research Center engineer Richard Whitcomb and through flight tests conducted at Dryden Flight Research Center. Seattle-based Aviation Partners Boeing -- a partnership between Aviation Partners Inc., of Seattle, and The Boeing Company, of Chicago -- manufactures Blended Winglets, a unique design featured on Boeing aircraft around the world. These winglets have saved more than 2 billion gallons of jet fuel to date, representing a cost savings of more than $4 billion and a reduction of almost 21.5 million tons in carbon dioxide emissions.

  16. Production equipment development needs for a 700 metric ton/year light water reactor mixed oxide fuel manufacturing plant

    International Nuclear Information System (INIS)

    Blahnik, D.E.

    1977-09-01

    A literature search and survey of fuel suppliers was conducted to determine how much development of production equipment is needed for a 700 metric tons/y LWR mixed-oxide (UO 2 --PuO 2 ) fuel fabrication plant. Results indicate that moderate to major production equipment development is needed in the powder and pellet processing areas. The equipment in the rod and assembly processing areas need only minor development effort. Required equipment development for a 700 MT/y plant is not anticipated to delay startup of the plant. The development, whether major or minor, can be done well within the time frame for licensing and construction of the plant as long as conventional production equipment is used

  17. Proposed method for assigning metric tons of heavy metal values to defense high-level waste forms to be disposed of in a geologic repository

    International Nuclear Information System (INIS)

    1987-08-01

    A proposed method is described for assigning an equivalent metric ton heavy metal (eMTHM) value to defense high-level waste forms to be disposed of in a geologic repository. This method for establishing a curie equivalency between defense high-level waste and irradiated commercial fuel is based on the ratio of defense fuel exposure to the typical commercial fuel exposure, MWd/MTHM. application of this technique to defense high-level wastes is described. Additionally, this proposed technique is compared to several alternate calculations for eMTHM. 15 refs., 2 figs., 10 tabs

  18. Land-Use Change and the Billion Ton 2016 Resource Assessment: Understanding the Effects of Land Management on Environmental Indicators

    Science.gov (United States)

    Kline, K. L.; Eaton, L. M.; Efroymson, R.; Davis, M. R.; Dunn, J.; Langholtz, M. H.

    2016-12-01

    The federal government, led by the U.S. Department of Energy (DOE), quantified potential U.S. biomass resources for expanded production of renewable energy and bioproducts in the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy (BT16) (DOE 2016). Volume 1 of the report provides analysis of projected supplies from 2015 to2040. Volume 2 (forthcoming) evaluates changes in environmental indicators for water quality and quantity, carbon, air quality, and biodiversity associated with production scenarios in BT16 volume 1. This presentation will review land-use allocations under the projected biomass production scenarios and the changes in land management that are implied, including drivers of direct and indirect LUC. National and global concerns such as deforestation and displacement of food production are addressed. The choice of reference scenario, input parameters and constraints (e.g., regarding land classes, availability, and productivity) drive LUC results in any model simulation and are reviewed to put BT16 impacts into context. The principal LUC implied in BT16 supply scenarios involves the transition of 25-to-47 million acres (net) from annual crops in 2015 baseline to perennial cover by 2040 under the base case and 3% yield growth case, respectively. We conclude that clear definitions of land parameters and effects are essential to assess LUC. A lack of consistency in parameters and outcomes of historic LUC analysis in the U.S. underscores the need for science-based approaches.

  19. Transportation system benefits of early deployment of a 75-ton multipurpose canister system

    International Nuclear Information System (INIS)

    Wankerl, M.W.; Schmid, S.P.

    1995-01-01

    In 1993 the US Civilian Radioactive Waste Management System (CRWMS) began developing two multipurpose canister (MPC) systems to provide a standardized method for interim storage and transportation of spent nuclear fuel (SNF) at commercial nuclear power plants. One is a 75-ton concept with an estimated payload of about 6 metric tons (t) of SNF, and the other is a 125-ton concept with an estimated payload of nearly 11 t of SNF. These payloads are two to three times the payloads of the largest currently certified US rail transport casks, the IF-300. Although is it recognized that a fully developed 125-ton MPC system is likely to provide a greater cost benefit, and radiation exposure benefit than the lower-capacity 75-ton MPC, the authors of this paper suggest that development and deployment of the 75-ton MPC prior to developing and deploying a 125-ton MPC is a desirable strategy. Reasons that support this are discussed in this paper

  20. Immersion Freezing of Coal Combustion Ash Particles from the Texas Panhandle

    Science.gov (United States)

    Whiteside, C. L.; Tobo, Y.; Mulamba, O.; Brooks, S. D.; Mirrielees, J.; Hiranuma, N.

    2017-12-01

    Coal combustion aerosol particles contribute to the concentrations of ice-nucleating particles (INPs) in the atmosphere. Especially, immersion freezing can be considered as one of the most important mechanisms for INP formation in supercooled tropospheric clouds that exist at temperatures between 0°C and -38°C. The U.S. contains more than 550 operating coal-burning plants consuming 7.2 x 108 metric tons of coal (in 2016) to generate a total annual electricity of >2 billion MW-h, resulting in the emission of at least 4.9 x 105 metric tons of PM10 (particulate matter smaller than 10 µm in diameter). In Texas alone, 19 combustion plants generate 0.15 billion MW-h electricity and >2.4 x 104 metric tons of PM10. Here we present the immersion freezing behavior of combustion fly ash and bottom ash particles collected in the Texas Panhandle region. Two types of particulate samples, namely electron microscopy on both ash types will also be presented to relate the crystallographic and chemical properties to their ice nucleation abilities.

  1. Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasability of a Billion-Ton Annual Supply

    Energy Technology Data Exchange (ETDEWEB)

    Perlack, R.D.

    2005-12-15

    whether the land resources of the United States are capable of producing a sustainable supply of biomass sufficient to displace 30 percent or more of the country's present petroleum consumption--the goal set by the Advisory Committee in their vision for biomass technologies. Accomplishing this goal would require approximately 1 billion dry tons of biomass feedstock per year.

  2. Economic assessment of mushroom project commercialisation

    International Nuclear Information System (INIS)

    Mat Rasol Awang; Rosnani Abdul Rashid; Hassan Hamdani Hassan Mutaat; Meswan Maskom

    2010-01-01

    The market value of mushroom is worth US $45 billion comprising: US $28-30 billion from food, US $9-10 billion from medicinal products and US $3.5-4 billion from wild mushroom. Malaysian import deficit of mushroom over the year 2001-2007 was 40,933 metric ton that worth of RM 187.7 million. The existing local market is lucrative and the potential world market is very large. Having cultivation technology in placed, understanding key value chains of cultivation technology processes, this paper assesses the case study of project economic of mushroom commercialization. (author)

  3. 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy, Volume 2: Environmental Sustainability Effects of Select Scenarios from Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, Rebecca Ann [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-11

    With the goal of understanding environmental effects of a growing bioeconomy, the U.S. Department of Energy (DOE), national laboratories, and U.S. Forest Service research laboratories, together with academic and industry collaborators, undertook a study to estimate environmental effects of potential biomass production scenarios in the United States, with an emphasis on agricultural and forest biomass. Potential effects investigated include changes in soil organic carbon (SOC), greenhouse gas (GHG) emissions, water quality and quantity, air emissions, and biodiversity. Effects of altered land-management regimes were analyzed based on select county-level biomass-production scenarios for 2017 and 2040 taken from the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy (BT16), volume 1, which assumes that the land bases for agricultural and forestry would not change over time. The scenarios reflect constraints on biomass supply (e.g., excluded areas; implementation of management practices; and consideration of food, feed, forage, and fiber demands and exports) that intend to address sustainability concerns. Nonetheless, both beneficial and adverse environmental effects might be expected. To characterize these potential effects, this research sought to estimate where and under what modeled scenarios or conditions positive and negative environmental effects could occur nationwide. The report also includes a discussion of land-use change (LUC) (i.e., land management change) assumptions associated with the scenario transitions (but not including analysis of indirect LUC [ILUC]), analyses of climate sensitivity of feedstock productivity under a set of potential scenarios, and a qualitative environmental effects analysis of algae production under carbon dioxide (CO2) co-location scenarios. Because BT16 biomass supplies are simulated independent of a defined end use, most analyses do not include benefits from displacing fossil fuels or other

  4. Doctor Shopping Behavior and the Diversion of Prescription Opioids.

    Science.gov (United States)

    Simeone, Ronald

    2017-01-01

    "Doctor shopping" as a means of prescription opioid diversion is examined. The number and percentage of prescriptions and morphine-equivalent milligrams diverted in this manner are estimated by state and molecule for the period 2008-2012. Eleven billion prescriptions with unique patient, doctor, and pharmacy identifiers were used to construct diversion "events" that involved between 1 and 6 unique doctors and between 1 and 6 unique pharmacies. Diversion thresholds were established based on the probability of each contingency. A geographically widespread decline occurred between 2008 and 2012. The number of prescriptions diverted fell from approximately 4.30 million (1.75% of all prescriptions) in 2008 to approximately 3.37 million (1.27% of all prescriptions) in 2012, and the number of morphine-equivalent milligrams fell from approximately 6.55 metric tons (2.95% of total metric tons) in 2008 to approximately 4.87 metric tons (2.19% of total metric tons) in 2012. Diversion control efforts have likely been effective. But given increases in opioid-related deaths, opioid-related drug treatment admissions, and the more specific resurgence of heroin-related events, it is clear that additional public health measures are required.

  5. extent of use of ict by fish farmers in isoko agricultural zone of delta ...

    African Journals Online (AJOL)

    Mr. TONY A

    various innovations disseminated by the project were adopted by the rice ... imported was 0.84 million metric tons but the price was N30.31billion. This position ..... ex. Rice production facilities. Tube well. 93. 12. 15. 28. 5. 1.4. Bore holes. 34. 3.

  6. Death of the TonB Shuttle Hypothesis.

    Science.gov (United States)

    Gresock, Michael G; Savenkova, Marina I; Larsen, Ray A; Ollis, Anne A; Postle, Kathleen

    2011-01-01

    A complex of ExbB, ExbD, and TonB couples cytoplasmic membrane (CM) proton motive force (pmf) to the active transport of large, scarce, or important nutrients across the outer membrane (OM). TonB interacts with OM transporters to enable ligand transport. Several mechanical models and a shuttle model explain how TonB might work. In the mechanical models, TonB remains attached to the CM during energy transduction, while in the shuttle model the TonB N terminus leaves the CM to deliver conformationally stored potential energy to OM transporters. Previous studies suggested that TonB did not shuttle based on the activity of a GFP-TonB fusion that was anchored in the CM by the GFP moiety. When we recreated the GFP-TonB fusion to extend those studies, in our hands it was proteolytically unstable, giving rise to potentially shuttleable degradation products. Recently, we discovered that a fusion of the Vibrio cholerae ToxR cytoplasmic domain to the N terminus of TonB was proteolytically stable. ToxR-TonB was able to be completely converted into a proteinase K-resistant conformation in response to loss of pmf in spheroplasts and exhibited an ability to form a pmf-dependent formaldehyde crosslink to ExbD, both indicators of its location in the CM. Most importantly, ToxR-TonB had the same relative specific activity as wild-type TonB. Taken together, these results provide conclusive evidence that TonB does not shuttle during energy transduction. We had previously concluded that TonB shuttles based on the use of an Oregon Green(®) 488 maleimide probe to assess periplasmic accessibility of N-terminal TonB. Here we show that the probe was permeant to the CM, thus permitting the labeling of the TonB N-terminus. These former results are reinterpreted in the context that TonB does not shuttle, and suggest the existence of a signal transduction pathway from OM to cytoplasm.

  7. Zulia rich coal seams to fuel Venezuela

    Energy Technology Data Exchange (ETDEWEB)

    1983-06-16

    In March, 1982, Carbozulia awarded a contract to Fluor Corp. to provide basic engineering services, including mine planning and geology, for the two-phase project. The open pit mine and ancillary facilities, valued at more than $200 million, will provide steam and metallurgical coal for domestic use. The site, Mina Paso Diablo, is located about 60 miles northwest of Maracaibo. Upon phase one completion sometime in 1987, the mill will start production, gradually increasing to 4 million metric-tons-per-year. This will increase to 6.4 million metric tons when phase two is completed. In addition to the mine, the Venezuelan government plans to build an industrial complex along Lake Maracaibo. Corpozulia will build a steel-rolling mill and add a 350,000 metric tons-per-year coking oven, which will consume about 7% of the mine's metallurgical-coal production. Another government-owned firm, Electric Energy of Venezuela, plans to build a thermo-electric plant nearby. Two 250-megawatt units are planned initially, with potential to add another six units. At full capacity, the plant will burn more than 90% of the coal produced from the mine. Mina Paso Diablo contains one of Latin America's largest proven coal reserves - about 350 million metric tons - with guesstimates running as high as 4 billion metric tons for the Zulia coal basin. The coal is of superior quality, running about 12,000 to 13,000 Btu's per lb. with a low ash and sulphur content.

  8. FY97 nuclear-related budgets total 493 billion yen (4.4 billion dollars)

    International Nuclear Information System (INIS)

    Anon.

    1996-01-01

    On September 13, the Atomic Energy Commission of Japan announced the estimated nuclear-related budget requests for FY1997 (April, 1997 - Mach, 1998), giving the breakdowns for eight ministries and agencies. The total amount requested by the government bodies was 493.3 billion yen, 0.8% increase as compared with FY96. this figure includes the budget requests of the Science and Technology Agency (STA), the Ministry of International Trade and Industry (MITI), the Ministry of Foreign Affairs, the Ministry of Transport, the Ministry of Agriculture, Forestry and Fisheries, the Okinawa Development Agency, and the Ministry of Home Affairs, but excludes the budget request made by the Ministry of Education. The budget requests of STA and MITI are 360 billion yen and 126 billion yen, respectively. On August 29, STA released its estimated FY97 budget request. The nuclear-related 360.4 billion yen is 0.9% more than that in year before. Of this sum, 199.9 billion yen is in the general account, and 160.6 billion yen is in the special account for power source development. The details of the nuclear-related amounts are explained. On August 26, MITI released its estimated budget request for FY97, and of the nuclear-related 125.7 billion yen (0.1% increase from FY96), 200 million yen is in the general account, and 98.9 billion yen and 26.6 billion yen are in the special accounts for power resource development and power source diversification, respectively. (K.I.)

  9. Death of the TonB shuttle hypothesis

    Directory of Open Access Journals (Sweden)

    Michael George Gresock

    2011-10-01

    Full Text Available A complex of ExbB, ExbD, and TonB transduces cytoplasmic membrane (CM proton motive force (pmf to outer membrane (OM transporters so that large, scarce, and important nutrients can be released into the periplasmic space for subsequent transport across the CM. TonB is the component that interacts with the OM transporters and enables ligand transport, and several mechanical models and a shuttle model explain how TonB might work. In the mechanical models, TonB remains attached to the CM during energy transduction, while in the shuttle model the TonB N terminus leaves the CM to deliver conformationally stored potential energy to OM transporters. Previous efforts to test the shuttle model by anchoring TonB to the CM by fusion to a large globular cytoplasmic protein have been hampered by the proteolytic susceptibility of the fusion constructs. Here we confirm that GFP-TonB, tested in a previous study by another laboratory, again gave rise to full-length TonB and slightly larger potentially shuttleable fragments that prevented unambiguous interpretation of the data. Recently, we discovered that a fusion of the Vibrio cholerae ToxR cytoplasmic domain to the N terminus of TonB was proteolytically stable. ToxR-TonB was able to be completely converted into a proteinase K-resistant conformation in response to loss of pmf in spheroplasts and exhibited an ability to form a pmf-dependent formaldehyde crosslink to ExbD, both indicators of its location in the CM. Most importantly, ToxR-TonB had the same relative specific activity as wild-type TonB. Taken together, these results provide the first conclusive evidence that TonB does not shuttle during energy transduction. The interpretations of our previous study, which concluded that TonB shuttled in vivo, were complicated by the fact that the probe used in those studies, Oregon Green® 488 maleimide, was permeant to the CM and could label proteins, including a TonB ∆TMD derivative, confined exclusively to the

  10. Consumption of materials in the United States, 1900-1995

    Science.gov (United States)

    Matos, G.; Wagner, L.; ,

    1998-01-01

    The flows of nonfood and nonfuel materials through the economy have significant impact on our lives and the world around us. Growing populations and economies demand more goods, services, and infrastructure. Since the beginning of the twentieth century, the types of materials consumed in the United States have significantly changed. In 1900, on a per-weight basis, almost half of the materials consumed were from renewable resources, such as wood, fibers, and agricultural products, the rest being derived from nonrenewable resources. By 1995, the consumption of renewable resources had declined dramatically, to only 8% of total consumption. During this century, the quantity of materials consumed has grown, from 161 million metric tons in 1900 to 2.8 billion metric tons by 1995, an equivalent of 10 metric tons per person per year. Of all the materials consumed during this century, more than half were consumed in the last 25 years. This paper examines the general historical shifts in materials consumption and presents an analysis of different measurements of materials use and the significance of their trends.

  11. Improvements in BTS estimation of ton-miles

    Science.gov (United States)

    2004-08-01

    Ton-miles (one ton of freight shipped one mile) is the primary physical measure of freight transportation output. This paper describes improved measurements of ton-miles for air, truck, rail, water, and pipeline modes. Each modal measure contains a d...

  12. Cellulose nanomaterials as additives for cementitious materials

    Science.gov (United States)

    Tengfei Fu; Robert J. Moon; Pablo Zavatierri; Jeffrey Youngblood; William Jason Weiss

    2017-01-01

    Cementitious materials cover a very broad area of industries/products (buildings, streets and highways, water and waste management, and many others; see Fig. 20.1). Annual production of cements is on the order of 4 billion metric tons [2]. In general these industries want stronger, cheaper, more durable concrete, with faster setting times, faster rates of strength gain...

  13. Asset Decommissioning Risk Metrics for Floating Structures in the Gulf of Mexico.

    Science.gov (United States)

    Kaiser, Mark J

    2015-08-01

    Public companies in the United States are required to report standardized values of their proved reserves and asset retirement obligations on an annual basis. When compared, these two measures provide an aggregate indicator of corporate decommissioning risk but, because of their consolidated nature, cannot readily be decomposed at a more granular level. The purpose of this article is to introduce a decommissioning risk metric defined in terms of the ratio of the expected value of an asset's reserves to its expected cost of decommissioning. Asset decommissioning risk (ADR) is more difficult to compute than a consolidated corporate risk measure, but can be used to quantify the decommissioning risk of structures and to perform regional comparisons, and also provides market signals of future decommissioning activity. We formalize two risk metrics for decommissioning and apply the ADR metric to the deepwater Gulf of Mexico (GOM) floater inventory. Deepwater oil and gas structures are expensive to construct, and at the end of their useful life, will be expensive to decommission. The value of proved reserves for the 42 floating structures in the GOM circa January 2013 is estimated to range between $37 and $80 billion for future oil prices between 60 and 120 $/bbl, which is about 10 to 20 times greater than the estimated $4.3 billion to decommission the inventory. Eni's Allegheny and MC Offshore's Jolliet tension leg platforms have ADR metrics less than one and are approaching the end of their useful life. Application of the proposed metrics in the regulatory review of supplemental bonding requirements in the U.S. Outer Continental Shelf is suggested to complement the current suite of financial metrics employed. © 2015 Society for Risk Analysis.

  14. The upper pennsylvanian pittsburgh coal bed: Resources and mine models

    Science.gov (United States)

    Watson, W.D.; Ruppert, L.F.; Tewalt, S.J.; Bragg, L.J.

    2001-01-01

    The U.S. Geological Survey recently completed a digital coal resource assessment model of the Upper Pennsylvanian Pittsburgh coal bed, which indicates that after subtracting minedout coal, 16 billion short tons (14 billion tonnes) remain of the original 34 billion short tons (31 billion tonnes) of coal. When technical, environmental, and social restrictions are applied to the remaining Pittsburgh coal model, only 12 billion short tons (11 billion tonnes) are available for mining. Our assessment models estimate that up to 0.61 billion short tons (0.55 billion tonnes), 2.7 billion short tons (2.4 billion tonnes), and 8.5 billion short tons (7.7 billion tonnes) could be available for surface mining, continuous mining, and longwall mining, respectively. This analysis is an example of a second-generation regional coal availability study designed to model recoverability characteristics for all the major coal beds in the United States. ?? 2001 International Association for Mathematical Geology.

  15. The Role of TonB Gene in Edwardsiella ictaluri Virulence

    Directory of Open Access Journals (Sweden)

    Hossam Abdelhamed

    2017-12-01

    Full Text Available Edwardsiella ictaluri is a Gram-negative facultative intracellular pathogen that causes enteric septicemia in catfish (ESC. Stress factors including poor water quality, poor diet, rough handling, overcrowding, and water temperature fluctuations increase fish susceptibility to ESC. The TonB energy transducing system (TonB-ExbB-ExbD and TonB-dependent transporters of Gram-negative bacteria support active transport of scarce resources including iron, an essential micronutrient for bacterial virulence. Deletion of the tonB gene attenuates virulence in several pathogenic bacteria. In the current study, the role of TonB (NT01EI_RS07425 in iron acquisition and E. ictaluri virulence were investigated. To accomplish this, the E. ictaluri tonB gene was in-frame deleted. Growth kinetics, iron utilization, and virulence of the EiΔtonB mutant were determined. Loss of TonB caused a significant reduction in bacterial growth in iron-depleted medium (p > 0.05. The EiΔtonB mutant grew similarly to wild-type E. ictaluri when ferric iron was added to the iron-depleted medium. The EiΔtonB mutant was significantly attenuated in catfish compared with the parent strain (21.69 vs. 46.91% mortality. Catfish surviving infection with EiΔtonB had significant protection against ESC compared with naïve fish (100 vs. 40.47% survival. These findings indicate that TonB participates in pathogenesis of ESC and is an important E. ictaluri virulence factor.

  16. Systems resilience for multihazard environments: definition, metrics, and valuation for decision making.

    Science.gov (United States)

    Ayyub, Bilal M

    2014-02-01

    The United Nations Office for Disaster Risk Reduction reported that the 2011 natural disasters, including the earthquake and tsunami that struck Japan, resulted in $366 billion in direct damages and 29,782 fatalities worldwide. Storms and floods accounted for up to 70% of the 302 natural disasters worldwide in 2011, with earthquakes producing the greatest number of fatalities. Average annual losses in the United States amount to about $55 billion. Enhancing community and system resilience could lead to massive savings through risk reduction and expeditious recovery. The rational management of such reduction and recovery is facilitated by an appropriate definition of resilience and associated metrics. In this article, a resilience definition is provided that meets a set of requirements with clear relationships to the metrics of the relevant abstract notions of reliability and risk. Those metrics also meet logically consistent requirements drawn from measure theory, and provide a sound basis for the development of effective decision-making tools for multihazard environments. Improving the resiliency of a system to meet target levels requires the examination of system enhancement alternatives in economic terms, within a decision-making framework. Relevant decision analysis methods would typically require the examination of resilience based on its valuation by society at large. The article provides methods for valuation and benefit-cost analysis based on concepts from risk analysis and management. © 2013 Society for Risk Analysis.

  17. A Go-to-Market Strategy: Promoting Private Sector Solutions to the Threat of Proliferation

    Science.gov (United States)

    2013-04-01

    indicators reveal that these problems, often subsumed under the seemingly innocuous heading of “transnational threats,” are a growing cancer on the...trade is worth an estimated $322 billion annually with 52,356 metric tons of opium, cannabis , cocaine, and amphetamine-type stimulant (ATS...of medical isotopes to the sites that secure the material. 30 Regulators are also now starting to consider another critical component in the

  18. A billion-dollar bonanza

    International Nuclear Information System (INIS)

    Isaacs, J.

    1993-01-01

    In late May -- only weeks after Congress had rejected the president's economic stimulus package because it would add to the federal deficit -- the House of Representatives generously allocated an extra $1.2 billion to the Pentagon. This article discusses some of the rationalizations House members gave for the gift and describes the attempts of a bipartisan group to defeat this request for funds propounded by Pennsylvania Democrat John Murtha. This gist of the arguments for and against the $1.2 billion and the results of votes on the bill are presented

  19. An independent assessment of CO{sub 2} capture research needs

    Energy Technology Data Exchange (ETDEWEB)

    St. John, B. [INTECH, Inc., Gaithersburg, MD (United States)

    1993-12-31

    The United States generates on the order of five billion metric tons of CO{sub 2} annually. Of this, approximately 1.8 billion metric tons is from electric utilities. Other industrial sources of CO{sub 2}, such as cement plants, coke ovens, ammonia plants, oil refineries, etc. are small relative to the emissions from power plants. The majority of the emissions from U.S. electric utilities are from coal-fired power plants. Thus, any large scale program to control CO{sub 2} emissions needs to include abatement of CO{sub 2} from power plants. Currently, there are very few proven options to mitigate CO{sub 2} emissions: (1) Improve thermal efficiency, thereby decreasing the amount of CO{sub 2} generated per unit of output. (2) Improve the efficiency of end use. (3) Convert to lower carbon fuels or non-fossil energy sources. (4) Plant trees to offset CO{sub 2} emitted. (5) Produce a concentrated CO{sub 2} stream for utilization or disposal. The first four options are well known and are being actively pursued at the present time. This paper examines the last option from the perspective that the gap between what is needed and what is available defines the research and development opportunities.

  20. ADVANCEMENTS IN CONCRETE TECHNOLOGY

    OpenAIRE

    Shri Purvansh B. Shah; Shri Prakash D. Gohil; Shri Hiren J. Chavda; Shri Tejas D. Khediya

    2015-01-01

    Developing and maintaining world’s infrastructure to meet the future needs of industrialized and developing countries is necessary to economically grow and improve the quality of life. The quality and performance of concrete plays a key role for most of infrastructure including commercial, industrial, residential and military structures, dams, power plants. Concrete is the single largest manufactured material in the world and accounts for more than 6 billion metric tons of materials annual...

  1. Projections of highway vehicle population, energy demand, and CO{sub 2} emissions in India through 2040.

    Energy Technology Data Exchange (ETDEWEB)

    Arora, S.; Vyas, A.; Johnson, L.; Energy Systems

    2011-02-22

    This paper presents projections of motor vehicles, oil demand, and carbon dioxide (CO{sub 2}) emissions for India through the year 2040. The populations of highway vehicles and two-wheelers are projected under three different scenarios on the basis of economic growth and average household size in India. The results show that by 2040, the number of highway vehicles in India would be 206-309 million. The oil demand projections for the Indian transportation sector are based on a set of nine scenarios arising out of three vehicle-growth and three fuel-economy scenarios. The combined effects of vehicle-growth and fuel-economy scenarios, together with the change in annual vehicle usage, result in a projected demand in 2040 by the transportation sector in India of 404-719 million metric tons (8.5-15.1 million barrels per day). The corresponding annual CO{sub 2} emissions are projected to be 1.2-2.2 billion metric tons.

  2. Countdown to Six Billion Teaching Kit.

    Science.gov (United States)

    Zero Population Growth, Inc., Washington, DC.

    This teaching kit features six activities focused on helping students understand the significance of the world population reaching six billion for our society and our environment. Featured activities include: (1) History of the World: Part Six Billion; (2) A Woman's Place; (3) Baby-O-Matic; (4) Earth: The Apple of Our Eye; (5) Needs vs. Wants; and…

  3. Geology and undiscovered resource assessment of the potash-bearing Pripyat and Dnieper-Donets Basins, Belarus and Ukraine

    Science.gov (United States)

    Cocker, Mark D.; Orris, Greta J.; Dunlap, Pamela; Lipin, Bruce R.; Ludington, Steve; Ryan, Robert J.; Słowakiewicz, Mirosław; Spanski, Gregory T.; Wynn, Jeff; Yang, Chao

    2017-08-03

    six potash mines in the Starobin area. Published reserves in the Pripyat Basin area are about 7.3 billion metric tons of potash ore (about 1.3 billion metric tons of K2O) mostly from potash-bearing salt horizons in the Starobin and Petrikov mine areas. The 15,160-square-kilometer area of the Pripyat Basin underlain by Famennian potash-bearing salt contains as many as 60 known potash-bearing salt horizons. Rough estimates of the total mineral endowment associated with stratabound Famennian salt horizons in the Pripyat Basin range from 80 to 200 billion metric tons of potash-bearing salt that could contain 15 to 30 billion metric tons of K2O.Parameters (including the number of economic potash horizons, grades, and depths) for these estimates are not published so the estimates are not easily confirmed. Historically, reserves have been estimated above a depth of 1,200 meters (m) (approximately the depths of conventional underground mining). Additional undiscovered K2O resources could be significantly greater in the remainder of the Fammenian salt depending on the extents and grades of the 60 identified potash horizons above the USGS assessment depth of 3,000 m in the remainder of the tract. Increasing ambient temperatures with increasing depths in the eastern parts of the Pripyat Basin may require a solution mining process which is aided by higher temperatures.No resource or reserve data have been published and little is known about stratabound Famennian and Frasnian salt in the Dnieper-Donets Basin. These Upper Devonian salt units dip to the southeast and extend to depths of 15–19 kilometers (km) or greater. The tract of stratabound Famennian salt that lies above a depth of 3 km, the depth above which potash is technically recoverable by solution mining, underlies an area of about 15,600 square kilometers (km2). If Upper Devonian salt units in the Dnieper-Donets Basin contain potash-bearing strata similar to salt of the same age in the Pripyat Basin, then the

  4. Determination of a Screening Metric for High Diversity DNA Libraries.

    Science.gov (United States)

    Guido, Nicholas J; Handerson, Steven; Joseph, Elaine M; Leake, Devin; Kung, Li A

    2016-01-01

    The fields of antibody engineering, enzyme optimization and pathway construction rely increasingly on screening complex variant DNA libraries. These highly diverse libraries allow researchers to sample a maximized sequence space; and therefore, more rapidly identify proteins with significantly improved activity. The current state of the art in synthetic biology allows for libraries with billions of variants, pushing the limits of researchers' ability to qualify libraries for screening by measuring the traditional quality metrics of fidelity and diversity of variants. Instead, when screening variant libraries, researchers typically use a generic, and often insufficient, oversampling rate based on a common rule-of-thumb. We have developed methods to calculate a library-specific oversampling metric, based on fidelity, diversity, and representation of variants, which informs researchers, prior to screening the library, of the amount of oversampling required to ensure that the desired fraction of variant molecules will be sampled. To derive this oversampling metric, we developed a novel alignment tool to efficiently measure frequency counts of individual nucleotide variant positions using next-generation sequencing data. Next, we apply a method based on the "coupon collector" probability theory to construct a curve of upper bound estimates of the sampling size required for any desired variant coverage. The calculated oversampling metric will guide researchers to maximize their efficiency in using highly variant libraries.

  5. Determination of a Screening Metric for High Diversity DNA Libraries.

    Directory of Open Access Journals (Sweden)

    Nicholas J Guido

    Full Text Available The fields of antibody engineering, enzyme optimization and pathway construction rely increasingly on screening complex variant DNA libraries. These highly diverse libraries allow researchers to sample a maximized sequence space; and therefore, more rapidly identify proteins with significantly improved activity. The current state of the art in synthetic biology allows for libraries with billions of variants, pushing the limits of researchers' ability to qualify libraries for screening by measuring the traditional quality metrics of fidelity and diversity of variants. Instead, when screening variant libraries, researchers typically use a generic, and often insufficient, oversampling rate based on a common rule-of-thumb. We have developed methods to calculate a library-specific oversampling metric, based on fidelity, diversity, and representation of variants, which informs researchers, prior to screening the library, of the amount of oversampling required to ensure that the desired fraction of variant molecules will be sampled. To derive this oversampling metric, we developed a novel alignment tool to efficiently measure frequency counts of individual nucleotide variant positions using next-generation sequencing data. Next, we apply a method based on the "coupon collector" probability theory to construct a curve of upper bound estimates of the sampling size required for any desired variant coverage. The calculated oversampling metric will guide researchers to maximize their efficiency in using highly variant libraries.

  6. Multi-Ton Argon and Xenon

    Energy Technology Data Exchange (ETDEWEB)

    Alarcon, Ricardo; Balascuta, Septimiu; Alton, Drew; Aprile, Elena; Giboni, Karl-Ludwig; Haruyama, Tom; Lang, Rafael; Melgarejo, Antonio Jesus; Ni, Kaixuan; Plante, Guillaume; Choi, Bin [et al.

    2009-01-01

    There is a wide range of astronomical evidence that the visible stars and gas in all galaxies, including our own, are immersed in a much larger cloud of non-luminous matter, typically an order of magnitude greater in total mass. The existence of this ''dark matter'' is consistent with evidence from large-scale galaxy surveys and microwave background measurements, indicating that the majority of matter in the universe is non-baryonic. The nature of this non-baryonic component is still totally unknown, and the resolution of the ''dark matter puzzle'' is of fundamental importance to cosmology, astrophysics, and elementary particle physics. A leading explanation, motivated by supersymmetry theory, is the existence of as yet undiscovered Weakly Interacting Massive Particles (WIMPs), formed in the early universe and subsequently clustered in association with normal matter. WIMPs could, in principle, be detected in terrestrial experiments by their collisions with ordinary nuclei, giving observable low energy (< 100 keV) nuclear recoils. The predicted low collision rates require ultra-low background detectors with large (0.1-10 ton) target masses, located in deep underground sites to eliminate neutron background from cosmic ray muons. The establishment of the Deep Underground Science and Engineering Laboratory for large-scale experiments of this type would strengthen the current leadership of US researchers in this and other particle astrophysics areas. We propose to detect nuclear recoils by scintillation and ionization in ton-scale liquid noble gas targets, using techniques already proven in experiments at the 0.01-0.1 ton level. The experimental challenge is to identify these events in the presence of background events from gammas, neutrons, and alphas.

  7. Inventory and Policy Reduction Potential of Greenhouse Gas and Pollutant Emissions of Road Transportation Industry in China

    Directory of Open Access Journals (Sweden)

    Ye Li

    2016-11-01

    Full Text Available In recent years, emissions from the road transportation industry in China have been increasing rapidly. To evaluate the reduction potential of greenhouse gas and pollutant emissions of the industry in China, its emission inventory was calculated and scenario analysis was created for the period between 2012 and 2030 in this paper. Based on the Long-range Energy Alternatives Planning System (LEAP model, the development of China’s road transportation industry in two scenarios (the business-as-usual (BAU scenario and the comprehensive-mitigation (CM scenario was simulated. In the Comprehensive Mitigation scenario, there are nine various measures which include Fuel Economy Standards, Auto Emission Standards, Energy-saving Technology, Tax Policy, Eco-driving, Logistics Informatization, Vehicle Liquidation, Electric Vehicles, and Alternative Fuels. The cumulative energy and emission reductions of these specific measures were evaluated. Our results demonstrate that China’s road transportation produced 881 million metric tons of CO2 and emitted 1420 thousand tons of CO, 2150 thousand tons of NOx, 148 thousand tons of PM10, and 745 thousand tons of HC in 2012. The reduction potential is quite large, and road freight transportation is the key mitigation subsector, accounting for 85%–92% of the total emission. For energy conservation and carbon emission mitigation, logistics informatization is the most effective method, potentially reducing 1.80 billion tons of coal equivalent and 3.83 billion tons of CO2 from 2012 to 2030. In terms of air pollutant emission mitigation, the auto emission standards measure performs best with respect to NOx, PM10, and HC emission mitigation, and logistic informatization measure is the best in CO emission reduction. In order to maximize the mitigation potential of China’s road transportation industry, the government needs to implement various measures in a timely and strict fashion.

  8. Assessment of coal geology, resources, and reserves in the Montana Powder River Basin

    Science.gov (United States)

    Haacke, Jon E.; Scott, David C.; Osmonson, Lee M.; Luppens, James A.; Pierce, Paul E.; Gunderson, Jay A.

    2013-01-01

    The purpose of this report is to summarize geology, coal resources, and coal reserves in the Montana Powder River Basin assessment area in southeastern Montana. This report represents the fourth assessment area within the Powder River Basin to be evaluated in the continuing U.S. Geological Survey regional coal assessment program. There are four active coal mines in the Montana Powder River Basin assessment area: the Spring Creek and Decker Mines, both near Decker; the Rosebud Mine, near Colstrip; and the Absaloka Mine, west of Colstrip. During 2011, coal production from these four mines totaled approximately 36 million short tons. A fifth mine, the Big Sky, had significant production from 1969-2003; however, it is no longer in production and has since been reclaimed. Total coal production from all five mines in the Montana Powder River Basin assessment area from 1968 to 2011 was approximately 1.4 billion short tons. The Rosebud/Knobloch coal bed near Colstrip and the Anderson, Dietz 2, and Dietz 3 coal beds near Decker contain the largest deposits of surface minable, low-sulfur, subbituminous coal currently being mined in the assessment area. A total of 26 coal beds were identified during this assessment, 18 of which were modeled and evaluated to determine in-place coal resources. The total original coal resource in the Montana Powder River Basin assessment area for the 18 coal beds assessed was calculated to be 215 billion short tons. Available coal resources, which are part of the original coal resource remaining after subtracting restrictions and areas of burned coal, are about 162 billion short tons. Restrictions included railroads, Federal interstate highways, urban areas, alluvial valley floors, state parks, national forests, and mined-out areas. It was determined that 10 of the 18 coal beds had sufficient areal extent and thickness to be evaluated for recoverable surface resources ([Roland (Baker), Smith, Anderson, Dietz 2, Dietz 3, Canyon, Werner

  9. 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy, Volume 2: Environmental Sustainability Effects of Select Scenarios from Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, Rebecca Ann [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Kristen [Dept. of Energy (DOE), Washington DC (United States); Stokes, Bryce [Allegheny Science & Technology, LLC, Bridgeport, WV (United States); Brandt, Craig C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hellwinckel, Chad [Univ. of Tennessee, Knoxville, TN (United States); Kline, Keith L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Eaton, Laurence M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Dunn, Jennifer [Argonne National Lab. (ANL), Argonne, IL (United States); Canter, Christina E. [Argonne National Lab. (ANL), Argonne, IL (United States); Qin, Zhangcai [Argonne National Lab. (ANL), Argonne, IL (United States); Cai, Hao [Argonne National Lab. (ANL), Argonne, IL (United States); Wang, Michael [Argonne National Lab. (ANL), Argonne, IL (United States); Scott, D. Andrew [USDA Forest Service, Normal, AL (United States); Jager, Henrietta I. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wu, May [Argonne National Lab. (ANL), Argonne, IL (United States); Ha, Miae [Argonne National Lab. (ANL), Argonne, IL (United States); Baskaran, Latha Malar [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kreig, Jasmine A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rau, Benjamin [USDA Forest Service, Aiken, SC (United States); Muwamba, Augustine [Univ. of Georgia, Athens, GA (United States); Trettin, Carl [USDA Forest Service, Aiken, SC (United States); Panda, Sudhanshu [Univ. of North Georgia, Oakwood, GA (United States); Amatya, Devendra M. [USDA Forest Service, Aiken, SC (United States); Tollner, Ernest W. [USDA Forest Service, Aiken, SC (United States); Sun, Ge [USDA Forest Service, Aiken, SC (United States); Zhang, Liangxia [USDA Forest Service, Aiken, SC (United States); Duan, Kai [North Carolina State Univ., Raleigh, NC (United States); Warner, Ethan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Zhang, Yimin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Inman, Daniel [National Renewable Energy Lab. (NREL), Golden, CO (United States); Eberle, Annika [National Renewable Energy Lab. (NREL), Golden, CO (United States); Carpenter, Alberta [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heath, Garvin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hettinger, Dylan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wang, Gangsheng [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sutton, Nathan J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Busch, Ingrid Karin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Donner, Deahn M. [USDA Forest Service, Aiken, SC (United States); Wigley, T. Bently [National Council for Air and Stream Improvement (NCASI), Research Triangle Park, NC (United States); Miller, Darren A. [Weyerhaeuser Company, Federal Way, WA (United States); Coleman, Andre [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wigmosta, Mark [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pattullo, Molly [Univ. of Tennessee, Knoxville, TN (United States); Mayes, Melanie [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Daly, Christopher [Oregon State Univ., Corvallis, OR (United States); Halbleib, Mike [Oregon State Univ., Corvallis, OR (United States); Negri, Cristina [Argonne National Lab. (ANL), Argonne, IL (United States); Turhollow, Anthony F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bonner, Ian [Monsanto Company, Twin Falls, ID (United States); Dale, Virginia H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-01

    With the goal of understanding environmental effects of a growing bioeconomy, the U.S. Department of Energy (DOE), national laboratories, and U.S. Forest Service research laboratories, together with academic and industry collaborators, undertook a study to estimate environmental effects of potential biomass production scenarios in the United States, with an emphasis on agricultural and forest biomass. Potential effects investigated include changes in soil organic carbon (SOC), greenhouse gas (GHG) emissions, water quality and quantity, air emissions, and biodiversity. Effects of altered land-management regimes were analyzed based on select county-level biomass-production scenarios for 2017 and 2040 taken from the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy (BT16), volume 1, which assumes that the land bases for agricultural and forestry would not change over time. The scenarios reflect constraints on biomass supply (e.g., excluded areas; implementation of management practices; and consideration of food, feed, forage, and fiber demands and exports) that intend to address sustainability concerns. Nonetheless, both beneficial and adverse environmental effects might be expected. To characterize these potential effects, this research sought to estimate where and under what modeled scenarios or conditions positive and negative environmental effects could occur nationwide. The report also includes a discussion of land-use change (LUC) (i.e., land management change) assumptions associated with the scenario transitions (but not including analysis of indirect LUC [ILUC]), analyses of climate sensitivity of feedstock productivity under a set of potential scenarios, and a qualitative environmental effects analysis of algae production under carbon dioxide (CO2) co-location scenarios. Because BT16 biomass supplies are simulated independent of a defined end use, most analyses do not include benefits from displacing fossil fuels or

  10. How much energy is locked in the USA? Alternative metrics for characterising the magnitude of overweight and obesity derived from BRFSS 2010 data.

    Science.gov (United States)

    Reidpath, Daniel D; Masood, Mohd; Allotey, Pascale

    2014-06-01

    Four metrics to characterise population overweight are described. Behavioural Risk Factors Surveillance System data were used to estimate the weight the US population needed to lose to achieve a BMI energy, and energy value. About 144 million people in the US need to lose 2.4 million metric tonnes. The volume of fat is 2.6 billion litres-1,038 Olympic size swimming pools. The energy in the fat would power 90,000 households for a year and is worth around 162 million dollars. Four confronting ways of talking about a national overweight and obesity are described. The value of the metrics remains to be tested.

  11. 40 CFR 98.193 - Calculating GHG emissions.

    Science.gov (United States)

    2010-07-01

    .... Calcium oxide and magnesium oxide content must be analyzed monthly for each lime type: ER30OC09.073 Where... subpart) (metric tons CO2/metric tons MgO). CaOi,n = Calcium oxide content for lime type i, for month n... = Calcium oxide content for sold lime byproduct/waste type i, for month n (metric tons CaO/metric ton lime...

  12. Taking out one billion tones of carbon: the magic of China's 11thFive-Year Plan

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Jiang; Zhou, Nan; Levine, Mark D.; Fridley, David

    2007-05-01

    China's 11th Five-Year Plan (FYP) sets an ambitious targetfor energy-efficiency improvement: energy intensity of the country sgross domestic product (GDP) should be reduced by 20 percent from 2005 to2010 (NDRC, 2006). This is the first time that a quantitative and bindingtarget has been set for energy efficiency, and signals a major shift inChina's strategic thinking about its long-term economic and energydevelopment. The 20 percent energy intensity target also translates intoan annual reduction of over one billion tons of CO2 by 2010, making theChinese effort one of most significant carbon mitigation effort in theworld today. While it is still too early to tell whether China willachieve this target, this paper attempts to understand the trend inenergy intensity in China and to explore a variety of options towardmeeting the 20 percent target using a detailed endues energymodel.

  13. Proposed Performance-Based Metrics for the Future Funding of Graduate Medical Education: Starting the Conversation.

    Science.gov (United States)

    Caverzagie, Kelly J; Lane, Susan W; Sharma, Niraj; Donnelly, John; Jaeger, Jeffrey R; Laird-Fick, Heather; Moriarty, John P; Moyer, Darilyn V; Wallach, Sara L; Wardrop, Richard M; Steinmann, Alwin F

    2017-12-12

    Graduate medical education (GME) in the United States is financed by contributions from both federal and state entities that total over $15 billion annually. Within institutions, these funds are distributed with limited transparency to achieve ill-defined outcomes. To address this, the Institute of Medicine convened a committee on the governance and financing of GME to recommend finance reform that would promote a physician training system that meets society's current and future needs. The resulting report provided several recommendations regarding the oversight and mechanisms of GME funding, including implementation of performance-based GME payments, but did not provide specific details about the content and development of metrics for these payments. To initiate a national conversation about performance-based GME funding, the authors asked: What should GME be held accountable for in exchange for public funding? In answer to this question, the authors propose 17 potential performance-based metrics for GME funding that could inform future funding decisions. Eight of the metrics are described as exemplars to add context and to help readers obtain a deeper understanding of the inherent complexities of performance-based GME funding. The authors also describe considerations and precautions for metric implementation.

  14. Areva excellent business volume: backlog as of december 31, 2008: + 21.1% to 48.2 billion euros. 2008 revenue: + 10.4% to 13.2 billion euros

    International Nuclear Information System (INIS)

    2009-01-01

    AREVA's backlog stood at 48.2 billion euros as of December 31, 2008, for 21.1% growth year-on-year, including 21.8% growth in Nuclear and 16.5% growth in Transmission and Distribution. The Nuclear backlog came to 42.5 billion euros at December 31, 2008. The Transmission and Distribution backlog came to 5.7 billion euros at year-end. The group recognized revenue of 13.2 billion euros in 2008, for year-on-year growth of 10.4% (+9.8% like-for-like). Revenue outside France was up 10.5% to 9.5 billion euros, representing 72% of total revenue. Revenue was up 6.5% in the Nuclear businesses (up 6.3% LFL), with strong performance in the Reactors and Services division (+10.9% LFL) and the Front End division (+7.2% LFL). The Transmission and Distribution division recorded growth of 17% (+15.8% LFL). Revenue for the fourth quarter of 2008 rose to 4.1 billion euros, up 5.2% (+1.6% LFL) from that of the fourth quarter of 2007. Revenue for the Front End division rose to 3.363 billion euros in 2008, up 7.1% over 2007 (+7.2% LFL). Foreign exchange (currency translations) had a negative impact of 53 million euros. Revenue for the Reactors and Services division rose to 3.037 billion euros, up 11.8% over 2007 (+10.9% LFL). Foreign exchange (currency translations) had a negative impact of 47 million euros. Revenue for the Back End division came to 1.692 billion euros, a drop of 2.7% (-2.5% LFL). Foreign exchange (currency translations) had a negative impact of 3.5 million euros. Revenue for the Transmission and Distribution division rose to 5.065 billion euros in 2008, up 17.0% (+15.8% LFL)

  15. TonEBP modulates the protective effect of taurine in ischemia-induced cytotoxicity in cardiomyocytes

    Science.gov (United States)

    Yang, Y J; Han, Y Y; Chen, K; Zhang, Y; Liu, X; Li, S; Wang, K Q; Ge, J B; Liu, W; Zuo, J

    2015-01-01

    Taurine, which is found at high concentration in the heart, exerts several protective actions on myocardium. Physically, the high level of taurine in heart is maintained by a taurine transporter (TauT), the expression of which is suppressed under ischemic insult. Although taurine supplementation upregulates TauT expression, elevates the intracellular taurine content and ameliorates the ischemic injury of cardiomyocytes (CMs), little is known about the regulatory mechanisms of taurine governing TauT expression under ischemia. In this study, we describe the TonE (tonicity-responsive element)/TonEBP (TonE-binding protein) pathway involved in the taurine-regulated TauT expression in ischemic CMs. Taurine inhibited the ubiquitin-dependent proteasomal degradation of TonEBP, promoted the translocation of TonEBP into the nucleus, enhanced TauT promoter activity and finally upregulated TauT expression in CMs. In addition, we observed that TonEBP had an anti-apoptotic and anti-oxidative role in CMs under ischemia. Moreover, the protective effects of taurine on myocardial ischemia were TonEBP dependent. Collectively, our findings suggest that TonEBP is a core molecule in the protective mechanism of taurine in CMs under ischemic insult. PMID:26673669

  16. Proton collider breaks the six-billion-dollar barrier

    CERN Multimedia

    Vaughan, C

    1990-01-01

    The SSC will cost at least 1 billion more than its estimated final price of 5.9 billion dollars. Critics in congress believe the final bill could be double that figure. The director of the SSC blames most of the increase in cost on technical problems with developing the superconducting magnets for the SSC (1/2 page).

  17. Criticality safety review of 2 1/2-, 10-, and 14-ton UF6 cylinders

    International Nuclear Information System (INIS)

    Broadhead, B.L.

    1991-10-01

    Currently, UF 6 cylinders designed to contain 2 1/2 tons of UF 6 are classified as Fissile Class 2 packages with a transport index (TI) of 5 for the purpose of transportation. The 10-ton UF 6 cylinders are classified as Fissile Class 1 with no TI assigned for transportation. The 14-ton cylinders, although not certified for transport with enrichments greater than 1 wt % because they have no approved overpack, can be used in on-site operations for enrichments greater than 1 wt %. The maximum 235 U enrichments for these cylinders are 5.0 wt % for the 2 1/2-ton cylinder and 4.5 wt % for the 10- and 14-ton cylinders. This work reviews the suitability for reclassification of the 2 1/2-ton UF 6 packages as Fissile Class 1 with a maximum 235 U enrichment of 5 wt %. Additionally, the 10- and 14-ton cylinders are reviewed to address a change in maximum 235 U enrichment from 4.5 to 5 wt %. Based on this evaluation, the 2 1/2-ton UF 6 cylinders meet the 10 CFR.71 criteria for Fissile Class 1 packages, and no TI is needed for criticality safety purposes; however, a TI may be required based on radiation from the packages. Similarly, the 10- and 14-ton UF 6 packages appear acceptable for a maximum enrichment rating change to 5 wt % 235 U. 11 refs., 13 figs., 7 tabs

  18. 305 Building 2 ton bridge crane and monorail assembly analysis

    International Nuclear Information System (INIS)

    Axup, M.D.

    1995-12-01

    The analyses in the appendix of this document evaluate the integrity of the existing bridge crane structure, as depicted on drawing H-3-34292, for a bridge crane and monorail assembly with a load rating of 2 tons. This bridge crane and monorail assembly is a modification of a 1 1/2 ton rated manipulator bridge crane which originally existed in the 305 building

  19. In vivo evidence of TonB shuttling between the cytoplasmic and outer membrane in Escherichia coli.

    Science.gov (United States)

    Larsen, Ray A; Letain, Tracy E; Postle, Kathleen

    2003-07-01

    Gram-negative bacteria are able to convert potential energy inherent in the proton gradient of the cytoplasmic membrane into active nutrient transport across the outer membrane. The transduction of energy is mediated by TonB protein. Previous studies suggest a model in which TonB makes sequential and cyclic contact with proteins in each membrane, a process called shuttling. A key feature of shuttling is that the amino-terminal signal anchor must quit its association with the cytoplasmic membrane, and TonB becomes associated solely with the outer membrane. However, the initial studies did not exclude the possibility that TonB was artifactually pulled from the cytoplasmic membrane by the fractionation process. To resolve this ambiguity, we devised a method to test whether the extreme TonB amino-terminus, located in the cytoplasm, ever became accessible to the cys-specific, cytoplasmic membrane-impermeant molecule, Oregon Green(R) 488 maleimide (OGM) in vivo. A full-length TonB and a truncated TonB were modified to carry a sole cysteine at position 3. Both full-length TonB and truncated TonB (consisting of the amino-terminal two-thirds) achieved identical conformations in the cytoplasmic membrane, as determined by their abilities to cross-link to the cytoplasmic membrane protein ExbB and their abilities to respond conformationally to the presence or absence of proton motive force. Full-length TonB could be amino-terminally labelled in vivo, suggesting that it was periplasmically exposed. In contrast, truncated TonB, which did not associate with the outer membrane, was not specifically labelled in vivo. The truncated TonB also acted as a control for leakage of OGM across the cytoplasmic membrane. Further, the extent of labelling for full-length TonB correlated roughly with the proportion of TonB found at the outer membrane. These findings suggest that TonB does indeed disengage from the cytoplasmic membrane during energy transduction and shuttle to the outer membrane.

  20. 40 CFR 98.65 - Procedures for estimating missing data.

    Science.gov (United States)

    2010-07-01

    ... factor (1.6 metric tons CO2/metric ton aluminum produced). MPp = Metal production from prebake process... produced). MPs = Metal production from Sderberg process (metric tons Al). (b) For other parameters, use the... PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Aluminum Production § 98.65 Procedures for...

  1. Connecting the last billion

    OpenAIRE

    Ben David, Yahel

    2015-01-01

    The last billion people to join the online world, are likely to face at least one of two obstacles:Part I: Rural Internet AccessRural, sparsely populated, areas make conventional infrastructure investments unfeasible: Bigcorporations attempt to address this challenge via the launch of Low-Earth-Orbiting (LEO) satelliteconstellations, fleets of high-altitude balloons, and giant solar-powered drones; although thesegrandiose initiatives hold potential, they are costly and risky. At the same time...

  2. An evaluation of the regional supply of biomass at three midwestern sites

    Energy Technology Data Exchange (ETDEWEB)

    English, B.C.; Dillivan, K.D.; Ojo, M.A.; Alexander, R.R. [Univ. of Tennessee, Knoxville, TN (United States); Graham, R.L. [Oak Ridge National Lab., TN (United States)

    1993-12-31

    Research has been conducted on both the agronomy and the conversion of biomass. However, few studies have been initiated that combine the knowledge of growing biomass with site specific resource availability information. An economic appraisal of how much biomass might be grown in a specific area for a given price has only just been initiated. This paper examines the economics of introducing biomass production to three midwest representative areas centered on the following counties, Orange County, Indiana; Olmsted County, Minnesota; and Cass County, North Dakota. Using a regional linear programming model, estimates of economic feasibility as well as environmental impacts are made. At a price of $53 per metric ton the biomass supplied to the plant gate is equal to 183,251 metric tons. At $62 per metric ton the biomass supply has increased to almost 1 million metric tons. The model predicts a maximum price of $88 per metric ton and at this price, 2,748,476 metric tons of biomass are produced.

  3. Geochemical proxies for understanding paleoceanography

    Digital Repository Service at National Institute of Oceanography (India)

    Nath, B.N.

    of size and volume. The annual delivery of all types of terrigenous material to the oceans is about 25 to 33 billion tons/yr compared to ~2-3 billion tons/yr of volcanogenic sedimentation and ~1.8 billion tons/yr of biogenic sedimentation... (0.1 to 0.2%) in most of the deep-sea sediments regardless of delivery rate to the seafloor and hence has limitations. Consequently, studies on high frequency changes in the major sediment components on the orbital time from the Oman margin reveal...

  4. Sharp metric obstructions for quasi-Einstein metrics

    Science.gov (United States)

    Case, Jeffrey S.

    2013-02-01

    Using the tractor calculus to study smooth metric measure spaces, we adapt results of Gover and Nurowski to give sharp metric obstructions to the existence of quasi-Einstein metrics on suitably generic manifolds. We do this by introducing an analogue of the Weyl tractor W to the setting of smooth metric measure spaces. The obstructions we obtain can be realized as tensorial invariants which are polynomial in the Riemann curvature tensor and its divergence. By taking suitable limits of their tensorial forms, we then find obstructions to the existence of static potentials, generalizing to higher dimensions a result of Bartnik and Tod, and to the existence of potentials for gradient Ricci solitons.

  5. 12 billion DM for Germany

    International Nuclear Information System (INIS)

    Anon.

    1975-01-01

    The German atomic industry has achieved the break-through to the world market: Brazil orders eight nuclear electricity generating plants from Siemens-AEG daughter Kraftwerk-Union. US concerns attacked the twelve billion DM deal, the biggest export order in the history of German industry. Without avail - the contract is to be signed in Bonn this week. (orig./LH) [de

  6. Dilution Refrigeration of Multi-Ton Cold Masses

    CERN Document Server

    Wikus, P; CERN. Geneva

    2007-01-01

    Dilution refrigeration is the only means to provide continuous cooling at temperatures below 250 mK. Future experiments featuring multi-ton cold masses require a new generation of dilution refrigeration systems, capable of providing a heat sink below 10 mK at cooling powers which exceed the performance of present systems considerably. This thesis presents some advances towards dilution refrigeration of multi-ton masses in this temperature range. A new method using numerical simulation to predict the cooling power of a dilution refrigerator of a given design has been developed in the framework of this thesis project. This method does not only allow to take into account the differences between an actual and an ideal continuous heat exchanger, but also to quantify the impact of an additional heat load on an intermediate section of the dilute stream. In addition, transient behavior can be simulated. The numerical model has been experimentally verified with a dilution refrigeration system which has been designed, ...

  7. $\\eta$-metric structures

    OpenAIRE

    Gaba, Yaé Ulrich

    2017-01-01

    In this paper, we discuss recent results about generalized metric spaces and fixed point theory. We introduce the notion of $\\eta$-cone metric spaces, give some topological properties and prove some fixed point theorems for contractive type maps on these spaces. In particular we show that theses $\\eta$-cone metric spaces are natural generalizations of both cone metric spaces and metric type spaces.

  8. The software product assurance metrics study: JPL's software systems quality and productivity

    Science.gov (United States)

    Bush, Marilyn W.

    1989-01-01

    The findings are reported of the Jet Propulsion Laboratory (JPL)/Software Product Assurance (SPA) Metrics Study, conducted as part of a larger JPL effort to improve software quality and productivity. Until recently, no comprehensive data had been assembled on how JPL manages and develops software-intensive systems. The first objective was to collect data on software development from as many projects and for as many years as possible. Results from five projects are discussed. These results reflect 15 years of JPL software development, representing over 100 data points (systems and subsystems), over a third of a billion dollars, over four million lines of code and 28,000 person months. Analysis of this data provides a benchmark for gauging the effectiveness of past, present and future software development work. In addition, the study is meant to encourage projects to record existing metrics data and to gather future data. The SPA long term goal is to integrate the collection of historical data and ongoing project data with future project estimations.

  9. Potential effects of the next 100 billion hamburgers sold by McDonald's.

    Science.gov (United States)

    Spencer, Elsa H; Frank, Erica; McIntosh, Nichole F

    2005-05-01

    McDonald's has sold >100 billion beef-based hamburgers worldwide with a potentially considerable health impact. This paper explores whether there would be any advantages if the next 100 billion burgers were instead plant-based burgers. Nutrient composition of the beef hamburger patty and the McVeggie burger patty were obtained from the McDonald's website; sales data were obtained from the McDonald's customer service. Consuming 100 billion McDonald's beef burgers versus the same company's McVeggie burgers would provide, approximately, on average, an additional 550 million pounds of saturated fat and 1.2 billion total pounds of fat, as well as 1 billion fewer pounds of fiber, 660 million fewer pounds of protein, and no difference in calories. These data suggest that the McDonald's new McVeggie burger represents a less harmful fast-food choice than the beef burger.

  10. Seven Billion People: Fostering Productive Struggle

    Science.gov (United States)

    Murawska, Jaclyn M.

    2018-01-01

    How can a cognitively demanding real-world task such as the Seven Billion People problem promote productive struggle "and" help shape students' mathematical dispositions? Driving home from school one evening, Jaclyn Murawska heard a commentator on the radio announce three statements: (1) experts had determined that the world population…

  11. Operation and maintenance techniques of 1 ton bucket elevator in IMEF

    International Nuclear Information System (INIS)

    Soong, Woong Sup

    1999-04-01

    IMEF pool is used as a pathway between pool and hot cell in order to transfer (incoming and outgoing) irradiated materials. Transfer is performed by 1 ton bucket elevator which is moved inside the rectangular tube installed between pool and M1 hot cell. Allowable load capacity is 1 ton of the bucket elevator and its size is 25 X 25 X 150 cm. Bucket is driven by chain system which is moved up and down through the guide rail. Guide rail is installed in rectangular tube that is tilted about 63 degree. Chain which is moved by using the roller sliding method is driven by sprocket wheel being rotated by the shaft and the shaft is driven by gear reducing motor. In this report operation and maintenance techniques of 1 ton bucket elevator in IMEF are described in detail. (Author). 8 refs., 14 tabs., 6 figs

  12. Operation and maintenance techniques of 1 ton bucket elevator in IMEF

    Energy Technology Data Exchange (ETDEWEB)

    Soong, Woong Sup

    1999-04-01

    IMEF pool is used as a pathway between pool and hot cell in order to transfer (incoming and outgoing) irradiated materials. Transfer is performed by 1 ton bucket elevator which is moved inside the rectangular tube installed between pool and M1 hot cell. Allowable load capacity is 1 ton of the bucket elevator and its size is 25 X 25 X 150 cm. Bucket is driven by chain system which is moved up and down through the guide rail. Guide rail is installed in rectangular tube that is tilted about 63 degree. Chain which is moved by using the roller sliding method is driven by sprocket wheel being rotated by the shaft and the shaft is driven by gear reducing motor. In this report operation and maintenance techniques of 1 ton bucket elevator in IMEF are described in detail. (Author). 8 refs., 14 tabs., 6 figs.

  13. Nouvelles Techniques d'Intervention sur la Corrosion des Armatures du Béton Armé

    CERN Document Server

    Colloca, C

    1999-01-01

    Les principaux dégâts constatés dans les armatures passives du béton armé sont la corrosion généralisée et la corrosion locale. Ces dégradations sont provoquées soit par la carbonatation du béton soit par le contact avec l'eau pure ou l'eau chargée de chlorures pénétrant dans les pores et dans les fissures de surface. Ce document présente de nouvelles techniques d'intervention, fondées sur d'anciens principes, introduites pour le traitement électrochimique des zones altérées liées aux différentes conditions. La réalcalinisation (dans le cas de béton carbonaté) permet d'augmenter le pH du béton et de rétablir un niveau de basicité garantissant la passivation de l'armature. La désalification (dans le cas de béton entamé par les chlorures) provoque l'élimination des ions chlorure à travers la surface du béton. Les avantages de ces traitements, par rapport aux anciennes techniques, sont appréciables si l'on considère la durée d'exécution et leur coût moins élevé.

  14. Etude comparative de la cinétique de la réaction d’hydratation des bétons autoplaçants et des bétons vibrés

    Directory of Open Access Journals (Sweden)

    Ahmed Gargouri

    2014-04-01

    En effet, la nature exothermique de la réaction chimique du ciment peut induire des déformations de dilatation et de contraction. Par ailleurs, la dépression capillaire crée par la consommation d’eau due à l’hydratation du ciment entraine un retrait de dessiccation. Ces déformations peuvent entrainer des micros fissurations pouvant affecter la durabilité de l’ouvrage à long terme surtout pour les ouvrages épais. D’où l’importance d’étudier la cinétique d’hydratation de ses bétons non conventionnels et de les comparer à celle des bétons vibrés traditionnels. L’évolution de la température adiabatique ainsi que la variation en fonction du temps du degré d’hydratation sont déterminées pour le béton autoplaçant et le béton vibré. L’analyse des résultats expérimentaux obtenus montre que le changement de composition modifie considérablement la cinétique de la réaction d’hydratation.

  15. Nuclear budget for FY1991 up 3.6% to 409.7 billion yen

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    A total of yen409.7 billion was approved for the Governmental nuclear energy draft budget for fiscal 1991 on December 28, as the Cabinet gave its approval. The total, the highest ever, was divided into yen182.6 billion for the general account and yen227.1 billion for the special account for power resources development, representing a 3.6% increase over the ongoing fiscal year's level of yen395.5 billion. The draft budget will be examined for approval of the Diet session by the end of March. The nuclear energy budget devoted to research and development projects governed by the Science and Technology Agency amounts yen306.4 billion, up 3.5% exceeding yen300 billion for the first time. The nuclear budget for the Ministry of International Trade and Industry is yen98.1 billion, up 3.5%. For the other ministries, including the Ministry of Foreign Affairs, yen5.1 billion was allotted to nuclear energy-related projects. The Government had decided to raise the unit cost of the power plant siting promotion subsidies in the special account for power resources development by 25% --- from yen600/kw to yen750/kw --- in order to support the siting of plants. Consequently, the power resources siting account of the special accounts for both STA and MITI showed high levels of growth rates: 6.3% and 7.5%, respectively. (N.K.)

  16. Community access networks: how to connect the next billion to the ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Community access networks: how to connect the next billion to the Internet. Despite recent progress with mobile technology diffusion, more than four billion people worldwide are unconnected and have limited access to global communication infrastructure. The cost of implementing connectivity infrastructure in underserved ...

  17. Future energy, exotic energy

    Energy Technology Data Exchange (ETDEWEB)

    Dumon, R

    1974-01-01

    The Detroit Energy Conference has highlighted the declining oil reserves, estimated worldwide at 95 billion tons vs. an annual rate of consumption of over 3 billion tons. The present problem is one of price; also, petroleum seems too valuable to be simply burned. New sources must come into action before 1985. The most abundant is coal, with 600 billion tons of easily recoverable reserves; then comes oil shale with a potential of 400 billion tons of oil. Exploitation at the rate of 55 go 140 million tons/yr is planned in the U.S. after 1985. More exotic and impossible to estimate quantitatively are such sources as wind, tides, and the thermal energy of the oceans--these are probably far in the future. The same is true of solar and geothermal energy in large amounts. The only other realistic energy source is nuclear energy: the European Economic Community looks forward to covering 60% of its energy needs from nuclear energy in the year 2000. Even today, from 400 mw upward, a nuclear generating plant is more economical than a fossil fueled one. Conservation will become the byword, and profound changes in society are to be expected.

  18. Four billion people facing severe water scarcity.

    Science.gov (United States)

    Mekonnen, Mesfin M; Hoekstra, Arjen Y

    2016-02-01

    Freshwater scarcity is increasingly perceived as a global systemic risk. Previous global water scarcity assessments, measuring water scarcity annually, have underestimated experienced water scarcity by failing to capture the seasonal fluctuations in water consumption and availability. We assess blue water scarcity globally at a high spatial resolution on a monthly basis. We find that two-thirds of the global population (4.0 billion people) live under conditions of severe water scarcity at least 1 month of the year. Nearly half of those people live in India and China. Half a billion people in the world face severe water scarcity all year round. Putting caps to water consumption by river basin, increasing water-use efficiencies, and better sharing of the limited freshwater resources will be key in reducing the threat posed by water scarcity on biodiversity and human welfare.

  19. Semantic metrics

    OpenAIRE

    Hu, Bo; Kalfoglou, Yannis; Dupplaw, David; Alani, Harith; Lewis, Paul; Shadbolt, Nigel

    2006-01-01

    In the context of the Semantic Web, many ontology-related operations, e.g. ontology ranking, segmentation, alignment, articulation, reuse, evaluation, can be boiled down to one fundamental operation: computing the similarity and/or dissimilarity among ontological entities, and in some cases among ontologies themselves. In this paper, we review standard metrics for computing distance measures and we propose a series of semantic metrics. We give a formal account of semantic metrics drawn from a...

  20. Metric modular spaces

    CERN Document Server

    Chistyakov, Vyacheslav

    2015-01-01

    Aimed toward researchers and graduate students familiar with elements of functional analysis, linear algebra, and general topology; this book contains a general study of modulars, modular spaces, and metric modular spaces. Modulars may be thought of as generalized velocity fields and serve two important purposes: generate metric spaces in a unified manner and provide a weaker convergence, the modular convergence, whose topology is non-metrizable in general. Metric modular spaces are extensions of metric spaces, metric linear spaces, and classical modular linear spaces. The topics covered include the classification of modulars, metrizability of modular spaces, modular transforms and duality between modular spaces, metric  and modular topologies. Applications illustrated in this book include: the description of superposition operators acting in modular spaces, the existence of regular selections of set-valued mappings, new interpretations of spaces of Lipschitzian and absolutely continuous mappings, the existe...

  1. Next-Generation Metrics: Responsible Metrics & Evaluation for Open Science

    Energy Technology Data Exchange (ETDEWEB)

    Wilsdon, J.; Bar-Ilan, J.; Peters, I.; Wouters, P.

    2016-07-01

    Metrics evoke a mixed reaction from the research community. A commitment to using data to inform decisions makes some enthusiastic about the prospect of granular, real-time analysis o of research and its wider impacts. Yet we only have to look at the blunt use of metrics such as journal impact factors, h-indices and grant income targets, to be reminded of the pitfalls. Some of the most precious qualities of academic culture resist simple quantification, and individual indicators often struggle to do justice to the richness and plurality of research. Too often, poorly designed evaluation criteria are “dominating minds, distorting behaviour and determining careers (Lawrence, 2007).” Metrics hold real power: they are constitutive of values, identities and livelihoods. How to exercise that power to more positive ends has been the focus of several recent and complementary initiatives, including the San Francisco Declaration on Research Assessment (DORA1), the Leiden Manifesto2 and The Metric Tide3 (a UK government review of the role of metrics in research management and assessment). Building on these initiatives, the European Commission, under its new Open Science Policy Platform4, is now looking to develop a framework for responsible metrics for research management and evaluation, which can be incorporated into the successor framework to Horizon 2020. (Author)

  2. How do you interpret a billion primary care records?

    Directory of Open Access Journals (Sweden)

    Martin Heaven

    2017-04-01

    To establish this we explored just over 1 billion unique Read coded records generated in the time period 1999 to 2015 by GP practices participating in the provision of anonymised records to SAIL, aligning, filtering and summarising the data in a series of observational exercises to generate hypotheses related to the capture and recording of the data. Results A fascinating journey through 1 billion GP practice generated pieces of information, embarked upon to aid interpretation of our Supporting People results, and providing insights into the patterns of recording within GP data.

  3. Interactive (statistical) visualisation and exploration of a billion objects with vaex

    NARCIS (Netherlands)

    Breddels, M. A.

    2016-01-01

    With new catalogues arriving such as the Gaia DR1, containing more than a billion objects, new methods of handling and visualizing these data volumes are needed. We show that by calculating statistics on a regular (N-dimensional) grid, visualizations of a billion objects can be done within a second

  4. Twitching motility and biofilm formation are associated with tonB1 in Xylella fastidiosa.

    Science.gov (United States)

    Cursino, Luciana; Li, Yaxin; Zaini, Paulo A; De La Fuente, Leonardo; Hoch, Harvey C; Burr, Thomas J

    2009-10-01

    A mutation in the Xylella fastidiosa tonB1 gene resulted in loss of twitching motility and in significantly less biofilm formation as compared with a wild type. The altered motility and biofilm phenotypes were restored by complementation with a functional copy of the gene. The mutation affected virulence as measured by Pierce's disease symptoms on grapevines. The role of TonB1 in twitching and biofilm formation appears to be independent of the characteristic iron-uptake function of this protein. This is the first report demonstrating a functional role for a tonB homolog in X. fastidiosa.

  5. Estimating current and future global urban domestic material consumption

    Science.gov (United States)

    Baynes, Timothy Malcolm; Kaviti Musango, Josephine

    2018-06-01

    Urban material resource requirements are significant at the global level and these are expected to expand with future urban population growth. However, there are no global scale studies on the future material consumption of urban areas. This paper provides estimates of global urban domestic material consumption (DMC) in 2050 using three approaches based on: current gross statistics; a regression model; and a transition theoretic logistic model. All methods use UN urban population projections and assume a simple ‘business-as-usual’ scenario wherein historical aggregate trends in income and material flow continue into the future. A collation of data for 152 cities provided a year 2000 world average DMC/capita estimate, 12 tons/person/year (±22%), which we combined with UN population projections to produce a first-order estimation of urban DMC at 2050 of ~73 billion tons/year (±22%). Urban DMC/capita was found to be significantly correlated (R 2 > 0.9) to urban GDP/capita and area per person through a power law relation used to obtain a second estimate of 106 billion tons (±33%) in 2050. The inelastic exponent of the power law indicates a global tendency for relative decoupling of direct urban material consumption with increasing income. These estimates are global and influenced by the current proportion of developed-world cities in the global population of cities (and in our sample data). A third method employed a logistic model of transitions in urban DMC/capita with regional resolution. This method estimated global urban DMC to rise from approximately 40 billion tons/year in 2010 to ~90 billion tons/year in 2050 (modelled range: 66–111 billion tons/year). DMC/capita across different regions was estimated to converge from a range of 5–27 tons/person/year in the year 2000 to around 8–17 tons/person/year in 2050. The urban population does not increase proportionally during this period and thus the global average DMC/capita increases from ~12 to ~14 tons

  6. Response Surface Model (RSM)-based Benefit Per Ton Estimates

    Science.gov (United States)

    The tables below are updated versions of the tables appearing in The influence of location, source, and emission type in estimates of the human health benefits of reducing a ton of air pollution (Fann, Fulcher and Hubbell 2009).

  7. Robust Exploration and Commercial Missions to the Moon Using LANTR Propulsion and In-Situ Propellants Derived From Lunar Polar Ice (LPI) Deposits

    Science.gov (United States)

    Borowski, Stanley K.; Ryan, Stephen W.; Burke, Laura M.; McCurdy, David R.; Fittje, James E.; Joyner, Claude R.

    2017-01-01

    Since the 1960s, scientists have conjectured that water icecould survive in the cold, permanently shadowed craters located at the Moons poles Clementine (1994), Lunar Prospector (1998),Chandrayaan-1 (2008), and Lunar Reconnaissance Orbiter (LRO) and Lunar CRater Observation and Sensing Satellite(LCROSS) (2009) lunar probes have provided data indicating the existence of large quantities of water ice at the lunar poles The Mini-SAR onboard Chandrayaan-1discovered more than 40 permanently shadowed craters near the lunar north pole that are thought to contain 600 million metric tons of water ice. Using neutron spectrometer data, the Lunar Prospector science team estimated a water ice content (1.5 +-0.8 wt in the regolith) found in the Moons polar cold trap sand estimated the total amount of water at both poles at 2 billion metric tons Using Mini-RF and spectrometry data, the LRO LCROSS science team estimated the water ice content in the regolith in the south polar region to be 5.6 +-2.9 wt. On the basis of the above scientific data, it appears that the water ice content can vary from 1-10 wt and the total quantity of LPI at both poles can range from 600 million to 2 billion metric tons NTP offers significant benefits for lunar missions and can take advantage of the leverage provided from using LDPs when they become available by transitioning to LANTR propulsion. LANTR provides a variablethrust and Isp capability, shortens burn times and extends engine life, and allows bipropellant operation The combination of LANTR and LDP has performance capability equivalent to that of a hypothetical gaseousfuel core NTR (effective Isp 1575 s) and can lead to a robust LTS with unique mission capabilities that include short transit time crewed cargo transports and routine commuter flights to the Moon The biggest challenge to making this vision a reality will be the production of increasing amounts of LDP andthe development of propellant depots in LEO, LLO and LPO. An industry

  8. Confined Mobility of TonB and FepA in Escherichia coli Membranes.

    Directory of Open Access Journals (Sweden)

    Yoriko Lill

    Full Text Available The important process of nutrient uptake in Escherichia coli, in many cases, involves transit of the nutrient through a class of beta-barrel proteins in the outer membrane known as TonB-dependent transporters (TBDTs and requires interaction with the inner membrane protein TonB. Here we have imaged the mobility of the ferric enterobactin transporter FepA and TonB by tracking them in the membranes of live E. coli with single-molecule resolution at time-scales ranging from milliseconds to seconds. We employed simple simulations to model/analyze the lateral diffusion in the membranes of E.coli, to take into account both the highly curved geometry of the cell and artifactual effects expected due to finite exposure time imaging. We find that both molecules perform confined lateral diffusion in their respective membranes in the absence of ligand with FepA confined to a region [Formula: see text] μm in radius in the outer membrane and TonB confined to a region [Formula: see text] μm in radius in the inner membrane. The diffusion coefficient of these molecules on millisecond time-scales was estimated to be [Formula: see text] μm2/s and [Formula: see text] μm2/s for FepA and TonB, respectively, implying that each molecule is free to diffuse within its domain. Disruption of the inner membrane potential, deletion of ExbB/D from the inner membrane, presence of ligand or antibody to FepA and disruption of the MreB cytoskeleton was all found to further restrict the mobility of both molecules. Results are analyzed in terms of changes in confinement size and interactions between the two proteins.

  9. A method to press powder at 6000 ton using small amount of explosive

    Science.gov (United States)

    Hilmi, Ahmad Humaizi; Azmi, Nor Azmaliana; Ismail, Ariffin

    2017-12-01

    Large die hydraulic press forces are one of the key instruments in making jumbo planes. The machine can produce aircraft components such as wing spars, landing gear supports and armor plates. Superpower nations such as USA, Russia, Germany, Japan, Korea and China have large die hydraulic press which can press 50,000 tons. In Malaysia, heavy-duty press is available from companies such as Proton that builds chassis for cars. However, that heavy-duty press is not able to produce better bulkhead for engines, fuselage, and wings of an aircraft. This paper presents the design of an apparatus that uses 50 grams of commercial grade explosives to produce 6000 tons of compaction. This is a first step towards producing larger scale apparatus that can produce 50,000-ton press. The design was done using AUTODYN blast simulation software. According to the results, the maximum load the apparatus can withstand was 6000 tons which was contributed by 50 grams of commercial explosive(Emulex). Explosive size larger than 50 grams will lead to catastrophic failure. Fabrication of the apparatus was completed. However, testing of the apparatus is not presented in this article.

  10. Cosmic rays and the biosphere over 4 billion years

    DEFF Research Database (Denmark)

    Svensmark, Henrik

    2006-01-01

    Variations in the flux of cosmic rays (CR) at Earth during the last 4.6 billion years are constructed from information about the star formation rate in the Milky Way and the evolution of the solar activity. The constructed CR signal is compared with variations in the Earths biological productivit...... as recorded in the isotope delta C-13, which spans more than 3 billion years. CR and fluctuations in biological productivity show a remarkable correlation and indicate that the evolution of climate and the biosphere on the Earth is closely linked to the evolution of the Milky Way....

  11. Baby universe metric equivalent to an interior black-hole metric

    International Nuclear Information System (INIS)

    Gonzalez-Diaz, P.F.

    1991-01-01

    It is shown that the maximally extended metric corresponding to a large wormhole is the unique possible wormhole metric whose baby universe sector is conformally equivalent ot the maximal inextendible Kruskal metric corresponding to the interior region of a Schwarzschild black hole whose gravitational radius is half the wormhole neck radius. The physical implications of this result in the black hole evaporation process are discussed. (orig.)

  12. $17 billion needed by year 2000.

    Science.gov (United States)

    Finger, W R

    1995-09-01

    The United Nations Population Fund (UNFPA) estimates that US$17 billion will be needed to fund reproductive health care in developing countries by the year 2000. About US$10 billion of would go for family planning: currently, the amount spent on family planning is about US$5 billion. Donors are focusing on fewer countries because of limited resources. The United States Agency for International Development (USAID) is planning to phase out support for family planning in Jamaica and Brazil because the programs there have advanced sufficiently. Resources will be shifted to countries with more pressing needs. Dr. Richard Osborn, senior technical officer for UNFPA, states that UNFPA works with national program managers in allocating resources at the macro level (commodities, training). Currently, two-thirds of family planning funds spent worldwide come from developing country governments (mainly China, India, Indonesia, Mexico, South Africa, Turkey, and Bangladesh). Sustaining programs, much less adding new services, will be difficult. User fees and public-private partnerships are being considered; worldwide, consumers provide, currently, about 14% of family planning funds (The portion is higher in most Latin American countries.). In a few countries, insurance, social security, and other public-private arrangements contribute. Social marketing programs are being considered that would remove constraints on prescriptions and prices and improve the quality of services so that clients would be more willing to pay for contraceptives. Although governments are attempting to fit family planning into their health care budgets, estimates at the national level are difficult to make. Standards are needed to make expenditure estimates quickly and at low cost, according to Dr. Barbara Janowitz of FHI, which is developing guidelines. Studies in Bangladesh, Ecuador, Ghana, Mexico, and the Philippines are being conducted, with the assistance of The Evaluation Project at the Population

  13. Properties of C-metric spaces

    Science.gov (United States)

    Croitoru, Anca; Apreutesei, Gabriela; Mastorakis, Nikos E.

    2017-09-01

    The subject of this paper belongs to the theory of approximate metrics [23]. An approximate metric on X is a real application defined on X × X that satisfies only a part of the metric axioms. In a recent paper [23], we introduced a new type of approximate metric, named C-metric, that is an application which satisfies only two metric axioms: symmetry and triangular inequality. The remarkable fact in a C-metric space is that a topological structure induced by the C-metric can be defined. The innovative idea of this paper is that we obtain some convergence properties of a C-metric space in the absence of a metric. In this paper we investigate C-metric spaces. The paper is divided into four sections. Section 1 is for Introduction. In Section 2 we recall some concepts and preliminary results. In Section 3 we present some properties of C-metric spaces, such as convergence properties, a canonical decomposition and a C-fixed point theorem. Finally, in Section 4 some conclusions are highlighted.

  14. Learning Low-Dimensional Metrics

    OpenAIRE

    Jain, Lalit; Mason, Blake; Nowak, Robert

    2017-01-01

    This paper investigates the theoretical foundations of metric learning, focused on three key questions that are not fully addressed in prior work: 1) we consider learning general low-dimensional (low-rank) metrics as well as sparse metrics; 2) we develop upper and lower (minimax)bounds on the generalization error; 3) we quantify the sample complexity of metric learning in terms of the dimension of the feature space and the dimension/rank of the underlying metric;4) we also bound the accuracy ...

  15. Evaluation of selected chemical processes for production of low-cost silicon, phase 2

    Science.gov (United States)

    Blocher, J. M., Jr.; Browning, M. F.; Wilson, W. J.; Carmichael, D. C.

    1977-01-01

    Potential designs for an integrated fluidized-bed reactor/zinc vaporizer/SiCl4 preheater unit are being considered and heat-transfer calculations have been initiated on versions of the zinc vaporizer section. Estimates of the cost of the silicon prepared in the experimental facility have been made for projected capacities of 25, 50, 75, and 100 metric ton of silicon. A 35 percent saving is obtained in going from 25 metric ton/year to the 50 metric ton/year level. This analysis, coupled with the recognition that use of two reactors in the 50 metric ton/year version allows for continued operation (at reduced capacity) with one reactor shut down, has resulted in a recommendation for adoption of an experimental facility capacity of 50 metric ton/year or greater. At this stage, the change to a larger size facility would not increase the design costs appreciably. In the experimental support program, the effects of seed bed particle size and depth were studied, and operation of the miniplant with a new zinc vaporizer was initiated, revealing the need for modification of the latter.

  16. Scalar-metric and scalar-metric-torsion gravitational theories

    International Nuclear Information System (INIS)

    Aldersley, S.J.

    1977-01-01

    The techniques of dimensional analysis and of the theory of tensorial concomitants are employed to study field equations in gravitational theories which incorporate scalar fields of the Brans-Dicke type. Within the context of scalar-metric gravitational theories, a uniqueness theorem for the geometric (or gravitational) part of the field equations is proven and a Lagrangian is determined which is uniquely specified by dimensional analysis. Within the context of scalar-metric-torsion gravitational theories a uniqueness theorem for field Lagrangians is presented and the corresponding Euler-Lagrange equations are given. Finally, an example of a scalar-metric-torsion theory is presented which is similar in many respects to the Brans-Dicke theory and the Einstein-Cartan theory

  17. Metrics of quantum states

    International Nuclear Information System (INIS)

    Ma Zhihao; Chen Jingling

    2011-01-01

    In this work we study metrics of quantum states, which are natural generalizations of the usual trace metric and Bures metric. Some useful properties of the metrics are proved, such as the joint convexity and contractivity under quantum operations. Our result has a potential application in studying the geometry of quantum states as well as the entanglement detection.

  18. METRIC context unit architecture

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, R.O.

    1988-01-01

    METRIC is an architecture for a simple but powerful Reduced Instruction Set Computer (RISC). Its speed comes from the simultaneous processing of several instruction streams, with instructions from the various streams being dispatched into METRIC's execution pipeline as they become available for execution. The pipeline is thus kept full, with a mix of instructions for several contexts in execution at the same time. True parallel programming is supported within a single execution unit, the METRIC Context Unit. METRIC's architecture provides for expansion through the addition of multiple Context Units and of specialized Functional Units. The architecture thus spans a range of size and performance from a single-chip microcomputer up through large and powerful multiprocessors. This research concentrates on the specification of the METRIC Context Unit at the architectural level. Performance tradeoffs made during METRIC's design are discussed, and projections of METRIC's performance are made based on simulation studies.

  19. Potash: a global overview of evaporate-related potash resources, including spatial databases of deposits, occurrences, and permissive tracts: Chapter S in Global mineral resource assessment

    Science.gov (United States)

    Orris, Greta J.; Cocker, Mark D.; Dunlap, Pamela; Wynn, Jeff C.; Spanski, Gregory T.; Briggs, Deborah A.; Gass, Leila; Bliss, James D.; Bolm, Karen S.; Yang, Chao; Lipin, Bruce R.; Ludington, Stephen; Miller, Robert J.; Słowakiewicz, Mirosław

    2014-01-01

    Potash is mined worldwide to provide potassium, an essential nutrient for food crops. Evaporite-hosted potash deposits are the largest source of salts that contain potassium in water-soluble form, including potassium chloride, potassium-magnesium chloride, potassium sulfate, and potassium nitrate. Thick sections of evaporitic salt that form laterally continuous strata in sedimentary evaporite basins are the most common host for stratabound and halokinetic potash-bearing salt deposits. Potash-bearing basins may host tens of millions to more than 100 billion metric tons of potassium oxide (K2O). Examples of these deposits include those in the Elk Point Basin in Canada, the Pripyat Basin in Belarus, the Solikamsk Basin in Russia, and the Zechstein Basin in Germany.

  20. A retrospective analysis of benefits and impacts of U.S. renewable portfolio standards

    International Nuclear Information System (INIS)

    Barbose, Galen; Wiser, Ryan; Heeter, Jenny; Mai, Trieu; Bird, Lori; Bolinger, Mark; Carpenter, Alberta; Heath, Garvin; Keyser, David; Macknick, Jordan; Mills, Andrew; Millstein, Dev

    2016-01-01

    As states consider revising or developing renewable portfolio standards (RPS), they are evaluating policy costs, benefits, and other impacts. We present the first U. S. national-level assessment of state RPS program benefits and impacts, focusing on new renewable electricity resources used to meet RPS compliance obligations in 2013. In our central-case scenario, reductions in life-cycle greenhouse gas emissions from displaced fossil fuel-generated electricity resulted in $2.2 billion of global benefits. Health and environmental benefits from reductions in criteria air pollutants (sulfur dioxide, nitrogen oxides, and particulate matter 2.5) were even greater, estimated at $5.2 billion in the central case. Further benefits accrued in the form of reductions in water withdrawals and consumption for power generation. Finally, although best considered resource transfers rather than net societal benefits, new renewable electricity generation used for RPS compliance in 2013 also supported nearly 200,000 U. S.-based gross jobs and reduced wholesale electricity prices and natural gas prices, saving consumers a combined $1.3–$4.9 billion. In total, the estimated benefits and impacts well-exceed previous estimates of RPS compliance costs. - Highlights: •Benefits of satisfying U. S. renewable portfolio standards in 2013 were evaluated. •Carbon dioxide (equivalent) was cut by 59 million metric tons (worth $2.2 billion). •Reduced air pollution provided $5.2 billion in health and environmental benefits. •Water withdrawals (830 billion gal) and consumption (27 billion gal) were reduced. •Job/economic, electricity price, and natural gas price impacts were also evaluated.

  1. Valorisation et Recyclage des Déchets Plastiques dans le Béton

    Directory of Open Access Journals (Sweden)

    Benimam Samir

    2014-04-01

    Full Text Available La valorisation des déchets dans le génie civil est un secteur important dans la mesure où les produits que l’on souhaite obtenir ne sont pas soumis à des critères de qualité trop rigoureux. Le recyclage des déchets touche deux impacts très importants à savoir l’impact et l’impact économique. Donc plusieurs pays du monde, différents déchets sont utilisé dans le domaine de la construction et spécialement dans le ciment ou béton comme poudre, fibres ou agrégats. Ce travail s’intéresse à la valorisation d’un déchet qui est nuisible pour l’environnement vu son caractère encombrant et inesthétique il s’agit du déchet plastique. Trois types de déchets plastiques sont ajoutés dans le béton (sous forme de grains et fibres (ondulées et rectilignes. Les propriétés à l’état frais (maniabilité, air occlus et densité et à l’état durci (résistance à la compression, à la traction, retrait et absorption d’eau des différents bétons réalisés sont analysées et comparés par rapport leurs témoins respectifs. D’après les résultats expérimentaux on peut conclure que le renforcement de la matrice cimentaire avec des fibres plastiques ondulées montrent une nette amélioration de la résistance à la traction du béton ainsi qu’une diminution remarquable de sa capacité d’absorption de l’eau lorsqu’on utilise des grains plastiques.

  2. Observable traces of non-metricity: New constraints on metric-affine gravity

    Science.gov (United States)

    Delhom-Latorre, Adrià; Olmo, Gonzalo J.; Ronco, Michele

    2018-05-01

    Relaxing the Riemannian condition to incorporate geometric quantities such as torsion and non-metricity may allow to explore new physics associated with defects in a hypothetical space-time microstructure. Here we show that non-metricity produces observable effects in quantum fields in the form of 4-fermion contact interactions, thereby allowing us to constrain the scale of non-metricity to be greater than 1 TeV by using results on Bahbah scattering. Our analysis is carried out in the framework of a wide class of theories of gravity in the metric-affine approach. The bound obtained represents an improvement of several orders of magnitude to previous experimental constraints.

  3. Increase of alcohol yield per ton of pulp

    Energy Technology Data Exchange (ETDEWEB)

    Sokolov, B N

    1957-01-01

    Digestion processes of cellulose were studied under production conditions. When the digestion was carried out with acid having 5.2% total SO/sub 2/ and 0.92% CaO, the concentration of total sugars in the spent liquor was 1.8 to 2.5%. When the acidity was reduced to 4.8% total SO/sub 2/ and 0.82% CaO, all other conditions being the same, the sugar concentration in the spent liquor increased to 3.0 to 3.7%. The importance of the acid strength and CaO content of the cooking liquor was further demonstrated at the end of 1955. At that time the total SO/sub 2/ in the acid rose to 8% while the amount of CaO remained practically the same-0.85 to 0.90%. These conditions permitted an increase in the amount of ships by 25 to 30%, which further changed the ratio CaO: wood and created conditions favorable for an improved yield of sugar. The increase in the activity of the acid was reflected favorably in the degree of hydrolysis of the hemicelluloses and in the degree to which the oligosaccharides or polysaccharides were hydrolyzed to simple sugars. At that time the yield of alcohol reached 53 1/ton of unbleached pulp. The process was further improved in 1956 by the use of successive washings; at the end of the digestion period the concentrated spent liquor was piped to the alcohol unit. The yield of alcohol reached 59.4 1/ton of pulp. Sugar recovery from the tank was 92.5% of that theoretically possible. Further improvements resulted by saturating the wood chips with acid under variable pressures. As a result, the base of the cooking acid was reduced to 0.7 to 0.72% and, at the end of the process the liquor contained 0.03 to 0.06% CaO instead of 0.2 to 0.18%. The alcohol yield/ton of pulp then rose to 66.8 l.

  4. Factory Acceptance Test Procedure Westinghouse 100 ton Hydraulic Trailer

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1994-01-01

    This Factory Acceptance Test Procedure (FAT) is for the Westinghouse 100 Ton Hydraulic Trailer. The trailer will be used for the removal of the 101-SY pump. This procedure includes: safety check and safety procedures; pre-operation check out; startup; leveling trailer; functional/proofload test; proofload testing; and rolling load test

  5. Soviet Military Power

    Science.gov (United States)

    1987-03-01

    some 27,000 meters of bridging equipment, 13 1 ."¾. million metric tons of arms and ammunition, "J -- and 60 million metric tons of petrol (fuel...rhis ammuitnition. lDuring aI warn sýuppjlis fromn stratev!1( s4tuck- a1lomg with ove-(r 9 m-illion metric tons o )f petrol 10 ~iilt- In thi. Stlviit...been a target for new Soviet overtures Deputy Foreign Minister Vladimir Petrovskiy through political influence operations and ex- went to Tunisia , Iraq

  6. ICARUS 600 ton: A status report

    CERN Document Server

    Vignoli, C; Badertscher, A; Barbieri, E; Benetti, P; Borio di Tigliole, A; Brunetti, R; Bueno, A; Calligarich, E; Campanelli, Mario; Carli, F; Carpanese, C; Cavalli, D; Cavanna, F; Cennini, P; Centro, S; Cesana, A; Chen, C; Chen, Y; Cinquini, C; Cline, D; De Mitri, I; Dolfini, R; Favaretto, D; Ferrari, A; Gigli Berzolari, A; Goudsmit, P; He, K; Huang, X; Li, Z; Lu, F; Ma, J; Mannocchi, G; Mauri, F; Mazza, D; Mazzone, L; Montanari, C; Nurzia, G P; Otwinowski, S; Palamara, O; Pascoli, D; Pepato, A; Periale, L; Petrera, S; Piano Mortari, Giovanni; Piazzoli, A; Picchi, P; Pietropaolo, F; Rancati, T; Rappoldi, A; Raselli, G L; Rebuzzi, D; Revol, J P; Rico, J; Rossella, M; Rossi, C; Rubbia, C; Rubbia, A; Sala, P; Scannicchio, D; Sergiampietri, F; Suzuki, S; Terrani, M; Ventura, S; Verdecchia, M; Wang, H; Woo, J; Xu, G; Xu, Z; Zhang, C; Zhang, Q; Zheng, S

    2000-01-01

    The goal of the ICARUS Project is the installation of a multi-kiloton LAr TPC in the underground Gran Sasso Laboratory. The programme foresees the realization of the detector in a modular way. The first step is the construction of a 600 ton module which is now at an advanced phase. It will be mounted and tested in Pavia in one year and then it will be moved to Gran Sasso for the final operation. The major cryogenic and purification systems and the mechanical components of the detector have been constructed and tested in a 10 m3 prototype. The results of these tests are here summarized.

  7. Foreign Agricultural Trade of the United States March/April 1991

    OpenAIRE

    Warden, Thomas

    1991-01-01

    U.S. agricultural exports fell to $16.5 billion (55 million tons) during the ·first 5 months of FY 1991, a decline of $1.4 billion and 14 million tons... Agricultural imports remained at $9.4 ·billion during the first 5 months of FY 1991....USDA allocated $200 million to 47 trade organizations to conduct export market promotions for a variety of agricultural products during FY 1991... Japan was the top market for U.S. agricultural exports during the past 3 fiscal years, followed by the Soviet...

  8. BUILDING A BILLION SPATIO-TEMPORAL OBJECT SEARCH AND VISUALIZATION PLATFORM

    Directory of Open Access Journals (Sweden)

    D. Kakkar

    2017-10-01

    Full Text Available With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC, an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.

  9. Building a Billion Spatio-Temporal Object Search and Visualization Platform

    Science.gov (United States)

    Kakkar, D.; Lewis, B.

    2017-10-01

    With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA) has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC), an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.

  10. $35 billion habit: will nuclear cost overruns bankrupt the utilities

    International Nuclear Information System (INIS)

    Morgan, R.E.

    1980-01-01

    The Nuclear Regulatory Commission (NRC) has proposed some 150 modifications in the design and operation of nuclear power plants as a result of the accident at Three Mile Island. The Atomic Industrial Forum estimates the total cost of the NRC's proposed rule changes at $35.5 billion ($3.5 billion in capital costs for the entire industry, and $32 billion in outage and construction-delay costs to the utilities) for existing facilities and for those with construction well underway. The changes range from improved training for reactor workers to a major overhaul of the reactor-containment design. The nuclear industry is asking the NRC to modify the proposals citing excessive costs (like the $100 million changes needed for a plant that cost $17 million to build) and safety (some of the complex regulations may interfere with safety). Financing the changes has become a major problem for the utilities. If the regulators allow all the costs to be passed along to the consumer, the author feels electricity will be too expensive for the consumer

  11. Backlog at December 31, 2007: euro 39,8 billion, up by 55% from year-end 2006. 2007 sales revenue: euro 11.9 billion, up by 9.8% (+10.4% like-for-like)

    International Nuclear Information System (INIS)

    2008-01-01

    The AREVA group's backlog reached a record level of euro 39.834 billion as of December 31, 2007, up by 55% from that of year-end 2006. In Nuclear, the backlog was euro 34.927 billion at year-end 2007 (+58%), due in particular to the signature of a contract in a record amount with the Chinese utility CGNPC. The series of agreements concluded provide among other things for the construction of two new-generation EPR nuclear islands and the supply of all of the materials and services needed for their operation through 2027. CGNPC also bought 35% of the production of UraMin, the mining company acquired by AREVA in August 2007. Industrial cooperation in the Back End of the cycle was launched with the signature of an agreement between China and France. In addition, the group signed several long-term contracts in significant amounts, particularly with KHNP of South Korea, EDF and Japanese utilities. The Transmission and Distribution division won several major contracts in Libya and Qatar at the end of the year approaching a total of euro 750 million. For the entire year, new orders grew by 34% to euro 5.816 billion. The backlog, meanwhile, grew by 40% to euro 4.906 billion at year-end. The group cleared sales revenue of euro 11.923 billion in 2007, up by 9.8% (+10.4% like-for-like) in relation to 2006 sales of euro 10.863 billion. Sales revenue for the 4. quarter of 2007 rose to euro 3.858 billion, for growth of 16.7% (+18.8% like-for-like) over one year. Sales revenue for the year was marked by: - Growth of 7.6% (+10.6% like-for-like) in Front End sales revenue, which rose to euro 3.140 billion. The division's Enrichment operations posted strong growth. - Sales were up by 17.5% (+15.2% like-for-like) to euro 2.717 billion in the Reactors and Services division. Sales revenue was driven in particular by the growth of Services operations, after weak demand in 2006, by progress on OL3 construction, and by the start of Flamanville 3, the second EPR. For the Back End division

  12. Implications of\tenhanced\teffectiveness\tof\tvincristine\tsulfate/ε-viniferin combination\tcompared\tto\tvincristine\tsulfate\tonly\ton\tHepG2\tcells

    Directory of Open Access Journals (Sweden)

    Filiz\tÖzdemir

    2016-12-01

    Full Text Available Objective: This\tstudy\twas\tdesigned\tto\tinvestigate\tthe\teffects\tof\tε-viniferin\t(ε-VNF\ton\tthe\tmitochondrial\tpathway\tof\tapoptosis and\ton\tlate\tapoptosis\tin\tHepG2\tcell\tlines.\tTo\tobserve\tthese\teffects,\tε-VNF\tand\tvincristine\tsulfate\t(VNC,\tanti-cancer\tdrugs\tused for\ttreatment\ton\tHepG2\tcells,\twere\tadministered\teither\talone\tor\tin\tcombination\tat\tdifferent\ttime\tintervals. Methods:\tMitochondrial\tmembrane\tpotential\tchanges\tin\tthe\tcells\t(ΔΨm\twere\tevaluated\tusing\tcationic\tdye\tJC-1,\twhile\tBax,\tBcl- 2\texpression\tlevels\twith\tRT-PCR\tand\tcaspase-3\tactivity\twere\tanalyzed\tusing\ta\tkit.\tFor\tdetection\tof\tapoptotic\tactivity,\tan\tin\tsitu TUNEL\tassay\twas\tperformed. Results: When 98.3µM ε-VNF, 52.5µM VNC and the 11.25+15.8µM VNC+ε-VNF combination were compared with the control group,\tΔΨm\tchanges\tat\tthe\t6th\thour\twere\tfound\tto\tbe\t19.5%,\t5.5%,\t24.6%,\tand\t3.5%\t,\trespectively.\tThese\tfinding\tshow\tthat\tthe combination\tgroup\t(24.6%\tresulted\tin\tearly\tapoptosis\tof\tthe\tcell\tat\tthe\t6th\thour.\tBax\tmRNA\texpression\tincreased\tat\tthe\t24th hour in the VNC+ε-VNF group compared to control group (160%, and caspase-3 activation increased in the 1.25+15.8 µM[VNC+ε-VNF]\tgroup\tcompared\tto\tthe\tcontrol\tgroup\tat\tthe\t48th\thour.\tThe\tdetection\tof\tDNA\tfragments\tin\tHepG2\tcells\twithin 24\thours\tsuggests\tdirect\tapoptosis. Conclusion: These findings demonstrate that the doses administered to the VNC+ε-VNF combination group\twere\tlower than those\tadministered\tto\tgroups\tusing\tone\tagent\talone\t(e.g.\tVNC,\tthe\tresults\tof\twhich\treduce\tthe\tpossibility\tof\tadministering\ttoxic doses.

  13. Rhenium

    Science.gov (United States)

    John, David A.; Seal, Robert R.; Polyak, Désirée E.; Schulz, Klaus J.; DeYoung,, John H.; Seal, Robert R.; Bradley, Dwight C.

    2017-12-19

    Rhenium is one of the rarest elements in Earth’s continental crust; its estimated average crustal abundance is less than 1 part per billion. Rhenium is a metal that has an extremely high melting point and a heat-stable crystalline structure. More than 80 percent of the rhenium consumed in the world is used in high-temperature superalloys, especially those used to make turbine blades for jet aircraft engines. Rhenium’s other major application is in platinum-rhenium catalysts used in petroleum refining.Rhenium rarely occurs as a native element or as its own sulfide mineral; most rhenium is present as a substitute for molybdenum in molybdenite. Annual world mine production of rhenium is about 50 metric tons. Nearly all primary rhenium production (that is, rhenium produced by mining rather than through recycling) is as a byproduct of copper mining, and about 80 percent of the rhenium obtained through mining is recovered from the flue dust produced during the roasting of molybdenite concentrates from porphyry copper deposits. Molybdenite in porphyry copper deposits can contain hundreds to several thousand grams per metric ton of rhenium, although the estimated rhenium grades of these deposits range from less than 0.1 gram per metric ton to about 0.6 gram per metric ton.Continental-arc porphyry copper-(molybdenum-gold) deposits supply most of the world’s rhenium production and have large inferred rhenium resources. Porphyry copper mines in Chile account for about 55 percent of the world’s mine production of rhenium; rhenium is also recovered from porphyry copper deposits in the United States, Armenia, Kazakhstan, Mexico, Peru, Russia, and Uzbekistan. Sediment-hosted strata-bound copper deposits in Kazakhstan (of the sandstone type) and in Poland (of the reduced-facies, or Kupferschiefer, type) account for most other rhenium produced by mining. These types of deposits also have large amounts of identified rhenium resources. The future supply of rhenium is likely

  14. 10'000 ton ALICE gets her UK-built "Brain"

    CERN Multimedia

    Maddock, Julia

    2007-01-01

    For one of the four LEP experiments, called ALICE, the process got a step closer last week when a crucial part of the 10'000-ton detector, the British-built Central Trigger Processor (CTP), was installed in the ALICE cavern, some 150 feet underground. (plus background information about ALICE) (2,5 pages)

  15. Metric diffusion along foliations

    CERN Document Server

    Walczak, Szymon M

    2017-01-01

    Up-to-date research in metric diffusion along compact foliations is presented in this book. Beginning with fundamentals from the optimal transportation theory and the theory of foliations; this book moves on to cover Wasserstein distance, Kantorovich Duality Theorem, and the metrization of the weak topology by the Wasserstein distance. Metric diffusion is defined, the topology of the metric space is studied and the limits of diffused metrics along compact foliations are discussed. Essentials on foliations, holonomy, heat diffusion, and compact foliations are detailed and vital technical lemmas are proved to aide understanding. Graduate students and researchers in geometry, topology and dynamics of foliations and laminations will find this supplement useful as it presents facts about the metric diffusion along non-compact foliation and provides a full description of the limit for metrics diffused along foliation with at least one compact leaf on the two dimensions.

  16. Analyzing and Comparing Biomass Feedstock Supply Systems in China: Corn Stover and Sweet Sorghum Case Studies

    Directory of Open Access Journals (Sweden)

    Lantian Ren

    2015-06-01

    Full Text Available This paper analyzes the rural Chinese biomass supply system and models supply chain operations according to U.S. concepts of logistical unit operations: harvest and collection, storage, transportation, preprocessing, and handling and queuing. In this paper, we quantify the logistics cost of corn stover and sweet sorghum in China under different scenarios. We analyze three scenarios of corn stover logistics from northeast China and three scenarios of sweet sorghum stalks logistics from Inner Mongolia in China. The case study estimates that the logistics cost of corn stover and sweet sorghum stalk to be $52.95/dry metric ton and $52.64/dry metric ton, respectively, for the current labor-based biomass logistics system. However, if the feedstock logistics operation is mechanized, the cost of corn stover and sweet sorghum stalk decreases to $36.01/dry metric ton and $35.76/dry metric ton, respectively. The study also includes a sensitivity analysis to identify the cost factors that cause logistics cost variation. Results of the sensitivity analysis show that labor price has the most influence on the logistics cost of corn stover and sweet sorghum stalk, with a variation of $6 to $12/dry metric ton.

  17. Integrating the stabilization of nuclear materials

    Energy Technology Data Exchange (ETDEWEB)

    Dalton, H.F. [Department of Energy, Washington, DC (United States)

    1996-05-01

    In response to Recommendation 94-1 of the Defense Nuclear Facilities Safety Board, the Department of Energy committed to stabilizing specific nuclear materials within 3 and 8 years. These efforts are underway. The Department has already repackaged the plutonium at Rocky Flats and metal turnings at Savannah River that had been in contact with plastic. As this effort proceeds, we begin to look at activities beyond stabilization and prepare for the final disposition of these materials. To describe the plutonium materials being stabilize, Figure 1 illustrates the quantities of plutonium in various forms that will be stabilized. Plutonium as metal comprises 8.5 metric tons. Plutonium oxide contains 5.5 metric tons of plutonium. Plutonium residues and solutions, together, contain 7 metric tons of plutonium. Figure 2 shows the quantity of plutonium-bearing material in these four categories. In this depiction, 200 metric tons of plutonium residues and 400 metric tons of solutions containing plutonium constitute most of the material in the stabilization program. So, it is not surprising that much of the work in stabilization is directed toward the residues and solutions, even though they contain less of the plutonium.

  18. Comportement en flexion des bétons fibrés sous chargement cyclique

    Directory of Open Access Journals (Sweden)

    Boulekbache Bensaid

    2014-04-01

    Full Text Available Ce papier présente les résultats d’une étude expérimentale sur le comportement en flexion des bétons de fibres métalliques. On étudie l’effet de la rhéologie du béton sur l’orientation des fibres et l’influence de l’orientation sur les propriétés mécaniques. La rigidité de l’ancrage des fibres étudiée par les essais cycliques est liée aux caractéristiques rhéologiques et mécaniques de la matrice. Les résultats montrent que la fluidité des bétons est un paramètre essentiel de l’orientation des fibres. Dès lors que l’on obtient une orientation dans le sens de l’efficacité mécanique, la résistance à la flexion est nettement améliorée.

  19. Fault Management Metrics

    Science.gov (United States)

    Johnson, Stephen B.; Ghoshal, Sudipto; Haste, Deepak; Moore, Craig

    2017-01-01

    This paper describes the theory and considerations in the application of metrics to measure the effectiveness of fault management. Fault management refers here to the operational aspect of system health management, and as such is considered as a meta-control loop that operates to preserve or maximize the system's ability to achieve its goals in the face of current or prospective failure. As a suite of control loops, the metrics to estimate and measure the effectiveness of fault management are similar to those of classical control loops in being divided into two major classes: state estimation, and state control. State estimation metrics can be classified into lower-level subdivisions for detection coverage, detection effectiveness, fault isolation and fault identification (diagnostics), and failure prognosis. State control metrics can be classified into response determination effectiveness and response effectiveness. These metrics are applied to each and every fault management control loop in the system, for each failure to which they apply, and probabilistically summed to determine the effectiveness of these fault management control loops to preserve the relevant system goals that they are intended to protect.

  20. Completion of a Dislocated Metric Space

    Directory of Open Access Journals (Sweden)

    P. Sumati Kumari

    2015-01-01

    Full Text Available We provide a construction for the completion of a dislocated metric space (abbreviated d-metric space; we also prove that the completion of the metric associated with a d-metric coincides with the metric associated with the completion of the d-metric.

  1. The financial attractiveness assessment of large waste management projects registered as clean development mechanism

    International Nuclear Information System (INIS)

    Bufoni, André Luiz; Oliveira, Luciano Basto; Rosa, Luiz Pinguelli

    2015-01-01

    Highlights: • Projects are not financially attractive without registration as CDMs. • WM benchmarks and indicators are converging and reducing in variance. • A sensitivity analysis reveal that revenue has more of an effect on the financial results. • Results indicate that an extensive database would reduce WM project risk and capital costs. • Disclosure standards would make information more comparable worldwide. - Abstract: This study illustrates the financial analyses for demonstration and assessment of additionality presented in the project design (PDD) and enclosed documents of the 431 large Clean Development Mechanisms (CDM) classified as the ‘waste handling and disposal sector’ (13) over the past ten years (2004–2014). The expected certified emissions reductions (CER) of these projects total 63.54 million metric tons of CO 2 eq, where eight countries account for 311 projects and 43.36 million metric tons. All of the projects declare themselves ‘not financially attractive’ without CER with an estimated sum of negative results of approximately a half billion US$. The results indicate that WM benchmarks and indicators are converging and reducing in variance, and the sensitivity analysis reveals that revenues have a greater effect on the financial results. This work concludes that an extensive financial database with simple standards for disclosure would greatly diminish statement problems and make information more comparable, reducing the risk and capital costs of WM projects

  2. The financial attractiveness assessment of large waste management projects registered as clean development mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Bufoni, André Luiz, E-mail: bufoni@facc.ufrj.br [Energy Planning Program, Universidade Federal do Rio de Janeiro PPE/COPPE/UFRJ (Brazil); Oliveira, Luciano Basto [International Virtual Institute of Global Changes IVIG/COPPE/UFRJ (Brazil); Rosa, Luiz Pinguelli [Energy Planning Program, Universidade Federal do Rio de Janeiro PPE/COPPE/UFRJ (Brazil)

    2015-09-15

    Highlights: • Projects are not financially attractive without registration as CDMs. • WM benchmarks and indicators are converging and reducing in variance. • A sensitivity analysis reveal that revenue has more of an effect on the financial results. • Results indicate that an extensive database would reduce WM project risk and capital costs. • Disclosure standards would make information more comparable worldwide. - Abstract: This study illustrates the financial analyses for demonstration and assessment of additionality presented in the project design (PDD) and enclosed documents of the 431 large Clean Development Mechanisms (CDM) classified as the ‘waste handling and disposal sector’ (13) over the past ten years (2004–2014). The expected certified emissions reductions (CER) of these projects total 63.54 million metric tons of CO{sub 2}eq, where eight countries account for 311 projects and 43.36 million metric tons. All of the projects declare themselves ‘not financially attractive’ without CER with an estimated sum of negative results of approximately a half billion US$. The results indicate that WM benchmarks and indicators are converging and reducing in variance, and the sensitivity analysis reveals that revenues have a greater effect on the financial results. This work concludes that an extensive financial database with simple standards for disclosure would greatly diminish statement problems and make information more comparable, reducing the risk and capital costs of WM projects.

  3. Analysis and evaluation of the result of 1995`s the mechanization program

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Shik; Lee, Kyung Woon; Kim, Oak Hwan; Kim, Dae Kyung [Korea Institute of Geology Mining and Materials, Taejon (Korea, Republic of)

    1996-12-01

    The reducing coal market has been enforcing the coal industry to make exceptional rationalization and restructuring efforts since the end of the eighties. To the competition from crude oil and natural gas has been added the growing pressure from rising wages and rising production cost as the workings get deeper. To improve the competitive position of the coal mines against oil and gas through cost reduction, studies to improve mining system have been carried out. The production of 1995 amounted to 5,719 thousand tons which is less than the last year`s production by 1,719 thousand tons. The investment put for the coal mining mechanization totaled 7,478 million won consisting of 4,176 million won of government subsidy and 3,302 million won of company`s own capital. By increasing mechanization rate from 72 % of the previous year to 75 % and overall O.M.S. from 1.63 ton/man{center_dot}shift to 1.68 ton/man{center_dot} shift the mechanization program got 218.0 billion won of returns compared to the base year 1980 which can be broken down as follows : 1) 89.3 billion won from increase in productivity, 2) 22.0 billion won from increase in calorific value, 3) 5.4 billion won from increase in recovery rate, 4) 101.3 billion won from reduction in accident rate. (author). tabs., 5 figs.

  4. A two-billion-year history for the lunar dynamo.

    Science.gov (United States)

    Tikoo, Sonia M; Weiss, Benjamin P; Shuster, David L; Suavet, Clément; Wang, Huapei; Grove, Timothy L

    2017-08-01

    Magnetic studies of lunar rocks indicate that the Moon generated a core dynamo with surface field intensities of ~20 to 110 μT between at least 4.25 and 3.56 billion years ago (Ga). The field subsequently declined to lunar dynamo by at least 1 billion years. Such a protracted history requires an extraordinarily long-lived power source like core crystallization or precession. No single dynamo mechanism proposed thus far can explain the strong fields inferred for the period before 3.56 Ga while also allowing the dynamo to persist in such a weakened state beyond ~2.5 Ga. Therefore, our results suggest that the dynamo was powered by at least two distinct mechanisms operating during early and late lunar history.

  5. Cow power: the energy and emissions benefits of converting manure to biogas

    International Nuclear Information System (INIS)

    Cuellar, Amanda D; Webber, Michael E

    2008-01-01

    This report consists of a top-level aggregate analysis of the total potential for converting livestock manure into a domestic renewable fuel source (biogas) that could be used to help states meet renewable portfolio standard requirements and reduce greenhouse gas (GHG) emissions. In the US, livestock agriculture produces over one billion tons of manure annually on a renewable basis. Most of this manure is disposed of in lagoons or stored outdoors to decompose. Such disposal methods emit methane and nitrous oxide, two important GHGs with 21 and 310 times the global warming potential of carbon dioxide, respectively. In total, GHG emissions from the agricultural sector in the US amounted to 536 million metric tons (MMT) of carbon dioxide equivalent, or 7% of the total US emissions in 2005. Of this agricultural contribution, 51 to 118 MMT of carbon dioxide equivalent resulted from livestock manure emissions alone, with trends showing this contribution increasing from 1990 to 2005. Thus, limiting GHG emissions from manure represents a valuable starting point for mitigating agricultural contributions to global climate change. Anaerobic digestion, a process that converts manure to methane-rich biogas, can lower GHG emissions from manure significantly. Using biogas as a substitute for other fossil fuels, such as coal for electricity generation, replaces two GHG sources-manure and coal combustion-with a less carbon-intensive source, namely biogas combustion. The biogas energy potential was calculated using values for the amount of biogas energy that can be produced per animal unit (defined as 1000 pounds of animal) per day and the number of animal units in the US. The 95 million animal units in the country could produce nearly 1 quad of renewable energy per year, amounting to approximately 1% of the US total energy consumption. Converting the biogas into electricity using standard microturbines could produce 88 ± 20 billion kWh, or 2.4 ± 0.6% of annual electricity

  6. Metrics with vanishing quantum corrections

    International Nuclear Information System (INIS)

    Coley, A A; Hervik, S; Gibbons, G W; Pope, C N

    2008-01-01

    We investigate solutions of the classical Einstein or supergravity equations that solve any set of quantum corrected Einstein equations in which the Einstein tensor plus a multiple of the metric is equated to a symmetric conserved tensor T μν (g αβ , ∂ τ g αβ , ∂ τ ∂ σ g αβ , ...,) constructed from sums of terms, the involving contractions of the metric and powers of arbitrary covariant derivatives of the curvature tensor. A classical solution, such as an Einstein metric, is called universal if, when evaluated on that Einstein metric, T μν is a multiple of the metric. A Ricci flat classical solution is called strongly universal if, when evaluated on that Ricci flat metric, T μν vanishes. It is well known that pp-waves in four spacetime dimensions are strongly universal. We focus attention on a natural generalization; Einstein metrics with holonomy Sim(n - 2) in which all scalar invariants are zero or constant. In four dimensions we demonstrate that the generalized Ghanam-Thompson metric is weakly universal and that the Goldberg-Kerr metric is strongly universal; indeed, we show that universality extends to all four-dimensional Sim(2) Einstein metrics. We also discuss generalizations to higher dimensions

  7. Petroleum and natural gas in the Federal Republic of Germany 2001; Erdoel und Erdgas in der Bundesrepublik Deutschland 2001

    Energy Technology Data Exchange (ETDEWEB)

    Pasternak, M.; Kosinowski, M.; Loesch, J.; Messner, J.; Sedlacek, R.

    2002-07-01

    After the last years had seen the lowest level of exploration activities, the year 2001 marked a distinct increase of exploration activities. This became quite visible in the number of completed exploration wells and in the extent of seismic surveys. Two of four new-field wildcats were completed in 2001, however without success. One of the three additional exploration wells (new-pool test and new-tectonic block) found gas. In production drilling nine of twelve wells were successful. Footage increased by near 25% compared to the preceding year and amounted to 54 030 metres. Seen in the light of history since 1945, this value marks the last but one rank compared to the last rank of the previous year. The rise of footage was achieved exclusively by exploration drilling. Footage of production drilling was slightly down. In geophysical prospecting the activities in 3D seismics reached the same level as in times of maximum seismic activities. The total area of 2400 square kilometres even exceeded the previous maximum. The main contribution was delivered by only one offshore survey in the German North Sea. 2D seismic was rather low and amounted to 450 kilometres in total. The license area for oil and gas exploration was reduced even further. As in previous years several expired licences were not extended and in some cases parts of the licence areas were relinquished. One licence around the city of Bremen was newly awarded. Two licences were awarded due to reorganisation of licence areas respectively due to adaptation to frontiers. In the state of Nordrhein-Westfalen three smaller licence areas were newly awarded aiming at producing gas from abandoned coal mines. German oil production could increase to 3.44 million metric tons due to the increase in production from the oil field Mittelplate to 1.6 million tons. In all other fields oil production decreased because of normal decline in production. Total remaining proven and probable oil reserves are reduced by 3 million tons

  8. Applied molecular simulations over FER-, TON- and AEL-type zeolites

    NARCIS (Netherlands)

    Domokos, L.; Lefferts, Leonardus; Seshan, Kulathuiyer; Lercher, J.A.

    2001-01-01

    Interaction and transport of representative (un)saturated hydrocarbon molecules involved in the proposed reaction network of n-butene isomerization in zeolites FER, TON, and AEL have been studied by classic molecular modeling calculations. Docking of the guest molecules into the zeolite frameworks

  9. SOME CONSIDERATIONS ON THE PROSPECTS OF SORGHUM CROP

    Directory of Open Access Journals (Sweden)

    Agatha POPESCU

    2014-10-01

    Full Text Available The paper purpose was to analyze the sorghum statement at world, EU and Romania level in order to establish the main trends in the future of this crop. Sorghum is an important cereal coming on the 5th position after maize, rice, wheat and barley at world level due to its importance in human nutrition, animal feed, in producing bioethanol and green energy, and due to its good impact on environment. It is cultivated on all the continents, in the tropical, subtropical and temperate areas due to its resistance to drought, production potential, low inputs and production cost. It is an alternative to maize crop being more utilized as substituent in animal diets. The world sorghum production reached 63,811 thousand metric tons in 2014, the main producers being the USA, Mexico, Nigeria, India, Argentina, Ethiopia, Sudan and China. The world consumption of sorghum reached 63,148 thousand metric tons and it is continuously increasing. The sorghum exports accounted for 7,690 thousand metric tons in 2014, of which the USA export represents 4,600 thousand metric tons. Besides the USA, other exporting countries are Argentina, Australia, Ethiopia, India, Nigeria, Uruguay, while the main importing countries are China, Japan, Chile, Colombia, Mexico, the EU, Sudan. In 2014, the EU produced 576 thousand metric tons sorghum, imported 200 thousand metric tons, and consumed 770 thousand metric tons. The main EU producers of sorghum are France, Italy, Romania, Spain and Hungary. In 2012, Romania cultivated 20,000 ha with sorghum crop, 18 times more than in 2077. Also, in 2012, Romania produced 37.5 thousand tons of sorghum grains, by 31 times more than in 2007. The sorghum yield was 1,875 kg/ha by 66% higher in 2012 compared to 2007. Therefore, these figures show the increasing importance of sorghum crop at world level. Because Romania is situated in suitable geographical area for producing sorghum, it could increase production and become a more important supplier

  10. Remarks on G-Metric Spaces

    Directory of Open Access Journals (Sweden)

    Bessem Samet

    2013-01-01

    Full Text Available In 2005, Mustafa and Sims (2006 introduced and studied a new class of generalized metric spaces, which are called G-metric spaces, as a generalization of metric spaces. We establish some useful propositions to show that many fixed point theorems on (nonsymmetric G-metric spaces given recently by many authors follow directly from well-known theorems on metric spaces. Our technique can be easily extended to other results as shown in application.

  11. America's War on Cocaine: The National Drug Control Supply Reduction Strategy

    National Research Council Canada - National Science Library

    Rasicot, Gary

    2004-01-01

    ...) estimates that the annual U.S. consumption of cocaine exceeds 300 metric tons. To satisfy this demand for the illicit drug, the IACM estimates that over 500 metric tons of cocaine is shipped from South America to the United States annually...

  12. Metric-adjusted skew information

    DEFF Research Database (Denmark)

    Liang, Cai; Hansen, Frank

    2010-01-01

    on a bipartite system and proved superadditivity of the Wigner-Yanase-Dyson skew informations for such states. We extend this result to the general metric-adjusted skew information. We finally show that a recently introduced extension to parameter values 1 ...We give a truly elementary proof of the convexity of metric-adjusted skew information following an idea of Effros. We extend earlier results of weak forms of superadditivity to general metric-adjusted skew information. Recently, Luo and Zhang introduced the notion of semi-quantum states...... of (unbounded) metric-adjusted skew information....

  13. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  14. The metric system: An introduction

    Science.gov (United States)

    Lumley, Susan M.

    On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.

  15. The metric system: An introduction

    Energy Technology Data Exchange (ETDEWEB)

    Lumley, S.M.

    1995-05-01

    On July 13, 1992, Deputy Director Duane Sewell restated the Laboratory`s policy on conversion to the metric system which was established in 1974. Sewell`s memo announced the Laboratory`s intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory`s conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on July 25, 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation`s conversion to the metric system. The second part of this report is on applying the metric system.

  16. Methods and calculations for regional, continental, and global dose assessments from a hypothetical fuel reprocessing facility

    International Nuclear Information System (INIS)

    Schubert, J.F.; Kern, C.D.; Cooper, R.E.; Watts, J.R.

    1978-01-01

    The Savannah River Laboratory (SRL) is coordinating an interlaboratory effort to provide, test, and use state-of-the-art methods for calculating the environmental impact to an offsite population from the normal releases of radionuclides during the routine operation of a fuel-reprocessing plant. Results of this effort are the estimated doses to regional, continental, and global populations. Estimates are based upon operation of a hypothetical reprocessing plant at a site in the southeastern United States. The hypothetical plant will reprocess fuel used at a burn rate of 30 megawatts/metric ton and a burnup of 33,000 megawatt days/metric ton. All fuel will have been cooled for at least 365 days. The plant will have a 10 metric ton/day capacity and an assumed 3000 metric ton/year (82 percent online plant operation) output. Lifetime of the plant is assumed to be 40 years

  17. Attack-Resistant Trust Metrics

    Science.gov (United States)

    Levien, Raph

    The Internet is an amazingly powerful tool for connecting people together, unmatched in human history. Yet, with that power comes great potential for spam and abuse. Trust metrics are an attempt to compute the set of which people are trustworthy and which are likely attackers. This chapter presents two specific trust metrics developed and deployed on the Advogato Website, which is a community blog for free software developers. This real-world experience demonstrates that the trust metrics fulfilled their goals, but that for good results, it is important to match the assumptions of the abstract trust metric computation to the real-world implementation.

  18. Fiscal 1988 draft budget for nuclear energy up 1.9% to yen 369 billion

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    AT the cabinet meeting held on December 28, the government approved the fiscal 1988 draft budget, with a general account of yen 56.6 trillion. The nuclear energy related budget is yen 181.124 billion from the general account and yen 186.098 billion from the special account for power sources development, totalling yen 367.222 billion, up 1.9% on the previous year. The largest appropriation goes to the Science and Technology Agency (STA) totaling yen 271 billion. The STA is promoting safety studies and R and D for extensive nuclear energy utilization but the budget shows a 0.7% decrease from the previous year, reflecting completion of the construction of JT-60, which is one of the Agency's major projects. MITI, with its budget of yen 91 billion will carry on policies related to the promotion of commercial nuclear power program as well as support for the industrialization program of the nuclear fuel cycle. Nuclear related budget of Ministry of Foreign Affairs is yen 2.8 billion, consisting mainly of IAEA subscriptions and contributions and OECD/NEA subscriptions. Besides these three government agencies, a large sum of yen 1.2 billion is allocated to the Okinawa Development Agency for the prevention and elimination of melon-flies in Kume Island and islands around Okinawa main island. The draft government budget will be submitted to the ordinary session of the Diet when it resumes towards the end of January. After deliberation in the Budget Committees of the House of Representatives and the House of Councilors, the draft budget will be put to the vote in the plenary session. Assuming that all proceeds smoothly, the budget is expected to be approved by the end of March without any major revision. (author)

  19. Transportation Energy Futures Series: Freight Transportation Demand: Energy-Efficient Scenarios for a Low-Carbon Future

    Energy Technology Data Exchange (ETDEWEB)

    Grenzeback, L. R. [Cambridge Systematics Inc., Cambridge, MA (United States); Brown, A. [Cambridge Systematics Inc., Cambridge, MA (United States); Fischer, M. J. [Cambridge Systematics Inc., Cambridge, MA (United States); Hutson, N. [Cambridge Systematics Inc., Cambridge, MA (United States); Lamm, C. R. [Cambridge Systematics Inc., Cambridge, MA (United States); Pei, Y. L. [Cambridge Systematics Inc., Cambridge, MA (United States); Vimmerstedt, L. [Cambridge Systematics Inc., Cambridge, MA (United States); Vyas, A. D. [Cambridge Systematics Inc., Cambridge, MA (United States); Winebrake, J. J. [Cambridge Systematics Inc., Cambridge, MA (United States)

    2013-03-01

    Freight transportation demand is projected to grow to 27.5 billion tons in 2040, and by extrapolation, to nearly 30.2 billion tons in 2050, requiring ever-greater amounts of energy. This report describes the current and future demand for freight transportation in terms of tons and ton-miles of commodities moved by truck, rail, water, pipeline, and air freight carriers. It outlines the economic, logistics, transportation, and policy and regulatory factors that shape freight demand; the possible trends and 2050 outlook for these factors, and their anticipated effect on freight demand and related energy use. After describing federal policy actions that could influence freight demand, the report then summarizes the available analytical models for forecasting freight demand, and identifies possible areas for future action.

  20. Hypertonic-induced lamin A/C synthesis and distribution to nucleoplasmic speckles is mediated by TonEBP/NFAT5 transcriptional activator

    International Nuclear Information System (INIS)

    Favale, Nicolas O.; Sterin Speziale, Norma B.; Fernandez Tome, Maria C.

    2007-01-01

    Lamin A/C is the most studied nucleoskeletal constituent. Lamin A/C expression indicates cell differentiation and is also a structural component of nuclear speckles, which are involved in gene expression regulation. Hypertonicity has been reported to induce renal epithelial cell differentiation and expression of TonEBP (NFAT5), a transcriptional activator of hypertonicity-induced gene transcription. In this paper, we investigate the effect of hypertonicity on lamin A/C expression in MDCK cells and the involvement of TonEBP. Hypertonicity increased lamin A/C expression and its distribution to nucleoplasm with speckled pattern. Microscopy showed codistribution of TonEBP and lamin A/C in nucleoplasmic speckles, and immunoprecipitation demonstrated their interaction. TonEBP silencing caused lamin A/C redistribution from nucleoplasmic speckles to the nuclear rim, followed by lamin decrease, thus showing that hypertonicity induces lamin A/C speckles through a TonEBP-dependent mechanism. We suggest that lamin A/C speckles could serve TonEBP as scaffold thus favoring its role in hypertonicity

  1. Symmetries of the dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.

    1998-01-01

    The geometric duality between the metric g μν and a Killing tensor K μν is studied. The conditions were found when the symmetries of the metric g μν and the dual metric K μν are the same. Dual spinning space was constructed without introduction of torsion. The general results are applied to the case of Kerr-Newmann metric

  2. A choice of renewable or upgraded material from oil palm solid wastes

    International Nuclear Information System (INIS)

    Farid Nasir Ani; Wong Chuan Chin; Hussin Mohd Nor

    2006-01-01

    Malaysian palm oil industries are producing a large amount of solid wastes from the palm oil mills. Malaysia generates around 1.10 million tons of oil palm shells in year 1980 but this amount increased up to 4.11 million tons in year 2002 as wastes. Disposal of these wastes created environmental problems. Thus, a process was designed to reuse and recycle these wastes into value added products. This research used oil palm shells as a renewable material resource by thermo-chemical process to produce pyrolysis oil. The oil could be utilized as fuel or converted to valued added products. Since it contain a significant amount of phenols, it was extracted using solvent extraction technique to gain the useful phenol and phenolic compounds. The extracted oil-palm-shell-based phenol was used in the manufacturing of phenol formaldehyde wood adhesives. Then the capability of wood bonding was tested comparing with the petroleum-based phenol formaldehyde wood adhesives. For the commercial values of this research, the total global consumption of phenol in 2000 was 11.3 million metric ton that worth USD 10.0 billions. Thus, the commercial potentiality of this research is very high as the oil-palm-shell-based phenol could replace the petroleum-based phenol. The methods and products utilize low manufacturing cost from relatively simple technology and locally abundant raw material, comparable performances in wood bonding and competitive in price. It is estimated that around USD 900 / ton for petroleum-based, but just USD 250 / ton for palm-shell-based phenol

  3. Twitching motility and biofilm formation are associated with tonB1 in Xylella fastidiosa

    OpenAIRE

    Cursino, Luciana; Li, Yaxin; Zaini, Paulo A.; De La Fuente, Leonardo; Hoch, Harvey C.; Burr, Thomas J.

    2017-01-01

    A mutation in the Xylella fastidiosa tonB1 gene resulted in loss of twitching motility and in significantly less biofilm formation as compared with a wild type. The altered motility and biofilm phenotypes were restored by complementation with a functional copy of the gene. The mutation affected virulence as measured by Pierce's disease symptoms on grapevines. The role of TonB1 in twitching and biofilm formation appears to be independent of the characteristic iron-uptake function of this prote...

  4. Overview of journal metrics

    Directory of Open Access Journals (Sweden)

    Kihong Kim

    2018-02-01

    Full Text Available Various kinds of metrics used for the quantitative evaluation of scholarly journals are reviewed. The impact factor and related metrics including the immediacy index and the aggregate impact factor, which are provided by the Journal Citation Reports, are explained in detail. The Eigenfactor score and the article influence score are also reviewed. In addition, journal metrics such as CiteScore, Source Normalized Impact per Paper, SCImago Journal Rank, h-index, and g-index are discussed. Limitations and problems that these metrics have are pointed out. We should be cautious to rely on those quantitative measures too much when we evaluate journals or researchers.

  5. Fugitive carbon dioxide: It's not hiding in the ocean

    International Nuclear Information System (INIS)

    Kerr, R.A.

    1992-01-01

    The fugitive carbon is the difference between the 7 billion or so tons that spew as carbon dioxide from smokestacks and burning tropical forests and the 3.4 billion tons known to stay in the atmosphere. Finding the other 3 billion or 4 billion tons has frustrated researchers for the past 15 years. The oceans certainly take up some of it. Any forecast of global warming has to be based on how much of the carbon dioxide released by human activity will remain in the atmosphere, and predictions vary by 30% depending on the mix of oceanic and terrestrial processes assumed to be removing the gas. What's more, those predictions assume that the processes at work today will go on operating. But not knowing where all the carbon is going raises the unnerving possibility that whatever processes are removing it may soon fall down on the job without warning, accelerating any warming. Such concerns add urgency to the question of whether the ocean harbors the missing carbon. But there's no simple way to find out. The obvious strategy might seem to be to measure the carbon content of the ocean repeatedly to see how much it increases year by year. The trouble is that several billion tons of added carbon, though impressive on a human scale, are undetectable against the huge swings in ocean carbon that occur from season to season, year to year, and place to place

  6. Holographic Spherically Symmetric Metrics

    Science.gov (United States)

    Petri, Michael

    The holographic principle (HP) conjectures, that the maximum number of degrees of freedom of any realistic physical system is proportional to the system's boundary area. The HP has its roots in the study of black holes. It has recently been applied to cosmological solutions. In this article we apply the HP to spherically symmetric static space-times. We find that any regular spherically symmetric object saturating the HP is subject to tight constraints on the (interior) metric, energy-density, temperature and entropy-density. Whenever gravity can be described by a metric theory, gravity is macroscopically scale invariant and the laws of thermodynamics hold locally and globally, the (interior) metric of a regular holographic object is uniquely determined up to a constant factor and the interior matter-state must follow well defined scaling relations. When the metric theory of gravity is general relativity, the interior matter has an overall string equation of state (EOS) and a unique total energy-density. Thus the holographic metric derived in this article can serve as simple interior 4D realization of Mathur's string fuzzball proposal. Some properties of the holographic metric and its possible experimental verification are discussed. The geodesics of the holographic metric describe an isotropically expanding (or contracting) universe with a nearly homogeneous matter-distribution within the local Hubble volume. Due to the overall string EOS the active gravitational mass-density is zero, resulting in a coasting expansion with Ht = 1, which is compatible with the recent GRB-data.

  7. 40 CFR 98.423 - Calculating CO2 supply.

    Science.gov (United States)

    2010-07-01

    ... 98.423 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... quarterly the mass of CO2 in a CO2 stream in metric tons, prior to any subsequent purification, processing... stream in metric tons, prior to any subsequent purification, processing, or compressing, by multiplying...

  8. R and D activities on radiation processing of natural polymers in Pakistan

    International Nuclear Information System (INIS)

    Yasin, Tariq

    2008-01-01

    The fishery industry in Pakistan is well established and is growing very fast. The annual production of shrimp is ∼45,000 metric ton, out of which ∼20,000 metric ton is exported. The annual production of crab is 250 metric ton. The estimated production of waste from these materials is approximately 20,000 metric ton, which is a huge quantity. This crustacean waste generated by fishery industries containing chitin, a natural polymer, can be extracted by chemical treatments. Deacetylation of chitin gives chitosan which is soluble in dilute mineral acids. Presently, the main consumer of this waste is animal feed industries. Pakistan Atomic Energy Commission (PAEC) has started research program on Radiation Processing of Natural Polymers in cooperation with IAEA and RCA in order to convert this sizable waste into value added products. This report describes some of our obtained results on radiation processing of natural polymer and its applications. (author)

  9. Combustion characteristics and NO formation for biomass blends in a 35-ton-per-hour travelling grate utility boiler.

    Science.gov (United States)

    Li, Zhengqi; Zhao, Wei; Li, Ruiyang; Wang, Zhenwang; Li, Yuan; Zhao, Guangbo

    2009-04-01

    Measurements were taken for a 35-ton-per-hour biomass-fired travelling grate boiler. Local mean concentrations of O(2), CO, SO(2) and NO gas species and gas temperatures were determined in the region above the grate. For a 28-ton-per-hour load, the mass ratios of biomass fly ash and boiler slag were 42% and 58%, the boiler efficiency was 81.56%, and the concentrations of NO(x) and SO(2) at 6% O(2) were 257 and 84 mg/m(3). For an 18-ton-per-hour load, the fuel burning zone was nearer to the inlet than it was for the 28-ton-per-hour load, and the contents of CO and NO in the fuel burning zone above the grate were lower.

  10. Reducing greenhouse gas emissions by inducing energy conservation and distributed generation from elimination of electric utility customer charges

    International Nuclear Information System (INIS)

    Pearce, Joshua M.; Harris, Paul J.

    2007-01-01

    This paper quantifies the increased greenhouse gas emissions and negative effect on energy conservation (or 'efficiency penalty') due to electric rate structures that employ an unavoidable customer charge. First, the extent of customer charges was determined from a nationwide survey of US electric tariffs. To eliminate the customer charge nationally while maintaining a fixed sum for electric companies for a given amount of electricity, an increase of 7.12% in the residential electrical rate was found to be necessary. If enacted, this increase in the electric rate would result in a 6.4% reduction in overall electricity consumption, conserving 73 billion kW h, eliminating 44.3 million metric tons of carbon dioxide, and saving the entire US residential sector over $8 billion per year. As shown here, these reductions would come from increased avoidable costs, thus leveraging an increased rate of return on investments in energy efficiency, energy conservation behavior, distributed energy generation, and fuel choices. Finally, limitations of this study and analysis are discussed and conclusions are drawn for proposed energy policy changes

  11. Constructing experimental devices for half-ton synthesis of gadolinium-loaded liquid scintillator and its performance

    Science.gov (United States)

    Park, Young Seo; Jang, Yeong Min; Joo, Kyung Kwang

    2018-04-01

    This paper describes in brief features of various experimental devices constructed for half-ton synthesis of gadolinium(Gd)-loaded liquid scintillator (GdLS) and also includes the performances and detailed chemical and physical results of a 0.5% high-concentration GdLS. Various feasibility studies on useful apparatus used for loading Gd into solvents have been carried out. The transmittance, Gd concentration, density, light yield, and moisture content were measured for quality control. We show that with the help of adequate automated experimental devices and tools, it is possible to perform ton scale synthesis of GdLS at moderate laboratory scale without difficulty. The synthesized GdLS was satisfactory to meet chemical, optical, and physical properties and various safety requirements. These synthesizing devices can be expanded into massive scale next-generation neutrino experiments of several hundred tons.

  12. Emerging trends in regional coal production

    International Nuclear Information System (INIS)

    Watson, W.D.

    1994-01-01

    At an average annual growth rate of 1.9%, the total national demand for coal will increase from 850 million short tons in 1985 to 2 billion short tons annually by the year 2030. A market simulation model (described in this paper) determines the regional pattern of coal production needed to meet these demands. Because compliance or low-sulfur coal resources are a low-cost option for meeting environmental regulations, they could be mined out substantially in the medium term. In the next 15 to 25 years, most of the Eastern compliance coal up to a mining cost of $40 per ton could be mined out and 4 billion short tons of Western compliance coal could be produced. By the year 2030, almost all Eastern low-sulfur coal could be mined out. Most Western compliance coal costing less than $20/ton could be mined out by 2030

  13. 19 CFR 360.104 - Steel import monitoring.

    Science.gov (United States)

    2010-04-01

    ... ANALYSIS SYSTEM § 360.104 Steel import monitoring. (a) Throughout the duration of the licensing requirement... include import quantity (metric tons), import Customs value (U.S. $), and average unit value ($/metric ton... and will also present a range of historical data for comparison purposes. Provision of this aggregate...

  14. RESOURCE USE EFFICIENCY OF GROUNDNUT PRODUCTION IN ...

    African Journals Online (AJOL)

    AGROSEARCH UIL

    2012-09-28

    Sep 28, 2012 ... A stratified sampling technique was employed to select 58 respondents. ... there is still opportunity to increase their production to attain optimal economic efficiency. The ... metric tons and an average productivity of 1.4 metric tons /ha. Developing ... The educated population are gainfully employed in some.

  15. Metric regularity and subdifferential calculus

    International Nuclear Information System (INIS)

    Ioffe, A D

    2000-01-01

    The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces

  16. Context-dependent ATC complexity metric

    NARCIS (Netherlands)

    Mercado Velasco, G.A.; Borst, C.

    2015-01-01

    Several studies have investigated Air Traffic Control (ATC) complexity metrics in a search for a metric that could best capture workload. These studies have shown how daunting the search for a universal workload metric (one that could be applied in different contexts: sectors, traffic patterns,

  17. Finding of no significant impact: Interim storage of enriched uranium above the maximum historical level at the Y-12 Plant Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    1995-01-01

    The US Department of Energy (DOE) has prepared an Environmental Assessment (EA) for the Proposed Interim Storage of Enriched Uranium Above the Maximum Historical Storage Level at the Y-12 Plant, Oak Ridge, Tennessee (DOE/EA-0929, September, 1994). The EA evaluates the environmental effects of transportation, prestorage processing, and interim storage of bounding quantities of enriched uranium at the Y-12 Plant over a ten-year period. The State of Tennessee and the public participated in public meetings and workshops which were held after a predecisional draft EA was released in February 1994, and after the revised pre-approval EA was issued in September 1994. Comments provided by the State and public have been carefully considered by the Department. As a result of this public process, the Department has determined that the Y-12 Plant-would store no more than 500 metric tons of highly enriched uranium (HEU) and no more than 6 metric tons of low enriched uranium (LEU). The bounding storage quantities analyzed in the pre-approval EA are 500 metric tons of HEU and 7,105.9 metric tons of LEU. Based on-the analyses in the EA, as revised by the attachment to the Finding of No Significant Impact (FONSI), DOE has determined that interim storage of 500 metric tons of HEU and 6 metric tons of LEU at the Y-12 Plant does not constitute a major Federal action significantly affecting the quality of the human environment, within the meaning of the National Environmental Policy Act (NEPA) of 1969. Therefore, an Environmental Impact Statement (EIS) is not required and the Department is issuing this FONSI

  18. Challenges of rapid economic growth in China: Reconciling sustainable energy use, environmental stewardship and social development

    International Nuclear Information System (INIS)

    Li Yong; Oberheitmann, Andreas

    2009-01-01

    China aims at quadrupling per-capita GDP by 2020 compared to the year 2000. Without any energy and environmental policy measures, this tremendous economic growth would be associated with a quadrupling of primary energy consumption up to 6.3 billion tons of standard coal equivalents (sce) and energy-related CO 2 -emissions of 13.9 billion tons Against this background, this paper is to set China's need to implement its sustainable development strategy into the quantitative context of the countries economic development and subsequent economic growth-related environmental problems. China is urgently searching for a way to ease the negative implications of economic growth and has committed itself to achieve a level of 3.0 billion ton sce primary energy consumption in 2020. As a consequence, the macro-economic energy intensity has to be reduced by 53% by 2020. A reduction of 53% by 2020 would lead to an energy intensity level 30% points below the year-2000 level of developed countries. As for natural resources, the expected economic growth will lead to an increase of crude oil net-imports up to 455 million ton sce in 2020 and 650 million ton sce in 2030. As for regional income distribution, economic growth helped to decrease existing inequities

  19. DLA Energy Biofuel Feedstock Metrics Study

    Science.gov (United States)

    2012-12-11

    moderately/highly in- vasive  Metric 2: Genetically modified organism ( GMO ) hazard, Yes/No and Hazard Category  Metric 3: Species hybridization...4– biofuel distribution Stage # 5– biofuel use Metric 1: State inva- siveness ranking Yes Minimal Minimal No No Metric 2: GMO hazard Yes...may utilize GMO microbial or microalgae species across the applicable biofuel life cycles (stages 1–3). The following consequence Metrics 4–6 then

  20. Working Paper 5: Beyond Collier's Bottom Billion | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2010-12-16

    Dec 16, 2010 ... The heart of the narrative presented in the book is that a group of almost 60 countries, with a population of about a billion people, are caught in four main traps. Their prospects for escaping the traps are poor, and they need a set of actions from the international community to achieve the rapid rates of growth ...

  1. Congress OKs $2 Billion Boost for the NIH.

    Science.gov (United States)

    2017-07-01

    President Donald Trump last week signed a $1.1 trillion spending bill for fiscal year 2017, including a welcome $2 billion boost for the NIH that will support former Vice President Joe Biden's Cancer Moonshot initiative, among other priorities. However, researchers who rely heavily on NIH grant funding remain concerned about proposed cuts for 2018. ©2017 American Association for Cancer Research.

  2. 40 CFR 98.343 - Calculating GHG emissions.

    Science.gov (United States)

    2010-07-01

    ... (fraction); default is 1. DOC = Degradable organic carbon from Table HH-1 of this subpart or measurement data, if available [fraction (metric tons C/metric ton waste)]. DOCF = Fraction of DOC dissimilated (fraction); default is 0.5. F = Fraction by volume of CH4 in landfill gas from measurement data, if...

  3. Symmetries of Taub-NUT dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.; Codoban, S.

    1998-01-01

    Recently geometric duality was analyzed for a metric which admits Killing tensors. An interesting example arises when the manifold has Killing-Yano tensors. The symmetries of the dual metrics in the case of Taub-NUT metric are investigated. Generic and non-generic symmetries of dual Taub-NUT metric are analyzed

  4. Responsible for 45 000 tons CO2 emissions

    International Nuclear Information System (INIS)

    Nedrelid, Ola N.

    2006-01-01

    Waste combustion has much better tax conditions in Sweden compared to Norway. Today waste is being transported from Norway to Sweden, resulting in a 45 000 ton emission of CO 2 every year, when the waste could have remained in Norway, utilized as regained energy in district heating. The tax regime, however, does not provide the conditions for a profitable use of the waste in Norway. The district heating association is disappointed with the new government's proposed fiscal budget, which only worsens the competitive situation for Norway handling its own waste (ml)

  5. Metric learning

    CERN Document Server

    Bellet, Aurelien; Sebban, Marc

    2015-01-01

    Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learnin

  6. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2018-01-01

    The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...

  7. Optical flare observed in the flaring gamma-ray blazar Ton 599

    Science.gov (United States)

    Pursimo, Tapio; Sagues, Ana; Telting, John; Ojha, Roopesh

    2017-11-01

    We report optical photometry of the flat spectrum radio quasar Ton 599, obtained with the 2.56m Nordic Optical Telescope in La Palma, to look for any enhanced optical activity associated with a recent flare in the daily averaged gamma-ray flux (ATel#10931, ATel#10937).

  8. On Information Metrics for Spatial Coding.

    Science.gov (United States)

    Souza, Bryan C; Pavão, Rodrigo; Belchior, Hindiael; Tort, Adriano B L

    2018-04-01

    The hippocampal formation is involved in navigation, and its neuronal activity exhibits a variety of spatial correlates (e.g., place cells, grid cells). The quantification of the information encoded by spikes has been standard procedure to identify which cells have spatial correlates. For place cells, most of the established metrics derive from Shannon's mutual information (Shannon, 1948), and convey information rate in bits/s or bits/spike (Skaggs et al., 1993, 1996). Despite their widespread use, the performance of these metrics in relation to the original mutual information metric has never been investigated. In this work, using simulated and real data, we find that the current information metrics correlate less with the accuracy of spatial decoding than the original mutual information metric. We also find that the top informative cells may differ among metrics, and show a surrogate-based normalization that yields comparable spatial information estimates. Since different information metrics may identify different neuronal populations, we discuss current and alternative definitions of spatially informative cells, which affect the metric choice. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  9. Generalized Painleve-Gullstrand metrics

    Energy Technology Data Exchange (ETDEWEB)

    Lin Chunyu [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: l2891112@mail.ncku.edu.tw; Soo Chopin [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: cpsoo@mail.ncku.edu.tw

    2009-02-02

    An obstruction to the implementation of spatially flat Painleve-Gullstrand (PG) slicings is demonstrated, and explicitly discussed for Reissner-Nordstroem and Schwarzschild-anti-deSitter spacetimes. Generalizations of PG slicings which are not spatially flat but which remain regular at the horizons are introduced. These metrics can be obtained from standard spherically symmetric metrics by physical Lorentz boosts. With these generalized PG metrics, problematic contributions to the imaginary part of the action in the Parikh-Wilczek derivation of Hawking radiation due to the obstruction can be avoided.

  10. Research Investments in Global Health: A Systematic Analysis of UK Infectious Disease Research Funding and Global Health Metrics, 1997-2013.

    Science.gov (United States)

    Head, Michael G; Fitchett, Joseph R; Nageshwaran, Vaitehi; Kumari, Nina; Hayward, Andrew; Atun, Rifat

    2016-01-01

    Infectious diseases account for a significant global burden of disease and substantial investment in research and development. This paper presents a systematic assessment of research investments awarded to UK institutions and global health metrics assessing disease burden. We systematically sourced research funding data awarded from public and philanthropic organisations between 1997 and 2013. We screened awards for relevance to infection and categorised data by type of science, disease area and specific pathogen. Investments were compared with mortality, disability-adjusted life years (DALYs) and years lived with disability (YLD) across three time points. Between 1997-2013, there were 7398 awards with a total investment of £3.7 billion. An increase in research funding across 2011-2013 was observed for most disease areas, with notable exceptions being sexually transmitted infections and sepsis research where funding decreased. Most funding remains for pre-clinical research (£2.2 billion, 59.4%). Relative to global mortality, DALYs and YLDs, acute hepatitis C, leishmaniasis and African trypanosomiasis received comparatively high levels of funding. Pneumonia, shigellosis, pertussis, cholera and syphilis were poorly funded across all health metrics. Tuberculosis (TB) consistently attracts relatively less funding than HIV and malaria. Most infections have received increases in research investment, alongside decreases in global burden of disease in 2013. The UK demonstrates research strengths in some neglected tropical diseases such as African trypanosomiasis and leishmaniasis, but syphilis, cholera, shigellosis and pneumonia remain poorly funded relative to their global burden. Acute hepatitis C appears well funded but the figures do not adequately take into account projected future chronic burdens for this condition. These findings can help to inform global policymakers on resource allocation for research investment.

  11. Kerr metric in the deSitter background

    International Nuclear Information System (INIS)

    Vaidya, P.C.

    1984-01-01

    In addition to the Kerr metric with cosmological constant Λ several other metrics are presented giving a Kerr-like solution of Einstein's equations in the background of deSitter universe. A new metric of what may be termed as rotating deSitter space-time devoid of matter but containing null fluid with twisting null rays, has been presented. This metric reduces to the standard deSitter metric when the twist in the rays vanishes. Kerr metric in this background is the immediate generalization of Schwarzschild's exterior metric with cosmological constant. (author)

  12. Kerr metric in cosmological background

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, P C [Gujarat Univ., Ahmedabad (India). Dept. of Mathematics

    1977-06-01

    A metric satisfying Einstein's equation is given which in the vicinity of the source reduces to the well-known Kerr metric and which at large distances reduces to the Robertson-Walker metric of a nomogeneous cosmological model. The radius of the event horizon of the Kerr black hole in the cosmological background is found out.

  13. 40 CFR 98.363 - Calculating GHG emissions.

    Science.gov (United States)

    2010-07-01

    ... combustion device (metric tons CH4/yr). V = Average annual volumetric flow rate, calculated in Equation JJ-7... CH4 flow to digester combustion device, as calculated in Equation JJ-6 of this section (metric tons CH4). DE = CH4 destruction efficiency from flaring or burning in engine (lesser of manufacturer's...

  14. 1.6 billion euros for nuclear research through the 'Horizon 2020' program

    International Nuclear Information System (INIS)

    Anon.

    2014-01-01

    The European Union Council has approved the budget for the future European program for research and innovation called 'Horizon 2020'. A global funding of 77 billion euros has been allocated to 'Horizon 2020' for the 2014 to 2020 years. The share for nuclear sciences will reach 1.6 billion euros and will break down as follows: 316 million euros for fundamental research on fission, 728 million euros for fundamental research on fusion (ITER not included) and 560 million euros for the research projects of the European Joint Research Center (JRC). (A.C.)

  15. Tellurium

    Science.gov (United States)

    Goldfarb, Richard J.; Berger, Byron R.; George, Micheal W.; Seal, Robert R.; Schulz, Klaus J.; DeYoung,, John H.; Seal, Robert R.; Bradley, Dwight C.

    2017-12-19

    Tellurium (Te) is a very rare element that averages only 3 parts per billion in Earth’s upper crust. It shows a close association with gold and may be present in orebodies of most gold deposit types at levels of tens to hundreds of parts per million. In large-tonnage mineral deposits, such as porphyry copper and seafloor volcanogenic massive sulfide deposits, sulfide minerals may contain hundreds of parts per million tellurium, although the orebodies likely have overall concentrations of 0.1 to 1.0 parts per million tellurium. Tellurium is presently recovered as a primary ore from only two districts in the world; these are the gold-tellurium epithermal vein deposits located adjacent to one another at Dashuigou and Majiagou (Sichuan Province) in southwestern China, and the epithermal-like mineralization at the Kankberg deposit in the Skellefteå VMS district of Västerbotten County, Sweden. Combined, these two groups of deposits account for about 15 percent (about 70 metric tons) of the annual global production of between 450 and 470 metric tons of tellurium. Most of the world’s tellurium, however, is produced as a byproduct of the mining of porphyry copper deposits. These deposits typically yield concentrations of 1 to 4 percent tellurium in the anode slimes recovered during copper refining. Present production of tellurium from the United States is solely from the anode slimes at ASARCO LLC’s copper refinery in Amarillo, Texas, and may total about 50 metric tons per year. The main uses of tellurium are in photovoltaic solar cells and as an additive to copper, lead, and steel alloys in various types of machinery. The environmental data available regarding the mining of tellurium are limited; most concerns to date have focused on the more-abundant metals present in the large-tonnage deposits from which tellurium is recovered as a byproduct. Global reserves of tellurium are estimated to be 24,000 metric tons, based on the amount of tellurium likely contained in

  16. Two classes of metric spaces

    Directory of Open Access Journals (Sweden)

    Isabel Garrido

    2016-04-01

    Full Text Available The class of metric spaces (X,d known as small-determined spaces, introduced by Garrido and Jaramillo, are properly defined by means of some type of real-valued Lipschitz functions on X. On the other hand, B-simple metric spaces introduced by Hejcman are defined in terms of some kind of bornologies of bounded subsets of X. In this note we present a common framework where both classes of metric spaces can be studied which allows us to see not only the relationships between them but also to obtain new internal characterizations of these metric properties.

  17. A Metric for Heterotic Moduli

    Science.gov (United States)

    Candelas, Philip; de la Ossa, Xenia; McOrist, Jock

    2017-12-01

    Heterotic vacua of string theory are realised, at large radius, by a compact threefold with vanishing first Chern class together with a choice of stable holomorphic vector bundle. These form a wide class of potentially realistic four-dimensional vacua of string theory. Despite all their phenomenological promise, there is little understanding of the metric on the moduli space of these. What is sought is the analogue of special geometry for these vacua. The metric on the moduli space is important in phenomenology as it normalises D-terms and Yukawa couplings. It is also of interest in mathematics, since it generalises the metric, first found by Kobayashi, on the space of gauge field connections, to a more general context. Here we construct this metric, correct to first order in {α^{\\backprime}}, in two ways: first by postulating a metric that is invariant under background gauge transformations of the gauge field, and also by dimensionally reducing heterotic supergravity. These methods agree and the resulting metric is Kähler, as is required by supersymmetry. Checking the metric is Kähler is intricate and the anomaly cancellation equation for the H field plays an essential role. The Kähler potential nevertheless takes a remarkably simple form: it is the Kähler potential of special geometry with the Kähler form replaced by the {α^{\\backprime}}-corrected hermitian form.

  18. 46 CFR 25.25-17 - Survival craft requirements for uninspected passenger vessels of at least 100 gross tons.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Survival craft requirements for uninspected passenger... Survival craft requirements for uninspected passenger vessels of at least 100 gross tons. (a) Each uninspected passenger vessel of at least 100 gross tons must have adequate survival craft with enough capacity...

  19. Polycrystalline silicon availability for photovoltaic and semiconductor industries

    Science.gov (United States)

    Ferber, R. R.; Costogue, E. N.; Pellin, R.

    1982-01-01

    Markets, applications, and production techniques for Siemens process-produced polycrystalline silicon are surveyed. It is noted that as of 1982 a total of six Si materials suppliers were servicing a worldwide total of over 1000 manufacturers of Si-based devices. Besides solar cells, the Si wafers are employed for thyristors, rectifiers, bipolar power transistors, and discrete components for control systems. An estimated 3890 metric tons of semiconductor-grade polycrystalline Si will be used in 1982, and 6200 metric tons by 1985. Although the amount is expected to nearly triple between 1982-89, research is being carried out on the formation of thin films and ribbons for solar cells, thereby eliminating the waste produced in slicing Czolchralski-grown crystals. The free-world Si production in 1982 is estimated to be 3050 metric tons. Various new technologies for the formation of polycrystalline Si at lower costs and with less waste are considered. New entries into the industrial Si formation field are projected to produce a 2000 metric ton excess by 1988.

  20. Atmospheric carbon reduction by urban trees

    International Nuclear Information System (INIS)

    Nowak, D.J.

    1993-01-01

    Trees, because they sequester atmospheric carbon through their growth process and conserve energy in urban areas, have been suggested as one means to combat increasing levels of atmospheric carbon. Analysis of the urban forest in Oakland, California (21% tree cover), reveals a tree carbon storage level of 11·0 metric tons/hectare. Trees in the area of the 1991 fire in Oakland stored approximately 14,500 metric tons of carbon, 10% of the total amount stored by Oakland's urban forest. National urban forest carbon storage in the United States (28% tree cover) is estimated at between 350 and 750 million metric tons. Establishment of 10 million urban trees annually over the next 10 years is estimated to sequester and offset the production of 363 million metric tons of carbon over the next 50 years-less than 1% of the estimated carbon emissions in the United States over the same time period. Advantages and limitations of managing urban trees to reduce atmospheric carbon are discussed. 36 refs., 2 figs., 3 tabs

  1. On characterizations of quasi-metric completeness

    Energy Technology Data Exchange (ETDEWEB)

    Dag, H.; Romaguera, S.; Tirado, P.

    2017-07-01

    Hu proved in [4] that a metric space (X, d) is complete if and only if for any closed subspace C of (X, d), every Banach contraction on C has fixed point. Since then several authors have investigated the problem of characterizing the metric completeness by means of fixed point theorems. Recently this problem has been studied in the more general context of quasi-metric spaces for different notions of completeness. Here we present a characterization of a kind of completeness for quasi-metric spaces by means of a quasi-metric versions of Hu’s theorem. (Author)

  2. TNF-α promotes nuclear enrichment of the transcription factor TonEBP/NFAT5 to selectively control inflammatory but not osmoregulatory responses in nucleus pulposus cells.

    Science.gov (United States)

    Johnson, Zariel I; Doolittle, Alexandra C; Snuggs, Joseph W; Shapiro, Irving M; Le Maitre, Christine L; Risbud, Makarand V

    2017-10-20

    Intervertebral disc degeneration (IDD) causes chronic back pain and is linked to production of proinflammatory molecules by nucleus pulposus (NP) and other disc cells. Activation of tonicity-responsive enhancer-binding protein (TonEBP)/NFAT5 by non-osmotic stimuli, including proinflammatory molecules, occurs in cells involved in immune response. However, whether inflammatory stimuli activate TonEBP in NP cells and whether TonEBP controls inflammation during IDD is unknown. We show that TNF-α, but not IL-1β or LPS, promoted nuclear enrichment of TonEBP protein. However, TNF-α-mediated activation of TonEBP did not cause induction of osmoregulatory genes. RNA sequencing showed that 8.5% of TNF-α transcriptional responses were TonEBP-dependent and identified genes regulated by both TNF-α and TonEBP. These genes were over-enriched in pathways and diseases related to inflammatory response and inhibition of matrix metalloproteases. Based on RNA-sequencing results, we further investigated regulation of novel TonEBP targets CXCL1 , CXCL2 , and CXCL3 TonEBP acted synergistically with TNF-α and LPS to induce CXCL1 -proximal promoter activity. Interestingly, this regulation required a highly conserved NF-κB-binding site but not a predicted TonE, suggesting cross-talk between these two members of the Rel family. Finally, analysis of human NP tissue showed that TonEBP expression correlated with canonical osmoregulatory targets TauT/SLC6A6 , SMIT/SLC5A3 , and AR/AKR1B1 , supporting in vitro findings that the inflammatory milieu during IDD does not interfere with TonEBP osmoregulation. In summary, whereas TonEBP participates in the proinflammatory response to TNF-α, therapeutic strategies targeting this transcription factor for treatment of disc disease must spare osmoprotective, prosurvival, and matrix homeostatic activities. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  3. Ozone Induced Premature Mortality and Crop Yield Loss in China

    Science.gov (United States)

    Lin, Y.; Jiang, F.; Wang, H.

    2017-12-01

    Exposure to ambient ozone is a major risk factor for health impacts such as chronic obstructive pulmonary disease (COPD) and cause damage to plant and agricultural crops. But these impacts were usually evaluated separately in earlier studies. We apply Community Multi-scale Air Quality model to simulate the ambient O3 concentration at a resolution of 36 km×36 km across China. Then, we follow Global Burden of Diseases approach and AOT40 (i.e., above a threshold of 40 ppb) metric to estimate the premature mortalities and yield losses of major grain crops (i.e., winter wheat, rice and corn) across China due to surface ozone exposure, respectively. Our results show that ozone exposure leads to nearly 67,700 premature mortalities and 145 billion USD losses in 2014. The ozone induced yield losses of all crop production totaled 78 (49.9-112.6)million metric tons, worth 5.3 (3.4-7.6)billion USD, in China. The relative yield losses ranged from 8.5-14% for winter wheat, 3.9-15% for rice, and 2.2-5.5% for maize. We can see that the top four health affected provinces (Sichuan, Henan, Shandong, Jiangsu) are also ranking on the winter wheat and rice crop yield loss. Our results provide further evidence that surface ozone pollution is becoming urgent air pollution in China, and have important policy implications for China to alleviate the impacts of air pollution.

  4. Engineering performance metrics

    Science.gov (United States)

    Delozier, R.; Snyder, N.

    1993-03-01

    Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.

  5. Goodwyn project under way off NW Australia

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    This paper reports that the $2 billion (Australian) Goodwyn field development project on Australia's Northwest Shelf is under way with installation of a 17,500 metric ton steel platform jacket. Northwest Shelf project operator Woodside Petroleum Pty. Ltd. and partners are continuing with an extensive exploration program in the Northwest Shelf area. The group expects to begin soon a wide ranging 3-D seismic survey over the WA-28-P license area and the Northwest Shelf production permits. The goal is to identify new and appraise existing oil and gas prospects in the region for an exploratory drilling campaign to begin in first half 1993. Finding more gas reserves would bode well for extending existing LNG contracts with Japan or competing for new markets in Japan, South Korea, and Taiwan

  6. Value addition to locally produced soybean in Ghana: production of ...

    African Journals Online (AJOL)

    Ghana produces about 50,000 metric tons of soy beans per annum, of which only about 15 metric tons are utilized. One aspect of utilizing the beans is in the production of soy sauce, a product whose demand is on the increase due to changing food habits of the Ghanaian society. A preliminary attempt to produce soy sauce ...

  7. Quantifying urban forest structure, function, and value: the Chicago Urban Forest Climate Project

    Science.gov (United States)

    E. Gregory McPherson; David Nowak; Gordon Heisler; Sue Grimmond; Catherine Souch; Rich Grant; Rowan Rowntree

    1997-01-01

    This paper is a review of research in Chicago that linked analyses of vegetation structure with forest functions and values. During 1991, the region's trees removed an estimated 5575 metric tons of air pollutants, providing air cleansing worth $9.2 million. Each year they sequester an estimated 315 800 metric tons of carbon. Increasing tree cover 10% or planting...

  8. An assessment of uncertainty in forest carbon budget projections

    Science.gov (United States)

    Linda S. Heath; James E. Smith

    2000-01-01

    Estimates of uncertainty are presented for projections of forest carbon inventory and average annual net carbon flux on private timberland in the US using the model FORCARB. Uncertainty in carbon inventory was approximately ±9% (2000 million metric tons) of the estimated median in the year 2000, rising to 11% (2800 million metric tons) in projection year 2040...

  9. Empowering billions with food safety and food security

    International Nuclear Information System (INIS)

    Pillai, Suresh D.

    2009-01-01

    Full text: There are virtually millions of people -who die needlessly every year due to contaminated water and food. There are virtually many millions more who are starving due to an inadequate supply of food. Billions of pounds of food are unnecessarily wasted due to insect and other damage. Deaths and illness due to contaminated food or inadequate food are at catastrophic levels in many regions of the world. A majority of the food and water borne illnesses and deaths are preventable. It can be prevented by improved food production methods, improved food processing technologies, improved food distribution systems and improved personal hygiene. Food irradiation technology is over 100 years old. Yet, this technology is poorly understood by governments and corporate decision makers all around the world. Many consumers also are unfortunately misinformed of this technology. There is an urgent need for nations and people around the world to empower themselves with the knowledge and the expertise to harness this powerful technology. Widespread and sensible adoption of this technology can empower billions around the world with clean and abundant food supplies. It is unconscionable in the 21st century for governments to allow people to die or go hungry when the technology to prevent them is readily available

  10. Brand metrics that matter

    NARCIS (Netherlands)

    Muntinga, D.; Bernritter, S.

    2017-01-01

    Het merk staat steeds meer centraal in de organisatie. Het is daarom essentieel om de gezondheid, prestaties en ontwikkelingen van het merk te meten. Het is echter een uitdaging om de juiste brand metrics te selecteren. Een enorme hoeveelheid metrics vraagt de aandacht van merkbeheerders. Maar welke

  11. Privacy Metrics and Boundaries

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  12. Nuclear business worth billions begins

    International Nuclear Information System (INIS)

    Beer, G.; Marcan, P.; Slovak, K.

    2005-01-01

    specific data regarding the direct costs of decommissioning. Preliminary estimates state 50 billions Slovak crowns (1.28 billions EUR), but the actual costs will mainly depend on the volume of nuclear waste to be disposed of. (authors)

  13. Coal geology and assessment of coal resources and reserves in the Powder River Basin, Wyoming and Montana

    Science.gov (United States)

    Luppens, James A.; Scott, David C.

    2015-01-01

    This report presents the final results of the first assessment of both coal resources and reserves for all significant coal beds in the entire Powder River Basin, northeastern Wyoming and southeastern Montana. The basin covers about 19,500 square miles, exclusive of the part of the basin within the Crow and Northern Cheyenne Indian Reservations in Montana. The Powder River Basin, which contains the largest resources of low-sulfur, low-ash, subbituminous coal in the United States, is the single most important coal basin in the United States. The U.S. Geological Survey used a geology-based assessment methodology to estimate an original coal resource of about 1.16 trillion short tons for 47 coal beds in the Powder River Basin; in-place (remaining) resources are about 1.15 trillion short tons. This is the first time that all beds were mapped individually over the entire basin. A total of 162 billion short tons of recoverable coal resources (coal reserve base) are estimated at a 10:1 stripping ratio or less. An estimated 25 billion short tons of that coal reserve base met the definition of reserves, which are resources that can be economically produced at or below the current sales price at the time of the evaluation. The total underground coal resource in coal beds 10–20 feet thick is estimated at 304 billion short tons.

  14. Un nouveau béton auto-cicatrisant grâce à l’incorporation de bactéries

    NARCIS (Netherlands)

    Wiktor, V.; Jonkers, H.M.

    2011-01-01

    La formation d’un réseau continu de fissures contribue à l’augmentation de la perméabilité du béton, réduisant ainsi de manière importante sa résistance à l’attaque d’agents agressifs dissous dans l’eau. Afin d’augmenter la capacité de cicatrisation autogène du béton, certains agents cicatrisants

  15. Price of next big thing in physics: $6.7 billion

    CERN Multimedia

    Overbye, Dennis

    2007-01-01

    The price of exploring inner space went up Thursday. The machine discusses in a news conference in Beijing, will be 20 miles long and would cost about $6.7 billion and 13'000 person-years of labor to be built. (1,5 page)

  16. Diabetes\tand\tdepression:\tA\treview\twith\tspecial\tfocus\ton\tIndia

    Directory of Open Access Journals (Sweden)

    Megha\tThakur

    2015-10-01

    Full Text Available Diabetes,\ta\tpsychologically\tchallenging\tcondition for the\tpatients\tand their care givers, has been found to be a significant risk factor for depression. Depression\tmay\tbe\ta\tcritical\tbarrier\tto\teffective\tdiabetes\tmanagement.\tThe accompanying\tfatigue\tremarkably\tlowers\tthe\tmotivation\tfor\tself-care,\toften leading to lowered physical and emotion well-being, poor markers of diabetes control, poor adherence to medication, and increased mortality among individuals with diabetes. A very small proportion of the diabetes patients\twith\tdepression\tget\tdiagnosed,\tand\tfurthermore,\tonly\ta\thandful\tof the ones diagnosed get treated for depression. Despite the fact that 80 percent\tof\tthe\tpeople\twith\ttype\t2\tdiabetes\treside\tin\tlow\tand\tmiddle\tincome\tcountries,\tmost\tof\tthe\tevidence\ton diabetes\tand\tdepression\tcomes\tfrom\thigh\tincome\tcountries.\tThis\treview\toffers\ta\tsummary\tof\texisting\tevidence and\tthe\tpotential\tgaps\tthat\tneed\tto\tbe\taddressed.

  17. Application of natural gas to the direct reduction of iron ore

    Energy Technology Data Exchange (ETDEWEB)

    1975-05-01

    The Gas Committee of the U.N. Economic Commission for Europe evaluated the potentials of natural gas for direct reduction of iron ore. The report, based essentially on that by the Italian representative E. Pasero with comments and observations from experts of the other member countries, indicated the general tendency of the iron and steel industry to use natural gas to reduce production costs by reducing coke consumption. By the end of 1972, gas consumption by these industries was reported at 38.8 billion Btu/ton (10.79 Gcal/m ton) by the Steel Committee of the U.N. Economic Commission at the symposium on the economic and technical aspects of the direct reduction of iron ore, held in September 1972 in Bucharest. In comparison, coke consumption was 9.5 billion Btu/ton (2.64 Gcal/m ton) steel, liquid hydrocarbons 3.1 billion Btu (0.85 Gcal), and electricity 16.1 billion Btu (4.46 Gcal). Natural gas was used mainly for ore reduction and generation of the reducing gas in-shaft furnaces with backdraft heating circulation, fixed-bed furances (Hyl type), and fluidized-bed reactors. Processes include the Midrex (shaft furnace), H.I.B. (fluidized bed), and Novalfer (fluidized bed). These processes are used to obtain 4.5 million tons/yr of iron sponge for the production of steel in electric furnaces. The natural gas outlook for direct reduction of iron will depend on local conditions and fuel availability. Its industrial application has been most successful in mini-steel installations, especially in the U.S., Japan, and Western Europe, and it is recommended for developing countries with no steel-industry basis.

  18. Cyber threat metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  19. Fixed point theory in metric type spaces

    CERN Document Server

    Agarwal, Ravi P; O’Regan, Donal; Roldán-López-de-Hierro, Antonio Francisco

    2015-01-01

    Written by a team of leading experts in the field, this volume presents a self-contained account of the theory, techniques and results in metric type spaces (in particular in G-metric spaces); that is, the text approaches this important area of fixed point analysis beginning from the basic ideas of metric space topology. The text is structured so that it leads the reader from preliminaries and historical notes on metric spaces (in particular G-metric spaces) and on mappings, to Banach type contraction theorems in metric type spaces, fixed point theory in partially ordered G-metric spaces, fixed point theory for expansive mappings in metric type spaces, generalizations, present results and techniques in a very general abstract setting and framework. Fixed point theory is one of the major research areas in nonlinear analysis. This is partly due to the fact that in many real world problems fixed point theory is the basic mathematical tool used to establish the existence of solutions to problems which arise natur...

  20. Deep Transfer Metric Learning.

    Science.gov (United States)

    Junlin Hu; Jiwen Lu; Yap-Peng Tan; Jie Zhou

    2016-12-01

    Conventional metric learning methods usually assume that the training and test samples are captured in similar scenarios so that their distributions are assumed to be the same. This assumption does not hold in many real visual recognition applications, especially when samples are captured across different data sets. In this paper, we propose a new deep transfer metric learning (DTML) method to learn a set of hierarchical nonlinear transformations for cross-domain visual recognition by transferring discriminative knowledge from the labeled source domain to the unlabeled target domain. Specifically, our DTML learns a deep metric network by maximizing the inter-class variations and minimizing the intra-class variations, and minimizing the distribution divergence between the source domain and the target domain at the top layer of the network. To better exploit the discriminative information from the source domain, we further develop a deeply supervised transfer metric learning (DSTML) method by including an additional objective on DTML, where the output of both the hidden layers and the top layer are optimized jointly. To preserve the local manifold of input data points in the metric space, we present two new methods, DTML with autoencoder regularization and DSTML with autoencoder regularization. Experimental results on face verification, person re-identification, and handwritten digit recognition validate the effectiveness of the proposed methods.

  1. Energy functionals for Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Headrick, M; Nassar, A

    2013-01-01

    We identify a set of ''energy'' functionals on the space of metrics in a given Kähler class on a Calabi-Yau manifold, which are bounded below and minimized uniquely on the Ricci-flat metric in that class. Using these functionals, we recast the problem of numerically solving the Einstein equation as an optimization problem. We apply this strategy, using the ''algebraic'' metrics (metrics for which the Kähler potential is given in terms of a polynomial in the projective coordinates), to the Fermat quartic and to a one-parameter family of quintics that includes the Fermat and conifold quintics. We show that this method yields approximations to the Ricci-flat metric that are exponentially accurate in the degree of the polynomial (except at the conifold point, where the convergence is polynomial), and therefore orders of magnitude more accurate than the balanced metrics, previously studied as approximations to the Ricci-flat metric. The method is relatively fast and easy to implement. On the theoretical side, we also show that the functionals can be used to give a heuristic proof of Yau's theorem

  2. Design and analysis of a 1-ton prototype of the Jinping Neutrino Experiment

    International Nuclear Information System (INIS)

    Wang, Zongyi; Wang, Yuanqing; Wang, Zhe; Chen, Shaomin; Du, Xinxi; Zhang, Tianxiong; Guo, Ziyi; Yuan, Huanxin

    2017-01-01

    The Jinping Neutrino Experiment will perform an in-depth research on solar neutrinos and geo-neutrinos. Two structural options (i.e., cylindrical and spherical schemes) are proposed for the Jinping detector based on other successful underground neutrino detectors. Several key factors in the design are also discussed in detail. A 1-ton prototype of the Jinping experiment is proposed based on physics requirements. Subsequently, the structural design, installation procedure, and mechanical analysis of the neutrino detector prototype are discussed. The results show that the maximum Mises stresses on the acrylic vessel, stainless steel truss, and the tank are all lower than the design values of the strengths. The stability requirement of the stainless steel truss in the detector prototype is satisfied. Consequently, the structural scheme for the 1-ton prototype is safe and reliable.

  3. Design and analysis of a 1-ton prototype of the Jinping Neutrino Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zongyi, E-mail: wangzongyi1990@outlook.com [School of Civil Engineering, Wuhan University, Wuhan 430072 (China); Wang, Yuanqing [Key Laboratory of Civil Engineering Safety and Durability of Education Ministry, Tsinghua University, Beijing 100084 (China); Wang, Zhe; Chen, Shaomin [Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Du, Xinxi [School of Civil Engineering, Wuhan University, Wuhan 430072 (China); Zhang, Tianxiong [School of Civil Engineering, Tianjin University, Tianjin 300072 (China); Guo, Ziyi [Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Yuan, Huanxin [School of Civil Engineering, Wuhan University, Wuhan 430072 (China)

    2017-05-21

    The Jinping Neutrino Experiment will perform an in-depth research on solar neutrinos and geo-neutrinos. Two structural options (i.e., cylindrical and spherical schemes) are proposed for the Jinping detector based on other successful underground neutrino detectors. Several key factors in the design are also discussed in detail. A 1-ton prototype of the Jinping experiment is proposed based on physics requirements. Subsequently, the structural design, installation procedure, and mechanical analysis of the neutrino detector prototype are discussed. The results show that the maximum Mises stresses on the acrylic vessel, stainless steel truss, and the tank are all lower than the design values of the strengths. The stability requirement of the stainless steel truss in the detector prototype is satisfied. Consequently, the structural scheme for the 1-ton prototype is safe and reliable.

  4. Channel change and bed-material transport in the Umpqua River basin, Oregon

    Science.gov (United States)

    Wallick, J. Rose; O'Connor, Jim E.; Anderson, Scott; Keith, Mackenzie K.; Cannon, Charles; Risley, John C.

    2011-01-01

    The Umpqua River drains 12,103 square kilometers of western Oregon; with headwaters in the Cascade Range, the river flows through portions of the Klamath Mountains and Oregon Coast Range before entering the Pacific Ocean. Above the head of tide, the Umpqua River, along with its major tributaries, the North and South Umpqua Rivers, flows on a mixed bedrock and alluvium bed, alternating between bedrock rapids and intermittent, shallow gravel bars composed of gravel to cobble-sized clasts. These bars have been a source of commercial aggregate since the mid-twentieth century. Below the head of tide, the Umpqua River contains large bars composed of mud and sand. Motivated by ongoing permitting and aquatic habitat concerns related to in-stream gravel mining on the fluvial reaches, this study evaluated spatial and temporal trends in channel change and bed-material transport for 350 kilometers of river channel along the Umpqua, North Umpqua, and South Umpqua Rivers. The assessment produced (1) detailed mapping of the active channel, using aerial photographs and repeat surveys, and (2) a quantitative estimation of bed-material flux that drew upon detailed measurements of particle size and lithology, equations of transport capacity, and a sediment yield analysis. Bed-material transport capacity estimates at 45 sites throughout the South Umpqua and main stem Umpqua Rivers for the period 1951-2008 result in wide-ranging transport capacity estimates, reflecting the difficulty of applying equations of bed-material transport to a supply-limited river. Median transport capacity values calculated from surface-based equations of bedload transport for each of the study reaches provide indications of maximum possible transport rates and range from 8,000 to 27,000 metric tons per year (tons/yr) for the South Umpqua River and 20,000 to 82,000 metric tons/yr for the main stem Umpqua River upstream of the head of tide; the North Umpqua River probably contributes little bed material. A

  5. Transportation Energy Futures Series: Freight Transportation Demand: Energy-Efficient Scenarios for a Low-Carbon Future

    Energy Technology Data Exchange (ETDEWEB)

    Grenzeback, L. R.; Brown, A.; Fischer, M. J.; Hutson, N.; Lamm, C. R.; Pei, Y. L.; Vimmerstedt, L.; Vyas, A. D.; Winebrake, J. J.

    2013-03-01

    Freight transportation demand is projected to grow to 27.5 billion tons in 2040, and to nearly 30.2 billion tons in 2050. This report describes the current and future demand for freight transportation in terms of tons and ton-miles of commodities moved by truck, rail, water, pipeline, and air freight carriers. It outlines the economic, logistics, transportation, and policy and regulatory factors that shape freight demand, the trends and 2050 outlook for these factors, and their anticipated effect on freight demand. After describing federal policy actions that could influence future freight demand, the report then summarizes the capabilities of available analytical models for forecasting freight demand. This is one in a series of reports produced as a result of the Transportation Energy Futures project, a Department of Energy-sponsored multi-agency effort to pinpoint underexplored strategies for reducing GHGs and petroleum dependence related to transportation.

  6. Possible development of nuclear energy in the European Community and consequences of different reactor strategies

    International Nuclear Information System (INIS)

    Decressin, A.; Haytinck, B.; Orlowski, S.

    1974-01-01

    The Commission of the European Communities recommended to stimulate the development of the nuclear energy, in order to ensure in the middle or long term, a diversification of the energy supply sources of the Community. According to such a policy, nuclear energy could cover nearly 80% of the Community needs in electrical power in the year 2000 - these being estimated at 50% of total energy needs of the Community - and correspond to 1,3 billion tep for that year alone. In the year 2000, the installed nuclear capacity in the Community (i.e. nearly 1000 GWe) would imply the consumption of roughly 150.000 metric tons of natural uranium and necessitate 90.000 tons of SWU in enrichment services, whatever ''average'' strategy is considered for the period 1980 - 2000. The choices between these various strategies made by public or industrial decision centers, will be the result of a complex assessment of many factors. In any case, the flow of nuclear material between countries will remain very important and a Community nuclear self sufficiency based on breeding is not conceivable before the time at which new energy sources could be brought in effective uses

  7. Document de travail 5: Beyond Collier's Bottom Billion | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    16 déc. 2010 ... L'ouvrage de Paul Collier, The Bottom Billion, suscite un grand intérêt dans le domaine du développement. Il repose sur la thèse selon laquelle un groupe de près de 60 pays, dont la population totale avoisine un milliard de personnes, sont pris dans quatre pièges principaux.

  8. Regge calculus from discontinuous metrics

    International Nuclear Information System (INIS)

    Khatsymovsky, V.M.

    2003-01-01

    Regge calculus is considered as a particular case of the more general system where the linklengths of any two neighbouring 4-tetrahedra do not necessarily coincide on their common face. This system is treated as that one described by metric discontinuous on the faces. In the superspace of all discontinuous metrics the Regge calculus metrics form some hypersurface defined by continuity conditions. Quantum theory of the discontinuous metric system is assumed to be fixed somehow in the form of quantum measure on (the space of functionals on) the superspace. The problem of reducing this measure to the Regge hypersurface is addressed. The quantum Regge calculus measure is defined from a discontinuous metric measure by inserting the δ-function-like phase factor. The requirement that continuity conditions be imposed in a 'face-independent' way fixes this factor uniquely. The term 'face-independent' means that this factor depends only on the (hyper)plane spanned by the face, not on it's form and size. This requirement seems to be natural from the viewpoint of existence of the well-defined continuum limit maximally free of lattice artefacts

  9. Numerical Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Douglas, Michael R.; Karp, Robert L.; Lukic, Sergio; Reinbacher, Rene

    2008-01-01

    We develop numerical methods for approximating Ricci flat metrics on Calabi-Yau hypersurfaces in projective spaces. Our approach is based on finding balanced metrics and builds on recent theoretical work by Donaldson. We illustrate our methods in detail for a one parameter family of quintics. We also suggest several ways to extend our results

  10. A parts-per-billion measurement of the antiproton magnetic moment

    CERN Document Server

    Smorra, C; Borchert, M J; Harrington, J A; Higuchi, T; Nagahama, H; Tanaka, T; Mooser, A; Schneider, G; Blaum, K; Matsuda, Y; Ospelkaus, C; Quint, W; Walz, J; Yamazaki, Y; Ulmer, S

    2017-01-01

    Precise comparisons of the fundamental properties of matter–antimatter conjugates provide sensitive tests of charge–parity–time (CPT) invariance1, which is an important symmetry that rests on basic assumptions of the standard model of particle physics. Experiments on mesons2, leptons3, 4 and baryons5, 6 have compared different properties of matter–antimatter conjugates with fractional uncertainties at the parts-per-billion level or better. One specific quantity, however, has so far only been known to a fractional uncertainty at the parts-per-million level7, 8: the magnetic moment of the antiproton, . The extraordinary difficulty in measuring with high precision is caused by its intrinsic smallness; for example, it is 660 times smaller than the magnetic moment of the positron3. Here we report a high-precision measurement of in units of the nuclear magneton μN with a fractional precision of 1.5 parts per billion (68% confidence level). We use a two-particle spectroscopy method in an advanced cryogenic ...

  11. Metrics for Evaluation of Student Models

    Science.gov (United States)

    Pelanek, Radek

    2015-01-01

    Researchers use many different metrics for evaluation of performance of student models. The aim of this paper is to provide an overview of commonly used metrics, to discuss properties, advantages, and disadvantages of different metrics, to summarize current practice in educational data mining, and to provide guidance for evaluation of student…

  12. International collaboration in SSC (or any $4 billion scientific project)

    International Nuclear Information System (INIS)

    Lederman, L.M.

    1988-01-01

    In this paper, the author discusses the superconducting supercollider. This is a project that costs U.S. $4.4 billion. The author spends a short time giving the motivation (which is a scientific motivation) and also giving the idea of how it is possible, with U.S. deficits

  13. Active Seismic Monitoring Using High-Power Moveable 40-TONS Vibration Sources in Altay-Sayn Region of Russia

    Science.gov (United States)

    Soloviev, V. M.; Seleznev, V. S.; Emanov, A. F.; Kashun, V. N.; Elagin, S. A.; Romanenko, I.; Shenmayer, A. E.; Serezhnikov, N.

    2013-05-01

    The paper presents data of operating vibroseismic observations using high-power stationary 100-tons and moveable 40-tons vibration sources, which have been carried out in Russia for 30 years. It is shown that investigations using high-power vibration sources open new possibilities for study stressedly-deformed condition of the Earth`s crust and the upper mantle and tectonic process in them. Special attention is given to developing operating seismic translucences of the Earth`s crust and the upper mantle using high-power 40-tons vibration sources. As a result of experimental researches there was proved high stability and repeatability of vibration effects. There were carried out long period experiments of many days with vibration source sessions of every two hours with the purpose of monitoring accuracy estimation. It was determined, that repeatability of vibroseismic effects (there was researched time difference of repeated sessions of P- and S-waves from crystal rocks surface) could be estimated as 10-3 - 10-4 sec. It is ten times less than revealed here annual variations of kinematic parameters according to regime vibroseismic observations. It is shown, that on hard high-speed grounds radiation spectrum becomes narrowband and is dislocated to high frequency; at the same time quantity of multiple high-frequency harmonic is growing. At radiation on soft sedimentary grounds (sand, clay) spectrum of vibration source in near zone is more broadband, correlograms are more compact. there Correspondence of wave fields from 40-tons vibration sources and explosions by reference waves from boundaries in he Earth`s crust and the upper mantle at record distance of 400 km was proved by many experiments in various regions of Russia; there was carried out the technique of high-power vibration sources grouping for increase of effectiveness of emanation and increase of record distance. According to results of long-term vibroseismic monitoring near Novosibirsk (1997-2012) there are

  14. Issues in Benchmark Metric Selection

    Science.gov (United States)

    Crolotte, Alain

    It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.

  15. Fluoresceination of FepA during colicin B killing: effects of temperature, toxin and TonB.

    Science.gov (United States)

    Smallwood, Chuck R; Marco, Amparo Gala; Xiao, Qiaobin; Trinh, Vy; Newton, Salete M C; Klebba, Phillip E

    2009-06-01

    We studied the reactivity of 35 genetically engineered Cys sulphydryl groups at different locations in Escherichia coli FepA. Modification of surface loop residues by fluorescein maleimide (FM) was strongly temperature-dependent in vivo, whereas reactivity at other sites was much less affected. Control reactions with bovine serum albumin showed that the temperature dependence of loop residue reactivity was unusually high, indicating that conformational changes in multiple loops (L2, L3, L4, L5, L7, L8, L10) transform the receptor to a more accessible form at 37 degrees C. At 0 degrees C colicin B binding impaired or blocked labelling at 8 of 10 surface loop sites, presumably by steric hindrance. Overall, colicin B adsorption decreased the reactivity of more than half of the 35 sites, in both the N- and C- domains of FepA. However, colicin B penetration into the cell at 37 degrees C did not augment the chemical modification of any residues in FepA. The FM modification patterns were similarly unaffected by the tonB locus. FepA was expressed at lower levels in a tonB host strain, but when we accounted for this decrease its FM labelling was comparable whether TonB was present or absent. Thus we did not detect TonB-dependent structural changes in FepA, either alone or when it interacted with colicin B at 37 degrees C. The only changes in chemical modification were reductions from steric hindrance when the bacteriocin bound to the receptor protein. The absence of increases in the reactivity of N-domain residues argues against the idea that the colicin B polypeptide traverses the FepA channel.

  16. Oncology pharma costs to exceed $150 billion by 2020.

    Science.gov (United States)

    2016-10-01

    Worldwide costs of oncology drugs will rise above $150 billion by 2020, according to a report by the IMS Institute for Healthcare Informatics. Many factors are in play, according to IMS, including the new wave of expensive immunotherapies. Pembrolizumab (Keytruda), priced at $150,000 per year per patient, and nivolumab (Opdivo), priced at $165,000, may be harbingers of the market for cancer immunotherapies.

  17. Cost of solving mysteries of universe: $6.7 billion

    CERN Multimedia

    Overbye, Dennis

    2007-01-01

    "An international consortium of physicists on Thursday released the first detailed design of what they believe will be the next big thing in physics. The machine, 20 miles long, will slam together electrons and their opposites, positrons, to produce fireballs of energy re-creating conditions when the universe was only a trillionth of a second old. It would cost about $6.7 billion." (1 page)

  18. Robustness of climate metrics under climate policy ambiguity

    International Nuclear Information System (INIS)

    Ekholm, Tommi; Lindroos, Tomi J.; Savolainen, Ilkka

    2013-01-01

    Highlights: • We assess the economic impacts of using different climate metrics. • The setting is cost-efficient scenarios for three interpretations of the 2C target. • With each target setting, the optimal metric is different. • Therefore policy ambiguity prevents the selection of an optimal metric. • Robust metric values that perform well with multiple policy targets however exist. -- Abstract: A wide array of alternatives has been proposed as the common metrics with which to compare the climate impacts of different emission types. Different physical and economic metrics and their parameterizations give diverse weights between e.g. CH 4 and CO 2 , and fixing the metric from one perspective makes it sub-optimal from another. As the aims of global climate policy involve some degree of ambiguity, it is not possible to determine a metric that would be optimal and consistent with all policy aims. This paper evaluates the cost implications of using predetermined metrics in cost-efficient mitigation scenarios. Three formulations of the 2 °C target, including both deterministic and stochastic approaches, shared a wide range of metric values for CH 4 with which the mitigation costs are only slightly above the cost-optimal levels. Therefore, although ambiguity in current policy might prevent us from selecting an optimal metric, it can be possible to select robust metric values that perform well with multiple policy targets

  19. Web metrics for library and information professionals

    CERN Document Server

    Stuart, David

    2014-01-01

    This is a practical guide to using web metrics to measure impact and demonstrate value. The web provides an opportunity to collect a host of different metrics, from those associated with social media accounts and websites to more traditional research outputs. This book is a clear guide for library and information professionals as to what web metrics are available and how to assess and use them to make informed decisions and demonstrate value. As individuals and organizations increasingly use the web in addition to traditional publishing avenues and formats, this book provides the tools to unlock web metrics and evaluate the impact of this content. The key topics covered include: bibliometrics, webometrics and web metrics; data collection tools; evaluating impact on the web; evaluating social media impact; investigating relationships between actors; exploring traditional publications in a new environment; web metrics and the web of data; the future of web metrics and the library and information professional.Th...

  20. Partial rectangular metric spaces and fixed point theorems.

    Science.gov (United States)

    Shukla, Satish

    2014-01-01

    The purpose of this paper is to introduce the concept of partial rectangular metric spaces as a generalization of rectangular metric and partial metric spaces. Some properties of partial rectangular metric spaces and some fixed point results for quasitype contraction in partial rectangular metric spaces are proved. Some examples are given to illustrate the observed results.

  1. A Kerr-NUT metric

    International Nuclear Information System (INIS)

    Vaidya, P.C.; Patel, L.K.; Bhatt, P.V.

    1976-01-01

    Using Galilean time and retarded distance as coordinates the usual Kerr metric is expressed in form similar to the Newman-Unti-Tamburino (NUT) metric. The combined Kerr-NUT metric is then investigated. In addition to the Kerr and NUT solutions of Einstein's equations, three other types of solutions are derived. These are (i) the radiating Kerr solution, (ii) the radiating NUT solution satisfying Rsub(ik) = sigmaxisub(i)xisub(k), xisub(i)xisup(i) = 0, and (iii) the associated Kerr solution satisfying Rsub(ik) = 0. Solution (i) is distinct from and simpler than the one reported earlier by Vaidya and Patel (Phys. Rev.; D7:3590 (1973)). Solutions (ii) and (iii) gave line elements which have the axis of symmetry as a singular line. (author)

  2. Areva - First quarter 2009 revenue climbs 8.5% to 3.003 billion euros

    International Nuclear Information System (INIS)

    2009-04-01

    First quarter 2009 revenue was up 8.5% compared with the same period last year, to 3.003 billion euros. At constant exchange rates and consolidation scope, growth came to 3.9%. Currency translation had a positive impact of 57 million euros over the quarter. Changes in the consolidation scope had an impact of 66 million euros, primarily due to the consolidation of acquisitions made in 2008 in Transmission and Distribution and in Renewable Energies. The growth engines for first quarter revenue were the Reactors and Services division and the Transmission and Distribution division, with growth of 9.2% and 16.1% respectively. Outside France, revenue rose to 2.032 billion euros, compared with 1.857 billion euros in the first quarter of 2008, and represents 68% of total revenue. Orders were steady in the first quarter, particularly in the Front End, which posted several significant contracts with US and Asian utilities, and in Transmission and Distribution, with orders up sharply in Asia and South America. As of March 31, 2009, the group's backlog reached 49.5 billion euros, for 28.3% growth year-on-year, including 31.3% growth in Nuclear and 10.2% in Transmission and Distribution. For the year as a whole, the group confirms its outlook for backlog and revenue growth as well as rising operating income It should be noted that revenue may vary significantly from one quarter to the next in nuclear operations. Accordingly, quarterly data cannot be viewed as a reliable indicator of annual trends

  3. Research Investments in Global Health: A Systematic Analysis of UK Infectious Disease Research Funding and Global Health Metrics, 1997–2013

    Science.gov (United States)

    Head, Michael G.; Fitchett, Joseph R.; Nageshwaran, Vaitehi; Kumari, Nina; Hayward, Andrew; Atun, Rifat

    2015-01-01

    Background Infectious diseases account for a significant global burden of disease and substantial investment in research and development. This paper presents a systematic assessment of research investments awarded to UK institutions and global health metrics assessing disease burden. Methods We systematically sourced research funding data awarded from public and philanthropic organisations between 1997 and 2013. We screened awards for relevance to infection and categorised data by type of science, disease area and specific pathogen. Investments were compared with mortality, disability-adjusted life years (DALYs) and years lived with disability (YLD) across three time points. Findings Between 1997–2013, there were 7398 awards with a total investment of £3.7 billion. An increase in research funding across 2011–2013 was observed for most disease areas, with notable exceptions being sexually transmitted infections and sepsis research where funding decreased. Most funding remains for pre-clinical research (£2.2 billion, 59.4%). Relative to global mortality, DALYs and YLDs, acute hepatitis C, leishmaniasis and African trypanosomiasis received comparatively high levels of funding. Pneumonia, shigellosis, pertussis, cholera and syphilis were poorly funded across all health metrics. Tuberculosis (TB) consistently attracts relatively less funding than HIV and malaria. Interpretation Most infections have received increases in research investment, alongside decreases in global burden of disease in 2013. The UK demonstrates research strengths in some neglected tropical diseases such as African trypanosomiasis and leishmaniasis, but syphilis, cholera, shigellosis and pneumonia remain poorly funded relative to their global burden. Acute hepatitis C appears well funded but the figures do not adequately take into account projected future chronic burdens for this condition. These findings can help to inform global policymakers on resource allocation for research investment

  4. Background metric in supergravity theories

    International Nuclear Information System (INIS)

    Yoneya, T.

    1978-01-01

    In supergravity theories, we investigate the conformal anomaly of the path-integral determinant and the problem of fermion zero modes in the presence of a nontrivial background metric. Except in SO(3) -invariant supergravity, there are nonvanishing conformal anomalies. As a consequence, amplitudes around the nontrivial background metric contain unpredictable arbitrariness. The fermion zero modes which are explicitly constructed for the Euclidean Schwarzschild metric are interpreted as an indication of the supersymmetric multiplet structure of a black hole. The degree of degeneracy of a black hole is 2/sup 4n/ in SO(n) supergravity

  5. Daylight metrics and energy savings

    Energy Technology Data Exchange (ETDEWEB)

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  6. Initial Response of Pine Seedlings and Weeds to Dried Sewage Sludge in Rehabilitation of an Eroded Forest Site

    Science.gov (United States)

    Charles R. Berry

    1977-01-01

    Dried sewage sludge was applied at rates of 0, 17, 34, and 69 metric tons/ha on a badly eroded forest site in the Piedmont region of northeast Georgia. Production of weed bio mass varied directly with amount of sludge applied. Heigh growth for both shortleafand loblolly pine seedlings appeared to be greater on plots receiving 17 metric tons of sludge/ha, bu differences...

  7. Metrics for energy resilience

    International Nuclear Information System (INIS)

    Roege, Paul E.; Collier, Zachary A.; Mancillas, James; McDonagh, John A.; Linkov, Igor

    2014-01-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today's energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system's energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth. - Highlights: • Resilience is the ability of a system to recover from adversity. • There is a need for methods to quantify and measure system resilience. • We developed a matrix-based approach to generate energy resilience metrics. • These metrics can be used in energy planning, system design, and operations

  8. Balanced metrics for vector bundles and polarised manifolds

    DEFF Research Database (Denmark)

    Garcia Fernandez, Mario; Ross, Julius

    2012-01-01

    leads to a Hermitian-Einstein metric on E and a constant scalar curvature Kähler metric in c_1(L). For special values of α, limits of balanced metrics are solutions of a system of coupled equations relating a Hermitian-Einstein metric on E and a Kähler metric in c1(L). For this, we compute the top two......We consider a notion of balanced metrics for triples (X, L, E) which depend on a parameter α, where X is smooth complex manifold with an ample line bundle L and E is a holomorphic vector bundle over X. For generic choice of α, we prove that the limit of a convergent sequence of balanced metrics...

  9. The metrics of science and technology

    CERN Document Server

    Geisler, Eliezer

    2000-01-01

    Dr. Geisler's far-reaching, unique book provides an encyclopedic compilation of the key metrics to measure and evaluate the impact of science and technology on academia, industry, and government. Focusing on such items as economic measures, patents, peer review, and other criteria, and supported by an extensive review of the literature, Dr. Geisler gives a thorough analysis of the strengths and weaknesses inherent in metric design, and in the use of the specific metrics he cites. His book has already received prepublication attention, and will prove especially valuable for academics in technology management, engineering, and science policy; industrial R&D executives and policymakers; government science and technology policymakers; and scientists and managers in government research and technology institutions. Geisler maintains that the application of metrics to evaluate science and technology at all levels illustrates the variety of tools we currently possess. Each metric has its own unique strengths and...

  10. Erosion and Sedimentation from the Bagley Fire, Eastern Klamath Mountains, Northern CA

    Science.gov (United States)

    De La Fuente, J. A.; Bachmann, S.; Mai, C.; Mikulovsky, R.; Mondry, Z. J.; Rust, B.; Young, D.

    2014-12-01

    The Bagley Fire burned about 19,000 hectares on the Shasta-Trinity National Forest in the late summer of 2012, with soil burn severities of 11% high, 19% moderate and 48% low. Two strong storms in November and December followed the fire. The first storm had a recurrence interval of about 2 years, and generated runoff with a return interval of 10-25 years, causing many road stream crossing failures in parts of the fire. The second storm had a recurrence interval of 25-50 years, and initiated more severe erosion throughout the fire area. Erosional processes were dominated by sheet, rill and gully erosion, and landslides were uncommon. A model predicted high potential for debris flows, but few were documented, and though most stream channels exhibited fresh scour and deposition, residual deposits lacked boulder levees or other evidence of debris flow. Rather, deposits were stratified and friable, suggesting a sediment laden flood flow rather than debris flow origin. The resulting sediment was rich in gravel and finer particles, and poor in larger rock. Soil loss was estimated at 0.5-5.6 cm on most hillslopes. A high resolution DEM (LiDAR) was used to measure gullies, small landslides, and stream scour, and also to estimate sedimentation in Squaw Creek, and Shasta Lake. A soil erosion model was used to estimate surface erosion. Total erosion in the Squaw Creek watershed was estimated at 2.24 million metric tons, which equates to 260 metric tons/hectare. Of this, about 0.89 million metric tons were delivered to the stream system (103 metric tons/hectare). Nearly half of this sediment, 0.41 million metric tons, was temporarily stored in the Squaw Creek channel, and around 0.33 million metric tons of fine sediment were carried into Shasta Lake. Squaw Creek also delivered about 0.17 million metric tons of sand, gravel and cobbles to the lake. This estimate is very tenuous, and was made by measuring the volume of a delta in Shasta Lake from a tributary to Squaw Creek and

  11. Extending cosmology: the metric approach

    OpenAIRE

    Mendoza, S.

    2012-01-01

    Comment: 2012, Extending Cosmology: The Metric Approach, Open Questions in Cosmology; Review article for an Intech "Open questions in cosmology" book chapter (19 pages, 3 figures). Available from: http://www.intechopen.com/books/open-questions-in-cosmology/extending-cosmology-the-metric-approach

  12. Metrics, Media and Advertisers: Discussing Relationship

    Directory of Open Access Journals (Sweden)

    Marco Aurelio de Souza Rodrigues

    2014-11-01

    Full Text Available This study investigates how Brazilian advertisers are adapting to new media and its attention metrics. In-depth interviews were conducted with advertisers in 2009 and 2011. In 2009, new media and its metrics were celebrated as innovations that would increase advertising campaigns overall efficiency. In 2011, this perception has changed: New media’s profusion of metrics, once seen as an advantage, started to compromise its ease of use and adoption. Among its findings, this study argues that there is an opportunity for media groups willing to shift from a product-focused strategy towards a customer-centric one, through the creation of new, simple and integrative metrics

  13. Measuring Information Security: Guidelines to Build Metrics

    Science.gov (United States)

    von Faber, Eberhard

    Measuring information security is a genuine interest of security managers. With metrics they can develop their security organization's visibility and standing within the enterprise or public authority as a whole. Organizations using information technology need to use security metrics. Despite the clear demands and advantages, security metrics are often poorly developed or ineffective parameters are collected and analysed. This paper describes best practices for the development of security metrics. First attention is drawn to motivation showing both requirements and benefits. The main body of this paper lists things which need to be observed (characteristic of metrics), things which can be measured (how measurements can be conducted) and steps for the development and implementation of metrics (procedures and planning). Analysis and communication is also key when using security metrics. Examples are also given in order to develop a better understanding. The author wants to resume, continue and develop the discussion about a topic which is or increasingly will be a critical factor of success for any security managers in larger organizations.

  14. Active Metric Learning for Supervised Classification

    OpenAIRE

    Kumaran, Krishnan; Papageorgiou, Dimitri; Chang, Yutong; Li, Minhan; Takáč, Martin

    2018-01-01

    Clustering and classification critically rely on distance metrics that provide meaningful comparisons between data points. We present mixed-integer optimization approaches to find optimal distance metrics that generalize the Mahalanobis metric extensively studied in the literature. Additionally, we generalize and improve upon leading methods by removing reliance on pre-designated "target neighbors," "triplets," and "similarity pairs." Another salient feature of our method is its ability to en...

  15. Projects of SOS FAIM

    Directory of Open Access Journals (Sweden)

    Mees, M.

    1985-01-01

    Full Text Available In Ivory Coast, the freshwater fishculture in rural areas is mainly on a small scale. This type of breeding in ponds (2 to 4 ares yields on an average 3 metric tons of fish/ha/year and represents only an activity with self-consumption of products. The yield in intensive pond fishculture Tilapia nilotica is on an average 6 to 7 metric tons/ha/year but yields bigger than 10 metric tons/ha/year are not uncommon. The intensive fishculture in floating cages, requiring a minor investment but a more improved formation than in fischculture, yields on an average about 30 to 40 kg/m3/year. However the effective development of this activity rests on the resolution of problems like the sufficient fry production, the feeding and the commercialization.

  16. Impression on Agricultural Development in Rwanda

    Directory of Open Access Journals (Sweden)

    Fromen, D.

    1985-01-01

    Full Text Available In Ivory Coast, the freshwater fishculture in rural areas is mainly on a small scale. This type of breeding in ponds (2 to 4 ares yields on an average 3 metric tons of fish/ha/year and represents only an activity with self-consumption of products. The yield in intensive pond fishculture Tilapia nilotica is on an average 6 to 7 metric tons/ha/year but yields bigger than 10 metric tons/ha/year are not uncommon. The intensive fishculture in floating cages, requiring a minor investment but a more improved formation than in fischculture, yields on an average about 30 to 40 kg/m3/year. However the effective development of this activity rests on the resolution of problems like the sufficient fry production, the feeding and the commercialization.

  17. Multimetric indices: How many metrics?

    Science.gov (United States)

    Multimetric indices (MMI’s) often include 5 to 15 metrics, each representing a different attribute of assemblage condition, such as species diversity, tolerant taxa, and nonnative taxa. Is there an optimal number of metrics for MMIs? To explore this question, I created 1000 9-met...

  18. Biogenic transformed floating macroporous cryogel for uranium sorption from aqueous medium

    International Nuclear Information System (INIS)

    Tripathi, Anuj; Melo, Jose Savio

    2015-01-01

    Radionuclide contamination to the environment affects human health and environmental flora and fauna. A recent study suggests that approximately 9000 tons of uranium is contributed to sea every year by fresh water streams and approximately 4.58 billion tons of uranium is present in the well-mixed upper surface of oceans. However, from this around 2 billion tons is considered for accessible recovery. Considering the requirement of separation approaches, a wide range of methods like solvent extraction, ion exchange, complexation and precipitation have been developed for the removal of hazardous substances from water bodies. The process of adsorption using biomaterials is considered as one of the alternative approach for removal of radionuclides from aqueous streams. Authors interest is to develop a simple and user-friendly technology for the separation of radionuclides from aqueous subsurfaces

  19. A ton is not always a ton: A road-test of landfill, manure, and afforestation/reforestation offset protocols in the U.S. carbon market

    International Nuclear Information System (INIS)

    Lee, Carrie M.; Lazarus, Michael; Smith, Gordon R.; Todd, Kimberly; Weitz, Melissa

    2013-01-01

    Highlights: • Protocols are the foundation of an offset program. • Using sample projects, we “road test” landfill, manure and afforestation protocols from 5 programs. • For a given project, we find large variation in the volume of offsets generated. • Harmonization of protocols can increase the likelihood that “a ton is a ton”. • Harmonization can enhance prospects for linking emission trading systems. -- Abstract: The outcome of recent international climate negotiations suggests we are headed toward a more fragmented carbon market, with multiple emission trading and offset programs operating in parallel. To effectively harmonize and link across programs, it will be important to ensure that across offset programs and protocols that a “ton is a ton”. In this article, we consider how sample offsets projects in the U.S. carbon market are treated across protocols from five programs: the Clean Development Mechanism, Climate Action Reserve, Chicago Climate Exchange, Regional Greenhouse Gas Initiative, and the U.S. EPA's former program, Climate Leaders. We find that differences among protocols for landfill methane, manure management, and afforestation/reforestation project types in accounting boundary definitions, baseline setting methods, measurement rules, emission factors, and discounts lead to differences in offsets credited that are often significant (e.g. greater than 50%). We suggest opportunities for modification and harmonization of protocols that can improve offset quality and credibility and enhance prospects for future linking of trading units and systems

  20. Global agriculture and carbon trade-offs.

    Science.gov (United States)

    Johnson, Justin Andrew; Runge, Carlisle Ford; Senauer, Benjamin; Foley, Jonathan; Polasky, Stephen

    2014-08-26

    Feeding a growing and increasingly affluent world will require expanded agricultural production, which may require converting grasslands and forests into cropland. Such conversions can reduce carbon storage, habitat provision, and other ecosystem services, presenting difficult societal trade-offs. In this paper, we use spatially explicit data on agricultural productivity and carbon storage in a global analysis to find where agricultural extensification should occur to meet growing demand while minimizing carbon emissions from land use change. Selective extensification saves ∼ 6 billion metric tons of carbon compared with a business-as-usual approach, with a value of approximately $1 trillion (2012 US dollars) using recent estimates of the social cost of carbon. This type of spatially explicit geospatial analysis can be expanded to include other ecosystem services and other industries to analyze how to minimize conflicts between economic development and environmental sustainability.

  1. Summary of the engineering analysis report for the long-term management of depleted uranium hexafluoride

    International Nuclear Information System (INIS)

    Dubrin, J.W.; Rahm-Crites, L.

    1997-09-01

    The Department of Energy (DOE) is reviewing ideas for the long-term management and use of its depleted uranium hexafluoride. DOE owns about 560,000 metric tons (over a billion pounds) of depleted uranium hexafluoride. This material is contained in steel cylinders located in storage yards near Paducah, Kentucky; Portsmouth, Ohio; and at the East Tennessee Technology Park (formerly the K-25 Site) in Oak Ridge, Tennessee. On November 10, 1994, DOE announced its new Depleted Uranium Hexafluoride Management Program by issuing a Request for Recommendations and an Advance Notice of Intent in the Federal Register (59 FR 56324 and 56325). The first part of this program consists of engineering, costs and environmental impact studies. Part one will conclude with the selection of a long-term management plan or strategy. Part two will carry out the selected strategy

  2. Summary of the engineering analysis report for the long-term management of depleted uranium hexafluoride

    Energy Technology Data Exchange (ETDEWEB)

    Dubrin, J.W., Rahm-Crites, L.

    1997-09-01

    The Department of Energy (DOE) is reviewing ideas for the long-term management and use of its depleted uranium hexafluoride. DOE owns about 560,000 metric tons (over a billion pounds) of depleted uranium hexafluoride. This material is contained in steel cylinders located in storage yards near Paducah, Kentucky; Portsmouth, Ohio; and at the East Tennessee Technology Park (formerly the K-25 Site) in Oak Ridge, Tennessee. On November 10, 1994, DOE announced its new Depleted Uranium Hexafluoride Management Program by issuing a Request for Recommendations and an Advance Notice of Intent in the Federal Register (59 FR 56324 and 56325). The first part of this program consists of engineering, costs and environmental impact studies. Part one will conclude with the selection of a long-term management plan or strategy. Part two will carry out the selected strategy.

  3. Metrics for Polyphonic Sound Event Detection

    Directory of Open Access Journals (Sweden)

    Annamaria Mesaros

    2016-05-01

    Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.

  4. Robustness Metrics: Consolidating the multiple approaches to quantify Robustness

    DEFF Research Database (Denmark)

    Göhler, Simon Moritz; Eifler, Tobias; Howard, Thomas J.

    2016-01-01

    robustness metrics; 3) Functional expectancy and dispersion robustness metrics; and 4) Probability of conformance robustness metrics. The goal was to give a comprehensive overview of robustness metrics and guidance to scholars and practitioners to understand the different types of robustness metrics...

  5. Common Metrics for Human-Robot Interaction

    Science.gov (United States)

    Steinfeld, Aaron; Lewis, Michael; Fong, Terrence; Scholtz, Jean; Schultz, Alan; Kaber, David; Goodrich, Michael

    2006-01-01

    This paper describes an effort to identify common metrics for task-oriented human-robot interaction (HRI). We begin by discussing the need for a toolkit of HRI metrics. We then describe the framework of our work and identify important biasing factors that must be taken into consideration. Finally, we present suggested common metrics for standardization and a case study. Preparation of a larger, more detailed toolkit is in progress.

  6. Narrowing the Gap Between QoS Metrics and Web QoE Using Above-the-fold Metrics

    OpenAIRE

    da Hora, Diego Neves; Asrese, Alemnew; Christophides, Vassilis; Teixeira, Renata; Rossi, Dario

    2018-01-01

    International audience; Page load time (PLT) is still the most common application Quality of Service (QoS) metric to estimate the Quality of Experience (QoE) of Web users. Yet, recent literature abounds with proposals for alternative metrics (e.g., Above The Fold, SpeedIndex and variants) that aim at better estimating user QoE. The main purpose of this work is thus to thoroughly investigate a mapping between established and recently proposed objective metrics and user QoE. We obtain ground tr...

  7. The impacts of policy mix for resolving overcapacity in heavy chemical industry and operating national carbon emission trading market in China

    International Nuclear Information System (INIS)

    Li, Wei; Lu, Can; Ding, Yi; Zhang, Yan-Wu

    2017-01-01

    Highlights: •A STIRPAT embed dynamic CGE model is utilized to evaluate the whole impact. •Economy and trade increased slightly under scenario shock. •Global carbon emission reduction rate ranges from 3.33% to 7.46%. •Carbon emission peaks in 2022, 2024, 2026 beyond simulating scenarios. •Energy intensity decreases 19.58–23.71% upon 2020 in contrast with 2015. -- Abstract: In place to reduce greenhouse gas emission efficiently and accomplish carbon emission peak destination ahead of 2030, a variety of policy-based interventions grounded in optimizing energy structure and boosting emission mitigation have been put forward to target carbon-and resource-intensive enterprises across China. Both defusing overcapacity in heavy chemical industry and constructing national carbon trading market are recently attached with a stronger significant importance. A STIRPAT (Stochastic Impacts by Regression on Population, Affluence, and Technology) embed dynamic CGE (computable general equilibrium) model is applied in this study to evaluate the simulation effects focusing on China’s economy, energy, and household lifestyle. We devise nine scenarios in terms of the two aforementioned mitigation strategies. The results indicate that, the optimal policy mix, balancing economic improvement, energy mix readjustment, and emission reduction to the maximize value, is founded to be declining the proportion of heavy chemical industry capacity with an annual average level of 3%, 1%, 1%, stipulating carbon price in 5.8 dollar/ton, 11.6 dollar/ton, 14.5 dollar/ton, and distributing annual carbon allowance as 3.5 billion ton, 7 billion ton, 9 billion ton during 2017–2020, 2021–2025, and 2026–2030 respectively.

  8. Ubiquitous Supercritical Wing Design Cuts Billions in Fuel Costs

    Science.gov (United States)

    2015-01-01

    A Langley Research Center engineer’s work in the 1960s and ’70s to develop a wing with better performance near the speed of sound resulted in a significant increase in subsonic efficiency. The design was shared with industry. Today, Renton, Washington-based Boeing Commercial Airplanes, as well as most other plane manufacturers, apply it to all their aircraft, saving the airline industry billions of dollars in fuel every year.

  9. Ants at Ton Nga Chang Wildlife Sanctuary, Songkhla

    Directory of Open Access Journals (Sweden)

    Watanasit, S.

    2005-03-01

    Full Text Available The aim of this study was to investigate diversity of ant at Ton Nga Chang Wildlife Sanctuary, Hat Yai, Songkhla. Three line transects (100 m each were randomly set up in 2 types of forest area, disturbed and undisturbed. Hand collecting (HC and leaf litter sampling (LL were applied for ant collection within a time limit of 30 minutes for each method. This study was carried out every month during Febuary 2002- Febuary 2003. The results showed that 206 species were placed under 8 subfamilies: Aenictinae, Cerapachyinae, Dolichoderinae, Formicinae, Leptanillinae, Myrmicinae, Ponerinae and Pseudomyrmecinae. Study sites and collection methods could divide ant species into 2 groups, whereas seasonal change could not distinguish the groups by DCA of multivariate analysis.

  10. Factor structure of the Tomimatsu-Sato metrics

    International Nuclear Information System (INIS)

    Perjes, Z.

    1989-02-01

    Based on an earlier result stating that δ = 3 Tomimatsu-Sato (TS) metrics can be factored over the field of integers, an analogous representation for higher TS metrics was sought. It is shown that the factoring property of TS metrics follows from the structure of special Hankel determinants. A set of linear algebraic equations determining the factors was defined, and the factors of the first five TS metrics were tabulated, together with their primitive factors. (R.P.) 4 refs.; 2 tabs

  11. Energy production for environmental issues in Turkey

    Science.gov (United States)

    Yuksel, Ibrahim; Arman, Hasan; Halil Demirel, Ibrahim

    2017-11-01

    Due to the diversification efforts of energy sources, use of natural gas that was newly introduced into Turkish economy, has been growing rapidly. Turkey has large reserves of coal, particularly of lignite. The proven lignite reserves are 8.0 billion tons. The estimated total possible reserves are 30 billion tons. Turkey, with its young population and growing energy demand per person, its fast growing urbanization, and its economic development, has been one of the fast growing power markets of the world for the last two decades. It is expected that the demand for electric energy in Turkey will be 580 billion kWh by the year 2020. Turkey's electric energy demand is growing about 6-8% yearly due to fast economic growing. This paper deals with energy demand and consumption for environmental issues in Turkey.

  12. ST-intuitionistic fuzzy metric space with properties

    Science.gov (United States)

    Arora, Sahil; Kumar, Tanuj

    2017-07-01

    In this paper, we define ST-intuitionistic fuzzy metric space and the notion of convergence and completeness properties of cauchy sequences is studied. Further, we prove some properties of ST-intuitionistic fuzzy metric space. Finally, we introduce the concept of symmetric ST Intuitionistic Fuzzy metric space.

  13. Community access networks: how to connect the next billion to the ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Community access networks: how to connect the next billion to the Internet ... services is a prerequisite to sustainable socio-economic development. ... It will provide case studies and formulate recommendations with respect to ... An IDRC delegation will join international delegates and city representatives at the ICLEI World ...

  14. Interactive (statistical) visualisation and exploration of a billion objects with vaex

    Science.gov (United States)

    Breddels, M. A.

    2017-06-01

    With new catalogues arriving such as the Gaia DR1, containing more than a billion objects, new methods of handling and visualizing these data volumes are needed. We show that by calculating statistics on a regular (N-dimensional) grid, visualizations of a billion objects can be done within a second on a modern desktop computer. This is achieved using memory mapping of hdf5 files together with a simple binning algorithm, which are part of a Python library called vaex. This enables efficient exploration or large datasets interactively, making science exploration of large catalogues feasible. Vaex is a Python library and an application, which allows for interactive exploration and visualization. The motivation for developing vaex is the catalogue of the Gaia satellite, however, vaex can also be used on SPH or N-body simulations, any other (future) catalogues such as SDSS, Pan-STARRS, LSST, etc. or other tabular data. The homepage for vaex is http://vaex.astro.rug.nl.

  15. Structure-activity correlations for TON, FER and MOR in the hydroisomerization of n-butane

    NARCIS (Netherlands)

    Pieterse, J.A.Z.; Seshan, Kulathuiyer; Lercher, J.A.

    2000-01-01

    n-Butane hydroconversion was studied over (Pt-loaded) molecular sieves with TON, FER, and MOR morphology. The conversion occurs via a complex interplay of mono- and bimolecular bifunctional acid mechanism and monofunctional platinum-catalyzed hydrogenolysis. Hydroisomerization occurs bimolecularly

  16. Pragmatic security metrics applying metametrics to information security

    CERN Document Server

    Brotby, W Krag

    2013-01-01

    Other books on information security metrics discuss number theory and statistics in academic terms. Light on mathematics and heavy on utility, PRAGMATIC Security Metrics: Applying Metametrics to Information Security breaks the mold. This is the ultimate how-to-do-it guide for security metrics.Packed with time-saving tips, the book offers easy-to-follow guidance for those struggling with security metrics. Step by step, it clearly explains how to specify, develop, use, and maintain an information security measurement system (a comprehensive suite of metrics) to

  17. One billion cubic meters of gas produced in Kikinda area

    Energy Technology Data Exchange (ETDEWEB)

    Vicicevic, M; Duric, N

    1969-10-01

    The Kikinda gas reservoir has just passed a milestone in producing one billion cubic meters of natural gas. The reservoir was discovered in 1962, and its present production amounts to 26 million cu m. One of the biggest problems was formation of hydrates, which has successfully been solved by using methanol. Four tables show production statistics by years and productive formations.

  18. Defining a Progress Metric for CERT RMM Improvement

    Science.gov (United States)

    2017-09-14

    REV-03.18.2016.0 Defining a Progress Metric for CERT-RMM Improvement Gregory Crabb Nader Mehravari David Tobar September 2017 TECHNICAL ...fendable resource allocation decisions. Technical metrics measure aspects of controls implemented through technology (systems, soft- ware, hardware...implementation metric would be the percentage of users who have received anti-phishing training . • Effectiveness/efficiency metrics measure whether

  19. IT Project Management Metrics

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Many software and IT projects fail in completing theirs objectives because different causes of which the management of the projects has a high weight. In order to have successfully projects, lessons learned have to be used, historical data to be collected and metrics and indicators have to be computed and used to compare them with past projects and avoid failure to happen. This paper presents some metrics that can be used for the IT project management.

  20. Mass Customization Measurements Metrics

    DEFF Research Database (Denmark)

    Nielsen, Kjeld; Brunø, Thomas Ditlev; Jørgensen, Kaj Asbjørn

    2014-01-01

    A recent survey has indicated that 17 % of companies have ceased mass customizing less than 1 year after initiating the effort. This paper presents measurement for a company’s mass customization performance, utilizing metrics within the three fundamental capabilities: robust process design, choice...... navigation, and solution space development. A mass customizer when assessing performance with these metrics can identify within which areas improvement would increase competitiveness the most and enable more efficient transition to mass customization....

  1. Metrical Phonology: German Sound System.

    Science.gov (United States)

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English and German languages. The objective is to promote use of metrical phonology as a tool for enhancing instruction in stress patterns in words and sentences, particularly in…

  2. The design of steel string crane with lifting capacity 10 tons

    International Nuclear Information System (INIS)

    Syamsurrijal Ramdja

    2007-01-01

    The steel string (sling) used for lift Crane of type of Overhead Travelling Crane, with capacities lifting 10 ton are designed. If compared to other string type, string of steel have some excellence. At this design, election of type of string become primary and the factor of safety become prima facie matter with pursuant to up to date standard. From made of design, is hence got by specification and age of steel string. (author)

  3. Construction of Einstein-Sasaki metrics in D≥7

    International Nuclear Information System (INIS)

    Lue, H.; Pope, C. N.; Vazquez-Poritz, J. F.

    2007-01-01

    We construct explicit Einstein-Kaehler metrics in all even dimensions D=2n+4≥6, in terms of a 2n-dimensional Einstein-Kaehler base metric. These are cohomogeneity 2 metrics which have the new feature of including a NUT-type parameter, or gravomagnetic charge, in addition to..' in addition to mass and rotation parameters. Using a canonical construction, these metrics all yield Einstein-Sasaki metrics in dimensions D=2n+5≥7. As is commonly the case in this type of construction, for suitable choices of the free parameters the Einstein-Sasaki metrics can extend smoothly onto complete and nonsingular manifolds, even though the underlying Einstein-Kaehler metric has conical singularities. We discuss some explicit examples in the case of seven-dimensional Einstein-Sasaki spaces. These new spaces can provide supersymmetric backgrounds in M theory, which play a role in the AdS 4 /CFT 3 correspondence

  4. National Metrical Types in Nineteenth Century Art Song

    Directory of Open Access Journals (Sweden)

    Leigh VanHandel

    2010-01-01

    Full Text Available William Rothstein’s article “National metrical types in music of the eighteenth and early nineteenth centuries” (2008 proposes a distinction between the metrical habits of 18th and early 19th century German music and those of Italian and French music of that period. Based on theoretical treatises and compositional practice, he outlines these national metrical types and discusses the characteristics of each type. This paper presents the results of a study designed to determine whether, and to what degree, Rothstein’s characterizations of national metrical types are present in 19th century French and German art song. Studying metrical habits in this genre may provide a lens into changing metrical conceptions of 19th century theorists and composers, as well as to the metrical habits and compositional style of individual 19th century French and German art song composers.

  5. A Metric on Phylogenetic Tree Shapes.

    Science.gov (United States)

    Colijn, C; Plazzotta, G

    2018-01-01

    The shapes of evolutionary trees are influenced by the nature of the evolutionary process but comparisons of trees from different processes are hindered by the challenge of completely describing tree shape. We present a full characterization of the shapes of rooted branching trees in a form that lends itself to natural tree comparisons. We use this characterization to define a metric, in the sense of a true distance function, on tree shapes. The metric distinguishes trees from random models known to produce different tree shapes. It separates trees derived from tropical versus USA influenza A sequences, which reflect the differing epidemiology of tropical and seasonal flu. We describe several metrics based on the same core characterization, and illustrate how to extend the metric to incorporate trees' branch lengths or other features such as overall imbalance. Our approach allows us to construct addition and multiplication on trees, and to create a convex metric on tree shapes which formally allows computation of average tree shapes. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  6. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  7. Spatial variability in oceanic redox structure 1.8 billion years ago

    DEFF Research Database (Denmark)

    Poulton, Simon W.; Fralick, Philip W.; Canfield, Donald Eugene

    2010-01-01

    to reconstruct oceanic redox conditions from the 1.88- to 1.83-billion-year-old Animikie group from the Superior region, North America. We find that surface waters were oxygenated, whereas at mid-depths, anoxic and sulphidic (euxinic) conditions extended over 100 km from the palaeoshoreline. The spatial extent...

  8. Degraded visual environment image/video quality metrics

    Science.gov (United States)

    Baumgartner, Dustin D.; Brown, Jeremy B.; Jacobs, Eddie L.; Schachter, Bruce J.

    2014-06-01

    A number of image quality metrics (IQMs) and video quality metrics (VQMs) have been proposed in the literature for evaluating techniques and systems for mitigating degraded visual environments. Some require both pristine and corrupted imagery. Others require patterned target boards in the scene. None of these metrics relates well to the task of landing a helicopter in conditions such as a brownout dust cloud. We have developed and used a variety of IQMs and VQMs related to the pilot's ability to detect hazards in the scene and to maintain situational awareness. Some of these metrics can be made agnostic to sensor type. Not only are the metrics suitable for evaluating algorithm and sensor variation, they are also suitable for choosing the most cost effective solution to improve operating conditions in degraded visual environments.

  9. Reserves, the extraction of petroleum and the number of wells in the countries of the world

    Energy Technology Data Exchange (ETDEWEB)

    Pluzhnikov, B I

    1981-01-01

    The greatest percentage increase in the proven reserves of petroleum for the beginning of 1980 (in percentage) was noted in the Philippines (158.8), the Netherlands (100), France (68.3), Oman (47.4), and Bolivia. A decrease in the reserves of petroleum was noted in Marocco (-40.5%), Israel (-20.3%), Japan (-15.6%), and so forth. Proven reserves of petroleum are as follows in billions of tons: Saudia Arabia, 23; Kuwait, 10; Iran, 5.5; Mexico, 4.5. The extraction of oil in the first half of 1980 amounted to 243 million tons in Saudia Arabia; 213 million tons in the United States; 86 million tons in Iraq; 55 million tons in Venezuela; 53 million tons in Nigeria; 52 million tons in Iraq; 47 million tons in Kuwait; 45 million tons in Libya; and 40 million tons in Great Britain.

  10. The Jacobi metric for timelike geodesics in static spacetimes

    Science.gov (United States)

    Gibbons, G. W.

    2016-01-01

    It is shown that the free motion of massive particles moving in static spacetimes is given by the geodesics of an energy-dependent Riemannian metric on the spatial sections analogous to Jacobi's metric in classical dynamics. In the massless limit Jacobi's metric coincides with the energy independent Fermat or optical metric. For stationary metrics, it is known that the motion of massless particles is given by the geodesics of an energy independent Finslerian metric of Randers type. The motion of massive particles is governed by neither a Riemannian nor a Finslerian metric. The properies of the Jacobi metric for massive particles moving outside the horizon of a Schwarschild black hole are described. By constrast with the massless case, the Gaussian curvature of the equatorial sections is not always negative.

  11. Relaxed metrics and indistinguishability operators: the relationship

    Energy Technology Data Exchange (ETDEWEB)

    Martin, J.

    2017-07-01

    In 1982, the notion of indistinguishability operator was introduced by E. Trillas in order to fuzzify the crisp notion of equivalence relation (/cite{Trillas}). In the study of such a class of operators, an outstanding property must be pointed out. Concretely, there exists a duality relationship between indistinguishability operators and metrics. The aforesaid relationship was deeply studied by several authors that introduced a few techniques to generate metrics from indistinguishability operators and vice-versa (see, for instance, /cite{BaetsMesiar,BaetsMesiar2}). In the last years a new generalization of the metric notion has been introduced in the literature with the purpose of developing mathematical tools for quantitative models in Computer Science and Artificial Intelligence (/cite{BKMatthews,Ma}). The aforementioned generalized metrics are known as relaxed metrics. The main target of this talk is to present a study of the duality relationship between indistinguishability operators and relaxed metrics in such a way that the aforementioned classical techniques to generate both concepts, one from the other, can be extended to the new framework. (Author)

  12. Classification in medical images using adaptive metric k-NN

    Science.gov (United States)

    Chen, C.; Chernoff, K.; Karemore, G.; Lo, P.; Nielsen, M.; Lauze, F.

    2010-03-01

    The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier with respect to different adaptive metrics in the context of medical imaging. We propose using adaptive metrics such that the structure of the data is better described, introducing some unsupervised learning knowledge in k-NN. We investigated four different metrics are estimated: a theoretical metric based on the assumption that images are drawn from Brownian Image Model (BIM), the normalized metric based on variance of the data, the empirical metric is based on the empirical covariance matrix of the unlabeled data, and an optimized metric obtained by minimizing the classification error. The spectral structure of the empirical covariance also leads to Principal Component Analysis (PCA) performed on it which results the subspace metrics. The metrics are evaluated on two data sets: lateral X-rays of the lumbar aortic/spine region, where we use k-NN for performing abdominal aorta calcification detection; and mammograms, where we use k-NN for breast cancer risk assessment. The results show that appropriate choice of metric can improve classification.

  13. Measurable Control System Security through Ideal Driven Technical Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Sean McBride; Marie Farrar; Zachary Tudor

    2008-01-01

    The Department of Homeland Security National Cyber Security Division supported development of a small set of security ideals as a framework to establish measurable control systems security. Based on these ideals, a draft set of proposed technical metrics was developed to allow control systems owner-operators to track improvements or degradations in their individual control systems security posture. The technical metrics development effort included review and evaluation of over thirty metrics-related documents. On the bases of complexity, ambiguity, or misleading and distorting effects the metrics identified during the reviews were determined to be weaker than necessary to aid defense against the myriad threats posed by cyber-terrorism to human safety, as well as to economic prosperity. Using the results of our metrics review and the set of security ideals as a starting point for metrics development, we identified thirteen potential technical metrics - with at least one metric supporting each ideal. Two case study applications of the ideals and thirteen metrics to control systems were then performed to establish potential difficulties in applying both the ideals and the metrics. The case studies resulted in no changes to the ideals, and only a few deletions and refinements to the thirteen potential metrics. This led to a final proposed set of ten core technical metrics. To further validate the security ideals, the modifications made to the original thirteen potential metrics, and the final proposed set of ten core metrics, seven separate control systems security assessments performed over the past three years were reviewed for findings and recommended mitigations. These findings and mitigations were then mapped to the security ideals and metrics to assess gaps in their coverage. The mappings indicated that there are no gaps in the security ideals and that the ten core technical metrics provide significant coverage of standard security issues with 87% coverage. Based

  14. Experiential space is hardly metric

    Czech Academy of Sciences Publication Activity Database

    Šikl, Radovan; Šimeček, Michal; Lukavský, Jiří

    2008-01-01

    Roč. 2008, č. 37 (2008), s. 58-58 ISSN 0301-0066. [European Conference on Visual Perception. 24.08-28.08.2008, Utrecht] R&D Projects: GA ČR GA406/07/1676 Institutional research plan: CEZ:AV0Z70250504 Keywords : visual space perception * metric and non-metric perceptual judgments * ecological validity Subject RIV: AN - Psychology

  15. High resolution metric imaging payload

    Science.gov (United States)

    Delclaud, Y.

    2017-11-01

    Alcatel Space Industries has become Europe's leader in the field of high and very high resolution optical payloads, in the frame work of earth observation system able to provide military government with metric images from space. This leadership allowed ALCATEL to propose for the export market, within a French collaboration frame, a complete space based system for metric observation.

  16. Coal; Le charbon

    Energy Technology Data Exchange (ETDEWEB)

    Teissie, J.; Bourgogne, D. de; Bautin, F. [TotalFinaElf, La Defense, 92 - Courbevoie (France)

    2001-12-15

    Coal world production represents 3.5 billions of tons, plus 900 millions of tons of lignite. 50% of coal is used for power generation, 16% by steel making industry, 5% by cement plants, and 29% for space heating and by other industries like carbo-chemistry. Coal reserves are enormous, about 1000 billions of tons (i.e. 250 years of consumption with the present day rate) but their exploitation will be in competition with less costly and less polluting energy sources. This documents treats of all aspects of coal: origin, composition, calorific value, classification, resources, reserves, production, international trade, sectoral consumption, cost, retail price, safety aspects of coal mining, environmental impacts (solid and gaseous effluents), different technologies of coal-fired power plants and their relative efficiency, alternative solutions for the recovery of coal energy (fuel cells, liquefaction). (J.S.)

  17. Coal

    International Nuclear Information System (INIS)

    Teissie, J.; Bourgogne, D. de; Bautin, F.

    2001-12-01

    Coal world production represents 3.5 billions of tons, plus 900 millions of tons of lignite. 50% of coal is used for power generation, 16% by steel making industry, 5% by cement plants, and 29% for space heating and by other industries like carbo-chemistry. Coal reserves are enormous, about 1000 billions of tons (i.e. 250 years of consumption with the present day rate) but their exploitation will be in competition with less costly and less polluting energy sources. This documents treats of all aspects of coal: origin, composition, calorific value, classification, resources, reserves, production, international trade, sectoral consumption, cost, retail price, safety aspects of coal mining, environmental impacts (solid and gaseous effluents), different technologies of coal-fired power plants and their relative efficiency, alternative solutions for the recovery of coal energy (fuel cells, liquefaction). (J.S.)

  18. Gaia: Science with 1 billion objects in three dimensions

    Science.gov (United States)

    Prusti, Timo

    2018-02-01

    Gaia is an operational satellite in the ESA science programme. It is gathering data for more than a billion objects. Gaia measures positions and motions of stars in our Milky Way Galaxy, but captures many asteroids and extragalactic sources as well. The first data release has already been made and exploitation by the world-wide scientific community is underway. Further data releases will be made with further increasing accuracy. Gaia is well underway to provide its promised set of fundamental astronomical data.

  19. Smart Grid Status and Metrics Report Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Antonopoulos, Chrissi A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clements, Samuel L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorrissen, Willy J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ruiz, Kathleen A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, David L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gardner, Chris [APQC, Houston, TX (United States); Varney, Jeff [APQC, Houston, TX (United States)

    2014-07-01

    A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

  20. Implications of Metric Choice for Common Applications of Readmission Metrics

    OpenAIRE

    Davies, Sheryl; Saynina, Olga; Schultz, Ellen; McDonald, Kathryn M; Baker, Laurence C

    2013-01-01

    Objective. To quantify the differential impact on hospital performance of three readmission metrics: all-cause readmission (ACR), 3M Potential Preventable Readmission (PPR), and Centers for Medicare and Medicaid 30-day readmission (CMS).

  1. Ici, la priorité est aux piétons / Pedestrians have the right of way at CERN

    CERN Multimedia

    2002-01-01

    Au CERN, nous sommes tous piétons, très souvent automobilistes et parfois cyclistes. Mais peu importe notre moyen de locomotion si l'on reste vigilant et si l'on se rappelle que le piéton est un usager de la route à part entière, mais plus vulnérable. / At CERN, we are all pedestrians, often drivers, and occasionally cyclists. But our means of locomotion do no matter so long as we exercise caution and remember that a pedestrian has equal rights as a roa user, except than that he runs greater risks.

  2. Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — This chapter presents several performance metrics for offline evaluation of prognostics algorithms. A brief overview of different methods employed for performance...

  3. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  4. The contemporary cement cycle of the United States

    Science.gov (United States)

    Kapur, A.; Van Oss, H. G.; Keoleian, G.; Kesler, S.E.; Kendall, A.

    2009-01-01

    A country-level stock and flow model for cement, an important construction material, was developed based on a material flow analysis framework. Using this model, the contemporary cement cycle of the United States was constructed by analyzing production, import, and export data for different stages of the cement cycle. The United States currently supplies approximately 80% of its cement consumption through domestic production and the rest is imported. The average annual net addition of in-use new cement stock over the period 2000-2004 was approximately 83 million metric tons and amounts to 2.3 tons per capita of concrete. Nonfuel carbon dioxide emissions (42 million metric tons per year) from the calcination phase of cement manufacture account for 62% of the total 68 million tons per year of cement production residues. The end-of-life cement discards are estimated to be 33 million metric tons per year, of which between 30% and 80% is recycled. A significant portion of the infrastructure in the United States is reaching the end of its useful life and will need to be replaced or rehabilitated; this could require far more cement than might be expected from economic forecasts of demand for cement. ?? 2009 Springer Japan.

  5. Universities Report $1.8-Billion in Earnings on Inventions in 2011

    Science.gov (United States)

    Blumenstyk, Goldie

    2012-01-01

    Universities and their inventors earned more than $1.8-billion from commercializing their academic research in the 2011 fiscal year, collecting royalties from new breeds of wheat, from a new drug for the treatment of HIV, and from longstanding arrangements over enduring products like Gatorade. Northwestern University earned the most of any…

  6. Determining the potential benefits for the freight carriage by road in Spain facing an increase in vehicles gvm 40 to 44 tons

    Energy Technology Data Exchange (ETDEWEB)

    Martinez Reguero, A.H.; Campos Cacheda, J.M.

    2016-07-01

    A very significant percentage of the products shipped by road in Spain using heavy goods vehicles (HGV) make 40 tons GVM (gross vehicle mass). Any changes aimed at increasing productivity in that vehicles category would result in a very positive way in the road freight transport market, by lowering transport costs, decreasing environmental costs, rationalizing the sector and improving logistics market. Therefore it is discussed here the improvement derived from the transfer of HGV that currently have a limitation of 40 tons GVM to a new limit of 44 tons GVM, establishing the potential benefits that would be set after the change. (Author)

  7. Energy-Based Metrics for Arthroscopic Skills Assessment.

    Science.gov (United States)

    Poursartip, Behnaz; LeBel, Marie-Eve; McCracken, Laura C; Escoto, Abelardo; Patel, Rajni V; Naish, Michael D; Trejos, Ana Luisa

    2017-08-05

    Minimally invasive skills assessment methods are essential in developing efficient surgical simulators and implementing consistent skills evaluation. Although numerous methods have been investigated in the literature, there is still a need to further improve the accuracy of surgical skills assessment. Energy expenditure can be an indication of motor skills proficiency. The goals of this study are to develop objective metrics based on energy expenditure, normalize these metrics, and investigate classifying trainees using these metrics. To this end, different forms of energy consisting of mechanical energy and work were considered and their values were divided by the related value of an ideal performance to develop normalized metrics. These metrics were used as inputs for various machine learning algorithms including support vector machines (SVM) and neural networks (NNs) for classification. The accuracy of the combination of the normalized energy-based metrics with these classifiers was evaluated through a leave-one-subject-out cross-validation. The proposed method was validated using 26 subjects at two experience levels (novices and experts) in three arthroscopic tasks. The results showed that there are statistically significant differences between novices and experts for almost all of the normalized energy-based metrics. The accuracy of classification using SVM and NN methods was between 70% and 95% for the various tasks. The results show that the normalized energy-based metrics and their combination with SVM and NN classifiers are capable of providing accurate classification of trainees. The assessment method proposed in this study can enhance surgical training by providing appropriate feedback to trainees about their level of expertise and can be used in the evaluation of proficiency.

  8. Drought analysis in the Tons River Basin, India during 1969-2008

    Science.gov (United States)

    Meshram, Sarita Gajbhiye; Gautam, Randhir; Kahya, Ercan

    2018-05-01

    The primary focus of this study is the analysis of droughts in the Tons River Basin during the period 1969-2008. Precipitation data observed at four gauging stations are used to identify drought over the study area. The event of drought is derived from the standardized precipitation index (SPI) on a 3-month scale. Our results indicated that severe drought occurred in the Allahabad, Rewa, and Satna stations in the years 1973 and 1979. The droughts in this region had occurred mainly due to erratic behavior in monsoons, especially due to long breaks between monsoons. During the drought years, the deficiency of the annual rainfall in the analysis of annual rainfall departure had varied from -26% in 1976 to -60% in 1973 at Allahabad station in the basin. The maximum deficiency of annual and seasonal rainfall recorded in the basin is 60%. The maximum seasonal rainfall departure observed in the basin is in the order of -60% at Allahabad station in 1973, while maximum annual rainfall departure had been recorded as -60% during 1979 at the Satna station. Extreme dry events ( z score <-2) were detected during July, August, and September. Moreover, severe dry events were observed in August, September, and October. The drought conditions in the Tons River Basin are dominantly driven by total rainfall throughout the period between June and November.

  9. Principle of space existence and De Sitter metric

    International Nuclear Information System (INIS)

    Mal'tsev, V.K.

    1990-01-01

    The selection principle for the solutions of the Einstein equations suggested in a series of papers implies the existence of space (g ik ≠ 0) only in the presence of matter (T ik ≠0). This selection principle (principle of space existence, in the Markov terminology) implies, in the general case, the absence of the cosmological solution with the De Sitter metric. On the other hand, the De Sitter metric is necessary for describing both inflation and deflation periods of the Universe. It is shown that the De Sitter metric is also allowed by the selection principle under discussion if the metric experiences the evolution into the Friedmann metric

  10. What can article-level metrics do for you?

    Science.gov (United States)

    Fenner, Martin

    2013-10-01

    Article-level metrics (ALMs) provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, usage statistics, discussions in online comments and social media, social bookmarking, and recommendations. In this essay, we describe why article-level metrics are an important extension of traditional citation-based journal metrics and provide a number of example from ALM data collected for PLOS Biology.

  11. About the possibility of a generalized metric

    International Nuclear Information System (INIS)

    Lukacs, B.; Ladik, J.

    1991-10-01

    The metric (the structure of the space-time) may be dependent on the properties of the object measuring it. The case of size dependence of the metric was examined. For this dependence the simplest possible form of the metric tensor has been constructed which fulfils the following requirements: there be two extremal characteristic scales; the metric be unique and the usual between them; the change be sudden in the neighbourhood of these scales; the size of the human body appear as a parameter (postulated on the basis of some philosophical arguments). Estimates have been made for the two extremal length scales according to existing observations. (author) 19 refs

  12. A Soil Service Index: Peatland soils as a case study for quantifying the value, vulnerability, and status of soils

    Science.gov (United States)

    Loisel, J.; Harden, J. W.; Hugelius, G.

    2017-12-01

    What are the most important soil services valued by land stewards and planners? Which soil-data metrics can be used to quantify each soil service? What are the steps required to quantitatively index the baseline value of soil services and their vulnerability under different land-use and climate change scenarios? How do we simulate future soil service pathways (or trajectories) under changing management regimes using process-based ecosystem models? What is the potential cost (economic, social, and other) of soil degradation under these scenarios? How sensitive or resilient are soil services to prescribed management practices, and how does sensitivity vary over space and time? We are bringing together a group of scientists and conservation organizations to answer these questions by launching Soil Banker, an open and flexible tool to quantify soil services that can be used at any scale, and by any stakeholder. Our overarching goals are to develop metrics and indices to quantify peatland soil ecosystem services, monitor change of these services, and guide management. This paper describes our methodology applied to peatlands and presents two case studies (Indonesia and Patagonia) demonstrating how Peatland Soil Banker can be deployed as an accounting tool of peatland stocks, a quantitative measure of peatland health, and as a projection of peatland degradation or enhancement under different land-use cases. Why peatlands? They store about 600 billion tons of carbon that account for ⅓ of the world's soil carbon. Peatlands have dynamic GHG exchanges of CO2, CH4, and NOx with the atmosphere, which plays a role in regulating global climate; studies indicate that peatland degradation releases about 2-3 billion tons of CO2 to the atmosphere annually. These ecosystems also provide local and regional ecosystem services: they constitute important components of the N and P cycles, store about 10% of the world's freshwater and buffer large fluxes of freshwater on an annual basis

  13. Readability of the web: a study on 1 billion web pages

    NARCIS (Netherlands)

    de Heus, Marije; Hiemstra, Djoerd

    We have performed a readability study on more than 1 billion web pages. The Automated Readability Index was used to determine the average grade level required to easily comprehend a website. Some of the results are that a 16-year-old can easily understand 50% of the web and an 18-year old can easily

  14. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  15. Ideal Based Cyber Security Technical Metrics for Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    W. F. Boyer; M. A. McQueen

    2007-10-01

    Much of the world's critical infrastructure is at risk from attack through electronic networks connected to control systems. Security metrics are important because they provide the basis for management decisions that affect the protection of the infrastructure. A cyber security technical metric is the security relevant output from an explicit mathematical model that makes use of objective measurements of a technical object. A specific set of technical security metrics are proposed for use by the operators of control systems. Our proposed metrics are based on seven security ideals associated with seven corresponding abstract dimensions of security. We have defined at least one metric for each of the seven ideals. Each metric is a measure of how nearly the associated ideal has been achieved. These seven ideals provide a useful structure for further metrics development. A case study shows how the proposed metrics can be applied to an operational control system.

  16. THE ROLE OF ARTICLE LEVEL METRICS IN SCIENTIFIC PUBLISHING

    Directory of Open Access Journals (Sweden)

    Vladimir TRAJKOVSKI

    2016-04-01

    Full Text Available Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Article-level metrics (ALMs provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, statistics of usage, discussions in online comments and social media, social bookmarking, and recommendations. In this editorial, the role of article level metrics in publishing scientific papers has been described. Article-Level Metrics (ALMs are rapidly emerging as important tools to quantify how individual articles are being discussed, shared, and used. Data sources depend on the tool, but they include classic metrics indicators depending on citations, academic social networks (Mendeley, CiteULike, Delicious and social media (Facebook, Twitter, blogs, and Youtube. The most popular tools used to apply this new metrics are: Public Library of Science - Article-Level Metrics, Altmetric, Impactstory and Plum Analytics. Journal Impact Factor (JIF does not consider impact or influence beyond citations count as this count reflected only through Thomson Reuters’ Web of Science® database. JIF provides indicator related to the journal, but not related to a published paper. Thus, altmetrics now becomes an alternative metrics for performance assessment of individual scientists and their contributed scholarly publications. Macedonian scholarly publishers have to work on implementing of article level metrics in their e-journals. It is the way to increase their visibility and impact in the world of science.

  17. Characterising risk - aggregated metrics: radiation and noise

    International Nuclear Information System (INIS)

    Passchier, W.

    1998-01-01

    The characterisation of risk is an important phase in the risk assessment - risk management process. From the multitude of risk attributes a few have to be selected to obtain a risk characteristic or profile that is useful for risk management decisions and implementation of protective measures. One way to reduce the number of attributes is aggregation. In the field of radiation protection such an aggregated metric is firmly established: effective dose. For protection against environmental noise the Health Council of the Netherlands recently proposed a set of aggregated metrics for noise annoyance and sleep disturbance. The presentation will discuss similarities and differences between these two metrics and practical limitations. The effective dose has proven its usefulness in designing radiation protection measures, which are related to the level of risk associated with the radiation practice in question, given that implicit judgements on radiation induced health effects are accepted. However, as the metric does not take into account the nature of radiation practice, it is less useful in policy discussions on the benefits and harm of radiation practices. With respect to the noise exposure metric, only one effect is targeted (annoyance), and the differences between sources are explicitly taken into account. This should make the metric useful in policy discussions with respect to physical planning and siting problems. The metric proposed has only significance on a population level, and can not be used as a predictor for individual risk. (author)

  18. Supplier selection using different metric functions

    Directory of Open Access Journals (Sweden)

    Omosigho S.E.

    2015-01-01

    Full Text Available Supplier selection is an important component of supply chain management in today’s global competitive environment. Hence, the evaluation and selection of suppliers have received considerable attention in the literature. Many attributes of suppliers, other than cost, are considered in the evaluation and selection process. Therefore, the process of evaluation and selection of suppliers is a multi-criteria decision making process. The methodology adopted to solve the supplier selection problem is intuitionistic fuzzy TOPSIS (Technique for Order Preference by Similarity to the Ideal Solution. Generally, TOPSIS is based on the concept of minimum distance from the positive ideal solution and maximum distance from the negative ideal solution. We examine the deficiencies of using only one metric function in TOPSIS and propose the use of spherical metric function in addition to the commonly used metric functions. For empirical supplier selection problems, more than one metric function should be used.

  19. 77 FR 12832 - Non-RTO/ISO Performance Metrics; Commission Staff Request Comments on Performance Metrics for...

    Science.gov (United States)

    2012-03-02

    ... Performance Metrics; Commission Staff Request Comments on Performance Metrics for Regions Outside of RTOs and... performance communicate about the benefits of RTOs and, where appropriate, (2) changes that need to be made to... common set of performance measures for markets both within and outside of ISOs/RTOs. As recommended by...

  20. Regional Sustainability: The San Luis Basin Metrics Project

    Science.gov (United States)

    There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute. Moreover, individual metrics may not capture all aspects of a system that are relevant to sust...

  1. Energy and Economic Impacts of U.S. Federal Energy and Water Conservation Standards Adopted From 1987 through 2012

    Energy Technology Data Exchange (ETDEWEB)

    Meyers, Stephen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Alison [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chan, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-04-01

    This paper presents estimates of the key impacts of Federal energy and water conservation standards adopted from 1987 through 2012. The standards for consumer products and commercial and industrial equipment include those set by legislation as well as standards adopted by DOE through rulemaking. In 2012, the standards saved an estimated 3.6 quads of primary energy, which is equivalent to 3% of total U.S. energy consumption. The savings in operating costs for households and businesses totaled $51.4 billion. The average household saved $347 in operating costs as a result of residential and plumbing product standards. The estimated reduction in CO2 emissions associated with the standards in 2012 was 198 million metric tons, which is equivalent to 3% of total U.S. CO2 emissions. The estimated cumulative energy savings over the period 1990-2070 amount to 179 quads. Accounting for the increased upfront costs of more-efficient products and the operating cost (energy and water) savings over the products’ lifetime, the standards have a past and projected cumulative net present value (NPV) of consumer benefit of between $1,104 billion and $1,390 billion, using 7 percent and 3 percent discount rates, respectively. The water conservation standards, together with energy conservation standards that also save water, reduced water use by 1.8 trillion gallons in 2012, and will achieve cumulative water savings by 2040 of 54 trillion gallons. The estimated consumer savings in 2012 from reduced water use amounted to $13 billon.

  2. Energy and Economic Impacts of U.S. Federal Energy and Water Conservation Standards Adopted From 1987 Through 2015

    Energy Technology Data Exchange (ETDEWEB)

    Meyers, Stephen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Alison [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chan, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Price, Sarah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-02-17

    This paper presents estimates of the key impacts of Federal energy and water conservation standards adopted from 1987 through 2015. The standards for consumer products and commercial and industrial equipment include those set by legislation as well as standards adopted by DOE through rulemaking. In 2015, the standards saved an estimated 4.49 quads of primary energy, which is equivalent to 5% of total U.S. energy consumption. The savings in operating costs for households and businesses totaled $63.4 billion. The average household saved $320 in operating costs as a result of residential appliance standards. The estimated reduction in CO2 emissions associated with the standards in 2015 was 238 million metric tons, which is equivalent to 4.3% of total U.S. CO2 emissions. The estimated cumulative energy savings over the period 1990-2090 amount to 216.9 quads. Accounting for the increased upfront costs of more-efficient products and the operating cost (energy and water) savings over the products’ lifetime, the standards have a cumulative net present value (NPV) of consumer benefit of between $1,627 billion and $1,887 billion, using 7 percent and 3 percent discount rates, respectively. The water conservation standards, together with energy conservation standards that also save water, reduced water use by 1.9 trillion gallons in 2015 and estimated cumulative water savings by 2090 amount to 55 trillion gallons. The estimated consumer savings in 2015 from reduced water use amounted to $12 billon.

  3. Energy and Economic Impacts of U.S. Federal Energy and Water Conservation Standards Adopted From 1987 Through 2013

    Energy Technology Data Exchange (ETDEWEB)

    Meyers, Stephen; Williams, Alison; Chan, Peter

    2014-06-30

    This paper presents estimates of the key impacts of Federal energy and water conservation standards adopted from 1987 through 2013. The standards for consumer products and commercial and industrial equipment include those set by legislation as well as standards adopted by DOE through rulemaking. In 2013, the standards saved an estimated 4.05 quads of primary energy, which is equivalent to 4% of total U.S. energy consumption. The savings in operating costs for households and businesses totaled $56 billion. The average household saved $361 in operating costs as a result of residential and plumbing product standards. The estimated reduction in CO{sub 2} emissions associated with the standards in 2013 was 218 million metric tons, which is equivalent to 4% of total U.S. CO{sub 2} emissions. The estimated cumulative energy savings over the period 1990-2090 amount to 181 quads. Accounting for the increased upfront costs of more-efficient products and the operating cost (energy and water) savings over the products’ lifetime, the standards have a past and projected cumulative net present value (NPV) of consumer benefit of between $1,271 billion and $1,487 billion, using 7 percent and 3 percent discount rates, respectively. The water conservation standards, together with energy conservation standards that also save water, reduced water use by 1.9 trillion gallons in 2013, and will achieve cumulative water savings by 2090 of 55 trillion gallons. The estimated consumer savings in 2013 from reduced water use amounted to $16 billon.

  4. Energy balances of OECD countries 1991-1992

    International Nuclear Information System (INIS)

    1994-01-01

    Contains a compilation of data on the supply and consumption of solid fuels, oil, gas, electricity and heat. The figures are expressed in million metric tons of oil equivalent. Historical tables summarize key energy and economic indicators as well as production, trade and final consumption data. Each issue includes definitions of products and flows and explanatory notes on the individual country data as well as conversion factors from original units to metric tons of oil equivalent. (author)

  5. La pisciculture de Tilapia nilotica (= Sarotherodon niloticus dans les eaux continentales de Côte d'Ivoire

    Directory of Open Access Journals (Sweden)

    Vincke, P.

    1985-01-01

    Full Text Available The fishculture of Tilapia niloiica (Sarotherodon niloticus in freshwater of Ivory Coast. In Ivory Coast, the freshwater fishculture in rural areas is mainly on a small scale. This type of breeding in ponds (2 to 4 ares yields on an average 3 metric tons of fish/ha/year and represents only an activity with self-consumption of products. The yield in intensive pond fishculture Tilapia nilotica is on an average 6 to 7 metric tons/ha/year but yields bigger than 10 metric tons/ha/year are not uncommon. The intensive fishculture in floating cages, requiring a minor investment but a more improved formation than in fischculture, yields on an average about 30 to 40 kg/m3/year. However the effective development of this activity rests on the resolution of problems like the sufficient fry production, the feeding and the commercialization.

  6. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  7. Metric solution of a spinning mass

    International Nuclear Information System (INIS)

    Sato, H.

    1982-01-01

    Studies on a particular class of asymptotically flat and stationary metric solutions called the Kerr-Tomimatsu-Sato class are reviewed about its derivation and properties. For a further study, an almost complete list of the papers worked on the Tomimatsu-Sato metrics is given. (Auth.)

  8. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  9. Generalized tolerance sensitivity and DEA metric sensitivity

    OpenAIRE

    Neralić, Luka; E. Wendell, Richard

    2015-01-01

    This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA). Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  10. On Nakhleh's metric for reduced phylogenetic networks

    OpenAIRE

    Cardona, Gabriel; Llabrés, Mercè; Rosselló, Francesc; Valiente Feruglio, Gabriel Alejandro

    2009-01-01

    We prove that Nakhleh’s metric for reduced phylogenetic networks is also a metric on the classes of tree-child phylogenetic networks, semibinary tree-sibling time consistent phylogenetic networks, and multilabeled phylogenetic trees. We also prove that it separates distinguishable phylogenetic networks. In this way, it becomes the strongest dissimilarity measure for phylogenetic networks available so far. Furthermore, we propose a generalization of that metric that separates arbitrary phyl...

  11. Generalized tolerance sensitivity and DEA metric sensitivity

    Directory of Open Access Journals (Sweden)

    Luka Neralić

    2015-03-01

    Full Text Available This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA. Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  12. Social Media Metrics Importance and Usage Frequency in Latvia

    Directory of Open Access Journals (Sweden)

    Ronalds Skulme

    2017-12-01

    Full Text Available Purpose of the article: The purpose of this paper was to explore which social media marketing metrics are most often used and are most important for marketing experts in Latvia and can be used to evaluate marketing campaign effectiveness. Methodology/methods: In order to achieve the aim of this paper several theoretical and practical research methods were used, such as theoretical literature analysis, surveying and grouping. First of all, theoretical research about social media metrics was conducted. Authors collected information about social media metric grouping methods and the most frequently mentioned social media metrics in the literature. The collected information was used as the foundation for the expert surveys. The expert surveys were used to collect information from Latvian marketing professionals to determine which social media metrics are used most often and which social media metrics are most important in Latvia. Scientific aim: The scientific aim of this paper was to identify if social media metrics importance varies depending on the consumer purchase decision stage. Findings: Information about the most important and most often used social media marketing metrics in Latvia was collected. A new social media grouping framework is proposed. Conclusions: The main conclusion is that the importance and the usage frequency of the social media metrics is changing depending of consumer purchase decisions stage the metric is used to evaluate.

  13. A comparison theorem of the Kobayashi metric and the Bergman metric on a class of Reinhardt domains

    International Nuclear Information System (INIS)

    Weiping Yin.

    1990-03-01

    A comparison theorem for the Kobayashi and Bergman metric is given on a class of Reinhardt domains in C n . In the meantime, we obtain a class of complete invariant Kaehler metrics for these domains of the special cases. (author). 5 refs

  14. The Boring Billion, a slingshot for Complex Life on Earth.

    Science.gov (United States)

    Mukherjee, Indrani; Large, Ross R; Corkrey, Ross; Danyushevsky, Leonid V

    2018-03-13

    The period 1800 to 800 Ma ("Boring Billion") is believed to mark a delay in the evolution of complex life, primarily due to low levels of oxygen in the atmosphere. Earlier studies highlight the remarkably flat C, Cr isotopes and low trace element trends during the so-called stasis, caused by prolonged nutrient, climatic, atmospheric and tectonic stability. In contrast, we suggest a first-order variability of bio-essential trace element availability in the oceans by combining systematic sampling of the Proterozoic rock record with sensitive geochemical analyses of marine pyrite by LA-ICP-MS technique. We also recall that several critical biological evolutionary events, such as the appearance of eukaryotes, origin of multicellularity & sexual reproduction, and the first major diversification of eukaryotes (crown group) occurred during this period. Therefore, it appears possible that the period of low nutrient trace elements (1800-1400 Ma) caused evolutionary pressures which became an essential trigger for promoting biological innovations in the eukaryotic domain. Later periods of stress-free conditions, with relatively high nutrient trace element concentration, facilitated diversification. We propose that the "Boring Billion" was a period of sequential stepwise evolution and diversification of complex eukaryotes, triggering evolutionary pathways that made possible the later rise of micro-metazoans and their macroscopic counterparts.

  15. System dynamics of the competition of municipal solid waste to landfill, electricity, and liquid fuel in California

    Energy Technology Data Exchange (ETDEWEB)

    Westbrook, Jessica; Malczynski, Leonard A.; Manley, Dawn Kataoka

    2014-03-01

    A quantitative system dynamics model was created to evaluate the economic and environmental tradeoffs between biomass to electricity and to liquid fuel using MSW biomass in the state of California as a case study. From an environmental perspective, landfilling represents the worst use of MSW over time, generating more greenhouse gas (GHG) emissions compared to converting MSW to liquid fuel or to electricity. MSW to ethanol results in the greatest displacement of GHG emissions per dollar spent compared to MSW to electricity. MSW to ethanol could save the state of California approximately $60 billion in energy costs by 2050 compared to landfilling, while also reducing GHG emissions state-wide by approximately 140 million metric tons during that timeframe. MSW conversion to electricity creates a significant cost within the state's electricity sector, although some conversion technologies are cost competitive with existing renewable generation.

  16. Handbook of Climate Change Mitigation

    CERN Document Server

    Seiner, John; Suzuki, Toshio; Lackner, Maximilian

    2012-01-01

    There is a mounting consensus that human behavior is changing the global climate and its consequence could be catastrophic. Reducing the 24 billion metric tons of carbon dioxide emissions from stationary and mobile sources is a gigantic task involving both technological challenges and monumental financial and societal costs. The pursuit of sustainable energy resources, environment, and economy has become a complex issue of global scale that affects the daily life of every citizen of the world. The present mitigation activities range from energy conservation, carbon-neutral energy conversions, carbon advanced combustion process that produce no greenhouse gases and that enable carbon capture and sequestion, to other advanced technologies. From its causes and impacts to its solutions, the issues surrounding climate change involve multidisciplinary science and technology. This handbook will provide a single source of this information. The book will be divided into the following sections: Scientific Evidence of Cl...

  17. Comportement du béton à l'eau de mer. Synthèse bibliographique Concrete Behavior in Marine Environment. a Review Paper

    Directory of Open Access Journals (Sweden)

    Lesage J.

    2006-11-01

    Full Text Available Depuis quelques années, dans l'élaboration de structures destinées à l'exploration et à l'exploitation des hydrocarbures en mer, le choix se porte parfois sur les structures en béton plutôt que sur les structures en acier, en particulier dans les zones difficiles de la mer du Nord. C'est pourquoi il nous a semblé intéressant de faire le point des connaissances actuelles sur le comportement du béton à l'eau de mer. Les problèmes les plus importants se situent au niveau de la zone de marnage, c'est-à-dire la zone où le béton est alternativement immergé. Ils sont de tous ordres : contraintes mécaniques avec érosion et cavitation, action capillaire de l'eau avec alternance d'humidification et de séchage, action du gel et du dégel. Les moyens de lutte ne manquent pas : constructions massives, voire surdimensionnées, résistant au choc, choix des formes, mise en aeuvre soignée d'un béton de qualité riche en ciment, correctement dosé en agrégats, dur, dense, compact, imperméable; introduction dans le béton d'un entraineur d'air pour diminuer les effets du gel. L'utilisation d'un ciment prise mer et le choix d'un rapport eau/ciment voisin de 0,45 sont hautement recommandés. Quant à la corrosion, elle concerne les structures en béton à tous les niveaux, aussi bien dans la zone immergée en continu que dans la zone alternativement immergée. L'attaque du béton par les sulfates contenus dans l'eau de mer conduit à la formation de sels de Candlot qui provoquent une dégradation du béton. Le remède consiste à limiter la teneur en aluminate tricalcique du ciment. L'attaque des aciers de renfort par les chlorures a fait l'objet de nombreuses études: la solution consiste surtout à protéger les aciers par galvanisation ou protection cathodique. D'une manière générale, le béton placé dans ce milieu agressif que constitue l'eau de mer subit des contraintes mécaniques et physicochimiques importantes, mais on sait en g

  18. Using Activity Metrics for DEVS Simulation Profiling

    Directory of Open Access Journals (Sweden)

    Muzy A.

    2014-01-01

    Full Text Available Activity metrics can be used to profile DEVS models before and during the simulation. It is critical to get good activity metrics of models before and during their simulation. Having a means to compute a-priori activity of components (analytic activity may be worth when simulating a model (or parts of it for the first time. After, during the simulation, analytic activity can be corrected using dynamic one. In this paper, we introduce McCabe cyclomatic complexity metric (MCA to compute analytic activity. Both static and simulation activity metrics have been implemented through a plug-in of the DEVSimPy (DEVS Simulator in Python language environment and applied to DEVS models.

  19. Tracker Performance Metric

    National Research Council Canada - National Science Library

    Olson, Teresa; Lee, Harry; Sanders, Johnnie

    2002-01-01

    .... We have developed the Tracker Performance Metric (TPM) specifically for this purpose. It was designed to measure the output performance, on a frame-by-frame basis, using its output position and quality...

  20. Metrication: An economic wake-up call for US industry

    Science.gov (United States)

    Carver, G. P.

    1993-03-01

    As the international standard of measurement, the metric system is one key to success in the global marketplace. International standards have become an important factor in international economic competition. Non-metric products are becoming increasingly unacceptable in world markets that favor metric products. Procurement is the primary federal tool for encouraging and helping U.S. industry to convert voluntarily to the metric system. Besides the perceived unwillingness of the customer, certain regulatory language, and certain legal definitions in some states, there are no major impediments to conversion of the remaining non-metric industries to metric usage. Instead, there are good reasons for changing, including an opportunity to rethink many industry standards and to take advantage of size standardization. Also, when the remaining industries adopt the metric system, they will come into conformance with federal agencies engaged in similar activities.

  1. Conformal and related changes of metric on the product of two almost contact metric manifolds.

    OpenAIRE

    Blair, D. E.

    1990-01-01

    This paper studies conformal and related changes of the product metric on the product of two almost contact metric manifolds. It is shown that if one factor is Sasakian, the other is not, but that locally the second factor is of the type studied by Kenmotsu. The results are more general and given in terms of trans-Sasakian, α-Sasakian and β-Kenmotsu structures.

  2. The Social Cost of Trading: Measuring the Increased Damages from Sulfur Dioxide Trading in the United States

    Science.gov (United States)

    Henry, David D., III; Muller, Nicholas Z.; Mendelsohn, Robert O.

    2011-01-01

    The sulfur dioxide (SO[subscript 2]) cap and trade program established in the 1990 Clean Air Act Amendments is celebrated for reducing abatement costs ($0.7 to $2.1 billion per year) by allowing emissions allowances to be traded. Unfortunately, places with high marginal costs also tend to have high marginal damages. Ton-for-ton trading reduces…

  3. Extremal limits of the C metric: Nariai, Bertotti-Robinson, and anti-Nariai C metrics

    International Nuclear Information System (INIS)

    Dias, Oscar J.C.; Lemos, Jose P.S.

    2003-01-01

    In two previous papers we have analyzed the C metric in a background with a cosmological constant Λ, namely, the de-Sitter (dS) C metric (Λ>0), and the anti-de Sitter (AdS) C metric (Λ 0, Λ=0, and Λ 2 xS-tilde 2 ) to each point in the deformed two-sphere S-tilde 2 corresponds a dS 2 spacetime, except for one point which corresponds to a dS 2 spacetime with an infinite straight strut or string. There are other important new features that appear. One expects that the solutions found in this paper are unstable and decay into a slightly nonextreme black hole pair accelerated by a strut or by strings. Moreover, the Euclidean version of these solutions mediate the quantum process of black hole pair creation that accompanies the decay of the dS and AdS spaces

  4. Graev metrics on free products and HNN extensions

    DEFF Research Database (Denmark)

    Slutsky, Konstantin

    2014-01-01

    We give a construction of two-sided invariant metrics on free products (possibly with amalgamation) of groups with two-sided invariant metrics and, under certain conditions, on HNN extensions of such groups. Our approach is similar to the Graev's construction of metrics on free groups over pointed...

  5. Validation of Metrics for Collaborative Systems

    OpenAIRE

    Ion IVAN; Cristian CIUREA

    2008-01-01

    This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  6. g-Weak Contraction in Ordered Cone Rectangular Metric Spaces

    Directory of Open Access Journals (Sweden)

    S. K. Malhotra

    2013-01-01

    Full Text Available We prove some common fixed-point theorems for the ordered g-weak contractions in cone rectangular metric spaces without assuming the normality of cone. Our results generalize some recent results from cone metric and cone rectangular metric spaces into ordered cone rectangular metric spaces. Examples are provided which illustrate the results.

  7. The dynamics of metric-affine gravity

    International Nuclear Information System (INIS)

    Vitagliano, Vincenzo; Sotiriou, Thomas P.; Liberati, Stefano

    2011-01-01

    Highlights: → The role and the dynamics of the connection in metric-affine theories is explored. → The most general second order action does not lead to a dynamical connection. → Including higher order invariants excites new degrees of freedom in the connection. → f(R) actions are also discussed and shown to be a non- representative class. - Abstract: Metric-affine theories of gravity provide an interesting alternative to general relativity: in such an approach, the metric and the affine (not necessarily symmetric) connection are independent quantities. Furthermore, the action should include covariant derivatives of the matter fields, with the covariant derivative naturally defined using the independent connection. As a result, in metric-affine theories a direct coupling involving matter and connection is also present. The role and the dynamics of the connection in such theories is explored. We employ power counting in order to construct the action and search for the minimal requirements it should satisfy for the connection to be dynamical. We find that for the most general action containing lower order invariants of the curvature and the torsion the independent connection does not carry any dynamics. It actually reduces to the role of an auxiliary field and can be completely eliminated algebraically in favour of the metric and the matter field, introducing extra interactions with respect to general relativity. However, we also show that including higher order terms in the action radically changes this picture and excites new degrees of freedom in the connection, making it (or parts of it) dynamical. Constructing actions that constitute exceptions to this rule requires significant fine tuned and/or extra a priori constraints on the connection. We also consider f(R) actions as a particular example in order to show that they constitute a distinct class of metric-affine theories with special properties, and as such they cannot be used as representative toy

  8. The definitive guide to IT service metrics

    CERN Document Server

    McWhirter, Kurt

    2012-01-01

    Used just as they are, the metrics in this book will bring many benefits to both the IT department and the business as a whole. Details of the attributes of each metric are given, enabling you to make the right choices for your business. You may prefer and are encouraged to design and create your own metrics to bring even more value to your business - this book will show you how to do this, too.

  9. Galaxy growth in a massive halo in the first billion years of cosmic history

    Science.gov (United States)

    Marrone, D. P.; Spilker, J. S.; Hayward, C. C.; Vieira, J. D.; Aravena, M.; Ashby, M. L. N.; Bayliss, M. B.; Béthermin, M.; Brodwin, M.; Bothwell, M. S.; Carlstrom, J. E.; Chapman, S. C.; Chen, Chian-Chou; Crawford, T. M.; Cunningham, D. J. M.; De Breuck, C.; Fassnacht, C. D.; Gonzalez, A. H.; Greve, T. R.; Hezaveh, Y. D.; Lacaille, K.; Litke, K. C.; Lower, S.; Ma, J.; Malkan, M.; Miller, T. B.; Morningstar, W. R.; Murphy, E. J.; Narayanan, D.; Phadke, K. A.; Rotermund, K. M.; Sreevani, J.; Stalder, B.; Stark, A. A.; Strandet, M. L.; Tang, M.; Weiß, A.

    2018-01-01

    According to the current understanding of cosmic structure formation, the precursors of the most massive structures in the Universe began to form shortly after the Big Bang, in regions corresponding to the largest fluctuations in the cosmic density field. Observing these structures during their period of active growth and assembly—the first few hundred million years of the Universe—is challenging because it requires surveys that are sensitive enough to detect the distant galaxies that act as signposts for these structures and wide enough to capture the rarest objects. As a result, very few such objects have been detected so far. Here we report observations of a far-infrared-luminous object at redshift 6.900 (less than 800 million years after the Big Bang) that was discovered in a wide-field survey. High-resolution imaging shows it to be a pair of extremely massive star-forming galaxies. The larger is forming stars at a rate of 2,900 solar masses per year, contains 270 billion solar masses of gas and 2.5 billion solar masses of dust, and is more massive than any other known object at a redshift of more than 6. Its rapid star formation is probably triggered by its companion galaxy at a projected separation of 8 kiloparsecs. This merging companion hosts 35 billion solar masses of stars and has a star-formation rate of 540 solar masses per year, but has an order of magnitude less gas and dust than its neighbour and physical conditions akin to those observed in lower-metallicity galaxies in the nearby Universe. These objects suggest the presence of a dark-matter halo with a mass of more than 100 billion solar masses, making it among the rarest dark-matter haloes that should exist in the Universe at this epoch.

  10. Gondwana basins and their coal resources in Bangladesh

    International Nuclear Information System (INIS)

    Nehaluddin, M.; Sultan-ul-Islam, M.

    1994-01-01

    Fault bounded five Gondwana basins have been discovered in the north western Bangladesh. Among these basins show considerable amount of coal deposits. The Gondwana rocks are highly formed during the Permo-carboniferous diastrophism and later on acquired dynamic characters. In almost all basins, the Permian rocks overlie the Precambrian basement and underlie either the Tertiary or the Cretaceous sediments, structural, stratigraphic, and depositional history of these basins is more or less similar. The sedimentary sequences are composed of light to dark gray, fine to very coarse grained, sub angular to sub rounded felspathic sandstone, dark grey carbonaceous shale and sandstone, variegated conglomerate and thick coal seams (single seam max. 42.38m). The rocks are often alternated and bear the characteristics of cyclic sedimentation. The depositional environments varied from restricted drainage to open fluvial dominated low to moderate sinuous drainage system. The coal bearing basins were flanked by vegetated and swampy over bank. Age of these coals is suggested to be the late permian. Proved and probable reserves of coal in Jamalganj-Paharpur basin are 670 and 1,460 million metric tons, in Barapukuria basin 303 and 3899 million metric tons; in Barapukuria basin 303 and 389 million metric tons; and in Khalaspir basin 143 and 685 million metric tons respectively. The coal is high volatile, low sulphur, bituminous type. It can be used for different forms of thermal conversion. (author)

  11. NASA education briefs for the classroom. Metrics in space

    Science.gov (United States)

    The use of metric measurement in space is summarized for classroom use. Advantages of the metric system over the English measurement system are described. Some common metric units are defined, as are special units for astronomical study. International system unit prefixes and a conversion table of metric/English units are presented. Questions and activities for the classroom are recommended.

  12. Enhancing Authentication Models Characteristic Metrics via ...

    African Journals Online (AJOL)

    In this work, we derive the universal characteristic metrics set for authentication models based on security, usability and design issues. We then compute the probability of the occurrence of each characteristic metrics in some single factor and multifactor authentication models in order to determine the effectiveness of these ...

  13. Validation of Metrics for Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2008-01-01

    Full Text Available This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  14. Understanding Acceptance of Software Metrics--A Developer Perspective

    Science.gov (United States)

    Umarji, Medha

    2009-01-01

    Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…

  15. Measuring reliability under epistemic uncertainty: Review on non-probabilistic reliability metrics

    Directory of Open Access Journals (Sweden)

    Kang Rui

    2016-06-01

    Full Text Available In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence-theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis-based reliability metrics, possibility-theory-based reliability metrics (posbist reliability and uncertainty-theory-based reliability metrics (belief reliability. It is pointed out that a qualified reliability metric that is able to consider the effect of epistemic uncertainty needs to (1 compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2 satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.

  16. Comportement d'un béton à hautes performances à base de laitier ...

    African Journals Online (AJOL)

    L'utilisation de béton à hautes performances (BHP) intégrant des ajouts cimentaires comme les cendres volantes, les fumées de silice ou le laitier hydraulique ... armatures qui sont, à leur tour attaquées. Il est possible de modifier la ... refroidissement brutal par l'eau sous pression, c'est un sable de granulométrie 0/5 mm.

  17. Construction of self-dual codes in the Rosenbloom-Tsfasman metric

    Science.gov (United States)

    Krisnawati, Vira Hari; Nisa, Anzi Lina Ukhtin

    2017-12-01

    Linear code is a very basic code and very useful in coding theory. Generally, linear code is a code over finite field in Hamming metric. Among the most interesting families of codes, the family of self-dual code is a very important one, because it is the best known error-correcting code. The concept of Hamming metric is develop into Rosenbloom-Tsfasman metric (RT-metric). The inner product in RT-metric is different from Euclid inner product that is used to define duality in Hamming metric. Most of the codes which are self-dual in Hamming metric are not so in RT-metric. And, generator matrix is very important to construct a code because it contains basis of the code. Therefore in this paper, we give some theorems and methods to construct self-dual codes in RT-metric by considering properties of the inner product and generator matrix. Also, we illustrate some examples for every kind of the construction.

  18. Chaotic inflation with metric and matter perturbations

    International Nuclear Information System (INIS)

    Feldman, H.A.; Brandenberger, R.H.

    1989-01-01

    A perturbative scheme to analyze the evolution of both metric and scalar field perturbations in an expanding universe is developed. The scheme is applied to study chaotic inflation with initial metric and scalar field perturbations present. It is shown that initial gravitational perturbations with wavelength smaller than the Hubble radius rapidly decay. The metric simultaneously picks up small perturbations determined by the matter inhomogeneities. Both are frozen in once the wavelength exceeds the Hubble radius. (orig.)

  19. AREVA - First quarter 2011 revenue: 2.7% growth like for like to 1.979 billion euros

    International Nuclear Information System (INIS)

    2011-01-01

    The group reported consolidated revenue of 1.979 billion euros in the 1. quarter of 2011, for 2.2% growth compared with the 1. quarter of 2010 (+ 2.7% like for like). The increase was driven by the Mining / Front End Business Group (+ 20.8% LFL). Revenue from outside France rose 12.0% to 1.22 billion euros and represented 62% of total revenue. The impacts of foreign exchange and changes in consolidation scope were negligible during the period. The March 11 events in Japan had no significant impact on the group's performance in the 1. quarter of 2011. The group's backlog of 43.5 billion euros at March 31, 2011 was stable in relation to March 31, 2010. The growth in the backlog of the Mining / Front End and Renewable Energies Business Groups offset the partial depletion of the backlog in the Reactors and Services and Back End Business Groups as contracts were completed

  20. Two ten-billion-solar-mass black holes at the centres of giant elliptical galaxies.

    Science.gov (United States)

    McConnell, Nicholas J; Ma, Chung-Pei; Gebhardt, Karl; Wright, Shelley A; Murphy, Jeremy D; Lauer, Tod R; Graham, James R; Richstone, Douglas O

    2011-12-08

    Observational work conducted over the past few decades indicates that all massive galaxies have supermassive black holes at their centres. Although the luminosities and brightness fluctuations of quasars in the early Universe suggest that some were powered by black holes with masses greater than 10 billion solar masses, the remnants of these objects have not been found in the nearby Universe. The giant elliptical galaxy Messier 87 hosts the hitherto most massive known black hole, which has a mass of 6.3 billion solar masses. Here we report that NGC 3842, the brightest galaxy in a cluster at a distance from Earth of 98 megaparsecs, has a central black hole with a mass of 9.7 billion solar masses, and that a black hole of comparable or greater mass is present in NGC 4889, the brightest galaxy in the Coma cluster (at a distance of 103 megaparsecs). These two black holes are significantly more massive than predicted by linearly extrapolating the widely used correlations between black-hole mass and the stellar velocity dispersion or bulge luminosity of the host galaxy. Although these correlations remain useful for predicting black-hole masses in less massive elliptical galaxies, our measurements suggest that different evolutionary processes influence the growth of the largest galaxies and their black holes.

  1. Origins fourteen billion years of cosmic evolution

    CERN Document Server

    Tyson, Neil deGrasse

    2004-01-01

    Origins explores cosmic science's stunning new insights into the formation and evolution of our universe--of the cosmos, of galaxies and galaxy clusters, of stars within galaxies, of planets that orbit those stars, and of different forms of life that take us back to the first three seconds and forward through three billion years of life on Earth to today's search for life on other planets. Drawing on the current cross-pollination of geology, biology and astrophysics, Origins explains the thrilling daily breakthroughs in our knowledge of the universe from dark energy to life on Mars to the mysteries of space and time. Distilling complex science in clear and lively prose, co-authors Neil deGrasse Tyson and Donald Goldsmith conduct a galvanising tour of the cosmos revealing what the universe has been up to while turning part of itself into us.

  2. Phantom metrics with Killing spinors

    Directory of Open Access Journals (Sweden)

    W.A. Sabra

    2015-11-01

    Full Text Available We study metric solutions of Einstein–anti-Maxwell theory admitting Killing spinors. The analogue of the IWP metric which admits a space-like Killing vector is found and is expressed in terms of a complex function satisfying the wave equation in flat (2+1-dimensional space–time. As examples, electric and magnetic Kasner spaces are constructed by allowing the solution to depend only on the time coordinate. Euclidean solutions are also presented.

  3. Pacific Northwest Laboratory environmental technologies available for deployment

    International Nuclear Information System (INIS)

    Slate, S.C.

    1994-07-01

    The Department of Energy created the Office of Environmental Management (EM) to conduct a 30-year plus, multi-billion dollar program to manage the wastes and cleanup the legacy from over fifty years of nuclear material production. Across the DOE System there are thousands of sites containing millions of metric tons of buried wastes and contaminated soils and groundwater. Additionally, there are nearly 400,000 m 3 of highly radioactive wastes in underground storage tanks, over 1,400 different mixed-waste streams, and thousands of contaminated surplus facilities, some exceeding 200,000 m 2 in size. Costs to remediate all these problems have been estimated to be as much as several hundred billion dollars. The tremendous technical challenges with some of the problems and the high costs of using existing technologies has led the Department to create the Office of Technology Development (TD) to lead an aggressive, integrated national program to develop and deploy the needed advanced, cost-effective technologies. This program is developing technologies for all major cleanup steps: assessment, characterization, retrieval, treatment, final stabilization, and disposal. Work is focused on the Department's five major problem areas: High-Level Waste Tank Remediation; Contaminant Plume Containment and Remediation; Mixed Waste Characterization, Treatment, and Disposal; Contaminated Soils and Buried Wastes Facility Transitioning, Decommissioning, and Final Disposal

  4. Embodied carbon dioxide emission at supra-national scale: A coalition analysis for G7, BRIC, and the rest of the world

    International Nuclear Information System (INIS)

    Chen, Z.M.; Chen, G.Q.

    2011-01-01

    Presented in this study is an empirical analysis of embodied carbon dioxide emissions induced by fossil fuel combustion for the world divided into three supra-national coalitions, i.e., G7, BRIC, and the rest of the world (ROW), via the application of a multi-region input-output modeling for 2004. Embodied emission intensities for the three coalitions are calculated and compared, with market exchange rate and purchase power parity separately used to investigate the difference between nominal and real production efficiencies. Emissions embodied in different economic activities such as production, consumption, import, and export are calculated and analyzed accordingly, and remarkable carbon trade imbalances associated with G7 (surplus of 1.53 billion tons, or 36% its traded emissions) and BRIC (deficit of 1.37 billion tons, or 51% its traded emissions) and approximate balance with ROW (deficit of 0.16 billion tons, or 3% its traded emissions) are concretely revealed. Carbon leakages associated with industry transfer and international trades are illustrated in terms of impacts on global climate policies. The last but not least, per capita consumption based emissions for G7, BRIC, and ROW are determined as 12.95, 1.53, and 2.22 tons, respectively, and flexible abatement policies as well as equity on per capita entitlement are discussed. - Research highlights: → We compare the embodied CO 2 emissions in 2004 for G7, BRIC, and ROW. → Emissions embodied in production, consumption, import, and export are investigated. → Considerable CO 2 trade surplus and deficit are obtained by G7 and BRIC, respectively. → Per head embodied emissions are 13, 1.5, and 2.2 tons for G7, BRIC, and ROW, respectively.

  5. Tonometer calibration in Brasília, Brazil Calibragem de tonômetros em Brasília, Brasil

    Directory of Open Access Journals (Sweden)

    Fernanda Pires da Silva Abrão

    2009-06-01

    Full Text Available PURPOSE: To determine calibration errors of Goldmann applanation tonometers in ophthalmic clinics of Brasília, Brazil, and correlate the findings with variables related to tonometers model and utilization. METHODS: Tonometers from ophthalmic clinics in Brasília, Brazil, were checked for calibration errors. A standard Goldmann applanation tonometer checking tool was used to asses the calibration error. Only one trained individual made all verifications, with a masked reading of the results. Data on the model, age, daily use, frequency of calibration checking and the nature of the ophthalmic department - private or public - were collected and correlated with the observed errors. RESULTS: One hundred tonometers were checked for calibration. Forty seven percent (47/100 were out of 1 mmHg range at least at one point checking. Tonometers mounted to slit lamp, with less than 5 years, used in less than 20 patients daily, that had a calibration check on a yearly basis, and those from private office exhibit a lower rate of inaccuracy, but only the first variable was statistically significant. Sixty one percent of tonometers on public hospitals were out of calibration. CONCLUSION: Calibration of tonometers in the capital of Brazil is poor; those from general hospitals are worst, and this fact can lead to inaccurate detection and assessment of glaucoma patients, overall in the population under government assistance.OBJETIVOS: Determinar os erros de calibração dos tonômetros de aplanação de Goldmann em clínicas oftalmológicas de Brasília, Brasil, e correlacioná-los a variáveis relativas ao modelo e à utilização dos aparelhos. MÉTODOS: Tonômetros de clínicas oftalmológicas de Brasília tiveram a calibragem aferida usando um cilindro padrão fornecido pelo fabricante dos aparelhos. Todas as aferições foram realizadas por um só examinador previamente treinado e a leitura das medidas foi mascarada por um observador independente. As medidas

  6. Invariant metric for nonlinear symplectic maps

    Indian Academy of Sciences (India)

    In this paper, we construct an invariant metric in the space of homogeneous polynomials of a given degree (≥ 3). The homogeneous polynomials specify a nonlinear symplectic map which in turn represents a Hamiltonian system. By minimizing the norm constructed out of this metric as a function of system parameters, we ...

  7. Acceptance test report for the Westinghouse 100 ton hydraulic trailer

    International Nuclear Information System (INIS)

    Barrett, R.A.

    1995-01-01

    The SY-101 Equipment Removal System 100 Ton Hydraulic Trailer was designed and built by KAMP Systems, Inc. Performance of the Acceptance Test Procedure at KAMP's facility in Ontario, California (termed Phase 1 in this report) was interrupted by discrepancies noted with the main hydraulic cylinder. The main cylinder was removed and sent to REMCO for repair while the trailer was sent to Lampson's facility in Pasco, Washington. The Acceptance Test Procedure was modified and performance resumed at Lampson (termed Phase 2 in this report) after receipt of the repaired cylinder. At the successful conclusion of Phase 2 testing the trailer was accepted as meeting all the performance criteria specified

  8. Two-dimensional manifolds with metrics of revolution

    International Nuclear Information System (INIS)

    Sabitov, I Kh

    2000-01-01

    This is a study of the topological and metric structure of two-dimensional manifolds with a metric that is locally a metric of revolution. In the case of compact manifolds this problem can be thoroughly investigated, and in particular it is explained why there are no closed analytic surfaces of revolution in R 3 other than a sphere and a torus (moreover, in the smoothness class C ∞ such surfaces, understood in a certain generalized sense, exist in any topological class)

  9. Gravitational lensing in metric theories of gravity

    International Nuclear Information System (INIS)

    Sereno, Mauro

    2003-01-01

    Gravitational lensing in metric theories of gravity is discussed. I introduce a generalized approximate metric element, inclusive of both post-post-Newtonian contributions and a gravitomagnetic field. Following Fermat's principle and standard hypotheses, I derive the time delay function and deflection angle caused by an isolated mass distribution. Several astrophysical systems are considered. In most of the cases, the gravitomagnetic correction offers the best perspectives for an observational detection. Actual measurements distinguish only marginally different metric theories from each other

  10. The uniqueness of the Fisher metric as information metric

    Czech Academy of Sciences Publication Activity Database

    Le, Hong-Van

    2017-01-01

    Roč. 69, č. 4 (2017), s. 879-896 ISSN 0020-3157 Institutional support: RVO:67985840 Keywords : Chentsov’s theorem * mixed topology * monotonicity of the Fisher metric Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.049, year: 2016 https://link.springer.com/article/10.1007%2Fs10463-016-0562-0

  11. Techniques et systèmes de renfort des structures en béton

    CERN Document Server

    Miranda-Vizuete, J

    2000-01-01

    Bien qu'appelé « pierre artificielle », le béton est un matériau vivant qui se modifie tout au long de sa vie utile. Il change car la structure dont il fait partie subit elle-même des changements. Ces changements proviennent soit de modifications ou de rénovations, soit d'une altération de sa capacité de support par un accroissement des charges. Dans la plupart des cas, ils nécessitent un renfort. Le renforcement d'une structure en béton consiste à améliorer les caractéristiques mécaniques des éléments qui la composent, de manière à ce qu'elle offre une meilleure solidité aussi bien en état de service qu'en état de résistances ultimes. Ce document présente les méthodes les plus utilisées dans le domaine de renfort des structures dont l'incorporation des profiles métalliques, l'augmentation de section structurelle et celle plus récente du renforcement à base d'adjonction de matériaux composites extérieurs.

  12. Hybrid metric-Palatini stars

    Science.gov (United States)

    Danilǎ, Bogdan; Harko, Tiberiu; Lobo, Francisco S. N.; Mak, M. K.

    2017-02-01

    We consider the internal structure and the physical properties of specific classes of neutron, quark and Bose-Einstein condensate stars in the recently proposed hybrid metric-Palatini gravity theory, which is a combination of the metric and Palatini f (R ) formalisms. It turns out that the theory is very successful in accounting for the observed phenomenology, since it unifies local constraints at the Solar System level and the late-time cosmic acceleration, even if the scalar field is very light. In this paper, we derive the equilibrium equations for a spherically symmetric configuration (mass continuity and Tolman-Oppenheimer-Volkoff) in the framework of the scalar-tensor representation of the hybrid metric-Palatini theory, and we investigate their solutions numerically for different equations of state of neutron and quark matter, by adopting for the scalar field potential a Higgs-type form. It turns out that the scalar-tensor definition of the potential can be represented as an Clairaut differential equation, and provides an explicit form for f (R ) given by f (R )˜R +Λeff, where Λeff is an effective cosmological constant. Furthermore, stellar models, described by the stiff fluid, radiation-like, bag model and the Bose-Einstein condensate equations of state are explicitly constructed in both general relativity and hybrid metric-Palatini gravity, thus allowing an in-depth comparison between the predictions of these two gravitational theories. As a general result it turns out that for all the considered equations of state, hybrid gravity stars are more massive than their general relativistic counterparts. Furthermore, two classes of stellar models corresponding to two particular choices of the functional form of the scalar field (constant value, and logarithmic form, respectively) are also investigated. Interestingly enough, in the case of a constant scalar field the equation of state of the matter takes the form of the bag model equation of state describing

  13. The universal connection and metrics on moduli spaces

    International Nuclear Information System (INIS)

    Massamba, Fortune; Thompson, George

    2003-11-01

    We introduce a class of metrics on gauge theoretic moduli spaces. These metrics are made out of the universal matrix that appears in the universal connection construction of M. S. Narasimhan and S. Ramanan. As an example we construct metrics on the c 2 = 1 SU(2) moduli space of instantons on R 4 for various universal matrices. (author)

  14. Reproducibility of graph metrics in fMRI networks

    Directory of Open Access Journals (Sweden)

    Qawi K Telesford

    2010-12-01

    Full Text Available The reliability of graph metrics calculated in network analysis is essential to the interpretation of complex network organization. These graph metrics are used to deduce the small-world properties in networks. In this study, we investigated the test-retest reliability of graph metrics from functional magnetic resonance imaging (fMRI data collected for two runs in 45 healthy older adults. Graph metrics were calculated on data for both runs and compared using intraclass correlation coefficient (ICC statistics and Bland-Altman (BA plots. ICC scores describe the level of absolute agreement between two measurements and provide a measure of reproducibility. For mean graph metrics, ICC scores were high for clustering coefficient (ICC=0.86, global efficiency (ICC=0.83, path length (ICC=0.79, and local efficiency (ICC=0.75; the ICC score for degree was found to be low (ICC=0.29. ICC scores were also used to generate reproducibility maps in brain space to test voxel-wise reproducibility for unsmoothed and smoothed data. Reproducibility was uniform across the brain for global efficiency and path length, but was only high in network hubs for clustering coefficient, local efficiency and degree. BA plots were used to test the measurement repeatability of all graph metrics. All graph metrics fell within the limits for repeatability. Together, these results suggest that with exception of degree, mean graph metrics are reproducible and suitable for clinical studies. Further exploration is warranted to better understand reproducibility across the brain on a voxel-wise basis.

  15. Greenhouse gas emissions in Hawaii. Household and visitor expenditure analysis

    International Nuclear Information System (INIS)

    Konan, Denise Eby; Chan, Hing Ling

    2010-01-01

    This paper focuses on petroleum use and greenhouse gas emissions associated with economic activities in Hawaii. Data on economic activity, petroleum consumption by type (gasoline, diesel, aviation fuel, residual, propane), and emissions factors are compiled and analyzed. In the baseline year 1997, emissions are estimated to total approximately 23.2 million metric tons of carbon, 181 thousand metric tons of nitrous oxide, and 31 thousand metric tons of methane in terms of carbon-equivalent global warming potential over a 100-year horizon. Air transportation, electricity, and other transportation are the key economic activity responsible for GHG emissions associated with fossil fuel use. More than 22% of total emissions are attributed to visitor expenditures. On a per person per annum basis, emission rates generated by visitor demand are estimated to be higher than that of residents by a factor of 4.3 for carbon, 3.2 for methane, and 4.8 for nitrous oxide. (author)

  16. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  17. Sustainability Metrics: The San Luis Basin Project

    Science.gov (United States)

    Sustainability is about promoting humanly desirable dynamic regimes of the environment. Metrics: ecological footprint, net regional product, exergy, emergy, and Fisher Information. Adaptive management: (1) metrics assess problem, (2) specific problem identified, and (3) managemen...

  18. Goedel-type metrics in various dimensions

    International Nuclear Information System (INIS)

    Guerses, Metin; Karasu, Atalay; Sarioglu, Oezguer

    2005-01-01

    Goedel-type metrics are introduced and used in producing charged dust solutions in various dimensions. The key ingredient is a (D - 1)-dimensional Riemannian geometry which is then employed in constructing solutions to the Einstein-Maxwell field equations with a dust distribution in D dimensions. The only essential field equation in the procedure turns out to be the source-free Maxwell's equation in the relevant background. Similarly the geodesics of this type of metric are described by the Lorentz force equation for a charged particle in the lower dimensional geometry. It is explicitly shown with several examples that Goedel-type metrics can be used in obtaining exact solutions to various supergravity theories and in constructing spacetimes that contain both closed timelike and closed null curves and that contain neither of these. Among the solutions that can be established using non-flat backgrounds, such as the Tangherlini metrics in (D - 1)-dimensions, there exists a class which can be interpreted as describing black-hole-type objects in a Goedel-like universe

  19. Standardised metrics for global surgical surveillance.

    Science.gov (United States)

    Weiser, Thomas G; Makary, Martin A; Haynes, Alex B; Dziekan, Gerald; Berry, William R; Gawande, Atul A

    2009-09-26

    Public health surveillance relies on standardised metrics to evaluate disease burden and health system performance. Such metrics have not been developed for surgical services despite increasing volume, substantial cost, and high rates of death and disability associated with surgery. The Safe Surgery Saves Lives initiative of WHO's Patient Safety Programme has developed standardised public health metrics for surgical care that are applicable worldwide. We assembled an international panel of experts to develop and define metrics for measuring the magnitude and effect of surgical care in a population, while taking into account economic feasibility and practicability. This panel recommended six measures for assessing surgical services at a national level: number of operating rooms, number of operations, number of accredited surgeons, number of accredited anaesthesia professionals, day-of-surgery death ratio, and postoperative in-hospital death ratio. We assessed the feasibility of gathering such statistics at eight diverse hospitals in eight countries and incorporated them into the WHO Guidelines for Safe Surgery, in which methods for data collection, analysis, and reporting are outlined.

  20. Developing a Security Metrics Scorecard for Healthcare Organizations.

    Science.gov (United States)

    Elrefaey, Heba; Borycki, Elizabeth; Kushniruk, Andrea

    2015-01-01

    In healthcare, information security is a key aspect of protecting a patient's privacy and ensuring systems availability to support patient care. Security managers need to measure the performance of security systems and this can be achieved by using evidence-based metrics. In this paper, we describe the development of an evidence-based security metrics scorecard specific to healthcare organizations. Study participants were asked to comment on the usability and usefulness of a prototype of a security metrics scorecard that was developed based on current research in the area of general security metrics. Study findings revealed that scorecards need to be customized for the healthcare setting in order for the security information to be useful and usable in healthcare organizations. The study findings resulted in the development of a security metrics scorecard that matches the healthcare security experts' information requirements.

  1. Landscape pattern metrics and regional assessment

    Science.gov (United States)

    O'Neill, R. V.; Riitters, K.H.; Wickham, J.D.; Jones, K.B.

    1999-01-01

    The combination of remote imagery data, geographic information systems software, and landscape ecology theory provides a unique basis for monitoring and assessing large-scale ecological systems. The unique feature of the work has been the need to develop and interpret quantitative measures of spatial pattern-the landscape indices. This article reviews what is known about the statistical properties of these pattern metrics and suggests some additional metrics based on island biogeography, percolation theory, hierarchy theory, and economic geography. Assessment applications of this approach have required interpreting the pattern metrics in terms of specific environmental endpoints, such as wildlife and water quality, and research into how to represent synergystic effects of many overlapping sources of stress.

  2. Anhui Tongling Invests 1 Billion Yuan to Set up “Copper Industry Fund”

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    <正>On September 12, the signing ceremony for "Anhui Copper Industry Fund" set up by Anhui Tongling Development & Investment Group Co., Ltd. and Shanghai V. Stone Investment Management Co., Ltd. was held in Tongling. The fund is 1 billion yuan.

  3. Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Sartor, Dale; Tschudi, William

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  4. Metrics Are Needed for Collaborative Software Development

    Directory of Open Access Journals (Sweden)

    Mojgan Mohtashami

    2011-10-01

    Full Text Available There is a need for metrics for inter-organizational collaborative software development projects, encompassing management and technical concerns. In particular, metrics are needed that are aimed at the collaborative aspect itself, such as readiness for collaboration, the quality and/or the costs and benefits of collaboration in a specific ongoing project. We suggest questions and directions for such metrics, spanning the full lifespan of a collaborative project, from considering the suitability of collaboration through evaluating ongoing projects to final evaluation of the collaboration.

  5. Predicting class testability using object-oriented metrics

    OpenAIRE

    Bruntink, Magiel; Deursen, Arie

    2004-01-01

    textabstractIn this paper we investigate factors of the testability of object-oriented software systems. The starting point is given by a study of the literature to obtain both an initial model of testability and existing OO metrics related to testability. Subsequently, these metrics are evaluated by means of two case studies of large Java systems for which JUnit test cases exist. The goal of this paper is to define and evaluate a set of metrics that can be used to assess the testability of t...

  6. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  7. Hermitian-Einstein metrics on parabolic stable bundles

    International Nuclear Information System (INIS)

    Li Jiayu; Narasimhan, M.S.

    1995-12-01

    Let M-bar be a compact complex manifold of complex dimension two with a smooth Kaehler metric and D a smooth divisor on M-bar. If E is a rank 2 holomorphic vector bundle on M-bar with a stable parabolic structure along D, we prove the existence of a metric on E' = E module MbarD (compatible with the parabolic structure) which is Hermitian-Einstein with respect to the restriction of Kaehler metric of M-barD. A converse is also proved. (author). 24 refs

  8. Transfer of Plutonium-Uranium Extraction Plant and N Reactor irradiated fuel for storage at the 105-KE and 105-KW fuel storage basins, Hanford Site, Richland Washington

    International Nuclear Information System (INIS)

    1995-07-01

    The U.S. Department of Energy (DOE) needs to remove irradiated fuel from the Plutonium-Uranium Extraction (PUREX) Plant and N Reactor at the Hanford Site, Richland, Washington, to stabilize the facilities in preparation for decontamination and decommissioning (D ampersand D) and to reduce the cost of maintaining the facilities prior to D ampersand D. DOE is proposing to transfer approximately 3.9 metric tons (4.3 short tons) of unprocessed irradiated fuel, by rail, from the PUREX Plant in the 200 East Area and the 105 N Reactor (N Reactor) fuel storage basin in the 100 N Area, to the 105-KE and 105-KW fuel storage basins (K Basins) in the 100 K Area. The fuel would be placed in storage at the K Basins, along with fuel presently stored, and would be dispositioned in the same manner as the other existing irradiated fuel inventory stored in the K Basins. The fuel transfer to the K Basins would consolidate storage of fuels irradiated at N Reactor and the Single Pass Reactors. Approximately 2.9 metric tons (3.2 short tons) of single-pass production reactor, aluminum clad (AC) irradiated fuel in four fuel baskets have been placed into four overpack buckets and stored in the PUREX Plant canyon storage basin to await shipment. In addition, about 0.5 metric tons (0.6 short tons) of zircaloy clad (ZC) and a few AC irradiated fuel elements have been recovered from the PUREX dissolver cell floors, placed in wet fuel canisters, and stored on the canyon deck. A small quantity of ZC fuel, in the form of fuel fragments and chips, is suspected to be in the sludge at the bottom of N Reactor's fuel storage basin. As part of the required stabilization activities at N Reactor, this sludge would be removed from the basin and any identifiable pieces of fuel elements would be recovered, placed in open canisters, and stored in lead lined casks in the storage basin to await shipment. A maximum of 0.5 metric tons (0.6 short tons) of fuel pieces is expected to be recovered

  9. Electron capture detection of sulphur gases in carbon dioxide at the parts-per-billion level

    International Nuclear Information System (INIS)

    Pick, M.E.

    1979-01-01

    A gas chromatograph with an electron capture detector has been used to determine sulphur gases in CO 2 at the parts-per-billion level, with particular application to the analysis of coolant from CO 2 cooled nuclear reactors. For COS, CS 2 , CH 3 SH, H 2 S and (CH 3 ) 2 S 2 the detector has a sensitivity comparable with the more commonly used flame photometric detector, but it is much less sensitive towards (CH 3 ) 2 S and thiophene. In addition, the paper describes a simple method for trapping sulphur gases which might enable detection of sub parts-per-billion levels of sulphur compounds. (Auth.)

  10. Coverage Metrics for Model Checking

    Science.gov (United States)

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  11. Future of the PCI Readmission Metric.

    Science.gov (United States)

    Wasfy, Jason H; Yeh, Robert W

    2016-03-01

    Between 2013 and 2014, the Centers for Medicare and Medicaid Services and the National Cardiovascular Data Registry publically reported risk-adjusted 30-day readmission rates after percutaneous coronary intervention (PCI) as a pilot project. A key strength of this public reporting effort included risk adjustment with clinical rather than administrative data. Furthermore, because readmission after PCI is common, expensive, and preventable, this metric has substantial potential to improve quality and value in American cardiology care. Despite this, concerns about the metric exist. For example, few PCI readmissions are caused by procedural complications, limiting the extent to which improved procedural technique can reduce readmissions. Also, similar to other readmission measures, PCI readmission is associated with socioeconomic status and race. Accordingly, the metric may unfairly penalize hospitals that care for underserved patients. Perhaps in the context of these limitations, Centers for Medicare and Medicaid Services has not yet included PCI readmission among metrics that determine Medicare financial penalties. Nevertheless, provider organizations may still wish to focus on this metric to improve value for cardiology patients. PCI readmission is associated with low-risk chest discomfort and patient anxiety. Therefore, patient education, improved triage mechanisms, and improved care coordination offer opportunities to minimize PCI readmissions. Because PCI readmission is common and costly, reducing PCI readmission offers provider organizations a compelling target to improve the quality of care, and also performance in contracts involve shared financial risk. © 2016 American Heart Association, Inc.

  12. Lean and Green Hand Surgery.

    Science.gov (United States)

    Van Demark, Robert E; Smith, Vanessa J S; Fiegen, Anthony

    2018-02-01

    Health care in the United States is both expensive and wasteful. The cost of health care in the United States continues to increase every year. Health care spending for 2016 is estimated at $3.35 trillion. Per capita spending ($10,345 per person) is more than twice the average of other developed countries. The United States also leads the world in solid waste production (624,700 metric tons of waste in 2011). The health care industry is second only to the food industry in annual waste production. Each year, health care facilities in the United States produce 4 billion pounds of waste (660 tons per day), with as much as 70%, or around 2.8 billion pounds, produced directly by operating rooms. Waste disposal also accounts for up to 20% of a hospital's annual environmental services budget. Since 1992, waste production by hospitals has increased annually by a rate of at least 15%, due in part to the increased usage of disposables. Reduction in operating room waste would decrease both health care costs and potential environmental hazards. In 2015, the American Association for Hand Surgery along with the American Society for Surgery of the Hand, American Society for Peripheral Nerve Surgery, and the American Society of Reconstructive Microsurgery began the "Lean and Green" surgery project to reduce the amount of waste generated by hand surgery. We recently began our own "Lean and Green" project in our institution. Using "minor field sterility" surgical principles and Wide Awake Local Anesthesia No Tourniquet (WALANT), both surgical costs and surgical waste were decreased while maintaining patient safety and satisfaction. As the current reimbursement model changes from quantity to quality, "Lean and Green" surgery will play a role in the future health care system. Copyright © 2018 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  13. Model assessment using a multi-metric ranking technique

    Science.gov (United States)

    Fitzpatrick, P. J.; Lau, Y.; Alaka, G.; Marks, F.

    2017-12-01

    Validation comparisons of multiple models presents challenges when skill levels are similar, especially in regimes dominated by the climatological mean. Assessing skill separation will require advanced validation metrics and identifying adeptness in extreme events, but maintain simplicity for management decisions. Flexibility for operations is also an asset. This work postulates a weighted tally and consolidation technique which ranks results by multiple types of metrics. Variables include absolute error, bias, acceptable absolute error percentages, outlier metrics, model efficiency, Pearson correlation, Kendall's Tau, reliability Index, multiplicative gross error, and root mean squared differences. Other metrics, such as root mean square difference and rank correlation were also explored, but removed when the information was discovered to be generally duplicative to other metrics. While equal weights are applied, weights could be altered depending for preferred metrics. Two examples are shown comparing ocean models' currents and tropical cyclone products, including experimental products. The importance of using magnitude and direction for tropical cyclone track forecasts instead of distance, along-track, and cross-track are discussed. Tropical cyclone intensity and structure prediction are also assessed. Vector correlations are not included in the ranking process, but found useful in an independent context, and will be briefly reported.

  14. Metric learning for DNA microarray data analysis

    International Nuclear Information System (INIS)

    Takeuchi, Ichiro; Nakagawa, Masao; Seto, Masao

    2009-01-01

    In many microarray studies, gene set selection is an important preliminary step for subsequent main task such as tumor classification, cancer subtype identification, etc. In this paper, we investigate the possibility of using metric learning as an alternative to gene set selection. We develop a simple metric learning algorithm aiming to use it for microarray data analysis. Exploiting a property of the algorithm, we introduce a novel approach for extending the metric learning to be adaptive. We apply the algorithm to previously studied microarray data on malignant lymphoma subtype identification.

  15. China’s primary energy demands in 2020: Predictions from an MPSO–RBF estimation model

    International Nuclear Information System (INIS)

    Yu Shiwei; Wei Yiming; Wang Ke

    2012-01-01

    Highlights: ► A Mix-encoding PSO and RBF network-based energy demand forecasting model is proposed. ► The proposed model has simpler structure and smaller estimated errors than other ANN models. ► China’s energy demand could reach 6.25 billion, 4.16 billion, and 5.29 billion tons tce. ► China’s energy efficiency in 2020 will increase by more than 30% compared with 2009. - Abstract: In the present study, a Mix-encoding Particle Swarm Optimization and Radial Basis Function (MPSO–RBF) network-based energy demand forecasting model is proposed and applied to forecast China’s energy consumption until 2020. The energy demand is analyzed for the period from 1980 to 2009 based on GDP, population, proportion of industry in GDP, urbanization rate, and share of coal energy. The results reveal that the proposed MPSO–RBF based model has fewer hidden nodes and smaller estimated errors compared with other ANN-based estimation models. The average annual growth of China’s energy demand will be 6.70%, 2.81%, and 5.08% for the period between 2010 and 2020 in three scenarios and could reach 6.25 billion, 4.16 billion, and 5.29 billion tons coal equivalent in 2020. Regardless of future scenarios, China’s energy efficiency in 2020 will increase by more than 30% compared with 2009.

  16. Comparison of luminance based metrics in different lighting conditions

    DEFF Research Database (Denmark)

    Wienold, J.; Kuhn, T.E.; Christoffersen, J.

    In this study, we evaluate established and newly developed metrics for predicting glare using data from three different research studies. The evaluation covers two different targets: 1. How well the user’s perception of glare magnitude correlates to the prediction of the glare metrics? 2. How well...... do the glare metrics describe the subjects’ disturbance by glare? We applied Spearman correlations, logistic regressions and an accuracy evaluation, based on an ROC-analysis. The results show that five of the twelve investigated metrics are failing at least one of the statistical tests. The other...... seven metrics CGI, modified DGI, DGP, Ev, average Luminance of the image Lavg, UGP and UGR are passing all statistical tests. DGP, CGI, DGI_mod and UGP have largest AUC and might be slightly more robust. The accuracy of the predictions of afore mentioned seven metrics for the disturbance by glare lies...

  17. A bi-metric theory of gravitation

    International Nuclear Information System (INIS)

    Rosen, N.

    1975-01-01

    The bi-metric theory of gravitation proposed previously is simplified in that the auxiliary conditions are discarded, the two metric tensors being tied together only by means of the boundary conditions. Some of the properties of the field of a particle are investigated; there is no black hole, and it appears that no gravitational collapse can take place. Although the proposed theory and general relativity are at present observationally indistinguishable, some differences are pointed out which may some day be susceptible of observation. An alternative bi-metric theory is considered which gives for the precession of the perihelion 5/6 of the value given by general relativity; it seems less satisfactory than the present theory from the aesthetic point of view. (author)

  18. Rapid emergence of subaerial landmasses and onset of a modern hydrologic cycle 2.5 billion years ago.

    Science.gov (United States)

    Bindeman, I N; Zakharov, D O; Palandri, J; Greber, N D; Dauphas, N; Retallack, G J; Hofmann, A; Lackey, J S; Bekker, A

    2018-05-01

    The history of the growth of continental crust is uncertain, and several different models that involve a gradual, decelerating, or stepwise process have been proposed 1-4 . Even more uncertain is the timing and the secular trend of the emergence of most landmasses above the sea (subaerial landmasses), with estimates ranging from about one billion to three billion years ago 5-7 . The area of emerged crust influences global climate feedbacks and the supply of nutrients to the oceans 8 , and therefore connects Earth's crustal evolution to surface environmental conditions 9-11 . Here we use the triple-oxygen-isotope composition of shales from all continents, spanning 3.7 billion years, to provide constraints on the emergence of continents over time. Our measurements show a stepwise total decrease of 0.08 per mille in the average triple-oxygen-isotope value of shales across the Archaean-Proterozoic boundary. We suggest that our data are best explained by a shift in the nature of water-rock interactions, from near-coastal in the Archaean era to predominantly continental in the Proterozoic, accompanied by a decrease in average surface temperatures. We propose that this shift may have coincided with the onset of a modern hydrological cycle owing to the rapid emergence of continental crust with near-modern average elevation and aerial extent roughly 2.5 billion years ago.

  19. Prioritizing Urban Habitats for Connectivity Conservation: Integrating Centrality and Ecological Metrics.

    Science.gov (United States)

    Poodat, Fatemeh; Arrowsmith, Colin; Fraser, David; Gordon, Ascelin

    2015-09-01

    Connectivity among fragmented areas of habitat has long been acknowledged as important for the viability of biological conservation, especially within highly modified landscapes. Identifying important habitat patches in ecological connectivity is a priority for many conservation strategies, and the application of 'graph theory' has been shown to provide useful information on connectivity. Despite the large number of metrics for connectivity derived from graph theory, only a small number have been compared in terms of the importance they assign to nodes in a network. This paper presents a study that aims to define a new set of metrics and compares these with traditional graph-based metrics, used in the prioritization of habitat patches for ecological connectivity. The metrics measured consist of "topological" metrics, "ecological metrics," and "integrated metrics," Integrated metrics are a combination of topological and ecological metrics. Eight metrics were applied to the habitat network for the fat-tailed dunnart within Greater Melbourne, Australia. A non-directional network was developed in which nodes were linked to adjacent nodes. These links were then weighted by the effective distance between patches. By applying each of the eight metrics for the study network, nodes were ranked according to their contribution to the overall network connectivity. The structured comparison revealed the similarity and differences in the way the habitat for the fat-tailed dunnart was ranked based on different classes of metrics. Due to the differences in the way the metrics operate, a suitable metric should be chosen that best meets the objectives established by the decision maker.

  20. Comparative Study of Trace Metrics between Bibliometrics and Patentometrics

    Directory of Open Access Journals (Sweden)

    Fred Y. Ye

    2016-06-01

    Full Text Available Purpose: To comprehensively evaluate the overall performance of a group or an individual in both bibliometrics and patentometrics. Design/methodology/approach: Trace metrics were applied to the top 30 universities in the 2014 Academic Ranking of World Universities (ARWU — computer sciences, the top 30 ESI highly cited papers in the computer sciences field in 2014, as well as the top 30 assignees and the top 30 most cited patents in the National Bureau of Economic Research (NBER computer hardware and software category. Findings: We found that, by applying trace metrics, the research or marketing impact efficiency, at both group and individual levels, was clearly observed. Furthermore, trace metrics were more sensitive to the different publication-citation distributions than the average citation and h-index were. Research limitations: Trace metrics considered publications with zero citations as negative contributions. One should clarify how he/she evaluates a zero-citation paper or patent before applying trace metrics. Practical implications: Decision makers could regularly examinine the performance of their university/company by applying trace metrics and adjust their policies accordingly. Originality/value: Trace metrics could be applied both in bibliometrics and patentometrics and provide a comprehensive view. Moreover, the high sensitivity and unique impact efficiency view provided by trace metrics can facilitate decision makers in examining and adjusting their policies.

  1. Overall view of the AA hall dominated by the 50 ton crane (Donges).

    CERN Multimedia

    1980-01-01

    A 50 ton, 32 metre span overhead travelling cranre was mounted in one of the bays of Hall 193 (AA). An identical crane was mounted on the other bay. See also photo 8004261. For photos of the AA in different phases of completion (between 1979 and 1982) see: 7911303, 7911597X, 8004261, 8004608X, 8005563X, 8005565X, 8006716X, 8006722X, 8010939X, 8010941X, 8202324, 8202658X, 8203628X .

  2. Chem systems

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    This paper reports that world styrene demand, paced by a near doubling of combined requirements in East Asia and Oceania, could reach 19.3 million metric tons by 2000, an average growth rate of 3.7%/year. So concludes Chem Systems Inc., Tarrytown, N.Y., in a study of world styrene markets through the end of the century. Pacific Rim styrene production and consumption throughout the 1990s are predicted to make up increasingly larger shares of world markets, while demand and production lag in the U.S. and western Europe. Demand and capacity in other parts of the world will grow in real terms, increasing combined market shares only slightly. Most of the increase will be driven by demand in East Asia and Oceania, where consumption by century's end is expected to increase 4.48 million metric tons from 2.25 million tons in 1991. Meantime, Japan's styrene demand in 2000 is projected at 2.64 million tons, a 500,000 ton increase from 1991 demand but a net market loss of 1.9%

  3. Acceptance test report for the Westinghouse 100 ton hydraulic trailer

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, R.A.

    1995-03-06

    The SY-101 Equipment Removal System 100 Ton Hydraulic Trailer was designed and built by KAMP Systems, Inc. Performance of the Acceptance Test Procedure at KAMP`s facility in Ontario, California (termed Phase 1 in this report) was interrupted by discrepancies noted with the main hydraulic cylinder. The main cylinder was removed and sent to REMCO for repair while the trailer was sent to Lampson`s facility in Pasco, Washington. The Acceptance Test Procedure was modified and performance resumed at Lampson (termed Phase 2 in this report) after receipt of the repaired cylinder. At the successful conclusion of Phase 2 testing the trailer was accepted as meeting all the performance criteria specified.

  4. U of M seeking $1.1 billion in projects for Soudan Mine lab.

    CERN Multimedia

    2003-01-01

    The University of Minnesota is hoping that groundbreaking research underway at its labs at the Soudan Underground Mine near Tower will help secure up to $1.1 billion in the next 5 to 20 years to expand its work into particle physics (1 page).

  5. Evaluating and Estimating the WCET Criticality Metric

    DEFF Research Database (Denmark)

    Jordan, Alexander

    2014-01-01

    a programmer (or compiler) from targeting optimizations the right way. A possible resort is to use a metric that targets WCET and which can be efficiently computed for all code parts of a program. Similar to dynamic profiling techniques, which execute code with input that is typically expected...... for the application, based on WCET analysis we can indicate how critical a code fragment is, in relation to the worst-case bound. Computing such a metric on top of static analysis, incurs a certain overhead though, which increases with the complexity of the underlying WCET analysis. We present our approach...... to estimate the Criticality metric, by relaxing the precision of WCET analysis. Through this, we can reduce analysis time by orders of magnitude, while only introducing minor error. To evaluate our estimation approach and share our garnered experience using the metric, we evaluate real-time programs, which...

  6. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  7. Plutonium inventories for stabilization and stabilized materials

    Energy Technology Data Exchange (ETDEWEB)

    Williams, A.K.

    1996-05-01

    The objective of the breakout session was to identify characteristics of materials containing plutonium, the need to stabilize these materials for storage, and plans to accomplish the stabilization activities. All current stabilization activities are driven by the Defense Nuclear Facilities Safety Board Recommendation 94-1 (May 26, 1994) and by the recently completed Plutonium ES&H Vulnerability Assessment (DOE-EH-0415). The Implementation Plan for accomplishing stabilization of plutonium-bearing residues in response to the Recommendation and the Assessment was published by DOE on February 28, 1995. This Implementation Plan (IP) commits to stabilizing problem materials within 3 years, and stabilizing all other materials within 8 years. The IP identifies approximately 20 metric tons of plutonium requiring stabilization and/or repackaging. A further breakdown shows this material to consist of 8.5 metric tons of plutonium metal and alloys, 5.5 metric tons of plutonium as oxide, and 6 metric tons of plutonium as residues. Stabilization of the metal and oxide categories containing greater than 50 weight percent plutonium is covered by DOE Standard {open_quotes}Criteria for Safe Storage of Plutonium Metals and Oxides{close_quotes} December, 1994 (DOE-STD-3013-94). This standard establishes criteria for safe storage of stabilized plutonium metals and oxides for up to 50 years. Each of the DOE sites and contractors with large plutonium inventories has either started or is preparing to start stabilization activities to meet these criteria.

  8. Economic analysis of novel synergistic biofuel (H{sub 2}Bioil) processes

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Navneet R.; Mallapragada, Dharik S.; Agrawal, Rakesh [Purdue University, School of Chemical Engineering, West Lafayette, IN (United States); Tyner, Wallace E. [Purdue University, Department of Agricultural Economics, West Lafayette, IN (United States)

    2012-06-15

    Fast-pyrolysis based processes can be built on small-scale and have higher process carbon and energy efficiency as compared to other options. H{sub 2}Bioil is a novel process based on biomass fast-hydropyrolysis and subsequent hydrodeoxygenation (HDO) and can potentially provide high yields of high energy density liquid fuel at relatively low hydrogen consumption. This paper contains a comprehensive financial analysis of the H{sub 2}Bioil process with hydrogen derived from different sources. Three different carbon tax scenarios are analyzed: no carbon tax, $55/metric ton carbon tax and $110/metric ton carbon tax. The break-even crude oil price for a delivered biomass cost of $94/metric ton when hydrogen is derived from coal, natural gas or nuclear energy ranges from $103 to $116/bbl for no carbon tax and even lower ($99-$111/bbl) for the carbon tax scenarios. This break-even crude oil price compares favorably with the literature estimated prices of fuels from alternate biochemical and thermochemical routes. The impact of the chosen carbon tax is found to be limited relative to the impact of the H{sub 2} source on the H{sub 2}Bioil break-even price. The economic robustness of the processes for hydrogen derived from coal, natural gas, or nuclear energy is seen by an estimated break-even crude oil price of $114-$126/bbl when biomass cost is increased to $121/metric ton. (orig.)

  9. The impact of 'Cash for Clunkers' on greenhouse gas emissions: a life cycle perspective

    International Nuclear Information System (INIS)

    Lenski, Shoshannah M; Keoleian, Gregory A; Bolon, Kevin M

    2010-01-01

    One of the goals of the US Consumer Assistance to Recycle and Save (CARS) Act of 2009, more commonly known as 'Cash for Clunkers', was to improve the US vehicle fleet fuel efficiency. Previous studies of the program's environmental impact have focused mainly on the effect of improved fuel economy, and the resulting reductions in fuel use and emissions during the vehicle use phase. We propose and apply a method for analyzing the net effect of CARS on greenhouse gas emissions from a full vehicle life cycle perspective, including the impact of premature production and retirement of vehicles. We find that CARS had a one-time effect of preventing 4.4 million metric tons of CO 2 -equivalent emissions, about 0.4% of US annual light-duty vehicle emissions. Of these, 3.7 million metric tons are avoided during the period of the expected remaining life of the inefficient 'clunkers'. 1.5 million metric tons are avoided as consumers purchase vehicles that are more efficient than their next replacement vehicle would otherwise have been. An additional 0.8 million metric tons are emitted as a result of premature manufacturing and disposal of vehicles. These results are sensitive to the remaining lifetime of the 'clunkers' and to the fuel economy of new vehicles in the absence of CARS, suggesting important considerations for policymakers deliberating on the use of accelerated vehicle retirement programs as a part of the greenhouse gas emissions policy.

  10. Economic analysis of novel synergistic biofuel (H2Bioil) processes

    International Nuclear Information System (INIS)

    Singh, Navneet R.; Mallapragada, Dharik S.; Agrawal, Rakesh; Tyner, Wallace E.

    2012-01-01

    Fast-pyrolysis based processes can be built on small-scale and have higher process carbon and energy efficiency as compared to other options. H 2 Bioil is a novel process based on biomass fast-hydropyrolysis and subsequent hydrodeoxygenation (HDO) and can potentially provide high yields of high energy density liquid fuel at relatively low hydrogen consumption. This paper contains a comprehensive financial analysis of the H 2 Bioil process with hydrogen derived from different sources. Three different carbon tax scenarios are analyzed: no carbon tax, $55/metric ton carbon tax and $110/metric ton carbon tax. The break-even crude oil price for a delivered biomass cost of $94/metric ton when hydrogen is derived from coal, natural gas or nuclear energy ranges from $103 to $116/bbl for no carbon tax and even lower ($99-$111/bbl) for the carbon tax scenarios. This break-even crude oil price compares favorably with the literature estimated prices of fuels from alternate biochemical and thermochemical routes. The impact of the chosen carbon tax is found to be limited relative to the impact of the H 2 source on the H 2 Bioil break-even price. The economic robustness of the processes for hydrogen derived from coal, natural gas, or nuclear energy is seen by an estimated break-even crude oil price of $114-$126/bbl when biomass cost is increased to $121/metric ton. (orig.)

  11. Uranium in Canada: Billion-dollar industry

    International Nuclear Information System (INIS)

    Whillans, R.T.

    1989-01-01

    In 1988, Canada maintained its position as the world's leading producer and exporter of uranium; five primary uranium producers reported concentrate output containing 12,400 MT of uranium, or about one-third of Western production. Uranium shipments made by these producers in 1988 exceeded 13,200 MT, worth Canadian $1.1 billion. Because domestic requirements represent only 15% of current Canadian output, most of Canada's uranium production is available for export. Despite continued market uncertainty in 1988, Canada's uranium producers signed new sales contracts for some 14,000 MT, twice the 1987 level. About 90% of this new volume is with the US, now Canada's major uranium customer. The recent implementation of the Canada/US Free Trade agreement brings benefits to both countries; the uranium industries in each can now develop in an orderly, free market. Canada's uranium industry was restructured and consolidated in 1988 through merger and acquisition; three new uranium projects advanced significantly. Canada's new policy on nonresident ownership in the uranium mining sector, designed to encourage both Canadian and foreign investment, should greatly improve efforts to finance the development of recent Canadian uranium discoveries

  12. Indefinite metric fields and the renormalization group

    International Nuclear Information System (INIS)

    Sherry, T.N.

    1976-11-01

    The renormalization group equations are derived for the Green functions of an indefinite metric field theory. In these equations one retains the mass dependence of the coefficient functions, since in the indefinite metric theories the masses cannot be neglected. The behavior of the effective coupling constant in the asymptotic and infrared limits is analyzed. The analysis is illustrated by means of a simple model incorporating indefinite metric fields. The model scales at first order, and at this order also the effective coupling constant has both ultra-violet and infra-red fixed points, the former being the bare coupling constant

  13. Kerr-Newman metric in deSitter background

    International Nuclear Information System (INIS)

    Patel, L.K.; Koppar, S.S.; Bhatt, P.V.

    1987-01-01

    In addition to the Kerr-Newman metric with cosmological constant several other metrics are presented giving Kerr-Newman type solutions of Einstein-Maxwell field equations in the background of deSitter universe. The electromagnetic field in all the solutions is assumed to be source-free. A new metric of what may be termed as an electrovac rotating deSitter space-time- a space-time devoid of matter but containing source-free electromagnetic field and a null fluid with twisting rays-has been presented. In the absence of the electromagnetic field, these solutions reduce to those discussed by Vaidya (1984). 8 refs. (author)

  14. The independence of software metrics taken at different life-cycle stages

    Science.gov (United States)

    Kafura, D.; Canning, J.; Reddy, G.

    1984-01-01

    Over the past few years a large number of software metrics have been proposed and, in varying degrees, a number of these metrics have been subjected to empirical validation which demonstrated the utility of the metrics in the software development process. Attempts to classify these metrics and to determine if the metrics in these different classes appear to be measuring distinct attributes of the software product are studied. Statistical analysis is used to determine the degree of relationship among the metrics.

  15. Thermodynamic metrics and optimal paths.

    Science.gov (United States)

    Sivak, David A; Crooks, Gavin E

    2012-05-11

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  16. Invariant metrics for Hamiltonian systems

    International Nuclear Information System (INIS)

    Rangarajan, G.; Dragt, A.J.; Neri, F.

    1991-05-01

    In this paper, invariant metrics are constructed for Hamiltonian systems. These metrics give rise to norms on the space of homeogeneous polynomials of phase-space variables. For an accelerator lattice described by a Hamiltonian, these norms characterize the nonlinear content of the lattice. Therefore, the performance of the lattice can be improved by minimizing the norm as a function of parameters describing the beam-line elements in the lattice. A four-fold increase in the dynamic aperture of a model FODO cell is obtained using this procedure. 7 refs

  17. Steiner trees for fixed orientation metrics

    DEFF Research Database (Denmark)

    Brazil, Marcus; Zachariasen, Martin

    2009-01-01

    We consider the problem of constructing Steiner minimum trees for a metric defined by a polygonal unit circle (corresponding to s = 2 weighted legal orientations in the plane). A linear-time algorithm to enumerate all angle configurations for degree three Steiner points is given. We provide...... a simple proof that the angle configuration for a Steiner point extends to all Steiner points in a full Steiner minimum tree, such that at most six orientations suffice for edges in a full Steiner minimum tree. We show that the concept of canonical forms originally introduced for the uniform orientation...... metric generalises to the fixed orientation metric. Finally, we give an O(s n) time algorithm to compute a Steiner minimum tree for a given full Steiner topology with n terminal leaves....

  18. Metric Learning for Hyperspectral Image Segmentation

    Science.gov (United States)

    Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca

    2011-01-01

    We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.

  19. Validation of Metrics as Error Predictors

    Science.gov (United States)

    Mendling, Jan

    In this chapter, we test the validity of metrics that were defined in the previous chapter for predicting errors in EPC business process models. In Section 5.1, we provide an overview of how the analysis data is generated. Section 5.2 describes the sample of EPCs from practice that we use for the analysis. Here we discuss a disaggregation by the EPC model group and by error as well as a correlation analysis between metrics and error. Based on this sample, we calculate a logistic regression model for predicting error probability with the metrics as input variables in Section 5.3. In Section 5.4, we then test the regression function for an independent sample of EPC models from textbooks as a cross-validation. Section 5.5 summarizes the findings.

  20. Potential for thermal coal and Clean Coal Technology (CCT) in the Asia-Pacific. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.J.; Long, S.

    1991-11-22

    The Coal Project was able to make considerable progress in understanding the evolving energy situation in Asia and the future role of coal and Clean Coal Technologies. It is clear that there will be major growth in consumption of coal in Asia over the next two decades -- we estimate an increase of 1.2 billion metric tons. Second, all governments are concerned about the environmental impacts of increased coal use, however enforcement of regulations appears to be quite variable among Asian countries. There is general caution of the part of Asian utilities with respect to the introduction of CCT`s. However, there appears to be potential for introduction of CCT`s in a few countries by the turn of the century. It is important to emphasize that it will be a long term effort to succeed in getting CCT`s introduced to Asia. The Coal Project recommends that the US CCT program be expanded to allow the early introduction of CCT`s in a number of countries.

  1. The global dispersion of microorganisms and pollutants in clouds of desert dust

    Science.gov (United States)

    Griffin, D. W.; Kellogg, C. A.; Garrison, V. H.; Kubilay, N.; Kocak, M.; Shinn, E.

    2003-12-01

    A current estimate of the quantity of dust that is transported some distance in Earth's atmosphere each year is approximately two billion metric tons. Whereas various research projects have been undertaken to understand this planetary process, little has been done to address public and ecosystem health issues. Our research group is currently investigating long-range transport of microorganisms associated with desert dust clouds at various points on the globe via the integration of remote sensing, modeling and microbiological assays. Using a suite of molecular biology techniques, we are identifying cultivable bacteria and fungi and enumerating total bacteria and viruses. Research results indicate that approximately 30% of the microorganisms found in Earth's atmosphere during `African dust events' are species of bacteria or fungi that have previously been identified as disease causing agents in terrestrial plants, trees, and animals. This presentation will cover historical research in this field and the implications of microbial and pollutant \\(metals, pesticides, etc.\\) transport to downwind ecosystems.

  2. Dark matter sensitivity of multi-ton liquid xenon detectors

    International Nuclear Information System (INIS)

    Schumann, Marc; Bütikofer, Lukas; Baudis, Laura; Kish, Alexander; Selvi, Marco

    2015-01-01

    We study the sensitivity of multi ton-scale time projection chambers using a liquid xenon target, e.g., the proposed DARWIN instrument, to spin-independent and spin-dependent WIMP-nucleon scattering interactions. Taking into account realistic backgrounds from the detector itself as well as from neutrinos, we examine the impact of exposure, energy threshold, background rejection efficiency and energy resolution on the dark matter sensitivity. With an exposure of 200 t × y and assuming detector parameters which have been already demonstrated experimentally, spin-independent cross sections as low as 2.5 × 10 −49 cm 2 can be probed for WIMP masses around 40 GeV/c 2 . Additional improvements in terms of background rejection and exposure will further increase the sensitivity, while the ultimate WIMP science reach will be limited by neutrinos scattering coherently off the xenon nuclei

  3. Predicting class testability using object-oriented metrics

    NARCIS (Netherlands)

    M. Bruntink (Magiel); A. van Deursen (Arie)

    2004-01-01

    textabstractIn this paper we investigate factors of the testability of object-oriented software systems. The starting point is given by a study of the literature to obtain both an initial model of testability and existing OO metrics related to testability. Subsequently, these metrics are evaluated

  4. Software Power Metric Model: An Implementation | Akwukwuma ...

    African Journals Online (AJOL)

    ... and the execution time (TIME) in each case was recorded. We then obtain the application functions point count. Our result shows that the proposed metric is computable, consistent in its use of unit, and is programming language independent. Keywords: Software attributes, Software power, measurement, Software metric, ...

  5. Meter Detection in Symbolic Music Using Inner Metric Analysis

    NARCIS (Netherlands)

    de Haas, W.B.; Volk, A.

    2016-01-01

    In this paper we present PRIMA: a new model tailored to symbolic music that detects the meter and the first downbeat position of a piece. Given onset data, the metrical structure of a piece is interpreted using the Inner Metric Analysis (IMA) model. IMA identifies the strong and weak metrical

  6. Pace studying worldwide coke production

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    Pace Consultants Inc., Houston, has started a multiclient study of world-wide petroleum coke production, examining environmental initiatives and eventually forecasting prices of fuel grade coke. Pace expects coker expansions, increased operating severity, and reduced cycle times to boost coke supply to more than 50 million metric tons/year in 2000, compared with 39.7 million metric tons in 1992. Increased supply and tightened environmental rules in countries consuming large amounts of petroleum coke will be the main factors affecting coke markets. The paper discusses coke quality and the Japanese market

  7. Performance metrics for the evaluation of hyperspectral chemical identification systems

    Science.gov (United States)

    Truslow, Eric; Golowich, Steven; Manolakis, Dimitris; Ingle, Vinay

    2016-02-01

    Remote sensing of chemical vapor plumes is a difficult but important task for many military and civilian applications. Hyperspectral sensors operating in the long-wave infrared regime have well-demonstrated detection capabilities. However, the identification of a plume's chemical constituents, based on a chemical library, is a multiple hypothesis testing problem which standard detection metrics do not fully describe. We propose using an additional performance metric for identification based on the so-called Dice index. Our approach partitions and weights a confusion matrix to develop both the standard detection metrics and identification metric. Using the proposed metrics, we demonstrate that the intuitive system design of a detector bank followed by an identifier is indeed justified when incorporating performance information beyond the standard detection metrics.

  8. Curvature properties of four-dimensional Walker metrics

    International Nuclear Information System (INIS)

    Chaichi, M; Garcia-Rio, E; Matsushita, Y

    2005-01-01

    A Walker n-manifold is a semi-Riemannian manifold, which admits a field of parallel null r-planes, r ≤ n/2. In the present paper we study curvature properties of a Walker 4-manifold (M, g) which admits a field of parallel null 2-planes. The metric g is necessarily of neutral signature (+ + - -). Such a Walker 4-manifold is the lowest dimensional example not of Lorentz type. There are three functions of coordinates which define a Walker metric. Some recent work shows that a Walker 4-manifold of restricted type whose metric is characterized by two functions exhibits a large variety of symplectic structures, Hermitian structures, Kaehler structures, etc. For such a restricted Walker 4-manifold, we shall study mainly curvature properties, e.g., conditions for a Walker metric to be Einstein, Osserman, or locally conformally flat, etc. One of our main results is the exact solutions to the Einstein equations for a restricted Walker 4-manifold

  9. Decision Analysis for Metric Selection on a Clinical Quality Scorecard.

    Science.gov (United States)

    Guth, Rebecca M; Storey, Patricia E; Vitale, Michael; Markan-Aurora, Sumita; Gordon, Randolph; Prevost, Traci Q; Dunagan, Wm Claiborne; Woeltje, Keith F

    2016-09-01

    Clinical quality scorecards are used by health care institutions to monitor clinical performance and drive quality improvement. Because of the rapid proliferation of quality metrics in health care, BJC HealthCare found it increasingly difficult to select the most impactful scorecard metrics while still monitoring metrics for regulatory purposes. A 7-step measure selection process was implemented incorporating Kepner-Tregoe Decision Analysis, which is a systematic process that considers key criteria that must be satisfied in order to make the best decision. The decision analysis process evaluates what metrics will most appropriately fulfill these criteria, as well as identifies potential risks associated with a particular metric in order to identify threats to its implementation. Using this process, a list of 750 potential metrics was narrowed to 25 that were selected for scorecard inclusion. This decision analysis process created a more transparent, reproducible approach for selecting quality metrics for clinical quality scorecards. © The Author(s) 2015.

  10. Common fixed point theorems in intuitionistic fuzzy metric spaces and L-fuzzy metric spaces with nonlinear contractive condition

    International Nuclear Information System (INIS)

    Jesic, Sinisa N.; Babacev, Natasa A.

    2008-01-01

    The purpose of this paper is to prove some common fixed point theorems for a pair of R-weakly commuting mappings defined on intuitionistic fuzzy metric spaces [Park JH. Intuitionistic fuzzy metric spaces. Chaos, Solitons and Fractals 2004;22:1039-46] and L-fuzzy metric spaces [Saadati R, Razani A, Adibi H. A common fixed point theorem in L-fuzzy metric spaces. Chaos, Solitons and Fractals, doi:10.1016/j.chaos.2006.01.023], with nonlinear contractive condition, defined with function, first observed by Boyd and Wong [Boyd DW, Wong JSW. On nonlinear contractions. Proc Am Math Soc 1969;20:458-64]. Following Pant [Pant RP. Common fixed points of noncommuting mappings. J Math Anal Appl 1994;188:436-40] we define R-weak commutativity for a pair of mappings and then prove the main results. These results generalize some known results due to Saadati et al., and Jungck [Jungck G. Commuting maps and fixed points. Am Math Mon 1976;83:261-3]. Some examples and comments according to the preceding results are given

  11. 43 CFR 12.915 - Metric system of measurement.

    Science.gov (United States)

    2010-10-01

    ... procurements, grants, and other business-related activities. Metric implementation may take longer where the... recipient, such as when foreign competitors are producing competing products in non-metric units. (End of...

  12. Cophenetic metrics for phylogenetic trees, after Sokal and Rohlf.

    Science.gov (United States)

    Cardona, Gabriel; Mir, Arnau; Rosselló, Francesc; Rotger, Lucía; Sánchez, David

    2013-01-16

    Phylogenetic tree comparison metrics are an important tool in the study of evolution, and hence the definition of such metrics is an interesting problem in phylogenetics. In a paper in Taxon fifty years ago, Sokal and Rohlf proposed to measure quantitatively the difference between a pair of phylogenetic trees by first encoding them by means of their half-matrices of cophenetic values, and then comparing these matrices. This idea has been used several times since then to define dissimilarity measures between phylogenetic trees but, to our knowledge, no proper metric on weighted phylogenetic trees with nested taxa based on this idea has been formally defined and studied yet. Actually, the cophenetic values of pairs of different taxa alone are not enough to single out phylogenetic trees with weighted arcs or nested taxa. For every (rooted) phylogenetic tree T, let its cophenetic vectorφ(T) consist of all pairs of cophenetic values between pairs of taxa in T and all depths of taxa in T. It turns out that these cophenetic vectors single out weighted phylogenetic trees with nested taxa. We then define a family of cophenetic metrics dφ,p by comparing these cophenetic vectors by means of Lp norms, and we study, either analytically or numerically, some of their basic properties: neighbors, diameter, distribution, and their rank correlation with each other and with other metrics. The cophenetic metrics can be safely used on weighted phylogenetic trees with nested taxa and no restriction on degrees, and they can be computed in O(n2) time, where n stands for the number of taxa. The metrics dφ,1 and dφ,2 have positive skewed distributions, and they show a low rank correlation with the Robinson-Foulds metric and the nodal metrics, and a very high correlation with each other and with the splitted nodal metrics. The diameter of dφ,p, for p⩾1 , is in O(n(p+2)/p), and thus for low p they are more discriminative, having a wider range of values.

  13. Finite Metric Spaces of Strictly negative Type

    DEFF Research Database (Denmark)

    Hjorth, Poul G.

    If a finite metric space is of strictly negative type then its transfinite diameter is uniquely realized by an infinite extent (“load vector''). Finite metric spaces that have this property include all trees, and all finite subspaces of Euclidean and Hyperbolic spaces. We prove that if the distance...

  14. Gravitational Metric Tensor Exterior to Rotating Homogeneous ...

    African Journals Online (AJOL)

    The covariant and contravariant metric tensors exterior to a homogeneous spherical body rotating uniformly about a common φ axis with constant angular velocity ω is constructed. The constructed metric tensors in this gravitational field have seven non-zero distinct components.The Lagrangian for this gravitational field is ...

  15. Exact solutions of strong gravity in generalized metrics

    International Nuclear Information System (INIS)

    Hojman, R.; Smailagic, A.

    1981-05-01

    We consider classical solutions for the strong gravity theory of Salam and Strathdee in a wider class of metrics with positive, zero and negative curvature. It turns out that such solutions exist and their relevance for quark confinement is explored. Only metrics with positive curvature (spherical symmetry) give a confining potential in a simple picture of the scalar hadron. This supports the idea of describing the hadron as a closed microuniverse of the strong metric. (author)

  16. Development of quality metrics for ambulatory pediatric cardiology: Infection prevention.

    Science.gov (United States)

    Johnson, Jonathan N; Barrett, Cindy S; Franklin, Wayne H; Graham, Eric M; Halnon, Nancy J; Hattendorf, Brandy A; Krawczeski, Catherine D; McGovern, James J; O'Connor, Matthew J; Schultz, Amy H; Vinocur, Jeffrey M; Chowdhury, Devyani; Anderson, Jeffrey B

    2017-12-01

    In 2012, the American College of Cardiology's (ACC) Adult Congenital and Pediatric Cardiology Council established a program to develop quality metrics to guide ambulatory practices for pediatric cardiology. The council chose five areas on which to focus their efforts; chest pain, Kawasaki Disease, tetralogy of Fallot, transposition of the great arteries after arterial switch, and infection prevention. Here, we sought to describe the process, evaluation, and results of the Infection Prevention Committee's metric design process. The infection prevention metrics team consisted of 12 members from 11 institutions in North America. The group agreed to work on specific infection prevention topics including antibiotic prophylaxis for endocarditis, rheumatic fever, and asplenia/hyposplenism; influenza vaccination and respiratory syncytial virus prophylaxis (palivizumab); preoperative methods to reduce intraoperative infections; vaccinations after cardiopulmonary bypass; hand hygiene; and testing to identify splenic function in patients with heterotaxy. An extensive literature review was performed. When available, previously published guidelines were used fully in determining metrics. The committee chose eight metrics to submit to the ACC Quality Metric Expert Panel for review. Ultimately, metrics regarding hand hygiene and influenza vaccination recommendation for patients did not pass the RAND analysis. Both endocarditis prophylaxis metrics and the RSV/palivizumab metric passed the RAND analysis but fell out during the open comment period. Three metrics passed all analyses, including those for antibiotic prophylaxis in patients with heterotaxy/asplenia, for influenza vaccination compliance in healthcare personnel, and for adherence to recommended regimens of secondary prevention of rheumatic fever. The lack of convincing data to guide quality improvement initiatives in pediatric cardiology is widespread, particularly in infection prevention. Despite this, three metrics were

  17. The Fernald Closure Project: Lessons Learned

    International Nuclear Information System (INIS)

    Murphy, Cornelius M.; Carr, Dennis

    2008-01-01

    For nearly 37 years, the U.S. Department of Energy site at Fernald - near Cincinnati, Ohio - produced 230,000 metric tons (250,000 short tons) of high-purity, low-enriched uranium for the U.S. Defense Program, generating more than 5.4 million metric tons (6 million short tons) of liquid and solid waste as it carried out its Cold War mission. The facility was shut down in 1989 and clean up began in 1992, when Fluor won the contract to clean up the site. Cleaning up Fernald and returning it to the people of Ohio was a $4.4 billion mega environmental-remediation project that was completed in October 2006. Project evolved through four phases: - Conducting remedial-investigation studies to determine the extent of damage to the environment and groundwater at, and adjacent to, the production facilities; - Selecting cleanup criteria - final end states that had to be met that protect human health and the environment; - Selecting and implementing the remedial actions to meet the cleanup goals; - Executing the work in a safe, compliant and cost-effective manner. In the early stages of the project, there were strained relationships - in fact total distrust - between the local community and the DOE as a result of aquifer contamination and potential health effects to the workers and local residents. To engage citizens and interested stakeholders groups in the decision-making process, the DOE and Fluor developed a public-participation strategy to open the channels of communication with the various parties: site leadership, technical staff and regulators. This approach proved invaluable to the success of the project, which has become a model for future environmental remediation projects. This paper will summarize the history and shares lessons learned: the completion of the uranium-production mission to the implementation of the Records of Decision defining the cleanup standards and the remedies achieved. Lessons learned fall into ten categories: - Regulatory approach with end

  18. Draft environmental statement. Homestake Mining Company: Homestake Mining Company Pitch Project (Saguache County, Colorado)

    International Nuclear Information System (INIS)

    1978-01-01

    The draft concerns the proposed issuance of approvals, permits, and licenses to the Homestake Mining Company for the implementation of the Pitch Project. The Pitch Project consists of mining and milling operations involving uranium ore deposits located in Gunnison National Forest, Saguache County, Colorado. Mining of uranium ore will take place over an estimated period of 20 years; a mill with a nominal capacity of 544 metric tons per day (600 tons per day) will be constructed and operated as long as ore is available. The waste material (tailings) from the mill, also produced at a rate of about 544 metric tons per day (600 tons per day), will be buried onsite at the head end of a natural valley. The environmental impacts are summarized in sections on the existing environment, applicant's proposed mining and milling operation, environmental effects of accidents, monitoring programs, productivity, commitment of resources, alternatives, and cost-benefit evaluation

  19. A new form of the rotating C-metric

    International Nuclear Information System (INIS)

    Hong, Kenneth; Teo, Edward

    2005-01-01

    In a previous paper, we showed that the traditional form of the charged C-metric can be transformed, by a change of coordinates, into one with an explicitly factorizable structure function. This new form of the C-metric has the advantage that its properties become much simpler to analyse. In this paper, we propose an analogous new form for the rotating charged C-metric, with structure function G(ξ) = (1 - ξ 2 )(1 + r + Aξ)(1 + r - Aξ), where r ± are the usual locations of the horizons in the Kerr-Newman black hole. Unlike the non-rotating case, this new form is not related to the traditional one by a coordinate transformation. We show that the physical distinction between these two forms of the rotating C-metric lies in the nature of the conical singularities causing the black holes to accelerate apart: the new form is free of torsion singularities and therefore does not contain any closed timelike curves. We claim that this new form should be considered the natural generalization of the C-metric with rotation

  20. [Clinical trial data management and quality metrics system].

    Science.gov (United States)

    Chen, Zhao-hua; Huang, Qin; Deng, Ya-zhong; Zhang, Yue; Xu, Yu; Yu, Hao; Liu, Zong-fan

    2015-11-01

    Data quality management system is essential to ensure accurate, complete, consistent, and reliable data collection in clinical research. This paper is devoted to various choices of data quality metrics. They are categorized by study status, e.g. study start up, conduct, and close-out. In each category, metrics for different purposes are listed according to ALCOA+ principles such us completeness, accuracy, timeliness, traceability, etc. Some general quality metrics frequently used are also introduced. This paper contains detail information as much as possible to each metric by providing definition, purpose, evaluation, referenced benchmark, and recommended targets in favor of real practice. It is important that sponsors and data management service providers establish a robust integrated clinical trial data quality management system to ensure sustainable high quality of clinical trial deliverables. It will also support enterprise level of data evaluation and bench marking the quality of data across projects, sponsors, data management service providers by using objective metrics from the real clinical trials. We hope this will be a significant input to accelerate the improvement of clinical trial data quality in the industry.

  1. Sigma Routing Metric for RPL Protocol

    Directory of Open Access Journals (Sweden)

    Paul Sanmartin

    2018-04-01

    Full Text Available This paper presents the adaptation of a specific metric for the RPL protocol in the objective function MRHOF. Among the functions standardized by IETF, we find OF0, which is based on the minimum hop count, as well as MRHOF, which is based on the Expected Transmission Count (ETX. However, when the network becomes denser or the number of nodes increases, both OF0 and MRHOF introduce long hops, which can generate a bottleneck that restricts the network. The adaptation is proposed to optimize both OFs through a new routing metric. To solve the above problem, the metrics of the minimum number of hops and the ETX are combined by designing a new routing metric called SIGMA-ETX, in which the best route is calculated using the standard deviation of ETX values between each node, as opposed to working with the ETX average along the route. This method ensures a better routing performance in dense sensor networks. The simulations are done through the Cooja simulator, based on the Contiki operating system. The simulations showed that the proposed optimization outperforms at a high margin in both OF0 and MRHOF, in terms of network latency, packet delivery ratio, lifetime, and power consumption.

  2. Socio-Technical Security Metrics (Dagstuhl Seminar 14491)

    NARCIS (Netherlands)

    Gollmann, Dieter; Herley, Cormac; Koenig, Vincent; Pieters, Wolter; Sasse, Martina Angela

    2015-01-01

    This report documents the program and the outcomes of Dagstuhl Seminar 14491 "Socio-Technical Security Metrics". In the domain of safety, metrics inform many decisions, from the height of new dikes to the design of nuclear plants. We can state, for example, that the dikes should be high enough to

  3. Landscape metrics for three-dimension urban pattern recognition

    Science.gov (United States)

    Liu, M.; Hu, Y.; Zhang, W.; Li, C.

    2017-12-01

    Understanding how landscape pattern determines population or ecosystem dynamics is crucial for managing our landscapes. Urban areas are becoming increasingly dominant social-ecological systems, so it is important to understand patterns of urbanization. Most studies of urban landscape pattern examine land-use maps in two dimensions because the acquisition of 3-dimensional information is difficult. We used Brista software based on Quickbird images and aerial photos to interpret the height of buildings, thus incorporating a 3-dimensional approach. We estimated the feasibility and accuracy of this approach. A total of 164,345 buildings in the Liaoning central urban agglomeration of China, which included seven cities, were measured. Twelve landscape metrics were proposed or chosen to describe the urban landscape patterns in 2- and 3-dimensional scales. The ecological and social meaning of landscape metrics were analyzed with multiple correlation analysis. The results showed that classification accuracy compared with field surveys was 87.6%, which means this method for interpreting building height was acceptable. The metrics effectively reflected the urban architecture in relation to number of buildings, area, height, 3-D shape and diversity aspects. We were able to describe the urban characteristics of each city with these metrics. The metrics also captured ecological and social meanings. The proposed landscape metrics provided a new method for urban landscape analysis in three dimensions.

  4. SOCIAL METRICS APPLIED TO SMART TOURISM

    Directory of Open Access Journals (Sweden)

    O. Cervantes

    2016-09-01

    Full Text Available We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  5. Social Metrics Applied to Smart Tourism

    Science.gov (United States)

    Cervantes, O.; Gutiérrez, E.; Gutiérrez, F.; Sánchez, J. A.

    2016-09-01

    We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general) to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services) to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  6. Reproducibility of graph metrics of human brain functional networks.

    Science.gov (United States)

    Deuker, Lorena; Bullmore, Edward T; Smith, Marie; Christensen, Soren; Nathan, Pradeep J; Rockstroh, Brigitte; Bassett, Danielle S

    2009-10-01

    Graph theory provides many metrics of complex network organization that can be applied to analysis of brain networks derived from neuroimaging data. Here we investigated the test-retest reliability of graph metrics of functional networks derived from magnetoencephalography (MEG) data recorded in two sessions from 16 healthy volunteers who were studied at rest and during performance of the n-back working memory task in each session. For each subject's data at each session, we used a wavelet filter to estimate the mutual information (MI) between each pair of MEG sensors in each of the classical frequency intervals from gamma to low delta in the overall range 1-60 Hz. Undirected binary graphs were generated by thresholding the MI matrix and 8 global network metrics were estimated: the clustering coefficient, path length, small-worldness, efficiency, cost-efficiency, assortativity, hierarchy, and synchronizability. Reliability of each graph metric was assessed using the intraclass correlation (ICC). Good reliability was demonstrated for most metrics applied to the n-back data (mean ICC=0.62). Reliability was greater for metrics in lower frequency networks. Higher frequency gamma- and beta-band networks were less reliable at a global level but demonstrated high reliability of nodal metrics in frontal and parietal regions. Performance of the n-back task was associated with greater reliability than measurements on resting state data. Task practice was also associated with greater reliability. Collectively these results suggest that graph metrics are sufficiently reliable to be considered for future longitudinal studies of functional brain network changes.

  7. Evaluation metrics for biostatistical and epidemiological collaborations.

    Science.gov (United States)

    Rubio, Doris McGartland; Del Junco, Deborah J; Bhore, Rafia; Lindsell, Christopher J; Oster, Robert A; Wittkowski, Knut M; Welty, Leah J; Li, Yi-Ju; Demets, Dave

    2011-10-15

    Increasing demands for evidence-based medicine and for the translation of biomedical research into individual and public health benefit have been accompanied by the proliferation of special units that offer expertise in biostatistics, epidemiology, and research design (BERD) within academic health centers. Objective metrics that can be used to evaluate, track, and improve the performance of these BERD units are critical to their successful establishment and sustainable future. To develop a set of reliable but versatile metrics that can be adapted easily to different environments and evolving needs, we consulted with members of BERD units from the consortium of academic health centers funded by the Clinical and Translational Science Award Program of the National Institutes of Health. Through a systematic process of consensus building and document drafting, we formulated metrics that covered the three identified domains of BERD practices: the development and maintenance of collaborations with clinical and translational science investigators, the application of BERD-related methods to clinical and translational research, and the discovery of novel BERD-related methodologies. In this article, we describe the set of metrics and advocate their use for evaluating BERD practices. The routine application, comparison of findings across diverse BERD units, and ongoing refinement of the metrics will identify trends, facilitate meaningful changes, and ultimately enhance the contribution of BERD activities to biomedical research. Copyright © 2011 John Wiley & Sons, Ltd.

  8. Investigation of ore processing to recover uranium concentrate from sandstone of Pa Lua area on scale of 2 tons of ore per batch

    International Nuclear Information System (INIS)

    Cao Hung Thai; Dinh Manh Thang; Tran Van Son; Le Quang Thai; Bui Dang Hanh; Hoang Bich Ngoc; Nguyen Hong Ha; Phung Vu Phong; Nguyen Khac Tuan

    2003-01-01

    Based on the laboratory results, a system for testing on scale of 2 tons uranium ore per batch including following parts was established: equipment for crushing and grinding, equipment for acid leaching, equipment for impurity precipitation and filtration and drying. the results of testing by 2 tons ore per batch scale shown that uranium recovery in the leach circuit were achieved of at least 90% under following conditions: The supplying rate of leach agent 50-701/m 2 .h, sandstone is mixed or incubated with acid before the percolation. About 23 kg filter cake per m 3 solution were disposed as tailings. Flocculants N101, A101 (TOAGOSEL, Japan) were used for improvement of filtration and washing capacity of impurity precipitation. Uranium peroxide was precipitated with addition of hydrogen peroxide. The underflow solids were filtered and calcined. The product contained min. 76% U 3 O 8 . The water recycle was successfully tested. That results in minimization of water addition to only 0.3m 3 /ton of ore. Experimental results on 2 tons scale showed that the proposed processing flow sheet using direct precipitation can meet all environmental and technical objectives. (CHT)

  9. Nuclear Materials: Reconsidering Wastes and Assets - 13193

    International Nuclear Information System (INIS)

    Michalske, T.A.

    2013-01-01

    The nuclear industry, both in the commercial and the government sectors, has generated large quantities of material that span the spectrum of usefulness, from highly valuable ('assets') to worthless ('wastes'). In many cases, the decision parameters are clear. Transuranic waste and high level waste, for example, have no value, and is either in a final disposition path today, or - in the case of high level waste - awaiting a policy decision about final disposition. Other materials, though discardable, have intrinsic scientific or market value that may be hidden by the complexity, hazard, or cost of recovery. An informed decision process should acknowledge the asset value, or lack of value, of the complete inventory of materials, and the structure necessary to implement the range of possible options. It is important that informed decisions are made about the asset value for the variety of nuclear materials available. For example, there is a significant quantity of spent fuel available for recycle (an estimated $4 billion value in the Savannah River Site's (SRS) L area alone); in fact, SRS has already blended down more than 300 metric tons of uranium for commercial reactor use. Over 34 metric tons of surplus plutonium is also on a path to be used as commercial fuel. There are other radiological materials that are routinely handled at the site in large quantities that should be viewed as strategically important and / or commercially viable. In some cases, these materials are irreplaceable domestically, and failure to consider their recovery could jeopardize our technological leadership or national defense. The inventories of nuclear materials at SRS that have been characterized as 'waste' include isotopes of plutonium, uranium, americium, and helium. Although planning has been performed to establish the technical and regulatory bases for their discard and disposal, recovery of these materials is both economically attractive and in the national interest. (authors)

  10. Used nuclear materials at Savannah River Site: asset or waste?

    International Nuclear Information System (INIS)

    Magoulas, Virginia

    2013-01-01

    The nuclear industry, both in the commercial and the government sectors, has generated large quantities of material that span the spectrum of usefulness, from highly valuable ''assets'' to worthless ''wastes''. In many cases, the decision parameters are clear. Transuranic waste and high level waste, for example, have no value, and is either in a final disposition path today, or - in the case of high level waste - awaiting a policy decision about final disposition. Other materials, though discardable, have intrinsic scientific or market value that may be hidden by the complexity, hazard, or cost of recovery. An informed decision process should acknowledge the asset value, or lack of value, of the complete inventory of materials, and the structure necessary to implement the range of possible options. It is important that informed decisions are made about the asset value for the variety of nuclear materials available. For example, there is a significant quantity of spent fuel available for recycle (an estimated $4 billion value in the Savannah River Site's (SRS) L area alone); in fact, SRS has already blended down more than 300 metric tons of uranium for commercial reactor use. Over 34 metric tons of surplus plutonium is also on a path to be used as commercial fuel. There are other radiological materials that are routinely handled at the site in large quantities that should be viewed as strategically important and / or commercially viable. In some cases, these materials are irreplaceable domestically, and failure to consider their recovery could jeopardize our technological leadership or national defense. The inventories of nuclear materials at SRS that have been characterized as ''waste'' include isotopes of plutonium, uranium, americium, and helium. Although planning has been performed to establish the technical and regulatory bases for their discard and disposal, recovery of these materials is both economically attractive and in the national interest.

  11. Generalization of Vaidya's radiation metric

    Energy Technology Data Exchange (ETDEWEB)

    Gleiser, R J; Kozameh, C N [Universidad Nacional de Cordoba (Argentina). Instituto de Matematica, Astronomia y Fisica

    1981-11-01

    In this paper it is shown that if Vaidya's radiation metric is considered from the point of view of kinetic theory in general relativity, the corresponding phase space distribution function can be generalized in a particular way. The new family of spherically symmetric radiation metrics obtained contains Vaidya's as a limiting situation. The Einstein field equations are solved in a ''comoving'' coordinate system. Two arbitrary functions of a single variable are introduced in the process of solving these equations. Particular examples considered are a stationary solution, a nonvacuum solution depending on a single parameter, and several limiting situations.

  12. Marketing communication metrics for social media

    OpenAIRE

    Töllinen, Aarne; Karjaluoto, Heikki

    2011-01-01

    The objective of this paper is to develop a conceptual framework for measuring the effectiveness of social media marketing communications. Specifically, we study whether the existing marketing communications performance metrics are still valid in the changing digitalised communications landscape, or whether it is time to rethink them, or even to devise entirely new metrics. Recent advances in information technology and marketing bring a need to re-examine measurement models. We combine two im...

  13. Productivity of seed agrocenosis of common millet (Panicum miliaceum L. at varying so­ ing w terms and techniques under the conditions of Right- Bank Forest-Steppe

    Directory of Open Access Journals (Sweden)

    С. П. Полторецький

    2013-08-01

    Full Text Available The objective of the research is to improve the techno­logy of growing high-quality seeds of millet broomcorn by means of the optimization of sowing terms and methods, aimed at increasing its productivity and improving seed qualities under conditions of unstable moistening of the Right-Bank Forest-Steppe of Ukraine. Scientific literature review indicates the study of the influence of these technology elements on the formation of sowing qualities and crop capacity of millet seeds has been of schematic and occasional nature. The issue has not been studied in this region condition at all, that is why the research has considerable significance and novelty. Analysis, observations and calculations were done by means of conventional methods. Research results indicate that under conditions of unstable moistening of the southern part of the Right-Bank Forest-Steppe of Ukraine the highest yield of Slobozhanske and Lana varieties was reached at sowing in drills – 39.2 and 41.0 metric centners per hectare, respectively. That was 2.4 and 3.9 metric centners per hectare increase against the wide-row sowing. In the years with optimal hydrothermal conditions maximum seed productivity of millet broomcorn at the level of 4.24 to 4.79 metric tons per hectare (Slobozhanske variety and 4.53 to 5.28 metric tons per hectare (Lana variety was observed at postponing the sowing terms to the third decade of May. If atypical for the region hydrothermal conditions (drought or excessive moistening are forecasted du­ ring the vegetation period of millet, the highest productivity is provided by sowing in the second decade of May. Early sowing in the first decade of May causes decrease in yield at the level of 0.14 to 0.48 metric tons per hectare (Slobozhanske variety and 0.14 to 0.48 metric tons per hectare (Lana variety; if the sowing is postponed to the first decade of June, the yield increases by 0.31 to 0.77 and 0.39 to 0.84 metric tons per hectare, respectively. Early

  14. Accuracy and precision in the calculation of phenology metrics

    DEFF Research Database (Denmark)

    Ferreira, Ana Sofia; Visser, Andre; MacKenzie, Brian

    2014-01-01

    a phenology metric is first determined from a noise- and gap-free time series, and again once it has been modified. We show that precision is a greater concern than accuracy for many of these metrics, an important point that has been hereto overlooked in the literature. The variability in precision between...... phenology metrics is substantial, but it can be improved by the use of preprocessing techniques (e.g., gap-filling or smoothing). Furthermore, there are important differences in the inherent variability of the metrics that may be crucial in the interpretation of studies based upon them. Of the considered......Phytoplankton phenology (the timing of seasonal events) is a commonly used indicator for evaluating responses of marine ecosystems to climate change. However, phenological metrics are vulnerable to observation-(bloom amplitude, missing data, and observational noise) and analysis-related (temporal...

  15. Quantum anomalies for generalized Euclidean Taub-NUT metrics

    International Nuclear Information System (INIS)

    Cotaescu, Ion I; Moroianu, Sergiu; Visinescu, Mihai

    2005-01-01

    The generalized Taub-NUT metrics exhibit in general gravitational anomalies. This is in contrast with the fact that the original Taub-NUT metric does not exhibit gravitational anomalies, which is a consequence of the fact that it admits Killing-Yano tensors forming Staeckel-Killing tensors as products. We have found that for axial anomalies, interpreted as the index of the Dirac operator, the presence of Killing-Yano tensors is irrelevant. In order to evaluate the axial anomalies, we compute the index of the Dirac operator with the APS boundary condition on balls and on annular domains. The result is an explicit number-theoretic quantity depending on the radii of the domain. This quantity is 0 for metrics close to the original Taub-NUT metric but it does not vanish in general

  16. Metrics for Performance Evaluation of Patient Exercises during Physical Therapy.

    Science.gov (United States)

    Vakanski, Aleksandar; Ferguson, Jake M; Lee, Stephen

    2017-06-01

    The article proposes a set of metrics for evaluation of patient performance in physical therapy exercises. Taxonomy is employed that classifies the metrics into quantitative and qualitative categories, based on the level of abstraction of the captured motion sequences. Further, the quantitative metrics are classified into model-less and model-based metrics, in reference to whether the evaluation employs the raw measurements of patient performed motions, or whether the evaluation is based on a mathematical model of the motions. The reviewed metrics include root-mean square distance, Kullback Leibler divergence, log-likelihood, heuristic consistency, Fugl-Meyer Assessment, and similar. The metrics are evaluated for a set of five human motions captured with a Kinect sensor. The metrics can potentially be integrated into a system that employs machine learning for modelling and assessment of the consistency of patient performance in home-based therapy setting. Automated performance evaluation can overcome the inherent subjectivity in human performed therapy assessment, and it can increase the adherence to prescribed therapy plans, and reduce healthcare costs.

  17. Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  18. Massless and massive quanta resulting from a mediumlike metric tensor

    International Nuclear Information System (INIS)

    Soln, J.

    1985-01-01

    A simple model of the ''primordial'' scalar field theory is presented in which the metric tensor is a generalization of the metric tensor from electrodynamics in a medium. The radiation signal corresponding to the scalar field propagates with a velocity that is generally less than c. This signal can be associated simultaneously with imaginary and real effective (momentum-dependent) masses. The requirement that the imaginary effective mass vanishes, which we take to be the prerequisite for the vacuumlike signal propagation, leads to the ''spontaneous'' splitting of the metric tensor into two distinct metric tensors: one metric tensor gives rise to masslesslike radiation and the other to a massive particle. (author)

  19. 76 FR 7633 - Endangered and Threatened Wildlife and Plants; 12-Month Finding on a Petition to List the Pacific...

    Science.gov (United States)

    2011-02-10

    ... walruses consume approximately 3 million metric tons (3,307 tons) of benthic biomass annually, and that the... prey use described here. Walruses typically swallow invertebrates without shells in their entirety (Fay 1982, p. 165). Walruses remove the soft parts of mollusks from their shells by suction, and discard the...

  20. Assessment of Coal Geology, Resources, and Reserves in the Gillette Coalfield, Powder River Basin, Wyoming

    Science.gov (United States)

    Luppens, James A.; Scott, David C.; Haacke, Jon E.; Osmonson, Lee M.; Rohrbacher, Timothy J.; Ellis, Margaret S.

    2008-01-01

    The Gillette coalfield, within the Powder River Basin in east-central Wyoming, is the most prolific coalfield in the United States. In 2006, production from the coalfield totaled over 431 million short tons of coal, which represented over 37 percent of the Nation's total yearly production. The Anderson and Canyon coal beds in the Gillette coalfield contain some of the largest deposits of low-sulfur subbituminous coal in the world. By utilizing the abundance of new data from recent coalbed methane development in the Powder River Basin, this study represents the most comprehensive evaluation of coal resources and reserves in the Gillette coalfield to date. Eleven coal beds were evaluated to determine the in-place coal resources. Six of the eleven coal beds were evaluated for reserve potential given current technology, economic factors, and restrictions to mining. These restrictions included the presence of railroads, a Federal interstate highway, cities, a gas plant, and alluvial valley floors. Other restrictions, such as thickness of overburden, thickness of coal beds, and areas of burned coal were also considered. The total original coal resource in the Gillette coalfield for all eleven coal beds assessed, and no restrictions applied, was calculated to be 201 billion short tons. Available coal resources, which are part of the original coal resource that is accessible for potential mine development after subtracting all restrictions, are about 164 billion short tons (81 percent of the original coal resource). Recoverable coal, which is the portion of available coal remaining after subtracting mining and processing losses, was determined for a stripping ratio of 10:1 or less. After mining and processing losses were subtracted, a total of 77 billion short tons of coal were calculated (48 percent of the original coal resource). Coal reserves are the portion of the recoverable coal that can be mined, processed, and marketed at a profit at the time of the economic

  1. Matrix model and time-like linear dila ton matter

    International Nuclear Information System (INIS)

    Takayanagi, Tadashi

    2004-01-01

    We consider a matrix model description of the 2d string theory whose matter part is given by a time-like linear dilaton CFT. This is equivalent to the c=1 matrix model with a deformed, but very simple Fermi surface. Indeed, after a Lorentz transformation, the corresponding 2d spacetime is a conventional linear dila ton background with a time-dependent tachyon field. We show that the tree level scattering amplitudes in the matrix model perfectly agree with those computed in the world-sheet theory. The classical trajectories of fermions correspond to the decaying D-boranes in the time-like linear dilaton CFT. We also discuss the ground ring structure. Furthermore, we study the properties of the time-like Liouville theory by applying this matrix model description. We find that its ground ring structure is very similar to that of the minimal string. (author)

  2. Collaboration in Humanitarian Logistics: Comparative Analysis of Disaster Response in Chile and Haiti 2010

    Science.gov (United States)

    2010-12-01

    hydropower. Exports total $554.8 million (2007 est.) with manufactured items, coffee, oils, cocoa , and mangoes being the main exports . Imports total... Exports total $48.8 billion in copper, fruit, fish products, paper and pulp, chemicals, and wine. Imports total $40.91 billion in petroleum and...relief supplies via USAID Uruguay Two water purification machines Venezuela Seven tons of relief supplies and assessment team of 27 experts

  3. A guide to phylogenetic metrics for conservation, community ecology and macroecology

    Science.gov (United States)

    Cadotte, Marc W.; Carvalho, Silvia B.; Davies, T. Jonathan; Ferrier, Simon; Fritz, Susanne A.; Grenyer, Rich; Helmus, Matthew R.; Jin, Lanna S.; Mooers, Arne O.; Pavoine, Sandrine; Purschke, Oliver; Redding, David W.; Rosauer, Dan F.; Winter, Marten; Mazel, Florent

    2016-01-01

    ABSTRACT The use of phylogenies in ecology is increasingly common and has broadened our understanding of biological diversity. Ecological sub‐disciplines, particularly conservation, community ecology and macroecology, all recognize the value of evolutionary relationships but the resulting development of phylogenetic approaches has led to a proliferation of phylogenetic diversity metrics. The use of many metrics across the sub‐disciplines hampers potential meta‐analyses, syntheses, and generalizations of existing results. Further, there is no guide for selecting the appropriate metric for a given question, and different metrics are frequently used to address similar questions. To improve the choice, application, and interpretation of phylo‐diversity metrics, we organize existing metrics by expanding on a unifying framework for phylogenetic information. Generally, questions about phylogenetic relationships within or between assemblages tend to ask three types of question: how much; how different; or how regular? We show that these questions reflect three dimensions of a phylogenetic tree: richness, divergence, and regularity. We classify 70 existing phylo‐diversity metrics based on their mathematical form within these three dimensions and identify ‘anchor’ representatives: for α‐diversity metrics these are PD (Faith's phylogenetic diversity), MPD (mean pairwise distance), and VPD (variation of pairwise distances). By analysing mathematical formulae and using simulations, we use this framework to identify metrics that mix dimensions, and we provide a guide to choosing and using the most appropriate metrics. We show that metric choice requires connecting the research question with the correct dimension of the framework and that there are logical approaches to selecting and interpreting metrics. The guide outlined herein will help researchers navigate the current jungle of indices. PMID:26785932

  4. Guangdong Aluminum to Raise RMB 3 billion for New Production Base in Guizhou

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    <正>On July 7, a loan signing ceremony was held between the Guangdong Aluminum Group, China Construction Bank, Hua Xia Bank and Guangzhou Bank Consortium. It is reported that these banks will provide Guangdong Aluminum Group with RMB 30 billion for an alu-minum oxide and supporting bauxite mining project in Guizhou.

  5. Ultrarelativistic heavy ion collisions: the first billion seconds

    Energy Technology Data Exchange (ETDEWEB)

    Baym, Gordon

    2016-12-15

    I first review the early history of the ultrarelativistic heavy ion program, starting with the 1974 Bear Mountain Workshop, and the 1983 Aurora meeting of the U.S. Nuclear Science Committtee, just one billion seconds ago, which laid out the initial science goals of an ultrarelativistic collider. The primary goal, to discover the properties of nuclear matter at the highest energy densities, included finding new states of matter – the quark-gluon plasma primarily – and to use collisions to open a new window on related problems of matter in cosmology, neutron stars, supernovae, and elsewhere. To bring out how the study of heavy ions and hot, dense matter in QCD has been fulfilling these goals, I concentrate on a few topics, the phase diagram of matter in QCD, and connections of heavy ion physics to cold atoms, cosmology, and neutron stars.

  6. A success story LHC cable production at ALSTOM-MSA

    CERN Document Server

    Mocaer, P; Köhler, C; Verwaerde, C

    2005-01-01

    ITER, when constructed, will be the equipment using the largest amount of superconductor strands ever built (Nb$_{3}$Sn and NbTi). ALSTOM- MSA Magnets and Superconductors SA, "ALSTOM-MSA" received in 1998 the largest orders to date for the delivery of superconducting strands and cables (3100 km of cables for dipole and quadrupole magnets and various strands) for the Large Hadron Collider (LHC) being built at CERN Geneva. These orders to ALSTOM-MSA correspond to more than 600 metric tons of superconducting strands, an amount to be compared to around 600 metric tons of Nb$_{3}$Sn strands and 250 metric tons of NbTi strands necessary for ITER. Starting from small and short R&D programs in the early nineties, ALSTOM-MSA has reached its industrial targets and has, as of September 2004, delivered around 74% of the whole orders with products meeting high quality standards. Production is going on at contractual delivery rate and with satisfactory financial results to finish deliveries around end 2005, taking into...

  7. Disposal of residue from uranium ore processing in France

    International Nuclear Information System (INIS)

    Crochon, Ph.

    2011-01-01

    Between 1949 and 2001, French mines produced 76, 000 metric tons of uranium and 50 million metric tons of ore, processing residues are stored at 17 sites (in ponds enclosed by dykes or in former open-cast mines) subject to ICPE (classified facility for environment protection) regulation. These disposal sites cover surface areas of between one and several tens of hectares and several thousands to several millions of metric tons of waste are stored at them. When uranium mining stopped in France, these sites were redeveloped, with caps placed over the residue to provide mechanical and radiological protection. All these sites are still monitored by AREVA. In the last fifteen years, these sites have been the subject of a number of studies, especially regarding the long-term evolution and impact of the residue. These studies are now being pursued within the framework of the national plan for the management of nuclear materials and waste (PNGMDR). A regulatory and institutional framework regarding long-term management of these disposal sites needs to be defined. (author)

  8. Assessing Software Quality Through Visualised Cohesion Metrics

    Directory of Open Access Journals (Sweden)

    Timothy Shih

    2001-05-01

    Full Text Available Cohesion is one of the most important factors for software quality as well as maintainability, reliability and reusability. Module cohesion is defined as a quality attribute that seeks for measuring the singleness of the purpose of a module. The module of poor quality can be a serious obstacle to the system quality. In order to design a good software quality, software managers and engineers need to introduce cohesion metrics to measure and produce desirable software. A highly cohesion software is thought to be a desirable constructing. In this paper, we propose a function-oriented cohesion metrics based on the analysis of live variables, live span and the visualization of processing element dependency graph. We give six typical cohesion examples to be measured as our experiments and justification. Therefore, a well-defined, well-normalized, well-visualized and well-experimented cohesion metrics is proposed to indicate and thus enhance software cohesion strength. Furthermore, this cohesion metrics can be easily incorporated with software CASE tool to help software engineers to improve software quality.

  9. MESUR: USAGE-BASED METRICS OF SCHOLARLY IMPACT

    Energy Technology Data Exchange (ETDEWEB)

    BOLLEN, JOHAN [Los Alamos National Laboratory; RODRIGUEZ, MARKO A. [Los Alamos National Laboratory; VAN DE SOMPEL, HERBERT [Los Alamos National Laboratory

    2007-01-30

    The evaluation of scholarly communication items is now largely a matter of expert opinion or metrics derived from citation data. Both approaches can fail to take into account the myriad of factors that shape scholarly impact. Usage data has emerged as a promising complement to existing methods o fassessment but the formal groundwork to reliably and validly apply usage-based metrics of schlolarly impact is lacking. The Andrew W. Mellon Foundation funded MESUR project constitutes a systematic effort to define, validate and cross-validate a range of usage-based metrics of schlolarly impact by creating a semantic model of the scholarly communication process. The constructed model will serve as the basis of a creating a large-scale semantic network that seamlessly relates citation, bibliographic and usage data from a variety of sources. A subsequent program that uses the established semantic network as a reference data set will determine the characteristics and semantics of a variety of usage-based metrics of schlolarly impact. This paper outlines the architecture and methodology adopted by the MESUR project and its future direction.

  10. Classroom reconstruction of the Schwarzschild metric

    OpenAIRE

    Kassner, Klaus

    2015-01-01

    A promising way to introduce general relativity in the classroom is to study the physical implications of certain given metrics, such as the Schwarzschild one. This involves lower mathematical expenditure than an approach focusing on differential geometry in its full glory and permits to emphasize physical aspects before attacking the field equations. Even so, in terms of motivation, lacking justification of the metric employed may pose an obstacle. The paper discusses how to establish the we...

  11. Area Regge calculus and discontinuous metrics

    International Nuclear Information System (INIS)

    Wainwright, Chris; Williams, Ruth M

    2004-01-01

    Taking the triangle areas as independent variables in the theory of Regge calculus can lead to ambiguities in the edge lengths, which can be interpreted as discontinuities in the metric. We construct solutions to area Regge calculus using a triangulated lattice and find that on a spacelike or timelike hypersurface no such discontinuity can arise. On a null hypersurface however, we can have such a situation and the resulting metric can be interpreted as a so-called refractive wave

  12. Use of metrics in an effective ALARA program

    International Nuclear Information System (INIS)

    Bates, B.B. Jr.

    1996-01-01

    ALARA radiological protection programs require metrics to meet their objectives. Sources of metrics include external dosimetry; internal dosimetry; radiological occurrences from the occurrence reporting an processing system (ORPS); and radiological incident reports (RIR). The sources themselves contain an abundance of specific open-quotes indicators.close quotes To choose the site-specific indicators that will be tracked and trended requires careful review. Justification is needed to defend the indicators selected and maybe even stronger justification is needed for those indicators that are available, but not chosen as a metric. Historically, the many different sources of information resided in a plethora of locations. Even the same type of metric had data located in different areas and could not be easily totaled for the entire Site. This required the end user to expend valuable time and effort to locate the data they needed. To address this problem, a central metrics database has been developed so that a customer can have all their questions addressed quickly and correctly. The database was developed in the beginning to answer some of the customer's most frequently asked questions. IL is now also a tool to communicate the status of the radiation protection program to facility managers. Finally, it also addresses requirements contained in the Rad Con manual and the 10CFR835 implementation guides. The database uses currently available, open-quotes user friendly,close quotes software and contains information from RIR's, ORPS, and external dosimetry records specific to ALARA performance indicators. The database is expandable to allow new metrics input. Specific reports have been developed to assist customers in their tracking and trending of ALARA metrics. These include quarterly performance indicator reports, monthly radiological incident reports, monthly external dose history and goals tracking reports, and the future use of performance indexing

  13. Assessment of six dissimilarity metrics for climate analogues

    Science.gov (United States)

    Grenier, Patrick; Parent, Annie-Claude; Huard, David; Anctil, François; Chaumont, Diane

    2013-04-01

    Spatial analogue techniques consist in identifying locations whose recent-past climate is similar in some aspects to the future climate anticipated at a reference location. When identifying analogues, one key step is the quantification of the dissimilarity between two climates separated in time and space, which involves the choice of a metric. In this communication, spatial analogues and their usefulness are briefly discussed. Next, six metrics are presented (the standardized Euclidean distance, the Kolmogorov-Smirnov statistic, the nearest-neighbor distance, the Zech-Aslan energy statistic, the Friedman-Rafsky runs statistic and the Kullback-Leibler divergence), along with a set of criteria used for their assessment. The related case study involves the use of numerical simulations performed with the Canadian Regional Climate Model (CRCM-v4.2.3), from which three annual indicators (total precipitation, heating degree-days and cooling degree-days) are calculated over 30-year periods (1971-2000 and 2041-2070). Results indicate that the six metrics identify comparable analogue regions at a relatively large scale, but best analogues may differ substantially. For best analogues, it is also shown that the uncertainty stemming from the metric choice does generally not exceed that stemming from the simulation or model choice. A synthesis of the advantages and drawbacks of each metric is finally presented, in which the Zech-Aslan energy statistic stands out as the most recommended metric for analogue studies, whereas the Friedman-Rafsky runs statistic is the least recommended, based on this case study.

  14. Comprehensive Metric Education Project: Implementing Metrics at a District Level Administrative Guide.

    Science.gov (United States)

    Borelli, Michael L.

    This document details the administrative issues associated with guiding a school district through its metrication efforts. Issues regarding staff development, curriculum development, and the acquisition of instructional resources are considered. Alternative solutions are offered. Finally, an overall implementation strategy is discussed with…

  15. Effective use of metrics in an ALARA program

    International Nuclear Information System (INIS)

    Bates, B.B. Jr.

    1996-01-01

    ALARA radiological protection programs require metrics to meet their objectives. Sources of metrics include; external dosimetry; internal dosimetry; radiological occurrences from the occurrence reporting and processing system (ORPS); and radiological incident reports (RIR). The sources themselves contain an abundance of specific ''indicators''. To choose the site-specific indicators that will be tracked and trended requires careful review. This required the end users to expend valuable time and effort to locate the data they needed. To address this problem, a central metrics database has been developed so that customers can have all their questions addressed quickly and correctly. The database was developed in the beginning to answer some of the customer's most frequently asked questions. It is now also a tool to communicate the status of the radiation protection program to facility managers. Finally it also addresses requirements contained in the Rad Con manual and the 10CFR835 implementation guides. The database uses currently available, ''user friendly'', software and contains information from RIR's, ORPS, and external dosimetry records specific to ALARA performance indicators. The database is expandable to allow new metrics input. Specific reports have been developed to assist customers in their tracking and trending of ALARA metrics

  16. A family of metric gravities

    Science.gov (United States)

    Shuler, Robert

    2018-04-01

    The goal of this paper is to take a completely fresh approach to metric gravity, in which the metric principle is strictly adhered to but its properties in local space-time are derived from conservation principles, not inferred from a global field equation. The global field strength variation then gains some flexibility, but only in the regime of very strong fields (2nd-order terms) whose measurement is now being contemplated. So doing provides a family of similar gravities, differing only in strong fields, which could be developed into meaningful verification targets for strong fields after the manner in which far-field variations were used in the 20th century. General Relativity (GR) is shown to be a member of the family and this is demonstrated by deriving the Schwarzschild metric exactly from a suitable field strength assumption. The method of doing so is interesting in itself because it involves only one differential equation rather than the usual four. Exact static symmetric field solutions are also given for one pedagogical alternative based on potential, and one theoretical alternative based on inertia, and the prospects of experimentally differentiating these are analyzed. Whether the method overturns the conventional wisdom that GR is the only metric theory of gravity and that alternatives must introduce additional interactions and fields is somewhat semantical, depending on whether one views the field strength assumption as a field and whether the assumption that produces GR is considered unique in some way. It is of course possible to have other fields, and the local space-time principle can be applied to field gravities which usually are weak-field approximations having only time dilation, giving them the spatial factor and promoting them to full metric theories. Though usually pedagogical, some of them are interesting from a quantum gravity perspective. Cases are noted where mass measurement errors, or distributions of dark matter, can cause one

  17. Metrication manual

    International Nuclear Information System (INIS)

    Harper, A.F.A.; Digby, R.B.; Thong, S.P.; Lacey, F.

    1978-04-01

    In April 1978 a meeting of senior metrication officers convened by the Commonwealth Science Council of the Commonwealth Secretariat, was held in London. The participants were drawn from Australia, Bangladesh, Britain, Canada, Ghana, Guyana, India, Jamaica, Papua New Guinea, Solomon Islands and Trinidad and Tobago. Among other things, the meeting resolved to develop a set of guidelines to assist countries to change to SI and to compile such guidelines in the form of a working manual

  18. An accurate metric for the spacetime around rotating neutron stars

    Science.gov (United States)

    Pappas, George

    2017-04-01

    The problem of having an accurate description of the spacetime around rotating neutron stars is of great astrophysical interest. For astrophysical applications, one needs to have a metric that captures all the properties of the spacetime around a rotating neutron star. Furthermore, an accurate appropriately parametrized metric, I.e. a metric that is given in terms of parameters that are directly related to the physical structure of the neutron star, could be used to solve the inverse problem, which is to infer the properties of the structure of a neutron star from astrophysical observations. In this work, we present such an approximate stationary and axisymmetric metric for the exterior of rotating neutron stars, which is constructed using the Ernst formalism and is parametrized by the relativistic multipole moments of the central object. This metric is given in terms of an expansion on the Weyl-Papapetrou coordinates with the multipole moments as free parameters and is shown to be extremely accurate in capturing the physical properties of a neutron star spacetime as they are calculated numerically in general relativity. Because the metric is given in terms of an expansion, the expressions are much simpler and easier to implement, in contrast to previous approaches. For the parametrization of the metric in general relativity, the recently discovered universal 3-hair relations are used to produce a three-parameter metric. Finally, a straightforward extension of this metric is given for scalar-tensor theories with a massless scalar field, which also admit a formulation in terms of an Ernst potential.

  19. Metric reconstruction from Weyl scalars

    Energy Technology Data Exchange (ETDEWEB)

    Whiting, Bernard F; Price, Larry R [Department of Physics, PO Box 118440, University of Florida, Gainesville, FL 32611 (United States)

    2005-08-07

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations.

  20. Metric reconstruction from Weyl scalars

    International Nuclear Information System (INIS)

    Whiting, Bernard F; Price, Larry R

    2005-01-01

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations

  1. Using Publication Metrics to Highlight Academic Productivity and Research Impact

    Science.gov (United States)

    Carpenter, Christopher R.; Cone, David C.; Sarli, Cathy C.

    2016-01-01

    This article provides a broad overview of widely available measures of academic productivity and impact using publication data and highlights uses of these metrics for various purposes. Metrics based on publication data include measures such as number of publications, number of citations, the journal impact factor score, and the h-index, as well as emerging metrics based on document-level metrics. Publication metrics can be used for a variety of purposes for tenure and promotion, grant applications and renewal reports, benchmarking, recruiting efforts, and administrative purposes for departmental or university performance reports. The authors also highlight practical applications of measuring and reporting academic productivity and impact to emphasize and promote individual investigators, grant applications, or department output. PMID:25308141

  2. Comparison of routing metrics for wireless mesh networks

    CSIR Research Space (South Africa)

    Nxumalo, SL

    2011-09-01

    Full Text Available in each and every relay node so as to find the next hop for the packet. A routing metric is simply a measure used for selecting the best path, used by a routing protocol. Figure 2 shows the relationship between a routing protocol and the routing... on its QoS-awareness level. The routing metrics that considered QoS the most were selected from each group. This section discusses the four routing metrics that were compared in this paper, which are: hop count (HOP), expected transmission count (ETX...

  3. The metrics and correlates of physician migration from Africa

    Directory of Open Access Journals (Sweden)

    Arah Onyebuchi A

    2007-05-01

    Full Text Available Abstract Background Physician migration from poor to rich countries is considered an important contributor to the growing health workforce crisis in the developing world. This is particularly true for Africa. The perceived magnitude of such migration for each source country might, however, depend on the choice of metrics used in the analysis. This study examined the influence of choice of migration metrics on the rankings of African countries that suffered the most physician migration, and investigated the correlates of physician migration. Methods Ranking and correlational analyses were conducted on African physician migration data adjusted for bilateral net flows, and supplemented with developmental, economic and health system data. The setting was the 53 African birth countries of African-born physicians working in nine wealthier destination countries. Three metrics of physician migration were used: total number of physician émigrés; emigration fraction defined as the proportion of the potential physician pool working in destination countries; and physician migration density defined as the number of physician émigrés per 1000 population of the African source country. Results Rankings based on any of the migration metrics differed substantially from those based on the other two metrics. Although the emigration fraction and physician migration density metrics gave proportionality to the migration crisis, only the latter was consistently associated with source countries' workforce capacity, health, health spending, economic and development characteristics. As such, higher physician migration density was seen among African countries with relatively higher health workforce capacity (0.401 ≤ r ≤ 0.694, p ≤ 0.011, health status, health spending, and development. Conclusion The perceived magnitude of physician migration is sensitive to the choice of metrics. Complementing the emigration fraction, the physician migration density is a metric

  4. Description of the Sandia National Laboratories science, technology & engineering metrics process.

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, Gretchen B.; Watkins, Randall D.; Trucano, Timothy Guy; Burns, Alan Richard; Oelschlaeger, Peter

    2010-04-01

    There has been a concerted effort since 2007 to establish a dashboard of metrics for the Science, Technology, and Engineering (ST&E) work at Sandia National Laboratories. These metrics are to provide a self assessment mechanism for the ST&E Strategic Management Unit (SMU) to complement external expert review and advice and various internal self assessment processes. The data and analysis will help ST&E Managers plan, implement, and track strategies and work in order to support the critical success factors of nurturing core science and enabling laboratory missions. The purpose of this SAND report is to provide a guide for those who want to understand the ST&E SMU metrics process. This report provides an overview of why the ST&E SMU wants a dashboard of metrics, some background on metrics for ST&E programs from existing literature and past Sandia metrics efforts, a summary of work completed to date, specifics on the portfolio of metrics that have been chosen and the implementation process that has been followed, and plans for the coming year to improve the ST&E SMU metrics process.

  5. Relevance of motion-related assessment metrics in laparoscopic surgery.

    Science.gov (United States)

    Oropesa, Ignacio; Chmarra, Magdalena K; Sánchez-González, Patricia; Lamata, Pablo; Rodrigues, Sharon P; Enciso, Silvia; Sánchez-Margallo, Francisco M; Jansen, Frank-Willem; Dankelman, Jenny; Gómez, Enrique J

    2013-06-01

    Motion metrics have become an important source of information when addressing the assessment of surgical expertise. However, their direct relationship with the different surgical skills has not been fully explored. The purpose of this study is to investigate the relevance of motion-related metrics in the evaluation processes of basic psychomotor laparoscopic skills and their correlation with the different abilities sought to measure. A framework for task definition and metric analysis is proposed. An explorative survey was first conducted with a board of experts to identify metrics to assess basic psychomotor skills. Based on the output of that survey, 3 novel tasks for surgical assessment were designed. Face and construct validation was performed, with focus on motion-related metrics. Tasks were performed by 42 participants (16 novices, 22 residents, and 4 experts). Movements of the laparoscopic instruments were registered with the TrEndo tracking system and analyzed. Time, path length, and depth showed construct validity for all 3 tasks. Motion smoothness and idle time also showed validity for tasks involving bimanual coordination and tasks requiring a more tactical approach, respectively. Additionally, motion smoothness and average speed showed a high internal consistency, proving them to be the most task-independent of all the metrics analyzed. Motion metrics are complementary and valid for assessing basic psychomotor skills, and their relevance depends on the skill being evaluated. A larger clinical implementation, combined with quality performance information, will give more insight on the relevance of the results shown in this study.

  6. Left-invariant Einstein metrics on S3 ×S3

    Science.gov (United States)

    Belgun, Florin; Cortés, Vicente; Haupt, Alexander S.; Lindemann, David

    2018-06-01

    The classification of homogeneous compact Einstein manifolds in dimension six is an open problem. We consider the remaining open case, namely left-invariant Einstein metrics g on G = SU(2) × SU(2) =S3 ×S3. Einstein metrics are critical points of the total scalar curvature functional for fixed volume. The scalar curvature S of a left-invariant metric g is constant and can be expressed as a rational function in the parameters determining the metric. The critical points of S, subject to the volume constraint, are given by the zero locus of a system of polynomials in the parameters. In general, however, the determination of the zero locus is apparently out of reach. Instead, we consider the case where the isotropy group K of g in the group of motions is non-trivial. When K ≇Z2 we prove that the Einstein metrics on G are given by (up to homothety) either the standard metric or the nearly Kähler metric, based on representation-theoretic arguments and computer algebra. For the remaining case K ≅Z2 we present partial results.

  7. Active Metric Learning from Relative Comparisons

    OpenAIRE

    Xiong, Sicheng; Rosales, Rómer; Pei, Yuanli; Fern, Xiaoli Z.

    2014-01-01

    This work focuses on active learning of distance metrics from relative comparison information. A relative comparison specifies, for a data point triplet $(x_i,x_j,x_k)$, that instance $x_i$ is more similar to $x_j$ than to $x_k$. Such constraints, when available, have been shown to be useful toward defining appropriate distance metrics. In real-world applications, acquiring constraints often require considerable human effort. This motivates us to study how to select and query the most useful ...

  8. LLNL's Big Science Capabilities Help Spur Over $796 Billion in U.S. Economic Activity Sequencing the Human Genome

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Jeffrey S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-28

    LLNL’s successful history of taking on big science projects spans beyond national security and has helped create billions of dollars per year in new economic activity. One example is LLNL’s role in helping sequence the human genome. Over $796 billion in new economic activity in over half a dozen fields has been documented since LLNL successfully completed this Grand Challenge.

  9. A practical approach to determine dose metrics for nanomaterials.

    Science.gov (United States)

    Delmaar, Christiaan J E; Peijnenburg, Willie J G M; Oomen, Agnes G; Chen, Jingwen; de Jong, Wim H; Sips, Adriënne J A M; Wang, Zhuang; Park, Margriet V D Z

    2015-05-01

    Traditionally, administered mass is used to describe doses of conventional chemical substances in toxicity studies. For deriving toxic doses of nanomaterials, mass and chemical composition alone may not adequately describe the dose, because particles with the same chemical composition can have completely different toxic mass doses depending on properties such as particle size. Other dose metrics such as particle number, volume, or surface area have been suggested, but consensus is lacking. The discussion regarding the most adequate dose metric for nanomaterials clearly needs a systematic, unbiased approach to determine the most appropriate dose metric for nanomaterials. In the present study, the authors propose such an approach and apply it to results from in vitro and in vivo experiments with silver and silica nanomaterials. The proposed approach is shown to provide a convenient tool to systematically investigate and interpret dose metrics of nanomaterials. Recommendations for study designs aimed at investigating dose metrics are provided. © 2015 SETAC.

  10. Information-theoretic semi-supervised metric learning via entropy regularization.

    Science.gov (United States)

    Niu, Gang; Dai, Bo; Yamada, Makoto; Sugiyama, Masashi

    2014-08-01

    We propose a general information-theoretic approach to semi-supervised metric learning called SERAPH (SEmi-supervised metRic leArning Paradigm with Hypersparsity) that does not rely on the manifold assumption. Given the probability parameterized by a Mahalanobis distance, we maximize its entropy on labeled data and minimize its entropy on unlabeled data following entropy regularization. For metric learning, entropy regularization improves manifold regularization by considering the dissimilarity information of unlabeled data in the unsupervised part, and hence it allows the supervised and unsupervised parts to be integrated in a natural and meaningful way. Moreover, we regularize SERAPH by trace-norm regularization to encourage low-dimensional projections associated with the distance metric. The nonconvex optimization problem of SERAPH could be solved efficiently and stably by either a gradient projection algorithm or an EM-like iterative algorithm whose M-step is convex. Experiments demonstrate that SERAPH compares favorably with many well-known metric learning methods, and the learned Mahalanobis distance possesses high discriminability even under noisy environments.

  11. Fisher information metrics for binary classifier evaluation and training

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Different evaluation metrics for binary classifiers are appropriate to different scientific domains and even to different problems within the same domain. This presentation focuses on the optimisation of event selection to minimise statistical errors in HEP parameter estimation, a problem that is best analysed in terms of the maximisation of Fisher information about the measured parameters. After describing a general formalism to derive evaluation metrics based on Fisher information, three more specific metrics are introduced for the measurements of signal cross sections in counting experiments (FIP1) or distribution fits (FIP2) and for the measurements of other parameters from distribution fits (FIP3). The FIP2 metric is particularly interesting because it can be derived from any ROC curve, provided that prevalence is also known. In addition to its relation to measurement errors when used as an evaluation criterion (which makes it more interesting that the ROC AUC), a further advantage of the FIP2 metric is ...

  12. Reconstruction and Analysis for the DUNE 35-ton Liquid Argon Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Wallbank, Michael James [Sheffield U.

    2018-01-01

    Neutrino physics is approaching the precision era, with current and future experiments aiming to perform highly accurate measurements of the parameters which govern the phenomenon of neutrino oscillations. The ultimate ambition with these results is to search for evidence of CP-violation in the lepton sector, currently hinted at in the world-leading analyses from present experiments, which may explain the dominance of matter over antimatter in the Universe. The Deep Underground Neutrino Experiment (DUNE) is a future long-baseline experiment based at Fermi National Accelerator Laboratory (FNAL), with a far detector at the Sanford Underground Research Facility (SURF) and a baseline of 1300 km. In order to make the required precision measurements, the far detector will consist of 40 kton liquid argon and an embedded time projection chamber. This promising technology is still in development and, since each detector module is around a factor 15 larger than any previous experiment employing this design, prototyping the detector and design choices is critical to the success of the experiment. The 35-ton experiment was constructed for this purpose and will be described in detail in this thesis. The outcomes of the 35-ton prototype are already influencing DUNE and, following the successes and lessons learned from the experiment, confidence can be taken forward to the next stage of the DUNE programme. The main oscillation signal at DUNE will be electron neutrino appearance from the muon neutrino beam. High-precision studies of these νe interactions requires advanced processing and event reconstruction techniques, particularly in the handling of showering particles such as electrons and photons. Novel methods developed for the purposes of shower reconstruction in liquid argon are presented with an aim to successfully develop a selection to use in a νe charged-current analysis, and a first-generation selection using the new techniques is presented.

  13. Metrical and dynamical aspects in complex analysis

    CERN Document Server

    2017-01-01

    The central theme of this reference book is the metric geometry of complex analysis in several variables. Bridging a gap in the current literature, the text focuses on the fine behavior of the Kobayashi metric of complex manifolds and its relationships to dynamical systems, hyperbolicity in the sense of Gromov and operator theory, all very active areas of research. The modern points of view expressed in these notes, collected here for the first time, will be of interest to academics working in the fields of several complex variables and metric geometry. The different topics are treated coherently and include expository presentations of the relevant tools, techniques and objects, which will be particularly useful for graduate and PhD students specializing in the area.

  14. Quarterly coal report, January--March 1992

    International Nuclear Information System (INIS)

    Young, P.

    1992-01-01

    The United States produced 257 million short tons of coal in the first quarter of 1992. This was the second highest quarterly production level ever recorded. US coal exports in January through March of 1992 were 25 million short tons, the highest first quarter since 1982. The leading destinations for US coal exports were Japan, Italy, France, and the Netherlands, together receiving 46 percent of the total. Coal exports for the first quarter of 1992 were valued at $1 billion, based on an average price of $42.28 per short ton. Steam coal exports totaled 10 million short tons, an increase of 34 percent over the level a year earlier. Metallurgical coal exports amounted to 15 million short tons, about the same as a year earlier. US coal consumption for January through March 1992 was 221 million short tons, 2 million short tons more than a year earlier (Table 45). All sectors but the residential and commercial sector reported increased coal consumption

  15. Important LiDAR metrics for discriminating forest tree species in Central Europe

    Science.gov (United States)

    Shi, Yifang; Wang, Tiejun; Skidmore, Andrew K.; Heurich, Marco

    2018-03-01

    Numerous airborne LiDAR-derived metrics have been proposed for classifying tree species. Yet an in-depth ecological and biological understanding of the significance of these metrics for tree species mapping remains largely unexplored. In this paper, we evaluated the performance of 37 frequently used LiDAR metrics derived under leaf-on and leaf-off conditions, respectively, for discriminating six different tree species in a natural forest in Germany. We firstly assessed the correlation between these metrics. Then we applied a Random Forest algorithm to classify the tree species and evaluated the importance of the LiDAR metrics. Finally, we identified the most important LiDAR metrics and tested their robustness and transferability. Our results indicated that about 60% of LiDAR metrics were highly correlated to each other (|r| > 0.7). There was no statistically significant difference in tree species mapping accuracy between the use of leaf-on and leaf-off LiDAR metrics. However, combining leaf-on and leaf-off LiDAR metrics significantly increased the overall accuracy from 58.2% (leaf-on) and 62.0% (leaf-off) to 66.5% as well as the kappa coefficient from 0.47 (leaf-on) and 0.51 (leaf-off) to 0.58. Radiometric features, especially intensity related metrics, provided more consistent and significant contributions than geometric features for tree species discrimination. Specifically, the mean intensity of first-or-single returns as well as the mean value of echo width were identified as the most robust LiDAR metrics for tree species discrimination. These results indicate that metrics derived from airborne LiDAR data, especially radiometric metrics, can aid in discriminating tree species in a mixed temperate forest, and represent candidate metrics for tree species classification and monitoring in Central Europe.

  16. Human Performance Optimization Metrics: Consensus Findings, Gaps, and Recommendations for Future Research.

    Science.gov (United States)

    Nindl, Bradley C; Jaffin, Dianna P; Dretsch, Michael N; Cheuvront, Samuel N; Wesensten, Nancy J; Kent, Michael L; Grunberg, Neil E; Pierce, Joseph R; Barry, Erin S; Scott, Jonathan M; Young, Andrew J; OʼConnor, Francis G; Deuster, Patricia A

    2015-11-01

    Human performance optimization (HPO) is defined as "the process of applying knowledge, skills and emerging technologies to improve and preserve the capabilities of military members, and organizations to execute essential tasks." The lack of consensus for operationally relevant and standardized metrics that meet joint military requirements has been identified as the single most important gap for research and application of HPO. In 2013, the Consortium for Health and Military Performance hosted a meeting to develop a toolkit of standardized HPO metrics for use in military and civilian research, and potentially for field applications by commanders, units, and organizations. Performance was considered from a holistic perspective as being influenced by various behaviors and barriers. To accomplish the goal of developing a standardized toolkit, key metrics were identified and evaluated across a spectrum of domains that contribute to HPO: physical performance, nutritional status, psychological status, cognitive performance, environmental challenges, sleep, and pain. These domains were chosen based on relevant data with regard to performance enhancers and degraders. The specific objectives at this meeting were to (a) identify and evaluate current metrics for assessing human performance within selected domains; (b) prioritize metrics within each domain to establish a human performance assessment toolkit; and (c) identify scientific gaps and the needed research to more effectively assess human performance across domains. This article provides of a summary of 150 total HPO metrics across multiple domains that can be used as a starting point-the beginning of an HPO toolkit: physical fitness (29 metrics), nutrition (24 metrics), psychological status (36 metrics), cognitive performance (35 metrics), environment (12 metrics), sleep (9 metrics), and pain (5 metrics). These metrics can be particularly valuable as the military emphasizes a renewed interest in Human Dimension efforts

  17. Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact

    Science.gov (United States)

    2011-01-01

    Background Citations in peer-reviewed articles and the impact factor are generally accepted measures of scientific impact. Web 2.0 tools such as Twitter, blogs or social bookmarking tools provide the possibility to construct innovative article-level or journal-level metrics to gauge impact and influence. However, the relationship of the these new metrics to traditional metrics such as citations is not known. Objective (1) To explore the feasibility of measuring social impact of and public attention to scholarly articles by analyzing buzz in social media, (2) to explore the dynamics, content, and timing of tweets relative to the publication of a scholarly article, and (3) to explore whether these metrics are sensitive and specific enough to predict highly cited articles. Methods Between July 2008 and November 2011, all tweets containing links to articles in the Journal of Medical Internet Research (JMIR) were mined. For a subset of 1573 tweets about 55 articles published between issues 3/2009 and 2/2010, different metrics of social media impact were calculated and compared against subsequent citation data from Scopus and Google Scholar 17 to 29 months later. A heuristic to predict the top-cited articles in each issue through tweet metrics was validated. Results A total of 4208 tweets cited 286 distinct JMIR articles. The distribution of tweets over the first 30 days after article publication followed a power law (Zipf, Bradford, or Pareto distribution), with most tweets sent on the day when an article was published (1458/3318, 43.94% of all tweets in a 60-day period) or on the following day (528/3318, 15.9%), followed by a rapid decay. The Pearson correlations between tweetations and citations were moderate and statistically significant, with correlation coefficients ranging from .42 to .72 for the log-transformed Google Scholar citations, but were less clear for Scopus citations and rank correlations. A linear multivariate model with time and tweets as significant

  18. Orbital forcing of climate 1.4 billion years ago

    DEFF Research Database (Denmark)

    Zhang, Shuichang; Wang, Xiaomei; Hammarlund, Emma U

    2015-01-01

    Fluctuating climate is a hallmark of Earth. As one transcends deep into Earth time, however, both the evidence for and the causes of climate change become difficult to establish. We report geochemical and sedimentological evidence for repeated, short-term climate fluctuations from the exceptionally...... well-preserved ∼1.4-billion-year-old Xiamaling Formation of the North China Craton. We observe two patterns of climate fluctuations: On long time scales, over what amounts to tens of millions of years, sediments of the Xiamaling Formation record changes in geochemistry consistent with long-term changes...... reflect what appear to be orbitally forced changes in wind patterns and ocean circulation as they influenced rates of organic carbon flux, trace metal accumulation, and the source of detrital particles to the sediment....

  19. Rainbows without unicorns: metric structures in theories with modified dispersion relations

    International Nuclear Information System (INIS)

    Lobo, Iarley P.; Loret, Niccolo; Nettel, Francisco

    2017-01-01

    Rainbow metrics are a widely used approach to the metric formalism for theories with modified dispersion relations. They have had a huge success in the quantum gravity phenomenology literature, since they allow one to introduce momentum-dependent space-time metrics into the description of systems with a modified dispersion relation. In this paper, we introduce the reader to some realizations of this general idea: the original rainbow metrics proposal, the momentum-space-inspired metric and a Finsler geometry approach. As the main result of this work we also present an alternative definition of a four-velocity dependent metric which allows one to handle the massless limit. This paper aims to highlight some of their properties and how to properly describe their relativistic realizations. (orig.)

  20. Rainbows without unicorns: metric structures in theories with modified dispersion relations

    Science.gov (United States)

    Lobo, Iarley P.; Loret, Niccoló; Nettel, Francisco

    2017-07-01

    Rainbow metrics are a widely used approach to the metric formalism for theories with modified dispersion relations. They have had a huge success in the quantum gravity phenomenology literature, since they allow one to introduce momentum-dependent space-time metrics into the description of systems with a modified dispersion relation. In this paper, we introduce the reader to some realizations of this general idea: the original rainbow metrics proposal, the momentum-space-inspired metric and a Finsler geometry approach. As the main result of this work we also present an alternative definition of a four-velocity dependent metric which allows one to handle the massless limit. This paper aims to highlight some of their properties and how to properly describe their relativistic realizations.

  1. Rainbows without unicorns: metric structures in theories with modified dispersion relations

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Iarley P. [Universita ' ' La Sapienza' ' , Dipartimento di Fisica, Rome (Italy); ICRANet, Pescara (Italy); CAPES Foundation, Ministry of Education of Brazil, Brasilia (Brazil); Universidade Federal da Paraiba, Departamento de Fisica, Joao Pessoa, PB (Brazil); INFN Sezione Roma 1 (Italy); Loret, Niccolo [Ruder Boskovic Institute, Division of Theoretical Physics, Zagreb (Croatia); Nettel, Francisco [Universita ' ' La Sapienza' ' , Dipartimento di Fisica, Rome (Italy); Universidad Nacional Autonoma de Mexico, Instituto de Ciencias Nucleares, Mexico (Mexico); INFN Sezione Roma 1 (Italy)

    2017-07-15

    Rainbow metrics are a widely used approach to the metric formalism for theories with modified dispersion relations. They have had a huge success in the quantum gravity phenomenology literature, since they allow one to introduce momentum-dependent space-time metrics into the description of systems with a modified dispersion relation. In this paper, we introduce the reader to some realizations of this general idea: the original rainbow metrics proposal, the momentum-space-inspired metric and a Finsler geometry approach. As the main result of this work we also present an alternative definition of a four-velocity dependent metric which allows one to handle the massless limit. This paper aims to highlight some of their properties and how to properly describe their relativistic realizations. (orig.)

  2. Temporal variability of daily personal magnetic field exposure metrics in pregnant women.

    Science.gov (United States)

    Lewis, Ryan C; Evenson, Kelly R; Savitz, David A; Meeker, John D

    2015-01-01

    Recent epidemiology studies of power-frequency magnetic fields and reproductive health have characterized exposures using data collected from personal exposure monitors over a single day, possibly resulting in exposure misclassification due to temporal variability in daily personal magnetic field exposure metrics, but relevant data in adults are limited. We assessed the temporal variability of daily central tendency (time-weighted average, median) and peak (upper percentiles, maximum) personal magnetic field exposure metrics over 7 consecutive days in 100 pregnant women. When exposure was modeled as a continuous variable, central tendency metrics had substantial reliability, whereas peak metrics had fair (maximum) to moderate (upper percentiles) reliability. The predictive ability of a single-day metric to accurately classify participants into exposure categories based on a weeklong metric depended on the selected exposure threshold, with sensitivity decreasing with increasing exposure threshold. Consistent with the continuous measures analysis, sensitivity was higher for central tendency metrics than for peak metrics. If there is interest in peak metrics, more than 1 day of measurement is needed over the window of disease susceptibility to minimize measurement error, but 1 day may be sufficient for central tendency metrics.

  3. Development of soil quality metrics using mycorrhizal fungi

    Energy Technology Data Exchange (ETDEWEB)

    Baar, J.

    2010-07-01

    Based on the Treaty on Biological Diversity of Rio de Janeiro in 1992 for maintaining and increasing biodiversity, several countries have started programmes monitoring soil quality and the above- and below ground biodiversity. Within the European Union, policy makers are working on legislation for soil protection and management. Therefore, indicators are needed to monitor the status of the soils and these indicators reflecting the soil quality, can be integrated in working standards or soil quality metrics. Soil micro-organisms, particularly arbuscular mycorrhizal fungi (AMF), are indicative of soil changes. These soil fungi live in symbiosis with the great majority of plants and are sensitive to changes in the physico-chemical conditions of the soil. The aim of this study was to investigate whether AMF are reliable and sensitive indicators for disturbances in the soils and can be used for the development of soil quality metrics. Also, it was studied whether soil quality metrics based on AMF meet requirements to applicability by users and policy makers. Ecological criterions were set for the development of soil quality metrics for different soils. Multiple root samples containing AMF from various locations in The Netherlands were analyzed. The results of the analyses were related to the defined criterions. This resulted in two soil quality metrics, one for sandy soils and a second one for clay soils, with six different categories ranging from very bad to very good. These soil quality metrics meet the majority of requirements for applicability and are potentially useful for the development of legislations for the protection of soil quality. (Author) 23 refs.

  4. Measures of agreement between computation and experiment:validation metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Barone, Matthew Franklin; Oberkampf, William Louis

    2005-08-01

    With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables and sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric and also features that should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent free-shear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.

  5. Molecular dynamics beyonds the limits: Massive scaling on 72 racks of a BlueGene/P and supercooled glass dynamics of a 1 billion particles system

    KAUST Repository

    Allsopp, Nicholas

    2012-04-01

    We report scaling results on the world\\'s largest supercomputer of our recently developed Billions-Body Molecular Dynamics (BBMD) package, which was especially designed for massively parallel simulations of the short-range atomic dynamics in structural glasses and amorphous materials. The code was able to scale up to 72 racks of an IBM BlueGene/P, with a measured 89% efficiency for a system with 100 billion particles. The code speed, with 0.13. s per iteration in the case of 1 billion particles, paves the way to the study of billion-body structural glasses with a resolution increase of two orders of magnitude with respect to the largest simulation ever reported. We demonstrate the effectiveness of our code by studying the liquid-glass transition of an exceptionally large system made by a binary mixture of 1 billion particles. © 2012.

  6. Tide or Tsunami? The Impact of Metrics on Scholarly Research

    Science.gov (United States)

    Bonnell, Andrew G.

    2016-01-01

    Australian universities are increasingly resorting to the use of journal metrics such as impact factors and ranking lists in appraisal and promotion processes, and are starting to set quantitative "performance expectations" which make use of such journal-based metrics. The widespread use and misuse of research metrics is leading to…

  7. Term Based Comparison Metrics for Controlled and Uncontrolled Indexing Languages

    Science.gov (United States)

    Good, B. M.; Tennis, J. T.

    2009-01-01

    Introduction: We define a collection of metrics for describing and comparing sets of terms in controlled and uncontrolled indexing languages and then show how these metrics can be used to characterize a set of languages spanning folksonomies, ontologies and thesauri. Method: Metrics for term set characterization and comparison were identified and…

  8. Software metrics: The key to quality software on the NCC project

    Science.gov (United States)

    Burns, Patricia J.

    1993-01-01

    Network Control Center (NCC) Project metrics are captured during the implementation and testing phases of the NCCDS software development lifecycle. The metrics data collection and reporting function has interfaces with all elements of the NCC project. Close collaboration with all project elements has resulted in the development of a defined and repeatable set of metrics processes. The resulting data are used to plan and monitor release activities on a weekly basis. The use of graphical outputs facilitates the interpretation of progress and status. The successful application of metrics throughout the NCC project has been instrumental in the delivery of quality software. The use of metrics on the NCC Project supports the needs of the technical and managerial staff. This paper describes the project, the functions supported by metrics, the data that are collected and reported, how the data are used, and the improvements in the quality of deliverable software since the metrics processes and products have been in use.

  9. 48 CFR 611.002-70 - Metric system implementation.

    Science.gov (United States)

    2010-10-01

    ... with security, operations, economic, technical, logistical, training and safety requirements. (3) The... total cost of the retrofit, including redesign costs, exceeds $50,000; (ii) Metric is not the accepted... office with an explanation for the disapproval. (7) The in-house operating metric costs shall be...

  10. Metrics for assessing retailers based on consumer perception

    Directory of Open Access Journals (Sweden)

    Klimin Anastasii

    2017-01-01

    Full Text Available The article suggests a new look at trading platforms, which is called “metrics.” Metrics are a way to look at the point of sale in a large part from the buyer’s side. The buyer enters the store and make buying decision based on those factors that the seller often does not consider, or considers in part, because “does not see” them, since he is not a buyer. The article proposes the classification of retailers, metrics and a methodology for their determination, presents the results of an audit of retailers in St. Petersburg on the proposed methodology.

  11. Chaos of discrete dynamical systems in complete metric spaces

    International Nuclear Information System (INIS)

    Shi Yuming; Chen Guanrong

    2004-01-01

    This paper is concerned with chaos of discrete dynamical systems in complete metric spaces. Discrete dynamical systems governed by continuous maps in general complete metric spaces are first discussed, and two criteria of chaos are then established. As a special case, two corresponding criteria of chaos for discrete dynamical systems in compact subsets of metric spaces are obtained. These results have extended and improved the existing relevant results of chaos in finite-dimensional Euclidean spaces

  12. Energy tax price tag for CPI: $1.2 billion, jobs, and production

    International Nuclear Information System (INIS)

    Begley, R.

    1993-01-01

    If President Clinton's proposed energy tax had been fully in place last year, it would have cost the US chemical industry an additional $1.2 billion and 9,900 jobs, according to Chemical Manufacturers Association (CMA; Washington) estimates. It also would have driven output down 3% and prices up 5%, CMA says. Allen Lenz, CMA director/trade and economics, says the increase in production costs that would accompany the tax will not be shared by foreign competitors, cannot be neutralized with higher border taxes because of existing trade agreements, and provides another reason to move production offshore. Worse, the US chemical industry's generally impressive trade surplus declined by $2.5 billion last year, and a further drop is projected for this year. The margin of error gets thinner all the time as competition increases, Lenz says. We're not concerned only with the chemical industry, but the rest of US-based manufacturing because they taken half our output, he adds. One problem is the energy intensiveness of the chemical process industries-a CMA report says that 55% of the cost of producing ethylene glycol is energy related. And double taxation of such things as coproducts returned for credit to oil refineries could add up to $115 million/year, the report says

  13. The correlation of metrics in complex networks with applications in functional brain networks

    International Nuclear Information System (INIS)

    Li, C; Wang, H; Van Mieghem, P; De Haan, W; Stam, C J

    2011-01-01

    An increasing number of network metrics have been applied in network analysis. If metric relations were known better, we could more effectively characterize networks by a small set of metrics to discover the association between network properties/metrics and network functioning. In this paper, we investigate the linear correlation coefficients between widely studied network metrics in three network models (Bárabasi–Albert graphs, Erdös–Rényi random graphs and Watts–Strogatz small-world graphs) as well as in functional brain networks of healthy subjects. The metric correlations, which we have observed and theoretically explained, motivate us to propose a small representative set of metrics by including only one metric from each subset of mutually strongly dependent metrics. The following contributions are considered important. (a) A network with a given degree distribution can indeed be characterized by a small representative set of metrics. (b) Unweighted networks, which are obtained from weighted functional brain networks with a fixed threshold, and Erdös–Rényi random graphs follow a similar degree distribution. Moreover, their metric correlations and the resultant representative metrics are similar as well. This verifies the influence of degree distribution on metric correlations. (c) Most metric correlations can be explained analytically. (d) Interestingly, the most studied metrics so far, the average shortest path length and the clustering coefficient, are strongly correlated and, thus, redundant. Whereas spectral metrics, though only studied recently in the context of complex networks, seem to be essential in network characterizations. This representative set of metrics tends to both sufficiently and effectively characterize networks with a given degree distribution. In the study of a specific network, however, we have to at least consider the representative set so that important network properties will not be neglected

  14. Fuel briquettes from wood and agricultural residues

    Energy Technology Data Exchange (ETDEWEB)

    Natividad, R.A.

    1982-01-01

    A short review of the production and uses of briquettes and of machinery available for briquetting fine dry, coarse dry and coarse wet raw materials. The potential of a fuel briquette industry in the Philippines with an estimated annual production of 217 million ton of sawdust, 2.09 billion ton of rice hulls and 2.87 million ton of coconut husks is discussed. Studies at the Forest Products Research and Development Institute (FPRDI) have shown that sawdust, coir dust rice hulls briquettes with 1-2% resin binder have heating values of 6882, 5839 and 3913 cal/g respectively.

  15. Otherwise Engaged : Social Media from Vanity Metrics to Critical Analytics

    NARCIS (Netherlands)

    Rogers, R.

    2018-01-01

    Vanity metrics is a term that captures the measurement and display of how well one is doing in the “success theater” of social media. The notion of vanity metrics implies a critique of metrics concerning both the object of measurement as well as their capacity to measure unobtrusively or only to

  16. IDENTIFYING MARKETING EFFECTIVENESS METRICS (Case study: East Azerbaijan`s industrial units)

    OpenAIRE

    Faridyahyaie, Reza; Faryabi, Mohammad; Bodaghi Khajeh Noubar, Hossein

    2012-01-01

    The Paper attempts to identify marketing eff ectiveness metrics in industrial units. The metrics investigated in this study are completely applicable and comprehensive, and consequently they can evaluate marketing eff ectiveness in various industries. The metrics studied include: Market Share, Profitability, Sales Growth, Customer Numbers, Customer Satisfaction and Customer Loyalty. The findings indicate that these six metrics are impressive when measuring marketing effectiveness. Data was ge...

  17. A parts-per-billion measurement of the antiproton magnetic moment.

    Science.gov (United States)

    Smorra, C; Sellner, S; Borchert, M J; Harrington, J A; Higuchi, T; Nagahama, H; Tanaka, T; Mooser, A; Schneider, G; Bohman, M; Blaum, K; Matsuda, Y; Ospelkaus, C; Quint, W; Walz, J; Yamazaki, Y; Ulmer, S

    2017-10-18

    Precise comparisons of the fundamental properties of matter-antimatter conjugates provide sensitive tests of charge-parity-time (CPT) invariance, which is an important symmetry that rests on basic assumptions of the standard model of particle physics. Experiments on mesons, leptons and baryons have compared different properties of matter-antimatter conjugates with fractional uncertainties at the parts-per-billion level or better. One specific quantity, however, has so far only been known to a fractional uncertainty at the parts-per-million level: the magnetic moment of the antiproton, . The extraordinary difficulty in measuring with high precision is caused by its intrinsic smallness; for example, it is 660 times smaller than the magnetic moment of the positron. Here we report a high-precision measurement of in units of the nuclear magneton μ N with a fractional precision of 1.5 parts per billion (68% confidence level). We use a two-particle spectroscopy method in an advanced cryogenic multi-Penning trap system. Our result  = -2.7928473441(42)μ N (where the number in parentheses represents the 68% confidence interval on the last digits of the value) improves the precision of the previous best measurement by a factor of approximately 350. The measured value is consistent with the proton magnetic moment, μ p  = 2.792847350(9)μ N , and is in agreement with CPT invariance. Consequently, this measurement constrains the magnitude of certain CPT-violating effects to below 1.8 × 10 -24 gigaelectronvolts, and a possible splitting of the proton-antiproton magnetic moments by CPT-odd dimension-five interactions to below 6 × 10 -12 Bohr magnetons.

  18. Some observations on a fuzzy metric space

    Energy Technology Data Exchange (ETDEWEB)

    Gregori, V.

    2017-07-01

    Let $(X,d)$ be a metric space. In this paper we provide some observations about the fuzzy metric space in the sense of Kramosil and Michalek $(Y,N,/wedge)$, where $Y$ is the set of non-negative real numbers $[0,/infty[$ and $N(x,y,t)=1$ if $d(x,y)/leq t$ and $N(x,y,t)=0$ if $d(x,y)/geq t$. (Author)

  19. A 17-billion-solar-mass black hole in a group galaxy with a diffuse core.

    Science.gov (United States)

    Thomas, Jens; Ma, Chung-Pei; McConnell, Nicholas J; Greene, Jenny E; Blakeslee, John P; Janish, Ryan

    2016-04-21

    Quasars are associated with and powered by the accretion of material onto massive black holes; the detection of highly luminous quasars with redshifts greater than z = 6 suggests that black holes of up to ten billion solar masses already existed 13 billion years ago. Two possible present-day 'dormant' descendants of this population of 'active' black holes have been found in the galaxies NGC 3842 and NGC 4889 at the centres of the Leo and Coma galaxy clusters, which together form the central region of the Great Wall--the largest local structure of galaxies. The most luminous quasars, however, are not confined to such high-density regions of the early Universe; yet dormant black holes of this high mass have not yet been found outside of modern-day rich clusters. Here we report observations of the stellar velocity distribution in the galaxy NGC 1600--a relatively isolated elliptical galaxy near the centre of a galaxy group at a distance of 64 megaparsecs from Earth. We use orbit superposition models to determine that the black hole at the centre of NGC 1600 has a mass of 17 billion solar masses. The spatial distribution of stars near the centre of NGC 1600 is rather diffuse. We find that the region of depleted stellar density in the cores of massive elliptical galaxies extends over the same radius as the gravitational sphere of influence of the central black holes, and interpret this as the dynamical imprint of the black holes.

  20. 22 CFR 226.15 - Metric system of measurement.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Metric system of measurement. 226.15 Section 226.15 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT ADMINISTRATION OF ASSISTANCE AWARDS TO U.S. NON-GOVERNMENTAL ORGANIZATIONS Pre-award Requirements § 226.15 Metric system of measurement. (a...