WorldWideScience

Sample records for minimum mapping unit

  1. State cigarette minimum price laws - United States, 2009.

    Science.gov (United States)

    2010-04-09

    Cigarette price increases reduce the demand for cigarettes and thereby reduce smoking prevalence, cigarette consumption, and youth initiation of smoking. Excise tax increases are the most effective government intervention to increase the price of cigarettes, but cigarette manufacturers use trade discounts, coupons, and other promotions to counteract the effects of these tax increases and appeal to price-sensitive smokers. State cigarette minimum price laws, initiated by states in the 1940s and 1950s to protect tobacco retailers from predatory business practices, typically require a minimum percentage markup to be added to the wholesale and/or retail price. If a statute prohibits trade discounts from the minimum price calculation, these laws have the potential to counteract discounting by cigarette manufacturers. To assess the status of cigarette minimum price laws in the United States, CDC surveyed state statutes and identified those states with minimum price laws in effect as of December 31, 2009. This report summarizes the results of that survey, which determined that 25 states had minimum price laws for cigarettes (median wholesale markup: 4.00%; median retail markup: 8.00%), and seven of those states also expressly prohibited the use of trade discounts in the minimum retail price calculation. Minimum price laws can help prevent trade discounting from eroding the positive effects of state excise tax increases and higher cigarette prices on public health.

  2. Genetic maps and physical units

    International Nuclear Information System (INIS)

    Karunakaran, V.; Holt, G.

    1976-01-01

    The relationships between physical and genetic units are examined. Genetic mapping involves the detection of linkage of genes and the measurement of recombination frequencies. The genetic distance is measured in map units and is proportional to the recombination frequencies between linked markers. Physical mapping of genophores, particularly the simple genomes of bacteriophages and bacterial plasmids can be achieved through heteroduplex analysis. Genetic distances are dependent on recombination frequencies and, therefore, can only be correlated accurately with physical unit lengths if the recombination frequency is constant throughout the entire genome. Methods are available to calculate the equivalent length of DNA per average map unit in different organisms. Such estimates indicate significant differences from one organism to another. Gene lengths can also be calculated from the number of amino acids in a specified polypeptide and relating this to the number of nucleotides required to code for such a polypeptide. Many attempts have been made to relate microdosimetric measurements to radiobiological data. For irradiation effects involving deletion of genetic material such a detailed correlation may be possible in systems where heteroduplex analysis or amino acid sequencing can be performed. The problems of DNA packaging and other functional associations within the cell in interpreting data is discussed

  3. Minimum Map of Social Institutional Network: a multidimensional strategy for research in Nursing

    Directory of Open Access Journals (Sweden)

    Diene Monique Carlos

    2016-06-01

    Full Text Available Objective To analyze the use of methodological strategies in qualitative research - Minimum Maps of Social Institutional Network, as proposed to understand the phenomena in the multidimensional perspective. Method Methodological theoretical essay in which we aimed to reflect on the use of innovative methodological strategies in nursing research, supported in Complex Paradigm fundamentals. Results The minimum map of Social Institutional External Network aims to identify institutional linkages and gaps for the intervention work of the surveyed institutions. The use of these maps provided important advances in know-how qualitative research in Health and Nursing. Conclusions In this perspective, the use of minimum Social Intitutional Network maps can be stimulated and enhanced to meet the current demands of the contemporary world, particularly for its flexibility in adapting to various research subjects; breadth and depth of discussion; and possibilities with health services.

  4. Minimum Map of Social Institutional Network: a multidimensional strategy for research in Nursing.

    Science.gov (United States)

    Carlos, Diene Monique; Pádua, Elisabete Matallo Marchesini de; Nakano, Ana Márcia Spanó; Ferriani, Maria das Graças Carvalho

    2016-06-01

    To analyze the use of methodological strategies in qualitative research - Minimum Maps of Social Institutional Network, as proposed to understand the phenomena in the multidimensional perspective. Methodological theoretical essay in which we aimed to reflect on the use of innovative methodological strategies in nursing research, supported in Complex Paradigm fundamentals. The minimum map of Social Institutional External Network aims to identify institutional linkages and gaps for the intervention work of the surveyed institutions. The use of these maps provided important advances in know-how qualitative research in Health and Nursing. In this perspective, the use of minimum Social Intitutional Network maps can be stimulated and enhanced to meet the current demands of the contemporary world, particularly for its flexibility in adapting to various research subjects; breadth and depth of discussion; and possibilities with health services. Analisar o uso de estratégias metodológicas em pesquisas qualitativas - Mapa mínimo da Rede Social Institucional, como proposta para compreender os fenômenos na perspectiva multidimensional. Ensaio teórico metodológico em que buscou-se refletir sobre o uso de estratégias metodológicas inovadoras de pesquisa na enfermagem, sustentada nos fundamentos do Pensamento Complexo. O mapa mínimo da Rede Social Institucional Externa tem o objetivo de identificar os vínculos institucionais e lacunas para o trabalho de intervenção das instituições pesquisadas. O uso destes mapas proporcionou avanços importantes no saber-fazer pesquisa qualitativa em Saúde e Enfermagem. Nessa perspectiva, o uso de mapas mínimos da Rede Social Institucional pode ser estimulado e potencializado para responder às atuais demandas da contemporaneidade, em especial pela sua flexibilidade na adequação a diversos objetos de pesquisa; amplitude e profundidade de discussão; e possibilidades de articulação com a prática dos serviços.

  5. Minimum number of transfer units and reboiler duty for multicomponent distillation columns

    International Nuclear Information System (INIS)

    Pleşu, Valentin; Bonet Ruiz, Alexandra Elena; Bonet, Jordi; Llorens, Joan; Iancu, Petrica

    2013-01-01

    Some guidelines to evaluate distillation columns, considering only basic thermodynamic data and principles, are provided in this paper. The method allows a first insight to the problem by simple calculations, without requiring column variables to ensure rational use of energy and low environmental impact. The separation system is approached by two complementary ways: minimum and infinite reflux flow rate. The minimum reflux provides the minimum energy requirements, and the infinite reflux provides the feasibility conditions. The difficulty of separation can be expressed in terms of number of transfer units (NTU). The applicability of the method is not mathematically limited by the number of components in the mixture. It is also applicable to reactive distillation. Several mixtures, including reactive distillation, are rigorously simulated as illustrative examples, to verify the applicability of the approach. The separation of the mixtures, performed by distillation columns, is feasible if a minimum NTU can be calculated between the distillate and bottom products. Once verified the feasibility of the separation, the maximum thermal efficiency depends only on boiling point of bottom and distillate streams. The minimum energy requirements corresponding to the reboiler can be calculated from the maximum thermal efficiency, and the variation of entropy and enthalpy of mixing between distillate and bottom streams. -- Highlights: • Feasibility analysis complemented with difficulty of separation parameters • Minimum and infinite reflux simplified models for distillation columns • Minimum number of transfer units (NTU) for packed columns at early design stages • Calculation of minimum energy distillation requirements at early design stages • Thermodynamic cycle approach and efficiency for distillation columns

  6. Bedrock Geologic Map of Vermont - Units

    Data.gov (United States)

    Vermont Center for Geographic Information — The bedrock geology was last mapped at a statewide scale 50 years ago at a scale of 1:250,000 (Doll and others, 1961). The 1961 map was compiled from 1:62,500-scale...

  7. Mapping severe fire potential across the contiguous United States

    Science.gov (United States)

    Brett H. Davis

    2016-01-01

    The Fire Severity Mapping System (FIRESEV) project is an effort to provide critical information and tools to fire managers that enhance their ability to assess potential ecological effects of wildland fire. A major component of FIRESEV is the development of a Severe Fire Potential Map (SFPM), a geographic dataset covering the contiguous United States (CONUS) that...

  8. Heavy Drinkers and the Potential Impact of Minimum Unit Pricing-No Single or Simple Effect?

    Science.gov (United States)

    Gill, J; Black, H; Rush, R; O'May, F; Chick, J

    2017-11-01

    To explore the potential impact of a minimum unit price (MUP: 50 pence per UK unit) on the alcohol consumption of ill Scottish heavy drinkers. Participants were 639 patients attending alcohol treatment services or admitted to hospital with an alcohol-related condition. From their reported expenditure on alcohol in their index week, and assuming this remained unchanged, we estimated the impact of a MUP (50 ppu) on future consumption. (Around 15% purchased from both the more expensive on-sale outlets (hotels, pubs, bars) and from off-sales (shops and supermarkets). For them we estimated the change in consumption that might follow MUP if (i) they continued this proportion of 'on-sales' purchasing or (ii) their reported expenditure was moved entirely to off-sale purchasing (to maintain consumption levels)). Around 69% of drinkers purchased exclusively off-sale alcohol at sales purchases could support, for some, an increase in consumption. While a proportion of our harmed, heavy drinkers might be able to mitigate the impact of MUP by changing purchasing habits, the majority are predicted to reduce purchasing. This analysis, focusing specifically on harmed drinkers, adds a unique dimension to the evidence base informing current pricing policy. From drink purchasing data of heavy drinkers, we estimated the impact of legislating £0.50 minimum unit price. Over two thirds of drinkers, representing all multiple deprivation quintiles, were predicted to decrease alcohol purchasing; remainder, hypothetically, could maintain consumption. Our data address an important gap within the evidence base informing policy. © The Author 2017. Medical Council on Alcohol and Oxford University Press. All rights reserved.

  9. USGS Governmental Unit Boundaries Overlay Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS Governmental Unit Boundaries service from The National Map (TNM) represents major civil areas for the Nation, including States or Territories, counties (or...

  10. Improving boiler unit performance using an optimum robust minimum-order observer

    International Nuclear Information System (INIS)

    Moradi, Hamed; Bakhtiari-Nejad, Firooz

    2011-01-01

    Research highlights: → Multivariable model of a boiler unit with uncertainty. → Design of a robust minimum-order observer. → Developing an optimal functional code in MATLAB environment. → Finding optimum region of observer-based controller poles. → Guarantee of robust performance in the presence of parametric uncertainties. - Abstract: To achieve a good performance of the utility boiler, dynamic variables such as drum pressure, steam temperature and water level of drum must be controlled. In this paper, a linear time invariant (LTI) model of a boiler system is considered in which the input variables are feed-water and fuel mass rates. Due to the inaccessibility of some state variables of boiler system, a minimum-order observer is designed based on Luenberger's model to gain an estimate state x-tilde of the true state x. Low cost of design and high accuracy of states estimation are the main advantages of the minimum-order observer; in comparison with previous designed full-order observers. By applying the observer on the closed-loop system, a regulator system is designed. Using an optimal functional code developed in MATLAB environment, desired observer poles are found such that suitable time response specifications of the boiler system are achieved and the gain and phase margin values are adjusted in an acceptable range. However, the real dynamic model may associate with parametric uncertainties. In that case, optimum region of poles of observer-based controller are found such that the robust performance of the boiler system against model uncertainties is guaranteed.

  11. A Lithology Based Map Unit Schema For Onegeology Regional Geologic Map Integration

    Science.gov (United States)

    Moosdorf, N.; Richard, S. M.

    2012-12-01

    A system of lithogenetic categories for a global lithological map (GLiM, http://www.ifbm.zmaw.de/index.php?id=6460&L=3) has been compiled based on analysis of lithology/genesis categories for regional geologic maps for the entire globe. The scheme is presented for discussion and comment. Analysis of units on a variety of regional geologic maps indicates that units are defined based on assemblages of rock types, as well as their genetic type. In this compilation of continental geology, outcropping surface materials are dominantly sediment/sedimentary rock; major subdivisions of the sedimentary category include clastic sediment, carbonate sedimentary rocks, clastic sedimentary rocks, mixed carbonate and clastic sedimentary rock, colluvium and residuum. Significant areas of mixed igneous and metamorphic rock are also present. A system of global categories to characterize the lithology of regional geologic units is important for Earth System models of matter fluxes to soils, ecosystems, rivers and oceans, and for regional analysis of Earth surface processes at global scale. Because different applications of the classification scheme will focus on different lithologic constituents in mixed units, an ontology-type representation of the scheme that assigns properties to the units in an analyzable manner will be pursued. The OneGeology project is promoting deployment of geologic map services at million scale for all nations. Although initial efforts are commonly simple scanned map WMS services, the intention is to move towards data-based map services that categorize map units with standard vocabularies to allow use of a common map legend for better visual integration of the maps (e.g. see OneGeology Europe, http://onegeology-europe.brgm.fr/ geoportal/ viewer.jsp). Current categorization of regional units with a single lithology from the CGI SimpleLithology (http://resource.geosciml.org/201202/ Vocab2012html/ SimpleLithology201012.html) vocabulary poorly captures the

  12. Globally optimal superconducting magnets part I: minimum stored energy (MSE) current density map.

    Science.gov (United States)

    Tieng, Quang M; Vegh, Viktor; Brereton, Ian M

    2009-01-01

    An optimal current density map is crucial in magnet design to provide the initial values within search spaces in an optimization process for determining the final coil arrangement of the magnet. A strategy for obtaining globally optimal current density maps for the purpose of designing magnets with coaxial cylindrical coils in which the stored energy is minimized within a constrained domain is outlined. The current density maps obtained utilising the proposed method suggests that peak current densities occur around the perimeter of the magnet domain, where the adjacent peaks have alternating current directions for the most compact designs. As the dimensions of the domain are increased, the current density maps yield traditional magnet designs of positive current alone. These unique current density maps are obtained by minimizing the stored magnetic energy cost function and therefore suggest magnet coil designs of minimal system energy. Current density maps are provided for a number of different domain arrangements to illustrate the flexibility of the method and the quality of the achievable designs.

  13. Karst mapping in the United States: Past, present and future

    Science.gov (United States)

    Weary, David J.; Doctor, Daniel H.

    2015-01-01

    The earliest known comprehensive karst map of the entire USA was published by Stringfield and LeGrand (1969), based on compilations of William E. Davies of the U.S. Geological Survey (USGS). Various versions of essentially the same map have been published since. The USGS recently published new digital maps and databases depicting the extent of known karst, potential karst, and pseudokarst areas of the United States of America including Puerto Rico and the U.S. Virgin Islands (Weary and Doctor, 2014). These maps are based primarily on the extent of potentially karstic soluble rock types, and rocks with physical properties conducive to the formation of pseudokarst features. These data were compiled and refined from multiple sources at various spatial resolutions, mostly as digital data supplied by state geological surveys. The database includes polygons delineating areas with potential for karst and that are tagged with attributes intended to facilitate classification of karst regions. Approximately 18% of the surface of the fifty United States is underlain by significantly soluble bedrock. In the eastern United States the extent of outcrop of soluble rocks provides a good first-approximation of the distribution of karst and potential karst areas. In the arid western states, the extent of soluble rock outcrop tends to overestimate the extent of regions that might be considered as karst under current climatic conditions, but the new dataset encompasses those regions nonetheless. This database will be revised as needed, and the present map will be updated as new information is incorporated.

  14. Environmental aspects of engineering geological mapping in the United States

    Science.gov (United States)

    Radbruch-Hall, Dorothy H.

    1979-01-01

    Many engineering geological maps at different scales have been prepared for various engineering and environmental purposes in regions of diverse geological conditions in the United States. They include maps of individual geological hazards and maps showing the effect of land development on the environment. An approach to assessing the environmental impact of land development that is used increasingly in the United States is the study of a single area by scientists from several disciplines, including geology. A study of this type has been made for the National Petroleum Reserve in northern Alaska. In the San Francisco Bay area, a technique has been worked out for evaluating the cost of different types of construction and land development in terms of the cost of a number of kinds of earth science factors. ?? 1979 International Association of Engineering Geology.

  15. Geomorphic Unit Tool (GUT): Applications of Fluvial Mapping

    Science.gov (United States)

    Kramer, N.; Bangen, S. G.; Wheaton, J. M.; Bouwes, N.; Wall, E.; Saunders, C.; Bennett, S.; Fortney, S.

    2017-12-01

    Geomorphic units are the building blocks of rivers and represent distinct habitat patches for many fluvial organisms. We present the Geomorphic Unit Toolkit (GUT), a flexible GIS geomorphic unit mapping tool, to generate maps of fluvial landforms from topography. GUT applies attributes to landforms based on flow stage (Tier 1), topographic signatures (Tier 2), geomorphic characteristics (Tier 3) and patch characteristics (Tier 4) to derive attributed maps at the level of detail required by analysts. We hypothesize that if more rigorous and consistent geomorphic mapping is conducted, better correlations between physical habitat units and ecohydraulic model results will be obtained compared to past work. Using output from GUT for coarse bed tributary streams in the Columbia River Basin, we explore relationships between salmonid habitat and geomorphic spatial metrics. We also highlight case studies of how GUT can be used to showcase geomorphic impact from large wood restoration efforts. Provided high resolution topography exists, this tool can be used to quickly assess changes in fluvial geomorphology in watersheds impacted by human activities.

  16. Stabilizing unstable fixed points of chaotic maps via minimum entropy control

    Energy Technology Data Exchange (ETDEWEB)

    Salarieh, Hassan [Center of Excellence in Design, Robotics and Automation, Department of Mechanical Engineering, Sharif University of Technology, P.O. Box 11365-9567, Tehran (Iran, Islamic Republic of)], E-mail: salarieh@mech.sharif.edu; Alasty, Aria [Center of Excellence in Design, Robotics and Automation, Department of Mechanical Engineering, Sharif University of Technology, P.O. Box 11365-9567, Tehran (Iran, Islamic Republic of)

    2008-08-15

    In this paper the problem of chaos control in nonlinear maps using minimization of entropy function is investigated. Invariant probability measure of a chaotic dynamics can be used to produce an entropy function in the sense of Shannon. In this paper it is shown that how the entropy control technique is utilized for chaos elimination. Using only the measured states of a chaotic map the probability measure of the system is numerically estimated and this estimated measure is used to obtain an estimation for the entropy of the chaotic map. The control variable of the chaotic system is determined in such a way that the entropy function descends until the chaotic trajectory of the map is replaced with a regular one. The proposed idea is applied for stabilizing the fixed points of the logistic and the Henon maps as some cases of study. Simulation results show the effectiveness of the method in chaos rejection when only the statistical information is available from the under-study systems.

  17. Potential benefits of minimum unit pricing for alcohol versus a ban on below cost selling in England 2014: modelling study.

    Science.gov (United States)

    Brennan, Alan; Meng, Yang; Holmes, John; Hill-McManus, Daniel; Meier, Petra S

    2014-09-30

    To evaluate the potential impact of two alcohol control policies under consideration in England: banning below cost selling of alcohol and minimum unit pricing. Modelling study using the Sheffield Alcohol Policy Model version 2.5. England 2014-15. Adults and young people aged 16 or more, including subgroups of moderate, hazardous, and harmful drinkers. Policy to ban below cost selling, which means that the selling price to consumers could not be lower than tax payable on the product, compared with policies of minimum unit pricing at £0.40 (€0.57; $0.75), 45 p, and 50 p per unit (7.9 g/10 mL) of pure alcohol. Changes in mean consumption in terms of units of alcohol, drinkers' expenditure, and reductions in deaths, illnesses, admissions to hospital, and quality adjusted life years. The proportion of the market affected is a key driver of impact, with just 0.7% of all units estimated to be sold below the duty plus value added tax threshold implied by a ban on below cost selling, compared with 23.2% of units for a 45 p minimum unit price. Below cost selling is estimated to reduce harmful drinkers' mean annual consumption by just 0.08%, around 3 units per year, compared with 3.7% or 137 units per year for a 45 p minimum unit price (an approximately 45 times greater effect). The ban on below cost selling has a small effect on population health-saving an estimated 14 deaths and 500 admissions to hospital per annum. In contrast, a 45 p minimum unit price is estimated to save 624 deaths and 23,700 hospital admissions. Most of the harm reductions (for example, 89% of estimated deaths saved per annum) are estimated to occur in the 5.3% of people who are harmful drinkers. The ban on below cost selling, implemented in the England in May 2014, is estimated to have small effects on consumption and health harm. The previously announced policy of a minimum unit price, if set at expected levels between 40 p and 50 p per unit, is estimated to have an approximately 40-50 times

  18. MAPPING A BASIC HEALTH UNIT: AN EXPERIENCE REPORT

    Directory of Open Access Journals (Sweden)

    Bárbara Carvalho Malheiros

    2015-01-01

    Full Text Available Backgound and Objectives: This study is an experience report on the construction of a map of a Basic Health Unit (BHU. The objective was to understand the relevance and/or importance of mapping a BHU and acquire more knowledge on the health-disease status of the registered population and identify the importance of cartography as a working tool. Case description: After reading some texts, evaluating information systems and on-site visits, it was possible to identify the health status of the population of the neighborhoods. The proposed objectives were considered to be achieved, considering the mapping of the assessed population’s health-disease situation with a closer-to-reality viewpoint, identifying the number of individuals, the diseases, living situation and health care. Conclusion: The mapping approach is a powerful working tool for allowing the planning of strategic interventions that enables the development of assistance activities, aiming to promote health and disease prevention. KEYWORDS: Mapping; Basic Health Unit; Health Planning.

  19. FLO1K, global maps of mean, maximum and minimum annual streamflow at 1 km resolution from 1960 through 2015

    Science.gov (United States)

    Barbarossa, Valerio; Huijbregts, Mark A. J.; Beusen, Arthur H. W.; Beck, Hylke E.; King, Henry; Schipper, Aafke M.

    2018-03-01

    Streamflow data is highly relevant for a variety of socio-economic as well as ecological analyses or applications, but a high-resolution global streamflow dataset is yet lacking. We created FLO1K, a consistent streamflow dataset at a resolution of 30 arc seconds (~1 km) and global coverage. FLO1K comprises mean, maximum and minimum annual flow for each year in the period 1960-2015, provided as spatially continuous gridded layers. We mapped streamflow by means of artificial neural networks (ANNs) regression. An ensemble of ANNs were fitted on monthly streamflow observations from 6600 monitoring stations worldwide, i.e., minimum and maximum annual flows represent the lowest and highest mean monthly flows for a given year. As covariates we used the upstream-catchment physiography (area, surface slope, elevation) and year-specific climatic variables (precipitation, temperature, potential evapotranspiration, aridity index and seasonality indices). Confronting the maps with independent data indicated good agreement (R2 values up to 91%). FLO1K delivers essential data for freshwater ecology and water resources analyses at a global scale and yet high spatial resolution.

  20. The Role of the Sheffield Model on the Minimum Unit Pricing of Alcohol Debate: The Importance of a Rhetorical Perspective

    Science.gov (United States)

    Katikireddi, Srinivasa Vittal; Hilton, Shona; Bond, Lyndal

    2016-01-01

    The minimum unit pricing (MUP) alcohol policy debate has been informed by the Sheffield model, a study which predicts impacts of different alcohol pricing policies. This paper explores the Sheffield model's influences on the policy debate by drawing on 36 semi-structured interviews with policy actors who were involved in the policy debate.…

  1. The Implications of the National Minimum Wage for Training Practices and Skill Utilisation in the United Kingdom Hospitality Industry

    Science.gov (United States)

    Norris, Gill; Williams, Steve; Adam-Smith, Derek

    2003-01-01

    Two key issues thrown up by the 1999 introduction of the National Minimum Wage (NMW) in the United Kingdom are its likely impact on employers' training practices in low paying sectors of the economy and the implications for skills. Based on a study of the hospitality industry, this article assesses the limited significance of the differential,…

  2. Map Database for Surficial Materials in the Conterminous United States

    Science.gov (United States)

    Soller, David R.; Reheis, Marith C.; Garrity, Christopher P.; Van Sistine, D. R.

    2009-01-01

    The Earth's bedrock is overlain in many places by a loosely compacted and mostly unconsolidated blanket of sediments in which soils commonly are developed. These sediments generally were eroded from underlying rock, and then were transported and deposited. In places, they exceed 1000 ft (330 m) in thickness. Where the sediment blanket is absent, bedrock is either exposed or has been weathered to produce a residual soil. For the conterminous United States, a map by Soller and Reheis (2004, scale 1:5,000,000; http://pubs.usgs.gov/of/2003/of03-275/) shows these sediments and the weathered, residual material; for ease of discussion, these are referred to as 'surficial materials'. That map was produced as a PDF file, from an Adobe Illustrator-formatted version of the provisional GIS database. The provisional GIS files were further processed without modifying the content of the published map, and are here published.

  3. Developing a minimum dataset for nursing team leader handover in the intensive care unit: A focus group study.

    Science.gov (United States)

    Spooner, Amy J; Aitken, Leanne M; Corley, Amanda; Chaboyer, Wendy

    2018-01-01

    Despite increasing demand for structured processes to guide clinical handover, nursing handover tools are limited in the intensive care unit. The study aim was to identify key items to include in a minimum dataset for intensive care nursing team leader shift-to-shift handover. This focus group study was conducted in a 21-bed medical/surgical intensive care unit in Australia. Senior registered nurses involved in team leader handovers were recruited. Focus groups were conducted using a nominal group technique to generate and prioritise minimum dataset items. Nurses were presented with content from previous team leader handovers and asked to select which content items to include in a minimum dataset. Participant responses were summarised as frequencies and percentages. Seventeen senior nurses participated in three focus groups. Participants agreed that ISBAR (Identify-Situation-Background-Assessment-Recommendations) was a useful tool to guide clinical handover. Items recommended to be included in the minimum dataset (≥65% agreement) included Identify (name, age, days in intensive care), Situation (diagnosis, surgical procedure), Background (significant event(s), management of significant event(s)) and Recommendations (patient plan for next shift, tasks to follow up for next shift). Overall, 30 of the 67 (45%) items in the Assessment category were considered important to include in the minimum dataset and focused on relevant observations and treatment within each body system. Other non-ISBAR items considered important to include related to the ICU (admissions to ICU, staffing/skill mix, theatre cases) and patients (infectious status, site of infection, end of life plan). Items were further categorised into those to include in all handovers and those to discuss only when relevant to the patient. The findings suggest a minimum dataset for intensive care nursing team leader shift-to-shift handover should contain items within ISBAR along with unit and patient specific

  4. Robust methods to create ex vivo minimum deformation atlases for brain mapping.

    Science.gov (United States)

    Janke, Andrew L; Ullmann, Jeremy F P

    2015-02-01

    Highly detailed ex vivo 3D atlases of average structure are of critical importance to neuroscience and its current push to understanding the global microstructure of the brain. Multiple single slice histology sections can no longer provide sufficient detail of inter-slice microstructure and lack out of plane resolution. Two ex vivo methods have emerged that can create such detailed models. High-field micro MRI with the addition of contrast media has allowed intact whole brain microstructure imaging with an isotropic resolution of 15 μm in mouse. Blockface imaging has similarly evolved to a point where it is now possible to image an entire brain in a rigorous fashion with an out of plane resolution of 10 μm. Despite the destruction of the tissue as part of this process it allows a reconstructed model that is free from cutting artifacts. Both of these methods have been utilised to create minimum deformation atlases that are representative of the respective populations. The MDA atlases allow us unprecedented insight into the commonality and differences in microstructure in cortical structures in specific taxa. In this paper we provide an overview of how to create such MDA models from ex vivo data. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Basement domain map of the conterminous United States and Alaska

    Science.gov (United States)

    Lund, Karen; Box, Stephen E.; Holm-Denoma, Christopher S.; San Juan, Carma A.; Blakely, Richard J.; Saltus, Richard W.; Anderson, Eric D.; DeWitt, Ed

    2015-01-01

    The basement-domain map is a compilation of basement domains in the conterminous United States and Alaska designed to be used at 1:5,000,000-scale, particularly as a base layer for national-scale mineral resource assessments. Seventy-seven basement domains are represented as eighty-three polygons on the map. The domains are based on interpretations of basement composition, origin, and architecture and developed from a variety of sources. Analysis of previously published basement, lithotectonic, and terrane maps as well as models of planetary development were used to formulate the concept of basement and the methodology of defining domains that spanned the ages of Archean to present but formed through different processes. The preliminary compilations for the study areas utilized these maps, national-scale gravity and aeromagnetic data, published and limited new age and isotopic data, limited new field investigations, and conventional geologic maps. Citation of the relevant source data for compilations and the source and types of original interpretation, as derived from different types of data, are provided in supporting descriptive text and tables.

  6. The Holdridge life zones of the conterminous United States in relation to ecosystem mapping

    Science.gov (United States)

    A.E. Lugo; S. L. Brown; R. Dodson; T. S Smith; H. H. Shugart

    1999-01-01

    Aim Our main goals were to develop a map of the life zones for the conterminous United States, based on the Holdridge Life Zone system, as a tool for ecosystem mapping, and to compare the map of Holdridge life zones with other global vegetation classification and mapping efforts. Location The area of interest is the forty-eight contiguous states of the United States....

  7. SU-F-T-78: Minimum Data Set of Measurements for TG 71 Based Electron Monitor-Unit Calculations

    International Nuclear Information System (INIS)

    Xu, H; Guerrero, M; Prado, K; Yi, B

    2016-01-01

    Purpose: Building up a TG-71 based electron monitor-unit (MU) calculation protocol usually involves massive measurements. This work investigates a minimum data set of measurements and its calculation accuracy and measurement time. Methods: For 6, 9, 12, 16, and 20 MeV of our Varian Clinac-Series linear accelerators, the complete measurements were performed at different depth using 5 square applicators (6, 10, 15, 20 and 25 cm) with different cutouts (2, 3, 4, 6, 10, 15 and 20 cm up to applicator size) for 5 different SSD’s. For each energy, there were 8 PDD scans and 150 point measurements for applicator factors, cutout factors and effective SSDs that were then converted to air-gap factors for SSD 99–110cm. The dependence of each dosimetric quantity on field size and SSD was examined to determine the minimum data set of measurements as a subset of the complete measurements. The “missing” data excluded in the minimum data set were approximated by linear or polynomial fitting functions based on the included data. The total measurement time and the calculated electron MU using the minimum and the complete data sets were compared. Results: The minimum data set includes 4 or 5 PDD’s and 51 to 66 point measurements for each electron energy, and more PDD’s and fewer point measurements are generally needed as energy increases. Using only <50% of complete measurement time, the minimum data set generates acceptable MU calculation results compared to those with the complete data set. The PDD difference is within 1 mm and the calculated MU difference is less than 1.5%. Conclusion: Data set measurement for TG-71 electron MU calculations can be minimized based on the knowledge of how each dosimetric quantity depends on various setup parameters. The suggested minimum data set allows acceptable MU calculation accuracy and shortens measurement time by a few hours.

  8. SU-F-T-78: Minimum Data Set of Measurements for TG 71 Based Electron Monitor-Unit Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Xu, H; Guerrero, M; Prado, K; Yi, B [University of Maryland School of Medicine, Baltimore, MD (United States)

    2016-06-15

    Purpose: Building up a TG-71 based electron monitor-unit (MU) calculation protocol usually involves massive measurements. This work investigates a minimum data set of measurements and its calculation accuracy and measurement time. Methods: For 6, 9, 12, 16, and 20 MeV of our Varian Clinac-Series linear accelerators, the complete measurements were performed at different depth using 5 square applicators (6, 10, 15, 20 and 25 cm) with different cutouts (2, 3, 4, 6, 10, 15 and 20 cm up to applicator size) for 5 different SSD’s. For each energy, there were 8 PDD scans and 150 point measurements for applicator factors, cutout factors and effective SSDs that were then converted to air-gap factors for SSD 99–110cm. The dependence of each dosimetric quantity on field size and SSD was examined to determine the minimum data set of measurements as a subset of the complete measurements. The “missing” data excluded in the minimum data set were approximated by linear or polynomial fitting functions based on the included data. The total measurement time and the calculated electron MU using the minimum and the complete data sets were compared. Results: The minimum data set includes 4 or 5 PDD’s and 51 to 66 point measurements for each electron energy, and more PDD’s and fewer point measurements are generally needed as energy increases. Using only <50% of complete measurement time, the minimum data set generates acceptable MU calculation results compared to those with the complete data set. The PDD difference is within 1 mm and the calculated MU difference is less than 1.5%. Conclusion: Data set measurement for TG-71 electron MU calculations can be minimized based on the knowledge of how each dosimetric quantity depends on various setup parameters. The suggested minimum data set allows acceptable MU calculation accuracy and shortens measurement time by a few hours.

  9. Potential Impact of Minimum Unit Pricing for Alcohol in Ireland: Evidence from the National Alcohol Diary Survey.

    Science.gov (United States)

    Cousins, Gráinne; Mongan, Deirdre; Barry, Joe; Smyth, Bobby; Rackard, Marion; Long, Jean

    2016-11-01

    One of the main provisions of the Irish Public Health (Alcohol) Bill is the introduction of a minimum unit price (MUP) for alcohol in Ireland, set at €1.00/standard drink. We sought to identify who will be most affected by the introduction of a MUP, examining the relationship between harmful alcohol consumption, personal income, place of purchase and price paid for alcohol. A nationally representative survey of 3187 respondents aged 18-75 years, completing a diary of their previous week's alcohol consumption. The primary outcome was purchasing alcohol at  5), low personal annual income (target those suffering the greatest harm, and reduce alcohol-attributable mortality in Ireland. Further prospective studies are needed to monitor consumption trends and associated harms following the introduction of minimum unit pricing of alcohol. © The Author 2016. Medical Council on Alcohol and Oxford University Press. All rights reserved.

  10. Improving boiler unit performance using an optimum robust minimum-order observer

    Energy Technology Data Exchange (ETDEWEB)

    Moradi, Hamed; Bakhtiari-Nejad, Firooz [Energy and Control Centre of Excellence, Department of Mechanical Engineering, Amirkabir University of Technology, Tehran (Iran, Islamic Republic of)

    2011-03-15

    To achieve a good performance of the utility boiler, dynamic variables such as drum pressure, steam temperature and water level of drum must be controlled. In this paper, a linear time invariant (LTI) model of a boiler system is considered in which the input variables are feed-water and fuel mass rates. Due to the inaccessibility of some state variables of boiler system, a minimum-order observer is designed based on Luenberger's model to gain an estimate state x of the true state x. Low cost of design and high accuracy of states estimation are the main advantages of the minimum-order observer; in comparison with previous designed full-order observers. By applying the observer on the closed-loop system, a regulator system is designed. Using an optimal functional code developed in MATLAB environment, desired observer poles are found such that suitable time response specifications of the boiler system are achieved and the gain and phase margin values are adjusted in an acceptable range. However, the real dynamic model may associate with parametric uncertainties. In that case, optimum region of poles of observer-based controller are found such that the robust performance of the boiler system against model uncertainties is guaranteed. (author)

  11. Understanding the Development of Minimum Unit Pricing of Alcohol in Scotland: A Qualitative Study of the Policy Process

    Science.gov (United States)

    Katikireddi, Srinivasa Vittal; Hilton, Shona; Bonell, Chris; Bond, Lyndal

    2014-01-01

    Background Minimum unit pricing of alcohol is a novel public health policy with the potential to improve population health and reduce health inequalities. Theories of the policy process may help to understand the development of policy innovation and in turn identify lessons for future public health research and practice. This study aims to explain minimum unit pricing’s development by taking a ‘multiple-lenses’ approach to understanding the policy process. In particular, we apply three perspectives of the policy process (Kingdon’s multiple streams, Punctuated-Equilibrium Theory, Multi-Level Governance) to understand how and why minimum unit pricing has developed in Scotland and describe implications for efforts to develop evidence-informed policymaking. Methods Semi-structured interviews were conducted with policy actors (politicians, civil servants, academics, advocates, industry representatives) involved in the development of MUP (n = 36). Interviewees were asked about the policy process and the role of evidence in policy development. Data from two other sources (a review of policy documents and an analysis of evidence submission documents to the Scottish Parliament) were used for triangulation. Findings The three perspectives provide complementary understandings of the policy process. Evidence has played an important role in presenting the policy issue of alcohol as a problem requiring action. Scotland-specific data and a change in the policy ‘image’ to a population-based problem contributed to making alcohol-related harms a priority for action. The limited powers of Scottish Government help explain the type of price intervention pursued while distinct aspects of the Scottish political climate favoured the pursuit of price-based interventions. Conclusions Evidence has played a crucial but complex role in the development of an innovative policy. Utilising different political science theories helps explain different aspects of the policy process

  12. Understanding the development of minimum unit pricing of alcohol in Scotland: a qualitative study of the policy process.

    Science.gov (United States)

    Katikireddi, Srinivasa Vittal; Hilton, Shona; Bonell, Chris; Bond, Lyndal

    2014-01-01

    Minimum unit pricing of alcohol is a novel public health policy with the potential to improve population health and reduce health inequalities. Theories of the policy process may help to understand the development of policy innovation and in turn identify lessons for future public health research and practice. This study aims to explain minimum unit pricing's development by taking a 'multiple-lenses' approach to understanding the policy process. In particular, we apply three perspectives of the policy process (Kingdon's multiple streams, Punctuated-Equilibrium Theory, Multi-Level Governance) to understand how and why minimum unit pricing has developed in Scotland and describe implications for efforts to develop evidence-informed policymaking. Semi-structured interviews were conducted with policy actors (politicians, civil servants, academics, advocates, industry representatives) involved in the development of MUP (n = 36). Interviewees were asked about the policy process and the role of evidence in policy development. Data from two other sources (a review of policy documents and an analysis of evidence submission documents to the Scottish Parliament) were used for triangulation. The three perspectives provide complementary understandings of the policy process. Evidence has played an important role in presenting the policy issue of alcohol as a problem requiring action. Scotland-specific data and a change in the policy 'image' to a population-based problem contributed to making alcohol-related harms a priority for action. The limited powers of Scottish Government help explain the type of price intervention pursued while distinct aspects of the Scottish political climate favoured the pursuit of price-based interventions. Evidence has played a crucial but complex role in the development of an innovative policy. Utilising different political science theories helps explain different aspects of the policy process, with Multi-Level Governance particularly useful for

  13. A mitotically inheritable unit containing a MAP kinase module.

    Science.gov (United States)

    Kicka, Sébastien; Bonnet, Crystel; Sobering, Andrew K; Ganesan, Latha P; Silar, Philippe

    2006-09-05

    Prions are novel kinds of hereditary units, relying solely on proteins, that are infectious and inherited in a non-Mendelian fashion. To date, they are either based on autocatalytic modification of a 3D conformation or on autocatalytic cleavage. Here, we provide further evidence that in the filamentous fungus Podospora anserina, a MAP kinase cascade is probably able to self-activate and generate C, a hereditary unit that bears many similarities to prions and triggers cell degeneration. We show that in addition to the MAPKKK gene, both the MAPKK and MAPK genes are necessary for the propagation of C, and that overexpression of MAPK as that of MAPKKK facilitates the appearance of C. We also show that a correlation exists between the presence of C and localization of the MAPK inside nuclei. These data emphasize the resemblance between prions and a self-positively regulated cascade in terms of their transmission. This thus further expands the concept of protein-base inheritance to regulatory networks that have the ability to self-activate.

  14. Minimum monitor unit per segment IMRT planning and over-shoot-ratio

    International Nuclear Information System (INIS)

    Grigorov, G.; Barnett, R.; Chow, J.

    2004-01-01

    The aim of this work is to describe the modulation quality for dose delivery of small Multi-Leaf Collimator (MLC) fields and MU/segment. The results were obtained with Pinnacle (V6) and a Varian Clinac 2100 EX (Varis 6.2) linear accelerator. The over-shoot effect was investigated by comparing integrated multiple segmented exposures to a single exposure with the same number of total MU (1, 2, 3,4, 5 and 6 MU). To present the OS effect the Over-Shoot-Ratio (OSR) was defined as the ratio of the segmented dose for a 1 cm 2 field at depth to the static dose for the same field size and depth. OSR was measured as a function of MU/segment and dose rate. Measured results can be used to optimise IMRT planning and also to calculate the surface dose. The dependence of the dose in depth with 1, 2, 3, 4, and 5 MU/segments for 6 MV photon beam, dose rate of 100 MU/min and 1 cm 2 beam field at the central axis is presented, where the argument of the function is the depth and parameter of the function is the number of minimum MU/segment. The dependence of the overshoot ratio on the MU/segment with a parameter of the dose rates (100, 400 and 600 MU/min) is also shown. The effect increases with the dose rate and decreases with the increasing of the minimum number of MU/segment. Having measured OSR for the 2100 EX linac it is possible to do correction and calibration of the dose of the first segment of IMRT beam, where the dose to the target and on the surface can increase over the planed dose of 1 MU by 40% and 70% for dose rate of 400 and 600 MU/min respectively. The Over-Shoot-Ratio is an important parameter to be determined as part of the routine quality assurance for IMRT and can be used to significantly improve the agreement between planned and delivered doses to the patient

  15. Risk maps for targeting exotic plant pest detection programs in the United States

    Science.gov (United States)

    R.D. Magarey; D.M. Borchert; J.S. Engle; M Garcia-Colunga; Frank H. Koch; et al

    2011-01-01

    In the United States, pest risk maps are used by the Cooperative Agricultural Pest Survey for spatial and temporal targeting of exotic plant pest detection programs. Methods are described to create standardized host distribution, climate and pathway risk maps for the top nationally ranked exotic pest targets. Two examples are provided to illustrate the risk mapping...

  16. TH-CD-209-01: A Greedy Reassignment Algorithm for the PBS Minimum Monitor Unit Constraint

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Y; Kooy, H; Craft, D; Depauw, N; Flanz, J; Clasie, B [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States)

    2016-06-15

    Purpose: To investigate a Greedy Reassignment algorithm in order to mitigate the effects of low weight spots in proton pencil beam scanning (PBS) treatment plans. Methods: To convert a plan from the treatment planning system’s (TPS) to a deliverable plan, post processing methods can be used to adjust the spot maps to meets the minimum MU constraint. Existing methods include: deleting low weight spots (Cut method), or rounding spots with weight above/below half the limit up/down to the limit/zero (Round method). An alternative method called Greedy Reassignment was developed in this work in which the lowest weight spot in the field was removed and its weight reassigned equally among its nearest neighbors. The process was repeated with the next lowest weight spot until all spots in the field were above the MU constraint. The algorithm performance was evaluated using plans collected from 190 patients (496 fields) treated at our facility. The evaluation criteria were the γ-index pass rate comparing the pre-processed and post-processed dose distributions. A planning metric was further developed to predict the impact of post-processing on treatment plans for various treatment planning, machine, and dose tolerance parameters. Results: For fields with a gamma pass rate of 90±1%, the metric has a standard deviation equal to 18% of the centroid value. This showed that the metric and γ-index pass rate are correlated for the Greedy Reassignment algorithm. Using a 3rd order polynomial fit to the data, the Greedy Reassignment method had 1.8 times better metric at 90% pass rate compared to other post-processing methods. Conclusion: We showed that the Greedy Reassignment method yields deliverable plans that are closest to the optimized-without-MU-constraint plan from the TPS. The metric developed in this work could help design the minimum MU threshold with the goal of keeping the γ-index pass rate above an acceptable value.

  17. Reflections on the Value of Mapping the Final Theory Examination in a Molecular Biochemistry Unit

    OpenAIRE

    Eri, Rajaraman; Cook, Anthony; Brown, Natalie

    2014-01-01

    This article assesses the impact of examination mapping as a tool to enhancing assessment and teaching quality in a second-year biochemistry unit for undergraduates. Examination mapping is a process where all questions in a written examination paper are assessed for links to the unit’s intended learning outcomes. We describe how mapping a final written examination helped visualise the impact of the assessment task on intended learning outcomes and skills for that biochemistry unit. The method...

  18. 2014 Update of the United States National Seismic Hazard Maps

    Science.gov (United States)

    Petersen, M.D.; Mueller, C.S.; Haller, K.M.; Moschetti, M.; Harmsen, S.C.; Field, E.H.; Rukstales, K.S.; Zeng, Y.; Perkins, D.M.; Powers, P.; Rezaeian, S.; Luco, N.; Olsen, A.; Williams, R.

    2012-01-01

    The U.S. National Seismic Hazard Maps are revised every six years, corresponding with the update cycle of the International Building Code. These maps cover the conterminous U.S. and will be updated in 2014 using the best-available science that is obtained from colleagues at regional and topical workshops, which are convened in 2012-2013. Maps for Alaska and Hawaii will be updated shortly following this update. Alternative seismic hazard models discussed at the workshops will be implemented in a logic tree framework and will be used to develop the seismic hazard maps and associated products. In this paper we describe the plan to update the hazard maps, the issues raised in workshops up to March 2012, and topics that will be discussed at future workshops. An advisory panel will guide the development of the hazard maps and ensure that the maps are acceptable to a broad segment of the science and engineering communities. These updated maps will then be considered by end-users for inclusion in building codes, risk models, and public policy documents.

  19. Rapid and minimum invasive functional brain mapping by real-time visualization of high gamma activity during awake craniotomy.

    Science.gov (United States)

    Ogawa, Hiroshi; Kamada, Kyousuke; Kapeller, Christoph; Hiroshima, Satoru; Prueckl, Robert; Guger, Christoph

    2014-11-01

    Electrocortical stimulation (ECS) is the gold standard for functional brain mapping during an awake craniotomy. The critical issue is to set aside enough time to identify eloquent cortices by ECS. High gamma activity (HGA) ranging between 80 and 120 Hz on electrocorticogram is assumed to reflect localized cortical processing. In this report, we used real-time HGA mapping and functional neuronavigation integrated with functional magnetic resonance imaging (fMRI) for rapid and reliable identification of motor and language functions. Four patients with intra-axial tumors in their dominant hemisphere underwent preoperative fMRI and lesion resection with an awake craniotomy. All patients showed significant fMRI activation evoked by motor and language tasks. During the craniotomy, we recorded electrocorticogram activity by placing subdural grids directly on the exposed brain surface. Each patient performed motor and language tasks and demonstrated real-time HGA dynamics in hand motor areas and parts of the inferior frontal gyrus. Sensitivity and specificity of HGA mapping were 100% compared with ECS mapping in the frontal lobe, which suggested HGA mapping precisely indicated eloquent cortices. We found different HGA dynamics of language tasks in frontal and temporal regions. Specificities of the motor and language-fMRI did not reach 85%. The results of HGA mapping was mostly consistent with those of ECS mapping, although fMRI tended to overestimate functional areas. This novel technique enables rapid and accurate identification of motor and frontal language areas. Furthermore, real-time HGA mapping sheds light on underlying physiological mechanisms related to human brain functions. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Keeping it wild: mapping wilderness character in the United States.

    Science.gov (United States)

    Carver, Steve; Tricker, James; Landres, Peter

    2013-12-15

    A GIS-based approach is developed to identify the state of wilderness character in US wilderness areas using Death Valley National Park (DEVA) as a case study. A set of indicators and measures are identified by DEVA staff and used as the basis for developing a flexible and broadly applicable framework to map wilderness character using data inputs selected by park staff. Spatial data and GIS methods are used to map the condition of four qualities of wilderness character: natural, untrammelled, undeveloped, and solitude or primitive and unconfined recreation. These four qualities are derived from the US 1964 Wilderness Act and later developed by Landres et al. (2008a) in "Keeping it Wild: An Interagency Strategy to Monitor Trends in Wilderness Character Across the National Wilderness Preservation System." Data inputs are weighted to reflect their importance in relation to other data inputs and the model is used to generate maps of each of the four qualities of wilderness character. The combined map delineates the range of quality of wilderness character in the DEVA wilderness revealing the majority of wilderness character to be optimal quality with the best areas in the northern section of the park. This map will serve as a baseline for monitoring change in wilderness character and for evaluating the spatial impacts of planning alternatives for wilderness and backcountry stewardship plans. The approach developed could be applied to any wilderness area, either in the USA or elsewhere in the world. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Wide-area measurement system-based supervision of protection schemes with minimum number of phasor measurement units.

    Science.gov (United States)

    Gajare, Swaroop; Rao, J Ganeswara; Naidu, O D; Pradhan, Ashok Kumar

    2017-08-13

    Cascade tripping of power lines triggered by maloperation of zone-3 relays during stressed system conditions, such as load encroachment, power swing and voltage instability, has led to many catastrophic power failures worldwide, including Indian blackouts in 2012. With the introduction of wide-area measurement systems (WAMS) into the grids, real-time monitoring of transmission network condition is possible. A phasor measurement unit (PMU) sends time-synchronized data to a phasor data concentrator, which can provide a control signal to substation devices. The latency associated with the communication system makes WAMS suitable for a slower form of protection. In this work, a method to identify the faulted line using synchronized data from strategic PMU locations is proposed. Subsequently, a supervisory signal is generated for specific relays in the system for any disturbance or stressed condition. For a given system, an approach to decide the strategic locations for PMU placement is developed, which can be used for determining the minimum number of PMUs required for application of the method. The accuracy of the scheme is tested for faults during normal and stressed conditions in a New England 39-bus system simulated using EMTDC/PSCAD software. With such a strategy, maloperation of relays can be averted in many situations and thereby blackouts/large-scale disturbances can be prevented.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  2. Representations of minimum unit pricing for alcohol in UK newspapers: a case study of a public health policy debate.

    Science.gov (United States)

    Patterson, Chris; Katikireddi, Srinivasa Vittal; Wood, Karen; Hilton, Shona

    2015-03-01

    Mass media influence public acceptability, and hence feasibility, of public health interventions. This study investigates newsprint constructions of the alcohol problem and minimum unit pricing (MUP). Quantitative content analysis of 901 articles about MUP published in 10 UK and Scottish newspapers between 2005 and 2012. MUP was a high-profile issue, particularly in Scottish publications. Reporting increased steadily between 2008 and 2012, matching the growing status of the debate. The alcohol problem was widely acknowledged, often associated with youths, and portrayed as driven by cheap alcohol, supermarkets and drinking culture. Over-consumption was presented as a threat to health and social order. Appraisals of MUP were neutral, with supportiveness increasing slightly over time. Arguments focused on health impacts more frequently than more emotive perspectives or business interests. Health charities and the NHS were cited slightly more frequently than alcohol industry representatives. Emphases on efficacy, evidence and experts are positive signs for evidence-based policymaking. The high profile of MUP, along with growing support within articles, could reflect growing appetite for action on the alcohol problem. Representations of the problem as structurally driven might engender support for legislative solutions, although cultural explanations remain common. © The Author 2014. Published by Oxford University Press on behalf of Faculty of Public Health.

  3. Quaternary Geologic Map of the Regina 4 Degrees x 6 Degrees Quadrangle, United States and Canada

    Science.gov (United States)

    Fullerton, David S.; Christiansen, Earl A.; Schreiner, Bryan T.; Colton, Roger B.; Clayton, Lee; Bush, Charles A.; Fullerton, David S.

    2007-01-01

    For scientific purposes, the map differentiates Quaternary surficial deposits and materials on the basis of clast lithology or composition, matrix texture or particle size, structure, genesis, stratigraphic relations, engineering geologic properties, and relative age, as shown on the correlation diagram and indicated in the 'Description of Map Units'. Deposits of some constructional landforms, such as end moraines, are distinguished as map units. Deposits of erosional landforms, such as outwash terraces, are not distinguished, although glaciofluvial, ice-contact, fluvial, and lacustrine deposits that are mapped may be terraced. Differentiation of sequences of fluvial and glaciofluvial deposits at this scale is not possible. For practical purposes, the map is a surficial materials map. Materials are distinguished on the basis of lithology or composition, texture or particle size, and other physical, chemical, and engineering characteristics. It is not a map of soils that are recognized and classified in pedology or agronomy. Rather, it is a generalized map of soils as recognized in engineering geology, or of substrata or parent materials in which pedologic or agronomic soils are formed. As a materials map, it serves as a base from which a variety of maps for use in planning engineering, land-use planning, or land-management projects can be derived and from which a variety of maps relating to earth surface processes and Quaternary geologic history can be derived.

  4. Education and Criminal Justice: The Educational Approach to Prison Administration. The United Nations Standard Minimum Rules for the Treatment of Prisoners.

    Science.gov (United States)

    Morin, Lucien; Cosman, J. W.

    The United Nations Standard Minimum Rules for the Treatment of Prisoners do not express the basic principle that would support a serious educational approach to prison administration. The crucial missing rationale is the concept of the inherent dignity of the individual human prisoner. This concept has certain basic educational implications,…

  5. Radiation field mapping in mammography units with TLDs

    Energy Technology Data Exchange (ETDEWEB)

    Castro, J.C.O.; Silva, J.O., E-mail: jonas.silva@ufg.br [Universidade Federal de Goiás (IFG), Goiânia (Brazil). Instituto de Física; Veneziani, G.R. [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo-SP (Brazil). Centro de Metrologia das Radiações

    2017-07-01

    Mammography is the most common imaging technique for breast cancer detection and its tracking. For dosimetry, is important to know the field intensity variation. In this work, TLD-100 were used to made a field mapping of a mammographic system from a hospital in Goiânia/GO. The maximum radiation intensity was 8 cm far from chest wall. The results obtained could be used in the optimization of the dosimetry in the equipment used in this work. (author)

  6. Perspectives on econometric modelling to inform policy: a UK qualitative case study of minimum unit pricing of alcohol.

    Science.gov (United States)

    Katikireddi, Srinivasa V; Bond, Lyndal; Hilton, Shona

    2014-06-01

    Novel policy interventions may lack evaluation-based evidence. Considerations to introduce minimum unit pricing (MUP) of alcohol in the UK were informed by econometric modelling (the 'Sheffield model'). We aim to investigate policy stakeholders' views of the utility of modelling studies for public health policy. In-depth qualitative interviews with 36 individuals involved in MUP policy debates (purposively sampled to include civil servants, politicians, academics, advocates and industry-related actors) were conducted and thematically analysed. Interviewees felt familiar with modelling studies and often displayed detailed understandings of the Sheffield model. Despite this, many were uneasy about the extent to which the Sheffield model could be relied on for informing policymaking and preferred traditional evaluations. A tension was identified between this preference for post hoc evaluations and a desire for evidence derived from local data, with modelling seen to offer high external validity. MUP critics expressed concern that the Sheffield model did not adequately capture the 'real life' world of the alcohol market, which was conceptualized as a complex and, to some extent, inherently unpredictable system. Communication of modelling results was considered intrinsically difficult but presenting an appropriate picture of the uncertainties inherent in modelling was viewed as desirable. There was general enthusiasm for increased use of econometric modelling to inform future policymaking but an appreciation that such evidence should only form one input into the process. Modelling studies are valued by policymakers as they provide contextually relevant evidence for novel policies, but tensions exist with views of traditional evaluation-based evidence. © The Author 2013. Published by Oxford University Press on behalf of the European Public Health Association.

  7. State-level minimum wage and heart disease death rates in the United States, 1980-2015: A novel application of marginal structural modeling.

    Science.gov (United States)

    Van Dyke, Miriam E; Komro, Kelli A; Shah, Monica P; Livingston, Melvin D; Kramer, Michael R

    2018-07-01

    Despite substantial declines since the 1960's, heart disease remains the leading cause of death in the United States (US) and geographic disparities in heart disease mortality have grown. State-level socioeconomic factors might be important contributors to geographic differences in heart disease mortality. This study examined the association between state-level minimum wage increases above the federal minimum wage and heart disease death rates from 1980 to 2015 among 'working age' individuals aged 35-64 years in the US. Annual, inflation-adjusted state and federal minimum wage data were extracted from legal databases and annual state-level heart disease death rates were obtained from CDC Wonder. Although most minimum wage and health studies to date use conventional regression models, we employed marginal structural models to account for possible time-varying confounding. Quasi-experimental, marginal structural models accounting for state, year, and state × year fixed effects estimated the association between increases in the state-level minimum wage above the federal minimum wage and heart disease death rates. In models of 'working age' adults (35-64 years old), a $1 increase in the state-level minimum wage above the federal minimum wage was on average associated with ~6 fewer heart disease deaths per 100,000 (95% CI: -10.4, -1.99), or a state-level heart disease death rate that was 3.5% lower per year. In contrast, for older adults (65+ years old) a $1 increase was on average associated with a 1.1% lower state-level heart disease death rate per year (b = -28.9 per 100,000, 95% CI: -71.1, 13.3). State-level economic policies are important targets for population health research. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States

    Science.gov (United States)

    Horton, John D.; San Juan, Carma A.; Stoeser, Douglas B.

    2017-06-30

    The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States (https://doi. org/10.5066/F7WH2N65) represents a seamless, spatial database of 48 State geologic maps that range from 1:50,000 to 1:1,000,000 scale. A national digital geologic map database is essential in interpreting other datasets that support numerous types of national-scale studies and assessments, such as those that provide geochemistry, remote sensing, or geophysical data. The SGMC is a compilation of the individual U.S. Geological Survey releases of the Preliminary Integrated Geologic Map Databases for the United States. The SGMC geodatabase also contains updated data for seven States and seven entirely new State geologic maps that have been added since the preliminary databases were published. Numerous errors have been corrected and enhancements added to the preliminary datasets using thorough quality assurance/quality control procedures. The SGMC is not a truly integrated geologic map database because geologic units have not been reconciled across State boundaries. However, the geologic data contained in each State geologic map have been standardized to allow spatial analyses of lithology, age, and stratigraphy at a national scale.

  9. A minimum price per unit of alcohol: A focus group study to investigate public opinion concerning UK government proposals to introduce new price controls to curb alcohol consumption

    Directory of Open Access Journals (Sweden)

    Lonsdale Adam J

    2012-11-01

    Full Text Available Abstract Background UK drinkers regularly consume alcohol in excess of guideline limits. One reason for this may be the high availability of low-cost alcoholic beverages. The introduction of a minimum price per unit of alcohol policy has been proposed as a means to reduce UK alcohol consumption. However, there is little in-depth research investigating public attitudes and beliefs regarding a minimum pricing policy. The aim of the present research was to investigate people’s attitudes and beliefs toward the introduction of a minimum price per unit of alcohol policy and their views on how the policy could be made acceptable to the general public. Methods Twenty-eight focus groups were conducted to gain in-depth data on attitudes, knowledge, and beliefs regarding the introduction of a minimum price per unit of alcohol policy. Participants (total N = 218 were asked to give their opinions about the policy, its possible outcomes, and how its introduction might be made more acceptable. Transcribed focus-group discussions were analysed for emergent themes using inductive thematic content analysis. Results Analysis indicated that participants’ objections to a minimum price had three main themes: (1 scepticism of minimum pricing as an effective means to reduce harmful alcohol consumption; (2 a dislike of the policy for a number of reasons (e.g., it was perceived to ‘punish’ the moderate drinker; and (3 concern that the policy might create or exacerbate existing social problems. There was a general perception that the policy was aimed at ‘problem’ and underage drinkers. Participants expressed some qualified support for the policy but stated that it would only work as part of a wider campaign including other educational elements. Conclusions There was little evidence to suggest that people would support the introduction of a minimum price per unit of alcohol policy. Scepticism about the effectiveness of the policy is likely to represent the most

  10. A minimum price per unit of alcohol: A focus group study to investigate public opinion concerning UK government proposals to introduce new price controls to curb alcohol consumption

    Science.gov (United States)

    2012-01-01

    Background UK drinkers regularly consume alcohol in excess of guideline limits. One reason for this may be the high availability of low-cost alcoholic beverages. The introduction of a minimum price per unit of alcohol policy has been proposed as a means to reduce UK alcohol consumption. However, there is little in-depth research investigating public attitudes and beliefs regarding a minimum pricing policy. The aim of the present research was to investigate people’s attitudes and beliefs toward the introduction of a minimum price per unit of alcohol policy and their views on how the policy could be made acceptable to the general public. Methods Twenty-eight focus groups were conducted to gain in-depth data on attitudes, knowledge, and beliefs regarding the introduction of a minimum price per unit of alcohol policy. Participants (total N = 218) were asked to give their opinions about the policy, its possible outcomes, and how its introduction might be made more acceptable. Transcribed focus-group discussions were analysed for emergent themes using inductive thematic content analysis. Results Analysis indicated that participants’ objections to a minimum price had three main themes: (1) scepticism of minimum pricing as an effective means to reduce harmful alcohol consumption; (2) a dislike of the policy for a number of reasons (e.g., it was perceived to ‘punish’ the moderate drinker); and (3) concern that the policy might create or exacerbate existing social problems. There was a general perception that the policy was aimed at ‘problem’ and underage drinkers. Participants expressed some qualified support for the policy but stated that it would only work as part of a wider campaign including other educational elements. Conclusions There was little evidence to suggest that people would support the introduction of a minimum price per unit of alcohol policy. Scepticism about the effectiveness of the policy is likely to represent the most significant barrier to

  11. A minimum price per unit of alcohol: a focus group study to investigate public opinion concerning UK government proposals to introduce new price controls to curb alcohol consumption.

    Science.gov (United States)

    Lonsdale, Adam J; Hardcastle, Sarah J; Hagger, Martin S

    2012-11-23

    UK drinkers regularly consume alcohol in excess of guideline limits. One reason for this may be the high availability of low-cost alcoholic beverages. The introduction of a minimum price per unit of alcohol policy has been proposed as a means to reduce UK alcohol consumption. However, there is little in-depth research investigating public attitudes and beliefs regarding a minimum pricing policy. The aim of the present research was to investigate people's attitudes and beliefs toward the introduction of a minimum price per unit of alcohol policy and their views on how the policy could be made acceptable to the general public. Twenty-eight focus groups were conducted to gain in-depth data on attitudes, knowledge, and beliefs regarding the introduction of a minimum price per unit of alcohol policy. Participants (total N = 218) were asked to give their opinions about the policy, its possible outcomes, and how its introduction might be made more acceptable. Transcribed focus-group discussions were analysed for emergent themes using inductive thematic content analysis. Analysis indicated that participants' objections to a minimum price had three main themes: (1) scepticism of minimum pricing as an effective means to reduce harmful alcohol consumption; (2) a dislike of the policy for a number of reasons (e.g., it was perceived to 'punish' the moderate drinker); and (3) concern that the policy might create or exacerbate existing social problems. There was a general perception that the policy was aimed at 'problem' and underage drinkers. Participants expressed some qualified support for the policy but stated that it would only work as part of a wider campaign including other educational elements. There was little evidence to suggest that people would support the introduction of a minimum price per unit of alcohol policy. Scepticism about the effectiveness of the policy is likely to represent the most significant barrier to public support. Findings also suggest that clearer

  12. A Method for Mapping Future Urbanization in the United States

    Directory of Open Access Journals (Sweden)

    Lahouari Bounoua

    2018-04-01

    Full Text Available Cities are poised to absorb additional people. Their sustainability, or ability to accommodate a population increase without depleting resources or compromising future growth, depends on whether they harness the efficiency gains from urban land management. Population is often projected as a bulk national number without details about spatial distribution. We use Landsat and population data in a methodology to project and map U.S. urbanization for the year 2020 and document its spatial pattern. This methodology is important to spatially disaggregate projected population and assist land managers to monitor land use, assess infrastructure and distribute resources. We found the U.S. west coast urban areas to have the fastest population growth with relatively small land consumption resulting in future decrease in per capita land use. Except for Miami (FL, most other U.S. large urban areas, especially in the Midwest, are growing spatially faster than their population and inadvertently consuming land needed for ecosystem services. In large cities, such as New York, Chicago, Houston and Miami, land development is expected more in suburban zones than urban cores. In contrast, in Los Angeles land development within the city core is greater than in its suburbs.

  13. Quaternary Geologic Map of the Lake Superior 4° x 6° Quadrangle, United States and Canada

    Data.gov (United States)

    Department of the Interior — The Quaternary Geologic Map of the Lake Superior 4° x 6° Quadrangle was mapped as part of the Quaternary Geologic Atlas of the United States. The atlas was begun as...

  14. Karst in the United States: a digital map compilation and database

    Science.gov (United States)

    Weary, David J.; Doctor, Daniel H.

    2014-01-01

    This report describes new digital maps delineating areas of the United States, including Puerto Rico and the U.S. Virgin Islands, having karst or the potential for development of karst and pseudokarst. These maps show areas underlain by soluble rocks and also by volcanic rocks, sedimentary deposits, and permafrost that have potential for karst or pseudokarst development. All 50 States contain rocks with potential for karst development, and about 18 percent of their area is underlain by soluble rocks having karst or the potential for development of karst features. The areas of soluble rocks shown are based primarily on selection from State geologic maps of rock units containing significant amounts of carbonate or evaporite minerals. Areas underlain by soluble rocks are further classified by general climate setting, degree of induration, and degree of exposure. Areas having potential for volcanic pseudokarst are those underlain chiefly by basaltic-flow rocks no older than Miocene in age. Areas with potential for pseudokarst features in sedimentary rocks are in relatively unconsolidated rocks from which pseudokarst features, such as piping caves, have been reported. Areas having potential for development of thermokarst features, mapped exclusively in Alaska, contain permafrost in relatively thick surficial deposits containing ground ice. This report includes a GIS database with links from the map unit polygons to online geologic unit descriptions.

  15. Quaternary allostratigraphy of surficial deposit map units at Yucca Mountain, Nevada: A progress report

    International Nuclear Information System (INIS)

    Lundstrom, S.C.; Wesling, J.R.; Swan, F.H.; Taylor, E.M.; Whitney, J.W.

    1993-01-01

    Surficial geologic mapping at Yucca Mountain, Nevada, is relevant to site characterization studies of paleoclimate, tectonics, erosion, flood hazards, and water infiltration. Alluvial, colluvial, and eolian allostratigraphic map units are defined on the basis of age-related surface characteristics and soil development, as well as lithology and sedimentology indicative of provenance and depositional mode. In gravelly alluvial units, which include interbedded debris flows, the authors observe a useful qualitative correlation between surface and soil properties. Map units of estimated middle Pleistocene age typically have a well-developed, varnished desert pavement, and minimal erosional and preserved depositional microrelief, associated with a soil with a reddened Bt horizon and stage 3 carbonate and silica morphology. Older units have greater erosional relief, an eroded argillic horizon and stage 4 carbonate morphology, whereas younger units have greater preservation of depositional morphology, but lack well-developed pavements, rock varnish, and Bt and Kqm soil horizons. Trench and gully-wall exposures show that alluvial, colluvial and eolian dominated surface units are underlain by multiple buried soils separating sedimentologically similar deposits; this stratigraphy increases the potential for understanding the long-term Quaternary paleoenvironmental history of Yucca Mountain. Age estimates for allostratigraphic units, presently based on uranium-trend dating and regional correlation using soil development, will be further constrained by ongoing dating studies that include tephra identification, uranium-series disequilibrium, and thermoluminescence methods

  16. Mapping variation in radon potential both between and within geological units

    International Nuclear Information System (INIS)

    Miles, J C H; Appleton, J D

    2005-01-01

    Previously, the potential for high radon levels in UK houses has been mapped either on the basis of grouping the results of radon measurements in houses by grid squares or by geological units. In both cases, lognormal modelling of the distribution of radon concentrations was applied to allow the estimated proportion of houses above the UK radon Action Level (AL, 200 Bq m -3 ) to be mapped. This paper describes a method of combining the grid square and geological mapping methods to give more accurate maps than either method can provide separately. The land area is first divided up using a combination of bedrock and superficial geological characteristics derived from digital geological map data. Each different combination of geological characteristics may appear at the land surface in many discontinuous locations across the country. HPA has a database of over 430 000 houses in which long-term measurements of radon concentration have been made, and whose locations are accurately known. Each of these measurements is allocated to the appropriate bedrock-superficial geological combination underlying it. Taking each geological combination in turn, the spatial variation of radon potential is mapped, treating the combination as if it were continuous over the land area. All of the maps of radon potential within different geological combinations are then combined to produce a map of variation in radon potential over the whole land surface

  17. The Effects of Data Gaps on the Calculated Monthly Mean Maximum and Minimum Temperatures in the Continental United States: A Spatial and Temporal Study.

    Science.gov (United States)

    Stooksbury, David E.; Idso, Craig D.; Hubbard, Kenneth G.

    1999-05-01

    Gaps in otherwise regularly scheduled observations are often referred to as missing data. This paper explores the spatial and temporal impacts that data gaps in the recorded daily maximum and minimum temperatures have on the calculated monthly mean maximum and minimum temperatures. For this analysis 138 climate stations from the United States Historical Climatology Network Daily Temperature and Precipitation Data set were selected. The selected stations had no missing maximum or minimum temperature values during the period 1951-80. The monthly mean maximum and minimum temperatures were calculated for each station for each month. For each month 1-10 consecutive days of data from each station were randomly removed. This was performed 30 times for each simulated gap period. The spatial and temporal impact of the 1-10-day data gaps were compared. The influence of data gaps is most pronounced in the continental regions during the winter and least pronounced in the southeast during the summer. In the north central plains, 10-day data gaps during January produce a standard deviation value greater than 2°C about the `true' mean. In the southeast, 10-day data gaps in July produce a standard deviation value less than 0.5°C about the mean. The results of this study will be of value in climate variability and climate trend research as well as climate assessment and impact studies.

  18. Development of a new USDA plant hardiness zone map for the United States

    Science.gov (United States)

    C. Daly; M.P. Widrlechner; M.D. Halbleib; J.I. Smith; W.P. Gibson

    2012-01-01

    In many regions of the world, the extremes of winter cold are a major determinant of the geographic distribution of perennial plant species and of their successful cultivation. In the United States, the U.S. Department of Agriculture (USDA) Plant Hardiness Zone Map (PHZM) is the primary reference for defining geospatial patterns of extreme winter cold for the...

  19. Using the Large Fire Simulator System to map wildland fire potential for the conterminous United States

    Science.gov (United States)

    LaWen Hollingsworth; James Menakis

    2010-01-01

    This project mapped wildland fire potential (WFP) for the conterminous United States by using the large fire simulation system developed for Fire Program Analysis (FPA) System. The large fire simulation system, referred to here as LFSim, consists of modules for weather generation, fire occurrence, fire suppression, and fire growth modeling. Weather was generated with...

  20. A geomorphic approach to 100-year floodplain mapping for the Conterminous United States

    Science.gov (United States)

    Jafarzadegan, Keighobad; Merwade, Venkatesh; Saksena, Siddharth

    2018-06-01

    Floodplain mapping using hydrodynamic models is difficult in data scarce regions. Additionally, using hydrodynamic models to map floodplain over large stream network can be computationally challenging. Some of these limitations of floodplain mapping using hydrodynamic modeling can be overcome by developing computationally efficient statistical methods to identify floodplains in large and ungauged watersheds using publicly available data. This paper proposes a geomorphic model to generate probabilistic 100-year floodplain maps for the Conterminous United States (CONUS). The proposed model first categorizes the watersheds in the CONUS into three classes based on the height of the water surface corresponding to the 100-year flood from the streambed. Next, the probability that any watershed in the CONUS belongs to one of these three classes is computed through supervised classification using watershed characteristics related to topography, hydrography, land use and climate. The result of this classification is then fed into a probabilistic threshold binary classifier (PTBC) to generate the probabilistic 100-year floodplain maps. The supervised classification algorithm is trained by using the 100-year Flood Insurance Rated Maps (FIRM) from the U.S. Federal Emergency Management Agency (FEMA). FEMA FIRMs are also used to validate the performance of the proposed model in areas not included in the training. Additionally, HEC-RAS model generated flood inundation extents are used to validate the model performance at fifteen sites that lack FEMA maps. Validation results show that the probabilistic 100-year floodplain maps, generated by proposed model, match well with both FEMA and HEC-RAS generated maps. On average, the error of predicted flood extents is around 14% across the CONUS. The high accuracy of the validation results shows the reliability of the geomorphic model as an alternative approach for fast and cost effective delineation of 100-year floodplains for the CONUS.

  1. Minimum energy requirements for desalination of brackish groundwater in the United States with comparison to international datasets

    Science.gov (United States)

    Ahdab, Yvana D.; Thiel, Gregory P.; Böhlke, John Karl; Stanton, Jennifer S.; Lienhard, John H.

    2018-01-01

    This paper uses chemical and physical data from a large 2017 U.S. Geological Surveygroundwater dataset with wells in the U.S. and three smaller international groundwater datasets with wells primarily in Australia and Spain to carry out a comprehensive investigation of brackish groundwater composition in relation to minimum desalinationenergy costs. First, we compute the site-specific least work required for groundwater desalination. Least work of separation represents a baseline for specific energy consumptionof desalination systems. We develop simplified equations based on the U.S. data for least work as a function of water recovery ratio and a proxy variable for composition, either total dissolved solids, specific conductance, molality or ionic strength. We show that the U.S. correlations for total dissolved solids and molality may be applied to the international datasets. We find that total molality can be used to calculate the least work of dilute solutions with very high accuracy. Then, we examine the effects of groundwater solute composition on minimum energy requirements, showing that separation requirements increase from calcium to sodium for cations and from sulfate to bicarbonate to chloride for anions, for any given TDS concentration. We study the geographic distribution of least work, total dissolved solids, and major ions concentration across the U.S. We determine areas with both low least work and high water stress in order to highlight regions holding potential for desalination to decrease the disparity between high water demand and low water supply. Finally, we discuss the implications of the USGS results on water resource planning, by comparing least work to the specific energy consumption of brackish water reverse osmosisplants and showing the scaling propensity of major electrolytes and silica in the U.S. groundwater samples.

  2. 76 FR 7096 - Minimum Quality and Handling Standards for Domestic and Imported Peanuts Marketed in the United...

    Science.gov (United States)

    2011-02-09

    ... DEPARTMENT OF AGRICULTURE Agricultural Marketing Service 7 CFR Part 996 [Doc. No. AMS-FV-10-0030... in the United States; Section 610 Review AGENCY: Agricultural Marketing Service, USDA. ACTION: Confirmation of regulations. SUMMARY: This action summarizes the results under the criteria contained in...

  3. Geochemical landscapes of the conterminous United States; new map presentations for 22 elements

    Science.gov (United States)

    Gustavsson, N.; Bolviken, B.; Smith, D.B.; Severson, R.C.

    2001-01-01

    Geochemical maps of the conterminous United States have been prepared for seven major elements (Al, Ca, Fe, K, Mg, Na, and Ti) and 15 trace elements (As, Ba, Cr, Cu, Hg, Li, Mn, Ni, Pb, Se, Sr, V, Y, Zn, and Zr). The maps are based on an ultra low-density geochemical survey consisting of 1,323 samples of soils and other surficial materials collected from approximately 1960-1975. The data were published by Boerngen and Shacklette (1981) and black-and-white point-symbol geochemical maps were published by Shacklette and Boerngen (1984). The data have been reprocessed using weighted-median and Bootstrap procedures for interpolation and smoothing.

  4. Quaternary geologic map of the Austin 4° x 6° quadrangle, United States

    Science.gov (United States)

    State compilations by Moore, David W.; Wermund, E.G.; edited and integrated by Moore, David W.; Richmond, Gerald Martin; Christiansen, Ann Coe; Bush, Charles A.

    1993-01-01

    This map is part of the Quaternary Geologic Atlas of the United States (I-1420). It was first published as a printed edition in 1993. The geologic data have now been captured digitally and are presented here along with images of the printed map sheet and component parts as PDF files. The Quaternary Geologic Map of the Austin 4° x 6° Quadrangle was mapped as part of the Quaternary Geologic Atlas of the United States. The atlas was begun as an effort to depict the areal distribution of surficial geologic deposits and other materials that accumulated or formed during the past 2+ million years, the period that includes all activities of the human species. These materials are at the surface of the Earth. They make up the ground on which we walk, the dirt in which we dig foundations, and the soil in which we grow crops. Most of our human activity is related in one way or another to these surface materials that are referred to collectively by many geologists as regolith, the mantle of fragmental and generally unconsolidated material that overlies the bedrock foundation of the continent. The maps were compiled at 1:1,000,000 scale. In recent years, surficial deposits and materials have become the focus of much interest by scientists, environmentalists, governmental agencies, and the general public. They are the foundations of ecosystems, the materials that support plant growth and animal habitat, and the materials through which travels much of the water required for our agriculture, our industry, and our general well being. They also are materials that easily can become contaminated by pesticides, fertilizers, and toxic wastes. In this context, the value of the surficial geologic map is evident.

  5. Harms to 'others' from alcohol consumption in the minimum unit pricing policy debate: a qualitative content analysis of U.K. newspapers (2005-12).

    Science.gov (United States)

    Wood, Karen; Patterson, Chris; Katikireddi, Srinivasa Vittal; Hilton, Shona

    2014-04-01

    Minimum unit pricing is a fiscal intervention intended to tackle the social and health harms from alcohol to individual drinkers and wider society. This paper presents the first large-scale qualitative examination of how newsprint media framed the debate around the harms of alcohol consumption to 'others' during the development and passing of minimum unit pricing legislation in Scotland. Qualitative content analysis was conducted on seven U.K. and three Scottish national newspapers between 1 January 2005 and 30 June 2012. Relevant articles were identified using the electronic databases Nexis U.K. and Newsbank. A total of 403 articles focused on the harms of alcohol consumption to 'others' and were eligible for detailed coding and analysis. Alcohol harms to wider society and communities were identified as being a worsening issue increasingly affecting everyone through shared economic costs, social disorder, crime and violence. The availability of cheap alcohol was blamed, alongside a minority of 'problem' youth binge drinkers. The harm caused to families was less widely reported. If news reporting encourages the public to perceive the harms caused by alcohol to wider society as having reached crisis point, a population-based intervention may be deemed necessary and acceptable. However, the current focus in news reports on youth binge drinkers may be masking the wider issue of overconsumption across the broader population. © 2013 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  6. Non-Markovianity Measure Based on Brukner-Zeilinger Invariant Information for Unital Quantum Dynamical Maps

    Science.gov (United States)

    He, Zhi; Zhu, Lie-Qiang; Li, Li

    2017-03-01

    A non-Markovianity measure based on Brukner-Zeilinger invariant information to characterize non-Markovian effect of open systems undergoing unital dynamical maps is proposed. The method takes advantage of non-increasing property of the Brukner-Zeilinger invariant information under completely positive and trace-preserving unital maps. The simplicity of computing the Brukner-Zeilinger invariant information is the advantage of the proposed measure because of mainly depending on the purity of quantum state. The measure effectively captures the characteristics of non-Markovianity of unital dynamical maps. As some concrete application, we consider two typical non-Markovian noise channels, i.e., the phase damping channel and the random unitary channel to show the sensitivity of the proposed measure. By investigation, we find that the conditions of detecting the non-Markovianity for the phase damping channel are consistent with the results of existing measures for non-Markovianity, i.e., information flow, divisibility and quantum mutual information. However, for the random unitary channel non-Markovian conditions are same to that of the information flow, but is different from that of the divisibility and quantum mutual information. Supported by the National Natural Science Foundation of China under Grant No. 61505053, the Natural Science Foundation of Hunan Province under Grant No. 2015JJ3092, the Research Foundation of Education Bureau of Hunan Province, China under Grant No. 16B177, the School Foundation from the Hunan University of Arts and Science under Grant No. 14ZD01

  7. Non-Markovianity Measure Based on Brukner–Zeilinger Invariant Information for Unital Quantum Dynamical Maps

    International Nuclear Information System (INIS)

    He Zhi; Zhu Lie-Qiang; Li Li

    2017-01-01

    A non-Markovianity measure based on Brukner–Zeilinger invariant information to characterize non-Markovian effect of open systems undergoing unital dynamical maps is proposed. The method takes advantage of non-increasing property of the Brukner–Zeilinger invariant information under completely positive and trace-preserving unital maps. The simplicity of computing the Brukner–Zeilinger invariant information is the advantage of the proposed measure because of mainly depending on the purity of quantum state. The measure effectively captures the characteristics of non-Markovianity of unital dynamical maps. As some concrete application, we consider two typical non-Markovian noise channels, i.e., the phase damping channel and the random unitary channel to show the sensitivity of the proposed measure. By investigation, we find that the conditions of detecting the non-Markovianity for the phase damping channel are consistent with the results of existing measures for non-Markovianity, i.e., information flow, divisibility and quantum mutual information. However, for the random unitary channel non-Markovian conditions are same to that of the information flow, but is different from that of the divisibility and quantum mutual information. (paper)

  8. Documentation for the 2014 update of the United States national seismic hazard maps

    Science.gov (United States)

    Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter M.; Mueller, Charles S.; Haller, Kathleen M.; Frankel, Arthur D.; Zeng, Yuehua; Rezaeian, Sanaz; Harmsen, Stephen C.; Boyd, Oliver S.; Field, Edward; Chen, Rui; Rukstales, Kenneth S.; Luco, Nico; Wheeler, Russell L.; Williams, Robert A.; Olsen, Anna H.

    2014-01-01

    The national seismic hazard maps for the conterminous United States have been updated to account for new methods, models, and data that have been obtained since the 2008 maps were released (Petersen and others, 2008). The input models are improved from those implemented in 2008 by using new ground motion models that have incorporated about twice as many earthquake strong ground shaking data and by incorporating many additional scientific studies that indicate broader ranges of earthquake source and ground motion models. These time-independent maps are shown for 2-percent and 10-percent probability of exceedance in 50 years for peak horizontal ground acceleration as well as 5-hertz and 1-hertz spectral accelerations with 5-percent damping on a uniform firm rock site condition (760 meters per second shear wave velocity in the upper 30 m, VS30). In this report, the 2014 updated maps are compared with the 2008 version of the maps and indicate changes of plus or minus 20 percent over wide areas, with larger changes locally, caused by the modifications to the seismic source and ground motion inputs.

  9. Floodplain Mapping for the Continental United States Using Machine Learning Techniques and Watershed Characteristics

    Science.gov (United States)

    Jafarzadegan, K.; Merwade, V.; Saksena, S.

    2017-12-01

    Using conventional hydrodynamic methods for floodplain mapping in large-scale and data-scarce regions is problematic due to the high cost of these methods, lack of reliable data and uncertainty propagation. In this study a new framework is proposed to generate 100-year floodplains for any gauged or ungauged watershed across the United States (U.S.). This framework uses Flood Insurance Rate Maps (FIRMs), topographic, climatic and land use data which are freely available for entire U.S. for floodplain mapping. The framework consists of three components, including a Random Forest classifier for watershed classification, a Probabilistic Threshold Binary Classifier (PTBC) for generating the floodplains, and a lookup table for linking the Random Forest classifier to the PTBC. The effectiveness and reliability of the proposed framework is tested on 145 watersheds from various geographical locations in the U.S. The validation results show that around 80 percent of total watersheds are predicted well, 14 percent have acceptable fit and less than five percent are predicted poorly compared to FIRMs. Another advantage of this framework is its ability in generating floodplains for all small rivers and tributaries. Due to the high accuracy and efficiency of this framework, it can be used as a preliminary decision making tool to generate 100-year floodplain maps for data-scarce regions and all tributaries where hydrodynamic methods are difficult to use.

  10. Mapping landscape units in Galicia (Spain: A first step for assessment and management?

    Directory of Open Access Journals (Sweden)

    Corbelle-Rico Eduardo

    2017-12-01

    Full Text Available In the beginning of 2015, the Regional Administration of Galicia (NW Spain set the requirements for a map of landscape units: it had to be produced in less than 3 months, it should cover the whole territory of the region (29,574 km², and it should be useful for management at a scale of 1:25,000. With these objectives in mind, we pro- posed a semiautomatic mapping methodology entirely based on the use of free software (GRASS GIS and already available cartographic information. Semi-automatic classification of different land-use patterns was at the heart of the proposed process. Consultation with experts of different academic background took place along the project. This consultation process allowed to identify both problems and opportunities. As it could be expected, the diverse epistemic community represented by the expert panel implied that one of the main challenges was to reach consensus on the understanding of the concept of landscape and the decisions leading to the mapping methodology proposed in this paper. This initiated a very interesting debate that, in our view, was centred around three main issues: the approach to the landscape, the purpose of the mapping exercise, and the ability to include subjectivity into the analysis.

  11. Dose mapping in working space of KORI unit 1 using MCNPX code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, C. W.; Shin, C. H.; Kim, J. G. [Hanyang University, Seoul (Korea, Republic of); Kim, S. Y. [Innovative Techonology Center for Radiation Safety, Seoul (Korea, Republic of)

    2004-07-01

    Radiation field analysis in nuclear power plant mainly depends on actual measurements. In this study, the analysis using computational calculation is performed to overcome the limits of measurement and provide the initial information for unfolding. The radiation field mapping is performed, which makes it possible to analyze the trends of the radiation filed for whole space. By using MCNPX code, containment building inside is modeled for KORI unit 1 cycle 21 under operation. Applying the neutron spectrum from the operating reactor as a radiation source, the ambient doses are calculated in the whole space, containment building inside, for neutron and photon fields. Dose mapping is performed for three spaces, 6{approx}20, 20{approx}44, 44{approx}70 ft from bottom of the containment building. The radiation distribution in dose maps shows the effects from structures and materials of components. With this dose maps, radiation field analysis contained the region near the detect position. The analysis and prediction are possible for radiation field from other radiation source or operating cycle.

  12. Vested Interests in Addiction Research and Policy The challenge corporate lobbying poses to reducing society’s alcohol problems: insights from UK evidence on minimum unit pricing

    Science.gov (United States)

    McCambridge, Jim; Hawkins, Benjamin; Holden, Chris

    2014-01-01

    Background There has been insufficient research attention to alcohol industry methods of influencing public policies. With the exception of the tobacco industry, there have been few studies of the impact of corporate lobbying on public health policymaking more broadly. Methods We summarize here findings from documentary analyses and interview studies in an integrative review of corporate efforts to influence UK policy on minimum unit pricing (MUP) of alcohol 2007–10. Results Alcohol producers and retailers adopted a long-term, relationship-building approach to policy influence, in which personal contacts with key policymakers were established and nurtured, including when they were not in government. The alcohol industry was successful in achieving access to UK policymakers at the highest levels of government and at all stages of the policy process. Within the United Kingdom, political devolution and the formation for the first time of a Scottish National Party (SNP) government disrupted the existing long-term strategy of alcohol industry actors and created the conditions for evidence-based policy innovations such as MUP. Conclusions Comparisons between policy communities within the United Kingdom and elsewhere are useful to the understanding of how different policy environments are amenable to influence through lobbying. Greater transparency in how policy is made is likely to lead to more effective alcohol and other public policies globally by constraining the influence of vested interests. PMID:24261642

  13. Vested interests in addiction research and policy. The challenge corporate lobbying poses to reducing society's alcohol problems: insights from UK evidence on minimum unit pricing.

    Science.gov (United States)

    McCambridge, Jim; Hawkins, Benjamin; Holden, Chris

    2014-02-01

    There has been insufficient research attention to alcohol industry methods of influencing public policies. With the exception of the tobacco industry, there have been few studies of the impact of corporate lobbying on public health policymaking more broadly. We summarize here findings from documentary analyses and interview studies in an integrative review of corporate efforts to influence UK policy on minimum unit pricing (MUP) of alcohol 2007-10. Alcohol producers and retailers adopted a long-term, relationship-building approach to policy influence, in which personal contacts with key policymakers were established and nurtured, including when they were not in government. The alcohol industry was successful in achieving access to UK policymakers at the highest levels of government and at all stages of the policy process. Within the United Kingdom, political devolution and the formation for the first time of a Scottish National Party (SNP) government disrupted the existing long-term strategy of alcohol industry actors and created the conditions for evidence-based policy innovations such as MUP. Comparisons between policy communities within the United Kingdom and elsewhere are useful to the understanding of how different policy environments are amenable to influence through lobbying. Greater transparency in how policy is made is likely to lead to more effective alcohol and other public policies globally by constraining the influence of vested interests. ©2013 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of The Society for the Study of Addiction.

  14. Classification of hyperspectral imagery using MapReduce on a NVIDIA graphics processing unit (Conference Presentation)

    Science.gov (United States)

    Ramirez, Andres; Rahnemoonfar, Maryam

    2017-04-01

    A hyperspectral image provides multidimensional figure rich in data consisting of hundreds of spectral dimensions. Analyzing the spectral and spatial information of such image with linear and non-linear algorithms will result in high computational time. In order to overcome this problem, this research presents a system using a MapReduce-Graphics Processing Unit (GPU) model that can help analyzing a hyperspectral image through the usage of parallel hardware and a parallel programming model, which will be simpler to handle compared to other low-level parallel programming models. Additionally, Hadoop was used as an open-source version of the MapReduce parallel programming model. This research compared classification accuracy results and timing results between the Hadoop and GPU system and tested it against the following test cases: the CPU and GPU test case, a CPU test case and a test case where no dimensional reduction was applied.

  15. Are Alcohol Taxation and Pricing Policies Regressive? Product-Level Effects of a Specific Tax and a Minimum Unit Price for Alcohol.

    Science.gov (United States)

    Vandenberg, Brian; Sharma, Anurag

    2016-07-01

    To compare estimated effects of two policy alternatives, (i) a minimum unit price (MUP) for alcohol and (ii) specific (per-unit) taxation, upon current product prices, per capita spending (A$), and per capita consumption by income quintile, consumption quintile and product type. Estimation of baseline spending and consumption, and modelling policy-to-price and price-to-consumption effects of policy changes using scanner data from a panel of demographically representative Australian households that includes product-level details of their off-trade alcohol spending (n = 885; total observations = 12,505). Robustness checks include alternative price elasticities, tax rates, minimum price thresholds and tax pass-through rates. Current alcohol taxes and alternative taxation and pricing policies are not highly regressive. Any regressive effects are small and concentrated among heavy consumers. The lowest-income consumers currently spend a larger proportion of income (2.3%) on alcohol taxes than the highest-income consumers (0.3%), but the mean amount is small in magnitude [A$5.50 per week (95%CI: 5.18-5.88)]. Both a MUP and specific taxation will have some regressive effects, but the effects are limited, as they are greatest for the heaviest consumers, irrespective of income. Among the policy alternatives, a MUP is more effective in reducing consumption than specific taxation, especially for consumers in the lowest-income quintile: an estimated mean per capita reduction of 11.9 standard drinks per week (95%CI: 11.3-12.6). Policies that increase the cost of the cheapest alcohol can be effective in reducing alcohol consumption, without having highly regressive effects. © The Author 2015. Medical Council on Alcohol and Oxford University Press. All rights reserved.

  16. Combining forest inventory, satellite remote sensing, and geospatial data for mapping forest attributes of the conterminous United States

    Science.gov (United States)

    Mark Nelson; Greg Liknes; Charles H. Perry

    2009-01-01

    Analysis and display of forest composition, structure, and pattern provides information for a variety of assessments and management decision support. The objective of this study was to produce geospatial datasets and maps of conterminous United States forest land ownership, forest site productivity, timberland, and reserved forest land. Satellite image-based maps of...

  17. Utilizing Multi-Sensor Fire Detections to Map Fires in the United States

    Science.gov (United States)

    Howard, S. M.; Picotte, J. J.; Coan, M. J.

    2014-11-01

    In 2006, the Monitoring Trends in Burn Severity (MTBS) project began a cooperative effort between the US Forest Service (USFS) and the U.S.Geological Survey (USGS) to map and assess burn severity all large fires that have occurred in the United States since 1984. Using Landsat imagery, MTBS is mandated to map wildfire and prescribed fire that meet specific size criteria: greater than 1000 acres in the west and 500 acres in the east, regardless of ownership. Relying mostly on federal and state fire occurrence records, over 15,300 individual fires have been mapped. While mapping recorded fires, an additional 2,700 "unknown" or undocumented fires were discovered and assessed. It has become apparent that there are perhaps thousands of undocumented fires in the US that are yet to be mapped. Fire occurrence records alone are inadequate if MTBS is to provide a comprehensive accounting of fire across the US. Additionally, the sheer number of fires to assess has overwhelmed current manual procedures. To address these problems, the National Aeronautics and Space Administration (NASA) Applied Sciences Program is helping to fund the efforts of the USGS and its MTBS partners (USFS, National Park Service) to develop, and implement a system to automatically identify fires using satellite data. In near real time, USGS will combine active fire satellite detections from MODIS, AVHRR and GOES satellites with Landsat acquisitions. Newly acquired Landsat imagery will be routinely scanned to identify freshly burned area pixels, derive an initial perimeter and tag the burned area with the satellite date and time of detection. Landsat imagery from the early archive will be scanned to identify undocumented fires. Additional automated fire assessment processes will be developed. The USGS will develop these processes using open source software packages in order to provide freely available tools to local land managers providing them with the capability to assess fires at the local level.

  18. MAPPING GLAUCONITE UNITES WITH USING REMOTE SENSING TECHNIQUES IN NORTH EAST OF IRAN

    Directory of Open Access Journals (Sweden)

    R. Ahmadirouhani

    2014-10-01

    Full Text Available Glauconite is a greenish ferric-iron silicate mineral with micaceous structure, characteristically formed in shallow marine environments. Glauconite has been used as a pigmentation agent for oil paint, contaminants remover in environmental studies and a source of potassium in plant fertilizers, and other industries. Koppeh-dagh basin is extended in Iran, Afghanistan and Turkmenistan countries and Glauconite units exist in this basin. In this research for enhancing and mapping glauconitic units in Koppeh-dagh structural zone in north east of Iran, remote sensing techniques such as Spectral Angle Mapper classification (SAM, band ratio and band composition methods on SPOT, ASTER and Landsat data in 3 steps were applied.

  19. Quaternary Geologic Map of the Lake of the Woods 4 Degrees x 6 Degrees Quadrangle, United States and Canada

    Science.gov (United States)

    Sado, Edward V.; Fullerton, David S.; Goebel, Joseph E.; Ringrose, Susan M.; Edited and Integrated by Fullerton, David S.

    1995-01-01

    The Quaternary Geologic Map of the Lake of the Woods 4 deg x 6 deg Quadrangle, United States and Canada, was mapped as part of the U.S. Geological Survey Quaternary Geologic Atlas of the United States map series (Miscellaneous Investigations Series I-1420, NM-15). The atlas was begun as an effort to depict the areal distribution of surficial geologic deposits and other materials that accumulated or formed during the past 2+ million years, the period that includes all activities of the human species. These materials are at the surface of the earth. They make up the 'ground' on which we walk, the 'dirt' in which we dig foundations, and the 'soil' in which we grow crops. Most of our human activity is related in one way or another to these surface materials that are referred to collectively by many geologists as regolith, the mantle of fragmental and generally unconsolidated material that overlies the bedrock foundation of the continent. The maps were compiled at 1:1,000,000 scale. This map is a product of collaboration of the Ontario Geological Survey, the Minnesota Geological Survey, the Manitoba Department of Energy and Mines, and the U.S. Geological Survey, and is designed for both scientific and practical purposes. It was prepared in two stages. First, separate maps and map explanations were prepared by the compilers. Second, the maps were combined, integrated, and supplemented by the editor. Map unit symbols were revised to a uniform system of classification and the map unit descriptions were prepared by the editor from information received from the compilers and from additional sources listed under Sources of Information. Diagrams accompanying the map were prepared by the editor. For scientific purposes, the map differentiates Quaternary surficial deposits on the basis of lithology or composition, texture or particle size, structure, genesis, stratigraphic relationships, engineering geologic properties, and relative age, as shown on the correlation diagram and

  20. An updated stress map of the continental United States reveals heterogeneous intraplate stress

    Science.gov (United States)

    Levandowski, Will; Herrmann, Robert B.; Briggs, Rich; Boyd, Oliver; Gold, Ryan

    2018-06-01

    Knowledge of the state of stress in Earth's crust is key to understanding the forces and processes responsible for earthquakes. Historically, low rates of natural seismicity in the central and eastern United States have complicated efforts to understand intraplate stress, but recent improvements in seismic networks and the spread of human-induced seismicity have greatly improved data coverage. Here, we compile a nationwide stress map based on formal inversions of focal mechanisms that challenges the idea that deformation in continental interiors is driven primarily by broad, uniform stress fields derived from distant plate boundaries. Despite plate-boundary compression, extension dominates roughly half of the continent, and second-order forces related to lithospheric structure appear to control extension directions. We also show that the states of stress in several active eastern United States seismic zones differ significantly from those of surrounding areas and that these anomalies cannot be explained by transient processes, suggesting that earthquakes are focused by persistent, locally derived sources of stress. Such spatially variable intraplate stress appears to justify the current, spatially variable estimates of seismic hazard. Future work to quantify sources of stress, stressing-rate magnitudes and their relationship with strain and earthquake rates could allow prospective mapping of intraplate hazard.

  1. Mapping the world: cartographic and geographic visualization by the United Nations Geospatial Information Section (formerly Cartographic Section)

    Science.gov (United States)

    Kagawa, Ayako; Le Sourd, Guillaume

    2018-05-01

    United Nations Secretariat activities, mapping began in 1946, and by 1951, the need for maps increased and an office with a team of cartographers was established. Since then, with the development of technologies including internet, remote sensing, unmanned aerial systems, relationship database management and information systems, geospatial information provides an ever-increasing variation of support to the work of the Organization for planning of operations, decision-making and monitoring of crises. However, the need for maps has remained intact. This presentation aims to highlight some of the cartographic representation styles over the decades by reviewing the evolution of selected maps by the office, and noting the changing cognitive and semiotic aspects of cartographic and geographic visualization required by the United Nations. Through presentation and analysis of these maps, the changing dynamics of the Organization in information management can be reflected, with a reminder of the continuing and expanding deconstructionist role of a cartographer, now geospatial information management experts.

  2. Implications for alcohol minimum unit pricing advocacy: what can we learn for public health from UK newsprint coverage of key claim-makers in the policy debate?

    Science.gov (United States)

    Hilton, Shona; Wood, Karen; Patterson, Chris; Katikireddi, Srinivasa Vittal

    2014-02-01

    On May 24th 2012, Scotland passed the Alcohol (Minimum Pricing) Bill. Minimum unit pricing (MUP) is an intervention that raises the price of the cheapest alcohol to reduce alcohol consumption and related harms. There is a growing literature on industry's influence in policymaking and media representations of policies, but relatively little about frames used by key claim-makers in the public MUP policy debate. This study elucidates the dynamic interplay between key claim-makers to identify lessons for policy advocacy in the media in the UK and internationally. Content analysis was conducted on 262 articles from seven UK and three Scottish national newspapers between 1st May 2011 and 31st May 2012, retrieved from electronic databases. Advocates' and critics' constructions of the alcohol problem and MUP were examined. Advocates depicted the problem as primarily driven by cheap alcohol and marketing, while critics' constructions focused on youth binge drinkers and dependent drinkers. Advocates justified support by citing the intervention's targeted design, but critics denounced the policy as illegal, likely to encourage illicit trade, unsupported by evidence and likely to be ineffective, while harming the responsible majority, low-income consumers and businesses. Critics' arguments were consistent over time, and single statements often encompassed multiple rationales. This study presents advocates with several important lessons for promoting policies in the media. Firstly, it may be useful to shift focus away from young binge drinkers and heavy drinkers, towards population-level over-consumption. Secondly, advocates might focus on presenting the policy as part of a wider package of alcohol policies. Thirdly, emphasis on the success of recent public health policies could help portray the UK and Scotland as world leaders in tackling culturally embedded health and social problems through policy; highlighting past successes when presenting future policies may be a valuable

  3. Geologic Map of the Derain (H-10) Quadrangle on Mercury: The Challenges of Consistently Mapping the Intercrater Plains Unit

    Science.gov (United States)

    Whitten, J. L.; Fassett, C. I.; Ostrach, L. R.

    2018-06-01

    We present the initial mapping of the H-10 quadrangle on Mercury, a region that was imaged for the first time by MESSENGER. Geologic map with assist with further characterization of the intercrater plains and their possible formation mechanism(s).

  4. Quaternary Geologic Map of the Lake Nipigon 4 Degrees x 6 Degrees Quadrangle, United States and Canada

    Science.gov (United States)

    Sado, Edward V.; Fullerton, David S.; Farrand, William R.; Edited and Integrated by Fullerton, David S.

    1994-01-01

    The Quaternary Geologic Map of the Lake Nipigon 4 degree x 6 degree Quadrangle was mapped as part of the Quaternary Geologic Atlas of the United States. The atlas was begun as an effort to depict the areal distribution of surficial geologic deposits and other materials that accumulated or formed during the past 2+ million years, the period that includes all activities of the human species. These materials are at the surface of the earth. They make up the 'ground' on which we walk, the 'dirt' in which we dig foundations, and the 'soil' in which we grow crops. Most of our human activity is related in one way or another to these surface materials that are referred to collectively by many geologists as regolith, the mantle of fragmental and generally unconsolidated material that overlies the bedrock foundation of the continent. The maps were compiled at 1:1,000,000 scale. This map is a product of collaboration of the Ontario Geological Survey, the University of Michigan, and the U.S. Geological Survey, and is designed for both scientific and practical purposes. It was prepared in two stages. First, separate maps and map explanations were prepared by the compilers. Second, the maps were combined, integrated, and supplemented by the editor. Map unit symbols were revised to a uniform system of classification and the map unit descriptions were prepared by the editor from information received from the compilers and from additional sources listed under Sources of Information. Diagrams accompanying the map were prepared by the editor. For scientific purposes, the map differentiates Quaternary surficial deposits on the basis of lithology or composition, texture or particle size, structure, genesis, stratigraphic relationships, engineering geologic properties, and relative age, as shown on the correlation diagram and indicated in the map unit descriptions. Deposits of some constructional landforms, such as kame moraine deposits, are distinguished as map units. Deposits of

  5. The General Urban Plan of Casimcea territorial administrative unit, map of natural and anthropogenic risks

    Directory of Open Access Journals (Sweden)

    Sorin BĂNICĂ

    2013-08-01

    Full Text Available The General Urban Plan represents the legal ground for any development action proposed. After endorsement and approval as required by law, GUP is act of authority of local government for the area in which it applies. The aim is to establish priorities regulations applied in land use planning and construction of structures. In terms of geographical location, the administrative territory of Casimcea, Tulcea county, falls in the central Northwest Plateau Casimcei. This is the second unit of the Central Dobrogea Plateau. Geographical location in southeastern Romania, climatic and relief conditions and anthropogenic pressure, expose the village administrative territorial unit Casimcea, permanent susceptibility to produce natural and antropogenical risks. In this context, we identified the following categories of natural and anthropogenic risks: i natural risk phenomena (earthquakes, strong winds, heavy rains, floods caused by overflowing or precipitation, erosion of river banks and torrents, gravitational processes, rain droplet erosion and surface soil erosion; and ii anthropogenic risk phenomena (overgrazing, chemicals use in agriculture, road transport infrastructure and electricity, wind turbines for electricity production, waste deposits, agro-zootechnical complexs, and human cemeteries. Extending their surface was materialized by creating a map of natural and anthropogenic risk on Casimcea territorial administrative unit, explaining the share of potentially affected areas as territorial balance

  6. How did policy actors use mass media to influence the Scottish alcohol minimum unit pricing debate? Comparative analysis of newspapers, evidence submissions and interviews

    Science.gov (United States)

    Hilton, Shona

    2015-01-01

    Aims: To explore how policy actors attempted to deliberately frame public debate around alcohol minimum unit pricing (MUP) in the UK by comparing and contrasting their constructions of the policy in public (newspapers), semi-public (evidence submissions) and private (interviews). Methods: Content analysis was conducted on articles published in ten national newspapers between 1 January 2005 and 30 June 2012. Newsprint data were contrasted with alcohol policy documents, evidence submissions to the Scottish Parliament's Health and Sport Committee and 36 confidential interviews with policy stakeholders (academics, advocates, industry representatives, politicians and civil servants). Findings: A range of policy actors exerted influence both directly (through Parliamentary institutions and political representatives) and indirectly through the mass media. Policy actors were acutely aware of mass media's importance in shaping public opinion and used it tactically to influence policy. They often framed messages in subtly different ways, depending on target audiences. In general, newspapers presented the policy debate in a “balanced” way, but this arguably over-represented hostile perspective and suggested greater disagreement around the evidence base than is the case. Conclusions: The roles of policy actors vary between public and policy spheres, and how messages are communicated in policy debates depends on perceived strategic advantage. PMID:26045639

  7. Using historical aerial photography and softcopy photogrammetry for waste unit mapping in L Lake

    International Nuclear Information System (INIS)

    Christel, L.M.

    1997-10-01

    L Lake was developed as a cooling water reservoir for the L Reactor at the Savannah River Site. The construction of the lake, which began in the fall of 1984, altered the structure and function of Steel Creek. Completed in the fall of 1985, L Lake has a capacity of 31 million cubic meters and a normal pool of 58 meters. When L Reactor operations ceased in 1988, the water level in the lake still had to be maintained. Site managers are currently trying to determine the feasibility of draining or drawing down the lake in order to save tax dollars. In order to understand the full repercussions of such an undertaking, it was necessary to compile a comprehensive inventory of what the lake bottom looked like prior to filling. Aerial photographs, acquired nine days before the filling of the lake began, were scanned and used for softcopy photogrammetry processing. A one-meter digital elevation model was generated and a digital orthophoto mosaic was created as the base map for the project. Seven categories of features, including the large waste units used to contain the contaminated soil removed from the dam site, were screen digitized and used to generate accurate maps. Other map features include vegetation waste piles, where contaminated vegetation from the flood plain was contained, and ash piles, which are sites where vegetation debris was burned and then covered with clean soil. For all seven categories, the area of disturbance totaled just over 63 hectares. When the screen digitizing was completed, the elevation at the centroid of each disturbance was determined. When the information is used in the Savannah River Site Geographical Information System, it can be used to visualize the various L Lake draw-down scenarios suggested by site managers and hopefully, to support evaluations of the cost effectiveness for each proposed activity

  8. Data layer integration for the national map of the united states

    Science.gov (United States)

    Usery, E.L.; Finn, M.P.; Starbuck, M.

    2009-01-01

    The integration of geographic data layers in multiple raster and vector formats, from many different organizations and at a variety of resolutions and scales, is a significant problem for The National Map of the United States being developed by the U.S. Geological Survey. Our research has examined data integration from a layer-based approach for five of The National Map data layers: digital orthoimages, elevation, land cover, hydrography, and transportation. An empirical approach has included visual assessment by a set of respondents with statistical analysis to establish the meaning of various types of integration. A separate theoretical approach with established hypotheses tested against actual data sets has resulted in an automated procedure for integration of specific layers and is being tested. The empirical analysis has established resolution bounds on meanings of integration with raster datasets and distance bounds for vector data. The theoretical approach has used a combination of theories on cartographic transformation and generalization, such as T??pfer's radical law, and additional research concerning optimum viewing scales for digital images to establish a set of guiding principles for integrating data of different resolutions.

  9. Very High Resolution Tree Cover Mapping for Continental United States using Deep Convolutional Neural Networks

    Science.gov (United States)

    Ganguly, Sangram; Kalia, Subodh; Li, Shuang; Michaelis, Andrew; Nemani, Ramakrishna R.; Saatchi, Sassan A

    2017-01-01

    Uncertainties in input land cover estimates contribute to a significant bias in modeled above ground biomass (AGB) and carbon estimates from satellite-derived data. The resolution of most currently used passive remote sensing products is not sufficient to capture tree canopy cover of less than ca. 10-20 percent, limiting their utility to estimate canopy cover and AGB for trees outside of forest land. In our study, we created a first of its kind Continental United States (CONUS) tree cover map at a spatial resolution of 1-m for the 2010-2012 epoch using the USDA NAIP imagery to address the present uncertainties in AGB estimates. The process involves different tasks including data acquisition ingestion to pre-processing and running a state-of-art encoder-decoder based deep convolutional neural network (CNN) algorithm for automatically generating a tree non-tree map for almost a quarter million scenes. The entire processing chain including generation of the largest open source existing aerial satellite image training database was performed at the NEX supercomputing and storage facility. We believe the resulting forest cover product will substantially contribute to filling the gaps in ongoing carbon and ecological monitoring research and help quantifying the errors and uncertainties in derived products.

  10. Very High Resolution Tree Cover Mapping for Continental United States using Deep Convolutional Neural Networks

    Science.gov (United States)

    Ganguly, S.; Kalia, S.; Li, S.; Michaelis, A.; Nemani, R. R.; Saatchi, S.

    2017-12-01

    Uncertainties in input land cover estimates contribute to a significant bias in modeled above gound biomass (AGB) and carbon estimates from satellite-derived data. The resolution of most currently used passive remote sensing products is not sufficient to capture tree canopy cover of less than ca. 10-20 percent, limiting their utility to estimate canopy cover and AGB for trees outside of forest land. In our study, we created a first of its kind Continental United States (CONUS) tree cover map at a spatial resolution of 1-m for the 2010-2012 epoch using the USDA NAIP imagery to address the present uncertainties in AGB estimates. The process involves different tasks including data acquisition/ingestion to pre-processing and running a state-of-art encoder-decoder based deep convolutional neural network (CNN) algorithm for automatically generating a tree/non-tree map for almost a quarter million scenes. The entire processing chain including generation of the largest open source existing aerial/satellite image training database was performed at the NEX supercomputing and storage facility. We believe the resulting forest cover product will substantially contribute to filling the gaps in ongoing carbon and ecological monitoring research and help quantifying the errors and uncertainties in derived products.

  11. Transient electromagnetic mapping of clay units in the San Luis Valley, Colorado

    Science.gov (United States)

    Fitterman, David V.; Grauch, V.J.S.

    2010-01-01

    Transient electromagnetic soundings were used to obtain information needed to refine hydrologic models of the San Luis Valley, Colorado. The soundings were able to map an aquitard called the blue clay that separates an unconfined surface aquifer from a deeper confined aquifer. The blue clay forms a conductor with an average resistivity of 6.9 ohm‐m. Above the conductor are found a mixture of gray clay and sand. The gray clay has an average resistivity of 21 ohm‐m, while the sand has a resistivity of greater than 100 ohm‐m. The large difference in resistivity of these units makes mapping them with a surface geophysical method relatively easy. The blue clay was deposited at the bottom of Lake Alamosa which filled most of the San Luis Valley during the Pleistocene. The geometry of the blue clay is influenced by a graben on the eastern side of the valley. The depth to the blue clay is greater over the graben. Along the eastern edge of valley the blue clay appears to be truncated by faults.

  12. Quantitative analysis of terrain units mapped in the northern quarter of Venus from Venera 15/16 data

    Science.gov (United States)

    Schaber, G. G.

    1991-01-01

    The contacts between 34 geological/geomorphic terrain units in the northern quarter of Venus mapped from Venera 15/16 data were digitized and converted to a Sinusoidal Equal-Area projection. The result was then registered with a merged Pioneer Venus/Venera 15/16 altimetric database, root mean square (rms) slope values, and radar reflectivity values derived from Pioneer Venus. The resulting information includes comparisons among individual terrain units and terrain groups to which they are assigned in regard to percentage of map area covered, elevation, rms slopes, distribution of suspected craters greater than 10 km in diameter.

  13. Changing policy framing as a deliberate strategy for public health advocacy: a qualitative policy case study of minimum unit pricing of alcohol.

    Science.gov (United States)

    Katikireddi, Srinivasa Vittal; Bond, Lyndal; Hilton, Shona

    2014-06-01

    Scotland is the first country in the world to pass legislation introducing a minimum unit price (MUP) for alcohol in an attempt to reduce consumption and associated harms by increasing the price of the cheapest alcohol. We investigated the competing ways in which policy stakeholders presented the debate. We then established whether a change in framing helped explain the policy's emergence. We conducted a detailed policy case study through analysis of evidence submitted to the Scottish parliament, and in-depth, one-to-one interviews (n = 36) with politicians, civil servants, advocates, researchers, and industry representatives. Public- and voluntary-sector stakeholders tended to support MUP, while industry representatives were more divided. Two markedly different ways of presenting alcohol as a policy problem were evident. Critics of MUP (all of whom were related to industry) emphasized social disorder issues, particularly among young people, and hence argued for targeted approaches. In contrast, advocates for MUP (with the exception of those in industry) focused on alcohol as a health issue arising from overconsumption at a population level, thus suggesting that population-based interventions were necessary. Industry stakeholders favoring MUP adopted a hybrid framing, maintaining several aspects of the critical framing. Our interview data showed that public health advocates worked hard to redefine the policy issue by deliberately presenting a consistent alternative framing. Framing alcohol policy as a broad, multisectoral, public health issue that requires a whole-population approach has been crucial to enabling policymakers to seriously consider MUP, and public health advocates intentionally presented alcohol policy in this way. This reframing helped prioritize public health considerations in the policy debate and represents a deliberate strategy for consideration by those advocating for policy change around the world and in other public health areas. © 2014

  14. Changing Policy Framing as a Deliberate Strategy for Public Health Advocacy: A Qualitative Policy Case Study of Minimum Unit Pricing of Alcohol

    Science.gov (United States)

    Katikireddi, Srinivasa Vittal; Bond, Lyndal; Hilton, Shona

    2014-01-01

    Context Scotland is the first country in the world to pass legislation introducing a minimum unit price (MUP) for alcohol in an attempt to reduce consumption and associated harms by increasing the price of the cheapest alcohol. We investigated the competing ways in which policy stakeholders presented the debate. We then established whether a change in framing helped explain the policy's emergence. Methods We conducted a detailed policy case study through analysis of evidence submitted to the Scottish parliament, and in-depth, one-to-one interviews (n = 36) with politicians, civil servants, advocates, researchers, and industry representatives. Findings Public- and voluntary-sector stakeholders tended to support MUP, while industry representatives were more divided. Two markedly different ways of presenting alcohol as a policy problem were evident. Critics of MUP (all of whom were related to industry) emphasized social disorder issues, particularly among young people, and hence argued for targeted approaches. In contrast, advocates for MUP (with the exception of those in industry) focused on alcohol as a health issue arising from overconsumption at a population level, thus suggesting that population-based interventions were necessary. Industry stakeholders favoring MUP adopted a hybrid framing, maintaining several aspects of the critical framing. Our interview data showed that public health advocates worked hard to redefine the policy issue by deliberately presenting a consistent alternative framing. Conclusions Framing alcohol policy as a broad, multisectoral, public health issue that requires a whole-population approach has been crucial to enabling policymakers to seriously consider MUP, and public health advocates intentionally presented alcohol policy in this way. This reframing helped prioritize public health considerations in the policy debate and represents a deliberate strategy for consideration by those advocating for policy change around the world and in

  15. Does our legal minimum drinking age modulate risk of first heavy drinking episode soon after drinking onset? Epidemiological evidence for the United States, 2006–2014

    Directory of Open Access Journals (Sweden)

    Hui G. Cheng

    2016-06-01

    Full Text Available Background. State-level ‘age 21’ drinking laws conform generally with the United States National Minimum Drinking Age Act of 1984 (US, and are thought to protect young people from adverse drinking experiences such as heavy episodic drinking (HED, sometimes called ‘binge drinking’. We shed light on this hypothesis while estimating the age-specific risk of transitioning from 1st full drink to 1st HED among 12-to-23-year-old newly incident drinkers, with challenge to a “gender gap” hypothesis and male excess described in HED prevalence reports. Methods. The study population consisted of non-institutionalized civilians in the United States, with nine independently drawn nationally representative samples of more than 40,000 12-to-23-year-olds (2006–2014. Standardized audio computer-assisted self-interviews identified 43,000 newly incident drinkers (all with 1st HED evaluated within 12 months of drinking onset. Estimated age-specific HED risk soon after first full drink is evaluated for males and females. Results. Among 12-to-23-year-old newly incident drinkers, an estimated 20–30% of females and 35–45% of males experienced their 1st HED within 12 months after drinking onset. Before mid-adolescence, there is no male excess in such HED risk. Those who postponed drinking to age 21 are not spared (27% for ‘postponer’ females; 95% CI [24–30]; 42% for ‘postponer’ males; 95% CI [38–45]. An estimated 10–18% females and 10–28% males experienced their 1st HED in the same month of their 1st drink; peak HED risk estimates are 18% for ‘postponer’ females (95% CI [15–21] and 28% for ‘postponer’ males (95% CI [24–31]. Conclusions. In the US, one in three young new drinkers transition into HED within 12 months after first drink. Those who postpone the 1st full drink until age 21 are not protected. Furthermore, ‘postponers’ have substantial risk for very rapid transition to HED. A male excess in this transition to HED

  16. Navigating Without Road Maps: The Early Business of Automobile Route Guide Publishing in the United States

    Science.gov (United States)

    Bauer, John T.

    2018-05-01

    In the United States, automobile route guides were important precursors to the road maps that Americans are familiar with today. Listing turn-by-turn directions between cities, they helped drivers navigate unmarked, local roads. This paper examines the early business of route guide publishing through the Official Automobile Blue Book series of guides. It focuses specifically on the expansion, contraction, and eventual decline of the Blue Book publishing empire and also the work of professional "pathfinders" that formed the company's data-gathering infrastructure. Be- ginning in 1901 with only one volume, the series steadily grew until 1920, when thirteen volumes were required to record thousands of routes throughout the country. Bankruptcy and corporate restructuring in 1921 forced the publishers to condense the guide into a four-volume set in 1922. Competition from emerging sheet maps, along with the nationwide standardization of highway numbers, pushed a switch to an atlas format in 1926. Blue Books, however, could not remain competitive and disappeared after 1937. "Pathfinders" were employed by the publishers and equipped with reliable automobiles. Soon they developed a shorthand notation system for recording field notes and efficiently incorporating them into the development workflow. Although pathfinders did not call themselves cartographers, they were geographical data field collectors and considered their work to be an "art and a science," much the same as modern-day cartographers. The paper concludes with some comments about the place of route guides in the history of American commercial cartography and draws some parallels between "pathfinders" and the digital road mappers of today.

  17. Detailed mapping of surface units on Mars with HRSC color data

    Science.gov (United States)

    Combe, J.-Ph.; Wendt, L.; McCord, T. B.; Neukum, G.

    2008-09-01

    Introduction: Making use of HRSC color data Mapping outcrops of clays, sulfates and ferric oxides are basis information to derive the climatic, tectonic and volcanic evolution of Mars, especially the episodes related to the presence of liquid water. The challenge is to resolve spatially the outcrops and to distinguish these components from the globally-driven deposits like the iron oxide-rich bright red dust and the basaltic dark sands. The High Resolution Stereo Camera (HRSC) onboard Mars-Express has five color filters in the visible and near infrared that are designed for visual interpretation and mapping various surface units [1]. It provides also information on the topography at scale smaller than a pixel (roughness) thanks to the different geometry of observation for each color channel. The HRSC dataset is the only one that combines global coverage, 200 m/pixel spatial resolution or better and filtering colors of light. The present abstract is a work in progress (to be submitted to Planetary and Space Science) that shows the potential and limitations of HRSC color data as visual support and as multispectral images. Various methods are described from the most simple to more complex ones in order to demonstrate how to make use of the spectra, because of the specific steps of processing they require [2-4]. The objective is to broaden the popularity of HRSC color data, as they could be used more widely by the scientific community. Results prove that imaging spectrometry and HRSC color data complement each other for mapping outcrops types. Example regions of interest HRSC is theoretically sensitive to materials with absorption features in the visible and near-infrared up to 1 μm. Therefore, oxide-rich red dust and basalts (pyroxenes) can be mapped, as well as very bright components like water ice [5, 6]. Possible detection of other materials still has to be demonstrated. We first explore regions where unusual mineralogy appears clearly from spectral data. Hematite

  18. Geodesy- and geology-based slip-rate models for the Western United States (excluding California) national seismic hazard maps

    Science.gov (United States)

    Petersen, Mark D.; Zeng, Yuehua; Haller, Kathleen M.; McCaffrey, Robert; Hammond, William C.; Bird, Peter; Moschetti, Morgan; Shen, Zhengkang; Bormann, Jayne; Thatcher, Wayne

    2014-01-01

    The 2014 National Seismic Hazard Maps for the conterminous United States incorporate additional uncertainty in fault slip-rate parameter that controls the earthquake-activity rates than was applied in previous versions of the hazard maps. This additional uncertainty is accounted for by new geodesy- and geology-based slip-rate models for the Western United States. Models that were considered include an updated geologic model based on expert opinion and four combined inversion models informed by both geologic and geodetic input. The two block models considered indicate significantly higher slip rates than the expert opinion and the two fault-based combined inversion models. For the hazard maps, we apply 20 percent weight with equal weighting for the two fault-based models. Off-fault geodetic-based models were not considered in this version of the maps. Resulting changes to the hazard maps are generally less than 0.05 g (acceleration of gravity). Future research will improve the maps and interpret differences between the new models.

  19. High Resolution Map of Water Supply and Demand for North East United States

    Science.gov (United States)

    Ehsani, N.; Vorosmarty, C. J.; Fekete, B. M.

    2012-12-01

    Accurate estimates of water supply and demand are crucial elements in water resources management and modeling. As part of our NSF-funded EaSM effort to build a Northeast Regional Earth System Model (NE-RESM) as a framework to improve our understanding and capacity to forecast the implications of planning decisions on the region's environment, ecosystem services, energy and economic systems through the 21st century, we are producing a high resolution map (3' x 3' lat/long) of estimated water supply and use for the north east region of United States. Focusing on water demand, results from this study enables us to quantify how demand sources affect the hydrology and thermal-chemical water pollution across the region. In an attempt to generate this 3-minute resolution map in which each grid cell has a specific estimated monthly domestic, agriculture, thermoelectric and industrial water use. Estimated Use of Water in the United States in 2005 (Kenny et al., 2009) is being coupled to high resolution land cover and land use, irrigation, power plant and population data sets. In addition to water demands, we tried to improve estimates of water supply from the WBM model by improving the way it controls discharge from reservoirs. Reservoirs are key characteristics of the modern hydrologic system, with a particular impact on altering the natural stream flow, thermal characteristics, and biogeochemical fluxes of rivers. Depending on dam characteristics, watershed characteristics and the purpose of building a dam, each reservoir has a specific optimum operating rule. It means that literally 84,000 dams in the National Inventory of Dams potentially follow 84,000 different sets of rules for storing and releasing water which must somehow be accounted for in our modeling exercise. In reality, there is no comprehensive observational dataset depicting these operating rules. Thus, we will simulate these rules. Our perspective is not to find the optimum operating rule per se but to find

  20. POLARIS: A 30-meter probabilistic soil series map of the contiguous United States

    Science.gov (United States)

    Chaney, Nathaniel W; Wood, Eric F; McBratney, Alexander B; Hempel, Jonathan W; Nauman, Travis; Brungard, Colby W.; Odgers, Nathan P

    2016-01-01

    A new complete map of soil series probabilities has been produced for the contiguous United States at a 30 m spatial resolution. This innovative database, named POLARIS, is constructed using available high-resolution geospatial environmental data and a state-of-the-art machine learning algorithm (DSMART-HPC) to remap the Soil Survey Geographic (SSURGO) database. This 9 billion grid cell database is possible using available high performance computing resources. POLARIS provides a spatially continuous, internally consistent, quantitative prediction of soil series. It offers potential solutions to the primary weaknesses in SSURGO: 1) unmapped areas are gap-filled using survey data from the surrounding regions, 2) the artificial discontinuities at political boundaries are removed, and 3) the use of high resolution environmental covariate data leads to a spatial disaggregation of the coarse polygons. The geospatial environmental covariates that have the largest role in assembling POLARIS over the contiguous United States (CONUS) are fine-scale (30 m) elevation data and coarse-scale (~ 2 km) estimates of the geographic distribution of uranium, thorium, and potassium. A preliminary validation of POLARIS using the NRCS National Soil Information System (NASIS) database shows variable performance over CONUS. In general, the best performance is obtained at grid cells where DSMART-HPC is most able to reduce the chance of misclassification. The important role of environmental covariates in limiting prediction uncertainty suggests including additional covariates is pivotal to improving POLARIS' accuracy. This database has the potential to improve the modeling of biogeochemical, water, and energy cycles in environmental models; enhance availability of data for precision agriculture; and assist hydrologic monitoring and forecasting to ensure food and water security.

  1. Radiological mapping of functional transcription units of bacteriophage phiX174 and S13

    International Nuclear Information System (INIS)

    Pollock, T.J.; Tessman, I.; Tessman, E.S.

    1978-01-01

    It has been found that the nearest promoter is not always the primary promoter for making translatable message. The technique of ultraviolet mapping was used to determine the location of promoter sites for translated mRNA coded for by bacteriophages phiX174 and S13. The method is based on the theory that the 'target size' for u.v. inactivation of expression of a gene is proportional to the distance between the promoter and the 3' end of the gene. This method has revealed an expected and some unexpected locations for the promoters responsible for gene expression. Ultraviolet-survival curves for expression of phage genes were interpreted in the following way. The contiguous genes D, F, G and H are expressed as a unit under the control of a promoter located near gene D. However, gene B (and probably the adjacent genes K and C) are controlled by a promoter distant from gene B, possibly in the region of gene H, rather than from a promoter located just before gene B. Likewise, gene A is controlled by a promoter distant from gene A. (author)

  2. A terrain-based site characterization map of California with implications for the contiguous United States

    Science.gov (United States)

    Yong, Alan K.; Hough, Susan E.; Iwahashi, Junko; Braverman, Amy

    2012-01-01

    We present an approach based on geomorphometry to predict material properties and characterize site conditions using the VS30 parameter (time‐averaged shear‐wave velocity to a depth of 30 m). Our framework consists of an automated terrain classification scheme based on taxonomic criteria (slope gradient, local convexity, and surface texture) that systematically identifies 16 terrain types from 1‐km spatial resolution (30 arcsec) Shuttle Radar Topography Mission digital elevation models (SRTM DEMs). Using 853 VS30 values from California, we apply a simulation‐based statistical method to determine the mean VS30 for each terrain type in California. We then compare the VS30 values with models based on individual proxies, such as mapped surface geology and topographic slope, and show that our systematic terrain‐based approach consistently performs better than semiempirical estimates based on individual proxies. To further evaluate our model, we apply our California‐based estimates to terrains of the contiguous United States. Comparisons of our estimates with 325 VS30 measurements outside of California, as well as estimates based on the topographic slope model, indicate our method to be statistically robust and more accurate. Our approach thus provides an objective and robust method for extending estimates of VS30 for regions where in situ measurements are sparse or not readily available.

  3. Map showing minimum depth to water in shallow aquifers (1963-72) in the Sugar House quadrangle, Salt Lake County, Utah

    Science.gov (United States)

    Mower, R.W.; Van Horn, Richard

    1973-01-01

    The depth to ground water in shallow aquifers in the Sugar Horse quadrangle ranges from zero in areas of springs and seeps to more than 10 feet beneath most of the area shown on the map. The depth to water differs from place to place because of irregular topography, and the varying capability of different rock materials to transmit water. Ground water also occurs under unconfined and confined conditions in deep aquifers beneath the Sugar Horse quadrangle, as shown by the block diagram and as described by Hely, Mower, and Harr (1971a, p. 17-111).

  4. Increasing minimum daily temperatures are associated with enhanced pesticide use in cultivated soybean along a latitudinal gradient in the mid-western United States.

    Directory of Open Access Journals (Sweden)

    Lewis H Ziska

    Full Text Available Assessments of climate change and food security often do not consider changes to crop production as a function of altered pest pressures. Evaluation of potential changes may be difficult, in part, because management practices are routinely utilized in situ to minimize pest injury. If so, then such practices, should, in theory, also change with climate, although this has never been quantified. Chemical (pesticide applications remain the primary means of managing pests in industrialized countries. While a wide range of climate variables can influence chemical use, minimum daily temperature (lowest 24 h recorded temperature in a given year can be associated with the distribution and thermal survival of many agricultural pests in temperate regions. The current study quantifies average pesticide applications since 1999 for commercial soybean grown over a 2100 km North-South latitudinal transect for seven states that varied in minimum daily temperature (1999-2013 from -28.6°C (Minnesota to -5.1°C (Louisiana. Although soybean yields (per hectare did not vary by state, total pesticide applications (kg of active ingredient, ai, per hectare increased from 4.3 to 6.5 over this temperature range. Significant correlations were observed between minimum daily temperatures and kg of ai for all pesticide classes. This suggested that minimum daily temperature could serve as a proxy for pesticide application. Longer term temperature data (1977-2013 indicated greater relative increases in minimum daily temperatures for northern relative to southern states. Using these longer-term trends to determine short-term projections of pesticide use (to 2023 showed a greater comparative increase in herbicide use for soybean in northern; but a greater increase in insecticide and fungicide use for southern states in a warmer climate. Overall, these data suggest that increases in pesticide application rates may be a means to maintain soybean production in response to rising

  5. Improved predictive mapping of indoor radon concentrations using ensemble regression trees based on automatic clustering of geological units

    International Nuclear Information System (INIS)

    Kropat, Georg; Bochud, Francois; Jaboyedoff, Michel; Laedermann, Jean-Pascal; Murith, Christophe; Palacios, Martha; Baechler, Sébastien

    2015-01-01

    Purpose: According to estimations around 230 people die as a result of radon exposure in Switzerland. This public health concern makes reliable indoor radon prediction and mapping methods necessary in order to improve risk communication to the public. The aim of this study was to develop an automated method to classify lithological units according to their radon characteristics and to develop mapping and predictive tools in order to improve local radon prediction. Method: About 240 000 indoor radon concentration (IRC) measurements in about 150 000 buildings were available for our analysis. The automated classification of lithological units was based on k-medoids clustering via pair-wise Kolmogorov distances between IRC distributions of lithological units. For IRC mapping and prediction we used random forests and Bayesian additive regression trees (BART). Results: The automated classification groups lithological units well in terms of their IRC characteristics. Especially the IRC differences in metamorphic rocks like gneiss are well revealed by this method. The maps produced by random forests soundly represent the regional difference of IRCs in Switzerland and improve the spatial detail compared to existing approaches. We could explain 33% of the variations in IRC data with random forests. Additionally, the influence of a variable evaluated by random forests shows that building characteristics are less important predictors for IRCs than spatial/geological influences. BART could explain 29% of IRC variability and produced maps that indicate the prediction uncertainty. Conclusion: Ensemble regression trees are a powerful tool to model and understand the multidimensional influences on IRCs. Automatic clustering of lithological units complements this method by facilitating the interpretation of radon properties of rock types. This study provides an important element for radon risk communication. Future approaches should consider taking into account further variables

  6. Population-Based Trachoma Mapping in Six Evaluation Units of Papua New Guinea.

    Science.gov (United States)

    Ko, Robert; Macleod, Colin; Pahau, David; Sokana, Oliver; Keys, Drew; Burnett, Anthea; Willis, Rebecca; Wabulembo, Geoffrey; Garap, Jambi; Solomon, Anthony W

    2016-01-01

    We sought to determine the prevalence of trachomatous inflammation - follicular (TF) in children aged 1-9 years, and trachomatous trichiasis (TT) in those aged ≥15 years, in suspected trachoma-endemic areas of Papua New Guinea (PNG). We carried out six population-based prevalence surveys using the protocol developed as part of the Global Trachoma Mapping Project. A total of 19,013 individuals were sampled for inclusion, with 15,641 (82.3%) consenting to participate. Four evaluation units had prevalences of TF in children ≥10%, above which threshold the World Health Organization (WHO) recommends mass drug administration (MDA) of azithromycin for at least three years; Western Province (South Fly/Daru) 11.2% (95% confidence interval, CI, 6.9-17.0%), Southern Highlands (East) 12.2% (95% CI 9.6-15.0%), Southern Highlands (West) 11.7% (95% CI 8.5-15.3%), and West New Britain 11.4% (95% CI 8.7-13.9%). TF prevalence was 5.0-9.9% in Madang (9.4%, 95% CI 6.1-13.0%) and National Capital District (6.0%. 95% CI 3.2-9.1%) where consideration of a single round of MDA is warranted. Cases of TT were not found outside West New Britain, in which four cases were seen, generating an estimated population-level prevalence of TT in adults of 0.10% (95% CI 0.00-0.40%) for West New Britain, below the WHO elimination threshold of 0.2% of those aged ≥15 years. Trachoma is a public health issue in PNG. However, other than in West New Britain, there are few data to support the idea that trachoma is a cause of blindness in PNG. Further research is needed to understand the stimulus for the active trachoma phenotype in these populations.

  7. Youth Employment and the Minimum Wage. Hearing before the Joint Economic Committee, Congress of the United States, Ninety-Eighth Congress, Second Session.

    Science.gov (United States)

    Joint Economic Committee, Washington, DC.

    This congressional hearing contains testimony about the problem of youth unemployment and about the relationship between youth employment opportunities and the minimum wage. A special focus is the administration's proposal for the enactment of a youth employment opportunity wage, under which youth below the age of 20 could be paid 75 percent of…

  8. What Is the Unit of Visual Attention? Object for Selection, but Boolean Map for Access

    Science.gov (United States)

    Huang, Liqiang

    2010-01-01

    In the past 20 years, numerous theories and findings have suggested that the unit of visual attention is the object. In this study, I first clarify 2 different meanings of unit of visual attention, namely the unit of access in the sense of measurement and the unit of selection in the sense of division. In accordance with this distinction, I argue…

  9. Heel Effect: Dose Mapping And Profiling For Mobile C-Arm Fluoroscopy Unit Toshiba SXT-1000A

    International Nuclear Information System (INIS)

    Husaini Salleh; Mohd Khalid Matori; Muhammad Jamal Md Isa; Mohd Ramli Arshad; Shahrul Azlan Azizan; Mohd Firdaus Abdul Rahman; Md Khairusalih Md Zin

    2014-01-01

    Heel Effect is the well known phenomena in x-ray production. It contributes the effect to image formation and as well as scattered radiation. But there is paucity in the study related to heel effect. This study is for mapping and profiling the dose on the surface of water phantom by using mobile C-arm unit Toshiba SXT-1000A. Based on the result the dose profile is increasing up to about 57 % from anode to cathode bound of the irradiated area. This result and information can be used as a guide to manipulate these phenomena for better image quality and radiation safety for this specific and dedicated fluoroscopy unit. (author)

  10. Seep Detection using E/V Nautilus Integrated Seafloor Mapping and Remotely Operated Vehicles on the United States West Coast

    Science.gov (United States)

    Gee, L. J.; Raineault, N.; Kane, R.; Saunders, M.; Heffron, E.; Embley, R. W.; Merle, S. G.

    2017-12-01

    Exploration Vessel (E/V) Nautilus has been mapping the seafloor off the west coast of the United States, from Washington to California, for the past three years with a Kongsberg EM302 multibeam sonar. This system simultaneously collects bathymetry, seafloor and water column backscatter data, allowing an integrated approach to mapping to more completely characterize a region, and has identified over 1,000 seafloor seeps. Hydrographic multibeam sonars like the EM302 were designed for mapping the bathymetry. It is only in the last decade that major mapping projects included an integrated approach that utilizes the seabed and water column backscatter information in addition to the bathymetry. Nautilus mapping in the Eastern Pacific over the past three years has included a number of seep-specific expeditions, and utilized and adapted the preliminary mapping guidelines that have emerged from research. The likelihood of seep detection is affected by many factors: the environment: seabed geomorphology, surficial sediment, seep location/depth, regional oceanography and biology, the nature of the seeps themselves: size variation, varying flux, depth, and transience, the detection system: design of hydrographic multibeam sonars limits use for water column detection, the platform: variations in the vessel and operations such as noise, speed, and swath overlap. Nautilus integrated seafloor mapping provided multiple indicators of seep locations, but it remains difficult to assess the probability of seep detection. Even when seeps were detected, they have not always been located during ROV dives. However, the presence of associated features (methane hydrate and bacterial mats) serve as evidence of potential seep activity and reinforce the transient nature of the seeps. Not detecting a seep in the water column data does not necessarily indicate that there is not a seep at a given location, but with multiple passes over an area and by the use of other contextual data, an area may

  11. Combined landslide inventory and susceptibility assessment based on different mapping units: an example from the Flemish Ardennes, Belgium

    Directory of Open Access Journals (Sweden)

    M. Van Den Eeckhaut

    2009-03-01

    Full Text Available For a 277 km2 study area in the Flemish Ardennes, Belgium, a landslide inventory and two landslide susceptibility zonations were combined to obtain an optimal landslide susceptibility assessment, in five classes. For the experiment, a regional landslide inventory, a 10 m × 10 m digital representation of topography, and lithological and soil hydrological information obtained from 1:50 000 scale maps, were exploited. In the study area, the regional inventory shows 192 landslides of the slide type, including 158 slope failures occurred before 1992 (model calibration set, and 34 failures occurred after 1992 (model validation set. The study area was partitioned in 2.78×106 grid cells and in 1927 topographic units. The latter are hydro-morphological units obtained by subdividing slope units based on terrain gradient. Independent models were prepared for the two terrain subdivisions using discriminant analysis. For grid cells, a single pixel was identified as representative of the landslide depletion area, and geo-environmental information for the pixel was obtained from the thematic maps. The landslide and geo-environmental information was used to model the propensity of the terrain to host landslide source areas. For topographic units, morphologic and hydrologic information and the proportion of lithologic and soil hydrological types in each unit, were used to evaluate landslide susceptibility, including the depletion and depositional areas. Uncertainty associated with the two susceptibility models was evaluated, and the model performance was tested using the independent landslide validation set. An heuristic procedure was adopted to combine the landslide inventory and the susceptibility zonations. The procedure makes optimal use of the available landslide and susceptibility information, minimizing the limitations inherent in the inventory and the susceptibility maps. For the established susceptibility classes, regulations to

  12. Combined landslide inventory and susceptibility assessment based on different mapping units: an example from the Flemish Ardennes, Belgium

    Science.gov (United States)

    van den Eeckhaut, M.; Reichenbach, P.; Guzzetti, F.; Rossi, M.; Poesen, J.

    2009-03-01

    For a 277 km2 study area in the Flemish Ardennes, Belgium, a landslide inventory and two landslide susceptibility zonations were combined to obtain an optimal landslide susceptibility assessment, in five classes. For the experiment, a regional landslide inventory, a 10 m × 10 m digital representation of topography, and lithological and soil hydrological information obtained from 1:50 000 scale maps, were exploited. In the study area, the regional inventory shows 192 landslides of the slide type, including 158 slope failures occurred before 1992 (model calibration set), and 34 failures occurred after 1992 (model validation set). The study area was partitioned in 2.78×106 grid cells and in 1927 topographic units. The latter are hydro-morphological units obtained by subdividing slope units based on terrain gradient. Independent models were prepared for the two terrain subdivisions using discriminant analysis. For grid cells, a single pixel was identified as representative of the landslide depletion area, and geo-environmental information for the pixel was obtained from the thematic maps. The landslide and geo-environmental information was used to model the propensity of the terrain to host landslide source areas. For topographic units, morphologic and hydrologic information and the proportion of lithologic and soil hydrological types in each unit, were used to evaluate landslide susceptibility, including the depletion and depositional areas. Uncertainty associated with the two susceptibility models was evaluated, and the model performance was tested using the independent landslide validation set. An heuristic procedure was adopted to combine the landslide inventory and the susceptibility zonations. The procedure makes optimal use of the available landslide and susceptibility information, minimizing the limitations inherent in the inventory and the susceptibility maps. For the established susceptibility classes, regulations to link terrain domains to appropriate land

  13. A Fire Severity Mapping System (FSMS) for real-time management applications and long term planning: Developing a map of the landscape potential for severe fire in the western United States

    Science.gov (United States)

    Gregory K. Dillon; Zachary A. Holden; Penny Morgan; Bob Keane

    2009-01-01

    The Fire Severity Mapping System project is geared toward providing fire managers across the western United States with critical information for dealing with and planning for the ecological effects of wildfire at multiple levels of thematic, spatial, and temporal detail. For this project, we are developing a comprehensive, west-wide map of the landscape potential for...

  14. Assssment and Mapping of the Riverine Hydrokinetic Resource in the Continental United States

    Energy Technology Data Exchange (ETDEWEB)

    Jacobson, Paul T. [Electric Power Research Institute; Ravens, Thomas M. [University of Alaska Anchorage; Cunningham, Keith W. [University of Alaska Fairbanks; Scott, George [National Renewable Energy Laboratory

    2012-12-14

    The U.S. Department of Energy (DOE) funded the Electric Power Research Institute and its collaborative partners, University of Alaska ? Anchorage, University of Alaska ? Fairbanks, and the National Renewable Energy Laboratory, to provide an assessment of the riverine hydrokinetic resource in the continental United States. The assessment benefited from input obtained during two workshops attended by individuals with relevant expertise and from a National Research Council panel commissioned by DOE to provide guidance to this and other concurrent, DOE-funded assessments of water based renewable energy. These sources of expertise provided valuable advice regarding data sources and assessment methodology. The assessment of the hydrokinetic resource in the 48 contiguous states is derived from spatially-explicit data contained in NHDPlus ?a GIS-based database containing river segment-specific information on discharge characteristics and channel slope. 71,398 river segments with mean annual flow greater than 1,000 cubic feet per second (cfs) mean discharge were included in the assessment. Segments with discharge less than 1,000 cfs were dropped from the assessment, as were river segments with hydroelectric dams. The results for the theoretical and technical resource in the 48 contiguous states were found to be relatively insensitive to the cutoff chosen. Raising the cutoff to 1,500 cfs had no effect on estimate of the technically recoverable resource, and the theoretical resource was reduced by 5.3%. The segment-specific theoretical resource was estimated from these data using the standard hydrological engineering equation that relates theoretical hydraulic power (Pth, Watts) to discharge (Q, m3 s-1) and hydraulic head or change in elevation (??, m) over the length of the segment, where ? is the specific weight of water (9800 N m-3): ??? = ? ? ?? For Alaska, which is not encompassed by NPDPlus, hydraulic head and discharge data were manually obtained from Idaho National

  15. Using NASA Satellite Observations to Map Wildfire Risk in the United States for Allocation of Fire Management Resources

    Science.gov (United States)

    Farahmand, A.; Reager, J. T., II; Behrangi, A.; Stavros, E. N.; Randerson, J. T.

    2017-12-01

    Fires are a key disturbance globally acting as a catalyst for terrestrial ecosystem change and contributing significantly to both carbon emissions and changes in surface albedo. The socioeconomic impacts of wildfire activities are also significant with wildfire activity results in billions of dollars of losses every year. Fire size, area burned and frequency are increasing, thus the likelihood of fire danger, defined by United States National Interagency Fire Center (NFIC) as the demand of fire management resources as a function of how flammable fuels (a function of ignitability, consumability and availability) are from normal, is an important step toward reducing costs associated with wildfires. Numerous studies have aimed to predict the likelihood of fire danger, but few studies use remote sensing data to map fire danger at scales commensurate with regional management decisions (e.g., deployment of resources nationally throughout fire season with seasonal and monthly prediction). Here, we use NASA Gravity Recovery And Climate Experiment (GRACE) assimilated surface soil moisture, NASA Atmospheric Infrared Sounder (AIRS) vapor pressure deficit, NASA Moderate Resolution Imaging Spectroradiometer (MODIS) enhanced vegetation index products and landcover products, along with US Forest Service historical fire activity data to generate probabilistic monthly fire potential maps in the United States. These maps can be useful in not only government operational allocation of fire management resources, but also improving understanding of the Earth System and how it is changing in order to refine predictions of fire extremes.

  16. Case closed: research evidence on the positive public health impact of the age 21 minimum legal drinking age in the United States.

    Science.gov (United States)

    DeJong, William; Blanchette, Jason

    2014-01-01

    In 2006, the nonprofit organization Choose Responsibility called for repealing the 1984 National Minimum Drinking Age Act, which had led all 50 states to establish a minimum legal drinking age (MLDA) of 21 years, and allowing the states to lower their MLDA to 18 years. Two years later, the organization assembled a small group of college and university presidents (the Amethyst Initiative) to call publicly for a critical reexamination of the law. Public health and traffic safety experts responded to these efforts by generating new research on the age 21 MLDA, thus warranting an updated review of the literature. This review focuses primarily on research published since 2006, when Choose Responsibility began its public relations campaign to lower the MLDA. Recent research on the age 21 MLDA has reinforced the position that the current law has served the nation well by reducing alcohol-related traffic crashes and alcohol consumption among youths, while also protecting drinkers from long-term negative outcomes they might experience in adulthood, including alcohol and other drug dependence, adverse birth outcomes, and suicide and homicide. The age 21 law saves lives and is unlikely to be overturned. College and university leaders need to put into effect workable policies, stricter enforcement, and other evidence-based prevention efforts that have been demonstrated to reduce underage drinking and alcohol-related problems on campus and are being applied successfully at prominent academic institutions.

  17. Elaboration Of A Classification Of Geomorphologic Units And The Basis Of A Digital Data-Base For Establishing Geomorphologic Maps In Egypt

    International Nuclear Information System (INIS)

    EI Gammal, E.A.; Cherif, O.H.; Abdel Aleem, E.

    2003-01-01

    A database for the classification and description of basic geomorphologic land form units has been prepared for establishing geomorphologic maps in Egyptian terrains. This database includes morpho-structural, lithological, denudational and depositional units. The database.is included in tables with proper coding to be used for establishing automatically the color, symbols and legend of the maps. Also the system includes description of various geomorphic units. The system is designed to be used with the ARC Map software. The AUTOCAD 2000 software has been used to trace the maps. The database has been applied to produce five new geomorphologic maps with a scale of I: 100 000. These are: Wadi Feiran Sheet, Wadi Kid Sheet, Gabal Katherina Sheet in South Sinai, Shelattein area (South Eastern Desert) and Baharia Oasis area (Western Desert)

  18. Material Units, Structures/Landforms, and Stratigraphy for the Global Geologic Map of Ganymede (1:15M)

    Science.gov (United States)

    Patterson, G. Wesley; Head, James W.; Collins, Geoffrey C.; Pappalardo, Robert T.; Prockter, Louis M.; Lucchitta, Baerbel K.

    2008-01-01

    In the coming year a global geological map of Ganymede will be completed that represents the most recent understanding of the satellite on the basis of Galileo mission results. This contribution builds on important previous accomplishments in the study of Ganymede utilizing Voyager data and incorporates the many new discoveries that were brought about by examination of Galileo data. Material units have been defined, structural landforms have been identified, and an approximate stratigraphy has been determined utilizing a global mosaic of the surface with a nominal resolution of 1 km/pixel assembled by the USGS. This mosaic incorporates the best available Voyager and Galileo regional coverage and high resolution imagery (100-200 m/pixel) of characteristic features and terrain types obtained by the Galileo spacecraft. This map has given us a more complete understanding of: 1) the major geological processes operating on Ganymede, 2) the characteristics of the geological units making up its surface, 3) the stratigraphic relationships of geological units and structures, and 4) the geological history inferred from these relationships. A summary of these efforts is provided here.

  19. Translation of Bernstein Coefficients Under an Affine Mapping of the Unit Interval

    Science.gov (United States)

    Alford, John A., II

    2012-01-01

    We derive an expression connecting the coefficients of a polynomial expanded in the Bernstein basis to the coefficients of an equivalent expansion of the polynomial under an affine mapping of the domain. The expression may be useful in the calculation of bounds for multi-variate polynomials.

  20. Okeanos Explorer (EX1606): CAPSTONE Wake Island Unit PRIMNM (ROV & Mapping)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Operations will use the ship’s deep water mapping systems (Kongsberg EM302 multibeam sonar, EK60 split-beam fisheries sonars, ADCPs, and Knudsen 3260 chirp...

  1. Next-generation forest change mapping across the United States: the landscape change monitoring system (LCMS)

    Science.gov (United States)

    Sean P. Healey; Warren B. Cohen; Yang Zhiqiang; Ken Brewer; Evan Brooks; Noel Gorelick; Mathew Gregory; Alexander Hernandez; Chengquan Huang; Joseph Hughes; Robert Kennedy; Thomas Loveland; Kevin Megown; Gretchen Moisen; Todd Schroeder; Brian Schwind; Stephen Stehman; Daniel Steinwand; James Vogelmann; Curtis Woodcock; Limin Yang; Zhe. Zhu

    2015-01-01

    Forest change information is critical in forest planning, ecosystem modeling, and in updating forest condition maps. The Landsat satellite platform has provided consistent observations of the world’s ecosystems since 1972. A number of innovative change detection algorithms have been developed to use the Landsat archive to identify and characterize forest change. The...

  2. Geologic quadrangle maps of the United States: geology of the Casa Diablo Mountain quadrangle, California

    Science.gov (United States)

    Rinehart, C. Dean; Ross, Donald Clarence

    1957-01-01

    The Casa Diablo Mountain quadrangle was mapped in the summers of 1952 and 1953 by the U.S. Geological Survey in cooperation with the California State Division of Mines as part of a study of potential tungsten-bearing areas.

  3. Taxonomic classification of world map units in crop producing areas of Argentina and Brazil with representative US soil series and major land resource areas in which they occur

    Science.gov (United States)

    Huckle, H. F. (Principal Investigator)

    1980-01-01

    The most probable current U.S. taxonomic classification of the soils estimated to dominate world soil map units (WSM)) in selected crop producing states of Argentina and Brazil are presented. Representative U.S. soil series the units are given. The map units occurring in each state are listed with areal extent and major U.S. land resource areas in which similar soils most probably occur. Soil series sampled in LARS Technical Report 111579 and major land resource areas in which they occur with corresponding similar WSM units at the taxonomic subgroup levels are given.

  4. Mapping critical levels of ozone, sulfur dioxide and nitrogen oxide for crops, forests and natural vegetation in the United States

    International Nuclear Information System (INIS)

    Rosenbaum, B.J.; Strickland, T.C.; McDowell, M.K.

    1994-01-01

    Air pollution abatement strategies for controlling nitrogen dioxide, sulfur dioxide, and ozone emissions in the United States focus on a 'standards-based' approach. This approach places limits on air pollution by maintaining a baseline value for air quality, no matter what the ecosystem can or cannot withstand. This paper, presents example critical levels maps for the conterminous U.S. developed using the 'effects-based' mapping approach as defined by the United Nations Economic Commission for Europe's Convention on Long-Range Transboundary Air Pollution, Task Force on Mapping. This approach emphasizes the pollution level or load capacity an ecosystem can accommodate before degradation occurs, and allows for analysis of cumulative effects. Presents the first stage of an analysis that reports the distribution of exceedances of critical levels for NO 2 , SO 2 , and O 3 in sensitive forest, crop, and natural vegetation ecosystems in the contiguous United States. It is concluded that extrapolation to surrounding geographic areas requires the analysis of diverse and compounding factors that preclude simple extrapolation methods. Pollutant data depicted in this analysis are limited to locationally specific data, and would be enhanced by utilizing spatial statistics, along with converging associated anthropogenic and climatological factors. Values used for critical levels were derived from current scientific knowledge. While not intended to be a definitive value, adjustments will occur as the scientific community gains new insight to pollutant/receptor relationships. We recommend future analysis to include a refinement of sensitive receptor data coverages and to report relative proportions of exceedances at varying grid scales. 27 refs., 4 figs., 1 tab

  5. Assessment and mapping of slope stability based on slope units: A ...

    Indian Academy of Sciences (India)

    Shallow landslide; infinite slope stability equation; return period precipitation; assessment; slope unit. ... 2010), logistic regression ... model to assess the hazard of shallow landslides ..... grating a fuzzy k-means classification and a Bayesian.

  6. Mapping Investments and Published Outputs in Norovirus Research: A Systematic Analysis of Research Funded in the United States and United Kingdom During 1997-2013.

    Science.gov (United States)

    Head, Michael G; Fitchett, Joseph R; Lichtman, Amos B; Soyode, Damilola T; Harris, Jennifer N; Atun, Rifat

    2016-02-01

    Norovirus accounts for a considerable portion of the global disease burden. Mapping national or international investments relating to norovirus research is limited. We analyzed the focus and type of norovirus research funding awarded to institutions in the United States and United Kingdom during 1997-2013. Data were obtained from key public and philanthropic funders across both countries, and norovirus-related research was identified from study titles and abstracts. Included studies were further categorized by the type of scientific investigation, and awards related to vaccine, diagnostic, and therapeutic research were identified. Norovirus publication trends are also described using data from Scopus. In total, US and United Kingdom funding investment for norovirus research was £97.6 million across 349 awards; 326 awards (amount, £84.9 million) were received by US institutions, and 23 awards (£12.6 million) were received by United Kingdom institutions. Combined, £81.2 million of the funding (83.2%) was for preclinical research, and £16.4 million (16.8%) was for translational science. Investments increased from £1.7 million in 1997 to £11.8 million in 2013. Publication trends showed a consistent temporal increase from 48 in 1997 to 182 in 2013. Despite increases over time, trends in US and United Kingdom funding for norovirus research clearly demonstrate insufficient translational research and limited investment in diagnostics, therapeutics, or vaccine research. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  7. Geoelectric hazard maps for the Mid-Atlantic United States: 100 year extreme values and the 1989 magnetic storm

    Science.gov (United States)

    Love, Jeffrey J.; Lucas, Greg M.; Kelbert, Anna; Bedrosian, Paul A.

    2018-01-01

    Maps of extreme value geoelectric field amplitude are constructed for the Mid‐Atlantic United States, a region with high population density and critically important power grid infrastructure. Geoelectric field time series for the years 1983–2014 are estimated by convolving Earth surface impedances obtained from 61 magnetotelluric survey sites across the Mid‐Atlantic with historical 1 min (2 min Nyquist) measurements of geomagnetic variation obtained from a nearby observatory. Statistical models are fitted to the maximum geoelectric amplitudes occurring during magnetic storms, and extrapolations made to estimate threshold amplitudes only exceeded, on average, once per century. For the Mid‐Atlantic region, 100 year geoelectric exceedance amplitudes have a range of almost 3 orders of magnitude (from 0.04 V/km at a site in southern Pennsylvania to 24.29 V/km at a site in central Virginia), and they have significant geographic granularity, all of which is due to site‐to‐site differences in magnetotelluric impedance. Maps of these 100 year exceedance amplitudes resemble those of the estimated geoelectric amplitudes attained during the March 1989 magnetic storm, and, in that sense, the March 1989 storm resembles what might be loosely called a “100 year” event. The geoelectric hazard maps reported here stand in stark contrast with the 100 year geoelectric benchmarks developed for the North American Electric Reliability Corporation.

  8. Geoelectric Hazard Maps for the Mid-Atlantic United States: 100 Year Extreme Values and the 1989 Magnetic Storm

    Science.gov (United States)

    Love, Jeffrey J.; Lucas, Greg M.; Kelbert, Anna; Bedrosian, Paul A.

    2018-01-01

    Maps of extreme value geoelectric field amplitude are constructed for the Mid-Atlantic United States, a region with high population density and critically important power grid infrastructure. Geoelectric field time series for the years 1983-2014 are estimated by convolving Earth surface impedances obtained from 61 magnetotelluric survey sites across the Mid-Atlantic with historical 1 min (2 min Nyquist) measurements of geomagnetic variation obtained from a nearby observatory. Statistical models are fitted to the maximum geoelectric amplitudes occurring during magnetic storms, and extrapolations made to estimate threshold amplitudes only exceeded, on average, once per century. For the Mid-Atlantic region, 100 year geoelectric exceedance amplitudes have a range of almost 3 orders of magnitude (from 0.04 V/km at a site in southern Pennsylvania to 24.29 V/km at a site in central Virginia), and they have significant geographic granularity, all of which is due to site-to-site differences in magnetotelluric impedance. Maps of these 100 year exceedance amplitudes resemble those of the estimated geoelectric amplitudes attained during the March 1989 magnetic storm, and, in that sense, the March 1989 storm resembles what might be loosely called a "100 year" event. The geoelectric hazard maps reported here stand in stark contrast with the 100 year geoelectric benchmarks developed for the North American Electric Reliability Corporation.

  9. Mapping marginal croplands suitable for cellulosic feedstock crops in the Great Plains, United States

    Science.gov (United States)

    Gu, Yingxin; Wylie, Bruce K.

    2016-01-01

    Growing cellulosic feedstock crops (e.g., switchgrass) for biofuel is more environmentally sustainable than corn-based ethanol. Specifically, this practice can reduce soil erosion and water quality impairment from pesticides and fertilizer, improve ecosystem services and sustainability (e.g., serve as carbon sinks), and minimize impacts on global food supplies. The main goal of this study was to identify high-risk marginal croplands that are potentially suitable for growing cellulosic feedstock crops (e.g., switchgrass) in the US Great Plains (GP). Satellite-derived growing season Normalized Difference Vegetation Index, a switchgrass biomass productivity map obtained from a previous study, US Geological Survey (USGS) irrigation and crop masks, and US Department of Agriculture (USDA) crop indemnity maps for the GP were used in this study. Our hypothesis was that croplands with relatively low crop yield but high productivity potential for switchgrass may be suitable for converting to switchgrass. Areas with relatively low crop indemnity (crop indemnity marginal croplands in the GP are potentially suitable for switchgrass development. The total estimated switchgrass biomass productivity gain from these suitable areas is about 5.9 million metric tons. Switchgrass can be cultivated in either lowland or upland regions in the GP depending on the local soil and environmental conditions. This study improves our understanding of ecosystem services and the sustainability of cropland systems in the GP. Results from this study provide useful information to land managers for making informed decisions regarding switchgrass development in the GP.

  10. Mapping water availability, projected use and cost in the western United States

    Science.gov (United States)

    Tidwell, Vincent C.; Moreland, Barbara D.; Zemlick, Katie M.; Roberts, Barry L.; Passell, Howard D.; Jensen, Daniel; Forsgren, Christopher; Sehlke, Gerald; Cook, Margaret A.; King, Carey W.; Larsen, Sara

    2014-05-01

    New demands for water can be satisfied through a variety of source options. In some basins surface and/or groundwater may be available through permitting with the state water management agency (termed unappropriated water), alternatively water might be purchased and transferred out of its current use to another (termed appropriated water), or non-traditional water sources can be captured and treated (e.g., wastewater). The relative availability and cost of each source are key factors in the development decision. Unfortunately, these measures are location dependent with no consistent or comparable set of data available for evaluating competing water sources. With the help of western water managers, water availability was mapped for over 1200 watersheds throughout the western US. Five water sources were individually examined, including unappropriated surface water, unappropriated groundwater, appropriated water, municipal wastewater and brackish groundwater. Also mapped was projected change in consumptive water use from 2010 to 2030. Associated costs to acquire, convey and treat the water, as necessary, for each of the five sources were estimated. These metrics were developed to support regional water planning and policy analysis with initial application to electric transmission planning in the western US.

  11. Hierarchical Object-Based Mapping of Riverscape Units and in-Stream Mesohabitats Using LiDAR and VHR Imagery

    Directory of Open Access Journals (Sweden)

    Luca Demarchi

    2016-01-01

    Full Text Available In this paper, we present a new, semi-automated methodology for mapping hydromorphological indicators of rivers at a regional scale using multisource remote sensing (RS data. This novel approach is based on the integration of spectral and topographic information within a multilevel, geographic, object-based image analysis (GEOBIA. Different segmentation levels were generated based on the two sources of Remote Sensing (RS data, namely very-high spatial resolution, near-infrared imagery (VHR and high-resolution LiDAR topography. At each level, different input object features were tested with Machine Learning classifiers for mapping riverscape units and in-stream mesohabitats. The GEOBIA approach proved to be a powerful tool for analyzing the river system at different levels of detail and for coupling spectral and topographic datasets, allowing for the delineation of the natural fluvial corridor with its primary riverscape units (e.g., water channel, unvegetated sediment bars, riparian densely-vegetated units, etc. and in-stream mesohabitats with a high level of accuracy, respectively of K = 0.91 and K = 0.83. This method is flexible and can be adapted to different sources of data, with the potential to be implemented at regional scales in the future. The analyzed dataset, composed of VHR imagery and LiDAR data, is nowadays increasingly available at larger scales, notably through European Member States. At the same time, this methodology provides a tool for monitoring and characterizing the hydromorphological status of river systems continuously along the entire channel network and coherently through time, opening novel and significant perspectives to river science and management, notably for planning and targeting actions.

  12. Advanced competencies mapping of critical care nursing: a qualitative research in two Intensive Care Units.

    Science.gov (United States)

    Alfieri, Emanuela; Mori, Marina; Barbui, Valentina; Sarli, Leopoldo

    2017-07-18

    Nowadays, in Italy, the nursing profession has suffered important changes in response to the needs of citizens' health and to improve the quality of the health service in the country.  At the basis of this development there is an increase of the nurses' knowledge, competencies and responsibilities. Currently, the presence of nurses who have followed post-basic training paths, and the subsequent acquisition of advanced clinical knowledge and specializations, has made it essential for the presence of competencies mappings for each specialty, also to differentiate them from general care nurses. The objective is to get a mapping of nurse's individual competencies working in critical care, to analyze the context of the Parma Hospital and comparing it with the Lebanon Heart Hospital in Lebanon. The survey has been done through a series of interviews involving some of the hospital staff, in order to collect opinions about the ICU nurses' competencies. What emerged from the data allowed us to get a list of important abilities, competencies, character traits and  intensive care nurse activities. Italians and Lebanese nurses appear to be prepared from a technical point of view, with a desire for improvement through specializations, masters and enabling courses in advanced health maneuvers. By respondents nurses can seize a strong desire for professional improvement. At the end of our research we were able to draw a list of different individual competencies, behavioral and moral characteristics. The nurse figure has a high potential and large professional improvement prospects, if more taken into account by the health system.

  13. Minimum Wages and Poverty

    OpenAIRE

    Fields, Gary S.; Kanbur, Ravi

    2005-01-01

    Textbook analysis tells us that in a competitive labor market, the introduction of a minimum wage above the competitive equilibrium wage will cause unemployment. This paper makes two contributions to the basic theory of the minimum wage. First, we analyze the effects of a higher minimum wage in terms of poverty rather than in terms of unemployment. Second, we extend the standard textbook model to allow for incomesharing between the employed and the unemployed. We find that there are situation...

  14. Mapping grasslands suitable for cellulosic biofuels in the Greater Platte River Basin, United States

    Science.gov (United States)

    Wylie, Bruce K.; Gu, Yingxin

    2012-01-01

    Biofuels are an important component in the development of alternative energy supplies, which is needed to achieve national energy independence and security in the United States. The most common biofuel product today in the United States is corn-based ethanol; however, its development is limited because of concerns about global food shortages, livestock and food price increases, and water demand increases for irrigation and ethanol production. Corn-based ethanol also potentially contributes to soil erosion, and pesticides and fertilizers affect water quality. Studies indicate that future potential production of cellulosic ethanol is likely to be much greater than grain- or starch-based ethanol. As a result, economics and policy incentives could, in the near future, encourage expansion of cellulosic biofuels production from grasses, forest woody biomass, and agricultural and municipal wastes. If production expands, cultivation of cellulosic feedstock crops, such as switchgrass (Panicum virgatum L.) and miscanthus (Miscanthus species), is expected to increase dramatically. The main objective of this study is to identify grasslands in the Great Plains that are potentially suitable for cellulosic feedstock (such as switchgrass) production. Producing ethanol from noncropland holdings (such as grassland) will minimize the effects of biofuel developments on global food supplies. Our pilot study area is the Greater Platte River Basin, which includes a broad range of plant productivity from semiarid grasslands in the west to the fertile corn belt in the east. The Greater Platte River Basin was the subject of related U.S. Geological Survey (USGS) integrated research projects.

  15. Mapping the potential distribution of the invasive Red Shiner, Cyprinella lutrensis (Teleostei: Cyprinidae) across waterways of the conterminous United States

    Science.gov (United States)

    Poulos, Helen M.; Chernoff, Barry; Fuller, Pam L.; Butman, David

    2012-01-01

    Predicting the future spread of non-native aquatic species continues to be a high priority for natural resource managers striving to maintain biodiversity and ecosystem function. Modeling the potential distributions of alien aquatic species through spatially explicit mapping is an increasingly important tool for risk assessment and prediction. Habitat modeling also facilitates the identification of key environmental variables influencing species distributions. We modeled the potential distribution of an aggressive invasive minnow, the red shiner (Cyprinella lutrensis), in waterways of the conterminous United States using maximum entropy (Maxent). We used inventory records from the USGS Nonindigenous Aquatic Species Database, native records for C. lutrensis from museum collections, and a geographic information system of 20 raster climatic and environmental variables to produce a map of potential red shiner habitat. Summer climatic variables were the most important environmental predictors of C. lutrensis distribution, which was consistent with the high temperature tolerance of this species. Results from this study provide insights into the locations and environmental conditions in the US that are susceptible to red shiner invasion.

  16. Hydrothermal alteration maps of the central and southern Basin and Range province of the United States compiled from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data

    Science.gov (United States)

    Mars, John L.

    2013-01-01

    Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data and Interactive Data Language (IDL) logical operator algorithms were used to map hydrothermally altered rocks in the central and southern parts of the Basin and Range province of the United States. The hydrothermally altered rocks mapped in this study include (1) hydrothermal silica-rich rocks (hydrous quartz, chalcedony, opal, and amorphous silica), (2) propylitic rocks (calcite-dolomite and epidote-chlorite mapped as separate mineral groups), (3) argillic rocks (alunite-pyrophyllite-kaolinite), and (4) phyllic rocks (sericite-muscovite). A series of hydrothermal alteration maps, which identify the potential locations of hydrothermal silica-rich, propylitic, argillic, and phyllic rocks on Landsat Thematic Mapper (TM) band 7 orthorectified images, and geographic information systems shape files of hydrothermal alteration units are provided in this study.

  17. Mapping and modeling the biogeochemical cycling of turf grasses in the United States.

    Science.gov (United States)

    Milesi, Cristina; Running, Steven W; Elvidge, Christopher D; Dietz, John B; Tuttle, Benjamin T; Nemani, Ramakrishna R

    2005-09-01

    Turf grasses are ubiquitous in the urban landscape of the United States and are often associated with various types of environmental impacts, especially on water resources, yet there have been limited efforts to quantify their total surface and ecosystem functioning, such as their total impact on the continental water budget and potential net ecosystem exchange (NEE). In this study, relating turf grass area to an estimate of fractional impervious surface area, it was calculated that potentially 163,800 km2 (+/- 35,850 km2) of land are cultivated with turf grasses in the continental United States, an area three times larger than that of any irrigated crop. Using the Biome-BGC ecosystem process model, the growth of warm-season and cool-season turf grasses was modeled at a number of sites across the 48 conterminous states under different management scenarios, simulating potential carbon and water fluxes as if the entire turf surface was to be managed like a well-maintained lawn. The results indicate that well-watered and fertilized turf grasses act as a carbon sink. The potential NEE that could derive from the total surface potentially under turf (up to 17 Tg C/yr with the simulated scenarios) would require up to 695 to 900 liters of water per person per day, depending on the modeled water irrigation practices, suggesting that outdoor water conservation practices such as xeriscaping and irrigation with recycled waste-water may need to be extended as many municipalities continue to face increasing pressures on freshwater.

  18. Mapping and Assessment of the United States Ocean Wave Energy Resource

    Energy Technology Data Exchange (ETDEWEB)

    Jacobson, Paul T; Hagerman, George; Scott, George

    2011-12-01

    This project estimates the naturally available and technically recoverable U.S. wave energy resources, using a 51-month Wavewatch III hindcast database developed especially for this study by National Oceanographic and Atmospheric Administration's (NOAA's) National Centers for Environmental Prediction. For total resource estimation, wave power density in terms of kilowatts per meter is aggregated across a unit diameter circle. This approach is fully consistent with accepted global practice and includes the resource made available by the lateral transfer of wave energy along wave crests, which enables wave diffraction to substantially reestablish wave power densities within a few kilometers of a linear array, even for fixed terminator devices. The total available wave energy resource along the U.S. continental shelf edge, based on accumulating unit circle wave power densities, is estimated to be 2,640 TWh/yr, broken down as follows: 590 TWh/yr for the West Coast, 240 TWh/yr for the East Coast, 80 TWh/yr for the Gulf of Mexico, 1570 TWh/yr for Alaska, 130 TWh/yr for Hawaii, and 30 TWh/yr for Puerto Rico. The total recoverable wave energy resource, as constrained by an array capacity packing density of 15 megawatts per kilometer of coastline, with a 100-fold operating range between threshold and maximum operating conditions in terms of input wave power density available to such arrays, yields a total recoverable resource along the U.S. continental shelf edge of 1,170 TWh/yr, broken down as follows: 250 TWh/yr for the West Coast, 160 TWh/yr for the East Coast, 60 TWh/yr for the Gulf of Mexico, 620 TWh/yr for Alaska, 80 TWh/yr for Hawaii, and 20 TWh/yr for Puerto Rico.

  19. Resonating, Rejecting, Reinterpreting: Mapping the Stabilization Discourse in the United Nations Security Council, 2000–14

    Directory of Open Access Journals (Sweden)

    David Curran

    2015-10-01

    Full Text Available This article charts the evolution of the conceptualisation of stabilization in the UN Security Council (UNSC during the period 2001–2014. UNSC open meetings provide an important dataset for a critical review of stabilization discourse and an opportunity to chart the positions of permanent Members, rotating Members and the UN Secretariat towards this concept. This article is the first to conduct an analysis of this material to map the evolution of stabilization in this critical chamber of the UN. This dataset of official statements will be complemented by a review of open source reporting on UNSC meetings and national stabilization doctrines of the ‘P3’ – France, the UK and the US. These countries have developed national stabilization doctrines predominantly to deal with cross-governmental approaches to counterinsurgency operations conducted during the 2000s. The article therefore presents a genealogy of the concept of stabilization in the UNSC to help understand implications for its future development in this multilateral setting. This article begins by examining efforts by the P3 to ‘upload’ their conceptualisations of stabilization into UN intervention frameworks. Secondly, the article uses a content analysis of UNSC debates during 2000–2014 to explore the extent to which the conceptualisation of stabilization resonated with other Council members, were rejected in specific contexts or in general, or were re-interpreted by member states to suit alternative security agendas and interests. Therefore, the article not only examines the UNSC debates surrounding existing UN ‘stabilization operations’ (MONUSCO, MINUSTAH, MINUSCA, MINUSMA, which could be regarded as evidence that this ‘western’ concept has resonated with other UNSC members and relevant UN agencies, but also documents the appearance of stabilization in other contexts too. The article opens new avenues of research into concepts of stabilization within the UN, and

  20. Auxiliary variables for the mapping of the drainage network: spatial correlation between relieve units, lithotypes and springs in Benevente River basin-ES

    Directory of Open Access Journals (Sweden)

    Tony Vinicius Moreira Sampaio

    2014-12-01

    Full Text Available Process of the drainage network mapping present methodological limitations re- sulting in inaccurate maps, restricting their use in environmental studies. Such problems demand the realization of long field surveys to verify the error and the search for auxiliary variables to optimize this works and turn possible the analysis of map accuracy. This research aims at the measurement of the correlation be- tween springs, lithotypes and relieve units, characterized by Roughness Concentration Index (RCI in River Basin Benevente-ES, focusing on the operations of map algebra and the use of spatial statistical techniques. These procedures have identified classes of RCI and lithotypes that present the highest and the lowest correlation with the spatial distribution of springs, indicating its potential use as auxiliary variables to verify the map accuracy.

  1. Mapping Antimicrobial Stewardship in Undergraduate Medical, Dental, Pharmacy, Nursing and Veterinary Education in the United Kingdom.

    Directory of Open Access Journals (Sweden)

    Enrique Castro-Sánchez

    Full Text Available To investigate the teaching of antimicrobial stewardship (AS in undergraduate healthcare educational degree programmes in the United Kingdom (UK.Cross-sectional survey of undergraduate programmes in human and veterinary medicine, dentistry, pharmacy and nursing in the UK. The main outcome measures included prevalence of AS teaching; stewardship principles taught; estimated hours apportioned; mode of content delivery and teaching strategies; evaluation methodologies; and frequency of multidisciplinary learning.80% (112/140 of programmes responded adequately. The majority of programmes teach AS principles (88/109, 80.7%. 'Adopting necessary infection prevention and control precautions' was the most frequently taught principle (83/88, 94.3%, followed by 'timely collection of microbiological samples for microscopy, culture and sensitivity' (73/88, 82.9% and 'minimisation of unnecessary antimicrobial prescribing' (72/88, 81.8%. The 'use of intravenous administration only to patients who are severely ill, or unable to tolerate oral treatment' was reported in ~50% of courses. Only 32/88 (36.3% programmes included all recommended principles.Antimicrobial stewardship principles are included in most undergraduate healthcare and veterinary degree programmes in the UK. However, future professionals responsible for using antimicrobials receive disparate education. Education may be boosted by standardisation and strengthening of less frequently discussed principles.

  2. Mapping watershed potential to contribute phosphorus from geologic materials to receiving streams, southeastern United States

    Science.gov (United States)

    Terziotti, Silvia; Hoos, Anne B.; Harned, Douglas; Garcia, Ana Maria

    2010-01-01

    As part of the southeastern United States SPARROW (SPAtially Referenced Regressions On Watershed attributes) water-quality model implementation, the U.S. Geological Survey created a dataset to characterize the contribution of phosphorus to streams from weathering and erosion of surficial geologic materials. SPARROW provides estimates of total nitrogen and phosphorus loads in surface waters from point and nonpoint sources. The characterization of the contribution of phosphorus from geologic materials is important to help separate the effects of natural or background sources of phosphorus from anthropogenic sources of phosphorus, such as municipal wastewater or agricultural practices. The potential of a watershed to contribute phosphorus from naturally occurring geologic materials to streams was characterized by using geochemical data from bed-sediment samples collected from first-order streams in relatively undisturbed watersheds as part of the multiyear U.S. Geological Survey National Geochemical Survey. The spatial pattern of bed-sediment phosphorus concentration is offered as a tool to represent the best available information at the regional scale. One issue may weaken the use of bed-sediment phosphorus concentration as a surrogate for the potential for geologic materials in the watershed to contribute to instream levels of phosphorus-an unknown part of the variability in bed-sediment phosphorus concentration may be due to the rates of net deposition and processing of phosphorus in the streambed rather than to variability in the potential of the watershed's geologic materials to contribute phosphorus to the stream. Two additional datasets were created to represent the potential of a watershed to contribute phosphorus from geologic materials disturbed by mining activities from active mines and inactive mines.

  3. Mapping integration of midwives across the United States: Impact on access, equity, and outcomes.

    Directory of Open Access Journals (Sweden)

    Saraswathi Vedam

    Full Text Available Our multidisciplinary team examined published regulatory data to inform a 50-state database describing the environment for midwifery practice and interprofessional collaboration. Items (110 detailed differences across jurisdictions in scope of practice, autonomy, governance, and prescriptive authority; as well as restrictions that can affect patient safety, quality, and access to maternity providers across birth settings. A nationwide survey of state regulatory experts (n = 92 verified the 'on the ground' relevance, importance, and realities of local interpretation of these state laws. Using a modified Delphi process, we selected 50/110 key items to include in a weighted, composite Midwifery Integration Scoring (MISS system. Higher scores indicate greater integration of midwives across all settings. We ranked states by MISS scores; and, using reliable indicators in the CDC-Vital Statistics Database, we calculated correlation coefficients between MISS scores and maternal-newborn outcomes by state, as well as state density of midwives and place of birth. We conducted hierarchical linear regression analysis to control for confounding effects of race.MISS scores ranged from lowest at 17 (North Carolina to highest at 61 (Washington, out of 100 points. Higher MISS scores were associated with significantly higher rates of spontaneous vaginal delivery, vaginal birth after cesarean, and breastfeeding, and significantly lower rates of cesarean, preterm birth, low birth weight infants, and neonatal death. MISS scores also correlated with density of midwives and access to care across birth settings. Significant differences in newborn outcomes accounted for by MISS scores persisted after controlling for proportion of African American births in each state.The MISS scoring system assesses the level of integration of midwives and evaluates regional access to high quality maternity care. In the United States, higher MISS Scores were associated with significantly

  4. Heat Maps of Hypertension, Diabetes Mellitus, and Smoking in the Continental United States.

    Science.gov (United States)

    Loop, Matthew Shane; Howard, George; de Los Campos, Gustavo; Al-Hamdan, Mohammad Z; Safford, Monika M; Levitan, Emily B; McClure, Leslie A

    2017-01-01

    Geographic variations in cardiovascular mortality are substantial, but descriptions of geographic variations in major cardiovascular risk factors have relied on data aggregated to counties. Herein, we provide the first description of geographic variation in the prevalence of hypertension, diabetes mellitus, and smoking within and across US counties. We conducted a cross-sectional analysis of baseline risk factor measurements and latitude/longitude of participant residence collected from 2003 to 2007 in the REGARDS study (Reasons for Geographic and Racial Differences in Stroke). Of the 30 239 participants, all risk factor measurements and location data were available for 28 887 (96%). The mean (±SD) age of these participants was 64.8(±9.4) years; 41% were black; 55% were female; 59% were hypertensive; 22% were diabetic; and 15% were current smokers. In logistic regression models stratified by race, the median(range) predicted prevalence of the risk factors were as follows: for hypertension, 49% (45%-58%) among whites and 72% (68%-78%) among blacks; for diabetes mellitus, 14% (10%-20%) among whites and 31% (28%-41%) among blacks; and for current smoking, 12% (7%-16%) among whites and 18% (11%-22%) among blacks. Hypertension was most prevalent in the central Southeast among whites, but in the west Southeast among blacks. Diabetes mellitus was most prevalent in the west and central Southeast among whites but in south Florida among blacks. Current smoking was most prevalent in the west Southeast and Midwest among whites and in the north among blacks. Geographic disparities in prevalent hypertension, diabetes mellitus, and smoking exist within states and within counties in the continental United States, and the patterns differ by race. © 2017 American Heart Association, Inc.

  5. Mapping integration of midwives across the United States: Impact on access, equity, and outcomes

    Science.gov (United States)

    Stoll, Kathrin; MacDorman, Marian; Declercq, Eugene; Cramer, Renee; Cheyney, Melissa; Fisher, Timothy; Butt, Emma; Yang, Y. Tony; Powell Kennedy, Holly

    2018-01-01

    birth settings. Significant differences in newborn outcomes accounted for by MISS scores persisted after controlling for proportion of African American births in each state. Conclusion The MISS scoring system assesses the level of integration of midwives and evaluates regional access to high quality maternity care. In the United States, higher MISS Scores were associated with significantly higher rates of physiologic birth, less obstetric interventions, and fewer adverse neonatal outcomes. PMID:29466389

  6. Minimum critical mass systems

    International Nuclear Information System (INIS)

    Dam, H. van; Leege, P.F.A. de

    1987-01-01

    An analysis is presented of thermal systems with minimum critical mass, based on the use of materials with optimum neutron moderating and reflecting properties. The optimum fissile material distributions in the systems are obtained by calculations with standard computer codes, extended with a routine for flat fuel importance search. It is shown that in the minimum critical mass configuration a considerable part of the fuel is positioned in the reflector region. For 239 Pu a minimum critical mass of 87 g is found, which is the lowest value reported hitherto. (author)

  7. Minimum entropy production principle

    Czech Academy of Sciences Publication Activity Database

    Maes, C.; Netočný, Karel

    2013-01-01

    Roč. 8, č. 7 (2013), s. 9664-9677 ISSN 1941-6016 Institutional support: RVO:68378271 Keywords : MINEP Subject RIV: BE - Theoretical Physics http://www.scholarpedia.org/article/Minimum_entropy_production_principle

  8. Removing non-urban roads from the National Land Cover Database to create improved urban maps for the United States, 1992-2011

    Science.gov (United States)

    Soulard, Christopher E.; Acevedo, William; Stehman, Stephen V.

    2018-01-01

    Quantifying change in urban land provides important information to create empirical models examining the effects of human land use. Maps of developed land from the National Land Cover Database (NLCD) of the conterminous United States include rural roads in the developed land class and therefore overestimate the amount of urban land. To better map the urban class and understand how urban lands change over time, we removed rural roads and small patches of rural development from the NLCD developed class and created four wall-to-wall maps (1992, 2001, 2006, and 2011) of urban land. Removing rural roads from the NLCD developed class involved a multi-step filtering process, data fusion using geospatial road and developed land data, and manual editing. Reference data classified as urban or not urban from a stratified random sample was used to assess the accuracy of the 2001 and 2006 urban and NLCD maps. The newly created urban maps had higher overall accuracy (98.7 percent) than the NLCD maps (96.2 percent). More importantly, the urban maps resulted in lower commission error of the urban class (23 percent versus 57 percent for the NLCD in 2006) with the trade-off of slightly inflated omission error (20 percent for the urban map, 16 percent for NLCD in 2006). The removal of approximately 230,000 km2 of rural roads from the NLCD developed class resulted in maps that better characterize the urban footprint. These urban maps are more suited to modeling applications and policy decisions that rely on quantitative and spatially explicit information regarding urban lands.

  9. Continuous bedside pressure mapping and rates of hospital-associated pressure ulcers in a medical intensive care unit.

    Science.gov (United States)

    Behrendt, Robert; Ghaznavi, Amir M; Mahan, Meredith; Craft, Susan; Siddiqui, Aamir

    2014-03-01

    Critically ill patients are vulnerable to the development of hospital-associated pressure ulcers (HAPUs). Positioning of patients is an essential component of pressure ulcer prevention because it off-loads areas of high pressure. However, the effectiveness of such positioning is debatable. A continuous bedside pressure mapping (CBPM) device can provide real-time feedback of optimal body position though a pressure-sensing mat that displays pressure images at a patient's bedside, allowing off-loading of high-pressure areas and possibly preventing HAPU formation. A prospective controlled study was designed to determine if CBPM would reduce the number of HAPUs in patients treated in our medical intensive care unit. In 2 months, 422 patients were enrolled and assigned to beds equipped with or without a CBPM device. Patients' skin was assessed daily and weekly to determine the presence and progress of HAPUs. All patients were turned every 2 hours. CBPM patients were repositioned to off-load high-pressure points during turning, according to a graphic display. The number of newly formed HAPUs was the primary outcome measured. A χ(2) test was then used to compare the occurrence of HAPUs between groups. HAPUs developed in 2 of 213 patients in the CBPM group (0.9%; both stage II) compared with 10 of 209 in the control group (4.8%; all stage II; P = .02). Significantly fewer HAPUs occurred in the CBPM group than the control group, indicating the effectiveness of real-time visual feedback in repositioning of patients to prevent the formation of new HAPUs.

  10. Modeled changes in 100 year Flood Risk and Asset Damages within Mapped Floodplains of the Contiguous United States

    Science.gov (United States)

    Wobus, C. W.; Gutmann, E. D.; Jones, R.; Rissing, M.; Mizukami, N.; Lorie, M.; Mahoney, H.; Wood, A.; Mills, D.; Martinich, J.

    2017-12-01

    A growing body of recent work suggests that the extreme weather events that drive inland flooding are likely to increase in frequency and magnitude in a warming climate, thus increasing monetary damages from flooding in the future. We use hydrologic projections based on the Coupled Model Intercomparison Project Phase 5 (CMIP5) to estimate changes in the frequency of modeled 1% annual exceedance probability flood events at 57,116 locations across the contiguous United States (CONUS). We link these flood projections to a database of assets within mapped flood hazard zones to model changes in inland flooding damages throughout the CONUS over the remainder of the 21st century, under two greenhouse gas (GHG) emissions scenarios. Our model generates early 21st century flood damages that reasonably approximate the range of historical observations, and trajectories of future damages that vary substantially depending on the GHG emissions pathway. The difference in modeled flood damages between higher and lower emissions pathways approaches $4 billion per year by 2100 (in undiscounted 2014 dollars), suggesting that aggressive GHG emissions reductions could generate significant monetary benefits over the long-term in terms of reduced flood risk. Although the downscaled hydrologic data we used have been applied to flood impacts studies elsewhere, this research expands on earlier work to quantify changes in flood risk by linking future flood exposure to assets and damages at a national scale. Our approach relies on a series of simplifications that could ultimately affect damage estimates (e.g., use of statistical downscaling, reliance on a nationwide hydrologic model, and linking damage estimates only to 1% AEP floods). Although future work is needed to test the sensitivity of our results to these methodological choices, our results suggest that monetary damages from inland flooding could be substantially reduced through more aggressive GHG mitigation policies.

  11. Climate change impacts on flood risk and asset damages within mapped 100-year floodplains of the contiguous United States

    Science.gov (United States)

    Wobus, Cameron; Gutmann, Ethan; Jones, Russell; Rissing, Matthew; Mizukami, Naoki; Lorie, Mark; Mahoney, Hardee; Wood, Andrew W.; Mills, David; Martinich, Jeremy

    2017-12-01

    A growing body of work suggests that the extreme weather events that drive inland flooding are likely to increase in frequency and magnitude in a warming climate, thus potentially increasing flood damages in the future. We use hydrologic projections based on the Coupled Model Intercomparison Project Phase 5 (CMIP5) to estimate changes in the frequency of modeled 1 % annual exceedance probability (1 % AEP, or 100-year) flood events at 57 116 stream reaches across the contiguous United States (CONUS). We link these flood projections to a database of assets within mapped flood hazard zones to model changes in inland flooding damages throughout the CONUS over the remainder of the 21st century. Our model generates early 21st century flood damages that reasonably approximate the range of historical observations and trajectories of future damages that vary substantially depending on the greenhouse gas (GHG) emissions pathway. The difference in modeled flood damages between higher and lower emissions pathways approaches USD 4 billion per year by 2100 (in undiscounted 2014 dollars), suggesting that aggressive GHG emissions reductions could generate significant monetary benefits over the long term in terms of reduced flood damages. Although the downscaled hydrologic data we used have been applied to flood impacts studies elsewhere, this research expands on earlier work to quantify changes in flood risk by linking future flood exposure to assets and damages on a national scale. Our approach relies on a series of simplifications that could ultimately affect damage estimates (e.g., use of statistical downscaling, reliance on a nationwide hydrologic model, and linking damage estimates only to 1 % AEP floods). Although future work is needed to test the sensitivity of our results to these methodological choices, our results indicate that monetary damages from inland flooding could be significantly reduced through substantial GHG mitigation.

  12. Climate change impacts on flood risk and asset damages within mapped 100-year floodplains of the contiguous United States

    Directory of Open Access Journals (Sweden)

    C. Wobus

    2017-12-01

    Full Text Available A growing body of work suggests that the extreme weather events that drive inland flooding are likely to increase in frequency and magnitude in a warming climate, thus potentially increasing flood damages in the future. We use hydrologic projections based on the Coupled Model Intercomparison Project Phase 5 (CMIP5 to estimate changes in the frequency of modeled 1 % annual exceedance probability (1 % AEP, or 100-year flood events at 57 116 stream reaches across the contiguous United States (CONUS. We link these flood projections to a database of assets within mapped flood hazard zones to model changes in inland flooding damages throughout the CONUS over the remainder of the 21st century. Our model generates early 21st century flood damages that reasonably approximate the range of historical observations and trajectories of future damages that vary substantially depending on the greenhouse gas (GHG emissions pathway. The difference in modeled flood damages between higher and lower emissions pathways approaches USD 4 billion per year by 2100 (in undiscounted 2014 dollars, suggesting that aggressive GHG emissions reductions could generate significant monetary benefits over the long term in terms of reduced flood damages. Although the downscaled hydrologic data we used have been applied to flood impacts studies elsewhere, this research expands on earlier work to quantify changes in flood risk by linking future flood exposure to assets and damages on a national scale. Our approach relies on a series of simplifications that could ultimately affect damage estimates (e.g., use of statistical downscaling, reliance on a nationwide hydrologic model, and linking damage estimates only to 1 % AEP floods. Although future work is needed to test the sensitivity of our results to these methodological choices, our results indicate that monetary damages from inland flooding could be significantly reduced through substantial GHG mitigation.

  13. Geospatial compilation and digital map of centerpivot irrigated areas in the mid-Atlantic region, United States

    Science.gov (United States)

    Finkelstein, Jason S.; Nardi, Mark R.

    2015-01-01

    To evaluate water availability within the Northern Atlantic Coastal Plain, the U.S. Geological Survey, in cooperation with the University of Delaware Agricultural Extension, created a dataset that maps the number of acres under center-pivot irrigation in the Northern Atlantic Coastal Plain study area. For this study, the extent of the Northern Atlantic Coastal Plain falls within areas of the States of New York, New Jersey, Delaware, Maryland, Virginia, and North Carolina. The irrigation dataset maps about 271,900 acres operated primarily under center-pivot irrigation in 57 counties. Manual digitizing was performed against aerial imagery in a process where operators used observable center-pivot irrigation signatures—such as irrigation arms, concentric wheel paths through cropped areas, and differential colors—to identify and map irrigated areas. The aerial imagery used for digitizing came from a variety of sources and seasons. The imagery contained a variety of spatial resolutions and included online imagery from the U.S. Department of Agriculture National Agricultural Imagery Program, Microsoft Bing Maps, and the Google Maps mapping service. The dates of the source images ranged from 2010 to 2012 for the U.S. Department of Agriculture imagery, whereas maps from the other mapping services were from 2013.

  14. Rising above the Minimum Wage.

    Science.gov (United States)

    Even, William; Macpherson, David

    An in-depth analysis was made of how quickly most people move up the wage scale from minimum wage, what factors influence their progress, and how minimum wage increases affect wage growth above the minimum. Very few workers remain at the minimum wage over the long run, according to this study of data drawn from the 1977-78 May Current Population…

  15. Mapping the universe.

    Science.gov (United States)

    Geller, M J; Huchra, J P

    1989-11-17

    Maps of the galaxy distribution in the nearby universe reveal large coherent structures. The extent of the largest features is limited only by the size of the survey. Voids with a density typically 20 percent of the mean and with diameters of 5000 km s(-1) are present in every survey large enough to contain them. Many galaxies lie in thin sheet-like structures. The largest sheet detected so far is the "Great Wall" with a minimum extent of 60 h(-1) Mpc x 170 h(-1) Mpc, where h is the Hubble constant in units of 100 km s(-1) Mpc(-1). The frequent occurrence of these structures is one of several serious challenges to our current understanding of the origin and evolution of the large-scale distribution of matter in the universe.

  16. Assessing Accessibility and Transport Infrastructure Inequities in Administrative Units in Serbia’s Danube Corridor Based on Multi-Criteria Analysis and Gis Mapping Tools

    Directory of Open Access Journals (Sweden)

    Ana VULEVIC

    2018-02-01

    Full Text Available The Danube Regions, especially the sub-national units of governance, must be ready to play an active role in spatial development policies. A precondition for this is good accessibility and the coordinated development of all transport systems in the Danube corridor. The main contribution of this paper is to provide a multi-criteria model for potential decision making related to the evaluation of transportation accessibility in Serbia’s Danube Corridor. Geographic Information Systems (GIS, based on maps, indicate the existing counties’ transport infrastructures inequities (between well-connected and isolated counties in terms of accessibility to central places. Through the research, relevant indicators have been identifi ed. This provides an outline of transportation perspectives regarding the development achieved and also fosters the increase of transportation accessibility in some peripheral Serbian Danube administrative units – counties (Nomenclature of Territorial Units for Statistics level 3 – NUTS 3.

  17. Minimum Error Entropy Classification

    CERN Document Server

    Marques de Sá, Joaquim P; Santos, Jorge M F; Alexandre, Luís A

    2013-01-01

    This book explains the minimum error entropy (MEE) concept applied to data classification machines. Theoretical results on the inner workings of the MEE concept, in its application to solving a variety of classification problems, are presented in the wider realm of risk functionals. Researchers and practitioners also find in the book a detailed presentation of practical data classifiers using MEE. These include multi‐layer perceptrons, recurrent neural networks, complexvalued neural networks, modular neural networks, and decision trees. A clustering algorithm using a MEE‐like concept is also presented. Examples, tests, evaluation experiments and comparison with similar machines using classic approaches, complement the descriptions.

  18. Performance of USGS one-year earthquake hazard map for natural and induced seismicity in the central and eastern United States

    Science.gov (United States)

    Brooks, E. M.; Stein, S.; Spencer, B. D.; Salditch, L.; Petersen, M. D.; McNamara, D. E.

    2017-12-01

    Seismicity in the central United States has dramatically increased since 2008 due to the injection of wastewater produced by oil and gas extraction. In response, the USGS created a one-year probabilistic hazard model and map for 2016 to describe the increased hazard posed to the central and eastern United States. Using the intensity of shaking reported to the "Did You Feel It?" system during 2016, we assess the performance of this model. Assessing the performance of earthquake hazard maps for natural and induced seismicity is conceptually similar but has practical differences. Maps that have return periods of hundreds or thousands of years— as commonly used for natural seismicity— can be assessed using historical intensity data that also span hundreds or thousands of years. Several different features stand out when assessing the USGS 2016 seismic hazard model for the central and eastern United States from induced and natural earthquakes. First, the model can be assessed as a forecast in one year, because event rates are sufficiently high to permit evaluation with one year of data. Second, because these models are projections from the previous year thus implicitly assuming that fluid injection rates remain the same, misfit may reflect changes in human activity. Our results suggest that the model was very successful by the metric implicit in probabilistic hazard seismic assessment: namely, that the fraction of sites at which the maximum shaking exceeded the mapped value is comparable to that expected. The model also did well by a misfit metric that compares the spatial patterns of predicted and maximum observed shaking. This was true for both the central and eastern United States as a whole, and for the region within it with the highest amount of seismicity, Oklahoma and its surrounding area. The model performed least well in northern Texas, over-stating hazard, presumably because lower oil and gas prices and regulatory action reduced the water injection volume

  19. Do Minimum Wages Fight Poverty?

    OpenAIRE

    David Neumark; William Wascher

    1997-01-01

    The primary goal of a national minimum wage floor is to raise the incomes of poor or near-poor families with members in the work force. However, estimates of employment effects of minimum wages tell us little about whether minimum wages are can achieve this goal; even if the disemployment effects of minimum wages are modest, minimum wage increases could result in net income losses for poor families. We present evidence on the effects of minimum wages on family incomes from matched March CPS s...

  20. Analyzing thematic maps and mapping for accuracy

    Science.gov (United States)

    Rosenfield, G.H.

    1982-01-01

    Two problems which exist while attempting to test the accuracy of thematic maps and mapping are: (1) evaluating the accuracy of thematic content, and (2) evaluating the effects of the variables on thematic mapping. Statistical analysis techniques are applicable to both these problems and include techniques for sampling the data and determining their accuracy. In addition, techniques for hypothesis testing, or inferential statistics, are used when comparing the effects of variables. A comprehensive and valid accuracy test of a classification project, such as thematic mapping from remotely sensed data, includes the following components of statistical analysis: (1) sample design, including the sample distribution, sample size, size of the sample unit, and sampling procedure; and (2) accuracy estimation, including estimation of the variance and confidence limits. Careful consideration must be given to the minimum sample size necessary to validate the accuracy of a given. classification category. The results of an accuracy test are presented in a contingency table sometimes called a classification error matrix. Usually the rows represent the interpretation, and the columns represent the verification. The diagonal elements represent the correct classifications. The remaining elements of the rows represent errors by commission, and the remaining elements of the columns represent the errors of omission. For tests of hypothesis that compare variables, the general practice has been to use only the diagonal elements from several related classification error matrices. These data are arranged in the form of another contingency table. The columns of the table represent the different variables being compared, such as different scales of mapping. The rows represent the blocking characteristics, such as the various categories of classification. The values in the cells of the tables might be the counts of correct classification or the binomial proportions of these counts divided by

  1. An Alternative Approach to Mapping Thermophysical Units from Martian Thermal Inertia and Albedo Data Using a Combination of Unsupervised Classification Techniques

    Directory of Open Access Journals (Sweden)

    Eriita Jones

    2014-06-01

    Full Text Available Thermal inertia and albedo provide information on the distribution of surface materials on Mars. These parameters have been mapped globally on Mars by the Thermal Emission Spectrometer (TES onboard the Mars Global Surveyor. Two-dimensional clusters of thermal inertia and albedo reflect the thermophysical attributes of the dominant materials on the surface. In this paper three automated, non-deterministic, algorithmic classification methods are employed for defining thermophysical units: Expectation Maximisation of a Gaussian Mixture Model; Iterative Self-Organizing Data Analysis Technique (ISODATA; and Maximum Likelihood. We analyse the behaviour of the thermophysical classes resulting from the three classifiers, operating on the 2007 TES thermal inertia and albedo datasets. Producing a rigorous mapping of thermophysical classes at ~3 km/pixel resolution remains important for constraining the geologic processes that have shaped the Martian surface on a regional scale, and for choosing appropriate landing sites. The results from applying these algorithms are compared to geologic maps, surface data from lander missions, features derived from imaging, and previous classifications of thermophysical units which utilized manual (and potentially more time consuming classification methods. These comparisons comprise data suitable for validation of our classifications. Our work shows that a combination of the algorithms—ISODATA and Maximum Likelihood—optimises the sensitivity to the underlying dataspace, and that new information on Martian surface materials can be obtained by using these methods. We demonstrate that the algorithms used here can be applied to define a finer partitioning of albedo and thermal inertia for a more detailed mapping of surface materials, grain sizes and thermal behaviour of the Martian surface and shallow subsurface, at the ~3 km scale.

  2. Employment effects of minimum wages

    OpenAIRE

    Neumark, David

    2014-01-01

    The potential benefits of higher minimum wages come from the higher wages for affected workers, some of whom are in low-income families. The potential downside is that a higher minimum wage may discourage employers from using the low-wage, low-skill workers that minimum wages are intended to help. Research findings are not unanimous, but evidence from many countries suggests that minimum wages reduce the jobs available to low-skill workers.

  3. Studies on reduction of dosimeter used in the product dose mapping process at Sinagama Plant

    International Nuclear Information System (INIS)

    Sofian Ibrahim; Syuhada Ramli; Cosmos George; Zarina Mohd Nor; Kamarudin Buyong; Shahidan Yob; Nor Ishadi Ismail; Mohd Sidek Othman; Ahsanulkhaliqin Abdul Wahab; Mohd Khairul Azfar Ramli

    2012-01-01

    Product dose mapping is the determination of the best product loading configuration which will be used during routine sterilization. In product dose mapping, dosimeters are placed throughout products at strategic locations to determine the zones of minimum and maximum dose. On previous Sinagama's product dose mapping method, a total of 240 unit's ceric-cerous dosimeter been used for a tote. Based on the data obtained from Irradiator Dose Mapping Report in 2004 and data from recent studies, the number of dosimeter to be used in product dose mapping can be reduced to 28 units without sacrificing precision and accuracy of the dose mapping results. This also led changes of the placing dosimeter method from Plane system to Coordinate system. Reduction of 88 % on dosimeters usage will directly reduce the cost of expenses on dosimeter, time and labor. (author)

  4. 75 FR 6151 - Minimum Capital

    Science.gov (United States)

    2010-02-08

    ... capital and reserve requirements to be issued by order or regulation with respect to a product or activity... minimum capital requirements. Section 1362(a) establishes a minimum capital level for the Enterprises... entities required under this section.\\6\\ \\3\\ The Bank Act's current minimum capital requirements apply to...

  5. Application of a GIS-Based Slope Unit Method for Landslide Susceptibility Mapping along the Longzi River, Southeastern Tibetan Plateau, China

    Directory of Open Access Journals (Sweden)

    Fei Wang

    2017-06-01

    Full Text Available The Longzi River Basin in Tibet is located along the edge of the Himalaya Mountains and is characterized by complex geological conditions and numerous landslides. To evaluate the susceptibility of landslide disasters in this area, eight basic factors were analyzed comprehensively in order to obtain a final susceptibility map. The eight factors are the slope angle, slope aspect, plan curvature, distance-to-fault, distance-to-river, topographic relief, annual precipitation, and lithology. Except for the rainfall factor, which was extracted from the grid cell, all the factors were extracted and classified by the slope unit, which is the basic unit in geological disaster development. The eight factors were superimposed using the information content method (ICM, and the weight of each factor was acquired through an analytic hierarchy process (AHP. The sensitivities of the landslides were divided into four categories: low, moderate, high, and very high, respectively, accounting for 22.76%, 38.64%, 27.51%, and 11.09% of the study area. The accuracies of the area under AUC using slope units and grid cells are 82.6% and 84.2%, respectively, and it means that the two methods are accurate in predicting landslide occurrence. The results show that the high and very high susceptibility areas are distributed throughout the vicinity of the river, with a large component in the north as well as a small portion in the middle and the south. Therefore, it is necessary to conduct landslide warnings in these areas, where the rivers are vast and the population is dense. The susceptibility map can reflect the comprehensive risk of each slope unit, which provides an important reference for later detailed investigations, including research and warning studies.

  6. Mapping of lithologic and structural units using multispectral imagery. [Afar-Triangle/Ethiopia and adjacent areas (Ethiopian Plateau, Somali Plateau, and parts of Yemen and Saudi Arabia)

    Science.gov (United States)

    Kronberg, P. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. ERTS-1 MSS imagery covering the Afar-Triangle/Ethiopia and adjacent regions (Ethiopian Plateau, Somali Plateau, and parts of Yemen and Saudi Arabi) was applied to the mapping of lithologic and structural units of the test area at a scale 1:1,000,000. Results of the geological evaluation of the ERTS-1 imagery of the Afar have proven the usefullness of this type of satellite data for regional geological mapping. Evaluation of the ERTS images also resulted in new aspects of the structural setting and tectonic development of the Afar-Triangle, where three large rift systems, the oceanic rifts of the Red Sea and Gulf of Aden and the continental East African rift system, seem to meet each other. Surface structures mapped by ERTS do not indicate that the oceanic rift of the Gulf of Aden (Sheba Ridge) continues into the area of continental crust west of the Gulf of Tadjura. ERTS data show that the Wonji fault belt of the African rift system does not enter or cut through the central Afar. The Aysha-Horst is not a Horst but an autochthonous spur of the Somali Plateau.

  7. Global Geological Map of Venus

    Science.gov (United States)

    Ivanov, M. A.

    2008-09-01

    units is ~81.7% of the map area, whereas the younger units cover ~14.1% of the surface. Depending upon the estimates of T (750 Ma [36], 500 Ma [37], 300 Ma [38]), duration of Fortunian Period can be from 300 m.y (T=750 Ma) to 120 m.y (T=300 Ma). The minimum integrated resurfacing rate (both volcanic and tectonic) at this time was from ~1.2 to ~3.1 km2/y. Duration of Atlian Period is estimated to be from 750 to 300 m.y and the integrated resurfacing rate during this period could be from ~0.2 to ~0.4 km2/y. Such a significant drop of the resurfacing rates suggests that Fortunian and Atlian periods correspond to two different geodynamic regimes that probably were related to different regimes of mantle convection and lithospheric properties. References: 1) Basilevsky, A. T. and J.W. Head, PSS, 43, 1523, 1995; 2) Basilevsky, A.T. and J.W. Head, PSS, 48, 75, 2000 3) DeShon, H.R. et al., JGR, 105, 6983, 2000; 4) Head, J.W. et al., JGR, 97, 13153, 1992; 5) Solomon, S.C. et al., JGR, 97, 13199, 1992; 6) Squyres, S.W. et al., JGR, 97, 13579, 1992; 7) Stofan, E. R. et al., JGR, 97, 13347, 1992; 8) Guest, J.E., and E.R., Icarus139, 56, 1999; 9) Basilevsky, A.T.,et al., in: Venus II, S.W. Bougher et al. eds., Univ. Arizona Press 1047, 1997; 10) Head, J.W. and A.T. Basilevsky, Geology, 26, 35, 1998; 11) Ivanov, M.A. and J.W. Head, JGR, 106, 17515, 2001; 12) Price, M. and J., Nature, 372, 756, 1994; 13) Price, M. et al., JGR, 101, 4657, 1996 14) Namiki, N. and S.C. Solomon, Science, 265, 929, 1994 15) Parmentier, E.M. and P.C. Hess, GRL, 19, 2015, 1992; 16) Head, J.W. et al., PSS, 42, 803, 1994; 17) Turcotte, D.L., JGR, 98, 127061, 1993; 18) Arkani-Hamed, J. and M.N. Toksoz, PEPI, 34, 232, 1984; 19) Solomon, S.C, LPSC (Abstr.), XXIV, 1331, 1993; 20) Phillips R.J. and V.L. Hansen, Science, 279, 1492, 1998; 21) Solomatov, S.V. and L.-N. Moresi, JGR, 101, 4737, 1996; 22) Bender, K.C., et al., USGS Map I-2620, 2000; 23) Rosenberg, E. and G. E. McGill, USGS Map I-2721, 2001; 24) Ivanov, M

  8. Evaluation of Electromagnetic Induction to Characterize and Map Sodium-Affected Soils in the Northern Great Plains of the United States

    Science.gov (United States)

    Brevik, E. C.; Heilig, J.; Kempenich, J.; Doolittle, J.; Ulmer, M.

    2012-04-01

    Sodium-affected soils (SAS) cover over 4 million hectares in the Northern Great Plains of the United States. Improving the classification, interpretation, and mapping of SAS is a major goal of the United States Department of Agriculture-Natural Resource Conservation Service (USDA-NRCS) as Northern Great Plains soil surveys are updated. Apparent electrical conductivity (ECa) as measured with ground conductivity meters has shown promise for mapping SAS, however, this use of this geophysical tool needs additional evaluation. This study used an EM-38 MK2-2 meter (Geonics Limited, Mississauga, Ontario), a Trimble AgGPS 114 L-band DGPS (Trimble, Sunnyvale, CA) and the RTmap38MK2 program (Geomar Software, Inc., Mississauga, Ontario) on an Allegro CX field computer (Juniper Systems, North Logan, UT) to collect, observe, and interpret ECa data in the field. The ECa map generated on-site was then used to guide collection of soil samples for soil characterization and to evaluate the influence of soil properties in SAS on ECa as measured with the EM-38MK2-2. Stochastic models contained in the ESAP software package were used to estimate the SAR and salinity levels from the measured ECa data in 30 cm depth intervals to a depth of 90 cm and for the bulk soil (0 to 90 cm). This technique showed promise, with meaningful spatial patterns apparent in the ECa data. However, many of the stochastic models used for salinity and SAR for individual depth intervals and for the bulk soil had low R-squared values. At both sites, significant variability in soil clay and water contents along with a small number of soil samples taken to calibrate the ECa values to soil properties likely contributed to these low R-squared values.

  9. Shape indexes for semi-automated detection of windbreaks in thematic tree cover maps from the central United States

    Science.gov (United States)

    Greg C. Liknes; Dacia M. Meneguzzo; Todd A. Kellerman

    2017-01-01

    Windbreaks are an important ecological resource across the large expanse of agricultural land in the central United States and are often planted in straight-line or L-shaped configurations to serve specific functions. As high-resolution (i.e., <5 m) land cover datasets become more available for these areas, semi-or fully-automated methods for distinguishing...

  10. Genome-Wide Mapping of Transcriptional Regulation and Metabolism Describes Information-Processing Units in Escherichia coli

    Directory of Open Access Journals (Sweden)

    Daniela Ledezma-Tejeida

    2017-08-01

    Full Text Available In the face of changes in their environment, bacteria adjust gene expression levels and produce appropriate responses. The individual layers of this process have been widely studied: the transcriptional regulatory network describes the regulatory interactions that produce changes in the metabolic network, both of which are coordinated by the signaling network, but the interplay between them has never been described in a systematic fashion. Here, we formalize the process of detection and processing of environmental information mediated by individual transcription factors (TFs, utilizing a concept termed genetic sensory response units (GENSOR units, which are composed of four components: (1 a signal, (2 signal transduction, (3 genetic switch, and (4 a response. We used experimentally validated data sets from two databases to assemble a GENSOR unit for each of the 189 local TFs of Escherichia coli K-12 contained in the RegulonDB database. Further analysis suggested that feedback is a common occurrence in signal processing, and there is a gradient of functional complexity in the response mediated by each TF, as opposed to a one regulator/one pathway rule. Finally, we provide examples of other GENSOR unit applications, such as hypothesis generation, detailed description of cellular decision making, and elucidation of indirect regulatory mechanisms.

  11. Use of multi-sensor active fire detections to map fires in the United States: the future of monitoring trends in burn severity

    Science.gov (United States)

    Picotte, Joshua J.; Coan, Michael; Howard, Stephen M.

    2014-01-01

    The effort to utilize satellite-based MODIS, AVHRR, and GOES fire detections from the Hazard Monitoring System (HMS) to identify undocumented fires in Florida and improve the Monitoring Trends in Burn Severity (MTBS) mapping process has yielded promising results. This method was augmented using regression tree models to identify burned/not-burned pixels (BnB) in every Landsat scene (1984–2012) in Worldwide Referencing System 2 Path/Rows 16/40, 17/39, and 1839. The burned area delineations were combined with the HMS detections to create burned area polygons attributed with their date of fire detection. Within our study area, we processed 88,000 HMS points (2003–2012) and 1,800 Landsat scenes to identify approximately 300,000 burned area polygons. Six percent of these burned area polygons were larger than the 500-acre MTBS minimum size threshold. From this study, we conclude that the process can significantly improve understanding of fire occurrence and improve the efficiency and timeliness of assessing its impacts upon the landscape.

  12. Site investigation SFR. Rock type coding, overview geological mapping and identification of rock units and possible deformation zones in drill cores from the construction of SFR

    Energy Technology Data Exchange (ETDEWEB)

    Petersson, Jesper (Vattenfall Power Consultant AB, Stockholm (Sweden)); Curtis, Philip; Bockgaard, Niclas (Golder Associates AB (Sweden)); Mattsson, Haakan (GeoVista AB, Luleaa (Sweden))

    2011-01-15

    This report presents the rock type coding, overview lithological mapping and identification of rock units and possible deformation zones in drill cores from 32 boreholes associated with the construction of SFR. This work can be seen as complementary to single-hole interpretations of other older SFR boreholes earlier reported in /Petersson and Andersson 2010/: KFR04, KFR08, KFR09, KFR13, KFR35, KFR36, KFR54, KFR55, KFR7A, KFR7B and KFR7C. Due to deficiencies in the available material, the necessary activities have deviated somewhat from the established methodologies used during the recent Forsmark site investigations for the final repository for spent nuclear fuel. The aim of the current work has been, wherever possible, to allow the incorporation of all relevant material from older boreholes in the ongoing SFR geological modelling work in spite of the deficiencies. The activities include: - Rock type coding of the original geological mapping according to the nomenclature used during the preceding Forsmark site investigation. As part of the Forsmark site investigation such rock type coding has already been performed on most of the old SFR boreholes if the original geological mapping results were available. This earlier work has been complemented by rock type coding on two further boreholes: KFR01 and KFR02. - Lithological overview mapping, including documentation of (1) rock types, (2) ductile and brittle-ductile deformation and (3) alteration for drill cores from eleven of the boreholes for which no original geological borehole mapping was available (KFR31, KFR32, KFR34, KFR37,KFR38, KFR51, KFR69, KFR70, KFR71, KFR72 and KFR89). - Identification of possible deformation zones and merging of similar rock types into rock units. This follows SKB's established criteria and methodology of the geological Single-hole interpretation (SHI) process wherever possible. Deviations from the standard SHI process are associated with the lack of data, for example BIPS images

  13. Site investigation SFR. Rock type coding, overview geological mapping and identification of rock units and possible deformation zones in drill cores from the construction of SFR

    International Nuclear Information System (INIS)

    Petersson, Jesper; Curtis, Philip; Bockgaard, Niclas; Mattsson, Haakan

    2011-01-01

    This report presents the rock type coding, overview lithological mapping and identification of rock units and possible deformation zones in drill cores from 32 boreholes associated with the construction of SFR. This work can be seen as complementary to single-hole interpretations of other older SFR boreholes earlier reported in /Petersson and Andersson 2010/: KFR04, KFR08, KFR09, KFR13, KFR35, KFR36, KFR54, KFR55, KFR7A, KFR7B and KFR7C. Due to deficiencies in the available material, the necessary activities have deviated somewhat from the established methodologies used during the recent Forsmark site investigations for the final repository for spent nuclear fuel. The aim of the current work has been, wherever possible, to allow the incorporation of all relevant material from older boreholes in the ongoing SFR geological modelling work in spite of the deficiencies. The activities include: - Rock type coding of the original geological mapping according to the nomenclature used during the preceding Forsmark site investigation. As part of the Forsmark site investigation such rock type coding has already been performed on most of the old SFR boreholes if the original geological mapping results were available. This earlier work has been complemented by rock type coding on two further boreholes: KFR01 and KFR02. - Lithological overview mapping, including documentation of (1) rock types, (2) ductile and brittle-ductile deformation and (3) alteration for drill cores from eleven of the boreholes for which no original geological borehole mapping was available (KFR31, KFR32, KFR34, KFR37,KFR38, KFR51, KFR69, KFR70, KFR71, KFR72 and KFR89). - Identification of possible deformation zones and merging of similar rock types into rock units. This follows SKB's established criteria and methodology of the geological Single-hole interpretation (SHI) process wherever possible. Deviations from the standard SHI process are associated with the lack of data, for example BIPS images, or a

  14. Design for minimum energy in interstellar communication

    Science.gov (United States)

    Messerschmitt, David G.

    2015-02-01

    Microwave digital communication at interstellar distances is the foundation of extraterrestrial civilization (SETI and METI) communication of information-bearing signals. Large distances demand large transmitted power and/or large antennas, while the propagation is transparent over a wide bandwidth. Recognizing a fundamental tradeoff, reduced energy delivered to the receiver at the expense of wide bandwidth (the opposite of terrestrial objectives) is advantageous. Wide bandwidth also results in simpler design and implementation, allowing circumvention of dispersion and scattering arising in the interstellar medium and motion effects and obviating any related processing. The minimum energy delivered to the receiver per bit of information is determined by cosmic microwave background alone. By mapping a single bit onto a carrier burst, the Morse code invented for the telegraph in 1836 comes closer to this minimum energy than approaches used in modern terrestrial radio. Rather than the terrestrial approach of adding phases and amplitudes increases information capacity while minimizing bandwidth, adding multiple time-frequency locations for carrier bursts increases capacity while minimizing energy per information bit. The resulting location code is simple and yet can approach the minimum energy as bandwidth is expanded. It is consistent with easy discovery, since carrier bursts are energetic and straightforward modifications to post-detection pattern recognition can identify burst patterns. Time and frequency coherence constraints leading to simple signal discovery are addressed, and observations of the interstellar medium by transmitter and receiver constrain the burst parameters and limit the search scope.

  15. Application of the Lean Office philosophy and mapping of the value stream in the process of designing the banking units of a financial company

    Directory of Open Access Journals (Sweden)

    Nelson Antônio Calsavara

    2016-09-01

    Full Text Available The purpose of this study is to conduct a critical analysis of the effects of Lean Office on the design process of the banking units of a financial company and how the implementation of this philosophy may contribute to productivity, thus reducing implementation time. A literature review of the Toyota Production System was conducted, as well as studies on its methods, with advancement to lean thinking and consistent application of Lean philosophies in services and Office. A bibliographic and documentary survey of the Lean processes and procedures for opening bank branches was taken. A Current State Map was developed, modeling the current operating procedures. Soon after the identification and analysis of waste, proposals were presented for reducing deadlines and eliminating and grouping stages, with consequent development of the Future State Map, implementation and monitoring of stages, and the measurement of estimated time gains in operation, demonstrating an estimated 45% reduction, in days, from start to end of the process, concluding that the implementation of the Lean Office philosophy contributed to the process.

  16. Exploiting differential vegetation phenology for satellite-based mapping of semiarid grass vegetation in the southwestern United States and northern Mexico

    Science.gov (United States)

    Dye, Dennis G.; Middleton, Barry R.; Vogel, John M.; Wu, Zhuoting; Velasco, Miguel G.

    2016-01-01

    We developed and evaluated a methodology for subpixel discrimination and large-area mapping of the perennial warm-season (C4) grass component of vegetation cover in mixed-composition landscapes of the southwestern United States and northern Mexico. We describe the methodology within a general, conceptual framework that we identify as the differential vegetation phenology (DVP) paradigm. We introduce a DVP index, the Normalized Difference Phenometric Index (NDPI) that provides vegetation type-specific information at the subpixel scale by exploiting differential patterns of vegetation phenology detectable in time-series spectral vegetation index (VI) data from multispectral land imagers. We used modified soil-adjusted vegetation index (MSAVI2) data from Landsat to develop the NDPI, and MSAVI2 data from MODIS to compare its performance relative to one alternate DVP metric (difference of spring average MSAVI2 and summer maximum MSAVI2), and two simple, conventional VI metrics (summer average MSAVI2, summer maximum MSAVI2). The NDPI in a scaled form (NDPIs) performed best in predicting variation in perennial C4 grass cover as estimated from landscape photographs at 92 sites (R2 = 0.76, p landscapes of the Southwest, and potentially for monitoring of its response to drought, climate change, grazing and other factors, including land management. With appropriate adjustments, the method could potentially be used for subpixel discrimination and mapping of grass or other vegetation types in other regions where the vegetation components of the landscape exhibit contrasting seasonal patterns of phenology.

  17. ABOUT SYSTEM MAPPING OF BIOLOGICAL RESOURCES FOR SUBSTANTIATION OF ENVIRONMENTAL MANAGEMENT OF THE ADMINISTRATED UNIT ON THE EXAMPLE OF NOVOSIBIRSK REGION

    Directory of Open Access Journals (Sweden)

    O. N. Nikolaeva

    2017-01-01

    Full Text Available The article considers the issues of systematization, modeling and presentation of regional biological resources data. The problem of providing regional state authorities with actual biological resources data and an analysis tool has been stated. The necessity of complex analysis of heterogeneous biological resources data in connection with the landscape factors has been articulated. The system of biological resources’ cartographic models (BRCM is proposed as tools for the regional authorities to develop the BRCM for practical appliances. The goal and the target audience of the system are named. The principles of cartographic visualization of information in the BRCM are formulated. The main sources of biological resources data are listed. These sources include state cadastres, monitoring and statistics. The scales for regional and topical biological resources’ cartographic models are stated. These scales comprise two scale groups for depicting the region itself and its units of internal administrative division. The specifics of cartographic modeling and visualization of relief according to legal requirements to public cartographic data are described. Various options of presentation of biological resources’ cartographic models, such as digital maps, 3Dmodels and cartographic animation are described. Examples of maps and cartographic 3D-models of Novosibirsk Region forests are shown. The conclusion about practical challenges solved with BRCM has been made.

  18. Feasibility and utility of mapping disease risk at the neighbourhood level within a Canadian public health unit: an ecological study

    Directory of Open Access Journals (Sweden)

    Wanigaratne Susitha

    2010-05-01

    Full Text Available Abstract Background We conducted spatial analyses to determine the geographic variation of cancer at the neighbourhood level (dissemination areas or DAs within the area of a single Ontario public health unit, Wellington-Dufferin-Guelph, covering a population of 238,326 inhabitants. Cancer incidence data between 1999 and 2003 were obtained from the Ontario Cancer Registry and were geocoded down to the level of DA using the enhanced Postal Code Conversion File. The 2001 Census of Canada provided information on the size and age-sex structure of the population at the DA level, in addition to information about selected census covariates, such as average neighbourhood income. Results Age standardized incidence ratios for cancer and the prevalence of census covariates were calculated for each of 331 dissemination areas in Wellington-Dufferin-Guelph. The standardized incidence ratios (SIR for cancer varied dramatically across the dissemination areas. However, application of the Moran's I statistic, a popular index of spatial autocorrelation, suggested significant spatial patterns for only two cancers, lung and prostate, both in males (p Conclusion This paper demonstrates the feasibility and utility of a systematic approach to identifying neighbourhoods, within the area served by a public health unit, that have significantly higher risks of cancer. This exploratory, ecologic study suggests several hypotheses for these spatial patterns that warrant further investigations. To the best of our knowledge, this is the first Canadian study published in the peer-reviewed literature estimating the risk of relatively rare public health outcomes at a very small areal level, namely dissemination areas.

  19. Mapping water availability, cost and projected consumptive use in the eastern United States with comparisons to the west

    Science.gov (United States)

    Tidwell, Vincent C.; Moreland, Barbie D.; Shaneyfelt, Calvin R.; Kobos, Peter

    2018-01-01

    The availability of freshwater supplies to meet future demand is a growing concern. Water availability metrics are needed to inform future water development decisions. With the help of water managers, water availability was mapped for over 1300 watersheds throughout the 31 contiguous states in the eastern US complimenting a prior study of the west. The compiled set of water availability data is unique in that it considers multiple sources of water (fresh surface and groundwater, wastewater and brackish groundwater); accommodates institutional controls placed on water use; is accompanied by cost estimates to access, treat and convey each unique source of water; and is compared to projected future growth in consumptive water use to 2030. Although few administrative limits have been set on water availability in the east, water managers have identified 315 fresh surface water and 398 fresh groundwater basins (with 151 overlapping basins) as areas of concern (AOCs) where water supply challenges exist due to drought related concerns, environmental flows, groundwater overdraft, or salt water intrusion. This highlights a difference in management where AOCs are identified in the east which simply require additional permitting, while in the west strict administrative limits are established. Although the east is generally considered ‘water rich’ roughly a quarter of the basins were identified as AOCs; however, this is still in strong contrast to the west where 78% of the surface water basins are operating at or near their administrative limit. Little effort was noted on the part of eastern or western water managers to quantify non-fresh water resources.

  20. Geologic map of outcrop areas of sedimentary units in the eastern part of the Hailey 1 degree x 2 degrees quadrangle and part of the southern part of the Challis 1 degree x 2 degrees quadrangle, south-central Idaho

    Science.gov (United States)

    Link, P.K.; Mahoney, J.B.; Bruner, D.J.; Batatian, L.D.; Wilson, Eric; Williams, F.J.C.

    1995-01-01

    The paper version of the Geologic map of outcrop areas of sedimentary units in the eastern part of the Hailey 1x2 Quadrangle and part of the southern part of the Challis 1x2 Quadrangle, south-central Idaho was compiled by Paul Link and others in 1995. The plate was compiled on a 1:100,000 scale topographic base map. TechniGraphic System, Inc. of Fort Collins Colorado digitized this map under contract for N.Shock. G.Green edited and prepared the digital version for publication as a GIS database. The digital geologic map database can be queried in many ways to produce a variety of geologic maps.

  1. The current status of mapping karst areas and availability of public sinkhole-risk resources in karst terrains of the United States

    Science.gov (United States)

    Kuniansky, Eve L.; Weary, David J.; Kaufmann, James E.

    2016-01-01

    Subsidence from sinkhole collapse is a common occurrence in areas underlain by water-soluble rocks such as carbonate and evaporite rocks, typical of karst terrain. Almost all 50 States within the United States (excluding Delaware and Rhode Island) have karst areas, with sinkhole damage highest in Florida, Texas, Alabama, Missouri, Kentucky, Tennessee, and Pennsylvania. A conservative estimate of losses to all types of ground subsidence was $125 million per year in 1997. This estimate may now be low, as review of cost reports from the last 15 years indicates that the cost of karst collapses in the United States averages more than $300 million per year. Knowing when a catastrophic event will occur is not possible; however, understanding where such occurrences are likely is possible. The US Geological Survey has developed and maintains national-scale maps of karst areas and areas prone to sinkhole formation. Several States provide additional resources for their citizens; Alabama, Colorado, Florida, Indiana, Iowa, Kentucky, Minnesota, Missouri, Ohio, and Pennsylvania maintain databases of sinkholes or karst features, with Florida, Kentucky, Missouri, and Ohio providing sinkhole reporting mechanisms for the public.

  2. The minimum attention plant inherent safety through LWR simplification

    International Nuclear Information System (INIS)

    Turk, R.S.; Matzie, R.A.

    1987-01-01

    The Minimum Attention Plant (MAP) is a unique small LWR that achieves greater inherent safety, improved operability, and reduced costs through design simplification. The MAP is a self-pressurized, indirect-cycle light water reactor with full natural circulation primary coolant flow and multiple once-through steam generators located within the reactor vessel. A fundamental tenent of the MAP design is its complete reliance on existing LWR technology. This reliance on conventional technology provides an extensive experience base which gives confidence in judging the safety and performance aspects of the design

  3. Mapping information exposure on social media to explain differences in HPV vaccine coverage in the United States.

    Science.gov (United States)

    Dunn, Adam G; Surian, Didi; Leask, Julie; Dey, Aditi; Mandl, Kenneth D; Coiera, Enrico

    2017-05-25

    Together with access, acceptance of vaccines affects human papillomavirus (HPV) vaccine coverage, yet little is known about media's role. Our aim was to determine whether measures of information exposure derived from Twitter could be used to explain differences in coverage in the United States. We conducted an analysis of exposure to information about HPV vaccines on Twitter, derived from 273.8 million exposures to 258,418 tweets posted between 1 October 2013 and 30 October 2015. Tweets were classified by topic using machine learning methods. Proportional exposure to each topic was used to construct multivariable models for predicting state-level HPV vaccine coverage, and compared to multivariable models constructed using socioeconomic factors: poverty, education, and insurance. Outcome measures included correlations between coverage and the individual topics and socioeconomic factors; and differences in the predictive performance of the multivariable models. Topics corresponding to media controversies were most closely correlated with coverage (both positively and negatively); education and insurance were highest among socioeconomic indicators. Measures of information exposure explained 68% of the variance in one dose 2015 HPV vaccine coverage in females (males: 63%). In comparison, models based on socioeconomic factors explained 42% of the variance in females (males: 40%). Measures of information exposure derived from Twitter explained differences in coverage that were not explained by socioeconomic factors. Vaccine coverage was lower in states where safety concerns, misinformation, and conspiracies made up higher proportions of exposures, suggesting that negative representations of vaccines in the media may reflect or influence vaccine acceptance. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  4. Mapping the Relative Probability of Common Toad Occurrence in Terrestrial Lowland Farm Habitat in the United Kingdom.

    Directory of Open Access Journals (Sweden)

    Rosie D Salazar

    Full Text Available The common toad (Bufo bufo is of increasing conservation concern in the United Kingdom (UK due to dramatic population declines occurring in the past century. Many of these population declines coincided with reductions in both terrestrial and aquatic habitat availability and quality and have been primarily attributed to the effect of agricultural land conversion (of natural and semi-natural habitats to arable and pasture fields and pond drainage. However, there is little evidence available to link habitat availability with common toad population declines, especially when examined at a broad landscape scale. Assessing such patterns of population declines at the landscape scale, for instance, require an understanding of how this species uses terrestrial habitat.We intensively studied the terrestrial resource selection of a large population of common toads in Oxfordshire, England, UK. Adult common toads were fitted with passive integrated transponder (PIT tags to allow detection in the terrestrial environment using a portable PIT antenna once toads left the pond and before going into hibernation (April/May-October 2012 and 2013. We developed a population-level resource selection function (RSF to assess the relative probability of toad occurrence in the terrestrial environment by collecting location data for 90 recaptured toads.The predicted relative probability of toad occurrence for this population was greatest in wooded habitat near to water bodies; relative probability of occurrence declined dramatically > 50 m from these habitats. Toads also tended to select habitat near to their breeding pond and toad occurrence was negatively related to urban environments.

  5. Mapping out Map Libraries

    Directory of Open Access Journals (Sweden)

    Ferjan Ormeling

    2008-09-01

    Full Text Available Discussing the requirements for map data quality, map users and their library/archives environment, the paper focuses on the metadata the user would need for a correct and efficient interpretation of the map data. For such a correct interpretation, knowledge of the rules and guidelines according to which the topographers/cartographers work (such as the kind of data categories to be collected, and the degree to which these rules and guidelines were indeed followed are essential. This is not only valid for the old maps stored in our libraries and archives, but perhaps even more so for the new digital files as the format in which we now have to access our geospatial data. As this would be too much to ask from map librarians/curators, some sort of web 2.0 environment is sought where comments about data quality, completeness and up-to-dateness from knowledgeable map users regarding the specific maps or map series studied can be collected and tagged to scanned versions of these maps on the web. In order not to be subject to the same disadvantages as Wikipedia, where the ‘communis opinio’ rather than scholarship, seems to be decisive, some checking by map curators of this tagged map use information would still be needed. Cooperation between map curators and the International Cartographic Association ( ICA map and spatial data use commission to this end is suggested.

  6. The Risk Management of Minimum Return Guarantees

    Directory of Open Access Journals (Sweden)

    Antje Mahayni

    2008-05-01

    Full Text Available Contracts paying a guaranteed minimum rate of return and a fraction of a positive excess rate, which is specified relative to a benchmark portfolio, are closely related to unit-linked life-insurance products and can be considered as alternatives to direct investment in the underlying benchmark. They contain an embedded power option, and the key issue is the tractable and realistic hedging of this option, in order to rigorously justify valuation by arbitrage arguments and prevent the guarantees from becoming uncontrollable liabilities to the issuer. We show how to determine the contract parameters conservatively and implement robust risk-management strategies.

  7. Maps of estimated nitrate and arsenic concentrations in basin-fill aquifers of the southwestern United States

    Science.gov (United States)

    Beisner, Kimberly R.; Anning, David W.; Paul, Angela P.; McKinney, Tim S.; Huntington, Jena M.; Bexfield, Laura M.; Thiros, Susan A.

    2012-01-01

    Human-health concerns and economic considerations associated with meeting drinking-water standards motivated a study of the vulnerability of basin-fill aquifers to nitrate contamination and arsenic enrichment in the southwestern United States. Statistical models were developed by using the random forest classifier algorithm to predict concentrations of nitrate and arsenic across a model grid representing about 190,600 square miles of basin-fill aquifers in parts of Arizona, California, Colorado, Nevada, New Mexico, and Utah. The statistical models, referred to as classifiers, reflect natural and human-related factors that affect aquifer vulnerability to contamination and relate nitrate and arsenic concentrations to explanatory variables representing local- and basin-scale measures of source and aquifer susceptibility conditions. Geochemical variables were not used in concentration predictions because they were not available for the entire study area. The models were calibrated to assess model accuracy on the basis of measured values.Only 2 percent of the area underlain by basin-fill aquifers in the study area was predicted to equal or exceed the U.S. Environmental Protection Agency drinking-water standard for nitrate as N (10 milligrams per liter), whereas 43 percent of the area was predicted to equal or exceed the standard for arsenic (10 micrograms per liter). Areas predicted to equal or exceed the drinking-water standard for nitrate include basins in central Arizona near Phoenix; the San Joaquin Valley, the Santa Ana Inland, and San Jacinto Basins of California; and the San Luis Valley of Colorado. Much of the area predicted to equal or exceed the drinking-water standard for arsenic is within a belt of basins along the western portion of the Basin and Range Physiographic Province that includes almost all of Nevada and parts of California and Arizona. Predicted nitrate and arsenic concentrations are substantially lower than the drinking-water standards in much of

  8. Minimum Q Electrically Small Antennas

    DEFF Research Database (Denmark)

    Kim, O. S.

    2012-01-01

    Theoretically, the minimum radiation quality factor Q of an isolated resonance can be achieved in a spherical electrically small antenna by combining TM1m and TE1m spherical modes, provided that the stored energy in the antenna spherical volume is totally suppressed. Using closed-form expressions...... for a multiarm spherical helix antenna confirm the theoretical predictions. For example, a 4-arm spherical helix antenna with a magnetic-coated perfectly electrically conducting core (ka=0.254) exhibits the Q of 0.66 times the Chu lower bound, or 1.25 times the minimum Q....

  9. Defining biological assemblages (biotopes) of conservation interest in the submarine canyons of the South West Approaches (offshore United Kingdom) for use in marine habitat mapping

    Science.gov (United States)

    Davies, Jaime S.; Howell, Kerry L.; Stewart, Heather A.; Guinan, Janine; Golding, Neil

    2014-06-01

    In 2007, the upper part of a submarine canyon system located in water depths between 138 and 1165 m in the South West (SW) Approaches (North East Atlantic Ocean) was surveyed over a 2 week period. High-resolution multibeam echosounder data covering 1106 km2, and 44 ground-truthing video and image transects were acquired to characterise the biological assemblages of the canyons. The SW Approaches is an area of complex terrain, and intensive ground-truthing revealed the canyons to be dominated by soft sediment assemblages. A combination of multivariate analysis of seabed photographs (184-1059 m) and visual assessment of video ground-truthing identified 12 megabenthic assemblages (biotopes) at an appropriate scale to act as mapping units. Of these biotopes, 5 adhered to current definitions of habitats of conservation concern, 4 of which were classed as Vulnerable Marine Ecosystems. Some of the biotopes correspond to descriptions of communities from other megahabitat features (for example the continental shelf and seamounts), although it appears that the canyons host modified versions, possibly due to the inferred high rates of sedimentation in the canyons. Other biotopes described appear to be unique to canyon features, particularly the sea pen biotope consisting of Kophobelemnon stelliferum and cerianthids.

  10. Fermat and the Minimum Principle

    Indian Academy of Sciences (India)

    Arguably, least action and minimum principles were offered or applied much earlier. This (or these) principle(s) is/are among the fundamental, basic, unifying or organizing ones used to describe a variety of natural phenomena. It considers the amount of energy expended in performing a given action to be the least required ...

  11. Coupling between minimum scattering antennas

    DEFF Research Database (Denmark)

    Andersen, J.; Lessow, H; Schjær-Jacobsen, Hans

    1974-01-01

    Coupling between minimum scattering antennas (MSA's) is investigated by the coupling theory developed by Wasylkiwskyj and Kahn. Only rotationally symmetric power patterns are considered, and graphs of relative mutual impedance are presented as a function of distance and pattern parameters. Crossed...

  12. TOXMAP®: Environmental Health Maps

    Data.gov (United States)

    U.S. Department of Health & Human Services — TOXMAP® is a Geographic Information System (GIS) that uses maps of the United States and Canada to help users visually explore data primarily from the EPA's Toxics...

  13. Quantum mechanics the theoretical minimum

    CERN Document Server

    Susskind, Leonard

    2014-01-01

    From the bestselling author of The Theoretical Minimum, an accessible introduction to the math and science of quantum mechanicsQuantum Mechanics is a (second) book for anyone who wants to learn how to think like a physicist. In this follow-up to the bestselling The Theoretical Minimum, physicist Leonard Susskind and data engineer Art Friedman offer a first course in the theory and associated mathematics of the strange world of quantum mechanics. Quantum Mechanics presents Susskind and Friedman’s crystal-clear explanations of the principles of quantum states, uncertainty and time dependence, entanglement, and particle and wave states, among other topics. An accessible but rigorous introduction to a famously difficult topic, Quantum Mechanics provides a tool kit for amateur scientists to learn physics at their own pace.

  14. Minimum resolvable power contrast model

    Science.gov (United States)

    Qian, Shuai; Wang, Xia; Zhou, Jingjing

    2018-01-01

    Signal-to-noise ratio and MTF are important indexs to evaluate the performance of optical systems. However,whether they are used alone or joint assessment cannot intuitively describe the overall performance of the system. Therefore, an index is proposed to reflect the comprehensive system performance-Minimum Resolvable Radiation Performance Contrast (MRP) model. MRP is an evaluation model without human eyes. It starts from the radiance of the target and the background, transforms the target and background into the equivalent strips,and considers attenuation of the atmosphere, the optical imaging system, and the detector. Combining with the signal-to-noise ratio and the MTF, the Minimum Resolvable Radiation Performance Contrast is obtained. Finally the detection probability model of MRP is given.

  15. Understanding the Minimum Wage: Issues and Answers.

    Science.gov (United States)

    Employment Policies Inst. Foundation, Washington, DC.

    This booklet, which is designed to clarify facts regarding the minimum wage's impact on marketplace economics, contains a total of 31 questions and answers pertaining to the following topics: relationship between minimum wages and poverty; impacts of changes in the minimum wage on welfare reform; and possible effects of changes in the minimum wage…

  16. 5 CFR 551.301 - Minimum wage.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Minimum wage. 551.301 Section 551.301... FAIR LABOR STANDARDS ACT Minimum Wage Provisions Basic Provision § 551.301 Minimum wage. (a)(1) Except... employees wages at rates not less than the minimum wage specified in section 6(a)(1) of the Act for all...

  17. Biomass Maps | Geospatial Data Science | NREL

    Science.gov (United States)

    Biomass Maps Biomass Maps These maps illustrate the biomass resource in the United States by county . Biomass feedstock data are analyzed both statistically and graphically using a geographic information Data Science Team. Solid Biomass Resources Map of Total Biomass Resources in the United States Solid

  18. TESTING TREE-CLASSIFIER VARIANTS AND ALTERNATE MODELING METHODOLOGIES IN THE EAST GREAT BASIN MAPPING UNIT OF THE SOUTHWEST REGIONAL GAP ANALYSIS PROJECT (SW REGAP)

    Science.gov (United States)

    We tested two methods for dataset generation and model construction, and three tree-classifier variants to identify the most parsimonious and thematically accurate mapping methodology for the SW ReGAP project. Competing methodologies were tested in the East Great Basin mapping un...

  19. Minimum DNBR Prediction Using Artificial Intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Su; Kim, Ju Hyun; Na, Man Gyun [Chosun University, Gwangju (Korea, Republic of)

    2011-05-15

    The minimum DNBR (MDNBR) for prevention of the boiling crisis and the fuel clad melting is very important factor that should be consistently monitored in safety aspects. Artificial intelligence methods have been extensively and successfully applied to nonlinear function approximation such as the problem in question for predicting DNBR values. In this paper, support vector regression (SVR) model and fuzzy neural network (FNN) model are developed to predict the MDNBR using a number of measured signals from the reactor coolant system. Also, two models are trained using a training data set and verified against test data set, which does not include training data. The proposed MDNBR estimation algorithms were verified by using nuclear and thermal data acquired from many numerical simulations of the Yonggwang Nuclear Power Plant Unit 3 (YGN-3)

  20. On palaeogeographic map

    Directory of Open Access Journals (Sweden)

    Zeng-Zhao Feng

    2016-01-01

    Full Text Available The palaeogeographic map is a graphic representation of physical geographical characteristics in geological history periods and human history periods. It is the most important result of palaeogeographic study. The author, as the Editor-in-Chief of Journal of Palaeogeography, Chinese Edition and English Edition, aimed at the problems of the articles submitted to and published in the Journal of Palaeogeography in recent years and the relevant papers and books of others, and integrated with his practice of palaeogeographic study and mapping, wrote this paper. The content mainly includes the data of palaeogeographic mapping, the problems of palaeogeographic mapping method, the “Single factor analysis and multifactor comprehensive mapping method —— Methodology of quantitative lithofacies palaeogeography”, i.e., the “4 steps mapping method”, the nomenclature of each palaeogeographic unit in palaeogeographic map, the explanation of each palaeogeographic unit in palaeogeographic map, the explanation of significance of palaeogeographic map and palaeogeographic article, the evaluative standards of palaeogeographic map and palaeogeographic article, and the self-evaluation. Criticisms and corrections are welcome.

  1. Minimum airflow reset of single-duct VAV terminal boxes

    Science.gov (United States)

    Cho, Young-Hum

    Single duct Variable Air Volume (VAV) systems are currently the most widely used type of HVAC system in the United States. When installing such a system, it is critical to determine the minimum airflow set point of the terminal box, as an optimally selected set point will improve the level of thermal comfort and indoor air quality (IAQ) while at the same time lower overall energy costs. In principle, this minimum rate should be calculated according to the minimum ventilation requirement based on ASHRAE standard 62.1 and maximum heating load of the zone. Several factors must be carefully considered when calculating this minimum rate. Terminal boxes with conventional control sequences may result in occupant discomfort and energy waste. If the minimum rate of airflow is set too high, the AHUs will consume excess fan power, and the terminal boxes may cause significant simultaneous room heating and cooling. At the same time, a rate that is too low will result in poor air circulation and indoor air quality in the air-conditioned space. Currently, many scholars are investigating how to change the algorithm of the advanced VAV terminal box controller without retrofitting. Some of these controllers have been found to effectively improve thermal comfort, indoor air quality, and energy efficiency. However, minimum airflow set points have not yet been identified, nor has controller performance been verified in confirmed studies. In this study, control algorithms were developed that automatically identify and reset terminal box minimum airflow set points, thereby improving indoor air quality and thermal comfort levels, and reducing the overall rate of energy consumption. A theoretical analysis of the optimal minimum airflow and discharge air temperature was performed to identify the potential energy benefits of resetting the terminal box minimum airflow set points. Applicable control algorithms for calculating the ideal values for the minimum airflow reset were developed and

  2. The minimum yield in channeling

    International Nuclear Information System (INIS)

    Uguzzoni, A.; Gaertner, K.; Lulli, G.; Andersen, J.U.

    2000-01-01

    A first estimate of the minimum yield was obtained from Lindhard's theory, with the assumption of a statistical equilibrium in the transverse phase-space of channeled particles guided by a continuum axial potential. However, computer simulations have shown that this estimate should be corrected by a fairly large factor, C (approximately equal to 2.5), called the Barrett factor. We have shown earlier that the concept of a statistical equilibrium can be applied to understand this result, with the introduction of a constraint in phase-space due to planar channeling of axially channeled particles. Here we present an extended test of these ideas on the basis of computer simulation of the trajectories of 2 MeV α particles in Si. In particular, the gradual trend towards a full statistical equilibrium is studied. We also discuss the introduction of this modification of standard channeling theory into descriptions of the multiple scattering of channeled particles (dechanneling) by a master equation and show that the calculated minimum yields are in very good agreement with the results of a full computer simulation

  3. Minimum Bias Trigger in ATLAS

    International Nuclear Information System (INIS)

    Kwee, Regina

    2010-01-01

    Since the restart of the LHC in November 2009, ATLAS has collected inelastic pp collisions to perform first measurements on charged particle densities. These measurements will help to constrain various models describing phenomenologically soft parton interactions. Understanding the trigger efficiencies for different event types are therefore crucial to minimize any possible bias in the event selection. ATLAS uses two main minimum bias triggers, featuring complementary detector components and trigger levels. While a hardware based first trigger level situated in the forward regions with 2.2 < |η| < 3.8 has been proven to select pp-collisions very efficiently, the Inner Detector based minimum bias trigger uses a random seed on filled bunches and central tracking detectors for the event selection. Both triggers were essential for the analysis of kinematic spectra of charged particles. Their performance and trigger efficiency measurements as well as studies on possible bias sources will be presented. We also highlight the advantage of these triggers for particle correlation analyses. (author)

  4. Do Higher Minimum Wages Benefit Health? Evidence From the UK.

    Science.gov (United States)

    Lenhart, Otto

    This study examines the link between minimum wages and health outcomes by using the introduction of the National Minimum Wage (NMW) in the United Kingdom in 1999 as an exogenous variation of earned income. A test for health effects by using longitudinal data from the British Household Panel Survey for a period of ten years was conducted. It was found that the NMW significantly improved several measures of health, including self-reported health status and the presence of health conditions. When examining potential mechanisms, it was shown that changes in health behaviors, leisure expenditures, and financial stress can explain the observed improvements in health.

  5. Integrating laser-range finding, electronic compass measurements and GPS to rapidly map vertical changes in volcanic stratigraphy and constrain unit thicknesses and volumes: two examples from the northern Cordilleran volcanic province

    Science.gov (United States)

    Nogier, M.; Edwards, B. R.; Wetherell, K.

    2005-12-01

    We present preliminary results of laser-range finding-GPS surveys from two separate locations in northern British Columbia, in the south-central northern Cordilleran volcanic province: Hoodoo Mountain volcano and Craven Lake cone. This technique, described in detail below, is appropriate for rapidly measuring changes in vertical thicknesses of units that either would be difficult or impossible to measure by most other techniques. The ability to accurately measure thicknesses of geologic units in otherwise difficult-to-access locations will aide in generating better quantitative estimates of deposit geometries and eruption volumes. Such data is particularly important for constraining quantitative models of magma production and eruption dynamics. The deposits of interest in this study comprised at least partly inaccessible, largely pyroclastic units, although the technique could be used to map any vertical surfaces. The first field location was the northern side of Hoodoo Mountain volcano (56deg47'23.72'N/131deg17'36.97'W/1208m-asl), where a sequence of welded to unwelded, trachytic-phonolitic tephra was deposited in a paleovalley. This deposit is informally referred to as the Pointer Ridge deposit, and it comprises at least 7 distinct subunits. The horizontal limit of the exposures is approximately 1.5km, and the vertical limit is approximately 250m. Three different GPS base stations were used to map the lateral and vertical variations in the deposit. The second field location is north of Craven Lake (56deg54'44.55'N/129deg21'42.17'W/1453m-asl), along Craven Creek, where a sequence of basaltic tephra is overlain by pillow lava and glacial diamicton. This exposure is 200m long and approximately 30m high, much smaller than the area mapped at Hoodoo Mountain. The basaltic tephra appears to comprise 4 distinct sequences (measured thicknesses vary from 3-4m) not including the overlying pillow lava (measured thickness varies from 2 to 10m), and measurements of the

  6. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong

    2017-11-09

    We present a general framework and method for detection of an object in a video based on apparent motion. The object moves relative to background motion at some unknown time in the video, and the goal is to detect and segment the object as soon it moves in an online manner. Due to unreliability of motion between frames, more than two frames are needed to reliably detect the object. Our method is designed to detect the object(s) with minimum delay, i.e., frames after the object moves, constraining the false alarms. Experiments on a new extensive dataset for moving object detection show that our method achieves less delay for all false alarm constraints than existing state-of-the-art.

  7. Approximating the minimum cycle mean

    Directory of Open Access Journals (Sweden)

    Krishnendu Chatterjee

    2013-07-01

    Full Text Available We consider directed graphs where each edge is labeled with an integer weight and study the fundamental algorithmic question of computing the value of a cycle with minimum mean weight. Our contributions are twofold: (1 First we show that the algorithmic question is reducible in O(n^2 time to the problem of a logarithmic number of min-plus matrix multiplications of n-by-n matrices, where n is the number of vertices of the graph. (2 Second, when the weights are nonnegative, we present the first (1 + ε-approximation algorithm for the problem and the running time of our algorithm is ilde(O(n^ω log^3(nW/ε / ε, where O(n^ω is the time required for the classic n-by-n matrix multiplication and W is the maximum value of the weights.

  8. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong

    2017-01-08

    We present a general framework and method for detection of an object in a video based on apparent motion. The object moves relative to background motion at some unknown time in the video, and the goal is to detect and segment the object as soon it moves in an online manner. Due to unreliability of motion between frames, more than two frames are needed to reliably detect the object. Our method is designed to detect the object(s) with minimum delay, i.e., frames after the object moves, constraining the false alarms. Experiments on a new extensive dataset for moving object detection show that our method achieves less delay for all false alarm constraints than existing state-of-the-art.

  9. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong; Sundaramoorthi, Ganesh

    2017-01-01

    We present a general framework and method for detection of an object in a video based on apparent motion. The object moves relative to background motion at some unknown time in the video, and the goal is to detect and segment the object as soon it moves in an online manner. Due to unreliability of motion between frames, more than two frames are needed to reliably detect the object. Our method is designed to detect the object(s) with minimum delay, i.e., frames after the object moves, constraining the false alarms. Experiments on a new extensive dataset for moving object detection show that our method achieves less delay for all false alarm constraints than existing state-of-the-art.

  10. Youth minimum wages and youth employment

    NARCIS (Netherlands)

    Marimpi, Maria; Koning, Pierre

    2018-01-01

    This paper performs a cross-country level analysis on the impact of the level of specific youth minimum wages on the labor market performance of young individuals. We use information on the use and level of youth minimum wages, as compared to the level of adult minimum wages as well as to the median

  11. Do Some Workers Have Minimum Wage Careers?

    Science.gov (United States)

    Carrington, William J.; Fallick, Bruce C.

    2001-01-01

    Most workers who begin their careers in minimum-wage jobs eventually gain more experience and move on to higher paying jobs. However, more than 8% of workers spend at least half of their first 10 working years in minimum wage jobs. Those more likely to have minimum wage careers are less educated, minorities, women with young children, and those…

  12. Does the Minimum Wage Affect Welfare Caseloads?

    Science.gov (United States)

    Page, Marianne E.; Spetz, Joanne; Millar, Jane

    2005-01-01

    Although minimum wages are advocated as a policy that will help the poor, few studies have examined their effect on poor families. This paper uses variation in minimum wages across states and over time to estimate the impact of minimum wage legislation on welfare caseloads. We find that the elasticity of the welfare caseload with respect to the…

  13. Minimum income protection in the Netherlands

    NARCIS (Netherlands)

    van Peijpe, T.

    2009-01-01

    This article offers an overview of the Dutch legal system of minimum income protection through collective bargaining, social security, and statutory minimum wages. In addition to collective agreements, the Dutch statutory minimum wage offers income protection to a small number of workers. Its

  14. 10 CFR 1015.505 - Minimum amount of referrals to the Department of Justice.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Minimum amount of referrals to the Department of Justice... THE UNITED STATES Referrals to the Department of Justice § 1015.505 Minimum amount of referrals to the Department of Justice. (a) DOE shall not refer for litigation claims of less than $2,500, exclusive of...

  15. Minimum wage development in the Russian Federation

    OpenAIRE

    Bolsheva, Anna

    2012-01-01

    The aim of this paper is to analyze the effectiveness of the minimum wage policy at the national level in Russia and its impact on living standards in the country. The analysis showed that the national minimum wage in Russia does not serve its original purpose of protecting the lowest wage earners and has no substantial effect on poverty reduction. The national subsistence minimum is too low and cannot be considered an adequate criterion for the setting of the minimum wage. The minimum wage d...

  16. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong

    2017-05-14

    This thesis presents a general framework and method for detection of an object in a video based on apparent motion. The object moves, at some unknown time, differently than the “background” motion, which can be induced from camera motion. The goal of proposed method is to detect and segment the object as soon it moves in an online manner. Since motion estimation can be unreliable between frames, more than two frames are needed to reliably detect the object. Observing more frames before declaring a detection may lead to a more accurate detection and segmentation, since more motion may be observed leading to a stronger motion cue. However, this leads to greater delay. The proposed method is designed to detect the object(s) with minimum delay, i.e., frames after the object moves, constraining the false alarms, defined as declarations of detection before the object moves or incorrect or inaccurate segmentation at the detection time. Experiments on a new extensive dataset for moving object detection show that our method achieves less delay for all false alarm constraints than existing state-of-the-art.

  17. A Preliminary Study on the Use of Mind Mapping as a Visual-Learning Strategy in General Education Science Classes for Arabic Speakers in the United Arab Emirates

    Science.gov (United States)

    Wilson, Kenesha; Copeland-Solas, Eddia; Guthrie-Dixon, Natalie

    2016-01-01

    Mind mapping was introduced as a culturally relevant pedagogy aimed at enhancing the teaching and learning experience in a general education, Environmental Science class for mostly Emirati English Language Learners (ELL). Anecdotal evidence suggests that the students are very artistic and visual and enjoy group-based activities. It was decided to…

  18. Multi-hazard Non-regulatory Risk Maps for Resilient Coastal Communities of Washington State in Pacific Northwest Region of the United States

    Science.gov (United States)

    Cakir, R.; Walsh, T. J.; Zou, Y.; Gufler, T.; Norman, D. K.

    2015-12-01

    Washington Department of Natural Resources - Division of Geology and Earth Resources (WADNR-DGER) partnered with FEMA through the FEMA Cooperating Technical Partners (CTP) program to assess annualized losses from flood and other hazards and prepare supportive risk related data for FEMA's coastal RiskMAP projects. We used HAZUS-MH analysis to assess losses from earthquake, flood and other potential hazards such as landslide and tsunami in the project areas; on shorelines of the Pacific Ocean and Puget Sound of Washington Grays Harbor, Pacific, Skagit, Whatcom, Island, Mason, Clallam, Jefferson and San Juan counties. The FEMA's Hazus-MH tool was applied to estimate losses and damages for each building due to floods and earthquakes. User-defined facilities (UDF) inventory data were prepared and used for individual building damage estimations and updating general building stocks. Flood depth grids were used to determine which properties are most impacted by flooding. For example, the HAZUS-MH (flood model) run based on the 1% annual chance event (or 100 year flood) for Grays Harbor County, resulted in a total of 161 million in losses to buildings including residential, commercial properties, and other building and occupancy types. A likely M9 megathrust Cascadia earthquake scenario USGS-ShakeMap was used for the HAZUS-MH earthquake model. For example, the HAZUS-MH (earthquake model) run based on the Cascadia M9 earthquake for Grays Harbor County, resulted in a total of 1.15 billion in losses to building inventory. We produced GIS-based overlay maps of properties exposed to tsunami, landslide, and liquefaction hazards within the communities. This multi-hazard approach is an essential component to produce non-regulatory maps for FEMA's RiskMAP project, and they help further improve local and regional mitigation efforts and emergency response plans, and overall resiliency plan of the communities in and around the coastal communities in western Washington.

  19. The Effect of an Increased Minimum Wage on Infant Mortality and Birth Weight.

    Science.gov (United States)

    Komro, Kelli A; Livingston, Melvin D; Markowitz, Sara; Wagenaar, Alexander C

    2016-08-01

    To investigate the effects of state minimum wage laws on low birth weight and infant mortality in the United States. We estimated the effects of state-level minimum wage laws using a difference-in-differences approach on rates of low birth weight (minimum wage above the federal level was associated with a 1% to 2% decrease in low birth weight births and a 4% decrease in postneonatal mortality. If all states in 2014 had increased their minimum wages by 1 dollar, there would likely have been 2790 fewer low birth weight births and 518 fewer postneonatal deaths for the year.

  20. Report on the lands of the arid region of the United States with a more detailed account of the land of Utah with maps

    Science.gov (United States)

    Powell, John Wesley

    1879-01-01

    A report from Maj. J. W.Powell, geologist in charge of the United States Geographical and Geological Survey of the Rocky Mountain Region, upon the lands of the Arid Region of the United States, setting forth the extent of said region, and making suggestions as to the conditions under which the lands embraced within its limit may be rendered available for agricultural and grazing purposes. With the report is transmitted a statement of the rainfall of the western portion of the United States, with reports upon the subject of irrigation by Capt. C. E. Button, U. S. A., Prof. A. H. Thompson, and Mr. G. K. Gilbert.

  1. Minimum Additive Waste Stabilization (MAWS)

    International Nuclear Information System (INIS)

    1994-02-01

    In the Minimum Additive Waste Stabilization(MAWS) concept, actual waste streams are utilized as additive resources for vitrification, which may contain the basic components (glass formers and fluxes) for making a suitable glass or glassy slag. If too much glass former is present, then the melt viscosity or temperature will be too high for processing; while if there is too much flux, then the durability may suffer. Therefore, there are optimum combinations of these two important classes of constituents depending on the criteria required. The challenge is to combine these resources in such a way that minimizes the use of non-waste additives yet yields a processable and durable final waste form for disposal. The benefit to this approach is that the volume of the final waste form is minimized (waste loading maximized) since little or no additives are used and vitrification itself results in volume reduction through evaporation of water, combustion of organics, and compaction of the solids into a non-porous glass. This implies a significant reduction in disposal costs due to volume reduction alone, and minimizes future risks/costs due to the long term durability and leach resistance of glass. This is accomplished by using integrated systems that are both cost-effective and produce an environmentally sound waste form for disposal. individual component technologies may include: vitrification; thermal destruction; soil washing; gas scrubbing/filtration; and, ion-exchange wastewater treatment. The particular combination of technologies will depend on the waste streams to be treated. At the heart of MAWS is vitrification technology, which incorporates all primary and secondary waste streams into a final, long-term, stabilized glass wasteform. The integrated technology approach, and view of waste streams as resources, is innovative yet practical to cost effectively treat a broad range of DOE mixed and low-level wastes

  2. Application of ecological mapping

    International Nuclear Information System (INIS)

    Sherk, J.A.

    1982-01-01

    The US Fish and Wildlife Service has initiated the production of a comprehensive ecological inventory map series for use as a major new planning tool. Important species data along with special land use designations are displayed on 1:250,000 scale topographic base maps. Sets of maps have been published for the Atlantic and Pacific coastal areas of the United States. Preparation of a map set for the Gulf of Mexico is underway at the present time. Potential application of ecological inventory map series information to a typical land disposal facility could occur during the narrowing of the number of possible disposal sites, the design of potential disposal site studies of ecological resources, the preparation of the environmental report, and the regulatory review of license applications. 3 figures, 3 tables

  3. Robustification and Optimization in Repetitive Control For Minimum Phase and Non-Minimum Phase Systems

    Science.gov (United States)

    Prasitmeeboon, Pitcha

    repetitive control FIR compensator. The aim is to reduce the final error level by using real time frequency response model updates to successively increase the cutoff frequency, each time creating the improved model needed to produce convergence zero error up to the higher cutoff. Non-minimum phase systems present a difficult design challenge to the sister field of Iterative Learning Control. The third topic investigates to what extent the same challenges appear in RC. One challenge is that the intrinsic non-minimum phase zero mapped from continuous time is close to the pole of repetitive controller at +1 creating behavior similar to pole-zero cancellation. The near pole-zero cancellation causes slow learning at DC and low frequencies. The Min-Max cost function over the learning rate is presented. The Min-Max can be reformulated as a Quadratically Constrained Linear Programming problem. This approach is shown to be an RC design approach that addresses the main challenge of non-minimum phase systems to have a reasonable learning rate at DC. Although it was illustrated that using the Min-Max objective improves learning at DC and low frequencies compared to other designs, the method requires model accuracy at high frequencies. In the real world, models usually have error at high frequencies. The fourth topic addresses how one can merge the quadratic penalty to the Min-Max cost function to increase robustness at high frequencies. The topic also considers limiting the Min-Max optimization to some frequencies interval and applying an FIR zero-phase low-pass filter to cutoff the learning for frequencies above that interval.

  4. Mapping of wine industry

    Directory of Open Access Journals (Sweden)

    Віліна Пересадько

    2016-10-01

    Full Text Available Having reviewed a variety of approaches to understanding the essence of wine industry, having studied the modern ideas about the future of wine industry, having analyzed more than 50 maps from the Internet we have set the trends and special features of wine industry mapping in the world, such as: - the vast majority of maps displays the development of the industry at regional or national level, whereas there are practically no world maps; - wine-growing regions are represented on maps very unevenly; - all existing maps of the industry could be classified as analytical ascertaining inventory type; - the dominant ways of cartographic representation are area method and qualitative background method, sign method and collation maps are rarely used; - basically all the Internet maps have low quality as they are scanned images with poor resolution; - the special feature of maps published lately is lack of geographical basis (except for state borders and coastline. We created wine production and consumption world map «Wine Industry» in the scale of 1:60 000 000 with simple geographical basis (state names, state borders, major rivers, coastline. It was concluded that from the methodological point of view it is incorrect not to show geographical basis on maps of wine industry. Analysis of this map allowed us to identify areas of traditional wine-making, potential wine-making areas and countries which claim to be the world leaders in the field of wine production. We found disbalans between wine production and wine consumption - increasing wine production in South America, China and the United States and increasing wine consumption (mainly due to the import products in countries where the grape is not the primary agricultural product.

  5. The migratory impact of minimum wage legislation: Puerto Rico, 1970-1987.

    Science.gov (United States)

    Santiago, C E

    1993-01-01

    "This study examines the impact of minimum wage setting on labor migration. A multiple time series framework is applied to monthly data for Puerto Rico from 1970-1987. The results show that net emigration from Puerto Rico to the United States fell in response to significant changes in the manner in which minimum wage policy was conducted, particularly after 1974. The extent of commuter type labor migration between Puerto Rico and the United States is influenced by minimum wage policy, with potentially important consequences for human capital investment and long-term standards of living." excerpt

  6. Estimating Climate Trends: Application to United States Plant Hardiness Zones

    Directory of Open Access Journals (Sweden)

    Nir Y. Krakauer

    2012-01-01

    Full Text Available The United States Department of Agriculture classifies plant hardiness zones based on mean annual minimum temperatures over some past period (currently 1976–2005. Since temperatures are changing, these values may benefit from updating. I outline a multistep methodology involving imputation of missing station values, geostatistical interpolation, and time series smoothing to update a climate variable’s expected value compared to a climatology period and apply it to estimating annual minimum temperature change over the coterminous United States. I show using hindcast experiments that trend estimation gives more accurate predictions of minimum temperatures 1-2 years in advance compared to the previous 30 years’ mean alone. I find that annual minimum temperature increased roughly 2.5 times faster than mean temperature (~2.0 K versus ~0.8 K since 1970, and is already an average of 1.2  0.5 K (regionally up to ~2 K above the 1976–2005 mean, so that much of the country belongs to warmer hardiness zones compared to the current map. The methods developed may also be applied to estimate changes in other climate variables and geographic regions.

  7. Minimum Energy Requirements in Complex Distillation Arrangements

    Energy Technology Data Exchange (ETDEWEB)

    Halvorsen, Ivar J.

    2001-07-01

    Distillation is the most widely used industrial separation technology and distillation units are responsible for a significant part of the total heat consumption in the world's process industry. In this work we focus on directly (fully thermally) coupled column arrangements for separation of multicomponent mixtures. These systems are also denoted Petlyuk arrangements, where a particular implementation is the dividing wall column. Energy savings in the range of 20-40% have been reported with ternary feed mixtures. In addition to energy savings, such integrated units have also a potential for reduced capital cost, making them extra attractive. However, the industrial use has been limited, and difficulties in design and control have been reported as the main reasons. Minimum energy results have only been available for ternary feed mixtures and sharp product splits. This motivates further research in this area, and this thesis will hopefully give some contributions to better understanding of complex column systems. In the first part we derive the general analytic solution for minimum energy consumption in directly coupled columns for a multicomponent feed and any number of products. To our knowledge, this is a new contribution in the field. The basic assumptions are constant relative volatility, constant pressure and constant molar flows and the derivation is based on Underwood's classical methods. An important conclusion is that the minimum energy consumption in a complex directly integrated multi-product arrangement is the same as for the most difficult split between any pair of the specified products when we consider the performance of a conventional two-product column. We also present the Vmin-diagram, which is a simple graphical tool for visualisation of minimum energy related to feed distribution. The Vmin-diagram provides a simple mean to assess the detailed flow requirements for all parts of a complex directly coupled arrangement. The main purpose in

  8. Minimum Energy Requirements in Complex Distillation Arrangements

    Energy Technology Data Exchange (ETDEWEB)

    Halvorsen, Ivar J

    2001-07-01

    Distillation is the most widely used industrial separation technology and distillation units are responsible for a significant part of the total heat consumption in the world's process industry. In this work we focus on directly (fully thermally) coupled column arrangements for separation of multicomponent mixtures. These systems are also denoted Petlyuk arrangements, where a particular implementation is the dividing wall column. Energy savings in the range of 20-40% have been reported with ternary feed mixtures. In addition to energy savings, such integrated units have also a potential for reduced capital cost, making them extra attractive. However, the industrial use has been limited, and difficulties in design and control have been reported as the main reasons. Minimum energy results have only been available for ternary feed mixtures and sharp product splits. This motivates further research in this area, and this thesis will hopefully give some contributions to better understanding of complex column systems. In the first part we derive the general analytic solution for minimum energy consumption in directly coupled columns for a multicomponent feed and any number of products. To our knowledge, this is a new contribution in the field. The basic assumptions are constant relative volatility, constant pressure and constant molar flows and the derivation is based on Underwood's classical methods. An important conclusion is that the minimum energy consumption in a complex directly integrated multi-product arrangement is the same as for the most difficult split between any pair of the specified products when we consider the performance of a conventional two-product column. We also present the Vmin-diagram, which is a simple graphical tool for visualisation of minimum energy related to feed distribution. The Vmin-diagram provides a simple mean to assess the detailed flow requirements for all parts of a complex directly coupled arrangement. The main purpose in the first

  9. Minimum emittance of three-bend achromats

    International Nuclear Information System (INIS)

    Li Xiaoyu; Xu Gang

    2012-01-01

    The calculation of the minimum emittance of three-bend achromats (TBAs) made by Mathematical software can ignore the actual magnets lattice in the matching condition of dispersion function in phase space. The minimum scaling factors of two kinds of widely used TBA lattices are obtained. Then the relationship between the lengths and the radii of the three dipoles in TBA is obtained and so is the minimum scaling factor, when the TBA lattice achieves its minimum emittance. The procedure of analysis and the results can be widely used in achromats lattices, because the calculation is not restricted by the actual lattice. (authors)

  10. A Pareto-Improving Minimum Wage

    OpenAIRE

    Eliav Danziger; Leif Danziger

    2014-01-01

    This paper shows that a graduated minimum wage, in contrast to a constant minimum wage, can provide a strict Pareto improvement over what can be achieved with an optimal income tax. The reason is that a graduated minimum wage requires high-productivity workers to work more to earn the same income as low-productivity workers, which makes it more difficult for the former to mimic the latter. In effect, a graduated minimum wage allows the low-productivity workers to benefit from second-degree pr...

  11. The minimum wage in the Czech enterprises

    OpenAIRE

    Eva Lajtkepová

    2010-01-01

    Although the statutory minimum wage is not a new category, in the Czech Republic we encounter the definition and regulation of a minimum wage for the first time in the 1990 amendment to Act No. 65/1965 Coll., the Labour Code. The specific amount of the minimum wage and the conditions of its operation were then subsequently determined by government regulation in February 1991. Since that time, the value of minimum wage has been adjusted fifteenth times (the last increase was in January 2007). ...

  12. ShoreZone Mapped Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set is a polyline file of mapped ShoreZone units which correspond with data records found in the Unit, Xshr, BioUnit, and BioBand tables of this...

  13. Stochastic variational approach to minimum uncertainty states

    Energy Technology Data Exchange (ETDEWEB)

    Illuminati, F.; Viola, L. [Dipartimento di Fisica, Padova Univ. (Italy)

    1995-05-21

    We introduce a new variational characterization of Gaussian diffusion processes as minimum uncertainty states. We then define a variational method constrained by kinematics of diffusions and Schroedinger dynamics to seek states of local minimum uncertainty for general non-harmonic potentials. (author)

  14. Zero forcing parameters and minimum rank problems

    NARCIS (Netherlands)

    Barioli, F.; Barrett, W.; Fallat, S.M.; Hall, H.T.; Hogben, L.; Shader, B.L.; Driessche, van den P.; Holst, van der H.

    2010-01-01

    The zero forcing number Z(G), which is the minimum number of vertices in a zero forcing set of a graph G, is used to study the maximum nullity/minimum rank of the family of symmetric matrices described by G. It is shown that for a connected graph of order at least two, no vertex is in every zero

  15. 30 CFR 281.30 - Minimum royalty.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Minimum royalty. 281.30 Section 281.30 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR OFFSHORE LEASING OF MINERALS OTHER THAN OIL, GAS, AND SULPHUR IN THE OUTER CONTINENTAL SHELF Financial Considerations § 281.30 Minimum royalty...

  16. New Minimum Wage Research: A Symposium.

    Science.gov (United States)

    Ehrenberg, Ronald G.; And Others

    1992-01-01

    Includes "Introduction" (Ehrenberg); "Effect of the Minimum Wage [MW] on the Fast-Food Industry" (Katz, Krueger); "Using Regional Variation in Wages to Measure Effects of the Federal MW" (Card); "Do MWs Reduce Employment?" (Card); "Employment Effects of Minimum and Subminimum Wages" (Neumark,…

  17. Minimum Wage Effects in the Longer Run

    Science.gov (United States)

    Neumark, David; Nizalova, Olena

    2007-01-01

    Exposure to minimum wages at young ages could lead to adverse longer-run effects via decreased labor market experience and tenure, and diminished education and training, while beneficial longer-run effects could arise if minimum wages increase skill acquisition. Evidence suggests that as individuals reach their late 20s, they earn less the longer…

  18. Bistable minimum energy structures (BiMES) for binary robotics

    International Nuclear Information System (INIS)

    Follador, M; Conn, A T; Rossiter, J

    2015-01-01

    Bistable minimum energy structures (BiMES) are devices derived from the union of the concepts of dielectric elastomer minimum energy structures and bistable systems. This article presents this novel approach to active, elastic and bistable structures. BiMES are based on dielectric elastomer actuators (DEAs), which act as antagonists and provide the actuation for switching between the two equilibrium positions. A central elastic beam is the backbone of the structure and is buckled into the minimum energy configurations by the action of the two DEAs. The theory and the model of the device are presented, and also its fabrication process. BiMES are considered as fundamental units for more complex structures, which are presented and fabricated as proof of concept. Two different ways of combining the multiple units are proposed: a parallel configuration, to make a simple gripper, and a serial configuration, to generate a binary device. The possibility of using the bistable system as a continuous bender actuator, by modulating the actuation voltage of the two DEAs, was also investigated. (paper)

  19. USGS Elevation Contours Overlay Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS Elevation Contours service from The National Map (TNM) consists of contours generated for the conterminous United States from 1- and 1/3 arc-second...

  20. Quantum Programs as Kleisli Maps

    Directory of Open Access Journals (Sweden)

    Abraham Westerbaan

    2017-01-01

    Full Text Available Furber and Jacobs have shown in their study of quantum computation that the category of commutative C*-algebras and PU-maps (positive linear maps which preserve the unit is isomorphic to the Kleisli category of a comonad on the category of commutative C*-algebras with MIU-maps (linear maps which preserve multiplication, involution and unit. [Furber and Jacobs, 2013] In this paper, we prove a non-commutative variant of this result: the category of C*-algebras and PU-maps is isomorphic to the Kleisli category of a comonad on the subcategory of MIU-maps. A variation on this result has been used to construct a model of Selinger and Valiron's quantum lambda calculus using von Neumann algebras. [Cho and Westerbaan, 2016

  1. Topographic mapping

    Science.gov (United States)

    ,

    2008-01-01

    The U.S. Geological Survey (USGS) produced its first topographic map in 1879, the same year it was established. Today, more than 100 years and millions of map copies later, topographic mapping is still a central activity for the USGS. The topographic map remains an indispensable tool for government, science, industry, and leisure. Much has changed since early topographers traveled the unsettled West and carefully plotted the first USGS maps by hand. Advances in survey techniques, instrumentation, and design and printing technologies, as well as the use of aerial photography and satellite data, have dramatically improved mapping coverage, accuracy, and efficiency. Yet cartography, the art and science of mapping, may never before have undergone change more profound than today.

  2. A Servicewide Benthic Mapping Program for National Parks

    Science.gov (United States)

    Moses, Christopher S.; Nayegandhi, Amar; Beavers, Rebecca; Brock, John

    2010-01-01

    In 2007, the National Park Service (NPS) Inventory and Monitoring Program directed the initiation of a benthic habitat mapping program in ocean and coastal parks in alignment with the NPS Ocean Park Stewardship 2007-2008 Action Plan. With 74 ocean and Great Lakes parks stretching over more than 5,000 miles of coastline across 26 States and territories, this Servicewide Benthic Mapping Program (SBMP) is essential. This program will deliver benthic habitat maps and their associated inventory reports to NPS managers in a consistent, servicewide format to support informed management and protection of 3 million acres of submerged National Park System natural and cultural resources. The NPS and the U.S. Geological Survey (USGS) convened a workshop June 3-5, 2008, in Lakewood, Colo., to discuss the goals and develop the design of the NPS SBMP with an assembly of experts (Moses and others, 2010) who identified park needs and suggested best practices for inventory and mapping of bathymetry, benthic cover, geology, geomorphology, and some water-column properties. The recommended SBMP protocols include servicewide standards (such as gap analysis, minimum accuracy, final products) as well as standards that can be adapted to fit network and park unit needs (for example, minimum mapping unit, mapping priorities). SBMP Mapping Process. The SBMP calls for a multi-step mapping process for each park, beginning with a gap assessment and data mining to determine data resources and needs. An interagency announcement of intent to acquire new data will provide opportunities to leverage partnerships. Prior to new data acquisition, all involved parties should be included in a scoping meeting held at network scale. Data collection will be followed by processing and interpretation, and finally expert review and publication. After publication, all digital materials will be archived in a common format. SBMP Classification Scheme. The SBMP will map using the Coastal and Marine Ecological

  3. Single-edition quadrangle maps

    Science.gov (United States)

    ,

    1998-01-01

    In August 1993, the U.S. Geological Survey's (USGS) National Mapping Division and the U.S. Department of Agriculture's Forest Service signed an Interagency Agreement to begin a single-edition joint mapping program. This agreement established the coordination for producing and maintaining single-edition primary series topographic maps for quadrangles containing National Forest System lands. The joint mapping program saves money by eliminating duplication of effort by the agencies and results in a more frequent revision cycle for quadrangles containing national forests. Maps are revised on the basis of jointly developed standards and contain normal features mapped by the USGS, as well as additional features required for efficient management of National Forest System lands. Single-edition maps look slightly different but meet the content, accuracy, and quality criteria of other USGS products. The Forest Service is responsible for the land management of more than 191 million acres of land throughout the continental United States, Alaska, and Puerto Rico, including 155 national forests and 20 national grasslands. These areas make up the National Forest System lands and comprise more than 10,600 of the 56,000 primary series 7.5-minute quadrangle maps (15-minute in Alaska) covering the United States. The Forest Service has assumed responsibility for maintaining these maps, and the USGS remains responsible for printing and distributing them. Before the agreement, both agencies published similar maps of the same areas. The maps were used for different purposes, but had comparable types of features that were revised at different times. Now, the two products have been combined into one so that the revision cycle is stabilized and only one agency revises the maps, thus increasing the number of current maps available for National Forest System lands. This agreement has improved service to the public by requiring that the agencies share the same maps and that the maps meet a

  4. Minimum alcohol pricing policies in practice: A critical examination of implementation in Canada.

    Science.gov (United States)

    Thompson, Kara; Stockwell, Tim; Wettlaufer, Ashley; Giesbrecht, Norman; Thomas, Gerald

    2017-02-01

    There is an interest globally in using Minimum Unit Pricing (MUP) of alcohol to promote public health. Canada is the only country to have both implemented and evaluated some forms of minimum alcohol prices, albeit in ways that fall short of MUP. To inform these international debates, we describe the degree to which minimum alcohol prices in Canada meet recommended criteria for being an effective public health policy. We collected data on the implementation of minimum pricing with respect to (1) breadth of application, (2) indexation to inflation and (3) adjustments for alcohol content. Some jurisdictions have implemented recommended practices with respect to minimum prices; however, the full harm reduction potential of minimum pricing is not fully realised due to incomplete implementation. Key concerns include the following: (1) the exclusion of minimum prices for several beverage categories, (2) minimum prices below the recommended minima and (3) prices are not regularly adjusted for inflation or alcohol content. We provide recommendations for best practices when implementing minimum pricing policy.

  5. Fire behavior simulation in Mediterranean forests using the minimum travel time algorithm

    Science.gov (United States)

    Kostas Kalabokidis; Palaiologos Palaiologou; Mark A. Finney

    2014-01-01

    Recent large wildfires in Greece exemplify the need for pre-fire burn probability assessment and possible landscape fire flow estimation to enhance fire planning and resource allocation. The Minimum Travel Time (MTT) algorithm, incorporated as FlamMap's version five module, provide valuable fire behavior functions, while enabling multi-core utilization for the...

  6. Minimum emittance in TBA and MBA lattices

    Science.gov (United States)

    Xu, Gang; Peng, Yue-Mei

    2015-03-01

    For reaching a small emittance in a modern light source, triple bend achromats (TBA), theoretical minimum emittance (TME) and even multiple bend achromats (MBA) have been considered. This paper derived the necessary condition for achieving minimum emittance in TBA and MBA theoretically, where the bending angle of inner dipoles has a factor of 31/3 bigger than that of the outer dipoles. Here, we also calculated the conditions attaining the minimum emittance of TBA related to phase advance in some special cases with a pure mathematics method. These results may give some directions on lattice design.

  7. Minimum emittance in TBA and MBA lattices

    International Nuclear Information System (INIS)

    Xu Gang; Peng Yuemei

    2015-01-01

    For reaching a small emittance in a modern light source, triple bend achromats (TBA), theoretical minimum emittance (TME) and even multiple bend achromats (MBA) have been considered. This paper derived the necessary condition for achieving minimum emittance in TBA and MBA theoretically, where the bending angle of inner dipoles has a factor of 3 1/3 bigger than that of the outer dipoles. Here, we also calculated the conditions attaining the minimum emittance of TBA related to phase advance in some special cases with a pure mathematics method. These results may give some directions on lattice design. (authors)

  8. Who Benefits from a Minimum Wage Increase?

    OpenAIRE

    John W. Lopresti; Kevin J. Mumford

    2015-01-01

    This paper addresses the question of how a minimum wage increase affects the wages of low-wage workers. Most studies assume that there is a simple mechanical increase in the wage for workers earning a wage between the old and the new minimum wage, with some studies allowing for spillovers to workers with wages just above this range. Rather than assume that the wages of these workers would have remained constant, this paper estimates how a minimum wage increase impacts a low-wage worker's wage...

  9. Wage inequality, minimum wage effects and spillovers

    OpenAIRE

    Stewart, Mark B.

    2011-01-01

    This paper investigates possible spillover effects of the UK minimum wage. The halt in the growth in inequality in the lower half of the wage distribution (as measured by the 50:10 percentile ratio) since the mid-1990s, in contrast to the continued inequality growth in the upper half of the distribution, suggests the possibility of a minimum wage effect and spillover effects on wages above the minimum. This paper analyses individual wage changes, using both a difference-in-differences estimat...

  10. Looking for an old map

    Science.gov (United States)

    ,

    1996-01-01

    Many people want maps that show an area of the United States as it existed many years ago. These are called historical maps, and there are two types. The most common type consists of special maps prepared by commercial firms to show such historical features as battle-fields, military routes, or the paths taken by famous travelers. Typically, these maps are for sale to tourists at the sites of historical events. The other type is the truly old map--one compiled by a surveyor or cartographer many years ago. Lewis and Clark, for example, made maps of their journeys into the Northwest Territories in 1803-6, and originals of some of these maps still exist.

  11. How unprecedented a solar minimum was it?

    Science.gov (United States)

    Russell, C T; Jian, L K; Luhmann, J G

    2013-05-01

    The end of the last solar cycle was at least 3 years late, and to date, the new solar cycle has seen mainly weaker activity since the onset of the rising phase toward the new solar maximum. The newspapers now even report when auroras are seen in Norway. This paper is an update of our review paper written during the deepest part of the last solar minimum [1]. We update the records of solar activity and its consequent effects on the interplanetary fields and solar wind density. The arrival of solar minimum allows us to use two techniques that predict sunspot maximum from readings obtained at solar minimum. It is clear that the Sun is still behaving strangely compared to the last few solar minima even though we are well beyond the minimum phase of the cycle 23-24 transition.

  12. Impact of the Minimum Wage on Compression.

    Science.gov (United States)

    Wolfe, Michael N.; Candland, Charles W.

    1979-01-01

    Assesses the impact of increases in the minimum wage on salary schedules, provides guidelines for creating a philosophy to deal with the impact, and outlines options and presents recommendations. (IRT)

  13. Quantitative Research on the Minimum Wage

    Science.gov (United States)

    Goldfarb, Robert S.

    1975-01-01

    The article reviews recent research examining the impact of minimum wage requirements on the size and distribution of teenage employment and earnings. The studies measure income distribution, employment levels and effect on unemployment. (MW)

  14. Determining minimum lubrication film for machine parts

    Science.gov (United States)

    Hamrock, B. J.; Dowson, D.

    1978-01-01

    Formula predicts minimum film thickness required for fully-flooded ball bearings, gears, and cams. Formula is result of study to determine complete theoretical solution of isothermal elasto-hydrodynamic lubrication of fully-flooded elliptical contacts.

  15. Long Term Care Minimum Data Set (MDS)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Long-Term Care Minimum Data Set (MDS) is a standardized, primary screening and assessment tool of health status that forms the foundation of the comprehensive...

  16. The SME gauge sector with minimum length

    Energy Technology Data Exchange (ETDEWEB)

    Belich, H.; Louzada, H.L.C. [Universidade Federal do Espirito Santo, Departamento de Fisica e Quimica, Vitoria, ES (Brazil)

    2017-12-15

    We study the gauge sector of the Standard Model Extension (SME) with the Lorentz covariant deformed Heisenberg algebra associated to the minimum length. In order to find and estimate corrections, we clarify whether the violation of Lorentz symmetry and the existence of a minimum length are independent phenomena or are, in some way, related. With this goal, we analyze the dispersion relations of this theory. (orig.)

  17. The SME gauge sector with minimum length

    Science.gov (United States)

    Belich, H.; Louzada, H. L. C.

    2017-12-01

    We study the gauge sector of the Standard Model Extension (SME) with the Lorentz covariant deformed Heisenberg algebra associated to the minimum length. In order to find and estimate corrections, we clarify whether the violation of Lorentz symmetry and the existence of a minimum length are independent phenomena or are, in some way, related. With this goal, we analyze the dispersion relations of this theory.

  18. Participatory Maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    2016-01-01

    practice. In particular, mapping environmental damage, endangered species, and human-made disasters has become one focal point for environmental knowledge production. This type of digital map has been highlighted as a processual turn in critical cartography, whereas in related computational journalism...... of a geo-visualization within information mapping that enhances embodiment in the experience of the information. InfoAmazonia is defined as a digitally created map-space within which journalistic practice can be seen as dynamic, performative interactions between journalists, ecosystems, space, and species...

  19. Maximum And Minimum Temperature Trends In Mexico For The Last 31 Years

    Science.gov (United States)

    Romero-Centeno, R.; Zavala-Hidalgo, J.; Allende Arandia, M. E.; Carrasco-Mijarez, N.; Calderon-Bustamante, O.

    2013-05-01

    Based on high-resolution (1') daily maps of the maximum and minimum temperatures in Mexico, an analysis of the last 31-year trends is performed. The maps were generated using all the available information from more than 5,000 stations of the Mexican Weather Service (Servicio Meteorológico Nacional, SMN) for the period 1979-2009, along with data from the North American Regional Reanalysis (NARR). The data processing procedure includes a quality control step, in order to eliminate erroneous daily data, and make use of a high-resolution digital elevation model (from GEBCO), the relationship between air temperature and elevation by means of the average environmental lapse rate, and interpolation algorithms (linear and inverse-distance weighting). Based on the monthly gridded maps for the mentioned period, the maximum and minimum temperature trends calculated by least-squares linear regression and their statistical significance are obtained and discussed.

  20. Understanding map projections: Chapter 15

    Science.gov (United States)

    Usery, E. Lynn; Kent, Alexander J.; Vujakovic, Peter

    2018-01-01

    It has probably never been more important in the history of cartography than now that people understand how maps work. With increasing globalization, for example, world maps provide a key format for the transmission of information, but are often poorly used. Examples of poor understanding and use of projections and the resultant maps are many; for instance, the use of rectangular world maps in the United Kingdom press to show Chinese and Korean missile ranges as circles, something which can only be achieved on equidistant projections and then only from one launch point (Vujakovic, 2014).

  1. Concept Mapping

    Science.gov (United States)

    Technology & Learning, 2005

    2005-01-01

    Concept maps are graphical ways of working with ideas and presenting information. They reveal patterns and relationships and help students to clarify their thinking, and to process, organize and prioritize. Displaying information visually--in concept maps, word webs, or diagrams--stimulates creativity. Being able to think logically teaches…

  2. Large Scale Landform Mapping Using Lidar DEM

    Directory of Open Access Journals (Sweden)

    Türkay Gökgöz

    2015-08-01

    Full Text Available In this study, LIDAR DEM data was used to obtain a primary landform map in accordance with a well-known methodology. This primary landform map was generalized using the Focal Statistics tool (Majority, considering the minimum area condition in cartographic generalization in order to obtain landform maps at 1:1000 and 1:5000 scales. Both the primary and the generalized landform maps were verified visually with hillshaded DEM and an orthophoto. As a result, these maps provide satisfactory visuals of the landforms. In order to show the effect of generalization, the area of each landform in both the primary and the generalized maps was computed. Consequently, landform maps at large scales could be obtained with the proposed methodology, including generalization using LIDAR DEM.

  3. Mapping the Information Trace in Local Field Potentials by a Computational Method of Two-Dimensional Time-Shifting Synchronization Likelihood Based on Graphic Processing Unit Acceleration.

    Science.gov (United States)

    Zhao, Zi-Fang; Li, Xue-Zhu; Wan, You

    2017-12-01

    The local field potential (LFP) is a signal reflecting the electrical activity of neurons surrounding the electrode tip. Synchronization between LFP signals provides important details about how neural networks are organized. Synchronization between two distant brain regions is hard to detect using linear synchronization algorithms like correlation and coherence. Synchronization likelihood (SL) is a non-linear synchronization-detecting algorithm widely used in studies of neural signals from two distant brain areas. One drawback of non-linear algorithms is the heavy computational burden. In the present study, we proposed a graphic processing unit (GPU)-accelerated implementation of an SL algorithm with optional 2-dimensional time-shifting. We tested the algorithm with both artificial data and raw LFP data. The results showed that this method revealed detailed information from original data with the synchronization values of two temporal axes, delay time and onset time, and thus can be used to reconstruct the temporal structure of a neural network. Our results suggest that this GPU-accelerated method can be extended to other algorithms for processing time-series signals (like EEG and fMRI) using similar recording techniques.

  4. The minimum wage in the Czech enterprises

    Directory of Open Access Journals (Sweden)

    Eva Lajtkepová

    2010-01-01

    Full Text Available Although the statutory minimum wage is not a new category, in the Czech Republic we encounter the definition and regulation of a minimum wage for the first time in the 1990 amendment to Act No. 65/1965 Coll., the Labour Code. The specific amount of the minimum wage and the conditions of its operation were then subsequently determined by government regulation in February 1991. Since that time, the value of minimum wage has been adjusted fifteenth times (the last increase was in January 2007. The aim of this article is to present selected results of two researches of acceptance of the statutory minimum wage by Czech enterprises. The first research makes use of the data collected by questionnaire research in 83 small and medium-sized enterprises in the South Moravia Region in 2005, the second one the data of 116 enterprises in the entire Czech Republic (in 2007. The data have been processed by means of the standard methods of descriptive statistics and of the appropriate methods of the statistical analyses (Spearman correlation coefficient of sequential correlation, Kendall coefficient, χ2 - independence test, Kruskal-Wallis test, and others.

  5. Mapping goal alignment of deployment programs for alternative fuel technologies: An analysis of wide-scope grant programs in the United States

    International Nuclear Information System (INIS)

    Sobin, Nathaniel; Molenaar, Keith; Cahill, Eric

    2012-01-01

    Governments have attempted to advance alternative fuels (AFs) in the on-road transportation sector with the goal of addressing multiple environmental, energy security, economic growth, and technology transition objectives. However there is little agreement, at all governmental levels, on how to prioritize goals and how to measure progress towards goals. Literature suggests that a consistent, aligned, and prioritized approach will increase the effectiveness of deployment efforts. While literature states that goal alignment and prioritization should occur, there are few studies suggesting how to measure the alignment of deployment programs. This paper presents a methodology for measuring goal alignment by applying the theories of goal ambiguity. It then demonstrates this methodology within the context of fuel- and project-neutral (wide-scope) grant programs directed toward AF deployment. This paper analyzes forty-seven (47) wide-scope federal, state, and regional grant programs in the United States, active between 2006 and 2011. On the whole, governments most use deployment grant programs to address environmental concerns and are highly aligned in doing so between agency levels. In contrast, there is much less consensus (and therefore goal alignment) on whether or how governments should address other priorities such as energy security, economic growth, and technology transition. - Highlights: ► Grants that deploy AFs most often address environmental goals and are highly aligned in doing so. ► Economic growth goals are most often addressed by federal AF deployment grant programs. ► Energy security goals are most often addressed by state and regional AF deployment grant programs. ► Technology transition goals are the least aligned when considering alignment across agencies.

  6. Risk control and the minimum significant risk

    International Nuclear Information System (INIS)

    Seiler, F.A.; Alvarez, J.L.

    1996-01-01

    Risk management implies that the risk manager can, by his actions, exercise at least a modicum of control over the risk in question. In the terminology of control theory, a management action is a control signal imposed as feedback on the system to bring about a desired change in the state of the system. In the terminology of risk management, an action is taken to bring a predicted risk to lower values. Even if it is assumed that the management action taken is 100% effective and that the projected risk reduction is infinitely well known, there is a lower limit to the desired effects that can be achieved. It is based on the fact that all risks, such as the incidence of cancer, exhibit a degree of variability due to a number of extraneous factors such as age at exposure, sex, location, and some lifestyle parameters such as smoking or the consumption of alcohol. If the control signal is much smaller than the variability of the risk, the signal is lost in the noise and control is lost. This defines a minimum controllable risk based on the variability of the risk over the population considered. This quantity is the counterpart of the minimum significant risk which is defined by the uncertainties of the risk model. Both the minimum controllable risk and the minimum significant risk are evaluated for radiation carcinogenesis and are shown to be of the same order of magnitude. For a realistic management action, the assumptions of perfectly effective action and perfect model prediction made above have to be dropped, resulting in an effective minimum controllable risk which is determined by both risk limits. Any action below that effective limit is futile, but it is also unethical due to the ethical requirement of doing more good than harm. Finally, some implications of the effective minimum controllable risk on the use of the ALARA principle and on the evaluation of remedial action goals are presented

  7. Globally optimal, minimum stored energy, double-doughnut superconducting magnets.

    Science.gov (United States)

    Tieng, Quang M; Vegh, Viktor; Brereton, Ian M

    2010-01-01

    The use of the minimum stored energy current density map-based methodology of designing closed-bore symmetric superconducting magnets was described recently. The technique is further developed to cater for the design of interventional-type MRI systems, and in particular open symmetric magnets of the double-doughnut configuration. This extends the work to multiple magnet domain configurations. The use of double-doughnut magnets in MRI scanners has previously been hindered by the ability to deliver strong magnetic fields over a sufficiently large volume appropriate for imaging, essentially limiting spatial resolution, signal-to-noise ratio, and field of view. The requirement of dedicated interventional space restricts the manner in which the coils can be arranged and placed. The minimum stored energy optimal coil arrangement ensures that the field strength is maximized over a specific region of imaging. The design method yields open, dual-domain magnets capable of delivering greater field strengths than those used prior to this work, and at the same time it provides an increase in the field-of-view volume. Simulation results are provided for 1-T double-doughnut magnets with at least a 50-cm 1-ppm (parts per million) field of view and 0.7-m gap between the two doughnuts. Copyright (c) 2009 Wiley-Liss, Inc.

  8. Mapping racism.

    Science.gov (United States)

    Moss, Donald B

    2006-01-01

    The author uses the metaphor of mapping to illuminate a structural feature of racist thought, locating the degraded object along vertical and horizontal axes. These axes establish coordinates of hierarchy and of distance. With the coordinates in place, racist thought begins to seem grounded in natural processes. The other's identity becomes consolidated, and parochialism results. The use of this kind of mapping is illustrated via two patient vignettes. The author presents Freud's (1905, 1927) views in relation to such a "mapping" process, as well as Adorno's (1951) and Baldwin's (1965). Finally, the author conceptualizes the crucial status of primitivity in the workings of racist thought.

  9. Minimum qualifications for nuclear criticality safety professionals

    International Nuclear Information System (INIS)

    Ketzlach, N.

    1990-01-01

    A Nuclear Criticality Technology and Safety Training Committee has been established within the U.S. Department of Energy (DOE) Nuclear Criticality Safety and Technology Project to review and, if necessary, develop standards for the training of personnel involved in nuclear criticality safety (NCS). The committee is exploring the need for developing a standard or other mechanism for establishing minimum qualifications for NCS professionals. The development of standards and regulatory guides for nuclear power plant personnel may serve as a guide in developing the minimum qualifications for NCS professionals

  10. A minimum achievable PV electrical generating cost

    International Nuclear Information System (INIS)

    Sabisky, E.S.

    1996-01-01

    The role and share of photovoltaic (PV) generated electricity in our nation's future energy arsenal is primarily dependent on its future production cost. This paper provides a framework for obtaining a minimum achievable electrical generating cost (a lower bound) for fixed, flat-plate photovoltaic systems. A cost of 2.8 $cent/kWh (1990$) was derived for a plant located in Southwestern USA sunshine using a cost of money of 8%. In addition, a value of 22 $cent/Wp (1990$) was estimated as a minimum module manufacturing cost/price

  11. Lectures on quasiconformal mappings

    CERN Document Server

    Ahlfors, Lars V

    2006-01-01

    Lars Ahlfors's Lectures on Quasiconformal Mappings, based on a course he gave at Harvard University in the spring term of 1964, was first published in 1966 and was soon recognized as the classic it was shortly destined to become. These lectures develop the theory of quasiconformal mappings from scratch, give a self-contained treatment of the Beltrami equation, and cover the basic properties of Teichm�ller spaces, including the Bers embedding and the Teichm�ller curve. It is remarkable how Ahlfors goes straight to the heart of the matter, presenting major results with a minimum set of prerequisites. Many graduate students and other mathematicians have learned the foundations of the theories of quasiconformal mappings and Teichm�ller spaces from these lecture notes. This edition includes three new chapters. The first, written by Earle and Kra, describes further developments in the theory of Teichm�ller spaces and provides many references to the vast literature on Teichm�ller spaces and quasiconformal ...

  12. Discretization of space and time: determining the values of minimum length and minimum time

    OpenAIRE

    Roatta , Luca

    2017-01-01

    Assuming that space and time can only have discrete values, we obtain the expression of the minimum length and the minimum time interval. These values are found to be exactly coincident with the Planck's length and the Planck's time but for the presence of h instead of ħ .

  13. Genetic Mapping

    Science.gov (United States)

    ... greatly advanced genetics research. The improved quality of genetic data has reduced the time required to identify a ... cases, a matter of months or even weeks. Genetic mapping data generated by the HGP's laboratories is freely accessible ...

  14. MINIMUM AREAS FOR ELEMENTARY SCHOOL BUILDING FACILITIES.

    Science.gov (United States)

    Pennsylvania State Dept. of Public Instruction, Harrisburg.

    MINIMUM AREA SPACE REQUIREMENTS IN SQUARE FOOTAGE FOR ELEMENTARY SCHOOL BUILDING FACILITIES ARE PRESENTED, INCLUDING FACILITIES FOR INSTRUCTIONAL USE, GENERAL USE, AND SERVICE USE. LIBRARY, CAFETERIA, KITCHEN, STORAGE, AND MULTIPURPOSE ROOMS SHOULD BE SIZED FOR THE PROJECTED ENROLLMENT OF THE BUILDING IN ACCORDANCE WITH THE PROJECTION UNDER THE…

  15. Dirac's minimum degree condition restricted to claws

    NARCIS (Netherlands)

    Broersma, Haitze J.; Ryjacek, Z.; Schiermeyer, I.

    1997-01-01

    Let G be a graph on n 3 vertices. Dirac's minimum degree condition is the condition that all vertices of G have degree at least . This is a well-known sufficient condition for the existence of a Hamilton cycle in G. We give related sufficiency conditions for the existence of a Hamilton cycle or a

  16. 7 CFR 33.10 - Minimum requirements.

    Science.gov (United States)

    2010-01-01

    ... Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... ISSUED UNDER AUTHORITY OF THE EXPORT APPLE ACT Regulations § 33.10 Minimum requirements. No person shall... shipment of apples to any foreign destination unless: (a) Apples grade at least U.S. No. 1 or U.S. No. 1...

  17. Minimum Risk Pesticide: Definition and Product Confirmation

    Science.gov (United States)

    Minimum risk pesticides pose little to no risk to human health or the environment and therefore are not subject to regulation under FIFRA. EPA does not do any pre-market review for such products or labels, but violative products are subject to enforcement.

  18. The Minimum Distance of Graph Codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Justesen, Jørn

    2011-01-01

    We study codes constructed from graphs where the code symbols are associated with the edges and the symbols connected to a given vertex are restricted to be codewords in a component code. In particular we treat such codes from bipartite expander graphs coming from Euclidean planes and other...... geometries. We give results on the minimum distances of the codes....

  19. Minimum maintenance solar pump | Assefa | Zede Journal

    African Journals Online (AJOL)

    A minimum maintenance solar pump (MMSP), Fig 1, has been simulated for Addis Ababa, taking solar meteorological data of global radiation, diffuse radiation and ambient air temperature as input to a computer program that has been developed. To increase the performance of the solar pump, by trapping the long-wave ...

  20. Context quantization by minimum adaptive code length

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Wu, Xiaolin

    2007-01-01

    Context quantization is a technique to deal with the issue of context dilution in high-order conditional entropy coding. We investigate the problem of context quantizer design under the criterion of minimum adaptive code length. A property of such context quantizers is derived for binary symbols....

  1. 7 CFR 35.13 - Minimum quantity.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Minimum quantity. 35.13 Section 35.13 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... part, transport or receive for transportation to any foreign destination, a shipment of 25 packages or...

  2. Minimum impact house prototype for sustainable building

    NARCIS (Netherlands)

    Götz, E.; Klenner, K.; Lantelme, M.; Mohn, A.; Sauter, S.; Thöne, J.; Zellmann, E.; Drexler, H.; Jauslin, D.

    2010-01-01

    The Minihouse is a prototupe for a sustainable townhouse. On a site of only 29 sqm it offers 154 sqm of urban life. The project 'Minimum Impact House' adresses two important questions: How do we provide living space in the cities without distroying the landscape? How to improve sustainably the

  3. 49 CFR 639.27 - Minimum criteria.

    Science.gov (United States)

    2010-10-01

    ... dollar value to any non-financial factors that are considered by using performance-based specifications..., DEPARTMENT OF TRANSPORTATION CAPITAL LEASES Cost-Effectiveness § 639.27 Minimum criteria. In making the... used where possible and appropriate: (a) Operation costs; (b) Reliability of service; (c) Maintenance...

  4. Computing nonsimple polygons of minimum perimeter

    NARCIS (Netherlands)

    Fekete, S.P.; Haas, A.; Hemmer, M.; Hoffmann, M.; Kostitsyna, I.; Krupke, D.; Maurer, F.; Mitchell, J.S.B.; Schmidt, A.; Schmidt, C.; Troegel, J.

    2018-01-01

    We consider the Minimum Perimeter Polygon Problem (MP3): for a given set V of points in the plane, find a polygon P with holes that has vertex set V , such that the total boundary length is smallest possible. The MP3 can be considered a natural geometric generalization of the Traveling Salesman

  5. Minimum-B mirrors plus EBT principles

    International Nuclear Information System (INIS)

    Yoshikawa, S.

    1983-01-01

    Electrons are heated at the minimum B location(s) created by the multipole field and the toroidal field. Resulting hot electrons can assist plasma confinement by (1) providing mirror, (2) creating azimuthally symmetric toroidal confinement, or (3) creating modified bumpy torus

  6. Completeness properties of the minimum uncertainty states

    Science.gov (United States)

    Trifonov, D. A.

    1993-01-01

    The completeness properties of the Schrodinger minimum uncertainty states (SMUS) and of some of their subsets are considered. The invariant measures and the resolution unity measures for the set of SMUS are constructed and the representation of squeezing and correlating operators and SMUS as superpositions of Glauber coherent states on the real line is elucidated.

  7. Minimum Description Length Shape and Appearance Models

    DEFF Research Database (Denmark)

    Thodberg, Hans Henrik

    2003-01-01

    The Minimum Description Length (MDL) approach to shape modelling is reviewed. It solves the point correspondence problem of selecting points on shapes defined as curves so that the points correspond across a data set. An efficient numerical implementation is presented and made available as open s...

  8. Faster Fully-Dynamic minimum spanning forest

    DEFF Research Database (Denmark)

    Holm, Jacob; Rotenberg, Eva; Wulff-Nilsen, Christian

    2015-01-01

    We give a new data structure for the fully-dynamic minimum spanning forest problem in simple graphs. Edge updates are supported in O(log4 n/log logn) expected amortized time per operation, improving the O(log4 n) amortized bound of Holm et al. (STOC’98, JACM’01).We also provide a deterministic data...

  9. Minimum Wage Effects throughout the Wage Distribution

    Science.gov (United States)

    Neumark, David; Schweitzer, Mark; Wascher, William

    2004-01-01

    This paper provides evidence on a wide set of margins along which labor markets can adjust in response to increases in the minimum wage, including wages, hours, employment, and ultimately labor income. Not surprisingly, the evidence indicates that low-wage workers are most strongly affected, while higher-wage workers are little affected. Workers…

  10. Asymptotics for the minimum covariance determinant estimator

    NARCIS (Netherlands)

    Butler, R.W.; Davies, P.L.; Jhun, M.

    1993-01-01

    Consistency is shown for the minimum covariance determinant (MCD) estimators of multivariate location and scale and asymptotic normality is shown for the former. The proofs are made possible by showing a separating ellipsoid property for the MCD subset of observations. An analogous property is shown

  11. Planetary tides during the Maunder sunspot minimum

    International Nuclear Information System (INIS)

    Smythe, C.M.; Eddy, J.A.

    1977-01-01

    Sun-centered planetary conjunctions and tidal potentials are here constructed for the AD1645 to 1715 period of sunspot absence, referred to as the 'Maunder Minimum'. These are found to be effectively indistinguishable from patterns of conjunctions and power spectra of tidal potential in the present era of a well established 11 year sunspot cycle. This places a new and difficult restraint on any tidal theory of sunspot formation. Problems arise in any direct gravitational theory due to the apparently insufficient forces and tidal heights involved. Proponents of the tidal hypothesis usually revert to trigger mechanisms, which are difficult to criticise or test by observation. Any tidal theory rests on the evidence of continued sunspot periodicity and the substantiation of a prolonged period of solar anomaly in the historical past. The 'Maunder Minimum' was the most drastic change in the behaviour of solar activity in the last 300 years; sunspots virtually disappeared for a 70 year period and the 11 year cycle was probably absent. During that time, however, the nine planets were all in their orbits, and planetary conjunctions and tidal potentials were indistinguishable from those of the present era, in which the 11 year cycle is well established. This provides good evidence against the tidal theory. The pattern of planetary tidal forces during the Maunder Minimum was reconstructed to investigate the possibility that the multiple planet forces somehow fortuitously cancelled at the time, that is that the positions of the slower moving planets in the 17th and early 18th centuries were such that conjunctions and tidal potentials were at the time reduced in number and force. There was no striking dissimilarity between the time of the Maunder Minimum and any period investigated. The failure of planetary conjunction patterns to reflect the drastic drop in sunspots during the Maunder Minimum casts doubt on the tidal theory of solar activity, but a more quantitative test

  12. Road networks as collections of minimum cost paths

    Science.gov (United States)

    Wegner, Jan Dirk; Montoya-Zegarra, Javier Alexander; Schindler, Konrad

    2015-10-01

    We present a probabilistic representation of network structures in images. Our target application is the extraction of urban roads from aerial images. Roads appear as thin, elongated, partially curved structures forming a loopy graph, and this complex layout requires a prior that goes beyond standard smoothness and co-occurrence assumptions. In the proposed model the network is represented as a union of 1D paths connecting distant (super-)pixels. A large set of putative candidate paths is constructed in such a way that they include the true network as much as possible, by searching for minimum cost paths in the foreground (road) likelihood. Selecting the optimal subset of candidate paths is posed as MAP inference in a higher-order conditional random field. Each path forms a higher-order clique with a type of clique potential, which attracts the member nodes of cliques with high cumulative road evidence to the foreground label. That formulation induces a robust PN -Potts model, for which a global MAP solution can be found efficiently with graph cuts. Experiments with two road data sets show that the proposed model significantly improves per-pixel accuracies as well as the overall topological network quality with respect to several baselines.

  13. Solving crystal structures with the symmetry minimum function

    International Nuclear Information System (INIS)

    Estermann, M.A.

    1995-01-01

    Unravelling the Patterson function (the auto-correlation function of the crystal structure) (A.L. Patterson, Phys. Rev. 46 (1934) 372) can be the only way of solving crystal structures from neutron and incomplete diffraction data (e.g. powder data) when direct methods for phase determination fail. The negative scattering lengths of certain isotopes and the systematic loss of information caused by incomplete diffraction data invalidate the underlying statistical assumptions made in direct methods. In contrast, the Patterson function depends solely on the quality of the available diffraction data. Simpson et al. (P.G. Simpson et al., Acta Crystallogr. 18 (1965) 169) showed that solving a crystal structure with a particular superposition of origin-shifted Patterson functions, the symmetry minimum function, is advantageous over using the Patterson function alone, for single-crystal X-ray data.This paper describes the extension of the Patterson superposition approach to neutron data and powder data by (a) actively using the negative regions in the Patterson map caused by negative scattering lengths and (b) using maximum entropy Patterson maps (W.I.F. David, Nature 346 (1990) 731). Furthermore, prior chemical knowledge such as bond lengths and angles from known fragments have been included. Two successful structure solutions of a known and a previously unknown structure (M. Hofmann, J. Solid State Chem., in press) illustrate the potential of this new development. ((orig.))

  14. Minimum throttling feedwater control in VVER-1000 and PWR NPPs

    International Nuclear Information System (INIS)

    Symkin, B.E.; Thaulez, F.

    2004-01-01

    This paper presents an approach for the design and implementation of advanced digital control systems that use a minimum-throttling algorithm for the feedwater control. The minimum-throttling algorithm for the feedwater control, i.e. for the control of steam generators level and of the feedwater pumps speed, is applicable for NPPs with variable speed feedwater pumps. It operates in such a way that the feedwater control valve in the most loaded loop is wide open, steam generator level in this loop being controlled by the feedwater pumps speed, while the feedwater control valves in the other loops are slightly throttling under the action of their control system, to accommodate the slight loop imbalances. This has the advantage of minimizing the valve pressure losses hence minimizing the feedwater pumps power consumption and increasing the net MWe. The benefit has been evaluated for specific plants as being roughly 0.7 and 2.4 MW. The minimum throttling mode has the further advantages of lowering the actuator efforts with potential positive impact in actuator life and of minimizing the feedwater pipelines vibrations. The minimum throttling mode of operation has been developed by the Ukrainian company LvivORGRES. It has been applied with great deal of success on several VVER-1000 NPPs, six units of Zaporizhzha in Ukraine plus, with participation of Westinghouse, Kozloduy 5 and 6 in Bulgaria and South Ukraine 1 to 3 in Ukraine. The concept operates with both ON-OFF valves and true control valves. A study, jointly conducted by Westinghouse and LvivORGRES, is ongoing to demonstrate the applicability of the concept to PWRs having variable speed feedwater pumps and having, or installing, digital feedwater control, standalone or as part of a global digital control system. The implementation of the algorithm at VVER-1000 plants provided both safety improvement and direct commercial benefits. The minimum-throttling algorithm will similarly increase the performance of PWRs. The

  15. How Will Higher Minimum Wages Affect Family Life and Children's Well-Being?

    Science.gov (United States)

    Hill, Heather D; Romich, Jennifer

    2018-06-01

    In recent years, new national and regional minimum wage laws have been passed in the United States and other countries. The laws assume that benefits flow not only to workers but also to their children. Adolescent workers will most likely be affected directly given their concentration in low-paying jobs, but younger children may be affected indirectly by changes in parents' work conditions, family income, and the quality of nonparental child care. Research on minimum wages suggests modest and mixed economic effects: Decreases in employment can offset, partly or fully, wage increases, and modest reductions in poverty rates may fade over time. Few studies have examined the effects of minimum wage increases on the well-being of families, adults, and children. In this article, we use theoretical frameworks and empirical evidence concerning the effects on children of parental work and family income to suggest hypotheses about the effects of minimum wage increases on family life and children's well-being.

  16. The Effect of Minimum Wages on Adolescent Fertility: A Nationwide Analysis.

    Science.gov (United States)

    Bullinger, Lindsey Rose

    2017-03-01

    To investigate the effect of minimum wage laws on adolescent birth rates in the United States. I used a difference-in-differences approach and vital statistics data measured quarterly at the state level from 2003 to 2014. All models included state covariates, state and quarter-year fixed effects, and state-specific quarter-year nonlinear time trends, which provided plausibly causal estimates of the effect of minimum wage on adolescent birth rates. A $1 increase in minimum wage reduces adolescent birth rates by about 2%. The effects are driven by non-Hispanic White and Hispanic adolescents. Nationwide, increasing minimum wages by $1 would likely result in roughly 5000 fewer adolescent births annually.

  17. Correlation of Geophysical and Geotechnical Methods for Sediment Mapping in Sungai Batu, Kedah

    Science.gov (United States)

    Zakaria, M. T.; Taib, A.; Saidin, M. M.; Saad, R.; Muztaza, N. M.; Masnan, S. S. K.

    2018-04-01

    Exploration geophysics is widely used to map the subsurface characteristics of a region, to understand the underlying rock structures and spatial distribution of rock units. 2-D resistivity and seismic refraction methods were conducted in Sungai Batu locality with objective to identify and map the sediment deposit with correlation of borehole record. 2-D resistivity data was acquire using ABEM SAS4000 system with Pole-dipole array and 2.5 m minimum electrode spacing while for seismic refraction ABEM MK8 seismograph was used to record the seismic data and 5 kg sledgehammer used as a seismic source with geophones interval of 5 m spacing. The inversion model of 2-D resistivity result shows that, the resistivity values 500 Ωm as the hard layer for this study area. The seismic result indicates that the velocity values 3600 m/s interpreted as the hard layer in this locality.

  18. Nowcasting daily minimum air and grass temperature

    Science.gov (United States)

    Savage, M. J.

    2016-02-01

    Site-specific and accurate prediction of daily minimum air and grass temperatures, made available online several hours before their occurrence, would be of significant benefit to several economic sectors and for planning human activities. Site-specific and reasonably accurate nowcasts of daily minimum temperature several hours before its occurrence, using measured sub-hourly temperatures hours earlier in the morning as model inputs, was investigated. Various temperature models were tested for their ability to accurately nowcast daily minimum temperatures 2 or 4 h before sunrise. Temperature datasets used for the model nowcasts included sub-hourly grass and grass-surface (infrared) temperatures from one location in South Africa and air temperature from four subtropical sites varying in altitude (USA and South Africa) and from one site in central sub-Saharan Africa. Nowcast models used employed either exponential or square root functions to describe the rate of nighttime temperature decrease but inverted so as to determine the minimum temperature. The models were also applied in near real-time using an open web-based system to display the nowcasts. Extrapolation algorithms for the site-specific nowcasts were also implemented in a datalogger in an innovative and mathematically consistent manner. Comparison of model 1 (exponential) nowcasts vs measured daily minima air temperatures yielded root mean square errors (RMSEs) <1 °C for the 2-h ahead nowcasts. Model 2 (also exponential), for which a constant model coefficient ( b = 2.2) was used, was usually slightly less accurate but still with RMSEs <1 °C. Use of model 3 (square root) yielded increased RMSEs for the 2-h ahead comparisons between nowcasted and measured daily minima air temperature, increasing to 1.4 °C for some sites. For all sites for all models, the comparisons for the 4-h ahead air temperature nowcasts generally yielded increased RMSEs, <2.1 °C. Comparisons for all model nowcasts of the daily grass

  19. Toward an operational framework for fine-scale urban land-cover mapping in Wallonia using submeter remote sensing and ancillary vector data

    Science.gov (United States)

    Beaumont, Benjamin; Grippa, Tais; Lennert, Moritz; Vanhuysse, Sabine; Stephenne, Nathalie; Wolff, Eléonore

    2017-07-01

    Encouraged by the EU INSPIRE directive requirements and recommendations, the Walloon authorities, similar to other EU regional or national authorities, want to develop operational land-cover (LC) and land-use (LU) mapping methods using existing geodata. Urban planners and environmental monitoring stakeholders of Wallonia have to rely on outdated, mixed, and incomplete LC and LU information. The current reference map is 10-years old. The two object-based classification methods, i.e., a rule- and a classifier-based method, for detailed regional urban LC mapping are compared. The added value of using the different existing geospatial datasets in the process is assessed. This includes the comparison between satellite and aerial optical data in terms of mapping accuracies, visual quality of the map, costs, processing, data availability, and property rights. The combination of spectral, tridimensional, and vector data provides accuracy values close to 0.90 for mapping the LC into nine categories with a minimum mapping unit of 15 m2. Such a detailed LC map offers opportunities for fine-scale environmental and spatial planning activities. Still, the regional application poses challenges regarding automation, big data handling, and processing time, which are discussed.

  20. Projective mapping

    DEFF Research Database (Denmark)

    Dehlholm, Christian; Brockhoff, Per B.; Bredie, Wender Laurentius Petrus

    2012-01-01

    by the practical testing environment. As a result of the changes, a reasonable assumption would be to question the consequences caused by the variations in method procedures. Here, the aim is to highlight the proven or hypothetic consequences of variations of Projective Mapping. Presented variations will include...... instructions and influence heavily the product placements and the descriptive vocabulary (Dehlholm et.al., 2012b). The type of assessors performing the method influences results with an extra aspect in Projective Mapping compared to more analytical tests, as the given spontaneous perceptions are much dependent......Projective Mapping (Risvik et.al., 1994) and its Napping (Pagès, 2003) variations have become increasingly popular in the sensory field for rapid collection of spontaneous product perceptions. It has been applied in variations which sometimes are caused by the purpose of the analysis and sometimes...

  1. Dose mapping role in gamma irradiation industry

    International Nuclear Information System (INIS)

    Noriah Mod Ali; John Konsoh Sangau; Mazni Abd Latif

    2002-01-01

    In this studies, the role of dosimetry activity in gamma irradiator was discussed. Dose distribution in the irradiator, which is a main needs in irradiator or chamber commissioning. This distribution data were used to confirm the dosimetry parameters i.e. exposure time, maximum and minimum dose map/points, and dose distribution - in which were used as guidelines for optimum product irradiation. (Author)

  2. Solution for Nonlinear Three-Dimensional Intercept Problem with Minimum Energy

    Directory of Open Access Journals (Sweden)

    Henzeh Leeghim

    2013-01-01

    a minimum-energy application, which then generates both the desired initial interceptor velocity and the TOF for the minimum-energy transfer. The optimization problem is formulated by using the classical Lagrangian f and g coefficients, which map initial position and velocity vectors to future times, and a universal time variable x. A Newton-Raphson iteration algorithm is introduced for iteratively solving the problem. A generalized problem formulation is introduced for minimizing the TOF as part of the optimization problem. Several examples are presented, and the results are compared with the Hohmann transfer solution approaches. The resulting minimum-energy intercept solution algorithm is expected to be broadly useful as a starting iterative for applications spanning: targeting, rendezvous, interplanetary trajectory design, and so on.

  3. Measurement of Minimum Bias Observables with ATLAS

    CERN Document Server

    Kvita, Jiri; The ATLAS collaboration

    2017-01-01

    The modelling of Minimum Bias (MB) is a crucial ingredient to learn about the description of soft QCD processes. It has also a significant relevance for the simulation of the environment at the LHC with many concurrent pp interactions (“pileup”). The ATLAS collaboration has provided new measurements of the inclusive charged particle multiplicity and its dependence on transverse momentum and pseudorapidity in special data sets with low LHC beam currents, recorded at center of mass energies of 8 TeV and 13 TeV. The measurements cover a wide spectrum using charged particle selections with minimum transverse momentum of both 100 MeV and 500 MeV and in various phase space regions of low and high charged particle multiplicities.

  4. Comments on the 'minimum flux corona' concept

    International Nuclear Information System (INIS)

    Antiochos, S.K.; Underwood, J.H.

    1978-01-01

    Hearn's (1975) models of the energy balance and mass loss of stellar coronae, based on a 'minimum flux corona' concept, are critically examined. First, it is shown that the neglect of the relevant length scales for coronal temperature variation leads to an inconsistent computation of the total energy flux F. The stability arguments upon which the minimum flux concept is based are shown to be fallacious. Errors in the computation of the stellar wind contribution to the energy budget are identified. Finally we criticize Hearn's (1977) suggestion that the model, with a value of the thermal conductivity modified by the magnetic field, can explain the difference between solar coronal holes and quiet coronal regions. (orig.) 891 WL [de

  5. Minimum wakefield achievable by waveguide damped cavity

    International Nuclear Information System (INIS)

    Lin, X.E.; Kroll, N.M.

    1995-01-01

    The authors use an equivalent circuit to model a waveguide damped cavity. Both exponentially damped and persistent (decay t -3/2 ) components of the wakefield are derived from this model. The result shows that for a cavity with resonant frequency a fixed interval above waveguide cutoff, the persistent wakefield amplitude is inversely proportional to the external Q value of the damped mode. The competition of the two terms results in an optimal Q value, which gives a minimum wakefield as a function of the distance behind the source particle. The minimum wakefield increases when the resonant frequency approaches the waveguide cutoff. The results agree very well with computer simulation on a real cavity-waveguide system

  6. Protocol for the verification of minimum criteria

    International Nuclear Information System (INIS)

    Gaggiano, M.; Spiccia, P.; Gaetano Arnetta, P.

    2014-01-01

    This Protocol has been prepared with reference to the provisions of article 8 of the Legislative Decree of May 26, 2000 No. 187. Quality controls of radiological equipment fit within the larger 'quality assurance Program' and are intended to ensure the correct operation of the same and the maintenance of that State. The pursuit of this objective guarantees that the radiological equipment subjected to those controls also meets the minimum criteria of acceptability set out in annex V of the aforementioned legislative decree establishing the conditions necessary to allow the functions to which each radiological equipment was designed, built and for which it is used. The Protocol is established for the purpose of quality control of radiological equipment of Cone Beam Computer Tomography type and reference document, in the sense that compliance with stated tolerances also ensures the subsistence minimum acceptability requirements, where applicable.

  7. Low Streamflow Forcasting using Minimum Relative Entropy

    Science.gov (United States)

    Cui, H.; Singh, V. P.

    2013-12-01

    Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.

  8. Map Archive Mining: Visual-Analytical Approaches to Explore Large Historical Map Collections

    Directory of Open Access Journals (Sweden)

    Johannes H. Uhl

    2018-04-01

    Full Text Available Historical maps are unique sources of retrospective geographical information. Recently, several map archives containing map series covering large spatial and temporal extents have been systematically scanned and made available to the public. The geographical information contained in such data archives makes it possible to extend geospatial analysis retrospectively beyond the era of digital cartography. However, given the large data volumes of such archives (e.g., more than 200,000 map sheets in the United States Geological Survey topographic map archive and the low graphical quality of older, manually-produced map sheets, the process to extract geographical information from these map archives needs to be automated to the highest degree possible. To understand the potential challenges (e.g., salient map characteristics and data quality variations in automating large-scale information extraction tasks for map archives, it is useful to efficiently assess spatio-temporal coverage, approximate map content, and spatial accuracy of georeferenced map sheets at different map scales. Such preliminary analytical steps are often neglected or ignored in the map processing literature but represent critical phases that lay the foundation for any subsequent computational processes including recognition. Exemplified for the United States Geological Survey topographic map and the Sanborn fire insurance map archives, we demonstrate how such preliminary analyses can be systematically conducted using traditional analytical and cartographic techniques, as well as visual-analytical data mining tools originating from machine learning and data science.

  9. Minimum Wage Laws and the Distribution of Employment.

    Science.gov (United States)

    Lang, Kevin

    The desirability of raising the minimum wage long revolved around just one question: the effect of higher minimum wages on the overall level of employment. An even more critical effect of the minimum wage rests on the composition of employment--who gets the minimum wage job. An examination of employment in eating and drinking establishments…

  10. Affective Maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    . In particular, mapping environmental damage, endangered species, and human made disasters has become one of the focal point of affective knowledge production. These ‘more-than-humangeographies’ practices include notions of species, space and territory, and movement towards a new political ecology. This type...... of digital cartographies has been highlighted as the ‘processual turn’ in critical cartography, whereas in related computational journalism it can be seen as an interactive and iterative process of mapping complex and fragile ecological developments. This paper looks at computer-assisted cartography as part...

  11. Minimum intervention dentistry: periodontics and implant dentistry.

    Science.gov (United States)

    Darby, I B; Ngo, L

    2013-06-01

    This article will look at the role of minimum intervention dentistry in the management of periodontal disease. It will discuss the role of appropriate assessment, treatment and risk factors/indicators. In addition, the role of the patient and early intervention in the continuing care of dental implants will be discussed as well as the management of peri-implant disease. © 2013 Australian Dental Association.

  12. Minimum quality standards and international trade

    DEFF Research Database (Denmark)

    Baltzer, Kenneth Thomas

    2011-01-01

    This paper investigates the impact of a non-discriminating minimum quality standard (MQS) on trade and welfare when the market is characterized by imperfect competition and asymmetric information. A simple partial equilibrium model of an international Cournot duopoly is presented in which a domes...... prefer different levels of regulation. As a result, international trade disputes are likely to arise even when regulation is non-discriminating....

  13. ''Reduced'' magnetohydrodynamics and minimum dissipation rates

    International Nuclear Information System (INIS)

    Montgomery, D.

    1992-01-01

    It is demonstrated that all solutions of the equations of ''reduced'' magnetohydrodynamics approach a uniform-current, zero-flow state for long times, given a constant wall electric field, uniform scalar viscosity and resistivity, and uniform mass density. This state is the state of minimum energy dissipation rate for these boundary conditions. No steady-state turbulence is possible. The result contrasts sharply with results for full three-dimensional magnetohydrodynamics before the reduction occurs

  14. Minimum K_2,3-saturated Graphs

    OpenAIRE

    Chen, Ya-Chen

    2010-01-01

    A graph is K_{2,3}-saturated if it has no subgraph isomorphic to K_{2,3}, but does contain a K_{2,3} after the addition of any new edge. We prove that the minimum number of edges in a K_{2,3}-saturated graph on n >= 5 vertices is sat(n, K_{2,3}) = 2n - 3.

  15. Minimum degree and density of binary sequences

    DEFF Research Database (Denmark)

    Brandt, Stephan; Müttel, J.; Rautenbach, D.

    2010-01-01

    For d,k∈N with k ≤ 2d, let g(d,k) denote the infimum density of binary sequences (x)∈{0,1} which satisfy the minimum degree condition σ(x+) ≥ k for all i∈Z with xi=1. We reduce the problem of computing g(d,k) to a combinatorial problem related to the generalized k-girth of a graph G which...

  16. Integrating the nursing management minimum data set into the logical observation identifier names and codes system.

    Science.gov (United States)

    Subramanian, Amarnath; Westra, Bonnie; Matney, Susan; Wilson, Patricia S; Delaney, Connie W; Huff, Stan; Huff, Stanley M; Huber, Diane

    2008-11-06

    This poster describes the process used to integrate the Nursing Management Minimum Data Set (NMMDS), an instrument to measure the nursing context of care, into the Logical Observation Identifier Names and Codes (LOINC) system to facilitate contextualization of quality measures. Integration of the first three of 18 elements resulted in 48 new codes including five panels. The LOINC Clinical Committee has approved the presented mapping for their next release.

  17. The First Global Geological Map of Mercury

    Science.gov (United States)

    Prockter, L. M.; Head, J. W., III; Byrne, P. K.; Denevi, B. W.; Kinczyk, M. J.; Fassett, C.; Whitten, J. L.; Thomas, R.; Ernst, C. M.

    2015-12-01

    Geological maps are tools with which to understand the distribution and age relationships of surface geological units and structural features on planetary surfaces. Regional and limited global mapping of Mercury has already yielded valuable science results, elucidating the history and distribution of several types of units and features, such as regional plains, tectonic structures, and pyroclastic deposits. To date, however, no global geological map of Mercury exists, and there is currently no commonly accepted set of standardized unit descriptions and nomenclature. With MESSENGER monochrome image data, we are undertaking the global geological mapping of Mercury at the 1:15M scale applying standard U.S. Geological Survey mapping guidelines. This map will enable the development of the first global stratigraphic column of Mercury, will facilitate comparisons among surface units distributed discontinuously across the planet, and will provide guidelines for mappers so that future mapping efforts will be consistent and broadly interpretable by the scientific community. To date we have incorporated three major datasets into the global geological map: smooth plains units, tectonic structures, and impact craters and basins >20 km in diameter. We have classified most of these craters by relative age on the basis of the state of preservation of morphological features and standard classification schemes first applied to Mercury by the Mariner 10 imaging team. Additional datasets to be incorporated include intercrater plains units and crater ejecta deposits. In some regions MESSENGER color data is used to supplement the monochrome data, to help elucidate different plains units. The final map will be published online, together with a peer-reviewed publication. Further, a digital version of the map, containing individual map layers, will be made publicly available for use within geographic information systems (GISs).

  18. Speedup of minimum discontinuity phase unwrapping algorithm with a reference phase distribution

    Science.gov (United States)

    Liu, Yihang; Han, Yu; Li, Fengjiao; Zhang, Qican

    2018-06-01

    In three-dimensional (3D) shape measurement based on phase analysis, the phase analysis process usually produces a wrapped phase map ranging from - π to π with some 2 π discontinuities, and thus a phase unwrapping algorithm is necessary to recover the continuous and nature phase map from which 3D height distribution can be restored. Usually, the minimum discontinuity phase unwrapping algorithm can be used to solve many different kinds of phase unwrapping problems, but its main drawback is that it requires a large amount of computations and has low efficiency in searching for the improving loop within the phase's discontinuity area. To overcome this drawback, an improvement to speedup of the minimum discontinuity phase unwrapping algorithm by using the phase distribution on reference plane is proposed. In this improved algorithm, before the minimum discontinuity phase unwrapping algorithm is carried out to unwrap phase, an integer number K was calculated from the ratio of the wrapped phase to the nature phase on a reference plane. And then the jump counts of the unwrapped phase can be reduced by adding 2K π, so the efficiency of the minimum discontinuity phase unwrapping algorithm is significantly improved. Both simulated and experimental data results verify the feasibility of the proposed improved algorithm, and both of them clearly show that the algorithm works very well and has high efficiency.

  19. The intercrater plains of Mercury and the Moon: Their nature, origin and role in terrestrial planet evolution. Constuction of the paleogeologic maps. Ph.D. Thesis

    Science.gov (United States)

    Leake, M. A.

    1982-01-01

    The Post Caoris surface was derived from the geologic map by plotting all Class 1 and 2 features. To construct the Caloris surface, Class 3 craters were plotted onto the map, as well as all Class 3 plains. However, if P3 plains were adjacent to P2 units, and appeared continuous with other exposures of P3 material, the P2 unit was assumed to overlie the C3 and P3 material. The younger superposed craters were ignored with respect to the Class 3 surface. The boundaries of P3 materials were then continued under the superposed units, using a minimum of reasonable assumptions. For instance, if P2 and P4 plains were adjacent units, no P3 plains were presumed to lie under the P2 material. Similarly, all C3 craters were considered to have some deposits of impact melt after formation, even if they are mapped containing younger units. C3 craters which were superposed with younger units, C1 or C2 craters, and perhaps P2 plains, were redrawn as if later materials had not been emplaced, i.e., in their post impact, pre-degradation states.

  20. Energetic map

    International Nuclear Information System (INIS)

    2012-01-01

    This report explains the energetic map of Uruguay as well as the different systems that delimits political frontiers in the region. The electrical system importance is due to the electricity, oil and derived , natural gas, potential study, biofuels, wind and solar energy

  1. Necklace maps

    NARCIS (Netherlands)

    Speckmann, B.; Verbeek, K.A.B.

    2010-01-01

    Statistical data associated with geographic regions is nowadays globally available in large amounts and hence automated methods to visually display these data are in high demand. There are several well-established thematic map types for quantitative data on the ratio-scale associated with regions:

  2. Participatory maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    towards a new political ecology. This type of digital cartographies has been highlighted as the ‘processual turn’ in critical cartography, whereas in related computational journalism it can be seen as an interactive and iterative process of mapping complex and fragile ecological developments. This paper...

  3. Fluence map segmentation

    International Nuclear Information System (INIS)

    Rosenwald, J.-C.

    2008-01-01

    The lecture addressed the following topics: 'Interpreting' the fluence map; The sequencer; Reasons for difference between desired and actual fluence map; Principle of 'Step and Shoot' segmentation; Large number of solutions for given fluence map; Optimizing 'step and shoot' segmentation; The interdigitation constraint; Main algorithms; Conclusions on segmentation algorithms (static mode); Optimizing intensity levels and monitor units; Sliding window sequencing; Synchronization to avoid the tongue-and-groove effect; Accounting for physical characteristics of MLC; Importance of corrections for leaf transmission and offset; Accounting for MLC mechanical constraints; The 'complexity' factor; Incorporating the sequencing into optimization algorithm; Data transfer to the treatment machine; Interface between R and V and accelerator; and Conclusions on fluence map segmentation (Segmentation is part of the overall inverse planning procedure; 'Step and Shoot' and 'Dynamic' options are available for most TPS (depending on accelerator model; The segmentation phase tends to come into the optimization loop; The physical characteristics of the MLC have a large influence on final dose distribution; The IMRT plans (MU and relative dose distribution) must be carefully validated). (P.A.)

  4. NACP MsTMIP: Unified North American Soil Map

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This data set provides soil maps for the United States (US) (including Alaska), Canada, Mexico, and a part of Guatemala. The map information content...

  5. NACP MsTMIP: Unified North American Soil Map

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set provides soil maps for the United States (US) (including Alaska), Canada, Mexico, and a part of Guatemala. The map information content includes maximum...

  6. Geologic map of the Murray Quadrangle, Newton County, Arkansas

    Science.gov (United States)

    Hudson, Mark R.; Turner, Kenzie J.

    2016-07-06

    This map summarizes the geology of the Murray quadrangle in the Ozark Plateaus region of northern Arkansas. Geologically, the area is on the southern flank of the Ozark dome, an uplift that has the oldest rocks exposed at its center, in Missouri. Physiographically, the Murray quadrangle is within the Boston Mountains, a high plateau region underlain by Pennsylvanian sandstones and shales. Valleys of the Buffalo River and Little Buffalo River and their tributaries expose an approximately 1,600-ft-thick (488-meter-thick) sequence of Ordovician, Mississippian, and Pennsylvanian carbonate and clastic sedimentary rocks that have been mildly deformed by a series of faults and folds. The Buffalo National River, a park that encompasses the Buffalo River and adjacent land that is administered by the National Park Service is present at the northwestern edge of the quadrangle.Mapping for this study was carried out by field inspection of numerous sites and was compiled as a 1:24,000 geographic information system (GIS) database. Locations and elevation of sites were determined with the aid of a global positioning satellite receiver and a hand-held barometric altimeter that was frequently recalibrated at points of known elevation. Hill-shade relief and slope maps derived from a U.S. Geological Survey 10-meter digital elevation model as well as orthophotographs were used to help trace ledge-forming units between field traverses within the Upper Mississippian and Pennsylvanian part of the stratigraphic sequence. Strike and dip of beds were typically measured along stream drainages or at well-exposed ledges. Structure contours, constructed on the top of the Boone Formation and the base of a prominent sandstone unit within the Bloyd Formation, were drawn based on the elevations of field sites on these contacts well as other limiting information for their minimum elevations above hilltops or their maximum elevations below valley bottoms.

  7. Decentralized Pricing in Minimum Cost Spanning Trees

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moulin, Hervé; Østerdal, Lars Peter

    In the minimum cost spanning tree model we consider decentralized pricing rules, i.e. rules that cover at least the ecient cost while the price charged to each user only depends upon his own connection costs. We de ne a canonical pricing rule and provide two axiomatic characterizations. First......, the canonical pricing rule is the smallest among those that improve upon the Stand Alone bound, and are either superadditive or piece-wise linear in connection costs. Our second, direct characterization relies on two simple properties highlighting the special role of the source cost....

  8. Iterative Regularization with Minimum-Residual Methods

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg; Hansen, Per Christian

    2007-01-01

    subspaces. We provide a combination of theory and numerical examples, and our analysis confirms the experience that MINRES and MR-II can work as general regularization methods. We also demonstrate theoretically and experimentally that the same is not true, in general, for GMRES and RRGMRES their success......We study the regularization properties of iterative minimum-residual methods applied to discrete ill-posed problems. In these methods, the projection onto the underlying Krylov subspace acts as a regularizer, and the emphasis of this work is on the role played by the basis vectors of these Krylov...... as regularization methods is highly problem dependent....

  9. Iterative regularization with minimum-residual methods

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg; Hansen, Per Christian

    2006-01-01

    subspaces. We provide a combination of theory and numerical examples, and our analysis confirms the experience that MINRES and MR-II can work as general regularization methods. We also demonstrate theoretically and experimentally that the same is not true, in general, for GMRES and RRGMRES - their success......We study the regularization properties of iterative minimum-residual methods applied to discrete ill-posed problems. In these methods, the projection onto the underlying Krylov subspace acts as a regularizer, and the emphasis of this work is on the role played by the basis vectors of these Krylov...... as regularization methods is highly problem dependent....

  10. MAPPING INNOVATION

    DEFF Research Database (Denmark)

    Thuesen, Christian Langhoff; Koch, Christian

    2011-01-01

    By adopting a theoretical framework from strategic niche management research (SNM) this paper presents an analysis of the innovation system of the Danish Construction industry. The analysis shows a multifaceted landscape of innovation around an existing regime, built around existing ways of working...... and developed over generations. The regime is challenged from various niches and the socio-technical landscape through trends as globalization. Three niches (Lean Construction, BIM and System Deliveries) are subject to a detailed analysis showing partly incompatible rationales and various degrees of innovation...... potential. The paper further discusses how existing policymaking operates in a number of tensions one being between government and governance. Based on the concepts from SNM the paper introduces an innovation map in order to support the development of meta-governance policymaking. By mapping some...

  11. Mapping filmmaking

    DEFF Research Database (Denmark)

    Gilje, Øystein; Frølunde, Lisbeth; Lindstrand, Fredrik

    2010-01-01

    This chapter concerns mapping patterns in regards to how young filmmakers (age 15 – 20) in the Scandinavian countries learn about filmmaking. To uncover the patterns, we present portraits of four young filmmakers who participated in the Scandinavian research project Making a filmmaker. The focus ...... is on their learning practices and how they create ‘learning paths’ in relation to resources in diverse learning contexts, whether formal, non-formal and informal contexts.......This chapter concerns mapping patterns in regards to how young filmmakers (age 15 – 20) in the Scandinavian countries learn about filmmaking. To uncover the patterns, we present portraits of four young filmmakers who participated in the Scandinavian research project Making a filmmaker. The focus...

  12. Use of linkage mapping and centrality analysis across habitat gradients to conserve connectivity of gray wolf populations in western North America.

    Science.gov (United States)

    Carroll, Carlos; McRae, Brad H; Brookes, Allen

    2012-02-01

    Centrality metrics evaluate paths between all possible pairwise combinations of sites on a landscape to rank the contribution of each site to facilitating ecological flows across the network of sites. Computational advances now allow application of centrality metrics to landscapes represented as continuous gradients of habitat quality. This avoids the binary classification of landscapes into patch and matrix required by patch-based graph analyses of connectivity. It also avoids the focus on delineating paths between individual pairs of core areas characteristic of most corridor- or linkage-mapping methods of connectivity analysis. Conservation of regional habitat connectivity has the potential to facilitate recovery of the gray wolf (Canis lupus), a species currently recolonizing portions of its historic range in the western United States. We applied 3 contrasting linkage-mapping methods (shortest path, current flow, and minimum-cost-maximum-flow) to spatial data representing wolf habitat to analyze connectivity between wolf populations in central Idaho and Yellowstone National Park (Wyoming). We then applied 3 analogous betweenness centrality metrics to analyze connectivity of wolf habitat throughout the northwestern United States and southwestern Canada to determine where it might be possible to facilitate range expansion and interpopulation dispersal. We developed software to facilitate application of centrality metrics. Shortest-path betweenness centrality identified a minimal network of linkages analogous to those identified by least-cost-path corridor mapping. Current flow and minimum-cost-maximum-flow betweenness centrality identified diffuse networks that included alternative linkages, which will allow greater flexibility in planning. Minimum-cost-maximum-flow betweenness centrality, by integrating both land cost and habitat capacity, allows connectivity to be considered within planning processes that seek to maximize species protection at minimum cost

  13. Feedback brake distribution control for minimum pitch

    Science.gov (United States)

    Tavernini, Davide; Velenis, Efstathios; Longo, Stefano

    2017-06-01

    The distribution of brake forces between front and rear axles of a vehicle is typically specified such that the same level of brake force coefficient is imposed at both front and rear wheels. This condition is known as 'ideal' distribution and it is required to deliver the maximum vehicle deceleration and minimum braking distance. For subcritical braking conditions, the deceleration demand may be delivered by different distributions between front and rear braking forces. In this research we show how to obtain the optimal distribution which minimises the pitch angle of a vehicle and hence enhances driver subjective feel during braking. A vehicle model including suspension geometry features is adopted. The problem of the minimum pitch brake distribution for a varying deceleration level demand is solved by means of a model predictive control (MPC) technique. To address the problem of the undesirable pitch rebound caused by a full-stop of the vehicle, a second controller is designed and implemented independently from the braking distribution in use. An extended Kalman filter is designed for state estimation and implemented in a high fidelity environment together with the MPC strategy. The proposed solution is compared with the reference 'ideal' distribution as well as another previous feed-forward solution.

  14. Application of Remote Sensing in Geological Mapping, Case Study al Maghrabah Area - Hajjah Region, Yemen

    Science.gov (United States)

    Al-Nahmi, F.; Saddiqi, O.; Hilali, A.; Rhinane, H.; Baidder, L.; El arabi, H.; Khanbari, K.

    2017-11-01

    Remote sensing technology plays an important role today in the geological survey, mapping, analysis and interpretation, which provides a unique opportunity to investigate the geological characteristics of the remote areas of the earth's surface without the need to gain access to an area on the ground. The aim of this study is achievement a geological map of the study area. The data utilizes is Sentinel-2 imagery, the processes used in this study, the OIF Optimum Index Factor is a statistic value that can be used to select the optimum combination of three bands in a satellite image. It's based on the total variance within bands and correlation coefficient between bands, ICA Independent component analysis (3, 4, 6) is a statistical and computational technique for revealing hidden factors that underlie sets of random variables, measurements, or signals, MNF Minimum Noise Fraction (1, 2, 3) is used to determine the inherent dimensionality of image data to segregate noise in the data and to reduce the computational requirements for subsequent processing, Optimum Index Factor is a good method for choosing the best band for lithological mapping. ICA, MNF, also a practical way to extract the structural geology maps. The results in this paper indicate that, the studied area can be divided into four main geological units: Basement rocks (Meta volcanic, Meta sediments), Sedimentary rocks, Intrusive rocks, volcanic rocks. The method used in this study offers great potential for lithological mapping, by using Sentinel-2 imagery, the results were compared with existing geologic maps and were superior and could be used to update the existing maps.

  15. Northern Circumpolar Soils Map, Version 1

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set consists of a circumpolar map of dominant soil characteristics, with a scale of 1:10,000,000, covering the United States, Canada, Greenland, Iceland,...

  16. Do minimum wages reduce poverty? Evidence from Central America ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2010-12-16

    Dec 16, 2010 ... Raising minimum wages has traditionally been considered a way to protect poor ... However, the effect of raising minimum wages remains an empirical question ... ​More than 70 of Vietnamese entrepreneurs choose to start a ...

  17. Mapping Resilience

    DEFF Research Database (Denmark)

    Carruth, Susan

    2015-01-01

    by planners when aiming to construct resilient energy plans. It concludes that a graphical language has the potential to be a significant tool, flexibly facilitating cross-disciplinary communication and decision-making, while emphasising that its role is to support imaginative, resilient planning rather than...... the relationship between resilience and energy planning, suggesting that planning in, and with, time is a core necessity in this domain. It then reviews four examples of graphically mapping with time, highlighting some of the key challenges, before tentatively proposing a graphical language to be employed...

  18. 30 CFR 56.19021 - Minimum rope strength.

    Science.gov (United States)

    2010-07-01

    ... feet: Minimum Value=Static Load×(7.0-0.001L) For rope lengths 3,000 feet or greater: Minimum Value=Static Load×4.0 (b) Friction drum ropes. For rope lengths less than 4,000 feet: Minimum Value=Static Load×(7.0-0.0005L) For rope lengths 4,000 feet or greater: Minimum Value=Static Load×5.0 (c) Tail ropes...

  19. Does increasing the minimum wage reduce poverty in developing countries?

    OpenAIRE

    Gindling, T. H.

    2014-01-01

    Do minimum wage policies reduce poverty in developing countries? It depends. Raising the minimum wage could increase or decrease poverty, depending on labor market characteristics. Minimum wages target formal sector workers—a minority of workers in most developing countries—many of whom do not live in poor households. Whether raising minimum wages reduces poverty depends not only on whether formal sector workers lose jobs as a result, but also on whether low-wage workers live in poor househol...

  20. On a Minimum Problem in Smectic Elastomers

    International Nuclear Information System (INIS)

    Buonsanti, Michele; Giovine, Pasquale

    2008-01-01

    Smectic elastomers are layered materials exhibiting a solid-like elastic response along the layer normal and a rubbery one in the plane. Balance equations for smectic elastomers are derived from the general theory of continua with constrained microstructure. In this work we investigate a very simple minimum problem based on multi-well potentials where the microstructure is taken into account. The set of polymeric strains minimizing the elastic energy contains a one-parameter family of simple strain associated with a micro-variation of the degree of freedom. We develop the energy functional through two terms, the first one nematic and the second one considering the tilting phenomenon; after, by developing in the rubber elasticity framework, we minimize over the tilt rotation angle and extract the engineering stress

  1. Image Segmentation Using Minimum Spanning Tree

    Science.gov (United States)

    Dewi, M. P.; Armiati, A.; Alvini, S.

    2018-04-01

    This research aim to segmented the digital image. The process of segmentation is to separate the object from the background. So the main object can be processed for the other purposes. Along with the development of technology in digital image processing application, the segmentation process becomes increasingly necessary. The segmented image which is the result of the segmentation process should accurate due to the next process need the interpretation of the information on the image. This article discussed the application of minimum spanning tree on graph in segmentation process of digital image. This method is able to separate an object from the background and the image will change to be the binary images. In this case, the object that being the focus is set in white, while the background is black or otherwise.

  2. Statistical physics when the minimum temperature is not absolute zero

    Science.gov (United States)

    Chung, Won Sang; Hassanabadi, Hassan

    2018-04-01

    In this paper, the nonzero minimum temperature is considered based on the third law of thermodynamics and existence of the minimal momentum. From the assumption of nonzero positive minimum temperature in nature, we deform the definitions of some thermodynamical quantities and investigate nonzero minimum temperature correction to the well-known thermodynamical problems.

  3. 12 CFR 564.4 - Minimum appraisal standards.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Minimum appraisal standards. 564.4 Section 564.4 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY APPRAISALS § 564.4 Minimum appraisal standards. For federally related transactions, all appraisals shall, at a minimum: (a...

  4. 29 CFR 505.3 - Prevailing minimum compensation.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Prevailing minimum compensation. 505.3 Section 505.3 Labor... HUMANITIES § 505.3 Prevailing minimum compensation. (a)(1) In the absence of an alternative determination...)(2) of this section, the prevailing minimum compensation required to be paid under the Act to the...

  5. An Empirical Analysis of the Relationship between Minimum Wage ...

    African Journals Online (AJOL)

    An Empirical Analysis of the Relationship between Minimum Wage, Investment and Economic Growth in Ghana. ... In addition, the ratio of public investment to tax revenue must increase as minimum wage increases since such complementary changes are more likely to lead to economic growth. Keywords: minimum wage ...

  6. Minimum Covers of Fixed Cardinality in Weighted Graphs.

    Science.gov (United States)

    White, Lee J.

    Reported is the result of research on combinatorial and algorithmic techniques for information processing. A method is discussed for obtaining minimum covers of specified cardinality from a given weighted graph. By the indicated method, it is shown that the family of minimum covers of varying cardinality is related to the minimum spanning tree of…

  7. Minimum Price Guarantees In a Consumer Search Model

    NARCIS (Netherlands)

    M.C.W. Janssen (Maarten); A. Parakhonyak (Alexei)

    2009-01-01

    textabstractThis paper is the first to examine the effect of minimum price guarantees in a sequential search model. Minimum price guarantees are not advertised and only known to consumers when they come to the shop. We show that in such an environment, minimum price guarantees increase the value of

  8. Employment Effects of Minimum and Subminimum Wages. Recent Evidence.

    Science.gov (United States)

    Neumark, David

    Using a specially constructed panel data set on state minimum wage laws and labor market conditions, Neumark and Wascher (1992) presented evidence that countered the claim that minimum wages could be raised with no cost to employment. They concluded that estimates indicating that minimum wages reduced employment on the order of 1-2 percent for a…

  9. Minimum Wages and Skill Acquisition: Another Look at Schooling Effects.

    Science.gov (United States)

    Neumark, David; Wascher, William

    2003-01-01

    Examines the effects of minimum wage on schooling, seeking to reconcile some of the contradictory results in recent research using Current Population Survey data from the late 1970s through the 1980s. Findings point to negative effects of minimum wages on school enrollment, bolstering the findings of negative effects of minimum wages on enrollment…

  10. Minimum Wage Effects on Educational Enrollments in New Zealand

    Science.gov (United States)

    Pacheco, Gail A.; Cruickshank, Amy A.

    2007-01-01

    This paper empirically examines the impact of minimum wages on educational enrollments in New Zealand. A significant reform to the youth minimum wage since 2000 has resulted in some age groups undergoing a 91% rise in their real minimum wage over the last 10 years. Three panel least squares multivariate models are estimated from a national sample…

  11. 41 CFR 50-201.1101 - Minimum wages.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 1 2010-07-01 2010-07-01 true Minimum wages. 50-201... Contracts PUBLIC CONTRACTS, DEPARTMENT OF LABOR 201-GENERAL REGULATIONS § 50-201.1101 Minimum wages. Determinations of prevailing minimum wages or changes therein will be published in the Federal Register by the...

  12. 29 CFR 4.159 - General minimum wage.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true General minimum wage. 4.159 Section 4.159 Labor Office of... General minimum wage. The Act, in section 2(b)(1), provides generally that no contractor or subcontractor... a contract less than the minimum wage specified under section 6(a)(1) of the Fair Labor Standards...

  13. 29 CFR 783.43 - Computation of seaman's minimum wage.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Computation of seaman's minimum wage. 783.43 Section 783.43...'s minimum wage. Section 6(b) requires, under paragraph (2) of the subsection, that an employee...'s minimum wage requirements by reason of the 1961 Amendments (see §§ 783.23 and 783.26). Although...

  14. 24 CFR 891.145 - Owner deposit (Minimum Capital Investment).

    Science.gov (United States)

    2010-04-01

    ... General Program Requirements § 891.145 Owner deposit (Minimum Capital Investment). As a Minimum Capital... Investment shall be one-half of one percent (0.5%) of the HUD-approved capital advance, not to exceed $25,000. ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Owner deposit (Minimum Capital...

  15. 12 CFR 931.3 - Minimum investment in capital stock.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Minimum investment in capital stock. 931.3... CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL STOCK § 931.3 Minimum investment in capital stock. (a) A Bank shall require each member to maintain a minimum investment in the capital stock of the Bank, both...

  16. 9 CFR 147.51 - Authorized laboratory minimum requirements.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Authorized laboratory minimum requirements. 147.51 Section 147.51 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE... Authorized Laboratories and Approved Tests § 147.51 Authorized laboratory minimum requirements. These minimum...

  17. Mapping of

    Directory of Open Access Journals (Sweden)

    Sayed M. Arafat

    2014-06-01

    Full Text Available Land cover map of North Sinai was produced based on the FAO-Land Cover Classification System (LCCS of 2004. The standard FAO classification scheme provides a standardized system of classification that can be used to analyze spatial and temporal land cover variability in the study area. This approach also has the advantage of facilitating the integration of Sinai land cover mapping products to be included with the regional and global land cover datasets. The total study area is covering a total area of 20,310.4 km2 (203,104 hectare. The landscape classification was based on SPOT4 data acquired in 2011 using combined multispectral bands of 20 m spatial resolution. Geographic Information System (GIS was used to manipulate the attributed layers of classification in order to reach the maximum possible accuracy. GIS was also used to include all necessary information. The identified vegetative land cover classes of the study area are irrigated herbaceous crops, irrigated tree crops and rain fed tree crops. The non-vegetated land covers in the study area include bare rock, bare soils (stony, very stony and salt crusts, loose and shifting sands and sand dunes. The water bodies were classified as artificial perennial water bodies (fish ponds and irrigated canals and natural perennial water bodies as lakes (standing. The artificial surfaces include linear and non-linear features.

  18. The minimum sit-to-stand height test: reliability, responsiveness and relationship to leg muscle strength.

    Science.gov (United States)

    Schurr, Karl; Sherrington, Catherine; Wallbank, Geraldine; Pamphlett, Patricia; Olivetti, Lynette

    2012-07-01

    To determine the reliability of the minimum sit-to-stand height test, its responsiveness and its relationship to leg muscle strength among rehabilitation unit inpatients and outpatients. Reliability study using two measurers and two test occasions. Secondary analysis of data from two clinical trials. Inpatient and outpatient rehabilitation services in three public hospitals. Eighteen hospital patients and five others participated in the reliability study. Seventy-two rehabilitation unit inpatients and 80 outpatients participated in the clinical trials. The minimum sit-to-stand height test was assessed using a standard procedure. For the reliability study, a second tester repeated the minimum sit-to-stand height test on the same day. In the inpatient clinical trial the measures were repeated two weeks later. In the outpatient trial the measures were repeated five weeks later. Knee extensor muscle strength was assessed in the clinical trials using a hand-held dynamometer. The reliability for the minimum sit-to-stand height test was excellent (intraclass correlation coefficient (ICC) 0.91, 95% confidence interval (CI) 0.81-0.96). The standard error of measurement was 34 mm. Responsiveness was moderate in the inpatient trial (effect size: 0.53) but small in the outpatient trial (effect size: 0.16). A small proportion (8-17%) of variability in minimum sit-to-stand height test was explained by knee extensor muscle strength. The minimum sit-to-stand height test has excellent reliability and moderate responsiveness in an inpatient rehabilitation setting. Responsiveness in an outpatient rehabilitation setting requires further investigation. Performance is influenced by factors other than knee extensor muscle strength.

  19. Geologic map of the Nepenthes Planum Region, Mars

    Science.gov (United States)

    Skinner, James A.; Tanaka, Kenneth L.

    2018-03-26

    This map product contains a map sheet at 1:1,506,000 scale that shows the geology of the Nepenthes Planum region of Mars, which is located between the cratered highlands that dominate the southern hemisphere and the less-cratered sedimentary plains that dominate the northern hemisphere.  The map region contains cone- and mound-shaped landforms as well as lobate materials that are morphologically similar to terrestrial igneous or mud vents and flows. This map is part of an informal series of small-scale (large-area) maps aimed at refining current understanding of the geologic units and structures that make up the highland-to-lowland transition zone. The map base consists of a controlled Thermal Emission Imaging System (THEMIS) daytime infrared image mosaic (100 meters per pixel resolution) supplemented by a Mars Orbiter Laser Altimeter (MOLA) digital elevation model (463 meters per pixel resolution). The map includes a Description of Map Units and a Correlation of Map Units that describes and correlates units identified across the entire map region. The geologic map was assembled using ArcGIS software by Environmental Systems Research Institute (http://www.esri.com). The ArcGIS project, geodatabase, base map, and all map components are included online as supplemental data.

  20. Minimum relative entropy, Bayes and Kapur

    Science.gov (United States)

    Woodbury, Allan D.

    2011-04-01

    The focus of this paper is to illustrate important philosophies on inversion and the similarly and differences between Bayesian and minimum relative entropy (MRE) methods. The development of each approach is illustrated through the general-discrete linear inverse. MRE differs from both Bayes and classical statistical methods in that knowledge of moments are used as ‘data’ rather than sample values. MRE like Bayes, presumes knowledge of a prior probability distribution and produces the posterior pdf itself. MRE attempts to produce this pdf based on the information provided by new moments. It will use moments of the prior distribution only if new data on these moments is not available. It is important to note that MRE makes a strong statement that the imposed constraints are exact and complete. In this way, MRE is maximally uncommitted with respect to unknown information. In general, since input data are known only to within a certain accuracy, it is important that any inversion method should allow for errors in the measured data. The MRE approach can accommodate such uncertainty and in new work described here, previous results are modified to include a Gaussian prior. A variety of MRE solutions are reproduced under a number of assumed moments and these include second-order central moments. Various solutions of Jacobs & van der Geest were repeated and clarified. Menke's weighted minimum length solution was shown to have a basis in information theory, and the classic least-squares estimate is shown as a solution to MRE under the conditions of more data than unknowns and where we utilize the observed data and their associated noise. An example inverse problem involving a gravity survey over a layered and faulted zone is shown. In all cases the inverse results match quite closely the actual density profile, at least in the upper portions of the profile. The similar results to Bayes presented in are a reflection of the fact that the MRE posterior pdf, and its mean

  1. Minimum Bias Measurements at the LHC

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00022031; The ATLAS collaboration

    2016-01-01

    Inclusive charged particle measurements at hadron colliders probe the low-energy nonperturbative region of QCD. Pseudorapidity distributions of charged-particles produced in pp collisions at 13 TeV have been measured by the CMS experiment. The ATLAS collaboration has measured the inclusive charged particle multiplicity and its dependence on transverse momentum and pseudorapidity in special data sets with low LHC beam current, recorded at a center-of-mass energy of 13 TeV. The measurements present the first detailed studies in inclusive phase spaces with a minimum transverse momentum of 100 MeV and 500 MeV. The distribution of electromagnetic and hadronic energy in the very forward phase-space has been measured with the CASTOR calorimeters located at a pseudorapidity of -5.2 to -6.6 in the very forward region of CMS. The energy distributions are very powerful benchmarks to study the performance of MPI in hadronic interactions models at 13 TeV collision energy. All measurements are compared with predictions of ...

  2. Topside measurements at Jicamarca during solar minimum

    Directory of Open Access Journals (Sweden)

    D. L. Hysell

    2009-01-01

    Full Text Available Long-pulse topside radar data acquired at Jicamarca and processed using full-profile analysis are compared to data processed using more conventional, range-gated approaches and with analytic and computational models. The salient features of the topside observations include a dramatic increase in the Te/Ti temperature ratio above the F peak at dawn and a local minimum in the topside plasma temperature in the afternoon. The hydrogen ion fraction was found to exhibit hyperbolic tangent-shaped profiles that become shallow (gradually changing above the O+-H+ transition height during the day. The profile shapes are generally consistent with diffusive equilibrium, although shallowing to the point of changes in inflection can only be accounted for by taking the effects of E×B drifts and meridional winds into account. The SAMI2 model demonstrates this as well as the substantial effect that drifts and winds can have on topside temperatures. Significant quiet-time variability in the topside composition and temperatures may be due to variability in the mechanical forcing. Correlations between topside measurements and magnetometer data at Jicamarca support this hypothesis.

  3. Topside measurements at Jicamarca during solar minimum

    Directory of Open Access Journals (Sweden)

    D. L. Hysell

    2009-01-01

    Full Text Available Long-pulse topside radar data acquired at Jicamarca and processed using full-profile analysis are compared to data processed using more conventional, range-gated approaches and with analytic and computational models. The salient features of the topside observations include a dramatic increase in the Te/Ti temperature ratio above the F peak at dawn and a local minimum in the topside plasma temperature in the afternoon. The hydrogen ion fraction was found to exhibit hyperbolic tangent-shaped profiles that become shallow (gradually changing above the O+-H+ transition height during the day. The profile shapes are generally consistent with diffusive equilibrium, although shallowing to the point of changes in inflection can only be accounted for by taking the effects of E×B drifts and meridional winds into account. The SAMI2 model demonstrates this as well as the substantial effect that drifts and winds can have on topside temperatures. Significant quiet-time variability in the topside composition and temperatures may be due to variability in the mechanical forcing. Correlations between topside measurements and magnetometer data at Jicamarca support this hypothesis.

  4. Designing from minimum to optimum functionality

    Science.gov (United States)

    Bannova, Olga; Bell, Larry

    2011-04-01

    This paper discusses a multifaceted strategy to link NASA Minimal Functionality Habitable Element (MFHE) requirements to a compatible growth plan; leading forward to evolutionary, deployable habitats including outpost development stages. The discussion begins by reviewing fundamental geometric features inherent in small scale, vertical and horizontal, pressurized module configuration options to characterize applicability to meet stringent MFHE constraints. A proposed scenario to incorporate a vertical core MFHE concept into an expanded architecture to provide continuity of structural form and a logical path from "minimum" to "optimum" design of a habitable module. The paper describes how habitation and logistics accommodations could be pre-integrated into a common Hab/Log Module that serves both habitation and logistics functions. This is offered as a means to reduce unnecessary redundant development costs and to avoid EVA-intensive on-site adaptation and retrofitting requirements for augmented crew capacity. An evolutionary version of the hard shell Hab/Log design would have an expandable middle section to afford larger living and working accommodations. In conclusion, the paper illustrates that a number of cargo missions referenced for NASA's 4.0.0 Lunar Campaign Scenario could be eliminated altogether to expedite progress and reduce budgets. The plan concludes with a vertical growth geometry that provides versatile and efficient site development opportunities using a combination of hard Hab/Log modules and a hybrid expandable "CLAM" (Crew Lunar Accommodations Module) element.

  5. Minimum nonuniform graph partitioning with unrelated weights

    Science.gov (United States)

    Makarychev, K. S.; Makarychev, Yu S.

    2017-12-01

    We give a bi-criteria approximation algorithm for the Minimum Nonuniform Graph Partitioning problem, recently introduced by Krauthgamer, Naor, Schwartz and Talwar. In this problem, we are given a graph G=(V,E) and k numbers ρ_1,\\dots, ρ_k. The goal is to partition V into k disjoint sets (bins) P_1,\\dots, P_k satisfying \\vert P_i\\vert≤ ρi \\vert V\\vert for all i, so as to minimize the number of edges cut by the partition. Our bi-criteria algorithm gives an O(\\sqrt{log \\vert V\\vert log k}) approximation for the objective function in general graphs and an O(1) approximation in graphs excluding a fixed minor. The approximate solution satisfies the relaxed capacity constraints \\vert P_i\\vert ≤ (5+ \\varepsilon)ρi \\vert V\\vert. This algorithm is an improvement upon the O(log \\vert V\\vert)-approximation algorithm by Krauthgamer, Naor, Schwartz and Talwar. We extend our results to the case of 'unrelated weights' and to the case of 'unrelated d-dimensional weights'. A preliminary version of this work was presented at the 41st International Colloquium on Automata, Languages and Programming (ICALP 2014). Bibliography: 7 titles.

  6. Isoflurane minimum alveolar concentration reduction by fentanyl.

    Science.gov (United States)

    McEwan, A I; Smith, C; Dyar, O; Goodman, D; Smith, L R; Glass, P S

    1993-05-01

    Isoflurane is commonly combined with fentanyl during anesthesia. Because of hysteresis between plasma and effect site, bolus administration of fentanyl does not accurately describe the interaction between these drugs. The purpose of this study was to determine the MAC reduction of isoflurane by fentanyl when both drugs had reached steady biophase concentrations. Seventy-seven patients were randomly allocated to receive either no fentanyl or fentanyl at several predetermined plasma concentrations. Fentanyl was administered using a computer-assisted continuous infusion device. Patients were also randomly allocated to receive a predetermined steady state end-tidal concentration of isoflurane. Blood samples for fentanyl concentration were taken at 10 min after initiation of the infusion and before and immediately after skin incision. A minimum of 20 min was allowed between the start of the fentanyl infusion and skin incision. The reduction in the MAC of isoflurane by the measured fentanyl concentration was calculated using a maximum likelihood solution to a logistic regression model. There was an initial steep reduction in the MAC of isoflurane by fentanyl, with 3 ng/ml resulting in a 63% MAC reduction. A ceiling effect was observed with 10 ng/ml providing only a further 19% reduction in MAC. A 50% decrease in MAC was produced by a fentanyl concentration of 1.67 ng/ml. Defining the MAC reduction of isoflurane by all the opioids allows their more rational administration with inhalational anesthetics and provides a comparison of their relative anesthetic potencies.

  7. Islands of biogeodiversity in arid lands on a polygons map study: Detecting scale invariance patterns from natural resources maps.

    Science.gov (United States)

    Ibáñez, J J; Pérez-Gómez, R; Brevik, Eric C; Cerdà, A

    2016-12-15

    Many maps (geology, hydrology, soil, vegetation, etc.) are created to inventory natural resources. Each of these resources is mapped using a unique set of criteria, including scales and taxonomies. Past research indicates that comparing results of related maps (e.g., soil and geology maps) may aid in identifying mapping deficiencies. Therefore, this study was undertaken in Almeria Province, Spain to (i) compare the underlying map structures of soil and vegetation maps and (ii) investigate if a vegetation map can provide useful soil information that was not shown on a soil map. Soil and vegetation maps were imported into ArcGIS 10.1 for spatial analysis, and results then exported to Microsoft Excel worksheets for statistical analyses to evaluate fits to linear and power law regression models. Vegetative units were grouped according to the driving forces that determined their presence or absence: (i) climatophilous (ii) lithologic-climate; and (iii) edaphophylous. The rank abundance plots for both the soil and vegetation maps conformed to Willis or Hollow Curves, meaning the underlying structures of both maps were the same. Edaphophylous map units, which represent 58.5% of the vegetation units in the study area, did not show a good correlation with the soil map. Further investigation revealed that 87% of the edaphohygrophilous units were found in ramblas, ephemeral riverbeds that are not typically classified and mapped as soils in modern systems, even though they meet the definition of soil given by the most commonly used and most modern soil taxonomic systems. Furthermore, these edaphophylous map units tend to be islands of biodiversity that are threatened by anthropogenic activity in the region. Therefore, this study revealed areas that need to be revisited and studied pedologically. The vegetation mapped in these areas and the soils that support it are key components of the earth's critical zone that must be studied, understood, and preserved. Copyright © 2016

  8. A Phosphate Minimum in the Oxygen Minimum Zone (OMZ) off Peru

    Science.gov (United States)

    Paulmier, A.; Giraud, M.; Sudre, J.; Jonca, J.; Leon, V.; Moron, O.; Dewitte, B.; Lavik, G.; Grasse, P.; Frank, M.; Stramma, L.; Garcon, V.

    2016-02-01

    The Oxygen Minimum Zone (OMZ) off Peru is known to be associated with the advection of Equatorial SubSurface Waters (ESSW), rich in nutrients and poor in oxygen, through the Peru-Chile UnderCurrent (PCUC), but this circulation remains to be refined within the OMZ. During the Pelágico cruise in November-December 2010, measurements of phosphate revealed the presence of a phosphate minimum (Pmin) in various hydrographic stations, which could not be explained so far and could be associated with a specific water mass. This Pmin, localized at a relatively constant layer ( 20minimum with a mean vertical phosphate decrease of 0.6 µM but highly variable between 0.1 and 2.2 µM. In average, these Pmin are associated with a predominant mixing of SubTropical Under- and Surface Waters (STUW and STSW: 20 and 40%, respectively) within ESSW ( 25%), complemented evenly by overlying (ESW, TSW: 8%) and underlying waters (AAIW, SPDW: 7%). The hypotheses and mechanisms leading to the Pmin formation in the OMZ are further explored and discussed, considering the physical regional contribution associated with various circulation pathways ventilating the OMZ and the local biogeochemical contribution including the potential diazotrophic activity.

  9. Global-scale high-resolution ( 1 km) modelling of mean, maximum and minimum annual streamflow

    Science.gov (United States)

    Barbarossa, Valerio; Huijbregts, Mark; Hendriks, Jan; Beusen, Arthur; Clavreul, Julie; King, Henry; Schipper, Aafke

    2017-04-01

    Quantifying mean, maximum and minimum annual flow (AF) of rivers at ungauged sites is essential for a number of applications, including assessments of global water supply, ecosystem integrity and water footprints. AF metrics can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict AF metrics based on climate and catchment characteristics. Yet, so far, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. We developed global-scale regression models that quantify mean, maximum and minimum AF as function of catchment area and catchment-averaged slope, elevation, and mean, maximum and minimum annual precipitation and air temperature. We then used these models to obtain global 30 arc-seconds (˜ 1 km) maps of mean, maximum and minimum AF for each year from 1960 through 2015, based on a newly developed hydrologically conditioned digital elevation model. We calibrated our regression models based on observations of discharge and catchment characteristics from about 4,000 catchments worldwide, ranging from 100 to 106 km2 in size, and validated them against independent measurements as well as the output of a number of process-based global hydrological models (GHMs). The variance explained by our regression models ranged up to 90% and the performance of the models compared well with the performance of existing GHMs. Yet, our AF maps provide a level of spatial detail that cannot yet be achieved by current GHMs.

  10. The National Map: from geography to mapping and back again

    Science.gov (United States)

    Kelmelis, John A.; DeMulder, Mark L.; Ogrosky, Charles E.; Van Driel, J. Nicholas; Ryan, Barbara J.

    2003-01-01

    When the means of production for national base mapping were capital intensive, required large production facilities, and had ill-defined markets, Federal Government mapping agencies were the primary providers of the spatial data needed for economic development, environmental management, and national defense. With desktop geographic information systems now ubiquitous, source data available as a commodity from private industry, and the realization that many complex problems faced by society need far more and different kinds of spatial data for their solutions, national mapping organizations must realign their business strategies to meet growing demand and anticipate the needs of a rapidly changing geographic information environment. The National Map of the United States builds on a sound historic foundation of describing and monitoring the land surface and adds a focused effort to produce improved understanding, modeling, and prediction of land-surface change. These added dimensions bring to bear a broader spectrum of geographic science to address extant and emerging issues. Within the overarching construct of The National Map, the U.S. Geological Survey (USGS) is making a transition from data collector to guarantor of national data completeness; from producing paper maps to supporting an online, seamless, integrated database; and from simply describing the Nation’s landscape to linking these descriptions with increased scientific understanding. Implementing the full spectrum of geographic science addresses a myriad of public policy issues, including land and natural resource management, recreation, urban growth, human health, and emergency planning, response, and recovery. Neither these issues nor the science and technologies needed to deal with them are static. A robust research agenda is needed to understand these changes and realize The National Map vision. Initial successes have been achieved. These accomplishments demonstrate the utility of

  11. RCoronae Borealis at the 2003 light minimum

    Science.gov (United States)

    Kameswara Rao, N.; Lambert, David L.; Shetrone, Matthew D.

    2006-08-01

    A set of five high-resolution optical spectra of R CrB obtained in 2003 March is discussed. At the time of the first spectrum (March 8), the star was at V = 12.6, a decline of more than six magnitudes. By March 31, the date of the last observation, the star at V = 9.3 was on the recovery to maximum light (V = 6). The 2003 spectra are compared with the extensive collection of spectra from the 1995-1996 minimum presented previously. Spectroscopic features common to the two minima include the familiar ones also seen in spectra of other R Coronae Borealis stars (RCBs) in decline: sharp emission lines of neutral and singly ionized atoms, broad emission lines including HeI, [NII] 6583 Å, Na D and CaII H & K lines, and blueshifted absorption lines of Na D, and KI resonance lines. Prominent differences between the 2003 and 1995-1996 spectra are seen. The broad Na D and Ca H & K lines in 2003 and 1995-1996 are centred approximately on the mean stellar velocity. The 2003 profiles are fit by a single Gaussian, but in 1995-1996 two Gaussians separated by about 200 km s-1 were required. However, the HeI broad emission lines are fit by a single Gaussian at all times; the emitting He and Na-Ca atoms are probably not colocated. The C2 Phillips 2-0 lines were detected as sharp absorption lines and the C2 Swan band lines as sharp emission lines in 2003, but in 1995-1996 the Swan band emission lines were broad and the Phillips lines were undetected. The 2003 spectra show CI sharp emission lines at minimum light with a velocity changing in 5 d by about 20 km s-1 when the velocity of `metal' sharp lines is unchanged; the CI emission may arise from shock-heated gas. Reexamination of spectra obtained at maximum light in 1995 shows extended blue wings to strong lines with the extension dependent on a line's lower excitation potential; this is the signature of a stellar wind, also revealed by published observations of the HeI 10830 Å line at maximum light. Changes in the cores of the

  12. Self-Mapping in Treating Suicide Ideation: A Case Study

    Science.gov (United States)

    Robertson, Lloyd Hawkeye

    2011-01-01

    This case study traces the development and use of a self-mapping exercise in the treatment of a youth who had been at risk for re-attempting suicide. A life skills exercise was modified to identify units of culture called "memes" from which a map of the youth's self was prepared. A successful treatment plan followed the mapping exercise. The…

  13. More 'mapping' in brain mapping: statistical comparison of effects

    DEFF Research Database (Denmark)

    Jernigan, Terry Lynne; Gamst, Anthony C.; Fennema-Notestine, Christine

    2003-01-01

    The term 'mapping' in the context of brain imaging conveys to most the concept of localization; that is, a brain map is meant to reveal a relationship between some condition or parameter and specific sites within the brain. However, in reality, conventional voxel-based maps of brain function......, or for that matter of brain structure, are generally constructed using analyses that yield no basis for inferences regarding the spatial nonuniformity of the effects. In the normal analysis path for functional images, for example, there is nowhere a statistical comparison of the observed effect in any voxel relative...... to that in any other voxel. Under these circumstances, strictly speaking, the presence of significant activation serves as a legitimate basis only for inferences about the brain as a unit. In their discussion of results, investigators rarely are content to confirm the brain's role, and instead generally prefer...

  14. Human Mind Maps

    Science.gov (United States)

    Glass, Tom

    2016-01-01

    When students generate mind maps, or concept maps, the maps are usually on paper, computer screens, or a blackboard. Human Mind Maps require few resources and little preparation. The main requirements are space where students can move around and a little creativity and imagination. Mind maps can be used for a variety of purposes, and Human Mind…

  15. Minimum inhibitory concentration distribution in environmental Legionella spp. isolates.

    Science.gov (United States)

    Sandalakis, Vassilios; Chochlakis, Dimosthenis; Goniotakis, Ioannis; Tselentis, Yannis; Psaroulaki, Anna

    2014-12-01

    In Greece standard tests are performed in the watering and cooling systems of hotels' units either as part of the surveillance scheme or following human infection. The purpose of this study was to establish the minimum inhibitory concentration (MIC) distributions of environmental Legionella isolates for six antimicrobials commonly used for the treatment of Legionella infections, by MIC-test methodology. Water samples were collected from 2004 to 2011 from 124 hotels from the four prefectures of Crete (Greece). Sixty-eight (68) Legionella isolates, comprising L. pneumophila serogroups 1, 2, 3, 5, 6, 8, 12, 13, 15, L. anisa, L. rubrilucens, L. maceachernii, L. quinlivanii, L. oakridgensis, and L. taurinensis, were included in the study. MIC-tests were performed on buffered charcoal yeast extract with α-ketoglutarate, L-cysteine, and ferric pyrophosphate. The MICs were read after 2 days of incubation at 36 ± 1 °C at 2.5% CO2. A large distribution in MICs was recorded for each species and each antibiotic tested. Rifampicin proved to be the most potent antibiotic regardless of the Legionella spp.; tetracycline appeared to have the least activity on our environmental isolates. The MIC-test approach is an easy, although not so cost-effective, way to determine MICs in Legionella spp. These data should be kept in mind especially since these Legionella species may cause human disease.

  16. An Integer Programming Formulation of the Minimum Common String Partition Problem.

    Directory of Open Access Journals (Sweden)

    S M Ferdous

    Full Text Available We consider the problem of finding a minimum common string partition (MCSP of two strings, which is an NP-hard problem. The MCSP problem is closely related to genome comparison and rearrangement, an important field in Computational Biology. In this paper, we map the MCSP problem into a graph applying a prior technique and using this graph, we develop an Integer Linear Programming (ILP formulation for the problem. We implement the ILP formulation and compare the results with the state-of-the-art algorithms from the literature. The experimental results are found to be promising.

  17. Digital engineering aspects of Karst map: a GIS version of Davies, W.E., Simpson, J.H., Ohlmacher, G.C., Kirk, W.S., and Newton, E.G., 1984, Engineering aspects of Karst: U.S. Geological Survey, National Atlas of the United States of America, Scale 1:7,500,000

    Science.gov (United States)

    Tobin, Bret D.; Weary, David J.

    2004-01-01

    These data are digital facsimiles of the original 1984 Engineering Aspects of Karst map by Davies and others. This data set was converted from a printed map to a digital GIS coverage to provide users with a citable national scale karst data set to use for graphic and demonstration purposes until new, improved data are developed. These data may be used freely with proper citation. Because it has been converted to GIS format, these data can be easily projected, displayed and queried for multiple uses in GIS. The karst polygons of the original map were scanned from the stable base negatives of the original, vectorized, edited and then attributed with unit descriptions. All of these processes potentially introduce small errors and distortions to the geography. The original map was produced at a scale of 1:7,500,000; this coverage is not as accurate, and should be used for broad-scale purposes only. It is not intended for any site-specific studies.

  18. The Distribution of the Sample Minimum-Variance Frontier

    OpenAIRE

    Raymond Kan; Daniel R. Smith

    2008-01-01

    In this paper, we present a finite sample analysis of the sample minimum-variance frontier under the assumption that the returns are independent and multivariate normally distributed. We show that the sample minimum-variance frontier is a highly biased estimator of the population frontier, and we propose an improved estimator of the population frontier. In addition, we provide the exact distribution of the out-of-sample mean and variance of sample minimum-variance portfolios. This allows us t...

  19. Minimum Wages and Teen Employment: A Spatial Panel Approach

    OpenAIRE

    Charlene Kalenkoski; Donald Lacombe

    2011-01-01

    The authors employ spatial econometrics techniques and Annual Averages data from the U.S. Bureau of Labor Statistics for 1990-2004 to examine how changes in the minimum wage affect teen employment. Spatial econometrics techniques account for the fact that employment is correlated across states. Such correlation may exist if a change in the minimum wage in a state affects employment not only in its own state but also in other, neighboring states. The authors show that state minimum wages negat...

  20. 30 CFR 75.1431 - Minimum rope strength.

    Science.gov (United States)

    2010-07-01

    ..., including rotation resistant). For rope lengths less than 3,000 feet: Minimum Value=Static Load×(7.0−0.001L) For rope lengths 3,000 feet or greater: Minimum Value=Static Load×4.0 (b) Friction drum ropes. For rope lengths less than 4,000 feet: Minimum Value=Static Load×(7.0−0.0005L) For rope lengths 4,000 feet...

  1. Minimum Wages and the Distribution of Family Incomes

    OpenAIRE

    Dube, Arindrajit

    2017-01-01

    Using the March Current Population Survey data from 1984 to 2013, I provide a comprehensive evaluation of how minimum wage policies influence the distribution of family incomes. I find robust evidence that higher minimum wages shift down the cumulative distribution of family incomes at the bottom, reducing the share of non-elderly individuals with incomes below 50, 75, 100, and 125 percent of the federal poverty threshold. The long run (3 or more years) minimum wage elasticity of the non-elde...

  2. Maps & minds : mapping through the ages

    Science.gov (United States)

    ,

    1984-01-01

    Throughout time, maps have expressed our understanding of our world. Human affairs have been influenced strongly by the quality of maps available to us at the major turning points in our history. "Maps & Minds" traces the ebb and flow of a few central ideas in the mainstream of mapping. Our expanding knowledge of our cosmic neighborhood stems largely from a small number of simple but grand ideas, vigorously pursued.

  3. Minimum Distance Estimation on Time Series Analysis With Little Data

    National Research Council Canada - National Science Library

    Tekin, Hakan

    2001-01-01

    .... Minimum distance estimation has been demonstrated better standard approaches, including maximum likelihood estimators and least squares, in estimating statistical distribution parameters with very small data sets...

  4. 30 CFR 57.19021 - Minimum rope strength.

    Science.gov (United States)

    2010-07-01

    ... feet: Minimum Value=Static Load×(7.0−0.001L) For rope lengths 3,000 feet or greater: Minimum Value=Static Load×4.0. (b) Friction drum ropes. For rope lengths less than 4,000 feet: Minimum Value=Static Load×(7.0−0.0005L) For rope lengths 4,000 feet or greater: Minimum Value=Static Load×5.0. (c) Tail...

  5. 30 CFR 77.1431 - Minimum rope strength.

    Science.gov (United States)

    2010-07-01

    ... feet: Minimum Value=Static Load×(7.0−0.001L) For rope lengths 3,000 feet or greater: Minimum Value=Static Load×4.0 (b) Friction drum ropes. For rope lengths less than 4,000 feet: Minimum Value=Static Load×(7.0−0.0005L) For rope lengths 4,000 feet or greater: Minimum Value=Static Load×5.0 (c) Tail ropes...

  6. Decision trees with minimum average depth for sorting eight elements

    KAUST Repository

    AbouEisha, Hassan M.

    2015-11-19

    We prove that the minimum average depth of a decision tree for sorting 8 pairwise different elements is equal to 620160/8!. We show also that each decision tree for sorting 8 elements, which has minimum average depth (the number of such trees is approximately equal to 8.548×10^326365), has also minimum depth. Both problems were considered by Knuth (1998). To obtain these results, we use tools based on extensions of dynamic programming which allow us to make sequential optimization of decision trees relative to depth and average depth, and to count the number of decision trees with minimum average depth.

  7. Lunar Map Catalog

    Data.gov (United States)

    National Aeronautics and Space Administration — The Lunar Map Catalog includes various maps of the moon's surface, including Apollo landing sites; earthside, farside, and polar charts; photography index maps; zone...

  8. Baby Brain Map

    Science.gov (United States)

    ... a Member Home Resources & Services Professional Resource Baby Brain Map Mar 17, 2016 The Brain Map was adapted in 2006 by ZERO TO ... supports Adobe Flash Player. To view the Baby Brain Map, please visit this page on a browser ...

  9. Snapshots for Semantic Maps

    National Research Council Canada - National Science Library

    Nielsen, Curtis W; Ricks, Bob; Goodrich, Michael A; Bruemmer, David; Few, Doug; Walton, Miles

    2004-01-01

    .... Semantic maps are a relatively new approach to information presentation. Semantic maps provide more detail about an environment than typical maps because they are augmented by icons or symbols that provide meaning for places or objects of interest...

  10. Mapping invasive alien Acacia dealbata Link using ASTER multispectral imagery: a case study in central-eastern of Portugal

    Energy Technology Data Exchange (ETDEWEB)

    Martins, F.; Alegria, C.; Artur, G.

    2016-07-01

    Aim of the study: Acacia dealbata is an alien invasive species that is widely spread in Portugal. The main goal of this study was to produce an accurate and detailed map for this invasive species using ASTER multispectral imagery. Area of study: The central-eastern zone of Portugal was used as study area. This whole area is represented in an ASTER scene covering about 321.1 x 103 ha. Material and methods: ASTER imagery of two dates (flowering season and dry season) were classified by applying three supervised classifiers (Maximum Likelihood, Support Vector Machine and Artificial Neural Networks) to five different land cover classifications (from most generic to most detailed land cover categories). The spectral separability of the land cover categories was analyzed and the accuracy of the 30 produced maps compared. Main results: The highest classification accuracy for acacia mapping was obtained using the flowering season imagery, the Maximum Likelihood classifier and the most detailed land cover classification (overall accuracy of 86%; Kappa statistics of 85%; acacia class Kappa statistics of 100%). As a result, the area occupied by acacia was estimated to be approximated 24,770 ha (i.e. 8% of the study area). Research highlights: The methodology explored proved to be a cost-effective solution for acacia mapping in central-eastern of Portugal. The obtained map enables a more accurate and detailed identification of this species’ invaded areas due to its spatial resolution (minimum mapping unit of 0.02 ha) providing a substantial improvement comparably to the existent national land cover maps to support monitoring and control activities. (Author)

  11. Training and minimum wages: first evidence from the introduction of the minimum wage in Germany

    Directory of Open Access Journals (Sweden)

    Lutz Bellmann

    2017-06-01

    Full Text Available Abstract We analyze the short-run impact of the introduction of the new statutory minimum wage in Germany on further training at the workplace level. Applying difference-in-difference methods to data from the IAB Establishment Panel, we do not find a reduction in the training incidence but a slight reduction in the intensity of training at treated establishments. Effect heterogeneities reveal that the negative impact is mostly driven by employer-financed training. On the worker level, we observe a reduction of training for medium- and high-skilled employees but no significant effects on the training of low-skilled employees.

  12. Minimum weight protection - Gradient method; Protection de poids minimum - Methode du gradient

    Energy Technology Data Exchange (ETDEWEB)

    Danon, R.

    1958-12-15

    After having recalled that, when considering a mobile installation, total weight has a crucial importance, and that, in the case of a nuclear reactor, a non neglectable part of weight is that of protection, this note presents an iterative method which results, for a given protection, to a configuration with a minimum weight. After a description of the problem, the author presents the theoretical formulation of the gradient method as it is applied to the concerned case. This application is then discussed, as well as its validity in terms of convergence and uniqueness. Its actual application is then reported, and possibilities of practical applications are evoked.

  13. Minimum Propellant Low-Thrust Maneuvers near the Libration Points

    Science.gov (United States)

    Marinescu, A.; Dumitrache, M.

    The impulse technique certainly can bring the vehicle on orbits around the libration points or close to them. The question that aries is, by what means can the vehicle arrive in such cases at the libration points? A first investigation carried out in this paper can give an answer: the use of the technique of low-thrust, which, in addition, can bring the vehicle from the libration points near to or into orbits around these points. This aspect is considered in this present paper where for the applications we have considered the transfer for orbits of the equidistant point L4 and of the collinear point L2, from Earth-moon system. This transfer maneuver can be used to insertion one satellite on libration points orbits. In Earth- moon system the points L 4 and L 5 because an vehicle in on of the equidistant points in quite stable and remains in its vicinity of perturbed, have potential interest for the establishment of transporder satellite for interplanetary tracking. In contrast an vehicle in one of the collinear points is quite instable and it will oscillate along the Earth-moon-axis at increasing amplitude and gradually escape from the libration point. Let use assume that a space vehicle equipped with a low-thrust propulsion is near a libration point L. We consider the planar motion in the restricted frame of the three bodies in the rotating system L, where the Earth-moon distance D=l. The unit of time T is period of the moon's orbit divided by 2 and multiplied by the square root of the quantity one plus the moon/Earth mass ratio, and the unit of mass is the Earth's mass. With these predictions the motion equatios of the vehicle equiped with a low-thrust propulsion installation in the linear approximation near the libration point, have been established. The parameters of the motion at the beginning and the end of these maneuvers are known, the variational problem has been formulated as a Lagrange type problem with fixed extremities. On established the differential

  14. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  15. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  16. Feedforward mapping for engine control

    OpenAIRE

    Aran, Volkan; Ünel, Mustafa; Unel, Mustafa

    2016-01-01

    Feedforward control is widely used in electronic control units of internal combustion engines besides feedback controls. However, almost all feedforward control values are used in table form, also called maps, having engine speed and engine torque in their axes. Table approach limits all inte ractions in two input dimensions. This paper focuses on application of Gaussian process modelling of errors of inverse parametric model of the valve position. Validation results based on ...

  17. Habitat Mapping and Classification of the Grand Bay National Estuarine Research Reserve using AISA Hyperspectral Imagery

    Science.gov (United States)

    Rose, K.

    2012-12-01

    Habitat mapping and classification provides essential information for land use planning and ecosystem research, monitoring and management. At the Grand Bay National Estuarine Research Reserve (GRDNERR), Mississippi, habitat characterization of the Grand Bay watershed will also be used to develop a decision-support tool for the NERR's managers and state and local partners. Grand Bay NERR habitat units were identified using a combination of remotely sensed imagery, aerial photography and elevation data. Airborne Imaging Spectrometer for Applications (AISA) hyperspectral data, acquired 5 and 6 May 2010, was analyzed and classified using ENVI v4.8 and v5.0 software. The AISA system was configured to return 63 bands of digital imagery data with a spectral range of 400 to 970 nm (VNIR), spectral resolution (bandwidth) at 8.76 nm, and 1 m spatial resolution. Minimum Noise Fraction (MNF) and Inverse Minimum Noise Fraction were applied to the data prior to using Spectral Angle Mapper ([SAM] supervised) and ISODATA (unsupervised) classification techniques. The resulting class image was exported to ArcGIS 10.0 and visually inspected and compared with the original imagery as well as auxiliary datasets to assist in the attribution of habitat characteristics to the spectral classes, including: National Agricultural Imagery Program (NAIP) aerial photography, Jackson County, MS, 2010; USFWS National Wetlands Inventory, 2007; an existing GRDNERR habitat map (2004), SAV (2009) and salt panne (2002-2003) GIS produced by GRDNERR; and USACE lidar topo-bathymetry, 2005. A field survey to validate the map's accuracy will take place during the 2012 summer season. ENVI's Random Sample generator was used to generate GIS points for a ground-truth survey. The broad range of coastal estuarine habitats and geomorphological features- many of which are transitional and vulnerable to environmental stressors- that have been identified within the GRDNERR point to the value of the Reserve for

  18. MEDOF - MINIMUM EUCLIDEAN DISTANCE OPTIMAL FILTER

    Science.gov (United States)

    Barton, R. S.

    1994-01-01

    The Minimum Euclidean Distance Optimal Filter program, MEDOF, generates filters for use in optical correlators. The algorithm implemented in MEDOF follows theory put forth by Richard D. Juday of NASA/JSC. This program analytically optimizes filters on arbitrary spatial light modulators such as coupled, binary, full complex, and fractional 2pi phase. MEDOF optimizes these modulators on a number of metrics including: correlation peak intensity at the origin for the centered appearance of the reference image in the input plane, signal to noise ratio including the correlation detector noise as well as the colored additive input noise, peak to correlation energy defined as the fraction of the signal energy passed by the filter that shows up in the correlation spot, and the peak to total energy which is a generalization of PCE that adds the passed colored input noise to the input image's passed energy. The user of MEDOF supplies the functions that describe the following quantities: 1) the reference signal, 2) the realizable complex encodings of both the input and filter SLM, 3) the noise model, possibly colored, as it adds at the reference image and at the correlation detection plane, and 4) the metric to analyze, here taken to be one of the analytical ones like SNR (signal to noise ratio) or PCE (peak to correlation energy) rather than peak to secondary ratio. MEDOF calculates filters for arbitrary modulators and a wide range of metrics as described above. MEDOF examines the statistics of the encoded input image's noise (if SNR or PCE is selected) and the filter SLM's (Spatial Light Modulator) available values. These statistics are used as the basis of a range for searching for the magnitude and phase of k, a pragmatically based complex constant for computing the filter transmittance from the electric field. The filter is produced for the mesh points in those ranges and the value of the metric that results from these points is computed. When the search is concluded, the

  19. Circum-North Pacific tectonostratigraphic terrane map

    Science.gov (United States)

    Nokleberg, Warren J.; Parfenov, Leonid M.; Monger, James W.H.; Baranov, Boris B.; Byalobzhesky, Stanislav G.; Bundtzen, Thomas K.; Feeney, Tracey D.; Fujita, Kazuya; Gordey, Steven P.; Grantz, Arthur; Khanchuk, Alexander I.; Natal'in, Boris A.; Natapov, Lev M.; Norton, Ian O.; Patton, William W.; Plafker, George; Scholl, David W.; Sokolov, Sergei D.; Sosunov, Gleb M.; Stone, David B.; Tabor, Rowland W.; Tsukanov, Nickolai V.; Vallier, Tracy L.; Wakita, Koji

    1994-01-01

    The companion tectonostratigraphic terrane and overlap assemblage of map the Circum-North Pacific presents a modern description of the major geologic and tectonic units of the region. The map illustrates both the onshore terranes and overlap volcanic assemblages of the region, and the major offshore geologic features. The map is the first collaborative compilation of the geology of the region at a scale of 1:5,000,000 by geologists of the Russian Far East, Japanese, Alaskan, Canadian, and U.S.A. Pacific Northwest. The map is designed to be a source of geologic information for all scientists interested in the region, and is designed to be used for several purposes, including regional tectonic analyses, mineral resource and metallogenic analyses (Nokleberg and others, 1993, 1994a), petroleum analyses, neotectonic analyses, and analyses of seismic hazards and volcanic hazards. This text contains an introduction, tectonic definitions, acknowledgments, descriptions of postaccretion stratified rock units, descriptions and stratigraphic columns for tectonostratigraphic terranes in onshore areas, and references for the companion map (Sheets 1 to 5). This map is the result of extensive geologic mapping and associated tectonic studies in the Russian Far East, Hokkaido Island of Japan, Alaska, the Canadian Cordillera, and the U.S.A. Pacific Northwest in the last few decades. Geologic mapping suggests that most of this region can be interpreted as a collage of fault-bounded tectonostratigraphic terranes that were accreted onto continental margins around the Circum-

  20. Map of the Physical Sciences

    Energy Technology Data Exchange (ETDEWEB)

    Boyack, Kevin W.

    1999-07-02

    Various efforts to map the structure of science have been undertaken over the years. Using a new tool, VxInsight{trademark}, we have mapped and displayed 3000 journals in the physical sciences. This map is navigable and interactively reveals the structure of science at many different levels. Science mapping studies are typically focused at either the macro-or micro-level. At a macro-level such studies seek to determine the basic structural units of science and their interrelationships. The majority of studies are performed at the discipline or specialty level, and seek to inform science policy and technical decision makers. Studies at both levels probe the dynamic nature of science, and the implications of the changes. A variety of databases and methods have been used for these studies. Primary among databases are the citation indices (SCI and SSCI) from the Institute for Scientific Information, which have gained widespread acceptance for bibliometric studies. Maps are most often based on computed similarities between journal articles (co-citation), keywords or topics (co-occurrence or co-classification), or journals (journal-journal citation counts). Once the similarity matrix is defined, algorithms are used to cluster the data.

  1. The Unexpected Long-Run Impact of the Minimum Wage: An Educational Cascade. NBER Working Paper No. 16355

    Science.gov (United States)

    Sutch, Richard

    2010-01-01

    Neglected, but significant, the long-run consequence of the minimum wage--which was made national policy in the United States in 1938--is its stimulation of capital deepening. This took two forms. First, the engineered shortage of low-skill, low-paying jobs induced teenagers to invest in additional human capital--primarily by extending their…

  2. Six months into Myanmar's minimum wage: Reflecting on progress ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-04-25

    Apr 25, 2016 ... Participants examined recent results from an IDRC-funded enterprise survey, ... of a minimum wage, and how they have coped with the new situation.” ... Debate on the impact of minimum wages on employment continues ...

  3. The impact of minimum wages on youth employment in Portugal

    NARCIS (Netherlands)

    S.C. Pereira

    2003-01-01

    textabstractFrom January 1, 1987, the legal minimum wage for workers aged 18 and 19 in Portugal was uprated to the full adult rate, generating a 49.3% increase between 1986 and 1987 in the legal minimum wage for this age group. This shock is used as a ?natural experiment? to evaluate the impact of

  4. The Impact Of Minimum Wage On Employment Level And ...

    African Journals Online (AJOL)

    This research work has been carried out to analyze the critical impact of minimum wage of employment level and productivity in Nigeria. A brief literature on wage and its determination was highlighted. Models on minimum wage effect are being look into. This includes research work done by different economist analyzing it ...

  5. 42 CFR 84.134 - Respirator containers; minimum requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Respirator containers; minimum requirements. 84.134... Respirators § 84.134 Respirator containers; minimum requirements. Supplied-air respirators shall be equipped with a substantial, durable container bearing markings which show the applicant's name, the type and...

  6. 42 CFR 84.1134 - Respirator containers; minimum requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Respirator containers; minimum requirements. 84... Combination Gas Masks § 84.1134 Respirator containers; minimum requirements. (a) Except as provided in paragraph (b) of this section each respirator shall be equipped with a substantial, durable container...

  7. 42 CFR 84.197 - Respirator containers; minimum requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Respirator containers; minimum requirements. 84.197... Cartridge Respirators § 84.197 Respirator containers; minimum requirements. Respirators shall be equipped with a substantial, durable container bearing markings which show the applicant's name, the type and...

  8. 42 CFR 84.174 - Respirator containers; minimum requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Respirator containers; minimum requirements. 84.174... Air-Purifying Particulate Respirators § 84.174 Respirator containers; minimum requirements. (a) Except..., durable container bearing markings which show the applicant's name, the type of respirator it contains...

  9. 42 CFR 84.74 - Apparatus containers; minimum requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Apparatus containers; minimum requirements. 84.74...-Contained Breathing Apparatus § 84.74 Apparatus containers; minimum requirements. (a) Apparatus may be equipped with a substantial, durable container bearing markings which show the applicant's name, the type...

  10. 14 CFR 91.155 - Basic VFR weather minimums.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 2 2010-01-01 2010-01-01 false Basic VFR weather minimums. 91.155 Section...) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Flight Rules Visual Flight Rules § 91.155 Basic VFR weather minimums. (a) Except as provided in paragraph (b) of this section and...

  11. 42 CFR 422.382 - Minimum net worth amount.

    Science.gov (United States)

    2010-10-01

    ... that CMS considers appropriate to reduce, control or eliminate start-up administrative costs. (b) After... section. (c) Calculation of the minimum net worth amount—(1) Cash requirement. (i) At the time of application, the organization must maintain at least $750,000 of the minimum net worth amount in cash or cash...

  12. 7 CFR 1610.5 - Minimum Bank loan.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Minimum Bank loan. 1610.5 Section 1610.5 Agriculture Regulations of the Department of Agriculture (Continued) RURAL TELEPHONE BANK, DEPARTMENT OF AGRICULTURE LOAN POLICIES § 1610.5 Minimum Bank loan. A Bank loan will not be made unless the applicant qualifies for a Bank...

  13. 5 CFR 551.601 - Minimum age standards.

    Science.gov (United States)

    2010-01-01

    ... ADMINISTRATION UNDER THE FAIR LABOR STANDARDS ACT Child Labor § 551.601 Minimum age standards. (a) 16-year... subject to its child labor provisions, with certain exceptions not applicable here. (b) 18-year minimum... occupation found and declared by the Secretary of Labor to be particularly hazardous for the employment of...

  14. 76 FR 15368 - Minimum Security Devices and Procedures

    Science.gov (United States)

    2011-03-21

    ... DEPARTMENT OF THE TREASURY Office of Thrift Supervision Minimum Security Devices and Procedures... concerning the following information collection. Title of Proposal: Minimum Security Devices and Procedures... security devices and procedures to discourage robberies, burglaries, and larcenies, and to assist in the...

  15. 76 FR 30243 - Minimum Security Devices and Procedures

    Science.gov (United States)

    2011-05-24

    ... DEPARTMENT OF THE TREASURY Office of Thrift Supervision Minimum Security Devices and Procedures.... Title of Proposal: Minimum Security Devices and Procedures. OMB Number: 1550-0062. Form Number: N/A... respect to the installation, maintenance, and operation of security devices and procedures to discourage...

  16. 12 CFR 567.2 - Minimum regulatory capital requirement.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Minimum regulatory capital requirement. 567.2... Regulatory Capital Requirements § 567.2 Minimum regulatory capital requirement. (a) To meet its regulatory capital requirement a savings association must satisfy each of the following capital standards: (1) Risk...

  17. Minimum bias measurement at 13 TeV

    CERN Document Server

    Orlando, Nicola; The ATLAS collaboration

    2017-01-01

    The modelling of Minimum Bias (MB) is a crucial ingredient to learn about the description of soft QCD processes and to simulate the environment at the LHC with many concurrent pp interactions (pile-up). We summarise the ATLAS minimum bias measurements with proton-proton collision at 13 TeV center-of-mass-energy at the Large Hadron Collider.

  18. Solving the minimum flow problem with interval bounds and flows

    Indian Academy of Sciences (India)

    ... with crisp data. In this paper, the idea of Ghiyasvand was extended for solving the minimum flow problem with interval-valued lower, upper bounds and flows. This problem can be solved using two minimum flow problems with crisp data. Then, this result is extended to networks with fuzzy lower, upper bounds and flows.

  19. 47 CFR 25.205 - Minimum angle of antenna elevation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Minimum angle of antenna elevation. 25.205 Section 25.205 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES SATELLITE COMMUNICATIONS Technical Standards § 25.205 Minimum angle of antenna elevation. (a) Earth station...

  20. 77 FR 43196 - Minimum Internal Control Standards and Technical Standards

    Science.gov (United States)

    2012-07-24

    ... NATIONAL INDIAN GAMING COMMISSION 25 CFR Parts 543 and 547 Minimum Internal Control Standards [email protected] . SUPPLEMENTARY INFORMATION: Part 543 addresses minimum internal control standards (MICS) for Class II gaming operations. The regulations require tribes to establish controls and implement...

  1. 12 CFR 3.6 - Minimum capital ratios.

    Science.gov (United States)

    2010-01-01

    ... should have well-diversified risks, including no undue interest rate risk exposure; excellent control... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Minimum capital ratios. 3.6 Section 3.6 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY MINIMUM CAPITAL RATIOS; ISSUANCE...

  2. Minimum Competencies in Undergraduate Motor Development. Guidance Document

    Science.gov (United States)

    National Association for Sport and Physical Education, 2004

    2004-01-01

    The minimum competency guidelines in Motor Development described herein at the undergraduate level may be gained in one or more motor development course(s) or through other courses provided in an undergraduate curriculum. The minimum guidelines include: (1) Formulation of a developmental perspective; (2) Knowledge of changes in motor behavior…

  3. 30 CFR 77.606-1 - Rubber gloves; minimum requirements.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Rubber gloves; minimum requirements. 77.606-1... COAL MINES Trailing Cables § 77.606-1 Rubber gloves; minimum requirements. (a) Rubber gloves (lineman's gloves) worn while handling high-voltage trailing cables shall be rated at least 20,000 volts and shall...

  4. 42 CFR 84.117 - Gas mask containers; minimum requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Gas mask containers; minimum requirements. 84.117... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Gas Masks § 84.117 Gas mask containers; minimum requirements. (a) Gas masks shall be equipped with a substantial...

  5. Decision trees with minimum average depth for sorting eight elements

    KAUST Repository

    AbouEisha, Hassan M.; Chikalov, Igor; Moshkov, Mikhail

    2015-01-01

    We prove that the minimum average depth of a decision tree for sorting 8 pairwise different elements is equal to 620160/8!. We show also that each decision tree for sorting 8 elements, which has minimum average depth (the number of such trees

  6. 30 CFR 18.97 - Inspection of machines; minimum requirements.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Inspection of machines; minimum requirements... TESTING, EVALUATION, AND APPROVAL OF MINING PRODUCTS ELECTRIC MOTOR-DRIVEN MINE EQUIPMENT AND ACCESSORIES Field Approval of Electrically Operated Mining Equipment § 18.97 Inspection of machines; minimum...

  7. 12 CFR 615.5330 - Minimum surplus ratios.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Minimum surplus ratios. 615.5330 Section 615.5330 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM FUNDING AND FISCAL AFFAIRS, LOAN POLICIES AND OPERATIONS, AND FUNDING OPERATIONS Surplus and Collateral Requirements § 615.5330 Minimum...

  8. 19 CFR 144.33 - Minimum quantities to be withdrawn.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Minimum quantities to be withdrawn. 144.33 Section 144.33 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT... Warehouse § 144.33 Minimum quantities to be withdrawn. Unless by special authority of the Commissioner of...

  9. The impact of minimum wage adjustments on Vietnamese wage inequality

    DEFF Research Database (Denmark)

    Hansen, Henrik; Rand, John; Torm, Nina

    Using Vietnamese Labour Force Survey data we analyse the impact of minimum wage changes on wage inequality. Minimum wages serve to reduce local wage inequality in the formal sectors by decreasing the gap between the median wages and the lower tail of the local wage distributions. In contrast, local...

  10. Minimum Moduli in Von Neumann Algebras | Gopalraj | Quaestiones ...

    African Journals Online (AJOL)

    In this paper we answer a question raised in [12] in the affirmative, namely that the essential minimum modulus of an element in a von. Neumann algebra, relative to any norm closed two-sided ideal, is equal to the minimum modulus of the element perturbed by an element from the ideal. As a corollary of this result, we ...

  11. 12 CFR 932.8 - Minimum liquidity requirements.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Minimum liquidity requirements. 932.8 Section... CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.8 Minimum liquidity requirements. In addition to meeting the deposit liquidity requirements contained in § 965.3 of this chapter, each Bank...

  12. Is a Minimum Wage an Appropriate Instrument for Redistribution?

    NARCIS (Netherlands)

    A.A.F. Gerritsen (Aart); B. Jacobs (Bas)

    2016-01-01

    textabstractWe analyze the redistributional (dis)advantages of a minimum wage over income taxation in competitive labor markets, without imposing assumptions on the (in)efficiency of labor rationing. Compared to a distributionally equivalent tax change, a minimum-wage increase raises involuntary

  13. The Minimum Wage and the Employment of Teenagers. Recent Research.

    Science.gov (United States)

    Fallick, Bruce; Currie, Janet

    A study used individual-level data from the National Longitudinal Study of Youth to examine the effects of changes in the federal minimum wage on teenage employment. Individuals in the sample were classified as either likely or unlikely to be affected by these increases in the federal minimum wage on the basis of their wage rates and industry of…

  14. The Minimum Wage, Restaurant Prices, and Labor Market Structure

    Science.gov (United States)

    Aaronson, Daniel; French, Eric; MacDonald, James

    2008-01-01

    Using store-level and aggregated Consumer Price Index data, we show that restaurant prices rise in response to minimum wage increases under several sources of identifying variation. We introduce a general model of employment determination that implies minimum wage hikes cause prices to rise in competitive labor markets but potentially fall in…

  15. The Effect of Minimum Wage Rates on High School Completion

    Science.gov (United States)

    Warren, John Robert; Hamrock, Caitlin

    2010-01-01

    Does increasing the minimum wage reduce the high school completion rate? Previous research has suffered from (1. narrow time horizons, (2. potentially inadequate measures of states' high school completion rates, and (3. potentially inadequate measures of minimum wage rates. Overcoming each of these limitations, we analyze the impact of changes in…

  16. Do minimum wages reduce poverty? Evidence from Central America ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    In all three countries, these multiple minimum wages are negotiated among representatives of the central government, labour unions and the chambers of commerce. Minimum wage legislation applies to all private-sector employees, but in all three countries a large part of the work force is self-employed or works as unpaid ...

  17. Minimum Variance Portfolios in the Brazilian Equity Market

    Directory of Open Access Journals (Sweden)

    Alexandre Rubesam

    2013-03-01

    Full Text Available We investigate minimum variance portfolios in the Brazilian equity market using different methods to estimate the covariance matrix, from the simple model of using the sample covariance to multivariate GARCH models. We compare the performance of the minimum variance portfolios to those of the following benchmarks: (i the IBOVESPA equity index, (ii an equally-weighted portfolio, (iii the maximum Sharpe ratio portfolio and (iv the maximum growth portfolio. Our results show that the minimum variance portfolio has higher returns with lower risk compared to the benchmarks. We also consider long-short 130/30 minimum variance portfolios and obtain similar results. The minimum variance portfolio invests in relatively few stocks with low βs measured with respect to the IBOVESPA index, being easily replicable by individual and institutional investors alike.

  18. Reducing tobacco use and access through strengthened minimum price laws.

    Science.gov (United States)

    McLaughlin, Ian; Pearson, Anne; Laird-Metke, Elisa; Ribisl, Kurt

    2014-10-01

    Higher prices reduce consumption and initiation of tobacco products. A minimum price law that establishes a high statutory minimum price and prohibits the industry's discounting tactics for tobacco products is a promising pricing strategy as an alternative to excise tax increases. Although some states have adopted minimum price laws on the basis of statutorily defined price "markups" over the invoice price, existing state laws have been largely ineffective at increasing the retail price. We analyzed 3 new variations of minimum price laws that hold great potential for raising tobacco prices and reducing consumption: (1) a flat rate minimum price law similar to a recent enactment in New York City, (2) an enhanced markup law, and (3) a law that incorporates both elements.

  19. The impact of minimum wage adjustments on Vietnamese wage inequality

    DEFF Research Database (Denmark)

    Hansen, Henrik; Rand, John; Torm, Nina

    Using Vietnamese Labour Force Survey data we analyse the impact of minimum wage changes on wage inequality. Minimum wages serve to reduce local wage inequality in the formal sectors by decreasing the gap between the median wages and the lower tail of the local wage distributions. In contrast, local...... wage inequality is increased in the informal sectors. Overall, the minimum wages decrease national wage inequality. Our estimates indicate a decrease in the wage distribution Gini coefficient of about 2 percentage points and an increase in the 10/50 wage ratio of 5-7 percentage points caused...... by the adjustment of the minimum wages from 2011to 2012 that levelled the minimum wage across economic sectors....

  20. Biometric recognition via fixation density maps

    Science.gov (United States)

    Rigas, Ioannis; Komogortsev, Oleg V.

    2014-05-01

    This work introduces and evaluates a novel eye movement-driven biometric approach that employs eye fixation density maps for person identification. The proposed feature offers a dynamic representation of the biometric identity, storing rich information regarding the behavioral and physical eye movement characteristics of the individuals. The innate ability of fixation density maps to capture the spatial layout of the eye movements in conjunction with their probabilistic nature makes them a particularly suitable option as an eye movement biometrical trait in cases when free-viewing stimuli is presented. In order to demonstrate the effectiveness of the proposed approach, the method is evaluated on three different datasets containing a wide gamut of stimuli types, such as static images, video and text segments. The obtained results indicate a minimum EER (Equal Error Rate) of 18.3 %, revealing the perspectives on the utilization of fixation density maps as an enhancing biometrical cue during identification scenarios in dynamic visual environments.

  1. Raw data from orientation studies in crystalline rock areas of the southeastern United States. [Maps, tables of field data and analytical data for sections of North and South Carolina and Georgia, previously reported sites of uranium mineralization

    Energy Technology Data Exchange (ETDEWEB)

    Price, V.

    1976-03-01

    Raw data are presented on orientation studies conducted in crystalline rock areas of the Southeast which were chosen because of published references to uranium mineralization. Preliminary data for four orientation study areas are included. These areas are Lamar County, Georgia; Oconee County, South Carolina; Brush Creek, North Carolina; and North Harper, North Carolina. Sample locality maps, tables of field data, and tables of analytical data are included for each study area. (JGB)

  2. Mapping the Heart

    Science.gov (United States)

    Hulse, Grace

    2012-01-01

    In this article, the author describes how her fourth graders made ceramic heart maps. The impetus for this project came from reading "My Map Book" by Sara Fanelli. This book is a collection of quirky, hand-drawn and collaged maps that diagram a child's world. There are maps of her stomach, her day, her family, and her heart, among others. The…

  3. USGS Map Indices Overlay Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS Map Indices service from The National Map (TNM) consists of 1x1 Degree, 30x60 Minute (100K), 15 Minute (63K), 7.5 Minute (24K), and 3.75 Minute grid...

  4. Increasing Efficiency of Soil Fertility Map for Rice Cultivation Using Fuzzy Logic, AHP and GIS

    Directory of Open Access Journals (Sweden)

    javad seyedmohammadi

    2017-02-01

    Full Text Available Introduction: With regard to increasing population of country, need to high agricultural production is essential. The most suitable method for this issue is high production per area unit. Preparation much food and other environmental resources with conservation of biotic resources for futures will be possible only with optimum exploitation of soil. Among effective factors for the most production balanced addition of fertilizers increases production of crops higher than the others. With attention to this topic, determination of soil fertility degree is essential tobetter use of fertilizers and right exploitation of soils. Using fuzzy logic and Analytic Hierarchy Process (AHP could be useful in accurate determination of soil fertility degree. Materials and Methods: The study area (at the east of Rasht city is located between 49° 31' to 49° 45' E longitude and 37° 7' to 37° 27' N latitude in north of Guilan Province, northern Iran, in the southern coast of the Caspian sea. 117 soil samples were derived from0-30 cm depth in the study area. Air-dried soil samples were crushed and passed through a 2mm sieve. Available phosphorus, potassium and organic carbon were determined by sodium bicarbonate, normal ammonium acetate and corrected walkly-black method, respectively. In the first stage, the interpolation of data was done by kriging method in GIS context. Then S-shape membership function was defined for each parameter and prepared fuzzy map. After determination of membership function weight parameters maps were determined using AHP technique and finally soil fertility map was prepared with overlaying of weighted fuzzy maps. Relative variance and correlation coefficient criteria used tocontrol groups separation accuracy in fuzzy fertility map. Results and Discussion: With regard to minimum amounts of parameters looks some lands of study area had fertility difficulty. Therefore, soil fertility map of study area distinct these lands and present soil

  5. Teaching the Minimum Wage in Econ 101 in Light of the New Economics of the Minimum Wage.

    Science.gov (United States)

    Krueger, Alan B.

    2001-01-01

    Argues that the recent controversy over the effect of the minimum wage on employment offers an opportunity for teaching introductory economics. Examines eight textbooks to determine topic coverage but finds little consensus. Describes how minimum wage effects should be taught. (RLH)

  6. 7. Annex II: Maps

    OpenAIRE

    Aeberli, Annina

    2012-01-01

    Map 1: States of South Sudan UN OCHA (2012) Republic of South Sudan – States, as of 15 July 2012, Reliefweb http://reliefweb.int/map/south-sudan-republic/republic-south-sudan-states-15-july-2012-reference-map, accessed 31 July 2012. Map 2: Counties of South Sudan UN OCHA (2012) Republic of South Sudan – Counties, as of 16 July 2012, Reliefweb http://reliefweb.int/map/south-sudan-republic/republic-south-sudan-counties-16-july-2012-reference-map, accessed 31 July 2012. Map 3: Eastern Equato...

  7. Applicability of vulnerability maps

    International Nuclear Information System (INIS)

    Andersen, L.J.; Gosk, E.

    1989-01-01

    A number of aspects to vulnerability maps are discussed: the vulnerability concept, mapping purposes, possible users, and applicability of vulnerability maps. Problems associated with general-type vulnerability mapping, including large-scale maps, universal pollutant, and universal pollution scenario are also discussed. An alternative approach to vulnerability assessment - specific vulnerability mapping for limited areas, specific pollutant, and predefined pollution scenario - is suggested. A simplification of the vulnerability concept is proposed in order to make vulnerability mapping more objective and by this means more comparable. An extension of the vulnerability concept to the rest of the hydrogeological cycle (lakes, rivers, and the sea) is proposed. Some recommendations regarding future activities are given

  8. Differential maps, difference maps, interpolated maps, and long term prediction

    International Nuclear Information System (INIS)

    Talman, R.

    1988-06-01

    Mapping techniques may be thought to be attractive for the long term prediction of motion in accelerators, especially because a simple map can approximately represent an arbitrarily complicated lattice. The intention of this paper is to develop prejudices as to the validity of such methods by applying them to a simple, exactly solveable, example. It is shown that a numerical interpolation map, such as can be generated in the accelerator tracking program TEAPOT, predicts the evolution more accurately than an analytically derived differential map of the same order. Even so, in the presence of ''appreciable'' nonlinearity, it is shown to be impractical to achieve ''accurate'' prediction beyond some hundreds of cycles of oscillation. This suggests that the value of nonlinear maps is restricted to the parameterization of only the ''leading'' deviation from linearity. 41 refs., 6 figs

  9. A minimum spanning forest based classification method for dedicated breast CT images

    International Nuclear Information System (INIS)

    Pike, Robert; Sechopoulos, Ioannis; Fei, Baowei

    2015-01-01

    Purpose: To develop and test an automated algorithm to classify different types of tissue in dedicated breast CT images. Methods: Images of a single breast of five different patients were acquired with a dedicated breast CT clinical prototype. The breast CT images were processed by a multiscale bilateral filter to reduce noise while keeping edge information and were corrected to overcome cupping artifacts. As skin and glandular tissue have similar CT values on breast CT images, morphologic processing is used to identify the skin based on its position information. A support vector machine (SVM) is trained and the resulting model used to create a pixelwise classification map of fat and glandular tissue. By combining the results of the skin mask with the SVM results, the breast tissue is classified as skin, fat, and glandular tissue. This map is then used to identify markers for a minimum spanning forest that is grown to segment the image using spatial and intensity information. To evaluate the authors’ classification method, they use DICE overlap ratios to compare the results of the automated classification to those obtained by manual segmentation on five patient images. Results: Comparison between the automatic and the manual segmentation shows that the minimum spanning forest based classification method was able to successfully classify dedicated breast CT image with average DICE ratios of 96.9%, 89.8%, and 89.5% for fat, glandular, and skin tissue, respectively. Conclusions: A 2D minimum spanning forest based classification method was proposed and evaluated for classifying the fat, skin, and glandular tissue in dedicated breast CT images. The classification method can be used for dense breast tissue quantification, radiation dose assessment, and other applications in breast imaging

  10. Ecological units of the Northern Region: Subsections

    Science.gov (United States)

    John A. Nesser; Gary L. Ford; C. Lee Maynard; Debbie Dumroese

    1997-01-01

    Ecological units are described at the subsection level of the Forest Service National Hierarchical Framework of Ecological Units. A total of 91 subsections are delineated on the 1996 map "Ecological Units of the Northern Region: Subsections," based on physical and biological criteria. This document consists of descriptions of the climate, geomorphology,...

  11. VEGETATION MAPPING IN WETLANDS

    Directory of Open Access Journals (Sweden)

    F. PEDROTTI

    2004-01-01

    Full Text Available The current work examines the main aspects of wetland vegetation mapping, which can be summarized as analysis of the ecological-vegetational (ecotone gradients; vegetation complexes; relationships between vegetation distribution and geomorphology; vegetation of the hydrographic basin lo which the wetland in question belongs; vegetation monitoring with help of four vegetation maps: phytosociological map of the real and potential vegetation, map of vegetation dynamical tendencies, map of vegetation series.

  12. Timed Fast Exact Euclidean Distance (tFEED) maps

    NARCIS (Netherlands)

    Kehtarnavaz, Nasser; Schouten, Theo E.; Laplante, Philip A.; Kuppens, Harco; van den Broek, Egon

    2005-01-01

    In image and video analysis, distance maps are frequently used. They provide the (Euclidean) distance (ED) of background pixels to the nearest object pixel. In a naive implementation, each object pixel feeds its (exact) ED to each background pixel; then the minimum of these values denotes the ED to

  13. Minimum and Full Fluidization Velocity for Alumina Used in the Aluminum Smelter

    Directory of Open Access Journals (Sweden)

    Paulo Douglas S. de Vasconcelos

    2011-11-01

    Full Text Available Fluidization is an engineering unit operation that occurs when a fluid (liquid or gas ascends through a bed of particles, and these particles get a velocity of minimum fluidization enough to stay in suspension, but without carrying them in the ascending flow. As from this moment the powder behaves as liquid at boiling point, hence the term “fluidization”. This operation is widely used in the aluminum smelter processes, for gas dry scrubbing (mass transfer and in a modern plant for continuous alumina pot feeding (particles’ momentum transfer. The understanding of the alumina fluoride rheology is of vital importance in the design of fluidized beds for gas treatment and fluidized pipelines for pot feeding. This paper shows the results of the experimental and theoretical values of the minimum and full fluidization velocities for the alumina fluoride used to project the state of the art round non‐metallic air‐fluidized conveyor of multiples outlets.

  14. Interface unit

    NARCIS (Netherlands)

    Keyson, D.V.; Freudenthal, A.; De Hoogh, M.P.A.; Dekoven, E.A.M.

    2001-01-01

    The invention relates to an interface unit comprising at least a display unit for communication with a user, which is designed for being coupled with a control unit for at least one or more parameters in a living or working environment, such as the temperature setting in a house, which control unit

  15. Setting a minimum age for juvenile justice jurisdiction in California.

    Science.gov (United States)

    S Barnert, Elizabeth; S Abrams, Laura; Maxson, Cheryl; Gase, Lauren; Soung, Patricia; Carroll, Paul; Bath, Eraka

    2017-03-13

    Purpose Despite the existence of minimum age laws for juvenile justice jurisdiction in 18 US states, California has no explicit law that protects children (i.e. youth less than 12 years old) from being processed in the juvenile justice system. In the absence of a minimum age law, California lags behind other states and international practice and standards. The paper aims to discuss these issues. Design/methodology/approach In this policy brief, academics across the University of California campuses examine current evidence, theory, and policy related to the minimum age of juvenile justice jurisdiction. Findings Existing evidence suggests that children lack the cognitive maturity to comprehend or benefit from formal juvenile justice processing, and diverting children from the system altogether is likely to be more beneficial for the child and for public safety. Research limitations/implications Based on current evidence and theory, the authors argue that minimum age legislation that protects children from contact with the juvenile justice system and treats them as children in need of services and support, rather than as delinquents or criminals, is an important policy goal for California and for other national and international jurisdictions lacking a minimum age law. Originality/value California has no law specifying a minimum age for juvenile justice jurisdiction, meaning that young children of any age can be processed in the juvenile justice system. This policy brief provides a rationale for a minimum age law in California and other states and jurisdictions without one.

  16. Setting a minimum age for juvenile justice jurisdiction in California

    Science.gov (United States)

    Barnert, Elizabeth S.; Abrams, Laura S.; Maxson, Cheryl; Gase, Lauren; Soung, Patricia; Carroll, Paul; Bath, Eraka

    2018-01-01

    Purpose Despite the existence of minimum age laws for juvenile justice jurisdiction in 18 US states, California has no explicit law that protects children (i.e. youth less than 12 years old) from being processed in the juvenile justice system. In the absence of a minimum age law, California lags behind other states and international practice and standards. The paper aims to discuss these issues. Design/methodology/approach In this policy brief, academics across the University of California campuses examine current evidence, theory, and policy related to the minimum age of juvenile justice jurisdiction. Findings Existing evidence suggests that children lack the cognitive maturity to comprehend or benefit from formal juvenile justice processing, and diverting children from the system altogether is likely to be more beneficial for the child and for public safety. Research limitations/implications Based on current evidence and theory, the authors argue that minimum age legislation that protects children from contact with the juvenile justice system and treats them as children in need of services and support, rather than as delinquents or criminals, is an important policy goal for California and for other national and international jurisdictions lacking a minimum age law. Originality/value California has no law specifying a minimum age for juvenile justice jurisdiction, meaning that young children of any age can be processed in the juvenile justice system. This policy brief provides a rationale for a minimum age law in California and other states and jurisdictions without one. Paper type Conceptual paper PMID:28299968

  17. Minimum wall pressure coefficient of orifice plate energy dissipater

    Directory of Open Access Journals (Sweden)

    Wan-zheng Ai

    2015-01-01

    Full Text Available Orifice plate energy dissipaters have been successfully used in large-scale hydropower projects due to their simple structure, convenient construction procedure, and high energy dissipation ratio. The minimum wall pressure coefficient of an orifice plate can indirectly reflect its cavitation characteristics: the lower the minimum wall pressure coefficient is, the better the ability of the orifice plate to resist cavitation damage is. Thus, it is important to study the minimum wall pressure coefficient of the orifice plate. In this study, this coefficient and related parameters, such as the contraction ratio, defined as the ratio of the orifice plate diameter to the flood-discharging tunnel diameter; the relative thickness, defined as the ratio of the orifice plate thickness to the tunnel diameter; and the Reynolds number of the flow through the orifice plate, were theoretically analyzed, and their relationships were obtained through physical model experiments. It can be concluded that the minimum wall pressure coefficient is mainly dominated by the contraction ratio and relative thickness. The lower the contraction ratio and relative thickness are, the larger the minimum wall pressure coefficient is. The effects of the Reynolds number on the minimum wall pressure coefficient can be neglected when it is larger than 105. An empirical expression was presented to calculate the minimum wall pressure coefficient in this study.

  18. Geospatial field applications within United States Department of Agriculture, Veterinary Services.

    Science.gov (United States)

    FitzMaurice, Priscilla L; Freier, Jerome E; Geter, Kenneth D

    2007-01-01

    Epidemiologists, veterinary medical officers and animal health technicians within Veterinary Services (VS) are actively utilising global positioning system (GPS) technology to obtain positional data on livestock and poultry operations throughout the United States. Geospatial data, if acquired for monitoring and surveillance purposes, are stored within the VS Generic Database (GDB). If the information is collected in response to an animal disease outbreak, the data are entered into the Emergency Management Response System (EMRS). The Spatial Epidemiology group within the Centers for Epidemiology and Animal Health (CEAH) has established minimum data accuracy standards for geodata acquisition. To ensure that field-collected geographic coordinates meet these minimum standards, field personnel are trained in proper data collection procedures. Positional accuracy is validated with digital atlases, aerial photographs, Web-based parcel maps, or address geocoding. Several geospatial methods and technologies are under investigation for future use within VS. These include the direct transfer of coordinates from GPS receivers to computers, GPS-enabled digital cameras, tablet PCs, and GPS receivers preloaded with custom ArcGIS maps - all with the objective of reducing transcription and data entry errors and improving the ease of data collection in the field.

  19. [Assessment on the ecological suitability in Zhuhai City, Guangdong, China, based on minimum cumulative resistance model].

    Science.gov (United States)

    Li, Jian-fei; Li, Lin; Guo, Luo; Du, Shi-hong

    2016-01-01

    Urban landscape has the characteristics of spatial heterogeneity. Because the expansion process of urban constructive or ecological land has different resistance values, the land unit stimulates and promotes the expansion of ecological land with different intensity. To compare the effect of promoting and hindering functions in the same land unit, we firstly compared the minimum cumulative resistance value of promoting and hindering functions, and then looked for the balance of two landscape processes under the same standard. According to the ecology principle of minimum limit factor, taking the minimum cumulative resistance analysis method under two expansion processes as the evaluation method of urban land ecological suitability, this research took Zhuhai City as the study area to estimate urban ecological suitability by relative evaluation method with remote sensing image, field survey, and statistics data. With the support of ArcGIS, five types of indicators on landscape types, ecological value, soil erosion sensitivity, sensitivity of geological disasters, and ecological function were selected as input parameters in the minimum cumulative resistance model to compute urban ecological suitability. The results showed that the ecological suitability of the whole Zhuhai City was divided into five levels: constructive expansion prohibited zone (10.1%), constructive expansion restricted zone (32.9%), key construction zone (36.3%), priority development zone (2.3%), and basic cropland (18.4%). Ecological suitability of the central area of Zhuhai City was divided into four levels: constructive expansion prohibited zone (11.6%), constructive expansion restricted zone (25.6%), key construction zone (52.4%), priority development zone (10.4%). Finally, we put forward the sustainable development framework of Zhuhai City according to the research conclusion. On one hand, the government should strictly control the development of the urban center area. On the other hand, the

  20. Design of a minimum emittance nBA lattice

    Science.gov (United States)

    Lee, S. Y.

    1998-04-01

    An attempt to design a minimum emittance n-bend achromat (nBA) lattice has been made. One distinct feature is that dipoles with two different lengths were used. As a multiple bend achromat, five bend achromat lattices with six superperiod were designed. The obtained emittace is three times larger than the theoretical minimum. Tunes were chosen to avoid third order resonances. In order to correct first and second order chromaticities, eight family sextupoles were placed. The obtained emittance of five bend achromat lattices is almost equal to the minimum emittance of five bend achromat lattice consisting of dipoles with equal length.

  1. [Hospitals failing minimum volumes in 2004: reasons and consequences].

    Science.gov (United States)

    Geraedts, M; Kühnen, C; Cruppé, W de; Blum, K; Ohmann, C

    2008-02-01

    In 2004 Germany introduced annual minimum volumes nationwide on five surgical procedures: kidney, liver, stem cell transplantation, complex oesophageal, and pancreatic interventions. Hospitals that fail to reach the minimum volumes are no longer allowed to perform the respective procedures unless they raise one of eight legally accepted exceptions. The goal of our study was to investigate how many hospitals fell short of the minimum volumes in 2004, whether and how this was justified, and whether hospitals that failed the requirements experienced any consequences. We analysed data on meeting the minimum volume requirements in 2004 that all German hospitals were obliged to publish as part of their biannual structured quality reports. We performed telephone interviews: a) with all hospitals not achieving the minimum volumes for complex oesophageal, and pancreatic interventions, and b) with the national umbrella organisations of all German sickness funds. In 2004, one quarter of all German acute care hospitals (N=485) performed 23,128 procedures where minimum volumes applied. 197 hospitals (41%) did not meet at least one of the minimum volumes. These hospitals performed N=715 procedures (3.1%) where the minimum volumes were not met. In 43% of these cases the hospitals raised legally accepted exceptions. In 33% of the cases the hospitals argued using reasons that were not legally acknowledged. 69% of those hospitals that failed to achieve the minimum volumes for complex oesophageal and pancreatic interventions did not experience any consequences from the sickness funds. However, one third of those hospitals reported that the sickness funds addressed the issue and partially announced consequences for the future. The sickness funds' umbrella organisations stated that there were only sparse activities related to the minimum volumes and that neither uniform registrations nor uniform proceedings in case of infringements of the standards had been agreed upon. In spite of the

  2. On the road again: traffic fatalities and auto insurance minimums

    Directory of Open Access Journals (Sweden)

    Pavel A. Yakovlev

    2018-03-01

    Full Text Available Prior research on policy-induced moral hazard effects in the auto insurance market has focused on the impact of compulsory insurance, no-fault liability, and tort liability laws on traffic fatalities. In contrast, this paper examines the moral hazard effect of a previously overlooked policy variable: minimum auto insurance coverage. We hypothesize that state-mandated auto insurance minimums may “over-insure” some drivers, lowering their incentives to drive carefully. Using a longitudinal panel of American states from 1982 to 2006, we find that policy-induced increases in auto insurance minimums are associated with higher traffic fatality rates, ceteris paribus.

  3. Minimum-Cost Reachability for Priced Timed Automata

    DEFF Research Database (Denmark)

    Behrmann, Gerd; Fehnker, Ansgar; Hune, Thomas Seidelin

    2001-01-01

    This paper introduces the model of linearly priced timed automata as an extension of timed automata, with prices on both transitions and locations. For this model we consider the minimum-cost reachability problem: i.e. given a linearly priced timed automaton and a target state, determine...... the minimum cost of executions from the initial state to the target state. This problem generalizes the minimum-time reachability problem for ordinary timed automata. We prove decidability of this problem by offering an algorithmic solution, which is based on a combination of branch-and-bound techniques...

  4. Land suitability maps for waste disposal siting

    International Nuclear Information System (INIS)

    Hrasna, M.

    1996-01-01

    The suitability of geoenvironment for waste disposal depends mainly on its stability and on the danger of groundwater pollution. Besides them, on the land suitability maps for the given purpose also those factors of the factors of the geoenvironment and the landscape should be taken into account, which enable another way of the land use, such as mineral resources, water resources, fertile soils, nature reserves, etc. On the base of the relevant factors influence evaluation - suitable, moderately suitable and unsuitable territorial units are delimited on the maps. The different way of various scale maps compilation is applied, taken into account their different representing feasibilities. (authors)

  5. Hybrid Message-Embedded Cipher Using Logistic Map

    OpenAIRE

    Mishra, Mina; Mankar, V. H.

    2012-01-01

    The proposed hybrid message embedded scheme consists of hill cipher combined with message embedded chaotic scheme. Message-embedded scheme using non-linear feedback shift register as non-linear function and 1-D logistic map as chaotic map is modified, analyzed and tested for avalanche property and strength against known plaintext attack and brute-force attack. Parameter of logistic map acts as a secret key. As we know that the minimum key space to resist brute-force attack is 2100, and it is ...

  6. Minimum Wages and Regional Disparity: An analysis on the evolution of price-adjusted minimum wages and their effects on firm profitability (Japanese)

    OpenAIRE

    MORIKAWA Masayuki

    2013-01-01

    This paper, using prefecture level panel data, empirically analyzes 1) the recent evolution of price-adjusted regional minimum wages and 2) the effects of minimum wages on firm profitability. As a result of rapid increases in minimum wages in the metropolitan areas since 2007, the regional disparity of nominal minimum wages has been widening. However, the disparity of price-adjusted minimum wages has been shrinking. According to the analysis of the effects of minimum wages on profitability us...

  7. Global 30m 2000-2014 Surface Water Dynamics Map Derived from All Landsat 5, 7, and 8

    Science.gov (United States)

    Hudson, A.; Hansen, M.

    2015-12-01

    Water is critical for human life, agriculture, and ecosystems. A better understanding of where it is and how it is changing will enable better management of this valuable resource and guide protection of sensitive ecological areas. Global water maps have typically been representations of surface water at one given time. However, there is both seasonal and interannual variability: rivers meander, lakes disappear, floods arise. To address this ephemeral nature of water, in this study University of Maryland has developed a method that analyzes every Landsat 5, 7, and 8 scene from 1999-2015 to produce global seasonal maps (Winter, Spring, Summer, Fall) of surface water dynamics from 2000-2014. Each Landsat scene is automatically classified into land, water, cloud, haze, shadow, and snow via a decision tree algorithm. The land and water observations are aggregated per pixel into percent occurrence of water in a 3 year moving window for each meteorological season. These annual water percentages form a curve for each season that is discretized into a continuous 3 band RGB map. Frequency of water observation and type of surface water change (loss, gain, peak, or dip) is clearly seen through brightness and hue respectively. Additional data layers include: the year the change began, peak year, minimum year, and the year the change process ended. Currently these maps have been created for 18 1°x1° test tiles scattered around the world, and a portion of the September-November map over Bangladesh is shown below. The entire Landsat archive from 1999-2015 will be processed through a partnership with Google Earth Engine to complete the global product in the coming months. In areas where there is sufficient satellite data density (e.g. the United States), this project could be expanded to 1984-2015. This study provides both scientific researchers and the public an understandable, temporally rich, and globally consistent map showing surface water changes over time.

  8. The Conterminous United States Mineral Appraisal Program; background information to accompany folio of geologic, geochemical, geophysical, and mineral resources maps of the Tonopah 1 by 2 degree Quadrangle, Nevada

    Science.gov (United States)

    John, David A.; Nash, J.T.; Plouff, Donald; Whitebread, D.H.

    1991-01-01

    The Tonopah 1 ? by 2 ? quadrangle in south-central Nevada was studied by an interdisciplinary research team to appraise its mineral resources. The appraisal is based on geological, geochemical, and geophysical field and laboratory investigations, the results of which are published as a folio of maps, figures, and tables, with accompanying discussions. This circular provides background information on the investigations and integrates the information presented in the folio. The selected bibliography lists references to the geology, geochemistry, geophysics, and mineral deposits of the Tonopah 1 ? by 2 ? quadrangle.

  9. Expanding Thurston maps

    CERN Document Server

    Bonk, Mario

    2017-01-01

    This monograph is devoted to the study of the dynamics of expanding Thurston maps under iteration. A Thurston map is a branched covering map on a two-dimensional topological sphere such that each critical point of the map has a finite orbit under iteration. It is called expanding if, roughly speaking, preimages of a fine open cover of the underlying sphere under iterates of the map become finer and finer as the order of the iterate increases. Every expanding Thurston map gives rise to a fractal space, called its visual sphere. Many dynamical properties of the map are encoded in the geometry of this visual sphere. For example, an expanding Thurston map is topologically conjugate to a rational map if and only if its visual sphere is quasisymmetrically equivalent to the Riemann sphere. This relation between dynamics and fractal geometry is the main focus for the investigations in this work.

  10. Terrestrial Ecosystems - Land Surface Forms of the Conterminous United States

    Science.gov (United States)

    Cress, Jill J.; Sayre, Roger G.; Comer, Patrick; Warner, Harumi

    2009-01-01

    As part of an effort to map terrestrial ecosystems, the U.S. Geological Survey has generated land surface form classes to be used in creating maps depicting standardized, terrestrial ecosystem models for the conterminous United States, using an ecosystems classification developed by NatureServe . A biophysical stratification approach, developed for South America and now being implemented globally, was used to model the ecosystem distributions. Since land surface forms strongly influence the differentiation and distribution of terrestrial ecosystems, they are one of the key input layers in this biophysical stratification. After extensive investigation into various land surface form mapping methodologies, the decision was made to use the methodology developed by the Missouri Resource Assessment Partnership (MoRAP). MoRAP made modifications to Hammond's land surface form classification, which allowed the use of 30-meter source data and a 1-km2 window for analyzing the data cell and its surrounding cells (neighborhood analysis). While Hammond's methodology was based on three topographic variables, slope, local relief, and profile type, MoRAP's methodology uses only slope and local relief. Using the MoRAP method, slope is classified as gently sloping when more than 50 percent of the area in a 1-km2 neighborhood has slope less than 8 percent, otherwise the area is considered moderately sloping. Local relief, which is the difference between the maximum and minimum elevation in a neighborhood, is classified into five groups: 0-15 m, 16-30 m, 31-90 m, 91-150 m, and >150 m. The land surface form classes are derived by combining slope and local relief to create eight landform classes: flat plains (gently sloping and local relief = 90 m), low hills (not gently sloping and local relief = 150 m). However, in the USGS application of the MoRAP methodology, an additional local relief group was used (> 400 m) to capture additional local topographic variation. As a result, low

  11. Labour Market Regulations in China: Minimum Wage Policy | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    At the same time, wage and income inequalities have grown significantly and wages have fallen. ... wages are set, and the wages' effects on employment and inequality. ... Impact of minimum wage on gender wage gaps in urban China.

  12. EFFECTS DISTRIBUTIVE THE WAGE MINIMUM IN MARKET OF LABOR CEARENSE

    Directory of Open Access Journals (Sweden)

    Joyciane Coelho Vasconcelos

    2015-11-01

    Full Text Available This paper analyses the contribution of the minimum wage (MW for the devolution of income from the labor market at Ceará in the period 2002-2012. This research was based on National Sample Survey (PNAD of the Brazilian Institute of Geography and Statistics (IBGE.It was used the simulation methodology proposed in DiNardo, Fortin and Lemieux (1996 from the estimated counterfactual Kernel density functions. The simulations were performed for females and males. The results revealed by the decompositions than the minimum wage, the degree of formalization and the personal attributes had impacts not concentrators to workers female and male. However, for women, the de-concentrating effect of the minimum wage is more intense in the sample compared to men. In summary, the simulations indicate the importance of the minimum wage to reduce the dispersion of labor income in recent years.

  13. Minimum uncertainty and squeezing in diffusion processes and stochastic quantization

    Science.gov (United States)

    Demartino, S.; Desiena, S.; Illuminati, Fabrizo; Vitiello, Giuseppe

    1994-01-01

    We show that uncertainty relations, as well as minimum uncertainty coherent and squeezed states, are structural properties for diffusion processes. Through Nelson stochastic quantization we derive the stochastic image of the quantum mechanical coherent and squeezed states.

  14. Appearance of minimum on the curve of cerium melting

    International Nuclear Information System (INIS)

    Boguslavskij, Yu.Ya.; Grigor'ev, S.B.

    1986-01-01

    It is shown by means of simple and obvious thermodynamical considerations that the reduced stability line continues up to the solid phase boundary. The existence of this line causes the appearance of minimum on the fcc cerium melting curve

  15. Parameterization of ion channeling half-angles and minimum yields

    Energy Technology Data Exchange (ETDEWEB)

    Doyle, Barney L.

    2016-03-15

    A MS Excel program has been written that calculates ion channeling half-angles and minimum yields in cubic bcc, fcc and diamond lattice crystals. All of the tables and graphs in the three Ion Beam Analysis Handbooks that previously had to be manually looked up and read from were programed into Excel in handy lookup tables, or parameterized, for the case of the graphs, using rather simple exponential functions with different power functions of the arguments. The program then offers an extremely convenient way to calculate axial and planar half-angles, minimum yields, effects on half-angles and minimum yields of amorphous overlayers. The program can calculate these half-angles and minimum yields for 〈u v w〉 axes and [h k l] planes up to (5 5 5). The program is open source and available at (http://www.sandia.gov/pcnsc/departments/iba/ibatable.html).

  16. Solving Minimum Cost Multi-Commodity Network Flow Problem ...

    African Journals Online (AJOL)

    ADOWIE PERE

    2018-03-23

    Mar 23, 2018 ... network-based modeling framework for integrated fixed and mobile ... Minimum Cost Network Flow Problem (MCNFP) and some ..... Unmanned Aerial Vehicle Routing in Traffic. Incident ... Ph.D. Thesis, Dept. of Surveying &.

  17. Allocation of optimal distributed generation using GA for minimum ...

    African Journals Online (AJOL)

    user

    quality of supply and reliability in tern extending equipment maintenance intervals and ... The performance of the method is tested on 33-bus test system and ... minimum real power losses of the system by calculating DG size at different buses.

  18. Suggested benchmarks for shape optimization for minimum stress concentration

    DEFF Research Database (Denmark)

    Pedersen, Pauli

    2008-01-01

    Shape optimization for minimum stress concentration is vital, important, and difficult. New formulations and numerical procedures imply the need for good benchmarks. The available analytical shape solutions rely on assumptions that are seldom satisfied, so here, we suggest alternative benchmarks...

  19. Decoding Reed-Solomon Codes beyond half the minimum distance

    DEFF Research Database (Denmark)

    Høholdt, Tom; Nielsen, Rasmus Refslund

    1999-01-01

    We describe an efficient implementation of M.Sudan"s algorithm for decoding Reed-Solomon codes beyond half the minimum distance. Furthermore we calculate an upper bound of the probabilty of getting more than one codeword as output...

  20. USING GENETIC ALGORTIHM TO SOLVE STEINER MINIMUM SPANNING TREE PROBLEM

    Directory of Open Access Journals (Sweden)

    Öznur İŞÇİ

    2006-03-01

    Full Text Available Genetic algorithms (GA are a stochastic research methods, and they produce solutions that are close to optimum or near optimum. In addition to GA's successful application to traveling salesman problem, square designation, allocation, workshop table, preparation of lesson/examination schedules, planning of communication networks, assembling line balanced, minimum spanning tree type many combinatorial optimization problems it would be applicable to make the best comparison in optimization. In this study a Java program is developed to solve Steiner minimum spanning tree problem by genetic algorithm and its performance is examined. According to the tests carried out on the problems that were given before in the literature, results that are close to optimum are obtained in by GA approach that is recommended in this study. For the predetermined points in the study, length and gain are calculated for Steiner minimum spanning tree problem and minimum spanning tree problem.

  1. Resident Assessment Instrument/Minimum Data Set (RAI/MDS)

    Data.gov (United States)

    Department of Veterans Affairs — The Resident Assessment Instrument/Minimum Data Set (RAI/MDS) is a comprehensive assessment and care planning process used by the nursing home industry since 1990 as...

  2. The debate on the economic effects of minimum wage legislation

    Directory of Open Access Journals (Sweden)

    Santos Miguel Ruesga-Benito

    2017-12-01

    Full Text Available The minimum wage establishment has its origin in the first third of the last century. Since its creation has been a focus of continuing controversy and an unfinished debate on economics field. This work reviews the effects of the minimum wage on employment and other macroeconomic variables, from both theoretical and empirical perspectives. The method is based on the revision of the literature and the main economic indicators. The central contribution of this paper is providing a general reflection on theoretical and empirical analysis about the debate on minimum wage and its effects. The results showed that some labor policies are taking account the effects of austerity strategies, shifting the attention towards the implementation of minimum wages or their updating, in order to reduce the growing inequalities in the distribution of income, and even poverty levels.

  3. 77 FR 76979 - Pesticides; Revisions to Minimum Risk Exemption

    Science.gov (United States)

    2012-12-31

    ... industries such as animal feed (NAICS code 311119), cosmetics (NAICS code 325620), and soap and detergents... reporting of production to EPA. To meet the criteria for the minimum risk exemption, a pesticide must...

  4. Mapping in the cloud

    CERN Document Server

    Peterson, Michael P

    2014-01-01

    This engaging text provides a solid introduction to mapmaking in the era of cloud computing. It takes students through both the concepts and technology of modern cartography, geographic information systems (GIS), and Web-based mapping. Conceptual chapters delve into the meaning of maps and how they are developed, covering such topics as map layers, GIS tools, mobile mapping, and map animation. Methods chapters take a learn-by-doing approach to help students master application programming interfaces and build other technical skills for creating maps and making them available on the Internet. Th

  5. Mapping with Drupal

    CERN Document Server

    Palazzolo, Alan

    2011-01-01

    Build beautiful interactive maps on your Drupal website, and tell engaging visual stories with your data. This concise guide shows you how to create custom geographical maps from top to bottom, using Drupal 7 tools and out-of-the-box modules. You'll learn how mapping works in Drupal, with examples on how to use intuitive interfaces to map local events, businesses, groups, and other custom data. Although building maps with Drupal can be tricky, this book helps you navigate the system's complexities for creating sophisticated maps that match your site design. Get the knowledge and tools you ne

  6. Putting Portugal on the Map

    Directory of Open Access Journals (Sweden)

    João Ferrão

    2010-01-01

    Full Text Available This paper argues the need to “put Portugal on the map” in a double sense: in a prospective way, in order to place the country on the required map(s, something which entails strategic vision and capacity for action; and in an analytical way – to enable us to understand Portugal from the map(s it is part of, which presupposes a capacity to analyse and understand the current state of affairs. By drawing inspiration from the polymorphic vision on the spatialities of contemporary societies and economies defended by Jessop, Brenner and Jones (2008, we propose the creation of a unifying reference framework to “put Portugal on the map”, using a combination of five elements: territory as a geographic location; territory as a unit of reference of the nation-state; places; geographic scales; and networks. The polymorphic nature of the spatialities that characterize, or should characterize, Portugal’s place in the world reflects several, and even contradictory, ethical values, interests, preferences, and options. Accordingly, the supported polymorphic spatialities ought to stir up controversy based on knowledge and arguments that are solid from a theoretical and empirical stance, and should make explicit the objectives and values they are based on.

  7. Minimum emittance of isochronus rings for synchrotron light source

    CERN Document Server

    Shoji, Y

    1999-01-01

    Theoretically achievable minimum emittances of isochronus rings for synchrotron light source are calculated. The rings discussed in this paper consist of isochronus and achromatic bending cells, isochronus TBA (triple bend achromat) cells with negative dispersion, isochronus TBA cells with inverse bends or isochronus QBA (four bend achromat) cells. We show that the minimum emittances of these rings are roughly 2 or 3 times of those of the optimized non-isochronus rings.

  8. Great expectations: Reservation wages and the minimum wage reform

    OpenAIRE

    Fedorets, Alexandra; Filatov, Alexey; Shupe, Cortnie

    2018-01-01

    We use the German Socio-Economic Panel to show that introducing a high-impact statutory minimum wage causes an increase in reservation wages of approximately 4 percent at the low end of the distribution. The shifts in reservation wages and observed wages due to the minimum wage reform are comparable in their magnitude. Additional results show that German citizens adjust their reservation wages more than immigrants. Moreover, suggestive evidence points to a compensation mechanism in which immi...

  9. Pay equity, minimum wage and equality at work

    OpenAIRE

    Rubery, Jill

    2003-01-01

    Reviews the underlying causes of pay discrimination embedded within the organization of the labour market and structures of pay and reward. Discusses the need to focus on pay equity as part of a general strategy of promoting equity and decent work and examines the case for using minimum wage policies in comparison to more targeted equal pay policies to reduce gender pay equity. Identifies potential obstacles to or support for such policies and describes experiences of the use of minimum wages...

  10. A method for minimum risk portfolio optimization under hybrid uncertainty

    Science.gov (United States)

    Egorova, Yu E.; Yazenin, A. V.

    2018-03-01

    In this paper, we investigate a minimum risk portfolio model under hybrid uncertainty when the profitability of financial assets is described by fuzzy random variables. According to Feng, the variance of a portfolio is defined as a crisp value. To aggregate fuzzy information the weakest (drastic) t-norm is used. We construct an equivalent stochastic problem of the minimum risk portfolio model and specify the stochastic penalty method for solving it.

  11. A Minimum Spanning Tree Representation of Anime Similarities

    OpenAIRE

    Wibowo, Canggih Puspo

    2016-01-01

    In this work, a new way to represent Japanese animation (anime) is presented. We applied a minimum spanning tree to show the relation between anime. The distance between anime is calculated through three similarity measurements, namely crew, score histogram, and topic similarities. Finally, the centralities are also computed to reveal the most significance anime. The result shows that the minimum spanning tree can be used to determine the similarity anime. Furthermore, by using centralities c...

  12. Energy and environmental norms on Minimum Vital Flux

    International Nuclear Information System (INIS)

    Maran, S.

    2008-01-01

    By the end of the year will come into force the recommendations on Minimum Vital flow and operators of hydroelectric power plants will be required to make available part of water of their derivations in order to protect river ecosystems. In this article the major energy and environmental consequences of these rules, we report some quantitative evaluations and are discusses the proposals for overcoming the weaknesses of the approach in the estimation of Minimum Vital Flux [it

  13. Optimal ship forms for minimum total resistance in shallow water

    OpenAIRE

    Zhao, Lian-en

    1984-01-01

    Optimal ship forms for minimum total resistance in shallow water Optimal ship forms for minimum total resistance in shallow water: An attempt is made to obtain shallow-water optimal ship forms for total resistance by means of "tent" function representation under the constraints that the main dimensions of the ship and the water-line area were kept constant. The objective function in the quadratic programming is the sum of wave-making resistance calculated by Sretenski's formula and viscou...

  14. MINIMUM BRACING STIFFNESS FOR MULTI-COLUMN SYSTEMS: THEORY

    OpenAIRE

    ARISTIZÁBAL-OCHOA, J. DARÍO

    2011-01-01

    A method that determines the minimum bracing stiffness required by a multi-column elastic system to achieve non-sway buckling conditions is proposed. Equations that evaluate the required minimum stiffness of the lateral and torsional bracings and the corresponding “braced" critical buckling load for each column of the story level are derived using the modified stability functions. The following effects are included: 1) the types of end connections (rigid, semirigid, and simple); 2) the bluepr...

  15. Colacium Minimum (Euglenophyta, A New Epiphytic Species For Asia

    Directory of Open Access Journals (Sweden)

    Wołowski Konrad

    2015-12-01

    Full Text Available Colacium minimum Fott & Komárek, known so far from a few localities in Central Europe (Czech Republic, is reported here for the first time from Asia (Thailand. This epiphytic species was found growing on eight taxa of loricated euglenoids. The process of surface colonization of Trachelomonas Ehrenb. and Strombomonas Deflandre taxa by C. minimum in natural populations is briefly discussed and originally documented using LM and SEM.

  16. A theory of compliance with minimum wage legislation

    OpenAIRE

    Jellal, Mohamed

    2012-01-01

    In this paper, we introduce firm heterogeneity in the context of a model of non-compliance with minimum wage legislation. The introduction of heterogeneity in the ease with which firms can be monitored for non compliance allows us to show that non-compliance will persist in sectors which are relatively difficult to monitor, despite the government implementing non stochastic monitoring. Moreover, we show that the incentive not to comply is an increasing function of the level of the minimum wag...

  17. Towards a mathematical foundation of minimum-variance theory

    Energy Technology Data Exchange (ETDEWEB)

    Feng Jianfeng [COGS, Sussex University, Brighton (United Kingdom); Zhang Kewei [SMS, Sussex University, Brighton (United Kingdom); Wei Gang [Mathematical Department, Baptist University, Hong Kong (China)

    2002-08-30

    The minimum-variance theory which accounts for arm and eye movements with noise signal inputs was proposed by Harris and Wolpert (1998 Nature 394 780-4). Here we present a detailed theoretical analysis of the theory and analytical solutions of the theory are obtained. Furthermore, we propose a new version of the minimum-variance theory, which is more realistic for a biological system. For the new version we show numerically that the variance is considerably reduced. (author)

  18. Minimum Wage Policy and Country’s Technical Efficiency

    OpenAIRE

    Karim, Mohd Zaini Abd; Chan, Sok-Gee; Hassan, Sallahuddin

    2016-01-01

    Recently, the government has decided that Malaysia would introduce a minimum wage policy. However, some quarters argued against the idea of a nationwide minimum wage asserting that it will lead to an increase in the cost of doing business and thus will hurt Malaysian competitiveness. Although standard economic theory unambiguously implies that wage floors have a negative impact on employment, the existing empirical literature is not so clear. Some studies have found the expected negative impa...

  19. The Einstein-Hilbert gravitation with minimum length

    Science.gov (United States)

    Louzada, H. L. C.

    2018-05-01

    We study the Einstein-Hilbert gravitation with the deformed Heisenberg algebra leading to the minimum length, with the intention to find and estimate the corrections in this theory, clarifying whether or not it is possible to obtain, by means of the minimum length, a theory, in D=4, which is causal, unitary and provides a massive graviton. Therefore, we will calculate and analyze the dispersion relationships of the considered theory.

  20. Meso(topoclimatic maps and mapping

    Directory of Open Access Journals (Sweden)

    Ladislav Plánka

    2007-06-01

    Full Text Available The atmospheric characteristics can be studied from many points of view, most often we talk about time and spatial standpoint. Application of time standpoint leads either to different kinds of the synoptic and prognostic maps production, which presents actual state of atmosphere in short time section in the past or in the near future or to the climatic maps production which presents longterm weather regime. Spatial standpoint then differs map works according to natural phenomenon proportions, whereas the scale of their graphic presentation can be different. It depends on production purpose of each work.In the paper there are analysed methods of mapping and climatic maps production, which display longterm regime of chosen atmospheric features. These athmosphere features are formed in interaction with land surface and also have direct influence on people and their activities throughout the country. At the same time they’re influenced by anthropogenic intervention to the landscape.