WorldWideScience

Sample records for quantifying global tolerance

  1. Quantifying global exergy resources

    International Nuclear Information System (INIS)

    Hermann, Weston A.

    2006-01-01

    Exergy is used as a common currency to assess and compare the reservoirs of theoretically extractable work we call energy resources. Resources consist of matter or energy with properties different from the predominant conditions in the environment. These differences can be classified as physical, chemical, or nuclear exergy. This paper identifies the primary exergy reservoirs that supply exergy to the biosphere and quantifies the intensive and extensive exergy of their derivative secondary reservoirs, or resources. The interconnecting accumulations and flows among these reservoirs are illustrated to show the path of exergy through the terrestrial system from input to its eventual natural or anthropogenic destruction. The results are intended to assist in evaluation of current resource utilization, help guide fundamental research to enable promising new energy technologies, and provide a basis for comparing the resource potential of future energy options that is independent of technology and cost

  2. Quantifying the impacts of global disasters

    Science.gov (United States)

    Jones, L. M.; Ross, S.; Wilson, R. I.; Borrero, J. C.; Brosnan, D.; Bwarie, J. T.; Geist, E. L.; Hansen, R. A.; Johnson, L. A.; Kirby, S. H.; Long, K.; Lynett, P. J.; Miller, K. M.; Mortensen, C. E.; Perry, S. C.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Thio, H. K.; Wein, A. M.; Whitmore, P.; Wood, N. J.

    2012-12-01

    The US Geological Survey, National Oceanic and Atmospheric Administration, California Geological Survey, and other entities are developing a Tsunami Scenario, depicting a realistic outcome of a hypothetical but plausible large tsunami originating in the eastern Aleutian Arc, affecting the west coast of the United States, including Alaska and Hawaii. The scenario includes earth-science effects, damage and restoration of the built environment, and social and economic impacts. Like the earlier ShakeOut and ARkStorm disaster scenarios, the purpose of the Tsunami Scenario is to apply science to quantify the impacts of natural disasters in a way that can be used by decision makers in the affected sectors to reduce the potential for loss. Most natural disasters are local. A major hurricane can destroy a city or damage a long swath of coastline while mostly sparing inland areas. The largest earthquake on record caused strong shaking along 1500 km of Chile, but left the capital relatively unscathed. Previous scenarios have used the local nature of disasters to focus interaction with the user community. However, the capacity for global disasters is growing with the interdependency of the global economy. Earthquakes have disrupted global computer chip manufacturing and caused stock market downturns. Tsunamis, however, can be global in their extent and direct impact. Moreover, the vulnerability of seaports to tsunami damage can increase the global consequences. The Tsunami Scenario is trying to capture the widespread effects while maintaining the close interaction with users that has been one of the most successful features of the previous scenarios. The scenario tsunami occurs in the eastern Aleutians with a source similar to the 2011 Tohoku event. Geologic similarities support the argument that a Tohoku-like source is plausible in Alaska. It creates a major nearfield tsunami in the Aleutian arc and peninsula, a moderate tsunami in the US Pacific Northwest, large but not the

  3. Quantifying and Mapping Global Data Poverty.

    Science.gov (United States)

    Leidig, Mathias; Teeuw, Richard M

    2015-01-01

    Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. However, access to these technologies, as well as their associated software and training materials, is not evenly distributed: since the 1990s there has been concern about a "Digital Divide" between the data-rich and the data-poor. We present an innovative metric for evaluating international variations in access to digital data: the Data Poverty Index (DPI). The DPI is based on Internet speeds, numbers of computer owners and Internet users, mobile phone ownership and network coverage, as well as provision of higher education. The datasets used to produce the DPI are provided annually for almost all the countries of the world and can be freely downloaded. The index that we present in this 'proof of concept' study is the first to quantify and visualise the problem of global data poverty, using the most recent datasets, for 2013. The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. The DPI highlights countries where support is needed for improving access to the Internet and for the provision of training in geoinfomatics. We conclude that the DPI is of value as a potential metric for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction.

  4. Quantifying and Mapping Global Data Poverty.

    Directory of Open Access Journals (Sweden)

    Mathias Leidig

    Full Text Available Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. However, access to these technologies, as well as their associated software and training materials, is not evenly distributed: since the 1990s there has been concern about a "Digital Divide" between the data-rich and the data-poor. We present an innovative metric for evaluating international variations in access to digital data: the Data Poverty Index (DPI. The DPI is based on Internet speeds, numbers of computer owners and Internet users, mobile phone ownership and network coverage, as well as provision of higher education. The datasets used to produce the DPI are provided annually for almost all the countries of the world and can be freely downloaded. The index that we present in this 'proof of concept' study is the first to quantify and visualise the problem of global data poverty, using the most recent datasets, for 2013. The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. The DPI highlights countries where support is needed for improving access to the Internet and for the provision of training in geoinfomatics. We conclude that the DPI is of value as a potential metric for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction.

  5. Teaching Tolerance in a Globalized World

    NARCIS (Netherlands)

    Sandoval-Hernández, Andrés; Isac, Maria Magdalena; Miranda, Daniel

    2018-01-01

    This open access thematic report identifies factors and conditions that can help schools and education systems promote tolerance in a globalized world. The IEA’s International Civic and Citizenship Study (ICCS) is a comparative research program designed to investigate the ways in which young people

  6. Global tropospheric ozone modeling: Quantifying errors due to grid resolution

    Science.gov (United States)

    Wild, Oliver; Prather, Michael J.

    2006-06-01

    Ozone production in global chemical models is dependent on model resolution because ozone chemistry is inherently nonlinear, the timescales for chemical production are short, and precursors are artificially distributed over the spatial scale of the model grid. In this study we examine the sensitivity of ozone, its precursors, and its production to resolution by running a global chemical transport model at four different resolutions between T21 (5.6° × 5.6°) and T106 (1.1° × 1.1°) and by quantifying the errors in regional and global budgets. The sensitivity to vertical mixing through the parameterization of boundary layer turbulence is also examined. We find less ozone production in the boundary layer at higher resolution, consistent with slower chemical production in polluted emission regions and greater export of precursors. Agreement with ozonesonde and aircraft measurements made during the NASA TRACE-P campaign over the western Pacific in spring 2001 is consistently better at higher resolution. We demonstrate that the numerical errors in transport processes on a given resolution converge geometrically for a tracer at successively higher resolutions. The convergence in ozone production on progressing from T21 to T42, T63, and T106 resolution is likewise monotonic but indicates that there are still large errors at 120 km scales, suggesting that T106 resolution is too coarse to resolve regional ozone production. Diagnosing the ozone production and precursor transport that follow a short pulse of emissions over east Asia in springtime allows us to quantify the impacts of resolution on both regional and global ozone. Production close to continental emission regions is overestimated by 27% at T21 resolution, by 13% at T42 resolution, and by 5% at T106 resolution. However, subsequent ozone production in the free troposphere is not greatly affected. We find that the export of short-lived precursors such as NOx by convection is overestimated at coarse resolution.

  7. Quantifying the global cellular thiol-disulfide status

    DEFF Research Database (Denmark)

    Hansen, Rosa E; Roth, Doris; Winther, Jakob R

    2009-01-01

    It is widely accepted that the redox status of protein thiols is of central importance to protein structure and folding and that glutathione is an important low-molecular-mass redox regulator. However, the total cellular pools of thiols and disulfides and their relative abundance have never been...... determined. In this study, we have assembled a global picture of the cellular thiol-disulfide status in cultured mammalian cells. We have quantified the absolute levels of protein thiols, protein disulfides, and glutathionylated protein (PSSG) in all cellular protein, including membrane proteins. These data...... cell types. However, when cells are exposed to a sublethal dose of the thiol-specific oxidant diamide, PSSG levels increase to >15% of all protein cysteine. Glutathione is typically characterized as the "cellular redox buffer"; nevertheless, our data show that protein thiols represent a larger active...

  8. Quantifying the global contribution of alcohol consumption to cardiomyopathy.

    Science.gov (United States)

    Manthey, Jakob; Imtiaz, Sameer; Neufeld, Maria; Rylett, Margaret; Rehm, Jürgen

    2017-05-25

    The global impact of alcohol consumption on deaths due to cardiomyopathy (CM) has not been quantified to date, even though CM contains a subcategory for alcoholic CM with an effect of heavy drinking over time as the postulated underlying causal mechanism. In this feasibility study, a model to estimate the alcohol-attributable fraction (AAF) of CM deaths based on alcohol exposure measures is proposed. A two-step model was developed based on aggregate-level data from 95 countries, including the most populous (data from 2013 or last available year). First, the crude mortality rate of alcoholic CM per 1,000,000 adults was predicted using a negative binomial regression based on prevalence of alcohol use disorders (AUD) and adult alcohol per capita consumption (APC) (n = 52 countries). Second, the proportion of alcoholic CM among all CM deaths (i.e., AAF) was predicted using a fractional response probit regression with alcoholic CM crude mortality rate (from Step 1), AUD prevalence, APC per drinker, and Global Burden of Disease region as predictions. Additional models repeated these steps by sex and for the wider Global Burden of Disease study definition of CM. There were strong correlations (>0.9) between the crude mortality rate of alcoholic CM and the AAFs, supporting the modeling strategy. In the first step, the population-weighted mean crude mortality rate was estimated at 8.4 alcoholic CM deaths per 1,000,000 (95% CI: 7.4-9.3). In the second step, the global AAFs were estimated at 6.9% (95% CI: 5.4-8.4%). Sex-specific figures suggested a lower AAF among females (2.9%, 95% CI: 2.3-3.4%) as compared to males (8.9%, 95% CI: 7.0-10.7%). Larger deviations between observed and predicted AAFs were found in Eastern Europe and Central Asia. The model proposed promises to fill the gap to include AAFs for CM into comparative risk assessments in the future. These predictions likely will be underestimates because of the stigma involved in all fully alcohol

  9. Global tropospheric ozone modeling: Quantifying errors due to grid resolution

    OpenAIRE

    Wild, Oliver; Prather, Michael J

    2006-01-01

    Ozone production in global chemical models is dependent on model resolution because ozone chemistry is inherently nonlinear, the timescales for chemical production are short, and precursors are artificially distributed over the spatial scale of the model grid. In this study we examine the sensitivity of ozone, its precursors, and its production to resolution by running a global chemical transport model at four different resolutions between T21 (5.6° × 5.6°) and T106 (1.1° × 1.1°) and by quant...

  10. Quantifying the Global Marine Biogenic Nitrogen Oxides Emissions

    Science.gov (United States)

    Su, H.; Wang, S.; Lin, J.; Hao, N.; Poeschl, U.; Cheng, Y.

    2017-12-01

    Nitrogen oxides (NOx) are among the most important molecules in atmospheric chemistry and nitrogen cycle. The NOx over the ocean areas are traditionally believed to originate from the continental outflows or the inter-continental shipping emissions. By comparing the satellite observations (OMI) and global chemical transport model simulation (GEOS-Chem), we suggest that the underestimated modeled atmospheric NO2 columns over biogenic active ocean areas can be possibly attributed to the biogenic source. Nitrification and denitrification in the ocean water produces nitrites which can be further reduced to NO through microbiological processes. We further report global distributions of marine biogenic NO emissions. The new added emissions improve the agreement between satellite observations and model simulations over large areas. Our model simulations manifest that the marine biogenic NO emissions increase the atmospheric oxidative capacity and aerosol formation rate, providing a closer link between atmospheric chemistry and ocean microbiology.

  11. Quantifying global soil carbon losses in response to warming.

    Science.gov (United States)

    Crowther, T W; Todd-Brown, K E O; Rowe, C W; Wieder, W R; Carey, J C; Machmuller, M B; Snoek, B L; Fang, S; Zhou, G; Allison, S D; Blair, J M; Bridgham, S D; Burton, A J; Carrillo, Y; Reich, P B; Clark, J S; Classen, A T; Dijkstra, F A; Elberling, B; Emmett, B A; Estiarte, M; Frey, S D; Guo, J; Harte, J; Jiang, L; Johnson, B R; Kröel-Dulay, G; Larsen, K S; Laudon, H; Lavallee, J M; Luo, Y; Lupascu, M; Ma, L N; Marhan, S; Michelsen, A; Mohan, J; Niu, S; Pendall, E; Peñuelas, J; Pfeifer-Meister, L; Poll, C; Reinsch, S; Reynolds, L L; Schmidt, I K; Sistla, S; Sokol, N W; Templer, P H; Treseder, K K; Welker, J M; Bradford, M A

    2016-11-30

    The majority of the Earth's terrestrial carbon is stored in the soil. If anthropogenic warming stimulates the loss of this carbon to the atmosphere, it could drive further planetary warming. Despite evidence that warming enhances carbon fluxes to and from the soil, the net global balance between these responses remains uncertain. Here we present a comprehensive analysis of warming-induced changes in soil carbon stocks by assembling data from 49 field experiments located across North America, Europe and Asia. We find that the effects of warming are contingent on the size of the initial soil carbon stock, with considerable losses occurring in high-latitude areas. By extrapolating this empirical relationship to the global scale, we provide estimates of soil carbon sensitivity to warming that may help to constrain Earth system model projections. Our empirical relationship suggests that global soil carbon stocks in the upper soil horizons will fall by 30 ± 30 petagrams of carbon to 203 ± 161 petagrams of carbon under one degree of warming, depending on the rate at which the effects of warming are realized. Under the conservative assumption that the response of soil carbon to warming occurs within a year, a business-as-usual climate scenario would drive the loss of 55 ± 50 petagrams of carbon from the upper soil horizons by 2050. This value is around 12-17 per cent of the expected anthropogenic emissions over this period. Despite the considerable uncertainty in our estimates, the direction of the global soil carbon response is consistent across all scenarios. This provides strong empirical support for the idea that rising temperatures will stimulate the net loss of soil carbon to the atmosphere, driving a positive land carbon-climate feedback that could accelerate climate change.

  12. Quantifying the Intercontinental and Global Reach and Effects of Pollution

    Science.gov (United States)

    Chatfield, Robert B.; Guo, Zitan

    2000-01-01

    The Atmospheric Chemistry Modeling Group is participating in an international effort to explore the projected interactions of the atmosphere with biota, human activity, and the natural environment over the next three decades. The group uses computer simulations and statistical analyses to compare theory and observations of the composition of the lower atmosphere. This study of global habitability change is part of a more ambitious activity to understand global habitability. This broad planetary understanding is central to planetary habitability, biomarker detection, and similar aspects of Astrobiology. The group has made highly detailed studies of immense intercontinental plumes that affect the chemistry of the global atmosphere, especially the region below the ozone (O3) layer whose chemical composition defines the conditions for healthy humans and the biosphere. For some decades there has been concern about the pollution from cities and industrial burning and its possible effect in increasing smog ozone, not only in continental regions, but also in plumes that spread downwind. Recently, there has been new concern about another kind of pollution plume. Projections for a greatly expanded aircraft fleet imply that there will be plumes of nitrogen oxides (NO(x)) from jet exhaust in the Northern Hemisphere downwind of major air traffic routes. Both of these are tied to large-scale O3 in the troposphere, where it is toxic to humans and plant tissues.

  13. Quantifying tolerance indicator values for common stream fish species of the United States

    Science.gov (United States)

    Meador, M.R.; Carlisle, D.M.

    2007-01-01

    The classification of fish species tolerance to environmental disturbance is often used as a means to assess ecosystem conditions. Its use, however, may be problematic because the approach to tolerance classification is based on subjective judgment. We analyzed fish and physicochemical data from 773 stream sites collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program to calculate tolerance indicator values for 10 physicochemical variables using weighted averaging. Tolerance indicator values (TIVs) for ammonia, chloride, dissolved oxygen, nitrite plus nitrate, pH, phosphorus, specific conductance, sulfate, suspended sediment, and water temperature were calculated for 105 common fish species of the United States. Tolerance indicator values for specific conductance and sulfate were correlated (rho = 0.87), and thus, fish species may be co-tolerant to these water-quality variables. We integrated TIVs for each species into an overall tolerance classification for comparisons with judgment-based tolerance classifications. Principal components analysis indicated that the distinction between tolerant and intolerant classifications was determined largely by tolerance to suspended sediment, specific conductance, chloride, and total phosphorus. Factors such as water temperature, dissolved oxygen, and pH may not be as important in distinguishing between tolerant and intolerant classifications, but may help to segregate species classified as moderate. Empirically derived tolerance classifications were 58.8% in agreement with judgment-derived tolerance classifications. Canonical discriminant analysis revealed that few TIVs, primarily chloride, could discriminate among judgment-derived tolerance classifications of tolerant, moderate, and intolerant. To our knowledge, this is the first empirically based understanding of fish species tolerance for stream fishes in the United States.

  14. A New Point of View on the Relationship Between Global Solar Irradiation and Sunshine Quantifiers

    Czech Academy of Sciences Publication Activity Database

    Brabec, Marek; Badescu, V.; Dumitrescu, A.; Paulescu, M.

    2016-01-01

    Roč. 126, March (2016), s. 252-263 ISSN 0038-092X Institutional support: RVO:67985807 Keywords : global solar irradiation * sunshine quantifiers * sunshine number * Angstrom equation * statistical modeling * regression analysis Subject RIV: BB - Applied Statistics, Operation al Research Impact factor: 4.018, year: 2016

  15. Quantifying Spatial Variation in Ecosystem Services Demand : A Global Mapping Approach

    NARCIS (Netherlands)

    Wolff, S.; Schulp, C. J E; Kastner, T.; Verburg, P. H.

    2017-01-01

    Understanding the spatial-temporal variability in ecosystem services (ES) demand can help anticipate externalities of land use change. This study presents new operational approaches to quantify and map demand for three non-commodity ES on a global scale: animal pollination, wild medicinal plants and

  16. Quantifying the Effects of Historical Land Cover Conversion Uncertainty on Global Carbon and Climate Estimates

    Science.gov (United States)

    Di Vittorio, A. V.; Mao, J.; Shi, X.; Chini, L.; Hurtt, G.; Collins, W. D.

    2018-01-01

    Previous studies have examined land use change as a driver of global change, but the translation of land use change into land cover conversion has been largely unconstrained. Here we quantify the effects of land cover conversion uncertainty on the global carbon and climate system using the integrated Earth System Model. Our experiments use identical land use change data and vary land cover conversions to quantify associated uncertainty in carbon and climate estimates. Land cover conversion uncertainty is large, constitutes a 5 ppmv range in estimated atmospheric CO2 in 2004, and generates carbon uncertainty that is equivalent to 80% of the net effects of CO2 and climate and 124% of the effects of nitrogen deposition during 1850-2004. Additionally, land cover uncertainty generates differences in local surface temperature of over 1°C. We conclude that future studies addressing land use, carbon, and climate need to constrain and reduce land cover conversion uncertainties.

  17. Quantifying the causes of differences in tropospheric OH within global models

    Science.gov (United States)

    Nicely, Julie M.; Salawitch, Ross J.; Canty, Timothy; Anderson, Daniel C.; Arnold, Steve R.; Chipperfield, Martyn P.; Emmons, Louisa K.; Flemming, Johannes; Huijnen, Vincent; Kinnison, Douglas E.; Lamarque, Jean-François; Mao, Jingqiu; Monks, Sarah A.; Steenrod, Stephen D.; Tilmes, Simone; Turquety, Solene

    2017-02-01

    The hydroxyl radical (OH) is the primary daytime oxidant in the troposphere and provides the main loss mechanism for many pollutants and greenhouse gases, including methane (CH4). Global mean tropospheric OH differs by as much as 80% among various global models, for reasons that are not well understood. We use neural networks (NNs), trained using archived output from eight chemical transport models (CTMs) that participated in the Polar Study using Aircraft, Remote Sensing, Surface Measurements and Models, of Climate, Chemistry, Aerosols and Transport Model Intercomparison Project (POLMIP), to quantify the factors responsible for differences in tropospheric OH and resulting CH4 lifetime (τCH4) between these models. Annual average τCH4, for loss by OH only, ranges from 8.0 to 11.6 years for the eight POLMIP CTMs. The factors driving these differences were quantified by inputting 3-D chemical fields from one CTM into the trained NN of another CTM. Across all CTMs, the largest mean differences in τCH4 (ΔτCH4) result from variations in chemical mechanisms (ΔτCH4 = 0.46 years), the photolysis frequency (J) of O3 → O(1D) (0.31 years), local O3 (0.30 years), and CO (0.23 years). The ΔτCH4 due to CTM differences in NOx (NO + NO2) is relatively low (0.17 years), although large regional variation in OH between the CTMs is attributed to NOx. Differences in isoprene and J(NO2) have negligible overall effect on globally averaged tropospheric OH, although the extent of OH variations due to each factor depends on the model being examined. This study demonstrates that NNs can serve as a useful tool for quantifying why tropospheric OH varies between global models, provided that essential chemical fields are archived.

  18. Can We Use Regression Modeling to Quantify Mean Annual Streamflow at a Global-Scale?

    Science.gov (United States)

    Barbarossa, V.; Huijbregts, M. A. J.; Hendriks, J. A.; Beusen, A.; Clavreul, J.; King, H.; Schipper, A.

    2016-12-01

    Quantifying mean annual flow of rivers (MAF) at ungauged sites is essential for a number of applications, including assessments of global water supply, ecosystem integrity and water footprints. MAF can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict MAF based on climate and catchment characteristics. Yet, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. In this study, we developed a global-scale regression model for MAF using observations of discharge and catchment characteristics from 1,885 catchments worldwide, ranging from 2 to 106 km2 in size. In addition, we compared the performance of the regression model with the predictive ability of the spatially explicit global hydrological model PCR-GLOBWB [van Beek et al., 2011] by comparing results from both models to independent measurements. We obtained a regression model explaining 89% of the variance in MAF based on catchment area, mean annual precipitation and air temperature, average slope and elevation. The regression model performed better than PCR-GLOBWB for the prediction of MAF, as root-mean-square error values were lower (0.29 - 0.38 compared to 0.49 - 0.57) and the modified index of agreement was higher (0.80 - 0.83 compared to 0.72 - 0.75). Our regression model can be applied globally at any point of the river network, provided that the input parameters are within the range of values employed in the calibration of the model. The performance is reduced for water scarce regions and further research should focus on improving such an aspect for regression-based global hydrological models.

  19. Developing and testing a global-scale regression model to quantify mean annual streamflow

    Science.gov (United States)

    Barbarossa, Valerio; Huijbregts, Mark A. J.; Hendriks, A. Jan; Beusen, Arthur H. W.; Clavreul, Julie; King, Henry; Schipper, Aafke M.

    2017-01-01

    Quantifying mean annual flow of rivers (MAF) at ungauged sites is essential for assessments of global water supply, ecosystem integrity and water footprints. MAF can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict MAF based on climate and catchment characteristics. Yet, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. In this study, we developed a global-scale regression model for MAF based on a dataset unprecedented in size, using observations of discharge and catchment characteristics from 1885 catchments worldwide, measuring between 2 and 106 km2. In addition, we compared the performance of the regression model with the predictive ability of the spatially explicit global hydrological model PCR-GLOBWB by comparing results from both models to independent measurements. We obtained a regression model explaining 89% of the variance in MAF based on catchment area and catchment averaged mean annual precipitation and air temperature, slope and elevation. The regression model performed better than PCR-GLOBWB for the prediction of MAF, as root-mean-square error (RMSE) values were lower (0.29-0.38 compared to 0.49-0.57) and the modified index of agreement (d) was higher (0.80-0.83 compared to 0.72-0.75). Our regression model can be applied globally to estimate MAF at any point of the river network, thus providing a feasible alternative to spatially explicit process-based global hydrological models.

  20. Ozone Production in Global Tropospheric Models: Quantifying Errors due to Grid Resolution

    Science.gov (United States)

    Wild, O.; Prather, M. J.

    2005-12-01

    Ozone production in global chemical models is dependent on model resolution because ozone chemistry is inherently nonlinear, the timescales for chemical production are short, and precursors are artificially distributed over the spatial scale of the model grid. In this study we examine the sensitivity of ozone, its precursors, and its production to resolution by running a global chemical transport model at four different resolutions between T21 (5.6° × 5.6°) and T106 (1.1° × 1.1°) and by quantifying the errors in regional and global budgets. The sensitivity to vertical mixing through the parameterization of boundary layer turbulence is also examined. We find less ozone production in the boundary layer at higher resolution, consistent with slower chemical production in polluted emission regions and greater export of precursors. Agreement with ozonesonde and aircraft measurements made during the NASA TRACE-P campaign over the Western Pacific in spring 2001 is consistently better at higher resolution. We demonstrate that the numerical errors in transport processes at a given resolution converge geometrically for a tracer at successively higher resolutions. The convergence in ozone production on progressing from T21 to T42, T63 and T106 resolution is likewise monotonic but still indicates large errors at 120~km scales, suggesting that T106 resolution is still too coarse to resolve regional ozone production. Diagnosing the ozone production and precursor transport that follow a short pulse of emissions over East Asia in springtime allows us to quantify the impacts of resolution on both regional and global ozone. Production close to continental emission regions is overestimated by 27% at T21 resolution, by 13% at T42 resolution, and by 5% at T106 resolution, but subsequent ozone production in the free troposphere is less significantly affected.

  1. Relation of tolerance of ambiguity to global and specific paranormal experience.

    Science.gov (United States)

    Houran, J; Williams, C

    1998-12-01

    We examined the relationship of tolerance of ambiguity to severe global factors and specific types of anomalous or paranormal experience. 107 undergraduate students completed MacDonald's 1970 AT-20 and the Anomalous Experiences Inventory of Kumar, Pekala, and Gallagher. Scores on the five subscales of the Anomalous Experiences Inventory correlated differently with tolerance of ambiguity. Global paranormal beliefs, abilities, experiences, and drug use were positively associated with tolerance of ambiguity, whereas a fear of paranormal experience showed a negative relation. The specific types of anomalous experiences that correlated with tolerance of ambiguity often involved internal or physiological experience, e.g., precognitive dreams, memories of reincarnation, visual apparitions, and vestibular alterations. We generally found no effects of age of sex. These results are consistent with the idea that some paranormal experiences are misattributions of internal experience to external ('paranormal') sources, a process analogous to mechanisms underpinning delusions and hallucinations.

  2. Plasticity in thermal tolerance has limited potential to buffer ectotherms from global warming

    Science.gov (United States)

    Gunderson, Alex R.; Stillman, Jonathon H.

    2015-01-01

    Global warming is increasing the overheating risk for many organisms, though the potential for plasticity in thermal tolerance to mitigate this risk is largely unknown. In part, this shortcoming stems from a lack of knowledge about global and taxonomic patterns of variation in tolerance plasticity. To address this critical issue, we test leading hypotheses for broad-scale variation in ectotherm tolerance plasticity using a dataset that includes vertebrate and invertebrate taxa from terrestrial, freshwater and marine habitats. Contrary to expectation, plasticity in heat tolerance was unrelated to latitude or thermal seasonality. However, plasticity in cold tolerance is associated with thermal seasonality in some habitat types. In addition, aquatic taxa have approximately twice the plasticity of terrestrial taxa. Based on the observed patterns of variation in tolerance plasticity, we propose that limited potential for behavioural plasticity (i.e. behavioural thermoregulation) favours the evolution of greater plasticity in physiological traits, consistent with the ‘Bogert effect’. Finally, we find that all ectotherms have relatively low acclimation in thermal tolerance and demonstrate that overheating risk will be minimally reduced by acclimation in even the most plastic groups. Our analysis indicates that behavioural and evolutionary mechanisms will be critical in allowing ectotherms to buffer themselves from extreme temperatures. PMID:25994676

  3. Emulation of a complex global aerosol model to quantify sensitivity to uncertain parameters

    Directory of Open Access Journals (Sweden)

    L. A. Lee

    2011-12-01

    Full Text Available Sensitivity analysis of atmospheric models is necessary to identify the processes that lead to uncertainty in model predictions, to help understand model diversity through comparison of driving processes, and to prioritise research. Assessing the effect of parameter uncertainty in complex models is challenging and often limited by CPU constraints. Here we present a cost-effective application of variance-based sensitivity analysis to quantify the sensitivity of a 3-D global aerosol model to uncertain parameters. A Gaussian process emulator is used to estimate the model output across multi-dimensional parameter space, using information from a small number of model runs at points chosen using a Latin hypercube space-filling design. Gaussian process emulation is a Bayesian approach that uses information from the model runs along with some prior assumptions about the model behaviour to predict model output everywhere in the uncertainty space. We use the Gaussian process emulator to calculate the percentage of expected output variance explained by uncertainty in global aerosol model parameters and their interactions. To demonstrate the technique, we show examples of cloud condensation nuclei (CCN sensitivity to 8 model parameters in polluted and remote marine environments as a function of altitude. In the polluted environment 95 % of the variance of CCN concentration is described by uncertainty in the 8 parameters (excluding their interaction effects and is dominated by the uncertainty in the sulphur emissions, which explains 80 % of the variance. However, in the remote region parameter interaction effects become important, accounting for up to 40 % of the total variance. Some parameters are shown to have a negligible individual effect but a substantial interaction effect. Such sensitivities would not be detected in the commonly used single parameter perturbation experiments, which would therefore underpredict total uncertainty. Gaussian process

  4. Tolerance

    DEFF Research Database (Denmark)

    Tønder, Lars

    is linked to a different set of circumstances than the ones suggested by existing models in contemporary democratic theory. Reorienting the discussion of tolerance, the book raises the question of how to disclose new possibilities within our given context of affect and perception. Once we move away from......Tolerance: A Sensorial Orientation to Politics is an experiment in re-orientation. The book is based on the wager that tolerance exceeds the more prevalent images of self-restraint and repressive benevolence because neither precludes the possibility of a more “active tolerance” motivated...... by the desire to experiment and to become otherwise. The objective is to discuss what gets lost, conceptually as well as politically, when we neglect the subsistence of active tolerance within other practices of tolerance, and to develop a theory of active tolerance in which tolerance's mobilizing character...

  5. Quantified carbon input for maintaining existing soil organic carbon stocks in global wheat systems

    Science.gov (United States)

    Wang, G.

    2017-12-01

    Soil organic carbon (SOC) dynamics in croplands is a crucial component of global carbon (C) cycle. Depending on local environmental conditions and management practices, typical C input is generally required to reduce or reverse C loss in agricultural soils. No studies have quantified the critical C input for maintaining SOC at global scale with high resolution. Such information will provide a baseline map for assessing soil C dynamics under potential changes in management practices and climate, and thus enable development of management strategies to reduce C footprint from farm to regional scales. We used the soil C model RothC to simulate the critical C input rates needed to maintain existing soil C level at 0.1°× 0.1° resolution in global wheat systems. On average, the critical C input was estimated to be 2.0 Mg C ha-1 yr-1, with large spatial variability depending on local soil and climatic conditions. Higher C inputs are required in wheat system of central United States and western Europe, mainly due to the higher current soil C stocks present in these regions. The critical C input could be effectively estimated using a summary model driven by current SOC level, mean annual temperature, precipitation, and soil clay content.

  6. Quantifying the impact of weather extremes on global food security: A spatial bio-economic approach

    Directory of Open Access Journals (Sweden)

    Sika Gbegbelegbe

    2014-08-01

    Full Text Available This study uses a spatial bio-economic modelling framework to estimate the impact of the 2012 weather extreme in the USA on food security in the developing world. The study also quantifies the potential effects of a similar weather extreme occurring in 2050 under climate change. The study results indicate that weather extremes that affect maize productivity in key grain baskets can negatively affect food security in vulnerable countries. The 2012 weather extreme which occurred in the USA reduced US and global maize production by 29% compared to trend; maize consumption in the country decreased by 5% only and this resulted in less surplus maize for exports from the largest maize exporter in the world. Global maize production decreased by 6% compared to trend. The decrease in global maize production coupled with a reduction in the volume of global maize exports worsened food insecurity in eastern Africa, the Caribbean and Central America and India. The effects of the weather extreme on global food security would be worse, if the latter were to occur under climate change in 2050, assuming no climate change adaptation worldwide over the years. In addition, the hardest-hit regions would remain the same, whether the weather extreme occurs in 2012 instead of 2050: Sub-Saharan Africa (SSA, South Asia and the Latin America and Caribbean (LAC region. However, sustained growth in per capita income across world economies between 2000 and 2050 would allow few countries in SSA and the LAC region to virtually eliminate hunger within their borders. In these countries, per capita income would be high enough by 2050 to completely offset the negative effect of the weather extreme. The study results are also consistent with USDA׳s estimates on US and global maize production and consumption in 2012 after the weather extreme. Some discrepancy is found on the volume of global maize trade; this implies that the bio-economic model likely overestimates the effect of the

  7. Tolerance

    DEFF Research Database (Denmark)

    Tønder, Lars

    Tolerance: A Sensorial Orientation to Politics is an experiment in re-orientation. The book is based on the wager that tolerance exceeds the more prevalent images of self-restraint and repressive benevolence because neither precludes the possibility of a more “active tolerance” motivated by the d...... these alternatives by returning to the notion of tolerance as the endurance of pain, linking this notion to exemplars and theories relevant to the politics of multiculturalism, religious freedom, and free speech....

  8. An Example of Creative Drama Implementation in Values Education: Mevlana's Global Messages "Love-Respect-Tolerance"

    Science.gov (United States)

    Akhan, Nadire Emel; Altikulaç, Ali

    2014-01-01

    This study aims to discover how social studies teachers' personal and professional values can be improved by having a basis in Mevlana's global messages "love-respect-tolerance" in terms of "Personal and Professional Values-Professional Development", which is the first component of general efficacies of teaching profession. The…

  9. Quantifying the Influence of Global Warming on Unprecedented Extreme Climate Events

    Science.gov (United States)

    Diffenbaugh, Noah S.; Singh, Deepti; Mankin, Justin S.; Horton, Daniel E.; Swain, Daniel L.; Touma, Danielle; Charland, Allison; Liu, Yunjie; Haugen, Matz; Tsiang, Michael; hide

    2017-01-01

    Efforts to understand the influence of historical global warming on individual extreme climate events have increased over the past decade. However, despite substantial progress, events that are unprecedented in the local observational record remain a persistent challenge. Leveraging observations and a large climate model ensemble, we quantify uncertainty in the influence of global warming on the severity and probability of the historically hottest month, hottest day, driest year, and wettest 5-d period for different areas of the globe. We find that historical warming has increased the severity and probability of the hottest month and hottest day of the year at >80% of the available observational area. Our framework also suggests that the historical climate forcing has increased the probability of the driest year and wettest 5-d period at 57% and 41% of the observed area, respectively, although we note important caveats. For the most protracted hot and dry events, the strongest and most widespread contributions of anthropogenic climate forcing occur in the tropics, including increases in probability of at least a factor of 4 for the hottest month and at least a factor of 2 for the driest year. We also demonstrate the ability of our framework to systematically evaluate the role of dynamic and thermodynamic factors such as atmospheric circulation patterns and atmospheric water vapor, and find extremely high statistical confidence that anthropogenic forcing increased the probability of record-low Arctic sea ice extent.

  10. Quantifying the influence of global warming on unprecedented extreme climate events.

    Science.gov (United States)

    Diffenbaugh, Noah S; Singh, Deepti; Mankin, Justin S; Horton, Daniel E; Swain, Daniel L; Touma, Danielle; Charland, Allison; Liu, Yunjie; Haugen, Matz; Tsiang, Michael; Rajaratnam, Bala

    2017-05-09

    Efforts to understand the influence of historical global warming on individual extreme climate events have increased over the past decade. However, despite substantial progress, events that are unprecedented in the local observational record remain a persistent challenge. Leveraging observations and a large climate model ensemble, we quantify uncertainty in the influence of global warming on the severity and probability of the historically hottest month, hottest day, driest year, and wettest 5-d period for different areas of the globe. We find that historical warming has increased the severity and probability of the hottest month and hottest day of the year at >80% of the available observational area. Our framework also suggests that the historical climate forcing has increased the probability of the driest year and wettest 5-d period at 57% and 41% of the observed area, respectively, although we note important caveats. For the most protracted hot and dry events, the strongest and most widespread contributions of anthropogenic climate forcing occur in the tropics, including increases in probability of at least a factor of 4 for the hottest month and at least a factor of 2 for the driest year. We also demonstrate the ability of our framework to systematically evaluate the role of dynamic and thermodynamic factors such as atmospheric circulation patterns and atmospheric water vapor, and find extremely high statistical confidence that anthropogenic forcing increased the probability of record-low Arctic sea ice extent.

  11. Quantifying global fossil-fuel CO2 emissions: from OCO-2 to optimal observing designs

    Science.gov (United States)

    Ye, X.; Lauvaux, T.; Kort, E. A.; Oda, T.; Feng, S.; Lin, J. C.; Yang, E. G.; Wu, D.; Kuze, A.; Suto, H.; Eldering, A.

    2017-12-01

    Cities house more than half of the world's population and are responsible for more than 70% of the world anthropogenic CO2 emissions. Therefore, quantifications of emissions from major cities, which are only less than a hundred intense emitting spots across the globe, should allow us to monitor changes in global fossil-fuel CO2 emissions, in an independent, objective way. Satellite platforms provide favorable temporal and spatial coverage to collect urban CO2 data to quantify the anthropogenic contributions to the global carbon budget. We present here the optimal observation design for future NASA's OCO-2 and Japanese GOSAT missions, based on real-data (i.e. OCO-2) experiments and Observing System Simulation Experiments (OSSE's) to address different error components in the urban CO2 budget calculation. We identify the major sources of emission uncertainties for various types of cities with different ecosystems and geographical features, such as urban plumes over flat terrains, accumulated enhancements within basins, and complex weather regimes in coastal areas. Atmospheric transport errors were characterized under various meteorological conditions using the Weather Research and Forecasting (WRF) model at 1-km spatial resolution, coupled to the Open-source Data Inventory for Anthropogenic CO2 (ODIAC) emissions. We propose and discuss the optimized urban sampling strategies to address some difficulties from the seasonality in cloud cover and emissions, vegetation density in and around cities, and address the daytime sampling bias using prescribed diurnal cycles. These factors are combined in pseudo data experiments in which we evaluate the relative impact of uncertainties on inverse estimates of CO2 emissions for cities across latitudinal and climatological zones. We propose here several sampling strategies to minimize the uncertainties in target mode for tracking urban fossil-fuel CO2 emissions over the globe for future satellite missions, such as OCO-3 and future

  12. The “Bad Labor” Footprint: Quantifying the Social Impacts of Globalization

    Directory of Open Access Journals (Sweden)

    Moana S. Simas

    2014-10-01

    Full Text Available The extent to what bad labor conditions across the globe are associated with international trade is unknown. Here, we quantify the bad labor conditions associated with consumption in seven world regions, the “bad labor” footprint. In particular, we analyze how much occupational health damage, vulnerable employment, gender inequality, share of unskilled workers, child labor, and forced labor is associated with the production of internationally traded goods. Our results show that (i as expected, there is a net flow of bad labor conditions from developing to developed regions; (ii the production of exported goods in lower income regions contributes to more than half of the bad labor footprints caused by the wealthy lifestyles of affluent regions; (iii exports from Asia constitute the largest global trade flow measured in the amount bad labor, while exports from Africa carry the largest burden of bad labor conditions per unit value traded and per unit of total labor required; and (IV the trade of food products stands out in both volume and intensity of bad labor conditions.

  13. Glucose Tolerance, MTHFR C677T and NOS3 G894T Polymorphisms, and Global DNA Methylation in Mixed Ancestry African Individuals

    Directory of Open Access Journals (Sweden)

    Tandi E. Matsha

    2016-01-01

    Full Text Available The aim of this study is to quantify global DNA methylation and investigate the relationship with diabetes status and polymorphisms in MTHFR C677T and NOS3 G894T genes in mixed ancestry subjects from South Africa. Global DNA methylation was measured, and MTHFR rs1801133 and NOS3 rs1799983 polymorphisms were genotyped using high throughput real-time polymerase chain reaction and direct DNA sequencing. Of the 564 participants, 158 (28% individuals had T2DM of which 97 (17.2% were screen-detected cases. Another 119 (21.1% had prediabetes, that is, impaired fasting glucose, impaired glucose tolerance, or the combination of both, and the remainder 287 (50.9% had normal glucose tolerance. Global DNA methylation was significantly higher in prediabetes and screen-detected diabetes than in normal glucose tolerance (both p≤0.033 and in screen-detected diabetes compared to known diabetes on treatment (p=0.019. There was no difference in global DNA methylation between known diabetes on treatment and normal glucose tolerance (p>0.999. In multivariable linear regression analysis, only NOS3 was associated with increasing global DNA methylation (β=0.943; 95% CI: 0.286 to 1.560. The association of global DNA methylation with screen-detected diabetes but not treated diabetes suggests that glucose control agents to some extent may be reversing DNA methylation. The association between NOS3 rs1799983 polymorphisms and DNA methylation suggests gene-epigenetic mechanisms through which vascular diabetes complications develop despite adequate metabolic control.

  14. Glucose Tolerance, MTHFR C677T and NOS3 G894T Polymorphisms, and Global DNA Methylation in Mixed Ancestry African Individuals

    Science.gov (United States)

    Mutize, Tinashe; Erasmus, Rajiv T.

    2016-01-01

    The aim of this study is to quantify global DNA methylation and investigate the relationship with diabetes status and polymorphisms in MTHFR C677T and NOS3 G894T genes in mixed ancestry subjects from South Africa. Global DNA methylation was measured, and MTHFR rs1801133 and NOS3 rs1799983 polymorphisms were genotyped using high throughput real-time polymerase chain reaction and direct DNA sequencing. Of the 564 participants, 158 (28%) individuals had T2DM of which 97 (17.2%) were screen-detected cases. Another 119 (21.1%) had prediabetes, that is, impaired fasting glucose, impaired glucose tolerance, or the combination of both, and the remainder 287 (50.9%) had normal glucose tolerance. Global DNA methylation was significantly higher in prediabetes and screen-detected diabetes than in normal glucose tolerance (both p ≤ 0.033) and in screen-detected diabetes compared to known diabetes on treatment (p = 0.019). There was no difference in global DNA methylation between known diabetes on treatment and normal glucose tolerance (p > 0.999). In multivariable linear regression analysis, only NOS3 was associated with increasing global DNA methylation (β = 0.943; 95% CI: 0.286 to 1.560). The association of global DNA methylation with screen-detected diabetes but not treated diabetes suggests that glucose control agents to some extent may be reversing DNA methylation. The association between NOS3 rs1799983 polymorphisms and DNA methylation suggests gene-epigenetic mechanisms through which vascular diabetes complications develop despite adequate metabolic control. PMID:27990443

  15. Global error estimation based on the tolerance proportionality for some adaptive Runge-Kutta codes

    Science.gov (United States)

    Calvo, M.; González-Pinto, S.; Montijano, J. I.

    2008-09-01

    Modern codes for the numerical solution of Initial Value Problems (IVPs) in ODEs are based in adaptive methods that, for a user supplied tolerance [delta], attempt to advance the integration selecting the size of each step so that some measure of the local error is [similar, equals][delta]. Although this policy does not ensure that the global errors are under the prescribed tolerance, after the early studies of Stetter [Considerations concerning a theory for ODE-solvers, in: R. Burlisch, R.D. Grigorieff, J. Schröder (Eds.), Numerical Treatment of Differential Equations, Proceedings of Oberwolfach, 1976, Lecture Notes in Mathematics, vol. 631, Springer, Berlin, 1978, pp. 188-200; Tolerance proportionality in ODE codes, in: R. März (Ed.), Proceedings of the Second Conference on Numerical Treatment of Ordinary Differential Equations, Humbold University, Berlin, 1980, pp. 109-123] and the extensions of Higham [Global error versus tolerance for explicit Runge-Kutta methods, IMA J. Numer. Anal. 11 (1991) 457-480; The tolerance proportionality of adaptive ODE solvers, J. Comput. Appl. Math. 45 (1993) 227-236; The reliability of standard local error control algorithms for initial value ordinary differential equations, in: Proceedings: The Quality of Numerical Software: Assessment and Enhancement, IFIP Series, Springer, Berlin, 1997], it has been proved that in many existing explicit Runge-Kutta codes the global errors behave asymptotically as some rational power of [delta]. This step-size policy, for a given IVP, determines at each grid point tn a new step-size hn+1=h(tn;[delta]) so that h(t;[delta]) is a continuous function of t. In this paper a study of the tolerance proportionality property under a discontinuous step-size policy that does not allow to change the size of the step if the step-size ratio between two consecutive steps is close to unity is carried out. This theory is applied to obtain global error estimations in a few problems that have been solved with

  16. Quantifying airborne dispersal routes of pathogens over continents to safeguard global wheat supply.

    Science.gov (United States)

    Meyer, M; Cox, J A; Hitchings, M D T; Burgin, L; Hort, M C; Hodson, D P; Gilligan, C A

    2017-10-01

    Infectious crop diseases spreading over large agricultural areas pose a threat to food security. Aggressive strains of the obligate pathogenic fungus Puccinia graminis f.sp. tritici (Pgt), causing the crop disease wheat stem rust, have been detected in East Africa and the Middle East, where they lead to substantial economic losses and threaten livelihoods of farmers. The majority of commercially grown wheat cultivars worldwide are susceptible to these emerging strains, which pose a risk to global wheat production, because the fungal spores transmitting the disease can be wind-dispersed over regions and even continents 1-11 . Targeted surveillance and control requires knowledge about airborne dispersal of pathogens, but the complex nature of long-distance dispersal poses significant challenges for quantitative research 12-14 . We combine international field surveys, global meteorological data, a Lagrangian dispersion model and high-performance computational resources to simulate a set of disease outbreak scenarios, tracing billions of stochastic trajectories of fungal spores over dynamically changing host and environmental landscapes for more than a decade. This provides the first quantitative assessment of spore transmission frequencies and amounts amongst all wheat producing countries in Southern/East Africa, the Middle East and Central/South Asia. We identify zones of high air-borne connectivity that geographically correspond with previously postulated wheat rust epidemiological zones (characterized by endemic disease and free movement of inoculum) 10,15 , and regions with genetic similarities in related pathogen populations 16,17 . We quantify the circumstances (routes, timing, outbreak sizes) under which virulent pathogen strains such as 'Ug99' 5,6 pose a threat from long-distance dispersal out of East Africa to the large wheat producing areas in Pakistan and India. Long-term mean spore dispersal trends (predominant direction, frequencies, amounts) are

  17. Quantifying Globalization in Social Work Research: A 10-Year Review of American Social Work Journals

    Science.gov (United States)

    Agbényiga, DeBrenna L.; Huang, Lihua

    2014-01-01

    Measured by the prevalence of journal article contributions, geographic coverage, and international collaboration, this literature review found an increasing level of globalization with respect to American social work research and contribution to the social work profession from 2000-2009. Findings suggest changes are needed in global awareness and…

  18. Quantifying the Global Fresh Water Budget: Capabilities from Current and Future Satellite Sensors

    Science.gov (United States)

    Hildebrand, Peter; Zaitchik, Benjamin

    2007-01-01

    The global water cycle is complex and its components are difficult to measure, particularly at the global scales and with the precision needed for assessing climate impacts. Recent advances in satellite observational capabilities, however, are greatly improving our knowledge of the key terms in the fresh water flux budget. Many components of the of the global water budget, e.g. precipitation, atmospheric moisture profiles, soil moisture, snow cover, sea ice are now routinely measured globally using instruments on satellites such as TRMM, AQUA, TERRA, GRACE, and ICESat, as well as on operational satellites. New techniques, many using data assimilation approaches, are providing pathways toward measuring snow water equivalent, evapotranspiration, ground water, ice mass, as well as improving the measurement quality for other components of the global water budget. This paper evaluates these current and developing satellite capabilities to observe the global fresh water budget, then looks forward to evaluate the potential for improvements that may result from future space missions as detailed by the US Decadal Survey, and operational plans. Based on these analyses, and on the goal of improved knowledge of the global fresh water budget under the effects of climate change, we suggest some priorities for the future, based on new approaches that may provide the improved measurements and the analyses needed to understand and observe the potential speed-up of the global water cycle under the effects of climate change.

  19. Coordinated approaches to quantify long-term ecosystem dynamics in response to global change

    DEFF Research Database (Denmark)

    Liu, Y.; Melillo, J.; Niu, S.

    2011-01-01

    a coordinated approach that combines long-term, large-scale global change experiments with process studies and modeling. Long-term global change manipulative experiments, especially in high-priority ecosystems such as tropical forests and high-latitude regions, are essential to maximize information gain......Many serious ecosystem consequences of climate change will take decades or even centuries to emerge. Long-term ecological responses to global change are strongly regulated by slow processes, such as changes in species composition, carbon dynamics in soil and by long-lived plants, and accumulation...... to be the most effective strategy to gain the best information on long-term ecosystem dynamics in response to global change....

  20. Quantifying the consensus on anthropogenic global warming in the scientific literature

    International Nuclear Information System (INIS)

    Cook, John; Nuccitelli, Dana; Winkler, Bärbel; Painting, Rob; Skuce, Andrew; Green, Sarah A; Richardson, Mark; Way, Robert; Jacobs, Peter

    2013-01-01

    We analyze the evolution of the scientific consensus on anthropogenic global warming (AGW) in the peer-reviewed scientific literature, examining 11 944 climate abstracts from 1991–2011 matching the topics ‘global climate change’ or ‘global warming’. We find that 66.4% of abstracts expressed no position on AGW, 32.6% endorsed AGW, 0.7% rejected AGW and 0.3% were uncertain about the cause of global warming. Among abstracts expressing a position on AGW, 97.1% endorsed the consensus position that humans are causing global warming. In a second phase of this study, we invited authors to rate their own papers. Compared to abstract ratings, a smaller percentage of self-rated papers expressed no position on AGW (35.5%). Among self-rated papers expressing a position on AGW, 97.2% endorsed the consensus. For both abstract ratings and authors’ self-ratings, the percentage of endorsements among papers expressing a position on AGW marginally increased over time. Our analysis indicates that the number of papers rejecting the consensus on AGW is a vanishingly small proportion of the published research. (letter)

  1. Coordinated approaches to quantify long-term ecosystem dynamics in response to global change

    Science.gov (United States)

    Yiqi Luo; Jerry Melillo; Shuli Niu; Claus Beier; James S. Clark; Aime E.T. Classen; Eric Dividson; Jeffrey S. Dukes; R. Dave Evans; Christopher B. Field; Claudia I. Czimczik; Michael Keller; Bruce A. Kimball; Lara M. Kueppers; Richard J. Norby; Shannon L. Pelini; Elise Pendall; Edward Rastetter; Johan Six; Melinda Smith; Mark G. Tjoelker; Margaret S. Torn

    2011-01-01

    Many serious ecosystem consequences of climate change will take decades or even centuries to emerge. Long-term ecological responses to global change are strongly regulated by slow processes, such as changes in species composition, carbon dynamics in soil and by long-lived plants, and accumulation of nutrient capitals. Understanding and predicting these processes...

  2. Quantifying impacts of nitrogen use in European agriculture on global warming potential.

    NARCIS (Netherlands)

    Vries, de W.; Kros, J.; Reinds, G.J.; Butterbach-Bahl, K.

    2011-01-01

    This paper summarizes current knowledge on the impacts of changes of nitrogen (Nr) use in agriculture on the global warming potential (GWP) by its impact on carbon dioxide (CO2), nitrous oxide (N2O) and methane (CH4) emissions from agricultural and terrestrial nonagricultural systems and from

  3. Global transcriptome analysis of Halolamina sp. to decipher the salt tolerance in extremely halophilic archaea.

    Science.gov (United States)

    Kurt-Kızıldoğan, Aslıhan; Abanoz, Büşra; Okay, Sezer

    2017-02-15

    Extremely halophilic archaea survive in the hypersaline environments such as salt lakes or salt mines. Therefore, these microorganisms are good sources to investigate the molecular mechanisms underlying the tolerance to high salt concentrations. In this study, a global transcriptome analysis was conducted in an extremely halophilic archaeon, Halolamina sp. YKT1, isolated from a salt mine in Turkey. A comparative RNA-seq analysis was performed using YKT1 isolate grown either at 2.7M NaCl or 5.5M NaCl concentrations. A total of 2149 genes were predicted to be up-regulated and 1638 genes were down-regulated in the presence of 5.5M NaCl. The salt tolerance of Halolamina sp. YKT1 involves the up-regulation of genes related with membrane transporters, CRISPR-Cas systems, osmoprotectant solutes, oxidative stress proteins, and iron metabolism. On the other hand, the genes encoding the proteins involved in DNA replication, transcription, translation, mismatch and nucleotide excision repair were down-regulated. The RNA-seq data were verified for seven up-regulated genes as well as six down-regulated genes via qRT-PCR analysis. This comprehensive transcriptome analysis showed that the halophilic archaeon canalizes its energy towards keeping the intracellular osmotic balance minimizing the production of nucleic acids and peptides. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Linking regional stakeholder scenarios and shared socioeconomic pathways: Quantified West African food and climate futures in a global context.

    Science.gov (United States)

    Palazzo, Amanda; Vervoort, Joost M; Mason-D'Croz, Daniel; Rutting, Lucas; Havlík, Petr; Islam, Shahnila; Bayala, Jules; Valin, Hugo; Kadi Kadi, Hamé Abdou; Thornton, Philip; Zougmore, Robert

    2017-07-01

    The climate change research community's shared socioeconomic pathways (SSPs) are a set of alternative global development scenarios focused on mitigation of and adaptation to climate change. To use these scenarios as a global context that is relevant for policy guidance at regional and national levels, they have to be connected to an exploration of drivers and challenges informed by regional expertise. In this paper, we present scenarios for West Africa developed by regional stakeholders and quantified using two global economic models, GLOBIOM and IMPACT, in interaction with stakeholder-generated narratives and scenario trends and SSP assumptions. We present this process as an example of linking comparable scenarios across levels to increase coherence with global contexts, while presenting insights about the future of agriculture and food security under a range of future drivers including climate change. In these scenarios, strong economic development increases food security and agricultural development. The latter increases crop and livestock productivity leading to an expansion of agricultural area within the region while reducing the land expansion burden elsewhere. In the context of a global economy, West Africa remains a large consumer and producer of a selection of commodities. However, the growth in population coupled with rising incomes leads to increases in the region's imports. For West Africa, climate change is projected to have negative effects on both crop yields and grassland productivity, and a lack of investment may exacerbate these effects. Linking multi-stakeholder regional scenarios to the global SSPs ensures scenarios that are regionally appropriate and useful for policy development as evidenced in the case study, while allowing for a critical link to global contexts.

  5. Quantifying PM2.5-Meteorology Sensitivities in a Global Climate Model

    Science.gov (United States)

    Westervelt, D. M.; Horowitz, L. W.; Naik, V.; Tai, A. P. K.; Fiore, A. M.; Mauzerall, D. L.

    2016-01-01

    Climate change can influence fine particulate matter concentrations (PM2.5) through changes in air pollution meteorology. Knowledge of the extent to which climate change can exacerbate or alleviate air pollution in the future is needed for robust climate and air pollution policy decision-making. To examine the influence of climate on PM2.5, we use the Geophysical Fluid Dynamics Laboratory Coupled Model version 3 (GFDL CM3), a fully-coupled chemistry-climate model, combined with future emissions and concentrations provided by the four Representative Concentration Pathways (RCPs). For each of the RCPs, we conduct future simulations in which emissions of aerosols and their precursors are held at 2005 levels while other climate forcing agents evolve in time, such that only climate (and thus meteorology) can influence PM2.5 surface concentrations. We find a small increase in global, annual mean PM2.5 of about 0.21 micro-g/cu m3 (5%) for RCP8.5, a scenario with maximum warming. Changes in global mean PM2.5 are at a maximum in the fall and are mainly controlled by sulfate followed by organic aerosol with minimal influence of black carbon. RCP2.6 is the only scenario that projects a decrease in global PM2.5 with future climate changes, albeit only by -0.06 micro-g/cu m (1.5%) by the end of the 21st century. Regional and local changes in PM2.5 are larger, reaching upwards of 2 micro-g/cu m for polluted (eastern China) and dusty (western Africa) locations on an annually averaged basis in RCP8.5. Using multiple linear regression, we find that future PM2.5 concentrations are most sensitive to local temperature, followed by surface wind and precipitation. PM2.5 concentrations are robustly positively associated with temperature, while negatively related with precipitation and wind speed. Present-day (2006-2015) modeled sensitivities of PM2.5 to meteorological variables are evaluated against observations and found to agree reasonably well with observed sensitivities (within 10e50

  6. Quantifying the global and distributional aspects of American household carbon footprint

    International Nuclear Information System (INIS)

    Weber, Christopher L.; Matthews, H. Scott

    2008-01-01

    Analysis of household consumption and its environmental impact remains one of the most important topics in sustainability research. Nevertheless, much past and recent work has focused on domestic national averages, neglecting both the growing importance of international trade on household carbon footprint and the variation between households of different income levels and demographics. Using consumer expenditure surveys and multi-country life cycle assessment techniques, this paper analyzes the global and distributional aspects of American household carbon footprint. We find that due to recently increased international trade, 30% of total US household CO 2 impact in 2004 occurred outside the US. Further, households vary considerably in their CO 2 responsibilities: at least a factor of ten difference exists between low and high-impact households, with total household income and expenditure being the best predictors of both domestic and international portions of the total CO 2 impact. The global location of emissions, which cannot be calculated using standard input-output analysis, and the variation of household impacts with income, have important ramifications for polices designed to lower consumer impacts on climate change, such as carbon taxes. The effectiveness and fairness of such policies hinges on a proper understanding of how income distributions, rebound effects, and international trade affect them. (author)

  7. Quantifying the erosion of natural darkness in the global protected area system.

    Science.gov (United States)

    Gaston, Kevin J; Duffy, James P; Bennie, Jonathan

    2015-08-01

    The nighttime light environment of much of the earth has been transformed by the introduction of electric lighting. This impact continues to spread with growth in the human population and extent of urbanization. This has profound consequences for organismal physiology and behavior and affects abundances and distributions of species, community structure, and likely ecosystem functions and processes. Protected areas play key roles in buffering biodiversity from a wide range of anthropogenic pressures. We used a calibration of a global satellite data set of nighttime lights to determine how well they are fulfilling this role with regard to artificial nighttime lighting. Globally, areas that are protected tend to be darker at night than those that are not, and, with the exception of Europe, recent regional declines in the proportion of the area that is protected and remains dark have been small. However, much of these effects result from the major contribution to overall protected area coverage by the small proportion of individual protected areas that are very large. Thus, in Europe and North America high proportions of individual protected areas (>17%) have exhibited high levels of nighttime lighting in all recent years, and in several regions (Europe, Asia, South and Central America) high proportions of protected areas (32-42%) have had recent significant increases in nighttime lighting. Limiting and reversing the erosion of nighttime darkness in protected areas will require routine consideration of nighttime conditions when designating and establishing new protected areas; establishment of appropriate buffer zones around protected areas where lighting is prohibited; and landscape level reductions in artificial nighttime lighting, which is being called for in general to reduce energy use and economic costs. © 2015 Society for Conservation Biology.

  8. Quantifying the effects of the break up of Pangaea on global terrestrial diversification with neutral theory.

    Science.gov (United States)

    Jordan, Sean M R; Barraclough, Timothy G; Rosindell, James

    2016-04-05

    The historic richness of most taxonomic groups increases substantially over geological time. Explanations for this fall broadly into two categories: bias in the fossil record and elevated net rates of diversification in recent periods. For example, the break up of Pangaea and isolation between continents might have increased net diversification rates. In this study, we investigate the effect on terrestrial diversification rates of the increased isolation between land masses brought about by continental drift. We use ecological neutral theory as a means to study geologically complex scenarios tractably. Our models show the effects of simulated geological events that affect all species equally, without the added complexity of further ecological processes. We find that continental drift leads to an increase in diversity only where isolation between continents leads to additional speciation through vicariance, and where higher taxa with very low global diversity are considered. We conclude that continental drift by itself is not sufficient to account for the increase in terrestrial species richness observed in the fossil record. © 2016 The Authors.

  9. Methodological Considerations When Quantifying High-Intensity Efforts in Team Sport Using Global Positioning System Technology.

    Science.gov (United States)

    Varley, Matthew C; Jaspers, Arne; Helsen, Werner F; Malone, James J

    2017-09-01

    Sprints and accelerations are popular performance indicators in applied sport. The methods used to define these efforts using athlete-tracking technology could affect the number of efforts reported. This study aimed to determine the influence of different techniques and settings for detecting high-intensity efforts using global positioning system (GPS) data. Velocity and acceleration data from a professional soccer match were recorded via 10-Hz GPS. Velocity data were filtered using either a median or an exponential filter. Acceleration data were derived from velocity data over a 0.2-s time interval (with and without an exponential filter applied) and a 0.3-second time interval. High-speed-running (≥4.17 m/s 2 ), sprint (≥7.00 m/s 2 ), and acceleration (≥2.78 m/s 2 ) efforts were then identified using minimum-effort durations (0.1-0.9 s) to assess differences in the total number of efforts reported. Different velocity-filtering methods resulted in small to moderate differences (effect size [ES] 0.28-1.09) in the number of high-speed-running and sprint efforts detected when minimum duration was GPS. Changes to how high-intensity efforts are defined affect reported data. Therefore, consistency in data processing is advised.

  10. Quantifying invertebrate resistance to floods: a global-scale meta-analysis.

    Science.gov (United States)

    McMullen, Laura E; Lytle, David A

    2012-12-01

    Floods are a key component of the ecology and management of riverine ecosystems around the globe, but it is not clear whether floods have predictable effects on organisms that can allow us to generalize across regions and continents. To address this, we conducted a global-scale meta-analysis to investigate effects of natural and managed floods on invertebrate resistance, the ability of invertebrates to survive flood events. We considered 994 studies for inclusion in the analysis, and after evaluation based on a priori criteria, narrowed our analysis to 41 studies spanning six of the seven continents. We used the natural-log-ratio of invertebrate abundance before and within 10 days after flood events because this measure of effect size can be directly converted to estimates of percent survival. We conducted categorical and continuous analyses that examined the contribution of environmental and study design variables to effect size heterogeneity, and examined differences in effect size among taxonomic groups. We found that invertebrate abundance was lowered by at least one-half after flood events. While natural vs. managed floods were similar in their effect, effect size differed among habitat and substrate types, with pools, sand, and boulders experiencing the strongest effect. Although sample sizes were not sufficient to examine all taxonomic groups, floods had a significant, negative effect on densities of Coleoptera, Eumalacostraca, Annelida, Ephemeroptera, Diptera, Plecoptera, and Trichoptera. Results from this study provide guidance for river flow regime prescriptions that will be applicable across continents and climate types, as well as baseline expectations for future empirical studies of freshwater disturbance.

  11. Quantifying biological integrity by taxonomic completeness: its utility in regional and global assessments.

    Science.gov (United States)

    Hawkins, Charles P

    2006-08-01

    Water resources managers and conservation biologists need reliable, quantitative, and directly comparable methods for assessing the biological integrity of the world's aquatic ecosystems. Large-scale assessments are constrained by the lack of consistency in the indicators used to assess biological integrity and our current inability to translate between indicators. In theory, assessments based on estimates of taxonomic completeness, i.e., the proportion of expected taxa that were observed (observed/expected, O/E) are directly comparable to one another and should therefore allow regionally and globally consistent summaries of the biological integrity of freshwater ecosystems. However, we know little about the true comparability of O/E assessments derived from different data sets or how well O/E assessments perform relative to other indicators in use. I compared the performance (precision, bias, and sensitivity to stressors) of O/E assessments based on five different data sets with the performance of the indicators previously applied to these data (three multimetric indices, a biotic index, and a hybrid method used by the state of Maine). Analyses were based on data collected from U.S. stream ecosystems in North Carolina, the Mid-Atlantic Highlands, Maine, and Ohio. O/E assessments resulted in very similar estimates of mean regional conditions compared with most other indicators once these indicators' values were standardized relative to reference-site means. However, other indicators tended to be biased estimators of O/E, a consequence of differences in their response to natural environmental gradients and sensitivity to stressors. These results imply that, in some cases, it may be possible to compare assessments derived from different indicators by standardizing their values (a statistical approach to data harmonization). In situations where it is difficult to standardize or otherwise harmonize two or more indicators, O/E values can easily be derived from existing

  12. A global review of freshwater crayfish temperature tolerance, preference, and optimal growth

    Science.gov (United States)

    Westhoff, Jacob T.; Rosenberger, Amanda E.

    2016-01-01

    Conservation efforts, environmental planning, and management must account for ongoing ecosystem alteration due to a changing climate, introduced species, and shifting land use. This type of management can be facilitated by an understanding of the thermal ecology of aquatic organisms. However, information on thermal ecology for entire taxonomic groups is rarely compiled or summarized, and reviews of the science can facilitate its advancement. Crayfish are one of the most globally threatened taxa, and ongoing declines and extirpation could have serious consequences on aquatic ecosystem function due to their significant biomass and ecosystem roles. Our goal was to review the literature on thermal ecology for freshwater crayfish worldwide, with emphasis on studies that estimated temperature tolerance, temperature preference, or optimal growth. We also explored relationships between temperature metrics and species distributions. We located 56 studies containing information for at least one of those three metrics, which covered approximately 6 % of extant crayfish species worldwide. Information on one or more metrics existed for all 3 genera of Astacidae, 4 of the 12 genera of Cambaridae, and 3 of the 15 genera of Parastacidae. Investigations employed numerous methodological approaches for estimating these parameters, which restricts comparisons among and within species. The only statistically significant relationship we observed between a temperature metric and species range was a negative linear relationship between absolute latitude and optimal growth temperature. We recommend expansion of studies examining the thermal ecology of freshwater crayfish and identify and discuss methodological approaches that can improve standardization and comparability among studies.

  13. Evaluating and Quantifying the Climate-Driven Interannual Variability in Global Inventory Modeling and Mapping Studies (GIMMS) Normalized Difference Vegetation Index (NDVI3g) at Global Scales

    Science.gov (United States)

    Zeng, Fanwei; Collatz, George James; Pinzon, Jorge E.; Ivanoff, Alvaro

    2013-01-01

    Satellite observations of surface reflected solar radiation contain informationabout variability in the absorption of solar radiation by vegetation. Understanding thecauses of variability is important for models that use these data to drive land surface fluxesor for benchmarking prognostic vegetation models. Here we evaluated the interannualvariability in the new 30.5-year long global satellite-derived surface reflectance index data,Global Inventory Modeling and Mapping Studies normalized difference vegetation index(GIMMS NDVI3g). Pearsons correlation and multiple linear stepwise regression analyseswere applied to quantify the NDVI interannual variability driven by climate anomalies, andto evaluate the effects of potential interference (snow, aerosols and clouds) on the NDVIsignal. We found ecologically plausible strong controls on NDVI variability by antecedent precipitation and current monthly temperature with distinct spatial patterns. Precipitation correlations were strongest for temperate to tropical water limited herbaceous systemswhere in some regions and seasons 40 of the NDVI variance could be explained byprecipitation anomalies. Temperature correlations were strongest in northern mid- to-high-latitudes in the spring and early summer where up to 70 of the NDVI variance was explained by temperature anomalies. We find that, in western and central North America,winter-spring precipitation determines early summer growth while more recent precipitation controls NDVI variability in late summer. In contrast, current or prior wetseason precipitation anomalies were correlated with all months of NDVI in sub-tropical herbaceous vegetation. Snow, aerosols and clouds as well as unexplained phenomena still account for part of the NDVI variance despite corrections. Nevertheless, this study demonstrates that GIMMS NDVI3g represents real responses of vegetation to climate variability that are useful for global models.

  14. Quantifying the effect of autonomous adaptation to global river flood projections: application to future flood risk assessments

    Science.gov (United States)

    Kinoshita, Youhei; Tanoue, Masahiro; Watanabe, Satoshi; Hirabayashi, Yukiko

    2018-01-01

    This study represents the first attempt to quantify the effects of autonomous adaptation on the projection of global flood hazards and to assess future flood risk by including this effect. A vulnerability scenario, which varies according to the autonomous adaptation effect for conventional disaster mitigation efforts, was developed based on historical vulnerability values derived from flood damage records and a river inundation simulation. Coupled with general circulation model outputs and future socioeconomic scenarios, potential future flood fatalities and economic loss were estimated. By including the effect of autonomous adaptation, our multimodel ensemble estimates projected a 2.0% decrease in potential flood fatalities and an 821% increase in potential economic losses by 2100 under the highest emission scenario together with a large population increase. Vulnerability changes reduced potential flood consequences by 64%-72% in terms of potential fatalities and 28%-42% in terms of potential economic losses by 2100. Although socioeconomic changes made the greatest contribution to the potential increased consequences of future floods, about a half of the increase of potential economic losses was mitigated by autonomous adaptation. There is a clear and positive relationship between the global temperature increase from the pre-industrial level and the estimated mean potential flood economic loss, while there is a negative relationship with potential fatalities due to the autonomous adaptation effect. A bootstrapping analysis suggests a significant increase in potential flood fatalities (+5.7%) without any adaptation if the temperature increases by 1.5 °C-2.0 °C, whereas the increase in potential economic loss (+0.9%) was not significant. Our method enables the effects of autonomous adaptation and additional adaptation efforts on climate-induced hazards to be distinguished, which would be essential for the accurate estimation of the cost of adaptation to

  15. Combining global land cover datasets to quantify agricultural expansion into forests in Latin America: Limitations and challenges

    Science.gov (United States)

    Persson, U. Martin

    2017-01-01

    While we know that deforestation in the tropics is increasingly driven by commercial agriculture, most tropical countries still lack recent and spatially-explicit assessments of the relative importance of pasture and cropland expansion in causing forest loss. Here we present a spatially explicit quantification of the extent to which cultivated land and grassland expanded at the expense of forests across Latin America in 2001–2011, by combining two “state-of-the-art” global datasets (Global Forest Change forest loss and GlobeLand30-2010 land cover). We further evaluate some of the limitations and challenges in doing this. We find that this approach does capture some of the major patterns of land cover following deforestation, with GlobeLand30-2010’s Grassland class (which we interpret as pasture) being the most common land cover replacing forests across Latin America. However, our analysis also reveals some major limitations to combining these land cover datasets for quantifying pasture and cropland expansion into forest. First, a simple one-to-one translation between GlobeLand30-2010’s Cultivated land and Grassland classes into cropland and pasture respectively, should not be made without caution, as GlobeLand30-2010 defines its Cultivated land to include some pastures. Comparisons with the TerraClass dataset over the Brazilian Amazon and with previous literature indicates that Cultivated land in GlobeLand30-2010 includes notable amounts of pasture and other vegetation (e.g. in Paraguay and the Brazilian Amazon). This further suggests that the approach taken here generally leads to an underestimation (of up to ~60%) of the role of pasture in replacing forest. Second, a large share (~33%) of the Global Forest Change forest loss is found to still be forest according to GlobeLand30-2010 and our analysis suggests that the accuracy of the combined datasets, especially for areas with heterogeneous land cover and/or small-scale forest loss, is still too poor for

  16. Combining global land cover datasets to quantify agricultural expansion into forests in Latin America: Limitations and challenges.

    Directory of Open Access Journals (Sweden)

    Florence Pendrill

    Full Text Available While we know that deforestation in the tropics is increasingly driven by commercial agriculture, most tropical countries still lack recent and spatially-explicit assessments of the relative importance of pasture and cropland expansion in causing forest loss. Here we present a spatially explicit quantification of the extent to which cultivated land and grassland expanded at the expense of forests across Latin America in 2001-2011, by combining two "state-of-the-art" global datasets (Global Forest Change forest loss and GlobeLand30-2010 land cover. We further evaluate some of the limitations and challenges in doing this. We find that this approach does capture some of the major patterns of land cover following deforestation, with GlobeLand30-2010's Grassland class (which we interpret as pasture being the most common land cover replacing forests across Latin America. However, our analysis also reveals some major limitations to combining these land cover datasets for quantifying pasture and cropland expansion into forest. First, a simple one-to-one translation between GlobeLand30-2010's Cultivated land and Grassland classes into cropland and pasture respectively, should not be made without caution, as GlobeLand30-2010 defines its Cultivated land to include some pastures. Comparisons with the TerraClass dataset over the Brazilian Amazon and with previous literature indicates that Cultivated land in GlobeLand30-2010 includes notable amounts of pasture and other vegetation (e.g. in Paraguay and the Brazilian Amazon. This further suggests that the approach taken here generally leads to an underestimation (of up to ~60% of the role of pasture in replacing forest. Second, a large share (~33% of the Global Forest Change forest loss is found to still be forest according to GlobeLand30-2010 and our analysis suggests that the accuracy of the combined datasets, especially for areas with heterogeneous land cover and/or small-scale forest loss, is still too

  17. Quantifying the role of fire in the Earth system – Part 1: Improved global fire modeling in the Community Earth System Model (CESM1)

    OpenAIRE

    F. Li; S. Levis; D. S. Ward

    2013-01-01

    Modeling fire as an integral part of an Earth system model (ESM) is vital for quantifying and understanding fire–climate–vegetation interactions on a global scale and from an Earth system perspective. In this study, we introduce to the Community Earth System Model (CESM) the new global fire parameterization proposed by Li et al. (2012a, b), now with a more realistic representation of the anthropogenic impacts on fires, with a parameterization of peat fires, and with other minor modifications....

  18. Global analysis of glycoproteins identifies markers of endotoxin tolerant monocytes and GPR84 as a modulator of TNFα expression.

    Science.gov (United States)

    Müller, Mario M; Lehmann, Roland; Klassert, Tilman E; Reifenstein, Stella; Conrad, Theresia; Moore, Christoph; Kuhn, Anna; Behnert, Andrea; Guthke, Reinhard; Driesch, Dominik; Slevogt, Hortense

    2017-04-12

    Exposure of human monocytes to lipopolysaccharide (LPS) induces a temporary insensitivity to subsequent LPS challenges, a cellular state called endotoxin tolerance. In this study, we investigated the LPS-induced global glycoprotein expression changes of tolerant human monocytes and THP-1 cells to identify markers and glycoprotein targets capable to modulate the immunosuppressive state. Using hydrazide chemistry and LC-MS/MS analysis, we analyzed glycoprotein expression changes during a 48 h LPS time course. The cellular snapshots at different time points identified 1491 glycoproteins expressed by monocytes and THP-1 cells. Label-free quantitative analysis revealed transient or long-lasting LPS-induced expression changes of secreted or membrane-anchored glycoproteins derived from intracellular membrane coated organelles or from the plasma membrane. Monocytes and THP-1 cells demonstrated marked differences in glycoproteins differentially expressed in the tolerant state. Among the shared differentially expressed glycoproteins G protein-coupled receptor 84 (GPR84) was identified as being capable of modulating pro-inflammatory TNFα mRNA expression in the tolerant cell state when activated with its ligand Decanoic acid.

  19. Quantifying atmospheric transport, chemistry, and mixing using a new trajectory-box model and a global atmospheric-chemistry GCM

    Directory of Open Access Journals (Sweden)

    H. Riede

    2009-12-01

    Full Text Available We present a novel method for the quantification of transport, chemistry, and mixing along atmospheric trajectories based on a consistent model hierarchy. The hierarchy consists of the new atmospheric-chemistry trajectory-box model CAABA/MJT and the three-dimensional (3-D global ECHAM/MESSy atmospheric-chemistry (EMAC general circulation model. CAABA/MJT employs the atmospheric box model CAABA in a configuration using the atmospheric-chemistry submodel MECCA (M, the photochemistry submodel JVAL (J, and the new trajectory submodel TRAJECT (T, to simulate chemistry along atmospheric trajectories, which are provided offline. With the same chemistry submodels coupled to the 3-D EMAC model and consistent initial conditions and physical parameters, a unique consistency between the two models is achieved. Since only mixing processes within the 3-D model are excluded from the model consistency, comparisons of results from the two models allow to separate and quantify contributions of transport, chemistry, and mixing along the trajectory pathways. Consistency of transport between the trajectory-box model CAABA/MJT and the 3-D EMAC model is achieved via calculation of kinematic trajectories based on 3-D wind fields from EMAC using the trajectory model LAGRANTO. The combination of the trajectory-box model CAABA/MJT and the trajectory model LAGRANTO can be considered as a Lagrangian chemistry-transport model (CTM moving isolated air parcels. The procedure for obtaining the necessary statistical basis for the quantification method is described as well as the comprehensive diagnostics with respect to chemistry.

    The quantification method presented here allows to investigate the characteristics of transport, chemistry, and mixing in a grid-based 3-D model. The analysis of chemical processes within the trajectory-box model CAABA/MJT is easily extendable to include, for example, the impact of different transport pathways or of mixing processes onto

  20. Mesenchymal stem cells induce T-cell tolerance and protect the preterm brain after global hypoxia-ischemia.

    Directory of Open Access Journals (Sweden)

    Reint K Jellema

    Full Text Available Hypoxic-ischemic encephalopathy (HIE in preterm infants is a severe disease for which no curative treatment is available. Cerebral inflammation and invasion of activated peripheral immune cells have been shown to play a pivotal role in the etiology of white matter injury, which is the clinical hallmark of HIE in preterm infants. The objective of this study was to assess the neuroprotective and anti-inflammatory effects of intravenously delivered mesenchymal stem cells (MSC in an ovine model of HIE. In this translational animal model, global hypoxia-ischemia (HI was induced in instrumented preterm sheep by transient umbilical cord occlusion, which closely mimics the clinical insult. Intravenous administration of 2 x 10(6 MSC/kg reduced microglial proliferation, diminished loss of oligodendrocytes and reduced demyelination, as determined by histology and Diffusion Tensor Imaging (DTI, in the preterm brain after global HI. These anti-inflammatory and neuroprotective effects of MSC were paralleled by reduced electrographic seizure activity in the ischemic preterm brain. Furthermore, we showed that MSC induced persistent peripheral T-cell tolerance in vivo and reduced invasion of T-cells into the preterm brain following global HI. These findings show in a preclinical animal model that intravenously administered MSC reduced cerebral inflammation, protected against white matter injury and established functional improvement in the preterm brain following global HI. Moreover, we provide evidence that induction of T-cell tolerance by MSC might play an important role in the neuroprotective effects of MSC in HIE. This is the first study to describe a marked neuroprotective effect of MSC in a translational animal model of HIE.

  1. Limited tolerance by insects to high temperatures across tropical elevational gradients and the implications of global warming for extinction.

    Science.gov (United States)

    García-Robledo, Carlos; Kuprewicz, Erin K; Staines, Charles L; Erwin, Terry L; Kress, W John

    2016-01-19

    The critical thermal maximum (CTmax), the temperature at which motor control is lost in animals, has the potential to determine if species will tolerate global warming. For insects, tolerance to high temperatures decreases with latitude, suggesting that similar patterns may exist along elevational gradients as well. This study explored how CTmax varies among species and populations of a group of diverse tropical insect herbivores, the rolled-leaf beetles, across both broad and narrow elevational gradients. Data from 6,948 field observations and 8,700 museum specimens were used to map the elevational distributions of rolled-leaf beetles on two mountains in Costa Rica. CTmax was determined for 1,252 individual beetles representing all populations across the gradients. Initial morphological identifications suggested a total of 26 species with populations at different elevations displaying contrasting upper thermal limits. However, compared with morphological identifications, DNA barcodes (cytochrome oxidase I) revealed significant cryptic species diversity. DNA barcodes identified 42 species and haplotypes across 11 species complexes. These 42 species displayed much narrower elevational distributions and values of CTmax than the 26 morphologically defined species. In general, species found at middle elevations and on mountaintops are less tolerant to high temperatures than species restricted to lowland habitats. Species with broad elevational distributions display high CTmax throughout their ranges. We found no significant phylogenetic signal in CTmax, geography, or elevational range. The narrow variance in CTmax values for most rolled-leaf beetles, especially high-elevation species, suggests that the risk of extinction of insects may be substantial under some projected rates of global warming.

  2. Linking regional stakeholder scenarios and shared socioeconomic pathways: Quantified West African food and climate futures in a global context

    NARCIS (Netherlands)

    Palazzo, Amanda; Vervoort, Joost M.; Mason-D’Croz, Daniel; Rutting, Lucas; Havlík, Petr; Islam, Shahnila; Bayala, Jules; Valin, Hugo; Kadi Kadi, Hamé Abdou; Thornton, Philip; Zougmore, Robert

    2017-01-01

    The climate change research community’s shared socioeconomic pathways (SSPs) are a set of alternative global development scenarios focused on mitigation of and adaptation to climate change. To use these scenarios as a global context that is relevant for policy guidance at regional and national

  3. How to Quantify Human-environment Interactions in the Past: A Global Historical Land Use Data Set for the Holocene

    Science.gov (United States)

    Klein Goldewijk, K.

    2015-12-01

    Land use plays an important role in the climate system. Many ecosystem processes are directly or indirectly climate driven, and together with human driven land use changes, they determine how the land surface will evolve through time. To assess the effects of land cover changes on the climate system, models are required which are capable of simulating interactions between the involved components of the Earth system. Since driving forces for global environmental change differ among regions, a geographically (spatially) explicit modeling approach is called for, so that it can be incorporated in global and regional (climate and/or biophysical) change models in order to enhance our understanding of the underlying processes and thus improving future projections.Some researchers suggest that mankind has shifted from living in the Holocene (~emergence of agriculture) into the Anthropocene (~humans capable of changing the Earth' atmosphere) since the start of the Industrial Revolution. But in the light of the sheer size and magnitude of some historical land use changes (e.g. the Black Plague in the 14th century and the aftermath of the Colombian Exchange in the 16th century), some believe that this point might have occurred earlier in time. There are still many uncertainties and gaps in our knowledge about the importance of land use (change) in the global biogeochemical cycle, and it is crucial that researchers from other disciplines are involved in decreasing the uncertainties.Thus, integrated records of the co-evolving human-environment system over millennia are needed to provide a basis for a deeper understanding of the present and for forecasting the future. This requires the major task of assembling and integrating regional and global historical, archaeological, and paleo-environmental records. Humans cannot predict the future. Here I present a tool for such long term global change studies; it is the latest update (v 3.2) of the History Database of the Global

  4. Thermal tolerance and preference of exploited turbinid snails near their range limit in a global warming hotspot.

    Science.gov (United States)

    Lah, Roslizawati Ab; Benkendorff, Kirsten; Bucher, Daniel

    2017-02-01

    Predicted global climate change has prompted numerous studies of thermal tolerances of marine species. The upper thermal tolerance is unknown for most marine species, but will determine their vulnerability to ocean warming. Gastropods in the family Turbinidae are widely harvested for human consumption. To investigate the responses of turbinid snails to future conditions we determined critical thermal maxima (CTMax) and preferred temperatures of Turbo militaris and Lunella undulata from the tropical-temperate overlap region of northern New South Wales, on the Australian east coast. CTMax were determined at two warming rates: 1°C/30min and 1°C/12h. The number of snails that lost attachment to the tank wall was recorded at each temperature increment. At the faster rate, T. militaris had a significantly higher CTMax (34.0°C) than L. undulata (32.2°C). At the slower rate the mean of both species was lower and there was no significant difference between them (29.4°C for T. militaris and 29.6°C for L. undulata). This is consistent with differences in thermal inertia possibly allowing animals to tolerate short periods at higher temperatures than is possible during longer exposure times, but other mechanisms are not discounted. The thermoregulatory behaviour of the turban snails was determined in a horizontal thermal gradient. Both species actively sought out particular temperatures along the gradient, suggesting that behavioural responses may be important in ameliorating short-term temperature changes. The preferred temperatures of both species were higher at night (24.0°C and 26.0°C) than during the day (22.0°C and 23.9°C). As the snails approached their preferred temperature, net hourly displacement decreased. Preferred temperatures were within the average seasonal seawater temperature range in this region. However, with future predicted water temperature trends, the species could experience increased periods of thermal stress, possibly exceeding CTMax and

  5. A computation ANN model for quantifying the global solar radiation: A case study of Al-Aqabah-Jordan

    International Nuclear Information System (INIS)

    Abolgasem, I M; Alghoul, M A; Ruslan, M H; Chan, H Y; Khrit, N G; Sopian, K

    2015-01-01

    In this paper, a computation model is developed to predict the global solar radiation (GSR) in Aqaba city based on the data recorded with association of Artificial Neural Networks (ANN). The data used in this work are global solar radiation (GSR), sunshine duration, maximum and minimum air temperature and relative humidity. These data are available from Jordanian meteorological station over a period of two years. The quality of GSR forecasting is compared by using different Learning Algorithms. The decision of changing the ANN architecture is essentially based on the predicted results to obtain the best ANN model for monthly and seasonal GSR. Different configurations patterns were tested using available observed data. It was found that the model using mainly sunshine duration and air temperature as inputs gives accurate results. The ANN model efficiency and the mean square error values show that the prediction model is accurate. It is found that the effect of the three learning algorithms on the accuracy of the prediction model at the training and testing stages for each time scale is mostly within the same accuracy range. (paper)

  6. Finding Snowmageddon: Detecting and quantifying northeastern U.S. snowstorms in a multi-decadal global climate ensemble

    Science.gov (United States)

    Zarzycki, C. M.

    2017-12-01

    The northeastern coast of the United States is particularly vulnerable to impacts from extratropical cyclones during winter months, which produce heavy precipitation, high winds, and coastal flooding. These impacts are amplified by the proximity of major population centers to common storm tracks and include risks to health and welfare, massive transportation disruption, lost spending productivity, power outages, and structural damage. Historically, understanding regional snowfall in climate models has generally centered around seasonal mean climatologies even though major impacts typically occur at the scales of hours to days. To quantify discrete snowstorms at the event level, we describe a new objective detection algorithm for gridded data based on the Regional Snowfall Index (RSI) produced by NOAA's National Centers for Environmental Information. The algorithm uses 6-hourly precipitation to collocate storm-integrated snowfall with population density to produce a distribution of snowstorms with societally relevant impacts. The algorithm is tested on the Community Earth System Model (CESM) Large Ensemble Project (LENS) data. Present day distributions of snowfall events is well-replicated within the ensemble. We discuss classification sensitivities to assumptions made in determining precipitation phase and snow water equivalent. We also explore projected reductions in mid-century and end-of-century snowstorms due to changes in snowfall rates and precipitation phase, as well as highlight potential improvements in storm representation from refined horizontal resolution in model simulations.

  7. On Day-to-Day Variability of Global Lightning Activity as Quantified from Background Schumann Resonance Observations

    Science.gov (United States)

    Mushtak, V. C.; Williams, E. R.

    2011-12-01

    Among the palette of methods (satellite, VLF, ELF) for monitoring global lightning activity, observations of the background Schumann resonances (SR) provide a unique prospect for estimating the integrated activity of global lightning activity in absolute units (coul2 km2/sec). This prospect is ensured by the SR waves' low attenuation, with wavelengths commensurate with the dimensions of dominant regional lightning "chimneys", and by the accumulating methodology for background SR techniques. Another benefit is the reduction of SR measurements into a compact set of resonance characteristics (modal frequencies, intensities, and quality factors). Suggested and tested in numerical simulations by T.R. Madden in the 1960s, the idea to invert the SR characteristics for the global lightning source has been farther developed, statistically substantiated, and practically realized here on the basis of the computing power and the quantity of experimental material way beyond what the SR pioneers had at their disposal. The critical issue of the quality of the input SR parameters is addressed by implementing a statistically substantiated sanitizing procedure to dispose of the fragments of the observed time series containing unrepresentative elements - local interference of various origin and strong ELF transients originating outside the major "chimneys" represented in the source model. As a result of preliminary research, a universal empirical sanitizing criterion has been established. Due to the fact that the actual observations have been collected from a set of individually organized ELF stations with various equipment sets and calibration techniques, the relative parameters in both input (the intensities) and output (the "chimney" activities) are being used as far as possible in the inversion process to avoid instabilities caused by calibration inconsistencies. The absolute regional activities - and so the sought for global activity in absolute units - is determined in the

  8. Enhancing E. coli tolerance towards oxidative stress via engineering its global regulator cAMP receptor protein (CRP.

    Directory of Open Access Journals (Sweden)

    Souvik Basak

    Full Text Available Oxidative damage to microbial hosts often occurs under stressful conditions during bioprocessing. Classical strain engineering approaches are usually both time-consuming and labor intensive. Here, we aim to improve E. coli performance under oxidative stress via engineering its global regulator cAMP receptor protein (CRP, which can directly or indirectly regulate redox-sensing regulators SoxR and OxyR, and other ~400 genes in E. coli. Error-prone PCR technique was employed to introduce modifications to CRP, and three mutants (OM1~OM3 were identified with improved tolerance via H(2O(2 enrichment selection. The best mutant OM3 could grow in 12 mM H(2O(2 with the growth rate of 0.6 h(-1, whereas the growth of wild type was completely inhibited at this H(2O(2 concentration. OM3 also elicited enhanced thermotolerance at 48°C as well as resistance against cumene hydroperoxide. The investigation about intracellular reactive oxygen species (ROS, which determines cell viability, indicated that the accumulation of ROS in OM3 was always lower than in WT with or without H(2O(2 treatment. Genome-wide DNA microarray analysis has shown not only CRP-regulated genes have demonstrated great transcriptional level changes (up to 8.9-fold, but also RpoS- and OxyR-regulated genes (up to 7.7-fold. qRT-PCR data and enzyme activity assay suggested that catalase (katE could be a major antioxidant enzyme in OM3 instead of alkyl hydroperoxide reductase or superoxide dismutase. To our knowledge, this is the first work on improving E. coli oxidative stress resistance by reframing its transcription machinery through its native global regulator. The positive outcome of this approach may suggest that engineering CRP can be successfully implemented as an efficient strain engineering alternative for E. coli.

  9. Deep-Sea DuraFET: A Pressure Tolerant pH Sensor Designed for Global Sensor Networks.

    Science.gov (United States)

    Johnson, Kenneth S; Jannasch, Hans W; Coletti, Luke J; Elrod, Virginia A; Martz, Todd R; Takeshita, Yuichiro; Carlson, Robert J; Connery, James G

    2016-03-15

    Increasing atmospheric carbon dioxide is driving a long-term decrease in ocean pH which is superimposed on daily to seasonal variability. These changes impact ecosystem processes, and they serve as a record of ecosystem metabolism. However, the temporal variability in pH is observed at only a few locations in the ocean because a ship is required to support pH observations of sufficient precision and accuracy. This paper describes a pressure tolerant Ion Sensitive Field Effect Transistor pH sensor that is based on the Honeywell Durafet ISFET die. When combined with a AgCl pseudoreference sensor that is immersed directly in seawater, the system is capable of operating for years at a time on platforms that cycle from depths of several km to the surface. The paper also describes the calibration scheme developed to allow calibrated pH measurements to be derived from the activity of HCl reported by the sensor system over the range of ocean pressure and temperature. Deployments on vertical profiling platforms enable self-calibration in deep waters where pH values are stable. Measurements with the sensor indicate that it is capable of reporting pH with an accuracy of 0.01 or better on the total proton scale and a precision over multiyear periods of 0.005. This system enables a global ocean observing system for ocean pH.

  10. Quantifying global-brain metabolite level changes with whole-head proton MR spectroscopy at 3T.

    Science.gov (United States)

    Davitz, Matthew S; Wu, William E; Soher, Brian J; Babb, James S; Kirov, Ivan I; Huang, Jeffrey; Fatterpekar, Girish; Gonen, Oded

    2017-01-01

    To assess the sensitivity of non-localized, whole-head 1 H-MRS to an individual's serial changes in total-brain NAA, Glx, Cr and Cho concentrations - metabolite metrics often used as surrogate markers in neurological pathologies. In this prospective study, four back-to-back (single imaging session) and three serial (successive sessions) non-localizing, ~3min 1 H-MRS (TE/TR/TI=5/10 4 /940ms) scans were performed on 18 healthy young volunteers: 9 women, 9 men: 29.9±7.6 [mean±standard deviation (SD)] years old. These were analyzed by calculating a within-subject coefficient of variation (CV=SD/mean) to assess intra- and inter-scan repeatability and prediction intervals. This study was Health Insurance Portability and Accountability Act compliant. All subjects gave institutional review board-approved written, informed consent. The intra-scan CVs for the NAA, Glx, Cr and Cho were: 3.9±1.8%, 7.3±4.6%, 4.0±3.4% and 2.5±1.6%, and the corresponding inter-scan (longitudinal) values were: 7.0±3.1%, 10.6±5.6%, 7.6±3.5% and 7.0±3.9%. This method is shown to have 80% power to detect changes of 14%, 27%, 26% and 19% between two serial measurements in a given individual. Subject to the assumption that in neurological disorders NAA, Glx, Cr and Cho changes represent brain-only pathology and not muscles, bone marrow, adipose tissue or epithelial cells, this approach enables us to quantify them, thereby adding specificity to the assessment of the total disease load. This will facilitate monitoring diffuse pathologies with faster measurement, more extensive (~90% of the brain) spatial coverage and sensitivity than localized 1 H-MRS. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Estimating uncertainty and its temporal variation related to global climate models in quantifying climate change impacts on hydrology

    Science.gov (United States)

    Shen, Mingxi; Chen, Jie; Zhuan, Meijia; Chen, Hua; Xu, Chong-Yu; Xiong, Lihua

    2018-01-01

    Uncertainty estimation of climate change impacts on hydrology has received much attention in the research community. The choice of a global climate model (GCM) is usually considered as the largest contributor to the uncertainty of climate change impacts. The temporal variation of GCM uncertainty needs to be investigated for making long-term decisions to deal with climate change. Accordingly, this study investigated the temporal variation (mainly long-term) of uncertainty related to the choice of a GCM in predicting climate change impacts on hydrology by using multi-GCMs over multiple continuous future periods. Specifically, twenty CMIP5 GCMs under RCP4.5 and RCP8.5 emission scenarios were adapted to adequately represent this uncertainty envelope, fifty-one 30-year future periods moving from 2021 to 2100 with 1-year interval were produced to express the temporal variation. Future climatic and hydrological regimes over all future periods were compared to those in the reference period (1971-2000) using a set of metrics, including mean and extremes. The periodicity of climatic and hydrological changes and their uncertainty were analyzed using wavelet analysis, while the trend was analyzed using Mann-Kendall trend test and regression analysis. The results showed that both future climate change (precipitation and temperature) and hydrological response predicted by the twenty GCMs were highly uncertain, and the uncertainty increased significantly over time. For example, the change of mean annual precipitation increased from 1.4% in 2021-2050 to 6.5% in 2071-2100 for RCP4.5 in terms of the median value of multi-models, but the projected uncertainty reached 21.7% in 2021-2050 and 25.1% in 2071-2100 for RCP4.5. The uncertainty under a high emission scenario (RCP8.5) was much larger than that under a relatively low emission scenario (RCP4.5). Almost all climatic and hydrological regimes and their uncertainty did not show significant periodicity at the P = .05 significance

  12. Soil Methanotrophy Model (MeMo v1.0): a process-based model to quantify global uptake of atmospheric methane by soil

    Science.gov (United States)

    Murguia-Flores, Fabiola; Arndt, Sandra; Ganesan, Anita L.; Murray-Tortarolo, Guillermo; Hornibrook, Edward R. C.

    2018-06-01

    Soil bacteria known as methanotrophs are the sole biological sink for atmospheric methane (CH4), a potent greenhouse gas that is responsible for ˜ 20 % of the human-driven increase in radiative forcing since pre-industrial times. Soil methanotrophy is controlled by a plethora of factors, including temperature, soil texture, moisture and nitrogen content, resulting in spatially and temporally heterogeneous rates of soil methanotrophy. As a consequence, the exact magnitude of the global soil sink, as well as its temporal and spatial variability, remains poorly constrained. We developed a process-based model (Methanotrophy Model; MeMo v1.0) to simulate and quantify the uptake of atmospheric CH4 by soils at the global scale. MeMo builds on previous models by Ridgwell et al. (1999) and Curry (2007) by introducing several advances, including (1) a general analytical solution of the one-dimensional diffusion-reaction equation in porous media, (2) a refined representation of nitrogen inhibition on soil methanotrophy, (3) updated factors governing the influence of soil moisture and temperature on CH4 oxidation rates and (4) the ability to evaluate the impact of autochthonous soil CH4 sources on uptake of atmospheric CH4. We show that the improved structural and parametric representation of key drivers of soil methanotrophy in MeMo results in a better fit to observational data. A global simulation of soil methanotrophy for the period 1990-2009 using MeMo yielded an average annual sink of 33.5 ± 0.6 Tg CH4 yr-1. Warm and semi-arid regions (tropical deciduous forest and open shrubland) had the highest CH4 uptake rates of 602 and 518 mg CH4 m-2 yr-1, respectively. In these regions, favourable annual soil moisture content ( ˜ 20 % saturation) and low seasonal temperature variations (variations < ˜ 6 °C) provided optimal conditions for soil methanotrophy and soil-atmosphere gas exchange. In contrast to previous model analyses, but in agreement with recent observational data

  13. Soil Methanotrophy Model (MeMo v1.0: a process-based model to quantify global uptake of atmospheric methane by soil

    Directory of Open Access Journals (Sweden)

    F. Murguia-Flores

    2018-06-01

    Full Text Available Soil bacteria known as methanotrophs are the sole biological sink for atmospheric methane (CH4, a potent greenhouse gas that is responsible for  ∼  20 % of the human-driven increase in radiative forcing since pre-industrial times. Soil methanotrophy is controlled by a plethora of factors, including temperature, soil texture, moisture and nitrogen content, resulting in spatially and temporally heterogeneous rates of soil methanotrophy. As a consequence, the exact magnitude of the global soil sink, as well as its temporal and spatial variability, remains poorly constrained. We developed a process-based model (Methanotrophy Model; MeMo v1.0 to simulate and quantify the uptake of atmospheric CH4 by soils at the global scale. MeMo builds on previous models by Ridgwell et al. (1999 and Curry (2007 by introducing several advances, including (1 a general analytical solution of the one-dimensional diffusion–reaction equation in porous media, (2 a refined representation of nitrogen inhibition on soil methanotrophy, (3 updated factors governing the influence of soil moisture and temperature on CH4 oxidation rates and (4 the ability to evaluate the impact of autochthonous soil CH4 sources on uptake of atmospheric CH4. We show that the improved structural and parametric representation of key drivers of soil methanotrophy in MeMo results in a better fit to observational data. A global simulation of soil methanotrophy for the period 1990–2009 using MeMo yielded an average annual sink of 33.5 ± 0.6 Tg CH4 yr−1. Warm and semi-arid regions (tropical deciduous forest and open shrubland had the highest CH4 uptake rates of 602 and 518 mg CH4 m−2 yr−1, respectively. In these regions, favourable annual soil moisture content ( ∼  20 % saturation and low seasonal temperature variations (variations  <   ∼  6 °C provided optimal conditions for soil methanotrophy and soil–atmosphere gas exchange

  14. Quantifying and reducing the differences in forest CO2-fluxes estimated by eddy covariance, biometric and chamber methods: A global synthesis

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xingchang; Wang, Chuankuan; Bond-Lamberty, Benjamin

    2017-12-15

    Carbon dioxide (CO2) fluxes between terrestrial ecosystems and the atmosphere are primarily measured with eddy covariance (EC), biometric, and chamber methods. However, it is unclear why the estimates of CO2-fluxes, when measured using these different methods, converge at some sites but diverge at others. We synthesized a novel global dataset of forest CO2-fluxes to evaluate the consistency between EC and biometric or chamber methods for quantifying CO2 budget in forests. The EC approach, comparing with the other two methods, tended to produce 25% higher estimate of net ecosystem production (NEP, 0.52Mg C ha-1 yr-1), mainly resulting from lower EC-estimated Re; 10% lower ecosystem respiration (Re, 1.39Mg C ha-1 yr-1); and 3% lower gross primary production (0.48 Mg C ha-1 yr-1) The discrepancies between EC and the other methods were higher at sites with complex topography and dense canopies versus those with flat topography and open canopies. Forest age also influenced the discrepancy through the change of leaf area index. The open-path EC system induced >50% of the discrepancy in NEP, presumably due to its surface heating effect. These results provided strong evidence that EC produces biased estimates of NEP and Re in forest ecosystems. A global extrapolation suggested that the discrepancies in CO2 fluxes between methods were consistent with a global underestimation of Re, and overestimation of NEP, by the EC method. Accounting for these discrepancies would substantially improve the our estimates of the terrestrial carbon budget .

  15. Quantifying Matter

    CERN Document Server

    Angelo, Joseph A

    2011-01-01

    Quantifying Matter explains how scientists learned to measure matter and quantify some of its most fascinating and useful properties. It presents many of the most important intellectual achievements and technical developments that led to the scientific interpretation of substance. Complete with full-color photographs, this exciting new volume describes the basic characteristics and properties of matter. Chapters include:. -Exploring the Nature of Matter. -The Origin of Matter. -The Search for Substance. -Quantifying Matter During the Scientific Revolution. -Understanding Matter's Electromagnet

  16. Integrating metabolic performance, thermal tolerance, and plasticity enables for more accurate predictions on species vulnerability to acute and chronic effects of global warming.

    Science.gov (United States)

    Magozzi, Sarah; Calosi, Piero

    2015-01-01

    Predicting species vulnerability to global warming requires a comprehensive, mechanistic understanding of sublethal and lethal thermal tolerances. To date, however, most studies investigating species physiological responses to increasing temperature have focused on the underlying physiological traits of either acute or chronic tolerance in isolation. Here we propose an integrative, synthetic approach including the investigation of multiple physiological traits (metabolic performance and thermal tolerance), and their plasticity, to provide more accurate and balanced predictions on species and assemblage vulnerability to both acute and chronic effects of global warming. We applied this approach to more accurately elucidate relative species vulnerability to warming within an assemblage of six caridean prawns occurring in the same geographic, hence macroclimatic, region, but living in different thermal habitats. Prawns were exposed to four incubation temperatures (10, 15, 20 and 25 °C) for 7 days, their metabolic rates and upper thermal limits were measured, and plasticity was calculated according to the concept of Reaction Norms, as well as Q10 for metabolism. Compared to species occupying narrower/more stable thermal niches, species inhabiting broader/more variable thermal environments (including the invasive Palaemon macrodactylus) are likely to be less vulnerable to extreme acute thermal events as a result of their higher upper thermal limits. Nevertheless, they may be at greater risk from chronic exposure to warming due to the greater metabolic costs they incur. Indeed, a trade-off between acute and chronic tolerance was apparent in the assemblage investigated. However, the invasive species P. macrodactylus represents an exception to this pattern, showing elevated thermal limits and plasticity of these limits, as well as a high metabolic control. In general, integrating multiple proxies for species physiological acute and chronic responses to increasing

  17. Assessment of the short-term safety and tolerability of a quantified 80 % ethanol extract from the stem bark of Nauclea pobeguinii (PR 259 CT1) in healthy volunteers: a clinical phase I study.

    Science.gov (United States)

    Mesia, Kahunu; Cimanga, Kanyanga; Tona, Lutete; Mampunza, Ma Miezi; Ntamabyaliro, Nsengi; Muanda, Tsobo; Muyembe, Tamfum; Totté, Jozef; Mets, Tony; Pieters, Luc; Vlietinck, Arnold

    2011-01-01

    The aim of this study was to evaluate the short-term safety and tolerability of an antimalarial herbal medicinal product (PR 259 CT1) consisting of a quantified 80 % ethanol extract from the stem bark of Nauclea pobeguinii when given orally to healthy adult male volunteers. The amount of the major alkaloid strictosamide in the extract was determined by a validated HPLC method and was shown to be 5.6 %. The herbal preparation was formulated in a gelatine capsule form containing 500 mg of PCR 259 CT1. A sample of 15 healthy male volunteers, selected using the Lot Quality Assurance of Sampling (LQAS) method, was eligible for inclusion after fulfillment of the inclusion criteria and clinical examination by a physician. The volunteers were treated in an outpatient clinic with a drug regimen of two 500 mg capsules three times daily (each eight hours) for seven days, during meals. Safety and tolerability were monitored clinically, haematologically, biochemically and by electrocardiographic (ECG) examination at days 0, 1, 3, 7 and 14. Adverse effects were recorded by self-reporting of the participants or by detection of abnormalities in clinical examinations by a physician. The oral administration of PR 259 CT1 at high doses of 2 × 500 mg/capsule/day for 7 days was found to induce no significant changes in the concentration levels of all investigated haematological, biochemical, electrocardiogram and vital sign parameters and physical characteristics after 14 days of treatment compared to those seen in the baseline data. The concentration levels of all evaluated parameters were within the normal limits as reported in the literature. All adverse events noted were mild and self-resolving including increase of appetite (33 %), headache (20 %) and nausea (20 %). Other minor side effects were insomnia, somnolence and asthenia (7 %). Thus, PR 259 CT1 presented a significant safety and tolerability in healthy volunteers to allow its further development by starting a phase II

  18. Globalization

    Directory of Open Access Journals (Sweden)

    Tulio Rosembuj

    2006-12-01

    Full Text Available There is no singular globalization, nor is the result of an individual agent. We could start by saying that global action has different angles and subjects who perform it are different, as well as its objectives. The global is an invisible invasion of materials and immediate effects.

  19. Globalization

    OpenAIRE

    Tulio Rosembuj

    2006-01-01

    There is no singular globalization, nor is the result of an individual agent. We could start by saying that global action has different angles and subjects who perform it are different, as well as its objectives. The global is an invisible invasion of materials and immediate effects.

  20. Quantifying Transmission.

    Science.gov (United States)

    Woolhouse, Mark

    2017-07-01

    Transmissibility is the defining characteristic of infectious diseases. Quantifying transmission matters for understanding infectious disease epidemiology and designing evidence-based disease control programs. Tracing individual transmission events can be achieved by epidemiological investigation coupled with pathogen typing or genome sequencing. Individual infectiousness can be estimated by measuring pathogen loads, but few studies have directly estimated the ability of infected hosts to transmit to uninfected hosts. Individuals' opportunities to transmit infection are dependent on behavioral and other risk factors relevant given the transmission route of the pathogen concerned. Transmission at the population level can be quantified through knowledge of risk factors in the population or phylogeographic analysis of pathogen sequence data. Mathematical model-based approaches require estimation of the per capita transmission rate and basic reproduction number, obtained by fitting models to case data and/or analysis of pathogen sequence data. Heterogeneities in infectiousness, contact behavior, and susceptibility can have substantial effects on the epidemiology of an infectious disease, so estimates of only mean values may be insufficient. For some pathogens, super-shedders (infected individuals who are highly infectious) and super-spreaders (individuals with more opportunities to transmit infection) may be important. Future work on quantifying transmission should involve integrated analyses of multiple data sources.

  1. Global Transcriptome Analysis Reveals Distinct Aluminum-Tolerance Pathways in the Al-Accumulating Species Hydrangea macrophylla and Marker Identification.

    Directory of Open Access Journals (Sweden)

    Haixia Chen

    Full Text Available Hydrangea (Hydrangea macrophylla is a well known Al-accumulating plant, showing a high level of aluminum (Al tolerance and accumulation. Although the physiological mechanisms for detoxification of Al and the roles of Al in blue hydrangea sepals have been reported, the molecular mechanisms of Al tolerance and accumulation are poorly understood in hydrangea. In this study, we conducted a genome-wide transcriptome analysis of Al-response genes in the roots and leaves of hydrangea by RNA sequencing (RNA-seq. The assembly of hydrangea transcriptome provides a rich source for gene identification and mining molecular markers, including single nucleotide polymorphism (SNP and simple sequence repeat (SSR. A total of 401,215 transcripts with an average length of 810.77 bp were assembled, generating 256,127 unigenes. After annotation, 4,287 genes in the roots and 730 genes in the leaves were up-regulated by Al exposure, while 236 genes in the roots and 719 genes in the leaves were down-regulated, respectively. Many transporters, including MATE and ABC families, were involved in the process of Al-citrate complex transporting from the roots in hydrangea. A plasma membrane Al uptake transporter, Nramp aluminum transporter was up-regulated in roots and leaves under Al stress, indicating it may play an important role in Al tolerance by reducing the level of toxic Al. Although the exact roles of these candidate genes remain to be examined, these results provide a platform for further functional analysis of the process of detoxification of Al in hydrangea.

  2. Globalization

    OpenAIRE

    Andru?cã Maria Carmen

    2013-01-01

    The field of globalization has highlighted an interdependence implied by a more harmonious understanding determined by the daily interaction between nations through the inducement of peace and the management of streamlining and the effectiveness of the global economy. For the functioning of the globalization, the developing countries that can be helped by the developed ones must be involved. The international community can contribute to the institution of the development environment of the gl...

  3. Comparative sensitivity and inhibitor tolerance of GlobalFiler® PCR Amplification and Investigator® 24plex QS kits for challenging samples.

    Science.gov (United States)

    Elwick, Kyleen; Mayes, Carrie; Hughes-Stamm, Sheree

    2018-02-17

    In cases such as mass disasters or missing persons, human remains are challenging to identify as they may be fragmented, burnt, been buried, decomposed, and/or contain inhibitory substances. This study compares the performance of a relatively new STR kit in the US market (Investigator® 24plex QS kit; Qiagen) with the GlobalFiler® PCR Amplification kit (Thermo Fisher Scientific) when genotyping highly inhibited and low level DNA samples. In this study, DNA samples ranging from 1 ng to 7.8 pg were amplified to define the sensitivity of two systems. In addition, DNA (1 ng and 0.1 ng input amounts) was spiked with various concentrations of five inhibitors common to human remains (humic acid, melanin, hematin, collagen, calcium). Furthermore, bone (N = 5) and tissue samples from decomposed human remains (N = 6) were used as mock casework samples for comparative analysis with both STR kits. The data suggest that the GlobalFiler® kit may be slightly more sensitive than the Investigator® kit. On average STR profiles appeared to be more balanced and average peak heights were higher when using the GlobalFiler® kit. However, the data also show that the Investigator® kit may be more tolerant to common PCR inhibitors. While both STR kits showed a decrease in alleles as the inhibitor concentration increased, more complete profiles were obtained when the Investigator® kit was used. Of the 11 bone and decomposed tissue samples tested, 8 resulted in more complete and balanced STR profiles when amplified with the GlobalFiler® kit. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Quantifying the temperature-independent effect of stratospheric aerosol geoengineering on global-mean precipitation in a multi-model ensemble

    International Nuclear Information System (INIS)

    Ferraro, Angus J; Griffiths, Hannah G

    2016-01-01

    The reduction in global-mean precipitation when stratospheric aerosol geoengineering is used to counterbalance global warming from increasing carbon dioxide (CO 2 ) concentrations has been mainly attributed to the temperature-independent effect of CO 2 on atmospheric radiative cooling. We demonstrate here that stratospheric sulphate aerosol itself also acts to reduce global-mean precipitation independent of its effects on temperature. The temperature-independent effect of stratospheric aerosol geoenginering on global-mean precipitation is calculated by removing temperature-dependent effects from climate model simulations of the Geoengineering Model Intercomparison Project (GeoMIP). When sulphate aerosol is injected into the stratosphere at a rate of 5 Tg SO 2 per year the aerosol reduces global-mean precipitation by approximately 0.2 %, though multiple ensemble members are required to separate this effect from internal variability. For comparison, the precipitation reduction from the temperature-independent effect of increasing CO 2 concentrations under the RCP4.5 scenario of the future is approximately 0.5 %. The temperature-independent effect of stratospheric sulphate aerosol arises from the aerosol’s effect on tropospheric radiative cooling. Radiative transfer calculations show this is mainly due to increasing downward emission of infrared radiation by the aerosol, but there is also a contribution from the stratospheric warming the aerosol causes. Our results suggest climate model simulations of solar dimming can capture the main features of the global-mean precipitation response to stratospheric aerosol geoengineering. (letter)

  5. Globalization

    DEFF Research Database (Denmark)

    Plum, Maja

    Globalization is often referred to as external to education - a state of affair facing the modern curriculum with numerous challenges. In this paper it is examined as internal to curriculum; analysed as a problematization in a Foucaultian sense. That is, as a complex of attentions, worries, ways...... of reasoning, producing curricular variables. The analysis is made through an example of early childhood curriculum in Danish Pre-school, and the way the curricular variable of the pre-school child comes into being through globalization as a problematization, carried forth by the comparative practices of PISA...

  6. Globalization

    OpenAIRE

    F. Gerard Adams

    2008-01-01

    The rapid globalization of the world economy is causing fundamental changes in patterns of trade and finance. Some economists have argued that globalization has arrived and that the world is “flat†. While the geographic scope of markets has increased, the author argues that new patterns of trade and finance are a result of the discrepancies between “old†countries and “new†. As the differences are gradually wiped out, particularly if knowledge and technology spread worldwide, the t...

  7. Quantifying resilience

    Science.gov (United States)

    Allen, Craig R.; Angeler, David G.

    2016-01-01

    The biosphere is under unprecedented pressure, reflected in rapid changes in our global ecological, social, technological and economic systems. In many cases, ecological and social systems can adapt to these changes over time, but when a critical threshold is surpassed, a system under stress can undergo catastrophic change and reorganize into a different state. The concept of resilience, introduced more than 40 years ago in the ecological sciences, captures the behaviour of systems that can occur in alternative states. The original definition of resilience forwarded by Holling (1973) is still the most useful. It defines resilience as the amount of disturbance that a system can withstand before it shifts into an alternative stable state. The idea of alternative stable states has clear and profound implications for ecological management. Coral reefs, for example, are high-diversity systems that provide key ecosystem services such as fisheries and coastal protection. Human impacts are causing significant, ongoing reef degradation, and many reefs have shifted from coral- to algal-dominated states in response to anthropogenic pressures such as elevated water temperatures and overfishing. Understanding and differentiating between the factors that help maintain reefs in coral-dominated states vs. those that facilitate a shift to an undesired algal-dominated state is a critical step towards sound management and conservation of these, and other, important social–ecological systems.

  8. Global W`o'rming and Darwin Revisited: Quantifying Soil Mixing Rates by Non-native Earthworms in Fennoscandian Boreal and Arctic Ecosystems

    Science.gov (United States)

    Wackett, A. A.; Yoo, K.; Cameron, E. K.; Olid, C.; Klaminder, J.

    2017-12-01

    Fennoscandian boreal and arctic ecosystems represent some of the most pristine environments in Europe and store sizeable quantities of soil carbon. Both ecosystems may have evolved without native earthworms since the last glaciation, but are now increasingly subject to arrivals of novel geoengineering earthworm species due to human activities. As a result, invaded areas are devoid of the typical thick organic horizon present in earthworm free forest soils and instead contain carbon-rich mineral (A-horizon) soils at the surface. How rapidly this transition occurs and how it affects the fate of soil organic carbon (SOC) pools is not well known. In this study, we quantify the rates at which earthworm-mediated mixing of forest soils proceeds in these formerly glaciated landscapes. We infer soil mass fluxes using the vertical distribution of 210Pb in soils from Fennoscandia (N=4) and North America (N=1) and quantify annual mixing velocities as well as vertical fluxes of organic and mineral matter throughout the upper soil profiles. Across the sites, mixing velocities generally increase with increasing earthworm biomass and functional group diversity, and our annual mixing rates closely align with those predicted by Darwin for earthworm-engineered ecosystems in the UK 130 years earlier. Reduction of the O-horizon is concomitant with a decrease in surface SOC contents. However, we observe minimal changes to SOC inventories with earthworm invasion across the sites, reflecting the upward translocation of mineral soil and accompanying increase in soil bulk densities. Thus, the reduction or depletion of organic horizon by exotic earthworms does not necessarily involve loss of SOC via earthworm-accelerated decomposition, but is rather compensated for by physical mixing of organic matter and minerals, which may facilitate stabilizing organo-mineral interactions. This work constitutes an important step to elucidate how non-native earthworms impact SOC inventories and potentially

  9. A Bayesian method to quantify azimuthal anisotropy model uncertainties: application to global azimuthal anisotropy in the upper mantle and transition zone

    Science.gov (United States)

    Yuan, K.; Beghein, C.

    2018-04-01

    Seismic anisotropy is a powerful tool to constrain mantle deformation, but its existence in the deep upper mantle and topmost lower mantle is still uncertain. Recent results from higher mode Rayleigh waves have, however, revealed the presence of 1 per cent azimuthal anisotropy between 300 and 800 km depth, and changes in azimuthal anisotropy across the mantle transition zone boundaries. This has important consequences for our understanding of mantle convection patterns and deformation of deep mantle material. Here, we propose a Bayesian method to model depth variations in azimuthal anisotropy and to obtain quantitative uncertainties on the fast seismic direction and anisotropy amplitude from phase velocity dispersion maps. We applied this new method to existing global fundamental and higher mode Rayleigh wave phase velocity maps to assess the likelihood of azimuthal anisotropy in the deep upper mantle and to determine whether previously detected changes in anisotropy at the transition zone boundaries are robustly constrained by those data. Our results confirm that deep upper-mantle azimuthal anisotropy is favoured and well constrained by the higher mode data employed. The fast seismic directions are in agreement with our previously published model. The data favour a model characterized, on average, by changes in azimuthal anisotropy at the top and bottom of the transition zone. However, this change in fast axes is not a global feature as there are regions of the model where the azimuthal anisotropy direction is unlikely to change across depths in the deep upper mantle. We were, however, unable to detect any clear pattern or connection with surface tectonics. Future studies will be needed to further improve the lateral resolution of this type of model at transition zone depths.

  10. A statistical approach to quantify uncertainty in carbon monoxide measurements at the Izaña global GAW station: 2008–2011

    Directory of Open Access Journals (Sweden)

    A. J. Gomez-Pelaez

    2013-03-01

    Full Text Available Atmospheric CO in situ measurements are carried out at the Izaña (Tenerife global GAW (Global Atmosphere Watch Programme of the World Meteorological Organization – WMO mountain station using a Reduction Gas Analyser (RGA. In situ measurements at Izaña are representative of the subtropical Northeast Atlantic free troposphere, especially during nighttime. We present the measurement system configuration, the response function, the calibration scheme, the data processing, the Izaña 2008–2011 CO nocturnal time series, and the mean diurnal cycle by months. We have developed a rigorous uncertainty analysis for carbon monoxide measurements carried out at the Izaña station, which could be applied to other GAW stations. We determine the combined standard measurement uncertainty taking into consideration four contributing components: uncertainty of the WMO standard gases interpolated over the range of measurement, the uncertainty that takes into account the agreement between the standard gases and the response function used, the uncertainty due to the repeatability of the injections, and the propagated uncertainty related to the temporal consistency of the response function parameters (which also takes into account the covariance between the parameters. The mean value of the combined standard uncertainty decreased significantly after March 2009, from 2.37 nmol mol−1 to 1.66 nmol mol−1, due to improvements in the measurement system. A fifth type of uncertainty we call representation uncertainty is considered when some of the data necessary to compute the temporal mean are absent. Any computed mean has also a propagated uncertainty arising from the uncertainties of the data used to compute the mean. The law of propagation depends on the type of uncertainty component (random or systematic. In situ hourly means are compared with simultaneous and collocated NOAA flask samples. The uncertainty of the differences is computed and used to determine

  11. A statistical approach to quantify uncertainty in carbon monoxide measurements at the Izaña global GAW station: 2008-2011

    Science.gov (United States)

    Gomez-Pelaez, A. J.; Ramos, R.; Gomez-Trueba, V.; Novelli, P. C.; Campo-Hernandez, R.

    2013-03-01

    Atmospheric CO in situ measurements are carried out at the Izaña (Tenerife) global GAW (Global Atmosphere Watch Programme of the World Meteorological Organization - WMO) mountain station using a Reduction Gas Analyser (RGA). In situ measurements at Izaña are representative of the subtropical Northeast Atlantic free troposphere, especially during nighttime. We present the measurement system configuration, the response function, the calibration scheme, the data processing, the Izaña 2008-2011 CO nocturnal time series, and the mean diurnal cycle by months. We have developed a rigorous uncertainty analysis for carbon monoxide measurements carried out at the Izaña station, which could be applied to other GAW stations. We determine the combined standard measurement uncertainty taking into consideration four contributing components: uncertainty of the WMO standard gases interpolated over the range of measurement, the uncertainty that takes into account the agreement between the standard gases and the response function used, the uncertainty due to the repeatability of the injections, and the propagated uncertainty related to the temporal consistency of the response function parameters (which also takes into account the covariance between the parameters). The mean value of the combined standard uncertainty decreased significantly after March 2009, from 2.37 nmol mol-1 to 1.66 nmol mol-1, due to improvements in the measurement system. A fifth type of uncertainty we call representation uncertainty is considered when some of the data necessary to compute the temporal mean are absent. Any computed mean has also a propagated uncertainty arising from the uncertainties of the data used to compute the mean. The law of propagation depends on the type of uncertainty component (random or systematic). In situ hourly means are compared with simultaneous and collocated NOAA flask samples. The uncertainty of the differences is computed and used to determine whether the differences are

  12. Crafting tolerance

    DEFF Research Database (Denmark)

    Kirchner, Antje; Freitag, Markus; Rapp, Carolin

    2011-01-01

    Ongoing changes in social structures, orientation, and value systems confront us with the growing necessity to address and understand transforming patterns of tolerance as well as specific aspects, such as social tolerance. Based on hierarchical analyses of the latest World Values Survey (2005......–08) and national statistics for 28 countries, we assess both individual and contextual aspects that influence an individual's perception of different social groupings. Using a social tolerance index that captures personal attitudes toward these groupings, we present an institutional theory of social tolerance. Our...

  13. Quantifiers and working memory

    NARCIS (Netherlands)

    Szymanik, J.; Zajenkowski, M.

    2010-01-01

    The paper presents a study examining the role of working memory in quantifier verification. We created situations similar to the span task to compare numerical quantifiers of low and high rank, parity quantifiers and proportional quantifiers. The results enrich and support the data obtained

  14. Quantifiers and working memory

    NARCIS (Netherlands)

    Szymanik, J.; Zajenkowski, M.

    2009-01-01

    The paper presents a study examining the role of working memory in quantifier verification. We created situations similar to the span task to compare numerical quantifiers of low and high rank, parity quantifiers and proportional quantifiers. The results enrich and support the data obtained

  15. Quantifying Anthropogenic Dust Emissions

    Science.gov (United States)

    Webb, Nicholas P.; Pierre, Caroline

    2018-02-01

    Anthropogenic land use and land cover change, including local environmental disturbances, moderate rates of wind-driven soil erosion and dust emission. These human-dust cycle interactions impact ecosystems and agricultural production, air quality, human health, biogeochemical cycles, and climate. While the impacts of land use activities and land management on aeolian processes can be profound, the interactions are often complex and assessments of anthropogenic dust loads at all scales remain highly uncertain. Here, we critically review the drivers of anthropogenic dust emission and current evaluation approaches. We then identify and describe opportunities to: (1) develop new conceptual frameworks and interdisciplinary approaches that draw on ecological state-and-transition models to improve the accuracy and relevance of assessments of anthropogenic dust emissions; (2) improve model fidelity and capacity for change detection to quantify anthropogenic impacts on aeolian processes; and (3) enhance field research and monitoring networks to support dust model applications to evaluate the impacts of disturbance processes on local to global-scale wind erosion and dust emissions.

  16. Om tolerance

    DEFF Research Database (Denmark)

    Huggler, Jørgen

    2007-01-01

    Begrebet tolerance og dets betydninger diskuteres med henblik på en tydeliggørelse af begrebets forbindelse med stat, religion, ytringsfrihed, skeptisk erkendelsesteori, antropologi og pædagogik.......Begrebet tolerance og dets betydninger diskuteres med henblik på en tydeliggørelse af begrebets forbindelse med stat, religion, ytringsfrihed, skeptisk erkendelsesteori, antropologi og pædagogik....

  17. Using global sensitivity analysis to understand higher order interactions in complex models: an application of GSA on the Revised Universal Soil Loss Equation (RUSLE) to quantify model sensitivity and implications for ecosystem services management in Costa Rica

    Science.gov (United States)

    Fremier, A. K.; Estrada Carmona, N.; Harper, E.; DeClerck, F.

    2011-12-01

    Appropriate application of complex models to estimate system behavior requires understanding the influence of model structure and parameter estimates on model output. To date, most researchers perform local sensitivity analyses, rather than global, because of computational time and quantity of data produced. Local sensitivity analyses are limited in quantifying the higher order interactions among parameters, which could lead to incomplete analysis of model behavior. To address this concern, we performed a GSA on a commonly applied equation for soil loss - the Revised Universal Soil Loss Equation. USLE is an empirical model built on plot-scale data from the USA and the Revised version (RUSLE) includes improved equations for wider conditions, with 25 parameters grouped into six factors to estimate long-term plot and watershed scale soil loss. Despite RUSLE's widespread application, a complete sensitivity analysis has yet to be performed. In this research, we applied a GSA to plot and watershed scale data from the US and Costa Rica to parameterize the RUSLE in an effort to understand the relative importance of model factors and parameters across wide environmental space. We analyzed the GSA results using Random Forest, a statistical approach to evaluate parameter importance accounting for the higher order interactions, and used Classification and Regression Trees to show the dominant trends in complex interactions. In all GSA calculations the management of cover crops (C factor) ranks the highest among factors (compared to rain-runoff erosivity, topography, support practices, and soil erodibility). This is counter to previous sensitivity analyses where the topographic factor was determined to be the most important. The GSA finding is consistent across multiple model runs, including data from the US, Costa Rica, and a synthetic dataset of the widest theoretical space. The three most important parameters were: Mass density of live and dead roots found in the upper inch

  18. Pharmacogenetic meta-analysis of baseline risk factors, pharmacodynamic, efficacy and tolerability endpoints from two large global cardiovascular outcomes trials for darapladib.

    Directory of Open Access Journals (Sweden)

    Astrid Yeo

    Full Text Available Darapladib, a lipoprotein-associated phospholipase A2 (Lp-PLA2 inhibitor, failed to demonstrate efficacy for the primary endpoints in two large phase III cardiovascular outcomes trials, one in stable coronary heart disease patients (STABILITY and one in acute coronary syndrome (SOLID-TIMI 52. No major safety signals were observed but tolerability issues of diarrhea and odor were common (up to 13%. We hypothesized that genetic variants associated with Lp-PLA2 activity may influence efficacy and tolerability and therefore performed a comprehensive pharmacogenetic analysis of both trials. We genotyped patients within the STABILITY and SOLID-TIMI 52 trials who provided a DNA sample and consent (n = 13,577 and 10,404 respectively, representing 86% and 82% of the trial participants using genome-wide arrays with exome content and performed imputation using a 1000 Genomes reference panel. We investigated baseline and change from baseline in Lp-PLA2 activity, two efficacy endpoints (major coronary events and myocardial infarction as well as tolerability parameters at genome-wide and candidate gene level using a meta-analytic approach. We replicated associations of published loci on baseline Lp-PLA2 activity (APOE, CELSR2, LPA, PLA2G7, LDLR and SCARB1 and identified three novel loci (TOMM5, FRMD5 and LPL using the GWAS-significance threshold P≤5E-08. Review of the PLA2G7 gene (encoding Lp-PLA2 within these datasets identified V279F null allele carriers as well as three other rare exonic null alleles within various ethnic groups, however none of these variants nor any other loci associated with Lp-PLA2 activity at baseline were associated with any of the drug response endpoints. The analysis of darapladib efficacy endpoints, despite low power, identified six low frequency loci with main genotype effect (though with borderline imputation scores and one common locus (minor allele frequency 0.24 with genotype by treatment interaction effect passing the GWAS

  19. Towards Tolerance

    NARCIS (Netherlands)

    Lisette Kuyper; Jurjen Iedema; Saskia Keuzenkamp

    2013-01-01

    Across Europe, public attitudes towards lesbian, gay and bisexual (LGB) individuals range from broad tolerance to widespread rejection. Attitudes towards homosexuality are more than mere individual opinions, but form part of the social and political structures which foster or hinder the equality

  20. Intolerant tolerance.

    Science.gov (United States)

    Khushf, G

    1994-04-01

    The Hyde Amendment and Roman Catholic attempts to put restrictions on Title X funding have been criticized for being intolerant. However, such criticism fails to appreciate that there are two competing notions of tolerance, one focusing on the limits of state force and accepting pluralism as unavoidable, and the other focusing on the limits of knowledge and advancing pluralism as a good. These two types of tolerance, illustrated in the writings of John Locke and J.S. Mill, each involve an intolerance. In a pluralistic context where the free exercise of religion is respected, John Locke's account of tolerance is preferable. However, it (in a reconstructed form) leads to a minimal state. Positive entitlements to benefits like artificial contraception or nontherapeutic abortions can legitimately be resisted, because an intolerance has already been shown with respect to those that consider the benefit immoral, since their resources have been coopted by taxation to advance an end that is contrary to their own. There is a sliding scale from tolerance (viewed as forbearance) to the affirmation of communal integrity, and this scale maps on to the continuum from negative to positive rights.

  1. Quantifiers for quantum logic

    OpenAIRE

    Heunen, Chris

    2008-01-01

    We consider categorical logic on the category of Hilbert spaces. More generally, in fact, any pre-Hilbert category suffices. We characterise closed subobjects, and prove that they form orthomodular lattices. This shows that quantum logic is just an incarnation of categorical logic, enabling us to establish an existential quantifier for quantum logic, and conclude that there cannot be a universal quantifier.

  2. Quantifying Distributional Model Risk via Optimal Transport

    OpenAIRE

    Blanchet, Jose; Murthy, Karthyek R. A.

    2016-01-01

    This paper deals with the problem of quantifying the impact of model misspecification when computing general expected values of interest. The methodology that we propose is applicable in great generality, in particular, we provide examples involving path dependent expectations of stochastic processes. Our approach consists in computing bounds for the expectation of interest regardless of the probability measure used, as long as the measure lies within a prescribed tolerance measured in terms ...

  3. What's so local about global climate change? Testing social theories of environmental degradation to quantify the demographic, economic, and governmental factors associated with energy consumption and carbon dioxide emissions in U.S. metropolitan areas and counties

    Science.gov (United States)

    Tribbia, John Luke

    . The STIRPAT method is used to test four social theories of environmental degradation -- the treadmill of production, ecological modernization, urban ecological transitions, and human ecology theories -- by quantifying variables associated with energy use and CO2 emissions drawn from each theory. The specific findings demonstrate that various demographic, economic, and governmental factors are related strongly to metropolitan area energy consumption and county-level CO2 emissions. The human ecology, treadmill of production, and urban ecological transitions theories are important to explaining how and why climate-related impacts differ for a wide variety of local areas in the United States. Related to human ecology and treadmill of production theory, environmental degradation is highest in metropolitan areas and counties with large populations and large economies that have various mechanisms in place to facilitate economic growth. By contrast, some U.S. counties are beginning to remedy their impact on the environment by applying economic and governmental resources toward the mitigation of CO2 emissions, which provides evidence of support for urban ecological transitions theory. However, because climate change is a complex cross-scale global environmental problem and the results in this dissertation confirm that this problem is locally driven by similar population and economic factors also affecting the climate at larger spatial scales, mitigation efforts to reduce energy use and emissions at the local level will be fruitless without a well-coordinated, cross-scale (local to global) ideological shift that puts less priority on economic goals and more on environmental sustainability. These results, and the methodological and theoretical framework applied in this dissertation, thus provide a useful platform for the successful application of future research that specifically addresses mitigation strategies to reduce local-level environmental impacts. This dissertation

  4. Connected Car: Quantified Self becomes Quantified Car

    Directory of Open Access Journals (Sweden)

    Melanie Swan

    2015-02-01

    Full Text Available The automotive industry could be facing a situation of profound change and opportunity in the coming decades. There are a number of influencing factors such as increasing urban and aging populations, self-driving cars, 3D parts printing, energy innovation, and new models of transportation service delivery (Zipcar, Uber. The connected car means that vehicles are now part of the connected world, continuously Internet-connected, generating and transmitting data, which on the one hand can be helpfully integrated into applications, like real-time traffic alerts broadcast to smartwatches, but also raises security and privacy concerns. This paper explores the automotive connected world, and describes five killer QS (Quantified Self-auto sensor applications that link quantified-self sensors (sensors that measure the personal biometrics of individuals like heart rate and automotive sensors (sensors that measure driver and passenger biometrics or quantitative automotive performance metrics like speed and braking activity. The applications are fatigue detection, real-time assistance for parking and accidents, anger management and stress reduction, keyless authentication and digital identity verification, and DIY diagnostics. These kinds of applications help to demonstrate the benefit of connected world data streams in the automotive industry and beyond where, more fundamentally for human progress, the automation of both physical and now cognitive tasks is underway.

  5. Heat tolerance in wheat

    DEFF Research Database (Denmark)

    Sharma, Dew Kumari

    As a consequence of global climate change, heat stress together with other abiotic stresses will remain an important determinant of future food security. Wheat (Triticum aestivum L.) is the third most important crop of the world feeding one third of the world population. Being a crop of temperate...... climate, wheat is sensitive to heat stress. We need to understand how our crops will perform in these changing climatic conditions and how we can develop varieties, which are more tolerant. The PhD study focussed on understanding heat tolerance in wheat with a combined approach of plant physiology...... and quantitative genetics in particular, plant phenotyping based quantitative trait loci (QTL) discovery for a physiological trait under heat stress. Chlorophyll a fluorescence trait, Fv/Fm was used as a phenotyping tool, as it reflects the effect of heat stress on maximum photochemical efficiency of photosystem...

  6. Salt Tolerance

    OpenAIRE

    Xiong, Liming; Zhu, Jian-Kang

    2002-01-01

    Studying salt stress is an important means to the understanding of plant ion homeostasis and osmo-balance. Salt stress research also benefits agriculture because soil salinity significantly limits plant productivity on agricultural lands. Decades of physiological and molecular studies have generated a large body of literature regarding potential salt tolerance determinants. Recent advances in applying molecular genetic analysis and genomics tools in the model plant Arabidopsis thaliana are sh...

  7. Infectious Tolerance

    OpenAIRE

    Jonuleit, Helmut; Schmitt, Edgar; Kakirman, Hacer; Stassen, Michael; Knop, Jürgen; Enk, Alexander H.

    2002-01-01

    Regulatory CD4+CD25+ T cells (Treg) are mandatory for maintaining immunologic self-tolerance. We demonstrate that the cell-cell contact–mediated suppression of conventional CD4+ T cells by human CD25+ Treg cells is fixation resistant, independent from membrane-bound TGF-β but requires activation and protein synthesis of CD25+ Treg cells. Coactivation of CD25+ Treg cells with Treg cell–depleted CD4+ T cells results in anergized CD4+ T cells that in turn inhibit the activation of conventional, ...

  8. Infectious Tolerance

    Science.gov (United States)

    Jonuleit, Helmut; Schmitt, Edgar; Kakirman, Hacer; Stassen, Michael; Knop, Jürgen; Enk, Alexander H.

    2002-01-01

    Regulatory CD4+CD25+ T cells (Treg) are mandatory for maintaining immunologic self-tolerance. We demonstrate that the cell-cell contact–mediated suppression of conventional CD4+ T cells by human CD25+ Treg cells is fixation resistant, independent from membrane-bound TGF-β but requires activation and protein synthesis of CD25+ Treg cells. Coactivation of CD25+ Treg cells with Treg cell–depleted CD4+ T cells results in anergized CD4+ T cells that in turn inhibit the activation of conventional, freshly isolated CD4+ T helper (Th) cells. This infectious suppressive activity, transferred from CD25+ Treg cells via cell contact, is cell contact–independent and partially mediated by soluble transforming growth factor (TGF)-β. The induction of suppressive properties in conventional CD4+ Th cells represents a mechanism underlying the phenomenon of infectious tolerance. This explains previously published conflicting data on the role of TGF-β in CD25+ Treg cell–induced immunosuppression. PMID:12119350

  9. Quantifying Productivity Gains from Foreign Investment

    NARCIS (Netherlands)

    C. Fons-Rosen (Christian); S. Kalemli-Ozcan (Sebnem); B.E. Sorensen (Bent); C. Villegas-Sanchez (Carolina)

    2013-01-01

    textabstractWe quantify the causal effect of foreign investment on total factor productivity (TFP) using a new global firm-level database. Our identification strategy relies on exploiting the difference in the amount of foreign investment by financial and industrial investors and simultaneously

  10. Is Time Predictability Quantifiable?

    DEFF Research Database (Denmark)

    Schoeberl, Martin

    2012-01-01

    Computer architects and researchers in the realtime domain start to investigate processors and architectures optimized for real-time systems. Optimized for real-time systems means time predictable, i.e., architectures where it is possible to statically derive a tight bound of the worst......-case execution time. To compare different approaches we would like to quantify time predictability. That means we need to measure time predictability. In this paper we discuss the different approaches for these measurements and conclude that time predictability is practically not quantifiable. We can only...... compare the worst-case execution time bounds of different architectures....

  11. Quantifying uncertainty and resilience on coral reefs using a Bayesian approach

    International Nuclear Information System (INIS)

    Van Woesik, R

    2013-01-01

    Coral reefs are rapidly deteriorating globally. The contemporary management option favors managing for resilience to provide reefs with the capacity to tolerate human-induced disturbances. Yet resilience is most commonly defined as the capacity of a system to absorb disturbances without changing fundamental processes or functionality. Quantifying no change, or the uncertainty of a null hypothesis, is nonsensical using frequentist statistics, but is achievable using a Bayesian approach. This study outlines a practical Bayesian framework that quantifies the resilience of coral reefs using two inter-related models. The first model examines the functionality of coral reefs in the context of their reef-building capacity, whereas the second model examines the recovery rates of coral cover after disturbances. Quantifying intrinsic rates of increase in coral cover and habitat-specific, steady-state equilibria are useful proxies of resilience. A reduction in the intrinsic rate of increase following a disturbance, or the slowing of recovery over time, can be useful indicators of stress; a change in the steady-state equilibrium suggests a phase shift. Quantifying the uncertainty of key reef-building processes and recovery parameters, and comparing these parameters against benchmarks, facilitates the detection of loss of resilience and provides signals of imminent change. (letter)

  12. Quantifying uncertainty and resilience on coral reefs using a Bayesian approach

    Science.gov (United States)

    van Woesik, R.

    2013-12-01

    Coral reefs are rapidly deteriorating globally. The contemporary management option favors managing for resilience to provide reefs with the capacity to tolerate human-induced disturbances. Yet resilience is most commonly defined as the capacity of a system to absorb disturbances without changing fundamental processes or functionality. Quantifying no change, or the uncertainty of a null hypothesis, is nonsensical using frequentist statistics, but is achievable using a Bayesian approach. This study outlines a practical Bayesian framework that quantifies the resilience of coral reefs using two inter-related models. The first model examines the functionality of coral reefs in the context of their reef-building capacity, whereas the second model examines the recovery rates of coral cover after disturbances. Quantifying intrinsic rates of increase in coral cover and habitat-specific, steady-state equilibria are useful proxies of resilience. A reduction in the intrinsic rate of increase following a disturbance, or the slowing of recovery over time, can be useful indicators of stress; a change in the steady-state equilibrium suggests a phase shift. Quantifying the uncertainty of key reef-building processes and recovery parameters, and comparing these parameters against benchmarks, facilitates the detection of loss of resilience and provides signals of imminent change.

  13. Thermosensory reversal effect quantified

    NARCIS (Netherlands)

    Bergmann Tiest, W.M.; Kappers, A.M.L.

    2008-01-01

    At room temperature, some materials feel colder than others due to differences in thermal conductivity, heat capacity and geometry. When the ambient temperature is well above skin temperature, the roles of 'cold' and 'warm' materials are reversed. In this paper, this effect is quantified by

  14. Thermosensory reversal effect quantified

    NARCIS (Netherlands)

    Bergmann Tiest, W.M.; Kappers, A.M.L.

    2008-01-01

    At room temperature, some materials feel colder than others due to differences in thermal conductivity, heat capacity and geometry. When the ambient temperature is well above skin temperature, the roles of ‘cold’ and ‘warm’ materials are reversed. In this paper, this effect is quantified by

  15. Quantifying requirements volatility effects

    NARCIS (Netherlands)

    Kulk, G.P.; Verhoef, C.

    2008-01-01

    In an organization operating in the bancassurance sector we identified a low-risk IT subportfolio of 84 IT projects comprising together 16,500 function points, each project varying in size and duration, for which we were able to quantify its requirements volatility. This representative portfolio

  16. The quantified relationship

    NARCIS (Netherlands)

    Danaher, J.; Nyholm, S.R.; Earp, B.

    2018-01-01

    The growth of self-tracking and personal surveillance has given rise to the Quantified Self movement. Members of this movement seek to enhance their personal well-being, productivity, and self-actualization through the tracking and gamification of personal data. The technologies that make this

  17. Quantifying IT estimation risks

    NARCIS (Netherlands)

    Kulk, G.P.; Peters, R.J.; Verhoef, C.

    2009-01-01

    A statistical method is proposed for quantifying the impact of factors that influence the quality of the estimation of costs for IT-enabled business projects. We call these factors risk drivers as they influence the risk of the misestimation of project costs. The method can effortlessly be

  18. Quantifying light pollution

    International Nuclear Information System (INIS)

    Cinzano, P.; Falchi, F.

    2014-01-01

    In this paper we review new available indicators useful to quantify and monitor light pollution, defined as the alteration of the natural quantity of light in the night environment due to introduction of manmade light. With the introduction of recent radiative transfer methods for the computation of light pollution propagation, several new indicators become available. These indicators represent a primary step in light pollution quantification, beyond the bare evaluation of the night sky brightness, which is an observational effect integrated along the line of sight and thus lacking the three-dimensional information. - Highlights: • We review new available indicators useful to quantify and monitor light pollution. • These indicators are a primary step in light pollution quantification. • These indicators allow to improve light pollution mapping from a 2D to a 3D grid. • These indicators allow carrying out a tomography of light pollution. • We show an application of this technique to an Italian region

  19. An architecture for fault tolerant controllers

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, Jakob

    2005-01-01

    degradation in the sense of guaranteed degraded performance. A number of fault diagnosis problems, fault tolerant control problems, and feedback control with fault rejection problems are formulated/considered, mainly from a fault modeling point of view. The method is illustrated on a servo example including......A general architecture for fault tolerant control is proposed. The architecture is based on the (primary) YJBK parameterization of all stabilizing compensators and uses the dual YJBK parameterization to quantify the performance of the fault tolerant system. The approach suggested can be applied...

  20. Repressive Tolerance

    DEFF Research Database (Denmark)

    Pedersen, Morten Jarlbæk

    2017-01-01

    Consultation of organised interests and others when drafting laws is often seen as an important source of both input and output legitimacy. But whereas the input side of the equation stems from the very process of listening to societal actors, output legitimacy can only be strengthened if consult......Consultation of organised interests and others when drafting laws is often seen as an important source of both input and output legitimacy. But whereas the input side of the equation stems from the very process of listening to societal actors, output legitimacy can only be strengthened...... a substantial effect on the substance of laws – shows that there is a great difference in the amenability of different branches of government but that, in general, authorities do not listen much despite a very strong consultation institution and tradition. A suggestion for an explanation could be pointing...... to an administrative culture of repressive tolerance of organised interests: authorities listen but only reacts in a very limited sense. This bears in it the risk of jeopardising the knowledge transfer from societal actors to administrative ditto thus harming the consultation institutions’ potential for strengthening...

  1. Global, regional, and national disability-adjusted life years (DALYs) for 306 diseases and injuries and healthy life expectancy (HALE) for 188 countries, 1990-2013 : quantifying the epidemiological transition

    NARCIS (Netherlands)

    Murray, Christopher J. L.; Barber, Ryan M.; Foreman, Kyle J.; Ozgoren, Ayse Abbasoglu; Abd-Allah, Foad; Abera, Semaw F.; Aboyans, Victor; Abraham, Jerry P.; Abubakar, Ibrahim; Abu-Raddad, Laith J.; Abu-Rmeileh, Niveen M.; Achoki, Tom; Ackerman, Ilana N.; Ademi, Zanfina; Adou, Arsene K.; Adsuar, Jose C.; Afshin, Ashkan; Agardh, Emilie E.; Alam, Sayed Saidul; Alasfoor, Deena; Albittar, Mohammed I.; Alegretti, Miguel A.; Alemu, Zewdie A.; Alfonso-Cristancho, Rafael; Alhabib, Samia; Ali, Raghib; Alla, Francois; Allebeck, Peter; Almazroa, Mohammad A.; Alsharif, Ubai; Alvarez, Elena; Alvis-Guzman, Nelson; Amare, Azmeraw T.; Ameh, Emmanuel A.; Amini, Heresh; Ammar, Walid; Anderson, H. Ross; Anderson, Benjamin O.; Antonio, Carl Abelardo T.; Anwari, Palwasha; Arnlov, Johan; Arsenijevic, Valentina S. Arsic; Artaman, Al; Asghar, Rana J.; Assadi, Reza; Atkins, Lydia S.; Avila, Marco A.; Awuah, Baffour; Hoek, Hans W.; Singh, Abhishek

    2015-01-01

    Background The Global Burden of Disease Study 2013 (GBD 2013) aims to bring together all available epidemiological data using a coherent measurement framework, standardised estimation methods, and transparent data sources to enable comparisons of health loss over time and across causes, age-sex

  2. Global, regional, and national disability-adjusted life years (DALYs) for 306 diseases and injuries and healthy life expectancy (HALE) for 188 countries, 1990–2013: quantifying the epidemiological transition

    NARCIS (Netherlands)

    Murray, C.J.; Barber, R.M.; Foreman, K.J.; Geleijnse, J.M.

    2015-01-01

    Background The Global Burden of Disease Study 2013 (GBD 2013) aims to bring together all available epidemiological data using a coherent measurement framework, standardised estimation methods, and transparent data sources to enable comparisons of health loss over time and across causes, age–sex

  3. Quantifying linguistic coordination

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Tylén, Kristian

    task (Bahrami et al 2010, Fusaroli et al. 2012) we extend to linguistic coordination dynamical measures of recurrence employed in the analysis of sensorimotor coordination (such as heart-rate (Konvalinka et al 2011), postural sway (Shockley 2005) and eye-movements (Dale, Richardson and Kirkham 2012......). We employ nominal recurrence analysis (Orsucci et al 2005, Dale et al 2011) on the decision-making conversations between the participants. We report strong correlations between various indexes of recurrence and collective performance. We argue this method allows us to quantify the qualities...

  4. Tolerance of Snakes to Hypergravity

    Science.gov (United States)

    Lillywhite, H. B.; Ballard, R. E.; Hargens, A. R.

    1994-01-01

    Sensitivity of carotid blood flow to +Gz (head-to-tail) acceleration was studied in six species of snakes hypothesized to show varied adaptive cardiovascular responses to gravity. Blood flow in the proximal carotid artery was measured in 15 snakes before, during and following stepwise increments of +0.25Gz force produced on a 2.4 m diameter centrifuge. During centrifugation each snake was confined to a straight position within an individually- fitted acrylic tube with the head facing the center of rotation. We measured the centrifugal force at the tail of the snake in order to quantify the maximum intensity of force gradient promoting antero-posterior pooling of blood. Tolerance to increased gravity was quantified as the acceleration force at which carotid blood flow ceased. This parameter varied according to the gravitational adaptation of species defined by their ecology and behavior. At the extremes, carotid blood flow decreased in response to increasing gravity and approached zero near +1Gz in aquatic and ground-dwelling species, whereas in climbing species carotid flow was maintained at forces in excess of +2Gz. Surprisingly, tolerant (arboreal) species withstood hypergravic forces of +2 to +3 G. for periods up to 1 h without cessation of carotid blood flow or apparent loss of consciousness. Data suggest that relatively tight skin of the tolerant species provides a natural antigravity suit which is of prime importance in counteracting Gz stress on blood circulation.

  5. Fault tolerant controllers for sampled-data systems

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, Jakob

    2004-01-01

    A general compensator architecture for fault tolerant control (FTC) for sampled-data systems is proposed. The architecture is based on the YJBK parameterization of all stabilizing controllers, and uses the dual YJBK parameterization to quantify the performance of the fault tolerant system. The FTC...

  6. Induced tolerance from a sublethal insecticide leads to cross-tolerance to other insecticides.

    Science.gov (United States)

    Hua, Jessica; Jones, Devin K; Relyea, Rick A

    2014-04-01

    As global pesticide use increases, the ability to rapidly respond to pesticides by increasing tolerance has important implications for the persistence of nontarget organisms. A recent study of larval amphibians discovered that increased tolerance can be induced by an early exposure to low concentrations of a pesticide. Since natural systems are often exposed to a variety of pesticides that vary in mode of action, we need to know whether the induction of increased tolerance to one pesticide confers increased tolerance to other pesticides. Using larval wood frogs (Lithobates sylvaticus), we investigated whether induction of increased tolerance to the insecticide carbaryl (AChE-inhibitor) can induce increased tolerance to other insecticides that have the same mode of action (chlorpyrifos, malathion) or a different mode of action (Na(+)channel-interfering insecticides; permethrin, cypermethrin). We found that embryonic exposure to sublethal concentrations of carbaryl induced higher tolerance to carbaryl and increased cross-tolerance to malathion and cypermethrin but not to chlorpyrifos or permethrin. In one case, the embryonic exposure to carbaryl induced tolerance in a nonlinear pattern (hormesis). These results demonstrate that that the newly discovered phenomenon of induced tolerance also provides induced cross-tolerance that is not restricted to pesticides with the same mode of action.

  7. Quantifying the Adaptive Cycle.

    Directory of Open Access Journals (Sweden)

    David G Angeler

    Full Text Available The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative approach for testing premises (reorganisation, conservatism, adaptation in the adaptive cycle, using Baltic Sea phytoplankton communities as an example of such complex system dynamics. Phytoplankton organizes in recurring spring and summer blooms, a well-established paradigm in planktology and succession theory, with characteristic temporal trajectories during blooms that may be consistent with adaptive cycle phases. We used long-term (1994-2011 data and multivariate analysis of community structure to assess key components of the adaptive cycle. Specifically, we tested predictions about: reorganisation: spring and summer blooms comprise distinct community states; conservatism: community trajectories during individual adaptive cycles are conservative; and adaptation: phytoplankton species during blooms change in the long term. All predictions were supported by our analyses. Results suggest that traditional ecological paradigms such as phytoplankton successional models have potential for moving the adaptive cycle from a metaphor to a framework that can improve our understanding how complex systems organize and reorganize following collapse. Quantifying reorganization, conservatism and adaptation provides opportunities to cope with the intricacies and uncertainties associated with fast ecological change, driven by shifting system controls. Ultimately, combining traditional ecological paradigms with heuristics of complex system dynamics using quantitative approaches may help refine ecological theory and improve our understanding of the resilience of ecosystems.

  8. Quantifying loopy network architectures.

    Directory of Open Access Journals (Sweden)

    Eleni Katifori

    Full Text Available Biology presents many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture containing closed loops at many different levels. Although a number of approaches have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework, the hierarchical loop decomposition, that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated graphs, such as artificial models and optimal distribution networks, as well as natural graphs extracted from digitized images of dicotyledonous leaves and vasculature of rat cerebral neocortex. We calculate various metrics based on the asymmetry, the cumulative size distribution and the Strahler bifurcation ratios of the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information (exact location of edges and nodes from the metric topology (connectivity and edge weight and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs.

  9. Teaching Tolerance? Associational Diversity and Tolerance Formation

    DEFF Research Database (Denmark)

    Rapp, Carolin; Freitag, Markus

    2015-01-01

    , a closer look is taken at how associational diversity relates to the formation of tolerance and the importance of associations as schools of tolerance are evaluated. The main theoretical argument follows contact theory, wherein regular and enduring contact in diverse settings reduces prejudice and thereby...

  10. Lactose tolerance tests

    Science.gov (United States)

    Hydrogen breath test for lactose tolerance ... Two common methods include: Lactose tolerance blood test Hydrogen breath test The hydrogen breath test is the preferred method. It measures the amount of hydrogen ...

  11. Quantifying Land and People Exposed to Sea-Level Rise with No Mitigation and 1.5°C and 2.0°C Rise in Global Temperatures to Year 2300

    Science.gov (United States)

    Brown, S.; Nicholls, R. J.; Goodwin, P.; Haigh, I. D.; Lincke, D.; Vafeidis, A. T.; Hinkel, J.

    2018-03-01

    We use multiple synthetic mitigation sea-level scenarios, together with a non-mitigation sea-level scenario from the Warming Acidification and Sea-level Projector model. We find sea-level rise (SLR) continues to accelerate post-2100 for all but the most aggressive mitigation scenarios indicative of 1.5°C and 2.0°C. Using the Dynamic Interactive Vulnerability Assessment modeling framework, we project land and population exposed in the 1 in 100 year coastal flood plain under SLR and population change. In 2000, the flood plain is estimated at 540 × 103 km2. By 2100, under the mitigation scenarios, it ranges between 610 × 103 and 640 × 103 km2 (580 × 103 and 700 × 103 km2 for the 5th and 95th percentiles). Thus differences between the mitigation scenarios are small in 2100. However, in 2300, flood plains are projected to increase to between 700 × 103 and 960 × 103 km2 in 2300 (610 × 103 and 1290 × 103 km2) for the mitigation scenarios, but 1630 × 103 km2 (1190 × 103 and 2220 × 103 km2) for the non-mitigation scenario. The proportion of global population exposed to SLR in 2300 is projected to be between 1.5% and 5.4% (1.2%-7.6%) (assuming no population growth after 2100) for the aggressive mitigation and the non-mitigation scenario, respectively. Hence over centennial timescales there are significant benefits to climate change mitigation and temperature stabilization. However, sea-levels will continue to rise albeit at lower rates. Thus potential impacts will keep increasing necessitating adaptation to existing coastal infrastructure and the careful planning of new coastal developments.

  12. The Fallacy of Quantifying Risk

    Science.gov (United States)

    2012-09-01

    Defense AT&L: September–October 2012 18 The Fallacy of Quantifying Risk David E. Frick, Ph.D. Frick is a 35-year veteran of the Department of...a key to risk analysis was “choosing the right technique” of quantifying risk . The weakness in this argument stems not from the assertion that one...of information about the enemy), yet achiev- ing great outcomes. Attempts at quantifying risk are not, in and of themselves, objectionable. Prudence

  13. Recognition and Toleration

    DEFF Research Database (Denmark)

    Lægaard, Sune

    2010-01-01

    Recognition and toleration are ways of relating to the diversity characteristic of multicultural societies. The article concerns the possible meanings of toleration and recognition, and the conflict that is often claimed to exist between these two approaches to diversity. Different forms...... or interpretations of recognition and toleration are considered, confusing and problematic uses of the terms are noted, and the compatibility of toleration and recognition is discussed. The article argues that there is a range of legitimate and importantly different conceptions of both toleration and recognition...

  14. Fault Tolerant Feedback Control

    DEFF Research Database (Denmark)

    Stoustrup, Jakob; Niemann, H.

    2001-01-01

    An architecture for fault tolerant feedback controllers based on the Youla parameterization is suggested. It is shown that the Youla parameterization will give a residual vector directly in connection with the fault diagnosis part of the fault tolerant feedback controller. It turns out...... that there is a separation be-tween the feedback controller and the fault tolerant part. The closed loop feedback properties are handled by the nominal feedback controller and the fault tolerant part is handled by the design of the Youla parameter. The design of the fault tolerant part will not affect the design...... of the nominal feedback con-troller....

  15. Mechanical tolerance stackup and analysis

    CERN Document Server

    Fischer, Bryan R

    2004-01-01

    BackgroundDimensioning and TolerancingTolerance Format and Decimal PlacesConverting Plus/Minus Dimensions and Tolerances into Equal Bilaterally Toleranced DimensionsVariation and Sources of VariationTolerance AnalysisWorst-case Tolerance StackupsStatistical Tolerance StackupsGeometric Dimensioning and Tolerancing (GD&T)Converting Plus/Minus Tolerancing to Positional Tolerancing and Projected Tolerance ZonesDiametral and Radial Tolerance StackupsSpecifying Material Condition Modifiers and Their Effect on Tolerance Stackups The Tolerance Stackup SketchThe Tolerance Stackup Report FormTolerance S

  16. Multidominance, ellipsis, and quantifier scope

    NARCIS (Netherlands)

    Temmerman, Tanja Maria Hugo

    2012-01-01

    This dissertation provides a novel perspective on the interaction between quantifier scope and ellipsis. It presents a detailed investigation of the scopal interaction between English negative indefinites, modals, and quantified phrases in ellipsis. One of the crucial observations is that a negative

  17. Accident tolerant fuel analysis

    International Nuclear Information System (INIS)

    2014-01-01

    Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced ''RISMC toolkit'' that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional ''accident-tolerant'' (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant

  18. Ammonium and nitrate tolerance in lichens

    Energy Technology Data Exchange (ETDEWEB)

    Hauck, Markus, E-mail: mhauck@gwdg.d [Department of Plant Ecology, Albrecht von Haller Institute of Plant Sciences, University of Goettingen, Untere Karspuele 2, 37073 Goettingen (Germany)

    2010-05-15

    Since lichens lack roots and take up water, solutes and gases over the entire thallus surface, these organisms respond more sensitively to changes in atmospheric purity than vascular plants. After centuries where effects of sulphur dioxide and acidity were in the focus of research on atmospheric chemistry and lichens, recently the globally increased levels of ammonia and nitrate increasingly affect lichen vegetation and gave rise to intense research on the tolerance of lichens to nitrogen pollution. The present paper discusses the main findings on the uptake of ammonia and nitrate in the lichen symbiosis and to the tolerance of lichens to eutrophication. Ammonia and nitrate are both efficiently taken up under ambient conditions. The tolerance to high nitrogen levels depends, among others, on the capability of the photobiont to provide sufficient amounts of carbon skeletons for ammonia assimilation. Lowly productive lichens are apparently predisposed to be sensitive to excess nitrogen. - Eutrophication has become a global threat for lichen diversity.

  19. 77 FR 12207 - Pyroxasulfone; Pesticide Tolerances

    Science.gov (United States)

    2012-02-29

    ...: Chief, Analytical Chemistry Branch, Environmental Science Center, 701 Mapes Rd., Ft. Meade, MD 20755.... The submitted data for wheat were collected from field trials conducted in Australia and, therefore... global review partners, Australia and Canada, U.S. tolerances for corn grain commodities will be enforced...

  20. Safety aspects of genetically modified crops with abiotic stress tolerance

    NARCIS (Netherlands)

    Liang, C.; Prins, T.W.; Wiel, van de C.C.M.; Kok, E.J.

    2014-01-01

    Abiotic stress, such as drought, salinity, and temperature extremes, significantly reduce crop yields. Hence, development of abiotic stress-tolerant crops by modern biotechnology may contribute to global food security. Prior to introducing genetically modified crops with abiotic stress tolerance to

  1. Air pollution tolerance indices of some plants around Ama industrial ...

    African Journals Online (AJOL)

    Air pollution tolerance indices of some plants around Ama industrial complex in ... The total chlorophyll, ascorbic acid, pH, and relative water content of the leaf ... which contribute to green house effect, global warming and climate change.

  2. Toleration out of respect?

    DEFF Research Database (Denmark)

    Lægaard, Sune

    2013-01-01

    Under conditions of pluralism different cultures, interests or values can come into conflict, which raises the problem of how to secure peaceful co-existence. The idea of toleration historically emerged as an answer to this problem. Recently Rainer Forst has argued that toleration should not just...... be based on a modus vivendi designed to secure peaceful co-existence, but should be based on moral reasons. Forst therefore advances what he calls the ‘respect conception’ of toleration as an in itself morally desirable type of relationship, which is furthermore the only conception of toleration...... that avoids various so-called ‘paradoxes of toleration’. The paper first examines whether Forst’s respect conception can be applied descriptively to distinguish between actual patterns of behaviour and classify different acts of toleration. Then the focus is shifted to toleration out of respect as a normative...

  3. Tolerance in Drosophila

    OpenAIRE

    Atkinson, Nigel S.

    2009-01-01

    The set of genes that underlie ethanol tolerance (inducible resistance) are likely to overlap with the set of genes responsible for ethanol addiction. Whereas addiction is difficult to recognize in simple model systems, behavioral tolerance is readily identifiable and can be induced in large populations of animals. Thus, tolerance lends itself to analysis in model systems with powerful genetics. Drosophila melanogaster has been used by a variety of laboratories for the identification of genes...

  4. Quantifiers in Russian Sign Language

    NARCIS (Netherlands)

    Kimmelman, V.; Paperno, D.; Keenan, E.L.

    2017-01-01

    After presenting some basic genetic, historical and typological information about Russian Sign Language, this chapter outlines the quantification patterns it expresses. It illustrates various semantic types of quantifiers, such as generalized existential, generalized universal, proportional,

  5. Quantified Self in de huisartsenpraktijk

    NARCIS (Netherlands)

    de Groot, Martijn; Timmers, Bart; Kooiman, Thea; van Ittersum, Miriam

    2015-01-01

    Quantified Self staat voor de zelfmetende mens. Het aantal mensen dat met zelf gegeneerde gezondheidsgegevens het zorgproces binnenwandelt gaat de komende jaren groeien. Verschillende soorten activity trackers en gezondheidsapplicaties voor de smartphone maken het relatief eenvoudig om persoonlijke

  6. Quantifying uncertainty in estimation of global arthropod species richness

    Czech Academy of Sciences Publication Activity Database

    Hamilton, J. A.; Basset, Y.; Benke, K. K.; Grimbacher, P. S.; Miller, S. E.; Novotný, Vojtěch; Samuelson, G. A.; Stork, N. E.; Weiblen, G. D.; Yen, J. D. L.

    2010-01-01

    Roč. 176, č. 1 (2010), s. 90-95 ISSN 0003-0147 Institutional research plan: CEZ:AV0Z50070508 Keywords : Coleoptera * host specificity * Latin hypercube sampling Subject RIV: EH - Ecology, Behaviour Impact factor: 4.736, year: 2010

  7. Compromise and Toleration

    DEFF Research Database (Denmark)

    Rostbøll, Christian F.

    Political compromise is akin to toleration, since both consist of an "agreement to disagree." Compromise and toleration also share a predicament of being regarded as ambiguous virtues that require of us to accept something we actually regard as wrong. However, we misunderstand the nature, justifi...... in compromise are more stringent than those for being tolerated. Still, the limits of compromise cannot be drawn to narrowly if it is to remain its value as a form of agreement that respects and embodies the differences of opinion in society.......Political compromise is akin to toleration, since both consist of an "agreement to disagree." Compromise and toleration also share a predicament of being regarded as ambiguous virtues that require of us to accept something we actually regard as wrong. However, we misunderstand the nature......, justification, and limits of compromise if we see it merely as a matter of toleration. While toleration is mainly a matter of accepting citizens' equal right to co-existence as subjects to law, political compromise includes the parties in making law – it makes them co-authors of law. Toleration entails...

  8. Tolerances in micro manufacturing

    DEFF Research Database (Denmark)

    Hansen, Hans Nørgaard; Zhang, Yang; Islam, Aminul

    This paper describes a method for analysis of tolerances in micro manufacturing. It proposes a mapping oftolerances to dimensions and compares this with current available international standards. The analysisdocuments that tolerances are not scaled down as the absolute dimension. In practice...

  9. Fault tolerant computing systems

    International Nuclear Information System (INIS)

    Randell, B.

    1981-01-01

    Fault tolerance involves the provision of strategies for error detection damage assessment, fault treatment and error recovery. A survey is given of the different sorts of strategies used in highly reliable computing systems, together with an outline of recent research on the problems of providing fault tolerance in parallel and distributed computing systems. (orig.)

  10. Toleration out of respect?

    DEFF Research Database (Denmark)

    Lægaard, Sune

    2014-01-01

    be based on a modus vivendi designed to secure peaceful co-existence, but should be based on moral reasons. Forst therefore advances what he calls the ‘respect conception’ of toleration as an in itself morally desirable type of relationship, which is furthermore the only conception of toleration...

  11. Recognition and Toleration

    DEFF Research Database (Denmark)

    Lægaard, Sune

    2010-01-01

    Recognition and toleration are ways of relating to the diversity characteristic of multicultural societies. The article concerns the possible meanings of toleration and recognition, and the conflict that is often claimed to exist between these two approaches to diversity. Different forms or inter...

  12. Remember Tolerance Differently

    DEFF Research Database (Denmark)

    Tønder, Lars

    2012-01-01

    This essay questions the linear conception of history which often accompanies the way contemporary democratic theory tends to disavow tolerance's discontinuities and remainders. In the spirit of Foucault's genealogy of descent, the idea is to develop a new sense of tolerance's history, not by inv......This essay questions the linear conception of history which often accompanies the way contemporary democratic theory tends to disavow tolerance's discontinuities and remainders. In the spirit of Foucault's genealogy of descent, the idea is to develop a new sense of tolerance's history......, not by invoking a critique external to contemporary democratic theory, but by witnessing the history of tolerance paraliptically, with an eye to what it obscures and yet presupposes....

  13. Quantifying the uncertainty in heritability.

    Science.gov (United States)

    Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph

    2014-05-01

    The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large.

  14. Using multiple linear regression techniques to quantify carbon ...

    African Journals Online (AJOL)

    Fallow ecosystems provide a significant carbon stock that can be quantified for inclusion in the accounts of global carbon budgets. Process and statistical models of productivity, though useful, are often technically rigid as the conditions for their application are not easy to satisfy. Multiple regression techniques have been ...

  15. A Multirelational Account of Toleration

    DEFF Research Database (Denmark)

    Ferretti, Maria Paola; Lægaard, Sune

    2013-01-01

    Toleration classically denotes a relation between two agents that is characterised by three components: objection, power, and acceptance overriding the objection. Against recent claims that classical toleration is not applicable in liberal democracies and that toleration must therefore either be ...

  16. Benefits of tolerance in public goods games.

    Science.gov (United States)

    Szolnoki, Attila; Chen, Xiaojie

    2015-10-01

    Leaving the joint enterprise when defection is unveiled is always a viable option to avoid being exploited. Although loner strategy helps the population not to be trapped into the tragedy of the commons state, it could offer only a modest income for nonparticipants. In this paper we demonstrate that showing some tolerance toward defectors could not only save cooperation in harsh environments but in fact results in a surprisingly high average payoff for group members in public goods games. Phase diagrams and the underlying spatial patterns reveal the high complexity of evolving states where cyclic dominant strategies or two-strategy alliances can characterize the final state of evolution. We identify microscopic mechanisms which are responsible for the superiority of global solutions containing tolerant players. This phenomenon is robust and can be observed both in well-mixed and in structured populations highlighting the importance of tolerance in our everyday life.

  17. State, religion and toleration

    DEFF Research Database (Denmark)

    Huggler, Jørgen

    2009-01-01

    Contribution to Religion and State - From separation to cooperation? Legal-philosophical reflections for a de-secularized world. (IVR Cracow Special Workshop). Eds. Bart. C. Labuschagne & Ari M. Solon. Abstract: Toleration is indeed a complex phenomenon. A discussion of the concept will have...... to underline not only the broadmindedness and liberty of individuals or of groups, but also the relevant distinctions and arguments in political philosophy, epistemology, philosophy of religion and philosophical anthropology and their connection with educational issues. Through a discussion of these relations......, the essay argues three theses: (1) Toleration is not reducible to an ethics of spiritual freedom. (2) Toleration is not neutral to fanatism. (3) Toleration involves esteem for the person....

  18. Genetic Approaches to Develop Salt Tolerant Germplasm

    KAUST Repository

    Tester, Mark A.

    2015-08-19

    Forty percent of the world\\'s food is produced under irrigation, and this is directly threatened by over-exploitation and changes in the global environment. One way to address this threat is to develop systems for increasing our ability to use lower quality water, in particular saline water. Low cost partial desalination of brackish water, use of saline water for cooling and increases in the salinity tolerance of crops can all contribute to the development of this new agricultural system. In this talk, the focus will be on the use of forward genetic approaches for discovery of genes related to salinity tolerance in barley and tomatoes. Rather than studying salinity tolerance as a trait in itself, we dissect salinity tolerance into a series of components that are hypothesised to contribute to overall salinity tolerance (following the paradigm of Munns & Tester, 2008). For example, one significant component of tolerance of most crop plants to moderate soil salinity is due to the ability to maintain low concentrations of Na+ in the leaves, and much analysis of this aspect has been done (e.g. Roy et al., 2013, 2014). A major site for the control of shoot Na+ accumulation is at the plasma membrane of the mature stele of the root. Alleles of HKT, a major gene underlying this transport process have been characterized and, in work led by Dr Rana Munns (CSIRO), have now been introgressed into commercial durum wheat and led to significantly increased yields in saline field conditions (Munns et al., 2012). The genotyping of mapping populations is now highly efficient. However, the ability to quantitatively phenotype these populations is now commonly limiting forward progress in plant science. The increasing power of digital imaging and computational technologies offers the opportunity to relieve this phenotyping bottleneck. The Plant Accelerator is a 4500m2 growth facility that provides non-destructive phenotyping of large populations of plants (http

  19. A Theory of Tolerance

    OpenAIRE

    Corneo, Giacomo; Jeanne, Olivier

    2006-01-01

    We develop an economic theory of tolerance where styles of behaviour are invested with symbolic value. Value systems are endogenous and taught by parents to their children. In conjunction with actual behaviour, value systems determine the esteem enjoyed by individuals. Intolerant individuals have all symbolic value invested in a single style of behaviour, whereas tolerant people have diversified values. The proposed model identifies a link between the unpredictability of children's lifestyles...

  20. Quantifying and simulating human sensation

    DEFF Research Database (Denmark)

    Quantifying and simulating human sensation – relating science and technology of indoor climate research Abstract In his doctoral thesis from 1970 civil engineer Povl Ole Fanger proposed that the understanding of indoor climate should focus on the comfort of the individual rather than averaged...... this understanding of human sensation was adjusted to technology. I will look into the construction of the equipment, what it measures and the relationship between theory, equipment and tradition....

  1. Quantifying emissions from spontaneous combustion

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-09-01

    Spontaneous combustion can be a significant problem in the coal industry, not only due to the obvious safety hazard and the potential loss of valuable assets, but also with respect to the release of gaseous pollutants, especially CO2, from uncontrolled coal fires. This report reviews methodologies for measuring emissions from spontaneous combustion and discusses methods for quantifying, estimating and accounting for the purpose of preparing emission inventories.

  2. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    Science.gov (United States)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  3. Thermal tolerance ranges and climate variability : A comparison between bivalves from differing climates

    NARCIS (Netherlands)

    Compton, Tanya J.; Rijkenberg, Micha J. A.; Drent, Jan; Piersma, Theunis

    2007-01-01

    The climate variability hypothesis proposes that in variable temperate climates poikilothermic animals have wide thermal tolerance windows, whereas in constant tropical climates they have small thermal tolerance windows. In this study we quantified and compared the upper and lower lethal thermal

  4. Escaping the tolerance trap

    International Nuclear Information System (INIS)

    Hammoudeh, S.; Madan, V.

    1994-01-01

    In order to examine the implications of the weakening of OPEC's responsiveness in adjusting its production levels, this paper explicitly incorporates rigidity in the quantity adjustment mechanism, thereby extending previous research which assumed smooth quantity adjustments. The rigidity is manifested in a tolerance range for the discrepancy between the declared target price and that of the market. This environment gives rise to a 'tolerance trap' which impedes the convergence process and inevitably brings the market to a standstill before its reaches the targeted price and revenue objectives. OPEC's reaction to the standstill has important implications for the achievement of the target-based equilibrium and for the potential collapse of the market price. This paper examines OPEC's policy options in the tolerance trap and reveals that the optional policy in order to break this impasse and move closer to the equilibrium point is gradually to reduce output and not to flood the market. (Author)

  5. Quantifying Coral Reef Ecosystem Services

    Science.gov (United States)

    Coral reefs have been declining during the last four decades as a result of both local and global anthropogenic stresses. Numerous research efforts to elucidate the nature, causes, magnitude, and potential remedies for the decline have led to the widely held belief that the recov...

  6. CORAL REEFS. Genomic determinants of coral heat tolerance across latitudes.

    Science.gov (United States)

    Dixon, Groves B; Davies, Sarah W; Aglyamova, Galina A; Meyer, Eli; Bay, Line K; Matz, Mikhail V

    2015-06-26

    As global warming continues, reef-building corals could avoid local population declines through "genetic rescue" involving exchange of heat-tolerant genotypes across latitudes, but only if latitudinal variation in thermal tolerance is heritable. Here, we show an up-to-10-fold increase in odds of survival of coral larvae under heat stress when their parents come from a warmer lower-latitude location. Elevated thermal tolerance was associated with heritable differences in expression of oxidative, extracellular, transport, and mitochondrial functions that indicated a lack of prior stress. Moreover, two genomic regions strongly responded to selection for thermal tolerance in interlatitudinal crosses. These results demonstrate that variation in coral thermal tolerance across latitudes has a strong genetic basis and could serve as raw material for natural selection. Copyright © 2015, American Association for the Advancement of Science.

  7. Mechanisms of flood tolerance in wheat and rice

    DEFF Research Database (Denmark)

    Herzog, Max

    Most crops are sensitive to excess water, and consequently floods have detrimental effects on crop yields worldwide. In addition, global climate change is expected to regionally increase the number of floods within decades, urging for more flood-tolerant crop cultivars to be released. The aim...... of this thesis was to assess mechanisms conferring rice (Oryza sativa) and wheat (Triticum aestivum) flood tolerance, focusing on the role of leaf gas films during plant submergence. Reviewing the literature showed that wheat germplasm holds genetic variation towards waterlogging (soil flooding), and highlighted...... that the contrasting submergence tolerance could rather be governed by tolerance to radical oxygen species or contrasting metabolic responses (other than carbohydrate consumption) to ethylene accumulation. Manipulating leaf gas film presence affected wheat and rice submergence tolerance such as plant growth...

  8. Fault tolerant linear actuator

    Science.gov (United States)

    Tesar, Delbert

    2004-09-14

    In varying embodiments, the fault tolerant linear actuator of the present invention is a new and improved linear actuator with fault tolerance and positional control that may incorporate velocity summing, force summing, or a combination of the two. In one embodiment, the invention offers a velocity summing arrangement with a differential gear between two prime movers driving a cage, which then drives a linear spindle screw transmission. Other embodiments feature two prime movers driving separate linear spindle screw transmissions, one internal and one external, in a totally concentric and compact integrated module.

  9. Inequality, Tolerance, and Growth

    DEFF Research Database (Denmark)

    Bjørnskov, Christian

    This paper argues for the importance of individuals' tolerance of inequality for economic growth. By using the political ideology of governments as a measure of revealed tolerance of inequality, the paper shows that controlling for ideology improves the accuracy with which the effects of inequality...... are measured. Results show that inequality reduces growth but more so in societies where people perceive it as being relatively unfair. Further results indicate that legal quality and social trust are likely transmission channels for the effects of inequality....

  10. Inequality, Tolerance, and Growth

    DEFF Research Database (Denmark)

    Bjørnskov, Christian

    2004-01-01

    This paper argues for the importance of individuals' tolerance of inequality for economic growth. By using the political ideology of governments as a measure of revealed tolerance of inequality, the paper shows that controlling for ideology improves the accuracy with which the effects of inequality...... are measured. Results show that inequality reduces growth but more so in societies where people perceive it as being relatively unfair. Further results indicate that legal quality and social trust are likely transmission channels for the effects of inequality....

  11. Quantifying Quantum-Mechanical Processes.

    Science.gov (United States)

    Hsieh, Jen-Hsiang; Chen, Shih-Hsuan; Li, Che-Ming

    2017-10-19

    The act of describing how a physical process changes a system is the basis for understanding observed phenomena. For quantum-mechanical processes in particular, the affect of processes on quantum states profoundly advances our knowledge of the natural world, from understanding counter-intuitive concepts to the development of wholly quantum-mechanical technology. Here, we show that quantum-mechanical processes can be quantified using a generic classical-process model through which any classical strategies of mimicry can be ruled out. We demonstrate the success of this formalism using fundamental processes postulated in quantum mechanics, the dynamics of open quantum systems, quantum-information processing, the fusion of entangled photon pairs, and the energy transfer in a photosynthetic pigment-protein complex. Since our framework does not depend on any specifics of the states being processed, it reveals a new class of correlations in the hierarchy between entanglement and Einstein-Podolsky-Rosen steering and paves the way for the elaboration of a generic method for quantifying physical processes.

  12. The Quantified Self (QS) Movement and Some Emerging Opportunities for the Educational Technology Field

    Science.gov (United States)

    Lee, Victor R.

    2013-01-01

    The Quantified Self (QS) movement is a growing global effort to use new mobile and wearable technologies to automatically obtain personal data about everyday activities. The social and material infrastructure associated with the Quantified Self (QS) movement provides a number of ideas that educational technologists should consider incorporating…

  13. A compact clinical instrument for quantifying suppression.

    Science.gov (United States)

    Black, Joanne M; Thompson, Benjamin; Maehara, Goro; Hess, Robert F

    2011-02-01

    We describe a compact and convenient clinical apparatus for the measurement of suppression based on a previously reported laboratory-based approach. In addition, we report and validate a novel, rapid psychophysical method for measuring suppression using this apparatus, which makes the technique more applicable to clinical practice. By using a Z800 dual pro head-mounted display driven by a MAC laptop, we provide dichoptic stimulation. Global motion stimuli composed of arrays of moving dots are presented to each eye. One set of dots move in a coherent direction (termed signal) whereas another set of dots move in a random direction (termed noise). To quantify performance, we measure the signal/noise ratio corresponding to a direction-discrimination threshold. Suppression is quantified by assessing the extent to which it matters which eye sees the signal and which eye sees the noise. A space-saving, head-mounted display using current video technology offers an ideal solution for clinical practice. In addition, our optimized psychophysical method provided results that were in agreement with those produced using the original technique. We made measures of suppression on a group of nine adult amblyopic participants using this apparatus with both the original and new psychophysical paradigms. All participants had measurable suppression ranging from mild to severe. The two different psychophysical methods gave a strong correlation for the strength of suppression (rho = -0.83, p = 0.006). Combining the new apparatus and new psychophysical method creates a convenient and rapid technique for parametric measurement of interocular suppression. In addition, this apparatus constitutes the ideal platform for suppressors to combine information between their eyes in a similar way to binocularly normal people. This provides a convenient way for clinicians to implement the newly proposed binocular treatment of amblyopia that is based on antisuppression training.

  14. Toleration, Groups, and Multiculturalism

    DEFF Research Database (Denmark)

    Lægaard, Sune

    2014-01-01

    have the ability to interfere with the group’s activities, an object of dislike or disapproval, an agent enjoying non-interference or a moral patient. This means that 'toleration of groups' can mean quite different things depending on the exact meaning of 'group' in relation to each component...

  15. Fault Tolerant Control Systems

    DEFF Research Database (Denmark)

    Bøgh, S. A.

    This thesis considered the development of fault tolerant control systems. The focus was on the category of automated processes that do not necessarily comprise a high number of identical sensors and actuators to maintain safe operation, but still have a potential for improving immunity to component...

  16. Toleration and its enemies

    DEFF Research Database (Denmark)

    Jarvad, Ib Martin

    2010-01-01

    After a presentation of the development of freedom of expression in Danish constitutional law, to freedom of the press in European human rights law - the Jersild case- to a right to mock and ridicule other faiths in recent Danish practice, the essay of Locke on toleration is examined, its...

  17. A little toleration, please

    Science.gov (United States)

    McKnight, C.

    2000-01-01

    Value pluralism does not imply relativism or subjectivism about values. What it does is allow respect for an at least limited toleration of values with which one may profoundly disagree. Thus a doctor can respect the autonomy of a patient whose values he does not share. Key Words: Pluralism • multiculturalism • relativism • subjectivism • patient autonomy PMID:11129842

  18. Deconstructing tolerance with clobazam

    Science.gov (United States)

    Wechsler, Robert T.; Sankar, Raman; Montouris, Georgia D.; White, H. Steve; Cloyd, James C.; Kane, Mary Clare; Peng, Guangbin; Tworek, David M.; Shen, Vivienne; Isojarvi, Jouko

    2016-01-01

    Objective: To evaluate potential development of tolerance to adjunctive clobazam in patients with Lennox-Gastaut syndrome. Methods: Eligible patients enrolled in open-label extension study OV-1004, which continued until clobazam was commercially available in the United States or for a maximum of 2 years outside the United States. Enrolled patients started at 0.5 mg·kg−1·d−1 clobazam, not to exceed 40 mg/d. After 48 hours, dosages could be adjusted up to 2.0 mg·kg−1·d−1 (maximum 80 mg/d) on the basis of efficacy and tolerability. Post hoc analyses evaluated mean dosages and drop-seizure rates for the first 2 years of the open-label extension based on responder categories and baseline seizure quartiles in OV-1012. Individual patient listings were reviewed for dosage increases ≥40% and increasing seizure rates. Results: Data from 200 patients were included. For patients free of drop seizures, there was no notable change in dosage over 24 months. For responder groups still exhibiting drop seizures, dosages were increased. Weekly drop-seizure rates for 100% and ≥75% responders demonstrated a consistent response over time. Few patients had a dosage increase ≥40% associated with an increase in seizure rates. Conclusions: Two-year findings suggest that the majority of patients do not develop tolerance to the antiseizure actions of clobazam. Observed dosage increases may reflect best efforts to achieve seizure freedom. It is possible that the clinical development of tolerance to clobazam has been overstated. ClinicalTrials.gov identifier: NCT00518713 and NCT01160770. Classification of evidence: This study provides Class III evidence that the majority of patients do not develop tolerance to clobazam over 2 years of treatment. PMID:27683846

  19. Surface Hydrophobicity Causes SO2 Tolerance in Lichens

    Science.gov (United States)

    Hauck, Markus; Jürgens, Sascha-René; Brinkmann, Martin; Herminghaus, Stephan

    2008-01-01

    Background and Aims The superhydrophobicity of the thallus surface in one of the most SO2-tolerant lichen species, Lecanora conizaeoides, suggests that surface hydrophobicity could be a general feature of lichen symbioses controlling their tolerance to SO2. The study described here tests this hypothesis. Methods Water droplets of the size of a raindrop were placed on the surface of air-dry thalli in 50 lichen species of known SO2 tolerance and contact angles were measured to quantify hydrophobicity. Key Results The wettability of lichen thalli ranges from strongly hydrophobic to strongly hydrophilic. SO2 tolerance of the studied lichen species increased with increasing hydrophobicity of the thallus surface. Extraction of extracellular lichen secondary metabolites with acetone reduced, but did not abolish the hydrophobicity of lichen thalli. Conclusions Surface hydrophobicity is the main factor controlling SO2 tolerance in lichens. It presumably originally evolved as an adaptation to wet habitats preventing the depression of net photosynthesis due to supersaturation of the thallus with water. Hydrophilicity of lichen thalli is an adaptation to dry or humid, but not directly rain-exposed habitats. The crucial role of surface hydrophobicity in SO2 also explains why many markedly SO2-tolerant species are additionally tolerant to other (chemically unrelated) toxic substances including heavy metals. PMID:18077467

  20. Quantifying Evaporation in a Permeable Pavement System

    Science.gov (United States)

    Studies quantifying evaporation from permeable pavement systems are limited to a few laboratory studies and one field application. This research quantifies evaporation for a larger-scale field application by measuring the water balance from lined permeable pavement sections. Th...

  1. Quantifying sound quality in loudspeaker reproduction

    NARCIS (Netherlands)

    Beerends, John G.; van Nieuwenhuizen, Kevin; van den Broek, E.L.

    2016-01-01

    We present PREQUEL: Perceptual Reproduction Quality Evaluation for Loudspeakers. Instead of quantifying the loudspeaker system itself, PREQUEL quantifies the overall loudspeakers' perceived sound quality by assessing their acoustic output using a set of music signals. This approach introduces a

  2. Selective labelling and eradication of antibiotic-tolerant bacterial populations in Pseudomonas aeruginosa biofilms

    DEFF Research Database (Denmark)

    Chua, Song Lin; Yam, Joey Kuok Hoong; Hao, Piliang

    2016-01-01

    Drug resistance and tolerance greatly diminish the therapeutic potential of antibiotics against pathogens. Antibiotic tolerance by bacterial biofilms often leads to persistent infections, but its mechanisms are unclear. Here we use a proteomics approach, pulsed stable isotope labelling with amino...... acids (pulsed-SILAC), to quantify newly expressed proteins in colistin-tolerant subpopulations of Pseudomonas aeruginosa biofilms (colistin is a 'last-resort' antibiotic against multidrug-resistant Gram-negative pathogens). Migration is essential for the formation of colistin-tolerant biofilm...

  3. Quantifier Scope in Categorical Compositional Distributional Semantics

    Directory of Open Access Journals (Sweden)

    Mehrnoosh Sadrzadeh

    2016-08-01

    Full Text Available In previous work with J. Hedges, we formalised a generalised quantifiers theory of natural language in categorical compositional distributional semantics with the help of bialgebras. In this paper, we show how quantifier scope ambiguity can be represented in that setting and how this representation can be generalised to branching quantifiers.

  4. Quantifying emission reduction contributions by emerging economics

    Energy Technology Data Exchange (ETDEWEB)

    Moltmann, Sara; Hagemann, Markus; Eisbrenner, Katja; Hoehne, Niklas [Ecofys GmbH, Koeln (Germany); Sterk, Wolfgang; Mersmann, Florian; Ott, Hermann E.; Watanabe, Rie [Wuppertal Institut (Germany)

    2011-04-15

    Further action is needed that goes far beyond what has been agreed so far under the United Nations Framework Convention on Climate Change (UNFCCC) and the Kyoto Protocol to 'prevent dangerous anthropogenic interference with the climate system', the ultimate objective of the UNFCCC. It is out of question that developed countries (Annex I countries) will have to take a leading role. They will have to commit to substantial emission reductions and financing commitments due to their historical responsibility and their financial capability. However, the stabilisation of the climate system will require global emissions to peak within the next decade and decline well below half of current levels by the middle of the century. It is hence a global issue and, thus, depends on the participation of as many countries as possible. This report provides a comparative analysis of greenhouse gas (GHG) emissions, including their national climate plans, of the major emitting developing countries Brazil, China, India, Mexico, South Africa and South Korea. It includes an overview of emissions and economic development, existing national climate change strategies, uses a consistent methodology for estimating emission reduction potential, costs of mitigation options, provides an estimate of the reductions to be achieved through the national climate plans and finally provides a comparison of the results to the allocation of emission rights according to different global effort-sharing approaches. In addition, the report discusses possible nationally appropriate mitigation actions (NAMAs) the six countries could take based on the analysis of mitigation options. This report is an output of the project 'Proposals for quantifying emission reduction contributions by emerging economies' by Ecofys and the Wuppertal Institute for the Federal Environment Agency in Dessau. It builds upon earlier joint work ''Proposals for contributions of emerging economies to the climate

  5. Improving abiotic stress tolerance of quinoa

    DEFF Research Database (Denmark)

    Yang, Aizheng

    Global food security faces the challenges of rapid population growth and shortage of water resources. Drought, heat waves and soil salinity are becoming more frequent and extreme due to climatic changes in many regions of the world, and resulting in yield reduction of many crops. It is hypothesized...... that quinoa has the potential to grow under a range of abiotic stresses, tolerating levels regarded as stresses in other crop species. Therefore cultivation of quinoa (Chenopodium quinoa Willd.) could be an alternative option in such regions. Even though quinoa is more tolerant to abiotic stress than most...... other crops, its productivity declines under severe drought, high salt conditions and harsh climate conditions. Different management approaches including water-saving irrigation methods (such as deficit irrigation, DI and alternate root-zone drying irrigation, ARD), inoculating crop seeds with plant...

  6. Socially-Tolerable Discrimination

    OpenAIRE

    Amegashie, J. Atsu

    2008-01-01

    History is replete with overt discrimination on the basis of race, gender, age, citizenship, ethnicity, marital status, academic performance, health status, volume of market transactions, religion, sexual orientation, etc. However, these forms of discrimination are not equally tolerable. For example, discrimination based on immutable or prohibitively unalterable characteristics such as race, gender, or ethnicity is much less acceptable. Why? I develop a simple rent-seeking model of conflict w...

  7. Quantifying the vitamin D economy.

    Science.gov (United States)

    Heaney, Robert P; Armas, Laura A G

    2015-01-01

    Vitamin D enters the body through multiple routes and in a variety of chemical forms. Utilization varies with input, demand, and genetics. Vitamin D and its metabolites are carried in the blood on a Gc protein that has three principal alleles with differing binding affinities and ethnic prevalences. Three major metabolites are produced, which act via two routes, endocrine and autocrine/paracrine, and in two compartments, extracellular and intracellular. Metabolic consumption is influenced by physiological controls, noxious stimuli, and tissue demand. When administered as a supplement, varying dosing schedules produce major differences in serum metabolite profiles. To understand vitamin D's role in human physiology, it is necessary both to identify the foregoing entities, mechanisms, and pathways and, specifically, to quantify them. This review was performed to delineate the principal entities and transitions involved in the vitamin D economy, summarize the status of present knowledge of the applicable rates and masses, draw inferences about functions that are implicit in these quantifications, and point out implications for the determination of adequacy. © The Author(s) 2014. Published by Oxford University Press on behalf of the International Life Sciences Institute. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Quantify the complexity of turbulence

    Science.gov (United States)

    Tao, Xingtian; Wu, Huixuan

    2017-11-01

    Many researchers have used Reynolds stress, power spectrum and Shannon entropy to characterize a turbulent flow, but few of them have measured the complexity of turbulence. Yet as this study shows, conventional turbulence statistics and Shannon entropy have limits when quantifying the flow complexity. Thus, it is necessary to introduce new complexity measures- such as topology complexity and excess information-to describe turbulence. Our test flow is a classic turbulent cylinder wake at Reynolds number 8100. Along the stream-wise direction, the flow becomes more isotropic and the magnitudes of normal Reynolds stresses decrease monotonically. These seem to indicate the flow dynamics becomes simpler downstream. However, the Shannon entropy keeps increasing along the flow direction and the dynamics seems to be more complex, because the large-scale vortices cascade to small eddies, the flow is less correlated and more unpredictable. In fact, these two contradictory observations partially describe the complexity of a turbulent wake. Our measurements (up to 40 diameters downstream the cylinder) show that the flow's degree-of-complexity actually increases firstly and then becomes a constant (or drops slightly) along the stream-wise direction. University of Kansas General Research Fund.

  9. Quantifying Cancer Risk from Radiation.

    Science.gov (United States)

    Keil, Alexander P; Richardson, David B

    2017-12-06

    Complex statistical models fitted to data from studies of atomic bomb survivors are used to estimate the human health effects of ionizing radiation exposures. We describe and illustrate an approach to estimate population risks from ionizing radiation exposure that relaxes many assumptions about radiation-related mortality. The approach draws on developments in methods for causal inference. The results offer a different way to quantify radiation's effects and show that conventional estimates of the population burden of excess cancer at high radiation doses are driven strongly by projecting outside the range of current data. Summary results obtained using the proposed approach are similar in magnitude to those obtained using conventional methods, although estimates of radiation-related excess cancers differ for many age, sex, and dose groups. At low doses relevant to typical exposures, the strength of evidence in data is surprisingly weak. Statements regarding human health effects at low doses rely strongly on the use of modeling assumptions. © 2017 Society for Risk Analysis.

  10. Quantifying China's regional economic complexity

    Science.gov (United States)

    Gao, Jian; Zhou, Tao

    2018-02-01

    China has experienced an outstanding economic expansion during the past decades, however, literature on non-monetary metrics that reveal the status of China's regional economic development are still lacking. In this paper, we fill this gap by quantifying the economic complexity of China's provinces through analyzing 25 years' firm data. First, we estimate the regional economic complexity index (ECI), and show that the overall time evolution of provinces' ECI is relatively stable and slow. Then, after linking ECI to the economic development and the income inequality, we find that the explanatory power of ECI is positive for the former but negative for the latter. Next, we compare different measures of economic diversity and explore their relationships with monetary macroeconomic indicators. Results show that the ECI index and the non-linear iteration based Fitness index are comparative, and they both have stronger explanatory power than other benchmark measures. Further multivariate regressions suggest the robustness of our results after controlling other socioeconomic factors. Our work moves forward a step towards better understanding China's regional economic development and non-monetary macroeconomic indicators.

  11. Quantifying and Reducing Light Pollution

    Science.gov (United States)

    Gokhale, Vayujeet; Caples, David; Goins, Jordan; Herdman, Ashley; Pankey, Steven; Wren, Emily

    2018-06-01

    We describe the current level of light pollution in and around Kirksville, Missouri and around Anderson Mesa near Flagstaff, Arizona. We quantify the amount of light that is projected up towards the sky, instead of the ground, using Unihedron sky quality meters installed at various locations. We also present results from DSLR photometry of several standard stars, and compare the photometric quality of the data collected at locations with varying levels of light pollution. Presently, light fixture shields and ‘warm-colored’ lights are being installed on Truman State University’s campus in order to reduce light pollution. We discuss the experimental procedure we use to test the effectiveness of the different light fixtures shields in a controlled setting inside the Del and Norma Robison Planetarium.Apart from negatively affecting the quality of the night sky for astronomers, light pollution adversely affects migratory patterns of some animals and sleep-patterns in humans, increases our carbon footprint, and wastes resources and money. This problem threatens to get particularly acute with the increasing use of outdoor LED lamps. We conclude with a call to action to all professional and amateur astronomers to act against the growing nuisance of light pollution.

  12. Quantifying meniscal kinematics in dogs.

    Science.gov (United States)

    Park, Brian H; Banks, Scott A; Pozzi, Antonio

    2017-11-06

    The dog has been used extensively as an experimental model to study meniscal treatments such as meniscectomy, meniscal repair, transplantation, and regeneration. However, there is very little information on meniscal kinematics in the dog. This study used MR imaging to quantify in vitro meniscal kinematics in loaded dog knees in four distinct poses: extension, flexion, internal, and external rotation. A new method was used to track the meniscal poses along the convex and posteriorly tilted tibial plateau. Meniscal displacements were large, displacing 13.5 and 13.7 mm posteriorly on average for the lateral and medial menisci during flexion (p = 0.90). The medial anterior horn and lateral posterior horns were the most mobile structures, showing average translations of 15.9 and 15.1 mm, respectively. Canine menisci are highly mobile and exhibit movements that correlate closely with the relative tibiofemoral positions. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  13. Quantifying the invasiveness of species

    Directory of Open Access Journals (Sweden)

    Robert Colautti

    2014-04-01

    Full Text Available The success of invasive species has been explained by two contrasting but non-exclusive views: (i intrinsic factors make some species inherently good invaders; (ii species become invasive as a result of extrinsic ecological and genetic influences such as release from natural enemies, hybridization or other novel ecological and evolutionary interactions. These viewpoints are rarely distinguished but hinge on distinct mechanisms leading to different management scenarios. To improve tests of these hypotheses of invasion success we introduce a simple mathematical framework to quantify the invasiveness of species along two axes: (i interspecific differences in performance among native and introduced species within a region, and (ii intraspecific differences between populations of a species in its native and introduced ranges. Applying these equations to a sample dataset of occurrences of 1,416 plant species across Europe, Argentina, and South Africa, we found that many species are common in their native range but become rare following introduction; only a few introduced species become more common. Biogeographical factors limiting spread (e.g. biotic resistance, time of invasion therefore appear more common than those promoting invasion (e.g. enemy release. Invasiveness, as measured by occurrence data, is better explained by inter-specific variation in invasion potential than biogeographical changes in performance. We discuss how applying these comparisons to more detailed performance data would improve hypothesis testing in invasion biology and potentially lead to more efficient management strategies.

  14. Integrated cosmological probes: concordance quantified

    Energy Technology Data Exchange (ETDEWEB)

    Nicola, Andrina; Amara, Adam; Refregier, Alexandre, E-mail: andrina.nicola@phys.ethz.ch, E-mail: adam.amara@phys.ethz.ch, E-mail: alexandre.refregier@phys.ethz.ch [Department of Physics, ETH Zürich, Wolfgang-Pauli-Strasse 27, CH-8093 Zürich (Switzerland)

    2017-10-01

    Assessing the consistency of parameter constraints derived from different cosmological probes is an important way to test the validity of the underlying cosmological model. In an earlier work [1], we computed constraints on cosmological parameters for ΛCDM from an integrated analysis of CMB temperature anisotropies and CMB lensing from Planck, galaxy clustering and weak lensing from SDSS, weak lensing from DES SV as well as Type Ia supernovae and Hubble parameter measurements. In this work, we extend this analysis and quantify the concordance between the derived constraints and those derived by the Planck Collaboration as well as WMAP9, SPT and ACT. As a measure for consistency, we use the Surprise statistic [2], which is based on the relative entropy. In the framework of a flat ΛCDM cosmological model, we find all data sets to be consistent with one another at a level of less than 1σ. We highlight that the relative entropy is sensitive to inconsistencies in the models that are used in different parts of the analysis. In particular, inconsistent assumptions for the neutrino mass break its invariance on the parameter choice. When consistent model assumptions are used, the data sets considered in this work all agree with each other and ΛCDM, without evidence for tensions.

  15. Fueling Global Fishing Fleets

    International Nuclear Information System (INIS)

    Tyedmers, Peter H.; Watson, Reg; Pauly, Daniel

    2005-01-01

    Over the course of the 20th century, fossil fuels became the dominant energy input to most of the world's fisheries. Although various analyses have quantified fuel inputs to individual fisheries, to date, no attempt has been made to quantify the global scale and to map the distribution of fuel consumed by fisheries. By integrating data representing more than 250 fisheries from around the world with spatially resolved catch statistics for 2000, we calculate that globally, fisheries burned almost 50 billion L of fuel in the process of landing just over 80 million t of marine fish and invertebrates for an average rate of 620 L/t. Consequently, fisheries account for about 1.2% of global oil consumption, an amount equivalent to that burned by the Netherlands, the 18th-ranked oil consuming country globally, and directly emit more than 130 million t of CO 2 into the atmosphere. From an efficiency perspective, the energy content of the fuel burned by global fisheries is 12.5 times greater than the edible protein energy content of the resulting catch

  16. Neural basis for generalized quantifier comprehension.

    Science.gov (United States)

    McMillan, Corey T; Clark, Robin; Moore, Peachie; Devita, Christian; Grossman, Murray

    2005-01-01

    Generalized quantifiers like "all cars" are semantically well understood, yet we know little about their neural representation. Our model of quantifier processing includes a numerosity device, operations that combine number elements and working memory. Semantic theory posits two types of quantifiers: first-order quantifiers identify a number state (e.g. "at least 3") and higher-order quantifiers additionally require maintaining a number state actively in working memory for comparison with another state (e.g. "less than half"). We used BOLD fMRI to test the hypothesis that all quantifiers recruit inferior parietal cortex associated with numerosity, while only higher-order quantifiers recruit prefrontal cortex associated with executive resources like working memory. Our findings showed that first-order and higher-order quantifiers both recruit right inferior parietal cortex, suggesting that a numerosity component contributes to quantifier comprehension. Moreover, only probes of higher-order quantifiers recruited right dorsolateral prefrontal cortex, suggesting involvement of executive resources like working memory. We also observed activation of thalamus and anterior cingulate that may be associated with selective attention. Our findings are consistent with a large-scale neural network centered in frontal and parietal cortex that supports comprehension of generalized quantifiers.

  17. The global carbon cycle

    International Nuclear Information System (INIS)

    Maier-Reimer, E.

    1991-01-01

    Basic concepts of the global carbon cycle on earth are described; by careful analyses of isotopic ratios, emission history and oceanic ventilation rates are derived, which provide crucial tests for constraining and calibrating models. Effects of deforestation, fertilizing, fossil fuel burning, soil erosion, etc. are quantified and compared, and the oceanic carbon process is evaluated. Oceanic and terrestrial biosphere modifications are discussed and a carbon cycle model is proposed

  18. Urban physiology: city ants possess high heat tolerance.

    Directory of Open Access Journals (Sweden)

    Michael J Angilletta

    Full Text Available Urbanization has caused regional increases in temperature that exceed those measured on a global scale, leading to urban heat islands as much as 12 degrees C hotter than their surroundings. Optimality models predict ectotherms in urban areas should tolerate heat better and cold worse than ectotherms in rural areas. We tested these predications by measuring heat and cold tolerances of leaf-cutter ants from South America's largest city (São Paulo, Brazil. Specifically, we compared thermal tolerances of ants from inside and outside of the city. Knock-down resistance and chill-coma recovery were used as indicators of heat and cold tolerances, respectively. Ants from within the city took 20% longer to lose mobility at 42 degrees C than ants from outside the city. Interestingly, greater heat tolerance came at no obvious expense of cold tolerance; hence, our observations only partially support current theory. Our results indicate that thermal tolerances of some organisms can respond to rapid changes in climate. Predictive models should account for acclimatory and evolutionary responses during climate change.

  19. A Religious Tolerance and Harmony the Qur'anic Perspective

    Directory of Open Access Journals (Sweden)

    Choirul Fuad Yusuf

    2016-01-01

    Full Text Available The religious tolerance and harmony is something necessary to develop due to the need of global security and peace today. For this purpose, all religions have to be fairly “tolerant” to others. Islam as a revealed religion, whatever its motive, is often perceived and accused as the religion of intolerance and violence. Some political and ideological questions, for example raised to this context: "Can Islamic faith tolerate other faiths, religions or groups?”, What’s actually the Islamic teachings on tolerance and peace or harmony?” and the likes. This article attempts to unpack and elaborate of how far at Qur’an –as the first and primary source of Islam– has a teaching on tolerance and peace. Using a hermeneutical approach the writer understands and analyses what is actually taught by al Qur'an on the concepts and practices of the tolerance. Based on the analysis, he highlights any conclusions of which al-Qur’an (Islam teaches the followers to respect and implement the doctrine of tolerance and peace. The Muslim world is imperatively to tolerate others, or respect the differences for strengthening the world security and peaceful life amongst nationwide.

  20. Generating high temperature tolerant transgenic plants: Achievements and challenges.

    Science.gov (United States)

    Grover, Anil; Mittal, Dheeraj; Negi, Manisha; Lavania, Dhruv

    2013-05-01

    Production of plants tolerant to high temperature stress is of immense significance in the light of global warming and climate change. Plant cells respond to high temperature stress by re-programming their genetic machinery for survival and reproduction. High temperature tolerance in transgenic plants has largely been achieved either by over-expressing heat shock protein genes or by altering levels of heat shock factors that regulate expression of heat shock and non-heat shock genes. Apart from heat shock factors, over-expression of other trans-acting factors like DREB2A, bZIP28 and WRKY proteins has proven useful in imparting high temperature tolerance. Besides these, elevating the genetic levels of proteins involved in osmotic adjustment, reactive oxygen species removal, saturation of membrane-associated lipids, photosynthetic reactions, production of polyamines and protein biosynthesis process have yielded positive results in equipping transgenic plants with high temperature tolerance. Cyclic nucleotide gated calcium channel proteins that regulate calcium influxes across the cell membrane have recently been shown to be the key players in induction of high temperature tolerance. The involvement of calmodulins and kinases in activation of heat shock factors has been implicated as an important event in governing high temperature tolerance. Unfilled gaps limiting the production of high temperature tolerant transgenic plants for field level cultivation are discussed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. Fault Tolerant Computer Architecture

    CERN Document Server

    Sorin, Daniel

    2009-01-01

    For many years, most computer architects have pursued one primary goal: performance. Architects have translated the ever-increasing abundance of ever-faster transistors provided by Moore's law into remarkable increases in performance. Recently, however, the bounty provided by Moore's law has been accompanied by several challenges that have arisen as devices have become smaller, including a decrease in dependability due to physical faults. In this book, we focus on the dependability challenge and the fault tolerance solutions that architects are developing to overcome it. The two main purposes

  2. Toleration, Synthesis or Replacement?

    DEFF Research Database (Denmark)

    Holtermann, Jakob v. H.; Madsen, Mikael Rask

    2016-01-01

    , in order to answer is not yet another partisan suggestion, but rather an attempt at making intelligible both the oppositions and the possibilities of synthesis between normative and empirical approaches to law. Based on our assessment and rational reconstruction of current arguments and positions, we...... therefore outline a taxonomy consisting of the following three basic, ideal-types in terms of the epistemological understanding of the interface of law and empirical studies: toleration, synthesis and replacement. This tripartite model proves useful with a view to teasing out and better articulating...

  3. Global warning, global warming

    International Nuclear Information System (INIS)

    Benarde, M.A.

    1992-01-01

    This book provides insights into the formidable array of issues which, in a warmer world, could impinge upon every facet of readers lives. It examines climatic change and long-term implications of global warming for the ecosystem. Topics include the ozone layer and how it works; the greenhouse effect; the dangers of imbalance and its effects on human and animal life; disruptions to the basic ecology of the planet; and the real scientific evidence for and against aberrant climatic shifts. The author also examines workable social and political programs and changes that must be instituted to avoid ecological disaster

  4. Is Multilingualism Linked to a Higher Tolerance of Ambiguity?

    Science.gov (United States)

    DeWaele, Jean-Marc; Wei, Li

    2013-01-01

    The present study investigates the link between multilingualism and the personality trait Tolerance of Ambiguity (TA) among 2158 mono-, bi- and multilinguals. Monolinguals and bilinguals scored significantly lower on TA compared to multilinguals. A high level of global proficiency of various languages was linked to higher TA scores. A stay abroad…

  5. Nosema Tolerant Honeybees (Apis mellifera) Escape Parasitic Manipulation of Apoptosis

    DEFF Research Database (Denmark)

    Kurze, Christoph; Le Conte, Yves; Dussaubat, Claudia

    2015-01-01

    conducted three inoculation experiments to investigate in the apoptotic respond during infection with the intracellular gut pathogen Nosema ceranae, which is considered as potential global threat to the honeybee (Apis mellifera) and other bee pollinators, in sensitive and tolerant honeybees. To explore...

  6. Ethnopoly promotes tolerance

    CERN Document Server

    CERN Bulletin

    2010-01-01

    On Friday 23 April, 225 primary school children from the eight schools in Meyrin-Cointrin and their accompanying adults took part in a big game of Ethnopoly. Private individuals, associations, administrations, shopkeepers and CERN all opened their doors to them to talk about their countries, their customs and what they are doing to promote tolerance and integration.   The CERN stand set up at ForumMeyrin for the Ethnopoly game. Scurrying from one place to another, the 10 and 11 year olds were made aware of the rich cultural diversity of their commune, which is home to 130 different nationalities. Physicists and engineers from CERN took up residence in the Forum Meyrin for the day in order to talk to the children about the advantages of international collaboration, a subject dear to the Organization's heart. They welcomed around fifty children in the course of the day, conveying to them a message of tolerance: despite their differences, the 10,000 scientists and other members of the CERN...

  7. Against Globalization

    DEFF Research Database (Denmark)

    Philipsen, Lotte; Baggesgaard, Mads Anders

    2013-01-01

    In order to understand globalization, we need to consider what globalization is not. That is, in order to understand the mechanisms and elements that work toward globalization, we must, in a sense, read against globalization, highlighting the limitations of the concept and its inherent conflicts....... Only by employing this as a critical practice will we be analytically able to gain a dynamic understanding of the forces of globalization as they unfold today and as they have developed historically....

  8. A Consideration of Resistance and Tolerance for Ruminant Nematode Infections

    Directory of Open Access Journals (Sweden)

    Steve eBishop

    2012-12-01

    Full Text Available Debates on the relative merits of resistance (the ability of the host to control the parasite lifecycle and tolerance (the net impact of infection on host performance are often lively and unhindered by data or evidence. Resistance generally shows continuous, heritable variation but data are sparser for tolerance, the utility of which will depend upon the disease prevalence. Prevalence is a function of group mean resistance and infection pressure, which itself is influenced by mean resistance. Tolerance will have most value for endemic diseases with a high prevalence, but will be of little value for low prevalence diseases. The conditionality of tolerance on infection status, and hence resistance, makes it difficult to estimate independently of resistance.Tolerance is potentially tractable for nematode infections, as the prevalence of infection is ca. 100% in animals grazing infected pasture, and infection level can be quantified by faecal egg count (FEC. Whilst individual animal phenotypes for tolerance are difficult to estimate, breeding values are estimable if related animals graze pastures of different contamination levels. Selection for resistance, i.e. FEC, provides both direct and indirect benefits from ever decreased pasture contamination and hence decreased infectious challenge. Modelling and experimental studies have shown that such reductions in pasture contamination may lead to substantially increased performance.It is proposed that selection goals addressing nematode infections should include both resistance and performance under challenging conditions. However, there may be benefits from exploiting large datasets in which sires are used across cohorts differing in infection level, to further explore tolerance. This may help to customise breeding objectives, with tolerance given greater weight in heavily parasitized environments.

  9. Quantifying Urban Fragmentation under Economic Transition in Shanghai City, China

    Directory of Open Access Journals (Sweden)

    Heyuan You

    2015-12-01

    Full Text Available Urban fragmentation affects sustainability through multiple impacts on economic, social, and environmental cost. Characterizing the dynamics of urban fragmentation in relation to economic transition should provide implications for sustainability. However, rather few efforts have been made in this issue. Using the case of Shanghai (China, this paper quantifies urban fragmentation in relation to economic transition. In particular, urban fragmentation is quantified by a time-series of remotely sensed images and a set of landscape metrics; and economic transition is described by a set of indicators from three aspects (globalization, decentralization, and marketization. Results show that urban fragmentation presents an increasing linear trend. Multivariate regression identifies positive linear correlation between urban fragmentation and economic transition. More specifically, the relative influence is different for the three components of economic transition. The relative influence of decentralization is stronger than that of globalization and marketization. The joint influences of decentralization and globalization are the strongest for urban fragmentation. The demonstrated methodology can be applicable to other places after making suitable adjustment of the economic transition indicators and fragmentation metrics.

  10. Fault-tolerant computing systems

    International Nuclear Information System (INIS)

    Dal Cin, M.; Hohl, W.

    1991-01-01

    Tests, Diagnosis and Fault Treatment were chosen as the guiding themes of the conference. However, the scope of the conference included reliability, availability, safety and security issues in software and hardware systems as well. The sessions were organized for the conference which was completed by an industrial presentation: Keynote Address, Reconfiguration and Recover, System Level Diagnosis, Voting and Agreement, Testing, Fault-Tolerant Circuits, Array Testing, Modelling, Applied Fault Tolerance, Fault-Tolerant Arrays and Systems, Interconnection Networks, Fault-Tolerant Software. One paper has been indexed separately in the database. (orig./HP)

  11. Commercialization of radiation tolerant camera

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Bum; Choi, Young Soo; Kim, Sun Ku; Lee, Jong Min; Cha, Bung Hun; Lee, Nam Ho; Byun, Eiy Gyo; Yoo, Seun Wook; Choi, Bum Ki; Yoon, Sung Up; Kim, Hyun Gun; Sin, Jeong Hun; So, Suk Il

    1999-12-01

    In this project, radiation tolerant camera which tolerates 10{sup 6} - 10{sup 8} rad total dose is developed. In order to develop radiation tolerant camera, radiation effect of camera components was examined and evaluated, and camera configuration was studied. By the result of evaluation, the components were decided and design was performed. Vidicon tube was selected to use by image sensor and non-browning optics and camera driving circuit were applied. The controller needed for CCTV camera system, lens, light, pan/tilt controller, was designed by the concept of remote control. And two type of radiation tolerant camera were fabricated consider to use in underwater environment or normal environment. (author)

  12. Commercialization of radiation tolerant camera

    International Nuclear Information System (INIS)

    Lee, Yong Bum; Choi, Young Soo; Kim, Sun Ku; Lee, Jong Min; Cha, Bung Hun; Lee, Nam Ho; Byun, Eiy Gyo; Yoo, Seun Wook; Choi, Bum Ki; Yoon, Sung Up; Kim, Hyun Gun; Sin, Jeong Hun; So, Suk Il

    1999-12-01

    In this project, radiation tolerant camera which tolerates 10 6 - 10 8 rad total dose is developed. In order to develop radiation tolerant camera, radiation effect of camera components was examined and evaluated, and camera configuration was studied. By the result of evaluation, the components were decided and design was performed. Vidicon tube was selected to use by image sensor and non-browning optics and camera driving circuit were applied. The controller needed for CCTV camera system, lens, light, pan/tilt controller, was designed by the concept of remote control. And two type of radiation tolerant camera were fabricated consider to use in underwater environment or normal environment. (author)

  13. Verification of Tolerance Chains in Micro Manufacturing

    DEFF Research Database (Denmark)

    Gasparin, Stefania

    .1 – 200 μm). Finally, an optical component is investigated with the purpose of suggesting a quality control approach for micro-manufacturing process through a control of the product. It is a useful method to adopt when the aim is to detect and quantify inconsistency or incompatibilities during a process...... on dimensional and geometrical metrology. If the measurements uncertainty is large compared to the tolerance interval, a small conformance zone is left for process variation. Therefore particular attention has to be paid to the instrument capabilities in order to reduce the measurement uncertainty. Different...... chain. In this way the process parameters can be adjusted in order to fulfil the requirements of the final micro-product....

  14. Salt Tolerance in Soybean

    Institute of Scientific and Technical Information of China (English)

    Tsui-Hung Phang; Guihua Shao; Hon-Ming Lam

    2008-01-01

    Soybean is an Important cash crop and its productivity is significantly hampered by salt stress. High salt Imposes negative impacts on growth, nodulation, agronomy traits, seed quality and quantity, and thus reduces the yield of soybean. To cope with salt stress, soybean has developed several tolerance mechanisms, including: (I) maintenance of ion homeostasis; (ii) adjustment in response to osmotic stress; (iii) restoration of osmotic balance; and (iv) other metabolic and structural adaptations. The regulatory network for abiotic stress responses in higher plants has been studied extensively in model plants such as Arabidopsis thaliana. Some homologous components involved in salt stress responses have been identified in soybean. In this review, we tried to integrate the relevant works on soybean and proposes a working model to descdbe Its salt stress responses at the molecular level.

  15. Delay tolerant networks

    CERN Document Server

    Gao, Longxiang; Luan, Tom H

    2015-01-01

    This brief presents emerging and promising communication methods for network reliability via delay tolerant networks (DTNs). Different from traditional networks, DTNs possess unique features, such as long latency and unstable network topology. As a result, DTNs can be widely applied to critical applications, such as space communications, disaster rescue, and battlefield communications. The brief provides a complete investigation of DTNs and their current applications, from an overview to the latest development in the area. The core issue of data forward in DTNs is tackled, including the importance of social characteristics, which is an essential feature if the mobile devices are used for human communication. Security and privacy issues in DTNs are discussed, and future work is also discussed.

  16. Global Strategy

    DEFF Research Database (Denmark)

    Li, Peter Ping

    2013-01-01

    Global strategy differs from domestic strategy in terms of content and process as well as context and structure. The content of global strategy can contain five key elements, while the process of global strategy can have six major stages. These are expounded below. Global strategy is influenced...... by rich and complementary local contexts with diverse resource pools and game rules at the national level to form a broad ecosystem at the global level. Further, global strategy dictates the interaction or balance between different entry strategies at the levels of internal and external networks....

  17. Quantifying forecast quality of IT business value

    NARCIS (Netherlands)

    Eveleens, J.L.; van der Pas, M.; Verhoef, C.

    2012-01-01

    This article discusses how to quantify the forecasting quality of IT business value. We address a common economic indicator often used to determine the business value of project proposals, the Net Present Value (NPV). To quantify the forecasting quality of IT business value, we develop a generalized

  18. Structural equation models to estimate risk of infection and tolerance to bovine mastitis

    OpenAIRE

    Detilleux, Johann; Theron, Léonard; Duprez, Jean-Noël; Reding, Edouard; Humblet, Marie-France; Planchon, Viviane; Delfosse, Camille; Bertozzi, Carlo; Mainil, Jacques; Hanzen, Christian

    2013-01-01

    Background One method to improve durably animal welfare is to select, as reproducers, animals with the highest ability to resist or tolerate infection. To do so, it is necessary to distinguish direct and indirect mechanisms of resistance and tolerance because selection on these traits is believed to have different epidemiological and evolutionary consequences. Methods We propose structural equation models with latent variables (1) to quantify the latent risk of infection and to identify, amon...

  19. Shaping tolerant attitudes towards immigrants

    DEFF Research Database (Denmark)

    Rapp, Carolin

    2017-01-01

    This article contributes to the ongoing discussion on how tolerance may be fostered in Western European countries and to the question of how contextual factors such as welfare state expenditures may contribute to this formation. Tolerance is understood as a basic democratic principle that helps c...

  20. Legal Quality, Inequality, and Tolerance

    DEFF Research Database (Denmark)

    Bjørnskov, Christian

    Previous findings suggest that income inequality leads to lower legal quality. This paper argues that voters' tolerance of inequality exerts an additional influence. Empirical findings suggest that inequality leads to lower legal quality due to its effect on trust while the tolerance of inequality...

  1. Legal Quality, Inequality, and Tolerance

    DEFF Research Database (Denmark)

    Bjørnskov, Christian

    2004-01-01

    Previous findings suggest that income inequality leads to lower legal quality. This paper argues that voters' tolerance of inequality exerts an additional influence. Empirical findings suggest that inequality leads to lower legal quality due to its effect on trust while the tolerance of inequality...

  2. Tolerance Issue in Kazakh Culture

    Science.gov (United States)

    Aubakirova, Saltanat S.; Ismagambetova, Zukhra N.; Karabayeva, Aliya G.; Rysbekova, Shamshiya S.; Mirzabekova, Alma Sh.

    2016-01-01

    In this article the authors reveal the basic cultural mechanisms that influence the formation of the tolerance strategy in Kazakh and Kazakhstan society, show its basic directions, as well as its importance for the modern Kazakhstan society and the formation of intercultural communication with foreign countries. Tolerance is a necessary element of…

  3. Tolerance-Based Feature Transforms

    NARCIS (Netherlands)

    Reniers, Dennie; Telea, Alexandru

    2007-01-01

    Tolerance-based feature transforms (TFTs) assign to each pixel in an image not only the nearest feature pixels on the boundary (origins), but all origins from the minimum distance up to a user-defined tolerance. In this paper, we compare four simple-to-implement methods for computing TFTs on binary

  4. Global Europa

    DEFF Research Database (Denmark)

    Manners, Ian

    2010-01-01

    at the mythology of ‘global Europa' - the EU in the world. It concludes with a reflection on the way in which the many diverse myths of global Europa compete for daily attention, whether as lore, ideology, or pleasure. In this respect the mythology of global Europa is part of our everyday existence, part of the EU...

  5. Bare quantifier fronting as contrastive topicalization

    Directory of Open Access Journals (Sweden)

    Ion Giurgea

    2015-11-01

    Full Text Available I argue that indefinites (in particular bare quantifiers such as ‘something’, ‘somebody’, etc. which are neither existentially presupposed nor in the restriction of a quantifier over situations, can undergo topicalization in a number of Romance languages (Catalan, Italian, Romanian, Spanish, but only if the sentence contains “verum” focus, i.e. focus on a high degree of certainty of the sentence. I analyze these indefinites as contrastive topics, using Büring’s (1999 theory (where the term ‘S-topic’ is used for what I call ‘contrastive topic’. I propose that the topic is evaluated in relation to a scalar set including generalized quantifiers such as {lP $x P(x, lP MANYx P(x, lP MOSTx P(x, lP “xP(x} or {lP $xP(x, lP P(a, lP P(b …}, and that the contrastive topic is the weakest generalized quantifier in this set. The verum focus, which is part of the “comment” that co-occurs with the “Topic”, introduces a set of alternatives including degrees of certainty of the assertion. The speaker asserts that his claim is certainly true or highly probable, contrasting it with stronger claims for which the degree of probability is unknown. This explains the observation that in downward entailing contexts, the fronted quantified DPs are headed by ‘all’ or ‘many’, whereas ‘some’, small numbers or ‘at least n’ appear in upward entailing contexts. Unlike other cases of non-specific topics, which are property topics, these are quantifier topics: the topic part is a generalized quantifier, the comment is a property of generalized quantifiers. This explains the narrow scope of the fronted quantified DP.

  6. Quantifying geocode location error using GIS methods

    Directory of Open Access Journals (Sweden)

    Gardner Bennett R

    2007-04-01

    Full Text Available Abstract Background The Metropolitan Atlanta Congenital Defects Program (MACDP collects maternal address information at the time of delivery for infants and fetuses with birth defects. These addresses have been geocoded by two independent agencies: (1 the Georgia Division of Public Health Office of Health Information and Policy (OHIP and (2 a commercial vendor. Geographic information system (GIS methods were used to quantify uncertainty in the two sets of geocodes using orthoimagery and tax parcel datasets. Methods We sampled 599 infants and fetuses with birth defects delivered during 1994–2002 with maternal residence in either Fulton or Gwinnett County. Tax parcel datasets were obtained from the tax assessor's offices of Fulton and Gwinnett County. High-resolution orthoimagery for these counties was acquired from the U.S. Geological Survey. For each of the 599 addresses we attempted to locate the tax parcel corresponding to the maternal address. If the tax parcel was identified the distance and the angle between the geocode and the residence were calculated. We used simulated data to characterize the impact of geocode location error. In each county 5,000 geocodes were generated and assigned their corresponding Census 2000 tract. Each geocode was then displaced at a random angle by a random distance drawn from the distribution of observed geocode location errors. The census tract of the displaced geocode was determined. We repeated this process 5,000 times and report the percentage of geocodes that resolved into incorrect census tracts. Results Median location error was less than 100 meters for both OHIP and commercial vendor geocodes; the distribution of angles appeared uniform. Median location error was approximately 35% larger in Gwinnett (a suburban county relative to Fulton (a county with urban and suburban areas. Location error occasionally caused the simulated geocodes to be displaced into incorrect census tracts; the median percentage

  7. Global usability

    CERN Document Server

    Douglas, Ian

    2011-01-01

    The concept of usability has become an increasingly important consideration in the design of all kinds of technology. As more products are aimed at global markets and developed through internationally distributed teams, usability design needs to be addressed in global terms. Interest in usability as a design issue and specialist area of research and education has developed steadily in North America and Europe since the 1980's. However, it is only over the last ten years that it has emerged as a global concern. Global Usability provides an introduction to the important issues in globalizing des

  8. Tolerance and chimerism.

    Science.gov (United States)

    Kolb, Hans-Jochem; Guenther, Wolfgang; Gyurkocza, Boglarka; Hoetzl, Florian; Simoes, Belinda; Falk, Christine; Schleuning, Michael; Ledderose, Georg

    2003-05-15

    Stem-cell transplantation from human leukocyte antigen (HLA)-haploidentical family members carries a high risk of rejection and graft-versus-host disease (GVHD) if donor and recipient differ by more than one HLA antigen. The authors have developed treatment protocols from studies in dog leukocyte antigen-haploidentical dogs that prevent rejection and modify GVHD to the extent that patients with aggressive hematologic neoplasia can be treated with success. Principal improvements have been achieved in the use of cyclophosphamide and total-body irradiation for conditioning and T-cell depletion for prevention of GVHD. More recently, the combination of marrow and CD6-depleted mobilized donor blood cells (MDBC) has been introduced for HLA-haploidentical transplantation on the basis that CD6-depleted MDBC contain immunoregulatory cells besides stem cells and natural killer cells. Clinical results are reported on 36 patients with high-risk hematologic neoplasia. The results encourage the use of HLA-haploidentical stem-cell transplantation at an earlier stage of the disease. This method could also be of use for tolerance induction in organ transplantation.

  9. Selective labelling and eradication of antibiotic-tolerant bacterial populations in Pseudomonas aeruginosa biofilms.

    Science.gov (United States)

    Chua, Song Lin; Yam, Joey Kuok Hoong; Hao, Piliang; Adav, Sunil S; Salido, May Margarette; Liu, Yang; Givskov, Michael; Sze, Siu Kwan; Tolker-Nielsen, Tim; Yang, Liang

    2016-02-19

    Drug resistance and tolerance greatly diminish the therapeutic potential of antibiotics against pathogens. Antibiotic tolerance by bacterial biofilms often leads to persistent infections, but its mechanisms are unclear. Here we use a proteomics approach, pulsed stable isotope labelling with amino acids (pulsed-SILAC), to quantify newly expressed proteins in colistin-tolerant subpopulations of Pseudomonas aeruginosa biofilms (colistin is a 'last-resort' antibiotic against multidrug-resistant Gram-negative pathogens). Migration is essential for the formation of colistin-tolerant biofilm subpopulations, with colistin-tolerant cells using type IV pili to migrate onto the top of the colistin-killed biofilm. The colistin-tolerant cells employ quorum sensing (QS) to initiate the formation of new colistin-tolerant subpopulations, highlighting multicellular behaviour in antibiotic tolerance development. The macrolide erythromycin, which has been previously shown to inhibit the motility and QS of P. aeruginosa, boosts biofilm eradication by colistin. Our work provides insights on the mechanisms underlying the formation of antibiotic-tolerant populations in bacterial biofilms and indicates research avenues for designing more efficient treatments against biofilm-associated infections.

  10. Selective labelling and eradication of antibiotic-tolerant bacterial populations in Pseudomonas aeruginosa biofilms

    Science.gov (United States)

    Chua, Song Lin; Yam, Joey Kuok Hoong; Hao, Piliang; Adav, Sunil S.; Salido, May Margarette; Liu, Yang; Givskov, Michael; Sze, Siu Kwan; Tolker-Nielsen, Tim; Yang, Liang

    2016-01-01

    Drug resistance and tolerance greatly diminish the therapeutic potential of antibiotics against pathogens. Antibiotic tolerance by bacterial biofilms often leads to persistent infections, but its mechanisms are unclear. Here we use a proteomics approach, pulsed stable isotope labelling with amino acids (pulsed-SILAC), to quantify newly expressed proteins in colistin-tolerant subpopulations of Pseudomonas aeruginosa biofilms (colistin is a ‘last-resort' antibiotic against multidrug-resistant Gram-negative pathogens). Migration is essential for the formation of colistin-tolerant biofilm subpopulations, with colistin-tolerant cells using type IV pili to migrate onto the top of the colistin-killed biofilm. The colistin-tolerant cells employ quorum sensing (QS) to initiate the formation of new colistin-tolerant subpopulations, highlighting multicellular behaviour in antibiotic tolerance development. The macrolide erythromycin, which has been previously shown to inhibit the motility and QS of P. aeruginosa, boosts biofilm eradication by colistin. Our work provides insights on the mechanisms underlying the formation of antibiotic-tolerant populations in bacterial biofilms and indicates research avenues for designing more efficient treatments against biofilm-associated infections. PMID:26892159

  11. Quantify Risk to Manage Cost and Schedule

    National Research Council Canada - National Science Library

    Raymond, Fred

    1999-01-01

    Too many projects suffer from unachievable budget and schedule goals, caused by unrealistic estimates and the failure to quantify and communicate the uncertainty of these estimates to managers and sponsoring executives...

  12. Quantifying drug-protein binding in vivo

    International Nuclear Information System (INIS)

    Buchholz, B; Bench, G; Keating III, G; Palmblad, M; Vogel, J; Grant, P G; Hillegonds, D

    2004-01-01

    Accelerator mass spectrometry (AMS) provides precise quantitation of isotope labeled compounds that are bound to biological macromolecules such as DNA or proteins. The sensitivity is high enough to allow for sub-pharmacological (''micro-'') dosing to determine macromolecular targets without inducing toxicities or altering the system under study, whether it is healthy or diseased. We demonstrated an application of AMS in quantifying the physiologic effects of one dosed chemical compound upon the binding level of another compound in vivo at sub-toxic doses [4].We are using tissues left from this study to develop protocols for quantifying specific binding to isolated and identified proteins. We also developed a new technique to quantify nanogram to milligram amounts of isolated protein at precisions that are comparable to those for quantifying the bound compound by AMS

  13. New frontiers of quantified self 3

    DEFF Research Database (Denmark)

    Rapp, Amon; Cena, Federica; Kay, Judy

    2017-01-01

    Quantified Self (QS) field needs to start thinking of how situated needs may affect the use of self-tracking technologies. In this workshop we will focus on the idiosyncrasies of specific categories of users....

  14. Mechanical tolerance stackup and analysis

    CERN Document Server

    Fischer, Bryan R

    2011-01-01

    Use Tolerance Analysis Techniques to Avoid Design, Quality, and Manufacturing Problems Before They Happen Often overlooked and misunderstood, tolerance analysis is a critical part of improving products and their design processes. Because all manufactured products are subject to variation, it is crucial that designers predict and understand how these changes can affect form, fit, and function of parts and assemblies--and then communicate their findings effectively. Written by one of the developers of ASME Y14.5 and other geometric dimension and tolerancing (GD&T) standards, Mechanical Tolerance

  15. Advanced cloud fault tolerance system

    Science.gov (United States)

    Sumangali, K.; Benny, Niketa

    2017-11-01

    Cloud computing has become a prevalent on-demand service on the internet to store, manage and process data. A pitfall that accompanies cloud computing is the failures that can be encountered in the cloud. To overcome these failures, we require a fault tolerance mechanism to abstract faults from users. We have proposed a fault tolerant architecture, which is a combination of proactive and reactive fault tolerance. This architecture essentially increases the reliability and the availability of the cloud. In the future, we would like to compare evaluations of our proposed architecture with existing architectures and further improve it.

  16. Tolerance to and cross tolerance between ethanol and nicotine.

    Science.gov (United States)

    Collins, A C; Burch, J B; de Fiebre, C M; Marks, M J

    1988-02-01

    Female DBA mice were subjected to one of four treatments: ethanol-containing or control diets, nicotine (0.2, 1.0, 5.0 mg/kg/hr) infusion or saline infusion. After removal from the liquid diets or cessation of infusion, the animals were challenged with an acute dose of ethanol or nicotine. Chronic ethanol-fed mice were tolerant to the effects of ethanol on body temperature and open field activity and were cross tolerant to the effects of nicotine on body temperature and heart rate. Nicotine infused animals were tolerant to the effects of nicotine on body temperature and rotarod performance and were cross tolerant to the effects of ethanol on body temperature. Ethanol-induced sleep time was decreased in chronic ethanol- but not chronic nicotine-treated mice. Chronic drug treatment did not alter the elimination rate of either drug. Chronic ethanol treatment did not alter the number or affinity of brain nicotinic receptors whereas chronic nicotine treatment elicited an increase in the number of [3H]-nicotine binding sites. Tolerance and cross tolerance between ethanol and nicotine is discussed in terms of potential effects on desensitization of brain nicotinic receptors.

  17. SHADOW GLOBALIZATION

    OpenAIRE

    Larissa Mihaylovna Kapitsa

    2014-01-01

    The article reviews some development trends brought about by globalization, particularly, a growing tax evasion and tax avoidance, an expansion of illicit financial flows and the proliferation of a global criminal network. The author draws attention to some new phenomena, particularly, cosmopolitanization of some parts of national elites and a deepening divide between national interests and the private interests of elites as a consequence of financial globalization. Modern mass media, both Ru...

  18. Global Mindset

    DEFF Research Database (Denmark)

    Sørensen, Olav Jull

    2016-01-01

    The concept of Global Mindset (GM) – the way to think about the global reality – is on the agenda of multinational companies concomitant with the increase in global complexity, uncertainty and diversity. In spite of a number of studies, the concept is still fluid and far from a managerial.......e. the capability to sense (quickly), reflect (constructively) and act purposefully (for mutual benefit). A case on an MNC is used at the end to show the organizational manifestations of a GM....

  19. TEMPERATURE TOLERANCES AND OSMOREGULATION IN ...

    African Journals Online (AJOL)

    The salinity and temperature tolerances of some burrowiq bivalves which oc:eur ... Along most of the estuary the salinity normally remains close to that of seawater (35'/.) ...... grapsoid crabs, Hemigrapsus nudus and Hemigrapsus oregonensis.

  20. TOLERANCE OF Abelmoschus esculentus (L

    African Journals Online (AJOL)

    Cletus

    Key word: - Tolerance, diesel oil, polluted soil, Abelmoschus esculentus. INTRODUCTION ... errors -of the mean values were calculated for the replicate readings and data .... African Schools and Colleges, 2nd Ed. University Press Limited ...

  1. Antibiotic tolerance and microbial biofilms

    DEFF Research Database (Denmark)

    Folkesson, Anders

    Increased tolerance to antimicrobial agents is thought to be an important feature of microbes growing in biofilms. We study the dynamics of antibiotic action within hydrodynamic flow chamber biofilms of Escherichia coli and Pseudomonas aeruginosa using isogenic mutants and fluorescent gene...... expression reporters and we address the question of how biofilm organization affects antibiotic susceptibility. The dynamics of microbial killing is monitored by viable count determination, and confocal laser microscopy. Our work shows that the apparent increased antibiotic tolerance is due to the formation...... of antibiotic tolerant subpopulations within the biofilm. The formation of these subpopulations is highly variable and dependent on the antibiotic used, the biofilm structural organization and the induction of specific tolerance mechanisms....

  2. Cytokine regulation of immune tolerance

    OpenAIRE

    Wu, Jie; Xie, Aini; Chen, Wenhao

    2014-01-01

    The immune system provides defenses against invading pathogens while maintaining immune tolerance to self-antigens. This immune homeostasis is harmonized by the direct interactions between immune cells and the cytokine environment in which immune cells develop and function. Herein, we discuss three non-redundant paradigms by which cytokines maintain or break immune tolerance. We firstly describe how anti-inflammatory cytokines exert direct inhibitory effects on immune cells to enforce immune ...

  3. Women’s G Tolerance

    Science.gov (United States)

    1986-08-01

    for the groups matched by age (70 pairs), weight sickness, uncomfortable feelings of distension in arms (26 pairs), and act~vity status (84 pairs...mass-spring-damper) s ,stem Straining G tolerance, being dpendent on skeletal having a resonant frequency above about I Hz. As muscular strength and...of the women’s G tolerance stud\\ scclic variations in muscular strength and endurance. was below 0.1 Hz (11), the production of any significant

  4. Fault-tolerant rotary actuator

    Science.gov (United States)

    Tesar, Delbert

    2006-10-17

    A fault-tolerant actuator module, in a single containment shell, containing two actuator subsystems that are either asymmetrically or symmetrically laid out is provided. Fault tolerance in the actuators of the present invention is achieved by the employment of dual sets of equal resources. Dual resources are integrated into single modules, with each having the external appearance and functionality of a single set of resources.

  5. Behavioral Tolerance to Anticholinergic Agents

    Science.gov (United States)

    1986-11-20

    Medicine , 47, 137-141. 7. Kurtz, P.J. (1977) Behavioral and biochemical effects of the carbamate insecticide, mobam. Pharmacology Biochemistry & Behavior...tolerance to marihuana in rats. Pharmacology Biochemistry and Behavior, 1, 73-76. 43 40. Olson, J. and Carder, B. (1974) Behavioral tolerance to... marihuana as a function of amount of prior training. Pharmacology Biochemistry and Behavior, 2, 243-247. 41. Sidman, M. (1960) Tactics of Scientific

  6. Conceptualizing Innovation in Born Global Firms

    DEFF Research Database (Denmark)

    Zijdemans, Erik; Tanev, Stoyan

    2014-01-01

    This research provides insights from recent literature on innovativeness in the environment of born globals. This article will be relevant to researchers interested in born globals and their business environments and, more specifically, the role that innovation plays in their foundation and devel...... of knowledge acquisition, networking capabilities and the lean startup approach in born global innovation. Finally, the article addresses the issue of quantifying and measuring innovativeness....

  7. Iron-Tolerant Cyanobacteria: Ecophysiology and Fingerprinting

    Science.gov (United States)

    Brown, I. I.; Mummey, D.; Lindsey, J.; McKay, D. S.

    2006-01-01

    Although the iron-dependent physiology of marine and freshwater cyanobacterial strains has been the focus of extensive study, very few studies dedicated to the physiology and diversity of cyanobacteria inhabiting iron-depositing hot springs have been conducted. One of the few studies that have been conducted [B. Pierson, 1999] found that cyanobacterial members of iron depositing bacterial mat communities might increase the rate of iron oxidation in situ and that ferrous iron concentrations up to 1 mM significantly stimulated light dependent consumption of bicarbonate, suggesting a specific role for elevated iron in photosynthesis of cyanobacteria inhabiting iron-depositing hot springs. Our recent studies pertaining to the diversity and physiology of cyanobacteria populating iron-depositing hot springs in Great Yellowstone area (Western USA) indicated a number of different isolates exhibiting elevated tolerance to Fe(3+) (up to 1 mM). Moreover, stimulation of growth was observed with increased Fe(3+) (0.02-0.4 mM). Molecular fingerprinting of unialgal isolates revealed a new cyanobacterial genus and species Chroogloeocystis siderophila, an unicellular cyanobacterium with significant EPS sheath harboring colloidal Fe(3+) from iron enriched media. Our preliminary data suggest that some filamentous species of iron-tolerant cyanobacteria are capable of exocytosis of iron precipitated in cytoplasm. Prior to 2.4 Ga global oceans were likely significantly enriched in soluble iron [Lindsay et al, 2003], conditions which are not conducive to growth of most contemporary oxygenic cyanobacteria. Thus, iron-tolerant CB may have played important physiological and evolutionary roles in Earths history.

  8. The effect of fasting and body reserves on cold tolerance in 2 pit-building insect predators.

    Science.gov (United States)

    Scharf, Inon; Daniel, Alma; MacMillan, Heath Andrew; Katz, Noa

    2017-06-01

    Pit-building antlions and wormlions are 2 distantly-related insect species, whose larvae construct pits in loose soil to trap small arthropod prey. This convergent evolution of natural histories has led to additional similarities in their natural history and ecology, and thus, these 2 species encounter similar abiotic stress (such as periodic starvation) in their natural habitat. Here, we measured the cold tolerance of the 2 species and examined whether recent feeding or food deprivation, as well as body composition (body mass and lipid content) and condition (quantified as mass-to-size residuals) affect their cold tolerance. In contrast to other insects, in which food deprivation either enhanced or impaired cold tolerance, prolonged fasting had no effect on the cold tolerance of either species, which had similar cold tolerance. The 2 species differed, however, in how cold tolerance related to body mass and lipid content: although body mass was positively correlated with the wormlion cold tolerance, lipid content was a more reliable predictor of cold tolerance in the antlions. Cold tolerance also underwent greater change with ontogeny in wormlions than in antlions. We discuss possible reasons for this lack of effect of food deprivation on both species' cold tolerance, such as their high starvation tolerance (being sit-and-wait predators).

  9. Prediction of Glucose Tolerance without an Oral Glucose Tolerance Test

    Directory of Open Access Journals (Sweden)

    Rohit Babbar

    2018-03-01

    Full Text Available IntroductionImpaired glucose tolerance (IGT is diagnosed by a standardized oral glucose tolerance test (OGTT. However, the OGTT is laborious, and when not performed, glucose tolerance cannot be determined from fasting samples retrospectively. We tested if glucose tolerance status is reasonably predictable from a combination of demographic, anthropometric, and laboratory data assessed at one time point in a fasting state.MethodsGiven a set of 22 variables selected upon clinical feasibility such as sex, age, height, weight, waist circumference, blood pressure, fasting glucose, HbA1c, hemoglobin, mean corpuscular volume, serum potassium, fasting levels of insulin, C-peptide, triglyceride, non-esterified fatty acids (NEFA, proinsulin, prolactin, cholesterol, low-density lipoprotein, HDL, uric acid, liver transaminases, and ferritin, we used supervised machine learning to estimate glucose tolerance status in 2,337 participants of the TUEF study who were recruited before 2012. We tested the performance of 10 different machine learning classifiers on data from 929 participants in the test set who were recruited after 2012. In addition, reproducibility of IGT was analyzed in 78 participants who had 2 repeated OGTTs within 1 year.ResultsThe most accurate prediction of IGT was reached with the recursive partitioning method (accuracy = 0.78. For all classifiers, mean accuracy was 0.73 ± 0.04. The most important model variable was fasting glucose in all models. Using mean variable importance across all models, fasting glucose was followed by NEFA, triglycerides, HbA1c, and C-peptide. The accuracy of predicting IGT from a previous OGTT was 0.77.ConclusionMachine learning methods yield moderate accuracy in predicting glucose tolerance from a wide set of clinical and laboratory variables. A substitution of OGTT does not currently seem to be feasible. An important constraint could be the limited reproducibility of glucose tolerance status during a

  10. Gendering Globalization

    DEFF Research Database (Denmark)

    Siim, Birte

    2009-01-01

    The current global financial situation bluntly and brutally brings home the fact that the global and local are closely connected in times of opportunity as well as crises. The articles in this issue of Asia Insights are about ontra-action between Asia, particularly China, and the Nordic countries...

  11. Developing Globalization

    DEFF Research Database (Denmark)

    Hansen, Annette Skovsted

    2017-01-01

    This chapter is the first qualitative micro case study of one aspect of globalization: personal networks as a concrete outcome of development assistance spending. The empirical findings related in this paper present circumstantial evidence that Japanese foreign aid has contributed to globalization...

  12. Global Uddannelse

    DEFF Research Database (Denmark)

    Jensen, Niels Rosendal

    Antologien handler om "demokratiproblemer i den globale sammenhæng" (del I) og "demokratiproblemer i uddannelse og for de offentligt ansatte" (del II), bundet sammen af et mellemstykke, der rækker ud mod begge poler både det globale og det lokale ved at knytte det til forholdet mellem marked...

  13. Global Mindsets

    DEFF Research Database (Denmark)

    Global Mindsets: Exploration and Perspectives seeks to tackle a topic that is relatively new in research and practice, and is considered by many to be critical for firms seeking to conduct global business. It argues that multiple mindsets exist (across and within organizations), that they operate...... in a global context, and that they are dynamic and undergo change and action. Part of the mindset(s) may depend upon place, situation and context where individuals and organizations operate. The book will examine the notion of "mindset" is situational and dynamic, especially in a global setting, why...... it is important for future scholars and managers and how it could be conceptualized. Global Mindsets: Exploration and Perspectives is split into two major sections; the first examines where the literature currently is with respect to the knowledge in the field and what conceptual frameworks guide the thinking...

  14. Global warming

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Canada's Green Plan strategy for dealing with global warming is being implemented as a multidepartmental partnership involving all Canadians and the international community. Many of the elements of this strategy are built on an existing base of activities predating the Green Plan. Elements of the strategy include programs to limit emissions of greenhouse gases, such as initiatives to encourage more energy-efficient practices and development of alternate fuel sources; studies and policy developments to help Canadians prepare and adapt to climate change; research on the global warming phenomenon; and stimulation of international action on global warming, including obligations arising out of the Framework Convention on Climate Change. All the program elements have been approved, funded, and announced. Major achievements to date are summarized, including improvements in the Energy Efficiency Act, studies on the socioeconomic impacts of global warming, and participation in monitoring networks. Milestones associated with the remaining global warming initiatives are listed

  15. Quantifying the scale and socioeconomic drivers of bird hunting in Central African forest communities

    NARCIS (Netherlands)

    Whytock, Robin C.; Morgan, Bethan J.; Awa, Taku; Bekokon, Zacharie; Abwe, Ekwoge A.; Buij, Ralph; Virani, Munir; Vickery, Juliet A.; Bunnefeld, Nils

    2018-01-01

    Global biodiversity is threatened by unsustainable exploitation for subsistence and commerce, and tropical forests are facing a hunting crisis. In Central African forests, hunting pressure has been quantified by monitoring changes in the abundance of affected species and by studying wild meat

  16. 75 FR 29908 - Prothioconazole; Pesticide Tolerances

    Science.gov (United States)

    2010-05-28

    .... The straw numerical value (5 ppm) is matched between the U.S. and Codex. The tolerance definition for... lower (0.07 ppm) than the recommended U.S. group tolerance. The 0.07 ppm value is the current U.S. tolerance value for wheat, but will be replaced by the cereal grain group tolerance. Canada does not...

  17. 78 FR 40027 - Novaluron; Pesticide Tolerances

    Science.gov (United States)

    2013-07-03

    ...). This regulation additionally deletes the time- limited tolerance for strawberry, as that tolerance..., pears, potatoes, strawberries, and tomatoes and utilized estimates for PCT for recently registered uses... deletes the time-limited tolerance for strawberry, as that tolerance expired on December 31, 2011. VI...

  18. On flaw tolerance of nacre: a theoretical study

    Science.gov (United States)

    Shao, Yue; Zhao, Hong-Ping; Feng, Xi-Qiao

    2014-01-01

    As a natural composite, nacre has an elegant staggered ‘brick-and-mortar’ microstructure consisting of mineral platelets glued by organic macromolecules, which endows the material with superior mechanical properties to achieve its biological functions. In this paper, a microstructure-based crack-bridging model is employed to investigate how the strength of nacre is affected by pre-existing structural defects. Our analysis demonstrates that owing to its special microstructure and the toughening effect of platelets, nacre has a superior flaw-tolerance feature. The maximal crack size that does not evidently reduce the tensile strength of nacre is up to tens of micrometres, about three orders higher than that of pure aragonite. Through dimensional analysis, a non-dimensional parameter is proposed to quantify the flaw-tolerance ability of nacreous materials in a wide range of structural parameters. This study provides us some inspirations for optimal design of advanced biomimetic composites. PMID:24402917

  19. Quantifying and Explaining Immutability in Scala

    OpenAIRE

    Haller, Philipp; Axelsson, Ludvig

    2017-01-01

    Functional programming typically emphasizes programming with first-class functions and immutable data. Immutable data types enable fault tolerance in distributed systems, and ensure process isolation in message-passing concurrency, among other applications. However, beyond the distinction between reassignable and non-reassignable fields, Scala's type system does not have a built-in notion of immutability for type definitions. As a result, immutability is "by-convention" in Scala, and statisti...

  20. SHADOW GLOBALIZATION

    Directory of Open Access Journals (Sweden)

    Larissa Mihaylovna Kapitsa

    2014-01-01

    Full Text Available The article reviews some development trends brought about by globalization, particularly, a growing tax evasion and tax avoidance, an expansion of illicit financial flows and the proliferation of a global criminal network. The author draws attention to some new phenomena, particularly, cosmopolitanization of some parts of national elites and a deepening divide between national interests and the private interests of elites as a consequence of financial globalization. Modern mass media, both Russian and foreign, tend to interpret globalization processes exclusively from the position of conformism, and for some of the researchers globalization became the "sacred cow", which one may only worship. Critical analysis of the processes associated with globalization is given a hostile reception. In response to criticism of globalization, one can hear the very same argument: "globalization in inevitable!" Such a state of affairs, the very least, causes perplexity. Some of the world development trends been observed over the past years raise serious concerns about the security and welfare of the peoples of the world. One of such trends has been the globalization of shadow economic activities. Methods of fight against the criminal economy been applied in international practice can be grouped into: 1 punitive enforcement (or criminal-legal methods and 2 socio-economic methods. As the results of various research works evidence punitive enforcement methods not supported by socio-economic measures not effective enough. Toughening the control over criminal economic activities in the absence of preventive and corrective actions aiming to neutralize institutional, social and other stimuli facilitating criminalization of economic activities can result in large losses of financial assets in the form of mass capital flight

  1. Shadow Globalization

    Directory of Open Access Journals (Sweden)

    Larissa Mihaylovna Kapitsa

    2014-01-01

    Full Text Available The article reviews some development trends brought about by globalization, particularly, a growing tax evasion and tax avoidance, an expansion of illicit financial flows and the proliferation of a global criminal network. The author draws attention to some new phenomena, particularly, cosmopolitanization of some parts of national elites and a deepening divide between national interests and the private interests of elites as a consequence of financial globalization. Modern mass media, both Russian and foreign, tend to interpret globalization processes exclusively from the position of conformism, and for some of the researchers globalization became the "sacred cow", which one may only worship. Critical analysis of the processes associated with globalization is given a hostile reception. In response to criticism of globalization, one can hear the very same argument: "globalization in inevitable!" Such a state of affairs, the very least, causes perplexity. Some of the world development trends been observed over the past years raise serious concerns about the security and welfare of the peoples of the world. One of such trends has been the globalization of shadow economic activities. Methods of fight against the criminal economy been applied in international practice can be grouped into: 1 punitive enforcement (or criminal-legal methods and 2 socio-economic methods. As the results of various research works evidence punitive enforcement methods not supported by socio-economic measures not effective enough. Toughening the control over criminal economic activities in the absence of preventive and corrective actions aiming to neutralize institutional, social and other stimuli facilitating criminalization of economic activities can result in large losses of financial assets in the form of mass capital flight

  2. Global Rome

    DEFF Research Database (Denmark)

    Is 21st-century Rome a global city? Is it part of Europe's core or periphery? This volume examines the “real city” beyond Rome's historical center, exploring the diversity and challenges of life in neighborhoods affected by immigration, neoliberalism, formal urban planning, and grassroots social...... movements. The contributors engage with themes of contemporary urban studies–the global city, the self-made city, alternative modernities, capital cities and nations, urban change from below, and sustainability. Global Rome serves as a provocative introduction to the Eternal City and makes an original...

  3. Quantifying graininess of glossy food products

    DEFF Research Database (Denmark)

    Møller, Flemming; Carstensen, Jens Michael

    The sensory quality of yoghurt can be altered when changing the milk composition or processing conditions. Part of the sensory quality may be assessed visually. It is described how a non-contact method for quantifying surface gloss and grains in yoghurt can be made. It was found that the standard...

  4. Quantifying antimicrobial resistance at veal calf farms

    NARCIS (Netherlands)

    Bosman, A.B.; Wagenaar, J.A.; Stegeman, A.; Vernooij, H.; Mevius, D.J.

    2012-01-01

    This study was performed to determine a sampling strategy to quantify the prevalence of antimicrobial resistance on veal calf farms, based on the variation in antimicrobial resistance within and between calves on five farms. Faecal samples from 50 healthy calves (10 calves/farm) were collected. From

  5. QS Spiral: Visualizing Periodic Quantified Self Data

    DEFF Research Database (Denmark)

    Larsen, Jakob Eg; Cuttone, Andrea; Jørgensen, Sune Lehmann

    2013-01-01

    In this paper we propose an interactive visualization technique QS Spiral that aims to capture the periodic properties of quantified self data and let the user explore those recurring patterns. The approach is based on time-series data visualized as a spiral structure. The interactivity includes ...

  6. Quantifying recontamination through factory environments - a review

    NARCIS (Netherlands)

    Asselt-den Aantrekker, van E.D.; Boom, R.M.; Zwietering, M.H.; Schothorst, van M.

    2003-01-01

    Recontamination of food products can be the origin of foodborne illnesses and should therefore be included in quantitative microbial risk assessment (MRA) studies. In order to do this, recontamination should be quantified using predictive models. This paper gives an overview of the relevant

  7. Quantifying quantum coherence with quantum Fisher information.

    Science.gov (United States)

    Feng, X N; Wei, L F

    2017-11-14

    Quantum coherence is one of the old but always important concepts in quantum mechanics, and now it has been regarded as a necessary resource for quantum information processing and quantum metrology. However, the question of how to quantify the quantum coherence has just been paid the attention recently (see, e.g., Baumgratz et al. PRL, 113. 140401 (2014)). In this paper we verify that the well-known quantum Fisher information (QFI) can be utilized to quantify the quantum coherence, as it satisfies the monotonicity under the typical incoherent operations and the convexity under the mixing of the quantum states. Differing from most of the pure axiomatic methods, quantifying quantum coherence by QFI could be experimentally testable, as the bound of the QFI is practically measurable. The validity of our proposal is specifically demonstrated with the typical phase-damping and depolarizing evolution processes of a generic single-qubit state, and also by comparing it with the other quantifying methods proposed previously.

  8. Interbank exposures: quantifying the risk of contagion

    OpenAIRE

    C. H. Furfine

    1999-01-01

    This paper examines the likelihood that failure of one bank would cause the subsequent collapse of a large number of other banks. Using unique data on interbank payment flows, the magnitude of bilateral federal funds exposures is quantified. These exposures are used to simulate the impact of various failure scenarios, and the risk of contagion is found to be economically small.

  9. Power Curve Measurements, quantify the production increase

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Vesth, Allan

    The purpose of this report is to quantify the production increase on a given turbine with respect to another given turbine. The used methodology is the “side by side” comparison method, provided by the client. This method involves the use of two neighboring turbines and it is based...

  10. Quantifying capital goods for waste landfilling

    DEFF Research Database (Denmark)

    Brogaard, Line Kai-Sørensen; Stentsøe, Steen; Willumsen, Hans Christian

    2013-01-01

    Materials and energy used for construction of a hill-type landfill of 4 million m3 were quantified in detail. The landfill is engineered with a liner and leachate collections system, as well as a gas collection and control system. Gravel and clay were the most common materials used, amounting...

  11. Quantifying interspecific coagulation efficiency of phytoplankton

    DEFF Research Database (Denmark)

    Hansen, J.L.S.; Kiørboe, Thomas

    1997-01-01

    . nordenskjoeldii. Mutual coagulation between Skeletonema costatum and the non-sticky cel:ls of Ditylum brightwellii also proceeded with hall the efficiency of S. costatum alone. The latex beads were suitable to be used as 'standard particles' to quantify the ability of phytoplankton to prime aggregation...

  12. New frontiers of quantified self 2

    DEFF Research Database (Denmark)

    Rapp, Amon; Cena, Federica; Kay, Judy

    2016-01-01

    While the Quantified Self (QS) community is described in terms of "self-knowledge through numbers" people are increasingly demanding value and meaning. In this workshop we aim at refocusing the QS debate on the value of data for providing new services....

  13. Quantifying temporal ventriloquism in audiovisual synchrony perception

    NARCIS (Netherlands)

    Kuling, I.A.; Kohlrausch, A.G.; Juola, J.F.

    2013-01-01

    The integration of visual and auditory inputs in the human brain works properly only if the components are perceived in close temporal proximity. In the present study, we quantified cross-modal interactions in the human brain for audiovisual stimuli with temporal asynchronies, using a paradigm from

  14. Reliability-How to Quantify and Improve?

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 5. Reliability - How to Quantify and Improve? - Improving the Health of Products. N K Srinivasan. General Article Volume 5 Issue 5 May 2000 pp 55-63. Fulltext. Click here to view fulltext PDF. Permanent link:

  15. The global burden of dengue: an analysis from the Global Burden of Disease Study 2013

    NARCIS (Netherlands)

    J.D. Stanaway (Jeffrey D.); D.S. Shepard (Donald); E.A. Undurraga (Eduardo); Halasa, Y.A. (Yara A); L.E. Coffeng (Luc); Brady, O.J. (Oliver J); Hay, S.I. (Simon I); Bedi, N. (Neeraj); I.M. Bensenor (Isabela M.); C.A. Castañeda-Orjuela (Carlos A); T.-W. Chuang (Ting-Wu); K.B. Gibney (Katherine B); Z.A. Memish (Ziad); A. Rafay (Anwar); K.N. Ukwaja (Kingsley N); N. Yonemoto (Naohiro); C.J.L. Murray (Christopher)

    2016-01-01

    textabstractBackground Dengue is the most common arbovirus infection globally, but its burden is poorly quantified. We estimated dengue mortality, incidence, and burden for the Global Burden of Disease Study 2013. Methods We modelled mortality from vital registration, verbal autopsy, and

  16. The carbon footprint of global tourism

    Science.gov (United States)

    Lenzen, Manfred; Sun, Ya-Yen; Faturay, Futu; Ting, Yuan-Peng; Geschke, Arne; Malik, Arunima

    2018-06-01

    Tourism contributes significantly to global gross domestic product, and is forecast to grow at an annual 4%, thus outpacing many other economic sectors. However, global carbon emissions related to tourism are currently not well quantified. Here, we quantify tourism-related global carbon flows between 160 countries, and their carbon footprints under origin and destination accounting perspectives. We find that, between 2009 and 2013, tourism's global carbon footprint has increased from 3.9 to 4.5 GtCO2e, four times more than previously estimated, accounting for about 8% of global greenhouse gas emissions. Transport, shopping and food are significant contributors. The majority of this footprint is exerted by and in high-income countries. The rapid increase in tourism demand is effectively outstripping the decarbonization of tourism-related technology. We project that, due to its high carbon intensity and continuing growth, tourism will constitute a growing part of the world's greenhouse gas emissions.

  17. Global Managers

    DEFF Research Database (Denmark)

    Barakat, Livia L.; Lorenz, Melanie P.; Ramsey, Jase R.

    2016-01-01

    Purpose: – The purpose of this paper is to examine the effect of cultural intelligence (CQ) on the job performance of global managers. Design/methodology/approach: – In total, 332 global managers were surveyed from multinational companies operating in Brazil. The mediating effect of job...... satisfaction was tested on the CQ-job performance relationship. Findings: – The findings suggest that job satisfaction transmits the effect of CQ to job performance, such that global managers high in CQ exhibit more job satisfaction in an international setting, and therefore perform better at their jobs....... Practical implications: – Results imply that global managers should increase their CQ in order to improve their job satisfaction and ultimately perform better in an international context. Originality/value: – The authors make three primary contributions to the international business literature. First...

  18. Hazard tolerance of spatially distributed complex networks

    International Nuclear Information System (INIS)

    Dunn, Sarah; Wilkinson, Sean

    2017-01-01

    In this paper, we present a new methodology for quantifying the reliability of complex systems, using techniques from network graph theory. In recent years, network theory has been applied to many areas of research and has allowed us to gain insight into the behaviour of real systems that would otherwise be difficult or impossible to analyse, for example increasingly complex infrastructure systems. Although this work has made great advances in understanding complex systems, the vast majority of these studies only consider a systems topological reliability and largely ignore their spatial component. It has been shown that the omission of this spatial component can have potentially devastating consequences. In this paper, we propose a number of algorithms for generating a range of synthetic spatial networks with different topological and spatial characteristics and identify real-world networks that share the same characteristics. We assess the influence of nodal location and the spatial distribution of highly connected nodes on hazard tolerance by comparing our generic networks to benchmark networks. We discuss the relevance of these findings for real world networks and show that the combination of topological and spatial configurations renders many real world networks vulnerable to certain spatial hazards. - Highlights: • We develop a method for quantifying the reliability of real-world systems. • We assess the spatial resilience of synthetic spatially distributed networks. • We form algorithms to generate spatial scale-free and exponential networks. • We show how these “synthetic” networks are proxies for real world systems. • Conclude that many real world systems are vulnerable to spatially coherent hazard.

  19. Globalization & technology

    DEFF Research Database (Denmark)

    Narula, Rajneesh

    Technology and globalization are interdependent processes. Globalization has a fundamental influence on the creation and diffusion of technology, which, in turn, affects the interdependence of firms and locations. This volume examines the international aspect of this interdependence at two levels...... of innovation" understanding of learning. Narula and Smith reconcile an important paradox. On the one hand, locations and firms are increasingly interdependent through supranational organisations, regional integration, strategic alliances, and the flow of investments, technologies, ideas and people...

  20. Another globalization

    OpenAIRE

    Prof. Ph.D. Ion Bucur

    2007-01-01

    Finding the anachronisms and the failures of the present globalization, as well as the vitiated system of world-wide government, has stimulated the debates regarding the identification of a more equitable form of globalization to favor the acceleration of the economic increase and the reduction of poverty.The deficiency of the present international economic institutions, especially the lack of transparency and democratic responsibility, claims back with acuteness the reformation of ...

  1. Gendered globalization

    DEFF Research Database (Denmark)

    Milwertz, Cecilia Nathansen; Cai, Yiping

    2017-01-01

    Both the People’s Republic of China (PRC) and Nordic countries (Sweden, Iceland, Denmark, Norway and Finland) view gender equality as a social justice issue and are politically committed towards achieving gender equality nationally and internationally. Since China has taken a proactive position...... on globalization and global governance, gender equality is possibly an area that China may wish to explore in collaboration with the Nordic countries....

  2. B cells in operational tolerance.

    Science.gov (United States)

    Chesneau, M; Danger, R; Soulillou, J-P; Brouard, S

    2018-02-16

    Transplantation is currently the therapy of choice for endstage organ failure even though it requires long-term immunosuppresive therapy, with its numerous side effects, for acceptance of the transplanted organ. In rare cases however, patients develop operational tolerance, that is, graft survival without immunosuppression. Studies conducted on these patients reveal genetic, phenotypic, and functional signatures. They provide a better understanding of the immunological mechanisms involved in operational tolerance and define biomarkers that could be used to adapt immunosuppressive treatment to the individual, safely reduce immunosuppression doses, and ideally and safely guide immunosuppression withdrawal. This review summarizes studies that suggest a role for B cells as biomarkers of operational tolerance and discusses the use of B cells as a predictive tool for immunologic risk. Copyright © 2018. Published by Elsevier Inc.

  3. Immune tolerance in radiation chimeras

    International Nuclear Information System (INIS)

    Awaya, Kazuhiko; Kuniki, Hiromichi; Neki, Miyuki

    1978-01-01

    Establishment of immune tolerance in radiation chimeras and the mechanism of maintaining it were discussed from certain points. Semiallogeneic radiation chimeras are mostly of long-living, and the hematopoietic organ of this individual consists mainly of the cells derived from the marrow donor, i. e., F 1 -type cells. F 1 -type lymphocytes can distinguish parental strain cells from themselves. In these chimeras, a F 1 -skin graft maintains to be fresh as long as the host is alive, showing immune tolerance effective through its life. In establishment and maintenance of this immune tolerance, the suppressing mechanism of host-type or F 1 -type seems to be involved. The allogeneic radiation chimera has very poor long-survival rate compared with that of the semiallogeneic radiation chimera. To raise this survival rate, efforts are now being made from the immunological point of view. (Ueda, J.)

  4. Global warming

    CERN Document Server

    Hulme, M

    1998-01-01

    Global warming-like deforestation, the ozone hole and the loss of species- has become one of the late 20the century icons of global environmental damage. The threat, is not the reality, of such a global climate change has motivated governments. businesses and environmental organisations, to take serious action ot try and achieve serious control of the future climate. This culminated last December in Kyoto in the agreement for legally-binding climate protocol. In this series of three lectures I will provide a perspective on the phenomenon of global warming that accepts the scientific basis for our concern, but one that also recognises the dynamic interaction between climate and society that has always exited The future will be no different. The challenge of global warning is not to pretend it is not happening (as with some pressure groups), nor to pretend it threatens global civilisation (as with other pressure groups), and it is not even a challenge to try and stop it from happening-we are too far down the ro...

  5. Design Life Level: Quantifying risk in a changing climate

    Science.gov (United States)

    Rootzén, Holger; Katz, Richard W.

    2013-09-01

    In the past, the concepts of return levels and return periods have been standard and important tools for engineering design. However, these concepts are based on the assumption of a stationary climate and do not apply to a changing climate, whether local or global. In this paper, we propose a refined concept, Design Life Level, which quantifies risk in a nonstationary climate and can serve as the basis for communication. In current practice, typical hydrologic risk management focuses on a standard (e.g., in terms of a high quantile corresponding to the specified probability of failure for a single year). Nevertheless, the basic information needed for engineering design should consist of (i) the design life period (e.g., the next 50 years, say 2015-2064); and (ii) the probability (e.g., 5% chance) of a hazardous event (typically, in the form of the hydrologic variable exceeding a high level) occurring during the design life period. Capturing both of these design characteristics, the Design Life Level is defined as an upper quantile (e.g., 5%) of the distribution of the maximum value of the hydrologic variable (e.g., water level) over the design life period. We relate this concept and variants of it to existing literature and illustrate how they, and some useful complementary plots, may be computed and used. One practically important consideration concerns quantifying the statistical uncertainty in estimating a high quantile under nonstationarity.

  6. Methods to establish flaw tolerances

    International Nuclear Information System (INIS)

    Varga, T.

    1978-01-01

    Three conventional methods used to establish flaw tolerances are compared with new approaches using fracture mechanics. The conventional methods are those based on (a) non-destructive testing methods; (b) fabrication and quality assurance experience; and (c) service and damage experience. Pre-requisites of fracture mechanics methods are outlined, and summaries given of linear elastic mechanics (LEFM) and elastoplastic fracture mechanics (EPFM). The latter includes discussion of C.O.D.(crack opening displacement), the J-integral and equivalent energy. Proposals are made for establishing flaw tolerances. (U.K.)

  7. Plant Distribution Data Show Broader Climatic Limits than Expert-Based Climatic Tolerance Estimates.

    Directory of Open Access Journals (Sweden)

    Caroline A Curtis

    Full Text Available Although increasingly sophisticated environmental measures are being applied to species distributions models, the focus remains on using climatic data to provide estimates of habitat suitability. Climatic tolerance estimates based on expert knowledge are available for a wide range of plants via the USDA PLANTS database. We aim to test how climatic tolerance inferred from plant distribution records relates to tolerance estimated by experts. Further, we use this information to identify circumstances when species distributions are more likely to approximate climatic tolerance.We compiled expert knowledge estimates of minimum and maximum precipitation and minimum temperature tolerance for over 1800 conservation plant species from the 'plant characteristics' information in the USDA PLANTS database. We derived climatic tolerance from distribution data downloaded from the Global Biodiversity and Information Facility (GBIF and corresponding climate from WorldClim. We compared expert-derived climatic tolerance to empirical estimates to find the difference between their inferred climate niches (ΔCN, and tested whether ΔCN was influenced by growth form or range size.Climate niches calculated from distribution data were significantly broader than expert-based tolerance estimates (Mann-Whitney p values << 0.001. The average plant could tolerate 24 mm lower minimum precipitation, 14 mm higher maximum precipitation, and 7° C lower minimum temperatures based on distribution data relative to expert-based tolerance estimates. Species with larger ranges had greater ΔCN for minimum precipitation and minimum temperature. For maximum precipitation and minimum temperature, forbs and grasses tended to have larger ΔCN while grasses and trees had larger ΔCN for minimum precipitation.Our results show that distribution data are consistently broader than USDA PLANTS experts' knowledge and likely provide more robust estimates of climatic tolerance, especially for

  8. Tolerating extremism : to what extent should intolerance be tolerated?

    NARCIS (Netherlands)

    Guiora, Amos Neuser

    2013-01-01

    In discussing extremism, the key questions are: to whom is a duty owed and what are the limits of intolerance that are to be tolerated? Answering these questions requires examining limits and rights; analyzing them in the context of extremism is the ‘core’ of this book. While freedom of speech and

  9. Stage- and sex-specific heat tolerance in the yellow dung fly Scathophaga stercoraria

    OpenAIRE

    Blanckenhorn Wolf U.; Gautier Roland; Nick Marcel; Puniamoorthy Nalini; Schäfer Martin A.

    2014-01-01

    Thermal tolerance varies at all hierarchical levels of biological organization: among species populations individuals and even within individuals. Age or developmental stage and sex specific thermal effects have received relatively little attention in the literature despite being crucial for understanding thermal adaptation in nature and responses to global warming. We document stage and sex specific heat tolerance in the yellow dung fly Scathophaga stercoraria (Diptera: Scathophagidae) a...

  10. Tropospheric Ozone Assessment Report: Assessment of global-scale model performance for global and regional ozone distributions, variability, and trends

    Directory of Open Access Journals (Sweden)

    P. J. Young

    2018-01-01

    the problem being addressed, whether biases can be tolerated or corrected, whether the model is appropriately constituted, and whether there is a way to satisfactorily quantify the uncertainty.

  11. Global entanglement in multiparticle systems

    International Nuclear Information System (INIS)

    Meyer, David A.; Wallach, Nolan R.

    2002-01-01

    We define a polynomial measure of multiparticle entanglement which is scalable, i.e., which applies to any number of spin-(1/2) particles. By evaluating it for three particle states, for eigenstates of the one dimensional Heisenberg antiferromagnet and on quantum error correcting code subspaces, we illustrate the extent to which it quantifies global entanglement. We also apply it to track the evolution of entanglement during a quantum computation

  12. Quantifying Stock Return Distributions in Financial Markets.

    Science.gov (United States)

    Botta, Federico; Moat, Helen Susannah; Stanley, H Eugene; Preis, Tobias

    2015-01-01

    Being able to quantify the probability of large price changes in stock markets is of crucial importance in understanding financial crises that affect the lives of people worldwide. Large changes in stock market prices can arise abruptly, within a matter of minutes, or develop across much longer time scales. Here, we analyze a dataset comprising the stocks forming the Dow Jones Industrial Average at a second by second resolution in the period from January 2008 to July 2010 in order to quantify the distribution of changes in market prices at a range of time scales. We find that the tails of the distributions of logarithmic price changes, or returns, exhibit power law decays for time scales ranging from 300 seconds to 3600 seconds. For larger time scales, we find that the distributions tails exhibit exponential decay. Our findings may inform the development of models of market behavior across varying time scales.

  13. A masking index for quantifying hidden glitches

    OpenAIRE

    Berti-Equille, Laure; Loh, J. M.; Dasu, T.

    2015-01-01

    Data glitches are errors in a dataset. They are complex entities that often span multiple attributes and records. When they co-occur in data, the presence of one type of glitch can hinder the detection of another type of glitch. This phenomenon is called masking. In this paper, we define two important types of masking and propose a novel, statistically rigorous indicator called masking index for quantifying the hidden glitches. We outline four cases of masking: outliers masked by missing valu...

  14. How are the catastrophical risks quantifiable

    International Nuclear Information System (INIS)

    Chakraborty, S.

    1985-01-01

    For the assessment and evaluation of industrial risks the question must be asked how are the catastrophical risks quantifiable. Typical real catastrophical risks and risk assessment based on modelling assumptions have been placed against each other in order to put the risks into proper perspective. However, the society is risk averse when there is a catastrophic potential of severe accidents in a large scale industrial facility even though there is extremely low probability of occurence. (orig.) [de

  15. Quantifying Anthropogenic Stress on Groundwater Resources

    OpenAIRE

    Ashraf, Batool; AghaKouchak, Amir; Alizadeh, Amin; Mousavi Baygi, Mohammad; R. Moftakhari, Hamed; Mirchi, Ali; Anjileli, Hassan; Madani, Kaveh

    2017-01-01

    This study explores a general framework for quantifying anthropogenic influences on groundwater budget based on normalized human outflow (hout) and inflow (hin). The framework is useful for sustainability assessment of groundwater systems and allows investigating the effects of different human water abstraction scenarios on the overall aquifer regime (e.g., depleted, natural flow-dominated, and human flow-dominated). We apply this approach to selected regions in the USA, Germany and Iran to e...

  16. Tolerance of snakes to hypergravity

    Science.gov (United States)

    Lillywhite, H. B.; Ballard, R. E.; Hargens, A. R.

    1996-01-01

    Sensitivity of carotid blood flow to increased gravitational force acting in the head-to-tail direction(+Gz) was studied in diverse species of snakes hypothesized to show adaptive variation of response. Tolerance to increased gravity was measured red as the maximum graded acceleration force at which carotid blood flow ceased and was shown to vary according to gravitational adaptation of species defined by their ecology and behavior. Multiple regression analysis showed that gravitational habitat, but not body length, had a significant effect on Gz tolerance. At the extremes, carotid blood flow decreased in response to increasing G force and approached zero near +1 Gz in aquatic and ground-dwelling species, whereas in climbing species carotid flow was maintained at forces in excess of +2 Gz. Tolerant (arboreal) species were able to withstand hypergravic forces of +2 to +3 Gz for periods up to 1 h without cessation of carotid blood flow or loss of body movement and tongue flicking. Data suggest that the relatively tight skin characteristic of tolerant species provides a natural antigravity suit and is of prime importance in counteracting Gz stress on blood circulation.

  17. Assessing Your Board's Risk Tolerance

    Science.gov (United States)

    Griswold, John S.; Jarvis, William F.

    2014-01-01

    In the wake of the financial crisis, trustees of many endowed nonprofit institutions realized that their portfolio was riskier than they thought and their own ability to tolerate loss wasn't as strong as they imagined. What can board and investment committee members do to improve their ability to assess their--and their institution's--capacity for…

  18. Toleration, Multiculturalism and Mistaken Belief

    Science.gov (United States)

    Standish, Paul

    2006-01-01

    Doubts have been expressed about the virtue of toleration, especially in view of what some have seen as its complicity with a morality of anything goes. More rigorous arguments have been provided by Peter Gardner and Harvey Siegel against the relativism evident in certain versions of multiculturalism and in the new religious studies. This article…

  19. Global Issues

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, J.L.

    2001-10-15

    Global Issues is an introduction to the nature and background of some of the central issues - economic, social, political, environmental - of modern times. This new edition of this text has been fully updated throughout and features expanded sections on issues such as global warming, biotechnology, and energy. Fully updated throughout and features expanded sections on issues such as global warming, biotechnology, and energy. An introduction to the nature and background of some of the central issues - economic, social, political, environmental - of modern times. Covers a range of perspectives on a variety of societies, developed and developing. Extensively illustrated with diagrams and photographs, contains guides to further reading, media, and internet resources, and includes suggestions for discussion and studying the material. (author)

  20. Global Inequality

    DEFF Research Database (Denmark)

    Niño-Zarazúa, Miguel; Roope, Laurence; Tarp, Finn

    2017-01-01

    This paper measures trends in global interpersonal inequality during 1975–2010 using data from the most recent version of the World Income Inequality Database (WIID). The picture that emerges using ‘absolute,’ and even ‘centrist’ measures of inequality, is very different from the results obtained...... using standard ‘relative’ inequality measures such as the Gini coefficient or Coefficient of Variation. Relative global inequality has declined substantially over the decades. In contrast, ‘absolute’ inequality, as captured by the Standard Deviation and Absolute Gini, has increased considerably...... and unabated. Like these ‘absolute’ measures, our ‘centrist’ inequality indicators, the Krtscha measure and an intermediate Gini, also register a pronounced increase in global inequality, albeit, in the case of the latter, with a decline during 2005 to 2010. A critical question posed by our findings is whether...

  1. Global Inequality

    DEFF Research Database (Denmark)

    Niño-Zarazúa, Miguel; Roope, Laurence; Tarp, Finn

    2017-01-01

    This paper measures trends in global interpersonal inequality during 1975–2010 using data from the most recent version of the World Income Inequality Database (WIID). The picture that emerges using ‘absolute,’ and even ‘centrist’ measures of inequality, is very different from the results obtained...... by centrist measures such as the Krtscha, could return to 1975 levels, at today's domestic and global per capita income levels, but this would require quite dramatic structural reforms to reduce domestic inequality levels in most countries....... using standard ‘relative’ inequality measures such as the Gini coefficient or Coefficient of Variation. Relative global inequality has declined substantially over the decades. In contrast, ‘absolute’ inequality, as captured by the Standard Deviation and Absolute Gini, has increased considerably...

  2. Global Programs

    Science.gov (United States)

    Lindberg Christensen, Lars; Russo, P.

    2009-05-01

    IYA2009 is a global collaboration between almost 140 nations and more than 50 international organisations sharing the same vision. Besides the common brand, mission, vision and goals, IAU established eleven cornerstones programmes to support the different IYA2009 stakeholder to organize events, activities under a common umbrella. These are global activities centred on specific themes and are aligned with IYA2009's main goals. Whether it is the support and promotion of women in astronomy, the preservation of dark-sky sites around the world or educating and explaining the workings of the Universe to millions, the eleven Cornerstones are key elements in the success of IYA2009. However, the process of implementing global projects across cultural boundaries is challenging and needs central coordination to preserve the pre-established goals. During this talk we will examine the ups and downs of coordinating such a project and present an overview of the principal achievements for the Cornerstones so far.

  3. Global rotation

    International Nuclear Information System (INIS)

    Rosquist, K.

    1980-01-01

    Global rotation in cosmological models is defined on an observational basis. A theorem is proved saying that, for rigid motion, the global rotation is equal to the ordinary local vorticity. The global rotation is calculated in the space-time homogeneous class III models, with Godel's model as a special case. It is shown that, with the exception of Godel's model, the rotation in these models becomes infinite for finite affine parameter values. In some directions the rotation changes sign and becomes infinite in a direction opposite to the local vorticity. The points of infinite rotation are identified as conjugate points along the null geodesics. The physical interpretation of the infinite rotation is discussed, and a comparison with the behaviour of the area distance at conjugate points is given. (author)

  4. Fighting the Global War on Terror Tolerably: Augmenting the Global Counter Insurgency Strategy with Surrogates

    Science.gov (United States)

    2007-03-27

    186, 198 36 their cattle .114 It was left to the BATT to determine how to overcome these predicaments; from forgoing the element of surprise and...demonstrating the CAS to planning and executing what might be the first cattle drive to be conducted with CAS and artillery providing security. For all...demonstrated that the Firqat were not pawns of the British. The relegation of the Harkis to a constant subordinate role effectively castrated them in the

  5. Another globalization

    Directory of Open Access Journals (Sweden)

    Prof. Ph.D. Ion Bucur

    2007-05-01

    Full Text Available Finding the anachronisms and the failures of the present globalization, as well as the vitiated system of world-wide government, has stimulated the debates regarding the identification of a more equitable form of globalization to favor the acceleration of the economic increase and the reduction of poverty.The deficiency of the present international economic institutions, especially the lack of transparency and democratic responsibility, claims back with acuteness the reformation of the architecture of the international institutional system and the promotion of those economical policies which must ensure the stability world-wide economy and the amelioration of the international equity.

  6. Measuring Globalization

    OpenAIRE

    Andersen, Torben M.; Herbertsson, Tryggvi Thor

    2003-01-01

    The multivariate technique of factor analysis is used to combine several indicators of economic integration and international transactions into a single measure or index of globalization. The index is an alternative to the simple measure of openness based on trade, and it produces a ranking of countries over time for 23 OECD countries. Ireland is ranked as the most globalized country during the 1990?s, while the UK was at the top during the 1980?s. Some of the most notable changes in the rank...

  7. Going global

    International Nuclear Information System (INIS)

    Meade, W.; Poirier, J.L.

    1992-01-01

    This article discusses the global market for independent power projects and the increased competition and strategic alliances that are occurring to take advantage of the increasing demand. The topics of the article include the amount of involvement of US companies in the global market, the forces driving the market toward independent power, markets in the United Kingdom, North America, Turkey, Central America, South America, the Caribbean, Europe, the Federal Republic of Germany, India, the former Eastern European countries, Asia and the Pacific nations, and niche markets

  8. Quantifying uncertainties of permafrost carbon–climate feedbacks

    Directory of Open Access Journals (Sweden)

    E. J. Burke

    2017-06-01

    Full Text Available The land surface models JULES (Joint UK Land Environment Simulator, two versions and ORCHIDEE-MICT (Organizing Carbon and Hydrology in Dynamic Ecosystems, each with a revised representation of permafrost carbon, were coupled to the Integrated Model Of Global Effects of climatic aNomalies (IMOGEN intermediate-complexity climate and ocean carbon uptake model. IMOGEN calculates atmospheric carbon dioxide (CO2 and local monthly surface climate for a given emission scenario with the land–atmosphere CO2 flux exchange from either JULES or ORCHIDEE-MICT. These simulations include feedbacks associated with permafrost carbon changes in a warming world. Both IMOGEN–JULES and IMOGEN–ORCHIDEE-MICT were forced by historical and three alternative future-CO2-emission scenarios. Those simulations were performed for different climate sensitivities and regional climate change patterns based on 22 different Earth system models (ESMs used for CMIP3 (phase 3 of the Coupled Model Intercomparison Project, allowing us to explore climate uncertainties in the context of permafrost carbon–climate feedbacks. Three future emission scenarios consistent with three representative concentration pathways were used: RCP2.6, RCP4.5 and RCP8.5. Paired simulations with and without frozen carbon processes were required to quantify the impact of the permafrost carbon feedback on climate change. The additional warming from the permafrost carbon feedback is between 0.2 and 12 % of the change in the global mean temperature (ΔT by the year 2100 and 0.5 and 17 % of ΔT by 2300, with these ranges reflecting differences in land surface models, climate models and emissions pathway. As a percentage of ΔT, the permafrost carbon feedback has a greater impact on the low-emissions scenario (RCP2.6 than on the higher-emissions scenarios, suggesting that permafrost carbon should be taken into account when evaluating scenarios of heavy mitigation and stabilization

  9. The 'sensory tolerance limit': A hypothetical construct determining exercise performance?

    Science.gov (United States)

    Hureau, Thomas J; Romer, Lee M; Amann, Markus

    2018-02-01

    Neuromuscular fatigue compromises exercise performance and is determined by central and peripheral mechanisms. Interactions between the two components of fatigue can occur via neural pathways, including feedback and feedforward processes. This brief review discusses the influence of feedback and feedforward mechanisms on exercise limitation. In terms of feedback mechanisms, particular attention is given to group III/IV sensory neurons which link limb muscle with the central nervous system. Central corollary discharge, a copy of the neural drive from the brain to the working muscles, provides a signal from the motor system to sensory systems and is considered a feedforward mechanism that might influence fatigue and consequently exercise performance. We highlight findings from studies supporting the existence of a 'critical threshold of peripheral fatigue', a previously proposed hypothesis based on the idea that a negative feedback loop operates to protect the exercising limb muscle from severe threats to homeostasis during whole-body exercise. While the threshold theory remains to be disproven within a given task, it is not generalisable across different exercise modalities. The 'sensory tolerance limit', a more theoretical concept, may address this issue and explain exercise tolerance in more global terms and across exercise modalities. The 'sensory tolerance limit' can be viewed as a negative feedback loop which accounts for the sum of all feedback (locomotor muscles, respiratory muscles, organs, and muscles not directly involved in exercise) and feedforward signals processed within the central nervous system with the purpose of regulating the intensity of exercise to ensure that voluntary activity remains tolerable.

  10. Global Games

    NARCIS (Netherlands)

    van Bottenburg, Maarten

    2001-01-01

    Why is soccer the sport of choice in South America, while baseball has soared to popularity in the Carribean? How did cricket become India's national sport, while China is a stronghold of table tennis? In Global Games, Maarten van Bottenburg asserts that it is the 'hidden competition' of social and

  11. Going global?

    DEFF Research Database (Denmark)

    Fejerskov, Adam Moe; Rasmussen, Christel

    2016-01-01

    occurred at a more micro level. This article explores this issue by studying the international activities of Danish foundations. It finds that grant-making on global issues is increasing, and that several foundations have undergone transformations in their approach to grantmaking, making them surprisingly...

  12. Justice Globalism

    NARCIS (Netherlands)

    Wilson, Erin; Steger, Manfred; Siracusa, Joseph; Battersby, Paul

    2014-01-01

    The pursuit of a global order founded on universal rules extends beyond economics into the normative spheres of law, politics and justice. Justice globalists claim universal principles applicable to all societies irrespective of religion or ideology. This view privileges human rights, democracy and

  13. Physiological determinants of human acute hypoxia tolerance.

    Science.gov (United States)

    2013-11-01

    AbstractIntroduction. We investigated possible physiological determinants of variability in hypoxia tolerance in subjects given a 5-minute normobaric exposure to 25,000 ft equivalent. Physiological tolerance to hypoxia was defined as the magnitude of...

  14. Persistence and drug tolerance in pathogenic yeast

    DEFF Research Database (Denmark)

    Bojsen, Rasmus Kenneth; Regenberg, Birgitte; Folkesson, Sven Anders

    2017-01-01

    In this review, we briefly summarize the current understanding of how fungal pathogens can persist antifungal treatment without heritable resistance mutations by forming tolerant persister cells. Fungal infections tolerant to antifungal treatment have become a major medical problem. One mechanism...

  15. Drought and submergence tolerance in plants

    Energy Technology Data Exchange (ETDEWEB)

    Du, Hewei; Zhou, Yufan; Oksenberg, Nir; Ronald, Pamela

    2017-11-14

    The invention provides methods of genetically modified plants to increase tolerance to drought and/or submergence. The invention additionally provides plants having increased drought and/or submergence tolerance engineered using such methods.

  16. Urbanism, Migration, and Tolerance: A Reassessment.

    Science.gov (United States)

    Wilson, Thomas C.

    1991-01-01

    Urbanism's impact on the personality may be stronger than previously thought. Finds that urban residence has a strong positive effect on tolerance. Migration also promotes tolerance, regardless of the size of the destination community. (DM)

  17. The credibility challenge for global fluvial flood risk analysis

    NARCIS (Netherlands)

    Trigg, M.A.; Birch, C.E.; Neal, J.C.; Bates, P.D.; Smith, A.; Sampson, C.C.; Yamazaki, D.; Hirabayashi, Y.; Pappenberger, F.; Dutra, E.; Ward, P.J.; Winsemius, H.C.; Salamon, P.; Dottori, F.; Rudari, R.; Kappes, M.S.; Simpson, A.L.; Hadzilacos, G.; Fewtrell, T.J.

    2016-01-01

    Quantifying flood hazard is an essential component of resilience planning, emergency response, and mitigation, including insurance. Traditionally undertaken at catchment and national scales, recently, efforts have intensified to estimate flood risk globally to better allow consistent and equitable

  18. Quantifier spreading: children misled by ostensive cues

    Directory of Open Access Journals (Sweden)

    Katalin É. Kiss

    2017-04-01

    Full Text Available This paper calls attention to a methodological problem of acquisition experiments. It shows that the economy of the stimulus employed in child language experiments may lend an increased ostensive effect to the message communicated to the child. Thus, when the visual stimulus in a sentence-picture matching task is a minimal model abstracting away from the details of the situation, children often regard all the elements of the stimulus as ostensive clues to be represented in the corresponding sentence. The use of such minimal stimuli is mistaken when the experiment aims to test whether or not a certain element of the stimulus is relevant for the linguistic representation or interpretation. The paper illustrates this point by an experiment involving quantifier spreading. It is claimed that children find a universally quantified sentence like 'Every girl is riding a bicycle 'to be a false description of a picture showing three girls riding bicycles and a solo bicycle because they are misled to believe that all the elements in the visual stimulus are relevant, hence all of them are to be represented by the corresponding linguistic description. When the iconic drawings were replaced by photos taken in a natural environment rich in accidental details, the occurrence of quantifier spreading was radically reduced. It is shown that an extra object in the visual stimulus can lead to the rejection of the sentence also in the case of sentences involving no quantification, which gives further support to the claim that the source of the problem is not (or not only the grammatical or cognitive difficulty of quantification but the unintended ostensive effect of the extra object.  This article is part of the special collection: Acquisition of Quantification

  19. Ethanol tolerant Pt-alloy cathodes for DEFC applications

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Valera, F.J. [CINVESTAV Unidad Saltillo, Coahuila (Mexico). Grupo de Recursos Minerales y Energeticos; Savadogo, O. [Ecole Polytechnique de Montreal, Montreal, PQ (Canada). Laboratoire de nouveaux materiaux pour l' energie et l' electrochimie

    2008-07-01

    Direct ethanol fuel cells (DEFCs) based on Ru/C cathodes have interesting current density versus cell voltage behaviour. In particular, the selectivity towards the oxygen reduction reaction (ORR) in acid medium in the presence of ethanol was improved when this cathode material was used. This study quantified the degree of tolerance to ethanol and the electrocatalytic activity for the ORR. It compared the specific activity towards the ORR for Pt1Co1/C and Pt3Cr1/C. The study showed that these cathodes have a high tolerance to this alcohol and demonstrated the good performance of this type of Pt-alloy in a DEFC as oxygen reduction cathodes. The performance of the Pt1Co1/C alloy was shown to be better than the Pt3Cr1/C, even when the former had a lower Pt content. The enhanced catalytic behaviour of the PtCo/C alloy can be attributed to the higher degree of allying or a smaller mean particle size and a larger surface area. Polarization measurements with relatively high ethanol concentrations confirmed the good catalytic behaviour of the PtCo/C alloy as cathode in a DEFC operating at 90 degrees C. Current work is focusing on the variation of Co content in the alloy structure and the analysis of this change in terms of ORR activity, tolerance to ethanol and electrochemical behaviour in a DEFC. 10 refs., 5 figs.

  20. Quantifying information leakage of randomized protocols

    DEFF Research Database (Denmark)

    Biondi, Fabrizio; Legay, Axel; Malacaria, Pasquale

    2015-01-01

    The quantification of information leakage provides a quantitative evaluation of the security of a system. We propose the usage of Markovian processes to model deterministic and probabilistic systems. By using a methodology generalizing the lattice of information approach we model refined attackers...... capable to observe the internal behavior of the system, and quantify the information leakage of such systems. We also use our method to obtain an algorithm for the computation of channel capacity from our Markovian models. Finally, we show how to use the method to analyze timed and non-timed attacks...

  1. Characterization of autoregressive processes using entropic quantifiers

    Science.gov (United States)

    Traversaro, Francisco; Redelico, Francisco O.

    2018-01-01

    The aim of the contribution is to introduce a novel information plane, the causal-amplitude informational plane. As previous works seems to indicate, Bandt and Pompe methodology for estimating entropy does not allow to distinguish between probability distributions which could be fundamental for simulation or for probability analysis purposes. Once a time series is identified as stochastic by the causal complexity-entropy informational plane, the novel causal-amplitude gives a deeper understanding of the time series, quantifying both, the autocorrelation strength and the probability distribution of the data extracted from the generating processes. Two examples are presented, one from climate change model and the other from financial markets.

  2. Quantifying Heuristic Bias: Anchoring, Availability, and Representativeness.

    Science.gov (United States)

    Richie, Megan; Josephson, S Andrew

    2018-01-01

    Construct: Authors examined whether a new vignette-based instrument could isolate and quantify heuristic bias. Heuristics are cognitive shortcuts that may introduce bias and contribute to error. There is no standardized instrument available to quantify heuristic bias in clinical decision making, limiting future study of educational interventions designed to improve calibration of medical decisions. This study presents validity data to support a vignette-based instrument quantifying bias due to the anchoring, availability, and representativeness heuristics. Participants completed questionnaires requiring assignment of probabilities to potential outcomes of medical and nonmedical scenarios. The instrument randomly presented scenarios in one of two versions: Version A, encouraging heuristic bias, and Version B, worded neutrally. The primary outcome was the difference in probability judgments for Version A versus Version B scenario options. Of 167 participants recruited, 139 enrolled. Participants assigned significantly higher mean probability values to Version A scenario options (M = 9.56, SD = 3.75) than Version B (M = 8.98, SD = 3.76), t(1801) = 3.27, p = .001. This result remained significant analyzing medical scenarios alone (Version A, M = 9.41, SD = 3.92; Version B, M = 8.86, SD = 4.09), t(1204) = 2.36, p = .02. Analyzing medical scenarios by heuristic revealed a significant difference between Version A and B for availability (Version A, M = 6.52, SD = 3.32; Version B, M = 5.52, SD = 3.05), t(404) = 3.04, p = .003, and representativeness (Version A, M = 11.45, SD = 3.12; Version B, M = 10.67, SD = 3.71), t(396) = 2.28, p = .02, but not anchoring. Stratifying by training level, students maintained a significant difference between Version A and B medical scenarios (Version A, M = 9.83, SD = 3.75; Version B, M = 9.00, SD = 3.98), t(465) = 2.29, p = .02, but not residents or attendings. Stratifying by heuristic and training level, availability maintained

  3. An index for quantifying flocking behavior.

    Science.gov (United States)

    Quera, Vicenç; Herrando, Salvador; Beltran, Francesc S; Salas, Laura; Miñano, Meritxell

    2007-12-01

    One of the classic research topics in adaptive behavior is the collective displacement of groups of organisms such as flocks of birds, schools of fish, herds of mammals, and crowds of people. However, most agent-based simulations of group behavior do not provide a quantitative index for determining the point at which the flock emerges. An index was developed of the aggregation of moving individuals in a flock and an example was provided of how it can be used to quantify the degree to which a group of moving individuals actually forms a flock.

  4. 75 FR 17566 - Flutolanil; Pesticide Tolerances

    Science.gov (United States)

    2010-04-07

    ... ppm, and the greater tolerance value is needed to accommodate indirect residues from soybean..., and soybean hay at 2.5 ppm are being revoked since the same tolerance values are being established...; Pesticide Tolerances AGENCY: Environmental Protection Agency (EPA). ACTION: Final rule. SUMMARY: This...

  5. 77 FR 49732 - Cyprodinil; Pesticide Tolerances

    Science.gov (United States)

    2012-08-17

    .../puree (1x) and lemon/lime juice (1x) were used to modify the tolerance values. iii. Cancer. Based on the... the tolerance necessitate a higher value. Additionally, Codex has an established MRL on grape at 3 ppm...; Pesticide Tolerances AGENCY: Environmental Protection Agency (EPA). ACTION: Final rule. SUMMARY: This...

  6. 15 CFR 750.11 - Shipping tolerances.

    Science.gov (United States)

    2010-01-01

    ... in the ECCN applicable to your item reads “ $ value” or “in $ value”, there is no shipping tolerance... is no shipping tolerance with respect to the number of units. However, the value of all of your... shipping tolerance on this license because the items are controlled by an ECCN where “$ value” is the...

  7. 78 FR 18511 - Thiamethoxam; Pesticide Tolerances

    Science.gov (United States)

    2013-03-27

    ... Health Risk Assessment of New Uses on Strawberry, Pistachio, and Citrus; New Tolerance for Tea; and... Uses on Strawberry, Pistachio, and Citrus; New Tolerance for Tea; and Revised PHI and Tolerance for... ``Clothianidin--Aggregate Human Health Risk Assessment of New Uses on Strawberry, Pistachio, and Citrus; New...

  8. 77 FR 28493 - Propylene Oxide; Tolerance Actions

    Science.gov (United States)

    2012-05-15

    .... SUMMARY: EPA is establishing the tree nut crop group tolerance and separate tolerances on pistachio and...; nut, tree, group 14; and pistachio; and in 40 CFR 180.491(a)(2) tolerances for propylene chlorohydrin at 10.0 ppm on nut, pine; nut, tree, group 14; and pistachio. Also, in accordance with current Agency...

  9. 75 FR 26673 - Clethodim; Pesticide Tolerances

    Science.gov (United States)

    2010-05-12

    ... regulation establishes tolerances for residues of clethodim in or on the raw agricultural commodity artichoke... clethodim, in or on the raw agricultural commodity artichoke, globe at 1.3 parts per million (ppm... bushberry subgroup 13-07B tolerance from 3.0 ppm to 0.20 ppm and the globe artichoke tolerance from 1.3 ppm...

  10. Selection and characterisation of high ethanol tolerant ...

    African Journals Online (AJOL)

    15% ethanol tolerance. High level ethanol tolerant Saccharomyces yeast, Orc 6, was investigated for its potential application in ethanologenic fermentations. Data presented in this study revealed that Orc 6 yeast isolate tolerated osmotic stress above 12% (w/v) sorbitol and 15% (w/v) sucrose equivalent of osmotic pressure ...

  11. Zero Tolerance: Advantages and Disadvantages. Research Brief

    Science.gov (United States)

    Walker, Karen

    2009-01-01

    What are the positives and negatives of zero tolerance? What should be considered when examining a school's program? Although there are no definitive definitions of zero tolerance, two commonly used ones are as follows: "Zero tolerance means that a school will automatically and severely punish a student for a variety of infractions" (American Bar…

  12. What is Fault Tolerant Control

    DEFF Research Database (Denmark)

    Blanke, Mogens; Frei, C. W.; Kraus, K.

    2000-01-01

    Faults in automated processes will often cause undesired reactions and shut-down of a controlled plant, and the consequences could be damage to the plant, to personnel or the environment. Fault-tolerant control is the synonym for a set of recent techniques that were developed to increase plant...... availability and reduce the risk of safety hazards. Its aim is to prevent that simple faults develop into serious failure. Fault-tolerant control merges several disciplines to achieve this goal, including on-line fault diagnosis, automatic condition assessment and calculation of remedial actions when a fault...... is detected. The envelope of the possible remedial actions is wide. This paper introduces tools to analyze and explore structure and other fundamental properties of an automated system such that any redundancy in the process can be fully utilized to enhance safety and a availability....

  13. Human tolerance to space flight

    Science.gov (United States)

    Huntoon, C. L.

    1989-01-01

    Medical studies of astronauts and cosmonauts before, during, and after space missions have identified several effects of weightlessness and other factors that influence the ability of humans to tolerate space flight. Weightlessness effects include space motion sickness, cardiovascular abnormalities, reduction in immune system function, loss of red blood cells, loss of bone mass, and muscle atrophy. Extravehicular activity (EVA) increases the likelihood that decompression sickness may occur. Radiation also gives reason for concern about health of crewmembers, and psychological factors are important on long-term flights. Countermeasures that have been used include sensory preadaptation, prebreathing and use of various air mixtures for EVA, loading with water and electrolytes, exercise, use of pharmacological agents and special diets, and psychological support. It appears that humans can tolerate and recover satisfactorily from at least one year of space flight, but a number of conditions must be further ameliorated before long-duration missions can be considered routine.

  14. Fault Tolerant External Memory Algorithms

    DEFF Research Database (Denmark)

    Jørgensen, Allan Grønlund; Brodal, Gerth Stølting; Mølhave, Thomas

    2009-01-01

    Algorithms dealing with massive data sets are usually designed for I/O-efficiency, often captured by the I/O model by Aggarwal and Vitter. Another aspect of dealing with massive data is how to deal with memory faults, e.g. captured by the adversary based faulty memory RAM by Finocchi and Italiano....... However, current fault tolerant algorithms do not scale beyond the internal memory. In this paper we investigate for the first time the connection between I/O-efficiency in the I/O model and fault tolerance in the faulty memory RAM, and we assume that both memory and disk are unreliable. We show a lower...... bound on the number of I/Os required for any deterministic dictionary that is resilient to memory faults. We design a static and a dynamic deterministic dictionary with optimal query performance as well as an optimal sorting algorithm and an optimal priority queue. Finally, we consider scenarios where...

  15. Copper tolerance of Trichoderma species

    Directory of Open Access Journals (Sweden)

    Jovičić-Petrović Jelena

    2014-01-01

    Full Text Available Some Trichoderma strains can persist in ecosystems with high concentrations of heavy metals. The aim of this research was to examine the variability of Trichoderma strains isolated from different ecosystems, based on their morphological properties and restriction analysis of ITS fragments. The fungal growth was tested on potato dextrose agar, amended with Cu(II concentrations ranging from 0.25 to 10 mmol/l, in order to identify copper-resistant strains. The results indicate that some isolated strains of Trichoderma sp. show tolerance to higher copper concentrations. Further research to examine the ability of copper bioaccumulation by tolerant Trichoderma strains is needed. [Projekat Ministarstva nauke Republike Srbije, br. TR 31080 i br. III 43010

  16. A damage-tolerant glass.

    Science.gov (United States)

    Demetriou, Marios D; Launey, Maximilien E; Garrett, Glenn; Schramm, Joseph P; Hofmann, Douglas C; Johnson, William L; Ritchie, Robert O

    2011-02-01

    Owing to a lack of microstructure, glassy materials are inherently strong but brittle, and often demonstrate extreme sensitivity to flaws. Accordingly, their macroscopic failure is often not initiated by plastic yielding, and almost always terminated by brittle fracture. Unlike conventional brittle glasses, metallic glasses are generally capable of limited plastic yielding by shear-band sliding in the presence of a flaw, and thus exhibit toughness-strength relationships that lie between those of brittle ceramics and marginally tough metals. Here, a bulk glassy palladium alloy is introduced, demonstrating an unusual capacity for shielding an opening crack accommodated by an extensive shear-band sliding process, which promotes a fracture toughness comparable to those of the toughest materials known. This result demonstrates that the combination of toughness and strength (that is, damage tolerance) accessible to amorphous materials extends beyond the benchmark ranges established by the toughest and strongest materials known, thereby pushing the envelope of damage tolerance accessible to a structural metal.

  17. Engineering yeast transcription machinery for improved ethanol tolerance and production.

    Science.gov (United States)

    Alper, Hal; Moxley, Joel; Nevoigt, Elke; Fink, Gerald R; Stephanopoulos, Gregory

    2006-12-08

    Global transcription machinery engineering (gTME) is an approach for reprogramming gene transcription to elicit cellular phenotypes important for technological applications. Here we show the application of gTME to Saccharomyces cerevisiae for improved glucose/ethanol tolerance, a key trait for many biofuels programs. Mutagenesis of the transcription factor Spt15p and selection led to dominant mutations that conferred increased tolerance and more efficient glucose conversion to ethanol. The desired phenotype results from the combined effect of three separate mutations in the SPT15 gene [serine substituted for phenylalanine (Phe(177)Ser) and, similarly, Tyr(195)His, and Lys(218)Arg]. Thus, gTME can provide a route to complex phenotypes that are not readily accessible by traditional methods.

  18. Modelling Accident Tolerant Fuel Concepts

    Energy Technology Data Exchange (ETDEWEB)

    Hales, Jason Dean [Idaho National Laboratory; Gamble, Kyle Allan Lawrence [Idaho National Laboratory

    2016-05-01

    The catastrophic events that occurred at the Fukushima-Daiichi nuclear power plant in 2011 have led to widespread interest in research of alternative fuels and claddings that are proposed to be accident tolerant. The United States Department of Energy (DOE) through its Nuclear Energy Advanced Modeling and Simulation (NEAMS) program has funded an Accident Tolerant Fuel (ATF) High Impact Problem (HIP). The ATF HIP is a three-year project to perform research on two accident tolerant concepts. The final outcome of the ATF HIP will be an in-depth report to the DOE Advanced Fuels Campaign (AFC) giving a recommendation on whether either of the two concepts should be included in their lead test assembly scheduled for placement into a commercial reactor in 2022. The two ATF concepts under investigation in the HIP are uranium silicide fuel and iron-chromium-aluminum (FeCrAl) alloy cladding. Utilizing the expertise of three national laboratory participants (Idaho National Laboratory, Los Alamos National Laboratory, and Argonne National Laboratory), a comprehensive multiscale approach to modeling is being used that includes atomistic modeling, molecular dynamics, rate theory, phase-field, and fuel performance simulations. Model development and fuel performance analysis are critical since a full suite of experimental studies will not be complete before AFC must prioritize concepts for focused development. In this paper, we present simulations of the two proposed accident tolerance fuel systems: U3Si2 fuel with Zircaloy-4 cladding, and UO2 fuel with FeCrAl cladding. Sensitivity analyses are completed using Sandia National Laboratories’ Dakota software to determine which input parameters (e.g., fuel specific heat) have the greatest influence on the output metrics of interest (e.g., fuel centerline temperature). We also outline the multiscale modelling approach being employed. Considerable additional work is required prior to preparing the recommendation report for the Advanced

  19. Historical overview of immunological tolerance.

    Science.gov (United States)

    Schwartz, Ronald H

    2012-04-01

    A fundamental property of the immune system is its ability to mediate self-defense with a minimal amount of collateral damage to the host. The system uses several different mechanisms to achieve this goal, which is collectively referred to as the "process of immunological tolerance." This article provides an introductory historical overview to these various mechanisms, which are discussed in greater detail throughout this collection, and then briefly describes what happens when this process fails, a state referred to as "autoimmunity."

  20. SALT TOLERANCE OF CROP PLANTS

    OpenAIRE

    Hamdia, M. A; Shaddad, M. A. K.

    2010-01-01

    Several environmental factors adversely affect plant growth and development and final yield performance of a crop. Drought, salinity, nutrient imbalances (including mineral toxicities and deficiencies) and extremes of temperature are among the major environmental constraints to crop productivity worldwide. Development of crop plants with stress tolerance, however, requires, among others, knowledge of the physiological mechanisms and genetic controls of the contributing traits at different pla...

  1. Fault Tolerant Wind Farm Control

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob

    2013-01-01

    In the recent years the wind turbine industry has focused on optimizing the cost of energy. One of the important factors in this is to increase reliability of the wind turbines. Advanced fault detection, isolation and accommodation are important tools in this process. Clearly most faults are deal...... scenarios. This benchmark model is used in an international competition dealing with Wind Farm fault detection and isolation and fault tolerant control....

  2. Enhanced fault-tolerant quantum computing in d-level systems.

    Science.gov (United States)

    Campbell, Earl T

    2014-12-05

    Error-correcting codes protect quantum information and form the basis of fault-tolerant quantum computing. Leading proposals for fault-tolerant quantum computation require codes with an exceedingly rare property, a transversal non-Clifford gate. Codes with the desired property are presented for d-level qudit systems with prime d. The codes use n=d-1 qudits and can detect up to ∼d/3 errors. We quantify the performance of these codes for one approach to quantum computation known as magic-state distillation. Unlike prior work, we find performance is always enhanced by increasing d.

  3. Global swindle of global warming

    NARCIS (Netherlands)

    Zeiler, W.

    2007-01-01

    Voor sommige mensen is het nog steeds niet aannemelijk dat we te maken hebben met de effecten van ‘Global Warming’, de opwarming van de aarde door voornamelijk de broeikasgassen die vrijkomen bij de verbranding van fossiele brandstoffen. In de media worden voor- en tegenstanders aan het woord

  4. SALT TOLERANCE OF CROP PLANTS

    Directory of Open Access Journals (Sweden)

    Hamdia, M. A

    2010-09-01

    Full Text Available Several environmental factors adversely affect plant growth and development and final yield performance of a crop. Drought, salinity, nutrient imbalances (including mineral toxicities and deficiencies and extremes of temperature are among the major environmental constraints to crop productivity worldwide. Development of crop plants with stress tolerance, however, requires, among others, knowledge of the physiological mechanisms and genetic controls of the contributing traits at different plant developmental stages. In the past 2 decades, biotechnology research has provided considerable insights into the mechanism of biotic stress tolerance in plants at the molecular level. Furthermore, different abiotic stress factors may provoke osmotic stress, oxidative stress and protein denaturation in plants, which lead to similar cellular adaptive responses such as accumulation of compatible solutes, induction of stress proteins, and acceleration of reactive oxygen species scavenging systems. Recently, the authores try to improve plant tolerance to salinity injury through either chemical treatments (plant hormones, minerals, amino acids, quaternary ammonium compounds, polyamines and vitamins or biofertilizers treatments (Asymbiotic nitrogen-fixing bacteria, symbiotic nitrogen-fixing bacteria and mycorrhiza or enhanced a process used naturally by plants to minimise the movement of Na+ to the shoot, using genetic modification to amplify the process, helping plants to do what they already do - but to do it much better."

  5. A Generalizable Methodology for Quantifying User Satisfaction

    Science.gov (United States)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  6. Quantifying the efficiency of river regulation

    Directory of Open Access Journals (Sweden)

    R. Rödel

    2005-01-01

    Full Text Available Dam-affected hydrologic time series give rise to uncertainties when they are used for calibrating large-scale hydrologic models or for analysing runoff records. It is therefore necessary to identify and to quantify the impact of impoundments on runoff time series. Two different approaches were employed. The first, classic approach compares the volume of the dams that are located upstream from a station with the annual discharge. The catchment areas of the stations are calculated and then related to geo-referenced dam attributes. The paper introduces a data set of geo-referenced dams linked with 677 gauging stations in Europe. Second, the intensity of the impoundment impact on runoff times series can be quantified more exactly and directly when long-term runoff records are available. Dams cause a change in the variability of flow regimes. This effect can be measured using the model of linear single storage. The dam-caused storage change ΔS can be assessed through the volume of the emptying process between two flow regimes. As an example, the storage change ΔS is calculated for regulated long-term series of the Luleälven in northern Sweden.

  7. Quantifying meta-correlations in financial markets

    Science.gov (United States)

    Kenett, Dror Y.; Preis, Tobias; Gur-Gershgoren, Gitit; Ben-Jacob, Eshel

    2012-08-01

    Financial markets are modular multi-level systems, in which the relationships between the individual components are not constant in time. Sudden changes in these relationships significantly affect the stability of the entire system, and vice versa. Our analysis is based on historical daily closing prices of the 30 components of the Dow Jones Industrial Average (DJIA) from March 15th, 1939 until December 31st, 2010. We quantify the correlation among these components by determining Pearson correlation coefficients, to investigate whether mean correlation of the entire portfolio can be used as a precursor for changes in the index return. To this end, we quantify the meta-correlation - the correlation of mean correlation and index return. We find that changes in index returns are significantly correlated with changes in mean correlation. Furthermore, we study the relationship between the index return and correlation volatility - the standard deviation of correlations for a given time interval. This parameter provides further evidence of the effect of the index on market correlations and their fluctuations. Our empirical findings provide new information and quantification of the index leverage effect, and have implications to risk management, portfolio optimization, and to the increased stability of financial markets.

  8. Conceived globals

    DEFF Research Database (Denmark)

    Cheraghi, Maryam; Schøtt, Thomas

    2016-01-01

    and culture which have separate effects. Being man, young, educated and having entrepreneurial competencies promote transnational networking extensively. Networking is embedded in culture, in the way that transnational networking is more extensive in secular-rational culture than in traditional culture.......A firm may be conceived global, in the sense that, before its birth, the founding entrepreneur has a transnational network of advisors which provides an embedding for organising the upstart that may include assembling resources and marketing abroad. The purpose is to account for the entrepreneurs...... the intending, starting and operating phases, fairly constantly with only small fluctuations. The firm is conceived global in terms of the entrepreneur's transnational networking already in the pre-birth phase, when the entrepreneur is intending to start the firm. These phase effects hardly depend on attributes...

  9. Global Derivatives

    DEFF Research Database (Denmark)

    Andersen, Torben Juul

    approaches to dealing in the global business environment." - Sharon Brown-Hruska, Commissioner, Commodity Futures Trading Commission, USA. "This comprehensive survey of modern risk management using derivative securities is a fine demonstration of the practical relevance of modern derivatives theory to risk......" provides comprehensive coverage of different types of derivatives, including exchange traded contracts and over-the-counter instruments as well as real options. There is an equal emphasis on the practical application of derivatives and their actual uses in business transactions and corporate risk...... management situations. Its key features include: derivatives are introduced in a global market perspective; describes major derivative pricing models for practical use, extending these principles to valuation of real options; practical applications of derivative instruments are richly illustrated...

  10. Quantifying the Determinants of Evolutionary Dynamics Leading to Drug Resistance.

    Directory of Open Access Journals (Sweden)

    Guillaume Chevereau

    Full Text Available The emergence of drug resistant pathogens is a serious public health problem. It is a long-standing goal to predict rates of resistance evolution and design optimal treatment strategies accordingly. To this end, it is crucial to reveal the underlying causes of drug-specific differences in the evolutionary dynamics leading to resistance. However, it remains largely unknown why the rates of resistance evolution via spontaneous mutations and the diversity of mutational paths vary substantially between drugs. Here we comprehensively quantify the distribution of fitness effects (DFE of mutations, a key determinant of evolutionary dynamics, in the presence of eight antibiotics representing the main modes of action. Using precise high-throughput fitness measurements for genome-wide Escherichia coli gene deletion strains, we find that the width of the DFE varies dramatically between antibiotics and, contrary to conventional wisdom, for some drugs the DFE width is lower than in the absence of stress. We show that this previously underappreciated divergence in DFE width among antibiotics is largely caused by their distinct drug-specific dose-response characteristics. Unlike the DFE, the magnitude of the changes in tolerated drug concentration resulting from genome-wide mutations is similar for most drugs but exceptionally small for the antibiotic nitrofurantoin, i.e., mutations generally have considerably smaller resistance effects for nitrofurantoin than for other drugs. A population genetics model predicts that resistance evolution for drugs with this property is severely limited and confined to reproducible mutational paths. We tested this prediction in laboratory evolution experiments using the "morbidostat", a device for evolving bacteria in well-controlled drug environments. Nitrofurantoin resistance indeed evolved extremely slowly via reproducible mutations-an almost paradoxical behavior since this drug causes DNA damage and increases the mutation

  11. Quantifying intra- and inter-fractional motion in breast radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Scott, E-mail: scott.jones@health.qld.gov.au [Division of Cancer Services, Radiation Oncology Mater Centre, Princess Alexandra Hospital, Brisbane (Australia); Fitzgerald, Rhys [Division of Cancer Services, Princess Alexandra Hospital, Brisbane (Australia); Owen, Rebecca; Ramsay, Jonathan [Division of Cancer Services, Radiation Oncology Mater Centre, Princess Alexandra Hospital, Brisbane (Australia)

    2015-03-15

    The magnitude of intra- and inter-fractional variation in the set up of breast cancer patients treated with tangential megavoltage photon beams was investigated using an electronic portal imaging device (EPID). Daily cine-EPID images were captured during delivery of the tangential fields for ten breast cancer patients treated in the supine position. Measurements collected from each image included the central lung distance (CLD), central flash distance (CFD), superior axial measurement (SAM) and the inferior axial measurement (IAM). The variation of motion within a fraction (intra-fraction) and the variation between fractions (inter-fraction) was analysed to quantify set up variation and motion due to respiration. Altogether 3775 EPID images were collected from 10 patients. The effect of respiratory motion during treatment was <0.1 cm standard deviation (SD) in the anterior–posterior (AP) direction. The inter-fraction movement caused by variations in daily set up was larger at 0.28 cm SD in the AP direction. Superior–inferior (SI) variation was more difficult to summarise and proved unreliable as the measurements were taken to an ambiguous point on the images. It was difficult to discern true SI movement from that implicated by AP movement. There is minimal intra-fractional chest wall motion due to respiration during treatment. Inter-fractional variation was larger, however, on average it remained within departmental tolerance (0.5 cm) for set up variations. This review of our current breast technique provides confidence in the feasibility of utilising advanced treatment techniques (field-in-field, intensity modulated radiotherapy or volumetric modulated arc therapy) following a review of the current imaging protocol.

  12. Regulation of mRNA translation influences hypoxia tolerance

    International Nuclear Information System (INIS)

    Koritzinsky, M.; Wouters, B.G.; Koumenis, C.

    2003-01-01

    Hypoxia is a heterogenous but common characteristic of human tumours and poor oxygenation is associated with poor prognosis. We believe that the presence of viable hypoxic tumor cells reflects in part an adaptation and tolerance of these cells to oxygen deficiency. Since oxidative phosphorylation is compromized during hypoxia, adaptation may involve both the upregulation of glycolysis as well as downregulation of energy consumption. mRNA translation is one of the most energy costly cellular processes, and we and others have shown that global mRNA translation is rapidly inhibited during hypoxia. However, some mRNAs, including those coding for HIF-1 α and VEGF, remain efficiently translated during hypoxia. Clearly, the mechanisms responsible for the overall inhibition of translation during hypoxia does not compromize the translation of certain hypoxia-induced mRNA species. We therefore hypothesize that the inhibition of mRNA translation serves to promote hypoxia tolerance in two ways: i) through conservation of energy and ii) through differential gene expression involved in hypoxia adaptation. We have recently identified two pathways that are responsible for the global inhibition of translation during hypoxia. The phosphorylation of the eukaryotic initiation factor eIF2 α by the ER resident kinase PERK results in down-regulation of protein synthesis shortly after the onset of hypoxia. In addition, the initiation complex eIF4F is disrupted during long lasting hypoxic conditions. The identification of the molecular pathways responsible for the inhibition of overall translation during hypoxia has rendered it possible to investigate their importance for hypoxia tolerance. We have found that mouse embryo fibroblasts that are knockout for PERK and therefore not able to inhibit protein synthesis efficiently during oxygen deficiency are significantly less tolerant to hypoxia than their wildtype counterparts. We are currently also investigating the functional significance

  13. Energy globalization

    International Nuclear Information System (INIS)

    Tierno Andres

    1997-01-01

    Toward the future, the petroleum could stop to be the main energy source in the world and the oil companies will only survive if they are adjusted to the new winds that blow in the general energy sector. It will no longer be enough to be the owner of the resource (petroleum or gas) so that a company subsists and be profitable in the long term. The future, it will depend in great measure of the vision with which the oil companies face the globalization concept that begins to experience the world in the energy sector. Concepts like globalization, competition, integration and diversification is something that the companies of the hydrocarbons sector will have very present. Globalization means that it should be been attentive to what happens in the world, beyond of the limits of its territory, or to be caught by competitive surprises that can originate in very distant places. The search of cleaner and friendlier energy sources with the means it is not the only threat that it should fear the petroleum. Their substitution for electricity in the big projects of massive transport, the technology of the communications, the optic fiber and the same relationships with the aboriginal communities are aspects that also compete with the future of the petroleum

  14. Global overeksponering

    DEFF Research Database (Denmark)

    Rosenstand, Claus A. Foss

    2007-01-01

    forandringer. Den globale orientering kommer blandt andet til udtryk i det relativt store internationale netværk, som bakker de unge op i deres protester - enten ved tilstedeværelse i København eller andre sympatiaktioner. Siden den 11. september, 2001, er globale realiteter blevet eksponeret i massemedierne...... så bliver der blændet fuldt op for linsen d. 11. september, 2001 til en global verden, hvor de demokratiske værdier ikke gælder. Lad mig blot give et eksempel: Guatanamo. Jeg skal hverken tale for eller imod den måde verden er indrettet på - da det er denne analyse uvedkommende - men blot pege på...... med væsentligt større kraft end tidligere. Før den 11. september blev globaliseringen udelukkende tegnet af jetsettet. Altså internationale politikere, kulturkoryfæer, videnskabsfolk og forretningsfolk, der har handler ud fra kendte rationaler. Men jetsettet har ikke længere den privilegeret position...

  15. Linking waterlogging tolerance with Mn²⁺ toxicity: a case study for barley.

    Science.gov (United States)

    Huang, X; Shabala, S; Shabala, L; Rengel, Z; Wu, X; Zhang, G; Zhou, M

    2015-01-01

    Vast agricultural areas are affected by flooding, causing up to 80% yield reduction and resulting in multibillion dollar losses. Up to now, the focus of plant breeders was predominantly on detrimental effects of anoxia, while other (potentially equally important) traits were essentially neglected; one of these is soil elemental toxicity. Excess water triggers a progressive decrease in soil redox potential, thus increasing the concentration of Mn(2+) that can be toxic to plants if above a specific threshold. This work aimed to quantify the relative contribution of Mn(2+) toxicity to waterlogging stress tolerance, using barley as a case study. Twenty barley (Hordeum vulgare) genotypes contrasting in waterlogging stress tolerance were studied for their ability to cope with toxic (1 mm) amounts of Mn(2+) in the root rhizosphere. Under Mn(2+) toxicity, chlorophyll content of most waterlogging-tolerant genotypes (TX9425, Yerong, CPI-71284-48 and CM72) remained above 60% of the control value, whereas sensitive genotypes (Franklin and Naso Nijo) had 35% less chlorophyll than 35% of controls. Manganese concentration in leaves was not related to visual Mn(2+) toxicity symptoms, suggesting that various Mn(2+) tolerance mechanisms might operate in different tolerant genotypes, i.e. avoidance versus tissue tolerance. The overall significant (r = 0.60) correlation between tolerance to Mn(2+) toxicity and waterlogging in barley suggests that plant breeding for tolerance to waterlogging traits may be advanced by targeting mechanisms conferring tolerance to Mn(2+) toxicity, at least in this species. © 2014 German Botanical Society and The Royal Botanical Society of the Netherlands.

  16. Linking salinity stress tolerance with tissue-specific Na+ sequestration in wheat roots

    Directory of Open Access Journals (Sweden)

    Honghong eWu

    2015-02-01

    Full Text Available Salinity stress tolerance is a physiologically complex trait that is conferred by the large array of interacting mechanisms. Among these, vacuolar Na+ sequestration has always been considered as one of the key components differentiating between sensitive and tolerant species and genotypes. However, vacuolar Na+ sequestration has been rarely considered in the context of the tissue-specific expression and regulation of appropriate transporters contributing to Na+ removal from the cytosol. In this work, six bread wheat varieties contrasting in their salinity tolerance (three tolerant and three sensitive were used to understand the essentiality of vacuolar Na+ sequestration between functionally different root tissues, and link it with the overall salinity stress tolerance in this species. Roots of 4-d old wheat seedlings were treated with 100 mM NaCl for 3 days, and then Na+ distribution between cytosol and vacuole was quantified by CoroNa Green fluorescent dye imaging. Our major observations were as follows: 1 salinity stress tolerance correlated positively with vacuolar Na+ sequestration ability in the mature root zone but not in the root apex; 2 Contrary to expectations, cytosolic Na+ levels in root meristem were significantly higher in salt tolerant than sensitive group, while vacuolar Na+ levels showed an opposite trend. These results are interpreted as meristem cells playing a role of the salt sensor; 3 No significant difference in the vacuolar Na+ sequestration ability was found between sensitive and tolerant group in either transition or elongation zones; 4 The overall Na+ accumulation was highest in the elongation zone, suggesting its role in osmotic adjustment and turgor maintenance required to drive root expansion growth. Overall, the reported results suggest high tissue-specificity of Na+ uptake, signalling, and sequestration in wheat root. The implications of these findings for plant breeding for salinity stress tolerance are discussed.

  17. Quantifying offshore wind resources from satellite wind maps: Study area the North Sea

    DEFF Research Database (Denmark)

    Hasager, Charlotte Bay; Barthelmie, Rebecca Jane; Christiansen, Merete B.

    2006-01-01

    Offshore wind resources are quantified from satellite synthetic aperture radar (SAR) and satellite scatterometer observations at local and regional scale respectively at the Horns Rev site in Denmark. The method for wind resource estimation from satellite observations interfaces with the wind atlas...... of the Horns Rev wind farm is quantified from satellite SAR images and compared with state-of-the-art wake model results with good agreement. It is a unique method using satellite observations to quantify the spatial extent of the wake behind large offshore wind farms. Copyright © 2006 John Wiley & Sons, Ltd....... analysis and application program (WAsP). An estimate of the wind resource at the new project site at Horns Rev is given based on satellite SAR observations. The comparison of offshore satellite scatterometer winds, global model data and in situ data shows good agreement. Furthermore, the wake effect...

  18. Current challenges in quantifying preferential flow through the vadose zone

    Science.gov (United States)

    Koestel, John; Larsbo, Mats; Jarvis, Nick

    2017-04-01

    In this presentation, we give an overview of current challenges in quantifying preferential flow through the vadose zone. A review of the literature suggests that current generation models do not fully reflect the present state of process understanding and empirical knowledge of preferential flow. We believe that the development of improved models will be stimulated by the increasingly widespread application of novel imaging technologies as well as future advances in computational power and numerical techniques. One of the main challenges in this respect is to bridge the large gap between the scales at which preferential flow occurs (pore to Darcy scales) and the scale of interest for management (fields, catchments, regions). Studies at the pore scale are being supported by the development of 3-D non-invasive imaging and numerical simulation techniques. These studies are leading to a better understanding of how macropore network topology and initial/boundary conditions control key state variables like matric potential and thus the strength of preferential flow. Extrapolation of this knowledge to larger scales would require support from theoretical frameworks such as key concepts from percolation and network theory, since we lack measurement technologies to quantify macropore networks at these large scales. Linked hydro-geophysical measurement techniques that produce highly spatially and temporally resolved data enable investigation of the larger-scale heterogeneities that can generate preferential flow patterns at pedon, hillslope and field scales. At larger regional and global scales, improved methods of data-mining and analyses of large datasets (machine learning) may help in parameterizing models as well as lead to new insights into the relationships between soil susceptibility to preferential flow and site attributes (climate, land uses, soil types).

  19. Fault-tolerant architecture: Evaluation methodology

    International Nuclear Information System (INIS)

    Battle, R.E.; Kisner, R.A.

    1992-08-01

    The design and reliability of four fault-tolerant architectures that may be used in nuclear power plant control systems were evaluated. Two architectures are variations of triple-modular-redundant (TMR) systems, and two are variations of dual redundant systems. The evaluation includes a review of methods of implementing fault-tolerant control, the importance of automatic recovery from failures, methods of self-testing diagnostics, block diagrams of typical fault-tolerant controllers, review of fault-tolerant controllers operating in nuclear power plants, and fault tree reliability analyses of fault-tolerant systems

  20. Identification of Proteins Involved in Salinity Tolerance in Salicornia bigelovii

    KAUST Repository

    Salazar Moya, Octavio Ruben

    2017-11-01

    With a global growing demand in food production, agricultural output must increase accordingly. An increased use of saline soils and brackish water would contribute to the required increase in world food production. Abiotic stresses, such as salinity and drought, are also major limiters of crop growth globally - most crops are relatively salt sensitive and are significantly affected when exposed to salt in the range of 50 to 200 mM NaCl. Genomic resources from plants that naturally thrive in highly saline environments have the potential to be valuable in the generation of salt tolerant crops; however, these resources have been largely unexplored. Salicornia bigelovii is a plant native to Mexico and the United States that grows in salt marshes and coastal regions. It can thrive in environments with salt concentrations higher than seawater. In contrast to most crops, S. bigelovii is able to accumulate very high concentrations (in the order of 1.5 M) of Na+ and Cl- in its photosynthetically active succulent shoots. Part of this tolerance is likely to include the storage of Na+ in the vacuoles of the shoots, making S. bigelovii a good model for understanding mechanisms of Na+ compartmentalization in the vacuoles and a good resource for gene discovery. In this research project, phenotypic, genomic, transcriptomic, and proteomic approaches have been used for the identification of candidate genes involved in salinity tolerance in S. bigelovii. The genomes and transcriptomes of three Salicornia species have been sequenced. This information has been used to support the characterization of the salt-induced transcriptome of S. bigelovii shoots and the salt-induced proteome of various organellar membrane enriched fractions from S. bigelovii shoots, which led to the creation of organellar membrane proteomes. Yeast spot assays at different salt concentrations revealed several proteins increasing or decreasing yeast salt tolerance. This work aims to create the basis for

  1. Quantifying the environmental impact of particulate deposition from dry unpaved roadways

    Energy Technology Data Exchange (ETDEWEB)

    Becker, D.L.

    1979-01-01

    Airborne dust is the air pollutant most frequently observed to exceed National Ambient Air Quality Standards in rural areas. This pollutant (also referred to as suspended particulates) may originate from point sources (e.g., large areas of bare soil or pollen-producing vegetation.) Most sources of atmospheric particulates, whether natural or anthropogenic, are difficult to quantify by means of a source strength (i.e., mass of particulates emitted per unit time). A numerical model was developed for calculating the source strength and quantifying the atmospheric transport and eposition of dust generated on unpaved roadways. This model satisfies the second-order differential equation for the diffusion process and also the equation of mass conservation. Input to the model includes meterological variables, surface roughness characteristics, and the size distribution and suspended particulate concentration of dust as sampled downwind of an unpaved roadway. By using predetermined tolerance levels of airborne concentrations or tolerance levels of deposition, maximum allowable vehicular traffic volume can be established. The model also may be used to estimate reduction in photosynthesis resulting from fugitive dust from point or line sources. The contribug ion to sedimentation in aquatic bodies, resulting from airborne particulates also may be assessed with this model.

  2. Emission metrics for quantifying regional climate impacts of aviation

    Directory of Open Access Journals (Sweden)

    M. T. Lund

    2017-07-01

    Full Text Available This study examines the impacts of emissions from aviation in six source regions on global and regional temperatures. We consider the NOx-induced impacts on ozone and methane, aerosols and contrail-cirrus formation and calculate the global and regional emission metrics global warming potential (GWP, global temperature change potential (GTP and absolute regional temperature change potential (ARTP. The GWPs and GTPs vary by a factor of 2–4 between source regions. We find the highest aviation aerosol metric values for South Asian emissions, while contrail-cirrus metrics are higher for Europe and North America, where contrail formation is prevalent, and South America plus Africa, where the optical depth is large once contrails form. The ARTP illustrate important differences in the latitudinal patterns of radiative forcing (RF and temperature response: the temperature response in a given latitude band can be considerably stronger than suggested by the RF in that band, also emphasizing the importance of large-scale circulation impacts. To place our metrics in context, we quantify temperature change in four broad latitude bands following 1 year of emissions from present-day aviation, including CO2. Aviation over North America and Europe causes the largest net warming impact in all latitude bands, reflecting the higher air traffic activity in these regions. Contrail cirrus gives the largest warming contribution in the short term, but remain important at about 15 % of the CO2 impact in several regions even after 100 years. Our results also illustrate both the short- and long-term impacts of CO2: while CO2 becomes dominant on longer timescales, it also gives a notable warming contribution already 20 years after the emission. Our emission metrics can be further used to estimate regional temperature change under alternative aviation emission scenarios. A first evaluation of the ARTP in the context of aviation suggests that further work to account

  3. Quantifying Risk in Epidemiological and Ecological Contexts

    OpenAIRE

    Sellman, Stefan

    2018-01-01

    The rates of globalization and growth of the human population puts ever increasing pressure on the agricultural sector to intensify and grow more complex, and with this intensification comes an increased risk of outbreaks of infectious livestock diseases. At the same time, and for the same reasons, the detrimental effect that humans have on other species with which we share the environment has never been more apparent, as the current rates of species loss from ecological communities rival tho...

  4. IRON-TOLERANT CYANOBACTERIA: IMPLICATIONS FOR ASTROBIOLOGY

    Science.gov (United States)

    Brown, Igor I.; Allen, Carlton C.; Mummey, Daniel L.; Sarkisova, Svetlana A.; McKay, David S.

    2006-01-01

    The review is dedicated to the new group of extremophiles - iron tolerant cyanobacteria. The authors have analyzed earlier published articles about the ecology of iron tolerant cyanobacteria and their diversity. It was concluded that contemporary iron depositing hot springs might be considered as relative analogs of Precambrian environment. The authors have concluded that the diversity of iron-tolerant cyanobacteria is understudied. The authors also analyzed published data about the physiological peculiarities of iron tolerant cyanobacteria. They made the conclusion that iron tolerant cyanobacteria may oxidize reduced iron through the photosystem of cyanobacteria. The involvement of both Reaction Centers 1 and 2 is also discussed. The conclusion that iron tolerant protocyanobacteria could be involved in banded iron formations generation is also proposed. The possible mechanism of the transition from an oxygenic photosynthesis to an oxygenic one is also discussed. In the final part of the review the authors consider the possible implications of iron tolerant cyanobacteria for astrobiology.

  5. Quantifying climate risk - the starting point

    International Nuclear Information System (INIS)

    Fairweather, Helen; Luo, Qunying; Liu, De Li; Wiles, Perry

    2007-01-01

    Full text: All natural systems have evolved to their current state as a result inter alia of the climate in which they developed. Similarly, man-made systems (such as agricultural production) have developed to suit the climate experienced over the last 100 or so years. The capacity of different systems to adapt to changes in climate that are outside those that have been experienced previously is largely unknown. This results in considerable uncertainty when predicting climate change impacts. However, it is possible to quantify the relative probabilities of a range of potential impacts of climate change. Quantifying current climate risks is an effective starting point for analysing the probable impacts of future climate change and guiding the selection of appropriate adaptation strategies. For a farming system to be viable within the current climate, its profitability must be sustained and, therefore, possible adaptation strategies need to be tested for continued viability in a changed climate. The methodology outlined in this paper examines historical patterns of key climate variables (rainfall and temperature) across the season and their influence on the productivity of wheat growing in NSW. This analysis is used to identify the time of year that the system is most vulnerable to climate variation, within the constraints of the current climate. Wheat yield is used as a measure of productivity, which is also assumed to be a surrogate for profitability. A time series of wheat yields is sorted into ascending order and categorised into five percentile groupings (i.e. 20th, 40th, 60th and 80th percentiles) for each shire across NSW (-100 years). Five time series of climate data (which are aggregated daily data from the years in each percentile) are analysed to determine the period that provides the greatest climate risk to the production system. Once this period has been determined, this risk is quantified in terms of the degree of separation of the time series

  6. How to quantify conduits in wood?

    Science.gov (United States)

    Scholz, Alexander; Klepsch, Matthias; Karimi, Zohreh; Jansen, Steven

    2013-01-01

    Vessels and tracheids represent the most important xylem cells with respect to long distance water transport in plants. Wood anatomical studies frequently provide several quantitative details of these cells, such as vessel diameter, vessel density, vessel element length, and tracheid length, while important information on the three dimensional structure of the hydraulic network is not considered. This paper aims to provide an overview of various techniques, although there is no standard protocol to quantify conduits due to high anatomical variation and a wide range of techniques available. Despite recent progress in image analysis programs and automated methods for measuring cell dimensions, density, and spatial distribution, various characters remain time-consuming and tedious. Quantification of vessels and tracheids is not only important to better understand functional adaptations of tracheary elements to environment parameters, but will also be essential for linking wood anatomy with other fields such as wood development, xylem physiology, palaeobotany, and dendrochronology.

  7. Towards Quantifying a Wider Reality: Shannon Exonerata

    Directory of Open Access Journals (Sweden)

    Robert E. Ulanowicz

    2011-10-01

    Full Text Available In 1872 Ludwig von Boltzmann derived a statistical formula to represent the entropy (an apophasis of a highly simplistic system. In 1948 Claude Shannon independently formulated the same expression to capture the positivist essence of information. Such contradictory thrusts engendered decades of ambiguity concerning exactly what is conveyed by the expression. Resolution of widespread confusion is possible by invoking the third law of thermodynamics, which requires that entropy be treated in a relativistic fashion. Doing so parses the Boltzmann expression into separate terms that segregate apophatic entropy from positivist information. Possibly more importantly, the decomposition itself portrays a dialectic-like agonism between constraint and disorder that may provide a more appropriate description of the behavior of living systems than is possible using conventional dynamics. By quantifying the apophatic side of evolution, the Shannon approach to information achieves what no other treatment of the subject affords: It opens the window on a more encompassing perception of reality.

  8. Message passing for quantified Boolean formulas

    International Nuclear Information System (INIS)

    Zhang, Pan; Ramezanpour, Abolfazl; Zecchina, Riccardo; Zdeborová, Lenka

    2012-01-01

    We introduce two types of message passing algorithms for quantified Boolean formulas (QBF). The first type is a message passing based heuristics that can prove unsatisfiability of the QBF by assigning the universal variables in such a way that the remaining formula is unsatisfiable. In the second type, we use message passing to guide branching heuristics of a Davis–Putnam–Logemann–Loveland (DPLL) complete solver. Numerical experiments show that on random QBFs our branching heuristics give robust exponential efficiency gain with respect to state-of-the-art solvers. We also manage to solve some previously unsolved benchmarks from the QBFLIB library. Apart from this, our study sheds light on using message passing in small systems and as subroutines in complete solvers

  9. Quantifying decoherence in continuous variable systems

    Energy Technology Data Exchange (ETDEWEB)

    Serafini, A [Dipartimento di Fisica ' ER Caianiello' , Universita di Salerno, INFM UdR Salerno, INFN Sezione Napoli, Gruppo Collegato Salerno, Via S Allende, 84081 Baronissi, SA (Italy); Paris, M G A [Dipartimento di Fisica and INFM, Universita di Milano, Milan (Italy); Illuminati, F [Dipartimento di Fisica ' ER Caianiello' , Universita di Salerno, INFM UdR Salerno, INFN Sezione Napoli, Gruppo Collegato Salerno, Via S Allende, 84081 Baronissi, SA (Italy); De Siena, S [Dipartimento di Fisica ' ER Caianiello' , Universita di Salerno, INFM UdR Salerno, INFN Sezione Napoli, Gruppo Collegato Salerno, Via S Allende, 84081 Baronissi, SA (Italy)

    2005-04-01

    We present a detailed report on the decoherence of quantum states of continuous variable systems under the action of a quantum optical master equation resulting from the interaction with general Gaussian uncorrelated environments. The rate of decoherence is quantified by relating it to the decay rates of various, complementary measures of the quantum nature of a state, such as the purity, some non-classicality indicators in phase space, and, for two-mode states, entanglement measures and total correlations between the modes. Different sets of physically relevant initial configurations are considered, including one- and two-mode Gaussian states, number states, and coherent superpositions. Our analysis shows that, generally, the use of initially squeezed configurations does not help to preserve the coherence of Gaussian states, whereas it can be effective in protecting coherent superpositions of both number states and Gaussian wavepackets. (review article)

  10. Quantifying decoherence in continuous variable systems

    International Nuclear Information System (INIS)

    Serafini, A; Paris, M G A; Illuminati, F; De Siena, S

    2005-01-01

    We present a detailed report on the decoherence of quantum states of continuous variable systems under the action of a quantum optical master equation resulting from the interaction with general Gaussian uncorrelated environments. The rate of decoherence is quantified by relating it to the decay rates of various, complementary measures of the quantum nature of a state, such as the purity, some non-classicality indicators in phase space, and, for two-mode states, entanglement measures and total correlations between the modes. Different sets of physically relevant initial configurations are considered, including one- and two-mode Gaussian states, number states, and coherent superpositions. Our analysis shows that, generally, the use of initially squeezed configurations does not help to preserve the coherence of Gaussian states, whereas it can be effective in protecting coherent superpositions of both number states and Gaussian wavepackets. (review article)

  11. Crowdsourcing for quantifying transcripts: An exploratory study.

    Science.gov (United States)

    Azzam, Tarek; Harman, Elena

    2016-02-01

    This exploratory study attempts to demonstrate the potential utility of crowdsourcing as a supplemental technique for quantifying transcribed interviews. Crowdsourcing is the harnessing of the abilities of many people to complete a specific task or a set of tasks. In this study multiple samples of crowdsourced individuals were asked to rate and select supporting quotes from two different transcripts. The findings indicate that the different crowdsourced samples produced nearly identical ratings of the transcripts, and were able to consistently select the same supporting text from the transcripts. These findings suggest that crowdsourcing, with further development, can potentially be used as a mixed method tool to offer a supplemental perspective on transcribed interviews. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Animal biometrics: quantifying and detecting phenotypic appearance.

    Science.gov (United States)

    Kühl, Hjalmar S; Burghardt, Tilo

    2013-07-01

    Animal biometrics is an emerging field that develops quantified approaches for representing and detecting the phenotypic appearance of species, individuals, behaviors, and morphological traits. It operates at the intersection between pattern recognition, ecology, and information sciences, producing computerized systems for phenotypic measurement and interpretation. Animal biometrics can benefit a wide range of disciplines, including biogeography, population ecology, and behavioral research. Currently, real-world applications are gaining momentum, augmenting the quantity and quality of ecological data collection and processing. However, to advance animal biometrics will require integration of methodologies among the scientific disciplines involved. Such efforts will be worthwhile because the great potential of this approach rests with the formal abstraction of phenomics, to create tractable interfaces between different organizational levels of life. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Quantifying capital goods for waste incineration

    DEFF Research Database (Denmark)

    Brogaard, Line Kai-Sørensen; Riber, C.; Christensen, Thomas Højlund

    2013-01-01

    material used amounting to 19,000–26,000tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000–5000MWh. In terms of the environmental burden...... that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2–3% with respect to kg CO2 per tonne of waste combusted.......Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000–240,000tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main...

  14. Pendulum Underwater - An Approach for Quantifying Viscosity

    Science.gov (United States)

    Leme, José Costa; Oliveira, Agostinho

    2017-12-01

    The purpose of the experiment presented in this paper is to quantify the viscosity of a liquid. Viscous effects are important in the flow of fluids in pipes, in the bloodstream, in the lubrication of engine parts, and in many other situations. In the present paper, the authors explore the oscillations of a physical pendulum in the form of a long and lightweight wire that carries a ball at its lower end, which is totally immersed in water, so as to determine the water viscosity. The system used represents a viscous damped pendulum and we tried different theoretical models to describe it. The experimental part of the present paper is based on a very simple and low-cost image capturing apparatus that can easily be replicated in a physics classroom. Data on the pendulum's amplitude as a function of time were acquired using digital video analysis with the open source software Tracker.

  15. Quantifying gait patterns in Parkinson's disease

    Science.gov (United States)

    Romero, Mónica; Atehortúa, Angélica; Romero, Eduardo

    2017-11-01

    Parkinson's disease (PD) is constituted by a set of motor symptoms, namely tremor, rigidity, and bradykinesia, which are usually described but not quantified. This work proposes an objective characterization of PD gait patterns by approximating the single stance phase a single grounded pendulum. This model estimates the force generated by the gait during the single support from gait data. This force describes the motion pattern for different stages of the disease. The model was validated using recorded videos of 8 young control subjects, 10 old control subjects and 10 subjects with Parkinson's disease in different stages. The estimated force showed differences among stages of Parkinson disease, observing a decrease of the estimated force for the advanced stages of this illness.

  16. Quantifying brain microstructure with diffusion MRI

    DEFF Research Database (Denmark)

    Novikov, Dmitry S.; Jespersen, Sune N.; Kiselev, Valerij G.

    2016-01-01

    the potential to quantify the relevant length scales for neuronal tissue, such as the packing correlation length for neuronal fibers, the degree of neuronal beading, and compartment sizes. The second avenue corresponds to the long-time limit, when the observed signal can be approximated as a sum of multiple non......-exchanging anisotropic Gaussian components. Here the challenge lies in parameter estimation and in resolving its hidden degeneracies. The third avenue employs multiple diffusion encoding techniques, able to access information not contained in the conventional diffusion propagator. We conclude with our outlook...... on the future research directions which can open exciting possibilities for developing markers of pathology and development based on methods of studying mesoscopic transport in disordered systems....

  17. Quantifying Temporal Genomic Erosion in Endangered Species.

    Science.gov (United States)

    Díez-Del-Molino, David; Sánchez-Barreiro, Fatima; Barnes, Ian; Gilbert, M Thomas P; Dalén, Love

    2018-03-01

    Many species have undergone dramatic population size declines over the past centuries. Although stochastic genetic processes during and after such declines are thought to elevate the risk of extinction, comparative analyses of genomic data from several endangered species suggest little concordance between genome-wide diversity and current population sizes. This is likely because species-specific life-history traits and ancient bottlenecks overshadow the genetic effect of recent demographic declines. Therefore, we advocate that temporal sampling of genomic data provides a more accurate approach to quantify genetic threats in endangered species. Specifically, genomic data from predecline museum specimens will provide valuable baseline data that enable accurate estimation of recent decreases in genome-wide diversity, increases in inbreeding levels, and accumulation of deleterious genetic variation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  19. Quantifying the evolution of individual scientific impact.

    Science.gov (United States)

    Sinatra, Roberta; Wang, Dashun; Deville, Pierre; Song, Chaoming; Barabási, Albert-László

    2016-11-04

    Despite the frequent use of numerous quantitative indicators to gauge the professional impact of a scientist, little is known about how scientific impact emerges and evolves in time. Here, we quantify the changes in impact and productivity throughout a career in science, finding that impact, as measured by influential publications, is distributed randomly within a scientist's sequence of publications. This random-impact rule allows us to formulate a stochastic model that uncouples the effects of productivity, individual ability, and luck and unveils the existence of universal patterns governing the emergence of scientific success. The model assigns a unique individual parameter Q to each scientist, which is stable during a career, and it accurately predicts the evolution of a scientist's impact, from the h-index to cumulative citations, and independent recognitions, such as prizes. Copyright © 2016, American Association for the Advancement of Science.

  20. Quantifying creativity: can measures span the spectrum?

    Science.gov (United States)

    Simonton, Dean Keith

    2012-03-01

    Because the cognitive neuroscientists have become increasingly interested in the phenomenon of creativity, the issue arises of how creativity is to be optimally measured. Unlike intelligence, which can be assessed across the full range of intellectual ability creativity measures tend to concentrate on different sections of the overall spectrum. After first defining creativity in terms of the three criteria of novelty, usefulness, and surprise, this article provides an overview of the available measures. Not only do these instruments vary according to whether they focus on the creative process, person, or product, but they differ regarding whether they tap into "little-c" versus "Big-C" creativity; only productivity and eminence measures reach into genius-level manifestations of the phenomenon. The article closes by discussing whether various alternative assessment techniques can be integrated into a single measure that quantifies creativity across the full spectrum.

  1. Quantifying capital goods for waste incineration

    International Nuclear Information System (INIS)

    Brogaard, L.K.; Riber, C.; Christensen, T.H.

    2013-01-01

    Highlights: • Materials and energy used for the construction of waste incinerators were quantified. • The data was collected from five incineration plants in Scandinavia. • Included were six main materials, electronic systems, cables and all transportation. • The capital goods contributed 2–3% compared to the direct emissions impact on GW. - Abstract: Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000–240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000–26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000–5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7–14 kg CO 2 per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2–3% with respect to kg CO 2 per tonne of waste combusted

  2. Quantifying structural states of soft mudrocks

    Science.gov (United States)

    Li, B.; Wong, R. C. K.

    2016-05-01

    In this paper, a cm model is proposed to quantify structural states of soft mudrocks, which are dependent on clay fractions and porosities. Physical properties of natural and reconstituted soft mudrock samples are used to derive two parameters in the cm model. With the cm model, a simplified homogenization approach is proposed to estimate geomechanical properties and fabric orientation distributions of soft mudrocks based on the mixture theory. Soft mudrocks are treated as a mixture of nonclay minerals and clay-water composites. Nonclay minerals have a high stiffness and serve as a structural framework of mudrocks when they have a high volume fraction. Clay-water composites occupy the void space among nonclay minerals and serve as an in-fill matrix. With the increase of volume fraction of clay-water composites, there is a transition in the structural state from the state of framework supported to the state of matrix supported. The decreases in shear strength and pore size as well as increases in compressibility and anisotropy in fabric are quantitatively related to such transition. The new homogenization approach based on the proposed cm model yields better performance evaluation than common effective medium modeling approaches because the interactions among nonclay minerals and clay-water composites are considered. With wireline logging data, the cm model is applied to quantify the structural states of Colorado shale formations at different depths in the Cold Lake area, Alberta, Canada. Key geomechancial parameters are estimated based on the proposed homogenization approach and the critical intervals with low strength shale formations are identified.

  3. Global safety

    Directory of Open Access Journals (Sweden)

    Dorien J. DeTombe

    2010-08-01

    Full Text Available Global Safety is a container concept referring to various threats such as HIV/Aids, floods and terrorism; threats with different causes and different effects. These dangers threaten people, the global economy and the slity of states. Policy making for this kind of threats often lack an overview of the real causes and the interventions are based on a too shallow analysis of the problem, mono-disciplinary and focus mostly only on the effects. It would be more appropriate to develop policy related to these issues by utilizing the approaches, methods and tools that have been developed for complex societal problems. Handling these complex societal problems should be done multidisciplinary instead of mono-disciplinary. In order to give politicians the opportunity to handle complex problems multidisciplinary, multidisciplinary research institutes should be created. These multidisciplinary research institutes would provide politicians with better approaches to handle this type of problem. In these institutes the knowledge necessary for the change of these problems can be created through the use of the Compram methodology which has been developed specifically for handling complex societal problems. In a six step approach, experts, actors and policymakers discuss the content of the problem and the possible changes. The framework method uses interviewing, the Group Decision Room, simulation models and scenario's in a cooperative way. The methodology emphasizes the exchange of knowledge and understanding by communication among and between the experts, actors and politicians meanwhile keeping emotion in mind. The Compram methodology will be further explained in relation to global safety in regard to terrorism, economy, health care and agriculture.

  4. Understanding and quantifying focused, indirect groundwater recharge from ephemeral streams using water table fluctuations

    Science.gov (United States)

    Cuthbert, M. O.; Acworth, R. I.; Andersen, M. S.; Larsen, J. R.; McCallum, A. M.; Rau, G. C.; Tellam, J. H.

    2016-02-01

    Understanding and managing groundwater resources in drylands is a challenging task, but one that is globally important. The dominant process for dryland groundwater recharge is thought to be as focused, indirect recharge from ephemeral stream losses. However, there is a global paucity of data for understanding and quantifying this process and transferable techniques for quantifying groundwater recharge in such contexts are lacking. Here we develop a generalized conceptual model for understanding water table and groundwater head fluctuations due to recharge from episodic events within ephemeral streams. By accounting for the recession characteristics of a groundwater hydrograph, we present a simple but powerful new water table fluctuation approach to quantify focused, indirect recharge over both long term and event time scales. The technique is demonstrated using a new, and globally unparalleled, set of groundwater observations from an ephemeral stream catchment located in NSW, Australia. We find that, following episodic streamflow events down a predominantly dry channel system, groundwater head fluctuations are controlled by pressure redistribution operating at three time scales from vertical flow (days to weeks), transverse flow perpendicular to the stream (weeks to months), and longitudinal flow parallel to the stream (years to decades). In relative terms, indirect recharge decreases almost linearly away from the mountain front, both in discrete monitored events as well as in the long-term average. In absolute terms, the estimated indirect recharge varies from 80 to 30 mm/a with the main uncertainty in these values stemming from uncertainty in the catchment-scale hydraulic properties.

  5. Downy mildew intensity in tolerant grapes varieties in highlands of southern Brazil

    Directory of Open Access Journals (Sweden)

    de Bem Betina

    2016-01-01

    Full Text Available The aim of this study was to evaluate the different degrees of tolerance to infection by P. viticolaamong three genotypes with constitutive resistance in comparison to susceptible varieties Vitis vinifera. For this purpose two experiments was conducted at EPAGRI Experimental Station, located in the city of São Joaquim, Santa Catarina State, at 2015/16 cycle. In the first experiment on the field, were quantified the incidence and severity and downy mildew intensity was compared by epidemiological variables, on the tolerant varieties Bronner, Regent, Cabernet Cortis and the susceptible Sangiovese. On the second experiment forty leaf discs for the same tolerant genotypes and the susceptible variety Chardonnay were artificially infected with P. viticola sporangia suspension and after seven days of incubation the discs were examined and the degree of infection was estimated based on the intensity of sporangiophore formation. Sangiovese showed highest downy mildew intensity in comparison to the tolerant varieties Cabernet Carbon, Regent and Bronner. Under controlled conditions, the susceptible variety Chardonnay showed higher sporangiophore formation on discs leafs in comparison to the tolerant varieties. All the downy mildew tolerant varieties evaluated showed lower disease development in comparison with V. vinifera varieties.

  6. The Heat Shock Protein 26 Gene is Required for Ethanol Tolerance in Drosophila

    Directory of Open Access Journals (Sweden)

    Awoyemi A. Awofala

    2011-01-01

    Full Text Available Stress plays an important role in drug- and addiction-related behaviours. However, the mechanisms underlying these behavioural responses are still poorly understood. In the light of recent reports that show consistent regulation of many genes encoding stress proteins including heat shock proteins following ethanol exposure in Drosophila , it was hypothesised that transition to alcohol dependence may involve the dysregulation of the circuits that mediate behavioural responses to stressors. Thus, behavioural genetic methodologies were used to investigate the role of the Drosophila hsp26 gene, a small heat shock protein coding gene which is induced in response to various stresses, in the development of rapid tolerance to ethanol sedation. Rapid tolerance was quantified as the percentage difference in the mean sedation times between the second and first ethanol exposure. Two independently isolated P-element mutations near the hsp26 gene eliminated the capacity for tolerance. In addition, RNAi-mediated functional knockdown of hsp26 expression in the glial cells and the whole nervous system also caused a defect in tolerance development. The rapid tolerance phenotype of the hsp26 mutants was rescued by the expression of the wild-type hsp26 gene in the nervous system. None of these manipulations of the hsp26 gene caused changes in the rate of ethanol absorption. Hsp26 genes are evolutionary conserved, thus the role of hsp26 in ethanol tolerance may present a new direction for research into alcohol dependency.

  7. Global ambitions

    International Nuclear Information System (INIS)

    Scruton, M.

    1996-01-01

    The article discusses global ambitions concerning the Norwegian petroleum industry. With the advent of the NORSOK (Forum for development and operation) cost reduction programme and a specific focus on key sectors of the market, the Norwegian oil industry is beginning to market its considerable technological achievements internationally. Obviously, the good fortune of having tested this technology in a very demanding domestic arena means that Norwegian offshore support companies, having succeeded at home, are perfectly poised to export their expertise to the international sector. Drawing on the traditional strengths of the country's maritime heritage, with mobile rig and specialized vessel business featuring strongly, other key technologies have been developed. 5 figs., 1 tab

  8. The failure-tolerant leader.

    Science.gov (United States)

    Farson, Richard; Keyes, Ralph

    2002-08-01

    "The fastest way to succeed," IBM's Thomas Watson, Sr., once said, "is to double your failure rate." In recent years, more and more executives have embraced Watson's point of view, coming to understand what innovators have always known: Failure is a prerequisite to invention. But while companies may grasp the value of making mistakes at the level of corporate practices, they have a harder time accepting the idea at the personal level. People are afraid to fail, and corporate culture reinforces that fear. In this article, psychologist and former Harvard Business School professor Richard Farson and coauthor Ralph Keyes discuss how companies can reduce the fear of miscues. What's crucial is the presence of failure-tolerant leaders--executives who, through their words and actions, help employees overcome their anxieties about making mistakes and, in the process, create a culture of intelligent risk-taking that leads to sustained innovation. Such leaders don't just accept productive failure, they promote it. Drawing from their research in business, politics, sports, and science, the authors identify common practices among failure-tolerant leaders. These leaders break down the social and bureaucratic barriers that separate them from their followers. They engage at a personal level with the people they lead. They avoid giving either praise or criticism, preferring to take a nonjudgmental, analytical posture as they interact with staff. They openly admit their own mistakes rather than trying to cover them up or shifting the blame. And they try to root out the destructive competitiveness built into most organizations. Above all else, failure-tolerant leaders push people to see beyond traditional definitions of success and failure. They know that as long as a person views failure as the opposite of success, rather than its complement, he or she will never be able to take the risks necessary for innovation.

  9. Recent advances in utilizing transcription factors to improve plant abiotic stress tolerance by transgenic technology

    Directory of Open Access Journals (Sweden)

    Hongyan eWang

    2016-02-01

    Full Text Available Agricultural production and quality are adversely affected by various abiotic stresses worldwide and this will be exacerbated by the deterioration of global climate. To feed a growing world population, it is very urgent to breed stress-tolerant crops with higher yields and improved qualities against multiple environmental stresses. Since conventional breeding approaches had marginal success due to the complexity of stress tolerance traits, the transgenic approach is now being popularly used to breed stress-tolerant crops. So identifying and characterizing the the critical genes involved in plant stress responses is an essential prerequisite for engineering stress-tolerant crops. Far beyond the manipulation of single functional gene, engineering certain regulatory genes has emerged as an effective strategy now for controlling the expression of many stress-responsive genes. Transcription factors (TFs are good candidates for genetic engineering to breed stress-tolerant crop because of their role as master regulators of many stress-responsive genes. Many TFs belonging to families AP2/EREBP, MYB, WRKY, NAC, bZIP have been found to be involved in various abiotic stresses and some TF genes have also been engineered to improve stress tolerance in model and crop plants. In this review, we take five large families of TFs as examples and review the recent progress of TFs involved in plant abiotic stress responses and their potential utilization to improve multiple stress tolerance of crops in the field conditions.

  10. Susceptibility and tolerance of rice crop to salt threat: Physiological and metabolic inspections.

    Directory of Open Access Journals (Sweden)

    Nyuk Ling Ma

    Full Text Available Salinity threat is estimated to reduce global rice production by 50%. Comprehensive analysis of the physiological and metabolite changes in rice plants from salinity stress (i.e. tolerant versus susceptible plants is important to combat higher salinity conditions. In this study, we screened a total of 92 genotypes and selected the most salinity tolerant line (SS1-14 and most susceptible line (SS2-18 to conduct comparative physiological and metabolome inspections. We demonstrated that the tolerant line managed to maintain their water and chlorophyll content with lower incidence of sodium ion accumulation. We also examined the antioxidant activities of these lines: production of ascorbate peroxidase (APX and catalase (CAT were significantly higher in the sensitive line while superoxide dismutase (SOD was higher in the tolerant line. Partial least squares discriminant analysis (PLS-DA score plots show significantly different response for both lines after the exposure to salinity stress. In the tolerant line, there was an upregulation of non-polar metabolites and production of sucrose, GABA and acetic acid, suggesting an important role in salinity adaptation. In contrast, glutamine and putrescine were noticeably high in the susceptible rice. Coordination of different strategies in tolerant and susceptible lines show that they responded differently after exposure to salt stress. These findings can assist crop development in terms of developing tolerance mechanisms for rice crops.

  11. Water velocity tolerance in tadpoles of the foothill yellow-legged frog (Rana boylii): Swimming performance, growth, and survival

    Science.gov (United States)

    S. Kupferberg; A. Lind; V. Thill; S. Yarnell

    2011-01-01

    We explored the effects of large magnitude flow fluctuations in rivers with dams, commonly referred to as pulsed flows, on tadpoles of the lotic-breeding Foothill Yellow-legged Frog, Rana boylii. We quantified the velocity conditions in habitats occupied by tadpoles and then conducted experiments to assess the tolerance to values at the upper limit...

  12. Decomposing global crop yield variability

    Science.gov (United States)

    Ben-Ari, Tamara; Makowski, David

    2014-11-01

    Recent food crises have highlighted the need to better understand the between-year variability of agricultural production. Although increasing future production seems necessary, the globalization of commodity markets suggests that the food system would also benefit from enhanced supplies stability through a reduction in the year-to-year variability. Here, we develop an analytical expression decomposing global crop yield interannual variability into three informative components that quantify how evenly are croplands distributed in the world, the proportion of cultivated areas allocated to regions of above or below average variability and the covariation between yields in distinct world regions. This decomposition is used to identify drivers of interannual yield variations for four major crops (i.e., maize, rice, soybean and wheat) over the period 1961-2012. We show that maize production is fairly spread but marked by one prominent region with high levels of crop yield interannual variability (which encompasses the North American corn belt in the USA, and Canada). In contrast, global rice yields have a small variability because, although spatially concentrated, much of the production is located in regions of below-average variability (i.e., South, Eastern and South Eastern Asia). Because of these contrasted land use allocations, an even cultivated land distribution across regions would reduce global maize yield variance, but increase the variance of global yield rice. Intermediate results are obtained for soybean and wheat for which croplands are mainly located in regions with close-to-average variability. At the scale of large world regions, we find that covariances of regional yields have a negligible contribution to global yield variance. The proposed decomposition could be applied at any spatial and time scales, including the yearly time step. By addressing global crop production stability (or lack thereof) our results contribute to the understanding of a key

  13. Noise tolerant spatiotemporal chaos computing.

    Science.gov (United States)

    Kia, Behnam; Kia, Sarvenaz; Lindner, John F; Sinha, Sudeshna; Ditto, William L

    2014-12-01

    We introduce and design a noise tolerant chaos computing system based on a coupled map lattice (CML) and the noise reduction capabilities inherent in coupled dynamical systems. The resulting spatiotemporal chaos computing system is more robust to noise than a single map chaos computing system. In this CML based approach to computing, under the coupled dynamics, the local noise from different nodes of the lattice diffuses across the lattice, and it attenuates each other's effects, resulting in a system with less noise content and a more robust chaos computing architecture.

  14. Anhydrobiosis and Freezing-Tolerance

    DEFF Research Database (Denmark)

    McGill, Lorraine; Shannon, Adam; Pisani, Davide

    2015-01-01

    Anhydrobiotic animals can survive the loss of both free and bound water from their cells. While in this state they are also resistant to freezing. This physiology adapts anhydrobiotes to harsh environments and it aids their dispersal. Panagrolaimus davidi, a bacterial feeding anhydrobiotic nematode...... Panagrolaimus strains from tropical, temperate, continental and polar habitats and we analysed their phylogenetic relationships. We found that several other Panagrolaimus isolates can also survive freezing when fully hydrated and that tissue extracts from these freezing-tolerant nematodes can inhibit the growth...

  15. Identification of the submergence tolerance QTL Come Quick Drowning1 (CGD1) in Arabidopsis thaliana

    NARCIS (Netherlands)

    Akman, Melis; Kleine, Rogier; Tienderen, van Peter H.; Schranz, Eric M.

    2017-01-01

    Global climate change is predicted to increase water precipitation fluctuations and lead to localized prolonged floods in agricultural fields and natural plant communities. Thus, understanding the genetic basis of submergence tolerance is crucial in order to improve plant survival under these

  16. Climate tolerances and trait choices shape continental patterns of urban tree biodiversity

    Science.gov (United States)

    G. Darrel Jenerette; Lorraine W. Clarke; Meghan L. Avolio; Diane E. Pataki; Thomas W. Gillespie; Stephanie Pincetl; Dave J. Nowak; Lucy R. Hutyra; Melissa McHale; Joseph P. McFadden; Michael Alonzo

    2016-01-01

    Aim. We propose and test a climate tolerance and trait choice hypothesis of urban macroecological variation in which strong filtering associated with low winter temperatures restricts urban biodiversity while weak filtering associated with warmer temperatures and irrigation allows dispersal of species from a global source pool, thereby...

  17. Marine megaherbivore grazing may increase seagrass tolerance to high nutrient loads

    NARCIS (Netherlands)

    Christianen, M.J.A.; Govers, L.L.; Bouma, T.J.; Kiswara, W.; Roelofs, J.G.M.; Lamers, L.P.M.; Van Katwijk, M.

    2012-01-01

    1.Populations of marine megaherbivores including green turtle (Chelonia mydas) have declined dramatically at a global scale as a result of overharvesting and habitat loss. This decline can be expected to also affect the tolerance of seagrass systems to coastal eutrophication. Until now, however,

  18. Global health and global health ethics

    National Research Council Canada - National Science Library

    Benatar, S. R; Brock, Gillian

    2011-01-01

    ...? What are our responsibilities and how can we improve global health? Global Health and Global Health Ethics addresses these questions from the perspective of a range of disciplines, including medicine, philosophy and the social sciences...

  19. The peer effect on pain tolerance.

    Science.gov (United States)

    Engebretsen, Solveig; Frigessi, Arnoldo; Engø-Monsen, Kenth; Furberg, Anne-Sofie; Stubhaug, Audun; de Blasio, Birgitte Freiesleben; Nielsen, Christopher Sivert

    2018-05-19

    Twin studies have found that approximately half of the variance in pain tolerance can be explained by genetic factors, while shared family environment has a negligible effect. Hence, a large proportion of the variance in pain tolerance is explained by the (non-shared) unique environment. The social environment beyond the family is a potential candidate for explaining some of the variance in pain tolerance. Numerous individual traits have previously shown to be associated with friendship ties. In this study, we investigate whether pain tolerance is associated with friendship ties. We study the friendship effect on pain tolerance by considering data from the Tromsø Study: Fit Futures I, which contains pain tolerance measurements and social network information for adolescents attending first year of upper secondary school in the Tromsø area in Northern Norway. Pain tolerance was measured with the cold-pressor test (primary outcome), contact heat and pressure algometry. We analyse the data by using statistical methods from social network analysis. Specifically, we compute pairwise correlations in pain tolerance among friends. We also fit network autocorrelation models to the data, where the pain tolerance of an individual is explained by (among other factors) the average pain tolerance of the individual's friends. We find a significant and positive relationship between the pain tolerance of an individual and the pain tolerance of their friends. The estimated effect is that for every 1 s increase in friends' average cold-pressor tolerance time, the expected cold-pressor pain tolerance of the individual increases by 0.21 s (p-value: 0.0049, sample size n=997). This estimated effect is controlled for sex. The friendship effect remains significant when controlling for potential confounders such as lifestyle factors and test sequence among the students. Further investigating the role of sex on this friendship effect, we only find a significant peer effect of male friends

  20. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    2004-07-01

    The lack of international consensus on the expression of uncertainty in measurements was recognised by the late 1970s and led, after the issuance of a series of rather generic recommendations, to the publication of a general publication, known as GUM, the Guide to the Expression of Uncertainty in Measurement. This publication, issued in 1993, was based on co-operation over several years by the Bureau International des Poids et Mesures, the International Electrotechnical Commission, the International Federation of Clinical Chemistry, the International Organization for Standardization (ISO), the International Union of Pure and Applied Chemistry, the International Union of Pure and Applied Physics and the Organisation internationale de metrologie legale. The purpose was to promote full information on how uncertainty statements are arrived at and to provide a basis for harmonized reporting and the international comparison of measurement results. The need to provide more specific guidance to different measurement disciplines was soon recognized and the field of analytical chemistry was addressed by EURACHEM in 1995 in the first edition of a guidance report on Quantifying Uncertainty in Analytical Measurements, produced by a group of experts from the field. That publication translated the general concepts of the GUM into specific applications for analytical laboratories and illustrated the principles with a series of selected examples as a didactic tool. Based on feedback from the actual practice, the EURACHEM publication was extensively reviewed in 1997-1999 under the auspices of the Co-operation on International Traceability in Analytical Chemistry (CITAC), and a second edition was published in 2000. Still, except for a single example on the measurement of radioactivity in GUM, the field of nuclear and radiochemical measurements was not covered. The explicit requirement of ISO standard 17025:1999, General Requirements for the Competence of Testing and Calibration

  1. Quantifying the limitations of small animal positron emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Oxley, D.C. [Department of Physics, University of Liverpool, Liverpool L69 7ZE (United Kingdom)], E-mail: dco@ns.ph.liv.ac.uk; Boston, A.J.; Boston, H.C.; Cooper, R.J.; Cresswell, J.R.; Grint, A.N.; Nolan, P.J.; Scraggs, D.P. [Department of Physics, University of Liverpool, Liverpool L69 7ZE (United Kingdom); Lazarus, I.H. [STFC Daresbury Laboratory, Warrington, WA4 4AD Cheshire (United Kingdom); Beveridge, T.E. [School of Materials and Engineering, Monash University, Melbourne (Australia)

    2009-06-01

    The application of position sensitive semiconductor detectors in medical imaging is a field of global research interest. The Monte-Carlo simulation toolkit GEANT4 [ (http://geant4.web.cern.ch/geant4/)] was employed to improve the understanding of detailed {gamma}-ray interactions within the small animal Positron Emission Tomography (PET), high-purity germanium (HPGe) imaging system, SmartPET [A.J. Boston, et al., Oral contribution, ANL, Chicago, USA, 2006]. This system has shown promising results in the field of PET [R.J. Cooper, et al., Nucl. Instr. and Meth. A (2009), accepted for publication] and Compton camera imaging [J.E. Gillam, et al., Nucl. Instr. and Meth. A 579 (2007) 76]. Images for a selection of single and multiple point, line and phantom sources were successfully reconstructed using both a filtered-back-projection (FBP) [A.R. Mather, Ph.D. Thesis, University of Liverpool, 2007] and an iterative reconstruction algorithm [A.R. Mather, Ph.D. Thesis, University of Liverpool, 2007]. Simulated data were exploited as an alternative route to a reconstructed image allowing full quantification of the image distortions introduced in each phase of the data processing. Quantifying the contribution of uncertainty in all system components from detector to reconstruction algorithm allows the areas in need of most attention on the SmartPET project and semiconductor PET to be addressed.

  2. Challenges in quantifying biosphere-atmosphere exchange of nitrogen species

    Energy Technology Data Exchange (ETDEWEB)

    Sutton, M.A. [Centre for Ecology and Hydrology (CEH), Edinburgh Research Station, Bush Estate, Penicuik, EH26 0QB (United Kingdom)], E-mail: ms@ceh.ac.uk; Nemitz, E. [Centre for Ecology and Hydrology (CEH), Edinburgh Research Station, Bush Estate, Penicuik, EH26 0QB (United Kingdom); Erisman, J.W. [ECN, Clean Fossil Fuels, PO Box 1, 1755 ZG Petten (Netherlands); Beier, C. [Riso National Laboratory, PO Box 49, DK-4000 Roskilde (Denmark); Bahl, K. Butterbach [Institute of Meteorology and Climate Research, Atmos. Environ. Research (IMK-IFU), Research Centre Karlsruhe GmbH, Kreuzeckbahnstr. 19, 82467 Garmisch-Partenkirchen (Germany); Cellier, P. [INRA Unite Mixte de Recherche, 78850 Thiverval-Grignon (France); Vries, W. de [Alterra, Green World Research, PO Box 47, 6700 AA Wageningen (Netherlands); Cotrufo, F. [Dip. Scienze Ambientali, Seconda Universita degli Studi di Napoli, via Vivaldi 43, 81100 Caserta (Italy); Skiba, U.; Di Marco, C.; Jones, S. [Centre for Ecology and Hydrology (CEH), Edinburgh Research Station, Bush Estate, Penicuik, EH26 0QB (United Kingdom); Laville, P.; Soussana, J.F.; Loubet, B. [INRA Unite Mixte de Recherche, 78850 Thiverval-Grignon (France); Twigg, M.; Famulari, D. [Centre for Ecology and Hydrology (CEH), Edinburgh Research Station, Bush Estate, Penicuik, EH26 0QB (United Kingdom); Whitehead, J.; Gallagher, M.W. [School of Earth, Atmospheric and Environmental Sciences, University of Manchester, Williamson Building, Oxford Road, Manchester, M13 9PL (United Kingdom); Neftel, A.; Flechard, C.R. [Agroscope FAL Reckenholz, Federal Research Station for Agroecology and Agriculture, PO Box, CH 8046 Zurich (Switzerland)] (and others)

    2007-11-15

    Recent research in nitrogen exchange with the atmosphere has separated research communities according to N form. The integrated perspective needed to quantify the net effect of N on greenhouse-gas balance is being addressed by the NitroEurope Integrated Project (NEU). Recent advances have depended on improved methodologies, while ongoing challenges include gas-aerosol interactions, organic nitrogen and N{sub 2} fluxes. The NEU strategy applies a 3-tier Flux Network together with a Manipulation Network of global-change experiments, linked by common protocols to facilitate model application. Substantial progress has been made in modelling N fluxes, especially for N{sub 2}O, NO and bi-directional NH{sub 3} exchange. Landscape analysis represents an emerging challenge to address the spatial interactions between farms, fields, ecosystems, catchments and air dispersion/deposition. European up-scaling of N fluxes is highly uncertain and a key priority is for better data on agricultural practices. Finally, attention is needed to develop N flux verification procedures to assess compliance with international protocols. - Current N research is separated by form; the challenge is to link N components, scales and issues.

  3. Global teaching of global seismology

    Science.gov (United States)

    Stein, S.; Wysession, M.

    2005-12-01

    Our recent textbook, Introduction to Seismology, Earthquakes, & Earth Structure (Blackwell, 2003) is used in many countries. Part of the reason for this may be our deliberate attempt to write the book for an international audience. This effort appears in several ways. We stress seismology's long tradition of global data interchange. Our brief discussions of the science's history illustrate the contributions of scientists around the world. Perhaps most importantly, our discussions of earthquakes, tectonics, and seismic hazards take a global view. Many examples are from North America, whereas others are from other areas. Our view is that non-North American students should be exposed to North American examples that are type examples, and that North American students should be similarly exposed to examples elsewhere. For example, we illustrate how the Euler vector geometry changes a plate boundary from spreading, to strike-slip, to convergence using both the Pacific-North America boundary from the Gulf of California to Alaska and the Eurasia-Africa boundary from the Azores to the Mediterranean. We illustrate diffuse plate boundary zones using western North America, the Andes, the Himalayas, the Mediterranean, and the East Africa Rift. The subduction zone discussions examine Japan, Tonga, and Chile. We discuss significant earthquakes both in the U.S. and elsewhere, and explore hazard mitigation issues in different contexts. Both comments from foreign colleagues and our experience lecturing overseas indicate that this approach works well. Beyond the specifics of our text, we believe that such a global approach is facilitated by the international traditions of the earth sciences and the world youth culture that gives students worldwide common culture. For example, a video of the scene in New Madrid, Missouri that arose from a nonsensical earthquake prediction in 1990 elicits similar responses from American and European students.

  4. Global carbon budget 2013

    International Nuclear Information System (INIS)

    Le Quere, C.; Moriarty, R.; Jones, S.D.; Boden, T.A.; Peters, G.P.; Andrew, R.M.; Andres, R.J.; Ciais, P.; Bopp, L.; Maignan, F.; Viovy, N.

    2014-01-01

    Accurate assessment of anthropogenic carbon dioxide (CO 2 ) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and a methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates, consistency within and among components, alongside methodology and data limitations. CO 2 emissions from fossil-fuel combustion and cement production (EFF) are based on energy statistics, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover change data, fire activity associated with deforestation, and models. The global atmospheric CO 2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO 2 sink (SOCEAN) is based on observations from the 1990's, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated for the first time in this budget with data products based on surveys of ocean CO 2 measurements. The global residual terrestrial CO 2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models forced by observed climate, CO 2 and land cover change (some including nitrogen-carbon interactions). All uncertainties are reported as ±1, reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. For the last decade available (2003-2012), EFF was 8.6±0.4 GtC yr -1 , ELUC 0.9±0.5 GtC yr -1 , GATM 4.3±0

  5. Quantifying Urban Groundwater in Environmental Field Observatories

    Science.gov (United States)

    Welty, C.; Miller, A. J.; Belt, K.; Smith, J. A.; Band, L. E.; Groffman, P.; Scanlon, T.; Warner, J.; Ryan, R. J.; Yeskis, D.; McGuire, M. P.

    2006-12-01

    Despite the growing footprint of urban landscapes and their impacts on hydrologic and biogeochemical cycles, comprehensive field studies of urban water budgets are few. The cumulative effects of urban infrastructure (buildings, roads, culverts, storm drains, detention ponds, leaking water supply and wastewater pipe networks) on temporal and spatial patterns of groundwater stores, fluxes, and flowpaths are poorly understood. The goal of this project is to develop expertise and analytical tools for urban groundwater systems that will inform future environmental observatory planning and that can be shared with research teams working in urban environments elsewhere. The work plan for this project draws on a robust set of information resources in Maryland provided by ongoing monitoring efforts of the Baltimore Ecosystem Study (BES), USGS, and the U.S. Forest Service working together with university scientists and engineers from multiple institutions. A key concern is to bridge the gap between small-scale intensive field studies and larger-scale and longer-term hydrologic patterns using synoptic field surveys, remote sensing, numerical modeling, data mining and visualization tools. Using the urban water budget as a unifying theme, we are working toward estimating the various elements of the budget in order to quantify the influence of urban infrastructure on groundwater. Efforts include: (1) comparison of base flow behavior from stream gauges in a nested set of watersheds at four different spatial scales from 0.8 to 171 km2, with diverse patterns of impervious cover and urban infrastructure; (2) synoptic survey of well water levels to characterize the regional water table; (3) use of airborne thermal infrared imagery to identify locations of groundwater seepage into streams across a range of urban development patterns; (4) use of seepage transects and tracer tests to quantify the spatial pattern of groundwater fluxes to the drainage network in selected subwatersheds; (5

  6. Tolerance doses for treatment planning

    International Nuclear Information System (INIS)

    Lyman, J.T.

    1985-10-01

    Data for the tolerance of normal tissues or organs to (low-LET) radiation has been compiled from a number of sources which are referenced at the end of this document. This tolerance dose data are ostensibly for uniform irradiation of all or part of an organ, and are for either 5% (TD 5 ) or 50% (TD 50 ) complication probability. The ''size'' of the irradiated organ is variously stated in terms of the absolute volume or the fraction of the organ volume irradiated, or the area or the length of the treatment field. The accuracy of these data is questionable. Much of the data represents doses that one or several experienced therapists have estimated could be safely given rather than quantitative analyses of clinical observations. Because these data have been obtained from multiple sources with possible different criteria for the definition of a complication, there are sometimes different values for what is apparently the same endpoint. The data from some sources shows a tendancy to be quantized in 5 Gy increments. This reflects the size of possible round off errors. It is believed that all these data have been accumulated without the benefit of 3-D dose distributions and therefore the estimates of the size of the volume and/or the uniformity of the irradiation may be less accurate than is now possible. 19 refs., 4 figs

  7. Radiation tolerance of amorphous semiconductors

    International Nuclear Information System (INIS)

    Nicolaides, R.V.; DeFeo, S.; Doremus, L.W.

    1976-01-01

    In an attempt to determine the threshold radiation damage in amorphous semiconductors, radiation tests were performed on amorphous semiconductor thin film materials and on threshold and memory devices. The influence of flash x-rays and neutron radiation upon the switching voltages, on- and off-state characteristics, dielectric response, optical transmission, absorption band edge and photoconductivity were measured prior to, during and following irradiation. These extensive tests showed the high radiation tolerance of amorphous semiconductor materials. Electrical and optical properties, other than photoconductivity, have a neutron radiation tolerance threshold above 10 17 nvt in the steady state and 10 14 nvt in short (50 μsec to 16 msec) pulses. Photoconductivity increases by 1 1 / 2 orders of magnitude at the level of 10 14 nvt (short pulses of 50 μsec). Super flash x-rays up to 5000 rads (Si), 20 nsec, do not initiate switching in off-state samples which are voltage biased up to 90 percent of the threshold voltage. Both memory and threshold amorphous devices are capable of switching on and off during nuclear radiation transients at least as high as 2 x 10 14 nvt in 50 μsec pulses

  8. Tolerance and potential for adaptation of a Baltic Sea rockweed under predicted climate change conditions.

    Science.gov (United States)

    Rugiu, Luca; Manninen, Iita; Rothäusler, Eva; Jormalainen, Veijo

    2018-03-01

    Climate change is threating species' persistence worldwide. To predict species responses to climate change we need information not just on their environmental tolerance but also on its adaptive potential. We tested how the foundation species of rocky littoral habitats, Fucus vesiculosus, responds to combined hyposalinity and warming projected to the Baltic Sea by 2070-2099. We quantified responses of replicated populations originating from the entrance, central, and marginal Baltic regions. Using replicated individuals, we tested for the presence of within-population tolerance variation. Future conditions hampered growth and survival of the central and marginal populations whereas the entrance populations fared well. Further, both the among- and within-population variation in responses to climate change indicated existence of genetic variation in tolerance. Such standing genetic variation provides the raw material necessary for adaptation to a changing environment, which may eventually ensure the persistence of the species in the inner Baltic Sea. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Globalizing Denmark

    DEFF Research Database (Denmark)

    Selmer, Jan; Lauring, Jakob

    2013-01-01

    countries to keep up the process of globalization may be substantial, and the economic gains for such countries from adjusting to a more internationally integrated world economy are clear. However, in small- population economies, especially social-democratic welfare states, the internal pressure......This exploratory article examines the paradox of being open-minded while ethnocentric as expressed in Danish international management practices at the micro level. With a population of 5.4 million, Denmark is one of the smallest of the European countries. The pressure on many small advanced...... to integrate counteracts to some extent the need to maintain openness to differences. Thus, a strong economy and a feeling of smug ethnocentrism in Denmark generate a central paradox in thinking about internationalization in Danish society....

  10. Global Geomorphology

    Science.gov (United States)

    Douglas, I.

    1985-01-01

    Any global view of landforms must include an evaluation of the link between plate tectonics and geomorphology. To explain the broad features of the continents and ocean floors, a basic distinction between the tectogene and cratogene part of the Earth's surface must be made. The tectogene areas are those that are dominated by crustal movements, earthquakes and volcanicity at the present time and are essentially those of the great mountain belts and mid ocean ridges. Cratogene areas comprise the plate interiors, especially the old lands of Gondwanaland and Laurasia. Fundamental as this division between plate margin areas and plate interiors is, it cannot be said to be a simple case of a distinction between tectonically active and stable areas. Indeed, in terms of megageomorphology, former plate margins and tectonic activity up to 600 million years ago have to be considered.

  11. Global engineering

    International Nuclear Information System (INIS)

    Plass, L.

    2001-01-01

    This article considers the challenges posed by the declining orders in the plant engineering and contracting business in Germany, the need to remain competitive, and essential preconditions for mastering the challenge. The change in engineering approach is illustrated by the building of a methanol plant in Argentina by Lurgi with the basic engineering completed in Frankfurt with involvement of key personnel from Poland, completely engineered subsystems from a Brazilian subsupplier, and detailed engineering work in Frankfurt. The production of methanol from natural gas using the LurgiMega/Methanol process is used as a typical example of the industrial plant construction sector. The prerequisites for successful global engineering are listed, and error costs in plant construction, possible savings, and process intensification are discussed

  12. Global warming

    International Nuclear Information System (INIS)

    Houghton, John

    2005-01-01

    'Global warming' is a phrase that refers to the effect on the climate of human activities, in particular the burning of fossil fuels (coal, oil and gas) and large-scale deforestation, which cause emissions to the atmosphere of large amounts of 'greenhouse gases', of which the most important is carbon dioxide. Such gases absorb infrared radiation emitted by the Earth's surface and act as blankets over the surface keeping it warmer than it would otherwise be. Associated with this warming are changes of climate. The basic science of the 'greenhouse effect' that leads to the warming is well understood. More detailed understanding relies on numerical models of the climate that integrate the basic dynamical and physical equations describing the complete climate system. Many of the likely characteristics of the resulting changes in climate (such as more frequent heat waves, increases in rainfall, increase in frequency and intensity of many extreme climate events) can be identified. Substantial uncertainties remain in knowledge of some of the feedbacks within the climate system (that affect the overall magnitude of change) and in much of the detail of likely regional change. Because of its negative impacts on human communities (including for instance substantial sea-level rise) and on ecosystems, global warming is the most important environmental problem the world faces. Adaptation to the inevitable impacts and mitigation to reduce their magnitude are both necessary. International action is being taken by the world's scientific and political communities. Because of the need for urgent action, the greatest challenge is to move rapidly to much increased energy efficiency and to non-fossil-fuel energy sources

  13. Global gamesmanship.

    Science.gov (United States)

    MacMillan, Ian C; van Putten, Alexander B; McGrath, Rita Gunther

    2003-05-01

    Competition among multinationals these days is likely to be a three-dimensional game of global chess: The moves an organization makes in one market are designed to achieve goals in another in ways that aren't immediately apparent to its rivals. The authors--all management professors-call this approach "competing under strategic interdependence," or CSI. And where this interdependence exists, the complexity of the situation can quickly overwhelm ordinary analysis. Indeed, most business strategists are terrible at anticipating the consequences of interdependent choices, and they're even worse at using interdependency to their advantage. In this article, the authors offer a process for mapping the competitive landscape and anticipating how your company's moves in one market can influence its competitive interactions in others. They outline the six types of CSI campaigns--onslaughts, contests, guerrilla campaigns, feints, gambits, and harvesting--available to any multiproduct or multimarket corporation that wants to compete skillfully. They cite real-world examples such as the U.S. pricing battle Philip Morris waged with R.J. Reynolds--not to gain market share in the domestic cigarette market but to divert R.J. Reynolds's resources and attention from the opportunities Philip Morris was pursuing in Eastern Europe. And, using data they collected from their studies of consumer-products companies Procter & Gamble and Unilever, the authors describe how to create CSI tables and bubble charts that present a graphical look at the competitive landscape and that may uncover previously hidden opportunities. The CSI mapping process isn't just for global corporations, the authors explain. Smaller organizations that compete with a portfolio of products in just one national or regional market may find it just as useful for planning their next business moves.

  14. Microbial stress tolerance for biofuels. Systems biology

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zonglin Lewis (ed.) [National Center for Agricultural Utilization Research, USDA-ARS, Peoria, IL (United States)

    2012-07-01

    The development of sustainable and renewable biofuels is attracting growing interest. It is vital to develop robust microbial strains for biocatalysts that are able to function under multiple stress conditions. This Microbiology Monograph provides an overview of methods for studying microbial stress tolerance for biofuels applications using a systems biology approach. Topics covered range from mechanisms to methodology for yeast and bacteria, including the genomics of yeast tolerance and detoxification; genetics and regulation of glycogen and trehalose metabolism; programmed cell death; high gravity fermentations; ethanol tolerance; improving biomass sugar utilization by engineered Saccharomyces; the genomics on tolerance of Zymomonas mobilis; microbial solvent tolerance; control of stress tolerance in bacterial host organisms; metabolomics for ethanologenic yeast; automated proteomics work cell systems for strain improvement; and unification of gene expression data for comparable analyses under stress conditions. (orig.)

  15. Quantifying the consensus on anthropogenic global warming in the literature: A re-analysis

    International Nuclear Information System (INIS)

    Tol, Richard S.J.

    2014-01-01

    A claim has been that 97% of the scientific literature endorses anthropogenic climate change (Cook et al., 2013. Environ. Res. Lett. 8, 024024). This claim, frequently repeated in debates about climate policy, does not stand. A trend in composition is mistaken for a trend in endorsement. Reported results are inconsistent and biased. The sample is not representative and contains many irrelevant papers. Overall, data quality is low. Cook's validation test shows that the data are invalid. Data disclosure is incomplete so that key results cannot be reproduced or tested

  16. Quantifying the effects of the break up of Pangaea on global terrestrial diversification with neutral theory

    OpenAIRE

    Jordan, S; Barraclough, T; Rosindell, JL

    2016-01-01

    The historic richness of most taxonomic groups increases substantially over geological time. Explanations for this fall broadly into two categories: bias in the fossil record and elevated net rates of diversification in recent periods. For example, the break up of Pangaea and isolation between continents might have increased net diversification rates. In this study, we investigate the effect on terrestrial diversification rates of the increased isolation between land masses brought about by c...

  17. QUANTIFYING LIFE STYLE IMPACT ON LIFESPAN

    Directory of Open Access Journals (Sweden)

    Antonello Lorenzini

    2012-12-01

    Full Text Available A healthy diet, physical activity and avoiding dangerous habits such as smoking are effective ways of increasing health and lifespan. Although a significant portion of the world's population still suffers from malnutrition, especially children, the most common cause of death in the world today is non-communicable diseases. Overweight and obesity significantly increase the relative risk for the most relevant non communicable diseases: cardiovascular disease, type II diabetes and some cancers. Childhood overweight also seems to increase the likelihood of disease in adulthood through epigenetic mechanisms. This worrisome trend now termed "globesity" will deeply impact society unless preventive strategies are put into effect. Researchers of the basic biology of aging have clearly established that animals with short lifespans live longer when their diet is calorie restricted. Although similar experiments carried on rhesus monkeys, a longer-lived species more closely related to humans, yielded mixed results, overall the available scientific data suggest keeping the body mass index in the "normal" range increases the chances of living a longer, healthier life. This can be successfully achieved both by maintaining a healthy diet and by engaging in physical activity. In this review we will try to quantify the relative impact of life style choices on lifespan.

  18. Stimfit: quantifying electrophysiological data with Python

    Directory of Open Access Journals (Sweden)

    Segundo Jose Guzman

    2014-02-01

    Full Text Available Intracellular electrophysiological recordings provide crucial insights into elementary neuronal signals such as action potentials and synaptic currents. Analyzing and interpreting these signals is essential for a quantitative understanding of neuronal information processing, and requires both fast data visualization and ready access to complex analysis routines. To achieve this goal, we have developed Stimfit, a free software package for cellular neurophysiology with a Python scripting interface and a built-in Python shell. The program supports most standard file formats for cellular neurophysiology and other biomedical signals through the Biosig library. To quantify and interpret the activity of single neurons and communication between neurons, the program includes algorithms to characterize the kinetics of presynaptic action potentials and postsynaptic currents, estimate latencies between pre- and postsynaptic events, and detect spontaneously occurring events. We validate and benchmark these algorithms, give estimation errors, and provide sample use cases, showing that Stimfit represents an efficient, accessible and extensible way to accurately analyze and interpret neuronal signals.

  19. Quantifying capital goods for waste incineration.

    Science.gov (United States)

    Brogaard, L K; Riber, C; Christensen, T H

    2013-06-01

    Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000-240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000-26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000-5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7-14 kg CO2 per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2-3% with respect to kg CO2 per tonne of waste combusted. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Fluorescence imaging to quantify crop residue cover

    Science.gov (United States)

    Daughtry, C. S. T.; Mcmurtrey, J. E., III; Chappelle, E. W.

    1994-01-01

    Crop residues, the portion of the crop left in the field after harvest, can be an important management factor in controlling soil erosion. Methods to quantify residue cover are needed that are rapid, accurate, and objective. Scenes with known amounts of crop residue were illuminated with long wave ultraviolet (UV) radiation and fluorescence images were recorded with an intensified video camera fitted with a 453 to 488 nm band pass filter. A light colored soil and a dark colored soil were used as background for the weathered soybean stems. Residue cover was determined by counting the proportion of the pixels in the image with fluorescence values greater than a threshold. Soil pixels had the lowest gray levels in the images. The values of the soybean residue pixels spanned nearly the full range of the 8-bit video data. Classification accuracies typically were within 3(absolute units) of measured cover values. Video imaging can provide an intuitive understanding of the fraction of the soil covered by residue.

  1. Quantifying Potential Groundwater Recharge In South Texas

    Science.gov (United States)

    Basant, S.; Zhou, Y.; Leite, P. A.; Wilcox, B. P.

    2015-12-01

    Groundwater in South Texas is heavily relied on for human consumption and irrigation for food crops. Like most of the south west US, woody encroachment has altered the grassland ecosystems here too. While brush removal has been widely implemented in Texas with the objective of increasing groundwater recharge, the linkage between vegetation and groundwater recharge in South Texas is still unclear. Studies have been conducted to understand plant-root-water dynamics at the scale of plants. However, little work has been done to quantify the changes in soil water and deep percolation at the landscape scale. Modeling water flow through soil profiles can provide an estimate of the total water flowing into deep percolation. These models are especially powerful with parameterized and calibrated with long term soil water data. In this study we parameterize the HYDRUS soil water model using long term soil water data collected in Jim Wells County in South Texas. Soil water was measured at every 20 cm intervals up to a depth of 200 cm. The parameterized model will be used to simulate soil water dynamics under a variety of precipitation regimes ranging from well above normal to severe drought conditions. The results from the model will be compared with the changes in soil moisture profile observed in response to vegetation cover and treatments from a study in a similar. Comparative studies like this can be used to build new and strengthen existing hypotheses regarding deep percolation and the role of soil texture and vegetation in groundwater recharge.

  2. Quantifying Anthropogenic Stress on Groundwater Resources.

    Science.gov (United States)

    Ashraf, Batool; AghaKouchak, Amir; Alizadeh, Amin; Mousavi Baygi, Mohammad; R Moftakhari, Hamed; Mirchi, Ali; Anjileli, Hassan; Madani, Kaveh

    2017-10-10

    This study explores a general framework for quantifying anthropogenic influences on groundwater budget based on normalized human outflow (h out ) and inflow (h in ). The framework is useful for sustainability assessment of groundwater systems and allows investigating the effects of different human water abstraction scenarios on the overall aquifer regime (e.g., depleted, natural flow-dominated, and human flow-dominated). We apply this approach to selected regions in the USA, Germany and Iran to evaluate the current aquifer regime. We subsequently present two scenarios of changes in human water withdrawals and return flow to the system (individually and combined). Results show that approximately one-third of the selected aquifers in the USA, and half of the selected aquifers in Iran are dominated by human activities, while the selected aquifers in Germany are natural flow-dominated. The scenario analysis results also show that reduced human withdrawals could help with regime change in some aquifers. For instance, in two of the selected USA aquifers, a decrease in anthropogenic influences by ~20% may change the condition of depleted regime to natural flow-dominated regime. We specifically highlight a trending threat to the sustainability of groundwater in northwest Iran and California, and the need for more careful assessment and monitoring practices as well as strict regulations to mitigate the negative impacts of groundwater overexploitation.

  3. Quantifying Supply Risk at a Cellulosic Biorefinery

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Jason K [Idaho National Laboratory; Jacobson, Jacob Jordan [Idaho National Laboratory; Cafferty, Kara Grace [Idaho National Laboratory; Lamers, Patrick [Idaho National Laboratory; Roni, MD S [Idaho National Laboratory

    2015-03-01

    In order to increase the sustainability and security of the nation’s energy supply, the U.S. Department of Energy through its Bioenergy Technology Office has set a vision for one billion tons of biomass to be processed for renewable energy and bioproducts annually by the year 2030. The Renewable Fuels Standard limits the amount of corn grain that can be used in ethanol conversion sold in the U.S, which is already at its maximum. Therefore making the DOE’s vision a reality requires significant growth in the advanced biofuels industry where currently three cellulosic biorefineries convert cellulosic biomass to ethanol. Risk mitigation is central to growing the industry beyond its infancy to a level necessary to achieve the DOE vision. This paper focuses on reducing the supply risk that faces a firm that owns a cellulosic biorefinery. It uses risk theory and simulation modeling to build a risk assessment model based on causal relationships of underlying, uncertain, supply driving variables. Using the model the paper quantifies supply risk reduction achieved by converting the supply chain from a conventional supply system (bales and trucks) to an advanced supply system (depots, pellets, and trains). Results imply that the advanced supply system reduces supply system risk, defined as the probability of a unit cost overrun, from 83% in the conventional system to 4% in the advanced system. Reducing cost risk in this nascent industry improves the odds of realizing desired growth.

  4. Quantifying Supply Risk at a Cellulosic Biorefinery

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Jason K.; Jacobson, Jacob J.; Cafferty, Kara G.; Lamers, Patrick; Roni, Mohammad S.

    2015-07-01

    In order to increase the sustainability and security of the nation’s energy supply, the U.S. Department of Energy through its Bioenergy Technology Office has set a vision for one billion tons of biomass to be processed for renewable energy and bioproducts annually by the year 2030. The Renewable Fuels Standard limits the amount of corn grain that can be used in ethanol conversion sold in the U.S, which is already at its maximum. Therefore making the DOE’s vision a reality requires significant growth in the advanced biofuels industry where currently three cellulosic biorefineries convert cellulosic biomass to ethanol. Risk mitigation is central to growing the industry beyond its infancy to a level necessary to achieve the DOE vision. This paper focuses on reducing the supply risk that faces a firm that owns a cellulosic biorefinery. It uses risk theory and simulation modeling to build a risk assessment model based on causal relationships of underlying, uncertain, supply driving variables. Using the model the paper quantifies supply risk reduction achieved by converting the supply chain from a conventional supply system (bales and trucks) to an advanced supply system (depots, pellets, and trains). Results imply that the advanced supply system reduces supply system risk, defined as the probability of a unit cost overrun, from 83% in the conventional system to 4% in the advanced system. Reducing cost risk in this nascent industry improves the odds of realizing desired growth.

  5. Data Used in Quantified Reliability Models

    Science.gov (United States)

    DeMott, Diana; Kleinhammer, Roger K.; Kahn, C. J.

    2014-01-01

    Data is the crux to developing quantitative risk and reliability models, without the data there is no quantification. The means to find and identify reliability data or failure numbers to quantify fault tree models during conceptual and design phases is often the quagmire that precludes early decision makers consideration of potential risk drivers that will influence design. The analyst tasked with addressing a system or product reliability depends on the availability of data. But, where is does that data come from and what does it really apply to? Commercial industries, government agencies, and other international sources might have available data similar to what you are looking for. In general, internal and external technical reports and data based on similar and dissimilar equipment is often the first and only place checked. A common philosophy is "I have a number - that is good enough". But, is it? Have you ever considered the difference in reported data from various federal datasets and technical reports when compared to similar sources from national and/or international datasets? Just how well does your data compare? Understanding how the reported data was derived, and interpreting the information and details associated with the data is as important as the data itself.

  6. The dairy cow and global climate changes

    OpenAIRE

    Flávio Baccari Jr

    2015-01-01

     High producing dairy cows are more sensitive to heat stress due mainly to their higher resting metabolic rate as compared to low producing and dry cows. Their responses to increasing levels of the temperature-humidity and the black globe-humidity indices are discussed as well as some aspects of heat tolerance as related to body temperature increase and milk production decrease. Some mitigation and adaptation practices are recommended to face the challenges of global climate changes.

  7. Predicting coexistence of plants subject to a tolerance-competition trade-off.

    Science.gov (United States)

    Haegeman, Bart; Sari, Tewfik; Etienne, Rampal S

    2014-06-01

    Ecological trade-offs between species are often invoked to explain species coexistence in ecological communities. However, few mathematical models have been proposed for which coexistence conditions can be characterized explicitly in terms of a trade-off. Here we present a model of a plant community which allows such a characterization. In the model plant species compete for sites where each site has a fixed stress condition. Species differ both in stress tolerance and competitive ability. Stress tolerance is quantified as the fraction of sites with stress conditions low enough to allow establishment. Competitive ability is quantified as the propensity to win the competition for empty sites. We derive the deterministic, discrete-time dynamical system for the species abundances. We prove the conditions under which plant species can coexist in a stable equilibrium. We show that the coexistence conditions can be characterized graphically, clearly illustrating the trade-off between stress tolerance and competitive ability. We compare our model with a recently proposed, continuous-time dynamical system for a tolerance-fecundity trade-off in plant communities, and we show that this model is a special case of the continuous-time version of our model.

  8. SCHEME ANALYSIS TREE DIMENSIONS AND TOLERANCES PROCESSING

    OpenAIRE

    Constanta RADULESCU; Liviu Marius CÎRŢÎNĂ; Constantin MILITARU

    2011-01-01

    This paper presents one of the steps that help us to determine the optimal tolerances depending on thetechnological capability of processing equipment. To determine the tolerances in this way is necessary to takethe study and to represent schematically the operations are used in technological process of making a piece.Also in this phase will make the tree diagram of the dimensions and machining tolerances, dimensions andtolerances shown that the design execution. Determination processes, and ...

  9. Generalized tolerance sensitivity and DEA metric sensitivity

    OpenAIRE

    Neralić, Luka; E. Wendell, Richard

    2015-01-01

    This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA). Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  10. Generalized tolerance sensitivity and DEA metric sensitivity

    Directory of Open Access Journals (Sweden)

    Luka Neralić

    2015-03-01

    Full Text Available This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA. Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  11. UNITED STATES DEPARTMENT OF TRANSPORTATION GLOBAL POSITIONING SYSTEM (GPS) ADJACENT BAND COMPATIBILITY ASSESSMENT

    Science.gov (United States)

    2018-04-01

    The goal of the U.S. Department of Transportation (DOT) Global Positioning System (GPS) Adjacent Band Compatibility Assessment is to evaluate the maximum transmitted power levels of adjacent band radiofrequency (RF) systems that can be tolerated by G...

  12. 76 FR 22045 - Fluopicolide; Pesticide Tolerances

    Science.gov (United States)

    2011-04-20

    ... regulation establishes tolerances for residues of fluopicolide and its metabolites in or on multiple... occurred at dose levels where significant maternal toxicity (severe body weight gain decrements and...

  13. Enhancing drought tolerance in C(4) crops.

    Science.gov (United States)

    Lopes, Marta S; Araus, Jose Luis; van Heerden, Philippus D R; Foyer, Christine H

    2011-05-01

    Adaptation to abiotic stresses is a quantitative trait controlled by many different genes. Enhancing the tolerance of crop plants to abiotic stresses such as drought has therefore proved to be somewhat elusive in terms of plant breeding. While many C(4) species have significant agronomic importance, most of the research effort on improving drought tolerance has focused on maize. Ideally, drought tolerance has to be achieved without penalties in yield potential. Possibilities for success in this regard are highlighted by studies on maize hybrids performed over the last 70 years that have demonstrated that yield potential and enhanced stress tolerance are associated traits. However, while our understanding of the molecular mechanisms that enable plants to tolerate drought has increased considerably in recent years, there have been relatively few applications of DNA marker technologies in practical C(4) breeding programmes for improved stress tolerance. Moreover, until recently, targeted approaches to drought tolerance have concentrated largely on shoot parameters, particularly those associated with photosynthesis and stay green phenotypes, rather than on root traits such as soil moisture capture for transpiration, root architecture, and improvement of effective use of water. These root traits are now increasingly considered as important targets for yield improvement in C(4) plants under drought stress. Similarly, the molecular mechanisms underpinning heterosis have considerable potential for exploitation in enhancing drought stress tolerance. While current evidence points to the crucial importance of root traits in drought tolerance in C(4) plants, shoot traits may also be important in maintaining high yields during drought.

  14. Synthesis of Fault-Tolerant Embedded Systems

    DEFF Research Database (Denmark)

    Eles, Petru; Izosimov, Viacheslav; Pop, Paul

    2008-01-01

    This work addresses the issue of design optimization for fault- tolerant hard real-time systems. In particular, our focus is on the handling of transient faults using both checkpointing with rollback recovery and active replication. Fault tolerant schedules are generated based on a conditional...... process graph representation. The formulated system synthesis approaches decide the assignment of fault-tolerance policies to processes, the optimal placement of checkpoints and the mapping of processes to processors, such that multiple transient faults are tolerated, transparency requirements...

  15. Establishing soil loss tolerance: an overview

    Directory of Open Access Journals (Sweden)

    Costanza Di Stefano

    2016-09-01

    Full Text Available Soil loss tolerance is a criterion for establishing if a soil is potentially subjected to erosion risk, productivity loss and if a river presents downstream over-sedimentation or other off-site effects are present at basin scale. At first this paper reviews the concept of tolerable soil loss and summarises the available definitions and the knowledge on the recommended values and evaluating criteria. Then a threshold soil loss value, at the annual temporal scale, established for limiting riling was used for defining the classical soil loss tolerance. Finally, some research needs on tolerable soil loss are listed.

  16. Global consequences of US environmental policies

    International Nuclear Information System (INIS)

    Sedjo, R.A.

    1993-01-01

    Attempts to quantify the financial and social benefits and costs, and their critiques, of habitat protection, have missed a major element: the global environmental consequences. In a global economy linked by international trade a significant reduction in timber harvests in on region will probably precipitate actions in other regions that may be detrimental to the global environment. These reactions would offset most or all of the alleged environmental benefits. The author uses the spotted owl controversy in the Pacific Northwest to illustrate his points. Global aspects of employment, marketing evaluations, fossil fuel implications are all discussed. The author feels that responses from environmentally responsible citizens would be influenced if it was more widely known that in a global system, domestic habitat protection and land-use decisions involved substantial environmental costs elsewhere

  17. The Global Tsunami Model (GTM)

    Science.gov (United States)

    Lorito, S.; Basili, R.; Harbitz, C. B.; Løvholt, F.; Polet, J.; Thio, H. K.

    2017-12-01

    The tsunamis occurred worldwide in the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but often disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.

  18. Global Carbon Budget 2017

    Science.gov (United States)

    Le Quéré, Corinne; Andrew, Robbie M.; Friedlingstein, Pierre; Sitch, Stephen; Pongratz, Julia; Manning, Andrew C.; Korsbakken, Jan Ivar; Peters, Glen P.; Canadell, Josep G.; Jackson, Robert B.; Boden, Thomas A.; Tans, Pieter P.; Andrews, Oliver D.; Arora, Vivek K.; Bakker, Dorothee C. E.; Barbero, Leticia; Becker, Meike; Betts, Richard A.; Bopp, Laurent; Chevallier, Frédéric; Chini, Louise P.; Ciais, Philippe; Cosca, Catherine E.; Cross, Jessica; Currie, Kim; Gasser, Thomas; Harris, Ian; Hauck, Judith; Haverd, Vanessa; Houghton, Richard A.; Hunt, Christopher W.; Hurtt, George; Ilyina, Tatiana; Jain, Atul K.; Kato, Etsushi; Kautz, Markus; Keeling, Ralph F.; Klein Goldewijk, Kees; Körtzinger, Arne; Landschützer, Peter; Lefèvre, Nathalie; Lenton, Andrew; Lienert, Sebastian; Lima, Ivan; Lombardozzi, Danica; Metzl, Nicolas; Millero, Frank; Monteiro, Pedro M. S.; Munro, David R.; Nabel, Julia E. M. S.; Nakaoka, Shin-ichiro; Nojiri, Yukihiro; Padin, X. Antonio; Peregon, Anna; Pfeil, Benjamin; Pierrot, Denis; Poulter, Benjamin; Rehder, Gregor; Reimer, Janet; Rödenbeck, Christian; Schwinger, Jörg; Séférian, Roland; Skjelvan, Ingunn; Stocker, Benjamin D.; Tian, Hanqin; Tilbrook, Bronte; Tubiello, Francesco N.; van der Laan-Luijkx, Ingrid T.; van der Werf, Guido R.; van Heuven, Steven; Viovy, Nicolas; Vuichard, Nicolas; Walker, Anthony P.; Watson, Andrew J.; Wiltshire, Andrew J.; Zaehle, Sönke; Zhu, Dan

    2018-03-01

    Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere - the global carbon budget - is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and methodology to quantify the five major components of the global carbon budget and their uncertainties. CO2 emissions from fossil fuels and industry (EFF) are based on energy statistics and cement production data, respectively, while emissions from land-use change (ELUC), mainly deforestation, are based on land-cover change data and bookkeeping models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The ocean CO2 sink (SOCEAN) and terrestrial CO2 sink (SLAND) are estimated with global process models constrained by observations. The resulting carbon budget imbalance (BIM), the difference between the estimated total emissions and the estimated changes in the atmosphere, ocean, and terrestrial biosphere, is a measure of imperfect data and understanding of the contemporary carbon cycle. All uncertainties are reported as ±1σ. For the last decade available (2007-2016), EFF was 9.4 ± 0.5 GtC yr-1, ELUC 1.3 ± 0.7 GtC yr-1, GATM 4.7 ± 0.1 GtC yr-1, SOCEAN 2.4 ± 0.5 GtC yr-1, and SLAND 3.0 ± 0.8 GtC yr-1, with a budget imbalance BIM of 0.6 GtC yr-1 indicating overestimated emissions and/or underestimated sinks. For year 2016 alone, the growth in EFF was approximately zero and emissions remained at 9.9 ± 0.5 GtC yr-1. Also for 2016, ELUC was 1.3 ± 0.7 GtC yr-1, GATM was 6.1 ± 0.2 GtC yr-1, SOCEAN was 2.6 ± 0.5 GtC yr-1, and SLAND was 2.7 ± 1.0 GtC yr-1, with a small BIM of -0.3 GtC. GATM continued to be higher in 2016 compared to the past decade (2007-2016), reflecting in part the high fossil emissions and the small SLAND

  19. Global Carbon Budget 2017

    Directory of Open Access Journals (Sweden)

    C. Le Quéré

    2018-03-01

    Full Text Available Accurate assessment of anthropogenic carbon dioxide (CO2 emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere – the global carbon budget – is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and methodology to quantify the five major components of the global carbon budget and their uncertainties. CO2 emissions from fossil fuels and industry (EFF are based on energy statistics and cement production data, respectively, while emissions from land-use change (ELUC, mainly deforestation, are based on land-cover change data and bookkeeping models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM is computed from the annual changes in concentration. The ocean CO2 sink (SOCEAN and terrestrial CO2 sink (SLAND are estimated with global process models constrained by observations. The resulting carbon budget imbalance (BIM, the difference between the estimated total emissions and the estimated changes in the atmosphere, ocean, and terrestrial biosphere, is a measure of imperfect data and understanding of the contemporary carbon cycle. All uncertainties are reported as ±1σ. For the last decade available (2007–2016, EFF was 9.4 ± 0.5 GtC yr−1, ELUC 1.3 ± 0.7 GtC yr−1, GATM 4.7 ± 0.1 GtC yr−1, SOCEAN 2.4 ± 0.5 GtC yr−1, and SLAND 3.0 ± 0.8 GtC yr−1, with a budget imbalance BIM of 0.6 GtC yr−1 indicating overestimated emissions and/or underestimated sinks. For year 2016 alone, the growth in EFF was approximately zero and emissions remained at 9.9 ± 0.5 GtC yr−1. Also for 2016, ELUC was 1.3 ± 0.7 GtC yr−1, GATM was 6.1 ± 0.2 GtC yr−1, SOCEAN was 2.6 ± 0.5 GtC yr−1, and SLAND was 2.7 ± 1.0 GtC yr−1, with a small BIM of −0.3 GtC. GATM continued to be

  20. SU-F-T-182: A Stochastic Approach to Daily QA Tolerances On Spot Properties for Proton Pencil Beam Scanning

    International Nuclear Information System (INIS)

    St James, S; Bloch, C; Saini, J

    2016-01-01

    Purpose: Proton pencil beam scanning is used clinically across the United States. There are no current guidelines on tolerances for daily QA specific to pencil beam scanning, specifically related to the individual spot properties (spot width). Using a stochastic method to determine tolerances has the potential to optimize tolerances on individual spots and decrease the number of false positive failures in daily QA. Individual and global spot tolerances were evaluated. Methods: As part of daily QA for proton pencil beam scanning, a field of 16 spots (corresponding to 8 energies) is measured using an array of ion chambers (Matrixx, IBA). Each individual spot is fit to two Gaussian functions (x,y). The spot width (σ) in × and y are recorded (32 parameters). Results from the daily QA were retrospectively analyzed for 100 days of data. The deviations of the spot widths were histogrammed and fit to a Gaussian function. The stochastic spot tolerance was taken to be the mean ± 3σ. Using these results, tolerances were developed and tested against known deviations in spot width. Results: The individual spot tolerances derived with the stochastic method decreased in 30/32 instances. Using the previous tolerances (± 20% width), the daily QA would have detected 0/20 days of the deviation. Using a tolerance of any 6 spots failing the stochastic tolerance, 18/20 days of the deviation would have been detected. Conclusion: Using a stochastic method we have been able to decrease daily tolerances on the spot widths for 30/32 spot widths measured. The stochastic tolerances can lead to detection of deviations that previously would have been picked up on monthly QA and missed by daily QA. This method could be easily extended for evaluation of other QA parameters in proton spot scanning.

  1. Quantifying Riverscape Connectivity with Graph Theory

    Science.gov (United States)

    Carbonneau, P.; Milledge, D.; Sinha, R.; Tandon, S. K.

    2013-12-01

    Fluvial catchments convey fluxes of water, sediment, nutrients and aquatic biota. At continental scales, crustal topography defines the overall path of channels whilst at local scales depositional and/or erosional features generally determine the exact path of a channel. Furthermore, constructions such as dams, for either water abstraction or hydropower, often have a significant impact on channel networks.The concept of ';connectivity' is commonly invoked when conceptualising the structure of a river network.This concept is easy to grasp but there have been uneven efforts across the environmental sciences to actually quantify connectivity. Currently there have only been a few studies reporting quantitative indices of connectivity in river sciences, notably, in the study of avulsion processes. However, the majority of current work describing some form of environmental connectivity in a quantitative manner is in the field of landscape ecology. Driven by the need to quantify habitat fragmentation, landscape ecologists have returned to graph theory. Within this formal setting, landscape ecologists have successfully developed a range of indices which can model connectivity loss. Such formal connectivity metrics are currently needed for a range of applications in fluvial sciences. One of the most urgent needs relates to dam construction. In the developed world, hydropower development has generally slowed and in many countries, dams are actually being removed. However, this is not the case in the developing world where hydropower is seen as a key element to low-emissions power-security. For example, several dam projects are envisaged in Himalayan catchments in the next 2 decades. This region is already under severe pressure from climate change and urbanisation, and a better understanding of the network fragmentation which can be expected in this system is urgently needed. In this paper, we apply and adapt connectivity metrics from landscape ecology. We then examine the

  2. Quantifying human vitamin kinetics using AMS

    Energy Technology Data Exchange (ETDEWEB)

    Hillegonds, D; Dueker, S; Ognibene, T; Buchholz, B; Lin, Y; Vogel, J; Clifford, A

    2004-02-19

    Tracing vitamin kinetics at physiologic concentrations has been hampered by a lack of quantitative sensitivity for chemically equivalent tracers that could be used safely in healthy people. Instead, elderly or ill volunteers were sought for studies involving pharmacologic doses with radioisotopic labels. These studies fail to be relevant in two ways: vitamins are inherently micronutrients, whose biochemical paths are saturated and distorted by pharmacological doses; and while vitamins remain important for health in the elderly or ill, their greatest effects may be in preventing slow and cumulative diseases by proper consumption throughout youth and adulthood. Neither the target dose nor the target population are available for nutrient metabolic studies through decay counting of radioisotopes at high levels. Stable isotopic labels are quantified by isotope ratio mass spectrometry at levels that trace physiologic vitamin doses, but the natural background of stable isotopes severely limits the time span over which the tracer is distinguishable. Indeed, study periods seldom ranged over a single biological mean life of the labeled nutrients, failing to provide data on the important final elimination phase of the compound. Kinetic data for the absorption phase is similarly rare in micronutrient research because the phase is rapid, requiring many consecutive plasma samples for accurate representation. However, repeated blood samples of sufficient volume for precise stable or radio-isotope quantitations consume an indefensible amount of the volunteer's blood over a short period. Thus, vitamin pharmacokinetics in humans has often relied on compartmental modeling based upon assumptions and tested only for the short period of maximal blood circulation, a period that poorly reflects absorption or final elimination kinetics except for the most simple models.

  3. Quantifying Sentiment and Influence in Blogspaces

    Energy Technology Data Exchange (ETDEWEB)

    Hui, Peter SY; Gregory, Michelle L.

    2010-07-25

    The weblog, or blog, has become a popular form of social media, through which authors can write posts, which can in turn generate feedback in the form of user comments. When considered in totality, a collection of blogs can thus be viewed as a sort of informal collection of mass sentiment and opinion. An obvious topic of interest might be to mine this collection to obtain some gauge of public sentiment over the wide variety of topics contained therein. However, the sheer size of the so-called blogosphere, combined with the fact that the subjects of posts can vary over a practically limitless number of topics poses some serious challenges when any meaningful analysis is attempted. Namely, the fact that largely anyone with access to the Internet can author their own blog, raises the serious issue of credibility— should some blogs be considered to be more influential than others, and consequently, when gauging sentiment with respect to a topic, should some blogs be weighted more heavily than others? In addition, as new posts and comments can be made on almost a constant basis, any blog analysis algorithm must be able to handle such updates efficiently. In this paper, we give a formalization of the blog model. We give formal methods of quantifying sentiment and influence with respect to a hierarchy of topics, with the specific aim of facilitating the computation of a per-topic, influence-weighted sentiment measure. Finally, as efficiency is a specific endgoal, we give upper bounds on the time required to update these values with new posts, showing that our analysis and algorithms are scalable.

  4. Quantifying antimicrobial resistance at veal calf farms.

    Directory of Open Access Journals (Sweden)

    Angela B Bosman

    Full Text Available This study was performed to determine a sampling strategy to quantify the prevalence of antimicrobial resistance on veal calf farms, based on the variation in antimicrobial resistance within and between calves on five farms. Faecal samples from 50 healthy calves (10 calves/farm were collected. From each individual sample and one pooled faecal sample per farm, 90 selected Escherichia coli isolates were tested for their resistance against 25 mg/L amoxicillin, 25 mg/L tetracycline, 0.5 mg/L cefotaxime, 0.125 mg/L ciprofloxacin and 8/152 mg/L trimethoprim/sulfamethoxazole (tmp/s by replica plating. From each faecal sample another 10 selected E. coli isolates were tested for their resistance by broth microdilution as a reference. Logistic regression analysis was performed to compare the odds of testing an isolate resistant between both test methods (replica plating vs. broth microdilution and to evaluate the effect of pooling faecal samples. Bootstrap analysis was used to investigate the precision of the estimated prevalence of resistance to each antimicrobial obtained by several simulated sampling strategies. Replica plating showed similar odds of E. coli isolates tested resistant compared to broth microdilution, except for ciprofloxacin (OR 0.29, p ≤ 0.05. Pooled samples showed in general lower odds of an isolate being resistant compared to individual samples, although these differences were not significant. Bootstrap analysis showed that within each antimicrobial the various compositions of a pooled sample provided consistent estimates for the mean proportion of resistant isolates. Sampling strategies should be based on the variation in resistance among isolates within faecal samples and between faecal samples, which may vary by antimicrobial. In our study, the optimal sampling strategy from the perspective of precision of the estimated levels of resistance and practicality consists of a pooled faecal sample from 20 individual animals, of which

  5. Quantifying the Clinical Significance of Cannabis Withdrawal

    Science.gov (United States)

    Allsop, David J.; Copeland, Jan; Norberg, Melissa M.; Fu, Shanlin; Molnar, Anna; Lewis, John; Budney, Alan J.

    2012-01-01

    Background and Aims Questions over the clinical significance of cannabis withdrawal have hindered its inclusion as a discrete cannabis induced psychiatric condition in the Diagnostic and Statistical Manual of Mental Disorders (DSM IV). This study aims to quantify functional impairment to normal daily activities from cannabis withdrawal, and looks at the factors predicting functional impairment. In addition the study tests the influence of functional impairment from cannabis withdrawal on cannabis use during and after an abstinence attempt. Methods and Results A volunteer sample of 49 non-treatment seeking cannabis users who met DSM-IV criteria for dependence provided daily withdrawal-related functional impairment scores during a one-week baseline phase and two weeks of monitored abstinence from cannabis with a one month follow up. Functional impairment from withdrawal symptoms was strongly associated with symptom severity (p = 0.0001). Participants with more severe cannabis dependence before the abstinence attempt reported greater functional impairment from cannabis withdrawal (p = 0.03). Relapse to cannabis use during the abstinence period was associated with greater functional impairment from a subset of withdrawal symptoms in high dependence users. Higher levels of functional impairment during the abstinence attempt predicted higher levels of cannabis use at one month follow up (p = 0.001). Conclusions Cannabis withdrawal is clinically significant because it is associated with functional impairment to normal daily activities, as well as relapse to cannabis use. Sample size in the relapse group was small and the use of a non-treatment seeking population requires findings to be replicated in clinical samples. Tailoring treatments to target withdrawal symptoms contributing to functional impairment during a quit attempt may improve treatment outcomes. PMID:23049760

  6. Quantifying seasonal velocity at Khumbu Glacier, Nepal

    Science.gov (United States)

    Miles, E.; Quincey, D. J.; Miles, K.; Hubbard, B. P.; Rowan, A. V.

    2017-12-01

    While the low-gradient debris-covered tongues of many Himalayan glaciers exhibit low surface velocities, quantifying ice flow and its variation through time remains a key challenge for studies aimed at determining the long-term evolution of these glaciers. Recent work has suggested that glaciers in the Everest region of Nepal may show seasonal variability in surface velocity, with ice flow peaking during the summer as monsoon precipitation provides hydrological inputs and thus drives changes in subglacial drainage efficiency. However, satellite and aerial observations of glacier velocity during the monsoon are greatly limited due to cloud cover. Those that do exist do not span the period over which the most dynamic changes occur, and consequently short-term (i.e. daily) changes in flow, as well as the evolution of ice dynamics through the monsoon period, remain poorly understood. In this study, we combine field and remote (satellite image) observations to create a multi-temporal, 3D synthesis of ice deformation rates at Khumbu Glacier, Nepal, focused on the 2017 monsoon period. We first determine net annual and seasonal surface displacements for the whole glacier based on Landsat-8 (OLI) panchromatic data (15m) processed with ImGRAFT. We integrate inclinometer observations from three boreholes drilled by the EverDrill project to determine cumulative deformation at depth, providing a 3D perspective and enabling us to assess the role of basal sliding at each site. We additionally analyze high-frequency on-glacier L1 GNSS data from three sites to characterize variability within surface deformation at sub-seasonal timescales. Finally, each dataset is validated against repeat-dGPS observations at gridded points in the vicinity of the boreholes and GNSS dataloggers. These datasets complement one another to infer thermal regime across the debris-covered ablation area of the glacier, and emphasize the seasonal and spatial variability of ice deformation for glaciers in High

  7. Quantifying collective attention from tweet stream.

    Directory of Open Access Journals (Sweden)

    Kazutoshi Sasahara

    Full Text Available Online social media are increasingly facilitating our social interactions, thereby making available a massive "digital fossil" of human behavior. Discovering and quantifying distinct patterns using these data is important for studying social behavior, although the rapid time-variant nature and large volumes of these data make this task difficult and challenging. In this study, we focused on the emergence of "collective attention" on Twitter, a popular social networking service. We propose a simple method for detecting and measuring the collective attention evoked by various types of events. This method exploits the fact that tweeting activity exhibits a burst-like increase and an irregular oscillation when a particular real-world event occurs; otherwise, it follows regular circadian rhythms. The difference between regular and irregular states in the tweet stream was measured using the Jensen-Shannon divergence, which corresponds to the intensity of collective attention. We then associated irregular incidents with their corresponding events that attracted the attention and elicited responses from large numbers of people, based on the popularity and the enhancement of key terms in posted messages or "tweets." Next, we demonstrate the effectiveness of this method using a large dataset that contained approximately 490 million Japanese tweets by over 400,000 users, in which we identified 60 cases of collective attentions, including one related to the Tohoku-oki earthquake. "Retweet" networks were also investigated to understand collective attention in terms of social interactions. This simple method provides a retrospective summary of collective attention, thereby contributing to the fundamental understanding of social behavior in the digital era.

  8. Quantifying the clinical significance of cannabis withdrawal.

    Directory of Open Access Journals (Sweden)

    David J Allsop

    Full Text Available Questions over the clinical significance of cannabis withdrawal have hindered its inclusion as a discrete cannabis induced psychiatric condition in the Diagnostic and Statistical Manual of Mental Disorders (DSM IV. This study aims to quantify functional impairment to normal daily activities from cannabis withdrawal, and looks at the factors predicting functional impairment. In addition the study tests the influence of functional impairment from cannabis withdrawal on cannabis use during and after an abstinence attempt.A volunteer sample of 49 non-treatment seeking cannabis users who met DSM-IV criteria for dependence provided daily withdrawal-related functional impairment scores during a one-week baseline phase and two weeks of monitored abstinence from cannabis with a one month follow up. Functional impairment from withdrawal symptoms was strongly associated with symptom severity (p=0.0001. Participants with more severe cannabis dependence before the abstinence attempt reported greater functional impairment from cannabis withdrawal (p=0.03. Relapse to cannabis use during the abstinence period was associated with greater functional impairment from a subset of withdrawal symptoms in high dependence users. Higher levels of functional impairment during the abstinence attempt predicted higher levels of cannabis use at one month follow up (p=0.001.Cannabis withdrawal is clinically significant because it is associated with functional impairment to normal daily activities, as well as relapse to cannabis use. Sample size in the relapse group was small and the use of a non-treatment seeking population requires findings to be replicated in clinical samples. Tailoring treatments to target withdrawal symptoms contributing to functional impairment during a quit attempt may improve treatment outcomes.

  9. Quantifying motion for pancreatic radiotherapy margin calculation

    International Nuclear Information System (INIS)

    Whitfield, Gillian; Jain, Pooja; Green, Melanie; Watkins, Gillian; Henry, Ann; Stratford, Julie; Amer, Ali; Marchant, Thomas; Moore, Christopher; Price, Patricia

    2012-01-01

    Background and purpose: Pancreatic radiotherapy (RT) is limited by uncertain target motion. We quantified 3D patient/organ motion during pancreatic RT and calculated required treatment margins. Materials and methods: Cone-beam computed tomography (CBCT) and orthogonal fluoroscopy images were acquired post-RT delivery from 13 patients with locally advanced pancreatic cancer. Bony setup errors were calculated from CBCT. Inter- and intra-fraction fiducial (clip/seed/stent) motion was determined from CBCT projections and orthogonal fluoroscopy. Results: Using an off-line CBCT correction protocol, systematic (random) setup errors were 2.4 (3.2), 2.0 (1.7) and 3.2 (3.6) mm laterally (left–right), vertically (anterior–posterior) and longitudinally (cranio-caudal), respectively. Fiducial motion varied substantially. Random inter-fractional changes in mean fiducial position were 2.0, 1.6 and 2.6 mm; 95% of intra-fractional peak-to-peak fiducial motion was up to 6.7, 10.1 and 20.6 mm, respectively. Calculated clinical to planning target volume (CTV–PTV) margins were 1.4 cm laterally, 1.4 cm vertically and 3.0 cm longitudinally for 3D conformal RT, reduced to 0.9, 1.0 and 1.8 cm, respectively, if using 4D planning and online setup correction. Conclusions: Commonly used CTV–PTV margins may inadequately account for target motion during pancreatic RT. Our results indicate better immobilisation, individualised allowance for respiratory motion, online setup error correction and 4D planning would improve targeting.

  10. Bracketing effects on risk tolerance

    Directory of Open Access Journals (Sweden)

    Ester Moher

    2010-08-01

    Full Text Available Research has shown that risk tolerance increases when multiple decisions and associated outcomes are presented together in a broader ``bracket'' rather than one at a time. The present studies disentangle the influence of problem bracketing (presenting multiple investment options together from that of outcome bracketing (presenting the aggregated outcomes of multiple decisions, factors which have been deliberately confounded in previous research. In the standard version of the bracketing task, in which participants decide how much of an initial endowment to invest into each in a series of repeated, identical gambles, we find a problem bracketing effect but not an outcome bracketing effect. However, this pattern of results does not generalize to the cases of non-identical gambles nor discrete choice, where we fail to find the standard bracketing effect.

  11. Euro-Multiculturalism and Toleration

    DEFF Research Database (Denmark)

    Lægaard, Sune

    2014-01-01

    The underlying concept of multiculturalism in many European discussions is different from that made prominent by the classic cases, e.g. in Canada, that have functioned as paradigm cases which the most prominent theories of multiculturalism have been tailored to fit and justify. “Euro-multicultur......The underlying concept of multiculturalism in many European discussions is different from that made prominent by the classic cases, e.g. in Canada, that have functioned as paradigm cases which the most prominent theories of multiculturalism have been tailored to fit and justify. “Euro......, and b) that it is under-inclusive in the sense that it collapses multiculturalism into standard liberal political theory and fails to explain what is distinctive about multiculturalism. Finally the paper shows that multiculturalism in this sense can involve issues of toleration....

  12. The Holistic Integrity Test (HIT - quantified resilience analysis

    Directory of Open Access Journals (Sweden)

    Dobson Mike

    2016-01-01

    Full Text Available The Holistic Integrity Test (HIT - Quantified Resilience Analysis. Rising sea levels and wider climate change mean we face an increasing risk from flooding and other natural hazards. Tough economic times make it difficult to economically justify or afford the desired level of engineered risk reduction. Add to this significant uncertainty from a range of future predictions, constantly updated with new science. We therefore need to understand not just how to reduce the risk, but what could happen should above design standard events occur. In flood terms this includes not only the direct impacts (damage and loss of life, but the wider cascade impacts to infrastructure systems and the longer term impacts on the economy and society. However, understanding the “what if” is only the first part of the equation; a range of improvement measures to mitigate such effects need to be identified and implemented. These measures should consider reducing the risk, lessening the consequences, aiding the response, and speeding up the recovery. However, they need to be objectively assessed through quantitative analysis, which underpins them technically and economically. Without such analysis, it cannot be predicted how measures will perform if the extreme events occur. It is also vital to consider all possible hazards as measures for one hazard may hinder the response to another. The Holistic Integrity Test (HIT, uses quantitative system analysis and “HITs” the site, its infrastructure, contained dangers and wider regional system to determine how it copes with a range of severe shock events, Before, During and After the event, whilst also accounting for uncertainty (as illustrated in figure 1. First explained at the TINCE 2014 Nuclear Conference in Paris, it was explained in terms of a Nuclear Facility needing to analyse the site in response to post Fukushima needs; the hit is however universally applicable. The HIT has three key risk reduction goals: The

  13. Targeting phenotypically tolerant Mycobacterium tuberculosis

    Science.gov (United States)

    Gold, Ben; Nathan, Carl

    2016-01-01

    While the immune system is credited with averting tuberculosis in billions of individuals exposed to Mycobacterium tuberculosis, the immune system is also culpable for tempering the ability of antibiotics to deliver swift and durable cure of disease. In individuals afflicted with tuberculosis, host immunity produces diverse microenvironmental niches that support suboptimal growth, or complete growth arrest, of M. tuberculosis. The physiological state of nonreplication in bacteria is associated with phenotypic drug tolerance. Many of these host microenvironments, when modeled in vitro by carbon starvation, complete nutrient starvation, stationary phase, acidic pH, reactive nitrogen intermediates, hypoxia, biofilms, and withholding streptomycin from the streptomycin-addicted strain SS18b, render M. tuberculosis profoundly tolerant to many of the antibiotics that are given to tuberculosis patients in a clinical setting. Targeting nonreplicating persisters is anticipated to reduce the duration of antibiotic treatment and rate of post-treatment relapse. Some promising drugs to treat tuberculosis, such as rifampicin and bedaquiline, only kill nonreplicating M. tuberculosis in vitro at concentrations far greater than their minimal inhibitory concentrations against replicating bacilli. There is an urgent demand to identify which of the currently used antibiotics, and which of the molecules in academic and corporate screening collections, have potent bactericidal action on nonreplicating M. tuberculosis. With this goal, we review methods of high throughput screening to target nonreplicating M. tuberculosis and methods to progress candidate molecules. A classification based on structures and putative targets of molecules that have been reported to kill nonreplicating M. tuberculosis revealed a rich diversity in pharmacophores. However, few of these compounds were tested under conditions that would exclude the impact of adsorbed compound acting during the recovery phase of

  14. Quantifying forest mortality with the remote sensing of snow

    Science.gov (United States)

    Baker, Emily Hewitt

    Greenhouse gas emissions have altered global climate significantly, increasing the frequency of drought, fire, and pest-related mortality in forests across the western United States, with increasing area affected each year. Associated changes in forests are of great concern for the public, land managers, and the broader scientific community. These increased stresses have resulted in a widespread, spatially heterogeneous decline of forest canopies, which in turn exerts strong controls on the accumulation and melt of the snowpack, and changes forest-atmosphere exchanges of carbon, water, and energy. Most satellite-based retrievals of summer-season forest data are insufficient to quantify canopy, as opposed to the combination of canopy and undergrowth, since the signals of the two types of vegetation greenness have proven persistently difficult to distinguish. To overcome this issue, this research develops a method to quantify forest canopy cover using winter-season fractional snow covered area (FSCA) data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) snow covered area and grain size (MODSCAG) algorithm. In areas where the ground surface and undergrowth are completely snow-covered, a pixel comprises only forest canopy and snow. Following a snowfall event, FSCA initially rises, as snow is intercepted in the canopy, and then falls, as snow unloads. A select set of local minima in a winter F SCA timeseries form a threshold where canopy is snow-free, but forest understory is snow-covered. This serves as a spatially-explicit measurement of forest canopy, and viewable gap fraction (VGF) on a yearly basis. Using this method, we determine that MODIS-observed VGF is significantly correlated with an independent product of yearly crown mortality derived from spectral analysis of Landsat imagery at 25 high-mortality sites in northern Colorado. (r =0.96 +/-0.03, p =0.03). Additionally, we determine the lag timing between green-stage tree mortality and

  15. Becoming less tolerant with age: sugar maple, shade, and ontogeny.

    Science.gov (United States)

    Sendall, Kerrie M; Lusk, Christopher H; Reich, Peter B

    2015-12-01

    Although shade tolerance is often assumed to be a fixed trait, recent work suggests ontogenetic changes in the light requirements of tree species. We determined the influence of gas exchange, biomass distribution, and self-shading on ontogenetic variation in the instantaneous aboveground carbon balance of Acer saccharum. We quantified the aboveground biomass distributions of 18 juveniles varying in height and growing in low light in a temperate forest understory in Minnesota, USA. Gas exchange rates of leaf and stem tissues were measured, and the crown architecture of each individual was quantified. The YPLANT program was used to estimate the self-shaded fraction of each crown and to model net leaf-level carbon gain. Leaf respiration and photosynthesis per gram of leaf tissue increased with plant size. In contrast, stem respiration rates per gram of stem tissue declined, reflecting a shift in the distribution of stem diameter sizes from smaller (with higher respiration) to larger diameter classes. However, these trends were outweighed by ontogenetic increases in self-shading (which reduces the net photosynthesis realized) and stem mass fraction (which increases the proportion of purely respiratory tissue) in terms of influence on net carbon exchange. As a result, net carbon gain per gram of aboveground plant tissue declined with increasing plant size, and the instantaneous aboveground light compensation point increased. When estimates of root respiration were included to model whole-plant carbon gain and light compensation points, relationships with plant size were even more pronounced. Our findings show how an interplay of gas exchange, self-shading, and biomass distribution shapes ontogenetic changes in shade tolerance.

  16. 77 FR 4890 - Damage Tolerance and Fatigue Evaluation for Composite Rotorcraft Structures, and Damage Tolerance...

    Science.gov (United States)

    2012-02-01

    ...-AJ52, 2120-AJ51 Damage Tolerance and Fatigue Evaluation for Composite Rotorcraft Structures, and Damage Tolerance and Fatigue Evaluation for Metallic Structures; Correction AGENCY: Federal Aviation Administration... Tolerance and Fatigue Evaluation for Composite Rotorcraft Structures'' (76 FR 74655), published December 1...

  17. Introduction to Global Urban Climatology

    Science.gov (United States)

    Varquez, A. C. G.; Kanda, M.; Kawano, N.; Darmanto, N. S.; Dong, Y.

    2016-12-01

    Urban heat island (UHI) is a widely investigated phenomenon in the field of urban climate characterized by the warming of urban areas relative to its surrounding rural environs. Being able to understand the mechanism behind the UHI formation of a city and distinguish its impact from that of global climate change is indispensable when identifying adaptation and mitigation strategies. However, the lack of UHI studies many cities especially for developing countries makes it difficult to generalize the mechanism for UHI formation. Thus, there is an impending demand for studies that focus on the simultaneous analyses of UHI and its trends throughout the world. Hence, we propose a subfield of urban climatology, called "global urban climatology" (GUC), which mainly focuses on the uniform understanding of urban climates across all cities, globally. By using globally applicable methodologies to quantify and compare urban heat islands of cities with diverse backgrounds, including their geography, climate, socio-demography, and other factors, a universal understanding of the mechanisms underlying the formation of the phenomenon can be established. The implementation of GUC involves the use of globally acquired historical observation networks, gridded meteorological parameters from climate models, global geographic information system datasets; the construction of a distributed urban parameter database; and the development of techniques necessary to model the urban climate. Research under GUC can be categorized into three approaches. The collaborative approach (1st) relies on the collection of data from micro-scale experiments conducted worldwide with the aid or development of professional social networking platforms; the analytical approach (2nd) relies on the use of global weather station datasets and their corresponding objectively analysed global outputs; and the numerical approach (3rd) relies on the global estimation of high-resolution urban-representative parameters as

  18. THE EFFECTS OF GLOBALIZATION IN NATIONAL ACCOUNTS

    Directory of Open Access Journals (Sweden)

    Clementina IVAN-UNGUREANU

    2007-06-01

    Full Text Available In the OECD Handbook on Economic Globalization, the term “globalization” is used to describe” the increasing internationalization of financial markets and of markets for goods and services. Globalization refers above all to a dynamic and multidimensional process of economic integration whereby national resources become more and more internationally mobile while national economies become increasingly interdependent.”Understanding globalization requires theory as well as facts, but certainly the facts are key ingredients in any assessment of this important phenomenon. Indeed, the facts are necessary to test the theories and to quantify the importance of what the theories predict.New concepts are emerging as economists address the issues of globalization and they need to be better defined and measured. In particular, there is a need for standard concepts and definitions in the area of globalization and its effects. Work under way internationally on SNA 93 rev 1 should help considerably in this regard. There is a pressing need for greater coordination and cooperation in this domain. This paper, presents some of the issues involved in measurement of the indicators of globalization and in using those indicators to quantify and describe the phenomenon and to evaluate its economic impact.

  19. A Methodological Approach to Quantifying Plyometric Intensity.

    Science.gov (United States)

    Jarvis, Mark M; Graham-Smith, Phil; Comfort, Paul

    2016-09-01

    Jarvis, MM, Graham-Smith, P, and Comfort, P. A Methodological approach to quantifying plyometric intensity. J Strength Cond Res 30(9): 2522-2532, 2016-In contrast to other methods of training, the quantification of plyometric exercise intensity is poorly defined. The purpose of this study was to evaluate the suitability of a range of neuromuscular and mechanical variables to describe the intensity of plyometric exercises. Seven male recreationally active subjects performed a series of 7 plyometric exercises. Neuromuscular activity was measured using surface electromyography (SEMG) at vastus lateralis (VL) and biceps femoris (BF). Surface electromyography data were divided into concentric (CON) and eccentric (ECC) phases of movement. Mechanical output was measured by ground reaction forces and processed to provide peak impact ground reaction force (PF), peak eccentric power (PEP), and impulse (IMP). Statistical analysis was conducted to assess the reliability intraclass correlation coefficient and sensitivity smallest detectable difference of all variables. Mean values of SEMG demonstrate high reliability (r ≥ 0.82), excluding ECC VL during a 40-cm drop jump (r = 0.74). PF, PEP, and IMP demonstrated high reliability (r ≥ 0.85). Statistical power for force variables was excellent (power = 1.0), and good for SEMG (power ≥0.86) excluding CON BF (power = 0.57). There was no significant difference (p > 0.05) in CON SEMG between exercises. Eccentric phase SEMG only distinguished between exercises involving a landing and those that did not (percentage of maximal voluntary isometric contraction [%MVIC] = no landing -65 ± 5, landing -140 ± 8). Peak eccentric power, PF, and IMP all distinguished between exercises. In conclusion, CON neuromuscular activity does not appear to vary when intent is maximal, whereas ECC activity is dependent on the presence of a landing. Force characteristics provide a reliable and sensitive measure enabling precise description of intensity

  20. Quantifying Permafrost Characteristics with DCR-ERT

    Science.gov (United States)

    Schnabel, W.; Trochim, E.; Munk, J.; Kanevskiy, M. Z.; Shur, Y.; Fortier, R.

    2012-12-01

    Geophysical methods are an efficient method for quantifying permafrost characteristics for Arctic road design and engineering. In the Alaskan Arctic construction and maintenance of roads requires integration of permafrost; ground that is below 0 degrees C for two or more years. Features such as ice content and temperature are critical for understanding current and future ground conditions for planning, design and evaluation of engineering applications. This study focused on the proposed Foothills West Transportation Access project corridor where the purpose is to construct a new all-season road connecting the Dalton Highway to Umiat. Four major areas were chosen that represented a range of conditions including gravel bars, alluvial plains, tussock tundra (both unburned and burned conditions), high and low centered ice-wedge polygons and an active thermokarst feature. Direct-current resistivity using galvanic contact (DCR-ERT) was applied over transects. In conjunction complimentary site data including boreholes, active layer depths, vegetation descriptions and site photographs was obtained. The boreholes provided information on soil morphology, ice texture and gravimetric moisture content. Horizontal and vertical resolutions in the DCR-ERT were varied to determine the presence or absence of ground ice; subsurface heterogeneity; and the depth to groundwater (if present). The four main DCR-ERT methods used were: 84 electrodes with 2 m spacing; 42 electrodes with 0.5 m spacing; 42 electrodes with 2 m spacing; and 84 electrodes with 1 m spacing. In terms of identifying the ground ice characteristics the higher horizontal resolution DCR-ERT transects with either 42 or 84 electrodes and 0.5 or 1 m spacing were best able to differentiate wedge-ice. This evaluation is based on a combination of both borehole stratigraphy and surface characteristics. Simulated apparent resistivity values for permafrost areas varied from a low of 4582 Ω m to a high of 10034 Ω m. Previous

  1. Entropy generation method to quantify thermal comfort

    Science.gov (United States)

    Boregowda, S. C.; Tiwari, S. N.; Chaturvedi, S. K.

    2001-01-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study

  2. Global challenges

    International Nuclear Information System (INIS)

    Blix, H.

    1990-01-01

    A major challenge now facing the world is the supply of energy needed for growth and development in a manner which is not only economically viable but also environmentally acceptable and sustainable in view of the demands of and risks to future generations. The internationally most significant pollutants from energy production through fossil fuels are SO 2 and NO x which cause acid rain, and CO 2 which is the most significant contributor to the greenhouse effect. Nuclear power, now providing about 17% of the world's electricity and 5% of the primary energy already is making a notable contribution to avoiding these emissions. While the industrialized countries will need more energy and especially electricity in the future, the needs of the developing countries are naturally much larger and present a tremendous challenge to the shaping of the world's future energy supply system. The advanced countries will have to accept special responsibilities, as they can most easily use advanced technologies and they have been and remain the main contributors to the environmental problems we now face. Energy conservation and resort to new renewable energy sources, though highly desirable, appear inadequate alone to meet the challenges. The world can hardly afford to do without an increased use of nuclear power, although it is strongly contested in many countries. The objections raised against the nuclear option focus on safety, waste management and disposal problems and the risk for proliferation of nuclear weapons. These issues are not without their problems. The risk of proliferation exists but will not appreciably diminish with lesser global reliance on nuclear power. The waste issue is more of a political than a technical problem. The use of nuclear power, or any other energy source, will never be at zero risk, but the risks are constantly reduced by new techniques and practices. The IAEA sees it as one of its priority tasks to promote such techniques. (author)

  3. Religion, tolerance and national development: The Nigerian ...

    African Journals Online (AJOL)

    Tolerance as the ability to bear with one another inspite of differences either in opinion, belief or knowledge is an indispensable factor for any meaningful progress and development of any nation. To a keen observer of the daily happenings in Nigeria, religious tolerance is more than a topical issue because of its relevance ...

  4. Tolerance based algorithms for the ATSP

    NARCIS (Netherlands)

    Goldengorin, B; Sierksma, G; Turkensteen, M; Hromkovic, J; Nagl, M; Westfechtel, B

    2004-01-01

    In this paper we use arc tolerances, instead of arc costs, to improve Branch-and-Bound type algorithms for the Asymmetric Traveling Salesman Problem (ATSP). We derive new tighter lower bounds based on exact and approximate bottleneck upper tolerance values of the Assignment Problem (AP). It is shown

  5. Political Socialization, Tolerance, and Sexual Identity

    Science.gov (United States)

    Avery, Patricia G.

    2002-01-01

    Key concepts in political socialization, tolerance, groups, rights and responsibilities can be used to understand the way in which young people struggle with sexual identity issues. Educators may promote greater tolerance for homosexuality among heterosexuals by situating sexual identity issues within a broader discussion of democratic principles.…

  6. 75 FR 69353 - Isoxaben; Pesticide Tolerances

    Science.gov (United States)

    2010-11-12

    ...; and pistachio. Dow AgroSciences requested these tolerances under the Federal Food, Drug, and Cosmetic... 0.01 ppm; and nut, tree, group 14 and pistachio at 0.03 ppm. That notice referenced a summary of the... the data supporting the petition, EPA has reduced the tolerances for nut, tree, group 14 and pistachio...

  7. 76 FR 27268 - Glyphosate; Pesticide Tolerance

    Science.gov (United States)

    2011-05-11

    ... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 180 [EPA-HQ-OPP-2010-0938; FRL-8872-6] Glyphosate... regulation increases the established tolerance for residues of glyphosate in or on corn, field, forage... tolerance for residues of the herbicide glyphosate, N-(phosphonomethyl) glycine, in or on corn, field...

  8. 7 CFR 51.346 - Tolerances.

    Science.gov (United States)

    2010-01-01

    ... Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946... Standards for Grades of Apples for Processing Tolerances § 51.346 Tolerances. When a lot of apples is...

  9. 7 CFR 51.306 - Tolerances.

    Science.gov (United States)

    2010-01-01

    ... Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946... Standards for Grades of Apples Tolerances § 51.306 Tolerances. In order to allow for variations incident to...

  10. 77 FR 60311 - Chlorantraniliprole; Pesticide Tolerances

    Science.gov (United States)

    2012-10-03

    ... sheep, fat to 0.5 ppm. EPA has also increased the existing tolerances in cattle, meat; goat, meat; horse..., horse and sheep, fat at 0.5 ppm, and cattle, goat, horse and sheep, meat at 0.1 ppm. Consistent with the.... Revise the tolerances for cattle, fat; cattle, meat; goat, fat; goat, meat; horse, fat; horse, meat...

  11. Chromium Tolerance and Bioremoval by Cyanobacteria Isolated ...

    African Journals Online (AJOL)

    Two cyanobacterial species Nostoc calcicola HH-12 and Chroococcus minutus HH-11 isolated from a textile mill oxidation pond were examined individually and as consortium for their chromium(VI) tolerance and bioremoval from aqueous solutions. Both species were tolerant to the metal and showed significant increase ...

  12. Effects of Political Knowledge on Political Tolerance

    Science.gov (United States)

    Hall, John Powell

    2018-01-01

    Sexual orientation continues to be an explosive issue in American classrooms. Increasing the political knowledge of students can reduce the volatility of this explosive issue by increasing tolerance toward the lesbian, gay, bisexual, and transgender community. This relationship between political knowledge and political tolerance has been…

  13. Drought tolerant wheat varieties developed through mutation ...

    African Journals Online (AJOL)

    In search for higher yielding drought tolerant wheat varieties, one of the Kenyan high yielding variety 'Pasa' was irradiated with gamma rays (at 150, 200, and 250gy) in 1997 so as to induce variability and select for drought tolerance. Six mutants ((KM10, KM14, KM15, KM18, KM20 and KM21) were selected at M4 for their ...

  14. Stress tolerant crops from nitrogen fixing trees

    Energy Technology Data Exchange (ETDEWEB)

    Becker, R.; Saunders, R.M.

    1983-01-01

    Notes are given on the nutritional quality and uses of: pods of Geoffroea decorticans, a species tolerant of saline and limed soils and saline water; seeds of Olneya tesota which nodulates readily and fixes nitrogen and photosynthesizes at low water potential; and pods of Prosopis chilensis and P. tamarugo which tolerate long periods without rain. 3 references.

  15. Intercultural Education of Tolerance and Hospitality

    Science.gov (United States)

    Dasli, Maria

    2017-01-01

    This paper aims to make a theoretical contribution to the current debate on intercultural education by focusing on the nature and limits of tolerance. Drawing on contemporary theorisations of the concept, it is suggested that while tolerance appears fundamental for confronting issues of difference, it has several caveats. The paper discusses the…

  16. 75 FR 26662 - Fluazinam; Pesticide Tolerances

    Science.gov (United States)

    2010-05-12

    ... due to systemic toxicity and not a result of frank neurotoxicity. No signs of neurotoxicity were... chromatography with electron capture detection (GC/ECD), is available to enforce the tolerance expression for...) enforcement method is also available to enforce the tolerance expression for wine grapes, which includes...

  17. Microstructure, flaw tolerance, and reliability of Ce-TZP and Y-TZP ceramics

    International Nuclear Information System (INIS)

    Readey, M.J.; McCallen, C.L.

    1995-01-01

    Ce-TZP and Y-TZP ceramics were heat-treated for various times and temperatures in order to vary the microstructure. Flaw tolerance was investigated using the indentation-strength test. Reliability was quantified using conventional two-parameter Weibull statistics. Some Ce-TZP specimens were indented at slightly elevated temperatures where no transformation was observed. Results indicated that the Ce-TZP specimens were extremely flaw tolerant, and showed a relatively high Weibull modulus that scaled with both R-curve behavior and flaw tolerance. Y-TZP, on the other hand, with very little if any R-curve behavior or flaw tolerance, had a low Weibull modulus. The results also show that flaw history, i.e., whether or not a transformation zone exists along the wake of the crack, has a significant influence on strength. Strength was much less dependent on initial crack size when the crack had an associated transformation zone, whereas strength was highly dependent on cracks typical of natural processing defects. It is argued that the improvement in reliability, flaw tolerance, and dependence on flaw history are all ramifications of pronounced R-curve behavior

  18. Hematopoietic chimerism and transplantation tolerance: a role for regulatory T cells

    Directory of Open Access Journals (Sweden)

    Lise ePasquet

    2011-12-01

    Full Text Available The major obstacle in transplantation medicine is rejection of donor tissues by the host’s immune system. Immunosuppressive drugs can delay but not prevent loss of transplants, and their efficiency is strongly impacted by inter-individual pharmacokinetic differences. Moreover, due to the global immunosuppression induced and to the broad distribution of their targets amongst human tissues, these drugs have severe side effects. Induction of donor-specific non-responsiveness (i.e. immunological tolerance to transplants would solve these problems and would substantially ameliorate patients’ quality of life. It is widely believed that bone marrow or hematopoietic stem cell transplantation, and resulting (mixed hematopoietic chimerism, invariably leads to immunological tolerance to organs of the same donor. A careful analysis of the literature, reviewed here, indeed shows that chimerism consistently prolongs allograft survival. However, in absence of additional conditioning leading to the development of active regulatory mechanisms, it does not prevent chronic rejection. A central role for active tolerance in transplantation-tolerance is also supported by recent data showing that genuine immunological tolerance to organ allografts can be achieved by combining induction of hematopoietic chimerism with infusion of regulatory T lymphocytes. Therefore, conditioning regimens that lead to the establishment of hematopoietic chimerism plus active regulatory mechanisms appear required for induction of genuine tolerance to allogeneic grafts.

  19. Transcriptome alteration in a rice introgression line with enhanced alkali tolerance.

    Science.gov (United States)

    Zhang, Yunhong; Lin, Xiuyun; Ou, Xiufang; Hu, Lanjuan; Wang, Jinming; Yang, Chunwu; Wang, Shucai; Liu, Bao

    2013-07-01

    Alkali stress inhibits plant growth and development and thus limits crop productivity. To investigate the possible genetic basis of alkali tolerance in rice, we generated an introgressed rice line (K83) with significantly enhanced tolerance to alkali stress compared to its recipient parental cultivar (Jijing88). By using microarray analysis, we examined the global gene expression profiles of K83 and Jijing88, and found that more than 1200 genes were constitutively and differentially expressed in K83 in comparison to Jijing88 with 572 genes up- and 654 down-regulated. Upon alkali treatment, a total of 347 genes were found up- and 156 down-regulated in K83 compared to 591 and 187, respectively, in Jijing88. Among the up-regulated genes in both K83 and Jijing88, only 34 were constitutively up-regulated in K83, suggesting that both the constitutive differentially expressed genes in K83 and those induced by alkali treatment are most likely responsible for enhanced alkali tolerance. A gene ontology analysis based on all annotated, differentially expressed genes revealed that genes with expression alterations were enriched in pathways involved in metabolic processes, catalytic activity, and transport and transcription factor activities, suggesting that these pathways are associated with alkali stress tolerance in rice. Our results illuminated the novel genetic aspects of alkali tolerance in rice and established a repertory of potential target genes for biotechnological manipulations that can be used to generate alkali-tolerant rice cultivars. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  20. State Toleration, Religious Recognition and Equality

    DEFF Research Database (Denmark)

    Lægaard, Sune

    2013-01-01

    In debates about multiculturalism, it is widely claimed that ‘toleration is not enough’ and that we need to go ‘beyond toleration’ to some form of politics of recognition in order to satisfactorily address contemporary forms of cultural diversity (e.g. the presence in Europe of Muslim minorities...... a conceptual question of whether the relation between states and minorities can be categoriseized in terms of recognition or toleration, but about a normative question of whether and how toleration and recognition secures equality. When toleration is inadequate, this is often because it institutionaliseizes...... and upholds specific inequalities. But politics of recognition may equally well institute inequalities, and in such cases unequal recognition may not be preferable to toleration....

  1. Historical Aspects in Tolerance Phenomenon Research

    Directory of Open Access Journals (Sweden)

    Janat A. Karmanova

    2013-01-01

    Full Text Available The article examines the historical aspect of the tolerance phenomenon research, particularly the study of tolerance in the age of Antiquity, Middle Ages, New Times, Enlightenment. It is remarkable that the problem of tolerance, emerged in Western civilization on religious grounds, laid the foundation for all other freedoms, attained in many countries. Besides, the article attaches special attention to the researchers of the East, such as Abu Nasr al-Farabi, Khoja Ahmed Yasawi, studies the historical aspect of works by Kazakhstan thinkers A. Kunanbayev, C. Valikhanova, K.B. Zharikbayev, S.K. Kaliyev, A.N. Nysanbayev, A.I. Artemev and others. The analysis of historical research of the tolerance phenomenon brings the author to the conclusion that religious freedom was the starting point for the emergence of new areas of tolerance display. The content of this phenomenon changed according to the historical peculiarities of the societies’ development

  2. Diagnosis and fault-tolerant control

    CERN Document Server

    Blanke, Mogens; Lunze, Jan; Staroswiecki, Marcel

    2016-01-01

    Fault-tolerant control aims at a gradual shutdown response in automated systems when faults occur. It satisfies the industrial demand for enhanced availability and safety, in contrast to traditional reactions to faults, which bring about sudden shutdowns and loss of availability. The book presents effective model-based analysis and design methods for fault diagnosis and fault-tolerant control. Architectural and structural models are used to analyse the propagation of the fault through the process, to test the fault detectability and to find the redundancies in the process that can be used to ensure fault tolerance. It also introduces design methods suitable for diagnostic systems and fault-tolerant controllers for continuous processes that are described by analytical models of discrete-event systems represented by automata. The book is suitable for engineering students, engineers in industry and researchers who wish to get an overview of the variety of approaches to process diagnosis and fault-tolerant contro...

  3. Algorithms for worst-case tolerance optimization

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans; Madsen, Kaj

    1979-01-01

    New algorithms are presented for the solution of optimum tolerance assignment problems. The problems considered are defined mathematically as a worst-case problem (WCP), a fixed tolerance problem (FTP), and a variable tolerance problem (VTP). The basic optimization problem without tolerances...... is denoted the zero tolerance problem (ZTP). For solution of the WCP we suggest application of interval arithmetic and also alternative methods. For solution of the FTP an algorithm is suggested which is conceptually similar to algorithms previously developed by the authors for the ZTP. Finally, the VTP...... is solved by a double-iterative algorithm in which the inner iteration is performed by the FTP- algorithm. The application of the algorithm is demonstrated by means of relatively simple numerical examples. Basic properties, such as convergence properties, are displayed based on the examples....

  4. Cognitive component of Tolerance in Pedagogic Education

    Directory of Open Access Journals (Sweden)

    O. V. Akimova

    2013-01-01

    Full Text Available The paper looks at one of the urgent educational problems of tolerance development by teachers and students; tolerance being viewed as the openness to the new knowledge acquisition, willingness to understand other people and cooperate with them, and therefore the opportunity for self- development.The paper outlines the ways of tolerant attitudes formation by all the human subjects of educational process; the concept of person oriented teaching is considered to be the basic one for tolerance development. To optimize the specialists’ training for communication at any level of professional environment, the cognitive activity educational model is suggested, providing the ways out of any complicated pedagogical situation. The cognitive psychology concepts give the background for the above model. The education in question promotes the intellectual level of the prospective teachers, intensifies their creative potential, methodological thinking and practical experience, as well as tolerance development in professional communication process. 

  5. Cognitive Ability, Principled Reasoning and Political Tolerance

    DEFF Research Database (Denmark)

    Hebbelstrup Rye Rasmussen, Stig; Nørgaard, Asbjørn Sonne

    Individuals are not equally politically tolerant. To explain why, individual differences in emotions and threat have received much scholarly attention in recent years. However, extant research also shows that psychological dispositions, habitual cognitive styles, ideological orientation...... and ‘principled reasoning’ influence political tolerance judgments. The extent to which cognitive ability plays a role has not been entertained even if the capacity to think abstractly, comprehend complex ideas and apply abstract ideas to concrete situations is inherent to both principled tolerance judgment...... and cognitive ability. Cognitive ability, we argue and show, adds to the etiology of political tolerance. In Danish and American samples cognitive ability strongly predicts political tolerance after taking habitual cognitive styles (as measured by personality traits), education, social ideology, and feelings...

  6. 40 CFR 180.564 - Indoxacarb; tolerances for residues.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Indoxacarb; tolerances for residues...) PESTICIDE PROGRAMS TOLERANCES AND EXEMPTIONS FOR PESTICIDE CHEMICAL RESIDUES IN FOOD Specific Tolerances § 180.564 Indoxacarb; tolerances for residues. (a) General. Tolerances are established for residues of...

  7. Quantifier spreading in child eye movements: A case of the Russian quantifier kazhdyj ‘every'

    Directory of Open Access Journals (Sweden)

    Irina A. Sekerina

    2017-07-01

    Full Text Available Extensive cross-linguistic work has documented that children up to the age of 9–10 make errors when performing a sentence-picture verification task that pairs spoken sentences with the universal quantifier 'every 'and pictures with entities in partial one-to-one correspondence. These errors stem from children’s difficulties in restricting the domain of a universal quantifier to the appropriate noun phrase and are referred in the literature as 'quantifier-spreading '('q'-spreading. We adapted the task to be performed in conjunction with eye-movement recordings using the Visual World Paradigm. Russian-speaking 5-to-6-year-old children ('N '= 31 listened to sentences like 'Kazhdyj alligator lezhit v vanne '‘Every alligator is lying in a bathtub’ and viewed pictures with three alligators, each in a bathtub, and two extra empty bathtubs. Non-spreader children ('N '= 12 were adult-like in their accuracy whereas 'q'-spreading ones ('N '= 19 were only 43% correct in interpreting such sentences compared to the control sentences. Eye movements of 'q'-spreading children revealed that more looks to the extra containers (two empty bathtubs correlated with higher error rates reflecting the processing pattern of 'q'-spreading. In contrast, more looks to the distractors in control sentences did not lead to errors in interpretation. We argue that 'q'-spreading errors are caused by interference from the extra entities in the visual context, and our results support the processing difficulty account of acquisition of quantification. Interference results in cognitive overload as children have to integrate multiple sources of information, i.e., visual context with salient extra entities and the spoken sentence in which these entities are mentioned in real-time processing.   This article is part of the special collection: Acquisition of Quantification

  8. Fisheries Exploitation by Albatross Quantified With Lipid Analysis

    Directory of Open Access Journals (Sweden)

    Melinda G. Conners

    2018-04-01

    Full Text Available Mortality from incidental bycatch in longline fishery operations is a global threat to seabird populations, and especially so for the albatross family (Diomedeidae in which 15 out of 22 species are threatened with extinction. Despite the risks, fisheries remain attractive to many species of seabird by providing access to high-energy foods in the form of discarded fish and offal, target fish, and baited hooks. Current policy regarding fisheries management is increasingly aimed at discard reform, exemplified by a discard ban initiated in the European Union Common Fisheries Policy in 2014. While there is global agreement on the importance of minimizing the waste inherent in bycatch and discards, there is also growing concern that there is a need to understand the extent to which marine animals rely on fisheries-associated resources, especially at the colony and individual levels. We used a novel adaptation of quantitative fatty acid signature analysis (QFASA to quantify fisheries-associated prey in the diet of two threatened North Pacific albatross species. Diet was estimated with QFASA using multiple lipid classes from stomach oil collected from incubating and chick-brooding Laysan and black-footed albatrosses across three breeding seasons. Prey-specific error was estimated by comparing QFASA estimated diets from known “simulated” diets, which informed the level of precaution appropriate when interpreting model results. Fisheries-associated diet occurred in both albatross species across both the incubation and chick-brood stages; however, neither species relied on fisheries food as the dominant food source (consisting of <10% of the total pooled proportional diet in each species. While total diet proportion was low, the incidence of fisheries-associated resources in albatross diets was highest in the 2009–2010 breeding season when there was a strong central Pacific El Niño. Additionally, the diets of a few individuals consisted almost

  9. Understanding and quantifying foliar temperature acclimation for Earth System Models

    Science.gov (United States)

    Smith, N. G.; Dukes, J.

    2015-12-01

    Photosynthesis and respiration on land are the two largest carbon fluxes between the atmosphere and Earth's surface. The parameterization of these processes represent major uncertainties in the terrestrial component of the Earth System Models used to project future climate change. Research has shown that much of this uncertainty is due to the parameterization of the temperature responses of leaf photosynthesis and autotrophic respiration, which are typically based on short-term empirical responses. Here, we show that including longer-term responses to temperature, such as temperature acclimation, can help to reduce this uncertainty and improve model performance, leading to drastic changes in future land-atmosphere carbon feedbacks across multiple models. However, these acclimation formulations have many flaws, including an underrepresentation of many important global flora. In addition, these parameterizations were done using multiple studies that employed differing methodology. As such, we used a consistent methodology to quantify the short- and long-term temperature responses of maximum Rubisco carboxylation (Vcmax), maximum rate of Ribulos-1,5-bisphosphate regeneration (Jmax), and dark respiration (Rd) in multiple species representing each of the plant functional types used in global-scale land surface models. Short-term temperature responses of each process were measured in individuals acclimated for 7 days at one of 5 temperatures (15-35°C). The comparison of short-term curves in plants acclimated to different temperatures were used to evaluate long-term responses. Our analyses indicated that the instantaneous response of each parameter was highly sensitive to the temperature at which they were acclimated. However, we found that this sensitivity was larger in species whose leaves typically experience a greater range of temperatures over the course of their lifespan. These data indicate that models using previous acclimation formulations are likely incorrectly

  10. Quantifying uncertainties in wind energy assessment

    Science.gov (United States)

    Patlakas, Platon; Galanis, George; Kallos, George

    2015-04-01

    The constant rise of wind energy production and the subsequent penetration in global energy markets during the last decades resulted in new sites selection with various types of problems. Such problems arise due to the variability and the uncertainty of wind speed. The study of the wind speed distribution lower and upper tail may support the quantification of these uncertainties. Such approaches focused on extreme wind conditions or periods below the energy production threshold are necessary for a better management of operations. Towards this direction, different methodologies are presented for the credible evaluation of potential non-frequent/extreme values for these environmental conditions. The approaches used, take into consideration the structural design of the wind turbines according to their lifespan, the turbine failures, the time needed for repairing as well as the energy production distribution. In this work, a multi-parametric approach for studying extreme wind speed values will be discussed based on tools of Extreme Value Theory. In particular, the study is focused on extreme wind speed return periods and the persistence of no energy production based on a weather modeling system/hind cast/10-year dataset. More specifically, two methods (Annual Maxima and Peaks Over Threshold) were used for the estimation of extreme wind speeds and their recurrence intervals. Additionally, two different methodologies (intensity given duration and duration given intensity, both based on Annual Maxima method) were implied to calculate the extreme events duration, combined with their intensity as well as the event frequency. The obtained results prove that the proposed approaches converge, at least on the main findings, for each case. It is also remarkable that, despite the moderate wind speed climate of the area, several consequent days of no energy production are observed.

  11. Variation in cassava germplasm for tolerance to post-harvest physiological deterioration.

    Science.gov (United States)

    Venturini, M T; Santos, L R; Vildoso, C I A; Santos, V S; Oliveira, E J

    2016-05-06

    Tolerant varieties can effectively control post-harvest physiological deterioration (PPD) of cassava, although knowledge on the genetic variability and inheritance of this trait is needed. The objective of this study was to estimate genetic parameters and identify sources of tolerance to PPD and their stability in cassava accessions. Roots from 418 cassava accessions, grown in four independent experiments, were evaluated for PPD tolerance 0, 2, 5, and 10 days post-harvest. Data were transformed into area under the PPD-progress curve (AUP-PPD) to quantify tolerance. Genetic parameters, stability (Si), adaptability (Ai), and the joint analysis of stability and adaptability (Zi) were obtained via residual maximum likelihood (REML) and best linear unbiased prediction (BLUP) methods. Variance in the genotype (G) x environment (E) interaction and genotypic variance were important for PPD tolerance. Individual broad-sense heritability (hg(2)= 0.38 ± 0.04) and average heritability in accessions (hmg(2)= 0.52) showed high genetic control of PPD tolerance. Genotypic correlation of AUP-PPD in different experiments was of medium magnitude (ȓgA = 0.42), indicating significant G x E interaction. The predicted genotypic values o f G x E free of interaction (û + ĝi) showed high variation. Of the 30 accessions with high Zi, 19 were common to û + ĝi, Si, and Ai parameters. The genetic gain with selection of these 19 cassava accessions was -55.94, -466.86, -397.72, and -444.03% for û + ĝi, Si, Ai, and Zi, respectively, compared with the overall mean for each parameter. These results demonstrate the variability and potential of cassava germplasm to introduce PPD tolerance in commercial varieties.

  12. SU-D-204-02: BED Consistent Extrapolation of Mean Dose Tolerances

    Energy Technology Data Exchange (ETDEWEB)

    Perko, Z; Bortfeld, T; Hong, T; Wolfgang, J; Unkelbach, J [Massachusetts General Hospital, Boston, MA (United States)

    2016-06-15

    Purpose: The safe use of radiotherapy requires the knowledge of tolerable organ doses. For experimental fractionation schemes (e.g. hypofractionation) these are typically extrapolated from traditional fractionation schedules using the Biologically Effective Dose (BED) model. This work demonstrates that using the mean dose in the standard BED equation may overestimate tolerances, potentially leading to unsafe treatments. Instead, extrapolation of mean dose tolerances should take the spatial dose distribution into account. Methods: A formula has been derived to extrapolate mean physical dose constraints such that they are mean BED equivalent. This formula constitutes a modified BED equation where the influence of the spatial dose distribution is summarized in a single parameter, the dose shape factor. To quantify effects we analyzed 14 liver cancer patients previously treated with proton therapy in 5 or 15 fractions, for whom also photon IMRT plans were available. Results: Our work has two main implications. First, in typical clinical plans the dose distribution can have significant effects. When mean dose tolerances are extrapolated from standard fractionation towards hypofractionation they can be overestimated by 10–15%. Second, the shape difference between photon and proton dose distributions can cause 30–40% differences in mean physical dose for plans having the same mean BED. The combined effect when extrapolating proton doses to mean BED equivalent photon doses in traditional 35 fraction regimens resulted in up to 7–8 Gy higher doses than when applying the standard BED formula. This can potentially lead to unsafe treatments (in 1 of the 14 analyzed plans the liver mean dose was above its 32 Gy tolerance). Conclusion: The shape effect should be accounted for to avoid unsafe overestimation of mean dose tolerances, particularly when estimating constraints for hypofractionated regimens. In addition, tolerances established for a given treatment modality cannot

  13. Quantifying Economic Value of Coastal Ecosystem Services: A Review

    Directory of Open Access Journals (Sweden)

    Seyedabdolhossein Mehvar

    2018-01-01

    Full Text Available The complexity of quantifying ecosystem services in monetary terms has long been a challenging issue for economists and ecologists. Many case specific valuation studies have been carried out in various parts of the World. Yet, a coherent review on the valuation of coastal ecosystem services (CES, which systematically describes fundamental concepts, analyzes reported applications, and addresses the issue of climate change (CC impacts on the monetary value of CES is still lacking. Here, we take a step towards addressing this knowledge gap by pursuing a coherent review that aims to provide policy makers and researchers in multidisciplinary teams with a summary of the state-of-the-art and a guideline on the process of economic valuation of CES and potential changes in these values due to CC impacts. The article highlights the main concepts of CES valuation studies and offers a systematic analysis of the best practices by analyzing two global scale and 30 selected local and regional case studies, in which different CES have been valued. Our analysis shows that coral reefs and mangroves are among the most frequently valued ecosystems, while sea-grass beds are the least considered ones. Currently, tourism and recreation services as well as storm protection are two of the most considered services representing higher estimated value than other CES. In terms of the valuation techniques used, avoided damage, replacement and substitute cost method as well as stated preference method are among the most commonly used valuation techniques. Following the above analysis, we propose a methodological framework that provides step-wise guidance and better insight into the linkages between climate change impacts and the monetary value of CES. This highlights two main types of CC impacts on CES: one being the climate regulation services of coastal ecosystems, and the other being the monetary value of services, which is subject to substantial uncertainty. Finally, a

  14. Quantifying the conservation gains from shared access to linear infrastructure.

    Science.gov (United States)

    Runge, Claire A; Tulloch, Ayesha I T; Gordon, Ascelin; Rhodes, Jonathan R

    2017-12-01

    The proliferation of linear infrastructure such as roads and railways is a major global driver of cumulative biodiversity loss. One strategy for reducing habitat loss associated with development is to encourage linear infrastructure providers and users to share infrastructure networks. We quantified the reductions in biodiversity impact and capital costs under linear infrastructure sharing of a range of potential mine to port transportation links for 47 mine locations operated by 28 separate companies in the Upper Spencer Gulf Region of South Australia. We mapped transport links based on least-cost pathways for different levels of linear-infrastructure sharing and used expert-elicited impacts of linear infrastructure to estimate the consequences for biodiversity. Capital costs were calculated based on estimates of construction costs, compensation payments, and transaction costs. We evaluated proposed mine-port links by comparing biodiversity impacts and capital costs across 3 scenarios: an independent scenario, where no infrastructure is shared; a restricted-access scenario, where the largest mining companies share infrastructure but exclude smaller mining companies from sharing; and a shared scenario where all mining companies share linear infrastructure. Fully shared development of linear infrastructure reduced overall biodiversity impacts by 76% and reduced capital costs by 64% compared with the independent scenario. However, there was considerable variation among companies. Our restricted-access scenario showed only modest biodiversity benefits relative to the independent scenario, indicating that reductions are likely to be limited if the dominant mining companies restrict access to infrastructure, which often occurs without policies that promote sharing of infrastructure. Our research helps illuminate the circumstances under which infrastructure sharing can minimize the biodiversity impacts of development. © 2017 The Authors. Conservation Biology published

  15. Quantifying food waste in Hawaii's food supply chain.

    Science.gov (United States)

    Loke, Matthew K; Leung, PingSun

    2015-12-01

    Food waste highlights a considerable loss of resources invested in the food supply chain. While it receives a lot of attention in the global context, the assessment of food waste is deficient at the sub-national level, owing primarily to an absence of quality data. This article serves to explore that gap and aims to quantify the edible weight, economic value, and calorie equivalent of food waste in Hawaii. The estimates are based on available food supply data for Hawaii and the US Department of Agriculture's (USDA's) loss-adjusted food availability data for defined food groups at three stages of the food supply chain. At its highest aggregated level, we estimate Hawaii's food waste generation at 237,122 t or 26% of available food supply in 2010. This is equivalent to food waste of 161.5 kg per person, per annum. Additionally, this food waste is valued at US$1.025 billion annually or the equivalent of 502.6 billion calories. It is further evident that the occurrence of food waste by all three measures is highest at the consumer stage, followed by the distribution and retail stage, and is lowest at the post-harvest and packing stage. The findings suggest that any meaningful intervention to reduce food waste in Hawaii should target the consumer, and distribution and retail stages of the food supply chain. Interventions at the consumer stage should focus on the two protein groups, as well as fresh fruits and fresh vegetables. © The Author(s) 2015.

  16. Quantifying movement demands of AFL football using GPS tracking.

    Science.gov (United States)

    Wisbey, Ben; Montgomery, Paul G; Pyne, David B; Rattray, Ben

    2010-09-01

    Global positioning system (GPS) monitoring of movement patterns is widespread in elite football including the Australian Football League (AFL). However documented analysis of this activity is lacking. We quantified the movement patterns of AFL football and differences between nomadic (midfield), forward and defender playing positions, and determined whether the physical demands have increased over a four season period. Selected premiership games were monitored during the 2005 (n=80 game files), 2006 (n=244), 2007 (n=632) and 2008 (n=793) AFL seasons. Players were fitted with a shoulder harness containing a GPS unit. GPS data were downloaded after games and the following measures extracted: total distance (km), time in various speed zones, maximum speed, number of surges, accelerations, longest continuous efforts and a derived exertion index representing playing intensity. In 2008 nomadic players covered per game 3.4% more total distance (km), had 4.8% less playing time (min), a 17% higher exertion index (per min), and 23% more time running >18kmh(-1) than forwards and defenders (all p<0.05). Physical demands were substantially higher in the 2008 season compared with 2005: an 8.4% increase in mean speed, a 14% increase in intensity (exertion index) and a 9.0% decrease in playing time (all p<0.05). Nomadic players in AFL work substantially harder than forwards and defenders in covering more ground and at higher running intensities. Increases in the physical demands of AFL football were evident between 2005 and 2008. The increasing speed of the game has implications for game authorities, players and coaching staff.

  17. How Will Copper Contamination Constrain Future Global Steel Recycling?

    OpenAIRE

    Daehn, Katrin; Cabrera Serrenho, Andre; Allwood, Julian Mark

    2017-01-01

    Copper in steel causes metallurgical problems, but is pervasive in end-of-life scrap and cannot currently be removed commercially once in the melt. Contamination can be managed to an extent by globally trading scrap for use in tolerant applications and dilution with primary iron sources. However, the viability of long-term strategies can only be evaluated with a complete characterization of copper in the global steel system and this is presented in this paper. The copper concentration of flow...

  18. On the contrast between Germanic and Romance negated quantifiers

    OpenAIRE

    Robert Cirillo

    2009-01-01

    Universal quantifiers can be stranded in the manner described by Sportiche (1988), Giusti (1990) and Shlonsky (1991) in both the Romance and Germanic languages, but a negated universal quantifier can only be stranded in the Germanic languages. The goal of this paper is to show that this contrast between the Romance and the Germanic languages can be explained if one adapts the theory of sentential negation in Zeijlstra (2004) to constituent (quantifier) negation. According to Zeijlstra’s theor...

  19. Tolerance study for the components of the probe-type and hook-type Higher Order Mode couplers for the HL-LHC 800 MHz harmonic system

    CERN Document Server

    Blanco, Esteban

    2016-01-01

    A superconducting 800 MHz second harmonic RF system is one of the considered options as a Landau damping mechanism for HiLumi LHC. The Higher Order Mode (HOM) coupler designs require tight manufacturing tolerances in order to operate at the design specifications. The project consists of defining the mechanical tolerances for the different components of both the probe-type and hook-type HOM coupler. With the use of electromagnetic field simulation software it is possible to identify the critical components of the HOM coupler and to quantify their respective tolerances. The obtained results are discussed in this paper.

  20. Fraud: zero tolerance at CERN

    CERN Multimedia

    2014-01-01

    In this week’s Bulletin (see here), you’ll read that fraudulent activities were uncovered last year by our Internal Audit Service. CERN has a very clearly defined policy in such cases: we base our efforts on prevention through education, we have a policy of protecting those reporting fraud from recrimination, and we have a zero-tolerance policy should fraud be uncovered.   I don’t intend to enter into the details of what occurred, but I’d like to remind you that fraud is a very grave business, and something we take extremely seriously. What do we mean by fraud at CERN? Operational Circular No. 10 on “Principles and procedures governing the investigation of fraud” defines fraud in terms of any deception intended to benefit the perpetrator, or a third party, that results in a loss to the Organization. This loss can be to funds, property or reputation. Thankfully, fraud at CERN is a rare occurrence, but we should never be complacent. ...