WorldWideScience

Sample records for extreme scale era

  1. Extreme Scale Visual Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Potok, Thomas E [ORNL; Pullum, Laura L [ORNL; Ramanathan, Arvind [ORNL; Shipman, Galen M [ORNL; Thornton, Peter E [ORNL; Potok, Thomas E [ORNL

    2013-01-01

    Given the scale and complexity of today s data, visual analytics is rapidly becoming a necessity rather than an option for comprehensive exploratory analysis. In this paper, we provide an overview of three applications of visual analytics for addressing the challenges of analyzing climate, text streams, and biosurveilance data. These systems feature varying levels of interaction and high performance computing technology integration to permit exploratory analysis of large and complex data of global significance.

  2. Gravo-Aeroelastic Scaling for Extreme-Scale Wind Turbines

    Energy Technology Data Exchange (ETDEWEB)

    Fingersh, Lee J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Loth, Eric [University of Virginia; Kaminski, Meghan [University of Virginia; Qin, Chao [University of Virginia; Griffith, D. Todd [Sandia National Laboratories

    2017-06-09

    A scaling methodology is described in the present paper for extreme-scale wind turbines (rated at 10 MW or more) that allow their sub-scale turbines to capture their key blade dynamics and aeroelastic deflections. For extreme-scale turbines, such deflections and dynamics can be substantial and are primarily driven by centrifugal, thrust and gravity forces as well as the net torque. Each of these are in turn a function of various wind conditions, including turbulence levels that cause shear, veer, and gust loads. The 13.2 MW rated SNL100-03 rotor design, having a blade length of 100-meters, is herein scaled to the CART3 wind turbine at NREL using 25% geometric scaling and blade mass and wind speed scaled by gravo-aeroelastic constraints. In order to mimic the ultralight structure on the advanced concept extreme-scale design the scaling results indicate that the gravo-aeroelastically scaled blades for the CART3 are be three times lighter and 25% longer than the current CART3 blades. A benefit of this scaling approach is that the scaled wind speeds needed for testing are reduced (in this case by a factor of two), allowing testing under extreme gust conditions to be much more easily achieved. Most importantly, this scaling approach can investigate extreme-scale concepts including dynamic behaviors and aeroelastic deflections (including flutter) at an extremely small fraction of the full-scale cost.

  3. El Nino, from 1870 to 2014, and other Atmospheric Circulation Forcing by Extreme Apparitions of the Eight Annual, Continental Scale, Aerosol Plumes in the Satellite Era which Point to a Possible Cause for the Current Californian Drought

    Science.gov (United States)

    Potts, K. A.

    2015-12-01

    Eight continental scale aerosol plumes exist each year as the enclosed image shows. Apparitions of seven plumes only exist for a few months in the same season each year whilst the East Asian Plume is visible all year. The aerosol optical depth (AOD) of all the plumes varies enormously interannually with two studies showing the surface radiative forcing of the South East Asian Plume (SEAP) as -150W/m2 and -286W/m2/AOD. I show that the SEAP, created by volcanic aerosols (natural) and biomass burning and gas flares in the oil industry (anthropogenic), is the sole cause of all El Nino events, the greatest interannual perturbation of the atmospheric circulation system. The SEAP creates an El Nino by absorbing solar radiation at the top of the plume which heats the upper atmosphere and cools the surface. This creates a temperature inversion compared to periods without the plume and reduces convection. With reduced convection in SE Asia, the Maritime Continent, the Trade Winds blowing across the Pacific are forced to relax as their exit into the Hadley and Walker Cells is constrained and the reduced Trade Wind speed causes the Sea Surface Temperature (SST) to rise in the central tropical Pacific Ocean as there is a strong negative correlation between wind speed and SST. The warmer SST in the central Pacific creates convection in the region which further reduces the Trade Wind speed and causes the Walker Cell to reverse - a classic El Nino. Having established the ability of such extreme aerosol plumes to create El Nino events I will then show how the South American, West African, Middle East and SEAP plumes create drought in the Amazon, Spain, Darfur and Australia as well as causing the extremely warm autumn and winter in Europe in 2006-07. All these effects are created by the plumes reducing convection in the region of the plume which forces the regional Hadley Cells into anomalous positions thereby creating persistent high pressure cells in the mid latitudes. This

  4. Benchmark Generation and Simulation at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Lagadapati, Mahesh [North Carolina State University (NCSU), Raleigh; Mueller, Frank [North Carolina State University (NCSU), Raleigh; Engelmann, Christian [ORNL

    2016-01-01

    The path to extreme scale high-performance computing (HPC) poses several challenges related to power, performance, resilience, productivity, programmability, data movement, and data management. Investigating the performance of parallel applications at scale on future architectures and the performance impact of different architectural choices is an important component of HPC hardware/software co-design. Simulations using models of future HPC systems and communication traces from applications running on existing HPC systems can offer an insight into the performance of future architectures. This work targets technology developed for scalable application tracing of communication events. It focuses on extreme-scale simulation of HPC applications and their communication behavior via lightweight parallel discrete event simulation for performance estimation and evaluation. Instead of simply replaying a trace within a simulator, this work promotes the generation of a benchmark from traces. This benchmark is subsequently exposed to simulation using models to reflect the performance characteristics of future-generation HPC systems. This technique provides a number of benefits, such as eliminating the data intensive trace replay and enabling simulations at different scales. The presented work features novel software co-design aspects, combining the ScalaTrace tool to generate scalable trace files, the ScalaBenchGen tool to generate the benchmark, and the xSim tool to assess the benchmark characteristics within a simulator.

  5. Extreme-scale Algorithms and Solver Resilience

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, Jack [Univ. of Tennessee, Knoxville, TN (United States)

    2016-12-10

    A widening gap exists between the peak performance of high-performance computers and the performance achieved by complex applications running on these platforms. Over the next decade, extreme-scale systems will present major new challenges to algorithm development that could amplify this mismatch in such a way that it prevents the productive use of future DOE Leadership computers due to the following; Extreme levels of parallelism due to multicore processors; An increase in system fault rates requiring algorithms to be resilient beyond just checkpoint/restart; Complex memory hierarchies and costly data movement in both energy and performance; Heterogeneous system architectures (mixing CPUs, GPUs, etc.); and Conflicting goals of performance, resilience, and power requirements.

  6. Cosmological neutrino simulations at extreme scale

    Science.gov (United States)

    Emberson, J. D.; Yu, Hao-Ran; Inman, Derek; Zhang, Tong-Jie; Pen, Ue-Li; Harnois-Déraps, Joachim; Yuan, Shuo; Teng, Huan-Yu; Zhu, Hong-Ming; Chen, Xuelei; Xing, Zhi-Zhong

    2017-08-01

    Constraining neutrino mass remains an elusive challenge in modern physics. Precision measurements are expected from several upcoming cosmological probes of large-scale structure. Achieving this goal relies on an equal level of precision from theoretical predictions of neutrino clustering. Numerical simulations of the non-linear evolution of cold dark matter and neutrinos play a pivotal role in this process. We incorporate neutrinos into the cosmological N-body code CUBEP3M and discuss the challenges associated with pushing to the extreme scales demanded by the neutrino problem. We highlight code optimizations made to exploit modern high performance computing architectures and present a novel method of data compression that reduces the phase-space particle footprint from 24 bytes in single precision to roughly 9 bytes. We scale the neutrino problem to the Tianhe-2 supercomputer and provide details of our production run, named TianNu, which uses 86% of the machine (13 824 compute nodes). With a total of 2.97 trillion particles, TianNu is currently the world’s largest cosmological N-body simulation and improves upon previous neutrino simulations by two orders of magnitude in scale. We finish with a discussion of the unanticipated computational challenges that were encountered during the TianNu runtime.

  7. Extreme Associated Functions: Optimally Linking Local Extremes to Large-scale Atmospheric Circulation Structures

    CERN Document Server

    Panja, Debabrata

    2007-01-01

    We present a new statistical method to optimally link local weather extremes to large-scale atmospheric circulation structures. The method is illustrated using July-August daily mean temperature at 2m height (T2m) time-series over the Netherlands and 500 hPa geopotential height (Z500) time-series over the Euroatlantic region of the ECMWF reanalysis dataset (ERA40). The method identifies patterns in the Z500 time-series that optimally describe, in a precise mathematical sense, the relationship with local warm extremes in the Netherlands. Two patterns are identified; the most important one corresponds to a blocking high pressure system leading to subsidence and calm, dry and sunny conditions over the Netherlands. The second one corresponds to a rare, easterly flow regime bringing warm, dry air into the region. The patterns are robust; they are also identified in shorter subsamples of the total dataset. The method is generally applicable and might prove useful in evaluating the performance of climate models in s...

  8. Cosmological neutrino simulations at extreme scale

    CERN Document Server

    Emberson, J D; Inman, Derek; Zhang, Tong-Jie; Pen, Ue-Li; Harnois-Deraps, Joachim; Yuan, Shuo; Teng, Huan-Yu; Zhu, Hong-Ming; Chen, Xuelei; Xing, Zhi-Zhong

    2016-01-01

    Constraining neutrino mass remains an elusive challenge in modern physics. Precision measurements are expected from several upcoming cosmological probes of large-scale structure. Achieving this goal relies on an equal level of precision from theoretical predictions of neutrino clustering. Numerical simulations of the non-linear evolution of cold dark matter and neutrinos play a pivotal role in this process. We incorporate neutrinos into the cosmological N-body code CUBEP3M and discuss the challenges associated with pushing to the extreme scales demanded by the neutrino problem. We highlight code optimizations made to exploit modern high performance computing architectures and present a novel method of data compression that reduces the phase-space particle footprint from 24 bytes in single precision to roughly 9 bytes. We scale the neutrino problem to the Tianhe-2 supercomputer and provide details of our production run, named TianNu, which uses 86% of the machine (13,824 compute nodes). With a total of 2.97 tr...

  9. LEMON - LHC Era Monitoring for Large-Scale Infrastructures

    Science.gov (United States)

    Marian, Babik; Ivan, Fedorko; Nicholas, Hook; Hector, Lansdale Thomas; Daniel, Lenkes; Miroslav, Siket; Denis, Waldron

    2011-12-01

    At the present time computer centres are facing a massive rise in virtualization and cloud computing as these solutions bring advantages to service providers and consolidate the computer centre resources. However, as a result the monitoring complexity is increasing. Computer centre management requires not only to monitor servers, network equipment and associated software but also to collect additional environment and facilities data (e.g. temperature, power consumption, cooling efficiency, etc.) to have also a good overview of the infrastructure performance. The LHC Era Monitoring (Lemon) system is addressing these requirements for a very large scale infrastructure. The Lemon agent that collects data on every client and forwards the samples to the central measurement repository provides a flexible interface that allows rapid development of new sensors. The system allows also to report on behalf of remote devices such as switches and power supplies. Online and historical data can be visualized via a web-based interface or retrieved via command-line tools. The Lemon Alarm System component can be used for notifying the operator about error situations. In this article, an overview of the Lemon monitoring is provided together with a description of the CERN LEMON production instance. No direct comparison is made with other monitoring tool.

  10. Validation and uncertainty analysis for monthly and extreme precipitation in the ERA-20C reanalysis based on the WZN in-situ measurements

    Science.gov (United States)

    Rustemeier, Elke; Ziese, Markus; Raykova, Kristin; Meyer-Christoffer, Anja; Schneider, Udo; Finger, Peter; Becker, Andreas

    2017-04-01

    The proper representation of precipitation, in particular extreme precipitation, in global reanalyses is still challenging. This paper focuses on the potential of the ERA-20C centennial reanalysis to reproduce precipitation events. The global ERA-20C Reanalysis has been developed within the projects ERA-CLIM and its successor ERA-CLIM2 with the aim of a multi-decadal reanalysis of the global climate system. One of the objectives of ERA-CLIM2 is to provide useful information about the uncertainty of the various parameters. Since precipitation is a prognostic variable, it allows for independent validation by in-situ measurements. For this purpose, the Global Precipitation Climatology Centre (GPCC) operated by the DWD has compared the ERA-20C Reanalysis with the GPCC observational products "Full Data Monthly Version 7" (FDM-V7) and "Full Data Daily Version 1" (FDD-V1). ERA-20C is based on the ECMWF prediction model IFS version Cy38r1 with a spatial resolution of approximately 125 km and covers the 111 years from 1900 to 2010. The GPCC FDM-V7 raster data product, on the other hand, includes the global land surface in-situ measurements between 1901 and 2013 (Schneider et al., 2014) and the FDD-V1 raster data product covers daily precipitation from 1988 to 2013 with daily resolution. The most suitable resolution of 1° was used to validate ERA-20C. For the spatial and temporal validation of the ERA-20C Reanalysis, global temporal scores were calculated on monthly, seasonal and annual time scales. These include e.g. monthly contingency table scores, correlation or climate change indices (ETCCDI) for precipitation to determine extreme values and their temporal change (Peterson et al., 2001, Appendix A). Not surprisingly, the regions with the strongest differences are also those with data scarcity, mountain regions with their luv and lee effects or monsoon areas. They all show a strong systematic difference and breaks within the time series. Differences between ERA-20C and

  11. Scaling Extreme Astrophysical Phenomena to the Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Remington, B A

    2007-11-01

    High-energy-density (HED) physics refers broadly to the study of macroscopic collections of matter under extreme conditions of temperature and density. The experimental facilities most widely used for these studies are high-power lasers and magnetic-pinch generators. The HED physics pursued on these facilities is still in its infancy, yet new regimes of experimental science are emerging. Examples from astrophysics include work relevant to planetary interiors, supernovae, astrophysical jets, and accreting compact objects (such as neutron stars and black holes). In this paper, we review a selection of recent results in this new field of HED laboratory astrophysics and provide a brief look ahead to the coming decade.

  12. Future increases in extreme precipitation exceed observed scaling rates

    Science.gov (United States)

    Bao, Jiawei; Sherwood, Steven C.; Alexander, Lisa V.; Evans, Jason P.

    2017-01-01

    Models and physical reasoning predict that extreme precipitation will increase in a warmer climate due to increased atmospheric humidity. Observational tests using regression analysis have reported a puzzling variety of apparent scaling rates including strong rates in midlatitude locations but weak or negative rates in the tropics. Here we analyse daily extreme precipitation events in several Australian cities to show that temporary local cooling associated with extreme events and associated synoptic conditions reduces these apparent scaling rates, especially in warmer climatic conditions. A regional climate projection ensemble for Australia, which implicitly includes these effects, accurately and robustly reproduces the observed apparent scaling throughout the continent for daily precipitation extremes. Projections from the same model show future daily extremes increasing at rates faster than those inferred from observed scaling. The strongest extremes (99.9th percentile events) scale significantly faster than near-surface water vapour, between 5.7-15% °C-1 depending on model details. This scaling rate is highly correlated with the change in water vapour, implying a trade-off between a more arid future climate or one with strong increases in extreme precipitation. These conclusions are likely to generalize to other regions.

  13. Scaling a Survey Course in Extreme Weather

    Science.gov (United States)

    Samson, P. J.

    2013-12-01

    "Extreme Weather" is a survey-level course offered at the University of Michigan that is broadcast via the web and serves as a research testbed to explore best practices for large class conduct. The course has led to the creation of LectureTools, a web-based student response and note-taking system that has been shown to increase student engagement dramatically in multiple courses by giving students more opportunities to participate in class. Included in this is the capacity to pose image-based questions (see image where question was "Where would you expect winds from the south") as well as multiple choice, ordered list, free response and numerical questions. Research in this class has also explored differences in learning outcomes from those who participate remotely versus those who physically come to class and found little difference. Moreover the technologies used allow instructors to conduct class from wherever they are while the students can still answer questions and engage in class discussion from wherever they are. This presentation will use LectureTools to demonstrate its features. Attendees are encouraged to bring a mobile device to the session to participate.

  14. Extreme Scale Computing to Secure the Nation

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L; McGraw, J R; Johnson, J R; Frincke, D

    2009-11-10

    absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This

  15. Extreme Scale Computing to Secure the Nation

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L; McGraw, J R; Johnson, J R; Frincke, D

    2009-11-10

    absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This

  16. Accounting for choice of measurement scale in extreme value modeling

    OpenAIRE

    Wadsworth, J. L.; Tawn, J. A.; Jonathan, P.

    2010-01-01

    We investigate the effect that the choice of measurement scale has upon inference and extrapolation in extreme value analysis. Separate analyses of variables from a single process on scales which are linked by a nonlinear transformation may lead to discrepant conclusions concerning the tail behavior of the process. We propose the use of a Box--Cox power transformation incorporated as part of the inference procedure to account parametrically for the uncertainty surrounding the scale of extrapo...

  17. Climatic forecast: down-scaling and extremes; La prevision climatique: regionalisation et extremes

    Energy Technology Data Exchange (ETDEWEB)

    Deque, M. [Meteo France, Centre National de Recherches Meteorologiques (CNRM), 31 - Toulouse (France); Li, L. [Laboratoire de Meteorologie Dynamique, Institut Pierre-Simon Laplace, Ecole Polytechnique, 91 - Palaiseau (France)

    2007-05-15

    There is a strong demand for specifying the future climate at local scale and about extreme events. New methods, allowing a better output from the climate models, are currently being developed and French laboratories involved in the Escrime project are actively participating. (authors)

  18. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    Energy Technology Data Exchange (ETDEWEB)

    Xiu, Dongbin [Univ. of Utah, Salt Lake City, UT (United States)

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  19. Extreme reaction times determine fluctuation scaling in human color vision

    Science.gov (United States)

    Medina, José M.; Díaz, José A.

    2016-11-01

    In modern mental chronometry, human reaction time defines the time elapsed from stimulus presentation until a response occurs and represents a reference paradigm for investigating stochastic latency mechanisms in color vision. Here we examine the statistical properties of extreme reaction times and whether they support fluctuation scaling in the skewness-kurtosis plane. Reaction times were measured for visual stimuli across the cardinal directions of the color space. For all subjects, the results show that very large reaction times deviate from the right tail of reaction time distributions suggesting the existence of dragon-kings events. The results also indicate that extreme reaction times are correlated and shape fluctuation scaling over a wide range of stimulus conditions. The scaling exponent was higher for achromatic than isoluminant stimuli, suggesting distinct generative mechanisms. Our findings open a new perspective for studying failure modes in sensory-motor communications and in complex networks.

  20. Extreme Scale Computing for First-Principles Plasma Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Choogn-Seock [Princeton University

    2011-10-12

    World superpowers are in the middle of the “Computnik” race. US Department of Energy (and National Nuclear Security Administration) wishes to launch exascale computer systems into the scientific (and national security) world by 2018. The objective is to solve important scientific problems and to predict the outcomes using the most fundamental scientific laws, which would not be possible otherwise. Being chosen into the next “frontier” group can be of great benefit to a scientific discipline. An extreme scale computer system requires different types of algorithms and programming philosophy from those we have been accustomed to. Only a handful of scientific codes are blessed to be capable of scalable usage of today’s largest computers in operation at petascale (using more than 100,000 cores concurrently). Fortunately, a few magnetic fusion codes are competing well in this race using the “first principles” gyrokinetic equations.These codes are beginning to study the fusion plasma dynamics in full-scale realistic diverted device geometry in natural nonlinear multiscale, including the large scale neoclassical and small scale turbulence physics, but excluding some ultra fast dynamics. In this talk, most of the above mentioned topics will be introduced at executive level. Representative properties of the extreme scale computers, modern programming exercises to take advantage of them, and different philosophies in the data flows and analyses will be presented. Examples of the multi-scale multi-physics scientific discoveries made possible by solving the gyrokinetic equations on extreme scale computers will be described. Future directions into “virtual tokamak experiments” will also be discussed.

  1. A Network Contention Model for the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2015-01-01

    The Extreme-scale Simulator (xSim) is a performance investigation toolkit for high-performance computing (HPC) hardware/software co-design. It permits running a HPC application with millions of concurrent execution threads, while observing its performance in a simulated extreme-scale system. This paper details a newly developed network modeling feature for xSim, eliminating the shortcomings of the existing network modeling capabilities. The approach takes a different path for implementing network contention and bandwidth capacity modeling using a less synchronous and accurate enough model design. With the new network modeling feature, xSim is able to simulate on-chip and on-node networks with reasonable accuracy and overheads.

  2. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    Energy Technology Data Exchange (ETDEWEB)

    Xiu, Dongbin [Purdue Univ., West Lafayette, IN (United States)

    2016-06-21

    The focus of the project is the development of mathematical methods and high-performance com- putational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly e cient and scalable numer- ical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  3. Large Scale Meteorological Pattern of Extreme Rainfall in Indonesia

    Science.gov (United States)

    Kuswanto, Heri; Grotjahn, Richard; Rachmi, Arinda; Suhermi, Novri; Oktania, Erma; Wijaya, Yosep

    2014-05-01

    Extreme Weather Events (EWEs) cause negative impacts socially, economically, and environmentally. Considering these facts, forecasting EWEs is crucial work. Indonesia has been identified as being among the countries most vulnerable to the risk of natural disasters, such as floods, heat waves, and droughts. Current forecasting of extreme events in Indonesia is carried out by interpreting synoptic maps for several fields without taking into account the link between the observed events in the 'target' area with remote conditions. This situation may cause misidentification of the event leading to an inaccurate prediction. Grotjahn and Faure (2008) compute composite maps from extreme events (including heat waves and intense rainfall) to help forecasters identify such events in model output. The composite maps show large scale meteorological patterns (LSMP) that occurred during historical EWEs. Some vital information about the EWEs can be acquired from studying such maps, in addition to providing forecaster guidance. Such maps have robust mid-latitude meteorological patterns (for Sacramento and California Central Valley, USA EWEs). We study the performance of the composite approach for tropical weather condition such as Indonesia. Initially, the composite maps are developed to identify and forecast the extreme weather events in Indramayu district- West Java, the main producer of rice in Indonesia and contributes to about 60% of the national total rice production. Studying extreme weather events happening in Indramayu is important since EWEs there affect national agricultural and fisheries activities. During a recent EWE more than a thousand houses in Indramayu suffered from serious flooding with each home more than one meter underwater. The flood also destroyed a thousand hectares of rice plantings in 5 regencies. Identifying the dates of extreme events is one of the most important steps and has to be carried out carefully. An approach has been applied to identify the

  4. Testing for scale-invariance in extreme events, with application to earthquake occurrence

    Science.gov (United States)

    Main, I.; Naylor, M.; Greenhough, J.; Touati, S.; Bell, A.; McCloskey, J.

    2009-04-01

    this case) to assume Gaussian errors. We develop methods to correct for these effects, and show that the current best fit maximum likelihood regression model for the global frequency-moment distribution in the digital era is a power law, i.e. mega-earthquakes continue to follow the Gutenberg-Richter trend of smaller earthquakes with no (as yet) observable cut-off or characteristic extreme event. The results may also have implications for the interpretation of other time-limited geophysical time series that exhibit power-law scaling.

  5. A first global-scale hindcast of extreme sea levels induced by extra-tropical storms

    Science.gov (United States)

    Muis, Sanne; Verlaan, Martin; Winsemius, Hessel; Ward, Philip

    2015-04-01

    Flood risk in coastal areas has been increasing in past years. This can be partly attributed to climate change and rising sea levels that increase the likelihood of coastal flood hazards, but also to increasing flood exposure because the global population and capital is increasingly concentrated in coastal zones. Without action, the increasing trends in flood hazard and exposure will be associated with catastrophic flood losses in the future. The adequate allocation of global investments and prioritization of adaptation actions requires an accurate understanding of the current and future coastal flood risk on a global-scale. Despite this, global data on extreme sea levels are scarce. A few studies have assessed coastal flood risk at the global-scale. To date, these have been either based on extreme water levels in the DIVA database or on observations from tide gauges. Both datasets have limitations when assessing flood risk, including low-probability events, on a fully global scale. Hence, there is a need for an improved estimation of extreme sea level on a global-scale. Therefore, we are developing the first global hindcast of coastal water levels which covers the period 1979-2013. To do this, we apply a global hydrodynamic model which is based on the Delft3D Flexible Mesh software from Deltares. By forcing the model with the tidal potential and meteorological fields derived from the ERA-Interim global reanalysis, we are able to simulate the water levels resulting from tides and surges. Subsequently, we apply extreme value statistics to estimate exceedance probabilities. Similar hydrodynamic modelling efforts have been carried out at the regional scale, but as the modelling of surges in shallow coastal areas requires a high-resolution model grid, generally this approach is computationally too costly on a global-scale. However, the recent application of unstructured grids (or flexible mesh) in hydrodynamic models, allowing local refinement of the grid, has enabled

  6. Improving the Performance of the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2014-01-01

    Investigating the performance of parallel applications at scale on future high-performance computing (HPC) architectures and the performance impact of different architecture choices is an important component of HPC hardware/software co-design. The Extreme-scale Simulator (xSim) is a simulation-based toolkit for investigating the performance of parallel applications at scale. xSim scales to millions of simulated Message Passing Interface (MPI) processes. The overhead introduced by a simulation tool is an important performance and productivity aspect. This paper documents two improvements to xSim: (1) a new deadlock resolution protocol to reduce the parallel discrete event simulation management overhead and (2) a new simulated MPI message matching algorithm to reduce the oversubscription management overhead. The results clearly show a significant performance improvement, such as by reducing the simulation overhead for running the NAS Parallel Benchmark suite inside the simulator from 1,020\\% to 238% for the conjugate gradient (CG) benchmark and from 102% to 0% for the embarrassingly parallel (EP) and benchmark, as well as, from 37,511% to 13,808% for CG and from 3,332% to 204% for EP with accurate process failure simulation.

  7. Large Scale Influences on Drought and Extreme Precipitation Events in the United States

    Science.gov (United States)

    Collow, A.; Bosilovich, M. G.; Koster, R. D.; Eichmann, A.

    2015-12-01

    Observations indicate that extreme weather events are increasing and it is likely that this trend will continue through the 21st century. However, there is uncertainty and disagreement in recent literature regarding the mechanisms by which extreme temperature and precipitation events are increasing, including the suggestion that enhanced Arctic warming has resulted in an increase in blocking events and a more meridional flow. A steady gradual increase in heavy precipitation events has been observed in the Midwestern and Northeastern United States, while the Southwestern United States, particularly California, has experienced suppressed precipitation and an increase in consecutive dry days over the past few years. The frequency, intensity, and duration of heavy precipitation events in the Midwestern United States and Northeastern United States, as well as drought in the Southwestern United States are examined using the Modern Era Retrospective Analysis for Research and Applications Version-2 (MERRA-2). Indices developed by the Expert Team on Climate Change Detection and Indices representing drought and heavy precipitation events have been calculated using the MERRA-2 dataset for the period of 1980 through 2014. Trends in these indices are analyzed and the indices are compared to large scale circulations and climate modes using a composite and statistical linkages approach. Statistically significant correlations are present in the summer months between heavy precipitation events and meridional flow despite the lack of enhanced Arctic warming, contradicting the suggested mechanisms. Weaker, though still significant, correlations are observed in the winter months when the Arctic is warming more rapidly than the Midlatitudes.

  8. Scaling and Intensification of Extreme Precipitation in High-Resolution Climate Change Simulations

    Science.gov (United States)

    Ban, Nikolina; Leutwyler, David; Lüthi, Daniel; Schär, Christoph

    2017-04-01

    Climate change projections of extreme precipitation are of great interest due to hydrological impacts such as droughts, floods, erosion, landslides and debris flows. Despite the trend towards dryer conditions over Europe, many climate simulations project increases of heavy precipitation events, while some theoretical studies have raised the possibility of dramatic increases in hourly events (by up to 14% per degree warming). However, conventional climate models are not suited to assess short-term heavy events due to the need to parameterize deep convection. High-resolution climate models with kilometer-scale grid spacing at which parameterization of convection can be switched off, significantly improve the simulation of heavy precipitation and can alter the climate change signal (e.g., Ban et al., 2015). Here we present decade-long high-resolution climate change simulations at horizontal resolution of 2.2 km over Europe on a computational domain with 1536x1536x60 grid points. These simulations have become feasible with a new version of the COSMO model that runs entirely on Graphics Processing Units. We compare a present-day climate simulation, driven by ERA-Interim reanalysis (Leutwyler at al., 2016), with a Pseudo-Global Warming (PGW) simulation The PGW simulation is driven by the slowly evolving mean seasonal cycle of the climate changes (derived from the CMIP5 model), superimposed on the ERA-Interim reanalysis. With this approach, the resulting changes are due to large scale warming of the atmosphere and due to slow-varying circulation changes. We will present the differences in climate change signal between conventional and high-resolution climate models, and discuss the thermodynamic effects on intensification of extreme precipitation. Ban N., J. Schmidli and C. Schär, 2015: Heavy precipitation in a changing climate: Does short-term summer precipitation increase faster? Geophys. Res. Lett., 42 (4), 1165-1172 Leutwyler, D., D. Lüthi, N. Ban, O. Fuhrer and C

  9. Reducing Waste in Extreme Scale Systems through Introspective Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bautista-Gomez, Leonardo [Argonne National Laboratory (ANL); Gainaru, Ana [University of Illinois at Urbana-Champaign, National Center for Supercomputing Applications; Perarnau, Swann [Argonne National Laboratory (ANL); Tiwari, Devesh [ORNL; Gupta, Saurabh [ORNL; Engelmann, Christian [ORNL; Cappello, Franck [Argonne National Laboratory (ANL); Snir, Marc [Argonne National Laboratory (ANL)

    2016-01-01

    Resilience is an important challenge for extreme- scale supercomputers. Today, failures in supercomputers are assumed to be uniformly distributed in time. However, recent studies show that failures in high-performance computing systems are partially correlated in time, generating periods of higher failure density. Our study of the failure logs of multiple supercomputers show that periods of higher failure density occur with up to three times more than the average. We design a monitoring system that listens to hardware events and forwards important events to the runtime to detect those regime changes. We implement a runtime capable of receiving notifications and adapt dynamically. In addition, we build an analytical model to predict the gains that such dynamic approach could achieve. We demonstrate that in some systems, our approach can reduce the wasted time by over 30%.

  10. Small-scale characteristics of extremely high latitude aurora

    Directory of Open Access Journals (Sweden)

    J. A. Cumnock

    2009-09-01

    Full Text Available We examine 14 cases of an interesting type of extremely high latitude aurora as identified in the precipitating particles measured by the DMSP F13 satellite. In particular we investigate structures within large-scale arcs for which the particle signatures are made up of a group of multiple distinct thin arcs. These cases are chosen without regard to IMF orientation and are part of a group of 87 events where DMSP F13 SSJ/4 measures emissions which occur near the noon-midnight meridian and are spatially separated from both the dawnside and duskside auroral ovals by wide regions with precipitating particles typical of the polar cap. For 73 of these events the high-latitude aurora consists of a continuous region of precipitating particles. We focus on the remaining 14 of these events where the particle signatures show multiple distinct thin arcs. These events occur during northward or weakly southward IMF conditions and follow a change in IMF By. Correlations are seen between the field-aligned currents and plasma flows associated with the arcs, implying local closure of the FACs. Strong correlations are seen only in the sunlit hemisphere. The convection associated with the multiple thin arcs is localized and has little influence on the large-scale convection. This also implies that the sunward flow along the arcs is unrelated to the overall ionospheric convection.

  11. Scaling extreme astrophysical phenomena to the laboratory - a tutorial

    Science.gov (United States)

    Remington, Bruce A.

    2007-11-01

    The ability to experimentally study scaled aspects of the explosion dynamics of core-collapse supernovae (massive stars that explode from the inside out) or the radiation kinetics of accreting neutron stars or black holes on high energy density (HED) facilities, such as high power lasers and magnetic pinch facilities, is an exciting scientific development over the last two decades. [1,2] Additional areas of research that become accessible on modern HED facilities are studies of fundamental properties of matter in conditions relevant to planetary and stellar interiors, protostellar jet dynamics, and with the added tool of thermonuclear ignition on the National Ignition Facility, excited state (``multi-hit'') nuclear physics, possibly relevant to nucleosynthesis. Techniques and methodologies for studying aspects of the physics of such extreme phenomena of the universe in millimeter scale parcels of plasma in the laboratory will be discussed. [1] ``Experimental astrophysics with high power lasers and Z pinches,'' B.A. Remington, R.P. Drake, D.D. Ryutov, Rev. Mod. Phys. 78, 755 (2006). [2] ``High energy density laboratory astrophysics,'' B.A. Remington, Plasma Phys. Cont. Fusion 47, A191 (2005).

  12. EXAMINATION OF THE VALIDITY AND RELIABILITY OF EXTREME SPORTS PARTICIPATION SCALE: PILOT STUDY

    OpenAIRE

    2016-01-01

    The aim of this research is to prove the validity and reliability of extreme sports participation scale. Accordingly, validity and reliability study of the scale were carried out adding the dimensions of motive for participating in extreme sports obtained from the focus group study performed within scope of the examination of related literature and the research to the scope of the scale. The scale was applied on the individuals who utilize the extreme sports facilities in some provinces in Tu...

  13. Differential Juvenile Hormone Variations in Scale Insect Extreme Sexual Dimorphism.

    Directory of Open Access Journals (Sweden)

    Isabelle Mifom Vea

    Full Text Available Scale insects have evolved extreme sexual dimorphism, as demonstrated by sedentary juvenile-like females and ephemeral winged males. This dimorphism is established during the post-embryonic development; however, the underlying regulatory mechanisms have not yet been examined. We herein assessed the role of juvenile hormone (JH on the diverging developmental pathways occurring in the male and female Japanese mealybug Planococcus kraunhiae (Kuwana. We provide, for the first time, detailed gene expression profiles related to JH signaling in scale insects. Prior to adult emergence, the transcript levels of JH acid O-methyltransferase, encoding a rate-limiting enzyme in JH biosynthesis, were higher in males than in females, suggesting that JH levels are higher in males. Furthermore, male quiescent pupal-like stages were associated with higher transcript levels of the JH receptor gene, Methoprene-tolerant and its co-activator taiman, as well as the JH early-response genes, Krüppel homolog 1 and broad. The exposure of male juveniles to an ectopic JH mimic prolonged the expression of Krüppel homolog 1 and broad, and delayed adult emergence by producing a supernumeral pupal stage. We propose that male wing development is first induced by up-regulated JH signaling compared to female expression pattern, but a decrease at the end of the prepupal stage is necessary for adult emergence, as evidenced by the JH mimic treatments. Furthermore, wing development seems linked to JH titers as JHM treatments on the pupal stage led to wing deformation. The female pedomorphic appearance was not reflected by the maintenance of high levels of JH. The results in this study suggest that differential variations in JH signaling may be responsible for sex-specific and radically different modes of metamorphosis.

  14. Differential Juvenile Hormone Variations in Scale Insect Extreme Sexual Dimorphism.

    Science.gov (United States)

    Vea, Isabelle Mifom; Tanaka, Sayumi; Shiotsuki, Takahiro; Jouraku, Akiya; Tanaka, Toshiharu; Minakuchi, Chieka

    2016-01-01

    Scale insects have evolved extreme sexual dimorphism, as demonstrated by sedentary juvenile-like females and ephemeral winged males. This dimorphism is established during the post-embryonic development; however, the underlying regulatory mechanisms have not yet been examined. We herein assessed the role of juvenile hormone (JH) on the diverging developmental pathways occurring in the male and female Japanese mealybug Planococcus kraunhiae (Kuwana). We provide, for the first time, detailed gene expression profiles related to JH signaling in scale insects. Prior to adult emergence, the transcript levels of JH acid O-methyltransferase, encoding a rate-limiting enzyme in JH biosynthesis, were higher in males than in females, suggesting that JH levels are higher in males. Furthermore, male quiescent pupal-like stages were associated with higher transcript levels of the JH receptor gene, Methoprene-tolerant and its co-activator taiman, as well as the JH early-response genes, Krüppel homolog 1 and broad. The exposure of male juveniles to an ectopic JH mimic prolonged the expression of Krüppel homolog 1 and broad, and delayed adult emergence by producing a supernumeral pupal stage. We propose that male wing development is first induced by up-regulated JH signaling compared to female expression pattern, but a decrease at the end of the prepupal stage is necessary for adult emergence, as evidenced by the JH mimic treatments. Furthermore, wing development seems linked to JH titers as JHM treatments on the pupal stage led to wing deformation. The female pedomorphic appearance was not reflected by the maintenance of high levels of JH. The results in this study suggest that differential variations in JH signaling may be responsible for sex-specific and radically different modes of metamorphosis.

  15. Faster Parallel Traversal of Scale Free Graphs at Extreme Scale with Vertex Delegates

    KAUST Repository

    Pearce, Roger

    2014-11-01

    © 2014 IEEE. At extreme scale, irregularities in the structure of scale-free graphs such as social network graphs limit our ability to analyze these important and growing datasets. A key challenge is the presence of high-degree vertices (hubs), that leads to parallel workload and storage imbalances. The imbalances occur because existing partitioning techniques are not able to effectively partition high-degree vertices. We present techniques to distribute storage, computation, and communication of hubs for extreme scale graphs in distributed memory supercomputers. To balance the hub processing workload, we distribute hub data structures and related computation among a set of delegates. The delegates coordinate using highly optimized, yet portable, asynchronous broadcast and reduction operations. We demonstrate scalability of our new algorithmic technique using Breadth-First Search (BFS), Single Source Shortest Path (SSSP), K-Core Decomposition, and Page-Rank on synthetically generated scale-free graphs. Our results show excellent scalability on large scale-free graphs up to 131K cores of the IBM BG/P, and outperform the best known Graph500 performance on BG/P Intrepid by 15%

  16. Detection of Coherent Structures in Extreme-Scale Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Kamath, C; Iverson, J; Kirk, R; Karypis, G

    2012-03-24

    The analysis of coherent structures is a common problem in many scientific domains ranging from astrophysics to combustion, fusion, and materials science. The data from three-dimensional simulations are analyzed to detect the structures, extract statistics on them, and track them over time to gain insights into the phenomenon being modeled. This analysis is typically done off-line, using data that have been written out by the simulations. However, the move towards extreme scale architectures, with multi-core processors and graphical processing units, will affect how such analysis is done as it is unlikely that the systems will support the I/O bandwidth required for off-line analysis. Moving the analysis in-situ is a solution only if we know a priori what analysis will be done, as well as the algorithms used and their parameter settings. Even then, we need to ensure that this will not substantially increase the memory requirements or the data movement as the former will be limited and the latter will be expensive. In the Exa-DM project, a collaboration between Lawrence Livermore National Laboratory and University of Minnesota, we are exploring ways in which we can address the conflicting demands of coherent structure analysis of simulation data and the architecture of modern parallel systems, while enabling scientific discovery at the exascale. In this paper, we describe our work in two areas: the in situ implementation of an existing algorithm for coherent structure analysis and the use of graph-based techniques to efficiently compress the data.

  17. A Fault-Oblivious Extreme-Scale Execution Environment (FOX)

    Energy Technology Data Exchange (ETDEWEB)

    Van Hensbergen, Eric; Speight, William; Xenidis, Jimi

    2013-03-15

    IBM Research’s contribution to the Fault Oblivious Extreme-scale Execution Environment (FOX) revolved around three core research deliverables: • collaboration with Boston University around the Kittyhawk cloud infrastructure which both enabled a development and deployment platform for the project team and provided a fault-injection testbed to evaluate prototypes • operating systems research focused on exploring role-based operating system technologies through collaboration with Sandia National Labs on the NIX research operating system and collaboration with the broader IBM Research community around a hybrid operating system model which became known as FusedOS • IBM Research also participated in an advisory capacity with the Boston University SESA project, the core of which was derived from the K42 operating system research project funded in part by DARPA’s HPCS program. Both of these contributions were built on a foundation of previous operating systems research funding by the Department of Energy’s FastOS Program. Through the course of the X-stack funding we were able to develop prototypes, deploy them on production clusters at scale, and make them available to other researchers. As newer hardware, in the form of BlueGene/Q, came online, we were able to port the prototypes to the new hardware and release the source code for the resulting prototypes as open source to the community. In addition to the open source coded for the Kittyhawk and NIX prototypes, we were able to bring the BlueGene/Q Linux patches up to a more recent kernel and contribute them for inclusion by the broader Linux community. The lasting impact of the IBM Research work on FOX can be seen in its effect on the shift of IBM’s approach to HPC operating systems from Linux and Compute Node Kernels to role-based approaches as prototyped by the NIX and FusedOS work. This impact can be seen beyond IBM in follow-on ideas being incorporated into the proposals for the Exasacale Operating

  18. Dynamics Of Saturn'S Mid-scale Storms In The Cassini Era.

    Science.gov (United States)

    Del Rio Gaztelurrutia, Teresa; Hueso, R.; Sánchez-Lavega, A.

    2010-10-01

    Convective storms, similar to those in Earth, but of much larger scale, develop often in Saturn's atmosphere. During the Voyagers’ flybys of Saturn in 1981 mid-scale storms, with an horizontal extension of the order of 1000-3000 km were observed to occur mainly in a narrow tropical-latitude band in the Northern hemisphere at latitudes 38-40 deg North. Contrasting with the Voyagers’ era, since the starting of the Cassini mission in 2004, a similar mid-scale convective activity has concentrated in the so-called "storm alley", a narrow band at a symmetric Southern latitude of 38 deg.. In this work, we characterize this storm activity using available visual information provided by Cassini ISS cameras and the continuous survey from the Earth by the International Outer Planets Watch (IOPW) and its online database PVOL (Hueso et al., Planetary and Space Science, 2010). We study the frequency of appearance of storms with sizes above 2000 km, their characteristic size and life-time, as well as their interaction with surrounding dynamical features. In particular we examine the possibility that storms might provide a mechanism of injection of energy into Saturn's jets, the influence of storms in the generation of atmospheric vortices, and the analogies and differences of Voyagers’ and present day jet structure at the relevant latitudes. Acknowledgments: This work has been funded by the Spanish MICIIN AYA2009-10701 with FEDER support and Grupos Gobierno Vasco IT-464

  19. Extreme Samples on the Eysenck Personality Questionnaire Psychoticism Scale.

    Science.gov (United States)

    Loo, Robert; Shiomi, Kunio

    1983-01-01

    Examined the P scale of the Eysenck Personality Questionnaire in three samples of normal Canadian and Japanese students who obtained either very high or very low means on the P scale. Psychometric problems were identified in the sample with a low P mean. Recommended that the scale be renamed. (JAC)

  20. Baseline climatology of extremely high vertical wind shears' values over Europe based on ERA-Interim reanalysis

    Science.gov (United States)

    Palarz, Angelika; Celiński-Mysław, Daniel

    2017-04-01

    The dominant role in the development of deep convection is played by kinematic and thermodynamic conditions, as well as atmospheric circulation, land cover and local relief. Severe thunderstorms are considerably more likely to form in environments with large values of convective available potential energy (CAPE) and significant magnitude of vertical wind shears (VWSs). According to the most recent research, the tropospheric wind shears have an important influence on intensity, longevity and organisation of the primary convective systems - bow echoes, squall lines and supercell thunderstorms. This study, in turn, examines the role of wind structure in controlling the spatial and temporal variability of VWSs over Europe. Considering the importance of the kinematic conditions for the convective systems formation, research is limited exclusively to 0-1 km, 0-3 km and 0-6 km wind shears. In order to compute the VWS' values, the data derived from ERA-Interim reanalysis for the period 1981-2015 was applied. It consisted of U and V wind components with 12-hourly sampling and horizontal resolution of 0.75×0.75°. The VWS' values were calculated as wind difference between two levels - this entails that the hodograph's shape was not considered (e.g. Clark 2013, Pucik et. al 2015). We have analysed both VWS' mean values (MN) and frequency of VWSs exceeding assumed thresholds (FQ). Taking into account previous studies (e.g. Rasmussen & Blanchard 1998, Schneider et al. 2006, Schaumann & Przybylinski 2012), the thresholds for extremely high values of vertical wind shears were set at 10 m/s for 0-1 km shear, 15 m/s for 0-3 km shear and 18 m/s for 0-6 km shear. Both MN and FQ values were characterised by strong temporal variability, as well as significant spatial differentiation over the research area. A clear diurnal cycle was identified in the case of 0-1 km shear, while seasonal variability was typical for 0-3 km and 0-6 km shears. Regardless of the season, 0-1 km shear reached

  1. Universal scaling properties of extremal cohesive holographic phases

    CERN Document Server

    Goutéraux, B

    2014-01-01

    In this work, we focus on zero-temperature, strongly-coupled, translation-invariant holographic phases at finite density. We show that they can be classified according to the scaling behaviour of the metric, the electric potential and the electric flux. Solutions fall into two classes, depending on whether they break relativistic symmetry or not. We conjecture a universal scaling for the optical conductivity at zero temperature and low frequencies, which reduces to the correct result for both classes of solutions. We also study the scaling behaviour of the electric flux through bulk minimal surfaces, which have been suggested to provide an order parameter for fractionalisation.

  2. Extreme Scaling of Production Visualization Software on Diverse Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Childs, Henry; Pugmire, David; Ahern, Sean; Whitlock, Brad; Howison, Mark; Weber, Gunther; Bethel, E. Wes

    2009-12-22

    We present the results of a series of experiments studying how visualization software scales to massive data sets. Although several paradigms exist for processing large data, we focus on pure parallelism, the dominant approach for production software. These experiments utilized multiple visualization algorithms and were run on multiple architectures. Two types of experiments were performed. For the first, we examined performance at massive scale: 16,000 or more cores and one trillion or more cells. For the second, we studied weak scaling performance. These experiments were performed on the largest data set sizes published to date in visualization literature, and the findings on scaling characteristics and bottlenecks contribute to understanding of how pure parallelism will perform at high levels of concurrency and with very large data sets.

  3. Understanding convective extreme precipitation scaling using observations and an entraining plume model

    NARCIS (Netherlands)

    Loriaux, J.M.; Lenderink, G.; De Roode, S.R.; Siebesma, A.P.

    2013-01-01

    Previously observed twice-Clausius–Clapeyron (2CC) scaling for extreme precipitation at hourly time scales has led to discussions about its origin. The robustness of this scaling is assessed by analyzing a subhourly dataset of 10-min resolution over the Netherlands. The results confirm the validity

  4. Web-based Visual Analytics for Extreme Scale Climate Science

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Evans, Katherine J [ORNL; Harney, John F [ORNL; Jewell, Brian C [ORNL; Shipman, Galen M [ORNL; Smith, Brian E [ORNL; Thornton, Peter E [ORNL; Williams, Dean N. [Lawrence Livermore National Laboratory (LLNL)

    2014-01-01

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via new visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.

  5. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing (Dagstuhl Perspectives Workshop 14022)

    OpenAIRE

    Bremer, Peer-Timo; Mohr, Bernd; Pascucci, Valerio; Schulz, Martin

    2014-01-01

    In the first week of January 2014 Dagstuhl hosted a Perspectives Workshop on "Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing". The event brought together two previously separate communities - from Visualization and HPC Performance Analysis - to discuss a long term joined research agenda. The goal was to identify and address the challenges in using visual representations to understand and optimize the performance of extreme-scale applications running...

  6. Simulating transitional hydrodynamics of the cerebrospinal fluid at extreme scale

    Science.gov (United States)

    Jain, Kartik; Roller, Sabine; Mardal, Kent-Andre

    Chiari malformation type I is a disorder characterized by the herniation of cerebellar tonsils into the spinal canal through the foramen magnum resulting in obstruction to cerebrospinal fluid (CSF) outflow. The flow of pulsating bidirectional CSF is of acutely complex nature due to the anatomy of the conduit containing it - the subarachnoid space. We report lattice Boltzmann method based direct numerical simulations on patient specific cases with spatial resolution of 24 μm amounting meshes of up to 2 billion cells conducted on 50000 cores of the Hazelhen supercomputer in Stuttgart. The goal is to characterize intricate dynamics of the CSF at resolutions that are of the order of Kolmogorov microscales. Results unfold velocity fluctuations up to ~ 10 KHz , turbulent kinetic energy ~ 2 times of the mean flow energy in Chiari patients whereas the flow remains laminar in a control subject. The fluctuations confine near the cranio-vertebral junction and are commensurate with the extremeness of pathology and the extent of herniation. The results advocate that the manifestation of pathological conditions like Chiari malformation may lead to transitional hydrodynamics of the CSF, and a prudent calibration of numerical approach is necessary to avoid overlook of such phenomena.

  7. Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Daily, Jeffrey A. [Washington State Univ., Pullman, WA (United States)

    2015-05-01

    The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore’s law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of already annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or “homologous”) on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K cores

  8. A Fault Oblivious Extreme-Scale Execution Environment

    Energy Technology Data Exchange (ETDEWEB)

    McKie, Jim

    2014-11-20

    The FOX project, funded under the ASCR X-stack I program, developed systems software and runtime libraries for a new approach to the data and work distribution for massively parallel, fault oblivious application execution. Our work was motivated by the premise that exascale computing systems will provide a thousand-fold increase in parallelism and a proportional increase in failure rate relative to today’s machines. To deliver the capability of exascale hardware, the systems software must provide the infrastructure to support existing applications while simultaneously enabling efficient execution of new programming models that naturally express dynamic, adaptive, irregular computation; coupled simulations; and massive data analysis in a highly unreliable hardware environment with billions of threads of execution. Our OS research has prototyped new methods to provide efficient resource sharing, synchronization, and protection in a many-core compute node. We have experimented with alternative task/dataflow programming models and shown scalability in some cases to hundreds of thousands of cores. Much of our software is in active development through open source projects. Concepts from FOX are being pursued in next generation exascale operating systems. Our OS work focused on adaptive, application tailored OS services optimized for multi → many core processors. We developed a new operating system NIX that supports role-based allocation of cores to processes which was released to open source. We contributed to the IBM FusedOS project, which promoted the concept of latency-optimized and throughput-optimized cores. We built a task queue library based on distributed, fault tolerant key-value store and identified scaling issues. A second fault tolerant task parallel library was developed, based on the Linda tuple space model, that used low level interconnect primitives for optimized communication. We designed fault tolerance mechanisms for task parallel computations

  9. Brief Assessment of Motor Function: Content Validity and Reliability of the Upper Extremity Gross Motor Scale

    Science.gov (United States)

    Cintas, Holly Lea; Parks, Rebecca; Don, Sarah; Gerber, Lynn

    2011-01-01

    Content validity and reliability of the Brief Assessment of Motor Function (BAMF) Upper Extremity Gross Motor Scale (UEGMS) were evaluated in this prospective, descriptive study. The UEGMS is one of five BAMF ordinal scales designed for quick documentation of gross, fine, and oral motor skill levels. Designed to be independent of age and…

  10. Responsiveness of SF-36 and Lower Extremity Functional Scale for assessing outcomes in traumatic injuries of lower extremities.

    Science.gov (United States)

    Pan, Shin-Liang; Liang, Huey-Wen; Hou, Wen-Hsuan; Yeh, Tian-Shin

    2014-11-01

    To assess the responsiveness of one generic questionnaire, Medical Outcomes Study Short Form-36 (SF-36), and one region-specific outcome measure, Lower Extremity Functional Scale (LEFS), in patients with traumatic injuries of lower extremities. A prospective and observational study of patients after traumatic injuries of lower extremities. Assessments were performed at baseline and 3 months later. In-patients and out-patients in two university hospitals in Taiwan. A convenience sample of 109 subjects were evaluated and 94 (86%) were followed. Not applicable. Assessments of responsiveness with distribution-based approach (effect size, standardized response mean [SRM], minimal detectable change) and anchor-based approach (receiver's operating curve analysis, ROC analysis). LEFS and physical component score (PCS) of SF-36 were all responsive to global improvement, with fair-to-good accuracy in discriminating between participants with and without improvement. The area under curve gained by ROC analysis for LEFS and SF-36 PCS was similar (0.65 vs. 0.70, p=0.26). Our findings revealed comparable responsiveness of LEFS and PCS of SF-36 in a sample of subjects with traumatic injuries of lower limbs. Either type of functional measure would be suitable for use in clinical trials where improvement in function was an endpoint of interest. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Improved neurosensory outcome at 8 years of age of extremely low birthweight children born in Victoria over three distinct eras

    OpenAIRE

    Doyle, L; Anderson, P.; t and,

    2005-01-01

    Aim: To determine neurosensory outcome at 8 years of age of extremely low birthweight (ELBW) children born in the 1990s, how it varies with birth weight, and how it compares with ELBW children born in the 1980s and 1970s.

  12. Scaling of precipitation extremes with temperature in the French Mediterranean region: What explains the hook shape?

    Science.gov (United States)

    Drobinski, P.; Alonzo, B.; Bastin, S.; Silva, N. Da; Muller, C.

    2016-04-01

    Expected changes to future extreme precipitation remain a key uncertainty associated with anthropogenic climate change. Extreme precipitation has been proposed to scale with the precipitable water content in the atmosphere. Assuming constant relative humidity, this implies an increase of precipitation extremes at a rate of about 7% °C-1 globally as indicated by the Clausius-Clapeyron relationship. Increases faster and slower than Clausius-Clapeyron have also been reported. In this work, we examine the scaling between precipitation extremes and temperature in the present climate using simulations and measurements from surface weather stations collected in the frame of the HyMeX and MED-CORDEX programs in Southern France. Of particular interest are departures from the Clausius-Clapeyron thermodynamic expectation, their spatial and temporal distribution, and their origin. Looking at the scaling of precipitation extreme with temperature, two regimes emerge which form a hook shape: one at low temperatures (cooler than around 15°C) with rates of increase close to the Clausius-Clapeyron rate and one at high temperatures (warmer than about 15°C) with sub-Clausius-Clapeyron rates and most often negative rates. On average, the region of focus does not seem to exhibit super Clausius-Clapeyron behavior except at some stations, in contrast to earlier studies. Many factors can contribute to departure from Clausius-Clapeyron scaling: time and spatial averaging, choice of scaling temperature (surface versus condensation level), and precipitation efficiency and vertical velocity in updrafts that are not necessarily constant with temperature. But most importantly, the dynamical contribution of orography to precipitation in the fall over this area during the so-called "Cevenoles" events, explains the hook shape of the scaling of precipitation extremes.

  13. The genome-scale metabolic extreme pathway structure in Haemophilus influenzae shows significant network redundancy.

    Science.gov (United States)

    Papin, Jason A; Price, Nathan D; Edwards, Jeremy S; Palsson B, Bernhard Ø

    2002-03-07

    Genome-scale metabolic networks can be characterized by a set of systemically independent and unique extreme pathways. These extreme pathways span a convex, high-dimensional space that circumscribes all potential steady-state flux distributions achievable by the defined metabolic network. Genome-scale extreme pathways associated with the production of non-essential amino acids in Haemophilus influenzae were computed. They offer valuable insight into the functioning of its metabolic network. Three key results were obtained. First, there were multiple internal flux maps corresponding to externally indistinguishable states. It was shown that there was an average of 37 internal states per unique exchange flux vector in H. influenzae when the network was used to produce a single amino acid while allowing carbon dioxide and acetate as carbon sinks. With the inclusion of succinate as an additional output, this ratio increased to 52, a 40% increase. Second, an analysis of the carbon fates illustrated that the extreme pathways were non-uniformly distributed across the carbon fate spectrum. In the detailed case study, 45% of the distinct carbon fate values associated with lysine production represented 85% of the extreme pathways. Third, this distribution fell between distinct systemic constraints. For lysine production, the carbon fate values that represented 85% of the pathways described above corresponded to only 2 distinct ratios of 1:1 and 4:1 between carbon dioxide and acetate. The present study analysed single outputs from one organism, and provides a start to genome-scale extreme pathways studies. These emergent system-level characterizations show the significance of metabolic extreme pathway analysis at the genome-scale.

  14. Large-scale drivers of local precipitation extremes in convection-permitting climate simulations

    Science.gov (United States)

    Chan, Steven C.; Kendon, Elizabeth J.; Roberts, Nigel M.; Fowler, Hayley J.; Blenkinsop, Stephen

    2016-04-01

    The Met Office 1.5-km UKV convective-permitting models (CPM) is used to downscale present-climate and RCP8.5 60-km HadGEM3 GCM simulations. Extreme UK hourly precipitation intensities increase with local near-surface temperatures and humidity; for temperature, the simulated increase rate for the present-climate simulation is about 6.5% K**-1, which is consistent with observations and theoretical expectations. While extreme intensities are higher in the RCP8.5 simulation as higher temperatures are sampled, there is a decline at the highest temperatures due to circulation and relative humidity changes. Extending the analysis to the broader synoptic scale, it is found that circulation patterns, as diagnosed by MSLP or circulation type, play an increased role in the probability of extreme precipitation in the RCP8.5 simulation. Nevertheless for both CPM simulations, vertical instability is the principal driver for extreme precipitation.

  15. Trends in Mediterranean gridded temperature extremes and large-scale circulation influences

    Directory of Open Access Journals (Sweden)

    D. Efthymiadis

    2011-08-01

    Full Text Available Two recently-available daily gridded datasets are used to investigate trends in Mediterranean temperature extremes since the mid-20th century. The underlying trends are found to be generally consistent with global trends of temperature and their extremes: cold extremes decrease and warm/hot extremes increase. This consistency is better manifested in the western part of the Mediterranean where changes are most pronounced since the mid-1970s. In the eastern part, a cooling is observed, with a near reversal in the last two decades. This inter-basin discrepancy is clearer in winter, while in summer changes are more uniform and the west-east difference is restricted to the rate of increase of warm/hot extremes, which is higher in central and eastern parts of the Mediterranean over recent decades. Linear regression and correlation analysis reveals some influence of major large-scale atmospheric circulation patterns on the occurrence of these extremes – both in terms of trend and interannual variability. These relationships are not, however, able to account for the most striking features of the observations – in particular the intensification of the increasing trend in warm/hot extremes, which is most evident over the last 15–20 yr in the Central and Eastern Mediterranean.

  16. Scaling precipitation extremes with temperature in the Mediterranean: past climate assessment and projection in anthropogenic scenarios

    Science.gov (United States)

    Drobinski, Philippe; Silva, Nicolas Da; Panthou, Gérémy; Bastin, Sophie; Muller, Caroline; Ahrens, Bodo; Borga, Marco; Conte, Dario; Fosser, Giorgia; Giorgi, Filippo; Güttler, Ivan; Kotroni, Vassiliki; Li, Laurent; Morin, Efrat; Önol, Bariş; Quintana-Segui, Pere; Romera, Raquel; Torma, Csaba Zsolt

    2016-03-01

    In this study we investigate the scaling of precipitation extremes with temperature in the Mediterranean region by assessing against observations the present day and future regional climate simulations performed in the frame of the HyMeX and MED-CORDEX programs. Over the 1979-2008 period, despite differences in quantitative precipitation simulation across the various models, the change in precipitation extremes with respect to temperature is robust and consistent. The spatial variability of the temperature-precipitation extremes relationship displays a hook shape across the Mediterranean, with negative slope at high temperatures and a slope following Clausius-Clapeyron (CC)-scaling at low temperatures. The temperature at which the slope of the temperature-precipitation extreme relation sharply changes (or temperature break), ranges from about 20 °C in the western Mediterranean to relationship is close to CC-scaling at temperatures below the temperature break, while at high temperatures, the negative slope is close, but somewhat flatter or steeper, than in the current climate depending on the model. Overall, models predict more intense precipitation extremes in the future. Adjusting the temperature-precipitation extremes relationship in the present climate using the CC law and the temperature shift in the future allows the recovery of the temperature-precipitation extremes relationship in the future climate. This implies negligible regional changes of relative humidity in the future despite the large warming and drying over the Mediterranean. This suggests that the Mediterranean Sea is the primary source of moisture which counteracts the drying and warming impacts on relative humidity in parts of the Mediterranean region.

  17. Resolving Planet Formation in the Era of ALMA and Extreme AO Report on the joint ESO/NRAO Conference

    Science.gov (United States)

    Dent, W. R. F.; Hales, A.; Milli, J.

    2016-12-01

    ALMA in its long-baseline configuration, as well as new optical/near-infrared adaptive optics instruments such as SPHERE and GPI, are now able to achieve spatial resolutions considerably better than 0.1 arcseconds. These facilities are enabling us to observe for the first time the regions around young stars where planets form. Already, complex structures including holes, spiral waves and extreme asymmetries are being found in these protoplanetary discs. To discuss these newly-imaged phenomena, and to enable cross-fertilisation of ideas between the two wavelength ranges, a joint ESO/NRAO workshop was held in Santiago. We present here a summary and some highlights of the meeting.

  18. Large-scale Agroecosytem's Resiliency to Extreme Hydrometeorological and Climate Extreme Events in the Missouri River Basin

    Science.gov (United States)

    Munoz-Arriola, F.; Smith, K.; Corzo, G.; Chacon, J.; Carrillo-Cruz, C.

    2015-12-01

    A major challenge for water, energy and food security relies on the capability of agroecosyststems and ecosystems to adapt to a changing climate and land use changes. The interdependency of these forcings, understood through our ability to monitor and model processes across scales, indicate the "depth" of their impact on agroecosystems and ecosystems, and consequently our ability to predict the system's ability to return to a "normal" state. We are particularly interested in explore two questions: (1) how hydrometeorological and climate extreme events (HCEs) affect sub-seasonal to interannual changes in evapotranspiration and soil moisture? And (2) how agroecosystems recover from the effect of such events. To address those questions we use the land surface hydrologic Variable Infiltration Capacity (VIC) model and the Moderate Resolution Imaging Spectrometer-Leaf Area Index (MODIS-LAI) over two time spans (1950-2013 using a seasonal fixed LAI cycle) and 2001-2013 (an 8-day MODIS-LAI). VIC is forced by daily/16th degree resolution precipitation, minimum and maximum temperature, and wind speed. In this large-scale experiment, resiliency is defined by the capacity of a particular agroecosystem, represented by a grid cell's ET, SM, and LAI to return to a historical average. This broad, yet simplistic definition will contribute to identify the possible components and their scales involved in agroecosystems and ecosystems capacity to adapt to the incidence of HCEs and technologies used to intensify agriculture and diversify their use for food and energy production. Preliminary results show that dynamical changes in land use, tracked by MODIS data, require larger time spans to address properly the influence of technologic improvements in crop production as well as the competition for land for biofuel vs. food production. On the other hand, fixed seasonal changes in land use allow us just to identify hydrologic changes mainly due to climate variability.

  19. Universality and extremal aging for dynamics of spin glasses on sub-exponential time scales

    CERN Document Server

    Arous, G Ben

    2010-01-01

    We consider Random Hopping Time (RHT) dynamics of the Sherrington - Kirkpatrick (SK) model and p-spin models of spin glasses. For any of these models and for any inverse temperature we prove that, on time scales that are sub-exponential in the dimension, the properly scaled clock process (time-change process) of the dynamics converges to an extremal process. Moreover, on these time scales, the system exhibits aging like behavior which we called extremal aging. In other words, the dynamics of these models ages as the random energy model (REM) does. Hence, by extension, this confirms Bouchaud's REM-like trap model as a universal aging mechanism for a wide range of systems which, for the first time, includes the SK model.

  20. Variability of temperature sensitivity of extreme precipitation from a regional-to-local impact scale perspective

    Science.gov (United States)

    Schroeer, K.; Kirchengast, G.

    2016-12-01

    Relating precipitation intensity to temperature is a popular approach to assess potential changes of extreme events in a warming climate. Potential increases in extreme rainfall induced hazards, such as flash flooding, serve as motivation. It has not been addressed whether the temperature-precipitation scaling approach is meaningful on a regional to local level, where the risk of climate and weather impact is dealt with. Substantial variability of temperature sensitivity of extreme precipitation has been found that results from differing methodological assumptions as well as from varying climatological settings of the study domains. Two aspects are consistently found: First, temperature sensitivities beyond the expected consistency with the Clausius-Clapeyron (CC) equation are a feature of short-duration, convective, sub-daily to sub-hourly high-percentile rainfall intensities at mid-latitudes. Second, exponential growth ceases or reverts at threshold temperatures that vary from region to region, as moisture supply becomes limited. Analyses of pooled data, or of single or dispersed stations over large areas make it difficult to estimate the consequences in terms of local climate risk. In this study we test the meaningfulness of the scaling approach from an impact scale perspective. Temperature sensitivities are assessed using quantile regression on hourly and sub-hourly precipitation data from 189 stations in the Austrian south-eastern Alpine region. The observed scaling rates vary substantially, but distinct regional and seasonal patterns emerge. High sensitivity exceeding CC-scaling is seen on the 10-minute scale more than on the hourly scale, in storms shorter than 2 hours duration, and in shoulder seasons, but it is not necessarily a significant feature of the extremes. To be impact relevant, change rates need to be linked to absolute rainfall amounts. We show that high scaling rates occur in lower temperature conditions and thus have smaller effect on absolute

  1. Towards a landscape scale management of pesticides: ERA using changes in modelled occupancy and abundance to assess long-term population impacts of pesticides.

    Science.gov (United States)

    Topping, Chris J; Craig, Peter S; de Jong, Frank; Klein, Michael; Laskowski, Ryszard; Manachini, Barbara; Pieper, Silvia; Smith, Rob; Sousa, José Paulo; Streissl, Franz; Swarowsky, Klaus; Tiktak, Aaldrik; van der Linden, Ton

    2015-12-15

    Pesticides are regulated in Europe and this process includes an environmental risk assessment (ERA) for non-target arthropods (NTA). Traditionally a non-spatial or field trial assessment is used. In this study we exemplify the introduction of a spatial context to the ERA as well as suggest a way in which the results of complex models, necessary for proper inclusion of spatial aspects in the ERA, can be presented and evaluated easily using abundance and occupancy ratios (AOR). We used an agent-based simulation system and an existing model for a widespread carabid beetle (Bembidion lampros), to evaluate the impact of a fictitious highly-toxic pesticide on population density and the distribution of beetles in time and space. Landscape structure and field margin management were evaluated by comparing scenario-based ERAs for the beetle. Source-sink dynamics led to an off-crop impact even when no pesticide was present off-crop. In addition, the impacts increased with multi-year application of the pesticide whereas current ERA considers only maximally one year. These results further indicated a complex interaction between landscape structure and pesticide effect in time, both in-crop and off-crop, indicating the need for NTA ERA to be conducted at landscape- and multi-season temporal-scales. Use of AOR indices to compare ERA outputs facilitated easy comparison of scenarios, allowing simultaneous evaluation of impacts and planning of mitigation measures. The landscape and population ERA approach also demonstrates that there is a potential to change from regulation of a pesticide in isolation, towards the consideration of pesticide management at landscape scales and provision of biodiversity benefits via inclusion and testing of mitigation measures in authorisation procedures. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Brunsell, Nathaniel [Univ. of Kansas, Lawrence, KS (United States); Mechem, David [Univ. of Kansas, Lawrence, KS (United States); Ma, Chunsheng [Wichita State Univ., KS (United States)

    2015-02-20

    Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive to alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the

  3. Small-Scale Industries in Jordan in the Globalization Era Performance and Prospects

    Directory of Open Access Journals (Sweden)

    Basem M. Lozi

    2008-01-01

    Full Text Available The research probes the implications of globalization and domestic economic liberalization for small-scale industries and analysis its growth performance. The research concludes with policy recommendations to ensure the sustenance and competitive growth of small-scale industries in Jordan. Results of the study show that the growth of small scale industries in Jordan in terms of units, employment, output and exports has come up due to globalization, domestic liberalization and dilution of sector specific measures. The government should involve the private sector in the development of infrastructure, to enable efficient monitoring and good facilities to the small scale industries. There is a need to create fund at the government level for disbursement as margin money through district industries centers to small scale industries units to encourage them to undertake technological innovations. A technological vibrant, internationally competitive small scale industries sector should be encouraged to emerge, to make a sustainable contribution to national income, employment and exports.

  4. Scale parameters in stationary and non-stationary GEV modeling of extreme precipitation

    Science.gov (United States)

    Panagoulia, Dionysia; Economou, Polychronis; Caroni, Chrys

    2013-04-01

    The generalized extreme value (GEV) distribution is often fitted to environmental time series of extreme values such as annual maxima of daily precipitation. We study two methodological issues here. First we compare methods of selecting the best model among a set of 16 GEV models that allow non-stationary scale and location parameters. Results of simulation studies showed that both the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) correctly detected non-stationarity but the BIC was superior in selecting the correct model more often. The second issue is how best to produce confidence intervals (CIs) for the parameters of the model and other quantities such as the return levels that are usually required for hydrological and climatological time series. Four bootstrap CIs - normal, percentile, basic, and bias corrected and accelerated (BCa) - constructed by random-t resampling, fixed-t resampling and the parametric bootstrap methods were compared. CIs for parameters of the stationary model do not present major differences. CIs for the more extreme quantiles tend to become very wide for all bootstrap methods. For non-stationary GEV models with linear time dependence of location or log-linear time dependence of scale, coverage probabilities of the CIs are reasonably accurate for the parameters. For the extreme percentiles, the BCa method is best overall and the fixed-t method also gives good average coverage probabilities.

  5. Influence of climate variability versus change at multi-decadal time scales on hydrological extremes

    Science.gov (United States)

    Willems, Patrick

    2014-05-01

    Recent studies have shown that rainfall and hydrological extremes do not randomly occur in time, but are subject to multidecadal oscillations. In addition to these oscillations, there are temporal trends due to climate change. Design statistics, such as intensity-duration-frequency (IDF) for extreme rainfall or flow-duration-frequency (QDF) relationships, are affected by both types of temporal changes (short term and long term). This presentation discusses these changes, how they influence water engineering design and decision making, and how this influence can be assessed and taken into account in practice. The multidecadal oscillations in rainfall and hydrological extremes were studied based on a technique for the identification and analysis of changes in extreme quantiles. The statistical significance of the oscillations was evaluated by means of a non-parametric bootstrapping method. Oscillations in large scale atmospheric circulation were identified as the main drivers for the temporal oscillations in rainfall and hydrological extremes. They also explain why spatial phase shifts (e.g. north-south variations in Europe) exist between the oscillation highs and lows. Next to the multidecadal climate oscillations, several stations show trends during the most recent decades, which may be attributed to climate change as a result of anthropogenic global warming. Such attribution to anthropogenic global warming is, however, uncertain. It can be done based on simulation results with climate models, but it is shown that the climate model results are too uncertain to enable a clear attribution. Water engineering design statistics, such as extreme rainfall IDF or peak or low flow QDF statistics, obviously are influenced by these temporal variations (oscillations, trends). It is shown in the paper, based on the Brussels 10-minutes rainfall data, that rainfall design values may be about 20% biased or different when based on short rainfall series of 10 to 15 years length, and

  6. Estimating changes in temperature extremes from millennial-scale climate simulations using generalized extreme value (GEV) distributions

    Science.gov (United States)

    Huang, Whitney K.; Stein, Michael L.; McInerney, David J.; Sun, Shanshan; Moyer, Elisabeth J.

    2016-07-01

    Changes in extreme weather may produce some of the largest societal impacts of anthropogenic climate change. However, it is intrinsically difficult to estimate changes in extreme events from the short observational record. In this work we use millennial runs from the Community Climate System Model version 3 (CCSM3) in equilibrated pre-industrial and possible future (700 and 1400 ppm CO2) conditions to examine both how extremes change in this model and how well these changes can be estimated as a function of run length. We estimate changes to distributions of future temperature extremes (annual minima and annual maxima) in the contiguous United States by fitting generalized extreme value (GEV) distributions. Using 1000-year pre-industrial and future time series, we show that warm extremes largely change in accordance with mean shifts in the distribution of summertime temperatures. Cold extremes warm more than mean shifts in the distribution of wintertime temperatures, but changes in GEV location parameters are generally well explained by the combination of mean shifts and reduced wintertime temperature variability. For cold extremes at inland locations, return levels at long recurrence intervals show additional effects related to changes in the spread and shape of GEV distributions. We then examine uncertainties that result from using shorter model runs. In theory, the GEV distribution can allow prediction of infrequent events using time series shorter than the recurrence interval of those events. To investigate how well this approach works in practice, we estimate 20-, 50-, and 100-year extreme events using segments of varying lengths. We find that even using GEV distributions, time series of comparable or shorter length than the return period of interest can lead to very poor estimates. These results suggest caution when attempting to use short observational time series or model runs to infer infrequent extremes.

  7. Wintertime connections between extreme wind patterns in Spain and large-scale geopotential height field

    Science.gov (United States)

    Pascual, A.; Martín, M. L.; Valero, F.; Luna, M. Y.; Morata, A.

    2013-03-01

    The present study is focused on the study of the variability and the most significant wind speed patterns in Spain during the winter season analyzing as well connections between the wind speed field and the geopotential height at 1000 hPa over an Atlantic area. The daily wind speed variability is investigated by means of principal components using wind speed observations. Five main modes of variation, accounting 66% of the variance of the original data, have been identified, highlighting their differences in the Spanish wind speed behavior. Connections between the wind speeds and the large-scale atmospheric field were underlined by means of composite maps. Composite maps were built up to give an averaged atmospheric circulation associated with extreme wind speed variability in Spain. Moreover, the principal component analysis was also applied to the geopotential heights, providing relationships between the large-scale atmospheric modes and the observational local wind speeds. Such relationships are shown in terms of the cumulated frequency values of wind speed associated with the extreme scores of the obtained large-scale atmospheric modes, showing those large-scale atmospheric patterns more dominant in the wind field in Spain.

  8. Scale orientated analysis of river width changes due to extreme flood hazards

    Directory of Open Access Journals (Sweden)

    G. Krapesch

    2011-08-01

    Full Text Available This paper analyses the morphological effects of extreme floods (recurrence interval >100 years and examines which parameters best describe the width changes due to erosion based on 5 affected alpine gravel bed rivers in Austria. The research was based on vertical aerial photos of the rivers before and after extreme floods, hydrodynamic numerical models and cross sectional measurements supported by LiDAR data of the rivers. Average width ratios (width after/before the flood were calculated and correlated with different hydraulic parameters (specific stream power, shear stress, flow area, specific discharge. Depending on the geomorphological boundary conditions of the different rivers, a mean width ratio between 1.12 (Lech River and 3.45 (Trisanna River was determined on the reach scale. The specific stream power (SSP best predicted the mean width ratios of the rivers especially on the reach scale and sub reach scale. On the local scale more parameters have to be considered to define the "minimum morphological spatial demand of rivers", which is a crucial parameter for addressing and managing flood hazards and should be used in hazard zone plans and spatial planning.

  9. Operational flood management under large-scale extreme conditions, using the example of the Middle Elbe

    Directory of Open Access Journals (Sweden)

    A. Kron

    2010-06-01

    Full Text Available In addition to precautionary or technical flood protection measures, short-term strategies of the operational management, i.e. the initiation and co-ordination of preventive measures during and/or before a flood event are crucially for the reduction of the flood damages. This applies especially for extreme flood events. These events are rare, but may cause a protection measure to be overtopped or even to fail and be destroyed. In such extreme cases, reliable decisions must be made and emergency measures need to be carried out to prevent even larger damages from occurring.

    Based on improved methods for meteorological and hydrological modelling a range of (physically based extreme flood scenarios can be derived from historical events by modification of air temperature and humidity, shifting of weather fields and recombination of flood relevant event characteristics. By coupling the large scale models with hydraulic and geotechnical models, the whole flood-process-chain can be analysed right down to the local scale. With the developed GIS-based tools for hydraulic modelling FlowGIS and the Dike-Information-System, (IS-dikes it is possible to quantify the endangering shortly before or even during a flood event, so the decision makers can evaluate possible options for action in operational mode.

  10. Segmented Ultralight Pre-Aligned Rotor for Extreme-Scale Wind Turbines

    Energy Technology Data Exchange (ETDEWEB)

    Loth, E.; Steele, A.; Ichter, B.; Selig, M.; Moriarty, P.

    2012-01-01

    To alleviate the mass-scaling issues associated with conventional upwind rotors of extreme-scale turbines, a downwind rotor concept is proposed which employs fixed blade curvature based on force alignment at rated conditions. For a given peak stress constraint, the reduction in downwind cantilever loads allows reduced shell and spar thickness, and thus a reduced blade mass as compared with a conventional upwind rotor, especially as rotor sizes approach extreme-scales. To quantify this mass reduction, a Finite Element Analysis was conducted for a 10 MW rated rotor based on the NREL offshore 5 MW baseline wind turbine. The results show that this 'pre-alignment' yields a net downstream deflection of 32 deg, a downward hub-pitch angle of 6 deg, a 20% increase in blade length (to maintain the same radius as the conventional blade), and a net mass savings of about 50% through decreased shell and spar thicknesses. The pre-alignment may also allow a more straightforward and efficient segmentation of the blade since shear stresses near joints are substantially reduced. Segmenting, in turn, can dramatically reduce costs associated with fabrication, transport and assembly for extreme-scale off-shore systems. The pre-aligned geometric curvature can also help alleviate tower wake effects on the blades since blade tips (where shadow effects can be most problematic) are shifted downstream where the tower wake is weaker. In addition, the portion of the tower that is upstream of the blade tips can be faired with an externally-rotating aerodynamic shroud. Furthermore, the downwind rotor can allow a floating off-shore tri-pod platform to reduce tower weight and yaw-control requirements. A simple economic analysis of the segmented ultralight pre-aligned rotor (SUPAR) concept suggests that the overall system cost savings can be as much as 25%, indicating that more detailed (numerical and experimental) investigations are warranted.

  11. Forecasting extreme events in collective dynamics: an analytic signal approach to detecting discrete scale invariance

    CERN Document Server

    Viswanathan, G M

    2006-01-01

    A challenging problem in physics concerns the possibility of forecasting rare but extreme phenomena such as large earthquakes, financial market crashes, and material rupture. A promising line of research involves the early detection of precursory log-periodic oscillations to help forecast extreme events in collective phenomena where discrete scale invariance plays an important role. Here I investigate two distinct approaches towards the general problem of how to detect log-periodic oscillations in arbitrary time series without prior knowledge of the location of the moveable singularity. I first show that the problem has a definite solution in Fourier space, however the technique involved requires an unrealistically large signal to noise ratio. I then show that the quadrature signal obtained via analytic continuation onto the imaginary axis, using the Hilbert transform, necessarily retains the log-periodicities found in the original signal. This finding allows the development of a new method of detecting log-p...

  12. On the estimation of the extremal index based on scaling and resampling

    CERN Document Server

    Hamidieh, Kamal; Michailidis, George

    2010-01-01

    The extremal index parameter theta characterizes the degree of local dependence in the extremes of a stationary time series and has important applications in a number of areas, such as hydrology, telecommunications, finance and environmental studies. In this study, a novel estimator for theta based on the asymptotic scaling of block-maxima and resampling is introduced. It is shown to be consistent and asymptotically normal for a large class of m-dependent time series. Further, a procedure for the automatic selection of its tuning parameter is developed and different types of confidence intervals that prove useful in practice proposed. The performance of the estimator is examined through simulations, which show its highly competitive behavior. Finally, the estimator is applied to three real data sets of daily crude oil prices, daily returns of the S&P 500 stock index, and high-frequency, intra-day traded volumes of a stock. These applications demonstrate additional diagnostic features of statistical plots ...

  13. Improving plot- and regional-scale crop models for simulating impacts of climate variability and extremes

    Science.gov (United States)

    Tao, F.; Rötter, R.

    2013-12-01

    Many studies on global climate report that climate variability is increasing with more frequent and intense extreme events1. There are quite large uncertainties from both the plot- and regional-scale models in simulating impacts of climate variability and extremes on crop development, growth and productivity2,3. One key to reducing the uncertainties is better exploitation of experimental data to eliminate crop model deficiencies and develop better algorithms that more adequately capture the impacts of extreme events, such as high temperature and drought, on crop performance4,5. In the present study, in a first step, the inter-annual variability in wheat yield and climate from 1971 to 2012 in Finland was investigated. Using statistical approaches the impacts of climate variability and extremes on wheat growth and productivity were quantified. In a second step, a plot-scale model, WOFOST6, and a regional-scale crop model, MCWLA7, were calibrated and validated, and applied to simulate wheat growth and yield variability from 1971-2012. Next, the estimated impacts of high temperature stress, cold damage, and drought stress on crop growth and productivity based on the statistical approaches, and on crop simulation models WOFOST and MCWLA were compared. Then, the impact mechanisms of climate extremes on crop growth and productivity in the WOFOST model and MCWLA model were identified, and subsequently, the various algorithm and impact functions were fitted against the long-term crop trial data. Finally, the impact mechanisms, algorithms and functions in WOFOST model and MCWLA model were improved to better simulate the impacts of climate variability and extremes, particularly high temperature stress, cold damage and drought stress for location-specific and large area climate impact assessments. Our studies provide a good example of how to improve, in parallel, the plot- and regional-scale models for simulating impacts of climate variability and extremes, as needed for

  14. Extremely large-scale simulation of a Kardar-Parisi-Zhang model using graphics cards.

    Science.gov (United States)

    Kelling, Jeffrey; Ódo, Géza

    2011-12-01

    The octahedron model introduced recently has been implemented onto graphics cards, which permits extremely large-scale simulations via binary lattice gases and bit-coded algorithms. We confirm scaling behavior belonging to the two-dimensional Kardar-Parisi-Zhang universality class and find a surface growth exponent: β = 0.2415(15) on 2(17) × 2(17) systems, ruling out β = 1/4 suggested by field theory. The maximum speedup with respect to a single CPU is 240. The steady state has been analyzed by finite-size scaling and a growth exponent α = 0.393(4) is found. Correction-to-scaling-exponent are computed and the power-spectrum density of the steady state is determined. We calculate the universal scaling functions and cumulants and show that the limit distribution can be obtained by the sizes considered. We provide numerical fitting for the small and large tail behavior of the steady-state scaling function of the interface width.

  15. Extreme Scale-out SuperMUC Phase 2 - lessons learned

    CERN Document Server

    Hammer, Nicolay; Satzger, Helmut; Allalen, Momme; Block, Alexander; Karmakar, Anupam; Brehm, Matthias; Bader, Reinhold; Iapichino, Luigi; Ragagnin, Antonio; Karakasis, Vasilios; Kranzlmüller, Dieter; Bode, Arndt; Huber, Herbert; Kühn, Martin; Machado, Rui; Grünewald, Daniel; Edelmann, Philipp V F; Röpke, Friedrich K; Wittmann, Markus; Zeiser, Thomas; Wellein, Gerhard; Mathias, Gerald; Schwörer, Magnus; Lorenzen, Konstantin; Federrath, Christoph; Klessen, Ralf; Bamberg, Karl-Ulrich; Ruhl, Hartmut; Schornbaum, Florian; Bauer, Martin; Nikhil, Anand; Qi, Jiaxing; Klimach, Harald; Stüben, Hinnerk; Deshmukh, Abhishek; Falkenstein, Tobias; Dolag, Klaus; Petkova, Margarita

    2016-01-01

    In spring 2015, the Leibniz Supercomputing Centre (Leibniz-Rechenzentrum, LRZ), installed their new Peta-Scale System SuperMUC Phase2. Selected users were invited for a 28 day extreme scale-out block operation during which they were allowed to use the full system for their applications. The following projects participated in the extreme scale-out workshop: BQCD (Quantum Physics), SeisSol (Geophysics, Seismics), GPI-2/GASPI (Toolkit for HPC), Seven-League Hydro (Astrophysics), ILBDC (Lattice Boltzmann CFD), Iphigenie (Molecular Dynamic), FLASH (Astrophysics), GADGET (Cosmological Dynamics), PSC (Plasma Physics), waLBerla (Lattice Boltzmann CFD), Musubi (Lattice Boltzmann CFD), Vertex3D (Stellar Astrophysics), CIAO (Combustion CFD), and LS1-Mardyn (Material Science). The projects were allowed to use the machine exclusively during the 28 day period, which corresponds to a total of 63.4 million core-hours, of which 43.8 million core-hours were used by the applications, resulting in a utilization of 69%. The top 3...

  16. Comparing trends in hydrometeorological average and extreme data sets around the world at different time scales

    Directory of Open Access Journals (Sweden)

    Thomas Rosmann

    2016-03-01

    New hydrological insights for the region: Results indicate that trends can be found for all variables and on all latitudes, with an increase of global temperature in the analysed time period. Fewer trends were observed in extreme value data. Trends in discharge data were predominantly negative, and precipitation trends were not very common. In some cases, an opposing pattern was observed in the northern and southern hemisphere. The highest number of trends was found at the annual and least on the daily resolution, nevertheless, trend patterns for discharges remained similar at different time scales. Some of the factors that might influence these results are discussed.

  17. Scaling analysis of paces of fetal breathing, gross-body and extremity movements

    OpenAIRE

    Govindan, R. B.; Wilson, J D; Murphy, P; Russel, W.A.; Lowery, C L

    2007-01-01

    Using detrended fluctuation analysis (DFA), we studied the scaling properties of the time instances (occurrence) of the fetal breathing, gross-body, and extremity movements scored on a second by second basis from the recorded ultrasound measurements of 49 fetuses. The DFA exponent α of all the three movements of the fetuses varied between 0.63 and 1.1. We found an increase in α obtained for the movement due to breathing as a function of the gestational age while this trend was not observed fo...

  18. Vineland Adaptive Behavior Scales as a summary of functional outcome of extremely low-birthweight children.

    Science.gov (United States)

    Rosenbaum, P; Saigal, S; Szatmari, P; Hoult, L

    1995-07-01

    This study reports moderate to high Pearson correlations between Vineland Adaptive Behavior Scale (VABS) subscale and total scores and a variety of cognitive, academic and motor performance tests on a population of extremely low-birthweight infants assessed at eight years of age. The subscales describe adaptive behaviour in daily living, communication, motor function and socialization, as well as an adaptive behaviour composite score. Because it can provide a norm-referenced description of functional outcomes and can be used to assess all children regardless of disability, the authors believe that the VABS should be applied uniformly by all groups reporting school-age outcome of neonatal intensive-care populations.

  19. Development of a censored modelling approach for stochastic estimation of rainfall extremes at fine temporal scales

    Science.gov (United States)

    Cross, David; Onof, Christian; Bernardara, Pietro

    2016-04-01

    With the COP21 drawing to a close in December 2015, storms Desmond, Eva and Frank which swept across the UK and Ireland causing widespread flooding and devastation have acted as a timely reminder of the need for reliable estimation of rainfall extremes in a changing climate. The frequency and intensity of rainfall extremes are predicted to increase in the UK under anthropogenic climate change, and it is notable that the UK's 24 hour rainfall record of 316mm set in Seathwaite, Cumbria in 2009 was broken on the 5 December 2015 with 341mm by storm Desmond at Honister Pass also in Cumbria. Immediate analysis of the latter by the Centre for Ecology and Hydrology (UK) on the 8 December 2015 estimated that this is approximately equivalent to a 1300 year return period event (Centre for Ecology & Hydrology, 2015). Rainfall extremes are typically estimated using extreme value analysis and intensity duration frequency curves. This study investigates the potential for using stochastic rainfall simulation with mechanistic rectangular pulse models for estimation of extreme rainfall. These models have been used since the late 1980s to generate synthetic rainfall time-series at point locations for scenario analysis in hydrological studies and climate impact assessment at the catchment scale. Routinely they are calibrated to the full historical hyetograph and used for continuous simulation. However, their extremal performance is variable with a tendency to underestimate short duration (hourly and sub-hourly) rainfall extremes which are often associated with heavy convective rainfall in temporal climates such as the UK. Focussing on hourly and sub-hourly rainfall, a censored modelling approach is proposed in which rainfall below a low threshold is set to zero prior to model calibration. It is hypothesised that synthetic rainfall time-series are poor at estimating extremes because the majority of the training data are not representative of the climatic conditions which give rise to

  20. Understanding extreme sea levels for broad-scale coastal impact and adaptation analysis

    Science.gov (United States)

    Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A. B. A.

    2017-07-01

    One of the main consequences of mean sea level rise (SLR) on human settlements is an increase in flood risk due to an increase in the intensity and frequency of extreme sea levels (ESL). While substantial research efforts are directed towards quantifying projections and uncertainties of future global and regional SLR, corresponding uncertainties in contemporary ESL have not been assessed and projections are limited. Here we quantify, for the first time at global scale, the uncertainties in present-day ESL estimates, which have by default been ignored in broad-scale sea-level rise impact assessments to date. ESL uncertainties exceed those from global SLR projections and, assuming that we meet the Paris agreement goals, the projected SLR itself by the end of the century in many regions. Both uncertainties in SLR projections and ESL estimates need to be understood and combined to fully assess potential impacts and adaptation needs.

  1. Intensification of the regional scale variability of extreme precipitation derived from RCM simulations and observations

    Science.gov (United States)

    Feldmann, H.; Schädler, G.; Panitz, H.-J.

    2012-04-01

    Future climate change patterns are usually derived from ensembles of coarse global climate model simulations (GCMs), for instance within the Coupled Model Intercomparison Project (CMIP) or from regional climate projections at resolutions of some tens of km, for instance for Europe from the ENSEMBLES or PRUDENCE projects. For regions with complex topography like Central Europe the horizontal resolution of these climate projections is still too coarse to resolve the typical topographical length scales, and therefore the impact of the large scale changes with the regional geography cannot be captured adequately. For this task high resolution ensemble simulations with regional climate models (RCMs) are needed. The generation of an ensemble of such high resolution simulations requires great computational efforts. With the RCM COSMO-CLM several simulations with resolutions down to 7 km have been performed, using different driving GCMs and GCM realisations. This ensemble approach is needed to estimate the robustness of the change signals and to account for the uncertainties introduced by differences in the large scale forcing due to the variability of the climate change signals caused by the different GCMs or the natural variability. The focus of the study is on the changes of extreme precipitation for the near future until the middle of the 21st century. An increase of the temporal and spatial variability is found for the precipitation extremes, especially for summer. The change patterns seem to be statistically robust. Based on long-term observation climatologies for the second half of the 20th century, similar structures where found with areas of decrease and increase only a few tens of kilometres apart from each other. The combination of the findings from the RCM projections and observations suggests a continuation of the trends from the recent past into the near future. Possible causes for the horizontally heterogeneous change patterns are related to weather pattern

  2. Large-scale and spatio-temporal extreme rain events over India: a hydrometeorological study

    Science.gov (United States)

    Ranade, Ashwini; Singh, Nityanand

    2014-02-01

    Frequency, intensity, areal extent (AE) and duration of rain spells during summer monsoon exhibit large intra-seasonal and inter-annual variations. Important features of the monsoon period large-scale wet spells over India have been documented. A main monsoon wet spell (MMWS) occurs over the country from 18 June to 16 September, during which, 26.5 % of the area receives rainfall 26.3 mm/day. Detailed characteristics of the MMWS period large-scale extreme rain events (EREs) and spatio-temporal EREs (ST-EREs), each concerning rainfall intensity (RI), AE and rainwater (RW), for 1 to 25 days have been studied using 1° gridded daily rainfall (1951-2007). In EREs, `same area' (grids) is continuously wet, whereas in ST-EREs, `any area' on the mean under wet condition for specified durations is considered. For the different extremes, second-degree polynomial gave excellent fit to increase in values from distribution of annual maximum RI and RW series with increase in duration. Fluctuations of RI, AE, RW and date of occurrence (or start) of the EREs and the ST-EREs did not show any significant trend. However, fluctuations of 1° latitude-longitude grid annual and spatial maximum rainfall showed highly significant increasing trend for 1 to 5 days, and unprecedented rains on 26-27 July 2005 over Mumbai could be a realization of this trend. The Asia-India monsoon intensity significantly influences the MMWS RW.

  3. Using scaling fluctuation analysis to quantify anthropogenic changes in regional and global precipitation, including extremes

    Science.gov (United States)

    de Lima, Isabel; Lovejoy, Shaun

    2016-04-01

    Anthropic precipitation changes affect the mean and the magnitude and frequency of extreme events, and therefore potentially have severe consequences in all aspects of human life. Unfortunately, - unlike the anthropic temperature changes - precipitation changes of anthropic origin have been proven difficult to establish with high statistical significance. For example, when changes have been established for individual precipitation products, the serious divergences found between products reflect our limited ability to estimate areal precipitation even at global scales. In addition to data issues, the usual approaches to assessing changes in precipitation also have methodological issues that hamper their identification. Here we discuss how the situation can be clarified by the systematic application of scaling fluctuation analysis - for example, to determine the scales at which the anthropogenic signal exceeds the natural variability noise (we find that it is roughly 20 years). Following a recent approach for estimating anthropogenic temperature changes we directly determine the effective sensitivity of the precipitation rate to a doubling of CO2. The novelty in this approach is that it takes CO2 as a surrogate for all anthropogenic forcings and estimates the trend based on the forcing rather than time - the usual approach. This leads both to an improved signal to noise ratio and, when compared to the usual estimates of trends, it augments their statistical significance; we further improve the signal to noise ratio by considering precipitation over the ocean where anthropogenic increases are strongest, finding that there are statistically significant trends at the 3 to 4 standard deviation level. This approach also permits the first direct estimate of the increases in global precipitation with temperature: we find 1.71±0.62 %/K which is close to that found by GCM's (2 - 3%/K) and is well below the value of ≈ 6 - 7%/K predicted on the basis of increases in humidity

  4. Resilience Design Patterns - A Structured Approach to Resilience at Extreme Scale (version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Hukerikar, Saurabh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Engelmann, Christian [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-10-01

    Reliability is a serious concern for future extreme-scale high-performance computing (HPC) systems. Projections based on the current generation of HPC systems and technology roadmaps suggest that very high fault rates in future systems. The errors resulting from these faults will propagate and generate various kinds of failures, which may result in outcomes ranging from result corruptions to catastrophic application crashes. Practical limits on power consumption in HPC systems will require future systems to embrace innovative architectures, increasing the levels of hardware and software complexities. The resilience challenge for extreme-scale HPC systems requires management of various hardware and software technologies that are capable of handling a broad set of fault models at accelerated fault rates. These techniques must seek to improve resilience at reasonable overheads to power consumption and performance. While the HPC community has developed various solutions, application-level as well as system-based solutions, the solution space of HPC resilience techniques remains fragmented. There are no formal methods and metrics to investigate and evaluate resilience holistically in HPC systems that consider impact scope, handling coverage, and performance & power eciency across the system stack. Additionally, few of the current approaches are portable to newer architectures and software ecosystems, which are expected to be deployed on future systems. In this document, we develop a structured approach to the management of HPC resilience based on the concept of resilience-based design patterns. A design pattern is a general repeatable solution to a commonly occurring problem. We identify the commonly occurring problems and solutions used to deal with faults, errors and failures in HPC systems. The catalog of resilience design patterns provides designers with reusable design elements. We define a design framework that enhances our understanding of the important

  5. Resilience Design Patterns - A Structured Approach to Resilience at Extreme Scale (version 1.1)

    Energy Technology Data Exchange (ETDEWEB)

    Hukerikar, Saurabh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Engelmann, Christian [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-12-01

    Reliability is a serious concern for future extreme-scale high-performance computing (HPC) systems. Projections based on the current generation of HPC systems and technology roadmaps suggest the prevalence of very high fault rates in future systems. The errors resulting from these faults will propagate and generate various kinds of failures, which may result in outcomes ranging from result corruptions to catastrophic application crashes. Therefore the resilience challenge for extreme-scale HPC systems requires management of various hardware and software technologies that are capable of handling a broad set of fault models at accelerated fault rates. Also, due to practical limits on power consumption in HPC systems future systems are likely to embrace innovative architectures, increasing the levels of hardware and software complexities. As a result the techniques that seek to improve resilience must navigate the complex trade-off space between resilience and the overheads to power consumption and performance. While the HPC community has developed various resilience solutions, application-level techniques as well as system-based solutions, the solution space of HPC resilience techniques remains fragmented. There are no formal methods and metrics to investigate and evaluate resilience holistically in HPC systems that consider impact scope, handling coverage, and performance & power efficiency across the system stack. Additionally, few of the current approaches are portable to newer architectures and software environments that will be deployed on future systems. In this document, we develop a structured approach to the management of HPC resilience using the concept of resilience-based design patterns. A design pattern is a general repeatable solution to a commonly occurring problem. We identify the commonly occurring problems and solutions used to deal with faults, errors and failures in HPC systems. Each established solution is described in the form of a pattern that

  6. Cooperative and Noncooperative Strategies for Small-scale Fisheries' Self-governance in the Globalization Era: Implications for Conservation

    Directory of Open Access Journals (Sweden)

    Xavier Basurto

    2013-12-01

    Full Text Available Fishing cooperatives (co-ops and patron-client relationships are the most common cooperative and noncooperative strategies for self-governance for small-scale fisheries around the world. We studied what drives fishers to choose between these two self-governance arrangements in 12 communities in the Gulf of California, Mexico. The communities depend on similar fishing resources, are located in contiguous portions of the coast, fish roughly the same species, have similar socioeconomic characteristics, and sell to similar markets, yet half of the fisheries are organized around co-ops and the other half work through patron-client arrangements. Using participant observation, in-depth interviews of key informants between 1995-2008, and a survey of 55% of the fisheries in the study area, we found that the presence of high transaction costs of commercialization, the desire to acquire fishing licenses, and the existence of traditions of successful collective action among fishing groups within each community strongly influence fishers' choices regarding membership in fishing co-ops. We also examined the implications of our findings for conservation of fishing resources. Given that the emergence of co-ops was associated with high transaction costs of commercialization, we hypothesize that cooperative strategies are more likely than patron-client strategies to emerge in communities in isolated locations. In an era of globalization, in which the rate of development and urbanization will increase in coastal areas, patron-client strategies are likely to become more prevalent among fisheries, but such self-governance strategies are thought to be less conducive to conservation behaviors.

  7. Enabling Structured Exploration of Workflow Performance Variability in Extreme-Scale Environments

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin; Stephan, Eric G.; Raju, Bibi; Altintas, Ilkay; Elsethagen, Todd O.; Krishnamoorthy, Sriram

    2015-11-15

    Workflows are taking an Workflows are taking an increasingly important role in orchestrating complex scientific processes in extreme scale and highly heterogeneous environments. However, to date we cannot reliably predict, understand, and optimize workflow performance. Sources of performance variability and in particular the interdependencies of workflow design, execution environment and system architecture are not well understood. While there is a rich portfolio of tools for performance analysis, modeling and prediction for single applications in homogenous computing environments, these are not applicable to workflows, due to the number and heterogeneity of the involved workflow and system components and their strong interdependencies. In this paper, we investigate workflow performance goals and identify factors that could have a relevant impact. Based on our analysis, we propose a new workflow performance provenance ontology, the Open Provenance Model-based WorkFlow Performance Provenance, or OPM-WFPP, that will enable the empirical study of workflow performance characteristics and variability including complex source attribution.

  8. Recovery Act - CAREER: Sustainable Silicon -- Energy-Efficient VLSI Interconnect for Extreme-Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Chiang, Patrick [Oregon State Univ., Corvallis, OR (United States)

    2014-01-31

    The research goal of this CAREER proposal is to develop energy-efficient, VLSI interconnect circuits and systems that will facilitate future massively-parallel, high-performance computing. Extreme-scale computing will exhibit massive parallelism on multiple vertical levels, from thou­ sands of computational units on a single processor to thousands of processors in a single data center. Unfortunately, the energy required to communicate between these units at every level (on­ chip, off-chip, off-rack) will be the critical limitation to energy efficiency. Therefore, the PI's career goal is to become a leading researcher in the design of energy-efficient VLSI interconnect for future computing systems.

  9. Topology-oblivious optimization of MPI broadcast algorithms on extreme-scale platforms

    KAUST Repository

    Hasanov, Khalid

    2015-11-01

    © 2015 Elsevier B.V. All rights reserved. Significant research has been conducted in collective communication operations, in particular in MPI broadcast, on distributed memory platforms. Most of the research efforts aim to optimize the collective operations for particular architectures by taking into account either their topology or platform parameters. In this work we propose a simple but general approach to optimization of the legacy MPI broadcast algorithms, which are widely used in MPICH and Open MPI. The proposed optimization technique is designed to address the challenge of extreme scale of future HPC platforms. It is based on hierarchical transformation of the traditionally flat logical arrangement of communicating processors. Theoretical analysis and experimental results on IBM BlueGene/P and a cluster of the Grid\\'5000 platform are presented.

  10. Data co-processing for extreme scale analysis level II ASC milestone (4745).

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, David; Moreland, Kenneth D.; Oldfield, Ron A.; Fabian, Nathan D.

    2013-03-01

    Exascale supercomputing will embody many revolutionary changes in the hardware and software of high-performance computing. A particularly pressing issue is gaining insight into the science behind the exascale computations. Power and I/O speed con- straints will fundamentally change current visualization and analysis work ows. A traditional post-processing work ow involves storing simulation results to disk and later retrieving them for visualization and data analysis. However, at exascale, scien- tists and analysts will need a range of options for moving data to persistent storage, as the current o ine or post-processing pipelines will not be able to capture the data necessary for data analysis of these extreme scale simulations. This Milestone explores two alternate work ows, characterized as in situ and in transit, and compares them. We nd each to have its own merits and faults, and we provide information to help pick the best option for a particular use.

  11. The role of Natural Flood Management in managing floods in large scale basins during extreme events

    Science.gov (United States)

    Quinn, Paul; Owen, Gareth; ODonnell, Greg; Nicholson, Alex; Hetherington, David

    2016-04-01

    There is a strong evidence database showing the negative impacts of land use intensification and soil degradation in NW European river basins on hydrological response and to flood impact downstream. However, the ability to target zones of high runoff production and the extent to which we can manage flood risk using nature-based flood management solution are less known. A move to planting more trees and having less intense farmed landscapes is part of natural flood management (NFM) solutions and these methods suggest that flood risk can be managed in alternative and more holistic ways. So what local NFM management methods should be used, where in large scale basin should they be deployed and how does flow is propagate to any point downstream? Generally, how much intervention is needed and will it compromise food production systems? If we are observing record levels of rainfall and flow, for example during Storm Desmond in Dec 2015 in the North West of England, what other flood management options are really needed to complement our traditional defences in large basins for the future? In this paper we will show examples of NFM interventions in the UK that have impacted at local scale sites. We will demonstrate the impact of interventions at local, sub-catchment (meso-scale) and finally at the large scale. These tools include observations, process based models and more generalised Flood Impact Models. Issues of synchronisation and the design level of protection will be debated. By reworking observed rainfall and discharge (runoff) for observed extreme events in the River Eden and River Tyne, during Storm Desmond, we will show how much flood protection is needed in large scale basins. The research will thus pose a number of key questions as to how floods may have to be managed in large scale basins in the future. We will seek to support a method of catchment systems engineering that holds water back across the whole landscape as a major opportunity to management water

  12. Forcings and Feedbacks on Convection in the 2010 Pakistan Flood: Modeling Extreme Precipitation with Interactive Large-Scale Ascent

    CERN Document Server

    Nie, Ji; Sobel, Adam H

    2016-01-01

    Extratropical extreme precipitation events are usually associated with large-scale flow disturbances, strong ascent and large latent heat release. The causal relationships between these factors are often not obvious, however, and the roles of different physical processes in producing the extreme precipitation event can be difficult to disentangle. Here, we examine the large-scale forcings and convective heating feedback in the precipitation events which caused the 2010 Pakistan flood within the Column Quasi-Geostrophic framework. A cloud-revolving model (CRM) is forced with the large-scale forcings (other than large-scale vertical motion) computed from the quasi-geostrophic omega equation with input data from a reanalysis data set, and the large-scale vertical motion is diagnosed interactively with the simulated convection. Numerical results show that the positive feedback of convective heating to large-scale dynamics is essential in amplifying the precipitation intensity to the observed values. Orographic li...

  13. Towards convection-resolving, global atmospheric simulations with the Model for Prediction Across Scales (MPAS: an extreme scaling experiment

    Directory of Open Access Journals (Sweden)

    D. Heinzeller

    2015-08-01

    Full Text Available The Model for Prediction Across Scales (MPAS is a novel set of earth-system simulation components and consists of an atmospheric model, an ocean model and a land-ice model. Its distinct features are the use of unstructured Voronoi meshes and C-grid discretisation to address shortcomings of global models on regular grids and of limited area models nested in a forcing data set, with respect to parallel scalability, numerical accuracy and physical consistency. This makes MPAS a promising tool for conducting climate-related impact studies of, for example, land use changes in a consistent approach. Here, we present an in-depth evaluation of MPAS with regards to technical aspects of performing model runs and scalability for three medium-size meshes on four different High Performance Computing sites with different architectures and compilers. We uncover model limitations and identify new aspects for the model optimisation that are introduced by the use of unstructured Voronoi meshes. We further demonstrate the model performance of MPAS in terms of its capability to reproduce the dynamics of the West African Monsoon and its associated precipitation. Comparing 11 month runs for two meshes with observations and a Weather Research & Forecasting tool (WRF reference model, we show that MPAS can reproduce the atmospheric dynamics on global and local scales, but that further optimisation is required to address a precipitation excess for this region. Finally, we conduct extreme scaling tests on a global 3 km mesh with more than 65 million horizontal grid cells on up to half a million cores. We discuss necessary modifications of the model code to improve its parallel performance in general and specific to the HPC environment. We confirm good scaling (70 % parallel efficiency or better of the MPAS model and provide numbers on the computational requirements for experiments with the 3 km mesh. In doing so, we show that global, convection-resolving atmospheric

  14. Graph visualization for the analysis of the structure and dynamics of extreme-scale supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Berkbigler, K. P. (Kathryn P.); Bush, B. W. (Brian W.); Davis, Kei,; Hoisie, A. (Adolfy); Smith, S. A. (Steve A.)

    2002-01-01

    We are exploring the development and application of information visualization techniques for the analysis of new extreme-scale supercomputer architectures. Modern supercomputers typically comprise very large clusters of commodity SMPs interconnected by possibly dense and often nonstandard networks. The scale, complexity, and inherent nonlocality of the structure and dynamics of this hardware, and the systems and applications distributed over it, challenge traditional analysis methods. As part of the a la carte team at Los Alamos National Laboratory, who are simulating these advanced architectures, we are exploring advanced visualization techniques and creating tools to provide intuitive exploration, discovery, and analysis of these simulations. This work complements existing and emerging algorithmic analysis tools. Here we gives background on the problem domain, a description of a prototypical computer architecture of interest (on the order of 10,000 processors connected by a quaternary fat-tree network), and presentations of several visualizations of the simulation data that make clear the flow of data in the interconnection network.

  15. Intrinsic Origin Of Extreme-Scale Rotation Of Quasar Polarization Vectors

    CERN Document Server

    Silant'ev, N A; Gnedin, Yu N; Natsvlishvili, T M

    2009-01-01

    Extreme-scale alignment of quasar optical polarization vectors at cosmological scales ($z\\le 2$) is also characterized by the rotation of mean position angle $\\chi$ with $\\Delta \\chi \\approx 30^{\\circ}$ per 1 Gpc. For observing interval of $z$ the total rotation angle acquires the value $\\sim 90^{\\circ}$. We suggest the possible explanation of the half of this rotation as a consequence of physical transformation of initially vertical magnetic field ${\\bf B}_{\\|}$, directed along the normal ${\\bf N}$ to the surface of accretion disk, into the horizontal (perpendicular to ${\\bf N}$) one. We found asymptotical analytical expressions for axially averaged polarization degree $p$ and mean position angle $\\chi$ for various types of magnetized accretion disks. We found also that during the evolution can be realized the case $B_{\\bot}\\approx B_{\\|}$ where position angle $\\chi$ rotates from $45^{\\circ}$ to zero. This rotation may occur during fairly great cosmological time (corresponding to $\\Delta z\\sim 1-2$). The par...

  16. Orographic signature on multiscale statistics of extreme rainfall: A storm-scale study

    Science.gov (United States)

    Ebtehaj, Mohammad; Foufoula-Georgiou, Efi

    2010-12-01

    Rainfall intensity and spatiotemporal patterns often show a strong dependence on the underlying terrain. The main objective of this work is to study the statistical signature imprinted by orography on the spatial structure of rainfall and its temporal evolution at multiple scales, with the aim of developing a consistent theoretical basis for conditional downscaling of precipitation given the topographic information of the underlying terrain. The results of an extensive analysis of the high-resolution stage II Doppler radar data of the Rapidan storm, June 1995, over the Appalachian Mountains is reported in this study. The orographic signature on the elementary statistical structure of the precipitation fields is studied via a variable-intensity thresholding scheme. This signature is further explored at multiple scales via analysis of the dependence of precipitation fields on the underlying terrain both in Fourier and wavelet domains. The generalized normal distribution is found to be a suitable probability model to explain the variability of the rainfall wavelet coefficients and its dependence on the underlying elevations. These results provide a new perspective for more accurate statistical downscaling of orographic precipitation over complex terrain with emphasis on preservation of extremes.

  17. Communicating Climate Uncertainties: Challenges and Opportunities Related to Spatial Scales, Extreme Events, and the Warming 'Hiatus'

    Science.gov (United States)

    Casola, J. H.; Huber, D.

    2013-12-01

    Many media, academic, government, and advocacy organizations have achieved sophistication in developing effective messages based on scientific information, and can quickly translate salient aspects of emerging climate research and evolving observations. However, there are several ways in which valid messages can be misconstrued by decision makers, leading them to inaccurate conclusions about the risks associated with climate impacts. Three cases will be discussed: 1) Issues of spatial scale in interpreting climate observations: Local climate observations may contradict summary statements about the effects of climate change on larger regional or global spatial scales. Effectively addressing these differences often requires communicators to understand local and regional climate drivers, and the distinction between a 'signal' associated with climate change and local climate 'noise.' Hydrological statistics in Missouri and California are shown to illustrate this case. 2) Issues of complexity related to extreme events: Climate change is typically invoked following a wide range of damaging meteorological events (e.g., heat waves, landfalling hurricanes, tornadoes), regardless of the strength of the relationship between anthropogenic climate change and the frequency or severity of that type of event. Examples are drawn from media coverage of several recent events, contrasting useful and potentially confusing word choices and frames. 3) Issues revolving around climate sensitivity: The so-called 'pause' or 'hiatus' in global warming has reverberated strongly through political and business discussions of climate change. Addressing the recent slowdown in warming yields an important opportunity to raise climate literacy in these communities. Attempts to use recent observations as a wedge between climate 'believers' and 'deniers' is likely to be counterproductive. Examples are drawn from Congressional testimony and media stories. All three cases illustrate ways that decision

  18. A Fast SVD-Hidden-nodes based Extreme Learning Machine for Large-Scale Data Analytics.

    Science.gov (United States)

    Deng, Wan-Yu; Bai, Zuo; Huang, Guang-Bin; Zheng, Qing-Hua

    2016-05-01

    Big dimensional data is a growing trend that is emerging in many real world contexts, extending from web mining, gene expression analysis, protein-protein interaction to high-frequency financial data. Nowadays, there is a growing consensus that the increasing dimensionality poses impeding effects on the performances of classifiers, which is termed as the "peaking phenomenon" in the field of machine intelligence. To address the issue, dimensionality reduction is commonly employed as a preprocessing step on the Big dimensional data before building the classifiers. In this paper, we propose an Extreme Learning Machine (ELM) approach for large-scale data analytic. In contrast to existing approaches, we embed hidden nodes that are designed using singular value decomposition (SVD) into the classical ELM. These SVD nodes in the hidden layer are shown to capture the underlying characteristics of the Big dimensional data well, exhibiting excellent generalization performances. The drawback of using SVD on the entire dataset, however, is the high computational complexity involved. To address this, a fast divide and conquer approximation scheme is introduced to maintain computational tractability on high volume data. The resultant algorithm proposed is labeled here as Fast Singular Value Decomposition-Hidden-nodes based Extreme Learning Machine or FSVD-H-ELM in short. In FSVD-H-ELM, instead of identifying the SVD hidden nodes directly from the entire dataset, SVD hidden nodes are derived from multiple random subsets of data sampled from the original dataset. Comprehensive experiments and comparisons are conducted to assess the FSVD-H-ELM against other state-of-the-art algorithms. The results obtained demonstrated the superior generalization performance and efficiency of the FSVD-H-ELM. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Relationship between climate extremes in Romania and their connection to large-scale air circulation

    Science.gov (United States)

    Barbu, Nicu; Ştefan, Sabina

    2015-04-01

    The aim of this paper is to investigate the connection between climate extremes (temperature and precipitation) in Romania and large-scale air circulation. Daily observational data of maximum air temperature and amount of precipitation for the period 1961-2010 were used to compute two seasonal indices associated with temperature and precipitation, quantifying their frequency, as follows: frequency of very warm days (FTmax90 ≥ 90th percentile), frequency of very wet days (FPp90; daily precipitation amount ≥ 90th percentile). Seasonally frequency of circulation types were calculated from daily circulation types determined by using two objective catalogues (GWT - GrossWetter-Typen and WLK - WetterLargenKlassifikation) from the COST733Action. Daily reanalysis data sets (sea level pressure, geopotential height at 925 and 500 hPa, u and v components of wind vector at 700 hPa and precipitable water content for the entire atmospheric column) build up by NCEP/NCAR, with 2.5°/2.5° lat/lon spatial resolution, were used to determine the circulation types. In order to select the optimal domain size related to the FTmax90 and the FPp90, the explained variance (EV) has been used. The EV determines the relation between the variance among circulation types and the total variance of the variable under consideration. This method quantifies the discriminatory power of a classification. The relationships between climate extremes in Romania and large-scale air circulation were investigated by using multiple linear regression model (MLRM), the predictands are FTmax90 and FPp90 and the circulation types were used as predictors. In order to select the independent predictors to build the MLRM the collinearity and multicollinearity analysis were performed. The study period is dividend in two periods: the period 1961-2000 is used to train the MLRM and the period 2001-2010 is used to validate the MLRM. The analytical relationship obtained by using MLRM can be used for future projection

  20. Radiosonde observational evidence of the influence of extreme SST gradient upon atmospheric meso-scale circulation

    Science.gov (United States)

    Nishikawa, H.; Tachibana, Y.; Udagawa, Y.

    2012-12-01

    Although the influence of the anomalous midlatitude SST upon atmospheric local circulation has been getting common in particular over the Kuroshio and the Gulf Stream regions, observational studies on the influence of the Okhotsk Sea, which is to the north of the Kuroshio, upon the atmospheric local circulation is much less than those of the Kuroshio. The climate of the Okhotsk SST is very peculiar. Extremely cold SST spots, whose summertime SST is lower than 5 Celsius degrees, are formed around Kuril Islands. Because SSTs are generally determined by local air-sea interaction as well as temperature advection, it is very difficult to isolate only the oceanic influence upon the atmosphere. The SST in this cold spot is, however, dominated by the tidal mixing, which is independent of the atmospheric processes. This unique condition may ease the account for the oceanic influence only. Although the SST environment of the Okhotsk Sea is good for understanding the oceanic influence upon the atmosphere, only a few studies has been executed in this region because of the difficulty of observations by research vessels in this region, where territory problems between Japan and Russia is unsolved. Because of the scant of direct observation, the Okhotsk Sea was still mysterious. In 2006 August, GPS radiosonde observation was carried out by Russian research vessel Khromov in the Sea of Okhotsk by the cooperation between Japan and Russia, and strong SST gradient of about 7 Celsius degrees/10km was observed around the Kuril Islands. The purpose of this study is to demonstrate observational finding of meso-scale atmospheric anticyclonic circulation influenced by the cold oceanic spot around the Kuril Island. The summaries of the observation are as follows. Meso-scale atmospheric ageostrophic anticyclonic circulation in the atmospheric marine-boundary layer is observed in and around the cold spot. A high air pressure area as compared with other surrounding areas is also located at the

  1. A Pervasive Parallel Processing Framework for Data Visualization and Analysis at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States)

    2014-11-01

    The evolution of the computing world from teraflop to petaflop has been relatively effortless, with several of the existing programming models scaling effectively to the petascale. The migration to exascale, however, poses considerable challenges. All industry trends infer that the exascale machine will be built using processors containing hundreds to thousands of cores per chip. It can be inferred that efficient concurrency on exascale machines requires a massive amount of concurrent threads, each performing many operations on a localized piece of data. Currently, visualization libraries and applications are based off what is known as the visualization pipeline. In the pipeline model, algorithms are encapsulated as filters with inputs and outputs. These filters are connected by setting the output of one component to the input of another. Parallelism in the visualization pipeline is achieved by replicating the pipeline for each processing thread. This works well for today’s distributed memory parallel computers but cannot be sustained when operating on processors with thousands of cores. Our project investigates a new visualization framework designed to exhibit the pervasive parallelism necessary for extreme scale machines. Our framework achieves this by defining algorithms in terms of worklets, which are localized stateless operations. Worklets are atomic operations that execute when invoked unlike filters, which execute when a pipeline request occurs. The worklet design allows execution on a massive amount of lightweight threads with minimal overhead. Only with such fine-grained parallelism can we hope to fill the billions of threads we expect will be necessary for efficient computation on an exascale machine.

  2. Characterization of Multi-Scale Atmospheric Conditions Associated with Extreme Precipitation in the Transverse Ranges of Southern California

    Science.gov (United States)

    Oakley, N.; Kaplan, M.; Ralph, F. M.

    2015-12-01

    The east-west oriented Transverse Ranges of Southern California have historically experienced shallow landslides and debris flows that threaten life and property. Steep topography, soil composition, and frequent wildfires make this area susceptible to mass wasting. Extreme rainfall often acts as a trigger for these events. This work characterizes atmospheric conditions at multiple scales during extreme (>99th percentile) 1-day precipitation events in the major sub-ranges of the Transverse Ranges. Totals from these 1-day events generally exceed the established sub-daily intensity-duration thresholds for shallow landslides and debris flows in this region. Daily extreme precipitation values are derived from both gridded and station-based datasets over the period 1958-2014. For each major sub-range, extreme events are clustered by atmospheric feature and direction of moisture transport. A composite analysis of synoptic conditions is produced for each cluster to create a conceptual model of atmospheric conditions favoring extreme precipitation. The vertical structure of the atmosphere during these extreme events is also examined using observed and modeled soundings. Preliminary results show two atmospheric features to be of importance: 1) closed and cutoff low-pressure systems, areas of counter-clockwise circulation that can produce southerly flow orthogonal to the Transverse Range ridge axes; and 2) atmospheric rivers that transport large quantities of water vapor into the region. In some cases, the closed lows and atmospheric rivers work in concert with each other to produce extreme precipitation. Additionally, there is a notable east-west dipole of precipitation totals during some extreme events between the San Gabriel and Santa Ynez Mountains where extreme values are observed in one range and not the other. The cause of this relationship is explored. The results of this work can help forecasters and emergency responders determine the likelihood that an event will

  3. ExM:System Support for Extreme-Scale, Many-Task Applications

    Energy Technology Data Exchange (ETDEWEB)

    Katz, Daniel S

    2011-05-31

    The ever-increasing power of supercomputer systems is both driving and enabling the emergence of new problem-solving methods that require the effi cient execution of many concurrent and interacting tasks. Methodologies such as rational design (e.g., in materials science), uncertainty quanti fication (e.g., in engineering), parameter estimation (e.g., for chemical and nuclear potential functions, and in economic energy systems modeling), massive dynamic graph pruning (e.g., in phylogenetic searches), Monte-Carlo- based iterative fi xing (e.g., in protein structure prediction), and inverse modeling (e.g., in reservoir simulation) all have these requirements. These many-task applications frequently have aggregate computing needs that demand the fastest computers. For example, proposed next-generation climate model ensemble studies will involve 1,000 or more runs, each requiring 10,000 cores for a week, to characterize model sensitivity to initial condition and parameter uncertainty. The goal of the ExM project is to achieve the technical advances required to execute such many-task applications efficiently, reliably, and easily on petascale and exascale computers. In this way, we will open up extreme-scale computing to new problem solving methods and application classes. In this document, we report on combined technical progress of the collaborative ExM project, and the institutional financial status of the portion of the project at University of Chicago, over the rst 8 months (through April 30, 2011)

  4. Monitoring Deforestation at Sub-Annual Scales as Extreme Events in Landsat Data Cubes

    Directory of Open Access Journals (Sweden)

    Eliakim Hamunyela

    2016-08-01

    Full Text Available Current methods for monitoring deforestation from satellite data at sub-annual scales require pixel time series to have many historical observations in the reference period to model normal forest dynamics before detecting deforestation. However, in some areas, pixel time series often do not have many historical observations. Detecting deforestation at a pixel with scarce historical observations can be improved by complementing the pixel time series with spatial context information. In this work, we propose a data-driven space-time change detection method that detects deforestation events at sub-annual scales in data cubes of satellite image time series. First we spatially normalised observations in the local space-time data cube to reduce seasonality. Subsequently, we detected deforestation by assessing whether a newly acquired observation in the monitoring period is an extreme when compared against spatially normalised values in a local space-time data cube defined over reference period. We demonstrated our method at two sites, a dry tropical Bolivian forest and a humid tropical Brazilian forest, by varying the spatial and temporal extent of data cube. We emulated a “near real-time” monitoring scenario, implying that observations in the monitoring period were sequentially rather than simultaneously assessed for deforestation. Using Landsat normalised difference vegetation index (NDVI time series, we achieved a median temporal detection delay of less than three observations, a producer’s accuracy above 70%, a user’s accuracy above 65%, and an overall accuracy above 80% at both sites, even when the reference period of the data cube only contained one year of data. Our results also show that large percentile thresholds (e.g., 5th percentile achieve higher producer’s accuracy and shorter temporal detection delay, whereas smaller percentiles (e.g., 0.1 percentile achieve higher user’s accuracy, but longer temporal detection delay. The

  5. The nonstationary impact of local temperature changes and ENSO on extreme precipitation at the global scale

    Science.gov (United States)

    Sun, Qiaohong; Miao, Chiyuan; Qiao, Yuanyuan; Duan, Qingyun

    2017-02-01

    The El Niño-Southern Oscillation (ENSO) and local temperature are important drivers of extreme precipitation. Understanding the impact of ENSO and temperature on the risk of extreme precipitation over global land will provide a foundation for risk assessment and climate-adaptive design of infrastructure in a changing climate. In this study, nonstationary generalized extreme value distributions were used to model extreme precipitation over global land for the period 1979-2015, with ENSO indicator and temperature as covariates. Risk factors were estimated to quantify the contrast between the influence of different ENSO phases and temperature. The results show that extreme precipitation is dominated by ENSO over 22% of global land and by temperature over 26% of global land. With a warming climate, the risk of high-intensity daily extreme precipitation increases at high latitudes but decreases in tropical regions. For ENSO, large parts of North America, southern South America, and southeastern and northeastern China are shown to suffer greater risk in El Niño years, with more than double the chance of intense extreme precipitation in El Niño years compared with La Niña years. Moreover, regions with more intense precipitation are more sensitive to ENSO. Global climate models were used to investigate the changing relationship between extreme precipitation and the covariates. The risk of extreme, high-intensity precipitation increases across high latitudes of the Northern Hemisphere but decreases in middle and lower latitudes under a warming climate scenario, and will likely trigger increases in severe flooding and droughts across the globe. However, there is some uncertainties associated with the influence of ENSO on predictions of future extreme precipitation, with the spatial extent and risk varying among the different models.

  6. Correlation between the selective control assessment of lower extremity and pediatric balance scale scores in children with spastic cerebral palsy

    Science.gov (United States)

    Lim, Hyoungwon

    2015-01-01

    [Purpose] The purpose of this study was to investigate the correlation between the Selective Control Assessment of Lower Extremity (SCALE) and Pediatric Balance Scales (PBS) in children with spastic cerebral palsy and further to test whether the SCALE is a valid tool to predict the PBS. [Subjects and Methods] A cross-sectional study was conducted to evaluate the SCALE and PBS in 23 children (9 females, 14 males, GMFCS level I–III) with spastic cerebral palsy. [Results] Both the SCALE and PBS scores for children with spastic hemiplegia were significantly higher than those for children with spastic diplegia. The scores for SCALE items were low for distal parts. The PBS items that were difficult for the participants to perform were items 8, 9, 10, and 14 with the highest difficulty experienced for item 8 followed by items 9, 10, and 14. The correlation coefficient (0.797) between the SCALE and PBS scores was statistically significant. The correlations between each SCALE item and the PBS scores were also statistically significant. SCALE items were significantly correlated with two PBS dimensions (standing and postural change). [Conclusion] In SCALE assessment, more severe deficits were observed in the distal parts. Standing and postural changes in the PBS method were difficult for the participants to perform. The two tests, that is, the SCALE and PBS, were highly correlated. Therefore, the SCALE is useful to prediction of PBS outcomes and is also applicable as a prognostic indicator for treatment planning. PMID:26834323

  7. ERA-40

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — ERA-40 project was to produce and promote the use of a comprehensive set of global analysis describing the state of the atmosphere and land and ocean-wave conditions...

  8. Assessing future climatic changes of rainfall extremes at small spatio-temporal scales

    DEFF Research Database (Denmark)

    Gregersen, Ida Bülow; Sørup, Hjalte Jomo Danielsen; Madsen, Henrik;

    2013-01-01

    in relation to urban flooding. The present study focuses on high-resolution regional climate model (RCM) skill in simulating sub-daily rainfall extremes. Temporal and spatial characteristics of output from three different RCM simulations with 25 km resolution are compared to point rainfall extremes estimated......Climate change is expected to influence the occurrence and magnitude of rainfall extremes and hence the flood risks in cities. Major impacts of an increased pluvial flood risk are expected to occur at hourly and sub-hourly resolutions. This makes convective storms the dominant rainfall type...

  9. The New York City Operations Support Tool: Supporting Water Supply Operations for Millions in an Era of Changing Patterns in Hydrological Extreme Events

    Science.gov (United States)

    Matonse, A. H.; Porter, J. H.; Frei, A.

    2015-12-01

    Providing an average 1.1 billion gallons (~ 4.2 x 106 cubic meters) of drinking water per day to approximately nine million people in New York City (NYC) and four upstate counties, the NYC water supply is among the world's largest unfiltered systems. In addition to providing a reliable water supply in terms of water quantity and quality, the city has to fulfill other flow objectives to serve downstream communities. At times, such as during extreme hydrological events, water quality issues may restrict water usage for parts of the system. To support a risk-based water supply decision making process NYC has developed the Operations Support Tool (OST). OST combines a water supply systems model with reservoir water quality models, near real time data ingestion, data base management and an ensemble hydrological forecast. A number of reports have addressed the frequency and intensities of extreme hydrological events across the continental US. In the northeastern US studies have indicated an increase in the frequency of extremely large precipitation and streamflow events during the most recent decades. During this presentation we describe OST and, using case studies we demonstrate how this tool has been useful to support operational decisions. We also want to motivate a discussion about how undergoing changes in patterns of hydrological extreme events elevate the challenge faced by water supply managers and the role of the scientific community to integrate nonstationarity approaches in hydrologic forecast and modeling.

  10. Modelling of spatio-temporal precipitation relevant for urban hydrology with focus on scales, extremes and climate change

    DEFF Research Database (Denmark)

    Sørup, Hjalte Jomo Danielsen

    Time series of precipitation are necessary for assessment of urban hydrological systems. In a changed climate this is challenging as climate model output is not directly comparable to observations at the scales relevant for urban hydrology. The focus of this PhD thesis is downscaling...... of precipitation to spatio-temporal scales used in urban hydrology. It investigates several observational data products and identifies relevant scales where climate change and precipitation can be assessed for urban use. Precipitation is modelled at different scales using different stochastic techniques. A weather...... generator is used to produce an artificial spatio-temporal precipitation product that can be used both directly in large scale urban hydrological modelling and for derivation of extreme precipitation statistics relevant for urban hydrology. It is discussed why precipitation time series from a changed...

  11. A Pervasive Parallel Processing Framework for Data Visualization and Analysis at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Kwan-Liu [Univ. of California, Davis, CA (United States)

    2017-02-01

    efficient computation on an exascale computer. This project concludes with a functional prototype containing pervasively parallel algorithms that perform demonstratively well on many-core processors. These algorithms are fundamental for performing data analysis and visualization at extreme scale.

  12. 2009 fault tolerance for extreme-scale computing workshop, Albuquerque, NM - March 19-20, 2009.

    Energy Technology Data Exchange (ETDEWEB)

    Katz, D. S.; Daly, J.; DeBardeleben, N.; Elnozahy, M.; Kramer, B.; Lathrop, S.; Nystrom, N.; Milfeld, K.; Sanielevici, S.; Scott, S.; Votta, L.; Louisiana State Univ.; Center for Exceptional Computing; LANL; IBM; Univ. of Illinois; Shodor Foundation; Pittsburgh Supercomputer Center; Texas Advanced Computing Center; ORNL; Sun Microsystems

    2009-02-01

    This is a report on the third in a series of petascale workshops co-sponsored by Blue Waters and TeraGrid to address challenges and opportunities for making effective use of emerging extreme-scale computing. This workshop was held to discuss fault tolerance on large systems for running large, possibly long-running applications. The main point of the workshop was to have systems people, middleware people (including fault-tolerance experts), and applications people talk about the issues and figure out what needs to be done, mostly at the middleware and application levels, to run such applications on the emerging petascale systems, without having faults cause large numbers of application failures. The workshop found that there is considerable interest in fault tolerance, resilience, and reliability of high-performance computing (HPC) systems in general, at all levels of HPC. The only way to recover from faults is through the use of some redundancy, either in space or in time. Redundancy in time, in the form of writing checkpoints to disk and restarting at the most recent checkpoint after a fault that cause an application to crash/halt, is the most common tool used in applications today, but there are questions about how long this can continue to be a good solution as systems and memories grow faster than I/O bandwidth to disk. There is interest in both modifications to this, such as checkpoints to memory, partial checkpoints, and message logging, and alternative ideas, such as in-memory recovery using residues. We believe that systematic exploration of these ideas holds the most promise for the scientific applications community. Fault tolerance has been an issue of discussion in the HPC community for at least the past 10 years; but much like other issues, the community has managed to put off addressing it during this period. There is a growing recognition that as systems continue to grow to petascale and beyond, the field is approaching the point where we don

  13. Scaling and trends of hourly precipitation extremes in two different climate zones – Hong Kong and the Netherlands

    Directory of Open Access Journals (Sweden)

    G. Lenderink

    2011-09-01

    Full Text Available Hourly precipitation extremes in very long time series from the Hong Kong Observatory and the Netherlands are investigated. Using the 2 m dew point temperature from 4 h before the rainfall event as a measure of near surface absolute humidity, hourly precipitation extremes closely follow a 14% per degree dependency – a scaling twice as large as following from the Clausius-Clapeyron relation. However, for dew point temperatures above 23 °C no significant dependency on humidity was found. Strikingly, in spite of the large difference in climate, results are almost identical in Hong Kong and the Netherlands for the dew point temperature range where both observational sets have sufficient data. Trends in hourly precipitation extremes show substantial increases over the last century for both De Bilt (the Netherlands and Hong Kong. For De Bilt, not only the long term trend, but also variations in hourly precipitation extremes on an inter-decadal timescale of 30 yr and longer, can be linked very well to the above scaling; there is a very close resemblance between variations in dew point temperature and precipitation intensity with an inferred dependency of hourly precipitation extremes of 10 to 14% per degree. For Hong Kong there is no connection between variations in humidity and those in precipitation intensity in the wet season, May to September. This is consistent with the found zero-dependency of precipitation intensity on humidity for dew points above 23 °C. Yet, outside the wet season humidity changes do appear to explain the positive trend in hourly precipitation extremes, again following a dependency close to twice the Clausius-Clapeyron relation.

  14. What is not csr: extremes of csr perception in the world of business and strategic view on it in the era of conscious capitalism

    OpenAIRE

    OKOROCHKOVA ANASTASIA

    2016-01-01

    Extremes of Corporate Social Responsibility (CSR) perception are evident in the business world today. Business leaders and other stakeholders can’t understand what in particular, how and for what purpose they should practice CSR and they often narrow it down to different business activities that do not have any connection with sustainable development of business. By opposing it to philanthropy and charity; to practice of social investments; to marketing activities and PR; tothe concept of sha...

  15. Extreme rainfall, vulnerability and risk: a continental-scale assessment for South America

    Science.gov (United States)

    Vorosmarty, Charles J.; de Guenni, Lelys Bravo; Wollheim, Wilfred M.; Pellerin, Brian A.; Bjerklie, David M.; Cardoso, Manoel; D'Almeida, Cassiano; Colon, Lilybeth

    2013-01-01

    Extreme weather continues to preoccupy society as a formidable public safety concern bearing huge economic costs. While attention has focused on global climate change and how it could intensify key elements of the water cycle such as precipitation and river discharge, it is the conjunction of geophysical and socioeconomic forces that shapes human sensitivity and risks to weather extremes. We demonstrate here the use of high-resolution geophysical and population datasets together with documentary reports of rainfall-induced damage across South America over a multi-decadal, retrospective time domain (1960–2000). We define and map extreme precipitation hazard, exposure, affectedpopulations, vulnerability and risk, and use these variables to analyse the impact of floods as a water security issue. Geospatial experiments uncover major sources of risk from natural climate variability and population growth, with change in climate extremes bearing a minor role. While rural populations display greatest relative sensitivity to extreme rainfall, urban settings show the highest rates of increasing risk. In the coming decades, rapid urbanization will make South American cities the focal point of future climate threats but also an opportunity for reducing vulnerability, protecting lives and sustaining economic development through both traditional and ecosystem-based disaster risk management systems.

  16. Forcings and feedbacks on convection in the 2010 Pakistan flood: Modeling extreme precipitation with interactive large-scale ascent

    Science.gov (United States)

    Nie, Ji; Shaevitz, Daniel A.; Sobel, Adam H.

    2016-09-01

    Extratropical extreme precipitation events are usually associated with large-scale flow disturbances, strong ascent, and large latent heat release. The causal relationships between these factors are often not obvious, however, the roles of different physical processes in producing the extreme precipitation event can be difficult to disentangle. Here we examine the large-scale forcings and convective heating feedback in the precipitation events, which caused the 2010 Pakistan flood within the Column Quasi-Geostrophic framework. A cloud-revolving model (CRM) is forced with large-scale forcings (other than large-scale vertical motion) computed from the quasi-geostrophic omega equation using input data from a reanalysis data set, and the large-scale vertical motion is diagnosed interactively with the simulated convection. Numerical results show that the positive feedback of convective heating to large-scale dynamics is essential in amplifying the precipitation intensity to the observed values. Orographic lifting is the most important dynamic forcing in both events, while differential potential vorticity advection also contributes to the triggering of the first event. Horizontal moisture advection modulates the extreme events mainly by setting the environmental humidity, which modulates the amplitude of the convection's response to the dynamic forcings. When the CRM is replaced by either a single-column model (SCM) with parameterized convection or a dry model with a reduced effective static stability, the model results show substantial discrepancies compared with reanalysis data. The reasons for these discrepancies are examined, and the implications for global models and theoretical models are discussed.

  17. Ecological recovery in ERA

    DEFF Research Database (Denmark)

    EFSA Scientific Committee (Scientific Committee); Topping, Christopher John

    2016-01-01

    recognises the importance of more integrated ERAs considering both the local and landscape scales, as well as the possible co-occurrence of multiple potential stressors that fall under the remit of EFSA, which are important when addressing ecological recovery. In this scientific opinion, the Scientific...... ecological recovery for any assessed products, and invasive alien species that are harmful for plant health. This framework proposes an integrative approach based on well-defined specific protection goals, scientific knowledge derived by means of experimentation, modelling and monitoring, and the selection...... Committee gathered scientific knowledge on the potential for the recovery of non-target organisms for the further development of ERA. Current EFSA guidance documents and opinions were reviewed on how ecological recovery is addressed in ERA schemes. In addition, this scientific opinion is based on expert...

  18. Moths produce extremely quiet ultrasonic courtship songs by rubbing specialized scales

    DEFF Research Database (Denmark)

    Nakano, Ryo; Skals, Niels; Takanashi, Takuma

    2008-01-01

    Insects have evolved a marked diversity of mechanisms to produce loud conspicuous sounds for efficient communication. However, the risk of eavesdropping by competitors and predators is high. Here, we describe a mechanism for producing extremely low-intensity ultrasonic songs (46 dB sound pressure...

  19. National scale multivariate extreme value modelling of waves, winds and sea levels

    Directory of Open Access Journals (Sweden)

    Gouldby Ben

    2016-01-01

    Full Text Available It has long been recognised that extreme coastal flooding can arise from the joint occurrence of extreme waves, winds and sea levels. The standard simplified joint probability approach used in England and Wales can result in an underestimation of flood risk unless correction factors are applied. This paper describes the application of a state-of-the-art multivariate extreme value model to offshore winds, waves and sea levels around the coast of England. The methodology overcomes the limitations of the traditional method. The output of the new statistical analysis is a Monte-Carlo (MC simulation comprising many thousands of offshore extreme events and it is necessary to translate all of these events into overtopping rates for use as input to flood risk assessments. It is computationally impractical to transform all of these MC events from the offshore to the nearshore. Computationally efficient statistical emulators of the SWAN wave transformation model have therefore been constructed. The emulators translate the thousands of MC events offshore. Whilst the methodology has been applied for national flood risk assessment, it has the potential to be implemented for wider use, including climate change impact assessment, nearshore wave climates for detailed local assessments and coastal flood forecasting.

  20. Evaluation of large-scale meteorological patterns associated with temperature extremes in the NARCCAP regional climate model simulations

    Science.gov (United States)

    Loikith, Paul C.; Waliser, Duane E.; Lee, Huikyo; Neelin, J. David; Lintner, Benjamin R.; McGinnis, Seth; Mearns, Linda O.; Kim, Jinwon

    2015-12-01

    Large-scale meteorological patterns (LSMPs) associated with temperature extremes are evaluated in a suite of regional climate model (RCM) simulations contributing to the North American Regional Climate Change Assessment Program. LSMPs are characterized through composites of surface air temperature, sea level pressure, and 500 hPa geopotential height anomalies concurrent with extreme temperature days. Six of the seventeen RCM simulations are driven by boundary conditions from reanalysis while the other eleven are driven by one of four global climate models (GCMs). Four illustrative case studies are analyzed in detail. Model fidelity in LSMP spatial representation is high for cold winter extremes near Chicago. Winter warm extremes are captured by most RCMs in northern California, with some notable exceptions. Model fidelity is lower for cool summer days near Houston and extreme summer heat events in the Ohio Valley. Physical interpretation of these patterns and identification of well-simulated cases, such as for Chicago, boosts confidence in the ability of these models to simulate days in the tails of the temperature distribution. Results appear consistent with the expectation that the ability of an RCM to reproduce a realistically shaped frequency distribution for temperature, especially at the tails, is related to its fidelity in simulating LMSPs. Each ensemble member is ranked for its ability to reproduce LSMPs associated with observed warm and cold extremes, identifying systematically high performing RCMs and the GCMs that provide superior boundary forcing. The methodology developed here provides a framework for identifying regions where further process-based evaluation would improve the understanding of simulation error and help guide future model improvement and downscaling efforts.

  1. Analysis of clinical characteristics and efficacy of chronic myeloid leukemia onset with extreme thrombocytosis in the era of tyrosine kinase inhibitors

    Directory of Open Access Journals (Sweden)

    Liu Z

    2017-07-01

    Full Text Available Zhihe Liu, Hongqiong Fan, Yuying Li, Chunshui Liu Department of Hematology, Cancer Center, The First Hospital of Jilin University, Changchun, People’s Republic of China Abstract: The aim of this study was to investigate the clinical characteristics and efficacy of chronic myeloid leukemia (CML onset with extreme thrombocytosis. A total of 121 newly diagnosed and untreated CML patients in chronic phase with complete clinical information from the First Hospital of Jilin University, from January 2010 to December 2014 were retrospectively recruited. Based on the platelet (PLT count, 22 patients were assigned into CML with thrombocytosis (CML-T group (PLT >1,000×109/L and 65 patients were classified into CML without extreme thrombocytosis (CML-N group (PLT ≤1,000×109/L. Fifty-four point five percent of patients in the CML-T group were female, which was higher than that in the CML-N group (27.7% (P=0.022. Except for gender, there was no significant difference for clinical information of patients between the two groups. For Sokal and Hasford scoring systems, the percentage of patients at high risk in the CML-T group were higher than those in the CML-N group, 95.5% vs 52.3% (P=0.000 and 68.2% vs 41.5% (P=0.031, respectively; however, there was no significant difference for European Treatment and Outcome Study (EUTOS score system between the two groups (P=0.213. In terms of major molecular response (MMR rate, the percent of patients with MMR in CML-T group was lower than that in CML-N group at 36 months after tyrosine kinase inhibitor therapy (P=0.037. Up until December 2016, the median of event-free survival was 21 months in the CML-T group, however, that was not reached in the CML-N group (P=0.027. The majority of CML patients with extreme thrombocytosis were females, and compared to patients in the CML-N group, the percentage of high risk patients based on the Sokal and Hasford scoring systems was higher in the CML-T group, and the median

  2. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    Energy Technology Data Exchange (ETDEWEB)

    Wilke, Jeremiah J [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Kenny, Joseph P. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States)

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading framework allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.

  3. Contribution of large-scale circulation anomalies to changes in extreme precipitation frequency in the United States

    Science.gov (United States)

    Yu, Lejiang; Zhong, Shiyuan; Pei, Lisi; Bian, Xindi; Heilman, Warren E.

    2016-04-01

    The mean global climate has warmed as a result of the increasing emission of greenhouse gases induced by human activities. This warming is considered the main reason for the increasing number of extreme precipitation events in the US. While much attention has been given to extreme precipitation events occurring over several days, which are usually responsible for severe flooding over a large region, little is known about how extreme precipitation events that cause flash flooding and occur at sub-daily time scales have changed over time. Here we use the observed hourly precipitation from the North American Land Data Assimilation System Phase 2 forcing datasets to determine trends in the frequency of extreme precipitation events of short (1 h, 3 h, 6 h, 12 h and 24 h) duration for the period 1979-2013. The results indicate an increasing trend in the central and eastern US. Over most of the western US, especially the Southwest and the Intermountain West, the trends are generally negative. These trends can be largely explained by the interdecadal variability of the Pacific Decadal Oscillation and Atlantic Multidecadal Oscillation (AMO), with the AMO making a greater contribution to the trends in both warm and cold seasons.

  4. Statistical methods for the analysis of rotation measure grids in large scale structures in the SKA era

    CERN Document Server

    Vacca, Valentina; Ensslin, Torsten; Selig, Marco; Junklewitz, Henrik; Greiner, Maksim; Jasche, Jens; Hales, Christopher; Reinecke, Martin; Carretti, Ettore; Feretti, Luigina; Ferrari, Chiara; Giovannini, Gabriele; Govoni, Federica; Horellou, Cathy; Ideguchi, Shinsuke; Johnston-Hollitt, Melanie; Murgia, Matteo; Paladino, Rosita; Pizzo, Roberto Francesco; Anna, Scaife

    2015-01-01

    To better understand the origin and properties of cosmological magnetic fields, a detailed knowledge of magnetic fields in the large-scale structure of the Universe (galaxy clusters, filaments) is crucial. We propose a new statistical approach to study magnetic fields on large scales with the rotation measure grid data that will be obtained with the new generation of radio interferometers.

  5. Rain Characteristics and Large-Scale Environments of Precipitation Objects with Extreme Rain Volumes from TRMM Observations

    Science.gov (United States)

    Zhou, Yaping; Lau, William K M.; Liu, Chuntao

    2013-01-01

    This study adopts a "precipitation object" approach by using 14 years of Tropical Rainfall Measuring Mission (TRMM) Precipitation Feature (PF) and National Centers for Environmental Prediction (NCEP) reanalysis data to study rainfall structure and environmental factors associated with extreme heavy rain events. Characteristics of instantaneous extreme volumetric PFs are examined and compared to those of intermediate and small systems. It is found that instantaneous PFs exhibit a much wider scale range compared to the daily gridded precipitation accumulation range. The top 1% of the rainiest PFs contribute over 55% of total rainfall and have 2 orders of rain volume magnitude greater than those of the median PFs. We find a threshold near the top 10% beyond which the PFs grow exponentially into larger, deeper, and colder rain systems. NCEP reanalyses show that midlevel relative humidity and total precipitable water increase steadily with increasingly larger PFs, along with a rapid increase of 500 hPa upward vertical velocity beyond the top 10%. This provides the necessary moisture convergence to amplify and sustain the extreme events. The rapid increase in vertical motion is associated with the release of convective available potential energy (CAPE) in mature systems, as is evident in the increase in CAPE of PFs up to 10% and the subsequent dropoff. The study illustrates distinct stages in the development of an extreme rainfall event including: (1) a systematic buildup in large-scale temperature and moisture, (2) a rapid change in rain structure, (3) explosive growth of the PF size, and (4) a release of CAPE before the demise of the event.

  6. Estimation of Dynamic VaR in Chinese Stock Markets Based on Time Scale and Extreme Value Theory

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The accuracy and time scale invariance of value-at-risk (VaR) measurement methods for different stock indices and at different confidence levels are tested. Extreme value theory (EVT) is applied to model the extreme tail of standardized residual series of daily/weekly indices losses, and parametric and nonparametric methods are used to estimate parameters of the general Pareto distribution (GPD), and dynamic VaR for indices of three stock markets in China. The accuracy and time scale invariance of risk measurement methods through back-testing approach are also examined. Results show that not all the indices accept time scale invariance; there are some differences in accuracy between different indices at various confidence levels. The most powerful dynamic VaR estimation methods are EVT-GJR-Hill at 97.5% level for weekly loss to Shanghai stock market, and EVT-GARCH-MLE (Hill) at 99.0% level for weekly loss to Taiwan and Hong Kong stock markets, respectively.

  7. Extreme ultra-violet movie camera for imaging microsecond time scale magnetic reconnection.

    Science.gov (United States)

    Chai, Kil-Byoung; Bellan, Paul M

    2013-12-01

    An ultra-fast extreme ultra-violet (EUV) movie camera has been developed for imaging magnetic reconnection in the Caltech spheromak/astrophysical jet experiment. The camera consists of a broadband Mo:Si multilayer mirror, a fast decaying YAG:Ce scintillator, a visible light block, and a high-speed visible light CCD camera. The camera can capture EUV images as fast as 3.3 × 10(6) frames per second with 0.5 cm spatial resolution. The spectral range is from 20 eV to 60 eV. EUV images reveal strong, transient, highly localized bursts of EUV radiation when magnetic reconnection occurs.

  8. Characteristics of the Eysenck Personality Questionnaire Lie Scale and of Extreme Lie Scorers.

    Science.gov (United States)

    Loo, Robert

    1980-01-01

    Results of statistical analyses suggest that high lie-scorers respond honestly, and that the Lie Scale for the Eysenck Personality Inventory may reflect a personality dimension of interest rather than an extraneous and undesirable factor to be eliminated. (Author)

  9. Nueva Era

    Directory of Open Access Journals (Sweden)

    Alma Mancilla

    2004-01-01

    Full Text Available El artículo aborda la problemática de la interpretación de las nuevas religiones a través del caso del estudio de la Nueva Era. Se aborda en principio la cuestión de la definición de la Nueva Era, y la necesidad de tomar en cuenta los diferentes puntos de vista por parte de los diversos actores que contribuyen a su construcción como objeto de análisis sociológico. Enseguida, se trata la problemática del análisis sobre el terreno con algunos puntos de particular interés para el análisis de la experiencia religiosa: la subjetividad y la reflexividad en el análisis etnográfico, la postura del ateísmo metodológico y la exclusión epistemológica con respecto a los informantes. Por último, se presentan algunos de los problemas a superar para la realización un análisis cuantitativo de la Nueva Era.

  10. On the Linkage between the Extreme Drought and Pluvial Patterns in China and the Large-Scale Atmospheric Circulation

    Directory of Open Access Journals (Sweden)

    Zengxin Zhang

    2016-01-01

    Full Text Available China is a nation that is affected by a multitude of natural disasters, including droughts and floods. In this paper, the variations of extreme drought and pluvial patterns and their relations to the large-scale atmospheric circulation have been analyzed based on monthly precipitation data from 483 stations during the period 1958–2010 in China. The results show the following: (1 the extreme drought and pluvial events in China increase significantly during that period. During 1959–1966 timeframe, more droughts occur in South China and more pluvial events are found in North China (DSC-PNC pattern; as for the period 1997–2003 (PSC-DNC pattern, the situation is the opposite. (2 There are good relationships among the extreme drought and pluvial events and the Western Pacific Subtropical High, meridional atmospheric moisture flux, atmospheric moisture content, and summer precipitation. (3 A cyclone atmospheric circulation anomaly occurs in North China, followed by an obvious negative height anomaly and a southern wind anomaly at 850 hPa and 500 hPa for the DSC-PNC pattern during the summer, and a massive ascending airflow from South China extends to North China at ~50∘N. As for the PSC-DNC pattern, the situation contrasts sharply with the DSC-PNC pattern.

  11. Extreme-Scale Stochastic Particle Tracing for Uncertain Unsteady Flow Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Hanqi; He, Wenbin; Seo, Sangmin; Shen, Han-Wei; Peterka, Tom

    2016-11-13

    We present an efficient and scalable solution to estimate uncertain transport behaviors using stochastic flow maps (SFM,) for visualizing and analyzing uncertain unsteady flows. SFM computation is extremely expensive because it requires many Monte Carlo runs to trace densely seeded particles in the flow. We alleviate the computational cost by decoupling the time dependencies in SFMs so that we can process adjacent time steps independently and then compose them together for longer time periods. Adaptive refinement is also used to reduce the number of runs for each location. We then parallelize over tasks—packets of particles in our design—to achieve high efficiency in MPI/thread hybrid programming. Such a task model also enables CPU/GPU coprocessing. We show the scalability on two supercomputers, Mira (up to 1M Blue Gene/Q cores) and Titan (up to 128K Opteron cores and 8K GPUs), that can trace billions of particles in seconds.

  12. Renormalization-group theory for finite-size scaling in extreme statistics.

    Science.gov (United States)

    Györgyi, G; Moloney, N R; Ozogány, K; Rácz, Z; Droz, M

    2010-04-01

    We present a renormalization-group (RG) approach to explain universal features of extreme statistics applied here to independent identically distributed variables. The outlines of the theory have been described in a previous paper, the main result being that finite-size shape corrections to the limit distribution can be obtained from a linearization of the RG transformation near a fixed point, leading to the computation of stable perturbations as eigenfunctions. Here we show details of the RG theory which exhibit remarkable similarities to the RG known in statistical physics. Besides the fixed points explaining universality, and the least stable eigendirections accounting for convergence rates and shape corrections, the similarities include marginally stable perturbations which turn out to be generic for the Fisher-Tippett-Gumbel class. Distribution functions containing unstable perturbations are also considered. We find that, after a transitory divergence, they return to the universal fixed line at the same or at a different point depending on the type of perturbation.

  13. Scaling Relations of Lognormal Type Growth Process with an Extremal Principle of Entropy

    Directory of Open Access Journals (Sweden)

    Zi-Niu Wu

    2017-01-01

    Full Text Available The scale, inflexion point and maximum point are important scaling parameters for studying growth phenomena with a size following the lognormal function. The width of the size function and its entropy depend on the scale parameter (or the standard deviation and measure the relative importance of production and dissipation involved in the growth process. The Shannon entropy increases monotonically with the scale parameter, but the slope has a minimum at p 6/6. This value has been used previously to study spreading of spray and epidemical cases. In this paper, this approach of minimizing this entropy slope is discussed in a broader sense and applied to obtain the relationship between the inflexion point and maximum point. It is shown that this relationship is determined by the base of natural logarithm e ' 2.718 and exhibits some geometrical similarity to the minimal surface energy principle. The known data from a number of problems, including the swirling rate of the bathtub vortex, more data of droplet splashing, population growth, distribution of strokes in Chinese language characters and velocity profile of a turbulent jet, are used to assess to what extent the approach of minimizing the entropy slope can be regarded as useful.

  14. Wavelength-Scale Structures as Extremely High Haze Films for Efficient Polymer Solar Cells.

    Science.gov (United States)

    Ham, Juyoung; Dong, Wan Jae; Jung, Gwan Ho; Lee, Jong-Lam

    2016-03-01

    Wavelength-scale inverted pyramid structures with low reflectance and excellent haze have been designed for application to polymer solar cells (PSCs). The wavelength-scale structured haze films are fabricated on the back surface of glass without damages to organic active layer by using a soft lithographic technique with etched GaN molds. With a rigorous coupled-wave analysis of optical modeling, we find the shift of resonance peaks with the increase of pattern's diameter. Wavelength-scale structures could provide the number of resonances at the long wavelength spectrum (λ = 650-800 nm), yielding enhancement of power conversion efficiency (PCE) in the PSCs. Compared with a flat device (PCE = 7.12%, Jsc = 15.6 mA/cm(2)), improved PCE of 8.41% is achieved in a haze film, which is mainly due to the increased short circuit current density (Jsc) of 17.5 mA/cm(2). Hence, it opens up exciting opportunities for a variety of PSCs with wavelength-scale structures to further improve performance, simplify complicated process, and reduce costs.

  15. Extreme maximum temperature events and their relationships with large-scale modes: potential hazard on the Iberian Peninsula

    Science.gov (United States)

    Merino, Andrés; Martín, M. L.; Fernández-González, S.; Sánchez, J. L.; Valero, F.

    2017-07-01

    The aim of this paper is to analyze spatiotemporal distribution of maximum temperatures in the Iberian Peninsula (IP) by using various extreme maximum temperature indices. Thresholds for determining temperature extreme event (TEE) severity are defined using 99th percentiles of daily temperature time series for the period 1948 to 2009. The synoptic-scale fields of such events were analyzed in order to better understand the related atmospheric processes. The results indicate that the regions with a higher risk of maximum temperatures are located in the river valleys of southwest and northeast of the IP, while the Cantabrian coast and mountain ranges are characterized by lower risk. The TEEs were classified, by means of several synoptic fields (sea level pressure, temperature, and geopotential height at 850 and 500 hPa), in four clusters that largely explain their spatiotemporal distribution on the IP. The results of this study show that TEEs mainly occur associated with a ridge elongated from Subtropical areas. The relationships of TEEs with teleconnection patterns, such as the North Atlantic Oscillation (NAO), Western Mediterranean Oscillation (WeMO), and Mediterranean Oscillation (MO), showed that the interannual variability of extreme maximum temperatures is largely controlled by the dominant phase of WeMO in all seasons except wintertime where NAO is prevailing. Results related to MO pattern show less relevance in the maximum temperatures variability. The correct identification of synoptic patterns linked with the most extreme temperature event associated with each cluster will assist the prediction of events that can pose a natural hazard, thereby providing useful information for decision making and warning systems.

  16. Open Problems in Network-aware Data Management in Exa-scale Computing and Terabit Networking Era

    Energy Technology Data Exchange (ETDEWEB)

    Balman, Mehmet; Byna, Surendra

    2011-12-06

    Accessing and managing large amounts of data is a great challenge in collaborative computing environments where resources and users are geographically distributed. Recent advances in network technology led to next-generation high-performance networks, allowing high-bandwidth connectivity. Efficient use of the network infrastructure is necessary in order to address the increasing data and compute requirements of large-scale applications. We discuss several open problems, evaluate emerging trends, and articulate our perspectives in network-aware data management.

  17. Consistent first-principles pressure scales for diffraction experiments under extreme conditions

    Science.gov (United States)

    Otero-de-La-Roza, Alberto; Cabal, Victor Lua Na

    2012-02-01

    Diamond anvil cell (DAC) diffraction experiments are fundamental in geophysics and materials science to explore the behavior of solids under very high pressures and temperatures. A factor limiting the accuracy of DAC experiments is the lack of an accurate pressure scale for the calibration materials that extends to the ever-increasing pressure and temperature limits of the technique. In this communication, we address this problem by applying a newly developed technique that allows the calculation of accurate thermodynamic properties from first-principles calculations [Phys. Rev. B 84 (2011) 024109, 84 (2011) 184103]. Three elements are key in this method: i) the quasiharmonic approximation (QHA) and the static energies and phonon frequencies obtained from an electronic structure calculation ii) the appropriate representation of the equation of state by using averages of strain polynomials and iii) the correction of the systematic errors caused by the exchange-correlation functional approximation. As a result, we propose accurate equations of scale for typical pressure calibrants that can be used in the whole experimental range of pressures and temperatures. The internal consistency and the agreement with the ruby scale based on experimental data is examined.

  18. Detecting Silent Data Corruption for Extreme-Scale Applications through Data Mining

    Energy Technology Data Exchange (ETDEWEB)

    Bautista-Gomez, Leonardo [Argonne National Lab. (ANL), Argonne, IL (United States); Cappello, Franck [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-01-16

    Supercomputers allow scientists to study natural phenomena by means of computer simulations. Next-generation machines are expected to have more components and, at the same time, consume several times less energy per operation. These trends are pushing supercomputer construction to the limits of miniaturization and energy-saving strategies. Consequently, the number of soft errors is expected to increase dramatically in the coming years. While mechanisms are in place to correct or at least detect some soft errors, a significant percentage of those errors pass unnoticed by the hardware. Such silent errors are extremely damaging because they can make applications silently produce wrong results. In this work we propose a technique that leverages certain properties of high-performance computing applications in order to detect silent errors at the application level. Our technique detects corruption solely based on the behavior of the application datasets and is completely application-agnostic. We propose multiple corruption detectors, and we couple them to work together in a fashion transparent to the user. We demonstrate that this strategy can detect the majority of the corruptions, while incurring negligible overhead. We show that with the help of these detectors, applications can have up to 80% of coverage against data corruption.

  19. Design considerations of 10 kW-scale, extreme ultraviolet SASE FEL for lithography

    CERN Document Server

    Pagani, C; Schneidmiller, E A; Yurkov, M V

    2001-01-01

    The semiconductor industry growth is driven to a large extent by steady advancements in microlithography. According to the newly updated industry road map, the 70 nm generation is anticipated to be available in the year 2008. However, the path to get there is not clear. The problem of construction of extreme ultraviolet (EUV) quantum lasers for lithography is still unsolved: progress in this field is rather moderate and we cannot expect a significant breakthrough in the near future. Nevertheless, there is clear path for optical lithography to take us to sub-100 nm dimensions. Theoretical and experimental work in Self-Amplified Spontaneous Emission (SASE) Free Electron Lasers (FEL) physics and the physics of superconducting linear accelerators over the last 10 years has pointed to the possibility of the generation of high-power optical beams with laser-like characteristics in the EUV spectral range. Recently, there have been important advances in demonstrating a high-gain SASE FEL at 100 nm wavelength (J. Andr...

  20. The new political scales of citizenship in a global era: The politics of hydroelectric development in the James Bay Region

    Science.gov (United States)

    Rousseau, Jean

    2000-10-01

    In this dissertation, I examine the current reconfiguration of citizenship in Western societies in a period marked by the expansion of globalisation. Specifically, I focus on the spatial implications of globalisation via a comparative analysis of the politics of hydroelectric development, in the James Bay region of Quebec, Canada. I discuss the struggles surrounding its two main phases: the construction of the La Grande complex (1970--1975) and the aborted launching of the Great Whale project (1988--1994). I pay close attention to the mobilisation of the Cree people and ecology groups who launched successive campaigns to prevent the construction of both projects, reflecting what I depict as process of jumping scales. Whereas their campaigns were unsuccessful during the first phase, they became much more potent during the second phase. This transition reveals an important remapping of citizenship politics through which its national and territorial scales are being unbundled and no longer exclusively restricted to national state borders. As a result, citizenship politics becomes embedded in various locales---from the local to the emerging global. Each represents a political scene in which citizens can make claims and participate in debates around the recognition of their rights. This dissertation would contribute to the debates on citizenship and globalisation. It suggests that globalisation not only imposes new constraints on citizenship policies from above, but also creates now spaces in which citizenship is reframed or rather, rescaled. This dissertation also provides insights into the implications of globalisation. It illustrates how the globalisation of politics results from non-state actors, reflecting what is called "globalisation from below". This reveals contradictory forms of globalisation and implicitly, different political projects.

  1. Dissipative Particle Dynamics Simulations at Extreme Scale: GPU Algorithms, Implementation and Applications

    Science.gov (United States)

    Tang, Yu-Hang; Karniadakis, George; Crunch Team

    2014-03-01

    We present a scalable dissipative particle dynamics simulation code, fully implemented on the Graphics Processing Units (GPUs) using a hybrid CUDA/MPI programming model, which achieves 10-30 times speedup on a single GPU over 16 CPU cores and almost linear weak scaling across a thousand nodes. A unified framework is developed within which the efficient generation of the neighbor list and maintaining particle data locality are addressed. Our algorithm generates strictly ordered neighbor lists in parallel, while the construction is deterministic and makes no use of atomic operations or sorting. Such neighbor list leads to optimal data loading efficiency when combined with a two-level particle reordering scheme. A faster in situ generation scheme for Gaussian random numbers is proposed using precomputed binary signatures. We designed custom transcendental functions that are fast and accurate for evaluating the pairwise interaction. Computer benchmarks demonstrate the speedup of our implementation over the CPU implementation as well as strong and weak scalability. A large-scale simulation of spontaneous vesicle formation consisting of 128 million particles was conducted to illustrate the practicality of our code in real-world applications. This work was supported by the new Department of Energy Collaboratory on Mathematics for Mesoscopic Modeling of Materials (CM4). Simulations were carried out at the Oak Ridge Leadership Computing Facility through the INCITE program under project BIP017.

  2. Risk prediction of Critical Infrastructures against extreme natural hazards: local and regional scale analysis

    Science.gov (United States)

    Rosato, Vittorio; Hounjet, Micheline; Burzel, Andreas; Di Pietro, Antonio; Tofani, Alberto; Pollino, Maurizio; Giovinazzi, Sonia

    2016-04-01

    Natural hazard events can induce severe impacts on the built environment; they can hit wide and densely populated areas, where there is a large number of (inter)dependent technological systems whose damages could cause the failure or malfunctioning of further different services, spreading the impacts on wider geographical areas. The EU project CIPRNet (Critical Infrastructures Preparedness and Resilience Research Network) is realizing an unprecedented Decision Support System (DSS) which enables to operationally perform risk prediction on Critical Infrastructures (CI) by predicting the occurrence of natural events (from long term weather to short nowcast predictions, correlating intrinsic vulnerabilities of CI elements with the different events' manifestation strengths, and analysing the resulting Damage Scenario. The Damage Scenario is then transformed into an Impact Scenario, where punctual CI element damages are transformed into micro (local area) or meso (regional) scale Services Outages. At the smaller scale, the DSS simulates detailed city models (where CI dependencies are explicitly accounted for) that are of important input for crisis management organizations whereas, at the regional scale by using approximate System-of-Systems model describing systemic interactions, the focus is on raising awareness. The DSS has allowed to develop a novel simulation framework for predicting earthquakes shake maps originating from a given seismic event, considering the shock wave propagation in inhomogeneous media and the subsequent produced damages by estimating building vulnerabilities on the basis of a phenomenological model [1, 2]. Moreover, in presence of areas containing river basins, when abundant precipitations are expected, the DSS solves the hydrodynamic 1D/2D models of the river basins for predicting the flux runoff and the corresponding flood dynamics. This calculation allows the estimation of the Damage Scenario and triggers the evaluation of the Impact Scenario

  3. HPC Colony II: FAST_OS II: Operating Systems and Runtime Systems at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Moreira, Jose [IBM, Armonk, NY (United States)

    2013-11-13

    HPC Colony II has been a 36-month project focused on providing portable performance for leadership class machines—a task made difficult by the emerging variety of more complex computer architectures. The project attempts to move the burden of portable performance to adaptive system software, thereby allowing domain scientists to concentrate on their field rather than the fine details of a new leadership class machine. To accomplish our goals, we focused on adding intelligence into the system software stack. Our revised components include: new techniques to address OS jitter; new techniques to dynamically address load imbalances; new techniques to map resources according to architectural subtleties and application dynamic behavior; new techniques to dramatically improve the performance of checkpoint-restart; and new techniques to address membership service issues at scale.

  4. West Africa Extreme Rainfall Events and Large-Scale Ocean Surface and Atmospheric Conditions in the Tropical Atlantic

    Directory of Open Access Journals (Sweden)

    S. Ta

    2016-01-01

    Full Text Available Based on daily precipitation from the Global Precipitation Climatology Project (GPCP data during April–October of the 1997–2014 period, the daily extreme rainfall trends and variability over West Africa are characterized using 90th-percentile threshold at each grid point. The contribution of the extreme rainfall amount reaches ~50–90% in the northern region while it is ~30–50% in the south. The yearly cumulated extreme rainfall amount indicates significant and negative trends in the 6°N–12°N; 6°N–12°N; 17°W–10°W and 4°N–7°N; 4°N–7°N; 6°E–10°E 4°N–7°N; 6°E–10°E 4°N–7°N; 6°E–10°E domains, while the number of days exhibits nonsignificant trends over West Africa. The empirical orthogonal functions performed on the standardized anomalies show four variability modes that include all West Africa with a focus on the Sahelian region, the eastern region including the south of Nigeria, the western part including Guinea, Sierra Leone, Liberia, and Guinea-Bissau, and finally a small region at the coast of Ghana and Togo. These four modes are influenced differently by the large-scale ocean surface and atmospheric conditions in the tropical Atlantic. The results are applicable in planning the risks associated with these climate hazards, particularly on water resource management and civil defense.

  5. Coupled large-eddy simulation and morphodynamics of a large-scale river under extreme flood conditions

    Science.gov (United States)

    Khosronejad, Ali; Sotiropoulos, Fotis; Stony Brook University Team

    2016-11-01

    We present a coupled flow and morphodynamic simulations of extreme flooding in 3 km long and 300 m wide reach of the Mississippi River in Minnesota, which includes three islands and hydraulic structures. We employ the large-eddy simulation (LES) and bed-morphodynamic modules of the VFS-Geophysics model to investigate the flow and bed evolution of the river during a 500 year flood. The coupling of the two modules is carried out via a fluid-structure interaction approach using a nested domain approach to enhance the resolution of bridge scour predictions. The geometrical data of the river, islands and structures are obtained from LiDAR, sub-aqueous sonar and in-situ surveying to construct a digital map of the river bathymetry. Our simulation results for the bed evolution of the river reveal complex sediment dynamics near the hydraulic structures. The numerically captured scour depth near some of the structures reach a maximum of about 10 m. The data-driven simulation strategy we present in this work exemplifies a practical simulation-based-engineering-approach to investigate the resilience of infrastructures to extreme flood events in intricate field-scale riverine systems. This work was funded by a Grant from Minnesota Dept. of Transportation.

  6. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and

  7. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D.; Sewell, Christopher (LANL); Childs, Hank (U of Oregon); Ma, Kwan-Liu (UC Davis); Geveci, Berk (Kitware); Meredith, Jeremy (ORNL)

    2016-05-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  8. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sewell, Christopher [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Childs, Hank [Univ. of Oregon, Eugene, OR (United States); Ma, Kwan-Liu [Univ. of California, Davis, CA (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Meredith, Jeremy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  9. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pugmire, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogers, David [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Childs, Hank [Univ. of Oregon, Eugene, OR (United States); Ma, Kwan-Liu [Univ. of California, Davis, CA (United States); Geveci, Berk [Kitware Inc., Clifton Park, NY (United States)

    2017-05-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  10. Physicochemical characterization of smoke aerosol during large-scale wildfires: Extreme event of August 2010 in Moscow

    Science.gov (United States)

    Popovicheva, O.; Kistler, M.; Kireeva, E.; Persiantseva, N.; Timofeev, M.; Kopeikin, V.; Kasper-Giebl, A.

    2014-10-01

    Enhancement of biomass burning-related research is essential for the assessment of large-scale wildfires impact on pollution at regional and global scale. Starting since 6 August 2010 Moscow was covered with thick smoke of unusually high PM10 and BC concentrations, considerably affected by huge forest and peat fires around megacity. This work presents the first comprehensive physico-chemical characterization of aerosols during extreme smoke event in Moscow in August 2010. Sampling was performed in the Moscow center and suburb as well as one year later, in August 2011 during a period when no biomass burning was observed. Small-scale experimental fires of regional biomass were conducted in the Moscow region. Carbon content, functionalities of organic/inorganic compounds, tracers of biomass burning (anhydrosaccharides), ionic composition, and structure of smoke were analyzed by thermal-optical analysis, FTIR spectroscopy, liquid and ion chromatography, and electron microscopy. Carbonaceous aerosol in August 2010 was dominated by organic species with elemental carbon (EC) as minor component. High average OC/EC near 27.4 is found, comparable to smoke of regional biomass smoldering fire, and exceeded 3 times the value observed in August 2011. Organic functionalities of Moscow smoke aerosols were hydroxyl, aliphatic, aromatic, acid and non-acid carbonyl, and nitro compound groups, almost all of them indicate wildfires around city as the source of smoke. The ratio of levoglucosan (LG) to mannosan near 5 confirms the origin of smoke from coniferous forest fires around megacity. Low ratio of LG/OC near 0.8% indicates the degradation of major molecular tracer of biomass burning in urban environment. Total concentration of inorganic ions dominated by sulfates SO2- and ammonium NH was found about 5 times higher during large-scale wildfires than in August 2011. Together with strong sulfate and ammonium absorbance in smoke aerosols, these observations prove the formation of

  11. Final Technical Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M. [Duke Univ., Durham, NC (United States). Dept. of Mechanical Engineering and Materials Science

    2017-06-06

    QUEST is a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, University of Southern California, Massachusetts Institute of Technology, University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The Duke effort focused on the development of algorithms and utility software for non-intrusive sparse UQ representations, and on participation in the organization of annual workshops and tutorials to disseminate UQ tools to the community, and to gather input in order to adapt approaches to the needs of SciDAC customers. In particular, fundamental developments were made in (a) multiscale stochastic preconditioners, (b) gradient-based approaches to inverse problems, (c) adaptive pseudo-spectral approximations, (d) stochastic limit cycles, and (e) sensitivity analysis tools for noisy systems. In addition, large-scale demonstrations were performed, namely in the context of ocean general circulation models.

  12. Two spatial scales in a bleaching event: Corals from the mildest and the most extreme thermal environments escape mortality

    KAUST Repository

    Pineda, Jesús

    2013-07-28

    In summer 2010, a bleaching event decimated the abundant reef flat coral Stylophora pistillata in some areas of the central Red Sea, where a series of coral reefs 100–300 m wide by several kilometers long extends from the coastline to about 20 km offshore. Mortality of corals along the exposed and protected sides of inner (inshore) and mid and outer (offshore) reefs and in situ and satellite sea surface temperatures (SSTs) revealed that the variability in the mortality event corresponded to two spatial scales of temperature variability: 300 m across the reef flat and 20 km across a series of reefs. However, the relationship between coral mortality and habitat thermal severity was opposite at the two scales. SSTs in summer 2010 were similar or increased modestly (0.5°C) in the outer and mid reefs relative to 2009. In the inner reef, 2010 temperatures were 1.4°C above the 2009 seasonal maximum for several weeks. We detected little or no coral mortality in mid and outer reefs. In the inner reef, mortality depended on exposure. Within the inner reef, mortality was modest on the protected (shoreward) side, the most severe thermal environment, with highest overall mean and maximum temperatures. In contrast, acute mortality was observed in the exposed (seaward) side, where temperature fluctuations and upper water temperature values were relatively less extreme. Refuges to thermally induced coral bleaching may include sites where extreme, high-frequency thermal variability may select for coral holobionts preadapted to, and physiologically condition corals to withstand, regional increases in water temperature.

  13. Extreme Postnatal Scaling in Bat Feeding Performance: A View of Ecomorphology from Ontogenetic and Macroevolutionary Perspectives.

    Science.gov (United States)

    Santana, Sharlene E; Miller, Kimberly E

    2016-09-01

    Ecomorphology studies focus on understanding how anatomical and behavioral diversity result in differences in performance, ecology, and fitness. In mammals, the determinate growth of the skeleton entails that bite performance should change throughout ontogeny until the feeding apparatus attains its adult size and morphology. Then, interspecific differences in adult phenotypes are expected to drive food resource partitioning and patterns of lineage diversification. However, Formal tests of these predictions are lacking for the majority of mammal groups, and thus our understanding of mammalian ecomorphology remains incomplete. By focusing on a fundamental measure of feeding performance, bite force, and capitalizing on the extraordinary morphological and dietary diversity of bats, we discuss how the intersection of ontogenetic and macroevolutionary changes in feeding performance may impact ecological diversity in these mammals. We integrate data on cranial morphology and bite force gathered through longitudinal studies of captive animals and comparative studies of free-ranging individuals. We demonstrate that ontogenetic trajectories and evolutionary changes in bite force are highly dependent on changes in body and head size, and that bats exhibit dramatic, allometric increases in bite force during ontogeny. Interspecific variation in bite force is highly dependent on differences in cranial morphology and function, highlighting selection for ecological specialization. While more research is needed to determine how ontogenetic changes in size and bite force specifically impact food resource use and fitness in bats, interspecific diversity in cranial morphology and bite performance seem to closely match functional differences in diet. Altogether, these results suggest direct ecomorphological relationships at ontogenetic and macroevolutionary scales in bats.

  14. Information transfer and synchronization among the scales of climate variability: clues for understanding anomalies and extreme events?

    Science.gov (United States)

    Palus, Milan

    2017-04-01

    Deeper understanding of complex dynamics of the Earth atmosphere and climate is inevitable for sustainable development, mitigation and adaptation strategies for global change and for prediction of and resilience against extreme events. Traditional (linear) approaches cannot explain or even detect nonlinear interactions of dynamical processes evolving on multiple spatial and temporal scales. Combination of nonlinear dynamics and information theory explains synchronization as a process of adjustment of information rates [1] and causal relations (à la Granger) as information transfer [2]. Information born in dynamical complexity or information transferred among systems on a way to synchronization might appear as an abstract quantity, however, information transfer is tied to a transfer of mass and energy, as demonstrated in a recent study using directed (causal) climate networks [2]. Recently, an information transfer across scales of atmospheric dynamics has been observed [3]. In particular, a climate oscillation with the period around 7-8 years has been identified as a factor influencing variability of surface air temperature (SAT) on shorter time scales. Its influence on the amplitude of the SAT annual cycle was estimated in the range 0.7-1.4 °C and the effect on the overall variability of the SAT anomalies (SATA) leads to the changes 1.5-1.7 °C in the annual SATA means. The strongest effect of the 7-8 year cycle was observed in the winter SATA means where it reaches 4-5 °C in central European station and reanalysis data [4]. In the dynamics of El Niño-Southern Oscillation, three principal time scales have been identified: the annual cycle (AC), the quasibiennial (QB) mode(s) and the low-frequency (LF) variability. An intricate causal network of information flows among these modes helps to understand the occurrence of extreme El Niño events, characterized by synchronization of the QB modes and AC, and modulation of the QB amplitude by the LF mode. The latter

  15. Multifractal Analysis of the Small Time-Scale Boundary-Layer Characteristics of the Wind: the Anisotropy and Extremes

    Science.gov (United States)

    Fitton, G. F.; Tchiguirinskaia, I.; Schertzer, D. J.; Lovejoy, S.

    2012-12-01

    anisotropic over high frequencies, where u1 most scales as Bolgiano-Obukhov and u2 scales as Kolmogorov. The scaling law of the vertical shears of the horizontal wind in the array varied from Kolmogorov to Bolgiano-Obukhov with height depending on the condition of stability. We interpret the results with the UM anisotropic model that greatly enhances our understanding of the ABL structure. Comparing the two case studies we found in both cases the multifractality parameter of about 1.6, which remains close to the estimates obtained for the free atmosphere. From the UM parameters, the exponent of the power law of the distribution of the extremes can be predicted. Over small scales, this exponent is of about 7.5 for the wind velocity, which is a crucial result for applications within the field of wind energy.

  16. Impacts of Multi-Scale Solar Activity on Climate.Part Ⅰ:Atmospheric Circulation Patterns and Climate Extremes

    Institute of Scientific and Technical Information of China (English)

    Hengyi WENG

    2012-01-01

    The impacts of solar activity on climate are explored in this two-part study.Based on the principles of atmospheric dynamics,Part Ⅰ propose an amplifying mechanism of solar impacts on winter climate extremes through changing the atmospheric circulation patterns.This mechanism is supported by data analysis of the sunspot number up to the predicted Solar Cycle 24,the historical surface temperature data,and atmospheric variables of NCEP/NCAR Reanalysis up to the February 2011 for the Northern Hemisphere winters.For low solar activity,the thermal contrast between the low- and high-latitudes is enhanced,so as the mid-latitude baroclinic ultra-long wave activity.The land-ocean thermal contrast is also enhanced,which amplifies the topographic waves.The enhanced mid-latitude waves in turn enhance the meridional heat transport from the low to high latitudes,making the atmospheric “heat engine” more efficient than normal. The jets shift southward and the polar vortex is weakened.The Northern Annular Mode (NAM) index tends to be negative.The mid-latitude surface exhibits large-scale convergence and updrafts,which favor extreme weather/climate events to occur.The thermally driven Siberian high is enhanced,which enhances the East Asian winter monsoon (EAWM).For high solar activity,the mid-latitude circulation patterns are less wavy with less meridional transport.The NAM tends to be positive,and the Siberian high and the EAWM tend to be weaker than normal.Thus the extreme weather/climate events for high solar activity occur in different regions with different severity from those for low solar activity.The solar influence on the midto high-latitude surface temperature and circulations can stand out after renoving the influence from the El Ni(n)o-Southern Oscillation.The atmospheric amplifying mechanism indicates that the solar impacts on climate should not be simply estimated by the magnitude of the change in the solar radiation over solar cycles when it is compared with

  17. Extreme Scale Computing Studies

    Science.gov (United States)

    2010-12-01

    modeling the mantle of earth as a ”living thing” coupled to crust for understanding tectonic plate system evolution is a “capability” (Category II...cooling using a microchannel cooling plate built as part of the package can handle up to 170 W based on existing technology [161]. The water is pumped...additional memory. Geophysicists within the Southern California Earthquake Center are using full 3D seismic to- mography [146] to obtain a 3D elastic

  18. An Extreme Metallicity, Large-Scale Outflow from a Star-Forming Galaxy at z ~ 0.4

    CERN Document Server

    Muzahid, Sowgat; Churchil, Christopher W; Charlton, Jane C; Nielsen, Nikole M; Mathes, Nigel L; Trujillo-Gomez, Sebastian

    2015-01-01

    We present a detailed analysis of a large-scale galactic outflow in the CGM of a massive (M_h ~ 10^12.5 Msun), star-forming (6.9 Msun/yr), sub-L* (0.5 L_B*) galaxy at z=0.39853 that exhibits a wealth of metal-line absorption in the spectra of the background quasar Q 0122-003 at an impact parameter of 163 kpc. The galaxy inclination angle (i=63 degree) and the azimuthal angle (Phi=73 degree) imply that the QSO sightline is passing through the projected minor-axis of the galaxy. The absorption system shows a multiphase, multicomponent structure with ultra-strong, wide velocity spread OVI (logN = 15.16\\pm0.04, V_{90} = 419 km/s) and NV (logN = 14.69\\pm0.07, V_{90} = 285 km/s) lines that are extremely rare in the literature. The highly ionized absorption components are well explained as arising in a low density (10^{-4.2} cm^{-3}), diffuse (10 kpc), cool (10^4 K) photoionized gas with a super-solar metallicity ([X/H] > 0.3). From the observed narrowness of the Lyb profile, the non-detection of SIV absorption, and...

  19. Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Johnson, Gary M.; Washington, Warren M.

    2009-07-02

    The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) in partnership with the Office of Advanced Scientific Computing Research (ASCR) held a workshop on the challenges in climate change science and the role of computing at the extreme scale, November 6-7, 2008, in Bethesda, Maryland. At the workshop, participants identified the scientific challenges facing the field of climate science and outlined the research directions of highest priority that should be pursued to meet these challenges. Representatives from the national and international climate change research community as well as representatives from the high-performance computing community attended the workshop. This group represented a broad mix of expertise. Of the 99 participants, 6 were from international institutions. Before the workshop, each of the four panels prepared a white paper, which provided the starting place for the workshop discussions. These four panels of workshop attendees devoted to their efforts the following themes: Model Development and Integrated Assessment; Algorithms and Computational Environment; Decadal Predictability and Prediction; Data, Visualization, and Computing Productivity. The recommendations of the panels are summarized in the body of this report.

  20. Measuring activity limitations in walking : Development of a hierarchical scale for patients with lower-extremity disorders who live at home

    NARCIS (Netherlands)

    Roorda, LD; Roebroeck, ME; van Tilburg, T; Molenaar, IW; Lankhorst, GJ; Bouter, LM

    2005-01-01

    Objective: To develop a hierarchical scale that measures activity limitations in walking in patients with lower-extremity disorders who live at home. Design: Cross-sectional study. Setting: Orthopedic workshops and outpatient clinics of secondary and tertiary care centers. Participants: Patients (N=

  1. Recent hydrological variability and extreme precipitation events in Moroccan Middle-Atlas mountains: micro-scale analyses of lacustrine sediments

    Science.gov (United States)

    Jouve, Guillaume; Vidal, Laurence; Adallal, Rachid; Bard, Edouard; Benkaddour, Abdel; Chapron, Emmanuel; Courp, Thierry; Dezileau, Laurent; Hébert, Bertil; Rhoujjati, Ali; Simonneau, Anaelle; Sonzogni, Corinne; Sylvestre, Florence; Tachikawa, Kazuyo; Viry, Elisabeth

    2016-04-01

    Since the 1990s, the Mediterranean basin undergoes an increase in precipitation events and extreme droughts likely to intensify in the XXI century, and whose origin is attributable to human activities since 1850 (IPCC, 2013). Regional climate models indicate a strengthening of flood episodes at the end of the XXI century in Morocco (Tramblay et al, 2012). To understand recent hydrological and paleohydrological variability in North Africa, our study focuses on the macro- and micro-scale analysis of sedimentary sequences from Lake Azigza (Moroccan Middle Atlas Mountains) covering the last few centuries. This lake is relevant since local site monitoring revealed that lake water table levels were correlated with precipitation regime (Adallal R., PhD Thesis in progress). The aim of our study is to distinguish sedimentary facies characteristic of low and high lake levels, in order to reconstruct past dry and wet periods during the last two hundred years. Here, we present results from sedimentological (lithology, grain size, microstructures under thin sections), geochemical (XRF) and physical (radiography) analyses on short sedimentary cores (64 cm long) taken into the deep basin of Lake Azigza (30 meters water depth). Cores have been dated (radionuclides 210Pb, 137Cs, and 14C dating). Two main facies were distinguished: one organic-rich facies composed of wood fragments, several reworked layers and characterized by Mn peaks; and a second facies composed of terrigenous clastic sediments, without wood nor reworked layers, and characterized by Fe, Ti, Si and K peaks. The first facies is interpreted as a high lake level stand. Indeed, the highest paleoshoreline is close to the vegetation, and steeper banks can increase the current velocity, allowing the transport of wood fragments in case of extreme precipitation events. Mn peaks are interpreted as Mn oxides precipitations under well-oxygenated deep waters after runoff events. The second facies is linked to periods of

  2. Translation, cross-cultural adaptation and analysis of the psychometric properties of the lower extremity functional scale (LEFS: LEFS- BRAZIL

    Directory of Open Access Journals (Sweden)

    Ligia M. Pereira

    2013-06-01

    Full Text Available BACKGROUND: There is a lack of questionnaires in Brazilian Portuguese to evaluate patient-reported lower limb function. OBJECTIVE: To translate, cross-culturally adapt to the Brazilian population, and evaluate the psychometric properties of the Lower Extremity Functional Scale (LEFS. METHOD: The LEFS was translated by two independent assessors and back-translated to English. Then, the LEFS-Brazil was tested on 20 patients who answered the questionnaire in the cross-cultural adaptation phase. For the evaluation of the psychometric properties, 100 patients answered the questionnaire. The reliability was tested by two independent assessors. The Medical Outcomes Study 36-item from Health Survey (SF-36 was used as the criterion method for construct validity. The sensitivity to change was tested for four consecutive weeks. RESULTS: The internal consistency was α = 0.96. The intra-observer reliability was CCI (intraclass correlation coefficient = 0.96 and CCI interobserver = 0.98; the Bland and Altman mean difference ( intra-observer = -1.52 and interobserver = 0.46. The correlation between the LEFS and SF-36 in the first week was the following: physical function r=0.82, physical role r=0.57, emotional role r=0.43 and mental health r=0.33. The LEFS was responsive when comparing the mean of the first week to the second, third and fourth weeks and comparing the second to the fourth week. The cut-off point was 11, and the area under the receiving operator curve was 0.96 95% CI [0.88;0.99], with sensitivity = 0.96, 1-specificity = 0 and standard error = 0.02. CONCLUSION: The LEFS-Brazil is reliable, valid and responsive.

  3. Extreme Convective Weather in Future Decades

    Science.gov (United States)

    Gadian, Alan; Burton, Ralph; Groves, James; Blyth, Alan; Warner, James; Holland, Greg; Bruyere, Cindy; Done, James; Thielen, Jutta

    2016-04-01

    WISER (Weather Climate Change Impact Study at Extreme Resolution) is a project designed to analyse changes in extreme weather events in a future climate, using a weather model (WRF) which is able to resolve small scale processes. Use of a weather model is specifically designed to look at convection which is of a scale which cannot be resolved by climate models. The regional meso-scale precipitation events, which are critical in understanding climate change impacts will be analysed. A channel domain outer model, with a resolution of ~ 20km in the outer domain drives an inner domain of ~ 3 km resolution. Results from 1989-1994 and 2020-2024 and 2030-2034 will be presented to show the effects of extreme convective events over Western Europe. This presentation will provide details of the project. It will present data from the 1989-1994 ERA-interim and CCSM driven simulations, with analysis of the future years as defined above. The representation of pdfs of extreme precipitation, Outgoing Longwave Radiation and wind speeds, with preliminary comparison with observations will be discussed. It is also planned to use the output to drive the EFAS (European Flood model) to examine the predicted changes in quantity and frequency of severe and hazardous convective rainfall events and leading to the frequency of flash flooding due to heavy convective precipitation.

  4. A global quantification of compound precipitation and wind extremes

    Science.gov (United States)

    Martius, Olivia; Pfahl, Stephan; Chevalier, Clément

    2016-07-01

    The concomitant occurrence of extreme precipitation and winds can have severe impacts. Here this concomitant occurrence is quantified globally using ERA-Interim reanalysis data. A logistic regression model is used to determine significant changes in the odds of precipitation extremes given a wind extreme that occurs on the same day, the day before, or the day after. High percentages of cooccurring wind and precipitation extremes are found in coastal regions and in areas with frequent tropical cyclones, with maxima of more than 50% of concomitant events. Strong regional-scale variations in this percentage are related to the interaction of weather systems with topography resulting in Föhn winds, gap winds, and orographic drying and the structure and tracks of extratropical and tropical cyclones. The percentage of concomitant events increases substantially if spatial shifts by one grid point are taken into account. Such spatially shifted but cooccurring events are important in insurance applications.

  5. A global quantification of compound precipitation and wind extremes

    Science.gov (United States)

    Martius, Olivia; Pfahl, Stephan; Chevalier, Clément

    2017-04-01

    The concomitant occurrence of extreme precipitation and winds can have severe impacts. Here this concomitant occurrence is quantified globally using ERA-Interim reanalysis data. A logistic regression model is used to determine significant changes in the odds ratio of precipitation extremes given a wind extreme occurs on the same day, the day before or the day after. High percentages of co-occurring wind and precipitation extremes are found in coastal regions and in areas with frequent tropical cyclones, with maxima of more than 50% of concomitant events. Strong regional-scale variations in this percentage are related to the interaction of weather systems with topography resulting in Föhn winds, gap winds, and orographic drying, and the structure and tracks of extratropical and tropical cyclones. The percentage of concomitant events increases substantially if spatial shifts by one grid point are taken into account. Such spatially shifted, but co-occurring events are important in insurance applications.

  6. Measuring activity limitations in climbing stairs: development of a hierarchical scale for patients with lower-extremity disorders living at home.

    Science.gov (United States)

    Roorda, Leo D; Roebroeck, Marij E; van Tilburg, Theo; Lankhorst, Gustaaf J; Bouter, Lex M

    2004-06-01

    To develop a hierarchical scale that measures activity limitations in climbing stairs in patients with lower-extremity disorders living at home. Cross-sectional study with Mokken scale analysis of 15 dichotomous items. Outpatient clinics of secondary and tertiary care centers. Patients (N=759; mean age +/- standard deviation, 59.8+/-15.0y; 48% men) living at home, with different lower-extremity disorders: stroke, poliomyelitis, osteoarthritis, amputation, complex regional pain syndrome type I, and diabetic foot problems. Not applicable. (1) Fit of the monotone homogeneity model, indicating whether items can be used for measuring patients; (2) fit of the double monotonicity model, indicating invariant (hierarchical) item ordering; (3) intratest reliability, indicating repeatability of the sum score; and (4) differential item functioning, addressing the validity of comparisons between subgroups of patients. There was (1) good fit of the monotone homogeneity model (coefficient H=.50) for all items for all patients, and for subgroups defined by age, gender, and diagnosis; (2) good fit of the double monotonicity model (coefficient H(T)=.58); (3) good intratest reliability (coefficient rho=.90); and (4) no differential item functioning with respect to age and gender, but differential item functioning for 4 items in amputees compared with nonamputees. A hierarchical scale, with excellent scaling characteristics, has been developed for measuring activity limitations in climbing stairs in patients with lower-extremity disorders who live at home. However, measurements should be interpreted with caution when comparisons are made between patients with and without amputation.

  7. Libraries in the post-scarcity era

    NARCIS (Netherlands)

    Bodó, B.; Porsdam, H.

    2015-01-01

    In the digital era where, thanks to the ubiquity of electronic copies, the book is no longer a scarce resource, libraries find themselves in an extremely competitive environment. Several different actors are now in a position to provide low cost access to knowledge. One of these competitors are

  8. Libraries in the post-scarcity era

    NARCIS (Netherlands)

    B. Bodó

    2015-01-01

    In the digital era where, thanks to the ubiquity of electronic copies, the book is no longer a scarce resource, libraries find themselves in an extremely competitive environment. Several different actors are now in a position to provide low cost access to knowledge. One of these competitors are shad

  9. Trends in mean and extreme temperatures over Ibadan, Southwest Nigeria

    Science.gov (United States)

    Abatan, Abayomi A.; Osayomi, Tolulope; Akande, Samuel O.; Abiodun, Babatunde J.; Gutowski, William J.

    2017-01-01

    In recent times, Ibadan has been experiencing an increase in mean temperature which appears to be linked to anthropogenic global warming. Previous studies have indicated that the warming may be accompanied by changes in extreme events. This study examined trends in mean and extreme temperatures over Ibadan during 1971-2012 at annual and seasonal scales using the high-resolution atmospheric reanalysis from European Centre for Medium-Range Weather Forecasts (ECMWF) twentieth-century dataset (ERA-20C) at 15 grid points. Magnitudes of linear trends in mean and extreme temperatures and their statistical significance were calculated using ordinary least squares and Mann-Kendall rank statistic tests. The results show that Ibadan has witnessed an increase in annual and seasonal mean minimum temperatures. The annual mean maximum temperature exhibited a non-significant decline in most parts of Ibadan. While trends in cold extremes at annual scale show warming, trends in coldest night show greater warming than in coldest day. At the seasonal scale, we found that Ibadan experienced a mix of positive and negative trends in absolute extreme temperature indices. However, cold extremes show the largest trend magnitudes, with trends in coldest night showing the greatest warming. The results compare well with those obtained from a limited number of stations. This study should inform decision-makers and urban planners about the ongoing warming in Ibadan.

  10. How extreme are extremes?

    Science.gov (United States)

    Cucchi, Marco; Petitta, Marcello; Calmanti, Sandro

    2016-04-01

    High temperatures have an impact on the energy balance of any living organism and on the operational capabilities of critical infrastructures. Heat-wave indicators have been mainly developed with the aim of capturing the potential impacts on specific sectors (agriculture, health, wildfires, transport, power generation and distribution). However, the ability to capture the occurrence of extreme temperature events is an essential property of a multi-hazard extreme climate indicator. Aim of this study is to develop a standardized heat-wave indicator, that can be combined with other indices in order to describe multiple hazards in a single indicator. The proposed approach can be used in order to have a quantified indicator of the strenght of a certain extreme. As a matter of fact, extremes are usually distributed in exponential or exponential-exponential functions and it is difficult to quickly asses how strong was an extreme events considering only its magnitude. The proposed approach simplify the quantitative and qualitative communication of extreme magnitude

  11. Remote, Real-time Investigations of Extreme Environments Using High Power and Bandwidth Cabled Observatories: The OOI Regional Scale Nodes

    Science.gov (United States)

    Kelley, D. S.; Delaney, J. R.

    2012-12-01

    Methane hydrate deposits and hydrothermal vents are two of the most extreme environments on Earth. Seismic events and flow of gases from the seafloor support and modulate novel microbial communities within these systems. Although studied intensely for several decades, significant questions remain about the flux of heat, volatiles and microbial material from the subsurface to the hydrosphere in these dynamic environments. Quantification of microbial communities, their structure and abundances, and metabolic activities is in an infant state. To better understand these systems, the National Science Foundation's Ocean Observatory Initiative has installed high power (8 kW), high bandwidth (10 Gb/s) nodes on the seafloor that provide access to active methane seeps at Southern Hydrate Ridge, and at the most magmatically robust volcano on the Juan de Fuca Ridge - Axial Seamount. The real-time interactive capabilities of the cabled observatory are critical to studying gas-hydrate systems because many of the key processes occur over short time scales. Events such as bubble plume formation, the creation of collapse zones, and increased seepage in response to earthquakes require adaptive response and sampling capabilities. To meet these challenges a suite of instruments will be connected to the cable in 2013. These sensors include full resolution sampling by upward-looking sonars, fluid and gas chemical characterization by mass spectrometers and osmo samplers, long-term duration collection of seep imagery from cameras, and in situ manipulation of chemical sensors coupled with flow meters. In concert, this instrument suite will provide quantification of transient and more stable chemical fluxes. Similarly, at Axial Seamount the high bandwidth and high power fiber optic cables will be used to communicate with and power a diverse array of sensors at the summit of the volcano. Real-time high definition video will provide unprecedented views of macrofaunal and microbial communities

  12. On the consideration of scaling properties of extreme rainfall in Madrid (Spain) for developing a generalized intensity-duration-frequency equation and assessing probable maximum precipitation estimates

    Science.gov (United States)

    Casas-Castillo, M. Carmen; Rodríguez-Solà, Raúl; Navarro, Xavier; Russo, Beniamino; Lastra, Antonio; González, Paula; Redaño, Angel

    2016-11-01

    The fractal behavior of extreme rainfall intensities registered between 1940 and 2012 by the Retiro Observatory of Madrid (Spain) has been examined, and a simple scaling regime ranging from 25 min to 3 days of duration has been identified. Thus, an intensity-duration-frequency (IDF) master equation of the location has been constructed in terms of the simple scaling formulation. The scaling behavior of probable maximum precipitation (PMP) for durations between 5 min and 24 h has also been verified. For the statistical estimation of the PMP, an envelope curve of the frequency factor (k m ) based on a total of 10,194 station-years of annual maximum rainfall from 258 stations in Spain has been developed. This curve could be useful to estimate suitable values of PMP at any point of the Iberian Peninsula from basic statistical parameters (mean and standard deviation) of its rainfall series.

  13. Extreme robustness of scaling in sample space reducing processes explains Zipf’s law in diffusion on directed networks

    Science.gov (United States)

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan

    2016-09-01

    It has been shown recently that a specific class of path-dependent stochastic processes, which reduce their sample space as they unfold, lead to exact scaling laws in frequency and rank distributions. Such sample space reducing processes offer an alternative new mechanism to understand the emergence of scaling in countless processes. The corresponding power law exponents were shown to be related to noise levels in the process. Here we show that the emergence of scaling is not limited to the simplest SSRPs, but holds for a huge domain of stochastic processes that are characterised by non-uniform prior distributions. We demonstrate mathematically that in the absence of noise the scaling exponents converge to -1 (Zipf’s law) for almost all prior distributions. As a consequence it becomes possible to fully understand targeted diffusion on weighted directed networks and its associated scaling laws in node visit distributions. The presence of cycles can be properly interpreted as playing the same role as noise in SSRPs and, accordingly, determine the scaling exponents. The result that Zipf’s law emerges as a generic feature of diffusion on networks, regardless of its details, and that the exponent of visiting times is related to the amount of cycles in a network could be relevant for a series of applications in traffic-, transport- and supply chain management.

  14. Changes in intensity of precipitation extremes in Romania on very hight temporal scale and implications on the validity of the Clausius-Clapeyron relation

    Science.gov (United States)

    Busuioc, Aristita; Baciu, Madalina; Breza, Traian; Dumitrescu, Alexandru; Stoica, Cerasela; Baghina, Nina

    2016-04-01

    Many observational, theoretical and based on climate model simulation studies suggested that warmer climates lead to more intense precipitation events, even when the total annual precipitation is slightly reduced. In this way, it was suggested that extreme precipitation events may increase at Clausius-Clapeyron (CC) rate under global warming and constraint of constant relative humidity. However, recent studies show that the relationship between extreme rainfall intensity and atmospheric temperature is much more complex than would be suggested by the CC relationship and is mainly dependent on precipitation temporal resolution, region, storm type and whether the analysis is conducted on storm events rather than fixed data. The present study presents the dependence between the very hight temporal scale extreme rainfall intensity and daily temperatures, with respect to the verification of the CC relation. To solve this objective, the analysis is conducted on rainfall event rather than fixed interval using the rainfall data based on graphic records including intensities (mm/min.) calculated over each interval with permanent intensity per minute. The annual interval with available a such data (April to October) is considered at 5 stations over the interval 1950-2007. For Bucuresti-Filaret station the analysis is extended over the longer interval (1898-2007). For each rainfall event, the maximum intensity (mm/min.) is retained and these time series are considered for the further analysis (abbreviated in the following as IMAX). The IMAX data were divided based on the daily mean temperature into bins 2oC - wide. The bins with less than 100 values were excluded. The 90th, 99th and 99.9th percentiles were computed from the binned data using the empirical distribution and their variability has been compared to the CC scaling (e.g. exponential relation given by a 7% increase per temperature degree rise). The results show a dependence close to double the CC relation for

  15. The Analysis from Sociology from Law on the Extreme Events in Right Maintaining in the "We-Media"Era---Taking "Ji Zhong-xing Case"as an Example%自媒体时代极端维权的法社会学分析--以冀中星案为例

    Institute of Scientific and Technical Information of China (English)

    谢昕欣

    2014-01-01

    "Ji Zhong-xing Case"is the miniature of Chinese extreme right maintaining events ,which re-flects the malfunctioning and defect of regular right-defending system .The social bottom class people's desire of rights protecting can't be satisfied and they have to seek for other irregular right-defending ways .In the era of "We-Media",these extreme events are widely spread by Internet and self-medias ,taking obvious ex-pansion and demonstration effect ,and are deeply influencing people's right defending selection .T he negative effects of extreme events in "We-Media"era are related with the deficient social remedy mechanisms of Chi-na .The extreme events in right maintaining will reduce the credibility of regular right maintaining channel , therefore ,it should be prevented with effective measures .%冀中星案是当下中国极端维权事件的缩影。极端维权反映出我国正规性传统维权渠道的失灵和制度缺陷,以及广大底层群体维权需求长期得不到满足而被迫选择其他非正规性维权渠道的现状。而在自媒体时代下,极端维权在网络和自媒体的广泛传播和影响下,显示明显的扩大效应和示范作用,对个体维权者的行动选择有着潜移默化的影响。极端维权在自媒体时代的负面社会效果凸显我国社会救济机制仍有很大不足。极端维权会降低正规维权渠道的公信力,因此,应当采取有效措施加以防范。

  16. Extreme robustness of scaling in sample space reducing processes explains Zipf's law in diffusion on directed networks

    CERN Document Server

    Corominas-Murtra, Bernat; Thurner, Stefan

    2016-01-01

    Sample Space Reducing processes (SSRP) offer an alternative new mechanism to understand the emergence of scaling in countless phenomena. We demonstrate that the scaling exponents associated to the dynamics of SSRPs converge to Zipf's law for a large class of systems. We show that Zipf's law emerges as a generic feature of diffusion on directed networks, regardless of its details, and that the exponent of the visiting time distribution is related to the amount of cycles in the network. These results are relevant for a series of applications in traffic, transport, and supply chain management.

  17. Scientific Grand Challenges: Discovery In Basic Energy Sciences: The Role of Computing at the Extreme Scale - August 13-15, 2009, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Galli, Giulia [Univ. of California, Davis, CA (United States). Workshop Chair; Dunning, Thom [Univ. of Illinois, Urbana, IL (United States). Workshop Chair

    2009-08-13

    The U.S. Department of Energy’s (DOE) Office of Basic Energy Sciences (BES) and Office of Advanced Scientific Computing Research (ASCR) workshop in August 2009 on extreme-scale computing provided a forum for more than 130 researchers to explore the needs and opportunities that will arise due to expected dramatic advances in computing power over the next decade. This scientific community firmly believes that the development of advanced theoretical tools within chemistry, physics, and materials science—combined with the development of efficient computational techniques and algorithms—has the potential to revolutionize the discovery process for materials and molecules with desirable properties. Doing so is necessary to meet the energy and environmental challenges of the 21st century as described in various DOE BES Basic Research Needs reports. Furthermore, computational modeling and simulation are a crucial complement to experimental studies, particularly when quantum mechanical processes controlling energy production, transformations, and storage are not directly observable and/or controllable. Many processes related to the Earth’s climate and subsurface need better modeling capabilities at the molecular level, which will be enabled by extreme-scale computing.

  18. The Subaru Coronagraphic Extreme Adaptive Optics system: enabling high-contrast imaging on solar-system scales

    CERN Document Server

    Jovanovic, N; Guyon, O; Clergeon, C; Singh, G; Kudo, T; Garrel, V; Newman, K; Doughty, D; Lozi, J; Males, J; Minowa, Y; Hayano, Y; Takato, N; Morino, J; Kuhn, J; Serabyn, E; Norris, B; Tuthill, P; Schworer, G; Stewart, P; Close, L; Huby, E; Perrin, G; Lacour, S; Gauchet, L; Vievard, S; Murakami, N; Oshiyama, F; Baba, N; Matsuo, T; Nishikawa, J; Tamura, M; Lai, O; Marchis, F; Duchene, G; Kotani, T; Woillez, J

    2015-01-01

    The Subaru Coronagraphic Extreme Adaptive Optics (SCExAO) instrument is a multipurpose high-contrast imaging platform designed for the discovery and detailed characterization of exoplanetary systems and serves as a testbed for high-contrast imaging technologies for ELTs. It is a multi-band instrument which makes use of light from 600 to 2500nm allowing for coronagraphic direct exoplanet imaging of the inner 3 lambda/D from the stellar host. Wavefront sensing and control are key to the operation of SCExAO. A partial correction of low-order modes is provided by Subaru's facility adaptive optics system with the final correction, including high-order modes, implemented downstream by a combination of a visible pyramid wavefront sensor and a 2000-element deformable mirror. The well corrected NIR (y-K bands) wavefronts can then be injected into any of the available coronagraphs, including but not limited to the phase induced amplitude apodization and the vector vortex coronagraphs, both of which offer an inner worki...

  19. Study of Straggling and Extreme Cases of Energy Deposition in Micron Scale Silicon Volumes using the DEPFET Detector

    CERN Document Server

    Wilk, Fabian; Schwenker, Benjamin

    The Depleted P-channel Field-Effect Transistor detector is a pixel detector type currently under development. In high energy physics, pixel detectors measure space points along the trajectory of charged particles. They determine the spatial position by measuring the charges created as a result of interactions with the passing particle. Thus the detector’s signals can be used to determine the energy deposited by the particle in single pixels of a pixel matrix. The development of a new detector raises the question whether our simulation models can accurately describe the physical processes – like ionisation and scattering – taking place during operation. The thesis aims to validate one of the current Monte-Carlo simulations (based on the Geant4 simulation package) of high energy straggling processes using experimental data of a test beam run of DEPFET modules. This is done by calculating the spatial distribution of the electron/hole pairs created in extreme cases of ionisation and using this distribution ...

  20. Full Scale Test SSP 34m blade, edgewise loading LTT. Extreme load and PoC_InvE Data report

    DEFF Research Database (Denmark)

    Nielsen, Magda; Roczek-Sieradzan, Agnieszka; Jensen, Find Mølholt

    stresses in the adhesive joints. Test results from measurements with the reinforcement have been compared to results without the coupling. The report presents only the relevant results for the 80% Risø load and the results applicable for the investigation of the influence of the invention on the profile...... in edgewise direction (LTT). The blade has been submitted to thorough examination by means of strain gauges, displacement transducers and a 3D optical measuring system. This data report presents results obtained during full scale testing of the blade up to 80% Risø load, where 80% Risø load corresponds to 100......% certification load. These pulls at 80% Risø load were repeated and the results from these pulls were compared. The blade was reinforced according to a Risø DTU invention, where the trailing edge panels are coupled. The coupling is implemented to prevent the out of plane deformations and to reduce peeling...

  1. Validación de una escala de afrontamiento frente a riesgos extremos Validation of a scale measuring coping with extreme risks

    Directory of Open Access Journals (Sweden)

    Esperanza López-Vázquez

    2004-06-01

    Full Text Available OBJETIVO: Validar, en población mexicana, una escala de afrontamiento, adaptada de la escala francesa "Echèlle Toulousaine de Coping". MATERIAL Y MÉTODOS: En el otoño de 2001 la escala se aplicó a 209 sujetos que habitaban en diversas zonas de México, expuestos a cinco diferentes tipos de riesgo extremo, entre los cuales se distinguen riesgos naturales y riesgos industriales. Se analizó la capacidad discriminatoria de los reactivos, así como la estructura factorial y la consistencia interna de la prueba. Se emplearon los métodos U de Mann-Whitney, análisis factorial de componentes principales y alpha de Cronbach. RESULTADOS: La escala final es de 26 reactivos que se agruparon en dos factores: afrontamiento activo y afrontamiento pasivo. La consistencia interna del instrumento es muy alta, tanto en la muestra total como en la submuestra de riesgos naturales y riesgos industriales. CONCLUSIONES: La escala de afrontamiento que proponemos es confiable y válida para la población mexicanaOBJECTIVE: The objective of this study was to validate, in Mexico, the French coping scale "Échelle Toulousaine de Coping". MATERIAL AND METHODS: In the fall of 2001, the scale questionnaire was applied to 209 subjects living in different areas of Mexico, exposed to five different types of extreme natural or industrial risks. The discriminatory capacity of the items, as well as the factorial structure and internal consistency of the scale, were analyzed using Mann-Whitney's U test, principal components factorial analysis, and Cronbach's alpha. RESULTS: The final scale was composed of 26 items forming two groups: active coping and passive coping. Internal consistency of the instrument was high, both in the total sample and in the subsample of natural and industrial risks. CONCLUSIONS: The coping scale is reliable and valid for the Mexican population

  2. Broad Scale Monitoring in the US Forest Service: Institutional Challenges and Collaborative Opportunites for Improving Planning and Decision-Making in an Era of Climate Change

    Science.gov (United States)

    Wurtzebach, Z.

    2016-12-01

    In 2012, the United States Forest Service promulgated new rules to guide Forest planning efforts in accordance with the National Forest Management Act (NFMA). One important component of the 2012 rule is a requirement for Regionally coordinated cross-boundary "broad scale" monitoring strategies that are designed to inform and facilitate Forest-level adaptive management and planning. This presentation will examine institutional challenges and opportunites for developing effective broad scale monitoring strategies identified in 90 interviews with USFS staff and partner organizations, and collaborative workshops held in Colorado, Wyoming, Arizona, and New Mexico. Internal barriers to development include funding and human resource constraints, organizational culture, problematic incentives and accountability structures, data management issues, and administrative barriers to collaboration. However, we also identify several opportunities for leveraging interagency collaboration, facilitating multi-level coordination, generating efficiencies in data collection and analysis, and improving strategies for reporting and communication to Forest level decision-makers and relevant stakeholders.

  3. On the Fine-Scale Topography Regulating Changes in Atmospheric Hydrological Cycle and Extreme Rainfall over West Africa in a Regional Climate Model Projections

    Directory of Open Access Journals (Sweden)

    M. B. Sylla

    2012-01-01

    Full Text Available The ICTP-RegCM3 is used to downscale at 40 km projections from ECHAM5 over West Africa during the mid and late 21st Century. The results show that while ECHAM5 projects wetter climate along the Gulf of Guinea and drier conditions along the Sahel, RegCM3 produces contrasting changes for low-elevation (negative and high-elevation (positive terrains more marked during the second period. These wetter conditions in the uplands result from an intensification of the atmospheric hydrological cycle arising as a consequence of more frequent and denser rainy days and leading to larger intensity and more extreme events. Examination of the large-scale dynamics reveal that these conditions are mostly driven by increased low-level moisture convergence which produces elevated vertical motion above Cameroun’s mountainous areas favoring more atmospheric instability, moisture, and rainfall. This regulation of climate change signal by high-elevation terrains is feasible only in RegCM3 as the driving ECHAM5 is smoothing along all the Gulf of Guinea. This consolidates the need to use regional climate model to investigate the regional and local response of the hydrological cycle, the daily rainfall and extreme events to the increasing anthropogenic GHG warming for suitable impact studies specifically over region with complex topography such as West Africa.

  4. Field limit and nano-scale surface topography of superconducting radio-frequency cavity made of extreme type II superconductor

    CERN Document Server

    Kubo, Takayuki

    2014-01-01

    The field limit of superconducting radio-frequency cavity made of type II superconductor with a large Ginzburg-Landau parameter is studied with taking effects of nano-scale surface topography into account. If the surface is ideally flat, the field limit is imposed by the superheating field. On the surface of cavity, however, nano-defects almost continuously distribute and suppress the superheating field everywhere. The field limit is imposed by an effective superheating field given by the product of the superheating field for ideal flat surface and a suppression factor that contains effects of nano-defects. A nano-defect is modeled by a triangular groove with a depth smaller than the penetration depth. An analytical formula for the suppression factor of bulk and multilayer superconductors are derived in the framework of the London theory. As an immediate application, the suppression factor of the dirty Nb processed by the electropolishing is evaluated by using results of surface topographic study. The estimat...

  5. SWAP OBSERVATIONS OF THE LONG-TERM, LARGE-SCALE EVOLUTION OF THE EXTREME-ULTRAVIOLET SOLAR CORONA

    Energy Technology Data Exchange (ETDEWEB)

    Seaton, Daniel B.; De Groof, Anik; Berghmans, David; Nicula, Bogdan [Royal Observatory of Belgium-SIDC, Avenue Circulaire 3, B-1180 Brussels (Belgium); Shearer, Paul [Department of Mathematics, 2074 East Hall, University of Michigan, 530 Church Street, Ann Arbor, MI 48109-1043 (United States)

    2013-11-01

    The Sun Watcher with Active Pixels and Image Processing (SWAP) EUV solar telescope on board the Project for On-Board Autonomy 2 spacecraft has been regularly observing the solar corona in a bandpass near 17.4 nm since 2010 February. With a field of view of 54 × 54 arcmin, SWAP provides the widest-field images of the EUV corona available from the perspective of the Earth. By carefully processing and combining multiple SWAP images, it is possible to produce low-noise composites that reveal the structure of the EUV corona to relatively large heights. A particularly important step in this processing was to remove instrumental stray light from the images by determining and deconvolving SWAP's point-spread function from the observations. In this paper, we use the resulting images to conduct the first-ever study of the evolution of the large-scale structure of the corona observed in the EUV over a three year period that includes the complete rise phase of solar cycle 24. Of particular note is the persistence over many solar rotations of bright, diffuse features composed of open magnetic fields that overlie polar crown filaments and extend to large heights above the solar surface. These features appear to be related to coronal fans, which have previously been observed in white-light coronagraph images and, at low heights, in the EUV. We also discuss the evolution of the corona at different heights above the solar surface and the evolution of the corona over the course of the solar cycle by hemisphere.

  6. Greenland accumulation and its connection to the large-scale atmospheric circulation in ERA-Interim and paleo-climate simulations

    Directory of Open Access Journals (Sweden)

    N. Merz

    2013-07-01

    Full Text Available Accumulation and aerosol chemistry records from Greenland ice cores offer the potential to reconstruct variability in Northern Hemisphere atmospheric circulation over the last millennia. However, an important prerequisite for a reconstruction is the stable relationship between local accumulation at the ice core site with the respective circulation pattern throughout the reconstruction period. We address this stability issue by using a comprehensive climate model and performing time-slice simulations for the present, the pre-industrial, the early Holocene and the last glacial maximum (LGM. The relationships between accumulation, precipitation and atmospheric circulation are investigated on on various time-scales. The analysis shows that the relationship between local accumulation on the Greenland ice sheet and the large-scale circulation undergoes a significant seasonal cycle. As the weights of the individual seasons change, annual mean accumulation variability is not necessarily related to the same atmospheric circulation patterns during the different climate states. Within a season, local Greenland accumulation variability is indeed linked to a consistent circulation pattern, which is observed for all studied climate periods, even for the LGM, however these circulation patterns are specific for different regions on the Greenland ice sheet. The simulated impact of orbital forcing and changes in the ice-sheet topography on accumulation exhibits strong spatial variability emphasizing that accumulation records from different ice core sites cannot be expected to look alike since they include a distinct local signature. Accumulation changes between different climate periods are dominated by changes in the amount of snowfall and are driven by both thermodynamic and dynamic factors. The thermodynamic impact determines the strength of the hydrological cycle, and warmer temperatures are generally accompanied by an increase in Greenland precipitation

  7. Extreme Heat

    Science.gov (United States)

    ... Landslides & Debris Flow Nuclear Blast Nuclear Power Plants Power Outages Pandemic Radiological Dispersion Device Severe Weather Snowstorms & Extreme ... Landslides & Debris Flow Nuclear Blast Nuclear Power Plants Power Outages Pandemic Radiological Dispersion Device Severe Weather Snowstorms & Extreme ...

  8. Mandelbrot's Extremism

    NARCIS (Netherlands)

    Beirlant, J.; Schoutens, W.; Segers, J.J.J.

    2004-01-01

    In the sixties Mandelbrot already showed that extreme price swings are more likely than some of us think or incorporate in our models.A modern toolbox for analyzing such rare events can be found in the field of extreme value theory.At the core of extreme value theory lies the modelling of maxima

  9. Translation and cross-cultural adaptation of the lower extremity functional scale into a Brazilian Portuguese version and validation on patients with knee injuries.

    Science.gov (United States)

    Metsavaht, Leonardo; Leporace, Gustavo; Riberto, Marcelo; Sposito, Maria Matilde M; Del Castillo, Letícia N C; Oliveira, Liszt P; Batista, Luiz Alberto

    2012-11-01

    Clinical measurement. To translate and culturally adapt the Lower Extremity Functional Scale (LEFS) into a Brazilian Portuguese version, and to test the construct and content validity and reliability of this version in patients with knee injuries. There is no Brazilian Portuguese version of an instrument to assess the function of the lower extremity after orthopaedic injury. The translation of the original English version of the LEFS into a Brazilian Portuguese version was accomplished using standard guidelines and tested in 31 patients with knee injuries. Subsequently, 87 patients with a variety of knee disorders completed the Brazilian Portuguese LEFS, the Medical Outcomes Study 36-Item Short-Form Health Survey, the Western Ontario and McMaster Universities Osteoarthritis Index, and the International Knee Documentation Committee Subjective Knee Evaluation Form and a visual analog scale for pain. All patients were retested within 2 days to determine reliability of these measures. Validation was assessed by determining the level of association between the Brazilian Portuguese LEFS and the other outcome measures. Reliability was documented by calculating internal consistency, test-retest reliability, and standard error of measurement. The Brazilian Portuguese LEFS had a high level of association with the physical component of the Medical Outcomes Study 36-Item Short-Form Health Survey (r = 0.82), the Western Ontario and McMaster Universities Osteoarthritis Index (r = 0.87), the International Knee Documentation Committee Subjective Knee Evaluation Form (r = 0.82), and the pain visual analog scale (r = -0.60) (all, PPortuguese LEFS had a low level of association with the mental component of the Medical Outcomes Study 36-Item Short-Form Health Survey (r = 0.38, PPortuguese version of the LEFS were high. The standard error of measurement was low (3.6) and the agreement was considered high, demonstrated by the small differences between test and retest and the narrow

  10. Are extreme events (statistically) special? (Invited)

    Science.gov (United States)

    Main, I. G.; Naylor, M.; Greenhough, J.; Touati, S.; Bell, A. F.; McCloskey, J.

    2009-12-01

    wrongly in this case) assume Gaussian errors. We develop methods to correct for these effects, and show that the current best fit maximum likelihood regression model for the global frequency-moment distribution in the digital era is a power law, i.e. mega-earthquakes continue to follow the Gutenberg-Richter trend of smaller earthquakes with no (as yet) observable cut-off or characteristic extreme event. The results may also have implications for the interpretation of other time-limited geophysical time series that exhibit power-law scaling.

  11. Spatial and temporal patterns of bank failure during extreme flood events: Evidence of nonlinearity and self-organised criticality at the basin scale?

    Science.gov (United States)

    Thompson, C. J.; Croke, J. C.; Grove, J. R.

    2012-04-01

    Non-linearity in physical systems provides a conceptual framework to explain complex patterns and form that are derived from complex internal dynamics rather than external forcings, and can be used to inform modeling and improve landscape management. One process that has been investigated previously to explore the existence of self-organised critical system (SOC) in river systems at the basin-scale is bank failure. Spatial trends in bank failure have been previously quantified to determine if the distribution of bank failures at the basin scale exhibit the necessary power law magnitude/frequency distributions. More commonly bank failures are investigated at a small-scale using several cross-sections with strong emphasis on local-scale factors such as bank height, cohesion and hydraulic properties. Advancing our understanding of non-linearity in such processes, however, requires many more studies where both the spatial and temporal measurements of the process can be used to investigate the existence or otherwise of non-linearity and self-organised criticality. This study presents measurements of bank failure throughout the Lockyer catchment in southeast Queensland, Australia, which experienced an extreme flood event in January 2011 resulting in the loss of human lives and geomorphic channel change. The most dominant form of fluvial adjustment consisted of changes in channel geometry and notably widespread bank failures, which were readily identifiable as 'scalloped' shaped failure scarps. The spatial extents of these were mapped using high-resolution LiDAR derived digital elevation model and were verified by field surveys and air photos. Pre-flood event LiDAR coverage for the catchment also existed allowing direct comparison of the magnitude and frequency of bank failures from both pre and post-flood time periods. Data were collected and analysed within a GIS framework and investigated for power-law relationships. Bank failures appeared random and occurred

  12. Evaluation of seabed mapping methods for fine-scale classification of extremely shallow benthic habitats - Application to the Venice Lagoon, Italy

    Science.gov (United States)

    Montereale Gavazzi, G.; Madricardo, F.; Janowski, L.; Kruss, A.; Blondel, P.; Sigovini, M.; Foglini, F.

    2016-03-01

    Recent technological developments of multibeam echosounder systems (MBES) allow mapping of benthic habitats with unprecedented detail. MBES can now be employed in extremely shallow waters, challenging data acquisition (as these instruments were often designed for deeper waters) and data interpretation (honed on datasets with resolution sometimes orders of magnitude lower). With extremely high-resolution bathymetry and co-located backscatter data, it is now possible to map the spatial distribution of fine scale benthic habitats, even identifying the acoustic signatures of single sponges. In this context, it is necessary to understand which of the commonly used segmentation methods is best suited to account for such level of detail. At the same time, new sampling protocols for precisely geo-referenced ground truth data need to be developed to validate the benthic environmental classification. This study focuses on a dataset collected in a shallow (2-10 m deep) tidal channel of the Lagoon of Venice, Italy. Using 0.05-m and 0.2-m raster grids, we compared a range of classifications, both pixel-based and object-based approaches, including manual, Maximum Likelihood Classifier, Jenks Optimization clustering, textural analysis and Object Based Image Analysis. Through a comprehensive and accurately geo-referenced ground truth dataset, we were able to identify five different classes of the substrate composition, including sponges, mixed submerged aquatic vegetation, mixed detritic bottom (fine and coarse) and unconsolidated bare sediment. We computed estimates of accuracy (namely Overall, User, Producer Accuracies and the Kappa statistic) by cross tabulating predicted and reference instances. Overall, pixel based segmentations produced the highest accuracies and the accuracy assessment is strongly dependent on the number of classes chosen for the thematic output. Tidal channels in the Venice Lagoon are extremely important in terms of habitats and sediment distribution

  13. Mechanisms underlying temperature extremes in Iberia: a Lagrangian perspective

    Directory of Open Access Journals (Sweden)

    João A. Santos

    2015-04-01

    Full Text Available The mechanisms underlying the occurrence of temperature extremes in Iberia are analysed considering a Lagrangian perspective of the atmospheric flow, using 6-hourly ERA-Interim reanalysis data for the years 1979–2012. Daily 2-m minimum temperatures below the 1st percentile and 2-m maximum temperatures above the 99th percentile at each grid point over Iberia are selected separately for winter and summer. Four categories of extremes are analysed using 10-d backward trajectories initialized at the extreme temperature grid points close to the surface: winter cold (WCE and warm extremes (WWE, and summer cold (SCE and warm extremes (SWE. Air masses leading to temperature extremes are first transported from the North Atlantic towards Europe for all categories. While there is a clear relation to large-scale circulation patterns in winter, the Iberian thermal low is important in summer. Along the trajectories, air mass characteristics are significantly modified through adiabatic warming (air parcel descent, upper-air radiative cooling and near-surface warming (surface heat fluxes and radiation. High residence times over continental areas, such as over northern-central Europe for WCE and, to a lesser extent, over Iberia for SWE, significantly enhance these air mass modifications. Near-surface diabatic warming is particularly striking for SWE. WCE and SWE are responsible for the most extreme conditions in a given year. For WWE and SCE, strong temperature advection associated with important meridional air mass transports are the main driving mechanisms, accompanied by comparatively minor changes in the air mass properties. These results permit a better understanding of mechanisms leading to temperature extremes in Iberia.

  14. Extreme-Value Analysis of China's Emergency Coal Reserve Scale%中国应急煤炭储备规模极值分析

    Institute of Scientific and Technical Information of China (English)

    牟敦果

    2012-01-01

    应急煤炭储备是中国特有的问题,但国内对其规模的定量研究非常少。本文从中国煤炭运输格局、发电用煤需求波动角度研究了建立国家级和省级应急煤炭储备省份的选择,运用极值分析方法研究了煤炭输入省份发电耗煤变动的极值冲击情况,确定了应对不同概率的发电用煤需求冲击的煤炭储备规模,证明当前应急煤炭储备500万吨规模的目标太低。文章最后提出保障我国煤炭供应安全的政策建议:建立省级和国家级应急煤炭储备;建立煤炭产能储备和运力储备;继续扩大应急煤炭储备规模。%Emergency coal reserve is a problem appearing only in China, yet there are few quantitative studies about its scale. This paper analyzes the location choice for national and provincial reserves from coal transportation situation and coal de- mand fluctuation aspects. This paper uses generalized extreme-Value approach to study coal demand shocks, calculating the coal reserve scales against different coal demand shocks, proving that the scale of 5 million ton is too low to deal with even per year shock. At last, this paper puts forward some suggestions to guaranty China's coal supply security:establishing national and provincial levels of emergency coal reserve ; keep coal production and transportation capacity surpluses ; continues to enhance e- mergency coal reserve scale.

  15. Extreme cosmos

    CERN Document Server

    Gaensler, Bryan

    2011-01-01

    The universe is all about extremes. Space has a temperature 270°C below freezing. Stars die in catastrophic supernova explosions a billion times brighter than the Sun. A black hole can generate 10 million trillion volts of electricity. And hypergiants are stars 2 billion kilometres across, larger than the orbit of Jupiter. Extreme Cosmos provides a stunning new view of the way the Universe works, seen through the lens of extremes: the fastest, hottest, heaviest, brightest, oldest, densest and even the loudest. This is an astronomy book that not only offers amazing facts and figures but also re

  16. Sap-flow measurement and scale transferring from sample trees to entire forest stand of Populus euphratica in desert riparian forest in extreme arid region

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Understanding how the transpiration of this vegetation type responds to environmental stress is important for determining the wa-ter-balance dynamics of the riparian ecosystem threatened by groundwater depletion. Transpiration and sap flow were measured using the heat-pulse technique. The results were then projected up to the stand level to investigate the stand’s water-use in relation to climate forcing in the desert riparian forest in an extreme arid region. This study took place from April through October 2003 and from May through October 2004. The experimental site was selected in the Populus euphratica Forest Reserve (101o10’ E, 41o59’ N) in Ejina county, in the lower Heihe River basin, China. The sapwood area was used as a scalar to extrapolate the stand-water consumption from the whole trees’ water consumption measured by the heat-pulse velocity recorder (HPVR). Scale transferring from a series of individual trees to a stand was done according to the existing natural variations between trees under given environmental conditions. The application of the biometric parameters available from individual tree and stand levels was proved suitable for this purpose. A significant correlation between the sapwood area and tree diameter at breast height (DBH) was found. The prediction model is well fitted by the power model. On the basis of the prediction model, the sapwood area can be cal-culated by DBH. The sap-flow density can then be used to extrapolate the stand-water use by means of a series of mathematical models.

  17. Cosmology with a stiff matter era

    Science.gov (United States)

    Chavanis, Pierre-Henri

    2015-11-01

    We consider the possibility that the Universe is made of a dark fluid described by a quadratic equation of state P =K ρ2 , where ρ is the rest-mass density and K is a constant. The energy density ɛ =ρ c2+K ρ2 is the sum of two terms: a rest-mass term ρ c2 that mimics "dark matter" (P =0 ) and an internal energy term u =K ρ2=P that mimics a "stiff fluid" (P =ɛ ) in which the speed of sound is equal to the speed of light. In the early universe, the internal energy dominates and the dark fluid behaves as a stiff fluid (P ˜ɛ , ɛ ∝a-6). In the late universe, the rest-mass energy dominates and the dark fluid behaves as pressureless dark matter (P ≃0 , ɛ ∝a-3). We provide a simple analytical solution of the Friedmann equations for a universe undergoing a stiff matter era, a dark matter era, and a dark energy era due to the cosmological constant. This analytical solution generalizes the Einstein-de Sitter solution describing the dark matter era, and the Λ CDM model describing the dark matter era and the dark energy era. Historically, the possibility of a primordial stiff matter era first appeared in the cosmological model of Zel'dovich where the primordial universe is assumed to be made of a cold gas of baryons. A primordial stiff matter era also occurs in recent cosmological models where dark matter is made of relativistic self-gravitating Bose-Einstein condensates (BECs). When the internal energy of the dark fluid mimicking stiff matter is positive, the primordial universe is singular like in the standard big bang theory. It expands from an initial state with a vanishing scale factor and an infinite density. We consider the possibility that the internal energy of the dark fluid is negative (while, of course, its total energy density is positive), so that it mimics anti-stiff matter. This happens, for example, when the BECs have an attractive self-interaction with a negative scattering length. In that case, the primordial universe is nonsingular and

  18. Second Nuclear Era

    Energy Technology Data Exchange (ETDEWEB)

    Weinberg, A.M.; Spiewak, I.; Barkenbus, J.N.; Livingston, R.S.; Phung, D.L.

    1984-03-01

    The Institute for Energy Analysis with support from The Andrew W. Mellon Foundation has studied the decline of the present nuclear era in the United States and the characteristics of a Second Nuclear Era which might be instrumental in restoring nuclear power to an appropriate place in the energy options of our country. The study has determined that reactors operating today are much safer than they were at the time of the TMI accident. A number of concepts for a supersafe reactor were reviewed and at least two were found that show considerable promise, the PIUS, a Swedish pressurized water design, and a gas-cooled modular design of German and US origin. Although new, safer, incrementally improved, conventional reactors are under study by the nuclear industry, the complete lack of new orders in the United States will slow their introduction and they are likely to be more expensive than present designs. The study recommends that supersafe reactors be taken seriously and that federal and private funds both be used to design and, if feasible, to build a prototype reactor of substantial size. 146 references, 8 figures, 2 tables.

  19. How extreme is extreme hourly precipitation?

    Science.gov (United States)

    Papalexiou, Simon Michael; Dialynas, Yannis G.; Pappas, Christoforos

    2016-04-01

    The importance of accurate representation of precipitation at fine time scales (e.g., hourly), directly associated with flash flood events, is crucial in hydrological design and prediction. The upper part of a probability distribution, known as the distribution tail, determines the behavior of extreme events. In general, and loosely speaking, tails can be categorized in two families: the subexponential and the hyperexponential family, with the first generating more intense and more frequent extremes compared to the latter. In past studies, the focus has been mainly on daily precipitation, with the Gamma distribution being the most popular model. Here, we investigate the behaviour of tails of hourly precipitation by comparing the upper part of empirical distributions of thousands of records with three general types of tails corresponding to the Pareto, Lognormal, and Weibull distributions. Specifically, we use thousands of hourly rainfall records from all over the USA. The analysis indicates that heavier-tailed distributions describe better the observed hourly rainfall extremes in comparison to lighter tails. Traditional representations of the marginal distribution of hourly rainfall may significantly deviate from observed behaviours of extremes, with direct implications on hydroclimatic variables modelling and engineering design.

  20. ERA-MIN: The European network (ERA-NET) on non-energy raw materials

    Science.gov (United States)

    vidal, o.; christmann, p.; Bol, d.; Goffé, b.; Groth, m.; Kohler, e.; Persson Nelson, k.; Schumacher, k.

    2012-04-01

    Non-energy raw materials are vital for the EU's economy, and for the development of environmentally friendly technologies. The EU is the world's largest consumers of non-energy minerals, but it remains dependent on the importation of many metals, as its domestic production is limited to about 3% of world production. We will present the project ERA-MIN, which is an ERA-NET on the Industrial Handling of Raw Materials for European industries, financially supported by the European Commission. The main objectives of ERA-MIN are: 1) Mapping and Networking: interconnecting the members of the currently fragmented European mineral resources research area, to the aim of fostering convergence of public research programs, industry, research institutes, academia and the European Commission, 2) Coordinating: establishing a permanent mechanism for planning and coordination of the European non-energy mineral raw materials research community (ENERC). 3) Roadmapping: defining the most important scientific and technological challenges that should be supported by the EU and its state members, 4) Programming: designing a Joint European Research Programme model and implementating it into a call for proposals open to academic and industrial research. The topics of interest in ERA-MIN are the primary continental and marine resources, the secondary resources and their related technologies, substitution and material efficiency, along with transversal topics such as environmental impact, public policy support, mineral intelligence, and public education and teaching. Public scientific research is very central in the scope of the ERA-MIN activity, whose consortium is indeed lead by a public organisation of fundamental research. Thus, universities and public research organisations are warmly invited to play an active role in defining the scientific questions and challenges that shall determine the European Raw Materials Roadmap and should be addressed by joint programming at the European scale

  1. Nasionalisme di Era Internet

    Directory of Open Access Journals (Sweden)

    Danu Widhyatmoko

    2015-07-01

    Full Text Available Nationalism and nationality of a country life are moving into the new phase. Internet has become a new medium that opens up so many opportunities to create a sense of nationalism for the country. This paper contains a review of nationalism in the age of the Internet. This paper begins with understanding nationalism, the character of the Internet, social media and nationalism in the era of the Internet. Research method used in this paper is literature study, continued with reflective data analysis. With reflective analysis method, the authors analyzed data from the data collection has been carried out for comparison between the existing literature by circumstances or phenomena that occur, so that the conclusions of rational and scientific data can be obtained.

  2. Attitude extremity, consensus and diagnosticity

    NARCIS (Netherlands)

    van der Pligt, J.; Ester, P.; van der Linden, J.

    1983-01-01

    Studied the effects of attitude extremity on perceived consensus and willingness to ascribe trait terms to others with either pro- or antinuclear attitudes. 611 Ss rated their attitudes toward nuclear energy on a 5-point scale. Results show that attitude extremity affected consensus estimates. Trait

  3. Grassland responses to precipitation extremes

    Science.gov (United States)

    Grassland ecosystems are naturally subjected to periods of prolonged drought and sequences of wet years. Climate change is expected to enhance the magnitude and frequency of extreme events at the intraannual and multiyear scales. Are grassland responses to extreme precipitation simply a response to ...

  4. Gauss-Bonnet Cosmology Unifying Late and Early-time Acceleration Eras with Intermediate Eras

    CERN Document Server

    Oikonomou, V K

    2016-01-01

    In this paper we demonstrate that with vacuum $F(G)$ gravity it is possible to describe the unification of late and early-time acceleration eras with the radiation and matter domination era. The Hubble rate of the unified evolution contains two mild singularities, so called Type IV singularities, and the evolution itself has some appealing features, such as the existence of a deceleration-acceleration transition at late times. We also address quantitatively a fundamental question related to modified gravity models description of cosmological evolution: Is it possible for all modified gravity descriptions of our Universe evolution, to produce a nearly scale invariant spectrum of primordial curvature perturbations? As we demonstrate, the answer for the $F(G)$ description is no, since the resulting power spectrum is not scale invariant, in contrast to the $F(R)$ description studied in the literature. Therefore, although the cosmological evolution can be realized in the context of vacuum $F(G)$ gravity, the evolu...

  5. Changes of Frequency of Summer Precipitation Extremes over the Yangtze River in Association with Large-scale Oceanic-atmospheric Conditions

    Institute of Scientific and Technical Information of China (English)

    WANG Yi; YAN Zhongwei

    2011-01-01

    Changes of the frequency of precipitation extremes (the number of days with daily precipitation exceeding the 90th percentile of a daily climatology,referred to as R90N) in summer (June-August) over the mid-lower reaches of the Yangtze River arc analyzed based on daily observations during 1961-2007.The first singular value decomposition (SVD) mode of R90N is linked to an ENSO-like mode of the sea surface temperature anomalies (SSTA) in the previous winter.Responses of different grades of precipitation events to the climatic mode are compared.It is notable that the frequency of summer precipitation extremes is significantly related with the SSTA in the Pacific,while those of light and moderate precipitation are not.It is suggested that the previously well-recognized impact of ENSO on summer rainfall along the Yangtze River is essentially due to a response in summer precipitation extremes in the region,in association with the East Asia-Pacific (EAP) teleconnection pattern.A negative relationship is found between the East Asian Summer Monsoon (EASM) and precipitation extremes over the mid-lower reaches of the Yangtze River.In contrast,light rainfall processes are independent from the SST and EASM variations.

  6. Nanotechnology: A new era for photodetection?

    Science.gov (United States)

    Ambrosio, M.; Ambrosio, A.; Ambrosone, G.; Campajola, L.; Cantele, G.; Carillo, V.; Coscia, U.; Iadonisi, G.; Ninno, D.; Maddalena, P.; Perillo, E.; Raulo, A.; Russo, P.; Trani, F.; Esposito, E.; Grossi, V.; Passacantando, M.; Santucci, S.; Allegrini, M.; Gucciardi, P. G.; Patanè, S.; Bobba, F.; Di Bartolomeo, A.; Giubileo, F.; Iemmo, L.; Scarfato, A.; Cucolo, A. M.

    2009-10-01

    Nowadays we live in the so-called "Silicon Era", in which devices based on the silicon technology permeate all aspects of our daily life. One can simply think how much silicon is in the everyday household objects, gadgets and appliances. The impact of silicon technology has been very relevant in photodetection as well. It enables designing large or very large-scale integration devices, in particular microchips and pixelled detectors like the Silicon Photo Multiplier made of micrometric channels grouped in mm 2 pixels. However, on the horizon, the recent development of nanotechnologies is opening a new direction in the design of sub-micron photodevices, owing to the capability to deal with individual molecules of compounds or to chemically grow various kinds of materials. Among them, carbon compounds appear to be the most promising materials being chemically very similar to silicon, abundant and easy to handle. In particular, carbon nanotubes (CNT) are a very intriguing new form of material, whose properties are being studied worldwide providing important results. The photoelectric effects observed on carbon nanotubes indicate the possibility to build photodetectors based on CNTs inducing many people to claim that we are at the beginning of a Post Silicon Era or of the Carbon Era. In this paper, we report on the most important achievements obtained on the application of nanotechnologies to photodetection and medical imaging, as well as to the development of radiation detectors for astro-particle physics experiments.

  7. Nanotechnology: A new era for photodetection?

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosio, M. [INFN Sezione di Napoli, Via Cintia 2, 80125 Napoli (Italy)], E-mail: ambrosio@na.infn.it; Ambrosio, A.; Ambrosone, G.; Campajola, L.; Cantele, G.; Carillo, V.; Coscia, U.; Iadonisi, G.; Ninno, D.; Maddalena, P.; Perillo, E.; Raulo, A.; Russo, P.; Trani, F. [INFN Sezione di Napoli, Via Cintia 2, 80125 Napoli (Italy); Dipartimento di Fisica, Universita ' Federico II' di Napoli (Italy); Esposito, E. [INFN Sezione di Napoli, Via Cintia 2, 80125 Napoli (Italy); Istituto di Cibernetica ' E. Caianiello' (Italy); Grossi, V.; Passacantando, M.; Santucci, S. [INFN Sezione di Napoli, Via Cintia 2, 80125 Napoli (Italy); Dipartimento di Fisica, Universita dell' Aquila (Italy); Allegrini, M. [INFN Sezione di Napoli, Via Cintia 2, 80125 Napoli (Italy); Dipartimento di Fisica, Universia di Pisa (Italy); Gucciardi, P.G. [INFN Sezione di Napoli, Via Cintia 2, 80125 Napoli (Italy); CNR-IPCF Sezione di Messina (Italy)] (and others)

    2009-10-21

    Nowadays we live in the so-called 'Silicon Era', in which devices based on the silicon technology permeate all aspects of our daily life. One can simply think how much silicon is in the everyday household objects, gadgets and appliances. The impact of silicon technology has been very relevant in photodetection as well. It enables designing large or very large-scale integration devices, in particular microchips and pixelled detectors like the Silicon Photo Multiplier made of micrometric channels grouped in mm{sup 2} pixels. However, on the horizon, the recent development of nanotechnologies is opening a new direction in the design of sub-micron photodevices, owing to the capability to deal with individual molecules of compounds or to chemically grow various kinds of materials. Among them, carbon compounds appear to be the most promising materials being chemically very similar to silicon, abundant and easy to handle. In particular, carbon nanotubes (CNT) are a very intriguing new form of material, whose properties are being studied worldwide providing important results. The photoelectric effects observed on carbon nanotubes indicate the possibility to build photodetectors based on CNTs inducing many people to claim that we are at the beginning of a Post Silicon Era or of the Carbon Era. In this paper, we report on the most important achievements obtained on the application of nanotechnologies to photodetection and medical imaging, as well as to the development of radiation detectors for astro-particle physics experiments.

  8. The eras of radiation, matter, and dark energy: new information from the Planck Collaboration

    CERN Document Server

    Cahill, Kevin

    2016-01-01

    Data released by the Planck Collaboration in 2015 imply new dates for the era of radiation, the era of matter, and the era of dark energy. The era of radiation ended, and the era of matter began, when the density of radiation dropped below that of matter. This happened 51,953 \\pm 2236 years after the time of infinite redshift when the ratio a(t)/a_0 of scale factors was (2.9332 \\pm 0.0711) x 10^{-4}. The era of matter ended, and the era of dark energy began, when the density of matter dropped below that of dark energy (assumed constant). This happened 10.1928 \\pm 0.0375 Gyr after the time of infinite redshift when the scale-factor ratio was 0.7646 \\pm 0.0168. The era of dark energy started 3.606 billion years ago. In this pedagogical paper, five figures trace the evolution of the densities of radiation and matter, the scale factor, and the redshift through the eras of radiation, matter, and dark energy.

  9. The reheating era leptogenesis in models with seesaw mechanism

    CERN Document Server

    Hamada, Yuta; Yasuhara, Daiki

    2016-01-01

    Observed baryon asymmetry can be achieved not only by the decay of right-handed neutrinos but also by the scattering processes in the reheating era. In the latter scenario, new physics in high energy scale does not need to be specified, but only two types of the higher dimensional operator of the standard model particles are assumed in the previous work. In this paper, we examine the origin of the higher dimensional operators assuming models with a certain seesaw mechanism at the high energy scale. The seesaw mechanism seems to be a simple realization of the reheating era leptogenesis because the lepton number violating interaction is included. We show that the effective interaction giving CP violating phases is provided in the several types of models and also the reheating era leptogenesis actually works in such models. Additionally, we discuss a possibility for lowering the reheating temperature in the radiative seesaw models, where the large Yukawa coupling is naturally realized.

  10. Comparative study between ERA-20C and ERA INTERIM reanalysis datasets

    Science.gov (United States)

    Krisztina Balázs, Zita; Ihász, István

    2016-04-01

    The continuous development of the 20th century had a positive effect on the meteorological forecasts as well. Thanks to that the numerical models and their forecasts became more precise by the end of the century. Therefore in the 1990s scientists required to verify the results of previous numerical models with the available new technologies. In this way now it is possible to get a more accurate picture of the atmosphere's past. To meet this need reanalyses were improved. Reanalyses not only help to represent the conditions of the atmosphere more precisely, but they also help to recognize the errors of the numerical models. All these progresses are the basics of making trustworthy forecasts, and getting precise results of global climate models as well. Thanks to the innovation of data-assimilated methods and further technical developments several reanalysis projects were improved in the last decades. In our current studies we are making a proper, comparative study between the two most modern ECMWF reanalysis datasets (ERA INTERIM, ERA-20C). In the first step we assigned three periods of ERA-20C (1901-2000, 1901-1950 and 1951-2000) where we examine several selected parameters. We also assigned a collective period from both ERA INTERIM and ERA-20C (1981-2010). Four different meteorological parameters - 500 hPa height, 850 hPa temperature, mean sea level pressure, and ice coverage in the Arctic- Circle regions were investigated in our study. Emphasis is also placed on extreme weather situations. Firstly we are monitoring the detectability and the changes in frequencies of rapid cyclones in the period 1981-2010 collectively in both reanalysis datasets. Besides we examine some selected cyclones' frequency and spatial location in three periods of ERA-20C (1901-2000, 1901-1950 and 1951-2000). By the results we can recognize the strengths and weaknesses of the two reanalyses. It is a great benefit for all the reanalysis users, such as climate researchers, and the developers

  11. Three eras of planetary exploration

    Science.gov (United States)

    Ingersoll, Andrew P.

    2017-01-01

    The number of known exoplanets rose from zero to one in the mid-1990s, and has been doubling approximately every two years ever since. Although this can justifiably be called the beginning of an era, an earlier era began in the 1960s when humankind began exploring the Solar System with spacecraft. Even earlier than that, the era of modern scientific study of the Solar System began with Copernicus, Galileo, Brahe, Kepler and Newton. These eras overlap in time, and many individuals have worked across all three. This Review explores what the past can tell us about the future and what the exploration of the Solar System can teach us about exoplanets, and vice versa. We consider two primary examples: the history of water on Venus and Mars; and the study of Jupiter, including its water, with the Juno spacecraft.

  12. Three eras of climate change

    Energy Technology Data Exchange (ETDEWEB)

    Huq, Saleemul; Toulmin, Camilla

    2006-10-15

    Climate change as a global challenge has evolved through a series of stages in the last few decades. We are now on the brink of a new era which will see the terms of the debate shift once again. The different eras are characterised by the scientific evidence, public perceptions, responses and engagement of different groups to address the problem. In the first era, from the late 1980s to 2000, climate change was seen as an “environmental” problem to do with prevention of future impacts on the planet's climate systems over the next fifty to hundred years, through reductions in emissions of greenhouse gases, known as “mitigation”. The second era can be said to have started around the turn of the millennium, with the recognition that there will be some unavoidable impacts from climate change in the near term (over the next decade or two). These impacts must be coped with through “adaptation”, as well as mitigation, to prevent much more severe and possibly catastrophic impacts in the longer term. It has become clear that many of the impacts of climate change in the near term are likely to fall on the poorest countries and communities. The third era, which we are just about to enter, will see the issue change from tackling an environmental or development problem to a question of “global justice”. It will engage with a much wider array of citizens from around the world than previous eras.

  13. Cosmic string loop distribution on all length scales and at any redshift

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Larissa; Ringeval, Christophe [Institute of Mathematics and Physics, Centre for Cosmology, Particle Physics and Phenomenology, Louvain University, 2 Chemin du Cyclotron, 1348 Louvain-la-Neuve (Belgium); Sakellariadou, Mairi, E-mail: larissa.lorenz@uclouvain.be, E-mail: christophe.ringeval@uclouvain.be, E-mail: mairi.sakellariadou@kcl.ac.uk [Department of Physics, King' s College, University of London, Strand, London WC2R 2LS (United Kingdom)

    2010-10-01

    We analytically derive the expected number density distribution of Nambu-Goto cosmic string loops at any redshift soon after the time of string formation to today. Our approach is based on the Polchinski-Rocha model of loop formation from long strings which we adjust to fit numerical simulations and complement by a phenomenological modelling of gravitational backreaction. Cosmological evolution drives the loop distribution towards scaling on all length scales in both the radiation and matter era. Memory of any reasonable initial loop distribution in the radiation era is shown to be erased well before Big Bang Nucleosynthesis. In the matter era, the loop distribution reaches full scaling, up to some residual loops from the radiation era which may be present for extremely low string tension. Finally, the number density of loops below the gravitational cutoff is shown to be scale independent, proportional to a negative power of the string tension and insensitive to the details of the backreaction modelling. As an application, we show that the energy density parameter of loops today cannot exceed 10{sup −5} for currently allowed string tension values, while the loop number density cannot be less than 10{sup −6} per Mpc{sup 3}. Our result should provide a more robust basis for studying the cosmological consequences of cosmic string loops.

  14. Gram-scale synthesis, thermal stability, magnetic properties, and microwave absorption application of extremely small Co-C core-shell nanoparticles

    Science.gov (United States)

    Kuang, Daitao; Hou, Lizhen; Yu, Bowen; Liang, Bingbing; Deng, Lianwen; Huang, Han; Ma, Songshan; He, Jun; Wang, Shiliang

    2017-07-01

    Co-C core-shell nanoparticles have been synthesized in large quantity (in grams) by metal-organic chemical vapor deposition with analytical cobalt (III) acetylacetonate as precursor. Extremely small nanoparticles with an average core diameter of 3 nm and a shell thickness of 1-2 nm, and relatively large nanoparticles with an average core diameter of 23 nm and a shell thickness of 5-20 nm were obtained, depending on the deposition regions. The 3 nm Co nanocores are thermally stable up to 200 °C in air atmosphere, and do not exhibit visible structural and morphological changes after exposure to air at room temperature for 180 d. The extremely small core-shell nanoparticles exhibit typical superparamagnetic behaviors with a small coercivity of 5 Oe, while the relative large nanoparticles are a typical ferromagnetic material with a high coercivity of 584 Oe. In the microwave absorption tests, a low reflection loss (RL) of  -80.3 dB and large effective bandwidth (frequency range for \\text{RL}≤slant -10~ dB) of 10.1 GHz are obtained in the nanoparticle-paraffin composites with appropriate layer thicknesses and particle contents. This suggests that the as-synthesized Co-C core-shell nanoparticles have a high potential as the microwave-absorbing materials.

  15. Injury in the era of genomics.

    Science.gov (United States)

    Cobb, J P; Brownstein, B H; Watson, M A; Shannon, W D; Laramie, J M; Qiu, Y; Stormo, G D; Morrissey, J J; Buchman, T G; Karl, I E; Hotchkiss, R S

    2001-03-01

    The traditional approach to the study of biology employs small-scale experimentation that results in the description of a molecular sequence of known function or relevance. In the era of the genome the reverse is true, as large-scale cloning and gene sequencing come first, followed by the use of computational methods to systematically determine gene function and regulation. The overarching goal of this new approach is to translate the knowledge learned from a systematic, global analysis of genomic data into a complete understanding of biology. For investigators who study shock, the specific goal is to increase understanding of the adaptive response to injury at the level of the entire genome. This review describes our initial experience using DNA microarrays to profile stress-induced changes in gene expression. We conclude that efforts to apply genomics to the study of injury are best coordinated by multi-disciplinary groups, because of the extensive expertise required.

  16. Assessment of the Suitability of a Global Hydrodynamic Model in Simulating a Regional-scale Extreme Flood at Finer Spatial Resolutions

    Science.gov (United States)

    Mateo, C. M. R.; Yamazaki, D.; Kim, H.; Champathong, A.; Oki, T.

    2015-12-01

    Global river models (GRMs) are elemental for large-scale predictions and impact analyses. However, they have limited capability in providing accurate flood information at fine resolution for practical purposes. Hyperresolution (~1km resolution) modelling is believed to improve the representation of topographical constraints, which consequently result to better predictions of surface water flows and flood inundation at regional to global scales. While numerous studies have shown that finer resolutions improve the predictions of catchment-scale floods using local-scale hydrodynamic models, the impact of finer spatial resolution on predictions of large-scale floods using GRMs is rarely examined. In this study, we assessed the suitability of a state-of-the-art hydrodynamic GRM, CaMa-Flood, in the hyperresolution simulation of a regional-scale flood. The impacts of finer spatial resolution and representation of sub-grid processes on simulating the 2011 immense flooding in Chao Phraya River Basin, Thailand was investigated. River maps ranging from 30-arcsecond (~1km) to 5-arcminute (~10km) spatial resolutions were generated from 90m resolution HydroSHEDS maps and SRTM3 DEM. Simulations were executed in each spatial resolution with the new multi-directional downstream connectivity (MDC) scheme in CaMa-Flood turned on and off. While the predictive capability of the model slightly improved with finer spatial resolution when MDC scheme is turned on, it significantly declined when MDC scheme is turned off; bias increased by 35% and NSE-coefficient decreased by 60%. These findings indicate that GRMs which assume single-downstream-grid flows are not suitable for hyperresolution modelling because of their limited capability to realistically represent floodplain connectivity. When simulating large-scale floods, MDC scheme is necessary for the following functions: provide additional storage for ovehrbank flows, enhance connectivity between floodplains which allow more realistic

  17. Ecosystem-scale volatile organic compound fluxes during an extreme drought in a broadleaf temperate forest of the Missouri Ozarks (central USA)

    Energy Technology Data Exchange (ETDEWEB)

    Seco, Roger [Univ. of California, Irvine, CA (United States); Karl, Thomas [Univ. of Innsbruck (Austria); Guenther, Alex B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Washington State Univ., Pullman, WA (United States); Hosman, Kevin P. [Univ. of Missouri, Columbia, MO (United States); Pallardy, Stephen G. [Univ. of Missouri, Columbia, MO (United States); Gu, Lianhong [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Geron, Chris [U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Harley, Peter [National Center for Atmospheric Research, Boulder, CO (United States); Kim, Saewung [Univ. of California, Irvine, CA (United States)

    2015-07-07

    Considerable amounts and varieties of biogenic volatile organic compounds (BVOCs) are exchanged between vegeta-tion and the surrounding air. These BVOCs play key ecological and atmospheric roles that must be adequately repre-sented for accurately modeling the coupled biosphere–atmosphere–climate earth system. One key uncertainty in existing models is the response of BVOC fluxes to an important global change process: drought. We describe the diur-nal and seasonal variation in isoprene, monoterpene, and methanol fluxes from a temperate forest ecosystem before, during, and after an extreme 2012 drought event in the Ozark region of the central USA. BVOC fluxes were domi-nated by isoprene, which attained high emission rates of up to 35.4 mg m-2h-1 at midday. Methanol fluxes were characterized by net deposition in the morning, changing to a net emission flux through the rest of the daylight hours. Net flux of CO2 reached its seasonal maximum approximately a month earlier than isoprenoid fluxes, which high-lights the differential response of photosynthesis and isoprenoid emissions to progressing drought conditions. Never-theless, both processes were strongly suppressed under extreme drought, although isoprene fluxes remained relatively high compared to reported fluxes from other ecosystems. Methanol exchange was less affected by drought throughout the season, conflrming the complex processes driving biogenic methanol fluxes. The fraction of daytime (7–17 h) assimilated carbon released back to the atmosphere combining the three BVOCs measured was 2% of gross primary productivity (GPP) and 4.9% of net ecosystem exchange (NEE) on average for our whole measurement cam-paign, while exceeding 5% of GPP and 10% of NEE just before the strongest drought phase. The MEGANv2.1 model correctly predicted diurnal variations in fluxes driven mainly by light and temperature, although further research is needed to address model BVOC fluxes

  18. Extreme Photonics & Applications

    CERN Document Server

    Hall, Trevor J; Paredes, Sofia A

    2010-01-01

    "Extreme Photonics & Applications" arises from the 2008 NATO Advanced Study Institute in Laser Control & Monitoring in New Materials, Biomedicine, Environment, Security and Defense. Leading experts in the manipulation of light offered by recent advances in laser physics and nanoscience were invited to give lectures in their fields of expertise and participate in discussions on current research, applications and new directions. The sum of their contributions to this book is a primer for the state of scientific knowledge and the issues within the subject of photonics taken to the extreme frontiers: molding light at the ultra-finest scales, which represents the beginning of the end to limitations in optical science for the benefit of 21st Century technological societies. Laser light is an exquisite tool for physical and chemical research. Physicists have recently developed pulsed lasers with such short durations that one laser shot takes the time of one molecular vibration or one electron rotation in an ...

  19. Climate exposure of US national parks in a new era of change.

    Directory of Open Access Journals (Sweden)

    William B Monahan

    Full Text Available US national parks are challenged by climate and other forms of broad-scale environmental change that operate beyond administrative boundaries and in some instances are occurring at especially rapid rates. Here, we evaluate the climate change exposure of 289 natural resource parks administered by the US National Park Service (NPS, and ask which are presently (past 10 to 30 years experiencing extreme (95th percentile climates relative to their 1901-2012 historical range of variability (HRV. We consider parks in a landscape context (including surrounding 30 km and evaluate both mean and inter-annual variation in 25 biologically relevant climate variables related to temperature, precipitation, frost and wet day frequencies, vapor pressure, cloud cover, and seasonality. We also consider sensitivity of findings to the moving time window of analysis (10, 20, and 30 year windows. Results show that parks are overwhelmingly at the extreme warm end of historical temperature distributions and this is true for several variables (e.g., annual mean temperature, minimum temperature of the coldest month, mean temperature of the warmest quarter. Precipitation and other moisture patterns are geographically more heterogeneous across parks and show greater variation among variables. Across climate variables, recent inter-annual variation is generally well within the range of variability observed since 1901. Moving window size has a measureable effect on these estimates, but parks with extreme climates also tend to exhibit low sensitivity to the time window of analysis. We highlight particular parks that illustrate different extremes and may facilitate understanding responses of park resources to ongoing climate change. We conclude with discussion of how results relate to anticipated future changes in climate, as well as how they can inform NPS and neighboring land management and planning in a new era of change.

  20. Climate exposure of US national parks in a new era of change.

    Science.gov (United States)

    Monahan, William B; Fisichelli, Nicholas A

    2014-01-01

    US national parks are challenged by climate and other forms of broad-scale environmental change that operate beyond administrative boundaries and in some instances are occurring at especially rapid rates. Here, we evaluate the climate change exposure of 289 natural resource parks administered by the US National Park Service (NPS), and ask which are presently (past 10 to 30 years) experiencing extreme (95th percentile) climates relative to their 1901-2012 historical range of variability (HRV). We consider parks in a landscape context (including surrounding 30 km) and evaluate both mean and inter-annual variation in 25 biologically relevant climate variables related to temperature, precipitation, frost and wet day frequencies, vapor pressure, cloud cover, and seasonality. We also consider sensitivity of findings to the moving time window of analysis (10, 20, and 30 year windows). Results show that parks are overwhelmingly at the extreme warm end of historical temperature distributions and this is true for several variables (e.g., annual mean temperature, minimum temperature of the coldest month, mean temperature of the warmest quarter). Precipitation and other moisture patterns are geographically more heterogeneous across parks and show greater variation among variables. Across climate variables, recent inter-annual variation is generally well within the range of variability observed since 1901. Moving window size has a measureable effect on these estimates, but parks with extreme climates also tend to exhibit low sensitivity to the time window of analysis. We highlight particular parks that illustrate different extremes and may facilitate understanding responses of park resources to ongoing climate change. We conclude with discussion of how results relate to anticipated future changes in climate, as well as how they can inform NPS and neighboring land management and planning in a new era of change.

  1. Deep Chandra, HST-COS, and Megacam Observations of the Phoenix Cluster: Extreme Star Formation and AGN Feedback on Hundred Kiloparsec Scales

    CERN Document Server

    McDonald, M; van Weeren, R J; Applegate, D E; Bayliss, M; Bautz, M W; Benson, B A; Carlstrom, J E; Bleem, L E; Chatzikos, M; Edge, A C; Fabian, A C; Garmire, G P; Hlavacek-Larrondo, J; Jones-Forman, C; Mantz, A B; Miller, E D; Stalder, B; Veilleux, S; Zuhone, J A

    2015-01-01

    We present new ultraviolet, optical, and X-ray data on the Phoenix galaxy cluster (SPT-CLJ2344-4243). Deep optical imaging reveals previously-undetected filaments of star formation, extending to radii of ~50-100 kpc in multiple directions. Combined UV-optical spectroscopy of the central galaxy reveals a massive (2x10^9 Msun)), young (~4.5 Myr) population of stars, consistent with a time-averaged star formation rate of 610 +/- 50 Msun/yr. We report a strong detection of OVI(1032,1038) which appears to originate primarily in shock-heated gas, but may contain a substantial contribution (>1000 Msun/yr) from the cooling intracluster medium. We confirm the presence of deep X-ray cavities in the inner ~10 kpc, which are amongst the most extreme examples of radio-mode feedback detected to date, implying jet powers of 2-7 x10^45 erg/s. We provide evidence that the AGN inflating these cavities may have only recently transitioned from "quasar-mode" to "radio-mode", and may currently be insufficient to completely offset ...

  2. Climatology of Vb-cyclones, physical mechanisms and their impact on extreme precipitation over Central Europe

    Directory of Open Access Journals (Sweden)

    M. Messmer

    2015-05-01

    Full Text Available Cyclones, which develop over the western Mediterranean and move northeastward are a major source of extreme weather and known to be responsible for heavy precipitation over Central Europe and the Alps. As the relevant processes triggering these so-called Vb-events and their impact on extreme precipitation are not yet fully understood, this study focusses on gaining insight into the dynamics of past events. For this, a cyclone detection and tracking tool is applied to the ERA-Interim reanalysis (1979–2013 to identify prominent Vb-situations. Precipitation in the ERA-Interim and the E-OBS datasets is used to evaluate case-to-case precipitation amounts and to assess consistency between the two datasets. Both datasets exhibit high variability in precipitation amounts among different Vb-events. While only 23 % of all Vb-events are associated with extreme precipitation, around 15 % of all extreme precipitation days (99 percentile over the Alpine region are induced by Vb-events, although Vb-cyclones are rare events (2.3 per year. To obtain a better understanding of the variability within Vb-events, the analysis of the 10 heaviest and lowest precipitation Vb-events reveals noticeable differences in the state of the atmosphere. These differences are most pronounced in the geopotential height and potential vorticity field, indicating a much stronger cyclone for heavy precipitation events. The related differences in wind direction are responsible for the moisture transport around the Alps and the orographical lifting along the Alps. These effects are the main reasons for a disastrous outcome of Vb-events, and consequently are absent in the Vb-events associated with low precipitation. Hence, our results point out that heavy precipitation related to Vb-events is mainly related to large-scale dynamics rather than to thermodynamic processes.

  3. Data on Vietnam Era Veterans.

    Science.gov (United States)

    Veterans Administration, Washington, DC. Office of the Controller.

    Statistical data are presented on Vietnam era veterans for the following topics: employment status, medical status, compensation and pension, education, housing assistance, expenditures, and demographic information. The estimated number and age of veterans in civil life, categorized by sex and state, and the educational attainment of veterans at…

  4. Deep Chandra, HST-COS, and Megacam Observations of the Phoenix Cluster: Extreme Star Formation and AGN Feedback on Hundred Kiloparsec Scales

    Science.gov (United States)

    McDonald, Michael; McNamara, Brian R.; van Weeren, Reinout J.; Applegate, Douglas E.; Bayliss, Matthew; Bautz, Marshall W.; Benson, Bradford A.; Carlstrom, John E.; Bleem, Lindsey E.; Chatzikos, Marios; Edge, Alastair C.; Fabian, Andrew C.; Garmire, Gordon P.; Hlavacek-Larrondo, Julie; Jones-Forman, Christine; Mantz, Adam B.; Miller, Eric D.; Stalder, Brian; Veilleux, Sylvain; ZuHone, John A.

    2015-10-01

    We present new ultraviolet, optical, and X-ray data on the Phoenix galaxy cluster (SPT-CLJ2344-4243). Deep optical imaging reveals previously undetected filaments of star formation, extending to radii of ˜50-100 kpc in multiple directions. Combined UV-optical spectroscopy of the central galaxy reveals a massive (2 × 109 M⊙), young (˜4.5 Myr) population of stars, consistent with a time-averaged star formation rate of 610 ± 50 M⊙ yr-1. We report a strong detection of O vi λλ1032,1038, which appears to originate primarily in shock-heated gas, but may contain a substantial contribution (>1000 M⊙ yr-1) from the cooling intracluster medium (ICM). We confirm the presence of deep X-ray cavities in the inner ˜10 kpc, which are among the most extreme examples of radio-mode feedback detected to date, implying jet powers of 2-7 × 1045 erg s-1. We provide evidence that the active galactic nucleus inflating these cavities may have only recently transitioned from “quasar-mode” to “radio-mode,” and may currently be insufficient to completely offset cooling. A model-subtracted residual X-ray image reveals evidence for prior episodes of strong radio-mode feedback at radii of ˜100 kpc, with extended “ghost” cavities indicating a prior epoch of feedback roughly 100 Myr ago. This residual image also exhibits significant asymmetry in the inner ˜200 kpc (0.15R500), reminiscent of infalling cool clouds, either due to minor mergers or fragmentation of the cooling ICM. Taken together, these data reveal a rapidly evolving cool core which is rich with structure (both spatially and in temperature), is subject to a variety of highly energetic processes, and yet is cooling rapidly and forming stars along thin, narrow filaments.

  5. Extreme erosion response after wildfire in the Upper Ovens, south-east Australia: Assessment of catchment scale connectivity by an intensive field survey

    Science.gov (United States)

    Box, Walter; Keestra, Saskia; Nyman, Petter; Langhans, Christoph; Sheridan, Gary

    2015-04-01

    South-eastern Australia is generally regarded as one of the world's most fire-prone environments because of its high temperatures, low rainfall and flammable native Eucalyptus forests. Modifications to the landscape by fire can lead to significant changes to erosion rates and hydrological processes. Debris flows in particular have been recognised as a process which increases in frequency as a result of fire. This study used a debris flow event in the east Upper Ovens occurred on the 28th of February 2013 as a case study for analysing sediment transport processes and connectivity of sediment sources and sinks. Source areas were identified using a 15 cm resolution areal imagery and a logistic regression model was made based on fire severity, aridity index and slope to predict locations of source areas. Deposits were measured by making cross-sections using a combination of a differential GPS and a total station. In total 77 cross-sections were made in a 14.1 km2 sub-catchment and distributed based on channel gradient and width. A more detailed estimation was obtained by making more cross-sections where the volume per area is higher. Particle size distribution between sources and sink areas were obtained by combination of field assessment, photography imagery analyses and sieve and laser diffraction. Sediment was locally eroded, transported and deposited depending on factors such as longitude gradient, stream power and the composition of bed and bank material. The role of headwaters as sediment sinks changed dramatically as a result of the extreme erosion event in the wildfire affected areas. Disconnected headwaters became connected to low order streams due to debris flow processes in the contributing catchment. However this redistribution of sediment from headwaters to the drainage network was confined to upper reaches of the Ovens. Below this upper part of the catchment the event resulted in redistribution of sediment already existing in the channel through a

  6. Future Perspectives of ERAS: A Narrative Review on the New Applications of an Established Approach

    Science.gov (United States)

    Marchesini, Maurizio; Allegri, Massimo; Fanelli, Guido

    2016-01-01

    ERAS approach (Enhanced Recovery After Surgery) is a multimodal, perioperative pathway designed to achieve early recovery after surgery. ERAS has shown documented efficacy in elective surgery, and the concept of “multimodal” and “multidisciplinary” approach seems still to be of higher importance than each single item within ERAS protocols. New perspectives include the use of ERAS in emergency surgery, where efficacy and safety on outcome have been documented, and flexibility of traditional items may add benefits for traditionally high-risk patients. Obstetric surgery, as well, may open wide horizons for future research, since extremely poor data are currently available, and ERAS benefits may translate even on the baby. Finally, the concept of “outcome” may be extended when considering the specific setting of cancer surgery, in which variables like cancer recurrence, early access to adjuvant therapies, and, finally, long-term survival are as important as the reduced perioperative complications. In this perspective, different items within ERAS protocols should be reinterpreted and eventually integrated towards “protective” techniques, to develop cancer-specific ERAS approaches keeping pace with the specific aims of oncologic surgery. PMID:27504486

  7. Future Perspectives of ERAS: A Narrative Review on the New Applications of an Established Approach

    Directory of Open Access Journals (Sweden)

    Dario Bugada

    2016-01-01

    Full Text Available ERAS approach (Enhanced Recovery After Surgery is a multimodal, perioperative pathway designed to achieve early recovery after surgery. ERAS has shown documented efficacy in elective surgery, and the concept of “multimodal” and “multidisciplinary” approach seems still to be of higher importance than each single item within ERAS protocols. New perspectives include the use of ERAS in emergency surgery, where efficacy and safety on outcome have been documented, and flexibility of traditional items may add benefits for traditionally high-risk patients. Obstetric surgery, as well, may open wide horizons for future research, since extremely poor data are currently available, and ERAS benefits may translate even on the baby. Finally, the concept of “outcome” may be extended when considering the specific setting of cancer surgery, in which variables like cancer recurrence, early access to adjuvant therapies, and, finally, long-term survival are as important as the reduced perioperative complications. In this perspective, different items within ERAS protocols should be reinterpreted and eventually integrated towards “protective” techniques, to develop cancer-specific ERAS approaches keeping pace with the specific aims of oncologic surgery.

  8. Introductory overview : a new era of asteroseismology

    Science.gov (United States)

    Christensen-Dalsgaard, J.

    2006-08-01

    In the last few years, asteroseismology of solar-like stars has been converted from a dream to a solid reality. New observational facilities, particularly very stable spectrographs, have allowed the detection and study of oscillations in a number of stars on, and just after, the main sequence, placing increasingly strong constraints on the modelling of stellar interiors. Further great advances are expected in the coming years, from continued ground- based efforts and from space missions. Particularly interesting will be the results from CoRoT, to be launched later this year; on a slightly longer time scale the NASA Kepler mission is expected to provide asteroseismic data for a large number of stars. The interpretation of these data will certainly start a new era of realistic stellar modelling with a strong observational base.

  9. Full scale test SSP 34m blade, edgewise loading LTT. Extreme load and PoC{sub I}nvE Data report

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, Magda; Roczek-Sieradzan, A.; Jensen, Find M. (and others)

    2010-09-15

    This report is the second report covering the research and demonstration project 'Experimental blade research: Structural mechanisms in current and future large blades under combined loading', supported by the EUDP program. A 34m wind turbine blade from SSP-Technology A/S has been tested in edgewise direction (LTT). The blade has been submitted to thorough examination by means of strain gauges, displacement transducers and a 3D optical measuring system. This data report presents results obtained during full scale testing of the blade up to 80% Risoe load, where 80% Risoe load corresponds to 100% certification load. These pulls at 80% Risoe load were repeated and the results from these pulls were compared. The blade was reinforced according to a Risoe DTU invention, where the trailing edge panels are coupled. The coupling is implemented to prevent the out of plane deformations and to reduce peeling stresses in the adhesive joints. Test results from measurements with the reinforcement have been compared to results without the coupling. The report presents only the relevant results for the 80% Risoe load and the results applicable for the investigation of the influence of the invention on the profile deformation. (Author)

  10. Statistical Model of Extreme Shear

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Hansen, Kurt Schaldemose

    2004-01-01

    In order to continue cost-optimisation of modern large wind turbines, it is important to continously increase the knowledge on wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...... by a model that, on a statistically consistent basis, describe the most likely spatial shape of an extreme wind shear event. Predictions from the model have been compared with results from an extreme value data analysis, based on a large number of high-sampled full-scale time series measurements...... are consistent, given the inevitabel uncertainties associated with model as well as with the extreme value data analysis. Keywords: Statistical model, extreme wind conditions, statistical analysis, turbulence, wind loading, statistical analysis, turbulence, wind loading, wind shear, wind turbines....

  11. Vaccinology in the genome era

    OpenAIRE

    Rinaudo, C. Daniela; Telford, John L.; Rappuoli, Rino; Seib, Kate L.

    2009-01-01

    Vaccination has played a significant role in controlling and eliminating life-threatening infectious diseases throughout the world, and yet currently licensed vaccines represent only the tip of the iceberg in terms of controlling human pathogens. However, as we discuss in this Review, the arrival of the genome era has revolutionized vaccine development and catalyzed a shift from conventional culture-based approaches to genome-based vaccinology. The availability of complete bacterial genomes h...

  12. Manajemen Pendidikan dalam Era Reformasi

    Directory of Open Access Journals (Sweden)

    Willem Mantja

    2016-02-01

    Full Text Available Planning, Organizing, actuating, and controlling are the functions of educational management of all kinds of fields of management. The differences between educational management and others are in the components of its substances. Educational management components include instructional, personnel, student, facilitation, financial, include and school public relation management. The educational managers in the reform era require a competence and managerial skills to perform their jobs as professional managers

  13. The Era of Super Capitalism

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The world has entered the "super capitalism" era when one third of its economic activities are controlled by less than 3 percent of global financial capital. This year,a global economic recession,triggered by the U.S. subprime mortgage crisis,seems unavoidable. To tackle international financial problems,Tao Dong,Chief Economist for Asia at Credit Suisse First Boston in Hong Kong,shared his insights with China Business Journal. Excerpts follow.

  14. Extremely Preterm Birth

    Science.gov (United States)

    ... Events Advocacy For Patients About ACOG Extremely Preterm Birth Home For Patients Search FAQs Extremely Preterm Birth ... Spanish FAQ173, June 2016 PDF Format Extremely Preterm Birth Pregnancy When is a baby considered “preterm” or “ ...

  15. Scale invariant density perturbations from cyclic cosmology

    Science.gov (United States)

    Frampton, Paul Howard

    2016-04-01

    It is shown how quantum fluctuations of the radiation during the contraction era of a comes back empty (CBE) cyclic cosmology can provide density fluctuations which re-enter the horizon during the subsequent expansion era and at lowest order are scale invariant, in a Harrison-Zel’dovich-Peebles sense. It is necessary to be consistent with observations of large scale structure.

  16. Primordial black hole formation in the radiative era: investigation of the critical nature of the collapse

    CERN Document Server

    Musco, Ilia; Polnarev, Alexander G

    2008-01-01

    Following on after two previous papers discussing the formation of primordial black holes in the early universe, we present here results from an in-depth investigation of the extent to which primordial black hole formation in the radiative era can be considered as an example of the critical collapse phenomenon. We focus on initial supra-horizon-scale perturbations of a type which could have come from inflation, with only a growing component and no decaying component. In order to study perturbations with amplitudes extremely close to the supposed critical limit, we have modified our previous computer code with the introduction of an adaptive mesh refinement scheme. This has allowed us to follow black hole formation from perturbations whose amplitudes are up to eight orders of magnitude closer to the threshold than we could do before. We find that scaling-law behaviour continues down to the smallest black hole masses that we are able to follow and we see no evidence of shock production such as has been reported...

  17. Characterizing Extreme Ionospheric Storms

    Science.gov (United States)

    Sparks, L.; Komjathy, A.; Altshuler, E.

    2011-12-01

    Ionospheric storms consist of disturbances of the upper atmosphere that generate regions of enhanced electron density typically lasting several hours. Depending upon the storm magnitude, gradients in electron density can sometimes become large and highly localized. The existence of such localized, dense irregularities is a major source of positioning error for users of the Global Positioning System (GPS). Consequently, satellite-based augmentation systems have been implemented to improve the accuracy and to ensure the integrity of user position estimates derived from GPS measurements. Large-scale irregularities generally do not pose a serious threat to estimate integrity as they can be readily detected by such systems. Of greater concern, however, are highly localized irregularities that interfere with the propagation of a signal detected by a user measurement but are poorly sampled by the receivers in the system network. The most challenging conditions have been found to arise following disturbances of large magnitude that occur only rarely over the course of a solar cycle. These extremely disturbed conditions exhibit behavior distinct from moderately disturbed conditions and, hence, have been designated "extreme storms". In this paper we examine and compare the behavior of the extreme ionospheric storms of solar cycle 23 (or, more precisely, extreme storms occurring between January 1, 2000, and December 31, 2008), as represented in maps of vertical total electron content. To identify these storms, we present a robust means of quantifying the regional magnitude of an ionospheric storm. Ionospheric storms are observed frequently to occur in conjunction with magnetic storms, i.e., periods of geophysical activity as measured by magnetometers. While various geomagnetic indices, such as the disturbance storm time (Dst) and the planetary Kp index, have long been used to rank the magnitudes of distinct magnetic storms, no comparable, generally recognized index exists for

  18. Can reanalysis datasets describe the persistent temperature and precipitation extremes over China?

    Science.gov (United States)

    Zhu, Jian; Huang, Dan-Qing; Yan, Pei-Wen; Huang, Ying; Kuang, Xue-Yuan

    2016-08-01

    The persistent temperature and precipitation extremes may bring damage to the economy and human due to their intensity, duration and areal coverage. Understanding the quality of reanalysis datasets in descripting these extreme events is important for detection, attribution and model evaluation. In this study, the performances of two reanalysis datasets [the twentieth century reanalysis (20CR) and Interim ECMWF reanalysis (ERA-Interim)] in reproducing the persistent temperature and precipitation extremes in China are evaluated. For the persistent temperature extremes, the two datasets can better capture the intensity indices than the frequency indices. The increasing/decreasing trend of persistent warm/cold extremes has been reasonably detected by the two datasets, particularly in the northern part of China. The ERA-Interim better reproduces the climatology and tendency of persistent warm extremes, while the 20CR has better skill to depict the persistent cold extremes. For the persistent precipitation extremes, the two datasets have the ability to reproduce the maximum consecutive 5-day precipitation. The two datasets largely underestimate the maximum consecutive dry days over the northern part of China, while overestimate the maximum consecutive wet days over the southern part of China. For the response of the precipitation extremes against the temperature variations, the ERA-Interim has good ability to depict the relationship among persistent precipitation extremes, local persistent temperature extremes, and global temperature variations over specific regions.

  19. Return Levels of Temperature Extremes in Southern Pakistan

    Science.gov (United States)

    Zahid, Maida; Lucarini, Valerio; Blender, Richard; Caterina Bramati, Maria

    2017-04-01

    Southern Pakistan (Sindh) is one of the hottest regions in the world and is highly vulnerable to temperature extremes. In order to improve rural and urban planning, information about the recurrence of temperature extremes is required. In this work, return levels of the daily maximum temperature Tmax are estimated, as well as the daily maximum wet-bulb temperature TWmax extremes. The method used is the Peak Over Threshold (POT) and it represents a novelty among the approaches previously used for similar studies in this region. Two main datasets are analyzed: temperatures observed in nine meteorological stations in southern Pakistan from 1980 to 2013, and the ERA Interim data for the nearest corresponding locations. The analysis provides the 2, 5, 10, 25, 50 and 100-year Return Levels (RLs) of temperature extremes. The 90% quantile is found to be a suitable threshold for all stations. We find that the RLs of the observed Tmax are above 50°C in northern stations, and above 45°C in the southern stations. The RLs of the observed TWmax exceed 35°C in the region, which is considered as a limit of survivability. The RLs estimated from the ERA Interim data are lower by 3°C to 5°C than the RLs assessed for the nine meteorological stations. A simple bias correction applied to ERA Interim data improves the RLs remarkably, yet discrepancies are still present. The results have potential implications for the risk assessment of extreme temperatures in Sindh.

  20. Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA)

    Science.gov (United States)

    Lichtwardt, Jonathan; Paciano, Eric; Jameson, Tina; Fong, Robert; Marshall, David

    2012-01-01

    With the very recent advent of NASA's Environmentally Responsible Aviation Project (ERA), which is dedicated to designing aircraft that will reduce the impact of aviation on the environment, there is a need for research and development of methodologies to minimize fuel burn, emissions, and reduce community noise produced by regional airliners. ERA tackles airframe technology, propulsion technology, and vehicle systems integration to meet performance objectives in the time frame for the aircraft to be at a Technology Readiness Level (TRL) of 4-6 by the year of 2020 (deemed N+2). The proceeding project that investigated similar goals to ERA was NASA's Subsonic Fixed Wing (SFW). SFW focused on conducting research to improve prediction methods and technologies that will produce lower noise, lower emissions, and higher performing subsonic aircraft for the Next Generation Air Transportation System. The work provided in this investigation was a NASA Research Announcement (NRA) contract #NNL07AA55C funded by Subsonic Fixed Wing. The project started in 2007 with a specific goal of conducting a large-scale wind tunnel test along with the development of new and improved predictive codes for the advanced powered-lift concepts. Many of the predictive codes were incorporated to refine the wind tunnel model outer mold line design. The large scale wind tunnel test goal was to investigate powered lift technologies and provide an experimental database to validate current and future modeling techniques. Powered-lift concepts investigated were Circulation Control (CC) wing in conjunction with over-the-wing mounted engines to entrain the exhaust to further increase the lift generated by CC technologies alone. The NRA was a five-year effort; during the first year the objective was to select and refine CESTOL concepts and then to complete a preliminary design of a large-scale wind tunnel model for the large scale test. During the second, third, and fourth years the large-scale wind

  1. Irrigation mitigates against heat extremes

    Science.gov (United States)

    Thiery, Wim; Fischer, Erich; Visser, Auke; Hirsch, Annette L.; Davin, Edouard L.; Lawrence, Dave; Hauser, Mathias; Seneviratne, Sonia I.

    2017-04-01

    Irrigation is an essential practice for sustaining global food production and many regional economies. Emerging scientific evidence indicates that irrigation substantially affects mean climate conditions in different regions of the world. Yet how this practice influences climate extremes is currently unknown. Here we use gridded observations and ensemble simulations with the Community Earth System Model to assess the impacts of irrigation on climate extremes. While the influence of irrigation on annual mean temperatures is limited, we find a large impact on temperature extremes, with a particularly strong cooling during the hottest day of the year (-0.78 K averaged over irrigated land). The strong influence on hot extremes stems from the timing of irrigation and its influence on land-atmosphere coupling strength. Together these effects result in asymmetric temperature responses, with a more pronounced cooling during hot and/or dry periods. The influence of irrigation is even more pronounced when considering subgrid-scale model output, suggesting that local effects of land management are far more important than previously thought. Finally we find that present-day irrigation is partly masking GHG-induced warming of extreme temperatures, with particularly strong effects in South Asia. Our results overall underline that irrigation substantially reduces our exposure to hot temperature extremes and highlight the need to account for irrigation in future climate projections.

  2. Multidimensional extremal dependence coefficients

    OpenAIRE

    2017-01-01

    Extreme values modeling has attracting the attention of researchers in diverse areas such as the environment, engineering, or finance. Multivariate extreme value distributions are particularly suitable to model the tails of multidimensional phenomena. The analysis of the dependence among multivariate maxima is useful to evaluate risk. Here we present new multivariate extreme value models, as well as, coefficients to assess multivariate extremal dependence.

  3. The European Extreme Right and Religious Extremism

    Directory of Open Access Journals (Sweden)

    Jean-Yves Camus

    2007-12-01

    Full Text Available The ideology of the Extreme Right in Western Europe is rooted in Catholic fundamentalism and Counter-Revolutionary ideas. However, the Extreme Right, like all other political families, has had to adjust to an increasingly secular society. The old link between religion and the Extreme Right has thus been broken and in fact already was when Fascism overtook Europe: Fascism was secular, sometimes even anti-religious, in its essence. Although Catholic fundamentalists still retain strong positions within the apparatus of several Extreme Right parties (Front National, the vote for the Extreme Right is generally weak among regular churchgoers and strong among non-believers. In several countries, the vote for the Extreme Right is stronger among Protestant voters than among Catholics, since while Catholics may support Christian-Democratic parties, there are very few political parties linked to Protestant churches. Presently, it also seems that Paganism is becoming the dominant religious creed within the Extreme Right. In a multicultural Europe, non-Christian forms of religious fundamentalism such as Islamism also exist with ideological similarities to the Extreme Right, but this is not sufficient to categorize Islamism as a form of Fascism. Some Islamist groups seek alliances with the Extreme Right on the basis of their common dislike for Israel and the West, globalization and individual freedom of thought.

  4. Statistical Model of Extreme Shear

    DEFF Research Database (Denmark)

    Hansen, Kurt Schaldemose; Larsen, Gunner Chr.

    2005-01-01

    In order to continue cost-optimisation of modern large wind turbines, it is important to continuously increase the knowledge of wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...... by a model that, on a statistically consistent basis, describes the most likely spatial shape of an extreme wind shear event. Predictions from the model have been compared with results from an extreme value data analysis, based on a large number of full-scale measurements recorded with a high sampling rate...

  5. Galactoseismology in the GAIA Era

    CERN Document Server

    Chakrabarti, Sukanya

    2016-01-01

    The GAIA satellite will provide unprecedented phase-space information for our Galaxy and enable a new era of Galactic dynamics. We may soon see successful realizations of Galactoseismology, i.e., inferring the characteristics of the Galactic potential and sub-structure from a dynamical analysis of observed perturbations in the gas or stellar disk of the Milky Way. Here, we argue that to maximally take advantage of the GAIA data and other complementary surveys, it is necessary to build comprehensive models for both the stars and the gas. We outline several key morphological puzzles of the Galactic disk and proposed solutions that may soon be tested.

  6. Are hourly precipitation extremes increasing faster than daily precipitation extremes?

    Science.gov (United States)

    Barbero, Renaud; Fowler, Hayley; Blenkinsop, Stephen; Lenderink, Geert

    2016-04-01

    Extreme precipitation events appear to be increasing with climate change in many regions of the world, including the United States. These extreme events have large societal impacts, as seen during the recent Texas-Oklahoma flooding in May 2015 which caused several billion in damages and left 47 deaths in its path. Better understanding of past changes in the characteristics of extreme rainfall events is thus critical for reliable projections of future changes. Although it has been documented in several studies that daily precipitation extremes are increasing across parts of the contiguous United States, very few studies have looked at hourly extremes. However, this is of primary importance as recent studies on the temperature scaling of extreme precipitation have shown that increases above the Clausius-Clapeyron (~ 7% °C-1) are possible for hourly precipitation. In this study, we used hourly precipitation data (HPD) from the National Climatic Data Center and extracted more than 1,000 stations across the US with more than 40 years of data spanning the period 1950-2010. As hourly measurements are often associated with a range of issues, the data underwent multiple quality control processes to exclude erroneous data. While no significant changes were found in annual maximum precipitation using both hourly and daily resolution datasets, significant increasing trends in terms of frequency of episodes exceeding present-day 95th percentiles of wet hourly/daily precipitation were observed across a significant portion of the US. The fraction of stations with significant increasing trends falls outside the confidence interval range during all seasons but the summer. While less than 12% of stations exhibit significant trends at the daily scale in the wintertime, more than 45% of stations, mostly clustered in central and Northern United States, show significant increasing trends at the hourly scale. This suggests that short-duration storms have increased faster than daily

  7. DAiSES: Dynamic Adaptivity in Support of Extreme Scale Department of Energy Project No. ER25622 Prime Contract No. DE-FG02-04ER25622 Final Report for September 15, 2004-September 14, 2008

    Energy Technology Data Exchange (ETDEWEB)

    PI: Patricia J. Teller, Ph.D.

    2009-05-05

    The DAiSES project [Te04] was focused on enabling conventional operating systems, in particular, those running on extreme scale systems, to dynamically customize system resource management in order to offer applications the best possible environment in which to execute. Such dynamic adaptation allows operating systems to modify the execution environment in response to changes in workload behavior and system state. The main challenges of this project included determination of what operating system (OS) algorithms, policies, and parameters should be adapted, when to adapt them, and how to adapt them. We addressed these challenges by using a combination of static analysis and runtime monitoring and adaptation to identify a priori profitable targets of adaptation and effective heuristics that can be used to dynamically trigger adaptation. Dynamic monitoring and adaptation of the OS was provided by either kernel modifications or the use of KernInst and Kperfmon [Wm04]. Since Linux, an open source OS, was our target OS, patches submitted by kernel developers and researchers often facilitated kernel modifications. KernInst operates on unmodified commodity operating systems, i.e., Solaris and Linux; it is fine-grained, thus, there were few constraints on how the underlying OS can be modified. Dynamically adaptive functionality of operating systems, both in terms of policies and parameters, is intended to deliver the maximum attainable performance of a computational environment and meet, as best as possible, the needs of high-performance applications running on extreme scale systems, while meeting system constraints. DAiSES research endeavored to reach this goal by developing methodologies for dynamic adaptation of OS parameters and policies to manage stateful and stateless resources [Te06] and pursuing the following two objectives: (1) Development of mechanisms to dynamically sense, analyze, and adjust common performance metrics, fluctuating workload situations, and

  8. DAiSES: Dynamic Adaptivity in Support of Extreme Scale Department of Energy Project No. ER25622 Prime Contract No. DE-FG02-04ER25622 Final Report for September 15, 2004-September 14, 2008

    Energy Technology Data Exchange (ETDEWEB)

    PI: Patricia J. Teller, Ph.D.

    2009-05-05

    The DAiSES project [Te04] was focused on enabling conventional operating systems, in particular, those running on extreme scale systems, to dynamically customize system resource management in order to offer applications the best possible environment in which to execute. Such dynamic adaptation allows operating systems to modify the execution environment in response to changes in workload behavior and system state. The main challenges of this project included determination of what operating system (OS) algorithms, policies, and parameters should be adapted, when to adapt them, and how to adapt them. We addressed these challenges by using a combination of static analysis and runtime monitoring and adaptation to identify a priori profitable targets of adaptation and effective heuristics that can be used to dynamically trigger adaptation. Dynamic monitoring and adaptation of the OS was provided by either kernel modifications or the use of KernInst and Kperfmon [Wm04]. Since Linux, an open source OS, was our target OS, patches submitted by kernel developers and researchers often facilitated kernel modifications. KernInst operates on unmodified commodity operating systems, i.e., Solaris and Linux; it is fine-grained, thus, there were few constraints on how the underlying OS can be modified. Dynamically adaptive functionality of operating systems, both in terms of policies and parameters, is intended to deliver the maximum attainable performance of a computational environment and meet, as best as possible, the needs of high-performance applications running on extreme scale systems, while meeting system constraints. DAiSES research endeavored to reach this goal by developing methodologies for dynamic adaptation of OS parameters and policies to manage stateful and stateless resources [Te06] and pursuing the following two objectives: (1) Development of mechanisms to dynamically sense, analyze, and adjust common performance metrics, fluctuating workload situations, and

  9. Runtime Systems for Extreme Scale Platforms

    Science.gov (United States)

    2013-12-01

    column of a pivot, called panel factorization, leads to better performance as it exposes more concurrency in the application. A similar approach will...performance. By addressing these challenges, this dissertation makes concrete contributions towards addressing some of the key software challenges for...benchmark-suite). [114] J. D. Cuvillo, W. Zhu, Z. Hu, and G. R. Gao, “TiNy Threads: A Thread Virtual Machine for the Cyclops64 Cellular Architecture,” in

  10. Tarbijalepingud rahvusvahelises eraõiguses / Margus Kingisepp

    Index Scriptorium Estoniae

    Kingisepp, Margus, 1969-

    1997-01-01

    Tarbijalepingute reguleerimisest erinevates riikides, 1955. a. Haagi konventsioonist ja 1980. a. Rooma konventsioonist, rahvusvahelisest jurisdiktsioonist tarbijalepingute puhul ning rahvusvahelise eraõiguse sätetest Eesti õiguses

  11. Tarbijalepingud rahvusvahelises eraõiguses / Margus Kingisepp

    Index Scriptorium Estoniae

    Kingisepp, Margus, 1969-

    1997-01-01

    Tarbijalepingute reguleerimisest erinevates riikides, 1955. a. Haagi konventsioonist ja 1980. a. Rooma konventsioonist, rahvusvahelisest jurisdiktsioonist tarbijalepingute puhul ning rahvusvahelise eraõiguse sätetest Eesti õiguses

  12. 低碳时代大型超市物流配送系统研究%Study on Logistics Distribution Systems of Large-scale Supermarkets in Low-carbon Era

    Institute of Scientific and Technical Information of China (English)

    许弢

    2014-01-01

    针对大型超市物流配送系统能耗高、污染大的问题,基于配送成本最低的目标建立一种客户有时间窗约束下的可多次访问的物流配送模型,以武汉某大型连锁超市为例检验模型的有效性。研究结果显示,遗传算法编码的低碳物流配送模型能够实现成本最小化,为大型超市企业管理者在低碳环境下进行物流配送系统的评价和完善提供了强有力的依据,并为其制订和调整相关物流政策提供借鉴。%In this paper, in view of the issues in the logistics distribution systems of large-scale supermarkets and with the minimal distribution cost as the objective, we formulated a logistics distribution model with time window constraint which could be visited multiple times by the customers, tested its validity in the case of a certain large-scale supermarket from Wuhan and found that the model could effectively minimize the distribution cost of the supermarket.

  13. Comparing Evaporative Sources of Terrestrial Precipitation and Their Extremes in MERRA Using Relative Entropy

    Science.gov (United States)

    Dirmeyer, Paul A.; Wei, Jiangfeng; Bosilovich, Michael G.; Mocko, David M.

    2014-01-01

    A quasi-isentropic back trajectory scheme is applied to output from the Modern Era Retrospective-analysis for Research and Applications and a land-only replay with corrected precipitation to estimate surface evaporative sources of moisture supplying precipitation over every ice-free land location for the period 1979-2005. The evaporative source patterns for any location and time period are effectively two dimensional probability distributions. As such, the evaporative sources for extreme situations like droughts or wet intervals can be compared to the corresponding climatological distributions using the method of relative entropy. Significant differences are found to be common and widespread for droughts, but not wet periods, when monthly data are examined. At pentad temporal resolution, which is more able to isolate floods and situations of atmospheric rivers, values of relative entropy over North America are typically 50-400 larger than at monthly time scales. Significant differences suggest that moisture transport may be the key to precipitation extremes. Where evaporative sources do not change significantly, it implies other local causes may underlie the extreme events.

  14. Major improvement of altimetry sea level estimations using pressure-derived corrections based on ERA-Interim atmospheric reanalysis

    Science.gov (United States)

    Carrere, Loren; Faugère, Yannice; Ablain, Michaël

    2016-06-01

    The new dynamic atmospheric correction (DAC) and dry tropospheric (DT) correction derived from the ERA-Interim meteorological reanalysis have been computed for the 1992-2013 altimeter period. Using these new corrections significantly improves sea level estimations for short temporal signals (SSH) error by more than 3 cm in the Southern Ocean and in some shallow water regions. The impact of DT_ERA (DT derived from ERA-Interim meteorological reanalysis) is also significant in the southern high latitudes for these missions. Concerning more recent missions (Jason-1, Jason-2, and Envisat), results are very similar between ERA-Interim and ECMWF-based corrections: on average for the global ocean, the operational DAC becomes slightly better than DAC_ERA only from the year 2006, likely due to the switch of the operational forcing to a higher spatial resolution. At regional scale, both DACs are similar in the deep ocean but DAC_ERA raises the residual crossovers' variance in some shallow water regions, indicating a slight degradation in the most recent years of the study. In the second decade of altimetry, unexpectedly DT_ERA still gives better results compared to the operational DT. Concerning climate signals, both DAC_ERA and DT_ERA have a low impact on global mean sea level rise (MSL) trends, but they can have a strong impact on long-term regional trends' estimation, up to several millimeters per year locally.

  15. Identification of the non-stationarity of extreme precipitation events and correlations with large-scale ocean-atmospheric circulation patterns: A case study in the Wei River Basin, China

    Science.gov (United States)

    Liu, Saiyan; Huang, Shengzhi; Huang, Qiang; Xie, Yangyang; Leng, Guoyong; Luan, Jinkai; Song, Xiaoyu; Wei, Xiu; Li, Xiangyang

    2017-05-01

    The investigation of extreme precipitation events in terms of variation characteristics, stationarity, and their underlying causes is of great significance to better understand the regional response of the precipitation variability to global climate change. In this study, the Wei River Basin (WRB), a typical eco-environmentally vulnerable region of the Loess Plateau in China was selected as the study region. A set of precipitation indices was adopted to study the changing patterns of precipitation extremes and the stationarity of extreme precipitation events. Furthermore, the correlations between the Pacific Decadal Oscillation (PDO)/El Niño-Southern Oscillation (ENSO) events and precipitation extremes were explored using the cross wavelet technique. The results indicate that: (1) extreme precipitation events in the WRB are characterized by a significant decrease of consecutive wet days (CWD) at the 95% confidence level; (2) compared with annual precipitation, daily precipitation extremes are much more sensitive to changing environments, and the assumption of stationarity of extreme precipitation in the WRB is invalid, especially in the upstream, thereby introducing large uncertainty to the design and management of water conservancy engineering; (3) both PDO and ENSO events have a strong influence on precipitation extremes in the WRB. These findings highlight the importance of examining the validity of the stationarity assumption in extreme hydrological frequency analysis, which has great implications for the prediction of extreme hydrological events.

  16. A new global dataset with extreme sea levels and its application for assessing flood risk

    Science.gov (United States)

    Muis, Sanne; Verlaan, Martin; Winsemius, Hessel; Aerts, Jeroen; Ward, Philip

    2016-04-01

    Extreme sea levels, caused by storm surges and high tides, can have devastating societal impacts. The global coastal population is faced with an increasing trend in flood risk, induced by socio-economic development and climate change. Without action, the increasing trends in flood hazard and exposure will be associated with catastrophic flood losses in the future. The adequate allocation of global investments in adaptation requires an accurate understanding of the current and future coastal flood risk on a global-scale. Here we present the first global reanalysis of storm surges and extreme sea levels (GTSR dataset) based on dynamical modelling. GTSR covers the entire world's coastline and consists of time series of tides and surges and estimates of extreme values for various return periods. The dataset is based on two different hydrodynamic models: FES2012 for modelling tides, and GSTM for modelling storm surges. GSTM is forced by meteorological fields from ERA-Interim to simulate storm surges for the period 1979-2014. Validation showed that there is very good agreement between modelled and observed sea levels. Only in regions prone to tropical cyclones, extreme sea levels are severely underestimated due to the limited resolution of the meteorological forcing. This will be resolved for future updates of GTSR. As a first application of GSTR, we estimate that 99 million people are exposed to a 1 in 100 year flood. This is almost 40% lower than estimates based the DIVA dataset, another global dataset of extreme sea level. We foresee other applications in assessing impacts of climate change and risk management, such as assessing changes in storminess, estimating the impacts of sea level, and providing warning levels to operational models.

  17. The influence of synoptic airflow on UK daily precipitation extremes. Part II: regional climate model and E-OBS data validation

    Energy Technology Data Exchange (ETDEWEB)

    Maraun, Douglas [Leibniz Institute of Marine Sciences (IFM-GEOMAR), Duesternbrooker Weg 20, 24105, Kiel (Germany); Osborn, Timothy J. [School of Environmental Sciences, Climatic Research Unit, Norwich (United Kingdom); Rust, Henning W. [Freie Universitaet Berlin, Institut fuer Meteorologie, Berlin (Germany)

    2012-07-15

    We investigate how well the variability of extreme daily precipitation events across the United Kingdom is represented in a set of regional climate models and the E-OBS gridded data set. Instead of simply evaluating the climatologies of extreme precipitation measures, we develop an approach to validate the representation of physical mechanisms controlling extreme precipitation variability. In part I of this study we applied a statistical model to investigate the influence of the synoptic scale atmospheric circulation on extreme precipitation using observational rain gauge data. More specifically, airflow strength, direction and vorticity are used as predictors for the parameters of the generalised extreme value (GEV) distribution of local precipitation extremes. Here we employ this statistical model for our validation study. In a first step, the statistical model is calibrated against a gridded precipitation data set provided by the UK Met Office. In a second step, the same statistical model is calibrated against 14 ERA40 driven 25 km resolution RCMs from the ENSEMBLES project and the E-OBS gridded data set. Validation indices describing relevant physical mechanisms are derived from the statistical models for observations and RCMs and are compared using pattern standard deviation, pattern correlation and centered pattern root mean squared error as validation measures. The results for the different RCMs and E-OBS are visualised using Taylor diagrams. We show that the RCMs adequately simulate moderately extreme precipitation and the influence of airflow strength and vorticity on precipitation extremes, but show deficits in representing the influence of airflow direction. Also very rare extremes are misrepresented, but this result is afflicted with a high uncertainty. E-OBS shows considerable biases, in particular in regions of sparse data. The proposed approach might be used to validate other physical relationships in regional as well as global climate models. (orig.)

  18. Legacy to the extreme

    NARCIS (Netherlands)

    A. van Deursen (Arie); T. Kuipers (Tobias); L.M.F. Moonen (Leon)

    2000-01-01

    textabstractWe explore the differences between developing a system using extreme programming techniques, and maintaining a legacy system. We investigate whether applying extreme programming techniques to legacy maintenance is useful and feasible.

  19. Legacy to the extreme

    NARCIS (Netherlands)

    Deursen, A. van; Kuipers, T.; Moonen, L.M.F.

    2000-01-01

    We explore the differences between developing a system using extreme programming techniques, and maintaining a legacy system. We investigate whether applying extreme programming techniques to legacy maintenance is useful and feasible.

  20. Extreme environment electronics

    CERN Document Server

    Cressler, John D

    2012-01-01

    Unfriendly to conventional electronic devices, circuits, and systems, extreme environments represent a serious challenge to designers and mission architects. The first truly comprehensive guide to this specialized field, Extreme Environment Electronics explains the essential aspects of designing and using devices, circuits, and electronic systems intended to operate in extreme environments, including across wide temperature ranges and in radiation-intense scenarios such as space. The Definitive Guide to Extreme Environment Electronics Featuring contributions by some of the world's foremost exp

  1. Deficiently Extremal Gorenstein Algebras

    Indian Academy of Sciences (India)

    Pavinder Singh

    2011-08-01

    The aim of this article is to study the homological properties of deficiently extremal Gorenstein algebras. We prove that if / is an odd deficiently extremal Gorenstein algebra with pure minimal free resolution, then the codimension of / must be odd. As an application, the structure of pure minimal free resolution of a nearly extremal Gorenstein algebra is obtained.

  2. Extreme value distributions

    CERN Document Server

    Ahsanullah, Mohammad

    2016-01-01

    The aim of the book is to give a through account of the basic theory of extreme value distributions. The book cover a wide range of materials available to date. The central ideas and results of extreme value distributions are presented. The book rwill be useful o applied statisticians as well statisticians interrested to work in the area of extreme value distributions.vmonograph presents the central ideas and results of extreme value distributions.The monograph gives self-contained of theory and applications of extreme value distributions.

  3. Statistical Downscaling of Gusts During Extreme European Winter Storms Using Radial-Basis-Function Networks

    Science.gov (United States)

    Voigt, M.; Lorenz, P.; Kruschke, T.; Osinski, R.; Ulbrich, U.; Leckebusch, G. C.

    2012-04-01

    Winterstorms and related gusts can cause extensive socio-economic damages. Knowledge about the occurrence and the small scale structure of such events may help to make regional estimations of storm losses. For a high spatial and temporal representation, the use of dynamical downscaling methods (RCM) is a cost-intensive and time-consuming option and therefore only applicable for a limited number of events. The current study explores a methodology to provide a statistical downscaling, which offers small scale structured gust fields from an extended large scale structured eventset. Radial-basis-function (RBF) networks in combination with bidirectional Kohonen (BDK) maps are used to generate the gustfields on a spatial resolution of 7 km from the 6-hourly mean sea level pressure field from ECMWF reanalysis data. BDK maps are a kind of neural network which handles supervised classification problems. In this study they are used to provide prototypes for the RBF network and give a first order approximation for the output data. A further interpolation is done by the RBF network. For the training process the 50 most extreme storm events over the North Atlantic area from 1957 to 2011 are used, which have been selected from ECMWF reanalysis datasets ERA40 and ERA-Interim by an objective wind based tracking algorithm. These events were downscaled dynamically by application of the DWD model chain GME → COSMO-EU. Different model parameters and their influence on the quality of the generated high-resolution gustfields are studied. It is shown that the statistical RBF network approach delivers reasonable results in modeling the regional gust fields for untrained events.

  4. ERA [Material gráfico

    OpenAIRE

    Fundación para la Etnografía y el Desarrollo de la Artesanía Canaria

    1990-01-01

    ALTERACIONES NATURALES A CAUSA DE LA COLONIZACION VEGETAL EN EL INTERIOR DE LA ERA. PRESENTA AUSENCIA DE ALGUNAS LAJAS. ASCENDEMOS POR LA CARRETERA QUE DISCURRE POR EL FONDO DEL BARRANCO HASTA LLEGAR A LA ZONA DENOMINADA CUEVA BERMEJA, JUNTO AL RESTAURANTE "EL CENTRO" Y LA ERMITA. LA ERA SE ENCUENTRAL LADO DEL COLEGIO. Antiguedad: SIGLO XX Calificación del suelo: RÚSTICO DE PROTECCIÓN DE ENTORNOS Clasificación del suelo: RÚSTICO Declaración BIC:No ERA DE PLANTA CIRCULAR RODE...

  5. El lenguaje en la era digital

    Directory of Open Access Journals (Sweden)

    Juan Carlos Vergara Silva

    1998-02-01

    Con base en la interrelación entre lenguaje y pensamiento se plantea el papel fundamental que el lenguaje ocupa en el modelo económico, educativo y cultural generado por la aparición de la era digital o era del conocimiento. en este artículo se evidencian los retos que genera una era marcada por un esquema digital en el desarrollo y uso de habilidades comunicativas tanto en la docencia superior como en el ejercicio profesional eficiente.

  6. A new era of thromboelastometry.

    Science.gov (United States)

    Crochemore, Tomaz; Piza, Felipe Maia de Toledo; Rodrigues, Roseny Dos Reis; Guerra, João Carlos de Campos; Ferraz, Leonardo José Rolim; Corrêa, Thiago Domingos

    2017-06-12

    Severe hemorrhage with necessity of allogeneic blood transfusion is common complication in intensive care unit and is associated with increased morbidity and mortality. Prompt recognition and treatment of bleeding causes becomes essential for the effective control of hemorrhage, rationalizing the use of allogeneic blood components, and in this way, preventing an occurrence of their potential adverse effects. Conventional coagulation tests such as prothrombin time and activated partial thromboplastin time present limitations in predicting bleeding and guiding transfusion therapy in critically ill patients. Viscoelastic tests such as thromboelastography and rotational thromboelastometry allow rapid detection of coagulopathy and goal-directed therapy with specific hemostatic drugs. The new era of thromboelastometry relies on its efficacy, practicality, reproducibility and cost-effectiveness to establish itself as the main diagnostic tool and transfusion guide in patients with severe active bleeding. RESUMO A hemorragia grave com necessidade de transfusão de sangue e componentes é uma complicação frequente na unidade de terapia intensiva e está associada ao aumento da morbidade e da mortalidade. A identificação adequada e o tratamento precoce da causa específica da coagulopatia tornam-se fundamentais para o controle efetivo da hemorragia, racionalizando a utilização de sangue e componentes, e desta forma, prevenindo a ocorrência de efeitos adversos. Testes convencionais da coagulação (tempo de ativação de protrombina e tempo de tromboplastina parcial ativada) apresentam limitações para prever sangramento e guiar a terapia transfusional em pacientes graves. Testes viscoelásticos como a tromboelastografia e tromboelastometria rotacional permitem a rápida detecção da coagulopatia e orientam a terapia de forma individualizada, alvo dirigida com drogas hemostáticas específicas. A nova era da tromboelastometria confia na sua eficácia, praticidade

  7. Current research status of immunology in the genomic era

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    This review updates the current status of immunology research under the influence of genomics,both conceptually and technologically.It particularly highlights the advantages of employing the high-throughput and large-scale technology,the large genomic database,and bioinformatic power in the immunology research.The fast development in the fields of basic immunology,clinical immunology(tumor and infectious immunology) and vaccine designing is illustrated with respect to the successful usage of genomic strategy.We also speculate the future research directions of immunology in the era of genomics and post-genomics.

  8. Recombination era magnetic fields from axion dark matter

    Science.gov (United States)

    Banik, Nilanjan; Christopherson, Adam J.

    2016-02-01

    We introduce a new mechanism for generating magnetic fields in the recombination era. This Harrison-like mechanism utilizes vorticity in baryons that is sourced through the Bose-Einstein condensate of axions via gravitational interactions. The magnetic fields generated are on galactic scales ˜10 kpc and have a magnitude of the order of B ˜1 0-23G today. The field has a greater magnitude than those generated from other mechanisms relying on second-order perturbation theory, and is sufficient to provide a seed for battery mechanisms.

  9. Current research status of immunology in the genomic era

    Institute of Scientific and Technical Information of China (English)

    LI HaoWen; LI dinZhi; ZHAO GuoPing; WANG Ying

    2009-01-01

    This review updates the current status of immunology research under the influence of genomics, both conceptually and technologically. It particularly highlights the advantages of employing the high-throughput and large-scale technology, the large genomic database, and bioinformatic power in the immunology research. The fast development in the fields of basic immunology, clinical immunology (tumor and infectious immunology) and vaccine designing is illustrated with respect to the successful usage of genomic strategy. We also speculate the future research directions of immunology in the era of genomics and post-genomics.

  10. Recombination era magnetic fields from axion dark matter

    CERN Document Server

    Banik, Nilanjan

    2015-01-01

    We introduce a new mechanism for generating magnetic fields in the recombination era. This Harrison-like mechanism utilizes vorticity in baryons that is sourced through the Bose-Einstein condensate of axions via gravitational interactions. The magnetic fields generated are on the galactic scales $\\sim 10\\,{\\rm kpc}$ and have a magnitude of the order of $B\\sim10^{-23}\\,{\\rm G}$ today. The field has a greater magnitude than those generated from other mechanisms relying on second order perturbation theory, and is sufficient to provide a seed for battery mechanisms.

  11. A THIRD ERA OF MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Liviu NEAMŢU

    2009-12-01

    Full Text Available Management like any social activity through specific stages of development trends from general society. Following these historical trends, this study summarizes the evolution of management at four stages of a full cycle of development, from a informal management to one perfect formalized. These stages of development are found differential represented at the various economic development regions in the world. Evolution increasingly grouped patterns of management and generalization for schools of thought management determines the current global development of worldwide management. For the current stage of evolution may be called as "the third era of management" or "imperial period” in which management pressures on individuals, employers or subordinate, are enormous. Evolution of companies, of markets and national economies also the global economy is driven by the current trend in management, leading to very strong mutations in the relationship of forces. The world economy is in what is called "war of resources" and the alternative that we believe is necessary in this "human management" although speculative trends of concentration of capital are binding on any plans or state regulators global ethical management.

  12. MEMAKNAI SUMPAH PEMUDA DI ERA REFORMASI

    Directory of Open Access Journals (Sweden)

    Sutejo K. Widodo

    2013-03-01

    Full Text Available The moment of Sumpah Pemuda (Young Man Oath took place 84 years ago, reflecting the spirit of nationalism that is still very important in this Reformation era. This paper endeavors to dig deeper meaning of Sumpah Pemuda in its pre-independence era and applying it to our contemporary situation. The method used here is historical research using literature resources, such as articles, books, and other readings in internet. It is then concluded that the spirit of Sumpah Pemuda should be our contemplative materials and valuable Iesson so that Reformation era may succeed in achieving national goals stated in the Constitution, a society that is fair, prosperous, and democratic. Keywords: Sumpah Pemuda, Reformation era, nationalism.

  13. Extreme Velocity Wind Sensor

    Science.gov (United States)

    Perotti, Jose; Voska, Ned (Technical Monitor)

    2002-01-01

    This presentation provides an overview of the development of new hurricane wind sensor (Extreme Velocity Wind Sensor) for the Kennedy Space Center (KSC) which is designed to withstand winds of up to three hundred miles an hour. The proposed Extreme Velocity Wind Sensor contains no moveable components that would be exposed to extreme wind conditions. Topics covered include: need for new hurricane wind sensor, conceptual design, software applications, computational fluid dynamic simulations of design concept, preliminary performance tests, and project status.

  14. Pancreas Transplantation in the Modern Era.

    Science.gov (United States)

    Redfield, Robert R; Rickels, Michael R; Naji, Ali; Odorico, Jon S

    2016-03-01

    The field of pancreas transplantation has evolved from an experimental procedure in the 1980s to become a routine transplant in the modern era. With short- and long-term outcomes continuing to improve and the significant mortality, quality-of-life, and end-organ disease benefits, pancreas transplantation should be offered to more patients. In this article, we review current indications, patient selection, surgical considerations, complications, and outcomes in the modern era of pancreas transplantation.

  15. A New Era of Intelligence Research

    Directory of Open Access Journals (Sweden)

    Andrew R. A. Conway

    2014-04-01

    Full Text Available A consensus definition of intelligence remains elusive but there are many reasons to believe that the field of intelligence is entering a new era of significant progress. The convergence of recent advances in psychometrics, cognitive psychology, and neuroscience has set the stage for the development of stronger theories and more sophisticated models. The establishment of a new open access journal as an outlet for new intelligence research is evidence that the new era has begun.

  16. A New Era of Intelligence Research

    OpenAIRE

    Conway, Andrew R. A.

    2014-01-01

    A consensus definition of intelligence remains elusive but there are many reasons to believe that the field of intelligence is entering a new era of significant progress. The convergence of recent advances in psychometrics, cognitive psychology, and neuroscience has set the stage for the development of stronger theories and more sophisticated models. The establishment of a new open access journal as an outlet for new intelligence research is evidence that the new era has begun.

  17. Reanalysis Data Evaluation to Study Temperature Extremes in Siberia

    Science.gov (United States)

    Shulgina, T. M.; Gordov, E. P.

    2014-12-01

    Ongoing global climate changes are strongly pronounced in Siberia by significant warming in the 2nd half of 20th century and recent extreme events such as 2010 heat wave and 2013 flood in Russia's Far East. To improve our understanding of observed climate extremes and to provide to regional decision makers the reliable scientifically based information with high special and temporal resolution on climate state, we need to operate with accurate meteorological data in our study. However, from available 231 stations across Siberia only 130 of them present the homogeneous daily temperature time series. Sparse, station network, especially in high latitudes, force us to use simulated reanalysis data. However those might differ from observations. To obtain reliable information on temperature extreme "hot spots" in Siberia we have compared daily temperatures form ERA-40, ERA Interim, JRA-25, JRA-55, NCEP/DOE, MERRA Reanalysis, HadEX2 and GHCNDEX gridded datasets with observations from RIHMI-WDC/CDIAC dataset for overlap period 1981-2000. Data agreement was estimated at station coordinates to which reanalysis data were interpolated using modified Shepard method. Comparison of averaged over 20 year annual mean temperatures shows general agreement for Siberia excepting Baikal region, where reanalyses significantly underestimate observed temperature behavior. The annual temperatures closest to observed one were obtained from ERA-40 and ERA Interim. Furthermore, t-test results show homogeneity of these datasets, which allows one to combine them for long term time series analysis. In particular, we compared the combined data with observations for percentile-based extreme indices. In Western Siberia reanalysis and gridded data accurately reproduce observed daily max/min temperatures. For East Siberia, Lake Baikal area, ERA Interim data slightly underestimates TN90p and TX90p values. Results obtained allows regional decision-makers to get required high spatial resolution (0,25°×0

  18. TANTANGAN DAKWAH DI ERA GLOBALISASI

    Directory of Open Access Journals (Sweden)

    Istina Rakhmawati

    2015-11-01

    Full Text Available Potret dakwah yang berkembang selama ini memiliki kecenderungan doktrinasi, di mana masyarakat diibaratkan seperti sebuah celengan yang harus diisi dengan perangkat keyakinan serta nilai moral dan praktek kehidupan agar disimpan dan pada saat dibutuhkan juga dikeluarkan. Salah satu persoalan krusial sebagai dampak proses globalisasi yang terkait dengan kehidupan keagamaan adalah makin menipisnya ruang religiositas dalam konteks kehidupan manusia. Pada sisi yang sama, kita bisa saksikan sebagian umat muslim yang lain justru cenderung menerima apa yang datang dari timur dan barat tanpa reserve. Selain itu, fenomena globalisasi yang perlu menjadi bahan kajian terpenting saat ini adalah penyebaran cara pandang seputar hubungan keluarga, kerukunan umat, social, terutama yang berkembang di negara maju yang notabene merupakan pemeran utama globalisasi. Untuk itu, perkembangan arus globalisasi menuntut para dai-dai untuk lebih mampu menyesuaikan dengan perkembangan zaman. Kata Kunci: Tantangan Dakwah, Globalisasi. THECHALLENGEOFDAWAINTHEGLOBALIZATION ERA. The portrait of dawa that developed during this time has a tendency of doctrinisation. In this term, the society are likened such a piggy bank which must be filled with believe and moral values and practices of life in order to be saved and in times of need is also issued. One of the crucial issues as the impact of globalization processes related to the religious life is the more depletion of religiosity space in contexts of human life. On the same side, we can see some of the other muslims thus tend to accept what is coming from the East and West without reserve. In addition, the phenomenon of globalization that need to be the most important study materials at the moment is the dissemination of viewpoints regarding family relationships, people harmony, social, especially which are developed in the developing countries that in fact is the main character of the globalization. Therefore, the development

  19. A new Era in Sustainable Development

    Energy Technology Data Exchange (ETDEWEB)

    Bass, Steve

    2007-03-15

    It is 20 years since the World Commission on Environment and Development — the Brundtland Commission — released its influential report on sustainable development. This is now the declared intention of most governments, many international organisations, and an increasing number of businesses and civil society groups. High profile 'intentions' have given rise to a bewildering array of sustainable development plans, tools and business models. But these have not yet triggered the pace, scale, scope and depth of change that is needed to make development sustainable. They leave the underlying causes of unsustainable development largely undisturbed. They include few means for anticipating non-linear changes – from climate change to economic cycles – and for building resilience to them. Consequently, most environmental and welfare measures continue to decline in almost all countries. Much energy has been spent crafting the sustainable development 'toolkit'. But that energy has been channelled largely through a narrow set of international processes and 'elite' national actors. The results are not yet integral to the machinery of government or business, or people's daily lives. This paper calls for energies to be directed in new ways, constructing a truly global endeavour informed by diverse local actors' evidence of 'what works', and focusing more keenly on long-term futures. The key drivers and challenges of a 'new era in sustainable development' are suggested, to elicit ideas and leadership from a richer vein of experience than has been embraced by the formal international endeavours to date. This paper is the first in a series on the sustainable development futures that face key sectors and stakeholder groups.

  20. Classifying Returns as Extreme

    DEFF Research Database (Denmark)

    Christiansen, Charlotte

    2014-01-01

    I consider extreme returns for the stock and bond markets of 14 EU countries using two classification schemes: One, the univariate classification scheme from the previous literature that classifies extreme returns for each market separately, and two, a novel multivariate classification scheme tha...

  1. Anticipating Future Extreme Climate Events for Alaska Using Dynamical Downscaling and Quantile Mapping

    Science.gov (United States)

    Lader, R.; Walsh, J. E.

    2016-12-01

    Alaska is projected to experience major changes in extreme climate during the 21st century, due to greenhouse warming and exacerbated by polar amplification, wherein the Arctic is warming at twice the rate compared to the Northern Hemisphere. Given its complex topography, Alaska displays extreme gradients of temperature and precipitation. However, global climate models (GCMs), which typically have a spatial resolution on the order of 100km, struggle to replicate these extremes. To help resolve this issue, this study employs dynamically downscaled regional climate simulations and quantile-mapping methodologies to provide a full suite of daily model variables at 20 km spatial resolution for Alaska, from 1970 to 2100. These data include downscaled products of the: ERA-Interim reanalysis from 1979 to 2015, GFDL-CM3 historical from 1970 to 2005, and GFDL-CM3 RCP 8.5 from 2006 to 2100. Due to the limited nature of long-term observations and high-resolution modeling in Alaska, these data enable a broad expansion of extremes analysis. This study uses these data to highlight a subset of the 27 climate extremes indices, previously defined by the Expert Team on Climate Change Detection and Indices, as they pertain to climate change in Alaska. These indices are based on the statistical distributions of daily surface temperature and precipitation and focus on threshold exceedance, and percentiles. For example, the annual number of days with a daily maximum temperature greater than 25°C is anticipated to triple in many locations in Alaska by the end of the century. Climate extremes can also refer to long duration events, such as the record-setting warmth that defined the 2015-16 cold season in Alaska. The downscaled climate model simulations indicate that this past winter will be considered normal by as early as the mid-2040s, if we continue to warm according to the business-as-usual RCP 8.5 emissions scenario. This represents an accelerated warming as compared to projections

  2. Hydrological extremes in the Aksu-Tarim River Basin: Climatology and regime shift

    Science.gov (United States)

    Tao, Hui; Borth, Hartmut; Fraedrich, Klaus; Schneidereit, Andrea; Zhu, Xiuhua

    2016-04-01

    Precipitation data between 1961 and 2010 from 39 meteorological stations in the Tarim River Basin are analyzed to classify and investigate hydrological drought and wetness conditions by using the standardized precipitation index (SPI). The leading time and spatial variability of hydrological drought has been investigated by applying a principal component analysis and Varimax rotation to the SPI on a time scale of 24 months. The results suggest that the western basin is characterized by a clear tendency towards wetter conditions after the middle of the 1980s, which results from an increase in the number of wet extremes and can be considered as a regime shift. Subdividing the period of analysis into two parts (1961-1986 and 1987-2010) this change can be clearly seen in a shift of the probability distribution function of precipitation events. Composite analyses of monthly mean geopotential height fields and wind fields of the ERA-40 data set show that enhanced wetness in the Tarim River Basin after the middle of 1980s is closely related to cyclonic anomalies on the European continent and circulation anomalies over mid-latitude of the Northern Hemisphere. Further correlation analysis between the principal components of SPI and large circulation indices shows that hydrological extremes in the Tarim River Basin correlate with indices related to the polar vortex and subtropical high.

  3. Historical influence of irrigation on climate extremes

    Science.gov (United States)

    Thiery, Wim; Davin, Edouard L.; Lawrence, Dave; Hauser, Mathias; Seneviratne, Sonia I.

    2016-04-01

    Land irrigation is an essential practice sustaining global food production and many regional economies. During the last decades, irrigation amounts have been growing rapidly. Emerging scientific evidence indicates that land irrigation substantially affects mean climate conditions in different regions of the world. However, a thorough understanding of the impact of irrigation on extreme climatic conditions, such as heat waves, droughts or intense precipitation, is currently still lacking. In this context, we aim to assess the historical influence of irrigation on the occurrence of climate extremes. To this end, two simulations are conducted over the period 1910-2010 with a state-of-the-art global climate model (the Community Earth System Model, CESM): a control simulation including all major anthropogenic and natural external forcings except for irrigation and a second experiment with transient irrigation enabled. The two simulations are evaluated for their ability to represent (i) hot, dry and wet extremes using the HadEX2 and ERA-Interim datasets as a reference, and (ii) latent heat fluxes using LandFlux-EVAL. Assuming a linear combination of climatic responses to different forcings, the difference between both experiments approximates the influence of irrigation. We will analyse the impact of irrigation on a number of climate indices reflecting the intensity and duration of heat waves. Thereby, particular attention is given to the role of soil moisture changes in modulating climate extremes. Furthermore, the contribution of individual biogeophysical processes to the total impact of irrigation on hot extremes is quantified by application of a surface energy balance decomposition technique to the 90th and 99th percentile surface temperature changes.

  4. Different People, Different Views on Era

    Institute of Scientific and Technical Information of China (English)

    Qiu Yuanlun; He Nan

    2010-01-01

    History can be divided into various segments based on the various .developing stages. We call these segments eras. The era issue has been raised to discuss time and time again, which is often associated with the major transition of political system, economic system and ideology in world life, and sometimes also linked with a number of far-reaching historical events. In the recent 15-20 years, we have many important issues that prompt people to rethink of the era issues, like the East and West economic restructuring, the collapse of the former Soviet Union, the war in Iraq and the former Yugoslavia, the "9 -11" event and the war in Afghanistan and Iraq, as well as all aspects of major global changes resulting from the process of globalization. It should be pointed out that in the West's political, academic and business circle, there are very few articles totally devoted to the clef'tuition of era; most of people only mention the word "era" or such kind of expression here and there in their political speeches or writings.

  5. Ecmwf Global Reanalysis Project, Era-40

    Science.gov (United States)

    Uppala, S.; Kållberg, P.; Simmons, A.; Fiorino, M.; Hernandez, A.; Li, Xu; Onogi, K.; Saarinen, S.; Sokka, N.

    ECMWF is currently performing the ERA-40 reanalysis of the global atmosphere for the period 1957-2001, with support from the European Union and several other or- ganizations. The ERA-40 analyses will complement the existing reanalysis datasets from NCEP (1947-2000) and ECMWF (ERA-15, 1979-1993). Reanalyses in general seek to achieve as great a time consistency as possible within the limitations of the data-assimilation scheme and the available observing systems. The historical ground- based WWW observations and observations from special experiments such as GATE, FGGE and ALPEX have been made available to the project, mainly by NCAR/NCEP, ECMWF, JMA and USNAVY. In addition, and to a much larger extent than in ERA- 15, ERA-40 makes use of multichannel satellite radiances through a T159L60/ 3D- variational assimilation starting with data from the first VTPR sounding instrument in 1972 and continuing up to the present SSM/I, TOVS and ATOVS instruments. Cloud Motion Winds are used from 1979 onwards and EUMETSAT has undertaken to repro- cess Meteosat winds for 1983-1988. The length of the period allows studies relating to climate change, long term trends and fluctuations such as El Nino and the QBO. The presentation will describe the project, including its status, the assimilation system and the characteristics of the observing systems through the period, with indications of their performance and impact on medium-range weather forecasts.

  6. Moving in extreme environments

    DEFF Research Database (Denmark)

    Lucas, Samuel J E; Helge, Jørn W; Schütz, Uwe H W;

    2016-01-01

    This review addresses human capacity for movement in the context of extreme loading and with it the combined effects of metabolic, biomechanical and gravitational stress on the human body. This topic encompasses extreme duration, as occurs in ultra-endurance competitions (e.g. adventure racing...... and transcontinental races) and expeditions (e.g. polar crossings), to the more gravitationally limited load carriage (e.g. in the military context). Juxtaposed to these circumstances is the extreme metabolic and mechanical unloading associated with space travel, prolonged bedrest and sedentary lifestyle, which may...

  7. Extremal surface barriers

    Energy Technology Data Exchange (ETDEWEB)

    Engelhardt, Netta; Wall, Aron C. [Department of Physics, University of California,Santa Barbara, CA 93106 (United States)

    2014-03-13

    We present a generic condition for Lorentzian manifolds to have a barrier that limits the reach of boundary-anchored extremal surfaces of arbitrary dimension. We show that any surface with nonpositive extrinsic curvature is a barrier, in the sense that extremal surfaces cannot be continuously deformed past it. Furthermore, the outermost barrier surface has nonnegative extrinsic curvature. Under certain conditions, we show that the existence of trapped surfaces implies a barrier, and conversely. In the context of AdS/CFT, these barriers imply that it is impossible to reconstruct the entire bulk using extremal surfaces. We comment on the implications for the firewall controversy.

  8. A probabilistic risk assessment for the vulnerability of the European carbon cycle to weather extremes: the ecosystem perspective

    Science.gov (United States)

    Rolinski, S.; Rammig, A.; Walz, A.; von Bloh, W.; van Oijen, M.; Thonicke, K.

    2015-03-01

    Extreme weather events are likely to occur more often under climate change and the resulting effects on ecosystems could lead to a further acceleration of climate change. But not all extreme weather events lead to extreme ecosystem response. Here, we focus on hazardous ecosystem behaviour and identify coinciding weather conditions. We use a simple probabilistic risk assessment based on time series of ecosystem behaviour and climate conditions. Given the risk assessment terminology, vulnerability and risk for the previously defined hazard are estimated on the basis of observed hazardous ecosystem behaviour. We apply this approach to extreme responses of terrestrial ecosystems to drought, defining the hazard as a negative net biome productivity over a 12-month period. We show an application for two selected sites using data for 1981-2010 and then apply the method to the pan-European scale for the same period, based on numerical modelling results (LPJmL for ecosystem behaviour; ERA-Interim data for climate). Our site-specific results demonstrate the applicability of the proposed method, using the SPEI to describe the climate condition. The site in Spain provides an example of vulnerability to drought because the expected value of the SPEI is 0.4 lower for hazardous than for non-hazardous ecosystem behaviour. In northern Germany, on the contrary, the site is not vulnerable to drought because the SPEI expectation values imply wetter conditions in the hazard case than in the non-hazard case. At the pan-European scale, ecosystem vulnerability to drought is calculated in the Mediterranean and temperate region, whereas Scandinavian ecosystems are vulnerable under conditions without water shortages. These first model-based applications indicate the conceptual advantages of the proposed method by focusing on the identification of critical weather conditions for which we observe hazardous ecosystem behaviour in the analysed data set. Application of the method to empirical time

  9. Left-Wing Extremism: The Current Threat

    Energy Technology Data Exchange (ETDEWEB)

    Karl A. Seger

    2001-04-30

    Left-wing extremism is ''alive and well'' both in the US and internationally. Although the current domestic terrorist threat within the U. S. is focused on right-wing extremists, left-wing extremists are also active and have several objectives. Leftist extremists also pose an espionage threat to U.S. interests. While the threat to the U.S. government from leftist extremists has decreased in the past decade, it has not disappeared. There are individuals and organizations within the U.S. who maintain the same ideology that resulted in the growth of left-wing terrorism in this country in the 1970s and 1980s. Some of the leaders from that era are still communicating from Cuba with their followers in the U.S., and new leaders and groups are emerging.

  10. Analysis of extreme events

    CSIR Research Space (South Africa)

    Khuluse, S

    2009-04-01

    Full Text Available ) determination of the distribution of the damage and (iii) preparation of products that enable prediction of future risk events. The methodology provided by extreme value theory can also be a powerful tool in risk analysis...

  11. Extreme environments and exobiology.

    Science.gov (United States)

    Friedmann, E I

    1993-01-01

    Ecological research on extreme environments can be applied to exobiological problems such as the question of life on Mars. If life forms (fossil or extant) are found on Mars, their study will help to solve fundamental questions about the nature of life on Earth. Extreme environments that are beyond the range of adaptability of their inhabitants are defined as "absolute extreme". Such environments can serve as terrestrial models for the last stages of life in the history of Mars, when the surface cooled down and atmosphere and water disappeared. The cryptoendolithic microbial community in porous rocks of the Ross Desert in Antarctica and the microbial mats at the bottom of frozen Antarctic lakes are such examples. The microbial communities of Siberian permafrost show that, in frozen but stable communities, long-term survival is possible. In the context of terraforming Mars, selected microorganisms isolated from absolute extreme environments are considered for use in creation of a biological carbon cycle.

  12. Venous Ultrasound (Extremities)

    Science.gov (United States)

    ... News Physician Resources Professions Site Index A-Z Ultrasound - Venous (Extremities) Venous ultrasound uses sound waves to ... limitations of Venous Ultrasound Imaging? What is Venous Ultrasound Imaging? Ultrasound is safe and painless, and produces ...

  13. Statistics of extremes

    CERN Document Server

    Gumbel, E J

    2012-01-01

    This classic text covers order statistics and their exceedances; exact distribution of extremes; the 1st asymptotic distribution; uses of the 1st, 2nd, and 3rd asymptotes; more. 1958 edition. Includes 44 tables and 97 graphs.

  14. Dynamical downscaling of present climate extremal episodes for the BINGO research site of Cyprus

    Science.gov (United States)

    Zittis, George; Hadjinicolaou, Panos; Bruggeman, Adriana; Camera, Corrado; Lelieveld, Jos

    2016-04-01

    Besides global warming, climate change is expected to cause alterations in precipitation amounts and distribution than can be linked to extreme events such as floods or prolonged droughts. This will have a significant impact in strategic societal sectors that base their activities on water resources. While the global climate projections inform us about the long-term and weather forecasts can give useful information only for a few days or weeks, decision-makers and end-users also need guidance on inter-annual to decadal time scales. In this context, the BINGO (Bringing INnovation to onGOing water management - a better future under climate change) H2020 project aims both at reducing the uncertainty of near-term climate predictions and developing response strategies in order to better manage the remaining uncertainty. One of the project's main objectives is to develop improved decadal predictions, in adequate spatiotemporal scales, with a specific focus on extreme precipitation events. The projected rainfall will be eventually used to drive hydrological impact models. BINGO focuses on research sites that encompass river basins, watersheds and urban areas of six European countries including Norway, Cyprus, Germany, Portugal, The Netherlands and Spain. In this study we present the dynamical downscaling of the ERA-Interim dataset for validation purposes and for the research site of Cyprus. Five extreme rainfall periods were identified from the observed precipitation archives and were simulated in very high horizontal resolutions (4~1 km) using the WRF limited area atmospheric model. To optimize the performance of the model we have tested a combination of three cumulus and five microphysics parameterization schemes that resulted in 15 simulations for each extreme precipitation event. The model output was compared with daily or hourly (where available) representative rain gauge data. A set of statistical metrics was applied in order to objectively select the best

  15. Chinese librarianship in the digital era

    CERN Document Server

    Fang, Conghui

    2013-01-01

    The library in China has been transformed by rapid socioeconomic development, and the proliferation of the Internet. The issues faced by Chinese libraries andlibrarians are those faced by library practitioners more globally, however, China also has its own unique set of issues in the digital era, including developmental imbalance between East and West, urban and rural areas, and availability of skilled practitioners. Chinese Librarianship in the Digital Era is the first book on Chinese libraries responding to these issues, and more.The first part of the book places discussion in historical con

  16. Globalization Era vs. International Order Transformation

    Institute of Scientific and Technical Information of China (English)

    Cui Liru; Ma Zongshi

    2009-01-01

    @@ The world has entered a globalization era in the true sense of the expression. In the words of Thomas Friedman, author of the best seller The World Is Flat, though the globalization process began in as early back as 1492 when Christopher Columbus's first voyage to the Americas opened up wade between the old and the new world, yet only now are we ushering in "an entirely new era of globalization 3.0." Unlike its two predecessors of globalizations 1.0 and 2.0,the current one, driven increasingly by non-white, non-western institutions, is a truly worldwide process, a matter of crucial significance.

  17. ERA [Material gráfico

    OpenAIRE

    Fundación para la Etnografía y el Desarrollo de la Artesanía Canaria

    1990-01-01

    Antiguedad: SIGLO XX Calificación del suelo: RÚSTICO DE PROTECCIÓN PAISAJÍSTICA Clasificación del suelo: RÚSTICO Declaración BIC:No EN EL CRUCE ENTRE FAGAGESTO-MOYA Y CAIDEROS TOMAMOS HACIA FAGAGESTO Y ENTRAMOS POR LA CERCANA 1° ENTRADA A LA IZQUIERDA RECORRIENDO 1200M. TOMANDO PRIMERO A LA DERECHA Y A LA IZQUIERDA EN LOS CRUCES RESTANTES, HASTA EL FINAL. SUBIMOS POR LA LADERA 85M. EN DIRECCIÓN SURESTE HASTA LA ERA. ERA DE TIERRA, DE PLANTA CIRCULAR, SIN MURO, CON PIEDRAS EMPEN...

  18. SISTEM BANK SOAL DAERAH TERKALIBRASI UNTUK MENYONGSONG ERA DESENTRALISASI

    Directory of Open Access Journals (Sweden)

    Heri Retnawati

    2015-08-01

    Full Text Available Abstract: Calibrated Local Governmental Test Item Bank in the Face of Decentralization. The purpose of this study is to develop a caliberated test item bank to face the era of decentralization. The data in the research and development (R & D project were collected through a delphi study, documen­tation, observations, and interviews. The project resulted in caliberated test item bank system, supple­mented with a guide book that has undergone limited try-out, revision, wide-scale try-out, and dissemi­nation. This system was then enriched with the addition of new items for mathematics and English. The system is presented on a website uny.ac.id under the Centre for Policy Studies and Testing System in Research and Community Service Institution of Yogyakarta State University. Keywords: test item bank, decentralisation, calibration, local government Abstrak: Sistem Bank Soal Daerah Terkalibrasi untuk Menyongsong Era Desentralisasi. Tujuan penelitian ini adalah mengembangkan sistem bank soal daerah terkalibrasi untuk menyongsong era de­sentralisasi. Penelitian ini menggunakan pendekatan penelitian dan pengembangan (research and de­velopment. Data penelitian dikumpulkan melalui delphi, dokumentasi, observasi dan wawancara. Analisis data dilakukan secara kuantitatif maupun kualitatif. Hasil penelitian yaitu telah dikembangkannya sis­tem bank soal daerah yang telah terkalibrasi, beserta buku panduan pemanfaatannya yang telah melalui ujicoba terbatas, revisi, ujicoba skala luas, dan desiminasi. Sistem Bank soal ini kemudian diperkaya dengan penambahan butir baru untuk matapelajaran matematika dan bahasa Inggris. Sistem bank soal ini disajikan pada laman uny.ac.id dibawah Pusat Studi Kebijakan dan Sistem Pengujian Lembaga Penelitian dan Pengabdian pada Masyarakat (LPPM Universitas Negeri Yogyakarta. Kata kunci: sistem bank soal, desentralisasi

  19. Extreme Value distribution for singular measures

    CERN Document Server

    Faranda, Davide; Turchetti, Giorgio; Vaienti, Sandro

    2011-01-01

    In this paper we perform an analytical and numerical study of Extreme Value distributions in discrete dynamical systems that have a singular measure. Using the block maxima approach described in Faranda et al. [2011] we show that, numerically, the Extreme Value distribution for these maps can be associated to the Generalised Extreme Value family where the parameters scale with the information dimension. The numerical analysis are performed on a few low dimensional maps. For the middle third Cantor set and the Sierpinskij triangle obtained using Iterated Function Systems, experimental parameters show a very good agreement with the theoretical values. For strange attractors like Lozi and H\\`enon maps a slower convergence to the Generalised Extreme Value distribution is observed. Even in presence of large statistics the observed convergence is slower if compared with the maps which have an absolute continuous invariant measure. Nevertheless and within the uncertainty computed range, the results are in good agree...

  20. Scale Space Hierarchy

    NARCIS (Netherlands)

    Kuijper, Arjan; Florack, L.M.J.; Viergever, M.A.

    2002-01-01

    We investigate the deep structure of a scale space image. We concentrate on scale space critical points - points with vanishing gradient with respect to both spatial and scale direction. We show that these points are always saddle points. They turn out to be extremely useful, since the iso-intensity

  1. Extreme Programming: Maestro Style

    Science.gov (United States)

    Norris, Jeffrey; Fox, Jason; Rabe, Kenneth; Shu, I-Hsiang; Powell, Mark

    2009-01-01

    "Extreme Programming: Maestro Style" is the name of a computer programming methodology that has evolved as a custom version of a methodology, called extreme programming that has been practiced in the software industry since the late 1990s. The name of this version reflects its origin in the work of the Maestro team at NASA's Jet Propulsion Laboratory that develops software for Mars exploration missions. Extreme programming is oriented toward agile development of software resting on values of simplicity, communication, testing, and aggressiveness. Extreme programming involves use of methods of rapidly building and disseminating institutional knowledge among members of a computer-programming team to give all the members a shared view that matches the view of the customers for whom the software system is to be developed. Extreme programming includes frequent planning by programmers in collaboration with customers, continually examining and rewriting code in striving for the simplest workable software designs, a system metaphor (basically, an abstraction of the system that provides easy-to-remember software-naming conventions and insight into the architecture of the system), programmers working in pairs, adherence to a set of coding standards, collaboration of customers and programmers, frequent verbal communication, frequent releases of software in small increments of development, repeated testing of the developmental software by both programmers and customers, and continuous interaction between the team and the customers. The environment in which the Maestro team works requires the team to quickly adapt to changing needs of its customers. In addition, the team cannot afford to accept unnecessary development risk. Extreme programming enables the Maestro team to remain agile and provide high-quality software and service to its customers. However, several factors in the Maestro environment have made it necessary to modify some of the conventional extreme

  2. Sea surface temperature variability in the North Western Mediterranean Sea (Gulf of Lion) during the Common Era

    Science.gov (United States)

    Sicre, Marie-Alexandrine; Jalali, Bassem; Martrat, Belen; Schmidt, Sabine; Bassetti, Maria-Angela; Kallel, Nejib

    2016-12-01

    This study investigates the multidecadal-scale variability of sea surface temperatures (SSTs) in the convection region of the Gulf of Lion (NW Mediterranean Sea) over the full past 2000 yr (Common Era) using alkenone biomarkers. Our data show colder SSTs by 1.7 °C over most of the first millennium (200-800 AD) and by 1.3 °C during the Little Ice Age (LIA; 1400-1850 AD) than the 20th century mean (17.9 °C). Although on average warmer, those of the Medieval Climate Anomaly (MCA) (1000-1200 AD) were lower by 1 °C. We found a mean SST warming of 2 °C/100 yr over the last century in close agreement with the 0.22 and 0.26 °C/decade values calculated for the western Mediterranean Sea from in situ and satellite data, respectively. Our results also reveal strongly fluctuating SSTs characterized by cold extremes followed by abrupt warming during the LIA. We suggest that the coldest decades of the LIA were likely caused by prevailing negative EA states and associated anticyclone blocking over the North Atlantic resulting in cold continental northeasterly winds to blow over Western Europe and the Mediterranean region.

  3. Extreme Networks' 10-Gigabit Ethernet enables

    CERN Multimedia

    2002-01-01

    " Extreme Networks, Inc.'s 10-Gigabit switching platform enabled researchers to transfer one Terabyte of information from Vancouver to Geneva across a single network hop, the world's first large-scale, end-to-end transfer of its kind" (1/2 page).

  4. Changes in Wind Speed and Extremes in Beijing during 1960-2008 Based on Homogenized Observations

    Institute of Scientific and Technical Information of China (English)

    LI Zhen; YAN Zhongwei; TU Kai; LIU Weidong; WANG Yingchun

    2011-01-01

    Daily observations of wind speed at 12 stations in the Greater Beijing Area during 1960-2008 were homogenized using the Multiple Analysis of Series for Homogenization method. The linear trends in the regional mean annual and seasonal (winter, spring, summer and autumn) wind speed series were -0.26,-0.39, -0.30, -0.12 and -0.22 m s-1 (10 yr)-1, respectively. Winter showed the greatest magnitude in declining wind speed, followed by spring, autumn and summer. The annual and seasonal frequencies of wind speed extremes (days) also decreased, more prominently for winter than for the other seasons. The declining trends in wind speed and extremes were formed mainly by some rapid declines during the 1970s and 1980s. The maximum declining trend in wind speed occurred at Chaoyang (CY), a station within the central business district (CBD) of Beijing with the highest level of urbanization. The declining trends were in general smaller in magnitude away from the city center, except for the winter case in which the maximum declining trend shifted northeastward to rural Miyun (MY). The influence of urbanization on the annual wind speed was estimated to be about -0.05 m s-1 (10 yr)-1 during 1960-2008, accounting for around one fifth of the regional mean declining trend. The annual and seasonal geostrophic wind speeds around Beijing, based on daily mean sea level pressure (MSLP) from the ERA-40 reanalysis dataset, also exhibited decreasing trends, coincident with the results from site observations. A comparative analysis of the MSLP fields between 1966-1975 and 1992-2001 suggested that the influences of both the winter and summer monsoons on Beijing were weaker in the more recent of the two decades. It is suggested that the bulk of wind in Beijing is influenced considerably by urbanization, while changes in strong winds or wind speed extremes are prone to large-scale climate change in the region.

  5. Dinosaur dynamics in the Jurassic Era

    Science.gov (United States)

    Lee, Scott

    2010-04-01

    Dinosaurs were fascinating animals and elicit much excitement in the classroom. Analysis of fossilized dinosaur trackways permits one to estimate the locomotion speeds and accelerations of these extinct beasts. Such analysis allows one to apply Newton's laws of motion to examples from the Jurassic Era.

  6. Faculty Recruitment in an Era of Change

    Science.gov (United States)

    Levine, Marilyn; Schimpf, Martin

    2010-01-01

    Faculty recruitment is a challenge for administration and departments, especially in an era of change in the academy. This article builds on information from an interactive conference panel session that focused on faculty recruitment best practices. The article addresses faculty recruitment strategies that focus on the optimization of search…

  7. Time Management in the Digital Era

    Science.gov (United States)

    Wodarz, Nan

    2013-01-01

    School business officials can strike a balance between setting a long-term strategy and responding to short-term situations by implementing time management strategies. This article presents tips for time management that could help boost productivity and save time in this digital era. Tips include decreasing meeting times via Skype or…

  8. Women Reformers in the Progressive Era.

    Science.gov (United States)

    McDonough, Judith

    1999-01-01

    Presents an overview of women in the Progressive Era, providing a glimpse at how women attempted to reform society and simultaneously change ideas about the role of women at the turn of the 20th century. Reviews the various roles of these women, such as suffragettes, individual freedom activists, and labor organizers. (CMK)

  9. The Post-Genomic Era of Cassava

    Science.gov (United States)

    The genomics era revolutionized our efficiency at gathering and disseminating scientific information required for advancing our understanding of plant biology. In the case of cassava, the genomics revolution has not kept pace with other staple food and fiber crops important to global economies. As a...

  10. Green Era Should Herald Smarter Buildings

    Science.gov (United States)

    Carlson, Scott

    2008-01-01

    In the past two decades, the most elite and ambitious colleges have commissioned buildings by "starchitects" for notoriety. However, people live in a green era now and there is a need for a new kind of star architecture to go with it, one in which the building is a star for its efficiency as well as its elegance. The new star architecture would…

  11. A New Era of Open Access?

    Science.gov (United States)

    Mercieca, Paul; Macauley, Peter

    2008-01-01

    There has been a push for open access journals for more than a decade in a higher education and research environment in which the "publish or perish" syndrome is as dominant as ever. This article examines the success, or otherwise, of open access schemes in light of the Excellence in Research for Australia (ERA) initiative. It compares…

  12. The new era for geo-information

    Institute of Scientific and Technical Information of China (English)

    LI DeRen; SHAO ZhenFeng

    2009-01-01

    Along with the forthcoming of Google Earth, Virtual Earth, the next generation of Internet, Web 2.0, grid computing and smart sensor web, comes the new era for geo-information. In this paper, main features of new geo-information era are discussed. This new era is characterized by these features: serviced users are extended from professionals to all public users, the users are data and information providers as well, geospatial data provided are no longer measurement-by-specification but measurement-on-demand through smart sensor web, and services are transferred from being data-driven to application-driven. Such problems as out-of-order issues in geographic data collection and information proliferation, quality issues in geographic information updating, security issues in geographic information services, privacy issues in sharing geographic information and property issues in sharing geographic information, which are brought about by new geo-information era, especially those confronted in geo-information science and geo-spatial information industry, are analyzed. Then strategies concerning standards, planning, laws, technology and applications are proposed.

  13. Time Management in the Digital Era

    Science.gov (United States)

    Wodarz, Nan

    2013-01-01

    School business officials can strike a balance between setting a long-term strategy and responding to short-term situations by implementing time management strategies. This article presents tips for time management that could help boost productivity and save time in this digital era. Tips include decreasing meeting times via Skype or…

  14. A New Era of Open Access?

    Science.gov (United States)

    Mercieca, Paul; Macauley, Peter

    2008-01-01

    There has been a push for open access journals for more than a decade in a higher education and research environment in which the "publish or perish" syndrome is as dominant as ever. This article examines the success, or otherwise, of open access schemes in light of the Excellence in Research for Australia (ERA) initiative. It compares…

  15. Statistics of Extremes

    KAUST Repository

    Davison, Anthony C.

    2015-04-10

    Statistics of extremes concerns inference for rare events. Often the events have never yet been observed, and their probabilities must therefore be estimated by extrapolation of tail models fitted to available data. Because data concerning the event of interest may be very limited, efficient methods of inference play an important role. This article reviews this domain, emphasizing current research topics. We first sketch the classical theory of extremes for maxima and threshold exceedances of stationary series. We then review multivariate theory, distinguishing asymptotic independence and dependence models, followed by a description of models for spatial and spatiotemporal extreme events. Finally, we discuss inference and describe two applications. Animations illustrate some of the main ideas. © 2015 by Annual Reviews. All rights reserved.

  16. Extremely deformable structures

    CERN Document Server

    2015-01-01

    Recently, a new research stimulus has derived from the observation that soft structures, such as biological systems, but also rubber and gel, may work in a post critical regime, where elastic elements are subject to extreme deformations, though still exhibiting excellent mechanical performances. This is the realm of ‘extreme mechanics’, to which this book is addressed. The possibility of exploiting highly deformable structures opens new and unexpected technological possibilities. In particular, the challenge is the design of deformable and bi-stable mechanisms which can reach superior mechanical performances and can have a strong impact on several high-tech applications, including stretchable electronics, nanotube serpentines, deployable structures for aerospace engineering, cable deployment in the ocean, but also sensors and flexible actuators and vibration absorbers. Readers are introduced to a variety of interrelated topics involving the mechanics of extremely deformable structures, with emphasis on ...

  17. Precursors of extreme increments

    CERN Document Server

    Hallerberg, S; Holstein, D; Kantz, H; Hallerberg, Sarah; Altmann, Eduardo G.; Holstein, Detlef; Kantz, Holger

    2006-01-01

    We investigate precursors and predictability of extreme events in time series, which consist in large increments within successive time steps. In order to understand the predictability of this class of extreme events, we study analytically the prediction of extreme increments in AR(1)-processes. The resulting strategies are then applied to predict sudden increases in wind speed recordings. In both cases we evaluate the success of predictions via creating receiver operator characteristics (ROC-plots). Surprisingly, we obtain better ROC-plots for completely uncorrelated Gaussian random numbers than for AR(1)-correlated data. Furthermore, we observe an increase of predictability with increasing event size. Both effects can be understood by using the likelihood ratio as a summary index for smooth ROC-curves.

  18. Weather and Climate Extremes.

    Science.gov (United States)

    1997-09-01

    Antarctica’s highest (New Zealand Antarctic Society, 1974). This extreme exceeded the record of 58°F (14.4°C) that occurred on 20 October 1956 at Esperanza ... Esperanza (also known as Bahia Esperanza , Hope Bay) was in operation from 1945 through the early 1960s. Meteorological/Climatological Factors: This extreme...cm) Location: Grand Ilet, La R’eunion Island [21°00’S, 55°30’E] Date: 26 January 1980 WORLD’S GREATEST 24-HOUR RAINFALL 72 in (182.5 cm

  19. Adventure and Extreme Sports.

    Science.gov (United States)

    Gomez, Andrew Thomas; Rao, Ashwin

    2016-03-01

    Adventure and extreme sports often involve unpredictable and inhospitable environments, high velocities, and stunts. These activities vary widely and include sports like BASE jumping, snowboarding, kayaking, and surfing. Increasing interest and participation in adventure and extreme sports warrants understanding by clinicians to facilitate prevention, identification, and treatment of injuries unique to each sport. This article covers alpine skiing and snowboarding, skateboarding, surfing, bungee jumping, BASE jumping, and whitewater sports with emphasis on epidemiology, demographics, general injury mechanisms, specific injuries, chronic injuries, fatality data, and prevention. Overall, most injuries are related to overuse, trauma, and environmental or microbial exposure.

  20. Extremal graph theory

    CERN Document Server

    Bollobas, Bela

    2004-01-01

    The ever-expanding field of extremal graph theory encompasses a diverse array of problem-solving methods, including applications to economics, computer science, and optimization theory. This volume, based on a series of lectures delivered to graduate students at the University of Cambridge, presents a concise yet comprehensive treatment of extremal graph theory.Unlike most graph theory treatises, this text features complete proofs for almost all of its results. Further insights into theory are provided by the numerous exercises of varying degrees of difficulty that accompany each chapter. A

  1. The European Union's Normative Power in a more Global Era

    DEFF Research Database (Denmark)

    Manners, Ian James

    2013-01-01

    The globalising, multilateralising and multipolarising era requires a reconsideration of the nature of European Union (EU) power and actorness in a more global era. The article does this by first looking at the study of the EU in a more global era. Second, the normative power approach will be exa...

  2. Increased threat of tropical cyclones and coastal flooding to New York City during the anthropogenic era.

    Science.gov (United States)

    Reed, Andra J; Mann, Michael E; Emanuel, Kerry A; Lin, Ning; Horton, Benjamin P; Kemp, Andrew C; Donnelly, Jeffrey P

    2015-10-13

    In a changing climate, future inundation of the United States' Atlantic coast will depend on both storm surges during tropical cyclones and the rising relative sea levels on which those surges occur. However, the observational record of tropical cyclones in the North Atlantic basin is too short (A.D. 1851 to present) to accurately assess long-term trends in storm activity. To overcome this limitation, we use proxy sea level records, and downscale three CMIP5 models to generate large synthetic tropical cyclone data sets for the North Atlantic basin; driving climate conditions span from A.D. 850 to A.D. 2005. We compare pre-anthropogenic era (A.D. 850-1800) and anthropogenic era (A.D.1970-2005) storm surge model results for New York City, exposing links between increased rates of sea level rise and storm flood heights. We find that mean flood heights increased by ∼1.24 m (due mainly to sea level rise) from ∼A.D. 850 to the anthropogenic era, a result that is significant at the 99% confidence level. Additionally, changes in tropical cyclone characteristics have led to increases in the extremes of the types of storms that create the largest storm surges for New York City. As a result, flood risk has greatly increased for the region; for example, the 500-y return period for a ∼2.25-m flood height during the pre-anthropogenic era has decreased to ∼24.4 y in the anthropogenic era. Our results indicate the impacts of climate change on coastal inundation, and call for advanced risk management strategies.

  3. Topcolor in the LHC Era

    CERN Document Server

    Simmons, Elizabeth H; Coleppa, Baradhwaj; Logan, Heather E; Martin, Adam

    2011-01-01

    Ongoing LHC searches for the standard model Higgs Boson in WW or ZZ decay modes strongly constrain the top-Higgs state predicted in many models with new dynamics that preferentially affects top quarks. Such a state couples strongly to top-quarks, and is therefore produced through gluon fusion at a rate that can be greatly enhanced relative to the rate for the standard model Higgs boson. As we discuss in this talk, a top-Higgs state with mass less than 300 GeV is excluded at 95% CL if the associated top-pion has a mass of 150 GeV, and the constraint is even stronger if the mass of the top-pion state exceeds the top-quark mass or if the top-pion decay constant is a substantial fraction of the weak scale. These results have significant implications for theories with strong top dynamics, such as topcolor-assisted technicolor, top-seesaw models, and certain Higgsless models.

  4. Extremity perfusion for sarcoma

    NARCIS (Netherlands)

    Hoekstra, Harald Joan

    2008-01-01

    For more than 50 years, the technique of extremity perfusion has been explored in the limb salvage treatment of local, recurrent, and multifocal sarcomas. The "discovery" of tumor necrosis factor-or. in combination with melphalan was a real breakthrough in the treatment of primarily irresectable ext

  5. Hydrological extremes and security

    Science.gov (United States)

    Kundzewicz, Z. W.; Matczak, P.

    2015-04-01

    Economic losses caused by hydrological extremes - floods and droughts - have been on the rise. Hydrological extremes jeopardize human security and impact on societal livelihood and welfare. Security can be generally understood as freedom from threat and the ability of societies to maintain their independent identity and their functional integrity against forces of change. Several dimensions of security are reviewed in the context of hydrological extremes. The traditional interpretation of security, focused on the state military capabilities, has been replaced by a wider understanding, including economic, societal and environmental aspects that get increasing attention. Floods and droughts pose a burden and serious challenges to the state that is responsible for sustaining economic development, and societal and environmental security. The latter can be regarded as the maintenance of ecosystem services, on which a society depends. An important part of it is water security, which can be defined as the availability of an adequate quantity and quality of water for health, livelihoods, ecosystems and production, coupled with an acceptable level of water-related risks to people, environments and economies. Security concerns arise because, over large areas, hydrological extremes - floods and droughts - are becoming more frequent and more severe. In terms of dealing with water-related risks, climate change can increase uncertainties, which makes the state's task to deliver security more difficult and more expensive. However, changes in population size and development, and level of protection, drive exposure to hydrological hazards.

  6. Acute lower extremity ischaemia

    African Journals Online (AJOL)

    tend to impact at arterial bifurcations, the commonest site being the ... Other ominous signs of advanced ischaemia include bluish ... Recommended standards for lower extremity ischaemia*. Doppler signals ... of the embolectomy procedure. An ... in a cath-lab or angio-suite under local ... We serially measure the aPTT and.

  7. Extremity perfusion for sarcoma

    NARCIS (Netherlands)

    Hoekstra, Harald Joan

    2008-01-01

    For more than 50 years, the technique of extremity perfusion has been explored in the limb salvage treatment of local, recurrent, and multifocal sarcomas. The "discovery" of tumor necrosis factor-or. in combination with melphalan was a real breakthrough in the treatment of primarily irresectable

  8. Statistics of Local Extremes

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Bierbooms, W.; Hansen, Kurt Schaldemose

    2003-01-01

    . A theoretical expression for the probability density function associated with local extremes of a stochasticprocess is presented. The expression is basically based on the lower four statistical moments and a bandwidth parameter. The theoretical expression is subsequently verified by comparison with simulated...

  9. de Sitter Extremal Surfaces

    CERN Document Server

    Narayan, K

    2015-01-01

    We study extremal surfaces in de Sitter space in the Poincare slicing in the upper patch, anchored on spatial subregions at the future boundary ${\\cal I}^+$, restricted to constant boundary Euclidean time slices (focussing on strip subregions). We find real extremal surfaces of minimal area as the boundaries of past lightcone wedges of the subregions in question: these are null surfaces with vanishing area. We find also complex extremal surfaces as complex extrema of the area functional, and the area is not always real-valued. In $dS_4$ the area is real and has some structural resemblance with entanglement entropy in a dual $CFT_3$. There are parallels with analytic continuation from the Ryu-Takayanagi expressions for holographic entanglement entropy in $AdS$. We also discuss extremal surfaces in the $dS$ black brane and the de Sitter "bluewall" studied previously. The $dS_4$ black brane complex surfaces exhibit a real finite cutoff-independent extensive piece. In the bluewall geometry, there are real surface...

  10. Moving in extreme environments

    DEFF Research Database (Denmark)

    Lucas, Samuel J E; Helge, Jørn W; Schütz, Uwe H W

    2016-01-01

    and transcontinental races) and expeditions (e.g. polar crossings), to the more gravitationally limited load carriage (e.g. in the military context). Juxtaposed to these circumstances is the extreme metabolic and mechanical unloading associated with space travel, prolonged bedrest and sedentary lifestyle, which may...

  11. MAD about the Large Magellanic Cloud: preparing for the era of Extremely Large Telescopes

    CERN Document Server

    Fiorentino, G; Diolaiti, E; Valenti, E; Cignoni, M; Mackey, A D

    2011-01-01

    We present J, H, Ks photometry from the the Multi conjugate Adaptive optics Demonstrator (MAD), a visitor instrument at the VLT, of a resolved stellar population in a small crowded field in the bar of the Large Magellanic Cloud near the globular cluster NGC 1928. In a total exposure time of 6, 36 and 20 minutes, magnitude limits were achieved of J \\sim 20.5 mag, H \\sim 21 mag, and Ks \\sim20.5 mag respectively, with S/N> 10. This does not reach the level of the oldest Main Sequence Turnoffs, however the resulting Colour-Magnitude Diagrams are the deepest and most accurate obtained so far in the infrared for the LMC bar. We combined our photometry with deep optical photometry from the Hubble Space Telescope/Advanced Camera for Surveys, which is a good match in spatial resolution. The comparison between synthetic and observed CMDs shows that the stellar population of the field we observed is consistent with the star formation history expected for the LMC bar, and that all combinations of IJHKs filters can, with ...

  12. Implementation of the Spanish National Enhanced Recovery Program (ERAS) in Bariatric Surgery: A Pilot Study.

    Science.gov (United States)

    Ruiz-Tovar, Jaime; Royo, Pablo; Muñoz, José L; Duran, Manuel; Redondo, Elisabeth; Ramirez, Jose M

    2016-12-01

    The essence of Enhanced Recovery After Surgery (ERAS) programs is the multimodal approach, and many authors have demonstrated safety and feasibility in fast track bariatric surgery. According to this concept, a multidisciplinary ERAS program for bariatric surgery has been developed by the Spanish fast track group (ERAS Spain). The aim of this study was to analyze the initial implementation of this Spanish National ERAS protocol in bariatric surgery. A multicentric prospective pilot study was performed, including 125 consecutive patients undergoing bariatric surgery at 3 Spanish hospitals between January and June 2015, after the Spanish National ERAS protocol in bariatric surgery. Compliance with the protocol, morbidity, mortality, hospital stay, and readmission were evaluated. Bariatric techniques performed included 68 Roux-en-Y gastric bypass (54.4%) and 57 laparoscopic sleeve gastrectomy (45.6%) cases. All surgeries were laparoscopically performed with conversion in only 1 case (0.8%). Median postoperative pain evaluated by visual analogic scale 24 hours after surgery was 2 (range, 0 to 5). Postoperative nausea or vomiting appeared in 7 patients (5.6%). Complications appeared in 6 patients (4.8%). The reoperation rate was 4%. The mortality rate was 0.8%. The median hospital stay was 2 days (range, 2 to 10 d) and readmission rate was 2.4%. The compliance of all the items of the protocol was achieved in 78.4% of the patients. The Spanish National ERAS protocol is a safe issue with a high implementation rate. It can be recommended to establish this protocol to other institutions.

  13. Early Onset of Industrial-Era Warming Across the Oceans and Continents

    Science.gov (United States)

    Abram, N.; McGregor, H. V.; Tierney, J. E.; Evans, M. N.; McKay, N.; Kaufman, D. S.; Pages 2k Consortium*, T.

    2016-12-01

    The evolution of industrial-era warming provides critical context for future climate change, and has fundamental importance for determining climate sensitivity and the processes that control regional warming. Palaeoclimate data from the Common Era - a period when natural and anthropogenic climate forcings are reasonably well constrained - provide valuable perspectives on anthropogenic greenhouse warming, but have focused mainly on the Northern Hemisphere using records derived primarily from terrestrial settings. Given the importance of the oceans in determining the pace and regional structure of climate changes, it is essential to extend our palaeoclimate assessments to determine how regional-scale warming developed in the oceans and over land during the Industrial Era. Here we use post-1500CE palaeoclimate records to show that sustained industrial-era warming of the tropical oceans first developed during the mid-19th Century, and was near-synchronous with Northern Hemisphere continental warming. The early onset of sustained, significant warming in palaeoclimate records and model simulations suggests greenhouse forcing of industrial-era warming commenced as early as the mid-19th Century, and included an enhanced equatorial ocean response mechanism. The development of Southern Hemisphere warming is delayed in reconstructions, but this apparent delay is not reproduced in climate simulations. Our findings imply that instrumental records are too short to comprehensively assess anthropogenic climate change, and in some regions 180 years of industrial-era warming has already caused surface temperatures to emerge above pre-industrial variability. *PAGES 2k Consortium authors are: Kaustubh Thirumalai, Belen Martrat, Hugues Goosse, Steven J. Phipps, Eric J. Steig, K. Halimeda Kilbourne, Casey P. Saenger, Jens Zinke, Guillaume Leduc, Jason A. Addison, P. Graham Mortyn, Marit-Solveig Seidenkrantz, Marie-Alexandrine Sicre, Kandasamy Selvaraj, Helena L. Filipsson, Raphael

  14. Composite circulation index of weather extremes (the example for Poland

    Directory of Open Access Journals (Sweden)

    Zbigniew Ustrnul

    2013-10-01

    Full Text Available The paper describes the implementation of a composite circulation index of weather extremes (CIE. The new index informs about the synoptic conditions favoring the occurrence of extremes on a regional scale. It was evaluated for temperature and precipitation extremes for Poland. Daily homogenized data obtained from 14 weather stations, which cover most of the country, were used. The data used cover the 60-year period from 1951 to 2010. The index is based on relationships between extremes and mesoscale circulation conditions. The core material also included data describing circulation types for the study period. Three different calendars were used (Grosswetterlagen, by Lity?ski, by Nied?wied?. The conditional probability of the occurrence of extremes for particular types was calculated independently using partial indices (Partial Index of weather Extremes ? PIE. The higher the index values, the more favorable the synoptic situation for the occurrence of extremes. The results were grouped by describing the probability of temperature and precipitation extremes on the IPCC likelihood scale. Three thresholds corresponding to the frequency of an occurrence were identified via a seasonal approach. The CIE was validated using basic correlation coefficients as well as contingency tables based on feature-displacement criteria and Bayesian Probability Methods. The results confirmed the significance of atmospheric circulation in the formation of temperature and precipitation extremes in Poland. The CIE proved the relationship by trying to estimate the probability of temperature and precipitation extremes occurrence depending on the circulation type forecasted.

  15. Menertawakan Fobia Komunis di Era Reproduksi Digital

    Directory of Open Access Journals (Sweden)

    Triyono Lukmantoro

    2017-04-01

    Full Text Available Abstract. In May-June 2016 issue of the rise of the Indonesian Communist Party (PKI and the latent danger of communism appeared again. Excessive fear of PKI and communism continues propagated. That is what is referred to as a communist phobia. But, the issue is considered sensitive that it gave birth to criticism. The phenomenon is the presence of a number of memes comics whose contents laugh hammer and sickle symbol and three communist iconic figures, namely D.N. Aidit, Tan Malaka, and Mao Zedong. Meme comics containing parody to show incongruities that can only happen to the era of digital reproduction. The idea of meme comics can be traced to the thought Walter Benjamin about the works of art in the age of mechanical reproduction. In that era, aura was declining. The crisis and the disappearance of aura increasingly occurs to the time of digital reproduction.

  16. Extreme fluctuations and the finite lifetime of the turbulent state.

    Science.gov (United States)

    Goldenfeld, Nigel; Guttenberg, Nicholas; Gioia, Gustavo

    2010-03-01

    We argue that the transition to turbulence is controlled by large amplitude events that follow extreme distribution theory. The theory suggests an explanation for recent observations of the turbulent state lifetime which exhibit superexponential scaling behavior with Reynolds number.

  17. Overview of ERA Integrated Technology Demonstration (ITD) 51A Ultra-High Bypass (UHB) Integration for Hybrid Wing Body (HWB)

    Science.gov (United States)

    Flamm, Jeffrey D.; James, Kevin D.; Bonet, John T.

    2016-01-01

    The NASA Environmentally Responsible Aircraft Project (ERA) was a ve year project broken into two phases. In phase II, high N+2 Technical Readiness Level demonstrations were grouped into Integrated Technology Demonstrations (ITD). This paper describes the work done on ITD-51A: the Vehicle Systems Integration, Engine Airframe Integration Demonstration. Refinement of a Hybrid Wing Body (HWB) aircraft from the possible candidates developed in ERA Phase I was continued. Scaled powered, and unpowered wind- tunnel testing, with and without acoustics, in the NASA LARC 14- by 22-foot Subsonic Tunnel, the NASA ARC Unitary Plan Wind Tunnel, and the 40- by 80-foot test section of the National Full-Scale Aerodynamics Complex (NFAC) in conjunction with very closely coupled Computational Fluid Dynamics was used to demonstrate the fuel burn and acoustic milestone targets of the ERA Project.

  18. Past Eras In Cyclic Cosmological Models

    CERN Document Server

    Frampton, Paul H

    2009-01-01

    In infinitely cyclic cosmology past eras are discussed using set theory and transfinite numbers. One consistent scenario, already in the literature, is where there is always a countably infinite number, $\\aleph_0$, of universes and no big bang. I describe here an alternative where the present number of universes is $\\aleph_0$ and in the infinite past there was only a finite number of universes. In this alternative model it is also possible that there was no big bang.

  19. Microbiology in the post-genomic era.

    Science.gov (United States)

    Medini, Duccio; Serruto, Davide; Parkhill, Julian; Relman, David A; Donati, Claudio; Moxon, Richard; Falkow, Stanley; Rappuoli, Rino

    2008-06-01

    Genomics has revolutionized every aspect of microbiology. Now, 13 years after the first bacterial genome was sequenced, it is important to pause and consider what has changed in microbiology research as a consequence of genomics. In this article, we review the evolving field of bacterial typing and the genomic technologies that enable comparative analysis of multiple genomes and the metagenomes of complex microbial environments, and address the implications of the genomic era for the future of microbiology.

  20. China's Cheap-Labor Era Is Ending

    Institute of Scientific and Technical Information of China (English)

    International Finance News

    2010-01-01

    @@ This year,salary has increased in eastern China's enterprises;governments in central and western China have raised minimum-wage standards…signals for higher labor cost emerge from time to time.This rising economic phenomenon poses a series of problems-is China's economy stepping out of the cheap-labor era? What opportunities and challenges may lie ahead in this important course of transition?

  1. ERA [Material gráfico

    OpenAIRE

    Fundación para la Etnografía y el Desarrollo de la Artesanía Canaria

    1990-01-01

    Antiguedad: SIGLO XIX Calificación del suelo: RÚSTICO DE PROTECCIÓN DE ENTORNOS Clasificación del suelo: RÚSTICO DE FORMA CASI CIRCULAR, CON PIEDRAS EMPENICADAS EN LOS BORDES, EMPEDRADO SIN DIBUJO APARENTE Y CON GRANDES LOSAS EN SU INTERIOR. MIXTA, EMPEDRADO Y LOSAS. Declaración BIC:No HAY BASTANTE BASURA EN EL INTERIOR Y ALREDEDOR DE LA ERA. LA VEGETACION HA LOGRADO INTRODUCIRSE ENTREEL EMPEDRADO, (ALGUNA ZONA HA PERDIDO EL ...

  2. Belydenisgebondenheid in ’n postmoderne era

    OpenAIRE

    C.F.C. Coetzee

    2010-01-01

    The binding to confessions in a postmodern era We are experiencing a paradigm shift between Modernism and Postmodernism in almost every sphere of life, and also in the sphere of church and theology. This paradigm shift has far-reaching consequences, especially for churches in the reformed tradition and the practice of reformed theology as far as the binding to the confessions is concerned. From the viewpoint of Postmodernism, there is no absolute truth. This applies also to Scripture. ...

  3. Non-extremal branes

    Directory of Open Access Journals (Sweden)

    Pablo Bueno

    2015-04-01

    Full Text Available We prove that for arbitrary black brane solutions of generic Supergravities there is an adapted system of variables in which the equations of motion are exactly invariant under electric–magnetic duality, i.e. the interchange of a given extended object by its electromagnetic dual. We obtain thus a procedure to automatically construct the electromagnetic dual of a given brane without needing to solve any further equation. We apply this procedure to construct the non-extremal (p,q-string of Type-IIB String Theory (new in the literature, explicitly showing how the dual (p,q-five-brane automatically arises in this construction. In addition, we prove that the system of variables used is suitable for a generic characterization of every double-extremal Supergravity brane solution, which we perform in full generality.

  4. Tibetans at extreme altitude.

    Science.gov (United States)

    Wu, Tianyi; Li, Shupin; Ward, Michal P

    2005-01-01

    Between 1960 and 2003, 13 Chinese expeditions successfully reached the summit of Chomolungma (Mt Everest or Sagarmatha). Forty-five of the 80 summiteers were Tibetan highlanders. During these and other high-altitude expeditions in Tibet, a series of medical and physiological investigations were carried out on the Tibetan mountaineers. The results suggest that these individuals are better adapted to high altitude and that, at altitude, they have a greater physical capacity than Han (ethnic Chinese) lowland newcomers. They have higher maximal oxygen uptake, greater ventilation, more brisk hypoxic ventilatory responses, larger lung volumes, greater diffusing capacities, and a better quality of sleep. Tibetans also have a lower incidence of acute mountain sickness and less body weight loss. These differences appear to represent genetic adaptations and are obviously significant for humans at extreme altitude. This paper reviews what is known about the physiologic responses of Tibetans at extreme altitudes.

  5. Extremal periodic wave profiles

    Directory of Open Access Journals (Sweden)

    E. van Groesen

    2007-01-01

    Full Text Available As a contribution to deterministic investigations into extreme fluid surface waves, in this paper wave profiles of prescribed period that have maximal crest height will be investigated. As constraints the values of the momentum and energy integrals are used in a simplified description with the KdV model. The result is that at the boundary of the feasible region in the momentum-energy plane, the only possible profiles are the well known cnoidal wave profiles. Inside the feasible region the extremal profiles of maximal crest height are "cornered" cnoidal profiles: cnoidal profiles of larger period, cut-off and periodically continued with the prescribed period so that at the maximal crest height a corner results.

  6. Extremal Hairy Black Holes

    CERN Document Server

    Gonzalez, P A; Saavedra, Joel; Vasquez, Yerko

    2014-01-01

    We consider a gravitating system consisting of a scalar field minimally coupled to gravity with a self-interacting potential and an U(1) electromagnetic field. Solving the coupled Einstein-Maxwell-scalar system we find exact hairy charged black hole solutions with the scalar field regular everywhere. We go to the zero temperature limit and we study the effect of the scalar field on the near horizon geometry of an extremal black hole. We find that except a critical value of the charge of the black hole there is also a critical value of the charge of the scalar field beyond of which the extremal black hole is destabilized. We study the thermodynamics of these solutions and we find that if the space is flat then at low temperature the Reissner-Nordstr\\"om black hole is thermodynamically preferred, while if the space is AdS the hairy charged black hole is thermodynamically preferred at low temperature.

  7. Humorous "Era" stories from the Arilje region

    Directory of Open Access Journals (Sweden)

    Nikolić Desanka P.

    2004-01-01

    Full Text Available This paper discusses humorous stories collected in the village of Brekovo near Arilje; the stories were created in the first half of the 20th century and noted down by Momčilo Jovanović, a villager from Brekovo. Later on, in the 1980's, the stories were passed down to the author of this paper. These narrations are mostly short stories and anecdotes; in a very realistic fashion, the stories depict life and culture of the Dinaric race from old Vlah - Zlatibor cultural area, namely, the type of person also known as "Era", well-known for its wittiness, smartness and wisdom. Based on the analysis of seven Era-stories, the author identified the social and cultural values highlighted in the stories (such as attitudes toward authorities intergenerational relationship, status of women, power relations between townsmen and peasants, propensity toward justice and truth. In summary, the stories document the mutual influence between traditional culture and the Era personality in this particular rural region; therefore, they could contribute to studies on character traits of the inhabitants in a given area of western Serbia.

  8. Philosophical Pragmatism in the Digital Era

    Directory of Open Access Journals (Sweden)

    Marius Constantin CUCU

    2014-03-01

    Full Text Available The view of the philosophic pragmatism warns against the growing danger of the technological modernization of the human being in the mechanization era, which is gradually transforming and approaching a digital era. Authors such as R. Rorty believe that only the return to the paradigm of the human reality separated from metaphysical ideals could keep us away from exacerbations of ideas and the dehumanizing automatisms of technology. In the view of pragmatic philosophy, the human being is not a mechanic, operational construction; on the contrary, he has a consciousness that opts for free actions, which may prove, in the end, to be genuine or not, thanks to the success or due to the failure in the concrete reality. The aim of this article is to underline that the digital era must be perceived as a product of human ingenuity and its applicative potentialities and should not be seen as the domination of techne, but only as a stage of the developments in the technologies that must assist our life. Classification-JEL: A23

  9. Scale interactions in economics: application to the evaluation of the economic damages of climatic change and of extreme events; Interactions d'echelles en economie: application a l'evaluation des dommages economiques du changement climatique et des evenements extremes

    Energy Technology Data Exchange (ETDEWEB)

    Hallegatte, S

    2005-06-15

    Growth models, which neglect economic disequilibria, considered as temporary, are in general used to evaluate the damaging effects generated by climatic change. This work shows, through a series of modeling experiences, the importance of disequilibria and of endogenous variability of economy in the evaluation of damages due to extreme events and climatic change. It demonstrates the impossibility to separate the evaluation of damages from the representation of growth and of economic dynamics: the comfort losses will depend on both the nature and intensity of impacts and on the dynamics and situation of the economy to which they will apply. Thus, the uncertainties about the damaging effects of future climatic changes come from both scientific uncertainties and from uncertainties about the future organization of our economies. (J.S.)

  10. Religious Extremism in Pakistan

    Science.gov (United States)

    2014-12-01

    Face (July 2008): 32. 21 Ahmed Rashid , Pakistan on the Brink: The Future of America, Pakistan, and Afghanistan (New York: Viking, 2012). 22 Brian J...promoting extremism. Commentators such as Jessica Stern, Alan Richards, Hussain Haqqani, Ahmed Rashid , and Ali Riaz are a few of the scholars who...www.jstor.org/stable/3183558; See also Ahmed Rashid , Descent Into Chaos: The United States and the Failure of Nation Building in Pakistan, Afghanistan, and

  11. USACE Extreme Sea levels

    Science.gov (United States)

    2014-03-14

    report summarising the results of the research, together with a set of recommendations arising from the research. This report describes progress to...Southampton University at HR Wallingford and subsequent teleconference with Heidi Moritz and Kate White. The notes summarising the findings of the...suggestion was made that we may want to begin talking about extreme water levels separate from storms. Ivan mentioned an analysis of storminess which

  12. Extreme geomagnetically induced currents

    Science.gov (United States)

    Kataoka, Ryuho; Ngwira, Chigomezyo

    2016-12-01

    We propose an emergency alert framework for geomagnetically induced currents (GICs), based on the empirically extreme values and theoretical upper limits of the solar wind parameters and of d B/d t, the time derivative of magnetic field variations at ground. We expect this framework to be useful for preparing against extreme events. Our analysis is based on a review of various papers, including those presented during Extreme Space Weather Workshops held in Japan in 2011, 2012, 2013, and 2014. Large-amplitude d B/d t values are the major cause of hazards associated with three different types of GICs: (1) slow d B/d t with ring current evolution (RC-type), (2) fast d B/d t associated with auroral electrojet activity (AE-type), and (3) transient d B/d t of sudden commencements (SC-type). We set "caution," "warning," and "emergency" alert levels during the main phase of superstorms with the peak Dst index of less than -300 nT (once per 10 years), -600 nT (once per 60 years), or -900 nT (once per 100 years), respectively. The extreme d B/d t values of the AE-type GICs are 2000, 4000, and 6000 nT/min at caution, warning, and emergency levels, respectively. For the SC-type GICs, a "transient alert" is also proposed for d B/d t values of 40 nT/s at low latitudes and 110 nT/s at high latitudes, especially when the solar energetic particle flux is unusually high.

  13. Extremes in nature

    CERN Document Server

    Salvadori, Gianfausto; Kottegoda, Nathabandu T

    2007-01-01

    This book is about the theoretical and practical aspects of the statistics of Extreme Events in Nature. Most importantly, this is the first text in which Copulas are introduced and used in Geophysics. Several topics are fully original, and show how standard models and calculations can be improved by exploiting the opportunities offered by Copulas. In addition, new quantities useful for design and risk assessment are introduced.

  14. Evaluation of NASA's MERRA Precipitation Product in Reproducing the Observed Trend and Distribution of Extreme Precipitation Events in the United States

    Science.gov (United States)

    Ashouri, Hamed; Sorooshian, Soroosh; Hsu, Kuo-Lin; Bosilovich, Michael G.; Lee, Jaechoul; Wehner, Michael F.; Collow, Allison

    2016-01-01

    This study evaluates the performance of NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) precipitation product in reproducing the trend and distribution of extreme precipitation events. Utilizing the extreme value theory, time-invariant and time-variant extreme value distributions are developed to model the trends and changes in the patterns of extreme precipitation events over the contiguous United States during 1979-2010. The Climate Prediction Center (CPC) U.S.Unified gridded observation data are used as the observational dataset. The CPC analysis shows that the eastern and western parts of the United States are experiencing positive and negative trends in annual maxima, respectively. The continental-scale patterns of change found in MERRA seem to reasonably mirror the observed patterns of change found in CPC. This is not previously expected, given the difficulty in constraining precipitation in reanalysis products. MERRA tends to overestimate the frequency at which the 99th percentile of precipitation is exceeded because this threshold tends to be lower in MERRA, making it easier to be exceeded. This feature is dominant during the summer months. MERRA tends to reproduce spatial patterns of the scale and location parameters of the generalized extreme value and generalized Pareto distributions. However, MERRA underestimates these parameters, particularly over the Gulf Coast states, leading to lower magnitudes in extreme precipitation events. Two issues in MERRA are identified: 1) MERRA shows a spurious negative trend in Nebraska and Kansas, which is most likely related to the changes in the satellite observing system over time that has apparently affected the water cycle in the central United States, and 2) the patterns of positive trend over the Gulf Coast states and along the East Coast seem to be correlated with the tropical cyclones in these regions. The analysis of the trends in the seasonal precipitation extremes indicates that

  15. Conceptual Design and Structural Optimization of NASA Environmentally Responsible Aviation (ERA) Hybrid Wing Body Aircraft

    Science.gov (United States)

    Quinlan, Jesse R.; Gern, Frank H.

    2016-01-01

    Simultaneously achieving the fuel consumption and noise reduction goals set forth by NASA's Environmentally Responsible Aviation (ERA) project requires innovative and unconventional aircraft concepts. In response, advanced hybrid wing body (HWB) aircraft concepts have been proposed and analyzed as a means of meeting these objectives. For the current study, several HWB concepts were analyzed using the Hybrid wing body Conceptual Design and structural optimization (HCDstruct) analysis code. HCDstruct is a medium-fidelity finite element based conceptual design and structural optimization tool developed to fill the critical analysis gap existing between lower order structural sizing approaches and detailed, often finite element based sizing methods for HWB aircraft concepts. Whereas prior versions of the tool used a half-model approach in building the representative finite element model, a full wing-tip-to-wing-tip modeling capability was recently added to HCDstruct, which alleviated the symmetry constraints at the model centerline in place of a free-flying model and allowed for more realistic center body, aft body, and wing loading and trim response. The latest version of HCDstruct was applied to two ERA reference cases, including the Boeing Open Rotor Engine Integration On an HWB (OREIO) concept and the Boeing ERA-0009H1 concept, and results agreed favorably with detailed Boeing design data and related Flight Optimization System (FLOPS) analyses. Following these benchmark cases, HCDstruct was used to size NASA's ERA HWB concepts and to perform a related scaling study.

  16. Recent changes in precipitation extremes in Romania

    Directory of Open Access Journals (Sweden)

    Adina-Eliza CROITORU

    2014-11-01

    Full Text Available Changes in daily extreme precipitations have been identified in many studies conducted at local, regional or global scales. In Romania, only little research on this issue has been done so far. The present study is focused on the analysis of the trends in daily extreme precipitations indices over a period of 53 years (1961-2013. Data sets of daily precipitation recorded in 34 weather stations were analyzed. Among them, three are located in the Carpathian Mountains area and four are located on the Black Sea Coast. The main goal was to find changes in extreme daily precipitation using a set of 13 indices adopted from the core indices developed by ETCCDMI with appropriate modifications to suit to the studied area. The series of the indices as well as their trends were generated using RClimDex software. The trends have been calculated using the linear mean square method. The findings are similar to those obtained at the global and European continental scales and the most noteworthy are: increasing trends dominate for the most of the indices, but only about 25% of them are statistically significant at α=0.05; decreasing trends are more specific to southern area of the country; decreasing trends of  R0.1, CDD and CWD dominate for the great majority of locations; the spatial distribution of the significant slopes in the area is extremely irregular.

  17. Eukaryotic diversity at pH extremes

    Directory of Open Access Journals (Sweden)

    Linda A. Amaral-Zettler

    2013-01-01

    Full Text Available Extremely acidic (pH<3 and extremely alkaline (pH>9 environments support a diversity of single-cell and to a lesser extent, multicellular eukaryotic life. This study compared alpha and beta diversity in eukaryotic communities from 7 diverse aquatic environments with pH values ranging from 2 to 11 using massively-parallel pyrotag sequencing targeting the V9 hypervariable region of the 18S ribosomal RNA (rRNA gene. A total of 946 Operational Taxonomic Units (OTUs were recovered at a 6% cut-off level (94% similarity across the sampled environments. Hierarchical clustering of the samples segregated the communities into acidic and alkaline groups. Similarity Percentage Analysis (SIMPER followed by Indicator OTU Analysis (IOA and Non-metric Multidimensional Scaling (NMDS were used to determine which characteristic groups of eukaryotic taxa typify acidic or alkaline extremes and the extent to which pH explains eukaryotic community structure in these environments. Spain’s Rio Tinto yielded the fewest observed OTUs while Nebraska Sandhills alkaline lakes yielded the most. Distinct OTUs, including metazoan OTUs, numerically dominated pH extreme sites. Indicator OTUs included the diatom Pinnularia and unidentified opisthokonts (Fungi and Filasterea in the extremely acidic environments, and the ciliate Frontonia across the extremely alkaline sites. Inferred from NMDS, pH explained only a modest fraction of the variation across the datasets, indicating that other factors influence the underlying community structure in these environments. The findings from this study suggest that the ability for eukaryotes to adapt to pH extremes over a broad range of values may be rare, but further study of taxa that can broadly adapt across diverse acidic and alkaline environments respectively present good models for understanding adaptation and should be targeted for future investigations.

  18. Extreme metal music and anger processing

    Directory of Open Access Journals (Sweden)

    Leah eSharman

    2015-05-01

    Full Text Available The claim that listening to extreme music causes anger and expressions of anger such as aggression and delinquency has yet to be substantiated using controlled experimental methods. In this study, 39 extreme music listeners aged 18 to 34 years were subjected to an anger induction, followed by random assignment to 10 minutes of listening to extreme music from their own playlist, or 10 minutes of silence (control. Measures of emotion included heart rate and subjective ratings on the Positive and Negative Affect Scale (PANAS. Results showed that ratings of PANAS hostility, irritability, and stress increased during the anger induction, and decreased after the music or silence. Heart rate increased during the anger induction and was sustained (not increased in the music condition, and decreased in the silence condition. PANAS active and inspired ratings increased during music listening, an effect that was not seen in controls. The findings indicate that extreme music did not make angry participants angrier rather it appeared to match their physiological arousal and result in an increase in positive emotions. Listening to extreme music may represent a healthy way of processing anger for these listeners.

  19. Present-day irrigation mitigates heat extremes

    Science.gov (United States)

    Thiery, Wim; Davin, Edouard L.; Lawrence, David M.; Hirsch, Annette L.; Hauser, Mathias; Seneviratne, Sonia I.

    2017-02-01

    Irrigation is an essential practice for sustaining global food production and many regional economies. Emerging scientific evidence indicates that irrigation substantially affects mean climate conditions in different regions of the world. Yet how this practice influences climate extremes is currently unknown. Here we use ensemble simulations with the Community Earth System Model to assess the impacts of irrigation on climate extremes. An evaluation of the model performance reveals that irrigation has a small yet overall beneficial effect on the representation of present-day near-surface climate. While the influence of irrigation on annual mean temperatures is limited, we find a large impact on temperature extremes, with a particularly strong cooling during the hottest day of the year (-0.78 K averaged over irrigated land). The strong influence on extremes stems from the timing of irrigation and its influence on land-atmosphere coupling strength. Together these effects result in asymmetric temperature responses, with a more pronounced cooling during hot and/or dry periods. The influence of irrigation is even more pronounced when considering subgrid-scale model output, suggesting that local effects of land management are far more important than previously thought. Our results underline that irrigation has substantially reduced our exposure to hot temperature extremes in the past and highlight the need to account for irrigation in future climate projections.

  20. Epidemiological studies of esophageal cancer in the era of genome-wide association studies

    Institute of Scientific and Technical Information of China (English)

    An-Hui; Wang; Yuan; Liu; Bo; Wang; Yi-Xuan; He; Ye-Xian; Fang; Yong-Ping; Yan

    2014-01-01

    Esophageal cancer(EC) caused about 395000 deaths in 2010. China has the most cases of EC and EC is the fourth leading cause of cancer death in China. Esophageal squamous cell carcinoma(ESCC) is the predominant histologic type(90%-95%), while the incidence of esophageal adenocarcinoma(EAC) remains extremely low in China. Traditional epidemiological studies have revealed that environmental carcinogens are risk factors for EC. Molecular epidemiological studies revealed that susceptibility to EC is influenced by both environmental and genetic risk factors. Of all the risk factors for EC, some are associated with the risk of ESCC and others with the risk of EAC. However, the details and mechanisms of risk factors involved in the process for EC are unclear. The advanced methods and techniques used in human genome studies bring a great opportunity for researchers to explore and identify the details of those risk factors or susceptibility genes involved inthe process of EC. Human genome epidemiology is a new branch of epidemiology, which leads the epidemiology study from the molecular epidemiology era to the era of genome wide association studies(GWAS). Here we review the epidemiological studies of EC(especially ESCC) in the era of GWAS, and provide an overview of the general risk factors and those genomic variants(genes, SNPs, miRNAs, proteins) involved in the process of ESCC.

  1. Pulsar Timing Array Based Search for Supermassive Black Hole Binaries in the Square Kilometer Array Era.

    Science.gov (United States)

    Wang, Yan; Mohanty, Soumya D

    2017-04-14

    The advent of next generation radio telescope facilities, such as the Square Kilometer Array (SKA), will usher in an era where a pulsar timing array (PTA) based search for gravitational waves (GWs) will be able to use hundreds of well timed millisecond pulsars rather than the few dozens in existing PTAs. A realistic assessment of the performance of such an extremely large PTA must take into account the data analysis challenge posed by an exponential increase in the parameter space volume due to the large number of so-called pulsar phase parameters. We address this problem and present such an assessment for isolated supermassive black hole binary (SMBHB) searches using a SKA era PTA containing 10^{3} pulsars. We find that an all-sky search will be able to confidently detect nonevolving sources with a redshifted chirp mass of 10^{10}  M_{⊙} out to a redshift of about 28 (corresponding to a rest-frame chirp mass of 3.4×10^{8}  M_{⊙}). We discuss the important implications that the large distance reach of a SKA era PTA has on GW observations from optically identified SMBHB candidates. If no SMBHB detections occur, a highly unlikely scenario in the light of our results, the sky-averaged upper limit on strain amplitude will be improved by about 3 orders of magnitude over existing limits.

  2. Extreme-volatility dynamics in crude oil markets

    Science.gov (United States)

    Jiang, Xiong-Fei; Zheng, Bo; Qiu, Tian; Ren, Fei

    2017-02-01

    Based on concepts and methods from statistical physics, we investigate extreme-volatility dynamics in the crude oil markets, using the high-frequency data from 2006 to 2010 and the daily data from 1986 to 2016. The dynamic relaxation of extreme volatilities is described by a power law, whose exponents usually depend on the magnitude of extreme volatilities. In particular, the relaxation before and after extreme volatilities is time-reversal symmetric at the high-frequency time scale, but time-reversal asymmetric at the daily time scale. This time-reversal asymmetry is mainly induced by exogenous events. However, the dynamic relaxation after exogenous events exhibits the same characteristics as that after endogenous events. An interacting herding model both with and without exogenous driving forces could qualitatively describe the extreme-volatility dynamics.

  3. Plasma Physics of Extreme Astrophysical Environments

    CERN Document Server

    Uzdensky, Dmitri A

    2014-01-01

    Certain classes of astrophysical objects, namely magnetars and central engines of supernovae and gamma-ray bursts (GRBs), are characterized by extreme physical conditions not encountered elsewhere in the Universe. In particular, they possess magnetic fields that exceed the critical quantum field of 44 teragauss. Figuring out how these complex ultra-magnetized systems work requires understanding various plasma processes, both small-scale kinetic and large-scale magnetohydrodynamic (MHD). However, an ultra-strong magnetic field modifies the underlying physics to such an extent that many relevant plasma-physical problems call for building QED-based relativistic quantum plasma physics. In this review, after describing the extreme astrophysical systems of interest and identifying the key relevant plasma-physical problems, we survey the recent progress in the development of such a theory. We discuss how a super-critical field modifies the properties of vacuum and matter and outline the basic theoretical framework f...

  4. Extreme Programming Pocket Guide

    CERN Document Server

    Chromatic

    2003-01-01

    Extreme Programming (XP) is a radical new approach to software development that has been accepted quickly because its core practices--the need for constant testing, programming in pairs, inviting customer input, and the communal ownership of code--resonate with developers everywhere. Although many developers feel that XP is rooted in commonsense, its vastly different approach can bring challenges, frustrations, and constant demands on your patience. Unless you've got unlimited time (and who does these days?), you can't always stop to thumb through hundreds of pages to find the piece of info

  5. Mycetoma of lower extremity

    Directory of Open Access Journals (Sweden)

    Sahariah S

    1978-01-01

    Full Text Available Ten cases of mycetoma of the lower extremity were seen and treated at the Postgraduate Institute of Medical Education & Research, Chandigarh, India, during the years 1973 to 1975. Six were treated by conservative method e.g. antibiotics, sulfonamides and immobilization of the part while remaining four were submitted t o surgery. Four out o f six from the first group had recurrence and has been put on second line of therapy. Recurrence occurred in only one case from the second group and he required an above knee amputation while the remaining three are free of disease and are well rehabilitated.

  6. Metagenomics of extreme environments.

    Science.gov (United States)

    Cowan, D A; Ramond, J-B; Makhalanyane, T P; De Maayer, P

    2015-06-01

    Whether they are exposed to extremes of heat or cold, or buried deep beneath the Earth's surface, microorganisms have an uncanny ability to survive under these conditions. This ability to survive has fascinated scientists for nearly a century, but the recent development of metagenomics and 'omics' tools has allowed us to make huge leaps in understanding the remarkable complexity and versatility of extremophile communities. Here, in the context of the recently developed metagenomic tools, we discuss recent research on the community composition, adaptive strategies and biological functions of extremophiles.

  7. Winter Storms and Extreme Cold

    Science.gov (United States)

    ... Landslides & Debris Flow Nuclear Blast Nuclear Power Plants Power Outages Pandemic Radiological Dispersion Device Severe Weather Snowstorms & Extreme ... Landslides & Debris Flow Nuclear Blast Nuclear Power Plants Power Outages Pandemic Radiological Dispersion Device Severe Weather Snowstorms & Extreme ...

  8. WRF high resolution dynamical downscaling of ERA-Interim for Portugal

    Energy Technology Data Exchange (ETDEWEB)

    Soares, Pedro M.M. [University of Lisbon, Instituto Dom Luiz, Lisbon (Portugal); Faculdade de Ciencias da Universidade de Lisboa, Lisbon (Portugal); Cardoso, Rita M.; Miranda, Pedro M.A.; Medeiros, Joana de [University of Lisbon, Instituto Dom Luiz, Lisbon (Portugal); Belo-Pereira, Margarida; Espirito-Santo, Fatima [Instituto de Meteorologia, Lisbon (Portugal)

    2012-11-15

    This study proposes a dynamically downscaled climatology of Portugal, produced by a high resolution (9 km) WRF simulation, forced by 20 years of ERA-Interim reanalysis (1989-2008), nested in an intermediate domain with 27 km of resolution. The Portuguese mainland is characterized by large precipitation gradients, with observed mean annual precipitation ranging from about 400 to over 2,200 mm, with a very wet northwest and rather dry southeast, largely explained by orographic processes. Model results are compared with all available stations with continuous records, comprising daily information in 32 stations for temperature and 308 for precipitation, through the computation of mean climatologies, standard statistical errors on daily to seasonally timescales, and distributions of extreme events. Results show that WRF at 9 km outperforms ERA-Interim in all analyzed variables, with good results in the representation of the annual cycles in each region. The biases of minimum and maximum temperature are reduced, with improvement of the description of temperature variability at the extreme range of its distribution. The largest gain of the high resolution simulations is visible in the rainiest regions of Portugal, where orographic enhancement is crucial. These improvements are striking in the high ranking percentiles in all seasons, describing extreme precipitation events. WRF results at 9 km compare favorably with published results supporting its use as a high-resolution regional climate model. This higher resolution allows a better representation of extreme events that are of major importance to develop mitigation/adaptation strategies by policy makers and downstream users of regional climate models in applications such as flash floods or heat waves. (orig.)

  9. La era de la cibercultura publicitaria

    Directory of Open Access Journals (Sweden)

    Dra. Rocío Jiménez

    2001-01-01

    Full Text Available La autora hace un recorrido hacia una dimensión ciberpublicitaria y se detiene en aspectos como el cibernauta, la fuerza de la interacción, la belleza y funcionalidad, el concepto de originar un producto para cada cibernauta, la voz del cibernauta y la vuelta al trueque. Insiste en algunos aspectos relevantes sobre la ciberpublidad: pull, cibercasting y banners publicitarios, para acabar su trabajo con el análisis de ideas como la era de la cibercultura publicitaria, la creación de comunidades y la retroalimentación de contenidos.

  10. PERGESERAN MITOLOGI PESANTREN DI ERA MODERN

    Directory of Open Access Journals (Sweden)

    Arif Junaidi

    2011-12-01

    Masalah utama dari kajian ini adalah mencari akar, pola, dan fungsi keyakinan mengenai karamah kiai pesantren Futuhiyah Mranggen Demak dan sejauh mana pergeseran mitologi yang terjadi di kalangan masyarakat sejalan dengan perubahan masyarakat di era modern ini. Kajian ini merupakan kajian lapangan. Paradigma yang digunakan adalah kualitatif karena kajian ini berupaya menemukan makna. Adapun penemuan dari kajian ini adalah bahwa kehidupan kiai di pesantren Futuhiyah Mranggen juga diliputi mitos kiai, dalam makna keyakinan bahwa kiai memiliki karamah.

  11. Spectroscopy in the Era of LSST

    CERN Document Server

    Matheson, Thomas; Green, Richard; McConnachie, Alan; Newman, Jeff; Olsen, Knut; Szkody, Paula; Wood-Vasey, W Michael

    2013-01-01

    This report summarizes the results of the 'Spectroscopy in the Era of LSST' workshop held April 11-12, 2013 in Tucson, Arizona. At the workshop, there were breakout sessions covering four broad science topics. These were: time domain science, Galactic structure and stellar populations, galaxies and AGN, and dark energy and cosmology. We present the science cases discussed in these breakout sessions and provide a synthesis of highly desired capabilities that meet needs across all four broad topics. We also present a table that will be useful to characterize the needs of specific science cases in a format that provides a general framework for discussion of future spectroscopic capabilities.

  12. ERA [Material gráfico

    OpenAIRE

    Fundación para la Etnografía y el Desarrollo de la Artesanía Canaria

    1990-01-01

    Antiguedad: SIGLO XX BASTANTE VEGETACION EN SU INTERIOR. Clasificación del suelo: RÚSTICO Declaración BIC:No ENTIDAD CON FORMA SEMICIRCULAR, ENLOSADA EN ALGUNOS SECTORES. NO PRESENTA PIEDRAS EN EL BORDE. Nivel de protección: 0 POR LA CARRETERA QUE VA DESDE EL ASERRADOR A CARRIZAL DE TEJEDA, LA PRIMERA PISTA A LA IZQUIERDA, LA ERA ESTA A LA DERECHA DENTRO DE ESTA PISTA. Propiedad: PRIVADA RESALTAR LAS GRANDES LOSAS DE SU INTERIOR, LAS LOSAS ESTAN CUBIERTAS DE TIERRA Y V...

  13. ERA [Material gráfico

    OpenAIRE

    Fundación para la Etnografía y el Desarrollo de la Artesanía Canaria

    1990-01-01

    Antiguedad: SIGLO XX Construcción circular realizada con grandes losas que la cubren en su totalidad. No presenta borde de piedras. Declaración BIC:No En muy buen estado. Cubierta de vegetación que ha crecido entre las losas. Nivel de protección: 0 Pasados los Lomos de Pedro Afonso hacia Cercados de Araña hay un cruce de pistas. Tomamos la de la derecha y a unos 350 metros está la era, bajo la carretera. Por los alrededores hay restos de antiguos bancales. Propiedad: PRIV...

  14. ERA ROJITA [Material gráfico

    OpenAIRE

    Fundación para la Etnografía y el Desarrollo de la Artesanía Canaria

    1990-01-01

    Antiguedad: SIGLO XIX Calificación del suelo: RÚSTICO DE PROTECCIÓN DE ENTORNOS Clasificación del suelo: RÚSTICO DE PLANTA CIRCULAR EMPEDRADA CON GRANDES LAJAS POR CASI TODA SU SUPERFICIE Y CON OTRAS PIEDRAS DE TAMAÑO DECIMETRICO. ESTA DELIMITADA POR UN LADO CON GRANDES PIEDRAS Y POR EL OTRO EL MURO ESTA SOBRE UN MURO DE PIEDRAS MUY ANCHO Y CASI NO SALVA PENDIENTE. SITUADA EN UN LOMITILLO MUY CERCA DEL CAUCE DEL BARRANQUILLO PERO EXPUESTA A LOS VIENTOS. DESDE LA ERA SE DOMINA PART...

  15. [Ethical issues in genome-era].

    Science.gov (United States)

    Kosugi, Shinji

    2016-06-01

    Handling of personal genome information is one of the most important current ethical issues in the era of next generation sequencer which is technically progressing at a furious speed, making it 100,000 times faster in only in five years. The author picked up topics of(1) research and clinical guidelines of handling of human genome information, (2) incidental and secondary findings of next generation sequencer in clinical exome and genome sequencing, and (3) so-called direct-to-consumer genetic testing services. In the topic(2), ACMG (American College of Medical Genetics and Genomics) recommendations inreporting incidental findings proposed in 2013 and 2014 are focused.

  16. Transforming healthcare in the Internet Era.

    Science.gov (United States)

    Detmer, D E

    2001-01-01

    Healthcare services will be transformed in the Internet Era by developments in biotechnology, bioinformatics, health informatics, assimilation of modern business processes, and changing policy expectations. Discoveries in biology and communications technology offer the potential for improvements in health status of individuals and populations. Improved access to information about health and disease will typify early progress. Care in hospitals will shift toward palliation and end-of-life care; curing and prevention will increase in outpatient settings and/or within the home or workplace. Barriers include resistance to change and a lack of a global health information infrastructure that includes financing, standards, and coherent policy.

  17. Studying Stepfamilies: Four Eras of Family Scholarship.

    Science.gov (United States)

    Ganong, Lawrence; Coleman, Marilyn

    2017-07-23

    Historically, there have always been stepfamilies, but until the early 1970s, they remained largely unnoticed by social scientists. Research interest in stepfamilies followed shortly after divorce became the primary precursor to stepfamily formation. Because stepfamilies are structurally diverse and much more complex than nuclear families, they have created considerable challenges for both researchers and clinicians. This article examines four eras of stepfamily scholarship, tracing the development of research questions, study designs and methods, and conceptual frameworks from the mid-1970s to the present and drawing implications for the current state of the field. © 2017 Family Process Institute.

  18. Waxing and Waning of Observed Extreme Annual Tropical Rainfall

    CERN Document Server

    Sukhatme, Jai

    2016-01-01

    We begin by providing observational evidence that the probability of encountering very high and very low annual tropical rainfall has increased significantly in the recent decade (1998-present) as compared to the preceding warming era (1979-1997). These changes over land and ocean are spatially coherent and comprise of a rearrangement of very wet regions and a systematic expansion of dry zones. While the increased likelihood of extremes is consistent with a higher average temperature during the pause (as compared to 1979-1997), it is important to note that the periods considered are also characterized by a transition from a relatively warm to cold phase of the El Nino Southern Oscillation (ENSO). To further probe the relation between contrasting phases of ENSO and extremes in accumulation, a similar comparison is performed between 1960-1978 (another extended cold phase of ENSO) and the aforementioned warming era. Though limited by land-only observations, in this cold-to-warm transition, remarkably, a near-exa...

  19. Teachers as Cultural Mediators: A Comparison of the Accountability Era to the Assimilation Era

    Science.gov (United States)

    Eick, Caroline; Valli, Linda

    2010-01-01

    This article examines teachers' relationships with foreign students during eras marked by large waves of immigration to the United States and by policies that shifted from cultural assimilation (1900-1920) to present-day accountability. We compare teachers' understandings of and instructional practices regarding foreign-born English language…

  20. Interannual variability of the surface mass balance of West Antarctica from ITASE cores and ERA40 reanalyses, 1958-2000

    Energy Technology Data Exchange (ETDEWEB)

    Genthon, C. [CNRS/UJF, Laboratoire de Glaciologie et Geophysique de l' Environnement, 54 Rue Moliere, BP 96, Saint Martin d' Heres cedex (France); Kaspari, S. [University of Maine, Climate Change Institute, Orono, ME (United States); Mayewski, P.A. [University of Maine, Climate Change Institute, Orono, ME (United States); University of Maine, Department of Earth Sciences, Orono, ME (United States)

    2005-06-01

    Time series of west-Antarctic (WA) annual surface mass balance (SMB) from ITASE firn/ice cores are compared with the ECMWF 1958-2001 ERA40 reanalysis-based model forecasts. The ITASE series partially confirm the spatial structure of the signature of El Nino Southern Oscillation (ENSO) in WA precipitation as previously identified in ERA40 and other models. However, an improvement of ERA40's ability to reproduce the west-Antarctic SMB since the 1970s is evidenced and is probably related to the onset and increasing use of satellite data in late 1972 and 1978. Restricting the analysis to the 1973-2000 (satellite) period, interannual correlations between ITASE cores and ERA40 SMB series are generally significant (95% confidence level) but weak. The fraction of common variability increases when the series are spatially averaged, suggesting that small-scale perturbation (SSP) of the large-scale SMB variability significantly contributes to year-to-year variability in single core series. A comparison of stake network and core data from the South Pole suggests that SSP can almost fully obscure the large-scale component of the SMB variability as recorded in a single core. Because of SSP, the 1973-2000 period is too brief to verify whether all aspects of the WA large-scale signatures of ENSO and of the Antarctic Oscillation suggested by ERA40 are confirmed in the core series. More annually resolved field data from cores and stakes, spatially extended using high-resolution ground penetrating radar, are necessary to fully assess the relationship between the Antarctic SMB and the large-scale climate as currently suggested by meteorological and climate models. (orig.)

  1. Engineering the (In, Al, Ga)N back-barrier to achieve high channel-conductivity for extremely scaled channel-thicknesses in N-polar GaN high-electron-mobility-transistors

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Jing, E-mail: jing@ece.ucsb.edu; Zheng, Xun; Guidry, Matthew; Denninghoff, Dan; Ahmadi, Elahe; Lal, Shalini; Keller, Stacia; Mishra, Umesh K. [Department of Electrical and Computer Engineering, University of California, Santa Barbara, California 93106 (United States); DenBaars, Steven P. [Department of Electrical and Computer Engineering, University of California, Santa Barbara, California 93106 (United States); Materials Department, University of California, Santa Barbara, California 93106 (United States)

    2014-03-03

    Scaling down the channel-thickness (t{sub ch}) in GaN/(In, Al, Ga)N high-electron-mobility-transistors (HEMTs) is essential to eliminating short-channel effects in sub 100 nm gate length HEMTs. However, this scaling can degrade both charge density (n{sub s}) and mobility (μ), thereby reducing channel-conductivity. In this study, the back-barrier design in N-polar GaN/(In, Al, Ga)N was engineered to achieve highly conductive-channels with t{sub ch} < 5-nm using metal organic chemical vapor deposition. Compositional-grading was found to be the most effective approach in reducing channel-conductivity for structures with t{sub ch} ∼ 3-nm. For a HEMT with 3-nm-thick-channel, a sheet-resistance of 329 Ω/◻ and a peak-transconductance of 718 mS/mm were demonstrated.

  2. Remembrance of ecohydrologic extremes past

    Science.gov (United States)

    Band, L. E.; Hwang, T.

    2013-12-01

    Ecohydrological systems operate at time scales that span several orders of magnitude. Significant processes and feedbacks range from subdaily physiologic response to meteorological drivers, to soil forming and geomorphic processes ranging up through 10^3-10^4 years. While much attention in ecohydrology has focused on ecosystem optimization paradigms, these systems can show significant transience in structure and function, with apparent memory of hydroclimate extremes and regime shifts. While optimization feedbacks can be reconciled with system transience, a better understanding of the time scales and mechanisms of adjustment to increased hydroclimate variability and to specific events is required to understand and predict dynamics and vulnerability of ecosystems. Under certain circumstances of slowly varying hydroclimate, we hypothesize that ecosystems can remain adjusted to changing climate regimes, without displaying apparent system memory. Alternatively, rapid changes in hydroclimate and increased hydroclimate variability, amplified with well expressed non-linearity in the processes controlling feedbacks between water, carbon and nutrients, can move ecosystems far from adjusted states. The Coweeta Hydrological Laboratory is typical of humid, broadleaf forests in eastern North America, with a range of forest biomes from northern hardwoods at higher elevations, to oak-pine assemblages at lower elevations. The site provides almost 80 years of rainfall-runoff records for a set of watersheds under different management, along with multi-decadal forest plot structural information, soil moisture conditions and stream chemistry. An initial period of multi-decadal cooling, was followed by three decades of warming and increased hydroclimate variability. While mean temperature has risen over this time period, precipitation shows no long term trends in the mean, but has had a significant rise in variability with repeated extreme drought and wet periods. Over this latter

  3. Assessment of multiple daily precipitation statistics in ERA-Interim driven Med-CORDEX and EURO-CORDEX experiments against high resolution observations

    Science.gov (United States)

    Fantini, Adriano; Raffaele, Francesca; Torma, Csaba; Bacer, Sara; Coppola, Erika; Giorgi, Filippo; Ahrens, Bodo; Dubois, Clotilde; Sanchez, Enrique; Verdecchia, Marco

    2016-11-01

    We assess the statistics of different daily precipitation indices in ensembles of Med-CORDEX and EURO-CORDEX experiments at high resolution (grid spacing of 0.11°, or RCM11) and medium resolution (grid spacing of 0.44°, or RCM44) with regional climate models (RCMs) driven by the ERA-Interim reanalysis of observations for the period 1989-2008. The assessment is carried out by comparison with a set of high resolution observation datasets for nine European subregions. The statistics analyzed include quantitative metrics for mean precipitation, daily precipitation probability density functions (PDFs), daily precipitation intensity, frequency, 95th percentile and 95th percentile of dry spell length. We assess an ensemble including all Med-CORDEX and EURO-CORDEX models together and others including the Med-CORDEX and EURO-CORDEX separately. For the All Models ensembles, the RCM11 one shows a remarkable performance in reproducing the spatial patterns and seasonal cycle of mean precipitation over all regions, with a consistent and marked improvement compared to the RCM44 ensemble and the ERA-Interim reanalysis. A good consistency with observations by the RCM11 ensemble (and a substantial improvement compared to RCM44 and ERA-Interim) is found also for the daily precipitation PDFs, mean intensity and, to a lesser extent, the 95th percentile. A general improvement by the RCM11 models is also found when the data are upscaled and intercompared at the 0.44° and 1.5° resolutions. For some regions the RCM11 ensemble overestimates the occurrence of very high intensity events while for one region the models underestimate the occurrence of the most intense extremes. The RCM11 ensemble still shows a general tendency to underestimate the dry day frequency and 95th percentile of dry spell length over wetter regions, with only a marginal improvement compared to the lower resolution models. This indicates that the problem of the excessive production of low precipitation events found

  4. Multidecadal oscillations in rainfall and hydrological extremes

    Science.gov (United States)

    Willems, Patrick

    2013-04-01

    Many studies have anticipated a worldwide increase in the frequency and intensity of precipitation extremes and floods since the last decade(s). Natural variability by climate oscillations partly determines the observed evolution of precipitation extremes. Based on a technique for the identification and analysis of changes in extreme quantiles, it is shown that hydrological extremes have oscillatory behaviour at multidecadal time scales. Results are based on nearly independent extremes extracted from long-term historical time series of precipitation intensities and river flows. Study regions include Belgium - The Netherlands (Meuse basin), Ethiopia (Blue Nile basin) and Ecuador (Paute basin). For Belgium - The Netherlands, the past 100 years showed larger and more hydrological extremes around the 1910s, 1950-1960s, and more recently during the 1990-2000s. Interestingly, the oscillations for southwestern Europe are anti-correlated with these of northwestern Europe, thus with oscillation highs in the 1930-1940s and 1970s. The precipitation oscillation peaks are explained by persistence in atmospheric circulation patterns over the North Atlantic during periods of 10 to 15 years. References: Ntegeka V., Willems P. (2008), 'Trends and multidecadal oscillations in rainfall extremes, based on a more than 100 years time series of 10 minutes rainfall intensities at Uccle, Belgium', Water Resources Research, 44, W07402, doi:10.1029/2007WR006471 Mora, D., Willems, P. (2012), 'Decadal oscillations in rainfall and air temperature in the Paute River Basin - Southern Andes of Ecuador', Theoretical and Applied Climatology, 108(1), 267-282, doi:0.1007/s00704-011-0527-4 Taye, M.T., Willems, P. (2011). 'Influence of climate variability on representative QDF predictions of the upper Blue Nile Basin', Journal of Hydrology, 411, 355-365, doi:10.1016/j.jhydrol.2011.10.019 Taye, M.T., Willems, P. (2012). 'Temporal variability of hydro-climatic extremes in the Blue Nile basin', Water

  5. Precipitation extremes under climate change

    CERN Document Server

    O'Gorman, Paul A

    2015-01-01

    The response of precipitation extremes to climate change is considered using results from theory, modeling, and observations, with a focus on the physical factors that control the response. Observations and simulations with climate models show that precipitation extremes intensify in response to a warming climate. However, the sensitivity of precipitation extremes to warming remains uncertain when convection is important, and it may be higher in the tropics than the extratropics. Several physical contributions govern the response of precipitation extremes. The thermodynamic contribution is robust and well understood, but theoretical understanding of the microphysical and dynamical contributions is still being developed. Orographic precipitation extremes and snowfall extremes respond differently from other precipitation extremes and require particular attention. Outstanding research challenges include the influence of mesoscale convective organization, the dependence on the duration considered, and the need to...

  6. "Triangular" extremal dilatonic dyons

    CERN Document Server

    Gal'tsov, Dmitri; Orlov, Dmitri

    2014-01-01

    Explicit dyonic dilaton black holes of the four-dimensional Einstein-Maxwell-dilaton theory are known only for two particular values of the dilaton coupling constant $a =1,\\sqrt{3}$, while for other $a$ numerical evidence was presented earlier about existence of extremal dyons in theories with the discrete sequence of dilaton couplings $a=\\sqrt{n(n+1)/2}$ with integer $n$. Apart from the lower members $n=1,\\,2$, this family of theories does not have motivation from supersymmetry or higher dimensions, and so far the above quantization rule has not been derived analytically. We fill this gap showing that this rule follows from analyticity of the dilaton at the $AdS_2\\times S^2$ event horizon with $n$ being the leading dilaton power in the series expansion. We also present generalization for asymptotically anti-de Sitter dyonic black holes with spherical, plane and hyperbolic topology of the horizon.

  7. Extreme skin depth waveguides

    CERN Document Server

    Jahani, Saman

    2014-01-01

    Recently, we introduced a paradigm shift in light confinement strategy and introduced a class of extreme skin depth (e-skid) photonic structures (S. Jahani and Z. Jacob, "Transparent sub-diffraction optics: nanoscale light confinement without metal," Optica 1, 96-100 (2014)). Here, we analytically establish that figures of merit related to light confinement in dielectric waveguides are fundamentally tied to the skin depth of waves in the cladding. We contrast the propagation characteristics of the fundamental mode of e-skid waveguides and conventional waveguides to show that the decay constant in the cladding is dramatically larger in e-skid waveguides, which is the origin of sub-diffraction confinement. Finally, we propose an approach to verify the reduced skin depth in experiment using the decrease in the Goos-H\\"anchen phase shift.

  8. Pulsars and Extreme Physics

    Science.gov (United States)

    Bell-Burnell, Jocelyn

    2004-10-01

    Pulsars were discovered 35 years ago. What do we know about them now, and what have they taught us about the extremes of physics? With an average density comparable to that of the nucleus, magnetic fields around 108 T and speeds close to c these objects have stretched our understanding of the behaviour of matter. They serve as extrememly accurate clocks with which to carry out precision experiments in relativity. Created in cataclysmic explosions, pulsars are a (stellar) form of life after death. After half a billion revolutions most pulsars finally die, but amazingly some are born again to yet another, even weirder, afterlife. Pulsar research continues lively, delivering exciting, startling and almost unbelievable results!

  9. Scales, scales and more scales.

    Science.gov (United States)

    Weitzenhoffer, Andre M

    2002-01-01

    This article examines the nature, uses, and limitations of the large variety of existing, so-called, hypnosis scales; that is, instruments that have been proposed for the assessment of hypnotic behavior. Although the major aim of most of the scales ostensively seems to be to assess several aspects of hypnotic states, they are found generally to say little about these and much more about responses to suggestions. The greatest application of these scales is to be found in research, but they also have a limited place in clinical work.

  10. PEMBELAJARAN BAHASA ARAB DI ERA POSMETODE

    Directory of Open Access Journals (Sweden)

    Muhbib Abdul Wahab

    2015-06-01

    Full Text Available This article is aimed to answer two essential problems; (1 development map of Arabic learning method from 1990s until today which has been formulated unclearly, and (2 developing Arabic learning in posmethod era by optimilizing teacher’s strategic role in the process of Arabic learning. This article used bibliographic sources from some books and articles in scientific journal about linguistic and Arabic learning. The interpretation data of academician thought and Arabic linguistic experts was done by using historical-critical approach and content analysis for substancial interpretation. B. Kumaravadivelu concept in Beyond Methods: Macrostrategies for Language Teaching (2003 which requires teacher to play three essential roles; pasif technician, reflective practician, and transformative intelectual is very inportant in Arabic learning in posmethod era. The principle of at-tharîqatu ahammu min al-mâddah (method is more important than content can be developed to be main principle “spirit, profesionality and strategic role of language educator is more important in teaching Arabic than the method itself”, since basically there is no most appropriate and ideal method for any goals and situation of Arabic learning.

  11. Aplikasi Citizen Journalism di Era Konvergensi Media

    Directory of Open Access Journals (Sweden)

    Rahmat Edi Irawan

    2014-10-01

    Full Text Available Citizen journalist has now become one of the concepts most developed television programs. If initially the concept is more widely used in radio and online media, this time with technology coverage and delivery of images that are easier and cheaper, it is a concept that provides a place for people to become amateur journalists also can be applied with ease in the medium of television. Application of citizen journalism in the television media is also increasingly facilitated by the start of the television is now the era of media convergence, different recent media, such as television media with print media , radio and internet media . The era of media convergence, making the concept of citizen journalism can be more developed , because of the platform or distribution media is also increasingly diverse television for the amateur journalists . However, the system equipment must be provided, human resources must be owned , as well as huge capital to be owned, make not many television stations that opened a lot of platforms to provide space for amateur journalists in citizen journalism

  12. Molecular anthropology in the genomic era.

    Science.gov (United States)

    Destro-Bisol, Giovanni; Jobling, Mark A; Rocha, Jorge; Novembre, John; Richards, Martin B; Mulligan, Connie; Batini, Chiara; Manni, Franz

    2010-01-01

    Molecular Anthropology is a relatively young field of research. In fact, less than 50 years have passed since the symposium "Classification and Human Evolution" (1962, Burg Wartenstein, Austria), where the term was formally introduced by Emil Zuckerkandl. In this time, Molecular Anthropology has developed both methodologically and theoretically and extended its applications, so covering key aspects of human evolution such as the reconstruction of the history of human populations and peopling processes, the characterization of DNA in extinct humans and the role of adaptive processes in shaping the genetic diversity of our species. In the current scientific panorama, molecular anthropologists have to face a double challenge. As members of the anthropological community, we are strongly committed to the integration of biological findings and other lines of evidence (e.g. linguistic and archaeological), while keeping in line with methodological innovations which are moving the approach from the genetic to the genomic level. In this framework, the meeting "DNA Polymorphisms in Human Populations: Molecular Anthropology in the Genomic Era" (Rome, December 3-5, 2009) offered an opportunity for discussion among scholars from different disciplines, while paying attention to the impact of recent methodological innovations. Here we present an overview of the meeting and discuss perspectives and prospects of Molecular Anthropology in the genomic era.

  13. The end of a remarkable era

    CERN Multimedia

    2011-01-01

    An important era in particle physics is coming to an end: the US Department of Energy announced on Monday that it will not fund an extension to Tevatron running beyond 2011. It is a poignant moment for particle physics as we prepare to bid farewell to a machine that has changed our view of the Universe, and played a significant role in paving the way for the new era that is opening up with the LHC.   The Tevatron has been at the high-energy frontier of particle physics for over a quarter of a century. That’s a remarkable achievement by any account, and the physics results are there to prove it. As well as bringing us the discovery of the top quark in 1995, the Tevatron’s experiments have provided vitally important precision measurements covering the full spectrum of Standard Model physics, not to mention hints of what may lie beyond. With several months of running still to come, it would be a foolish gambler who bet against further new physics emerging before the Teva...

  14. Eukaryotic diversity at pH extremes.

    Science.gov (United States)

    Amaral-Zettler, Linda A

    2012-01-01

    Extremely acidic (pH 9) environments support a diversity of single-cell and to a lesser extent, multicellular eukaryotic life. This study compared alpha and beta diversity in eukaryotic communities from seven diverse aquatic environments with pH values ranging from 2 to 11 using massively-parallel pyrotag sequencing targeting the V9 hypervariable region of the 18S ribosomal RNA (rRNA) gene. A total of 946 operational taxonomic units (OTUs) were recovered at a 6% cut-off level (94% similarity) across the sampled environments. Hierarchical clustering of the samples segregated the communities into acidic and alkaline groups. Similarity percentage (SIMPER) analysis followed by indicator OTU analysis (IOA) and non-metric multidimensional scaling (NMDS) were used to determine which characteristic groups of eukaryotic taxa typify acidic or alkaline extremes and the extent to which pH explains eukaryotic community structure in these environments. Spain's Rio Tinto yielded the fewest observed OTUs while Nebraska Sandhills alkaline lakes yielded the most. Distinct OTUs, including metazoan OTUs, numerically dominated pH extreme sites. Indicator OTUs included the diatom Pinnularia and unidentified opisthokonts (Fungi and Filasterea) in the extremely acidic environments, and the ciliate Frontonia across the extremely alkaline sites. Inferred from NMDS, pH explained only a modest fraction of the variation across the datasets, indicating that other factors influence the underlying community structure in these environments. The findings from this study suggest that the ability for eukaryotes to adapt to pH extremes over a broad range of values may be rare, but further study of taxa that can broadly adapt across diverse acidic and alkaline environments, respectively present good models for understanding adaptation and should be targeted for future investigations.

  15. Biological Extreme Events - Past, Present, and Future

    Science.gov (United States)

    Gutschick, V. P.

    2010-12-01

    Biological extreme events span wide ranges temporally and spatially and in type - population dieoffs, extinctions, ecological reorganizations, changes in biogeochemical fluxes, and more. Driving variables consist in meteorology, tectonics, orbital changes, anthropogenic changes (land-use change, species introductions, reactive N injection into the biosphere), and evolution (esp. of diseases). However, the mapping of extremes in the drivers onto biological extremes as organismal responses is complex, as laid out originally in the theoretical framework of Gutschick and BassiriRad (New Phytologist [2003] 100:21-42). Responses are nonlinear and dependent on (mostly unknown and) complex temporal sequences - often of multiple environmental variables. The responses are species- and genotype specific. I review extreme events over from past to present over wide temporal scales, while noting that they are not wholly informative of responses to the current and near-future drivers for at least two reasons: 1) the current combination of numerous environmental extremes - changes in CO2, temperature, precipitation, reactive N, land fragmentation, O3, etc. -is unprecedented in scope, and 2) adaptive genetic variation for organismal responses is constrained by poorly-characterized genetic structures (in organisms and populations) and by loss of genetic variation by genetic drift over long periods. We may expect radical reorganizations of ecosystem and biogeochemical functions. These changes include many ecosystem services in flood control, crop pollination and insect/disease control, C-water-mineral cycling, and more, as well as direct effects on human health. Predictions of such changes will necessarily be very weak in the critical next few decades, given the great deal of observation, experimentation, and theory construction that will be necessary, on both organisms and drivers. To make the research efforts most effective will require extensive, insightful planning, beginning

  16. Interplanetary shocks and solar wind extremes

    Science.gov (United States)

    Vats, Hari

    The interplanetary shocks have a very high correlation with the annual sunspot numbers during the solar cycle; however the correlation falls very low on shorter time scale. Thus poses questions and difficulty in the predictability. Space weather is largely controlled by these interplanetary shocks, solar energetic events and the extremes of solar wind. In fact most of the solar wind extremes are related to the solar energetic phenomena. It is quite well understood that the energetic events like flares, filament eruptions etc. occurring on the Sun produce high speed extremes both in terms of density and speed. There is also high speed solar wind steams associated with the coronal holes mainly because the magnetic field lines are open there and the solar plasma finds it easy to escape from there. These are relatively tenuous high speed streams and hence create low intensity geomagnetic storms of higher duration. The solar flares and/or filament eruptions usually release excess coronal mass into the interplanetary medium and thus these energetic events send out high density and high speed solar wind which statistically found to produce more intense storms. The other extremes of solar wind are those in which density and speed are much lower than the normal values. Several such events have been observed and are found to produce space weather consequences of different kind. It is found that such extremes are more common around the maximum of solar cycle 20 and 23. Most of these have significantly low Alfven Mach number. This article is intended to outline the interplanetary and geomagnetic consequences of observed by ground based and satellite systems for the solar wind extremes.

  17. Extreme events in multilayer, interdependent complex networks and control

    Science.gov (United States)

    Chen, Yu-Zhong; Huang, Zi-Gang; Zhang, Hai-Feng; Eisenberg, Daniel; Seager, Thomas P.; Lai, Ying-Cheng

    2015-11-01

    We investigate the emergence of extreme events in interdependent networks. We introduce an inter-layer traffic resource competing mechanism to account for the limited capacity associated with distinct network layers. A striking finding is that, when the number of network layers and/or the overlap among the layers are increased, extreme events can emerge in a cascading manner on a global scale. Asymptotically, there are two stable absorption states: a state free of extreme events and a state of full of extreme events, and the transition between them is abrupt. Our results indicate that internal interactions in the multiplex system can yield qualitatively distinct phenomena associated with extreme events that do not occur for independent network layers. An implication is that, e.g., public resource competitions among different service providers can lead to a higher resource requirement than naively expected. We derive an analytical theory to understand the emergence of global-scale extreme events based on the concept of effective betweenness. We also articulate a cost-effective control scheme through increasing the capacity of very few hubs to suppress the cascading process of extreme events so as to protect the entire multi-layer infrastructure against global-scale breakdown.

  18. Extreme Rainfall Events Over Southern Africa: Assessment of a Climate Model to Reproduce Daily Extremes

    Science.gov (United States)

    Williams, C.; Kniveton, D.; Layberry, R.

    2007-12-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable extreme events, due to a number of factors including extensive poverty, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of a state-of-the-art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. Once the model's ability to reproduce extremes has been assessed, idealised regions of SST anomalies are used to force the model, with the overall aim of investigating the ways in which SST anomalies influence rainfall extremes over southern Africa. In this paper, results from sensitivity testing of the UK Meteorological Office Hadley Centre's climate model's domain size are firstly presented. Then simulations of current climate from the model, operating in both regional and global mode, are compared to the MIRA dataset at daily timescales. Thirdly, the ability of the model to reproduce daily rainfall extremes will be assessed, again by a comparison with extremes from the MIRA dataset. Finally, the results from the idealised SST experiments are briefly presented, suggesting associations between rainfall extremes and both local and remote SST anomalies.

  19. On the kinematic detection of accreted streams in the Gaia era: a cautionary tale

    Science.gov (United States)

    Jean-Baptiste, I.; Di Matteo, P.; Haywood, M.; Gómez, A.; Montuori, M.; Combes, F.; Semelin, B.

    2017-08-01

    The ΛCDM cosmological scenario predicts that our Galaxy should contain hundreds of stellar streams in the solar vicinity, fossil relics of the merging history of the Milky Way and more generally of the hierarchical growth of galaxies. Because of the mixing time scales in the inner Galaxy, it has been claimed that these streams should be difficult to detect in configuration space but can still be identifiable in kinematic-related spaces like the energy/angular momenta spaces, E - Lz and L⊥ - Lz, or spaces of orbital/velocity parameters. By means of high-resolution, dissipationless N-body simulations containing between 25 × 106 and 35 × 106 particles, we model the accretion of a series of up to four 1:10 mass ratio satellites then up to eight 1:100 satellites and search systematically for the signature of accretions in these spaces. The novelty of this work with respect to the majority of those already published is our analysis of fully consistent models, where both the satellite(s) and the Milky Way galaxy are "live" systems, which can react to the interaction and experience kinematical heating, tidal effects and dynamical friction (the latter, a process often neglected in previous studies). We find that, in agreement with previous works, all spaces are rich in substructures, but that, contrary to previous works, the origin of these substructures - accreted or in-situ - cannot be determined for the following reasons. In all spaces considered (1) each satellite provides the origin of several independent over-densities; (2) over-densities of multiple satellites overlap; (3) satellites of different masses can produce similar substructures; (4) the overlap between the in-situ and the accreted population is considerable everywhere; and (5) in-situ stars also form substructures in response to the satellite(s') accretion. These points are valid even if the search is restricted to kinematically-selected halo stars only. As we are now entering the "Gaia era", our

  20. Mangled extremity severity score in children.

    Science.gov (United States)

    Fagelman, Mitchell F; Epps, Howard R; Rang, Mercer

    2002-01-01

    Treatment of the severely traumatized or mangled lower extremity poses significant challenges. The Mangled Extremity Severity Score (MESS) is a scale that uses objective criteria to assist with acute management decisions. Most research on the MESS has been in adults or combined series with few children. The study was performed to investigate the MESS in children exclusively. The MESS was applied retrospectively to 36 patients with grades IIIB and IIIC open lower extremity fractures collected from two level 1 pediatric trauma centers. Patients were divided into limb salvage and primary amputation groups based on the decision of the treating surgeon. In the salvage group there were 18 grade IIIB fractures and 10 grade IIIC fractures. The MESS prediction was accurate in 93% of the injured limbs. In the amputation group eight limbs met the inclusion criteria; the MESS agreed with the treating surgeon in 63% of cases. These findings suggest the MESS should be considered when managing a child with severe lower extremity trauma.

  1. Extreme weather: Subtropical floods and tropical cyclones

    Science.gov (United States)

    Shaevitz, Daniel A.

    Extreme weather events have a large effect on society. As such, it is important to understand these events and to project how they may change in a future, warmer climate. The aim of this thesis is to develop a deeper understanding of two types of extreme weather events: subtropical floods and tropical cyclones (TCs). In the subtropics, the latitude is high enough that quasi-geostrophic dynamics are at least qualitatively relevant, while low enough that moisture may be abundant and convection strong. Extratropical extreme precipitation events are usually associated with large-scale flow disturbances, strong ascent, and large latent heat release. In the first part of this thesis, I examine the possible triggering of convection by the large-scale dynamics and investigate the coupling between the two. Specifically two examples of extreme precipitation events in the subtropics are analyzed, the 2010 and 2014 floods of India and Pakistan and the 2015 flood of Texas and Oklahoma. I invert the quasi-geostrophic omega equation to decompose the large-scale vertical motion profile to components due to synoptic forcing and diabatic heating. Additionally, I present model results from within the Column Quasi-Geostrophic framework. A single column model and cloud-revolving model are forced with the large-scale forcings (other than large-scale vertical motion) computed from the quasi-geostrophic omega equation with input data from a reanalysis data set, and the large-scale vertical motion is diagnosed interactively with the simulated convection. It is found that convection was triggered primarily by mechanically forced orographic ascent over the Himalayas during the India/Pakistan flood and by upper-level Potential Vorticity disturbances during the Texas/Oklahoma flood. Furthermore, a climate attribution analysis was conducted for the Texas/Oklahoma flood and it is found that anthropogenic climate change was responsible for a small amount of rainfall during the event but the

  2. Extreme Precipitation and High-Impact Landslides

    Science.gov (United States)

    Kirschbaum, Dalia; Adler, Robert; Huffman, George; Peters-Lidard, Christa

    2012-01-01

    It is well known that extreme or prolonged rainfall is the dominant trigger of landslides; however, there remain large uncertainties in characterizing the distribution of these hazards and meteorological triggers at the global scale. Researchers have evaluated the spatiotemporal distribution of extreme rainfall and landslides at local and regional scale primarily using in situ data, yet few studies have mapped rainfall-triggered landslide distribution globally due to the dearth of landslide data and consistent precipitation information. This research uses a newly developed Global Landslide Catalog (GLC) and a 13-year satellite-based precipitation record from Tropical Rainfall Measuring Mission (TRMM) data. For the first time, these two unique products provide the foundation to quantitatively evaluate the co-occurence of precipitation and rainfall-triggered landslides globally. The GLC, available from 2007 to the present, contains information on reported rainfall-triggered landslide events around the world using online media reports, disaster databases, etc. When evaluating this database, we observed that 2010 had a large number of high-impact landslide events relative to previous years. This study considers how variations in extreme and prolonged satellite-based rainfall are related to the distribution of landslides over the same time scales for three active landslide areas: Central America, the Himalayan Arc, and central-eastern China. Several test statistics confirm that TRMM rainfall generally scales with the observed increase in landslide reports and fatal events for 2010 and previous years over each region. These findings suggest that the co-occurrence of satellite precipitation and landslide reports may serve as a valuable indicator for characterizing the spatiotemporal distribution of landslide-prone areas in order to establish a global rainfall-triggered landslide climatology. This research also considers the sources for this extreme rainfall, citing

  3. Women in extreme poverty.

    Science.gov (United States)

    1994-01-01

    Population is estimated to increase from 5.5 billion in 1990 to 10 billion by 2050; the poverty level is expected to increase from 1 billion to 2-3 billion people. Women in development has been promoted throughout the UN and development system, but women in poverty who perform work in the informal sector are still uncounted, and solutions are elusive. The issue of extreme poverty can not be approached as just another natural disaster with immediate emergency relief. Many people live in precarious economic circumstances throughout their lives. Recent research reveals a greater understanding of the underlying causes and the need for inclusion of poor women in sustainable development. Sanitation, water, housing, health facilities need to be improved. Women must have access to education, opportunities for trading, and loans on reasonable terms. UNESCO makes available a book on survival strategies for poor women in the informal sector. The profile shows common problems of illiteracy, broken marriages, and full time involvement in provision of subsistence level existence. Existence is a fragile balance. Jeanne Vickers' "Women and the World" offers simple, low cost interventions for aiding extremely poor women. The 1992 Commission on the Status of Women was held in Vienna. Excerpts from several speeches are provided. The emphasis is on some global responses and an analysis of solutions. The recommendation is for attention to the gender dimension of poverty. Women's dual role contributes to greater disadvantages. Women are affected differently by macroeconomic factors, and that there is intergenerational transfer of poverty. Social services should be viewed as investments and directed to easing the burdens on time and energy. Public programs must be equipped to deal with poverty and to bring about social and economic change. Programs must be aware of the different distribution of resources within households. Women must be recognized as principal economic providers within

  4. Extreme winds in Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Kristensen, L.; Rathmann, O.; Hansen, S.O.

    1999-02-01

    Wind-speed data from four sites in Denmark have been analyzed in order to obtain estimates of the basic wind velocity which is defined as the 50-year wind speed under standard conditions, i.e. ten-minute averages at the height 10 m over a uniform terrain with the roughness length 0.05 m. The sites are, from west, Skjern (15 years), Kegnaes (7 years), Sprogoe (20 years), and Tystofte (15 years). The data are ten minute averages of wind speed, wind direction, temperature and pressure. The last two quantities are used to determine the air density {rho}. The data are cleaned for terrain effects by means of a slightly modified WASP technique where the sector speed-up factors and roughness lengths are linearly smoothed with a direction resolution of one degree. Assuming geotropic balance, all the wind-velocity data are transformed to friction velocity u{sub *} and direction at standard conditions by means of the geotropic drag law for neutral stratification. The basic wind velocity in 30 deg. sectors are obtained through ranking of the largest values of the friction velocity pressure 1/2{rho}u{sub *}{sup 2} taken both one every two months and once every year. The main conclusion is that the basic wind velocity is significantly larger at Skjern, close to the west coast of Jutland, than at any of the other sites. Irrespective of direction, the present standard estimates of 50-year wind are 25 {+-} 1 m/s at Skern and 22 {+-} 1 m/s at the other three sites. These results are in agreement with those obtained by Jensen and Franck (1970) and Abild (1994) and supports the conclusion that the wind climate at the west coast of Jutland is more extreme than in any other part of the country. Simple procedures to translate in a particular direction sector the standard basic wind velocity to conditions with a different roughness length and height are presented. It is shown that a simple scheme makes it possible to calculate the total 50-year extreme load on a general structure without

  5. Stacked Extreme Learning Machines.

    Science.gov (United States)

    Zhou, Hongming; Huang, Guang-Bin; Lin, Zhiping; Wang, Han; Soh, Yeng Chai

    2015-09-01

    Extreme learning machine (ELM) has recently attracted many researchers' interest due to its very fast learning speed, good generalization ability, and ease of implementation. It provides a unified solution that can be used directly to solve regression, binary, and multiclass classification problems. In this paper, we propose a stacked ELMs (S-ELMs) that is specially designed for solving large and complex data problems. The S-ELMs divides a single large ELM network into multiple stacked small ELMs which are serially connected. The S-ELMs can approximate a very large ELM network with small memory requirement. To further improve the testing accuracy on big data problems, the ELM autoencoder can be implemented during each iteration of the S-ELMs algorithm. The simulation results show that the S-ELMs even with random hidden nodes can achieve similar testing accuracy to support vector machine (SVM) while having low memory requirements. With the help of ELM autoencoder, the S-ELMs can achieve much better testing accuracy than SVM and slightly better accuracy than deep belief network (DBN) with much faster training speed.

  6. Solar extreme events

    CERN Document Server

    Hudson, Hugh S

    2015-01-01

    Solar flares and CMEs have a broad range of magnitudes. This review discusses the possibility of "extreme events," defined as those with magnitudes greater than have been seen in the existing historical record. For most quantitative measures, this direct information does not extend more than a century and a half into the recent past. The magnitude distributions (occurrence frequencies) of solar events (flares/CMEs) typically decrease with the parameter measured or inferred (peak flux, mass, energy etc. Flare radiation fluxes tend to follow a power law slightly flatter than $S^{-2}$, where S represents a peak flux; solar particle events (SPEs) follow a still flatter power law up to a limiting magnitude, and then appear to roll over to a steeper distribution, which may take an exponential form or follow a broken power law. This inference comes from the terrestrial $^{14}$C record and from the depth dependence of various radioisotope proxies in the lunar regolith and in meteorites. Recently major new observation...

  7. Detectors in Extreme Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Blaj, G. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Carini, G. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Carron, S. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Haller, G. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Hart, P. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Hasi, J. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Herrmann, S. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Kenney, C. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Segal, J. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Tomada, A. [SLAC National Accelerator Lab., Menlo Park, CA (United States)

    2015-08-06

    Free Electron Lasers opened a new window on imaging the motion of atoms and molecules. At SLAC, FEL experiments are performed at LCLS using 120Hz pulses with 1012 - 1013 photons in 10 femtoseconds (billions of times brighter than the most powerful synchrotrons). This extreme detection environment raises unique challenges, from obvious to surprising. Radiation damage is a constant threat due to accidental exposure to insufficiently attenuated beam, focused beam and formation of ice crystals reflecting the beam onto the detector. Often high power optical lasers are also used (e.g., 25TW), increasing the risk of damage or impeding data acquisition through electromagnetic pulses (EMP). The sample can contaminate the detector surface or even produce shrapnel damage. Some experiments require ultra high vacuum (UHV) with strict design, surface contamination and cooling requirements - also for detectors. The setup is often changed between or during experiments with short turnaround times, risking mechanical and ESD damage, requiring work planning, training of operators and sometimes continuous participation of the LCLS Detector Group in the experiments. The detectors used most often at LCLS are CSPAD cameras for hard x-rays and pnCCDs for soft x-rays.

  8. Implementing Maxwell's Aether Illuminates the Physics of Gravitation:. The Gravity-Electric (G-E) Field, Evident at Every Scale, From the Ionosphere to Spiral Galaxies and a Neutron-Star Extreme

    Science.gov (United States)

    Osmaston, Miles F.

    2013-09-01

    the means for displacing its local density exist; that, we show, is the nature of gravitational action and brings gravitation into the electromagnetic family of forces. Under (B) the particle mass is measured by the aether-sucking capability of its vortex, positiveonly gravitation being because the outward-diminishing force developed by each makes mutual convergence at any given point the statistically prevalent expectation. This activity maintains a radial aether (charge) density gradient - the Gravity-Electric (G-E) Field - around and within any gravitationally retained assemblage. So Newton's is an incomplete description of gravitation; the corresponding G-E field is an inseparable facet of the action. The effect on c of that charge density gradient yields gravitational lensing. We find that G-E field action on plasma is astronomically ubiquitous. This strictly radial outward force on ions has the property of increasing the orbital angular momentum of material, by moving it outwards, but at constant tangential velocity. Spiral galaxies no longer require Cold Dark Matter (CDM) to explain this. The force (maybe 30 V.m-1 at solar surface) has comprehensive relevance to the high orbital a.m. achieved during solar planet formation, to their prograde spins and to exoplanet observations. The growth of high-mass stars is impossible if radiation pressure rules, whereas G-E field repulsion is low during dust-opaque infall, driving their prodigious mass loss rates when infall ceases and the star establishes an ionized environment. Its biggest force-effect (~1012 V.m-1) is developed at neutron stars, where it is likely the force of supernova explosions, and leads to a fertile model for pulsars and the acceleration of 1019 eV extreme-energy cosmic rays. Our only directly observed measure of the G-E field is recorded at about 1 V.m-1 in the ionosphere-to-Earth electric potential. And temporary local changes of ionosphere electron density, monitored by radio and satellite, have

  9. Climate projection of extreme wind speed regime in the Arctic

    Science.gov (United States)

    Surkova, Galina; Sokolova, Larisa

    2016-04-01

    Extreme surface wind events over the Arctic (60-90N, 0-360 E) are studied for the modern climate and for its future possible changes on the base of ERA-Interim reanalysis data and CMIP5 scenario RCP8.5. Horizontal surface wind speed (10 m) probability distribution functions in every grid point of reanalysis and models data over the Arctic were evaluated as well as wind speed for 50, 95, 99, 99.9 percentiles (V0.50, V0.95, V0.99, V0.999). At first, changes of V0.50, V0.95, V0.99, V0.999 were studied on the base of ERA-Interim reanalysis for 1981-2010. Results showed regional inhomogenity of wind speed trend intensity. Also, analysis was made for zonal means and separate sectors of the Arctic. To study climate projection of high wind speed there were taken u,v values from CMIP5 numerical experiments for 1961-1990 (Historical) and 2081-2100 (RCP8.5). RCP8.5 scenario was chosen as having the most pronounced response in the climate system, which gave more statistical significance to the calculated trends. Modeled extreme wind speeds for the total Arctic and zonal means show rather good agreement with reanalysis data (compared for decades 1981-1990, 1991-2000). At the same time regional intermodel variability of wind speed is revealed. Trend of extreme surface wind speed in 21 century and for 2081-2100 over the Arctic are analyzed for each model. The study was supported by the Russian Science Foundation (project no. 14-37-00038).

  10. Variability of extreme climate events in the territory and water area of Russia

    Science.gov (United States)

    Serykh, Ilya; Kostianoy, Andrey

    2016-04-01

    The Fourth (2007) and Fifth (2014) Assessment Reports on Climate Change of the Intergovernmental Panel on Climate Change (IPCC) state that in the XXI century, climate change will be accompanied by an increase in the frequency, intensity and duration of extreme nature events such as: extreme precipitation and extreme high and low air temperatures. All these will lead to floods, droughts, fires, shallowing of rivers, lakes and water reservoirs, desertification, dust storms, melting of glaciers and permafrost, algal bloom events in the seas, lakes and water reservoirs. In its turn, these events will lead to chemical and biological contamination of water, land and air. These events will result in a deterioration of quality of life, significant financial loss due to damage to the houses, businesses, roads, agriculture, forestry, tourism, and in many cases they end in loss of life. These predictions are confirmed by the results of the studies presented in the RosHydromet First (2008) and Second (2014) Assessment Reports on Climate Change and its Consequences in Russian Federation. Scientists predictions have been repeatedly confirmed in the last 15 years - floods in Novorossiysk (2002), Krymsk and Gelendzhik (2012), the Far East (2013), heat waves in 2010, unusually cold winter (February) of 2012 and unusually warm winter of 2013/2014 in the European territory of Russia. In this regard, analysis and forecasting of extreme climate events associated with climate change in the territory of Russia are an extremely important task. This task is complicated by the fact that modern atmospheric models used by IPCC and RosHydromet badly reproduce and predict the intensity of precipitation. We are analyzing meteorological reanalysis data (NCEP/NCAR, 20th Century Reanalysis, ERA-20C, JRA-55) and satellite data (NASA and AVISO) on air, water and land temperature, rainfall, wind speed and cloud cover, water levels in seas and lakes, index of vegetation over the past 30-60 years

  11. Extremal almost-Kahler metrics

    CERN Document Server

    Lejmi, Mehdi

    2009-01-01

    We generalize the notion of the Futaki invariant and extremal vector field to the general almost-Kahler case and we prove the periodicity of the extremal vector field when the symplectic form represents an integral cohomology class modulo torsion. We give also an explicit formula of the hermitian scalar curvature which allows us to obtain examples of non-integrable extremal almost-Kahler metrics saturating LeBrun's estimates.

  12. Upper extremity amputations and prosthetics.

    Science.gov (United States)

    Ovadia, Steven A; Askari, Morad

    2015-02-01

    Upper extremity amputations are most frequently indicated by severe traumatic injuries. The location of the injury will determine the level of amputation. Preservation of extremity length is often a goal. The amputation site will have important implications on the functional status of the patient and options for prosthetic reconstruction. Advances in amputation techniques and prosthetic reconstructions promote improved quality of life. In this article, the authors review the principles of upper extremity amputation, including techniques, amputation sites, and prosthetic reconstructions.

  13. Recent and future extreme precipitation over Ukraine

    Science.gov (United States)

    Vyshkvarkova, Olena; Voskresenskaya, Elena

    2014-05-01

    The aim of study is to analyze the parameters of precipitation extremes and inequality over Ukraine in recent climate epoch and their possible changes in the future. Data of observations from 28 hydrometeorological stations over Ukraine and output of GFDL-CM3 model (CMIP5) for XXI century were used in the study. The methods of concentration index (J. Martin-Vide, 2004) for the study of precipitation inequality while the extreme precipitation indices recommended by the ETCCDI - for the frequency of events. Results. Precipitation inequality on the annual and seasonal scales was studied using estimated CI series for 1951-2005. It was found that annual CI ranges vary from 0.58 to 0.64. They increase southward from the north-west (forest zone) and the north-east (forest steppe zone) of Ukraine. CI maxima are located in the coastal regions of the Black Sea and the Sea of Azov. Annual CI spatial distribution indicates that the contribution of extreme precipitation into annual totals is most significant at the boundary zone between steppe and marine regions. At the same time precipitation pattern at the foothill of Carpathian Mountains is more homogenous. The CI minima (0.54) are typical for the winter season in foothill of Ukrainian Carpathians. The CI maxima reach 0.71 in spring at the steppe zone closed to the Black Sea coast. It should be noted that the greatest ranges of CI maximum and CI minimum deviation are typical for spring. It is associated with patterns of cyclone trajectories in that season. The most territory is characterized by tendency to decrease the contribution of extreme precipitation into the total amount (CI linear trends are predominantly negative in all seasons). Decadal and interdecadal variability of precipitation inequality associated with global processes in ocean-atmosphere system are also studied. It was shown that precipitation inequality over Ukraine on 10 - 15 % stronger in negative phase of Pacific Decadal Oscillation and in positive phase

  14. Scandinavian neuroscience during the Nazi era.

    Science.gov (United States)

    Kondziella, Daniel; Hansen, Klaus; Zeidman, Lawrence A

    2013-07-01

    Although Scandinavian neuroscience has a proud history, its status during the Nazi era has been overlooked. In fact, prominent neuroscientists in German-occupied Denmark and Norway, as well as in neutral Sweden, were directly affected. Mogens Fog, Poul Thygesen (Denmark) and Haakon Sæthre (Norway) were resistance fighters, tortured by the Gestapo: Thygesen was imprisoned in concentration camps and Sæthre executed. Jan Jansen (Norway), another neuroscientist resistor, escaped to Sweden, returning under disguise to continue fighting. Fritz Buchthal (Denmark) was one of almost 8000 Jews escaping deportation by fleeing from Copenhagen to Sweden. In contrast, Carl Værnet (Denmark) became a collaborator, conducting inhuman experiments in Buchenwald concentration camp, and Herman Lundborg (Sweden) and Thorleif Østrem (Norway) advanced racial hygiene in order to maintain the "superior genetic pool of the Nordic race." Compared to other Nazi-occupied countries, there was a high ratio of resistance fighters to collaborators and victims among the neuroscientists in Scandinavia.

  15. Astronomy in the Big Data Era

    Directory of Open Access Journals (Sweden)

    Yanxia Zhang

    2015-05-01

    Full Text Available The fields of Astrostatistics and Astroinformatics are vital for dealing with the big data issues now faced by astronomy. Like other disciplines in the big data era, astronomy has many V characteristics. In this paper, we list the different data mining algorithms used in astronomy, along with data mining software and tools related to astronomical applications. We present SDSS, a project often referred to by other astronomical projects, as the most successful sky survey in the history of astronomy and describe the factors influencing its success. We also discuss the success of Astrostatistics and Astroinformatics organizations and the conferences and summer schools on these issues that are held annually. All the above indicates that astronomers and scientists from other areas are ready to face the challenges and opportunities provided by massive data volume.

  16. Big data era in meteor science

    Science.gov (United States)

    Vinković, D.; Gritsevich, M.; Srećković, V.; Pečnik, B.; Szabó, G.; Debattista, V.; Škoda, P.; Mahabal, A.; Peltoniemi, J.; Mönkölä, S.; Mickaelian, A.; Turunen, E.; Kákona, J.; Koskinen, J.; Grokhovsky, V.

    2016-01-01

    Over the last couple of decades technological advancements in observational techniques in meteor science have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced science goals. We review some of the developments that push meteor science into the big data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere.

  17. Extreme events in gross primary production: a characterization across continents

    Directory of Open Access Journals (Sweden)

    J. Zscheischler

    2014-01-01

    Full Text Available Climate extremes can affect the functioning of terrestrial ecosystems, for instance via a reduction of the photosynthetic capacity or alterations of respiratory processes. Yet the dominant regional and seasonal effects of hydrometeorological extremes are still not well documented. Here we quantify and characterize the role of large spatiotemporal extreme events in gross primary production (GPP as triggers of continental anomalies. We also investigate seasonal dynamics of extreme impacts on continental GPP anomalies. We find that the 50 largest positive (increase in uptake and negative extremes (decrease in uptake on each continent can explain most of the continental variation in GPP, which is in line with previous results obtained at the global scale. We show that negative extremes are larger than positive ones and demonstrate that this asymmetry is particularly strong in South America and Europe. Most extremes in GPP start in early summer. Our analysis indicates that the overall impacts and the spatial extents of GPP extremes are power law distributed with exponents that vary little across continents. Moreover, we show that on all continents and for all data sets the spatial extents play a more important role than durations or maximal GPP anomaly when it comes to the overall impact of GPP extremes. An analysis of possible causes implies that across continents most extremes in GPP can best be explained by water scarcity rather than by extreme temperatures. However, for Europe, South America and Oceania we identify also fire as an important driver. Our findings are consistent with remote sensing products. An independent validation against a literature survey on specific extreme events supports our results to a large extent.

  18. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaël

    2015-11-17

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  19. Variability of Iberian upwelling implied by ERA-40 and ERA-Interim reanalyses

    Directory of Open Access Journals (Sweden)

    José M. R. Alves

    2013-05-01

    Full Text Available The Regional Ocean Modeling System ocean model is used to simulate the decadal evolution of the regional waters in offshore Iberia in response to atmospheric fields given by ECMWF ERA-40 (1961–2001 and ERA-Interim (1989–2008 reanalyses. The simulated sea surface temperature (SST fields are verified against satellite AVHRR SST, and they are analysed to characterise the variability and trends of coastal upwelling in the region. Opposing trends in upwelling frequency are found at the northern limit, where upwelling has been decreasing in recent decades, and at its southern edge, where there is some evidence of increased upwelling. These results confirm previous observational studies and, more importantly, indicate that observed SST trends are not only due to changes in radiative or atmospheric heat fluxes alone but also due to changes in upwelling dynamics, suggesting that such a process may be relevant in climate change scenarios.

  20. Uncertainties in extreme precipitation under climate change conditions

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia

    of adaptation strategies, but these changes are subject to uncertainties. The focus of this PhD thesis is the quantification of uncertainties in changes in extreme precipitation. It addresses two of the main sources of uncertainty in climate change impact studies: regional climate models (RCMs) and statistical...... by extreme precipitation pose a threat to human life and cause high economic losses for society. Thus, strategies to adapt to changes in extreme precipitation are currently being developed and established worldwide. Information on the expected changes in extreme precipitation is required for the development...... downscaling methods (SDMs). RCMs provide information on climate change at the regional scale. SDMs are used to bias-correct and downscale the outputs of the RCMs to the local scale of interest in adaptation strategies. In the first part of the study, a multi-model ensemble of RCMs from the European ENSEMBLES...

  1. PROFIL STRATEGI PEMASARAN INTERNASIONAL DI ERA GLOBAL

    Directory of Open Access Journals (Sweden)

    Moh Munir

    2006-01-01

    Full Text Available Memasuki era tahun 2006 sebagai era pemulihan struktur kehidupan ekonomi setelah sekian tahun dilanda krisis ekonomi dan adanya kenaikan BBM.  Namun kenyataan yang ada asumsi pertumbuhan ekonomi sebesar 6 % tampaknya masih terlalu berat. Sumber pendapatan negara dari sektor non pajak masih kecil, kita lebih pas sebagai negara pengimpor dari pada pengekspor. Kita lebih banyak bertindak sebagai tujuan pasar dari pada produsen atau suplier barang. Kenyataan ini tidak bisa kita pungkiri sebagai akibat dari berlakunya pasar global. Kita lebih banyak terlena dan terlalu puas dengan kondisi produk yang dihasilkan, tanpa berpikir panjang bahwa sautu saat nanti ada produk barang yang sama bahkan lebih bagus dengan harga lebih murah. Kondisi inilah yang memaksa kita untuk mereviu kembali strategi pemasaran internasional kita, apakah sudah menyesuaikan dan menjawab tantangan global. Tampaknya masih banyak kebijakan-kebijakan yang harus diambil dalam hal ini terutama bagaimana kita mempunyai daya saing tinggi dengan mempunyai produk unggulan dimasing-masing otonomi daerah. Salah satu bentuk riilnya yaitu dengan menelusuri tiga macam strategic resources . Pertama kita harus membenahi aspek tangible resources yang meliputi : karyawan atau SDM, pelanggan, kapasitas, dana dan produk. Kedua menyangkut tentang intangible resources yang meliputi : ketrampilan karyawan, mutu pelanggan, efisiensi biaya produksi dan mutu produk. Ketiga aspek very intangible resources yang meliputi : moral karywan, reputasi dimata pelanggan dan reputasi dimata investor. Selain cara ini bisa juga dengan model  “Satu Kabupaten Satu Kompetensi Inti” (Saka-Sakti dimana model ini sejalan dengan kebijakan otonomi daerah, saka yang berarti tiang atau tonggak dan sakti yang berarti keampuhan, kekuatan atau ilmu. Jadi tonggak keampuhan atau tonggak kekuatan andalan yang dimiliki oleh setiap kabupaten di Indonesia guna membangun kompetensi inti agar bisa bersaing di pasar global.

  2. Chirurgie in die Grieks-Romeinse era

    Directory of Open Access Journals (Sweden)

    François P. Retief

    2006-09-01

    Full Text Available In die Grieks-Romeinse era het mediese behandeling kenmerkend uit drie elemente bestaan, naamlik regimen (dieet en gesonde leefwyse, geneesmiddels en chirurgie – laasgenoemde alleen toegepas indien regimen en geneesmiddels onsuksesvol was. Bewyse van primitiewe chirurgie dateer terug na die Bronstydperk, en in Homerus se eposse is heelwat vermelding van die chirurgiese hantering van oorlogswonde, met tussenkoms van die gode. Met die koms van empiriese geneeskunde in die 5de eeu v.C. het chirurgie in die Hippokratiese Corpus prominent gefigureer met beduidende bydraes in veral die ortopediese veld en hoofbeserings. Uitbouing van anatomiese en fisiologiese kennis, gebaseer op disseksie van menslike kadawers in Alexandrië vanaf die laat 4de eeu v.C., het chirurgie ’n hupstoot gegee. Teen die Romeinse era vanaf die 2de eeu v.C. het snykundetegnieke (en -instrumente beduidend verbeter, maar is steeds oorwegend deur Griekse geneeshere beoefen. Van geneeshere is steeds verwag om al drie bovermelde terapeutiese modaliteite te bemeester, maar chirurgie het meer aansien verwerf en daar is al meer in onderafdelings van chirurgie soos oogheelkunde, vrouesiektes en verloskunde, blaaskwale en mond- en keelsnykunde gespesialiseer. Militêre geneeskunde was in die Romeinse Ryk ’n belangrike aktiwiteit, en het veral traumachirurgie uitgebou. Betreding van die buik- en toraksholtes was nie meer noodwendig fataal nie, en veeartsenykunde het tot stand gekom. Die eerste beduidende chirurgiehandboek ná die Hippokratiese Corpus is in die 1ste eeu n.C. deur Celsus opgestel. Vanaf die 3de eeu het die chirurgieberoep min vordering gemaak, die beroepstaal het mettertyd van Grieks na Latyn verander en kundigheid is later veral deur Islam-geneeshere na die Middeleeue en later oorgedra.

  3. Objective criteria accurately predict amputation following lower extremity trauma.

    Science.gov (United States)

    Johansen, K; Daines, M; Howey, T; Helfet, D; Hansen, S T

    1990-05-01

    MESS (Mangled Extremity Severity Score) is a simple rating scale for lower extremity trauma, based on skeletal/soft-tissue damage, limb ischemia, shock, and age. Retrospective analysis of severe lower extremity injuries in 25 trauma victims demonstrated a significant difference between MESS values for 17 limbs ultimately salvaged (mean, 4.88 +/- 0.27) and nine requiring amputation (mean, 9.11 +/- 0.51) (p less than 0.01). A prospective trial of MESS in lower extremity injuries managed at two trauma centers again demonstrated a significant difference between MESS values of 14 salvaged (mean, 4.00 +/- 0.28) and 12 doomed (mean, 8.83 +/- 0.53) limbs (p less than 0.01). In both the retrospective survey and the prospective trial, a MESS value greater than or equal to 7 predicted amputation with 100% accuracy. MESS may be useful in selecting trauma victims whose irretrievably injured lower extremities warrant primary amputation.

  4. Diabetic foot in the era of telemedicine

    NARCIS (Netherlands)

    Hazenberg, C.E.V.B.

    2013-01-01

    Diabetes mellitus is a common cause of lower extremity pathology such as ulceration, infection and amputation, causing a major socioeconomic burden. In diabetes, selfcare is an essential element of disease management and prevention of ulceration. Diabetes related complications, like neuropathy, reti

  5. A Decade-Long European-Scale Convection-Resolving Climate Simulation on GPUs

    Science.gov (United States)

    Leutwyler, D.; Fuhrer, O.; Ban, N.; Lapillonne, X.; Lüthi, D.; Schar, C.

    2016-12-01

    Convection-resolving models have proven to be very useful tools in numerical weather prediction and in climate research. However, due to their extremely demanding computational requirements, they have so far been limited to short simulations and/or small computational domains. Innovations in the supercomputing domain have led to new supercomputer designs that involve conventional multi-core CPUs and accelerators such as graphics processing units (GPUs). One of the first atmospheric models that has been fully ported to GPUs is the Consortium for Small-Scale Modeling weather and climate model COSMO. This new version allows us to expand the size of the simulation domain to areas spanning continents and the time period up to one decade. We present results from a decade-long, convection-resolving climate simulation over Europe using the GPU-enabled COSMO version on a computational domain with 1536x1536x60 gridpoints. The simulation is driven by the ERA-interim reanalysis. The results illustrate how the approach allows for the representation of interactions between synoptic-scale and meso-scale atmospheric circulations at scales ranging from 1000 to 10 km. We discuss some of the advantages and prospects from using GPUs, and focus on the performance of the convection-resolving modeling approach on the European scale. Specifically we investigate the organization of convective clouds and on validate hourly rainfall distributions with various high-resolution data sets.

  6. Evaluation of dynamically downscaled extreme temperature using a spatially-aggregated generalized extreme value (GEV) model

    Science.gov (United States)

    Wang, Jiali; Han, Yuefeng; Stein, Michael L.; Kotamarthi, Veerabhadra R.; Huang, Whitney K.

    2016-11-01

    The weather research and forecast (WRF) model downscaling skill in extreme maximum daily temperature is evaluated by using the generalized extreme value (GEV) distribution. While the GEV distribution has been used extensively in climatology and meteorology for estimating probabilities of extreme events, accurately estimating GEV parameters based on data from a single pixel can be difficult, even with fairly long data records. This work proposes a simple method assuming that the shape parameter, the most difficult of the three parameters to estimate, does not vary over a relatively large region. This approach is applied to evaluate 31-year WRF-downscaled extreme maximum temperature through comparison with North American regional reanalysis (NARR) data. Uncertainty in GEV parameter estimates and the statistical significance in the differences of estimates between WRF and NARR are accounted for by conducting a novel bootstrap procedure that makes no assumption of temporal or spatial independence within a year, which is especially important for climate data. Despite certain biases over parts of the United States, overall, WRF shows good agreement with NARR in the spatial pattern and magnitudes of GEV parameter estimates. Both WRF and NARR show a significant increase in extreme maximum temperature over the southern Great Plains and southeastern United States in January and over the western United States in July. The GEV model shows clear benefits from the regionally constant shape parameter assumption, for example, leading to estimates of the location and scale parameters of the model that show coherent spatial patterns.

  7. Global Transmission Dynamics of Measles in the Measles Elimination Era.

    Science.gov (United States)

    Furuse, Yuki; Oshitani, Hitoshi

    2017-04-16

    Although there have been many epidemiological reports of the inter-country transmission of measles, systematic analysis of the global transmission dynamics of the measles virus (MV) is limited. In this study, we applied phylogeographic analysis to characterize the global transmission dynamics of the MV using large-scale genetic sequence data (obtained for 7456 sequences) from 115 countries between 1954 and 2015. These analyses reveal the spatial and temporal characteristics of global transmission of the virus, especially in Australia, China, India, Japan, the UK, and the USA in the period since 1990. The transmission is frequently observed, not only within the same region but also among distant and frequently visited areas. Frequencies of export from measles-endemic countries, such as China, India, and Japan are high but decreasing, while the frequencies from countries where measles is no longer endemic, such as Australia, the UK, and the USA, are low but slightly increasing. The world is heading toward measles eradication, but the disease is still transmitted regionally and globally. Our analysis reveals that countries wherein measles is endemic and those having eliminated the disease (apart from occasional outbreaks) both remain a source of global transmission in this measles elimination era. It is therefore crucial to maintain vigilance in efforts to monitor and eradicate measles globally.

  8. Plant microbe interactions in post genomic era: perspectives and applications

    Directory of Open Access Journals (Sweden)

    Jahangir Imam

    2016-09-01

    Full Text Available Deciphering plant-microbe interactions is a promising aspect to understand the benefits and the pathogenic effect of microbes and crop improvement. The advancement in sequencing technologies and various ‘omics’ tool has impressively accelerated the research in biological sciences in this area. The recent and ongoing developments provide a unique approach to describing these intricate interactions and test hypotheses. In the present review, we discuss the role of plant-pathogen interaction in crop improvement. The plant innate immunity has always been an important aspect of research and leads to some interesting information like the adaptation of unique immune mechanisms of plants against pathogens. The development of new techniques in the post - genomic era has greatly enhanced our understanding of the regulation of plant defense mechanisms against pathogens. The present review also provides an overview of beneficial plant-microbe interactions with special reference to Agrobacterium tumefaciens-plant interactions where plant derived signal molecules and plant immune responses are important in pathogenicity and transformation efficiency. The construction of various Genome-scale metabolic models of microorganisms and plants presented a better understanding of all metabolic interactions activated during the interactions. This review also lists the emerging repertoire of phytopathogens and its impact on plant disease resistance. Outline of different aspects of plant-pathogen interactions is presented in this review to bridge the gap between plant microbial ecology and their immune responses.

  9. Plant Microbe Interactions in Post Genomic Era: Perspectives and Applications.

    Science.gov (United States)

    Imam, Jahangir; Singh, Puneet K; Shukla, Pratyoosh

    2016-01-01

    Deciphering plant-microbe interactions is a promising aspect to understand the benefits and the pathogenic effect of microbes and crop improvement. The advancement in sequencing technologies and various 'omics' tool has impressively accelerated the research in biological sciences in this area. The recent and ongoing developments provide a unique approach to describing these intricate interactions and test hypotheses. In the present review, we discuss the role of plant-pathogen interaction in crop improvement. The plant innate immunity has always been an important aspect of research and leads to some interesting information like the adaptation of unique immune mechanisms of plants against pathogens. The development of new techniques in the post - genomic era has greatly enhanced our understanding of the regulation of plant defense mechanisms against pathogens. The present review also provides an overview of beneficial plant-microbe interactions with special reference to Agrobacterium tumefaciens-plant interactions where plant derived signal molecules and plant immune responses are important in pathogenicity and transformation efficiency. The construction of various Genome-scale metabolic models of microorganisms and plants presented a better understanding of all metabolic interactions activated during the interactions. This review also lists the emerging repertoire of phytopathogens and its impact on plant disease resistance. Outline of different aspects of plant-pathogen interactions is presented in this review to bridge the gap between plant microbial ecology and their immune responses.

  10. The NASA Energy and Water Cycle Extreme (NEWSE) Integration Project

    Science.gov (United States)

    House, P. R.; Lapenta, W.; Schiffer, R.

    2008-01-01

    Skillful predictions of water and energy cycle extremes (flood and drought) are elusive. To better understand the mechanisms responsible for water and energy extremes, and to make decisive progress in predicting these extremes, the collaborative NASA Energy and Water cycle Extremes (NEWSE) Integration Project, is studying these extremes in the U.S. Southern Great Plains (SGP) during 2006-2007, including their relationships with continental and global scale processes, and assessment of their predictability on multiple space and time scales. It is our hypothesis that an integrative analysis of observed extremes which reflects the current understanding of the role of SST and soil moisture variability influences on atmospheric heating and forcing of planetary waves, incorporating recently available global and regional hydro- meteorological datasets (i.e., precipitation, water vapor, clouds, etc.) in conjunction with advances in data assimilation, can lead to new insights into the factors that lead to persistent drought and flooding. We will show initial results of this project, whose goals are to provide an improved definition, attribution and prediction on sub-seasonal to interannual time scales, improved understanding of the mechanisms of decadal drought and its predictability, including the impacts of SST variability and deep soil moisture variability, and improved monitoring/attributions, with transition to applications; a bridging of the gap between hydrological forecasts and stakeholders (utilization of probabilistic forecasts, education, forecast interpretation for different sectors, assessment of uncertainties for different sectors, etc.).

  11. 亚洲季风区过去700年来夏季极端干/湿事件多尺度变化特征分析%Multi-scale Analysis of the Extreme Dry/Wet Events in Asian Monsoon Region in Summer During Last 7 Centuries

    Institute of Scientific and Technical Information of China (English)

    杨萍; 侯威; 颜鹏程

    2016-01-01

    Monsoon failures,mega droughts,and extreme flooding events have repeatedly affected the agrari-an peoples of Asia over the past millennium.A scarcity of long-term instrumental climate data for many remote re-gions of Monsoon Asia impedes progress toward resolving these issues.To better elucidate the spatial complexity of the Asian monsoon,a large-scale,spatially explicit,long-term data set is needed.This context is provided here by our Monsoon Asia Drought Atlas (MADA).The MADA provides a seasonal-to centennial-scale window into the A-sian monsoon’s repeated tendency for extended dry and wet extremes with distinct spatial flavors of response.Re-cently,ensemble empirical mode decomposition (EEMD)method is developed for non-linear and non-stationary signal analysis.The EEMD method is instituted and utilized in several fields such as de-noising,ocean surface measurement,metrology,image processing and so on.The method can work on nature signals (non-linear and nonstationary signals)as well as reducing the speckle noise.The EEMD method is like as a filter bank that the sig-nal is decomposed into several intrinsic mode functions(IMFs)and the frequencies of IMFs are arranged in decrease order (high to low)after the EEMD processing.The scaling mode of the EEMD method is similar to wavelet trans-form,but the signal resolution at different frequency domain is not decrease by down-sampling.In this paper,we propose the EEMD method to extract the multi-scale characters of the variability of extreme dry/wet events in Asian monsoon area.Using the Palmer Drought Severity Index data of MADA from 504 stations in Asian monsoon area in summer from 1300-2005a and EEMD method,we get the series of the number of grids which is especially and seri-ous wet or dry in this region by an interval of 5 years,and analysis the variations of these series.Based the feature of nonlinear/nonstationarity and multi-scale in climatic system,applying EEMD to the series of the number of grids which is

  12. Gender, Education, Extremism and Security

    Science.gov (United States)

    Davies, Lynn

    2008-01-01

    This paper examines the complex relationships between gender, education, extremism and security. After defining extremism and fundamentalism, it looks first at the relationship of gender to violence generally, before looking specifically at how this plays out in more extremist violence and terrorism. Religious fundamentalism is also shown to have…

  13. Feeling the extreme : an exploratory study of experienced emotions during extreme sport

    OpenAIRE

    Hetland, Audun

    2009-01-01

    In the current study 13 BASE-jumpers and 18 skydivers reported their emotions immediately after a jump and after a 24 hours delay, using verbal (Likert-like scales) and visual (Feelometer) emotional report. Heart rate measures were also collected during, and 24 hours after the jump. The Feelometer is a newly developed tool enabling the participants to give a moment-to-moment report from a particular event or episode. Given the complexity and dynamics of extreme sport experiences, the Feelomet...

  14. Potentially Extreme Population Displacement and Concentration in the Tropics Under Non-Extreme Warming

    OpenAIRE

    Hsiang, Solomon M.; Sobel, Adam H.

    2016-01-01

    Evidence increasingly suggests that as climate warms, some plant, animal, and human populations may move to preserve their environmental temperature. The distances they must travel to do this depends on how much cooler nearby surfaces temperatures are. Because large-scale atmospheric dynamics constrain surface temperatures to be nearly uniform near the equator, these displacements can grow to extreme distances in the tropics, even under relatively mild warming scenarios. Here we show that in ...

  15. Feeling the extreme : an exploratory study of experienced emotions during extreme sport

    OpenAIRE

    Hetland, Audun

    2009-01-01

    In the current study 13 BASE-jumpers and 18 skydivers reported their emotions immediately after a jump and after a 24 hours delay, using verbal (Likert-like scales) and visual (Feelometer) emotional report. Heart rate measures were also collected during, and 24 hours after the jump. The Feelometer is a newly developed tool enabling the participants to give a moment-to-moment report from a particular event or episode. Given the complexity and dynamics of extreme sport experiences, the Feelomet...

  16. Projected changes to surface wind characteristics and extremes over North America in CRCM5

    Science.gov (United States)

    Jeong, Dae Il; Sushama, Laxmi

    2017-04-01

    Changes in the tendency of wind speed and direction have significant implications for long-term water cycle, air pollution, arid and semiarid environments, fire activity, and wind energy production. Furthermore, changes in wind extremes have direct impacts on buildings, infrastructures, agriculture, power lines, and trees. This study evaluates projected changes to wind speed characteristics (i.e., seasonal and annual mean, seasonal and diurnal cycles, directional distribution, and extreme events) for the future 2071-2100 period, with respect to the current 1981-2010 period over North America, using four different simulations from the fifth-generation Canadian Regional Climate Model (CRCM5) with two driving GCMs under RCP (Representative Concentration Pathways) 4.5 and 8.5 scenarios. The CRCM5 simulates the climatology of mean sea level pressure gradient and associated wind direction over North America well when compared to ERA-Interim reanalysis dataset. The CRCM5 also reproduces properly the spatial distributions of observed seasonal and annual mean wind speeds obtained from 611 meteorological stations across North America. The CRCM5 simulations generally suggest an increase in future mean wind speed for northern and eastern parts of Canada, due to a decrease of future mean sea level pressure and more intense low pressure air circulation systems already situated in those regions such as Aleutian and Icelandic Lows. Projected changes to annual maximum wind speed show more spatial variability compared to seasonal and annual mean wind speed as extreme wind speed is influenced more by regional-scale features associated with instantaneous surface temperature and air pressure gradients. The CRCM5 simulations suggest some increases in the future 50-year return levels of wind speed, mainly due to changes in the inter-annual variability of annual maximum wind speed. However, the projected changes vary in spatial pattern with the driving GCM fields and emission scenarios

  17. Modeling extreme risks in ecology.

    Science.gov (United States)

    Burgman, Mark; Franklin, James; Hayes, Keith R; Hosack, Geoffrey R; Peters, Gareth W; Sisson, Scott A

    2012-11-01

    Extreme risks in ecology are typified by circumstances in which data are sporadic or unavailable, understanding is poor, and decisions are urgently needed. Expert judgments are pervasive and disagreements among experts are commonplace. We outline approaches to evaluating extreme risks in ecology that rely on stochastic simulation, with a particular focus on methods to evaluate the likelihood of extinction and quasi-extinction of threatened species, and the likelihood of establishment and spread of invasive pests. We evaluate the importance of assumptions in these assessments and the potential of some new approaches to account for these uncertainties, including hierarchical estimation procedures and generalized extreme value distributions. We conclude by examining the treatment of consequences in extreme risk analysis in ecology and how expert judgment may better be harnessed to evaluate extreme risks.

  18. Modern Era Retrospective-Analysis for Research and Applications

    Science.gov (United States)

    Bosilovich, Michael G.; Robertson, Franklin R.; Chen, Junye

    2009-01-01

    The Modern Era Retrospective-analysis for Research and Applications (MERRA) reanalyses has produced several years of data, on the way to a. completing the 1979-present modern satellite era. Here, we present a preliminary evaluation of those years currently available, including comparisons with the existing long reanalyses (ERA40, JPA25 and NCEP I and II) as well as with global data sets for the water and energy cycle. Time series shows that the MERRA budgets can change with some of the variations in observing systems. We will present all terms of the budgets in MERRA including the time rates of change and analysis increments (tendency due to the analysis of observations).

  19. ALPENDRE Y ERA [Material gráfico

    OpenAIRE

    Fundación para la Etnografía y el Desarrollo de la Artesanía Canaria

    1990-01-01

    Antiguedad: SIGLO XIX Antrópicas: construcción de un camino que atraviesa la era. En el alpendre se encuentran basuras y planchas metálicas. Naturales: colonización vegetal. Clasificación del suelo: RÚSTICO Conjunto de era y alpendre. La era es de planta circular, de tierra y conjunto de piedras, localizándose en un lomo. El alpendre está excavado en la roca, a modo de cuatro compartimentos de diferentes tamaños en donde se amaraban los anima-les a una serie de palos horizontales q...

  20. Response of wheat yield in Spain to large-scale patterns

    Science.gov (United States)

    Hernandez-Barrera, Sara; Rodriguez-Puebla, Concepcion

    2016-04-01

    Crops are vulnerable to extreme climate conditions as drought, heat stress and frost risk. In previous study we have quantified the influence of these climate conditions for winter wheat in Spain (Hernandez-Barrera et al. 2015). The climate extremes respond to large-scale atmospheric and oceanic patterns. Therefore, a question emerges in our investigation: How large-scale patterns affect wheat yield? Obtaining and understanding these relationships require different approaches. In this study, we first obtained the leading mode of observed wheat yield variability to characterize the common variability over different provinces in Spain. Then, the wheat variability is related to different modes of mean sea level pressure, jet stream and sea surface temperature by using Partial Least-Squares, which captures the relevant climate drivers accounting for variations in wheat yield from sowing to harvesting. We used the ERA-Interim reanalysis data and the Extended Reconstructed Sea Surface Temperature (SST) (ERSST v3b). The derived model provides insight about the teleconnections between wheat yield and atmospheric and oceanic circulations, which is considered to project the wheat yield trend under global warming using outputs of twelve climate models corresponding to the Coupled Models Intercomparison Project phase 5 (CMIP5). Hernandez-Barrera S., C. Rodríguez-Puebla and A.J. Challinor. Effects of diurnal temperature range and drought on wheat yield in Spain. Theoretical and Applied Climatology (submitted)

  1. Going Extreme For Small Solutions To Big Environmental Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Bagwell, Christopher E.

    2011-03-31

    This chapter is devoted to the scale, scope, and specific issues confronting the cleanup and long-term disposal of the U.S. nuclear legacy generated during WWII and the Cold War Era. The research reported is aimed at complex microbiological interactions with legacy waste materials generated by past nuclear production activities in the United States. The intended purpose of this research is to identify cost effective solutions to the specific problems (stability) and environmental challenges (fate, transport, exposure) in managing and detoxifying persistent contaminant species. Specifically addressed are high level waste microbiology and bacteria inhabiting plutonium laden soils in the unsaturated subsurface.

  2. Spatial dependence of extreme rainfall

    Science.gov (United States)

    Radi, Noor Fadhilah Ahmad; Zakaria, Roslinazairimah; Satari, Siti Zanariah; Azman, Muhammad Az-zuhri

    2017-05-01

    This study aims to model the spatial extreme daily rainfall process using the max-stable model. The max-stable model is used to capture the dependence structure of spatial properties of extreme rainfall. Three models from max-stable are considered namely Smith, Schlather and Brown-Resnick models. The methods are applied on 12 selected rainfall stations in Kelantan, Malaysia. Most of the extreme rainfall data occur during wet season from October to December of 1971 to 2012. This period is chosen to assure the available data is enough to satisfy the assumption of stationarity. The dependence parameters including the range and smoothness, are estimated using composite likelihood approach. Then, the bootstrap approach is applied to generate synthetic extreme rainfall data for all models using the estimated dependence parameters. The goodness of fit between the observed extreme rainfall and the synthetic data is assessed using the composite likelihood information criterion (CLIC). Results show that Schlather model is the best followed by Brown-Resnick and Smith models based on the smallest CLIC's value. Thus, the max-stable model is suitable to be used to model extreme rainfall in Kelantan. The study on spatial dependence in extreme rainfall modelling is important to reduce the uncertainties of the point estimates for the tail index. If the spatial dependency is estimated individually, the uncertainties will be large. Furthermore, in the case of joint return level is of interest, taking into accounts the spatial dependence properties will improve the estimation process.

  3. Problematika Pendidikan Islam Sebagai Sub Sistem Pendidikan Nasional di Era Global

    Directory of Open Access Journals (Sweden)

    Moch. Miftachul Choiri

    2011-11-01

    Full Text Available A globalization, which looks like both sides of one coin, has both positive and negative impacts. The globalization inspired not only by capitalism but also by pragmatism has practically affected the education in Indonesia. The roles of education become practically unfamiliar and faraway from society needs. The globalization takes some issues such as competence, standardization, and commerce. To face this era, what should Islamic education do as sub-system of national education? The Islamic school (madrasah as a sub-system of Islamic education in Indonesia, had extremely strong experienced to face the challenges at the last era of Dutch colonialism. The fact that madrasah had not only an autonomy but also an intellectual resources had proven that it could fulfill the needs of Islamic community. These are cultural potencies which should be kept and not be abandoned for the sake of globalization interest. The globalization as a cultural transformation process affects the world, especially the practice of education in Indonesia. All people using science and technology can easily access the global culture. The global culture which is value-free should be faced by transformation of values of which Islamic scholars had transformed in pesantren (Islamic boarding schools and Islamic schools (madrasah. In other word, both pesantren and madrasah should not be entrapped in capitalism ideology and could serve all people. It is because the paradigm of Islamic education differs from that of both capitalism and pragmatism. The article tries to elaborate how Islamic education in Indonesia especially madrasah should be positioned in the global era

  4. Explosion Source Phenomena Using Soviet, Test-Era, Waveform Data

    Energy Technology Data Exchange (ETDEWEB)

    Richards, Paul G.; Rautian, Tatyana G.; Khalturin, Vitaly I.; Phillips, W. Scott

    2006-04-12

    During the nuclear testing era, the former Soviet Union carried out extensive observations of underground nuclear explosions, recording both their own shots and those of foreign nuclear states. Between 1961 and 1989, the Soviet Complex Seismological Expedition deployed seismometers at time-varying subsets of over 150 sites to record explosions at regional distances from the Semipalatinsk and Lop Nor test sites and from the shot points of peaceful nuclear explosions. This data set included recordings from broadband, multi-channel ChISS seismometers that produced a series of narrow band outputs, which could then be measured to perform spectral studies. [ChISS is the Russian abbreviation for multichannel spectral seismometer. In this instrument the signal from the seismometer is passed through a system of narrow bandpass filters and recorded on photo paper. ChISS instruments have from 8 to 16 channels in the frequency range from 100 sec to 40 Hz. We used data mostly from 7 channels, ranging from 0.08 to 5 Hz.] Quantitative, pre-digital era investigations of high-frequency source scaling relied on this type of data. To augment data sets of central Central Asia explosions, we have measured and compiled 537 ChISS coda envelopes for 124 events recorded at Talgar, Kazakhstan, at a distance of about 750 km from Semipalatinsk. Envelopes and calibration levels were measured manually from photo paper records for seven bands between 0.08 and 5 Hz. We obtained from 2 to 10 coda envelope measurements per event, depending on the event size and instrument magnification. Coda lengths varied from 250 to 1400 s. For small events, only bands between 0.6 and 2.5 Hz could be measured. Envelope levels were interpolated or extrapolated to 500 s and we have obtained the dependence of this quantity on magnitude. Coda Q was estimated and found to increase from 232 at 0.08 Hz to 1270 at 5 Hz. These relationships were used to construct an average scaling law of coda spectra for Semipalatinsk

  5. THE SZ EFFECT IN THE PLANCK ERA: ASTROPHYSICAL AND COSMOLOGICAL IMPACT

    Directory of Open Access Journals (Sweden)

    Sergio Colafrancesco

    2013-12-01

    Full Text Available The Sunyaev–Zel’dovich effect (SZE is a relevant probe for cosmology and particle astrophysics. The Planck Era marks a definite step forward in the use of this probe for astrophysics and cosmology. Astrophysical applications to galaxy clusters, galaxies, radiogalaxies and large-scale structures are discussed. Cosmological relevance for the Dark Energy equation of state, modified Gravity scenarios, Dark Matter search, cosmic magnetism and other cosmological applications is also reviewed. Future directions for the study of the SZE and its polarization are finally outlined.

  6. Extreme Thrombocytosis and Cardiovascular Surgery

    Science.gov (United States)

    Natelson, Ethan A.

    2012-01-01

    Extreme thrombocytosis is a major risk factor for excessive bleeding and for thrombosis, either of which can complicate cardiovascular surgical and interventional procedures. Extreme thrombocytosis can also cause an unusual syndrome, erythromelalgia, that results in a type of chronic microvascular occlusive arterial disease. We present the differential diagnosis of conditions that may lead to extreme thrombocytosis, 3 cases (each of which illustrates a different potential complication), and a review of the pertinent medical literature. Correcting excessive thrombocytosis is typically not difficult, whether electively or acutely, and effective therapy usually controls thrombosis and excessive hemorrhage postprocedurally. PMID:23304015

  7. Key Exoplanets in the Era of JWST

    Science.gov (United States)

    Batalha, Natasha; Mandell, Avi; Lewis, Nikole K.; Pontoppidan, Klaus

    2017-01-01

    In 2018, exoplanet science will enter a new era with the launch of the James Webb Space Telescope (JWST). With JWST's observing power, several studies have sought to characterize how the instruments will perform and what atmospheric spectral features could theoretically be detected using transmission spectroscopy. With just two years left until launch, it is imperative that the exoplanet community begins to digest and integrate these studies into their observing plans and strategies. In order to encourage this and to allow all members of the community access to JWST simulations, we present here an open source tool for creating observation simulations of all observatory-supported time-series spectroscopy modes. We describe our tool, PandExo and use it to calculate the expected signal-to-noise ratio (SNR) for every confirmed planetary system with Jhours are needed to attain a SNR of 5 on key molecular absorption bands of H2O, CH4, and CO. We end by determining the number of planets (hot Jupiters, warm Neptunes, super-Earths, etc.) that are currently attainable with JWST.

  8. Flavour in the era of the LHC

    CERN Multimedia

    2006-01-01

    The 4th meeting of the 'Flavour in the era of the LHC'workshop will take place at CERN on 9-11 October, 2006. The goal of this workshop is to outline and document a programme for flavour physics for the next decade, addressing in particular the complementarity and synergy between the discoveries we expect to emerge from the LHC and the potential for accurate measurements of future flavour factories. Over 150 physicists will join in the discussions of the three working groups dedicated to 'Flavour physics at high Q', 'B/D/K decays'and 'Flavour in the lepton sector, EDM's, g-2, etc'. The previous meetings took place in November 2005, and in February and May this year. In addition to the working group sessions, a special miniworkshop dedicated to future prospects for electric dipole moment (EDM) searches and g-2 measurements will be held on 9-10 October. Sensitive EDM and g-2 experiments probe physics in an integral way, and in many cases their physics reach is much higher than the spectrometer searches at th...

  9. Flavour in the era of the LHC

    CERN Multimedia

    2006-01-01

    The 4th meeting of the 'Flavour in the era of the LHC' workshop will take place at CERN on 9-11 October, 2006. The goal of this workshop is to outline and document a programme for flavour physics for the next decade, addressing in particular the complementarity and synergy between the discoveries we expect to emerge from the LHC and the potential for accurate measurements of future flavour factories. Over 150 physicists will join in the discussions of the three working groups dedicated to 'Flavour physics at high Q', 'B/D/K decays' and 'Flavour in the lepton sector, EDM's, g-2, etc'. The previous meetings took place in November 2005, and in February and May this year. In addition to the working group sessions, a special miniworkshop dedicated to future prospects for electric dipole moment (EDM) searches and g-2 measurements will be held on 9-10 October. Sensitive EDM and g-2 experiments probe physics in an integral way, and in many cases their physics reach is much higher than the spectrometer searches at th...

  10. CERN moves into the LHC era

    CERN Multimedia

    2001-01-01

    Dr Hans Eschelbacher (on the left), President of the CERN Council for the last three years, hands over to his successor Maurice Bourquin.  The CERN Council, where the representatives of the 20 Member States of the Organization decide on scientific programmes and financial resources, held its 116th session on 15 December under the chairmanship of Dr. Hans C. Eschelbacher (DE). 'Le Roi est mort. Vive le Roi !' The Large Electron Positron Collider (LEP) era has ended and CERN's future is the Large Hadron Collider (LHC), stated Director General, Prof. Luciano Maiani. He opened his report to Council with a 'homage to LEP', which reached the end of its career during 2000 and is now being dismantled to make way for CERN's next major machine, the LHC collider, in the same 27-kilometre tunnel. The strong indications of a Higgs boson at 115 GeV found during the year were the culmination of LEP's long and distinguished physics career, during which the machine opened up new regimes of precision physics, involvi...

  11. Modelling Viking ERA Water Ice Clouds

    Science.gov (United States)

    Tamppari, L. K.; Wilson, R. J.; Zurek, R. W.; Paige, D. A.

    1999-09-01

    Water ice clouds in the Martian atmosphere are increasingly becoming recognized as a potentially important aspect of the water cycle and potentially potent mechanism for climte change. In particular, it has been suggested that water ice cloud formation can control the extent of the water column (Kahn, 1990). Further, water ice cloud formation may scavenge dust out of the atmosphere and may prevent cross-equatorial water transport, especially in the northern summer (Clancy, 1996). To address these questions, a combintion of modelling and data analysis can be used. The Viking era water ice clouds were identified (Tamppari et al., 1998) from the IRTM data set. Following that, Tamppari et al. (1999) attempted to identify the cloud opacity and temperature using a 1D, 2-layer ice and dust cloud model. However, data fits were sensitive to the surface temperature, dust opacity and temperature, and ice particle mode radius value, as well as the water ice cloud temperature and opacity. This resulted in an underconstrained problem. A Mars GCM will be employed to provide realistic atmospheric conditions as a function of season, latitude, and longitude. The non-unit surface emissivities (Christensen, 1998) will be added and synthetic IRTM brightness temperatures will be calculated. Results of the comparison of the synthetic and measured brightness temperatures will be presented.

  12. Belydenisgebondenheid in ’n postmoderne era

    Directory of Open Access Journals (Sweden)

    C.F.C. Coetzee

    2010-07-01

    Full Text Available The binding to confessions in a postmodern era We are experiencing a paradigm shift between Modernism and Postmodernism in almost every sphere of life, and also in the sphere of church and theology. This paradigm shift has far-reaching consequences, especially for churches in the reformed tradition and the practice of reformed theology as far as the binding to the confessions is concerned. From the viewpoint of Postmodernism, there is no absolute truth. This applies also to Scripture. As far as their hermeneutics is concerned, they adhere to the principles of deduction as formulated by Derrida. According to these principles, a text has no intrinsic meaning but rather creates meaning. There is nothing outside the text. This leads to radical relativism. Over against the postmodern view, reformed hermeneutics maintain that Scripture is the infallible Word of God and proclaims everlasting truth. In the confessions this truth is formulated. Confessions belong to the very essence of the church. The binding to the confessions therefore applies to every member as well as all office-bearers and also professors in theology. In this regard there can be no compromise with Postmodernism.

  13. Bounding Extreme Spacecraft Charging in the Lunar Environment

    Science.gov (United States)

    Minow, Joseph I.; Parker, Linda N.

    2008-01-01

    Robotic and manned spacecraft from the Apollo era demonstrated that the lunar surface in daylight will charge to positive potentials of a few tens of volts because the photoelectron current dominates the charging process. In contrast, potentials of the lunar surface in darkness which were predicted to be on the order of a hundred volts negative in the Apollo era have been shown more recently to reach values of a few hundred volts negative with extremes on the order of a few kilovolts. The recent measurements of night time lunar surface potentials are based on electron beams in the Lunar Prospector Electron Reflectometer data sets interpreted as evidence for secondary electrons generated on the lunar surface accelerated through a plasma sheath from a negatively charged lunar surface. The spacecraft potential was not evaluated in these observations and therefore represents a lower limit to the magnitude of the lunar negative surface potential. This paper will describe a method for obtaining bounds on the magnitude of lunar surface potentials from spacecraft measurements in low lunar orbit based on estimates of the spacecraft potential. We first use Nascap-2k surface charging analyses to evaluate potentials of spacecraft in low lunar orbit and then include the potential drops between the ambient space environment and the spacecraft to the potential drop between the lunar surface and the ambient space environment to estimate the lunar surface potential from the satellite measurements.

  14. Probing Terrestrial Planet Formation with Extreme Disk Variability

    Science.gov (United States)

    Su, Kate; Rieke, George; Gaspar, Andras; Jackson, Alan

    2016-08-01

    Spitzer has advanced our knowledge about the critical stages of terrestrial planet formation (and in some cases destruction) by discovering young stars orbited by 1.) silica dust emission close to their terrestrial zones indicative of the violent collisions, and 2.) variable disk emission arising from the aftermath of asteroid-size impacts. The variable emission provides a unique opportunity to learn about asteroid-sized bodies in young exoplanetary systems and to explore planetesimal collisions and their aftermaths during the era of terrestrial-planet-building. We propose continued study of debris disk variability, focused in two areas: (1) to provide continuous monitoring of systems where our existing program has discovered substantial variations indicative of major ongoing episodes of planetesimal impacts; and (2) to investigate intensively possible variations in the dust content of systems that show prominent crystalline emission features to establish a link between the two indicators of planet building. Together these objectives will prepare us for the JWST era, when we will again obtain mid-infrared spectra of these systems, and of both higher spectral resolution and signal to noise than has been possible previously. This program will extend the time-domain study of extreme debris disks as an important heritage of the Spitzer warm mission.

  15. Storm-Tracks in ERA-40 and ERA-Interim Reanalyses

    Science.gov (United States)

    Liberato, M. L. R.; Trigo, I. F.; Trigo, R. M.

    2009-04-01

    Extratropical cyclones, their dominant paths, frequency and intensity have long been the object of climatological studies. The analysis of cyclone characteristics for the Euro-Atlantic sector (85°W-70°E; 20°N-75°N) presented here is based on the cyclone detecting and tracking algorithm first developed for the Mediterranean region (Trigo et al., 1999, 2002) and recently extended to a larger Euro-Atlantic region (Trigo, 2006). The objective methodology, which identifies and follows individual lows (Trigo et al. 1999), is applied to 6-hourly geopotential data at 1000-hPa from two reanalyses datasets provided by the European Centre for Medium-Range Weather Forecasts (ECMWF): ERA-40 and ERA-Interim reanalyses. Two storm-track databases are built over the Northern Atlantic European area, spanning the common available extended winter seasons from October 1989 to March 2002. Although relatively short, this common period allows a comparison of systems represented in reanalyses datasets with distinct horizontal resolutions (T106 and T255, respectively). This exercise is mostly focused on the key areas of cyclone formation and dissipation and main cyclone characteristics for the Euro-Atlantic sector. Trigo, I. F., T. D. Davies, and G. R. Bigg, 1999: Objective climatology of cyclones in the Mediterranean region. J. Climate, 12, 1685-1696. Trigo I. F., G. R. Bigg and T. D. Davies, 2002: Climatology of Cyclogenesis Mechanisms in the Mediterranean. Mon. Weather Rev. 130, 549-569. Trigo, I. F. 2006: Climatology and Interannual Variability of Storm-Tracks in the Euro-Atlantic sector: a comparison between ERA-40 and NCEP/NCAR Reanalyses. Clim. Dyn. DOI 10.1007/s00382-005-0065-9.

  16. Matter Under Extreme Conditions: The Early Years

    CERN Document Server

    Keeler, R Norris

    2010-01-01

    Extreme conditions in natural flows are examined, starting with a turbulent big bang. A hydro-gravitational-dynamics cosmology model is adopted. Planck-Kerr turbulence instability causes Planck-particle turbulent combustion. Inertial-vortex forces induce a non-turbulent kinetic energy cascade to Planck-Kolmogorov scales where vorticity is produced, overcoming 10^113 Pa Planck-Fortov pressures. The spinning, expanding fireball has a slight deficit of Planck antiparticles. Space and mass-energy powered by gluon viscous stresses expand exponentially at speeds >10^25 c. Turbulent temperature and spin fluctuations fossilize at scales larger than ct, where c is light speed and t is time. Because “dark-energy” antigravity forces vanish when inflation ceases, and because turbulence produces entropy, the universe is closed and will collapse and rebound. Density and spin fossils of big bang turbulent mixing trigger structure formation in the plasma epoch. Fragmenting protosuperclustervoids and protoclustervoi...

  17. Effects of Standard Extremity on Mixed Standard Scale Performance Ratings.

    Science.gov (United States)

    1983-03-01

    Los Angeles, CA 90024 Champaign, IL 61820 Dr. Charles Perrow Dr. Howard M. Weiss Yale University Purdue University *’ I. S. P. S. Department of...University University of Washington Department of Psychology Department of Psychology, NI-25 Stanford, CA 94305 Seattle, WA 98195 Dr. Philip Wexler Dr

  18. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  19. A Fault-oblivious Extreme-scale Execution Environment

    Energy Technology Data Exchange (ETDEWEB)

    Sadayappan, Ponnuswamy [The Ohio State Univ., Columbus, OH (United States)

    2016-08-31

    Exascale computing systems will provide a thousand-fold increase in parallelism and a proportional increase in failure rate relative to today's machines. Systems software for exascale machines must provide the infrastructure to support existing applications while simultaneously enabling efficient execution of new programming models that naturally express dynamic, adaptive, irregular computation; coupled simulations; and massive data analysis in a highly unreliable hardware environment with billions of threads of execution. We propose a new approach to the data and work distribution model provided by system software based on the unifying formalism of an abstract file system. The proposed hierarchical data model provides simple, familiar visibility and access to data structures through the file system hierarchy, while providing fault tolerance through selective redundancy. The hierarchical task model features work queues whose form and organization are represented as file system objects. Data and work are both first class entities. By exposing the relationships between data and work to the runtime system, information is available to optimize execution time and provide fault tolerance. The data distribution scheme provides replication (where desirable and possible) for fault tolerance and efficiency, and it is hierarchical to make it possible to take advantage of locality. The user, tools, and applications, including legacy applications, can interface with the data, work queues, and one another through the abstract file model. This runtime environment will provide multiple interfaces to support traditional Message Passing Interface applications, languages developed under DARPA's High Productivity Computing Systems program, as well as other, experimental programming models. We will validate our runtime system with pilot codes on existing platforms and will use simulation to validate for exascale-class platforms. In this final report, we summarize research results from the work done at the Ohio State University towards the larger goals of the project listed above.

  20. Exploring Asynchronous Many-Task Runtime Systems toward Extreme Scales

    Energy Technology Data Exchange (ETDEWEB)

    Knight, Samuel [O8953; Baker, Gavin Matthew; Gamell, Marc [Rutgers U; Hollman, David [08953; Sjaardema, Gregor [SNL; Kolla, Hemanth [SNL; Teranishi, Keita; Wilke, Jeremiah J; Slattengren, Nicole [SNL; Bennett, Janine Camille

    2015-10-01

    Major exascale computing reports indicate a number of software challenges to meet the dramatic change of system architectures in near future. While several-orders-of-magnitude increase in parallelism is the most commonly cited of those, hurdles also include performance heterogeneity of compute nodes across the system, increased imbalance between computational capacity and I/O capabilities, frequent system interrupts, and complex hardware architectures. Asynchronous task-parallel programming models show a great promise in addressing these issues, but are not yet fully understood nor developed su ciently for computational science and engineering application codes. We address these knowledge gaps through quantitative and qualitative exploration of leading candidate solutions in the context of engineering applications at Sandia. In this poster, we evaluate MiniAero code ported to three leading candidate programming models (Charm++, Legion and UINTAH) to examine the feasibility of these models that permits insertion of new programming model elements into an existing code base.

  1. Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Ghanem, Roger [Univ. of Southern California, Los Angeles, CA (United States)

    2017-04-18

    QUEST was a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, the Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The USC effort centered on the development of reduced models and efficient algorithms for implementing various components of the UQ pipeline. USC personnel were responsible for the development of adaptive bases, adaptive quadrature, and reduced models to be used in estimation and inference.

  2. Scalable ParaView for Extreme Scale Visualization Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Petscale computing is leading to significant breakthroughs in a number of fields and is revolutionizing the way science is conducted. Data is not knowledge, however,...

  3. Predicting Ice Sheet and Climate Evolution at Extreme Scales

    Energy Technology Data Exchange (ETDEWEB)

    Heimbach, Patrick [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2016-02-06

    A main research objectives of PISCEES is the development of formal methods for quantifying uncertainties in ice sheet modeling. Uncertainties in simulating and projecting mass loss from the polar ice sheets arise primarily from initial conditions, surface and basal boundary conditions, and model parameters. In general terms, two main chains of uncertainty propagation may be identified: 1. inverse propagation of observation and/or prior onto posterior control variable uncertainties; 2. forward propagation of prior or posterior control variable uncertainties onto those of target output quantities of interest (e.g., climate indices or ice sheet mass loss). A related goal is the development of computationally efficient methods for producing initial conditions for an ice sheet that are close to available present-day observations and essentially free of artificial model drift, which is required in order to be useful for model projections (“initialization problem”). To be of maximum value, such optimal initial states should be accompanied by “useful” uncertainty estimates that account for the different sources of uncerainties, as well as the degree to which the optimum state is constrained by available observations. The PISCEES proposal outlined two approaches for quantifying uncertainties. The first targets the full exploration of the uncertainty in model projections with sampling-based methods and a workflow managed by DAKOTA (the main delivery vehicle for software developed under QUEST). This is feasible for low-dimensional problems, e.g., those with a handful of global parameters to be inferred. This approach can benefit from derivative/adjoint information, but it is not necessary, which is why it often referred to as “non-intrusive”. The second approach makes heavy use of derivative information from model adjoints to address quantifying uncertainty in high-dimensions (e.g., basal boundary conditions in ice sheet models). The use of local gradient, or Hessian information (i.e., second derivatives of the cost function), requires additional code development and implementation, and is thus often referred to as an “intrusive” approach. Within PISCEES, MIT has been tasked to develop methods for derivative-based UQ, the ”intrusive” approach discussed above. These methods rely on the availability of first (adjoint) and second (Hessian) derivative code, developed through intrusive methods such as algorithmic differentiation (AD). While representing a significant burden in terms of code development, derivative-baesd UQ is able to cope with very high-dimensional uncertainty spaces. That is, unlike sampling methods (all variations of Monte Carlo), calculational burden is independent of the dimension of the uncertainty space. This is a significant advantage for spatially distributed uncertainty fields, such as threedimensional initial conditions, three-dimensional parameter fields, or two-dimensional surface and basal boundary conditions. Importantly, uncertainty fields for ice sheet models generally fall into this category.

  4. Beijing New Era Architectural Design Co.,Ltd.

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Beijing New Era Architectural Design Co.,Ltd., an A-Class design enterprise approved by the Ministry of Construction, provides services of industrial and civil architectural design, interior decoration, consultation for real estate development,

  5. CENET: Cost Efficiency in a New Era with new Technology

    Energy Technology Data Exchange (ETDEWEB)

    Karlsen, Jan E.; Lund, Bjoernar; Bos, Christian F.M.; Stokka, Sigmund

    1997-12-31

    This report relates to the CENET (Cost Efficiency in a New Era with new Technology) project the oil and gas in Europe. Key objectives of the CENET project are to determine the role of RTD (Research and Technology Development) in European oil and gas industry towards improved value and cost reduction with a particular focus on the means of developing offshore European marginal fields commercially, to identify RTD areas with the largest potential for improved value and cost reduction and technological developments and advances which are likely to increase European competitiveness internationally, and to provide guidance to European governments when deciding RTD priorities. A new era with new technology concerns increased oil and gas potential during the next century, a new era with clean, safe and cost efficient energy production, a new era with a new business structure, and globalization of the industry. 44 tabs., 5 figs., 23 tabs.

  6. Eraõigusliku juriidilise isiku organi liikmete õigussuhted / Kalev Saare

    Index Scriptorium Estoniae

    Saare, Kalev, 1974-

    2010-01-01

    Eraõiguslike juriidiliste isikute organi mõistest aktsiaseltsi ja osaühingu näitel, organiliikmete sisesuhte tekkimisest ja tsiviilseadustiku üldosa seaduse poolt määratud sisesuhte sisusse kuuluvatest peamistest kohustustest

  7. Dragon paves the way for new spaceflight era

    Science.gov (United States)

    Gwynne, Peter

    2012-07-01

    The success of the first private mission to the International Space Station (ISS) has opened up a new era in commercial spaceflight after SpaceX's Dragon capsule splashed down safely in the Pacific Ocean on 31 May.

  8. Eraõigusliku juriidilise isiku organi liikmete õigussuhted / Kalev Saare

    Index Scriptorium Estoniae

    Saare, Kalev, 1974-

    2010-01-01

    Eraõiguslike juriidiliste isikute organi mõistest aktsiaseltsi ja osaühingu näitel, organiliikmete sisesuhte tekkimisest ja tsiviilseadustiku üldosa seaduse poolt määratud sisesuhte sisusse kuuluvatest peamistest kohustustest

  9. Pulsar timing array based search for supermassive black hole binaries in the SKA era

    CERN Document Server

    Wang, Yan

    2016-01-01

    The advent of next generation radio telescope facilities, such as the Square Kilometer Array (SKA), will usher in an era where a Pulsar Timing Array (PTA) based search for gravitational waves (GWs) will be able to use hundreds of well timed millisecond pulsars rather than the few dozens in existing PTAs. A realistic assessment of the performance of such an extremely large PTA must take into account the data analysis challenge posed by an exponential increase in the parameter space volume due to the large number of so-called pulsar phase parameters. We address this problem and present such an assessment for isolated supermassive black hole binary (SMBHB) searches using a SKA era PTA containing $10^3$ pulsars. We find that an all-sky search will be able to confidently detect non-evolving sources with redshifted chirp mass of $10^{10}$ $M_\\odot$ out to a redshift of about $28$. The detection of GW signals from optically identified SMBHB candidates similar to PSO J334+01 is assured. If no SMBHB detections occur, ...

  10. Extreme Ionizing-Radiation-Resistant Bacterium

    Science.gov (United States)

    Vaishampayan, Parag A.; Venkateswaran, Kasthuri J.; Schwendner, Petra

    2013-01-01

    There is a growing concern that desiccation and extreme radiation-resistant, non-spore-forming microorganisms associated with spacecraft surfaces can withstand space environmental conditions and subsequent proliferation on another solar body. Such forward contamination would jeopardize future life detection or sample return technologies. The prime focus of NASA s planetary protection efforts is the development of strategies for inactivating resistance-bearing micro-organisms. Eradi cation techniques can be designed to target resistance-conferring microbial populations by first identifying and understanding their physiologic and biochemical capabilities that confers its elevated tolerance (as is being studied in Deinococcus phoenicis, as a result of this description). Furthermore, hospitals, food, and government agencies frequently use biological indicators to ensure the efficacy of a wide range of radiation-based sterilization processes. Due to their resistance to a variety of perturbations, the nonspore forming D. phoenicis may be a more appropriate biological indicator than those currently in use. The high flux of cosmic rays during space travel and onto the unshielded surface of Mars poses a significant hazard to the survival of microbial life. Thus, radiation-resistant microorganisms are of particular concern that can survive extreme radiation, desiccation, and low temperatures experienced during space travel. Spore-forming bacteria, a common inhabitant of spacecraft assembly facilities, are known to tolerate these extreme conditions. Since the Viking era, spores have been utilized to assess the degree and level of microbiological contamination on spacecraft and their associated spacecraft assembly facilities. Members of the non-sporeforming bacterial community such as Deinococcus radiodurans can survive acute exposures to ionizing radiation (5 kGy), ultraviolet light (1 kJ/m2), and desiccation (years). These resistive phenotypes of Deinococcus enhance the

  11. Extreme hypertriglyceridemia managed with insulin.

    Science.gov (United States)

    Thuzar, Moe; Shenoy, Vasant V; Malabu, Usman H; Schrale, Ryan; Sangla, Kunwarjit S

    2014-01-01

    Extreme hypertriglyceridemia can lead to acute pancreatitis and rapid lowering of serum triglycerides (TG) is necessary for preventing such life-threatening complications. However, there is no established consensus on the acute management of extreme hypertriglyceridemia. We retrospectively reviewed 10 cases of extreme hypertriglyceridemia with mean serum TG on presentation of 101.5 ± 23.4 mmol/L (8982 ± 2070 mg/dL) managed with insulin. Serum TG decreased by 87 ± 4% in 24 hours in those patients managed with intravenous insulin and fasting and 40 ± 8.4% in those managed with intravenous insulin alone (P = .0003). The clinical course was uncomplicated in all except 1 patient who subsequently developed a pancreatic pseudocyst. Thus, combination of intravenous insulin with fasting appears to be an effective, simple, and safe treatment strategy in immediate management of extreme hypertriglyceridemia. Crown Copyright © 2014. Published by Elsevier Inc. All rights reserved.

  12. Moment methods in extremal geometry

    NARCIS (Netherlands)

    De Laat, D.

    2016-01-01

    In this thesis we develop techniques for solving problems in extremal geometry. We give an infinite dimensional generalization of moment techniques from polynomial optimization. We use this to construct semidefinite programming hierarchies for approximating optimal packing densities and ground state

  13. Moment methods in extremal geometry

    NARCIS (Netherlands)

    De Laat, D.

    2016-01-01

    In this thesis we develop techniques for solving problems in extremal geometry. We give an infinite dimensional generalization of moment techniques from polynomial optimization. We use this to construct semidefinite programming hierarchies for approximating optimal packing densities and ground state

  14. Enforcing patents in the era of 3D printing

    DEFF Research Database (Denmark)

    Ballardini, Rosa Maria; Norrgård, Marcus; Minssen, Timo

    2015-01-01

    This article explores relevant laws and doctrines of patent infringement in Europe with a special emphasis on 3D printing (3DP) technologies. Considering the difficulties that patent owners might face in pursuing direct patent infringement actions in the rapidly evolving era of 3DP, we suggest...... of IP law. Enforcing patents in the era of 3D printing Rosa Maria Ballardini, Marcus Norrgård, and Timo Minssen Journal of Intellectual Property Law & Practice 2015 10: 850-866...

  15. Sino­Pakistan Relations and the Challenges of Post-­Cold War Era

    Directory of Open Access Journals (Sweden)

    Mutahir Ahmed

    2015-04-01

    Full Text Available China has emerged as the world’s second largest economy, and the largest exporter of goods with 9.6 per cent of the global share. Moreover, the last two decades have seen China emerging as an international and regional power of the 21st century. Thus, in order to continue with the economic benefits, China wants peace and stability as well as to play an active role on international and regional fronts. On the other hand, Pakistan, the world’s sixth most populous country, is a major power of South Asia. While having a developed infrastructure and vibrant political and security institutions, Pakistan is nevertheless currently facing many challenges on the economic front, including political instability and religious extremism. This paper is an attempt to analyze the challenges faced by both China and Pakistan in the post-Cold War era.

  16. Ethics of using language editing services in an era of digital communication and heavily multiauthored papers

    CERN Document Server

    Lozano, George A

    2013-01-01

    Scientists of many countries in which English is not the primary language routinely use a variety of manuscript preparation, correction or editing services, a practice that is openly endorsed by many journals and scientific institutions. These services vary tremendously in their scope; at one end there is simple proof-reading, and at the other extreme there is in-depth and extensive peer-reviewing, proposal preparation, statistical analyses, re-writing and co-writing. In this paper, the various types of service are reviewed, along with authorship guidelines, and the question is raised of whether the high-end services surpass most guidelines' criteria for authorship. Three other factors are considered. First, the ease of collaboration possible in the internet era allows multiple iterations between authors and the editing service, so essentially, papers can be co-written. Second, 'editing services' often offer subject-specific experts who comment not only on the language, but interpret and improve scientific co...

  17. From photons to gravitational waves: pulsars in the era of multimessenger astronomy

    Science.gov (United States)

    Razzano, Massimiliano

    2015-08-01

    Multiwavelength astronomy has provided the most complete picture of the Universe so far. In the coming years the second-generation interferometers like Advanced LIGO and Advanced Virgo will reach sensitivities good enough to detect the first gravitational wave signals, opening a new window on the cosmos. In this new era of multimessenger astronomy, pulsars are very promising candidates to be studied using electromagnetic radiation and gravitational waves. In fact, being the endpoints of the evolution of massive stars, they are great tools to understand stellar structure and evolution, as well as the population of the Galaxy. Moreover, they are excellent natural laboratories to probe the laws of Physics under extreme conditions of gravity and elecromagnetic fields. I will review the multimessenger opportunities for the electromagnetic and gravitational observations of pulsars, highlighting their potential as continuous gravitational waves emitters.

  18. Nonstationary modeling of extreme precipitation in China

    Science.gov (United States)

    Gao, Meng; Mo, Dingyuan; Wu, Xiaoqing

    2016-12-01

    The statistical methods based on extreme value theory have been traditionally used in meteorology and hydrology for a long time. Due to climate change and variability, the hypothesis of stationarity in meteorological or hydrological time series was usually not satisfied. In this paper, a nonstationary extreme value analysis was conducted for annual maximum daily precipitation (AMP) at 631 meteorological stations over China for the period 1951-2013. Stationarity of all 631 AMP time series was firstly tested using KPSS test method, and only 48 AMP time series showed non-stationarity at 5% significance level. The trends of these 48 nonstationary AMP time series were further tested using M-K test method. There were 25 nonstationary AMP time series mainly distributed in southern and western China showing significant positive trend at 5% level. Another 5 nonstationary AMP time series with significant negative trends were near northern urban agglomeration, Sichuan Basin, and central China. For these nonstationary AMP time series with significant positive or negative trends, the location parameter in generalized extreme value (GEV) distribution was assumed to be time-varying, and the trends were successfully characterized by the nonstationary GEV models. For the remaining 18 nonstationary AMP time series mainly in the eastern portion of China, no significant trend was detected. The correlation analysis showed that only 5 nonstationary AMP time series were significantly correlated with one or two of the four climate indices EASMI, WPI, SOI, and PDO. Then, the location and scale parameters in the GEV distribution were modeled as functions of the significantly correlated climate indices. The modeling results in this study showed that the nonstationary GEV distributions performed better than their stationary equivalents. Finally, 20-year and 50-year return levels of precipitation extremes at all 631 stations were estimated using the best fitting distribution for the year 1961

  19. Summer 2015 Extremes over South Asia within the Historical Perspective

    Science.gov (United States)

    Rastogi, D.; Ashfaq, M.

    2015-12-01

    South Asian summer in 2015 has been marked by weather events of extremely different nature, including hot extremes over India and Pakistan, and wet extremes over northern, western and eastern states of India. Interestingly, these extremes are happening against the backdrop of warm sea surface temperature anomalies in the equatorial Pacific, which has historically reduced the strength of summer monsoon over South Asia. Given the occurrence of the contrasting anomalies at large and regional scales, in this study, we analyze 2015 extremes over South Asia within the historical perspective. We study the anomalies in the land, atmospheric and oceanic processes that potentially led to the regional heat waves and wet extremes throughout the summer and their connection to the large-scale anomalies in the monsoon dynamic. Additionally, we analyze historical simulations of the CMIP5 GCMs to investigate the likelihood of these anomalies with respect to the pre-industrial time period. Our analysis suggests evolving changes in the monsoon dynamics over South Asia where the lesser-known regional and local drivers have influence on the historical tele-connections.

  20. Deformations of extremal toric manifolds

    CERN Document Server

    Rollin, Yann

    2012-01-01

    Let $X$ be a compact toric extremal K\\"ahler manifold. Using the work of Sz\\'ekelyhidi, we provide a simple criterion on the fan describing $X$ to ensure the existence of complex deformations of $X$ that carry extremal metrics. As an example, we find new CSC metrics on 4-points blow-ups of $\\C\\P^1\\times\\C\\P^1$.

  1. Large Extremity Peripheral Nerve Repair

    Science.gov (United States)

    2015-10-01

    IL, Kochevar IE, Redmond RW. Large extremity peripheral nerve repair. Military Health System Research Symposium (MHSRS) Fort Lauderdale, FL. August...some notable discoveries that may impact military health care in the near future. There is a clear need in military medicine to improve outcomes in...membranes or “caul” intact was considered extremely lucky. Children were gifted with life-long happiness , the ability to see spirits, and protection

  2. Observed Statistics of Extreme Waves

    Science.gov (United States)

    2006-12-01

    9 Figure 5. An energy stealing wave as a solution to the NLS equation . (From: Dysthe and...shown that nonlinear interaction between four colliding waves can produce extreme wave behavior. He utilized the NLS equation in his numerical ...2000) demonstrated the formation of extreme waves using the Korteweg de Vries ( KdV ) equation , which is valid in shallow water. It was shown in the

  3. Weather Extremes Around the World

    Science.gov (United States)

    1974-04-01

    or ever has occurred. According to M. A. Arkin, "... record extremes must be taken with a grain of salt .... Ř He explains that news of an extreme...the edge of the Danakil Depression, a salt desert. By averaging the annual mean daily maximum temperature of 106°F36 atid the annual mean daily...increased by orographic lifting.1" Asa result of these monsoon disturbances, which are still not fully understood, the eastern Himalayan 105 106

  4. Extreme precipitation and extreme streamflow in the Dongjiang River Basin in southern China

    Directory of Open Access Journals (Sweden)

    W. Wang

    2007-07-01

    Full Text Available Extreme hydro-meteorological events have become the focus of more and more studies in the last decade. Due to the complexity of the spatial pattern of changes in precipitation processes, it is still hard to establish a clear view of how precipitation has changed and how it will change in the future. In the present study, changes in extreme precipitation and streamflow processes in the Dongjiang River Basin in southern China are investigated. It was shown that little change is observed in annual extreme precipitation in terms of various indices, but some significant changes are found in the precipitation processes on a monthly basis. The result indicates that when detecting climate changes, besides annual indices, seasonal variations in extreme events should be considered as well. Despite of little change in annual extreme precipitation series, significant changes are detected in several annual extreme flood flow and low-flow series, mainly at the stations along the main channel of Dongjiang River, which are affected significantly by the operation of several major reservoirs. The result highlights the importance of evaluating the impacts of human activities in assessing the changes of extreme streamflows. In addition, three non-parametric methods that are not-commonly used by hydro-meteorology community, i.e., Kolmogorov–Smirnov test, Levene's test and quantile test, are introduced and assessed by Monte Carlo simulation in the present study to test for changes in the distribution, variance and the shift of tails of different groups of dataset. Monte Carlo simulation result shows that, while all three methods work well for detecting changes in two groups of data with large data size (e.g., over 200 points in each group and big difference in distribution parameters (e.g., over 100% increase of scale parameter in Gamma distribution, none of them are powerful enough for small data sets (e.g., less than 100 points and small distribution

  5. Is climate change modifying precipitation extremes?

    Science.gov (United States)

    Montanari, Alberto; Papalexiou, Simon Michael

    2016-04-01

    The title of the present contribution is a relevant question that is frequently posed to scientists, technicians and managers of local authorities. Although several research efforts were recently dedicated to rainfall observation, analysis and modelling, the above question remains essentially unanswered. The question comes from the awareness that the frequency of floods and the related socio-economic impacts are increasing in many countries, and climate change is deemed to be the main trigger. Indeed, identifying the real reasons for the observed increase of flood risk is necessary in order to plan effective mitigation and adaptation strategies. While mitigation of climate change is an extremely important issue at the global level, at small spatial scales several other triggers may interact with it, therefore requiring different mitigation strategies. Similarly, the responsibilities of administrators are radically different at local and global scales. This talk aims to provide insights and information to address the question expressed by its title. High resolution and long term rainfall data will be presented, as well as an analysis of the frequency of their extremes and its progress in time. The results will provide pragmatic indications for the sake of better planning flood risk mitigation policies.

  6. Re-Form: FPGA-Powered True Codesign Flow for High-Performance Computing In The Post-Moore Era

    Energy Technology Data Exchange (ETDEWEB)

    Cappello, Franck; Yoshii, Kazutomo; Finkel, Hal; Cong, Jason

    2016-11-14

    Multicore scaling will end soon because of practical power limits. Dark silicon is becoming a major issue even more than the end of Moore’s law. In the post-Moore era, the energy efficiency of computing will be a major concern. FPGAs could be a key to maximizing the energy efficiency. In this paper we address severe challenges in the adoption of FPGA in HPC and describe “Re-form,” an FPGA-powered codesign flow.

  7. Quantum cosmology on (k = -1)-Friedmann-Robertson-Walker Universe evolving from stiff matter era to the dust dominated one

    Science.gov (United States)

    Dariescu, Marina-Aura; Dariescu, Ciprian

    2017-01-01

    This work is devoted to the spatially open Friedmann-Robertson-Walker (FRW) Universe evolving from the stiff matter era to the dust dominated one. Within the quantum analysis based on the Wheeler-DeWitt equation, we derive the wave function of the (k = -1)-FRW Universe with combined matter sources. On the classical level, one has to deal with the Friedmann equation which leads on a dependence of the scale function on time generally expressed from functional relations involving elliptic integrals.

  8. The solar siblings in the Gaia era

    Science.gov (United States)

    Martínez-Barbosa, C. A.; Brown, A. G. A.; Portegies Zwart, S.

    2014-07-01

    We perform realistic simulations of the Sun's birth cluster in order to predict the current distribution of solar siblings in the Galaxy. We study the possibility of finding the solar siblings in the Gaia catalogue by using only positional and kinematic information. We find that the number of solar siblings predicted to be observed by Gaia will be around 100 in the most optimistic case, and that a phase space only search in the Gaia catalogue will be extremely difficult. It is therefore mandatory to combine the chemical tagging technique with phase space selection criteria in order to have any hope of finding the solar siblings.

  9. The solar siblings in the Gaia era

    CERN Document Server

    Martínez-Barbosa, C A; Zwart, S Portegies

    2015-01-01

    We perform realistic simulations of the Sun's birth cluster in order to predict the current distribution of solar siblings in the Galaxy. We study the possibility of finding the solar siblings in the Gaia catalogue by using only positional and kinematic information. We find that the number of solar siblings predicted to be observed by Gaia will be around 100 in the most optimistic case, and that a phase space only search in the Gaia catalogue will be extremely difficult. It is therefore mandatory to combine the chemical tagging technique with phase space selection criteria in order to have any hope of finding the solar siblings.

  10. Recent trends of extreme temperature indices for the Iberian Peninsula

    Science.gov (United States)

    Fonseca, D.; Carvalho, M. J.; Marta-Almeida, M.; Melo-Gonçalves, P.; Rocha, A.

    2016-08-01

    Climate change and extreme climate events have a significant impact on societies and ecosystems. As a result, climate change projections, especially related with extreme temperature events, have gained increasing importance due to their impacts on the well-being of the population and ecosystems. However, most studies in the field are based on coarse global climate models (GCMs). In this study, we perform a high resolution downscaling simulation to evaluate recent trends of extreme temperature indices. The model used was Weather Research and Forecast (WRF) forced by MPI-ESM-LR, which has been shown to be one of the more robust models to simulate European climate. The domain used in the simulations includes the Iberian Peninsula and the simulation covers the 1986-2005 period (i.e. recent past). In order to study extreme temperature events, trends were computed using the Theil-Sen method for a set of temperature indexes defined by the Expert Team on Climate Change Detection and Indices (ETCCDI). For this, daily values of minimum and maximum temperatures were used. The trends of the indexes were computed for annual and seasonal values and the Mann-Kendall Trend test was used to evaluate their statistical significance. In order to validate the results, a second simulation, in which WRF was forced by ERA-Interim, was performed. The results suggest an increase in the number of warm days and warm nights, especially during summer and negative trends for cold nights and cold days for the summer and spring. For the winter, contrary to the expected, the results suggest an increase in cold days and cold nights (warming hiatus). This behavior is supported by the WRF simulation forced by ERA-Interim for the autumn days, pointing to an extension of the warming hiatus phenomenon to the remaining seasons. These results should be used with caution since the period used to calculate the trends may not be long enough for this purpose. However, the general sign of trends are similar for

  11. War and peace in the Internet era

    Directory of Open Access Journals (Sweden)

    Josep M. Porta Fabregat

    2004-04-01

    Full Text Available This article looks to find the ideological causes that lead human beings to war or peace nowadays, in the Internet era. This proposal is worthy of study as war is not a need in terms of human nature or history: we are capable of war and peace simultaneously. However, why does war survive if we are able to live in peace? In our opinion, the actual cause of conflict is fanaticism. This phenomenon comes from the perversion of the two bases of our civilisation: liberty and rationality. This twofold perversion leads us to believe that we are the Absolute, or at least its instrument.Since the fall of the Berlin wall, this kind of fanaticism has come from the generalised conviction that we are at the "end of history"; in this light, one can conclude that this irrationality is definitive and, thus, that any efforts to achieve world peace are useless. However, we believe that the formula for peace can only be derived from reflection and the effective extension around the world of a technical medium that makes communication between all men possible. This would be able to resolve all the perversions of liberty and rationality and make people aware of the infinite distance between us and the Absolute. However, this reflection is not enough. For this awareness to triumph, the technical and ideological situation represented by the Internet has to spread over the whole planet: liberty for those taking part, rationality to allow for communication among all those connected and universal access. This is the moral trend for the Internet, which in itself encourages progress towards world peace.

  12. Kinesiogenomics: The Genomic Era in Kinesiology

    Directory of Open Access Journals (Sweden)

    Onur ORAL

    2015-08-01

    Full Text Available The application of genomics research in kinesiology may be characterized as kinesiogenomics or sport genetics. Therefore, kinesiogenomcs is generally known as the study of genetics in the various discipline of kinesiology which refer to the study of human movement. In the research field of Kinesiogenomics, the scientific goal is to clarify the role of genes in sport performance, identification of various genes with their different alleles that affect to the response and adaptation of the body systems and m etabolism . By thelp of the these genetic research studies in various disciplines of kinesiology, it will be possible to use genetic testing to predict sport performance or to individualize exercise prescription with the potential for genetic therapy. I t is supposed to enhance the sport performance because of that reason that the Interindividual variation of kinesiogenomic - related trait such as maximal oxygen consumption, muscle fibre composition and trainability has a strong genetic basis. Genetic res earch has a large potential for developing the area of kinesiology with the contribution of genetic factors for various phenotypes related to sport performance including aerobic and anaerobic performance, muscular endurance and strength, motor performance, and some determinant of performance such as morphological, cardiac and skeletal muscle characteristic. In the area of kinesiology, up till now kinesiogenomics research studies have moved into a new era utilizing well - phenotyped, large cohort and genomewide technologies. The main aim of this review is to summarize the most recent and significant genetic finding in the various discipline of kinesiology and to predict the future expectation and possibilities for kinesiogenomics.

  13. [Prolactinoma treatment status in the cabergoline era].

    Science.gov (United States)

    Watanabe, Shinya; Takano, Shingo; Akutsu, Hiroyoshi; Sato, Hiroshige; Matsumura, Akira

    2011-11-01

    The aim of our study is to report the most adequate therapy for prolactinoma in the cabergoline era. From 2003 to 2009, 27 patients with prolactinoma were treated at our hospital. Patients are categorized into 2 groups. The Cabergoline Group: Cabergoline was administered for 5 years and discontinued. Using this protocol, the case with normal prolactin level in addition to having no visible tumor more than 24 months after the discontinuation of cabergoline was judged as cured. The Operation Group: Transsphenoidal surgery (TSS) was performed first. In the Cabergoline group, 12 cases were cured with 5 years cabergoline treatment (Cure) and 6 cases were not cured (Not cure). We compared the pretreatment prolactin level, the normalization of the serum level of prolactin, the degree of invasiveness on MRI, regression of the tumor during treatment on MRI, max dose of cabergoline, degree of pituitary hormone replacement, frequency of pregnancy, and follow up periods between the Cabergoline-cure group, the Cabergoline-not-cure group, and the Operation group. Normalization rate in serum level of prolactin and cure rate were 91% and 63% in the Cabergoline group. Pretreatment prolactine level and the frequency of tumor invasiveness on initial MRI were significantly higher in the Cabergoline-not-cure group compared to the Cabergoline-cure group. All of the five woman accompanied with pregnancy after the treatment belonged to the Cabergoline-cure group. In the Operation group, all 4 cases achieved normalization of serum prolactin level without visible tumor and with normal pituitary function. Cabergoline for prolactinoma is effective, but the cure rate by continuous usage of cabergoline for 5 years was 67%. The factors that cabergoline and / or TSS can cure prolactinoma are non-invasive tumor and prolactin level under 200 ng/mL at pretreatment.

  14. Politisasi Birokrasi Pemerintahan Desa Pada Era Reformasi

    Directory of Open Access Journals (Sweden)

    R Widodo Triputro

    2015-12-01

    Full Text Available Public bureaucracy holds a strategic position in government implementation as well as to effort of democratization and autonomy in local and village government scope. Professional bureaucracy apparatus intensely support the increase of public service quality, particular yfor social empowerment as the realization of local and village autonomy essence. The concept of bureaucracy neutrality needs to be bought into reality in order to urge a bureaucracy that more oriented to its main function, namely as public service apparatus. Long history oflndonesian bureaucracy reflects the occuring bureaucracy politization by government regime, with the result that all bureaucracy's line become an administration tool in performing is authority centralization. As the consequence, service tends to be addressed to government (patron by neglecting public service function. It includes in village government scope, in which bureaucracy becomes a political machine, meanwlile serves as an effective controlling tool that limits social access to public arena. The outcome of case study conducted in one village of Bantul Regency with data resource was gained from government official and prominent figures both in regency or village government area, reveals that bureaucracy politization in village government nowdays is much stronger than under new orde era. On the pretext of democratization and social empowerment, government (red : regent and his political party performs a set of bureaucracy politization in village government. With limited village resource condition, politic euphoria, and conflict as the result of election proces of village government bureaucracy apparatus, government intervences village government and its community. The patron-client relation is between the government with village government and its community. It is evidenced that bureaucracy politization of villlage government is re-carried out, among others is the estabilish of "Paguyuban Pamong" with its

  15. EPE The Extreme Physics Explorer

    Science.gov (United States)

    Garcia, Michael; Elvis, Martin; Bookbinder, Jay; Brenneman, Laura; Bulbul, Esra; Nulsen, Paul; Patnaude, Dan; Smith, Randall; Bandler, Simon; Okajima, Takashi; Ptak, Andy; Figueroa-Feliciano, Enectali; Chakrabarty, Deepto; Danner, Rolf; Daily, Dean; Fraser, George; Willingale, Richard; Miller, Jon; Turner, T. J.; Risalti, Guido; Galeazzi, Massimiliano

    2012-01-01

    The Extreme Physics Explorer (EPE) is a mission concept that will address fundamental and timely questions in astrophysics which are primary science objectives of IXO. The reach of EPE to the areas outlined in NASA RFI NNH11ZDA018L is shown as a table. The dark green indicates areas in which EPE can do the basic IXO science, and the light green areas where EPE can contribute but will not reach the full IXO capability. To address these science questions, EPE will trace orbits close to the event horizon of black holes, measure black hole spin in active galactic nuclei (AGN), use spectroscopy to characterize outflows and the environment of AGN, map bulk motions and turbulence in galaxy clusters, and observe the process of cosmic feedback where black holes inject energy on galactic and intergalactic scales. EPE gives up the high resolution imaging of IXO in return for lightweight, high TRL foil mirrors which will provide >20 times the effective area of ASTRO-H and similar spatial resolution, with a beam sufficient to study point sources and nearby galaxies and clusters. Advances in micro-calorimeters allow improved performance at high rates with twice the energy resolution of ASTRO-H. A lower TRL option would provide 200 times the area of ASTRO-H using a micro-channel plate optic (MCPO) and a deployable optical bench. Both options are in the middle range of RFI missions at between $600M and $1000M. The EPE foil optic has direct heritage to ASTRO-H, allowing robust cost estimates. The spacecraft is entirely off the shelf and introduces no difficult requirements. The mission could be started and launched in this decade to an L2 orbit, with a three-year lifetime and consumables for 5 years. While ASTRO-H will give us the first taste of high-resolution, non-dispersive X-ray spectroscopy, it will be limited to small numbers of objects in many categories. EPE will give us the first statistically significant samples in each of these categories.

  16. Influence of extreme weather disasters on global crop production.

    Science.gov (United States)

    Lesk, Corey; Rowhani, Pedram; Ramankutty, Navin

    2016-01-07

    In recent years, several extreme weather disasters have partially or completely damaged regional crop production. While detailed regional accounts of the effects of extreme weather disasters exist, the global scale effects of droughts, floods and extreme temperature on crop production are yet to be quantified. Here we estimate for the first time, to our knowledge, national cereal production losses across the globe resulting from reported extreme weather disasters during 1964-2007. We show that droughts and extreme heat significantly reduced national cereal production by 9-10%, whereas our analysis could not identify an effect from floods and extreme cold in the national data. Analysing the underlying processes, we find that production losses due to droughts were associated with a reduction in both harvested area and yields, whereas extreme heat mainly decreased cereal yields. Furthermore, the results highlight ~7% greater production damage from more recent droughts and 8-11% more damage in developed countries than in developing ones. Our findings may help to guide agricultural priorities in international disaster risk reduction and adaptation efforts.

  17. Gamma-Ray Bursts in the Swift Era

    CERN Document Server

    Gehrels, N; Fox, D B; 10.1146/annurev.astro.46.060407.145147

    2009-01-01

    With its rapid-response capability and multiwavelength complement of instruments, the Swift satellite has transformed our physical understanding of gamma-ray bursts (GRBs). Providing high-quality observations of hundreds of bursts, and facilitating a wide range of follow-up observations within seconds of each event, Swift has revealed an unforeseen richness in observed burst properties, shed light on the nature of short-duration bursts, and helped realize the promise of GRBs as probes of the processes and environments of star formation out to the earliest cosmic epochs. These advances have opened new perspectives on the nature and properties of burst central engines, interactions with the burst environment from microparsec to gigaparsec scales, and the possibilities for non-photonic signatures. Our understanding of these extreme cosmic sources has thus advanced substantially; yet more than 40 years after their discovery, GRBs continue to present major challenges on both observational and theoretical fronts.

  18. Mid-Latitude Circulation and Extremes in a Changing Climate

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Gang [Cornell Univ., Ithaca, NY (United States)

    2016-08-04

    Mid-latitude extreme weather events are responsible for a large part of climate-related damage. Yet large uncertainties remain in climate model projections of heat waves, droughts, and heavy rain/snow events on regional scales, limiting our ability to effectively use these projections for climate adaptation and mitigation. These uncertainties can be attributed to both the lack of spatial resolution in the models, and to the lack of a dynamical understanding of these extremes. The approach of this project is to relate the fine-scale features to the large scales in current climate simulations, seasonal re-forecasts, and climate change projections in a very wide range of models, including the atmospheric and coupled models of ECMWF over a range of horizontal resolutions (125 to 10 km), aqua-planet configuration of the Model for Prediction Across Scales and High Order Method Modeling Environments (resolutions ranging from 240 km – 7.5 km) with various physics suites, and selected CMIP5 model simulations. The large scale circulation will be quantified both on the basis of the well tested preferred circulation regime approach, and very recently developed measures, the finite amplitude Wave Activity (FAWA) and its spectrum. The fine scale structures related to extremes will be diagnosed following the latest approaches in the literature. The goal is to use the large scale measures as indicators of the probability of occurrence of the finer scale structures, and hence extreme events. These indicators will then be applied to the CMIP5 models and time-slice projections of a future climate.

  19. Temperature extremes in Western Europe and associated atmospheric anomalies

    Science.gov (United States)

    Carvalho, V. A.; Santos, J. A.

    2009-09-01

    This worḱs focal point is the analysis of temperature extremes over Western Europe in the period 1957-2007 and their relationship to large-scale anomalies in the atmospheric circulation patterns. The study is based on temperature daily time series recorded at a set of meteorological stations covering the target area. The large-scale anomalies are analyzed using data from the National Centers for Environmental Prediction reanalysis project. Firstly, a preliminary statistical analysis was undertaken in order to identify data gaps and erroneous values and to check the homogeneity of the time series, using not only elementary statistical approaches (e.g., chronograms, box-plots, scatter-plots), but also a set of non-parametric statistical tests particularly suitable for the analysis of monthly and seasonal mean temperature time series (e.g., Wald-Wolfowitz serial correlation test, Spearman and Mann-Kendall trend tests). Secondly, based on previous results, a selection of the highest quality time series was carried out. Aiming at identifying temperature extremes, we then proceed to the isolation of months with temperature values above or below pre-selected thresholds based on the empirical distribution of each time series. In particular, thresholds are based on percentiles specifically computed for each individual temperature record (data adaptive) and not on fixed values. As a result, a calendar of extremely high and extremely low monthly mean temperatures is obtained and the large-scale atmospheric conditions during each extreme are analyzed. Several atmospheric fields are considered in this study (e.g., 2-m maximum and minimum air temperature, sea level pressure, geopotential height, zonal and meridional wind components, vorticity, relative humidity) at different isobaric levels. Results show remarkably different synoptic conditions for temperature extremes in different parts of Western Europe, highlighting the different dynamical mechanisms underlying their

  20. Coastal Cover Change Analysis Program (C-CAP) Great Lakes 1996-era and 2001-era land cover change analysis (NODC Accession 0042437)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains the 1996-era and 2001-era classifications of Great lakes and can be used to analyze change. This imagery was collected as part of the...