WorldWideScience

Sample records for extreme scale era

  1. Software challenges in extreme scale systems

    International Nuclear Information System (INIS)

    Sarkar, Vivek; Harrod, William; Snavely, Allan E

    2009-01-01

    Computer systems anticipated in the 2015 - 2020 timeframe are referred to as Extreme Scale because they will be built using massive multi-core processors with 100's of cores per chip. The largest capability Extreme Scale system is expected to deliver Exascale performance of the order of 10 18 operations per second. These systems pose new critical challenges for software in the areas of concurrency, energy efficiency and resiliency. In this paper, we discuss the implications of the concurrency and energy efficiency challenges on future software for Extreme Scale Systems. From an application viewpoint, the concurrency and energy challenges boil down to the ability to express and manage parallelism and locality by exploring a range of strong scaling and new-era weak scaling techniques. For expressing parallelism and locality, the key challenges are the ability to expose all of the intrinsic parallelism and locality in a programming model, while ensuring that this expression of parallelism and locality is portable across a range of systems. For managing parallelism and locality, the OS-related challenges include parallel scalability, spatial partitioning of OS and application functionality, direct hardware access for inter-processor communication, and asynchronous rather than interrupt-driven events, which are accompanied by runtime system challenges for scheduling, synchronization, memory management, communication, performance monitoring, and power management. We conclude by discussing the importance of software-hardware co-design in addressing the fundamental challenges for application enablement on Extreme Scale systems.

  2. Gravo-Aeroelastic Scaling for Extreme-Scale Wind Turbines

    Energy Technology Data Exchange (ETDEWEB)

    Fingersh, Lee J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Loth, Eric [University of Virginia; Kaminski, Meghan [University of Virginia; Qin, Chao [University of Virginia; Griffith, D. Todd [Sandia National Laboratories

    2017-06-09

    A scaling methodology is described in the present paper for extreme-scale wind turbines (rated at 10 MW or more) that allow their sub-scale turbines to capture their key blade dynamics and aeroelastic deflections. For extreme-scale turbines, such deflections and dynamics can be substantial and are primarily driven by centrifugal, thrust and gravity forces as well as the net torque. Each of these are in turn a function of various wind conditions, including turbulence levels that cause shear, veer, and gust loads. The 13.2 MW rated SNL100-03 rotor design, having a blade length of 100-meters, is herein scaled to the CART3 wind turbine at NREL using 25% geometric scaling and blade mass and wind speed scaled by gravo-aeroelastic constraints. In order to mimic the ultralight structure on the advanced concept extreme-scale design the scaling results indicate that the gravo-aeroelastically scaled blades for the CART3 are be three times lighter and 25% longer than the current CART3 blades. A benefit of this scaling approach is that the scaled wind speeds needed for testing are reduced (in this case by a factor of two), allowing testing under extreme gust conditions to be much more easily achieved. Most importantly, this scaling approach can investigate extreme-scale concepts including dynamic behaviors and aeroelastic deflections (including flutter) at an extremely small fraction of the full-scale cost.

  3. El Nino, from 1870 to 2014, and other Atmospheric Circulation Forcing by Extreme Apparitions of the Eight Annual, Continental Scale, Aerosol Plumes in the Satellite Era which Point to a Possible Cause for the Current Californian Drought

    Science.gov (United States)

    Potts, K. A.

    2015-12-01

    Eight continental scale aerosol plumes exist each year as the enclosed image shows. Apparitions of seven plumes only exist for a few months in the same season each year whilst the East Asian Plume is visible all year. The aerosol optical depth (AOD) of all the plumes varies enormously interannually with two studies showing the surface radiative forcing of the South East Asian Plume (SEAP) as -150W/m2 and -286W/m2/AOD. I show that the SEAP, created by volcanic aerosols (natural) and biomass burning and gas flares in the oil industry (anthropogenic), is the sole cause of all El Nino events, the greatest interannual perturbation of the atmospheric circulation system. The SEAP creates an El Nino by absorbing solar radiation at the top of the plume which heats the upper atmosphere and cools the surface. This creates a temperature inversion compared to periods without the plume and reduces convection. With reduced convection in SE Asia, the Maritime Continent, the Trade Winds blowing across the Pacific are forced to relax as their exit into the Hadley and Walker Cells is constrained and the reduced Trade Wind speed causes the Sea Surface Temperature (SST) to rise in the central tropical Pacific Ocean as there is a strong negative correlation between wind speed and SST. The warmer SST in the central Pacific creates convection in the region which further reduces the Trade Wind speed and causes the Walker Cell to reverse - a classic El Nino. Having established the ability of such extreme aerosol plumes to create El Nino events I will then show how the South American, West African, Middle East and SEAP plumes create drought in the Amazon, Spain, Darfur and Australia as well as causing the extremely warm autumn and winter in Europe in 2006-07. All these effects are created by the plumes reducing convection in the region of the plume which forces the regional Hadley Cells into anomalous positions thereby creating persistent high pressure cells in the mid latitudes. This

  4. Climatic forecast: down-scaling and extremes

    International Nuclear Information System (INIS)

    Deque, M.; Li, L.

    2007-01-01

    There is a strong demand for specifying the future climate at local scale and about extreme events. New methods, allowing a better output from the climate models, are currently being developed and French laboratories involved in the Escrime project are actively participating. (authors)

  5. Extreme-scale Algorithms and Solver Resilience

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, Jack [Univ. of Tennessee, Knoxville, TN (United States)

    2016-12-10

    A widening gap exists between the peak performance of high-performance computers and the performance achieved by complex applications running on these platforms. Over the next decade, extreme-scale systems will present major new challenges to algorithm development that could amplify this mismatch in such a way that it prevents the productive use of future DOE Leadership computers due to the following; Extreme levels of parallelism due to multicore processors; An increase in system fault rates requiring algorithms to be resilient beyond just checkpoint/restart; Complex memory hierarchies and costly data movement in both energy and performance; Heterogeneous system architectures (mixing CPUs, GPUs, etc.); and Conflicting goals of performance, resilience, and power requirements.

  6. Frameworks for visualization at the extreme scale

    International Nuclear Information System (INIS)

    Joy, Kenneth I; Miller, Mark; Childs, Hank; Bethel, E Wes; Clyne, John; Ostrouchov, George; Ahern, Sean

    2007-01-01

    The challenges of visualization at the extreme scale involve issues of scale, complexity, temporal exploration and uncertainty. The Visualization and Analytics Center for Enabling Technologies (VACET) focuses on leveraging scientific visualization and analytics software technology as an enabling technology to increased scientific discovery and insight. In this paper, we introduce new uses of visualization frameworks through the introduction of Equivalence Class Functions (ECFs). These functions give a new class of derived quantities designed to greatly expand the ability of the end user to explore and visualize data. ECFs are defined over equivalence classes (i.e., groupings) of elements from an original mesh, and produce summary values for the classes as output. ECFs can be used in the visualization process to directly analyze data, or can be used to synthesize new derived quantities on the original mesh. The design of ECFs enable a parallel implementation that allows the use of these techniques on massive data sets that require parallel processing

  7. Extreme-Scale De Novo Genome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Georganas, Evangelos [Intel Corporation, Santa Clara, CA (United States); Hofmeyr, Steven [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Joint Genome Inst.; Egan, Rob [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Buluc, Aydin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Joint Genome Inst.; Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Joint Genome Inst.; Rokhsar, Daniel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Yelick, Katherine [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Joint Genome Inst.

    2017-09-26

    De novo whole genome assembly reconstructs genomic sequence from short, overlapping, and potentially erroneous DNA segments and is one of the most important computations in modern genomics. This work presents HipMER, a high-quality end-to-end de novo assembler designed for extreme scale analysis, via efficient parallelization of the Meraculous code. Genome assembly software has many components, each of which stresses different components of a computer system. This chapter explains the computational challenges involved in each step of the HipMer pipeline, the key distributed data structures, and communication costs in detail. We present performance results of assembling the human genome and the large hexaploid wheat genome on large supercomputers up to tens of thousands of cores.

  8. Quantum universe on extremely small space-time scales

    International Nuclear Information System (INIS)

    Kuzmichev, V.E.; Kuzmichev, V.V.

    2010-01-01

    The semiclassical approach to the quantum geometrodynamical model is used for the description of the properties of the Universe on extremely small space-time scales. Under this approach, the matter in the Universe has two components of the quantum nature which behave as antigravitating fluids. The first component does not vanish in the limit h → 0 and can be associated with dark energy. The second component is described by an extremely rigid equation of state and goes to zero after the transition to large spacetime scales. On small space-time scales, this quantum correction turns out to be significant. It determines the geometry of the Universe near the initial cosmological singularity point. This geometry is conformal to a unit four-sphere embedded in a five-dimensional Euclidean flat space. During the consequent expansion of the Universe, when reaching the post-Planck era, the geometry of the Universe changes into that conformal to a unit four-hyperboloid in a five-dimensional Lorentzsignatured flat space. This agrees with the hypothesis about the possible change of geometry after the origin of the expanding Universe from the region near the initial singularity point. The origin of the Universe can be interpreted as a quantum transition of the system from a region in the phase space forbidden for the classical motion, but where a trajectory in imaginary time exists, into a region, where the equations of motion have the solution which describes the evolution of the Universe in real time. Near the boundary between two regions, from the side of real time, the Universe undergoes almost an exponential expansion which passes smoothly into the expansion under the action of radiation dominating over matter which is described by the standard cosmological model.

  9. LEMON - LHC Era Monitoring for Large-Scale Infrastructures

    International Nuclear Information System (INIS)

    Babik, Marian; Hook, Nicholas; Lansdale, Thomas Hector; Lenkes, Daniel; Siket, Miroslav; Waldron, Denis; Fedorko, Ivan

    2011-01-01

    At the present time computer centres are facing a massive rise in virtualization and cloud computing as these solutions bring advantages to service providers and consolidate the computer centre resources. However, as a result the monitoring complexity is increasing. Computer centre management requires not only to monitor servers, network equipment and associated software but also to collect additional environment and facilities data (e.g. temperature, power consumption, cooling efficiency, etc.) to have also a good overview of the infrastructure performance. The LHC Era Monitoring (Lemon) system is addressing these requirements for a very large scale infrastructure. The Lemon agent that collects data on every client and forwards the samples to the central measurement repository provides a flexible interface that allows rapid development of new sensors. The system allows also to report on behalf of remote devices such as switches and power supplies. Online and historical data can be visualized via a web-based interface or retrieved via command-line tools. The Lemon Alarm System component can be used for notifying the operator about error situations. In this article, an overview of the Lemon monitoring is provided together with a description of the CERN LEMON production instance. No direct comparison is made with other monitoring tool.

  10. The Spatial Scaling of Global Rainfall Extremes

    Science.gov (United States)

    Devineni, N.; Xi, C.; Lall, U.; Rahill-Marier, B.

    2013-12-01

    Floods associated with severe storms are a significant source of risk for property, life and supply chains. These property losses tend to be determined as much by the duration of flooding as by the depth and velocity of inundation. High duration floods are typically induced by persistent rainfall (upto 30 day duration) as seen recently in Thailand, Pakistan, the Ohio and the Mississippi Rivers, France, and Germany. Events related to persistent and recurrent rainfall appear to correspond to the persistence of specific global climate patterns that may be identifiable from global, historical data fields, and also from climate models that project future conditions. A clear understanding of the space-time rainfall patterns for events or for a season will enable in assessing the spatial distribution of areas likely to have a high/low inundation potential for each type of rainfall forcing. In this paper, we investigate the statistical properties of the spatial manifestation of the rainfall exceedances. We also investigate the connection of persistent rainfall events at different latitudinal bands to large-scale climate phenomena such as ENSO. Finally, we present the scaling phenomena of contiguous flooded areas as a result of large scale organization of long duration rainfall events. This can be used for spatially distributed flood risk assessment conditional on a particular rainfall scenario. Statistical models for spatio-temporal loss simulation including model uncertainty to support regional and portfolio analysis can be developed.

  11. Scaling a Survey Course in Extreme Weather

    Science.gov (United States)

    Samson, P. J.

    2013-12-01

    "Extreme Weather" is a survey-level course offered at the University of Michigan that is broadcast via the web and serves as a research testbed to explore best practices for large class conduct. The course has led to the creation of LectureTools, a web-based student response and note-taking system that has been shown to increase student engagement dramatically in multiple courses by giving students more opportunities to participate in class. Included in this is the capacity to pose image-based questions (see image where question was "Where would you expect winds from the south") as well as multiple choice, ordered list, free response and numerical questions. Research in this class has also explored differences in learning outcomes from those who participate remotely versus those who physically come to class and found little difference. Moreover the technologies used allow instructors to conduct class from wherever they are while the students can still answer questions and engage in class discussion from wherever they are. This presentation will use LectureTools to demonstrate its features. Attendees are encouraged to bring a mobile device to the session to participate.

  12. Asynchronous schemes for CFD at extreme scales

    Science.gov (United States)

    Konduri, Aditya; Donzis, Diego

    2013-11-01

    Recent advances in computing hardware and software have made simulations an indispensable research tool in understanding fluid flow phenomena in complex conditions at great detail. Due to the nonlinear nature of the governing NS equations, simulations of high Re turbulent flows are computationally very expensive and demand for extreme levels of parallelism. Current large simulations are being done on hundreds of thousands of processing elements (PEs). Benchmarks from these simulations show that communication between PEs take a substantial amount of time, overwhelming the compute time, resulting in substantial waste in compute cycles as PEs remain idle. We investigate a novel approach based on widely used finite-difference schemes in which computations are carried out asynchronously, i.e. synchronization of data among PEs is not enforced and computations proceed regardless of the status of messages. This drastically reduces PE idle time and results in much larger computation rates. We show that while these schemes remain stable, their accuracy is significantly affected. We present new schemes that maintain accuracy under asynchronous conditions and provide a viable path towards exascale computing. Performance of these schemes will be shown for simple models like Burgers' equation.

  13. Scaling of Precipitation Extremes Modelled by Generalized Pareto Distribution

    Science.gov (United States)

    Rajulapati, C. R.; Mujumdar, P. P.

    2017-12-01

    Precipitation extremes are often modelled with data from annual maximum series or peaks over threshold series. The Generalized Pareto Distribution (GPD) is commonly used to fit the peaks over threshold series. Scaling of precipitation extremes from larger time scales to smaller time scales when the extremes are modelled with the GPD is burdened with difficulties arising from varying thresholds for different durations. In this study, the scale invariance theory is used to develop a disaggregation model for precipitation extremes exceeding specified thresholds. A scaling relationship is developed for a range of thresholds obtained from a set of quantiles of non-zero precipitation of different durations. The GPD parameters and exceedance rate parameters are modelled by the Bayesian approach and the uncertainty in scaling exponent is quantified. A quantile based modification in the scaling relationship is proposed for obtaining the varying thresholds and exceedance rate parameters for shorter durations. The disaggregation model is applied to precipitation datasets of Berlin City, Germany and Bangalore City, India. From both the applications, it is observed that the uncertainty in the scaling exponent has a considerable effect on uncertainty in scaled parameters and return levels of shorter durations.

  14. Extreme Scale Computing to Secure the Nation

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L; McGraw, J R; Johnson, J R; Frincke, D

    2009-11-10

    absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This

  15. Censored rainfall modelling for estimation of fine-scale extremes

    Science.gov (United States)

    Cross, David; Onof, Christian; Winter, Hugo; Bernardara, Pietro

    2018-01-01

    Reliable estimation of rainfall extremes is essential for drainage system design, flood mitigation, and risk quantification. However, traditional techniques lack physical realism and extrapolation can be highly uncertain. In this study, we improve the physical basis for short-duration extreme rainfall estimation by simulating the heavy portion of the rainfall record mechanistically using the Bartlett-Lewis rectangular pulse (BLRP) model. Mechanistic rainfall models have had a tendency to underestimate rainfall extremes at fine temporal scales. Despite this, the simple process representation of rectangular pulse models is appealing in the context of extreme rainfall estimation because it emulates the known phenomenology of rainfall generation. A censored approach to Bartlett-Lewis model calibration is proposed and performed for single-site rainfall from two gauges in the UK and Germany. Extreme rainfall estimation is performed for each gauge at the 5, 15, and 60 min resolutions, and considerations for censor selection discussed.

  16. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    Energy Technology Data Exchange (ETDEWEB)

    Xiu, Dongbin [Univ. of Utah, Salt Lake City, UT (United States)

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  17. Investigating the Scaling Properties of Extreme Rainfall Depth ...

    African Journals Online (AJOL)

    Investigating the Scaling Properties of Extreme Rainfall Depth Series in Oromia Regional State, Ethiopia. ... Science, Technology and Arts Research Journal ... for storm duration ranging from 0.5 to 24 hr observed at network of rain gauges sited in Oromia regional state were analyzed using an approach based on moments.

  18. Large Scale Influences on Summertime Extreme Precipitation in the Northeastern United States

    Science.gov (United States)

    Collow, Allison B. Marquardt; Bosilovich, Michael G.; Koster, Randal Dean

    2016-01-01

    Observations indicate that over the last few decades there has been a statistically significant increase in precipitation in the northeastern United States and that this can be attributed to an increase in precipitation associated with extreme precipitation events. Here a state-of-the-art atmospheric reanalysis is used to examine such events in detail. Daily extreme precipitation events defined at the 75th and 95th percentile from gridded gauge observations are identified for a selected region within the Northeast. Atmospheric variables from the Modern-Era Retrospective Analysis for Research and Applications, version 2 (MERRA-2), are then composited during these events to illustrate the time evolution of associated synoptic structures, with a focus on vertically integrated water vapor fluxes, sea level pressure, and 500-hectopascal heights. Anomalies of these fields move into the region from the northwest, with stronger anomalies present in the 95th percentile case. Although previous studies show tropical cyclones are responsible for the most intense extreme precipitation events, only 10 percent of the events in this study are caused by tropical cyclones. On the other hand, extreme events resulting from cutoff low pressure systems have increased. The time period of the study was divided in half to determine how the mean composite has changed over time. An arc of lower sea level pressure along the East Coast and a change in the vertical profile of equivalent potential temperature suggest a possible increase in the frequency or intensity of synoptic-scale baroclinic disturbances.

  19. The Era of Kilometer-Scale Neutrino Detectors

    Directory of Open Access Journals (Sweden)

    Francis Halzen

    2013-01-01

    Full Text Available Neutrino astronomy beyond the Sun was first imagined in the late 1950s; by the 1970s, it was realized that kilometer-scale neutrino detectors were required. The first such instrument, IceCube, transforms a cubic kilometer of deep and ultra-transparent Antarctic ice into a particle detector. KM3NeT, an instrument that aims to exploit several cubic kilometers of the deep Mediterranean sea as its detector medium, is in its final design stages. The scientific missions of these instruments include searching for sources of cosmic rays and for dark matter, observing Galactic supernova explosions, and studying the neutrinos themselves. Identifying the accelerators that produce Galactic and extragalactic cosmic rays has been a priority mission of several generations of high-energy gamma-ray and neutrino telescopes; success has been elusive so far. Detecting the gamma-ray and neutrino fluxes associated with cosmic rays reaches a new watershed with the completion of IceCube, the first neutrino detector with sensitivity to the anticipated fluxes. In this paper, we will first revisit the rationale for constructing kilometer-scale neutrino detectors. We will subsequently recall the methods for determining the arrival direction, energy and flavor of neutrinos, and will subsequently describe the architecture of the IceCube and KM3NeT detectors.

  20. Extreme Scale Computing for First-Principles Plasma Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Choogn-Seock [Princeton University

    2011-10-12

    World superpowers are in the middle of the “Computnik” race. US Department of Energy (and National Nuclear Security Administration) wishes to launch exascale computer systems into the scientific (and national security) world by 2018. The objective is to solve important scientific problems and to predict the outcomes using the most fundamental scientific laws, which would not be possible otherwise. Being chosen into the next “frontier” group can be of great benefit to a scientific discipline. An extreme scale computer system requires different types of algorithms and programming philosophy from those we have been accustomed to. Only a handful of scientific codes are blessed to be capable of scalable usage of today’s largest computers in operation at petascale (using more than 100,000 cores concurrently). Fortunately, a few magnetic fusion codes are competing well in this race using the “first principles” gyrokinetic equations.These codes are beginning to study the fusion plasma dynamics in full-scale realistic diverted device geometry in natural nonlinear multiscale, including the large scale neoclassical and small scale turbulence physics, but excluding some ultra fast dynamics. In this talk, most of the above mentioned topics will be introduced at executive level. Representative properties of the extreme scale computers, modern programming exercises to take advantage of them, and different philosophies in the data flows and analyses will be presented. Examples of the multi-scale multi-physics scientific discoveries made possible by solving the gyrokinetic equations on extreme scale computers will be described. Future directions into “virtual tokamak experiments” will also be discussed.

  1. A Network Contention Model for the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2015-01-01

    The Extreme-scale Simulator (xSim) is a performance investigation toolkit for high-performance computing (HPC) hardware/software co-design. It permits running a HPC application with millions of concurrent execution threads, while observing its performance in a simulated extreme-scale system. This paper details a newly developed network modeling feature for xSim, eliminating the shortcomings of the existing network modeling capabilities. The approach takes a different path for implementing network contention and bandwidth capacity modeling using a less synchronous and accurate enough model design. With the new network modeling feature, xSim is able to simulate on-chip and on-node networks with reasonable accuracy and overheads.

  2. Temporal and spatial scaling impacts on extreme precipitation

    Science.gov (United States)

    Eggert, B.; Berg, P.; Haerter, J. O.; Jacob, D.; Moseley, C.

    2015-01-01

    Both in the current climate and in the light of climate change, understanding of the causes and risk of precipitation extremes is essential for protection of human life and adequate design of infrastructure. Precipitation extreme events depend qualitatively on the temporal and spatial scales at which they are measured, in part due to the distinct types of rain formation processes that dominate extremes at different scales. To capture these differences, we first filter large datasets of high-resolution radar measurements over Germany (5 min temporally and 1 km spatially) using synoptic cloud observations, to distinguish convective and stratiform rain events. In a second step, for each precipitation type, the observed data are aggregated over a sequence of time intervals and spatial areas. The resulting matrix allows a detailed investigation of the resolutions at which convective or stratiform events are expected to contribute most to the extremes. We analyze where the statistics of the two types differ and discuss at which resolutions transitions occur between dominance of either of the two precipitation types. We characterize the scales at which the convective or stratiform events will dominate the statistics. For both types, we further develop a mapping between pairs of spatially and temporally aggregated statistics. The resulting curve is relevant when deciding on data resolutions where statistical information in space and time is balanced. Our study may hence also serve as a practical guide for modelers, and for planning the space-time layout of measurement campaigns. We also describe a mapping between different pairs of resolutions, possibly relevant when working with mismatched model and observational resolutions, such as in statistical bias correction.

  3. Large Scale Meteorological Pattern of Extreme Rainfall in Indonesia

    Science.gov (United States)

    Kuswanto, Heri; Grotjahn, Richard; Rachmi, Arinda; Suhermi, Novri; Oktania, Erma; Wijaya, Yosep

    2014-05-01

    Extreme Weather Events (EWEs) cause negative impacts socially, economically, and environmentally. Considering these facts, forecasting EWEs is crucial work. Indonesia has been identified as being among the countries most vulnerable to the risk of natural disasters, such as floods, heat waves, and droughts. Current forecasting of extreme events in Indonesia is carried out by interpreting synoptic maps for several fields without taking into account the link between the observed events in the 'target' area with remote conditions. This situation may cause misidentification of the event leading to an inaccurate prediction. Grotjahn and Faure (2008) compute composite maps from extreme events (including heat waves and intense rainfall) to help forecasters identify such events in model output. The composite maps show large scale meteorological patterns (LSMP) that occurred during historical EWEs. Some vital information about the EWEs can be acquired from studying such maps, in addition to providing forecaster guidance. Such maps have robust mid-latitude meteorological patterns (for Sacramento and California Central Valley, USA EWEs). We study the performance of the composite approach for tropical weather condition such as Indonesia. Initially, the composite maps are developed to identify and forecast the extreme weather events in Indramayu district- West Java, the main producer of rice in Indonesia and contributes to about 60% of the national total rice production. Studying extreme weather events happening in Indramayu is important since EWEs there affect national agricultural and fisheries activities. During a recent EWE more than a thousand houses in Indramayu suffered from serious flooding with each home more than one meter underwater. The flood also destroyed a thousand hectares of rice plantings in 5 regencies. Identifying the dates of extreme events is one of the most important steps and has to be carried out carefully. An approach has been applied to identify the

  4. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  5. Neutron Star Astronomy in the era of the European Extremely Large Telescope

    International Nuclear Information System (INIS)

    Mignani, Roberto P.

    2011-01-01

    About 25 isolated neutron stars (INSs) are now detected in the optical domain, mainly thanks to the HST and to VLT-class telescopes. The European Extremely Large Telescope(E-ELT) will yield ∼100 new identifications, many of which from the follow-up of SKA, IXO, and Fermi observations. Moreover, the E-ELT will allow to carry out, on a much larger sample, INS observations which still challenge VLT-class telescopes, enabling studies on the structure and composition of the NS interior, of its atmosphere and magnetosphere, as well as to search for debris discs. In this contribution, I outline future perspectives for NS optical astronomy with the E-ELT.

  6. Improving the Performance of the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2014-01-01

    Investigating the performance of parallel applications at scale on future high-performance computing (HPC) architectures and the performance impact of different architecture choices is an important component of HPC hardware/software co-design. The Extreme-scale Simulator (xSim) is a simulation-based toolkit for investigating the performance of parallel applications at scale. xSim scales to millions of simulated Message Passing Interface (MPI) processes. The overhead introduced by a simulation tool is an important performance and productivity aspect. This paper documents two improvements to xSim: (1) a new deadlock resolution protocol to reduce the parallel discrete event simulation management overhead and (2) a new simulated MPI message matching algorithm to reduce the oversubscription management overhead. The results clearly show a significant performance improvement, such as by reducing the simulation overhead for running the NAS Parallel Benchmark suite inside the simulator from 1,020\\% to 238% for the conjugate gradient (CG) benchmark and from 102% to 0% for the embarrassingly parallel (EP) and benchmark, as well as, from 37,511% to 13,808% for CG and from 3,332% to 204% for EP with accurate process failure simulation.

  7. Extreme weather events in southern Germany - Climatological risk and development of a large-scale identification procedure

    Science.gov (United States)

    Matthies, A.; Leckebusch, G. C.; Rohlfing, G.; Ulbrich, U.

    2009-04-01

    Extreme weather events such as thunderstorms, hail and heavy rain or snowfall can pose a threat to human life and to considerable tangible assets. Yet there is a lack of knowledge about present day climatological risk and its economic effects, and its changes due to rising greenhouse gas concentrations. Therefore, parts of economy particularly sensitve to extreme weather events such as insurance companies and airports require regional risk-analyses, early warning and prediction systems to cope with such events. Such an attempt is made for southern Germany, in close cooperation with stakeholders. Comparing ERA40 and station data with impact records of Munich Re and Munich Airport, the 90th percentile was found to be a suitable threshold for extreme impact relevant precipitation events. Different methods for the classification of causing synoptic situations have been tested on ERA40 reanalyses. An objective scheme for the classification of Lamb's circulation weather types (CWT's) has proved to be most suitable for correct classification of the large-scale flow conditions. Certain CWT's have been turned out to be prone to heavy precipitation or on the other side to have a very low risk of such events. Other large-scale parameters are tested in connection with CWT's to find out a combination that has the highest skill to identify extreme precipitation events in climate model data (ECHAM5 and CLM). For example vorticity advection in 700 hPa shows good results, but assumes knowledge of regional orographic particularities. Therefore ongoing work is focused on additional testing of parameters that indicate deviations of a basic state of the atmosphere like the Eady Growth Rate or the newly developed Dynamic State Index. Evaluation results will be used to estimate the skill of the regional climate model CLM concerning the simulation of frequency and intensity of the extreme weather events. Data of the A1B scenario (2000-2050) will be examined for a possible climate change

  8. Perioperative Optimization of Geriatric Lower Extremity Bypass in the Era of Increased Performance Accountability.

    Science.gov (United States)

    Adkar, Shaunak S; Turley, Ryan S; Benrashid, Ehsan; Lagoo, Sandhya; Shortell, Cynthia K; Mureebe, Leila

    2017-01-01

    The initiation of bundled payment for care improvement by Centers for Medicare and Medicaid Services (CMS) has led to increased financial and performance accountability. As most vascular surgery patients are elderly and reimbursed via CMS, improving their outcomes will be critical for durable financial stability. As a first step in forming a multidisciplinary pathway for the elderly vascular patients, we sought to identify modifiable perioperative variables in geriatric patients undergoing lower extremity bypass (LEB). The 2011-2013 LEB-targeted American College of Surgeons National Surgical Quality Improvement Program database was used for this analysis (n = 5316). Patients were stratified by age <65 (n = 2171), 65-74 (n = 1858), 75-84 (n = 1190), and ≥85 (n = 394) years. Comparisons of patient- and procedure-related characteristics and 30-day postoperative outcomes stratified by age groups were performed with Pearson χ 2 tests for categorical variables and Wilcoxon rank-sum tests for continuous variables. During the study period, 5316 total patients were identified. There were 2171 patients aged <65 years, 1858 patients in the 65-74 years age group, 1190 patients in the 75-84 years age group, and 394 patients in the ≥85 years age group. Increasing age was associated with an increased frequency of cardiopulmonary disease (P < 0.001) and a decreased frequency of diabetes, tobacco use, and prior surgical intervention (P < 0.001). Only 79% and 68% of all patients were on antiplatelet and statin therapies, respectively. Critical limb ischemia occurred more frequently in older patients (P < 0.001). Length of hospital stay, transfusion requirements, and discharge to a skilled nursing facility increased with age (P < 0.001). Thirty-day amputation rates did not differ significantly with age (P = 0.12). Geriatric patients undergoing LEB have unique and potentially modifiable perioperative factors that may improve postoperative outcomes. These

  9. Validity and Reliability of the Upper Extremity Work Demands Scale.

    Science.gov (United States)

    Jacobs, Nora W; Berduszek, Redmar J; Dijkstra, Pieter U; van der Sluis, Corry K

    2017-12-01

    Purpose To evaluate validity and reliability of the upper extremity work demands (UEWD) scale. Methods Participants from different levels of physical work demands, based on the Dictionary of Occupational Titles categories, were included. A historical database of 74 workers was added for factor analysis. Criterion validity was evaluated by comparing observed and self-reported UEWD scores. To assess structural validity, a factor analysis was executed. For reliability, the difference between two self-reported UEWD scores, the smallest detectable change (SDC), test-retest reliability and internal consistency were determined. Results Fifty-four participants were observed at work and 51 of them filled in the UEWD twice with a mean interval of 16.6 days (SD 3.3, range = 10-25 days). Criterion validity of the UEWD scale was moderate (r = .44, p = .001). Factor analysis revealed that 'force and posture' and 'repetition' subscales could be distinguished with Cronbach's alpha of .79 and .84, respectively. Reliability was good; there was no significant difference between repeated measurements. An SDC of 5.0 was found. Test-retest reliability was good (intraclass correlation coefficient for agreement = .84) and all item-total correlations were >.30. There were two pairs of highly related items. Conclusion Reliability of the UEWD scale was good, but criterion validity was moderate. Based on current results, a modified UEWD scale (2 items removed, 1 item reworded, divided into 2 subscales) was proposed. Since observation appeared to be an inappropriate gold standard, we advise to investigate other types of validity, such as construct validity, in further research.

  10. Spatial Scaling of Global Rainfall and Flood Extremes

    Science.gov (United States)

    Devineni, Naresh; Lall, Upmanu; Xi, Chen; Ward, Philip

    2014-05-01

    Floods associated with severe storms are a significant source of risk for property, life and supply chains. These property losses tend to be determined as much by the duration and spatial extent of flooding as by the depth and velocity of inundation. High duration floods are typically induced by persistent rainfall (up to 30 day duration) as seen recently in Thailand, Pakistan, the Ohio and the Mississippi Rivers, France, and Germany. Events related to persistent and recurrent rainfall appear to correspond to the persistence of specific global climate patterns that may be identifiable from global, historical data fields, and also from climate models that project future conditions. In this paper, we investigate the statistical properties of the spatial manifestation of the rainfall exceedances and floods. We present the first ever results on a global analysis of the scaling characteristics of extreme rainfall and flood event duration, volumes and contiguous flooded areas as a result of large scale organization of long duration rainfall events. Results are organized by latitude and with reference to the phases of ENSO, and reveal surprising invariance across latitude. Speculation as to the potential relation to the dynamical factors is presented

  11. Toward a continuous 405-kyr-calibrated Astronomical Time Scale for the Mesozoic Era

    Science.gov (United States)

    Hinnov, Linda; Ogg, James; Huang, Chunju

    2010-05-01

    Mesozoic cyclostratigraphy is being assembled into a continuous Astronomical Time Scale (ATS) tied to the Earth's cyclic orbital parameters. Recognition of a nearly ubiquitous, dominant ~400-kyr cycling in formations throughout the era has been particularly striking. Composite formations spanning contiguous intervals up to 50 myr clearly express these long-eccentricity cycles, and in some cases, this cycling is defined by third- or fourth-order sea-level sequences. This frequency is associated with the 405-kyr orbital eccentricity cycle, which provides a basic metronome and enables the extension of the well-defined Cenozoic ATS to scale the majority of the Mesozoic Era. This astronomical calibration has a resolution comparable to the 1% to 0.1% precision for radioisotope dating of Mesozoic ash beds, but with the added benefit of providing continuous stratigraphic coverage between dated beds. Extended portions of the Mesozoic ATS provide solutions to long-standing geologic problems of tectonics, eustasy, paleoclimate change, and rates of seafloor spreading.

  12. Spatial Heterogeneity, Scale, Data Character and Sustainable Transport in the Big Data Era

    Science.gov (United States)

    Jiang, Bin

    2018-04-01

    In light of the emergence of big data, I have advocated and argued for a paradigm shift from Tobler's law to scaling law, from Euclidean geometry to fractal geometry, from Gaussian statistics to Paretian statistics, and - more importantly - from Descartes' mechanistic thinking to Alexander's organic thinking. Fractal geometry falls under the third definition of fractal - that is, a set or pattern is fractal if the scaling of far more small things than large ones recurs multiple times (Jiang and Yin 2014) - rather than under the second definition of fractal, which requires a power law between scales and details (Mandelbrot 1982). The new fractal geometry is more towards living geometry that "follows the rules, constraints, and contingent conditions that are, inevitably, encountered in the real world" (Alexander et al. 2012, p. 395), not only for understanding complexity, but also for creating complex or living structure (Alexander 2002-2005). This editorial attempts to clarify why the paradigm shift is essential and to elaborate on several concepts, including spatial heterogeneity (scaling law), scale (or the fourth meaning of scale), data character (in contrast to data quality), and sustainable transport in the big data era.

  13. Differential Juvenile Hormone Variations in Scale Insect Extreme Sexual Dimorphism.

    Directory of Open Access Journals (Sweden)

    Isabelle Mifom Vea

    Full Text Available Scale insects have evolved extreme sexual dimorphism, as demonstrated by sedentary juvenile-like females and ephemeral winged males. This dimorphism is established during the post-embryonic development; however, the underlying regulatory mechanisms have not yet been examined. We herein assessed the role of juvenile hormone (JH on the diverging developmental pathways occurring in the male and female Japanese mealybug Planococcus kraunhiae (Kuwana. We provide, for the first time, detailed gene expression profiles related to JH signaling in scale insects. Prior to adult emergence, the transcript levels of JH acid O-methyltransferase, encoding a rate-limiting enzyme in JH biosynthesis, were higher in males than in females, suggesting that JH levels are higher in males. Furthermore, male quiescent pupal-like stages were associated with higher transcript levels of the JH receptor gene, Methoprene-tolerant and its co-activator taiman, as well as the JH early-response genes, Krüppel homolog 1 and broad. The exposure of male juveniles to an ectopic JH mimic prolonged the expression of Krüppel homolog 1 and broad, and delayed adult emergence by producing a supernumeral pupal stage. We propose that male wing development is first induced by up-regulated JH signaling compared to female expression pattern, but a decrease at the end of the prepupal stage is necessary for adult emergence, as evidenced by the JH mimic treatments. Furthermore, wing development seems linked to JH titers as JHM treatments on the pupal stage led to wing deformation. The female pedomorphic appearance was not reflected by the maintenance of high levels of JH. The results in this study suggest that differential variations in JH signaling may be responsible for sex-specific and radically different modes of metamorphosis.

  14. Scaling and clustering effects of extreme precipitation distributions

    Science.gov (United States)

    Zhang, Qiang; Zhou, Yu; Singh, Vijay P.; Li, Jianfeng

    2012-08-01

    SummaryOne of the impacts of climate change and human activities on the hydrological cycle is the change in the precipitation structure. Closely related to the precipitation structure are two characteristics: the volume (m) of wet periods (WPs) and the time interval between WPs or waiting time (t). Using daily precipitation data for a period of 1960-2005 from 590 rain gauge stations in China, these two characteristics are analyzed, involving scaling and clustering of precipitation episodes. Our findings indicate that m and t follow similar probability distribution curves, implying that precipitation processes are controlled by similar underlying thermo-dynamics. Analysis of conditional probability distributions shows a significant dependence of m and t on their previous values of similar volumes, and the dependence tends to be stronger when m is larger or t is longer. It indicates that a higher probability can be expected when high-intensity precipitation is followed by precipitation episodes with similar precipitation intensity and longer waiting time between WPs is followed by the waiting time of similar duration. This result indicates the clustering of extreme precipitation episodes and severe droughts or floods are apt to occur in groups.

  15. Faster Parallel Traversal of Scale Free Graphs at Extreme Scale with Vertex Delegates

    KAUST Repository

    Pearce, Roger

    2014-11-01

    © 2014 IEEE. At extreme scale, irregularities in the structure of scale-free graphs such as social network graphs limit our ability to analyze these important and growing datasets. A key challenge is the presence of high-degree vertices (hubs), that leads to parallel workload and storage imbalances. The imbalances occur because existing partitioning techniques are not able to effectively partition high-degree vertices. We present techniques to distribute storage, computation, and communication of hubs for extreme scale graphs in distributed memory supercomputers. To balance the hub processing workload, we distribute hub data structures and related computation among a set of delegates. The delegates coordinate using highly optimized, yet portable, asynchronous broadcast and reduction operations. We demonstrate scalability of our new algorithmic technique using Breadth-First Search (BFS), Single Source Shortest Path (SSSP), K-Core Decomposition, and Page-Rank on synthetically generated scale-free graphs. Our results show excellent scalability on large scale-free graphs up to 131K cores of the IBM BG/P, and outperform the best known Graph500 performance on BG/P Intrepid by 15%

  16. Faster Parallel Traversal of Scale Free Graphs at Extreme Scale with Vertex Delegates

    KAUST Repository

    Pearce, Roger; Gokhale, Maya; Amato, Nancy M.

    2014-01-01

    © 2014 IEEE. At extreme scale, irregularities in the structure of scale-free graphs such as social network graphs limit our ability to analyze these important and growing datasets. A key challenge is the presence of high-degree vertices (hubs), that leads to parallel workload and storage imbalances. The imbalances occur because existing partitioning techniques are not able to effectively partition high-degree vertices. We present techniques to distribute storage, computation, and communication of hubs for extreme scale graphs in distributed memory supercomputers. To balance the hub processing workload, we distribute hub data structures and related computation among a set of delegates. The delegates coordinate using highly optimized, yet portable, asynchronous broadcast and reduction operations. We demonstrate scalability of our new algorithmic technique using Breadth-First Search (BFS), Single Source Shortest Path (SSSP), K-Core Decomposition, and Page-Rank on synthetically generated scale-free graphs. Our results show excellent scalability on large scale-free graphs up to 131K cores of the IBM BG/P, and outperform the best known Graph500 performance on BG/P Intrepid by 15%

  17. Dynamics Of Saturn'S Mid-scale Storms In The Cassini Era.

    Science.gov (United States)

    Del Rio Gaztelurrutia, Teresa; Hueso, R.; Sánchez-Lavega, A.

    2010-10-01

    Convective storms, similar to those in Earth, but of much larger scale, develop often in Saturn's atmosphere. During the Voyagers’ flybys of Saturn in 1981 mid-scale storms, with an horizontal extension of the order of 1000-3000 km were observed to occur mainly in a narrow tropical-latitude band in the Northern hemisphere at latitudes 38-40 deg North. Contrasting with the Voyagers’ era, since the starting of the Cassini mission in 2004, a similar mid-scale convective activity has concentrated in the so-called "storm alley", a narrow band at a symmetric Southern latitude of 38 deg.. In this work, we characterize this storm activity using available visual information provided by Cassini ISS cameras and the continuous survey from the Earth by the International Outer Planets Watch (IOPW) and its online database PVOL (Hueso et al., Planetary and Space Science, 2010). We study the frequency of appearance of storms with sizes above 2000 km, their characteristic size and life-time, as well as their interaction with surrounding dynamical features. In particular we examine the possibility that storms might provide a mechanism of injection of energy into Saturn's jets, the influence of storms in the generation of atmospheric vortices, and the analogies and differences of Voyagers’ and present day jet structure at the relevant latitudes. Acknowledgments: This work has been funded by the Spanish MICIIN AYA2009-10701 with FEDER support and Grupos Gobierno Vasco IT-464

  18. Assessing future climatic changes of rainfall extremes at small spatio-temporal scales

    DEFF Research Database (Denmark)

    Gregersen, Ida Bülow; Sørup, Hjalte Jomo Danielsen; Madsen, Henrik

    2013-01-01

    Climate change is expected to influence the occurrence and magnitude of rainfall extremes and hence the flood risks in cities. Major impacts of an increased pluvial flood risk are expected to occur at hourly and sub-hourly resolutions. This makes convective storms the dominant rainfall type...... in relation to urban flooding. The present study focuses on high-resolution regional climate model (RCM) skill in simulating sub-daily rainfall extremes. Temporal and spatial characteristics of output from three different RCM simulations with 25 km resolution are compared to point rainfall extremes estimated...... from observed data. The applied RCM data sets represent two different models and two different types of forcing. Temporal changes in observed extreme point rainfall are partly reproduced by the RCM RACMO when forced by ERA40 re-analysis data. Two ECHAM forced simulations show similar increases...

  19. Spatial extreme value analysis to project extremes of large-scale indicators for severe weather.

    Science.gov (United States)

    Gilleland, Eric; Brown, Barbara G; Ammann, Caspar M

    2013-09-01

    Concurrently high values of the maximum potential wind speed of updrafts ( W max ) and 0-6 km wind shear (Shear) have been found to represent conducive environments for severe weather, which subsequently provides a way to study severe weather in future climates. Here, we employ a model for the product of these variables (WmSh) from the National Center for Atmospheric Research/United States National Center for Environmental Prediction reanalysis over North America conditioned on their having extreme energy in the spatial field in order to project the predominant spatial patterns of WmSh. The approach is based on the Heffernan and Tawn conditional extreme value model. Results suggest that this technique estimates the spatial behavior of WmSh well, which allows for exploring possible changes in the patterns over time. While the model enables a method for inferring the uncertainty in the patterns, such analysis is difficult with the currently available inference approach. A variation of the method is also explored to investigate how this type of model might be used to qualitatively understand how the spatial patterns of WmSh correspond to extreme river flow events. A case study for river flows from three rivers in northwestern Tennessee is studied, and it is found that advection of WmSh from the Gulf of Mexico prevails while elsewhere, WmSh is generally very low during such extreme events. © 2013 The Authors. Environmetrics published by JohnWiley & Sons, Ltd.

  20. Adaptation to extreme climate events at a regional scale

    OpenAIRE

    Hoffmann, Christin

    2017-01-01

    A significant increase of the frequency, the intensity and the duration of extreme climate events in Switzerland induces the need to find a strategy to deal with the damages they cause. For more than two decades, mitigation has been the main objective of climate policy. However, due to already high atmospheric carbon concentrations and the inertia of the climate system, climate change is unavoidable to some degree, even if today’s emissions were almost completely cut back. Along with the high...

  1. Identification of large-scale meteorological patterns associated with extreme precipitation in the US northeast

    Science.gov (United States)

    Agel, Laurie; Barlow, Mathew; Feldstein, Steven B.; Gutowski, William J.

    2018-03-01

    Patterns of daily large-scale circulation associated with Northeast US extreme precipitation are identified using both k-means clustering (KMC) and Self-Organizing Maps (SOM) applied to tropopause height. The tropopause height provides a compact representation of the upper-tropospheric potential vorticity, which is closely related to the overall evolution and intensity of weather systems. Extreme precipitation is defined as the top 1% of daily wet-day observations at 35 Northeast stations, 1979-2008. KMC is applied on extreme precipitation days only, while the SOM algorithm is applied to all days in order to place the extreme results into the overall context of patterns for all days. Six tropopause patterns are identified through KMC for extreme day precipitation: a summertime tropopause ridge, a summertime shallow trough/ridge, a summertime shallow eastern US trough, a deeper wintertime eastern US trough, and two versions of a deep cold-weather trough located across the east-central US. Thirty SOM patterns for all days are identified. Results for all days show that 6 SOM patterns account for almost half of the extreme days, although extreme precipitation occurs in all SOM patterns. The same SOM patterns associated with extreme precipitation also routinely produce non-extreme precipitation; however, on extreme precipitation days the troughs, on average, are deeper and the downstream ridges more pronounced. Analysis of other fields associated with the large-scale patterns show various degrees of anomalously strong moisture transport preceding, and upward motion during, extreme precipitation events.

  2. Web-based Visual Analytics for Extreme Scale Climate Science

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Evans, Katherine J [ORNL; Harney, John F [ORNL; Jewell, Brian C [ORNL; Shipman, Galen M [ORNL; Smith, Brian E [ORNL; Thornton, Peter E [ORNL; Williams, Dean N. [Lawrence Livermore National Laboratory (LLNL)

    2014-01-01

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via new visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.

  3. Understanding convective extreme precipitation scaling using observations and an entraining plume model

    NARCIS (Netherlands)

    Loriaux, J.M.; Lenderink, G.; De Roode, S.R.; Siebesma, A.P.

    2013-01-01

    Previously observed twice-Clausius–Clapeyron (2CC) scaling for extreme precipitation at hourly time scales has led to discussions about its origin. The robustness of this scaling is assessed by analyzing a subhourly dataset of 10-min resolution over the Netherlands. The results confirm the validity

  4. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    Science.gov (United States)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  5. Extreme daily precipitation in Western Europe with climate change at appropriate spatial scales

    NARCIS (Netherlands)

    Booij, Martijn J.

    2002-01-01

    Extreme daily precipitation for the current and changed climate at appropriate spatial scales is assessed. This is done in the context of the impact of climate change on flooding in the river Meuse in Western Europe. The objective is achieved by determining and comparing extreme precipitation from

  6. Amputations for extremity soft tissue sarcoma in an era of limb salvage treatment : Local control and survival

    NARCIS (Netherlands)

    Stevenson, Marc G; Musters, Annelie H; Geertzen, Jan H B; van Leeuwen, Barbara L; Hoekstra, Harald J; Been, Lukas B

    2018-01-01

    BACKGROUND: Despite multimodality limb salvage treatment (LST) for locally advanced extremity soft tissue sarcoma (ESTS), some patients still need an amputation. Indications for amputation and oncological outcome for these patients are described. METHODS: Between 1996 and 2016, all patients who

  7. Visualization and parallel I/O at extreme scale

    International Nuclear Information System (INIS)

    Ross, R B; Peterka, T; Shen, H-W; Hong, Y; Ma, K-L; Yu, H; Moreland, K

    2008-01-01

    In our efforts to solve ever more challenging problems through computational techniques, the scale of our compute systems continues to grow. As we approach petascale, it becomes increasingly important that all the resources in the system be used as efficiently as possible, not just the floating-point units. Because of hardware, software, and usability challenges, storage resources are often one of the most poorly used and performing components of today's compute systems. This situation can be especially true in the case of the analysis phases of scientific workflows. In this paper we discuss the impact of large-scale data on visual analysis operations and examine a collection of approaches to I/O in the visual analysis process. First we examine the performance of volume rendering on a leadership-computing platform and assess the relative cost of I/O, rendering, and compositing operations. Next we analyze the performance implications of eliminating preprocessing from this example workflow. Then we describe a technique that uses data reorganization to improve access times for data-intensive volume rendering

  8. Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale

    International Nuclear Information System (INIS)

    Daily, Jeffrey A.

    2015-01-01

    The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore's law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of already annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or 'homologous') on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K

  9. Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Daily, Jeffrey A. [Washington State Univ., Pullman, WA (United States)

    2015-05-01

    The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore’s law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of already annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or “homologous”) on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K cores

  10. A Fault Oblivious Extreme-Scale Execution Environment

    Energy Technology Data Exchange (ETDEWEB)

    McKie, Jim

    2014-11-20

    The FOX project, funded under the ASCR X-stack I program, developed systems software and runtime libraries for a new approach to the data and work distribution for massively parallel, fault oblivious application execution. Our work was motivated by the premise that exascale computing systems will provide a thousand-fold increase in parallelism and a proportional increase in failure rate relative to today’s machines. To deliver the capability of exascale hardware, the systems software must provide the infrastructure to support existing applications while simultaneously enabling efficient execution of new programming models that naturally express dynamic, adaptive, irregular computation; coupled simulations; and massive data analysis in a highly unreliable hardware environment with billions of threads of execution. Our OS research has prototyped new methods to provide efficient resource sharing, synchronization, and protection in a many-core compute node. We have experimented with alternative task/dataflow programming models and shown scalability in some cases to hundreds of thousands of cores. Much of our software is in active development through open source projects. Concepts from FOX are being pursued in next generation exascale operating systems. Our OS work focused on adaptive, application tailored OS services optimized for multi → many core processors. We developed a new operating system NIX that supports role-based allocation of cores to processes which was released to open source. We contributed to the IBM FusedOS project, which promoted the concept of latency-optimized and throughput-optimized cores. We built a task queue library based on distributed, fault tolerant key-value store and identified scaling issues. A second fault tolerant task parallel library was developed, based on the Linda tuple space model, that used low level interconnect primitives for optimized communication. We designed fault tolerance mechanisms for task parallel computations

  11. On the nonlinearity of spatial scales in extreme weather attribution statements

    Science.gov (United States)

    Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah; Alexander, Lisa V.; Wehner, Michael; Shiogama, Hideo; Wolski, Piotr; Ciavarella, Andrew; Christidis, Nikolaos

    2018-04-01

    In the context of ongoing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporal scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.

  12. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Conrad, Patrick [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Bigoni, Daniele [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Parno, Matthew [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2017-06-09

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a history of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT

  13. Brief Assessment of Motor Function: Content Validity and Reliability of the Upper Extremity Gross Motor Scale

    Science.gov (United States)

    Cintas, Holly Lea; Parks, Rebecca; Don, Sarah; Gerber, Lynn

    2011-01-01

    Content validity and reliability of the Brief Assessment of Motor Function (BAMF) Upper Extremity Gross Motor Scale (UEGMS) were evaluated in this prospective, descriptive study. The UEGMS is one of five BAMF ordinal scales designed for quick documentation of gross, fine, and oral motor skill levels. Designed to be independent of age and…

  14. Fine-scale population structure and the era of next-generation sequencing.

    Science.gov (United States)

    Henn, Brenna M; Gravel, Simon; Moreno-Estrada, Andres; Acevedo-Acevedo, Suehelay; Bustamante, Carlos D

    2010-10-15

    Fine-scale population structure characterizes most continents and is especially pronounced in non-cosmopolitan populations. Roughly half of the world's population remains non-cosmopolitan and even populations within cities often assort along ethnic and linguistic categories. Barriers to random mating can be ecologically extreme, such as the Sahara Desert, or cultural, such as the Indian caste system. In either case, subpopulations accumulate genetic differences if the barrier is maintained over multiple generations. Genome-wide polymorphism data, initially with only a few hundred autosomal microsatellites, have clearly established differences in allele frequency not only among continental regions, but also within continents and within countries. We review recent evidence from the analysis of genome-wide polymorphism data for genetic boundaries delineating human population structure and the main demographic and genomic processes shaping variation, and discuss the implications of population structure for the distribution and discovery of disease-causing genetic variants, in the light of the imminent availability of sequencing data for a multitude of diverse human genomes.

  15. Extreme-Scale Alignments Of Quasar Optical Polarizations And Galactic Dust Contamination

    Science.gov (United States)

    Pelgrims, Vincent

    2017-10-01

    Almost twenty years ago the optical polarization vectors from quasars were shown to be aligned over extreme-scales. That evidence was later confirmed and enhanced thanks to additional optical data obtained with the ESO instrument FORS2 mounted on the VLT, in Chile. These observations suggest either Galactic foreground contamination of the data or, more interestingly, a cosmological origin. Using 353-GHz polarization data from the Planck satellite, I recently showed that the main features of the extreme-scale alignments of the quasar optical polarization vectors are unaffected by the Galactic thermal dust. This confirms previous studies based on optical starlight polarization and discards the scenario of Galactic contamination. In this talk, I shall briefly review the extreme-scale quasar polarization alignments, discuss the main results submitted in A&A and motivate forthcoming projects at the frontier between Galactic and extragalactic astrop hysics.

  16. United States Temperature and Precipitation Extremes: Phenomenology, Large-Scale Organization, Physical Mechanisms and Model Representation

    Science.gov (United States)

    Black, R. X.

    2017-12-01

    We summarize results from a project focusing on regional temperature and precipitation extremes over the continental United States. Our project introduces a new framework for evaluating these extremes emphasizing their (a) large-scale organization, (b) underlying physical sources (including remote-excitation and scale-interaction) and (c) representation in climate models. Results to be reported include the synoptic-dynamic behavior, seasonality and secular variability of cold waves, dry spells and heavy rainfall events in the observational record. We also study how the characteristics of such extremes are systematically related to Northern Hemisphere planetary wave structures and thus planetary- and hemispheric-scale forcing (e.g., those associated with major El Nino events and Arctic sea ice change). The underlying physics of event onset are diagnostically quantified for different categories of events. Finally, the representation of these extremes in historical coupled climate model simulations is studied and the origins of model biases are traced using new metrics designed to assess the large-scale atmospheric forcing of local extremes.

  17. Scaling of precipitation extremes with temperature in the French Mediterranean region: What explains the hook shape?

    Science.gov (United States)

    Drobinski, P.; Alonzo, B.; Bastin, S.; Silva, N. Da; Muller, C.

    2016-04-01

    Expected changes to future extreme precipitation remain a key uncertainty associated with anthropogenic climate change. Extreme precipitation has been proposed to scale with the precipitable water content in the atmosphere. Assuming constant relative humidity, this implies an increase of precipitation extremes at a rate of about 7% °C-1 globally as indicated by the Clausius-Clapeyron relationship. Increases faster and slower than Clausius-Clapeyron have also been reported. In this work, we examine the scaling between precipitation extremes and temperature in the present climate using simulations and measurements from surface weather stations collected in the frame of the HyMeX and MED-CORDEX programs in Southern France. Of particular interest are departures from the Clausius-Clapeyron thermodynamic expectation, their spatial and temporal distribution, and their origin. Looking at the scaling of precipitation extreme with temperature, two regimes emerge which form a hook shape: one at low temperatures (cooler than around 15°C) with rates of increase close to the Clausius-Clapeyron rate and one at high temperatures (warmer than about 15°C) with sub-Clausius-Clapeyron rates and most often negative rates. On average, the region of focus does not seem to exhibit super Clausius-Clapeyron behavior except at some stations, in contrast to earlier studies. Many factors can contribute to departure from Clausius-Clapeyron scaling: time and spatial averaging, choice of scaling temperature (surface versus condensation level), and precipitation efficiency and vertical velocity in updrafts that are not necessarily constant with temperature. But most importantly, the dynamical contribution of orography to precipitation in the fall over this area during the so-called "Cevenoles" events, explains the hook shape of the scaling of precipitation extremes.

  18. The Relationship between Spatial and Temporal Magnitude Estimation of Scientific Concepts at Extreme Scales

    Science.gov (United States)

    Price, Aaron; Lee, H.

    2010-01-01

    Many astronomical objects, processes, and events exist and occur at extreme scales of spatial and temporal magnitudes. Our research draws upon the psychological literature, replete with evidence of linguistic and metaphorical links between the spatial and temporal domains, to compare how students estimate spatial and temporal magnitudes associated with objects and processes typically taught in science class.. We administered spatial and temporal scale estimation tests, with many astronomical items, to 417 students enrolled in 12 undergraduate science courses. Results show that while the temporal test was more difficult, students’ overall performance patterns between the two tests were mostly similar. However, asymmetrical correlations between the two tests indicate that students think of the extreme ranges of spatial and temporal scales in different ways, which is likely influenced by their classroom experience. When making incorrect estimations, students tended to underestimate the difference between the everyday scale and the extreme scales on both tests. This suggests the use of a common logarithmic mental number line for both spatial and temporal magnitude estimation. However, there are differences between the two tests in the errors student make in the everyday range. Among the implications discussed is the use of spatio-temporal reference frames, instead of smooth bootstrapping, to help students maneuver between scales of magnitude and the use of logarithmic transformations between reference frames. Implications for astronomy range from learning about spectra to large scale galaxy structure.

  19. Regional-Scale High-Latitude Extreme Geoelectric Fields Pertaining to Geomagnetically Induced Currents

    Science.gov (United States)

    Pulkkinen, Antti; Bernabeu, Emanuel; Eichner, Jan; Viljanen, Ari; Ngwira, Chigomezyo

    2015-01-01

    Motivated by the needs of the high-voltage power transmission industry, we use data from the high-latitude IMAGE magnetometer array to study characteristics of extreme geoelectric fields at regional scales. We use 10-s resolution data for years 1993-2013, and the fields are characterized using average horizontal geoelectric field amplitudes taken over station groups that span about 500-km distance. We show that geoelectric field structures associated with localized extremes at single stations can be greatly different from structures associated with regionally uniform geoelectric fields, which are well represented by spatial averages over single stations. Visual extrapolation and rigorous extreme value analysis of spatially averaged fields indicate that the expected range for 1-in-100-year extreme events are 3-8 V/km and 3.4-7.1 V/km, respectively. The Quebec reference ground model is used in the calculations.

  20. Changes and Attribution of Extreme Precipitation in Climate Models: Subdaily and Daily Scales

    Science.gov (United States)

    Zhang, W.; Villarini, G.; Scoccimarro, E.; Vecchi, G. A.

    2017-12-01

    Extreme precipitation events are responsible for numerous hazards, including flooding, soil erosion, and landslides. Because of their significant socio-economic impacts, the attribution and projection of these events is of crucial importance to improve our response, mitigation and adaptation strategies. Here we present results from our ongoing work.In terms of attribution, we use idealized experiments [pre-industrial control experiment (PI) and 1% per year increase (1%CO2) in atmospheric CO2] from ten general circulation models produced under the Coupled Model Intercomparison Project Phase 5 (CMIP5) and the fraction of attributable risk to examine the CO2 effects on extreme precipitation at the sub-daily and daily scales. We find that the increased CO2 concentration substantially increases the odds of the occurrence of sub-daily precipitation extremes compared to the daily scale in most areas of the world, with the exception of some regions in the sub-tropics, likely in relation to the subsidence of the Hadley Cell. These results point to the large role that atmospheric CO2 plays in extreme precipitation under an idealized framework. Furthermore, we investigate the changes in extreme precipitation events with the Community Earth System Model (CESM) climate experiments using the scenarios consistent with the 1.5°C and 2°C temperature targets. We find that the frequency of annual extreme precipitation at a global scale increases in both 1.5°C and 2°C scenarios until around 2070, after which the magnitudes of the trend become much weaker or even negative. Overall, the frequency of global annual extreme precipitation is similar between 1.5°C and 2°C for the period 2006-2035, and the changes in extreme precipitation in individual seasons are consistent with those for the entire year. The frequency of extreme precipitation in the 2°C experiments is higher than for the 1.5°C experiment after the late 2030s, particularly for the period 2071-2100.

  1. Extreme value statistics and finite-size scaling at the ecological extinction/laminar-turbulence transition

    Science.gov (United States)

    Shih, Hong-Yan; Goldenfeld, Nigel

    Experiments on transitional turbulence in pipe flow seem to show that turbulence is a transient metastable state since the measured mean lifetime of turbulence puffs does not diverge asymptotically at a critical Reynolds number. Yet measurements reveal that the lifetime scales with Reynolds number in a super-exponential way reminiscent of extreme value statistics, and simulations and experiments in Couette and channel flow exhibit directed percolation type scaling phenomena near a well-defined transition. This universality class arises from the interplay between small-scale turbulence and a large-scale collective zonal flow, which exhibit predator-prey behavior. Why is asymptotically divergent behavior not observed? Using directed percolation and a stochastic individual level model of predator-prey dynamics related to transitional turbulence, we investigate the relation between extreme value statistics and power law critical behavior, and show that the paradox is resolved by carefully defining what is measured in the experiments. We theoretically derive the super-exponential scaling law, and using finite-size scaling, show how the same data can give both super-exponential behavior and power-law critical scaling.

  2. Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Brunsell, Nathaniel [Univ. of Kansas, Lawrence, KS (United States); Mechem, David [Univ. of Kansas, Lawrence, KS (United States); Ma, Chunsheng [Wichita State Univ., KS (United States)

    2015-02-20

    Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive to alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the

  3. Combinations of large-scale circulation anomalies conducive to precipitation extremes in the Czech Republic

    Czech Academy of Sciences Publication Activity Database

    Kašpar, Marek; Müller, Miloslav

    2014-01-01

    Roč. 138, March 2014 (2014), s. 205-212 ISSN 0169-8095 R&D Projects: GA ČR(CZ) GAP209/11/1990 Institutional support: RVO:68378289 Keywords : precipitation extreme * synoptic-scale cause * re-analysis * circulation anomaly Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 2.844, year: 2014 http://www.sciencedirect.com/science/article/pii/S0169809513003372

  4. Extreme-scale alignments of quasar optical polarizations and Galactic dust contamination

    OpenAIRE

    Pelgrims, Vincent

    2017-01-01

    Almost twenty years ago the optical polarization vectors from quasars were shown to be aligned over extreme-scales. That evidence was later confirmed and enhanced thanks to additional optical data obtained with the ESO instrument FORS2 mounted on the VLT, in Chile. These observations suggest either Galactic foreground contamination of the data or, more interestingly, a cosmological origin. Using 353-GHz polarization data from the Planck satellite, I recently showed that the main features of t...

  5. Moths produce extremely quiet ultrasonic courtship songs by rubbing specialized scales

    DEFF Research Database (Denmark)

    Nakano, Ryo; Skals, Niels; Takanashi, Takuma

    2008-01-01

    level at 1 cm) adapted for private sexual communication in the Asian corn borer moth, Ostrinia furnacalis. During courtship, the male rubs specialized scales on the wing against those on the thorax to produce the songs, with the wing membrane underlying the scales possibly acting as a sound resonator....... The male's song suppresses the escape behavior of the female, thereby increasing his mating success. Our discovery of extremely low-intensity ultrasonic communication may point to a whole undiscovered world of private communication, using "quiet" ultrasound....

  6. Influence of climate variability versus change at multi-decadal time scales on hydrological extremes

    Science.gov (United States)

    Willems, Patrick

    2014-05-01

    Recent studies have shown that rainfall and hydrological extremes do not randomly occur in time, but are subject to multidecadal oscillations. In addition to these oscillations, there are temporal trends due to climate change. Design statistics, such as intensity-duration-frequency (IDF) for extreme rainfall or flow-duration-frequency (QDF) relationships, are affected by both types of temporal changes (short term and long term). This presentation discusses these changes, how they influence water engineering design and decision making, and how this influence can be assessed and taken into account in practice. The multidecadal oscillations in rainfall and hydrological extremes were studied based on a technique for the identification and analysis of changes in extreme quantiles. The statistical significance of the oscillations was evaluated by means of a non-parametric bootstrapping method. Oscillations in large scale atmospheric circulation were identified as the main drivers for the temporal oscillations in rainfall and hydrological extremes. They also explain why spatial phase shifts (e.g. north-south variations in Europe) exist between the oscillation highs and lows. Next to the multidecadal climate oscillations, several stations show trends during the most recent decades, which may be attributed to climate change as a result of anthropogenic global warming. Such attribution to anthropogenic global warming is, however, uncertain. It can be done based on simulation results with climate models, but it is shown that the climate model results are too uncertain to enable a clear attribution. Water engineering design statistics, such as extreme rainfall IDF or peak or low flow QDF statistics, obviously are influenced by these temporal variations (oscillations, trends). It is shown in the paper, based on the Brussels 10-minutes rainfall data, that rainfall design values may be about 20% biased or different when based on short rainfall series of 10 to 15 years length, and

  7. Scale orientated analysis of river width changes due to extreme flood hazards

    Directory of Open Access Journals (Sweden)

    G. Krapesch

    2011-08-01

    Full Text Available This paper analyses the morphological effects of extreme floods (recurrence interval >100 years and examines which parameters best describe the width changes due to erosion based on 5 affected alpine gravel bed rivers in Austria. The research was based on vertical aerial photos of the rivers before and after extreme floods, hydrodynamic numerical models and cross sectional measurements supported by LiDAR data of the rivers. Average width ratios (width after/before the flood were calculated and correlated with different hydraulic parameters (specific stream power, shear stress, flow area, specific discharge. Depending on the geomorphological boundary conditions of the different rivers, a mean width ratio between 1.12 (Lech River and 3.45 (Trisanna River was determined on the reach scale. The specific stream power (SSP best predicted the mean width ratios of the rivers especially on the reach scale and sub reach scale. On the local scale more parameters have to be considered to define the "minimum morphological spatial demand of rivers", which is a crucial parameter for addressing and managing flood hazards and should be used in hazard zone plans and spatial planning.

  8. How do the multiple large-scale climate oscillations trigger extreme precipitation?

    Science.gov (United States)

    Shi, Pengfei; Yang, Tao; Xu, Chong-Yu; Yong, Bin; Shao, Quanxi; Li, Zhenya; Wang, Xiaoyan; Zhou, Xudong; Li, Shu

    2017-10-01

    Identifying the links between variations in large-scale climate patterns and precipitation is of tremendous assistance in characterizing surplus or deficit of precipitation, which is especially important for evaluation of local water resources and ecosystems in semi-humid and semi-arid regions. Restricted by current limited knowledge on underlying mechanisms, statistical correlation methods are often used rather than physical based model to characterize the connections. Nevertheless, available correlation methods are generally unable to reveal the interactions among a wide range of climate oscillations and associated effects on precipitation, especially on extreme precipitation. In this work, a probabilistic analysis approach by means of a state-of-the-art Copula-based joint probability distribution is developed to characterize the aggregated behaviors for large-scale climate patterns and their connections to precipitation. This method is employed to identify the complex connections between climate patterns (Atlantic Multidecadal Oscillation (AMO), El Niño-Southern Oscillation (ENSO) and Pacific Decadal Oscillation (PDO)) and seasonal precipitation over a typical semi-humid and semi-arid region, the Haihe River Basin in China. Results show that the interactions among multiple climate oscillations are non-uniform in most seasons and phases. Certain joint extreme phases can significantly trigger extreme precipitation (flood and drought) owing to the amplification effect among climate oscillations.

  9. The Persistence of Informality: Small-Scale Water Providers in Manila’s Post-Privatisation Era

    Directory of Open Access Journals (Sweden)

    Deborah Cheng

    2014-02-01

    Full Text Available This article troubles the notion of a formal-informal dichotomy in urban water provision. Whereas expansion of a water utility typically involves the replacement of informal providers, the experience in Manila demonstrates that the rapid connection of low-income areas actually hinges, in part, on the selective inclusion and exclusion of these smaller actors. In this sense, privatisation has not eliminated small-scale water provision, but has led to the reconfiguration of its usage, blurring the boundaries between formal and informal. By examining the spatial and temporal evolution of small-scale water provision in Manila’s post-privatisation era, I show how certain spaces are seen as less serviceable than others. Critically, small providers working in partnership with the utilities are sanctioned because they supplement the utilities’ operations. The areas in which they work are considered served, factoring into aggregate coverage statistics, even though their terms of service are often less desirable than those of households directly connected to the utilities. In contrast, small providers that operate outside of the utilities’ zones of coverage are considered inferior, to be replaced. The result is a differentiation in informality – one in which the private utilities largely determine modes of access and thus the spatialisation of informal water provision.

  10. Improving plot- and regional-scale crop models for simulating impacts of climate variability and extremes

    Science.gov (United States)

    Tao, F.; Rötter, R.

    2013-12-01

    Many studies on global climate report that climate variability is increasing with more frequent and intense extreme events1. There are quite large uncertainties from both the plot- and regional-scale models in simulating impacts of climate variability and extremes on crop development, growth and productivity2,3. One key to reducing the uncertainties is better exploitation of experimental data to eliminate crop model deficiencies and develop better algorithms that more adequately capture the impacts of extreme events, such as high temperature and drought, on crop performance4,5. In the present study, in a first step, the inter-annual variability in wheat yield and climate from 1971 to 2012 in Finland was investigated. Using statistical approaches the impacts of climate variability and extremes on wheat growth and productivity were quantified. In a second step, a plot-scale model, WOFOST6, and a regional-scale crop model, MCWLA7, were calibrated and validated, and applied to simulate wheat growth and yield variability from 1971-2012. Next, the estimated impacts of high temperature stress, cold damage, and drought stress on crop growth and productivity based on the statistical approaches, and on crop simulation models WOFOST and MCWLA were compared. Then, the impact mechanisms of climate extremes on crop growth and productivity in the WOFOST model and MCWLA model were identified, and subsequently, the various algorithm and impact functions were fitted against the long-term crop trial data. Finally, the impact mechanisms, algorithms and functions in WOFOST model and MCWLA model were improved to better simulate the impacts of climate variability and extremes, particularly high temperature stress, cold damage and drought stress for location-specific and large area climate impact assessments. Our studies provide a good example of how to improve, in parallel, the plot- and regional-scale models for simulating impacts of climate variability and extremes, as needed for

  11. Contribution of large-scale midlatitude disturbances to hourly precipitation extremes in the United States

    Science.gov (United States)

    Barbero, Renaud; Abatzoglou, John T.; Fowler, Hayley J.

    2018-02-01

    Midlatitude synoptic weather regimes account for a substantial portion of annual precipitation accumulation as well as multi-day precipitation extremes across parts of the United States (US). However, little attention has been devoted to understanding how synoptic-scale patterns contribute to hourly precipitation extremes. A majority of 1-h annual maximum precipitation (AMP) across the western US were found to be linked to two coherent midlatitude synoptic patterns: disturbances propagating along the jet stream, and cutoff upper-level lows. The influence of these two patterns on 1-h AMP varies geographically. Over 95% of 1-h AMP along the western coastal US were coincident with progressive midlatitude waves embedded within the jet stream, while over 30% of 1-h AMP across the interior western US were coincident with cutoff lows. Between 30-60% of 1-h AMP were coincident with the jet stream across the Ohio River Valley and southeastern US, whereas a a majority of 1-h AMP over the rest of central and eastern US were not found to be associated with either midlatitude synoptic features. Composite analyses for 1-h AMP days coincident to cutoff lows and jet stream show that an anomalous moisture flux and upper-level dynamics are responsible for initiating instability and setting up an environment conducive to 1-h AMP events. While hourly precipitation extremes are generally thought to be purely convective in nature, this study shows that large-scale dynamics and baroclinic disturbances may also contribute to precipitation extremes on sub-daily timescales.

  12. Using GRACE Satellite Gravimetry for Assessing Large-Scale Hydrologic Extremes

    Directory of Open Access Journals (Sweden)

    Alexander Y. Sun

    2017-12-01

    Full Text Available Global assessment of the spatiotemporal variability in terrestrial total water storage anomalies (TWSA in response to hydrologic extremes is critical for water resources management. Using TWSA derived from the gravity recovery and climate experiment (GRACE satellites, this study systematically assessed the skill of the TWSA-climatology (TC approach and breakpoint (BP detection method for identifying large-scale hydrologic extremes. The TC approach calculates standardized anomalies by using the mean and standard deviation of the GRACE TWSA corresponding to each month. In the BP detection method, the empirical mode decomposition (EMD is first applied to identify the mean return period of TWSA extremes, and then a statistical procedure is used to identify the actual occurrence times of abrupt changes (i.e., BPs in TWSA. Both detection methods were demonstrated on basin-averaged TWSA time series for the world’s 35 largest river basins. A nonlinear event coincidence analysis measure was applied to cross-examine abrupt changes detected by these methods with those detected by the Standardized Precipitation Index (SPI. Results show that our EMD-assisted BP procedure is a promising tool for identifying hydrologic extremes using GRACE TWSA data. Abrupt changes detected by the BP method coincide well with those of the SPI anomalies and with documented hydrologic extreme events. Event timings obtained by the TC method were ambiguous for a number of river basins studied, probably because the GRACE data length is too short to derive long-term climatology at this time. The BP approach demonstrates a robust wet-dry anomaly detection capability, which will be important for applications with the upcoming GRACE Follow-On mission.

  13. Extreme events in total ozone: Spatio-temporal analysis from local to global scale

    Science.gov (United States)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    dynamics (NAO, ENSO) on total ozone is a global feature in the northern mid-latitudes (Rieder et al., 2010c). In a next step frequency distributions of extreme events are analyzed on global scale (northern and southern mid-latitudes). A specific focus here is whether findings gained through analysis of long-term European ground based stations can be clearly identified as a global phenomenon. By showing results from these three types of studies an overview of extreme events in total ozone (and the dynamical and chemical features leading to those) will be presented from local to global scales. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Rieder, H.E., Jancso, L., Staehelin, J., Maeder, J.A., Ribatet, Peter, T., and A.D., Davison (2010): Extreme events in total ozone over the northern mid-latitudes: A case study based on long-term data sets from 5 ground-based stations, in preparation. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998a. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa

  14. The New Era of Precision Cosmology: Testing Gravity at Large Scales

    Science.gov (United States)

    Prescod-Weinstein, Chanda

    2011-01-01

    Cosmic acceleration may be the biggest phenomenological mystery in cosmology today. Various explanations for its cause have been proposed, including the cosmological constant, dark energy and modified gravities. Structure formation provides a strong test of any cosmic acceleration model because a successful dark energy model must not inhibit the development of observed large-scale structures. Traditional approaches to studies of structure formation in the presence of dark energy ore modified gravity implement the Press & Schechter formalism (PGF). However, does the PGF apply in all cosmologies? The search is on for a better understanding of universality in the PGF In this talk, I explore the potential for universality and talk about what dark matter haloes may be able to tell us about cosmology. I will also discuss the implications of this and new cosmological experiments for better understanding our theory of gravity.

  15. Extreme Scale FMM-Accelerated Boundary Integral Equation Solver for Wave Scattering

    KAUST Repository

    AbdulJabbar, Mustafa Abdulmajeed

    2018-03-27

    Algorithmic and architecture-oriented optimizations are essential for achieving performance worthy of anticipated energy-austere exascale systems. In this paper, we present an extreme scale FMM-accelerated boundary integral equation solver for wave scattering, which uses FMM as a matrix-vector multiplication inside the GMRES iterative method. Our FMM Helmholtz kernels treat nontrivial singular and near-field integration points. We implement highly optimized kernels for both shared and distributed memory, targeting emerging Intel extreme performance HPC architectures. We extract the potential thread- and data-level parallelism of the key Helmholtz kernels of FMM. Our application code is well optimized to exploit the AVX-512 SIMD units of Intel Skylake and Knights Landing architectures. We provide different performance models for tuning the task-based tree traversal implementation of FMM, and develop optimal architecture-specific and algorithm aware partitioning, load balancing, and communication reducing mechanisms to scale up to 6,144 compute nodes of a Cray XC40 with 196,608 hardware cores. With shared memory optimizations, we achieve roughly 77% of peak single precision floating point performance of a 56-core Skylake processor, and on average 60% of peak single precision floating point performance of a 72-core KNL. These numbers represent nearly 5.4x and 10x speedup on Skylake and KNL, respectively, compared to the baseline scalar code. With distributed memory optimizations, on the other hand, we report near-optimal efficiency in the weak scalability study with respect to both the logarithmic communication complexity as well as the theoretical scaling complexity of FMM. In addition, we exhibit up to 85% efficiency in strong scaling. We compute in excess of 2 billion DoF on the full-scale of the Cray XC40 supercomputer.

  16. Power-law scaling of extreme dynamics near higher-order exceptional points

    Science.gov (United States)

    Zhong, Q.; Christodoulides, D. N.; Khajavikhan, M.; Makris, K. G.; El-Ganainy, R.

    2018-02-01

    We investigate the extreme dynamics of non-Hermitian systems near higher-order exceptional points in photonic networks constructed using the bosonic algebra method. We show that strong power oscillations for certain initial conditions can occur as a result of the peculiar eigenspace geometry and its dimensionality collapse near these singularities. By using complementary numerical and analytical approaches, we show that, in the parity-time (PT ) phase near exceptional points, the logarithm of the maximum optical power amplification scales linearly with the order of the exceptional point. We focus in our discussion on photonic systems, but we note that our results apply to other physical systems as well.

  17. XVIS: Visualization for the Extreme-Scale Scientific-Computation Ecosystem Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Maynard, Robert [Kitware, Inc., Clifton Park, NY (United States)

    2017-10-27

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respective features into a new visualization toolkit called VTK-m.

  18. Extreme-Scale Bayesian Inference for Uncertainty Quantification of Complex Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Biros, George [Univ. of Texas, Austin, TX (United States)

    2018-01-12

    Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their large-scale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extreme-scale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parameter-to-observable map. These include structure-exploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extreme-scale Bayesian inversion problems by appealing to and adapting ideas from large-scale PDE-constrained optimization, which have been very successful at exploring high-dimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and non-parametric density estimation. Constructing problem-specific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for non-parametric density estimation using high dimensional N-body methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a

  19. Understanding extreme sea levels for broad-scale coastal impact and adaptation analysis

    Science.gov (United States)

    Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A. B. A.

    2017-07-01

    One of the main consequences of mean sea level rise (SLR) on human settlements is an increase in flood risk due to an increase in the intensity and frequency of extreme sea levels (ESL). While substantial research efforts are directed towards quantifying projections and uncertainties of future global and regional SLR, corresponding uncertainties in contemporary ESL have not been assessed and projections are limited. Here we quantify, for the first time at global scale, the uncertainties in present-day ESL estimates, which have by default been ignored in broad-scale sea-level rise impact assessments to date. ESL uncertainties exceed those from global SLR projections and, assuming that we meet the Paris agreement goals, the projected SLR itself by the end of the century in many regions. Both uncertainties in SLR projections and ESL estimates need to be understood and combined to fully assess potential impacts and adaptation needs.

  20. DISPATCH: A Numerical Simulation Framework for the Exa-scale Era. I. Fundamentals

    Science.gov (United States)

    Nordlund, Åke; P Ramsey, Jon; Popovas, Andrius; Küffmeier, Michael

    2018-03-01

    We introduce a high-performance simulation framework that permits the semi-independent, task-based solution of sets of partial differential equations, typically manifesting as updates to a collection of `patches' in space-time. A hybrid MPI/OpenMP execution model is adopted, where work tasks are controlled by a rank-local `dispatcher' which selects, from a set of tasks generally much larger than the number of physical cores (or hardware threads), tasks that are ready for updating. The definition of a task can vary, for example, with some solving the equations of ideal magnetohydrodynamics (MHD), others non-ideal MHD, radiative transfer, or particle motion, and yet others applying particle-in-cell (PIC) methods. Tasks do not have to be grid-based, while tasks that are, may use either Cartesian or orthogonal curvilinear meshes. Patches may be stationary or moving. Mesh refinement can be static or dynamic. A feature of decisive importance for the overall performance of the framework is that time steps are determined and applied locally; this allows potentially large reductions in the total number of updates required in cases when the signal speed varies greatly across the computational domain, and therefore a corresponding reduction in computing time. Another feature is a load balancing algorithm that operates `locally' and aims to simultaneously minimise load and communication imbalance. The framework generally relies on already existing solvers, whose performance is augmented when run under the framework, due to more efficient cache usage, vectorisation, local time-stepping, plus near-linear and, in principle, unlimited OpenMP and MPI scaling.

  1. DISPATCH: a numerical simulation framework for the exa-scale era - I. Fundamentals

    Science.gov (United States)

    Nordlund, Åke; Ramsey, Jon P.; Popovas, Andrius; Küffmeier, Michael

    2018-06-01

    We introduce a high-performance simulation framework that permits the semi-independent, task-based solution of sets of partial differential equations, typically manifesting as updates to a collection of `patches' in space-time. A hybrid MPI/OpenMP execution model is adopted, where work tasks are controlled by a rank-local `dispatcher' which selects, from a set of tasks generally much larger than the number of physical cores (or hardware threads), tasks that are ready for updating. The definition of a task can vary, for example, with some solving the equations of ideal magnetohydrodynamics (MHD), others non-ideal MHD, radiative transfer, or particle motion, and yet others applying particle-in-cell (PIC) methods. Tasks do not have to be grid based, while tasks that are, may use either Cartesian or orthogonal curvilinear meshes. Patches may be stationary or moving. Mesh refinement can be static or dynamic. A feature of decisive importance for the overall performance of the framework is that time-steps are determined and applied locally; this allows potentially large reductions in the total number of updates required in cases when the signal speed varies greatly across the computational domain, and therefore a corresponding reduction in computing time. Another feature is a load balancing algorithm that operates `locally' and aims to simultaneously minimize load and communication imbalance. The framework generally relies on already existing solvers, whose performance is augmented when run under the framework, due to more efficient cache usage, vectorization, local time-stepping, plus near-linear and, in principle, unlimited OpenMP and MPI scaling.

  2. Measurement Properties of the Lower Extremity Functional Scale: A Systematic Review.

    Science.gov (United States)

    Mehta, Saurabh P; Fulton, Allison; Quach, Cedric; Thistle, Megan; Toledo, Cesar; Evans, Neil A

    2016-03-01

    Systematic review of measurement properties. Many primary studies have examined the measurement properties, such as reliability, validity, and sensitivity to change, of the Lower Extremity Functional Scale (LEFS) in different clinical populations. A systematic review summarizing these properties for the LEFS may provide an important resource. To locate and synthesize evidence on the measurement properties of the LEFS and to discuss the clinical implications of the evidence. A literature search was conducted in 4 databases (PubMed, MEDLINE, Embase, and CINAHL), using predefined search terms. Two reviewers performed a critical appraisal of the included studies using a standardized assessment form. A total of 27 studies were included in the review, of which 18 achieved a very good to excellent methodological quality level. The LEFS scores demonstrated excellent test-retest reliability (intraclass correlation coefficients ranging between 0.85 and 0.99) and demonstrated the expected relationships with measures assessing similar constructs (Pearson correlation coefficient values of greater than 0.7). The responsiveness of the LEFS scores was excellent, as suggested by consistently high effect sizes (greater than 0.8) in patients with different lower extremity conditions. Minimal detectable change at the 90% confidence level (MDC90) for the LEFS scores varied between 8.1 and 15.3 across different reassessment intervals in a wide range of patient populations. The pooled estimate of the MDC90 was 6 points and the minimal clinically important difference was 9 points in patients with lower extremity musculoskeletal conditions, which are indicative of true change and clinically meaningful change, respectively. The results of this review support the reliability, validity, and responsiveness of the LEFS scores for assessing functional impairment in a wide array of patient groups with lower extremity musculoskeletal conditions.

  3. AN AUTOMATIC DETECTION METHOD FOR EXTREME-ULTRAVIOLET DIMMINGS ASSOCIATED WITH SMALL-SCALE ERUPTION

    Energy Technology Data Exchange (ETDEWEB)

    Alipour, N.; Safari, H. [Department of Physics, University of Zanjan, P.O. Box 45195-313, Zanjan (Iran, Islamic Republic of); Innes, D. E. [Max-Planck Institut fuer Sonnensystemforschung, 37191 Katlenburg-Lindau (Germany)

    2012-02-10

    Small-scale extreme-ultraviolet (EUV) dimming often surrounds sites of energy release in the quiet Sun. This paper describes a method for the automatic detection of these small-scale EUV dimmings using a feature-based classifier. The method is demonstrated using sequences of 171 Angstrom-Sign images taken by the STEREO/Extreme UltraViolet Imager (EUVI) on 2007 June 13 and by Solar Dynamics Observatory/Atmospheric Imaging Assembly on 2010 August 27. The feature identification relies on recognizing structure in sequences of space-time 171 Angstrom-Sign images using the Zernike moments of the images. The Zernike moments space-time slices with events and non-events are distinctive enough to be separated using a support vector machine (SVM) classifier. The SVM is trained using 150 events and 700 non-event space-time slices. We find a total of 1217 events in the EUVI images and 2064 events in the AIA images on the days studied. Most of the events are found between latitudes -35 Degree-Sign and +35 Degree-Sign . The sizes and expansion speeds of central dimming regions are extracted using a region grow algorithm. The histograms of the sizes in both EUVI and AIA follow a steep power law with slope of about -5. The AIA slope extends to smaller sizes before turning over. The mean velocity of 1325 dimming regions seen by AIA is found to be about 14 km s{sup -1}.

  4. Modelling of spatio-temporal precipitation relevant for urban hydrology with focus on scales, extremes and climate change

    DEFF Research Database (Denmark)

    Sørup, Hjalte Jomo Danielsen

    -correlation lengths for sub-daily extreme precipitation besides having too low intensities. Especially the wrong spatial correlation structure is disturbing from an urban hydrological point of view as short-term extremes will cover too much ground if derived directly from bias corrected regional climate model output...... of precipitation are compared and used to rank climate models with respect to performance metrics. The four different observational data sets themselves are compared at daily temporal scale with respect to climate indices for mean and extreme precipitation. Data density seems to be a crucial parameter for good...... happening in summer and most of the daily extremes in fall. This behaviour is in good accordance with reality where short term extremes originate in convective precipitation cells that occur when it is very warm and longer term extremes originate in frontal systems that dominate the fall and winter seasons...

  5. Analyzing extreme sea levels for broad-scale impact and adaptation studies

    Science.gov (United States)

    Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A.

    2017-12-01

    Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels (ESL), because increasing damage due to extreme events is one of the major consequences of sea-level rise (SLR) and climate change. Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future SLR; different scenarios were developed with process-based or semi-empirical models and used for coastal impact studies at various temporal and spatial scales to guide coastal management and adaptation efforts. Uncertainties in future SLR are typically accounted for by analyzing the impacts associated with a range of scenarios and model ensembles. ESL distributions are then displaced vertically according to the SLR scenarios under the inherent assumption that we have perfect knowledge on the statistics of extremes. However, there is still a limited understanding of present-day ESL which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of storm surge water levels, and (2) statistical models used for determining present-day ESL exceedance probabilities. There is no universally accepted approach to obtain such values for broad-scale flood risk assessments and while substantial research has explored SLR uncertainties, we quantify, for the first time globally, key uncertainties in ESL estimates. We find that contemporary ESL uncertainties exceed those from SLR projections and, assuming that we meet the Paris agreement, the projected SLR itself by the end of the century. Our results highlight the necessity to further improve our understanding of uncertainties in ESL estimates through (1) continued improvement of numerical and statistical models to simulate and analyze coastal water levels and (2) exploit the rich observational database and continue data archeology to obtain longer time series and remove model bias

  6. Resilience Design Patterns - A Structured Approach to Resilience at Extreme Scale (version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Hukerikar, Saurabh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Engelmann, Christian [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-10-01

    Reliability is a serious concern for future extreme-scale high-performance computing (HPC) systems. Projections based on the current generation of HPC systems and technology roadmaps suggest that very high fault rates in future systems. The errors resulting from these faults will propagate and generate various kinds of failures, which may result in outcomes ranging from result corruptions to catastrophic application crashes. Practical limits on power consumption in HPC systems will require future systems to embrace innovative architectures, increasing the levels of hardware and software complexities. The resilience challenge for extreme-scale HPC systems requires management of various hardware and software technologies that are capable of handling a broad set of fault models at accelerated fault rates. These techniques must seek to improve resilience at reasonable overheads to power consumption and performance. While the HPC community has developed various solutions, application-level as well as system-based solutions, the solution space of HPC resilience techniques remains fragmented. There are no formal methods and metrics to investigate and evaluate resilience holistically in HPC systems that consider impact scope, handling coverage, and performance & power eciency across the system stack. Additionally, few of the current approaches are portable to newer architectures and software ecosystems, which are expected to be deployed on future systems. In this document, we develop a structured approach to the management of HPC resilience based on the concept of resilience-based design patterns. A design pattern is a general repeatable solution to a commonly occurring problem. We identify the commonly occurring problems and solutions used to deal with faults, errors and failures in HPC systems. The catalog of resilience design patterns provides designers with reusable design elements. We define a design framework that enhances our understanding of the important

  7. Resilience Design Patterns - A Structured Approach to Resilience at Extreme Scale (version 1.1)

    Energy Technology Data Exchange (ETDEWEB)

    Hukerikar, Saurabh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Engelmann, Christian [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-12-01

    Reliability is a serious concern for future extreme-scale high-performance computing (HPC) systems. Projections based on the current generation of HPC systems and technology roadmaps suggest the prevalence of very high fault rates in future systems. The errors resulting from these faults will propagate and generate various kinds of failures, which may result in outcomes ranging from result corruptions to catastrophic application crashes. Therefore the resilience challenge for extreme-scale HPC systems requires management of various hardware and software technologies that are capable of handling a broad set of fault models at accelerated fault rates. Also, due to practical limits on power consumption in HPC systems future systems are likely to embrace innovative architectures, increasing the levels of hardware and software complexities. As a result the techniques that seek to improve resilience must navigate the complex trade-off space between resilience and the overheads to power consumption and performance. While the HPC community has developed various resilience solutions, application-level techniques as well as system-based solutions, the solution space of HPC resilience techniques remains fragmented. There are no formal methods and metrics to investigate and evaluate resilience holistically in HPC systems that consider impact scope, handling coverage, and performance & power efficiency across the system stack. Additionally, few of the current approaches are portable to newer architectures and software environments that will be deployed on future systems. In this document, we develop a structured approach to the management of HPC resilience using the concept of resilience-based design patterns. A design pattern is a general repeatable solution to a commonly occurring problem. We identify the commonly occurring problems and solutions used to deal with faults, errors and failures in HPC systems. Each established solution is described in the form of a pattern that

  8. Durango: Scalable Synthetic Workload Generation for Extreme-Scale Application Performance Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Carothers, Christopher D. [Rensselaer Polytechnic Institute (RPI); Meredith, Jeremy S. [ORNL; Blanco, Marc [Rensselaer Polytechnic Institute (RPI); Vetter, Jeffrey S. [ORNL; Mubarak, Misbah [Argonne National Laboratory; LaPre, Justin [Rensselaer Polytechnic Institute (RPI); Moore, Shirley V. [ORNL

    2017-05-01

    Performance modeling of extreme-scale applications on accurate representations of potential architectures is critical for designing next generation supercomputing systems because it is impractical to construct prototype systems at scale with new network hardware in order to explore designs and policies. However, these simulations often rely on static application traces that can be difficult to work with because of their size and lack of flexibility to extend or scale up without rerunning the original application. To address this problem, we have created a new technique for generating scalable, flexible workloads from real applications, we have implemented a prototype, called Durango, that combines a proven analytical performance modeling language, Aspen, with the massively parallel HPC network modeling capabilities of the CODES framework.Our models are compact, parameterized and representative of real applications with computation events. They are not resource intensive to create and are portable across simulator environments. We demonstrate the utility of Durango by simulating the LULESH application in the CODES simulation environment on several topologies and show that Durango is practical to use for simulation without loss of fidelity, as quantified by simulation metrics. During our validation of Durango's generated communication model of LULESH, we found that the original LULESH miniapp code had a latent bug where the MPI_Waitall operation was used incorrectly. This finding underscores the potential need for a tool such as Durango, beyond its benefits for flexible workload generation and modeling.Additionally, we demonstrate the efficacy of Durango's direct integration approach, which links Aspen into CODES as part of the running network simulation model. Here, Aspen generates the application-level computation timing events, which in turn drive the start of a network communication phase. Results show that Durango's performance scales well when

  9. Extreme Temperature Regimes during the Cool Season and their Associated Large-Scale Circulations

    Science.gov (United States)

    Xie, Z.

    2015-12-01

    In the cool season (November-March), extreme temperature events (ETEs) always hit the continental United States (US) and provide significant societal impacts. According to the anomalous amplitudes of the surface air temperature (SAT), there are two typical types of ETEs, e.g. cold waves (CWs) and warm waves (WWs). This study used cluster analysis to categorize both CWs and WWs into four distinct regimes respectively and investigated their associated large-scale circulations on intra-seasonal time scale. Most of the CW regimes have large areal impact over the continental US. However, the distribution of cold SAT anomalies varies apparently in four regimes. In the sea level, the four CW regimes are characterized by anomalous high pressure over North America (near and to west of cold anomaly) with different extension and orientation. As a result, anomalous northerlies along east flank of anomalous high pressure convey cold air into the continental US. To the middle troposphere, the leading two groups feature large-scale and zonally-elongated circulation anomaly pattern, while the other two regimes exhibit synoptic wavetrain pattern with meridionally elongated features. As for the WW regimes, there are some patterns symmetry and anti-symmetry with respect to CW regimes. The WW regimes are characterized by anomalous low pressure and southerlies wind over North America. The first and fourth groups are affected by remote forcing emanating from North Pacific, while the others appear mainly locally forced.

  10. Recovery Act - CAREER: Sustainable Silicon -- Energy-Efficient VLSI Interconnect for Extreme-Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Chiang, Patrick [Oregon State Univ., Corvallis, OR (United States)

    2014-01-31

    The research goal of this CAREER proposal is to develop energy-efficient, VLSI interconnect circuits and systems that will facilitate future massively-parallel, high-performance computing. Extreme-scale computing will exhibit massive parallelism on multiple vertical levels, from thou­ sands of computational units on a single processor to thousands of processors in a single data center. Unfortunately, the energy required to communicate between these units at every level (on­ chip, off-chip, off-rack) will be the critical limitation to energy efficiency. Therefore, the PI's career goal is to become a leading researcher in the design of energy-efficient VLSI interconnect for future computing systems.

  11. Data co-processing for extreme scale analysis level II ASC milestone (4745).

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, David; Moreland, Kenneth D.; Oldfield, Ron A.; Fabian, Nathan D.

    2013-03-01

    Exascale supercomputing will embody many revolutionary changes in the hardware and software of high-performance computing. A particularly pressing issue is gaining insight into the science behind the exascale computations. Power and I/O speed con- straints will fundamentally change current visualization and analysis work ows. A traditional post-processing work ow involves storing simulation results to disk and later retrieving them for visualization and data analysis. However, at exascale, scien- tists and analysts will need a range of options for moving data to persistent storage, as the current o ine or post-processing pipelines will not be able to capture the data necessary for data analysis of these extreme scale simulations. This Milestone explores two alternate work ows, characterized as in situ and in transit, and compares them. We nd each to have its own merits and faults, and we provide information to help pick the best option for a particular use.

  12. Enabling Structured Exploration of Workflow Performance Variability in Extreme-Scale Environments

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin; Stephan, Eric G.; Raju, Bibi; Altintas, Ilkay; Elsethagen, Todd O.; Krishnamoorthy, Sriram

    2015-11-15

    Workflows are taking an Workflows are taking an increasingly important role in orchestrating complex scientific processes in extreme scale and highly heterogeneous environments. However, to date we cannot reliably predict, understand, and optimize workflow performance. Sources of performance variability and in particular the interdependencies of workflow design, execution environment and system architecture are not well understood. While there is a rich portfolio of tools for performance analysis, modeling and prediction for single applications in homogenous computing environments, these are not applicable to workflows, due to the number and heterogeneity of the involved workflow and system components and their strong interdependencies. In this paper, we investigate workflow performance goals and identify factors that could have a relevant impact. Based on our analysis, we propose a new workflow performance provenance ontology, the Open Provenance Model-based WorkFlow Performance Provenance, or OPM-WFPP, that will enable the empirical study of workflow performance characteristics and variability including complex source attribution.

  13. Topology-oblivious optimization of MPI broadcast algorithms on extreme-scale platforms

    KAUST Repository

    Hasanov, Khalid

    2015-11-01

    © 2015 Elsevier B.V. All rights reserved. Significant research has been conducted in collective communication operations, in particular in MPI broadcast, on distributed memory platforms. Most of the research efforts aim to optimize the collective operations for particular architectures by taking into account either their topology or platform parameters. In this work we propose a simple but general approach to optimization of the legacy MPI broadcast algorithms, which are widely used in MPICH and Open MPI. The proposed optimization technique is designed to address the challenge of extreme scale of future HPC platforms. It is based on hierarchical transformation of the traditionally flat logical arrangement of communicating processors. Theoretical analysis and experimental results on IBM BlueGene/P and a cluster of the Grid\\'5000 platform are presented.

  14. Large-Scale Atmospheric Circulation Patterns Associated with Temperature Extremes as a Basis for Model Evaluation: Methodological Overview and Results

    Science.gov (United States)

    Loikith, P. C.; Broccoli, A. J.; Waliser, D. E.; Lintner, B. R.; Neelin, J. D.

    2015-12-01

    Anomalous large-scale circulation patterns often play a key role in the occurrence of temperature extremes. For example, large-scale circulation can drive horizontal temperature advection or influence local processes that lead to extreme temperatures, such as by inhibiting moderating sea breezes, promoting downslope adiabatic warming, and affecting the development of cloud cover. Additionally, large-scale circulation can influence the shape of temperature distribution tails, with important implications for the magnitude of future changes in extremes. As a result of the prominent role these patterns play in the occurrence and character of extremes, the way in which temperature extremes change in the future will be highly influenced by if and how these patterns change. It is therefore critical to identify and understand the key patterns associated with extremes at local to regional scales in the current climate and to use this foundation as a target for climate model validation. This presentation provides an overview of recent and ongoing work aimed at developing and applying novel approaches to identifying and describing the large-scale circulation patterns associated with temperature extremes in observations and using this foundation to evaluate state-of-the-art global and regional climate models. Emphasis is given to anomalies in sea level pressure and 500 hPa geopotential height over North America using several methods to identify circulation patterns, including self-organizing maps and composite analysis. Overall, evaluation results suggest that models are able to reproduce observed patterns associated with temperature extremes with reasonable fidelity in many cases. Model skill is often highest when and where synoptic-scale processes are the dominant mechanisms for extremes, and lower where sub-grid scale processes (such as those related to topography) are important. Where model skill in reproducing these patterns is high, it can be inferred that extremes are

  15. Establishing the Turkish version of the SIGAM mobility scale, and determining its validity and reliability in lower extremity amputees.

    Science.gov (United States)

    Yilmaz, Hülya; Gafuroğlu, Ümit; Ryall, Nicola; Yüksel, Selcen

    2018-02-01

    The aim of this study is to adapt the Special Interest Group in Amputee Medicine (SIGAM) mobility scale to Turkish, and to test its validity and reliability in lower extremity amputees. Adaptation of the scale into Turkish was performed by following the steps in American Association of Orthopedic Surgeons (AAOS) guideline. Turkish version of the scale was tested twice on 109 patients who had lower extremity amputations, at hours 0 and 72. The reliability of the Turkish version was tested for internal consistency and test-retest reliability. Structural validity was tested using the "scale validity" method. For this purpose, the scores of the Short Form-36 (SF-36), Functional Ambulation Scale (FAS), Get Up and Go Test, and Satisfaction with the Prosthesis Questionnaire (SATPRO) were calculated, and analyzed using Spearman's correlation test. Cronbach's alpha coefficient was 0.67 for the Turkish version of the SIGAM mobility scale. Cohen's kappa coefficients were between 0.224 and 0.999. Repeatability according to the results of the SIGAM mobility scale (grades A-F) was 0.822. We found significant and strong positive correlations of the SIGAM mobility scale results with the FAS, Get Up and Go Test, SATPRO, and all of the SF-36 subscales. In our study, the Turkish version of the SIGAM mobility scale was found as a reliable, valid, and easy to use scale in everyday practice for measuring mobility in lower extremity amputees. Implications for Rehabilitation Amputation is the surgical removal of a severely injured and nonfunctional extremity, at a level of one or more bones proximal to the body. Loss of a lower extremity is one of the most important conditions that cause functional disability. The Special Interest Group in Amputee Medicine (SIGAM) mobility scale contains 21 questions that evaluate the mobility of lower extremity amputees. Lack of a specific Turkish scale that evaluates rehabilitation results and mobility of lower extremity amputees, and determines their

  16. The structure and large-scale organization of extreme cold waves over the conterminous United States

    Science.gov (United States)

    Xie, Zuowei; Black, Robert X.; Deng, Yi

    2017-12-01

    Extreme cold waves (ECWs) occurring over the conterminous United States (US) are studied through a systematic identification and documentation of their local synoptic structures, associated large-scale meteorological patterns (LMPs), and forcing mechanisms external to the US. Focusing on the boreal cool season (November-March) for 1950‒2005, a hierarchical cluster analysis identifies three ECW patterns, respectively characterized by cold surface air temperature anomalies over the upper midwest (UM), northwestern (NW), and southeastern (SE) US. Locally, ECWs are synoptically organized by anomalous high pressure and northerly flow. At larger scales, the UM LMP features a zonal dipole in the mid-tropospheric height field over North America, while the NW and SE LMPs each include a zonal wave train extending from the North Pacific across North America into the North Atlantic. The Community Climate System Model version 4 (CCSM4) in general simulates the three ECW patterns quite well and successfully reproduces the observed enhancements in the frequency of their associated LMPs. La Niña and the cool phase of the Pacific Decadal Oscillation (PDO) favor the occurrence of NW ECWs, while the warm PDO phase, low Arctic sea ice extent and high Eurasian snow cover extent (SCE) are associated with elevated SE-ECW frequency. Additionally, high Eurasian SCE is linked to increases in the occurrence likelihood of UM ECWs.

  17. Automatic detection of ischemic stroke based on scaling exponent electroencephalogram using extreme learning machine

    Science.gov (United States)

    Adhi, H. A.; Wijaya, S. K.; Prawito; Badri, C.; Rezal, M.

    2017-03-01

    Stroke is one of cerebrovascular diseases caused by the obstruction of blood flow to the brain. Stroke becomes the leading cause of death in Indonesia and the second in the world. Stroke also causes of the disability. Ischemic stroke accounts for most of all stroke cases. Obstruction of blood flow can cause tissue damage which results the electrical changes in the brain that can be observed through the electroencephalogram (EEG). In this study, we presented the results of automatic detection of ischemic stroke and normal subjects based on the scaling exponent EEG obtained through detrended fluctuation analysis (DFA) using extreme learning machine (ELM) as the classifier. The signal processing was performed with 18 channels of EEG in the range of 0-30 Hz. Scaling exponents of the subjects were used as the input for ELM to classify the ischemic stroke. The performance of detection was observed by the value of accuracy, sensitivity and specificity. The result showed, performance of the proposed method to classify the ischemic stroke was 84 % for accuracy, 82 % for sensitivity and 87 % for specificity with 120 hidden neurons and sine as the activation function of ELM.

  18. Kinetic turbulence simulations at extreme scale on leadership-class systems

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Bei [Princeton Univ., Princeton, NJ (United States); Ethier, Stephane [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Tang, William [Princeton Univ., Princeton, NJ (United States); Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Williams, Timothy [Argonne National Lab. (ANL), Argonne, IL (United States); Ibrahim, Khaled Z. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Madduri, Kamesh [The Pennsylvania State Univ., University Park, PA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-01-01

    Reliable predictive simulation capability addressing confinement properties in magnetically confined fusion plasmas is critically-important for ITER, a 20 billion dollar international burning plasma device under construction in France. The complex study of kinetic turbulence, which can severely limit the energy confinement and impact the economic viability of fusion systems, requires simulations at extreme scale for such an unprecedented device size. Our newly optimized, global, ab initio particle-in-cell code solving the nonlinear equations underlying gyrokinetic theory achieves excellent performance with respect to "time to solution" at the full capacity of the IBM Blue Gene/Q on 786,432 cores of Mira at ALCF and recently of the 1,572,864 cores of Sequoia at LLNL. Recent multithreading and domain decomposition optimizations in the new GTC-P code represent critically important software advances for modern, low memory per core systems by enabling routine simulations at unprecedented size (130 million grid points ITER-scale) and resolution (65 billion particles).

  19. Communicating Climate Uncertainties: Challenges and Opportunities Related to Spatial Scales, Extreme Events, and the Warming 'Hiatus'

    Science.gov (United States)

    Casola, J. H.; Huber, D.

    2013-12-01

    Many media, academic, government, and advocacy organizations have achieved sophistication in developing effective messages based on scientific information, and can quickly translate salient aspects of emerging climate research and evolving observations. However, there are several ways in which valid messages can be misconstrued by decision makers, leading them to inaccurate conclusions about the risks associated with climate impacts. Three cases will be discussed: 1) Issues of spatial scale in interpreting climate observations: Local climate observations may contradict summary statements about the effects of climate change on larger regional or global spatial scales. Effectively addressing these differences often requires communicators to understand local and regional climate drivers, and the distinction between a 'signal' associated with climate change and local climate 'noise.' Hydrological statistics in Missouri and California are shown to illustrate this case. 2) Issues of complexity related to extreme events: Climate change is typically invoked following a wide range of damaging meteorological events (e.g., heat waves, landfalling hurricanes, tornadoes), regardless of the strength of the relationship between anthropogenic climate change and the frequency or severity of that type of event. Examples are drawn from media coverage of several recent events, contrasting useful and potentially confusing word choices and frames. 3) Issues revolving around climate sensitivity: The so-called 'pause' or 'hiatus' in global warming has reverberated strongly through political and business discussions of climate change. Addressing the recent slowdown in warming yields an important opportunity to raise climate literacy in these communities. Attempts to use recent observations as a wedge between climate 'believers' and 'deniers' is likely to be counterproductive. Examples are drawn from Congressional testimony and media stories. All three cases illustrate ways that decision

  20. A Fast SVD-Hidden-nodes based Extreme Learning Machine for Large-Scale Data Analytics.

    Science.gov (United States)

    Deng, Wan-Yu; Bai, Zuo; Huang, Guang-Bin; Zheng, Qing-Hua

    2016-05-01

    Big dimensional data is a growing trend that is emerging in many real world contexts, extending from web mining, gene expression analysis, protein-protein interaction to high-frequency financial data. Nowadays, there is a growing consensus that the increasing dimensionality poses impeding effects on the performances of classifiers, which is termed as the "peaking phenomenon" in the field of machine intelligence. To address the issue, dimensionality reduction is commonly employed as a preprocessing step on the Big dimensional data before building the classifiers. In this paper, we propose an Extreme Learning Machine (ELM) approach for large-scale data analytic. In contrast to existing approaches, we embed hidden nodes that are designed using singular value decomposition (SVD) into the classical ELM. These SVD nodes in the hidden layer are shown to capture the underlying characteristics of the Big dimensional data well, exhibiting excellent generalization performances. The drawback of using SVD on the entire dataset, however, is the high computational complexity involved. To address this, a fast divide and conquer approximation scheme is introduced to maintain computational tractability on high volume data. The resultant algorithm proposed is labeled here as Fast Singular Value Decomposition-Hidden-nodes based Extreme Learning Machine or FSVD-H-ELM in short. In FSVD-H-ELM, instead of identifying the SVD hidden nodes directly from the entire dataset, SVD hidden nodes are derived from multiple random subsets of data sampled from the original dataset. Comprehensive experiments and comparisons are conducted to assess the FSVD-H-ELM against other state-of-the-art algorithms. The results obtained demonstrated the superior generalization performance and efficiency of the FSVD-H-ELM. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Resilience Design Patterns: A Structured Approach to Resilience at Extreme Scale

    International Nuclear Information System (INIS)

    Engelmann, Christian; Hukerikar, Saurabh

    2017-01-01

    Reliability is a serious concern for future extreme-scale high-performance computing (HPC) systems. Projections based on the current generation of HPC systems and technology roadmaps suggest the prevalence of very high fault rates in future systems. While the HPC community has developed various resilience solutions, application-level techniques as well as system-based solutions, the solution space remains fragmented. There are no formal methods and metrics to integrate the various HPC resilience techniques into composite solutions, nor are there methods to holistically evaluate the adequacy and efficacy of such solutions in terms of their protection coverage, and their performance \\& power efficiency characteristics. Additionally, few of the current approaches are portable to newer architectures and software environments that will be deployed on future systems. In this paper, we develop a structured approach to the design, evaluation and optimization of HPC resilience using the concept of design patterns. A design pattern is a general repeatable solution to a commonly occurring problem. We identify the problems caused by various types of faults, errors and failures in HPC systems and the techniques used to deal with these events. Each well-known solution that addresses a specific HPC resilience challenge is described in the form of a pattern. We develop a complete catalog of such resilience design patterns, which may be used by system architects, system software and tools developers, application programmers, as well as users and operators as essential building blocks when designing and deploying resilience solutions. We also develop a design framework that enhances a designer's understanding the opportunities for integrating multiple patterns across layers of the system stack and the important constraints during implementation of the individual patterns. It is also useful for defining mechanisms and interfaces to coordinate flexible fault management across

  2. A Pervasive Parallel Processing Framework for Data Visualization and Analysis at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States)

    2014-11-01

    The evolution of the computing world from teraflop to petaflop has been relatively effortless, with several of the existing programming models scaling effectively to the petascale. The migration to exascale, however, poses considerable challenges. All industry trends infer that the exascale machine will be built using processors containing hundreds to thousands of cores per chip. It can be inferred that efficient concurrency on exascale machines requires a massive amount of concurrent threads, each performing many operations on a localized piece of data. Currently, visualization libraries and applications are based off what is known as the visualization pipeline. In the pipeline model, algorithms are encapsulated as filters with inputs and outputs. These filters are connected by setting the output of one component to the input of another. Parallelism in the visualization pipeline is achieved by replicating the pipeline for each processing thread. This works well for today’s distributed memory parallel computers but cannot be sustained when operating on processors with thousands of cores. Our project investigates a new visualization framework designed to exhibit the pervasive parallelism necessary for extreme scale machines. Our framework achieves this by defining algorithms in terms of worklets, which are localized stateless operations. Worklets are atomic operations that execute when invoked unlike filters, which execute when a pipeline request occurs. The worklet design allows execution on a massive amount of lightweight threads with minimal overhead. Only with such fine-grained parallelism can we hope to fill the billions of threads we expect will be necessary for efficient computation on an exascale machine.

  3. ExM:System Support for Extreme-Scale, Many-Task Applications

    Energy Technology Data Exchange (ETDEWEB)

    Katz, Daniel S

    2011-05-31

    The ever-increasing power of supercomputer systems is both driving and enabling the emergence of new problem-solving methods that require the effi cient execution of many concurrent and interacting tasks. Methodologies such as rational design (e.g., in materials science), uncertainty quanti fication (e.g., in engineering), parameter estimation (e.g., for chemical and nuclear potential functions, and in economic energy systems modeling), massive dynamic graph pruning (e.g., in phylogenetic searches), Monte-Carlo- based iterative fi xing (e.g., in protein structure prediction), and inverse modeling (e.g., in reservoir simulation) all have these requirements. These many-task applications frequently have aggregate computing needs that demand the fastest computers. For example, proposed next-generation climate model ensemble studies will involve 1,000 or more runs, each requiring 10,000 cores for a week, to characterize model sensitivity to initial condition and parameter uncertainty. The goal of the ExM project is to achieve the technical advances required to execute such many-task applications efficiently, reliably, and easily on petascale and exascale computers. In this way, we will open up extreme-scale computing to new problem solving methods and application classes. In this document, we report on combined technical progress of the collaborative ExM project, and the institutional financial status of the portion of the project at University of Chicago, over the rst 8 months (through April 30, 2011)

  4. Prototyping a large-scale distributed system for the Great Observatories era - NASA Astrophysics Data System (ADS)

    Science.gov (United States)

    Shames, Peter

    1990-01-01

    The NASA Astrophysics Data System (ADS) is a distributed information system intended to support research in the Great Observatories era, to simplify access to data, and to enable simultaneous analyses of multispectral data sets. Here, the user agent and interface, its functions, and system components are examined, and the system architecture and infrastructure is addressed. The present status of the system and related future activities are examined.

  5. The nonstationary impact of local temperature changes and ENSO on extreme precipitation at the global scale

    Science.gov (United States)

    Sun, Qiaohong; Miao, Chiyuan; Qiao, Yuanyuan; Duan, Qingyun

    2017-12-01

    The El Niño-Southern Oscillation (ENSO) and local temperature are important drivers of extreme precipitation. Understanding the impact of ENSO and temperature on the risk of extreme precipitation over global land will provide a foundation for risk assessment and climate-adaptive design of infrastructure in a changing climate. In this study, nonstationary generalized extreme value distributions were used to model extreme precipitation over global land for the period 1979-2015, with ENSO indicator and temperature as covariates. Risk factors were estimated to quantify the contrast between the influence of different ENSO phases and temperature. The results show that extreme precipitation is dominated by ENSO over 22% of global land and by temperature over 26% of global land. With a warming climate, the risk of high-intensity daily extreme precipitation increases at high latitudes but decreases in tropical regions. For ENSO, large parts of North America, southern South America, and southeastern and northeastern China are shown to suffer greater risk in El Niño years, with more than double the chance of intense extreme precipitation in El Niño years compared with La Niña years. Moreover, regions with more intense precipitation are more sensitive to ENSO. Global climate models were used to investigate the changing relationship between extreme precipitation and the covariates. The risk of extreme, high-intensity precipitation increases across high latitudes of the Northern Hemisphere but decreases in middle and lower latitudes under a warming climate scenario, and will likely trigger increases in severe flooding and droughts across the globe. However, there is some uncertainties associated with the influence of ENSO on predictions of future extreme precipitation, with the spatial extent and risk varying among the different models.

  6. Extreme scale multi-physics simulations of the tsunamigenic 2004 Sumatra megathrust earthquake

    Science.gov (United States)

    Ulrich, T.; Gabriel, A. A.; Madden, E. H.; Wollherr, S.; Uphoff, C.; Rettenberger, S.; Bader, M.

    2017-12-01

    SeisSol (www.seissol.org) is an open-source software package based on an arbitrary high-order derivative Discontinuous Galerkin method (ADER-DG). It solves spontaneous dynamic rupture propagation on pre-existing fault interfaces according to non-linear friction laws, coupled to seismic wave propagation with high-order accuracy in space and time (minimal dispersion errors). SeisSol exploits unstructured meshes to account for complex geometries, e.g. high resolution topography and bathymetry, 3D subsurface structure, and fault networks. We present the up-to-date largest (1500 km of faults) and longest (500 s) dynamic rupture simulation modeling the 2004 Sumatra-Andaman earthquake. We demonstrate the need for end-to-end-optimization and petascale performance of scientific software to realize realistic simulations on the extreme scales of subduction zone earthquakes: Considering the full complexity of subduction zone geometries leads inevitably to huge differences in element sizes. The main code improvements include a cache-aware wave propagation scheme and optimizations of the dynamic rupture kernels using code generation. In addition, a novel clustered local-time-stepping scheme for dynamic rupture has been established. Finally, asynchronous output has been implemented to overlap I/O and compute time. We resolve the frictional sliding process on the curved mega-thrust and a system of splay faults, as well as the seismic wave field and seafloor displacement with frequency content up to 2.2 Hz. We validate the scenario by geodetic, seismological and tsunami observations. The resulting rupture dynamics shed new light on the activation and importance of splay faults.

  7. The scaling of population persistence with carrying capacity does not asymptote in populations of a fish experiencing extreme climate variability.

    Science.gov (United States)

    White, Richard S A; Wintle, Brendan A; McHugh, Peter A; Booker, Douglas J; McIntosh, Angus R

    2017-06-14

    Despite growing concerns regarding increasing frequency of extreme climate events and declining population sizes, the influence of environmental stochasticity on the relationship between population carrying capacity and time-to-extinction has received little empirical attention. While time-to-extinction increases exponentially with carrying capacity in constant environments, theoretical models suggest increasing environmental stochasticity causes asymptotic scaling, thus making minimum viable carrying capacity vastly uncertain in variable environments. Using empirical estimates of environmental stochasticity in fish metapopulations, we showed that increasing environmental stochasticity resulting from extreme droughts was insufficient to create asymptotic scaling of time-to-extinction with carrying capacity in local populations as predicted by theory. Local time-to-extinction increased with carrying capacity due to declining sensitivity to demographic stochasticity, and the slope of this relationship declined significantly as environmental stochasticity increased. However, recent 1 in 25 yr extreme droughts were insufficient to extirpate populations with large carrying capacity. Consequently, large populations may be more resilient to environmental stochasticity than previously thought. The lack of carrying capacity-related asymptotes in persistence under extreme climate variability reveals how small populations affected by habitat loss or overharvesting, may be disproportionately threatened by increases in extreme climate events with global warming. © 2017 The Author(s).

  8. ERA-40

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — ERA-40 project was to produce and promote the use of a comprehensive set of global analysis describing the state of the atmosphere and land and ocean-wave conditions...

  9. The New York City Operations Support Tool: Supporting Water Supply Operations for Millions in an Era of Changing Patterns in Hydrological Extreme Events

    Science.gov (United States)

    Matonse, A. H.; Porter, J. H.; Frei, A.

    2015-12-01

    Providing an average 1.1 billion gallons (~ 4.2 x 106 cubic meters) of drinking water per day to approximately nine million people in New York City (NYC) and four upstate counties, the NYC water supply is among the world's largest unfiltered systems. In addition to providing a reliable water supply in terms of water quantity and quality, the city has to fulfill other flow objectives to serve downstream communities. At times, such as during extreme hydrological events, water quality issues may restrict water usage for parts of the system. To support a risk-based water supply decision making process NYC has developed the Operations Support Tool (OST). OST combines a water supply systems model with reservoir water quality models, near real time data ingestion, data base management and an ensemble hydrological forecast. A number of reports have addressed the frequency and intensities of extreme hydrological events across the continental US. In the northeastern US studies have indicated an increase in the frequency of extremely large precipitation and streamflow events during the most recent decades. During this presentation we describe OST and, using case studies we demonstrate how this tool has been useful to support operational decisions. We also want to motivate a discussion about how undergoing changes in patterns of hydrological extreme events elevate the challenge faced by water supply managers and the role of the scientific community to integrate nonstationarity approaches in hydrologic forecast and modeling.

  10. Extreme-Scale Computing Project Aims to Advance Precision Oncology | FNLCR Staging

    Science.gov (United States)

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  11. Extreme-Scale Computing Project Aims to Advance Precision Oncology | Poster

    Science.gov (United States)

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict drug response, and improve treatments for patients.

  12. Extreme-Scale Computing Project Aims to Advance Precision Oncology | FNLCR

    Science.gov (United States)

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  13. A Pervasive Parallel Processing Framework for Data Visualization and Analysis at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Kwan-Liu [Univ. of California, Davis, CA (United States)

    2017-02-01

    efficient computation on an exascale computer. This project concludes with a functional prototype containing pervasively parallel algorithms that perform demonstratively well on many-core processors. These algorithms are fundamental for performing data analysis and visualization at extreme scale.

  14. What is not csr: extremes of csr perception in the world of business and strategic view on it in the era of conscious capitalism

    OpenAIRE

    OKOROCHKOVA ANASTASIA

    2016-01-01

    Extremes of Corporate Social Responsibility (CSR) perception are evident in the business world today. Business leaders and other stakeholders can’t understand what in particular, how and for what purpose they should practice CSR and they often narrow it down to different business activities that do not have any connection with sustainable development of business. By opposing it to philanthropy and charity; to practice of social investments; to marketing activities and PR; tothe concept of sha...

  15. Extreme rainfall, vulnerability and risk: a continental-scale assessment for South America

    Science.gov (United States)

    Vorosmarty, Charles J.; de Guenni, Lelys Bravo; Wollheim, Wilfred M.; Pellerin, Brian A.; Bjerklie, David M.; Cardoso, Manoel; D'Almeida, Cassiano; Colon, Lilybeth

    2013-01-01

    Extreme weather continues to preoccupy society as a formidable public safety concern bearing huge economic costs. While attention has focused on global climate change and how it could intensify key elements of the water cycle such as precipitation and river discharge, it is the conjunction of geophysical and socioeconomic forces that shapes human sensitivity and risks to weather extremes. We demonstrate here the use of high-resolution geophysical and population datasets together with documentary reports of rainfall-induced damage across South America over a multi-decadal, retrospective time domain (1960–2000). We define and map extreme precipitation hazard, exposure, affectedpopulations, vulnerability and risk, and use these variables to analyse the impact of floods as a water security issue. Geospatial experiments uncover major sources of risk from natural climate variability and population growth, with change in climate extremes bearing a minor role. While rural populations display greatest relative sensitivity to extreme rainfall, urban settings show the highest rates of increasing risk. In the coming decades, rapid urbanization will make South American cities the focal point of future climate threats but also an opportunity for reducing vulnerability, protecting lives and sustaining economic development through both traditional and ecosystem-based disaster risk management systems.

  16. Extreme rainfall, vulnerability and risk: a continental-scale assessment for South America.

    Science.gov (United States)

    Vörösmarty, Charles J; Bravo de Guenni, Lelys; Wollheim, Wilfred M; Pellerin, Brian; Bjerklie, David; Cardoso, Manoel; D'Almeida, Cassiano; Green, Pamela; Colon, Lilybeth

    2013-11-13

    Extreme weather continues to preoccupy society as a formidable public safety concern bearing huge economic costs. While attention has focused on global climate change and how it could intensify key elements of the water cycle such as precipitation and river discharge, it is the conjunction of geophysical and socioeconomic forces that shapes human sensitivity and risks to weather extremes. We demonstrate here the use of high-resolution geophysical and population datasets together with documentary reports of rainfall-induced damage across South America over a multi-decadal, retrospective time domain (1960-2000). We define and map extreme precipitation hazard, exposure, affectedpopulations, vulnerability and risk, and use these variables to analyse the impact of floods as a water security issue. Geospatial experiments uncover major sources of risk from natural climate variability and population growth, with change in climate extremes bearing a minor role. While rural populations display greatest relative sensitivity to extreme rainfall, urban settings show the highest rates of increasing risk. In the coming decades, rapid urbanization will make South American cities the focal point of future climate threats but also an opportunity for reducing vulnerability, protecting lives and sustaining economic development through both traditional and ecosystem-based disaster risk management systems.

  17. Forcings and feedbacks on convection in the 2010 Pakistan flood: Modeling extreme precipitation with interactive large-scale ascent

    Science.gov (United States)

    Nie, Ji; Shaevitz, Daniel A.; Sobel, Adam H.

    2016-09-01

    Extratropical extreme precipitation events are usually associated with large-scale flow disturbances, strong ascent, and large latent heat release. The causal relationships between these factors are often not obvious, however, the roles of different physical processes in producing the extreme precipitation event can be difficult to disentangle. Here we examine the large-scale forcings and convective heating feedback in the precipitation events, which caused the 2010 Pakistan flood within the Column Quasi-Geostrophic framework. A cloud-revolving model (CRM) is forced with large-scale forcings (other than large-scale vertical motion) computed from the quasi-geostrophic omega equation using input data from a reanalysis data set, and the large-scale vertical motion is diagnosed interactively with the simulated convection. Numerical results show that the positive feedback of convective heating to large-scale dynamics is essential in amplifying the precipitation intensity to the observed values. Orographic lifting is the most important dynamic forcing in both events, while differential potential vorticity advection also contributes to the triggering of the first event. Horizontal moisture advection modulates the extreme events mainly by setting the environmental humidity, which modulates the amplitude of the convection's response to the dynamic forcings. When the CRM is replaced by either a single-column model (SCM) with parameterized convection or a dry model with a reduced effective static stability, the model results show substantial discrepancies compared with reanalysis data. The reasons for these discrepancies are examined, and the implications for global models and theoretical models are discussed.

  18. Sensitivity of extreme precipitation to temperature: the variability of scaling factors from a regional to local perspective

    Science.gov (United States)

    Schroeer, K.; Kirchengast, G.

    2018-06-01

    Potential increases in extreme rainfall induced hazards in a warming climate have motivated studies to link precipitation intensities to temperature. Increases exceeding the Clausius-Clapeyron (CC) rate of 6-7%/°C-1 are seen in short-duration, convective, high-percentile rainfall at mid latitudes, but the rates of change cease or revert at regionally variable threshold temperatures due to moisture limitations. It is unclear, however, what these findings mean in term of the actual risk of extreme precipitation on a regional to local scale. When conditioning precipitation intensities on local temperatures, key influences on the scaling relationship such as from the annual cycle and regional weather patterns need better understanding. Here we analyze these influences, using sub-hourly to daily precipitation data from a dense network of 189 stations in south-eastern Austria. We find that the temperature sensitivities in the mountainous western region are lower than in the eastern lowlands. This is due to the different weather patterns that cause extreme precipitation in these regions. Sub-hourly and hourly intensities intensify at super-CC and CC-rates, respectively, up to temperatures of about 17 °C. However, we also find that, because of the regional and seasonal variability of the precipitation intensities, a smaller scaling factor can imply a larger absolute change in intensity. Our insights underline that temperature precipitation scaling requires careful interpretation of the intent and setting of the study. When this is considered, conditional scaling factors can help to better understand which influences control the intensification of rainfall with temperature on a regional scale.

  19. Understanding extreme sea levels for broad-scale coastal impact and adaptation analysis

    NARCIS (Netherlands)

    Wahl, T.; Haigh, I.D.; Nicholls, R.J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A.B.A.

    2017-01-01

    One of the main consequences of mean sea level rise (SLR) on human settlements is an increase in flood risk due to an increase in the intensity and frequency of extreme sea levels (ESL). While substantial research efforts are directed towards quantifying projections and uncertainties of future

  20. Scale dependency of regional climate modeling of current and future climate extremes in Germany

    Science.gov (United States)

    Tölle, Merja H.; Schefczyk, Lukas; Gutjahr, Oliver

    2017-11-01

    A warmer climate is projected for mid-Europe, with less precipitation in summer, but with intensified extremes of precipitation and near-surface temperature. However, the extent and magnitude of such changes are associated with creditable uncertainty because of the limitations of model resolution and parameterizations. Here, we present the results of convection-permitting regional climate model simulations for Germany integrated with the COSMO-CLM using a horizontal grid spacing of 1.3 km, and additional 4.5- and 7-km simulations with convection parameterized. Of particular interest is how the temperature and precipitation fields and their extremes depend on the horizontal resolution for current and future climate conditions. The spatial variability of precipitation increases with resolution because of more realistic orography and physical parameterizations, but values are overestimated in summer and over mountain ridges in all simulations compared to observations. The spatial variability of temperature is improved at a resolution of 1.3 km, but the results are cold-biased, especially in summer. The increase in resolution from 7/4.5 km to 1.3 km is accompanied by less future warming in summer by 1 ∘C. Modeled future precipitation extremes will be more severe, and temperature extremes will not exclusively increase with higher resolution. Although the differences between the resolutions considered (7/4.5 km and 1.3 km) are small, we find that the differences in the changes in extremes are large. High-resolution simulations require further studies, with effective parameterizations and tunings for different topographic regions. Impact models and assessment studies may benefit from such high-resolution model results, but should account for the impact of model resolution on model processes and climate change.

  1. Ecological recovery in ERA

    DEFF Research Database (Denmark)

    EFSA Scientific Committee (Scientific Committee); Topping, Christopher John

    2016-01-01

    knowledge and data retrieved from the literature. Finally, the information presented in this opinion was reviewed by experts from the relevant EFSA Panels, European risk assessment bodies and through an open consultation requesting input from stakeholders. A conceptual framework was developed to address...... recognises the importance of more integrated ERAs considering both the local and landscape scales, as well as the possible co-occurrence of multiple potential stressors that fall under the remit of EFSA, which are important when addressing ecological recovery. In this scientific opinion, the Scientific...... Committee gathered scientific knowledge on the potential for the recovery of non-target organisms for the further development of ERA. Current EFSA guidance documents and opinions were reviewed on how ecological recovery is addressed in ERA schemes. In addition, this scientific opinion is based on expert...

  2. Analysis of clinical characteristics and efficacy of chronic myeloid leukemia onset with extreme thrombocytosis in the era of tyrosine kinase inhibitors

    Directory of Open Access Journals (Sweden)

    Liu Z

    2017-07-01

    Full Text Available Zhihe Liu, Hongqiong Fan, Yuying Li, Chunshui Liu Department of Hematology, Cancer Center, The First Hospital of Jilin University, Changchun, People’s Republic of China Abstract: The aim of this study was to investigate the clinical characteristics and efficacy of chronic myeloid leukemia (CML onset with extreme thrombocytosis. A total of 121 newly diagnosed and untreated CML patients in chronic phase with complete clinical information from the First Hospital of Jilin University, from January 2010 to December 2014 were retrospectively recruited. Based on the platelet (PLT count, 22 patients were assigned into CML with thrombocytosis (CML-T group (PLT >1,000×109/L and 65 patients were classified into CML without extreme thrombocytosis (CML-N group (PLT ≤1,000×109/L. Fifty-four point five percent of patients in the CML-T group were female, which was higher than that in the CML-N group (27.7% (P=0.022. Except for gender, there was no significant difference for clinical information of patients between the two groups. For Sokal and Hasford scoring systems, the percentage of patients at high risk in the CML-T group were higher than those in the CML-N group, 95.5% vs 52.3% (P=0.000 and 68.2% vs 41.5% (P=0.031, respectively; however, there was no significant difference for European Treatment and Outcome Study (EUTOS score system between the two groups (P=0.213. In terms of major molecular response (MMR rate, the percent of patients with MMR in CML-T group was lower than that in CML-N group at 36 months after tyrosine kinase inhibitor therapy (P=0.037. Up until December 2016, the median of event-free survival was 21 months in the CML-T group, however, that was not reached in the CML-N group (P=0.027. The majority of CML patients with extreme thrombocytosis were females, and compared to patients in the CML-N group, the percentage of high risk patients based on the Sokal and Hasford scoring systems was higher in the CML-T group, and the median

  3. Contribution of large-scale circulation anomalies to changes in extreme precipitation frequency in the United States

    International Nuclear Information System (INIS)

    Yu, Lejiang; Zhong, Shiyuan; Pei, Lisi; Bian, Xindi; Heilman, Warren E

    2016-01-01

    The mean global climate has warmed as a result of the increasing emission of greenhouse gases induced by human activities. This warming is considered the main reason for the increasing number of extreme precipitation events in the US. While much attention has been given to extreme precipitation events occurring over several days, which are usually responsible for severe flooding over a large region, little is known about how extreme precipitation events that cause flash flooding and occur at sub-daily time scales have changed over time. Here we use the observed hourly precipitation from the North American Land Data Assimilation System Phase 2 forcing datasets to determine trends in the frequency of extreme precipitation events of short (1 h, 3 h, 6 h, 12 h and 24 h) duration for the period 1979–2013. The results indicate an increasing trend in the central and eastern US. Over most of the western US, especially the Southwest and the Intermountain West, the trends are generally negative. These trends can be largely explained by the interdecadal variability of the Pacific Decadal Oscillation and Atlantic Multidecadal Oscillation (AMO), with the AMO making a greater contribution to the trends in both warm and cold seasons. (letter)

  4. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    Energy Technology Data Exchange (ETDEWEB)

    Wilke, Jeremiah J [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Kenny, Joseph P. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States)

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading framework allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.

  5. Reliability, validity, and sensitivity to change of the lower extremity functional scale in individuals affected by stroke.

    Science.gov (United States)

    Verheijde, Joseph L; White, Fred; Tompkins, James; Dahl, Peder; Hentz, Joseph G; Lebec, Michael T; Cornwall, Mark

    2013-12-01

    To investigate reliability, validity, and sensitivity to change of the Lower Extremity Functional Scale (LEFS) in individuals affected by stroke. The secondary objective was to test the validity and sensitivity of a single-item linear analog scale (LAS) of function. Prospective cohort reliability and validation study. A single rehabilitation department in an academic medical center. Forty-three individuals receiving neurorehabilitation for lower extremity dysfunction after stroke were studied. Their ages ranged from 32 to 95 years, with a mean of 70 years; 77% were men. Test-retest reliability was assessed by calculating the classical intraclass correlation coefficient, and the Bland-Altman limits of agreement. Validity was assessed by calculating the Pearson correlation coefficient between the instruments. Sensitivity to change was assessed by comparing baseline scores with end of treatment scores. Measurements were taken at baseline, after 1-3 days, and at 4 and 8 weeks. The LEFS, Short-Form-36 Physical Function Scale, Berg Balance Scale, Six-Minute Walk Test, Five-Meter Walk Test, Timed Up-and-Go test, and the LAS of function were used. The test-retest reliability of the LEFS was found to be excellent (ICC = 0.96). Correlated with the 6 other measures of function studied, the validity of the LEFS was found to be moderate to high (r = 0.40-0.71). Regarding the sensitivity to change, the mean LEFS scores from baseline to study end increased 1.2 SD and for LAS 1.1 SD. LEFS exhibits good reliability, validity, and sensitivity to change in patients with lower extremity impairments secondary to stroke. Therefore, the LEFS can be a clinically efficient outcome measure in the rehabilitation of patients with subacute stroke. The LAS is shown to be a time-saving and reasonable option to track changes in a patient's functional status. Copyright © 2013 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  6. Rain Characteristics and Large-Scale Environments of Precipitation Objects with Extreme Rain Volumes from TRMM Observations

    Science.gov (United States)

    Zhou, Yaping; Lau, William K M.; Liu, Chuntao

    2013-01-01

    This study adopts a "precipitation object" approach by using 14 years of Tropical Rainfall Measuring Mission (TRMM) Precipitation Feature (PF) and National Centers for Environmental Prediction (NCEP) reanalysis data to study rainfall structure and environmental factors associated with extreme heavy rain events. Characteristics of instantaneous extreme volumetric PFs are examined and compared to those of intermediate and small systems. It is found that instantaneous PFs exhibit a much wider scale range compared to the daily gridded precipitation accumulation range. The top 1% of the rainiest PFs contribute over 55% of total rainfall and have 2 orders of rain volume magnitude greater than those of the median PFs. We find a threshold near the top 10% beyond which the PFs grow exponentially into larger, deeper, and colder rain systems. NCEP reanalyses show that midlevel relative humidity and total precipitable water increase steadily with increasingly larger PFs, along with a rapid increase of 500 hPa upward vertical velocity beyond the top 10%. This provides the necessary moisture convergence to amplify and sustain the extreme events. The rapid increase in vertical motion is associated with the release of convective available potential energy (CAPE) in mature systems, as is evident in the increase in CAPE of PFs up to 10% and the subsequent dropoff. The study illustrates distinct stages in the development of an extreme rainfall event including: (1) a systematic buildup in large-scale temperature and moisture, (2) a rapid change in rain structure, (3) explosive growth of the PF size, and (4) a release of CAPE before the demise of the event.

  7. Extreme ultra-violet movie camera for imaging microsecond time scale magnetic reconnection

    International Nuclear Information System (INIS)

    Chai, Kil-Byoung; Bellan, Paul M.

    2013-01-01

    An ultra-fast extreme ultra-violet (EUV) movie camera has been developed for imaging magnetic reconnection in the Caltech spheromak/astrophysical jet experiment. The camera consists of a broadband Mo:Si multilayer mirror, a fast decaying YAG:Ce scintillator, a visible light block, and a high-speed visible light CCD camera. The camera can capture EUV images as fast as 3.3 × 10 6 frames per second with 0.5 cm spatial resolution. The spectral range is from 20 eV to 60 eV. EUV images reveal strong, transient, highly localized bursts of EUV radiation when magnetic reconnection occurs

  8. Extreme ultra-violet movie camera for imaging microsecond time scale magnetic reconnection

    Energy Technology Data Exchange (ETDEWEB)

    Chai, Kil-Byoung; Bellan, Paul M. [Applied Physics, Caltech, 1200 E. California Boulevard, Pasadena, California 91125 (United States)

    2013-12-15

    An ultra-fast extreme ultra-violet (EUV) movie camera has been developed for imaging magnetic reconnection in the Caltech spheromak/astrophysical jet experiment. The camera consists of a broadband Mo:Si multilayer mirror, a fast decaying YAG:Ce scintillator, a visible light block, and a high-speed visible light CCD camera. The camera can capture EUV images as fast as 3.3 × 10{sup 6} frames per second with 0.5 cm spatial resolution. The spectral range is from 20 eV to 60 eV. EUV images reveal strong, transient, highly localized bursts of EUV radiation when magnetic reconnection occurs.

  9. Changes in daily climate extremes in China and their connection to the large scale atmospheric circulation during 1961-2003

    Energy Technology Data Exchange (ETDEWEB)

    You, Qinglong [Institute of Tibetan Plateau Research, Chinese Academy of Sciences (CAS), Laboratory of Tibetan Environment Changes and Land Surface Processes, Beijing (China); Friedrich-Schiller University Jena, Department of Geoinformatics, Jena (Germany); Graduate University of Chinese Academy of Sciences, Beijing (China); Kang, Shichang [Institute of Tibetan Plateau Research, Chinese Academy of Sciences (CAS), Laboratory of Tibetan Environment Changes and Land Surface Processes, Beijing (China); State Key Laboratory of Cryospheric Science, Chinese Academy of Sciences, Lanzhou (China); Aguilar, Enric [Universitat Rovirai Virgili de Tarragona, Climate Change Research Group, Geography Unit, Tarragona (Spain); Pepin, Nick [University of Portsmouth, Department of Geography, Portsmouth (United Kingdom); Fluegel, Wolfgang-Albert [Friedrich-Schiller University Jena, Department of Geoinformatics, Jena (Germany); Yan, Yuping [National Climate Center, Beijing (China); Xu, Yanwei; Huang, Jie [Institute of Tibetan Plateau Research, Chinese Academy of Sciences (CAS), Laboratory of Tibetan Environment Changes and Land Surface Processes, Beijing (China); Graduate University of Chinese Academy of Sciences, Beijing (China); Zhang, Yongjun [Institute of Tibetan Plateau Research, Chinese Academy of Sciences (CAS), Laboratory of Tibetan Environment Changes and Land Surface Processes, Beijing (China)

    2011-06-15

    negative magnitudes. This is inconsistent with changes of water vapor flux calculated from NCEP/NCAR reanalysis. Large scale atmospheric circulation changes derived from NCEP/NCAR reanalysis grids show that a strengthening anticyclonic circulation, increasing geopotential height and rapid warming over the Eurasian continent have contributed to the changes in climate extremes in China. (orig.)

  10. Extreme-Scale Stochastic Particle Tracing for Uncertain Unsteady Flow Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Hanqi; He, Wenbin; Seo, Sangmin; Shen, Han-Wei; Peterka, Tom

    2016-11-13

    We present an efficient and scalable solution to estimate uncertain transport behaviors using stochastic flow maps (SFM,) for visualizing and analyzing uncertain unsteady flows. SFM computation is extremely expensive because it requires many Monte Carlo runs to trace densely seeded particles in the flow. We alleviate the computational cost by decoupling the time dependencies in SFMs so that we can process adjacent time steps independently and then compose them together for longer time periods. Adaptive refinement is also used to reduce the number of runs for each location. We then parallelize over tasks—packets of particles in our design—to achieve high efficiency in MPI/thread hybrid programming. Such a task model also enables CPU/GPU coprocessing. We show the scalability on two supercomputers, Mira (up to 1M Blue Gene/Q cores) and Titan (up to 128K Opteron cores and 8K GPUs), that can trace billions of particles in seconds.

  11. Synchronization and Causality Across Time-scales: Complex Dynamics and Extremes in El Niño/Southern Oscillation

    Science.gov (United States)

    Jajcay, N.; Kravtsov, S.; Tsonis, A.; Palus, M.

    2017-12-01

    A better understanding of dynamics in complex systems, such as the Earth's climate is one of the key challenges for contemporary science and society. A large amount of experimental data requires new mathematical and computational approaches. Natural complex systems vary on many temporal and spatial scales, often exhibiting recurring patterns and quasi-oscillatory phenomena. The statistical inference of causal interactions and synchronization between dynamical phenomena evolving on different temporal scales is of vital importance for better understanding of underlying mechanisms and a key for modeling and prediction of such systems. This study introduces and applies information theory diagnostics to phase and amplitude time series of different wavelet components of the observed data that characterizes El Niño. A suite of significant interactions between processes operating on different time scales was detected, and intermittent synchronization among different time scales has been associated with the extreme El Niño events. The mechanisms of these nonlinear interactions were further studied in conceptual low-order and state-of-the-art dynamical, as well as statistical climate models. Observed and simulated interactions exhibit substantial discrepancies, whose understanding may be the key to an improved prediction. Moreover, the statistical framework which we apply here is suitable for direct usage of inferring cross-scale interactions in nonlinear time series from complex systems such as the terrestrial magnetosphere, solar-terrestrial interactions, seismic activity or even human brain dynamics.

  12. Open Problems in Network-aware Data Management in Exa-scale Computing and Terabit Networking Era

    Energy Technology Data Exchange (ETDEWEB)

    Balman, Mehmet; Byna, Surendra

    2011-12-06

    Accessing and managing large amounts of data is a great challenge in collaborative computing environments where resources and users are geographically distributed. Recent advances in network technology led to next-generation high-performance networks, allowing high-bandwidth connectivity. Efficient use of the network infrastructure is necessary in order to address the increasing data and compute requirements of large-scale applications. We discuss several open problems, evaluate emerging trends, and articulate our perspectives in network-aware data management.

  13. Extreme hydrometeorological events in the Peruvian Central Andes during austral summer and their relationship with the large-scale circulation

    Science.gov (United States)

    Sulca, Juan C.

    In this Master's dissertation, atmospheric circulation patterns associated with extreme hydrometeorological events in the Mantaro Basin, Peruvian Central Andes, and their teleconnections during the austral summer (December-January-February-March) are addressed. Extreme rainfall events in the Mantaro basin are related to variations of the large-scale circulation as indicated by the changing strength of the Bolivian High-Nordeste Low (BH-NL) system. Dry (wet) spells are associated with a weakening (strengthening) of the BH-NL system and reduced (enhanced) influx of moist air from the lowlands to the east due to strengthened westerly (easterly) wind anomalies at mid- and upper-tropospheric levels. At the same time extreme rainfall events of the opposite sign occur over northeastern Brazil (NEB) due to enhanced (inhibited) convective activity in conjunction with a strengthened (weakened) Nordeste Low. Cold episodes in the Mantaro Basin are grouped in three types: weak, strong and extraordinary cold episodes. Weak and strong cold episodes in the MB are mainly associated with a weakening of the BH-NL system due to tropical-extratropical interactions. Both types of cold episodes are associated with westerly wind anomalies at mid- and upper-tropospheric levels aloft the Peruvian Central Andes, which inhibit the influx of humid air masses from the lowlands to the east and hence limit the potential for development of convective cloud cover. The resulting clear sky conditions cause nighttime temperatures to drop, leading to cold extremes below the 10-percentile. Extraordinary cold episodes in the MB are associated with cold and dry polar air advection at all tropospheric levels toward the central Peruvian Andes. Therefore, weak and strong cold episodes in the MB appear to be caused by radiative cooling associated with reduced cloudiness, rather than cold air advection, while the latter plays an important role for extraordinary cold episodes only.

  14. A Large-Scale Multi-Hop Localization Algorithm Based on Regularized Extreme Learning for Wireless Networks.

    Science.gov (United States)

    Zheng, Wei; Yan, Xiaoyong; Zhao, Wei; Qian, Chengshan

    2017-12-20

    A novel large-scale multi-hop localization algorithm based on regularized extreme learning is proposed in this paper. The large-scale multi-hop localization problem is formulated as a learning problem. Unlike other similar localization algorithms, the proposed algorithm overcomes the shortcoming of the traditional algorithms which are only applicable to an isotropic network, therefore has a strong adaptability to the complex deployment environment. The proposed algorithm is composed of three stages: data acquisition, modeling and location estimation. In data acquisition stage, the training information between nodes of the given network is collected. In modeling stage, the model among the hop-counts and the physical distances between nodes is constructed using regularized extreme learning. In location estimation stage, each node finds its specific location in a distributed manner. Theoretical analysis and several experiments show that the proposed algorithm can adapt to the different topological environments with low computational cost. Furthermore, high accuracy can be achieved by this method without setting complex parameters.

  15. dV/dt - Accelerating the Rate of Progress towards Extreme Scale Collaborative Science

    Energy Technology Data Exchange (ETDEWEB)

    Livny, Miron [Univ. of Wisconsin, Madison, WI (United States)

    2018-01-22

    This report introduces publications that report the results of a project that aimed to design a computational framework that enables computational experimentation at scale while supporting the model of “submit locally, compute globally”. The project focuses on estimating application resource needs, finding the appropriate computing resources, acquiring those resources,deploying the applications and data on the resources, managing applications and resources during run.

  16. Final Report Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    Energy Technology Data Exchange (ETDEWEB)

    O' Leary, Patrick [Kitware, Inc., Clifton Park, NY (United States)

    2017-09-13

    The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.

  17. Detecting Silent Data Corruption for Extreme-Scale Applications through Data Mining

    Energy Technology Data Exchange (ETDEWEB)

    Bautista-Gomez, Leonardo [Argonne National Lab. (ANL), Argonne, IL (United States); Cappello, Franck [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-01-16

    Supercomputers allow scientists to study natural phenomena by means of computer simulations. Next-generation machines are expected to have more components and, at the same time, consume several times less energy per operation. These trends are pushing supercomputer construction to the limits of miniaturization and energy-saving strategies. Consequently, the number of soft errors is expected to increase dramatically in the coming years. While mechanisms are in place to correct or at least detect some soft errors, a significant percentage of those errors pass unnoticed by the hardware. Such silent errors are extremely damaging because they can make applications silently produce wrong results. In this work we propose a technique that leverages certain properties of high-performance computing applications in order to detect silent errors at the application level. Our technique detects corruption solely based on the behavior of the application datasets and is completely application-agnostic. We propose multiple corruption detectors, and we couple them to work together in a fashion transparent to the user. We demonstrate that this strategy can detect the majority of the corruptions, while incurring negligible overhead. We show that with the help of these detectors, applications can have up to 80% of coverage against data corruption.

  18. Design considerations of 10 kW-scale, extreme ultraviolet SASE FEL for lithography

    CERN Document Server

    Pagani, C; Schneidmiller, E A; Yurkov, M V

    2001-01-01

    The semiconductor industry growth is driven to a large extent by steady advancements in microlithography. According to the newly updated industry road map, the 70 nm generation is anticipated to be available in the year 2008. However, the path to get there is not clear. The problem of construction of extreme ultraviolet (EUV) quantum lasers for lithography is still unsolved: progress in this field is rather moderate and we cannot expect a significant breakthrough in the near future. Nevertheless, there is clear path for optical lithography to take us to sub-100 nm dimensions. Theoretical and experimental work in Self-Amplified Spontaneous Emission (SASE) Free Electron Lasers (FEL) physics and the physics of superconducting linear accelerators over the last 10 years has pointed to the possibility of the generation of high-power optical beams with laser-like characteristics in the EUV spectral range. Recently, there have been important advances in demonstrating a high-gain SASE FEL at 100 nm wavelength (J. Andr...

  19. Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-07-24

    The primary challenge motivating this team’s work is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who are able to perform analysis only on a small fraction of the data they compute, resulting in the very real likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, an approach that is known as in situ processing. The idea in situ processing was not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by DOE science projects. In large, our objective was produce and enable use of production-quality in situ methods and infrastructure, at scale, on DOE HPC facilities, though we expected to have impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve that objective, we assembled a unique team of researchers consisting of representatives from DOE national laboratories, academia, and industry, and engaged in software technology R&D, as well as engaged in close partnerships with DOE science code teams, to produce software technologies that were shown to run effectively at scale on DOE HPC platforms.

  20. Advancing the large-scale CCS database for metabolomics and lipidomics at the machine-learning era.

    Science.gov (United States)

    Zhou, Zhiwei; Tu, Jia; Zhu, Zheng-Jiang

    2018-02-01

    Metabolomics and lipidomics aim to comprehensively measure the dynamic changes of all metabolites and lipids that are present in biological systems. The use of ion mobility-mass spectrometry (IM-MS) for metabolomics and lipidomics has facilitated the separation and the identification of metabolites and lipids in complex biological samples. The collision cross-section (CCS) value derived from IM-MS is a valuable physiochemical property for the unambiguous identification of metabolites and lipids. However, CCS values obtained from experimental measurement and computational modeling are limited available, which significantly restricts the application of IM-MS. In this review, we will discuss the recently developed machine-learning based prediction approach, which could efficiently generate precise CCS databases in a large scale. We will also highlight the applications of CCS databases to support metabolomics and lipidomics. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. HPC Colony II: FAST_OS II: Operating Systems and Runtime Systems at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Moreira, Jose [IBM, Armonk, NY (United States)

    2013-11-13

    HPC Colony II has been a 36-month project focused on providing portable performance for leadership class machines—a task made difficult by the emerging variety of more complex computer architectures. The project attempts to move the burden of portable performance to adaptive system software, thereby allowing domain scientists to concentrate on their field rather than the fine details of a new leadership class machine. To accomplish our goals, we focused on adding intelligence into the system software stack. Our revised components include: new techniques to address OS jitter; new techniques to dynamically address load imbalances; new techniques to map resources according to architectural subtleties and application dynamic behavior; new techniques to dramatically improve the performance of checkpoint-restart; and new techniques to address membership service issues at scale.

  2. Research Guidelines in the Era of Large-scale Collaborations: An Analysis of Genome-wide Association Study Consortia

    Science.gov (United States)

    Austin, Melissa A.; Hair, Marilyn S.; Fullerton, Stephanie M.

    2012-01-01

    Scientific research has shifted from studies conducted by single investigators to the creation of large consortia. Genetic epidemiologists, for example, now collaborate extensively for genome-wide association studies (GWAS). The effect has been a stream of confirmed disease-gene associations. However, effects on human subjects oversight, data-sharing, publication and authorship practices, research organization and productivity, and intellectual property remain to be examined. The aim of this analysis was to identify all research consortia that had published the results of a GWAS analysis since 2005, characterize them, determine which have publicly accessible guidelines for research practices, and summarize the policies in these guidelines. A review of the National Human Genome Research Institute’s Catalog of Published Genome-Wide Association Studies identified 55 GWAS consortia as of April 1, 2011. These consortia were comprised of individual investigators, research centers, studies, or other consortia and studied 48 different diseases or traits. Only 14 (25%) were found to have publicly accessible research guidelines on consortia websites. The available guidelines provide information on organization, governance, and research protocols; half address institutional review board approval. Details of publication, authorship, data-sharing, and intellectual property vary considerably. Wider access to consortia guidelines is needed to establish appropriate research standards with broad applicability to emerging forms of large-scale collaboration. PMID:22491085

  3. Racism in digital era: Development and initial validation of the Perceived Online Racism Scale (PORS v1.0).

    Science.gov (United States)

    Keum, Brian TaeHyuk; Miller, Matthew J

    2017-04-01

    The purpose of this study was to develop the Perceived Online Racism Scale (PORS) to assess perceived online racist interpersonal interactions and exposure to online racist content among people of color. Items were developed through a multistage process involving a comprehensive literature review, focus-groups, qualitative data collection, and survey of online racism experiences. Based on a sample of 1,023 racial minority participants, exploratory and confirmatory factor analyses provided support for a 30-item bifactor model accounted by the general factor and the following 3 specific factors: (a) personal experience of racial cyber-aggression, (b) vicarious exposure to racial cyber-aggression, and (c) online-mediated exposure to racist reality. The PORS demonstrated measurement invariance across racial/ethnic groups in our sample. Internal reliability estimates for the total and subscale scores of the PORS were above .88 and the 4-week test-retest reliability was adequate. Limitations and future directions for research are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and

  5. Coupled large-eddy simulation and morphodynamics of a large-scale river under extreme flood conditions

    Science.gov (United States)

    Khosronejad, Ali; Sotiropoulos, Fotis; Stony Brook University Team

    2016-11-01

    We present a coupled flow and morphodynamic simulations of extreme flooding in 3 km long and 300 m wide reach of the Mississippi River in Minnesota, which includes three islands and hydraulic structures. We employ the large-eddy simulation (LES) and bed-morphodynamic modules of the VFS-Geophysics model to investigate the flow and bed evolution of the river during a 500 year flood. The coupling of the two modules is carried out via a fluid-structure interaction approach using a nested domain approach to enhance the resolution of bridge scour predictions. The geometrical data of the river, islands and structures are obtained from LiDAR, sub-aqueous sonar and in-situ surveying to construct a digital map of the river bathymetry. Our simulation results for the bed evolution of the river reveal complex sediment dynamics near the hydraulic structures. The numerically captured scour depth near some of the structures reach a maximum of about 10 m. The data-driven simulation strategy we present in this work exemplifies a practical simulation-based-engineering-approach to investigate the resilience of infrastructures to extreme flood events in intricate field-scale riverine systems. This work was funded by a Grant from Minnesota Dept. of Transportation.

  6. Final Technical Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M. [Duke Univ., Durham, NC (United States). Dept. of Mechanical Engineering and Materials Science

    2017-06-06

    QUEST is a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, University of Southern California, Massachusetts Institute of Technology, University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The Duke effort focused on the development of algorithms and utility software for non-intrusive sparse UQ representations, and on participation in the organization of annual workshops and tutorials to disseminate UQ tools to the community, and to gather input in order to adapt approaches to the needs of SciDAC customers. In particular, fundamental developments were made in (a) multiscale stochastic preconditioners, (b) gradient-based approaches to inverse problems, (c) adaptive pseudo-spectral approximations, (d) stochastic limit cycles, and (e) sensitivity analysis tools for noisy systems. In addition, large-scale demonstrations were performed, namely in the context of ocean general circulation models.

  7. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sewell, Christopher [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Childs, Hank [Univ. of Oregon, Eugene, OR (United States); Ma, Kwan-Liu [Univ. of California, Davis, CA (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Meredith, Jeremy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  8. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pugmire, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogers, David [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Childs, Hank [Univ. of Oregon, Eugene, OR (United States); Ma, Kwan-Liu [Univ. of California, Davis, CA (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States)

    2017-10-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  9. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D.; Sewell, Christopher (LANL); Childs, Hank (U of Oregon); Ma, Kwan-Liu (UC Davis); Geveci, Berk (Kitware); Meredith, Jeremy (ORNL)

    2016-05-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  10. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pugmire, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogers, David [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Childs, Hank [Univ. of Oregon, Eugene, OR (United States); Ma, Kwan-Liu [Univ. of California, Davis, CA (United States); Geveci, Berk [Kitware Inc., Clifton Park, NY (United States)

    2017-05-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  11. Two spatial scales in a bleaching event: Corals from the mildest and the most extreme thermal environments escape mortality

    KAUST Repository

    Pineda, Jesús

    2013-07-28

    In summer 2010, a bleaching event decimated the abundant reef flat coral Stylophora pistillata in some areas of the central Red Sea, where a series of coral reefs 100–300 m wide by several kilometers long extends from the coastline to about 20 km offshore. Mortality of corals along the exposed and protected sides of inner (inshore) and mid and outer (offshore) reefs and in situ and satellite sea surface temperatures (SSTs) revealed that the variability in the mortality event corresponded to two spatial scales of temperature variability: 300 m across the reef flat and 20 km across a series of reefs. However, the relationship between coral mortality and habitat thermal severity was opposite at the two scales. SSTs in summer 2010 were similar or increased modestly (0.5°C) in the outer and mid reefs relative to 2009. In the inner reef, 2010 temperatures were 1.4°C above the 2009 seasonal maximum for several weeks. We detected little or no coral mortality in mid and outer reefs. In the inner reef, mortality depended on exposure. Within the inner reef, mortality was modest on the protected (shoreward) side, the most severe thermal environment, with highest overall mean and maximum temperatures. In contrast, acute mortality was observed in the exposed (seaward) side, where temperature fluctuations and upper water temperature values were relatively less extreme. Refuges to thermally induced coral bleaching may include sites where extreme, high-frequency thermal variability may select for coral holobionts preadapted to, and physiologically condition corals to withstand, regional increases in water temperature.

  12. Extreme Postnatal Scaling in Bat Feeding Performance: A View of Ecomorphology from Ontogenetic and Macroevolutionary Perspectives.

    Science.gov (United States)

    Santana, Sharlene E; Miller, Kimberly E

    2016-09-01

    Ecomorphology studies focus on understanding how anatomical and behavioral diversity result in differences in performance, ecology, and fitness. In mammals, the determinate growth of the skeleton entails that bite performance should change throughout ontogeny until the feeding apparatus attains its adult size and morphology. Then, interspecific differences in adult phenotypes are expected to drive food resource partitioning and patterns of lineage diversification. However, Formal tests of these predictions are lacking for the majority of mammal groups, and thus our understanding of mammalian ecomorphology remains incomplete. By focusing on a fundamental measure of feeding performance, bite force, and capitalizing on the extraordinary morphological and dietary diversity of bats, we discuss how the intersection of ontogenetic and macroevolutionary changes in feeding performance may impact ecological diversity in these mammals. We integrate data on cranial morphology and bite force gathered through longitudinal studies of captive animals and comparative studies of free-ranging individuals. We demonstrate that ontogenetic trajectories and evolutionary changes in bite force are highly dependent on changes in body and head size, and that bats exhibit dramatic, allometric increases in bite force during ontogeny. Interspecific variation in bite force is highly dependent on differences in cranial morphology and function, highlighting selection for ecological specialization. While more research is needed to determine how ontogenetic changes in size and bite force specifically impact food resource use and fitness in bats, interspecific diversity in cranial morphology and bite performance seem to closely match functional differences in diet. Altogether, these results suggest direct ecomorphological relationships at ontogenetic and macroevolutionary scales in bats. © The Author 2016. Published by Oxford University Press on behalf of the Society for Integrative and Comparative

  13. Conservative treatment of soft tissue sarcomas of the extremities. Functional evaluation with LENT-SOMA scales and the Enneking score

    International Nuclear Information System (INIS)

    Tawfiq, N.; Lagarde, P.; Thomas, L.; Kantor, G.; Stockle, E.; Bui, B.N.

    2000-01-01

    Objective. - The aim of this prospective study is the feasibility of late effects assessment by LENT-SOMA scales after conservative treatment of soft tissue sarcomas of the extremities and a comparison with the functional evaluation by the Enneking score. Patients and methods. - During the systematic follow-up consultations, a series of 32 consecutive patients was evaluated in terms of late effects by LENT SOMA scales and functional results by the Enneking score. The median time after treatment was 65 months. The treatment consisted of conservative surgery (all cases) followed by radiation therapy (29 cases), often combined with adjuvant therapy (12 concomitant radio-chemotherapy association cases out of 14). The assessment of the toxicity was retrospective for acute effects and prospective for the following late tissue damage: skin/subcutaneous tissues, muscles/soft tissues and peripheral nerves. Results. -According to the Enneking score, the global score for the overall series was high (24/30) despite four the scores zero for the psychological acceptance. According to LENT SOMA scales, a low rate of severe sequelae (grade 3-4) was observed. The occurrence of high-grade sequelae and their functional consequences were not correlated with quality of exeresis, dose of radiotherapy or use of concomitant chemotherapy. A complementarity was observed between certain factors of the Enneking score and some criteria of the LENTSOMA scales, especially of muscles/soft tissues. Conclusion. -The good quality of functional results was confirmed by the two mean scoring systems for late normal tissue damage. The routine use of LENT-SOMA seems to be more time consuming than the Enneking score (mean time of scoring: 1 3 versus five minutes). The LENT-SOMA scales are aimed at a detailed description of late toxicity and sequelae while the Enneking score provides a more global evaluation, including the psychological acceptance of treatment. The late effects assessment by the LENT

  14. Toward Improving Predictability of Extreme Hydrometeorological Events: the Use of Multi-scale Climate Modeling in the Northern High Plains

    Science.gov (United States)

    Munoz-Arriola, F.; Torres-Alavez, J.; Mohamad Abadi, A.; Walko, R. L.

    2014-12-01

    Our goal is to investigate possible sources of predictability of hydrometeorological extreme events in the Northern High Plains. Hydrometeorological extreme events are considered the most costly natural phenomena. Water deficits and surpluses highlight how the water-climate interdependence becomes crucial in areas where single activities drive economies such as Agriculture in the NHP. Nonetheless we recognize the Water-Climate interdependence and the regulatory role that human activities play, we still grapple to identify what sources of predictability could be added to flood and drought forecasts. To identify the benefit of multi-scale climate modeling and the role of initial conditions on flood and drought predictability on the NHP, we use the Ocean Land Atmospheric Model (OLAM). OLAM is characterized by a dynamic core with a global geodesic grid with hexagonal (and variably refined) mesh cells and a finite volume discretization of the full compressible Navier Stokes equations, a cut-grid cell method for topography (that reduces error in computational gradient computation and anomalous vertical dispersion). Our hypothesis is that wet conditions will drive OLAM's simulations of precipitation to wetter conditions affecting both flood forecast and drought forecast. To test this hypothesis we simulate precipitation during identified historical flood events followed by drought events in the NHP (i.e. 2011-2012 years). We initialized OLAM with CFS-data 1-10 days previous to a flooding event (as initial conditions) to explore (1) short-term and high-resolution and (2) long-term and coarse-resolution simulations of flood and drought events, respectively. While floods are assessed during a maximum of 15-days refined-mesh simulations, drought is evaluated during the following 15 months. Simulated precipitation will be compared with the Sub-continental Observation Dataset, a gridded 1/16th degree resolution data obtained from climatological stations in Canada, US, and

  15. Extreme Scale Computing Studies

    Science.gov (United States)

    2010-12-01

    systems that would fall under the Exascale rubric . In this chapter, we first discuss the attributes by which achievement of the label “Exascale” may be...Carrington, and E. Strohmaier. A Genetic Algorithms Approach to Modeling the Performance of Memory-bound Computations. Reno, NV, November 2007. ACM/IEEE... genetic stochasticity (random mating, mutation, etc). Outcomes are thus stochastic as well, and ecologists wish to ask questions like, “What is the

  16. Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Johnson, Gary M.; Washington, Warren M.

    2009-07-02

    The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) in partnership with the Office of Advanced Scientific Computing Research (ASCR) held a workshop on the challenges in climate change science and the role of computing at the extreme scale, November 6-7, 2008, in Bethesda, Maryland. At the workshop, participants identified the scientific challenges facing the field of climate science and outlined the research directions of highest priority that should be pursued to meet these challenges. Representatives from the national and international climate change research community as well as representatives from the high-performance computing community attended the workshop. This group represented a broad mix of expertise. Of the 99 participants, 6 were from international institutions. Before the workshop, each of the four panels prepared a white paper, which provided the starting place for the workshop discussions. These four panels of workshop attendees devoted to their efforts the following themes: Model Development and Integrated Assessment; Algorithms and Computational Environment; Decadal Predictability and Prediction; Data, Visualization, and Computing Productivity. The recommendations of the panels are summarized in the body of this report.

  17. Dominant Large-Scale Atmospheric Circulation Systems for the Extreme Precipitation over the Western Sichuan Basin in Summer 2013

    Directory of Open Access Journals (Sweden)

    Yamin Hu

    2015-01-01

    Full Text Available The western Sichuan Basin (WSB is a rainstorm center influenced by complicated factors such as topography and circulation. Based on multivariable empirical orthogonal function technique for extreme precipitation processes (EPP in WSB in 2013, this study reveals the dominant circulation patterns. Results indicate that the leading modes are characterized by “Saddle” and “Sandwich” structures, respectively. In one mode, a TC from the South China Sea (SCS converts into the inverted trough and steers warm moist airflow northward into the WSB. At the same time, WPSH extends westward over the Yangtze River and conveys a southeasterly warm humid flow. In the other case, WPSH is pushed westward by TC in the Western Pacific and then merges with an anomalous anticyclone over SCS. The anomalous anticyclone and WPSH form a conjunction belt and convey the warm moist southwesterly airflow to meet with the cold flow over the WSB. The configurations of WPSH and TC in the tropic and the blocking and trough in the midhigh latitudes play important roles during the EPPs over the WSB. The persistence of EPPs depends on the long-lived large-scale circulation configuration steady over the suitable positions.

  18. Measuring activity limitations in walking : Development of a hierarchical scale for patients with lower-extremity disorders who live at home

    NARCIS (Netherlands)

    Roorda, LD; Roebroeck, ME; van Tilburg, T; Molenaar, IW; Lankhorst, GJ; Bouter, LM

    2005-01-01

    Objective: To develop a hierarchical scale that measures activity limitations in walking in patients with lower-extremity disorders who live at home. Design: Cross-sectional study. Setting: Orthopedic workshops and outpatient clinics of secondary and tertiary care centers. Participants: Patients

  19. Recent hydrological variability and extreme precipitation events in Moroccan Middle-Atlas mountains: micro-scale analyses of lacustrine sediments

    Science.gov (United States)

    Jouve, Guillaume; Vidal, Laurence; Adallal, Rachid; Bard, Edouard; Benkaddour, Abdel; Chapron, Emmanuel; Courp, Thierry; Dezileau, Laurent; Hébert, Bertil; Rhoujjati, Ali; Simonneau, Anaelle; Sonzogni, Corinne; Sylvestre, Florence; Tachikawa, Kazuyo; Viry, Elisabeth

    2016-04-01

    Since the 1990s, the Mediterranean basin undergoes an increase in precipitation events and extreme droughts likely to intensify in the XXI century, and whose origin is attributable to human activities since 1850 (IPCC, 2013). Regional climate models indicate a strengthening of flood episodes at the end of the XXI century in Morocco (Tramblay et al, 2012). To understand recent hydrological and paleohydrological variability in North Africa, our study focuses on the macro- and micro-scale analysis of sedimentary sequences from Lake Azigza (Moroccan Middle Atlas Mountains) covering the last few centuries. This lake is relevant since local site monitoring revealed that lake water table levels were correlated with precipitation regime (Adallal R., PhD Thesis in progress). The aim of our study is to distinguish sedimentary facies characteristic of low and high lake levels, in order to reconstruct past dry and wet periods during the last two hundred years. Here, we present results from sedimentological (lithology, grain size, microstructures under thin sections), geochemical (XRF) and physical (radiography) analyses on short sedimentary cores (64 cm long) taken into the deep basin of Lake Azigza (30 meters water depth). Cores have been dated (radionuclides 210Pb, 137Cs, and 14C dating). Two main facies were distinguished: one organic-rich facies composed of wood fragments, several reworked layers and characterized by Mn peaks; and a second facies composed of terrigenous clastic sediments, without wood nor reworked layers, and characterized by Fe, Ti, Si and K peaks. The first facies is interpreted as a high lake level stand. Indeed, the highest paleoshoreline is close to the vegetation, and steeper banks can increase the current velocity, allowing the transport of wood fragments in case of extreme precipitation events. Mn peaks are interpreted as Mn oxides precipitations under well-oxygenated deep waters after runoff events. The second facies is linked to periods of

  20. Contribution of large-scale circulation anomalies to changes in extreme precipitation frequency in the United States

    Science.gov (United States)

    Lejiang Yu; Shiyuan Zhong; Lisi Pei; Xindi (Randy) Bian; Warren E. Heilman

    2016-01-01

    The mean global climate has warmed as a result of the increasing emission of greenhouse gases induced by human activities. This warming is considered the main reason for the increasing number of extreme precipitation events in the US. While much attention has been given to extreme precipitation events occurring over several days, which are usually responsible for...

  1. Reliability and validity of the Persian lower extremity functional scale (LEFS) in a heterogeneous sample of outpatients with lower limb musculoskeletal disorders.

    Science.gov (United States)

    Negahban, Hossein; Hessam, Masumeh; Tabatabaei, Saeid; Salehi, Reza; Sohani, Soheil Mansour; Mehravar, Mohammad

    2014-01-01

    The aim was to culturally translate and validate the Persian lower extremity functional scale (LEFS) in a heterogeneous sample of outpatients with lower extremity musculoskeletal disorders (n = 304). This is a prospective methodological study. After a standard forward-backward translation, psychometric properties were assessed in terms of test-retest reliability, internal consistency, construct validity, dimensionality, and ceiling or floor effects. The acceptable level of intraclass correlation coefficient >0.70 and Cronbach's alpha coefficient >0.70 was obtained for the Persian LEFS. Correlations between Persian LEFS and Short-Form 36 Health Survey (SF-36) subscales of Physical Health component (rs range = 0.38-0.78) were higher than correlations between Persian LEFS and SF-36 subscales of Mental Health component (rs range = 0.15-0.39). A corrected item--total correlation of >0.40 (Spearman's rho) was obtained for all items of the Persian LEFS. Horn's parallel analysis detected a total of two factors. No ceiling or floor effects were detected for the Persian LEFS. The Persian version of the LEFS is a reliable and valid instrument that can be used to measure functional status in Persian-speaking patients with different musculoskeletal disorders of the lower extremity. Implications for Rehabilitation The Persian lower extremity functional scale (LEFS) is a reliable, internally consistent and valid instrument, with no ceiling or floor effects, to determine functional status of heterogeneous patients with musculoskeletal disorders of the lower extremity. The Persian version of the LEFS can be used in clinical and research settings to measure function in Iranian patients with different musculoskeletal disorders of the lower extremity.

  2. Potential changes in the extreme climate conditions at the regional scale: from observed data to modelling approaches and towards probabilistic climate change information

    International Nuclear Information System (INIS)

    Gachon, P.; Radojevic, M.; Harding, A.; Saad, C.; Nguyen, V.T.V.

    2008-01-01

    The changes in the characteristics of extreme climate conditions are one of the most critical challenges for all ecosystems, human being and infrastructure, in the context of the on-going global climate change. However, extremes information needed for impacts studies cannot be obtained directly from coarse scale global climate models (GCMs), due mainly to their difficulties to incorporate regional scale feedbacks and processes responsible in part for the occurrence, intensity and duration of extreme events. Downscaling approaches, namely statistical and dynamical downscaling techniques (i.e. SD and RCM), have emerged as useful tools to develop high resolution climate change information, in particular for extremes, as those are theoretically more capable to take into account regional/local forcings and their feedbacks from large scale influences as they are driven with GCM synoptic variables. Nevertheless, in spite of the potential added values from downscaling methods (statistical and dynamical), a rigorous assessment of these methods are needed as inherent difficulties to simulate extremes are still present. In this paper, different series of RCM and SD simulations using three different GCMs are presented and evaluated with respect to observed values over the current period and over a river basin in southern Quebec, with future ensemble runs, i.e. centered over 2050s (i.e. 2041-2070 period using the SRES A2 emission scenario). Results suggest that the downscaling performance over the baseline period significantly varies between the two downscaling techniques and over various seasons with more regular reliable simulated values with SD technique for temperature than for RCM runs, while both approaches produced quite similar temperature changes in the future from median values with more divergence for extremes. For precipitation, less accurate information is obtained compared to observed data, and with more differences among models with higher uncertainties in the

  3. Dynamical downscaling of ERA-40 in complex terrain using the WRF regional climate model

    Energy Technology Data Exchange (ETDEWEB)

    Heikkilae, U. [Bjerknes Centre for Climate Research, Uni Bjerknes Centre, Bergen (Norway); Sandvik, A. [Bjerknes Centre for Climate Research, Institute for Marine Research (IMR), Bergen (Norway); Sorteberg, A. [University of Bergen, Geophysical Institute, Bergen (Norway)

    2011-10-15

    Results from a first-time employment of the WRF regional climate model to climatological simulations in Europe are presented. The ERA-40 reanalysis (resolution 1 ) has been downscaled to a horizontal resolution of 30 and 10 km for the period of 1961-1990. This model setup includes the whole North Atlantic in the 30 km domain and spectral nudging is used to keep the large scales consistent with the driving ERA-40 reanalysis. The model results are compared against an extensive observational network of surface variables in complex terrain in Norway. The comparison shows that the WRF model is able to add significant detail to the representation of precipitation and 2-m temperature of the ERA-40 reanalysis. Especially the geographical distribution, wet day frequency and extreme values of precipitation are highly improved due to the better representation of the orography. Refining the resolution from 30 to 10 km further increases the skill of the model, especially in case of precipitation. Our results indicate that the use of 10-km resolution is advantageous for producing regional future climate projections. Use of a large domain and spectral nudging seems to be useful in reproducing the extreme precipitation events due to the better resolved synoptic scale features over the North Atlantic, and also helps to reduce the large regional temperature biases over Norway. This study presents a high-resolution, high-quality climatological data set useful for reference climate impact studies. (orig.)

  4. Extreme air pollution events in Hokkaido, Japan, traced back to early snowmelt and large-scale wildfires over East Eurasia: Case studies.

    Science.gov (United States)

    Yasunari, Teppei J; Kim, Kyu-Myong; da Silva, Arlindo M; Hayasaki, Masamitsu; Akiyama, Masayuki; Murao, Naoto

    2018-04-25

    To identify the unusual climate conditions and their connections to air pollutions in a remote area due to wildfires, we examine three anomalous large-scale wildfires in May 2003, April 2008, and July 2014 over East Eurasia, as well as how products of those wildfires reached an urban city, Sapporo, in the northern part of Japan (Hokkaido), significantly affecting the air quality. NASA's MERRA-2 (the Modern-Era Retrospective analysis for Research and Applications, Version 2) aerosol re-analysis data closely reproduced the PM 2.5 variations in Sapporo for the case of smoke arrival in July 2014. Results show that all three cases featured unusually early snowmelt in East Eurasia, accompanied by warmer and drier surface conditions in the months leading to the fires, inducing long-lasting soil dryness and producing climate and environmental conditions conducive to active wildfires. Due to prevailing anomalous synoptic-scale atmospheric motions, smoke from those fires eventually reached a remote area, Hokkaido, and worsened the air quality in Sapporo. In future studies, continuous monitoring of the timing of Eurasian snowmelt and the air quality from the source regions to remote regions, coupled with the analysis of atmospheric and surface conditions, may be essential in more accurately predicting the effects of wildfires on air quality.

  5. Extreme robustness of scaling in sample space reducing processes explains Zipf’s law in diffusion on directed networks

    International Nuclear Information System (INIS)

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan

    2016-01-01

    It has been shown recently that a specific class of path-dependent stochastic processes, which reduce their sample space as they unfold, lead to exact scaling laws in frequency and rank distributions. Such sample space reducing processes offer an alternative new mechanism to understand the emergence of scaling in countless processes. The corresponding power law exponents were shown to be related to noise levels in the process. Here we show that the emergence of scaling is not limited to the simplest SSRPs, but holds for a huge domain of stochastic processes that are characterised by non-uniform prior distributions. We demonstrate mathematically that in the absence of noise the scaling exponents converge to −1 (Zipf’s law) for almost all prior distributions. As a consequence it becomes possible to fully understand targeted diffusion on weighted directed networks and its associated scaling laws in node visit distributions. The presence of cycles can be properly interpreted as playing the same role as noise in SSRPs and, accordingly, determine the scaling exponents. The result that Zipf’s law emerges as a generic feature of diffusion on networks, regardless of its details, and that the exponent of visiting times is related to the amount of cycles in a network could be relevant for a series of applications in traffic-, transport- and supply chain management. (paper)

  6. ERA`s Ranger uranium mine

    Energy Technology Data Exchange (ETDEWEB)

    Davies, W. [Energy Resources of Australia Ltd., Sydney, NSW (Australia)

    1997-12-31

    Energy Resource of Australia (ERA) is a public company with 68% of its shares owned by the Australian company North Limited. It is currently operating one major production centre - Ranger Mine which is 260 kilometres east of Darwin, extracting and selling uranium from the Ranger Mine in the Northern Territory to nuclear electricity utilities in Japan, South Korea, Europe and North America. The first drum of uranium oxide from Ranger was drummed in August 1981 and operations have continued since that time. ERA is also in the process of working towards obtaining approvals for the development of a second mine - Jabiluka which is located 20 kilometres north of Ranger. The leases of Ranger and Jabiluka adjoin. The Minister for the Environment has advised the Minister for Resources and Energy that there does not appear to be any environmental issue which would prevent the preferred Jabiluka proposal from proceeding. Consent for the development of ERA`s preferred option for the development of Jabiluka is being sought from the Aboriginal Traditional Owners. Ranger is currently the third largest producing uranium mine in the world producing 4,237 tonnes of U{sub 3}O{sub 8} in the year to June 1997.

  7. Changes in intensity of precipitation extremes in Romania on very hight temporal scale and implications on the validity of the Clausius-Clapeyron relation

    Science.gov (United States)

    Busuioc, Aristita; Baciu, Madalina; Breza, Traian; Dumitrescu, Alexandru; Stoica, Cerasela; Baghina, Nina

    2016-04-01

    Many observational, theoretical and based on climate model simulation studies suggested that warmer climates lead to more intense precipitation events, even when the total annual precipitation is slightly reduced. In this way, it was suggested that extreme precipitation events may increase at Clausius-Clapeyron (CC) rate under global warming and constraint of constant relative humidity. However, recent studies show that the relationship between extreme rainfall intensity and atmospheric temperature is much more complex than would be suggested by the CC relationship and is mainly dependent on precipitation temporal resolution, region, storm type and whether the analysis is conducted on storm events rather than fixed data. The present study presents the dependence between the very hight temporal scale extreme rainfall intensity and daily temperatures, with respect to the verification of the CC relation. To solve this objective, the analysis is conducted on rainfall event rather than fixed interval using the rainfall data based on graphic records including intensities (mm/min.) calculated over each interval with permanent intensity per minute. The annual interval with available a such data (April to October) is considered at 5 stations over the interval 1950-2007. For Bucuresti-Filaret station the analysis is extended over the longer interval (1898-2007). For each rainfall event, the maximum intensity (mm/min.) is retained and these time series are considered for the further analysis (abbreviated in the following as IMAX). The IMAX data were divided based on the daily mean temperature into bins 2oC - wide. The bins with less than 100 values were excluded. The 90th, 99th and 99.9th percentiles were computed from the binned data using the empirical distribution and their variability has been compared to the CC scaling (e.g. exponential relation given by a 7% increase per temperature degree rise). The results show a dependence close to double the CC relation for

  8. Scale interactions in economics: application to the evaluation of the economic damages of climatic change and of extreme events

    International Nuclear Information System (INIS)

    Hallegatte, S.

    2005-06-01

    Growth models, which neglect economic disequilibria, considered as temporary, are in general used to evaluate the damaging effects generated by climatic change. This work shows, through a series of modeling experiences, the importance of disequilibria and of endogenous variability of economy in the evaluation of damages due to extreme events and climatic change. It demonstrates the impossibility to separate the evaluation of damages from the representation of growth and of economic dynamics: the comfort losses will depend on both the nature and intensity of impacts and on the dynamics and situation of the economy to which they will apply. Thus, the uncertainties about the damaging effects of future climatic changes come from both scientific uncertainties and from uncertainties about the future organization of our economies. (J.S.)

  9. Extreme value statistics for annual minimum and trough-under-treshold precipitation at different, spatio-temporal scales

    NARCIS (Netherlands)

    Booij, Martijn J.; de Wit, Marcel J.M.

    2010-01-01

    The aim of this paper is to quantify meteorological droughts and assign return periods to these droughts. Moreover, the relation between meteorological and hydrological droughts is explored. This has been done for the River Meuse basin in Western Europe at different spatial and temporal scales to

  10. Scientific Grand Challenges: Discovery In Basic Energy Sciences: The Role of Computing at the Extreme Scale - August 13-15, 2009, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Galli, Giulia [Univ. of California, Davis, CA (United States). Workshop Chair; Dunning, Thom [Univ. of Illinois, Urbana, IL (United States). Workshop Chair

    2009-08-13

    The U.S. Department of Energy’s (DOE) Office of Basic Energy Sciences (BES) and Office of Advanced Scientific Computing Research (ASCR) workshop in August 2009 on extreme-scale computing provided a forum for more than 130 researchers to explore the needs and opportunities that will arise due to expected dramatic advances in computing power over the next decade. This scientific community firmly believes that the development of advanced theoretical tools within chemistry, physics, and materials science—combined with the development of efficient computational techniques and algorithms—has the potential to revolutionize the discovery process for materials and molecules with desirable properties. Doing so is necessary to meet the energy and environmental challenges of the 21st century as described in various DOE BES Basic Research Needs reports. Furthermore, computational modeling and simulation are a crucial complement to experimental studies, particularly when quantum mechanical processes controlling energy production, transformations, and storage are not directly observable and/or controllable. Many processes related to the Earth’s climate and subsurface need better modeling capabilities at the molecular level, which will be enabled by extreme-scale computing.

  11. Field limit and nano-scale surface topography of superconducting radio-frequency cavity made of extreme type II superconductor

    OpenAIRE

    Kubo, Takayuki

    2014-01-01

    The field limit of superconducting radio-frequency cavity made of type II superconductor with a large Ginzburg-Landau parameter is studied with taking effects of nano-scale surface topography into account. If the surface is ideally flat, the field limit is imposed by the superheating field. On the surface of cavity, however, nano-defects almost continuously distribute and suppress the superheating field everywhere. The field limit is imposed by an effective superheating field given by the pro...

  12. Removal of volatile organic compounds at extreme shock-loading using a scaled-up pilot rotating drum biofilter.

    Science.gov (United States)

    Sawvel, Russell A; Kim, Byung; Alvarez, Pedro J J

    2008-11-01

    A pilot-scale rotating drum biofilter (RDB), which is a novel biofilter design that offers flexible flow-through configurations, was used to treat complex and variable volatile organic compound (VOC) emissions, including shock loadings, emanating from paint drying operations at an Army ammunition plant. The RDB was seeded with municipal wastewater activated sludge. Removal efficiencies up to 86% and an elimination capacity of 5.3 g chemical oxygen demand (COD) m(-3) hr(-1) were achieved at a filter-medium contact time of 60 sec. Efficiency increased at higher temperatures that promote higher biological activity, and decreased at lower pH, which dropped down to pH 5.5 possibly as a result of carbon dioxide and volatile fatty acid production and ammonia consumption during VOC degradation. In comparison, other studies have shown that a bench-scale RDB could achieve a removal efficiency of 95% and elimination capacity of 331 g COD m(-3) hr(-1). Sustainable performance of the pilot-scale RDB was challenged by the intermittent nature of painting operations, which typically resulted in 3-day long shutdown periods when bacteria were not fed. This challenge was overcome by adding sucrose (2 g/L weekly) as an auxiliary substrate to sustain metabolic activity during shutdown periods.

  13. Spatial and temporal accuracy of asynchrony-tolerant finite difference schemes for partial differential equations at extreme scales

    Science.gov (United States)

    Kumari, Komal; Donzis, Diego

    2017-11-01

    Highly resolved computational simulations on massively parallel machines are critical in understanding the physics of a vast number of complex phenomena in nature governed by partial differential equations. Simulations at extreme levels of parallelism present many challenges with communication between processing elements (PEs) being a major bottleneck. In order to fully exploit the computational power of exascale machines one needs to devise numerical schemes that relax global synchronizations across PEs. This asynchronous computations, however, have a degrading effect on the accuracy of standard numerical schemes.We have developed asynchrony-tolerant (AT) schemes that maintain order of accuracy despite relaxed communications. We show, analytically and numerically, that these schemes retain their numerical properties with multi-step higher order temporal Runge-Kutta schemes. We also show that for a range of optimized parameters,the computation time and error for AT schemes is less than their synchronous counterpart. Stability of the AT schemes which depends upon history and random nature of delays, are also discussed. Support from NSF is gratefully acknowledged.

  14. Dakwah di Era Digital

    OpenAIRE

    Budiantoro, Wahyu

    2018-01-01

    These days dakwah is not only interpreted as transformation of a pure religious value, but also transformation of a more relevant value including many aspects in digital era. Digital era is when society succumbed into the flow of information causing cultural shock and difficulties on synthesizing meaning from those scattered information. Dakwah on Digital age must accommodate societal needs which tend to move into a mass society. It results in strategy and more humane and innovative dakwah me...

  15. Impacts of an extreme cyclone event on landscape-scale savanna fire, productivity and greenhouse gas emissions

    International Nuclear Information System (INIS)

    Hutley, L B; Maier, S W; Evans, B J; Beringer, J; Cook, G D; Razon, E

    2013-01-01

    North Australian tropical savanna accounts for 12% of the world’s total savanna land cover. Accordingly, understanding processes that govern carbon, water and energy exchange within this biome is critical to global carbon and water budgeting. Climate and disturbances drive ecosystem carbon dynamics. Savanna ecosystems of the coastal and sub-coastal of north Australia experience a unique combination of climatic extremes and are in a state of near constant disturbance from fire events (1 in 3 years), storms resulting in windthrow (1 in 5–10 years) and mega-cyclones (1 in 500–1000 years). Critically, these disturbances occur over large areas creating a spatial and temporal mosaic of carbon sources and sinks. We quantify the impact on gross primary productivity (GPP) and fire occurrence from a tropical mega-cyclone, tropical Cyclone Monica (TC Monica), which affected 10 400 km 2 of savanna across north Australia, resulting in the mortality and severe structural damage to ∼140 million trees. We estimate a net carbon equivalent emission of 43 Tg of CO 2 -e using the moderate resolution imaging spectroradiometer (MODIS) GPP (MOD17A2) to quantify spatial and temporal patterns pre- and post-TC Monica. GPP was suppressed for four years after the event, equivalent to a loss of GPP of 0.5 Tg C over this period. On-ground fuel loads were estimated to potentially release 51.2 Mt CO 2 -e, equivalent to ∼10% of Australia’s accountable greenhouse gas emissions. We present a simple carbon balance to examine the relative importance of frequency versus impact for a number of key disturbance processes such as fire, termite consumption and intense but infrequent mega-cyclones. Our estimates suggested that fire and termite consumption had a larger impact on Net Biome Productivity than infrequent mega-cyclones. We demonstrate the importance of understanding how climate variability and disturbance impacts savanna dynamics in the context of the increasing interest in

  16. Full Scale Test SSP 34m blade, edgewise loading LTT. Extreme load and PoC_InvE Data report

    DEFF Research Database (Denmark)

    Nielsen, Magda; Roczek-Sieradzan, Agnieszka; Jensen, Find Mølholt

    This report is the second report covering the research and demonstration project “Eksperimentel vingeforskning: Strukturelle mekanismer i nutidens og fremtidens store vinger under kombineret last”, supported by the EUDP program. A 34m wind turbine blade from SSP-Technology A/S has been tested...... in edgewise direction (LTT). The blade has been submitted to thorough examination by means of strain gauges, displacement transducers and a 3D optical measuring system. This data report presents results obtained during full scale testing of the blade up to 80% Risø load, where 80% Risø load corresponds to 100...... stresses in the adhesive joints. Test results from measurements with the reinforcement have been compared to results without the coupling. The report presents only the relevant results for the 80% Risø load and the results applicable for the investigation of the influence of the invention on the profile...

  17. Large-Scale Skin Resurfacing of the Upper Extremity in Pediatric Patients Using a Pre-Expanded Intercostal Artery Perforator Flap.

    Science.gov (United States)

    Wei, Jiao; Herrler, Tanja; Gu, Bin; Yang, Mei; Li, Qingfeng; Dai, Chuanchang; Xie, Feng

    2018-05-01

    The repair of extensive upper limb skin lesions in pediatric patients is extremely challenging due to substantial limitations of flap size and donor-site morbidity. We aimed to create an oversize preexpanded flap based on intercostal artery perforators for large-scale resurfacing of the upper extremity in children. Between March 2013 and August 2016, 11 patients underwent reconstructive treatment for extensive skin lesions in the upper extremity using a preexpanded intercostal artery perforator flap. Preoperatively, 2 to 4 candidate perforators were selected as potential pedicle vessels based on duplex ultrasound examination. After tissue expander implantation in the thoracodorsal area, regular saline injections were performed until the expanded flap was sufficient in size. Then, a pedicled flap was formed to resurface the skin lesion of the upper limb. The pedicles were transected 3 weeks after flap transfer. Flap survival, complications, and long-term outcome were evaluated. The average time of tissue expansion was 133 days with a mean final volume of 1713 mL. The thoracoabdominal flaps were based on 2 to 6 pedicles and used to resurface a mean skin defect area of 238 cm ranging from 180 to 357 cm. In all cases, primary donor-site closure was achieved. Marginal necrosis was seen in 5 cases. The reconstructed limbs showed satisfactory outcome in both aesthetic and functional aspects. The preexpanded intercostal artery perforator flap enables 1-block repair of extensive upper limb skin lesions. Due to limited donor-site morbidity and a pedicled technique, this resurfacing approach represents a useful tool especially in pediatric patients.

  18. Broad Scale Monitoring in the US Forest Service: Institutional Challenges and Collaborative Opportunites for Improving Planning and Decision-Making in an Era of Climate Change

    Science.gov (United States)

    Wurtzebach, Z.

    2016-12-01

    In 2012, the United States Forest Service promulgated new rules to guide Forest planning efforts in accordance with the National Forest Management Act (NFMA). One important component of the 2012 rule is a requirement for Regionally coordinated cross-boundary "broad scale" monitoring strategies that are designed to inform and facilitate Forest-level adaptive management and planning. This presentation will examine institutional challenges and opportunites for developing effective broad scale monitoring strategies identified in 90 interviews with USFS staff and partner organizations, and collaborative workshops held in Colorado, Wyoming, Arizona, and New Mexico. Internal barriers to development include funding and human resource constraints, organizational culture, problematic incentives and accountability structures, data management issues, and administrative barriers to collaboration. However, we also identify several opportunities for leveraging interagency collaboration, facilitating multi-level coordination, generating efficiencies in data collection and analysis, and improving strategies for reporting and communication to Forest level decision-makers and relevant stakeholders.

  19. Validación de una escala de afrontamiento frente a riesgos extremos Validation of a scale measuring coping with extreme risks

    Directory of Open Access Journals (Sweden)

    Esperanza López-Vázquez

    2004-06-01

    Full Text Available OBJETIVO: Validar, en población mexicana, una escala de afrontamiento, adaptada de la escala francesa "Echèlle Toulousaine de Coping". MATERIAL Y MÉTODOS: En el otoño de 2001 la escala se aplicó a 209 sujetos que habitaban en diversas zonas de México, expuestos a cinco diferentes tipos de riesgo extremo, entre los cuales se distinguen riesgos naturales y riesgos industriales. Se analizó la capacidad discriminatoria de los reactivos, así como la estructura factorial y la consistencia interna de la prueba. Se emplearon los métodos U de Mann-Whitney, análisis factorial de componentes principales y alpha de Cronbach. RESULTADOS: La escala final es de 26 reactivos que se agruparon en dos factores: afrontamiento activo y afrontamiento pasivo. La consistencia interna del instrumento es muy alta, tanto en la muestra total como en la submuestra de riesgos naturales y riesgos industriales. CONCLUSIONES: La escala de afrontamiento que proponemos es confiable y válida para la población mexicanaOBJECTIVE: The objective of this study was to validate, in Mexico, the French coping scale "Échelle Toulousaine de Coping". MATERIAL AND METHODS: In the fall of 2001, the scale questionnaire was applied to 209 subjects living in different areas of Mexico, exposed to five different types of extreme natural or industrial risks. The discriminatory capacity of the items, as well as the factorial structure and internal consistency of the scale, were analyzed using Mann-Whitney's U test, principal components factorial analysis, and Cronbach's alpha. RESULTS: The final scale was composed of 26 items forming two groups: active coping and passive coping. Internal consistency of the instrument was high, both in the total sample and in the subsample of natural and industrial risks. CONCLUSIONS: The coping scale is reliable and valid for the Mexican population

  20. Field limit and nano-scale surface topography of superconducting radio-frequency cavity made of extreme type II superconductor

    Science.gov (United States)

    Kubo, Takayuki

    2015-06-01

    The field limit of a superconducting radio-frequency cavity made of a type II superconductor with a large Ginzburg-Landau parameter is studied, taking the effects of nano-scale surface topography into account. If the surface is ideally flat, the field limit is imposed by the superheating field. On the surface of cavity, however, nano-defects almost continuously distribute and suppress the superheating field everywhere. The field limit is imposed by an effective superheating field given by the product of the superheating field for an ideal flat surface and a suppression factor that contains the effects of nano-defects. A nano-defect is modeled by a triangular groove with a depth smaller than the penetration depth. An analytical formula for the suppression factor of bulk and multilayer superconductors is derived in the framework of the London theory. As an immediate application, the suppression factor of the dirty Nb processed by electropolishing is evaluated by using results of surface topographic study. The estimated field limit is consistent with the present record field of nitrogen-doped Nb cavities. Suppression factors of surfaces of other bulk and multilayer superconductors, and those after various surface processing technologies, can also be evaluated by using the formula.

  1. Global Learning in a New Era

    Science.gov (United States)

    Ramaley, Judith

    2016-01-01

    Our nation's colleges and universities frequently adapt their approach to education in response to the reality of social, economic and environmental challenges. Today the reality is that we are increasingly interconnected on a global scale. This new era of globalization impacts every facet of society, and it offers both an exciting blend of…

  2. Recent Reanalysis Activities at ECMWF: Results from ERA-20C and Plans for ERA5

    Science.gov (United States)

    Dragani, R.; Hersbach, H.; Poli, P.; Pebeuy, C.; Hirahara, S.; Simmons, A.; Dee, D.

    2015-12-01

    This presentation will provide an overview of the most recent reanalysis activities performed at the European Centre for Medium-Range Weather Forecasts (ECMWF). A pilot reanalysis of the 20th-century (ERA-20C) has recently been completed. Funded through the European FP7 collaborative project ERA-CLIM, ERA-20C is part of a suite of experiments that also includes a model-only integration (ERA-20CM) and a land-surface reanalysis (ERA-20CL). Its data assimilation system is constrained by only surface observations obtained from ISPD (3.2.6) and ICOADS (2.5.1). Surface boundary conditions are provided by the Hadley Centre (HadISST2.1.0.0) and radiative forcing follows CMIP5 recommended data sets. First-guess uncertainty estimates are based on a 10-member ensemble of Data Assimilations, ERA-20C ensemble, run prior to ERA-20C using ten SST and sea-ice realizations from the Hadley Centre. In November 2014, the European Commission entrusted ECMWF to run on its behalf the Copernicus Climate Change Service (C3S) aiming at producing quality-assured information about the past, current and future states of the climate at both European and global scales. Reanalysis will be one of the main components of the C3S portfolio and the first one to be produced is a global modern era reanalysis (ERA5) covering the period from 1979 onwards. Based on a recent version of the ECMWF data assimilation system, ERA5 will replace the widely used ERA-Interim dataset. This new production will benefit from a much improved model, and better characterized and exploited observations compared to its predecessor. The first part of the presentation will focus on the ERA-20C production, provide an overview of its main characteristics and discuss some of the key results from its assessment. The second part of the talk will give an overview of ERA5, and briefly discuss some of its challenges.

  3. SWAP OBSERVATIONS OF THE LONG-TERM, LARGE-SCALE EVOLUTION OF THE EXTREME-ULTRAVIOLET SOLAR CORONA

    Energy Technology Data Exchange (ETDEWEB)

    Seaton, Daniel B.; De Groof, Anik; Berghmans, David; Nicula, Bogdan [Royal Observatory of Belgium-SIDC, Avenue Circulaire 3, B-1180 Brussels (Belgium); Shearer, Paul [Department of Mathematics, 2074 East Hall, University of Michigan, 530 Church Street, Ann Arbor, MI 48109-1043 (United States)

    2013-11-01

    The Sun Watcher with Active Pixels and Image Processing (SWAP) EUV solar telescope on board the Project for On-Board Autonomy 2 spacecraft has been regularly observing the solar corona in a bandpass near 17.4 nm since 2010 February. With a field of view of 54 × 54 arcmin, SWAP provides the widest-field images of the EUV corona available from the perspective of the Earth. By carefully processing and combining multiple SWAP images, it is possible to produce low-noise composites that reveal the structure of the EUV corona to relatively large heights. A particularly important step in this processing was to remove instrumental stray light from the images by determining and deconvolving SWAP's point-spread function from the observations. In this paper, we use the resulting images to conduct the first-ever study of the evolution of the large-scale structure of the corona observed in the EUV over a three year period that includes the complete rise phase of solar cycle 24. Of particular note is the persistence over many solar rotations of bright, diffuse features composed of open magnetic fields that overlie polar crown filaments and extend to large heights above the solar surface. These features appear to be related to coronal fans, which have previously been observed in white-light coronagraph images and, at low heights, in the EUV. We also discuss the evolution of the corona at different heights above the solar surface and the evolution of the corona over the course of the solar cycle by hemisphere.

  4. A highly scalable particle tracking algorithm using partitioned global address space (PGAS) programming for extreme-scale turbulence simulations

    Science.gov (United States)

    Buaria, D.; Yeung, P. K.

    2017-12-01

    A new parallel algorithm utilizing a partitioned global address space (PGAS) programming model to achieve high scalability is reported for particle tracking in direct numerical simulations of turbulent fluid flow. The work is motivated by the desire to obtain Lagrangian information necessary for the study of turbulent dispersion at the largest problem sizes feasible on current and next-generation multi-petaflop supercomputers. A large population of fluid particles is distributed among parallel processes dynamically, based on instantaneous particle positions such that all of the interpolation information needed for each particle is available either locally on its host process or neighboring processes holding adjacent sub-domains of the velocity field. With cubic splines as the preferred interpolation method, the new algorithm is designed to minimize the need for communication, by transferring between adjacent processes only those spline coefficients determined to be necessary for specific particles. This transfer is implemented very efficiently as a one-sided communication, using Co-Array Fortran (CAF) features which facilitate small data movements between different local partitions of a large global array. The cost of monitoring transfer of particle properties between adjacent processes for particles migrating across sub-domain boundaries is found to be small. Detailed benchmarks are obtained on the Cray petascale supercomputer Blue Waters at the University of Illinois, Urbana-Champaign. For operations on the particles in a 81923 simulation (0.55 trillion grid points) on 262,144 Cray XE6 cores, the new algorithm is found to be orders of magnitude faster relative to a prior algorithm in which each particle is tracked by the same parallel process at all times. This large speedup reduces the additional cost of tracking of order 300 million particles to just over 50% of the cost of computing the Eulerian velocity field at this scale. Improving support of PGAS models on

  5. Adaptation to a landscape-scale mountain pine beetle epidemic in the era of networked governance: the enduring importance of bureaucratic institutions

    Directory of Open Access Journals (Sweden)

    Jesse B. Abrams

    2017-12-01

    Full Text Available Landscape-scale forest disturbance events have become increasingly common worldwide under the combined influences of climate change and ecosystem modification. The mountain pine beetle (Dendroctonus ponderosae epidemic that swept through North American forests from the late 1990s through the early 2010s was one of the largest such disturbance events on record and triggered shocks to ecological and economic systems. We analyze the policy and governance responses to this event by focusing on three national forests in the state of Colorado and on the agency responsible for their management, the U.S. Forest Service. We found that the event triggered the formation of new hybrid agency/nonagency organizations that contributed both legitimacy and capacity to address the most immediate threats to human safety and infrastructure. Despite the use of a highly networked governance structure, longstanding U.S. Forest Service institutions continued to heavily influence the scope of the response and the means for implementing management activities. We detected relatively limited institutional response at the level of the agency as a whole, even as regional- and local-scale institutions within Colorado showed greater dynamism. Indeed, the changes to agency institutions that were detected were largely consistent with institutional change trajectories already in place prior to the epidemic. Our study points to the importance of institutional persistence and path dependence in limiting the latitude for adaptation to social and environmental shocks.

  6. Mandelbrot's Extremism

    NARCIS (Netherlands)

    Beirlant, J.; Schoutens, W.; Segers, J.J.J.

    2004-01-01

    In the sixties Mandelbrot already showed that extreme price swings are more likely than some of us think or incorporate in our models.A modern toolbox for analyzing such rare events can be found in the field of extreme value theory.At the core of extreme value theory lies the modelling of maxima

  7. Translation and cross-cultural adaptation of the lower extremity functional scale into a Brazilian Portuguese version and validation on patients with knee injuries.

    Science.gov (United States)

    Metsavaht, Leonardo; Leporace, Gustavo; Riberto, Marcelo; Sposito, Maria Matilde M; Del Castillo, Letícia N C; Oliveira, Liszt P; Batista, Luiz Alberto

    2012-11-01

    Clinical measurement. To translate and culturally adapt the Lower Extremity Functional Scale (LEFS) into a Brazilian Portuguese version, and to test the construct and content validity and reliability of this version in patients with knee injuries. There is no Brazilian Portuguese version of an instrument to assess the function of the lower extremity after orthopaedic injury. The translation of the original English version of the LEFS into a Brazilian Portuguese version was accomplished using standard guidelines and tested in 31 patients with knee injuries. Subsequently, 87 patients with a variety of knee disorders completed the Brazilian Portuguese LEFS, the Medical Outcomes Study 36-Item Short-Form Health Survey, the Western Ontario and McMaster Universities Osteoarthritis Index, and the International Knee Documentation Committee Subjective Knee Evaluation Form and a visual analog scale for pain. All patients were retested within 2 days to determine reliability of these measures. Validation was assessed by determining the level of association between the Brazilian Portuguese LEFS and the other outcome measures. Reliability was documented by calculating internal consistency, test-retest reliability, and standard error of measurement. The Brazilian Portuguese LEFS had a high level of association with the physical component of the Medical Outcomes Study 36-Item Short-Form Health Survey (r = 0.82), the Western Ontario and McMaster Universities Osteoarthritis Index (r = 0.87), the International Knee Documentation Committee Subjective Knee Evaluation Form (r = 0.82), and the pain visual analog scale (r = -0.60) (all, Pcoefficient = 0.957) of the Brazilian Portuguese version of the LEFS were high. The standard error of measurement was low (3.6) and the agreement was considered high, demonstrated by the small differences between test and retest and the narrow limit of agreement, as observed in Bland-Altman and survival-agreement plots. The translation of the LEFS into a

  8. Evaluation of seabed mapping methods for fine-scale classification of extremely shallow benthic habitats - Application to the Venice Lagoon, Italy

    Science.gov (United States)

    Montereale Gavazzi, G.; Madricardo, F.; Janowski, L.; Kruss, A.; Blondel, P.; Sigovini, M.; Foglini, F.

    2016-03-01

    Recent technological developments of multibeam echosounder systems (MBES) allow mapping of benthic habitats with unprecedented detail. MBES can now be employed in extremely shallow waters, challenging data acquisition (as these instruments were often designed for deeper waters) and data interpretation (honed on datasets with resolution sometimes orders of magnitude lower). With extremely high-resolution bathymetry and co-located backscatter data, it is now possible to map the spatial distribution of fine scale benthic habitats, even identifying the acoustic signatures of single sponges. In this context, it is necessary to understand which of the commonly used segmentation methods is best suited to account for such level of detail. At the same time, new sampling protocols for precisely geo-referenced ground truth data need to be developed to validate the benthic environmental classification. This study focuses on a dataset collected in a shallow (2-10 m deep) tidal channel of the Lagoon of Venice, Italy. Using 0.05-m and 0.2-m raster grids, we compared a range of classifications, both pixel-based and object-based approaches, including manual, Maximum Likelihood Classifier, Jenks Optimization clustering, textural analysis and Object Based Image Analysis. Through a comprehensive and accurately geo-referenced ground truth dataset, we were able to identify five different classes of the substrate composition, including sponges, mixed submerged aquatic vegetation, mixed detritic bottom (fine and coarse) and unconsolidated bare sediment. We computed estimates of accuracy (namely Overall, User, Producer Accuracies and the Kappa statistic) by cross tabulating predicted and reference instances. Overall, pixel based segmentations produced the highest accuracies and the accuracy assessment is strongly dependent on the number of classes chosen for the thematic output. Tidal channels in the Venice Lagoon are extremely important in terms of habitats and sediment distribution

  9. ERA: Adverse Consequences

    Science.gov (United States)

    Martin, Brian

    2011-01-01

    Excellence in Research for Australia has a number of limitations: inputs are counted as outputs, time is wasted, disciplinary research is favoured and public engagement is discouraged. Most importantly, by focusing on measurement and emphasising competition, ERA may actually undermine the cooperation and intrinsic motivation that underpin research…

  10. Energy: a new era

    International Nuclear Information System (INIS)

    Moore, Curtis.

    1995-01-01

    The world appears on the verge of a new era of advanced technologies and new fuels. Although such a transformation is unlikely to take place overnight, change is clearly coming. The question is how much and how fast: Will energy transition amount to a technological revolution, or merely an evolution? A detailed evaluation of the various aspects is given. (author). 23 refs

  11. Spatial and temporal patterns of bank failure during extreme flood events: Evidence of nonlinearity and self-organised criticality at the basin scale?

    Science.gov (United States)

    Thompson, C. J.; Croke, J. C.; Grove, J. R.

    2012-04-01

    Non-linearity in physical systems provides a conceptual framework to explain complex patterns and form that are derived from complex internal dynamics rather than external forcings, and can be used to inform modeling and improve landscape management. One process that has been investigated previously to explore the existence of self-organised critical system (SOC) in river systems at the basin-scale is bank failure. Spatial trends in bank failure have been previously quantified to determine if the distribution of bank failures at the basin scale exhibit the necessary power law magnitude/frequency distributions. More commonly bank failures are investigated at a small-scale using several cross-sections with strong emphasis on local-scale factors such as bank height, cohesion and hydraulic properties. Advancing our understanding of non-linearity in such processes, however, requires many more studies where both the spatial and temporal measurements of the process can be used to investigate the existence or otherwise of non-linearity and self-organised criticality. This study presents measurements of bank failure throughout the Lockyer catchment in southeast Queensland, Australia, which experienced an extreme flood event in January 2011 resulting in the loss of human lives and geomorphic channel change. The most dominant form of fluvial adjustment consisted of changes in channel geometry and notably widespread bank failures, which were readily identifiable as 'scalloped' shaped failure scarps. The spatial extents of these were mapped using high-resolution LiDAR derived digital elevation model and were verified by field surveys and air photos. Pre-flood event LiDAR coverage for the catchment also existed allowing direct comparison of the magnitude and frequency of bank failures from both pre and post-flood time periods. Data were collected and analysed within a GIS framework and investigated for power-law relationships. Bank failures appeared random and occurred

  12. ERA-Net Evaluation of technology status for small-scale combustion of pellets from new ash rich biomasses - combustion tests in residential burners

    Energy Technology Data Exchange (ETDEWEB)

    Roennbaeck, Marie; Johansson, Mathias; Frida Claesson

    2008-07-01

    In this project, pellets with higher ash content compared to the wood pellets used today on the Swedish market were tested in three domestic-scale burners. The tests were carried out based on EN 303-5. In the flue gas, combustion parameters as carbon monoxide, carbon dioxide, oxygen and hydro carbons were measured, and also more fuel specific parameters such as nitrogen oxides, sulphur dioxide, hydrogen chloride, total dust and particle mass- and number concentration. The dust (fly ash) and bottom ash were characterized chemically. The implications of high ash content on combustion performance are discussed in the report. Altogether five pellets with 8 mm diameter were tested: oilseed straw pellet, reed canary grass pellet (RCG), barley straw pellet, bark pellet and wood pellet. All fuels were dry ranging from 6.5-12 % moisture. The ash content varied from 0.3 weight-% dm in wood to 7.9 % in RCG. Barley straw has a noticeable low ash melting temperature, < 980 deg C, and could not be combusted in any of the burners. The nitrogen content varied nine times and sulphur more than 10 times. The chlorine content was very low in wood and bark and more than 20 times higher in oilseed and barley. The composition of inorganic species in the fuel ash was dominated by calcium, potassium and silica in wood, bark and oilseed pellet, while RCG and barley straw were dominated by silica. The three burners used were commercial and known to fulfil high quality requirements. Burner A is a pellet burner where fuel is supplied on top of the grate with no mechanical mean for moving bottom ash on the grate during combustion. Bottom ash is blown away, and any slag remaining on the grate is removed with a scrape before ignition. Burner B is an upward burning pellet burner where fuel and ash is pushed upwards and the glow bed is exposed to the surrounding combustion department. Burner C is a forward burning grain burner that pushes fuel and ash forwards, inside a cylinder. From the

  13. Application of the extreme value theory to beam loss estimates in the SPIRAL2 linac based on large scale Monte Carlo computations

    Directory of Open Access Journals (Sweden)

    R. Duperrier

    2006-04-01

    Full Text Available The influence of random perturbations of high intensity accelerator elements on the beam losses is considered. This paper presents the error sensitivity study which has been performed for the SPIRAL2 linac in order to define the tolerances for the construction. The proposed driver aims to accelerate a 5 mA deuteron beam up to 20   A MeV and a 1 mA ion beam for q/A=1/3 up to 14.5 A MeV. It is a continuous wave regime linac, designed for a maximum efficiency in the transmission of intense beams and a tunable energy. It consists in an injector (two   ECRs   sources+LEBTs with the possibility to inject from several sources+radio frequency quadrupole followed by a superconducting section based on an array of independently phased cavities where the transverse focalization is performed with warm quadrupoles. The correction scheme and the expected losses are described. The extreme value theory is used to estimate the expected beam losses. The described method couples large scale computations to obtain probability distribution functions. The bootstrap technique is used to provide confidence intervals associated to the beam loss predictions. With such a method, it is possible to measure the risk to loose a few watts in this high power linac (up to 200 kW.

  14. India Labour and Employment Report 2014: Workers in the era of ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-06-07

    Jun 7, 2016 ... India's rapid economic growth has reduced extreme poverty among ... India Labour and Employment Report 2014: Workers in the era of globalization ... The impact of demographic change on economic growth in Kenya and ...

  15. ERA's Ranger uranium mine

    International Nuclear Information System (INIS)

    Davies, W.

    1997-01-01

    Energy Resource of Australia (ERA) is a public company with 68% of its shares owned by the Australian company North Limited. It is currently operating one major production centre - Ranger Mine which is 260 kilometres east of Darwin, extracting and selling uranium from the Ranger Mine in the Northern Territory to nuclear electricity utilities in Japan, South Korea, Europe and North America. The first drum of uranium oxide from Ranger was drummed in August 1981 and operations have continued since that time. ERA is also in the process of working towards obtaining approvals for the development of a second mine - Jabiluka which is located 20 kilometres north of Ranger. The leases of Ranger and Jabiluka adjoin. The Minister for the Environment has advised the Minister for Resources and Energy that there does not appear to be any environmental issue which would prevent the preferred Jabiluka proposal from proceeding. Consent for the development of ERA's preferred option for the development of Jabiluka is being sought from the Aboriginal Traditional Owners. Ranger is currently the third largest producing uranium mine in the world producing 4,237 tonnes of U 3 O 8 in the year to June 1997

  16. Scales

    Science.gov (United States)

    Scales are a visible peeling or flaking of outer skin layers. These layers are called the stratum ... Scales may be caused by dry skin, certain inflammatory skin conditions, or infections. Examples of disorders that ...

  17. Extreme cosmos

    CERN Document Server

    Gaensler, Bryan

    2011-01-01

    The universe is all about extremes. Space has a temperature 270°C below freezing. Stars die in catastrophic supernova explosions a billion times brighter than the Sun. A black hole can generate 10 million trillion volts of electricity. And hypergiants are stars 2 billion kilometres across, larger than the orbit of Jupiter. Extreme Cosmos provides a stunning new view of the way the Universe works, seen through the lens of extremes: the fastest, hottest, heaviest, brightest, oldest, densest and even the loudest. This is an astronomy book that not only offers amazing facts and figures but also re

  18. Ideologies and Discourses: Extreme Narratives in Extreme Metal Music

    Directory of Open Access Journals (Sweden)

    Bojana Radovanović

    2016-10-01

    Full Text Available Historically speaking, metal music has always been about provoking a strong reaction. Depending on the characteristics of different sub-genres, one can focus on the sound, technique, visual appearance, and furthermore, the ideologies and ideas that are the foundation for each of the sub-genres. Although the majority of the metal community rejects accusations of being racially intolerant, some ideologies of extreme sub-genres (such as black metal are in fact formed around the ideas of self-conscious elitism expressed through interest in pagan mythology, racism, Nazism and fascism. There has been much interest in the Nazi era within the extreme metal scene thus influencing other sub-genres and artists. The aim of this paper is to examine various appearances of extreme narratives such as Nazism and racism in  different sub-genres of metal, bearing in mind variations dependent on geographical, political, and other factors.

  19. Dakwah di Era Digital

    Directory of Open Access Journals (Sweden)

    Wahyu Budiantoro

    2018-04-01

    Full Text Available These days dakwah is not only interpreted as transformation of a pure religious value, but also transformation of a more relevant value including many aspects in digital era. Digital era is when society succumbed into the flow of information causing cultural shock and difficulties on synthesizing meaning from those scattered information. Dakwah on Digital age must accommodate societal needs which tend to move into a mass society. It results in strategy and more humane and innovative dakwah methods. One of innovative dakwah methods is conducted dakwah activities through digital media,with the consequences of this is that da’i must developed soft skill and technological capabilities. Another beneficial comes from this is that dakwah could become more modern and practical in terms of methods and material. On the other hand, citizen Journalism as a mass cultural product and the results of technological development, gives an opportunity for da’i to able to record the entire activities, including the dynamics of islamic life. In terms of learning curriculum, dakwah in digital format must be included, so then the intellectual and cultural spirit which flourished in pesantren could be adapted and competed in a global world.

  20. Second Nuclear Era

    International Nuclear Information System (INIS)

    Weinberg, A.M.; Spiewak, I.; Barkenbus, J.N.; Livingston, R.S.; Phung, D.L.

    1984-03-01

    The Institute for Energy Analysis with support from The Andrew W. Mellon Foundation has studied the decline of the present nuclear era in the United States and the characteristics of a Second Nuclear Era which might be instrumental in restoring nuclear power to an appropriate place in the energy options of our country. The study has determined that reactors operating today are much safer than they were at the time of the TMI accident. A number of concepts for a supersafe reactor were reviewed and at least two were found that show considerable promise, the PIUS, a Swedish pressurized water design, and a gas-cooled modular design of German and US origin. Although new, safer, incrementally improved, conventional reactors are under study by the nuclear industry, the complete lack of new orders in the United States will slow their introduction and they are likely to be more expensive than present designs. The study recommends that supersafe reactors be taken seriously and that federal and private funds both be used to design and, if feasible, to build a prototype reactor of substantial size. 146 references, 8 figures, 2 tables

  1. Rework of the ERA software system: ERA-8

    Science.gov (United States)

    Pavlov, D.; Skripnichenko, V.

    2015-08-01

    The software system that has been powering many products of the IAA during decades has undergone a major rework. ERA has capabilities for: processing tables of observations of different kinds, fitting parameters to observations, integrating equations of motion of the Solar system bodies. ERA comprises a domain-specific language called SLON, tailored for astronomical tasks. SLON provides a convenient syntax for reductions of observations, choosing of IAU standards to use, applying rules for filtering observations or selecting parameters for fitting. Also, ERA includes a table editor and a graph plotter. ERA-8 has a number of improvements over previous versions such as: integration of the Solar system and TT xA1 TDB with arbitrary number of asteroids; option to use different ephemeris (including DE and INPOP); integrator with 80-bit floating point. The code of ERA-8 has been completely rewritten from Pascal to C (for numerical computations) and Racket (for running SLON programs and managing data). ERA-8 is portable across major operating systems. The format of tables in ERA-8 is based on SQLite. The SPICE format has been chosen as the main format for ephemeris in ERA-8.

  2. How extreme is extreme hourly precipitation?

    Science.gov (United States)

    Papalexiou, Simon Michael; Dialynas, Yannis G.; Pappas, Christoforos

    2016-04-01

    The importance of accurate representation of precipitation at fine time scales (e.g., hourly), directly associated with flash flood events, is crucial in hydrological design and prediction. The upper part of a probability distribution, known as the distribution tail, determines the behavior of extreme events. In general, and loosely speaking, tails can be categorized in two families: the subexponential and the hyperexponential family, with the first generating more intense and more frequent extremes compared to the latter. In past studies, the focus has been mainly on daily precipitation, with the Gamma distribution being the most popular model. Here, we investigate the behaviour of tails of hourly precipitation by comparing the upper part of empirical distributions of thousands of records with three general types of tails corresponding to the Pareto, Lognormal, and Weibull distributions. Specifically, we use thousands of hourly rainfall records from all over the USA. The analysis indicates that heavier-tailed distributions describe better the observed hourly rainfall extremes in comparison to lighter tails. Traditional representations of the marginal distribution of hourly rainfall may significantly deviate from observed behaviours of extremes, with direct implications on hydroclimatic variables modelling and engineering design.

  3. Nasionalisme di Era Internet

    Directory of Open Access Journals (Sweden)

    Danu Widhyatmoko

    2015-04-01

    Full Text Available Nationalism and nationality of a country life are moving into the new phase. Internet has become a new medium that opens up so many opportunities to create a sense of nationalism for the country. This paper contains a review of nationalism in the age of the Internet. This paper begins with understanding nationalism, the character of the Internet, social media and nationalism in the era of the Internet. Research method used in this paper is literature study, continued with reflective data analysis. With reflective analysis method, the authors analyzed data from the data collection has been carried out for comparison between the existing literature by circumstances or phenomena that occur, so that the conclusions of rational and scientific data can be obtained. 

  4. Scale interactions in economics: application to the evaluation of the economic damages of climatic change and of extreme events; Interactions d'echelles en economie: application a l'evaluation des dommages economiques du changement climatique et des evenements extremes

    Energy Technology Data Exchange (ETDEWEB)

    Hallegatte, S

    2005-06-15

    Growth models, which neglect economic disequilibria, considered as temporary, are in general used to evaluate the damaging effects generated by climatic change. This work shows, through a series of modeling experiences, the importance of disequilibria and of endogenous variability of economy in the evaluation of damages due to extreme events and climatic change. It demonstrates the impossibility to separate the evaluation of damages from the representation of growth and of economic dynamics: the comfort losses will depend on both the nature and intensity of impacts and on the dynamics and situation of the economy to which they will apply. Thus, the uncertainties about the damaging effects of future climatic changes come from both scientific uncertainties and from uncertainties about the future organization of our economies. (J.S.)

  5. Attitude extremity, consensus and diagnosticity

    NARCIS (Netherlands)

    van der Pligt, J.; Ester, P.; van der Linden, J.

    1983-01-01

    Studied the effects of attitude extremity on perceived consensus and willingness to ascribe trait terms to others with either pro- or antinuclear attitudes. 611 Ss rated their attitudes toward nuclear energy on a 5-point scale. Results show that attitude extremity affected consensus estimates. Trait

  6. Drivers and seasonal predictability of extreme wind speeds in the ECMWF System 4 and a statistical model

    Science.gov (United States)

    Walz, M. A.; Donat, M.; Leckebusch, G. C.

    2017-12-01

    As extreme wind speeds are responsible for large socio-economic losses in Europe, a skillful prediction would be of great benefit for disaster prevention as well as for the actuarial community. Here we evaluate patterns of large-scale atmospheric variability and the seasonal predictability of extreme wind speeds (e.g. >95th percentile) in the European domain in the dynamical seasonal forecast system ECMWF System 4, and compare to the predictability based on a statistical prediction model. The dominant patterns of atmospheric variability show distinct differences between reanalysis and ECMWF System 4, with most patterns in System 4 extended downstream in comparison to ERA-Interim. The dissimilar manifestations of the patterns within the two models lead to substantially different drivers associated with the occurrence of extreme winds in the respective model. While the ECMWF System 4 is shown to provide some predictive power over Scandinavia and the eastern Atlantic, only very few grid cells in the European domain have significant correlations for extreme wind speeds in System 4 compared to ERA-Interim. In contrast, a statistical model predicts extreme wind speeds during boreal winter in better agreement with the observations. Our results suggest that System 4 does not seem to capture the potential predictability of extreme winds that exists in the real world, and therefore fails to provide reliable seasonal predictions for lead months 2-4. This is likely related to the unrealistic representation of large-scale patterns of atmospheric variability. Hence our study points to potential improvements of dynamical prediction skill by improving the simulation of large-scale atmospheric dynamics.

  7. VALUATION IN THE CONSTITUTIONAL ERA

    African Journals Online (AJOL)

    Brimer

    16 ..... stem from the pre-constitutional era, and the constitutional framework and its legitimate reform efforts. A decision on what is just ...... Carroll L Alice's Adventures in Wonderland (Digital Scanning Scituate MA. 2007). Dagan 1999 Va L Rev.

  8. Three eras of climate change

    Energy Technology Data Exchange (ETDEWEB)

    Huq, Saleemul; Toulmin, Camilla

    2006-10-15

    Climate change as a global challenge has evolved through a series of stages in the last few decades. We are now on the brink of a new era which will see the terms of the debate shift once again. The different eras are characterised by the scientific evidence, public perceptions, responses and engagement of different groups to address the problem. In the first era, from the late 1980s to 2000, climate change was seen as an “environmental” problem to do with prevention of future impacts on the planet's climate systems over the next fifty to hundred years, through reductions in emissions of greenhouse gases, known as “mitigation”. The second era can be said to have started around the turn of the millennium, with the recognition that there will be some unavoidable impacts from climate change in the near term (over the next decade or two). These impacts must be coped with through “adaptation”, as well as mitigation, to prevent much more severe and possibly catastrophic impacts in the longer term. It has become clear that many of the impacts of climate change in the near term are likely to fall on the poorest countries and communities. The third era, which we are just about to enter, will see the issue change from tackling an environmental or development problem to a question of “global justice”. It will engage with a much wider array of citizens from around the world than previous eras.

  9. Investigating NARCCAP Precipitation Extremes via Bivariate Extreme Value Theory (Invited)

    Science.gov (United States)

    Weller, G. B.; Cooley, D. S.; Sain, S. R.; Bukovsky, M. S.; Mearns, L. O.

    2013-12-01

    We introduce methodology from statistical extreme value theory to examine the ability of reanalysis-drive regional climate models to simulate past daily precipitation extremes. Going beyond a comparison of summary statistics such as 20-year return values, we study whether the most extreme precipitation events produced by climate model simulations exhibit correspondence to the most extreme events seen in observational records. The extent of this correspondence is formulated via the statistical concept of tail dependence. We examine several case studies of extreme precipitation events simulated by the six models of the North American Regional Climate Change Assessment Program (NARCCAP) driven by NCEP reanalysis. It is found that the NARCCAP models generally reproduce daily winter precipitation extremes along the Pacific coast quite well; in contrast, simulation of past daily summer precipitation extremes in a central US region is poor. Some differences in the strength of extremal correspondence are seen in the central region between models which employ spectral nudging and those which do not. We demonstrate how these techniques may be used to draw a link between extreme precipitation events and large-scale atmospheric drivers, as well as to downscale extreme precipitation simulated by a future run of a regional climate model. Specifically, we examine potential future changes in the nature of extreme precipitation along the Pacific coast produced by the pineapple express (PE) phenomenon. A link between extreme precipitation events and a "PE Index" derived from North Pacific sea-surface pressure fields is found. This link is used to study PE-influenced extreme precipitation produced by a future-scenario climate model run.

  10. Attractions to radiation-like eras in superstring cosmologies

    CERN Document Server

    Partouche, Herve

    2010-01-01

    We review the cosmology induced by finite temperature and quantum effects on non-supersymmetric string models. We show the evolution is attracted to radiation-like solutions after the Hagedorn era and before the electroweak phase transition. This mechanism generates a hierarchy between the Planck mass and the supersymmetry breaking scale. A dynamical change of space-time dimension can take place.

  11. Ecosystem-scale volatile organic compound fluxes during an extreme drought in a broadleaf temperate forest of the Missouri Ozarks (central USA)

    Energy Technology Data Exchange (ETDEWEB)

    Seco, Roger [Univ. of California, Irvine, CA (United States); Karl, Thomas [Univ. of Innsbruck (Austria); Guenther, Alex B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Washington State Univ., Pullman, WA (United States); Hosman, Kevin P. [Univ. of Missouri, Columbia, MO (United States); Pallardy, Stephen G. [Univ. of Missouri, Columbia, MO (United States); Gu, Lianhong [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Geron, Chris [U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Harley, Peter [National Center for Atmospheric Research, Boulder, CO (United States); Kim, Saewung [Univ. of California, Irvine, CA (United States)

    2015-07-07

    Considerable amounts and varieties of biogenic volatile organic compounds (BVOCs) are exchanged between vegeta-tion and the surrounding air. These BVOCs play key ecological and atmospheric roles that must be adequately repre-sented for accurately modeling the coupled biosphere–atmosphere–climate earth system. One key uncertainty in existing models is the response of BVOC fluxes to an important global change process: drought. We describe the diur-nal and seasonal variation in isoprene, monoterpene, and methanol fluxes from a temperate forest ecosystem before, during, and after an extreme 2012 drought event in the Ozark region of the central USA. BVOC fluxes were domi-nated by isoprene, which attained high emission rates of up to 35.4 mg m-2h-1 at midday. Methanol fluxes were characterized by net deposition in the morning, changing to a net emission flux through the rest of the daylight hours. Net flux of CO2 reached its seasonal maximum approximately a month earlier than isoprenoid fluxes, which high-lights the differential response of photosynthesis and isoprenoid emissions to progressing drought conditions. Never-theless, both processes were strongly suppressed under extreme drought, although isoprene fluxes remained relatively high compared to reported fluxes from other ecosystems. Methanol exchange was less affected by drought throughout the season, conflrming the complex processes driving biogenic methanol fluxes. The fraction of daytime (7–17 h) assimilated carbon released back to the atmosphere combining the three BVOCs measured was 2% of gross primary productivity (GPP) and 4.9% of net ecosystem exchange (NEE) on average for our whole measurement cam-paign, while exceeding 5% of GPP and 10% of NEE just before the strongest drought phase. The MEGANv2.1 model correctly predicted diurnal variations in fluxes driven mainly by light and temperature, although further research is needed to address model BVOC fluxes

  12. METODE MUHADDITSIN DI ERA MODERN

    Directory of Open Access Journals (Sweden)

    Adriansyah Adriansyah

    2016-05-01

    Full Text Available After the era of tadwin, almost all disciplines of knowledge in the Islamic world, including the study of hadith, was considered “running on the spot.” Yet, attention and maintenance of the hadith was still favored by intellectuals. Similarly, in the modern era, the hadith remains the object of criticism by not only Muslim intellectuals but also outsiders, such as the West. Western imperialism against the Islamic world in the past was now the beginning of the history of how Muslims are only able to “survive” rather than “attack.” The emergence of defensive and reactive works against trends of the West in criticizing and blasphemed the hadith, then, such works became trends and supporting methodologies among Muslim observers of the hadith in today’s modern era

  13. Wafer-scaled monolayer WO{sub 3} windows ultra-sensitive, extremely-fast and stable UV-A photodetection

    Energy Technology Data Exchange (ETDEWEB)

    Hai, Zhenyin; Akbari, Mohammad Karbalaei [Ghent University Global Campus, Department of Applied Analytical & Physical Chemistry, Faculty of Bioscience Engineering, 119 Songdomunhwa-ro, Yeonsu-gu, Incheon 21985 (Korea, Republic of); Xue, Chenyang [Key Laboratory of Instrumentation Science and Dynamic Measurement of Ministry of Education, North University of China, Taiyuan, Shanxi 030051 (China); Xu, Hongyan [School of Materials Science and Engineering, North University of China, Taiyuan, Shanxi 030051 (China); Hyde, Lachlan [Melbourne Centre for Nanofabrication, Clayton, Victoria 3168 (Australia); Zhuiykov, Serge, E-mail: serge.zhuiykov@ugent.be [Ghent University Global Campus, Department of Applied Analytical & Physical Chemistry, Faculty of Bioscience Engineering, 119 Songdomunhwa-ro, Yeonsu-gu, Incheon 21985 (Korea, Republic of)

    2017-05-31

    Highlights: • Monolayer WO{sub 3}-based photodetectors were fabricated for the first time. • The device has ultrafast response time of ∼40 μs and responsivity of ∼0.329 A W{sup −1}. • The response time is 400-fold improvement over any other WO{sub 3} UV photodetectors. • The device has better characteristics than many 2D materials-based photodetectors. • This proposed strategy has great potential for commercialization of photodetectors. - Abstract: The monolayer WO{sub 3}-based UV-A photodetectors, fabricated by atomic layer deposition (ALD) technique at the large area of SiO{sub 2}/Si wafer, have demonstrated vastly improved functional capabilities: extremely fast response time of less than 40 μs and photoresponsivity reaching of ∼0.329 A W{sup −1}. Their ultrafast photoresponse time is at least 400-fold improvement over the previous reports for any other WO{sub 3}-based UV photodetectors that have ever been fabricated, and significantly faster than most of other photodetectors based on two-dimensional (2D) nanomaterials reported-to-date. Moreover, their measured long-term stability exceeds more than 200 cycles without any visible degradation. The ALD-deposited WO{sub 3} monolayer has also exhibited wider bandgap of 3.53 eV and the UV-A photodetector based on it is environmentally friendly, highly reliable, with excellent reproducibility and long-term stability. Thus, the shift to mono-layered semiconductors, which possess completely new quantum-confined effects, has the greatest potential in creating a new class of nano-materials, which in return windows new functional opportunities for various opto-electronic instruments built on semiconductor monolayer and, more importantly, can result in new strategy for fabrication highly-flexible, inexpensive and extremely-sensitive devices. This strategy also opens up the great opportunities for industrialization and commercialization of the photodetectors and other optoelectronic devices based on

  14. Ultrastrong Carbon Thin Films from Diamond to Graphene under Extreme Conditions: Probing Atomic Scale Interfacial Mechanisms to Achieve Ultralow Friction and Wear

    Science.gov (United States)

    2016-12-08

    tribological behavior of hard carbon materials during initial sliding contact, in order to understand what controls and enables the transition from high to...publication. Our goal is to characterize and understand the atomic-scale mechanisms governing the tribological behavior (friction and wear) of hard carbon...affecting the sliding behavior of these materials, including: rehybridization from sp3 to sp2-bonding of the C atoms20, formation of bonds across the

  15. DEEP CHANDRA, HST-COS, AND MEGACAM OBSERVATIONS OF THE PHOENIX CLUSTER: EXTREME STAR FORMATION AND AGN FEEDBACK ON HUNDRED KILOPARSEC SCALES

    International Nuclear Information System (INIS)

    McDonald, Michael; Bautz, Marshall W.; Miller, Eric D.; ZuHone, John A.; McNamara, Brian R.; Weeren, Reinout J. van; Bayliss, Matthew; Jones-Forman, Christine; Applegate, Douglas E.; Benson, Bradford A.; Carlstrom, John E.; Mantz, Adam B.; Bleem, Lindsey E.; Chatzikos, Marios; Edge, Alastair C.; Fabian, Andrew C.; Garmire, Gordon P.; Hlavacek-Larrondo, Julie; Stalder, Brian; Veilleux, Sylvain

    2015-01-01

    We present new ultraviolet, optical, and X-ray data on the Phoenix galaxy cluster (SPT-CLJ2344-4243). Deep optical imaging reveals previously undetected filaments of star formation, extending to radii of ∼50–100 kpc in multiple directions. Combined UV-optical spectroscopy of the central galaxy reveals a massive (2 × 10 9 M ⊙ ), young (∼4.5 Myr) population of stars, consistent with a time-averaged star formation rate of 610 ± 50 M ⊙ yr −1 . We report a strong detection of O vi λλ1032,1038, which appears to originate primarily in shock-heated gas, but may contain a substantial contribution (>1000 M ⊙ yr −1 ) from the cooling intracluster medium (ICM). We confirm the presence of deep X-ray cavities in the inner ∼10 kpc, which are among the most extreme examples of radio-mode feedback detected to date, implying jet powers of 2–7 × 10 45 erg s −1 . We provide evidence that the active galactic nucleus inflating these cavities may have only recently transitioned from “quasar-mode” to “radio-mode,” and may currently be insufficient to completely offset cooling. A model-subtracted residual X-ray image reveals evidence for prior episodes of strong radio-mode feedback at radii of ∼100 kpc, with extended “ghost” cavities indicating a prior epoch of feedback roughly 100 Myr ago. This residual image also exhibits significant asymmetry in the inner ∼200 kpc (0.15R 500 ), reminiscent of infalling cool clouds, either due to minor mergers or fragmentation of the cooling ICM. Taken together, these data reveal a rapidly evolving cool core which is rich with structure (both spatially and in temperature), is subject to a variety of highly energetic processes, and yet is cooling rapidly and forming stars along thin, narrow filaments

  16. Extreme erosion response after wildfire in the Upper Ovens, south-east Australia: Assessment of catchment scale connectivity by an intensive field survey

    Science.gov (United States)

    Box, Walter; Keestra, Saskia; Nyman, Petter; Langhans, Christoph; Sheridan, Gary

    2015-04-01

    South-eastern Australia is generally regarded as one of the world's most fire-prone environments because of its high temperatures, low rainfall and flammable native Eucalyptus forests. Modifications to the landscape by fire can lead to significant changes to erosion rates and hydrological processes. Debris flows in particular have been recognised as a process which increases in frequency as a result of fire. This study used a debris flow event in the east Upper Ovens occurred on the 28th of February 2013 as a case study for analysing sediment transport processes and connectivity of sediment sources and sinks. Source areas were identified using a 15 cm resolution areal imagery and a logistic regression model was made based on fire severity, aridity index and slope to predict locations of source areas. Deposits were measured by making cross-sections using a combination of a differential GPS and a total station. In total 77 cross-sections were made in a 14.1 km2 sub-catchment and distributed based on channel gradient and width. A more detailed estimation was obtained by making more cross-sections where the volume per area is higher. Particle size distribution between sources and sink areas were obtained by combination of field assessment, photography imagery analyses and sieve and laser diffraction. Sediment was locally eroded, transported and deposited depending on factors such as longitude gradient, stream power and the composition of bed and bank material. The role of headwaters as sediment sinks changed dramatically as a result of the extreme erosion event in the wildfire affected areas. Disconnected headwaters became connected to low order streams due to debris flow processes in the contributing catchment. However this redistribution of sediment from headwaters to the drainage network was confined to upper reaches of the Ovens. Below this upper part of the catchment the event resulted in redistribution of sediment already existing in the channel through a

  17. Survival of Er(a+) red cells in a patient with allo-anti-Era

    International Nuclear Information System (INIS)

    Thompson, H.W.; Skradski, K.J.; Thoreson, J.R.; Polesky, H.F.

    1985-01-01

    51 Chromium-labeled Er(a+) red cells survived nearly normally (T1/2 of 21 days) in a patient with allo-anti-Era. Transfusion of Er(a+) blood was without significant reaction and did not affect the anti-Era titer

  18. Extreme temperature events on Greenland in observations and the MAR regional climate model

    Science.gov (United States)

    Leeson, Amber A.; Eastoe, Emma; Fettweis, Xavier

    2018-03-01

    Meltwater from the Greenland Ice Sheet contributed 1.7-6.12 mm to global sea level between 1993 and 2010 and is expected to contribute 20-110 mm to future sea level rise by 2100. These estimates were produced by regional climate models (RCMs) which are known to be robust at the ice sheet scale but occasionally miss regional- and local-scale climate variability (e.g. Leeson et al., 2017; Medley et al., 2013). To date, the fidelity of these models in the context of short-period variability in time (i.e. intra-seasonal) has not been fully assessed, for example their ability to simulate extreme temperature events. We use an event identification algorithm commonly used in extreme value analysis, together with observations from the Greenland Climate Network (GC-Net), to assess the ability of the MAR (Modèle Atmosphérique Régional) RCM to reproduce observed extreme positive-temperature events at 14 sites around Greenland. We find that MAR is able to accurately simulate the frequency and duration of these events but underestimates their magnitude by more than half a degree Celsius/kelvin, although this bias is much smaller than that exhibited by coarse-scale Era-Interim reanalysis data. As a result, melt energy in MAR output is underestimated by between 16 and 41 % depending on global forcing applied. Further work is needed to precisely determine the drivers of extreme temperature events, and why the model underperforms in this area, but our findings suggest that biases are passed into MAR from boundary forcing data. This is important because these forcings are common between RCMs and their range of predictions of past and future ice sheet melting. We propose that examining extreme events should become a routine part of global and regional climate model evaluation and that addressing shortcomings in this area should be a priority for model development.

  19. Assessment of Observational Uncertainty in Extreme Precipitation Events over the Continental United States

    Science.gov (United States)

    Slinskey, E. A.; Loikith, P. C.; Waliser, D. E.; Goodman, A.

    2017-12-01

    Extreme precipitation events are associated with numerous societal and environmental impacts. Furthermore, anthropogenic climate change is projected to alter precipitation intensity across portions of the Continental United States (CONUS). Therefore, a spatial understanding and intuitive means of monitoring extreme precipitation over time is critical. Towards this end, we apply an event-based indicator, developed as a part of NASA's support of the ongoing efforts of the US National Climate Assessment, which assigns categories to extreme precipitation events based on 3-day storm totals as a basis for dataset intercomparison. To assess observational uncertainty across a wide range of historical precipitation measurement approaches, we intercompare in situ station data from the Global Historical Climatology Network (GHCN), satellite-derived precipitation data from NASA's Tropical Rainfall Measuring Mission (TRMM), gridded in situ station data from the Parameter-elevation Regressions on Independent Slopes Model (PRISM), global reanalysis from NASA's Modern Era Retrospective-Analysis version 2 (MERRA 2), and regional reanalysis with gauge data assimilation from NCEP's North American Regional Reanalysis (NARR). Results suggest considerable variability across the five-dataset suite in the frequency, spatial extent, and magnitude of extreme precipitation events. Consistent with expectations, higher resolution datasets were found to resemble station data best and capture a greater frequency of high-end extreme events relative to lower spatial resolution datasets. The degree of dataset agreement varies regionally, however all datasets successfully capture the seasonal cycle of precipitation extremes across the CONUS. These intercomparison results provide additional insight about observational uncertainty and the ability of a range of precipitation measurement and analysis products to capture extreme precipitation event climatology. While the event category threshold is fixed

  20. Full scale test SSP 34m blade, edgewise loading LTT. Extreme load and PoC{sub I}nvE Data report

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, Magda; Roczek-Sieradzan, A.; Jensen, Find M. (and others)

    2010-09-15

    This report is the second report covering the research and demonstration project 'Experimental blade research: Structural mechanisms in current and future large blades under combined loading', supported by the EUDP program. A 34m wind turbine blade from SSP-Technology A/S has been tested in edgewise direction (LTT). The blade has been submitted to thorough examination by means of strain gauges, displacement transducers and a 3D optical measuring system. This data report presents results obtained during full scale testing of the blade up to 80% Risoe load, where 80% Risoe load corresponds to 100% certification load. These pulls at 80% Risoe load were repeated and the results from these pulls were compared. The blade was reinforced according to a Risoe DTU invention, where the trailing edge panels are coupled. The coupling is implemented to prevent the out of plane deformations and to reduce peeling stresses in the adhesive joints. Test results from measurements with the reinforcement have been compared to results without the coupling. The report presents only the relevant results for the 80% Risoe load and the results applicable for the investigation of the influence of the invention on the profile deformation. (Author)

  1. Evolution caused by extreme events.

    Science.gov (United States)

    Grant, Peter R; Grant, B Rosemary; Huey, Raymond B; Johnson, Marc T J; Knoll, Andrew H; Schmitt, Johanna

    2017-06-19

    Extreme events can be a major driver of evolutionary change over geological and contemporary timescales. Outstanding examples are evolutionary diversification following mass extinctions caused by extreme volcanism or asteroid impact. The evolution of organisms in contemporary time is typically viewed as a gradual and incremental process that results from genetic change, environmental perturbation or both. However, contemporary environments occasionally experience strong perturbations such as heat waves, floods, hurricanes, droughts and pest outbreaks. These extreme events set up strong selection pressures on organisms, and are small-scale analogues of the dramatic changes documented in the fossil record. Because extreme events are rare, almost by definition, they are difficult to study. So far most attention has been given to their ecological rather than to their evolutionary consequences. We review several case studies of contemporary evolution in response to two types of extreme environmental perturbations, episodic (pulse) or prolonged (press). Evolution is most likely to occur when extreme events alter community composition. We encourage investigators to be prepared for evolutionary change in response to rare events during long-term field studies.This article is part of the themed issue 'Behavioural, ecological and evolutionary responses to extreme climatic events'. © 2017 The Author(s).

  2. Statistical Model of Extreme Shear

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Hansen, Kurt Schaldemose

    2004-01-01

    In order to continue cost-optimisation of modern large wind turbines, it is important to continously increase the knowledge on wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...... by a model that, on a statistically consistent basis, describe the most likely spatial shape of an extreme wind shear event. Predictions from the model have been compared with results from an extreme value data analysis, based on a large number of high-sampled full-scale time series measurements...... are consistent, given the inevitabel uncertainties associated with model as well as with the extreme value data analysis. Keywords: Statistical model, extreme wind conditions, statistical analysis, turbulence, wind loading, statistical analysis, turbulence, wind loading, wind shear, wind turbines....

  3. A km-scale "triaxial experiment" reveals the extreme mechanical weakness and anisotropy of mica-schists (Grandes Rousses Massif, France)

    Science.gov (United States)

    Bolognesi, Francesca; Bistacchi, Andrea

    2018-02-01

    The development of Andersonian faults is predicted, according to theory and experiments, for brittle/frictional deformation occurring in a homogeneous medium. In contrast, in an anisotropic medium it is possible to observe fault nucleation and propagation that is non-Andersonian in geometry and kinematics. Here, we consider post-metamorphic brittle/frictional deformation in the mechanically anisotropic mylonitic mica-schists of the Grandes Rousse Massif (France). The role of the mylonitic foliation (and of any other source of mechanical anisotropy) in brittle/frictional deformation is a function of orientation and friction angle. According to the relative orientation of principal stress axes and foliation, a foliation characterized by a certain coefficient of friction will be utilized or not for the nucleation and propagation of brittle/frictional fractures and faults. If the foliation is not utilized, the rock behaves as if it was isotropic, and Andersonian geometry and kinematics can be observed. If the foliation is utilized, the deviatoric stress magnitude is buffered and Andersonian faults/fractures cannot develop. In a narrow transition regime, both Andersonian and non-Andersonian structures can be observed. We apply stress inversion and slip tendency analysis to determine the critical angle for failure of the metamorphic foliation of the Grandes Rousses schists, defined as the limit angle between the foliation and principal stress axes for which the foliation was brittlely reactivated. This approach allows defining the ratio of the coefficient of internal friction for failure along the mylonitic foliation to the isotropic coefficient of friction. Thus, the study area can be seen as a km-scale triaxial experiment that allows measuring the degree of mechanical anisotropy of the mylonitic mica-schists. In this way, we infer a coefficient of friction μweak = 0.14 for brittle-frictional failure of the foliation, or 20 % of the isotropic coefficient of internal

  4. Extremely Preterm Birth

    Science.gov (United States)

    ... Events Advocacy For Patients About ACOG Extremely Preterm Birth Home For Patients Search FAQs Extremely Preterm Birth ... Spanish FAQ173, June 2016 PDF Format Extremely Preterm Birth Pregnancy When is a baby considered “preterm” or “ ...

  5. Combined dendro-documentary evidence of Central European hydroclimatic springtime extremes over the last millennium

    Science.gov (United States)

    Büntgen, Ulf; Brázdil, Rudolf; Heussner, Karl-Uwe; Hofmann, Jutta; Kontic, Raymond; Kyncl, Tomáš; Pfister, Christian; Chromá, Kateřina; Tegel, Willy

    2011-12-01

    A predicted rise in anthropogenic greenhouse gas emissions and associated effects on the Earth's climate system likely imply more frequent and severe weather extremes with alternations in hydroclimatic parameters expected to be most critical for ecosystem functioning, agricultural yield, and human health. Evaluating the return period and amplitude of modern climatic extremes in light of pre-industrial natural changes is, however, limited by generally too short instrumental meteorological observations. Here we introduce and analyze 11,873 annually resolved and absolutely dated ring width measurement series from living and historical fir ( Abies alba Mill.) trees sampled across France, Switzerland, Germany, and the Czech Republic, which continuously span the AD 962-2007 period. Even though a dominant climatic driver of European fir growth was not found, ring width extremes were evidently triggered by anomalous variations in Central European April-June precipitation. Wet conditions were associated with dynamic low-pressure cells, whereas continental-scale droughts coincided with persistent high-pressure between 35 and 55°N. Documentary evidence independently confirms many of the dendro signals over the past millennium, and further provides insight on causes and consequences of ambient weather conditions related to the reconstructed extremes. A fairly uniform distribution of hydroclimatic extremes throughout the Medieval Climate Anomaly, Little Ice Age and Recent Global Warming may question the common believe that frequency and severity of such events closely relates to climate mean stages. This joint dendro-documentary approach not only allows extreme climate conditions of the industrial era to be placed against the backdrop of natural variations, but also probably helps to constrain climate model simulations over exceptional long timescales.

  6. Return levels of temperature extremes in southern Pakistan

    Science.gov (United States)

    Zahid, Maida; Blender, Richard; Lucarini, Valerio; Caterina Bramati, Maria

    2017-12-01

    Southern Pakistan (Sindh) is one of the hottest regions in the world and is highly vulnerable to temperature extremes. In order to improve rural and urban planning, it is useful to gather information about the recurrence of temperature extremes. In this work, return levels of the daily maximum temperature Tmax are estimated, as well as the daily maximum wet-bulb temperature TWmax extremes. We adopt the peaks over threshold (POT) method, which has not yet been used for similar studies in this region. Two main datasets are analyzed: temperatures observed at nine meteorological stations in southern Pakistan from 1980 to 2013, and the ERA-Interim (ECMWF reanalysis) data for the nearest corresponding locations. The analysis provides the 2-, 5-, 10-, 25-, 50-, and 100-year return levels (RLs) of temperature extremes. The 90 % quantile is found to be a suitable threshold for all stations. We find that the RLs of the observed Tmax are above 50 °C at northern stations and above 45 °C at the southern stations. The RLs of the observed TWmax exceed 35 °C in the region, which is considered as a limit of survivability. The RLs estimated from the ERA-Interim data are lower by 3 to 5 °C than the RLs assessed for the nine meteorological stations. A simple bias correction applied to ERA-Interim data improves the RLs remarkably, yet discrepancies are still present. The results have potential implications for the risk assessment of extreme temperatures in Sindh.

  7. Modelling extreme dry spells in the Mediterranean region in connection with atmospheric circulation

    Science.gov (United States)

    Tramblay, Yves; Hertig, Elke

    2018-04-01

    Long droughts periods can affect the Mediterranean region during the winter season, when most of annual precipitation occurs, and consequently have strong impacts on agriculture, groundwater levels and water resources. The goal of this study is to model annual maximum dry spells lengths (AMDSL) that occur during the extended winter season (October to April). The spatial patterns of extreme dry spells and their relationships with large-scale atmospheric circulation were first investigated. Then, AMDSL were modelled using Generalized Extreme Value (GEV) distributions incorporating climatic covariates, to evaluate the dependences of extreme dry spells to synoptic patterns using an analogue approach. The data from a network of 160 rain gauges having daily precipitation measurements between 1960 and 2009 are considered together with the ERA-20C reanalysis of the 20th century to provide atmospheric variables (geopotential heights, humidity, winds). A regional classification of both the occurrence and the duration of AMDSL helped to distinguish three spatially contiguous regions in which the regional distributions were found homogeneous. From composite analysis, significant positive anomalies in geopotential height (Z500) and negative anomalies in zonal wind (U850) and relative and specific humidity (S850, R850) were found to be associated with AMDSL in the three regions and provided the reference to build analogue days. Finally, non-stationary GEV models have been compared, in which the location and scale parameters are related to different atmospheric indices. Results indicates, at the whole Mediterranean scale, that positives anomalies of the North Atlantic Oscillation index and to a lesser extent the Mediterranean Oscillation index are linked to longer extreme dry spells in the majority of stations. For the three regions identified, the frequency of U850 negative anomalies over North Africa is significantly associated with the magnitude of AMDSL. AMDL are also

  8. Statistical Model of Extreme Shear

    DEFF Research Database (Denmark)

    Hansen, Kurt Schaldemose; Larsen, Gunner Chr.

    2005-01-01

    In order to continue cost-optimisation of modern large wind turbines, it is important to continuously increase the knowledge of wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...... by a model that, on a statistically consistent basis, describes the most likely spatial shape of an extreme wind shear event. Predictions from the model have been compared with results from an extreme value data analysis, based on a large number of full-scale measurements recorded with a high sampling rate...

  9. Runtime Systems for Extreme Scale Platforms

    Science.gov (United States)

    2013-12-01

    Programming in Java, PPPJ ’11, (New York, NY, USA), pp. 51–61, ACM, 2011. [76] Y. Guo, R. Barik , R. Raman, and V. Sarkar, “Work-first and help-first...95] R. Barik , J. Zhao, D. Grove, I. Peshansky, Z. Budimlić, and V. Sarkar, “Commu- nication Optimizations for Distributed-Memory X10 Programs,” in

  10. Tarbijalepingud rahvusvahelises eraõiguses / Margus Kingisepp

    Index Scriptorium Estoniae

    Kingisepp, Margus, 1969-

    1997-01-01

    Tarbijalepingute reguleerimisest erinevates riikides, 1955. a. Haagi konventsioonist ja 1980. a. Rooma konventsioonist, rahvusvahelisest jurisdiktsioonist tarbijalepingute puhul ning rahvusvahelise eraõiguse sätetest Eesti õiguses

  11. The New Era of Counterforce

    Science.gov (United States)

    Lieber, Keir

    Nuclear deterrence rests on the survivability of nuclear arsenals. For much of the nuclear age, counterforce disarming attacks those aimed at eliminating nuclear forces were nearly impossible because of the ability of potential victims to hide and protect their weapons. However, technological developments are eroding this foundation of nuclear deterrence. Advances rooted in the computer revolution have made nuclear forces around the world far more vulnerable than before. Specifically, two key approaches that countries have relied on to ensure arsenal survivability since the dawn of the nuclear age hardening and concealment have been undercut by leaps in weapons accuracy and a revolution in remote sensing. Various models, methods, and evidence demonstrate the emergence of new possibilities for counterforce disarming strikes. In short, the task of securing nuclear arsenals against attack is a far greater challenge than it was in the past. The new era of counterforce challenges the basis for confidence in contemporary deterrence stability, raises critical issues for national and international security policy, and sheds light on one of the enduring theoretical puzzles of the nuclear era: why international security competition has endured in the shadow of the nuclear revolution.

  12. Seasonal temperature extremes in Potsdam

    Science.gov (United States)

    Kundzewicz, Zbigniew; Huang, Shaochun

    2010-12-01

    The awareness of global warming is well established and results from the observations made on thousands of stations. This paper complements the large-scale results by examining a long time-series of high-quality temperature data from the Secular Meteorological Station in Potsdam, where observation records over the last 117 years, i.e., from January 1893 are available. Tendencies of change in seasonal temperature-related climate extremes are demonstrated. "Cold" extremes have become less frequent and less severe than in the past, while "warm" extremes have become more frequent and more severe. Moreover, the interval of the occurrence of frost has been decreasing, while the interval of the occurrence of hot days has been increasing. However, many changes are not statistically significant, since the variability of temperature indices at the Potsdam station has been very strong.

  13. Comparing Evaporative Sources of Terrestrial Precipitation and Their Extremes in MERRA Using Relative Entropy

    Science.gov (United States)

    Dirmeyer, Paul A.; Wei, Jiangfeng; Bosilovich, Michael G.; Mocko, David M.

    2014-01-01

    A quasi-isentropic back trajectory scheme is applied to output from the Modern Era Retrospective-analysis for Research and Applications and a land-only replay with corrected precipitation to estimate surface evaporative sources of moisture supplying precipitation over every ice-free land location for the period 1979-2005. The evaporative source patterns for any location and time period are effectively two dimensional probability distributions. As such, the evaporative sources for extreme situations like droughts or wet intervals can be compared to the corresponding climatological distributions using the method of relative entropy. Significant differences are found to be common and widespread for droughts, but not wet periods, when monthly data are examined. At pentad temporal resolution, which is more able to isolate floods and situations of atmospheric rivers, values of relative entropy over North America are typically 50-400 larger than at monthly time scales. Significant differences suggest that moisture transport may be the key to precipitation extremes. Where evaporative sources do not change significantly, it implies other local causes may underlie the extreme events.

  14. Entrepreneurship in the Digital Era

    Directory of Open Access Journals (Sweden)

    Nur Achmad

    2016-12-01

    Full Text Available This research aims to know about the entrepreneurship as one of the key issues related to many aspects, including its relevance to the economic, business and employment. The researches on entrepreneurship are the references to identify the success factors which supported the entrepreneurial success. The internet development in the digital age indirectly influenced the entrepreneurial ethos. It is not only influenced by the market potential and product innovation, but also by the commitments to entrepreneurship education and training model. Therefore, the studies on entrepreneurship in the industrial and developing countries are attractive. It is relevant to the internet era which provided opportunities for the development of entrepreneurial ethos, especially for the younger generation.

  15. TANTANGAN DAKWAH DI ERA POSMODERNISME

    Directory of Open Access Journals (Sweden)

    Elya Munfarida

    2016-11-01

    Full Text Available As a criticism to modemism, posmodemism has created new cultural realities different from the previous. Through its political acts and strategies, postmodernism has constructed cultural hypereality and complexity. This change becomes a chance and also threat that we need to be critical to anticipate the negative effects. One of its strategies is cultural deconstruction, which denies transcendental signs, meanings, and values, represents one of its negative effects. It will eliminate religious values which by contrast place transcendental values as its principle. For that reason, we need to contextualize Islamic teachings in order not to make postmodernism eliminate Muslims’ sense of religiousity. Da’wa, as a means of communication and transformation of Islamic values, plays a significantrole in this postmodernism era. Consequently, reconstruction of contextual da’wa strategy should be committed to balance the domination of postmodernism cultures.

  16. Extreme environment electronics

    CERN Document Server

    Cressler, John D

    2012-01-01

    Unfriendly to conventional electronic devices, circuits, and systems, extreme environments represent a serious challenge to designers and mission architects. The first truly comprehensive guide to this specialized field, Extreme Environment Electronics explains the essential aspects of designing and using devices, circuits, and electronic systems intended to operate in extreme environments, including across wide temperature ranges and in radiation-intense scenarios such as space. The Definitive Guide to Extreme Environment Electronics Featuring contributions by some of the world's foremost exp

  17. Extreme value distributions

    CERN Document Server

    Ahsanullah, Mohammad

    2016-01-01

    The aim of the book is to give a through account of the basic theory of extreme value distributions. The book cover a wide range of materials available to date. The central ideas and results of extreme value distributions are presented. The book rwill be useful o applied statisticians as well statisticians interrested to work in the area of extreme value distributions.vmonograph presents the central ideas and results of extreme value distributions.The monograph gives self-contained of theory and applications of extreme value distributions.

  18. Global Learning in a New Era

    Directory of Open Access Journals (Sweden)

    Judith Ramaley

    2016-06-01

    Full Text Available Our nation’s colleges and universities have frequently adapted their educational approaches and their relationships with society to respond to new social, economic and environmental challenges. The increasingly interconnected patterns that link together our lives on a global scale have created a new reality. Globalization offers an especially exciting and challenging blend of generational change combined with the emergence of a set of complex, multi-faceted problems created by the global context in which we all now live and work. How shall we educate our students for life in this new era? What can we expect of our graduates in a global world? The answer to these questions is straightforward but will require our institutions to make significant changes in their approach to educating their students and in their interactions with the broader communities that they serve. The approach is shaped by a clear sense of what a globally prepared graduate knows and can do, guided by clear learning outcomes exercised along a sequential pathway of experiences extending from the first year of college through to graduation. These experiences are supported by the use of engaged learning practices that draw students into work that is both personally and socially meaningful cross-disciplinary inquiry that focuses on Big Questions with the goal of finding ways to address those questions in ethical and responsible and effective ways.

  19. Nutrition security under extreme events

    Science.gov (United States)

    Martinez, A.

    2017-12-01

    Nutrition security under extreme events. Zero hunger being one of the Sustainable Development Goal from the United Nations, food security has become a trending research topic. However extreme events impact on global food security is not yet 100% understood and there is a lack of comprehension of the underlying mechanisms of global food trade and nutrition security to improve countries resilience to extreme events. In a globalized world, food is still a highly regulated commodity and a strategic resource. A drought happening in a net food-exporter will have little to no effect on its own population but the repercussion on net food-importers can be extreme. In this project, we propose a methodology to describe and quantify the impact of a local drought to human health at a global scale. For this purpose, nutrition supply and global trade data from FAOSTAT have been used with domestic food production from national agencies and FAOSTAT, global precipitation from the Climate Research Unit and health data from the World Health Organization. A modified Herfindahl-Hirschman Index (HHI) has been developed to measure the level of resilience of one country to a drought happening in another country. This index describes how a country is dependent of importation and how diverse are its importation. Losses of production and exportation due to extreme events have been calculated using yield data and a simple food balance at country scale. Results show that countries the most affected by global droughts are the one with the highest dependency to one exporting country. Changes induced by droughts also disturbed their domestic proteins, fat and calories supply resulting most of the time in a higher intake of calories or fat over proteins.

  20. Nuclear energy and the new era

    International Nuclear Information System (INIS)

    Sefidvash, F.

    1992-01-01

    The problem of the utilization of nuclear energy is not only technical but also has important social, economic, political and ethical ramifications. Therefore, to discuss nuclear energy for the future, a vision of the new era needs to be identified. A model for the new era, as a natural consequence of growing interdependence among nations and the process of human evolution is described. The problems of inherent and passive safety, waste disposal, ecology, proliferation, economy and regulatory institutions in the new era are discussed. The particular role of small nuclear power reactors and their potential advantages are described. (author). 12 refs

  1. El lenguaje en la era digital

    Directory of Open Access Journals (Sweden)

    Juan Carlos Vergara Silva

    1998-02-01

    Con base en la interrelación entre lenguaje y pensamiento se plantea el papel fundamental que el lenguaje ocupa en el modelo económico, educativo y cultural generado por la aparición de la era digital o era del conocimiento. en este artículo se evidencian los retos que genera una era marcada por un esquema digital en el desarrollo y uso de habilidades comunicativas tanto en la docencia superior como en el ejercicio profesional eficiente.

  2. Statistical Downscaling of Gusts During Extreme European Winter Storms Using Radial-Basis-Function Networks

    Science.gov (United States)

    Voigt, M.; Lorenz, P.; Kruschke, T.; Osinski, R.; Ulbrich, U.; Leckebusch, G. C.

    2012-04-01

    Winterstorms and related gusts can cause extensive socio-economic damages. Knowledge about the occurrence and the small scale structure of such events may help to make regional estimations of storm losses. For a high spatial and temporal representation, the use of dynamical downscaling methods (RCM) is a cost-intensive and time-consuming option and therefore only applicable for a limited number of events. The current study explores a methodology to provide a statistical downscaling, which offers small scale structured gust fields from an extended large scale structured eventset. Radial-basis-function (RBF) networks in combination with bidirectional Kohonen (BDK) maps are used to generate the gustfields on a spatial resolution of 7 km from the 6-hourly mean sea level pressure field from ECMWF reanalysis data. BDK maps are a kind of neural network which handles supervised classification problems. In this study they are used to provide prototypes for the RBF network and give a first order approximation for the output data. A further interpolation is done by the RBF network. For the training process the 50 most extreme storm events over the North Atlantic area from 1957 to 2011 are used, which have been selected from ECMWF reanalysis datasets ERA40 and ERA-Interim by an objective wind based tracking algorithm. These events were downscaled dynamically by application of the DWD model chain GME → COSMO-EU. Different model parameters and their influence on the quality of the generated high-resolution gustfields are studied. It is shown that the statistical RBF network approach delivers reasonable results in modeling the regional gust fields for untrained events.

  3. Periodistas para la era digital

    Directory of Open Access Journals (Sweden)

    Fernando Villalobos G.

    2014-12-01

    Full Text Available La formación de los periodistas del siglo XXI en la era digital se concibe como un reto y el mayor desafío. Los medios digitales demandan especialización y dominio de las nuevas tecnologías. Las universidades, principales gestores de este cambio, en la sociedad del conocimiento, requieren trabajar en la reformulación de nuevos pensum y en tres terrenos fundamentales: el perfil del egresado, el estilo pedagógico y el nuevo modo de relacionarse con el mundo exterior. Se necesita de una oferta académica referida al ciberperiodismo y a la cibercomunicación. La profesión periodística, y específicamente la formación de los comunicadores registra cambios importantes debido a las nuevas tecnologías, ahora el aula es el mundo globalizado e interconectado por la gran autopista de la información, por ello, la formación del comunicador social, demanda renovadas prácticas educativas. Las universidades deben repensar el papel que están obligadas a desempeñar, reformular los pensum académicos y ofertar la cibercomunicación y el ciberperiodismo para estar al día de las grandes transformaciones.

  4. A THIRD ERA OF MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Liviu NEAMŢU

    2009-12-01

    Full Text Available Management like any social activity through specific stages of development trends from general society. Following these historical trends, this study summarizes the evolution of management at four stages of a full cycle of development, from a informal management to one perfect formalized. These stages of development are found differential represented at the various economic development regions in the world. Evolution increasingly grouped patterns of management and generalization for schools of thought management determines the current global development of worldwide management. For the current stage of evolution may be called as "the third era of management" or "imperial period” in which management pressures on individuals, employers or subordinate, are enormous. Evolution of companies, of markets and national economies also the global economy is driven by the current trend in management, leading to very strong mutations in the relationship of forces. The world economy is in what is called "war of resources" and the alternative that we believe is necessary in this "human management" although speculative trends of concentration of capital are binding on any plans or state regulators global ethical management.

  5. Identification of Tropical-Extratropical Interactions and Extreme Precipitation Events in the Middle East based on Potential Vorticity and Moisture Transport

    KAUST Repository

    de Vries, A. J.

    2017-12-26

    Extreme precipitation events in the otherwise arid Middle East can cause flooding with dramatic socioeconomic impacts. Most of these events are associated with tropical-extratropical interactions, whereby a stratospheric potential vorticity (PV) intrusion reaches deep into the subtropics and forces an incursion of high poleward vertically integrated water vapor transport (IVT) into the Middle East. This study presents an object-based identification method for extreme precipitation events based on the combination of these two larger-scale meteorological features. The general motivation for this approach is that precipitation is often poorly simulated in relatively coarse weather and climate models, whereas the synoptic-scale circulation is much better represented. The algorithm is applied to ERA-Interim reanalysis data (1979-2015) and detects 90% (83%) of the 99th (97.5th) percentile of extreme precipitation days in the region of interest. Our results show that stratospheric PV intrusions and IVT structures are intimately connected to extreme precipitation intensity and seasonality. The farther south a stratospheric PV intrusion reaches, the larger the IVT magnitude, and the longer the duration of their combined occurrence, the more extreme the precipitation. Our algorithm detects a large fraction of the climatological rainfall amounts (40-70%), heavy precipitation days (50-80%), and the top 10 extreme precipitation days (60-90%) at many sites in southern Israel and the northern and western parts of Saudi Arabia. This identification method provides a new tool for future work to disentangle teleconnections, assess medium-range predictability and improve understanding of climatic changes of extreme precipitation in the Middle East and elsewhere.

  6. Identification of Tropical-Extratropical Interactions and Extreme Precipitation Events in the Middle East Based On Potential Vorticity and Moisture Transport

    Science.gov (United States)

    de Vries, A. J.; Ouwersloot, H. G.; Feldstein, S. B.; Riemer, M.; El Kenawy, A. M.; McCabe, M. F.; Lelieveld, J.

    2018-01-01

    Extreme precipitation events in the otherwise arid Middle East can cause flooding with dramatic socioeconomic impacts. Most of these events are associated with tropical-extratropical interactions, whereby a stratospheric potential vorticity (PV) intrusion reaches deep into the subtropics and forces an incursion of high poleward vertically integrated water vapor transport (IVT) into the Middle East. This study presents an object-based identification method for extreme precipitation events based on the combination of these two larger-scale meteorological features. The general motivation for this approach is that precipitation is often poorly simulated in relatively coarse weather and climate models, whereas the synoptic-scale circulation is much better represented. The algorithm is applied to ERA-Interim reanalysis data (1979-2015) and detects 90% (83%) of the 99th (97.5th) percentile of extreme precipitation days in the region of interest. Our results show that stratospheric PV intrusions and IVT structures are intimately connected to extreme precipitation intensity and seasonality. The farther south a stratospheric PV intrusion reaches, the larger the IVT magnitude, and the longer the duration of their combined occurrence, the more extreme the precipitation. Our algorithm detects a large fraction of the climatological rainfall amounts (40-70%), heavy precipitation days (50-80%), and the top 10 extreme precipitation days (60-90%) at many sites in southern Israel and the northern and western parts of Saudi Arabia. This identification method provides a new tool for future work to disentangle teleconnections, assess medium-range predictability, and improve understanding of climatic changes of extreme precipitation in the Middle East and elsewhere.

  7. The Performing Arts in a New Era

    National Research Council Canada - National Science Library

    McCarthy, Kevin

    2001-01-01

    The Pew Charitable Trust commissioned The Performing Arts in a New Era from RAND in 1999 as part of a broad initiative aimed at increasing policy and financial support for nonprofit culture in the United States...

  8. MEMAKNAI SUMPAH PEMUDA DI ERA REFORMASI

    Directory of Open Access Journals (Sweden)

    Sutejo K. Widodo

    2013-03-01

    Full Text Available The moment of Sumpah Pemuda (Young Man Oath took place 84 years ago, reflecting the spirit of nationalism that is still very important in this Reformation era. This paper endeavors to dig deeper meaning of Sumpah Pemuda in its pre-independence era and applying it to our contemporary situation. The method used here is historical research using literature resources, such as articles, books, and other readings in internet. It is then concluded that the spirit of Sumpah Pemuda should be our contemplative materials and valuable Iesson so that Reformation era may succeed in achieving national goals stated in the Constitution, a society that is fair, prosperous, and democratic. Keywords: Sumpah Pemuda, Reformation era, nationalism.

  9. Technology: A New Era in Education

    Science.gov (United States)

    Cunningham, William G.

    1977-01-01

    Teachers and technologists have lived apart, with much doubt on both sides. The author suggests that collaboration, mutual trust, and respect, will usher in a new era for effective education. (Editor)

  10. Optimization with Extremal Dynamics

    International Nuclear Information System (INIS)

    Boettcher, Stefan; Percus, Allon G.

    2001-01-01

    We explore a new general-purpose heuristic for finding high-quality solutions to hard discrete optimization problems. The method, called extremal optimization, is inspired by self-organized criticality, a concept introduced to describe emergent complexity in physical systems. Extremal optimization successively updates extremely undesirable variables of a single suboptimal solution, assigning them new, random values. Large fluctuations ensue, efficiently exploring many local optima. We use extremal optimization to elucidate the phase transition in the 3-coloring problem, and we provide independent confirmation of previously reported extrapolations for the ground-state energy of ±J spin glasses in d=3 and 4

  11. Inflation with a smooth constant-roll to constant-roll era transition

    Science.gov (United States)

    Odintsov, S. D.; Oikonomou, V. K.

    2017-07-01

    In this paper, we study canonical scalar field models, with a varying second slow-roll parameter, that allow transitions between constant-roll eras. In the models with two constant-roll eras, it is possible to avoid fine-tunings in the initial conditions of the scalar field. We mainly focus on the stability of the resulting solutions, and we also investigate if these solutions are attractors of the cosmological system. We shall calculate the resulting scalar potential and, by using a numerical approach, we examine the stability and attractor properties of the solutions. As we show, the first constant-roll era is dynamically unstable towards linear perturbations, and the cosmological system is driven by the attractor solution to the final constant-roll era. As we demonstrate, it is possible to have a nearly scale-invariant power spectrum of primordial curvature perturbations in some cases; however, this is strongly model dependent and depends on the rate of the final constant-roll era. Finally, we present, in brief, the essential features of a model that allows oscillations between constant-roll eras.

  12. A new era of competitiveness.

    Science.gov (United States)

    Theadore, Jason C

    2011-01-01

    Many of my family, friends, and colleagues would describe me as competitive and that at times I overuse this skill with a win-at-all-costs attitude. I would tend to agree. I love to win. Yet for me winning is not about me, it is as our coaches suggest, about others. I was recently asked by a new clinical leader if I missed taking care of patients. Without thinking, my response was that I take part in the care (add value) of every patient as a leader. Every decision we make as leaders (coaches) impacts many others regardless of the magnitude of our decision and, at times, our direct involvement. Operational excellence, in any field, is about winning. We all have different definitions of winning, defined by the strategic vision for our organizations. An organization's managers, supervisors, and employees all play an important role in the team to achieve the vision of the organization. As some elite coaches have suggested, winning starts with each of us being our best. Today's environment requires consistent change. Yet many in the radiology field change the wrong things for the wrong reasons. Many organizations and individuals look for instant gratification, specifically in this new era of competitiveness. Many evaluate what their competitors are doing in the market, what cars their neighbors are buying, or become jealous over a friend's success. Focusing on others and not improving yourself takes your focus away from what is important-you and the team you lead. Keeping your focus on your operations and what you can control may very well help you coach a winning team.

  13. EKSISTENSI AGAMA DALAM ERA GLOBALISASI

    Directory of Open Access Journals (Sweden)

    Ahmad Muttaqin

    2016-11-01

    globalization circle, which seems to be contradictory with religion.Globalization is utilitarian  as its nature and it results in vanish of local values or cultures. However, none can avoid, neither can religious people. Responds to globalization frequently occur in extreme behavior since some people thinks that globalization will threat their existence in this world. Such responds make the people labeled as fundamentalists or terrorists, and many of them have religious background.  Some of religious groups extremely rejecting globalization can be found states of former USSR, Japan, and Iran. Finally, this paper presents the forms and  positions of  religion suggested by four figures, i.e. Immanuel Wallerstein, John Meyer, Roland Robertson, and Niklas Luhmann. They suggest that the religions will keep their existence if they adopt the values of globalization and make themselves the instrument of communication as well as political and economic interaction of the world’s interaction. Religion should evolve from narrow mindedness to a broader, new, and universal values.

  14. Arctic daily temperature and precipitation extremes: Observed and simulated physical behavior

    Science.gov (United States)

    Glisan, Justin Michael

    climatological records, regional weather patterns, and geographical/topographical features. We compared simulated extremes with those occurring at corresponding observing stations in the U.S. National Climate Data Center's (NCDC's) Global Summary of the Day. Our analysis focused on variations in features of the extremes such as magnitudes, spatial scales, and temporal regimes. Using composites of extreme events, we also analyzed the processes producing these extremes, comparing circulation, pressure, temperature and humidity fields from the ERA-Interim reanalysis and the model output. The analysis revealed the importance of atmospheric convection in the Arctic for some extreme precipitation events and the overall importance of topographic precipitation. The analysis established the physical credibility of the simulations for extreme behavior, laying a foundation for examining projected changes in extreme precipitation. It also highlighted the utility of the model for extracting behavior that one cannot discern directly from the observations, such as summer convective precipitation.

  15. Classifying Returns as Extreme

    DEFF Research Database (Denmark)

    Christiansen, Charlotte

    2014-01-01

    I consider extreme returns for the stock and bond markets of 14 EU countries using two classification schemes: One, the univariate classification scheme from the previous literature that classifies extreme returns for each market separately, and two, a novel multivariate classification scheme tha...

  16. A new Era in Sustainable Development

    Energy Technology Data Exchange (ETDEWEB)

    Bass, Steve

    2007-03-15

    It is 20 years since the World Commission on Environment and Development — the Brundtland Commission — released its influential report on sustainable development. This is now the declared intention of most governments, many international organisations, and an increasing number of businesses and civil society groups. High profile 'intentions' have given rise to a bewildering array of sustainable development plans, tools and business models. But these have not yet triggered the pace, scale, scope and depth of change that is needed to make development sustainable. They leave the underlying causes of unsustainable development largely undisturbed. They include few means for anticipating non-linear changes – from climate change to economic cycles – and for building resilience to them. Consequently, most environmental and welfare measures continue to decline in almost all countries. Much energy has been spent crafting the sustainable development 'toolkit'. But that energy has been channelled largely through a narrow set of international processes and 'elite' national actors. The results are not yet integral to the machinery of government or business, or people's daily lives. This paper calls for energies to be directed in new ways, constructing a truly global endeavour informed by diverse local actors' evidence of 'what works', and focusing more keenly on long-term futures. The key drivers and challenges of a 'new era in sustainable development' are suggested, to elicit ideas and leadership from a richer vein of experience than has been embraced by the formal international endeavours to date. This paper is the first in a series on the sustainable development futures that face key sectors and stakeholder groups.

  17. Flux scaling: Ultimate regime

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Flux scaling: Ultimate regime. With the Nusselt number and the mixing length scales, we get the Nusselt number and Reynolds number (w'd/ν) scalings: and or. and. scaling expected to occur at extremely high Ra Rayleigh-Benard convection. Get the ultimate regime ...

  18. Era Transmídia

    Directory of Open Access Journals (Sweden)

    Rodrigo Dias Arnaut

    2011-12-01

    Full Text Available A abordagem transmídia se mostra cada vez mais atual e interessante por integrar todos os conceitos de produção de conteúdos em uma única metodologia e processo de criação e distribuição. Atualmente, e cada vez mais, as pessoas, o público em geral é um potencial produtor de conteúdo nas novas mídias, seja através de uma simples câmera fotográfica, um telefone celular, um PC ou mesmo do mais sofisticado tablet. Nesta nova dinâmica, o mercado (conjunto de: audiência, mídia, concorrência e demais agentes apresenta novas plataformas de comuni-cação que, devido a sua abertura e amplo acesso, trazem consigo uma grande perda de controle do que é publi-cado e do próprio contexto originalmente planejado, pois a mídia espontânea e a interpretação do público criam novos caminhos para a história principal do projeto.A necessidade do mercado em estreitar relacionamento com seus clientes ou públicos (cada segmento de mercado é considerado um público diferente passa por uma grande transformação, que se implementada de forma impul-siva e despreparada, no que diz respeito à análise do público-alvo, aos formatos de distribuição e à mensagem enviada, pode acarretar na perda de oportunidades e propostas de comunicação. O foco dos projetos transmídia é utilizar  metodologias e  processos mais completos e abrangentes, do ponto de vista das áreas de criação de con-teúdo, tecnologia, marketing e outras, utilizando as melhores plataformas de mídia para o sucesso do projeto. De forma resumida o grupo de estudos em transmídia, #EraTransmídia, irá apresentar seus conceitos com o objetivo de proporcionar o engajamento social multiplataforma para resultados positivos.

  19. Understanding Research Strategies to Improve ERA Performance in Australian Universities: Circumventing Secrecy to Achieve Success

    Science.gov (United States)

    Diezmann, Carmel M.

    2018-01-01

    Many Australian universities have prioritised improving discipline performance on the national research assessment--Excellence for Research in Australia. However, a "culture of secrecy" pervades "Excellence in Research for Australia" (ERA). There are no specified criteria for the assignment of ratings on a 5-point scale ranging…

  20. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  1. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  2. Extremal surface barriers

    International Nuclear Information System (INIS)

    Engelhardt, Netta; Wall, Aron C.

    2014-01-01

    We present a generic condition for Lorentzian manifolds to have a barrier that limits the reach of boundary-anchored extremal surfaces of arbitrary dimension. We show that any surface with nonpositive extrinsic curvature is a barrier, in the sense that extremal surfaces cannot be continuously deformed past it. Furthermore, the outermost barrier surface has nonnegative extrinsic curvature. Under certain conditions, we show that the existence of trapped surfaces implies a barrier, and conversely. In the context of AdS/CFT, these barriers imply that it is impossible to reconstruct the entire bulk using extremal surfaces. We comment on the implications for the firewall controversy

  3. Statistics of Extremes

    KAUST Repository

    Davison, Anthony C.; Huser, Raphaë l

    2015-01-01

    Statistics of extremes concerns inference for rare events. Often the events have never yet been observed, and their probabilities must therefore be estimated by extrapolation of tail models fitted to available data. Because data concerning the event

  4. Analysis of extreme events

    CSIR Research Space (South Africa)

    Khuluse, S

    2009-04-01

    Full Text Available ) determination of the distribution of the damage and (iii) preparation of products that enable prediction of future risk events. The methodology provided by extreme value theory can also be a powerful tool in risk analysis...

  5. Acute lower extremity ischaemia

    African Journals Online (AJOL)

    Acute lower extremity ischaemia. Acute lower limb ischaemia is a surgical emergency. ... is ~1.5 cases per 10 000 persons per year. Acute ischaemia ... Table 2. Clinical features discriminating embolic from thrombotic ALEXI. Clinical features.

  6. Left-Wing Extremism: The Current Threat

    Energy Technology Data Exchange (ETDEWEB)

    Karl A. Seger

    2001-04-30

    Left-wing extremism is ''alive and well'' both in the US and internationally. Although the current domestic terrorist threat within the U. S. is focused on right-wing extremists, left-wing extremists are also active and have several objectives. Leftist extremists also pose an espionage threat to U.S. interests. While the threat to the U.S. government from leftist extremists has decreased in the past decade, it has not disappeared. There are individuals and organizations within the U.S. who maintain the same ideology that resulted in the growth of left-wing terrorism in this country in the 1970s and 1980s. Some of the leaders from that era are still communicating from Cuba with their followers in the U.S., and new leaders and groups are emerging.

  7. Regional tendencies of extreme wind characteristics in Hungary

    Science.gov (United States)

    Radics, Dr.; Bartholy, Dr.; Péliné

    2009-09-01

    Human activities have substantial effects on climate system. It has already accepted that change in the long-term climatic mean state will have significant consequences in the global economy and society, but the most important effects of climate change may come from changes in the intensity and frequency of climatic extremes. It is therefore of great interest to document the extremes of surface wind that could assist in estimating the regional effects of climate change. The research presented is based on 34-year-long (1975-2008) wind (speed, direction, and wind gust) data sets of 36 Hungarian synoptic meteorological stations. After processing (including digitalisation of old instrumental records, quality control and homogenisation of wind time series) the measured wind data sets, time series and complex wind climate analysis were carried out. Spatial and temporal distributions of mean and extreme wind climate characteristics were estimated, wind extremes and trends were interpolated and mapped over the country. Finally, measured and reanalysed (ERA40) wind data were compared over Hungary, in order to verify not only the validity of ERA40 reanalysed data sets, but the adaptability of climate simulation results in estimation of regional climate change effects.

  8. Communicating mathematics in the digital era

    CERN Document Server

    Borwein, Jonathan; Rodrigues, Jose Francisco

    2008-01-01

    The digital era has dramatically changed the ways that researchers search, produce, publish, and disseminate their scientific work. These processes are still rapidly evolving due to improvements in information science, new achievements in computer science technologies, and initiatives such as DML and open access journals, digitization projects, scientific reference catalogs, and digital repositories. These changes have prompted many mathematicians to play an active part in the developments of the digital era, and have led mathematicians to promote and discuss new ideas with colleagues from other fields, such as technology developers and publishers. This book is a collection of contributions by key leaders in the field, offering the paradigms and mechanisms for producing, searching, and exploiting scientific and technical scholarship in mathematics in the digital era.

  9. Achieving Transformational Materials Performance in a New Era of Science

    International Nuclear Information System (INIS)

    Sarrao, John

    2009-01-01

    The inability of current materials to meet performance requirements is a key stumbling block for addressing grand challenges in energy and national security. Fortunately, materials research is on the brink of a new era - a transition from observation and validation of materials properties to prediction and control of materials performance. In this talk, I describe the nature of the current challenge, the prospects for success, and a specific facility concept, MaRIE, that will provide the needed capabilities to meet these challenges, especially for materials in extreme environments. MaRIE, for Matter-Radiation Interactions in Extremes, is Los Alamos' concept to realize this vision of 21st century materials research. This vision will be realized through enhancements to the current LANSCE accelerator, development of a fourth-generation x-ray light source co-located with the proton accelerator, and a comprehensive synthesis and characterization facility focused on controlling complex materials and the defect/structure link to materials performance.

  10. Extreme Weather and Climate: Workshop Report

    Science.gov (United States)

    Sobel, Adam; Camargo, Suzana; Debucquoy, Wim; Deodatis, George; Gerrard, Michael; Hall, Timothy; Hallman, Robert; Keenan, Jesse; Lall, Upmanu; Levy, Marc; hide

    2016-01-01

    Extreme events are the aspects of climate to which human society is most sensitive. Due to both their severity and their rarity, extreme events can challenge the capacity of physical, social, economic and political infrastructures, turning natural events into human disasters. Yet, because they are low frequency events, the science of extreme events is very challenging. Among the challenges is the difficulty of connecting extreme events to longer-term, large-scale variability and trends in the climate system, including anthropogenic climate change. How can we best quantify the risks posed by extreme weather events, both in the current climate and in the warmer and different climates to come? How can we better predict them? What can we do to reduce the harm done by such events? In response to these questions, the Initiative on Extreme Weather and Climate has been created at Columbia University in New York City (extreme weather.columbia.edu). This Initiative is a University-wide activity focused on understanding the risks to human life, property, infrastructure, communities, institutions, ecosystems, and landscapes from extreme weather events, both in the present and future climates, and on developing solutions to mitigate those risks. In May 2015,the Initiative held its first science workshop, entitled Extreme Weather and Climate: Hazards, Impacts, Actions. The purpose of the workshop was to define the scope of the Initiative and tremendously broad intellectual footprint of the topic indicated by the titles of the presentations (see Table 1). The intent of the workshop was to stimulate thought across disciplinary lines by juxtaposing talks whose subjects differed dramatically. Each session concluded with question and answer panel sessions. Approximately, 150 people were in attendance throughout the day. Below is a brief synopsis of each presentation. The synopses collectively reflect the variety and richness of the emerging extreme event research agenda.

  11. Chinese librarianship in the digital era

    CERN Document Server

    Fang, Conghui

    2013-01-01

    The library in China has been transformed by rapid socioeconomic development, and the proliferation of the Internet. The issues faced by Chinese libraries andlibrarians are those faced by library practitioners more globally, however, China also has its own unique set of issues in the digital era, including developmental imbalance between East and West, urban and rural areas, and availability of skilled practitioners. Chinese Librarianship in the Digital Era is the first book on Chinese libraries responding to these issues, and more.The first part of the book places discussion in historical con

  12. Extreme Programming: Maestro Style

    Science.gov (United States)

    Norris, Jeffrey; Fox, Jason; Rabe, Kenneth; Shu, I-Hsiang; Powell, Mark

    2009-01-01

    "Extreme Programming: Maestro Style" is the name of a computer programming methodology that has evolved as a custom version of a methodology, called extreme programming that has been practiced in the software industry since the late 1990s. The name of this version reflects its origin in the work of the Maestro team at NASA's Jet Propulsion Laboratory that develops software for Mars exploration missions. Extreme programming is oriented toward agile development of software resting on values of simplicity, communication, testing, and aggressiveness. Extreme programming involves use of methods of rapidly building and disseminating institutional knowledge among members of a computer-programming team to give all the members a shared view that matches the view of the customers for whom the software system is to be developed. Extreme programming includes frequent planning by programmers in collaboration with customers, continually examining and rewriting code in striving for the simplest workable software designs, a system metaphor (basically, an abstraction of the system that provides easy-to-remember software-naming conventions and insight into the architecture of the system), programmers working in pairs, adherence to a set of coding standards, collaboration of customers and programmers, frequent verbal communication, frequent releases of software in small increments of development, repeated testing of the developmental software by both programmers and customers, and continuous interaction between the team and the customers. The environment in which the Maestro team works requires the team to quickly adapt to changing needs of its customers. In addition, the team cannot afford to accept unnecessary development risk. Extreme programming enables the Maestro team to remain agile and provide high-quality software and service to its customers. However, several factors in the Maestro environment have made it necessary to modify some of the conventional extreme

  13. Extreme meteorological conditions

    International Nuclear Information System (INIS)

    Altinger de Schwarzkopf, M.L.

    1983-01-01

    Different meteorological variables which may reach significant extreme values, such as the windspeed and, in particular, its occurrence through tornadoes and hurricanes that necesarily incide and wich must be taken into account at the time of nuclear power plants' installation, are analyzed. For this kind of study, it is necessary to determine the basic phenomenum of design. Two criteria are applied to define the basic values of design for extreme meteorological variables. The first one determines the expected extreme value: it is obtained from analyzing the recurence of the phenomenum in a convened period of time, wich may be generally of 50 years. The second one determines the extreme value of low probability, taking into account the nuclear power plant's operating life -f.ex. 25 years- and considering, during said lapse, the occurrence probabilities of extreme meteorological phenomena. The values may be determined either by the deterministic method, which is based on the acknowledgement of the fundamental physical characteristics of the phenomena or by the probabilistic method, that aims to the analysis of historical statistical data. Brief comments are made on the subject in relation to the Argentine Republic area. (R.J.S.) [es

  14. Extreme Networks' 10-Gigabit Ethernet enables

    CERN Multimedia

    2002-01-01

    " Extreme Networks, Inc.'s 10-Gigabit switching platform enabled researchers to transfer one Terabyte of information from Vancouver to Geneva across a single network hop, the world's first large-scale, end-to-end transfer of its kind" (1/2 page).

  15. Family Health in an Era of Stress.

    Science.gov (United States)

    USA Today, 1979

    1979-01-01

    Summarizes major findings of a national survey, "The General Mills American Family Report 1978/79: Family Health in an Era of Stress," conducted by Yankelovich, Skelly and White. Topics covered include attitudes toward medical costs, mental illness, and good health practices, as well as expressed interest in health information. (SJL)

  16. Biotechnology: An Era of Hopes and Fears

    Science.gov (United States)

    2016-01-01

    Strategic Studies Quarterly ♦ Fall 2016 23 Biotechnology An Era of Hopes and Fears LTC Douglas R. Lewis, PhD, US Army Abstract Biotechnology ......ignored. The idea of advances in biotechnology increasing the biological weapons threat is not new. In 2003 an analysis of gene sequencing and

  17. Time Management in the Digital Era

    Science.gov (United States)

    Wodarz, Nan

    2013-01-01

    School business officials can strike a balance between setting a long-term strategy and responding to short-term situations by implementing time management strategies. This article presents tips for time management that could help boost productivity and save time in this digital era. Tips include decreasing meeting times via Skype or…

  18. Aplikasi Citizen Journalism di Era Konvergensi Media

    Directory of Open Access Journals (Sweden)

    Rahmat Edi Irawan

    2014-10-01

    Full Text Available Citizen journalism has now become one of the most developed television program concepts. If the concept was initially more widely used in radio and online media, this time with easier and cheaper technology coverage and delivery of images, it is a concept that provides a place for people to become amateur journalist that can also be easily applied in the medium of television. Research raised the issue on how the concept and implementation of citizen journalism on television in the era of media convergence. The purpose of this study is to explain concepts and demonstrate the implementation of citizen journalism on television in the era of media convergence. Research used qualitative method in which data were obtained using literature study. Results of the study showed that the implementation of citizen journalism on television is also increasingly facilitated by the entry of the television in the era of media convergence, or different media mingle, such as television with printed, radio, and Internet media. The era of media convergence makes the concept of citizen journalism can be more developed, because the platform or media distribution is also increasingly varied for amateur journalist. However, the system equipment that must be provided, human resources that must be owned, as well as huge capital to be owned make a few television stations open a lot of platforms to provide space for amateur journalist in citizen journalism. 

  19. Rethinking Education in an Era of Globalisation

    Science.gov (United States)

    Wrigley, Terry

    2007-01-01

    This article reflects on the historic tensions of education under capitalism, arguing that they have been exacerbated in our era of neo-liberal globalisation. Government drives for greater "accountability" and "effectiveness" are a blinkered response to the threefold global crisis we face: poverty and debt; a collapse of the…

  20. Scandinavian neuroscience during the Nazi era

    DEFF Research Database (Denmark)

    Kondziella, Daniel; Hansen, Klaus; Zeidman, Lawrence A

    2013-01-01

    Although Scandinavian neuroscience has a proud history, its status during the Nazi era has been overlooked. In fact, prominent neuroscientists in German-occupied Denmark and Norway, as well as in neutral Sweden, were directly affected. Mogens Fog, Poul Thygesen (Denmark) and Haakon Sæthre (Norway...

  1. Family Therapy in the Postmodern Era.

    Science.gov (United States)

    Mills, Steven D.; Sprenkle, Douglas H.

    1995-01-01

    Discusses theoretical and clinical developments that have accompanied family therapy's entry into the postmodern era. Clinical trends, including use of reflecting teams, self-of-the-therapist issues, increased therapist self-disclosure, and postmodern supervision are examined. Feminist critiques, health-care reform, and increasing collaboration…

  2. Faculty Recruitment in an Era of Change

    Science.gov (United States)

    Levine, Marilyn; Schimpf, Martin

    2010-01-01

    Faculty recruitment is a challenge for administration and departments, especially in an era of change in the academy. This article builds on information from an interactive conference panel session that focused on faculty recruitment best practices. The article addresses faculty recruitment strategies that focus on the optimization of search…

  3. Multiscale computing in the exascale era

    NARCIS (Netherlands)

    Alowayyed, S.; Groen, D.; Coveney, P.V.; Hoekstra, A.G.

    We expect that multiscale simulations will be one of the main high performance computing workloads in the exascale era. We propose multiscale computing patterns as a generic vehicle to realise load balanced, fault tolerant and energy aware high performance multiscale computing. Multiscale computing

  4. Acclimatization to extreme heat

    Science.gov (United States)

    Warner, M. E.; Ganguly, A. R.; Bhatia, U.

    2017-12-01

    Heat extremes throughout the globe, as well as in the United States, are expected to increase. These heat extremes have been shown to impact human health, resulting in some of the highest levels of lives lost as compared with similar natural disasters. But in order to inform decision makers and best understand future mortality and morbidity, adaptation and mitigation must be considered. Defined as the ability for individuals or society to change behavior and/or adapt physiologically, acclimatization encompasses the gradual adaptation that occurs over time. Therefore, this research aims to account for acclimatization to extreme heat by using a hybrid methodology that incorporates future air conditioning use and installation patterns with future temperature-related time series data. While previous studies have not accounted for energy usage patterns and market saturation scenarios, we integrate such factors to compare the impact of air conditioning as a tool for acclimatization, with a particular emphasis on mortality within vulnerable communities.

  5. Extremely deformable structures

    CERN Document Server

    2015-01-01

    Recently, a new research stimulus has derived from the observation that soft structures, such as biological systems, but also rubber and gel, may work in a post critical regime, where elastic elements are subject to extreme deformations, though still exhibiting excellent mechanical performances. This is the realm of ‘extreme mechanics’, to which this book is addressed. The possibility of exploiting highly deformable structures opens new and unexpected technological possibilities. In particular, the challenge is the design of deformable and bi-stable mechanisms which can reach superior mechanical performances and can have a strong impact on several high-tech applications, including stretchable electronics, nanotube serpentines, deployable structures for aerospace engineering, cable deployment in the ocean, but also sensors and flexible actuators and vibration absorbers. Readers are introduced to a variety of interrelated topics involving the mechanics of extremely deformable structures, with emphasis on ...

  6. Statistics of Extremes

    KAUST Repository

    Davison, Anthony C.

    2015-04-10

    Statistics of extremes concerns inference for rare events. Often the events have never yet been observed, and their probabilities must therefore be estimated by extrapolation of tail models fitted to available data. Because data concerning the event of interest may be very limited, efficient methods of inference play an important role. This article reviews this domain, emphasizing current research topics. We first sketch the classical theory of extremes for maxima and threshold exceedances of stationary series. We then review multivariate theory, distinguishing asymptotic independence and dependence models, followed by a description of models for spatial and spatiotemporal extreme events. Finally, we discuss inference and describe two applications. Animations illustrate some of the main ideas. © 2015 by Annual Reviews. All rights reserved.

  7. Sea surface temperature variability in the North Western Mediterranean Sea (Gulf of Lion) during the Common Era

    Science.gov (United States)

    Sicre, Marie-Alexandrine; Jalali, Bassem; Martrat, Belen; Schmidt, Sabine; Bassetti, Maria-Angela; Kallel, Nejib

    2016-12-01

    This study investigates the multidecadal-scale variability of sea surface temperatures (SSTs) in the convection region of the Gulf of Lion (NW Mediterranean Sea) over the full past 2000 yr (Common Era) using alkenone biomarkers. Our data show colder SSTs by 1.7 °C over most of the first millennium (200-800 AD) and by 1.3 °C during the Little Ice Age (LIA; 1400-1850 AD) than the 20th century mean (17.9 °C). Although on average warmer, those of the Medieval Climate Anomaly (MCA) (1000-1200 AD) were lower by 1 °C. We found a mean SST warming of 2 °C/100 yr over the last century in close agreement with the 0.22 and 0.26 °C/decade values calculated for the western Mediterranean Sea from in situ and satellite data, respectively. Our results also reveal strongly fluctuating SSTs characterized by cold extremes followed by abrupt warming during the LIA. We suggest that the coldest decades of the LIA were likely caused by prevailing negative EA states and associated anticyclone blocking over the North Atlantic resulting in cold continental northeasterly winds to blow over Western Europe and the Mediterranean region.

  8. Adventure and Extreme Sports.

    Science.gov (United States)

    Gomez, Andrew Thomas; Rao, Ashwin

    2016-03-01

    Adventure and extreme sports often involve unpredictable and inhospitable environments, high velocities, and stunts. These activities vary widely and include sports like BASE jumping, snowboarding, kayaking, and surfing. Increasing interest and participation in adventure and extreme sports warrants understanding by clinicians to facilitate prevention, identification, and treatment of injuries unique to each sport. This article covers alpine skiing and snowboarding, skateboarding, surfing, bungee jumping, BASE jumping, and whitewater sports with emphasis on epidemiology, demographics, general injury mechanisms, specific injuries, chronic injuries, fatality data, and prevention. Overall, most injuries are related to overuse, trauma, and environmental or microbial exposure. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Extremal graph theory

    CERN Document Server

    Bollobas, Bela

    2004-01-01

    The ever-expanding field of extremal graph theory encompasses a diverse array of problem-solving methods, including applications to economics, computer science, and optimization theory. This volume, based on a series of lectures delivered to graduate students at the University of Cambridge, presents a concise yet comprehensive treatment of extremal graph theory.Unlike most graph theory treatises, this text features complete proofs for almost all of its results. Further insights into theory are provided by the numerous exercises of varying degrees of difficulty that accompany each chapter. A

  10. Packagings in the silicon era

    International Nuclear Information System (INIS)

    Beone, G.; Mione, A.; Orsini, A.; Forasassi, G.

    1993-01-01

    ENEA is studying, with the collaboration of the DCMN of the Pisa University, a new packaging to collect wastes in various facilities while proceeding to find a final disposal. Following a survey on the wastes that could be transported in the future, it was agreed to design a packaging able to contain an industrial drum, with a maximum capacity of 220 litres and a total weight less than 4000 N, previously filled with solid wastes in bulk or in a solid binding material. The packaging, to be approved as a Type B in agreement with the IAEA Regulations, will be useful to transport not only radioactive wastes but any kind of dangerous goods and also those not in agreement with the UNO Regulations. The 1/2 scale model of the packaging is formed by two concentric vessels of mild steel obtained by welding commercial shells to cylindrical walls and joined through a flange. The new packaging under development presents features that seem to be proper for its envisaged waste collection main use such as construction simplicity, relatively low cost, time and use endurance, low maintenance requirements. The design analysis and testing program ongoing at present allowed for the preliminary definition of the packaging geometry and confirmed the necessity of further investigations in some key areas as the determination of actual behaviour of the silicon foam, used as energy absorbing/thermal insulating material, in the specific conditions of interest. (J.P.N.)

  11. Emotion, reflexivity and social change in the era of extreme fossil fuels.

    Science.gov (United States)

    Davidson, Debra J

    2018-05-09

    Reflexivity is an important sociological lens through which to examine the means by which people engage in actions that contribute to social reproduction or social elaboration. Reflexivity theorists have largely overlooked the central place of emotions in reflexive processing, however, thus missing opportunities to enhance our understanding of reflexivity by capitalizing on recent scholarship on emotions emanating from other fields of inquiry. This paper explores the role of emotion in reflexivity, with a qualitative analysis of social responses to hydraulic fracturing in Alberta, Canada, utilizing narrative analysis of long-form interviews with rural landowners who have experienced direct impacts from hydraulic fracturing, and have attempted to voice their concerns in the public sphere. Based on interviews with a selection of two interview participants, the paper highlights the means by which emotions shape reflexivity in consequential ways, beginning with personal and highly individualized emotional responses to contingent situations, which then factor into the social interactions engaged in the pursuit of personal projects. The shared emotional context that emerges then plays a substantial role in shaping outcomes and their implications for social stasis or change. This study exemplifies the extent to which reflexive processing in response to breaches in the social order can be emotionally tumultuous affairs, constituting a significant personal toll that many may be unwilling to pay. © London School of Economics and Political Science 2018.

  12. MAD about the Large Magellanic Cloud Preparing for the era of Extremely Large Telescopes

    NARCIS (Netherlands)

    Fiorentino, G.; Tolstoy, E.; Diolaiti, E.; Valenti, E.; Cignoni, M.; Mackey, A. D.

    We present J, H, K-s photometry from the the Multi conjugate Adaptive optics Demonstrator (MAD), a visitor instrument at the VLT, of a resolved stellar population in a small crowded field in the bar of the Large Magellanic Cloud near the globular cluster NGC 1928. In a total exposure time of 6, 36

  13. Stellar extreme ultraviolet astronomy

    International Nuclear Information System (INIS)

    Cash, W.C. Jr.

    1978-01-01

    The design, calibration, and launch of a rocket-borne imaging telescope for extreme ultraviolet astronomy are described. The telescope, which employed diamond-turned grazing incidence optics and a ranicon detector, was launched November 19, 1976, from the White Sands Missile Range. The telescope performed well and returned data on several potential stellar sources of extreme ultraviolet radiation. Upper limits ten to twenty times more sensitive than previously available were obtained for the extreme ultraviolet flux from the white dwarf Sirius B. These limits fall a factor of seven below the flux predicted for the star and demonstrate that the temperature of Sirius B is not 32,000 K as previously measured, but is below 30,000 K. The new upper limits also rule out the photosphere of the white dwarf as the source of the recently reported soft x-rays from Sirius. Two other white dwarf stars, Feige 24 and G191-B2B, were observed. Upper limits on the flux at 300 A were interpreted as lower limits on the interstellar hydrogen column densities to these stars. The lower limits indicate interstellar hydrogen densitites of greater than .02 cm -3 . Four nearby stars (Sirius, Procyon, Capella, and Mirzam) were observed in a search for intense low temperature coronae or extended chromospheres. No extreme ultraviolet radiation from these stars was detected, and upper limits to their coronal emisson measures are derived

  14. Extremity x-ray

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/article/003461.htm Extremity x-ray To use the sharing features on this page, ... in the body Risks There is low-level radiation exposure. X-rays are monitored and regulated to provide the ...

  15. Extremity perfusion for sarcoma

    NARCIS (Netherlands)

    Hoekstra, Harald Joan

    2008-01-01

    For more than 50 years, the technique of extremity perfusion has been explored in the limb salvage treatment of local, recurrent, and multifocal sarcomas. The "discovery" of tumor necrosis factor-or. in combination with melphalan was a real breakthrough in the treatment of primarily irresectable

  16. Statistics of Local Extremes

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Bierbooms, W.; Hansen, Kurt Schaldemose

    2003-01-01

    . A theoretical expression for the probability density function associated with local extremes of a stochasticprocess is presented. The expression is basically based on the lower four statistical moments and a bandwidth parameter. The theoretical expression is subsequently verified by comparison with simulated...

  17. Large-scale simulation of ductile fracture process of microstructured materials

    International Nuclear Information System (INIS)

    Tian Rong; Wang Chaowei

    2011-01-01

    The promise of computational science in the extreme-scale computing era is to reduce and decompose macroscopic complexities into microscopic simplicities with the expense of high spatial and temporal resolution of computing. In materials science and engineering, the direct combination of 3D microstructure data sets and 3D large-scale simulations provides unique opportunity for the development of a comprehensive understanding of nano/microstructure-property relationships in order to systematically design materials with specific desired properties. In the paper, we present a framework simulating the ductile fracture process zone in microstructural detail. The experimentally reconstructed microstructural data set is directly embedded into a FE mesh model to improve the simulation fidelity of microstructure effects on fracture toughness. To the best of our knowledge, it is for the first time that the linking of fracture toughness to multiscale microstructures in a realistic 3D numerical model in a direct manner is accomplished. (author)

  18. Electra en Piñera

    Directory of Open Access Journals (Sweden)

    Elina Miranda Cancela

    1991-12-01

    Full Text Available Este artigo e uma análise da Electra Garrigó de Virgilio Piñera, teatrólogo cubano, e das suas vinculações com o teatro trágico grego, sobretudo com a Electra de Sófocles, acrescida ainda de aproximações com autores modernos que trataram do mesmo tema. Apesar da inspiração grega, Piñera permanece um típico teatrólogo nacional, marcado pelos momentos de grande tensão social da sua época (a tragédia em apreço data de 1941. O conflito produzido pela excessiva autoridade dos pais sobre os filhos, latente neste mito, interessa-o por seu significado dentro da família cubana.

  19. Menertawakan Fobia Komunis di Era Reproduksi Digital

    Directory of Open Access Journals (Sweden)

    Triyono Lukmantoro

    2017-04-01

    Full Text Available Abstract. In May-June 2016 issue of the rise of the Indonesian Communist Party (PKI and the latent danger of communism appeared again. Excessive fear of PKI and communism continues propagated. That is what is referred to as a communist phobia. But, the issue is considered sensitive that it gave birth to criticism. The phenomenon is the presence of a number of memes comics whose contents laugh hammer and sickle symbol and three communist iconic figures, namely D.N. Aidit, Tan Malaka, and Mao Zedong. Meme comics containing parody to show incongruities that can only happen to the era of digital reproduction. The idea of meme comics can be traced to the thought Walter Benjamin about the works of art in the age of mechanical reproduction. In that era, aura was declining. The crisis and the disappearance of aura increasingly occurs to the time of digital reproduction.

  20. Flavour physics in the LHC era

    CERN Document Server

    Gershon, Tim

    2014-01-01

    These lectures give a topical review of heavy flavour physics, in particular \\CP violation and rare decays, from an experimental point of view. They describe the ongoing motivation to study heavy flavour physics in the LHC era, the current status of the field emphasising key results from previous experiments, some selected topics in which new results are expected in the near future, and a brief look at future projects.

  1. The Molecular Era of Surfactant Biology

    OpenAIRE

    Whitsett, Jeffrey A.

    2014-01-01

    Advances in the physiology, biochemistry, molecular and cell biology of the pulmonary surfactant system transformed the clinical care and outcome of preterm infants with respiratory distress syndrome. The molecular era of surfactant biology provided genetic insights into the pathogenesis of pulmonary disorders, previously termed “idiopathic” that affect newborn infants, children and adults. Knowledge related to the structure and function of the surfactant proteins and their roles in alveolar ...

  2. A New Era for Jefferson Lab

    International Nuclear Information System (INIS)

    McKeown, R. D.; Montgomery, H. E.; Pennington, M. R.

    2016-01-01

    On a cool Saturday morning in late April a seemingly endless stream of cars turned off Jefferson Avenue in Newport News, Virginia, bringing 12,000 people ages 1 to 91 to the Open House to learn more about “the new era in science” at the Thomas Jefferson National Accelerator Facility. Here, the visitors were dazzled by the complex equipment, the enthusiastic staff, and the advanced technology at the Laboratory.

  3. Creative clusters: a new era for SMEs?

    OpenAIRE

    Oxborrow, L

    2012-01-01

    Objectives: The paper illustrates how the characteristics of industry clusters are revived in a new era for SME networks. It explores how a succession of industry shocks - increased global competition, recession and reduced policy support - have stimulated an innovative response in creative SMEs. The paper goes on to investigate the clustering experience of a small group of creative entrepreneurs in pursuing networked activities, with a view to identifying lessons that can be learnt to suppor...

  4. Extremes in nature

    CERN Document Server

    Salvadori, Gianfausto; Kottegoda, Nathabandu T

    2007-01-01

    This book is about the theoretical and practical aspects of the statistics of Extreme Events in Nature. Most importantly, this is the first text in which Copulas are introduced and used in Geophysics. Several topics are fully original, and show how standard models and calculations can be improved by exploiting the opportunities offered by Copulas. In addition, new quantities useful for design and risk assessment are introduced.

  5. Scale interactions in economics: application to the evaluation of the economic damages of climatic change and of extreme events; Interactions d'echelles en economie: application a l'evaluation des dommages economiques du changement climatique et des evenements extremes

    Energy Technology Data Exchange (ETDEWEB)

    Hallegatte, S

    2005-06-15

    Growth models, which neglect economic disequilibria, considered as temporary, are in general used to evaluate the damaging effects generated by climatic change. This work shows, through a series of modeling experiences, the importance of disequilibria and of endogenous variability of economy in the evaluation of damages due to extreme events and climatic change. It demonstrates the impossibility to separate the evaluation of damages from the representation of growth and of economic dynamics: the comfort losses will depend on both the nature and intensity of impacts and on the dynamics and situation of the economy to which they will apply. Thus, the uncertainties about the damaging effects of future climatic changes come from both scientific uncertainties and from uncertainties about the future organization of our economies. (J.S.)

  6. Rhabdomyosarcoma of the extremity

    International Nuclear Information System (INIS)

    Rao, Bhaskar N

    1997-01-01

    Rhabdomyosarcoma is the most common soft tissue sarcoma accounting for almost 55%. These tumors arise from unsegmented mesoderm or primitive mesenchyma, which have the capacity to differentiate into muscle. Less than 5% occur in the first year of life. Extremity rhabdomyosarcoma are mainly seen in the adolescent years. The most common histologic subtype is the alveolar variant. Other characteristics of extremity rhabdomyosarcoma include a predilection for lymph node metastasis, a high local failure, and a relatively low survival rate. They often present as slow painless masses; however, lesions in the hand and foot often present as painful masses and imaging studies may show invasion of the bone. Initial diagnostic approaches include needle biopsy or incisional biopsy for larger lesions. Excisional biopsy is indicated preferably for lesions less than 2.5 cm. following this in most instances therapy is initiated with multi agent chemotherapy depending upon response, the next modality may be either surgery with intent to cure or radiation therapy. Amputation of an extremity for local control is not considered in most instances. Prognostic factors that have been determined over the years to be of significance by multi variant analysis have included age, tumor size, invasiveness, presence of either nodal or distant metastasis, and complete excision whenever feasible, with supplemental radiation therapy for local control

  7. Rater Reliability of the Hardy Classification for Pituitary Adenomas in the Magnetic Resonance Imaging Era.

    Science.gov (United States)

    Mooney, Michael A; Hardesty, Douglas A; Sheehy, John P; Bird, C Roger; Chapple, Kristina; White, William L; Little, Andrew S

    2017-10-01

    Objectives  The Hardy classification is used to classify pituitary tumors for clinical and research purposes. The scale was developed using lateral skull radiographs and encephalograms, and its reliability has not been evaluated in the magnetic resonance imaging (MRI) era. Design  Fifty preoperative MRI scans of biopsy-proven pituitary adenomas using the sellar invasion and suprasellar extension components of the Hardy scale were reviewed. Setting  This study was a cohort study set at a single institution. Participants  There were six independent raters. Main Outcome Measures  The main outcome measures of this study were interrater reliability, intrarater reliability, and percent agreement. Results  Overall interrater reliability of both Hardy subscales on MRI was strong. However, reliability of the intermediate scores was weak, and percent agreement among raters was poor (12-16%) using the full scales. Dichotomizing the scale into clinically useful groups maintained strong interrater reliability for the sellar invasion scale and increased the percent agreement for both scales. Conclusion  This study raises important questions about the reliability of the original Hardy classification. Editing the measure to a clinically relevant dichotomous scale simplifies the rating process and may be useful for preoperative tumor characterization in the MRI era. Future research studies should use the dichotomized Hardy scale (sellar invasion Grades 0-III versus Grade IV, suprasellar extension Types 0-C versus Type D).

  8. An investigation into the nutritional status of patients receiving an Enhanced Recovery After Surgery (ERAS) protocol versus standard care following Oesophagectomy.

    Science.gov (United States)

    Benton, Katie; Thomson, Iain; Isenring, Elisabeth; Mark Smithers, B; Agarwal, Ekta

    2018-06-01

    Enhanced Recovery After Surgery (ERAS) protocols have been effectively expanded to various surgical specialities including oesophagectomy. Despite nutrition being a key component, actual nutrition outcomes and specific guidelines are lacking. This cohort comparison study aims to compare nutritional status and adherence during implementation of a standardised post-operative nutritional support protocol, as part of ERAS, compared to those who received usual care. Two groups of patients undergoing resection of oesophageal cancer were studied. Group 1 (n = 17) underwent oesophagectomy between Oct 2014 and Nov 2016 during implementation of an ERAS protocol. Patients in group 2 (n = 16) underwent oesophagectomy between Jan 2011 and Dec 2012 prior to the implementation of ERAS. Demographic, nutritional status, dietary intake and adherence data were collected. Ordinal data was analysed using independent t tests, and categorical data using chi-square tests. There was no significant difference in nutrition status, dietary intake or length of stay following implementation of an ERAS protocol. Malnutrition remained prevalent in both groups at day 42 post surgery (n = 10, 83% usual care; and n = 9, 60% ERAS). A significant difference was demonstrated in adherence with earlier initiation of oral free fluids (p nutrition protocol, within an ERAS framework, results in earlier transition to oral intake; however, malnutrition remains prevalent post surgery. Further large-scale studies are warranted to examine individualised decision-making regarding nutrition support within an ERAS protocol.

  9. Overview of ERA Integrated Technology Demonstration (ITD) 51A Ultra-High Bypass (UHB) Integration for Hybrid Wing Body (HWB)

    Science.gov (United States)

    Flamm, Jeffrey D.; James, Kevin D.; Bonet, John T.

    2016-01-01

    The NASA Environmentally Responsible Aircraft Project (ERA) was a ve year project broken into two phases. In phase II, high N+2 Technical Readiness Level demonstrations were grouped into Integrated Technology Demonstrations (ITD). This paper describes the work done on ITD-51A: the Vehicle Systems Integration, Engine Airframe Integration Demonstration. Refinement of a Hybrid Wing Body (HWB) aircraft from the possible candidates developed in ERA Phase I was continued. Scaled powered, and unpowered wind- tunnel testing, with and without acoustics, in the NASA LARC 14- by 22-foot Subsonic Tunnel, the NASA ARC Unitary Plan Wind Tunnel, and the 40- by 80-foot test section of the National Full-Scale Aerodynamics Complex (NFAC) in conjunction with very closely coupled Computational Fluid Dynamics was used to demonstrate the fuel burn and acoustic milestone targets of the ERA Project.

  10. A North American Hydroclimate Synthesis (NAHS) of the Common Era

    Science.gov (United States)

    Rodysill, Jessica R.; Anderson, Lesleigh; Cronin, Thomas M.; Jones, Miriam C.; Thompson, Robert S.; Wahl, David B.; Willard, Debra A.; Addison, Jason A.; Alder, Jay R.; Anderson, Katherine H.; Anderson, Lysanna; Barron, John A.; Bernhardt, Christopher E.; Hostetler, Steven W.; Kehrwald, Natalie M.; Khan, Nicole S.; Richey, Julie N.; Starratt, Scott W.; Strickland, Laura E.; Toomey, Michael R.; Treat, Claire C.; Wingard, G. Lynn

    2018-03-01

    This study presents a synthesis of century-scale hydroclimate variations in North America for the Common Era (last 2000 years) using new age models of previously published multiple proxy-based paleoclimate data. This North American Hydroclimate Synthesis (NAHS) examines regional hydroclimate patterns and related environmental indicators, including vegetation, lake water elevation, stream flow and runoff, cave drip rates, biological productivity, assemblages of living organisms, and salinity. Centennial-scale hydroclimate anomalies are obtained by iteratively sampling the proxy data on each of thousands of age model realizations and determining the fractions of possible time series indicating that the century-smoothed data was anomalously wet or dry relative to the 100 BCE to 1900 CE mean. Results suggest regionally asynchronous wet and dry periods over multidecadal to centennial timescales and frequent periods of extended regional drought. Most sites indicate drying during previously documented multicentennial periods of warmer Northern Hemisphere temperatures, particularly in the western U.S., central U.S., and Canada. Two widespread droughts were documented by the NAHS: from 50 BCE to 450 CE and from 800 to 1100 CE. Major hydroclimate reorganizations occurred out of sync with Northern Hemisphere temperature variations and widespread wet and dry anomalies occurred during both warm and cool periods. We present a broad assessment of paleoclimate relationships that highlights the potential influences of internal variability and external forcing and supports a prominent role for Pacific and Atlantic Ocean dynamics on century-scale continental hydroclimate.

  11. Evaluation of NASA's MERRA Precipitation Product in Reproducing the Observed Trend and Distribution of Extreme Precipitation Events in the United States

    Science.gov (United States)

    Ashouri, Hamed; Sorooshian, Soroosh; Hsu, Kuo-Lin; Bosilovich, Michael G.; Lee, Jaechoul; Wehner, Michael F.; Collow, Allison

    2016-01-01

    This study evaluates the performance of NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) precipitation product in reproducing the trend and distribution of extreme precipitation events. Utilizing the extreme value theory, time-invariant and time-variant extreme value distributions are developed to model the trends and changes in the patterns of extreme precipitation events over the contiguous United States during 1979-2010. The Climate Prediction Center (CPC) U.S.Unified gridded observation data are used as the observational dataset. The CPC analysis shows that the eastern and western parts of the United States are experiencing positive and negative trends in annual maxima, respectively. The continental-scale patterns of change found in MERRA seem to reasonably mirror the observed patterns of change found in CPC. This is not previously expected, given the difficulty in constraining precipitation in reanalysis products. MERRA tends to overestimate the frequency at which the 99th percentile of precipitation is exceeded because this threshold tends to be lower in MERRA, making it easier to be exceeded. This feature is dominant during the summer months. MERRA tends to reproduce spatial patterns of the scale and location parameters of the generalized extreme value and generalized Pareto distributions. However, MERRA underestimates these parameters, particularly over the Gulf Coast states, leading to lower magnitudes in extreme precipitation events. Two issues in MERRA are identified: 1) MERRA shows a spurious negative trend in Nebraska and Kansas, which is most likely related to the changes in the satellite observing system over time that has apparently affected the water cycle in the central United States, and 2) the patterns of positive trend over the Gulf Coast states and along the East Coast seem to be correlated with the tropical cyclones in these regions. The analysis of the trends in the seasonal precipitation extremes indicates that

  12. Multifractal Conceptualisation of Hydro-Meteorological Extremes

    Science.gov (United States)

    Tchiguirinskaia, I.; Schertzer, D.; Lovejoy, S.

    2009-04-01

    Hydrology and more generally sciences involved in water resources management, technological or operational developments face a fundamental difficulty: the extreme variability of hydro-meteorological fields. It clearly appears today that this variability is a function of the observation scale and yield hydro-meteorological hazards. Throughout the world, the development of multifractal theory offers new techniques for handling such non-classical variability over wide ranges of time and space scales. The resulting stochastic simulations with a very limited number of parameters well reproduce the long range dependencies and the clustering of rainfall extremes often yielding fat tailed (i.e., an algebraic type) probability distributions. The goal of this work was to investigate the ability of using very short or incomplete data records for reliable statistical predictions of the extremes. In particular we discuss how to evaluate the uncertainty in the empirical or semi-analytical multifractal outcomes. We consider three main aspects of the evaluation, such as the scaling adequacy, the multifractal parameter estimation error and the quantile estimation error. We first use the multiplicative cascade model to generate long series of multifractal data. The simulated samples had to cover the range of the universal multifractal parameters widely available in the scientific literature for the rainfall and river discharges. Using these long multifractal series and their sub-samples, we defined a metric for parameter estimation error. Then using the sets of estimated parameters, we obtained the quantile values for a range of excedance probabilities from 5% to 0.01%. Plotting the error bars on a quantile plot enable an approximation of confidence intervals that would be particularly important for the predictions of multifractal extremes. We finally illustrate the efficiency of such concept on its application to a large database (more than 16000 selected stations over USA and

  13. Precipitation, temperature and wind in Norway: dynamical downscaling of ERA40

    Energy Technology Data Exchange (ETDEWEB)

    Barstad, I.; Sorteberg, A.; Flatoey, F. [Bjerknes Centre for Climate Research, Bergen (Norway); Deque, M. [Meteo France, EAC/GMGEC/CNRM, Toulouse (France)

    2009-11-15

    A novel downscaling approach of the ERA40 (ECMWF 40-years reanalysis) data set has been taken and results for comparison with observations in Norway are shown. The method applies a nudging technique in a stretched global model, focused in the Norwegian Sea (67 N, 5 W). The effective resolution is three times the one of the ERA40, equivalent to about 30 km grid spacing in the area of focus. Longer waves (ERA40 solution, and thus the large-scale circulation is similar in the two data sets. The shorter waves are free to evolve, and produce high intensities of winds and precipitation. The comparison to observations incorporate numerous station data points of (1) precipitation (357), (2) temperature (98) and (3) wind (10), and for the period 1961-1990, the downscaled data set shows large improvements over ERA40. The daily precipitation shows considerable reduction in bias (from 50 to 11%), and twofold reduction at the 99.9 percentile (from -59 to -29%). The daily temperature showed a bias reduction of about a degree in most areas, and relative large RMSE reduction (from 7.5 to 5.0 C except winter). The wind comparison showed a slight improvement in bias, and significant improvements in RMSE. (orig.)

  14. Extreme Programming Pocket Guide

    CERN Document Server

    Chromatic

    2003-01-01

    Extreme Programming (XP) is a radical new approach to software development that has been accepted quickly because its core practices--the need for constant testing, programming in pairs, inviting customer input, and the communal ownership of code--resonate with developers everywhere. Although many developers feel that XP is rooted in commonsense, its vastly different approach can bring challenges, frustrations, and constant demands on your patience. Unless you've got unlimited time (and who does these days?), you can't always stop to thumb through hundreds of pages to find the piece of info

  15. Upper extremity golf injuries.

    Science.gov (United States)

    Cohn, Michael A; Lee, Steven K; Strauss, Eric J

    2013-01-01

    Golf is a global sport enjoyed by an estimated 60 million people around the world. Despite the common misconception that the risk of injury during the play of golf is minimal, golfers are subject to a myriad of potential pathologies. While the majority of injuries in golf are attributable to overuse, acute traumatic injuries can also occur. As the body's direct link to the golf club, the upper extremities are especially prone to injury. A thorough appreciation of the risk factors and patterns of injury will afford accurate diagnosis, treatment, and prevention of further injury.

  16. Pulsar Timing Array Based Search for Supermassive Black Hole Binaries in the Square Kilometer Array Era.

    Science.gov (United States)

    Wang, Yan; Mohanty, Soumya D

    2017-04-14

    The advent of next generation radio telescope facilities, such as the Square Kilometer Array (SKA), will usher in an era where a pulsar timing array (PTA) based search for gravitational waves (GWs) will be able to use hundreds of well timed millisecond pulsars rather than the few dozens in existing PTAs. A realistic assessment of the performance of such an extremely large PTA must take into account the data analysis challenge posed by an exponential increase in the parameter space volume due to the large number of so-called pulsar phase parameters. We address this problem and present such an assessment for isolated supermassive black hole binary (SMBHB) searches using a SKA era PTA containing 10^{3} pulsars. We find that an all-sky search will be able to confidently detect nonevolving sources with a redshifted chirp mass of 10^{10}  M_{⊙} out to a redshift of about 28 (corresponding to a rest-frame chirp mass of 3.4×10^{8}  M_{⊙}). We discuss the important implications that the large distance reach of a SKA era PTA has on GW observations from optically identified SMBHB candidates. If no SMBHB detections occur, a highly unlikely scenario in the light of our results, the sky-averaged upper limit on strain amplitude will be improved by about 3 orders of magnitude over existing limits.

  17. Seasonal Cycle in German Daily Precipitation Extremes

    Directory of Open Access Journals (Sweden)

    Madlen Fischer

    2018-01-01

    Full Text Available The seasonal cycle of extreme precipitation in Germany is investigated by fitting statistical models to monthly maxima of daily precipitation sums for 2,865 rain gauges. The basis is a non-stationary generalized extreme value (GEV distribution variation of location and scale parameters. The negative log-likelihood serves as the forecast error for a cross validation to select adequate orders of the harmonic functions for each station. For nearly all gauges considered, the seasonal model is more appropriate to estimate return levels on a monthly scale than a stationary GEV used for individual months. The 100-year return-levels show the influence of cyclones in the western, and convective events in the eastern part of Germany. In addition to resolving the seasonality, we use a simulation study to show that annual return levels can be estimated more precisely from a monthly-resolved seasonal model than from a stationary model based on annual maxima.

  18. Sensitivity of Rainfall Extremes Under Warming Climate in Urban India

    Science.gov (United States)

    Ali, H.; Mishra, V.

    2017-12-01

    Extreme rainfall events in urban India halted transportation, damaged infrastructure, and affected human lives. Rainfall extremes are projected to increase under the future climate. We evaluated the relationship (scaling) between rainfall extremes at different temporal resolutions (daily, 3-hourly, and 30 minutes), daily dewpoint temperature (DPT) and daily air temperature at 850 hPa (T850) for 23 urban areas in India. Daily rainfall extremes obtained from Global Surface Summary of Day Data (GSOD) showed positive regression slopes for most of the cities with median of 14%/K for the period of 1979-2013 for DPT and T850, which is higher than Clausius-Clapeyron (C-C) rate ( 7%). Moreover, sub-daily rainfall extremes are more sensitive to both DPT and T850. For instance, 3-hourly rainfall extremes obtained from Tropical Rainfall Measurement Mission (TRMM 3B42 V7) showed regression slopes more than 16%/K aginst DPT and T850 for the period of 1998-2015. Half-hourly rainfall extremes from the Integrated Multi-satellitE Retrievals (IMERGE) of Global precipitation mission (GPM) also showed higher sensitivity against changes in DPT and T850. The super scaling of rainfall extremes against changes in DPT and T850 can be attributed to convective nature of precipitation in India. Our results show that urban India may witness non-stationary rainfall extremes, which, in turn will affect stromwater designs and frequency and magniture of urban flooding.

  19. Assessment of the Latest GPM-Era High-Resolution Satellite Precipitation Products by Comparison with Observation Gauge Data over the Chinese Mainland

    Directory of Open Access Journals (Sweden)

    Shaowei Ning

    2016-10-01

    Full Text Available The Global Precipitation Mission (GPM Core Observatory that was launched on 27 February 2014 ushered in a new era for estimating precipitation from satellites. Based on their high spatial–temporal resolution and near global coverage, satellite-based precipitation products have been applied in many research fields. The goal of this study was to quantitatively compare two of the latest GPM-era satellite precipitation products (GPM IMERG and GSMap-Gauge Ver. 6 with a network of 840 precipitation gauges over the Chinese mainland. Direct comparisons of satellite-based precipitation products with rain gauge observations over a 20 month period from April 2014 to November 2015 at 0.1° and daily/monthly resolutions showed the following results: Both of the products were capable of capturing the overall spatial pattern of the 20 month mean daily precipitation, which was characterized by a decreasing trend from the southeast to the northwest. GPM IMERG overestimated precipitation by approximately 0.09 mm/day while GSMap-Gauge Ver. 6 underestimated precipitation by −0.04 mm/day. The two satellite-based precipitation products performed better over wet southern regions than over dry northern regions. They also showed better performance in summer than in winter. In terms of mean error, root mean square error, correlation coefficient, and probability of detection, GSMap-Gauge was better able to estimate precipitation and had more stable quality results than GPM IMERG on both daily and monthly scales. GPM IMERG was more sensitive to conditions of no rain or light rainfall and demonstrated good capability of capturing the behavior of extreme precipitation events. Overall, the results revealed some limitations of these two latest satellite-based precipitation products when used over the Chinese mainland, helping to characterize some of the error features in these datasets for potential users.

  20. Biobanks in the Era of Digital Medicine.

    Science.gov (United States)

    Jacobs, Gunnar; Wolf, Andreas; Krawczak, Michael; Lieb, Wolfgang

    2018-05-01

    Digitalization is currently permeating virtually all sectors of modern societies, including biomedical research and medical care. At the same time, biobanks engaged in the long-term storage of biological samples that are fit for purpose have become key drivers in both fields. The present article highlights some of the challenges and opportunities that biobanking is facing in the current proverbial "era of digitalization." © 2017 The Authors Clinical Pharmacology & Therapeutics published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  1. Superbend era begins swiftly at the ALS

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, Art; Tamura, Lori

    2001-11-29

    The successful installation and commissioning of high-field superconducting bend magnets (superbends) in three curved sectors of ALS storage ring was the first time the magnet lattice of an operating synchrotron light source has been retrofitted in this fundamental way. As a result, the ALS now offers an expanded spectral range well into the hard x-ray region without compromising either the number of undulators or their high brightness in the soft x-ray region for which the ALS design was originally optimized. In sum, when the superbend-enhanced ALS started up for user operations in October 2001, it marked the beginning of a new era in its history.

  2. Evolusi Saluran Interaksi di Era Internet

    Directory of Open Access Journals (Sweden)

    Benedictus Arnold Simangunsong

    2011-07-01

    Full Text Available The development and advancement of technology affects how man interacts with others. The revolution of society interaction which was proposed by Straubhaar and LaRose, from pre-farming society to a farming society, into an industrial society, which was marked by the label of industrial revolution, to the information society, which is marked by the information revolution, shifted the way and man’s attitudes both in terms of economy and interaction. In the information society, the revolution also takes place in message delivery, where face-to-face was common at first, to textual and visual delivery, which is a change in the internet era.

  3. WRF high resolution dynamical downscaling of ERA-Interim for Portugal

    Energy Technology Data Exchange (ETDEWEB)

    Soares, Pedro M.M. [University of Lisbon, Instituto Dom Luiz, Lisbon (Portugal); Faculdade de Ciencias da Universidade de Lisboa, Lisbon (Portugal); Cardoso, Rita M.; Miranda, Pedro M.A.; Medeiros, Joana de [University of Lisbon, Instituto Dom Luiz, Lisbon (Portugal); Belo-Pereira, Margarida; Espirito-Santo, Fatima [Instituto de Meteorologia, Lisbon (Portugal)

    2012-11-15

    This study proposes a dynamically downscaled climatology of Portugal, produced by a high resolution (9 km) WRF simulation, forced by 20 years of ERA-Interim reanalysis (1989-2008), nested in an intermediate domain with 27 km of resolution. The Portuguese mainland is characterized by large precipitation gradients, with observed mean annual precipitation ranging from about 400 to over 2,200 mm, with a very wet northwest and rather dry southeast, largely explained by orographic processes. Model results are compared with all available stations with continuous records, comprising daily information in 32 stations for temperature and 308 for precipitation, through the computation of mean climatologies, standard statistical errors on daily to seasonally timescales, and distributions of extreme events. Results show that WRF at 9 km outperforms ERA-Interim in all analyzed variables, with good results in the representation of the annual cycles in each region. The biases of minimum and maximum temperature are reduced, with improvement of the description of temperature variability at the extreme range of its distribution. The largest gain of the high resolution simulations is visible in the rainiest regions of Portugal, where orographic enhancement is crucial. These improvements are striking in the high ranking percentiles in all seasons, describing extreme precipitation events. WRF results at 9 km compare favorably with published results supporting its use as a high-resolution regional climate model. This higher resolution allows a better representation of extreme events that are of major importance to develop mitigation/adaptation strategies by policy makers and downstream users of regional climate models in applications such as flash floods or heat waves. (orig.)

  4. 77 FR 40644 - ERA Systems, LLC, Formerly ERA Systems Corporation, a Subsidiary of Systems Research and...

    Science.gov (United States)

    2012-07-10

    ... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-81,047, TA-W-81,047A] ERA Systems..., 2011, resulted in a negative determination, issued on January 13, 2012. The determination was... partial separation from employment on the date of certification through two years from the date of...

  5. Analysis of ERA40-driven CLM simulations for Europe

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, E.B.; Luethi, D.; Schaer, C.; Seneviratne, S.I. [Inst. for Atmospheric and Climate Science, ETH Zurich (Switzerland); Anders, I.; Rockel, B. [Inst. for Coastal Research, GKSS Research Center, Geesthacht (Germany)

    2008-08-15

    The Climate Local Model (CLM) is a community Regional Climate Model (RCM) based on the COSMO weather forecast model. We present a validation of long-term ERA40-driven CLM simulations performed with different model versions. In particular we analyse three simulations with differences in boundary nudging and horizontal resolution performed for the EU-project ENSEMBLES with the model version 2.4.6, and one with the latest version 4.0. Moreover, we include for comparison a long-term simulation with the RCM CHRM previously used at ETH Zurich. We provide a thorough validation of temperature, precipitation, net radiation, cloud cover, circulation, evaporation and terrestrial water storage for winter and summer. For temperature and precipitation the interannual variability is additionally assessed. While simulations with CLM version 2.4.6 are generally too warm and dry in summer but still within the typical error of PRUDENCE simulations, version 4.0 has an anomalous cold and wet bias. This is partly due to a strong underestimation of the net radiation associated with cloud cover overestimation. Two similar CLM 2.4.6 simulations with different spatial resolutions (0.44 and 0.22 ) reveal for the analysed fields no clear benefit of the higher resolution except for better resolved fine-scale structures. While the large-scale circulation is represented more realistically with spectral nudging, temperature and precipitation are not. Overall, CLM performs comparatively to other state-of-the-art RCMs over Europe. (orig.)

  6. Multidecadal oscillations in rainfall and hydrological extremes

    Science.gov (United States)

    Willems, Patrick

    2013-04-01

    Many studies have anticipated a worldwide increase in the frequency and intensity of precipitation extremes and floods since the last decade(s). Natural variability by climate oscillations partly determines the observed evolution of precipitation extremes. Based on a technique for the identification and analysis of changes in extreme quantiles, it is shown that hydrological extremes have oscillatory behaviour at multidecadal time scales. Results are based on nearly independent extremes extracted from long-term historical time series of precipitation intensities and river flows. Study regions include Belgium - The Netherlands (Meuse basin), Ethiopia (Blue Nile basin) and Ecuador (Paute basin). For Belgium - The Netherlands, the past 100 years showed larger and more hydrological extremes around the 1910s, 1950-1960s, and more recently during the 1990-2000s. Interestingly, the oscillations for southwestern Europe are anti-correlated with these of northwestern Europe, thus with oscillation highs in the 1930-1940s and 1970s. The precipitation oscillation peaks are explained by persistence in atmospheric circulation patterns over the North Atlantic during periods of 10 to 15 years. References: Ntegeka V., Willems P. (2008), 'Trends and multidecadal oscillations in rainfall extremes, based on a more than 100 years time series of 10 minutes rainfall intensities at Uccle, Belgium', Water Resources Research, 44, W07402, doi:10.1029/2007WR006471 Mora, D., Willems, P. (2012), 'Decadal oscillations in rainfall and air temperature in the Paute River Basin - Southern Andes of Ecuador', Theoretical and Applied Climatology, 108(1), 267-282, doi:0.1007/s00704-011-0527-4 Taye, M.T., Willems, P. (2011). 'Influence of climate variability on representative QDF predictions of the upper Blue Nile Basin', Journal of Hydrology, 411, 355-365, doi:10.1016/j.jhydrol.2011.10.019 Taye, M.T., Willems, P. (2012). 'Temporal variability of hydro-climatic extremes in the Blue Nile basin', Water

  7. Mempertanyakan Privasi di Era Selebgram: Masih Adakah?

    Directory of Open Access Journals (Sweden)

    Ester Krisnawati

    2017-01-01

    Full Text Available Abstract: Instagram marks the importance of participatory culture in the era of new media. This paper aims to examine the complex notion of privacy in regards to children’s privacy that were made famous (by their parents through Instagram with the selebgram phenomenon. By examining the data gathered using #selebgram and underlining the self-presentation perspective in the study of the psychological communication, the results show that parents have their own motives and goal when uploading their child’s fotos on Instagram. Consequently, the childs have to lose their privacy in cyberspace and of course, the information is vulnerable to crime. Keywords: children’s privacy, Instagram, selebgram, self-presentation Abstrak: Instagram menandai gagasan pentingnya budaya partisipatif dalam era media baru. Tujuan dari paper ini adalah untuk menganalisa konsep Instagram sebagai forum dan sarana komunikasi dengan melihat fenomena selebgram. Paper ini meneliti gagasan kompleks privasi dalam hal privasi anak-anak yang dibuat terkenal (oleh orang tua mereka melalui Instagram dengan memeriksa data yang dikumpulkan menggunakan #selebgram dan menggarisbawahi perspektif presentasi diri dalam kajian psikologi komunikasi. Hasil analisa menunjukkan bahwa ada motif dan tujuan orang tua memuat foto anaknya di akun Instagram. Sedangkan dampaknya, anak tidak mempunyai privasi di dunia maya dan tentunya informasi tersebut akan rentan disalahgunakan untuk kejahatan. Kata Kunci: Instagram, presentasi diri, privasi anak, selebgram

  8. Molecular anthropology in the genomic era.

    Science.gov (United States)

    Destro-Bisol, Giovanni; Jobling, Mark A; Rocha, Jorge; Novembre, John; Richards, Martin B; Mulligan, Connie; Batini, Chiara; Manni, Franz

    2010-01-01

    Molecular Anthropology is a relatively young field of research. In fact, less than 50 years have passed since the symposium "Classification and Human Evolution" (1962, Burg Wartenstein, Austria), where the term was formally introduced by Emil Zuckerkandl. In this time, Molecular Anthropology has developed both methodologically and theoretically and extended its applications, so covering key aspects of human evolution such as the reconstruction of the history of human populations and peopling processes, the characterization of DNA in extinct humans and the role of adaptive processes in shaping the genetic diversity of our species. In the current scientific panorama, molecular anthropologists have to face a double challenge. As members of the anthropological community, we are strongly committed to the integration of biological findings and other lines of evidence (e.g. linguistic and archaeological), while keeping in line with methodological innovations which are moving the approach from the genetic to the genomic level. In this framework, the meeting "DNA Polymorphisms in Human Populations: Molecular Anthropology in the Genomic Era" (Rome, December 3-5, 2009) offered an opportunity for discussion among scholars from different disciplines, while paying attention to the impact of recent methodological innovations. Here we present an overview of the meeting and discuss perspectives and prospects of Molecular Anthropology in the genomic era.

  9. CENTRAL BANKING IN THE NEW ERA

    Directory of Open Access Journals (Sweden)

    Bilal Bagis

    2017-12-01

    Full Text Available This paper analyzes the evolution of central banking, and in particular the American experience of central banking. It provides projections for the future of central banking in the new era of post 2008. The paper initially demonstrates recent improvements in the financial and banking sectors, regulations and different measures of monetary and financial rules both in the USA and the rest of the advanced economies. Then, it claims institutions, such as central banks, will gain new objectives and more significance in this new era and thus will be given new roles, over time and along with the improvements and deepening in the financial system. The paper argues centuries long central bank evolution is not complete yet and that more objectives should be expected to come forward. In that line, there is need for a shift in the conventional policy measures. New trends in central banking such as the helicopter money, popular nominal GDP targeting regime and the retro developmental central banking are all critically analyzed. The paper provides a breakdown of financial development and central banking activities in a historical context and provides a rationale and a new basis for possible future innovations.

  10. The end of a remarkable era

    CERN Multimedia

    2011-01-01

    An important era in particle physics is coming to an end: the US Department of Energy announced on Monday that it will not fund an extension to Tevatron running beyond 2011. It is a poignant moment for particle physics as we prepare to bid farewell to a machine that has changed our view of the Universe, and played a significant role in paving the way for the new era that is opening up with the LHC.   The Tevatron has been at the high-energy frontier of particle physics for over a quarter of a century. That’s a remarkable achievement by any account, and the physics results are there to prove it. As well as bringing us the discovery of the top quark in 1995, the Tevatron’s experiments have provided vitally important precision measurements covering the full spectrum of Standard Model physics, not to mention hints of what may lie beyond. With several months of running still to come, it would be a foolish gambler who bet against further new physics emerging before the Teva...

  11. Causal Analysis of the Unanticipated Extremity Exposure at HFEF

    Energy Technology Data Exchange (ETDEWEB)

    David E. James; Charles R. Posegate; Thomas P. Zahn; Alan G. Wagner

    2011-11-01

    This report covers the unintended extremity exposure to an operator while handling a metallurgical mount sample of irradiated fuel following an off-scale high beta radiation reading of the sample. The decision was made to continue working after the meter indicated high off-scale by the HPT Supervisor, which resulted in the operator at the next operation being exposed.

  12. Extreme weather: Subtropical floods and tropical cyclones

    Science.gov (United States)

    Shaevitz, Daniel A.

    Extreme weather events have a large effect on society. As such, it is important to understand these events and to project how they may change in a future, warmer climate. The aim of this thesis is to develop a deeper understanding of two types of extreme weather events: subtropical floods and tropical cyclones (TCs). In the subtropics, the latitude is high enough that quasi-geostrophic dynamics are at least qualitatively relevant, while low enough that moisture may be abundant and convection strong. Extratropical extreme precipitation events are usually associated with large-scale flow disturbances, strong ascent, and large latent heat release. In the first part of this thesis, I examine the possible triggering of convection by the large-scale dynamics and investigate the coupling between the two. Specifically two examples of extreme precipitation events in the subtropics are analyzed, the 2010 and 2014 floods of India and Pakistan and the 2015 flood of Texas and Oklahoma. I invert the quasi-geostrophic omega equation to decompose the large-scale vertical motion profile to components due to synoptic forcing and diabatic heating. Additionally, I present model results from within the Column Quasi-Geostrophic framework. A single column model and cloud-revolving model are forced with the large-scale forcings (other than large-scale vertical motion) computed from the quasi-geostrophic omega equation with input data from a reanalysis data set, and the large-scale vertical motion is diagnosed interactively with the simulated convection. It is found that convection was triggered primarily by mechanically forced orographic ascent over the Himalayas during the India/Pakistan flood and by upper-level Potential Vorticity disturbances during the Texas/Oklahoma flood. Furthermore, a climate attribution analysis was conducted for the Texas/Oklahoma flood and it is found that anthropogenic climate change was responsible for a small amount of rainfall during the event but the

  13. Are BALQSOs extreme accretors?

    Science.gov (United States)

    Yuan, M. J.; Wills, B. J.

    2002-12-01

    Broad Absorption Line (BAL) QSOs are QSOs with massive absorbing outflows up to 0.2c. Two hypothesis have been suggested in the past about the nature of BALQSOs: Every QSO might have BAL outflow with some covering factor. BALQSOs are those which happen to have outflow along our line of sight. BALQSOs have intrinsically different physical properties than non-BALQSOs. Based on BALQSO's optical emission properties and a large set of correlations linking many general QSO emission line and continuum properties, it has been suggested that BALQSOs might accrete at near Eddington limit with abundant of fuel supplies. With new BALQSO Hβ region spectroscopic observation conducted at UKIRT and re-analysis of literature data for low and high redshift non-BALQSOs, We confirm that BALQSOs have extreme Fe II and [O III] emission line properties. Using results derived from the latest QSO Hβ region reverberation mapping, we calculated Eddington ratios (˙ {M}/˙ {M}Edd) for our BAL and non-BALQSOs. The Fe II and [O III] strengths are strongly correlated with Eddington ratios. Those correlations link Eddington ratio to a large set of general QSO properties through the Boroson & Green Eigenvector 1. We find that BALQSOs have Eddington ratios close to 1. However, all high redshift, high luminosity QSOs have rather high Eddington ratios. We argue that this is a side effect from selecting the brightest objects. In fact, our high redshift sample might constitute BALQSO's high Eddington ratio orientation parent population.

  14. A New Era for Research Education in Australia?

    Science.gov (United States)

    Marsh, Helene; Smith, Bradley; King, Max; Evans, Terry

    2012-01-01

    Use of the Australian research assessment exercise, Excellence in Research for Australia (ERA) to influence the policy and practice of research education in Australia will undoubtedly have many consequences, some of them unintended and potentially deleterious. ERA is a retrospective measure of research quality; research education is prospective.…

  15. Progressive-Era Resources on the World Wide Web.

    Science.gov (United States)

    Howenstein, Amanda

    1999-01-01

    Provides a list of Progressive-era websites with the address and a detailed description of each of the websites. Includes topics such as the womens suffrage movement, the Triangle Shirtwaist Factory fire, the Prohibition, labor-management conflicts, the Hull House, the Chicago fire, Emma Goldman, Progressive-era entertainment, and the Worlds Fair.…

  16. Enhanced recovery after surgery (ERAS) in penetrating abdominal ...

    African Journals Online (AJOL)

    Background: Enhanced recovery after surgery (ERAS) programmes employed in elective surgery have provided strong evidence for decreased lengths of hospital stay without increase in postoperative complications. The aim of this study was to explore the role and benefits of ERAS implemented in patients undergoing ...

  17. Implementing Maxwell's Aether Illuminates the Physics of Gravitation:. The Gravity-Electric (G-E) Field, Evident at Every Scale, From the Ionosphere to Spiral Galaxies and a Neutron-Star Extreme

    Science.gov (United States)

    Osmaston, Miles F.

    2013-09-01

    the means for displacing its local density exist; that, we show, is the nature of gravitational action and brings gravitation into the electromagnetic family of forces. Under (B) the particle mass is measured by the aether-sucking capability of its vortex, positiveonly gravitation being because the outward-diminishing force developed by each makes mutual convergence at any given point the statistically prevalent expectation. This activity maintains a radial aether (charge) density gradient - the Gravity-Electric (G-E) Field - around and within any gravitationally retained assemblage. So Newton's is an incomplete description of gravitation; the corresponding G-E field is an inseparable facet of the action. The effect on c of that charge density gradient yields gravitational lensing. We find that G-E field action on plasma is astronomically ubiquitous. This strictly radial outward force on ions has the property of increasing the orbital angular momentum of material, by moving it outwards, but at constant tangential velocity. Spiral galaxies no longer require Cold Dark Matter (CDM) to explain this. The force (maybe 30 V.m-1 at solar surface) has comprehensive relevance to the high orbital a.m. achieved during solar planet formation, to their prograde spins and to exoplanet observations. The growth of high-mass stars is impossible if radiation pressure rules, whereas G-E field repulsion is low during dust-opaque infall, driving their prodigious mass loss rates when infall ceases and the star establishes an ionized environment. Its biggest force-effect (~1012 V.m-1) is developed at neutron stars, where it is likely the force of supernova explosions, and leads to a fertile model for pulsars and the acceleration of 1019 eV extreme-energy cosmic rays. Our only directly observed measure of the G-E field is recorded at about 1 V.m-1 in the ionosphere-to-Earth electric potential. And temporary local changes of ionosphere electron density, monitored by radio and satellite, have

  18. A note on extreme sets

    Directory of Open Access Journals (Sweden)

    Radosław Cymer

    2017-10-01

    Full Text Available In decomposition theory, extreme sets have been studied extensively due to its connection to perfect matchings in a graph. In this paper, we first define extreme sets with respect to degree-matchings and next investigate some of their properties. In particular, we prove the generalized Decomposition Theorem and give a characterization for the set of all extreme vertices in a graph.

  19. Pengelolaan Kepegawaian pada Era Otonomi Daerah

    Directory of Open Access Journals (Sweden)

    Erlanda Juliansyah Putra

    2015-04-01

    ABSTRACTPersonnel management in the era of regional autonomy in essence , has experienced rapid growth , especially in the case of the preparation of the needs of employees , along with the procurement process and the appointment of civil servants. In developing the needs of employees , management personnel requirements determination is based on job analysis and workload that is based on the priority needs of the region . Besides the procurement process and the appointment of civil servants has also contains a provision that puts the professionalism system based on the ability of each candidate civil servants as well as some of the qualification requirements to enable the civil servants to be able to compete boost development in the region , accompanied by the management salary and allowances are based on the management of the budget needs of each region.

  20. Laplace and the era of differential equations

    Science.gov (United States)

    Weinberger, Peter

    2012-11-01

    Between about 1790 and 1850 French mathematicians dominated not only mathematics, but also all other sciences. The belief that a particular physical phenomenon has to correspond to a single differential equation originates from the enormous influence Laplace and his contemporary compatriots had in all European learned circles. It will be shown that at the beginning of the nineteenth century Newton's "fluxionary calculus" finally gave way to a French-type notation of handling differential equations. A heated dispute in the Philosophical Magazine between Challis, Airy and Stokes, all three of them famous Cambridge professors of mathematics, then serves to illustrate the era of differential equations. A remark about Schrödinger and his equation for the hydrogen atom finally will lead back to present times.

  1. Russian Gas Market: Entering New Era

    International Nuclear Information System (INIS)

    Mitrova, Tatiana; Molnar, Gergely

    2015-04-01

    After a period of extensive growth in the 2000's, the Russian gas industry is now facing numerous challenges. Mounting competition by independent producers and the development of new production by Gazprom, combined with stagnating domestic demand and weakening export markets, have created a situation of overproduction, made worse by western sanctions and low oil and gas prices. Expansion to the East thanks to the recent China deal is not expected to provide much relief before 2024. The coming decade will be critical for the industry and its outcome will largely depend on the government's pricing and institutional policies but the role of the state should remain essential. This document presents the key findings of the New CEDIGAZ report 'Russian Gas Market: Entering New Era'. The report analyses the ongoing changes in the Russian industry and the challenges to be met

  2. Engineering Education for a New Era

    Science.gov (United States)

    Ohgaki, Shinichiro

    Engineering education is composed of five components, the idea what engineering education ought to be, the knowledge in engineering fields, those who learn engineering, those who teach engineering and the stakeholders in engineering issues. The characteristics of all these five components are changing with the times. When we consider the engineering education for the next era, we should analyze the changes of all five components. Especially the knowledge and tools in engineering fields has been expanding, and advanced science and technology is casting partly a dark shadow on the modern convenient life. Moral rules or ethics for developing new products and engineering systems are now regarded as most important in engineering fields. All those who take the responsibility for engineering education should understand the change of all components in engineering education and have a clear grasp of the essence of engineering for sustainable society.

  3. The Rejuvenation of Cartography in ICT Era

    Directory of Open Access Journals (Sweden)

    GUO Renzhong

    2017-10-01

    Full Text Available With the impetus of ICT, cartography faces the unprecedented challenges. The paper discusses the problems and changes of cartography facing the digital technology, analyzes the constraints of traditional cartography that are mainly delimited by 2D physical paper map. Diverseness of modern cartography shows various map products, and the paper illustrates the digital freedom in information space of modern cartography from eight aspects, including physical reality VS virtual reality, paper map VS digital map, superficial visualization VS inner visualization and so on. Modern cartography encounters the new development opportunities and fresh demands in digital era, and it's necessary to extend the framework of cartography and to assimilate newly sprouted things to promote the rejuvenation of cartography.

  4. The forthcoming era of precision medicine.

    Science.gov (United States)

    Gamulin, Stjepan

    2016-11-01

    The aim of this essay is to present the definition and principles of personalized or precision medicine, the perspective and barriers to its development and clinical application. The implementation of precision medicine in health care requires the coordinated efforts of all health care stakeholders (the biomedical community, government, regulatory bodies, patients' groups). Particularly, translational research with the integration of genomic and comprehensive data from all levels of the organism ("big data"), development of bioinformatics platforms enabling network analysis of disease etiopathogenesis, development of a legislative framework for handling personal data, and new paradigms of medical education are necessary for successful application of the concept of precision medicine in health care. In the present and future era of precision medicine, the collaboration of all participants in health care is necessary for its realization, resulting in improvement of diagnosis, prevention and therapy, based on a holistic, individually tailored approach. Copyright © 2016 by Academy of Sciences and Arts of Bosnia and Herzegovina.

  5. Astronomy in the Big Data Era

    Directory of Open Access Journals (Sweden)

    Yanxia Zhang

    2015-05-01

    Full Text Available The fields of Astrostatistics and Astroinformatics are vital for dealing with the big data issues now faced by astronomy. Like other disciplines in the big data era, astronomy has many V characteristics. In this paper, we list the different data mining algorithms used in astronomy, along with data mining software and tools related to astronomical applications. We present SDSS, a project often referred to by other astronomical projects, as the most successful sky survey in the history of astronomy and describe the factors influencing its success. We also discuss the success of Astrostatistics and Astroinformatics organizations and the conferences and summer schools on these issues that are held annually. All the above indicates that astronomers and scientists from other areas are ready to face the challenges and opportunities provided by massive data volume.

  6. Scandinavian neuroscience during the Nazi era.

    Science.gov (United States)

    Kondziella, Daniel; Hansen, Klaus; Zeidman, Lawrence A

    2013-07-01

    Although Scandinavian neuroscience has a proud history, its status during the Nazi era has been overlooked. In fact, prominent neuroscientists in German-occupied Denmark and Norway, as well as in neutral Sweden, were directly affected. Mogens Fog, Poul Thygesen (Denmark) and Haakon Sæthre (Norway) were resistance fighters, tortured by the Gestapo: Thygesen was imprisoned in concentration camps and Sæthre executed. Jan Jansen (Norway), another neuroscientist resistor, escaped to Sweden, returning under disguise to continue fighting. Fritz Buchthal (Denmark) was one of almost 8000 Jews escaping deportation by fleeing from Copenhagen to Sweden. In contrast, Carl Værnet (Denmark) became a collaborator, conducting inhuman experiments in Buchenwald concentration camp, and Herman Lundborg (Sweden) and Thorleif Østrem (Norway) advanced racial hygiene in order to maintain the "superior genetic pool of the Nordic race." Compared to other Nazi-occupied countries, there was a high ratio of resistance fighters to collaborators and victims among the neuroscientists in Scandinavia.

  7. The forthcoming era of precision medicine

    Directory of Open Access Journals (Sweden)

    Stjepan Gamulin

    2016-11-01

    Full Text Available Abstract. The aim of this essay is to present the definition and principles of personalized or precision medicine, the perspective and barriers to its development and clinical application. The implementation of precision medicine in health care requires the coordinated efforts of all health care stakeholders (the biomedical community, government, regulatory bodies, patients’ groups. Particularly, translational research with the integration of genomic and comprehensive data from all levels of the organism (“big data”, development of bioinformatics platforms enabling network analysis of disease etiopathogenesis, development of a legislative framework for handling personal data, and new paradigms of medical education are necessary for successful application of the concept of precision medicine in health care. Conclusion. In the present and future era of precision medicine, the collaboration of all participants in health care is necessary for its realization, resulting in improvement of diagnosis, prevention and therapy, based on a holistic, individually tailored approach.

  8. New Software for the Fast Estimation of Population Recombination Rates (FastEPRR in the Genomic Era

    Directory of Open Access Journals (Sweden)

    Feng Gao

    2016-06-01

    Full Text Available Genetic recombination is a very important evolutionary mechanism that mixes parental haplotypes and produces new raw material for organismal evolution. As a result, information on recombination rates is critical for biological research. In this paper, we introduce a new extremely fast open-source software package (FastEPRR that uses machine learning to estimate recombination rate ρ (=4Ner from intraspecific DNA polymorphism data. When ρ>10 and the number of sampled diploid individuals is large enough (≥50, the variance of ρFastEPRR remains slightly smaller than that of ρLDhat. The new estimate ρcomb (calculated by averaging ρFastEPRR and ρLDhat has the smallest variance of all cases. When estimating ρFastEPRR, the finite-site model was employed to analyze cases with a high rate of recurrent mutations, and an additional method is proposed to consider the effect of variable recombination rates within windows. Simulations encompassing a wide range of parameters demonstrate that different evolutionary factors, such as demography and selection, may not increase the false positive rate of recombination hotspots. Overall, accuracy of FastEPRR is similar to the well-known method, LDhat, but requires far less computation time. Genetic maps for each human population (YRI, CEU, and CHB extracted from the 1000 Genomes OMNI data set were obtained in less than 3 d using just a single CPU core. The Pearson Pairwise correlation coefficient between the ρFastEPRR and ρLDhat maps is very high, ranging between 0.929 and 0.987 at a 5-Mb scale. Considering that sample sizes for these kinds of data are increasing dramatically with advances in next-generation sequencing technologies, FastEPRR (freely available at http://www.picb.ac.cn/evolgen/ is expected to become a widely used tool for establishing genetic maps and studying recombination hotspots in the population genomic era.

  9. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaë l; Davison, Anthony C.; Genton, Marc G.

    2015-01-01

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  10. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaël

    2015-11-17

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  11. Chirurgie in die Grieks-Romeinse era

    Directory of Open Access Journals (Sweden)

    François P. Retief

    2006-09-01

    Full Text Available In die Grieks-Romeinse era het mediese behandeling kenmerkend uit drie elemente bestaan, naamlik regimen (dieet en gesonde leefwyse, geneesmiddels en chirurgie – laasgenoemde alleen toegepas indien regimen en geneesmiddels onsuksesvol was. Bewyse van primitiewe chirurgie dateer terug na die Bronstydperk, en in Homerus se eposse is heelwat vermelding van die chirurgiese hantering van oorlogswonde, met tussenkoms van die gode. Met die koms van empiriese geneeskunde in die 5de eeu v.C. het chirurgie in die Hippokratiese Corpus prominent gefigureer met beduidende bydraes in veral die ortopediese veld en hoofbeserings. Uitbouing van anatomiese en fisiologiese kennis, gebaseer op disseksie van menslike kadawers in Alexandrië vanaf die laat 4de eeu v.C., het chirurgie ’n hupstoot gegee. Teen die Romeinse era vanaf die 2de eeu v.C. het snykundetegnieke (en -instrumente beduidend verbeter, maar is steeds oorwegend deur Griekse geneeshere beoefen. Van geneeshere is steeds verwag om al drie bovermelde terapeutiese modaliteite te bemeester, maar chirurgie het meer aansien verwerf en daar is al meer in onderafdelings van chirurgie soos oogheelkunde, vrouesiektes en verloskunde, blaaskwale en mond- en keelsnykunde gespesialiseer. Militêre geneeskunde was in die Romeinse Ryk ’n belangrike aktiwiteit, en het veral traumachirurgie uitgebou. Betreding van die buik- en toraksholtes was nie meer noodwendig fataal nie, en veeartsenykunde het tot stand gekom. Die eerste beduidende chirurgiehandboek ná die Hippokratiese Corpus is in die 1ste eeu n.C. deur Celsus opgestel. Vanaf die 3de eeu het die chirurgieberoep min vordering gemaak, die beroepstaal het mettertyd van Grieks na Latyn verander en kundigheid is later veral deur Islam-geneeshere na die Middeleeue en later oorgedra.

  12. Recent and future extreme precipitation over Ukraine

    Science.gov (United States)

    Vyshkvarkova, Olena; Voskresenskaya, Elena

    2014-05-01

    The aim of study is to analyze the parameters of precipitation extremes and inequality over Ukraine in recent climate epoch and their possible changes in the future. Data of observations from 28 hydrometeorological stations over Ukraine and output of GFDL-CM3 model (CMIP5) for XXI century were used in the study. The methods of concentration index (J. Martin-Vide, 2004) for the study of precipitation inequality while the extreme precipitation indices recommended by the ETCCDI - for the frequency of events. Results. Precipitation inequality on the annual and seasonal scales was studied using estimated CI series for 1951-2005. It was found that annual CI ranges vary from 0.58 to 0.64. They increase southward from the north-west (forest zone) and the north-east (forest steppe zone) of Ukraine. CI maxima are located in the coastal regions of the Black Sea and the Sea of Azov. Annual CI spatial distribution indicates that the contribution of extreme precipitation into annual totals is most significant at the boundary zone between steppe and marine regions. At the same time precipitation pattern at the foothill of Carpathian Mountains is more homogenous. The CI minima (0.54) are typical for the winter season in foothill of Ukrainian Carpathians. The CI maxima reach 0.71 in spring at the steppe zone closed to the Black Sea coast. It should be noted that the greatest ranges of CI maximum and CI minimum deviation are typical for spring. It is associated with patterns of cyclone trajectories in that season. The most territory is characterized by tendency to decrease the contribution of extreme precipitation into the total amount (CI linear trends are predominantly negative in all seasons). Decadal and interdecadal variability of precipitation inequality associated with global processes in ocean-atmosphere system are also studied. It was shown that precipitation inequality over Ukraine on 10 - 15 % stronger in negative phase of Pacific Decadal Oscillation and in positive phase

  13. Long-memory exchange rate dynamics in the euro era

    International Nuclear Information System (INIS)

    Barkoulas, John T.; Barilla, Anthony G.; Wells, William

    2016-01-01

    We investigate the long-run dynamics of a system of eight major exchange rates in the euro era using both integer and fractional cointegration methodologies. Contrary to the fragile evidence in the pre-euro era, robust evidence of linear cointegratedness is obtained in the foreign exchange market during the euro era. Upon closer examination, deviations from the cointegrating relationship exhibit nonstationary, long-memory dynamic behavior (Joseph effect). We find the long-memory evidence to be temporally stable in the most recent era. Finally, the foreign exchange system dynamics appears to be characterized by less persistence (smaller fractional exponent) in the euro era (as compared to pre-euro time periods), potentially indicating increased policy coordination by central banks in the recent period.

  14. Variability of Iberian upwelling implied by ERA-40 and ERA-Interim reanalyses

    Directory of Open Access Journals (Sweden)

    José M. R. Alves

    2013-05-01

    Full Text Available The Regional Ocean Modeling System ocean model is used to simulate the decadal evolution of the regional waters in offshore Iberia in response to atmospheric fields given by ECMWF ERA-40 (1961–2001 and ERA-Interim (1989–2008 reanalyses. The simulated sea surface temperature (SST fields are verified against satellite AVHRR SST, and they are analysed to characterise the variability and trends of coastal upwelling in the region. Opposing trends in upwelling frequency are found at the northern limit, where upwelling has been decreasing in recent decades, and at its southern edge, where there is some evidence of increased upwelling. These results confirm previous observational studies and, more importantly, indicate that observed SST trends are not only due to changes in radiative or atmospheric heat fluxes alone but also due to changes in upwelling dynamics, suggesting that such a process may be relevant in climate change scenarios.

  15. Further outlooks: extremely uncomfortable; Die weiteren Aussichten: extrem ungemuetlich

    Energy Technology Data Exchange (ETDEWEB)

    Resenhoeft, T.

    2006-07-01

    Climate is changing extremely in the last decades. Scientists dealing with extreme weather, should not only stare at computer simulations. They have also to turn towards psyche, seriously personal experiences, knowing statistics, relativise supposed sensational reports and last not least collecting more data. (GL)

  16. Economic outcome for intensive care of infants of birthweight 500-999 g born in Victoria in the post surfactant era. The Victorian Infant Collaborative Study Group.

    Science.gov (United States)

    1997-06-01

    To determine the incremental cost of improving the outcome for extremely low birthweight (ELBW, birthweight 500-999 g) infants born in Victoria after the introduction of exogenous surfactant (the post surfactant era). This was a geographically determined cohort study of ELBW children in Victoria, Australia of consecutive livebirths born in three distinct eras: (i) 1979-80 (n = 351); (ii) 1985-87 (n = 560); and (iii) 1991-92 (n = 429). Exogenous surfactant was first used in Victoria in March, 1991. The consumption of nursery resources per livebirth, and the survival and sensorineural disability rates at 2 years of age for each era were investigated. Utilities were assigned as follows: 0 for dead, 0.4 for severe disability, 0.6 for moderate disability, 0.8 for mild disability, and 1 for no disability. Utilities were multiplied for more than one disability. Dollar costs were assumed to be $1470 ($A 1992) per day of assisted ventilation, and one dose of exogenous surfactant was assumed to be equivalent to one third of a day of assisted ventilation. Cost-effectiveness (additional costs per additional survivor or life-year gained) and cost-utility (additional costs per additional quality-adjusted survivor or life-year gained) ratios were calculated for the pre-surfactant era (1985-87 vs 1979-80), and for the post surfactant era (1991-92 vs 1985-87). Considering only the costs incurred during the primary hospitalization, cost-effectiveness and cost-utility ratios were lower (i.e. economically better) in the post surfactant era than in the pre-surfactant era (pre-surfactant vs post surfactant; $7040 vs $4040 per life year gained; $6700 vs $5360 per quality-adjusted life year gained). Both ratios fell with increasing birthweight. In contrast with the pre-surfactant era, cost-utility ratios were less favourable than cost-effectiveness ratios in the post surfactant era. With costs for long-term care of severely disabled children added, both cost ratios were higher in the post

  17. Assessing Climate Variability using Extreme Rainfall and ...

    African Journals Online (AJOL)

    user1

    extreme frequency); the average intensity of rainfall from extreme events ... frequency and extreme intensity indices, suggesting that extreme events are more frequent and intense during years with high rainfall. The proportion of total rainfall from ...

  18. The ERA-EDTA today and tomorrow: a progress document by the ERA-EDTA Council.

    Science.gov (United States)

    Zoccali, Carmine; Arici, Mustafa; Blankestijn, Peter J; Bruchfeld, Annette; Capasso, Giovambattista; Fliser, Danilo; Fouque, Denis; Goumenos, Dimitrios; Ketteler, Markus; Malyszko, Jolanta; Massy, Ziad; Rychlík, Ivan; Spasovski, Goce

    2018-05-23

    Scientific societies are increasingly seen as central to the advancement of information sharing and collaboration among scientists and clinical investigators for the progress of medical research and the promotion of education, professional competence, integrity and quality studies. To more effectively serve the practicing nephrologists and investigators dedicated to renal science, the Council of the European Renal Association and European Dialysis and Transplantation Association (ERA-EDTA) reorganized and integrated the various activities of the society into two branches, the Clinical Nephrology Governance branch and the Renal Science branch. New affordable initiatives to promote research, education and professional development and to advocate for the recognition of chronic kidney disease as a major public health issue at the European level will be put in place and/or potentiated in the new organizational frame. Educational initiatives will be espoused to Continuous Professional Development and, starting from 2019, 14 Education & Continuous Professional Development courses will be held covering the full range of knowledge areas of modern nephrology. Consolidation and development is the short- and medium-term mantra of the ERA-EDTA. The society has a rich portfolio of successful activities and brilliant, creative scientists among its members. Integrating the various activities of the ERA-EDTA and treasuring the expertise and wisdom of its most accomplished members will facilitate collaborative research, education and its public impact at large.

  19. Recommended Capacities for Educational Leadership: Pre-Reform Era Scholars versus Reform-Era Scholars versus National Standards

    Science.gov (United States)

    Gordon, Stephen P.; Taylor-Backor, Karen; Croteau, Susan

    2017-01-01

    We reviewed the scholarship on capacities for educational leadership for the past decade of the pre-reform era (1976-1985), as well as a recent decade of the reform era (2005-2015), and compared scholarship from both decades with the current Professional Standards for Educational Leaders. We found that scholars in the past decade of the pre-reform…

  20. Extend Course for Product Designer in Digital Mobile Era

    Science.gov (United States)

    CHAO, Fang-Lin; LIU, Tzu-Heng; HUANG, Xian-Chun

    2017-12-01

    Product design refers to a system of processes from confirming a product’s specifications to product’s structure. Form, technology, and needs must be considered simultaneously to ensure qualities. In recent years, with the advancement of smartphone technology, many products are connected with apps. Designers cannot exclude themselves from this new wave of the trend. In this article, household hydroponic products design is used as an example, to show the close relationship between digital mobile technology and product design in the contemporary world. Regularly measure the amount of liquid to be added is difficult for a consumer who has no professional experience. To facilitate the introduction of small-scale aquaculture systems into the home, we proposed sensor hardware combined with App software, measured EC and pH value and transmitted to the phone. The app can calculate and display the amount of added and control the amount of inserted through a Bluetooth connection. The physical design needs to take into account the connection between the electronic parts and the circuit board, and interface operation. Thus, not only the model of the product but also the user interface has to be integrated to show the product's quality completely. Besides, authors made reflection upon the necessity for adjustments for interdisciplinary courses under the changing digital mobile era. Also, under the current curriculum structure, possible teaching approach is expressed for extending student’s feasibility.

  1. Management of the mangled extremity

    NARCIS (Netherlands)

    Prasarn, Mark L.; Helfet, David L.; Kloen, Peter

    2012-01-01

    The management of a mangled extremity continues to be a matter of debate. With modern advances in trauma resuscitation, microvascular tissue transfer, and fracture fixation, severe traumatic extremity injuries that would historically have been amputated are often salvaged. Even if preserving a

  2. A decade of weather extremes

    NARCIS (Netherlands)

    Coumou, Dim; Rahmstorf, Stefan

    The ostensibly large number of recent extreme weather events has triggered intensive discussions, both in- and outside the scientific community, on whether they are related to global warming. Here, we review the evidence and argue that for some types of extreme - notably heatwaves, but also

  3. The NASA Energy and Water Cycle Extreme (NEWSE) Integration Project

    Science.gov (United States)

    House, P. R.; Lapenta, W.; Schiffer, R.

    2008-01-01

    Skillful predictions of water and energy cycle extremes (flood and drought) are elusive. To better understand the mechanisms responsible for water and energy extremes, and to make decisive progress in predicting these extremes, the collaborative NASA Energy and Water cycle Extremes (NEWSE) Integration Project, is studying these extremes in the U.S. Southern Great Plains (SGP) during 2006-2007, including their relationships with continental and global scale processes, and assessment of their predictability on multiple space and time scales. It is our hypothesis that an integrative analysis of observed extremes which reflects the current understanding of the role of SST and soil moisture variability influences on atmospheric heating and forcing of planetary waves, incorporating recently available global and regional hydro- meteorological datasets (i.e., precipitation, water vapor, clouds, etc.) in conjunction with advances in data assimilation, can lead to new insights into the factors that lead to persistent drought and flooding. We will show initial results of this project, whose goals are to provide an improved definition, attribution and prediction on sub-seasonal to interannual time scales, improved understanding of the mechanisms of decadal drought and its predictability, including the impacts of SST variability and deep soil moisture variability, and improved monitoring/attributions, with transition to applications; a bridging of the gap between hydrological forecasts and stakeholders (utilization of probabilistic forecasts, education, forecast interpretation for different sectors, assessment of uncertainties for different sectors, etc.).

  4. Conservative treatment of soft tissue sarcomas of the extremities. Functional evaluation with LENT-SOMA scales and the Enneking score; Traitement conservateur des sarcomes des tissus mous des membres. Evaluation du resultat fonctionnel selon l'echelle LENT-SOMA et le score de Enneking

    Energy Technology Data Exchange (ETDEWEB)

    Tawfiq, N.; Lagarde, P.; Thomas, L.; Kantor, G. [Institut Bergonie, Centre Regional de Lutte Contre le Cancer, Service de Radiotherapie, 33 - Bordeaux (France); Stockle, E. [Institut Bergonie, Centre Regional de Lutte Contre le Cancer, Service de Chirurgie, 33 - Bordeaux (France); Bui, B.N. [Institut Bergonie, Centre Regional de Lutte Contre le Cancer, Service d' Oncologie Medicale, 33 - Bordeaux (France)

    2000-12-01

    Objective. - The aim of this prospective study is the feasibility of late effects assessment by LENT-SOMA scales after conservative treatment of soft tissue sarcomas of the extremities and a comparison with the functional evaluation by the Enneking score. Patients and methods. - During the systematic follow-up consultations, a series of 32 consecutive patients was evaluated in terms of late effects by LENT SOMA scales and functional results by the Enneking score. The median time after treatment was 65 months. The treatment consisted of conservative surgery (all cases) followed by radiation therapy (29 cases), often combined with adjuvant therapy (12 concomitant radio-chemotherapy association cases out of 14). The assessment of the toxicity was retrospective for acute effects and prospective for the following late tissue damage: skin/subcutaneous tissues, muscles/soft tissues and peripheral nerves. Results. -According to the Enneking score, the global score for the overall series was high (24/30) despite four the scores zero for the psychological acceptance. According to LENT SOMA scales, a low rate of severe sequelae (grade 3-4) was observed. The occurrence of high-grade sequelae and their functional consequences were not correlated with quality of exeresis, dose of radiotherapy or use of concomitant chemotherapy. A complementarity was observed between certain factors of the Enneking score and some criteria of the LENTSOMA scales, especially of muscles/soft tissues. Conclusion. -The good quality of functional results was confirmed by the two mean scoring systems for late normal tissue damage. The routine use of LENT-SOMA seems to be more time consuming than the Enneking score (mean time of scoring: 1 3 versus five minutes). The LENT-SOMA scales are aimed at a detailed description of late toxicity and sequelae while the Enneking score provides a more global evaluation, including the psychological acceptance of treatment. The late effects assessment by the LENT

  5. Cause and Properties of the Extreme Space Weather Event of 2012 July 23

    Science.gov (United States)

    Liu, Y. D.; Luhmann, J. G.; Kajdic, P.; Kilpua, E.; Lugaz, N.; Nitta, N.; Lavraud, B.; Bale, S. D.; Farrugia, C. J.; Galvin, A. B.

    2013-12-01

    Extreme space weather refers to extreme conditions in space driven by solar eruptions and subsequent disturbances in interplanetary space, or otherwise called solar superstorms. Understanding extreme space weather events is becoming ever more vital, as the vulnerability of our society and its technological infrastructure to space weather has increased dramatically. Instances of extreme space weather, however, are very rare by definition and therefore are difficult to study. Here we report and investigate an extreme event, which occurred on 2012 July 23 with a maximum speed of about 3050 km/s near the Sun. This event, with complete modern remote sensing and in situ observations from multiple vantage points, provides an unprecedented opportunity to study the cause and consequences of extreme space weather. It produced a superfast shock with a peak solar wind speed of 2246 km/s and a superstrong magnetic cloud with a peak magnetic field of 109 nT observed near 1 AU at STEREO A. The record solar wind speed and magnetic field would produce a record geomagnetic storm since the space era with a minimum Dst of -1200 - -600 nT, if this event hit the Earth. We demonstrate how successive coronal mass ejections (CMEs) can be enhanced into a solar superstorm as they interact en route from the Sun to 1 AU. These results not only provide a benchmark for studies of extreme space weather, but also present a new view of how an extreme space weather event can be generated from usual solar eruptions.

  6. The climate of the Common Era off the Iberian Peninsula

    Science.gov (United States)

    Abrantes, Fátima; Rodrigues, Teresa; Rufino, Marta; Salgueiro, Emília; Oliveira, Dulce; Gomes, Sandra; Oliveira, Paulo; Costa, Ana; Mil-Homens, Mário; Drago, Teresa; Naughton, Filipa

    2017-12-01

    The Mediterranean region is a climate hot spot, sensitive not only to global warming but also to water availability. In this work we document major temperature and precipitation changes in the Iberian Peninsula and margin during the last 2000 years and propose an interplay of the North Atlantic internal variability with the three atmospheric circulation modes (ACMs), (North Atlantic Oscillation (NAO), east atlantic (EA) and Scandinavia (SCAND)) to explain the detected climate variability. We present reconstructions of sea surface temperature (SST derived from alkenones) and on-land precipitation (estimated from higher plant n-alkanes and pollen data) in sedimentary sequences recovered along the Iberian Margin between the south of Portugal (Algarve) and the northwest of Spain (Galiza) (36 to 42° N). A clear long-term cooling trend, from 0 CE to the beginning of the 20th century, emerges in all SST records and is considered to be a reflection of the decrease in the Northern Hemisphere summer insolation that began after the Holocene optimum. Multi-decadal/centennial SST variability follows other records from Spain, Europe and the Northern Hemisphere. Warm SSTs throughout the first 1300 years encompass the Roman period (RP), the Dark Ages (DA) and the Medieval Climate Anomaly (MCA). A cooling initiated at 1300 CE leads to 4 centuries of colder SSTs contemporary with the Little Ice Age (LIA), while a climate warming at 1800 CE marks the beginning of the modern/Industrial Era. Novel results include two distinct phases in the MCA: an early period (900-1100 years) characterized by intense precipitation/flooding and warm winters but a cooler spring-fall season attributed to the interplay of internal oceanic variability with a positive phase in the three modes of atmospheric circulation (NAO, EA and SCAND). The late MCA is marked by cooler and relatively drier winters and a warmer spring-fall season consistent with a shift to a negative mode of the SCAND. The Industrial Era

  7. Implementation of the Spanish ERAS program in bariatric surgery.

    Science.gov (United States)

    Ruiz-Tovar, Jaime; Muñoz, José Luis; Royo, Pablo; Duran, Manuel; Redondo, Elisabeth; Ramirez, Jose Manuel

    2018-03-08

    The essence of Enhanced Recovery After Surgery (ERAS) programs is the multimodal approach, and many authors have demonstrated safety and feasibility in fast track bariatric surgery. According to this concept, a multidisciplinary ERAS program for bariatric surgery has been developed by the Spanish Fast Track Group (ERAS Spain). The aim of this study was to analyze the initial implementation of this Spanish National ERAS protocol in bariatric surgery, comparing it with a historical cohort receiving standard care. A multi-centric prospective study was performed, including 233 consecutive patients undergoing bariatric surgery during 2015 and following ERAS protocol. It was compared with a historical cohort of 286 patients, who underwent bariatric surgery at the same institutions between 2013 and 2014 and following standard care. Compliance with the protocol, morbidity, mortality, hospital stay and readmission were evaluated. Bariatric techniques performed were Roux-en-Y gastric bypass and sleeve gastrectomy. There were no significant differences in complications, mortality and readmission. Postoperative pain and hospital stay were significantly lower in the ERAS group. The total compliance to protocol was 80%. The Spanish National ERAS protocol is a safe issue, obtaining similar results to standard care in terms of complications, reoperations, mortality and readmissions. It is associated with less postoperative pain and earlier hospital discharge.

  8. Regional climate change trends and uncertainty analysis using extreme indices: A case study of Hamilton, Canada

    OpenAIRE

    Razavi, Tara; Switzman, Harris; Arain, Altaf; Coulibaly, Paulin

    2016-01-01

    This study aims to provide a deeper understanding of the level of uncertainty associated with the development of extreme weather frequency and intensity indices at the local scale. Several different global climate models, downscaling methods, and emission scenarios were used to develop extreme temperature and precipitation indices at the local scale in the Hamilton region, Ontario, Canada. Uncertainty associated with historical and future trends in extreme indices and future climate projectio...

  9. A Decade-Long European-Scale Convection-Resolving Climate Simulation on GPUs

    Science.gov (United States)

    Leutwyler, D.; Fuhrer, O.; Ban, N.; Lapillonne, X.; Lüthi, D.; Schar, C.

    2016-12-01

    Convection-resolving models have proven to be very useful tools in numerical weather prediction and in climate research. However, due to their extremely demanding computational requirements, they have so far been limited to short simulations and/or small computational domains. Innovations in the supercomputing domain have led to new supercomputer designs that involve conventional multi-core CPUs and accelerators such as graphics processing units (GPUs). One of the first atmospheric models that has been fully ported to GPUs is the Consortium for Small-Scale Modeling weather and climate model COSMO. This new version allows us to expand the size of the simulation domain to areas spanning continents and the time period up to one decade. We present results from a decade-long, convection-resolving climate simulation over Europe using the GPU-enabled COSMO version on a computational domain with 1536x1536x60 gridpoints. The simulation is driven by the ERA-interim reanalysis. The results illustrate how the approach allows for the representation of interactions between synoptic-scale and meso-scale atmospheric circulations at scales ranging from 1000 to 10 km. We discuss some of the advantages and prospects from using GPUs, and focus on the performance of the convection-resolving modeling approach on the European scale. Specifically we investigate the organization of convective clouds and on validate hourly rainfall distributions with various high-resolution data sets.

  10. Australia's Unprecedented Future Temperature Extremes Under Paris Limits to Warming

    Science.gov (United States)

    Lewis, Sophie C.; King, Andrew D.; Mitchell, Daniel M.

    2017-10-01

    Record-breaking temperatures can detrimentally impact ecosystems, infrastructure, and human health. Previous studies show that climate change has influenced some observed extremes, which are expected to become more frequent under enhanced future warming. Understanding the magnitude, as a well as frequency, of such future extremes is critical for limiting detrimental impacts. We focus on temperature changes in Australian regions, including over a major coral reef-building area, and assess the potential magnitude of future extreme temperatures under Paris Agreement global warming targets (1.5°C and 2°C). Under these limits to global mean warming, we determine a set of projected high-magnitude unprecedented Australian temperature extremes. These include extremes unexpected based on observational temperatures, including current record-breaking events. For example, while the difference in global-average warming during the hottest Australian summer and the 2°C Paris target is 1.1°C, extremes of 2.4°C above the observed summer record are simulated. This example represents a more than doubling of the magnitude of extremes, compared with global mean change, and such temperatures are unexpected based on the observed record alone. Projected extremes do not necessarily scale linearly with mean global warming, and this effect demonstrates the significant potential benefits of limiting warming to 1.5°C, compared to 2°C or warmer.

  11. Portable upper extremity robotics is as efficacious as upper extremity rehabilitative therapy: a randomized controlled pilot trial.

    Science.gov (United States)

    Page, Stephen J; Hill, Valerie; White, Susan

    2013-06-01

    To compare the efficacy of a repetitive task-specific practice regimen integrating a portable, electromyography-controlled brace called the 'Myomo' versus usual care repetitive task-specific practice in subjects with chronic, moderate upper extremity impairment. Sixteen subjects (7 males; mean age 57.0 ± 11.02 years; mean time post stroke 75.0 ± 87.63 months; 5 left-sided strokes) exhibiting chronic, stable, moderate upper extremity impairment. Subjects were administered repetitive task-specific practice in which they participated in valued, functional tasks using their paretic upper extremities. Both groups were supervised by a therapist and were administered therapy targeting their paretic upper extremities that was 30 minutes in duration, occurring 3 days/week for eight weeks. One group participated in repetitive task-specific practice entirely while wearing the portable robotic, while the other performed the same activity regimen manually. The upper extremity Fugl-Meyer, Canadian Occupational Performance Measure and Stroke Impact Scale were administered on two occasions before intervention and once after intervention. After intervention, groups exhibited nearly identical Fugl-Meyer score increases of ≈2.1 points; the group using robotics exhibited larger score changes on all but one of the Canadian Occupational Performance Measure and Stroke Impact Scale subscales, including a 12.5-point increase on the Stroke Impact Scale recovery subscale. Findings suggest that therapist-supervised repetitive task-specific practice integrating robotics is as efficacious as manual practice in subjects with moderate upper extremity impairment.

  12. Critical exponents of extremal Kerr perturbations

    Science.gov (United States)

    Gralla, Samuel E.; Zimmerman, Peter

    2018-05-01

    We show that scalar, electromagnetic, and gravitational perturbations of extremal Kerr black holes are asymptotically self-similar under the near-horizon, late-time scaling symmetry of the background metric. This accounts for the Aretakis instability (growth of transverse derivatives) as a critical phenomenon associated with the emergent symmetry. We compute the critical exponent of each mode, which is equivalent to its decay rate. It follows from symmetry arguments that, despite the growth of transverse derivatives, all generally covariant scalar quantities decay to zero.

  13. (When and where) Do extreme climate events trigger extreme ecosystem responses? - Development and initial results of a holistic analysis framework

    Science.gov (United States)

    Hauber, Eva K.; Donner, Reik V.

    2015-04-01

    a seasonal cycle for each quantile of the distribution, which can be used for a fully data-adaptive definition of extremes as exceedances above this time-dependent quantile function. (2) Having thus identified the extreme events, their distribution is analyzed in both space and time. Following a procedure recently proposed by Lloyd-Hughes (2012) and further exploited by Zscheischler et al. (2013), extremes observed at neighboring points in space and time are considered to form connected sets. Investigating the size distribution of these sets provides novel insights into the development and dynamical characteristics of spatio-temporally extended climate and ecosystem extremes. (3) Finally, the timing of such spatio-temporally extended extremes in different climatic as well as ecological variables is tested pairwise to rule out that co-occurrences of extremes have emerged solely due to chance. For this purpose, the recently developed framework of coincidence analysis (Donges et al., 2011; Rammig et al. 2014) is applied. The corresponding analysis allows identifying potential causal linkages between climatic extremes and extreme ecosystem responses and, thus, to study their mechanisms and spatial as well as seasonal distribution in great detail. In this work, the described method is exemplified by using different climate data from the ERA-Interim reanalysis as well as remote sensing-based land surface temperature data. References: Donges et al., PNAS, 108, 20422, 2011 Lloyd-Hughes, Int. J. Climatol., 32, 406, 2012 Rammig et al., Biogeosc. Disc., 11, 2537, 2014 Zscheischler et al., Ecol. Inform., 15, 66, 2013

  14. Spatial dependence of extreme rainfall

    Science.gov (United States)

    Radi, Noor Fadhilah Ahmad; Zakaria, Roslinazairimah; Satari, Siti Zanariah; Azman, Muhammad Az-zuhri

    2017-05-01

    This study aims to model the spatial extreme daily rainfall process using the max-stable model. The max-stable model is used to capture the dependence structure of spatial properties of extreme rainfall. Three models from max-stable are considered namely Smith, Schlather and Brown-Resnick models. The methods are applied on 12 selected rainfall stations in Kelantan, Malaysia. Most of the extreme rainfall data occur during wet season from October to December of 1971 to 2012. This period is chosen to assure the available data is enough to satisfy the assumption of stationarity. The dependence parameters including the range and smoothness, are estimated using composite likelihood approach. Then, the bootstrap approach is applied to generate synthetic extreme rainfall data for all models using the estimated dependence parameters. The goodness of fit between the observed extreme rainfall and the synthetic data is assessed using the composite likelihood information criterion (CLIC). Results show that Schlather model is the best followed by Brown-Resnick and Smith models based on the smallest CLIC's value. Thus, the max-stable model is suitable to be used to model extreme rainfall in Kelantan. The study on spatial dependence in extreme rainfall modelling is important to reduce the uncertainties of the point estimates for the tail index. If the spatial dependency is estimated individually, the uncertainties will be large. Furthermore, in the case of joint return level is of interest, taking into accounts the spatial dependence properties will improve the estimation process.

  15. Astrometric surveys in the Gaia era

    Science.gov (United States)

    Zacharias, Norbert

    2018-04-01

    The Gaia first data release (DR1) already provides an almost error free optical reference frame on the milli-arcsecond (mas) level allowing significantly better calibration of ground-based astrometric data than ever before. Gaia DR1 provides positions, proper motions and trigonometric parallaxes for just over 2 million stars in the Tycho-2 catalog. For over 1.1 billion additional stars DR1 gives positions. Proper motions for these, mainly fainter stars (G >= 11.5) are currently provided by several new projects which combine earlier epoch ground-based observations with Gaia DR1 positions. These data are very helpful in the interim period but will become obsolete with the second Gaia data release (DR2) expected in April 2018. The era of traditional, ground-based, wide-field astrometry with the goal to provide accurate reference stars has come to an end. Future ground-based astrometry will fill in some gaps (very bright stars, observations needed at many or specific epochs) and mainly will go fainter than the Gaia limit, like the PanSTARRS and the upcoming LSST surveys.

  16. ETIKA DRIYARKARA DAN RELEVANSINYA DI ERA POSTMODERN

    Directory of Open Access Journals (Sweden)

    Banin Diar Sukmono

    2013-04-01

    Full Text Available Driyarkara is an Indonesian philosopher who has original way of thinking. This paper aims to explore and analyze his ideas of moral and decency. According to Driyarkara, moral is a human need. Without moral, humanity will be in chaos. Based on Driyarkara's point of view, moral is a consequence of consciousness; therefore, having high moral standard is a human nature. Furthermore, conscience can be a moral standard for determining right or wrong as long as the conscience itself has not been “raped”. When conducting a moral consideration, every human has to use his own reason which is reflected in Driyarkara's Purity of Reason Dialectic. For him, all moral efforts show that human always want to achieve the perfection which is considered as God. The Driyarkara ethics could be classified as the deontological ethics. But it is the deontological ethics which has theological, humanist-naturalist, and axiological dimensions so that it can be defined as teleological as well. If we correlate the Driyarkara ethics and the Postmodern Era where morality has already been blurring, we can place the Driyarkara ethics which is considered teleological and deontological as a solid alternative of morality.

  17. CERN moves into the LHC era

    CERN Multimedia

    2001-01-01

    Dr Hans Eschelbacher (on the left), President of the CERN Council for the last three years, hands over to his successor Maurice Bourquin.  The CERN Council, where the representatives of the 20 Member States of the Organization decide on scientific programmes and financial resources, held its 116th session on 15 December under the chairmanship of Dr. Hans C. Eschelbacher (DE). 'Le Roi est mort. Vive le Roi !' The Large Electron Positron Collider (LEP) era has ended and CERN's future is the Large Hadron Collider (LHC), stated Director General, Prof. Luciano Maiani. He opened his report to Council with a 'homage to LEP', which reached the end of its career during 2000 and is now being dismantled to make way for CERN's next major machine, the LHC collider, in the same 27-kilometre tunnel. The strong indications of a Higgs boson at 115 GeV found during the year were the culmination of LEP's long and distinguished physics career, during which the machine opened up new regimes of precision physics, involvi...

  18. Flavour in the era of the LHC

    CERN Multimedia

    2006-01-01

    The 4th meeting of the 'Flavour in the era of the LHC' workshop will take place at CERN on 9-11 October, 2006. The goal of this workshop is to outline and document a programme for flavour physics for the next decade, addressing in particular the complementarity and synergy between the discoveries we expect to emerge from the LHC and the potential for accurate measurements of future flavour factories. Over 150 physicists will join in the discussions of the three working groups dedicated to 'Flavour physics at high Q', 'B/D/K decays' and 'Flavour in the lepton sector, EDM's, g-2, etc'. The previous meetings took place in November 2005, and in February and May this year. In addition to the working group sessions, a special miniworkshop dedicated to future prospects for electric dipole moment (EDM) searches and g-2 measurements will be held on 9-10 October. Sensitive EDM and g-2 experiments probe physics in an integral way, and in many cases their physics reach is much higher than the spectrometer searches at th...

  19. Flavour in the era of the LHC

    CERN Multimedia

    2006-01-01

    The 4th meeting of the 'Flavour in the era of the LHC'workshop will take place at CERN on 9-11 October, 2006. The goal of this workshop is to outline and document a programme for flavour physics for the next decade, addressing in particular the complementarity and synergy between the discoveries we expect to emerge from the LHC and the potential for accurate measurements of future flavour factories. Over 150 physicists will join in the discussions of the three working groups dedicated to 'Flavour physics at high Q', 'B/D/K decays'and 'Flavour in the lepton sector, EDM's, g-2, etc'. The previous meetings took place in November 2005, and in February and May this year. In addition to the working group sessions, a special miniworkshop dedicated to future prospects for electric dipole moment (EDM) searches and g-2 measurements will be held on 9-10 October. Sensitive EDM and g-2 experiments probe physics in an integral way, and in many cases their physics reach is much higher than the spectrometer searches at th...

  20. A Generalized Framework for Non-Stationary Extreme Value Analysis

    Science.gov (United States)

    Ragno, E.; Cheng, L.; Sadegh, M.; AghaKouchak, A.

    2017-12-01

    Empirical trends in climate variables including precipitation, temperature, snow-water equivalent at regional to continental scales are evidence of changes in climate over time. The evolving climate conditions and human activity-related factors such as urbanization and population growth can exert further changes in weather and climate extremes. As a result, the scientific community faces an increasing demand for updated appraisal of the time-varying climate extremes. The purpose of this study is to offer a robust and flexible statistical tool for non-stationary extreme value analysis which can better characterize the severity and likelihood of extreme climatic variables. This is critical to ensure a more resilient environment in a changing climate. Following the positive feedback on the first version of Non-Stationary Extreme Value Analysis (NEVA) Toolbox by Cheng at al. 2014, we present an improved version, i.e. NEVA2.0. The upgraded version herein builds upon a newly-developed hybrid evolution Markov Chain Monte Carlo (MCMC) approach for numerical parameters estimation and uncertainty assessment. This addition leads to a more robust uncertainty estimates of return levels, return periods, and risks of climatic extremes under both stationary and non-stationary assumptions. Moreover, NEVA2.0 is flexible in incorporating any user-specified covariate other than the default time-covariate (e.g., CO2 emissions, large scale climatic oscillation patterns). The new feature will allow users to examine non-stationarity of extremes induced by physical conditions that underlie the extreme events (e.g. antecedent soil moisture deficit, large-scale climatic teleconnections, urbanization). In addition, the new version offers an option to generate stationary and/or non-stationary rainfall Intensity - Duration - Frequency (IDF) curves that are widely used for risk assessment and infrastructure design. Finally, a Graphical User Interface (GUI) of the package is provided, making NEVA

  1. Storm-Tracks in ERA-40 and ERA-Interim Reanalyses

    Science.gov (United States)

    Liberato, M. L. R.; Trigo, I. F.; Trigo, R. M.

    2009-04-01

    Extratropical cyclones, their dominant paths, frequency and intensity have long been the object of climatological studies. The analysis of cyclone characteristics for the Euro-Atlantic sector (85°W-70°E; 20°N-75°N) presented here is based on the cyclone detecting and tracking algorithm first developed for the Mediterranean region (Trigo et al., 1999, 2002) and recently extended to a larger Euro-Atlantic region (Trigo, 2006). The objective methodology, which identifies and follows individual lows (Trigo et al. 1999), is applied to 6-hourly geopotential data at 1000-hPa from two reanalyses datasets provided by the European Centre for Medium-Range Weather Forecasts (ECMWF): ERA-40 and ERA-Interim reanalyses. Two storm-track databases are built over the Northern Atlantic European area, spanning the common available extended winter seasons from October 1989 to March 2002. Although relatively short, this common period allows a comparison of systems represented in reanalyses datasets with distinct horizontal resolutions (T106 and T255, respectively). This exercise is mostly focused on the key areas of cyclone formation and dissipation and main cyclone characteristics for the Euro-Atlantic sector. Trigo, I. F., T. D. Davies, and G. R. Bigg, 1999: Objective climatology of cyclones in the Mediterranean region. J. Climate, 12, 1685-1696. Trigo I. F., G. R. Bigg and T. D. Davies, 2002: Climatology of Cyclogenesis Mechanisms in the Mediterranean. Mon. Weather Rev. 130, 549-569. Trigo, I. F. 2006: Climatology and Interannual Variability of Storm-Tracks in the Euro-Atlantic sector: a comparison between ERA-40 and NCEP/NCAR Reanalyses. Clim. Dyn. DOI 10.1007/s00382-005-0065-9.

  2. Consistent scalar and tensor perturbation power spectra in single fluid matter bounce with dark energy era

    Science.gov (United States)

    Bacalhau, Anna Paula; Pinto-Neto, Nelson; Vitenti, Sandro Dias Pinto

    2018-04-01

    We investigate cosmological scenarios containing one canonical scalar field with an exponential potential in the context of bouncing models, in which the bounce happens due to quantum cosmological effects. The only possible bouncing solutions in this scenario (discarding an infinitely fine-tuned exception) must have one and only one dark energy phase, occurring either in the contracting era or in the expanding era. Hence, these bounce solutions are necessarily asymmetric. Naturally, the more convenient solution is the one in which the dark energy phase happens in the expanding era, in order to be a possible explanation for the current accelerated expansion indicated by cosmological observations. In this case, one has the picture of a Universe undergoing a classical dust contraction from very large scales, the initial repeller of the model, moving to a classical stiff-matter contraction near the singularity, which is avoided due to the quantum bounce. The Universe is then launched to a dark energy era, after passing through radiation- and dust-dominated phases, finally returning to the dust expanding phase, the final attractor of the model. We calculate the spectral indices and amplitudes of scalar and tensor perturbations numerically, considering the whole history of the model, including the bounce phase itself, without making any approximation nor using any matching condition on the perturbations. As the background model is necessarily dust dominated in the far past, the usual adiabatic vacuum initial conditions can be easily imposed in this era. Hence, this is a cosmological model in which the presence of dark energy behavior in the Universe does not turn the usual vacuum initial conditions prescription for cosmological perturbation in bouncing models problematic. Scalar and tensor perturbations end up being almost scale invariant, as expected. The background parameters can be adjusted, without fine-tunings, to yield the observed amplitude for scalar

  3. Problematika Pendidikan Islam Sebagai Sub Sistem Pendidikan Nasional di Era Global

    Directory of Open Access Journals (Sweden)

    Moch. Miftachul Choiri

    2011-11-01

    Full Text Available A globalization, which looks like both sides of one coin, has both positive and negative impacts. The globalization inspired not only by capitalism but also by pragmatism has practically affected the education in Indonesia. The roles of education become practically unfamiliar and faraway from society needs. The globalization takes some issues such as competence, standardization, and commerce. To face this era, what should Islamic education do as sub-system of national education? The Islamic school (madrasah as a sub-system of Islamic education in Indonesia, had extremely strong experienced to face the challenges at the last era of Dutch colonialism. The fact that madrasah had not only an autonomy but also an intellectual resources had proven that it could fulfill the needs of Islamic community. These are cultural potencies which should be kept and not be abandoned for the sake of globalization interest. The globalization as a cultural transformation process affects the world, especially the practice of education in Indonesia. All people using science and technology can easily access the global culture. The global culture which is value-free should be faced by transformation of values of which Islamic scholars had transformed in pesantren (Islamic boarding schools and Islamic schools (madrasah. In other word, both pesantren and madrasah should not be entrapped in capitalism ideology and could serve all people. It is because the paradigm of Islamic education differs from that of both capitalism and pragmatism. The article tries to elaborate how Islamic education in Indonesia especially madrasah should be positioned in the global era

  4. Meteorological Drivers of Extreme Air Pollution Events

    Science.gov (United States)

    Horton, D. E.; Schnell, J.; Callahan, C. W.; Suo, Y.

    2017-12-01

    The accumulation of pollutants in the near-surface atmosphere has been shown to have deleterious consequences for public health, agricultural productivity, and economic vitality. Natural and anthropogenic emissions of ozone and particulate matter can accumulate to hazardous concentrations when atmospheric conditions are favorable, and can reach extreme levels when such conditions persist. Favorable atmospheric conditions for pollutant accumulation include optimal temperatures for photochemical reaction rates, circulation patterns conducive to pollutant advection, and a lack of ventilation, dispersion, and scavenging in the local environment. Given our changing climate system and the dual ingredients of poor air quality - pollutants and the atmospheric conditions favorable to their accumulation - it is important to characterize recent changes in favorable meteorological conditions, and quantify their potential contribution to recent extreme air pollution events. To facilitate our characterization, this study employs the recently updated Schnell et al (2015) 1°×1° gridded observed surface ozone and particulate matter datasets for the period of 1998 to 2015, in conjunction with reanalysis and climate model simulation data. We identify extreme air pollution episodes in the observational record and assess the meteorological factors of primary support at local and synoptic scales. We then assess (i) the contribution of observed meteorological trends (if extant) to the magnitude of the event, (ii) the return interval of the meteorological event in the observational record, simulated historical climate, and simulated pre-industrial climate, as well as (iii) the probability of the observed meteorological trend in historical and pre-industrial climates.

  5. Explosion Source Phenomena Using Soviet, Test-Era, Waveform Data

    Energy Technology Data Exchange (ETDEWEB)

    Richards, Paul G.; Rautian, Tatyana G.; Khalturin, Vitaly I.; Phillips, W. Scott

    2006-04-12

    During the nuclear testing era, the former Soviet Union carried out extensive observations of underground nuclear explosions, recording both their own shots and those of foreign nuclear states. Between 1961 and 1989, the Soviet Complex Seismological Expedition deployed seismometers at time-varying subsets of over 150 sites to record explosions at regional distances from the Semipalatinsk and Lop Nor test sites and from the shot points of peaceful nuclear explosions. This data set included recordings from broadband, multi-channel ChISS seismometers that produced a series of narrow band outputs, which could then be measured to perform spectral studies. [ChISS is the Russian abbreviation for multichannel spectral seismometer. In this instrument the signal from the seismometer is passed through a system of narrow bandpass filters and recorded on photo paper. ChISS instruments have from 8 to 16 channels in the frequency range from 100 sec to 40 Hz. We used data mostly from 7 channels, ranging from 0.08 to 5 Hz.] Quantitative, pre-digital era investigations of high-frequency source scaling relied on this type of data. To augment data sets of central Central Asia explosions, we have measured and compiled 537 ChISS coda envelopes for 124 events recorded at Talgar, Kazakhstan, at a distance of about 750 km from Semipalatinsk. Envelopes and calibration levels were measured manually from photo paper records for seven bands between 0.08 and 5 Hz. We obtained from 2 to 10 coda envelope measurements per event, depending on the event size and instrument magnification. Coda lengths varied from 250 to 1400 s. For small events, only bands between 0.6 and 2.5 Hz could be measured. Envelope levels were interpolated or extrapolated to 500 s and we have obtained the dependence of this quantity on magnitude. Coda Q was estimated and found to increase from 232 at 0.08 Hz to 1270 at 5 Hz. These relationships were used to construct an average scaling law of coda spectra for Semipalatinsk

  6. Going Extreme For Small Solutions To Big Environmental Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Bagwell, Christopher E.

    2011-03-31

    This chapter is devoted to the scale, scope, and specific issues confronting the cleanup and long-term disposal of the U.S. nuclear legacy generated during WWII and the Cold War Era. The research reported is aimed at complex microbiological interactions with legacy waste materials generated by past nuclear production activities in the United States. The intended purpose of this research is to identify cost effective solutions to the specific problems (stability) and environmental challenges (fate, transport, exposure) in managing and detoxifying persistent contaminant species. Specifically addressed are high level waste microbiology and bacteria inhabiting plutonium laden soils in the unsaturated subsurface.

  7. Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Ghanem, Roger [Univ. of Southern California, Los Angeles, CA (United States)

    2017-04-18

    QUEST was a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, the Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The USC effort centered on the development of reduced models and efficient algorithms for implementing various components of the UQ pipeline. USC personnel were responsible for the development of adaptive bases, adaptive quadrature, and reduced models to be used in estimation and inference.

  8. Exploring Asynchronous Many-Task Runtime Systems toward Extreme Scales

    Energy Technology Data Exchange (ETDEWEB)

    Knight, Samuel [O8953; Baker, Gavin Matthew; Gamell, Marc [Rutgers U; Hollman, David [08953; Sjaardema, Gregor [SNL; Kolla, Hemanth [SNL; Teranishi, Keita; Wilke, Jeremiah J; Slattengren, Nicole [SNL; Bennett, Janine Camille

    2015-10-01

    Major exascale computing reports indicate a number of software challenges to meet the dramatic change of system architectures in near future. While several-orders-of-magnitude increase in parallelism is the most commonly cited of those, hurdles also include performance heterogeneity of compute nodes across the system, increased imbalance between computational capacity and I/O capabilities, frequent system interrupts, and complex hardware architectures. Asynchronous task-parallel programming models show a great promise in addressing these issues, but are not yet fully understood nor developed su ciently for computational science and engineering application codes. We address these knowledge gaps through quantitative and qualitative exploration of leading candidate solutions in the context of engineering applications at Sandia. In this poster, we evaluate MiniAero code ported to three leading candidate programming models (Charm++, Legion and UINTAH) to examine the feasibility of these models that permits insertion of new programming model elements into an existing code base.

  9. PISCEES: Predicting Ice Sheet and Climate Evolution at Extreme Scales

    Energy Technology Data Exchange (ETDEWEB)

    Gunzburger, Max [Florida State Univ., Tallahassee, FL (United States); Ju, Lili [Univ. of South Carolina, Columbia, SC (United States)

    2017-12-09

    This report provides a summary of major accomplishments and activities obtained/performed by the Florida State University/University of South Carolina team participating in the PISCEES project. Major accomplishments for development and application of the prallell 3D finite element Stokes dycore "FELIX-S" of the PISCEES project are discussed in certain detail and some representative test results and findings are also provided.

  10. On a global scale, marine recreational angling is an extremely ...

    African Journals Online (AJOL)

    spamer

    Estuary mouth along the south bank to the Athlone. Bridge, then along ... ETh = EToutings × ai. ,. (5) where a is ...... Table VII: Employment status of respondents at each locality. Locality .... angling periods has also been shown for other sectors.

  11. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  12. Scalable ParaView for Extreme Scale Visualization, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Petscale computing is leading to significant breakthroughs in a number of fields and is revolutionizing the way science is conducted. Data is not knowledge, however,...

  13. The new survivors and a new era for trauma research.

    Science.gov (United States)

    Brohi, Karim; Schreiber, Martin

    2017-07-01

    Karim Brohi and Martin Schreiber, Guest Editors of the Special Issue on Trauma, describe a new era in exploration of the biology of injury response and translation of new opportunities into clinical practice.

  14. The new survivors and a new era for trauma research

    OpenAIRE

    Brohi, Karim; Schreiber, Martin

    2017-01-01

    Karim Brohi and Martin Schreiber, Guest Editors of the Special Issue on Trauma, describe a new era in exploration of the biology of injury response and translation of new opportunities into clinical practice.

  15. CENET: Cost Efficiency in a New Era with new Technology

    Energy Technology Data Exchange (ETDEWEB)

    Karlsen, Jan E.; Lund, Bjoernar; Bos, Christian F.M.; Stokka, Sigmund

    1997-12-31

    This report relates to the CENET (Cost Efficiency in a New Era with new Technology) project the oil and gas in Europe. Key objectives of the CENET project are to determine the role of RTD (Research and Technology Development) in European oil and gas industry towards improved value and cost reduction with a particular focus on the means of developing offshore European marginal fields commercially, to identify RTD areas with the largest potential for improved value and cost reduction and technological developments and advances which are likely to increase European competitiveness internationally, and to provide guidance to European governments when deciding RTD priorities. A new era with new technology concerns increased oil and gas potential during the next century, a new era with clean, safe and cost efficient energy production, a new era with a new business structure, and globalization of the industry. 44 tabs., 5 figs., 23 tabs.

  16. Eraõigusliku juriidilise isiku organi liikmete õigussuhted / Kalev Saare

    Index Scriptorium Estoniae

    Saare, Kalev, 1974-

    2010-01-01

    Eraõiguslike juriidiliste isikute organi mõistest aktsiaseltsi ja osaühingu näitel, organiliikmete sisesuhte tekkimisest ja tsiviilseadustiku üldosa seaduse poolt määratud sisesuhte sisusse kuuluvatest peamistest kohustustest

  17. Racial Extremism in the Army

    National Research Council Canada - National Science Library

    Hudson, Walter M

    1998-01-01

    ... modem phenomenon of "skinheads." I then discuss the history of white supremacist extremism in the Army, culminating in the December, 1995 murders of two black civilians by soldiers assigned to the 82d Airborne Division at Fort Bragg, North Carolina...

  18. THE SZ EFFECT IN THE PLANCK ERA: ASTROPHYSICAL AND COSMOLOGICAL IMPACT

    Directory of Open Access Journals (Sweden)

    Sergio Colafrancesco

    2013-12-01

    Full Text Available The Sunyaev–Zel’dovich effect (SZE is a relevant probe for cosmology and particle astrophysics. The Planck Era marks a definite step forward in the use of this probe for astrophysics and cosmology. Astrophysical applications to galaxy clusters, galaxies, radiogalaxies and large-scale structures are discussed. Cosmological relevance for the Dark Energy equation of state, modified Gravity scenarios, Dark Matter search, cosmic magnetism and other cosmological applications is also reviewed. Future directions for the study of the SZE and its polarization are finally outlined.

  19. Disaster Risks Reduction for Extreme Natural Hazards

    Science.gov (United States)

    Plag, H.; Jules-Plag, S.

    2013-12-01

    Mega disasters associated with extreme natural hazards have the potential to escalate the global sustainability crisis and put us close to the boundaries of the safe operating space for humanity. Floods and droughts are major threats that potentially could reach planetary extent, particularly through secondary economic and social impacts. Earthquakes and tsunamis frequently cause disasters that eventually could exceed the immediate coping capacity of the global economy, particularly since we have built mega cities in hazardous areas that are now ready to be harvested by natural hazards. Unfortunately, the more we learn to cope with the relatively frequent hazards (50 to 100 years events), the less we are worried about the low-probability, high-impact events (a few hundred and more years events). As a consequence, threats from the 500 years flood, drought, volcano eruption are not appropriately accounted for in disaster risk reduction (DRR) discussions. Extreme geohazards have occurred regularly throughout the past, but mostly did not cause major disasters because exposure of human assets to hazards was much lower in the past. The most extreme events that occurred during the last 2,000 years would today cause unparalleled damage on a global scale and could worsen the sustainability crisis. Simulation of these extreme hazards under present conditions can help to assess the disaster risk. Recent extreme earthquakes have illustrated the destruction they can inflict, both directly and indirectly through tsunamis. Large volcano eruptions have the potential to impact climate, anthropogenic infrastructure and resource supplies on global scale. During the last 2,000 years several large volcano eruptions occurred, which under today's conditions are associated with extreme disaster risk. The comparison of earthquakes and volcano eruptions indicates that large volcano eruptions are the low-probability geohazards with potentially the highest impact on our civilization

  20. Extreme Ionizing-Radiation-Resistant Bacterium

    Science.gov (United States)

    Vaishampayan, Parag A.; Venkateswaran, Kasthuri J.; Schwendner, Petra

    2013-01-01

    There is a growing concern that desiccation and extreme radiation-resistant, non-spore-forming microorganisms associated with spacecraft surfaces can withstand space environmental conditions and subsequent proliferation on another solar body. Such forward contamination would jeopardize future life detection or sample return technologies. The prime focus of NASA s planetary protection efforts is the development of strategies for inactivating resistance-bearing micro-organisms. Eradi cation techniques can be designed to target resistance-conferring microbial populations by first identifying and understanding their physiologic and biochemical capabilities that confers its elevated tolerance (as is being studied in Deinococcus phoenicis, as a result of this description). Furthermore, hospitals, food, and government agencies frequently use biological indicators to ensure the efficacy of a wide range of radiation-based sterilization processes. Due to their resistance to a variety of perturbations, the nonspore forming D. phoenicis may be a more appropriate biological indicator than those currently in use. The high flux of cosmic rays during space travel and onto the unshielded surface of Mars poses a significant hazard to the survival of microbial life. Thus, radiation-resistant microorganisms are of particular concern that can survive extreme radiation, desiccation, and low temperatures experienced during space travel. Spore-forming bacteria, a common inhabitant of spacecraft assembly facilities, are known to tolerate these extreme conditions. Since the Viking era, spores have been utilized to assess the degree and level of microbiological contamination on spacecraft and their associated spacecraft assembly facilities. Members of the non-sporeforming bacterial community such as Deinococcus radiodurans can survive acute exposures to ionizing radiation (5 kGy), ultraviolet light (1 kJ/m2), and desiccation (years). These resistive phenotypes of Deinococcus enhance the

  1. Exploring the extreme gamma-ray sky with HESS

    International Nuclear Information System (INIS)

    Sol, Helene

    2006-01-01

    The international HESS experiment. High Energy Stereoscopic System, fully operational since January 2004, is opening a new era for extreme gamma-ray astronomy. Located in Namibia, it is now the most sensitive detector for cosmic sources of very high energy (VHE) gamma-rays, in the tera-electron-volt (TeV) range. In July 2005, it had already more than double the number of sources detected at such energies, with the discovery of several active galactic nuclei (AGN), supernova remnants and plerions, a binary pulsar system, a microquasar candidate, and a sample of yet unidentified sources. HESS has also provide for the first time gamma-ray images of extended sources with the first astrophysical jet resolved in gamma-rays, and the first mapping of a shell supernova remnant, which proves the efficiency of in situ acceleration of particles up to 100 TeV and beyond

  2. Evaluation of ERA-Interim precipitation data in complex terrain

    Science.gov (United States)

    Gao, Lu; Bernhardt, Matthias; Schulz, Karsten

    2013-04-01

    Precipitation controls a large variety of environmental processes, which is an essential input parameter for land surface models e.g. in hydrology, ecology and climatology. However, rain gauge networks provides the necessary information, are commonly sparse in complex terrains, especially in high mountainous regions. Reanalysis products (e.g. ERA-40 and NCEP-NCAR) as surrogate data are increasing applied in the past years. Although they are improving forward, previous studies showed that these products should be objectively evaluated due to their various uncertainties. In this study, we evaluated the precipitation data from ERA-Interim, which is a latest reanalysis product developed by ECMWF. ERA-Interim daily total precipitation are compared with high resolution gridded observation dataset (E-OBS) at 0.25°×0.25° grids for the period 1979-2010 over central Alps (45.5-48°N, 6.25-11.5°E). Wet or dry day is defined using different threshold values (0.5mm, 1mm, 5mm, 10mm and 20mm). The correspondence ratio (CR) is applied for frequency comparison, which is the ratio of days when precipitation occurs in both ERA-Interim and E-OBS dataset. The result shows that ERA-Interim captures precipitation occurrence very well with a range of CR from 0.80 to 0.97 for 0.5mm to 20mm thresholds. However, the bias of intensity increases with rising thresholds. Mean absolute error (MAE) varies between 4.5 mm day-1 and 9.5 mm day-1 in wet days for whole area. In term of mean annual cycle, ERA-Interim almost has the same standard deviation of the interannual variability of daily precipitation with E-OBS, 1.0 mm day-1. Significant wet biases happened in ERA-Interim throughout warm season (May to August) and dry biases in cold season (November to February). The spatial distribution of mean annual daily precipitation shows that ERA-Interim significant underestimates precipitation intensity in high mountains and northern flank of Alpine chain from November to March while pronounced

  3. A pentatonic classification of extreme events

    International Nuclear Information System (INIS)

    Eliazar, Iddo; Cohen, Morrel H.

    2015-01-01

    In this paper we present a classification of the extreme events – very small and very large outcomes – of positive-valued random variables. The classification distinguishes five different categories of randomness, ranging from the very ‘mild’ to the very ‘wild’. In analogy with the common five-tone musical scale we term the classification ‘pentatonic’. The classification is based on the analysis of the inherent Gibbsian ‘forces’ and ‘temperatures’ existing on the logarithmic scale of the random variables under consideration, and provides a statistical-physics insight regarding the nature of these random variables. The practical application of the pentatonic classification is remarkably straightforward, it can be performed by non-experts, and it is demonstrated via an array of examples

  4. THE APPEARANCE OF GOVERNMENT BUREAUCRACY IN QUANTUM ERA

    OpenAIRE

    Kadir, Gau

    2015-01-01

    This study will answer three main questions: 1) how is the reduction of Weber???s theory in the bureaucracy appearance?; 2) How is the model of government bureaucracy in rationalistic and quantum era?; and 3) how is the reality of government bureaucracy reformation model in rationalistic and quantum era? This study was employed empirically by using qualitative method and content analysis techniques with the focus on the bureaucracy reformation as a result of rational thinking application in q...

  5. Era of superheavy-particle dominance and big bang nucleosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Polnarev, A.G.; Khlopov, M.Y.

    1982-01-01

    The observed primordial He/sup 4/ abundance imposes astrophysical constraints on the possible departures from radiation dominance in the big bang universe during the neutron hardening era (at epoch t roughly-equal1 sec). Limits are obtained which, along with the data on the spectrum of the cosmic background radiation, practically rule out any stages of superheavy stable-particle dominance in the era 1< or approx. =t<10/sup 10/ sec, thereby setting restrictions on current elementary-particle theories.

  6. Legacies from extreme drought increase ecosystem sensitivity to future extremes

    Science.gov (United States)

    Smith, M. D.; Knapp, A.; Hoover, D. L.; Avolio, M. L.; Felton, A. J.; Wilcox, K. R.

    2016-12-01

    Climate extremes, such as drought, are increasing in frequency and intensity, and the ecological consequences of these extreme events can be substantial and widespread. Although there is still much to be learned about how ecosystems will respond to an intensification of drought, even less is known about the factors that determine post-drought recovery of ecosystem function. Such knowledge is particularly important because post-drought recovery periods can be protracted depending on the extent to which key plant populations, community structure and biogeochemical processes are affected. These drought legacies may alter ecosystem function for many years post-drought and may impact future sensitivity to climate extremes. We experimentally imposed two extreme growing season droughts in a central US grassland to assess the impacts of repeated droughts on ecosystem resistance (response) and resilience (recovery). We found that this grassland was not resistant to the first extreme drought due to reduced productivity and differential sensitivity of the co-dominant C4 grass (Andropogon gerardii) and C3 forb (Solidago canadensis) species. This differential sensitivity led to a reordering of species abundances within the plant community. Yet, despite this large shift in plant community composition, which persisted post-drought, the grassland was highly resilient post-drought, due to increased abundance of the dominant C4 grass. Because of this shift to increased C4 grass dominance, we expected that previously-droughted grassland would be more resistant to a second extreme drought. However, contrary to these expectations, previously droughted grassland was more sensitive to drought than grassland that had not experienced drought. Thus, our result suggest that legacies of drought (shift in community composition) may increase ecosystem sensitivity to future extreme events.

  7. Global predictability of temperature extremes

    Science.gov (United States)

    Coughlan de Perez, Erin; van Aalst, Maarten; Bischiniotis, Konstantinos; Mason, Simon; Nissan, Hannah; Pappenberger, Florian; Stephens, Elisabeth; Zsoter, Ervin; van den Hurk, Bart

    2018-05-01

    Extreme temperatures are one of the leading causes of death and disease in both developed and developing countries, and heat extremes are projected to rise in many regions. To reduce risk, heatwave plans and cold weather plans have been effectively implemented around the world. However, much of the world’s population is not yet protected by such systems, including many data-scarce but also highly vulnerable regions. In this study, we assess at a global level where such systems have the potential to be effective at reducing risk from temperature extremes, characterizing (1) long-term average occurrence of heatwaves and coldwaves, (2) seasonality of these extremes, and (3) short-term predictability of these extreme events three to ten days in advance. Using both the NOAA and ECMWF weather forecast models, we develop global maps indicating a first approximation of the locations that are likely to benefit from the development of seasonal preparedness plans and/or short-term early warning systems for extreme temperature. The extratropics generally show both short-term skill as well as strong seasonality; in the tropics, most locations do also demonstrate one or both. In fact, almost 5 billion people live in regions that have seasonality and predictability of heatwaves and/or coldwaves. Climate adaptation investments in these regions can take advantage of seasonality and predictability to reduce risks to vulnerable populations.

  8. A short generic measure of work stress in the era of globalization: effort-reward imbalance.

    Science.gov (United States)

    Siegrist, Johannes; Wege, Natalia; Pühlhofer, Frank; Wahrendorf, Morten

    2009-08-01

    We evaluate psychometric properties of a short version of the original effort-reward imbalance (ERI) questionnaire. This measure is of interest in the context of assessing stressful work conditions in the era of economic globalization. In a representative sample of 10,698 employed men and women participating in the longitudinal Socio-Economic Panel (SOEP) in Germany, a short version of the ERI questionnaire was included in the 2006 panel wave. Structural equation modeling and logistic regression analysis were applied. In addition to satisfactory internal consistency of scales, a model representing the theoretical structure of the scales provided the best data fit in a competitive test (RMSEA = 0.059, CAIC = 4124.19). Scoring high on the ERI scales was associated with elevated risks of poor self-rated health. This short version of the ERI questionnaire reveals satisfactory psychometric properties, and can be recommended for further use in research and practice.

  9. Sino­Pakistan Relations and the Challenges of Post-­Cold War Era

    Directory of Open Access Journals (Sweden)

    Mutahir Ahmed

    2015-04-01

    Full Text Available China has emerged as the world’s second largest economy, and the largest exporter of goods with 9.6 per cent of the global share. Moreover, the last two decades have seen China emerging as an international and regional power of the 21st century. Thus, in order to continue with the economic benefits, China wants peace and stability as well as to play an active role on international and regional fronts. On the other hand, Pakistan, the world’s sixth most populous country, is a major power of South Asia. While having a developed infrastructure and vibrant political and security institutions, Pakistan is nevertheless currently facing many challenges on the economic front, including political instability and religious extremism. This paper is an attempt to analyze the challenges faced by both China and Pakistan in the post-Cold War era.

  10. War and peace in the Internet era

    Directory of Open Access Journals (Sweden)

    Josep M. Porta Fabregat

    2004-04-01

    Full Text Available This article looks to find the ideological causes that lead human beings to war or peace nowadays, in the Internet era. This proposal is worthy of study as war is not a need in terms of human nature or history: we are capable of war and peace simultaneously. However, why does war survive if we are able to live in peace? In our opinion, the actual cause of conflict is fanaticism. This phenomenon comes from the perversion of the two bases of our civilisation: liberty and rationality. This twofold perversion leads us to believe that we are the Absolute, or at least its instrument.Since the fall of the Berlin wall, this kind of fanaticism has come from the generalised conviction that we are at the "end of history"; in this light, one can conclude that this irrationality is definitive and, thus, that any efforts to achieve world peace are useless. However, we believe that the formula for peace can only be derived from reflection and the effective extension around the world of a technical medium that makes communication between all men possible. This would be able to resolve all the perversions of liberty and rationality and make people aware of the infinite distance between us and the Absolute. However, this reflection is not enough. For this awareness to triumph, the technical and ideological situation represented by the Internet has to spread over the whole planet: liberty for those taking part, rationality to allow for communication among all those connected and universal access. This is the moral trend for the Internet, which in itself encourages progress towards world peace.

  11. Politisasi Birokrasi Pemerintahan Desa Pada Era Reformasi

    Directory of Open Access Journals (Sweden)

    R Widodo Triputro

    2015-12-01

    Full Text Available Public bureaucracy holds a strategic position in government implementation as well as to effort of democratization and autonomy in local and village government scope. Professional bureaucracy apparatus intensely support the increase of public service quality, particular yfor social empowerment as the realization of local and village autonomy essence. The concept of bureaucracy neutrality needs to be bought into reality in order to urge a bureaucracy that more oriented to its main function, namely as public service apparatus. Long history oflndonesian bureaucracy reflects the occuring bureaucracy politization by government regime, with the result that all bureaucracy's line become an administration tool in performing is authority centralization. As the consequence, service tends to be addressed to government (patron by neglecting public service function. It includes in village government scope, in which bureaucracy becomes a political machine, meanwlile serves as an effective controlling tool that limits social access to public arena. The outcome of case study conducted in one village of Bantul Regency with data resource was gained from government official and prominent figures both in regency or village government area, reveals that bureaucracy politization in village government nowdays is much stronger than under new orde era. On the pretext of democratization and social empowerment, government (red : regent and his political party performs a set of bureaucracy politization in village government. With limited village resource condition, politic euphoria, and conflict as the result of election proces of village government bureaucracy apparatus, government intervences village government and its community. The patron-client relation is between the government with village government and its community. It is evidenced that bureaucracy politization of villlage government is re-carried out, among others is the estabilish of "Paguyuban Pamong" with its

  12. Prolactinoma treatment status in the cabergoline era

    International Nuclear Information System (INIS)

    Watanabe, Shinya; Takano, Shingo; Akutsu, Hiroyoshi; Sato, Hiroshige; Matsumura, Akira

    2011-01-01

    The aim of our study is to report the most adequate therapy for prolactinoma in the cabergoline era. From 2003 to 2009, 27 patients with prolactinoma were treated at our hospital. Patients are categorized into 2 groups. The Cabergoline Group: Cabergoline was administered for 5 years and discontinued. Using this protocol, the case with normal prolactin level in addition to having no visible tumor more than 24 months after the discontinuation of cabergoline was judged as cured. The Operation Group: Transsphenoidal surgery (TSS) was performed first. In the Cabergoline group, 12 cases were cured with 5 years cabergoline treatment (Cure) and 6 cases were not cured (Not cure). We compared the pretreatment prolactin level, the normalization of the serum level of prolactin, the degree of invasiveness on MRI, regression of the tumor during treatment on MRI, max dose of cabergoline, degree of pituitary hormone replacement, frequency of pregnancy, and follow up periods between the Cabergoline-cure group, the Cabergoline-not-cure group, and the Operation group. Normalization rate in serum level of prolactin and cure rate were 91% and 63% in the Cabergoline group. Pretreatment prolactine level and the frequency of tumor invasiveness on initial MRI were significantly higher in the Cabergoline-not-cure group compared to the Cabergoline-cure group. All of the five woman accompanied with pregnancy after the treatment belonged to the Cabergoline-cure group. In the Operation group, all 4 cases achieved normalization of serum prolactin level without visible tumor and with normal pituitary function. Cabergoline for prolactinoma is effective, but the cure rate by continuous usage of cabergoline for 5 years was 67%. The factors that cabergoline and/or TSS can cure prolactinoma are non-invasive tumor and prolactin level under 200 ng/mL at pretreatment. (author)

  13. Space Weather Drivers in the ACE Era

    Science.gov (United States)

    Vogt, M.; Puhl-Quinn, P.; Jordanova, V. K.; Smith, C. W.; Cohen, C. M.

    2004-12-01

    The Advanced Composition Explorer (ACE) spacecraft was launched Aug.~25, 1997 [Stone et al., 1998]. Beginning shortly after launch and continuing to the present day ACE has provided real-time data telemetry of solar wind conditions upstream of the Earth. The real-time data includes solar wind speed and density, magnetic field direction and magnitude, and a range of energetic particle intensities [Zwickl et al., 1999]. The real-time data product is provided within 5 minutes of observation and many partners from both industry and science use these data for a variety of purposes. The most common purpose of practical industrial application involves mitigation of lost services arising from magnetospheric storm activity. Many space weather efforts are directed at providing improved predictions of magnetospheric response that can be applied to real-time data in the hope of better predicting the vulnerability and required action of industry to approaching disturbances. It therefore seems prudent that following 6 years of activity including one solar maximum period we should evaluate the nature and strength of the largest disturbances observed with the hope of better assessing the industrial response. Simply put: ``Did ACE observe disturbances that were as large as those seen previously during the space age?'' If not, it may be the case that industry must evaluate its response to the real-time warnings and not become complacent by the simple act of survival. We compare the most intense space weather events of the ACE era with those recorded on the Omnitape data set spanning 40+ years of spacecraft measurements in the near-Earth environment. We compare both magnetospheric response parameters and solar wind drivers. In addition, we compare the large energetic particle events over the same time frame. Stone, E.~C., et al., Space Science Rev., 86(1-4), 357-408, 1998. Zwickl, R.~D., et al., Space Science Rev., 86(1-4), 633-648, 1998.

  14. Modulation of extreme temperatures in Europe under extreme values of the North Atlantic Oscillation Index.

    Science.gov (United States)

    Beniston, Martin

    2018-03-10

    This paper reports on the influence that extreme values in the tails of the North Atlantic Oscillation (NAO) Index probability density function (PDF) can exert on temperatures in Europe. When the NAO Index enters into its lowest (10% quantile or less) and highest (90% quantile or higher) modes, European temperatures often exhibit large negative or positive departures from their mean values, respectively. Analyses of the joint quantiles of the Index and temperatures (i.e., the simultaneous exceedance of particular quantile thresholds by the two variables) show that temperatures enter into the upper or lower tails of their PDF when the NAO Index also enters into its extreme tails, more often that could be expected from random statistics. Studies of this nature help further our understanding of the manner by which mechanisms of decadal-scale climate variability can influence extremes of temperature-and thus perhaps improve the forecasting of extreme temperatures in weather and climate models. © 2018 New York Academy of Sciences.

  15. The Study of the Technical Innovation of Agriculture and the Peasant's Activity in Landholder System, in Meiji-Taisyo Era

    OpenAIRE

    勝部, 眞人

    1997-01-01

    In this article, I analyzed the process of Japanese Modern Technical Innovation of Agriculture by many peasants in Meiji-Taisyo Era, for I think that analysis is the key to link between the study on Japanese Modern Landholder System and the historical study on agricultural technique. This study is made on Hiroshima Prefecture, that had so much population and therefore the smallest-scale farmers in Japan, and Akita Prefecture, that had less population and therefore the most extensive culti...

  16. Re-Form: FPGA-Powered True Codesign Flow for High-Performance Computing In The Post-Moore Era

    Energy Technology Data Exchange (ETDEWEB)

    Cappello, Franck; Yoshii, Kazutomo; Finkel, Hal; Cong, Jason

    2016-11-14

    Multicore scaling will end soon because of practical power limits. Dark silicon is becoming a major issue even more than the end of Moore’s law. In the post-Moore era, the energy efficiency of computing will be a major concern. FPGAs could be a key to maximizing the energy efficiency. In this paper we address severe challenges in the adoption of FPGA in HPC and describe “Re-form,” an FPGA-powered codesign flow.

  17. Correlation dimension and phase space contraction via extreme value theory

    Science.gov (United States)

    Faranda, Davide; Vaienti, Sandro

    2018-04-01

    We show how to obtain theoretical and numerical estimates of correlation dimension and phase space contraction by using the extreme value theory. The maxima of suitable observables sampled along the trajectory of a chaotic dynamical system converge asymptotically to classical extreme value laws where: (i) the inverse of the scale parameter gives the correlation dimension and (ii) the extremal index is associated with the rate of phase space contraction for backward iteration, which in dimension 1 and 2, is closely related to the positive Lyapunov exponent and in higher dimensions is related to the metric entropy. We call it the Dynamical Extremal Index. Numerical estimates are straightforward to obtain as they imply just a simple fit to a univariate distribution. Numerical tests range from low dimensional maps, to generalized Henon maps and climate data. The estimates of the indicators are particularly robust even with relatively short time series.

  18. Coastal Change Analysis Program (C-CAP) zone 66 1995-era and 2000-era land cover change analysis (NODC Accession 0042136)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains the 1995-era and 2000-era classifications of US Coast zone 66 and can be used to analyze change. This imagery was collected as part of the...

  19. Evaluation of precipitation extremes over the Asian domain: observation and modelling studies

    Science.gov (United States)

    Kim, In-Won; Oh, Jaiho; Woo, Sumin; Kripalani, R. H.

    2018-04-01

    In this study, a comparison in the precipitation extremes as exhibited by the seven reference datasets is made to ascertain whether the inferences based on these datasets agree or they differ. These seven datasets, roughly grouped in three categories i.e. rain-gauge based (APHRODITE, CPC-UNI), satellite-based (TRMM, GPCP1DD) and reanalysis based (ERA-Interim, MERRA, and JRA55), having a common data period 1998-2007 are considered. Focus is to examine precipitation extremes in the summer monsoon rainfall over South Asia, East Asia and Southeast Asia. Measures of extreme precipitation include the percentile thresholds, frequency of extreme precipitation events and other quantities. Results reveal that the differences in displaying extremes among the datasets are small over South Asia and East Asia but large differences among the datasets are displayed over the Southeast Asian region including the maritime continent. Furthermore, precipitation data appear to be more consistent over East Asia among the seven datasets. Decadal trends in extreme precipitation are consistent with known results over South and East Asia. No trends in extreme precipitation events are exhibited over Southeast Asia. Outputs of the Coupled Model Intercomparison Project Phase 5 (CMIP5) simulation data are categorized as high, medium and low-resolution models. The regions displaying maximum intensity of extreme precipitation appear to be dependent on model resolution. High-resolution models simulate maximum intensity of extreme precipitation over the Indian sub-continent, medium-resolution models over northeast India and South China and the low-resolution models over Bangladesh, Myanmar and Thailand. In summary, there are differences in displaying extreme precipitation statistics among the seven datasets considered here and among the 29 CMIP5 model data outputs.

  20. Modeling annual extreme temperature using generalized extreme value distribution: A case study in Malaysia

    Science.gov (United States)

    Hasan, Husna; Salam, Norfatin; Kassim, Suraiya

    2013-04-01

    Extreme temperature of several stations in Malaysia is modeled by fitting the annual maximum to the Generalized Extreme Value (GEV) distribution. The Augmented Dickey Fuller (ADF) and Phillips Perron (PP) tests are used to detect stochastic trends among the stations. The Mann-Kendall (MK) test suggests a non-stationary model. Three models are considered for stations with trend and the Likelihood Ratio test is used to determine the best-fitting model. The results show that Subang and Bayan Lepas stations favour a model which is linear for the location parameters while Kota Kinabalu and Sibu stations are suitable with a model in the logarithm of the scale parameters. The return level is the level of events (maximum temperature) which is expected to be exceeded once, on average, in a given number of years, is obtained.

  1. Flavour Physics in the LHC Era

    International Nuclear Information System (INIS)

    Buras, A.

    2011-01-01

    This decade will allow to improve the resolution of the short distance scales by at least an order of magnitude, extending the picture of fundamental physics down to scales 5 x 10 -20 with the help of the LHC. Further resolution down to scales as short as 10 -21 should be possible with the help of high precision experiments in which flavour violating processes will play a prominent role. Will this increase in resolution allow us to see new particles (new animalcula) similarly to what Antoni van Leeuvenhoek saw by discovering bacteria in 1676? The basic question for particle physics is how these new animalcula will look like and which difficulties of the Standard Model (SM) they will help us to solve and which new puzzles and problems they will bring with them. I will describe what role flavour physics will play in these exciting times provided this new world is animalculated. (author)

  2. Is Extremely High Life Satisfaction during Adolescence Advantageous?

    Science.gov (United States)

    Suldo, Shannon M.; Huebner, E. Scott

    2006-01-01

    This study examined whether extremely high life satisfaction was associated with adaptive functioning or maladaptive functioning. Six hundred ninety-eight secondary level students completed the Students' Life Satisfaction Scale [Huebner, 1991a, School Psychology International, 12, pp. 231-240], Youth Self-Report of the Child Behavior Checklist…

  3. Extreme Winds from the NCEP/NCAR Reanalysis Data

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Mann, Jakob

    2009-01-01

    wind. We examined extreme winds in different places where the strongest wind events are weather phenomena of different scales, including the mid-latitude lows in Denmark, channelling winds in the Gulf of Suez, typhoons in the western North Pacific, cyclones in the Caribbean Sea, local strong winds...

  4. Lymphoscintigraphy of the lower extremity

    International Nuclear Information System (INIS)

    Abbasi, N.Z.

    1990-01-01

    Fifty one lower extremities of 26 normal healthy volunteers and 26 extremities of 13 patients with oedema have been studied. Dynamic quantitative lymphoscintigraphy using 99Tc-m antimony sulphide colloid during passive exercise as well as before and after active exercise was performed. parameters of lymphatic function including percentage of radioactivity cleared from the injection site, the percentage uptake by the inguinal lymph nodes, the time of arrival of activity at the regional lymph nodes and the lymphatic reserve index have been evaluated. The percentage clearance of activity from the injection site was found technically difficult to standardize and proved to be an unreliable parameter of lymphatic function. However, the quantitation of nodal uptake, the lymphatic transit time and the lymphatic reserve capacity accurately depicted the lymphatic functional status of an individual. The physiologic parameters of lymphatic function of the contralateral lower extremities were compared and a physiologic difference in the lymphatic capacity of the two limbs was scintigraphically documented. (author)

  5. Forecasting extreme temperature health hazards in Europe

    Science.gov (United States)

    Di Napoli, Claudia; Pappenberger, Florian; Cloke, Hannah L.

    2017-04-01

    Extreme hot temperatures, such as those experienced during a heat wave, represent a dangerous meteorological hazard to human health. Heat disorders such as sunstroke are harmful to people of all ages and responsible for excess mortality in the affected areas. In 2003 more than 50,000 people died in western and southern Europe because of a severe and sustained episode of summer heat [1]. Furthermore, according to the Intergovernmental Panel on Climate Change heat waves are expected to get more frequent in the future thus posing an increasing threat to human lives. Developing appropriate tools for extreme hot temperatures prediction is therefore mandatory to increase public preparedness and mitigate heat-induced impacts. A recent study has shown that forecasts of the Universal Thermal Climate Index (UTCI) provide a valid overview of extreme temperature health hazards on a global scale [2]. UTCI is a parameter related to the temperature of the human body and its regulatory responses to the surrounding atmospheric environment. UTCI is calculated using an advanced thermo-physiological model that includes the human heat budget, physiology and clothing. To forecast UTCI the model uses meteorological inputs, such as 2m air temperature, 2m water vapour pressure and wind velocity at body height derived from 10m wind speed, from NWP models. Here we examine the potential of UTCI as an extreme hot temperature prediction tool for the European area. UTCI forecasts calculated using above-mentioned parameters from ECMWF models are presented. The skill in predicting UTCI for medium lead times is also analysed and discussed for implementation to international health-hazard warning systems. This research is supported by the ANYWHERE project (EnhANcing emergencY management and response to extreme WeatHER and climate Events) which is funded by the European Commission's HORIZON2020 programme. [1] Koppe C. et al., Heat waves: risks and responses. World Health Organization. Health and

  6. Automation Rover for Extreme Environments

    Science.gov (United States)

    Sauder, Jonathan; Hilgemann, Evan; Johnson, Michael; Parness, Aaron; Hall, Jeffrey; Kawata, Jessie; Stack, Kathryn

    2017-01-01

    Almost 2,300 years ago the ancient Greeks built the Antikythera automaton. This purely mechanical computer accurately predicted past and future astronomical events long before electronics existed1. Automata have been credibly used for hundreds of years as computers, art pieces, and clocks. However, in the past several decades automata have become less popular as the capabilities of electronics increased, leaving them an unexplored solution for robotic spacecraft. The Automaton Rover for Extreme Environments (AREE) proposes an exciting paradigm shift from electronics to a fully mechanical system, enabling longitudinal exploration of the most extreme environments within the solar system.

  7. Hygienic diagnosis in extreme conditions

    International Nuclear Information System (INIS)

    Sofronov, G.A.

    1997-01-01

    Review for book by M.P. Zakharchenko, S.A. Lopatin, G.N. Novozhilov, V.I. Zakharov Hygienic diagnosis in extreme conditions is presented discussing the problem of people health preservation under extreme conditions. Hygienic diagnosis is considered illustrated by cases of hostilities (Afghan War), earthquake response in Armenia (1988) and Chernobyl accident response. Attention is paid to the estimation of radiation doses to people and characteristics of main types of dosimeters. The high scientific level of the book is marked

  8. Extreme Conditions Modeling Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Coe, R. G.; Neary, V. S.; Lawson, M. J.; Yu, Y.; Weber, J.

    2014-07-01

    Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) hosted the Wave Energy Converter (WEC) Extreme Conditions Modeling (ECM) Workshop in Albuquerque, NM on May 13th-14th, 2014. The objective of the workshop was to review the current state of knowledge on how to model WECs in extreme conditions (e.g. hurricanes and other large storms) and to suggest how U.S. Department of Energy (DOE) and national laboratory resources could be used to improve ECM methods for the benefit of the wave energy industry.

  9. Lunar geodesy and cartography: a new era

    Science.gov (United States)

    Duxbury, Thomas; Smith, David; Robinson, Mark; Zuber, Maria T.; Neumann, Gregory; Danton, Jacob; Oberst, Juergen; Archinal, Brent; Glaeser, Philipp

    The Lunar Reconnaissance Orbiter (LRO) ushers in a new era in precision lunar geodesy and cartography. LRO was launched in June, 2009, completed its Commissioning Phase in Septem-ber 2009 and is now in its Primary Mission Phase on its way to collecting high precision, global topographic and imaging data. Aboard LRO are the Lunar Orbiter Laser Altimeter (LOLA -Smith, et al., 2009) and the Lunar Reconnaissance Orbiter Camera (LROC -Robinson, et al., ). LOLA is a derivative of the successful MOLA at Mars that produced the global reference surface being used for all precision cartographic products. LOLA produces 5 altimetry spots having footprints of 5 m at a frequency of 28 Hz, significantly bettering MOLA that produced 1 spot having a footprint of 150 m at a frequency of 10 Hz. LROC has twin narrow angle cameras having pixel resolutions of 0.5 meters from a 50 km orbit and a wide-angle camera having a pixel resolution of 75 m and in up to 7 color bands. One of the two NACs looks to the right of nadir and the other looks to the left with a few hundred pixel overlap in the nadir direction. LOLA is mounted on the LRO spacecraft to look nadir, in the overlap region of the NACs. The LRO spacecraft has the ability to look nadir and build up global coverage as well as looking off-nadir to provide stereo coverage and fill in data gaps. The LROC wide-angle camera builds up global stereo coverage naturally from its large field-of-view overlap from orbit to orbit during nadir viewing. To date, the LROC WAC has already produced global stereo coverage of the lunar surface. This report focuses on the registration of LOLA altimetry to the LROC NAC images. LOLA has a dynamic range of tens of km while producing elevation data at sub-meter precision. LOLA also has good return in off-nadir attitudes. Over the LRO mission, multiple LOLA tracks will be in each of the NAC images at the lunar equator and even more tracks in the NAC images nearer the poles. The registration of LOLA

  10. Moving in extreme environments: what's extreme and who decides?

    Science.gov (United States)

    Cotter, James David; Tipton, Michael J

    2014-01-01

    Humans work, rest and play in immensely varied extreme environments. The term 'extreme' typically refers to insufficiency or excess of one or more stressors, such as thermal energy or gravity. Individuals' behavioural and physiological capacity to endure and enjoy such environments varies immensely. Adverse effects of acute exposure to these environments are readily identifiable (e.g. heat stroke or bone fracture), whereas adverse effects of chronic exposure (e.g. stress fractures or osteoporosis) may be as important but much less discernable. Modern societies have increasingly sought to protect people from such stressors and, in that way, minimise their adverse effects. Regulations are thus established, and advice is provided on what is 'acceptable' exposure. Examples include work/rest cycles in the heat, hydration regimes, rates of ascent to and duration of stay at altitude and diving depth. While usually valuable and well intentioned, it is important to realise the breadth and importance of limitations associated with such guidelines. Regulations and advisories leave less room for self-determination, learning and perhaps adaptation. Regulations based on stress (e.g. work/rest cycles relative to WBGT) are more practical but less direct than those based on strain (e.g. core temperature), but even the latter can be substantively limited (e.g. by lack of criterion validation and allowance for behavioural regulation in the research on which they are based). Extreme Physiology & Medicine is publishing a series of reviews aimed at critically examining the issues involved with self- versus regulation-controlled human movement acutely and chronically in extreme environments. These papers, arising from a research symposium in 2013, are about the impact of people engaging in such environments and the effect of rules and guidelines on their safety, enjoyment, autonomy and productivity. The reviews will cover occupational heat stress, sporting heat stress, hydration, diving

  11. Bremsstrahlung: an experimentalists personal perspective on the post modern era

    International Nuclear Information System (INIS)

    Quarles, C.A.

    2000-01-01

    In this brief review I will discuss the recent experimental work on the doubly differential cross section, i.e. the photon energy and angular distribution, for electron Bremsstrahlung from thin solid film and gas targets. Since the beginning of the modern era in the study of Bremsstrahlung with the publication of the 1971 paper by Ts eng and Pratt, Professor Pratt has been the dominant influence in Bremsstrahlung research. Most, if not all, experimental research during the modern era has been motivated by the interest in comparing data with the theory of Pratt and his coworkers. As Bremsstrahlung research has moved into its post modern era, new experiments with increasing precision are concentrating on determining under what conditions ordinary Bremsstrahlung theory needs to be supplemented by a contribution from polarization Bremsstrahlung. Efforts to improve the comparison of thin-target experiment with theory have also led to new experimental and modeling work on Bremsstrahlung from thick solid targets. Thick-target Bremsstrahlung is interesting in its own right, but we also want to understand it better since it is the ever-present background in the thin-target experiments and the limiting factor in the effort to distinguish the polarization contribution to the total Bremsstrahlung spectrum. Professor Pratt ushered in the modern era in Bremsstrahlung research and has recently guided the transition into the post modern era. It can be expected that he will continue to have a formative influence on the developments of Bremsstrahlung research into the foreseeable future.

  12. The climate of the Common Era off the Iberian Peninsula

    Directory of Open Access Journals (Sweden)

    F. Abrantes

    2017-12-01

    SCAND. The Industrial Era reveals a clear difference between the NW Iberia and the Algarve records. While off NW Iberia variability is low, the Algarve shows large-amplitude decadal variability with an inverse relationship between SST and river input. Such conditions suggest a shift in the EA mode, from negative between 1900 and 1970 CE to positive after 1970, while NAO and SCAND remain in a positive phase. The particularly noticeable rise in SST at the Algarve site by the mid-20th century (±1970, provides evidence for a regional response to the ongoing climate warming. The reported findings have implications for decadal-scale predictions of future climate change in the Iberian Peninsula.

  13. Angiography of the upper extremity

    International Nuclear Information System (INIS)

    Janevski, B.K.

    1982-01-01

    This thesis provides a description of the technical and medical aspects of arteriography of the upper extremity and an extensive analysis of the angiographic anatomy and pathology of 750 selective studies performed in more than 500 patients. A short historical review is provided of angiography as a whole and of arteriography of the hand in particular. The method of percutaneous transfemoral catheterization of the arteries of the upper extremity and particularly the arteries of the hand is considered, discussing the problems the angiographer encounters frequently, describing the angiographic complications which may occur and emphasizing the measures to keep them to a minimum. The use of vasodilators in hand angiography is discussed. A short description of the embryological patterns persisting in the arteries of the arm is included in order to understand the congenital variations of the arteries of the upper extremity. The angiographic patterns and clinical aspects of the most common pathological processes involving the arteries of the upper extremities are presented. Special attention is paid to the correlation between angiography and pathology. (Auth.)

  14. Extreme conditions (p, T, H)

    Energy Technology Data Exchange (ETDEWEB)

    Mesot, J [Lab. for Neutron Scattering ETH Zurich, Zurich (Switzerland) and Paul Scherrer Institute, Villigen (Switzerland)

    1996-11-01

    The aim of this paper is to summarize the sample environment which will be accessible at the SINQ. In order to illustrate the type of experiments which will be feasible under extreme conditions of temperature, magnetic field and pressure at the SINQ a few selected examples are also given. (author) 7 figs., 14 refs.

  15. Book review: Extreme ocean waves

    Science.gov (United States)

    Geist, Eric L.

    2011-01-01

    ‘‘Extreme Ocean Waves’’ is a collection of ten papers edited by Efim Pelinovsky and Christian Kharif that followed the April 2007 meeting of the General Assembly of the European Geosciences Union. A note on terminology: extreme waves in this volume broadly encompass different types of waves, includ- ing deep-water and shallow-water rogue waves (alternatively termed freak waves), storm surges from cyclones, and internal waves. Other types of waves such as tsunamis or rissaga (meteotsunamis) are not discussed in this volume. It is generally implied that ‘‘extreme’’ has a statistical connotation relative to the average or significant wave height specific to each type of wave. Throughout the book, in fact, the reader will find a combination of theoretical and statistical/ empirical treatment necessary for the complete examination of this subject. In the introduction, the editors underscore the importance of studying extreme waves, documenting several dramatic instances of damaging extreme waves that occurred in 2007. 

  16. Extreme Energy Events Monitoring report

    CERN Document Server

    Baimukhamedova, Nigina

    2015-01-01

    Following paper reflects the progress I made on Summer Student Program within Extreme Energy Events Monitor project I was working on. During 8 week period I managed to build a simple detector system that is capable of triggering events similar to explosions (sudden change in sound levels) and measuring approximate location of the event. Source codes are available upon request and settings described further.

  17. Astrobiology: Life in Extreme Environments

    Science.gov (United States)

    Kaur, Preeti

    2011-01-01

    Astrobiology is the study of the origin, evolution and distribution of life in the universe. It seeks to answer two important scientific questions: how did we get here and are we alone in the universe? Scientists begin by studying life on Earth and its limits. The discovery of extremophiles on Earth capable of surviving extremes encourages the…

  18. EN LA ERA DEL CADUCEO DE MERCURIO

    Directory of Open Access Journals (Sweden)

    Hernán Urbina Joiro

    2010-06-01

    Full Text Available

    En un momento extraño el bastón de Asclepio, dios de la medicina, fue cambiado por el caduceo de Mercurio, dios del comercio y también de los ladrones.

    Distintas tradiciones griegas dicen que Asclepio se servía de un bastón y de la serpiente para sanar enfermos, enseñar y resucitar difuntos1,2 y que cuando volvió a la vida a Hipólito, hijo de Teseo, le había restado tantos muertos a Hades, rey de los infiernos, que el propio Hades fue a querellarse ante Zeus, quien convencido de la amenaza que representaría Asclepio para mantener el orden establecido, lo hirió con un rayo3,4. La vara de Asclepio —Esculapio para los romanos— con una serpiente enrollada simbolizó la sanación mediada por el médico5.

    De Hermes —Mercurio para los romanos—, dios griego del comercio, las comunicaciones, la astucia y los ladrones6, la tradición afirma que su caduceo consistía en un bastón de oro con alas y dos serpientes enrolladas, y que le fue regalado por Apolo a cambio de la flauta del dios Pan7. Una práctica anglosajona del siglo XVI, iniciada por el doctor William Butts, médico del rey Enrique VIII, introdujo el caduceo de Mercurio —en lugar de lavara de Asclepio— como símbolo entre médicos británicos8 y de allí pasó a galenos del cuerpo médico del ejército de los Estados Unidos y de diversas otras comunidades médicas. Por cierto, Mrs. Butts aparece en Enrique VIII, de William Shakespeare, como «El doctor Butts, médico del Rey»9.

    Sin embargo, de acuerdo con Michel Foucault, sólo en el siglo XVIII se expresaría en pleno la era de la «economía política»10 como el eje del arte de gobernar y en donde la medicina jugaría un papel central para vigilar a los pueblos, perseguir amenazas como la locura, los descarríos sexuales, las infecciones e incluso a la propia delincuencia11. En La vida de los hombres infames12, Foucault sostiene que fue en la Alemania de finales del siglo XVIII donde surgi

  19. CERN: End of LEP's Z era

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1995-11-15

    Full text: Achapter of history at CERN's LEP electron-positron collider closed in October when the four big experiments, Aleph, Delphi, L3 and Opal, logged their final data at the Z energy, just over six years after LEP's first Z was detected. The LEP Z era has been one of great success, both in terms of physics results and the advances which have been made with the machine itself. LEP now takes a step towards becoming LEP2, when the energy is wound up from around 45 GeV to about 70 GeV per beam (September, page 6). By the end of LEP's 1995 run, each of the four LEP experiments had seen almost five million Zs. Now the spotlight at LEP shifts to producing pairs of W particles, the electrically charged counterparts of the Z. LEP's first Zs were recorded in August 1989, one month after the machine's first circulating beam. The 30,000 Z decays recorded by each experiment in 1989 confirmed that matter comes in just three distinct families of quarks and leptons. The values of the Z mass and width quoted in 1990 were 91.161 ± 0.031 GeV and 2.534 ± 0.027 GeV. By the beginning of 1995, these had been fine-tuned to the extraordinary accuracy of 91.1884 ± 0.0022 GeV and 2.4963 ± 0.0032 GeV, and when data from this year's run is included, will be even better. These results, combined with precision data from neutrino experiments and from Fermilab's Tevatron protonantiproton collider, have put the Standard Model of quarks and leptons through its most gruelling test yet. Right from the start, collaboration between LEP experiments and the accelerator team has been close, with frequent scheduling meetings determining how the machine is run. For the first few years, LEP ran on a diet of four bunches of electrons and four of positrons, but by the end of 1992, a way had been found to increase the luminosity by squeezing in more bunches. In 1993, the 'pretzel' scheme (October 1992, page 17), so called because of the shape traced out by the circulating beams, was running with eight

  20. Earth's portfolio of extreme sediment transport events

    Science.gov (United States)

    Korup, Oliver

    2012-05-01

    Quantitative estimates of sediment flux and the global cycling of sediments from hillslopes to rivers, estuaries, deltas, continental shelves, and deep-sea basins have a long research tradition. In this context, extremely large and commensurately rare sediment transport events have so far eluded a systematic analysis. To start filling this knowledge gap I review some of the highest reported sediment yields in mountain rivers impacted by volcanic eruptions, earthquake- and storm-triggered landslide episodes, and catastrophic dam breaks. Extreme specific yields, defined here as those exceeding the 95th percentile of compiled data, are ~ 104 t km- 2 yr- 1 if averaged over 1 yr. These extreme yields vary by eight orders of magnitude, but systematically decay with reference intervals from minutes to millennia such that yields vary by three orders of magnitude for a given reference interval. Sediment delivery from natural dam breaks and pyroclastic eruptions dominate these yields for a given reference interval. Even if averaged over 102-103 yr, the contribution of individual disturbances may remain elevated above corresponding catchment denudation rates. I further estimate rates of sediment (re-)mobilisation by individual giant terrestrial and submarine mass movements. Less than 50 postglacial submarine mass movements have involved an equivalent of ~ 10% of the contemporary annual global flux of fluvial sediment to Earth's oceans, while mobilisation rates by individual events rival the decadal-scale sediment discharge from tectonically active orogens such as Taiwan or New Zealand. Sediment flushing associated with catastrophic natural dam breaks is non-stationary and shows a distinct kink at the last glacial-interglacial transition, owing to the drainage of very large late Pleistocene ice-marginal lakes. Besides emphasising the contribution of high-magnitude and low-frequency events to the global sediment cascade, these findings stress the importance of sediment storage

  1. Particle Physics in the LHC Era

    CERN Document Server

    Bunk, Don

    During the past 100 years experimental particle physicists have collected an impressive amount of data. Theorists have also come to understand this data extremely well. It was in the first half of the 20th century the efforts of the early pioneers of quantum mechanics laid the ground work for this understanding: quantum field theory. Through the tireless efforts of researchers during the later half of the 20th century many ideas came together to form what we now call the Standard Model (SM) of particle physics. Finally, it was through the ideas of the renormalization group and effective field theory that the understanding of how the SM fits into a larger framework of particle physics was crystallized. In the past four years the Large Hadron Collider (LHC) has made more precise measurements than ever before. Currently the SM of particle physics is known to have excellent agreement with these measurements. As a result of this agreement with data, the SM continues to play such a central role in modern particle p...

  2. System and software safety analysis for the ERA control computer

    International Nuclear Information System (INIS)

    Beerthuizen, P.G.; Kruidhof, W.

    2001-01-01

    The European Robotic Arm (ERA) is a seven degrees of freedom relocatable anthropomorphic robotic manipulator system, to be used in manned space operation on the International Space Station, supporting the assembly and external servicing of the Russian segment. The safety design concept and implementation of the ERA is described, in particular with respect to the central computer's software design. A top-down analysis and specification process is used to down flow the safety aspects of the ERA system towards the subsystems, which are produced by a consortium of companies in many countries. The user requirements documents and the critical function list are the key documents in this process. Bottom-up analysis (FMECA) and test, on both subsystem and system level, are the basis for safety verification. A number of examples show the use of the approach and methods used

  3. Leprosy: International Public Health Policies and Public Health Eras

    Directory of Open Access Journals (Sweden)

    Niyi Awofeso

    2011-09-01

    Full Text Available Public health policies continue to play important roles in national and international health reforms. However, the influence and legacies of the public health eras during which such policies are formulated remain largely underappreciated. The limited appreciation of this relationship may hinder consistent adoption of public health policies by nation-states, and encumber disinvestment from ineffective or anachronistic policies. This article reviews seven public health eras and highlights how each era has influenced international policy formulation for leprosy control—“the fertile soil for policy learning”. The author reiterates the role of health leadership and health activism in facilitating consistency in international health policy formulation and implementation for leprosy control.

  4. Participatory Design in an Era of Participation, Special Issue

    DEFF Research Database (Denmark)

    This special issue on participatory design in an era of participation presents emerging topics and discussions from the thirteenth Participatory Design conference (PDC), held at Aarhus University in August 2016. The PDC 2016 marked the twenty-fifth anniversary of the Participatory Design conference...... series, which began in 1990 with the first biannual conference in Seattle. Since then, the PDC conferences have continued to bring together a multidisciplinary, international community of researchers and practitioners around issues of cooperative design. The theme for the 2016 PDC conference...... was ‘Participatory Design in an Era of Participation.’ Critical and constructive discussions were invited on the values, characteristics, politics and future practices of participatory design in an era in which participation has now become pervasive (Bossen, Smith, Kanstrup, McDonnell, et al. 2016, Bossen, Smith...

  5. Temperature-driven global sea-level variability in the Common Era

    Science.gov (United States)

    Kopp, Robert E.; Kemp, Andrew C.; Bittermann, Klaus; Horton, Benjamin P.; Donnelly, Jeffrey P.; Gehrels, W. Roland; Hay, Carling C.; Mitrovica, Jerry X.; Morrow, Eric D.; Rahmstorf, Stefan

    2016-01-01

    We assess the relationship between temperature and global sea-level (GSL) variability over the Common Era through a statistical metaanalysis of proxy relative sea-level reconstructions and tide-gauge data. GSL rose at 0.1 ± 0.1 mm/y (2σ) over 0–700 CE. A GSL fall of 0.2 ± 0.2 mm/y over 1000–1400 CE is associated with ∼0.2 °C global mean cooling. A significant GSL acceleration began in the 19th century and yielded a 20th century rise that is extremely likely (probability P≥0.95) faster than during any of the previous 27 centuries. A semiempirical model calibrated against the GSL reconstruction indicates that, in the absence of anthropogenic climate change, it is extremely likely (P=0.95) that 20th century GSL would have risen by less than 51% of the observed 13.8±1.5 cm. The new semiempirical model largely reconciles previous differences between semiempirical 21st century GSL projections and the process model-based projections summarized in the Intergovernmental Panel on Climate Change’s Fifth Assessment Report. PMID:26903659

  6. An observational and modeling study of the August 2017 Florida climate extreme event.

    Science.gov (United States)

    Konduru, R.; Singh, V.; Routray, A.

    2017-12-01

    A special report on the climate extremes by the Intergovernmental Panel on Climate Change (IPCC) elucidates that the sole cause of disasters is due to the exposure and vulnerability of the human and natural system to the climate extremes. The cause of such a climate extreme could be anthropogenic or non-anthropogenic. Therefore, it is challenging to discern the critical factor of influence for a particular climate extreme. Such kind of perceptive study with reasonable confidence on climate extreme events is possible only if there exist any past case studies. A similar rarest climate extreme problem encountered in the case of Houston floods and extreme rainfall over Florida in August 2017. A continuum of hurricanes like Harvey and Irma targeted the Florida region and caused catastrophe. Due to the rarity of August 2017 Florida climate extreme event, it requires the in-depth study on this case. To understand the multi-faceted nature of the event, a study on the development of the Harvey hurricane and its progression and dynamics is significant. Current article focus on the observational and modeling study on the Harvey hurricane. A global model named as NCUM (The global UK Met office Unified Model (UM) operational at National Center for Medium Range Weather Forecasting, India, was utilized to simulate the Harvey hurricane. The simulated rainfall and wind fields were compared with the observational datasets like Tropical Rainfall Measuring Mission rainfall datasets and Era-Interim wind fields. The National Centre for Environmental Prediction (NCEP) automated tracking system was utilized to track the Harvey hurricane, and the tracks were analyzed statistically for different forecasts concerning the Harvey hurricane track of Joint Typhon Warning Centre. Further, the current study will be continued to investigate the atmospheric processes involved in the August 2017 Florida climate extreme event.

  7. Exascale Co-design for Modeling Materials in Extreme Environments

    Energy Technology Data Exchange (ETDEWEB)

    Germann, Timothy C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-07-08

    Computational materials science has provided great insight into the response of materials under extreme conditions that are difficult to probe experimentally. For example, shock-induced plasticity and phase transformation processes in single-crystal and nanocrystalline metals have been widely studied via large-scale molecular dynamics simulations, and many of these predictions are beginning to be tested at advanced 4th generation light sources such as the Advanced Photon Source (APS) and Linac Coherent Light Source (LCLS). I will describe our simulation predictions and their recent verification at LCLS, outstanding challenges in modeling the response of materials to extreme mechanical and radiation environments, and our efforts to tackle these as part of the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx). ExMatEx has initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. We anticipate that we will be able to exploit hierarchical, heterogeneous architectures to achieve more realistic large-scale simulations with adaptive physics refinement, and are using tractable application scale-bridging proxy application testbeds to assess new approaches and requirements. Such current scale-bridging strategies accumulate (or recompute) a distributed response database from fine-scale calculations, in a top-down rather than bottom-up multiscale approach.

  8. Probabilistic attribution of individual unprecedented extreme events

    Science.gov (United States)

    Diffenbaugh, N. S.

    2016-12-01

    The last decade has seen a rapid increase in efforts to understand the influence of global warming on individual extreme climate events. Although trends in the distributions of climate observations have been thoroughly analyzed, rigorously quantifying the contribution of global-scale warming to individual events that are unprecedented in the observed record presents a particular challenge. This paper describes a method for leveraging observations and climate model ensembles to quantify the influence of historical global warming on the severity and probability of unprecedented events. This approach uses formal inferential techniques to quantify four metrics: (1) the contribution of the observed trend to the event magnitude, (2) the contribution of the observed trend to the event probability, (3) the probability of the observed trend in the current climate and a climate without human influence, and (4) the probability of the event magnitude in the current climate and a climate without human influence. Illustrative examples are presented, spanning a range of climate variables, timescales, and regions. These examples illustrate that global warming can influence the severity and probability of unprecedented extremes. In some cases - particularly high temperatures - this change is indicated by changes in the mean. However, changes in probability do not always arise from changes in the mean, suggesting that global warming can alter the frequency with which complex physical conditions co-occur. Because our framework is transparent and highly generalized, it can be readily applied to a range of climate events, regions, and levels of climate forcing.

  9. Extreme Transients in the High Energy Universe

    Science.gov (United States)

    Kouveliotou, Chryssa

    2013-01-01

    The High Energy Universe is rich in diverse populations of objects spanning the entire cosmological (time)scale, from our own present-day Milky Way to the re-ionization epoch. Several of these are associated with extreme conditions irreproducible in laboratories on Earth. Their study thus sheds light on the behavior of matter under extreme conditions, such as super-strong magnetic fields (in excess of 10^14 G), high gravitational potentials (e.g., Super Massive Black Holes), very energetic collimated explosions resulting in relativistic jet flows (e.g., Gamma Ray Bursts, exceeding 10^53 ergs). In the last thirty years, my work has been mostly focused on two apparently different but potentially linked populations of such transients: magnetars (highly magnetized neutron stars) and Gamma Ray Bursts (strongly beamed emission from relativistic jets), two populations that constitute unique astrophysical laboratories, while also giving us the tools to probe matter conditions in the Universe to redshifts beyond z=10, when the first stars and galaxies were assembled. I did not make this journey alone I have either led or participated in several international collaborations studying these phenomena in multi-wavelength observations; solitary perfection is not sufficient anymore in the world of High Energy Astrophysics. I will describe this journey, present crucial observational breakthroughs, discuss key results and muse on the future of this field.

  10. Projecting changes in regional temperature and precipitation extremes in the United States

    OpenAIRE

    Justin T. Schoof; Scott M. Robeson

    2016-01-01

    Regional and local climate extremes, and their impacts, result from the multifaceted interplay between large-scale climate forcing, local environmental factors (physiography), and societal vulnerability. In this paper, we review historical and projected changes in temperature and precipitation extremes in the United States, with a focus on strengths and weaknesses of (1) commonly used definitions for extremes such as thresholds and percentiles, (2) statistical approaches to quantifying change...

  11. [Embracing medical innovation in the era of big data].

    Science.gov (United States)

    You, Suning

    2015-01-01

    Along with the advent of big data era worldwide, medical field has to place itself in it inevitably. The current article thoroughly introduces the basic knowledge of big data, and points out the coexistence of its advantages and disadvantages. Although the innovations in medical field are struggling, the current medical pattern will be changed fundamentally by big data. The article also shows quick change of relevant analysis in big data era, depicts a good intention of digital medical, and proposes some wise advices to surgeons.

  12. Governments, contractors seen headed for era of cooperation

    International Nuclear Information System (INIS)

    McHaffie, E.R.; Jarvis, M.G.; Barber, S.A.

    1993-01-01

    The oil and gas industry is on the threshold of a new era in international oil and gas investments. It will be an era of increasing flexibility and cooperation among investors, oil companies, and host governments. And it will develop as a necessary response to flat oil prices and the growing mobility of capital. This article will cover three essential elements of this topic: the effect of fiscal terms on the economics of a project investment; how recent changes in global economies have impacted the economics of the projects the authors evaluated at Amoco; some potential trends that fiscal provisions, and the pattern of investment in the upstream petroleum industry, may take

  13. La era de la información

    OpenAIRE

    Florez Calderón, Mauro

    2011-01-01

    La revolución de la información basada en la información tiene como límite las fronteras del conocimiento. La información a diferencia de un bien material es inalienable, acumulativa y no produce los tipos de deterioro ambiental creados por la industria; por lo anterior, algunos especialistas, a la era de la información la denominan era Post-industrial. Si es necio pretender entender los procesos producidos por la revolución industrial, con una mentalidad pastoril, mucho más necio será tratar...

  14. E-Polmas: Paradigma Baru Pemolisian Masyarakat Era Digital

    Directory of Open Access Journals (Sweden)

    Bayu Suseno

    2016-05-01

    Full Text Available Tulisan ini memberikan perpekstif baru tentang polmas di era digital dengan menggunakan studi kasus kejahatan cyber crime di Polrestabes Semarang. Berdasarkan tingginya angka kejahatan berbasis teknologi yang ditangani kepolisian maka ada kebutuhan mendesak untuk melakukan kajian ulang terhadap pendekatan polmas yang selama ini ada. Penulis memberikan sebuah pandangan baru konsep polmas era digital atau E-Polmas. E-Polmas merupakan pengembangan dari konsep Polmas yang sudah ada, akan tetapi menitikberatkan kepada media yang digunakan untuk menyampaikan pesan kamtibmas kepada masyarakat. Yang semula dilaksanakan secara manual konvensional, dirubah menjadi cara online dengan memanfaatkan media sosial yang sudah ada.

  15. Identitas Moral: Rekonstruksi Identitas Keindonesiaan pada Era Globalisasi Budaya

    Directory of Open Access Journals (Sweden)

    Leonardus Pandu Hapsoro

    2016-06-01

    Full Text Available Tulisan ini membahas mengenai proses rekonstruksi identitas keindonesiaan para aktor dalam komunitas Kultura Indonesia Star Society (KISS pada era globalisasi budaya. Dengan menggunakan metode penelitian kualitatif dan kerangka pemikiran Zygmunt Bauman antara modernitas cair dan agensi moral, tulisan ini menunjukkan bagaimana proses rekonstruksi identitas moral berawal dari keresahan aktor terhadap kondisi budaya tradisional pada era globalisasi. Penulis berpendapat bahwa ekspresi dari identitas aktor melalui gerakan sosial ini akan berperan menciptakan keberagaman budaya pada era globalisasi dan modernitas cair. Penulis ingin bergerak menjauh dari pandangan agentless dalam proses globalisasi dengan melihat dinamika agensi. Melalui agen dan bentuk agensi moral, penulis berpendapat bahwa di dalam proses dan dampak globalisasi, manusia tidak tertahan pada kondisi "adalah" atau tekanan struktural, melainkan terdapat optimisme untuk melihat suatu harapan atas kondisi yang "seharusnya" atau lebih baik melalui kesadaran identitas dan moral.This study discusses about the process of identity construction of actor  in the KISS community in an era of cultural globalization. This study will explain how the construction process of moral identity actor is formed in the era of cultural globalization. Moral identity construction of the actor in the era of globalization will be the anchor for agents to act and preserve the traditional culture with motivation, passion, and hope. This study used a qualitative approach by using the framework of Zygmunt Bauman concept of liquid modernity and moral agency. Through the framework of moral agency, this study shows how the construction of moral identity process begins with the actor disquite over the state of traditional culture in globalization era. Moral identity of the actor is capable of forming social practices agent in a daily life. First, by forming a community KISS. Second, as the cornerstone of actors to act

  16. Report from the 4th Workshop on Extremely Large Databases

    Directory of Open Access Journals (Sweden)

    Jacek Becla

    2011-02-01

    Full Text Available Academic and industrial users are increasingly facing the challenge of petabytes of data, but managing and analyzing such large data sets still remains a daunting task. The 4th Extremely Large Databases workshop was organized to examine the needs of communities under-represented at the past workshops facing these issues. Approaches to big data statistical analytics as well as emerging opportunities related to emerging hardware technologies were also debated. Writable extreme scale databases and the science benchmark were discussed. This paper is the final report of the discussions and activities at this workshop.

  17. Predictability and possible earlier awareness of extreme precipitation across Europe

    Science.gov (United States)

    Lavers, David; Pappenberger, Florian; Richardson, David; Zsoter, Ervin

    2017-04-01

    Extreme hydrological events can cause large socioeconomic damages in Europe. In winter, a large proportion of these flood episodes are associated with atmospheric rivers, a region of intense water vapour transport within the warm sector of extratropical cyclones. When preparing for such extreme events, forecasts of precipitation from numerical weather prediction models or river discharge forecasts from hydrological models are generally used. Given the strong link between water vapour transport (integrated vapour transport IVT) and heavy precipitation, it is possible that IVT could be used to warn of extreme events. Furthermore, as IVT is located in extratropical cyclones, it is hypothesized to be a more predictable variable due to its link with synoptic-scale atmospheric dynamics. In this research, we firstly provide an overview of the predictability of IVT and precipitation forecasts, and secondly introduce and evaluate the ECMWF Extreme Forecast Index (EFI) for IVT. The EFI is a tool that has been developed to evaluate how ensemble forecasts differ from the model climate, thus revealing the extremeness of the forecast. The ability of the IVT EFI to capture extreme precipitation across Europe during winter 2013/14, 2014/15, and 2015/16 is presented. The results show that the IVT EFI is more capable than the precipitation EFI of identifying extreme precipitation in forecast week 2 during forecasts initialized in a positive North Atlantic Oscillation (NAO) phase. However, the precipitation EFI is superior during the negative NAO phase and at shorter lead times. An IVT EFI example is shown for storm Desmond in December 2015 highlighting its potential to identify upcoming hydrometeorological extremes.

  18. Exoplanets: A New Era of Comparative Planetology

    Science.gov (United States)

    Meadows, Victoria

    2014-11-01

    We now know of over 1700 planets orbiting other stars, and several thousand additional planetary candidates. These discoveries have the potential to revolutionize our understanding of planet formation and evolution, while providing targets for the search for life beyond the Solar System. Exoplanets display a larger diversity of planetary types than those seen in our Solar System - including low-density, low-mass objects. They are also found in planetary system architectures very different from our own, even for stars similar to our Sun. Over 20 potentially habitable planets are now known, and half of the M dwarfs stars in our Galaxy may harbor a habitable planet. M dwarfs are plentiful, and they are therefore the most likely habitable planet hosts, but their planets will have radiative and gravitational interactions with their star and sibling planets that are unlike those in our Solar System. Observations to characterize the atmospheres and surfaces of exoplanets are extremely challenging, and transit transmission spectroscopy has been used to measure atmospheric composition for a handful of candidates. Frustratingly, many of the smaller exoplanets have flat, featureless spectra indicative of planet-wide haze or clouds. The James Webb Space Telescope and future ground-based telescopes will improve transit transmission characterization, and enable the first search for signs of life in terrestrial exoplanet atmospheres. Beyond JWST, planned next-generation space telescopes will directly image terrestrial exoplanets, allowing surface and atmospheric characterization that is more robust to haze. Until these observations become available, there is a lot that we can do as planetary scientists to inform required measurements and future data interpretation. Solar System planets can be used as validation targets for extrasolar planet observations and models. The rich heritage of planetary science models can also be used to explore the potential diversity of exoplanet

  19. Expected impacts of climate change on extreme climate events

    International Nuclear Information System (INIS)

    Planton, S.; Deque, M.; Chauvin, F.; Terray, L.

    2008-01-01

    An overview of the expected change of climate extremes during this century due to greenhouse gases and aerosol anthropogenic emissions is presented. The most commonly used methodologies rely on the dynamical or statistical down-scaling of climate projections, performed with coupled atmosphere-ocean general circulation models. Either of dynamical or of statistical type, down-scaling methods present strengths and weaknesses, but neither their validation on present climate conditions, nor their potential ability to project the impact of climate change on extreme event statistics allows one to give a specific advantage to one of the two types. The results synthesized in the last IPCC report and more recent studies underline a convergence for a very likely increase in heat wave episodes over land surfaces, linked to the mean warming and the increase in temperature variability. In addition, the number of days of frost should decrease and the growing season length should increase. The projected increase in heavy precipitation events appears also as very likely over most areas and also seems linked to a change in the shape of the precipitation intensity distribution. The global trends for drought duration are less consistent between models and down-scaling methodologies, due to their regional variability. The change of wind-related extremes is also regionally dependent, and associated to a poleward displacement of the mid-latitude storm tracks. The specific study of extreme events over France reveals the high sensitivity of some statistics of climate extremes at the decadal time scale as a consequence of regional climate internal variability. (authors)

  20. The extremity function index (EFI), a disability severity measure for neuromuscular diseases : psychometric evaluation

    NARCIS (Netherlands)

    Bos, Isaac; Wynia, Klaske; Drost, Gea; Almansa, Josué; Kuks, Joannes

    2017-01-01

    OBJECTIVE: To adapt and to combine the self-report Upper Extremity Functional Index and Lower Extremity Function Scale, for the assessment of disability severity in patients with a neuromuscular disease and to examine its psychometric properties in order to make it suitable for indicating disease

  1. Clean coal technology: The new coal era

    Energy Technology Data Exchange (ETDEWEB)

    1994-01-01

    The Clean Coal Technology Program is a government and industry cofunded effort to demonstrate a new generation of innovative coal processes in a series of full-scale showcase`` facilities built across the country. Begun in 1986 and expanded in 1987, the program is expected to finance more than $6.8 billion of projects. Nearly two-thirds of the funding will come from the private sector, well above the 50 percent industry co-funding expected when the program began. The original recommendation for a multi-billion dollar clean coal demonstration program came from the US and Canadian Special Envoys on Acid Rain. In January 1986, Special Envoys Lewis and Davis presented their recommendations. Included was the call for a 5-year, $5-billion program in the US to demonstrate, at commercial scale, innovative clean coal technologies that were beginning to emerge from research programs both in the US and elsewhere in the world. As the Envoys said: if the menu of control options was expanded, and if the new options were significantly cheaper, yet highly efficient, it would be easier to formulate an acid rain control plan that would have broader public appeal.

  2. Pediatric lower extremity mower injuries.

    Science.gov (United States)

    Hill, Sean M; Elwood, Eric T

    2011-09-01

    Lawn mower injuries in children represent an unfortunate common problem to the plastic reconstructive surgeon. There are approximately 68,000 per year reported in the United States. Compounding this problem is the fact that a standard treatment algorithm does not exist. This study follows a series of 7 pediatric patients treated for lower extremity mower injuries by a single plastic surgeon. The extent of soft tissue injury varied. All patients were treated with negative pressure wound therapy as a bridge to definitive closure. Of the 7 patients, 4 required skin grafts, 1 required primary closure, 1 underwent a lower extremity amputation secondary to wounds, and 1 was repaired using a cross-leg flap. Function limitations were minimal for all of our patients after reconstruction. Our basic treatment algorithm is presented with initial debridement followed by the simplest method possible for wound closure using negative pressure wound therapy, if necessary.

  3. Extreme Conditions Modeling Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Coe, Ryan Geoffrey [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Neary, Vincent Sinclair [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Lawon, Michael J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Yu, Yi-Hsiang [National Renewable Energy Lab. (NREL), Golden, CO (United States); Weber, Jochem [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-07-01

    Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) hosted the Wave Energy Converter (WEC) Extreme Conditions Modeling (ECM) Workshop in Albuquerque, New Mexico on May 13–14, 2014. The objective of the workshop was to review the current state of knowledge on how to numerically and experimentally model WECs in extreme conditions (e.g. large ocean storms) and to suggest how national laboratory resources could be used to improve ECM methods for the benefit of the wave energy industry. More than 30 U.S. and European WEC experts from industry, academia, and national research institutes attended the workshop, which consisted of presentations from W EC developers, invited keynote presentations from subject matter experts, breakout sessions, and a final plenary session .

  4. Extreme project. Progress report 2006

    International Nuclear Information System (INIS)

    Eyrolle, F.; Masson, O.; Charmasson, S.

    2007-01-01

    The E.X.T.R.E.M.E. project introduced in 2005 to the S.E.S.U.R.E. / L.E.R.C.M. has for objectives to acquire data on the consequences of the extreme climatic meteorological episodes on the distribution of the artificial radioisotopes within the various compartments of the geosphere. This report presents the synthesis of the actions developed in 2006 in positioning and in co financing of the project by means of regional or national research programs (C.A.R.M.A., E.X.T.R.E.M.A., E.C.C.O.R.E.V.I.), of data acquisition, valuation and scientific collaboration. (N.C.)

  5. On causality of extreme events

    Directory of Open Access Journals (Sweden)

    Massimiliano Zanin

    2016-06-01

    Full Text Available Multiple metrics have been developed to detect causality relations between data describing the elements constituting complex systems, all of them considering their evolution through time. Here we propose a metric able to detect causality within static data sets, by analysing how extreme events in one element correspond to the appearance of extreme events in a second one. The metric is able to detect non-linear causalities; to analyse both cross-sectional and longitudinal data sets; and to discriminate between real causalities and correlations caused by confounding factors. We validate the metric through synthetic data, dynamical and chaotic systems, and data representing the human brain activity in a cognitive task. We further show how the proposed metric is able to outperform classical causality metrics, provided non-linear relationships are present and large enough data sets are available.

  6. Extreme Nonlinear Optics An Introduction

    CERN Document Server

    Wegener, Martin

    2005-01-01

    Following the birth of the laser in 1960, the field of "nonlinear optics" rapidly emerged. Today, laser intensities and pulse durations are readily available, for which the concepts and approximations of traditional nonlinear optics no longer apply. In this regime of "extreme nonlinear optics," a large variety of novel and unusual effects arise, for example frequency doubling in inversion symmetric materials or high-harmonic generation in gases, which can lead to attosecond electromagnetic pulses or pulse trains. Other examples of "extreme nonlinear optics" cover diverse areas such as solid-state physics, atomic physics, relativistic free electrons in a vacuum and even the vacuum itself. This book starts with an introduction to the field based primarily on extensions of two famous textbook examples, namely the Lorentz oscillator model and the Drude model. Here the level of sophistication should be accessible to any undergraduate physics student. Many graphical illustrations and examples are given. The followi...

  7. Modeling The Atmosphere In The Era Of Big Data From Extremely Wide Field-Of-View Telescopes

    Science.gov (United States)

    Gonzalez Quiles, Junellie; Nordin, Jakob

    2018-01-01

    Surveys like the Sloan Digital Sky Survey (SDSS), Pan-STARRS and the Palomar Transient Factory Survey (PTF) receive large amounts of data, which need to be processed and calibrated in order to correct for various factors. One of the limiting factors in obtaining high quality data is the atmosphere, and it is therefore essential to find the appropriate calibration for the atmospheric extinction. It is to be expected that a physical atmospheric model, compared to a photometric calibration used currently by PTF, is more effective in calibrating for the atmospheric extinction due to its ability to account for rapid atmospheric fluctuation and objects of different colors. We focused on creating tools to model the atmospheric extinction for the upcoming Zwicky Transient Factory Survey (ZTF). In order to model the atmosphere, we created a program that combines input data and catalogue values, and efficiently handles them. Then, using PTF data and the SDSS catalogue, we created several models to fit the data, and tested the quality of the fits by chi-square minimization. This will allow us to optimize atmospheric extinction for the upcoming ZTF in the near future.

  8. Promoting Exit from Violent Extremism

    DEFF Research Database (Denmark)

    Dalgaard-Nielsen, Anja

    2013-01-01

    A number of Western countries are currently adding exit programs targeting militant Islamists to their counterterrorism efforts. Drawing on research into voluntary exit from violent extremism, this article identifies themes and issues that seem to cause doubt, leading to exit. It then provides a ...... the influence attempt as subtle as possible, use narratives and self-affirmatory strategies to reduce resistance to persuasion, and consider the possibility to promote attitudinal change via behavioral change as an alternative to seek to influence beliefs directly....

  9. Racial Extremism in the Army

    Science.gov (United States)

    1998-04-01

    of Deference ...................................................................................................... 46 1. The Separation of Powers Doctrine...to the military. This deference has a two-fold basis. First, the separation of powers in the U.S. Constitution gives authority to the executive (and...Why should there be judicial deference to the Army’s policy on extremism? There are two principal reasons. First, the Constitution’s separation of powers doctrine

  10. Large Extremity Peripheral Nerve Repair

    Science.gov (United States)

    2016-12-01

    LM, de Crombrugghe B. Some recent advances in the chemistry and biology of trans- forming growth factor-beta. J Cell Biol 1987;105:1039e45. 12. Hao Y...SUPPLEMENTARY NOTES 14. ABSTRACT In current war trauma, 20-30% of all extremity injuries and >80% of penetrating injuries being associated with peripheral nerve...through both axonal advance and in revascularization of the graft following placement. We are confident that this technology may allow us to

  11. Technology improves upper extremity rehabilitation.

    Science.gov (United States)

    Kowalczewski, Jan; Prochazka, Arthur

    2011-01-01

    Stroke survivors with hemiparesis and spinal cord injury (SCI) survivors with tetraplegia find it difficult or impossible to perform many activities of daily life. There is growing evidence that intensive exercise therapy, especially when supplemented with functional electrical stimulation (FES), can improve upper extremity function, but delivering the treatment can be costly, particularly after recipients leave rehabilitation facilities. Recently, there has been a growing level of interest among researchers and healthcare policymakers to deliver upper extremity treatments to people in their homes using in-home teletherapy (IHT). The few studies that have been carried out so far have encountered a variety of logistical and technical problems, not least the difficulty of conducting properly controlled and blinded protocols that satisfy the requirements of high-level evidence-based research. In most cases, the equipment and communications technology were not designed for individuals with upper extremity disability. It is clear that exercise therapy combined with interventions such as FES, supervised over the Internet, will soon be adopted worldwide in one form or another. Therefore it is timely that researchers, clinicians, and healthcare planners interested in assessing IHT be aware of the pros and cons of the new technology and the factors involved in designing appropriate studies of it. It is crucial to understand the technical barriers, the role of telesupervisors, the motor improvements that participants can reasonably expect and the process of optimizing IHT-exercise therapy protocols to maximize the benefits of the emerging technology. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Typologies of extreme longevity myths.

    Science.gov (United States)

    Young, Robert D; Desjardins, Bertrand; McLaughlin, Kirsten; Poulain, Michel; Perls, Thomas T

    2010-01-01

    Purpose. Political, national, religious, and other motivations have led the media and even scientists to errantly accept extreme longevity claims prima facie. We describe various causes of false claims of extraordinary longevity. Design and Methods. American Social Security Death Index files for the period 1980-2009 were queried for individuals with birth and death dates yielding ages 110+ years of age. Frequency was compared to a list of age-validated supercentenarians maintained by the Gerontology Research Group who died during the same time period. Age claims of 110+ years and the age validation experiences of the authors facilitated a list of typologies of false age claims. Results. Invalid age claim rates increase with age from 65% at age 110-111 to 98% by age 115 to 100% for 120+ years. Eleven typologies of false claims were: Religious Authority Myth, Village Elder Myth, Fountain of Youth Myth (substance), Shangri-La Myth (geographic), Nationalist Pride, Spiritual Practice, Familial Longevity, Individual and/or Family Notoriety, Military Service, Administrative Entry Error, and Pension-Social Entitlement Fraud. Conclusions. Understanding various causes of false extreme age claims is important for placing current, past, and future extreme longevity claims in context and for providing a necessary level of skepticism.

  13. Moderate and extreme maternal obesity.

    LENUS (Irish Health Repository)

    Abdelmaboud, M O

    2012-05-01

    The aim of this study was to investigate the prevalence of moderate and extreme obesity among an Irish obstetric population over a 10-year period, and to evaluate the obstetric features of such pregnancies. Of 31,869 women delivered during the years 2000-2009, there were 306 women in the study group, including 173 in the moderate or Class 2 obese category (BMI 35-39.9) and 133 in the extreme or Class 3 obese category (BMI > or = 40).The prevalence of obese women with BMI > or = 35 was 9.6 per 1000 (0.96%), with an upward trend observed from 2.1 per 1000 in the year 2000, to 11.8 per 1000 in the year 2009 (P = 0.001). There was an increase in emergency caesarean section (EMCS) risk for primigravida versus multigravid women, within both obese categories (P < 0.001). However, there was no significant difference in EMCS rates observed between Class 2 and Class 3 obese women, when matched for parity. The prevalence of moderate and extreme obesity reported in this population is high, and appears to be increasing. The increased rates of abdominal delivery, and the levels of associated morbidity observed, have serious implications for such women embarking on pregnancy.

  14. Attribution of climate extreme events

    Science.gov (United States)

    Trenberth, Kevin E.; Fasullo, John T.; Shepherd, Theodore G.

    2015-08-01

    There is a tremendous desire to attribute causes to weather and climate events that is often challenging from a physical standpoint. Headlines attributing an event solely to either human-induced climate change or natural variability can be misleading when both are invariably in play. The conventional attribution framework struggles with dynamically driven extremes because of the small signal-to-noise ratios and often uncertain nature of the forced changes. Here, we suggest that a different framing is desirable, which asks why such extremes unfold the way they do. Specifically, we suggest that it is more useful to regard the extreme circulation regime or weather event as being largely unaffected by climate change, and question whether known changes in the climate system's thermodynamic state affected the impact of the particular event. Some examples briefly illustrated include 'snowmaggedon' in February 2010, superstorm Sandy in October 2012 and supertyphoon Haiyan in November 2013, and, in more detail, the Boulder floods of September 2013, all of which were influenced by high sea surface temperatures that had a discernible human component.

  15. Typologies of Extreme Longevity Myths

    Directory of Open Access Journals (Sweden)

    Robert D. Young

    2010-01-01

    Full Text Available Purpose. Political, national, religious, and other motivations have led the media and even scientists to errantly accept extreme longevity claims prima facie. We describe various causes of false claims of extraordinary longevity. Design and Methods. American Social Security Death Index files for the period 1980–2009 were queried for individuals with birth and death dates yielding ages 110+ years of age. Frequency was compared to a list of age-validated supercentenarians maintained by the Gerontology Research Group who died during the same time period. Age claims of 110+ years and the age validation experiences of the authors facilitated a list of typologies of false age claims. Results. Invalid age claim rates increase with age from 65% at age 110-111 to 98% by age 115 to 100% for 120+ years. Eleven typologies of false claims were: Religious Authority Myth, Village Elder Myth, Fountain of Youth Myth (substance, Shangri-La Myth (geographic, Nationalist Pride, Spiritual Practice, Familial Longevity, Individual and/or Family Notoriety, Military Service, Administrative Entry Error, and Pension-Social Entitlement Fraud. Conclusions. Understanding various causes of false extreme age claims is important for placing current, past, and future extreme longevity claims in context and for providing a necessary level of skepticism.

  16. Exploring the Links in Monthly to Decadal Variability of the Atmospheric Water Balance Over the Wettest Regions in ERA-20C

    Science.gov (United States)

    Nogueira, M.

    2017-10-01

    Monthly-to-decadal variability of the regional precipitation over Intertropical Convergence Zone and north-Atlantic and north-Pacific storm tracks was investigated using ERA-20C reanalysis. Satellite-based precipitation (P) and evaporation (E) climatological patterns were well reproduced by ERA-20C. Regional P and E monthly time series displayed 20% differences, but these decreased rapidly with time scale ( 10% at yearly time scales). Spectral analysis showed good scale-by-scale statistical agreement between ERA-20C and observations. Using ERA-Interim showed no improvement despite the much wider range of information assimilated (including satellites). Remarkably high Detrended Cross-Correlation Analysis coefficients (ρDCCA > 0.7 and often ρDCCA > 0.9) revealed tight links between the nonperiodic variability of P, moisture divergence (DIV), and pressure velocity (ω) at monthly-to-decadal time scales over all the wet regions. In contrast, ρDCCA was essentially nonsignificant between nonperiodic P and E or sea surface temperature (SST). Thus, the nonperiodic monthly-to-decadal variability of precipitation in these regions is almost fully controlled by dynamics and not by local E or SST (suggested by Clausius-Clapeyron relation). Analysis of regional nonperiodic standard deviations and power spectra (and respective spectral exponents, β) provided further robustness to this conclusion. Finally, clear transitions in β for P, DIV, and ω between tropical and storm track regions were found. The latter is dominated by transient storms, with energy accumulation at synoptic scales and β β values (0.2 to 0.4) were found in the tropics, implying longer-range autocorrelations and slower decreasing variability and information creation with time scale, consistent with the important forcing from internal modes of variability (e.g., El Niño-Southern Oscillation).

  17. Formalities in the digital era: an obstacle or opportunity?

    NARCIS (Netherlands)

    van Gompel, S.; Bently, L.; Suthersanen, U.; Torremans, P.

    2010-01-01

    This paper, which was presented at the 2009 ALAI conference in London, examines the possible reintroduction of copyright formalities against the background of the challenges that copyright law faces in the digital era. It does so by contrasting the current calls for reintroducing formalities with

  18. First results from stellar occultations in the "GAIA era"

    Science.gov (United States)

    Benedetti-Rossi, G.; Vieira-Martins, R.; Sicardy, B.

    2017-09-01

    Stellar occultation is a powerful technique to study distant solar system bodies. It allows high angular resolution of the occulting body from the analysis of a light curve acquired with high temporal resolution with uncertainties comparable as probes. In the "GAIA era", stellar occultations is now able to obtain even more impressive results such as the presence of atmosphere, rings and topographic features.

  19. NASA EOSDIS Evolution in the BigData Era

    Science.gov (United States)

    Lynnes, Christopher

    2015-01-01

    NASA's EOSDIS system faces several challenges in the Big Data Era. Although volumes are large (but not unmanageably so), the variety of different data collections is daunting. That variety also brings with it a large and diverse user community. One key evolution EOSDIS is working toward is to enable more science analysis to be performed close to the data.

  20. Culturally Responsive: Art Education in a Global Era

    Science.gov (United States)

    Lai, Alice

    2012-01-01

    Facing the era of globalization, culturally responsive art teachers must recognize that students' home culture, including local artistic expression, is inevitably influenced by global forces. They should strive to engage with students systems and issues of globalization and its impact on their community culture and art. In this article, the author…

  1. THE UNECIC: INTERNATIONAL TRADE IN THE DIGITAL ERA

    African Journals Online (AJOL)

    Dr Tanya du Plessis

    THE UNECIC: INTERNATIONAL TRADE IN THE DIGITAL ERA. S Eiselen. *. 1. Introduction. The use of electronic means of communication such as e-mail, SMS and the internet in the last decade has outstripped and replaced other more traditional forms of communications such as post, telex and telegram. The only other ...

  2. Gender prejudice in the Victorian Era: an elucidation of Thomas ...

    African Journals Online (AJOL)

    The more intriguing character of the Victorian society was that women bore the brunt of the society's inequality, injustice and unfairness. This paper examines British history with the intent of exposing the variables that shaped and defined the Victorian era consciousness, especially the collective perspective about gender ...

  3. systemic chemical education reform [scer] in the global era

    African Journals Online (AJOL)

    IICBA01

    growing the systemic way of thinking of our students that is one of the most important characteristics of Global Era. Here is the systemic education reform which means the change of our educational system from linearity to systemic in which we design the curriculum and write content systemically, which presented by SATL ...

  4. Professional Boundaries in the Era of the Internet

    Science.gov (United States)

    Gabbard, Glen O.; Kassaw, Kristin A.; Perez-Garcia, Gonzalo

    2011-01-01

    Objective: The era of the Internet presents new dilemmas in educating psychiatrists about professional boundaries. The objective of this overview is to clarify those dilemmas and offer recommendations for dealing with them. Method: The characteristics of social networking sites, blogs, and search engines are reviewed with a specific focus on their…

  5. Not invented here : managing corporate innovation in a new era

    NARCIS (Netherlands)

    Vrande, van de V.J.A.

    2007-01-01

    Not Invented here: Managing Corporate Innovation in a New Era External technology sourcing as a means to develop new businesses is taking a more central role in established companies. Acquiring new technologies from outside the firm which speeds up the innovation process and complements internal R&D

  6. Customer to Consumer: The New Consumption in the Progressive Era.

    Science.gov (United States)

    Strasser, Susan

    1999-01-01

    Discusses the transformation of the U.S. consumption habits and the creation of the consumer during the Progressive Era. Describes the relationships among the production of goods, advertising, and progress. Focuses on the role advertising played in altering U.S. cultural beliefs and the continued attachment to the past. (CMK)

  7. Health Education Films of the Silent Era: A Historical Analysis

    Science.gov (United States)

    Sofalvi, Alan

    2011-01-01

    Films have been used to present health messages throughout the history of the medium. The purpose of this article is to describe pictures from the silent film era that were designed to educate people about health issues. Films still available in at least one format were reviewed. Published reviews were also used to obtain information about these…

  8. La era de la información

    Directory of Open Access Journals (Sweden)

    Mauro Florez Calderón

    1991-01-01

    Full Text Available La revolución de la información basada en la información tiene como límite las fronteras del conocimiento. La información a diferencia de un bien material es inalienable, acumulativa y no produce los tipos de deterioro ambiental creados por la industria; por lo anterior, algunos especialistas, a la era de la información la denominan era Post-industrial. Si es necio pretender entender los procesos producidos por la revolución industrial, con una mentalidad pastoril, mucho más necio será tratar de comprender la era informacional con una concepción industrial. La nueva era implica formas y estructuras del pensamiento radicalmente diferentes, pues las profundas transformaciones sociales, técnicas, políticas, económicas que conlleva, no conducirán necesariamente por si mismos a un mayor bienestar para la humanidad. En el presente artículo pretendo dar una visión general sobre este apasionante y delicado tema.

  9. ERA-pankurid võivad pattu kahetseda / Nils Niitra

    Index Scriptorium Estoniae

    Niitra, Nils, 1975-

    2002-01-01

    Tartu maakohtu kohtunik andis ERA Grupi üheksa eksjuhi suurprotsessil riiklikule süüdistajale ja kohtualustele kaks nädalat aega protsessi lahendamiseks lihtmenetluse teel, kuid selleks peavad endised pankurid end süüdi tunnistama

  10. Radiation oncology in the era of precision medicine

    DEFF Research Database (Denmark)

    Baumann, Michael; Krause, Mechthild; Overgaard, Jens

    2016-01-01

    with preservation of health-related quality of life can be achieved in many patients. Two major strategies, acting synergistically, will enable further widening of the therapeutic window of radiation oncology in the era of precision medicine: technology-driven improvement of treatment conformity, including advanced...

  11. Op pad na 'n omvattende woordeboekkultuur in die digitale era ...

    African Journals Online (AJOL)

    Many potential dictionary users within the digital era belong to Generation Z. Some features of this generation are briefly discussed. The need is indicated for an adaptation in lexicography that could motivate this generation to use dictionaries. It is argued that dictionary didactics should play an important role in establishing ...

  12. Biodiversity analysis in the digital era

    Science.gov (United States)

    2016-01-01

    This paper explores what the virtual biodiversity e-infrastructure will look like as it takes advantage of advances in ‘Big Data’ biodiversity informatics and e-research infrastructure, which allow integration of various taxon-level data types (genome, morphology, distribution and species interactions) within a phylogenetic and environmental framework. By overcoming the data scaling problem in ecology, this integrative framework will provide richer information and fast learning to enable a deeper understanding of biodiversity evolution and dynamics in a rapidly changing world. The Atlas of Living Australia is used as one example of the advantages of progressing towards this future. Living in this future will require the adoption of new ways of integrating scientific knowledge into societal decision making. This article is part of the themed issue ‘From DNA barcodes to biomes’. PMID:27481789

  13. Retracted: Design Education in the Global Era

    Science.gov (United States)

    de Lobo, Theresa

    The aim of this paper is to show the collaboration of design disciplines to instill a broader sense of design for students through intercultural service learning projects. While there are programs that are reinventing their curriculum, there are still several that follow the classic structure of a first year art foundation program with the final years concentrating on the desired discipline. The interactions at a global scale, has heightened the need for graduates to learn to interact more effectively with people from different cultures. This approach combines the concern of addressing a need for design in a real world situation, with learning how to understand culture, place, and experience through a collaborative project. Referencing a specific international service learning project, and drawing from literature on internationalization of education, this paper explores key concepts, learning objectives, methods, and challenges faced in addressing the need to prepare students for practice in an increasingly integrated workplace.

  14. Assessment of the potential forecasting skill of a global hydrological model in reproducing the occurrence of monthly flow extremes

    Directory of Open Access Journals (Sweden)

    N. Candogan Yossef

    2012-11-01

    Full Text Available As an initial step in assessing the prospect of using global hydrological models (GHMs for hydrological forecasting, this study investigates the skill of the GHM PCR-GLOBWB in reproducing the occurrence of past extremes in monthly discharge on a global scale. Global terrestrial hydrology from 1958 until 2001 is simulated by forcing PCR-GLOBWB with daily meteorological data obtained by downscaling the CRU dataset to daily fields using the ERA-40 reanalysis. Simulated discharge values are compared with observed monthly streamflow records for a selection of 20 large river basins that represent all continents and a wide range of climatic zones.

    We assess model skill in three ways all of which contribute different information on the potential forecasting skill of a GHM. First, the general skill of the model in reproducing hydrographs is evaluated. Second, model skill in reproducing significantly higher and lower flows than the monthly normals is assessed in terms of skill scores used for forecasts of categorical events. Third, model skill in reproducing flood and drought events is assessed by constructing binary contingency tables for floods and droughts for each basin. The skill is then compared to that of a simple estimation of discharge from the water balance (PE.

    The results show that the model has skill in all three types of assessments. After bias correction the model skill in simulating hydrographs is improved considerably. For most basins it is higher than that of the climatology. The skill is highest in reproducing monthly anomalies. The model also has skill in reproducing floods and droughts, with a markedly higher skill in floods. The model skill far exceeds that of the water balance estimate. We conclude that the prospect for using PCR-GLOBWB for monthly and seasonal forecasting of the occurrence of hydrological extremes is positive. We argue that this conclusion applies equally to other similar GHMs and

  15. Prospect for extreme field science

    Energy Technology Data Exchange (ETDEWEB)

    Tajima, T. [Ludwig Maximilian Univ. and Max Planck Institute for Quantum Optics, Garching (Germany); Japan Atomic Energy Agency, Kyoto and KEK, Tsukuba (Japan)

    2009-11-15

    The kind of laser extreme light infrastructure (ELI) provides will usher in a class of experiments we have only dreamed of for years. The characteristics that ELI brings in include: the highest intensity ever, large fluence, and relatively high repetition rate. A personal view of the author on the prospect of harnessing this unprecedented opportunity for advancing science of extreme fields is presented. The first characteristic of ELI, its intensity, will allow us to access, as many have stressed already, extreme fields that hover around the Schwinger field or at the very least the neighboring fields in which vacuum begins to behave as a nonlinear medium. In this sense, we are seriously probing the 'material' property of vacuum and thus the property that theory of relativity itself described and will entail. We will probe both special theory and general theory of relativity in regimes that have been never tested so far. We may see a glimpse into the reach of relativity or even its breakdown in some extreme regimes. We will learn Einstein and may even go beyond Einstein, if our journey is led. Laser-driven acceleration both by the laser field itself and by the wakefield that is triggered in a plasma is huge. Energies, if not luminosity, we can access, may be unprecedented going far beyond TeV. The nice thing about ELI is that it has relatively high repetition rate and average fluence as compared with other extreme lasers. This high fluence can be a key element that leads to applications to high energy physics, such as gamma-gamma collider driver experiment, and some gamma ray experiments that may be relevant in the frontier of photo-nuclear physics, and atomic energy applications. Needless to say, high fluence is one of most important features that industrial and medical applications may need. If we are lucky, we may see a door opens at the frontier of novel physics that may not be available by any other means. (authors)

  16. Satisfaction with Quality of Care Received by Patients without National Health Insurance Attending a Primary Care Clinic in a Resource-Poor Environment of a Tertiary Hospital in Eastern Nigeria in the Era of Scaling up the Nigerian Formal Sector Health Insurance Scheme

    Science.gov (United States)

    Iloh, GUP; Ofoedu, JN; Njoku, PU; Okafor, GOC; Amadi, AN; Godswill-Uko, EU

    2013-01-01

    Background: The increasing importance of the concept of patients’ satisfaction as a valuable tool for assessing quality of care is a current global healthcare concerns as regards consumer-oriented health services. Aim: This study assessed satisfaction with quality of care received by patients without national health insurance (NHI) attending a primary care clinic in a resource-poor environment of a tertiary hospital in South-Eastern Nigeria. Subject and Methods: This was a cross-sectional study carried out on 400 non-NHI patients from April 2011 to October 2011 at the primary care clinic of Federal Medical Centre, Umuahia, Nigeria. Adult patients seen within the study period were selected by systematic sampling using every second non-NHI patient that registered to see the physicians and who met the selection criteria. Data were collected using pretested, structured interviewer administered questionnaire designed on a five points Likert scale items with 1 and 5 indicating the lowest and highest levels of satisfaction respectively. Satisfaction was measured from the following domains: patient waiting time, patient–staff communication, patient-staff relationship, and cost of care, hospital bureaucracy and hospital environment. Operationally, patients who scored 3 points and above in the assessed domain were considered satisfied while those who scored less than 3 points were dissatisfied. Results: The overall satisfaction score of the respondents was 3.1. Specifically, the respondents expressed satisfaction with patient–staff relationship (3.9), patient–staff communication (3.8), and hospital environment (3.6) and dissatisfaction with patient waiting time (2.4), hospital bureaucracy (2.5), and cost of care (2.6). Conclusion: The overall non-NHI patient's satisfaction with the services provided was good. The hospital should set targets for quality improvement in the current domains of satisfaction while the cost of care has implications for government

  17. Results of ERAS protocol in patients with colorectal cancer

    Directory of Open Access Journals (Sweden)

    A. O. Rasulov

    2016-01-01

    Full Text Available Objective: explore the use of enhanced recovery after surgery (ERAS in the treatment of patients with colorectal cancer, evaluate its efficacy and safety.Materials and methods. Prospective, single-site, randomized study for the implementation of enhanced recovery after surgery in patients with colorectal cancer has been conducted from October 2014 till the present time. All patients after laparoscopic surgeries undergo treatment according to ERAS protocol, patients after open surgeries are randomized (1:1 in groups of the standard treatment or treatment according to ERAS protocol. The study included patients with localized and locally disseminated colorectal cancer aged from 18 to 75 years, ECOG score ≤ 2. The primary evaluated parameters were the following: the number of postoperative complications (according to Clavien– Dindo classification, postoperative hospital days, incidence of complications and mortality in the 30-day period, timing of activation.Results. Up to date, the study includes 105 patients: laparoscopic group – 51 patients, open-surgery group of patients treated by ERAS protocol – 27 patients, open-surgery group of patients with the standard post-op treatment – 26 patients. Complications requiring emergency surgery for anastomotic leak (p = 0.159 developed in 3.7 % of patients with the standard post-op treatment and in 3.9 % of patients after laparoscopic surgery, while 1 patient required repeat hospitalization. The total number of complications was significantly lower in opensurgery group of patients treated by ERAS protocol compared with the standard post-op treatment (p = 0.021. However, there were no differences between laparoscopic and open-surgery group with the standard post-op treatment (p = 0.159. An average hospitalization stay in patients with the standard post-op treatment was equal to 10 days compared to 7 days in patients treated by ERAS protocol (p = 0.067 and 6 days after laparoscopic

  18. Flood protection diversification to reduce probabilities of extreme losses.

    Science.gov (United States)

    Zhou, Qian; Lambert, James H; Karvetski, Christopher W; Keisler, Jeffrey M; Linkov, Igor

    2012-11-01

    Recent catastrophic losses because of floods require developing resilient approaches to flood risk protection. This article assesses how diversification of a system of coastal protections might decrease the probabilities of extreme flood losses. The study compares the performance of portfolios each consisting of four types of flood protection assets in a large region of dike rings. A parametric analysis suggests conditions in which diversifications of the types of included flood protection assets decrease extreme flood losses. Increased return periods of extreme losses are associated with portfolios where the asset types have low correlations of economic risk. The effort highlights the importance of understanding correlations across asset types in planning for large-scale flood protection. It allows explicit integration of climate change scenarios in developing flood mitigation strategy. © 2012 Society for Risk Analysis.

  19. Brownian gas models for extreme-value laws

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2013-01-01

    In this paper we establish one-dimensional Brownian gas models for the extreme-value laws of Gumbel, Weibull, and Fréchet. A gas model is a countable collection of independent particles governed by common diffusion dynamics. The extreme-value laws are the universal probability distributions governing the affine scaling limits of the maxima and minima of ensembles of independent and identically distributed one-dimensional random variables. Using the recently introduced concept of stationary Poissonian intensities, we construct two gas models whose global statistical structures are stationary, and yield the extreme-value laws: a linear Brownian motion gas model for the Gumbel law, and a geometric Brownian motion gas model for the Weibull and Fréchet laws. The stochastic dynamics of these gas models are studied in detail, and closed-form analytical descriptions of their temporal correlation structures, their topological phase transitions, and their intrinsic first-passage-time fluxes are presented. (paper)

  20. Nonlinear wave-mixing processes in the extreme ultraviolet

    International Nuclear Information System (INIS)

    Misoguti, L.; Christov, I. P.; Backus, S.; Murnane, M. M.; Kapteyn, H. C.

    2005-01-01

    We present data from two-color high-order harmonic generation in a hollow waveguide, that suggest the presence of a nonlinear-optical frequency conversion process driven by extreme ultraviolet light. By combining the fundamental and second harmonic of an 800 nm laser in a hollow-core fiber, with varying relative polarizations, and by observing the pressure and power scaling of the various harmonic orders, we show that the data are consistent with a picture where we drive the process of high-harmonic generation, which in turn drives four-wave frequency mixing processes in the extreme EUV. This work promises a method for extending nonlinear optics into the extreme ultraviolet region of the spectrum using an approach that has not previously been considered, and has compelling implications for generating tunable light at short wavelengths