WorldWideScience

Sample records for realistic scenarios approximating

  1. Evaluation of photovoltaic panel temperature in realistic scenarios

    International Nuclear Information System (INIS)

    Du, Yanping; Fell, Christopher J.; Duck, Benjamin; Chen, Dong; Liffman, Kurt; Zhang, Yinan; Gu, Min; Zhu, Yonggang

    2016-01-01

    Highlights: • The developed realistic model captures more reasonably the thermal response and hysteresis effects. • The predicted panel temperature is as high as 60 °C under a solar irradiance of 1000 W/m"2 in no-wind weather. • In realistic scenarios, the thermal response normally takes 50–250 s. • The actual heating effect may cause a photoelectric efficiency drop of 2.9–9.0%. - Abstract: Photovoltaic (PV) panel temperature was evaluated by developing theoretical models that are feasible to be used in realistic scenarios. Effects of solar irradiance, wind speed and ambient temperature on the PV panel temperature were studied. The parametric study shows significant influence of solar irradiance and wind speed on the PV panel temperature. With an increase of ambient temperature, the temperature rise of solar cells is reduced. The characteristics of panel temperature in realistic scenarios were analyzed. In steady weather conditions, the thermal response time of a solar cell with a Si thickness of 100–500 μm is around 50–250 s. While in realistic scenarios, the panel temperature variation in a day is different from that in steady weather conditions due to the effect of thermal hysteresis. The heating effect on the photovoltaic efficiency was assessed based on real-time temperature measurement of solar cells in realistic weather conditions. For solar cells with a temperature coefficient in the range of −0.21%∼−0.50%, the current field tests indicated an approximate efficiency loss between 2.9% and 9.0%.

  2. Comparison of Sigma-Point and Extended Kalman Filters on a Realistic Orbit Determination Scenario

    Science.gov (United States)

    Gaebler, John; Hur-Diaz. Sun; Carpenter, Russell

    2010-01-01

    Sigma-point filters have received a lot of attention in recent years as a better alternative to extended Kalman filters for highly nonlinear problems. In this paper, we compare the performance of the additive divided difference sigma-point filter to the extended Kalman filter when applied to orbit determination of a realistic operational scenario based on the Interstellar Boundary Explorer mission. For the scenario studied, both filters provided equivalent results. The performance of each is discussed in detail.

  3. The Impact of Emotion on Negotiation Behaviour during a Realistic Training Scenario

    Science.gov (United States)

    2007-11-01

    Sorce, Emde, & Svejda; cited in van Kleef, et al., 2004a), and induce reciprocal emotions in others ( Keltner & Haidt, 1999; cited in van Kleef, et al...DRDC Toronto CR 2007-166 THE IMPACT OF EMOTION ON NEGOTIATION BEHAVIOUR DURING A REALISTIC TRAINING SCENARIO by: Michael H. Thomson...made to dig what look like their own graves. This experiment explored the impact of emotion on military trainees’ negotiation behaviour and

  4. Performance Analysis of Relays in LTE for a Realistic Suburban Deployment Scenario

    DEFF Research Database (Denmark)

    Coletti, Claudio; Mogensen, Preben; Irmer, Ralf

    2011-01-01

    Relays are likely to play an important role in the deployment of Beyond 3G networks, such as LTE-Advanced, thanks to the possibility of effectively extending Macro network coverage and fulfilling the expected high data-rate requirements. Up until now, the relay technology potential and its cost......-effectiveness have been widely investigated in the literature, considering mainly statistical deployment scenarios, like regular networks with uniform traffic distribution. This paper is envisaged to illustrate the performances of different relay technologies (In-Band/Out-band) in a realistic suburban network...... scenario with real Macro site positions, user density map and spectrum band availability. Based on a proposed heuristic deployment algorithm, results show that deploying In-band relays can significantly reduce the user outage if high backhaul link quality is ensured, whereas Out-band relaying and the usage...

  5. Towards realistic Holocene land cover scenarios: integration of archaeological, palynological and geomorphological records and comparison to global land cover scenarios.

    Science.gov (United States)

    De Brue, Hanne; Verstraeten, Gert; Broothaerts, Nils; Notebaert, Bastiaan

    2016-04-01

    Accurate and spatially explicit landscape reconstructions for distinct time periods in human history are essential for the quantification of the effect of anthropogenic land cover changes on, e.g., global biogeochemical cycles, ecology, and geomorphic processes, and to improve our understanding of interaction between humans and the environment in general. A long-term perspective covering Mid and Late Holocene land use changes is recommended in this context, as it provides a baseline to evaluate human impact in more recent periods. Previous efforts to assess the evolution and intensity of agricultural land cover in past centuries or millennia have predominantly focused on palynological records. An increasing number of quantitative techniques has been developed during the last two decades to transfer palynological data to land cover estimates. However, these techniques have to deal with equifinality issues and, furthermore, do not sufficiently allow to reconstruct spatial patterns of past land cover. On the other hand, several continental and global databases of historical anthropogenic land cover changes based on estimates of global population and the required agricultural land per capita have been developed in the past decennium. However, at such long temporal and spatial scales, reconstruction of past anthropogenic land cover intensities and spatial patterns necessarily involves many uncertainties and assumptions as well. Here, we present a novel approach that combines archaeological, palynological and geomorphological data for the Dijle catchment in the central Belgium Loess Belt in order to arrive at more realistic Holocene land cover histories. Multiple land cover scenarios (> 60.000) are constructed using probabilistic rules and used as input into a sediment delivery model (WaTEM/SEDEM). Model outcomes are confronted with a detailed geomorphic dataset on Holocene sediment fluxes and with REVEALS based estimates of vegetation cover using palynological data from

  6. Heterogeneous Deployment to Meet Traffic Demand in a Realistic LTE Urban Scenario

    DEFF Research Database (Denmark)

    Coletti, Claudio; Hu, Liang; Nguyen, Huan Cong

    2012-01-01

    growth of mo-bile broadband traffic. Emphasis is put on how to optimally as-sign the spectrum for the different networks layers in an evolved HetNet including outdoor and indoor small cells. The study is conducted for a “Hot-Zone” scenario, i.e. a high-traffic area within a realistic dense urban...... performance with a minimum user data rate of 1 Mbps is achieved when deploying small cells on dedicated channels rather than co-channel deployment. Fur-thermore, the joint pico and femto deployment turns out to be the right trade-off between increased base station density and en-hanced network capacity....

  7. Smart-DS: Synthetic Models for Advanced, Realistic Testing: Distribution Systems and Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Krishnan, Venkat K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, Brian S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hale, Elaine T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Elgindy, Tarek [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bugbee, Bruce [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Rossol, Michael N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Lopez, Anthony J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnamurthy, Dheepak [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Vergara, Claudio [MIT; Domingo, Carlos Mateo [IIT Comillas; Postigo, Fernando [IIT Comillas; de Cuadra, Fernando [IIT Comillas; Gomez, Tomas [IIT Comillas; Duenas, Pablo [MIT; Luke, Max [MIT; Li, Vivian [MIT; Vinoth, Mohan [GE Grid Solutions; Kadankodu, Sree [GE Grid Solutions

    2017-08-09

    The National Renewable Energy Laboratory (NREL) in collaboration with Massachusetts Institute of Technology (MIT), Universidad Pontificia Comillas (Comillas-IIT, Spain) and GE Grid Solutions, is working on an ARPA-E GRID DATA project, titled Smart-DS, to create: 1) High-quality, realistic, synthetic distribution network models, and 2) Advanced tools for automated scenario generation based on high-resolution weather data and generation growth projections. Through these advancements, the Smart-DS project is envisioned to accelerate the development, testing, and adoption of advanced algorithms, approaches, and technologies for sustainable and resilient electric power systems, especially in the realm of U.S. distribution systems. This talk will present the goals and overall approach of the Smart-DS project, including the process of creating the synthetic distribution datasets using reference network model (RNM) and the comprehensive validation process to ensure network realism, feasibility, and applicability to advanced use cases. The talk will provide demonstrations of early versions of synthetic models, along with the lessons learnt from expert engagements to enhance future iterations. Finally, the scenario generation framework, its development plans, and co-ordination with GRID DATA repository teams to house these datasets for public access will also be discussed.

  8. The Formation of Teacher Work Teams under Adverse Conditions: Towards a More Realistic Scenario for Schools in Distress

    Science.gov (United States)

    Mintrop, Rick; Charles, Jessica

    2017-01-01

    Group formation studies are rare in the literature on teacher professional learning communities (PLCs). But they are needed to render realistic scenarios and design interventions for practitioners who work in schools where teachers encounter distress and social adversity. Under these conditions, we may need approaches to PLC development that are…

  9. Scenario approximation in a phenomenological study in Mexico: experience report.

    Science.gov (United States)

    Guerrero-Castañeda, Raúl Fernando; Menezes, Tânia Maria de Oliva; Vargas, Ma Guadalupe Ojeda

    2017-01-01

    To report our experience using scenario approximation in a phenomenological study of nursing in Mexico. Experience report on scenario approximation to coexist with elderly in order to select the participants of a phenomenological study. During a four-month period in 2016, visits were carried out two groups of elderly individuals where several activities were carried out. Coexistence with the elderly throughout accompaniment in the groups' activities together with joint dialogue allowed selection of those who corresponded to the characteristics of the study objective. Scenario approximation is necessary in phenomenological studies, not only for creating empathy among the participants but also for the researchers to immerse themselves in the phenomenon under study, as shown by the first approaches of the researcher. Relatar la experiencia del acercamiento al escenario de un estudio fenomenológico en enfermería en México. Relato de experiencia sobre el acercamiento al escenario de estudio para convivir con adultos mayores con la finalidad de seleccionar a los participantes de un estudio fenomenológico. Se llevaron a cabo visitas durante el año 2016, en un periodo de cuatro meses a dos grupos de adultos mayores en donde se realizaron diversas actividades. La convivencia con los adultos mayores a través del acompañamiento en las actividades que realizaban en los grupos y el diálogo conjunto permitió seleccionar a aquellos que respondían a las características del objeto de estudio. Es necesaria la aproximación al escenario de estudios fenomenológicos, no sólo con la finalidad de ganar empatía de los participantes sino para sumergirse en el fenómeno de estudio, mismo que se va mostrando desde los primeros acercamientos del investigador.

  10. Diffusion approximation for modeling of 3-D radiation distributions

    International Nuclear Information System (INIS)

    Zardecki, A.; Gerstl, S.A.W.; De Kinder, R.E. Jr.

    1985-01-01

    A three-dimensional transport code DIF3D, based on the diffusion approximation, is used to model the spatial distribution of radiation energy arising from volumetric isotropic sources. Future work will be concerned with the determination of irradiances and modeling of realistic scenarios, relevant to the battlefield conditions. 8 refs., 4 figs

  11. Aquatic risk assessment of a realistic exposure to pesticides used in bulb crops: a microcosm study

    NARCIS (Netherlands)

    Wijngaarden, van R.P.A.; Cuppen, J.G.M.; Arts, G.H.P.; Crum, S.J.H.; Hoorn, van den M.W.; Brink, van den P.J.; Brock, T.C.M.

    2004-01-01

    The fungicide fluazinam, the insecticide lambda-cyhalothrin, and the herbicides asulam and metamitron were applied to indoor freshwater microcosms (water volume approximately 0.6 m3). The treatment regime was based on a realistic application scenario in tulip cultivation. Concentrations of each

  12. Effects of combined dredging-related stressors on sponges: a laboratory approach using realistic scenarios.

    Science.gov (United States)

    Pineda, Mari-Carmen; Strehlow, Brian; Kamp, Jasmine; Duckworth, Alan; Jones, Ross; Webster, Nicole S

    2017-07-12

    Dredging can cause increased suspended sediment concentrations (SSCs), light attenuation and sedimentation in marine communities. In order to determine the combined effects of dredging-related pressures on adult sponges, three species spanning different nutritional modes and morphologies were exposed to 5 treatment levels representing realistic dredging scenarios. Most sponges survived under low to moderate turbidity scenarios (SSCs of ≤ 33 mg L -1 , and a daily light integral of ≥0.5 mol photons m -2 d -1 ) for up to 28 d. However, under the highest turbidity scenario (76 mg L -1 , 0.1 mol photons m -2 d -1 ) there was 20% and 90% mortality of the phototrophic sponges Cliona orientalis and Carteriospongia foliascens respectively, and tissue regression in the heterotrophic Ianthella basta. All three sponge species exhibited mechanisms to effectively tolerate dredging-related pressures in the short term (e.g. oscula closure, mucus production and tissue regression), although reduced lipids and deterioration of sponge health suggest that longer term exposure to similar conditions is likely to result in higher mortality. These results suggest that the combination of high SSCs and low light availability can accelerate mortality, increasing the probability of biological effects, although there is considerable interspecies variability in how adult sponges respond to dredging pressures.

  13. Realistic Visualization of Virtual Views

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    that can be impractical and sometime impossible. In addition, the artificial nature of data often makes visualized virtual scenarios not realistic enough. Not realistic in the sense that a synthetic scene is easy to discriminate visually from a natural scene. A new field of research has consequently...... developed and received much attention in recent years: Realistic Virtual View Synthesis. The main goal is a high fidelity representation of virtual scenarios while easing modeling and physical phenomena simulation. In particular, realism is achieved by the transfer to the novel view of all the physical...... phenomena captured in the reference photographs, (i.e. the transfer of photographic-realism). An overview of most prominent approaches in realistic virtual view synthesis will be presented and briefly discussed. Applications of proposed methods to visual survey, virtual cinematography, as well as mobile...

  14. The Attentional Demand of Automobile Driving Revisited: Occlusion Distance as a Function of Task-Relevant Event Density in Realistic Driving Scenarios.

    Science.gov (United States)

    Kujala, Tuomo; Mäkelä, Jakke; Kotilainen, Ilkka; Tokkonen, Timo

    2016-02-01

    We studied the utility of occlusion distance as a function of task-relevant event density in realistic traffic scenarios with self-controlled speed. The visual occlusion technique is an established method for assessing visual demands of driving. However, occlusion time is not a highly informative measure of environmental task-relevant event density in self-paced driving scenarios because it partials out the effects of changes in driving speed. Self-determined occlusion times and distances of 97 drivers with varying backgrounds were analyzed in driving scenarios simulating real Finnish suburban and highway traffic environments with self-determined vehicle speed. Occlusion distances varied systematically with the expected environmental demands of the manipulated driving scenarios whereas the distributions of occlusion times remained more static across the scenarios. Systematic individual differences in the preferred occlusion distances were observed. More experienced drivers achieved better lane-keeping accuracy than inexperienced drivers with similar occlusion distances; however, driving experience was unexpectedly not a major factor for the preferred occlusion distances. Occlusion distance seems to be an informative measure for assessing task-relevant event density in realistic traffic scenarios with self-controlled speed. Occlusion time measures the visual demand of driving as the task-relevant event rate in time intervals, whereas occlusion distance measures the experienced task-relevant event density in distance intervals. The findings can be utilized in context-aware distraction mitigation systems, human-automated vehicle interaction, road speed prediction and design, as well as in the testing of visual in-vehicle tasks for inappropriate in-vehicle glancing behaviors in any dynamic traffic scenario for which appropriate individual occlusion distances can be defined. © 2015, Human Factors and Ergonomics Society.

  15. Predicting perceptual quality of images in realistic scenario using deep filter banks

    Science.gov (United States)

    Zhang, Weixia; Yan, Jia; Hu, Shiyong; Ma, Yang; Deng, Dexiang

    2018-03-01

    Classical image perceptual quality assessment models usually resort to natural scene statistic methods, which are based on an assumption that certain reliable statistical regularities hold on undistorted images and will be corrupted by introduced distortions. However, these models usually fail to accurately predict degradation severity of images in realistic scenarios since complex, multiple, and interactive authentic distortions usually appear on them. We propose a quality prediction model based on convolutional neural network. Quality-aware features extracted from filter banks of multiple convolutional layers are aggregated into the image representation. Furthermore, an easy-to-implement and effective feature selection strategy is used to further refine the image representation and finally a linear support vector regression model is trained to map image representation into images' subjective perceptual quality scores. The experimental results on benchmark databases present the effectiveness and generalizability of the proposed model.

  16. Realistic modelling of external flooding scenarios - A multi-disciplinary approach

    International Nuclear Information System (INIS)

    Brinkman, J.L.

    2014-01-01

    against flooding and timing of the events into account as basis for the development and screening of flooding scenarios. Realistic modelling of external flooding scenarios in a PSA requires a multi-disciplinary approach. Next to being thoroughly familiar with the design features of the plant against flooding, like its critical elevations for safety (related) equipment and the strength of buildings, additional knowledge is necessary on design of flood protection measures as dikes and dunes, their failure behaviour and modelling. The approach does not change the basic flooding scenarios - the event tree structure - itself, but impacts the initiating event of the specific flooding scenarios. (authors)

  17. Numerical solution of the ekpyrotic scenario in the moduli space approximation

    International Nuclear Information System (INIS)

    Soerensen, Torquil MacDonald

    2005-01-01

    A numerical solution to the equations of motion for the ekpyrotic bulk brane scenario in the moduli space approximation is presented. The visible universe brane has positive tension, and we use a potential that goes to zero exponentially at large distance, and also goes to zero at small distance. In the case considered, no bulk brane, visible brane collision occurs in the solution. This property and the general behavior of the solution is qualitatively the same when the visible brane tension is negative, and for many different parameter choices

  18. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    Directory of Open Access Journals (Sweden)

    Quaggiotto Marco

    2011-02-01

    Full Text Available Abstract Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level

  19. Toward realistic pursuit-evasion using a roadmap-based approach

    KAUST Repository

    Rodriguez, Samuel; Denny, Jory; Burgos, Juan; Mahadevan, Aditya; Manavi, Kasra; Murray, Luke; Kodochygov, Anton; Zourntos, Takis; Amato, Nancy M.

    2011-01-01

    be applied to more realistic scenarios than are typically studied in most previous work, including agents moving in 3D environments such as terrains, multi-story buildings, and dynamic environments. We also support more realistic three-dimensional visibility

  20. SU-F-J-208: Prompt Gamma Imaging-Based Prediction of Bragg Peak Position for Realistic Treatment Error Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Xing, Y; Macq, B; Bondar, L [Universite catholique de Louvain, Louvain-la-Neuve (Belgium); Janssens, G [IBA, Louvain-la-Neuve (Belgium)

    2016-06-15

    Purpose: To quantify the accuracy in predicting the Bragg peak position using simulated in-room measurements of prompt gamma (PG) emissions for realistic treatment error scenarios that combine several sources of errors. Methods: Prompt gamma measurements by a knife-edge slit camera were simulated using an experimentally validated analytical simulation tool. Simulations were performed, for 143 treatment error scenarios, on an anthropomorphic phantom and a pencil beam scanning plan for nasal cavity. Three types of errors were considered: translation along each axis, rotation around each axis, and CT-calibration errors with magnitude ranging respectively, between −3 and 3 mm, −5 and 5 degrees, and between −5 and +5%. We investigated the correlation between the Bragg peak (BP) shift and the horizontal shift of PG profiles. The shifts were calculated between the planned (reference) position and the position by the error scenario. The prediction error for one spot was calculated as the absolute difference between the PG profile shift and the BP shift. Results: The PG shift was significantly and strongly correlated with the BP shift for 92% of the cases (p<0.0001, Pearson correlation coefficient R>0.8). Moderate but significant correlations were obtained for all cases that considered only CT-calibration errors and for 1 case that combined translation and CT-errors (p<0.0001, R ranged between 0.61 and 0.8). The average prediction errors for the simulated scenarios ranged between 0.08±0.07 and 1.67±1.3 mm (grand mean 0.66±0.76 mm). The prediction error was moderately correlated with the value of the BP shift (p=0, R=0.64). For the simulated scenarios the average BP shift ranged between −8±6.5 mm and 3±1.1 mm. Scenarios that considered combinations of the largest treatment errors were associated with large BP shifts. Conclusion: Simulations of in-room measurements demonstrate that prompt gamma profiles provide reliable estimation of the Bragg peak position for

  1. On Small Antenna Measurements in a Realistic MIMO Scenario

    DEFF Research Database (Denmark)

    Yanakiev, Boyan; Nielsen, Jesper Ødum; Pedersen, Gert Frølund

    2010-01-01

    . The problem using coaxial cable is explained and a solution suitable for long distance channel sounding is presented. A large scale measurement campaign is then described. Special attention is paid to bring the measurement setup as close as possible to a realistic LTE network of the future, with attention......This paper deals with the challenges related to evaluating the performance of multiple, small terminal antennas within a natural MIMO environment. The focus is on the antenna measurement accuracy. First a method is presented for measuring small phone mock-ups, with the use of optical fibers...

  2. Toward realistic pursuit-evasion using a roadmap-based approach

    KAUST Repository

    Rodriguez, Samuel

    2011-05-01

    In this work, we describe an approach for modeling and simulating group behaviors for pursuit-evasion that uses a graph-based representation of the environment and integrates multi-agent simulation with roadmap-based path planning. Our approach can be applied to more realistic scenarios than are typically studied in most previous work, including agents moving in 3D environments such as terrains, multi-story buildings, and dynamic environments. We also support more realistic three-dimensional visibility computations that allow evading agents to hide in crowds or behind hills. We demonstrate the utility of this approach on mobile robots and in simulation for a variety of scenarios including pursuit-evasion and tag on terrains, in multi-level buildings, and in crowds. © 2011 IEEE.

  3. Modeling and Analysis of Realistic Fire Scenarios in Spacecraft

    Science.gov (United States)

    Brooker, J. E.; Dietrich, D. L.; Gokoglu, S. A.; Urban, D. L.; Ruff, G. A.

    2015-01-01

    An accidental fire inside a spacecraft is an unlikely, but very real emergency situation that can easily have dire consequences. While much has been learned over the past 25+ years of dedicated research on flame behavior in microgravity, a quantitative understanding of the initiation, spread, detection and extinguishment of a realistic fire aboard a spacecraft is lacking. Virtually all combustion experiments in microgravity have been small-scale, by necessity (hardware limitations in ground-based facilities and safety concerns in space-based facilities). Large-scale, realistic fire experiments are unlikely for the foreseeable future (unlike in terrestrial situations). Therefore, NASA will have to rely on scale modeling, extrapolation of small-scale experiments and detailed numerical modeling to provide the data necessary for vehicle and safety system design. This paper presents the results of parallel efforts to better model the initiation, spread, detection and extinguishment of fires aboard spacecraft. The first is a detailed numerical model using the freely available Fire Dynamics Simulator (FDS). FDS is a CFD code that numerically solves a large eddy simulation form of the Navier-Stokes equations. FDS provides a detailed treatment of the smoke and energy transport from a fire. The simulations provide a wealth of information, but are computationally intensive and not suitable for parametric studies where the detailed treatment of the mass and energy transport are unnecessary. The second path extends a model previously documented at ICES meetings that attempted to predict maximum survivable fires aboard space-craft. This one-dimensional model implies the heat and mass transfer as well as toxic species production from a fire. These simplifications result in a code that is faster and more suitable for parametric studies (having already been used to help in the hatch design of the Multi-Purpose Crew Vehicle, MPCV).

  4. Analysis of Heterogeneous Networks with Dual Connectivity in a Realistic Urban Deployment

    DEFF Research Database (Denmark)

    Gerardino, Guillermo Andrés Pocovi; Barcos, Sonia; Wang, Hua

    2015-01-01

    the performance in this realistic layout. Due to the uneven load distribution observed in realistic deployments, DC is able to provide fast load balancing gains also at relatively high load - and not only at low load as typically observed in 3GPP scenarios. For the same reason, the proposed cell selection...

  5. Determination of Realistic Fire Scenarios in Spacecraft

    Science.gov (United States)

    Dietrich, Daniel L.; Ruff, Gary A.; Urban, David

    2013-01-01

    This paper expands on previous work that examined how large a fire a crew member could successfully survive and extinguish in the confines of a spacecraft. The hazards to the crew and equipment during an accidental fire include excessive pressure rise resulting in a catastrophic rupture of the vehicle skin, excessive temperatures that burn or incapacitate the crew (due to hyperthermia), carbon dioxide build-up or accumulation of other combustion products (e.g. carbon monoxide). The previous work introduced a simplified model that treated the fire primarily as a source of heat and combustion products and sink for oxygen prescribed (input to the model) based on terrestrial standards. The model further treated the spacecraft as a closed system with no capability to vent to the vacuum of space. The model in the present work extends this analysis to more realistically treat the pressure relief system(s) of the spacecraft, include more combustion products (e.g. HF) in the analysis and attempt to predict the fire spread and limiting fire size (based on knowledge of terrestrial fires and the known characteristics of microgravity fires) rather than prescribe them in the analysis. Including the characteristics of vehicle pressure relief systems has a dramatic mitigating effect by eliminating vehicle overpressure for all but very large fires and reducing average gas-phase temperatures.

  6. Testing Realistic Disaster Scenarios for Space Weather: The Economic Impacts of Electricity Transmission Infrastructure Failure in the UK

    Science.gov (United States)

    Gibbs, M.; Oughton, E. J.; Hapgood, M. A.

    2017-12-01

    The socio-economic impacts of space weather have been under-researched, despite this threat featuring on the UK's National Risk Register. In this paper, a range of Realistic Disaster Scenarios due to failure in electricity transmission infrastructure are tested. We use regional Geomagnetically Induced Current (GIC) studies to identify areas in the UK high-voltage power system deemed to be high-risk. The potential level of disruption arising from a large geomagnetic disturbance in these `hot spots' on local economic activity is explored. Electricity is a necessary factor of production without which businesses cannot operate, so even short term power loss can cause significant loss of value. We utilise a spatially disaggregated approach that focuses on quantifying employment disruption by industrial sector, and relating this to direct Gross Value Added loss. We then aggregate this direct loss into a set of shocks to undertake macroeconomic modelling of different scenarios, to obtain the total economic impact which includes both direct and indirect supply chain disruption effects. These results are reported for a range of temporal periods, with the minimum increment being a one-hour blackout. This work furthers our understanding of the economic impacts of space weather, and can inform future reviews of the UK's National Risk Register. The key contribution of the paper is that the results can be used in the future cost-benefit analysis of investment in space weather forecasting.

  7. GADEN: A 3D Gas Dispersion Simulator for Mobile Robot Olfaction in Realistic Environments.

    Science.gov (United States)

    Monroy, Javier; Hernandez-Bennets, Victor; Fan, Han; Lilienthal, Achim; Gonzalez-Jimenez, Javier

    2017-06-23

    This work presents a simulation framework developed under the widely used Robot Operating System (ROS) to enable the validation of robotics systems and gas sensing algorithms under realistic environments. The framework is rooted in the principles of computational fluid dynamics and filament dispersion theory, modeling wind flow and gas dispersion in 3D real-world scenarios (i.e., accounting for walls, furniture, etc.). Moreover, it integrates the simulation of different environmental sensors, such as metal oxide gas sensors, photo ionization detectors, or anemometers. We illustrate the potential and applicability of the proposed tool by presenting a simulation case in a complex and realistic office-like environment where gas leaks of different chemicals occur simultaneously. Furthermore, we accomplish quantitative and qualitative validation by comparing our simulated results against real-world data recorded inside a wind tunnel where methane was released under different wind flow profiles. Based on these results, we conclude that our simulation framework can provide a good approximation to real world measurements when advective airflows are present in the environment.

  8. Interactive Web-based Floodplain Simulation System for Realistic Experiments of Flooding and Flood Damage

    Science.gov (United States)

    Demir, I.

    2013-12-01

    Recent developments in web technologies make it easy to manage and visualize large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The floodplain simulation system is a web-based 3D interactive flood simulation environment to create real world flooding scenarios. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create and modify predefined scenarios, control environmental parameters, and evaluate flood mitigation techniques. The web-based simulation system provides an environment to children and adults learn about the flooding, flood damage, and effects of development and human activity in the floodplain. The system provides various scenarios customized to fit the age and education level of the users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various flooding and land use scenarios.

  9. Estimating the water table under the Radioactive Waste Management Site in Area 5 of the Nevada Test Site the Dupuit-Forcheimer approximation

    International Nuclear Information System (INIS)

    Lindstrom, T.F.; Barker, L.E.; Cawlfield, D.E.; Daffern, D.D.; Dozier, B.L.; Emer, D.F.; Strong, W.R.

    1992-01-01

    A two-dimensional steady-state water-flow equation for estimating the water table elevation under a thick, very dry vadose zone is developed and discussed. The Dupuit assumption is made. A prescribed downward vertical infiltration/evaporation condition is assumed at the atmosphere-soil interface. An approximation to the square of the elevation head, based upon multivariate cubic interpolation methods, is introduced. The approximation is forced to satisfy the governing elliptic (Poisson) partial differential equation over the domain of definition. The remaining coefficients are determined by interpolating the water table at eight ''boundary points.'' Several realistic scenarios approximating the water table under the Radioactive Waste Management Site (RWMS) in Area 5 of the Nevada Test Site (NTS) are discussed

  10. Fully Realistic Multi-Criteria Multi-Modal Routing

    OpenAIRE

    Gündling, Felix; Keyhani, Mohammad Hossein; Schnee, Mathias; Weihe, Karsten

    2014-01-01

    We report on a multi-criteria search system, in which the German long- and short-distance trains, local public transport, walking, private car, private bike, and taxi are incorporated. The system is fully realistic. Three optimization criteria are addressed: travel time, travel cost, and convenience. Our algorithmic approach computes a complete Pareto set of reasonable connections. The computational study demonstrates that, even in such a large-scale, highly complex scenario, approp...

  11. Two-Capacitor Problem: A More Realistic View.

    Science.gov (United States)

    Powell, R. A.

    1979-01-01

    Discusses the two-capacitor problem by considering the self-inductance of the circuit used and by determining how well the usual series RC circuit approximates the two-capacitor problem when realistic values of L, C, and R are chosen. (GA)

  12. Classification for Safety-Critical Car-Cyclist Scenarios Using Machine Learning

    NARCIS (Netherlands)

    Cara, I.; Gelder, E.D.

    2015-01-01

    The number of fatal car-cyclist accidents is increasing. Advanced Driver Assistance Systems (ADAS) can improve the safety of cyclists, but they need to be tested with realistic safety-critical car-cyclist scenarios. In order to store only relevant scenarios, an online classification algorithm is

  13. Emergency exercise scenario tools

    International Nuclear Information System (INIS)

    Sjoeblom, K.

    1998-03-01

    Nuclear power plant emergency exercises require a realistically presented accident situation which includes various aspects: plant process, radioactivity, radiation, weather and people. Experiences from nuclear power plant emergency exercises show that preparing accident scenarios even for relatively short exercises is tedious. In the future modern computer technology and past experience could be used for making exercise planning more effective. (au)

  14. Security of quantum cryptography with realistic sources

    International Nuclear Information System (INIS)

    Lutkenhaus, N.

    1999-01-01

    The interest in practical implementations of quantum key distribution is steadily growing. However, there is still a need to give a precise security statement which adapts to realistic implementation. In this paper I give the effective key rate we can obtain in a practical setting within scenario of security against individual attacks by an eavesdropper. It illustrates previous results that high losses together with detector dark counts can make secure quantum key distribution impossible. (Author)

  15. Security of quantum cryptography with realistic sources

    Energy Technology Data Exchange (ETDEWEB)

    Lutkenhaus, N [Helsinki Institute of Physics, P.O. Box 9, 00014 Helsingin yliopisto (Finland)

    1999-08-01

    The interest in practical implementations of quantum key distribution is steadily growing. However, there is still a need to give a precise security statement which adapts to realistic implementation. In this paper I give the effective key rate we can obtain in a practical setting within scenario of security against individual attacks by an eavesdropper. It illustrates previous results that high losses together with detector dark counts can make secure quantum key distribution impossible. (Author)

  16. Building Realistic Mobility Models for Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Adrian Pullin

    2018-04-01

    Full Text Available A mobile ad hoc network (MANET is a self-configuring wireless network in which each node could act as a router, as well as a data source or sink. Its application areas include battlefields and vehicular and disaster areas. Many techniques applied to infrastructure-based networks are less effective in MANETs, with routing being a particular challenge. This paper presents a rigorous study into simulation techniques for evaluating routing solutions for MANETs with the aim of producing more realistic simulation models and thereby, more accurate protocol evaluations. MANET simulations require models that reflect the world in which the MANET is to operate. Much of the published research uses movement models, such as the random waypoint (RWP model, with arbitrary world sizes and node counts. This paper presents a technique for developing more realistic simulation models to test and evaluate MANET protocols. The technique is animation, which is applied to a realistic scenario to produce a model that accurately reflects the size and shape of the world, node count, movement patterns, and time period over which the MANET may operate. The animation technique has been used to develop a battlefield model based on established military tactics. Trace data has been used to build a model of maritime movements in the Irish Sea. Similar world models have been built using the random waypoint movement model for comparison. All models have been built using the ns-2 simulator. These models have been used to compare the performance of three routing protocols: dynamic source routing (DSR, destination-sequenced distance-vector routing (DSDV, and ad hoc n-demand distance vector routing (AODV. The findings reveal that protocol performance is dependent on the model used. In particular, it is shown that RWP models do not reflect the performance of these protocols under realistic circumstances, and protocol selection is subject to the scenario to which it is applied. To

  17. Phenomenology of a realistic accelerating universe using tracker fields

    Indian Academy of Sciences (India)

    We present a realistic scenario of tracking of scalar fields with varying equation of state. The astrophysical constraints on the evolution of scalar fields in the physical universe are discussed. The nucleosynthesis and the galaxy formation constraints have been used to put limits on and estimate during cosmic evolution.

  18. FttC-Based Fronthaul for 5G Dense/Ultra-Dense Access Network: Performance and Costs in Realistic Scenarios

    Directory of Open Access Journals (Sweden)

    Franco Mazzenga

    2017-10-01

    Full Text Available One distinctive feature of the next 5G systems is the presence of a dense/ultra-dense wireless access network with a large number of access points (or nodes at short distances from each other. Dense/ultra-dense access networks allow for providing very high transmission capacity to terminals. However, the deployment of dense/ultra-dense networks is slowed down by the cost of the fiber-based infrastructure required to connect radio nodes to the central processing units and then to the core network. In this paper, we investigate the possibility for existing FttC access networks to provide fronthaul capabilities for dense/ultra-dense 5G wireless networks. The analysis is realistic in that it is carried out considering an actual access network scenario, i.e., the Italian FttC deployment. It is assumed that access nodes are connected to the Cabinets and to the corresponding distributors by a number of copper pairs. Different types of cities grouped in terms of population have been considered. Results focus on fronthaul transport capacity provided by the FttC network and have been expressed in terms of the available fronthaul bit rate per node and of the achievable coverage.

  19. SMART-DS: Synthetic Models for Advanced, Realistic Testing: Distribution Systems and Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Palmintier, Bryan: Hodge, Bri-Mathias

    2017-01-26

    This presentation provides a Smart-DS project overview and status update for the ARPA-e GRID DATA program meeting 2017, including distribution systems, models, and scenarios, as well as opportunities for GRID DATA collaborations.

  20. The importance of realistic dispersal models in conservation planning: application of a novel modelling platform to evaluate management scenarios in an Afrotropical biodiversity hotspot.

    Science.gov (United States)

    Aben, Job; Bocedi, Greta; Palmer, Stephen C F; Pellikka, Petri; Strubbe, Diederik; Hallmann, Caspar; Travis, Justin M J; Lens, Luc; Matthysen, Erik

    2016-08-01

    As biodiversity hotspots are often characterized by high human population densities, implementation of conservation management practices that focus only on the protection and enlargement of pristine habitats is potentially unrealistic. An alternative approach to curb species extinction risk involves improving connectivity among existing habitat patches. However, evaluation of spatially explicit management strategies is challenging, as predictive models must account for the process of dispersal, which is difficult in terms of both empirical data collection and modelling.Here, we use a novel, individual-based modelling platform that couples demographic and mechanistic dispersal models to evaluate the effectiveness of realistic management scenarios tailored to conserve forest birds in a highly fragmented biodiversity hotspot. Scenario performance is evaluated based on the spatial population dynamics of a well-studied forest bird species.The largest population increase was predicted to occur under scenarios increasing habitat area. However, the effectiveness was sensitive to spatial planning. Compared to adding one large patch to the habitat network, adding several small patches yielded mixed benefits: although overall population sizes increased, specific newly created patches acted as dispersal sinks, which compromised population persistence in some existing patches. Increasing matrix connectivity by the creation of stepping stones is likely to result in enhanced dispersal success and occupancy of smaller patches. Synthesis and applications . We show that the effectiveness of spatial management is strongly driven by patterns of individual dispersal across landscapes. For species conservation planning, we advocate the use of models that incorporate adequate realism in demography and, particularly, in dispersal behaviours.

  1. The European Union, regionalism, and world order: five scenarios

    Directory of Open Access Journals (Sweden)

    Mario Telò

    2011-06-01

    Full Text Available This paper addresses the changing interplay between regional and global governance, with reference to five possible scenarios in the future shift of the international system. A ‘new multilateralism’ represents the only possible global framework consistent with an expanding and pluridimensional regional governance system. Furthermore, the weaknesses and contradictions of the alternative scenarios make a multilayered, more robust and legitimate multilateral governance more realistic.

  2. Separable expansion for realistic multichannel scattering problems

    International Nuclear Information System (INIS)

    Canton, L.; Cattapan, G.; Pisent, G.

    1987-01-01

    A new approach to the multichannel scattering problem with realistic local or nonlocal interactions is developed. By employing the negative-energy solutions of uncoupled Sturmian eigenvalue problems referring to simple auxiliary potentials, the coupling interactions appearing to the original multichannel problem are approximated by finite-rank potentials. By resorting to integral-equation tecniques the coupled-channel equations are then reduced to linear algebraic equations which can be straightforwardly solved. Compact algebraic expressions for the relevant scattering matrix elements are thus obtained. The convergence of the method is tasted in the single-channel case with realistic optical potentials. Excellent agreement is obtained with a few terms in the separable expansion for both real and absorptive interactions

  3. "What Line Will You Take with This Person?": Scenarios for Strategic Interaction.

    Science.gov (United States)

    Docker, Julie

    1990-01-01

    Presents two-role scenarios involving persons with conflicting needs or interests to enhance foreign language students' strategic intervention language skills in realistic situations. The scenarios not only encourage students to think through strategies to resolve a problem or conflict but also use their language skills to communicate effectively…

  4. Realistic diversity loss and variation in soil depth independently affect community-level plant nitrogen use.

    Science.gov (United States)

    Selmants, Paul C; Zavaleta, Erika S; Wolf, Amelia A

    2014-01-01

    Numerous experiments have demonstrated that diverse plant communities use nitrogen (N) more completely and efficiently, with implications for how species conservation efforts might influence N cycling and retention in terrestrial ecosystems. However, most such experiments have randomly manipulated species richness and minimized environmental heterogeneity, two design aspects that may reduce applicability to real ecosystems. Here we present results from an experiment directly comparing how realistic and randomized plant species losses affect plant N use across a gradient of soil depth in a native-dominated serpentine grassland in California. We found that the strength of the species richness effect on plant N use did not increase with soil depth in either the realistic or randomized species loss scenarios, indicating that the increased vertical heterogeneity conferred by deeper soils did not lead to greater complementarity among species in this ecosystem. Realistic species losses significantly reduced plant N uptake and altered N-use efficiency, while randomized species losses had no effect on plant N use. Increasing soil depth positively affected plant N uptake in both loss order scenarios but had a weaker effect on plant N use than did realistic species losses. Our results illustrate that realistic species losses can have functional consequences that differ distinctly from randomized losses, and that species diversity effects can be independent of and outweigh those of environmental heterogeneity on ecosystem functioning. Our findings also support the value of conservation efforts aimed at maintaining biodiversity to help buffer ecosystems against increasing anthropogenic N loading.

  5. Realistic modelling of the seismic input: Site effects and parametric studies

    International Nuclear Information System (INIS)

    Romanelli, F.; Vaccari, F.; Panza, G.F.

    2002-11-01

    We illustrate the work done in the framework of a large international cooperation, showing the very recent numerical experiments carried out within the framework of the EC project 'Advanced methods for assessing the seismic vulnerability of existing motorway bridges' (VAB) to assess the importance of non-synchronous seismic excitation of long structures. The definition of the seismic input at the Warth bridge site, i.e. the determination of the seismic ground motion due to an earthquake with a given magnitude and epicentral distance from the site, has been done following a theoretical approach. In order to perform an accurate and realistic estimate of site effects and of differential motion it is necessary to make a parametric study that takes into account the complex combination of the source and propagation parameters, in realistic geological structures. The computation of a wide set of time histories and spectral information, corresponding to possible seismotectonic scenarios for different sources and structural models, allows us the construction of damage scenarios that are out of the reach of stochastic models, at a very low cost/benefit ratio. (author)

  6. Information content of slug tests for estimating hydraulic properties in realistic, high-conductivity aquifer scenarios

    Science.gov (United States)

    Cardiff, Michael; Barrash, Warren; Thoma, Michael; Malama, Bwalya

    2011-06-01

    SummaryA recently developed unified model for partially-penetrating slug tests in unconfined aquifers ( Malama et al., in press) provides a semi-analytical solution for aquifer response at the wellbore in the presence of inertial effects and wellbore skin, and is able to model the full range of responses from overdamped/monotonic to underdamped/oscillatory. While the model provides a unifying framework for realistically analyzing slug tests in aquifers (with the ultimate goal of determining aquifer properties such as hydraulic conductivity K and specific storage Ss), it is currently unclear whether parameters of this model can be well-identified without significant prior information and, thus, what degree of information content can be expected from such slug tests. In this paper, we examine the information content of slug tests in realistic field scenarios with respect to estimating aquifer properties, through analysis of both numerical experiments and field datasets. First, through numerical experiments using Markov Chain Monte Carlo methods for gauging parameter uncertainty and identifiability, we find that: (1) as noted by previous researchers, estimation of aquifer storage parameters using slug test data is highly unreliable and subject to significant uncertainty; (2) joint estimation of aquifer and skin parameters contributes to significant uncertainty in both unless prior knowledge is available; and (3) similarly, without prior information joint estimation of both aquifer radial and vertical conductivity may be unreliable. These results have significant implications for the types of information that must be collected prior to slug test analysis in order to obtain reliable aquifer parameter estimates. For example, plausible estimates of aquifer anisotropy ratios and bounds on wellbore skin K should be obtained, if possible, a priori. Secondly, through analysis of field data - consisting of over 2500 records from partially-penetrating slug tests in a

  7. Realistic Planning Scenarios.

    Science.gov (United States)

    1987-07-01

    independent multiracial government, dominated primarily by the Zulu tribe and the local Asian population, had been proclaimed and aspired to control all of the...concentrated most of South Africa’s - remaining English-speaking population, and by the reigning Chief of the Zulu tribe , speaking for the self-styled...Africa. Facilities in one or more northern African countries-- Morocco, Egypt, Sudan, Kenya, Somalia--could be critical to U.S. military actions in the

  8. Any realistic theory must be computationally realistic: a response to N. Gisin's definition of a Realistic Physics Theory

    OpenAIRE

    Bolotin, Arkady

    2014-01-01

    It is argued that the recent definition of a realistic physics theory by N. Gisin cannot be considered comprehensive unless it is supplemented with requirement that any realistic theory must be computationally realistic as well.

  9. Visualizing Our Options for Coastal Places: Exploring Realistic Immersive Geovisualizations as Tools for Inclusive Approaches to Coastal Planning and Management

    Directory of Open Access Journals (Sweden)

    Robert Newell

    2017-09-01

    Full Text Available Effective coastal planning is inclusive and incorporates the variety of user needs, values and interests associated with coastal environments. Realistic, immersive geographic visualizations, i.e., geovisualizations, can serve as potentially powerful tools for facilitating such planning because they can provide diverse groups with vivid understandings of how they would feel about certain management outcomes or impacts if transpired in real places. However, the majority of studies in this area have focused on terrestrial environments, and research on applications of such tools in the coastal and marine contexts is in its infancy. The current study aims to advance such research by examining the potential a land-to-sea geovisualization has to serve as a tool for inclusive coastal planning efforts. The research uses Sidney Spit Park (BC, Canada as a study site, and a realistic, dynamic geovisualization of the park was developed (using Unity3D that allows users to interact with and navigate it through the first-person perspective. Management scenarios were developed based on discussions with Parks Canada, and these scenarios included fencing around vegetation areas, positioning of mooring buoys, and management of dog activity within the park. Scenarios were built into the geovisualization in a manner that allows users to toggle different options. Focus groups were then assembled, involving residents of the Capital Regional District (BC, Canada, and participants explored and provided feedback on the scenarios. Findings from the study demonstrate the geovisualization's usefulness for assessing certain qualities of scenarios, such as aesthetics and functionality of fencing options and potential viewshed impacts associated with different mooring boat locations. In addition, the study found that incorporating navigability into the geovisualization proved to be valuable for understanding scenarios that hold implications for the marine environment due to

  10. Realistic Simulations of Coronagraphic Observations with WFIRST

    Science.gov (United States)

    Rizzo, Maxime; Zimmerman, Neil; Roberge, Aki; Lincowski, Andrew; Arney, Giada; Stark, Chris; Jansen, Tiffany; Turnbull, Margaret; WFIRST Science Investigation Team (Turnbull)

    2018-01-01

    We present a framework to simulate observing scenarios with the WFIRST Coronagraphic Instrument (CGI). The Coronagraph and Rapid Imaging Spectrograph in Python (crispy) is an open-source package that can be used to create CGI data products for analysis and development of post-processing routines. The software convolves time-varying coronagraphic PSFs with realistic astrophysical scenes which contain a planetary architecture, a consistent dust structure, and a background field composed of stars and galaxies. The focal plane can be read out by a WFIRST electron-multiplying CCD model directly, or passed through a WFIRST integral field spectrograph model first. Several elementary post-processing routines are provided as part of the package.

  11. I-Love relations for incompressible stars and realistic stars

    Science.gov (United States)

    Chan, T. K.; Chan, AtMa P. O.; Leung, P. T.

    2015-02-01

    In spite of the diversity in the equations of state of nuclear matter, the recently discovered I-Love-Q relations [Yagi and Yunes, Science 341, 365 (2013), 10.1126/science.1236462], which relate the moment of inertia, tidal Love number (deformability), and the spin-induced quadrupole moment of compact stars, hold for various kinds of realistic neutron stars and quark stars. While the physical origin of such universality is still a current issue, the observation that the I-Love-Q relations of incompressible stars can well approximate those of realistic compact stars hints at a new direction to approach the problem. In this paper, by establishing recursive post-Minkowskian expansion for the moment of inertia and the tidal deformability of incompressible stars, we analytically derive the I-Love relation for incompressible stars and show that the so-obtained formula can be used to accurately predict the behavior of realistic compact stars from the Newtonian limit to the maximum mass limit.

  12. Scenario Archetypes: Converging Rather than Diverging Themes

    Directory of Open Access Journals (Sweden)

    Jon P. Sadler

    2012-04-01

    Full Text Available Future scenarios provide challenging, plausible and relevant stories about how the future could unfold. Urban Futures (UF research has identified a substantial set (>450 of seemingly disparate scenarios published over the period 1997–2011 and within this research, a sub-set of >160 scenarios has been identified (and categorized based on their narratives according to the structure first proposed by the Global Scenario Group (GSG in 1997; three world types (Business as Usual, Barbarization, and Great Transitions and six scenarios, two for each world type (Policy Reform—PR, Market Forces—MF, Breakdown—B, Fortress World—FW, Eco-Communalism—EC and New Sustainability Paradigm—NSP. It is suggested that four of these scenario archetypes (MF, PR, NSP and FW are sufficiently distinct to facilitate active stakeholder engagement in futures thinking. Moreover they are accompanied by a well-established, internally consistent set of narratives that provide a deeper understanding of the key fundamental drivers (e.g., STEEP—Social, Technological, Economic, Environmental and Political that could bring about realistic world changes through a push or a pull effect. This is testament to the original concept of the GSG scenarios and their development and refinement over a 16 year period.

  13. Study on Earth Radiation Budget mission scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Dlhopolsky, R; Hollmann, R; Mueller, J; Stuhlmann, R [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Atmosphaerenphysik

    1998-12-31

    The goal of this study is to study optimized satellite configurations for observation of the radiation balance of the earth. We present a literature survey of earth radiation budget missions and instruments. We develop a parametric tool to simulate realistic multiple satellite mission scenarios. This tool is a modular computer program which models satellite orbits and scanning operation. We use Meteosat data sampled at three hour intervals as a database to simulate atmospheric scenes. Input variables are satellite equatorial crossing time and instrument characteristics. Regional, zonal and global monthly averages of shortwave and longwave fluxes for an ideal observing system and several realistic satellite scenarios are produced. Comparisons show that the three satellite combinations which have equatorial crossing times at midmorning, noon and midafternoon provide the best shortwave monitoring. Crossing times near sunrise and sunset should be avoided for the shortwave. Longwave diurnal models are necessary over and surfaces and cloudy regions, if there are only two measurements made during daylight hours. We have found in the shortwave inversion comparison that at least 15% of the monthly regional errors can be attributed to the shortwave anisotropic models used. (orig.) 68 refs.

  14. Study on Earth Radiation Budget mission scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Dlhopolsky, R.; Hollmann, R.; Mueller, J.; Stuhlmann, R. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Atmosphaerenphysik

    1997-12-31

    The goal of this study is to study optimized satellite configurations for observation of the radiation balance of the earth. We present a literature survey of earth radiation budget missions and instruments. We develop a parametric tool to simulate realistic multiple satellite mission scenarios. This tool is a modular computer program which models satellite orbits and scanning operation. We use Meteosat data sampled at three hour intervals as a database to simulate atmospheric scenes. Input variables are satellite equatorial crossing time and instrument characteristics. Regional, zonal and global monthly averages of shortwave and longwave fluxes for an ideal observing system and several realistic satellite scenarios are produced. Comparisons show that the three satellite combinations which have equatorial crossing times at midmorning, noon and midafternoon provide the best shortwave monitoring. Crossing times near sunrise and sunset should be avoided for the shortwave. Longwave diurnal models are necessary over and surfaces and cloudy regions, if there are only two measurements made during daylight hours. We have found in the shortwave inversion comparison that at least 15% of the monthly regional errors can be attributed to the shortwave anisotropic models used. (orig.) 68 refs.

  15. The scenario-based generalization of radiation therapy margins

    International Nuclear Information System (INIS)

    Fredriksson, Albin; Bokrantz, Rasmus

    2016-01-01

    We give a scenario-based treatment plan optimization formulation that is equivalent to planning with geometric margins if the scenario doses are calculated using the static dose cloud approximation. If the scenario doses are instead calculated more accurately, then our formulation provides a novel robust planning method that overcomes many of the difficulties associated with previous scenario-based robust planning methods. In particular, our method protects only against uncertainties that can occur in practice, it gives a sharp dose fall-off outside high dose regions, and it avoids underdosage of the target in ‘easy’ scenarios. The method shares the benefits of the previous scenario-based robust planning methods over geometric margins for applications where the static dose cloud approximation is inaccurate, such as irradiation with few fields and irradiation with ion beams. These properties are demonstrated on a suite of phantom cases planned for treatment with scanned proton beams subject to systematic setup uncertainty. (paper)

  16. Remote vital parameter monitoring in neonatology - robust, unobtrusive heart rate detection in a realistic clinical scenario.

    Science.gov (United States)

    Blanik, Nikolai; Heimann, Konrad; Pereira, Carina; Paul, Michael; Blazek, Vladimir; Venema, Boudewijn; Orlikowsky, Thorsten; Leonhardt, Steffen

    2016-12-01

    Vital parameter monitoring of term and preterm infants during incubator care with self-adhesive electrodes or sensors directly positioned on the skin [e.g. photoplethysmography (PPG) for oxygen saturation or electrocardiography (ECG)] is an essential part of daily routine care in neonatal intensive care units. For various reasons, this kind of monitoring contains a lot of stress for the infants. Therefore, there is a need to measure vital parameters (for instance respiration, temperature, pulse, oxygen saturation) without mechanical or conductive contact. As a non-contact method of monitoring, we present an adapted version of camera-based photoplethysmography imaging (PPGI) according to neonatal requirements. Similar to classic PPG, the PPGI camera detects small temporal changes in the term and preterm infant's skin brightness due to the cardiovascular rhythm of dermal blood perfusion. We involved 10 preterm infants in a feasibility study [five males and five females; mean gestational age: 26 weeks (24-28 weeks); mean biological age: 35 days (8-41 days); mean weight at the time of investigation: 960 g (670-1290 g)]. The PPGI camera was placed directly above the incubators with the infant inside illuminated by an infrared light emitting diode (LED) array (850 nm). From each preterm infant, 5-min video sequences were recorded and analyzed post hoc. As the measurement scenario was kept as realistic as possible, the infants were not constrained in their movements in front of the camera. Movement intensities were assigned into five classes (1: no visible motion to 5: heavy struggling). PPGI was found to be significantly sensitive to movement artifacts. However, for movement classes 1-4, changes in blood perfusion according to the heart rate (HR) were recovered successfully (Pearson correlation: r=0.9759; r=0.765 if class 5 is included). The study was approved by the Ethics Committee of the Universal Hospital of the RWTH Aachen University, Aachen, Germany (EK 254/13).

  17. Does extreme precipitation intensity depend on the emissions scenario?

    Science.gov (United States)

    Pendergrass, Angeline; Lehner, Flavio; Sanderson, Benjamin; Xu, Yangyang

    2016-04-01

    The rate of increase of global-mean precipitation per degree surface temperature increase differs for greenhouse gas and aerosol forcings, and therefore depends on the change in composition of the emissions scenario used to drive climate model simulations for the remainder of the century. We investigate whether or not this is also the case for extreme precipitation simulated by a multi-model ensemble driven by four realistic emissions scenarios. In most models, the rate of increase of maximum annual daily rainfall per degree global warming in the multi-model ensemble is statistically indistinguishable across the four scenarios, whether this extreme precipitation is calculated globally, over all land, or over extra-tropical land. These results indicate that, in most models, extreme precipitation depends on the total amount of warming and does not depend on emissions scenario, in contrast to mean precipitation.

  18. The Reference Scenarios for the Swiss Emergency Planning

    International Nuclear Information System (INIS)

    Hanspeter Isaak; Navert, Stephan B.; Ralph Schulz

    2006-01-01

    For the purpose of emergency planning and preparedness, realistic reference scenarios and corresponding accident source terms have been defined on the basis of common plant features. Three types of representative reference scenarios encompass the accident sequences expected to be the most probable. Accident source terms are assumed to be identical for all Swiss nuclear power plants, although the plants differ in reactor type and power. Plant-specific probabilistic safety analyses were used to justify the reference scenarios and the postulated accident source terms. From the full spectrum of release categories available, those categories were selected which would be covered by the releases and time frames assumed in the reference scenarios. For each nuclear power plant, the cumulative frequency of accident sequences not covered by the reference scenarios was determined. It was found that the cumulative frequency for such accident sequences does not exceed about 1 x 10 -6 per year. The Swiss Federal Nuclear Safety Inspectorate concludes that the postulated accident source terms for the reference scenarios are consistent with the current international approach in emergency planning, where one should concentrate on the most probable accident sequences. (N.C.)

  19. Ozone concentrations and damage for realistic future European climate and air quality scenarios

    Science.gov (United States)

    Hendriks, Carlijn; Forsell, Nicklas; Kiesewetter, Gregor; Schaap, Martijn; Schöpp, Wolfgang

    2016-11-01

    Ground level ozone poses a significant threat to human health from air pollution in the European Union. While anthropogenic emissions of precursor substances (NOx, NMVOC, CH4) are regulated by EU air quality legislation and will decrease further in the future, the emissions of biogenic NMVOC (mainly isoprene) may increase significantly in the coming decades if short-rotation coppice plantations are expanded strongly to meet the increased biofuel demand resulting from the EU decarbonisation targets. This study investigates the competing effects of anticipated trends in land use change, anthropogenic ozone precursor emissions and climate change on European ground level ozone concentrations and related health and environmental impacts until 2050. The work is based on a consistent set of energy consumption scenarios that underlie current EU climate and air quality policy proposals: a current legislation case, and an ambitious decarbonisation case. The Greenhouse Gas-Air Pollution Interactions and Synergies (GAINS) integrated assessment model was used to calculate air pollutant emissions for these scenarios, while land use change because of bioenergy demand was calculated by the Global Biosphere Model (GLOBIOM). These datasets were fed into the chemistry transport model LOTOS-EUROS to calculate the impact on ground level ozone concentrations. Health damage because of high ground level ozone concentrations is projected to decline significantly towards 2030 and 2050 under current climate conditions for both energy scenarios. Damage to plants is also expected to decrease but to a smaller extent. The projected change in anthropogenic ozone precursor emissions is found to have a larger impact on ozone damage than land use change. The increasing effect of a warming climate (+2-5 °C across Europe in summer) on ozone concentrations and associated health damage, however, might be higher than the reduction achieved by cutting back European ozone precursor emissions. Global

  20. The SAFRR Tsunami Scenario

    Science.gov (United States)

    Porter, K.; Jones, Lucile M.; Ross, Stephanie L.; Borrero, J.; Bwarie, J.; Dykstra, D.; Geist, Eric L.; Johnson, L.; Kirby, Stephen H.; Long, K.; Lynett, P.; Miller, K.; Mortensen, Carl E.; Perry, S.; Plumlee, G.; Real, C.; Ritchie, L.; Scawthorn, C.; Thio, H.K.; Wein, Anne; Whitmore, P.; Wilson, R.; Wood, Nathan J.; Ostbo, Bruce I.; Oates, Don

    2013-01-01

    The U.S. Geological Survey and several partners operate a program called Science Application for Risk Reduction (SAFRR) that produces (among other things) emergency planning scenarios for natural disasters. The scenarios show how science can be used to enhance community resiliency. The SAFRR Tsunami Scenario describes potential impacts of a hypothetical, but realistic, tsunami affecting California (as well as the west coast of the United States, Alaska, and Hawaii) for the purpose of informing planning and mitigation decisions by a variety of stakeholders. The scenario begins with an Mw 9.1 earthquake off the Alaska Peninsula. With Pacific basin-wide modeling, we estimate up to 5m waves and 10 m/sec currents would strike California 5 hours later. In marinas and harbors, 13,000 small boats are damaged or sunk (1 in 3) at a cost of $350 million, causing navigation and environmental problems. Damage in the Ports of Los Angeles and Long Beach amount to $110 million, half of it water damage to vehicles and containerized cargo. Flooding of coastal communities affects 1800 city blocks, resulting in $640 million in damage. The tsunami damages 12 bridge abutments and 16 lane-miles of coastal roadway, costing $85 million to repair. Fire and business interruption losses will substantially add to direct losses. Flooding affects 170,000 residents and workers. A wide range of environmental impacts could occur. An extensive public education and outreach program is underway, as well as an evaluation of the overall effort.

  1. Identification of reference accident scenarios in SEVESO establishments

    International Nuclear Information System (INIS)

    Delvosalle, C.; Fievez, C.; Pipart, A.; Fabrega, J. Casal; Planas, E.; Christou, M.; Mushtaq, F.

    2005-01-01

    In the frame of the ESREL special session on ARAMIS project, this paper aims at presenting the work carried out in the first Work Package, devoted to the definition of accident scenarios. This topic is a key-point in risk assessment, and serves as basis for the whole risk quantification. A first part of the work aims at building a Methodology for the Identification of Major Accident Hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. This work is coupled with an historical analysis of accidents. In a second part, influence of safety devices and policies will be considered, in order to build a Methodology for the Identification of Reference Accident Scenarios (MIRAS). This last one will take into account safety systems and lead to obtain more realistic scenarios

  2. Global sensitivity analysis of the BSM2 dynamic influent disturbance scenario generator

    DEFF Research Database (Denmark)

    Flores-Alsina, Xavier; Gernaey, Krist V.; Jeppsson, Ulf

    2012-01-01

    This paper presents the results of a global sensitivity analysis (GSA) of a phenomenological model that generates dynamic wastewater treatment plant (WWTP) influent disturbance scenarios. This influent model is part of the Benchmark Simulation Model (BSM) family and creates realistic dry/wet weat...

  3. Exploring drivers of energy demand in Cyprus – Scenarios and policy options

    International Nuclear Information System (INIS)

    Zachariadis, Theodoros; Taibi, Emanuele

    2015-01-01

    This paper describes a new set of energy demand forecasts for the Republic of Cyprus up to the year 2040, which have been developed in support of the renewable energy roadmap that was prepared for national authorities by the International Renewable Energy Agency. The analysis takes into account national end-use data from the residential and tertiary sector that had not been exploited up to now. Four final energy demand scenarios with diverging assumptions were defined in this study, offering a wide range of possible outcomes up to 2040; in addition, four alternative scenarios were applied for sensitivity analysis. Two of these scenarios can be regarded as those continuing the trends of the recent past in Cyprus (prior to the economic and financial downturn of years 2011–2014). However, a more rigorous implementation of energy efficiency measures in buildings and transport, as defined in the fourth scenario of this study, is also realistic; despite its potential costs, it might allow Cyprus both to decrease its carbon emissions in line with the long-term EU decarbonisation targets, and to reduce its dependence on fossil fuels, thereby promoting energy efficiency as an important climate change adaptation measure. - Highlights: • Energy demand forecasts for the Republic of Cyprus up to the year 2040 are presented. • Study in the frame of renewable energy roadmap for Cyprus supported by IRENA. • Four scenarios considered, some allowing for breaks with past trends of energy use. • Rigorous implementation of energy efficiency measures is realistic. • Strong energy savings required in line with EU decarbonisation targets.

  4. Does the cosmic no-hair conjecture in brane scenarios follow from general relativity?

    CERN Document Server

    Chakraborty, S

    2003-01-01

    In this paper we examine the cosmic no-hair conjecture (CNHC) in braneworld scenarios. For the validity of this conjecture, in addition to the strong- and weak-energy conditions for the matter field, a similar type of assumption is made on the quadratic correction term and there is a restriction on the non-local term. It is shown using examples with realistic fluid models that strong- and weak-energy conditions are sufficient for the CNHC in braneworld scenarios.

  5. Equivalent sphere approximations for skin, eye, and blood-forming organs

    International Nuclear Information System (INIS)

    Maxson, W.L.; Townsend, L.W.; Bier, S.G.

    1996-01-01

    Throughout the manned spaceflight program, protecting astronauts from space radiation has been the subject of intense study. For interplanetary crews, two main sources of radiation hazards are solar particle events (SPEs) and galactic cosmic rays. For nearly three decades, crew doses and related shielding requirements have been assessed using the assumption that body organ exposures are well approximated by exposures at the center of tissue-equivalent spheres. For the skin and for blood-forming organs (BFOs), these spheres have radii of 0 and 5 cm, respectively. Recent studies indicate that significant overestimation of organ doses occurs if these models are used instead of realistic human geometry models. The use of the latter, however, requires much longer computational times. In this work, the authors propose preliminary revisions to these equivalent sphere approximations that yield more realistic dose estimates

  6. An improved saddlepoint approximation.

    Science.gov (United States)

    Gillespie, Colin S; Renshaw, Eric

    2007-08-01

    Given a set of third- or higher-order moments, not only is the saddlepoint approximation the only realistic 'family-free' technique available for constructing an associated probability distribution, but it is 'optimal' in the sense that it is based on the highly efficient numerical method of steepest descents. However, it suffers from the problem of not always yielding full support, and whilst [S. Wang, General saddlepoint approximations in the bootstrap, Prob. Stat. Lett. 27 (1992) 61.] neat scaling approach provides a solution to this hurdle, it leads to potentially inaccurate and aberrant results. We therefore propose several new ways of surmounting such difficulties, including: extending the inversion of the cumulant generating function to second-order; selecting an appropriate probability structure for higher-order cumulants (the standard moment closure procedure takes them to be zero); and, making subtle changes to the target cumulants and then optimising via the simplex algorithm.

  7. Improvements in Total Column Ozone in GEOSCCM and Comparisons with a New Ozone-Depleting Substances Scenario

    Science.gov (United States)

    Oman, Luke D.; Douglass, Anne R.

    2014-01-01

    The evolution of ozone is examined in the latest version of the Goddard Earth Observing System Chemistry-Climate Model (GEOSCCM) using old and new ozone-depleting substances (ODS) scenarios. This version of GEOSCCM includes a representation of the quasi-biennial oscillation, a more realistic implementation of ozone chemistry at high solar zenith angles, an improved air/sea roughness parameterization, and an extra 5 parts per trillion of CH3Br to account for brominated very short-lived substances. Together these additions improve the representation of ozone compared to observations. This improved version of GEOSCCM was used to simulate the ozone evolution for the A1 2010 and the newStratosphere-troposphere Processes and their Role in Climate (SPARC) 2013 ODS scenario derived using the SPARC Lifetimes Report 2013. This new ODS scenario results in a maximum Cltot increase of 65 parts per trillion by volume (pptv), decreasing slightly to 60 pptv by 2100. Approximately 72% of the increase is due to the longer lifetime of CFC-11. The quasi-global (60degS-60degN) total column ozone difference is relatively small and less than 1Dobson unit on average and consistent with the 3-4% larger 2050-2080 average Cly in the new SPARC 2013 scenario. Over high latitudes, this small change in Cly compared to the relatively large natural variabilitymakes it not possible to discern a significant impact on ozone in the second half of the 21st century in a single set of simulations.

  8. 2016 Standard Scenarios Report: A U.S. Electricity Sector Outlook

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Logan, Jeffrey [National Renewable Energy Lab. (NREL), Golden, CO (United States); Steinberg, Daniel [National Renewable Energy Lab. (NREL), Golden, CO (United States); McCall, James [National Renewable Energy Lab. (NREL), Golden, CO (United States); Richards, James [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sigrin, Benjamin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Porro, Gian [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-11-01

    The National Renewable Energy Laboratory is conducting a study sponsored by the Office of Energy Efficiency and Renewable Energy (EERE) that aims to document and implement an annual process designed to identify a realistic and timely set of input assumptions (e.g., technology cost and performance, fuel costs), and a diverse set of potential futures (standard scenarios), initially for electric sector analysis.

  9. Stochastic congestion management in power markets using efficient scenario approaches

    International Nuclear Information System (INIS)

    Esmaili, Masoud; Amjady, Nima; Shayanfar, Heidar Ali

    2010-01-01

    Congestion management in electricity markets is traditionally performed using deterministic values of system parameters assuming a fixed network configuration. In this paper, a stochastic programming framework is proposed for congestion management considering the power system uncertainties comprising outage of generating units and transmission branches. The Forced Outage Rate of equipment is employed in the stochastic programming. Using the Monte Carlo simulation, possible scenarios of power system operating states are generated and a probability is assigned to each scenario. The performance of the ordinary as well as Lattice rank-1 and rank-2 Monte Carlo simulations is evaluated in the proposed congestion management framework. As a tradeoff between computation time and accuracy, scenario reduction based on the standard deviation of accepted scenarios is adopted. The stochastic congestion management solution is obtained by aggregating individual solutions of accepted scenarios. Congestion management using the proposed stochastic framework provides a more realistic solution compared with traditional deterministic solutions. Results of testing the proposed stochastic congestion management on the 24-bus reliability test system indicate the efficiency of the proposed framework.

  10. Realistic modeling of seismic input for megacities and large urban areas

    International Nuclear Information System (INIS)

    Panza, Giuliano F.; Alvarez, Leonardo; Aoudia, Abdelkrim

    2002-06-01

    The project addressed the problem of pre-disaster orientation: hazard prediction, risk assessment, and hazard mapping, in connection with seismic activity and man-induced vibrations. The definition of realistic seismic input has been obtained from the computation of a wide set of time histories and spectral information, corresponding to possible seismotectonic scenarios for different source and structural models. The innovative modeling technique, that constitutes the common tool to the entire project, takes into account source, propagation and local site effects. This is done using first principles of physics about wave generation and propagation in complex media, and does not require to resort to convolutive approaches, that have been proven to be quite unreliable, mainly when dealing with complex geological structures, the most interesting from the practical point of view. In fact, several techniques that have been proposed to empirically estimate the site effects using observations convolved with theoretically computed signals corresponding to simplified models, supply reliable information about the site response to non-interfering seismic phases. They are not adequate in most of the real cases, when the seismic sequel is formed by several interfering waves. The availability of realistic numerical simulations enables us to reliably estimate the amplification effects even in complex geological structures, exploiting the available geotechnical, lithological, geophysical parameters, topography of the medium, tectonic, historical, palaeoseismological data, and seismotectonic models. The realistic modeling of the ground motion is a very important base of knowledge for the preparation of groundshaking scenarios that represent a valid and economic tool for the seismic microzonation. This knowledge can be very fruitfully used by civil engineers in the design of new seismo-resistant constructions and in the reinforcement of the existing built environment, and, therefore

  11. ILC Operating Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Barklow, T.; Brau, J.; Fujii, K.; Gao, J.; List, J.; Walker, N.; Yokoya, K.; Collaboration: ILC Parameters Joint Working Group

    2015-06-15

    The ILC Technical Design Report documents the design for the construction of a linear collider which can be operated at energies up to 500 GeV. This report summarizes the outcome of a study of possible running scenarios, including a realistic estimate of the real time accumulation of integrated luminosity based on ramp-up and upgrade processes. The evolution of the physics outcomes is emphasized, including running initially at 500 GeV, then at 350 GeV and 250 GeV. The running scenarios have been chosen to optimize the Higgs precision measurements and top physics while searching for evidence for signals beyond the standard model, including dark matter. In addition to the certain precision physics on the Higgs and top that is the main focus of this study, there are scientific motivations that indicate the possibility for discoveries of new particles in the upcoming operations of the LHC or the early operation of the ILC. Follow-up studies of such discoveries could alter the plan for the centre-of-mass collision energy of the ILC and expand the scientific impact of the ILC physics program. It is envisioned that a decision on a possible energy upgrade would be taken near the end of the twenty year period considered in this report.

  12. Weather scenarios for dose calculations with incomplete meteorological data. V.IV

    International Nuclear Information System (INIS)

    Alp, E.; Lam, L.H.; Moran, M.D.

    1985-09-01

    This report documents a study to substantiate or modify the weather scenarios proposed by the Atomic Energy Control Board Staff Position Paper on meteorological acceptance criteria for estimating the potential radiological consequences of postulated accidents (AECB, 1982) for short-, prolonged-, and long-term releases from ground level and elevated sources. The study examined available meteorological data in Canada to determine whether the AECB-proposed scenarios are sufficiently general that they are appropriate and conservative for any potential nuclear power plant in Canada, but also realistic, i.e., not so conservative that the results of dose calculations using these scenarios would be wholly unrepresentative leading to incorrect design decisions. Three different sets of scenarios were derived using three site-specific data sets from weather stations that are representative of existing nuclear power plants in Canada. When compared, the scenarios for the three sites are not significantly different from each other, especially in terms of trends, considering that they have been based on data from widely differing meteorological regions in Canada. Conservative envelopes of the scenarios for the three sites were taken to give the recommended general weather scenario set. The recommended set was then compared with the AECB proposed scenarios. The recommended scenarios are, in general, conservative

  13. Reliable Freestanding Position-Based Routing in Highway Scenarios

    Science.gov (United States)

    Galaviz-Mosqueda, Gabriel A.; Aquino-Santos, Raúl; Villarreal-Reyes, Salvador; Rivera-Rodríguez, Raúl; Villaseñor-González, Luis; Edwards, Arthur

    2012-01-01

    Vehicular Ad Hoc Networks (VANETs) are considered by car manufacturers and the research community as the enabling technology to radically improve the safety, efficiency and comfort of everyday driving. However, before VANET technology can fulfill all its expected potential, several difficulties must be addressed. One key issue arising when working with VANETs is the complexity of the networking protocols compared to those used by traditional infrastructure networks. Therefore, proper design of the routing strategy becomes a main issue for the effective deployment of VANETs. In this paper, a reliable freestanding position-based routing algorithm (FPBR) for highway scenarios is proposed. For this scenario, several important issues such as the high mobility of vehicles and the propagation conditions may affect the performance of the routing strategy. These constraints have only been partially addressed in previous proposals. In contrast, the design approach used for developing FPBR considered the constraints imposed by a highway scenario and implements mechanisms to overcome them. FPBR performance is compared to one of the leading protocols for highway scenarios. Performance metrics show that FPBR yields similar results when considering freespace propagation conditions, and outperforms the leading protocol when considering a realistic highway path loss model. PMID:23202159

  14. Validation of Tilt Gain under Realistic Path Loss Model and Network Scenario

    DEFF Research Database (Denmark)

    Nguyen, Huan Cong; Rodriguez, Ignacio; Sørensen, Troels Bundgaard

    2013-01-01

    Despite being a simple and commonly-applied radio optimization technique, the impact on practical network performance from base station antenna downtilt is not well understood. Most published studies based on empirical path loss models report tilt angles and performance gains that are far higher...... than practical experience suggests. We motivate in this paper, based on a practical LTE scenario, that the discrepancy partly lies in the path loss model, and shows that a more detailed semi-deterministic model leads to both lower gains in terms of SINR, outage probability and downlink throughput...... settings, including the use of electrical and/or mechanical antenna downtilt, and therefore it is possible to find multiple optimum tilt profiles in a practical case. A broader implication of this study is that care must be taken when using the 3GPP model to evaluate advanced adaptive antenna techniques...

  15. Gauge coupling unification in realistic free-fermionic string models

    International Nuclear Information System (INIS)

    Dienes, K.R.; Faraggi, A.E.

    1995-01-01

    We discuss the unification of gauge couplings within the framework of a wide class of realistic free-fermionic string models which have appeared in the literature, including the flipped SU(5), SO(6)xSO(4), and various SU(3)xSU(2)xU(1) models. If the matter spectrum below the string scale is that of the Minimal Supersymmetric Standard Model (MSSM), then string unification is in disagreement with experiment. We therefore examine several effects that may modify the minimal string predictions. First, we develop a systematic procedure for evaluating the one-loop heavy string threshold corrections in free-fermionic string models, and we explicitly evaluate these corrections for each of the realistic models. We find that these string threshold corrections are small, and we provide general arguments explaining why such threshold corrections are suppressed in string theory. Thus heavy thresholds cannot resolve the disagreement with experiment. We also study the effect of non-standard hypercharge normalizations, light SUSY thresholds, and intermediate-scale gauge structure, and similarly conclude that these effects cannot resolve the disagreement with low-energy data. Finally, we examine the effects of additional color triplets and electroweak doublets beyond the MSSM. Although not required in ordinary grand unification scenarios, such states generically appear within the context of certain realistic free-fermionic string models. We show that if these states exist at the appropriate thresholds, then the gauge couplings will indeed unify at the string scale. Thus, within these string models, string unification can be in agreement with low-energy data. (orig.)

  16. Results of extended plant tests using more realistic exposure scenarios for improving environmental risk assessment of veterinary pharmaceuticals.

    Science.gov (United States)

    Richter, Elisabeth; Berkner, Silvia; Ebert, Ina; Förster, Bernhard; Graf, Nadin; Herrchen, Monika; Kühnen, Ute; Römbke, Jörg; Simon, Markus

    2016-01-01

    Residues of veterinary medicinal products (VMPs) enter the environment via application of manure onto agricultural areas where in particular antibiotics can cause phytotoxicity. Terrestrial plant tests according to OECD guideline 208 are part of the environmental risk assessment of VMPs. However, this standard approach might not be appropriate for VMPs which form non-extractable residues or transformation products in manure and manure-amended soil. Therefore, a new test design with a more realistic exposure scenario via manure application is needed. This paper presents an extended plant test and its experimental verification with the veterinary antibiotics florfenicol and tylosin tartrate. With each substance, plant tests with four different types of application were conducted: standard tests according to OECD 208 and three tests with application of test substance via spiked manure either without storage, aerobically incubated, or anaerobically incubated for different time periods. In standard tests, the lowest NOEC was tylosin tartrate. Pre-tests showed that plant growth was not impaired at 22-g fresh manure/kg dry soil, which therefore was used for the final tests. The application of the test substances via freshly spiked as well as via aerobically incubated manure had no significant influence on the test results. Application of florfenicol via anaerobically incubated manure increased the EC10 by a factor up to 282 and 540 for half-maximum and for maximum incubation period, respectively. For tylosin tartrate, this factor amounted to 64 at half-maximum and 61 at maximum incubation period. The reduction of phytotoxicity was generally stronger when using cattle manure than pig manure and particularly in tests with cattle manure phytotoxicity decreased over the incubation period. The verification of the extended plant test showed that seedling emergence and growth are comparable to a standard OECD 208 test and reliable effect concentrations could be established. As

  17. Multi-compartment linear noise approximation

    International Nuclear Information System (INIS)

    Challenger, Joseph D; McKane, Alan J; Pahle, Jürgen

    2012-01-01

    The ability to quantify the stochastic fluctuations present in biochemical and other systems is becoming increasing important. Analytical descriptions of these fluctuations are attractive, as stochastic simulations are computationally expensive. Building on previous work, a linear noise approximation is developed for biochemical models with many compartments, for example cells. The procedure is then implemented in the software package COPASI. This technique is illustrated with two simple examples and is then applied to a more realistic biochemical model. Expressions for the noise, given in the form of covariance matrices, are presented. (paper)

  18. Biochemical transport modeling, estimation, and detection in realistic environments

    Science.gov (United States)

    Ortner, Mathias; Nehorai, Arye

    2006-05-01

    Early detection and estimation of the spread of a biochemical contaminant are major issues for homeland security applications. We present an integrated approach combining the measurements given by an array of biochemical sensors with a physical model of the dispersion and statistical analysis to solve these problems and provide system performance measures. We approximate the dispersion model of the contaminant in a realistic environment through numerical simulations of reflected stochastic diffusions describing the microscopic transport phenomena due to wind and chemical diffusion using the Feynman-Kac formula. We consider arbitrary complex geometries and account for wind turbulence. Localizing the dispersive sources is useful for decontamination purposes and estimation of the cloud evolution. To solve the associated inverse problem, we propose a Bayesian framework based on a random field that is particularly powerful for localizing multiple sources with small amounts of measurements. We also develop a sequential detector using the numerical transport model we propose. Sequential detection allows on-line analysis and detecting wether a change has occurred. We first focus on the formulation of a suitable sequential detector that overcomes the presence of unknown parameters (e.g. release time, intensity and location). We compute a bound on the expected delay before false detection in order to decide the threshold of the test. For a fixed false-alarm rate, we obtain the detection probability of a substance release as a function of its location and initial concentration. Numerical examples are presented for two real-world scenarios: an urban area and an indoor ventilation duct.

  19. Eu Enlargement and Migration: Scenarios of Croatian Accession

    Directory of Open Access Journals (Sweden)

    Wadim Strielkowski

    2013-09-01

    Full Text Available This paper analyzes possible incidence of pending Croatian EU accession that is to take place on the 1st of July 2013, on the labour migration from Croatia to the European Union. We apply panel data estimators using the data on emigration from 18 EU countries into Germany (which is the EU country with the largest share of ex-Yugoslav and Croatian migrants in order to construct possible scenarios of Croatian migration to the EU.Three scenarios of migration - pessimistic, realistic and optimistic - are drawn and the sensitivity of estimated coefficients on migration from Croatia into Germany during next 25 years is further discussed in detail. We conclude that, similarly to hypothetical Turkish accession, Croatian EU accession is not going to cause massive migration inflows.

  20. OBEST: The Object-Based Event Scenario Tree Methodology

    International Nuclear Information System (INIS)

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-01-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies

  1. Long-term scenarios for global energy demand and supply. Four global greenhouse mitigation scenarios. Final report

    International Nuclear Information System (INIS)

    Soerensen, B.; Meibom, P.; Kuemmel, B.

    1999-01-01

    The scenario method is used to investigate energy demand and supply systems for the 21st century. A geographical information system (GIS) is employed to assess the spatial match between supply and demand, and the robustness of the scenario against changes in assumptions is discussed, for scenarios using fossil fuels without carbon dioxide emissions, nuclear fuels with reduced accident and proliferation risks, and renewable energy from local and from more centralised installations: The year 2050 demand scenario is based on a very high goal satisfaction in all regions of the world, for the middle UN population projection. All energy efficiency measures that are technically ready and economic today are assumed in effect by year 2050. An increased fraction of total activities are assumed to occur in non-material sectors. Technical, economic and implementation issues are discussed, including the resilience to changes in particularly demand assumptions and the type of framework that would allow energy policy to employ any of (or a mix of) the scenario options. Results are presented as average energy flows per unit of land area. This geographically based presentation method gives additional insights, particularly for the dispersed renewable energy systems, but in all cases it allows to identify the need for energy transmission and trade between regions, and to display it in a visually suggestive fashion. The scenarios are examples of greenhouse mitigation scenarios, all characterised by near-zero emissions of greenhouse gases to the atmosphere. All are more expensive than the present system, but only if the cost of the negative impacts from the current system is neglected. As options for global energy policy during the next decades, the clean fossil and the renewable energy options (possibly in combination) are the only realistic ones, because the safe nuclear option requires research and development that most likely will take longer time, if it can at all be carried

  2. Long-term scenarios for global energy demand and supply. Four global greenhouse mitigation scenarios. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen, B; Meibom, P [Technical Univ. of Denmark, Lyngby (Denmark); Kuemmel, B [Royal Agricultural and Veterinary Univ., Tastrup (Denmark)

    1999-01-01

    The scenario method is used to investigate energy demand and supply systems for the 21st century. A geographical information system (GIS) is employed to assess the spatial match between supply and demand, and the robustness of the scenario against changes in assumptions is discussed, for scenarios using fossil fuels without carbon dioxide emissions, nuclear fuels with reduced accident and proliferation risks, and renewable energy from local and from more centralised installations: The year 2050 demand scenario is based on a very high goal satisfaction in all regions of the world, for the middle UN population projection. All energy efficiency measures that are technically ready and economic today are assumed in effect by year 2050. An increased fraction of total activities are assumed to occur in non-material sectors. Technical, economic and implementation issues are discussed, including the resilience to changes in particularly demand assumptions and the type of framework that would allow energy policy to employ any of (or a mix of) the scenario options. Results are presented as average energy flows per unit of land area. This geographically based presentation method gives additional insights, particularly for the dispersed renewable energy systems, but in all cases it allows to identify the need for energy transmission and trade between regions, and to display it in a visually suggestive fashion. The scenarios are examples of greenhouse mitigation scenarios, all characterised by near-zero emissions of greenhouse gases to the atmosphere. All are more expensive than the present system, but only if the cost of the negative impacts from the current system is neglected. As options for global energy policy during the next decades, the clean fossil and the renewable energy options (possibly in combination) are the only realistic ones, because the safe nuclear option requires research and development that most likely will take longer time, if it can at all be carried

  3. BIOMOVS test scenario model comparison using BIOPATH

    International Nuclear Information System (INIS)

    Grogan, H.A.; Van Dorp, F.

    1986-07-01

    This report presents the results of the irrigation test scenario, presented in the BIOMOVS intercomparison study, calculated by the computer code BIOPATH. This scenario defines a constant release of Tc-99 and Np-237 into groundwater that is used for irrigation. The system of compartments used to model the biosphere is based upon an area in northern Switzerland and is essentially the same as that used in Projekt Gewaehr to assess the radiological impact of a high level waste repository. Two separate irrigation methods are considered, namely ditch and overhead irrigation. Their influence on the resultant activities calculated in the groundwater, soil and different foodproducts, as a function of time, is evaluated. The sensitivity of the model to parameter variations is analysed which allows a deeper understanding of the model chain. These results are assessed subjectively in a first effort to realistically quantify the uncertainty associated with each calculated activity. (author)

  4. Identification performance of homeland security pocket and fixed devices for masking scenarios

    International Nuclear Information System (INIS)

    Schulcz, Francis; Gunnink, Ray; Giribaldi, Vincent; Ellenboegen, Michal

    2008-01-01

    Full text: The paper presents the principle and results of recent tests of the spectrometric pocket radiation detector PDS100GN/ID and of the spectrometric portal SPIR-Ident. We have first considered the masking scenarios listed in the relevant standards of illicit trafficking. Then we considered a larger choice of realistic masking scenarios in particular, popular medical isotopes mixed with special nuclear material. We estimated through direct measurements or by simulation based on recorded or synthesized spectra the limit of imbalance that can be detected. The consequences about the use and settings of the detection and identification devices are discussed. (author)

  5. Weather scenarios for dose calculations with incomplete meteorological data. V.I.(rev.1)

    International Nuclear Information System (INIS)

    Alp, E.; Lam, L.H.; Moran, M.D.

    1985-09-01

    This report documents a study to substantiate or modify the weather scenarios proposed by the Atomic Energy Control Board Staff Position Paper on meteorological acceptance criteria for estimating the potential radiological consequences of postulated accidents (AECB, 1982) for short-, prolonged-, and long-term releases from ground level and elevated sources. The study examined available meteorological data in Canada to determine whether the AECB-proposed scenarios are sufficiently general that they are appropriate and conservative for any potential nuclear power plant in Canada, but also realistic, i.e., not so conservative that the results of dose calculations using these scenarios would be wholly unrepresentative leading to incorrect design decisions. Three different sets of scenarios were derived using three site-specific data sets from weather stations that are representative of existing nuclear power plants in Canada. When compared, the scenarios for the three sites are not significantly different from each other, especially in terms of trends, considering that they have been based on data from widely differing meteorological regions in Canada. Conservative envelopes of the scenarios for the three sites were taken to give the recommended general weather scenario set. The recommended set was then compared with the AECB proposed scenarios. The recommended scenarios are, in general, conservative

  6. Power handling of a segmented bulk W tile for JET under realistic plasma scenarios

    Science.gov (United States)

    Jet-Efda Contributors Mertens, Ph.; Coenen, J. W.; Eich, T.; Huber, A.; Jachmich, S.; Nicolai, D.; Riccardo, V.; Senik, K.; Samm, U.

    2011-08-01

    A solid tungsten divertor row has been designed for JET in the frame of the ITER-like Wall project (ILW). The plasma-facing tiles are segmented in four stacks of tungsten lamellae oriented in the toroidal direction. Earlier estimations of the expected tile performance were carried out mostly for engineering purposes, to compare the permissible heat load with the power density of 7 MW/m2 originally specified for the ILW as a uniform load for 10 s.The global thermal model developed for the W modules delivers results for more realistic plasma footprints: the poloidal extension of the outer strike point was reduced from the full lamella width of 62 mm to ⩾15 mm. Model validation is given by the experimental exposure of a 1:1 prototype stack in the ion beam facility MARION (incidence ˜6°, load E ⩽ 66 MJ/m2 on the wetted surface). Spreading the deposited energy by appropriate sweeping over one or several stacks in the torus is beneficial for the tungsten lamellae and for the support structure.

  7. Effects of realistic topography on the ground motion of the Colombian Andes - A case study at the Aburrá Valley, Antioquia

    Science.gov (United States)

    Restrepo, Doriam; Bielak, Jacobo; Serrano, Ricardo; Gómez, Juan; Jaramillo, Juan

    2016-03-01

    This paper presents a set of deterministic 3-D ground motion simulations for the greater metropolitan area of Medellín in the Aburrá Valley, an earthquake-prone region of the Colombian Andes that exhibits moderate-to-strong topographic irregularities. We created the velocity model of the Aburrá Valley region (version 1) using the geological structures as a basis for determining the shear wave velocity. The irregular surficial topography is considered by means of a fictitious domain strategy. The simulations cover a 50 × 50 × 25 km3 volume, and four Mw = 5 rupture scenarios along a segment of the Romeral fault, a significant source of seismic activity in Colombia. In order to examine the sensitivity of ground motion to the irregular topography and the 3-D effects of the valley, each earthquake scenario was simulated with three different models: (i) realistic 3-D velocity structure plus realistic topography, (ii) realistic 3-D velocity structure without topography, and (iii) homogeneous half-space with realistic topography. Our results show how surface topography affects the ground response. In particular, our findings highlight the importance of the combined interaction between source-effects, source-directivity, focusing, soft-soil conditions, and 3-D topography. We provide quantitative evidence of this interaction and show that topographic amplification factors can be as high as 500 per cent at some locations. In other areas within the valley, the topographic effects result in relative reductions, but these lie in the 0-150 per cent range.

  8. Modeling Rocket Flight in the Low-Friction Approximation

    Directory of Open Access Journals (Sweden)

    Logan White

    2014-09-01

    Full Text Available In a realistic model for rocket dynamics, in the presence of atmospheric drag and altitude-dependent gravity, the exact kinematic equation cannot be integrated in closed form; even when neglecting friction, the exact solution is a combination of elliptic functions of Jacobi type, which are not easy to use in a computational sense. This project provides a precise analysis of the various terms in the full equation (such as gravity, drag, and exhaust momentum, and the numerical ranges for which various approximations are accurate to within 1%. The analysis leads to optimal approximations expressed through elementary functions, which can be implemented for efficient flight prediction on simple computational devices, such as smartphone applications.

  9. Realistic electrostatic potentials in a neutron star crust

    International Nuclear Information System (INIS)

    Ebel, Claudio; Mishustin, Igor; Greiner, Walter

    2015-01-01

    We study the electrostatic properties of inhomogeneous nuclear matter which can be formed in the crusts of neutron stars or in supernova explosions. Such matter is represented by Wigner–Seitz cells of different geometries (spherical, cylindrical, cartesian), which contain nuclei, free neutrons and electrons under the conditions of electrical neutrality. Using the Thomas–Fermi approximation, we have solved the Poisson equation for the electrostatic potential and calculated the corresponding electron density distributions in individual cells. The calculations are done for different shapes and sizes of the cells and different average baryon densities. The electron-to-baryon fraction was fixed at 0.3. Using realistic electron distributions leads to a significant reduction in electrostatic energy and electron chemical potential. (paper)

  10. Effect of geometry on concentration polarization in realistic heterogeneous permselective systems

    Science.gov (United States)

    Green, Yoav; Shloush, Shahar; Yossifon, Gilad

    2014-04-01

    This study extends previous analytical solutions of concentration polarization occurring solely in the depleted region, to the more realistic geometry consisting of a three-dimensional (3D) heterogeneous ion-permselective medium connecting two opposite microchambers (i.e., a three-layer system). Under the local electroneutrality approximation, the separation of variable methods is used to derive an analytical solution of the electrodiffusive problem for the two opposing asymmetric microchambers. The assumption of an ideal permselective medium allows for the analytic calculation of the 3D concentration and electric potential distributions as well as a current-voltage relation. It is shown that any asymmetry in the microchamber geometries will result in current rectification. Moreover, it is demonstrated that for non-negligible microchamber resistances, the conductance does not exhibit the expected saturation at low concentrations but instead shows a continuous decrease. The results are intended to facilitate a more direct comparison between theory and experiments, as now the voltage drop is across a realistic 3D and three-layer system.

  11. On the convergence of multigroup discrete-ordinates approximations

    International Nuclear Information System (INIS)

    Victory, H.D. Jr.; Allen, E.J.; Ganguly, K.

    1987-01-01

    Our analysis is divided into two distinct parts which we label for convenience as Part A and Part B. In Part A, we demonstrate that the multigroup discrete-ordinates approximations are well-defined and converge to the exact transport solution in any subcritical setting. For the most part, we focus on transport in two-dimensional Cartesian geometry. A Nystroem technique is used to extend the discrete ordinates multigroup approximates to all values of the angular and energy variables. Such an extension enables us to employ collectively compact operator theory to deduce stability and convergence of the approximates. In Part B, we perform a thorough convergence analysis for the multigroup discrete-ordinates method for an anisotropically-scattering subcritical medium in slab geometry. The diamond-difference and step-characteristic spatial approximation methods are each studied. The multigroup neutron fluxes are shown to converge in a Banach space setting under realistic smoothness conditions on the solution. This is the first thorough convergence analysis for the fully-discretized multigroup neutron transport equations

  12. Effects of classical and neo-classical cross-field transport of tungsten impurity in realistic tokamak geometry

    Energy Technology Data Exchange (ETDEWEB)

    Yamoto, S.; Inoue, H.; Sawada, Y.; Hatayama, A. [Faculty of Science and Technology, Keio University, Yokohama (Japan); Homma, Y.; Hoshino, K. [Japan Atomic Energy Agency, Rokkasho, Aomori (Japan); Bonnin, X. [ITER Organization, St. Paul Lez Durance (France); Coster, D. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Schneider, R. [Ernst-Moritz-Arndt University Greifswald (Germany)

    2016-08-15

    The initial simulation study of the neoclassical perpendicular self-diffusion transport in the SOL/Divertor regions for a realistic tokamak geometry with the IMPGYRO code has been performed in this paper. One of the most unique features of the IMPGYRO code is calculating exact Larmor orbit of the test particle instead of assuming guiding center approximation. Therefore, effects of the magnetic drifts in realistic tokamaks are naturally taken into account in the IMPGYRO code. This feature makes it possible to calculate neoclassical transport processes, which possibly become large in the SOL/divertor plasma. Indeed, neoclassical self-diffusion process, the resultant effect of the combination of magnetic drift and Coulomb collisions with background ions, has already been included in the IMPGYRO model. In the present paper, prior to implementing the detailed model of neoclassical transport process into IMPGYRO, we have investigated the effect of neoclassical selfdiffusion in a realistic tokamak geometry with lower single null X-point. We also use a model with guiding center approximation in order to compare with the IMPGYRO full orbit model. The preliminary calculation results of each model have shown differences in the perpendicular average velocity of impurity ions at the top region of the SOL. The mechanism which leads to the difference has been discussed. (copyright 2015 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  13. An approximate fractional Gaussian noise model with computational cost

    KAUST Repository

    Sørbye, Sigrunn H.

    2017-09-18

    Fractional Gaussian noise (fGn) is a stationary time series model with long memory properties applied in various fields like econometrics, hydrology and climatology. The computational cost in fitting an fGn model of length $n$ using a likelihood-based approach is ${\\\\mathcal O}(n^{2})$, exploiting the Toeplitz structure of the covariance matrix. In most realistic cases, we do not observe the fGn process directly but only through indirect Gaussian observations, so the Toeplitz structure is easily lost and the computational cost increases to ${\\\\mathcal O}(n^{3})$. This paper presents an approximate fGn model of ${\\\\mathcal O}(n)$ computational cost, both with direct or indirect Gaussian observations, with or without conditioning. This is achieved by approximating fGn with a weighted sum of independent first-order autoregressive processes, fitting the parameters of the approximation to match the autocorrelation function of the fGn model. The resulting approximation is stationary despite being Markov and gives a remarkably accurate fit using only four components. The performance of the approximate fGn model is demonstrated in simulations and two real data examples.

  14. The impact of power negotiations on nursing students learning processes in simulation scenario debriefing

    DEFF Research Database (Denmark)

    Frandsen, Anne; Topperzer, Martha; Sejr Olsen, Pernille

    element to obtain critical reflection and learning (3). During debriefing students and facilitators jointly explore what happened in the scenario, and what should be focus of the prospective learning process (4). The aim of this paper is to explore how the negotiations of power (5) in the simulation...... scenario debriefing have an impact on the learning process for novice nursing students. References: Zapko et al 2015: Interdisciplinary Disaster Drill Simulation: Laying the Groundwork for further Research. Nurse Education Perspective Nov. Dec. 36(6):379-82 Motola et al 2013: Simulation in healthcare......Clinical simulation in nursing improves student`s ability to critical thinking, clinical judgement and clinical decision- making(1). The key elements to success are planning, pre- briefing, engagement, a realistic scenario with interruptions and debriefing (2). Debriefing is highlighted as the key...

  15. Multi-modal Virtual Scenario Enhances Neurofeedback Learning

    Directory of Open Access Journals (Sweden)

    Avihay Cohen

    2016-08-01

    produced a higher effect size. In addition, neurofeedback via the animated scenario showed better sustainability, as indicated by a no-feedback trial conducted in session 2 and better transferability to a new unfamiliar interface. Lastly, participants reported that the animated scenario was more engaging and more motivating than the thermometer. Together, these results demonstrate the promising potential of integrating realistic virtual environments in neurofeedback to enhance learning and improve user's experience.

  16. Tsunami simulations of mega-thrust earthquakes in the Nankai–Tonankai Trough (Japan) based on stochastic rupture scenarios

    KAUST Repository

    Goda, Katsuichiro

    2017-02-23

    In this study, earthquake rupture models for future mega-thrust earthquakes in the Nankai–Tonankai subduction zone are developed by incorporating the main characteristics of inverted source models of the 2011 Tohoku earthquake. These scenario ruptures also account for key features of the national tsunami source model for the Nankai–Tonankai earthquake by the Central Disaster Management Council of the Japanese Government. The source models capture a wide range of realistic slip distributions and kinematic rupture processes, reflecting the current best understanding of what may happen due to a future mega-earthquake in the Nankai–Tonankai Trough, and therefore are useful for conducting probabilistic tsunami hazard and risk analysis. A large suite of scenario rupture models is then used to investigate the variability of tsunami effects in coastal areas, such as offshore tsunami wave heights and onshore inundation depths, due to realistic variations in source characteristics. Such investigations are particularly valuable for tsunami hazard mapping and evacuation planning in municipalities along the Nankai–Tonankai coast.

  17. SITE-94. The central scenario for SITE-94: A climate change scenario

    International Nuclear Information System (INIS)

    King-Clayton, L.M.; Chapman, N.A.; Kautsky, F.; Svensson, N.O.; Ledoux, E.

    1995-12-01

    The central scenario includes the following main components: a deterministic description of the most probable climatic state for Sweden (with special ref. to the Aespoe area) for the next c. 120,000 years, a description of the likely nature of the surface and geological environment in the area at each stage of the climate sequence selected, and quantitative information on how these changes might affect the disposal system. The climate models suggest glacial maxima at c. 5, 20, 60 and 100 thousand years from now. The Aespoe region is predicted to be significantly affected by the latter three glacial episodes, with the ice sheet reaching and covering the area during the latter two episodes (by up to c 2200m and 1200m thickness of ice, resp.). Permafrost thicknesses over the next 120,000 years have been calculated. Assumptions, estimates and alternatives to the prescribed climate evolution are discussed. Following definition of a realistic, albeit non-unique, climate sequence, the objective of scenario development is to provide an indicator of the physical, chemical and hydrogeological conditions at the front of and beneath the advancing and retreating ice sheets, with the aim of identifying critical aspects for Performance Assessment modelling. The effect of various factors, such as ice loading, development of permafrost, temperature changes and sea level changes are considered in terms of their impact on hydrogeology, groundwater chemistry, rock stress and surface environments. 183 refs

  18. SITE-94. The central scenario for SITE-94: A climate change scenario

    Energy Technology Data Exchange (ETDEWEB)

    King-Clayton, L M; Chapman, N A [QuantiSci Ltd, Melton Mowbray (United Kingdom); Kautsky, F [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Svensson, N O [Lund Univ. (Sweden). Dept. of Quaternary Geology; Marsily, G de [Univ. VI Paris (France); Ledoux, E [Ecole Nationale Superieure des Mines, 77 - Fontainebleau (France)

    1995-12-01

    The central scenario includes the following main components: a deterministic description of the most probable climatic state for Sweden (with special ref. to the Aespoe area) for the next c. 120,000 years, a description of the likely nature of the surface and geological environment in the area at each stage of the climate sequence selected, and quantitative information on how these changes might affect the disposal system. The climate models suggest glacial maxima at c. 5, 20, 60 and 100 thousand years from now. The Aespoe region is predicted to be significantly affected by the latter three glacial episodes, with the ice sheet reaching and covering the area during the latter two episodes (by up to c 2200m and 1200m thickness of ice, resp.). Permafrost thicknesses over the next 120,000 years have been calculated. Assumptions, estimates and alternatives to the prescribed climate evolution are discussed. Following definition of a realistic, albeit non-unique, climate sequence, the objective of scenario development is to provide an indicator of the physical, chemical and hydrogeological conditions at the front of and beneath the advancing and retreating ice sheets, with the aim of identifying critical aspects for Performance Assessment modelling. The effect of various factors, such as ice loading, development of permafrost, temperature changes and sea level changes are considered in terms of their impact on hydrogeology, groundwater chemistry, rock stress and surface environments. 183 refs.

  19. Triangulating and guarding realistic polygons

    NARCIS (Netherlands)

    Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.

    2014-01-01

    We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards. We show that k-guardable polygons generalize two previously identified classes of realistic input. Following this, we give two simple algorithms for triangulating

  20. [A new age of mass casuality education? : The InSitu project: realistic training in virtual reality environments].

    Science.gov (United States)

    Lorenz, D; Armbruster, W; Vogelgesang, C; Hoffmann, H; Pattar, A; Schmidt, D; Volk, T; Kubulus, D

    2016-09-01

    Chief emergency physicians are regarded as an important element in the care of the injured and sick following mass casualty accidents. Their education is very theoretical; practical content in contrast often falls short. Limitations are usually the very high costs of realistic (large-scale) exercises, poor reproducibility of the scenarios, and poor corresponding results. To substantially improve the educational level because of the complexity of mass casualty accidents, modified training concepts are required that teach the not only the theoretical but above all the practical skills considerably more intensively than at present. Modern training concepts should make it possible for the learner to realistically simulate decision processes. This article examines how interactive virtual environments are applicable for the education of emergency personnel and how they could be designed. Virtual simulation and training environments offer the possibility of simulating complex situations in an adequately realistic manner. The so-called virtual reality (VR) used in this context is an interface technology that enables free interaction in addition to a stereoscopic and spatial representation of virtual large-scale emergencies in a virtual environment. Variables in scenarios such as the weather, the number wounded, and the availability of resources, can be changed at any time. The trainees are able to practice the procedures in many virtual accident scenes and act them out repeatedly, thereby testing the different variants. With the aid of the "InSitu" project, it is possible to train in a virtual reality with realistically reproduced accident situations. These integrated, interactive training environments can depict very complex situations on a scale of 1:1. Because of the highly developed interactivity, the trainees can feel as if they are a direct part of the accident scene and therefore identify much more with the virtual world than is possible with desktop systems

  1. HackAttack: Game-Theoretic Analysis of Realistic Cyber Conflicts

    Energy Technology Data Exchange (ETDEWEB)

    Ferragut, Erik M [ORNL; Brady, Andrew C [Jefferson Middle School, Oak Ridge, TN; Brady, Ethan J [Oak Ridge High School, Oak Ridge, TN; Ferragut, Jacob M [Oak Ridge High School, Oak Ridge, TN; Ferragut, Nathan M [Oak Ridge High School, Oak Ridge, TN; Wildgruber, Max C [ORNL

    2016-01-01

    Game theory is appropriate for studying cyber conflict because it allows for an intelligent and goal-driven adversary. Applications of game theory have led to a number of results regarding optimal attack and defense strategies. However, the overwhelming majority of applications explore overly simplistic games, often ones in which each participant s actions are visible to every other participant. These simplifications strip away the fundamental properties of real cyber conflicts: probabilistic alerting, hidden actions, unknown opponent capabilities. In this paper, we demonstrate that it is possible to analyze a more realistic game, one in which different resources have different weaknesses, players have different exploits, and moves occur in secrecy, but they can be detected. Certainly, more advanced and complex games are possible, but the game presented here is more realistic than any other game we know of in the scientific literature. While optimal strategies can be found for simpler games using calculus, case-by-case analysis, or, for stochastic games, Q-learning, our more complex game is more naturally analyzed using the same methods used to study other complex games, such as checkers and chess. We define a simple evaluation function and employ multi-step searches to create strategies. We show that such scenarios can be analyzed, and find that in cases of extreme uncertainty, it is often better to ignore one s opponent s possible moves. Furthermore, we show that a simple evaluation function in a complex game can lead to interesting and nuanced strategies.

  2. The ShakeOut Earthquake Scenario - A Story That Southern Californians Are Writing

    Science.gov (United States)

    Perry, Suzanne; Cox, Dale; Jones, Lucile; Bernknopf, Richard; Goltz, James; Hudnut, Kenneth; Mileti, Dennis; Ponti, Daniel; Porter, Keith; Reichle, Michael; Seligson, Hope; Shoaf, Kimberley; Treiman, Jerry; Wein, Anne

    2008-01-01

    The question is not if but when southern California will be hit by a major earthquake - one so damaging that it will permanently change lives and livelihoods in the region. How severe the changes will be depends on the actions that individuals, schools, businesses, organizations, communities, and governments take to get ready. To help prepare for this event, scientists of the U.S. Geological Survey (USGS) have changed the way that earthquake scenarios are done, uniting a multidisciplinary team that spans an unprecedented number of specialties. The team includes the California Geological Survey, Southern California Earthquake Center, and nearly 200 other partners in government, academia, emergency response, and industry, working to understand the long-term impacts of an enormous earthquake on the complicated social and economic interactions that sustain southern California society. This project, the ShakeOut Scenario, has applied the best current scientific understanding to identify what can be done now to avoid an earthquake catastrophe. More information on the science behind this project will be available in The ShakeOut Scenario (USGS Open-File Report 2008-1150; http://pubs.usgs.gov/of/2008/1150/). The 'what if?' earthquake modeled in the ShakeOut Scenario is a magnitude 7.8 on the southern San Andreas Fault. Geologists selected the details of this hypothetical earthquake by considering the amount of stored strain on that part of the fault with the greatest risk of imminent rupture. From this, seismologists and computer scientists modeled the ground shaking that would occur in this earthquake. Engineers and other professionals used the shaking to produce a realistic picture of this earthquake's damage to buildings, roads, pipelines, and other infrastructure. From these damages, social scientists projected casualties, emergency response, and the impact of the scenario earthquake on southern California's economy and society. The earthquake, its damages, and

  3. Future scenario development within life cycle assessment of waste management systems

    DEFF Research Database (Denmark)

    Bisinella, Valentina

    Life Cycle Assessment (LCA) is an acknowledged tool for quantifying the sustainability of waste management solutions. However, the use of LCA for decision-making is hindered by the strong dependency of the LCA results on the assumptions regarding the future conditions in which the waste management...... solutions will operate. Future scenario methods from the management engineering field may provide valid approaches for formulating consistent assumptions on future conditions for the waste management system modelled with LCA. However, the standardized LCA procedure currently does not offer much guidance...... field. The quantitative modelling implications were tested within real-scale LCA models focusing on the management of residual waste in Denmark. In a wide range of scenarios, this thesis addressed the influence on the LCA model results of realistic technology and waste composition uncertainties, as well...

  4. Realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap

    2013-11-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. According to this definition, some roofs may have faces isolated from the boundary of P or even local minima, which are undesirable for several practical reasons. In this paper, we introduce realistic roofs by imposing a few additional constraints. We investigate the geometric and combinatorial properties of realistic roofs and show that the straight skeleton induces a realistic roof with maximum height and volume. We also show that the maximum possible number of distinct realistic roofs over P is ((n-4)(n-4)/4 /2⌋) when P has n vertices. We present an algorithm that enumerates a combinatorial representation of each such roof in O(1) time per roof without repetition, after O(n4) preprocessing time. We also present an O(n5)-time algorithm for computing a realistic roof with minimum height or volume. © 2013 Elsevier B.V.

  5. Designing a Methodology for Future Air Travel Scenarios

    Science.gov (United States)

    Wuebbles, Donald J.; Baughcum, Steven L.; Gerstle, John H.; Edmonds, Jae; Kinnison, Douglas E.; Krull, Nick; Metwally, Munir; Mortlock, Alan; Prather, Michael J.

    1992-01-01

    The growing demand on air travel throughout the world has prompted several proposals for the development of commercial aircraft capable of transporting a large number of passengers at supersonic speeds. Emissions from a projected fleet of such aircraft, referred to as high-speed civil transports (HSCT's), are being studied because of their possible effects on the chemistry and physics of the global atmosphere, in particular, on stratospheric ozone. At the same time, there is growing concern about the effects on ozone from the emissions of current (primarily subsonic) aircraft emissions. Evaluating the potential atmospheric impact of aircraft emissions from HSCT's requires a scientifically sound understanding of where the aircraft fly and under what conditions the aircraft effluents are injected into the atmosphere. A preliminary set of emissions scenarios are presented. These scenarios will be used to understand the sensitivity of environment effects to a range of fleet operations, flight conditions, and aircraft specifications. The baseline specifications for the scenarios are provided: the criteria to be used for developing the scenarios are defined, the required data base for initiating the development of the scenarios is established, and the state of the art for those scenarios that have already been developed is discussed. An important aspect of the assessment will be the evaluation of realistic projections of emissions as a function of both geographical distribution and altitude from an economically viable commercial HSCT fleet. With an assumed introduction date of around the year 2005, it is anticipated that there will be no HSCT aircraft in the global fleet at that time. However, projections show that, by 2015, the HSCT fleet could reach significant size. We assume these projections of HSCT and subsonic fleets for about 2015 can the be used as input to global atmospheric chemistry models to evaluate the impact of the HSCT fleets, relative to an all

  6. A realistic extension of gauge-mediated SUSY-breaking model with superconformal hidden sector

    International Nuclear Information System (INIS)

    Asano, Masaki; Hisano, Junji; Okada, Takashi; Sugiyama, Shohei

    2009-01-01

    The sequestering of supersymmetry (SUSY) breaking parameters, which is induced by superconformal hidden sector, is one of the solutions for the μ/B μ problem in gauge-mediated SUSY-breaking scenario. However, it is found that the minimal messenger model does not derive the correct electroweak symmetry breaking. In this Letter we present a model which has the coupling of the messengers with the SO(10) GUT-symmetry breaking Higgs fields. The model is one of the realistic extensions of the gauge mediation model with superconformal hidden sector. It is shown that the extension is applicable for a broad range of conformality breaking scale

  7. Simulation of regional day-ahead PV power forecast scenarios

    DEFF Research Database (Denmark)

    Nuno, Edgar; Koivisto, Matti Juhani; Cutululis, Nicolaos Antonio

    2017-01-01

    Uncertainty associated with Photovoltaic (PV) generation can have a significant impact on real-time planning and operation of power systems. This obstacle is commonly handled using multiple forecast realizations, obtained using for example forecast ensembles and/or probabilistic forecasts, often...... at the expense of a high computational burden. Alternatively, some power system applications may require realistic forecasts rather than actual estimates; able to capture the uncertainty of weatherdriven generation. To this end, we propose a novel methodology to generate day-ahead forecast scenarios of regional...... PV production matching the spatio-temporal characteristics while preserving the statistical properties of actual records....

  8. Air quality impacts of distributed power generation in the South Coast Air Basin of California 1: Scenario development and modeling analysis

    Science.gov (United States)

    Rodriguez, M. A.; Carreras-Sospedra, M.; Medrano, M.; Brouwer, J.; Samuelsen, G. S.; Dabdub, D.

    Distributed generation (DG) is generally defined as the operation of many small stationary power generators throughout an urban air basin. Although DG has the potential to supply a significant portion of the increased power demands in California and the rest of the United States, it may lead to increased levels of in-basin pollutants and adversely impact urban air quality. This study focuses on two main objectives: (1) the systematic characterization of DG installation in urban air basins, and (2) the simulation of potential air quality impacts using a state-of-the-art three-dimensional computational model. A general and systematic approach is devised to construct five realistic and 21 spanning scenarios of DG implementation in the South Coast Air Basin (SoCAB) of California. Realistic scenarios reflect an anticipated level of DG deployment in the SoCAB by the year 2010. Spanning scenarios are developed to determine the potential impacts of unexpected outcomes. Realistic implementations of DG in the SoCAB result in small differences in ozone and particulate matter concentrations in the basin compared to the baseline simulations. The baseline accounts for population increase, but does not consider any future emissions control measures. Model results for spanning implementations with extra high DG market penetration show that domain-wide ozone peak concentrations increase significantly. Also, air quality impacts of spanning implementations when DG operate during a 6-h period are larger than when the same amount of emissions are introduced during a 24-h period.

  9. A systematic study of the octupole correlations in the lanthanides with realistic forces

    International Nuclear Information System (INIS)

    Egido, J.L.; Robledo, L.M.

    1992-01-01

    We have performed a systematic study of the octupole degree of freedom in the nuclei 140 Ba, 142-150 deg Ce, 144-152 Nd and 146-154 Sm. The static properties (ground state deformations, energy gaps, dipole moments, etc.) have been analyzed within the Hartree-Fock plus BCS approximation (HFBCS); for the dynamical ones (energy splittings, transition probabilities, etc.) the adiabatic time-dependent Hartree-Fock plus zero point energy in the cranking approximation (ATDHF+ZPE) has been applied. In both approximations the realistic density-dependent Gogny force has been used. In our parameter-free calculations we are able to describe very well the whole experimental systematic of energy splittings and B(E1), among others. The flatness of the whole experimental systematic of energy splittings and B(E1), among others. The flatness of the potential energy of some nuclei makes the mean field approach unreliable for such nuclei. (orig.)

  10. Environmental influences on mate preferences as assessed by a scenario manipulation experiment.

    Science.gov (United States)

    Marzoli, Daniele; Moretto, Francesco; Monti, Aura; Tocci, Ornella; Roberts, S Craig; Tommasi, Luca

    2013-01-01

    Many evolutionary psychology studies have addressed the topic of mate preferences, focusing particularly on gender and cultural differences. However, the extent to which situational and environmental variables might affect mate preferences has been comparatively neglected. We tested 288 participants in order to investigate the perceived relative importance of six traits of an ideal partner (wealth, dominance, intelligence, height, kindness, attractiveness) under four different hypothetical scenarios (status quo/nowadays, violence/post-nuclear, poverty/resource exhaustion, prosperity/global well-being). An equal number of participants (36 women, 36 men) was allotted to each scenario; each was asked to allocate 120 points across the six traits according to their perceived value. Overall, intelligence was the trait to which participants assigned most importance, followed by kindness and attractiveness, and then by wealth, dominance and height. Men appraised attractiveness as more valuable than women. Scenario strongly influenced the relative importance attributed to traits, the main finding being that wealth and dominance were more valued in the poverty and post-nuclear scenarios, respectively, compared to the other scenarios. Scenario manipulation generally had similar effects in both sexes, but women appeared particularly prone to trade off other traits for dominance in the violence scenario, and men particularly prone to trade off other traits for wealth in the poverty scenario. Our results are in line with other correlational studies of situational variables and mate preferences, and represent strong evidence of a causal relationship of environmental factors on specific mate preferences, corroborating the notion of an evolved plasticity to current ecological conditions. A control experiment seems to suggest that our scenarios can be considered as realistic descriptions of the intended ecological conditions.

  11. Atmospheric circulation and hydroclimate impacts of alternative warming scenarios for the Eocene

    Science.gov (United States)

    Carlson, Henrik; Caballero, Rodrigo

    2017-08-01

    Recent work in modelling the warm climates of the early Eocene shows that it is possible to obtain a reasonable global match between model surface temperature and proxy reconstructions, but only by using extremely high atmospheric CO2 concentrations or more modest CO2 levels complemented by a reduction in global cloud albedo. Understanding the mix of radiative forcing that gave rise to Eocene warmth has important implications for constraining Earth's climate sensitivity, but progress in this direction is hampered by the lack of direct proxy constraints on cloud properties. Here, we explore the potential for distinguishing among different radiative forcing scenarios via their impact on regional climate changes. We do this by comparing climate model simulations of two end-member scenarios: one in which the climate is warmed entirely by CO2 (which we refer to as the greenhouse gas (GHG) scenario) and another in which it is warmed entirely by reduced cloud albedo (which we refer to as the low CO2-thin clouds or LCTC scenario) . The two simulations have an almost identical global-mean surface temperature and equator-to-pole temperature difference, but the LCTC scenario has ˜ 11 % greater global-mean precipitation than the GHG scenario. The LCTC scenario also has cooler midlatitude continents and warmer oceans than the GHG scenario and a tropical climate which is significantly more El Niño-like. Extremely high warm-season temperatures in the subtropics are mitigated in the LCTC scenario, while cool-season temperatures are lower at all latitudes. These changes appear large enough to motivate further, more detailed study using other climate models and a more realistic set of modelling assumptions.

  12. Environmental influences on mate preferences as assessed by a scenario manipulation experiment.

    Directory of Open Access Journals (Sweden)

    Daniele Marzoli

    Full Text Available Many evolutionary psychology studies have addressed the topic of mate preferences, focusing particularly on gender and cultural differences. However, the extent to which situational and environmental variables might affect mate preferences has been comparatively neglected. We tested 288 participants in order to investigate the perceived relative importance of six traits of an ideal partner (wealth, dominance, intelligence, height, kindness, attractiveness under four different hypothetical scenarios (status quo/nowadays, violence/post-nuclear, poverty/resource exhaustion, prosperity/global well-being. An equal number of participants (36 women, 36 men was allotted to each scenario; each was asked to allocate 120 points across the six traits according to their perceived value. Overall, intelligence was the trait to which participants assigned most importance, followed by kindness and attractiveness, and then by wealth, dominance and height. Men appraised attractiveness as more valuable than women. Scenario strongly influenced the relative importance attributed to traits, the main finding being that wealth and dominance were more valued in the poverty and post-nuclear scenarios, respectively, compared to the other scenarios. Scenario manipulation generally had similar effects in both sexes, but women appeared particularly prone to trade off other traits for dominance in the violence scenario, and men particularly prone to trade off other traits for wealth in the poverty scenario. Our results are in line with other correlational studies of situational variables and mate preferences, and represent strong evidence of a causal relationship of environmental factors on specific mate preferences, corroborating the notion of an evolved plasticity to current ecological conditions. A control experiment seems to suggest that our scenarios can be considered as realistic descriptions of the intended ecological conditions.

  13. An Estimation of Operator's Diagnostic Time for Feed-And-Bleed Operation under Various Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Bo Gyung; Kang, Hyun Gook [KAIST, Daejeon (Korea, Republic of); Yoon, Ho Joon [Khalifa University of Science, Abu Dhabi (United Arab Emirates)

    2015-10-15

    In order to assess realistic safety of plant, effects of interactions between components, operator, and plant condition are needed to be considered in the PSA model. One of the important issues to estimate the CDF is the estimation of human error probability (HEP). When an accident occurs, operators follow the emergency operating procedure and check various alarm, parameters, and signals. In the conventional Korean PSA model, the Korean standard HRA (K-HRA) method is used. In this method, the HEP is the sum of diagnosis error probability and execution error probability. A diagnosis error probability is expressed by the available time for diagnosis and adjusting performance shaping factors, and an execution error probability is a function of task type and stress level. Available time for diagnosis is very important factor of HEP. If the available time for diagnosis is short, the HEP becomes high. In order to obtain the realistic risk assessment results, we first focus on the estimation of HEP considering the plant dynamics under various scenarios. Target operation and scenarios are feed-and bleed operation (F and B operation) and total loss of feedwater (TLOFW) accident with/without loss of coolant accident (LOCA). One of the highest HEP is HEP of FnB operation. In additional, Scenarios, which are related to combination secondary heat removal failure and primary heat removal failure, are most critical core damage scenario of the combined accident except scenarios related to station black out (SBO). In these scenarios, the FnB operation is last resort to prevent core damage. To estimate available operator diagnosis time, we identify the relationship between accidents, mitigation function, and plant condition. Distribution of available time of diagnosis was estimated using the MOSAIQUE. The variables are break size, break timing, trip timing of RCP, and availability of high pressure safety injection (HPSI) pump under the TLOFW accident with LOCA. For Type 1 accident

  14. HELIOSEISMOLOGY OF A REALISTIC MAGNETOCONVECTIVE SUNSPOT SIMULATION

    International Nuclear Information System (INIS)

    Braun, D. C.; Birch, A. C.; Rempel, M.; Duvall, T. L. Jr.

    2012-01-01

    We compare helioseismic travel-time shifts measured from a realistic magnetoconvective sunspot simulation using both helioseismic holography and time-distance helioseismology, and measured from real sunspots observed with the Helioseismic and Magnetic Imager instrument on board the Solar Dynamics Observatory and the Michelson Doppler Imager instrument on board the Solar and Heliospheric Observatory. We find remarkable similarities in the travel-time shifts measured between the methodologies applied and between the simulated and real sunspots. Forward modeling of the travel-time shifts using either Born or ray approximation kernels and the sound-speed perturbations present in the simulation indicates major disagreements with the measured travel-time shifts. These findings do not substantially change with the application of a correction for the reduction of wave amplitudes in the simulated and real sunspots. Overall, our findings demonstrate the need for new methods for inferring the subsurface structure of sunspots through helioseismic inversions.

  15. Scenario drafting to anticipate future developments in technology assessment

    Directory of Open Access Journals (Sweden)

    Retèl Valesca P

    2012-08-01

    -uncertainty by means of scenario drafting into a decision model may reveal unanticipated developments and can demonstrate a range of possible cost-effectiveness outcomes. The effect of scenarios give additional information on the speed with cost effectiveness might be reached and thus provide a more realistic picture for policy makers, opinion leaders and manufacturers.

  16. Starobinsky-type Inflation in Dynamical Supergravity Breaking Scenarios

    CERN Document Server

    Alexandre, Jean; Mavromatos, Nick E.

    2014-01-27

    In the context of dynamical breaking of local supersymmetry (supergravity), including the Deser-Zumino super-Higgs effect, for the simple but quite representative cases of N=1, D=4 supergravity, we discuss the emergence of Starobinsky-type inflation, due to quantum corrections in the effective action arising from integrating out gravitino fields in their massive phase. This type of inflation may occur after a first-stage small-field inflation that characterises models near the origin of the one-loop effective potential, and it may occur at the non-trivial minima of the latter. Phenomenologically realistic scenarios, compatible with the Planck data, may be expected for the conformal supergravity variants of the basic model.

  17. Assessment of riverine load of contaminants to European seas under policy implementation scenarios: an example with 3 pilot substances.

    Science.gov (United States)

    Marinov, Dimitar; Pistocchi, Alberto; Trombetti, Marco; Bidoglio, Giovanni

    2014-01-01

    An evaluation of conventional emission scenarios is carried out targeting a possible impact of European Union (EU) policies on riverine loads to the European seas for 3 pilot pollutants: lindane, trifluralin, and perfluorooctane sulfonate (PFOS). The policy scenarios are investigated to the time horizon of year 2020 starting from chemical-specific reference conditions and considering different types of regulatory measures including business as usual (BAU), current trend (CT), partial implementation (PI), or complete ban (PI ban) of emissions. The scenario analyses show that the model-estimated lindane load of 745 t to European seas in 1995, based on the official emission data, would be reduced by 98.3% to approximately 12.5 t in 2005 (BAU scenario), 10 years after the start of the EU regulation of this chemical. The CT and PI ban scenarios indicate a reduction of sea loads of lindane in 2020 by 74% and 95%, respectively, when compared to the BAU estimate. For trifluralin, an annual load of approximately 61.7 t is estimated for the baseline year 2003 (BAU scenario), although the applied conservative assumptions related to pesticide use data availability in Europe. Under the PI (ban) scenario, assuming only small residual emissions of trifluralin, we estimate a sea loading of approximately 0.07 t/y. For PFOS, the total sea load from all European countries is estimated at approximately 5.8 t/y referred to 2007 (BAU scenario). Reducing the total load of PFOS below 1 t/y requires emissions to be reduced by 84%. The analysis of conventional scenarios or scenario typologies for emissions of contaminants using simple spatially explicit GIS-based models is suggested as a viable, affordable exercise that may support the assessment of implementation of policies and the identification or negotiation of emission reduction targets. © 2013 SETAC.

  18. The joint SKI/SKB scenario development project

    International Nuclear Information System (INIS)

    Andersson, Johan

    1989-12-01

    The Swedish Nuclear Power and Swedish Nuclear Waste Management Co. have carried through a joint scenario development exercise of a hypothetical repository for spent fuel and high level waste based on the KBS-3 concept as disposal method. The starting point has been the 'Sandia methodology', but the actual implementation of the steps in this method has required new strategy development. The work started with a relatively large internationally composed group meeting, which identified an extensive list of features, events and processes (FEPs) that might influence the long term performance of a repository. All these FEPs as well as its possible causes and consequences have been entered into a computer database. The next step in the development was to remove from the list approximately 30 FEPs of low probability or negligible consequence. In a following step a large number of the FEPs on the original list were assigned to the 'PROCESS SYSTEM', comprising the complete set of 'deterministic' chemical and physical processes that might influence the release from the repository to the biosphere. A scenario is defined by a set of external conditions which will influence the processes in the PROCESS SYSTEM. Approximately 50 FEPs were left representing external conditions. The remaining FEPs could all be combined to form scenarios, but it is concluded that it is not meaningful to discuss combinations without first analysing the consequence and probability of the individual conditions. An important aspect of the work is that the developed strategy includes a framework for the documentation of the complete chain of scenario development. Such a transparent documentation makes possible an extensive review and updating of the set of scenarios. A reviewing process, open to very broad groups in the society, is probably the best means of assuring reasonable completeness and of building up a general consensus on what are the critical issues for the safe disposal of radioactive waste

  19. Development of exposure scenarios for CERCLA risk assessments at the Savannah River Site

    International Nuclear Information System (INIS)

    Nix, D.W.; Immel, J.W.; Phifer, M.A.

    1992-01-01

    A CERCLA Baseline Risk Assessment (BRA) is performed to determine if there are any potential risks to human health and the environment from waste unit at SRS. The SRS has numerous waste units to evaluate in the RFMU and CMS/FS programs and, in order to provide a consistent approach, four standard exposure scenarios were developed for exposure assessments to be used in human health risk assessments. The standard exposure scenarios are divided into two temporal categories: (a) Current Land Use in the BRA, and (b) Future Land Use in the RERA. The Current Land Use scenarios consist of the evaluation of human health risk for Industrial Exposure (of a worker not involved in waste unit characterization or remediation), a Trespasser, a hypothetical current On-site Resident, and an Off-site Resident. The Future Land Use scenario considers exposure to an On-site Resident following termination of institutional control in the absence of any remedial action (No Action Alternative), as well as evaluating potential remedial alternatives against the four scenarios from the BRA. A critical facet in the development of a BRA or RERA is the scoping of exposure scenarios that reflect actual conditions at a waste unit, rather than using factors such as EPA Standard Default Exposure Scenarios (OSWER Directive 9285.6-03) that are based on upper-bound exposures that tend to reflect worst case conditions. The use of site-specific information for developing risk assessment exposure scenarios will result in a more realistic estimate of Reasonable Maximum Exposure for SRS waste units

  20. Development of exposure scenarios for CERCLA risk assessments at the Savannah River Site

    Energy Technology Data Exchange (ETDEWEB)

    Nix, D.W.; Immel, J.W. [Westinghouse Savannah River Co., Aiken, SC (United States); Phifer, M.A. [Tennessee Univ., Knoxville, TN (United States). Dept. of Civil Engineering

    1992-12-31

    A CERCLA Baseline Risk Assessment (BRA) is performed to determine if there are any potential risks to human health and the environment from waste unit at SRS. The SRS has numerous waste units to evaluate in the RFMU and CMS/FS programs and, in order to provide a consistent approach, four standard exposure scenarios were developed for exposure assessments to be used in human health risk assessments. The standard exposure scenarios are divided into two temporal categories: (a) Current Land Use in the BRA, and (b) Future Land Use in the RERA. The Current Land Use scenarios consist of the evaluation of human health risk for Industrial Exposure (of a worker not involved in waste unit characterization or remediation), a Trespasser, a hypothetical current On-site Resident, and an Off-site Resident. The Future Land Use scenario considers exposure to an On-site Resident following termination of institutional control in the absence of any remedial action (No Action Alternative), as well as evaluating potential remedial alternatives against the four scenarios from the BRA. A critical facet in the development of a BRA or RERA is the scoping of exposure scenarios that reflect actual conditions at a waste unit, rather than using factors such as EPA Standard Default Exposure Scenarios (OSWER Directive 9285.6-03) that are based on upper-bound exposures that tend to reflect worst case conditions. The use of site-specific information for developing risk assessment exposure scenarios will result in a more realistic estimate of Reasonable Maximum Exposure for SRS waste units.

  1. Modelisation of synchrotron radiation losses in realistic tokamak plasmas

    International Nuclear Information System (INIS)

    Albajar, F.; Johner, J.; Granata, G.

    2000-08-01

    Synchrotron radiation losses become significant in the power balance of high-temperature plasmas envisaged for next step tokamaks. Due to the complexity of the exact calculation, these losses are usually roughly estimated with expressions derived from a plasma description using simplifying assumptions on the geometry, radiation absorption, and density and temperature profiles. In the present article, the complete formulation of the transport of synchrotron radiation is performed for realistic conditions of toroidal plasma geometry with elongated cross-section, using an exact method for the calculation of the absorption coefficient, and for arbitrary shapes of density and temperature profiles. The effects of toroidicity and temperature profile on synchrotron radiation losses are analyzed in detail. In particular, when the electron temperature profile is almost flat in the plasma center, as for example in ITB confinement regimes, synchrotron losses are found to be much stronger than in the case where the profile is represented by its best generalized parabolic approximation, though both cases give approximately the same thermal energy contents. Such an effect is not included in present approximate expressions. Finally, we propose a seven-variable fit for the fast calculation of synchrotron radiation losses. This fit is derived from a large database, which has been generated using a code implementing the complete formulation and optimized for massively parallel computing. (author)

  2. Atmospheric circulation and hydroclimate impacts of alternative warming scenarios for the Eocene

    Directory of Open Access Journals (Sweden)

    H. Carlson

    2017-08-01

    Full Text Available Recent work in modelling the warm climates of the early Eocene shows that it is possible to obtain a reasonable global match between model surface temperature and proxy reconstructions, but only by using extremely high atmospheric CO2 concentrations or more modest CO2 levels complemented by a reduction in global cloud albedo. Understanding the mix of radiative forcing that gave rise to Eocene warmth has important implications for constraining Earth's climate sensitivity, but progress in this direction is hampered by the lack of direct proxy constraints on cloud properties. Here, we explore the potential for distinguishing among different radiative forcing scenarios via their impact on regional climate changes. We do this by comparing climate model simulations of two end-member scenarios: one in which the climate is warmed entirely by CO2 (which we refer to as the greenhouse gas (GHG scenario and another in which it is warmed entirely by reduced cloud albedo (which we refer to as the low CO2–thin clouds or LCTC scenario . The two simulations have an almost identical global-mean surface temperature and equator-to-pole temperature difference, but the LCTC scenario has  ∼  11 % greater global-mean precipitation than the GHG scenario. The LCTC scenario also has cooler midlatitude continents and warmer oceans than the GHG scenario and a tropical climate which is significantly more El Niño-like. Extremely high warm-season temperatures in the subtropics are mitigated in the LCTC scenario, while cool-season temperatures are lower at all latitudes. These changes appear large enough to motivate further, more detailed study using other climate models and a more realistic set of modelling assumptions.

  3. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    Science.gov (United States)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    . The usable and realistic ground motion maps for urban areas are generated: - either from the assumption of a "reference earthquake" - or directly, showing values of macroseimic intensity generated by a damaging, real earthquake. In the study, applying deterministic approach, earthquake scenario in macroseismic intensity ("model" earthquake scenario) for the city of Sofia is generated. The deterministic "model" intensity scenario based on assumption of a "reference earthquake" is compared with a scenario based on observed macroseimic effects caused by the damaging 2012 earthquake (MW5.6). The difference between observed (Io) and predicted (Ip) intensities values is analyzed.

  4. Scenario planning for the electricity generation in Indonesia

    International Nuclear Information System (INIS)

    Rachmatullah, C.; Aye, L.; Fuller, R.J.

    2007-01-01

    The long-term planning of a future electricity supply system requires data about future demand. Planners who use the conventional planning method forecast future demand by observing past trends or alternatively by developing scenarios and then selecting the scenarios considered to be the most likely to occur. This method, however, fails to include future uncertainties. To consider such uncertainties, the scenario planning method may be used. This study uses this method to devise a long-term electricity supply plan for the Java-Madura-Bali electricity system. It was found that the scenario planning method could save up to US$3.5 billion over a 15-year period of the method was applied right at the beginning of the period. In the case of the Java-Madura-Bali system, which currently has excess installed capacity, the scenario planning method does not provide such large benefits. It was also found that introducing integrated coal gasification combined cycle and advanced gas combined cycle units would reduce greenhouse gas emissions from the Java-Madura-Bali system by approximately 230 million tonnes or 15% compared to a business-as-usual (BAU) scenario over a 15-year planning timeframe. The abatement cost was found to be US$4 per tonne of CO 2 . (author)

  5. Scenario planning for the electricity generation in Indonesia

    Energy Technology Data Exchange (ETDEWEB)

    Rachmatullah, C.; Aye, L.; Fuller, R.J. [The University of Melbourne, Victoria (Australia). Department of Civil and Environmental Engineering, International Technologies Centre

    2007-04-15

    The long-term planning of a future electricity supply system requires data about future demand. Planners who use the conventional planning method forecast future demand by observing past trends or alternatively by developing scenarios and then selecting the scenarios considered to be the most likely to occur. This method, however, fails to include future uncertainties. To consider such uncertainties, the scenario planning method may be used. This study uses this method to devise a long-term electricity supply plan for the Java-Madura-Bali electricity system. It was found that the scenario planning method could save up to US$3.5 billion over a 15-year period of the method was applied right at the beginning of the period. In the case of the Java-Madura-Bali system, which currently has excess installed capacity, the scenario planning method does not provide such large benefits. It was also found that introducing integrated coal gasification combined cycle and advanced gas combined cycle units would reduce greenhouse gas emissions from the Java-Madura-Bali system by approximately 230 million tonnes or 15% compared to a business-as-usual (BAU) scenario over a 15-year planning timeframe. The abatement cost was found to be US$4 per tonne of CO{sub 2}. (author)

  6. Scenario based seismic hazard assessment and its application to the seismic verification of relevant buildings

    Science.gov (United States)

    Romanelli, Fabio; Vaccari, Franco; Altin, Giorgio; Panza, Giuliano

    2016-04-01

    The procedure we developed, and applied to a few relevant cases, leads to the seismic verification of a building by: a) use of a scenario based neodeterministic approach (NDSHA) for the calculation of the seismic input, and b) control of the numerical modeling of an existing building, using free vibration measurements of the real structure. The key point of this approach is the strict collaboration, from the seismic input definition to the monitoring of the response of the building in the calculation phase, of the seismologist and the civil engineer. The vibrometry study allows the engineer to adjust the computational model in the direction suggested by the experimental result of a physical measurement. Once the model has been calibrated by vibrometric analysis, one can select in the design spectrum the proper range of periods of interest for the structure. Then, the realistic values of spectral acceleration, which include the appropriate amplification obtained through the modeling of a "scenario" input to be applied to the final model, can be selected. Generally, but not necessarily, the "scenario" spectra lead to higher accelerations than those deduced by taking the spectra from the national codes (i.e. NTC 2008, for Italy). The task of the verifier engineer is to act so that the solution of the verification is conservative and realistic. We show some examples of the application of the procedure to some relevant (e.g. schools) buildings of the Trieste Province. The adoption of the scenario input has given in most of the cases an increase of critical elements that have to be taken into account in the design of reinforcements. However, the higher cost associated with the increase of elements to reinforce is reasonable, especially considering the important reduction of the risk level.

  7. The impact of traffic emissions on air quality in the Berlin-Brandenburg region - a case study on cycling scenarios

    Science.gov (United States)

    Kuik, F.; Lauer, A.; von Schneidemesser, E.; Butler, T. M.

    2016-12-01

    Many European cities continue to struggle with exceedances of NO2 limit values at measurement sites near roads, of which a large contribution is attributed to emissions from traffic. In this study, we explore how urban air quality can be improved with different traffic measures using the example of the Berlin-Brandenburg region. In order to simulate urban background air quality we use the Weather Research and Forecasting model with chemistry (WRF-Chem) at a horizontal resolution of 1km. We use emission input data at a horizontal resolution of 1km obtained by downscaling TNO-MACC III emissions based on local proxy data including population and traffic densities. In addition we use a statistical approach combining the simulated urban background concentrations with information on traffic densities to estimate NO2 at street level. This helps assessing whether the emission scenarios studied here can lead to significant reductions in NO2 concentrations at street level. The emission scenarios in this study represent a range of scenarios in which car traffic is replaced with bicycle traffic. Part of this study was an initial discussion phase with stakeholders, including policy makers and NGOs. The discussions have shown that the different stakeholders are interested in a scientific assessment of the impact of replacing car traffic with bicycle traffic in the Berlin-Brandenburg urban area. Local policy makers responsible for city planning and implementing traffic measures can make best use of scientific modeling results if input data and scenarios are as realistic as possible. For these reasons, the scenarios cover very idealized optimistic ("all passenger cars are replaced by bicycles") and pessimistic ("all cyclists are replaced by cars") scenarios to explore the sensitivity of simulated urban background air quality to these changes, as well as additional scenarios based on city-specific data to analyze more realistic situations. Of particular interest is how these impact

  8. Radiative transfer in disc galaxies - V. The accuracy of the KB approximation

    Science.gov (United States)

    Lee, Dukhang; Baes, Maarten; Seon, Kwang-Il; Camps, Peter; Verstocken, Sam; Han, Wonyong

    2016-12-01

    We investigate the accuracy of an approximate radiative transfer technique that was first proposed by Kylafis & Bahcall (hereafter the KB approximation) and has been popular in modelling dusty late-type galaxies. We compare realistic galaxy models calculated with the KB approximation with those of a three-dimensional Monte Carlo radiative transfer code SKIRT. The SKIRT code fully takes into account of the contribution of multiple scattering whereas the KB approximation calculates only single scattered intensity and multiple scattering components are approximated. We find that the KB approximation gives fairly accurate results if optically thin, face-on galaxies are considered. However, for highly inclined (I ≳ 85°) and/or optically thick (central face-on optical depth ≳1) galaxy models, the approximation can give rise to substantial errors, sometimes, up to ≳40 per cent. Moreover, it is also found that the KB approximation is not always physical, sometimes producing infinite intensities at lines of sight with high optical depth in edge-on galaxy models. There is no `simple recipe' to correct the errors of the KB approximation that is universally applicable to any galaxy models. Therefore, it is recommended that the full radiative transfer calculation be used, even though it is slower than the KB approximation.

  9. Online scenario labeling using a hidden Markov model for assessment of nuclear plant state

    International Nuclear Information System (INIS)

    Zamalieva, Daniya; Yilmaz, Alper; Aldemir, Tunc

    2013-01-01

    By taking into account both aleatory and epistemic uncertainties within the same probabilistic framework, dynamic event trees (DETs) provide more comprehensive and systematic coverage of possible scenarios following an initiating event compared to conventional event trees. When DET generation algorithms are applied to complex realistic systems, extremely large amounts of data can be produced due to both the large number of scenarios generated following a single initiating event and the large number of data channels that represent these scenarios. In addition, the computational time required for the simulation of each scenario can be very large (e.g. about 24 h of serial run simulation time for a 4 h station blackout scenario). Since scenarios leading to system failure are more of interest, a method is proposed for online labeling of scenarios as failure or non-failure. The algorithm first trains a Hidden Markov Model, which represents the behavior of non-failure scenarios, using a training set from previous simulations. Then, the maximum likelihoods of sample failure and non-failure scenarios fitting this model are computed. These values are used to determine the timestamp at which the labeling of a certain scenario should be performed. Finally, during the succeeding timestamps, the likelihood of each scenario fitting the learned model is computed, and a dynamic thresholding based on the previously calculated likelihood values is applied. The scenarios whose likelihood is higher than the threshold are labeled as non-failure. The proposed algorithm can further delay the non-failure scenarios or discontinue them in order to redirect the computational resources toward the failure scenarios, and reduce computational time and complexity. Experiments using RELAP5/3D model of a fast reactor utilizing an Reactor Vessel Auxiliary Cooling System (RVACS) passive decay heat removal system and dynamic analysis of a station blackout (SBO) event show that the proposed method is

  10. A Hierarchy of Transport Approximations for High Energy Heavy (HZE) Ions

    Science.gov (United States)

    Wilson, John W.; Lamkin, Stanley L.; Hamidullah, Farhat; Ganapol, Barry D.; Townsend, Lawrence W.

    1989-01-01

    The transport of high energy heavy (HZE) ions through bulk materials is studied neglecting energy dependence of the nuclear cross sections. A three term perturbation expansion appears to be adequate for most practical applications for which penetration depths are less than 30 g per sq cm of material. The differential energy flux is found for monoenergetic beams and for realistic ion beam spectral distributions. An approximate formalism is given to estimate higher-order terms.

  11. Comparison of approximate gravitational lens equations and a proposal for an improved new one

    International Nuclear Information System (INIS)

    Bozza, V.

    2008-01-01

    Keeping the exact general relativistic treatment of light bending as a reference, we compare the accuracy of commonly used approximate lens equations. We conclude that the best approximate lens equation is the Ohanian lens equation, for which we present a new expression in terms of distances between observer, lens, and source planes. We also examine a realistic gravitational lensing case, showing that the precision of the Ohanian lens equation might be required for a reliable treatment of gravitational lensing and a correct extraction of the full information about gravitational physics.

  12. Tuukka Kaidesoja on Critical Realist Transcendental Realism

    Directory of Open Access Journals (Sweden)

    Groff Ruth

    2015-09-01

    Full Text Available I argue that critical realists think pretty much what Tukka Kaidesoja says that he himself thinks, but also that Kaidesoja’s objections to the views that he attributes to critical realists are not persuasive.

  13. Implementation of the CCGM approximation for surface diffraction using Wigner R-matrix theory

    International Nuclear Information System (INIS)

    Lauderdale, J.G.; McCurdy, C.W.

    1983-01-01

    The CCGM approximation for surface scattering proposed by Cabrera, Celli, Goodman, and Manson [Surf. Sci. 19, 67 (1970)] is implemented for realistic surface interaction potentials using Wigner R-matrix theory. The resulting procedure is highly efficient computationally and is in no way limited to hard wall or purely repulsive potentials. Comparison is made with the results of close-coupling calculations of other workers which include the same diffraction channels in order to fairly evaluate the CCGM approximation which is an approximation to the coupled channels Lippman--Schwinger equation for the T matrix. The shapes of selective adsorption features, whether maxima or minima, in the scattered intensity are well represented in this approach for cases in which the surface corrugation is not too strong

  14. Collapse Scenarios of High-Rise Buildings Using Plastic Limit Analysis

    Directory of Open Access Journals (Sweden)

    G. Liu

    2009-01-01

    Full Text Available The Twin Towers of the World Trade Center (WTC in New York, USA collapsed on 11 September, 2001. The incident is regarded as the most severe disaster for high-rise buildings in history. Investigations into the collapse scenarios are still being conducted. Possible collapse scenarios assessed by local and international experts were reported. Another possible collapse scenario of the WTC based on two hypotheses was proposed in this paper, and the idea of plastic limit analysis was applied to evaluate the approximate limit load. According to the theory analysis and numerical calculations, a conclusion can be drawn that the large fires, aroused by the terrorist attack, play a significant role on the collapse of the WTC.

  15. Measurement of time delays in gated radiotherapy for realistic respiratory motions

    International Nuclear Information System (INIS)

    Chugh, Brige P.; Quirk, Sarah; Conroy, Leigh; Smith, Wendy L.

    2014-01-01

    Purpose: Gated radiotherapy is used to reduce internal motion margins, escalate target dose, and limit normal tissue dose; however, its temporal accuracy is limited. Beam-on and beam-off time delays can lead to treatment inefficiencies and/or geographic misses; therefore, AAPM Task Group 142 recommends verifying the temporal accuracy of gating systems. Many groups use sinusoidal phantom motion for this, under the tacit assumption that use of sinusoidal motion for determining time delays produces negligible error. The authors test this assumption by measuring gating time delays for several realistic motion shapes with increasing degrees of irregularity. Methods: Time delays were measured on a linear accelerator with a real-time position management system (Varian TrueBeam with RPM system version 1.7.5) for seven motion shapes: regular sinusoidal; regular realistic-shape; large (40%) and small (10%) variations in amplitude; large (40%) variations in period; small (10%) variations in both amplitude and period; and baseline drift (30%). Film streaks of radiation exposure were generated for each motion shape using a programmable motion phantom. Beam-on and beam-off time delays were determined from the difference between the expected and observed streak length. Results: For the system investigated, all sine, regular realistic-shape, and slightly irregular amplitude variation motions had beam-off and beam-on time delays within the AAPM recommended limit of less than 100 ms. In phase-based gating, even small variations in period resulted in some time delays greater than 100 ms. Considerable time delays over 1 s were observed with highly irregular motion. Conclusions: Sinusoidal motion shapes can be considered a reasonable approximation to the more complex and slightly irregular shapes of realistic motion. When using phase-based gating with predictive filters even small variations in period can result in time delays over 100 ms. Clinical use of these systems for patients

  16. Measurement of time delays in gated radiotherapy for realistic respiratory motions

    Energy Technology Data Exchange (ETDEWEB)

    Chugh, Brige P.; Quirk, Sarah; Conroy, Leigh; Smith, Wendy L., E-mail: Wendy.Smith@albertahealthservices.ca [Department of Medical Physics, Tom Baker Cancer Centre, Calgary, Alberta T2N 4N2 (Canada)

    2014-09-15

    Purpose: Gated radiotherapy is used to reduce internal motion margins, escalate target dose, and limit normal tissue dose; however, its temporal accuracy is limited. Beam-on and beam-off time delays can lead to treatment inefficiencies and/or geographic misses; therefore, AAPM Task Group 142 recommends verifying the temporal accuracy of gating systems. Many groups use sinusoidal phantom motion for this, under the tacit assumption that use of sinusoidal motion for determining time delays produces negligible error. The authors test this assumption by measuring gating time delays for several realistic motion shapes with increasing degrees of irregularity. Methods: Time delays were measured on a linear accelerator with a real-time position management system (Varian TrueBeam with RPM system version 1.7.5) for seven motion shapes: regular sinusoidal; regular realistic-shape; large (40%) and small (10%) variations in amplitude; large (40%) variations in period; small (10%) variations in both amplitude and period; and baseline drift (30%). Film streaks of radiation exposure were generated for each motion shape using a programmable motion phantom. Beam-on and beam-off time delays were determined from the difference between the expected and observed streak length. Results: For the system investigated, all sine, regular realistic-shape, and slightly irregular amplitude variation motions had beam-off and beam-on time delays within the AAPM recommended limit of less than 100 ms. In phase-based gating, even small variations in period resulted in some time delays greater than 100 ms. Considerable time delays over 1 s were observed with highly irregular motion. Conclusions: Sinusoidal motion shapes can be considered a reasonable approximation to the more complex and slightly irregular shapes of realistic motion. When using phase-based gating with predictive filters even small variations in period can result in time delays over 100 ms. Clinical use of these systems for patients

  17. TOXICOLOGICAL EVALUATION OF REALISTIC EMISSIONS OF SOURCE AEROSOLS (TERESA): APPLICATION TO POWER PLANT-DERIVED PM2.5

    Energy Technology Data Exchange (ETDEWEB)

    Annette C. Rohr; Petros Koutrakis; John Godleski

    2011-03-31

    Determining the health impacts of different sources and components of fine particulate matter (PM2.5) is an important scientific goal, because PM is a complex mixture of both inorganic and organic constituents that likely differ in their potential to cause adverse health outcomes. The TERESA (Toxicological Evaluation of Realistic Emissions of Source Aerosols) study focused on two PM sources - coal-fired power plants and mobile sources - and sought to investigate the toxicological effects of exposure to realistic emissions from these sources. The DOE-EPRI Cooperative Agreement covered the performance and analysis of field experiments at three power plants. The mobile source component consisted of experiments conducted at a traffic tunnel in Boston; these activities were funded through the Harvard-EPA Particulate Matter Research Center and will be reported separately in the peer-reviewed literature. TERESA attempted to delineate health effects of primary particles, secondary (aged) particles, and mixtures of these with common atmospheric constituents. The study involved withdrawal of emissions directly from power plant stacks, followed by aging and atmospheric transformation of emissions in a mobile laboratory in a manner that simulated downwind power plant plume processing. Secondary organic aerosol (SOA) derived from the biogenic volatile organic compound {alpha}-pinene was added in some experiments, and in others ammonia was added to neutralize strong acidity. Specifically, four scenarios were studied at each plant: primary particles (P); secondary (oxidized) particles (PO); oxidized particles + secondary organic aerosol (SOA) (POS); and oxidized and neutralized particles + SOA (PONS). Extensive exposure characterization was carried out, including gas-phase and particulate species. Male Sprague Dawley rats were exposed for 6 hours to filtered air or different atmospheric mixtures. Toxicological endpoints included (1) breathing pattern; (2) bronchoalveolar lavage

  18. Development of a realistic, dynamic digital brain phantom for CT perfusion validation

    Science.gov (United States)

    Divel, Sarah E.; Segars, W. Paul; Christensen, Soren; Wintermark, Max; Lansberg, Maarten G.; Pelc, Norbert J.

    2016-03-01

    Physicians rely on CT Perfusion (CTP) images and quantitative image data, including cerebral blood flow, cerebral blood volume, and bolus arrival delay, to diagnose and treat stroke patients. However, the quantification of these metrics may vary depending on the computational method used. Therefore, we have developed a dynamic and realistic digital brain phantom upon which CTP scans can be simulated based on a set of ground truth scenarios. Building upon the previously developed 4D extended cardiac-torso (XCAT) phantom containing a highly detailed brain model, this work consisted of expanding the intricate vasculature by semi-automatically segmenting existing MRA data and fitting nonuniform rational B-spline surfaces to the new vessels. Using time attenuation curves input by the user as reference, the contrast enhancement in the vessels changes dynamically. At each time point, the iodine concentration in the arteries and veins is calculated from the curves and the material composition of the blood changes to reflect the expected values. CatSim, a CT system simulator, generates simulated data sets of this dynamic digital phantom which can be further analyzed to validate CTP studies and post-processing methods. The development of this dynamic and realistic digital phantom provides a valuable resource with which current uncertainties and controversies surrounding the quantitative computations generated from CTP data can be examined and resolved.

  19. Demand scenarios, worldwide

    Energy Technology Data Exchange (ETDEWEB)

    Schaefer, A [Massachusetts Inst. of Technology, Center for Technology, Policy and Industrial Development and the MIT Joint Program on the Science and Policy of Global Change, Cambridge, MA (United States)

    1996-11-01

    Existing methods are inadequate for developing aggregate (regional and global) and long-term (several decades) passenger transport demand scenarios, since they are mainly based on simple extensions of current patterns rather than causal relationships that account for the competition among transport modes (aircraft, automobiles, buses and trains) to provide transport services. The demand scenario presented in this paper is based on two empirically proven invariances of human behavior. First, transport accounts for 10 to 15 percent of household total expenditures for those owning an automobile, and around 5 percent for non-motorized households on average (travel money budget). Second, the mean time spent traveling is approximately one hour per capita per day (travel time budget). These two budgets constraints determine the dynamics of the scenario: rising income increases per capita expenditure on travel which, in turn, increase demand for mobility. Limited travel time constraints travelers to shift to faster transport systems. The scenario is initiated with the first integrated historical data set on traffic volume in 11 world regions and the globe from 1960 to 1990 for all major modes of motorized transport. World average per capita traffic volume, which was 1,800 kilometers in 1960 and 4,2090 in 1990, is estimated to rise to 7,900 kilometers in 2020 - given a modest average increase in Gross World Product of 1.9% per year. Higher economic growth rates in Asian regions result in an increase in regional per capita traffic volume up to a factor of 5.3 from 1990 levels. Modal splits continue shifting to more flexible and faster modes of transport. At one point, passenger cars can no longer satisfy the increasing demand for speed (i.e. rising mobility within a fixed time budget). In North America it is estimated that the absolute traffic volume of automobiles will gradually decline starting in the 2010s. (author) 13 figs., 6 tabs., 35 refs.

  20. Creating pedestrian crash scenarios in a driving simulator environment.

    Science.gov (United States)

    Chrysler, Susan T; Ahmad, Omar; Schwarz, Chris W

    2015-01-01

    In 2012 in the United States, pedestrian injuries accounted for 3.3% of all traffic injuries but, disproportionately, pedestrian fatalities accounted for roughly 14% of traffic-related deaths (NHTSA 2014 ). In many other countries, pedestrians make up more than 50% of those injured and killed in crashes. This research project examined driver response to crash-imminent situations involving pedestrians in a high-fidelity, full-motion driving simulator. This article presents a scenario development method and discusses experimental design and control issues in conducting pedestrian crash research in a simulation environment. Driving simulators offer a safe environment in which to test driver response and offer the advantage of having virtual pedestrian models that move realistically, unlike test track studies, which by nature must use pedestrian dummies on some moving track. An analysis of pedestrian crash trajectories, speeds, roadside features, and pedestrian behavior was used to create 18 unique crash scenarios representative of the most frequent and most costly crash types. For the study reported here, we only considered scenarios where the car is traveling straight because these represent the majority of fatalities. We manipulated driver expectation of a pedestrian both by presenting intersection and mid-block crossing as well as by using features in the scene to direct the driver's visual attention toward or away from the crossing pedestrian. Three visual environments for the scenarios were used to provide a variety of roadside environments and speed: a 20-30 mph residential area, a 55 mph rural undivided highway, and a 40 mph urban area. Many variables of crash situations were considered in selecting and developing the scenarios, including vehicle and pedestrian movements; roadway and roadside features; environmental conditions; and characteristics of the pedestrian, driver, and vehicle. The driving simulator scenarios were subjected to iterative testing to

  1. World Future Mapping and Scenarios for the 21st Century

    Directory of Open Access Journals (Sweden)

    Vareikis Egidijus

    2015-12-01

    Full Text Available The aim of this text is to describe the methods of future studies, its possibilities and limitations, as well as to make some predictions about the real picture of the development of the 21st century. However, the planning is still not very reliable, and far from a “road map” framework. Thus, future studies are still balancing between science and scientific/artistic fiction. The set of methods of future investigation permits one to compose a few or even up to dozens of medium term or long term scenarios of the world’s future. There are a few well-proven laws of social and economic development as well as some partially predictable phenomena in the area of environment, biology, human ethic, etc. No future planning is secure from unpredictable phenomena – “black swans” – and their impact, nor secure from “political decisions” that destroy natural developments in society. So no one scenario can pretend to be absolutely right. The most frequent future scenarios are based on the wish to implement a copy of an existing “happy nation”, to fight undesirable trends, and create some kind of “dream society” while stimulating positives and inhibiting negative trends. The final version of a scenario depends also upon the “human factors”, e.g. knowledge, stereotypes of thinking, as well as the wishes of those who are financing the project. Generally they are “happy end” projects. This makes scenarios rather useless. Only the independent experts that present more realistic and reliable scenarios can help in the planning of medium term and long term futures. Currently many scenarios foresee the so-called American or European way of development, which is in fact the continuation of the existing world order. There is a growing number of publications about the emergence of China (and Russia as a great power as well as possibilities of a New Caliphate, New Messiah or new Orwellian style regimes.

  2. Generating realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap

    2011-01-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. In this paper, we introduce realistic roofs by imposing a few additional constraints. We investigate the geometric and combinatorial properties of realistic roofs, and show a connection with the straight skeleton of P. We show that the maximum possible number of distinct realistic roofs over P is ( ⌊(n-4)/4⌋ (n-4)/2) when P has n vertices. We present an algorithm that enumerates a combinatorial representation of each such roof in O(1) time per roof without repetition, after O(n 4) preprocessing time. We also present an O(n 5)-time algorithm for computing a realistic roof with minimum height or volume. © 2011 Springer-Verlag.

  3. Hydrogen energy network start-up scenario

    International Nuclear Information System (INIS)

    Weingartner, S.; Ellerbrock, H.

    1994-01-01

    Hydrogen is widely discussed as future fuel and energy storage medium either to replace conventional fuels for automobiles, aircrafts and ships or to avoid the necessity of bulky battery systems for electricity storage, especially in connection with solar power systems. These discussions however started more than 25 years ago and up to now hydrogen has failed to achieve a major break-through towards wider application as energy storage medium in civil markets. The main reason is that other fuels are cheaper and very well implemented in our daily life. A study has been performed at Deutsche Aerospace in order to evaluate the boundary conditions, either political or economical, which would give hydrogen the necessary push, i.e. advantage over conventional fuels. The main goal of this study was to identify critical influence factors and specific start-up scenarios which would allow an economical and practically realistic use of hydrogen as fuel and energy medium in certain niche markets outside the space industry. Method and major results of this study are presented in detail in the paper. Certain niche markets could be identified, where with little initial governmental support, either by funding, tax laws or legislation, hydrogen can compete with conventional fuels. This however requires a scenario where a lot of small actions have to be taken by a high variety of institutions and industries which today are not interconnected with each other, i.e. it requires a new cooperative and proactive network between e.g. energy utilities, car industries, those who have a sound experience with hydrogen (space industry, chemical industry) and last, but certainly not the least, the government. Based on the developed scenario precise recommendations are drawn as conclusions

  4. Development of a safety decision-making scenario to measure worker safety in agriculture.

    Science.gov (United States)

    Mosher, G A; Keren, N; Freeman, S A; Hurburgh, C R

    2014-04-01

    Human factors play an important role in the management of occupational safety, especially in high-hazard workplaces such as commercial grain-handling facilities. Employee decision-making patterns represent an essential component of the safety system within a work environment. This research describes the process used to create a safety decision-making scenario to measure the process that grain-handling employees used to make choices in a safety-related work task. A sample of 160 employees completed safety decision-making simulations based on a hypothetical but realistic scenario in a grain-handling environment. Their choices and the information they used to make their choices were recorded. Although the employees emphasized safety information in their decision-making process, not all of their choices were safe choices. Factors influencing their choices are discussed, and implications for industry, management, and workers are shared.

  5. Nonlinear oligopolistic game with isoelastic demand function: Rationality and local monopolistic approximation

    International Nuclear Information System (INIS)

    Askar, S.S.; Alnowibet, K.

    2016-01-01

    Isoelastic demand function have been used in literature to study the dynamic features of systems constructed based on economic market structure. In this paper, we adopt the so-called Cobb–Douglas production function and study its impact on the steady state of an oligopolistic game that consists of four oligopolistic competitors or firms. Briefly, the paper handles three different scenarios. The first scenario introduces four oligopolistic firms who plays rational against each other in market. The firms use the myopic mechanism (or bounded rational) to update their production in the next time unit. The steady state of the obtained system in this scenario, which is the Nash equilibrium, is unique and its characteristics are investigated. Based on a local monopolistic approximation (LMA) strategy, one competitor prefers to play against the three rational firms and this is illustrated in the second scenario. The last scenario discusses the case when three competitors use the LMA strategy against a rational one. For all scenarios discrete dynamical systems are used to describe the game introduced in all scenarios. The stability analysis of the Nash equilibrium is investigated analytically and some numerical simulations are used to confirm the obtained analytical results.

  6. Nucleon-pair approximation to the nuclear shell model

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Y.M., E-mail: ymzhao@sjtu.edu.cn [Department of Physics and Astronomy, Shanghai Jiao Tong University, Shanghai 200240 (China); Arima, A. [Department of Physics and Astronomy, Shanghai Jiao Tong University, Shanghai 200240 (China); Musashi Gakuen, 1-26-1 Toyotamakami Nerima-ku, Tokyo 176-8533 (Japan)

    2014-12-01

    Atomic nuclei are complex systems of nucleons–protons and neutrons. Nucleons interact with each other via an attractive and short-range force. This feature of the interaction leads to a pattern of dominantly monopole and quadrupole correlations between like particles (i.e., proton–proton and neutron–neutron correlations) in low-lying states of atomic nuclei. As a consequence, among dozens or even hundreds of possible types of nucleon pairs, very few nucleon pairs such as proton and neutron pairs with spin zero, two (in some cases spin four), and occasionally isoscalar spin-aligned proton–neutron pairs, play important roles in low-energy nuclear structure. The nucleon-pair approximation therefore provides us with an efficient truncation scheme of the full shell model configurations which are otherwise too large to handle for medium and heavy nuclei in foreseeable future. Furthermore, the nucleon-pair approximation leads to simple pictures in physics, as the dimension of nucleon-pair subspace is always small. The present paper aims at a sound review of its history, formulation, validity, applications, as well as its link to previous approaches, with the focus on the new developments in the last two decades. The applicability of the nucleon-pair approximation and numerical calculations of low-lying states for realistic atomic nuclei are demonstrated with examples. Applications of pair approximations to other problems are also discussed.

  7. Development of a realistic human airway model.

    Science.gov (United States)

    Lizal, Frantisek; Elcner, Jakub; Hopke, Philip K; Jedelsky, Jan; Jicha, Miroslav

    2012-03-01

    Numerous models of human lungs with various levels of idealization have been reported in the literature; consequently, results acquired using these models are difficult to compare to in vivo measurements. We have developed a set of model components based on realistic geometries, which permits the analysis of the effects of subsequent model simplification. A realistic digital upper airway geometry except for the lack of an oral cavity has been created which proved suitable both for computational fluid dynamics (CFD) simulations and for the fabrication of physical models. Subsequently, an oral cavity was added to the tracheobronchial geometry. The airway geometry including the oral cavity was adjusted to enable fabrication of a semi-realistic model. Five physical models were created based on these three digital geometries. Two optically transparent models, one with and one without the oral cavity, were constructed for flow velocity measurements, two realistic segmented models, one with and one without the oral cavity, were constructed for particle deposition measurements, and a semi-realistic model with glass cylindrical airways was developed for optical measurements of flow velocity and in situ particle size measurements. One-dimensional phase doppler anemometry measurements were made and compared to the CFD calculations for this model and good agreement was obtained.

  8. Environmental evaluation of waste management scenarios - significance of the boundaries

    NARCIS (Netherlands)

    Ghinea, C.; Petraru, M.; Bressers, Johannes T.A.; Gavrilescu, M.

    2012-01-01

    Life cycle concept was applied to analyse and assess some municipal solid waste (MSW) management scenarios in terms of environmental impacts, particularised for Iasi city, Romania, where approximately 380 kg/cap/yr of waste are generated. Currently, the management processes include temporary

  9. An approximate dynamic programming approach to resource management in multi-cloud scenarios

    Science.gov (United States)

    Pietrabissa, Antonio; Priscoli, Francesco Delli; Di Giorgio, Alessandro; Giuseppi, Alessandro; Panfili, Martina; Suraci, Vincenzo

    2017-03-01

    The programmability and the virtualisation of network resources are crucial to deploy scalable Information and Communications Technology (ICT) services. The increasing demand of cloud services, mainly devoted to the storage and computing, requires a new functional element, the Cloud Management Broker (CMB), aimed at managing multiple cloud resources to meet the customers' requirements and, simultaneously, to optimise their usage. This paper proposes a multi-cloud resource allocation algorithm that manages the resource requests with the aim of maximising the CMB revenue over time. The algorithm is based on Markov decision process modelling and relies on reinforcement learning techniques to find online an approximate solution.

  10. Simulations of KSTAR high performance steady state operation scenarios

    International Nuclear Information System (INIS)

    Na, Yong-Su; Kessel, C.E.; Park, J.M.; Yi, Sumin; Kim, J.Y.; Becoulet, A.; Sips, A.C.C.

    2009-01-01

    We report the results of predictive modelling of high performance steady state operation scenarios in KSTAR. Firstly, the capabilities of steady state operation are investigated with time-dependent simulations using a free-boundary plasma equilibrium evolution code coupled with transport calculations. Secondly, the reproducibility of high performance steady state operation scenarios developed in the DIII-D tokamak, of similar size to that of KSTAR, is investigated using the experimental data taken from DIII-D. Finally, the capability of ITER-relevant steady state operation is investigated in KSTAR. It is found that KSTAR is able to establish high performance steady state operation scenarios; β N above 3, H 98 (y, 2) up to 2.0, f BS up to 0.76 and f NI equals 1.0. In this work, a realistic density profile is newly introduced for predictive simulations by employing the scaling law of a density peaking factor. The influence of the current ramp-up scenario and the transport model is discussed with respect to the fusion performance and non-inductive current drive fraction in the transport simulations. As observed in the experiments, both the heating and the plasma current waveforms in the current ramp-up phase produce a strong effect on the q-profile, the fusion performance and also on the non-inductive current drive fraction in the current flattop phase. A criterion in terms of q min is found to establish ITER-relevant steady state operation scenarios. This will provide a guideline for designing the current ramp-up phase in KSTAR. It is observed that the transport model also affects the predictive values of fusion performance as well as the non-inductive current drive fraction. The Weiland transport model predicts the highest fusion performance as well as non-inductive current drive fraction in KSTAR. In contrast, the GLF23 model exhibits the lowest ones. ITER-relevant advanced scenarios cannot be obtained with the GLF23 model in the conditions given in this work

  11. Towards a user's guide to scenarios - a report on scenario types and scenario techniques

    Energy Technology Data Exchange (ETDEWEB)

    Boerjeson, Lena; Hoejer, Mattias; Dreborg, Karl-Henrik; Finnveden, Goeran [Royal Inst. of Technology, Stockholm (Sweden). Environmental Strategies Research - fms; Ekvall, Tomas [Chalmers Univ. of Technology, Goeteborg (Sweden). Dept. of Energy and Environment

    2005-11-01

    Futures studies consist of a vast variation of studies and approaches. The aim of this paper is to contribute to the understanding of for what purposes scenarios are useful and what methods and procedures are useful for furthering these purposes. We present a scenario typology with an aim to better suit the context in which the scenarios are used. The scenario typology is combined with a new way of looking at scenario techniques, i.e. practical methods and procedures for scenario development. Finally, we look at the usefulness of scenarios in the light of the scenario typology and the scenario techniques. As a start, we distinguish between three main categories of scenario studies. The classification is based on the principal questions we believe a user may want to pose about the future. The resolution is then increased by letting each category contain two different scenario types. These are distinguished by different angles of approach of the questions defining the categories. The first question, What will happen?, is responded to by Predictive scenarios. In fact, the response to a question like this will always be conditional, e.g. of a stable and peaceful world, or by a certain continuous development of some kind. We have utilized this fact when defining the two predictive scenario types, Forecasts and What-if scenarios. The second question, What can happen?, is responded to by Explorative scenarios. The scenarios are thus explorations of what might happen in the future, regardless of beliefs of what is likely to happen or opinions of what is desirable. This category is further divided into external and strategic scenarios. The final question, How can a specific target be reached?, is responded to by Normative scenarios. Such studies are explicitly normative, since they take a target as a starting point. They are often directed towards how the target could be reached. This category is divided into preserving and transforming scenarios. If the user wants to

  12. Scenario-Led Habitat Modelling of Land Use Change Impacts on Key Species.

    Directory of Open Access Journals (Sweden)

    Matthew Geary

    Full Text Available Accurate predictions of the impacts of future land use change on species of conservation concern can help to inform policy-makers and improve conservation measures. If predictions are spatially explicit, predicted consequences of likely land use changes could be accessible to land managers at a scale relevant to their working landscape. We introduce a method, based on open source software, which integrates habitat suitability modelling with scenario-building, and illustrate its use by investigating the effects of alternative land use change scenarios on landscape suitability for black grouse Tetrao tetrix. Expert opinion was used to construct five near-future (twenty years scenarios for the 800 km2 study site in upland Scotland. For each scenario, the cover of different land use types was altered by 5-30% from 20 random starting locations and changes in habitat suitability assessed by projecting a MaxEnt suitability model onto each simulated landscape. A scenario converting grazed land to moorland and open forestry was the most beneficial for black grouse, and 'increased grazing' (the opposite conversion the most detrimental. Positioning of new landscape blocks was shown to be important in some situations. Increasing the area of open-canopy forestry caused a proportional decrease in suitability, but suitability gains for the 'reduced grazing' scenario were nonlinear. 'Scenario-led' landscape simulation models can be applied in assessments of the impacts of land use change both on individual species and also on diversity and community measures, or ecosystem services. A next step would be to include landscape configuration more explicitly in the simulation models, both to make them more realistic, and to examine the effects of habitat placement more thoroughly. In this example, the recommended policy would be incentives on grazing reduction to benefit black grouse.

  13. Kuhn: Realist or Antirealist?

    Directory of Open Access Journals (Sweden)

    Michel Ghins

    1998-06-01

    Full Text Available Although Kuhn is much more an antirealist than a realist, the earlier and later articulations of realist and antirealist ingredients in his views merit close scrutiny. What are the constituents of the real invariant World posited by Kuhn and its relation to the mutable paradigm-related worlds? Various proposed solutions to this problem (dubbed the "new-world problem" by Ian Hacking are examined and shown to be unsatisfactory. In The Structure of Scientific Revolutions, the stable World can reasonably be taken to be made up of ordinary perceived objects, whereas in Kuhn's later works the transparadigmatic World is identified with something akin to the Kantian world-in-itself. It is argued that both proposals are beset with insuperable difficulties which render Kuhn's earlier and later versions of antirealism implausible.

  14. The effect of a realistic thermal diffusivity on numerical model of a subducting slab

    Science.gov (United States)

    Maierova, P.; Steinle-Neumann, G.; Cadek, O.

    2010-12-01

    A number of numerical studies of subducting slab assume simplified (constant or only depth-dependent) models of thermal conductivity. The available mineral physics data indicate, however, that thermal diffusivity is strongly temperature- and pressure-dependent and may also vary among different mantle materials. In the present study, we examine the influence of realistic thermal properties of mantle materials on the thermal state of the upper mantle and the dynamics of subducting slabs. On the basis of the data published in mineral physics literature we compile analytical relationships that approximate the pressure and temperature dependence of thermal diffusivity for major mineral phases of the mantle (olivine, wadsleyite, ringwoodite, garnet, clinopyroxenes, stishovite and perovskite). We propose a simplified composition of mineral assemblages predominating in the subducting slab and the surrounding mantle (pyrolite, mid-ocean ridge basalt, harzburgite) and we estimate their thermal diffusivity using the Hashin-Shtrikman bounds. The resulting complex formula for the diffusivity of each aggregate is then approximated by a simpler analytical relationship that is used in our numerical model as an input parameter. For the numerical modeling we use the Elmer software (open source finite element software for multiphysical problems, see http://www.csc.fi/english/pages/elmer). We set up a 2D Cartesian thermo-mechanical steady-state model of a subducting slab. The model is partly kinematic as the flow is driven by a boundary condition on velocity that is prescribed on the top of the subducting lithospheric plate. Reology of the material is non-linear and is coupled with the thermal equation. Using the realistic relationship for thermal diffusivity of mantle materials, we compute the thermal and flow fields for different input velocity and age of the subducting plate and we compare the results against the models assuming a constant thermal diffusivity. The importance of the

  15. ARAMIS project: A comprehensive methodology for the identification of reference accident scenarios in process industries

    International Nuclear Information System (INIS)

    Delvosalle, Christian; Fievez, Cecile; Pipart, Aurore; Debray, Bruno

    2006-01-01

    In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term 'major accidents' must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called 'risk matrix', crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage

  16. Prototype Development Capabilities of 3D Spatial Interactions and Failures During Scenario Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Steven Prescott; Ramprasad Sampath; Curtis Smith; Tony Koonce

    2014-09-01

    Computers have been used for 3D modeling and simulation, but only recently have computational resources been able to give realistic results in a reasonable time frame for large complex models. This report addressed the methods, techniques, and resources used to develop a prototype for using 3D modeling and simulation engine to improve risk analysis and evaluate reactor structures and components for a given scenario. The simulations done for this evaluation were focused on external events, specifically tsunami floods, for a hypothetical nuclear power facility on a coastline.

  17. Based on user interest level of modeling scenarios and browse content

    Science.gov (United States)

    Zhao, Yang

    2017-08-01

    User interest modeling is the core of personalized service, taking into account the impact of situational information on user preferences, the user behavior days of financial information. This paper proposes a method of user interest modeling based on scenario information, which is obtained by calculating the similarity of the situation. The user's current scene of the approximate scenario set; on the "user - interest items - scenarios" three-dimensional model using the situation pre-filtering method of dimension reduction processing. View the content of the user interested in the theme, the analysis of the page content to get each topic of interest keywords, based on the level of vector space model user interest. The experimental results show that the user interest model based on the scenario information is within 9% of the user's interest prediction, which is effective.

  18. Simulation supported scenario analysis for water resources planning: a case study in northern italy

    Science.gov (United States)

    Facchi, A.; Gandolfi, C.; Ortuani, B.; Maggi, D.

    2003-04-01

    The work presents the results of a comprehensive modelling study of surface and groundwater systems, including the interaction between irrigation and groundwater resources, for the Muzza-Bassa Lodigiana irrigation district, placed in the southern part of the densely-settled Lombardia plain (northern Italy). The area, of approximately 700 km2, has been selected as: a) it is representative of agricultural and irrigation practices in a wide portion of the plain of Lombardia; b) it has well defined hydrogeological borders, represented by the Adda, Po, and Lambro rivers (respectively East, South and West) and by the Muzza canal (North). The objective of the study is to assess the impact of land use and irrigation water availability on the distribution of crop water consumption in space and time, as well as on the groundwater resources in this wide portion of the Lombardia plain. To achieve this goal, a number of realistic management scenarios, currently under discussion with the regional water authority, have been taken into account. A standard 'base case' has been defined to allow comparative analysis of the results of different scenarios. To carry out the research, an integrated, distributed, catchment-scale simulation package, already developed and applied to the study area, has been used. The simulation system is based on the integration of two hydrological models - a conceptual vadose zone model and the groundwater model MODFLOW. An interface performs the explicit coupling in space and time between the two models. A GIS manages all the information relevant to the study area, as well as all the input, the spatially distributed parameters and the output of the system. The simulation package has been verified for the years 1999-2000 using land use derived from remote-sensed images, reported water availability for irrigation, observed water stage in rivers as well as groundwater level in the alluvial aquifer system.

  19. Modeling Future Land Use Scenarios in South Korea: Applying the IPCC Special Report on Emissions Scenarios and the SLEUTH Model on a Local Scale

    Science.gov (United States)

    Han, Haejin; Hwang, YunSeop; Ha, Sung Ryong; Kim, Byung Sik

    2015-05-01

    This study developed three scenarios of future land use/land cover on a local level for the Kyung-An River Basin and its vicinity in South Korea at a 30-m resolution based on the two scenario families of the Intergovernmental Panel on Climate Change (IPCC) Special Report Emissions Scenarios (SRES): A2 and B1, as well as a business-as-usual scenario. The IPCC SRES A2 and B1 were used to define future local development patterns and associated land use change. We quantified the population-driven demand for urban land use for each qualitative storyline and allocated the urban demand in geographic space using the SLEUTH model. The model results demonstrate the possible land use/land cover change scenarios for the years from 2000 to 2070 by examining the broad narrative of each SRES within the context of a local setting, such as the Kyoungan River Basin, constructing narratives of local development shifts and modeling a set of `best guess' approximations of the future land use conditions in the study area. This study found substantial differences in demands and patterns of land use changes among the scenarios, indicating compact development patterns under the SRES B1 compared to the rapid and dispersed development under the SRES A2.

  20. Modeling future land use scenarios in South Korea: applying the IPCC special report on emissions scenarios and the SLEUTH model on a local scale.

    Science.gov (United States)

    Han, Haejin; Hwang, YunSeop; Ha, Sung Ryong; Kim, Byung Sik

    2015-05-01

    This study developed three scenarios of future land use/land cover on a local level for the Kyung-An River Basin and its vicinity in South Korea at a 30-m resolution based on the two scenario families of the Intergovernmental Panel on Climate Change (IPCC) Special Report Emissions Scenarios (SRES): A2 and B1, as well as a business-as-usual scenario. The IPCC SRES A2 and B1 were used to define future local development patterns and associated land use change. We quantified the population-driven demand for urban land use for each qualitative storyline and allocated the urban demand in geographic space using the SLEUTH model. The model results demonstrate the possible land use/land cover change scenarios for the years from 2000 to 2070 by examining the broad narrative of each SRES within the context of a local setting, such as the Kyoungan River Basin, constructing narratives of local development shifts and modeling a set of 'best guess' approximations of the future land use conditions in the study area. This study found substantial differences in demands and patterns of land use changes among the scenarios, indicating compact development patterns under the SRES B1 compared to the rapid and dispersed development under the SRES A2.

  1. A model based bayesian solution for characterization of complex damage scenarios in aerospace composite structures.

    Science.gov (United States)

    Reed, H; Leckey, Cara A C; Dick, A; Harvey, G; Dobson, J

    2018-01-01

    Ultrasonic damage detection and characterization is commonly used in nondestructive evaluation (NDE) of aerospace composite components. In recent years there has been an increased development of guided wave based methods. In real materials and structures, these dispersive waves result in complicated behavior in the presence of complex damage scenarios. Model-based characterization methods utilize accurate three dimensional finite element models (FEMs) of guided wave interaction with realistic damage scenarios to aid in defect identification and classification. This work describes an inverse solution for realistic composite damage characterization by comparing the wavenumber-frequency spectra of experimental and simulated ultrasonic inspections. The composite laminate material properties are first verified through a Bayesian solution (Markov chain Monte Carlo), enabling uncertainty quantification surrounding the characterization. A study is undertaken to assess the efficacy of the proposed damage model and comparative metrics between the experimental and simulated output. The FEM is then parameterized with a damage model capable of describing the typical complex damage created by impact events in composites. The damage is characterized through a transdimensional Markov chain Monte Carlo solution, enabling a flexible damage model capable of adapting to the complex damage geometry investigated here. The posterior probability distributions of the individual delamination petals as well as the overall envelope of the damage site are determined. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Scalability of regional climate change in Europe for high-end scenarios

    DEFF Research Database (Denmark)

    Christensen, O. B.; Yang, S.; Boberg, F.

    2015-01-01

    With the help of a simulation using the global circulation model (GCM) EC-Earth, downscaled over Europe with the regional model DMI-HIRHAM5 at a 25 km grid point distance, we investigated regional climate change corresponding to 6°C of global warming to investigate whether regional climate change...... are close to the RCP8.5 emission scenario. We investigated the extent to which pattern scaling holds, i.e. the approximation that the amplitude of any climate change will be approximately proportional to the amount of global warming. We address this question through a comparison of climate change results...... from downscaling simulations over the same integration domain, but for different driving and regional models and scenarios, mostly from the EU ENSEMBLES project. For almost all quantities investigated, pattern scaling seemed to apply to the 6° simulation. This indicates that the single 6° simulation...

  3. Developing spatially explicit footprints of plausible land-use scenarios in the Santa Cruz Watershed, Arizona and Sonora

    Science.gov (United States)

    Norman, Laura M.; Feller, Mark; Villarreal, Miguel L.

    2012-01-01

    The SLEUTH urban growth model is applied to a binational dryland watershed to envision and evaluate plausible future scenarios of land use change into the year 2050. Our objective was to create a suite of geospatial footprints portraying potential land use change that can be used to aid binational decision-makers in assessing the impacts relative to sustainability of natural resources and potential socio-ecological consequences of proposed land-use management. Three alternatives are designed to simulate different conditions: (i) a Current Trends Scenario of unmanaged exponential growth, (ii) a Conservation Scenario with managed growth to protect the environment, and (iii) a Megalopolis Scenario in which growth is accentuated around a defined international trade corridor. The model was calibrated with historical data extracted from a time series of satellite images. Model materials, methodology, and results are presented. Our Current Trends Scenario predicts the footprint of urban growth to approximately triple from 2009 to 2050, which is corroborated by local population estimates. The Conservation Scenario results in protecting 46% more of the Evergreen class (more than 150,000 acres) than the Current Trends Scenario and approximately 95,000 acres of Barren Land, Crops, Deciduous Forest (Mesquite Bosque), Grassland/Herbaceous, Urban/Recreational Grasses, and Wetlands classes combined. The Megalopolis Scenario results also depict the preservation of some of these land-use classes compared to the Current Trends Scenario, most notably in the environmentally important headwaters region. Connectivity and areal extent of land cover types that provide wildlife habitat were preserved under the alternative scenarios when compared to Current Trends.

  4. A Local-Realistic Model of Quantum Mechanics Based on a Discrete Spacetime

    Science.gov (United States)

    Sciarretta, Antonio

    2018-01-01

    This paper presents a realistic, stochastic, and local model that reproduces nonrelativistic quantum mechanics (QM) results without using its mathematical formulation. The proposed model only uses integer-valued quantities and operations on probabilities, in particular assuming a discrete spacetime under the form of a Euclidean lattice. Individual (spinless) particle trajectories are described as random walks. Transition probabilities are simple functions of a few quantities that are either randomly associated to the particles during their preparation, or stored in the lattice nodes they visit during the walk. QM predictions are retrieved as probability distributions of similarly-prepared ensembles of particles. The scenarios considered to assess the model comprise of free particle, constant external force, harmonic oscillator, particle in a box, the Delta potential, particle on a ring, particle on a sphere and include quantization of energy levels and angular momentum, as well as momentum entanglement.

  5. Electronic and Optical Properties of CuO Based on DFT+U and GW Approximation

    International Nuclear Information System (INIS)

    Ahmad, F; Agusta, M K; Dipojono, H K

    2016-01-01

    We report ab initio calculations of electronic structure and optical properties of monoclinic CuO based on DFT+U and GW approximation. CuO is an antiferromagnetic material with strong electron correlations. Our calculation shows that DFT+U and GW approximation sufficiently reliable to investigate the material properties of CuO. The calculated band gap of DFT+U for reasonable value of U slightly underestimates. The use of GW approximation requires adjustment of U value to get realistic result. Hybridization Cu 3dxz, 3dyz with O 2p plays an important role in the formation of band gap. The calculated optical properties based on DFT+U and GW corrections by solving Bethe-Salpeter are in good agreement with the calculated electronic properties and the experimental result. (paper)

  6. Development of vortex model with realistic axial velocity distribution

    International Nuclear Information System (INIS)

    Ito, Kei; Ezure, Toshiki; Ohshima, Hiroyuki

    2014-01-01

    A vortex is considered as one of significant phenomena which may cause gas entrainment (GE) and/or vortex cavitation in sodium-cooled fast reactors. In our past studies, the vortex is assumed to be approximated by the well-known Burgers vortex model. However, the Burgers vortex model has a simple but unreal assumption that the axial velocity component is horizontally constant, while in real the free surface vortex has the axial velocity distribution which shows large gradient in radial direction near the vortex center. In this study, a new vortex model with realistic axial velocity distribution is proposed. This model is derived from the steady axisymmetric Navier-Stokes equation as well as the Burgers vortex model, but the realistic axial velocity distribution in radial direction is considered, which is defined to be zero at the vortex center and to approach asymptotically to zero at infinity. As the verification, the new vortex model is applied to the evaluation of a simple vortex experiment, and shows good agreements with the experimental data in terms of the circumferential velocity distribution and the free surface shape. In addition, it is confirmed that the Burgers vortex model fails to calculate accurate velocity distribution with the assumption of uniform axial velocity. However, the calculation accuracy of the Burgers vortex model can be enhanced close to that of the new vortex model in consideration of the effective axial velocity which is calculated as the average value only in the vicinity of the vortex center. (author)

  7. The HayWired Earthquake Scenario

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    ForewordThe 1906 Great San Francisco earthquake (magnitude 7.8) and the 1989 Loma Prieta earthquake (magnitude 6.9) each motivated residents of the San Francisco Bay region to build countermeasures to earthquakes into the fabric of the region. Since Loma Prieta, bay-region communities, governments, and utilities have invested tens of billions of dollars in seismic upgrades and retrofits and replacements of older buildings and infrastructure. Innovation and state-of-the-art engineering, informed by science, including novel seismic-hazard assessments, have been applied to the challenge of increasing seismic resilience throughout the bay region. However, as long as people live and work in seismically vulnerable buildings or rely on seismically vulnerable transportation and utilities, more work remains to be done.With that in mind, the U.S. Geological Survey (USGS) and its partners developed the HayWired scenario as a tool to enable further actions that can change the outcome when the next major earthquake strikes. By illuminating the likely impacts to the present-day built environment, well-constructed scenarios can and have spurred officials and citizens to take steps that change the outcomes the scenario describes, whether used to guide more realistic response and recovery exercises or to launch mitigation measures that will reduce future risk.The HayWired scenario is the latest in a series of like-minded efforts to bring a special focus onto the impacts that could occur when the Hayward Fault again ruptures through the east side of the San Francisco Bay region as it last did in 1868. Cities in the east bay along the Richmond, Oakland, and Fremont corridor would be hit hardest by earthquake ground shaking, surface fault rupture, aftershocks, and fault afterslip, but the impacts would reach throughout the bay region and far beyond. The HayWired scenario name reflects our increased reliance on the Internet and telecommunications and also alludes to the

  8. Adapting realist synthesis methodology: The case of workplace harassment interventions.

    Science.gov (United States)

    Carr, Tracey; Quinlan, Elizabeth; Robertson, Susan; Gerrard, Angie

    2017-12-01

    Realist synthesis techniques can be used to assess complex interventions by extracting and synthesizing configurations of contexts, mechanisms, and outcomes found in the literature. Our novel and multi-pronged approach to the realist synthesis of workplace harassment interventions describes our pursuit of theory to link macro and program level theories. After discovering the limitations of a dogmatic approach to realist synthesis, we adapted our search strategy and focused our analysis on a subset of data. We tailored our realist synthesis to understand how, why, and under what circumstances workplace harassment interventions are effective. The result was a conceptual framework to test our theory-based interventions and provide the basis for subsequent realist evaluation. Our experience documented in this article contributes to an understanding of how, under what circumstances, and with what consequences realist synthesis principles can be customized. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Relic abundance of WIMPs in non-standard cosmological scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Yimingniyazi, W.

    2007-08-06

    In this thesis we study the relic density n{sub {chi}} of non--relativistic long--lived or stable particles {chi} in various non--standard cosmological scenarios. First, we discuss the relic density in the non--standard cosmological scenario in which the temperature is too low for the particles {chi} to achieve full chemical equilibrium. We also investigated the case where {chi} particles are non--thermally produced from the decay of heavier particles in addition to the usual thermal production. In low temperature scenario, we calculate the relic abundance starting from arbitrary initial temperatures T{sub 0} of the radiation--dominated epoch and derive approximate solutions for the temperature dependence of the relic density which can accurately reproduces numerical results when full thermal equilibrium is not achieved. If full equilibrium is reached, our ansatz no longer reproduces the correct temperature dependence of the {chi} number density. However, we can contrive a semi-analytic formula which gives the correct final relic density, to an accuracy of about 3% or better, for all cross sections and initial temperatures. We also derive the lower bound on the initial temperature T{sub 0}, assuming that the relic particle accounts for the dark matter energy density in the universe. The observed cold dark matter abundance constrains the initial temperature T{sub 0} {>=}m{sub {chi}}/23, where m{sub {chi}} is the mass of {chi}. Second, we discuss the {chi} density in the scenario where the the Hubble parameter is modified. Even in this case, an approximate formula similar to the standard one is found to be capable of predicting the final relic abundance correctly. Choosing the {chi} annihilation cross section such that the observed cold dark matter abundance is reproduced in standard cosmology, we constrain possible modifications of the expansion rate at T {proportional_to}m{sub {chi}}/20, well before Big Bang Nucleosynthesis. (orig.)

  10. Relic abundance of WIMPs in non-standard cosmological scenarios

    International Nuclear Information System (INIS)

    Yimingniyazi, W.

    2007-01-01

    In this thesis we study the relic density n χ of non--relativistic long--lived or stable particles χ in various non--standard cosmological scenarios. First, we discuss the relic density in the non--standard cosmological scenario in which the temperature is too low for the particles χ to achieve full chemical equilibrium. We also investigated the case where χ particles are non--thermally produced from the decay of heavier particles in addition to the usual thermal production. In low temperature scenario, we calculate the relic abundance starting from arbitrary initial temperatures T 0 of the radiation--dominated epoch and derive approximate solutions for the temperature dependence of the relic density which can accurately reproduces numerical results when full thermal equilibrium is not achieved. If full equilibrium is reached, our ansatz no longer reproduces the correct temperature dependence of the χ number density. However, we can contrive a semi-analytic formula which gives the correct final relic density, to an accuracy of about 3% or better, for all cross sections and initial temperatures. We also derive the lower bound on the initial temperature T 0 , assuming that the relic particle accounts for the dark matter energy density in the universe. The observed cold dark matter abundance constrains the initial temperature T 0 ≥m χ /23, where m χ is the mass of χ. Second, we discuss the χ density in the scenario where the the Hubble parameter is modified. Even in this case, an approximate formula similar to the standard one is found to be capable of predicting the final relic abundance correctly. Choosing the χ annihilation cross section such that the observed cold dark matter abundance is reproduced in standard cosmology, we constrain possible modifications of the expansion rate at T ∝m χ /20, well before Big Bang Nucleosynthesis. (orig.)

  11. Realistic training scenario simulations and simulation techniques

    Energy Technology Data Exchange (ETDEWEB)

    Dunlop, William H.; Koncher, Tawny R.; Luke, Stanley John; Sweeney, Jerry Joseph; White, Gregory K.

    2017-12-05

    In one embodiment, a system includes a signal generator operatively coupleable to one or more detectors; and a controller, the controller being both operably coupled to the signal generator and configured to cause the signal generator to: generate one or more signals each signal being representative of at least one emergency event; and communicate one or more of the generated signal(s) to a detector to which the signal generator is operably coupled. In another embodiment, a method includes: receiving data corresponding to one or more emergency events; generating at least one signal based on the data; and communicating the generated signal(s) to a detector.

  12. Simulation of the ATLAS SCT barrel module response to LHC beam loss scenarios

    CERN Document Server

    Rose, P; The ATLAS collaboration; Fadeyev, V; Spencer, E; Wilder, M; Domingo, M

    2014-01-01

    In the event of beam loss at the LHC, ATLAS Inner Detector components nearest the beam line may be subjected to unusually large amounts of radiation. Understanding their behavior in such an event is important in determining whether they would still function properly. We built a SPICE model of the silicon strip module electrical system to determine the behavior of its elements during a realistic beam loss scenario. We found that the power supply and bias filter characteristics strongly affect the module response in such scenarios. In particular, the following self-limiting phenomena were observed: there is a finite amount of charge initially available on the bias filter capacitors for collection by the strips; the power supply current limit reduces the rate at which the bias filter capacitors' charge can be replenished; the reduced bias voltage leads to a smaller depletion depth in the sensors which results in less collected charge. These effects provide a larger measure of safety during beam loss events than ...

  13. Simulation of the ATLAS SCT Barrel Module Response to LHC Beam Loss Scenarios

    CERN Document Server

    Rose, P; The ATLAS collaboration; Fadeyev, V; Spencer, E; Wilder, M; Domingo, M

    2013-01-01

    In the event of beam loss at the LHC, ATLAS Inner Detector components nearest the beamline may be subjected to unusually large amounts of radiation. Understanding their behavior in such an event is important in determining whether they would still function properly. We built a SPICE model of the silicon strip module electrical system to determine the behavior of its elements during a realistic beam loss scenario. We found that the power supply and bias filter characteristics strongly affect the module response in such scenarios. In particular, the following self-limiting phenomena were observed: there is a finite amount of charge initially available on the bias filter capacitors for collection by the strips; the power supply current limit reduces the rate at which the bias filter capacitors' charge can be replenished; the reduced bias voltage leads to a smaller depletion depth which results in less collected charge. These effects provide a larger measure of safety during beam loss events than we have previous...

  14. Realist synthesis: illustrating the method for implementation research

    Directory of Open Access Journals (Sweden)

    Rycroft-Malone Jo

    2012-04-01

    Full Text Available Abstract Background Realist synthesis is an increasingly popular approach to the review and synthesis of evidence, which focuses on understanding the mechanisms by which an intervention works (or not. There are few published examples of realist synthesis. This paper therefore fills a gap by describing, in detail, the process used for a realist review and synthesis to answer the question ‘what interventions and strategies are effective in enabling evidence-informed healthcare?’ The strengths and challenges of conducting realist review are also considered. Methods The realist approach involves identifying underlying causal mechanisms and exploring how they work under what conditions. The stages of this review included: defining the scope of the review (concept mining and framework formulation; searching for and scrutinising the evidence; extracting and synthesising the evidence; and developing the narrative, including hypotheses. Results Based on key terms and concepts related to various interventions to promote evidence-informed healthcare, we developed an outcome-focused theoretical framework. Questions were tailored for each of four theory/intervention areas within the theoretical framework and were used to guide development of a review and data extraction process. The search for literature within our first theory area, change agency, was executed and the screening procedure resulted in inclusion of 52 papers. Using the questions relevant to this theory area, data were extracted by one reviewer and validated by a second reviewer. Synthesis involved organisation of extracted data into evidence tables, theming and formulation of chains of inference, linking between the chains of inference, and hypothesis formulation. The narrative was developed around the hypotheses generated within the change agency theory area. Conclusions Realist synthesis lends itself to the review of complex interventions because it accounts for context as well as

  15. Sotsialistlik realist Keskküla

    Index Scriptorium Estoniae

    1998-01-01

    Londonis 1998. a. ilmunud inglise kunstikriitiku Matthew Cullerne Bowni monograafias "Socialist Realist Painting" on eesti kunstnikest Enn Põldroos, Nikolai Kormashov, Ando Keskküla, Kormashovi ja Keskküla maalide reproduktsioonid

  16. Simulations of electromagnetic effects in high-frequency capacitively coupled discharges using the Darwin approximation

    International Nuclear Information System (INIS)

    Eremin, Denis; Hemke, Torben; Brinkmann, Ralf Peter; Mussenbrock, Thomas

    2013-01-01

    The Darwin approximation is investigated for its possible use in simulation of electromagnetic effects in large size, high-frequency capacitively coupled discharges. The approximation is utilized within the framework of two different fluid models which are applied to typical cases showing pronounced standing wave and skin effects. With the first model it is demonstrated that the Darwin approximation is valid for treatment of such effects in the range of parameters under consideration. The second approach, a reduced nonlinear Darwin approximation-based model, shows that the electromagnetic phenomena persist in a more realistic setting. The Darwin approximation offers a simple and efficient way of carrying out electromagnetic simulations as it removes the Courant condition plaguing explicit electromagnetic algorithms and can be implemented as a straightforward modification of electrostatic algorithms. The algorithm described here avoids iterative schemes needed for the divergence cleaning and represents a fast and efficient solver, which can be used in fluid and kinetic models for self-consistent description of technical plasmas exhibiting certain electromagnetic activity. (paper)

  17. Estimating the water table under the Radioactive Waste Management Site in Area 5 of the Nevada Test Site: The Dupuit-Forcheimer approximation

    International Nuclear Information System (INIS)

    Lindstrom, F.T.; Barker, L.E.; Cawlfield, D.E.; Daffern, D.D.; Dozier, B.L.; Emer, D.F.; Strong, W.R.

    1992-01-01

    To adequately manage the low level nuclear waste (LLW) repository in Area 5 of the Nevada Test Site (NTS), a knowledge of the water table under the site is paramount. The estimated thickness of the arid intermountain basin alluvium is roughly 900 feet. Very little reliable water table data for Area 5 currently exists. The Special Projects Section of the Reynolds Electrical ampersand Engineering Co., Inc. Waste Management Department is currently formulating a long-range drilling and sampling plan in support of a Resource Conservation Recovery Act (RCRA) Part B permit waiver for groundwater monitoring and liner systems. An estimate of the water table under the LLW repository, called the Radioactive Waste Management Site (RWMS) in Area 5, is needed for the drilling and sampling plan. Very old water table elevation estimates at about a dozen widely scattered test drill holes, as well as water wells, are available from declassified US Geological Survey, Lawrence Livermore National Laboratory, and Los Alamos National Laboratory drilling logs. A three-dimensional steady-state water-flow equation for estimating the water table elevation under a thick, very dry vadose zone is developed using the Dupuit assumption. A prescribed positive vertical downward infiltration/evaporation condition is assumed at the atmosphere/soil interface. An approximation to the square of the elevation head, based upon multivariate cubic interpolation methods, is introduced. The approximate is forced to satisfy the governing elliptic (Poisson) partial differential equation over the domain of definition. The remaining coefficients are determined by interpolating the water table at eight ''boundary point.'' Several realistic scenarios approximating the water table under the RWMS in Area 5 of the NTS are discussed

  18. Biomass Scenario Model Scenario Library: Definitions, Construction, and Description

    Energy Technology Data Exchange (ETDEWEB)

    Inman, D.; Vimmerstedt, L.; Bush, B.; Peterson, S.

    2014-04-01

    Understanding the development of the biofuels industry in the United States is important to policymakers and industry. The Biomass Scenario Model (BSM) is a system dynamics model of the biomass-to-biofuels system that can be used to explore policy effects on biofuels development. Because of the complexity of the model, as well as the wide range of possible future conditions that affect biofuels industry development, we have not developed a single reference case but instead developed a set of specific scenarios that provide various contexts for our analyses. The purpose of this report is to describe the scenarios that comprise the BSM scenario library. At present, we have the following policy-focused scenarios in our library: minimal policies, ethanol-focused policies, equal access to policies, output-focused policies, technological diversity focused, and the point-of-production- focused. This report describes each scenario, its policy settings, and general insights gained through use of the scenarios in analytic studies.

  19. Are there realistically interpretable local theories?

    International Nuclear Information System (INIS)

    d'Espagnat, B.

    1989-01-01

    Although it rests on strongly established proofs, the statement that no realistically interpretable local theory is compatible with some experimentally testable predictions of quantum mechanics seems at first sight to be incompatible with a few general ideas and clear-cut statements occurring in recent theoretical work by Griffiths, Omnes, and Ballentine and Jarrett. It is shown here that in fact none of the developments due to these authors can be considered as a realistically interpretable local theory, so that there is no valid reason for suspecting that the existing proofs of the statement in question are all flawed

  20. Realistic Real-Time Outdoor Rendering in Augmented Reality

    Science.gov (United States)

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  1. Realistic real-time outdoor rendering in augmented reality.

    Directory of Open Access Journals (Sweden)

    Hoshang Kolivand

    Full Text Available Realistic rendering techniques of outdoor Augmented Reality (AR has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps. Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.

  2. 'Semi-realistic'F-term inflation model building in supergravity

    International Nuclear Information System (INIS)

    Kain, Ben

    2008-01-01

    We describe methods for building 'semi-realistic' models of F-term inflation. By semi-realistic we mean that they are built in, and obey the requirements of, 'semi-realistic' particle physics models. The particle physics models are taken to be effective supergravity theories derived from orbifold compactifications of string theory, and their requirements are taken to be modular invariance, absence of mass terms and stabilization of moduli. We review the particle physics models, their requirements and tools and methods for building inflation models

  3. Generating realistic images using Kray

    Science.gov (United States)

    Tanski, Grzegorz

    2004-07-01

    Kray is an application for creating realistic images. It is written in C++ programming language, has a text-based interface, solves global illumination problem using techniques such as radiosity, path tracing and photon mapping.

  4. Scenario planning.

    Science.gov (United States)

    Enzmann, Dieter R; Beauchamp, Norman J; Norbash, Alexander

    2011-03-01

    In facing future developments in health care, scenario planning offers a complementary approach to traditional strategic planning. Whereas traditional strategic planning typically consists of predicting the future at a single point on a chosen time horizon and mapping the preferred plans to address such a future, scenario planning creates stories about multiple likely potential futures on a given time horizon and maps the preferred plans to address the multiple described potential futures. Each scenario is purposefully different and specifically not a consensus worst-case, average, or best-case forecast; nor is scenario planning a process in probabilistic prediction. Scenario planning focuses on high-impact, uncertain driving forces that in the authors' example affect the field of radiology. Uncertainty is the key concept as these forces are mapped onto axes of uncertainty, the poles of which have opposed effects on radiology. One chosen axis was "market focus," with poles of centralized health care (government control) vs a decentralized private market. Another axis was "radiology's business model," with one pole being a unified, single specialty vs a splintered, disaggregated subspecialty. The third axis was "technology and science," with one pole representing technology enabling to radiology vs technology threatening to radiology. Selected poles of these axes were then combined to create 3 scenarios. One scenario, termed "entrepreneurialism," consisted of a decentralized private market, a disaggregated business model, and threatening technology and science. A second scenario, termed "socialized medicine," had a centralized market focus, a unified specialty business model, and enabling technology and science. A third scenario, termed "freefall," had a centralized market focus, a disaggregated business model, and threatening technology and science. These scenarios provide a range of futures that ultimately allow the identification of defined "signposts" that can

  5. A stepwise regression tree for nonlinear approximation: applications to estimating subpixel land cover

    Science.gov (United States)

    Huang, C.; Townshend, J.R.G.

    2003-01-01

    A stepwise regression tree (SRT) algorithm was developed for approximating complex nonlinear relationships. Based on the regression tree of Breiman et al . (BRT) and a stepwise linear regression (SLR) method, this algorithm represents an improvement over SLR in that it can approximate nonlinear relationships and over BRT in that it gives more realistic predictions. The applicability of this method to estimating subpixel forest was demonstrated using three test data sets, on all of which it gave more accurate predictions than SLR and BRT. SRT also generated more compact trees and performed better than or at least as well as BRT at all 10 equal forest proportion interval ranging from 0 to 100%. This method is appealing to estimating subpixel land cover over large areas.

  6. Designing virtual audiences for fear of public speaking training - an observation study on realistic nonverbal behavior.

    Science.gov (United States)

    Poeschl, Sandra; Doering, Nicola

    2012-01-01

    Virtual Reality technology offers great possibilities for Cognitive Behavioral Therapy of fear of public speaking: Clients can be exposed to virtual fear-triggering stimuli (exposure) and are able to role-play in virtual environments, training social skills to overcome their fear. Usually, prototypical audience behavior (neutral, social and anti-social) serves as stimulus in virtual training sessions, although there is significant lack of theoretical basis on typical audience behavior. The study presented deals with the design of a realistic virtual presentation scenario. An audience (consisting of n=18 men and women) in an undergraduate seminar was observed during three frontal lecture sessions. Behavior frequency of four nonverbal dimensions (eye contact, facial expression, gesture, and posture) was rated by means of a quantitative content analysis. Results show audience behavior patterns which seem to be typical in frontal lecture contexts, like friendly and neutral face expressions. Additionally, combined and even synchronized behavioral patterns between participants who sit next to each other (like turning to the neighbor and start talking) were registered. The gathered data serve as empirical design basis for a virtual audience to be used in virtual training applications that stimulate the experiences of the participants in a realistic manner, thereby improving the experienced presence in the training application.

  7. Calculations of NTM stabilization in ITER-FEAT by ECCD with realistic antenna geometry

    International Nuclear Information System (INIS)

    Ramponi, G.; Nowak, S.; Lazzaro, E.; Giruzzi, G.; Bosia, G.

    2001-01-01

    Neoclassical Tearing Modes stabilization is one of the main purposes for the implementation of an Electron Cyclotron Current Drive system on ITER-FEAT. Previous estimates have shown that a wave power level of 20-30 MW should be appropriate for a substantial reduction of the (3,2) and/or (2,1) modes. Here detailed calculations are presented combining, for the first time, the following elements: i) realistic antenna geometry resulting from detailed study of the implementation in an ITER upper port; ii) Gaussian beam-tracing calculations; iii) 3D Fokker-Planck calculations of the driven current density profile; iv) island evolution calculation, including island rotation effects. The power level necessary for complete stabilization of NTMs is evaluated for the ITER FEAT reference scenarios and the chosen wave frequency of 170 GHz. Optimization as a function of the injection poloidal and toroidal angles is discussed

  8. A Novel Cooperation-Based Network Coding Scheme for Walking Scenarios in WBANs

    Directory of Open Access Journals (Sweden)

    Hongyun Zhang

    2017-01-01

    Full Text Available In Wireless Body Area Networks (WBANs, the tradeoff between network throughput and energy efficiency remains a key challenge. Most current transmission schemes try to cope with the challenge from the perspective of general Wireless Sensor Networks (WSNs, which may not take the peculiarities of WBAN channels into account. In this paper, we take advantage of the correlation of on-body channels in walking scenarios to achieve a better tradeoff between throughput and energy consumption. We first analyze the characteristics of on-body channels based on realistic channel gain datasets, which are collected by our customized wireless transceivers in walking scenarios. The analytical results confirm the rationale of our newly proposed transmission scheme A3NC, which explores the combination of the aggregative allocation (AA mechanism in MAC layer and the Analog Network Coding (ANC technique in PHY layer. Both theoretical analyses and simulation results show that the A3NC scheme achieves significant improvement in upload throughput and energy efficiency, compared to the conventional approaches.

  9. The Development of a Novel Perfused Cadaver Model With Dynamic Vital Sign Regulation and Real-World Scenarios to Teach Surgical Skills and Error Management.

    Science.gov (United States)

    Minneti, Michael; Baker, Craig J; Sullivan, Maura E

    The landscape of graduate medical education has changed dramatically over the past decade and the traditional apprenticeship model has undergone scrutiny and modifications. The mandate of the 80-hour work-week, the introduction of integrated residency programs, increased global awareness about patient safety along with financial constraints have spurred changes in graduate educational practices. In addition, new technologies, more complex procedures, and a host of external constraints have changed where and how we teach technical and procedural skills. Simulation-based training has been embraced by the surgical community and has quickly become an essential component of most residency programs as a method to add efficacy to the traditional learning model. The purpose of this paper is twofold: (1) to describe the development of a perfused cadaver model with dynamic vital sign regulation, and (2) to assess the impact of a curriculum using this model and real world scenarios to teach surgical skills and error management. By providing a realistic training environment our aim is to enhance the acquisition of surgical skills and provide a more thorough assessment of resident performance. Twenty-six learners participated in the scenarios. Qualitative data showed that participants felt that the simulation model was realistic, and that participating in the scenarios helped them gain new knowledge, learn new surgical techniques and increase their confidence performing the skill in a clinical setting. Identifying the importance of both technical and nontechnical skills in surgical education has hastened the need for more realistic simulators and environments in which they are placed. Team members should be able to interact in ways that allow for a global display of their skills thus helping to provide a more comprehensive assessment by faculty and learners. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  10. Pattern recognition techniques and neo-deterministic seismic hazard: Time dependent scenarios for North-Eastern Italy

    International Nuclear Information System (INIS)

    Peresan, A.; Vaccari, F.; Panza, G.F.; Zuccolo, E.; Gorshkov, A.

    2009-05-01

    An integrated neo-deterministic approach to seismic hazard assessment has been developed that combines different pattern recognition techniques, designed for the space-time identification of strong earthquakes, with algorithms for the realistic modeling of seismic ground motion. The integrated approach allows for a time dependent definition of the seismic input, through the routine updating of earthquake predictions. The scenarios of expected ground motion, associated with the alarmed areas, are defined by means of full waveform modeling. A set of neo-deterministic scenarios of ground motion is defined at regional and local scale, thus providing a prioritization tool for timely prevention and mitigation actions. Constraints about the space and time of occurrence of the impending strong earthquakes are provided by three formally defined and globally tested algorithms, which have been developed according to a pattern recognition scheme. Two algorithms, namely CN and M8, are routinely used for intermediate-term middle-range earthquake predictions, while a third algorithm allows for the identification of the areas prone to large events. These independent procedures have been combined to better constrain the alarmed area. The pattern recognition of earthquake-prone areas does not belong to the family of earthquake prediction algorithms since it does not provide any information about the time of occurrence of the expected earthquakes. Nevertheless, it can be considered as the term-less zero-approximation, which restrains the alerted areas (e.g. defined by CN or M8) to the more precise location of large events. Italy is the only region of moderate seismic activity where the two different prediction algorithms CN and M8S (i.e. a spatially stabilized variant of M8) are applied simultaneously and a real-time test of predictions, for earthquakes with magnitude larger than 5.4, is ongoing since 2003. The application of the CN to the Adriatic region (s.l.), which is relevant

  11. Long term contaminant migration and impacts from uranium mill tailings. Comparison of computer models using a realistic dataset

    Energy Technology Data Exchange (ETDEWEB)

    Camus, H. [CEA Centre d' Etudes Nucleaires de Fontenay-aux-Roses, 92 (France). Inst. de Protection et de Surete Nucleaire] [and others

    1996-08-01

    This is the final report of the Working Group describing: the enhancement of the previously devised V1 scenario to produce a V2 scenario which includes more detailed source term and other site specific data; the application of models in deterministic and probabilistic mode to calculate contaminant concentrations in biosphere media, and related radiation doses, contaminant intakes and health risks, including estimates of uncertainties; the comparison and analysis of the resulting calculations. A series of scenarios was developed based on data provided by Working Group members from a range of actual tailings disposal sites, culminating in the V2.2 and V2.3 scenarios. The V2.2 and V2.3 scenarios are identical in all respects, except that the V2.2 considers radioactive (U-238 chain) contaminants, whilst the V2.3 considers stable elements (As, Ni, Pb). Since the scenarios are based on data obtained from a range of actual sites, they should be considered to be generically realistic rather than representative of a particular single site. In both scenarios, the contaminants of interest are assumed to be released in leachate from a tailings pile into an underlying aquifer. They are transported in groundwater through the aquifer to a well. Water is abstracted from the well and used for: watering beef cattle; human consumption; and irrigating leafy vegetables. The beef and leafy vegetables are consumed by humans living in the area. The same contaminants are also released into the atmosphere due to the wind erosion of the pile and then deposited upon the soil, pasture and leafy vegetables. In addition, for the V2.2 scenario, Rn-222 is assumed to be released to atmosphere from the pile. Unlike the V1 scenario, no consideration is given to surface water exposure pathways. Results show that there is exceedingly good agreement between participants' deterministic and probabilistic estimates of total dose or intake. They agree within a factor of two to three for both scenarios

  12. Safer passenger car front shapes for pedestrians: A computational approach to reduce overall pedestrian injury risk in realistic impact scenarios.

    Science.gov (United States)

    Li, Guibing; Yang, Jikuang; Simms, Ciaran

    2017-03-01

    Vehicle front shape has a significant influence on pedestrian injuries and the optimal design for overall pedestrian protection remains an elusive goal, especially considering the variability of vehicle-to-pedestrian accident scenarios. Therefore this study aims to develop and evaluate an efficient framework for vehicle front shape optimization for pedestrian protection accounting for the broad range of real world impact scenarios and their distributions in recent accident data. Firstly, a framework for vehicle front shape optimization for pedestrian protection was developed based on coupling of multi-body simulations and a genetic algorithm. This framework was then applied for optimizing passenger car front shape for pedestrian protection, and its predictions were evaluated using accident data and kinematic analyses. The results indicate that the optimization shows a good convergence and predictions of the optimization framework are corroborated when compared to the available accident data, and the optimization framework can distinguish 'good' and 'poor' vehicle front shapes for pedestrian safety. Thus, it is feasible and reliable to use the optimization framework for vehicle front shape optimization for reducing overall pedestrian injury risk. The results also show the importance of considering the broad range of impact scenarios in vehicle front shape optimization. A safe passenger car for overall pedestrian protection should have a wide and flat bumper (covering pedestrians' legs from the lower leg up to the shaft of the upper leg with generally even contacts), a bonnet leading edge height around 750mm, a short bonnet (17° or car front shape for head and leg protection are generally consistent, but partially conflict with pelvis protection. In particular, both head and leg injury risk increase with increasing bumper lower height and depth, and decrease with increasing bonnet leading edge height, while pelvis injury risk increases with increasing bonnet leading

  13. On Realistically Attacking Tor with Website Fingerprinting

    Directory of Open Access Journals (Sweden)

    Wang Tao

    2016-10-01

    Full Text Available Website fingerprinting allows a local, passive observer monitoring a web-browsing client’s encrypted channel to determine her web activity. Previous attacks have shown that website fingerprinting could be a threat to anonymity networks such as Tor under laboratory conditions. However, there are significant differences between laboratory conditions and realistic conditions. First, in laboratory tests we collect the training data set together with the testing data set, so the training data set is fresh, but an attacker may not be able to maintain a fresh data set. Second, laboratory packet sequences correspond to a single page each, but for realistic packet sequences the split between pages is not obvious. Third, packet sequences may include background noise from other types of web traffic. These differences adversely affect website fingerprinting under realistic conditions. In this paper, we tackle these three problems to bridge the gap between laboratory and realistic conditions for website fingerprinting. We show that we can maintain a fresh training set with minimal resources. We demonstrate several classification-based techniques that allow us to split full packet sequences effectively into sequences corresponding to a single page each. We describe several new algorithms for tackling background noise. With our techniques, we are able to build the first website fingerprinting system that can operate directly on packet sequences collected in the wild.

  14. Iterated interactions method. Realistic NN potential

    International Nuclear Information System (INIS)

    Gorbatov, A.M.; Skopich, V.L.; Kolganova, E.A.

    1991-01-01

    The method of iterated potential is tested in the case of realistic fermionic systems. As a base for comparison calculations of the 16 O system (using various versions of realistic NN potentials) by means of the angular potential-function method as well as operators of pairing correlation were used. The convergence of genealogical series is studied for the central Malfliet-Tjon potential. In addition the mathematical technique of microscopical calculations is improved: new equations for correlators in odd states are suggested and the technique of leading terms was applied for the first time to calculations of heavy p-shell nuclei in the basis of angular potential functions

  15. Realistic Material Appearance Modelling

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal; Filip, Jiří; Hatka, Martin

    2010-01-01

    Roč. 2010, č. 81 (2010), s. 13-14 ISSN 0926-4981 R&D Projects: GA ČR GA102/08/0593 Institutional research plan: CEZ:AV0Z10750506 Keywords : bidirectional texture function * texture modelling Subject RIV: BD - Theory of Information http:// library .utia.cas.cz/separaty/2010/RO/haindl-realistic material appearance modelling.pdf

  16. Worst-case Throughput Analysis for Parametric Rate and Parametric Actor Execution Time Scenario-Aware Dataflow Graphs

    Directory of Open Access Journals (Sweden)

    Mladen Skelin

    2014-03-01

    Full Text Available Scenario-aware dataflow (SADF is a prominent tool for modeling and analysis of dynamic embedded dataflow applications. In SADF the application is represented as a finite collection of synchronous dataflow (SDF graphs, each of which represents one possible application behaviour or scenario. A finite state machine (FSM specifies the possible orders of scenario occurrences. The SADF model renders the tightest possible performance guarantees, but is limited by its finiteness. This means that from a practical point of view, it can only handle dynamic dataflow applications that are characterized by a reasonably sized set of possible behaviours or scenarios. In this paper we remove this limitation for a class of SADF graphs by means of SADF model parametrization in terms of graph port rates and actor execution times. First, we formally define the semantics of the model relevant for throughput analysis based on (max,+ linear system theory and (max,+ automata. Second, by generalizing some of the existing results, we give the algorithms for worst-case throughput analysis of parametric rate and parametric actor execution time acyclic SADF graphs with a fully connected, possibly infinite state transition system. Third, we demonstrate our approach on a few realistic applications from digital signal processing (DSP domain mapped onto an embedded multi-processor architecture.

  17. Electrical-Generation Scenarios for China

    Energy Technology Data Exchange (ETDEWEB)

    Kypreos, S.; Krakowski, R.A.

    2002-03-01

    The China Energy Technology Program (CETP) used both optimizing and simulation energy- economic-environmental (E3) models to assess tradeoffs in the electricity-generation sector for a range of fuel, transport, generation, and distribution options. The CETP is composed of a range of technical tasks or activities, including Energy Economics Modeling (EEM, optimizations), Electric Sector Simulation (ESS, simulations), Life Cycle Analyses (LCA, externalization) of energy systems, and Multi-Criteria Decision Analyses (MCDA, integration). The scope of CETP is limited to one province (Shandong), to one economic sector (electricity), and to one energy sector (electricity). This document describes the methods, approaches, limitations, sample results, and future/needed work for the EEM ( optimization-based modeling) task that supports the overall goal of CETP. An important tool used by the EEM task is based on a Linear Programming (LP) optimization model that considers 17 electricity-generation technologies utilizing 14 fuel forms (type, composition, source) in a 7-region transportation model of China's electricity demand and supply system over the period 2000-2030; Shandong is one of the seven regions modeled. The China Regional Electricity Trade Model (CRETM) is used to examine a set of energy-environment-economy E3-driven scenarios to quantify related policy implications. The development of electricity production mixes that are optimized under realistically E3 constraints is determined through regional demands for electricity that respond to exogenous assumptions on income (GDP) and electricity prices through respective time-dependent elasticities. Constraints are applied to fuel prices, transportation limits, resource availability, introduction (penetration) rates of specific technology, and (where applicable) to local, regional, and countrywide emission rates of CO{sub 2}, SO{sub 2} and NO{sub x}. Importantly, future inter- regional energy flows are optimized with

  18. A Radiosity Approach to Realistic Image Synthesis

    Science.gov (United States)

    1992-12-01

    AD-A259 082 AFIT/GCE/ENG/92D-09 A RADIOSITY APPROACH TO REALISTIC IMAGE SYNTHESIS THESIS Richard L. Remington Captain, USAF fl ECTE AFIT/GCE/ENG/92D...09 SJANl 1993U 93-00134 Approved for public release; distribution unlimited 93& 1! A -A- AFIT/GCE/ENG/92D-09 A RADIOSITY APPROACH TO REALISTIC IMAGE...assistance in creating the input geometry file for the AWACS aircraft interior. Without his assistance, a good model for the diffuse radiosity implementation

  19. Do environmental dynamics matter in fate models? Exploring scenario dynamics for a terrestrial and an aquatic system.

    Science.gov (United States)

    Morselli, Melissa; Terzaghi, Elisa; Di Guardo, Antonio

    2018-01-24

    Nowadays, there is growing interest in inserting more ecological realism into risk assessment of chemicals. On the exposure evaluation side, this can be done by studying the complexity of exposure in the ecosystem, niche partitioning, e.g. variation of the exposure scenario. Current regulatory predictive approaches, to ensure simplicity and predictive ability, generally keep the scenario as static as possible. This could lead to under or overprediction of chemical exposure depending on the chemical and scenario simulated. To account for more realistic exposure conditions, varying temporally and spatially, additional scenario complexity should be included in currently used models to improve their predictive ability. This study presents two case studies (a terrestrial and an aquatic one) in which some polychlorinated biphenyls (PCBs) were simulated with the SoilPlusVeg and ChimERA models to show the importance of scenario variation in time (biotic and abiotic compartments). The results outlined the importance of accounting for planetary boundary layer variation and vegetation dynamics to accurately predict air concentration changes and the timing of chemical dispersion from the source in terrestrial systems. For the aquatic exercise, the results indicated the need to account for organic carbon forms (particulate and dissolved organic carbon) and vegetation biomass dynamics. In both cases the range of variation was up to two orders of magnitude depending on the congener and scenario, reinforcing the need for incorporating such knowledge into exposure assessment.

  20. Second-order symmetric eikonal approximation for electron capture at high energies

    Energy Technology Data Exchange (ETDEWEB)

    Deco, G R; Rivarola, R D [Rosario Univ. Nacional (Argentina). Dept. de Fisica

    1985-06-14

    A symmetric eikonal approximation for electron capture in ion-atom collisions at high energies has been developed within the Dodd and Greider (1966, Phys. Rev. 146 675) formalism. Implicit intermediate states are included through the choice of distorted initial and final wavefunctions. Explicit intermediate state are considered by the introduction of a free-particle Green's function G/sup +//sub 0/. The model is applied for the resonant charge exchange in H/sup +/+H(1s) collisions. Also, the characteristic dip of the continuum distorted-wave model is analysed when higher orders are included at 'realistic' high energies.

  1. Bridging Scales: Developing a Framework to Build a City-Scale Environmental Scenario for Japanese Municipalities

    Science.gov (United States)

    Hashimoto, S.; Fujita, T.; Nakayama, T.; Xu, K.

    2007-12-01

    There is an ongoing project on establishing environmental scenarios in Japan to evaluate middle to long-term environmental policy and technology options toward low carbon society. In this project, the time horizon of the scenarios is set for 2050 on the ground that a large part of social infrastructure in Japan is likely to be renovated by that time, and cities are supposed to play important roles in building low carbon society in Japan. This belief is held because cities or local governments could implement various policies and programs, such as land use planning and promotion of new technologies with low GHG emissions, which produce an effect in an ununiform manner, taking local socio-economic conditions into account, while higher governments, either national or prefectural, could impose environmental tax on electricity and gas to alleviate ongoing GHG emissions, which uniformly covers their jurisdictions. In order for local governments to devise and implement concrete administrative actions equipped with rational policies and technologies, referring the environmental scenarios developed for the entire nation, we need to localize the national scenarios, both in terms of spatial and temporal extent, so that they could better reflect local socio-economic and institutional conditions. In localizing the national scenarios, the participation of stakeholders is significant because they play major roles in shaping future society. Stakeholder participation in the localization process would bring both creative and realistic inputs on how future unfolds on a city scale. In this research, 1) we reviewed recent efforts on international and domestic scenario development to set a practical time horizon for a city-scale environmental scenario, which would lead to concrete environmental policies and programs, 2) designed a participatory scenario development/localization process, drawing on the framework of the 'Story-and-Simulation' or SAS approach, which Alcamo(2001) proposed

  2. Scenario for a warm, high-CO/sub 2/ world

    Energy Technology Data Exchange (ETDEWEB)

    Wigley, T M.L.; Jones, P D; Kelly, P M

    1980-01-03

    To assess the impact of global changes in temperature, precipitation, and winds that might occur as a result of increased carbon dioxide levels in the atmosphere, a meteorologically and climatologically realistic scenario of global warming was developed. The patterns of climatic changes that could result from a large increase in atmospheric CO/sub 2/ were determined by comparing the five warmest years from 1925-74 with the five coldest years in the same period. Results indicate that increased atmospheric CO/sub 2/ will cause temperature increases in most regions of the world, with maximum temperature increases occurring in northern Asia. A few isolated regions, however, will be cooler. Precipitation will increase over India and decrease in regions of the U.S., Europe, and the USSR. The social, political, and economic impacts of these changes are briefly considered. (2 maps, 34 references)

  3. Attributes Of Quality Scenarios/Scenario Sets Used In Software Requirements Elicitation

    National Research Council Canada - National Science Library

    Braun, Kimberly

    1997-01-01

    .... This thesis examines scenarios used in software requirements elicitation. Many different definitions, formats, and ideas exist on scenarios, but no thorough work has been done on what makes a good, quality scenario and scenario set...

  4. Diophantine approximation and badly approximable sets

    DEFF Research Database (Denmark)

    Kristensen, S.; Thorn, R.; Velani, S.

    2006-01-01

    . The classical set Bad of `badly approximable' numbers in the theory of Diophantine approximation falls within our framework as do the sets Bad(i,j) of simultaneously badly approximable numbers. Under various natural conditions we prove that the badly approximable subsets of Omega have full Hausdorff dimension...

  5. Rayleigh scatter in kilovoltage x-ray imaging: is the independent atom approximation good enough?

    OpenAIRE

    Poludniowski, G; Evans, PM; Webb, S

    2009-01-01

    Monte Carlo simulation is the gold standard method for modelling scattering processes in medical x-ray imaging. General-purpose Monte Carlo codes, however, typically use the independent atom approximation (IAA). This is known to be inaccurate for Rayleigh scattering, for many materials, in the forward direction. This work addresses whether the IAA is sufficient for the typical modelling tasks in medical kilovoltage x-ray imaging. As a means of comparison, we incorporate a more realistic 'inte...

  6. Beaconless Georouting Under The Spotlight: Practical Link Models and Application Scenarios

    KAUST Repository

    Bader, Ahmed

    2015-06-18

    Beaconless georouting has emerged as a viable packetforwarding technique in distributed wireless networks, particularly for applications requiring portability and scalability. In this paper, we focus on fine-tuning and developing the analytical tools associated with the study of beaconless georouting protocols. For instance, they have been traditionally analyzed and simulated from the perspective of a single hop only. However, end-to-end performance analysis is instrumental when considering practical application scenarios. Furthermore, beaconless georouting protocols have been studied in literature assuming equal communication ranges for the data and control packets. In reality, this is not true since the communication range is actually a function of the packet length (among other factors). Control packets are typically much shorter than data packets. As a consequence, a substantial discrepancy exists in practice between their respective communication ranges, causing many data packet drops. Accordingly, we introduce two simple strategies for bridging the gap between the control and data packet communication ranges. Our primary objective in this paper is to construct a realistic analysis describing the end-to-end performance of beaconless georouting protocols. Two flagship protocols are selected in this paper for further investigation. For a better perspective, the two protocols are actually compared to a hypothetical limit case, one which offers optimal energy and latency performance. Finally, we present four different application scenarios. For each scenario, we highlight the georouting protocol which performs the best and discuss the reasons behind it. © 2007-2012 IEEE.

  7. Long term contaminant migration and impacts from uranium mill tailings. Comparison of computer models using a realistic dataset

    International Nuclear Information System (INIS)

    Camus, H.

    1996-08-01

    This is the final report of the Working Group describing: the enhancement of the previously devised V1 scenario to produce a V2 scenario which includes more detailed source term and other site specific data; the application of models in deterministic and probabilistic mode to calculate contaminant concentrations in biosphere media, and related radiation doses, contaminant intakes and health risks, including estimates of uncertainties; the comparison and analysis of the resulting calculations. A series of scenarios was developed based on data provided by Working Group members from a range of actual tailings disposal sites, culminating in the V2.2 and V2.3 scenarios. The V2.2 and V2.3 scenarios are identical in all respects, except that the V2.2 considers radioactive (U-238 chain) contaminants, whilst the V2.3 considers stable elements (As, Ni, Pb). Since the scenarios are based on data obtained from a range of actual sites, they should be considered to be generically realistic rather than representative of a particular single site. In both scenarios, the contaminants of interest are assumed to be released in leachate from a tailings pile into an underlying aquifer. They are transported in groundwater through the aquifer to a well. Water is abstracted from the well and used for: watering beef cattle; human consumption; and irrigating leafy vegetables. The beef and leafy vegetables are consumed by humans living in the area. The same contaminants are also released into the atmosphere due to the wind erosion of the pile and then deposited upon the soil, pasture and leafy vegetables. In addition, for the V2.2 scenario, Rn-222 is assumed to be released to atmosphere from the pile. Unlike the V1 scenario, no consideration is given to surface water exposure pathways. Results show that there is exceedingly good agreement between participants' deterministic and probabilistic estimates of total dose or intake. They agree within a factor of two to three for both scenarios. Even

  8. Survey of Approaches to Generate Realistic Synthetic Graphs

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Seung-Hwan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lee, Sangkeun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, Sarah S [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shankar, Mallikarjun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Imam, Neena [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-10-01

    A graph is a flexible data structure that can represent relationships between entities. As with other data analysis tasks, the use of realistic graphs is critical to obtaining valid research results. Unfortunately, using the actual ("real-world") graphs for research and new algorithm development is difficult due to the presence of sensitive information in the data or due to the scale of data. This results in practitioners developing algorithms and systems that employ synthetic graphs instead of real-world graphs. Generating realistic synthetic graphs that provide reliable statistical confidence to algorithmic analysis and system evaluation involves addressing technical hurdles in a broad set of areas. This report surveys the state of the art in approaches to generate realistic graphs that are derived from fitted graph models on real-world graphs.

  9. Estimates of future discharges of the river Rhine using two scenario methodologies: direct versus delta approach

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Simulations with a hydrological model for the river Rhine for the present (1960–1989 and a projected future (2070–2099 climate are discussed. The hydrological model (RhineFlow is driven by meteorological data from a 90-years (ensemble of three 30-years simulation with the HadRM3H regional climate model for both present-day and future climate (A2 emission scenario. Simulation of present-day discharges is realistic provided that (1 the HadRM3H temperature and precipitation are corrected for biases, and (2 the potential evapotranspiration is derived from temperature only. Different methods are used to simulate discharges for the future climate: one is based on the direct model output of the future climate run (direct approach, while the other is based on perturbation of the present-day HadRM3H time series (delta approach. Both methods predict a similar response in the mean annual discharge, an increase of 30% in winter and a decrease of 40% in summer. However, predictions of extreme flows differ significantly, with increases of 10% in flows with a return period of 100 years in the direct approach and approximately 30% in the delta approach. A bootstrap method is used to estimate the uncertainties related to the sample size (number of years simulated in predicting changes in extreme flows.

  10. Global climate change mitigation scenarios for solid waste management

    Energy Technology Data Exchange (ETDEWEB)

    Monni, S. [Benviroc Ltd, Espoo (Finland); Pipatti, R. [Statistics Finland, Helsinki (Finland); Lehtilae, A.; Savolainen, I.; Syri, S. [VTT Technical Research Centre of Finland, Espoo (Finland)

    2006-07-15

    The waste sector is an important contributor to climate change. CH{sub 4} produced at solid waste disposal sites contributes approximately 3.4 percent to the annual global anthropogenic greenhouse gas emissions. Emissions from solid waste disposal are expected to increase with increasing global population and GDP. On the other hand, many cost-efficient emission reduction options are available. The rate of waste degradation in landfills depends on waste composition, climate and conditions in the landfill. Because the duration of CH{sub 4} generation is several decades, estimation of emissions from landfills requires modelling of waste disposal prior to the year whose emissions are of interest. In this study, country- or region-specific first-order decay (FOD) models based on the 2006 IPCC Guidelines are used to estimate emissions from municipal solid waste disposal in landfills. In addition, IPCC methodology is used to estimate emissions from waste incineration. Five global scenarios are compiled from 1990 to 2050. These scenarios take into account political decision making and changes in the waste management system. In the Baseline scenario, waste generation is assumed to follow past and current trends using population and GDP as drivers. In the other scenarios, effects of increased incineration, increased recycling and increased landfill gas recovery on greenhouse gas (GHG) emissions are assessed. Economic maximum emission reduction potentials for these waste management options are estimated at different marginal cost levels for the year 2030 by using the Global TIMES model. Global emissions from landfills are projected to increase from 340 Tg CO{sub 2} eq in 1990 to 1500 Tg CO{sub 2} eq by 2030 and 2900 Tg CO{sub 2} eq by 2050 in the Baseline scenario. The emission reduction scenarios give emissions reductions from 5% (9%) to 21% (27%) compared to the Baseline in 2030 (2050). As each scenario considered one mitigation option, the results are largely additive, and

  11. Comparative analysis of the effectiveness of three immunization strategies in controlling disease outbreaks in realistic social networks.

    Directory of Open Access Journals (Sweden)

    Zhijing Xu

    Full Text Available The high incidence of emerging infectious diseases has highlighted the importance of effective immunization strategies, especially the stochastic algorithms based on local available network information. Present stochastic strategies are mainly evaluated based on classical network models, such as scale-free networks and small-world networks, and thus are insufficient. Three frequently referred stochastic immunization strategies-acquaintance immunization, community-bridge immunization, and ring vaccination-were analyzed in this work. The optimal immunization ratios for acquaintance immunization and community-bridge immunization strategies were investigated, and the effectiveness of these three strategies in controlling the spreading of epidemics were analyzed based on realistic social contact networks. The results show all the strategies have decreased the coverage of the epidemics compared to baseline scenario (no control measures. However the effectiveness of acquaintance immunization and community-bridge immunization are very limited, with acquaintance immunization slightly outperforming community-bridge immunization. Ring vaccination significantly outperforms acquaintance immunization and community-bridge immunization, and the sensitivity analysis shows it could be applied to controlling the epidemics with a wide infectivity spectrum. The effectiveness of several classical stochastic immunization strategies was evaluated based on realistic contact networks for the first time in this study. These results could have important significance for epidemic control research and practice.

  12. Comparative analysis of the effectiveness of three immunization strategies in controlling disease outbreaks in realistic social networks.

    Science.gov (United States)

    Xu, Zhijing; Zu, Zhenghu; Zheng, Tao; Zhang, Wendou; Xu, Qing; Liu, Jinjie

    2014-01-01

    The high incidence of emerging infectious diseases has highlighted the importance of effective immunization strategies, especially the stochastic algorithms based on local available network information. Present stochastic strategies are mainly evaluated based on classical network models, such as scale-free networks and small-world networks, and thus are insufficient. Three frequently referred stochastic immunization strategies-acquaintance immunization, community-bridge immunization, and ring vaccination-were analyzed in this work. The optimal immunization ratios for acquaintance immunization and community-bridge immunization strategies were investigated, and the effectiveness of these three strategies in controlling the spreading of epidemics were analyzed based on realistic social contact networks. The results show all the strategies have decreased the coverage of the epidemics compared to baseline scenario (no control measures). However the effectiveness of acquaintance immunization and community-bridge immunization are very limited, with acquaintance immunization slightly outperforming community-bridge immunization. Ring vaccination significantly outperforms acquaintance immunization and community-bridge immunization, and the sensitivity analysis shows it could be applied to controlling the epidemics with a wide infectivity spectrum. The effectiveness of several classical stochastic immunization strategies was evaluated based on realistic contact networks for the first time in this study. These results could have important significance for epidemic control research and practice.

  13. Simulating the value of electric-vehicle-grid integration using a behaviourally realistic model

    Science.gov (United States)

    Wolinetz, Michael; Axsen, Jonn; Peters, Jotham; Crawford, Curran

    2018-02-01

    Vehicle-grid integration (VGI) uses the interaction between electric vehicles and the electrical grid to provide benefits that may include reducing the cost of using intermittent renwable electricity or providing a financial incentive for electric vehicle ownerhip. However, studies that estimate the value of VGI benefits have largely ignored how consumer behaviour will affect the magnitude of the impact. Here, we simulate the long-term impact of VGI using behaviourally realistic and empirically derived models of vehicle adoption and charging combined with an electricity system model. We focus on the case where a central entity manages the charging rate and timing for participating electric vehicles. VGI is found not to increase the adoption of electric vehicles, but does have a a small beneficial impact on electricity prices. By 2050, VGI reduces wholesale electricity prices by 0.6-0.7% (0.7 MWh-1, 2010 CAD) relative to an equivalent scenario without VGI. Excluding consumer behaviour from the analysis inflates the value of VGI.

  14. Coherent delocalization: views of entanglement in different scenarios

    International Nuclear Information System (INIS)

    De J León-Montiel, R; Vallés, A; Torres, J P; Moya-Cessa, H M

    2015-01-01

    The concept of entanglement was originally introduced to explain correlations existing between two spatially separated systems, that cannot be described using classical ideas. Interestingly, in recent years, it has been shown that similar correlations can be observed when considering different degrees of freedom of a single system, even a classical one. Surprisingly, it has also been suggested that entanglement might be playing a relevant role in certain biological processes, such as the functioning of pigment-proteins that constitute light-harvesting complexes of photosynthetic bacteria. The aim of this work is to show that the presence of entanglement in all of these different scenarios should not be unexpected, once it is realized that the very same mathematical structure can describe all of them. We show this by considering three different, realistic cases in which the only condition for entanglement to exist is that a single excitation is coherently delocalized between the different subsystems that compose the system of interest. (letter)

  15. A Low-cost System for Generating Near-realistic Virtual Actors

    Science.gov (United States)

    Afifi, Mahmoud; Hussain, Khaled F.; Ibrahim, Hosny M.; Omar, Nagwa M.

    2015-06-01

    Generating virtual actors is one of the most challenging fields in computer graphics. The reconstruction of a realistic virtual actor has been paid attention by the academic research and the film industry to generate human-like virtual actors. Many movies were acted by human-like virtual actors, where the audience cannot distinguish between real and virtual actors. The synthesis of realistic virtual actors is considered a complex process. Many techniques are used to generate a realistic virtual actor; however they usually require expensive hardware equipment. In this paper, a low-cost system that generates near-realistic virtual actors is presented. The facial features of the real actor are blended with a virtual head that is attached to the actor's body. Comparing with other techniques that generate virtual actors, the proposed system is considered a low-cost system that requires only one camera that records the scene without using any expensive hardware equipment. The results of our system show that the system generates good near-realistic virtual actors that can be used on many applications.

  16. Problem Posing with Realistic Mathematics Education Approach in Geometry Learning

    Science.gov (United States)

    Mahendra, R.; Slamet, I.; Budiyono

    2017-09-01

    One of the difficulties of students in the learning of geometry is on the subject of plane that requires students to understand the abstract matter. The aim of this research is to determine the effect of Problem Posing learning model with Realistic Mathematics Education Approach in geometry learning. This quasi experimental research was conducted in one of the junior high schools in Karanganyar, Indonesia. The sample was taken using stratified cluster random sampling technique. The results of this research indicate that the model of Problem Posing learning with Realistic Mathematics Education Approach can improve students’ conceptual understanding significantly in geometry learning especially on plane topics. It is because students on the application of Problem Posing with Realistic Mathematics Education Approach are become to be active in constructing their knowledge, proposing, and problem solving in realistic, so it easier for students to understand concepts and solve the problems. Therefore, the model of Problem Posing learning with Realistic Mathematics Education Approach is appropriately applied in mathematics learning especially on geometry material. Furthermore, the impact can improve student achievement.

  17. Mott Transition In Strongly Correlated Materials: Many-Body Methods And Realistic Materials Simulations

    Science.gov (United States)

    Lee, Tsung-Han

    Strongly correlated materials are a class of materials that cannot be properly described by the Density Functional Theory (DFT), which is a single-particle approximation to the original many-body electronic Hamiltonian. These systems contain d or f orbital electrons, i.e., transition metals, actinides, and lanthanides compounds, for which the electron-electron interaction (correlation) effects are too strong to be described by the single-particle approximation of DFT. Therefore, complementary many-body methods have been developed, at the model Hamiltonians level, to describe these strong correlation effects. Dynamical Mean Field Theory (DMFT) and Rotationally Invariant Slave-Boson (RISB) approaches are two successful methods that can capture the correlation effects for a broad interaction strength. However, these many-body methods, as applied to model Hamiltonians, treat the electronic structure of realistic materials in a phenomenological fashion, which only allow to describe their properties qualitatively. Consequently, the combination of DFT and many body methods, e.g., Local Density Approximation augmented by RISB and DMFT (LDA+RISB and LDA+DMFT), have been recently proposed to combine the advantages of both methods into a quantitative tool to analyze strongly correlated systems. In this dissertation, we studied the possible improvements of these approaches, and tested their accuracy on realistic materials. This dissertation is separated into two parts. In the first part, we studied the extension of DMFT and RISB in three directions. First, we extended DMFT framework to investigate the behavior of the domain wall structure in metal-Mott insulator coexistence regime by studying the unstable solution describing the domain wall. We found that this solution, differing qualitatively from both the metallic and the insulating solutions, displays an insulating-like behavior in resistivity while carrying a weak metallic character in its electronic structure. Second, we

  18. Novel high-fidelity realistic explosion damage simulation for urban environments

    Science.gov (United States)

    Liu, Xiaoqing; Yadegar, Jacob; Zhu, Youding; Raju, Chaitanya; Bhagavathula, Jaya

    2010-04-01

    Realistic building damage simulation has a significant impact in modern modeling and simulation systems especially in diverse panoply of military and civil applications where these simulation systems are widely used for personnel training, critical mission planning, disaster management, etc. Realistic building damage simulation should incorporate accurate physics-based explosion models, rubble generation, rubble flyout, and interactions between flying rubble and their surrounding entities. However, none of the existing building damage simulation systems sufficiently faithfully realize the criteria of realism required for effective military applications. In this paper, we present a novel physics-based high-fidelity and runtime efficient explosion simulation system to realistically simulate destruction to buildings. In the proposed system, a family of novel blast models is applied to accurately and realistically simulate explosions based on static and/or dynamic detonation conditions. The system also takes account of rubble pile formation and applies a generic and scalable multi-component based object representation to describe scene entities and highly scalable agent-subsumption architecture and scheduler to schedule clusters of sequential and parallel events. The proposed system utilizes a highly efficient and scalable tetrahedral decomposition approach to realistically simulate rubble formation. Experimental results demonstrate that the proposed system has the capability to realistically simulate rubble generation, rubble flyout and their primary and secondary impacts on surrounding objects including buildings, constructions, vehicles and pedestrians in clusters of sequential and parallel damage events.

  19. Collision kernels in the eikonal approximation for Lennard-Jones interaction potential

    International Nuclear Information System (INIS)

    Zielinska, S.

    1985-03-01

    The velocity changing collisions are conveniently described by collisional kernels. These kernels depend on an interaction potential and there is a necessity for evaluating them for realistic interatomic potentials. Using the collision kernels, we are able to investigate the redistribution of atomic population's caused by the laser light and velocity changing collisions. In this paper we present the method of evaluating the collision kernels in the eikonal approximation. We discuss the influence of the potential parameters Rsub(o)sup(i), epsilonsub(o)sup(i) on kernel width for a given atomic state. It turns out that unlike the collision kernel for the hard sphere model of scattering the Lennard-Jones kernel is not so sensitive to changes of Rsub(o)sup(i) as the previous one. Contrary to the general tendency of approximating collisional kernels by the Gaussian curve, kernels for the Lennard-Jones potential do not exhibit such a behaviour. (author)

  20. The Scenario Model Intercomparison Project (ScenarioMIP) for CMIP6

    Energy Technology Data Exchange (ETDEWEB)

    O' Neill, Brian C.; Tebaldi, Claudia; van Vuuren, Detlef P.; Eyring, Veronika; Friedlingstein, Pierre; Hurtt, George; Knutti, Reto; Kriegler, Elmar; Lamarque, Jean-Francois; Lowe, Jason; Meehl, Gerald A.; Moss, Richard; Riahi, Keywan; Sanderson, Benjamin M.

    2016-01-01

    Projections of future climate change play a fundamental role in improving understanding of the climate system as well as characterizing societal risks and response options. The Scenario Model Intercomparison Project (ScenarioMIP) is the primary activity within Phase 6 of the Coupled Model Intercomparison Project (CMIP6) that will provide multi-model climate projections based on alternative scenarios of future emissions and land use changes produced with integrated assessment models. In this paper, we describe ScenarioMIP's objectives, experimental design, and its relation to other activities within CMIP6. The ScenarioMIP design is one component of a larger scenario process that aims to facilitate a wide range of integrated studies across the climate science, integrated assessment modeling, and impacts, adaptation, and vulnerability communities, and will form an important part of the evidence base in the forthcoming Intergovernmental Panel on Climate Change (IPCC) assessments. At the same time, it will provide the basis for investigating a number of targeted science and policy questions that are especially relevant to scenario-based analysis, including the role of specific forcings such as land use and aerosols, the effect of a peak and decline in forcing, the consequences of scenarios that limit warming to below 2 °C, the relative contributions to uncertainty from scenarios, climate models, and internal variability, and long-term climate system outcomes beyond the 21st century. To serve this wide range of scientific communities and address these questions, a design has been identified consisting of eight alternative 21st century scenarios plus one large initial condition ensemble and a set of long-term extensions, divided into two tiers defined by relative priority. Some of these scenarios will also provide a basis for variants planned to be run in other CMIP6-Endorsed MIPs to investigate questions related to specific forcings. Harmonized, spatially

  1. Instantons and magnetization tunneling: Beyond the giant-spin approximation

    International Nuclear Information System (INIS)

    Florez, J.M.; Vargas, P.; Nunez, Alvaro S.

    2009-01-01

    In this work we show that commonly neglected fluctuations of the net total spin of a molecular nanomagnet strongly modified its tunneling properties and provide a scenario to explain some discrepancies between theory and experiment. Starting off from an effective spin Hamiltonian, we study the quantum tunneling of the magnetization of molecular nanomagnets in the regime where the giant-spin approximation is breaking down. This study is done using an instanton description of the tunneling path. The instanton is calculated considering its coupling to quantum fluctuations.

  2. Characteristics of 454 pyrosequencing data--enabling realistic simulation with flowsim.

    Science.gov (United States)

    Balzer, Susanne; Malde, Ketil; Lanzén, Anders; Sharma, Animesh; Jonassen, Inge

    2010-09-15

    The commercial launch of 454 pyrosequencing in 2005 was a milestone in genome sequencing in terms of performance and cost. Throughout the three available releases, average read lengths have increased to approximately 500 base pairs and are thus approaching read lengths obtained from traditional Sanger sequencing. Study design of sequencing projects would benefit from being able to simulate experiments. We explore 454 raw data to investigate its characteristics and derive empirical distributions for the flow values generated by pyrosequencing. Based on our findings, we implement Flowsim, a simulator that generates realistic pyrosequencing data files of arbitrary size from a given set of input DNA sequences. We finally use our simulator to examine the impact of sequence lengths on the results of concrete whole-genome assemblies, and we suggest its use in planning of sequencing projects, benchmarking of assembly methods and other fields. Flowsim is freely available under the General Public License from http://blog.malde.org/index.php/flowsim/.

  3. Progress in realistic LOCA analysis

    Energy Technology Data Exchange (ETDEWEB)

    Young, M Y; Bajorek, S M; Ohkawa, K [Westinghouse Electric Corporation, Pittsburgh, PA (United States)

    1994-12-31

    While LOCA is a complex transient to simulate, the state of art in thermal hydraulics has advanced sufficiently to allow its realistic prediction and application of advanced methods to actual reactor design as demonstrated by methodology described in this paper 6 refs, 5 figs, 3 tabs

  4. Should scientific realists be platonists?

    DEFF Research Database (Denmark)

    Busch, Jacob; Morrison, Joe

    2015-01-01

    an appropriate use of the resources of Scientific Realism (in particular, IBE) to achieve platonism? (§2) We argue that just because a variety of different inferential strategies can be employed by Scientific Realists does not mean that ontological conclusions concerning which things we should be Scientific...

  5. Towards a unified European electricity market: The contribution of data-mining to support realistic simulation studies

    DEFF Research Database (Denmark)

    Pinto, Tiago; Santos, Gabriel; Pereira, Ivo F.

    2014-01-01

    Worldwide electricity markets have been evolving into regional and even continental scales. The aim at an efficient use of renewable based generation in places where it exceeds the local needs is one of the main reasons. A reference case of this evolution is the European Electricity Market, where...... countries are connected, and several regional markets were created, each one grouping several countries, and supporting transactions of huge amounts of electrical energy. The continuous transformations electricity markets have been experiencing over the years create the need to use simulation platforms...... to support operators, regulators, and involved players for understanding and dealing with this complex environment. This paper focuses on demonstrating the advantage that real electricity markets data has for the creation of realistic simulation scenarios, which allow the study of the impacts...

  6. Evolution of spatio-temporal drought characteristics: validation, projections and effect of adaptation scenarios

    Science.gov (United States)

    Vidal, J.-P.; Martin, E.; Kitova, N.; Najac, J.; Soubeyroux, J.-M.

    2012-08-01

    Drought events develop in both space and time and they are therefore best described through summary joint spatio-temporal characteristics, such as mean duration, mean affected area and total magnitude. This paper addresses the issue of future projections of such characteristics of drought events over France through three main research questions: (1) Are downscaled climate projections able to simulate spatio-temporal characteristics of meteorological and agricultural droughts in France over a present-day period? (2) How such characteristics will evolve over the 21st century? (3) How to use standardized drought indices to represent theoretical adaptation scenarios? These questions are addressed using the Isba land surface model, downscaled climate projections from the ARPEGE General Circulation Model under three emissions scenarios, as well as results from a previously performed 50-yr multilevel and multiscale drought reanalysis over France. Spatio-temporal characteristics of meteorological and agricultural drought events are computed using the Standardized Precipitation Index and the Standardized Soil Wetness Index, respectively, and for time scales of 3 and 12 months. Results first show that the distributions of joint spatio-temporal characteristics of observed events are well simulated by the downscaled hydroclimate projections over a present-day period. All spatio-temporal characteristics of drought events are then found to dramatically increase over the 21st century, with stronger changes for agricultural droughts. Two theoretical adaptation scenarios are eventually built based on hypotheses of adaptation to evolving climate and hydrological normals, either retrospective or prospective. The perceived spatio-temporal characteristics of drought events derived from these theoretical adaptation scenarios show much reduced changes, but they call for more realistic scenarios at both the catchment and national scale in order to accurately assess the combined effect of

  7. The carbon budget of Pinus radiata plantations in south-western Australia under 4 climate change scenarios

    International Nuclear Information System (INIS)

    Simioni, G.; Ritson, P.; McGrath, J.; Dumbrell, I.; Copeland, B.

    2009-01-01

    The future stem wood production and net ecosystem production of Pinus radiata plantations in southwestern Australia were estimated in this modelling study, which was conducted in order to determine the potential effects of anticipated severe rainfall reductions in the region. Four climate change and emission scenarios were considered as well as simulations of the present climate. Results of the study showed that stem wood production and NEP were not significantly influenced by moderate changes in temperature. However, stem wood production and NEP decreased significantly under the most pessimistic climate change scenarios. Results of the study suggested that a trade-off between the positive effects of rising atmospheric carbon dioxide (CO 2 ) on plant and water use efficiency and the negative impacts of decreased rainfall and increased temperatures. Changes in heterotrophic respiration lagged behind changes in plant growth. It was concluded that realistic predictions of forest production and carbon sequestration potential will require modelling tools capable of characterizing interactions between environmental variables, plant physiology and soil organic matter decomposition, as well as the potential range of climate change scenarios. 53 refs., 4 tabs., 9 figs

  8. Making use of scenarios : supporting scenario use in product design

    NARCIS (Netherlands)

    Anggreeni, Irene

    2010-01-01

    The discipline of Scenario-Based Product Design (SBPD) guides the use of scenarios in a product design process. As concrete narratives, scenarios could facilitate making explicit how users would use the designed product in their activities, allowing usability studies to be an integrated part of the

  9. Universal resources for approximate and stochastic measurement-based quantum computation

    International Nuclear Information System (INIS)

    Mora, Caterina E.; Piani, Marco; Miyake, Akimasa; Van den Nest, Maarten; Duer, Wolfgang; Briegel, Hans J.

    2010-01-01

    We investigate which quantum states can serve as universal resources for approximate and stochastic measurement-based quantum computation in the sense that any quantum state can be generated from a given resource by means of single-qubit (local) operations assisted by classical communication. More precisely, we consider the approximate and stochastic generation of states, resulting, for example, from a restriction to finite measurement settings or from possible imperfections in the resources or local operations. We show that entanglement-based criteria for universality obtained in M. Van den Nest et al. [New J. Phys. 9, 204 (2007)] for the exact, deterministic case can be lifted to the much more general approximate, stochastic case. This allows us to move from the idealized situation (exact, deterministic universality) considered in previous works to the practically relevant context of nonperfect state preparation. We find that any entanglement measure fulfilling some basic requirements needs to reach its maximum value on some element of an approximate, stochastic universal family of resource states, as the resource size grows. This allows us to rule out various families of states as being approximate, stochastic universal. We prove that approximate, stochastic universality is in general a weaker requirement than deterministic, exact universality and provide resources that are efficient approximate universal, but not exact deterministic universal. We also study the robustness of universal resources for measurement-based quantum computation under realistic assumptions about the (imperfect) generation and manipulation of entangled states, giving an explicit expression for the impact that errors made in the preparation of the resource have on the possibility to use it for universal approximate and stochastic state preparation. Finally, we discuss the relation between our entanglement-based criteria and recent results regarding the uselessness of states with a high

  10. The relative greenhouse gas impacts of realistic dietary choices

    International Nuclear Information System (INIS)

    Berners-Lee, M.; Hoolohan, C.; Cammack, H.; Hewitt, C.N.

    2012-01-01

    The greenhouse gas (GHG) emissions embodied in 61 different categories of food are used, with information on the diet of different groups of the population (omnivorous, vegetarian and vegan), to calculate the embodied GHG emissions in different dietary scenarios. We calculate that the embodied GHG content of the current UK food supply is 7.4 kg CO 2 e person −1 day −1 , or 2.7 t CO 2 e person −1 y −1 . This gives total food-related GHG emissions of 167 Mt CO 2 e (1 Mt=10 6 metric tonnes; CO 2 e being the mass of CO 2 that would have the same global warming potential, when measured over 100 years, as a given mixture of greenhouse gases) for the entire UK population in 2009. This is 27% of total direct GHG emissions in the UK, or 19% of total GHG emissions from the UK, including those embodied in goods produced abroad. We calculate that potential GHG savings of 22% and 26% can be made by changing from the current UK-average diet to a vegetarian or vegan diet, respectively. Taking the average GHG saving from six vegetarian or vegan dietary scenarios compared with the current UK-average diet gives a potential national GHG saving of 40 Mt CO 2 e y −1 . This is equivalent to a 50% reduction in current exhaust pipe emissions from the entire UK passenger car fleet. Hence realistic choices about diet can make substantial differences to embodied GHG emissions. - Highlights: ► We calculate the greenhouse gas emissions embodied in different diets. ► The embodied GHG content of the current UK food supply is 7.4 kg CO 2 e person −1 day −1 . ► Changing to a vegetarian or vegan diet reduces GHG emissions by 22–26%. ► Changing to a vegetarian or vegan diet would reduce UK GHG emissions by 40 Mt CO 2 e y −1 .

  11. Impact on short-lived climate forcers (SLCFs) from a realistic land-use change scenario via changes in biogenic emissions.

    Science.gov (United States)

    Scott, C E; Monks, S A; Spracklen, D V; Arnold, S R; Forster, P M; Rap, A; Carslaw, K S; Chipperfield, M P; Reddington, C L S; Wilson, C

    2017-08-24

    More than one quarter of natural forests have been cleared by humans to make way for other land-uses, with changes to forest cover projected to continue. The climate impact of land-use change (LUC) is dependent upon the relative strength of several biogeophysical and biogeochemical effects. In addition to affecting the surface albedo and exchanging carbon dioxide (CO 2 ) and moisture with the atmosphere, vegetation emits biogenic volatile organic compounds (BVOCs), altering the formation of short-lived climate forcers (SLCFs) including aerosol, ozone (O 3 ) and methane (CH 4 ). Once emitted, BVOCs are rapidly oxidised by O 3 , and the hydroxyl (OH) and nitrate (NO 3 ) radicals. These oxidation reactions yield secondary organic products which are implicated in the formation and growth of aerosol particles and are estimated to have a negative radiative effect on the climate (i.e. a cooling). These reactions also deplete OH, increasing the atmospheric lifetime of CH 4 , and directly affect concentrations of O 3 ; the latter two being greenhouse gases which impose a positive radiative effect (i.e. a warming) on the climate. Our previous work assessing idealised deforestation scenarios found a positive radiative effect due to changes in SLCFs; however, since the radiative effects associated with changes to SLCFs result from a combination of non-linear processes it may not be appropriate to scale radiative effects from complete deforestation scenarios according to the deforestation extent. Here we combine a land-surface model, a chemical transport model, a global aerosol model, and a radiative transfer model to assess the net radiative effect of changes in SLCFs due to historical LUC between the years 1850 and 2000.

  12. Dying scenarios improve recall as much as survival scenarios.

    Science.gov (United States)

    Burns, Daniel J; Hart, Joshua; Kramer, Melanie E

    2014-01-01

    Merely contemplating one's death improves retention for entirely unrelated material learned subsequently. This "dying to remember" effect seems conceptually related to the survival processing effect, whereby processing items for their relevance to being stranded in the grasslands leads to recall superior to that of other deep processing control conditions. The present experiments directly compared survival processing scenarios with "death processing" scenarios. Results showed that when the survival and dying scenarios are closely matched on key dimensions, and possible congruency effects are controlled, the dying and survival scenarios produced equivalently high recall levels. We conclude that the available evidence (cf. Bell, Roer, & Buchner, 2013; Klein, 2012), while not definitive, is consistent with the possibility of overlapping mechanisms.

  13. Generation of large-scale PV scenarios using aggregated power curves

    DEFF Research Database (Denmark)

    Nuño Martinez, Edgar; Cutululis, Nicolaos Antonio

    2017-01-01

    The contribution of solar photovoltaic (PV) power to the generation is becoming more relevant in modern power system. Therefore, there is a need to model the variability large-scale PV generation accurately. This paper presents a novel methodology to generate regional PV scenarios based...... on aggregated power curves rather than traditional physical PV conversion models. Our approach is based on hourly mesoscale reanalysis irradiation data and power measurements and do not require additional variables such as ambient temperature or wind speed. It was used to simulate the PV generation...... on the German system between 2012 and 2015 showing high levels of correlation with actual measurements (93.02–97.60%) and small deviations from the expected capacity factors (0.02–1.80%). Therefore, we are confident about the ability of the proposed model to accurately generate realistic large-scale PV...

  14. Time management: a realistic approach.

    Science.gov (United States)

    Jackson, Valerie P

    2009-06-01

    Realistic time management and organization plans can improve productivity and the quality of life. However, these skills can be difficult to develop and maintain. The key elements of time management are goals, organization, delegation, and relaxation. The author addresses each of these components and provides suggestions for successful time management.

  15. Triangulating and guarding realistic polygons

    NARCIS (Netherlands)

    Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.

    2008-01-01

    We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards in the interior of the object. In this abstract, we describe a simple algorithm for triangulating k-guardable polygons. Our algorithm, which is easily implementable, takes

  16. Spin dynamics and implications for superconductivity. Some problems with the d-wave scenario

    International Nuclear Information System (INIS)

    Levin, K.; Zha, Y.; Radtke, R.J.; Si, Q.; Norman, M.R.; Schuettler, H.B.

    1994-01-01

    We review the spin dynamics of the normal state of the cuprates with special emphasis on neutron data in both the YBa 2 Cu 3 O 7-δ and La 2-x Sr x CuO 4 systems. When realistic models of the Fermi surface shapes are incorporated, along with a moderate degree of spin fluctuations, we find good semiquantitative agreement with experiment for both cuprates. Building on the success of this Fermi-liquid-based scheme, we explore the implications for d-wave pairing from a number of vantage points. We conclude that our present experimental and theoretical understanding is inadequate to confirm or refute the d-wave scenario. 26 refs., 6 figs

  17. Dynamic influent pollutant disturbance scenario generation using a phenomenological modelling approach

    DEFF Research Database (Denmark)

    Gernaey, Krist; Flores Alsina, Xavier; Rosen, Christian

    2011-01-01

    : the larger the simulated sewer network, the smoother the simulated diurnal flow rate and concentration variations. In the discussion, it is pointed out how the proposed phenomenological models can be expanded to other applications, for example to represent heavy metal or organic micro-pollutant loads......Activated Sludge Models are widely used for simulation-based evaluation of wastewater treatment plant (WWTP) performance. However, due to the high workload and cost of a measuring campaign on a full-scale WWTP, many simulation studies suffer from lack of sufficiently long influent flow rate...... and concentration time series representing realistic wastewater influent dynamics. In this paper, a simple phenomenological modelling approach is proposed as an alternative to generate dynamic influent pollutant disturbance scenarios. The presented set of models is constructed following the principles of parsimony...

  18. EDITORIAL: Where next with global environmental scenarios? Where next with global environmental scenarios?

    Science.gov (United States)

    O'Neill, Brian; Pulver, Simone; Van Deveer, Stacy; Garb, Yaakov

    2008-12-01

    Scenarios have become a standard tool in the portfolio of techniques that scientists and policy-makers use to envision and plan for the future. Defined as plausible, challenging and relevant stories about how the future might unfold that integrate quantitative models with qualitative assessments of social and political trends, scenarios are a central component in assessment processes for a range of global issues, including climate change, biodiversity, agriculture, and energy. Yet, despite their prevalence, systematic analysis of scenarios is in its beginning stages. Fundamental questions remain about both the epistemology and scientific credibility of scenarios and their roles in policymaking and social change. Answers to these questions have the potential to determine the future of scenario analyses. Is scenario analysis moving in the direction of earth system governance informed by global scenarios generated through increasingly complex and comprehensive models integrating socio-economic and earth systems? Or will global environmental scenario analyses lose favour compared to more focused, policy-driven, regionally specific modelling? These questions come at an important time for the climate change issue, given that the scenario community, catalyzed by the Intergovernmental Panel on Climate Change (IPCC), is currently preparing to embark on a new round of scenario development processes aimed at coordinating research and assessment, and informing policy, over the next five to ten years. These and related questions about where next to go with global environmental scenarios animated a workshop held at Brown University (Note1) that brought together leading practitioners and scholars of global environmental change scenarios from research, policy-making, advocacy, and business settings. The workshop aimed to provide an overview of current practices/best practices in scenario production and scenario use across a range of global environmental change arenas. Participants

  19. The feasibility of sharing simulation-based evaluation scenarios in anesthesiology.

    Science.gov (United States)

    Berkenstadt, Haim; Kantor, Gareth S; Yusim, Yakov; Gafni, Naomi; Perel, Azriel; Ezri, Tiberiu; Ziv, Amitai

    2005-10-01

    We prospectively assessed the feasibility of international sharing of simulation-based evaluation tools despite differences in language, education, and anesthesia practice, in an Israeli study, using validated scenarios from a multi-institutional United States (US) study. Thirty-one Israeli junior anesthesia residents performed four simulation scenarios. Training sessions were videotaped and performance was assessed using two validated scoring systems (Long and Short Forms) by two independent raters. Subjects scored from 37 to 95 (70 +/- 12) of 108 possible points with the "Long Form" and "Short Form" scores ranging from 18 to 35 (28.2 +/- 4.5) of 40 possible points. Scores >70% of the maximal score were achieved by 61% of participants in comparison to only 5% in the original US study. The scenarios were rated as very realistic by 80% of the participants (grade 4 on a 1-4 scale). Reliability of the original assessment tools was demonstrated by internal consistencies of 0.66 for the Long and 0.75 for the Short Form (Cronbach alpha statistic). Values in the original study were 0.72-0.76 for the Long and 0.71-0.75 for the Short Form. The reliability did not change when a revised Israeli version of the scoring was used. Interrater reliability measured by Pearson correlation was 0.91 for the Long and 0.96 for the Short Form (P Israel. The higher scores achieved by Israeli residents may be related to the fact that most Israeli residents are immigrants with previous training in anesthesia. Simulation-based assessment tools developed in a multi-institutional study in the United States can be used in Israel despite the differences in language, education, and medical system.

  20. Towards realistic string vacua from branes at singularities

    Science.gov (United States)

    Conlon, Joseph P.; Maharana, Anshuman; Quevedo, Fernando

    2009-05-01

    We report on progress towards constructing string models incorporating both realistic D-brane matter content and moduli stabilisation with dynamical low-scale supersymmetry breaking. The general framework is that of local D-brane models embedded into the LARGE volume approach to moduli stabilisation. We review quiver theories on del Pezzo n (dPn) singularities including both D3 and D7 branes. We provide supersymmetric examples with three quark/lepton families and the gauge symmetries of the Standard, Left-Right Symmetric, Pati-Salam and Trinification models, without unwanted chiral exotics. We describe how the singularity structure leads to family symmetries governing the Yukawa couplings which may give mass hierarchies among the different generations. We outline how these models can be embedded into compact Calabi-Yau compactifications with LARGE volume moduli stabilisation, and state the minimal conditions for this to be possible. We study the general structure of soft supersymmetry breaking. At the singularity all leading order contributions to the soft terms (both gravity- and anomaly-mediation) vanish. We enumerate subleading contributions and estimate their magnitude. We also describe model-independent physical implications of this scenario. These include the masses of anomalous and non-anomalous U(1)'s and the generic existence of a new hyperweak force under which leptons and/or quarks could be charged. We propose that such a gauge boson could be responsible for the ghost muon anomaly recently found at the Tevatron's CDF detector.

  1. The world in scenarios

    International Nuclear Information System (INIS)

    De Jong, A.; Roodenburg, H.

    1992-01-01

    As an introduction to this special issue 'Worlds of difference: Scenarios's for the economy, energy and the environment 1990-2015', an outline is given of the future of the world and the Netherlands, based on four scenarios. These scenarios are published in 'Scanning the future' in May 1992 by the CPB, the Dutch Central Planning Bureau. The Global Shift (GS) scenario is characterized by a very dynamic technological development, the free market perspective, strong economic growth in the Asian economies, and a relative economic regression in Western Europe. In the European Renaissance (ER) scenario the technological development is less dynamic and more gradual than in the GS scenario. The Balanced Growth (BG) scenario is dominated by a sustainable economic development and a strong technological dynamic development. The Global Crisis (GC) scenario shows a downward spiral in many areas, stagnating developments and fragile economies as results of the trends in the eighties. The first three scenarios are elaborated for the Netherlands. Also attention is paid to the aims and meaning of long-term scenarios. 2 figs., 2 tabs., 3 refs

  2. Development of exposure scenarios for CERCLA risk assessments at the Savannah River Site (U)

    International Nuclear Information System (INIS)

    Nix, D.W.; Immel, J.W.; Phifer, M.A.

    1992-01-01

    factors such as EPA Standard Default Exposure Scenarios (OSWER Directive 9285.6-03) that are based on upper-bound exposures that tend to reflect worst case conditions. The use of site-specific information for developing risk assessment exposure scenarios will result in a realistic estimate of Reasonable Maximum Exposure for SRS waste units. (author)

  3. Approximate truncation robust computed tomography—ATRACT

    International Nuclear Information System (INIS)

    Dennerlein, Frank; Maier, Andreas

    2013-01-01

    We present an approximate truncation robust algorithm to compute tomographic images (ATRACT). This algorithm targets at reconstructing volumetric images from cone-beam projections in scenarios where these projections are highly truncated in each dimension. It thus facilitates reconstructions of small subvolumes of interest, without involving prior knowledge about the object. Our method is readily applicable to medical C-arm imaging, where it may contribute to new clinical workflows together with a considerable reduction of x-ray dose. We give a detailed derivation of ATRACT that starts from the conventional Feldkamp filtered-backprojection algorithm and that involves, as one component, a novel original formula for the inversion of the two-dimensional Radon transform. Discretization and numerical implementation are discussed and reconstruction results from both, simulated projections and first clinical data sets are presented. (paper)

  4. Generic Simulator Environment for Realistic Simulation - Autonomous Entity Proof and Emotion in Decision Making

    Directory of Open Access Journals (Sweden)

    Mickaël Camus

    2004-10-01

    Full Text Available Simulation is usually used as an evaluation and testing system. Many sectors are concerned such as EUROPEAN SPACE AGENCY or the EUROPEAN DEFENCE. It is important to make sure that the project is error-free in order to continue it. The difficulty is to develop a realistic environment for the simulation and the execution of a scenario. This paper presents PALOMA, a Generic Simulator Environment. This project is based essantially on the Chaos Theory and Complex Systems to create and direct an environment for a simulation. An important point is the generic aspect. PALOMA will be able to create an environment for different sectors (Aero-space, Biology, Mathematic, .... PALOMA includes six components : the Simulation Engine, the Direction Module, the Environment Generator, the Natural Behavior Restriction, the Communication API and the User API. Three languages are used to develop this simulator. SCHEME for the Direction language, C/C++ for the development of modules and OZ/MOZART for the heart of PALOMA.

  5. DC electrophoresis and viscosity of realistic salt-free concentrated suspensions: non-equilibrium dissociation-association processes.

    Science.gov (United States)

    Ruiz-Reina, Emilio; Carrique, Félix; Lechuga, Luis

    2014-03-01

    Most of the suspensions usually found in industrial applications are concentrated, aqueous and in contact with the atmospheric CO2. The case of suspensions with a high concentration of added salt is relatively well understood and has been considered in many studies. In this work we are concerned with the case of concentrated suspensions that have no ions different than: (1) those stemming from the charged colloidal particles (the added counterions, that counterbalance their surface charge); (2) the H(+) and OH(-) ions from water dissociation, and (3) the ions generated by the atmospheric CO2 contamination. We call this kind of systems "realistic salt-free suspensions". We show some theoretical results about the electrophoretic mobility of a colloidal particle and the electroviscous effect of realistic salt-free concentrated suspensions. The theoretical framework is based on a cell model that accounts for particle-particle interactions in concentrated suspensions, which has been successfully applied to many different phenomena in concentrated suspensions. On the other hand, the water dissociation and CO2 contamination can be described following two different levels of approximation: (a) by local equilibrium mass-action equations, because it is supposed that the reactions are so fast that chemical equilibrium is attained everywhere in the suspension, or (b) by non-equilibrium dissociation-association kinetic equations, because it is considered that some reactions are not rapid enough to ensure local chemical equilibrium. Both approaches give rise to different results in the range from dilute to semidilute suspensions, causing possible discrepancies when comparing standard theories and experiments concerning transport properties of realistic salt-free suspensions. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Validity of spherical approximations of initial charge cloud shape in silicon detectors

    International Nuclear Information System (INIS)

    Xu Cheng; Danielsson, Mats; Bornefalk, Hans

    2011-01-01

    Spherical approximation has been used extensively in low-energy X-ray imaging to represent the initial charge cloud produced by photon interactions in silicon detectors, mainly because of its simplicity. However, for high-energy X-rays, where the initial charge distribution is as important as the diffusion process, the spherical approximation will not result in a realistic detector response. In this paper, we present a bubble-line model that simulates the initial charge cloud in silicon detectors for photons in the energy range of medical imaging. An initial charge cloud can be generated by sampling the center of gravity and the track size from statistical distributions derived from Monte Carlo generated tracks and by distributing a certain proportion of photon energy into a bubble (68%) and a line portion uniformly. The simulations of detector response demonstrate that the new model simulates the detector response accurately and corresponds well to Monte Carlo simulation.

  7. Map-Based Channel Model for Urban Macrocell Propagation Scenarios

    Directory of Open Access Journals (Sweden)

    Jose F. Monserrat

    2015-01-01

    Full Text Available The evolution of LTE towards 5G has started and different research projects and institutions are in the process of verifying new technology components through simulations. Coordination between groups is strongly recommended and, in this sense, a common definition of test cases and simulation models is needed. The scope of this paper is to present a realistic channel model for urban macrocell scenarios. This model is map-based and takes into account the layout of buildings situated in the area under study. A detailed description of the model is given together with a comparison with other widely used channel models. The benchmark includes a measurement campaign in which the proposed model is shown to be much closer to the actual behavior of a cellular system. Particular attention is given to the outdoor component of the model, since it is here where the proposed approach is showing main difference with other previous models.

  8. High-order above-threshold ionization beyond the electric dipole approximation

    Science.gov (United States)

    Brennecke, Simon; Lein, Manfred

    2018-05-01

    Photoelectron momentum distributions from strong-field ionization are calculated by numerical solution of the one-electron time-dependent Schrödinger equation for a model atom including effects beyond the electric dipole approximation. We focus on the high-energy electrons from rescattering and analyze their momentum component along the field propagation direction. We show that the boundary of the calculated momentum distribution is deformed in accordance with the classical three-step model including the beyond-dipole Lorentz force. In addition, the momentum distribution exhibits an asymmetry in the signal strengths of electrons emitted in the forward/backward directions. Taken together, the two non-dipole effects give rise to a considerable average forward momentum component of the order of 0.1 a.u. for realistic laser parameters.

  9. TOXICOLOGICAL EVALUATION OF REALISTIC EMISSIONS OF SOURCE AEROSOLS (TERESA): APPLICATION TO POWER PLANT-DERIVED PM2.5

    Energy Technology Data Exchange (ETDEWEB)

    Annette Rohr

    2004-12-02

    tended to be slightly higher. Exposure concentrations were about 249 {micro}g/m{sup 3} PM, of which 87 {micro}g/m{sup 3} was sulfate and approximately 110 {micro}g/m{sup 3} was secondary organic material ({approx}44%). Results indicated subtle differences in breathing pattern between exposed and control (sham) animals, but no differences in other endpoints (in vivo chemiluminescence, blood cytology, bronchoalveolar lavage fluid analysis). It was suspected that primary particle losses may have been occurring in the venturi aspirator/orifice sampler; therefore, the stack sampling system was redesigned. The modified system resulted in no substantial increase in particle concentration in the emissions, leading us to conclude that the electrostatic precipitator at the power plant has high efficiency, and that the sampled emissions are representative of those exiting the stack into the atmosphere. This is important, since the objective of the Project is to carry out exposures to realistic coal combustion-derived secondary PM arising from power plants. During the next reporting period, we will document and describe the remainder of the fieldwork at Plant 0, which we expect to be complete by mid-November 2004. This report will include detailed Phase I toxicological findings for all scenarios run, and Phase II toxicological findings for one selected scenario. Depending upon the outcome of the ongoing fieldwork at Plant 0 (i.e. the biological effects observed), not all the proposed scenarios may be evaluated. The next report is also expected to include preliminary field data for Plant 1, located in the Southeast.

  10. Characterizing Volcanic Eruptions on Venus: Some Realistic (?) Scenarios

    Science.gov (United States)

    Stofan, E. R.; Glaze, L. S.; Grinspoon, D. H.

    2011-01-01

    When Pioneer Venus arrived at Venus in 1978, it detected anomalously high concentrations of SO2 at the top of the troposphere, which subsequently declined over the next five years. This decline in SO2 was linked to some sort of dynamic process, possibly a volcanic eruption. Observations of SO2 variability have persisted since Pioneer Venus. More recently, scientists from the Venus Express mission announced that the SPICAV (Spectroscopy for Investigation of Characteristics of the Atmosphere of Venus) instrument had measured varying amounts of SO2 in the upper atmosphere; VIRTIS (Visible and Infrared Thermal Imaging Spectrometer) measured no similar variations in the lower atmosphere (ESA, 4 April, 2008). In addition, Fegley and Prinn stated that venusian volcanoes must replenish SO2 to the atmosphere, or it would react with calcite and disappear within 1.9 my. Fegley and Tremain suggested an eruption rate on the order of approx 1 cubic km/year to maintain atmospheric SO2; Bullock and Grinspoon posit that volcanism must have occurred within the last 20-50 my to maintain the sulfuric acid/water clouds on Venus. The abundance of volcanic deposits on Venus and the likely thermal history of the planet suggest that it is still geologically active, although at rates lower than Earth. Current estimates of resurfacing rates range from approx 0.01 cubic km/yr to approx 2 cubic km/yr. Demonstrating definitively that Venus is still volcanically active, and at what rate, would help to constrain models of evolution of the surface and interior, and help to focus future exploration of Venus.

  11. Essays on variational approximation techniques for stochastic optimization problems

    Science.gov (United States)

    Deride Silva, Julio A.

    This dissertation presents five essays on approximation and modeling techniques, based on variational analysis, applied to stochastic optimization problems. It is divided into two parts, where the first is devoted to equilibrium problems and maxinf optimization, and the second corresponds to two essays in statistics and uncertainty modeling. Stochastic optimization lies at the core of this research as we were interested in relevant equilibrium applications that contain an uncertain component, and the design of a solution strategy. In addition, every stochastic optimization problem relies heavily on the underlying probability distribution that models the uncertainty. We studied these distributions, in particular, their design process and theoretical properties such as their convergence. Finally, the last aspect of stochastic optimization that we covered is the scenario creation problem, in which we described a procedure based on a probabilistic model to create scenarios for the applied problem of power estimation of renewable energies. In the first part, Equilibrium problems and maxinf optimization, we considered three Walrasian equilibrium problems: from economics, we studied a stochastic general equilibrium problem in a pure exchange economy, described in Chapter 3, and a stochastic general equilibrium with financial contracts, in Chapter 4; finally from engineering, we studied an infrastructure planning problem in Chapter 5. We stated these problems as belonging to the maxinf optimization class and, in each instance, we provided an approximation scheme based on the notion of lopsided convergence and non-concave duality. This strategy is the foundation of the augmented Walrasian algorithm, whose convergence is guaranteed by lopsided convergence, that was implemented computationally, obtaining numerical results for relevant examples. The second part, Essays about statistics and uncertainty modeling, contains two essays covering a convergence problem for a sequence

  12. Scenario simulation based assessment of subsurface energy storage

    Science.gov (United States)

    Beyer, C.; Bauer, S.; Dahmke, A.

    2014-12-01

    Energy production from renewable sources such as solar or wind power is characterized by temporally varying power supply. The politically intended transition towards renewable energies in Germany („Energiewende") hence requires the installation of energy storage technologies to compensate for the fluctuating production. In this context, subsurface energy storage represents a viable option due to large potential storage capacities and the wide prevalence of suited geological formations. Technologies for subsurface energy storage comprise cavern or deep porous media storage of synthetic hydrogen or methane from electrolysis and methanization, or compressed air, as well as heat storage in shallow or moderately deep porous formations. Pressure build-up, fluid displacement or temperature changes induced by such operations may affect local and regional groundwater flow, geomechanical behavior, groundwater geochemistry and microbiology. Moreover, subsurface energy storage may interact and possibly be in conflict with other "uses" like drinking water abstraction or ecological goods and functions. An utilization of the subsurface for energy storage therefore requires an adequate system and process understanding for the evaluation and assessment of possible impacts of specific storage operations on other types of subsurface use, the affected environment and protected entities. This contribution presents the framework of the ANGUS+ project, in which tools and methods are developed for these types of assessments. Synthetic but still realistic scenarios of geological energy storage are derived and parameterized for representative North German storage sites by data acquisition and evaluation, and experimental work. Coupled numerical hydraulic, thermal, mechanical and reactive transport (THMC) simulation tools are developed and applied to simulate the energy storage and subsurface usage scenarios, which are analyzed for an assessment and generalization of the imposed THMC

  13. Food scenarios 2025

    DEFF Research Database (Denmark)

    Sundbo, Jon

    2016-01-01

    This article presents the results of a future study of the food sector. Two scenarios have been developed using a combination of: 1) a summary of the relevant scientific knowledge, 2) systematic scenario writing, 3) an expert-based Delphi technique, and 4) an expert seminar assessment. The two...... scenarios present possible futures at global, national (Denmark) and regional (Zealand, Denmark) levels. The main scenario is called ‘Food for ordinary days and celebrations’ (a combination of ‘High-technological food production − The functional society’ and ‘High-gastronomic food − The experience society...

  14. Interpreting energy scenarios

    Science.gov (United States)

    Iyer, Gokul; Edmonds, James

    2018-05-01

    Quantitative scenarios from energy-economic models inform decision-making about uncertain futures. Now, research shows the different ways these scenarios are subsequently used by users not involved in their initial development. In the absence of clear guidance from modellers, users may place too much or too little confidence in scenario assumptions and results.

  15. Realistic nurse-led policy implementation, optimization and evaluation: novel methodological exemplar.

    Science.gov (United States)

    Noyes, Jane; Lewis, Mary; Bennett, Virginia; Widdas, David; Brombley, Karen

    2014-01-01

    To report the first large-scale realistic nurse-led implementation, optimization and evaluation of a complex children's continuing-care policy. Health policies are increasingly complex, involve multiple Government departments and frequently fail to translate into better patient outcomes. Realist methods have not yet been adapted for policy implementation. Research methodology - Evaluation using theory-based realist methods for policy implementation. An expert group developed the policy and supporting tools. Implementation and evaluation design integrated diffusion of innovation theory with multiple case study and adapted realist principles. Practitioners in 12 English sites worked with Consultant Nurse implementers to manipulate the programme theory and logic of new decision-support tools and care pathway to optimize local implementation. Methods included key-stakeholder interviews, developing practical diffusion of innovation processes using key-opinion leaders and active facilitation strategies and a mini-community of practice. New and existing processes and outcomes were compared for 137 children during 2007-2008. Realist principles were successfully adapted to a shorter policy implementation and evaluation time frame. Important new implementation success factors included facilitated implementation that enabled 'real-time' manipulation of programme logic and local context to best-fit evolving theories of what worked; using local experiential opinion to change supporting tools to more realistically align with local context and what worked; and having sufficient existing local infrastructure to support implementation. Ten mechanisms explained implementation success and differences in outcomes between new and existing processes. Realistic policy implementation methods have advantages over top-down approaches, especially where clinical expertise is low and unlikely to diffuse innovations 'naturally' without facilitated implementation and local optimization. © 2013

  16. Immersive simulated reality scenarios for enhancing students' experience of people with learning disabilities across all fields of nurse education.

    Science.gov (United States)

    Saunder, Lorna; Berridge, Emma-Jane

    2015-11-01

    Poor preparation of nurses, regarding learning disabilities can have devastating consequences. High-profile reports and the Nursing and Midwifery Council requirements led this University to introduce Shareville into the undergraduate and postgraduate nursing curriculum. Shareville is a virtual environment developed at Birmingham City University, in which student nurses learn from realistic, problem-based scenarios featuring people with learning disabilities. Following the implementation of the resource an evaluation of both staff and student experience was undertaken. Students reported that problem-based scenarios were sufficiently real and immersive. Scenarios presented previously unanticipated considerations, offering new insights, and giving students the opportunity to practise decision-making in challenging scenarios before encountering them in practice. The interface and the quality of the graphics were criticised, but, this did not interfere with learning. Nine lecturers were interviewed, they generally felt positively towards the resource and identified strengths in terms of blended learning and collaborative teaching. The evaluation contributes to understandings of learning via simulated reality, and identifies process issues that will inform the development of further resources and their roll-out locally, and may guide other education providers in developing and implementing resources of this nature. There was significant parity between lecturers' expectations of students' experience of Shareville. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. An approximate Kalman filter for ocean data assimilation: An example with an idealized Gulf Stream model

    Science.gov (United States)

    Fukumori, Ichiro; Malanotte-Rizzoli, Paola

    1995-04-01

    A practical method of data assimilation for use with large, nonlinear, ocean general circulation models is explored. A Kaiman filter based on approximations of the state error covariance matrix is presented, employing a reduction of the effective model dimension, the error's asymptotic steady state limit, and a time-invariant linearization of the dynamic model for the error integration. The approximations lead to dramatic computational savings in applying estimation theory to large complex systems. We examine the utility of the approximate filter in assimilating different measurement types using a twin experiment of an idealized Gulf Stream. A nonlinear primitive equation model of an unstable east-west jet is studied with a state dimension exceeding 170,000 elements. Assimilation of various pseudomeasurements are examined, including velocity, density, and volume transport at localized arrays and realistic distributions of satellite altimetry and acoustic tomography observations. Results are compared in terms of their effects on the accuracies of the estimation. The approximate filter is shown to outperform an empirical nudging scheme used in a previous study. The examples demonstrate that useful approximate estimation errors can be computed in a practical manner for general circulation models.

  18. The Two Defaults Scenario for Stressing Credit Portfolio Loss Distributions

    Directory of Open Access Journals (Sweden)

    Dirk Tasche

    2015-12-01

    Full Text Available The impact of a stress scenario of default events on the loss distribution of a credit portfolio can be assessed by determining the loss distribution conditional on these events. While it is conceptually easy to estimate loss distributions conditional on default events by means of Monte Carlo simulation, it becomes impractical for two or more simultaneous defaults as then the conditioning event is extremely rare. We provide an analytical approach to the calculation of the conditional loss distribution for the CreditRisk + portfolio model with independent random loss given default distributions. The analytical solution for this case can be used to check the accuracy of an approximation to the conditional loss distribution whereby the unconditional model is run with stressed input probabilities of default (PDs. It turns out that this approximation is unbiased. Numerical examples, however, suggest that the approximation may be seriously inaccurate but that the inaccuracy leads to overestimation of tail losses and, hence, the approach errs on the conservative side.

  19. Hamilton-Jacobi formalism to warm inflationary scenario

    Science.gov (United States)

    Sayar, K.; Mohammadi, A.; Akhtari, L.; Saaidi, Kh.

    2017-01-01

    The Hamilton-Jacobi formalism as a powerful method is being utilized to reconsider the warm inflationary scenario, where the scalar field as the main component driving inflation interacts with other fields. Separating the context into strong and weak dissipative regimes, the goal is followed for two popular functions of Γ . Applying slow-rolling approximation, the required perturbation parameters are extracted and, by comparing to the latest Planck data, the free parameters are restricted. The possibility of producing an acceptable inflation is studied where the result shows that for all cases the model could successfully suggest the amplitude of scalar perturbation, scalar spectral index, its running, and the tensor-to-scalar ratio.

  20. RenderGAN: Generating Realistic Labeled Data

    Directory of Open Access Journals (Sweden)

    Leon Sixt

    2018-06-01

    Full Text Available Deep Convolutional Neuronal Networks (DCNNs are showing remarkable performance on many computer vision tasks. Due to their large parameter space, they require many labeled samples when trained in a supervised setting. The costs of annotating data manually can render the use of DCNNs infeasible. We present a novel framework called RenderGAN that can generate large amounts of realistic, labeled images by combining a 3D model and the Generative Adversarial Network framework. In our approach, image augmentations (e.g., lighting, background, and detail are learned from unlabeled data such that the generated images are strikingly realistic while preserving the labels known from the 3D model. We apply the RenderGAN framework to generate images of barcode-like markers that are attached to honeybees. Training a DCNN on data generated by the RenderGAN yields considerably better performance than training it on various baselines.

  1. Maximizing direct current power delivery from bistable vibration energy harvesting beams subjected to realistic base excitations

    Science.gov (United States)

    Dai, Quanqi; Harne, Ryan L.

    2017-04-01

    Effective development of vibration energy harvesters is required to convert ambient kinetic energy into useful electrical energy as power supply for sensors, for example in structural health monitoring applications. Energy harvesting structures exhibiting bistable nonlinearities have previously been shown to generate large alternating current (AC) power when excited so as to undergo snap-through responses between stable equilibria. Yet, most microelectronics in sensors require rectified voltages and hence direct current (DC) power. While researchers have studied DC power generation from bistable energy harvesters subjected to harmonic excitations, there remain important questions as to the promise of such harvester platforms when the excitations are more realistic and include both harmonic and random components. To close this knowledge gap, this research computationally and experimentally studies the DC power delivery from bistable energy harvesters subjected to such realistic excitation combinations as those found in practice. Based on the results, it is found that the ability for bistable energy harvesters to generate peak DC power is significantly reduced by introducing sufficient amount of stochastic excitations into an otherwise harmonic input. On the other hand, the elimination of a low amplitude, coexistent response regime by way of the additive noise promotes power delivery if the device was not originally excited to snap-through. The outcomes of this research indicate the necessity for comprehensive studies about the sensitivities of DC power generation from bistable energy harvester to practical excitation scenarios prior to their optimal deployment in applications.

  2. Scenario development and evaluation for the NPP Krsko revised decommissioning program

    International Nuclear Information System (INIS)

    Levanat, I.; Lokner, V.; Subasic, D.

    2004-01-01

    In this first revision, several integrated scenarios of the NPP Krsko dismantling and waste management were developed and analyzed in order to estimate the decommissioning program (DP) costs and to propose an appropriate funding plan. Most dismantling technologies and cost estimates were derived from the original decommissioning plan adopted in 1996. The LILW disposal cost estimates, however, rely on the tunnel type facility design which was developed in Slovenia a few years ago, whereas the SF repository design for this DP was adapted from the Swedish deep disposal concept. The starting assumptions for this DP were that the LILW repository would be licensed by 2013, the NPP would be permanently shut down in 2023, and the SF repository would become available in 2030. The boundary conditions also specified that DP should first re-evaluate the SID strategy from the original plan (Strategy Immediate Dismantling with immediate SF disposal, but also with a long period of on-site decay storage for the activated components, so that it actually terminates only after 96 years), and then modify it to achieve truly prompt decommissioning in which all planned activities should be completed within about 15 years after the NPP shut-down. In addition, the option of SF export to a third country should be introduced in all DP scenarios, as a realistic alternative to SF disposal into the local repository (in Slovenia or in Croatia). And finally, dry storage of SF for some 30 years before disposal or export, in an independent installation on unspecified location, should be evaluated within the DP sensitivity analysis. After a thorough analysis of the original SID strategy, it became clear that substantial modifications would be necessary in order to meet the boundary conditions while complying with the specified design and technologies of the assumed LILW and SF disposal facilities. Therefore, a systematic procedure for development and financial evaluation of feasible scenarios was

  3. Pesticide exposure assessment for surface waters in the EU. Part 2: Determination of statistically based run-off and drainage scenarios for Germany.

    Science.gov (United States)

    Bach, Martin; Diesner, Mirjam; Großmann, Dietlinde; Guerniche, Djamal; Hommen, Udo; Klein, Michael; Kubiak, Roland; Müller, Alexandra; Preuss, Thomas G; Priegnitz, Jan; Reichenberger, Stefan; Thomas, Kai; Trapp, Matthias

    2017-05-01

    In order to assess surface water exposure to active substances of plant protection products (PPPs) in the European Union (EU), the FOCUS (FOrum for the Co-ordination of pesticide fate models and their USe) surface water workgroup introduced four run-off and six drainage scenarios for Step 3 of the tiered FOCUSsw approach. These scenarios may not necessarily represent realistic worst-case situations for the different Member States of the EU. Hence, the suitability of the scenarios for risk assessment in the national authorisation procedures is not known. Using Germany as an example, the paper illustrates how national soil-climate scenarios can be developed to model entries of active substances into surface waters from run-off and erosion (using the model PRZM) and from drainage (using the model MACRO). In the authorisation procedure for PPPs on Member State level, such soil-climate scenarios can be used to determine exposure endpoints with a defined overall percentile. The approach allows the development of national specific soil-climate scenarios and to calculate percentile-based exposure endpoints. The scenarios have been integrated into a software tool analogous to FOCUS-SWASH which can be used in the future to assess surface water exposure in authorisation procedures of PPPs in Germany. © 2017 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. © 2017 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  4. Neutron star models with realistic high-density equations of state

    International Nuclear Information System (INIS)

    Malone, R.C.; Johnson, M.B.; Bethe, H.A.

    1975-01-01

    We calculate neutron star models using four realistic high-density models of the equation of state. We conclude that the maximum mass of a neutron star is unlikely to exceed 2 M/sub sun/. All of the realistic models are consistent with current estimates of the moment of inertia of the Crab pulsar

  5. Analysis and validation center for ITER RH maintenance scenarios in a virtual environment

    International Nuclear Information System (INIS)

    Elzendoorn, B.S.Q.; Baar, M. de; Hamilton, D.; Heemskerk, C.J.M.; Koning, J.F.; Ronden, D.M.S.

    2012-01-01

    A facility for detailed simulation of maintenance processes in the ITER Hot Cell Facility (HCF) has been taken into operation. The facility mimics the Remote Handling (RH) work-cells as are presently foreseen. Novel virtual reality (VR) technology, extended with a physics engine is used to create a realistic setting in which a team of Remote Handling (RH) operators can interact with a virtual Hot Cell environment. The physics engine is used to emulate the Hot Cell behavior and to provide tactile feed-back of the (virtual) slave. Multi-operator maintenance scenarios can be developed and tested in virtual reality. Complex interactions between the RH operators and the HCF control system software will be tested. Task performance will be quantified and operational resource consumption will be estimated.

  6. Analysis and validation center for ITER RH maintenance scenarios in a virtual environment

    Energy Technology Data Exchange (ETDEWEB)

    Elzendoorn, B.S.Q., E-mail: B.S.Q.Elzendoorn@rijnhuizen.nl [FOM-Institute for Plasma Physics Rijnhuizen, Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster and ITER-NL, PO Box 1207, 3430 BE, Nieuwegein (Netherlands); Baar, M. de [FOM-Institute for Plasma Physics Rijnhuizen, Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster and ITER-NL, PO Box 1207, 3430 BE, Nieuwegein (Netherlands); Hamilton, D. [ITER Organization, Route de Vinon-sur-Verdon, CS 90 046, 13067 St. Paul-lez-Durance Cedex (France); Heemskerk, C.J.M. [Heemskerk Innovative Technology, Sassenheim (Netherlands); Koning, J.F.; Ronden, D.M.S. [FOM-Institute for Plasma Physics Rijnhuizen, Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster and ITER-NL, PO Box 1207, 3430 BE, Nieuwegein (Netherlands)

    2012-03-15

    A facility for detailed simulation of maintenance processes in the ITER Hot Cell Facility (HCF) has been taken into operation. The facility mimics the Remote Handling (RH) work-cells as are presently foreseen. Novel virtual reality (VR) technology, extended with a physics engine is used to create a realistic setting in which a team of Remote Handling (RH) operators can interact with a virtual Hot Cell environment. The physics engine is used to emulate the Hot Cell behavior and to provide tactile feed-back of the (virtual) slave. Multi-operator maintenance scenarios can be developed and tested in virtual reality. Complex interactions between the RH operators and the HCF control system software will be tested. Task performance will be quantified and operational resource consumption will be estimated.

  7. Bell Operator Method to Classify Local Realistic Theories

    International Nuclear Information System (INIS)

    Nagata, Koji

    2010-01-01

    We review the historical fact of multipartite Bell inequalities with an arbitrary number of settings. An explicit local realistic model for the values of a correlation function, given in a two-setting Bell experiment (two-setting model), works only for the specific set of settings in the given experiment, but cannot construct a local realistic model for the values of a correlation function, given in a continuous-infinite settings Bell experiment (infinite-setting model), even though there exist two-setting models for all directions in space. Hence, the two-setting model does not have the property that the infinite-setting model has. Here, we show that an explicit two-setting model cannot construct a local realistic model for the values of a correlation function, given in an M-setting Bell experiment (M-setting model), even though there exist two-setting models for the M measurement directions chosen in the given M-setting experiment. Hence, the two-setting model does not have the property that the M-setting model has. (general)

  8. Cognitive—Motor Interference in an Ecologically Valid Street Crossing Scenario

    Directory of Open Access Journals (Sweden)

    Christin Janouch

    2018-05-01

    Full Text Available Laboratory-based research revealed that gait involves higher cognitive processes, leading to performance impairments when executed with a concurrent loading task. Deficits are especially pronounced in older adults. Theoretical approaches like the multiple resource model highlight the role of task similarity and associated attention distribution problems. It has been shown that in cases where these distribution problems are perceived relevant to participant's risk of falls, older adults prioritize gait and posture over the concurrent loading task. Here we investigate whether findings on task similarity and task prioritization can be transferred to an ecologically valid scenario. Sixty-three younger adults (20–30 years of age and 61 older adults (65–75 years of age participated in a virtual street crossing simulation. The participants' task was to identify suitable gaps that would allow them to cross a simulated two way street safely. Therefore, participants walked on a manual treadmill that transferred their forward motion to forward displacements in a virtual city. The task was presented as a single task (crossing only and as a multitask. In the multitask condition participants were asked, among others, to type in three digit numbers that were presented either visually or auditorily. We found that for both age groups, street crossing as well as typing performance suffered under multitasking conditions. Impairments were especially pronounced for older adults (e.g., longer crossing initiation phase, more missed opportunities. However, younger and older adults did not differ in the speed and success rate of crossing. Further, deficits were stronger in the visual compared to the auditory task modality for most parameters. Our findings conform to earlier studies that found an age-related decline in multitasking performance in less realistic scenarios. However, task similarity effects were inconsistent and question the validity of the multiple

  9. Toward developing more realistic groundwater models using big data

    Science.gov (United States)

    Vahdat Aboueshagh, H.; Tsai, F. T. C.; Bhatta, D.; Paudel, K.

    2017-12-01

    Rich geological data is the backbone of developing realistic groundwater models for groundwater resources management. However, constructing realistic groundwater models can be challenging due to inconsistency between different sources of geological, hydrogeological and geophysical data and difficulty in processing big data to characterize the subsurface environment. This study develops a framework to utilize a big geological dataset to create a groundwater model for the Chicot Aquifer in the southwestern Louisiana, which borders on the Gulf of Mexico at south. The Chicot Aquifer is the principal source of fresh water in southwest Louisiana, underlying an area of about 9,000 square miles. Agriculture is the largest groundwater consumer in this region and overpumping has caused significant groundwater head decline and saltwater intrusion from the Gulf and deep formations. A hydrostratigraphy model was constructed using around 29,000 electrical logs and drillers' logs as well as screen lengths of pumping wells through a natural neighbor interpolation method. These sources of information have different weights in terms of accuracy and trustworthy. A data prioritization procedure was developed to filter untrustworthy log information, eliminate redundant data, and establish consensus of various lithological information. The constructed hydrostratigraphy model shows 40% sand facies, which is consistent with the well log data. The hydrostratigraphy model confirms outcrop areas of the Chicot Aquifer in the north of the study region. The aquifer sand formation is thinning eastward to merge into Atchafalaya River alluvial aquifer and coalesces to the underlying Evangeline aquifer. A grid generator was used to convert the hydrostratigraphy model into a MODFLOW grid with 57 layers. A Chicot groundwater model was constructed using the available hydrologic and hydrogeological data for 2004-2015. Pumping rates for irrigation wells were estimated using the crop type and acreage

  10. Culling dogs in scenarios of imperfect control: realistic impact on the prevalence of canine visceral leishmaniasis.

    Directory of Open Access Journals (Sweden)

    Danielle N C C Costa

    Full Text Available BACKGROUND: Visceral leishmaniasis belongs to the list of neglected tropical diseases and is considered a public health problem worldwide. Spatial correlation between the occurrence of the disease in humans and high rates of canine infection suggests that in the presence of the vector, canine visceral leishmaniasis is the key factor for triggering transmission to humans. Despite the control strategies implemented, such as the sacrifice of infected dogs being put down, the incidence of American visceral leishmaniasis remains high in many Latin American countries. METHODOLOGY/PRINCIPAL FINDINGS: Mathematical models were developed to describe the transmission dynamics of canine leishmaniasis and its control by culling. Using these models, imperfect control scenarios were implemented to verify the possible factors which alter the effectiveness of controlling this disease in practice. CONCLUSIONS/SIGNIFICANCE: A long-term continuous program targeting both asymptomatic and symptomatic dogs should be effective in controlling canine leishmaniasis in areas of low to moderate transmission (R0 up to 1.4. However, the indiscriminate sacrifice of asymptomatic dogs with positive diagnosis may jeopardize the effectiveness of the control program, if tests with low specificity are used, increasing the chance of generating outrage in the population, and leading to lower adherence to the program. Therefore, culling must be planned accurately and implemented responsibly and never as a mechanical measure in large scale. In areas with higher transmission, culling alone is not an effective control strategy.

  11. Blend Shape Interpolation and FACS for Realistic Avatar

    Science.gov (United States)

    Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Basori, Ahmad Hoirul; Saba, Tanzila

    2015-03-01

    The quest of developing realistic facial animation is ever-growing. The emergence of sophisticated algorithms, new graphical user interfaces, laser scans and advanced 3D tools imparted further impetus towards the rapid advancement of complex virtual human facial model. Face-to-face communication being the most natural way of human interaction, the facial animation systems became more attractive in the information technology era for sundry applications. The production of computer-animated movies using synthetic actors are still challenging issues. Proposed facial expression carries the signature of happiness, sadness, angry or cheerful, etc. The mood of a particular person in the midst of a large group can immediately be identified via very subtle changes in facial expressions. Facial expressions being very complex as well as important nonverbal communication channel are tricky to synthesize realistically using computer graphics. Computer synthesis of practical facial expressions must deal with the geometric representation of the human face and the control of the facial animation. We developed a new approach by integrating blend shape interpolation (BSI) and facial action coding system (FACS) to create a realistic and expressive computer facial animation design. The BSI is used to generate the natural face while the FACS is employed to reflect the exact facial muscle movements for four basic natural emotional expressions such as angry, happy, sad and fear with high fidelity. The results in perceiving the realistic facial expression for virtual human emotions based on facial skin color and texture may contribute towards the development of virtual reality and game environment of computer aided graphics animation systems.

  12. Benchmark accident scenarios for nuclear powered warship visits to Australian ports

    International Nuclear Information System (INIS)

    Frikken, A.J.

    1996-01-01

    planning and exercise purposes. The set consists of twenty separate BMA scenarios, covering four orders of magnitude in accident severity. For each BMA scenario, simulated radiological field measurements can be obtained quickly for any location and time during an exercise, allowing a realistic test of the process for dose estimation and decision making on the need for countermeasures. It is envisaged that the BMA scenarios will be used by the States and Territories for reviewing and exercising the emergency arrangements for NPW visits. The NSB has also investigated the use of computer techniques for dose assessment during an NPW emergency. Real time dose assessment computer codes can enhance emergency response by providing increased accuracy, speed and flexibility in assessing the situation, resulting in more appropriate and timely decisions. Two such computer codes are currently available in Australia which may be readily integrated into the emergency response to NPW accidents

  13. Integrative Scenario Development

    Directory of Open Access Journals (Sweden)

    Joerg A. Priess

    2014-03-01

    Full Text Available Scenarios are employed to address a large number of future environmental and socioeconomic challenges. We present a conceptual framework for the development of scenarios to integrate the objectives of different stakeholder groups. Based on the framework, land-use scenarios were developed to provide a common base for further research. At the same time, these scenarios assisted regional stakeholders to bring forward their concerns and arrive at a shared understanding of challenges between scientific and regional stakeholders, which allowed them to eventually support regional decision making. The focus on the integration of views and knowledge domains of different stakeholder groups, such as scientists and practitioners, required rigorous and repeated measures of quality control. The application of the integrative concept provided products for both stakeholder groups, and the process of scenario development facilitated cooperation and learning within both the scientist and practitioner groups as well as between the two groups.

  14. Value Function Approximation or Stopping Time Approximation

    DEFF Research Database (Denmark)

    Stentoft, Lars

    2014-01-01

    In their 2001 paper, Longstaff and Schwartz suggested a method for American option pricing using simulation and regression, and since then this method has rapidly gained importance. However, the idea of using regression and simulation for American option pricing was used at least as early as 1996......, due to this difference, it is possible to provide arguments favoring the method of Longstaff and Schwartz. Finally, we compare the methods in a realistic numerical setting and show that practitioners would do well to choose the method of Longstaff and Schwartz instead of the methods of Carriere...

  15. The future of scenarios: issues in developing new climate change scenarios

    International Nuclear Information System (INIS)

    Pitcher, Hugh M

    2009-01-01

    In September, 2007, the IPCC convened a workshop to discuss how a new set of scenarios to support climate model runs, mitigation analyses, and impact, adaptation and vulnerability research might be developed. The first phase of the suggested new approach is now approaching completion. This article discusses some of the issues raised by scenario relevant research and analysis since the last set of IPCC scenarios were created (IPCC SRES, 2000) that will need to be addressed as new scenarios are developed by the research community during the second phase. These include (1) providing a logic for how societies manage to transition from historical paths to the various future development paths foreseen in the scenarios, (2) long-term economic growth issues, (3) the appropriate GDP metric to use (purchasing power parity or market exchange rates), (4) ongoing issues with moving from the broad geographic and time scales of the emission scenarios to the finer scales needed for impacts, adaptation and vulnerability analyses and (5) some possible ways to handle the urgent request from the policy community for some guidance on scenario likelihoods. The challenges involved in addressing these issues are manifold; the reward is greater credibility and deeper understanding of an analytic tool that does much to form the context within which many issues in addition to the climate problem will need to be addressed.

  16. The future of scenarios: issues in developing new climate change scenarios

    Science.gov (United States)

    Pitcher, Hugh M.

    2009-04-01

    In September, 2007, the IPCC convened a workshop to discuss how a new set of scenarios to support climate model runs, mitigation analyses, and impact, adaptation and vulnerability research might be developed. The first phase of the suggested new approach is now approaching completion. This article discusses some of the issues raised by scenario relevant research and analysis since the last set of IPCC scenarios were created (IPCC SRES, 2000) that will need to be addressed as new scenarios are developed by the research community during the second phase. These include (1) providing a logic for how societies manage to transition from historical paths to the various future development paths foreseen in the scenarios, (2) long-term economic growth issues, (3) the appropriate GDP metric to use (purchasing power parity or market exchange rates), (4) ongoing issues with moving from the broad geographic and time scales of the emission scenarios to the finer scales needed for impacts, adaptation and vulnerability analyses and (5) some possible ways to handle the urgent request from the policy community for some guidance on scenario likelihoods. The challenges involved in addressing these issues are manifold; the reward is greater credibility and deeper understanding of an analytic tool that does much to form the context within which many issues in addition to the climate problem will need to be addressed.

  17. Fatigue - determination of a more realistic usage factor

    International Nuclear Information System (INIS)

    Lang, H.

    2001-01-01

    The ability to use a suitable counting method for determining the stress range spectrum in elastic and simplified elastic-plastic fatigue analyses is of crucial importance for enabling determination of a realistic usage factor. Determination of elastic-plastic strain range using the K e factor from fictitious elastically calculated loads is also important in the event of elastic behaviour being exceeded. This paper thus examines both points in detail. A fatigue module with additional options, which functions on this basis is presented. The much more realistic determination of usage factor presented here offers various economic benefits depending on the application

  18. Synchronization scenarios in the Winfree model of coupled oscillators

    Science.gov (United States)

    Gallego, Rafael; Montbrió, Ernest; Pazó, Diego

    2017-10-01

    Fifty years ago Arthur Winfree proposed a deeply influential mean-field model for the collective synchronization of large populations of phase oscillators. Here we provide a detailed analysis of the model for some special, analytically tractable cases. Adopting the thermodynamic limit, we derive an ordinary differential equation that exactly describes the temporal evolution of the macroscopic variables in the Ott-Antonsen invariant manifold. The low-dimensional model is then thoroughly investigated for a variety of pulse types and sinusoidal phase response curves (PRCs). Two structurally different synchronization scenarios are found, which are linked via the mutation of a Bogdanov-Takens point. From our results, we infer a general rule of thumb relating pulse shape and PRC offset with each scenario. Finally, we compare the exact synchronization threshold with the prediction of the averaging approximation given by the Kuramoto-Sakaguchi model. At the leading order, the discrepancy appears to behave as an odd function of the PRC offset.

  19. Realistic rhetoric and legal decision

    Directory of Open Access Journals (Sweden)

    João Maurício Adeodato

    2017-06-01

    Full Text Available The text aims to lay the foundations of a realistic rhetoric, from the descriptive perspective of how the legal decision actually takes place, without normative considerations. Aristotle's rhetorical idealism and its later prestige reduced rhetoric to the art of persuasion, eliminating important elements of sophistry, especially with regard to legal decision. It concludes with a rhetorical perspective of judicial activism in complex societies.

  20. Realistic molecular model of kerogen's nanostructure.

    Science.gov (United States)

    Bousige, Colin; Ghimbeu, Camélia Matei; Vix-Guterl, Cathie; Pomerantz, Andrew E; Suleimenova, Assiya; Vaughan, Gavin; Garbarino, Gaston; Feygenson, Mikhail; Wildgruber, Christoph; Ulm, Franz-Josef; Pellenq, Roland J-M; Coasne, Benoit

    2016-05-01

    Despite kerogen's importance as the organic backbone for hydrocarbon production from source rocks such as gas shale, the interplay between kerogen's chemistry, morphology and mechanics remains unexplored. As the environmental impact of shale gas rises, identifying functional relations between its geochemical, transport, elastic and fracture properties from realistic molecular models of kerogens becomes all the more important. Here, by using a hybrid experimental-simulation method, we propose a panel of realistic molecular models of mature and immature kerogens that provide a detailed picture of kerogen's nanostructure without considering the presence of clays and other minerals in shales. We probe the models' strengths and limitations, and show that they predict essential features amenable to experimental validation, including pore distribution, vibrational density of states and stiffness. We also show that kerogen's maturation, which manifests itself as an increase in the sp(2)/sp(3) hybridization ratio, entails a crossover from plastic-to-brittle rupture mechanisms.

  1. Working Toward Policy-Relevant Air Quality Emissions Scenarios

    Science.gov (United States)

    Holloway, T.

    2010-12-01

    to meet the increasingly intricate demands of both advanced air quality models and more realistic and relevant policy scenarios.

  2. The Emissions Scenarios Portal: Visualizing Low-Carbon Pathways for the 21st Century

    Science.gov (United States)

    Hennig, R. J.; Friedrich, J.; Ge, M.; Mountford, H.; Fransen, T.; Altamirano, J. C.; Thanawala, Z.; Arcipowska, A.

    2017-12-01

    The Emissions Scenarios Portal (ESP) is a newly developed exploration tool for 21st century low-carbon pathways and investigation of the Nationally Determined Contributions (NDC's) that countries have put forward under the Paris Agreement. It is open to the public and aims to help achieve the goal of limiting global temperature increase to well below 2 degrees Celsius above pre-industrial levels by enhancing access to high-quality, up-to-date scenario information. It can guide users to set ambitious, realistic emission mitigation goals and understand what these goals imply for different sectors of the economy. Data will be integrated from a wide variety of economic and energy-system models with results from both national models as well as globally integrated assessment models (IAM's) and countries biennial update reports (BUR's). This information can support policy and investment decision making that will lead to a low carbon future. It is designed to help find answers to questions such as "Are the NDC's enough to put the world on a 2DC track?", "What do NDC's imply for different sectors of the economy under different assumptions?" or "What are good ways to increase ambition beyond NDC's?". The portal strives to achieve both inter-comparability across a wide range of different models and nationally reported scenarios, as well as flexibility to allow modelers to bring out the strengths and purpose of their model on the platform. Furthermore, it aims to enhance standardized and transparent reporting of emissions scenarios and relevant metadata, assumptions and results to improve understanding, accessibility and impact of the scenarios. On the data side, these rivaling objectives present interesting challenges for both the collection and communication of the data and in this presentation we will present some of our ideas for tackling these. This project will be part of Climate Watch, a new data platform developed jointly by the World Resources Institute and the NDC

  3. Particle reduction strategies - PAREST. Gridded European emission data for projection years 2010, 2015 and 2020 based on the IIASA GAINS NEC scenarios. Teilbericht

    Energy Technology Data Exchange (ETDEWEB)

    Gon, Hugo Denier van der; Visschedijk, Antoon; Brugh, Hans van den [TNO Earth, Environment and Life Sciences, Utrecht (Netherlands)

    2013-06-15

    Projected emissions for selected scenarios for the years 2010, 2015 and 2020 were obtained from the GAINS NEC scenario reports and distributed on a high resolution over Europe using the TNO gridding tools. These emission maps are available as model input in the PAREST project to model the contribution of Europe to air quality in Germany in 2010, 2015 and 2020 (see note Rainer Stern, May 2009). The scenarios have a significant influence on absolute emission levels for the countries that were covered by IIASA GAINS. This suggests that emission changes in countries were no scenarios were available (Armenia, Azerbaijan, Georgia) or where only a projection year baseline is available (all non-EU) may be subject to significant changes as well (but these are quite far from Germany). For future projects it is recommended to make simple and transparent scenarios for these other countries, as well as for International Shipping. The change in emissions from the base year 2005 to the projection year 2010 needs to be interpreted with care. This because some methodology differences between 2005 official emission data as used in the PAREST base year 2005 emission set and GAINS 2010 data exist. It is expected that the emission reduction steps towards 2020 are more realistic.

  4. On the Realistic Stochastic Model of GPS Observables: Implementation and Performance

    Science.gov (United States)

    Zangeneh-Nejad, F.; Amiri-Simkooei, A. R.; Sharifi, M. A.; Asgari, J.

    2015-12-01

    High-precision GPS positioning requires a realistic stochastic model of observables. A realistic GPS stochastic model of observables should take into account different variances for different observation types, correlations among different observables, the satellite elevation dependence of observables precision, and the temporal correlation of observables. Least-squares variance component estimation (LS-VCE) is applied to GPS observables using the geometry-based observation model (GBOM). To model the satellite elevation dependent of GPS observables precision, an exponential model depending on the elevation angles of the satellites are also employed. Temporal correlation of the GPS observables is modelled by using a first-order autoregressive noise model. An important step in the high-precision GPS positioning is double difference integer ambiguity resolution (IAR). The fraction or percentage of success among a number of integer ambiguity fixing is called the success rate. A realistic estimation of the GNSS observables covariance matrix plays an important role in the IAR. We consider the ambiguity resolution success rate for two cases, namely a nominal and a realistic stochastic model of the GPS observables using two GPS data sets collected by the Trimble R8 receiver. The results confirm that applying a more realistic stochastic model can significantly improve the IAR success rate on individual frequencies, either on L1 or on L2. An improvement of 20% was achieved to the empirical success rate results. The results also indicate that introducing the realistic stochastic model leads to a larger standard deviation for the baseline components by a factor of about 2.6 on the data sets considered.

  5. Multi-scenario evaluation and specification of electromagnetic loads on ITER vacuum vessel

    International Nuclear Information System (INIS)

    Rozov, Vladimir; Martinez, J.-M.; Portafaix, C.; Sannazzaro, G.

    2014-01-01

    Highlights: • We present the results of multi-scenario analysis of EM loads on ITER vacuum vessel (VV). • The differentiation of models provides the economic way to perform big amount of calculations. • Functional approximation is proposed for distributed data/FE/numerical results specification. • Examples of specification of the load profiles by trigonometric polynomials (DHT) are given. • Principles of accounting for toroidal asymmetry at EM interactions in tokamak are considered. - Abstract: The electro-magnetic (EM) transients cause mechanical forces, which represent one of the most critical loads for the ITER vacuum vessel (VV). The paper is focused on the results of multi-scenario analysis and systematization of these EM loads, including specifically addressed pressures on shells and the net vertical force. The proposed mathematical model and computational technology, based on the use of integral parameters and operational analysis methods, enabled qualitative and quantitative analysis of the problem, time-efficient computations and systematic assessment of a large number of scenarios. The obtained estimates, found envelopes and peak values exemplify the principal loads on the VV and provide a database to support engineering load specifications. Special attention is given to the challenge of specification and documenting of the results in a form, suitable for using the data in engineering applications. The practical aspects of specification of distributed data, such as experimental and finite-element (FE) results, by analytical interpolants are discussed. The example of functional approximation of the load profiles by trigonometric polynomials based on discrete Hartley transform (DHT) is given

  6. Multi-scenario evaluation and specification of electromagnetic loads on ITER vacuum vessel

    Energy Technology Data Exchange (ETDEWEB)

    Rozov, Vladimir, E-mail: vladimir.rozov@iter.org; Martinez, J.-M.; Portafaix, C.; Sannazzaro, G.

    2014-10-15

    Highlights: • We present the results of multi-scenario analysis of EM loads on ITER vacuum vessel (VV). • The differentiation of models provides the economic way to perform big amount of calculations. • Functional approximation is proposed for distributed data/FE/numerical results specification. • Examples of specification of the load profiles by trigonometric polynomials (DHT) are given. • Principles of accounting for toroidal asymmetry at EM interactions in tokamak are considered. - Abstract: The electro-magnetic (EM) transients cause mechanical forces, which represent one of the most critical loads for the ITER vacuum vessel (VV). The paper is focused on the results of multi-scenario analysis and systematization of these EM loads, including specifically addressed pressures on shells and the net vertical force. The proposed mathematical model and computational technology, based on the use of integral parameters and operational analysis methods, enabled qualitative and quantitative analysis of the problem, time-efficient computations and systematic assessment of a large number of scenarios. The obtained estimates, found envelopes and peak values exemplify the principal loads on the VV and provide a database to support engineering load specifications. Special attention is given to the challenge of specification and documenting of the results in a form, suitable for using the data in engineering applications. The practical aspects of specification of distributed data, such as experimental and finite-element (FE) results, by analytical interpolants are discussed. The example of functional approximation of the load profiles by trigonometric polynomials based on discrete Hartley transform (DHT) is given.

  7. Feasibility study for a realistic training dedicated to radiological protection improvement

    International Nuclear Information System (INIS)

    Courageot, E.; Kutschera, R.; Gaillard-Lecanu, E.; Jahan, S.; Riedel, A.; Therache, B.

    2013-01-01

    An evident purpose of the radiological protection training is to use suitable protective equipment and to behave correctly if unexpected working conditions happen. A major difficulty of this training consist in having the most realistic reading from the monitoring devices for a given exposure situation, but without using real radioactive sources. A new approach is developed at EDF R/D for radiological protection training. This approach combines different technologies, in an environment representative of the workplace but geographically separated from the nuclear power plant: a training area representative of a workplace, a Man Machine Interface used by the trainer to define the source configuration and the training scenario, a geo-localization system, fictive radiation monitoring devices and a particle transport code able to calculate in real time the dose map due to the virtual sources. In a first approach, our real-time particles transport code, called Moderato, used only an attenuation low in straight line. To improve the realism further, we would like to switch a code based on the Monte Carlo transport of particles like Geant 4 or MCNPX instead of Moderato. The aim of our study is the evaluation of the code in our application, in particular, the possibility to keep a real time response of our architecture. (authors)

  8. Optimizing Decision Preparedness by Adapting Scenario Complexity and Automating Scenario Generation

    Science.gov (United States)

    Dunne, Rob; Schatz, Sae; Flore, Stephen M.; Nicholson, Denise

    2011-01-01

    Klein's recognition-primed decision (RPD) framework proposes that experts make decisions by recognizing similarities between current decision situations and previous decision experiences. Unfortunately, military personnel arQ often presented with situations that they have not experienced before. Scenario-based training (S8T) can help mitigate this gap. However, SBT remains a challenging and inefficient training approach. To address these limitations, the authors present an innovative formulation of scenario complexity that contributes to the larger research goal of developing an automated scenario generation system. This system will enable trainees to effectively advance through a variety of increasingly complex decision situations and experiences. By adapting scenario complexities and automating generation, trainees will be provided with a greater variety of appropriately calibrated training events, thus broadening their repositories of experience. Preliminary results from empirical testing (N=24) of the proof-of-concept formula are presented, and future avenues of scenario complexity research are also discussed.

  9. Photo-Realistic Image Synthesis and Virtual Cinematography

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    Realistic Virtual View Synthesis is a new field of research that has received increasing attention in recent years. It is strictly related to the grown popularity of virtual reality and the spread of its applications, among which virtual photography and cinematography. The use of computer generated...... characters, "virtual actors", in the motion picture production increases every day. While the most known computer graphics techniques have largely been adopted successfully in nowadays fictions, it still remains very challenging to implement virtual actors which would resemble, visually, human beings....... Interestingly, film directors have been looking at the recent progress achieved by the research community in the field of realistic visualization of virtual views, and they have successfully implemented state of the art research approaches in their productions. An innovative concept is then gaining consensus...

  10. Future Scenarios of Land Change Based on Empirical Data and Demographic Trends

    Science.gov (United States)

    Sleeter, Benjamin M.; Wilson, Tamara S.; Sharygin, Ethan; Sherba, Jason T.

    2017-11-01

    Changes in land use and land cover (LULC) have important and fundamental interactions with the global climate system. Top-down global scale projections of land use change have been an important component of climate change research; however, their utility at local to regional scales is often limited. The goal of this study was to develop an approach for projecting changes in LULC based on land use histories and demographic trends. We developed a set of stochastic, empirical-based projections of LULC change for the state of California, for the period 2001-2100. Land use histories and demographic trends were used to project a "business-as-usual" (BAU) scenario and three population growth scenarios. For the BAU scenario, we projected developed lands would more than double by 2100. When combined with cultivated areas, we projected a 28% increase in anthropogenic land use by 2100. As a result, natural lands were projected to decline at a rate of 139 km2 yr-1; grasslands experienced the largest net decline, followed by shrublands and forests. The amount of cultivated land was projected to decline by approximately 10%; however, the relatively modest change masked large shifts between annual and perennial crop types. Under the three population scenarios, developed lands were projected to increase 40-90% by 2100. Our results suggest that when compared to the BAU projection, scenarios based on demographic trends may underestimate future changes in LULC. Furthermore, regardless of scenario, the spatial pattern of LULC change was likely to have the greatest negative impacts on rangeland ecosystems.

  11. Future scenarios of land change based on empirical data and demographic trends

    Science.gov (United States)

    Sleeter, Benjamin M.; Wilson, Tamara; Sharygin, Ethan; Sherba, Jason

    2017-01-01

    Changes in land use and land cover (LULC) have important and fundamental interactions with the global climate system. Top-down global scale projections of land use change have been an important component of climate change research; however, their utility at local to regional scales is often limited. The goal of this study was to develop an approach for projecting changes in LULC based on land use histories and demographic trends. We developed a set of stochastic, empirical-based projections of LULC change for the state of California, for the period 2001–2100. Land use histories and demographic trends were used to project a “business-as-usual” (BAU) scenario and three population growth scenarios. For the BAU scenario, we projected developed lands would more than double by 2100. When combined with cultivated areas, we projected a 28% increase in anthropogenic land use by 2100. As a result, natural lands were projected to decline at a rate of 139 km2 yr−1; grasslands experienced the largest net decline, followed by shrublands and forests. The amount of cultivated land was projected to decline by approximately 10%; however, the relatively modest change masked large shifts between annual and perennial crop types. Under the three population scenarios, developed lands were projected to increase 40–90% by 2100. Our results suggest that when compared to the BAU projection, scenarios based on demographic trends may underestimate future changes in LULC. Furthermore, regardless of scenario, the spatial pattern of LULC change was likely to have the greatest negative impacts on rangeland ecosystems.

  12. A comparison of spent fuel shipping cask response to 10 CFR 71 normal conditions and realistic hot day extremes

    International Nuclear Information System (INIS)

    Manson, S.J.; Gianoulakis, S.E.

    1994-04-01

    An examination of the effect of a realistic (though conservative) hot day environment on the thermal transient behavior of spent fuel shipping casks is made. These results are compared to those that develop under the prescribed normal thermal condition of 10 CFR 71. Of specific concern are the characteristics of propagating thermal waves, which are set up by diurnal variations of temperature and insolation in the outdoor environment. In order to arrive at a realistic approximation of these variations on a conservative hot day, actual temperature and insolation measurements have been obtained from the National Climatic Data Center (NCDC) for representatively hot and high heat flux days. Thus, the use of authentic meteorological data ensures the realistic approach sought. Further supporting the desired realism of the modeling effort is the use of realistic cask configurations in which multiple laminations of structural, shielding, and other materials are expected to attenuate the propagating thermal waves. The completed analysis revealed that the majority of wall temperatures, for a wide variety of spent fuel shipping cask configurations, fall well below those predicted by enforcement of the regulatory environmental conditions of 10 CFR 71. It was found that maximum temperatures at the cask surface occasionally lie above temperatures predicted under the prescribed regulatory conditions. However, the temperature differences are small enough that the normal conservative assumptions that are made in the course of typical cask evaluations should correct for any potential violations. The analysis demonstrates that diurnal temperature variations that penetrate the cask wall all have maxima substantially less than the corresponding regulatory solutions. Therefore it is certain that vital cask components and the spent fuel itself will not exceed the temperatures calculated by use of the conditions of 10 CFR 71

  13. Hierarchical low-rank approximation for high dimensional approximation

    KAUST Repository

    Nouy, Anthony

    2016-01-07

    Tensor methods are among the most prominent tools for the numerical solution of high-dimensional problems where functions of multiple variables have to be approximated. Such high-dimensional approximation problems naturally arise in stochastic analysis and uncertainty quantification. In many practical situations, the approximation of high-dimensional functions is made computationally tractable by using rank-structured approximations. In this talk, we present algorithms for the approximation in hierarchical tensor format using statistical methods. Sparse representations in a given tensor format are obtained with adaptive or convex relaxation methods, with a selection of parameters using crossvalidation methods.

  14. Hierarchical low-rank approximation for high dimensional approximation

    KAUST Repository

    Nouy, Anthony

    2016-01-01

    Tensor methods are among the most prominent tools for the numerical solution of high-dimensional problems where functions of multiple variables have to be approximated. Such high-dimensional approximation problems naturally arise in stochastic analysis and uncertainty quantification. In many practical situations, the approximation of high-dimensional functions is made computationally tractable by using rank-structured approximations. In this talk, we present algorithms for the approximation in hierarchical tensor format using statistical methods. Sparse representations in a given tensor format are obtained with adaptive or convex relaxation methods, with a selection of parameters using crossvalidation methods.

  15. Improving UWB-Based Localization in IoT Scenarios with Statistical Models of Distance Error.

    Science.gov (United States)

    Monica, Stefania; Ferrari, Gianluigi

    2018-05-17

    Interest in the Internet of Things (IoT) is rapidly increasing, as the number of connected devices is exponentially growing. One of the application scenarios envisaged for IoT technologies involves indoor localization and context awareness. In this paper, we focus on a localization approach that relies on a particular type of communication technology, namely Ultra Wide Band (UWB). UWB technology is an attractive choice for indoor localization, owing to its high accuracy. Since localization algorithms typically rely on estimated inter-node distances, the goal of this paper is to evaluate the improvement brought by a simple (linear) statistical model of the distance error. On the basis of an extensive experimental measurement campaign, we propose a general analytical framework, based on a Least Square (LS) method, to derive a novel statistical model for the range estimation error between a pair of UWB nodes. The proposed statistical model is then applied to improve the performance of a few illustrative localization algorithms in various realistic scenarios. The obtained experimental results show that the use of the proposed statistical model improves the accuracy of the considered localization algorithms with a reduction of the localization error up to 66%.

  16. Disposition scenarios and safeguardability of fissile materials under START Treaty

    International Nuclear Information System (INIS)

    Pillay, K.K.S.

    1993-01-01

    Under the Strategic Arms Reduction Treaty (START-I) signed in 1991 and the Lisbon Protocol of 1992, a large inventory of fissile materials will be removed from the weapons fuel cycles of the United States and the Former Soviet Union (FSU). The Lisbon Protocol calls for Ukraine, Kazakstan, and Byelarus to become nonnuclear members of the treaty and for Russia to assume the responsibility of the treaty as a nuclear weapons state. In addition, the START-II Treaty, which was signed in 1993 by the United States and Russia, further reduces deployed nuclear warheads and adds to the inventory of excess special nuclear materials (SNM). Because storage of in-tact warheads has the potential for a open-quotes breakout,close quotes it would be desirable to dismantle the warheads and properly dispose of the SNMs under appropriate safeguards to prevent their reentry into the weapons fuel cycle. The SNM recovered from dismantled warheads can be disposed of in several ways, and the final choices may be up to the country having the title to the SNM. Current plans are to store them indefinitely, leaving serious safeguards concerns. Recognizing that the underlying objective of these treaties is to prevent the fissile materials from reentering the weapons fuel cycle, it is necessary to establish a verifiable disposal scheme that includes safeguards requirements. This paper identifies some realistic scenarios for the disposal of SNM from the weapons fuel cycle and examines the safeguardability of those scenarios

  17. Cluster Risk of Walking Scenarios Based on Macroscopic Flow Model and Crowding Force Analysis

    Directory of Open Access Journals (Sweden)

    Xiaohong Li

    2018-02-01

    Full Text Available In recent years, accidents always happen in confined space such as metro stations because of congestion. Various researchers investigated the patterns of dense crowd behaviors in different scenarios via simulations or experiments and proposed methods for avoiding accidents. In this study, a classic continuum macroscopic model was applied to simulate the crowded pedestrian flow in typical scenarios such as at bottlenecks or with an obstacle. The Lax–Wendroff finite difference scheme and artificial viscosity filtering method were used to discretize the model to identify high-density risk areas. Furthermore, we introduced a contact crowding force test of the interactions among pedestrians at bottlenecks. Results revealed that in the most dangerous area, the individual on the corner position bears the maximum pressure in such scenarios is 90.2 N, and there is an approximate exponential relationship between crowding force and density indicated by our data. The results and findings presented in this paper can facilitate more reasonable and precise simulation models by utilizing crowding force and crowd density and ensure the safety of pedestrians in high-density scenarios.

  18. Hyper-realistic face masks: a new challenge in person identification.

    Science.gov (United States)

    Sanders, Jet Gabrielle; Ueda, Yoshiyuki; Minemoto, Kazusa; Noyes, Eilidh; Yoshikawa, Sakiko; Jenkins, Rob

    2017-01-01

    We often identify people using face images. This is true in occupational settings such as passport control as well as in everyday social environments. Mapping between images and identities assumes that facial appearance is stable within certain bounds. For example, a person's apparent age, gender and ethnicity change slowly, if at all. It also assumes that deliberate changes beyond these bounds (i.e., disguises) would be easy to spot. Hyper-realistic face masks overturn these assumptions by allowing the wearer to look like an entirely different person. If unnoticed, these masks break the link between facial appearance and personal identity, with clear implications for applied face recognition. However, to date, no one has assessed the realism of these masks, or specified conditions under which they may be accepted as real faces. Herein, we examined incidental detection of unexpected but attended hyper-realistic masks in both photographic and live presentations. Experiment 1 (UK; n = 60) revealed no evidence for overt detection of hyper-realistic masks among real face photos, and little evidence of covert detection. Experiment 2 (Japan; n = 60) extended these findings to different masks, mask-wearers and participant pools. In Experiment 3 (UK and Japan; n = 407), passers-by failed to notice that a live confederate was wearing a hyper-realistic mask and showed limited evidence of covert detection, even at close viewing distance (5 vs. 20 m). Across all of these studies, viewers accepted hyper-realistic masks as real faces. Specific countermeasures will be required if detection rates are to be improved.

  19. Identifying optimal agricultural countermeasure strategies for a hypothetical contamination scenario using the strategy model

    International Nuclear Information System (INIS)

    Cox, G.; Beresford, N.A.; Alvarez-Farizo, B.; Oughton, D.; Kis, Z.; Eged, K.; Thorring, H.; Hunt, J.; Wright, S.; Barnett, C.L.; Gil, J.M.; Howard, B.J.; Crout, N.M.J.

    2005-01-01

    A spatially implemented model designed to assist the identification of optimal countermeasure strategies for radioactively contaminated regions is described. Collective and individual ingestion doses for people within the affected area are estimated together with collective exported ingestion dose. A range of countermeasures are incorporated within the model, and environmental restrictions have been included as appropriate. The model evaluates the effectiveness of a given combination of countermeasures through a cost function which balances the benefit obtained through the reduction in dose with the cost of implementation. The optimal countermeasure strategy is the combination of individual countermeasures (and when and where they are implemented) which gives the lowest value of the cost function. The model outputs should not be considered as definitive solutions, rather as interactive inputs to the decision making process. As a demonstration the model has been applied to a hypothetical scenario in Cumbria (UK). This scenario considered a published nuclear power plant accident scenario with a total deposition of 1.7 x 10 14 , 1.2 x 10 13 , 2.8 x 10 10 and 5.3 x 10 9 Bq for Cs-137, Sr-90, Pu-239/240 and Am-241, respectively. The model predicts that if no remediation measures were implemented the resulting collective dose would be approximately 36 000 person-Sv (predominantly from 137 Cs) over a 10-year period post-deposition. The optimal countermeasure strategy is predicted to avert approximately 33 000 person-Sv at a cost of approximately pound 160 million. The optimal strategy comprises a mixture of ploughing, AFCF (ammonium-ferric hexacyano-ferrate) administration, potassium fertiliser application, clean feeding of livestock and food restrictions. The model recommends specific areas within the contaminated area and time periods where these measures should be implemented

  20. Putting a Realistic Theory of Mind into Agency Theory

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Stea, Diego

    2014-01-01

    Agency theory is one of the most important foundational theories in management research, but it rests on contestable cognitive assumptions. Specifically, the principal is assumed to hold a perfect (correct) theory regarding some of the content of the agent's mind, while he is entirely ignorant...... concerning other such content. More realistically, individuals have some limited access to the minds of others. We explore the implications for classical agency theory of realistic assumptions regarding the human potential for interpersonal sensemaking. We discuss implications for the design and management...

  1. Realistic searches on stretched exponential networks

    Indian Academy of Sciences (India)

    We consider navigation or search schemes on networks which have a degree distribution of the form () ∝ exp(−). In addition, the linking probability is taken to be dependent on social distances and is governed by a parameter . The searches are realistic in the sense that not all search chains can be completed.

  2. Tsunamigenic scenarios for southern Peru and northern Chile seismic gap: Deterministic and probabilistic hybrid approach for hazard assessment

    Science.gov (United States)

    González-Carrasco, J. F.; Gonzalez, G.; Aránguiz, R.; Yanez, G. A.; Melgar, D.; Salazar, P.; Shrivastava, M. N.; Das, R.; Catalan, P. A.; Cienfuegos, R.

    2017-12-01

    probabilistic kinematic tsunamigenic scenarios give a more realistic slip patterns, similar to maximum slip amount of major past earthquakes. For all studied sites, the peak of slip location and shelf resonance is a first order control for the observed coastal inundation depths results.

  3. Universality and Realistic Extensions to the Semi-Analytic Simulation Principle in GNSS Signal Processing

    Directory of Open Access Journals (Sweden)

    O. Jakubov

    2012-06-01

    Full Text Available Semi-analytic simulation principle in GNSS signal processing bypasses the bit-true operations at high sampling frequency. Instead, signals at the output branches of the integrate&dump blocks are successfully modeled, thus making extensive Monte Carlo simulations feasible. Methods for simulations of code and carrier tracking loops with BPSK, BOC signals have been introduced in the literature. Matlab toolboxes were designed and published. In this paper, we further extend the applicability of the approach. Firstly, we describe any GNSS signal as a special instance of linear multi-dimensional modulation. Thereby, we state universal framework for classification of differently modulated signals. Using such description, we derive the semi-analytic models generally. Secondly, we extend the model for realistic scenarios including delay in the feed back, slowly fading multipath effects, finite bandwidth, phase noise, and a combination of these. Finally, a discussion on connection of this semi-analytic model and position-velocity-time estimator is delivered, as well as comparison of theoretical and simulated characteristics, produced by a prototype simulator developed at CTU in Prague.

  4. Results of recent calculations using realistic potentials

    International Nuclear Information System (INIS)

    Friar, J.L.

    1987-01-01

    Results of recent calculations for the triton using realistic potentials with strong tensor forces are reviewed, with an emphasis on progress made using the many different calculational schemes. Several test problems are suggested. 49 refs., 5 figs

  5. Future coal production outlooks in the IPCC Emission Scenarios: Are they plausible?

    International Nuclear Information System (INIS)

    Hoeoek, Mikael

    2010-10-01

    regarding future fossil fuel production in SRES was investigated and compared with scientific methodology regarding reasonable future production trajectories. Historical data from the past 20 years was used to test how well the production scenarios agree with actual reality. Some of the scenarios turned out to mismatch with reality, and should be ruled out. Given the importance of coal utilization as a source of anthropogenic GHG emissions it is necessary to use realistic production trajectories that incorporate geological and physical data as well as socioeconomic parameters. SRES is underpinned by a paradigm of perpetual growth and technological optimism as well as old and outdated estimates regarding the availability of fossil energy. This has resulted in overoptimistic production outlooks

  6. Future coal production outlooks in the IPCC Emission Scenarios: Are they plausible?

    Energy Technology Data Exchange (ETDEWEB)

    Hoeoek, Mikael

    2010-10-15

    fundamental assumptions regarding future fossil fuel production in SRES was investigated and compared with scientific methodology regarding reasonable future production trajectories. Historical data from the past 20 years was used to test how well the production scenarios agree with actual reality. Some of the scenarios turned out to mismatch with reality, and should be ruled out. Given the importance of coal utilization as a source of anthropogenic GHG emissions it is necessary to use realistic production trajectories that incorporate geological and physical data as well as socioeconomic parameters. SRES is underpinned by a paradigm of perpetual growth and technological optimism as well as old and outdated estimates regarding the availability of fossil energy. This has resulted in overoptimistic production outlooks

  7. Simple and Realistic Data Generation

    DEFF Research Database (Denmark)

    Pedersen, Kenneth Houkjær; Torp, Kristian; Wind, Rico

    2006-01-01

    This paper presents a generic, DBMS independent, and highly extensible relational data generation tool. The tool can efficiently generate realistic test data for OLTP, OLAP, and data streaming applications. The tool uses a graph model to direct the data generation. This model makes it very simple...... to generate data even for large database schemas with complex inter- and intra table relationships. The model also makes it possible to generate data with very accurate characteristics....

  8. TOXICOLOGICAL EVALUATION OF REALISTIC EMISSIONS OF SOURCE AEROSOLS (TERESA): APPLICATION TO POWER PLANT-DERIVED PM2.5

    Energy Technology Data Exchange (ETDEWEB)

    Annette Rohr

    2006-03-01

    TERESA (Toxicological Evaluation of Realistic Emissions of Source Aerosols) involves exposing laboratory rats to realistic coal-fired power plant and mobile source emissions to help determine the relative toxicity of these PM sources. There are three coal-fired power plants in the TERESA program; this report describes the results of fieldwork conducted at the first plant, located in the Upper Midwest. The project was technically challenging by virtue of its novel design and requirement for the development of new techniques. By examining aged, atmospherically transformed aerosol derived from power plant stack emissions, we were able to evaluate the toxicity of PM derived from coal combustion in a manner that more accurately reflects the exposure of concern than existing methodologies. TERESA also involves assessment of actual plant emissions in a field setting--an important strength since it reduces the question of representativeness of emissions. A sampling system was developed and assembled to draw emissions from the stack; stack sampling conducted according to standard EPA protocol suggested that the sampled emissions are representative of those exiting the stack into the atmosphere. Two mobile laboratories were then outfitted for the study: (1) a chemical laboratory in which the atmospheric aging was conducted and which housed the bulk of the analytical equipment; and (2) a toxicological laboratory, which contained animal caging and the exposure apparatus. Animal exposures were carried out from May-November 2004 to a number of simulated atmospheric scenarios. Toxicological endpoints included (1) pulmonary function and breathing pattern; (2) bronchoalveolar lavage fluid cytological and biochemical analyses; (3) blood cytological analyses; (4) in vivo oxidative stress in heart and lung tissue; and (5) heart and lung histopathology. Results indicated no differences between exposed and control animals in any of the endpoints examined. Exposure concentrations for the

  9. Classifying Scenarios in a Product Design Process: a study to achieve automated scenario generation

    NARCIS (Netherlands)

    Anggreeni, Irene; van der Voort, Mascha C.; van Houten, F.J.A.M.; Miedema, J.; Lutters, D.

    2008-01-01

    This paper explains the possible uses of scenarios in product design. A scenario classification is proposed as a framework to create, use and reuse different types of scenarios in a product design process. Our aims are three-fold: (1) to obtain a better view on the extent to which scenarios can be

  10. Realistic Approach for Phasor Measurement Unit Placement

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2015-01-01

    This paper presents a realistic cost-effectivemodel for optimal placement of phasor measurement units (PMUs) for complete observability of a power system considering practical cost implications. The proposed model considers hidden or otherwise unaccounted practical costs involved in PMU...... installation. Consideration of these hidden but significant and integral part of total PMU installation costs was inspired from practical experience on a real-life project. The proposedmodel focuses on the minimization of total realistic costs instead of a widely used theoretical concept of a minimal number...... of PMUs. The proposed model has been applied to IEEE 14-bus, IEEE 24-bus, IEEE 30-bus, New England 39-bus, and large power system of 300 buses and real life Danish grid. A comparison of the presented results with those reported by traditionalmethods has also been shown to justify the effectiveness...

  11. Getting realistic; Endstation Demut

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, J.P.

    2004-01-28

    The fuel cell hype of the turn of the millenium has reached its end. The industry is getting realistic. If at all, fuel cell systems for private single-family and multiple dwellings will not be available until the next decade. With a Europe-wide field test, Vaillant intends to advance the PEM technology. [German] Der Brennstoffzellen-Hype der Jahrtausendwende ist verfolgen. Die Branche uebt sich in Bescheidenheit. Die Marktreife der Systeme fuer Ein- und Mehrfamilienhaeuser wird - wenn ueberhaupt - wohl erst im naechsten Jahrzehnt erreicht sein. Vaillant will durch einen europaweiten Feldtest die Entwicklung der PEM-Technologie vorantreiben. (orig.)

  12. Bayesian inference on EMRI signals using low frequency approximations

    International Nuclear Information System (INIS)

    Ali, Asad; Meyer, Renate; Christensen, Nelson; Röver, Christian

    2012-01-01

    Extreme mass ratio inspirals (EMRIs) are thought to be one of the most exciting gravitational wave sources to be detected with LISA. Due to their complicated nature and weak amplitudes the detection and parameter estimation of such sources is a challenging task. In this paper we present a statistical methodology based on Bayesian inference in which the estimation of parameters is carried out by advanced Markov chain Monte Carlo (MCMC) algorithms such as parallel tempering MCMC. We analysed high and medium mass EMRI systems that fall well inside the low frequency range of LISA. In the context of the Mock LISA Data Challenges, our investigation and results are also the first instance in which a fully Markovian algorithm is applied for EMRI searches. Results show that our algorithm worked well in recovering EMRI signals from different (simulated) LISA data sets having single and multiple EMRI sources and holds great promise for posterior computation under more realistic conditions. The search and estimation methods presented in this paper are general in their nature, and can be applied in any other scenario such as AdLIGO, AdVIRGO and Einstein Telescope with their respective response functions. (paper)

  13. Effects of the neonicotinoid pesticide thiamethoxam at field-realistic levels on microcolonies of Bombus terrestris worker bumble bees.

    Science.gov (United States)

    Laycock, Ian; Cotterell, Katie C; O'Shea-Wheller, Thomas A; Cresswell, James E

    2014-02-01

    Neonicotinoid pesticides are currently implicated in the decline of wild bee populations. Bumble bees, Bombus spp., are important wild pollinators that are detrimentally affected by ingestion of neonicotinoid residues. To date, imidacloprid has been the major focus of study into the effects of neonicotinoids on bumble bee health, but wild populations are increasingly exposed to alternative neonicotinoids such as thiamethoxam. To investigate whether environmentally realistic levels of thiamethoxam affect bumble bee performance over a realistic exposure period, we exposed queenless microcolonies of Bombus terrestris L. workers to a wide range of dosages up to 98 μgkg(-1) in dietary syrup for 17 days. Results showed that bumble bee workers survived fewer days when presented with syrup dosed at 98 μg thiamethoxamkg(-1), while production of brood (eggs and larvae) and consumption of syrup and pollen in microcolonies were significantly reduced by thiamethoxam only at the two highest concentrations (39, 98 μgkg(-1)). In contrast, we found no detectable effect of thiamethoxam at levels typically found in the nectars of treated crops (between 1 and 11 μgkg(-1)). By comparison with published data, we demonstrate that during an exposure to field-realistic concentrations lasting approximately two weeks, brood production in worker bumble bees is more sensitive to imidacloprid than thiamethoxam. We speculate that differential sensitivity arises because imidacloprid produces a stronger repression of feeding in bumble bees than thiamethoxam, which imposes a greater nutrient limitation on production of brood. © 2013 Published by Elsevier Inc.

  14. ADEME energy transition scenarios. Summary including a macro-economic evaluation 2030 2050

    International Nuclear Information System (INIS)

    2014-05-01

    ADEME, the French Environment and Energy Management Agency, is a public agency reporting to the Ministry of Ecology, Sustainable Development and Energy and the Ministry of Higher Education and Research. In 2012 the agency drew up a long-term scenario entitled 'ADEME Energy Transition Scenarios 2030-2050'. This document presents a summary of the report. The full version can be viewed online on the ADEME web site. With this work ADEME offers a proactive energy vision for all stakeholders - experts, the general public, decision-makers, etc. - focusing on two main areas of expertise: managing energy conservation and developing renewable energy production using proven or demonstration-phase technologies. These scenarios identify a possible pathway for the energy transition in France. They are based on two time horizons and two separate methodologies. One projection, applicable from the present day, seeks to maximise potential energy savings and renewable energy production in an ambitious but realistic manner, up to 2030. The second exercise is a normative scenario that targets a fourfold reduction in greenhouse gas emissions generated in France by 2050, compared to 1990 levels. The analysis presented in this document is primarily based on an exploration of different scenarios that allow for the achievement of ambitious energy and environmental targets under technically, economically and socially feasible conditions. This analysis is supplemented by a macro-economic analysis. These projections, particularly for 2030, do not rely on radical changes in lifestyle, lower comfort levels or hypothetical major technological breakthroughs. They show that by using technologies and organisational changes that are currently within our reach, we have the means to achieve these long-term goals. The scenarios are based on assumptions of significant growth, both economic (1.8% per year) and demographic (0.4% a year). The 2050 scenario shows that with sustained growth, a

  15. A Novel Simulator of Nonstationary Random MIMO Channels in Rayleigh Fading Scenarios

    Directory of Open Access Journals (Sweden)

    Qiuming Zhu

    2016-01-01

    Full Text Available For simulations of nonstationary multiple-input multiple-output (MIMO Rayleigh fading channels in time-variant scattering environments, a novel channel simulator is proposed based on the superposition of chirp signals. This new method has the advantages of low complexity and implementation simplicity as the sum of sinusoids (SOS method. In order to reproduce realistic time varying statistics for dynamic channels, an efficient parameter computation method is also proposed for updating the frequency parameters of employed chirp signals. Simulation results indicate that the proposed simulator is effective in generating nonstationary MIMO channels with close approximation of the time-variant statistical characteristics in accordance with the expected theoretical counterparts.

  16. Effects of climate change adaptation scenarios on perceived spatio-temporal characteristics of drought events

    Science.gov (United States)

    Vidal, J.-P.; Martin, E.; Kitova, N.; Najac, J.; Soubeyroux, J.-M.

    2012-04-01

    " adaptation) or over a 30-year period centred around the date considered ("prospective" adaptation). These adaptation scenarios are translated into local-scale transient drought thresholds, as opposed to a non-adaptation scenario where the drought threshold remains constant. The perceived spatio-temporal characteristics derived from the theoretical adaptation scenarios show much reduced changes, but they call for more realistic scenarios at both the catchment and national scale in order to accurately assess the combined effect of local-scale adaptation and global-scale mitigation. This study thus proposes a proof of concept for using standardized drought indices for (1) assessing projections of spatio-temporal drought characteristics and (2) building theoretical adaptation scenarios and associated perceived changes in hydrological impact studies (Vidal et al., submitted). Vidal J.-P., Martin E., Franchistéguy L., Habets F., Soubeyroux J.-M., Blanchard M. & Baillon M. (2010) Multilevel and multiscale drought reanalysis over France with the Safran-Isba-Modcou hydrometeorological suite. Hydrology and Earth System Sciences, 14, 459-478.doi: 10.5194/hess-14-459-2010 Vidal J.-P., Martin E., Kitova N., Najac J. & Soubeyroux, J. M. (submitted) Evolution of spatio-temporal drought characteristics: validation, projections and effect of adaptation scenarios. Submitted to Hydrology and earth System Sciences

  17. Satellite Maps Deliver More Realistic Gaming

    Science.gov (United States)

    2013-01-01

    When Redwood City, California-based Electronic Arts (EA) decided to make SSX, its latest snowboarding video game, it faced challenges in creating realistic-looking mountains. The solution was NASA's ASTER Global Digital Elevation Map, made available by the Jet Propulsion Laboratory, which EA used to create 28 real-life mountains from 9 different ranges for its award-winning game.

  18. Detailed analysis of the bundle damage scenario in the PHEBUS FPT0

    International Nuclear Information System (INIS)

    Park, Rae Joon; Kim, Sang Baik; Kim, Hee Dong; Yoo, Kun Joong

    1998-03-01

    The PHEBUS FP program and the test facility have been investigated, and the late phase melt progression in the PHEBUS FPT0 has been analyzed in the present study. The objectives of this program are to investigate fission product (FP) release and this program consists of six in-pile tests, which are FPT0, FPT1, FPT4, FPT2, FPT5, and FPT3, under different thermal hydraulic and fuel rod environment conditions. The first test, FPT0, was performed in December 1993, and the second test, FPT1, was performed in July 1996. The present study has been performed to evaluate a late phase damage scenario of the fuel bundle using the FPT0 test results, which are primarily a non-destructive Post Irradiation Examination (PIE) and a destructive PIE. The fuel bundle degradation scenario is summarized as follows: the fuel rod cladding failed at approximately 7,000 seconds; the control rod materials ruptured at 11,000 seconds; the stainless-steel reaction occurs at approximately 12,100 seconds; the upper fuel bundle materials melted and relocated to the elevation between 35 and 45 cm at the period between 14,750 and 15,200 seconds; the molten pool and the debris were formed at the elevation between 26 and 36 cm at the period between 15,200 and 18,100 seconds; the molten pool and the debris dropped the elevation between 15 and 25 cm from the bfc at approximately 18,100 seconds; the molten pool was finally quenched by the injected steam. (author). 45 refs., 10 tabs., 73 figs

  19. Scenario planning and nanotechnological futures

    International Nuclear Information System (INIS)

    Farber, Darryl; Lakhtakia, Akhlesh

    2009-01-01

    Scenario planning may assist us in harnessing the benefits of nanotechnology and managing the associated risks for the good of the society. Scenario planning is a way to describe the present state of the world and develop several hypotheses about the future of the world, thereby enabling discussions about how the world ought to be. Scenario planning thus is not only a tool for learning and foresight, but also for leadership. Informed decision making by experts and political leaders becomes possible, while simultaneously allaying the public's perception of the risks of new and emerging technologies such as nanotechnology. Two scenarios of the societal impact of nanotechnology are the mixed-signals scenario and the confluence scenario. Technoscientists have major roles to play in both scenarios.

  20. Making Energy-Water Nexus Scenarios more Fit-for-Purpose through Better Characterization of Extremes

    Science.gov (United States)

    Yetman, G.; Levy, M. A.; Chen, R. S.; Schnarr, E.

    2017-12-01

    Often quantitative scenarios of future trends exhibit less variability than the historic data upon which the models that generate them are based. The problem of dampened variability, which typically also entails dampened extremes, manifests both temporally and spatially. As a result, risk assessments that rely on such scenarios are in danger of producing misleading results. This danger is pronounced in nexus issues, because of the multiple dimensions of change that are relevant. We illustrate the above problem by developing alternative joint distributions of the probability of drought and of human population totals, across U.S. counties over the period 2010-2030. For the dampened-extremes case we use drought frequencies derived from climate models used in the U.S. National Climate Assessment and the Environmental Protection Agency's population and land use projections contained in its Integrated Climate and Land Use Scenarios (ICLUS). For the elevated extremes case we use an alternative spatial drought frequency estimate based on tree-ring data, covering a 555-year period (Ho et al 2017); and we introduce greater temporal and spatial extremes in the ICLUS socioeconomic projections so that they conform to observed extremes in the historical U.S. spatial census data 1790-present (National Historical Geographic Information System). We use spatial and temporal coincidence of high population and extreme drought as a proxy for energy-water nexus risk. We compare the representation of risk in the dampened-extreme and elevated-extreme scenario analysis. We identify areas of the country where using more realistic portrayals of extremes makes the biggest difference in estimate risk and suggest implications for future risk assessments. References: Michelle Ho, Upmanu Lall, Xun Sun, Edward R. Cook. 2017. Multiscale temporal variability and regional patterns in 555 years of conterminous U.S. streamflow. Water Resources Research. . doi: 10.1002/2016WR019632

  1. The SAFRR tsunami scenario-physical damage in California: Chapter E in The SAFRR (Science Application for Risk Reduction) Tsunami Scenario

    Science.gov (United States)

    Porter, Keith; Byers, William; Dykstra, David; Lim, Amy; Lynett, Patrick; Ratliff, Jaime; Scawthorn, Charles; Wein, Anne; Wilson, Rick

    2013-01-01

    his chapter attempts to depict a single realistic outcome of the SAFRR (Science Application for Risk Reduction) tsunami scenario in terms of physical damage to and recovery of various aspects of the built environment in California. As described elsewhere in this report, the tsunami is generated by a hypothetical magnitude 9.1 earthquake seaward of the Alaska Peninsula on the Semidi Sector of the Alaska–Aleutian Subduction Zone, 495 miles southwest of Anchorage, at 11:50 a.m. Pacific Daylight Time (PDT) on Thursday March 27, 2014, and arriving at the California coast between 4:00 and 5:40 p.m. (depending on location) the same day. Although other tsunamis could have locally greater impact, this source represents a substantial threat to the state as a whole. One purpose of this chapter is to help operators and users of coastal assets throughout California to develop emergency plans to respond to a real tsunami. Another is to identify ways that operators or owners of these assets can think through options for reducing damage before a future tsunami. A third is to inform the economic analyses for the SAFRR tsunami scenario. And a fourth is to identify research needs to better understand the possible consequences of a tsunami on these assets. The asset classes considered here include the following: Piers, cargo, buildings, and other assets at the Ports of Los Angeles and Long Beach Large vessels in the Ports of Los Angeles and Long Beach Marinas and small craft Coastal buildings Roads and roadway bridges Rail, railway bridges, and rolling stock Agriculture Fire following tsunami Each asset class is examined in a subsection of this chapter. In each subsection, we generally attempt to offer a historical review of damage. We characterize and quantify the assets exposed to loss and describe the modes of damage that have been observed in past tsunamis or are otherwise deemed likely to occur in the SAFRR tsunami scenario. Where practical, we offer a mathematical model of the

  2. Land-Use Scenarios: National-Scale Housing-Density Scenarios Consistent with Climate Change Storylines (Final Report)

    Science.gov (United States)

    EPA announced the availability of the final report, Land-Use Scenarios: National-Scale Housing-Density Scenarios Consistent with Climate Change Storylines. This report describes the scenarios and models used to generate national-scale housing density scenarios for the con...

  3. Portfolio optimization for seed selection in diverse weather scenarios.

    Science.gov (United States)

    Marko, Oskar; Brdar, Sanja; Panić, Marko; Šašić, Isidora; Despotović, Danica; Knežević, Milivoje; Crnojević, Vladimir

    2017-01-01

    The aim of this work was to develop a method for selection of optimal soybean varieties for the American Midwest using data analytics. We extracted the knowledge about 174 varieties from the dataset, which contained information about weather, soil, yield and regional statistical parameters. Next, we predicted the yield of each variety in each of 6,490 observed subregions of the Midwest. Furthermore, yield was predicted for all the possible weather scenarios approximated by 15 historical weather instances contained in the dataset. Using predicted yields and covariance between varieties through different weather scenarios, we performed portfolio optimisation. In this way, for each subregion, we obtained a selection of varieties, that proved superior to others in terms of the amount and stability of yield. According to the rules of Syngenta Crop Challenge, for which this research was conducted, we aggregated the results across all subregions and selected up to five soybean varieties that should be distributed across the network of seed retailers. The work presented in this paper was the winning solution for Syngenta Crop Challenge 2017.

  4. Portfolio optimization for seed selection in diverse weather scenarios.

    Directory of Open Access Journals (Sweden)

    Oskar Marko

    Full Text Available The aim of this work was to develop a method for selection of optimal soybean varieties for the American Midwest using data analytics. We extracted the knowledge about 174 varieties from the dataset, which contained information about weather, soil, yield and regional statistical parameters. Next, we predicted the yield of each variety in each of 6,490 observed subregions of the Midwest. Furthermore, yield was predicted for all the possible weather scenarios approximated by 15 historical weather instances contained in the dataset. Using predicted yields and covariance between varieties through different weather scenarios, we performed portfolio optimisation. In this way, for each subregion, we obtained a selection of varieties, that proved superior to others in terms of the amount and stability of yield. According to the rules of Syngenta Crop Challenge, for which this research was conducted, we aggregated the results across all subregions and selected up to five soybean varieties that should be distributed across the network of seed retailers. The work presented in this paper was the winning solution for Syngenta Crop Challenge 2017.

  5. A new scenario framework for Climate Change Research: scenario matrix architecture

    NARCIS (Netherlands)

    van Vuuren, D.P.|info:eu-repo/dai/nl/11522016X; Kriegler, E.; O'Neill, B.C.; Ebi, K.L.; Riahi, K.; Carter, T.R.; Edmonds, J.; Hallegatte, S.; Kram, T.; Mathur, R.; Winkler, H.

    2014-01-01

    This paper describes the scenario matrix architecture that underlies a framework for developing new scenarios for climate change research. The matrix architecture facilitates addressing key questions related to current climate research and policy-making: identifying the effectiveness of different

  6. Multiple methods for multiple futures: Integrating qualitative scenario planning and quantitative simulation modeling for natural resource decision making

    Science.gov (United States)

    Symstad, Amy J.; Fisichelli, Nicholas A.; Miller, Brian W.; Rowland, Erika; Schuurman, Gregor W.

    2017-01-01

    Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.

  7. Strategic Scenario Construction Made Easy

    DEFF Research Database (Denmark)

    Duus, Henrik Johannsen

    2016-01-01

    insights from the area of strategic forecasting (of which scenario planning is a proper subset) and experiences gained from a recent course in that area to develop a simpler, more direct, hands-on method for scenario construction and to provide several ideas for scenario construction that can be used......Scenario planning is a well-known way to develop corporate strategy by creating multiple images of alternative futures. Yet although scenario planning grew from very hands-on strategy development efforts in the military and from operations research dedicated to solving practical problems, the use...... of scenarios in business has, in many cases, remained a cumbersome affair. Very often a large group of consultants, employees and staff is involved in the development of scenarios and strategies, thus making the whole process expensive in terms of time, money and human resources. In response, this article uses...

  8. Scenario? Guilty!

    DEFF Research Database (Denmark)

    Kyng, Morten

    1992-01-01

    Robert Campbell categorizes the word "scenario" as a buzzword, identifies four major uses within HCI and suggests that we adopt new terms differentiating these four uses of the word. My first reaction to reading the article was definitely positive, but rereading it gave me enough second thoughts...... to warrant a response. I should probably confess that I searched my latest paper for the word "scenario" and found eight occurrences, none of which fell in the categories described by Campbell....

  9. Interferometric data modelling: issues in realistic data generation

    International Nuclear Information System (INIS)

    Mukherjee, Soma

    2004-01-01

    This study describes algorithms developed for modelling interferometric noise in a realistic manner, i.e. incorporating non-stationarity that can be seen in the data from the present generation of interferometers. The noise model is based on individual component models (ICM) with the application of auto regressive moving average (ARMA) models. The data obtained from the model are vindicated by standard statistical tests, e.g. the KS test and Akaike minimum criterion. The results indicate a very good fit. The advantage of using ARMA for ICMs is that the model parameters can be controlled and hence injection and efficiency studies can be conducted in a more controlled environment. This realistic non-stationary noise generator is intended to be integrated within the data monitoring tool framework

  10. Learning from global emissions scenarios

    International Nuclear Information System (INIS)

    O'Neill, Brian C; Nakicenovic, Nebojsa

    2008-01-01

    Scenarios of global greenhouse gas emissions have played a key role in climate change analysis for over twenty years. Currently, several research communities are organizing to undertake a new round of scenario development in the lead-up to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC). To help inform this process, we assess a number of past efforts to develop and learn from sets of global greenhouse gas emissions scenarios. We conclude that while emissions scenario exercises have likely had substantial benefits for participating modeling teams and produced insights from individual models, learning from the exercises taken as a whole has been more limited. Model comparison exercises have typically focused on the production of large numbers of scenarios while investing little in assessing the results or the production process, perhaps on the assumption that later assessment efforts could play this role. However, much of this assessment potential remains untapped. Efforts such as scenario-related chapters of IPCC reports have been most informative when they have gone to extra lengths to carry out more specific comparison exercises, but in general these assessments do not have the remit or resources to carry out the kind of detailed analysis of scenario results necessary for drawing the most useful conclusions. We recommend that scenario comparison exercises build-in time and resources for assessing scenario results in more detail at the time when they are produced, that these exercises focus on more specific questions to improve the prospects for learning, and that additional scenario assessments are carried out separately from production exercises. We also discuss the obstacles to better assessment that might exist, and how they might be overcome. Finally, we recommend that future work include much greater emphasis on understanding how scenarios are actually used, as a guide to improving scenario production.

  11. Realist cinema as world cinema

    OpenAIRE

    Nagib, Lucia

    2017-01-01

    The idea that “realism” is the common denominator across the vast range of productions normally labelled as “world cinema” is widespread and seemly uncontroversial. Leaving aside oppositional binaries that define world cinema as the other of Hollywood or of classical cinema, this chapter will test the realist premise by locating it in the mode of production. It will define this mode as an ethics that engages filmmakers, at cinema’s creative peaks, with the physical and historical environment,...

  12. Floating Offshore Wind in Oregon: Potential for Jobs and Economic Impacts from Two Future Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Tony [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Tegen, Suzanne [National Renewable Energy Lab. (NREL), Golden, CO (United States); Speer, Bethany [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-05-01

    Construction of the first offshore wind power plant in the United States began in 2015, off the coast of Rhode Island, using fixed platform structures that are appropriate for shallow seafloors, like those located off of the East Coast and mid-Atlantic. However, floating platforms, which have yet to be deployed commercially, will likely need to anchor to the deeper seafloor if deployed off of the West Coast. To analyze the employment and economic potential for floating offshore wind along the West Coast, the Bureau of Ocean Energy Management (BOEM) commissioned the National Renewable Energy Laboratory (NREL) to analyze two hypothetical, large-scale deployment scenarios for Oregon: 5,500 megawatts (MW) of offshore wind deployment in Oregon by 2050 (Scenario A), and 2,900 MW of offshore wind by 2050 (Scenario B). These levels of deployment could power approximately 1,600,000 homes (Scenario A) or 870,000 homes (Scenario B). Offshore wind would contribute to economic development in Oregon in the near future, and more substantially in the long term, especially if equipment and labor are sourced from within the state. According to the analysis, over the 2020-2050 period, Oregon floating offshore wind facilities could support 65,000-97,000 job-years and add $6.8 billion-$9.9 billion to the state GDP (Scenario A).

  13. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation.

    Science.gov (United States)

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-05-10

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011.

  14. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation

    Science.gov (United States)

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-01-01

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011. PMID:27161897

  15. Finding Multiple Lanes in Urban Road Networks with Vision and Lidar

    Science.gov (United States)

    2009-03-24

    drawbacks. First, the cost of updating and main- taining millions of kilometers of roadway is prohibitive. Second, the danger of autonomous vehicles perceiving... autonomous vehicles . We propose that a data infrastructure is useful for topological information and sparse geometry, but reject relying upon it for dense...Throughout the race, approximately 50 human- driven and autonomous vehicles were simultaneously active, thus providing realistic traffic scenarios. Our most

  16. The Effect of Realistic Mathematics Education Approach on Students' Achievement And Attitudes Towards Mathematics

    Directory of Open Access Journals (Sweden)

    Effandi Zakaria

    2017-02-01

    Full Text Available This study was conducted to determine the effect of Realistic Mathematics Education Approach on mathematics achievement and student attitudes towards mathematics. This study also sought determine the relationship between student achievement and attitudes towards mathematics. This study used a quasi-experimental design conducted on 61 high school students at SMA Unggul Sigli. Students were divided into two groups, the treatment group $(n = 30$ namely, the Realistic Mathematics Approach group (PMR and the control group $(n = 31$ namely, the traditional group. This study was conducted for six weeks. The instruments used in this study were the achievement test and the attitudes towards mathematics questionnaires. Data were analyzed using SPSS. To determine the difference in mean achievement and attitudes between the two groups, data were analyzed using one-way ANOVA test. The result showed significant differences between the Realistic Mathematics Approach and the traditional approach in terms of achievement. The study showed no significant difference between the Realistic Mathematics Approach and the traditional approach in term of attitudes towards mathematics. It can be concluded that the use of realistic mathematics education approach enhanced students' mathematics achievement, but not attitudes towards mathematics. The Realistic Mathematics Education Approach encourage students to participate actively in the teaching and learning of mathematics. Thus, Realistic Mathematics Education Approach is an appropriate methods to improve the quality of teaching and learning process.

  17. Rigorous approximation of stationary measures and convergence to equilibrium for iterated function systems

    International Nuclear Information System (INIS)

    Galatolo, Stefano; Monge, Maurizio; Nisoli, Isaia

    2016-01-01

    We study the problem of the rigorous computation of the stationary measure and of the rate of convergence to equilibrium of an iterated function system described by a stochastic mixture of two or more dynamical systems that are either all uniformly expanding on the interval, either all contracting. In the expanding case, the associated transfer operators satisfy a Lasota–Yorke inequality, we show how to compute a rigorous approximations of the stationary measure in the L "1 norm and an estimate for the rate of convergence. The rigorous computation requires a computer-aided proof of the contraction of the transfer operators for the maps, and we show that this property propagates to the transfer operators of the IFS. In the contracting case we perform a rigorous approximation of the stationary measure in the Wasserstein–Kantorovich distance and rate of convergence, using the same functional analytic approach. We show that a finite computation can produce a realistic computation of all contraction rates for the whole parameter space. We conclude with a description of the implementation and numerical experiments. (paper)

  18. Invisible Base Electrode Coordinates Approximation for Simultaneous SPECT and EEG Data Visualization

    Science.gov (United States)

    Kowalczyk, L.; Goszczynska, H.; Zalewska, E.; Bajera, A.; Krolicki, L.

    2014-04-01

    This work was performed as part of a larger research concerning the feasibility of improving the localization of epileptic foci, as compared to the standard SPECT examination, by applying the technique of EEG mapping. The presented study extends our previous work on the development of a method for superposition of SPECT images and EEG 3D maps when these two examinations are performed simultaneously. Due to the lack of anatomical data in SPECT images it is a much more difficult task than in the case of MRI/EEG study where electrodes are visible in morphological images. Using the appropriate dose of radioisotope we mark five base electrodes to make them visible in the SPECT image and then approximate the coordinates of the remaining electrodes using properties of the 10-20 electrode placement system and the proposed nine-ellipses model. This allows computing a sequence of 3D EEG maps spanning on all electrodes. It happens, however, that not all five base electrodes can be reliably identified in SPECT data. The aim of the current study was to develop a method for determining the coordinates of base electrode(s) missing in the SPECT image. The algorithm for coordinates approximation has been developed and was tested on data collected for three subjects with all visible electrodes. To increase the accuracy of the approximation we used head surface models. Freely available model from Oostenveld research based on data from SPM package and our own model based on data from our EEG/SPECT studies were used. For data collected in four cases with one electrode not visible we compared the invisible base electrode coordinates approximation for Oostenveld and our models. The results vary depending on the missing electrode placement, but application of the realistic head model significantly increases the accuracy of the approximation.

  19. Invisible Base Electrode Coordinates Approximation for Simultaneous SPECT and EEG Data Visualization

    Directory of Open Access Journals (Sweden)

    Kowalczyk L.

    2014-04-01

    Full Text Available This work was performed as part of a larger research concerning the feasibility of improving the localization of epileptic foci, as compared to the standard SPECT examination, by applying the technique of EEG mapping. The presented study extends our previous work on the development of a method for superposition of SPECT images and EEG 3D maps when these two examinations are performed simultaneously. Due to the lack of anatomical data in SPECT images it is a much more difficult task than in the case of MRI/EEG study where electrodes are visible in morphological images. Using the appropriate dose of radioisotope we mark five base electrodes to make them visible in the SPECT image and then approximate the coordinates of the remaining electrodes using properties of the 10-20 electrode placement system and the proposed nine-ellipses model. This allows computing a sequence of 3D EEG maps spanning on all electrodes. It happens, however, that not all five base electrodes can be reliably identified in SPECT data. The aim of the current study was to develop a method for determining the coordinates of base electrode(s missing in the SPECT image. The algorithm for coordinates approximation has been developed and was tested on data collected for three subjects with all visible electrodes. To increase the accuracy of the approximation we used head surface models. Freely available model from Oostenveld research based on data from SPM package and our own model based on data from our EEG/SPECT studies were used. For data collected in four cases with one electrode not visible we compared the invisible base electrode coordinates approximation for Oostenveld and our models. The results vary depending on the missing electrode placement, but application of the realistic head model significantly increases the accuracy of the approximation.

  20. Emissions reduction scenarios in the Argentinean Energy Sector

    International Nuclear Information System (INIS)

    Di Sbroiavacca, Nicolás; Nadal, Gustavo; Lallana, Francisco; Falzon, James; Calvin, Katherine

    2016-01-01

    In this paper the LEAP, TIAM-ECN, and GCAM models were applied to evaluate the impact of a variety of climate change control policies (including carbon pricing and emission constraints relative to a base year) on primary energy consumption, final energy consumption, electricity sector development, and CO_2 emission savings of the energy sector in Argentina over the 2010–2050 period. The LEAP model results indicate that if Argentina fully implements the most feasible mitigation measures currently under consideration by official bodies and key academic institutions on energy supply and demand, such as the ProBiomass program, a cumulative incremental economic cost of 22.8 billion US$(2005) to 2050 is expected, resulting in a 16% reduction in GHG emissions compared to a business-as-usual scenario. These measures also bring economic co-benefits, such as a reduction of energy imports improving the balance of trade. A Low CO_2 price scenario in LEAP results in the replacement of coal by nuclear and wind energy in electricity expansion. A High CO_2 price leverages additional investments in hydropower. By way of cross-model comparison with the TIAM-ECN and GCAM global integrated assessment models, significant variation in projected emissions reductions in the carbon price scenarios was observed, which illustrates the inherent uncertainties associated with such long-term projections. These models predict approximately 37% and 94% reductions under the High CO_2 price scenario, respectively. By comparison, the LEAP model, using an approach based on the assessment of a limited set of mitigation options, predicts an 11.3% reduction. The main reasons for this difference include varying assumptions about technology cost and availability, CO_2 storage capacity, and the ability to import bioenergy. An emission cap scenario (2050 emissions 20% lower than 2010 emissions) is feasible by including such measures as CCS and Bio CCS, but at a significant cost. In terms of technology

  1. Realistic Noise Assessment and Strain Analysis of Iranian Permanent GPS Stations

    Science.gov (United States)

    Razeghi, S. M.; Amiri Simkooei, A. A.; Sharifi, M. A.

    2012-04-01

    To assess noise characteristics of Iranian Permanent GPS Stations (IPGS), northwestern part of this network namely Azerbaijan Continuous GPS Station (ACGS), was selected. For a realistic noise assessment it is required to model all deterministic signals of the GPS time series by means of least squares harmonic estimation (LS-HE) and derive all periodic behavior of the series. After taking all deterministic signals into account, the least squares variance component estimation (LS-VCE) is used to obtain a realistic noise model (white noise plus flicker noise) of the ACGS. For this purpose, one needs simultaneous GPS time series for which a multivariate noise assessment is applied. Having determined realistic noise model, a realistic strain analysis of the network is obtained for which one relies on the finite element methods. Finite element is now considered to be a new functional model and the new stochastic model is given based on the multivariate noise assessment using LS-VCE. The deformation rates of the components along with their full covariance matries are input to the strain analysis. Further, the results are also provided using a pure white noise model. The normalized strains for these two models show that the strain parameters derived from a realistic noise model are less significant than those derived from the white model. This could be either due to the short time span of the time series used or due to the intrinsic behavior of the strain parameters in the ACGS. Longer time series are required to further elaborate this issue.

  2. Scenario-based potential effects of carbon trading in China: An integrated approach

    International Nuclear Information System (INIS)

    Zhang, Cheng; Wang, Qunwei; Shi, Dan; Li, Pengfei; Cai, Wanhuan

    2016-01-01

    Highlights: • Carbon dioxide shadow price shows a negative asymmetrical correlation with carbon dioxide emissions in China. • The implements of carbon trading can bring Porter Hypothesis effect significantly. • Provincial carbon trading can reduce carbon intensity by 19.79–25.24% in China. - Abstract: Using China’s provincial panel data and national panel data of OECD (Organization for Economic Co-operation and Development) and BRICS (Five major emerging national economies: Brazil, Russia, India, China and South Africa), this paper simulates the scenario-based potential effect of carbon trading in China. Analysis methods included Stochastic Frontier Analysis, Difference-in-differences Model, and Nonlinear Programming Technique. Results indicated that in a theory-based view of carbon trading, the shadow price of carbon dioxide generally rises, with a non-linear negative correlation with carbon dioxide emissions. In different regions, the shadow price of carbon dioxide presents a digressive tendency among eastern, central, and western areas, with divergent gaps between and within areas. When the greatest goal is assumed to reduce national carbon intensity as much as possible at the given national GDP (Gross Domestic Product) (Scenario I), carbon trading has the effect of reducing carbon intensity by 19.79%, with the consideration of Porter Hypothesis effect. If the rigid constraint of national GDP is relaxed, and the dual constraint of both economic growth and environment protection in each region is introduced (Scenario II), the resulting effect is a reduced carbon intensity of 25.24%. China’s general carbon intensity in 2012 was higher than goals set at the Copenhagen Conference, but lagged behind the goal of Twelfth Five-Year Plan for National Economy. This study provides realistic and significant technical support for the government to use in designing and deploying a national carbon trading market.

  3. Performance Assessment of a Solar-Assisted Desiccant-Based Air Handling Unit Considering Different Scenarios

    Directory of Open Access Journals (Sweden)

    Giovanni Angrisani

    2016-09-01

    Full Text Available In this paper, three alternative layouts (scenarios of an innovative solar-assisted hybrid desiccant-based air handling unit (AHU are investigated through dynamic simulations. Performance is evaluated with respect to a reference system and compared to those of the innovative plant without modifications. For each scenario, different collector types, surfaces and tilt angles are considered. The effect of the solar thermal energy surplus exploitation for other low-temperature uses is also investigated. The first alternative scenario consists of the recovery of the heat rejected by the condenser of the chiller to pre-heat the regeneration air. The second scenario considers the pre-heating of regeneration air with the warmer regeneration air exiting the desiccant wheel (DW. The last scenario provides pre-cooling of the process air before entering the DW. Results reveal that the plants with evacuated solar collectors (SC can ensure primary energy savings (15%–24% and avoid equivalent CO2 emissions (14%–22%, about 10 percentage points more than those with flat-plate collectors, when the solar thermal energy is used only for air conditioning and the collectors have the best tilt angle. If all of the solar thermal energy is considered, the best results with evacuated tube collectors are approximately 73% in terms of primary energy saving, 71% in terms of avoided equivalent CO2 emissions and a payback period of six years.

  4. Generating realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap; Bae, Sangwon; Knauer, Christian; Lee, Mira; Shin, Chansu; Vigneron, Antoine E.

    2011-01-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. In this paper, we introduce realistic roofs by imposing

  5. Realistic Visualization of Virtual Views and Virtual Cinema

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    Realistic Virtual View Visualization is a new field of research which has received increasing attention in recent years. It is strictly related to the increased popularity of virtual reality and the spread of its applications, among which virtual photography and cinematography. The use of computer...... generated characters, "virtual actors", in the motion picture production increases every day. While the most known computer graphics techniques have largely been adopted successfully in nowadays fictions, it still remains very challenging to implement virtual actors which would resemble, visually, human...... beings. Interestingly, film directors have been looking at the recent progress achieved by the research community in the field of realistic visualization of virtual views, and they have successfully implemented state of the art research approaches in their productions. An innovative concept...

  6. Scenarios and innovative systems

    International Nuclear Information System (INIS)

    2001-11-01

    The purpose of this workshop is to present to the GEDEON community the scenarios for the deployment of innovative nuclear solutions. Both steady state situations and possible transitions from the present to new reactors and fuel cycles are considered. Innovative systems that satisfy improved natural resource utilization and waste minimization criteria will be described as well as the R and D orientations of various partners. This document brings together the transparencies of 17 communications given at this workshop: general policy for transmutation and partitioning; Amster: a molten salt reactor (MSR) concept; MSR capabilities; potentials and capabilities of accelerator driven systems (ADS); ADS demonstrator interest as an experimental facility; innovative systems: gas coolant technologies; Pu management in EPR; scenarios with thorium fuel; scenarios at the equilibrium state; scenarios for transition; partitioning and specific conditioning; management of separated radio-toxic elements; European programs; DOE/AAA (Advanced Accelerator Applications) program; OECD scenario studies; CEA research programs and orientations; partitioning and transmutation: an industrial point of view. (J.S.)

  7. Scenarios for Gluino Coannihilation

    CERN Document Server

    Ellis, John; Luo, Feng; Olive, Keith A

    2016-01-01

    We study supersymmetric scenarios in which the gluino is the next-to-lightest supersymmetric particle (NLSP), with a mass sufficiently close to that of the lightest supersymmetric particle (LSP) that gluino coannihilation becomes important. One of these scenarios is the MSSM with soft supersymmetry-breaking squark and slepton masses that are universal at an input GUT renormalization scale, but with non-universal gaugino masses. The other scenario is an extension of the MSSM to include vector-like supermultiplets. In both scenarios, we identify the regions of parameter space where gluino coannihilation is important, and discuss their relations to other regions of parameter space where other mechanisms bring the dark matter density into the range allowed by cosmology. In the case of the non-universal MSSM scenario, we find that the allowed range of parameter space is constrained by the requirement of electroweak symmetry breaking, the avoidance of a charged LSP and the measured mass of the Higgs boson, in parti...

  8. A parallel adaptive finite element simplified spherical harmonics approximation solver for frequency domain fluorescence molecular imaging

    International Nuclear Information System (INIS)

    Lu Yujie; Zhu Banghe; Rasmussen, John C; Sevick-Muraca, Eva M; Shen Haiou; Wang Ge

    2010-01-01

    Fluorescence molecular imaging/tomography may play an important future role in preclinical research and clinical diagnostics. Time- and frequency-domain fluorescence imaging can acquire more measurement information than the continuous wave (CW) counterpart, improving the image quality of fluorescence molecular tomography. Although diffusion approximation (DA) theory has been extensively applied in optical molecular imaging, high-order photon migration models need to be further investigated to match quantitation provided by nuclear imaging. In this paper, a frequency-domain parallel adaptive finite element solver is developed with simplified spherical harmonics (SP N ) approximations. To fully evaluate the performance of the SP N approximations, a fast time-resolved tetrahedron-based Monte Carlo fluorescence simulator suitable for complex heterogeneous geometries is developed using a convolution strategy to realize the simulation of the fluorescence excitation and emission. The validation results show that high-order SP N can effectively correct the modeling errors of the diffusion equation, especially when the tissues have high absorption characteristics or when high modulation frequency measurements are used. Furthermore, the parallel adaptive mesh evolution strategy improves the modeling precision and the simulation speed significantly on a realistic digital mouse phantom. This solver is a promising platform for fluorescence molecular tomography using high-order approximations to the radiative transfer equation.

  9. Realistic and efficient 2D crack simulation

    Science.gov (United States)

    Yadegar, Jacob; Liu, Xiaoqing; Singh, Abhishek

    2010-04-01

    Although numerical algorithms for 2D crack simulation have been studied in Modeling and Simulation (M&S) and computer graphics for decades, realism and computational efficiency are still major challenges. In this paper, we introduce a high-fidelity, scalable, adaptive and efficient/runtime 2D crack/fracture simulation system by applying the mathematically elegant Peano-Cesaro triangular meshing/remeshing technique to model the generation of shards/fragments. The recursive fractal sweep associated with the Peano-Cesaro triangulation provides efficient local multi-resolution refinement to any level-of-detail. The generated binary decomposition tree also provides efficient neighbor retrieval mechanism used for mesh element splitting and merging with minimal memory requirements essential for realistic 2D fragment formation. Upon load impact/contact/penetration, a number of factors including impact angle, impact energy, and material properties are all taken into account to produce the criteria of crack initialization, propagation, and termination leading to realistic fractal-like rubble/fragments formation. The aforementioned parameters are used as variables of probabilistic models of cracks/shards formation, making the proposed solution highly adaptive by allowing machine learning mechanisms learn the optimal values for the variables/parameters based on prior benchmark data generated by off-line physics based simulation solutions that produce accurate fractures/shards though at highly non-real time paste. Crack/fracture simulation has been conducted on various load impacts with different initial locations at various impulse scales. The simulation results demonstrate that the proposed system has the capability to realistically and efficiently simulate 2D crack phenomena (such as window shattering and shards generation) with diverse potentials in military and civil M&S applications such as training and mission planning.

  10. Low carbon society scenario 2050 in Thai industrial sector

    International Nuclear Information System (INIS)

    Selvakkumaran, Sujeetha; Limmeechokchai, Bundit; Masui, Toshihiko; Hanaoka, Tatsuya; Matsuoka, Yuzuru

    2014-01-01

    Highlights: • Thai industrial sector has been modelled using AIM/Enduse model. • Potential mitigation of CO 2 for 2050 is approximately 20% from Baseline scenario. • Abatement cost curves show that varied counter measures are practical in the industrial sector. • Energy security is enhanced due to CO 2 mitigation in the LCS scenario. - Abstract: Energy plays a dominant role in determining the individual competitiveness of a country and this is more relevant to emerging economies. That being said, energy also plays an important and ever expanding role in carbon emissions and sustainability of the country. As a developing country Thailand’s industrial sector is vibrant and robust and consumes majority of the energy. In addition, it also has the highest CO 2 emissions, provided the emissions of power generation are taken into account. Industry also accounts for the highest consumption of electricity in Thailand. The objective of this study is to model the Thai industrial energy sector and estimate the mitigation potential for the timeframe of 2010–2050 using the principles of Low Carbon Society (LCS). In addition, the paper would also evaluate emission tax as a key driver of Greenhouse Gas (GHG) mitigation along with Marginal Abatement Cost (MAC) analysis. Another secondary objective is to analyse the impact of mitigation on energy security of the industrial sector. The Thai industrial sector was modelled using AIM/Enduse model, which is a recursive dynamic optimisation model belonging to the Asia–Pacific Integrated Model (AIM) family. Thai industrial sector was divided into nine sub-sectors based on national economic reporting procedures. Results suggest that the mitigation potential in 2050, compared to the Baseline scenario, is around 20% with positive impacts on energy security. The Baseline emission will approximately be 377 Mt-CO 2 in the industrial sector. All four indicators of energy security, Primary Energy Intensity, Carbon Intensity, Oil

  11. Does really Born-Oppenheimer approximation break down in charge transfer processes? An exactly solvable model

    International Nuclear Information System (INIS)

    Kuznetsov, Alexander M.; Medvedev, Igor G.

    2006-01-01

    Effects of deviation from the Born-Oppenheimer approximation (BOA) on the non-adiabatic transition probability for the transfer of a quantum particle in condensed media are studied within an exactly solvable model. The particle and the medium are modeled by a set of harmonic oscillators. The dynamic interaction of the particle with a single local mode is treated explicitly without the use of BOA. Two particular situations (symmetric and non-symmetric systems) are considered. It is shown that the difference between the exact solution and the true BOA is negligibly small at realistic parameters of the model. However, the exact results differ considerably from those of the crude Condon approximation (CCA) which is usually considered in the literature as a reference point for BOA (Marcus-Hush-Dogonadze formula). It is shown that the exact rate constant can be smaller (symmetric system) or larger (non-symmetric one) than that obtained in CCA. The non-Condon effects are also studied

  12. Limiting efficiency of generalized realistic c-Si solar cells coupled to ideal up-converters

    Science.gov (United States)

    Johnson, Craig M.; Conibeer, Gavin J.

    2012-11-01

    The detailed balance model of photovoltaic up-conversion is revised for the specific case of a c-Si solar cell under the AM1.5G solar spectrum. The limiting efficiency of an ideal solar cell with a band gap of 1.117 eV may be increased from approximately 33% to 40% with ideal up-conversion. However, real solar cells do not demonstrate the step-function absorption characteristic assumed in the standard detailed balance model. Here, we use tabulated Si refractive index data to develop a generalized model of a realistic conventional c-Si solar cell. The model incorporates optical design and material parameters such as free carrier absorption that have a non-trivial impact on the operation of the up-conversion layer. While these modifications are shown to decrease the absolute limiting efficiency, the benefit of up-conversion is shown to be relatively greater.

  13. A working definition of scenario and a method of scenario construction

    International Nuclear Information System (INIS)

    Barr, G.E.; Dunn, E.

    1992-01-01

    The event-tree method of scenario construction has been chosen for the Yucca Mountain performance assessment. Its applicability and suitability to the problem are discussed and compared with those of the Nuclear Regulatory Commission (NRC) method. The event-tree method is appropriate for an incompletely characterized site, where there must be an evolving understanding, over time, of the processes at work, for a site that may require analysis of details in specific context, and when the scenario functions to guide site characterization. Anticipating the eventual requirement for using the NRC method, we show that the event-tree method can be translated to the NRC format after final scenario screening

  14. ABCtoolbox: a versatile toolkit for approximate Bayesian computations

    Directory of Open Access Journals (Sweden)

    Neuenschwander Samuel

    2010-03-01

    Full Text Available Abstract Background The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. Results Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC. It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. Conclusion ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.

  15. A Case Study on Air Combat Decision Using Approximated Dynamic Programming

    Directory of Open Access Journals (Sweden)

    Yaofei Ma

    2014-01-01

    Full Text Available As a continuous state space problem, air combat is difficult to be resolved by traditional dynamic programming (DP with discretized state space. The approximated dynamic programming (ADP approach is studied in this paper to build a high performance decision model for air combat in 1 versus 1 scenario, in which the iterative process for policy improvement is replaced by mass sampling from history trajectories and utility function approximating, leading to high efficiency on policy improvement eventually. A continuous reward function is also constructed to better guide the plane to find its way to “winner” state from any initial situation. According to our experiments, the plane is more offensive when following policy derived from ADP approach other than the baseline Min-Max policy, in which the “time to win” is reduced greatly but the cumulated probability of being killed by enemy is higher. The reason is analyzed in this paper.

  16. Stochastic-shielding approximation of Markov chains and its application to efficiently simulate random ion-channel gating.

    Science.gov (United States)

    Schmandt, Nicolaus T; Galán, Roberto F

    2012-09-14

    Markov chains provide realistic models of numerous stochastic processes in nature. We demonstrate that in any Markov chain, the change in occupation number in state A is correlated to the change in occupation number in state B if and only if A and B are directly connected. This implies that if we are only interested in state A, fluctuations in B may be replaced with their mean if state B is not directly connected to A, which shortens computing time considerably. We show the accuracy and efficacy of our approximation theoretically and in simulations of stochastic ion-channel gating in neurons.

  17. An Overview of Westinghouse Realistic Large Break LOCA Evaluation Model

    Directory of Open Access Journals (Sweden)

    Cesare Frepoli

    2008-01-01

    Full Text Available Since the 1988 amendment of the 10 CFR 50.46 rule in 1988, Westinghouse has been developing and applying realistic or best-estimate methods to perform LOCA safety analyses. A realistic analysis requires the execution of various realistic LOCA transient simulations where the effect of both model and input uncertainties are ranged and propagated throughout the transients. The outcome is typically a range of results with associated probabilities. The thermal/hydraulic code is the engine of the methodology but a procedure is developed to assess the code and determine its biases and uncertainties. In addition, inputs to the simulation are also affected by uncertainty and these uncertainties are incorporated into the process. Several approaches have been proposed and applied in the industry in the framework of best-estimate methods. Most of the implementations, including Westinghouse, follow the Code Scaling, Applicability and Uncertainty (CSAU methodology. Westinghouse methodology is based on the use of the WCOBRA/TRAC thermal-hydraulic code. The paper starts with an overview of the regulations and its interpretation in the context of realistic analysis. The CSAU roadmap is reviewed in the context of its implementation in the Westinghouse evaluation model. An overview of the code (WCOBRA/TRAC and methodology is provided. Finally, the recent evolution to nonparametric statistics in the current edition of the W methodology is discussed. Sample results of a typical large break LOCA analysis for a PWR are provided.

  18. Learning Through Scenario Planning

    DEFF Research Database (Denmark)

    Balarezo, Jose

    level variables, this research corrects this void by investigating the dynamics of organizational learning through the lenses of a corporate scenario planning process. This enhances our scientific understanding of the role that scenario planning might play in the context of organizational learning......This project investigates the uses and effects of scenario planning in companies operating in highly uncertain and dynamic environments. Whereas previous research on scenario planning has fallen short of providing sufficient evidence of its mechanisms and effects on individual or organizational...... and strategic renewal. Empirical evidence of the various difficulties that learning flows has to overcome as it journeys through organizational and hierarchical levels are presented. Despite various cognitive and social psychological barriers identified along the way, the results show the novel...

  19. Simulación clínica de alto realismo: una experiencia en el pregrado Realistic clinical simulation: an experience with undergraduate medical students

    Directory of Open Access Journals (Sweden)

    Javier Riancho

    2012-06-01

    Full Text Available Introducción. La simulación con modelos de alto realismo se utiliza a menudo en la formación de los profesionales sanitarios. Sin embargo, son escasas las experiencias en el pregrado. El objetivo de este trabajo fue conocer la factibilidad y la aceptación de su aplicación con estudiantes de sexto curso de la licenciatura de Medicina. Materiales y métodos. Se diseñaron ocho escenarios que simulaban problemas clínicos frecuentes para su desarrollo con maniquíes de alto realismo. Los estudiantes se dividieron en grupos de 6-8 sujetos, cada uno de los cuales atendió dos casos durante 30 minutos. Posteriormente se llevó a cabo un análisis reflexivo durante 25-40 minutos. La actividad se repitió en dos años consecutivos. Al final se recabó la opinión de los estudiantes mediante encuestas anónimas. Resultados. La actividad fue valorada muy positivamente por los estudiantes, quienes la consideraron como "útil" (4,8 y 4,9 puntos sobre 5 e "interesante" (4,9 y 4,9 puntos. El tiempo preciso para preparar cada escenario fue de unas 3 horas. Fueron necesarias una jornada completa de un profesor, un técnico y un enfermero para que un colectivo de unos 40 estudiantes se expusiera a dos casos clínicos. Conclusiones. Esta experiencia piloto sugiere que la simulación de alto realismo es factible en el pregrado, supone un consumo razonable de recursos y tiene una elevada aceptación por parte de los estudiantes. No obstante, se necesitan otros estudios que confirmen la impresión subjetiva de que resulta útil para potenciar el aprendizaje de los alumnos y su competencia clínica.Introduction. Realistic clinical simulation is commonly used with physicians and other health professionals. However it has been rarely used with undergraduate students. The aim of this study was to explore its feasibility and acceptance with medical students. Materials and methods. Eight clinical scenarios representing common acute problems in medical practice were

  20. Trends and reduction scenarios for Rn 222 concentrations in dwellings

    International Nuclear Information System (INIS)

    Blaauboer, R.O.; Heling, R.

    1993-07-01

    In the title study the effects of possible measures on the average radon concentration in Dutch dwellings is evaluated. Attention is paid to the trends in building methods, the use of building materials and using the trends as a reference development (scenario 0). A total of seven scenarios has been evaluated. The model that was used was kept rather simple, because most of the parameter values are average values. The measures studied were selected on the basis of cost-effectiveness. All measures are based on reducing the infiltration of radon from the crawl space under the house to the living quarters and reducing the exhalation rates of building materials. The evaluation shows a rather good match with earlier measurements and projections as far as the average radon concentration is concerned. The trend, i.e. the development without taking measures directed at reducing the radon concentration, predicts a slow increase of about 15% until approximately the year 2025. The scenario that is directed at using concrete with low Ra-226 concentrations in new houses projects an end to this trend. Other scenarios reveal that taking measures solely in the existing housing stock would give a substantial decrease in radon concentrations in the near future. The spread sheet model that was developed to evaluate the consequences of the different scenarios projects a possible reduction of the average radon concentration in dwellings with 25% by the year 2025, compared to 1991, if measures, directed at Rn-reduction are applied. If in addition to that concrete with low Ra-226 concentrations is used in new buildings, a reduction of the average radon concentration is projected of about 30%. This would result in an average radon concentration in dwellings of about 23 Bq.m -3 in the future. These reduction percentages have to be handled with some care however, because the effect of the obviously occurring uncertainties in several parameters used, are not yet quite clear. Trends in and the

  1. An interaction scenario of the galaxy pair NGC 3893/96 (KPG 302): A single passage?

    Energy Technology Data Exchange (ETDEWEB)

    Gabbasov, R. F.; Rosado, M. [Instituto de Astronomía, Universidad Nacional Autónoma de Mexico (UNAM), A.P. 70-264,04510 México D.F. (Mexico); Klapp, J., E-mail: ruslan.gabb@gmail.com [Instituto Nacional de Investigaciones Nucleares, Carretera México-Toluca S/N, La Marquesa, Ocoyoacac, 52750 Estado de México (Mexico)

    2014-05-20

    Using the data obtained previously from Fabry-Perot interferometry, we study the orbital characteristics of the interacting pair of galaxies KPG 302 with the aim to estimate a possible interaction history, the conditions necessary for the spiral arm formation, and initial satellite mass. We found by performing N-body/smoothed particle hydrodynamics simulations of the interaction that a single passage can produce a grand design spiral pattern in less than 1 Gyr. Although we reproduce most of the features with the single passage, the required satellite to host mass ratio should be ∼1:5, which is not confirmed by the dynamical mass estimate made from the measured rotation curve. We conclude that a more realistic interaction scenario would require several passages in order to explain the mass ratio discrepancy.

  2. TMS modeling toolbox for realistic simulation.

    Science.gov (United States)

    Cho, Young Sun; Suh, Hyun Sang; Lee, Won Hee; Kim, Tae-Seong

    2010-01-01

    Transcranial magnetic stimulation (TMS) is a technique for brain stimulation using rapidly changing magnetic fields generated by coils. It has been established as an effective stimulation technique to treat patients suffering from damaged brain functions. Although TMS is known to be painless and noninvasive, it can also be harmful to the brain by incorrect focusing and excessive stimulation which might result in seizure. Therefore there is ongoing research effort to elucidate and better understand the effect and mechanism of TMS. Lately Boundary element method (BEM) and Finite element method (FEM) have been used to simulate the electromagnetic phenomenon of TMS. However, there is a lack of general tools to generate the models of TMS due to some difficulties in realistic modeling of the human head and TMS coils. In this study, we have developed a toolbox through which one can generate high-resolution FE TMS models. The toolbox allows creating FE models of the head with isotropic and anisotropic electrical conductivities in five different tissues of the head and the coils in 3D. The generated TMS model is importable to FE software packages such as ANSYS for further and efficient electromagnetic analysis. We present a set of demonstrative results of realistic simulation of TMS with our toolbox.

  3. Investigating the need for complex vs. simple scenarios to improve predictions of aquatic ecosystem exposure with the SoilPlus model

    International Nuclear Information System (INIS)

    Ghirardello, Davide; Morselli, Melissa; Otto, Stefan; Zanin, Giuseppe; Di Guardo, Antonio

    2014-01-01

    A spatially-explicit version of the recent multimedia fate model SoilPlus was developed and applied to predict the runoff of three pesticides in a small agricultural watershed in north-eastern Italy. In order to evaluate model response to increasing spatial resolution, a tiered simulation approach was adopted, also using a dynamic model for surface water (DynA model), to predict the fate of pesticides in runoff water and sediment, and concentrations in river water. Simulation outputs were compared to water concentrations measured in the basin. Results showed that a high spatial resolution and scenario complexity improved model predictions of metolachlor and terbuthylazine in runoff to an acceptable performance (R 2 = 0.64–0.70). The importance was also shown of a field-based database of properties (i.e. soil texture and organic carbon, rainfall and water flow, pesticides half-life in soil) in reducing the distance between predicted and measured surface water concentrations and its relevance for risk assessment. Highlights: • A GIS based model was developed to predict pesticide fate in soil and water. • Spatial scenario was obtained at field level for a small agricultural basin. • A tiered strategy was applied to test the performance gain with complexity. • Increased details of scenario as well as the role of surface water are relevant. -- In order to obtain more ecologically realistic predictions of pulse exposure in aquatic ecosystems detailed information about the scenario is required

  4. Scenario-based strategizing

    DEFF Research Database (Denmark)

    Lehr, Thomas; Lorenz, Ullrich; Willert, Markus

    2017-01-01

    For over 40 years, scenarios have been promoted as a key technique for forming strategies in uncertain en- vironments. However, many challenges remain. In this article, we discuss a novel approach designed to increase the applicability of scenario-based strategizing in top management teams. Drawi...... Ministry) and a firm affected by disruptive change (Bosch, leading global supplier of technology and solutions)....

  5. Risk assessment by convergence methodology in RDD scenarios

    International Nuclear Information System (INIS)

    Araujo, Olga Maria Oliveira de; Andrade, Edson Ramos de; Rebello, Wilson Freitas; Silva, Gabriel Fidalgo Queiroz da

    2015-01-01

    An RDD event occurs by explosion and radioactive material dispersion where particles containing radioactive material can reach great distances from original point of the explosion and generating a plume of contamination. The use of a RDD is regarded as the most likely scenario involving radiological terrorist material. Accurate information on the population and the estimated dose are essential for analysis during the decision process. This work intends to present a proposal for a convergence of methodologies using the computer simulation codes Hotspot Health Physics 3.0 and the statistical model Radiation Effects Research Foundation (RERF) to calculate the approximate dose depending on the distance of the original point of the explosion of an RDD. From those data, the relative risk of developing tumors is estimated, as well as the probability of causation. At a later stage, the proposed combination of actions intended to help the decision-making and employment response personnel in emergency protection measures, such as sheltering and evacuation through the RESRAD-RDD software. The convergence of the proposed methodology can accelerate the process of acquiring information during the first hours of a radiological scenario and provide proper management of medical response and organization of the overall response. (author)

  6. Risk assessment by convergence methodology in RDD scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Olga Maria Oliveira de; Andrade, Edson Ramos de; Rebello, Wilson Freitas; Silva, Gabriel Fidalgo Queiroz da, E-mail: olgafisica2013@hotmail.com, E-mail: fisica.dna@gmail.com, E-mail: rebello@ime.eb.br, E-mail: profgabriel.fisica@gmail.com [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil). Secao de Engenharia Nuclear

    2015-07-01

    An RDD event occurs by explosion and radioactive material dispersion where particles containing radioactive material can reach great distances from original point of the explosion and generating a plume of contamination. The use of a RDD is regarded as the most likely scenario involving radiological terrorist material. Accurate information on the population and the estimated dose are essential for analysis during the decision process. This work intends to present a proposal for a convergence of methodologies using the computer simulation codes Hotspot Health Physics 3.0 and the statistical model Radiation Effects Research Foundation (RERF) to calculate the approximate dose depending on the distance of the original point of the explosion of an RDD. From those data, the relative risk of developing tumors is estimated, as well as the probability of causation. At a later stage, the proposed combination of actions intended to help the decision-making and employment response personnel in emergency protection measures, such as sheltering and evacuation through the RESRAD-RDD software. The convergence of the proposed methodology can accelerate the process of acquiring information during the first hours of a radiological scenario and provide proper management of medical response and organization of the overall response. (author)

  7. 40 Years of Shell Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-02-15

    Shell has been using scenario planning for four decades. During that time these scenarios have helped the company and governments across the world to make better strategic choices. Scenarios provide lenses that help see future prospects more clearly, make richer judgments and be more sensitive to uncertainties. Discover how the Shell Scenarios team has helped guide decision makers at major moments in history and get a peek at the team future focus, including the intricate relationship between energy, water and food.

  8. Realistic edge field model code REFC for designing and study of isochronous cyclotron

    International Nuclear Information System (INIS)

    Ismail, M.

    1989-01-01

    The focussing properties and the requirements for isochronism in cyclotron magnet configuration are well-known in hard edge field model. The fact that they quite often change considerably in realistic field can be attributed mainly to the influence of the edge field. A solution to this problem requires a field model which allows a simple construction of equilibrium orbit and yield simple formulae. This can be achieved by using a fitted realistic edge field (Hudson et al 1975) in the region of the pole edge and such a field model is therefore called a realistic edge field model. A code REFC based on realistic edge field model has been developed to design the cyclotron sectors and the code FIELDER has been used to study the beam properties. In this report REFC code has been described along with some relevant explaination of the FIELDER code. (author). 11 refs., 6 figs

  9. Realistic terrain visualization based on 3D virtual world technology

    Science.gov (United States)

    Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai

    2010-11-01

    The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.

  10. Scenarios in society, society in scenarios: toward a social scientific analysis of storyline-driven environmental modeling

    International Nuclear Information System (INIS)

    Garb, Yaakov; Pulver, Simone; VanDeveer, Stacy D

    2008-01-01

    Scenario analysis, an approach to thinking about alternative futures based on storyline-driven modeling, has become increasingly common and important in attempts to understand and respond to the impacts of human activities on natural systems at a variety of scales. The construction of scenarios is a fundamentally social activity, yet social scientific perspectives have rarely been brought to bear on it. Indeed, there is a growing imbalance between the increasing technical sophistication of the modeling elements of scenarios and the continued simplicity of our understanding of the social origins, linkages, and implications of the narratives to which they are coupled. Drawing on conceptual and methodological tools from science and technology studies, sociology and political science, we offer an overview of what a social scientific analysis of scenarios might include. In particular, we explore both how scenarios intervene in social microscale and macroscale contexts and how aspects of such contexts are embedded in scenarios, often implicitly. Analyzing the social 'work' of scenarios (i) can enhance the understanding of scenario developers and modeling practitioners of the knowledge production processes in which they participate and (ii) can improve the utility of scenario products as decision-support tools to actual, rather than imagined, decision-makers.

  11. Electromagnetic field effect simulation over a realistic pixel ed phantom human's brain

    International Nuclear Information System (INIS)

    Rojas, R.; Calderon, J. A.; Rivera, T.; Azorin, J.

    2012-10-01

    The exposition to different types of electromagnetic radiations can produce damages and injures on the people's tissues. The scientist, spend time and resources studying the effects of electromagnetic fields over the organs. Particularly in medical areas, the specialist in imaging methodologies and radiological treatment, are very worried about no injure there patient. Determination of matter radiation interaction, can be experimental or theoretical is not an easy task anyway. At first case, is not possible make measures inside the patient, then the experimental procedure consist in make measures in human's dummy, however, is not possible see deformations of electromagnetic fields due the organs presence. In the second case, is necessary solve, the Maxwell's equations with the electromagnetic field, crossing a lot of organs and tissues with different electric and magnetic properties each one. One alternative for theoretical solution, is make a computational simulation, however, this option, require an enormous quantity of memory and large computational times. Then, the most simulations are making in 2 dimensional or in 3 dimensional although using human models approximations, build ed with basic geometrical figures, like spheres, cylinders, ellipsoids, etc. Obviously this models just lets obtain a coarse solution of the actually situation. In this work, we propose a novel methodology to build a realistic pixel ed phantom of human's organs, and solve the Maxwell's equations over this models, evidently, the solutions are more approximated to the real behaviour. Additionally, there models results optimized when they are discretized and the finite element method is used to calculate the electromagnetic field and the induced currents. (Author)

  12. Realistic Affective Forecasting: The Role of Personality

    Science.gov (United States)

    Hoerger, Michael; Chapman, Ben; Duberstein, Paul

    2016-01-01

    Affective forecasting often drives decision making. Although affective forecasting research has often focused on identifying sources of error at the event level, the present investigation draws upon the ‘realistic paradigm’ in seeking to identify factors that similarly influence predicted and actual emotions, explaining their concordance across individuals. We hypothesized that the personality traits neuroticism and extraversion would account for variation in both predicted and actual emotional reactions to a wide array of stimuli and events (football games, an election, Valentine’s Day, birthdays, happy/sad film clips, and an intrusive interview). As hypothesized, individuals who were more introverted and neurotic anticipated, correctly, that they would experience relatively more unpleasant emotional reactions, and those who were more extraverted and less neurotic anticipated, correctly, that they would experience relatively more pleasant emotional reactions. Personality explained 30% of the concordance between predicted and actual emotional reactions. Findings suggest three purported personality processes implicated in affective forecasting, highlight the importance of individual-differences research in this domain, and call for more research on realistic affective forecasts. PMID:26212463

  13. Scenarios for remote gas production

    International Nuclear Information System (INIS)

    Tangen, Grethe; Molnvik, Mona J.

    2009-01-01

    The amount of natural gas resources accessible via proven production technology and existing infrastructure is declining. Therefore, smaller and less accessible gas fields are considered for commercial exploitation. The research project Enabling production of remote gas builds knowledge and technology aiming at developing competitive remote gas production based on floating LNG and chemical gas conversion. In this project, scenarios are used as basis for directing research related to topics that affect the overall design and operation of such plants. Selected research areas are safety, environment, power supply, operability and control. The paper summarises the scenario building process as a common effort among research institutes and industry. Further, it documents four scenarios for production of remote gas and outlines how the scenarios are applied to establish research strategies and adequate plans in a multidisciplinary project. To ensure relevance of the scenarios, it is important to adapt the building process to the current problem and the scenarios should be developed with extensive participation of key personnel.

  14. The Reality and Future Scenarios of Commercial Building Energy Consumption in China

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Nan; Lin, Jiang

    2007-08-01

    While China's 11th Five Year Plan called for a reduction of energy intensity by 2010, whether and how the energy consumption trend can be changed in a short time has been hotly debated. This research intends to evaluate the impact of a variety of scenarios of GDP growth, energy elasticity and energy efficiency improvement on energy consumption in commercial buildings in China using a detailed China End-use Energy Model. China's official energy statistics have limited information on energy demand by end use. This is a particularly pertinent issue for building energy consumption. The authors have applied reasoned judgments, based on experience of working on Chinese efficiency standards and energy related programs, to present a realistic interpretation of the current energy data. The bottom-up approach allows detailed consideration of end use intensity, equipment efficiency, etc., thus facilitating assessment of potential impacts of specific policy and technology changes on building energy use. The results suggest that: (1) commercial energy consumption in China's current statistics is underestimated by about 44%, and the fuel mix is misleading; (2) energy efficiency improvements will not be sufficient to offset the strong increase in end-use penetration and intensity in commercial buildings; (3) energy intensity (particularly electricity) in commercial buildings will increase; (4) different GDP growth and elasticity scenarios could lead to a wide range of floor area growth trajectories , and therefore, significantly impact energy consumption in commercial buildings.

  15. The HayWired Scenario - How Can the San Francisco Bay Region Bounce Back Better?

    Science.gov (United States)

    Hudnut, K. W.; Wein, A. M.; Cox, D. A.; Perry, S. C.; Porter, K.; Johnson, L. A.; Strauss, J. A.

    2017-12-01

    The HayWired scenario is a hypothetical yet scientifically realistic and quantitative depiction of a moment magnitude (Mw) 7.0 earthquake occurring on April 18, 2018, at 4:18 p.m. on the Hayward Fault in the east bay part of the San Francisco Bay area, California. The hypothetical earthquake has its epicenter in Oakland, and strong ground shaking from the scenario causes a wide range of severe impacts throughout the greater bay region. In the scenario, the Hayward Fault is ruptured along its length for 83 kilometers (about 52 miles). Building on a decades-long series of efforts to reduce earthquake risk in the SF Bay area, the hypothetical HayWired earthquake is used to examine the well-known earthquake hazard of the Hayward Fault, with a focus on newly emerging vulnerabilities. After a major earthquake disaster, reestablishing water services and food-supply chains are, of course, top priorities. However, problems associated with telecommunication outages or "network congestion" will increase and become more urgent as the bay region deepens its reliance on the "Internet of Things." Communications at all levels are crucial during incident response following an earthquake. Damage to critical facilities (such as power plants) from earthquake shaking and to electrical and telecommunications wires and fiber-optic cables that are severed where they cross a fault rupture can trigger cascading Internet and telecommunications outages, and restoring these services is crucially important for emergency-response coordination. Without good communications, emergency-response efficiency is reduced, and as a result, life-saving response functions can be compromised. For these reasons, the name HayWired was chosen for this scenario to emphasize the need to examine our interconnectedness and reliance on telecommunications and other lifelines (such as water and electricity). Earthquake risk in the SF Bay area has been greatly reduced as a result of previous concerted efforts; for

  16. National greenhouse gas emissions baseline scenarios. Learning from experiences in developing countries

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-04-15

    This report reviews national approaches to preparing baseline scenarios of greenhouse-gas (GHG) emissions. It does so by describing and comparing in non-technical language existing practices and choices made by ten developing countries - Brazil, China, Ethiopia, India, Indonesia, Kenya, Mexico, South Africa, Thailand and Vietnam. The review focuses on a number of key elements, including model choices, transparency considerations, choices about underlying assumptions and challenges associated with data management. The aim is to improve overall understanding of baseline scenarios and facilitate their use for policy-making in developing countries more broadly. The findings are based on the results of a collaborative project involving a number of activities undertaken by the Danish Energy Agency, the Organisation for Economic Co-operation and Development (OECD) and the UNEP Risoe Centre (URC), including a series of workshops on the subject. The ten contributing countries account for approximately 40% of current global GHG emissions - a share that is expected to increase in the future. The breakdown of emissions by sector varies widely among these countries. In some countries, the energy sector is the leading source of emissions; for others, the land-use sector and/or agricultural sector dominate emissions. The report underscores some common technical and financial capacity gaps faced by developing countries when preparing baseline scenarios. It does not endeavour to propose guidelines for preparing baseline scenarios. Rather, it is hoped that the report will inform any future attempts at preparing such kind of guidelines. (Author)

  17. Beyond the realist turn: a socio-material analysis of heart failure self-care.

    Science.gov (United States)

    McDougall, Allan; Kinsella, Elizabeth Anne; Goldszmidt, Mark; Harkness, Karen; Strachan, Patricia; Lingard, Lorelei

    2018-01-01

    For patients living with chronic illnesses, self-care has been linked with positive outcomes such as decreased hospitalisation, longer lifespan, and improved quality of life. However, despite calls for more and better self-care interventions, behaviour change trials have repeatedly fallen short on demonstrating effectiveness. The literature on heart failure (HF) stands as a case in point, and a growing body of HF studies advocate realist approaches to self-care research and policymaking. We label this trend the 'realist turn' in HF self-care. Realist evaluation and realist interventions emphasise that the relationship between self-care interventions and positive health outcomes is not fixed, but contingent on social context. This paper argues socio-materiality offers a productive framework to expand on the idea of social context in realist accounts of HF self-care. This study draws on 10 interviews as well as researcher reflections from a larger study exploring health care teams for patients with advanced HF. Leveraging insights from actor-network theory (ANT), this study provides two rich narratives about the contextual factors that influence HF self-care. These descriptions portray not self-care contexts but self-care assemblages, which we discuss in light of socio-materiality. © 2018 Foundation for the Sociology of Health & Illness.

  18. Orbits in a Stäckel approximation.

    Science.gov (United States)

    de Bruyne, V.; Leeuwin, F.; Dejonghe, H.

    One family of potentials frequently used for dynamical models for galaxies are the Stackel potentials (e.g. de Zeeuw et al., 1986, MNRAS 221, 1001, Dejonghe et al., 1996, A&A 306, 363) because of their unique analytical tractability. An axisymmetric model is determined by a potential psi(R,z) and a distribution function F(E,Lz,I3), generally depending on 3 integrals of motion. It has indeed been known for a long time that the integration of orbits in many realistic potentials reveals the presence of an effective third integral I3. Unfortunately, no general analytic expression exists for such an integral, and this has put a limitation on designing anisotropic axisymmetric models. However, I3 can be computed analytically if a Stackel potential is used. This could be used as a local analytic approximation for the third integral in general potentials. One long-standing concern though, is that SPs form only a very small subspace in the family of all potentials, and may not capture the essential of the dynamics for a general potential. One way to address this issue is to compare orbits in both potentials. We therefore fit a Stackel potential (like in Mathieu et al., A&A 314, 25), for different radial ranges, to a spheroidal (MN-)potential (Miyamoto-Nagai, 1975, PASJ 27, 533), and compare orbits for an appropriate sampling of integral space. As a result, we find that the orbit shapes are very similar (as exhibited on surfaces of section, or by the orbital densities). The Stackel I3 is found to be a good approximation to a third integral of the MN-potential. The variation of I3 along an orbit in the MN-potential is of the same order as the difference between the two potentials.

  19. Realistic modeling of seismic input for megacities and large urban areas

    Science.gov (United States)

    Panza, G. F.; Unesco/Iugs/Igcp Project 414 Team

    2003-04-01

    The project addressed the problem of pre-disaster orientation: hazard prediction, risk assessment, and hazard mapping, in connection with seismic activity and man-induced vibrations. The definition of realistic seismic input has been obtained from the computation of a wide set of time histories and spectral information, corresponding to possible seismotectonic scenarios for different source and structural models. The innovative modeling technique, that constitutes the common tool to the entire project, takes into account source, propagation and local site effects. This is done using first principles of physics about wave generation and propagation in complex media, and does not require to resort to convolutive approaches, that have been proven to be quite unreliable, mainly when dealing with complex geological structures, the most interesting from the practical point of view. In fact, several techniques that have been proposed to empirically estimate the site effects using observations convolved with theoretically computed signals corresponding to simplified models, supply reliable information about the site response to non-interfering seismic phases. They are not adequate in most of the real cases, when the seismic sequel is formed by several interfering waves. The availability of realistic numerical simulations enables us to reliably estimate the amplification effects even in complex geological structures, exploiting the available geotechnical, lithological, geophysical parameters, topography of the medium, tectonic, historical, palaeoseismological data, and seismotectonic models. The realistic modeling of the ground motion is a very important base of knowledge for the preparation of groundshaking scenarios that represent a valid and economic tool for the seismic microzonation. This knowledge can be very fruitfully used by civil engineers in the design of new seismo-resistant constructions and in the reinforcement of the existing built environment, and, therefore

  20. The Traffic Adaptive Data Dissemination (TrAD Protocol for both Urban and Highway Scenarios

    Directory of Open Access Journals (Sweden)

    Bin Tian

    2016-06-01

    Full Text Available The worldwide economic cost of road crashes and injuries is estimated to be US$518 billion per year and the annual congestion cost in France is estimated to be €5.9 billion. Vehicular Ad hoc Networks (VANETs are one solution to improve transport features such as traffic safety, traffic jam and infotainment on wheels, where a great number of event-driven messages need to be disseminated in a timely way in a region of interest. In comparison with traditional wireless networks, VANETs have to consider the highly dynamic network topology and lossy links due to node mobility. Inter-Vehicle Communication (IVC protocols are the keystone of VANETs. According to our survey, most of the proposed IVC protocols focus on either highway or urban scenarios, but not on both. Furthermore, too few protocols, considering both scenarios, can achieve high performance. In this paper, an infrastructure-less Traffic Adaptive data Dissemination (TrAD protocol which takes into account road traffic and network traffic status for both highway and urban scenarios will be presented. TrAD has double broadcast suppression techniques and is designed to adapt efficiently to the irregular road topology. The performance of the TrAD protocol was evaluated quantitatively by means of realistic simulations taking into account different real road maps, traffic routes and vehicular densities. The obtained simulation results show that TrAD is more efficient in terms of packet delivery ratio, number of transmissions and delay in comparison with the performance of three well-known reference protocols. Moreover, TrAD can also tolerate a reasonable degree of GPS drift and still achieve efficient data dissemination.

  1. The Traffic Adaptive Data Dissemination (TrAD) Protocol for both Urban and Highway Scenarios.

    Science.gov (United States)

    Tian, Bin; Hou, Kun Mean; Zhou, Haiying

    2016-06-21

    The worldwide economic cost of road crashes and injuries is estimated to be US$518 billion per year and the annual congestion cost in France is estimated to be €5.9 billion. Vehicular Ad hoc Networks (VANETs) are one solution to improve transport features such as traffic safety, traffic jam and infotainment on wheels, where a great number of event-driven messages need to be disseminated in a timely way in a region of interest. In comparison with traditional wireless networks, VANETs have to consider the highly dynamic network topology and lossy links due to node mobility. Inter-Vehicle Communication (IVC) protocols are the keystone of VANETs. According to our survey, most of the proposed IVC protocols focus on either highway or urban scenarios, but not on both. Furthermore, too few protocols, considering both scenarios, can achieve high performance. In this paper, an infrastructure-less Traffic Adaptive data Dissemination (TrAD) protocol which takes into account road traffic and network traffic status for both highway and urban scenarios will be presented. TrAD has double broadcast suppression techniques and is designed to adapt efficiently to the irregular road topology. The performance of the TrAD protocol was evaluated quantitatively by means of realistic simulations taking into account different real road maps, traffic routes and vehicular densities. The obtained simulation results show that TrAD is more efficient in terms of packet delivery ratio, number of transmissions and delay in comparison with the performance of three well-known reference protocols. Moreover, TrAD can also tolerate a reasonable degree of GPS drift and still achieve efficient data dissemination.

  2. PERFORMANCE COMPARISON OF SCENARIO-GENERATION METHODS APPLIED TO A STOCHASTIC OPTIMIZATION ASSET-LIABILITY MANAGEMENT MODEL

    Directory of Open Access Journals (Sweden)

    Alan Delgado de Oliveira

    Full Text Available ABSTRACT In this paper, we provide an empirical discussion of the differences among some scenario tree-generation approaches for stochastic programming. We consider the classical Monte Carlo sampling and Moment matching methods. Moreover, we test the Resampled average approximation, which is an adaptation of Monte Carlo sampling and Monte Carlo with naive allocation strategy as the benchmark. We test the empirical effects of each approach on the stability of the problem objective function and initial portfolio allocation, using a multistage stochastic chance-constrained asset-liability management (ALM model as the application. The Moment matching and Resampled average approximation are more stable than the other two strategies.

  3. 2015 Standard Scenarios Annual Report: U.S. Electric Sector Scenario Exploration

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, Patrick [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cole, Wesley [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Blair, Nate [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Lantz, Eric [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnan, Venkat [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mai, Trieu [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mulcahy, David [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Porro, Gian [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2015-07-16

    This report is one of several products resulting from an initial effort to provide a consistent set of technology cost and performance data and to define a conceptual and consistent scenario framework that can be used in the National Renewable Energy Laboratory’s (NREL’s) future analyses. The long-term objective of this effort is to identify a range of possible futures of the U.S. electricity sector in which to consider specific energy system issues through (1) defining a set of prospective scenarios that bound ranges of key technology, market, and policy assumptions and (2) assessing these scenarios in NREL’s market models to understand the range of resulting outcomes, including energy technology deployment and production, energy prices, and carbon dioxide (CO2) emissions.

  4. Confinement in Maxwell-Chern-Simons planar quantum electrodynamics and the 1/N approximation

    International Nuclear Information System (INIS)

    Hofmann, Christoph P.; Raya, Alfredo; Madrigal, Saul Sanchez

    2010-01-01

    We study the analytical structure of the fermion propagator in planar quantum electrodynamics coupled to a Chern-Simons term within a four-component spinor formalism. The dynamical generation of parity-preserving and parity-violating fermion mass terms is considered, through the solution of the corresponding Schwinger-Dyson equation for the fermion propagator at leading order of the 1/N approximation in Landau gauge. The theory undergoes a first-order phase transition toward chiral symmetry restoration when the Chern-Simons coefficient θ reaches a critical value which depends upon the number of fermion families considered. Parity-violating masses, however, are generated for arbitrarily large values of the said coefficient. On the confinement scenario, complete charge screening - characteristic of the 1/N approximation - is observed in the entire (N,θ)-plane through the local and global properties of the vector part of the fermion propagator.

  5. Scenario Development for Water Resources Planning and Management

    Science.gov (United States)

    Stewart, S.; Mahmoud, M.; Liu, Y.; Hartman, H.; Wagener, T.; Gupta, H.

    2006-12-01

    The main objective of scenario development for water resources is to inform policy-makers about the implications of various policies to inform decision-making. Although there have been a number of studies conducted in the relatively-new and recent field of scenario analysis and development, very few of those have been explicitly applied to water resource issues. More evident is the absence of an established formal approach to develop and apply scenarios. Scenario development is a process that evaluates possible future states of the world by examining several feasible scenarios. A scenario is a projection of various physical and socioeconomic conditions that describe change from the current state to a future state. In this paper, a general framework for scenario development with special emphasis on applications to water resources is considered. The process comprises several progressive and reiterative phases: scenario definition, scenario construction, scenario analysis, scenario assessment, and risk management. Several characteristics of scenarios that are important in describing scenarios are also taken into account; these include scenario types, scenario themes, scenario likelihoods and scenario categories. A hindrance to the adoption of a unified framework for scenario development is inconsistency in the terminology used by scenario developers. To address this problem, we propose a consistent terminology of basic and frequent terms. Outreach for this formal approach is partially maintained through an interactive community website that seeks to educate potential scenario developers about the scenario development process, share and exchange information and resources on scenarios to foster a multidisciplinary community of scenario developers, and establish a unified framework for scenario development with regards to terminology and guidelines. The website provides information on scenario development, current scenario-related activities, key water resources scenario

  6. Downscaling climate change scenarios for apple pest and disease modeling in Switzerland

    Science.gov (United States)

    Hirschi, M.; Stoeckli, S.; Dubrovsky, M.; Spirig, C.; Calanca, P.; Rotach, M. W.; Fischer, A. M.; Duffy, B.; Samietz, J.

    2012-02-01

    As a consequence of current and projected climate change in temperate regions of Europe, agricultural pests and diseases are expected to occur more frequently and possibly to extend to previously non-affected regions. Given their economic and ecological relevance, detailed forecasting tools for various pests and diseases have been developed, which model their phenology, depending on actual weather conditions, and suggest management decisions on that basis. Assessing the future risk of pest-related damages requires future weather data at high temporal and spatial resolution. Here, we use a combined stochastic weather generator and re-sampling procedure for producing site-specific hourly weather series representing present and future (1980-2009 and 2045-2074 time periods) climate conditions in Switzerland. The climate change scenarios originate from the ENSEMBLES multi-model projections and provide probabilistic information on future regional changes in temperature and precipitation. Hourly weather series are produced by first generating daily weather data for these climate scenarios and then using a nearest neighbor re-sampling approach for creating realistic diurnal cycles. These hourly weather series are then used for modeling the impact of climate change on important life phases of the codling moth and on the number of predicted infection days of fire blight. Codling moth (Cydia pomonella) and fire blight (Erwinia amylovora) are two major pest and disease threats to apple, one of the most important commercial and rural crops across Europe. Results for the codling moth indicate a shift in the occurrence and duration of life phases relevant for pest control. In southern Switzerland, a 3rd generation per season occurs only very rarely under today's climate conditions but is projected to become normal in the 2045-2074 time period. While the potential risk for a 3rd generation is also significantly increasing in northern Switzerland (for most stations from roughly 1

  7. Exposure scenarios for workers

    NARCIS (Netherlands)

    Marquart, H.; Northage, C.; Money, C.

    2007-01-01

    The new European chemicals legislation REACH (Registration, Evaluation, Authorisation and restriction of Chemicals) requires the development of Exposure Scenarios describing the conditions and risk management measures needed for the safe use of chemicals. Such Exposure Scenarios should integrate

  8. Regional climate change scenarios

    International Nuclear Information System (INIS)

    Somot, S.

    2005-01-01

    Because studies of the regional impact of climate change need higher spatial resolution than that obtained in standard global climate change scenarios, developing regional scenarios from models is a crucial goal for the climate modelling community. The zoom capacity of ARPEGE-Climat, the Meteo-France climate model, allows use of scenarios with a horizontal resolution of about 50 km over France and the Mediterranean basin. An IPCC-A2 scenario for the end of the 21. century in France shows higher temperatures in each season and more winter and less summer precipitation than now. Tuning the modelled statistical distributions to observed temperature and precipitation allows us to study changes in the frequency of extreme events between today's climate and that at the end of century. The frequency of very hot days in summer will increase. In particular, the frequency of days with a maximum temperature above 35 deg C will be multiplied by a factor of 10, on average. In our scenario, the Toulouse area and Provence might see one quarter of their summer days with a maximum temperature above 35 deg C. (author)

  9. Role-playing for more realistic technical skills training.

    Science.gov (United States)

    Nikendei, C; Zeuch, A; Dieckmann, P; Roth, C; Schäfer, S; Völkl, M; Schellberg, D; Herzog, W; Jünger, J

    2005-03-01

    Clinical skills are an important and necessary part of clinical competence. Simulation plays an important role in many fields of medical education. Although role-playing is common in communication training, there are no reports about the use of student role-plays in the training of technical clinical skills. This article describes an educational intervention with analysis of pre- and post-intervention self-selected student survey evaluations. After one term of skills training, a thorough evaluation showed that the skills-lab training did not seem very realistic nor was it very demanding for trainees. To create a more realistic training situation and to enhance students' involvement, case studies and role-plays with defined roles for students (i.e. intern, senior consultant) were introduced into half of the sessions. Results of the evaluation in the second term showed that sessions with role-playing were rated significantly higher than sessions without role-playing.

  10. Nuclear power prospects and potential: scenarios

    International Nuclear Information System (INIS)

    Rogner, Hans-Hogler; McDonald, Alan; )

    2002-01-01

    This paper outlines a range of scenarios describing what the world's energy system might look in the middle of the century, and what nuclear energy's most profitable role might be. The starting point is the 40 non-greenhouse-gas-mitigation scenarios in the Special Report on Emission Scenarios (SRES) of the Intergovernmental Panel on Climate Change (IPCC). Given their international authorship and comprehensive review by governments and scientific experts, the SRES scenarios are the state of the art in long-term energy scenarios

  11. Battery state-of-charge estimation using approximate least squares

    Science.gov (United States)

    Unterrieder, C.; Zhang, C.; Lunglmayr, M.; Priewasser, R.; Marsili, S.; Huemer, M.

    2015-03-01

    In recent years, much effort has been spent to extend the runtime of battery-powered electronic applications. In order to improve the utilization of the available cell capacity, high precision estimation approaches for battery-specific parameters are needed. In this work, an approximate least squares estimation scheme is proposed for the estimation of the battery state-of-charge (SoC). The SoC is determined based on the prediction of the battery's electromotive force. The proposed approach allows for an improved re-initialization of the Coulomb counting (CC) based SoC estimation method. Experimental results for an implementation of the estimation scheme on a fuel gauge system on chip are illustrated. Implementation details and design guidelines are presented. The performance of the presented concept is evaluated for realistic operating conditions (temperature effects, aging, standby current, etc.). For the considered test case of a GSM/UMTS load current pattern of a mobile phone, the proposed method is able to re-initialize the CC-method with a high accuracy, while state-of-the-art methods fail to perform a re-initialization.

  12. Scenario-based roadmapping assessing nuclear technology development paths for future nuclear energy system scenarios

    International Nuclear Information System (INIS)

    Van Den Durpel, Luc; Roelofs, Ferry; Yacout, Abdellatif

    2009-01-01

    Nuclear energy may play a significant role in a future sustainable energy mix. The transition from today's nuclear energy system towards a future more sustainable nuclear energy system will be dictated by technology availability, energy market competitiveness and capability to achieve sustainability through the nuclear fuel cycle. Various scenarios have been investigated worldwide each with a diverse set of assumptions on the timing and characteristics of new nuclear energy systems. Scenario-based roadmapping combines the dynamic scenario-analysis of nuclear energy systems' futures with the technology roadmap information published and analysed in various technology assessment reports though integrated within the nuclear technology roadmap Nuclear-Roadmap.net. The advantages of this combination is to allow mutual improvement of scenario analysis and nuclear technology roadmapping providing a higher degree of confidence in the assessment of nuclear energy system futures. This paper provides a description of scenario-based roadmapping based on DANESS and Nuclear-Roadmap.net. (author)

  13. Elaborating SRES scenarios for nuclear energy

    International Nuclear Information System (INIS)

    McDonald, Alan; Riahi, Keywan; Rogner, Hans-Holger

    2003-01-01

    The objective of this paper is identifying mid-century economic targets for nuclear energy. The first step is to describe what the mid-century energy market might look like: the major competitors for nuclear energy, what products are in demand, how much of each, where is growth greatest, and so forth. The mechanism for systematically describing the future market is scenario building. The starting point is the scenarios in the Special Report on Emissions Scenarios (SRES) of the Intergovernmental Panel on Climate Change. SRES developed four narrative story lines, each representing a different coherent set of demographic, social, economic, technological, and environmental developments. For each story line several different scenarios were developed by six international modelling teams, resulting in 40 scenarios grouped in the 4 story lines. For three of the story lines this paper uses a single marker scenario representative of central tendencies within the scenario family. For the fourth story line the authors chose the scenario that assumes that advances in non-fossil technologies - renewable, nuclear, and high-efficiency conservation technologies - make them most cost-competitive. (BA)

  14. Energy scenarios for New Zealand

    Energy Technology Data Exchange (ETDEWEB)

    Harris, G. S.; Ellis, M. J.; Scott, G. C.; Wood, J. R.

    1977-10-15

    Three energy scenarios have been formulated for New Zealand. They concentrate on those aspects of society which have a direct bearing on energy, emphasizing three important issues: major shifts in society's values in relation to material wealth, pollution, and resources. The scenarios make assumptions that certain overall social conditions would prevail so that all decisions of government, the private sector, and individuals would be governed by the requirement to conform to the scenario theme in a way not possible under existing social and political conditions. The 3 scenarios are known as Continuation, Low New Zealand Pollution, and Limited Growth.

  15. Entrepreneurial Education: A Realistic Alternative for Women and Minorities.

    Science.gov (United States)

    Steward, James F.; Boyd, Daniel R.

    1989-01-01

    Entrepreneurial education is a valid, realistic occupational training alternative for minorities and women in business. Entrepreneurship requires that one become involved with those educational programs that contribute significantly to one's success. (Author)

  16. Energy scenarios: a prospective outlook

    International Nuclear Information System (INIS)

    Salomon, Thierry; Claustre, Raphael; Charru, Madeleine; Sukov, Stephane; Marignac, Yves; Fink, Meike; Bibas, Ruben; Le Saux, Gildas

    2011-01-01

    A set of articles discusses the use of energy scenarios: how useful they can be to describe a possible future and even to gather the involved actors, how they have been used in France in the past (for planning or prediction purposes, with sometimes some over-assessed or contradictory results, without considering any decline of nuclear energy, or by setting an impossible equation in the case of the Grenelle de l'Environnement), how the scenario framework impacts its content (depending on the approach type: standard, optimization, bottom-up, top-down, or hybrid). It also discusses the issue of choice of hypotheses on growth-based and de-growth-based scenarios, outlines how energy saving is a key for a sustainable evolution. Two German scenarios regarding electricity production (centralisation or decentralisation) and French regional scenarios for Nord-Pas-de-Calais are then briefly discussed

  17. Negatep: A Scenario for Combating Global Warming; Le scenario Negatep. Un scenario de lutte contre le rechauffement climatique

    Energy Technology Data Exchange (ETDEWEB)

    Acket, C.; Bacher, P. [Sauvons Le Climat, 92 - Boulogne Billancourt (France)

    2011-07-15

    There have been an increasing number of foresight exercises in the field of energy and global warming in recent years, as we have seen from the articles devoted to these questions by Futuribles in 2011 (both in this special issue and in the April number). It is certainly the case that the goals for greenhouse-gas emission reduction are rather ambitious, particularly in France, it being the aim of the 2005 French framework law on energy to reduce carbon gas discharges by a factor of four. Among these scenarios, the Negatep scenario developed by Claude Acket and Pierre Bacher from the 'Sauvons le climat' [Let's save the climate] Association proposes to achieve this ('factor 4') goal in France by 2050 by reducing fossil fuel use by 75% and replacing this as quickly as possible with electricity produced from non-carbon-gas-emitting sources - chiefly, nuclear power and renewables. The authors lay out their goals here, backed up by figures, comparing these with the reference scenario. They also show the path that must be followed to arrive at these goals, particularly in the residential and tertiary sectors, and in transport and industry (through control of needs and recourse to alternative energy sources). They close by comparing the Negatep scenario with two other more recent scenarios aimed also at reducing greenhouse gas emissions, on the one hand in Europe, and on the other in Germany. The comparison confirms that they were right to rely on electricity as a substitute for oil, but gives them cause for concern in respect of the consequences (formidable in their view) that the replacement of nuclear power and coal energy by intermittent renewable energies might have in Europe, both with regard to costs and to the effects on the power network. (authors)

  18. Protocol - realist and meta-narrative evidence synthesis: Evolving Standards (RAMESES

    Directory of Open Access Journals (Sweden)

    Westhorp Gill

    2011-08-01

    Full Text Available Abstract Background There is growing interest in theory-driven, qualitative and mixed-method approaches to systematic review as an alternative to (or to extend and supplement conventional Cochrane-style reviews. These approaches offer the potential to expand the knowledge base in policy-relevant areas - for example by explaining the success, failure or mixed fortunes of complex interventions. However, the quality of such reviews can be difficult to assess. This study aims to produce methodological guidance, publication standards and training resources for those seeking to use the realist and/or meta-narrative approach to systematic review. Methods/design We will: [a] collate and summarise existing literature on the principles of good practice in realist and meta-narrative systematic review; [b] consider the extent to which these principles have been followed by published and in-progress reviews, thereby identifying how rigour may be lost and how existing methods could be improved; [c] using an online Delphi method with an interdisciplinary panel of experts from academia and policy, produce a draft set of methodological steps and publication standards; [d] produce training materials with learning outcomes linked to these steps; [e] pilot these standards and training materials prospectively on real reviews-in-progress, capturing methodological and other challenges as they arise; [f] synthesise expert input, evidence review and real-time problem analysis into more definitive guidance and standards; [g] disseminate outputs to audiences in academia and policy. The outputs of the study will be threefold: 1. Quality standards and methodological guidance for realist and meta-narrative reviews for use by researchers, research sponsors, students and supervisors 2. A 'RAMESES' (Realist and Meta-review Evidence Synthesis: Evolving Standards statement (comparable to CONSORT or PRISMA of publication standards for such reviews, published in an open

  19. Engaging Personas and Narrative Scenarios

    DEFF Research Database (Denmark)

    Nielsen, Lene

    2004-01-01

    design ideas. The concept of engaging personas and narrative scenario explores personas in the light of what what it is to identify with and have empathy with a character. The concept of narrative scenarios views the narrative as aid for exploration of design ideas. Both concepts incorporate...... a distinktion between creating, writing and reading. Keywords: personas, scenarios, user-centered design, HCI...

  20. Mars base buildup scenarios

    International Nuclear Information System (INIS)

    Blacic, J.D.

    1985-01-01

    Two surface base build-up scenarios are presented in order to help visualize the mission and to serve as a basis for trade studies. In the first scenario, direct manned landings on the Martian surface occur early in the missions and scientific investigation is the main driver and rationale. In the second scenario, early development of an infrastructure to exploite the volatile resources of the Martian moons for economic purposes is emphasized. Scientific exploration of the surface is delayed at first, but once begun develops rapidly aided by the presence of a permanently manned orbital station

  1. Self-consistent embedded-cluster calculations of the electronic structure of alkaline earth fluorides in the Hartree-Fock-Slater approximation

    International Nuclear Information System (INIS)

    Amaral, N.C.; Maffeo, B.; Guenzburger, D.J.R.

    1982-01-01

    Molecular orbitals calculations were performed for clusters representing the CaF 2 , SrF 2 and BaF 2 ionic crystals. The discrete variational method was employed, with the Xα approximation for the exchange interaction; a detailed investigation of different models for embedding the clusters in the solids led to a realistic description of the effect of neighbour ions in the infinite crystal. The results obtained were used to interpret optical and photoelectron data reported in the literature. In the case of CaF 2 , comparisons were made with existing band structure calculations. (Author) [pt

  2. Evaluation of Gaussian approximations for data assimilation in reservoir models

    KAUST Repository

    Iglesias, Marco A.

    2013-07-14

    The Bayesian framework is the standard approach for data assimilation in reservoir modeling. This framework involves characterizing the posterior distribution of geological parameters in terms of a given prior distribution and data from the reservoir dynamics, together with a forward model connecting the space of geological parameters to the data space. Since the posterior distribution quantifies the uncertainty in the geologic parameters of the reservoir, the characterization of the posterior is fundamental for the optimal management of reservoirs. Unfortunately, due to the large-scale highly nonlinear properties of standard reservoir models, characterizing the posterior is computationally prohibitive. Instead, more affordable ad hoc techniques, based on Gaussian approximations, are often used for characterizing the posterior distribution. Evaluating the performance of those Gaussian approximations is typically conducted by assessing their ability at reproducing the truth within the confidence interval provided by the ad hoc technique under consideration. This has the disadvantage of mixing up the approximation properties of the history matching algorithm employed with the information content of the particular observations used, making it hard to evaluate the effect of the ad hoc approximations alone. In this paper, we avoid this disadvantage by comparing the ad hoc techniques with a fully resolved state-of-the-art probing of the Bayesian posterior distribution. The ad hoc techniques whose performance we assess are based on (1) linearization around the maximum a posteriori estimate, (2) randomized maximum likelihood, and (3) ensemble Kalman filter-type methods. In order to fully resolve the posterior distribution, we implement a state-of-the art Markov chain Monte Carlo (MCMC) method that scales well with respect to the dimension of the parameter space, enabling us to study realistic forward models, in two space dimensions, at a high level of grid refinement. Our

  3. Self-similar factor approximants

    International Nuclear Information System (INIS)

    Gluzman, S.; Yukalov, V.I.; Sornette, D.

    2003-01-01

    The problem of reconstructing functions from their asymptotic expansions in powers of a small variable is addressed by deriving an improved type of approximants. The derivation is based on the self-similar approximation theory, which presents the passage from one approximant to another as the motion realized by a dynamical system with the property of group self-similarity. The derived approximants, because of their form, are called self-similar factor approximants. These complement the obtained earlier self-similar exponential approximants and self-similar root approximants. The specific feature of self-similar factor approximants is that their control functions, providing convergence of the computational algorithm, are completely defined from the accuracy-through-order conditions. These approximants contain the Pade approximants as a particular case, and in some limit they can be reduced to the self-similar exponential approximants previously introduced by two of us. It is proved that the self-similar factor approximants are able to reproduce exactly a wide class of functions, which include a variety of nonalgebraic functions. For other functions, not pertaining to this exactly reproducible class, the factor approximants provide very accurate approximations, whose accuracy surpasses significantly that of the most accurate Pade approximants. This is illustrated by a number of examples showing the generality and accuracy of the factor approximants even when conventional techniques meet serious difficulties

  4. From scenarios to components

    NARCIS (Netherlands)

    Fahland, D.

    2010-01-01

    Scenario-based modeling has evolved as an accepted paradigm for developing complex systems of various kinds. Its main purpose is to ensure that a system provides desired behavior to its users. A scenario is generally understood as a behavioral requirement, denoting a course of actions that shall

  5. Rapidly re-computable EEG (electroencephalography) forward models for realistic head shapes

    International Nuclear Information System (INIS)

    Ermer, J.J.; Mosher, J.C.; Baillet, S.; Leahy, R.M.

    2001-01-01

    Solution of the EEG source localization (inverse) problem utilizing model-based methods typically requires a significant number of forward model evaluations. For subspace based inverse methods like MUSIC (6), the total number of forward model evaluations can often approach an order of 10 3 or 10 4 . Techniques based on least-squares minimization may require significantly more evaluations. The observed set of measurements over an M-sensor array is often expressed as a linear forward spatio-temporal model of the form: F = GQ + N (1) where the observed forward field F (M-sensors x N-time samples) can be expressed in terms of the forward model G, a set of dipole moment(s) Q (3xP-dipoles x N-time samples) and additive noise N. Because of their simplicity, ease of computation, and relatively good accuracy, multi-layer spherical models (7) (or fast approximations described in (1), (7)) have traditionally been the 'forward model of choice' for approximating the human head. However, approximation of the human head via a spherical model does have several key drawbacks. By its very shape, the use of a spherical model distorts the true distribution of passive currents in the skull cavity. Spherical models also require that the sensor positions be projected onto the fitted sphere (Fig. 1), resulting in a distortion of the true sensor-dipole spatial geometry (and ultimately the computed surface potential). The use of a single 'best-fitted' sphere has the added drawback of incomplete coverage of the inner skull region, often ignoring areas such as the frontal cortex. In practice, this problem is typically countered by fitting additional sphere(s) to those region(s) not covered by the primary sphere. The use of these additional spheres results in added complication to the forward model. Using high-resolution spatial information obtained via X-ray CT or MR imaging, a realistic head model can be formed by tessellating the head into a set of contiguous regions (typically the scalp

  6. Modulated Pade approximant

    International Nuclear Information System (INIS)

    Ginsburg, C.A.

    1980-01-01

    In many problems, a desired property A of a function f(x) is determined by the behaviour of f(x) approximately equal to g(x,A) as x→xsup(*). In this letter, a method for resuming the power series in x of f(x) and approximating A (modulated Pade approximant) is presented. This new approximant is an extension of a resumation method for f(x) in terms of rational functions. (author)

  7. Possible climate change over Eurasia under different emission scenarios

    Science.gov (United States)

    Sokolov, A. P.; Monier, E.; Gao, X.

    2012-12-01

    In an attempt to evaluate possible climate change over EURASIA, we analyze results of six AMIP type simulations with CAM version 3 (CAM3) at 2x2.5 degree resolution. CAM3 is driven by time series of sea surface temperatures (SSTs) and sea ice obtained by running the MIT IGSM2.3, which consists of a 3D ocean GCM coupled to a zonally-averaged atmospheric climate-chemistry model. In addition to changes in SSTs, CAM3 is forced by changes in greenhouse gases and ozone concentrations, sulfate aerosol forcing and black carbon loading calculated by the IGSM2.3. An essential feature of the IGSM is the possibility to vary its climate sensitivity (using a cloud adjustment technique) and the strength of the aerosol forcing. For consistency, new modules were developed in CAM3 to modify its climate sensitivity and aerosol forcing to match those used in the simulations with the IGSM2.3. The simulations presented in this paper were carried out for two emission scenarios, a "Business as usual" scenario and a 660 ppm of CO2-EQ stabilization, which are similar to the RCP8.5 and RCP4.5 scenarios, respectively. Values of climate sensitivity used in the simulations within the IGSM-CAM framework are median and the bounds of the 90% probability interval of the probability distribution obtained by comparing the 20th century climate simulated by different versions of the IGSM with observations. The associated strength of the aerosol forcing was chosen to ensure a good agreement with the observed climate change over the 20th century. Because the concentration of sulfate aerosol significantly decreases over the 21st century in both emissions scenarios, climate changes obtained in these simulations provide a good approximation for the median, and the 5th and 95th percentiles of the probability distribution of 21st century climate change.

  8. Approximate Dynamic Programming: Combining Regional and Local State Following Approximations.

    Science.gov (United States)

    Deptula, Patryk; Rosenfeld, Joel A; Kamalapurkar, Rushikesh; Dixon, Warren E

    2018-06-01

    An infinite-horizon optimal regulation problem for a control-affine deterministic system is solved online using a local state following (StaF) kernel and a regional model-based reinforcement learning (R-MBRL) method to approximate the value function. Unlike traditional methods such as R-MBRL that aim to approximate the value function over a large compact set, the StaF kernel approach aims to approximate the value function in a local neighborhood of the state that travels within a compact set. In this paper, the value function is approximated using a state-dependent convex combination of the StaF-based and the R-MBRL-based approximations. As the state enters a neighborhood containing the origin, the value function transitions from being approximated by the StaF approach to the R-MBRL approach. Semiglobal uniformly ultimately bounded (SGUUB) convergence of the system states to the origin is established using a Lyapunov-based analysis. Simulation results are provided for two, three, six, and ten-state dynamical systems to demonstrate the scalability and performance of the developed method.

  9. Italian energy scenarios comparative evaluations

    International Nuclear Information System (INIS)

    Contaldi, Mario

    2005-01-01

    This paper reviews some representative scenarios of the evolution of the Italian primary energy consumption, updated recently. After an overview of the main macroeconomics assumptions the scenario results are cross checked at sectorial level, with a brief discussion of the underlining data and energy intensity trends. The emissions of CO 2 , SO 2 and NO x resulting from the considered scenarios are also reported and discussed [it

  10. Principles of maximally classical and maximally realistic quantum ...

    Indian Academy of Sciences (India)

    Principles of maximally classical and maximally realistic quantum mechanics. S M ROY. Tata Institute of Fundamental Research, Homi Bhabha Road, Mumbai 400 005, India. Abstract. Recently Auberson, Mahoux, Roy and Singh have proved a long standing conjecture of Roy and Singh: In 2N-dimensional phase space, ...

  11. Engineering, nutrient removal, and feedstock conversion evaluations of four corn stover harvest scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Hoskinson, Reed L.; Radtke, Corey W. [Idaho National Laboratory, P.O Box 1625, Idaho Falls, ID 83415-2210 (United States); Karlen, Douglas L. [USDA-ARS, National Soil Tilth Laboratory, Ames, IA 50011-3120 (United States); Birrell, Stuart J. [Iowa State University, Agricultural and Biosystems Engineering Department, Ames, IA 50011 (United States); Wilhelm, W.W. [USDA-ARS, Soil and Water Conservation Research Unit, Lincoln, NE 68583-0934 (United States)

    2007-02-15

    Crop residue has been identified as a near-term source of biomass for renewable fuel, heat, power, chemicals and other bio-materials. A prototype one-pass harvest system was used to collect residue samples from a corn (Zea mays L.) field near Ames, IA. Four harvest scenarios (low cut, high-cut top, high-cut bottom, and normal cut) were evaluated and are expressed as collected stover harvest indices (CSHI). High-cut top and high-cut bottom samples were obtained from the same plot in separate operations. Chemical composition, dilute acid pretreatment response, ethanol conversion yield and efficiency, and thermochemical conversion for each scenario were determined. Mean grain yield in this study (10.1 Mg ha{sup -1} dry weight) was representative of the average yield (10.0 Mg ha{sup -1}) for the area (Story County, IA) and year (2005). The four harvest scenarios removed 6.7, 4.9, 1.7, and 5.1 Mg ha{sup -1} of dry matter, respectively, or 0.60 for low cut, 0.66 for normal cut, and 0.61 for the total high-cut (top+bottom) scenarios when expressed as CSHI values. The macro-nutrient replacement value for the normal harvest scenario was $57.36 ha{sup -1} or $11.27 Mg{sup -1}. Harvesting stalk bottoms increased stover water content, risk of combine damage, estimated transportation costs, and left insufficient soil cover, while also producing a problematic feedstock. These preliminary results indicate harvesting stover (including the cobs) at a height of approximately 40 cm would be best for farmers and ethanol producers because of faster harvest speed and higher quality ethanol feedstock. (author)

  12. Scenario-based resilience assessment framework for critical infrastructure systems: Case study for seismic resilience of seaports

    International Nuclear Information System (INIS)

    Shafieezadeh, Abdollah; Ivey Burden, Lindsay

    2014-01-01

    A number of metrics in the past have been proposed and numerically implemented to assess the overall performance of large systems during natural disasters and their recovery in the aftermath of the events. Among such performance measures, resilience is a reliable metric. This paper proposes a probabilistic framework for scenario-based resilience assessment of infrastructure systems. The method accounts for uncertainties in the process including the correlation of the earthquake intensity measures, fragility assessment of structural components, estimation of repair requirements, the repair process, and finally the service demands. The proposed method is applied to a hypothetical seaport terminal and the system level performance of the seaport is assessed using various performance metrics. Results of this analysis have shown that medium to large seismic events may significantly disrupt the operation of seaports right after the event and the recovery process may take months. The proposed framework will enable port stakeholders to systematically assess the most-likely performance of the system during expected future earthquake events. - Highlights: • A scenario-based framework for seismic resilience assessment of systems is presented. • Seismic resilience of a hypothetical seaport with realistic settings is studied. • Berth availability is found to govern seaport functionality following earthquakes

  13. Guidance for a harmonized emission scenario document (ESD) on ballast water discharge

    Energy Technology Data Exchange (ETDEWEB)

    Zipperle, Andreas [BIS - Beratungszentrum fuer integriertes Sedimentmanagement, Hamburg (Germany); Gils, Jos van [DELTARES, Delft (Netherlands); Hattum, Bert van [Amsterdam Univ. (Netherlands). IVM - Institute for Environmental Studies; Heise, Susanne [BIS - Beratungszentrum fuer integriertes Sedimentmanagement, Hamburg (Germany); Hamburg Univ. of Applied Sciences (Germany)

    2011-05-15

    The present report provides guidance for a harmonized Emission Scenario Document (ESD) for the exposure assessment as part of the environmental risk assessment process which applicants seeking approval of a ballast water management system (BWMS) need to perform prior to notification and authorisation procedures. Despite the global variability of the marine environment, ballast water discharges and treatment methods, exposure assessments need to be comparable between different applications. In order to achieve this, this ESD points out the following aspects: - Applicants should use standardized scenarios in order to predict mean exposure. These should reflect generic situations, independent of region or port so that results are widely applicable. In addition to a harbour scenario, a standardized shipping lane scenario should be considered, - During or right after ballast water discharge, high concentrations may persist in a water body for a certain length of time until extensive mixing results in mean concentrations. Not taking exposure to peak concentrations within gradients into account could lead to an underestimation of risk, especially for rapidly degrading substances. Efforts have been made to approximate maximum exposure concentration with simple dilution factors. Their applicability was checked by near-field-evaluations. - Chemical properties determine the environmental fate of substances. If they are ambiguous, selection of a specific set of data strongly influences the result of an exposure assessment. Guidance is given on what to do about lacking data. - In order to harmonize the exposure assessments, reliable chemical model software should be used. A discussion on the requirements of suitable software and an evaluation of MAMPEC is given in this report. (orig.)

  14. Scenarios

    NARCIS (Netherlands)

    Pérez-Soba, Marta; Maas, Rob

    2015-01-01

    We cannot predict the future with certainty, but we know that it is influenced by our current actions, and that these in turn are influenced by our expectations. This is why future scenarios have existed from the dawn of civilization and have been used for developing military, political and economic

  15. Toward the M(F)--Theory Embedding of Realistic Free-Fermion Models

    CERN Document Server

    Berglund, P; Faraggi, A E; Nanopoulos, Dimitri V; Qiu, Z; Berglund, Per; Ellis, John; Faraggi, Alon E.; Qiu, Zongan

    1998-01-01

    We construct a Landau-Ginzburg model with the same data and symmetries as a $Z_2\\times Z_2$ orbifold that corresponds to a class of realistic free-fermion models. Within the class of interest, we show that this orbifolding connects between different $Z_2\\times Z_2$ orbifold models and commutes with the mirror symmetry. Our work suggests that duality symmetries previously discussed in the context of specific $M$ and $F$ theory compactifications may be extended to the special $Z_2\\times Z_2$ orbifold that characterizes realistic free-fermion models.

  16. Ultra-realistic imaging advanced techniques in analogue and digital colour holography

    CERN Document Server

    Bjelkhagen, Hans

    2013-01-01

    Ultra-high resolution holograms are now finding commercial and industrial applications in such areas as holographic maps, 3D medical imaging, and consumer devices. Ultra-Realistic Imaging: Advanced Techniques in Analogue and Digital Colour Holography brings together a comprehensive discussion of key methods that enable holography to be used as a technique of ultra-realistic imaging.After a historical review of progress in holography, the book: Discusses CW recording lasers, pulsed holography lasers, and reviews optical designs for many of the principal laser types with emphasis on attaining th

  17. Modelling Snowmelt Runoff under Climate Change Scenarios in an Ungauged Mountainous Watershed, Northwest China

    Directory of Open Access Journals (Sweden)

    Yonggang Ma

    2013-01-01

    Full Text Available An integrated modeling system has been developed for analyzing the impact of climate change on snowmelt runoff in Kaidu Watershed, Northwest China. The system couples Hadley Centre Coupled Model version 3 (HadCM3 outputs with Snowmelt Runoff Model (SRM. The SRM was verified against observed discharge for outlet hydrological station of the watershed during the period from April to September in 2001 and generally performed well for Nash-Sutcliffe coefficient (EF and water balance coefficient (RE. The EF is approximately over 0.8, and the water balance error is lower than ± 10%, indicating reasonable prediction accuracy. The Statistical Downscaling Model (SDSM was used to downscale coarse outputs of HadCM3, and then the downscaled future climate data were used as inputs of the SRM. Four scenarios were considered for analyzing the climate change impact on snowmelt flow in the Kaidu Watershed. And the results indicated that watershed hydrology would alter under different climate change scenarios. The stream flow in spring is likely to increase with the increased mean temperature; the discharge and peck flow in summer decrease with the decreased precipitation under Scenarios 1 and 2. Moreover, the consideration of the change in cryosphere area would intensify the variability of stream flow under Scenarios 3 and 4. The modeling results provide useful decision support for water resources management.

  18. Low carbon society scenario analysis of transport sector of an emerging economy—The AIM/Enduse modelling approach

    International Nuclear Information System (INIS)

    Selvakkumaran, Sujeetha; Limmeechokchai, Bundit

    2015-01-01

    The transport sector of a country is the backbone driving the economy forward. Thailand’s land transport sector is modelled using the AIM/Enduse, which is a recursive dynamic optimization model, based on bottom-up modelling principle. The travel demand is divided into two major categories which are passenger travel and freight travel. The objective of this paper is to analyse the mitigation possible through low carbon society (LCS) measures and emission tax (ET). Two scenario clusters are devised along with the BAU case. The LCS scenario cluster has three designed scenarios which are LCS-L, LCS-M and LCS-H. The emission tax (ET) cluster has four scenarios, where the taxes of 50, 100, 200 and 500 USD/t-CO 2 are implemented. Along with this the marginal abatement costs (MAC) of the counter-measures (CMs) and the co-benefits in terms of energy security, productivity and air pollutant mitigation are also assessed. Results show that LCS scenarios are possible of mitigating up to 1230 Mt-CO 2 cumulatively, from 2010 to 2050. In terms of MACs, new vehicles play a pivotal role, along with hybrid vehicles. The Average Abatement Cost (AAC) assessment shows that the AAC of LCS-H scenario is in the order of 100 USD/t-CO 2 . All the LCS and ET scenarios show an enhancement in energy security and also a threefold increase in productivity. There is distinct mitigation in terms of air pollutants from the transport sector as well. -- Highlights: •Thailand transport sector has been modelled using AIM/Enduse model. •Potential cumulative mitigation of CO 2 during 2010–2050 is approximately 30% when compared the BAU scenario. •Abatement cost curves show that various counter measures are practical in the transport sector. •Energy security is enhanced due to CO 2 mitigation in the LCS scenario

  19. Collective multipole excitations based on correlated realistic nucleon-nucleon interactions

    International Nuclear Information System (INIS)

    Paar, N.; Papakonstantinou, P.; Hergert, H.; Roth, R.

    2006-01-01

    We investigate collective multipole excitations for closed shell nuclei from 16 O to 208 Pb using correlated realistic nucleon-nucleon interactions in the framework of the random phase approximation (RPA). The dominant short-range central and tensor correlations a re treated explicitly within the Unitary Correlation Operator Method (UCOM), which provides a phase-shift equivalent correlated interaction VUCOM adapted to simple uncorrelated Hilbert spaces. The same unitary transformation that defines the correlated interaction is used to derive correlated transition operators. Using VUCOM we solve the Hartree-Fock problem and employ the single-particle states as starting point for the RPA. By construction, the UCOM-RPA is fully self-consistent, i.e. the same correlated nucleon-nucleon interact ion is used in calculations of the HF ground state and in the residual RPA interaction. Consequently, the spurious state associated with the center-of-mass motion is properly removed and the sum-rules are exhausted within ±3%. The UCOM-RPA scheme results in a collective character of giant monopole, dipole, and quadrupole resonances in closed-shell nuclei across the nuclear chart. For the isoscalar giant monopole resonance, the resonance energies are in agreement with experiment hinting at a reasonable compressibility. However, in the 1 - and 2 + channels the resonance energies are overestimated due to missing long-range correlations and three-body contributions. (orig.)

  20. The Brazilian external individual monitoring scenario

    International Nuclear Information System (INIS)

    Mauricio, Claudia L.P.; Silva, Claudio R. da; Cunha, Paulo G. da

    2015-01-01

    In order to improve radiation protection it is necessary to have knowledge of the occupational radiation dose levels in all radiation facilities. This information comes from individual monitoring services, which are responsible for measuring and providing information about workers' radiation exposure. In 1981, the Comissao Nacional de Energia Nuclear (CNEN) of Brazil starts to develop a comprehensive system for regulation and storage of occupational radiation dose. This paper starts with an overview of the evolution of the Brazilian authorization and data storage system for external individual monitoring. It starts with a rule for authorization of all Brazilian photon individual monitoring services and the obligation for them to send the measured dose to CNEN. Up to now there is no regulation for neutron individual monitoring. The aim of this paper is to present the current scenario of the Brazilian external monitoring system, reinforcing its importance and remaining problems. The number of monitored workers greatly increases every year, having surpassed 150,000 people monitored. The stored data show that the mean annual occupational external dose is decreasing from 2.4 mSv in 1987 to about 0.6 mSv, in 2012, but there is still some not realistic very high dose measured (higher than 100 mSv), without investigation. About 80% of the annual dose values are lower than the monthly register level. As expected, the higher real photon doses are found in Nuclear Medicine, Industrial Radiology and Interventional Radiology. All recorded annual neutron dose values are lower than 20 mSv. (author)

  1. Using a Realist Research Methodology in Policy Analysis

    Science.gov (United States)

    Lourie, Megan; Rata, Elizabeth

    2017-01-01

    The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…

  2. SCENARIO PLANNING AS LEARNING

    Directory of Open Access Journals (Sweden)

    Antonio Lourenço Junior

    2010-10-01

    Full Text Available Scenario Planning has been increasingly used, from its introduction to the decision process as effective tools to test decisions, and improve performance in a dynamic environment (Chermack, 2005. The purpose of this article is to demonstrate the potential of an experimental Scenario Planning Model to mobilize, encourage and add more content to the organization’s decision making process – mainly with respect to Strategic Plans of two governmental institutions, a pharmaceutical company and a technology education foundation.  This study describes the application stages of a hybrid scenario-planning model – herein referred to as Planning as Learning – via action-research, showing the scenarios resulting from the experiment and describes the main results of an assessment of such practice. In order to do that, two well-established Scenario Planning models (Prospective school and Shell’s model were analyzed. They were used as a reference for the proposition and application of an experimental model in the two study objects. A questionnaire was used to assess the technique impact. It was possible to obtain high levels of reliability. In-depth interviews were also conducted with the participants. At the end, the results confirmed the model efficiency as a basis for decision making in the competitive environment in which the two institutions are inserted, also to encourage the learning process as a group, as observed throughout the work.

  3. Development of groundwater pesticide exposure modeling scenarios for vulnerable spring and winter wheat-growing areas.

    Science.gov (United States)

    Padilla, Lauren; Winchell, Michael; Peranginangin, Natalia; Grant, Shanique

    2017-11-01

    Wheat crops and the major wheat-growing regions of the United States are not included in the 6 crop- and region-specific scenarios developed by the US Environmental Protection Agency (USEPA) for exposure modeling with the Pesticide Root Zone Model conceptualized for groundwater (PRZM-GW). The present work augments the current scenarios by defining appropriately vulnerable PRZM-GW scenarios for high-producing spring and winter wheat-growing regions that are appropriate for use in refined pesticide exposure assessments. Initial screening-level modeling was conducted for all wheat areas across the conterminous United States as defined by multiple years of the Cropland Data Layer land-use data set. Soil, weather, groundwater temperature, evaporation depth, and crop growth and management practices were characterized for each wheat area from publicly and nationally available data sets and converted to input parameters for PRZM. Approximately 150 000 unique combinations of weather, soil, and input parameters were simulated with PRZM for an herbicide applied for postemergence weed control in wheat. The resulting postbreakthrough average herbicide concentrations in a theoretical shallow aquifer were ranked to identify states with the largest regions of relatively vulnerable wheat areas. For these states, input parameters resulting in near 90 th percentile postbreakthrough average concentrations corresponding to significant wheat areas with shallow depth to groundwater formed the basis for 4 new spring wheat scenarios and 4 new winter wheat scenarios to be used in PRZM-GW simulations. Spring wheat scenarios were identified in North Dakota, Montana, Washington, and Texas. Winter wheat scenarios were identified in Oklahoma, Texas, Kansas, and Colorado. Compared to the USEPA's original 6 scenarios, postbreakthrough average herbicide concentrations in the new scenarios were lower than all but Florida Potato and Georgia Coastal Peanuts of the original scenarios and better

  4. The negaWatt 2011 scenario

    International Nuclear Information System (INIS)

    2016-03-01

    This article presents the approach adopted for the negaWatt scenario and its obtained results. It is based on sobriety (energy savings), on energy efficiency, and on the use of renewable energies. After having outlined the different reasons for an energy transition (increasing energy consumption, critics and risks related to nuclear energy, and high potential of renewable energies), the scenario is presented with its main principles. The scenario identifies possibilities ranging from half to two thirds of energy saving in the different energy consuming sectors. The building sector is presented as a major issue. The transport is described as a sector to be addressed on the long term. The necessary change of the industry sector is highlighted. The agriculture sector is presented as being at the heart of transition. Energy usages are to become sober, efficient and renewable. The scenario is based on a high rate development of renewable energies, while fossil energies are to become marginal, nuclear is to be progressively and reasonably given up, and networks are to become compatible to ensure the scenario success. Thus, the scenario demonstrates the feasibility of a 100 pc sustainable assessment for primary energy, complies with stakes and objectives by 2050. The cost of energy transition is briefly discussed

  5. Rethinking the role of scenarios: Participatory scripting of low-carbon scenarios for France

    International Nuclear Information System (INIS)

    Mathy, Sandrine; Fink, Meike; Bibas, Ruben

    2015-01-01

    This article considers the usefulness of low-carbon scenarios in public decision-making. They may be useful as a product-oriented trajectory. The scenarios on the agenda of the 2013 Energy Debate in France belong to this category. But a scenario may also be process-oriented, in the sense that its scripting process helps build consensus and a minimum level of agreement. We have scripted scenarios using a codevelopment method, involving about 40 stakeholders from the private and public sectors, and from the state: NGOs, consumer groups, trade unions, banks and local authorities. They selected policies they considered acceptable for achieving 75% greenhouse gases emission reductions in 2050. These policies were then integrated in the Imaclim-R-France technico-economic simulation model, as part of a high or moderate acceptability scenario. In the first case emissions were cut by between 58% and 72% by 2050; in the second case by between 68% and 81%, depending on the energy price assumptions. All these measures benefited jobs and economic growth, swiftly and durably cutting household spending on energy services. This offers a solid basis for gaining acceptability for low carbon trajectories; the process constitutes also a framework for consolidating collective learning centering on the acceptability of climate policies. - Highlights: • The article develops a ‘process-oriented’ low carbon scenario for France. • Stakeholders define a set of sectoral and fiscal ‘acceptable’ climate policies. • These policies are integrated within a technico-economic model Imaclim-R-France. • Economic impacts and CO 2 emission reductions are computed. •The co-development methodology favors joint production of solutions and shared vision-building

  6. Carbon-constrained scenarios. Final report

    International Nuclear Information System (INIS)

    2009-05-01

    This report provides the results of the study entitled 'Carbon-Constrained Scenarios' that was funded by FONDDRI from 2004 to 2008. The study was achieved in four steps: (i) Investigating the stakes of a strong carbon constraint for the industries participating in the study, not only looking at the internal decarbonization potential of each industry but also exploring the potential shifts of the demand for industrial products. (ii) Developing an hybrid modelling platform based on a tight dialog between the sectoral energy model POLES and the macro-economic model IMACLIM-R, in order to achieve a consistent assessment of the consequences of an economy-wide carbon constraint on energy-intensive industrial sectors, while taking into account technical constraints, barriers to the deployment of new technologies and general economic equilibrium effects. (iii) Producing several scenarios up to 2050 with different sets of hypotheses concerning the driving factors for emissions - in particular the development styles. (iv) Establishing an iterative dialog between researchers and industry representatives on the results of the scenarios so as to improve them, but also to facilitate the understanding and the appropriate use of these results by the industrial partners. This report provides the results of the different scenarios computed in the course of the project. It is a partial synthesis of the work that has been accomplished and of the numerous exchanges that this study has induced between modellers and stakeholders. The first part was written in April 2007 and describes the first reference scenario and the first mitigation scenario designed to achieve stabilization at 450 ppm CO 2 at the end of the 21. century. This scenario has been called 'mimetic' because it has been build on the assumption that the ambitious climate policy would coexist with a progressive convergence of development paths toward the current paradigm of industrialized countries: urban sprawl, general

  7. A multipoint flux approximation of the steady-state heat conduction equation in anisotropic media

    KAUST Repository

    Salama, Amgad; Sun, Shuyu; El-Amin, M. F.

    2013-01-01

    In this work, we introduce multipoint flux (MF) approximation method to the problem of conduction heat transfer in anisotropic media. In such media, the heat flux vector is no longer coincident with the temperature gradient vector. In this case, thermal conductivity is described as a second order tensor that usually requires, at least, six quantities to be fully defined in general three-dimensional problems. The two-point flux finite differences approximation may not handle such anisotropy and essentially more points need to be involved to describe the heat flux vector. In the framework of mixed finite element method (MFE), the MFMFE methods are locally conservative with continuous normal fluxes. We consider the lowest order Brezzi-Douglas-Marini (BDM) mixed finite element method with a special quadrature rule that allows for nodal velocity elimination resulting in a cell-centered system for the temperature. We show comparisons with some analytical solution of the problem of conduction heat transfer in anisotropic long strip. We also consider the problem of heat conduction in a bounded, rectangular domain with different anisotropy scenarios. It is noticed that the temperature field is significantly affected by such anisotropy scenarios. Also, the technique used in this work has shown that it is possible to use the finite difference settings to handle heat transfer in anisotropic media. In this case, heat flux vectors, for the case of rectangular mesh, generally require six points to be described. Copyright © 2013 by ASME.

  8. A multipoint flux approximation of the steady-state heat conduction equation in anisotropic media

    KAUST Repository

    Salama, Amgad

    2013-03-20

    In this work, we introduce multipoint flux (MF) approximation method to the problem of conduction heat transfer in anisotropic media. In such media, the heat flux vector is no longer coincident with the temperature gradient vector. In this case, thermal conductivity is described as a second order tensor that usually requires, at least, six quantities to be fully defined in general three-dimensional problems. The two-point flux finite differences approximation may not handle such anisotropy and essentially more points need to be involved to describe the heat flux vector. In the framework of mixed finite element method (MFE), the MFMFE methods are locally conservative with continuous normal fluxes. We consider the lowest order Brezzi-Douglas-Marini (BDM) mixed finite element method with a special quadrature rule that allows for nodal velocity elimination resulting in a cell-centered system for the temperature. We show comparisons with some analytical solution of the problem of conduction heat transfer in anisotropic long strip. We also consider the problem of heat conduction in a bounded, rectangular domain with different anisotropy scenarios. It is noticed that the temperature field is significantly affected by such anisotropy scenarios. Also, the technique used in this work has shown that it is possible to use the finite difference settings to handle heat transfer in anisotropic media. In this case, heat flux vectors, for the case of rectangular mesh, generally require six points to be described. Copyright © 2013 by ASME.

  9. Bone marrow equivalent prompt dose from two common fallout scenarios

    International Nuclear Information System (INIS)

    Morris, M.D.; Jones, T.D.; Young, R.W.

    1994-01-01

    A cell-kinetics model for radiation-induced myelopoiesis has been derived for mice, rats, dogs, sheep, swine, and burros. The model was extended to humans after extensive comparisons with molecular and cellular data from biological experiments and an assortment of predictive/validation tests on animal mortality, cell survival, and cellular repopulation following irradiations. One advantage of the model is that any complex pattern of protracted irradiation can be equated to its equivalent prompt dose. Severity of biological response depends upon target-organ dose, dose rate, and dose fractionation. Epidemiological and animal data are best suited for exposures given in brief periods of time. To use those data to assess risk from protracted human exposures, it is obligatory to model molecular repair and compensatory proliferation in terms of prompt dose. Although the model is somewhat complex both mathematically and biologically, this note describes simple numerical approximations for two common exposure scenarios. Both approximations are easily evaluated on a simple pocket calculator by a health physicist or emergency management officer. 12 refs., 5 figs

  10. Realistic electricity market simulator for energy and economic studies

    International Nuclear Information System (INIS)

    Bernal-Agustin, Jose L.; Contreras, Javier; Conejo, Antonio J.; Martin-Flores, Raul

    2007-01-01

    Electricity market simulators have become a useful tool to train engineers in the power industry. With the maturing of electricity markets throughout the world, there is a need for sophisticated software tools that can replicate the actual behavior of power markets. In most of these markets, power producers/consumers submit production/demand bids and the Market Operator clears the market producing a single price per hour. What makes markets different from each other are the bidding rules and the clearing algorithms to balance the market. This paper presents a realistic simulator of the day-ahead electricity market of mainland Spain. All the rules that govern this market are modeled. This simulator can be used either to train employees by power companies or to teach electricity markets courses in universities. To illustrate the tool, several realistic case studies are presented and discussed. (author)

  11. Type IIB orientifolds, D-brane instantons and the large volume scenario

    Energy Technology Data Exchange (ETDEWEB)

    Plauschinn, Erik

    2009-07-28

    This thesis is concerned with a branch of research in String Theory called String Phenomenology which aims for a better understanding of the connection between String Theory and Particle Physics. In particular, in this work we cover three topics which are important in order to establish this connection. The first topic is about String Theory model building in the context of so-called type IIB orientifolds with orientifold three- and seven-planes. After giving a brief overview, we work out in detail an important consistency condition for String Theory constructions, the so-called tadpole cancellation condition, and we verify explicitly that chiral anomalies are cancelled via the generalised Green-Schwarz mechanism. The second topic is concerned with so-called D-brane instantons which are nonperturbative effects in type II String Theory constructions. We recall the instanton calculus for such configurations, we derive the so-called A eck-Dine-Seiberg superpotential in String Theory and we develop an important constraint, a chiral zero-mode constraint, for instanton contributions in the presence of a realistic Particle Physics sector. The third topic is about moduli stabilisation in type IIB string compactifications. More concretely, we review the so-called KKLT as well as Large Volume Scenario, and we construct and study a model for the latter scenario where the constraint mentioned above has been taken into account explicitly. Although the three topics studied in this thesis are slightly different in nature, there is nevertheless a complex interplay between them with many interrelations. In order to uncover these connections, a detailed study of each individual subject has been performed which has led to new results such as the chiral zero-mode constraint. (orig.)

  12. Type IIB orientifolds, D-brane instantons and the large volume scenario

    International Nuclear Information System (INIS)

    Plauschinn, Erik

    2009-01-01

    This thesis is concerned with a branch of research in String Theory called String Phenomenology which aims for a better understanding of the connection between String Theory and Particle Physics. In particular, in this work we cover three topics which are important in order to establish this connection. The first topic is about String Theory model building in the context of so-called type IIB orientifolds with orientifold three- and seven-planes. After giving a brief overview, we work out in detail an important consistency condition for String Theory constructions, the so-called tadpole cancellation condition, and we verify explicitly that chiral anomalies are cancelled via the generalised Green-Schwarz mechanism. The second topic is concerned with so-called D-brane instantons which are nonperturbative effects in type II String Theory constructions. We recall the instanton calculus for such configurations, we derive the so-called A eck-Dine-Seiberg superpotential in String Theory and we develop an important constraint, a chiral zero-mode constraint, for instanton contributions in the presence of a realistic Particle Physics sector. The third topic is about moduli stabilisation in type IIB string compactifications. More concretely, we review the so-called KKLT as well as Large Volume Scenario, and we construct and study a model for the latter scenario where the constraint mentioned above has been taken into account explicitly. Although the three topics studied in this thesis are slightly different in nature, there is nevertheless a complex interplay between them with many interrelations. In order to uncover these connections, a detailed study of each individual subject has been performed which has led to new results such as the chiral zero-mode constraint. (orig.)

  13. Biomass Scenario Model | Energy Analysis | NREL

    Science.gov (United States)

    Biomass Scenario Model Biomass Scenario Model The Biomass Scenario Model (BSM) is a unique range of lignocellulosic biomass feedstocks into biofuels. Over the past 25 years, the corn ethanol plant matter (lignocellulosic biomass) to fermentable sugars for the production of fuel ethanol

  14. Virtual-source diffusion approximation for enhanced near-field modeling of photon-migration in low-albedo medium.

    Science.gov (United States)

    Jia, Mengyu; Chen, Xueying; Zhao, Huijuan; Cui, Shanshan; Liu, Ming; Liu, Lingling; Gao, Feng

    2015-01-26

    Most analytical methods for describing light propagation in turbid medium exhibit low effectiveness in the near-field of a collimated source. Motivated by the Charge Simulation Method in electromagnetic theory as well as the established discrete source based modeling, we herein report on an improved explicit model for a semi-infinite geometry, referred to as "Virtual Source" (VS) diffuse approximation (DA), to fit for low-albedo medium and short source-detector separation. In this model, the collimated light in the standard DA is analogously approximated as multiple isotropic point sources (VS) distributed along the incident direction. For performance enhancement, a fitting procedure between the calculated and realistic reflectances is adopted in the near-field to optimize the VS parameters (intensities and locations). To be practically applicable, an explicit 2VS-DA model is established based on close-form derivations of the VS parameters for the typical ranges of the optical parameters. This parameterized scheme is proved to inherit the mathematical simplicity of the DA approximation while considerably extending its validity in modeling the near-field photon migration in low-albedo medium. The superiority of the proposed VS-DA method to the established ones is demonstrated in comparison with Monte-Carlo simulations over wide ranges of the source-detector separation and the medium optical properties.

  15. Quantifying introgression risk with realistic population genetics

    OpenAIRE

    Ghosh, Atiyo; Meirmans, Patrick G.; Haccou, Patsy

    2012-01-01

    Introgression is the permanent incorporation of genes from the genome of one population into another. This can have severe consequences, such as extinction of endemic species, or the spread of transgenes. Quantification of the risk of introgression is an important component of genetically modified crop regulation. Most theoretical introgression studies aimed at such quantification disregard one or more of the most important factors concerning introgression: realistic genetical mechanisms, rep...

  16. Quantum cryptography: towards realization in realistic conditions

    International Nuclear Information System (INIS)

    Imoto, M.; Koashi, M.; Shimizu, K.; Huttner, B.

    1997-01-01

    Many of quantum cryptography schemes have been proposed based on some assumptions such as no transmission loss, no measurement error, and an ideal single photon generator. We have been trying to develop a theory of quantum cryptography considering realistic conditions. As such attempts, we propose quantum cryptography with coherent states, quantum cryptography with two-photon interference, and generalization of two-state cryptography to two-mixed-state cases. (author)

  17. Energy scenarios for Colombia - Environmental Aspects

    International Nuclear Information System (INIS)

    Smith, Ricardo A; Vesga A, Daniel R; Boman, Ulf

    2000-01-01

    The planning unit of the Colombian ministry of energy -UPME -has done an energy scenario project for Colombia with a 20-year horizon (vision year 2020) in this project the scenario methodology was used in a systemic way involving a great number of local and international energy experts. As a result four energy scenarios were designed and in all of them the possible evolution of all energy was analyzed. In this article a description of the used methodology is presented with the developed scenarios. Also a discussion of the long-range future environmental considerations in the energy sector, taking into account the developed scenarios, is presented. Finally some conclusions and recommendations are presented

  18. Scenario Planning as Organizational Intervention

    DEFF Research Database (Denmark)

    Balarezo, Jose; Nielsen, Bo Bernhard

    2017-01-01

    existing contributions on scenario planning within a new consolidating framework that includes antecedents, processes, and outcomes. The proposed framework allows for integration of the extant literature on scenario planning from a wide variety of fields, including strategic management, finance, human...... resource management, operations management, and psychology. Findings: This study contributes to research by offering a coherent and consistent framework for understanding scenario planning as a dynamic process. As such, it offers future researchers with a systematic way to ascertain where a particular......Purpose: This paper identifies four areas in need of future research to enhance our theoretical understanding of scenario planning, and sets the basis for future empirical examination of its effects on individual and organizational level outcomes. Design/methodology/approach: This paper organizes...

  19. About the Need of Combining Power Market and Power Grid Model Results for Future Energy System Scenarios

    Science.gov (United States)

    Mende, Denis; Böttger, Diana; Löwer, Lothar; Becker, Holger; Akbulut, Alev; Stock, Sebastian

    2018-02-01

    The European power grid infrastructure faces various challenges due to the expansion of renewable energy sources (RES). To conduct investigations on interactions between power generation and the power grid, models for the power market as well as for the power grid are necessary. This paper describes the basic functionalities and working principles of both types of models as well as steps to couple power market results and the power grid model. The combination of these models is beneficial in terms of gaining realistic power flow scenarios in the grid model and of being able to pass back results of the power flow and restrictions to the market model. Focus is laid on the power grid model and possible application examples like algorithms in grid analysis, operation and dynamic equipment modelling.

  20. Scenario-based fitted Q-iteration for adaptive control of water reservoir systems under uncertainty

    Science.gov (United States)

    Bertoni, Federica; Giuliani, Matteo; Castelletti, Andrea

    2017-04-01

    Over recent years, mathematical models have largely been used to support planning and management of water resources systems. Yet, the increasing uncertainties in their inputs - due to increased variability in the hydrological regimes - are a major challenge to the optimal operations of these systems. Such uncertainty, boosted by projected changing climate, violates the stationarity principle generally used for describing hydro-meteorological processes, which assumes time persisting statistical characteristics of a given variable as inferred by historical data. As this principle is unlikely to be valid in the future, the probability density function used for modeling stochastic disturbances (e.g., inflows) becomes an additional uncertain parameter of the problem, which can be described in a deterministic and set-membership based fashion. This study contributes a novel method for designing optimal, adaptive policies for controlling water reservoir systems under climate-related uncertainty. The proposed method, called scenario-based Fitted Q-Iteration (sFQI), extends the original Fitted Q-Iteration algorithm by enlarging the state space to include the space of the uncertain system's parameters (i.e., the uncertain climate scenarios). As a result, sFQI embeds the set-membership uncertainty of the future inflow scenarios in the action-value function and is able to approximate, with a single learning process, the optimal control policy associated to any scenario included in the uncertainty set. The method is demonstrated on a synthetic water system, consisting of a regulated lake operated for ensuring reliable water supply to downstream users. Numerical results show that the sFQI algorithm successfully identifies adaptive solutions to operate the system under different inflow scenarios, which outperform the control policy designed under historical conditions. Moreover, the sFQI policy generalizes over inflow scenarios not directly experienced during the policy design

  1. A possible definition of a {\\it Realistic} Physics Theory

    OpenAIRE

    Gisin, Nicolas

    2014-01-01

    A definition of a {\\it Realistic} Physics Theory is proposed based on the idea that, at all time, the set of physical properties possessed (at that time) by a system should unequivocally determine the probabilities of outcomes of all possible measurements.

  2. Numerical computation of aeroacoustic transfer functions for realistic airfoils

    NARCIS (Netherlands)

    De Santana, Leandro Dantas; Miotto, Renato Fuzaro; Wolf, William Roberto

    2017-01-01

    Based on Amiet's theory formalism, we propose a numerical framework to compute the aeroacoustic transfer function of realistic airfoil geometries. The aeroacoustic transfer function relates the amplitude and phase of an incoming periodic gust to the respective unsteady lift response permitting,

  3. Student Work Experience: A Realistic Approach to Merchandising Education.

    Science.gov (United States)

    Horridge, Patricia; And Others

    1980-01-01

    Relevant and realistic experiences are needed to prepare the student for a future career. Addresses the results of a survey of colleges and universities in the United States in regard to their student work experience (SWE) in fashion merchandising. (Author)

  4. A realistic intersecting D6-brane model after the first LHC run

    Science.gov (United States)

    Li, Tianjun; Nanopoulos, D. V.; Raza, Shabbar; Wang, Xiao-Chuan

    2014-08-01

    With the Higgs boson mass around 125 GeV and the LHC supersymmetry search constraints, we revisit a three-family Pati-Salam model from intersecting D6-branes in Type IIA string theory on the T 6/(ℤ2 × ℤ2) orientifold which has a realistic phenomenology. We systematically scan the parameter space for μ 0, and find that the gravitino mass is generically heavier than about 2 TeV for both cases due to the Higgs mass low bound 123 GeV. In particular, we identify a region of parameter space with the electroweak fine-tuning as small as Δ EW ~ 24-32 (3-4%). In the viable parameter space which is consistent with all the current constraints, the mass ranges for gluino, the first two-generation squarks and sleptons are respectively [3, 18] TeV, [3, 16] TeV, and [2, 7] TeV. For the third-generation sfermions, the light stop satisfying 5 σ WMAP bounds via neutralino-stop coannihilation has mass from 0.5 to 1.2 TeV, and the light stau can be as light as 800 GeV. We also show various coannihilation and resonance scenarios through which the observed dark matter relic density is achieved. Interestingly, the certain portions of parameter space has excellent t- b- τ and b- τ Yukawa coupling unification. Three regions of parameter space are highlighted as well where the dominant component of the lightest neutralino is a bino, wino or higgsino. We discuss various scenarios in which such solutions may avoid recent astrophysical bounds in case if they satisfy or above observed relic density bounds. Prospects of finding higgsino-like neutralino in direct and indirect searches are also studied. And we display six tables of benchmark points depicting various interesting features of our model. Note that the lightest neutralino can be heavy up to 2.8 TeV, and there exists a natural region of parameter space from low-energy fine-tuning definition with heavy gluino and first two-generation squarks/sleptons, we point out that the 33 TeV and 100 TeV proton-proton colliders are indeed

  5. Social Foundation of Scenario Planning

    DEFF Research Database (Denmark)

    Rowland, Nicholas James; Spaniol, Matthew Jon

    2017-01-01

    In this article, the authors establish that models of scenario planning typically involve a series of phases, stages, or steps that imply a sequenced (i.e., linear or chronological) process. Recursive models, in contrast, allow phases to repeat, thus, incorporating iteration. The authors acknowle......In this article, the authors establish that models of scenario planning typically involve a series of phases, stages, or steps that imply a sequenced (i.e., linear or chronological) process. Recursive models, in contrast, allow phases to repeat, thus, incorporating iteration. The authors...... from science and technology studies (STS) on knowledge production, the authors explain transition from one phase to the next and iteration between and within phases based on social negotiation. To this end, the authors examine the interplay between the “scenario development” phase and the “scenario use......” phase of a planning process with a non-governmental organization in Denmark. The upshot for facilitators is practical insight into how transition between phases and phase iteration in scenario planning can be identified, leveraged, and, thus, managed. The upshot for scholars is a related insight...

  6. IPCC Special report on Emissions Scenarios (SRES)

    International Nuclear Information System (INIS)

    Anon

    2001-01-01

    This special report on emissions scenarios (SRES) is intended to reflect the most recent trends in driving forces of emissions; population projections economic development, and structural and technological change. It serves as an update to IS92 scenarios developed by IPCC in the early 1990s to illustrate a plausible range of future greenhouse gas emissions. This update is based on a review of the literature and the development of a database of over 400 global and regional scenarios; 190 of these extend from 1900 to 2100 and thus fed into the development of the narrative scenarios and storylines. Based on the literature review, a set of four alternative scenario families, having a total of 40 emission scenarios have been developed. Each scenario family includes a narrative storyline which describes a demographic, social. economic, technological, environmental and policy future. Characteristic features of each of the four families are summarized and a comparison is made between the IS92 and SRES. One of the main conclusions of this recent scenario construction effort is the realization that alternative combinations of main scenario driving forces can lead to similar levels of GHG emissions by the end of the 21st century, and that scenarios with different underlying assumptions can result in very similar climate change

  7. Realistic phantoms to characterize dosimetry in pediatric CT

    Energy Technology Data Exchange (ETDEWEB)

    Carver, Diana E.; Kost, Susan D.; Fraser, Nicholas D.; Pickens, David R.; Price, Ronald R.; Stabin, Michael G. [Vanderbilt University Medical Center, Department of Radiology and Radiological Sciences, Nashville, TN (United States); Segars, W.P. [Duke University, Carl E. Ravin Advanced Imaging Laboratories, Durham, NC (United States)

    2017-05-15

    The estimation of organ doses and effective doses for children receiving CT examinations is of high interest. Newer, more realistic anthropomorphic body models can provide information on individual organ doses and improved estimates of effective dose. Previously developed body models representing 50th-percentile individuals at reference ages (newborn, 1, 5, 10 and 15 years) were modified to represent 10th, 25th, 75th and 90th height percentiles for both genders and an expanded range of ages (3, 8 and 13 years). We calculated doses for 80 pediatric reference phantoms from simulated chest-abdomen-pelvis exams on a model of a Philips Brilliance 64 CT scanner. Individual organ and effective doses were normalized to dose-length product (DLP) and fit as a function of body diameter. We calculated organ and effective doses for 80 reference phantoms and plotted them against body diameter. The data were well fit with an exponential function. We found DLP-normalized organ dose to correlate strongly with body diameter (R{sup 2}>0.95 for most organs). Similarly, we found a very strong correlation with body diameter for DLP-normalized effective dose (R{sup 2}>0.99). Our results were compared to other studies and we found average agreement of approximately 10%. We provide organ and effective doses for a total of 80 reference phantoms representing normal-stature children ranging in age and body size. This information will be valuable in replacing the types of vendor-reported doses available. These data will also permit the recording and tracking of individual patient doses. Moreover, this comprehensive dose database will facilitate patient matching and the ability to predict patient-individualized dose prior to examination. (orig.)

  8. Generalized Warburg impedance on realistic self-affine fractals ...

    Indian Academy of Sciences (India)

    Administrator

    Generalized Warburg impedance on realistic self-affine fractals: Comparative study of statistically corrugated and isotropic roughness. RAJESH KUMAR and RAMA KANT. Journal of Chemical Sciences, Vol. 121, No. 5, September 2009, pp. 579–588. 1. ( ) c. L. R ω on page 582, column 2, para 2, after eq (8) should read as ...

  9. Integrating experimental and numerical methods for a scenario-based quantitative assessment of subsurface energy storage options

    Science.gov (United States)

    Kabuth, Alina; Dahmke, Andreas; Hagrey, Said Attia al; Berta, Márton; Dörr, Cordula; Koproch, Nicolas; Köber, Ralf; Köhn, Daniel; Nolde, Michael; Tilmann Pfeiffer, Wolf; Popp, Steffi; Schwanebeck, Malte; Bauer, Sebastian

    2016-04-01

    second example, the option of seasonal hydrogen storage in a deep saline aquifer is considered. The induced thermal and hydraulic multiphase flow processes were simulated. Also, an integrative approach towards geophysical monitoring of gas presence was evaluated by synthetically applying these monitoring methods to the synthetic, however realistically defined numerical storage scenarios. Laboratory experiments provided parameterisations of geochemical effects caused by storage gas leakage into shallow aquifers in cases of sealing failure. Ultimately, the analysis of realistically defined scenarios of subsurface energy storage within the ANGUS+ project allows a quantification of the subsurface space claimed by a storage operation and its induced effects. Acknowledgments: This work is part of the ANGUS+ project (www.angusplus.de) and funded by the German Federal Ministry of Education and Research (BMBF) as part of the energy storage initiative "Energiespeicher".

  10. Scenarios for gluino coannihilation

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, John [Theoretical Particle Physics and Cosmology Group, Department of Physics, King’s College London, London, WC2R 2LS United Kingdom (United Kingdom); Theory Division, CERN,Geneva 23, CH-1211 (Switzerland); Evans, Jason L. [School of Physics and Astronomy, University of Minnesota,Minneapolis, MN, 55455 (United States); William I. Fine Theoretical Physics Institute, School of Physics and Astronomy, University of Minnesota, Minneapolis, MN, 55455 (United States); Luo, Feng [Theory Division, CERN,Geneva 23, CH-1211 (Switzerland); Olive, Keith A. [School of Physics and Astronomy, University of Minnesota,Minneapolis, MN, 55455 (United States); William I. Fine Theoretical Physics Institute, School of Physics and Astronomy, University of Minnesota, Minneapolis, MN, 55455 (United States)

    2016-02-11

    We study supersymmetric scenarios in which the gluino is the next-to-lightest supersymmetric particle (NLSP), with a mass sufficiently close to that of the lightest supersymmetric particle (LSP) that gluino coannihilation becomes important. One of these scenarios is the MSSM with soft supersymmetry-breaking squark and slepton masses that are universal at an input GUT renormalization scale, but with non-universal gaugino masses. The other scenario is an extension of the MSSM to include vector-like supermultiplets. In both scenarios, we identify the regions of parameter space where gluino coannihilation is important, and discuss their relations to other regions of parameter space where other mechanisms bring the dark matter density into the range allowed by cosmology. In the case of the non-universal MSSM scenario, we find that the allowed range of parameter space is constrained by the requirement of electroweak symmetry breaking, the avoidance of a charged LSP and the measured mass of the Higgs boson, in particular, as well as the appearance of other dark matter (co)annihilation processes. Nevertheless, LSP masses m{sub χ}≲8 TeV with the correct dark matter density are quite possible. In the case of pure gravity mediation with additional vector-like supermultiplets, changes to the anomaly-mediated gluino mass and the threshold effects associated with these states can make the gluino almost degenerate with the LSP, and we find a similar upper bound.

  11. Scenario group summary

    International Nuclear Information System (INIS)

    Thorndike, A.

    1976-01-01

    A scenario is given for ISABELLE which provides a plausible sequence of events from FY 1980 to 1990. No doubt reality will be quite different. The scenario is based on the construction schedule of the 1976 proposal. Assembly and testing of the accelerator will occur until the end of FY 1983, and the next six years will provide pp interactions for the initial high energy physics research. By 1990 any temporary conditions associated with start-up of ISABELLE should be a thing of the past and all experimental capabilities fully utilized

  12. Scenario group summary

    International Nuclear Information System (INIS)

    Thorndike, A.

    1976-01-01

    The scenario is given which provides a plausible sequence of events for ISABELLE from FY 1980 to 1990. No doubt reality will be quite different. The scenario is based on the construction schedule of the 1976 proposal. Assembly and testing of the accelerator will occur until the end of FY 1983, and the next six years will provide pp interactions for the initial high energy physics research. By 1990 any temporary conditions associated with start-up of ISABELLE should be a thing of the past and all experimental capabilities fully utilized

  13. Modeling the potential distribution of Bacillus anthracis under multiple climate change scenarios for Kazakhstan.

    Directory of Open Access Journals (Sweden)

    Timothy Andrew Joyner

    Full Text Available Anthrax, caused by the bacterium Bacillus anthracis, is a zoonotic disease that persists throughout much of the world in livestock, wildlife, and secondarily infects humans. This is true across much of Central Asia, and particularly the Steppe region, including Kazakhstan. This study employed the Genetic Algorithm for Rule-set Prediction (GARP to model the current and future geographic distribution of Bacillus anthracis in Kazakhstan based on the A2 and B2 IPCC SRES climate change scenarios using a 5-variable data set at 55 km(2 and 8 km(2 and a 6-variable BioClim data set at 8 km(2. Future models suggest large areas predicted under current conditions may be reduced by 2050 with the A2 model predicting approximately 14-16% loss across the three spatial resolutions. There was greater variability in the B2 models across scenarios predicting approximately 15% loss at 55 km(2, approximately 34% loss at 8 km(2, and approximately 30% loss with the BioClim variables. Only very small areas of habitat expansion into new areas were predicted by either A2 or B2 in any models. Greater areas of habitat loss are predicted in the southern regions of Kazakhstan by A2 and B2 models, while moderate habitat loss is also predicted in the northern regions by either B2 model at 8 km(2. Anthrax disease control relies mainly on livestock vaccination and proper carcass disposal, both of which require adequate surveillance. In many situations, including that of Kazakhstan, vaccine resources are limited, and understanding the geographic distribution of the organism, in tandem with current data on livestock population dynamics, can aid in properly allocating doses. While speculative, contemplating future changes in livestock distributions and B. anthracis spore promoting environments can be useful for establishing future surveillance priorities. This study may also have broader applications to global public health surveillance relating to other diseases in addition to B

  14. Chemical aspects of cylinder corrosion and a scenario for hole development

    Energy Technology Data Exchange (ETDEWEB)

    Barber, E.J. [Martin Marietta Energy Systems, Oak Ridge, TN (United States)

    1991-12-31

    In June 1990, two cylinders in the depleted UF{sub 6} cylinder storage yards at Portsmouth were discovered to have holes in their walls at the valve-end stiffening ring at a point below the level of the gas-solid interface of the UF{sub 6}. The cylinder with the larger hole, which extended under the stiffening ring, was stacked in a top row 13 years ago. The cylinder with the smaller hole had been stacked in a bottom row 4 years ago. The lifting lugs of the adjacent cylinders pointed directly at the holes. A Cylinder Investigating Committee was appointed to determine the cause or causes of the holes and to assess the implications of these findings. This report contains a listing of the chemically related facts established by the Investigating Committee with the cooperation of the Operations and Technical Support Divisions at the Portsmouth Gaseous Diffusion Plant, the scenario developed to explain these findings and some implications of this scenario. In summary, the interrelated reactions of water, solid UF{sub 6} and iron presented by R. L. Ritter are used to develop a scenario which explains the observations and deductions made during the investigation. The chemical processes are intimately related to the course of the last three of the four stages of hole development. A simple model is proposed which permits semiquantitative prediction of such information as the HF loss rates as a function of time, the rate of hole enlargement, the time to hydrolyze a cylinder of UF{sub 6} and the approximate size of the hole. The scenario suggests that the environmental consequences associated with a developing hole in a depleted UF{sub 6} cylinder are minimal for the first several years but will become significant if too many years pass before detection. The overall environmental picture is presented in more detail elsewhere.

  15. Hypothetical Case and Scenario Description for International Transportation of Spent Nuclear Fuel.

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Adam David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Osborn, Douglas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Katherine A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kalinina, Elena Arkadievna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cohn, Brian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Thomas, Maikael A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Parks, Mancel Jordan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Parks, Ethan Rutledge [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mohagheghi, Amir H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    To support more rigorous analysis on global security issues at Sandia National Laboratories (SNL), there is a need to develop realistic data sets without using "real" data or identifying "real" vulnerabilities, hazards or geopolitically embarrassing shortcomings. In response, an interdisciplinary team led by subject matter experts in SNL's Center for Global Security and Cooperation (CGSC) developed a hypothetical case description. This hypothetical case description assigns various attributes related to international SNF transportation that are representative, illustrative and indicative of "real" characteristics of "real" countries. There is no intent to identify any particular country and any similarity with specific real-world events is purely coincidental. To support the goal of this report to provide a case description (and set of scenarios of concern) for international SNF transportation inclusive of as much "real-world" complexity as possible -- without crossing over into politically sensitive or classified information -- this SAND report provides a subject matter expert-validated (and detailed) description of both technical and political influences on the international transportation of spent nuclear fuel. [PAGE INTENTIONALLY LEFT BLANK

  16. Effective realistic interactions for low momentum Hilbert spaces

    International Nuclear Information System (INIS)

    Weber, Dennis

    2012-01-01

    Realistic nucleon-nucleon potentials are an essential ingredient of modern microscopic many-body calculations. These potentials can be represented in two different ways: operator representation or matrix element representation. In operator representation the potential is represented by a set of quantum mechanical operators while in matrix element representation it is defined by the matrix elements in a given basis. Many modern potentials are constructed directly in matrix element representation. While the matrix element representation can be calculated from the operator representation, the determination of the operator representation from the matrix elements is more difficult. Some methods to solve the nuclear many-body problem, such as Fermionic Molecular Dynamics (FMD) or the Green's Function Monte Carlo (GFMC) method, however require explicitly the operator representation of the potential, as they do not work in a fixed many-body basis. It is therefore desirable to derive an operator representation also for the interactions given by matrix elements. In this work a method is presented which allows the derivation of an approximate operator representation starting from the momentum space partial wave matrix elements of the interaction. For that purpose an ansatz for the operator representation is chosen. The parameters in the ansatz are determined by a fit to the partial wave matrix elements. Since a perfect reproduction of the matrix elements in general cannot be achieved with a finite number of operators and the quality of the results depends on the choice of the ansatz, the obtained operator representation is tested in nuclear many-body calculations and the results are compared with those from the initial interaction matrix elements. For the calculation of the nucleon-nucleon scattering phase shifts and the deuteron properties a computer code written within this work is used. For larger nuclei the No Core Shell Model (NCSM) and FMD are applied. The described

  17. Constraining the Deforestation History of Europe: Evaluation of Historical Land Use Scenarios with Pollen-Based Land Cover Reconstructions

    Directory of Open Access Journals (Sweden)

    Jed O. Kaplan

    2017-12-01

    Full Text Available Anthropogenic land cover change (ALCC is the most important transformation of the Earth system that occurred in the preindustrial Holocene, with implications for carbon, water and sediment cycles, biodiversity and the provision of ecosystem services and regional and global climate. For example, anthropogenic deforestation in preindustrial Eurasia may have led to feedbacks to the climate system: both biogeophysical, regionally amplifying winter cold and summer warm temperatures, and biogeochemical, stabilizing atmospheric CO 2 concentrations and thus influencing global climate. Quantification of these effects is difficult, however, because scenarios of anthropogenic land cover change over the Holocene vary widely, with increasing disagreement back in time. Because land cover change had such widespread ramifications for the Earth system, it is essential to assess current ALCC scenarios in light of observations and provide guidance on which models are most realistic. Here, we perform a systematic evaluation of two widely-used ALCC scenarios (KK10 and HYDE3.1 in northern and part of central Europe using an independent, pollen-based reconstruction of Holocene land cover (REVEALS. Considering that ALCC in Europe primarily resulted in deforestation, we compare modeled land use with the cover of non-forest vegetation inferred from the pollen data. Though neither land cover change scenario matches the pollen-based reconstructions precisely, KK10 correlates well with REVEALS at the country scale, while HYDE systematically underestimates land use with increasing magnitude with time in the past. Discrepancies between modeled and reconstructed land use are caused by a number of factors, including assumptions of per-capita land use and socio-cultural factors that cannot be predicted on the basis of the characteristics of the physical environment, including dietary preferences, long-distance trade, the location of urban areas and social organization.

  18. Scaling up complex interventions: insights from a realist synthesis.

    Science.gov (United States)

    Willis, Cameron D; Riley, Barbara L; Stockton, Lisa; Abramowicz, Aneta; Zummach, Dana; Wong, Geoff; Robinson, Kerry L; Best, Allan

    2016-12-19

    Preventing chronic diseases, such as cancer, cardiovascular disease and diabetes, requires complex interventions, involving multi-component and multi-level efforts that are tailored to the contexts in which they are delivered. Despite an increasing number of complex interventions in public health, many fail to be 'scaled up'. This study aimed to increase understanding of how and under what conditions complex public health interventions may be scaled up to benefit more people and populations.A realist synthesis was conducted and discussed at an in-person workshop involving practitioners responsible for scaling up activities. Realist approaches view causality through the linkages between changes in contexts (C) that activate mechanisms (M), leading to specific outcomes (O) (CMO configurations). To focus this review, three cases of complex interventions that had been successfully scaled up were included: Vibrant Communities, Youth Build USA and Pathways to Education. A search strategy of published and grey literature related to each case was developed, involving searches of relevant databases and nominations from experts. Data extracted from included documents were classified according to CMO configurations within strategic themes. Findings were compared and contrasted with guidance from diffusion theory, and interpreted with knowledge users to identify practical implications and potential directions for future research.Four core mechanisms were identified, namely awareness, commitment, confidence and trust. These mechanisms were activated within two broad scaling up strategies, those of renewing and regenerating, and documenting success. Within each strategy, specific actions to change contexts included building partnerships, conducting evaluations, engaging political support and adapting funding models. These modified contexts triggered the identified mechanisms, leading to a range of scaling up outcomes, such as commitment of new communities, changes in relevant

  19. Economic assessment of energetic scenarios

    International Nuclear Information System (INIS)

    Grandjean, Alain; Bureau, Dominique; Schubert, Katheline; Henriet, Fanny; Maggiar, Nicolas; Criqui, Patrick; Le Teno, Helene; Baumstark, Luc; Crassous, Renaud; Roques, Fabien

    2013-09-01

    This publication gathers contributions proposed by different members of the Economic Council for a Sustainable Development (CEDD) on the issue of energy transition, and more precisely on scenarios elaborated with respect to energy transition. A first set of contributions addresses models of energy transition (assessment of scenario costs to reach a factor 4; the issue of de-carbonation of energy consumption; study of ELECsim, a tool to highlight costs of scenarios of evolution of the electric power system). The second part addresses arbitrations and choice assessment (the importance of social and economic impacts of scenarios; challenges related to the joint definition of the discount rate and of the evolution of carbon value in time; the issue of assessment of the integration of renewable energies into the power system)

  20. Non realist tendencies in new Turkish cinema

    OpenAIRE

    Can, İclal

    2016-01-01

    http://hdl.handle.net/11693/29111 Thesis (M.S.): Bilkent University, Department of Communication and Design, İhsan Doğramacı Bilkent University, 2016. Includes bibliographical references (leaves 113-123). The realist tendency which had been dominant in cinema became more apparent with Italian neorealism affecting other national cinemas to a large extent. With the changing and developing socio economic and cultural dynamics, realism gradually has stopped being a natural const...

  1. Quantum cryptography: towards realization in realistic conditions

    Energy Technology Data Exchange (ETDEWEB)

    Imoto, M; Koashi, M; Shimizu, K [NTT Basic Research Laboratories, 3-1 Morinosato-Wakamiya, Atsugi-shi, Kanagawa 243-01 (Japan); Huttner, B [Universite de Geneve, GAP-optique, 20, Rue de l` Ecole de Medecine CH1211, Geneve 4 (Switzerland)

    1997-05-11

    Many of quantum cryptography schemes have been proposed based on some assumptions such as no transmission loss, no measurement error, and an ideal single photon generator. We have been trying to develop a theory of quantum cryptography considering realistic conditions. As such attempts, we propose quantum cryptography with coherent states, quantum cryptography with two-photon interference, and generalization of two-state cryptography to two-mixed-state cases. (author) 15 refs., 1 fig., 1 tab.

  2. The Forecast Scenarios of Development of the National Economy in the Context of the Need to Improve the «Cost of Living»

    Directory of Open Access Journals (Sweden)

    Kulakov Gennady T.

    2017-04-01

    Full Text Available The article is aimed at elaborating and materialization of the forecast scenarios of development of the national economy in the context of substantiating the feasibility of improving the «cost of living» being the equivalent of the liability of public authorities for the value of human life. The article researches the phenomenon of the «cost of living» in the context of sustainable innovative development of a socially oriented development of economy as an axis for developing its forecast scenarios. Focus has been set on complementarity of the terms of «cost of living» and «sustainable development» in the context of satisfying vital interests of the population of Ukraine. It has been suggested that wages accounting as an equivalent to the «cost of living» should not be included with costs but with the value added, however, the growth rate of wages must not outpace the growth rate of labor productivity. For the first time on the basis of the interdisciplinary and intersectoral approach, as well as the index method, have been elaborated baseline scenarios of development of the national economy on the basis of the upgraded human development index: pessimistic, realistic, and optimistic forecasts.

  3. Nuclear Security Futures Scenarios

    International Nuclear Information System (INIS)

    Keller, Elizabeth James Kistin; Warren, Drake Edward; Hayden, Nancy Kay; Passell, Howard D.; Malczynski, Leonard A.; Backus, George A.

    2017-01-01

    This report provides an overview of the scenarios used in strategic futures workshops conducted at Sandia on September 21 and 29, 2016. The workshops, designed and facilitated by analysts in Center 100, used scenarios to enable thought leaders to think collectively about the changing aspects of global nuclear security and the potential implications for the US Government and Sandia National Laboratories.

  4. Nuclear Security Futures Scenarios.

    Energy Technology Data Exchange (ETDEWEB)

    Keller, Elizabeth James Kistin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Warren, Drake Edward [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hayden, Nancy Kay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Passell, Howard D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Malczynski, Leonard A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Backus, George A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    This report provides an overview of the scenarios used in strategic futures workshops conducted at Sandia on September 21 and 29, 2016. The workshops, designed and facilitated by analysts in Center 100, used scenarios to enable thought leaders to think collectively about the changing aspects of global nuclear security and the potential implications for the US Government and Sandia National Laboratories.

  5. Rethinking Mathematics Teaching in Liberia: Realistic Mathematics Education

    Science.gov (United States)

    Stemn, Blidi S.

    2017-01-01

    In some African cultures, the concept of division does not necessarily mean sharing money or an item equally. How an item is shared might depend on the ages of the individuals involved. This article describes the use of the Realistic Mathematics Education (RME) approach to teach division word problems involving money in a 3rd-grade class in…

  6. Realistic full wave modeling of focal plane array pixels.

    Energy Technology Data Exchange (ETDEWEB)

    Campione, Salvatore [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Warne, Larry K. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Jorgenson, Roy E. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Davids, Paul [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Applied Photonic Microsystems Dept.; Peters, David W. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Applied Photonic Microsystems Dept.

    2017-11-01

    Here, we investigate full-wave simulations of realistic implementations of multifunctional nanoantenna enabled detectors (NEDs). We focus on a 2x2 pixelated array structure that supports two wavelengths of operation. We design each resonating structure independently using full-wave simulations with periodic boundary conditions mimicking the whole infinite array. We then construct a supercell made of a 2x2 pixelated array with periodic boundary conditions mimicking the full NED; in this case, however, each pixel comprises 10-20 antennas per side. In this way, the cross-talk between contiguous pixels is accounted for in our simulations. We observe that, even though there are finite extent effects, the pixels work as designed, each responding at the respective wavelength of operation. This allows us to stress that realistic simulations of multifunctional NEDs need to be performed to verify the design functionality by taking into account finite extent and cross-talk effects.

  7. Facilities upgrade for natural forces: traditional vs. realistic approach

    International Nuclear Information System (INIS)

    Terkun, V.

    1985-01-01

    The traditional method utilized for upgrading existing buildings and equipment involves the following steps: performs structural study using finite element analysis and some in situ testing; compare predicted member forces/stresses to material code allowables; determine strengthening schemes for those structural members judged to be weak; estimate cost for required upgrades. This approach will result in structural modifications that are not only conservative but very expensive as well. The realistic structural evaluation approach uses traditional data to predict structural weaknesses as a final step. Next, using considerable information now available for buildings and equipment exposed to natural hazards, engineering judgments about structures being evaluated can be made with a great deal of confidence. This approach does not eliminate conservatism entirely, but it does reduce it to a reasonable and realistic level. As a result, the upgrade cost goes down without compromising the low risk necessary for vital facilities

  8. Assessment of nuclear power scenarios allowing for matrix behavior in radiological impact modeling of disposal scenarios

    International Nuclear Information System (INIS)

    Tronche, E.; Boussier, H.

    2000-01-01

    Under the provisions of the 1991 French radioactive waste management law, various fuel cycle scenarios will be assessed and compared in terms of feasibility, flexibility, cost, and ultimate waste radio-toxic inventory. The latter criterion may be further broken down into 'potential radio-toxic inventory' (the radio-toxic inventory of all the radionuclides produced) and 'residual radio-toxic inventory' (the radionuclide fraction reaching the biosphere after migration from the repository). The innovative scientific contribution of this study is to consider a third type of radio-toxic inventory: the potential radio-toxic inventory after conditioning, i.e. taking into account the containment capacity of the radionuclide conditioning matrices. The matrix fraction subjected to alteration over time determines the potential for radionuclide release, hence the notion of the potential radio-toxic inventory after conditioning. An initial comparison of possible scenarios is proposed by considering orders of magnitude for the radionuclide containment capacity of the disposal matrices and for their mobilization potential. All the scenarios investigated are normalized to the same annual electric power production so that a legitimate comparison can be established for the ultimate wasteform produced per year of operation. This approach reveals significant differences among the scenarios considered that do not appear when only the raw potential radio-toxic inventory is taken into account. The matrix containment performance has a decisive effect on the final impact of a given scenario or type of scenario. Pu recycling scenarios thus reduce the potential radio-toxicity by roughly a factor of 50 compared with an open cycle; the gain rises to a factor of about 300 for scenarios in which Pu and the minor actinides are recycled. Interestingly, the results obtained by the use of a dedicated containment matrix for the minor actinides in a scenario limited to Pu recycling were comparable to

  9. Ethoprophos fate on soil-water interface and effects on non-target terrestrial and aquatic biota under Mediterranean crop-based scenarios.

    Science.gov (United States)

    Leitão, Sara; Moreira-Santos, Matilde; Van den Brink, Paul J; Ribeiro, Rui; José Cerejeira, M; Sousa, José Paulo

    2014-05-01

    The present study aimed to assess the environmental fate of the insecticide and nematicide ethoprophos in the soil-water interface following the pesticide application in simulated maize and potato crops under Mediterranean agricultural conditions, particularly of irrigation. Focus was given to the soil-water transfer pathways (leaching and runoff), to the pesticide transport in soil between pesticide application (crop row) and non-application areas (between crop rows), as well as to toxic effects of the various matrices on terrestrial and aquatic biota. A semi-field methodology mimicking a "worst-case" ethoprophos application (twice the recommended dosage for maize and potato crops: 100% concentration v/v) in agricultural field situations was used, in order to mimic a possible misuse by the farmer under realistic conditions. A rainfall was simulated under a slope of 20° for both crop-based scenarios. Soil and water samples were collected for the analysis of pesticide residues. Ecotoxicity of soil and aquatic samples was assessed by performing lethal and sublethal bioassays with organisms from different trophic levels: the collembolan Folsomia candida, the earthworm Eisenia andrei and the cladoceran Daphnia magna. Although the majority of ethoprophos sorbed to the soil application area, pesticide concentrations were detected in all water matrices illustrating pesticide transfer pathways of water contamination between environmental compartments. Leaching to groundwater proved to be an important transfer pathway of ethoprophos under both crop-based scenarios, as it resulted in high pesticide concentration in leachates from Maize (130µgL(-1)) and Potato (630µgL(-1)) crop scenarios, respectively. Ethoprophos application at the Potato crop scenario caused more toxic effects on terrestrial and aquatic biota than at the Maize scenario at the recommended dosage and lower concentrations. In both crop-based scenarios, ethoprophos moved with the irrigation water flow to the

  10. Plasmon response in K, Na and Li clusters: systematics using the separable random-phase approximation with pseudo-Hamiltonians

    International Nuclear Information System (INIS)

    Kleinig, W.; Nesterenko, V.O.; Reinhard, P.-G.; Serra, Ll.

    1998-01-01

    The systematics of the plasmon response in spherical K, Na and Li clusters in a wide size region (8≤N≤440) is studied. We have considered two simplifying approximations whose validity has been established previously. First, a separable approach to the random-phase approximation is used. This involves an expansion of the residual interaction into a sum of separable terms until convergence is reached. Second, the electron-ion interaction is modelled by using the pseudo-Hamiltonian jellium model (MHJM) which includes nonlocal effects by means of realistic atomic pseudo-Hamiltonians. In cases where nonlocal effects are negligible the Structure Averaged Jellium Model (SAJM) has been used. Good agreement with available experimental data is achieved for K, Na (using the SAJM) and small Li clusters (invoking the PHJM). The trends for peak position and width are generally well reproduced, even up to details of the Landau fragmentation in several clusters. Less good agreement, however, is found for large Li clusters. This remains an open question

  11. Autonomic Closure for Turbulent Flows Using Approximate Bayesian Computation

    Science.gov (United States)

    Doronina, Olga; Christopher, Jason; Hamlington, Peter; Dahm, Werner

    2017-11-01

    Autonomic closure is a new technique for achieving fully adaptive and physically accurate closure of coarse-grained turbulent flow governing equations, such as those solved in large eddy simulations (LES). Although autonomic closure has been shown in recent a priori tests to more accurately represent unclosed terms than do dynamic versions of traditional LES models, the computational cost of the approach makes it challenging to implement for simulations of practical turbulent flows at realistically high Reynolds numbers. The optimization step used in the approach introduces large matrices that must be inverted and is highly memory intensive. In order to reduce memory requirements, here we propose to use approximate Bayesian computation (ABC) in place of the optimization step, thereby yielding a computationally-efficient implementation of autonomic closure that trades memory-intensive for processor-intensive computations. The latter challenge can be overcome as co-processors such as general purpose graphical processing units become increasingly available on current generation petascale and exascale supercomputers. In this work, we outline the formulation of ABC-enabled autonomic closure and present initial results demonstrating the accuracy and computational cost of the approach.

  12. Climate scenarios for California

    Science.gov (United States)

    Cayan, Daniel R.; Maurer, Ed; Dettinger, Mike; Tyree, Mary; Hayhoe, Katharine; Bonfils, Celine; Duffy, Phil; Santer, Ben

    2006-01-01

    Possible future climate changes in California are investigated from a varied set of climate change model simulations. These simulations, conducted by three state-of-the-art global climate models, provide trajectories from three greenhouse gas (GHG) emission scenarios. These scenarios and the resulting climate simulations are not “predictions,” but rather are a limited sample from among the many plausible pathways that may affect California’s climate. Future GHG concentrations are uncertain because they depend on future social, political, and technological pathways, and thus the IPCC has produced four “families” of emission scenarios. To explore some of these uncertainties, emissions scenarios A2 (a medium-high emissions) and B1 (low emissions) were selected from the current IPCC Fourth climate assessment, which provides several recent model simulations driven by A2 and B1 emissions. The global climate model simulations addressed here were from PCM1, the Parallel Climate Model from the National Center for Atmospheric Research (NCAR) and U.S. Department of Energy (DOE) group, and CM2.1 from the National Oceanic and Atmospheric Administration (NOAA) Geophysical Fluids Dynamics Laboratory (GFDL).

  13. The Electrostatic Instability for Realistic Pair Distributions in Blazar/EBL Cascades

    Science.gov (United States)

    Vafin, S.; Rafighi, I.; Pohl, M.; Niemiec, J.

    2018-04-01

    This work revisits the electrostatic instability for blazar-induced pair beams propagating through the intergalactic medium (IGM) using linear analysis and PIC simulations. We study the impact of the realistic distribution function of pairs resulting from the interaction of high-energy gamma-rays with the extragalactic background light. We present analytical and numerical calculations of the linear growth rate of the instability for the arbitrary orientation of wave vectors. Our results explicitly demonstrate that the finite angular spread of the beam dramatically affects the growth rate of the waves, leading to the fastest growth for wave vectors quasi-parallel to the beam direction and a growth rate at oblique directions that is only a factor of 2–4 smaller compared to the maximum. To study the nonlinear beam relaxation, we performed PIC simulations that take into account a realistic wide-energy distribution of beam particles. The parameters of the simulated beam-plasma system provide an adequate physical picture that can be extrapolated to realistic blazar-induced pairs. In our simulations, the beam looses only 1% of its energy, and we analytically estimate that the beam would lose its total energy over about 100 simulation times. An analytical scaling is then used to extrapolate the parameters of realistic blazar-induced pair beams. We find that they can dissipate their energy slightly faster by the electrostatic instability than through inverse-Compton scattering. The uncertainties arising from, e.g., details of the primary gamma-ray spectrum are too large to make firm statements for individual blazars, and an analysis based on their specific properties is required.

  14. Scenario development methods and practice

    International Nuclear Information System (INIS)

    2001-01-01

    The safe management of radioactive waste is an essential aspect of all nuclear power programmes. Although a general consensus has been reached in OECD countries on the use of geological repositories for the disposal of high-level radioactive waste, analysis of the long-term safety of these repositories, using performance assessment and other tools, is required prior to implementation. The initial stage in developing a repository safety assessment is the identification of all factors that may be relevant to the long-term safety of the repository and their combination to form scenarios. This must be done in a systematic and transparent way in order to assure the regulatory authorities that nothing important has been forgotten. Scenario development has become the general term used to describe the collection and organisation of the scientific and technical information necessary to assess the long-term performance or safety of radioactive waste disposal systems. This includes the identification of the relevant features, events and processes (FEPs), the synthesis of broad models of scientific understanding, and the selection of cases to be calculated. Scenario development provides the overall framework in which the cases and their calculated consequences can be discussed, including biases or shortcomings due to omissions or lack of knowledge. The NEA Workshop on Scenario Development was organised in Madrid, in May 1999, with the objective of reviewing developments in scenario methodologies and applications in safety assessments since 1992. The outcome of this workshop is the subject of this book. It is a review of developments in scenario methodologies based on a large body of practical experience in safety assessments. It will be of interest to radioactive waste management experts as well as to other specialists involved in the development of scenario methodologies. (author)

  15. Approximate symmetries of Hamiltonians

    Science.gov (United States)

    Chubb, Christopher T.; Flammia, Steven T.

    2017-08-01

    We explore the relationship between approximate symmetries of a gapped Hamiltonian and the structure of its ground space. We start by considering approximate symmetry operators, defined as unitary operators whose commutators with the Hamiltonian have norms that are sufficiently small. We show that when approximate symmetry operators can be restricted to the ground space while approximately preserving certain mutual commutation relations. We generalize the Stone-von Neumann theorem to matrices that approximately satisfy the canonical (Heisenberg-Weyl-type) commutation relations and use this to show that approximate symmetry operators can certify the degeneracy of the ground space even though they only approximately form a group. Importantly, the notions of "approximate" and "small" are all independent of the dimension of the ambient Hilbert space and depend only on the degeneracy in the ground space. Our analysis additionally holds for any gapped band of sufficiently small width in the excited spectrum of the Hamiltonian, and we discuss applications of these ideas to topological quantum phases of matter and topological quantum error correcting codes. Finally, in our analysis, we also provide an exponential improvement upon bounds concerning the existence of shared approximate eigenvectors of approximately commuting operators under an added normality constraint, which may be of independent interest.

  16. A comparison of approximation techniques for variance-based sensitivity analysis of biochemical reaction systems

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2010-05-01

    Full Text Available Abstract Background Sensitivity analysis is an indispensable tool for the analysis of complex systems. In a recent paper, we have introduced a thermodynamically consistent variance-based sensitivity analysis approach for studying the robustness and fragility properties of biochemical reaction systems under uncertainty in the standard chemical potentials of the activated complexes of the reactions and the standard chemical potentials of the molecular species. In that approach, key sensitivity indices were estimated by Monte Carlo sampling, which is computationally very demanding and impractical for large biochemical reaction systems. Computationally efficient algorithms are needed to make variance-based sensitivity analysis applicable to realistic cellular networks, modeled by biochemical reaction systems that consist of a large number of reactions and molecular species. Results We present four techniques, derivative approximation (DA, polynomial approximation (PA, Gauss-Hermite integration (GHI, and orthonormal Hermite approximation (OHA, for analytically approximating the variance-based sensitivity indices associated with a biochemical reaction system. By using a well-known model of the mitogen-activated protein kinase signaling cascade as a case study, we numerically compare the approximation quality of these techniques against traditional Monte Carlo sampling. Our results indicate that, although DA is computationally the most attractive technique, special care should be exercised when using it for sensitivity analysis, since it may only be accurate at low levels of uncertainty. On the other hand, PA, GHI, and OHA are computationally more demanding than DA but can work well at high levels of uncertainty. GHI results in a slightly better accuracy than PA, but it is more difficult to implement. OHA produces the most accurate approximation results and can be implemented in a straightforward manner. It turns out that the computational cost of the

  17. Carbon tax scenarios and their effects on the Irish energy sector

    International Nuclear Information System (INIS)

    Di Cosmo, Valeria; Hyland, Marie

    2013-01-01

    In this paper we use annual time series data from 1960 to 2008 to estimate the long run price and income elasticities underlying energy demand in Ireland. The Irish economy is divided into five sectors: residential, industrial, commercial, agricultural and transport, and separate energy demand equations are estimated for all sectors. Energy demand is broken down by fuel type, and price and income elasticities are estimated for the primary fuels in the Irish fuel mix. Using the estimated price and income elasticities we forecast Irish sectoral energy demand out to 2025. The share of electricity in the Irish fuel mix is predicted to grow over time, as the share of carbon intensive fuels such as coal, oil and peat, falls. The share of electricity in total energy demand grows most in the industrial and commercial sectors, while oil remains an important fuel in the residential and transport sectors. Having estimated the baseline forecasts, two different carbon tax scenarios are imposed and the impact of these scenarios on energy demand, carbon dioxide emissions, and government revenue is assessed. If it is assumed that the level of the carbon tax will track the futures price of carbon under the EU-ETS, the carbon tax will rise from €21.50 per tonne CO 2 in 2012 (the first year forecasted) to €41 in 2025. Results show that under this scenario total emissions would be reduced by approximately 861,000 tonnes of CO 2 in 2025 relative to a zero carbon tax scenario, and that such a tax would generate €1.1 billion in revenue in the same year. We also examine a high tax scenario under which emissions reductions and revenue generated will be greater. Finally, in order to assess the macroeconomic effects of a carbon tax, the carbon tax scenarios were run in HERMES, the ESRI's medium-term macroeconomic model. The results from HERMES show that, a carbon tax of €41 per tonne CO 2 would lead to a 0.21% contraction in GDP, and a 0.08% reduction in employment. A higher carbon

  18. Climate change scenarios and key climate indices in the Swiss Alpine region

    Science.gov (United States)

    Zubler, Elias; Croci-Maspoli, Mischa; Frei, Christoph; Liniger, Mark; Scherrer, Simon; Appenzeller, Christof

    2013-04-01

    For climate adaption and to support climate mitigation policy it is of outermost importance to demonstrate the consequences of climate change on a local level and in user oriented quantities. Here, a framework is presented to apply the Swiss national climate change scenarios CH2011 to climate indices with direct relevance to applications, such as tourism, transportation, agriculture and health. This framework provides results on a high spatial and temporal resolution and can also be applied in mountainous regions such as the Alps. Results are shown for some key indices, such as the number of summer days and tropical nights, growing season length, number of frost days, heating and cooling degree days, and the number of days with fresh snow. Particular focus is given to changes in the vertical distribution for the future periods 2020-2049, 2045-2074 and 2070-2099 relative to the reference period 1980-2009 for the A1B, A2 and RCP3PD scenario. The number of days with fresh snow is approximated using a combination of temperature and precipitation as proxies. Some findings for the latest scenario period are: (1) a doubling of the number of summer days by the end of the century under the business-as-usual scenario A2, (2) tropical nights appear above 1500 m asl, (3) the number of frost days may be reduced by more than 3 months at altitudes higher than 2500 m, (4) an overall reduction of heating degree days of about 30% by the end of the century, but on the other hand an increase in cooling degree days in warm seasons, and (5) the number of days with fresh snow tends to go towards zero at low altitudes. In winter, there is little change in snowfall above 2000 m asl (roughly -3 days) in all scenarios. The largest impact on snowfall is found along the Northern Alpine flank and the Jura (-10 days or roughly -50% in A1B for the winter season). It is also highlighted that the future projections for all indices strongly depend on the chosen scenario and on model uncertainty

  19. DendroBLAST: approximate phylogenetic trees in the absence of multiple sequence alignments.

    Science.gov (United States)

    Kelly, Steven; Maini, Philip K

    2013-01-01

    The rapidly growing availability of genome information has created considerable demand for both fast and accurate phylogenetic inference algorithms. We present a novel method called DendroBLAST for reconstructing phylogenetic dendrograms/trees from protein sequences using BLAST. This method differs from other methods by incorporating a simple model of sequence evolution to test the effect of introducing sequence changes on the reliability of the bipartitions in the inferred tree. Using realistic simulated sequence data we demonstrate that this method produces phylogenetic trees that are more accurate than other commonly-used distance based methods though not as accurate as maximum likelihood methods from good quality multiple sequence alignments. In addition to tests on simulated data, we use DendroBLAST to generate input trees for a supertree reconstruction of the phylogeny of the Archaea. This independent analysis produces an approximate phylogeny of the Archaea that has both high precision and recall when compared to previously published analysis of the same dataset using conventional methods. Taken together these results demonstrate that approximate phylogenetic trees can be produced in the absence of multiple sequence alignments, and we propose that these trees will provide a platform for improving and informing downstream bioinformatic analysis. A web implementation of the DendroBLAST method is freely available for use at http://www.dendroblast.com/.

  20. DendroBLAST: approximate phylogenetic trees in the absence of multiple sequence alignments.

    Directory of Open Access Journals (Sweden)

    Steven Kelly

    Full Text Available The rapidly growing availability of genome information has created considerable demand for both fast and accurate phylogenetic inference algorithms. We present a novel method called DendroBLAST for reconstructing phylogenetic dendrograms/trees from protein sequences using BLAST. This method differs from other methods by incorporating a simple model of sequence evolution to test the effect of introducing sequence changes on the reliability of the bipartitions in the inferred tree. Using realistic simulated sequence data we demonstrate that this method produces phylogenetic trees that are more accurate than other commonly-used distance based methods though not as accurate as maximum likelihood methods from good quality multiple sequence alignments. In addition to tests on simulated data, we use DendroBLAST to generate input trees for a supertree reconstruction of the phylogeny of the Archaea. This independent analysis produces an approximate phylogeny of the Archaea that has both high precision and recall when compared to previously published analysis of the same dataset using conventional methods. Taken together these results demonstrate that approximate phylogenetic trees can be produced in the absence of multiple sequence alignments, and we propose that these trees will provide a platform for improving and informing downstream bioinformatic analysis. A web implementation of the DendroBLAST method is freely available for use at http://www.dendroblast.com/.

  1. Measurable realistic image-based 3D mapping

    Science.gov (United States)

    Liu, W.; Wang, J.; Wang, J. J.; Ding, W.; Almagbile, A.

    2011-12-01

    Maps with 3D visual models are becoming a remarkable feature of 3D map services. High-resolution image data is obtained for the construction of 3D visualized models.The3D map not only provides the capabilities of 3D measurements and knowledge mining, but also provides the virtual experienceof places of interest, such as demonstrated in the Google Earth. Applications of 3D maps are expanding into the areas of architecture, property management, and urban environment monitoring. However, the reconstruction of high quality 3D models is time consuming, and requires robust hardware and powerful software to handle the enormous amount of data. This is especially for automatic implementation of 3D models and the representation of complicated surfacesthat still need improvements with in the visualisation techniques. The shortcoming of 3D model-based maps is the limitation of detailed coverage since a user can only view and measure objects that are already modelled in the virtual environment. This paper proposes and demonstrates a 3D map concept that is realistic and image-based, that enables geometric measurements and geo-location services. Additionally, image-based 3D maps provide more detailed information of the real world than 3D model-based maps. The image-based 3D maps use geo-referenced stereo images or panoramic images. The geometric relationships between objects in the images can be resolved from the geometric model of stereo images. The panoramic function makes 3D maps more interactive with users but also creates an interesting immersive circumstance. Actually, unmeasurable image-based 3D maps already exist, such as Google street view, but only provide virtual experiences in terms of photos. The topographic and terrain attributes, such as shapes and heights though are omitted. This paper also discusses the potential for using a low cost land Mobile Mapping System (MMS) to implement realistic image 3D mapping, and evaluates the positioning accuracy that a measureable

  2. Neutron dosemeter responses in workplace fields and the implications of using realistic neutron calibration fields

    International Nuclear Information System (INIS)

    Thomas, D.J.; Horwood, N.; Taylor, G.C.

    1999-01-01

    The use of realistic neutron calibration fields to overcome some of the problems associated with the response functions of presently available dosemeters, both area survey instruments and personal dosemeters, has been investigated. Realistic calibration fields have spectra which, compared to conventional radionuclide source based calibration fields, more closely match those of the workplace fields in which dosemeters are used. Monte Carlo simulations were performed to identify laboratory systems which would produce appropriate workplace-like calibration fields. A detailed analysis was then undertaken of the predicted under- and over-responses of dosemeters in a wide selection of measured workplace field spectra assuming calibration in a selection of calibration fields. These included both conventional radionuclide source calibration fields, and also several proposed realistic calibration fields. The present state of the art for dosemeter performance, and the possibilities of improving accuracy by using realistic calibration fields are both presented. (author)

  3. Quantification of the Potential Gross Economic Impacts of Five Methane Reduction Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Keyser, David [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Warner, Ethan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Curley, Christina [Colorado State Univ., Fort Collins, CO (United States)

    2015-04-23

    Methane (CH4) is a potent greenhouse gas that is released from the natural gas supply chain into the atmosphere as a result of fugitive emissions1 and venting2 . We assess five potential CH4 reduction scenarios from transmission, storage, and distribution (TS&D) using published literature on the costs and the estimated quantity of CH4 reduced. We utilize cost and methane inventory data from ICF (2014) and Warner et al. (forthcoming) as well as data from Barrett and McCulloch (2014) and the American Gas Association (AGA) (2013) to estimate that the implementation of these measures could support approximately 85,000 jobs annually from 2015 to 2019 and reduce CH4 emissions from natural gas TS&D by over 40%. Based on standard input/output analysis methodology, measures are estimated to support over $8 billion in GDP annually over the same time period and allow producers to recover approximately $912 million annually in captured gas.

  4. Downscaling climate change scenarios for apple pest and disease modeling in Switzerland

    Directory of Open Access Journals (Sweden)

    M. Hirschi

    2012-02-01

    Full Text Available As a consequence of current and projected climate change in temperate regions of Europe, agricultural pests and diseases are expected to occur more frequently and possibly to extend to previously non-affected regions. Given their economic and ecological relevance, detailed forecasting tools for various pests and diseases have been developed, which model their phenology, depending on actual weather conditions, and suggest management decisions on that basis. Assessing the future risk of pest-related damages requires future weather data at high temporal and spatial resolution. Here, we use a combined stochastic weather generator and re-sampling procedure for producing site-specific hourly weather series representing present and future (1980–2009 and 2045–2074 time periods climate conditions in Switzerland. The climate change scenarios originate from the ENSEMBLES multi-model projections and provide probabilistic information on future regional changes in temperature and precipitation. Hourly weather series are produced by first generating daily weather data for these climate scenarios and then using a nearest neighbor re-sampling approach for creating realistic diurnal cycles. These hourly weather series are then used for modeling the impact of climate change on important life phases of the codling moth and on the number of predicted infection days of fire blight. Codling moth (Cydia pomonella and fire blight (Erwinia amylovora are two major pest and disease threats to apple, one of the most important commercial and rural crops across Europe. Results for the codling moth indicate a shift in the occurrence and duration of life phases relevant for pest control. In southern Switzerland, a 3rd generation per season occurs only very rarely under today's climate conditions but is projected to become normal in the 2045–2074 time period. While the potential risk for a 3rd generation is also significantly increasing in northern

  5. Multiscale scenarios for nature futures

    NARCIS (Netherlands)

    Rosa, Isabel M.D.; Pereira, Henrique M.; Ferrier, Simon; Alkemade, Rob; Acosta, Lilibeth A.; Akcakaya, H. Resit; Den Belder, Eefje; Fazel, Asghar M.; Fujimori, Shinichiro; Harfoot, Mike; Harhash, Khaled A.; Harrison, Paula A.; Hauck, Jennifer; Hendriks, Rob J.J.; Hernández, Gladys; Jetz, Walter; Karlsson-Vinkhuyzen, Sylvia I.; Kim, Hyejin; King, Nicholas; Kok, Marcel T.J.; Kolomytsev, Grygoriy O.; Lazarova, Tanya; Leadley, Paul; Lundquist, Carolyn J.; García Márquez, Jaime; Meyer, Carsten; Navarro, Laetitia M.; Nesshöver, Carsten; Ngo, Hien T.; Ninan, Karachepone N.; Palomo, Maria G.; Pereira, Laura M.; Peterson, Garry D.; Pichs, Ramon; Popp, Alexander; Purvis, Andy; Ravera, Federica; Rondinini, Carlo; Sathyapalan, Jyothis; Schipper, Aafke M.; Seppelt, Ralf; Settele, Josef; Sitas, Nadia; Van Vuuren, Detlef|info:eu-repo/dai/nl/11522016X

    2017-01-01

    Targets for human development are increasingly connected with targets for nature, however, existing scenarios do not explicitly address this relationship. Here, we outline a strategy to generate scenarios centred on our relationship with nature to inform decision-making at multiple scales.

  6. Trust and the illusive force of scenarios

    DEFF Research Database (Denmark)

    Selin, Cynthia Lea

    2006-01-01

    formulation and decision-making. By definition, scenarios are possible versions of the future so judging and evaluating scenarios is thus not about revealing truthfulness, but rather demonstrating trust, reliability, credibility in the absence of truth and in the face of varied influences and possible...... becomes interesting is how scenarios convey authority and trustworthiness. How is it that scenarios attain and maintain power to compel people to action, change their worldview, or influence the directions of decisions or consensus? This piece examines the process, participation and products of scenario...

  7. The role of fusion power in energy scenarios. Proposed method and review of existing scenarios

    International Nuclear Information System (INIS)

    Lako, P; Ybema, J.R.; Seebregts, A.J.

    1998-04-01

    The European Commission wishes more insight in the potential role of fusion energy in the second half of the 21st century. Therefore, several scenario studies are carried out in the so-called macro-task Long Term Scenarios to investigate the potential role of fusion power in the energy system. The main contribution of ECN to the macro-task is to perform a long term energy scenario study for Western Europe with special focus on the role of fusion power. This interim report gives some methodological considerations for such an analysis. A discussion is given on the problems related to the long time horizon of the scenario study such as the forecast of technological innovations, the selection of appropriate discount rates and the links with climate change. Key parameters which are expected to have large effects on the role and cost-effectiveness are discussed in general terms. The key parameters to be varied include level and structure of energy demand, availability and prices of fossil energy, CO2 reduction policy, discount rates, cost and potential of renewable energy sources, availability of fission power and CO2 capture and disposal and the cost and the maximum rate of market growth of fusion power. The scenario calculations are to be performed later in the project with the help of an existing cost minimisation model of the Western European energy system. This MARKAL model is briefly introduced. The results of the model calculations are expected to make clear under which combinations of scenario parameters fusion power is needed and how large the expected financial benefits will be. The present interim report also gives an evaluation of existing energy scenarios with respect to the role of fusion power. 18 refs

  8. International Management: Creating a More Realistic Global Planning Environment.

    Science.gov (United States)

    Waldron, Darryl G.

    2000-01-01

    Discusses the need for realistic global planning environments in international business education, introducing a strategic planning model that has teams interacting with teams to strategically analyze a selected multinational company. This dynamic process must result in a single integrated written analysis that specifies an optimal strategy for…

  9. Geometric approximation algorithms

    CERN Document Server

    Har-Peled, Sariel

    2011-01-01

    Exact algorithms for dealing with geometric objects are complicated, hard to implement in practice, and slow. Over the last 20 years a theory of geometric approximation algorithms has emerged. These algorithms tend to be simple, fast, and more robust than their exact counterparts. This book is the first to cover geometric approximation algorithms in detail. In addition, more traditional computational geometry techniques that are widely used in developing such algorithms, like sampling, linear programming, etc., are also surveyed. Other topics covered include approximate nearest-neighbor search, shape approximation, coresets, dimension reduction, and embeddings. The topics covered are relatively independent and are supplemented by exercises. Close to 200 color figures are included in the text to illustrate proofs and ideas.

  10. Ultra-realistic 3-D imaging based on colour holography

    International Nuclear Information System (INIS)

    Bjelkhagen, H I

    2013-01-01

    A review of recent progress in colour holography is provided with new applications. Colour holography recording techniques in silver-halide emulsions are discussed. Both analogue, mainly Denisyuk colour holograms, and digitally-printed colour holograms are described and their recent improvements. An alternative to silver-halide materials are the panchromatic photopolymer materials such as the DuPont and Bayer photopolymers which are covered. The light sources used to illuminate the recorded holograms are very important to obtain ultra-realistic 3-D images. In particular the new light sources based on RGB LEDs are described. They show improved image quality over today's commonly used halogen lights. Recent work in colour holography by holographers and companies in different countries around the world are included. To record and display ultra-realistic 3-D images with perfect colour rendering are highly dependent on the correct recording technique using the optimal recording laser wavelengths, the availability of improved panchromatic recording materials and combined with new display light sources.

  11. Evidence and future scenarios of a low-carbon energy transition in Central America: a case study in Nicaragua

    Science.gov (United States)

    Barido, Diego Ponce de Leon; Johnston, Josiah; Moncada, Maria V.; Callaway, Duncan; Kammen, Daniel M.

    2015-10-01

    The global carbon emissions budget over the next decades depends critically on the choices made by fast-growing emerging economies. Few studies exist, however, that develop country-specific energy system integration insights that can inform emerging economies in this decision-making process. High spatial- and temporal-resolution power system planning is central to evaluating decarbonization scenarios, but obtaining the required data and models can be cost prohibitive, especially for researchers in low, lower-middle income economies. Here, we use Nicaragua as a case study to highlight the importance of high-resolution open access data and modeling platforms to evaluate fuel-switching strategies and their resulting cost of power under realistic technology, policy, and cost scenarios (2014-2030). Our results suggest that Nicaragua could cost-effectively achieve a low-carbon grid (≥80%, based on non-large hydro renewable energy generation) by 2030 while also pursuing multiple development objectives. Regional cooperation (balancing) enables the highest wind and solar generation (18% and 3% by 2030, respectively), at the least cost (US127 MWh-1). Potentially risky resources (geothermal and hydropower) raise system costs but do not significantly hinder decarbonization. Oil price sensitivity scenarios suggest renewable energy to be a more cost-effective long-term investment than fuel oil, even under the assumption of prevailing cheap oil prices. Nicaragua’s options illustrate the opportunities and challenges of power system decarbonization for emerging economies, and the key role that open access data and modeling platforms can play in helping develop low-carbon transition pathways.

  12. Multiscale scenarios for nature futures

    NARCIS (Netherlands)

    Rosa, Isabel M.D.; Pereira, Henrique Miguel; Ferrier, Simon; Alkemade, J.R.M.; Acosta, Lilibeth A.; Resit Akcakaya, H.; Belder, den E.; Fazel, Asghar M.; Fujimori, Shinichiro; Harfoot, Mike; Harhash, Khaled A.; Harrison, Paula A.; Hauck, Jennifer; Hendriks, Rob J.J.; Hernández, Gladys; Jetz, Walter; Karlsson-Vinkhuyzen, S.I.S.E.; Kim, Hyejin; King, Nicholas; Kok, Marcel; Kolomytsev, Grygoriy O.; Lazarova, Tanya; Leadley, Paul; Lundquist, Carolyn J.; García Márquez, Jaime; Meyer, Carsten; Navarro, Laetitia M.; Nesshöver, Carsten; Ngo, Hien T.; Ninan, Karachepone N.; Palomo, Maria G.; Pereira, Laura; Peterson, G.D.; Pichs, Ramon; Popp, Alexander; Purvis, Andy; Ravera, Federica; Rondinini, Carlo; Sathyapalan, Jyothis; Schipper, Aafke; Seppelt, Ralf; Settele, Josef; Sitas, Nadia; Vuuren, van D.

    2017-01-01

    Targets for human development are increasingly connected with targets for nature, however, existing scenarios do not explicitly address this relationship. Here, we outline a strategy to generate scenarios centred on our relationship
    with nature to inform decision-making at multiple scales.

  13. Electron spin polarization in realistic trajectories around the magnetic node of two counter-propagating, circularly polarized, ultra-intense lasers

    Science.gov (United States)

    Del Sorbo, D.; Seipt, D.; Thomas, A. G. R.; Ridgers, C. P.

    2018-06-01

    It has recently been suggested that two counter-propagating, circularly polarized, ultra-intense lasers can induce a strong electron spin polarization at the magnetic node of the electromagnetic field that they setup (Del Sorbo et al 2017 Phys. Rev. A 96 043407). We confirm these results by considering a more sophisticated description that integrates over realistic trajectories. The electron dynamics is weakly affected by the variation of power radiated due to the spin polarization. The degree of spin polarization differs by approximately 5% if considering electrons initially at rest or already in a circular orbit. The instability of trajectories at the magnetic node induces a spin precession associated with the electron migration that establishes an upper temporal limit to the polarization of the electron population of about one laser period.

  14. Understanding how appraisal of doctors produces its effects: a realist review protocol.

    Science.gov (United States)

    Brennan, Nicola; Bryce, Marie; Pearson, Mark; Wong, Geoff; Cooper, Chris; Archer, Julian

    2014-06-23

    UK doctors are now required to participate in revalidation to maintain their licence to practise. Appraisal is a fundamental component of revalidation. However, objective evidence of appraisal changing doctors' behaviour and directly resulting in improved patient care is limited. In particular, it is not clear how the process of appraisal is supposed to change doctors' behaviour and improve clinical performance. The aim of this research is to understand how and why appraisal of doctors is supposed to produce its effect. Realist review is a theory-driven interpretive approach to evidence synthesis. It applies realist logic of inquiry to produce an explanatory analysis of an intervention that is, what works, for whom, in what circumstances, in what respects. Using a realist review approach, an initial programme theory of appraisal will be developed by consulting with key stakeholders in doctors' appraisal in expert panels (ethical approval is not required), and by searching the literature to identify relevant existing theories. The search strategy will have a number of phases including a combination of: (1) electronic database searching, for example, EMBASE, MEDLINE, the Cochrane Library, ASSIA, (2) 'cited by' articles search, (3) citation searching, (4) contacting authors and (5) grey literature searching. The search for evidence will be iteratively extended and refocused as the review progresses. Studies will be included based on their ability to provide data that enable testing of the programme theory. Data extraction will be conducted, for example, by note taking and annotation at different review stages as is consistent with the realist approach. The evidence will be synthesised using realist logic to interrogate the final programme theory of the impact of appraisal on doctors' performance. The synthesis results will be written up according to RAMESES guidelines and disseminated through peer-reviewed publication and presentations. The protocol is registered with

  15. TURVA-2012: Formulation of radionuclide release scenarios

    International Nuclear Information System (INIS)

    Marcos, Nuria; Hjerpe, Thomas; Snellman, Margit; Ikonen, Ari; Smith, Paul

    2014-01-01

    TURVA-2012 is Posiva's safety case in support of the Preliminary Safety Analysis Report (PSAR) and application for a construction licence for a repository for disposal of spent nuclear fuel at the Olkiluoto site in south-western Finland. This paper gives a summary of the scenarios and the methodology followed in formulating them as described in TURVA-2012: Formulation of Radionuclide Release Scenarios (Posiva, 2013). The scenarios are further analysed in TURVA-2012: Assessment of Radionuclide Release Scenarios for the Repository System and TURVA-2012: Biosphere Assessment (Posiva, 2012a, 2012b). The formulation of scenarios takes into account the safety functions of the main barriers of the repository system and the uncertainties in the features, events, and processes (FEP) that may affect the entire disposal system (i.e. repository system plus the surface environment) from the emplacement of the first canister until the far future. In the report TURVA-2012: Performance Assessment (2012d), the performance of the engineered and natural barriers has been assessed against the loads expected during the evolution of the repository system and the site. Uncertainties have been identified and these are taken into account in the formulation of radionuclide release scenarios. The uncertainties in the FEP and evolution of the surface environment are taken into account in formulating the surface environment scenarios used ultimately in estimating radiation exposure. Formulating radionuclide release scenarios for the repository system links the reports Performance Assessment and Assessment of Radionuclide Release Scenarios for the Repository System. The formulation of radionuclide release scenarios for the surface environment brings together biosphere description and the surface environment FEP and is the link to the assessment of the surface environment scenarios summarised in TURVA-2012: Biosphere Assessment. (authors)

  16. Parameterized source term in the diffusion approximation for enhanced near-field modeling of collimated light

    Science.gov (United States)

    Jia, Mengyu; Wang, Shuang; Chen, Xueying; Gao, Feng; Zhao, Huijuan

    2016-03-01

    Most analytical methods for describing light propagation in turbid medium exhibit low effectiveness in the near-field of a collimated source. Motivated by the Charge Simulation Method in electromagnetic theory as well as the established discrete source based modeling, we have reported on an improved explicit model, referred to as "Virtual Source" (VS) diffuse approximation (DA), to inherit the mathematical simplicity of the DA while considerably extend its validity in modeling the near-field photon migration in low-albedo medium. In this model, the collimated light in the standard DA is analogously approximated as multiple isotropic point sources (VS) distributed along the incident direction. For performance enhancement, a fitting procedure between the calculated and realistic reflectances is adopted in the nearfield to optimize the VS parameters (intensities and locations). To be practically applicable, an explicit 2VS-DA model is established based on close-form derivations of the VS parameters for the typical ranges of the optical parameters. The proposed VS-DA model is validated by comparing with the Monte Carlo simulations, and further introduced in the image reconstruction of the Laminar Optical Tomography system.

  17. The Influence of Gaussian Signaling Approximation on Error Performance in Cellular Networks

    KAUST Repository

    Afify, Laila H.; Elsawy, Hesham; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2015-01-01

    Stochastic geometry analysis for cellular networks is mostly limited to outage probability and ergodic rate, which abstracts many important wireless communication aspects. Recently, a novel technique based on the Equivalent-in-Distribution (EiD) approach is proposed to extend the analysis to capture these metrics and analyze bit error probability (BEP) and symbol error probability (SEP). However, the EiD approach considerably increases the complexity of the analysis. In this paper, we propose an approximate yet accurate framework, that is also able to capture fine wireless communication details similar to the EiD approach, but with simpler analysis. The proposed methodology is verified against the exact EiD analysis in both downlink and uplink cellular networks scenarios.

  18. The Influence of Gaussian Signaling Approximation on Error Performance in Cellular Networks

    KAUST Repository

    Afify, Laila H.

    2015-08-18

    Stochastic geometry analysis for cellular networks is mostly limited to outage probability and ergodic rate, which abstracts many important wireless communication aspects. Recently, a novel technique based on the Equivalent-in-Distribution (EiD) approach is proposed to extend the analysis to capture these metrics and analyze bit error probability (BEP) and symbol error probability (SEP). However, the EiD approach considerably increases the complexity of the analysis. In this paper, we propose an approximate yet accurate framework, that is also able to capture fine wireless communication details similar to the EiD approach, but with simpler analysis. The proposed methodology is verified against the exact EiD analysis in both downlink and uplink cellular networks scenarios.

  19. Market share scenarios for Gen-DIII and gen-IV reactors in Europe

    International Nuclear Information System (INIS)

    Roelofs, F.; Heek, A. V.; Durpel, L. V. D.

    2008-01-01

    Nuclear energy is back on the agenda worldwide in order to meet growing energy demand and especially the growth in electricity demand. Many objectives direct to an increased use of nuclear energy, i.e. minimising energy costs, reducing climate change effects and others. In the light of the potential renewed growth of nuclear energy, the public demands a clear view on what nuclear energy may contribute towards meeting these objectives and especially how nuclear energy may address some socio-political obstructions with respect to economics, radioactive waste, safety and proliferation of fissile materials. To address these questions, the future nuclear reactor park mix in Europe has been analysed applying an integrated dynamic process modelling technique. Various market share scenarios for nuclear energy are derived including sub-variants with regard to the intra-nuclear options. In the analyses, it is assumed that different types of new reactors may be built, taking into account the introduction date of considered Gen-Ill (i.e. EPR) and Gen-IV (i.e. SCWR, HTR, FR) reactors, and the economic evaluation of the complete fuel cycle. The assessment was undertaken using the DANESS code (Dynamic Analysis of Nuclear Energy System Strategies). The analyses show that given the considered realistic nuclear energy demand and given a limited number of available Gen-III and Gen-IV reactor types, the future European nuclear park will exist of combinations of Gen-III and Gen-IV reactors. This mix will always consist of a set of reactor types each having its specific strengths. The analyses also highlight the triggers influencing the choice between different nuclear energy deployment scenarios. (authors)

  20. Impact of a realistic river routing in coupled ocean-atmosphere simulations of the Last Glacial Maximum climate

    Energy Technology Data Exchange (ETDEWEB)

    Alkama, Ramdane [IPSL, Laboratoire des Sciences du Climat et de l' Environnement, Gif-sur-Yvette Cedex (France); Universite Pierre et Marie Curie, Structure et fonctionnement des systemes hydriques continentaux (Sisyphe), Paris (France); Kageyama, M.; Ramstein, G.; Marti, O.; Swingedouw, D. [IPSL, Laboratoire des Sciences du Climat et de l' Environnement, Gif-sur-Yvette Cedex (France); Ribstein, P. [Universite Pierre et Marie Curie, Structure et fonctionnement des systemes hydriques continentaux (Sisyphe), Paris (France)

    2008-06-15

    The presence of large ice sheets over North America and North Europe at the Last Glacial Maximum (LGM) strongly impacted Northern hemisphere river pathways. Despite the fact that such changes may significantly alter the freshwater input to the ocean, modified surface hydrology has never been accounted for in coupled ocean-atmosphere general circulation model simulations of the LGM climate. To reconstruct the LGM river routing, we use the ICE-5G LGM topography. Because of the uncertainties in the extent of the Fennoscandian ice sheet in the Eastern part of the Kara Sea, we consider two more realistic river routing scenarios. The first scenario is characterised by the presence of an ice dammed lake south of the Fennoscandian ice sheet, and corresponds to the ICE-5G topography. This lake is fed by the Ob and Yenisei rivers. In the second scenario, both these rivers flow directly into the Arctic Ocean, which is more consistent with the latest QUEEN ice sheet margin reconstructions. We study the impact of these changes on the LGM climate as simulated by the IPSL{sub C}M4 model and focus on the overturning thermohaline circulation. A comparison with a classical LGM simulation performed using the same model and modern river basins as designed in the PMIP2 exercise leads to the following conclusions: (1) The discharge into the North Atlantic Ocean is increased by 2,000 m{sup 3}/s between 38 and 54 N in both simulations that contain LGM river routing, compared to the classical LGM experiment. (2) The ice dammed lake is shown to have a weak impact, relative to the classical simulation, both in terms of climate and ocean circulation. (3) In contrast, the North Atlantic deep convection and meridional overturning are weaker than during the classical LGM run if the Ob and Yenisei rivers flow directly into the Arctic Ocean. The total discharge into the Arctic Ocean is increased by 31,000 m{sup 3}/s, relative to the classical LGM simulation. Consequentially, northward ocean heat