WorldWideScience

Sample records for models require future

  1. Bioenergy crop models: Descriptions, data requirements and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Kang, Shujiang [ORNL; Zhang, Xuesong [Pacific Northwest National Laboratory (PNNL); Miguez, Fernando [Iowa State University; Izaurralde, Dr. R. Cesar [Pacific Northwest National Laboratory (PNNL); Post, Wilfred M [ORNL; Dietze, Michael [University of Illinois, Urbana-Champaign; Lynd, L. [Dartmouth College; Wullschleger, Stan D [ORNL

    2012-01-01

    Field studies that address the production of lignocellulosic biomass as a source of renewable energy provide critical data for the development of bioenergy crop models. A literature survey revealed that 14 models have been used for simulating bioenergy crops including herbaceous and woody bioenergy crops, and for crassulacean acid metabolism (CAM) crops. These models simulate field-scale production of biomass for switchgrass (ALMANAC, EPIC, and Agro-BGC), miscanthus (MISCANFOR, MISCANMOD, and WIMOVAC), sugarcane (APSIM, AUSCANE, and CANEGRO), and poplar and willow (SECRETS and 3PG). Two models are adaptations of dynamic global vegetation models and simulate biomass yields of miscanthus and sugarcane at regional scales (Agro-IBIS and LPJmL). Although it lacks the complexity of other bioenergy crop models, the environmental productivity index (EPI) is the only model used to estimate biomass production of CAM (Agave and Opuntia) plants. Except for the EPI model, all models include representations of leaf area dynamics, phenology, radiation interception and utilization, biomass production, and partitioning of biomass to roots and shoots. A few models simulate soil water, nutrient, and carbon cycle dynamics, making them especially useful for assessing the environmental consequences (e.g., erosion and nutrient losses) associated with the large-scale deployment of bioenergy crops. The rapid increase in use of models for energy crop simulation is encouraging; however, detailed information on the influence of climate, soils, and crop management practices on biomass production is scarce. Thus considerable work remains regarding the parameterization and validation of process-based models for bioenergy crops; generation and distribution of high-quality field data for model development and validation; and implementation of an integrated framework for efficient, high-resolution simulations of biomass production for use in planning sustainable bioenergy systems.

  2. Future Home Network Requirements

    DEFF Research Database (Denmark)

    Charbonnier, Benoit; Wessing, Henrik; Lannoo, Bart;

    This paper presents the requirements for future Home Area Networks (HAN). Firstly, we discuss the applications and services as well as their requirements. Then, usage scenarios are devised to establish a first specification for the HAN. The main requirements are an increased bandwidth (towards 1...

  3. Future Home Network Requirements

    DEFF Research Database (Denmark)

    Charbonnier, Benoit; Wessing, Henrik; Lannoo, Bart

    This paper presents the requirements for future Home Area Networks (HAN). Firstly, we discuss the applications and services as well as their requirements. Then, usage scenarios are devised to establish a first specification for the HAN. The main requirements are an increased bandwidth (towards 1...

  4. Bench Test Modeling for Current and Future PCMO Fuel Economy Requirements

    Institute of Scientific and Technical Information of China (English)

    Mark T.Devlin; Cheng C.Kuo; John M.Pietras; Zhang Yun

    2008-01-01

    The automotive and lubricant industries have been placing increased emphasis on the fuel economy benefits of today's modern passenger car motor oils.The current ILSAC GF-4 specification has more stringent requirements than previous ILSAC specifications,and the proposed future ILSAC GF-5 specification is looking at further improvements in the lubricant's fuel economy performance.To address these needs,Afton Chemical has developed correlations between the rheological and frictional properties of oils and fuel economy measured in engine and field tests.In this paper we will present correlations between lubricauts' physical properties and fuel economy measured in vehicles and enginetests.These tools have been used to develop current commercial oils which are being used extensively to meet today's OEM needs.

  5. High resolution weather data for urban hydrological modelling and impact assessment, ICT requirements and future challenges

    Science.gov (United States)

    ten Veldhuis, Marie-claire; van Riemsdijk, Birna

    2013-04-01

    Hydrological analysis of urban catchments requires high resolution rainfall and catchment information because of the small size of these catchments, high spatial variability of the urban fabric, fast runoff processes and related short response times. Rainfall information available from traditional radar and rain gauge networks does no not meet the relevant scales of urban hydrology. A new type of weather radars, based on X-band frequency and equipped with Doppler and dual polarimetry capabilities, promises to provide more accurate rainfall estimates at the spatial and temporal scales that are required for urban hydrological analysis. Recently, the RAINGAIN project was started to analyse the applicability of this new type of radars in the context of urban hydrological modelling. In this project, meteorologists and hydrologists work closely together in several stages of urban hydrological analysis: from the acquisition procedure of novel and high-end radar products to data acquisition and processing, rainfall data retrieval, hydrological event analysis and forecasting. The project comprises of four pilot locations with various characteristics of weather radar equipment, ground stations, urban hydrological systems, modelling approaches and requirements. Access to data processing and modelling software is handled in different ways in the pilots, depending on ownership and user context. Sharing of data and software among pilots and with the outside world is an ongoing topic of discussion. The availability of high resolution weather data augments requirements with respect to the resolution of hydrological models and input data. This has led to the development of fully distributed hydrological models, the implementation of which remains limited by the unavailability of hydrological input data. On the other hand, if models are to be used in flood forecasting, hydrological models need to be computationally efficient to enable fast responses to extreme event conditions. This

  6. ATLAS Future Framework Requirements Group Report

    CERN Document Server

    The ATLAS collaboration

    2016-01-01

    The Future Frameworks Requirements Group was constituted in Summer 2013 to consider and summarise the framework requirements from trigger and offline for configuring, scheduling and monitoring the data processing software needed by the ATLAS experiment. The principal motivation for such a re-examination arises from the current and anticipated evolution of CPUs, where multiple cores, hyper-threading and wide vector registers require a shift to a concurrent programming model. Such a model requires extensive changes in the current Gaudi/Athena frameworks and offers the opportunity to consider how HLT and offline processing can be better accommodated within the ATLAS framework. This note contains the report of the Future Frameworks Requirements Group.

  7. Future requirements for fossil power plants

    Directory of Open Access Journals (Sweden)

    Spliethoff H.

    2013-06-01

    Full Text Available The fast increasing installation of technologies to convert renewable energy into power influences the operation of conventional power plants. New requirements on the technology, on the operation and on the economic have to be considered for already running and future power plants. Currently, first experiences with such a production and market situation are available. Technologies are discussed to store power and to reduce CO2 emissions. New compensation models are necessary to enable economic operation of fossil power plants in base load. This article gives a short review about available technologies and future challenges.

  8. Climate Model Simulation of Present and Future Extreme Events in Latin America and the Caribbean: What Spatial Resolution is Required?

    Science.gov (United States)

    Rowe, C. M.; Oglesby, R. J.; Mawalagedara, R.; Mohammad Abadi Kamarei, A.

    2015-12-01

    Latin America and the Caribbean are at risk of extreme climate events, including flooding rains, damaging winds, drought, heat waves, and in high elevation mountainous regions, excessive snowfalls. The causes of these events are numerous - flooding rains and damaging winds are often associated with tropical cyclones, but also can occur, either separately or in tandem, due to smaller, more localized storms. Similarly, heat waves and droughts can be large scale or localized, and frequently occur together (as excessive drying can lead to enhanced heating, while enhanced heating in turn promotes additional drying). Even in the tropics, extreme snow and ice events can have severe consequences due to avalanches, and also impact water resources. Understanding and modeling the climate controls behind these extreme events requires consideration of a range of time and space scales. A common strategy is to use a global climate model (GCM) to simulate the large-scale (~100km) daily atmospheric controls on extreme events. A limited area, high resolution regional climate model (RCM) is then employed to dynamically downscale the results, so as to better incorporate the influence of topography and, secondarily, the nature of the land cover. But what resolution is required to provide the necessary results, i.e., minimize biases due to improper resolution? In conjunction with our partners from participating Latin American and Caribbean nations, we have made an extensive series of simulations, both region-wide and for individual countries, using the WRF regional climate model to downscale output from a variety of GCMs, as well as Reanalyses (as a proxy for observations). The simulations driven by the Reanalyses are used for robust model verification against actual weather station observations. The simulations driven by GCMs are designed to provide projections of future climate, including importantly how the nature and number of extreme events may change through coming decades. Our

  9. Future requirements for advanced materials

    Science.gov (United States)

    Olstad, W. B.

    1980-01-01

    Recent advances and future trends in aerospace materials technology are reviewed with reference to metal alloys, high-temperature composites and adhesives, tungsten fiber-reinforced superalloys, hybrid materials, ceramics, new ablative materials, such as carbon-carbon composite and silica tiles used in the Shuttle Orbiter. The technologies of powder metallurgy coupled with hot isostatic pressing, near net forging, complex large shape casting, chopped fiber molding, superplastic forming, and computer-aided design and manufacture are emphasized.

  10. Future requirements for advanced materials

    Science.gov (United States)

    Olstad, W. B.

    1980-01-01

    Recent advances and future trends in aerospace materials technology are reviewed with reference to metal alloys, high-temperature composites and adhesives, tungsten fiber-reinforced superalloys, hybrid materials, ceramics, new ablative materials, such as carbon-carbon composite and silica tiles used in the Shuttle Orbiter. The technologies of powder metallurgy coupled with hot isostatic pressing, near net forging, complex large shape casting, chopped fiber molding, superplastic forming, and computer-aided design and manufacture are emphasized.

  11. Key requirements for future control room functionality

    DEFF Research Database (Denmark)

    Tornelli, Carlo; Zuelli, Roberto; Marinelli, Mattia

    2016-01-01

    This internal report provides the key requirements for the future control centres. R8.1 represents the starting point of WP8 activities and wants to achieve a double objective. On the one hand it collects general requirements on future control centres emerging from the general trends in power...... system operation as well as experiences and results from other European projects. On the other hand, it analyses what requirements for future control rooms arise from the ELECTRA proposed control solutions. Hence, different points of view are taken into account. The ELECTRA Use Cases (UCs...... requirements for the future control centres discussed within this report. The analysis of what happened before the European system disturbance occurred on 4th November 2006 and of the existing trends by vendors helped T8.1 in the definition of the requirements for the future control centres. Volunteer...

  12. Data Services Required for Future Magnetospheric Research

    Science.gov (United States)

    McPherron, R. L.

    2006-05-01

    Magnetospheric research today has gone far beyond the search for new phenomena in the data from a single instrument on a single spacecraft. Typical studies today are either case histories of several events using data from many instruments in many locations or statistical studies of very long records. Most existing data services are not designed to support such studies. Individual and group data bases, mission data archives, data centers, and future virtual observatories will all be sources of data for future research. To facilitate this research these data sources must satisfy a number of requirements. These include: the data must be publicly accessible via the internet; the data must exist in well organized file systems; the data must have accompanying metadata; the data should exist or be delivered in processed form; the delivered data should be in an easily used format; it should not be necessary to repeatedly fill forms to obtain long data sets; there should be no arbitrary limits on the amount of data provided in a single request. In addition to data there exist certain forms of data processing that are better done in specialized facilities. Some examples of this include: generation of survey plot; coordinate transformation; calculation of spacecraft orbits; propagation of solar wind data; projection of images; evaluation of complex models. In this paper we will discuss the justification for these and other requirements in facilitating research through a distributed Great Observatory.

  13. Management models for the future

    DEFF Research Database (Denmark)

    Eskildsen, Jacob Kjær; van Pijkeren, Michel

    2009-01-01

    outline will be provided of each of the twelve business contributions in this volume. The experiences recorded in the following chapters are wide-ranging. They cover know-how with national quality award models; management models for fair trade, corporate social responsibility, organisational excellence......" theoretical framework that can be used to observe, create and assess a real life organizational 'situation' in order to make desired (future) improvements. We also argue that five common requirements can be used to appraise the applicability of a framework claiming to be a management model. Thereafter a brief...... and various aspects of an organisations' value chain. The volume makes available an intriguing journey into the application of management models in different organizational and environmental contexts - a great learning experience for anyone who undertakes it....

  14. Testing agile requirements models

    Institute of Scientific and Technical Information of China (English)

    BOTASCHANJAN Jewgenij; PISTER Markus; RUMPE Bernhard

    2004-01-01

    This paper discusses a model-based approach to validate software requirements in agile development processes by simulation and in particular automated testing. The use of models as central development artifact needs to be added to the portfolio of software engineering techniques, to further increase efficiency and flexibility of the development beginning already early in the requirements definition phase. Testing requirements are some of the most important techniques to give feedback and to increase the quality of the result. Therefore testing of artifacts should be introduced as early as possible, even in the requirements definition phase.

  15. Visual soil evaluation - future research requirements

    Science.gov (United States)

    Emmet-Booth, Jeremy; Forristal, Dermot; Fenton, Owen; Ball, Bruce; Holden, Nick

    2017-04-01

    A review of Visual Soil Evaluation (VSE) techniques (Emmet-Booth et al., 2016) highlighted their established utility for soil quality assessment, though some limitations were identified; (1) The examination of aggregate size, visible intra-porosity and shape forms a key assessment criterion in almost all methods, thus limiting evaluation to structural form. The addition of criteria that holistically examine structure may be desirable. For example, structural stability can be indicated using dispersion tests or examining soil surface crusting, while the assessment of soil colour may indirectly indicate soil organic matter content, a contributor to stability. Organic matter assessment may also indicate structural resilience, along with rooting, earthworm numbers or shrinkage cracking. (2) Soil texture may influence results or impeded method deployment. Modification of procedures to account for extreme texture variation is desirable. For example, evidence of compaction in sandy or single grain soils greatly differs to that in clayey soils. Some procedures incorporate separate classification systems or adjust deployment based on texture. (3) Research into impacts of soil moisture content on VSE evaluation criteria is required. Criteria such as rupture resistance and shape may be affected by moisture content. It is generally recommended that methods are deployed on moist soils and quantification of influences of moisture variation on results is necessary. (4) Robust sampling strategies for method deployment are required. Dealing with spatial variation differs between methods, but where methods can be deployed over large areas, clear instruction on sampling is required. Additionally, as emphasis has been placed on the agricultural production of soil, so the ability of VSE for exploring structural quality in terms of carbon storage, water purification and biodiversity support also requires research. References Emmet-Booth, J.P., Forristal. P.D., Fenton, O., Ball, B

  16. Requirements for future automotive batteries - a snapshot

    Science.gov (United States)

    Karden, Eckhard; Shinn, Paul; Bostock, Paul; Cunningham, James; Schoultz, Evan; Kok, Daniel

    Introduction of new fuel economy, performance, safety, and comfort features in future automobiles will bring up many new, power-hungry electrical systems. As a consequence, demands on automotive batteries will grow substantially, e.g. regarding reliability, energy throughput (shallow-cycle life), charge acceptance, and high-rate partial state-of-charge (HRPSOC) operation. As higher voltage levels are mostly not an economically feasible alternative for the short term, the existing 14 V electrical system will have to fulfil these new demands, utilizing advanced 12 V energy storage devices. The well-established lead-acid battery technology is expected to keep playing a key role in this application. Compared to traditional starting-lighting-ignition (SLI) batteries, significant technological progress has been achieved or can be expected, which improve both performance and service life. System integration of the storage device into the vehicle will become increasingly important. Battery monitoring systems (BMS) are expected to become a commodity, penetrating the automotive volume market from both highly equipped premium cars and dedicated fuel-economy vehicles (e.g. stop/start). Battery monitoring systems will allow for more aggressive battery operating strategies, at the same time improving the reliability of the power supply system. Where a single lead-acid battery cannot fulfil the increasing demands, dual-storage systems may form a cost-efficient extension. They consist either of two lead-acid batteries or of a lead-acid battery plus another storage device.

  17. Strategic forces: Future requirements and options

    Energy Technology Data Exchange (ETDEWEB)

    Speed, R.D.

    1990-11-01

    In the wake of the collapse of the Warsaw Pact and the apparent ending of the Cold War, there have been renewed calls for radical cuts in US strategic forces to levels far below the 10,000 or so warheads allowed each side under the current START proposal. Since it now appears that NATO for the first time will have the capability to defeat a Soviet conventional attack without the necessity of threatening to resort to nuclear weapons, this should pave the way for the rethinking of US strategy and the reduction of US strategic weapons requirements. In this new environment, it seems plausible that, with a modification of the Flexible Response doctrine to forego attempts to disarm the Soviet Union, deterrence could be maintained with 1500 or so survivable strategic weapons. With a new strategy that confined US strategic weapons to the role of deterring the use of nuclear weapons by other countries, a survivable force of about 500 weapons would seem sufficient. With this premise, the implications for the US strategic force structure are examined for two cases: a treaty that allows each side 3000 warheads and one that allows each side 1000 warheads. In Part 1 of this paper, the weapons requirements for deterrence are examined in light of recent changes in the geopolitical environment. In Part 2, it is assumed that the President and Congress have decided that deep cuts in strategic forces are acceptable. 128 refs., 12 figs., 12 tabs. (JF)

  18. Evolution of Requirements and Assumptions for Future Exploration Missions

    Science.gov (United States)

    Anderson, Molly; Sargusingh, Miriam; Perry, Jay

    2017-01-01

    NASA programs are maturing technologies, systems, and architectures to enabling future exploration missions. To increase fidelity as technologies mature, developers must make assumptions that represent the requirements of a future program. Multiple efforts have begun to define these requirements, including team internal assumptions, planning system integration for early demonstrations, and discussions between international partners planning future collaborations. For many detailed life support system requirements, existing NASA documents set limits of acceptable values, but a future vehicle may be constrained in other ways, and select a limited range of conditions. Other requirements are effectively set by interfaces or operations, and may be different for the same technology depending on whether the hard-ware is a demonstration system on the International Space Station, or a critical component of a future vehicle. This paper highlights key assumptions representing potential life support requirements and explanations of the driving scenarios, constraints, or other issues that drive them.

  19. Management models for the future

    DEFF Research Database (Denmark)

    Jonker, Jan; Eskildsen, Jacob Kjær

    2007-01-01

    In the last decades a growing number of generic management models (e.g. EFQM, INK, ISO 9000:2000) has emerged. All these models are based on the ambition to stipulate the road to conventional and contemporary forms of organizational excellence. Some of the models aim to do so with regard to one...... aspect of the company's operations such as processes; others are based on a holistic view of the organisation. This paper is based on a book project (2006-2007) entitled "Management Models for the Future" (Springer Verlag, Heidelberg - Germany) aiming to harvest twelve new company-based models from...... and inspiring set of models together with an analysis thus showing the building blocks of meaningful and applicable models. Knowledge does not simply lie around waiting to be picked up. It must be concisely carved out of a continuous stream of ongoing events in reality, perceived within a specific frame...

  20. Management models for the future

    DEFF Research Database (Denmark)

    Jonker, Jan; Eskildsen, Jacob Kjær

    2007-01-01

    In the last decades a growing number of generic management models (e.g. EFQM, INK, ISO 9000:2000) has emerged. All these models are based on the ambition to stipulate the road to conventional and contemporary forms of organizational excellence. Some of the models aim to do so with regard to one...... aspect of the company's operations such as processes; others are based on a holistic view of the organisation. This paper is based on a book project (2006-2007) entitled "Management Models for the Future" (Springer Verlag, Heidelberg - Germany) aiming to harvest twelve new company-based models from...... around the globe. Each of these models is described in a structured company-based story thus creating the backbone for the book at hand. The aim is to analyse these different kinds of institutional frameworks of excellence and discuss their nature, content and enactability. The result is a rich...

  1. Management Models for the Future

    DEFF Research Database (Denmark)

    companies are not. New functional requirements often seem to be in conflict, such as transparency, stock market performance, sustainability, innovation, responsibility, time to market, stakeholders, business rationalisation and many others. These requirements force business to revise its management model....... The time is right to demonstrate how the business enterprise can be re-conceptualised, and what the challenges are of fundamental strategic choices in organising a sustainable business proposition. This book presents ten cases of organisations which have developed  a management model that leads...

  2. Renewable Energy Requirements for Future Building Codes: Options for Compliance

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Heather E.; Antonopoulos, Chrissi A.; Solana, Amy E.; Russo, Bryan J.

    2011-09-30

    As the model energy codes are improved to reach efficiency levels 50 percent greater than current codes, use of on-site renewable energy generation is likely to become a code requirement. This requirement will be needed because traditional mechanisms for code improvement, including envelope, mechanical and lighting, have been pressed to the end of reasonable limits. Research has been conducted to determine the mechanism for implementing this requirement (Kaufman 2011). Kaufmann et al. determined that the most appropriate way to structure an on-site renewable requirement for commercial buildings is to define the requirement in terms of an installed power density per unit of roof area. This provides a mechanism that is suitable for the installation of photovoltaic (PV) systems on future buildings to offset electricity and reduce the total building energy load. Kaufmann et al. suggested that an appropriate maximum for the requirement in the commercial sector would be 4 W/ft{sup 2} of roof area or 0.5 W/ft{sup 2} of conditioned floor area. As with all code requirements, there must be an alternative compliance path for buildings that may not reasonably meet the renewables requirement. This might include conditions like shading (which makes rooftop PV arrays less effective), unusual architecture, undesirable roof pitch, unsuitable building orientation, or other issues. In the short term, alternative compliance paths including high performance mechanical equipment, dramatic envelope changes, or controls changes may be feasible. These options may be less expensive than many renewable systems, which will require careful balance of energy measures when setting the code requirement levels. As the stringency of the code continues to increase however, efficiency trade-offs will be maximized, requiring alternative compliance options to be focused solely on renewable electricity trade-offs or equivalent programs. One alternate compliance path includes purchase of Renewable Energy

  3. Deriving future oriented research and competence requirements based on scenarios

    DEFF Research Database (Denmark)

    Sonne, Anne-Mette; Harmsen, Hanne; Jensen, Birger Boutrup

    The key to a company's survival lies in its ability to adapt itself to an ever changing world. A company's knowledge and competencies must be fitted to the requirements of the environment in which it operates. However, the kind of competencies that ensures a company's survival are not acquired...... prepare for different possible futures do scenario methods offer real value. A way to try to be better prepared for the future is to deduct competence and research needs given different possible future development described in a number of scenarios. Hence, the aim of this paper is to test the use...... of scenarios for this purpose. Most scenario studies report mostly on the scenario construction, were as we want to focus on the suitability of scenario methods as a mean of deducting competence requirements and research needs. Also scenario techniques have mostly been used on either a company level or a macro...

  4. Requirements for user interaction support in future CACE environments

    DEFF Research Database (Denmark)

    Ravn, Ole; Szymkat, M.

    1994-01-01

    Based on a review of user interaction modes and the specific needs of the CACE domain the paper describes requirements for user interaction in future CACE environments. Taking another look at the design process in CACE key areas in need of more user interaction support are pointed out. Three...

  5. Smart Agri-Food Logistics: Requirements for the Future Internet

    NARCIS (Netherlands)

    Verdouw, C.N.; Sundmaeker, H.; Meyer, F.; Wolfert, J.; Verhoosel, J.

    2013-01-01

    The food and agribusiness is an important sector in European logistics with a share in the EU road transport of about 20 %. One of the main logistic challenges in this sector is to deal with the high dynamics and uncertainty in supply and demand. This paper defines requirements on Future Internet (F

  6. Needs and Requirements for Future Research Reactors (ORNL Perspectives)

    Energy Technology Data Exchange (ETDEWEB)

    Ilas, Germina [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bryan, Chris [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gehin, Jess C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-02-10

    The High Flux Isotope Reactor (HFIR) is a vital national and international resource for neutron science research, production of radioisotopes, and materials irradiation. While HFIR is expected to continue operation for the foreseeable future, interest is growing in understanding future research reactors features, needs, and requirements. To clarify, discuss, and compile these needs from the perspective of Oak Ridge National Laboratory (ORNL) research and development (R&D) missions, a workshop, titled “Needs and Requirements for Future Research Reactors”, was held at ORNL on May 12, 2015. The workshop engaged ORNL staff that is directly involved in research using HFIR to collect valuable input on the reactor’s current and future missions. The workshop provided an interactive forum for a fruitful exchange of opinions, and included a mix of short presentations and open discussions. ORNL staff members made 15 technical presentations based on their experience and areas of expertise, and discussed those capabilities of the HFIR and future research reactors that are essential for their current and future R&D needs. The workshop was attended by approximately 60 participants from three ORNL directorates. The agenda is included in Appendix A. This document summarizes the feedback provided by workshop contributors and participants. It also includes information and insights addressing key points that originated from the dialogue started at the workshop. A general overview is provided on the design features and capabilities of high performance research reactors currently in use or under construction worldwide. Recent and ongoing design efforts in the US and internationally are briefly summarized, followed by conclusions and recommendations.

  7. The future of irrigated agriculture under environmental flow requirements restrictions

    Science.gov (United States)

    Pastor, Amandine; Palazzo, Amanda; Havlik, Petr; Kabat, Pavel; Obersteiner, Michael; Ludwig, Fulco

    2016-04-01

    Water is not an infinite resource and demand from irrigation, household and industry is constantly increasing. This study focused on including global water availability including environmental flow requirements with water withdrawal from irrigation and other sectors at a monthly time-step in the GLOBIOM model. This model allows re-adjustment of land-use allocation, crop management, consumption and international trade. The GLOBIOM model induces an endogenous change in water price depending on water supply and demand. In this study, the focus was on how the inclusion of water resources affects land-use and, in particular, how global change will influence repartition of irrigated and rainfed lands at global scale. We used the climate change scenario including a radiative forcing of 8.5 W/m2 (RCP8.5), the socio-economic scenario (SSP2: middle-of-road), and the environmental flow method based on monthly flow allocation (the Variable Monthly Flow method) with high and low restrictions. Irrigation withdrawals were adjusted to a monthly time-step to account for biophysical water limitations at finer time resolution. Our results show that irrigated land might decrease up to 40% on average depending on the choice of EFR restrictions. Several areas were identified as future hot-spots of water stress such as the Mediterranean and Middle-East regions. Other countries were identified to be in safe position in terms of water stress such as North-European countries. Re-allocation of rainfed and irrigated land might be useful information for land-use planners and water managers at an international level to decide on appropriate legislations on climate change mitigation/adaptation when exposure and sensitivity to climate change is high and/or on adaptation measures to face increasing water demand. For example, some countries are likely to adopt measures to increase their water use efficiencies (irrigation system, soil and water conservation practices) to face water shortages, while

  8. Future CANDU nuclear power plant design requirements document executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Duk Su; Chang, Woo Hyun; Lee, Nam Young [Korea Atomic Energy Research Institute, Daeduk (Korea, Republic of); Usmani, S.A. [Atomic Energy of Canada Ltd., Toronto (Canada)

    1996-03-01

    The future CANDU Requirements Document (FCRED) describes a clear and complete statement of utility requirements for the next generation of CANDU nuclear power plants including those in Korea. The requirements are based on proven technology of PHWR experience and are intended to be consistent with those specified in the current international requirement documents. Furthermore, these integrated set of design requirements, incorporate utility input to the extent currently available and assure a simple, robust and more forgiving design that enhances the performance and safety. The FCRED addresses the entire plant, including the nuclear steam supply system and the balance of the plant, up to the interface with the utility grid at the distribution side of the circuit breakers which connect the switchyard to the transmission lines. Requirements for processing of low level radioactive waste at the plant site and spent fuel storage requirements are included in the FCRED. Off-site waste disposal is beyond the scope of the FCRED. 2 tabs., 1 fig. (Author) .new.

  9. Far Future Colliders and Required R&D Program

    Energy Technology Data Exchange (ETDEWEB)

    Shiltsev, V.; /Fermilab

    2012-06-01

    Particle colliders for high energy physics have been in the forefront of scientific discoveries for more than half a century. The accelerator technology of the collider has progressed immensely, while the beam energy, luminosity, facility size and the cost have grown by several orders of magnitude. The method of colliding beams has not fully exhausted its potential but its pace of progress has greatly slowed down. In this paper we very briefly review the R&D toward near future colliders and make an attempt to look beyond the current horizon and outline the changes in the paradigm required for the next breakthroughs.

  10. Agent Based Multiviews Requirements Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Based on the current researches of viewpoints oriented requirements engineering and intelligent agent, we present the concept of viewpoint agent and its abstract model based on a meta-language for multiviews requirements engineering. It provided a basis for consistency checking and integration of different viewpoint requirements, at the same time, these checking and integration works can automatically realized in virtue of intelligent agent's autonomy, proactiveness and social ability. Finally, we introduce the practical application of the model by the case study of data flow diagram.

  11. Wireless Technology Use Case Requirement Analysis for Future Space Applications

    Science.gov (United States)

    Abedi, Ali; Wilkerson, DeLisa

    2016-01-01

    This report presents various use case scenarios for wireless technology -including radio frequency (RF), optical, and acoustic- and studies requirements and boundary conditions in each scenario. The results of this study can be used to prioritize technology evaluation and development and in the long run help in development of a roadmap for future use of wireless technology. The presented scenarios cover the following application areas: (i) Space Vehicles (manned/unmanned), (ii) Satellites and Payloads, (iii) Surface Explorations, (iv) Ground Systems, and (v) Habitats. The requirement analysis covers two parallel set of conditions. The first set includes the environmental conditions such as temperature, radiation, noise/interference, wireless channel characteristics and accessibility. The second set of requirements are dictated by the application and may include parameters such as latency, throughput (effective data rate), error tolerance, and reliability. This report provides a comprehensive overview of all requirements from both perspectives and details their effects on wireless system reliability and network design. Application area examples are based on 2015 NASA Technology roadmap with specific focus on technology areas: TA 2.4, 3.3, 5.2, 5.5, 6.4, 7.4, and 10.4 sections that might benefit from wireless technology.

  12. The future dynamic world model

    Science.gov (United States)

    Karr, Thomas J.

    2014-10-01

    Defense and security forces exploit sensor data by means of a model of the world. They use a world model to geolocate sensor data, fuse it with other data, navigate platforms, recognize features and feature changes, etc. However, their need for situational awareness today exceeds the capabilities of their current world model for defense operations, despite the great advances of sensing technology in recent decades. I review emerging technologies that may enable a great improvement in the spatial and spectral coverage, the timeliness, and the functional insight of their world model.

  13. Detecting Chemical Weapons: Threats, Requirements, Solutions, and Future Challenges

    Science.gov (United States)

    Boso, Brian

    2011-03-01

    mobility spectrometry, and amplifying fluorescence polymers. In the future the requirements for detection equipment will continue to become even more stringent. The continuing increase in the sheer number of threats that will need to be detected, the development of binary agents requiring that even the precursor chemicals be detected, the development of new types of agents unlike any of the current chemistries, and the expansion of the list of toxic industrial chemical will require new techniques with higher specificity and more sensitivity.

  14. Performance requirements of automotive batteries for future car electrical systems

    Science.gov (United States)

    Friedrich, R.; Richter, G.

    The further increase in the number of power-consuming functions which has been announced for future vehicle electrical systems, and in particular the effects of new starting systems on battery performance, requires a further optimization of the lead acid system coupled with effective energy management, and enhanced battery operating conditions. In the face of these increased requirements, there are proven benefits to splitting the functions of a single SLI battery between two separate, special-purpose batteries, each of which are optimized, for high power output and for high energy throughput, respectively. This will bring about a marked improvement in weight, reliability, and state of charge (SOC). The development of special design starter and service batteries is almost completed and will lead to new products with a high standard of reliability. The design of the power-optimized lead acid accumulator is particularly suitable for further development as the battery for a 42/36 V electrical system. This is intended to improve the efficiency of the generator and the various power-consuming functions and to improve start/stop operation thereby bringing about a marked reduction in the fuel consumption of passenger cars. This improvement can also be assisted by a charge management system used in conjunction with battery status monitoring.

  15. Future of the "China Model"

    Institute of Scientific and Technical Information of China (English)

    Wu Jinglian

    2011-01-01

    THE past 30 years have witnessed a remarkable take-off of the Chinese economy.The economic miracle has sparked a worldwide debate about the development model of China.Many economists attribute the success to China's unique economic and political systems - it has a powerful government and a controlled national economy,so it is more able to formulate and implement strategies in the best of national interests.

  16. Future mobile internet services : business model scenarios

    OpenAIRE

    2004-01-01

    In this report we explore future business models for mobile internet services. Based on four different scenarios, we sketch out how future conditions in the mobile industry may influence value propositions, value networks, and financial aspects of mobile services. The four scenarios vary along two dimensions - technological development and social identity, and different combinations of these two dimensions provide us with four scenarios where quite different business models can be expected. M...

  17. Modeling and forecasting petroleum futures volatility

    Energy Technology Data Exchange (ETDEWEB)

    Sadorsky, Perry [York Univ., Schulich School of Business, Toronto, ON (Canada)

    2006-07-15

    Forecasts of oil price volatility are important inputs into macroeconometric models, financial market risk assessment calculations like value at risk, and option pricing formulas for futures contracts. This paper uses several different univariate and multivariate statistical models to estimate forecasts of daily volatility in petroleum futures price returns. The out-of-sample forecasts are evaluated using forecast accuracy tests and market timing tests. The TGARCH model fits well for heating oil and natural gas volatility and the GARCH model fits well for crude oil and unleaded gasoline volatility. Simple moving average models seem to fit well in some cases provided the correct order is chosen. Despite the increased complexity, models like state space, vector autoregression and bivariate GARCH do not perform as well as the single equation GARCH model. Most models out perform a random walk and there is evidence of market timing. Parametric and non-parametric value at risk measures are calculated and compared. Non-parametric models outperform the parametric models in terms of number of exceedences in backtests. These results are useful for anyone needing forecasts of petroleum futures volatility. (author)

  18. The modelling of future energy scenarios for Denmark

    DEFF Research Database (Denmark)

    Kwon, Pil Seok

    2014-01-01

    the overall energy system model for analyzing three subjects which are important but uncertain areas in the future. The first model is a consequential LCA analysis for biomass potential. The second model targets transport demand due to uncertain technology development in the future transport sector. The third...... model addresses grid stability with a high time resolution. As a result of the consequential LCA, the potential of biomass is less than that of IDA2050. The reduced biomass potential in turn requires larger non-biomass RES capacity, which necessitates a larger capacity of flexible means as a chain...... for the important but uncertain areas biomass and flexible demand are performed. Thirdly, modelling-related issues are investigated with a focus on the effect of future forecasting assumption and differences between a predefined priority order and order determined by given efficiencies and constraints...

  19. A View on Future Building System Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  20. Global net irrigation water requirements from various water supply sources during past and future periods

    Science.gov (United States)

    Yoshikawa, S.; Cho, J.; Hanasaki, N.; Kanae, S.

    2014-12-01

    Water supply sources for irrigation (e.g. rivers and reservoirs) are critically important for agricultural productivity. The current rapid increase in irrigation water use is considered unsustainable and threatens food production. In this study, we estimated the time-varying dependence of irrigation water requirements from water supply sources, with a particular focus on variations in irrigation area during past (1960-2001) and future (2002-2050) periods using the global water resources model, H08. The H08 model can simulate water requirements on a daily basis at a resolution of 1.0° × 1.0° latitude and longitude. The sources of irrigation water requirements in the past simulations were specified using four categories: rivers (RIV), large reservoirs (LR), medium-size reservoirs (MSR), and non-local non-renewable blue water (NNBW). The simulated results from 1960 to 2001 showed that RIV, MSR and NNBW increased significantly from the 1960s to the early 1990s globally, but LR increased at a relatively low rate. After the early 1990s, the increase in RIV declined as it approached a critical limit, due to the continued expansion of irrigation area. MSR and NNBW increased significantly, during the same time period, following the expansion of the irrigation area and the increased storage capacity of the medium-size reservoirs. We also estimated future irrigation water requirements from the above four water supply sources and an additional water supply source (ADD) in three future simulation designs; irrigation area change, climate change, and changes in both irrigation area and climate. ADD was defined as a future increase in NNBW. After the 2020s, MSR was predicted to approach the critical limit, and ADD would account for 11-23% of the total requirements in the 2040s.

  1. [Primary care practices in Germany: a model for the future].

    Science.gov (United States)

    Beyer, Martin; Gerlach, Ferdinand M; Erler, Antje

    2011-01-01

    In its 2009 report the Federal Advisory Council on the Assessment of Developments in the Health Care System developed a model of Primary Care Practices for future general practice-based primary care. This article presents the theoretical background of the model. Primary care practices are seen as developed organisations requiring changes at all system levels (interaction, organisation, and health system) to ensure sustainability of primary care functions in the future. Developments of the elements comprising the health care system may be compared to the developments and proposals observed in other countries. In Germany, however, the pace of these developments is relatively slow.

  2. Predicting and Supplying Human Resource Requirements for the Future.

    Science.gov (United States)

    Blake, Larry J.

    After asserting that public institutions should not provide training for nonexistent jobs, this paper reviews problems associated with the accurate prediction of future manpower needs. The paper reviews the processes currently used to project labor force needs and notes the difficulty of accurately forecasting labor market "surprises,"…

  3. Status and future of hydrodynamical model atmospheres

    CERN Document Server

    Ludwig, H G

    2004-01-01

    Since about 25 years ago work has been dedicated to the development of hydrodynamical model atmospheres for cool stars (of A to T spectral type). Despite their obviously sounder physical foundation in comparison with standard hydrostatic models, their general application has been rather limited. In order to understand why this is, and how to progress, we review the present status of hydrodynamical modelling of cool star atmospheres. The development efforts were and are motivated by the theoretical interest of understanding the dynamical processes operating in stellar atmospheres. To show the observational impact, we discuss examples in the fields of spectroscopy and stellar structure where hydrodynamical modelling provided results on a level qualitatively beyond standard models. We stress present modelling challenges, and highlight presently possible and future observations that would be particularly valuable in the interplay between model validation and interpretation of observables, to eventually widen the ...

  4. Factors Required for Successful Future Research in Decision Making

    Science.gov (United States)

    1999-06-01

    Verne Wheelwright. Uruvcrsit)’ of Houston, Clear-Lake 16 AC/UNU Mllk�.1Wll ProJect - Implementation of Futures Research in Decision Making...parts of the city separated by a 14 Submitted by Verne Wheelwright, Department of Studies of the Futw"e, University of Houston, Clear-Lake, Texas 1...Russian Fed. Abidjan, Ivory Coast Stanislaw Orzeszyna Julio A. Millan B. World Health Organization Jaya Kothai Pillai M. Salihu, Vice Chancellor

  5. Metrology Requirements of Future X-Ray Telescopes

    Science.gov (United States)

    Gubarev, Mikhail

    2010-01-01

    Fundamental needs for future x-ray telescopes: a) Sharp images => excellent angular resolution. b) High throughput => large aperture areas. Generation-X optics technical challenges: a) High resolution => precision mirrors & alignment. b) Large apertures => lots of lightweight mirrors. Innovation needed for technical readiness: a) 4 top-level error terms contribute to image size. b) There are approaches to controlling those errors. Innovation needed for manufacturing readiness: Programmatic issues are at least as severe

  6. Modeling global and regional energy futures

    Science.gov (United States)

    Rethinaraj, T. S. Gopi

    A rigorous econometric calibration of a model of energy consumption is presented using a comprehensive time series database on energy consumption and other socioeconomic indicators. The future of nuclear power in the evolving distribution of various energy sources is also examined. An important consideration for the long-term future of nuclear power concerns the rate of decline of the fraction of energy that comes from coal, which has historically declined on a global basis about linearly as a function of the cumulative use of coal. The use of fluid fossil fuels is also expected to eventually decline as the more readily extractable deposits are depleted. The investigation here is restricted to examining a comparatively simple model of the dynamics of competition between nuclear and other competing energy sources. Using a defined tropical/temperate disaggregation of the world, region-specific modeling results are presented for population growth, GDP growth, energy use, and carbon use compatible with a gradual transition to energy sustainability. Results for the fractions of energy use from various sources by grouping nine commercial primary energy sources into pairs of competing fuel categories are presented in combination with the idea of experiential learning and resource depletion. Analysis based on this division provides estimates for future evolution of the fractional shares, annual use rates, cumulative use of individual energy sources, and the economic attractiveness of spent nuclear fuel reprocessing. This unified approach helps to conceptualize and understand the dynamics of evolution of importance of various energy resources over time.

  7. Information Requirements and Consumer Protection in Future M-Commerce

    DEFF Research Database (Denmark)

    Cleff, Evelyne Beatrix; Henschel, Rene Franz

    2006-01-01

    The aim of this article is to discuss information requirements and consumer protection in mobile commerce. On the basis of a brief introduction to the characteristics of mobile commerce and the regulatory framework that governs mobile commerce in the European Union today, the article presents...... and discusses information requirements of particular interest to m-commerce and how the technological development facilitates but also challenges the traditional way in which legal information is given. In order to prevent over-regulation and hindrances to the technological development, it has been suggested...

  8. Information Requirements and Consumer Protection in Future M-Commerce

    DEFF Research Database (Denmark)

    Cleff, Evelyne Beatrix; Henschel, Rene Franz

    2007-01-01

      The aim of this article is to discuss information requirements and consumer protection in mobile commerce. On the basis of a brief introduction to the characteristics of mobile commerce and the regulatory framework that governs mobile commerce in the European Union today, the article presents a...

  9. Mask data volume: historical perspective and future requirements

    Science.gov (United States)

    Spence, Chris; Goad, Scott; Buck, Peter; Gladhill, Richard; Cinque, Russell; Preuninger, Jürgen; Griesinger, Üwe; Blöcker, Martin

    2006-06-01

    Mask data file sizes are increasing as we move from technology generation to generation. The historical 30% linear shrink every 2-3 years that has been called Moore's Law, has driven a doubling of the transistor budget and hence feature count. The transition from steppers to step-and-scan tools has increased the area of the mask that needs to be patterned. At the 130nm node and below, Optical Proximity Correction (OPC) has become prevalent, and the edge fragmentation required to implement OPC leads to an increase in the number of polygons required to define the layout. Furthermore, Resolution Enhancement Techniques (RETs) such as Sub-Resolution Assist Features (SRAFs) or tri-tone Phase Shift Masks (PSM) require additional features to be defined on the mask which do not resolve on the wafer, further increasing masks volumes. In this paper we review historical data on mask file sizes for microprocessor, DRAM and Flash memory designs. We consider the consequences of this increase in file size on Mask Data Prep (MDP) activities, both within the Integrated Device Manufacturer (IDM) and Mask Shop, namely: computer resources, storage and networks (for file transfer). The impact of larger file sizes on mask writing times is also reviewed. Finally we consider, based on the trends that have been observed over the last 5 technology nodes, what will be required to maintain reasonable MDP and mask manufacturing cycle times.

  10. Radiation Belt and Plasma Model Requirements

    Science.gov (United States)

    Barth, Janet L.

    2005-01-01

    Contents include the following: Radiation belt and plasma model environment. Environment hazards for systems and humans. Need for new models. How models are used. Model requirements. How can space weather community help?

  11. [Requirements of a future-oriented social medicine].

    Science.gov (United States)

    Brennecke, R

    2005-02-01

    With the new national licensing regulations for physicians subsections of the social medicine became discrete subjects. The question arises, which contents the social medicine can have in the future, with consideration of important basic conditions. Such are the progress of medical knowledge, the representation of social medicine at medical faculties, changes of the medical supply, the transformation of jobs and the globalization. On a long-term basis effects of the demographic development, changes of the family structure and the financing of health and illness are important too. The social medicine should promptly make quality-assured contents available with consideration of the Internet. Such contents could be the comprehensive consultation, investigation and control of patient careers as well as the consultation and investigation from health problems in municipalities and in the society. In addition an inductive and practical oriented curriculum should be compiled, using the subject catalogue of the social medicine as well as a new basic textbook of social medicine.

  12. Human requirements in future air-conditioned environments

    DEFF Research Database (Denmark)

    Fanger, Povl Ole

    2002-01-01

    though existing standards and guidelines are met. A paradigm shift from rather mediocre to excellent indoor environments is foreseen in buildings in the 21st century. Based on existing information and on new research results, five principles are suggested as elements behind a new philosophy of excellence......Air-conditioning of buildings has played a very positive role for economic development in warm climates. Still its image is globally mixed. Field studies demonstrate that there are substantial numbers of dissatisfied people in many buildings, among them those suffering from SBS symptoms, even...... to the breathing zone of each individual; individual control of the airflow and/or the thermal environment should be provided. These principles of excellence should be combined with energy efficiency and sustainability of future buildings....

  13. Parallel Computational Fluid Dynamics: Current Status and Future Requirements

    Science.gov (United States)

    Simon, Horst D.; VanDalsem, William R.; Dagum, Leonardo; Kutler, Paul (Technical Monitor)

    1994-01-01

    One or the key objectives of the Applied Research Branch in the Numerical Aerodynamic Simulation (NAS) Systems Division at NASA Allies Research Center is the accelerated introduction of highly parallel machines into a full operational environment. In this report we discuss the performance results obtained from the implementation of some computational fluid dynamics (CFD) applications on the Connection Machine CM-2 and the Intel iPSC/860. We summarize some of the experiences made so far with the parallel testbed machines at the NAS Applied Research Branch. Then we discuss the long term computational requirements for accomplishing some of the grand challenge problems in computational aerosciences. We argue that only massively parallel machines will be able to meet these grand challenge requirements, and we outline the computer science and algorithm research challenges ahead.

  14. Simulator Training Requirements and Effectiveness Study (STRES): Future Research Plans.

    Science.gov (United States)

    1981-01-01

    simulation technology. The AFHRL/OT program, using the ASPT and SAAC devices, is already embarked on an extensive visual technology research effort, one...facilities that would be required to conduct the research described. In some cases, specific research devices are mentioned, such as ASPT , SAAC, and the...configuration of a particular device cannot be foreseen at this point (e.g., the ASPT might have a variety of possible specific cockpit configurations), no

  15. Information Requirements and Consumer Protection in Future M-Commerce

    DEFF Research Database (Denmark)

    Cleff, Evelyne Beatrix; Henschel, Rene Franz

    2006-01-01

    and discusses information requirements of particular interest to m-commerce and how the technological development facilitates but also challenges the traditional way in which legal information is given. In order to prevent over-regulation and hindrances to the technological development, it has been suggested...... that the solution may be relaxed enforcement of the regulatory framework and/or self-regulation, e.g. by codes of conduct. However, the article argues that other possible solutions should be considered, e.g. the use of specific symbols and sounds that - like road traffic rules - could help the consumer to navigate...

  16. Information Requirements and Consumer Protection in Future M-Commerce

    DEFF Research Database (Denmark)

    Cleff, Evelyne Beatrix; Henschel, Rene Franz

    2007-01-01

    and discusses information requirements of particular interest to m-commerce and how the technological development facilitates but also challenges the traditional way in which legal information is given. In order to prevent over-regulation and hindrances to the technological development, it has been suggested...... that the solution may be relaxed enforcement of the regulatory framework and/or self-regulation, e.g. by codes of conduct. However, the article argues that other possible solutions should be considered, e.g. the use of specific symbols and sounds that - like road traffic rules - could help the consumer to navigate...

  17. Solar sorptive cooling. Technologies, user requirements, practical experience, future prospects

    Energy Technology Data Exchange (ETDEWEB)

    Treffinger, P. [DLR Deutsches Zentrum fuer Luft- und Raumfahrt e.V., Hardthausen (Germany); Hertlein, H.P. [eds.] [Forschungsverbund Sonnenenergie, Koeln (Germany)

    1998-09-01

    Sorptive cooling techniques permit the use of low-temperature solar heat, i.e. a renewable energy of low cost and world-wide availability. The Forschungsverbund Sonnenenergie intends to develop solar sorptive cooling technologies to the prototype stage and, in cooperation with the solar industry and its end users, to promote practical application in air conditioning of buildings and cold storage of food. The workshop presents an outline of the state of development of solar sorptive cooling from the view of users and developers. Exemplary solar cooling systems are described, and the potential of open and closed sorptive processes is assessed. Future central activities will be defined in an intensive discussion between planners, producers, users and developers. [German] Der Einsatz von Sorptionstechniken zur Kaelteerzeugung erlaubt es, als treibende Solarenergie Niedertemperatur-Solarwaerme einzusetzen, also eine regenerative Energie mit sehr geringen Kosten und weltweiter Verfuegbarkeit. Der Forschungsverbund Sonnenenergie hat sich als Aufgabe gestellt, die Techniken der solaren Sorptionskuehlung bis zum Prototyp zu entwickeln und mit Industrie und Nutzern die praktische Anwendung voranzubringen. Die Anwendungsfelder sind die Klimatisierung von Gebaeuden und die Kaltlagerung von Lebensmitteln. Der Workshop gibt einen Ueberblick zum Entwicklungsstand der solaren Sorptionskuehlung aus der Sicht der Anwender und Entwickler. Bereits ausgefuehrte Beispiele zur solaren Kuehlung werden vorgestellt und das Potential geschlossener und offener Sorptionsverfahren angegeben. In intensiver Diskussion zwischen Planern, Herstellern, Nutzern und Entwicklern sollen kuenftige Arbeitsschwerpunkte herausgearbeitet werden. (orig.)

  18. Herbicide resistance modelling: past, present and future.

    Science.gov (United States)

    Renton, Michael; Busi, Roberto; Neve, Paul; Thornby, David; Vila-Aiub, Martin

    2014-09-01

    Computer simulation modelling is an essential aid in building an integrated understanding of how different factors interact to affect the evolutionary and population dynamics of herbicide resistance, and thus in helping to predict and manage how agricultural systems will be affected. In this review, we first discuss why computer simulation modelling is such an important tool and framework for dealing with herbicide resistance. We then explain what questions related to herbicide resistance have been addressed to date using simulation modelling, and discuss the modelling approaches that have been used, focusing first on the earlier, more general approaches, and then on some newer, more innovative approaches. We then consider how these approaches could be further developed in the future, by drawing on modelling techniques that are already employed in other areas, such as individual-based and spatially explicit modelling approaches, as well as the possibility of better representing genetics, competition and economics, and finally the questions and issues of importance to herbicide resistance research and management that could be addressed using these new approaches are discussed. We conclude that it is necessary to proceed with caution when increasing the complexity of models by adding new details, but, with appropriate care, more detailed models will make it possible to integrate more current knowledge in order better to understand, predict and ultimately manage the evolution of herbicide resistance.

  19. Future materials requirements for high temperature power engineering components

    Energy Technology Data Exchange (ETDEWEB)

    Marriott, J.B. (Commission of the European Communities, Petten (Netherlands). Joint Nuclear Research Center)

    1989-08-01

    The two dominant technologies in power engineering are steam and gas turbines. These are, however, dependent on a prior stage of combustion and, perhaps, gasification. There is a continuous drive towards higher operating efficiencies and greater reliability of the units. This leads to a need for larger components to operate at higher temperatures and pressures and hence under more arduous conditions of mechanical and corrosive loading for times which may exceed 200,000 h (30 years). Some examples are used to illustrate generic features of the materials problems towards which research and development is aimed. In some components the high temperature time-dependent mechanical properties dominate, a good example being gas turbine blades. Uniformity of the time-dependent mechanical properties plus fracture toughness is difficult to attain in the very large forgings required for steam turbines. Within the heat generation units (boiler tubes, headers, etc.) the mechanical requirements are severe, but would not be critical without the constraints imposed by the need for inexpensive corrosion and erosion resistance. (author).

  20. The Future of Army Item-Level Modeling

    Science.gov (United States)

    1988-02-01

    protection and high mobility, but the reality of all our systems is that they are constrained. And the constraints are many; cost is one, weight is another...Each of these desired future force characteristics is attractive and desirable. As noted earlier, the reality of system design requires that difficult...context of a Unit-Level war game. It seems possible that such a capability can be achieved through aumentation of one of a number of extant models (e.g

  1. Forecast of future aviation fuels: The model

    Science.gov (United States)

    Ayati, M. B.; Liu, C. Y.; English, J. M.

    1981-01-01

    A conceptual models of the commercial air transportation industry is developed which can be used to predict trends in economics, demand, and consumption. The methodology is based on digraph theory, which considers the interaction of variables and propagation of changes. Air transportation economics are treated by examination of major variables, their relationships, historic trends, and calculation of regression coefficients. A description of the modeling technique and a compilation of historic airline industry statistics used to determine interaction coefficients are included. Results of model validations show negligible difference between actual and projected values over the twenty-eight year period of 1959 to 1976. A limited application of the method presents forecasts of air tranportation industry demand, growth, revenue, costs, and fuel consumption to 2020 for two scenarios of future economic growth and energy consumption.

  2. Requirements for a Future Astronomical Data Analysis Environment

    Science.gov (United States)

    Grosbl, P.; Banse, K.; Tody, D.; Cotton, W.; Cornwell, T. J.; Ponz, D.; Ignatius, J.; Linde, P.; van der Hulst, T.; Burwitz, V.; Giaretta, D.; Pasian, F.; Garilli, B.; Pence, W.; Shaw, D.

    2005-12-01

    Most of the systems currently used to analyze astronomical data were designed and implemented more than a decade ago. Although they still are very useful for analysis, one often would like a better interface to newer concepts like archives, Virtual Observatories and GRID. Further, incompatibilities between most of the current systems with respect to control language and semantics make it cumbersome to mix applications from different origins. An OPTICON Network, funded by the Sixth Framework Programme of the European Commission, started this year to discuss high-level needs for an astronomical data analysis environment which could provide a flexible access to both legacy applications and new astronomical resources. The main objective of the Network is to establish widely accepted requirements and basic design recommendations for such an environment. The hope is that this effort will help other projects, which consider to implement such systems, in collaborating and achieving a common environment.

  3. Driver Behavior Modeling: Developments and Future Directions

    Directory of Open Access Journals (Sweden)

    Najah AbuAli

    2016-01-01

    Full Text Available The advances in wireless communication schemes, mobile cloud and fog computing, and context-aware services boost a growing interest in the design, development, and deployment of driver behavior models for emerging applications. Despite the progressive advancements in various aspects of driver behavior modeling (DBM, only limited work can be found that reviews the growing body of literature, which only targets a subset of DBM processes. Thus a more general review of the diverse aspects of DBM, with an emphasis on the most recent developments, is needed. In this paper, we provide an overview of advances of in-vehicle and smartphone sensing capabilities and communication and recent applications and services of DBM and emphasize research challenges and key future directions.

  4. A Framework for Modelling Software Requirements

    Directory of Open Access Journals (Sweden)

    Dhirendra Pandey

    2011-05-01

    Full Text Available Requirement engineering plays an important role in producing quality software products. In recent past years, some approaches of requirement framework have been designed to provide an end-to-end solution for system development life cycle. Textual requirements specifications are difficult to learn, design, understand, review, and maintain whereas pictorial modelling is widely recognized as an effective requirement analysis tool. In this paper, we will present a requirement modelling framework with the analysis of modern requirements modelling techniques. Also, we will discuss various domains of requirement engineering with the help of modelling elements such as semantic map of business concepts, lifecycles of business objects, business processes, business rules, system context diagram, use cases and their scenarios, constraints, and user interface prototypes. The proposed framework will be illustrated with the case study of inventory management system.

  5. Monthly Water Balance Model Hydrology Futures

    Science.gov (United States)

    Bock, Andy; Hay, Lauren E.; Markstrom, Steven; Atkinson, R. Dwight

    2016-01-01

    A monthly water balance model (MWBM) was driven with precipitation and temperature using a station-based dataset for current conditions (1950 to 2010) and selected statistically-downscaled general circulation models (GCMs) for current and future conditions (1950 to 2099) across the conterminous United States (CONUS) using hydrologic response units from the Geospatial Fabric for National Hydrologic Modeling (http://dx.doi.org/doi:10.5066/F7542KMD). Six MWBM output variables (actual evapotranspiration (AET), potential evapotranspiration (PET), runoff (RO), streamflow (STRM), soil moisture storage (SOIL), and snow water equivalent (SWE)) and the two MWBM input variables (atmospheric temperature (TAVE) and precipitation (PPT)) were summarized for hydrologic response units and aggregated at points of interest on a stream network. Results were then organized into the Monthly Water Balance Hydrology Futures database, an open-access database using netCDF format (http://cida-eros-mows1.er.usgs.gov/thredds/dodsC/nwb_pub/).  Methods used to calibrate and parameterize the MWBM are detailed in the Hydrology and Earth System Sciences (HESS)  paper "Parameter regionalization of a monthly water balance model for the conterminous United States" by Bock and others (2016).  See the discussion paper link in the "Related External Resources" section for access.  Supplemental data files related to the plots and data analysis in Bock and others (2016) can be found in the HESS-2015-325.zip folder in the "Attached Files" section.  Detailed information on the files and data can be found in the ReadMe.txt contained within the zipped folder. Recommended citation of discussion paper:Bock, A.R., Hay, L.E., McCabe, G.J., Markstrom, S.L., and Atkinson, R.D., 2016, Parameter regionalization of a monthly water balance model for the conterminous United States: Hydrology and Earth System Sciences, v. 20, 2861-2876, doi:10.5194/hess-20-2861-2016, 2016

  6. A statistically predictive model for future monsoon failure in India

    Science.gov (United States)

    Schewe, Jacob; Levermann, Anders

    2012-12-01

    Indian monsoon rainfall is vital for a large share of the world’s population. Both reliably projecting India’s future precipitation and unraveling abrupt cessations of monsoon rainfall found in paleorecords require improved understanding of its stability properties. While details of monsoon circulations and the associated rainfall are complex, full-season failure is dominated by large-scale positive feedbacks within the region. Here we find that in a comprehensive climate model, monsoon failure is possible but very rare under pre-industrial conditions, while under future warming it becomes much more frequent. We identify the fundamental intraseasonal feedbacks that are responsible for monsoon failure in the climate model, relate these to observational data, and build a statistically predictive model for such failure. This model provides a simple dynamical explanation for future changes in the frequency distribution of seasonal mean all-Indian rainfall. Forced only by global mean temperature and the strength of the Pacific Walker circulation in spring, it reproduces the trend as well as the multidecadal variability in the mean and skewness of the distribution, as found in the climate model. The approach offers an alternative perspective on large-scale monsoon variability as the result of internal instabilities modulated by pre-seasonal ambient climate conditions.

  7. Future Challenges of Modeling THMC Systems

    Science.gov (United States)

    Miller, S. A.; Heinze, T.; Hamidi, S.; Galvan, B.

    2014-12-01

    and future plans, and show that a GPU-based modeling approach is fast, high-resolution, and can reproduce experimental results of fluid injection experiments with numerical resolution at the grain-scale of the experiment.

  8. Integrated Human Futures Modeling in Egypt

    Energy Technology Data Exchange (ETDEWEB)

    Passell, Howard D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aamir, Munaf Syed [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bernard, Michael Lewis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beyeler, Walter E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Fellner, Karen Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hayden, Nancy Kay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jeffers, Robert Fredric [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Keller, Elizabeth James Kistin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Malczynski, Leonard A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mitchell, Michael David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silver, Emily [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tidwell, Vincent C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Villa, Daniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Engelke, Peter [Atlantic Council, Washington, D.C. (United States); Burrow, Mat [Atlantic Council, Washington, D.C. (United States); Keith, Bruce [United States Military Academy, West Point, NY (United States)

    2016-01-01

    The Integrated Human Futures Project provides a set of analytical and quantitative modeling and simulation tools that help explore the links among human social, economic, and ecological conditions, human resilience, conflict, and peace, and allows users to simulate tradeoffs and consequences associated with different future development and mitigation scenarios. In the current study, we integrate five distinct modeling platforms to simulate the potential risk of social unrest in Egypt resulting from the Grand Ethiopian Renaissance Dam (GERD) on the Blue Nile in Ethiopia. The five platforms simulate hydrology, agriculture, economy, human ecology, and human psychology/behavior, and show how impacts derived from development initiatives in one sector (e.g., hydrology) might ripple through to affect other sectors and how development and security concerns may be triggered across the region. This approach evaluates potential consequences, intended and unintended, associated with strategic policy actions that span the development-security nexus at the national, regional, and international levels. Model results are not intended to provide explicit predictions, but rather to provide system-level insight for policy makers into the dynamics among these interacting sectors, and to demonstrate an approach to evaluating short- and long-term policy trade-offs across different policy domains and stakeholders. The GERD project is critical to government-planned development efforts in Ethiopia but is expected to reduce downstream freshwater availability in the Nile Basin, fueling fears of negative social and economic impacts that could threaten stability and security in Egypt. We tested these hypotheses and came to the following preliminary conclusions. First, the GERD will have an important short-term impact on water availability, food production, and hydropower production in Egypt, depending on the short- term reservoir fill rate. Second, the GERD will have a very small impact on

  9. User requirements and future expectations for geosensor networks – an assessment

    NARCIS (Netherlands)

    Kooistra, L.; Thessler, S.; Bregt, A.K.

    2009-01-01

    Considerable progress has been made on the technical development of sensor networks. However increasing attention is now also required for the broad diversity of end-user requirements for the deployment of sensor networks. An expert survey on the user requirements and future expectations for sensor

  10. Modeling Requirements for Cohort and Register IT.

    Science.gov (United States)

    Stäubert, Sebastian; Weber, Ulrike; Michalik, Claudia; Dress, Jochen; Ngouongo, Sylvie; Stausberg, Jürgen; Winter, Alfred

    2016-01-01

    The project KoRegIT (funded by TMF e.V.) aimed to develop a generic catalog of requirements for research networks like cohort studies and registers (KoReg). The catalog supports such kind of research networks to build up and to manage their organizational and IT infrastructure. To make transparent the complex relationships between requirements, which are described in use cases from a given text catalog. By analyzing and modeling the requirements a better understanding and optimizations of the catalog are intended. There are two subgoals: a) to investigate one cohort study and two registers and to model the current state of their IT infrastructure; b) to analyze the current state models and to find simplifications within the generic catalog. Processing the generic catalog was performed by means of text extraction, conceptualization and concept mapping. Then methods of enterprise architecture planning (EAP) are used to model the extracted information. To work on objective a) questionnaires are developed by utilizing the model. They are used for semi-structured interviews, whose results are evaluated via qualitative content analysis. Afterwards the current state was modeled. Objective b) was done by model analysis. A given generic text catalog of requirements was transferred into a model. As result of objective a) current state models of one existing cohort study and two registers are created and analyzed. An optimized model called KoReg-reference-model is the result of objective b). It is possible to use methods of EAP to model requirements. This enables a better overview of the partly connected requirements by means of visualization. The model based approach also enables the analysis and comparison of the empirical data from the current state models. Information managers could reduce the effort of planning the IT infrastructure utilizing the KoReg-reference-model. Modeling the current state and the generation of reports from the model, which could be used as

  11. Methodological requirements of the competitive approach in tourist formation of the future teachers

    Directory of Open Access Journals (Sweden)

    Dudorova L.J.

    2014-03-01

    Full Text Available Purpose: to prove methodological requirements of the competetive approach in the tourist preparation of the future teachers. Material : the research work was made on the basis of studying of references, the analysis and synthesis of the received information, with the usage of the method of pedagogical designing. Results: methodological requirements of the competetive approach in tourist formation of the future teachers are considered and concretised. The methods of objective diagnosing of tourist preparation of the future teacher is opened. It is noticed that for objective diagnosing are necessary not only the subject (as it descends in traditional training, but also the system, professionally oriented criteria, allowing to measure level of forming of tourist competence of the future teacher. Conclusions: the universal structure of tourist competence of the future teachers consists of following components: motivational; cognitive; praxeological; individually-psychologic; the subjective. The assessment of these components allows to define complex level of forming of tourist competence of the future teacher.

  12. An Extended Analysis of Requirements Traceability Model

    Institute of Scientific and Technical Information of China (English)

    Jiang Dandong(蒋丹东); Zhang Shensheng; Chen Lu

    2004-01-01

    A new extended meta model of traceability is presented. Then, a formalized fine-grained model of traceability is described. Some major issues about this model, including trace units, requirements and relations within the model, are further analyzed. Finally, a case study that comes from a key project of 863 Program is given.

  13. Long-term dynamics simulation: Modeling requirements

    Energy Technology Data Exchange (ETDEWEB)

    Morched, A.S.; Kar, P.K.; Rogers, G.J.; Morison, G.K. (Ontario Hydro, Toronto, ON (Canada))

    1989-12-01

    This report details the required performance and modelling capabilities of a computer program intended for the study of the long term dynamics of power systems. Following a general introduction which outlines the need for long term dynamic studies, the modelling requirements for the conduct of such studies is discussed in detail. Particular emphasis is placed on models for system elements not normally modelled in power system stability programs, which will have a significant impact in the long term time frame of minutes to hours following the initiating disturbance. The report concludes with a discussion of the special computational and programming requirements for a long term stability program. 43 refs., 36 figs.

  14. 17 CFR 1.15 - Risk assessment reporting requirements for futures commission merchants.

    Science.gov (United States)

    2010-04-01

    ... exposure reports filed by such Material Affiliated Person with a foreign futures authority or other foreign... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Risk assessment reporting requirements for futures commission merchants. 1.15 Section 1.15 Commodity and Securities Exchanges COMMODITY...

  15. Modelling of future hydrogeological conditions at SFR

    Energy Technology Data Exchange (ETDEWEB)

    Holmen, L.G.; Stigsson, M. [Golder Associates, Stockholm (Sweden)

    2001-03-01

    The purpose is to estimate the future groundwater movements at the SFR repository and to produce input to the quantitative safety assessment of the SFR. The future flow pattern of the groundwater is of interest, since components of the waste emplaced in a closed and abandoned repository will dissolve in the groundwater and be transported by the groundwater to the ground surface. The study is based on a system analysis approach. Three-dimensional models were devised of the studied domain. The models include the repository tunnels and the surrounding rock mass with fracture zones. The formal models used for simulation of the groundwater flow are three-dimensional mathematical descriptions of the studied hydraulic system. The studied domain is represented on four scales - regional, local, semi local and detailed - forming four models with different resolutions: regional, local, semi local and detailed models. The local and detailed models include a detailed description of the tunnel system at SFR and of surrounding rock mass and fracture zones. In addition, the detailed model includes description of the different structures that take place inside the deposition tunnels. At the area studied, the shoreline will retreat due to the shore level displacement; this process is included in the models. The studied period starts at 2000 AD and continues until a steady state like situation is reached for the surroundings of the SFR at ca 6000 AD. The models predict that as long as the sea covers the ground above the SFR, the regional groundwater flow as well as the flow in the deposition tunnels are small. However, due to the shore level displacement the shoreline (the sea) will retreat. Because of the retreating shoreline, the general direction of the groundwater flow at SFR will change, from vertical upward to a more horizontal flow; the size of the groundwater flow will be increased as well. The present layout of the SFR includes five deposition tunnels: SILO, BMA, BLA, BTF1

  16. The Bioeconomy Model in Future Sustainable Development

    Directory of Open Access Journals (Sweden)

    Ipate Nicolae

    2015-07-01

    Full Text Available The future of sustainable development is the bioeconomy with the ―global‖ solution; both global and local action for developed the renewable energy generation. When local solutions are implemented is being laid for global solutions are positive affect the national economy. The implementation of the bioeconomy strategy used by society to prevent urgent problems, such as increasing competition for natural resources, climate change, rural sustainable development. The bioeconomy is a new economic and social order and promotes systemic change from using non-renewable resources to renewables. Bioeconomy reveals that production, which involves the transformation of a limited stock of matter and energy, but respecting the same laws that govern entropy closed systems, the entropy or unavailable matter and energy in the forms tend to increase continuously. Economic growth not only increases the apparent output per unit of inputs, which is performed using finite stock of matter and energy in the world. The current economy is based on fossil fuels and other material inputs suffering entropic degradation, both in the raw material extraction and pollution. The production, even if technical progress leads to lower overall yields. The idea of a steady state as the final economic growth that perpetuated indefinitely pendulum model is an impossibility

  17. Integrated Human Futures Modeling in Egypt

    Energy Technology Data Exchange (ETDEWEB)

    Passell, Howard D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aamir, Munaf Syed [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bernard, Michael Lewis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beyeler, Walter E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Fellner, Karen Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hayden, Nancy Kay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jeffers, Robert Fredric [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Keller, Elizabeth James Kistin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Malczynski, Leonard A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mitchell, Michael David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silver, Emily [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tidwell, Vincent C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Villa, Daniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Engelke, Peter [Atlantic Council, Washington, D.C. (United States); Burrow, Mat [Atlantic Council, Washington, D.C. (United States); Keith, Bruce [United States Military Academy, West Point, NY (United States)

    2016-01-01

    The Integrated Human Futures Project provides a set of analytical and quantitative modeling and simulation tools that help explore the links among human social, economic, and ecological conditions, human resilience, conflict, and peace, and allows users to simulate tradeoffs and consequences associated with different future development and mitigation scenarios. In the current study, we integrate five distinct modeling platforms to simulate the potential risk of social unrest in Egypt resulting from the Grand Ethiopian Renaissance Dam (GERD) on the Blue Nile in Ethiopia. The five platforms simulate hydrology, agriculture, economy, human ecology, and human psychology/behavior, and show how impacts derived from development initiatives in one sector (e.g., hydrology) might ripple through to affect other sectors and how development and security concerns may be triggered across the region. This approach evaluates potential consequences, intended and unintended, associated with strategic policy actions that span the development-security nexus at the national, regional, and international levels. Model results are not intended to provide explicit predictions, but rather to provide system-level insight for policy makers into the dynamics among these interacting sectors, and to demonstrate an approach to evaluating short- and long-term policy trade-offs across different policy domains and stakeholders. The GERD project is critical to government-planned development efforts in Ethiopia but is expected to reduce downstream freshwater availability in the Nile Basin, fueling fears of negative social and economic impacts that could threaten stability and security in Egypt. We tested these hypotheses and came to the following preliminary conclusions. First, the GERD will have an important short-term impact on water availability, food production, and hydropower production in Egypt, depending on the short- term reservoir fill rate. Second, the GERD will have a very small impact on

  18. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. 17 CFR 1.14 - Risk assessment recordkeeping requirements for futures commission merchants.

    Science.gov (United States)

    2010-04-01

    ... regarding sources of funding, together with a narrative discussion by management of the liquidity of the... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Risk assessment recordkeeping... Related Reporting Requirements § 1.14 Risk assessment recordkeeping requirements for futures...

  20. Climate modelling: IPCC gazes into the future

    Science.gov (United States)

    Raper, Sarah

    2012-04-01

    In 2013, the Intergovernmental Panel on Climate Change will report on the next set of future greenhouse-gas emission scenarios, offering a rational alternative pathway for avoiding dangerous climate change.

  1. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  2. Modelling UV sky for future UV missions

    Science.gov (United States)

    Sreejith, A. G.; Safanova, M.; Mohan, R.; Murthy, Jayant

    Software simulators are now widely used in all areas of science, especially in application to astronomical missions: from instrument design to mission planning, and to data interpretation. We present a simulator to model the diffuse ultraviolet sky, where the different contributors are separately calculated and added together to produce a sky image of the size specified by the instrument requirements. Each of the contributors to the background, instrumental dark current, airglow, zodiacal light and diffuse galactic light, is dependent on various factors. Airglow is dependent on the time of day, zodiacal light on the time of year, angle from the Sun and from the ecliptic, and diffuse UV emission depends on the look direction. To provide a full description of any line of sight, we have also added stars. The diffuse UV background light can dominate in many areas of the sky and severely impact space telescopes viewing directions due to over brightness. The simulator, available as a downloadable package and as a simple web-based tool, can be applied to separate missions and instruments. For demonstration, we present the example used for two UV missions: the UVIT instrument on the Indian ASTROSAT mission to be launched in the next year and a prospective wide-field mission to search for transients in the UV.

  3. Mediterranean agriculture: More efficient irrigation needed to compensate increases in future irrigation water requirements

    Science.gov (United States)

    Fader, Marianela; Shi, Sinan; von Bloh, Werner; Bondeau, Alberte; Cramer, Wolfgang

    2016-04-01

    Irrigation in the Mediterranean is of vital importance for food security, employment and economic development. Our research shows that, at present, Mediterranean region could save 35% of water by implementing more efficient irrigation and conveyance systems. Some countries like Syria, Egypt and Turkey have higher saving potentials than others. Currently some crops, especially sugar cane and agricultural trees, consume in average more irrigation water per hectare than annual crops (1). Also under climate change, more efficient irrigation is of vital importance for counteracting increases in irrigation water requirements. The Mediterranean area as a whole might face an increase in gross irrigation requirements between 4% and 18% from climate change alone by the end of the century if irrigation systems and conveyance are not improved. Population growth increases these numbers to 22% and 74%, respectively, affecting mainly the Southern and Eastern Mediterranean. However, improved irrigation technologies and conveyance systems have large water saving potentials, especially in the Eastern Mediterranean, and may be able to compensate to some degree the increases due to climate change and population growth. Both subregions would need around 35% more water than today if they could afford some degree of modernization of irrigation and conveyance systems and benefit from the CO2-fertilization effect (1). However, in some scenarios (in this case as combinations of climate change, irrigation technology, influence of population growth and CO2-fertilization effect) water scarcity may constrain the supply of the irrigation water needed in future in Algeria, Libya, Israel, Jordan, Lebanon, Syria, Serbia, Morocco, Tunisia and Spain (1). In this study, vegetation growth, phenology, agricultural production and irrigation water requirements and withdrawal were simulated with the process-based ecohydrological and agro-ecosystem model LPJmL ("Lund-Potsdam-Jena managed Land") after a

  4. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  5. The MSFC Solar Activity Future Estimation (MSAFE) Model

    Science.gov (United States)

    Suggs, Ron

    2017-01-01

    The Natural Environments Branch of the Engineering Directorate at Marshall Space Flight Center (MSFC) provides solar cycle forecasts for NASA space flight programs and the aerospace community. These forecasts provide future statistical estimates of sunspot number, solar radio 10.7 cm flux (F10.7), and the geomagnetic planetary index, Ap, for input to various space environment models. For example, many thermosphere density computer models used in spacecraft operations, orbital lifetime analysis, and the planning of future spacecraft missions require as inputs the F10.7 and Ap. The solar forecast is updated each month by executing MSAFE using historical and the latest month's observed solar indices to provide estimates for the balance of the current solar cycle. The forecasted solar indices represent the 13-month smoothed values consisting of a best estimate value stated as a 50 percentile value along with approximate +/- 2 sigma values stated as 95 and 5 percentile statistical values. This presentation will give an overview of the MSAFE model and the forecast for the current solar cycle.

  6. Required experimental accuracy to select between supersymmetrical models

    Indian Academy of Sciences (India)

    David Grellscheid

    2004-03-01

    We will present a method to decide a priori whether various supersymmetrical scenarios can be distinguished based on sparticle mass data alone. For each model, a scan over all free SUSY breaking parameters reveals the extent of that model's physically allowed region of sparticle-mass-space. Based on the geometrical configuration of these regions in mass-space, it is possible to obtain an estimate of the required accuracy of future sparticle mass measurements to distinguish between the models. We will illustrate this algorithm with an example. Ths talk is based on work done in collaboration with B C Allanach (LAPTH, Annecy) and F Quevedo (DAMTP, Cambridge).

  7. A fuzzy model for exploiting customer requirements

    Directory of Open Access Journals (Sweden)

    Zahra Javadirad

    2016-09-01

    Full Text Available Nowadays, Quality function deployment (QFD is one of the total quality management tools, where customers’ views and requirements are perceived and using various techniques improves the production requirements and operations. The QFD department, after identification and analysis of the competitors, takes customers’ feedbacks to meet the customers’ demands for the products compared with the competitors. In this study, a comprehensive model for assessing the importance of the customer requirements in the products or services for an organization is proposed. The proposed study uses linguistic variables, as a more comprehensive approach, to increase the precision of the expression evaluations. The importance of these requirements specifies the strengths and weaknesses of the organization in meeting the requirements relative to competitors. The results of these experiments show that the proposed method performs better than the other methods.

  8. QUANTIFYING AN UNCERTAIN FUTURE: HYDROLOGIC MODEL PERFORMANCE FOR A SERIES OF REALIZED "/FUTURE" CONDITIONS

    Science.gov (United States)

    A systematic analysis of model performance during simulations based on observed landcover/use change is used to quantify errors associated with simulations of known "future" conditions. Calibrated and uncalibrated assessments of relative change over different lengths of...

  9. 17 CFR 1.17 - Minimum financial requirements for futures commission merchants and introducing brokers.

    Science.gov (United States)

    2010-04-01

    ... but not limited to customers, general creditors, subordinated lenders, minority shareholders... Commission (17 CFR 240.15c3-1(a)). (ii) Each person registered as a futures commission merchant engaged in... capital required by Rule 15c3-1(a) of the Securities and Exchange Commission (17 CFR 240.15c3-1(a))....

  10. Organizing for the Future Requires the Non-Aristotelian Lens of a Dragonfly.

    Science.gov (United States)

    Collins, Marla Del

    To organize for the future requires non-Aristotelian thinking...a multifaceted wide-angle lens revealing hidden information. A multifaceted lens includes at least three general systems of evaluation, all of which promote complex thinking. The three systems are general semantics, postmodern feminist philosophy, and the unifying principle of…

  11. User Requirements and Domain Model Engineering

    NARCIS (Netherlands)

    Specht, Marcus; Glahn, Christian

    2006-01-01

    Specht, M., & Glahn, C. (2006). User requirements and domain model engineering. Presentation at International Workshop in Learning Networks for Lifelong Competence Development. March, 30-31, 2006. Sofia, Bulgaria: TENCompetence Conference. Retrieved June 30th, 2006, from http://dspace.learningnetwor

  12. User Requirements and Domain Model Engineering

    NARCIS (Netherlands)

    Specht, Marcus; Glahn, Christian

    2006-01-01

    Specht, M., & Glahn, C. (2006). User requirements and domain model engineering. Presentation at International Workshop in Learning Networks for Lifelong Competence Development. March, 30-31, 2006. Sofia, Bulgaria: TENCompetence Conference. Retrieved June 30th, 2006, from http://dspace.learningnetwor

  13. A generic hydroeconomic model to assess future water scarcity

    Science.gov (United States)

    Neverre, Noémie; Dumas, Patrice

    2015-04-01

    We developed a generic hydroeconomic model able to confront future water supply and demand on a large scale, taking into account man-made reservoirs. The assessment is done at the scale of river basins, using only globally available data; the methodology can thus be generalized. On the supply side, we evaluate the impacts of climate change on water resources. The available quantity of water at each site is computed using the following information: runoff is taken from the outputs of CNRM climate model (Dubois et al., 2010), reservoirs are located using Aquastat, and the sub-basin flow-accumulation area of each reservoir is determined based on a Digital Elevation Model (HYDRO1k). On the demand side, agricultural and domestic demands are projected in terms of both quantity and economic value. For the agricultural sector, globally available data on irrigated areas and crops are combined in order to determine irrigated crops localization. Then, crops irrigation requirements are computed for the different stages of the growing season using Allen (1998) method with Hargreaves potential evapotranspiration. Irrigation water economic value is based on a yield comparison approach between rainfed and irrigated crops. Potential irrigated and rainfed yields are taken from LPJmL (Blondeau et al., 2007), or from FAOSTAT by making simple assumptions on yield ratios. For the domestic sector, we project the combined effects of demographic growth, economic development and water cost evolution on future demands. The method consists in building three-blocks inverse demand functions where volume limits of the blocks evolve with the level of GDP per capita. The value of water along the demand curve is determined from price-elasticity, price and demand data from the literature, using the point-expansion method, and from water costs data. Then projected demands are confronted to future water availability. Operating rules of the reservoirs and water allocation between demands are based on

  14. Modeling requirements for in situ vitrification

    Energy Technology Data Exchange (ETDEWEB)

    MacKinnon, R.J.; Mecham, D.C.; Hagrman, D.L.; Johnson, R.W.; Murray, P.E.; Slater, C.E.; Marwil, E.S.; Weaver, R.A.; Argyle, M.D.

    1991-11-01

    This document outlines the requirements for the model being developed at the INEL which will provide analytical support for the ISV technology assessment program. The model includes representations of the electric potential field, thermal transport with melting, gas and particulate release, vapor migration, off-gas combustion and process chemistry. The modeling objectives are to (1) help determine the safety of the process by assessing the air and surrounding soil radionuclide and chemical pollution hazards, the nuclear criticality hazard, and the explosion and fire hazards, (2) help determine the suitability of the ISV process for stabilizing the buried wastes involved, and (3) help design laboratory and field tests and interpret results therefrom.

  15. Future of Plant Functional Types in Terrestrial Biosphere Models

    Science.gov (United States)

    Wullschleger, S. D.; Euskirchen, E. S.; Iversen, C. M.; Rogers, A.; Serbin, S.

    2015-12-01

    Earth system models describe the physical, chemical, and biological processes that govern our global climate. While it is difficult to single out one component as being more important than another in these sophisticated models, terrestrial vegetation is a critical player in the biogeochemical and biophysical dynamics of the Earth system. There is much debate, however, as to how plant diversity and function should be represented in these models. Plant functional types (PFTs) have been adopted by modelers to represent broad groupings of plant species that share similar characteristics (e.g. growth form) and roles (e.g. photosynthetic pathway) in ecosystem function. In this review the PFT concept is traced from its origin in the early 1800s to its current use in regional and global dynamic vegetation models (DVMs). Special attention is given to the representation and parameterization of PFTs and to validation and benchmarking of predicted patterns of vegetation distribution in high-latitude ecosystems. These ecosystems are sensitive to changing climate and thus provide a useful test case for model-based simulations of past, current, and future distribution of vegetation. Models that incorporate the PFT concept predict many of the emerging patterns of vegetation change in tundra and boreal forests, given known processes of tree mortality, treeline migration, and shrub expansion. However, representation of above- and especially belowground traits for specific PFTs continues to be problematic. Potential solutions include developing trait databases and replacing fixed parameters for PFTs with formulations based on trait co-variance and empirical trait-environment relationships. Surprisingly, despite being important to land-atmosphere interactions of carbon, water, and energy, PFTs such as moss and lichen are largely absent from DVMs. Close collaboration among those involved in modelling with the disciplines of taxonomy, biogeography, ecology, and remote sensing will be

  16. Distributed Control and Management of Renewable Electric Energy Resources for Future Grid Requirements

    DEFF Research Database (Denmark)

    Mokhtari, Ghassem; Anvari-Moghaddam, Amjad; Nourbakhsh, Ghavameddin

    2016-01-01

    It is anticipated that both medium- and low-voltage distribution networks will include high level of distributed renewable energy resources, in the future. The high penetration of these resources inevitably can introduce various power quality issues, including; overvoltage and overloading...... strategy is a promising approach to manage and utilise the resources in future distribution networks to effectively deal with grid electric quality issues and requirements. Jointly, utility and customers the owners of the resources in the network are considered as part of a practical coordination strategy...

  17. Model Waveform Accuracy Requirements for the $\\chi^2$ Discriminator

    CERN Document Server

    Lindblom, Lee

    2016-01-01

    This paper derives accuracy standards for model gravitational waveforms required to ensure proper use of the $\\chi^2$ discriminator test in gravitational wave (GW) data analysis. These standards are different from previously established requirements for detection and waveform parameter measurement based on signal-to-noise optimization. We present convenient formulae both for evaluating and interpreting the contribution of model errors to measured $\\chi^2$ values. Motivated by these formula, we also present an enhanced, complexified variant of the standard $\\chi^2$ statistic used in GW searches. While our results are not directly relevant to current searches (which use the $\\chi^2$ test only to veto signal candidates with extremely high $\\chi^2$ values), they could be useful in future GW searches and as figures of merit for model gravitational waveforms.

  18. The Global Earthquake Model - Past, Present, Future

    Science.gov (United States)

    Smolka, Anselm; Schneider, John; Stein, Ross

    2014-05-01

    The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange. Sharing of data and risk information, best practices, and approaches across the globe are key to assessing risk more effectively. Through consortium driven global projects, open-source IT development and collaborations with more than 10 regions, leading experts are developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. The year 2013 has seen the completion of ten global data sets or components addressing various aspects of earthquake hazard and risk, as well as two GEM-related, but independently managed regional projects SHARE and EMME. Notably, the International Seismological Centre (ISC) led the development of a new ISC-GEM global instrumental earthquake catalogue, which was made publicly available in early 2013. It has set a new standard for global earthquake catalogues and has found widespread acceptance and application in the global earthquake community. By the end of 2014, GEM's OpenQuake computational platform will provide the OpenQuake hazard/risk assessment software and integrate all GEM data and information products. The public release of OpenQuake is planned for the end of this 2014, and will comprise the following datasets and models: • ISC-GEM Instrumental Earthquake Catalogue (released January 2013) • Global Earthquake History Catalogue [1000-1903] • Global Geodetic Strain Rate Database and Model • Global Active Fault Database • Tectonic Regionalisation Model • Global Exposure Database • Buildings and Population Database • Earthquake Consequences Database • Physical Vulnerabilities Database • Socio-Economic Vulnerability and Resilience Indicators • Seismic

  19. Understanding requirements via natural language information modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sharp, J.K.; Becker, S.D.

    1993-07-01

    Information system requirements that are expressed as simple English sentences provide a clear understanding of what is needed between system specifiers, administrators, users, and developers of information systems. The approach used to develop the requirements is the Natural-language Information Analysis Methodology (NIAM). NIAM allows the processes, events, and business rules to be modeled using natural language. The natural language presentation enables the people who deal with the business issues that are to be supported by the information system to describe exactly the system requirements that designers and developers will implement. Computer prattle is completely eliminated from the requirements discussion. An example is presented that is based upon a section of a DOE Order involving nuclear materials management. Where possible, the section is analyzed to specify the process(es) to be done, the event(s) that start the process, and the business rules that are to be followed during the process. Examples, including constraints, are developed. The presentation steps through the modeling process and shows where the section of the DOE Order needs clarification, extensions or interpretations that could provide a more complete and accurate specification.

  20. Future of human models for crash analysis

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Hoof, J.F.A.M. van; Lange, R. de

    2001-01-01

    In the crash safety field mathematical models can be applied in practically all area's of research and development including: reconstruction of actual accidents, design (CAD) of the crash response of vehicles, safety devices and roadside facilities and in support of human impact biomechanical

  1. Modeling and Simulation in Healthcare Future Directions

    Science.gov (United States)

    2010-07-13

    Quantify performance (Competency - based) 6. Simulate before practice ( Digital Libraries ) Classic Education and Examination What is the REVOLUTION in...av $800,000 yr 2.) Actor patients - $250,000 – $400,000/yr 2. Digital Libraries or synthetic tissue models a. Subscription vs up-front costs

  2. Future of human models for crash analysis

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Hoof, J.F.A.M. van; Lange, R. de

    2001-01-01

    In the crash safety field mathematical models can be applied in practically all area's of research and development including: reconstruction of actual accidents, design (CAD) of the crash response of vehicles, safety devices and roadside facilities and in support of human impact biomechanical studie

  3. Control Architecture Modeling for Future Power Systems

    DEFF Research Database (Denmark)

    Heussen, Kai

    and operation structures; and finally the application to some concrete study cases, including a present system balancing, and proposed control structures such as Microgrids and Cells. In the second part, the main contributions are the outline of a formation strategy, integrating the design and model...

  4. Consequences for designer and manufacturer of mechanical components due to future requirements in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Hans-Joachim, Frank [Siemens/ SNP-NDM1, Erlangen (Germany)

    2001-07-01

    In the frame of European harmonization, a lot of changes on requirements for designer and manufacturer of mechanical components have been performed. Differed organizations are involved in preparing future requirements for nuclear application. On one side the French German cooperation on the development of EPR. At the origin of this project was the common decision in 1989 of Framatome and Siemens to cooperate through NPI, to design the Nuclear Island, which meets the future needs of utilities. EDF and a group of the main German Utilities joined this cooperation in 1991 and since then they have been totally involved to the progress of the work. In addition, all the process was backed up to the end by the strong cooperation between the French and the German. Safety Authorities, which have a long lasting cooperation to define common requirements, which have to be applied to future Nuclear Power Plants. Furthermore an organization has been set up to elaborate common codes related to the EPR design, at the level of the French design and construction rules (RCC) or the German KTA safety standards, the so-called EPR technical codes (ETC). On the other side, the European utilities co-operate on a much broader basis for the establishment of European Utilities Requirements (EUR). These requirements are prepared by a group of European utilities that represent the major European electricity generating companies that are determined to keep the nuclear option open. The technical requirements specified in the EUR document define the boundaries in which future plants need to be designed in order to be acceptable for the needs of the utilities and in order to fulfill the basic requirements of competitive power generation costs and licensability in all countries represented in the EUR group. All the new requirements have to be applied by designer and manufacturer. Siemens /SNP act as a designer of a lot of various vessels and tanks, heat exchangers and other items of process

  5. Consequences for designer and manufacturer of mechanical components due to future requirements in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Hans-Joachim, Frank [Siemens/ SNP-NDM1, Erlangen (Germany)

    2001-07-01

    In the frame of European harmonization, a lot of changes on requirements for designer and manufacturer of mechanical components have been performed. Differed organizations are involved in preparing future requirements for nuclear application. On one side the French German cooperation on the development of EPR. At the origin of this project was the common decision in 1989 of Framatome and Siemens to cooperate through NPI, to design the Nuclear Island, which meets the future needs of utilities. EDF and a group of the main German Utilities joined this cooperation in 1991 and since then they have been totally involved to the progress of the work. In addition, all the process was backed up to the end by the strong cooperation between the French and the German. Safety Authorities, which have a long lasting cooperation to define common requirements, which have to be applied to future Nuclear Power Plants. Furthermore an organization has been set up to elaborate common codes related to the EPR design, at the level of the French design and construction rules (RCC) or the German KTA safety standards, the so-called EPR technical codes (ETC). On the other side, the European utilities co-operate on a much broader basis for the establishment of European Utilities Requirements (EUR). These requirements are prepared by a group of European utilities that represent the major European electricity generating companies that are determined to keep the nuclear option open. The technical requirements specified in the EUR document define the boundaries in which future plants need to be designed in order to be acceptable for the needs of the utilities and in order to fulfill the basic requirements of competitive power generation costs and licensability in all countries represented in the EUR group. All the new requirements have to be applied by designer and manufacturer. Siemens /SNP act as a designer of a lot of various vessels and tanks, heat exchangers and other items of process

  6. The modelling of future energy scenarios for Denmark

    DEFF Research Database (Denmark)

    Kwon, Pil Seok

    2014-01-01

    for the important but uncertain areas biomass and flexible demand are performed. Thirdly, modelling-related issues are investigated with a focus on the effect of future forecasting assumption and differences between a predefined priority order and order determined by given efficiencies and constraints...... the overall energy system model for analyzing three subjects which are important but uncertain areas in the future. The first model is a consequential LCA analysis for biomass potential. The second model targets transport demand due to uncertain technology development in the future transport sector. The third...... performance, more than a quarter of the classic electricity demand would need to be flexible within a month, which is highly unlikely to happen. For the investigation of the energy system model, EnergyPLAN, which is used for two scenario analyses, two questions are asked; “what is the value of future...

  7. Large-scale computation at PSI scientific achievements and future requirements

    Energy Technology Data Exchange (ETDEWEB)

    Adelmann, A.; Markushin, V

    2008-11-15

    and Networking' (SNSP-HPCN) is discussing this complex. Scientific results which are made possible by PSI's engagement at CSCS (named Horizon) are summarised and PSI's future high-performance computing requirements are evaluated. The data collected shows the current situation and a 5 year extrapolation of the users' needs with respect to HPC resources is made. In consequence this report can serve as a basis for future strategic decisions with respect to a non-existing HPC road-map for PSI. PSI's institutional HPC area started hardware-wise approximately in 1999 with the assembly of a 32-processor LINUX cluster called Merlin. Merlin was upgraded several times, lastly in 2007. The Merlin cluster at PSI is used for small scale parallel jobs, and is the only general purpose computing system at PSI. Several dedicated small scale clusters followed the Merlin scheme. Many of the clusters are used to analyse data from experiments at PSI or CERN, because dedicated clusters are most efficient. The intellectual and financial involvement of the procurement (including a machine update in 2007) results in a PSI share of 25 % of the available computing resources at CSCS. The (over) usage of available computing resources by PSI scientists is demonstrated. We actually get more computing cycles than we have paid for. The reason is the fair share policy that is implemented on the Horizon machine. This policy allows us to get cycles, with a low priority, even when our bi-monthly share is used. Five important observations can be drawn from the analysis of the scientific output and the survey of future requirements of main PSI HPC users: (1) High Performance Computing is a main pillar in many important PSI research areas; (2) there is a lack in the order of 10 times the current computing resources (measured in available core-hours per year); (3) there is a trend to use in the order of 600 processors per average production run; (4) the disk and tape storage growth

  8. Modelling Long Memory Volatility in Agricultural Commodity Futures Returns

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); M.J. McAleer (Michael); R. Tansuchat (Roengchai)

    2012-01-01

    textabstractThis paper estimates a long memory volatility model for 16 agricultural commodity futures returns from different futures markets, namely corn, oats, soybeans, soybean meal, soybean oil, wheat, live cattle, cattle feeder, pork, cocoa, coffee, cotton, orange juice, Kansas City wheat, rubbe

  9. Modelling Long Memory Volatility in Agricultural Commodity Futures Returns

    NARCIS (Netherlands)

    R. Tansuchat (Roengchai); C-L. Chang (Chia-Lin); M.J. McAleer (Michael)

    2009-01-01

    textabstractThis paper estimates the long memory volatility model for 16 agricultural commodity futures returns from different futures markets, namely corn, oats, soybeans, soybean meal, soybean oil, wheat, live cattle, cattle feeder, pork, cocoa, coffee, cotton, orange juice, Kansas City wheat, rub

  10. Efficiency and economics of large scale hydrogen liquefaction. [for future generation aircraft requirements

    Science.gov (United States)

    Baker, C. R.

    1975-01-01

    Liquid hydrogen is being considered as a substitute for conventional hydrocarbon-based fuels for future generations of commercial jet aircraft. Its acceptance will depend, in part, upon the technology and cost of liquefaction. The process and economic requirements for providing a sufficient quantity of liquid hydrogen to service a major airport are described. The design is supported by thermodynamic studies which determine the effect of process arrangement and operating parameters on the process efficiency and work of liquefaction.

  11. Long-term durum wheat monoculture: modelling and future projection

    OpenAIRE

    Ettore Bernardoni; Marco Acutis; Domenico Ventrella

    2012-01-01

    The potential effects of future climate change on grain production of a winter durum wheat cropping system were investigated. Based on future climate change projections, derived from a statistical downscaling process applied to the HadCM3 general circulation model and referred to two IPCC scenarios (A2 and B1), the response on yield and aboveground biomass (AGB) and the variation in total organic carbon (TOC) were explored. The software used in this work is an hybrid dynamic simulation model ...

  12. Renewable Energy Requirements for Future Building Codes: Energy Generation and Economic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Russo, Bryan J.; Weimar, Mark R.; Dillon, Heather E.

    2011-09-30

    As the model energy codes are improved to reach efficiency levels 50 percent greater than current codes, installation of on-site renewable energy generation is likely to become a code requirement. This requirement will be needed because traditional mechanisms for code improvement, including the building envelope, mechanical systems, and lighting, have been maximized at the most cost-effective limit.

  13. Risk-informed assessment of regulatory and design requirements for future nuclear power plants. Annual report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-08-01

    OAK B188 Risk-informed assessment of regulatory and design requirements for future nuclear power plants. Annual report. The overall goal of this research project is to support innovation in new nuclear power plant designs. This project is examining the implications, for future reactors and future safety regulation, of utilizing a new risk-informed regulatory system as a replacement for the current system. This innovation will be made possible through development of a scientific, highly risk-formed approach for the design and regulation of nuclear power plants. This approach will include the development and/or confirmation of corresponding regulatory requirements and industry standards. The major impediment to long term competitiveness of new nuclear plants in the U.S. is the capital cost component--which may need to be reduced on the order of 35% to 40% for Advanced Light Water Reactors (ALWRS) such as System 80+ and Advanced Boiling Water Reactor (ABWR). The required cost reduction for an ALWR such as AP600 or AP1000 would be expected to be less. Such reductions in capital cost will require a fundamental reevaluation of the industry standards and regulatory bases under which nuclear plants are designed and licensed. Fortunately, there is now an increasing awareness that many of the existing regulatory requirements and industry standards are not significantly contributing to safety and reliability and, therefore, are unnecessarily adding to nuclear plant costs. Not only does this degrade the economic competitiveness of nuclear energy, it results in unnecessary costs to the American electricity consumer. While addressing these concerns, this research project will be coordinated with current efforts of industry and NRC to develop risk-informed, performance-based regulations that affect the operation of the existing nuclear plants; however, this project will go further by focusing on the design of new plants.

  14. Building models for marketing decisions : past, present and future

    NARCIS (Netherlands)

    Leeflang, P.S.H.; Wittink, Dick R.

    2000-01-01

    We review five eras of model building in marketing, with special emphasis on the fourth and the fifth eras, the present and the future. At many firms managers now routinely use model-based results for marketing decisions. Given an increasing number of successful applications, the demand for models t

  15. Building models for marketing decisions : Past, present and future

    NARCIS (Netherlands)

    Leeflang, PSH; Wittink, DR

    2000-01-01

    We review five eras of model building in marketing, with special emphasis on the fourth and the fifth eras, the present and the future. At many firms managers now routinely use model-based results for marketing decisions. Given an increasing number of successful applications, the demand for models t

  16. Analyzing Oil Futures with a Dynamic Nelson-Siegel Model

    DEFF Research Database (Denmark)

    Hansen, Niels Strange; Lunde, Asger

    In this paper we are interested in the term structure of futures contracts on oil. The objective is to specify a relatively parsimonious model which explains data well and performs well in a real time out of sample forecasting. The dynamic Nelson-Siegel model is normally used to analyze and forec......In this paper we are interested in the term structure of futures contracts on oil. The objective is to specify a relatively parsimonious model which explains data well and performs well in a real time out of sample forecasting. The dynamic Nelson-Siegel model is normally used to analyze...... and forecast interest rates of different maturities. The structure of oil futures resembles the structure of interest rates and this motivates the use of this model for our purposes. The data set is vast and the dynamic Nelson-Siegel model allows for a significant dimension reduction by introducing three...

  17. Asian water futures - Multi scenarios, models and criteria assessment -

    Science.gov (United States)

    Satoh, Yusuke; Burek, Peter; Wada, Yoshihide; Flrörke, Martina; Eisner, Stephanie; Hanasaki, Naota; Kahil, Taher; Tramberend, Sylvia; Fischer, Günther; Wiberg, David

    2016-04-01

    A better understanding of the current and future availability of water resources is essential for the implementation of the recently agreed Sustainable Development Goals (SDGs). Long-term/efficient strategies for coping with current and potential future water-related challenges are urgently required. Although Representative Concentration Pathways (RCPs) and Shared Socioeconomic Pathways (SSPs) were develop for the impact assessment of climate change, very few assessments have yet used the SSPs to assess water resources. Then the IIASA Water Futures and Solutions Initiative (WFaS), developed a set of water use scenarios consistent with RCPs and SSPs and applying the latest climate changes scenarios. Here this study focuses on results for Asian countries for the period 2010-2050. We present three conceivable future pathways of Asian water resources, determined by feasible combinations of two RCPs and three SSPs. Such a scenario approach provides valuable insights towards identifying appropriate strategies as gaps between a "scenario world" and reality. In addition, for the assessment of future water resources a multi-criteria analysis is applied. A classification system for countries and watershed that consists of two broad dimensions: (i) economic and institutional adaptive capacity, (ii) hydrological complexity. The latter is composed of several sub-indexes including total renewable water resources per capita, the ratio of water demand to renewable water resource, variability of runoff and dependency ratio to external. Furthermore, this analysis uses a multi-model approach to estimate runoff and discharge using 5 GCMs and 5 global hydrological models (GHMs). Three of these GHMs calculate water use based on a consistent set of scenarios in addition to water availability. As a result, we have projected hot spots of water scarcity in Asia and their spatial and temporal change. For example, in a scenario based on SSP2 and RCP6.0, by 2050, in total 2.1 billion people

  18. Simulating future supply of and requirements for human resources for health in high-income OECD countries.

    Science.gov (United States)

    Tomblin Murphy, Gail; Birch, Stephen; MacKenzie, Adrian; Rigby, Janet

    2016-12-12

    As part of efforts to inform the development of a global human resources for health (HRH) strategy, a comprehensive methodology for estimating HRH supply and requirements was described in a companion paper. The purpose of this paper is to demonstrate the application of that methodology, using data publicly available online, to simulate the supply of and requirements for midwives, nurses, and physicians in the 32 high-income member countries of the Organisation for Economic Co-operation and Development (OECD) up to 2030. A model combining a stock-and-flow approach to simulate the future supply of each profession in each country-adjusted according to levels of HRH participation and activity-and a needs-based approach to simulate future HRH requirements was used. Most of the data to populate the model were obtained from the OECD's online indicator database. Other data were obtained from targeted internet searches and documents gathered as part of the companion paper. Relevant recent measures for each model parameter were found for at least one of the included countries. In total, 35% of the desired current data elements were found; assumed values were used for the other current data elements. Multiple scenarios were used to demonstrate the sensitivity of the simulations to different assumed future values of model parameters. Depending on the assumed future values of each model parameter, the simulated HRH gaps across the included countries could range from shortfalls of 74 000 midwives, 3.2 million nurses, and 1.2 million physicians to surpluses of 67 000 midwives, 2.9 million nurses, and 1.0 million physicians by 2030. Despite important gaps in the data publicly available online and the short time available to implement it, this paper demonstrates the basic feasibility of a more comprehensive, population needs-based approach to estimating HRH supply and requirements than most of those currently being used. HRH planners in individual countries, working with their

  19. The Future of Clinical Pharmacy: Developing a Holistic Model

    Directory of Open Access Journals (Sweden)

    Patricia A. Shane

    2013-11-01

    Full Text Available This concept paper discusses the untapped promise of often overlooked humanistic skills to advance the practice of pharmacy. It highlights the seminal work that is, increasingly, integrated into medical and nursing education. The work of these educators and the growing empirical evidence that validates the importance of humanistic skills is raising questions for the future of pharmacy education and practice. To potentiate humanistic professional competencies, e.g., compassion, empathy, and emotional intelligence, how do we develop a more holistic model that integrates reflective and affective skills? There are many historical and current transitions in the profession and practice of pharmacy. If our education model is refocused with an emphasis on pharmacy’s therapeutic roots, the field has the opportunity to play a vital role in improving health outcomes and patient-centered care. Beyond the metrics of treatment effects, achieving greater patient-centeredness will require transformations that improve care processes and invest in patients’ experiences of the treatment and care they receive. Is layering on additional science sufficient to yield better health outcomes if we neglect the power of empathic interactions in the healing process?

  20. Employers’ Perspectives on Future Roles and Skills Requirements for Australian Health Librarians

    Directory of Open Access Journals (Sweden)

    Cheryl Hamill

    2011-01-01

    Full Text Available Objective – This study, which comprises one stage of a larger project (ALIA/HLA Workforce and Education Research Project, aimed to discover employers’ views on how (or whether health librarians assist in achieving the mission-critical goals of their organizations; how health librarians contribute to the organization now and into the future; and what are the current and future skills requirements of health librarians.Methods – Each member of the project group approached between one and five individuals known to them to generate a convenience sample of 22 employers of health librarians. There were 15 semi-structured interviews conducted between October and November 2010 with employers in the hospital, academic, government, private, consumer health and not-for-profit sectors. The interview schedule was sent to each interviewee prior to the interview so that they had time to consider their responses. The researchers wrote up the interview notes using the interview schedule and submitted them to the principal researcher, who combined the data into one document. Content analysis of the data was used to identify major themes.Results – Employers expressed a clear sense of respect for the roles and responsibilities of library staff in their organizations. Areas of practice such as education and training, scientific research and clinical support were highlighted as critical for the future. Current areas of practice such as using technology and systems to manage information, providing information services to meet user needs and management of health information resources in a range of formats were identified as remaining highly relevant for the future. There was potential for health librarians to play a more active and strategic role in their organizations, and to repackage their traditional skill sets for anticipated future roles. Interpersonal skills and the role of health librarians as the interface between clinicians and information technology

  1. Large-scale computation at PSI scientific achievements and future requirements

    Energy Technology Data Exchange (ETDEWEB)

    Adelmann, A.; Markushin, V

    2008-11-15

    and Networking' (SNSP-HPCN) is discussing this complex. Scientific results which are made possible by PSI's engagement at CSCS (named Horizon) are summarised and PSI's future high-performance computing requirements are evaluated. The data collected shows the current situation and a 5 year extrapolation of the users' needs with respect to HPC resources is made. In consequence this report can serve as a basis for future strategic decisions with respect to a non-existing HPC road-map for PSI. PSI's institutional HPC area started hardware-wise approximately in 1999 with the assembly of a 32-processor LINUX cluster called Merlin. Merlin was upgraded several times, lastly in 2007. The Merlin cluster at PSI is used for small scale parallel jobs, and is the only general purpose computing system at PSI. Several dedicated small scale clusters followed the Merlin scheme. Many of the clusters are used to analyse data from experiments at PSI or CERN, because dedicated clusters are most efficient. The intellectual and financial involvement of the procurement (including a machine update in 2007) results in a PSI share of 25 % of the available computing resources at CSCS. The (over) usage of available computing resources by PSI scientists is demonstrated. We actually get more computing cycles than we have paid for. The reason is the fair share policy that is implemented on the Horizon machine. This policy allows us to get cycles, with a low priority, even when our bi-monthly share is used. Five important observations can be drawn from the analysis of the scientific output and the survey of future requirements of main PSI HPC users: (1) High Performance Computing is a main pillar in many important PSI research areas; (2) there is a lack in the order of 10 times the current computing resources (measured in available core-hours per year); (3) there is a trend to use in the order of 600 processors per average production run; (4) the disk and tape storage growth

  2. Masterplanning in the process of adapting existing hospitals to future requirements.

    Science.gov (United States)

    Gatermann, Hans-Evert

    2003-01-01

    In many countries in Europe there will be built only few total new hospitals, but a lot of existing hospitals need to be adapted to the requirements of the future. For this process of planning we need masterplans. However, there does not exist an exact definition of the term "Masterplan" up to now. Therefore several committees tried to rectify this lack. The UIA WP Public Health focussed the exchange of experience in the field of updating and upgrading existing hospitals in its last International Public Health Seminars. It has shown that the process of masterplanning step by step is similar in many countries. UIA WP Public Health now tries to find new ways for a better understanding: to develop a definition of terms, to standardize the building structure of hospitals as well as the structure of the process of masterplanning and its depiction. We now have received the first interesting results and we are looking with optimize into the future.

  3. Implementing a stochastic model for oil futures prices

    Energy Technology Data Exchange (ETDEWEB)

    Cortazar, Gonzalo [Departamento de Ingenieria Industrial y de Sistemas, Escuela de Ingenieria, Pontificia Universidad Catolica de Chile, Vicuna Mackenna 4860, Santiago (Chile); Schwartz, Eduardo S. [Anderson School at UCLA, 110 Westwood Plaza, Los Angeles, CA 90095-1481 (United States)

    2003-05-01

    This paper develops a parsimonious three-factor model of the term structure of oil futures prices that can be easily estimated from available futures price data. In addition, it proposes a new simple spreadsheet implementation procedure. The procedure is flexible, may be used with market prices of any oil contingent claim with closed form pricing solution, and easily deals with missing data problems. The approach is implemented using daily prices of all futures contracts traded at the New York Mercantile Exchange between 1991 and 2001. In-sample and out-of-sample tests indicate that the model fits the data extremely well. Though the paper concentrates on oil, the approach can be used for any other commodity with well-developed futures markets.

  4. The Standard Model from LHC to future colliders.

    Science.gov (United States)

    Forte, S; Nisati, A; Passarino, G; Tenchini, R; Calame, C M Carloni; Chiesa, M; Cobal, M; Corcella, G; Degrassi, G; Ferrera, G; Magnea, L; Maltoni, F; Montagna, G; Nason, P; Nicrosini, O; Oleari, C; Piccinini, F; Riva, F; Vicini, A

    This review summarizes the results of the activities which have taken place in 2014 within the Standard Model Working Group of the "What Next" Workshop organized by INFN, Italy. We present a framework, general questions, and some indications of possible answers on the main issue for Standard Model physics in the LHC era and in view of possible future accelerators.

  5. The Standard Model from LHC to future colliders

    Energy Technology Data Exchange (ETDEWEB)

    Forte, S., E-mail: forte@mi.infn.it [Dipartimento di Fisica, Università di Milano, Via Celoria 16, 20133, Milan (Italy); INFN, Sezione di Milano, Via Celoria 16, 20133, Milan (Italy); Nisati, A. [INFN, Sezione di Roma, Piazzale Aldo Moro 2, 00185, Rome (Italy); Passarino, G. [Dipartimento di Fisica, Università di Torino, Via P. Giuria 1, 10125, Turin (Italy); INFN, Sezione di Torino, Via P. Giuria 1, 10125, Turin (Italy); Tenchini, R. [INFN, Sezione di Pisa, Largo B. Pontecorvo 3, 56127, Pisa (Italy); Calame, C. M. Carloni [Dipartimento di Fisica, Università di Pavia, via Bassi 6, 27100, Pavia (Italy); Chiesa, M. [INFN, Sezione di Pavia, via Bassi 6, 27100, Pavia (Italy); Cobal, M. [Dipartimento di Chimica, Fisica e Ambiente, Università di Udine, Via delle Scienze, 206, 33100, Udine (Italy); INFN, Gruppo Collegato di Udine, Via delle Scienze, 206, 33100, Udine (Italy); Corcella, G. [INFN, Laboratori Nazionali di Frascati, Via E. Fermi 40, 00044, Frascati (Italy); Degrassi, G. [Dipartimento di Matematica e Fisica, Università’ Roma Tre, Via della Vasca Navale 84, 00146, Rome (Italy); INFN, Sezione di Roma Tre, Via della Vasca Navale 84, 00146, Rome (Italy); Ferrera, G. [Dipartimento di Fisica, Università di Milano, Via Celoria 16, 20133, Milan (Italy); INFN, Sezione di Milano, Via Celoria 16, 20133, Milan (Italy); Magnea, L. [Dipartimento di Fisica, Università di Torino, Via P. Giuria 1, 10125, Turin (Italy); INFN, Sezione di Torino, Via P. Giuria 1, 10125, Turin (Italy); Maltoni, F. [Centre for Cosmology, Particle Physics and Phenomenology (CP3), Université Catholique de Louvain, 1348, Louvain-la-Neuve (Belgium); Montagna, G. [Dipartimento di Fisica, Università di Pavia, via Bassi 6, 27100, Pavia (Italy); INFN, Sezione di Pavia, via Bassi 6, 27100, Pavia (Italy); Nason, P. [INFN, Sezione di Milano-Bicocca, Piazza della Scienza 3, 20126, Milan (Italy); Nicrosini, O. [INFN, Sezione di Pavia, via Bassi 6, 27100, Pavia (Italy); Oleari, C. [Dipartimento di Fisica, Università di Milano-Bicocca, Piazza della Scienza 3, 20126, Milan (Italy); INFN, Sezione di Milano-Bicocca, Piazza della Scienza 3, 20126, Milan (Italy); Piccinini, F. [INFN, Sezione di Pavia, via Bassi 6, 27100, Pavia (Italy); Riva, F. [Institut de Théorie des Phénoménes Physiques, École Polytechnique Fédérale de Lausanne, 1015, Lausanne (Switzerland); Vicini, A. [Dipartimento di Fisica, Università di Milano, Via Celoria 16, 20133, Milan (Italy); INFN, Sezione di Milano, Via Celoria 16, 20133, Milan (Italy)

    2015-11-25

    This review summarizes the results of the activities which have taken place in 2014 within the Standard Model Working Group of the “What Next” Workshop organized by INFN, Italy. We present a framework, general questions, and some indications of possible answers on the main issue for Standard Model physics in the LHC era and in view of possible future accelerators.

  6. Depot Maintenance: Improved Strategic Planning Needed to Ensure That Army and Marine Corps Depots Can Meet Future Maintenance Requirements

    Science.gov (United States)

    2009-09-01

    of Representatives DEPOT MAINTENANCE Improved Strategic Planning Needed to Ensure That Army and Marine Corps Depots Can Meet Future...Depot Maintenance. Improved Strategic Planning Needed to Ensure That Army and Marine Corps Depots Can Meet Future Maintenance Requirements 5a... Strategic Planning Needed to Ensure That Army and Marine Corps Depots Can Meet Future Maintenance Requirements Highlights of GAO-09-865, a report to

  7. An Equilibrium Model of Catastrophe Insurance Futures and Spreads

    OpenAIRE

    Knut Aase

    1999-01-01

    This article presents a valuation model of futures contracts and derivatives on such contracts, when the underlying delivery value is an insurance index, which follows a stochastic process containing jumps of random claim sizes at random time points of accident occurrence. Applications are made on insurance futures and spreads, a relatively new class of instruments for risk management launched by the Chicago Board of Trade in 1993, anticipated to start in Europe and perhaps also in other part...

  8. A Use Case Methodology to Handle Conflicting Controller Requirements for Future Power Systems

    DEFF Research Database (Denmark)

    Heussen, Kai; Uslar, Mathias; Tornelli, Carlo

    2015-01-01

    This paper proposes a standards based requirements elicitation and analysis strategy tailored for smart grid control structure development. Control structures in electric power systems often span across several systems and stakeholders. Requirements elicitation for such control systems therefore...... is to describe a process starting from a tailored IEC 62559 template amended for recording controller conflicts and adapting the underlying use case management repository for collaborative work. Conflict identification is supported by Multilevel Flow Modeling providing abstracted conflict patterns....

  9. Integrated environmental modeling: a vision and roadmap for the future

    Science.gov (United States)

    Laniak, Gerard F.; Olchin, Gabriel; Goodall, Jonathan; Voinov, Alexey; Hill, Mary; Glynn, Pierre; Whelan, Gene; Geller, Gary; Quinn, Nigel; Blind, Michiel; Peckham, Scott; Reaney, Sim; Gaber, Noha; Kennedy, Philip R.; Hughes, Andrew

    2013-01-01

    Integrated environmental modeling (IEM) is inspired by modern environmental problems, decisions, and policies and enabled by transdisciplinary science and computer capabilities that allow the environment to be considered in a holistic way. The problems are characterized by the extent of the environmental system involved, dynamic and interdependent nature of stressors and their impacts, diversity of stakeholders, and integration of social, economic, and environmental considerations. IEM provides a science-based structure to develop and organize relevant knowledge and information and apply it to explain, explore, and predict the behavior of environmental systems in response to human and natural sources of stress. During the past several years a number of workshops were held that brought IEM practitioners together to share experiences and discuss future needs and directions. In this paper we organize and present the results of these discussions. IEM is presented as a landscape containing four interdependent elements: applications, science, technology, and community. The elements are described from the perspective of their role in the landscape, current practices, and challenges that must be addressed. Workshop participants envision a global scale IEM community that leverages modern technologies to streamline the movement of science-based knowledge from its sources in research, through its organization into databases and models, to its integration and application for problem solving purposes. Achieving this vision will require that the global community of IEM stakeholders transcend social, and organizational boundaries and pursue greater levels of collaboration. Among the highest priorities for community action are the development of standards for publishing IEM data and models in forms suitable for automated discovery, access, and integration; education of the next generation of environmental stakeholders, with a focus on transdisciplinary research, development, and

  10. Futurism.

    Science.gov (United States)

    Foy, Jane Loring

    The objectives of this research report are to gain insight into the main problems of the future and to ascertain the attitudes that the general population has toward the treatment of these problems. In the first section of this report the future is explored socially, psychologically, and environmentally. The second section describes the techniques…

  11. Holism, entrenchment, and the future of climate model pluralism

    Science.gov (United States)

    Lenhard, Johannes; Winsberg, Eric

    In this paper, we explore the extent to which issues of simulation model validation take on novel characteristics when the models in question become particularly complex. Our central claim is that complex simulation models in general, and global models of climate in particular, face a form of confirmation holism. This holism, moreover, makes analytic understanding of complex models of climate either extremely difficult or even impossible. We argue that this supports a position we call convergence skepticism: the belief that the existence of a plurality of different models making a plurality of different forecasts of future climate is likely to be a persistent feature of global climate science.

  12. Space Weather - Current Capabilities, Future Requirements, and the Path to Improved Forecasting

    Science.gov (United States)

    Mann, Ian

    2016-07-01

    We present an overview of Space Weather activities and future opportunities including assessments of current status and capabilities, knowledge gaps, and future directions in relation to both observations and modeling. The review includes input from the scientific community including from SCOSTEP scientific discipline representatives (SDRs), COSPAR Main Scientific Organizers (MSOs), and SCOSTEP/VarSITI leaders. The presentation also draws on results from the recent activities related to the production of the COSPAR-ILWS Space Weather Roadmap "Understanding Space Weather to Shield Society" [Schrijver et al., Advances in Space Research 55, 2745 (2015) http://dx.doi.org/10.1016/j.asr.2015.03.023], from the activities related to the United Nations (UN) Committee on the Peaceful Uses of Outer Space (COPUOS) actions in relation to the Long-term Sustainability of Outer Space (LTS), and most recently from the newly formed and ongoing efforts of the UN COPUOS Expert Group on Space Weather.

  13. Integrated modelling requires mass collaboration (Invited)

    Science.gov (United States)

    Moore, R. V.

    2009-12-01

    The need for sustainable solutions to the world’s problems is self evident; the challenge is to anticipate where, in the environment, economy or society, the proposed solution will have negative consequences. If we failed to realise that the switch to biofuels would have the seemingly obvious result of reduced food production, how much harder will it be to predict the likely impact of policies whose impacts may be more subtle? It has been clear for a long time that models and data will be important tools for assessing the impact of events and the measures for their mitigation. They are an effective way of encapsulating knowledge of a process and using it for prediction. However, most models represent a single or small group of processes. The sustainability challenges that face us now require not just the prediction of a single process but the prediction of how many interacting processes will respond in given circumstances. These processes will not be confined to a single discipline but will often straddle many. For example, the question, “What will be the impact on river water quality of the medical plans for managing a ‘flu pandemic and could they cause a further health hazard?” spans medical planning, the absorption of drugs by the body, the spread of disease, the hydraulic and chemical processes in sewers and sewage treatment works and river water quality. This question nicely reflects the present state of the art. We have models of the processes and standards, such as the Open Modelling Interface (the OpenMI), allow them to be linked together and to datasets. We can therefore answer the question but with the important proviso that we thought to ask it. The next and greater challenge is to deal with the open question, “What are the implications of the medical plans for managing a ‘flu pandemic?”. This implies a system that can make connections that may well not have occurred to us and then evaluate their probable impact. The final touch will be to

  14. Simulating Future Global Deforestation Using Geographically Explicit Models

    Energy Technology Data Exchange (ETDEWEB)

    Witmer, F. [University of Colorado, Boulder, CO (United States)

    2005-03-15

    What might the spatial distribution of forests look like in 2100? Global deforestation continues to be a significant component of human activity affecting both the terrestrial and atmospheric environments. This work models the relationship between people and forests using two approaches. Initially, a brief global scale analysis of recent historical trends is conducted. The remainder of the paper then focuses on current population densities as determinants of cumulative historical deforestation. Spatially explicit models are generated and used to generate two possible scenarios of future deforestation. The results suggest that future deforestation in tropical Africa may be considerably worse than deforestation in the Amazon region.

  15. Musculoskeletal modelling in dogs: challenges and future perspectives.

    Science.gov (United States)

    Dries, Billy; Jonkers, Ilse; Dingemanse, Walter; Vanwanseele, Benedicte; Vander Sloten, Jos; van Bree, Henri; Gielen, Ingrid

    2016-05-18

    Musculoskeletal models have proven to be a valuable tool in human orthopaedics research. Recently, veterinary research started taking an interest in the computer modelling approach to understand the forces acting upon the canine musculoskeletal system. While many of the methods employed in human musculoskeletal models can applied to canine musculoskeletal models, not all techniques are applicable. This review summarizes the important parameters necessary for modelling, as well as the techniques employed in human musculoskeletal models and the limitations in transferring techniques to canine modelling research. The major challenges in future canine modelling research are likely to centre around devising alternative techniques for obtaining maximal voluntary contractions, as well as finding scaling factors to adapt a generalized canine musculoskeletal model to represent specific breeds and subjects.

  16. Gamified Requirements Engineering: Model and Experimentation

    NARCIS (Netherlands)

    Lombriser, Philipp; Dalpiaz, Fabiano; Lucassen, Garm; Brinkkemper, Sjaak

    2016-01-01

    [Context & Motivation] Engaging stakeholders in requirements engineering (RE) influences the quality of the requirements and ultimately of the system to-be. Unfortunately, stakeholder engagement is often insufficient, leading to too few, low-quality requirements. [Question/problem] We aim to

  17. Gamified Requirements Engineering: Model and Experimentation

    NARCIS (Netherlands)

    Lombriser, Philipp; Dalpiaz, Fabiano; Lucassen, Garm; Brinkkemper, Sjaak

    2016-01-01

    [Context & Motivation] Engaging stakeholders in requirements engineering (RE) influences the quality of the requirements and ultimately of the system to-be. Unfortunately, stakeholder engagement is often insufficient, leading to too few, low-quality requirements. [Question/problem] We aim to evaluat

  18. Gamified Requirements Engineering: Model and Experimentation

    NARCIS (Netherlands)

    Lombriser, Philipp; Dalpiaz, Fabiano|info:eu-repo/dai/nl/369508394; Lucassen, Garm; Brinkkemper, Sjaak|info:eu-repo/dai/nl/07500707X

    2016-01-01

    [Context & Motivation] Engaging stakeholders in requirements engineering (RE) influences the quality of the requirements and ultimately of the system to-be. Unfortunately, stakeholder engagement is often insufficient, leading to too few, low-quality requirements. [Question/problem] We aim to evaluat

  19. Predicting future glacial lakes in Austria using different modelling approaches

    Science.gov (United States)

    Otto, Jan-Christoph; Helfricht, Kay; Prasicek, Günther; Buckel, Johannes; Keuschnig, Markus

    2017-04-01

    Glacier retreat is one of the most apparent consequences of temperature rise in the 20th and 21th centuries in the European Alps. In Austria, more than 240 new lakes have formed in glacier forefields since the Little Ice Age. A similar signal is reported from many mountain areas worldwide. Glacial lakes can constitute important environmental and socio-economic impacts on high mountain systems including water resource management, sediment delivery, natural hazards, energy production and tourism. Their development significantly modifies the landscape configuration and visual appearance of high mountain areas. Knowledge on the location, number and extent of these future lakes can be used to assess potential impacts on high mountain geo-ecosystems and upland-lowland interactions. Information on new lakes is critical to appraise emerging threads and potentials for society. The recent development of regional ice thickness models and their combination with high resolution glacier surface data allows predicting the topography below current glaciers by subtracting ice thickness from glacier surface. Analyzing these modelled glacier bed surfaces reveals overdeepenings that represent potential locations for future lakes. In order to predict the location of future glacial lakes below recent glaciers in the Austrian Alps we apply different ice thickness models using high resolution terrain data and glacier outlines. The results are compared and validated with ice thickness data from geophysical surveys. Additionally, we run the models on three different glacier extents provided by the Austrian Glacier Inventories from 1969, 1998 and 2006. Results of this historical glacier extent modelling are compared to existing glacier lakes and discussed focusing on geomorphological impacts on lake evolution. We discuss model performance and observed differences in the results in order to assess the approach for a realistic prediction of future lake locations. The presentation delivers

  20. Faculty Unions, Business Models, and the Academy's Future

    Science.gov (United States)

    Rhoades, Gary

    2011-01-01

    In this article, the author addresses questions about the future of faculty unions, business models, and the academy by providing some current and historical context regarding the causes of and conflicts about faculty unions. He also reviews trends in college and university management over the past three decades, using California, Ohio, and…

  1. Dynamic Pathloss Model for Future Mobile Communication Networks

    DEFF Research Database (Denmark)

    Kumar, Ambuj; Mihovska, Albena Dimitrova; Prasad, Ramjee

    2016-01-01

    Future mobile communication networks (MCNs) are expected to be more intelligent and proactive based on new capabilities that increase agility and performance. However, for any successful mobile network service, the dexterity in network deployment is a key factor. The efficiency of the network...... that incorporates the environmental dynamics factor in the propagation model for intelligent and proactively iterative networks...

  2. Business model innovation: Past research, current debates, and future directions

    DEFF Research Database (Denmark)

    Hossain, Mokter

    2017-01-01

    Purpose – The purpose of this paper is to provide state-of-the-art knowledge about business model innovation (BMI) and suggest avenues for future research. Design/methodology/approach – A systematic literature review approach was adopted with thematic analysis being conducted on 92 articles...

  3. Modelling faba bean production in an uncertain future climate

    NARCIS (Netherlands)

    Crawford, J.W.; Yiqun Gu,; Peiris, D.R.; Grashoff, C.; McNicol, J.W.; Marschall, B.

    1996-01-01

    Future climate change may bring risk or benefit to crop production. In this paper, the possible impact of climate change on faba bean production in Scotland is examined. Instead of conventional simulation modelling techniques, the belief network approach is applied to deal with the uncertain

  4. Organisational Behaviour: Business Models for a Profitable and Sustainable Future

    OpenAIRE

    2014-01-01

    There is a growing trend for companies to integrate sustainable strategies that require a comprehensive reconfiguration of their daily operations. This is referred to as “embedded sustainability”. Whilst also providing significant reductions in environmental impact, these sustainability strategies result in (a) reduced short term operational costs, (b) reduced exposure to future environmental risk and (c) an improved brand image. This is in contrast to the sustainability actions implemented b...

  5. The Standard Model from LHC to future colliders

    Energy Technology Data Exchange (ETDEWEB)

    Forte, S.; Ferrera, G.; Vicini, A. [Universita di Milano, Dipartimento di Fisica, Milan (Italy); INFN, Sezione di Milano, Milan (Italy); Nisati, A. [INFN, Sezione di Roma, Rome (Italy); Passarino, G.; Magnea, L. [Universita di Torino, Dipartimento di Fisica, Turin (Italy); INFN, Sezione di Torino, Turin (Italy); Tenchini, R. [INFN, Sezione di Pisa, Pisa (Italy); Calame, C.M.C. [Universita di Pavia, Dipartimento di Fisica, Pavia (Italy); Chiesa, M.; Nicrosini, O.; Piccinini, F. [INFN, Sezione di Pavia, Pavia (Italy); Cobal, M. [Universita di Udine, Dipartimento di Chimica, Fisica e Ambiente, Udine (Italy); INFN, Gruppo Collegato di Udine, Udine (Italy); Corcella, G. [INFN, Laboratori Nazionali di Frascati, Frascati (Italy); Degrassi, G. [Universita' Roma Tre, Dipartimento di Matematica e Fisica, Rome (Italy); INFN, Sezione di Roma Tre, Rome (Italy); Maltoni, F. [Universite Catholique de Louvain, Centre for Cosmology, Particle Physics and Phenomenology (CP3), Louvain-la-Neuve (Belgium); Montagna, G. [Universita di Pavia, Dipartimento di Fisica, Pavia (Italy); INFN, Sezione di Pavia, Pavia (Italy); Nason, P. [INFN, Sezione di Milano-Bicocca, Milan (Italy); Oleari, C. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN, Sezione di Milano-Bicocca, Milan (Italy); Riva, F. [Ecole Polytechnique Federale de Lausanne, Institut de Theorie des Phenomenes Physiques, Lausanne (Switzerland)

    2015-11-15

    This review summarizes the results of the activities which have taken place in 2014 within the Standard Model Working Group of the ''What Next'' Workshop organized by INFN, Italy. We present a framework, general questions, and some indications of possible answers on the main issue for Standard Model physics in the LHC era and in view of possible future accelerators. (orig.)

  6. The Starburst Model for AGN Past, Present & Future

    CERN Document Server

    Fernandes, R C

    1996-01-01

    It is now eleven years since Terlevich \\& Melnick first proposed an `AGN without black-holes' model, an idea which since then evolved into what is now called the starburst model for AGN. This model has been the subject of much debate in the last decade, with observational evidence both for and against it further fuelling the controversy. Can we after all these years reach a veredictum on whether starbursts can power AGN? This contribution tries to answer this question reviewing the main achievements of the starburst model, its current status and future prospects.

  7. Future aerospace ground test facility requirements for the Arnold Engineering Development Center

    Science.gov (United States)

    Kirchner, Mark E.; Baron, Judson R.; Bogdonoff, Seymour M.; Carter, Donald I.; Couch, Lana M.; Fanning, Arthur E.; Heiser, William H.; Koff, Bernard L.; Melnik, Robert E.; Mercer, Stephen C.

    1992-01-01

    Arnold Engineering Development Center (AEDC) was conceived at the close of World War II, when major new developments in flight technology were presaged by new aerodynamic and propulsion concepts. During the past 40 years, AEDC has played a significant part in the development of many aerospace systems. The original plans were extended through the years by some additional facilities, particularly in the area of propulsion testing. AEDC now has undertaken development of a master plan in an attempt to project requirements and to plan for ground test and computational facilities over the coming 20 to 30 years. This report was prepared in response to an AEDC request that the National Research Council (NRC) assemble a committee to prepare guidance for planning and modernizing AEDC facilities for the development and testing of future classes of aerospace systems as envisaged by the U.S. Air Force.

  8. Future aerospace ground test facility requirements for the Arnold Engineering Development Center

    Science.gov (United States)

    Kirchner, Mark E.; Baron, Judson R.; Bogdonoff, Seymour M.; Carter, Donald I.; Couch, Lana M.; Fanning, Arthur E.; Heiser, William H.; Koff, Bernard L.; Melnik, Robert E.; Mercer, Stephen C.

    1992-01-01

    Arnold Engineering Development Center (AEDC) was conceived at the close of World War II, when major new developments in flight technology were presaged by new aerodynamic and propulsion concepts. During the past 40 years, AEDC has played a significant part in the development of many aerospace systems. The original plans were extended through the years by some additional facilities, particularly in the area of propulsion testing. AEDC now has undertaken development of a master plan in an attempt to project requirements and to plan for ground test and computational facilities over the coming 20 to 30 years. This report was prepared in response to an AEDC request that the National Research Council (NRC) assemble a committee to prepare guidance for planning and modernizing AEDC facilities for the development and testing of future classes of aerospace systems as envisaged by the U.S. Air Force.

  9. From requirements to Java in a snap model-driven requirements engineering in practice

    CERN Document Server

    Smialek, Michal

    2015-01-01

    This book provides a coherent methodology for Model-Driven Requirements Engineering which stresses the systematic treatment of requirements within the realm of modelling and model transformations. The underlying basic assumption is that detailed requirements models are used as first-class artefacts playing a direct role in constructing software. To this end, the book presents the Requirements Specification Language (RSL) that allows precision and formality, which eventually permits automation of the process of turning requirements into a working system by applying model transformations and co

  10. International evaluation of current and future requirements for environmental engineering education.

    Science.gov (United States)

    Morgenroth, E; Daigger, G T; Ledin, A; Keller, J

    2004-01-01

    The field of environmental engineering is developing as a result of changing environmental requirements. In response, environmental engineering education (E3) needs to ensure that it provides students with the necessary tools to address these challenges. In this paper the current status and future development of E3 is evaluated based on a questionnaire sent to universities and potential employers of E3 graduates. With increasing demands on environmental quality, the complexity of environmental engineering problems to be solved can be expected to increase. To find solutions environmental engineers will need to work in interdisciplinary teams. Based on the questionnaire there was a broad agreement that the best way to prepare students for these future challenges is to provide them with a fundamental education in basic sciences and related engineering fields. Many exciting developments in the environmental engineering profession will be located at the interface between engineering, science, and society. Aspects of all three areas need to be included in E3 and the student needs to be exposed to the tensions associated with linking the three.

  11. Modelling Monsoons: Understanding and Predicting Current and Future Behaviour

    Energy Technology Data Exchange (ETDEWEB)

    Turner, A; Sperber, K R; Slingo, J M; Meehl, G A; Mechoso, C R; Kimoto, M; Giannini, A

    2008-09-16

    including, but not limited to, the Mei-Yu/Baiu sudden onset and withdrawal, low-level jet orientation and variability, and orographic forced rainfall. Under anthropogenic climate change many competing factors complicate making robust projections of monsoon changes. Without aerosol effects, increased land-sea temperature contrast suggests strengthened monsoon circulation due to climate change. However, increased aerosol emissions will reflect more solar radiation back to space, which may temper or even reduce the strength of monsoon circulations compared to the present day. A more comprehensive assessment is needed of the impact of black carbon aerosols, which may modulate that of other anthropogenic greenhouse gases. Precipitation may behave independently from the circulation under warming conditions in which an increased atmospheric moisture loading, based purely on thermodynamic considerations, could result in increased monsoon rainfall under climate change. The challenge to improve model parameterizations and include more complex processes and feedbacks pushes computing resources to their limit, thus requiring continuous upgrades of computational infrastructure to ensure progress in understanding and predicting the current and future behavior of monsoons.

  12. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  13. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  14. THE MODEL OF EXTERNSHIP ORGANIZATION FOR FUTURE TEACHERS: QUALIMETRIC APPROACH

    Directory of Open Access Journals (Sweden)

    Taisiya A. Isaeva

    2015-01-01

    Full Text Available The aim of the paper is to present author’s model for bachelors – future teachers of vocational training. The model is been worked out from the standpoint of qualimetric approach and provides a pedagogical training.Methods. The process is based on the literature analysis of externship organization for students in higher education and includes the SWOT-analysis techniques in pedagogical training. The method of group expert evaluation is the main method of pedagogical qualimetry. Structural components of professional pedagogical competency of students-future teachers are defined. It allows us to determine a development level and criterion of estimation on mastering programme «Vocational training (branch-wise».Results. This article interprets the concept «pedagogical training»; its basic organization principles during students’ practice are stated. The methods of expert group formation are presented: self-assessment and personal data.Scientific novelty. The externship organization model for future teachers is developed. This model is based on pedagogical training, using qualimetric approach and the SWOT-analysis techniques. Proposed criterion-assessment procedures are managed to determine the developing levels of professional and pedagogical competency.Practical significance. The model is introduced into pedagogical training of educational process of Kalashnikov’s Izhevsk State Technical University, and can be used in other similar educational establishments.

  15. A goal-oriented requirements modelling language for enterprise architecture

    NARCIS (Netherlands)

    Quartel, Dick; Engelsman, Wilco; Jonkers, Henk; Sinderen, van Marten

    2009-01-01

    Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for enterpris

  16. A goal-oriented requirements modelling language for enterprise architecture

    NARCIS (Netherlands)

    Quartel, Dick; Engelsman, W.; Jonkers, Henk; van Sinderen, Marten J.

    2009-01-01

    Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for

  17. Future mission opportunities and requirements for advanced space photovoltaic energy conversion technology

    Science.gov (United States)

    Flood, Dennis J.

    1990-01-01

    The variety of potential future missions under consideration by NASA will impose a broad range of requirements on space solar arrays, and mandates the development of new solar cells which can offer a wide range of capabilities to mission planners. Major advances in performance have recently been achieved at several laboratories in a variety of solar cell types. Many of those recent advances are reviewed, the areas are examined where possible improvements are yet to be made, and the requirements are discussed that must be met by advanced solar cell if they are to be used in space. The solar cells of interest include single and multiple junction cells which are fabricated from single crystal, polycrystalline and amorphous materials. Single crystal cells on foreign substrates, thin film single crystal cells on superstrates, and multiple junction cells which are either mechanically stacked, monolithically grown, or hybrid structures incorporating both techniques are discussed. Advanced concentrator array technology for space applications is described, and the status of thin film, flexible solar array blanket technology is reported.

  18. Requirements and solutions for future pellet technology; Krav och loesningar foer framtidens pelletsteknik

    Energy Technology Data Exchange (ETDEWEB)

    Paulrud, Susanne; Roennbaeck, Marie; Ryde, Daniel; Laitila, Thomas

    2010-07-01

    Requirements and solutions for future pellet burning technologies Since 2006, sales of pellet burning technologies to the Swedish residential market have fallen. The main reasons for this decrease are: many of the economically favorable easy conversions from oil to pellets have been made; competition from heat pumps; warm winters; a stable electricity price; and the current structure of heating in residential buildings, where electric heating dominates. To change this falling trend pellets need to become more attractive to consumers. This project aimed to analyze the requirements for the next generation of pellets systems and to develop potential solutions, in collaboration with the pellets industry. More specifically, the study looked at consumers' attitudes toward heating choices and different heating through a survey to 2000 house owners across Sweden. The project included a market analysis of Swedish and international technologies and examines the conditions for Swedish pellet burning technology in different markets. In addition, new solutions and developments for Swedish pellets burning technology are described

  19. Key Technologies in the Context of Future Networks: Operational and Management Requirements

    Directory of Open Access Journals (Sweden)

    Lorena Isabel Barona López

    2016-12-01

    Full Text Available The concept of Future Networks is based on the premise that current infrastructures require enhanced control, service customization, self-organization and self-management capabilities to meet the new needs in a connected society, especially of mobile users. In order to provide a high-performance mobile system, three main fields must be improved: radio, network, and operation and management. In particular, operation and management capabilities are intended to enable business agility and operational sustainability, where the addition of new services does not imply an excessive increase in capital or operational expenditures. In this context, a set of key-enabled technologies have emerged in order to aid in this field. Concepts such as Software Defined Network (SDN, Network Function Virtualization (NFV and Self-Organized Networks (SON are pushing traditional systems towards the next 5G network generation.This paper presents an overview of the current status of these promising technologies and ongoing works to fulfill the operational and management requirements of mobile infrastructures. This work also details the use cases and the challenges, taking into account not only SDN, NFV, cloud computing and SON but also other paradigms.

  20. Forecasting Model of Coal Requirement Quantity Based on Grey System Theory

    Institute of Scientific and Technical Information of China (English)

    孙继湖

    2001-01-01

    The generally used methods of forecasting coal requirement quantity include the analogy method, the outside-push method and the cause-effect analysis method. However, the precision of forecasting results using these methods is lower. This paper uses the grey system theory, and sets up grey forecasting model GM (1, 3) to coal requirement quantity. The forecasting result for the Chinese coal requirement quantity coincides with the actual values, and this shows that the model is reliable. Finally, this model are used to forecast Chinese coal requirement quantity in the future ten years.

  1. Elite Sports Training as Model for Future Internet Practices?

    OpenAIRE

    Eriksson, Magnus

    2013-01-01

    This paper reflects on the experience of using ethnographic and experimental research at a high-performance athletic training center as model for drawing conclusion about the future everyday use of ICT and Internet technologies. The research project has consisted of field studies of training session and everyday life at an elite training center where athletes live and train as well as experimental design processes where new internet and media technologies has been explored within elite sports...

  2. Modern requirements to professional training of future teacher of physical culture in the conditions of informatization of teaching.

    Directory of Open Access Journals (Sweden)

    Naumenko O.I.

    2012-06-01

    Full Text Available Modern requirements to professional training of future teacher of physical culture in the conditions of informatization of teaching are examined. It is exposed, that in the conditions of introduction of the modern newest information technologies in teaching new requirements are put to training of future teacher of physical culture. Abilities which must characterize the modern teacher of physical culture are indicated. It is marked that application of information technologies in industry of physical education optimizes an educational process. However there are contradictions between growth of their role in studies and direct application of these technologies in the field of knowledges. It is certain that a future specialist must adhere to the certain requirements of information technologies. It is marked that to the basic measures on implementation of the program providing of high-quality level of preparation of future teachers belongs to professional activity.

  3. Predicting the future completing models of observed complex systems

    CERN Document Server

    Abarbanel, Henry

    2013-01-01

    Predicting the Future: Completing Models of Observed Complex Systems provides a general framework for the discussion of model building and validation across a broad spectrum of disciplines. This is accomplished through the development of an exact path integral for use in transferring information from observations to a model of the observed system. Through many illustrative examples drawn from models in neuroscience, fluid dynamics, geosciences, and nonlinear electrical circuits, the concepts are exemplified in detail. Practical numerical methods for approximate evaluations of the path integral are explored, and their use in designing experiments and determining a model's consistency with observations is investigated. Using highly instructive examples, the problems of data assimilation and the means to treat them are clearly illustrated. This book will be useful for students and practitioners of physics, neuroscience, regulatory networks, meteorology and climate science, network dynamics, fluid dynamics, and o...

  4. 26 CFR 54.4980F-1 - Notice requirements for certain pension plan amendments significantly reducing the rate of future...

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 17 2010-04-01 2010-04-01 false Notice requirements for certain pension plan... (CONTINUED) PENSION EXCISE TAXES § 54.4980F-1 Notice requirements for certain pension plan amendments... a plan amendment of an applicable pension plan that significantly reduces the rate of future benefit...

  5. 17 CFR 1.48 - Requirements for classification of sales or purchases for future delivery as bona fide hedging of...

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Requirements for....48 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION GENERAL REGULATIONS UNDER THE COMMODITY EXCHANGE ACT Miscellaneous § 1.48 Requirements for classification of sales or purchases...

  6. 17 CFR 1.47 - Requirements for classification of purchases or sales of contracts for future delivery as bona...

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Requirements for... the regulations. 1.47 Section 1.47 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION GENERAL REGULATIONS UNDER THE COMMODITY EXCHANGE ACT Miscellaneous § 1.47 Requirements for...

  7. Model of future officers' availability to the management physical training

    Directory of Open Access Journals (Sweden)

    Olkhovy O.M.

    2012-03-01

    Full Text Available A purpose of work is creation of model of readiness of graduating student to implementation of official questions of guidance, organization and leadthrough of physical preparation in the process of military-professional activity. An analysis is conducted more than 40 sources and questionnaire questioning of a 21 expert. For introduction of model to the system of physical preparation of students the list of its basic constituents is certain: theoretical methodical readiness; functionally-physical readiness; organizationally-administrative readiness. It is certain that readiness of future officers to military-professional activity foresees determination of level of forming of motive capabilities, development of general physical qualities.

  8. Future needs and requirements for AMS {sup 14}C standards and reference materials

    Energy Technology Data Exchange (ETDEWEB)

    Scott, E. Marian E-mail: marian@stats.gla.ac.uk; Boaretto, Elisabetta; Bryant, Charlotte; Cook, Gordon T.; Gulliksen, Steinar; Harkness, Doug D.; Heinemeier, Jan; McGee, Edward; Naysmith, Philip; Possnert, Goran; Plicht, Hans van der; Strydonck, Mark van

    2004-08-01

    {sup 14}C measurement uses a number of standards and reference materials with different properties. Historically the absolute calibration of {sup 14}C measurement was tied to 1890 wood, through the 'primary' standard of NBS-OxI (produced by the National Bureau of Standards, now NIST - National Institute of Standards and technology) subsequently replaced by NBS-OxII. These are both internationally calibrated and certified materials, whose {sup 14}C activities are known absolutely. A second tier of materials, often called secondary standards or reference materials, and including internationally recognised materials such as ANU-sucrose (now also IAEA-C6), Chinese - sucrose and the IAEA C1-C6 series, augmented by additional oxalic acid samples are also used routinely. The activity of these materials has been estimated from large numbers of measurements made by many laboratories. Recently, further natural materials from the Third and Fourth International Radiocarbon Inter-comparisons (TIRI and FIRI) have been added to this list. The activities of these standards and reference materials span both the applied {sup 14}C age range and the chemical composition range of typical samples, but this is not achieved uniformly and there is a continuing need for reference materials for laboratory quality control and measurement-traceability purposes. In this paper, we review the development of {sup 14}C standards and reference materials and consider the future requirements for such materials within the {sup 14}C AMS community.

  9. Space Resource Requirements for Future In-Space Propellant Production Depots

    Science.gov (United States)

    Smitherman, David; Fikes, John; Roy, Stephanie; Henley, Mark W.; Potter, Seth D.; Howell, Joe T. (Technical Monitor)

    2001-01-01

    In 2000 and 2001 studies were conducted at the NASA Marshall Space Flight Center on the technical requirements and commercial potential for propellant production depots in low Earth orbit (LEO) to support future commercial, NASA, and other Agency missions. Results indicate that propellant production depots appear to be technically feasible given continued technology development, and there is a substantial growing market that depots could support. Systems studies showed that the most expensive part of transferring payloads to geosynchronous orbit (GEO) is the fuel. A cryogenic propellant production and storage depot stationed in LEO could lower the cost of missions to GEO and beyond. Propellant production separates water into hydrogen and oxygen through electrolysis. This process utilizes large amounts of power, therefore a depot derived from advanced space solar power technology was defined. Results indicate that in the coming decades there could be a significant demand for water-based propellants from Earth, moon, or asteroid resources if in-space transfer vehicles (upper stages) transitioned to reusable systems using water based propellants. This type of strategic planning move could create a substantial commercial market for space resources development, and ultimately lead toward significant commercial infrastructure development within the Earth-Moon system.

  10. A Review of Engine Seal Performance and Requirements for Current and Future Army Engine Platforms

    Science.gov (United States)

    Delgado, Irebert R.; Proctor, Margaret P.

    2008-01-01

    Sand ingestion continues to impact combat ground and air vehicles in military operations in the Middle East. The T-700 engine used in Apache and Blackhawk helicopters has been subjected to increased overhauls due to sand and dust ingestion during desert operations. Engine component wear includes compressor and turbine blades/vanes resulting in decreased engine power and efficiency. Engine labyrinth seals have also been subjected to sand and dust erosion resulting in tooth tip wear, increased clearances, and loss in efficiency. For the current investigation, a brief overview is given of the history of the T-700 engine development with respect to sand and dust ingestion requirements. The operational condition of labyrinth seals taken out of service from 4 different locations of the T-700 engine during engine overhauls are examined. Collaborative efforts between the Army and NASA to improve turbine engine seal leakage and life capability are currently focused on noncontacting, low leakage, compliant designs. These new concepts should be evaluated for their tolerance to sand laden air. Future R&D efforts to improve seal erosion resistance and operation in desert environments are recommended

  11. DECISION MAKING MODELING OF CONCRETE REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    Suhartono Irawan

    2001-01-01

    Full Text Available This paper presents the results of an experimental evaluation between predicted and practice concrete strength. The scope of the evaluation is the optimisation of the cement content for different concrete grades as a result of bringing the target mean value of tests cubes closer to the required characteristic strength value by reducing the standard deviation. Abstract in Bahasa Indonesia : concrete+mix+design%2C+acceptance+control%2C+optimisation%2C+cement+content.

  12. Specification of advanced safety modeling requirements (Rev. 0).

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models

  13. A MODEL FOR ALIGNING SOFTWARE PROJECTS REQUIREMENTS WITH PROJECT TEAM MEMBERS REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    Robert Hans

    2013-02-01

    Full Text Available The fast-paced, dynamic environment within which information and communication technology (ICT projects are run as well as ICT professionals’ constant changing requirements present a challenge for project managers in terms of aligning projects’ requirements with project team members’ requirements. This research paper purports that if projects’ requirements are properly aligned with team members’ requirements, then this will result in a balanced decision approach. Moreover, such an alignment will result in the realization of employee’s needs as well as meeting project’s needs. This paper presents a Project’s requirements and project Team members’ requirements (PrTr alignment model and argues that a balanced decision which meets both software project’s requirements and team members’ requirements can be achieved through the application of the PrTr alignment model.

  14. Supporting requirements model evolution throughout the system life-cycle

    OpenAIRE

    Ernst, Neil; Mylopoulos, John; Yu, Yijun; Ngyuen, Tien T.

    2008-01-01

    Requirements models are essential not just during system implementation, but also to manage system changes post-implementation. Such models should be supported by a requirements model management framework that allows users to create, manage and evolve models of domains, requirements, code and other design-time artifacts along with traceability links between their elements. We propose a comprehensive framework which delineates the operations and elements necessary, and then describe a tool imp...

  15. The implications of user requirements for the functionality and content of a future EGDI

    Science.gov (United States)

    Pedersen, Mikael; Tulstrup, Jørgen

    2014-05-01

    The FP7 co-funded EGDI-Scope project is conducting analyses, which forms the basis for the development of an implementation plan for a future European Geological Data Infrastructure (EGDI) the aim of which will be to serve pan-European geological information from the European geological survey organisations. An important aspect of the project has been to consult stakeholders in order to deduce requirements, which is a fundamental prerequisite for making recommendations on the content and technical design of the system. It is indisputable that EGDI will have to build on international standards such as OGC and CGI and take into account legislative requirements from e.g. the INSPIRE directive. This will support the tasks of data providers and facilitate integration with other e-Infrastructures, but will not in itself lever the end user experiences. In order to make the future EGDI a successful online contributor of geological information, EGDI-Scope has therefore been looking very concretely into the needs and expectations of various user groups Most people have clear expectations anno 2014. They want to be able to search the web for information, and once found, they expect fast-performing, intuitive web applications with buttons to click, maps to navigate and reliable content to fulfil their immediate needs. In order for the EGDI to handle such requirements, a number of use cases for various thematic areas have been assessed. The use cases reveal (for example) that information about the geological composition of the ground is critical for the assessment of things like ecosystems or ground water quality. But where ecosystem assessment relies on the composition of the surface layers, groundwater geochemistry rely on the lithology of subsurface layers. For both scenarios, harmonised, pan-European geological maps are very important, but the harmonisation should not only relate to lithological classes, but also to the depth representation. The use cases also make clear

  16. Population balance models: a useful complementary modelling framework for future WWTP modelling.

    Science.gov (United States)

    Nopens, Ingmar; Torfs, Elena; Ducoste, Joel; Vanrolleghem, Peter A; Gernaey, Krist V

    2015-01-01

    Population balance models (PBMs) represent a powerful modelling framework for the description of the dynamics of properties that are characterised by distributions. This distribution of properties under transient conditions has been demonstrated in many chemical engineering applications. Modelling efforts of several current and future unit processes in wastewater treatment plants could potentially benefit from this framework, especially when distributed dynamics have a significant impact on the overall unit process performance. In these cases, current models that rely on average properties cannot sufficiently capture the true behaviour and even lead to completely wrong conclusions. Examples of distributed properties are bubble size, floc size, crystal size or granule size. In these cases, PBMs can be used to develop new knowledge that can be embedded in our current models to improve their predictive capability. Hence, PBMs should be regarded as a complementary modelling framework to biokinetic models. This paper provides an overview of current applications, future potential and limitations of PBMs in the field of wastewater treatment modelling, thereby looking over the fence to other scientific disciplines.

  17. The Job Demands–Resources model: Challenges for future research

    Directory of Open Access Journals (Sweden)

    Evangelia Demerouti

    2011-05-01

    Full Text Available Motivation: The motivation of this overview is to present the state of the art of Job Demands–Resources (JD–R model whilst integrating the various contributions to the special issue.Research purpose: To provide an overview of the JD–R model, which incorporates many possible working conditions and focuses on both negative and positive indicators of employee well-being. Moreover, the studies of the special issue were introduced.Research design: Qualitative and quantitative studies on the JD–R model were reviewed to enlighten the health and motivational processes suggested by the model.Main findings: Next to the confirmation of the two suggested processes of the JD–R model, the studies of the special issue showed that the model can be used to predict work-place bullying, incidences of upper respiratory track infection, work-based identity, and early retirement intentions. Moreover, whilst psychological safety climate could be considered as a hypothetical precursor of job demands and resources, compassion satisfaction moderated the health process of the model.Contribution/value-add: The findings of previous studies and the studies of the special issue were integrated in the JD–R model that can be used to predict well-being and performance at work. New avenues for future research were suggested.Practical/managerial implications: The JD–R model is a framework that can be used for organisations to improve employee health and motivation, whilst simultaneously improving various organisational outcomes.

  18. Long-term durum wheat monoculture: modelling and future projection

    Directory of Open Access Journals (Sweden)

    Ettore Bernardoni

    2012-03-01

    Full Text Available The potential effects of future climate change on grain production of a winter durum wheat cropping system were investigated. Based on future climate change projections, derived from a statistical downscaling process applied to the HadCM3 general circulation model and referred to two IPCC scenarios (A2 and B1, the response on yield and aboveground biomass (AGB and the variation in total organic carbon (TOC were explored. The software used in this work is an hybrid dynamic simulation model able to simulate, under different pedoclimatic conditions, the processes involved in cropping system such as crop growth and development, water and nitrogen balance. It implements different approaches in order to ensure accurate simulation of the mainprocess related to soil-crop-atmosphere continuum.The model was calibrated using soil data, crop yield, AGB and phenology coming from a long-term experiment, located in Apulia region. The calibration was performed using data collected in the period 1978–1990; validation was carried out on the 1991–2009 data. Phenology simulation was sufficiently accurate, showing some limitation only in predicting the physiological maturity. Yields and AGBs were predicted with an acceptable accuracy during both calibration and validation. CRM resulted always close to optimum value, EF in every case scored positive value, the value of index r2 was good, although in some cases values lower than 0.6 were calculated. Slope of the linear regression equation between measured and simulated values was always close to 1, indicating an overall good performance of the model. Both future climate scenarios led to a general increase in yields but a slightly decrease in AGB values. Data showed variations in the total production and yield among the different periods due to the climate variation. TOC evolution suggests that the combination of temperature and precipitation is the main factor affecting TOC variation under future scenarios

  19. Should we believe model predictions of future climate change? (Invited)

    Science.gov (United States)

    Knutti, R.

    2009-12-01

    As computers get faster and our understanding of the climate system improves, climate models to predict the future are getting more complex by including more and more processes, and they are run at higher and higher resolution to resolve more of the small scale processes. As a result, some of the simulated features and structures, e.g. ocean eddies or tropical cyclones look surprisingly real. But are these deceptive? A pattern can look perfectly real but be in the wrong place. So can the current global models really provide the kind of information on local scales and on the quantities (e.g. extreme events) that the decision maker would need to know to invest for example in adaptation? A closer look indicates that evaluating skill of climate models and quantifying uncertainties in predictions is very difficult. This presentation shows that while models are improving in simulating the climate features we observe (e.g. the present day mean state, or the El Nino Southern Oscillation), the spread from multiple models in predicting future changes is often not decreasing. The main problem is that (unlike with weather forecasts for example) we cannot evaluate the model on a prediction (for example for the year 2100) and we have to use the present, or past changes as metrics of skills. But there are infinite ways of testing a model, and many metrics used to test models do not clearly relate to the prediction. Therefore there is little agreement in the community on metrics to separate ‘good’ and ‘bad’ models, and there is a concern that model development, evaluation and posterior weighting or ranking of models are all using the same datasets. While models are continuously improving in representing what we believe to be the key processes, many models also share ideas, parameterizations or even pieces of model code. The current models can therefore not be considered independent. Robustness of a model simulated result is often interpreted as increasing the confidence

  20. Requirements model for an e-Health awareness portal

    Science.gov (United States)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Nawi, Mohd Nasrun M.

    2016-08-01

    Requirements engineering is at the heart and foundation of software engineering process. Poor quality requirements inevitably lead to poor quality software solutions. Also, poor requirement modeling is tantamount to designing a poor quality product. So, quality assured requirements development collaborates fine with usable products in giving the software product the needed quality it demands. In the light of the foregoing, the requirements for an e-Ebola Awareness Portal were modeled with a good attention given to these software engineering concerns. The requirements for the e-Health Awareness Portal are modeled as a contribution to the fight against Ebola and helps in the fulfillment of the United Nation's Millennium Development Goal No. 6. In this study requirements were modeled using UML 2.0 modeling technique.

  1. Extending enterprise architecture modelling with business goals and requirements

    NARCIS (Netherlands)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; Sinderen, van Marten

    2011-01-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling te

  2. Future meteorological drought: projections of regional climate models for Europe

    Science.gov (United States)

    Stagge, James; Tallaksen, Lena; Rizzi, Jonathan

    2015-04-01

    In response to the major European drought events of the last decade, projecting future drought frequency and severity in a non-stationary climate is a major concern for Europe. Prior drought studies have identified regional hotspots in the Mediterranean and Eastern European regions, but have otherwise produced conflicting results with regard to future drought severity. Some of this disagreement is likely related to the relatively coarse resolution of Global Climate Models (GCMs) and regional averaging, which tends to smooth extremes. This study makes use of the most current Regional Climate Models (RCMs) forced with CMIP5 climate projections to quantify the projected change in meteorological drought for Europe during the next century at a fine, gridded scale. Meteorological drought is quantified using the Standardized Precipitation Index (SPI) and the Standardized Precipitation-Evapotranspiration Index (SPEI), which normalize accumulated precipitation and climatic water balance anomaly, respectively, for a specific location and time of year. By comparing projections for these two indices, the importance of precipitation deficits can be contrasted with the importance of evapotranspiration increases related to temperature changes. Climate projections are based on output from CORDEX (the Coordinated Regional Climate Downscaling Experiment), which provides high resolution regional downscaled climate scenarios that have been extensively tested for numerous regions around the globe, including Europe. SPI and SPEI are then calculated on a gridded scale at a spatial resolution of either 0.44 degrees (~50 km) or 0.11 degrees (~12.5km) for the three projected emission pathways (rcp26, rcp45, rcp85). Analysis is divided into two major sections: first validating the models with respect to observed historical trends in meteorological drought from 1970-2005 and then comparing drought severity and frequency during three future time periods (2011-2040, 2041-2070, 2071-2100) to the

  3. Rodent models in Down syndrome research: impact and future opportunities.

    Science.gov (United States)

    Herault, Yann; Delabar, Jean M; Fisher, Elizabeth M C; Tybulewicz, Victor L J; Yu, Eugene; Brault, Veronique

    2017-10-01

    Down syndrome is caused by trisomy of chromosome 21. To date, a multiplicity of mouse models with Down-syndrome-related features has been developed to understand this complex human chromosomal disorder. These mouse models have been important for determining genotype-phenotype relationships and identification of dosage-sensitive genes involved in the pathophysiology of the condition, and in exploring the impact of the additional chromosome on the whole genome. Mouse models of Down syndrome have also been used to test therapeutic strategies. Here, we provide an overview of research in the last 15 years dedicated to the development and application of rodent models for Down syndrome. We also speculate on possible and probable future directions of research in this fast-moving field. As our understanding of the syndrome improves and genome engineering technologies evolve, it is necessary to coordinate efforts to make all Down syndrome models available to the community, to test therapeutics in models that replicate the whole trisomy and design new animal models to promote further discovery of potential therapeutic targets. © 2017. Published by The Company of Biologists Ltd.

  4. Biological ensemble modeling to evaluate potential futures of living marine resources

    DEFF Research Database (Denmark)

    Gårdmark, Anna; Lindegren, Martin; Neuenfeldt, Stefan

    2013-01-01

    trajectories carried through to uncertainty of cod responses. Models ignoring the feedback from prey on cod showed large interannual fluctuations in cod dynamics and were more sensitive to the underlying uncertainty of climate forcing than models accounting for such stabilizing predator–prey feedbacks. Yet......Natural resource management requires approaches to understand and handle sources of uncertainty in future responses of complex systems to human activities. Here we present one such approach, the “biological ensemble modeling approach,” using the Eastern Baltic cod (Gadus morhua callarias...

  5. Business Models for Future Networked 3D Services

    OpenAIRE

    Bøhler, Marianne

    2011-01-01

    3-Dimensional (3D) technology has seen an increasingly widespread use over the last years, although the concept of 3D has been around for many years. Large studio movies being released in 3D and the development of 3DTVs and 3D games are the major reasons for its increasing popularity. The purpose of this thesis is to specify future collaboration space services based on the use of autostereoscopic 3D technology and propose possible business models. The collaboration spaces are geographically s...

  6. AN OPTION PRICING MODEL UNDER FUTURE REVENUE UNCERTAINTY

    Institute of Scientific and Technical Information of China (English)

    XueMinggao

    2003-01-01

    The purpose of this paper is to discuss how the value of high-tech firm can be rationally valued by taking into account managerial flexibility when its future revenue is uncertain ,thereby the firm's manager can make rational investment decisionS. Using stochastic control theory, the paper will present that the firm's value satisfies a partially differentiate equation,and analyze the managerial flexibility value within a framework of real-option analytic theorey. Finally,the comparative static analysis and the model's simple application are given.

  7. Extending enterprise architecture modelling with business goals and requirements

    Science.gov (United States)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  8. Mixing Formal and Informal Model Elements for Tracing Requirements

    DEFF Research Database (Denmark)

    Jastram, Michael; Hallerstede, Stefan; Ladenberger, Lukas

    2011-01-01

    a system for traceability with a state-based formal method that supports refinement. We do not require all specification elements to be modelled formally and support incremental incorporation of new specification elements into the formal model. Refinement is used to deal with larger amounts of requirements......Tracing between informal requirements and formal models is challenging. A method for such tracing should permit to deal efficiently with changes to both the requirements and the model. A particular challenge is posed by the persisting interplay of formal and informal elements. In this paper, we...

  9. Development of technology-neutral safety requirements for the regulation of future nuclear power reactors: Back to basics

    Energy Technology Data Exchange (ETDEWEB)

    Tronea, Madalina, E-mail: madalina.tronea@gmail.co [Faculty of Physics, University of Bucharest (Romania)

    2011-03-15

    This paper explores the current trends as regards the development of technology-neutral safety requirements to be used in the regulation of future nuclear power reactors and the role of the quantitative safety goals in the design of reactor safety systems. The use of the recommendations of the International Commission on Radiological Protection (ICRP) on protection against potential exposure could form the basis of a technology-neutral framework for safety requirements on new reactor designs and could contribute to international harmonisation of nuclear safety assessment practices as part of the licensing processes for future nuclear power plants.

  10. Modeling of Soybean under Present and Future Climates in Mozambique

    Directory of Open Access Journals (Sweden)

    Manuel António Dina Talacuece

    2016-06-01

    Full Text Available This study aims to calibrate and validate the generic crop model (CROPGRO-Soybean and estimate the soybean yield, considering simulations with different sowing times for the current period (1990–2013 and future climate scenario (2014–2030. The database used came from observed data, nine climate models of CORDEX (Coordinated Regional climate Downscaling Experiment-Africa framework and MERRA (Modern Era Retrospective-Analysis for Research and Applications reanalysis. The calibration and validation data for the model were acquired in field experiments, carried out in the 2009/2010 and 2010/2011 growing seasons in the experimental area of the International Institute of Tropical Agriculture (IITA in Angónia, Mozambique. The yield of two soybean cultivars: Tgx 1740-2F and Tgx 1908-8F was evaluated in the experiments and modeled for two distinct CO2 concentrations. Our model simulation results indicate that the fertilization effect leads to yield gains for both cultivars, ranging from 11.4% (Tgx 1908-8F to 15% (Tgx 1740-2Fm when compared to the performance of those cultivars under current CO2 atmospheric concentration. Moreover, our results show that MERRA, the RegCM4 (Regional Climatic Model version 4 and CNRM-CM5 (Centre National de Recherches Météorologiques – Climatic Model version 5 models provided more accurate estimates of yield, while others models underestimate yield as compared to observations, a fact that was demonstrated to be related to the model’s capability of reproducing the precipitation and the surface radiation amount.

  11. Using cognitive modeling for requirements engineering in anesthesiology

    NARCIS (Netherlands)

    Pott, C; le Feber, J

    2005-01-01

    Cognitive modeling is a complexity reducing method to describe significant cognitive processes under a specified research focus. Here, a cognitive process model for decision making in anesthesiology is presented and applied in requirements engineering. Three decision making situations of

  12. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    Energy Technology Data Exchange (ETDEWEB)

    D. E. Shropshire; W. H. West

    2005-11-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies.

  13. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  14. Influence of climate model variability on projected Arctic shipping futures

    Science.gov (United States)

    Stephenson, Scott R.; Smith, Laurence C.

    2015-11-01

    Though climate models exhibit broadly similar agreement on key long-term trends, they have significant temporal and spatial differences due to intermodel variability. Such variability should be considered when using climate models to project the future marine Arctic. Here we present multiple scenarios of 21st-century Arctic marine access as driven by sea ice output from 10 CMIP5 models known to represent well the historical trend and climatology of Arctic sea ice. Optimal vessel transits from North America and Europe to the Bering Strait are estimated for two periods representing early-century (2011-2035) and mid-century (2036-2060) conditions under two forcing scenarios (RCP 4.5/8.5), assuming Polar Class 6 and open-water vessels with medium and no ice-breaking capability, respectively. Results illustrate that projected shipping viability of the Northern Sea Route (NSR) and Northwest Passage (NWP) depends critically on model choice. The eastern Arctic will remain the most reliably accessible marine space for trans-Arctic shipping by mid-century, while outcomes for the NWP are particularly model-dependent. Omitting three models (GFDL-CM3, MIROC-ESM-CHEM, and MPI-ESM-MR), our results would indicate minimal NWP potential even for routes from North America. Furthermore, the relative importance of the NSR will diminish over time as the number of viable central Arctic routes increases gradually toward mid-century. Compared to vessel class, climate forcing plays a minor role. These findings reveal the importance of model choice in devising projections for strategic planning by governments, environmental agencies, and the global maritime industry.

  15. Requirements for future control room and visualization features in the Web-of-Cells framework defined in the ELECTRA project

    DEFF Research Database (Denmark)

    Tornelli, Carlo; Zuelli, Roberto; Marinelli, Mattia

    2017-01-01

    This paper outlines an overview of the general requirements for the control rooms of the future power systems (2030+). The roles and activities in the future control centres will evolve with respect to the switching, dispatching and restoration functions currently active. The control centre...... operators will supervise on the power system and intervene - when necessary - thanks to the maturation and wide scale deployment of flexible controls. For the identification of control room requirements, general trends in power system evolution are considered and mainly the outcomes of the ELECTRA IRP...... project, that proposes a new Web-of-Cell (WoC) power system control architecture. Dedicated visualization features are proposed, aimed to support the control room operators activities in a WoC oriented approach. Furthermore, the work takes into account the point of view of network operators about future...

  16. Requirements for future gasoline DI systems and respective platform solutions; Anforderungen an zukuenftige Otto DI-Einspritzsysteme und entsprechende Plattformloesungen

    Energy Technology Data Exchange (ETDEWEB)

    Schoeppe, Detlev; Greff, Andreas; Zhang, Hong; Frenzel, Holger; Roesel, Gerd; Achleitner, Erwin; Kapphan, Friedrich [Continental Automotive GmbH, Regensburg (Germany)

    2011-07-01

    The spark-ignition engine is the world's most common type of engine. Its outstanding cost / benefit ratio and high performance mean it will continue to be developed and improved to meet future more stringent requirements on aspects such as ''fun to drive'' qualities, low fuel consumption and particle number emissions limits. Meeting these future requirements demands measures in the area of the air-fuel system and the ignition system. To meet these objectives, Continental can supply a range of tailored platform solutions for the different markets. A basic enabling technology for reducing fuel consumption is the turbocharged direct-injection gasoline engine, for which Continental has developed injection system technology and an innovative turbocharger which improves the dynamic behavior of the engine. In combination with variable valve train functionality, this technology opens up new scope for reducing fuel consumption by downsizing and downspeeding. A key part in meeting future emission requirements is played by the injection system. In particular, the improvements here are focused on two parameters: accurate fuel metering, particularly for very small injection quantities, and spray preparation. Innovative injection systems incorporating these features are part of the Continental portfolio. A mechatronic approach allows the performance of injection components to be improved in order to meet future requirements on accurate fuel metering and mixture formation at the injection valves. A mechatronic approach can also reduce the noise emissions of the highpressure pump. Turbocharged direct-injection engines require more ignition energy than naturally aspirated engines. These requirements are met by an innovative ignition system. The open-architecture, scalable EMS 3 engine management platform presented at last year's Vienna Engine Symposium provides a convenient way of integrating the various new functionalities for different markets and

  17. Performance Requirements Modeling andAssessment for Active Power Ancillary Services

    DEFF Research Database (Denmark)

    Bondy, Daniel Esteban Morales; Thavlov, Anders; Tougaard, Janus Bundsgaard Mosbæk

    2017-01-01

    New sources of ancillary services are expected in the power system. For large and conventional generation units the dynamic response is well understood and detailed individual measurement is feasible, which factors in to the straightforward performance requirements applied today. For secure power...... ancillary service sources. This paper develops a modeling method for ancillary services performance requirements, including performance and verification indices. The use of the modeling method and the indices is exemplified in two case studies.......New sources of ancillary services are expected in the power system. For large and conventional generation units the dynamic response is well understood and detailed individual measurement is feasible, which factors in to the straightforward performance requirements applied today. For secure power...... system operation, a reliable service delivery is required, yet it may not be appropriate to apply conventional performance requirements to new technologies and methods. The service performance requirements and assessment methods therefore need to be generalized and standardized in order to include future...

  18. Modeling daily realized futures volatility with singular spectrum analysis

    Science.gov (United States)

    Thomakos, Dimitrios D.; Wang, Tao; Wille, Luc T.

    2002-09-01

    Using singular spectrum analysis (SSA), we model the realized volatility and logarithmic standard deviations of two important futures return series. The realized volatility and logarithmic standard deviations are constructed following the methodology of Andersen et al. [J. Am. Stat. Ass. 96 (2001) 42-55] using intra-day transaction data. We find that SSA decomposes the volatility series quite well and effectively captures both the market trend (accounting for about 34-38% of the total variance in the series) and, more importantly, a number of underlying market periodicities. Reliable identification of any periodicities is extremely important for options pricing and risk management and we believe that SSA can be a useful addition to the financial practitioners’ toolbox.

  19. The Adaptation Fund: a model for the future?

    Energy Technology Data Exchange (ETDEWEB)

    Chandani, Achala; Harmeling, Sven; Kaloga, Alpha Oumar

    2009-08-15

    With millions of the poor already facing the impacts of a changing climate, adaptation is a globally urgent – and costly – issue. The Adaptation Fund, created under the Kyoto Protocol, has unique features that could herald a new era of international cooperation on adaptation. Its governance structure, for instance, offers a fresh approach to fund management under the UN climate convention. The Fund's Board has also developed a constructive working atmosphere, and further progress is expected before the 2009 climate summit in Copenhagen. But developing countries' demand for adaptation funding is huge: conservative estimates put it at US$50 billion a year. The Fund's current structure and funding base are clearly only a first step towards filling that gap. And despite its significant progress over the last 18 months, many countries, particularly in the developed world, remain sceptical about this approach. Looking in detail at the Fund's evolution offers insight into its future potential as a model for adaptation finance.

  20. Modeling of Future Initial Teacher of Foreign Language Training, Using Situation Analysis

    Directory of Open Access Journals (Sweden)

    Maryana М. Sidun

    2012-12-01

    Full Text Available The article discloses the content of modeling of future initial teacher of foreign language, using situation analysis, defines the stages of modeling during the professional competence formation of future teacher of foreign language: preparatory, analytical and executive.

  1. Process Model for Defining Space Sensing and Situational Awareness Requirements

    Science.gov (United States)

    2006-04-01

    process model for defining systems for space sensing and space situational awareness is presented. The paper concentrates on eight steps for determining the requirements to include: decision maker needs, system requirements, exploitation methods and vulnerabilities, critical capabilities, and identify attack scenarios. Utilization of the USAF anti-tamper (AT) implementation process as a process model departure point for the space sensing and situational awareness (SSSA...is presented. The AT implementation process model , as an

  2. Modelling volatility recurrence intervals in the Chinese commodity futures market

    Science.gov (United States)

    Zhou, Weijie; Wang, Zhengxin; Guo, Haiming

    2016-09-01

    The law of extreme event occurrence attracts much research. The volatility recurrence intervals of Chinese commodity futures market prices are studied: the results show that the probability distributions of the scaled volatility recurrence intervals have a uniform scaling curve for different thresholds q. So we can deduce the probability distribution of extreme events from normal events. The tail of a scaling curve can be well fitted by a Weibull form, which is significance-tested by KS measures. Both short-term and long-term memories are present in the recurrence intervals with different thresholds q, which denotes that the recurrence intervals can be predicted. In addition, similar to volatility, volatility recurrence intervals also have clustering features. Through Monte Carlo simulation, we artificially synthesise ARMA, GARCH-class sequences similar to the original data, and find out the reason behind the clustering. The larger the parameter d of the FIGARCH model, the stronger the clustering effect is. Finally, we use the Fractionally Integrated Autoregressive Conditional Duration model (FIACD) to analyse the recurrence interval characteristics. The results indicated that the FIACD model may provide a method to analyse volatility recurrence intervals.

  3. Modeling green infrastructure land use changes on future air ...

    Science.gov (United States)

    Green infrastructure can be a cost-effective approach for reducing stormwater runoff and improving water quality as a result, but it could also bring co-benefits for air quality: less impervious surfaces and more vegetation can decrease the urban heat island effect, and also result in more removal of air pollutants via dry deposition with increased vegetative surfaces. Cooler surface temperatures can also decrease ozone formation through the increases of NOx titration; however, cooler surface temperatures also lower the height of the boundary layer resulting in more concentrated pollutants within the same volume of air, especially for primary emitted pollutants (e.g. NOx, CO, primary particulate matter). To better understand how green infrastructure impacts air quality, the interactions between all of these processes must be considered collectively. In this study, we use a comprehensive coupled meteorology-air quality model (WRF-CMAQ) to simulate the influence of planned land use changes that include green infrastructure in Kansas City (KC) on regional meteorology and air quality. Current and future land use data was provided by the Mid-America Regional Council for 2012 and 2040 (projected land use due to population growth, city planning and green infrastructure implementation). These land use datasets were incorporated into the WRF-CMAQ modeling system allowing the modeling system to propagate the changes in vegetation and impervious surface coverage on meteoro

  4. Assessing Calorimeter Requirements for a 100 TeV Future Collider With Reference to New Physics Benchmarks

    CERN Document Server

    Dylewsky, Daniel

    2014-01-01

    Plans for a future 100 TeV circular collider require the design of detection equipment capable of measuring events at such high energy. This study examined the simulated decay of hypothetical 10 TeV excited quarks in 100 TeV pp collisions with regard to the possibility of calorimeter punch-through. Two methods of parameterizing the energy resolution in detector simulations were employed to model the effects of particles escaping the hadronic calorimeter. Varying the constant term of the energy resolution parameterization caused the dijet mass distribution to broaden up to 58% with respect to the ATLAS default. Using the assumption that the jets' makeup could be approximated by 180 GeV pions, their expected signal degradation in calorimeters of varying depths was compared to the varied constant term trials. It was found that the broadening associated with a calorimeter of thickness 7 lambda was consistent with that caused by an increase of 1\\% in the constant term (from the ATLAS default).

  5. Availability modeling approach for future circular colliders based on the LHC operation experience

    CERN Document Server

    Niemi, Arto; Gutleber, Johannes; Sollander, Peter; Penttinen, Jussi-Pekka; Virtanen, Seppo Johannes

    2016-01-01

    Reaching the challenging integrated luminosity production goals of a future circular hadron collider (FCC-hh) and high luminosity LHC (HL-LHC) requires a thorough understanding of today’s most powerful high energy physics research infrastructure, the LHC accelerator complex at CERN. FCC-hh, a 4 times larger collider ring aims at delivering 10–20 ab−1 of integrated luminosity at 7 times higher collision energy. Since the identification of the key factors that impact availability and cost is far from obvious, a dedicated activity has been launched in the frame of the future circular collider study to develop models to study possible ways to optimize accelerator availability. This paper introduces the FCC reliability and availability study, which takes a fresh new look at assessing and modeling reliability and availability of particle accelerator infrastructures. The paper presents a probabilistic approach for Monte Carlo simulation of the machine operational cycle, schedule and availability for physics. T...

  6. Availability modeling approach for future circular colliders based on the LHC operation experience

    CERN Document Server

    AUTHOR|(CDS)2096726; Apollonio, Andrea; Gutleber, Johannes; Sollander, Peter; Penttinen, Jussi-Pekka; Virtanen, Seppo Johannes

    2016-01-01

    Reaching the challenging integrated luminosity production goals of a future circular hadron collider (FCC-hh) and high luminosity LHC (HL-LHC) requires a thorough understanding of today’s most powerful high energy physics research infrastructure, the LHC accelerator complex at CERN. FCC-hh, a 4 times larger collider ring aims at delivering 10–20  ab$^-$$^1$ of integrated luminosity at 7 times higher collision energy. Since the identification of the key factors that impact availability and cost is far from obvious, a dedicated activity has been launched in the frame of the future circular collider study to develop models to study possible ways to optimize accelerator availability. This paper introduces the FCC reliability and availability study, which takes a fresh new look at assessing and modeling reliability and availability of particle accelerator infrastructures. The paper presents a probabilistic approach for Monte Carlo simulation of the machine operational cycle, schedule and availability for p...

  7. Futures Business Models for an IoT Enabled Healthcare Sector: A Causal Layered Analysis Perspective

    Directory of Open Access Journals (Sweden)

    Julius Francis Gomes

    2016-12-01

    Full Text Available Purpose: To facilitate futures business research by proposing a novel way to combine business models as a conceptual tool with futures research techniques. Design: A futures perspective is adopted to foresight business models of the Internet of Things (IoT enabled healthcare sector by using business models as a futures business research tool. In doing so, business models is coupled with one of the most prominent foresight methodologies, Causal Layered Analysis (CLA. Qualitative analysis provides deeper understanding of the phenomenon through the layers of CLA; litany, social causes, worldview and myth. Findings: It is di cult to predict the far future for a technology oriented sector like healthcare. This paper presents three scenarios for short-, medium- and long-term future. Based on these scenarios we also present a set of business model elements for different future time frames. This paper shows a way to combine business models with CLA, a foresight methodology; in order to apply business models in futures business research. Besides offering early results for futures business research, this study proposes a conceptual space to work with individual business models for managerial stakeholders. Originality / Value: Much research on business models has offered conceptualization of the phenomenon, innovation through business model and transformation of business models. However, existing literature does not o er much on using business model as a futures research tool. Enabled by futures thinking, we collected key business model elements and building blocks for the futures market and ana- lyzed them through the CLA framework.

  8. Architecture modeling for interoperability analysis on the future internet

    NARCIS (Netherlands)

    Ullberg, Johan; Lagerström, Robert; Sinderen, van Marten; Johnson, Pontus; Zelm, Martin; Sanchis, Raquel; Poler, Raul; Doumeingts, Guy

    2012-01-01

    One of the key aspects of the Future Internet is the Internet of Services, where companies are envisioned to sell and purchase services online in a dynamic fashion. A typical future scenario would be that companies form so-called ad-hoc business networks on the Future Internet to be able to collabor

  9. Architecture modeling for interoperability analysis on the future internet

    NARCIS (Netherlands)

    Ullberg, Johan; Lagerström, Robert; van Sinderen, Marten J.; Johnson, Pontus; Zelm, Martin; Sanchis, Raquel; Poler, Raul; Doumeingts, Guy

    2012-01-01

    One of the key aspects of the Future Internet is the Internet of Services, where companies are envisioned to sell and purchase services online in a dynamic fashion. A typical future scenario would be that companies form so-called ad-hoc business networks on the Future Internet to be able to

  10. Requirements for Logical Models for Value-Added Tax Legislation

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    -specific needs. Currently, these difficulties are handled in most major ERP systems by customising and localising the native code of the ERP systems for each specific country and industry. We propose an alternative that uses logical modeling of VAT legislation. The potential benefit is to eventually transform...... such a model automatically into programs that essentially will replace customisation and localisation by con¿guration by changing parameters in the model. In particular, we: (1) identify a number of requirements for such modeling, including requirements for the underlying logic; (2) model salient parts...

  11. Virtual Modeling for Cities of the Future. State-Of Art and Virtual Modeling for Cities of the Future. State-Of Art AN

    Science.gov (United States)

    Valencia, J.; Muñoz-Nieto, A.; Rodriguez-Gonzalvez, P.

    2015-02-01

    3D virtual modeling, visualization, dissemination and management of urban areas is one of the most exciting challenges that must face geomatics in the coming years. This paper aims to review, compare and analyze the new technologies, policies and software tools that are in progress to manage urban 3D information. It is assumed that the third dimension increases the quality of the model provided, allowing new approaches to urban planning, conservation and management of architectural and archaeological areas. Despite the fact that displaying 3D urban environments is an issue nowadays solved, there are some challenges to be faced by geomatics in the coming future. Displaying georeferenced linked information would be considered the first challenge. Another challenge to face is to improve the technical requirements if this georeferenced information must be shown in real time. Are there available software tools ready for this challenge? Are they useful to provide services required in smart cities? Throughout this paper, many practical examples that require 3D georeferenced information and linked data will be shown. Computer advances related to 3D spatial databases and software that are being developed to convert rendering virtual environment to a new enriched environment with linked information will be also analyzed. Finally, different standards that Open Geospatial Consortium has assumed and developed regarding the three-dimensional geographic information will be reviewed. Particular emphasis will be devoted on KML, LandXML, CityGML and the new IndoorGML.

  12. Digital Avionics Information System (DAIS): Training Requirements Analysis Model (TRAMOD).

    Science.gov (United States)

    Czuchry, Andrew J.; And Others

    The training requirements analysis model (TRAMOD) described in this report represents an important portion of the larger effort called the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study. TRAMOD is the second of three models that comprise an LCC impact modeling system for use in the early stages of system development. As…

  13. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach.......Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  14. QPS/LHC Activities requiring important Tunnel Work During a future long Shutdown

    CERN Document Server

    Dahlerup-Petersen, K

    2011-01-01

    The MPE/circuit protection section is presently establishing a road map for its future LHC activities. The tasks comprise essential consolidation work, compulsory upgrades and extensions of existing machine facilities. The results of a first round of engineering exertion were presented and evaluated at a MPE activity review in December 2010. The technical and financial aspects of this program will be detailed in the ‘QPS Medium and Long-Term Improvement Plan’, to be published shortly. The QPS activities in the LHC tunnel during a future, long shutdown are closely related to this improvement chart. A project-package based program for the interventions has been established and will be presented in this report, together with estimates for the associated human and financial resources necessary for its implementation.

  15. What are the essential cognitive requirements for prospection (thinking about the future?

    Directory of Open Access Journals (Sweden)

    Magda eOsman

    2014-06-01

    Full Text Available Placing the future centre stage as a way of understanding cognition is gaining attention in psychology. The general modern label for this is prospection which refers to the process of representing and thinking about possible future states of the world. Several theorists have claimed that episodic and prospective memory, as well as hypothetical thinking (mental simulation and conditional reasoning are necessary cognitive faculties that enable prospection. Given the limitations in current empirical efforts connecting these faculties to prospection, the aim of this mini review is to argue that the findings show that they are sufficient, but not necessary for prospection. As a result, the short concluding section gives an outline of an alternative conceptualisation of prospection. The proposal is that the critical characteristics of prospection are the discovery of, and maintenance of goals via causal learning.

  16. Straw bale houses in a moderate climate: adaptable to meet future energy performance requirements?

    OpenAIRE

    Verbeeck, Griet; Ponet, Jolien

    2012-01-01

    The energy performance regulations for buildings, introduced in many countries during the last decade, will be tightened in the future, even up to zero energy level. Apart from that, ancient building techniques that use renewable materials, such as straw bales, have a revival, inspired by concerns about the environmental impact of building materials. However, straw bale construction related organisations are concerned whether this building technique will survive the upcoming severe energy per...

  17. Views on the future of thermal hydraulic modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ishii, M. [Purdue Univ., West Lafayette, IN (United States)

    1997-07-01

    It is essential for the U.S. NRC to sustain the highest level of the thermal-hydraulics and reactor safety research expertise and continuously improve their accident analysis capability. Such expertise should span over four different areas which are strongly related to each other. These are: (1) Reactor Safety Code Development, (2) Two-phase Flow Modeling, (3) Instrumentation and Fundamental Experimental Research, and (4) Separate Effect and Integral Test. The NRC is already considering a new effort in the area of advanced thermal-hydraulics effort. Its success largely depends on the availability of a significantly improved two-phase flow formulation and constitutive relations supported by detailed experimental data. Therefore, it is recommended that the NRC start significant research efforts in the areas of two-phase flow modeling, instrumentation, basic and separate effect experiments which should be pursued systematically and with clearly defined objectives. It is desirable that some international program is developed in this area. This paper is concentrated on those items in the thermal-hydraulic area which eventually determine the quality of future accident analysis codes.

  18. Regional Dynamic Simulation Modeling and Analysis of Integrated Energy Futures

    Energy Technology Data Exchange (ETDEWEB)

    MALCZYNSKI, LEONARD A.; BEYELER, WALTER E.; CONRAD, STEPHEN H.; HARRIS, DAVID B; REXROTH, PAUL E.; BAKER, ARNOLD B.

    2002-11-01

    The Global Energy Futures Model (GEFM) is a demand-based, gross domestic product (GDP)-driven, dynamic simulation tool that provides an integrated framework to model key aspects of energy, nuclear-materials storage and disposition, environmental effluents from fossil and non fossil energy and global nuclear-materials management. Based entirely on public source data, it links oil, natural gas, coal, nuclear and renewable energy dynamically to greenhouse-gas emissions and 12 other measures of environmental impact. It includes historical data from 1990 to 2000, is benchmarked to the DOE/EIA/IEO 2001 [5] Reference Case for 2000 to 2020, and extrapolates energy demand through the year 2050. The GEFM is globally integrated, and breaks out five regions of the world: United States of America (USA), the Peoples Republic of China (China), the former Soviet Union (FSU), the Organization for Economic Cooperation and Development (OECD) nations excluding the USA (other industrialized countries), and the rest of the world (ROW) (essentially the developing world). The GEFM allows the user to examine a very wide range of ''what if'' scenarios through 2050 and to view the potential effects across widely dispersed, but interrelated areas. The authors believe that this high-level learning tool will help to stimulate public policy debate on energy, environment, economic and national security issues.

  19. Health care administration in the year 2000: practitioners' views of future issues and job requirements.

    Science.gov (United States)

    Hudak, R P; Brooke, P P; Finstuen, K; Riley, P

    1993-01-01

    This research identifies the most important domains in health care administration (HCA) from now to the year 2000 and differentiates job skill, knowledge, and ability requirements necessary for successful management. Fellows of the American College of Healthcare Executives from about half of the United States responded to two iterations of a Delphi mail inquiry. Fellows identified 102 issues that were content-analyzed into nine domains by an HCA expert panel. Domains, in order of ranked importance, were cost/finance, leadership, professional staff interactions, health care delivery concepts, accessibility, ethics, quality/risk management, technology, and marketing. In the second Delphi iteration, Fellows reviewed domain results and rated job requirements on required job importance. Results indicated that while a business orientation is needed for organizational survival, an equal emphasis on person-oriented skills, knowledge, and abilities is required.

  20. GENERAL REQUIREMENTS FOR SIMULATION MODELS IN WASTE MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Ian; Kossik, Rick; Voss, Charlie

    2003-02-27

    Most waste management activities are decided upon and carried out in a public or semi-public arena, typically involving the waste management organization, one or more regulators, and often other stakeholders and members of the public. In these environments, simulation modeling can be a powerful tool in reaching a consensus on the best path forward, but only if the models that are developed are understood and accepted by all of the parties involved. These requirements for understanding and acceptance of the models constrain the appropriate software and model development procedures that are employed. This paper discusses requirements for both simulation software and for the models that are developed using the software. Requirements for the software include transparency, accessibility, flexibility, extensibility, quality assurance, ability to do discrete and/or continuous simulation, and efficiency. Requirements for the models that are developed include traceability, transparency, credibility/validity, and quality control. The paper discusses these requirements with specific reference to the requirements for performance assessment models that are used for predicting the long-term safety of waste disposal facilities, such as the proposed Yucca Mountain repository.

  1. Requirements engineering for cross-sectional information chain models.

    Science.gov (United States)

    Hübner, U; Cruel, E; Gök, M; Garthaus, M; Zimansky, M; Remmers, H; Rienhoff, O

    2012-01-01

    Despite the wealth of literature on requirements engineering, little is known about engineering very generic, innovative and emerging requirements, such as those for cross-sectional information chains. The IKM health project aims at building information chain reference models for the care of patients with chronic wounds, cancer-related pain and back pain. Our question therefore was how to appropriately capture information and process requirements that are both generally applicable and practically useful. To this end, we started with recommendations from clinical guidelines and put them up for discussion in Delphi surveys and expert interviews. Despite the heterogeneity we encountered in all three methods, it was possible to obtain requirements suitable for building reference models. We evaluated three modelling languages and then chose to write the models in UML (class and activity diagrams). On the basis of the current project results, the pros and cons of our approach are discussed.

  2. Inferring Requirement Goals from Model Implementing in UML

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    UML is used widely in many software developmentprocesses.However,it does not make explicit requirement goals.Here is a method tending to establish the semantic relationship between requirements goals and UML models.Before the method is introduced,some relevant concepts are described

  3. Characterization of Metering, Merging and Spacing Requirements for Future Trajectory-Based Operations

    Science.gov (United States)

    Johnson, Sally

    2017-01-01

    Trajectory-Based Operations (TBO) is one of the essential paradigm shifts in the NextGen transformation of the National Airspace System. Under TBO, aircraft are managed by 4-dimensional trajectories, and airborne and ground-based metering, merging, and spacing operations are key to managing those trajectories. This paper presents the results of a study of potential metering, merging, and spacing operations within a future TBO environment. A number of operational scenarios for tactical and strategic uses of metering, merging, and spacing are described, and interdependencies between concurrent tactical and strategic operations are identified.

  4. AMTD: Update of Engineering Specifications Derived from Science Requirements for Future UVOIR Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Postman, Marc; Mosier, Gary; Smith, W. Scott; Blaurock, Carl; Ha, Kong; Stark, Christopher C.

    2014-01-01

    The Advance Mirror Technology Development (AMTD) project is in Phase 2 of a multiyear effort, initiated in FY12, to mature by at least a half TRL step six critical technologies required to enable 4 meter or larger UVOIR space telescope primary mirror assemblies for both general astrophysics and ultra-high contrast observations of exoplanets. AMTD uses a science-driven systems engineering approach. We mature technologies required to enable the highest priority science AND provide a high-performance low-cost low-risk system. To give the science community options, we are pursuing multiple technology paths. A key task is deriving engineering specifications for advanced normal-incidence monolithic and segmented mirror systems needed to enable both general astrophysics and ultra-high contrast observations of exoplanets missions as a function of potential launch vehicles and their mass and volume constraints. A key finding of this effort is that the science requires an 8 meter or larger aperture telescope

  5. Modeled future peak streamflows in four coastal Maine rivers

    Science.gov (United States)

    Hodgkins, Glenn A.; Dudley, Robert W.

    2013-01-01

    To safely and economically design bridges and culverts, it is necessary to compute the magnitude of peak streamflows that have specified annual exceedance probabilities (AEPs). Annual precipitation and air temperature in the northeastern United States are, in general, projected to increase during the 21st century. It is therefore important for engineers and resource managers to understand how peak flows may change in the future. This report, prepared in cooperation with the Maine Department of Transportation (MaineDOT), presents modeled changes in peak flows at four basins in coastal Maine on the basis of projected changes in air temperature and precipitation. To estimate future peak streamflows at the four basins in this study, historical values for climate (temperature and precipitation) in the basins were adjusted by different amounts and input to a hydrologic model of each study basin. To encompass the projected changes in climate in coastal Maine by the end of the 21st century, air temperatures were adjusted by four different amounts, from -3.6 degrees Fahrenheit (ºF) (-2 degrees Celsius (ºC)) to +10.8 ºF (+6 ºC) of observed temperatures. Precipitation was adjusted by three different percentage values from -15 percent to +30 percent of observed precipitation. The resulting 20 combinations of temperature and precipitation changes (includes the no-change scenarios) were input to Precipitation-Runoff Modeling System (PRMS) watershed models, and annual daily maximum peak flows were calculated for each combination. Modeled peak flows from the adjusted changes in temperature and precipitation were compared to unadjusted (historical) modeled peak flows. Annual daily maximum peak flows increase or decrease, depending on whether temperature or precipitation is adjusted; increases in air temperature (with no change in precipitation) lead to decreases in peak flows, whereas increases in precipitation (with no change in temperature) lead to increases in peak flows. As

  6. Workshop on Functional Requirements for the Modeling of Fate and Transport of Waterborne CBRN Materials

    Energy Technology Data Exchange (ETDEWEB)

    Giles, GE

    2005-02-03

    The purpose of this Workshop on ''Functional Requirements for the Modeling of Fate and Transport of Waterborne CBRN Materials'' was to solicit functional requirements for tools that help Incident Managers plan for and deal with the consequences of industrial or terrorist releases of materials into the nation's waterways and public water utilities. Twenty representatives attended and several made presentations. Several hours of discussions elicited a set of requirements. These requirements were summarized in a form for the attendees to vote on their highest priority requirements. These votes were used to determine the prioritized requirements that are reported in this paper and can be used to direct future developments.

  7. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    The actual power requirement of an active loudspeaker during playback of music has not received much attention in the literature. This is probably because no single and simple solution exists and because a complete system knowledge from input voltage to output sound pressure level is required....... There are however many advantages that could be harvested from such knowledge like size, cost and efficiency improvements. In this paper a recently proposed power requirement model for active loudspeakers is experimentally validated and the model is expanded to include the closed and vented type enclosures...

  8. Testing evolutionary models of senescence: traditional approaches and future directions.

    Science.gov (United States)

    Robins, Chloe; Conneely, Karen N

    2014-12-01

    From an evolutionary perspective, the existence of senescence is a paradox. Why has senescence not been more effectively selected against given its associated decreases in Darwinian fitness? Why does senescence exist and how has it evolved? Three major theories offer explanations: (1) the theory of mutation accumulation suggested by PB Medawar; (2) the theory of antagonistic pleiotropy suggested by GC Williams; and (3) the disposable soma theory suggested by TBL Kirkwood. These three theories differ in the underlying causes of aging that they propose but are not mutually exclusive. This paper compares the specific biological predictions of each theory and discusses the methods and results of previous empirical tests. Lifespan is found to be the most frequently used estimate of senescence in evolutionary investigations. This measurement acts as a proxy for an individual's rate of senescence, but provides no information on an individual's senescent state or "biological age" throughout life. In the future, use of alternative longitudinal measures of senescence may facilitate investigation of previously neglected aspects of evolutionary models, such as intra- and inter-individual heterogeneity in the process of aging. DNA methylation data are newly proposed to measure biological aging and are suggested to be particularly useful for such investigations.

  9. A METHODOLOGY FOR DETERMINING FUTURE PHYSICAL FACILITIES REQUIREMENTS FOR INSTITUTIONS OF HIGHER EDUCATION.

    Science.gov (United States)

    YURKOVICH, JOHN V.

    A COMPUTERIZED METHODOLOGY FOR DETERMINING THE PHYSICAL FACILITIES REQUIREMENTS OF A LARGE UNIVERSITY WAS DEVELOPED. THE RESEARCH INCLUDED THE DEVELOPMENT, IMPLEMENTATION, AND TESTING OF SYSTEMS FOR (1) CLASSIFYING SPACE, (2) MAINTAINING A PERPETUAL SPACE INVENTORY, (3) CONDUCTING ROOM UTILIZATION STUDIES, (4) PROJECTING STUDENTS BY A SET OF…

  10. Towards a Formalized Ontology-Based Requirements Model

    Institute of Scientific and Technical Information of China (English)

    JIANG Dan-dong; ZHANG Shen-sheng; WANG Ying-lin

    2005-01-01

    The goal of this paper is to take a further step towards an ontological approach for representing requirements information. The motivation for ontologies was discussed. The definitions of ontology and requirements ontology were given. Then, it presented a collection of informal terms, including four subject areas. It also discussed the formalization process of ontology. The underlying meta-ontology was determined, and the formalized requirements ontology was analyzed. This formal ontology is built to serve as a basis for requirements model. Finally, the implementation of software system was given.

  11. A transformation approach for collaboration based requirement models

    CERN Document Server

    Harbouche, Ahmed; Mokhtari, Aicha

    2012-01-01

    Distributed software engineering is widely recognized as a complex task. Among the inherent complexities is the process of obtaining a system design from its global requirement specification. This paper deals with such transformation process and suggests an approach to derive the behavior of a given system components, in the form of distributed Finite State Machines, from the global system requirements, in the form of an augmented UML Activity Diagrams notation. The process of the suggested approach is summarized in three steps: the definition of the appropriate source Meta-Model (requirements Meta-Model), the definition of the target Design Meta-Model and the definition of the rules to govern the transformation during the derivation process. The derivation process transforms the global system requirements described as UML diagram activities (extended with collaborations) to system roles behaviors represented as UML finite state machines. The approach is implemented using Atlas Transformation Language (ATL).

  12. Irrigation Requirement Estimation Using Vegetation Indices and Inverse Biophysical Modeling

    Science.gov (United States)

    Bounoua, Lahouari; Imhoff, Marc L.; Franks, Shannon

    2010-01-01

    We explore an inverse biophysical modeling process forced by satellite and climatological data to quantify irrigation requirements in semi-arid agricultural areas. We constrain the carbon and water cycles modeled under both equilibrium, balance between vegetation and climate, and non-equilibrium, water added through irrigation. We postulate that the degree to which irrigated dry lands vary from equilibrium climate conditions is related to the amount of irrigation. The amount of water required over and above precipitation is considered as an irrigation requirement. For July, results show that spray irrigation resulted in an additional amount of water of 1.3 mm per occurrence with a frequency of 24.6 hours. In contrast, the drip irrigation required only 0.6 mm every 45.6 hours or 46% of that simulated by the spray irrigation. The modeled estimates account for 87% of the total reported irrigation water use, when soil salinity is not important and 66% in saline lands.

  13. Innovative Product Design Based on Customer Requirement Weight Calculation Model

    Institute of Scientific and Technical Information of China (English)

    Chen-Guang Guo; Yong-Xian Liu; Shou-Ming Hou; Wei Wang

    2010-01-01

    In the processes of product innovation and design, it is important for the designers to find and capture customer's focus through customer requirement weight calculation and ranking. Based on the fuzzy set theory and Euclidean space distance, this paper puts forward a method for customer requirement weight calculation called Euclidean space distances weighting ranking method. This method is used in the fuzzy analytic hierarchy process that satisfies the additive consistent fuzzy matrix. A model for the weight calculation steps is constructed;meanwhile, a product innovation design module on the basis of the customer requirement weight calculation model is developed. Finally, combined with the instance of titanium sponge production, the customer requirement weight calculation model is validated. By the innovation design module, the structure of the titanium sponge reactor has been improved and made innovative.

  14. Tissue Engineering of Blood Vessels: Functional Requirements, Progress, and Future Challenges.

    Science.gov (United States)

    Kumar, Vivek A; Brewster, Luke P; Caves, Jeffrey M; Chaikof, Elliot L

    2011-09-01

    Vascular disease results in the decreased utility and decreased availability of autologus vascular tissue for small diameter (engineered replacement vessels represent an ideal solution to this clinical problem. Ongoing progress requires combined approaches from biomaterials science, cell biology, and translational medicine to develop feasible solutions with the requisite mechanical support, a non-fouling surface for blood flow, and tissue regeneration. Over the past two decades interest in blood vessel tissue engineering has soared on a global scale, resulting in the first clinical implants of multiple technologies, steady progress with several other systems, and critical lessons-learned. This review will highlight the current inadequacies of autologus and synthetic grafts, the engineering requirements for implantation of tissue-engineered grafts, and the current status of tissue-engineered blood vessel research.

  15. CHOReOS perspective on the Future Internet and initial conceptual model (D1.2)

    OpenAIRE

    Autili, Marco; Di Ruscio, Davide; Salle, Amleto Di; Georgantas, Nikolaos; Hachem, Sara; Issamy, Valérie; Parathyras, Athanasios; Trimintzios, Lefteris; Silingas, Darius; Lockerbie, James; Maiden, Neil; Ben Hamida, Amira; Bertolino, Antonia; De Angelis, Guglielmo; Polini, Andrea

    2011-01-01

    The D1.2 deliverable outlines the CHOReOS perspective on the Future Internet and its conceptualization. In particular, the deliverable focuses on: - Definition of the Future Internet and related Future Internet of Services and (Smart) Things, as considered within CHOReOS, further stressing the many dimensions underpinning the Ultra-Large Scale of the Future Internet; - Definition of the initial conceptual model of the CHOReOS Service-Oriented Architecture (SOA) for the Future Internet, identi...

  16. Evaluation of Foreign Exchange Risk Capital Requirement Models

    Directory of Open Access Journals (Sweden)

    Ricardo S. Maia Clemente

    2005-12-01

    Full Text Available This paper examines capital requirement for financial institutions in order to cover market risk stemming from exposure to foreign currencies. The models examined belong to two groups according to the approach involved: standardized and internal models. In the first group, we study the Basel model and the model adopted by the Brazilian legislation. In the second group, we consider the models based on the concept of value at risk (VaR. We analyze the single and the double-window historical model, the exponential smoothing model (EWMA and a hybrid approach that combines features of both models. The results suggest that the Basel model is inadequate to the Brazilian market, exhibiting a large number of exceptions. The model of the Brazilian legislation has no exceptions, though generating higher capital requirements than other internal models based on VaR. In general, VaR-based models perform better and result in less capital allocation than the standardized approach model applied in Brazil.

  17. The Benefit of Ambiguity in Understanding Goals in Requirements Modelling

    DEFF Research Database (Denmark)

    Paay, Jeni; Pedell, Sonja; Sterling, Leon

    2011-01-01

    This paper examines the benefit of ambiguity in describing goals in requirements modelling for the design of socio-technical systems using concepts from Agent-Oriented Software Engineering (AOSE) and ethnographic and cultural probe methods from Human Computer Interaction (HCI). The authors’ aim...... a holistic approach to eliciting, analyzing, and modelling socially-oriented requirements by combining a particular form of ethnographic technique, cultural probes, with Agent Oriented Software Engineering notations to model these requirements. This paper focuses on examining the value of maintaining...... of their research is to create technologies that support more flexible and meaningful social interactions, by combining best practice in software engineering with ethnographic techniques to model complex social interactions from their socially oriented life for the purposes of building rich socio...

  18. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  19. Modelling climate change in a Dutch polder system using the FutureViewR modelling suite

    NARCIS (Netherlands)

    Immerzeel, W.W.; Heerwaarden, van C.C.; Droogers, P.

    2009-01-01

    This paper describes the development of a hydrological modelling suite, FutureViewR, which enables spatial quantification of the complex interaction between climate change, land use and soil in the Quarles van Ufford (QvU) polder entangled in and under influence of the Dutch river delta. The soil¿wa

  20. Advancing vector biology research: a community survey for future directions, research applications and infrastructure requirements.

    Science.gov (United States)

    Kohl, Alain; Pondeville, Emilie; Schnettler, Esther; Crisanti, Andrea; Supparo, Clelia; Christophides, George K; Kersey, Paul J; Maslen, Gareth L; Takken, Willem; Koenraadt, Constantianus J M; Oliva, Clelia F; Busquets, Núria; Abad, F Xavier; Failloux, Anna-Bella; Levashina, Elena A; Wilson, Anthony J; Veronesi, Eva; Pichard, Maëlle; Arnaud Marsh, Sarah; Simard, Frédéric; Vernick, Kenneth D

    2016-01-01

    Vector-borne pathogens impact public health, animal production, and animal welfare. Research on arthropod vectors such as mosquitoes, ticks, sandflies, and midges which transmit pathogens to humans and economically important animals is crucial for development of new control measures that target transmission by the vector. While insecticides are an important part of this arsenal, appearance of resistance mechanisms is increasingly common. Novel tools for genetic manipulation of vectors, use of Wolbachia endosymbiotic bacteria, and other biological control mechanisms to prevent pathogen transmission have led to promising new intervention strategies, adding to strong interest in vector biology and genetics as well as vector-pathogen interactions. Vector research is therefore at a crucial juncture, and strategic decisions on future research directions and research infrastructure investment should be informed by the research community. A survey initiated by the European Horizon 2020 INFRAVEC-2 consortium set out to canvass priorities in the vector biology research community and to determine key activities that are needed for researchers to efficiently study vectors, vector-pathogen interactions, as well as access the structures and services that allow such activities to be carried out. We summarize the most important findings of the survey which in particular reflect the priorities of researchers in European countries, and which will be of use to stakeholders that include researchers, government, and research organizations.

  1. Confronting dark energy models mimicking {Lambda}CDM epoch with observational constraints: Future cosmological perturbations decay or future Rip?

    Energy Technology Data Exchange (ETDEWEB)

    Astashenok, Artyom V., E-mail: artyom.art@gmail.com [Baltic Federal University of I. Kant, Department of Theoretical Physics, 236041, 14, Nevsky st., Kaliningrad (Russian Federation); Odintsov, Sergei D. [Institucio Catalana de Recerca i Estudis Avancats (ICREA), Barcelona (Spain); Institut de Ciencies de l' Espai (CSIC-IEEC), Campus UAB, Torre C5-Par-2a pl, E-08193 Bellaterra (Barcelona) (Spain); Eurasian International Center for Theor. Physics, Eurasian National University, Astana 010008 (Kazakhstan); Tomsk State Pedagogical University, Tomsk (Russian Federation)

    2013-01-29

    We confront dark energy models which are currently similar to {Lambda}CDM theory with observational data which include the SNe data, matter density perturbations and baryon acoustic oscillations data. DE cosmology under consideration may evolve to Big Rip, type II or type III future singularity, or to Little Rip or Pseudo-Rip universe. It is shown that matter perturbations data define more precisely the possible deviation from {Lambda}CDM model than consideration of SNe data only. The combined data analysis proves that DE models under consideration are as consistent as {Lambda}CDM model. We demonstrate that growth of matter density perturbations may occur at sufficiently small background density but still before the possible disintegration of bound objects (like clusters of galaxies, galaxies, etc.) in Big Rip, type III singularity, Little Rip or Pseudo-Rip universe. This new effect may bring the future universe to chaotic state well before disintegration or Rip.

  2. The Hydrogen Futures Simulation Model (H[2]Sim) technical description.

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Scott A.; Kamery, William; Baker, Arnold Barry; Drennen, Thomas E.; Lutz, Andrew E.; Rosthal, Jennifer Elizabeth

    2004-10-01

    Hydrogen has the potential to become an integral part of our energy transportation and heat and power sectors in the coming decades and offers a possible solution to many of the problems associated with a heavy reliance on oil and other fossil fuels. The Hydrogen Futures Simulation Model (H2Sim) was developed to provide a high level, internally consistent, strategic tool for evaluating the economic and environmental trade offs of alternative hydrogen production, storage, transport and end use options in the year 2020. Based on the model's default assumptions, estimated hydrogen production costs range from 0.68 $/kg for coal gasification to as high as 5.64 $/kg for centralized electrolysis using solar PV. Coal gasification remains the least cost option if carbon capture and sequestration costs ($0.16/kg) are added. This result is fairly robust; for example, assumed coal prices would have to more than triple or the assumed capital cost would have to increase by more than 2.5 times for natural gas reformation to become the cheaper option. Alternatively, assumed natural gas prices would have to fall below $2/MBtu to compete with coal gasification. The electrolysis results are highly sensitive to electricity costs, but electrolysis only becomes cost competitive with other options when electricity drops below 1 cent/kWhr. Delivered 2020 hydrogen costs are likely to be double the estimated production costs due to the inherent difficulties associated with storing, transporting, and dispensing hydrogen due to its low volumetric density. H2Sim estimates distribution costs ranging from 1.37 $/kg (low distance, low production) to 3.23 $/kg (long distance, high production volumes, carbon sequestration). Distributed hydrogen production options, such as on site natural gas, would avoid some of these costs. H2Sim compares the expected 2020 per mile driving costs (fuel, capital, maintenance, license, and registration) of current technology internal combustion engine (ICE

  3. Beyond scenario planning: projecting the future using models at Wind Cave National Park (USA)

    Science.gov (United States)

    King, D. A.; Bachelet, D. M.; Symstad, A. J.

    2011-12-01

    Scenario planning has been used by the National Park Service as a tool for natural resource management planning in the face of climate change. Sets of plausible but divergent future scenarios are constructed from available information and expert opinion and serve as starting point to derive climate-smart management strategies. However, qualitative hypotheses about how systems would react to a particular set of conditions assumed from coarse scale climate projections may lack the scientific rigor expected from a federal agency. In an effort to better assess the range of likely futures at Wind Cave National Park, a project was conceived to 1) generate high resolution historic and future climate time series to identify local weather patterns that may or may not persist, 2) simulate the hydrological cycle in this geologically varied landscape and its response to future climate, 3) project vegetation dynamics and ensuing changes in the biogeochemical cycles given grazing and fire disturbances under new climate conditions, and 4) synthesize and compare results with those from the scenario planning exercise. In this framework, we tested a dynamic global vegetation model against local information on vegetation cover, disturbance history and stream flow to better understand the potential resilience of these ecosystems to climate change. We discuss the tradeoffs between a coarse scale application of the model showing regional trends with limited ability to project the fine scale mosaic of vegetation at Wind Cave, and a finer scale approach that can account for local slope effects on water balance and better assess the vulnerability of landscape facets, but requires more intensive data acquisition. We elaborate on the potential for sharing information between models to mitigate the often-limited treatment of biological feedbacks in the physical representations of soil and atmospheric processes.

  4. Firefighter safety for PV systems: Overview of future requirements and protection systems

    DEFF Research Database (Denmark)

    Spataru, Sergiu; Sera, Dezso; Blaabjerg, Frede;

    2013-01-01

    An important and highly discussed safety issue for photovoltaic systems is that, as long as they are illuminated, a high voltage is present at the PV string terminals and cables between the string and inverters, independent of the state of the inverter's dc disconnection switch, which poses a risk...... shutdown procedures. This paper gives an overview on the most recent fire - and firefighter safety requirements for PV systems, with focus on system and module shutdown systems. Several solutions are presented, analyzed and compared by considering a number of essential characteristics, including...... for operators during maintenance or fire-fighting. One of the solutions is individual module shutdown by short-circuiting or disconnecting each PV module from the PV string. However, currently no standards have been adopted either for implementing or testing these methods, or doing an evaluation of the module...

  5. A TRANSFORMATION APPROACH FOR COLLABORATION BASED REQUIREMENT MODELS

    Directory of Open Access Journals (Sweden)

    Ahmed Harbouche

    2012-02-01

    Full Text Available Distributed software engineering is widely recognized as a complex task. Among the inherent complexitiesis the process of obtaining a system design from its global requirement specification. This paper deals withsuch transformation process and suggests an approach to derive the behavior of a given systemcomponents, in the form of distributed Finite State Machines, from the global system requirements, in theform of an augmented UML Activity Diagrams notation. The process of the suggested approach issummarized in three steps: the definition of the appropriate source Meta-Model (requirements Meta-Model, the definition of the target Design Meta-Model and the definition of the rules to govern thetransformation during the derivation process. The derivation process transforms the global systemrequirements described as UML diagram activities (extended with collaborations to system rolesbehaviors represented as UML finite state machines. The approach is implemented using AtlasTransformation Language (ATL.

  6. Developing Senior Navy Leaders: Requirements for Flag Officer Expertise Today and in the Future

    Science.gov (United States)

    2008-01-01

    three: the MH-60 Sierra, MH-60 Romeo , and the MH-53E. This reduction offers efficiencies of simplified main- tenance, logistics, and training...Carrier Strike Group leadership. Additionally, five expeditionary MH-60 Romeo squadrons and six expeditionary MH- 60 Sierra squadrons will meet non...Hesketh, Jerry Kehoe, Kenneth Pearlman, Erich P. Prein, and Juan I. Sanchez, “The Practice of Competency Modeling,” Personnel Psychology, Vol. 53, No

  7. Availability modeling approach for future circular colliders based on the LHC operation experience

    Science.gov (United States)

    Niemi, Arto; Apollonio, Andrea; Gutleber, Johannes; Sollander, Peter; Penttinen, Jussi-Pekka; Virtanen, Seppo

    2016-12-01

    Reaching the challenging integrated luminosity production goals of a future circular hadron collider (FCC-hh) and high luminosity LHC (HL-LHC) requires a thorough understanding of today's most powerful high energy physics research infrastructure, the LHC accelerator complex at CERN. FCC-hh, a 4 times larger collider ring aims at delivering 10 - 20 ab-1 of integrated luminosity at 7 times higher collision energy. Since the identification of the key factors that impact availability and cost is far from obvious, a dedicated activity has been launched in the frame of the future circular collider study to develop models to study possible ways to optimize accelerator availability. This paper introduces the FCC reliability and availability study, which takes a fresh new look at assessing and modeling reliability and availability of particle accelerator infrastructures. The paper presents a probabilistic approach for Monte Carlo simulation of the machine operational cycle, schedule and availability for physics. The approach is based on best-practice, industrially applied reliability analysis methods. It relies on failure rate and repair time distributions to calculate impacts on availability. The main source of information for the study is coming from CERN accelerator operation and maintenance data. Recent improvements in LHC failure tracking help improving the accuracy of modeling of LHC performance. The model accuracy and prediction capabilities are discussed by comparing obtained results with past LHC operational data.

  8. Balancing functional and nutritional quality of oils and fats: Current requirements and future trends

    Directory of Open Access Journals (Sweden)

    Van den Bremt Karen

    2012-03-01

    Full Text Available Oils and fats play an important role in the structure, aroma and stability of a wide variety of food products, as well as in their nutritional properties. For Puratos, a producer of ingredients for bakery, patisserie and chocolate sector, functionality and taste are of utmost importance, but the company also wants to contribute to the balanced diet of consumers. Vegetable oils and fats are used in margarines and releasing agents, vegetable creams, compound chocolate, fillings and emulsifiers. Each application requires an oil or fat with specific physicochemical properties in order to ensure the optimal structure, stability and taste of the end product. Traditionally, (partially hydrogenated vegetable oils deliver important functional characteristics concerning crystallization behaviour, directly linked with the workability, melting properties, stability and mouth feel of the food product. However, due to negative nutritional implications, trans fats are to be replaced by healthier alternatives, preferably not by saturated fats. Consumers – and in some regions, legal instances – demand transfree or hydro-free products while not compromising on taste. Alternative fats and oils will be discussed concerning their functional and nutritional properties.

  9. Present and future changes of ice sheets in a coupled ice sheet-climate model

    Science.gov (United States)

    Kapsch, Marie; Ziemen, Florian; Mikolajewicz, Uwe

    2017-04-01

    The future evolution of the ice sheets covering Greenland and Antarctica is of importance, as ice sheets hold more than 99% of the Earths' freshwater. If released into the oceans, this freshwater could significantly impact the global climate, most prominently the oceanic overturning circulation and the sea-level. To model past and future climate change it is therefore important to integrate ice sheet models (ISMs) into state-of-the-art Earth System Models (ESMs), in order to account for the full range of feedback processes between ice sheets and other climate components. However, the coupling of ISMs into ESMs remains challenging, especially due to the required downscaling of the surface mass balance (SMB) from the low resolution atmospheric grid of the ESM onto the high resolution ice sheet topography. Here we present results from model simulations with the Max Planck Institute ESM (MPI-ESM) coupled to the Parallel ISM (PISM; http://www.pism-docs.org). To bridge the gap between the different model resolutions of the atmospheric component of MPI-ESM and PISM a sophisticated energy balance model (EBM) is used to calculate and downscale the SMB. The modeled SMB for present-day climate conditions shows good agreement with SMB reconstructions from regional climate modeling (e.g. RACMO, MAR). To estimate the effect of different downscaling methods, simulations performed with the EBM are compared to simulations that use a commonly applied positive degree day approach. These comparisons are shown for simulations with present day as well as increasing greenhouse gas concentrations.

  10. NVC Based Model for Selecting Effective Requirement Elicitation Technique

    Directory of Open Access Journals (Sweden)

    Md. Rizwan Beg

    2012-10-01

    Full Text Available Requirement Engineering process starts from gathering of requirements i.e.; requirements elicitation. Requirementselicitation (RE is the base building block for a software project and has very high impact onsubsequent design and builds phases as well. Accurately capturing system requirements is the major factorin the failure of most of software projects. Due to the criticality and impact of this phase, it is very importantto perform the requirements elicitation in no less than a perfect manner. One of the most difficult jobsfor elicitor is to select appropriate technique for eliciting the requirement. Interviewing and Interactingstakeholder during Elicitation process is a communication intensive activity involves Verbal and Nonverbalcommunication (NVC. Elicitor should give emphasis to Non-verbal communication along with verbalcommunication so that requirements recorded more efficiently and effectively. In this paper we proposea model in which stakeholders are classified by observing non-verbal communication and use it as a basefor elicitation technique selection. We also propose an efficient plan for requirements elicitation which intendsto overcome on the constraints, faced by elicitor.

  11. Wastewater treatment models in teaching and training: the mismatch between education and requirements for jobs.

    Science.gov (United States)

    Hug, Thomas; Benedetti, Lorenzo; Hall, Eric R; Johnson, Bruce R; Morgenroth, Eberhard; Nopens, Ingmar; Rieger, Leiv; Shaw, Andrew; Vanrolleghem, Peter A

    2009-01-01

    As mathematical modeling of wastewater treatment plants has become more common in research and consultancy, a mismatch between education and requirements for model-related jobs has developed. There seems to be a shortage of skilled people, both in terms of quantity and in quality. In order to address this problem, this paper provides a framework to outline different types of model-related jobs, assess the required skills for these jobs and characterize different types of education that modelers obtain "in school" as well as "on the job". It is important to consider that education of modelers does not mainly happen in university courses and that the variety of model related jobs goes far beyond use for process design by consulting companies. To resolve the mismatch, the current connection between requirements for different jobs and the various types of education has to be assessed for different geographical regions and professional environments. This allows the evaluation and improvement of important educational paths, considering quality assurance and future developments. Moreover, conclusions from a workshop involving practitioners and academics from North America and Europe are presented. The participants stressed the importance of non-technical skills and recommended strengthening the role of realistic modeling experience in university training. However, this paper suggests that all providers of modeling education and support, not only universities, but also software suppliers, professional associations and companies performing modeling tasks are called to assess and strengthen their role in training and support of professional modelers.

  12. Formal Requirements Modeling for Reactive Systems with Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon

    This dissertation presents the contributions of seven publications all concerned with the application of Coloured Petri Nets (CPN) to requirements modeling for reactive systems. The publications are introduced along with relevant background material and related work, and their contributions...... interface composed of recognizable artifacts and activities. The presentation of the three publications related to Use Cases is followed by a the presentation of a publication formalizing some of the guidelines applied for structuring the CPN requirements models|namely the guidelines that make it possible...... activity. The traces are automatically recorded during execution of the model. The second publication presents a formally specified framework for automating a large part of the tasks related to integrating Problem Frames with CPN. The framework is specified in VDM++, and allows the modeler to automatically...

  13. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    Energy Technology Data Exchange (ETDEWEB)

    St. John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program.

  14. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    Science.gov (United States)

    Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2013-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  15. Single High Fidelity Geometric Data Sets for LCM - Model Requirements

    Science.gov (United States)

    2006-11-01

    material name (example, an HY80 steel ) plus additional material requirements (heat treatment, etc.) Creation of a more detailed description of the data...57 Figure 2.22. Typical Stress-Strain Curve for Steel (adapted from Ref 59) .............................. 60 Figure...structures are steel , aluminum and composites. The structural components that make up a global FEA model drive the fidelity of the model. For example

  16. Building Futurism into the Institution's Strategic Planning and Human Resource Development Model.

    Science.gov (United States)

    Groff, Warren H.

    A process for building futurism into the institution's strategic planning and human resource development model is described. It is an attempt to assist faculty and staff to understand the future and the formulation and revision of professional goals in relation to an image of the future. A conceptual framework about the changing nature of human…

  17. Future water availability in North African dams simulated by high-resolution regional climate models

    Science.gov (United States)

    Tramblay, Yves; Jarlan, Lionel; Hanich, Lahoucine; Somot, Samuel

    2016-04-01

    In North Africa, the countries of Morocco, Algeria and Tunisia are already experiencing water scarcity and a strong interannual variability of precipitation. To better manage their existing water resources, several dams and reservoirs have been built on most large river catchments. The objective of this study is to provide quantitative scenarios of future changes in water availability for the 47 major dams and reservoirs catchments located in North Africa. An ensemble of regional climate models (RCM) with a spatial resolution of 12km, driven by different general circulation models (GCM), from the EuroCORDEX experiment have been considered to analyze the projected changes on temperature, precipitation and potential evapotranspiration (PET) for two scenarios (RCP4.5 and RCP8.5) and two time horizons (2040-2065 and 2065-2090). PET is estimated from RCM outputs either with the FAO-Penman-Monteith (PM) equation, requiring air temperature, relative humidity, net radiation and wind, or with the Hargreave Samani (HS) equation, requiring only air temperature. The water balance is analyzed by comparing the climatic demand and supply of water, considering that for most of these catchments groundwater storage is negligible over long time periods. Results indicated a future temperature increase for all catchments between +1.8° and +4.2°, depending on the emission scenario and the time period considered. Precipitation is projected to decrease between -14% to -27%, mainly in winter and spring, with a strong East to West gradient. PET computed from PM or HS formulas provided very similar estimates and projections, ranging between +7% to +18%. Changes in PET are mostly driven by rising temperatures and are greatest during dry summer months than for the wet winter season. Therefore the increased PET has a lower impact than declining precipitation on future water availability, which is expected to decrease by -19% to -33% on average.

  18. Requirements for a next generation global flood inundation models

    Science.gov (United States)

    Bates, P. D.; Neal, J. C.; Smith, A.; Sampson, C. C.

    2016-12-01

    In this paper we review the current status of global hydrodynamic models for flood inundation prediction and highlight recent successes and current limitations. Building on this analysis we then go on to consider what is required to develop the next generation of such schemes and show that to achieve this a number of fundamental science problems will need to be overcome. New data sets and new types of analysis will be required, and we show that these will only partially be met by currently planned satellite missions and data collection initiatives. A particular example is the quality of available global Digital Elevation data. The current best data set for flood modelling, SRTM, is only available at a relatively modest 30m resolution, contains pixel-to-pixel noise of 6m and is corrupted by surface artefacts. Creative processing techniques have sought to address these issues with some success, but fundamentally the quality of the available global terrain data limits flood modelling and needs to be overcome. Similar arguments can be made for many other elements of global hydrodynamic models including their bathymetry data, boundary conditions, flood defence information and model validation data. We therefore systematically review each component of global flood models and document whether planned new technology will solve current limitations and, if not, what exactly will be required to do so.

  19. Development of Rainfall-Discharge Model for Future NPP candidate Site

    Energy Technology Data Exchange (ETDEWEB)

    An, Ji-hong; Yee, Eric [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-10-15

    By this study, most suitable model for future nuclear power plant site in Yeongdeok to be used to predict peak amount of riverine flooding was developed by examining historical rainfall and discharge data from the nearest gage station which is Jodong water level gage station in Taehwa basin. Sitting a nuclear power plant (NPP) requires safety analyses that include the effects of extreme events such as flooding or earthquake. In light of South Korean government's 15-year power supply plan that calls for the construction of new nuclear power station in Yeongdeok, it becomes more important to site new station in a safe area from flooding. Because flooding or flooding related accidents mostly happen due to extremely intense rainfall, it is necessary to find out the relationship between rainfall and run-off by setting up feasible model to figure out the peak flow of the river around nuclear related facilities.

  20. Comparative analysis of hourly and dynamic power balancing models for validating future energy scenarios

    DEFF Research Database (Denmark)

    Pillai, Jayakrishnan R.; Heussen, Kai; Østergaard, Poul Alberg

    2011-01-01

    Energy system analyses on the basis of fast and simple tools have proven particularly useful for interdisciplinary planning projects with frequent iterations and re-evaluation of alternative scenarios. As such, the tool “EnergyPLAN” is used for hourly balanced and spatially aggregate annual......, the model is verified on the basis of the existing energy mix on Bornholm as an islanded energy system. Future energy scenarios for the year 2030 are analysed to study a feasible technology mix for a higher share of wind power. Finally, the results of the hourly simulations are compared to dynamic frequency...... simulations incorporating the Vehicle-to-grid technology. The results indicate how the EnergyPLAN model may be improved in terms of intra-hour variability, stability and ancillary services to achieve a better reflection of energy and power capacity requirements....

  1. Modelling future oil production, population and the economy

    Energy Technology Data Exchange (ETDEWEB)

    Laherrere, Jean

    2003-07-01

    pattern, giving one or more new cycles. To model an event made up of several cycles extending into the future calls for an estimate of the ultimate value, which corresponds with the area under the curve up to the end of the event. For oil, the best tool to determine an ultimate value is the creaming curve that plots cumulative discovery versus the cumulative number of new field wildcats, the result being modelled by one or more hyperbolas. Another method is to plot the ratio of annual to cumulative production versus cumulative production, and extrapolate the trend to zero. When the trend is linear, it represents the derivative of the logistic curve. The fractal distribution of sizes (field reserves, incomes, urban agglomerations plotted against decreasing rank) can also be extrapolated to an ultimate value. Population can be well modelled with two cycles, distinguishing countries with high and low fertility rates. Previous UN forecasts were too high for different reasons. Economic parameters, such as unemployment or inflation, can be correlated with oil price after a certain time-shift. Income distribution is well described by a fractal plot of population versus income. The income fractal distribution in France is in fact the same as that in the United States, although the total of the latter is higher because of a larger population. Many graphs are shown for each domain using the same tools. The goal is that the reader may be able to draw his own conclusions, and make his own forecast. Ironically, it appears that the modelling is more reliable than the input data. Accordingly, the main challenge is to secure better data, but that will be achieved only if and when political influences can be removed. A neutral agency is needed, but neither the UN nor national agencies are neutral. It is hard to see how to force the actors to tell the truth, or know who would run and finance such an organisation. A step in the right direction would be to make official organisations liable

  2. Modelling future oil production, population and the economy

    Energy Technology Data Exchange (ETDEWEB)

    Laherrere, Jean

    2003-07-01

    pattern, giving one or more new cycles. To model an event made up of several cycles extending into the future calls for an estimate of the ultimate value, which corresponds with the area under the curve up to the end of the event. For oil, the best tool to determine an ultimate value is the creaming curve that plots cumulative discovery versus the cumulative number of new field wildcats, the result being modelled by one or more hyperbolas. Another method is to plot the ratio of annual to cumulative production versus cumulative production, and extrapolate the trend to zero. When the trend is linear, it represents the derivative of the logistic curve. The fractal distribution of sizes (field reserves, incomes, urban agglomerations plotted against decreasing rank) can also be extrapolated to an ultimate value. Population can be well modelled with two cycles, distinguishing countries with high and low fertility rates. Previous UN forecasts were too high for different reasons. Economic parameters, such as unemployment or inflation, can be correlated with oil price after a certain time-shift. Income distribution is well described by a fractal plot of population versus income. The income fractal distribution in France is in fact the same as that in the United States, although the total of the latter is higher because of a larger population. Many graphs are shown for each domain using the same tools. The goal is that the reader may be able to draw his own conclusions, and make his own forecast. Ironically, it appears that the modelling is more reliable than the input data. Accordingly, the main challenge is to secure better data, but that will be achieved only if and when political influences can be removed. A neutral agency is needed, but neither the UN nor national agencies are neutral. It is hard to see how to force the actors to tell the truth, or know who would run and finance such an organisation. A step in the right direction would be to make official organisations liable

  3. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  4. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  5. Nuclear Energy Research Initiative. Risk Informed Assessment of Regulatory and Design Requirements for Future Nuclear Power Plants. Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Ritterbusch, S.E.

    2000-08-01

    The overall goal of this research project is to support innovation in new nuclear power plant designs. This project is examining the implications, for future reactors and future safety regulation, of utilizing a new risk-informed regulatory system as a replacement for the current system. This innovation will be made possible through development of a scientific, highly risk-informed approach for the design and regulation of nuclear power plants. This approach will include the development and.lor confirmation of corresponding regulatory requirements and industry standards. The major impediment to long term competitiveness of new nuclear plants in the U.S. is the capital cost component--which may need to be reduced on the order of 35% to 40% for Advanced Light Water Reactors (ALWRs) such as System 80+ and Advanced Boiling Water Reactor (ABWR). The required cost reduction for an ALWR such as AP600 or AP1000 would be expected to be less. Such reductions in capital cost will require a fundamental reevaluation of the industry standards and regulatory bases under which nuclear plants are designed and licensed. Fortunately, there is now an increasing awareness that many of the existing regulatory requirements and industry standards are not significantly contributing to safety and reliability and, therefore, are unnecessarily adding to nuclear plant costs. Not only does this degrade the economic competitiveness of nuclear energy, it results in unnecessary costs to the American electricity consumer. While addressing these concerns, this research project will be coordinated with current efforts of industry and NRC to develop risk-informed, performance-based regulations that affect the operation of the existing nuclear plants; however, this project will go farther by focusing on the design of new plants.

  6. Models of protein and amino acid requirements for cattle

    Directory of Open Access Journals (Sweden)

    Luis Orlindo Tedeschi

    2015-03-01

    Full Text Available Protein supply and requirements by ruminants have been studied for more than a century. These studies led to the accumulation of lots of scientific information about digestion and metabolism of protein by ruminants as well as the characterization of the dietary protein in order to maximize animal performance. During the 1980s and 1990s, when computers became more accessible and powerful, scientists began to conceptualize and develop mathematical nutrition models, and to program them into computers to assist with ration balancing and formulation for domesticated ruminants, specifically dairy and beef cattle. The most commonly known nutrition models developed during this period were the National Research Council (NRC in the United States, Agricultural Research Council (ARC in the United Kingdom, Institut National de la Recherche Agronomique (INRA in France, and the Commonwealth Scientific and Industrial Research Organization (CSIRO in Australia. Others were derivative works from these models with different degrees of modifications in the supply or requirement calculations, and the modeling nature (e.g., static or dynamic, mechanistic, or deterministic. Circa 1990s, most models adopted the metabolizable protein (MP system over the crude protein (CP and digestible CP systems to estimate supply of MP and the factorial system to calculate MP required by the animal. The MP system included two portions of protein (i.e., the rumen-undegraded dietary CP - RUP - and the contributions of microbial CP - MCP as the main sources of MP for the animal. Some models would explicitly account for the impact of dry matter intake (DMI on the MP required for maintenance (MPm; e.g., Cornell Net Carbohydrate and Protein System - CNCPS, the Dutch system - DVE/OEB, while others would simply account for scurf, urinary, metabolic fecal, and endogenous contributions independently of DMI. All models included milk yield and its components in estimating MP required for lactation

  7. Future Orientation, Impulsivity, and Problem Behaviors: A Longitudinal Moderation Model

    Science.gov (United States)

    Chen, Pan; Vazsonyi, Alexander T.

    2011-01-01

    In the current study, based on a sample of 1,873 adolescents between 11.4 and 20.9 years of age from the first 3 waves of the National Longitudinal Study of Adolescent Health, we investigated the longitudinal effects of future orientation on levels of and developmental changes in problem behaviors, while controlling for the effects by impulsivity;…

  8. More than meets the eye: Using cognitive work analysis to identify design requirements for future rail level crossing systems.

    Science.gov (United States)

    Salmon, Paul M; Lenné, Michael G; Read, Gemma J M; Mulvihill, Christine M; Cornelissen, Miranda; Walker, Guy H; Young, Kristie L; Stevens, Nicholas; Stanton, Neville A

    2016-03-01

    An increasing intensity of operations means that the longstanding safety issue of rail level crossings is likely to become worse in the transport systems of the future. It has been suggested that the failure to prevent collisions may be, in part, due to a lack of systems thinking during design, crash analysis, and countermeasure development. This paper presents a systems analysis of current active rail level crossing systems in Victoria, Australia that was undertaken to identify design requirements to improve safety in future rail level crossing environments. Cognitive work analysis was used to analyse rail level crossing systems using data derived from a range of activities. Overall the analysis identified a range of instances where modification or redesign in line with systems thinking could potentially improve behaviour and safety. A notable finding is that there are opportunities for redesign outside of the physical rail level crossing infrastructure, including improved data systems, in-vehicle warnings and modifications to design processes, standards and guidelines. The implications for future rail level crossing systems are discussed.

  9. Modeling requirements for in situ vitrification. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    MacKinnon, R.J.; Mecham, D.C.; Hagrman, D.L.; Johnson, R.W.; Murray, P.E.; Slater, C.E.; Marwil, E.S.; Weaver, R.A.; Argyle, M.D.

    1991-11-01

    This document outlines the requirements for the model being developed at the INEL which will provide analytical support for the ISV technology assessment program. The model includes representations of the electric potential field, thermal transport with melting, gas and particulate release, vapor migration, off-gas combustion and process chemistry. The modeling objectives are to (1) help determine the safety of the process by assessing the air and surrounding soil radionuclide and chemical pollution hazards, the nuclear criticality hazard, and the explosion and fire hazards, (2) help determine the suitability of the ISV process for stabilizing the buried wastes involved, and (3) help design laboratory and field tests and interpret results therefrom.

  10. Thermodynamic models for bounding pressurant mass requirements of cryogenic tanks

    Science.gov (United States)

    Vandresar, Neil T.; Haberbusch, Mark S.

    1994-01-01

    Thermodynamic models have been formulated to predict lower and upper bounds for the mass of pressurant gas required to pressurize a cryogenic tank and then expel liquid from the tank. Limiting conditions are based on either thermal equilibrium or zero energy exchange between the pressurant gas and initial tank contents. The models are independent of gravity level and allow specification of autogenous or non-condensible pressurants. Partial liquid fill levels may be specified for initial and final conditions. Model predictions are shown to successfully bound results from limited normal-gravity tests with condensable and non-condensable pressurant gases. Representative maximum collapse factor maps are presented for liquid hydrogen to show the effects of initial and final fill level on the range of pressurant gas requirements. Maximum collapse factors occur for partial expulsions with large final liquid fill fractions.

  11. A commuting generation model requiring only aggregated data

    CERN Document Server

    Lenormand, Maxime; Gargiulo, Floriana

    2011-01-01

    We recently proposed, in (Gargiulo et al., 2011), an innova tive stochastic model with only one parameter to calibrate. It reproduces the complete network by an iterative process stochastically choosing, for each commuter living in the municipality of a region, a workplace in the region. The choice is done considering the job offer in each municipality of the region and the distance to all the possible destinations. The model is quite effective if the region is sufficiently autonomous in terms of job offers. However, calibrating or being sure of this autonomy require data or expertise which are not necessarily available. Moreover the region can be not autonomous. In the present, we overcome these limitations, extending the job search geographical base of the commuters to the outside of the region, and changing the deterrence function form. We also found a law to calibrate the improvement model which does not require data.

  12. Diagnostics and future evolution analysis of the two parametric models

    CERN Document Server

    Yang, Guang; Meng, Xinhe

    2016-01-01

    In this paper, we apply three diagnostics including $Om$, Statefinder hierarchy and the growth rate of perturbations into discriminating the two parametric models for the effective pressure with the $\\Lambda$CDM model. By using the $Om$ diagnostic, we find that both the model 1 and the model 2 can be hardly distinguished from each other as well as the $\\Lambda$CDM model in terms of 68\\% confidence level. As a supplement, by using the Statefinder hierarchy diagnostics and the growth rate of perturbations, we discover that not only can our two parametric models be well distinguished from $\\Lambda$CDM model, but also, by comparing with $Om$ diagnostic, the model 1 and the model 2 can be distinguished better from each other. In addition, we also explore the fate of universe evolution of our two models by means of the rip analysis.

  13. The NRC's SPAR Models: Current Status, Future Development, and Modeling Issues

    Energy Technology Data Exchange (ETDEWEB)

    Robert F. Buell

    2008-09-01

    Probabilistic risk assessments (PRAs) play an increasingly important role in the regulatory framework of the U.S. nuclear power industry. The Nuclear Regulatory Commission (NRC) relies on a set of plant-specific Standardized Plant Analysis Risk (SPAR) models to provide critical risk-based input to the regulatory process. The Significance Determination Process (SDP), Management Directive 8.3 - NRC Incident Investigation Program, Accident Sequence Precursor (ASP) and Mitigating Systems Performance Index (MSPI) programs are among the regulatory initiatives that receive significant input from the SPAR models. Other uses of the SPAR models include: Screening & Resolution of Generic Safety Issues, License Amendment reviews and Notice of Enforcement Discretion (NOEDs). This paper presents the current status of SPAR model development activities, future development objectives, and issues related to the development, verification and maintenance of the SPAR models.

  14. Cognition and procedure representational requirements for predictive human performance models

    Science.gov (United States)

    Corker, K.

    1992-01-01

    Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods

  15. Mathematical Modeling of Programmatic Requirements for Yaws Eradication

    Science.gov (United States)

    Mitjà, Oriol; Fitzpatrick, Christopher; Asiedu, Kingsley; Solomon, Anthony W.; Mabey, David C.W.; Funk, Sebastian

    2017-01-01

    Yaws is targeted for eradication by 2020. The mainstay of the eradication strategy is mass treatment followed by case finding. Modeling has been used to inform programmatic requirements for other neglected tropical diseases and could provide insights into yaws eradication. We developed a model of yaws transmission varying the coverage and number of rounds of treatment. The estimated number of cases arising from an index case (basic reproduction number [R0]) ranged from 1.08 to 3.32. To have 80% probability of achieving eradication, 8 rounds of treatment with 80% coverage were required at low estimates of R0 (1.45). This requirement increased to 95% at high estimates of R0 (2.47). Extending the treatment interval to 12 months increased requirements at all estimates of R0. At high estimates of R0 with 12 monthly rounds of treatment, no combination of variables achieved eradication. Models should be used to guide the scale-up of yaws eradication. PMID:27983500

  16. The Benefit of Ambiguity in Understanding Goals in Requirements Modelling

    DEFF Research Database (Denmark)

    Paay, Jeni; Pedell, Sonja; Sterling, Leon

    2011-01-01

    of their research is to create technologies that support more flexible and meaningful social interactions, by combining best practice in software engineering with ethnographic techniques to model complex social interactions from their socially oriented life for the purposes of building rich socio......This paper examines the benefit of ambiguity in describing goals in requirements modelling for the design of socio-technical systems using concepts from Agent-Oriented Software Engineering (AOSE) and ethnographic and cultural probe methods from Human Computer Interaction (HCI). The authors’ aim...... of abstraction, ambiguous and open for conversations through the modelling process add richness to goal models, and communicate quality attributes of the interaction being modelled to the design phase, where this ambiguity is regarded as a resource for design....

  17. Information Models, Data Requirements, and Agile Data Curation

    Science.gov (United States)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  18. Policy Research Using Agent-Based Modeling to Assess Future Impacts of Urban Expansion into Farmlands and Forests

    Directory of Open Access Journals (Sweden)

    Michael R. Guzy

    2008-06-01

    Full Text Available The expansion of urban land uses into farmlands and forests requires an assessment of future ecological impacts. Spatially explicit agent-based models can represent the changes in resilience and ecological services that result from different land-use policies. When modeling complex adaptive systems, both the methods used to interpret results and the standards of rigor used to judge adequacy are complicated and require additional research. Recent studies suggest that it would be appropriate to use these models as an extension of exploratory analysis. This type of analysis generates ensembles of alternate plausible representations of future system conditions. User expertise steers interactive, stepwise system exploration toward inductive reasoning about potential changes to the system. In this study, we develop understanding of the potential alternative futures for a social-ecological system by way of successive simulations that test variations in the types and numbers of policies. The model addresses the agricultural-urban interface and the preservation of ecosystem services. The landscape analyzed is at the junction of the McKenzie and Willamette Rivers adjacent to the cities of Eugene and Springfield in Lane County, Oregon. Our exploration of alternative future scenarios suggests that policies that constrain urban growth and create incentives for farming and forest enterprises to preserve and enhance habitat can protect ecosystem resilience and services.

  19. Future requirements. Clinical investigations

    DEFF Research Database (Denmark)

    Qvist, V.

    2002-01-01

    Biocompatability, Cariology, Clinical trials, Dental materials, Helath services research, Human, Pedodontics......Biocompatability, Cariology, Clinical trials, Dental materials, Helath services research, Human, Pedodontics...

  20. The Future of Planetary Climate Modeling and Weather Prediction

    Science.gov (United States)

    Del Genio, A. D.; Domagal-Goldman, S. D.; Kiang, N. Y.; Kopparapu, R. K.; Schmidt, G. A.; Sohl, L. E.

    2017-01-01

    Modeling of planetary climate and weather has followed the development of tools for studying Earth, with lags of a few years. Early Earth climate studies were performed with 1-dimensionalradiative-convective models, which were soon fol-lowed by similar models for the climates of Mars and Venus and eventually by similar models for exoplan-ets. 3-dimensional general circulation models (GCMs) became common in Earth science soon after and within several years were applied to the meteorology of Mars, but it was several decades before a GCM was used to simulate extrasolar planets. Recent trends in Earth weather and and climate modeling serve as a useful guide to how modeling of Solar System and exoplanet weather and climate will evolve in the coming decade.

  1. 3D tumor models: history, advances and future perspectives.

    Science.gov (United States)

    Benien, Parul; Swami, Archana

    2014-05-01

    Evaluation of cancer therapeutics by utilizing 3D tumor models, before clinical studies, could be more advantageous than conventional 2D tumor models (monolayer cultures). The 3D systems mimic the tumor microenvironment more closely than 2D systems. The following review discusses the various 3D tumor models present today with the advantages and limitations of each. 3D tumor models replicate the elements of a tumor microenvironment such as hypoxia, necrosis, angiogenesis and cell adhesion. The review introduces application of techniques such as microfluidics, imaging and tissue engineering to improve the 3D tumor models. Despite their tremendous potential to better screen chemotherapeutics, 3D tumor models still have a long way to go before they are used commonly as in vitro tumor models in pharmaceutical industrial research.

  2. Structured Multi-level Data Fusion and Modelling of Heterogeneous Environmental Data for Future Internet Applications

    Science.gov (United States)

    Sabeur, Zoheir; Chakravarthy, Ajay; Bashevoy, Maxim; Modafferi, Stefano

    2013-04-01

    The rapid increase in environmental observations which are conducted by Small to Medium Enterprise communities and volunteers using affordable in situ sensors at various scales, in addition to the more established observatories set up by environmental and space agencies using airborne and space-borne sensing technologies is generating serious amounts of BIG data at ever increasing speeds. Furthermore, the emergence of Future Internet technologies and the urgent requirements for the deployment of specific enablers for the delivery of processed environmental knowledge in real-time with advanced situation awareness to citizens has reached paramount importance. Specifically, it has become highly critical now to build and provide services which automate the aggregation of data from various sources, while surmounting the semantic gaps, conflicts and heterogeneity in data sources. The early stage aggregation of data will enable the pre-processing of data from multiple sources while reconciling the temporal gaps in measurement time series, and aligning their respective a-synchronicities. This low level type of data fusion process needs to be automated and chained to more advanced level of data fusion services specialising in observation forecasts at spaces where sensing is not deployed; or at time slices where sensing has not taken place yet. As a result, multi-level fusion services are required among the families of specific enablers for monitoring environments and spaces in the Future Internet. These have been intially deployed and piloted in the ongoing ENVIROFI project of the FI-PPP programme [1]. Automated fusion and modelling of in situ and remote sensing data has been set up and the experimentation successfully conducted using RBF networks for the spatial fusion of water quality parameters measurements from satellite and stationary buoys in the Irish Sea. The RBF networks method scales for the spatial data fusion of multiple types of observation sources. This

  3. Forecasting Future Salaries in the Czech Republic Using Stochastic Modelling

    OpenAIRE

    Ondřej Šimpach; Jitka Langhamrová

    2013-01-01

    Background: In spite of the course of the economic crisis of 2008, there have not been changes dramatic to the extent that they would strongly alter the behaviour of the trend in the Average Gross Monthly Wages and the Monthly Wage Medians in the Czech Republic. In order to support public and monetary planning, reliable forecasts of future salaries are indispensable. Objectives: The aim is to provide an outline of the behaviour of the average gross wages and the gross wage medians of the Czec...

  4. Models for residential- and commercial-sector energy-conservation analysis: applications, limitations, and future potential. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Henry E.; Fullen, Robert E.

    1980-09-01

    This report reviews four of the major models used by the Department of Energy (DOE) for energy conservation analyses in the residential- and commercial-building sectors. The objective is to provide a critical analysis of how these models can serve as tools for DOE and its Conservation Policy Office in evaluating and quantifying their policy and program requirements. For this, the study brings together information on the models' analytical structure and their strengths and limitations in policy applications these are then employed to assess the most-effective role for each model in addressing future issues of buildings energy-conservation policy and analysis. The four models covered are: Oak Ridge Residential Energy Model; Micro Analysis of Transfers to Households/Comprehensive Human Resources Data System (MATH/CHRDS) Model; Oak Ridge Commercial Energy Model; and Brookhaven Buildings Energy Conservation Optimization Model (BECOM).

  5. A Model for Forecasting Enlisted Student IA Billet Requirements

    Science.gov (United States)

    2016-03-01

    were promised and had at least one course failure . Training times Student execution depends on TTT. TTT includes under-instruction (UI) time and...Cleared for Public Release A Model for Forecasting Enlisted Student IA Billet Requirements Steven W. Belcher with David L. Reese...and Kletus S. Lawler March 2016 Copyright © 2016 CNA This document contains the best opinion of CNA at the time of issue. It does

  6. Dynamic causal models of neural system dynamics: current state and future extensions

    Indian Academy of Sciences (India)

    Klaas E Stephan; Lee M Harrison; Stefan J Kiebel; Olivier David; Will D Penny; Karl J Friston

    2007-01-01

    Complex processes resulting from interaction of multiple elements can rarely be understood by analytical scientific approaches alone; additional, mathematical models of system dynamics are required. This insight, which disciplines like physics have embraced for a long time already, is gradually gaining importance in the study of cognitive processes by functional neuroimaging. In this field, causal mechanisms in neural systems are described in terms of effective connectivity. Recently, dynamic causal modelling (DCM) was introduced as a generic method to estimate effective connectivity from neuroimaging data in a Bayesian fashion. One of the key advantages of DCM over previous methods is that it distinguishes between neural state equations and modality-specific forward models that translate neural activity into a measured signal. Another strength is its natural relation to Bayesian model selection (BMS) procedures. In this article, we review the conceptual and mathematical basis of DCM and its implementation for functional magnetic resonance imaging data and event-related potentials. After introducing the application of BMS in the context of DCM, we conclude with an outlook to future extensions of DCM. These extensions are guided by the long-term goal of using dynamic system models for pharmacological and clinical applications, particularly with regard to synaptic plasticity.

  7. Eysenck Psychobiological Personality Model: a projected into the future history

    OpenAIRE

    Schmidt, Vanina; Firpo, L; Vion, D.; Oliván, M E De Costa; Casella, L.; L Cuenya; G D Blum; Pedrón, V

    2010-01-01

    In this article, particular circumstances, author and ideas that influenced on the elaboration of one of the most solid personality models that Psychology has till nowadays: Eysenck Personality Model, are revised. Its main characteristics are presented, which defined it as a dispositional, dimensional, hierarchic and psychobiological model. The intention of improving dimensions description, explanation, and measurement, took this author to propose changes to his original theory and instrument...

  8. Eysenck Psychobiological Personality Model: a projected into the future history

    OpenAIRE

    Schmidt, Vanina; Firpo, L; Vion, D.; Oliván, M E De Costa; Casella, L.; L Cuenya; G D Blum; V Pedrón

    2010-01-01

    In this article, particular circumstances, author and ideas that influenced on the elaboration of one of the most solid personality models that Psychology has till nowadays: Eysenck Personality Model, are revised. Its main characteristics are presented, which defined it as a dispositional, dimensional, hierarchic and psychobiological model. The intention of improving dimensions description, explanation, and measurement, took this author to propose changes to his original theory and instrument...

  9. Calculation and visualisation of future glacier extent in the Swiss Alps by means of hypsographic modelling

    Science.gov (United States)

    Paul, F.; Maisch, M.; Rothenbühler, C.; Hoelzle, M.; Haeberli, W.

    2007-02-01

    The observed rapid glacier wastage in the European Alps during the past 20 years already has strong impacts on the natural environment (rock fall, lake formation) as well as on human activities (tourism, hydro-power production, etc.) and poses several new challenges also for glacier monitoring. With a further increase of global mean temperature in the future, it is likely that Alpine glaciers and the high-mountain environment as an entire system will further develop into a state of imbalance. Hence, the assessment of future glacier geometries is a valuable prerequisite for various impact studies. In order to calculate and visualize in a consistent manner future glacier extent for a large number of individual glaciers (> 100) according to a given climate change scenario, we have developed an automated and simple but robust approach that is based on an empirical relationship between glacier size and the steady-state accumulation area ratio (AAR 0) in the Alps. The model requires digital glacier outlines and a digital elevation model (DEM) only and calculates new glacier geometries from a given shift of the steady-state equilibrium line altitude (ELA 0) by means of hypsographic modelling. We have calculated changes in number, area and volume for 3062 individual glacier units in Switzerland and applied six step changes in ELA 0 (from + 100 to + 600 m) combined with four different values of the AAR 0 (0.5, 0.6, 0.67, 0.75). For an AAR 0 of 0.6 and an ELA 0 rise of 200 m (400 m) we calculate a total area loss of - 54% (- 80%) and a corresponding volume loss of - 50% (- 78%) compared to the 1973 glacier extent. In combination with a geocoded satellite image, the future glacier outlines are also used for automated rendering of perspective visualisations. This is a very attractive tool for communicating research results to the general public. Our study is illustrated for a test site in the Upper Engadine (Switzerland), where landscape changes above timberline play an

  10. How will Dhaka grow spatially in future?-Modelling its urban growth with a near-future planning scenario perspective

    Directory of Open Access Journals (Sweden)

    Sohel Ahmed

    2015-12-01

    Full Text Available Being the primate city of Bangladesh, higher population growth and inward migration from rural areas is making Dhaka to experience an unprecedented level of urbanisation. This has brought two-fold implications-pushing it high up the mega-city size ladder while also posing the planners and city managers with more complex spatial and socio-economic challenges to deal with the rapidly expanding urban footprint. Updating the knowledge and evidence-base of Dhaka’s urban growth dynamics becomes increasingly crucial for better functioning of its strategic urban planning and management. Therefore, this research seeks to broaden our knowledge of understanding spatial urban growth patterns and processes of Dhaka over the period of 1988–2005. Hybrid spatial modelling frameworks, incorporating statistical models (in the form of weight-of-evidence approach along with cellular automata functions, therefore, have been used to comprehend the dynamism of rapid urban growth from 1988 to 2005. As expected, the local version of the transition probabilities (where Dhaka was divided into 18 Spatial Planning Zones, produced improved results compared to the global version (i.e. the whole of the Dhaka metropolitan area. The modelling framework has further been tested as a planner’s ‘what-if’ simulation box to generate near-future scenario using future policy dataset. It appears to have sufficient experimental potential to implement more extensive spatio–temporal land-use modelling process even in sparse data environment such as Dhaka.

  11. The Future Role of Information Technology in Erosion Modelling

    Science.gov (United States)

    Natural resources management and decision-making is a complex process requiring cooperation and communication among federal, state, and local stakeholders balancing biophysical and socio-economic concerns. Predicting soil erosion is common practice in natural resource management for assessing the e...

  12. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  13. Computerized models : tools for assessing the future of complex systems?

    NARCIS (Netherlands)

    Ittersum, van M.K.; Sterk, B.

    2015-01-01

    Models are commonly used to make decisions. At some point all of us will have employed a mental model, that is, a simplification of reality, in an everyday situation. For instance, when we want to make the best decision for the environment and consider whether to buy our vegetables in a large

  14. Eysenck Psychobiological Personality Model: a projected into the future history

    Directory of Open Access Journals (Sweden)

    Vanina Schmidt

    2010-07-01

    Full Text Available In this article, particular circumstances, author and ideas that influenced on the elaboration of one of the most solid personality models that Psychology has till nowadays: Eysenck Personality Model, are revised. Its main characteristics are presented, which defined it as a dispositional, dimensional, hierarchic and psychobiological model. The intention of improving dimensions description, explanation, and measurement, took this author to propose changes to his original theory and instrument. Hence, different periods of this model are analyzed. In spite of proliferation of personality theories, Eysenck model has an empirical validity that only a few have. Thus, we argue that in Personality Psychology there is a background available which represents the Paradigm into which we will probably be moving in the next years

  15. Pharmacovigilance and Biomedical Informatics: A Model for Future Development.

    Science.gov (United States)

    Beninger, Paul; Ibara, Michael A

    2016-12-01

    The discipline of pharmacovigilance is rooted in the aftermath of the thalidomide tragedy of 1961. It has evolved as a result of collaborative efforts by many individuals and organizations, including physicians, patients, Health Authorities, universities, industry, the World Health Organization, the Council for International Organizations of Medical Sciences, and the International Conference on Harmonisation. Biomedical informatics is rooted in technologically based methodologies and has evolved at the speed of computer technology. The purpose of this review is to bring a novel lens to pharmacovigilance, looking at the evolution and development of the field of pharmacovigilance from the perspective of biomedical informatics, with the explicit goal of providing a foundation for discussion of the future direction of pharmacovigilance as a discipline. For this review, we searched [publication trend for the log10 value of the numbers of publications identified in PubMed] using the key words [informatics (INF), pharmacovigilance (PV), phar-macovigilance þ informatics (PV þ INF)], for [study types] articles published between [1994-2015]. We manually searched the reference lists of identified articles for additional information. Biomedical informatics has made significant contributions to the infrastructural development of pharmacovigilance. However, there has not otherwise been a systematic assessment of the role of biomedical informatics in enhancing the field of pharmacovigilance, and there has been little cross-discipline scholarship. Rapidly developing innovations in biomedical informatics pose a challenge to pharmacovigilance in finding ways to include new sources of safety information, including social media, massively linked databases, and mobile and wearable wellness applications and sensors. With biomedical informatics as a lens, it is evident that certain aspects of pharmacovigilance are evolving more slowly. However, the high levels of mutual interest in

  16. Modelling Alzheimer’s disease: from past to future

    Directory of Open Access Journals (Sweden)

    Claudia eSaraceno

    2013-06-01

    Full Text Available Alzheimer’s disease (AD is emerging as the most prevalent and socially disruptive illness of aging populations, as more people live long enough to become affected. Although AD is placing a considerable and increasing burden on society, it represents the largest unmet medical need in neurology, because current drugs improve symptoms, but do not have profound disease-modifying effects.Although AD pathogenesis is multifaceted and difficult to pinpoint, genetic and cell biological studies led to the amyloid hypothesis, which posits that Aβ plays a pivotal role in AD pathogenesis. Amyloid precursor protein (APP, as well as β- and γ-secretases are the principal players involved in Aβ production, while α-secretase cleavage on APP prevents Aβ deposition. The association of early onset familial AD with mutations in the APP and γ-secretase components provided a potential tool of generating animal models of the disease. However, a model that recapitulates all the aspects of AD has not yet been produced.Here, we face the problem of modelling AD pathology describing several models, which have played a major role in defining critical disease-related mechanisms and in exploring novel potential therapeutic approaches. In particular, we will provide an extensive overview on the distinct features and pros and contras of different AD models, ranging from invertebrate to rodent models and finally dealing with computational models and induced pluripotent stem cells.

  17. The past and future of modeling forest dynamics: from growth and yield curves to forest landscape models

    Science.gov (United States)

    Stephen R. Shifley; Hong S. He; Heike Lischke; Wen J. Wang; Wenchi Jin; Eric J. Gustafson; Jonathan R. Thompson; Frank R. Thompson; William D. Dijak; Jian Yang

    2017-01-01

    Context. Quantitative models of forest dynamics have followed a progression toward methods with increased detail, complexity, and spatial extent. Objectives. We highlight milestones in the development of forest dynamics models and identify future research and application opportunities. Methods. We reviewed...

  18. Modeling the Current and Future Distribution of Treeline Species in the Nepal Himalaya

    Science.gov (United States)

    Chhetri, P. K.; Cairns, D. M.

    2015-12-01

    Knowledge of the current distribution of treeline species is important for predicting their future distribution in the landscape. Many studies have indicated that treeline will advance with climate change. Treeline advance will result in a loss of alpine biodiversity because the advancing treeline will fragment alpine ecosystems. A species distribution modeling approach using predicted climate data can increase our understanding of how treeline species will expand their range in the future. We used the Maxent model to predict the current and future distributions of three dominant treeline species, Abies spectabilis, Betula utilis, and Pinus wallichiana, of the Nepal Himalaya. The Maxent model predicted that the distribution of treeline species will change significantly under future climate change scenarios. The range of these treeline species will expand northward or upslope in response to future climate change. The model also indicated that temperature-related climatic variables are the most important determinants of the distribution of treeline species.

  19. Numerical modelling of present and future hydrology at Laxemar- Simpevarp

    Energy Technology Data Exchange (ETDEWEB)

    Sassner, Mona; Sabel, Ulrika (DHI Sverige AB (Sweden)); Bosson, Emma; Berglund, Sten (Svensk Kaernbraenslehantering AB (Sweden))

    2011-04-15

    The Swedish Nuclear Fuel and Waste Management Company (SKB) has performed site investigations at two potential sites for a final repository for spent nuclear fuel. This report presents results of water flow modelling of the Laxemar area. The modelling reported in this document is focused on the near-surface groundwater, i.e. groundwater in Quaternary deposits and shallow rock, and surface water systems, and was performed using the MIKE SHE tool. The main objective of the modelling was to provide input to the radionuclide transport and dose calculations that were carried out as a part of the comparison between the Laxemar and Forsmark sites

  20. Past and present of analogue modelling, and its future trend

    Science.gov (United States)

    Koyi, Hemin

    2015-04-01

    Since Hull (1815) published his article on modelling, analogue modelling has expanded to simulate both a wider range of tectonic regimes and target more challenging set-ups, and has become an integrated part of the fields of tectonics and structural geology. Establishment of new laboratories testifies for the increased attention the technique receives. The ties between modellers and field geoscientists have become stronger with the focus being on understanding the parameters that govern the evolution of a tectonic regime and the processes that dominate it. Since the first sand castle was built with damp sand on a beach, sand has proven to be an appropriate material analogue. Even though granular materials is the most widely used analogue material, new materials are also (re)introduced as rock analogues. Emphasis has been on more precise measurements of the mechanical properties of the materials and on minimizing the preparation effects, which have a great impact on scaling, interpretations and benchmarking. The analytical technique used to quantify model results has also seen a great deal of improvement. In addition to X-ray tomography used to visualise internal structures of models, new techniques (e.g. PIV, high-resolution laser scanning, and interferometry) have enabled monitoring kinematics with a higher precision. Benchmarking exercises have given modelling an additional checking tool by outlining, in addition to the rheology of the modelling materials, the impact of different preparation approaches, the effect of boundary conditions, and the human factor on model results. However, despite the different approaches and deformation rigs, results of models of different tectonic laboratories have shown a great deal of similarities. Even with the introduction of more sophisticated numerical codes and usage of more powerful computers which enable the simulation of more challenging material properties and combinations of those, and 3D model set-up, analogue modelling

  1. Management and Service-aware Networking Architectures (MANA) for Future Internet - Position Paper: System Functions, Capabilities and Requirements

    NARCIS (Netherlands)

    Galis, A.; Abramowicz, H.; Brunner, M.; Raz, D.; Chemouil, P.; Pras, Aiko

    2009-01-01

    Future Internet (FI) research and development threads have recently been gaining momentum all over the world and as such the international race to create a new generation Internet is in full swing: GENI, Asia Future Internet, Future Internet Forum Korea, European Union Future Internet Assembly (FIA)

  2. Management and Service-aware Networking Architectures (MANA) for Future Internet - Position Paper: System Functions, Capabilities and Requirements

    NARCIS (Netherlands)

    Galis, A.; Abramowicz, H.; Brunner, M.; Raz, D.; Chemouil, P.; Pras, Aiko

    Future Internet (FI) research and development threads have recently been gaining momentum all over the world and as such the international race to create a new generation Internet is in full swing: GENI, Asia Future Internet, Future Internet Forum Korea, European Union Future Internet Assembly

  3. Mathematical modelling of anaerobic digestion processes: applications and future needs

    DEFF Research Database (Denmark)

    Batstone, Damien J.; Puyol, Daniel; Flores Alsina, Xavier

    2015-01-01

    of the role of the central carbon catabolic metabolism in anaerobic digestion, with an increased importance of phosphorous, sulfur, and metals as electron source and sink, and consideration of hydrogen and methane as potential electron sources. The paradigm of anaerobic digestion is challenged by anoxygenic...... phototrophism, where energy is relatively cheap, but electron transfer is expensive. These new processes are commonly not compatible with the existing structure of anaerobic digestion models. These core issues extend to application of anaerobic digestion in domestic plant-wide modelling, with the need......Anaerobic process modelling is a mature and well-established field, largely guided by a mechanistic model structure that is defined by our understanding of underlying processes. This led to publication of the IWA ADM1, and strong supporting, analytical, and extension research in the 15 years since...

  4. The Future of Food Demand: Understanding Differences in Global Economic Models

    Energy Technology Data Exchange (ETDEWEB)

    Valin, Hugo; Sands, Ronald; van der Mensbrugghe, Dominique; Nelson, Gerald; Ahammad, Helal; Blanc, Elodie; Bodirsky, Benjamin; Fujimori, Shinichiro; Hasegawa, Tomoko; Havlik, Petr; Heyhoe, Edwina; Kyle, G. Page; Mason d' Croz, Daniel; Paltsev, S.; Rolinski, Susanne; Tabeau, Andrzej; van Meijl, Hans; von Lampe, Martin; Willenbockel, Dirk

    2014-01-01

    Understanding the capacity of agricultural systems to feed the world population under climate change requires a good prospective vision on the future development of food demand. This paper reviews modeling approaches from ten global economic models participating to the AgMIP project, in particular the demand function chosen and the set of parameters used. We compare food demand projections at the horizon 2050 for various regions and agricultural products under harmonized scenarios. Depending on models, we find for a business as usual scenario (SSP2) an increase in food demand of 59-98% by 2050, slightly higher than FAO projection (54%). The prospective for animal calories is particularly uncertain with a range of 61-144%, whereas FAO anticipates an increase by 76%. The projections reveal more sensitive to socio-economic assumptions than to climate change conditions or bioenergy development. When considering a higher population lower economic growth world (SSP3), consumption per capita drops by 9% for crops and 18% for livestock. Various assumptions on climate change in this exercise do not lead to world calorie losses greater than 6%. Divergences across models are however notable, due to differences in demand system, income elasticities specification, and response to price change in the baseline.

  5. Acidification in Three Lake District Tarns: Historical Iong term trends and modelled future behaviour under changing sulphate and nitrate deposition

    Directory of Open Access Journals (Sweden)

    P. G. Whitchead

    1997-01-01

    Full Text Available Three upland Lake District Tarns, Scoat, Greendale and Burnmoor, have been evaluated using MAGIC (Model of Acidification of Groundwater In Catchments to reconstruct past, present and future chemical behaviour. The modelled historical changes in acidity are compared with palaeoecological estimation of pH to demonstrate model validity. Chemistry as simulated for all anions and cations and two of the three lakes are shown to have undergone significant acidification. The effects of changing atmospheric pollution levels on lake chemistry is evaluated and 80-90% sulphur reduction levels are required to achieve zero alkalinity. The impacts of increased nitrogen deposition are assessed and are shown to further delay reversibility.

  6. Assessing "dangerous climate change": required reduction of carbon emissions to protect young people, future generations and nature.

    Directory of Open Access Journals (Sweden)

    James Hansen

    Full Text Available We assess climate impacts of global warming using ongoing observations and paleoclimate data. We use Earth's measured energy imbalance, paleoclimate data, and simple representations of the global carbon cycle and temperature to define emission reductions needed to stabilize climate and avoid potentially disastrous impacts on today's young people, future generations, and nature. A cumulative industrial-era limit of ∼500 GtC fossil fuel emissions and 100 GtC storage in the biosphere and soil would keep climate close to the Holocene range to which humanity and other species are adapted. Cumulative emissions of ∼1000 GtC, sometimes associated with 2°C global warming, would spur "slow" feedbacks and eventual warming of 3-4°C with disastrous consequences. Rapid emissions reduction is required to restore Earth's energy balance and avoid ocean heat uptake that would practically guarantee irreversible effects. Continuation of high fossil fuel emissions, given current knowledge of the consequences, would be an act of extraordinary witting intergenerational injustice. Responsible policymaking requires a rising price on carbon emissions that would preclude emissions from most remaining coal and unconventional fossil fuels and phase down emissions from conventional fossil fuels.

  7. Assessing 'Dangerous Climate Change': Required Reduction of Carbon Emissions to Protect Young People, Future Generations and Nature

    Science.gov (United States)

    Hansen, James; Kharecha, Pushker; Sato, Makiko; Masson-Demotte, Valerie; Ackerman, Frank; Beerling, David J.; Hearty, Paul J.; Hoegh-Guldberg, Ove; Hsu, Shi-Ling; Parmesan, Camille; hide

    2013-01-01

    We assess climate impacts of global warming using ongoing observations and paleoclimate data. We use Earth's measured energy imbalance, paleoclimate data, and simple representations of the global carbon cycle and temperature to define emission reductions needed to stabilize climate and avoid potentially disastrous impacts on today's young people, future generations, and nature. A cumulative industrial-era limit of approx.500 GtC fossil fuel emissions and 100 GtC storage in the biosphere and soil would keep climate close to the Holocene range to which humanity and other species are adapted. Cumulative emissions of approx.1000 GtC, sometimes associated with 2 C global warming, would spur "slow" feedbacks and eventual warming of 3-4 C with disastrous consequences. Rapid emissions reduction is required to restore Earth's energy balance and avoid ocean heat uptake that would practically guarantee irreversible effects. Continuation of high fossil fuel emissions, given current knowledge of the consequences, would be an act of extraordinary witting intergenerational injustice. Responsible policymaking requires a rising price on carbon emissions that would preclude emissions from most remaining coal and unconventional fossil fuels and phase down emissions from conventional fossil fuels.

  8. Understanding Resilient Urban Futures: A Systemic Modelling Approach

    Directory of Open Access Journals (Sweden)

    Ralph Chapman

    2013-07-01

    Full Text Available The resilience of cities in response to natural disasters and long-term climate change has emerged as a focus of academic and policy attention. In particular, how to understand the interconnectedness of urban and natural systems is a key issue. This paper introduces an urban model that can be used to evaluate city resilience outcomes under different policy scenarios. The model is the Wellington Integrated Land Use-Transport-Environment Model (WILUTE. It considers the city (i.e., Wellington as a complex system characterized by interactions between a variety of internal urban processes (social, economic and physical and the natural environment. It is focused on exploring the dynamic relations between human activities (the geographic distribution of housing and employment, infrastructure layout, traffic flows and energy consumption, environmental effects (carbon emissions, influences on local natural and ecological systems and potential natural disasters (e.g., inundation due to sea level rise and storm events faced under different policy scenarios. The model gives insights that are potentially useful for policy to enhance the city’s resilience, by modelling outcomes, such as the potential for reduction in transportation energy use, and changes in the vulnerability of the city’s housing stock and transport system to sea level rise.

  9. A case study in modeling company policy documents as a source of requirements

    Energy Technology Data Exchange (ETDEWEB)

    CRUMPTON,KATHLEEN MARIE; GONZALES,REGINA M.; TRAUTH,SHARON L.

    2000-04-11

    This paper describes an approach that was developed to produce structured models that graphically reflect the requirements contained within a text document. The document used in this research is a draft policy document governing business in a research and development environment. In this paper, the authors present a basic understanding of why this approach is needed, the techniques developed, lessons learned during modeling and analysis, and recommendations for future investigation. The modeling method applied on the policy document was developed as an extension to entity relationship (ER) diagrams, which built in some structural information typically associated with object-oriented techniques. This approach afforded some structure as an analysis tool, while remaining flexible enough to be used with the text document. It provided a visual representation that allowed further analysis and layering of the model to be done.

  10. Physically unclonable functions (PUFs) applications, models, and future directions

    CERN Document Server

    Wachsmann, Christian

    2014-01-01

    Today, embedded systems are used in many security-critical applications, from access control, electronic tickets, sensors, and smart devices (e.g., wearables) to automotive applications and critical infrastructures. These systems are increasingly used to produce and process both security-critical and privacy-sensitive data, which bear many security and privacy risks. Establishing trust in the underlying devices and making them resistant to software and hardware attacks is a fundamental requirement in many applications and a challenging, yet unsolved, task. Solutions solely based on software ca

  11. Future aerosol emissions: a multi-model comparison

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Steven J.; Rao, Shilpa; Riahi, Keywan; van Vuuren, Detlef P.; Calvin, Katherine V.; Kyle, Page

    2016-08-02

    This paper compares projections over the 21st century of SO2, BC, and OC emissions from three technologically detailed, long-term integrated assessment models. The character of the projections and the response of emissions due to a comprehensive climate policy are discussed. In a continuation of historical experience, aerosol and precursor emissions are increasingly decoupled from carbon dioxide emissions over the 21st century. Implementation of a comprehensive climate policy further reduces emissions, although there is significant variation in this response by sector and by model. Differences in model responses can be traced to specific characteristics of reference case end-use and supply-side technology deployment and emissions control assumptions, which are detailed by sector.

  12. Value of the distant future: Model-independent results

    Science.gov (United States)

    Katz, Yuri A.

    2017-01-01

    This paper shows that the model-independent account of correlations in an interest rate process or a log-consumption growth process leads to declining long-term tails of discount curves. Under the assumption of an exponentially decaying memory in fluctuations of risk-free real interest rates, I derive the analytical expression for an apt value of the long run discount factor and provide a detailed comparison of the obtained result with the outcome of the benchmark risk-free interest rate models. Utilizing the standard consumption-based model with an isoelastic power utility of the representative economic agent, I derive the non-Markovian generalization of the Ramsey discounting formula. Obtained analytical results allowing simple calibration, may augment the rigorous cost-benefit and regulatory impact analysis of long-term environmental and infrastructure projects.

  13. Programs Model the Future of Air Traffic Management

    Science.gov (United States)

    2010-01-01

    Through Small Business Innovation Research (SBIR) contracts with Ames Research Center, Intelligent Automation Inc., based in Rockville, Maryland, advanced specialized software the company had begun developing with U.S. Department of Defense funding. The agent-based infrastructure now allows NASA's Airspace Concept Evaluation System to explore ways of improving the utilization of the National Airspace System (NAS), providing flexible modeling of every part of the NAS down to individual planes, airports, control centers, and even weather. The software has been licensed to a number of aerospace and robotics customers, and has even been used to model the behavior of crowds.

  14. SMV model-based safety analysis of software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kwang Yong [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Seong, Poong Hyun [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)], E-mail: phseong@kaist.ac.kr

    2009-02-15

    Fault tree analysis (FTA) is one of the most frequently applied safety analysis techniques when developing safety-critical industrial systems such as software-based emergency shutdown systems of nuclear power plants and has been used for safety analysis of software requirements in the nuclear industry. However, the conventional method for safety analysis of software requirements has several problems in terms of correctness and efficiency; the fault tree generated from natural language specifications may contain flaws or errors while the manual work of safety verification is very labor-intensive and time-consuming. In this paper, we propose a new approach to resolve problems of the conventional method; we generate a fault tree from a symbolic model verifier (SMV) model, not from natural language specifications, and verify safety properties automatically, not manually, by a model checker SMV. To demonstrate the feasibility of this approach, we applied it to shutdown system 2 (SDS2) of Wolsong nuclear power plant (NPP). In spite of subtle ambiguities present in the approach, the results of this case study demonstrate its overall feasibility and effectiveness.

  15. A Statewide Writing Assessment Model: Student Proficiency and Future Implications

    Science.gov (United States)

    Dappen, Leon; Isernhagen, Jody; Anderson, Sue

    2008-01-01

    This paper is an examination of statewide district writing achievement gain data from the Nebraska Statewide Writing Assessment system and implications for statewide assessment writing models. The writing assessment program is used to gain compliance with the United States No Child Left Behind Law (NCLB), a federal effort to influence school…

  16. Dental Hygiene Curriculum Model for Transition to Future Roles.

    Science.gov (United States)

    Paarmann, Carlene S.; And Others

    1990-01-01

    The establishment of the baccalaureate degree as the minimum entry level for dental hygiene practice centers around three main concerns: changes in health care delivery, awarding of a degree commensurate with students' educational background, and the credibility of dental hygiene as a profession. A curriculum model is discussed. (MLW)

  17. Towards modeling future energy infrastructures - the ELECTRA system engineering approach

    DEFF Research Database (Denmark)

    Uslar, Mathias; Heussen, Kai

    2016-01-01

    Within this contribution, we provide an overview based on previous work conducted in the ELECTRA project to come up with a consistent method for modeling the ELECTRA WoC approach according to the methods established with the M/490 mandate of the European Commission. We will motivate the use of th...

  18. Future aerosol emissions: a multi-model comparison

    NARCIS (Netherlands)

    Smith, Steven J.; Rao, Shilpa; Riahi, Keywan; van Vuuren, Detlef P.; Calvin, Katherine V.; Kyle, Page

    2016-01-01

    This paper compares projections over the twenty-first century of SO2, BC, and OC emissions from three technologically detailed, long-term integrated assessment models. The character of the projections and the response of emissions due to a comprehensive climate policy are discussed focusing on the

  19. Towards modeling future energy infrastructures - the ELECTRA system engineering approach

    DEFF Research Database (Denmark)

    Uslar, Mathias; Heussen, Kai

    2016-01-01

    Within this contribution, we provide an overview based on previous work conducted in the ELECTRA project to come up with a consistent method for modeling the ELECTRA WoC approach according to the methods established with the M/490 mandate of the European Commission. We will motivate the use...

  20. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  1. Hydroclimatic modelling of local sea level rise and its projection in future

    Science.gov (United States)

    Naren, A.; Maity, Rajib

    2016-09-01

    Studies on sea level rise (SLR) in the context of climate change are gaining importance in the recent past. Whereas there is some clear evidence of SLR at global scale, its trend varies significantly from location to location. The role of different meteorological variables on sea level change (SLC) is explored. We hypothesise that the role of such variables varies from location to location and modelling of local SLC requires a proper identification of specific role of individual factors. After identifying a group of various local meteorological variables, Supervised Principal Component Analysis (SPCA) is used to develop a location specific Combined Index (CI). The SPCA ensures that the developed CI possesses highest possible association with the historical SLC at that location. Further, using the developed CI, an attempt is made to model the local sea level (LSL) variation in synchronous with the changing climate. The developed approach, termed as hydroclimatic semi-empirical approach, is found to be potential for local SLC at different coastal locations. The validated hydroclimatic approach is used for future projection of SLC at those coastal locations till 2100 for different climate change scenarios, i.e. different Representative Concentration Pathways (RCPs). Future hydrometeorological variables are obtained from Global Climate Models (GCMs) for different such scenarios, i.e. RCP2.6, RCP4.5 and RCP8.5. Effect of glacial isostatic readjustment (GIA) is not included in this study. However, if the reliable information on GIA is available for a location, the same can be arithmetically added to the final outcome of the proposed hydrometeorological approach.

  2. User requirements for hydrological models with remote sensing input

    Energy Technology Data Exchange (ETDEWEB)

    Kolberg, Sjur

    1997-10-01

    Monitoring the seasonal snow cover is important for several purposes. This report describes user requirements for hydrological models utilizing remotely sensed snow data. The information is mainly provided by operational users through a questionnaire. The report is primarily intended as a basis for other work packages within the Snow Tools project which aim at developing new remote sensing products for use in hydrological models. The HBV model is the only model mentioned by users in the questionnaire. It is widely used in Northern Scandinavia and Finland, in the fields of hydroelectric power production, flood forecasting and general monitoring of water resources. The current implementation of HBV is not based on remotely sensed data. Even the presently used HBV implementation may benefit from remotely sensed data. However, several improvements can be made to hydrological models to include remotely sensed snow data. Among these the most important are a distributed version, a more physical approach to the snow depletion curve, and a way to combine data from several sources. 1 ref.

  3. Life sciences research in space: The requirement for animal models

    Science.gov (United States)

    Fuller, C. A.; Philips, R. W.; Ballard, R. W.

    1987-01-01

    Use of animals in NASA space programs is reviewed. Animals are needed because life science experimentation frequently requires long-term controlled exposure to environments, statistical validation, invasive instrumentation or biological tissue sampling, tissue destruction, exposure to dangerous or unknown agents, or sacrifice of the subject. The availability and use of human subjects inflight is complicated by the multiple needs and demands upon crew time. Because only living organisms can sense, integrate and respond to the environment around them, the sole use of tissue culture and computer models is insufficient for understanding the influence of the space environment on intact organisms. Equipment for spaceborne experiments with animals is described.

  4. Prediction of lake surface temperature using the air2water model: guidelines, challenges, and future perspectives

    Directory of Open Access Journals (Sweden)

    Sebastiano Piccolroaz

    2016-04-01

    Full Text Available Water temperature plays a primary role in controlling a wide range of physical, geochemical and ecological processes in lakes, with considerable influences on lake water quality and ecosystem functioning. Being able to reliably predict water temperature is therefore a desired goal, which stimulated the development of models of different type and complexity, ranging from simple regression-based models to more sophisticated process-based numerical models. However, both types of models suffer of some limitations: the first are not able to address some fundamental physical processes as e.g., thermal stratification, while the latter generally require a large amount of data in input, which are not always available. In this work, lake surface temperature is simulated by means of air2water, a hybrid physically-based/statistical model, which is able to provide a robust, predictive understanding of LST dynamics knowing air temperature only. This model showed performances that are comparable with those obtained by using process based models (a root mean square error on the order of 1°C, at daily scale, while retaining the simplicity and parsimony of regression-based models, thus making it a good candidate for long-term applications.The aim of the present work is to provide the reader with useful and practical guidelines for proper use of the air2water model and for critical analysis of results. Two case studies have been selected for the analysis: Lake Superior and Lake Erie. These are clear and emblematic examples of a deep and a shallow temperate lake characterized by markedly different thermal responses to external forcing, thus are ideal for making the results of the analysis the most general and comprehensive. Particular attention is paid to assessing the influence of missing data on model performance, and to evaluating when an observed time series is sufficiently informative for proper model calibration or, conversely, data are too scarce thus

  5. How well can we forecast future model error and uncertainty by mining past model performance data

    Science.gov (United States)

    Solomatine, Dimitri

    2016-04-01

    ) method by Koenker and Basset in which linear regression is used to build predictive models for distribution quantiles [1] (b) the UNEEC method [2,3,7] which takes into account the input variables influencing such uncertainty and uses more advanced machine learning (non-linear) methods (e.g. neural networks or k-NN method) (c) the recent DUBRAE method (Dynamic Uncertainty Model By Regression on Absolute Error), a autoregressive model of model residuals which first corrects the model residual and then employs an autoregressive statistical model for uncertainty prediction) [5] 2. The data uncertainty (parametric and/or input) - in this case we study the propagation of uncertainty (presented typically probabilistically) from parameters or inputs to the model outputs. For real complex non-linear functions (models) implemented in software various versions of the Monte Carlo simulation are used: values of parameters or inputs are sampled from the assumed distributions and the model is run multiple times to generate multiple outputs. The data generated by Monte Carlo analysis can be used to build a machine learning model which will be able to make predictions of model uncertainty for the future his method is named MLUE (Machine Learning for Uncertainty Estimation) and is covered in [4,6]. 3. Structural uncertainty stemming from inadequate model structure. The paper discusses the possibilities and experiences of building the models able to forecast (rather than analyse) residual and parametric uncertainty of hydrological models. References [1] Koenker, R., and G. Bassett (1978). Regression quantiles. Econometrica, 46(1), 33- 50, doi:10.2307/1913643. [2] D.L. Shrestha, D.P. Solomatine (2006). Machine learning approaches for estimation of prediction interval for the model output. Neural Networks J., 19(2), 225-235. [3] D.P. Solomatine, D.L. Shrestha (2009). A novel method to estimate model uncertainty using machine learning techniques. Water Resources Res. 45, W00B11. [4] D. L

  6. Research for Future Training Modeling and Simulation Strategies

    Science.gov (United States)

    2011-09-01

    critical new products. Although other portable audio devices existed when Apple introduced the iPod , its design and features were revolutionary and played...the iPod however, Apple cleverly added another part of its business model following the introduction of the device. Within the Apple “closed system...adaptive in meeting user needs by product and service transformations. iii • Apple and IBM provide two well-known business cases to examine as

  7. Panel: The Future of Research in Modeling & Simulation

    Science.gov (United States)

    2014-12-01

    necessary to replicate a model (Yilmaz 2013). Scientific workflow systems (Anand et al. 2009; Oinn et al. 2004) and provenance-based tracking of...www.sciencemag.org/content/327/5962/144.1.full. Anand, M., S. Bowers, T. McPhillips, and R. Ludascher. 2009. “Exploring Scientific Workflow ...Provenance Using Hybrid Queries over Nested Data and Lineage Graphs.” In Proceedings of the Scientific and Statistical Database Management, 237–254

  8. In Marriage of Model and Numerics, Glimpses of the Future

    Science.gov (United States)

    Nejadmalayeri, Alireza; Vasilyev, Oleg V.; Vezolainen, Alexei

    2012-11-01

    A newly defined concept of m-refinement (model-refinement), which provides two-way coupling of physical models and numerical methods, is employed to study the Reynolds scaling of SCALES with constant levels of fidelity. Within the context of wavelet-based methods, this new hybrid methodology provides a hierarchical space/time dynamically adaptive automatic smooth transition from resolving the Kolmogorov length-scale (WDNS) to decomposing deterministic-coherent/stochastic-incoherent modes (CVS) to capturing more/less energetic structures (SCALES). This variable fidelity turbulence modeling approach utilizes a unified single solver framework by means of a Lagrangian spatially varying thresholding technique. The fundamental findings of this computational complexity study are summarized as follows: 1) SCALES can achieve the objective of ``controlling the captured flow-physics as desired'' by profoundly small number of spatial modes; 2) Reynolds scaling of constant-dissipation SCALES is the same regardless of fidelity of the simulations; 3) the number of energy containing structures at a fixed level of resolved turbulent kinetic energy scales linearly with Re; and 4) the fractal dimension of coherent energy containing structures is close to unity. This work was supported by NSF under grant No. CBET-0756046.

  9. Modeling of Testability Requirement Based on Generalized Stochastic Petri Nets

    Institute of Scientific and Technical Information of China (English)

    SU Yong-ding; QIU Jing; LIU Guan-jun; QIAN Yan-ling

    2009-01-01

    Testability design is an effective way to realize the fault detection and isolation. Its important step is to determine testability figures of merits (TFOM). Firstly, some influence factors for TFOMs are analyzed, such as the processes of system operation, maintenance and support, fault detection and isolation and so on. Secondly, a testability requirement analysis model is built based on generalized stochastic Petri net (GSPN). Then, the system's reachable states are analyzed based on the model, a Markov chain isomorphic with Petri net is constructed, a state transition matrix is created and the system's steady state probability is obtained. The relationship between the steady state availability and testability parameters can be revealed and reasoned. Finally, an example shows that the proposed method can determine TFOM, such as fault detection rate and fault isolation rate, effectively and reasonably.

  10. Model Penentuan Nilai Target Functional Requirement Berbasis Utilitas

    Directory of Open Access Journals (Sweden)

    Cucuk Nur Rosyidi

    2012-01-01

    Full Text Available In a product design and development process, a designer faces a problem to decide functional requirement (FR target values. That decision is made under a risk since it is conducted in the early design phase using incomplete information. Utility function can be used to reflect the decision maker attitude towards the risk in making such decision. In this research, we develop a utility-based model to determine FR target values using quadratic utility function and information from Quality Function Deployment (QFD. A pencil design is used as a numerical example using quadratic utility function for each FR. The model can be applied for balancing customer and designer interest in determining FR target values.

  11. User Requirements from the Climate Modelling Community for Next Generation Global Products from Land Cover CCI Project

    Science.gov (United States)

    Kooistra, Lammert; van Groenestijn, Annemarie; Kalogirou, Vasileios; Arino, Olivier; Herold, Martin

    2011-01-01

    Land Cover has been selected as one of 11 Essential Climate Variables which will be elaborated during the first phase of the ESA Climate Change Initiative (2010- 2013). In the first stage of the Land Cover CCI project, an user requirements analysis has been carried out on the basis of which the detailed specifications of a global land cover product can be defined which match the requirements from the Global Climate Observing System (GCOS) and the climate modelling community. As part of the requirements analysis, an user consultation mechanism was set-up to actively involve different climate modelling groups by setting out surveys to different type of users within the climate modelling community and the broad land cover data user community. The evolution of requirements from current models to future new modelling approaches was specifically taken into account. In addition, requirements from the GCOS Implementation Plan 2004 and 2010 and associated strategic earth observation documents for land cover were assessed and a detailed literature review was carried out. The outcome of the user requirements assessment shows that although the range of requirements coming from the climate modelling community is broad, there is a good match among the requirements coming from different user groups and the broader requirements derived from GCOS, CMUG and other relevant international panels. More specific requirements highlight that future land cover datasets should be both stable and have a dynamic component; deal with the consistency in relationships between land cover classes and land surface parameters; should provide flexibility to serve different scales and purposes; and should provide transparency of product quality. As a next step within the Land Cover CCI project, the outcome of this user requirements analysis will be used as input for the product specification of the next generation Global Land Cover datasets.

  12. A Review of Decision Support Models for Global Distribution Network Design and Future Model development

    DEFF Research Database (Denmark)

    Reich, Juri; Kinra, Aseem; Kotzab, Herbert

    not offer a comprehensive method that is able to solve the problem in one single decision making process considering all relevant goals and factors. Thus, we attempt to create such a model using existing methods as building blocks, namely mixedinteger linear programming and the analytical hierarchy process.......We look at the global distribution network design problem and the requirements to solve it. This problem typically involves conflicting goals and a magnitude of interdependent input factors, described by qualitative and quantitative information. Our literature review shows that current models do...

  13. Assessing dengue vaccination impact: Model challenges and future directions.

    Science.gov (United States)

    Recker, Mario; Vannice, Kirsten; Hombach, Joachim; Jit, Mark; Simmons, Cameron P

    2016-08-31

    In response to the sharp rise in the global burden caused by dengue virus (DENV) over the last few decades, the WHO has set out three specific key objectives in its disease control strategy: (i) to estimate the true burden of dengue by 2015; (ii) a reduction in dengue mortality by at least 50% by 2020 (used as a baseline); and (iii) a reduction in dengue morbidity by at least 25% by 2020. Although various elements will all play crucial parts in achieving this goal, from diagnosis and case management to integrated surveillance and outbreak response, sustainable vector control, vaccine implementation and finally operational and implementation research, it seems clear that new tools (e.g. a safe and effective vaccine and/or effective vector control) are key to success. The first dengue vaccine was licensed in December 2015, Dengvaxia® (CYD-TDV) developed by Sanofi Pasteur. The WHO has provided guidance on the use of CYD-TDV in endemic countries, for which there are a variety of considerations beyond the risk-benefit evaluation done by regulatory authorities, including public health impact and cost-effectiveness. Population-level vaccine impact and economic and financial aspects are two issues that can potentially be considered by means of mathematical modelling, especially for new products for which empirical data are still lacking. In December 2014 a meeting was convened by the WHO in order to revisit the current status of dengue transmission models and their utility for public health decision-making. Here, we report on the main points of discussion and the conclusions of this meeting, as well as next steps for maximising the use of mathematical models for vaccine decision-making. Copyright © 2016.

  14. Designing and modelling Havana’s future bus rapid transit

    OpenAIRE

    Warren, James; Ortegon-Sanchez, Adriana

    2015-01-01

    A single bus route in Havana’s bus system is modelled from the current position to a modernised bus rapid transit (BRT). The system is based on an expert-led visioning process and Cuba’s official planning documents, which define the high-level design criteria and their objectives. Building on the experiences of BRT systems that operate in other Latin American cities, a conceptual design for Havana’s BRT system is defined in terms of the key institutional, technical and financial frameworks, a...

  15. Past and ongoing shifts in Joshua tree distribution support future modeled range contraction

    Science.gov (United States)

    Cole, Kenneth L.; Ironside, Kirsten; Eischeid, Jon K.; Garfin, Gregg; Duffy, Phil; Toney, Chris

    2011-01-01

    The future distribution of the Joshua tree (Yucca brevifolia) is projected by combining a geostatistical analysis of 20th-century climates over its current range, future modeled climates, and paleoecological data showing its response to a past similar climate change. As climate rapidly warmed ;11 700 years ago, the range of Joshua tree contracted, leaving only the populations near what had been its northernmost limit. Its ability to spread northward into new suitable habitats after this time may have been inhibited by the somewhat earlier extinction of megafaunal dispersers, especially the Shasta ground sloth. We applied a model of climate suitability for Joshua tree, developed from its 20th-century range and climates, to future climates modeled through a set of six individual general circulation models (GCM) and one suite of 22 models for the late 21st century. All distribution data, observed climate data, and future GCM results were scaled to spatial grids of ;1 km and ;4 km in order to facilitate application within this topographically complex region. All of the models project the future elimination of Joshua tree throughout most of the southern portions of its current range. Although estimates of future monthly precipitation differ between the models, these changes are outweighed by large increases in temperature common to all the models. Only a few populations within the current range are predicted to be sustainable. Several models project significant potential future expansion into new areas beyond the current range, but the species' Historical and current rates of dispersal would seem to prevent natural expansion into these new areas. Several areas are predicted to be potential sites for relocation/ assisted migration. This project demonstrates how information from paleoecology and modern ecology can be integrated in order to understand ongoing processes and future distributions.

  16. Requirements for high level models supporting design space exploration in model-based systems engineering

    NARCIS (Netherlands)

    Haveman, Steven; Bonnema, Gerrit Maarten

    2013-01-01

    Most formal models are used in detailed design and focus on a single domain. Few effective approaches exist that can effectively tie these lower level models to a high level system model during design space exploration. This complicates the validation of high level system requirements during

  17. A GENERALIZATION OF TRADITIONAL KANO MODEL FOR CUSTOMER REQUIREMENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Renáta Turisová

    2015-07-01

    Full Text Available Purpose: The theory of attractiveness determines the relationship between the technically achieved and customer perceived quality of product attributes. The most frequently used approach in the theory of attractiveness is the implementation of Kano‘s model. There exist a lot of generalizations of that model which take into consideration various aspects and approaches focused on understanding the customer preferences and identification of his priorities for a selling  product. The aim of this article is to outline another possible generalization of Kano‘s model.Methodology/Approach: The traditional Kano’s model captures the nonlinear relationship between reached attributes of quality and customer requirements. The individual attributes of quality are divided into three main categories: must-be, one-dimensional, attractive quality and into two side categories: indifferent and reverse quality. The well selling product has to contain the must-be attribute. It should contain as many one-dimensional attributes as possible. If there are also supplementary attractive attributes, it means that attractiveness of the entire product, from the viewpoint of the customer, nonlinearly sharply rises what has a direct positive impact on a decision of potential customer when purchasing the product. In this article, we show that inclusion of individual quality attributes of a product to the mentioned categories depends, among other things, also on costs on life cycle of the product, respectively on a price of the product on the market.Findings: In practice, we are often encountering the inclusion of products into different price categories: lower, middle and upper class. For a certain type of products the category is either directly declared by a producer (especially in automotive industry, or is determined by a customer by means of assessment of available market prices. To each of those groups of a products different customer expectations can be assigned

  18. Stellar models: firm evidence, open questions and future developments

    CERN Document Server

    Cassisi, Santi

    2009-01-01

    During this last decade our knowledge of the evolutionary properties of stars has significantly improved. This result has been achieved thanks to our improved understanding of the physical behavior of stellar matter in the thermal regimes characteristic of the different stellar mass ranges and/or evolutionary stages. This notwithstanding, the current generation of stellar models is still affected by several, not negligible, uncertainties related to our poor knowledge of some thermodynamical processes and nuclear reaction rates, as well as the efficiency of mixing processes. These drawbacks have to be properly taken into account when comparing theory with observations, to derive evolutionary properties of both resolved and unresolved stellar populations. In this paper we review the major sources of uncertainty along the main evolutionary stages, and emphasize their impact on population synthesis techniques.

  19. A prospective overview of the essential requirements in molecular modeling for nanomedicine design.

    Science.gov (United States)

    Kumar, Pradeep; Khan, Riaz A; Choonara, Yahya E; Pillay, Viness

    2013-05-01

    Nanotechnology has presented many new challenges and opportunities in the area of nanomedicine design. The issues related to nanoconjugation, nanosystem-mediated targeted drug delivery, transitional stability of nanovehicles, the integrity of drug transport, drug-delivery mechanisms and chemical structural design require a pre-estimated and determined course of assumptive actions with property and characteristic estimations for optimal nanomedicine design. Molecular modeling in nanomedicine encompasses these pre-estimations and predictions of pertinent design data via interactive computographic software. Recently, an increasing amount of research has been reported where specialized software is being developed and employed in an attempt to bridge the gap between drug discovery, materials science and biology. This review provides an assimilative and concise incursion into the current and future strategies of molecular-modeling applications in nanomedicine design and aims to describe the utilization of molecular models and theoretical-chemistry computographic techniques for expansive nanomedicine design and development.

  20. On data requirements for calibration of integrated models for urban water systems.

    Science.gov (United States)

    Langeveld, Jeroen; Nopens, Ingmar; Schilperoort, Remy; Benedetti, Lorenzo; de Klein, Jeroen; Amerlinck, Youri; Weijers, Stefan

    2013-01-01

    Modeling of integrated urban water systems (IUWS) has seen a rapid development in recent years. Models and software are available that describe the process dynamics in sewers, wastewater treatment plants (WWTPs), receiving water systems as well as at the interfaces between the submodels. Successful applications of integrated modeling are, however, relatively scarce. One of the reasons for this is the lack of high-quality monitoring data with the required spatial and temporal resolution and accuracy to calibrate and validate the integrated models, even though the state of the art of monitoring itself is no longer the limiting factor. This paper discusses the efforts to be able to meet the data requirements associated with integrated modeling and describes the methods applied to validate the monitoring data and to use submodels as software sensor to provide the necessary input for other submodels. The main conclusion of the paper is that state of the art monitoring is in principle sufficient to provide the data necessary to calibrate integrated models, but practical limitations resulting in incomplete data-sets hamper widespread application. In order to overcome these difficulties, redundancy of future monitoring networks should be increased and, at the same time, data handling (including data validation, mining and assimilation) should receive much more attention.

  1. A framework for modeling anthropogenic impacts on waterbird habitats: addressing future uncertainty in conservation planning

    Science.gov (United States)

    Matchett, Elliott L.; Fleskes, Joseph P.; Young, Charles A.; Purkey, David R.

    2015-01-01

    The amount and quality of natural resources available for terrestrial and aquatic wildlife habitats are expected to decrease throughout the world in areas that are intensively managed for urban and agricultural uses. Changes in climate and management of increasingly limited water supplies may further impact water resources essential for sustaining habitats. In this report, we document adapting a Water Evaluation and Planning (WEAP) system model for the Central Valley of California. We demonstrate using this adapted model (WEAP-CVwh) to evaluate impacts produced from plausible future scenarios on agricultural and wetland habitats used by waterbirds and other wildlife. Processed output from WEAP-CVwh indicated varying levels of impact caused by projected climate, urbanization, and water supply management in scenarios used to exemplify this approach. Among scenarios, the NCAR-CCSM3 A2 climate projection had a greater impact than the CNRM-CM3 B1 climate projection, whereas expansive urbanization had a greater impact than strategic urbanization, on annual availability of waterbird habitat. Scenarios including extensive rice-idling or substantial instream flow requirements on important water supply sources produced large impacts on annual availability of waterbird habitat. In the year corresponding with the greatest habitat reduction for each scenario, the scenario including instream flow requirements resulted in the greatest decrease in habitats throughout all months of the wintering period relative to other scenarios. This approach provides a new and useful tool for habitat conservation planning in the Central Valley and a model to guide similar research investigations aiming to inform conservation, management, and restoration of important wildlife habitats.

  2. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NARCIS (Netherlands)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D A; Brogaard, Sara; Van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-01-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS (Lund-Potsd

  3. Ecological models for regulatory risk assessments of pesticides: Developing a strategy for the future.

    NARCIS (Netherlands)

    Thorbek, P.; Forbes, V.; Heimbach, F.; Hommen, U.; Thulke, H.H.; Brink, van den P.J.

    2010-01-01

    Ecological Models for Regulatory Risk Assessments of Pesticides: Developing a Strategy for the Future provides a coherent, science-based view on ecological modeling for regulatory risk assessments. It discusses the benefits of modeling in the context of registrations, identifies the obstacles that p

  4. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NARCIS (Netherlands)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D A; Brogaard, Sara; Van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-01-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS

  5. Blended learning in anesthesia education: current state and future model.

    Science.gov (United States)

    Kannan, Jaya; Kurup, Viji

    2012-12-01

    Educators in anesthesia residency programs across the country are facing a number of challenges as they attempt to integrate blended learning techniques in their curriculum. Compared with the rest of higher education, which has made advances to varying degrees in the adoption of online learning anesthesiology education has been sporadic in the active integration of blended learning. The purpose of this review is to discuss the challenges in anesthesiology education and relevance of the Universal Design for Learning framework in addressing them. There is a wide chasm between student demand for online education and the availability of trained faculty to teach. The design of the learning interface is important and will significantly affect the learning experience for the student. This review examines recent literature pertaining to this field, both in the realm of higher education in general and medical education in particular, and proposes the application of a comprehensive learning model that is new to anesthesiology education and relevant to its goals of promoting self-directed learning.

  6. Generic feature of future crossing of phantom divide in viable $f(R)$ gravity models

    CERN Document Server

    Bamba, Kazuharu; Lee, Chung-Chi

    2010-01-01

    We study the equation of state for dark energy and explicitly demonstrate that the future crossings of the phantom divide line $w_{\\mathrm{DE}}=-1$ are the generic feature in the existing viable $f(R)$ gravity models. We also explore the future evolution of the cosmological horizon entropy and illustrate that the cosmological horizon entropy oscillates with time due to the oscillatory behavior of the Hubble parameter. The important cosmological consequence is that in the future, the sign of the time derivative of the Hubble parameter changes from negative to positive in these viable $f(R)$ gravity models.

  7. Generic feature of future crossing of phantom divide in viable f(R) gravity models

    Energy Technology Data Exchange (ETDEWEB)

    Bamba, Kazuharu; Geng, Chao-Qiang; Lee, Chung-Chi, E-mail: bamba@phys.nthu.edu.tw, E-mail: geng@phys.nthu.edu.tw, E-mail: g9522545@oz.nthu.edu.tw [Department of Physics, National Tsing Hua University, No. 101, Section 2, Kuang Fu Road, Hsinchu, Taiwan (China)

    2010-11-01

    We study the equation of state for dark energy and explicitly demonstrate that the future crossings of the phantom divide line w{sub DE} = −1 are the generic feature in the existing viable f(R) gravity models. We also explore the future evolution of the cosmological horizon entropy and illustrate that the cosmological horizon entropy oscillates with time due to the oscillatory behavior of the Hubble parameter. The important cosmological consequence is that in the future, the sign of the time derivative of the Hubble parameter changes from negative to positive in these viable f(R) gravity models.

  8. A Robust Statistical Model to Predict the Future Value of the Milk Production of Dairy Cows Using Herd Recording Data

    Science.gov (United States)

    Græsbøll, Kaare; Kirkeby, Carsten; Nielsen, Søren Saxmose; Halasa, Tariq; Toft, Nils; Christiansen, Lasse Engbo

    2017-01-01

    The future value of an individual dairy cow depends greatly on its projected milk yield. In developed countries with developed dairy industry infrastructures, facilities exist to record individual cow production and reproduction outcomes consistently and accurately. Accurate prediction of the future value of a dairy cow requires further detailed knowledge of the costs associated with feed, management practices, production systems, and disease. Here, we present a method to predict the future value of the milk production of a dairy cow based on herd recording data only. The method consists of several steps to evaluate lifetime milk production and individual cow somatic cell counts and to finally predict the average production for each day that the cow is alive. Herd recording data from 610 Danish Holstein herds were used to train and test a model predicting milk production (including factors associated with milk yield, somatic cell count, and the survival of individual cows). All estimated parameters were either herd- or cow-specific. The model prediction deviated, on average, less than 0.5 kg from the future average milk production of dairy cows in multiple herds after adjusting for the effect of somatic cell count. We conclude that estimates of future average production can be used on a day-to-day basis to rank cows for culling, or can be implemented in simulation models of within-herd disease spread to make operational decisions, such as culling versus treatment. An advantage of the approach presented in this paper is that it requires no specific knowledge of disease status or any other information beyond herd recorded milk yields, somatic cell counts, and reproductive status. PMID:28261585

  9. Evaluation of Stochastic Rainfall Models in Capturing Climate Variability for Future Drought and Flood Risk Assessment

    Science.gov (United States)

    Chowdhury, A. F. M. K.; Lockart, N.; Willgoose, G. R.; Kuczera, G. A.; Kiem, A.; Nadeeka, P. M.

    2016-12-01

    One of the key objectives of stochastic rainfall modelling is to capture the full variability of climate system for future drought and flood risk assessment. However, it is not clear how well these models can capture the future climate variability when they are calibrated to Global/Regional Climate Model data (GCM/RCM) as these datasets are usually available for very short future period/s (e.g. 20 years). This study has assessed the ability of two stochastic daily rainfall models to capture climate variability by calibrating them to a dynamically downscaled RCM dataset in an east Australian catchment for 1990-2010, 2020-2040, and 2060-2080 epochs. The two stochastic models are: (1) a hierarchical Markov Chain (MC) model, which we developed in a previous study and (2) a semi-parametric MC model developed by Mehrotra and Sharma (2007). Our hierarchical model uses stochastic parameters of MC and Gamma distribution, while the semi-parametric model uses a modified MC process with memory of past periods and kernel density estimation. This study has generated multiple realizations of rainfall series by using parameters of each model calibrated to the RCM dataset for each epoch. The generated rainfall series are used to generate synthetic streamflow by using a SimHyd hydrology model. Assessing the synthetic rainfall and streamflow series, this study has found that both stochastic models can incorporate a range of variability in rainfall as well as streamflow generation for both current and future periods. However, the hierarchical model tends to overestimate the multiyear variability of wet spell lengths (therefore, is less likely to simulate long periods of drought and flood), while the semi-parametric model tends to overestimate the mean annual rainfall depths and streamflow volumes (hence, simulated droughts are likely to be less severe). Sensitivity of these limitations of both stochastic models in terms of future drought and flood risk assessment will be discussed.

  10. A copula-multifractal volatility hedging model for CSI 300 index futures

    Science.gov (United States)

    Wei, Yu; Wang, Yudong; Huang, Dengshi

    2011-11-01

    In this paper, we propose a new hedging model combining the newly introduced multifractal volatility (MFV) model and the dynamic copula functions. Using high-frequency intraday quotes of the spot Shanghai Stock Exchange Composite Index (SSEC), spot China Securities Index 300 (CSI 300), and CSI 300 index futures, we compare the direct and cross hedging effectiveness of the copula-MFV model with several popular copula-GARCH models. The main empirical results show that the proposed copula-MFV model obtains better hedging effectiveness than the copula-GARCH-type models in general. Furthermore, the hedge operating strategy based MFV hedging model involves fewer transaction costs than those based on the GARCH-type models. The finding of this paper indicates that multifractal analysis may offer a new way of quantitative hedging model design using financial futures.

  11. Existing and Required Modeling Capabilities for Evaluating ATM Systems and Concepts

    Science.gov (United States)

    Odoni, Amedeo R.; Bowman, Jeremy; Delahaye, Daniel; Deyst, John J.; Feron, Eric; Hansman, R. John; Khan, Kashif; Kuchar, James K.; Pujet, Nicolas; Simpson, Robert W.

    1997-01-01

    ATM systems throughout the world are entering a period of major transition and change. The combination of important technological developments and of the globalization of the air transportation industry has necessitated a reexamination of some of the fundamental premises of existing Air Traffic Management (ATM) concepts. New ATM concepts have to be examined, concepts that may place more emphasis on: strategic traffic management; planning and control; partial decentralization of decision-making; and added reliance on the aircraft to carry out strategic ATM plans, with ground controllers confined primarily to a monitoring and supervisory role. 'Free Flight' is a case in point. In order to study, evaluate and validate such new concepts, the ATM community will have to rely heavily on models and computer-based tools/utilities, covering a wide range of issues and metrics related to safety, capacity and efficiency. The state of the art in such modeling support is adequate in some respects, but clearly deficient in others. It is the objective of this study to assist in: (1) assessing the strengths and weaknesses of existing fast-time models and tools for the study of ATM systems and concepts and (2) identifying and prioritizing the requirements for the development of additional modeling capabilities in the near future. A three-stage process has been followed to this purpose: 1. Through the analysis of two case studies involving future ATM system scenarios, as well as through expert assessment, modeling capabilities and supporting tools needed for testing and validating future ATM systems and concepts were identified and described. 2. Existing fast-time ATM models and support tools were reviewed and assessed with regard to the degree to which they offer the capabilities identified under Step 1. 3 . The findings of 1 and 2 were combined to draw conclusions about (1) the best capabilities currently existing, (2) the types of concept testing and validation that can be carried

  12. Population balance models: a useful complementary modelling framework for future WWTP modelling

    DEFF Research Database (Denmark)

    Nopens, Ingmar; Torfs, Elena; Ducoste, Joel;

    2015-01-01

    sufficiently capture the true behaviour and even lead to completely wrong conclusions. Examples of distributed properties are bubble size, floc size, crystal size or granule size. In these cases, PBMs can be used to develop new knowledge that can be embedded in our current models to improve their predictive...

  13. Population Balance Models: A useful complementary modelling framework for future WWTP modelling

    DEFF Research Database (Denmark)

    Nopens, Ingmar; Torfs, Elena; Ducoste, Joel

    2014-01-01

    processes in WWTPs could potentially benefit from this framework, especially when distributed dynamics have a significant impact on the overall unit process performance. In these cases, current models that rely on average properties cannot sufficiently captured the true behaviour. Examples are bubble size...

  14. REQUIREMENTS FOR SYSTEMS DEVELOPMENT LIFE CYCLE MODELS FOR LARGE-SCALE DEFENSE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Kadir Alpaslan DEMIR

    2015-10-01

    Full Text Available TLarge-scale defense system projects are strategic for maintaining and increasing the national defense capability. Therefore, governments spend billions of dollars in the acquisition and development of large-scale defense systems. The scale of defense systems is always increasing and the costs to build them are skyrocketing. Today, defense systems are software intensive and they are either a system of systems or a part of it. Historically, the project performances observed in the development of these systems have been signifi cantly poor when compared to other types of projects. It is obvious that the currently used systems development life cycle models are insuffi cient to address today’s challenges of building these systems. Using a systems development life cycle model that is specifi cally designed for largescale defense system developments and is effective in dealing with today’s and near-future challenges will help to improve project performances. The fi rst step in the development a large-scale defense systems development life cycle model is the identifi cation of requirements for such a model. This paper contributes to the body of literature in the fi eld by providing a set of requirements for system development life cycle models for large-scale defense systems. Furthermore, a research agenda is proposed.

  15. A non-Gaussian Ornstein-Uhlenbeck model for pricing wind power futures

    DEFF Research Database (Denmark)

    Benth, Fred Espen; Pircalabu, Anca

    2017-01-01

    generated assuming a recent level of installed capacity. Also, based on one year of observed prices for wind power futures with different delivery periods, we study the market price of risk. Generally, we find a negative risk premium whose magnitude decreases as the length of the delivery period increases.......The recent introduction of wind power futures written on the German wind power production index has brought with it new interesting challenges in terms of modeling and pricing. Some particularities of this product are the strong seasonal component embedded in the underlying, the fact that the wind...... index. We discuss the properties of the model and estimation of the model parameters. Further, the model allows for an analytical formula for pricing wind power futures. We provide an empirical study, where the model is calibrated to 37 years of German wind power production index that is synthetically...

  16. LSST camera heat requirements using CFD and thermal seeing modeling

    Science.gov (United States)

    Sebag, Jacques; Vogiatzis, Konstantinos

    2010-07-01

    The LSST camera is located above the LSST primary/tertiary mirror and in front of the secondary mirror in the shadow of its central obscuration. Due to this position within the optical path, heat released from the camera has a potential impact on the seeing degradation that is larger than traditionally estimated for Cassegrain or Nasmyth telescope configurations. This paper presents the results of thermal seeing modeling combined with Computational Fluid Dynamics (CFD) analyzes to define the thermal requirements on the LSST camera. Camera power output fluxes are applied to the CFD model as boundary conditions to calculate the steady-state temperature distribution on the camera and the air inside the enclosure. Using a previously presented post-processing analysis to calculate the optical seeing based on the mechanical turbulence and temperature variations along the optical path, the optical performance resulting from the seeing is determined. The CFD simulations are repeated for different wind speeds and orientations to identify the worst case scenario and generate an estimate of seeing contribution as a function of camera-air temperature difference. Finally, after comparing with the corresponding error budget term, a maximum allowable temperature for the camera is selected.

  17. Research on Computer Aided Innovation Model of Weapon Equipment Requirement Demonstration

    Science.gov (United States)

    Li, Yong; Guo, Qisheng; Wang, Rui; Li, Liang

    Firstly, in order to overcome the shortcoming of using only AD or TRIZ solely, and solve the problems currently existed in weapon equipment requirement demonstration, the paper construct the method system of weapon equipment requirement demonstration combining QFD, AD, TRIZ, FA. Then, we construct a CAI model frame of weapon equipment requirement demonstration, which include requirement decomposed model, requirement mapping model and requirement plan optimization model. Finally, we construct the computer aided innovation model of weapon equipment requirement demonstration, and developed CAI software of equipment requirement demonstration.

  18. Porcine models of digestive disease: the future of large animal translational research

    OpenAIRE

    Gonzalez, Liara M.; Moeser, Adam J; Blikslager, Anthony T.

    2015-01-01

    There is increasing interest in non-rodent translational models for the study of human disease. The pig, in particular, serves as a useful animal model for the study of pathophysiological conditions relevant to the human intestine. This review assesses currently used porcine models of gastrointestinal physiology and disease and provides a rationale for the use of these models for future translational studies. The pig has proven its utility for the study of fundamental disease conditions such ...

  19. Requirements-Driven Deployment: Customizing the Requirements Model for the Host Environment

    NARCIS (Netherlands)

    Ali, Raian; Dalpiaz, Fabiano; Giorgini, Paolo

    2014-01-01

    Deployment is a main development phase which configures a software to be ready for use in a certain environment. The ultimate goal of deployment is to enable users to achieve their requirements while using the deployed software. However, requirements are not uniform and differ between deployment env

  20. Understanding the relationship between Kano model's customer satisfaction scores and self-stated requirements importance.

    Science.gov (United States)

    Mkpojiogu, Emmanuel O C; Hashim, Nor Laily

    2016-01-01

    Customer satisfaction is the result of product quality and viability. The place of the perceived satisfaction of users/customers for a software product cannot be neglected especially in today competitive market environment as it drives the loyalty of customers and promotes high profitability and return on investment. Therefore understanding the importance of requirements as it is associated with the satisfaction of users/customers when their requirements are met is worth the pain considering. It is necessary to know the relationship between customer satisfactions when their requirements are met (or their dissatisfaction when their requirements are unmet) and the importance of such requirement. So many works have been carried out on customer satisfaction in connection with the importance of requirements but the relationship between customer satisfaction scores (coefficients) of the Kano model and users/customers self-stated requirements importance have not been sufficiently explored. In this study, an attempt is made to unravel the underlying relationship existing between Kano model's customer satisfaction indexes and users/customers self reported requirements importance. The results of the study indicate some interesting associations between these considered variables. These bivariate associations reveal that customer satisfaction index (SI), and average satisfaction coefficient (ASC) and customer dissatisfaction index (DI) and average satisfaction coefficient (ASC) are highly correlated (r = 96 %) and thus ASC can be used in place of either SI or DI in representing customer satisfaction scores. Also, these Kano model's customer satisfaction variables (SI, DI, and ASC) are each associated with self-stated requirements importance (IMP). Further analysis indicates that the value customers or users place on requirements that are met or on features that are incorporated into a product influences the level of satisfaction such customers derive from the product. The

  1. Formation of Future Specialists' Communicative Competence in Language Disciplines through Modeling in Game of Professional Situations

    Science.gov (United States)

    Sturikova, Marina V.; Albrekht, Nina V.; Kondyurina, Irina M.; Rozhneva, Svetlana S.; Sankova, Larisa V.; Morozova, Elena S.

    2016-01-01

    The relevance of the research problem driven by the necessity of formation of future specialists' communicative competence as a component of professional competence with the aim of further professional mobility of graduates. The purpose of the article is to justify the possibility and necessity of formation of the required competencies in language…

  2. Beauty and the beast: Some perspectives on efficient model analysis, surrogate models, and the future of modeling

    Science.gov (United States)

    Hill, M. C.; Jakeman, J.; Razavi, S.; Tolson, B.

    2015-12-01

    For many environmental systems model runtimes have remained very long as more capable computers have been used to add more processes and more time and space discretization. Scientists have also added more parameters and kinds of observations, and many model runs are needed to explore the models. Computational demand equals run time multiplied by number of model runs divided by parallelization opportunities. Model exploration is conducted using sensitivity analysis, optimization, and uncertainty quantification. Sensitivity analysis is used to reveal consequences of what may be very complex simulated relations, optimization is used to identify parameter values that fit the data best, or at least better, and uncertainty quantification is used to evaluate the precision of simulated results. The long execution times make such analyses a challenge. Methods for addressing this challenges include computationally frugal analysis of the demanding original model and a number of ingenious surrogate modeling methods. Both commonly use about 50-100 runs of the demanding original model. In this talk we consider the tradeoffs between (1) original model development decisions, (2) computationally frugal analysis of the original model, and (3) using many model runs of the fast surrogate model. Some questions of interest are as follows. If the added processes and discretization invested in (1) are compared with the restrictions and approximations in model analysis produced by long model execution times, is there a net benefit related of the goals of the model? Are there changes to the numerical methods that could reduce the computational demands while giving up less fidelity than is compromised by using computationally frugal methods or surrogate models for model analysis? Both the computationally frugal methods and surrogate models require that the solution of interest be a smooth function of the parameters or interest. How does the information obtained from the local methods typical

  3. Elusive present: Hidden past and future dependency and why we build models.

    Science.gov (United States)

    Ara, Pooneh M; James, Ryan G; Crutchfield, James P

    2016-02-01

    Modeling a temporal process as if it is Markovian assumes that the present encodes all of a process's history. When this occurs, the present captures all of the dependency between past and future. We recently showed that if one randomly samples in the space of structured processes, this is almost never the case. So, how does the Markov failure come about? That is, how do individual measurements fail to encode the past? and How many are needed to capture dependencies between the past and future? Here, we investigate how much information can be shared between the past and the future but not reflected in the present. We quantify this elusive information, give explicit calculational methods, and outline the consequences, the most important of which is that when the present hides past-future correlation or dependency we must move beyond sequence-based statistics and build state-based models.

  4. Modelling land-use effects of future urbanization using cellular automata

    DEFF Research Database (Denmark)

    Fuglsang, Morten; Münier, B.; Hansen, H.S.

    2013-01-01

    The modelling of land use change is a way to analyse future scenarios by modelling different pathways. Application of spatial data of different scales coupled with socio-economic data makes it possible to explore and test the understanding of land use change relations. In the EU-FP7 research...

  5. A 2nd generation static model of greenhouse energy requirements (horticern) : a comparison with dynamic models

    CERN Document Server

    Jolliet, O; Munday, G L

    1989-01-01

    Optimisation of a greenhouse and its components requires a suitable model permitting precise determination of its energy requirements. Existing static models are simple but lack precision; dynamic models though more precise, are unsuitable for use over long periods and difficult to handle in practice. A theoretical study and measurements from the CERN trial greenhouse have allowed the development of new static model named "HORTICERN", precise and easy to use for predicting energy consumption and which takes into account effects of solar energy, wind and radiative loss to the sky. This paper compares the HORTICERN model with the dynamic models of Bot, Takakura, Van Bavel and Gembloux, and demonstrates that its precision is comparable; differences on average being less than 5%, it is independent of type of greenhouse (e.g. single or double glazing, Hortiplus, etc.) and climate. The HORTICERN method has been developed for PC use and is proving to be a powerful tool for greenhouse optimisation by research work...

  6. Impact and cost-effectiveness of current and future tuberculosis diagnostics: the contribution of modelling

    Science.gov (United States)

    Houben, R.; Cohen, T.; Pai, M.; Cobelens, F.; Vassall, A.; Menzies, N. A.; Gomez, G. B.; Langley, I.; Squire, S. B.; White, R.

    2014-01-01

    SUMMARY The landscape of diagnostic testing for tuberculosis (TB) is changing rapidly, and stakeholders need urgent guidance on how to develop, deploy and optimize TB diagnostics in a way that maximizes impact and makes best use of available resources. When decisions must be made with only incomplete or preliminary data available, modelling is a useful tool for providing such guidance. Following a meeting of modelers and other key stakeholders organized by the TB Modelling and Analysis Consortium, we propose a conceptual framework for positioning models of TB diagnostics. We use that framework to describe modelling priorities in four key areas: Xpert® MTB/RIF scale-up, target product profiles for novel assays, drug susceptibility testing to support new drug regimens, and the improvement of future TB diagnostic models. If we are to maximize the impact and cost-effectiveness of TB diagnostics, these modelling priorities should figure prominently as targets for future research. PMID:25189546

  7. Deriving required model structures to predict global wildfire burned area from multiple satellite and climate observations

    Science.gov (United States)

    Forkel, Matthias; Dorigo, Wouter; Lasslop, Gitta; Teubner, Irene; Chuvieco, Emilio; Thonicke, Kirsten

    2017-04-01

    Vegetation fires have important effects on human infrastructures and ecosystems, and affect atmospheric composition and the climate system. Consequently, it is necessary to accurately represent fire dynamics in global vegetation models to realistically represent the role of fires in the Earth system. However, it is unclear which model structures are required in global vegetation/fire models to represent fire activity at regional to global scales. Here we aim to identify required structural components and necessary complexities of global vegetation/fire models to predict spatial-temporal dynamics of burned area. For this purpose, we developed the SOFIA (satellite observations for fire activity) modelling approach to predict burned area from several satellite and climate datasets. A large ensemble of SOFIA models was generated and each model was optimized against observed burned area data. Models that account for a suppression of fire activity at wet conditions result in the highest performances in predicting burned area. Models that include vegetation optical depth data from microwave satellite observations reach higher performances in predicting burned area than models that do not include this dataset. Vegetation optical depth is a proxy for vegetation biomass, density and water content and thus indicates a strong control of vegetation states and dynamics on fire activity. We further compared the best performing SOFIA models with the global process-oriented vegetation/fire model JSBACH-SPITFIRE, and with the GFED and Fire_CCI burned area datasets. SOFIA models outperform JSBACH-SPITFIRE in predicting regional variabilities of burned area. We further applied the best SOFIA model to identify controlling factors for burned area. The results indicate that fire activity is controlled by regionally diverse and complex interactions of human, vegetation and climate factors. Our results demonstrate that the use of multiple observational datasets on climate, hydrological

  8. Defining Requirements and Applying Information Modeling for Protecting Enterprise Assets

    Science.gov (United States)

    Fortier, Stephen C.; Volk, Jennifer H.

    The advent of terrorist threats has heightened local, regional, and national governments' interest in emergency response and disaster preparedness. The threat of natural disasters also challenges emergency responders to act swiftly and in a coordinated fashion. When a disaster occurs, an ad hoc coalition of pre-planned groups usually forms to respond to the incident. History has shown that these “system of systems” do not interoperate very well. Communications between fire, police and rescue components either do not work or are inefficient. Government agencies, non-governmental organizations (NGOs), and private industry use a wide array of software platforms for managing data about emergency conditions, resources and response activities. Most of these are stand-alone systems with very limited capability for data sharing with other agencies or other levels of government. Information technology advances have facilitated the movement towards an integrated and coordinated approach to emergency management. Other communication mechanisms, such as video teleconferencing, digital television and radio broadcasting, are being utilized to combat the challenges of emergency information exchange. Recent disasters, such as Hurricane Katrina and the tsunami in Indonesia, have illuminated the weaknesses in emergency response. This paper will discuss the need for defining requirements for components of ad hoc coalitions which are formed to respond to disasters. A goal of our effort was to develop a proof of concept that applying information modeling to the business processes used to protect and mitigate potential loss of an enterprise was feasible. These activities would be modeled both pre- and post-incident.

  9. Modeling the Effects of Future Growing Demand for Charcoal in the Tropics

    Directory of Open Access Journals (Sweden)

    M. J. Santos

    2017-06-01

    Full Text Available Global demand for charcoal is increasing mainly due to urban population in developing countries. More than half the global population now lives in cities, and urban-dwellers are restricted to charcoal use because of easiness of production, access, transport, and tradition. Increasing demand for charcoal, however, may lead to increasing impacts on forests, food, and water resources, and may even create additional pressures on the climate system. Here we assess how different charcoal scenarios based on the Shared Socio-economic Pathways (SSP relate to potential biomass supply. For this, we use the energy model TIMER to project the demand for fuelwood and charcoal for different socio-economic pathways for urban and rural populations, globally, and for four tropical regions (Central America, South America, Africa and Indonesia. Second, we assess whether the biomass demands for each scenario can be met with current and projected forest biomass estimated with remote sensing and modeled Net Primary Productivity (NPP using a Dynamic Global Vegetation Model (LPJ-GUESS. Currently one third of residential energy use is based on traditional bioenergy, including charcoal. Globally, biomass needs by urban households by 2100 under the most sustainable scenario, SSP1, are of 14.4 mi ton biomass for charcoal plus 17.1 mi ton biomass for fuelwood (31.5 mi ton biomass in total. Under SSP3, the least sustainable scenario, we project a need of 205 mi tons biomass for charcoal plus 243.8 mi ton biomass for fuelwood by 2100 (total of 450 mi ton biomass. Africa and South America contribute the most for this biomass demand, however, all areas are able to meet the demand. We find that the future of the charcoal sector is not dire. Charcoal represents a small fraction of the energy requirements, but its biomass demands are disproportionate and in some regions require a large fraction of forest. This could be because of large growing populations moving to urban areas

  10. Modelling the state of the Mediterranean Sea under contemporary and future climate

    Science.gov (United States)

    Solidoro, Cosimo; Lazzari, Paolo; Cossarini, Gianpiero; Melaku Canu, Donata; Lovato, Tomas

    2016-04-01

    A validated 3D coupled transport-biogeochemical model is used to assess the impact of future climatic and management scenarios on biogeochemical and ecological properties of the Mediterranean Sea. Results are discussed in term of temporal and spatial distribution of parameters and indicators related to the carbonate system and the cycles of carbon and inorganic nutrients through dissolved and particulate phases, as simulated by a multi nutrient multi plankton numerical model under current and future conditions. Simulations span the period 1990-2040 and are performed by forcing a three-dimensional off-line coupled eco-hydrodynamical model (BFM and OPA-tracer model) with current fields produced by ad hoc implementation of the NEMO modelling system and with river input of nutrient and freshwater computed in recent European fp7 projects. The model properly describes available experimental information on contemporary seasonal dynamic and spatial distribution at the basin and sub-basin scale of major biogeochemical parameters, as well as primary production and carbon fluxes at the air-ocean interface. Model projections suggest that future Mediterranean sea will be globally warmer, more productive, and more acidic, but with significant space variability. Consequences in terms of ecological and higher trophic level organisms dynamics are discussed as well, also in reference with impact on space distribution of suitable site for aquaculture activity and future space distributions of suitability habitats for habitat building organisms (e.g. Poseidonia, Coralligenous)

  11. FS-OpenSecurity: A Taxonomic Modeling of Security Threats in SDN for Future Sustainable Computing

    Directory of Open Access Journals (Sweden)

    Yunsick Sung

    2016-09-01

    Full Text Available Software Defined Networking (SDN has brought many changes in terms of the interaction processes between systems and humans. It has become the key enabler of software defined architecture, which allows enterprises to build a highly agile Information Technology (IT infrastructure. For Future Sustainability Computing (FSC, SDN needs to deliver on many information technology commitments—more automation, simplified design, increased agility, policy-based management, and network management bond to more liberal IT workflow systems. To address the sustainability problems, SDN needs to provide greater collaboration and tighter integration with networks, servers, and security teams that will have an impact on how enterprises design, plan, deploy and manage networks. In this paper, we propose FS-OpenSecurity, which is a new and pragmatic security architecture model. It consists of two novel methodologies, Software Defined Orchestrator (SDO and SQUEAK, which offer a robust and secure architecture. The secure architecture is required for protection from diverse threats. Usually, security administrators need to handle each threat individually. However, handling threats automatically by adapting to the threat landscape is a critical demand. Therefore, the architecture must handle defensive processes automatically that are collaboratively based on intelligent external and internal information.

  12. Titan's past and future: 3D modeling of a pure nitrogen atmosphere and geological implications

    CERN Document Server

    Charnay, Benjamin; Tobie, Gabriel; Sotin, Christophe; Wordsworth, Robin

    2014-01-01

    Several clues indicate that Titan's atmosphere has been depleted in methane during some period of its history, possibly as recently as 0.5-1 billion years ago. It could also happen in the future. Under these conditions, the atmosphere becomes only composed of nitrogen with a range of temperature and pressure allowing liquid or solid nitrogen to condense. Here, we explore these exotic climates throughout Titan's history with a 3D Global Climate Model (GCM) including the nitrogen cycle and the radiative effect of nitrogen clouds. We show that for the last billion years, only small polar nitrogen lakes should have formed. Yet, before 1 Ga, a significant part of the atmosphere could have condensed, forming deep nitrogen polar seas, which could have flowed and flooded the equatorial regions. Alternatively, nitrogen could be frozen on the surface like on Triton, but this would require an initial surface albedo higher than 0.65 at 4 Ga. Such a state could be stable even today if nitrogen ice albedo is higher than th...

  13. Sorghum production under future climate in the Southwestern USA: model projections of yield, greenhouse gas emissions and soil C fluxes

    Science.gov (United States)

    Duval, B.; Ghimire, R.; Hartman, M. D.; Marsalis, M.

    2016-12-01

    Large tracts of semi-arid land in the Southwestern USA are relatively less important for food production than the US Corn Belt, and represent a promising area for expansion of biofuel/bioproduct crops. However, high temperatures, low available water and high solar radiation in the SW represent a challenge to suitable feedstock development, and future climate change scenarios predict that portions of the SW will experience increased temperature and temporal shifts in precipitation distribution. Sorghum (Sorghum bicolor) is a valuable forage crop with promise as a biofuel feedstock, given its high biomass under semi-arid conditions, relatively lower N fertilizer requirements compared to corn, and salinity tolerance. To evaluate the environmental impact of expanded sorghum cultivation under future climate in the SW USA, we used the DayCent model in concert with a suite of downscaled future weather projections to predict biogeochemical consequences (greenhouse gas flux and impacts on soil carbon) of sorghum cultivation in New Mexico. The model showed good correspondence with yield data from field trials including both dryland and irrigated sorghum (measured vs. modeled; r2 = 0.75). Simulation experiments tested the effect of dryland production versus irrigation, low N versus high N inputs and delayed fertilizer application. Nitrogen application timing and irrigation impacted yield and N2O emissions less than N rate and climate. Across N and irrigation treatments, future climate simulations resulted in 6% increased yield and 20% lower N2O emissions compared to current climate. Soil C pools declined under future climate. The greatest declines in soil C were from low N input sorghum simulations, regardless of irrigation (>20% declines in SOM in both cases), and requires further evaluation to determine if changing future climate is driving these declines, or if they are a function of prolonged sorghum-fallow rotations in the model. The relatively small gain in yield for

  14. Coopetitive Business Models in Future Mobile Broadband with Licensed Shared Access (LSA

    Directory of Open Access Journals (Sweden)

    P. Ahokangas

    2016-08-01

    Full Text Available Spectrum scarcity forces mobile network operators (MNOs providing mobile broadband services to develop new business models that address spectrum sharing. It engages MNOs into coopetitive relationship with incumbents. Licensed Shared Access (LSA concept complements traditional licensing and helps MNOs to access new spectrum bands on a shared basis. This paper discusses spectrum sharing with LSA from business perspective. It describes how coopetition and business model are linked conceptually, and identifies the influence of coopetition on future business models in LSA. We develop business models for dominant and challenger MNOs in traditional licensing and future with LSA. The results indicate that coopetition and business model concepts are linked via value co-creation and value co-capture. LSA offers different business opportunities to dominant and challenger MNOs. Offering, value proposition, customer segments and differentiation in business models become critical in mobile broadband.

  15. A stochastic Forest Fire Model for future land cover scenarios assessment

    Directory of Open Access Journals (Sweden)

    M. D'Andrea

    2010-10-01

    Full Text Available Land cover is affected by many factors including economic development, climate and natural disturbances such as wildfires. The ability to evaluate how fire regimes may alter future vegetation, and how future vegetation may alter fire regimes, would assist forest managers in planning management actions to be carried out in the face of anticipated socio-economic and climatic change. In this paper, we present a method for calibrating a cellular automata wildfire regime simulation model with actual data on land cover and wildfire size-frequency. The method is based on the observation that many forest fire regimes, in different forest types and regions, exhibit power law frequency-area distributions. The standard Drossel-Schwabl cellular automata Forest Fire Model (DS-FFM produces simulations which reproduce this observed pattern. However, the standard model is simplistic in that it considers land cover to be binary – each cell either contains a tree or it is empty – and the model overestimates the frequency of large fires relative to actual landscapes. Our new model, the Modified Forest Fire Model (MFFM, addresses this limitation by incorporating information on actual land use and differentiating among various types of flammable vegetation. The MFFM simulation model was tested on forest types with Mediterranean and sub-tropical fire regimes. The results showed that the MFFM was able to reproduce structural fire regime parameters for these two regions. Further, the model was used to forecast future land cover. Future research will extend this model to refine the forecasts of future land cover and fire regime scenarios under climate, land use and socio-economic change.

  16. Model-based Integration of Past & Future in TimeTravel

    DEFF Research Database (Denmark)

    Khalefa, Mohamed E.; Fischer, Ulrike; Pedersen, Torben Bach

    2012-01-01

    We demonstrate TimeTravel, an efficient DBMS system for seamless integrated querying of past and (forecasted) future values of time series, allowing the user to view past and future values as one joint time series. This functionality is important for advanced application domain like energy....... The main idea is to compactly represent time series as models. By using models, the TimeTravel system answers queries approximately on past and future data with error guarantees (absolute error and confidence) one order of magnitude faster than when accessing the time series directly. In addition...... it to answer approximate and exact queries. TimeTravel is implemented into PostgreSQL, thus achieving complete user transparency at the query level. In the demo, we show the easy building of a hierarchical model index for a real-world time series and the effect of varying the error guarantees on the speed up...

  17. Modeling nonstationary extreme wave heights in present and future climate of Greek Seas

    Directory of Open Access Journals (Sweden)

    Panagiota Galiatsatou

    2016-01-01

    Full Text Available In this study the generalized extreme value (GEV distribution function was used to assess nonstationarity in annual maximum wave heights for selected locations in the Greek Seas, both in the present and future climate. The available significant wave height data were divided into groups corresponding to the present period (1951 to 2000, a first future period (2001 to 2050, and a second future period (2051 to 2100. For each time period, the parameters of the GEV distribution were specified as functions of time-varying covariates and estimated using the conditional density network (CDN. For each location and selected time period, a total number of 29 linear and nonlinear models were fitted to the wave data, for a given combination of covariates. The covariates used in the GEV-CDN models consisted of wind fields resulting from the Regional Climate Model version 3 (RegCM3 developed by the International Center for Theoritical Physics (ICTP with a spatial resolution of 10 km × 10 km, after being processed using principal component analysis (PCA. The results obtained from the best fitted models in the present and future periods for each location were compared, revealing different patterns of relationships between wind components and extreme wave height quantiles in different parts of the Greek Seas and different periods. The analysis demonstrates an increase of extreme wave heights in the first future period as compared with the present period, causing a significant threat to Greek coastal areas in the North Aegean Sea and the Ionian Sea.

  18. Sensitivity of the Atmospheric Response to Warm Pool El Nino Events to Modeled SSTs and Future Climate Forcings

    Science.gov (United States)

    Hurwitz, Margaret M.; Garfinkel, Chaim I.; Newman, Paul A.; Oman, Luke D.

    2013-01-01

    Warm pool El Nino (WPEN) events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. Under present-day climate conditions, WPEN events generate poleward propagating wavetrains and enhance midlatitude planetary wave activity, weakening the stratospheric polar vortices. The late 21st century extratropical atmospheric response to WPEN events is investigated using the Goddard Earth Observing System Chemistry-Climate Model (GEOSCCM), version 2. GEOSCCM simulations are forced by projected late 21st century concentrations of greenhouse gases (GHGs) and ozone-depleting substances (ODSs) and by SSTs and sea ice concentrations from an existing ocean-atmosphere simulation. Despite known ocean-atmosphere model biases, the prescribed SST fields represent a best estimate of the structure of late 21st century WPEN events. The future Arctic vortex response is qualitatively similar to that observed in recent decades but is weaker in late winter. This response reflects the weaker SST forcing in the Nino 3.4 region and subsequently weaker Northern Hemisphere tropospheric teleconnections. The Antarctic stratosphere does not respond to WPEN events in a future climate, reflecting a change in tropospheric teleconnections: The meridional wavetrain weakens while a more zonal wavetrain originates near Australia. Sensitivity simulations show that a strong poleward wavetrain response to WPEN requires a strengthening and southeastward extension of the South Pacific Convergence Zone; this feature is not captured by the late 21st century modeled SSTs. Expected future increases in GHGs and decreases in ODSs do not affect the polar stratospheric responses to WPEN.

  19. Micro-canonical cascade model: Analyzing parameter changes in the future and their influence on disaggregation results

    Science.gov (United States)

    Müller, Hannes; Föt, Annika; Haberlandt, Uwe

    2016-04-01

    Rainfall time series with a high temporal resolution are needed in many hydrological and water resources management fields. Unfortunately, future climate projections are often available only in low temporal resolutions, e.g. daily values. A possible solution is the disaggregation of these time series using information of high-resolution time series of recording stations. Often, the required parameters for the disaggregation process are applied to future climate without any change, because the change is unknown. For this investigation a multiplicative random cascade model is used. The parameters can be estimated directly from high-resolution time series. Here, time series with hourly resolution generated by the ECHAM5-model and dynamically downscaled with the REMO-model (UBA-, BfG- & ENS-realisation) are used for parameter estimation. The parameters are compared between the past (1971-20000), near-term (2021-2050) and long-term future (2071-2100) for temporal resolutions of 1 h and 8 h. Additionally, the parameters of each period are used for the disaggregation of the other two periods. Afterwards the disaggregated time series are analyzed concerning extreme values representation, event specific characteristics (average wet spell duration and amount) and overall time series characteristics (average intensity and fraction of dry spell events). The aim of the investigation is a) to detect and quantify parameter changes and b) to analyze the influence on the disaggregated time series. The investigation area is Lower Saxony, Germany.

  20. How historic simulation-observation discrepancy affects future warming projections in a very large model ensemble

    Science.gov (United States)

    Goodwin, Philip

    2016-10-01

    Projections of future climate made by model-ensembles have credibility because the historic simulations by these models are consistent with, or near-consistent with, historic observations. However, it is not known how small inconsistencies between the ranges of observed and simulated historic climate change affects the future projections made by a model ensemble. Here, the impact of historical simulation-observation inconsistencies on future warming projections is quantified in a 4-million member Monte Carlo ensemble from a new efficient Earth System Model (ESM). Of the 4-million ensemble members, a subset of 182,500 are consistent with historic ranges of warming, heat uptake and carbon uptake simulated by the Climate Model Intercomparison Project 5 (CMIP5) ensemble. This simulation-consistent subset projects similar future warming ranges to the CMIP5 ensemble for all four RCP scenarios, indicating the new ESM represents an efficient tool to explore parameter space for future warming projections based on historic performance. A second subset of 14,500 ensemble members are consistent with historic observations for warming, heat uptake and carbon uptake. This observation-consistent subset projects a narrower range for future warming, with the lower bounds of projected warming still similar to CMIP5, but the upper warming bounds reduced by 20-35 %. These findings suggest that part of the upper range of twenty-first century CMIP5 warming projections may reflect historical simulation-observation inconsistencies. However, the agreement of lower bounds for projected warming implies that the likelihood of warming exceeding dangerous levels over the twenty-first century is unaffected by small discrepancies between CMIP5 models and observations.

  1. Model of an aquaponic system for minimised water, energy and nitrogen requirements.

    Science.gov (United States)

    Reyes Lastiri, D; Slinkert, T; Cappon, H J; Baganz, D; Staaks, G; Keesman, K J

    2016-01-01

    Water and nutrient savings can be established by coupling water streams between interacting processes. Wastewater from production processes contains nutrients like nitrogen (N), which can and should be recycled in order to meet future regulatory discharge demands. Optimisation of interacting water systems is a complex task. An effective way of understanding, analysing and optimising such systems is by applying mathematical models. The present modelling work aims at supporting the design of a nearly emission-free aquaculture and hydroponic system (aquaponics), thus contributing to sustainable production and to food security for the 21st century. Based on the model, a system that couples 40 m(3) fish tanks and a hydroponic system of 1,000 m(2) can produce 5 tons of tilapia and 75 tons of tomato yearly. The system requires energy to condense and recover evaporated water, for lighting and heating, adding up to 1.3 GJ/m(2) every year. In the suggested configuration, the fish can provide about 26% of the N required in a plant cycle. A coupling strategy that sends water from the fish to the plants in amounts proportional to the fish feed input, reduces the standard deviation of the NO3(-) level in the fish cycle by 35%.

  2. Model-based Integration of Past & Future in TimeTravel

    DEFF Research Database (Denmark)

    Khalefa, Mohamed E.; Fischer, Ulrike; Pedersen, Torben Bach

    2012-01-01

    usually exhibits seasonal behavior, models in this index incorporate seasonality. To construct a hierarchical model index, the user specifies seasonality period, error guarantees levels, and a statistical forecast method. As time proceeds, the system incrementally updates the index and utilizes......We demonstrate TimeTravel, an efficient DBMS system for seamless integrated querying of past and (forecasted) future values of time series, allowing the user to view past and future values as one joint time series. This functionality is important for advanced application domain like energy...

  3. Constructing Positive Futures: Modeling the Relationship between Adolescents' Hopeful Future Expectations and Intentional Self Regulation in Predicting Positive Youth Development

    Science.gov (United States)

    Schmid, Kristina L.; Phelps, Erin; Lerner, Richard M.

    2011-01-01

    Intentional self regulation and hopeful expectations for the future are theoretically-related constructs shown to lead to positive youth development (PYD). However, the nature of their relationship over time has not been tested. Therefore, this study explored the associations between hopeful future expectations and intentional self regulation in…

  4. Constructing Positive Futures: Modeling the Relationship between Adolescents' Hopeful Future Expectations and Intentional Self Regulation in Predicting Positive Youth Development

    Science.gov (United States)

    Schmid, Kristina L.; Phelps, Erin; Lerner, Richard M.

    2011-01-01

    Intentional self regulation and hopeful expectations for the future are theoretically-related constructs shown to lead to positive youth development (PYD). However, the nature of their relationship over time has not been tested. Therefore, this study explored the associations between hopeful future expectations and intentional self regulation in…

  5. Passion Trumps Pay: A Study of the Future Skills Requirements of Information Professionals in Galleries, Libraries, Archives and Museums in Australia

    Science.gov (United States)

    Howard, Katherine; Partridge, Helen; Hughes, Hilary; Oliver, Gillian

    2016-01-01

    Introduction: This paper explores the current and future skills and knowledge requirements of contemporary information professionals in a converged gallery, library, archive and museum sector (also referred to as the GLAM sector) in Australia. This research forms part of a larger study that investigated the education needs of information…

  6. Climate projections of future extreme events accounting for modelling uncertainties and historical simulation biases

    Science.gov (United States)

    Brown, Simon J.; Murphy, James M.; Sexton, David M. H.; Harris, Glen R.

    2014-11-01

    A methodology is presented for providing projections of absolute future values of extreme weather events that takes into account key uncertainties in predicting future climate. This is achieved by characterising both observed and modelled extremes with a single form of non-stationary extreme value (EV) distribution that depends on global mean temperature and which includes terms that account for model bias. Such a distribution allows the prediction of future "observed" extremes for any period in the twenty-first century. Uncertainty in modelling future climate, arising from a wide range of atmospheric, oceanic, sulphur cycle and carbon cycle processes, is accounted for by using probabilistic distributions of future global temperature and EV parameters. These distributions are generated by Bayesian sampling of emulators with samples weighted by their likelihood with respect to a set of observational constraints. The emulators are trained on a large perturbed parameter ensemble of global simulations of the recent past, and the equilibrium response to doubled CO2. Emulated global EV parameters are converted to the relevant regional scale through downscaling relationships derived from a smaller perturbed parameter regional climate model ensemble. The simultaneous fitting of the EV model to regional model data and observations allows the characterisation of how observed extremes may change in the future irrespective of biases that may be present in the regional models simulation of the recent past climate. The clearest impact of a parameter perturbation in this ensemble was found to be the depth to which plants can access water. Members with shallow soils tend to be biased hot and dry in summer for the observational period. These biases also appear to have an impact on the potential future response for summer temperatures with some members with shallow soils having increases for extremes that reduce with extreme severity. We apply this methodology for London, using the

  7. A robust optimization model for green regional logistics network design with uncertainty in future logistics demand

    Directory of Open Access Journals (Sweden)

    Dezhi Zhang

    2015-12-01

    Full Text Available This article proposes a new model to address the design problem of a sustainable regional logistics network with uncertainty in future logistics demand. In the proposed model, the future logistics demand is assumed to be a random variable with a given probability distribution. A set of chance constraints with regard to logistics service capacity and environmental impacts is incorporated to consider the sustainability of logistics network design. The proposed model is formulated as a two-stage robust optimization problem. The first-stage problem before the realization of future logistics demand aims to minimize a risk-averse objective by determining the optimal location and size of logistics parks with CO2 emission taxes consideration. The second stage after the uncertain logistics demand has been determined is a scenario-based stochastic logistics service route choices equilibrium problem. A heuristic solution algorithm, which is a combination of penalty function method, genetic algorithm, and Gauss–Seidel decomposition approach, is developed to solve the proposed model. An illustrative example is given to show the application of the proposed model and solution algorithm. The findings show that total social welfare of the logistics system depends very much on the level of uncertainty in future logistics demand, capital budget for logistics parks, and confidence levels of the chance constraints.

  8. A Review of Water Isotopes in Atmospheric General Circulation Models: Recent Advances and Future Prospects

    Directory of Open Access Journals (Sweden)

    Xi Xi

    2014-01-01

    Full Text Available Stable water isotopologues, mainly 1H2O, 1H2HO (HDO, and H12O18, are useful tracers for processes in the global hydrological cycle. The incorporation of water isotopes into Atmospheric General Circulation Models (AGCMs since 1984 has helped scientists gain substantial new insights into our present and past climate. In recent years, there have been several significant advances in water isotopes modeling in AGCMs. This paper reviews and synthesizes key advances accomplished in modeling (1 surface evaporation, (2 condensation, (3 supersaturation, (4 postcondensation processes, (5 vertical distribution of water isotopes, and (6 spatial δ18O-temperature slope and utilizing (1 spectral nudging technique, (2 higher model resolutions, and (3 coupled atmosphere-ocean models. It also reviews model validation through comparisons of model outputs and ground-based and spaceborne measurements. In the end, it identifies knowledge gaps and discusses future prospects of modeling and model validation.

  9. Closed loop models for analyzing engineering requirements for simulators

    Science.gov (United States)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  10. Charging and billing in modern communications networks : A comprehensive survey of the state of the art and future requirements

    NARCIS (Netherlands)

    Kühne, R.; Huitema, G.B.; Carle, G.

    2012-01-01

    In mobile telecommunication networks the trend for an increasing heterogeneity of access networks, the convergence with fixed networks as well as with the Internet are apparent. The resulting future converged network with an expected wide variety of services and a possibly stiff competition between

  11. Charging and billing in modern communications networks : A comprehensive survey of the state of art and future requirements

    NARCIS (Netherlands)

    Kuehne, Ralph; Huitema, George; Carle, George

    2012-01-01

    In mobile telecommunication networks the trend for an increasing heterogeneity of access networks, the convergence with fixed networks as well as with the Internet are apparent. The resulting future converged network with an expected wide variety of services and a possibly stiff competition between

  12. Constraining RS Models by Future Flavor and Collider Measurements: A Snowmass Whitepaper

    Energy Technology Data Exchange (ETDEWEB)

    Agashe, Kaustubh [Maryland U.; Bauer, Martin [Chicago U., EFI; Goertz, Florian [Zurich, ETH; Lee, Seung J. [Korea Inst. Advanced Study, Seoul; Vecchi, Luca [Maryland U.; Wang, Lian-Tao [Chicago U., EFI; Yu, Felix [Fermilab

    2013-10-03

    Randall-Sundrum models are models of quark flavor, because they explain the hierarchies in the quark masses and mixings in terms of order one localization parameters of extra dimensional wavefunctions. The same small numbers which generate the light quark masses suppress contributions to flavor violating tree level amplitudes. In this note we update universal constraints from electroweak precision parameters and demonstrate how future measurements of flavor violation in ultra rare decay channels of Kaons and B mesons will constrain the parameter space of this type of models. We show how collider signatures are correlated with these flavor measurements and compute projected limits for direct searches at the 14 TeV LHC run, a 14 TeV LHC luminosity upgrade, a 33 TeV LHC energy upgrade, and a potential 100 TeV machine. We further discuss the effects of a warped model of leptons in future measurements of lepton flavor violation.

  13. Biological ensemble modeling to evaluate potential futures of living marine resources

    DEFF Research Database (Denmark)

    Gårdmark, Anna; Lindegren, Martin; Neuenfeldt, Stefan

    2013-01-01

    in all models, intense fishing prevented recovery, and climate change further decreased the cod population. Our study demonstrates how the biological ensemble modeling approach makes it possible to evaluate the relative importance of different sources of uncertainty in future species responses, as well......) as an example. The core of the approach is to expose an ensemble of models with different ecological assumptions to climate forcing, using multiple realizations of each climate scenario. We simulated the long-term response of cod to future fishing and climate change in seven ecological models ranging from...... as to seek scientific conclusions and sustainable management solutions robust to uncertainty of food web processes in the face of climate change...

  14. Comparing projections of future changes in runoff from hydrological and biome models in ISI-MIP

    NARCIS (Netherlands)

    Davie, J.C.S.; Falloon, P.D.; Kahana, R.; Dankers, R.; Betts, R.; Portmann, F.T.; Wisser, D.; Clark, D.B.; Ito, A.; Masaki, Y.; Nishina, K.; Fekete, B.; Tessler, Z.; Wada, Y.; Liu, X.; Tang, Q.; Hagemann, S.; Stacke, T.; Pavlick, R.; Schaphoff, S.; Gosling, S.N.; Franssen, W.H.P.; Arnell, N.

    2013-01-01

    Future changes in runoff can have important implications for water resources and flooding. In this study, runoff projections from ISI-MIP (Inter-sectoral Impact Model Inter-comparison Project) simulations forced with HadGEM2-ES bias-corrected climate data under the Representative Concentration Pathw

  15. Comparing projections of future changes in runoff from hydrological and biome models in ISI-MIP

    NARCIS (Netherlands)

    Davie, J. C. S.; Falloon, P. D.; Kahana, R.; Dankers, R.; Betts, R.; Portmann, F. T.; Wisser, D.; Clark, D. B.; Ito, A.; Masaki, Y.; Nishina, K.; Fekete, B.; Tessler, Z.; Wada, Y.; Liu, X.; Tang, Q.; Hagemann, S.; Stacke, T.; Pavlick, R.; Schaphoff, S.; Gosling, S. N.; Franssen, W.; Arnell, N.

    2013-01-01

    Future changes in runoff can have important implications for water resources and flooding. In this study, runoff projections from ISI-MIP (Inter-sectoral Impact Model Intercomparison Project) simulations forced with HadGEM2-ES bias-corrected climate data under the Representative Concentration Pathwa

  16. 49 CFR 526.5 - Earning offsetting monetary credits in future model years.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 6 2010-10-01 2010-10-01 false Earning offsetting monetary credits in future model years. 526.5 Section 526.5 Transportation Other Regulations Relating to Transportation (Continued) NATIONAL HIGHWAY TRAFFIC SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PETITIONS AND PLANS FOR RELIEF UNDER THE AUTOMOBILE FUEL EFFICIENCY...

  17. Harmonization of future needs for dermal exposure assessment and modeling : a workshop report

    NARCIS (Netherlands)

    Marquart, H.; Maidment, S.; Mcclaflin, J.L.; Fehrenbacher, M.C.

    2001-01-01

    Dermal exposure assessment and modeling is still in early phases of development. This article presents the results of a workshop organized to harmonize the future needs in this field. Methods for dermal exposure assessment either assess the mass of contaminant that is transferred to the skin, or the

  18. Scalar dark energy models mimicking {Lambda}CDM with arbitrary future evolution

    Energy Technology Data Exchange (ETDEWEB)

    Astashenok, Artyom V. [Baltic Federal University of I. Kant, Department of Theoretical Physics, 236041, 14, Nevsky st., Kaliningrad (Russian Federation); Nojiri, Shin' ichi, E-mail: nojiri@phys.nagoya-u.ac.jp [Department of Physics, Nagoya University, Nagoya 464-8602 (Japan); Kobayashi-Maskawa Institute for the Origin of Particles and the Universe, Nagoya University, Nagoya 464-8602 (Japan); Odintsov, Sergei D. [Institucio Catalana de Recerca i Estudis Avancats (ICREA), Barcelona (Spain); Institut de Ciencies de l' Espai (CSIC-IEEC), Campus UAB, Facultat de Ciencies, Torre C5-Par-2a pl, E-08193 Bellaterra, Barcelona (Spain); Scherrer, Robert J. [Department of Physics and Astronomy, Vanderbilt University, Nashville TN 37235 (United States)

    2012-07-09

    Dark energy models with various scenarios of evolution are considered from the viewpoint of the formalism for the equation of state. It is shown that these models are compatible with current astronomical data. Some of the models presented here evolve arbitrarily close to {Lambda}CDM up to the present, but diverge in the future into a number of different possible asymptotic states, including asymptotic de Sitter (pseudo-rip) evolution, little rips with disintegration of bound structures, and various forms of finite-time future singularities. Therefore it is impossible from observational data to determine whether the universe will end in a future singularity or not. We demonstrate that the models under consideration are stable for a long period of time (billions of years) before entering a Little Rip/Pseudo-Rip induced dissolution of bound structures or before entering a soft finite-time future singularity. Finally, the physical consequences of Little Rip, Types II, III and Big Crush singularities are briefly compared.

  19. Modelling land Use Change : Improving the prediction of future land use patterns

    NARCIS (Netherlands)

    de Nijs, A.C.M.

    2009-01-01

    Modelling land Use Change: Improving the prediction of future land use patterns. Man has been altering his living environment since prehistoric times and will continue to do so. It is predicted that by 2030 about 90,000 ha will be needed for residential developments in the Netherlands and 55,000 ha

  20. Modelling the future of Boswellia papyrifera population and its frankincense production

    NARCIS (Netherlands)

    Lemenih, M.; Arts, B.J.M.; Wiersum, K.F.; Bongers, F.

    2014-01-01

    Sustainable production of the aromatic forest product frankincense is at stake due to rapid decline in its resource base. This affects livelihoods of thousands of citizens and several global industries. A system dynamic model approach is used to predict the future population of Boswellia papyrifera

  1. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    Science.gov (United States)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D. A.; Brogaard, Sara; van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-11-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS (Lund-Potsdam-Jena General Ecosystem Simulator) using socio-economic data from the SSPs and climate data from the RCPs (representative concentration pathways). The simulated range of global cropland is 893-2380 Mha in 2100 (± 1 standard deviation), with the main uncertainties arising from differences in the socio-economic conditions prescribed by the SSP scenarios and the assumptions that underpin the translation of qualitative SSP storylines into quantitative model input parameters. Uncertainties in the assumptions for population growth, technological change and cropland degradation were found to be the most important for global cropland, while uncertainty in food consumption had less influence on the results. The uncertainties arising from climate variability and the differences between climate change scenarios do not strongly affect the range of global cropland futures. Some overlap occurred across all of the conditional probabilistic futures, except for those based on SSP3. We conclude that completely different socio-economic and climate change futures, although sharing low to medium population development, can result in very similar cropland areas on the aggregated global scale.

  2. Prediction of Future Observations in Polynomial Growth Curve Models. Part 1.

    Science.gov (United States)

    1983-03-01

    UNIT NUMBERS University of Pittsburgh, Ninth Floor, PE6llO2F; 2304/A5 Schenley Hall, Pittsburgh PA 15260 It CONTROLLING OFFICE NAME AND ADDRESS 12...8217. DSIM Enitvd, ’ SR-TR. 8 3 0491 PREDICTION OF FUTURE OBSERVATIONS IN POLYNOMIAL GROWTH CURVE MODELS PART - 1 C. Radhakrishna Rao University of Pittsburgh

  3. Charging a renewable future: The impact of electric vehicle charging intelligence on energy storage requirements to meet renewable portfolio standards

    Science.gov (United States)

    Forrest, Kate E.; Tarroja, Brian; Zhang, Li; Shaffer, Brendan; Samuelsen, Scott

    2016-12-01

    Increased usage of renewable energy resources is key for energy system evolution to address environmental concerns. Capturing variable renewable power requires the use of energy storage to shift generation and load demand. The integration of plug-in electric vehicles, however, impacts the load demand profile and therefore the capacity of energy storage required to meet renewable utilization targets. This study examines how the intelligence of plug-in electric vehicle (PEV) integration impacts the required capacity of energy storage systems to meet renewable utilization targets for a large-scale energy system, using California as an example for meeting a 50% and 80% renewable portfolio standard (RPS) in 2030 and 2050. For an 80% RPS in 2050, immediate charging of PEVs requires the installation of an aggregate energy storage system with a power capacity of 60% of the installed renewable capacity and an energy capacity of 2.3% of annual renewable generation. With smart charging of PEVs, required power capacity drops to 16% and required energy capacity drops to 0.6%, and with vehicle-to-grid (V2G) charging, non-vehicle energy storage systems are no longer required. Overall, this study highlights the importance of intelligent PEV charging for minimizing the scale of infrastructure required to meet renewable utilization targets.

  4. Ecological models supporting environmental decision making: a strategy for the future.

    Science.gov (United States)

    Schmolke, Amelie; Thorbek, Pernille; DeAngelis, Donald L; Grimm, Volker

    2010-08-01

    Ecological models are important for environmental decision support because they allow the consequences of alternative policies and management scenarios to be explored. However, current modeling practice is unsatisfactory. A literature review shows that the elements of good modeling practice have long been identified but are widely ignored. The reasons for this might include lack of involvement of decision makers, lack of incentives for modelers to follow good practice, and the use of inconsistent terminologies. As a strategy for the future, we propose a standard format for documenting models and their analyses: transparent and comprehensive ecological modeling (TRACE) documentation. This standard format will disclose all parts of the modeling process to scrutiny and make modeling itself more efficient and coherent. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  5. Ecological models supporting environmental decision making: a strategy for the future

    Science.gov (United States)

    Schmolke, Amelie; Thorbek, Pernille; DeAngelis, Donald L.; Grimm, Volker

    2010-01-01

    Ecological models are important for environmental decision support because they allow the consequences of alternative policies and management scenarios to be explored. However, current modeling practice is unsatisfactory. A literature review shows that the elements of good modeling practice have long been identified but are widely ignored. The reasons for this might include lack of involvement of decision makers, lack of incentives for modelers to follow good practice, and the use of inconsistent terminologies. As a strategy for the future, we propose a standard format for documenting models and their analyses: transparent and comprehensive ecological modeling (TRACE) documentation. This standard format will disclose all parts of the modeling process to scrutiny and make modeling itself more efficient and coherent.

  6. The Future of Eurasian Boreal Forests: Ecological Modeling Projections in the Russian Federation

    Science.gov (United States)

    Lutz, D.; Shugart, H.

    2008-12-01

    Ecological modeling is one of the primary methodologies for making predictions on future changes in forested ecosystems such as those occurring in Northern Eurasia and Siberia. In particular, combining ecological modeling with global circulation model simulation outputs is a method in which scientists can forecast the impact of climate change on biodiversity (Thuiller, 2007) as well as the forested landscape. Dynamic global vegetation models (DGVMs) have been designed for specifically this purpose, however, these vegetation models run at large spatial scales and as a result make predictions that are highly uncertain (Purves and Pacala, 2008). In previous papers, we discussed the FAREAST forest gap model and its ability to accurately predict boreal forest dynamics at smaller scales and higher resolution than DGVMs. This presentation investigates the use of the FAREAST gap model, modified for spatial expansion to cover the entire country of Russia, to predict future land cover trends under different warming scenarios. The poster provides the initial framework for the project, as well as some initial results. The collection of input variables needed by FAREAST to model the Russian continent will involve collaboration with the Russian Academy of Sciences (CEPF). Together we have developed a framework in which to amalgamate both original (temperature, precipitation, soil values) parameters as well as new parameters (fire probability, logging probability) into a GIS database that can be integrated with the FAREAST model. This framework will be capable of providing visual and graphical output for interpretation of large model runs. In order to ensure accuracy in FAREAST's ability to simulate the current environment, a run of the model under current-day conditions will be compared to recent remote sensing land cover maps. The GLC2000 land cover classification project (EU JRC) will be the primary validation method with additional validation through other biophysical

  7. Two decades of numerical modelling to understand long term fluvial archives: Advances and future perspectives

    Science.gov (United States)

    Veldkamp, A.; Baartman, J. E. M.; Coulthard, T. J.; Maddy, D.; Schoorl, J. M.; Storms, J. E. A.; Temme, A. J. A. M.; van Balen, R.; van De Wiel, M. J.; van Gorp, W.; Viveen, W.; Westaway, R.; Whittaker, A. C.

    2017-06-01

    The development and application of numerical models to investigate fluvial sedimentary archives has increased during the last decades resulting in a sustained growth in the number of scientific publications with keywords, 'fluvial models', 'fluvial process models' and 'fluvial numerical models'. In this context we compile and review the current contributions of numerical modelling to the understanding of fluvial archives. In particular, recent advances, current limitations, previous unexpected results and future perspectives are all discussed. Numerical modelling efforts have demonstrated that fluvial systems can display non-linear behaviour with often unexpected dynamics causing significant delay, amplification, attenuation or blurring of externally controlled signals in their simulated record. Numerical simulations have also demonstrated that fluvial records can be generated by intrinsic dynamics without any change in external controls. Many other model applications demonstrate that fluvial archives, specifically of large fluvial systems, can be convincingly simulated as a function of the interplay of (palaeo) landscape properties and extrinsic climate, base level and crustal controls. All discussed models can, after some calibration, produce believable matches with real world systems suggesting that equifinality - where a given end state can be reached through many different pathways starting from different initial conditions and physical assumptions - plays an important role in fluvial records and their modelling. The overall future challenge lies in the development of new methodologies for a more independent validation of system dynamics and research strategies that allow the separation of intrinsic and extrinsic record signals using combined fieldwork and modelling.

  8. Required Collaborative Work in Online Courses: A Predictive Modeling Approach

    Science.gov (United States)

    Smith, Marlene A.; Kellogg, Deborah L.

    2015-01-01

    This article describes a predictive model that assesses whether a student will have greater perceived learning in group assignments or in individual work. The model produces correct classifications 87.5% of the time. The research is notable in that it is the first in the education literature to adopt a predictive modeling methodology using data…

  9. Integrated Modeling of Water Policy Futures in the Imperial-Mexicali Valleys

    Science.gov (United States)

    Kjelland, M. K.; Forster, C. B.; Grant, W. E.; Collins, K.

    2004-12-01

    planners and residents of the IMVs are to make sound socioeconomic and environmental policy decisions. We use a spatially-explicit, stochastic, simulation compartment model (based on difference equations with a daily time step programmed with STELLAr software) to simulate the hydrologic system that underlies our broader modeling of the socioeconomic and environmental future of the IMVs. Alternative future scenarios are defined and used to explore the hydrologic and environmental implications of variations in future Colorado River flows, and various water-related policy decisions. The results of a suite of simulations, made assuming that the Sea is not impounded in small sub-Seas, suggest that the salinity of the Salton Sea is most likely to continue increasing. If this is the case, and if restoration of the Salton Sea continues to be a high priority, then more aggressive water conservation methods or a much smaller Salton Sea will be required. More aggressive conservation will lead to greater socioeconomic concerns, while a smaller Sea will increase concerns regarding windblown dust from the exposed lakebed.

  10. Spatial Simulation Modelling of Future Forest Cover Change Scenarios in Luangprabang Province, Lao PDR

    Directory of Open Access Journals (Sweden)

    Khamma Homsysavath

    2011-08-01

    Full Text Available Taking Luangprabang province in Lao Peoples’s Democratic Republic (PDR as an example, we simulated future forest cover changes under the business-as-usual (BAU, pessimistic and optimistic scenarios based on the Markov-cellular automata (MCA model. We computed transition probabilities from satellite-derived forest cover maps (1993 and 2000 using the Markov chains, while the “weights of evidence” technique was used to generate transition potential maps. The initial forest cover map (1993, the transition potential maps and the 1993–2000 transition probabilities were used to calibrate the model. Forest cover simulations were then performed from 1993 to 2007 at an annual time-step. The simulated forest cover map for 2007 was compared to the observed (actual forest cover map for 2007 in order to test the accuracy of the model. Following the successful calibration and validation, future forest cover changes were simulated up to 2014 under different scenarios. The MCA simulations under the BAU and pessimistic scenarios projected that current forest areas would decrease, whereas unstocked forest areas would increase in the future. Conversely, the optimistic scenario projected that current forest areas would increase in the future if strict forestry laws enforcing conservation in protected forest areas are implemented. The three simulation scenarios provide a very good case study for simulating future forest cover changes at the subnational level (Luangprabang province. Thus, the future simulated forest cover changes can possibly be used as a guideline to set reference scenarios as well as undertake REDD/REDD+ preparedness activities within the study area.

  11. An epidemic model for the future progression of the current Haiti cholera epidemic

    Science.gov (United States)

    Bertuzzo, E.; Mari, L.; Righetto, L.; Casagrandi, R.; Gatto, M.; Rodriguez-Iturbe, I.; Rinaldo, A.

    2012-04-01

    As a major cholera epidemic progresses in Haiti, and the figures of the infection, up to December 2011, climb to 522,000 cases and 7,000 deaths, the development of general models to track and predict the evolution of the outbreak, so as to guide the allocation of medical supplies and staff, is gaining notable urgency. We propose here a spatially explicit epidemic model that accounts for the dynamics of susceptible and infected individuals as well as the redistribution of Vibrio cholera, the causative agent of the disease, among different human communities. In particular, we model two spreading pathways: the advection of pathogens through hydrologic connections and the dissemination due to human mobility described by means of a gravity-like model. To this end the country has been divided into hydrologic units based on drainage directions derived from a digital terrain model. Moreover the population of each unit has been estimated from census data downscaled to 1 km x 1 km resolution via remotely sensed geomorphological information (LandScan project). The model directly accounts for the role of rainfall patterns in driving the seasonality of cholera outbreaks. The two main outbreaks in fact occurred during the rainy seasons (October and May) when extensive floodings severely worsened the sanitation conditions and, in turn, raised the risk of infection. The model capability to reproduce the spatiotemporal features of the epidemic up to date grants robustness to the foreseen future development. To this end, we generate realistic scenario of future precipitation in order to forecast possible epidemic paths up to the end of the 2013. In this context, the duration of acquired immunity, a hotly debated topic in the scientific community, emerges as a controlling factor for progression of the epidemic in the near future. The framework presented here can straightforwardly be used to evaluate the effectiveness of alternative intervention strategies like mass vaccinations

  12. A dataset of future daily weather data for crop modelling over Europe derived from climate change scenarios

    Science.gov (United States)

    Duveiller, G.; Donatelli, M.; Fumagalli, D.; Zucchini, A.; Nelson, R.; Baruth, B.

    2017-02-01

    Coupled atmosphere-ocean general circulation models (GCMs) simulate different realizations of possible future climates at global scale under contrasting scenarios of land-use and greenhouse gas emissions. Such data require several additional processing steps before it can be used to drive impact models. Spatial downscaling, typically by regional climate models (RCM), and bias-correction are two such steps that have already been addressed for Europe. Yet, the errors in resulting daily meteorological variables may be too large for specific model applications. Crop simulation models are particularly sensitive to these inconsistencies and thus require further processing of GCM-RCM outputs. Moreover, crop models are often run in a stochastic manner by using various plausible weather time series (often generated using stochastic weather generators) to represent climate time scale for a period of interest (e.g. 2000 ± 15 years), while GCM simulations typically provide a single time series for a given emission scenario. To inform agricultural policy-making, data on near- and medium-term decadal time scale is mostly requested, e.g. 2020 or 2030. Taking a sample of multiple years from these unique time series to represent time horizons in the near future is particularly problematic because selecting overlapping years may lead to spurious trends, creating artefacts in the results of the impact model simulations. This paper presents a database of consolidated and coherent future daily weather data for Europe that addresses these problems. Input data consist of daily temperature and precipitation from three dynamically downscaled and bias-corrected regional climate simulations of the IPCC A1B emission scenario created within the ENSEMBLES project. Solar radiation is estimated from temperature based on an auto-calibration procedure. Wind speed and relative air humidity are collected from historical series. From these variables, reference evapotranspiration and vapour pressure

  13. Exhaust-gas aftertreatment concepts for meeting future emission requirements; Abgasnachbehandlungs-Konzepte zur Erfuellung zukuenftiger Emissionsrichtlinien

    Energy Technology Data Exchange (ETDEWEB)

    Toepfer, Tobias; Weiskirch, Christian; Behnk, Kai [IAV GmbH, Berlin (Germany). Fachbereich Nutzfahrzeuge; Mueller, Raimund [Emitec Gesellschaft fuer Emissionstechnologie mbH, Lohmar (Germany). Technischer Vertrieb Nutzfahrzeuge und Non-road Anwendungen

    2011-02-15

    Proceeding from a modern two-stage supercharged commercial-vehicle engine employing measures inside the engine to satisfy the NO{sub x} limit values for Euro V and EEV as well as a partial-flow particulate filter system for meeting the particulate limit values prescribed under Euro V and EEV, the IAV investigated potential approaches to complying with future emission standards in the on-road and off-road segments. (orig.)

  14. Region-specific study of the electric utility industry: financial history and future power requirements for the VACAR region

    Energy Technology Data Exchange (ETDEWEB)

    Pochan, M.J.

    1985-07-01

    Financial data for the period 1966 to 1981 are presented for the four investor-owned electric utilities in the VACAR (Virginia-Carolinas) region. This region was selected as representative for the purpose of assessing the availability, reliability, and cost of electric power for the future in the United States. The estimated demand for power and planned additions to generating capacity for the region through the year 2000 are also given.

  15. Macroeconomic factors and oil futures prices. A data-rich model

    Energy Technology Data Exchange (ETDEWEB)

    Zagaglia, Paolo [Modelling Division, Sveriges Riksbank (Sweden)

    2010-03-15

    I study the dynamics of oil futures prices in the NYMEX using a large panel dataset that includes global macroeconomic indicators, financial market indices, quantities and prices of energy products. I extract common factors from the panel data series and estimate a Factor-Augmented Vector Autoregression for the maturity structure of oil futures prices. I find that latent factors generate information that, once combined with that of the yields, improves the forecasting performance for oil prices. Furthermore, I show that a factor correlated to purely financial developments contributes to the model performance, in addition to factors related to energy quantities and prices. (author)

  16. Comparison of strategies for model predictive control for home heating in future energy systems

    DEFF Research Database (Denmark)

    Vogler-Finck, Pierre Jacques Camille; Popovski, Petar; Wisniewski, Rafal

    2017-01-01

    Model predictive control is seen as one of the key future enabler in increasing energy efficiency in buildings. This paper presents a comparison of the performance of the control for different formulations of the objective function. This comparison is made in a simulation study on a single building...... using historical weather and power system data from Denmark. Trade-offs between energy consumption, comfort and incurred CO2 emissions depending on the chosen objective function are quantified, highlighting the need to carefully select the strategy used in future design and implementation, rather than...

  17. Future Directions in Medical Physics: Models, Technology, and Translation to Medicine

    Science.gov (United States)

    Siewerdsen, Jeffrey

    The application of physics in medicine has been integral to major advances in diagnostic and therapeutic medicine. Two primary areas represent the mainstay of medical physics research in the last century: in radiation therapy, physicists have propelled advances in conformal radiation treatment and high-precision image guidance; and in diagnostic imaging, physicists have advanced an arsenal of multi-modality imaging that includes CT, MRI, ultrasound, and PET as indispensible tools for noninvasive screening, diagnosis, and assessment of treatment response. In addition to their role in building such technologically rich fields of medicine, physicists have also become integral to daily clinical practice in these areas. The future suggests new opportunities for multi-disciplinary research bridging physics, biology, engineering, and computer science, and collaboration in medical physics carries a strong capacity for identification of significant clinical needs, access to clinical data, and translation of technologies to clinical studies. In radiation therapy, for example, the extraction of knowledge from large datasets on treatment delivery, image-based phenotypes, genomic profile, and treatment outcome will require innovation in computational modeling and connection with medical physics for the curation of large datasets. Similarly in imaging physics, the demand for new imaging technology capable of measuring physical and biological processes over orders of magnitude in scale (from molecules to whole organ systems) and exploiting new contrast mechanisms for greater sensitivity to molecular agents and subtle functional / morphological change will benefit from multi-disciplinary collaboration in physics, biology, and engineering. Also in surgery and interventional radiology, where needs for increased precision and patient safety meet constraints in cost and workflow, development of new technologies for imaging, image registration, and robotic assistance can leverage

  18. Forecasting Model for Crude Oil Price Using Artificial Neural Networks and Commodity Futures Prices

    CERN Document Server

    Kulkarni, Siddhivinayak

    2009-01-01

    This paper presents a model based on multilayer feedforward neural network to forecast crude oil spot price direction in the short-term, up to three days ahead. A great deal of attention was paid on finding the optimal ANN model structure. In addition, several methods of data pre-processing were tested. Our approach is to create a benchmark based on lagged value of pre-processed spot price, then add pre-processed futures prices for 1, 2, 3,and four months to maturity, one by one and also altogether. The results on the benchmark suggest that a dynamic model of 13 lags is the optimal to forecast spot price direction for the short-term. Further, the forecast accuracy of the direction of the market was 78%, 66%, and 53% for one, two, and three days in future conclusively. For all the experiments, that include futures data as an input, the results show that on the short-term, futures prices do hold new information on the spot price direction. The results obtained will generate comprehensive understanding of the cr...

  19. The U.S. Geological Survey Monthly Water Balance Model Futures Portal

    Science.gov (United States)

    Bock, Andrew R.; Hay, Lauren E.; Markstrom, Steven L.; Emmerich, Christopher; Talbert, Marian

    2017-05-03

    The U.S. Geological Survey Monthly Water Balance Model Futures Portal (https://my.usgs.gov/mows/) is a user-friendly interface that summarizes monthly historical and simulated future conditions for seven hydrologic and meteorological variables (actual evapotranspiration, potential evapotranspiration, precipitation, runoff, snow water equivalent, atmospheric temperature, and streamflow) at locations across the conterminous United States (CONUS).The estimates of these hydrologic and meteorological variables were derived using a Monthly Water Balance Model (MWBM), a modular system that simulates monthly estimates of components of the hydrologic cycle using monthly precipitation and atmospheric temperature inputs. Precipitation and atmospheric temperature from 222 climate datasets spanning historical conditions (1952 through 2005) and simulated future conditions (2020 through 2099) were summarized for hydrographic features and used to drive the MWBM for the CONUS. The MWBM input and output variables were organized into an open-access database. An Open Geospatial Consortium, Inc., Web Feature Service allows the querying and identification of hydrographic features across the CONUS. To connect the Web Feature Service to the open-access database, a user interface—the Monthly Water Balance Model Futures Portal—was developed to allow the dynamic generation of summary files and plots  based on plot type, geographic location, specific climate datasets, period of record, MWBM variable, and other options. Both the plots and the data files are made available to the user for download 

  20. Multiscale Modeling, Simulation and Visualization and Their Potential for Future Aerospace Systems

    Science.gov (United States)

    Noor, Ahmed K. (Compiler)

    2002-01-01

    This document contains the proceedings of the Training Workshop on Multiscale Modeling, Simulation and Visualization and Their Potential for Future Aerospace Systems held at NASA Langley Research Center, Hampton, Virginia, March 5 - 6, 2002. The workshop was jointly sponsored by Old Dominion University's Center for Advanced Engineering Environments and NASA. Workshop attendees were from NASA, other government agencies, industry, and universities. The objectives of the workshop were to give overviews of the diverse activities in hierarchical approach to material modeling from continuum to atomistics; applications of multiscale modeling to advanced and improved material synthesis; defects, dislocations, and material deformation; fracture and friction; thin-film growth; characterization at nano and micro scales; and, verification and validation of numerical simulations, and to identify their potential for future aerospace systems.

  1. The future of the Devon Ice cap: results from climate and ice dynamics modelling

    Science.gov (United States)

    Mottram, Ruth; Rodehacke, Christian; Boberg, Fredrik

    2017-04-01

    The Devon Ice Cap is an example of a relatively well monitored small ice cap in the Canadian Arctic. Close to Greenland, it shows a similar surface mass balance signal to glaciers in western Greenland. Here we use high resolution (5km) simulations from HIRHAM5 to drive the PISM glacier model in order to model the present day and future prospects of this small Arctic ice cap. Observational data from the Devon Ice Cap in Arctic Canada is used to evaluate the surface mass balance (SMB) data output from the HIRHAM5 model for simulations forced with the ERA-Interim climate reanalysis data and the historical emissions scenario run by the EC-Earth global climate model. The RCP8.5 scenario simulated by EC-Earth is also downscaled by HIRHAM5 and this output is used to force the PISM model to simulate the likely future evolution of the Devon Ice Cap under a warming climate. We find that the Devon Ice Cap is likely to continue its present day retreat, though in the future increased precipitation partly offsets the enhanced melt rates caused by climate change.

  2. Future of water resources in the Aral Sea Region, Central Asia - Reality-checked climate model projections

    Science.gov (United States)

    Asokan, Shilpa M.; Destouni, Georgia

    2014-05-01

    The future of water resources in a region invariably depends on its historic as well as present water use management policy. In order to understand the past hydro-climatic conditions and changes, one needs to analyze observation data and their implications for climate and hydrology, such as Temperature, Precipitation, Runoff and Evapotranspiration in the region. In addition to the changes in climate, human re-distribution of water through land- and water­use changes is found to significantly alter the water transfer from land to atmosphere through an increase or decrease in evapotranspiration. The Aral region in Central Asia, comprising the Aral Sea Drainage Basin and the Aral Sea, is an example case where the human induced changes in water-use have led to one of the worst environmental disasters of our time, the desiccation of the Aral Sea. Identification of the historical hydro-climatic changes that have happened in this region and their drivers is required before one can project future changes to water and its availability in the landscape. Knowledge of the future of water resources in the Aral region is needed for planning to meet increasing water and food demands of the growing population in conjunction with ecosystem sustainability. In order to project future scenarios of water on land, the Global Climate Model (GCM) ensemble of the Coupled Model Intercomparison Project, Phase 5 (CMIP5) was analyzed for their performance against hydrologically important, basin-scale observational climate and hydrological datasets. We found that the ensemble mean of 22 GCMs over-estimated the observed temperature by about 1°C for the historic period of 1961-1990. For the future extreme climate scenario RCP8.5 the increase in temperature was projected to be about 5°C by 2070-2099, the accuracy of which is questionable from identified biases of GCMs and their ensemble results compared with observations for the period 1961-1990. In particular, the water balance components

  3. Divergent projections of future land use in the United States arising from different models and scenarios

    Science.gov (United States)

    Sohl, Terry L.; Wimberly, Michael; Radeloff, Volker C.; Theobald, David M.; Sleeter, Benjamin M.

    2016-01-01

    A variety of land-use and land-cover (LULC) models operating at scales from local to global have been developed in recent years, including a number of models that provide spatially explicit, multi-class LULC projections for the conterminous United States. This diversity of modeling approaches raises the question: how consistent are their projections of future land use? We compared projections from six LULC modeling applications for the United States and assessed quantitative, spatial, and conceptual inconsistencies. Each set of projections provided multiple scenarios covering a period from roughly 2000 to 2050. Given the unique spatial, thematic, and temporal characteristics of each set of projections, individual projections were aggregated to a common set of basic, generalized LULC classes (i.e., cropland, pasture, forest, range, and urban) and summarized at the county level across the conterminous United States. We found very little agreement in projected future LULC trends and patterns among the different models. Variability among scenarios for a given model was generally lower than variability among different models, in terms of both trends in the amounts of basic LULC classes and their projected spatial patterns. Even when different models assessed the same purported scenario, model projections varied substantially. Projections of agricultural trends were often far above the maximum historical amounts, raising concerns about the realism of the projections. Comparisons among models were hindered by major discrepancies in categorical definitions, and suggest a need for standardization of historical LULC data sources. To capture a broader range of uncertainties, ensemble modeling approaches are also recommended. However, the vast inconsistencies among LULC models raise questions about the theoretical and conceptual underpinnings of current modeling approaches. Given the substantial effects that land-use change can have on ecological and societal processes, there

  4. Achieving a System Operational Availability Requirement (ASOAR) Model

    Science.gov (United States)

    1992-07-01

    ASOAR requires only system and end item level input data, not Line Replaceable Unit (LRU) Input data. ASOAR usage provides concepts for major logistics...the Corp/Theater ADP Service Center II (CTASC II) to a systen operational availabilty goal. The CTASC II system configuration had many redundant types

  5. The U.S. Geological Survey Monthly Water Balance Model Futures Portal

    Science.gov (United States)

    Bock, Andy

    2017-03-16

    Simulations of future climate suggest profiles of temperature and precipitation may differ significantly from those in the past. These changes in climate will likely lead to changes in the hydrologic cycle. As such, natural resource managers are in need of tools that can provide estimates of key components of the hydrologic cycle, uncertainty associated with the estimates, and limitations associated with the climate forcing data used to estimate these components. To help address this need, the U.S. Geological Survey Monthly Water Balance Model Futures Portal (https://my.usgs.gov/mows/) provides a user friendly interface to deliver hydrologic and meteorological variables for monthly historic and potential future climatic conditions across the continental United States.

  6. Uncertainties in future-proof decision-making: the Dutch Delta Model

    Science.gov (United States)

    IJmker, Janneke; Snippen, Edwin; Ruijgh, Erik

    2013-04-01

    In 1953, a number of European countries experienced flooding after a major storm event coming from the northwest. Over 2100 people died of the resulting floods, 1800 of them being Dutch. This gave rise to the development of the so-called Delta Works and Zuiderzee Works that strongly reduced the flood risk in the Netherlands. These measures were a response to a large flooding event. As boundary conditions have changed (increasing population, increasing urban development, etc.) , the flood risk should be evaluated continuously, and measures should be taken if necessary. The Delta Programme was designed to be prepared for future changes and to limit the flood risk, taking into account economics, nature, landscape, residence and recreation . To support decisions in the Delta Programme, the Delta Model was developed. By using four different input scenarios (extremes in climate and economics) and variations in system setup, the outcomes of the Delta Model represent a range of possible outcomes for the hydrological situation in 2050 and 2100. These results flow into effect models that give insight in the integrated effects on freshwater supply (including navigation, industry and ecology) and flood risk. As the long-term water management policy of the Netherlands for the next decades will be based on these results, they have to be reliable. Therefore, a study was carried out to investigate the impact of uncertainties on the model outcomes. The study focused on "known unknowns": uncertainties in the boundary conditions, in the parameterization and in the model itself. This showed that for different parts of the Netherlands, the total uncertainty is in the order of meters! Nevertheless, (1) the total uncertainty is dominated by uncertainties in boundary conditions. Internal model uncertainties are subordinate to that. Furthermore, (2) the model responses develop in a logical way, such that the exact model outcomes might be uncertain, but the outcomes of different model runs

  7. Regional modeling of large wildfires under current and potential future climates in Colorado and Wyoming, USA

    Science.gov (United States)

    West, Amanda; Kumar, Sunil; Jarnevich, Catherine S.

    2016-01-01

    Regional analysis of large wildfire potential given climate change scenarios is crucial to understanding areas most at risk in the future, yet wildfire models are not often developed and tested at this spatial scale. We fit three historical climate suitability models for large wildfires (i.e. ≥ 400 ha) in Colorado andWyoming using topography and decadal climate averages corresponding to wildfire occurrence at the same temporal scale. The historical models classified points of known large wildfire occurrence with high accuracies. Using a novel approach in wildfire modeling, we applied the historical models to independent climate and wildfire datasets, and the resulting sensitivities were 0.75, 0.81, and 0.83 for Maxent, Generalized Linear, and Multivariate Adaptive Regression Splines, respectively. We projected the historic models into future climate space using data from 15 global circulation models and two representative concentration pathway scenarios. Maps from these geospatial analyses can be used to evaluate the changing spatial distribution of climate suitability of large wildfires in these states. April relative humidity was the most important covariate in all models, providing insight to the climate space of large wildfires in this region. These methods incorporate monthly and seasonal climate averages at a spatial resolution relevant to land management (i.e. 1 km2) and provide a tool that can be modified for other regions of North America, or adapted for other parts of the world.

  8. Climate change impacts on the future distribution of date palms: a modeling exercise using CLIMEX.

    Directory of Open Access Journals (Sweden)

    Farzin Shabani

    Full Text Available Climate is changing and, as a consequence, some areas that are climatically suitable for date palm (Phoenix dactylifera L. cultivation at the present time will become unsuitable in the future. In contrast, some areas that are unsuitable under the current climate will become suitable in the future. Consequently, countries that are dependent on date fruit export will experience economic decline, while other countries' economies could improve. Knowledge of the likely potential distribution of this economically important crop under current and future climate scenarios will be useful in planning better strategies to manage such issues. This study used CLIMEX to estimate potential date palm distribution under current and future climate models by using one emission scenario (A2 with two different global climate models (GCMs, CSIRO-Mk3.0 (CS and MIROC-H (MR. The results indicate that in North Africa, many areas with a suitable climate for this species are projected to become climatically unsuitable by 2100. In North and South America, locations such as south-eastern Bolivia and northern Venezuela will become climatically more suitable. By 2070, Saudi Arabia, Iraq and western Iran are projected to have a reduction in climate suitability. The results indicate that cold and dry stresses will play an important role in date palm distribution in the future. These results can inform strategic planning by government and agricultural organizations by identifying new areas in which to cultivate this economically important crop in the future and those areas that will need greater attention due to becoming marginal regions for continued date palm cultivation.

  9. Climate change impacts on the future distribution of date palms: a modeling exercise using CLIMEX.

    Science.gov (United States)

    Shabani, Farzin; Kumar, Lalit; Taylor, Subhashni

    2012-01-01

    Climate is changing and, as a consequence, some areas that are climatically suitable for date palm (Phoenix dactylifera L.) cultivation at the present time will become unsuitable in the future. In contrast, some areas that are unsuitable under the current climate will become suitable in the future. Consequently, countries that are dependent on date fruit export will experience economic decline, while other countries' economies could improve. Knowledge of the likely potential distribution of this economically important crop under current and future climate scenarios will be useful in planning better strategies to manage such issues. This study used CLIMEX to estimate potential date palm distribution under current and future climate models by using one emission scenario (A2) with two different global climate models (GCMs), CSIRO-Mk3.0 (CS) and MIROC-H (MR). The results indicate that in North Africa, many areas with a suitable climate for this species are projected to become climatically unsuitable by 2100. In North and South America, locations such as south-eastern Bolivia and northern Venezuela will become climatically more suitable. By 2070, Saudi Arabia, Iraq and western Iran are projected to have a reduction in climate suitability. The results indicate that cold and dry stresses will play an important role in date palm distribution in the future. These results can inform strategic planning by government and agricultural organizations by identifying new areas in which to cultivate this economically important crop in the future and those areas that will need greater attention due to becoming marginal regions for continued date palm cultivation.

  10. Tools required for efficient management of municipal utilities in the future heat and power market; Zukunftsfaehiges Management von Stadtwerken braucht Instrumente

    Energy Technology Data Exchange (ETDEWEB)

    Estermann, Andre S. [Strategieprojekt stadtwerke-monitor.de, Berlin (Germany)

    2010-10-15

    The key aspect of organizational management is the definition and implementation of goals for the future. This requires continuous reflection of the organization's position at a given moment and of the further strategy required to achieve the targeted goals. Small and medium-sized municipal utilities as a rule are reluctant to take this strategy and promote innovations. The author presents a new online platform that was developed specifically for municipal utilities, offering them the option to find new ways of business management. In the energy markets of the future, participation of municipal utilities will thus no longer be a matter of 'if' but a matter of 'how'. (orig.)

  11. INFORMATION TECHNOLOGY STRATEGIC ALIGNMENT: ANALYSIS OF ALIGNMENT MODELS AND PROPOSALS FOR FUTURE RESEARCH

    Directory of Open Access Journals (Sweden)

    Fabrício Sobrosa Affeldt

    2009-10-01

    Full Text Available Information Technology (IT is a resource capable of supporting businesses, which provides agile operations and mobility and decision support tools. The link between IT and business strategy has been studied regarding the best fitted model to improve company performance. This paper analyzes, through bibliographic research, the strategic alignment concept and the evolution of the strategic alignment theoretical models that are considered references in this area. The paper presents a comparison between these referential models and some perspectives for future research related IT strategic alignment.

  12. Estimates of future warming-induced methane emissions from hydrate offshore west Svalbard for a range of climate models

    Science.gov (United States)

    Marín-Moreno, Héctor; Minshull, Timothy A.; Westbrook, Graham K.; Sinha, Bablu

    2015-05-01

    Methane hydrate close to the hydrate stability limit in seafloor sediment could represent an important source of methane to the oceans and atmosphere as the oceans warm. We investigate the extent to which patterns of past and future ocean-temperature fluctuations influence hydrate stability in a region offshore West Svalbard where active gas venting has been observed. We model the transient behavior of the gas hydrate stability zone at 400-500 m water depth (mwd) in response to past temperature changes inferred from historical measurements and proxy data and we model future changes predicted by seven climate models and two climate-forcing scenarios (Representative Concentration Pathways RCPs 2.6 and 8.5). We show that over the past 2000 year, a combination of annual and decadal temperature fluctuations could have triggered multiple hydrate-sourced methane emissions from seabed shallower than 400 mwd during episodes when the multidecadal average temperature was similar to that over the last century (˜2.6°C). These temperature fluctuations can explain current methane emissions at 400 mwd, but decades to centuries of ocean warming are required to generate emissions in water deeper than 420 m. In the venting area, future methane emissions are relatively insensitive to the choice of climate model and RCP scenario until 2050 year, but are more sensitive to the RCP scenario after 2050 year. By 2100 CE, we estimate an ocean uptake of 97-1050 TgC from marine Arctic hydrate-sourced methane emissions, which is 0.06-0.67% of the ocean uptake from anthropogenic CO2 emissions for the period 1750-2011.

  13. Combining a Spatial Model and Demand Forecasts to Map Future Surface Coal Mining in Appalachia.

    Directory of Open Access Journals (Sweden)

    Michael P Strager

    Full Text Available Predicting the locations of future surface coal mining in Appalachia is challenging for a number of reasons. Economic and regulatory factors impact the coal mining industry and forecasts of future coal production do not specifically predict changes in location of future coal production. With the potential environmental impacts from surface coal mining, prediction of the location of future activity would be valuable to decision makers. The goal of this study was to provide a method for predicting future surface coal mining extents under changing economic and regulatory forecasts through the year 2035. This was accomplished by integrating a spatial model with production demand forecasts to predict (1 km2 gridded cell size land cover change. Combining these two inputs was possible with a ratio which linked coal extraction quantities to a unit area extent. The result was a spatial distribution of probabilities allocated over forecasted demand for the Appalachian region including northern, central, southern, and eastern Illinois coal regions. The results can be used to better plan for land use alterations and potential cumulative impacts.

  14. Future extreme events in European climate: An exploration of regional climate model projections

    DEFF Research Database (Denmark)

    Beniston, M.; Stephenson, D.B.; Christensen, O.B.

    2007-01-01

    -90) and future (2071-2 100) climate on the basis of regional climate model simulations produced by the PRUDENCE project. A summary of the main results follows. Heat waves - Regional surface warming causes the frequency, intensity and duration of heat waves to increase over Europe. By the end of the twenty first......, and the detailed patterns of these changes are sensitive to the choice of the driving global model. In the case of precipitation, variation between models can exceed both internal variability and variability between different emissions scenarios....... regions of Holland, Germany and Denmark, in particular. These results are found to depend to different degrees on model formulation. While the responses of heat waves are robust to model formulation, the magnitudes of changes in precipitation and wind speed are sensitive to the choice of regional model...

  15. Current Animal Models of Postoperative Spine Infection and Potential Future Advances.

    Science.gov (United States)

    Stavrakis, A I; Loftin, A H; Lord, E L; Hu, Y; Manegold, J E; Dworsky, E M; Scaduto, A A; Bernthal, N M

    2015-01-01

    Implant related infection following spine surgery is a devastating complication for patients and can potentially lead to significant neurological compromise, disability, morbidity, and even mortality. This paper provides an overview of the existing animal models of postoperative spine infection and highlights the strengths and weaknesses of each model. In addition, there is discussion regarding potential modifications to these animal models to better evaluate preventative and treatment strategies for this challenging complication. Current models are effective in simulating surgical procedures but fail to evaluate infection longitudinally using multiple techniques. Potential future modifications to these models include using advanced imaging technologies to evaluate infection, use of bioluminescent bacterial species, and testing of novel treatment strategies against multiple bacterial strains. There is potential to establish a postoperative spine infection model using smaller animals, such as mice, as these would be a more cost-effective screening tool for potential therapeutic interventions.

  16. Current Animal Models of Postoperative Spine Infection and Potential Future Advances

    Directory of Open Access Journals (Sweden)

    Alexandra eStavrakis

    2015-05-01

    Full Text Available Implant related infection following spine surgery is a devastating complication for patients and can potentially lead to significant neurological compromise, disability, morbidity, and even mortality. This paper provides an overview of the existing animal models of postoperative spine infection and highlights the strengths and weaknesses of each model. In addition there is discussion regarding potential modifications to these animal models to better evaluate preventative and treatment strategies for this challenging complication. Current models are effective in simulating surgical procedures but fail to evaluate infection longitudinally using multiple techniques. Potential future modifications to these models include using advanced imaging technologies to evaluate infection, use of bioluminescent bacterial species, and testing of novel treatment strategies against multiple bacterial strains. There is potential to establish a postoperative spine infection model using smaller animals, such as mice, as these would be a more cost-effective screening tool for potential therapeutic interventions.

  17. Modeling plant species distributions under future climates: how fine scale do climate projections need to be?

    Science.gov (United States)

    Franklin, Janet; Davis, Frank W; Ikegami, Makihiko; Syphard, Alexandra D; Flint, Lorraine E; Flint, Alan L; Hannah, Lee

    2013-02-01

    Recent studies suggest that species distribution models (SDMs) based on fine-scale climate data may provide markedly different estimates of climate-change impacts than coarse-scale models. However, these studies disagree in their conclusions of how scale influences projected species distributions. In rugged terrain, coarse-scale climate grids may not capture topographically controlled climate variation at the scale that constitutes microhabitat or refugia for some species. Although finer scale data are therefore considered to better reflect climatic conditions experienced by species, there have been few formal analyses of how modeled distributions differ with scale. We modeled distributions for 52 plant species endemic to the California Floristic Province of different life forms and range sizes under recent and future climate across a 2000-fold range of spatial scales (0.008-16 km(2) ). We produced unique current and future climate datasets by separately downscaling 4 km climate models to three finer resolutions based on 800, 270, and 90 m digital elevation models and deriving bioclimatic predictors from them. As climate-data resolution became coarser, SDMs predicted larger habitat area with diminishing spatial congruence between fine- and coarse-scale predictions. These trends were most pronounced at the coarsest resolutions and depended on climate scenario and species' range size. On average, SDMs projected onto 4 km climate data predicted 42% more stable habitat (the amount of spatial overlap between predicted current and future climatically suitable habitat) compared with 800 m data. We found only modest agreement between areas predicted to be stable by 90 m models generalized to 4 km grids compared with areas classified as stable based on 4 km models, suggesting that some climate refugia captured at finer scales may be missed using coarser scale data. These differences in projected locations of habitat change may have more serious implications than net

  18. Simulation of extreme rainfall and projection of future changes using the GLIMCLIM model

    Science.gov (United States)

    Rashid, Md. Mamunur; Beecham, Simon; Chowdhury, Rezaul Kabir

    2016-08-01

    In this study, the performance of the Generalized LInear Modelling of daily CLImate sequence (GLIMCLIM) statistical downscaling model was assessed to simulate extreme rainfall indices and annual maximum daily rainfall (AMDR) when downscaled daily rainfall from National Centers for Environmental Prediction (NCEP) reanalysis and Coupled Model Intercomparison Project Phase 5 (CMIP5) general circulation models (GCM) (four GCMs and two scenarios) output datasets and then their changes were estimated for the future period 2041-2060. The model was able to reproduce the monthly variations in the extreme rainfall indices reasonably well when forced by the NCEP reanalysis datasets. Frequency Adapted Quantile Mapping (FAQM) was used to remove bias in the simulated daily rainfall when forced by CMIP5 GCMs, which reduced the discrepancy between observed and simulated extreme rainfall indices. Although the observed AMDR were within the 2.5th and 97.5th percentiles of the simulated AMDR, the model consistently under-predicted the inter-annual variability of AMDR. A non-stationary model was developed using the generalized linear model for local, shape and scale to estimate the AMDR with an annual exceedance probability of 0.01. The study shows that in general, AMDR is likely to decrease in the future. The Onkaparinga catchment will also experience drier conditions due to an increase in consecutive dry days coinciding with decreases in heavy (>long term 90th percentile) rainfall days, empirical 90th quantile of rainfall and maximum 5-day consecutive total rainfall for the future period (2041-2060) compared to the base period (1961-2000).

  19. Developing scenarios to assess future landslide risks: a model-based approach applied to mountainous regions

    Science.gov (United States)

    Vacquie, Laure; Houet, Thomas

    2016-04-01

    In the last century, European mountain landscapes have experienced significant transformations. Natural and anthropogenic changes, climate changes, touristic and industrial development, socio-economic interactions, and their implications in terms of LUCC (land use and land cover changes) have directly influenced the spatial organization and vulnerability of mountain landscapes. This study is conducted as part of the SAMCO project founded by the French National Science Agency (ANR). It aims at developing a methodological approach, combining various tools, modelling platforms and methods, to identify vulnerable regions to landslide hazards accounting for futures LUCC. It presents an integrated approach combining participative scenarios and a LULC changes simulation models to assess the combined effects of LUCC and climate change on landslide risks in the Cauterets valley (French Pyrenees Mountains) up to 2100. Through vulnerability and risk mapping, the objective is to gather information to support landscape planning and implement land use strategies with local stakeholders for risk management. Four contrasting scenarios are developed and exhibit contrasting trajectories of socio-economic development. Prospective scenarios are based on national and international socio-economic contexts relying on existing assessment reports. The methodological approach integrates knowledge from local stakeholders to refine each scenario during their construction and to reinforce their plausibility and relevance by accounting for local specificities, e.g. logging and pastoral activities, touristic development, urban planning, etc. A process-based model, the Forecasting Scenarios for Mountains (ForeSceM) model, developed on the Dinamica Ego modelling platform is used to spatially allocate futures LUCC for each prospective scenario. Concurrently, a spatial decision support tool, i.e. the SYLVACCESS model, is used to identify accessible areas for forestry in scenario projecting logging

  20. Future intensification of hydro-meteorological extremes: downscaling using the weather research and forecasting model

    Science.gov (United States)

    El-Samra, R.; Bou-Zeid, E.; Bangalath, H. K.; Stenchikov, G.; El-Fadel, M.

    2017-02-01

    A set of ten downscaling simulations at high spatial resolution (3 km horizontally) were performed using the Weather Research and Forecasting (WRF) model to generate future climate projections of annual and seasonal temperature and precipitation changes over the Eastern Mediterranean (with a focus on Lebanon). The model was driven with the High Resolution Atmospheric Model (HiRAM), running over the whole globe at a resolution of 25 km, under the conditions of two Representative Concentration Pathways (RCP) (4.5 and 8.5). Each downscaling simulation spanned one year. Two past years (2003 and 2008), also forced by HiRAM without data assimilation, were simulated to evaluate the model's ability to capture the cold and wet (2003) and hot and dry (2008) extremes. The downscaled data were in the range of recent observed climatic variability, and therefore corrected for the cold bias of HiRAM. Eight future years were then selected based on an anomaly score that relies on the mean annual temperature and accumulated precipitation to identify the worst year per decade from a water resources perspective. One hot and dry year per decade, from 2011 to 2050, and per scenario was simulated and compared to the historic 2008 reference. The results indicate that hot and dry future extreme years will be exacerbated and the study area might be exposed to a significant decrease in annual precipitation (rain and snow), reaching up to 30% relative to the current extreme conditions.

  1. Drivers' communicative interactions: on-road observations and modelling for integration in future automation systems.

    Science.gov (United States)

    Portouli, Evangelia; Nathanael, Dimitris; Marmaras, Nicolas

    2014-01-01

    Social interactions with other road users are an essential component of the driving activity and may prove critical in view of future automation systems; still up to now they have received only limited attention in the scientific literature. In this paper, it is argued that drivers base their anticipations about the traffic scene to a large extent on observations of social behaviour of other 'animate human-vehicles'. It is further argued that in cases of uncertainty, drivers seek to establish a mutual situational awareness through deliberate communicative interactions. A linguistic model is proposed for modelling these communicative interactions. Empirical evidence from on-road observations and analysis of concurrent running commentary by 25 experienced drivers support the proposed model. It is suggested that the integration of a social interactions layer based on illocutionary acts in future driving support and automation systems will improve their performance towards matching human driver's expectations. Practitioner Summary: Interactions between drivers on the road may play a significant role in traffic coordination. On-road observations and running commentaries are presented as empirical evidence to support a model of such interactions; incorporation of drivers' interactions in future driving support and automation systems may improve their performance towards matching driver's expectations.

  2. Forest fire risk assessment in Sweden using climate model data: bias correction and future changes

    Directory of Open Access Journals (Sweden)

    W. Yang

    2015-01-01

    Full Text Available As the risk for a forest fire is largely influenced by weather, evaluating its tendency under a changing climate becomes important for management and decision making. Currently, biases in climate models make it difficult to realistically estimate the future climate and consequent impact on fire risk. A distribution-based scaling (DBS approach was developed as a post-processing tool that intends to correct systematic biases in climate modelling outputs. In this study, we used two projections, one driven by historical reanalysis (ERA40 and one from a global climate model (ECHAM5 for future projection, both having been dynamically downscaled by a regional climate model (RCA3. The effects of the post-processing tool on relative humidity and wind speed were studied in addition to the primary variables precipitation and temperature. Finally, the Canadian Fire Weather Index system was used to evaluate the influence of changing meteorological conditions on the moisture content in fuel layers and the fire-spread risk. The forest fire risk results using DBS are proven to better reflect risk using observations than that using raw climate outputs. For future periods, southern Sweden is likely to have a higher fire risk than today, whereas northern Sweden will have a lower risk of forest fire.

  3. Phenomenology of a Higgs triplet model at future $e^{+}e^{-}$ colliders

    CERN Document Server

    Blunier, Sylvain; Díaz, Marco Aurelio; Koch, Benjamin

    2016-01-01

    In this work, we investigate the prospects of future $e^{+}e^{-}$ colliders in testing a Higgs triplet model with a scalar triplet and a scalar singlet under $SU(2)$. The parameters of the model are fixed so that the lightest $CP-$even state corresponds to the Higgs particle observed at the LHC at around $125$ GeV. This study investigates if the second heaviest $CP-$even, the heaviest $CP-$odd and the singly charged states can be observed at existing and future colliders by computing their accessible production and decay channels. In particular, the LHC is not well equipped to produce a Higgs boson which is not mainly doublet-like, so we turn our focus to lepton colliders. We find distinctive features of this model in cases when the second heaviest $CP-$even Higgs is triplet-like, singlet-like or a mixture. These features could distinguish the model from other scenarios at future $e^{+}e^{-}$ colliders.

  4. Modeling future water demand in California from developed and agricultural land uses

    Science.gov (United States)

    Wilson, T. S.; Sleeter, B. M.; Cameron, D. R.

    2015-12-01

    Municipal and urban land-use intensification in coming decades will place increasing pressure on water resources in California. The state is currently experiencing one of the most extreme droughts on record. This coupled with earlier spring snowmelt and projected future climate warming will increasingly constrain already limited water supplies. The development of spatially explicit models of future land use driven by empirical, historical land use change data allow exploration of plausible LULC-related water demand futures and potential mitigation strategies. We utilized the Land Use and Carbon Scenario Simulator (LUCAS) state-and-transition simulation model to project spatially explicit (1 km) future developed and agricultural land use from 2012 to 2062 and estimated the associated water use for California's Mediterranean ecoregions. We modeled 100 Monte Carlo simulations to better characterize and project historical land-use change variability. Under current efficiency rates, total water demand was projected to increase 15.1% by 2062, driven primarily by increases in urbanization and shifts to more water intensive crops. Developed land use was projected to increase by 89.8%-97.2% and result in an average 85.9% increase in municipal water use, while agricultural water use was projected to decline by approximately 3.9%, driven by decreases in row crops and increases in woody cropland. In order for water demand in 2062 to balance to current demand levels, the currently mandated 25% reduction in urban water use must remain in place in conjunction with a near 7% reduction in agricultural water use. Scenarios of land-use related water demand are useful for visualizing alternative futures, examining potential management approaches, and enabling better informed resource management decisions.

  5. Application of Multi-Model CMIP5 Analysis in Future Drought Adaptation Strategies

    Science.gov (United States)

    Casey, M.; Luo, L.; Lang, Y.

    2014-12-01

    Drought influences the efficacy of numerous natural and artificial systems including species diversity, agriculture, and infrastructure. Global climate change raises concerns that extend well beyond atmospheric and hydrological disciplines - as climate changes with time, the need for system adaptation becomes apparent. Drought, as a natural phenomenon, is typically defined relative to the climate in which it occurs. Typically a 30-year reference time frame (RTF) is used to determine the severity of a drought event. This study investigates the projected future droughts over North America with different RTFs. Confidence in future hydroclimate projection is characterized by the agreement of long term (2005-2100) multi-model precipitation (P) and temperature (T) projections within the Coupled model Intercomparison Project Phase 5 (CMIP5). Drought severity and the propensity of extreme conditions are measured by the multi-scalar, probabilistic, RTF-based Standard Precipitation Index (SPI) and Standard Precipitation Evapotranspiration Index (SPEI). SPI considers only P while SPEI incorporates Evapotranspiration (E) via T; comparing the two reveals the role of temperature change in future hydroclimate change. Future hydroclimate conditions, hydroclimate extremity, and CMIP5 model agreement are assessed for each Representative Concentration Pathway (RCP 2.6, 4.5, 6.0, 8.5) in regions throughout North America for the entire year and for the boreal seasons. In addition, multiple time scales of SPI and SPEI are calculated to characterize drought at time scales ranging from short to long term. The study explores a simple, standardized method for considering adaptation in future drought assessment, which provides a novel perspective to incorporate adaptation with climate change. The result of the analysis is a multi-dimension, probabilistic summary of the hydrological (P, E) environment a natural or artificial system must adapt to over time. Studies similar to this with

  6. On the importance of paleoclimate modelling for improving predictions of future climate change

    Directory of Open Access Journals (Sweden)

    J. C. Hargreaves

    2009-12-01

    Full Text Available We use an ensemble of runs from the MIROC3.2 AGCM with slab-ocean to explore the extent to which mid-Holocene simulations are relevant to predictions of future climate change. The results are compared with similar analyses for the Last Glacial Maximum (LGM and pre-industrial control climate. We suggest that the paleoclimate epochs can provide some independent validation of the models that is also relevant for future predictions. Considering the paleoclimate epochs, we find that the stronger global forcing and hence larger climate change at the LGM makes this likely to be the more powerful one for estimating the large-scale changes that are anticipated due to anthropogenic forcing. The phenomena in the mid-Holocene simulations which are most strongly correlated with future changes (i.e., the mid to high northern latitude land temperature and monsoon precipitation do, however, coincide with areas where the LGM results are not correlated with future changes, and these are also areas where the paleodata indicate significant climate changes have occurred. Thus, these regions and phenomena for the mid-Holocene may be useful for model improvement and validation.

  7. Three Tier Unified Process Model for Requirement Negotiations and Stakeholder Collaborations

    Science.gov (United States)

    Niazi, Muhammad Ashraf Khan; Abbas, Muhammad; Shahzad, Muhammad

    2012-11-01

    This research paper is focused towards carrying out a pragmatic qualitative analysis of various models and approaches of requirements negotiations (a sub process of requirements management plan which is an output of scope managementís collect requirements process) and studies stakeholder collaborations methodologies (i.e. from within communication management knowledge area). Experiential analysis encompass two tiers; first tier refers to the weighted scoring model while second tier focuses on development of SWOT matrices on the basis of findings of weighted scoring model for selecting an appropriate requirements negotiation model. Finally the results are simulated with the help of statistical pie charts. On the basis of simulated results of prevalent models and approaches of negotiations, a unified approach for requirements negotiations and stakeholder collaborations is proposed where the collaboration methodologies are embeded into selected requirements negotiation model as internal parameters of the proposed process alongside some external required parameters like MBTI, opportunity analysis etc.

  8. An Analysis of the Requirements and Potential Opportunities for the Future Education of Law Enforcement Intelligence Analysts

    Science.gov (United States)

    2008-03-01

    universities, that taught intelligence analysis, focused on foreign, national or business intelligence . Foreign intelligence in this thesis refers to the...coursework consists of the following classes: Fundamentals of Intelligence Analysis, Business Intelligence , National Intelligence, Criminal Intelligence...certificate programs: competitive ( business ) intelligence and intelligence analysis. The competitive intelligence analysis program requires completion of the

  9. Requirements and Problems in Parallel Model Development at DWD

    Directory of Open Access Journals (Sweden)

    Ulrich Schäattler

    2000-01-01

    Full Text Available Nearly 30 years after introducing the first computer model for weather forecasting, the Deutscher Wetterdienst (DWD is developing the 4th generation of its numerical weather prediction (NWP system. It consists of a global grid point model (GME based on a triangular grid and a non-hydrostatic Lokal Modell (LM. The operational demand for running this new system is immense and can only be met by parallel computers. From the experience gained in developing earlier NWP models, several new problems had to be taken into account during the design phase of the system. Most important were portability (including efficieny of the programs on several computer architectures and ease of code maintainability. Also the organization and administration of the work done by developers from different teams and institutions is more complex than it used to be. This paper describes the models and gives some performance results. The modular approach used for the design of the LM is explained and the effects on the development are discussed.

  10. A spatially explicit model for the future progression of the current Haiti cholera epidemic

    Science.gov (United States)

    Bertuzzo, E.; Mari, L.; Righetto, L.; Gatto, M.; Casagrandi, R.; Rodriguez-Iturbe, I.; Rinaldo, A.

    2011-12-01

    As a major cholera epidemic progresses in Haiti, and the figures of the infection, up to July 2011, climb to 385,000 cases and 5,800 deaths, the development of general models to track and predict the evolution of the outbreak, so as to guide the allocation of medical supplies and staff, is gaining notable urgency. We propose here a spatially explicit epidemic model that accounts for the dynamics of susceptible and infected individuals as well as the redistribution of textit{Vibrio cholera}, the causative agent of the disease, among different human communities. In particular, we model two spreading pathways: the advection of pathogens through hydrologic connections and the dissemination due to human mobility described by means of a gravity-like model. To this end the country has been divided into hydrologic units based on drainage directions derived from a digital terrain model. Moreover the population of each unit has been estimated from census data downscaled to 1 km x 1 km resolution via remotely sensed geomorphological information (LandScan texttrademark project). The model directly account for the role of rainfall patterns in driving the seasonality of cholera outbreaks. The two main outbreaks in fact occurred during the rainy seasons (October and May) when extensive floodings severely worsened the sanitation conditions and, in turn, raised the risk of infection. The model capability to reproduce the spatiotemporal features of the epidemic up to date grants robustness to the foreseen future development. In this context, the duration of acquired immunity, a hotly debated topic in the scientific community, emerges as a controlling factor for progression of the epidemic in the near future. The framework presented here can straightforwardly be used to evaluate the effectiveness of alternative intervention strategies like mass vaccinations, clean water supply and educational campaigns, thus emerging as an essential component of the control of future cholera

  11. Modelling the future biogeography of North Atlantic zooplankton communities in response to climate change

    KAUST Repository

    Villarino, E

    2015-07-02

    Advances in habitat and climate modelling allow us to reduce uncertainties of climate change impacts on species distribution. We evaluated the impacts of future climate change on community structure, diversity, distribution and phenology of 14 copepod species in the North Atlantic. We developed and validated habitat models for key zooplankton species using continuous plankton recorder (CPR) survey data collected at mid latitudes of the North Atlantic. Generalized additive models (GAMs) were applied to relate the occurrence of species to environmental variables. Models were projected to future (2080–2099) environmental conditions using coupled hydroclimatix–biogeochemical models under the Intergovernmental Panel on Climate Change (IPCC) A1B climate scenario, and compared to present (2001–2020) conditions. Our projections indicated that the copepod community is expected to respond substantially to climate change: a mean poleward latitudinal shift of 8.7 km per decade for the overall community with an important species range variation (–15 to 18 km per decade); the species seasonal peak is expected to occur 12–13 d earlier for Calanus finmarchicus and C. hyperboreus; and important changes in community structure are also expected (high species turnover of 43–79% south of the Oceanic Polar Front). The impacts of the change expected by the end of the century under IPCC global warming scenarios on copepods highlight poleward shifts, earlier seasonal peak and changes in biodiversity spatial patterns that might lead to alterations of the future North Atlantic pelagic ecosystem. Our model and projections are supported by a temporal validation undertaken using the North Atlantic climate regime shift that occurred in the 1980s: the habitat model built in the cold period (1970–1986) has been validated in the warm period (1987–2004).

  12. Future evolution in a backreaction model and the analogous scalar field cosmology

    CERN Document Server

    Ali, Amna

    2016-01-01

    We investigate the future evolution of the universe using the Buchert framework for averaged backreaction in the context of a two-domain partition of the universe. We show that this approach allows for the possibility of the global acceleration vanishing at a finite future time, provided that none of the subdomains accelerate individually. The model at large scales is analogously described in terms of a homogeneous scalar field emerging with a potential that is fixed and free from phenomenological parametrization. The dynamics of this scalar field is explored in the analogous FLRW cosmology. We use observational data from Type Ia Supernovae, Baryon Acoustic Oscillations, and Cosmic Microwave Background to constrain the parameters of the model for a viable cosmology, providing the corresponding likelihood contours.

  13. Modeling of the Orbital Debris Environment Risks in the Past, Present, and Future

    Science.gov (United States)

    Matney, Mark

    2016-01-01

    Despite of the tireless work by space surveillance assets, much of the Earth debris environment is not easily measured or tracked. For every object that is in an orbit we can track, there are hundreds of small debris that are too small to be tracked but still large enough to damage spacecraft. In addition, even if we knew today's environment with perfect knowledge, the debris environment is dynamic and would change tomorrow. Therefore, orbital debris scientists rely on numerical modeling to understand the nature of the debris environment and its risk to space operations throughout Earth orbit and into the future. This talk will summarize the ways in which modeling complements measurements to help give us a better picture of what is occurring in Earth orbit, and helps us to better conduct current and future space operations.

  14. Current applications and future directions for the CDISC Operational Data Model standard: A methodological review.

    Science.gov (United States)

    Hume, Sam; Aerts, Jozef; Sarnikar, Surendra; Huser, Vojtech

    2016-04-01

    In order to further advance research and development on the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM) standard, the existing research must be well understood. This paper presents a methodological review of the ODM literature. Specifically, it develops a classification schema to categorize the ODM literature according to how the standard has been applied within the clinical research data lifecycle. This paper suggests areas for future research and development that address ODM's limitations and capitalize on its strengths to support new trends in clinical research informatics. A systematic scan of the following databases was performed: (1) ABI/Inform, (2) ACM Digital, (3) AIS eLibrary, (4) Europe Central PubMed, (5) Google Scholar, (5) IEEE Xplore, (7) PubMed, and (8) ScienceDirect. A Web of Science citation analysis was also performed. The search term used on all databases was "CDISC ODM." The two primary inclusion criteria were: (1) the research must examine the use of ODM as an information system solution component, or (2) the research must critically evaluate ODM against a stated solution usage scenario. Out of 2686 articles identified, 266 were included in a title level review, resulting in 183 articles. An abstract review followed, resulting in 121 remaining articles; and after a full text scan 69 articles met the inclusion criteria. As the demand for interoperability has increased, ODM has shown remarkable flexibility and has been extended to cover a broad range of data and metadata requirements that reach well beyond ODM's original use cases. This flexibility has yielded research literature that covers a diverse array of topic areas. A classification schema reflecting the use of ODM within the clinical research data lifecycle was created to provide a categorized and consolidated view of the ODM literature. The elements of the framework include: (1) EDC (Electronic Data Capture) and EHR (Electronic Health Record

  15. Modeling Training of Future Teachers Aimed on Innovation Activities Based on the System of Design Features

    Directory of Open Access Journals (Sweden)

    Yury S. Tyunnikov

    2015-05-01

    Full Text Available Modeling of training system of future teachers aimed on innovation activities performed in a certain project logic and procedures, which is possible only through a specific set of design features, based on capability and peculiar properties of the university. The article is formulated and solved the problem of design features, revealing in its set the characteristic properties, organization and functioning of training system aimed on innovation in specific terms of professional education.

  16. Investigating nonlinear speculation in cattle, corn, and hog futures markets using logistic smooth transition regression models

    OpenAIRE

    Röthig, Andreas; Chiarella, Carl

    2006-01-01

    This article explores nonlinearities in the response of speculators' trading activity to price changes in live cattle, corn, and lean hog futures markets. Analyzing weekly data from March 4, 1997 to December 27, 2005, we reject linearity in all of these markets. Using smooth transition regression models, we find a similar structure of nonlinearities with regard to the number of different regimes, the choice of the transition variable, and the value at which the transition occurs.

  17. Dynamo Models of the Solar Cycle: Current Trends and Future Prospects

    CERN Document Server

    Nandy, Dibyendu

    2011-01-01

    The magnetic cycle of the Sun, as manifested in the cyclic appearance of sunspots, significantly influences our space environment and space-based technologies by generating what is now termed as space weather. Long-term variation in the Sun's magnetic output also influences planetary atmospheres and climate through modulation of solar irradiance. Here, I summarize the current state of understanding of this magnetic cycle, highlighting important observational constraints, detailing the kinematic dynamo modeling approach and commenting on future prospects.

  18. Simulation of Change Trend of Drought in Shaanxi Province in Future Based on PRECIS Model

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [Objective] The aim was to predict the change trend of drought in Shaanxi Province in future. [Method] Based on the regional climate model PRECIS from Hadley Climate Center, British Meteorological Bureau, taking precipitation anomaly percentage as assessment index, the change trend of drought in Shaanxi Province in reference years (1971-1990) was simulated, and the change trend of drought in Shaanxi Province from 2071 to 2100 was predicted. [Result] The simulated value of drought frequency in reference year...

  19. Modeling Future Life-Cycle Greenhouse Gas Emissions and Environmental Impacts of Electricity Supplies in Brazil

    OpenAIRE

    2013-01-01

    Brazil’s status as a rapidly developing country is visible in its need for more energy, including electricity. While the current electricity generation mix is primarily hydropower based, high-quality dam sites are diminishing and diversification to other sources is likely. We combined life-cycle data for electricity production with scenarios developed using the IAEA’s MESSAGE model to examine environmental impacts of future electricity generation under a baseline case and four side cases, usi...

  20. How Good Can We Get? Using mathematical models to predict the future of athletics

    CERN Document Server

    Mureika, J R

    1998-01-01

    Track and field world records have risen and fallen throughout the history of the sport. A recent rash of record-breaking performances has prompted the question: "How good can we get?". This article offers a review of several attempts to answer this question, based on mathematical modeling of key physiological processes. The predictions are compared with present-day world records, and a discussion of the future of athletics ensues...

  1. Regional Climate Downscaling Of African Climate Using A High-Resolution Global Atmospheric Model: Validation And Future Projection

    Science.gov (United States)

    Raj, J.; Stenchikov, G. L.; Bangalath, H.

    2013-12-01

    Climate change impact assessment and adaptation planning require region specific information with high spatial resolution, since the climate and weather effects are directly felt at the local scale. While most of the state-of-the-art General Circulation Models lack adequate spatial resolution, regional climate models (RCM) used in a nested domain are generally incapable of incorporating the two-way exchanges between regional and global climate. In this study we use a very high resolution atmospheric general circulation model HiRAM, developed at NOAA GFDL, to investigate the regional climate changes over CORDEX African domain. The HiRAM simulations are performed with a horizontal grid spacing of 25 km, which is an ample resolution for regional climate simulation. HiRAM has the advantage of naturally describing interaction between regional and global climate. Historic (1975-2004) simulations and future (2007-2050) projections, with both RCP 4.5 and RCP 8.5 pathways, are conducted in line with the CORDEX protocol. A coarse resolution sea surface temperature (SST) is prescribed from the GFDL Earth System Model runs of IPPC AR5, as bottom boundary condition over ocean. The GFDL Land Surface Model (LM3) is employed to calculate physical processes at surface and in soil. The preliminary analysis of the performance of HiRAM, using historic runs, shows it reproduces the regional climate adequately well in comparison with observations. Significant improvement in the simulation of regional climate is evident in comparison with the coarse resolution driving model. Future projections predict an increase in atmospheric temperature over Africa with stronger warming in the subtropics than in tropics. A significant strengthening of West African Monsoon and a southward shift of the summer rainfall maxima over Africa is predicted in both RCP 4.5 and RCP8.5 scenarios.

  2. Preparing the Dutch delta for future droughts: model based support in the national Delta Programme

    Science.gov (United States)

    ter Maat, Judith; Haasnoot, Marjolijn; van der Vat, Marnix; Hunink, Joachim; Prinsen, Geert; Visser, Martijn

    2014-05-01

    Keywords: uncertainty, policymaking, adaptive policies, fresh water management, droughts, Netherlands, Dutch Deltaprogramme, physically-based complex model, theory-motivated meta-model To prepare the Dutch Delta for future droughts and water scarcity, a nation-wide 4-year project, called Delta Programme, is established to assess impacts of climate scenarios and socio-economic developments and to explore policy options. The results should contribute to a national adaptive plan that is able to adapt to future uncertain conditions, if necessary. For this purpose, we followed a model-based step-wise approach, wherein both physically-based complex models and theory-motivated meta-models were used. First step (2010-2011) was to make a quantitative problem description. This involved a sensitivity analysis of the water system for drought situations under current and future conditions. The comprehensive Dutch national hydrological instrument was used for this purpose and further developed. Secondly (2011-2012) our main focus was on making an inventory of potential actions together with stakeholders. We assessed efficacy, sell-by date of actions, and reassessed vulnerabilities and opportunities for the future water supply system if actions were (not) taken. A rapid assessment meta-model was made based on the complex model. The effects of all potential measures were included in the tool. Thirdly (2012-2013), with support of the rapid assessment model, we assessed the efficacy of policy actions over time for an ensemble of possible futures including sea level rise and climate and land use change. Last step (2013-2014) involves the selection of preferred actions from a set of promising actions that meet the defined objectives. These actions are all modeled and evaluated using the complex model. The outcome of the process will be an adaptive management plan. The adaptive plan describes a set of preferred policy pathways - sequences of policy actions - to achieve targets under

  3. Numerical Modeling of Climate-Chemistry Connections: Recent Developments and Future Challenges

    Directory of Open Access Journals (Sweden)

    Patrick Jöckel

    2013-05-01

    Full Text Available This paper reviews the current state and development of different numerical model classes that are used to simulate the global atmospheric system, particularly Earth’s climate and climate-chemistry connections. The focus is on Chemistry-Climate Models. In general, these serve to examine dynamical and chemical processes in the Earth atmosphere, their feedback, and interaction with climate. Such models have been established as helpful tools in addition to analyses of observational data. Definitions of the global model classes are given and their capabilities as well as weaknesses are discussed. Examples of scientific studies indicate how numerical exercises contribute to an improved understanding of atmospheric behavior. There, the focus is on synergistic investigations combining observations and model results. The possible future developments and challenges are presented, not only from the scientific point of view but also regarding the computer technology and respective consequences for numerical modeling of atmospheric processes. In the future, a stronger cross-linkage of subject-specific scientists is necessary, to tackle the looming challenges. It should link the specialist discipline and applied computer science.

  4. Characteristic properties of two different viscous cosmology models for the future universe

    Science.gov (United States)

    Normann, Ben David; Brevik, Iver

    2017-02-01

    We analyze characteristic properties of two different cosmological models: (i) a one-component dark energy model where the bulk viscosity ζ is associated with the fluid as a whole, and (ii) a two-component model where ζ is associated with a dark matter component ρm only, the dark energy component considered inviscid. Shear viscosity is omitted. We assume throughout the simple equation-of-state p = wρ with w a constant. In the one-component model, we consider two possibilities, either to take ζ proportional to the scalar expansion (equivalent to the Hubble parameter), in which case the evolution becomes critically dependent on the value of the small constant α = 1 + w and the magnitude of ζ, or we consider the case ζ = const., where a de Sitter final stage is reached in the future. In the two-component model, we consider only the case where the dark matter viscosity ζm is proportional to the square of ρm, where again a de Sitter form is found in the future. In this latter case, the formalism is supplemented by a phase space analysis. As a general result of our considerations, we suggest that a value ζ0 ˜ 106Pa ṡs for the present viscosity is reasonable, and that the two-component model seems to be favored.

  5. Characterizing the EPODE logic model: unravelling the past and informing the future.

    Science.gov (United States)

    Van Koperen, T M; Jebb, S A; Summerbell, C D; Visscher, T L S; Romon, M; Borys, J M; Seidell, J C

    2013-02-01

    EPODE ('Ensemble Prévenons l'Obésité De Enfants' or 'Together let's Prevent Childhood Obesity') is a large-scale, centrally coordinated, capacity-building approach for communities to implement effective and sustainable strategies to prevent childhood obesity. Since 2004, EPODE has been implemented in over 500 communities in six countries. Although based on emergent practice and scientific knowledge, EPODE, as many community programs, lacks a logic model depicting key elements of the approach. The objective of this study is to gain insight in the dynamics and key elements of EPODE and to represent these in a schematic logic model. EPODE's process manuals and documents were collected and interviews were held with professionals involved in the planning and delivery of EPODE. Retrieved data were coded, themed and placed in a four-level logic model. With input from international experts, this model was scaled down to a concise logic model covering four critical components: political commitment, public and private partnerships, social marketing and evaluation. The EPODE logic model presented here can be used as a reference for future and follow-up research; to support future implementation of EPODE in communities; as a tool in the engagement of stakeholders; and to guide the construction of a locally tailored evaluation plan.

  6. A Markov switching model of the conditional volatility of crude oil futures prices

    Energy Technology Data Exchange (ETDEWEB)

    Fong, Wai Mun; See, Kim Hock [Department of Finance and Accounting, National University of Singapore, 119260 Kent Ridge Cresent (Singapore)

    2002-01-01

    This paper examines the temporal behaviour of volatility of daily returns on crude oil futures using a generalised regime switching model that allows for abrupt changes in mean and variance, GARCH dynamics, basis-driven time-varying transition probabilities and conditional leptokurtosis. This flexible model enables us to capture many complex features of conditional volatility within a relatively parsimonious set-up. We show that regime shifts are clearly present in the data and dominate GARCH effects. Within the high volatility state, a negative basis is more likely to increase regime persistence than a positive basis, a finding which is consistent with previous empirical research on the theory of storage. The volatility regimes identified by our model correlate well with major events affecting supply and demand for oil. Out-of-sample tests indicate that the regime switching model performs noticeably better than non-switching models regardless of evaluation criteria. We conclude that regime switching models provide a useful framework for the financial historian interested in studying factors behind the evolution of volatility and to oil futures traders interested short-term volatility forecasts.

  7. State of the art review and future directions in oil spill modeling.

    Science.gov (United States)

    Spaulding, Malcolm L

    2017-02-15

    A review of the state of the art in oil spill modeling, focused on the period from 2000 to present is provided. The review begins with an overview of the current structure of spill models and some lessons learned from model development and application and then provides guiding principles that govern the development of the current generation of spill models. A review of the basic structure of spill models, and new developments in specific transport and fate processes; including surface and subsurface transport, spreading, evaporation, dissolution, entrainment and oil droplet size distributions, emulsification, degradation, and sediment oil interaction are presented. The paper concludes with thoughts on future directions in the field with a primary focus on advancements in handling interactions between Lagrangian elements.

  8. Modelling and Simulation for Requirements Engineering and Options Analysis

    Science.gov (United States)

    2010-05-01

    Defence, 2010 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la Défense nationale, 2010 DRDC Toronto CR 2010...externalize their mental model of the assumed solution for critique and correction by others, and whether or not this would assist in ensuring that

  9. Predicting Flu Season Requirements: An Undergraduate Modeling Project

    Science.gov (United States)

    Kramlich, Gary R., II; Braunstein Fierson, Janet L.; Wright, J. Adam

    2010-01-01

    This project was designed to be used in a freshman calculus class whose students had already been introduced to logistic functions and basic data modeling techniques. It need not be limited to such an audience, however; it has also been implemented in a topics in mathematics class for college upperclassmen. Originally intended to be presented in…

  10. Thermal Modeling and Feedback Requirements for LIFE Neutronic Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, J E

    2009-07-15

    An initial study is performed to determine how temperature considerations affect LIFE neutronic simulations. Among other figures of merit, the isotopic mass accumulation, thermal power, tritium breeding, and criticality are analyzed. Possible fidelities of thermal modeling and degrees of coupling are explored. Lessons learned from switching and modifying nuclear datasets is communicated.

  11. FutureTox II: in vitro data and in silico models for predictive toxicology.

    Science.gov (United States)

    Knudsen, Thomas B; Keller, Douglas A; Sander, Miriam; Carney, Edward W; Doerrer, Nancy G; Eaton, David L; Fitzpatrick, Suzanne Compton; Hastings, Kenneth L; Mendrick, Donna L; Tice, Raymond R; Watkins, Paul B; Whelan, Maurice

    2015-02-01

    FutureTox II, a Society of Toxicology Contemporary Concepts in Toxicology workshop, was held in January, 2014. The meeting goals were to review and discuss the state of the science in toxicology in the context of implementing the NRC 21st century vision of predicting in vivo responses from in vitro and in silico data, and to define the goals for the future. Presentations and discussions were held on priority concerns such as predicting and modeling of metabolism, cell growth and differentiation, effects on sensitive subpopulations, and integrating data into risk assessment. Emerging trends in technologies such as stem cell-derived human cells, 3D organotypic culture models, mathematical modeling of cellular processes and morphogenesis, adverse outcome pathway development, and high-content imaging of in vivo systems were discussed. Although advances in moving towards an in vitro/in silico based risk assessment paradigm were apparent, knowledge gaps in these areas and limitations of technologies were identified. Specific recommendations were made for future directions and research needs in the areas of hepatotoxicity, cancer prediction, developmental toxicity, and regulatory toxicology.

  12. Planners in the Future City: Using City Information Modelling to Support Planners as Market Actors

    Directory of Open Access Journals (Sweden)

    Emine Mine Thompson

    2016-03-01

    Full Text Available Recently, Adams and Tiesdell (2010, Tewdwr-Jones (2012 and Batty (2013 have outlined the importance of information and intelligence in relation to the mediation and management of land, property and urban consumers in the future city. Traditionally, the challenge for urban planners was the generation of meaningful and timely information. Today, the urban planners’ challenge is no longer the timely generation of urban data, rather, it is in relation to how so much information can be exploited and integrated successfully into contemporary spatial planning and governance. The paper investigates this challenge through a commentary on two City Information Modelling (CIM case studies at Northumbria University, UK. This commentary is grouped around four key themes, Accessibility and availability of data, accuracy and consistency of data, manageability of data and integration of data. It is also designed to provoke discussion in relation to the exploitation and improvement of data modelling and visualisation in the urban planning discipline and to contribute to the literature in related fields. The paper concludes that the production of information, its use and modelling, can empower urban planners as they mediate and contest state-market relations in the city. However, its use should be circumspect as data alone does not guarantee delivery of a sustainable urban future, rather, emphasis and future research should be placed upon interpretation and use of data.

  13. Future intensification of hydro-meteorological extremes: downscaling using the weather research and forecasting model

    KAUST Repository

    El-Samra, R.

    2017-02-15

    A set of ten downscaling simulations at high spatial resolution (3 km horizontally) were performed using the Weather Research and Forecasting (WRF) model to generate future climate projections of annual and seasonal temperature and precipitation changes over the Eastern Mediterranean (with a focus on Lebanon). The model was driven with the High Resolution Atmospheric Model (HiRAM), running over the whole globe at a resolution of 25 km, under the conditions of two Representative Concentration Pathways (RCP) (4.5 and 8.5). Each downscaling simulation spanned one year. Two past years (2003 and 2008), also forced by HiRAM without data assimilation, were simulated to evaluate the model’s ability to capture the cold and wet (2003) and hot and dry (2008) extremes. The downscaled data were in the range of recent observed climatic variability, and therefore corrected for the cold bias of HiRAM. Eight future years were then selected based on an anomaly score that relies on the mean annual temperature and accumulated precipitation to identify the worst year per decade from a water resources perspective. One hot and dry year per decade, from 2011 to 2050, and per scenario was simulated and compared to the historic 2008 reference. The results indicate that hot and dry future extreme years will be exacerbated and the study area might be exposed to a significant decrease in annual precipitation (rain and snow), reaching up to 30% relative to the current extreme conditions.

  14. Combat Medical Modernization: Posturing Low Supply And High Demand Assets To Meet Emerging And Future Capability Requirements

    Science.gov (United States)

    2015-07-01

    the Asia - Pacific .66 This shift could bring new requirements for military projection of power and therefore needed medical capability into an area...Lead Agent Medical Materiel – Pacific (TLAMM-P) as well as leading an MFST (Mobile Field Surgical Team) team in the CENTCOM Area of Operation (AOR...duty as Independent Duty Medical Technician (IDMT), Aeromedical Evacuation Technician (AET), Hyperbaric Medical Technician (HBMT); Allergy and/or

  15. Generation of future high-resolution rainfall time series with a disaggregation model

    Science.gov (United States)

    Müller, Hannes; Haberlandt, Uwe

    2017-04-01

    High-resolution rainfall data are needed in many fields of hydrology and water resources management. For analyzes of future rainfall condition climate scenarios exist with hourly values of rainfall. However, the direct usage of these data is associated with uncertainties which can be indicated by comparisons of observations and C20 control runs. An alternative is the derivation of changes of rainfall behavior over the time from climate simulations. Conclusions about future rainfall conditions can be drawn by adding these changes to observed time series. A multiplicative cascade model is used in this investigation for the disaggregation of daily rainfall amounts to hourly values. Model parameters can be estimated by REMO rainfall time series (UBA-, BfG- and ENS-realization), based on ECHAM5. Parameter estimation is carried out for C20 period as well as near term and long term future (2021-2050 and 2071-2100). Change factors for both future periods are derived by parameter comparisons and added to the parameters estimated from observed time series. This enables the generation of hourly rainfall time series from observed daily values with respect to future changes. The investigation is carried out for rain gauges in Lower Saxony. Generated Time series are analyzed regarding statistical characteristics, e.g. extreme values, event-based (wet spell duration and amounts, dry spell duration, …) and continuum characteristics (average intensity, fraction of dry intervals,…). The generation of the time series is validated by comparing the changes in the statistical characteristics from the REMO data and from the disaggregated data.

  16. Historical and future fire occurrence (1850 to 2100) simulated in CMIP5 Earth System Models

    Science.gov (United States)

    Kloster, Silvia; Lasslop, Gitta

    2017-03-01

    Earth System Models (ESMs) have recently integrated fire processes in their vegetation model components to account for fire as an important disturbance process for vegetation dynamics and agent in the land carbon cycle. The present study analyses the performance of ESMs that participated in the 5th Coupled Model Intercomparison Project (CMIP5) in simulating historical and future fire occurrence. The global present day (1981 to 2005) burned area simulated in the analysed ESMs ranges between 149 and 208Mha, which is substantially lower than the most recent observation based estimate of 399Mha (GFEDv4s averaged over the time period 1997 to 2015). Simulated global fire carbon emissions, however, are with 2.0PgC/year to 2.7PgC/year on the higher end compared to the GFEDv4s estimate of 2.2PgC/year. Regionally, largest differences are found for Africa. Over the historical period (1850 to 2005) changes in simulated fire carbon emissions range between an increase of +43% and a decrease of -35%. For the future (2005 to 2100) we analysed the CMIP5 simulations following the representative concentration pathways (RCPs) 26, 45, and 85, for which the strongest changes in global fire carbon emissions simulated in the single ESMs amount to +8%, +52% and +58%, respectively. Overall, however, there is little agreement between the single ESMs on how fire occurrence changed over the past or will change in the future. Furthermore, contrasting simulated changes in fire carbon emissions and changes in annual mean precipitation shows no emergent pattern among the different analysed ESMs on the regional or global scale. This indicates differences in the single fire model representations that should be subject of upcoming fire model intercomparison studies. The increasing information derived from observational datasets (charcoal, ice-cores, satellite, inventories) will help to further constrain the trajectories of fire models.

  17. Mathematical Formulation Requirements and Specifications for the Process Models

    Energy Technology Data Exchange (ETDEWEB)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-11-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments

  18. Drivers and uncertainties of future global marine primary production in marine ecosystem models

    Directory of Open Access Journals (Sweden)

    C. Laufkötter

    2015-02-01

    Full Text Available Past model studies have projected a global decrease in marine net primary production (NPP over the 21st century, but these studies focused on the multi-model mean and mostly ignored the large inter-model differences. Here, we analyze model simulated changes of NPP for the 21st century under IPCC's high emission scenario RCP8.5 using a suite of nine coupled carbon–climate Earth System Models with embedded marine ecosystem models with a focus on the spread between the different models and the underlying reasons. Globally, five out of the nine models show a decrease in NPP over the course of the 21st century, while three show no significant trend and one even simulates an increase. The largest model spread occurs in the low latitudes (between 30° S and 30° N, with individual models simulating relative changes between −25 and +40%. In this region, the inter-quartile range of the differences between the 2012–2031 average and the 2081–2100 average is up to 3 mol C m-2 yr-1. These large differences in future change mirror large differences in present day NPP. Of the seven models diagnosing a net decrease in NPP in the low latitudes, only three simulate this to be a consequence of the classical interpretation, i.e., a stronger nutrient limitation due to increased stratification and reduced upwelling. In the other four, warming-induced increases in phytoplankton growth outbalance the stronger nutrient limitation. However, temperature-driven increases in grazing and other loss processes cause a net decrease in phytoplankton biomass and reduces NPP despite higher growth rates. One model projects a strong increase in NPP in the low latitudes, caused by an intensification of the microbial loop, while the remaining model simulates changes of less than 0.5%. While there is more consistency in the modeled increase in NPP in the Southern Ocean, the regional inter-model range is also very substantial. In most models, this increase in NPP is driven by

  19. Reference model of future ubiquitous convergent network and context-aware telecommunication service platform

    Institute of Scientific and Technical Information of China (English)

    QIAO Xiu-quan; LI Xiao-feng; LIANG Shou-qing

    2006-01-01

    A reference model for future ubiquitous convergent network is analyzed. To provide user-centric, intelligent,personalized service, this article presents a context-aware telecommunication service platform (CaTSP) to adapt to dynamically changing context. This article focuses on the new design method of context-aware telecommunication service platform and its architecture. Through the use of model-driven architecture (MDA) and semantic web technologies, CaTSP can enable context reasoning and service personalization adaption.This article explores a new approach for service intelligence,personalization, and adaptability in the semantic web service computing era.

  20. "Open Access" Requires Clarification: Medical Journal Publication Models Evolve.

    Science.gov (United States)

    Lubowitz, James H; Brand, Jefferson C; Rossi, Michael J; Provencher, Matthew T

    2017-03-01

    While Arthroscopy journal is a traditional subscription model journal, our companion journal Arthroscopy Techniques is "open access." We used to believe open access simply meant online and free of charge. However, while open-access journals are free to readers, in 2017 authors must make a greater sacrifice in the form of an article-processing charge (APC). Again, while this does not apply to Arthroscopy, the APC will apply to Arthroscopy Techniques.

  1. A practitioner's perspective on the uses and future developments for wastewater treatment modelling.

    Science.gov (United States)

    Daigger, G T

    2011-01-01

    The modern age of wastewater treatment modelling began with publication of the International Water Association (IWA) Activated Sludge Model (ASM) No.1 and has advanced significantly since. Models are schematic representations of systems that are useful for analysis to support decision-making. The most appropriate model for a particular application often incorporates only those components essential for the particular analyses to be performed (i.e. the simplest model possible). Characteristics of effective models are presented, along with how wastewater modelling is integrated into the wastewater project life cycle. The desirable characteristics of wastewater treatment modelling platforms are then reviewed. Current developments of note in wastewater treatment modelling practice include estimates of greenhouse gas emissions, incorporating uncertainty into wastewater modelling and design practice, more fundamental modelling of process chemistry, and improved understanding of the degradability of wastewater constituents in different environments. Areas requiring greater emphasis include increased use of metabolic modelling, characterisation of the hydrodynamics of suspended and biofilm biological treatment processes, and the integration of biofilm and suspended growth process modelling. Wastewater treatment models must also interface with water and wastewater management software packages. While wastewater treatment modelling will continue to advance and make important contributions to practice, it must be remembered that these are complex systems which exhibit counter-intuitive behaviour (results differ from initial expectations) and multiple dynamic steady-states which can abruptly transition from one to another.

  2. Modelling Future Coronary Heart Disease Mortality to 2030 in the British Isles.

    Science.gov (United States)

    Hughes, John; Kabir, Zubair; Bennett, Kathleen; Hotchkiss, Joel W; Kee, Frank; Leyland, Alastair H; Davies, Carolyn; Bandosz, Piotr; Guzman-Castillo, Maria; O'Flaherty, Martin; Capewell, Simon; Critchley, Julia

    2015-01-01

    Despite rapid declines over the last two decades, coronary heart disease (CHD) mortality rates in the British Isles are still amongst the highest in Europe. This study uses a modelling approach to compare the potential impact of future risk factor scenarios relating to smoking and physical activity levels, dietary salt and saturated fat intakes on future CHD mortality in three countries: Northern Ireland (NI), Republic of Ireland (RoI) and Scotland. CHD mortality models previously developed and validated in each country were extended to predict potential reductions in CHD mortality from 2010 (baseline year) to 2030. Risk factor trends data from recent surveys at baseline were used to model alternative future risk factor scenarios: Absolute decreases in (i) smoking prevalence and (ii) physical inactivity rates of up to 15% by 2030; relative decreases in (iii) dietary salt intake of up to 30% by 2030 and (iv) dietary saturated fat of up to 6% by 2030. Probabilistic sensitivity analyses were then conducted. Projected populations in 2030 were 1.3, 3.4 and 3.9 million in NI, RoI and Scotland respectively (adults aged 25-84). In 2030: assuming recent declining mortality trends continue: 15% absolute reductions in smoking could decrease CHD deaths by 5.8-7.2%. 15% absolute reductions in physical inactivity levels could decrease CHD deaths by 3.1-3.6%. Relative reductions in salt intake of 30% could decrease CHD deaths by 5.2-5.6% and a 6% reduction in saturated fat intake might decrease CHD deaths by some 7.8-9.0%. These projections remained stable under a wide range of sensitivity analyses. Feasible reductions in four cardiovascular risk factors (already achieved elsewhere) could substantially reduce future coronary deaths. More aggressive polices are therefore needed in the British Isles to control tobacco, promote healthy food and increase physical activity.

  3. Modelling Future Coronary Heart Disease Mortality to 2030 in the British Isles.

    Directory of Open Access Journals (Sweden)

    John Hughes

    Full Text Available Despite rapid declines over the last two decades, coronary heart disease (CHD mortality rates in the British Isles are still amongst the highest in Europe. This study uses a modelling approach to compare the potential impact of future risk factor scenarios relating to smoking and physical activity levels, dietary salt and saturated fat intakes on future CHD mortality in three countries: Northern Ireland (NI, Republic of Ireland (RoI and Scotland.CHD mortality models previously developed and validated in each country were extended to predict potential reductions in CHD mortality from 2010 (baseline year to 2030. Risk factor trends data from recent surveys at baseline were used to model alternative future risk factor scenarios: Absolute decreases in (i smoking prevalence and (ii physical inactivity rates of up to 15% by 2030; relative decreases in (iii dietary salt intake of up to 30% by 2030 and (iv dietary saturated fat of up to 6% by 2030. Probabilistic sensitivity analyses were then conducted.Projected populations in 2030 were 1.3, 3.4 and 3.9 million in NI, RoI and Scotland respectively (adults aged 25-84. In 2030: assuming recent declining mortality trends continue: 15% absolute reductions in smoking could decrease CHD deaths by 5.8-7.2%. 15% absolute reductions in physical inactivity levels could decrease CHD deaths by 3.1-3.6%. Relative reductions in salt intake of 30% could decrease CHD deaths by 5.2-5.6% and a 6% reduction in saturated fat intake might decrease CHD deaths by some 7.8-9.0%. These projections remained stable under a wide range of sensitivity analyses.Feasible reductions in four cardiovascular risk factors (already achieved elsewhere could substantially reduce future coronary deaths. More aggressive polices are therefore needed in the British Isles to control tobacco, promote healthy food and increase physical activity.

  4. Modelling Bambara Groundnut Yield in Southern Africa: Towards a Climate-Resilient Future

    Science.gov (United States)

    Karunaratne, A. S.; Walker, S.; Ruane, A. C.

    2015-01-01

    Current agriculture depends on a few major species grown as monocultures that are supported by global research underpinning current productivity. However, many hundreds of alternative crops have the potential to meet real world challenges by sustaining humanity, diversifying agricultural systems for food and nutritional security, and especially responding to climate change through their resilience to certain climate conditions. Bambara groundnut (Vigna subterranea (L.) Verdc.), an underutilised African legume, is an exemplar crop for climate resilience. Predicted yield performances of Bambara groundnut by AquaCrop (a crop-water productivity model) were evaluated for baseline (1980-2009) and mid-century climates (2040-2069) under 20 downscaled Global Climate Models (CMIP5-RCP8.5), as well as for climate sensitivities (AgMIPC3MP) across 3 locations in Southern Africa (Botswana, South Africa, Namibia). Different land - races of Bambara groundnut originating from various semi-arid African locations showed diverse yield performances with diverse sensitivities to climate. S19 originating from hot-dry conditions in Namibia has greater future yield potential compared to the Swaziland landrace Uniswa Red-UN across study sites. South Africa has the lowest yield under the current climate, indicating positive future yield trends. Namibia reported the highest baseline yield at optimum current temperatures, indicating less yield potential in future climates. Bambara groundnut shows positive yield potential at temperatures of up to 31degC, with further warming pushing yields down. Thus, many regions in Southern Africa can utilize Bambara groundnut successfully in the coming decades. This modelling exercise supports decisions on genotypic suitability for present and future climates at specific locations.

  5. Modelling Bambara Groundnut Yield in Southern Africa: Towards a Climate-Resilient Future

    Science.gov (United States)

    Karunaratne, A. S.; Walker, S.; Ruane, A. C.

    2015-01-01

    Current agriculture depends on a few major species grown as monocultures that are supported by global research underpinning current productivity. However, many hundreds of alternative crops have the potential to meet real world challenges by sustaining humanity, diversifying agricultural systems for food and nutritional security, and especially responding to climate change through their resilience to certain climate conditions. Bambara groundnut (Vigna subterranea (L.) Verdc.), an underutilised African legume, is an exemplar crop for climate resilience. Predicted yield performances of Bambara groundnut by AquaCrop (a crop-water productivity model) were evaluated for baseline (1980-2009) and mid-century climates (2040-2069) under 20 downscaled Global Climate Models (CMIP5-RCP8.5), as well as for climate sensitivities (AgMIPC3MP) across 3 locations in Southern Africa (Botswana, South Africa, Namibia). Different land - races of Bambara groundnut originating from various semi-arid African locations showed diverse yield performances with diverse sensitivities to climate. S19 originating from hot-dry conditions in Namibia has greater future yield potential compared to the Swaziland landrace Uniswa Red-UN across study sites. South Africa has the lowest yield under the current climate, indicating positive future yield trends. Namibia reported the highest baseline yield at optimum current temperatures, indicating less yield potential in future climates. Bambara groundnut shows positive yield potential at temperatures of up to 31degC, with further warming pushing yields down. Thus, many regions in Southern Africa can utilize Bambara groundnut successfully in the coming decades. This modelling exercise supports decisions on genotypic suitability for present and future climates at specific locations.

  6. Projecting future expansion of invasive species: comparing and improving methodologies for species distribution modeling.

    Science.gov (United States)

    Mainali, Kumar P; Warren, Dan L; Dhileepan, Kunjithapatham; McConnachie, Andrew; Strathie, Lorraine; Hassan, Gul; Karki, Debendra; Shrestha, Bharat B; Parmesan, Camille

    2015-12-01

    Modeling the distributions of species, especially of invasive species in non-native ranges, involves multiple challenges. Here, we developed some novel approaches to species distribution modeling aimed at reducing the influences of such challenges and improving the realism of projections. We estimated species-environment relationships for Parthenium hysterophorus L. (Asteraceae) with four modeling methods run with multiple scenarios of (i) sources of occurrences and geographically isolated background ranges for absences, (ii) approaches to drawing background (absence) points, and (iii) alternate sets of predictor variables. We further tested various quantitative metrics of model evaluation against biological insight. Model projections were very sensitive to the choice of training dataset. Model accuracy was much improved using a global dataset for model training, rather than restricting data input to the species' native range. AUC score was a poor metric for model evaluation and, if used alone, was not a useful criterion for assessing model performance. Projections away from the sampled space (i.e., into areas of potential future invasion) were very different depending on the modeling methods used, raising questions about the reliability of ensemble projections. Generalized linear models gave very unrealistic projections far away from the training region. Models that efficiently fit the dominant pattern, but exclude highly local patterns in the dataset and capture interactions as they appear in data (e.g., boosted regression trees), improved generalization of the models. Biological knowledge of the species and its distribution was important in refining choices about the best set of projections. A post hoc test conducted on a new Parthenium dataset from Nepal validated excellent predictive performance of our 'best' model. We showed that vast stretches of currently uninvaded geographic areas on multiple continents harbor highly suitable habitats for parthenium

  7. Future US energy demands based upon traditional consumption patterns lead to requirements which significantly exceed domestic supply

    Science.gov (United States)

    1975-01-01

    Energy consumption in the United States has risen in response to both increasing population and to increasing levels of affluence. Depletion of domestic energy reserves requires consumption modulation, production of fossil fuels, more efficient conversion techniques, and large scale transitions to non-fossile fuel energy sources. Widening disparity between the wealthy and poor nations of the world contributes to trends that increase the likelihood of group action by the lesser developed countries to achieve political and economic goals. The formation of anticartel cartels is envisioned.

  8. Modeling Global Water Use for the 21st Century: Water Futures and Solutions (WFaS) Initiative and Its Approaches

    Science.gov (United States)

    Wada, Y.; Florke, M.; Hanasaki, N.; Eisner, S.; Fischer, G.; Tramberend, S.; Satoh, Y.; van Vliet, M. T. H.; Yillia, P.; Ringler, C.; Burek, P.; Wiberg, D.

    2016-01-01

    To sustain growing food demand and increasing standard of living, global water use increased by nearly 6 times during the last 100 years, and continues to grow. As water demands get closer and closer to the water availability in many regions, each drop of water becomes increasingly valuable and water must be managed more efficiently and intensively. However, soaring water use worsens water scarcity conditions already prevalent in semi-arid and arid regions, increasing uncertainty for sustainable food production and economic development. Planning for future development and investments requires that we prepare water projections for the future. However, estimations are complicated because the future of the world's waters will be influenced by a combination of environmental, social, economic, and political factors, and there is only limited knowledge and data available about freshwater resources and how they are being used. The Water Futures and Solutions (WFaS) initiative coordinates its work with other ongoing scenario efforts for the sake of establishing a consistent set of new global water scenarios based on the shared socio-economic pathways (SSPs) and the representative concentration pathways (RCPs). The WFaS "fast track" assessment uses three global water models, namely H08, PCR-GLOBWB, and WaterGAP. This study assesses the state of the art for estimating and projecting water use regionally and globally in a consistent manner. It provides an overview of different approaches, the uncertainty, strengths and weaknesses of the various estimation methods, types of management and policy decisions for which the current estimation methods are useful. We also discuss additional information most needed to be able to improve water use estimates and be able to assess a greater range of management options across the water-energy-climate nexus.

  9. Modeling global water use for the 21st century: Water Futures and Solutions (WFaS) initiative and its approaches

    Science.gov (United States)

    Wada, Y.; Flörke, M.; Hanasaki, N.; Eisner, S.; Fischer, G.; Tramberend, S.; Satoh, Y.; van Vliet, M. T. H.; Yillia, P.; Ringler, C.; Wiberg, D.

    2015-08-01

    To sustain growing food demand and increasing standard of living, global water use increased by nearly 6 times during the last 100 years and continues to grow. As water demands get closer and closer to the water availability in many regions, each drop of water becomes increasingly valuable and water must be managed more efficiently and intensively. However, soaring water use worsens water scarcity condition already prevalent in semi-arid and arid regions, increasing uncertainty for sustainable food production and economic development. Planning for future development and investments requires that we prepare water projections for the future. However, estimations are complicated because the future of world's waters will be influenced by a combination of environmental, social, economic, and political factors, and there is only limited knowledge and data available about freshwater resources and how they are being used. The Water Futures and Solutions initiative (WFaS) coordinates its work with other on-going scenario efforts for the sake of establishing a consistent set of new global water scenarios based on the Shared Socioeconomic Pathways (SSPs) and the Representative Concentration Pathways (RCPs). The WFaS "fast-track" assessment uses three global water models, namely H08, PCR-GLOBWB, and WaterGAP. This study assesses the state of the art for estimating and projecting water use regionally and globally in a consistent manner. It provides an overview of different approaches, the uncertainty, strengths and weaknesses of the various estimation methods, types of management and policy decisions for which the current estimation methods are useful. We also discuss additional information most needed to be able to improve water use estimates and be able to assess a greater range of management options across the water-energy-climate nexus.

  10. Modeling global water use for the 21st century: Water Futures and Solutions (WFaS initiative and its approaches

    Directory of Open Access Journals (Sweden)

    Y. Wada

    2015-08-01

    Full Text Available To sustain growing food demand and increasing standard of living, global water use increased by nearly 6 times during the last 100 years and continues to grow. As water demands get closer and closer to the water availability in many regions, each drop of water becomes increasingly valuable and water must be managed more efficiently and intensively. However, soaring water use worsens water scarcity condition already prevalent in semi-arid and arid regions, increasing uncertainty for sustainable food production and economic development. Planning for future development and investments requires that we prepare water projections for the future. However, estimations are complicated because the future of world's waters will be influenced by a combination of environmental, social, economic, and political factors, and there is only limited knowledge and data available about freshwater resources and how they are being used. The Water Futures and Solutions initiative (WFaS coordinates its work with other on-going scenario efforts for the sake of establishing a consistent set of new global water scenarios based on the Shared Socioeconomic Pathways (SSPs and the Representative Concentration Pathways (RCPs. The WFaS "fast-track" assessment uses three global water models, namely H08, PCR-GLOBWB, and WaterGAP. This study assesses the state of the art for estimating and projecting water use regionally and globally in a consistent manner. It provides an overview of different approaches, the uncertainty, strengths and weaknesses of the various estimation methods, types of management and policy decisions for which the current estimation methods are useful. We also discuss additional information most needed to be able to improve water use estimates and be able to assess a greater range of management options across the water-energy-climate nexus.

  11. Requirements for Logical Models for Value-Added Tax Legislation

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    -specific needs. Currently, these difficulties are handled in most major ERP systems by customising and localising the native code of the ERP systems for each specific country and industry. We propose an alternative that uses logical modeling of VAT legislation. The potential benefit is to eventually transform......Enterprise resource planning (ERP) systems are ubiquitous in commercial enterprises of all sizes and invariably need to account for the notion of value-added tax (VAT). The legal and technical difficulties in handling VAT are exacerbated by spanning a broad and chaotic spectrum of intricate country...

  12. USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 3. Future to be Asset Sustainment Process Model.

    Science.gov (United States)

    2007-11-02

    Models), contains the To-Be Retail Asset Sustainment Process Model displaying the activities and functions related to the improved processes for receipt...of a logistics process model for a more distant future asset sustainment scenario unconstrained by today’s logistics information systems limitations...It also contains a process model reflecting the Reengineering Team’s vision of the future asset sustainment process.

  13. CONTINUING EDUCATION MODEL OF SCHOOL OF FUTURE (CASE STUDY OF ENGINEERING SCHOOL

    Directory of Open Access Journals (Sweden)

    Olga A. Karlova

    2013-01-01

    Full Text Available The paper presents the concept of continuous engineering education for the educational complex kindergarten-school-high school. Basing on the space-time model of memory and thinking, we define the didactic requirements for organization of the educational process to ensure the strength and depth of knowledge, the formation of engineering students' thinking. In this work we offer the model of the School of Engineering, which implements the necessary requirements through the cloud and cluster technologies of multi-age learning and mega-class. Besides, the paper marks the ways of the formation of the Engineering School in Zheleznogorsk. 

  14. Predicting future conflict between team-members with parameter-free models of social networks

    Science.gov (United States)

    Rovira-Asenjo, Núria; Gumí, Tània; Sales-Pardo, Marta; Guimerà, Roger

    2013-06-01

    Despite the well-documented benefits of working in teams, teamwork also results in communication, coordination and management costs, and may lead to personal conflict between team members. In a context where teams play an increasingly important role, it is of major importance to understand conflict and to develop diagnostic tools to avert it. Here, we investigate empirically whether it is possible to quantitatively predict future conflict in small teams using parameter-free models of social network structure. We analyze data of conflict appearance and resolution between 86 team members in 16 small teams, all working in a real project for nine consecutive months. We find that group-based models of complex networks successfully anticipate conflict in small teams whereas micro-based models of structural balance, which have been traditionally used to model conflict, do not.

  15. Predicting future conflict between team-members with parameter-free models of social networks

    CERN Document Server

    Rovira-Asenjo, Nuria; Sales-Pardo, Marta; Guimera, Roger

    2014-01-01

    Despite the well-documented benefits of working in teams, teamwork also results in communication, coordination and management costs, and may lead to personal conflict between team members. In a context where teams play an increasingly important role, it is of major importance to understand conflict and to develop diagnostic tools to avert it. Here, we investigate empirically whether it is possible to quantitatively predict future conflict in small teams using parameter-free models of social network structure. We analyze data of conflict appearance and resolution between 86 team members in 16 small teams, all working in a real project for nine consecutive months. We find that group-based models of complex networks successfully anticipate conflict in small teams whereas micro-based models of structural balance, which have been traditionally used to model conflict, do not.

  16. Predicting future conflict between team-members with parameter-free models of social networks.

    Science.gov (United States)

    Rovira-Asenjo, Núria; Gumí, Tània; Sales-Pardo, Marta; Guimerà, Roger

    2013-01-01

    Despite the well-documented benefits of working in teams, teamwork also results in communication, coordination and management costs, and may lead to personal conflict between team members. In a context where teams play an increasingly important role, it is of major importance to understand conflict and to develop diagnostic tools to avert it. Here, we investigate empirically whether it is possible to quantitatively predict future conflict in small teams using parameter-free models of social network structure. We analyze data of conflict appearance and resolution between 86 team members in 16 small teams, all working in a real project for nine consecutive months. We find that group-based models of complex networks successfully anticipate conflict in small teams whereas micro-based models of structural balance, which have been traditionally used to model conflict, do not.

  17. The technology acceptance model: its past and its future in health care.

    Science.gov (United States)

    Holden, Richard J; Karsh, Ben-Tzion

    2010-02-01

    Increasing interest in end users' reactions to health information technology (IT) has elevated the importance of theories that predict and explain health IT acceptance and use. This paper reviews the application of one such theory, the Technology Acceptance Model (TAM), to health care. We reviewed 16 data sets analyzed in over 20 studies of clinicians using health IT for patient care. Studies differed greatly in samples and settings, health ITs studied, research models, relationships tested, and construct operationalization. Certain TAM relationships were consistently found to be significant, whereas others were inconsistent. Several key relationships were infrequently assessed. Findings show that TAM predicts a substantial portion of the use or acceptance of health IT, but that the theory may benefit from several additions and modifications. Aside from improved study quality, standardization, and theoretically motivated additions to the model, an important future direction for TAM is to adapt the model specifically to the health care context, using beliefs elicitation methods.

  18. Polar predictability: exploring the influence of GCM and regional model uncertainty on future ice sheet climates

    Science.gov (United States)

    Reusch, D. B.

    2015-12-01

    Evaluating uncertainty in GCMs and regional-scale forecast models is an essential step in the development of climate change predictions. Polar-region skill is particularly important due to the potential for changes affecting both local (ice sheet) and global (sea level) environments through more frequent/intense surface melting and changes in precipitation type/amount. High-resolution, regional-scale models also use GCMs as a source of boundary/initial conditions in future scenarios, thus inheriting a measure of GCM-derived externally-driven uncertainty. We examine inter- and intramodel uncertainty through statistics from decadal climatologies and analyses of variability based on self-organizing maps (SOMs), a nonlinear data analysis tool. We evaluate a 19-member CMIP5 subset and the 30-member CESM1.0-CAM5-BGC Large Ensemble (CESMLE) during polar melt seasons (boreal/austral summer) for recent (1981-2000) and future (2081-2100, RCP 8.5) decades. Regional-model uncertainty is examined with a subset of these GCMs driving Polar WRF simulations. Decadal climatologies relative to a reference (recent: the ERA-Interim reanalysis; future: a skillful modern GCM) identify model uncertainty in bulk, e.g., BNU-ESM is too warm, CMCC-CM too cold. While quite useful for model screening, diagnostic benefit is often indirect. SOMs extend our diagnostics by providing a concise, objective summary of model variability as a set of generalized patterns. Joint analysis of reference and test models summarizes the variability of multiple realizations of climate (all the models), benchmarks each model versus the reference (frequency analysis helps identify the patterns behind GCM bias), and places each GCM in a common context. Joint SOM analysis of CESMLE members shows how initial conditions contribute to differences in modeled climates, providing useful information about internal variability, such as contributions from each member to overall uncertainty using pattern frequencies. In the

  19. The influence of HBV model calibration on flood predictions for future climate

    Science.gov (United States)

    Osuch, Marzena; Romanowicz, Renata

    2014-05-01

    The temporal variability of HBV rainfall-runoff model parameters was tested to address the influence of climate characteristics on the values of model optimal parameters. HBV is a conceptual model with a physically-based structure that takes into account soil moisture, snow-melt and dynamic runoff components. The model parameters were optimized by the DEGL method (Differential Evolution with Global and Local neighbours) for a set of catchments located in Poland. The methodology consisted of the calibration and cross-validation of the HBV models on a series of five-year periods within a moving window. The optimal parameter values show large temporal variability and dependence on climatic conditions described by the mean and standard deviation of precipitation, air temperature and PET. Derived regressions models between parameters and climatic indices were statistically significant at the 0.05 level. The set of model optimal values was applied to simulate future flows in a changed climate. We used the precipitation and temperature series from 6 RCM/GCM models for 2071-2100 following the A1B climate change scenario. The climatic variables were obtained from the KLIMADA project. The resulting flow series for the future climate scenario were used to derive flow indices, including the flood quantiles. Results indicate a large influence of climatic variability on flow indices. This work was partly supported by the project "Stochastic flood forecasting system (The River Vistula reach from Zawichost to Warsaw)" carried out by the Institute of Geophysics, Polish Academy of Sciences by order of the National Science Centre (contract No. 2011/01/B/ST10/06866). The rainfall and flow data were provided by the Institute of Meteorology and Water Management (IMGW), Poland.

  20. Future planetary X-ray and gamma-ray remote sensing system and in situ requirements for room temperature solid state detectors

    CERN Document Server

    Trombka, J I; Starr, R; Clark, P E; Floyd, S R

    1999-01-01

    X-Ray and gamma-ray remote sensing observations find important applications in the study of the development of the planets. Orbital measurements can be carried out on solar-system bodies whose atmospheres and trapped radiation environments do not interfere significantly with the emissions. Elemental compositions can be inferred from observations of these line emissions. Future planetary missions also will involve landing both stationery and roving probes on planetary surfaces. Both X-ray and gamma-ray spectrometers will be used for performing elemental analysis of surface samples. These future planetary missions will impose a number of constraints: the flight instruments must be significantly reduced in weight from those previously flown; for many missions, gravity assist will be required, greatly increasing mission duration, resulting in the passage of several years before the first scientific measurement of a solar system body. The detector systems must operate reliably after years of cosmic-ray irradiation...