WorldWideScience

Sample records for risk model version

  1. The temporal version of the pediatric sepsis biomarker risk model.

    Directory of Open Access Journals (Sweden)

    Hector R Wong

    Full Text Available PERSEVERE is a risk model for estimating mortality probability in pediatric septic shock, using five biomarkers measured within 24 hours of clinical presentation.Here, we derive and test a temporal version of PERSEVERE (tPERSEVERE that considers biomarker values at the first and third day following presentation to estimate the probability of a "complicated course", defined as persistence of ≥2 organ failures at seven days after meeting criteria for septic shock, or death within 28 days.Biomarkers were measured in the derivation cohort (n = 225 using serum samples obtained during days 1 and 3 of septic shock. Classification and Regression Tree (CART analysis was used to derive a model to estimate the risk of a complicated course. The derived model was validated in the test cohort (n = 74, and subsequently updated using the combined derivation and test cohorts.A complicated course occurred in 23% of the derivation cohort subjects. The derived model had a sensitivity for a complicated course of 90% (95% CI 78-96, specificity was 70% (62-77, positive predictive value was 47% (37-58, and negative predictive value was 96% (91-99. The area under the receiver operating characteristic curve was 0.85 (0.79-0.90. Similar test characteristics were observed in the test cohort. The updated model had a sensitivity of 91% (81-96, a specificity of 70% (64-76, a positive predictive value of 47% (39-56, and a negative predictive value of 96% (92-99.tPERSEVERE reasonably estimates the probability of a complicated course in children with septic shock. tPERSEVERE could potentially serve as an adjunct to physiological assessments for monitoring how risk for poor outcomes changes during early interventions in pediatric septic shock.

  2. MATILDA Version 2: Rough Earth TIALD Model for Laser Probabilistic Risk Assessment in Hilly Terrain - Part I

    Science.gov (United States)

    2017-03-13

    AFRL-RH-FS-TR-2017-0009 MATILDA Version-2: Rough Earth TIALD Model for Laser Probabilistic Risk Assessment in Hilly Terrain – Part I Paul K...Probabilistic Risk Assessment in Hilly Terrain – Part I ii Distribution A: Approved for public release; distribution unlimited. PA Case No: TSRL-PA-2017-0169...any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN

  3. A Functional Version of the ARCH Model

    CERN Document Server

    Hormann, Siegfried; Reeder, Ron

    2011-01-01

    Improvements in data acquisition and processing techniques have lead to an almost continuous flow of information for financial data. High resolution tick data are available and can be quite conveniently described by a continuous time process. It is therefore natural to ask for possible extensions of financial time series models to a functional setup. In this paper we propose a functional version of the popular ARCH model. We will establish conditions for the existence of a strictly stationary solution, derive weak dependence and moment conditions, show consistency of the estimators and perform a small empirical study demonstrating how our model matches with real data.

  4. A version management model of PDM system and its realization

    Institute of Scientific and Technical Information of China (English)

    ZHONG Shi-sheng; LI Tao

    2008-01-01

    Based on the key function of version management in PDM system, this paper discusses the function and the realization of version management and the transitions of version states with a workflow. A directed acy-clic graph is used to describe a version model. Three storage modes of the directed acyclic graph version model in the database, the bumping block and the PDM working memory are presented and the conversion principle of these three modes is given. The study indicates that building a dynamic product structure configuration model based on versions is the key to resolve the problem. Thus a version model of single product object is built. Then the version management model in product structure configuration is built and the apphcation of version manage-ment of PDM syste' is presented as a case.

  5. Forsmark - site descriptive model version 0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-10-01

    During 2002, the Swedish Nuclear Fuel and Waste Management Company (SKB) is starting investigations at two potential sites for a deep repository in the Precambrian basement of the Fennoscandian Shield. The present report concerns one of those sites, Forsmark, which lies in the municipality of Oesthammar, on the east coast of Sweden, about 150 kilometres north of Stockholm. The site description should present all collected data and interpreted parameters of importance for the overall scientific understanding of the site, for the technical design and environmental impact assessment of the deep repository, and for the assessment of long-term safety. The site description will have two main components: a written synthesis of the site, summarising the current state of knowledge, as documented in the databases containing the primary data from the site investigations, and one or several site descriptive models, in which the collected information is interpreted and presented in a form which can be used in numerical models for rock engineering, environmental impact and long-term safety assessments. The site descriptive models are devised and stepwise updated as the site investigations proceed. The point of departure for this process is the regional site descriptive model, version 0, which is the subject of the present report. Version 0 is developed out of the information available at the start of the site investigation. This information, with the exception of data from tunnels and drill holes at the sites of the Forsmark nuclear reactors and the underground low-middle active radioactive waste storage facility, SFR, is mainly 2D in nature (surface data), and is general and regional, rather than site-specific, in content. For this reason, the Forsmark site descriptive model, version 0, as detailed in the present report, has been developed at a regional scale. It covers a rectangular area, 15 km in a southwest-northeast and 11 km in a northwest-southeast direction, around the

  6. Simpevarp - site descriptive model version 0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-11-01

    During 2002, SKB is starting detailed investigations at two potential sites for a deep repository in the Precambrian rocks of the Fennoscandian Shield. The present report concerns one of those sites, Simpevarp, which lies in the municipality of Oskarshamn, on the southeast coast of Sweden, about 250 kilometres south of Stockholm. The site description will have two main components: a written synthesis of the site, summarising the current state of knowledge, as documented in the databases containing the primary data from the site investigations, and one or several site descriptive models, in which the collected information is interpreted and presented in a form which can be used in numerical models for rock engineering, environmental impact and long-term safety assessments. SKB maintains two main databases at the present time, a site characterisation database called SICADA and a geographic information system called SKB GIS. The site descriptive model will be developed and presented with the aid of the SKB GIS capabilities, and with SKBs Rock Visualisation System (RVS), which is also linked to SICADA. The version 0 model forms an important framework for subsequent model versions, which are developed successively, as new information from the site investigations becomes available. Version 0 is developed out of the information available at the start of the site investigation. In the case of Simpevarp, this is essentially the information which was compiled for the Oskarshamn feasibility study, which led to the choice of that area as a favourable object for further study, together with information collected since its completion. This information, with the exception of the extensive data base from the nearby Aespoe Hard Rock Laboratory, is mainly 2D in nature (surface data), and is general and regional, rather than site-specific, in content. Against this background, the present report consists of the following components: an overview of the present content of the databases

  7. Forsmark - site descriptive model version 0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-10-01

    During 2002, the Swedish Nuclear Fuel and Waste Management Company (SKB) is starting investigations at two potential sites for a deep repository in the Precambrian basement of the Fennoscandian Shield. The present report concerns one of those sites, Forsmark, which lies in the municipality of Oesthammar, on the east coast of Sweden, about 150 kilometres north of Stockholm. The site description should present all collected data and interpreted parameters of importance for the overall scientific understanding of the site, for the technical design and environmental impact assessment of the deep repository, and for the assessment of long-term safety. The site description will have two main components: a written synthesis of the site, summarising the current state of knowledge, as documented in the databases containing the primary data from the site investigations, and one or several site descriptive models, in which the collected information is interpreted and presented in a form which can be used in numerical models for rock engineering, environmental impact and long-term safety assessments. The site descriptive models are devised and stepwise updated as the site investigations proceed. The point of departure for this process is the regional site descriptive model, version 0, which is the subject of the present report. Version 0 is developed out of the information available at the start of the site investigation. This information, with the exception of data from tunnels and drill holes at the sites of the Forsmark nuclear reactors and the underground low-middle active radioactive waste storage facility, SFR, is mainly 2D in nature (surface data), and is general and regional, rather than site-specific, in content. For this reason, the Forsmark site descriptive model, version 0, as detailed in the present report, has been developed at a regional scale. It covers a rectangular area, 15 km in a southwest-northeast and 11 km in a northwest-southeast direction, around the

  8. Simpevarp - site descriptive model version 0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-11-01

    During 2002, SKB is starting detailed investigations at two potential sites for a deep repository in the Precambrian rocks of the Fennoscandian Shield. The present report concerns one of those sites, Simpevarp, which lies in the municipality of Oskarshamn, on the southeast coast of Sweden, about 250 kilometres south of Stockholm. The site description will have two main components: a written synthesis of the site, summarising the current state of knowledge, as documented in the databases containing the primary data from the site investigations, and one or several site descriptive models, in which the collected information is interpreted and presented in a form which can be used in numerical models for rock engineering, environmental impact and long-term safety assessments. SKB maintains two main databases at the present time, a site characterisation database called SICADA and a geographic information system called SKB GIS. The site descriptive model will be developed and presented with the aid of the SKB GIS capabilities, and with SKBs Rock Visualisation System (RVS), which is also linked to SICADA. The version 0 model forms an important framework for subsequent model versions, which are developed successively, as new information from the site investigations becomes available. Version 0 is developed out of the information available at the start of the site investigation. In the case of Simpevarp, this is essentially the information which was compiled for the Oskarshamn feasibility study, which led to the choice of that area as a favourable object for further study, together with information collected since its completion. This information, with the exception of the extensive data base from the nearby Aespoe Hard Rock Laboratory, is mainly 2D in nature (surface data), and is general and regional, rather than site-specific, in content. Against this background, the present report consists of the following components: an overview of the present content of the databases

  9. Meson Properties in a renormalizable version of the NJL model

    CERN Document Server

    Mota, A L; Hiller, B; Walliser, H; Mota, Andre L.; Hiller, Brigitte; Walliser, Hans

    1999-01-01

    In the present paper we implement a non-trivial and renormalizable extension of the NJL model. We discuss the advantages and shortcomings of this extended model compared to a usual effective Pauli-Villars regularized version. We show that both versions become equivalent in the case of a large cutoff. Various relevant mesonic observables are calculated and compared.

  10. Land-Use Portfolio Modeler, Version 1.0

    Science.gov (United States)

    Taketa, Richard; Hong, Makiko

    2010-01-01

    -on-investment. The portfolio model, now known as the Land-Use Portfolio Model (LUPM), provided the framework for the development of the Land-Use Portfolio Modeler, Version 1.0 software (LUPM v1.0). The software provides a geographic information system (GIS)-based modeling tool for evaluating alternative risk-reduction mitigation strategies for specific natural-hazard events. The modeler uses information about a specific natural-hazard event and the features exposed to that event within the targeted study region to derive a measure of a given mitigation strategy`s effectiveness. Harnessing the spatial capabilities of a GIS enables the tool to provide a rich, interactive mapping environment in which users can create, analyze, visualize, and compare different

  11. Industrial Waste Management Evaluation Model Version 3.1

    Science.gov (United States)

    IWEM is a screening level ground water model designed to simulate contaminant fate and transport. IWEM v3.1 is the latest version of the IWEM software, which includes additional tools to evaluate the beneficial use of industrial materials

  12. GCFM Users Guide Revision for Model Version 5.0

    Energy Technology Data Exchange (ETDEWEB)

    Keimig, Mark A.; Blake, Coleman

    1981-08-10

    This paper documents alterations made to the MITRE/DOE Geothermal Cash Flow Model (GCFM) in the period of September 1980 through September 1981. Version 4.0 of GCFM was installed on the computer at the DOE San Francisco Operations Office in August 1980. This Version has also been distributed to about a dozen geothermal industry firms, for examination and potential use. During late 1980 and 1981, a few errors detected in the Version 4.0 code were corrected, resulting in Version 4.1. If you are currently using GCFM Version 4.0, it is suggested that you make the changes to your code that are described in Section 2.0. User's manual changes listed in Section 3.0 and Section 4.0 should then also be made.

  13. Model Adequacy Analysis of Matching Record Versions in Nosql Databases

    Directory of Open Access Journals (Sweden)

    E. V. Tsviashchenko

    2015-01-01

    Full Text Available The article investigates a model of matching record versions. The goal of this work is to analyse the model adequacy. This model allows estimating a user’s processing time distribution of the record versions and a distribution of the record versions count. The second option of the model was used, according to which, for a client the time to process record versions depends explicitly on the number of updates, performed by the other users between the sequential updates performed by a current client. In order to prove the model adequacy the real experiment was conducted in the cloud cluster. The cluster contains 10 virtual nodes, provided by DigitalOcean Company. The Ubuntu Server 14.04 was used as an operating system (OS. The NoSQL system Riak was chosen for experiments. In the Riak 2.0 version and later provide “dotted vector versions” (DVV option, which is an extension of the classic vector clock. Their use guarantees, that the versions count, simultaneously stored in DB, will not exceed the count of clients, operating in parallel with a record. This is very important while conducting experiments. For developing the application the java library, provided by Riak, was used. The processes run directly on the nodes. In experiment two records were used. They are: Z – the record, versions of which are handled by clients; RZ – service record, which contains record update counters. The application algorithm can be briefly described as follows: every client reads versions of the record Z, processes its updates using the RZ record counters, and saves treated record in database while old versions are deleted form DB. Then, a client rereads the RZ record and increments counters of updates for the other clients. After that, a client rereads the Z record, saves necessary statistics, and deliberates the results of processing. In the case of emerging conflict because of simultaneous updates of the RZ record, the client obtains all versions of that

  14. The New York PTSD Risk Score for Assessment of Psychological Trauma: Male and Female Versions

    Science.gov (United States)

    Boscarino, Joseph A.; Kirchner, H. Lester; Hoffman, Stuart N.; Sartorius, Jennifer; Adams, Richard E.; Figley, Charles R.

    2012-01-01

    We previously developed a new posttraumatic stress disorder (PTSD) screening instrument – the New York PTSD Risk Score (NYPRS). Since research suggests different PTSD risk factors and outcomes for men and women, in the current study we assessed the suitability of male and female versions of this screening instrument among 3,298 adults exposed to traumatic events. Using diagnostic test methods, including receiver operating curve (ROC) and bootstrap techniques, we examined different prediction domains, including core PTSD symptoms, trauma exposures, sleep disturbances, depression symptoms, and other measures to assess PTSD prediction models for men and women. While the original NYPRS worked well in predicting PTSD, significant interaction was detected by gender, suggesting that separate models are warranted for men and women. Model comparisons suggested that while the overall results appeared robust, prediction results differed by gender. For example, for women, core PTSD symptoms contributed more to the prediction score than for men. For men, depression symptoms, sleep disturbance, and trauma exposure contributed more to the prediction score. Men also had higher cut-off scores for PTSD compared to women. There were other gender-specific differences as well. The NYPRS is a screener that appears to be effective in predicting PTSD status among at-risk populations. However, consistent with other medical research, this instrument appears to require male and female versions to be the most effective. PMID:22648009

  15. Solar Advisor Model User Guide for Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Gilman, P.; Blair, N.; Mehos, M.; Christensen, C.; Janzou, S.; Cameron, C.

    2008-08-01

    The Solar Advisor Model (SAM) provides a consistent framework for analyzing and comparing power system costs and performance across the range of solar technologies and markets, from photovoltaic systems for residential and commercial markets to concentrating solar power and large photovoltaic systems for utility markets. This manual describes Version 2.0 of the software, which can model photovoltaic and concentrating solar power technologies for electric applications for several markets. The current version of the Solar Advisor Model does not model solar heating and lighting technologies.

  16. METAPHOR (version 1): Users guide. [performability modeling

    Science.gov (United States)

    Furchtgott, D. G.

    1979-01-01

    General information concerning METAPHOR, an interactive software package to facilitate performability modeling and evaluation, is presented. Example systems are studied and their performabilities are calculated. Each available METAPHOR command and array generator is described. Complete METAPHOR sessions are included.

  17. Renormalized versions of the massless Thirring model

    CERN Document Server

    Casana, R

    2003-01-01

    We present a non-perturbative study of the (1+1)-dimensional massless Thirring model by using path integral methods. The model presents two features, one of them has a local gauge symmetry that is implemented at quantum level and the other one without this symmetry. We make a detailed analysis of their UV divergence structure, a non-perturbative regularization and renormalization processes are proposed.

  18. An Open Platform for Processing IFC Model Versions

    Institute of Scientific and Technical Information of China (English)

    Mohamed Nour; Karl Beucke

    2008-01-01

    The IFC initiative from the International Alliance of Interoperability has been developing since the mid-nineties through several versions.This paper addresses the problem of binding the growing number of IFC versions and their EXPRESS definitions to programming environments (Java and.NET).The solution developed in this paper automates the process of generating early binding classes,whenever a new version of the IFC model is released.Furthermore, a runtime instantiation of the generated eady binding classes takes place by importing IFC-STEP ISO 10303-P21 models.The user can navigate the IFC STEP model with relevance to the defining EXPRESS-schema,modify,deletem,and create new instances.These func-tionalities are considered to be a basis for any IFC based implementation.It enables researchers to experi-ment the IFC model independently from any software application.

  19. Correction, improvement and model verification of CARE 3, version 3

    Science.gov (United States)

    Rose, D. M.; Manke, J. W.; Altschul, R. E.; Nelson, D. L.

    1987-01-01

    An independent verification of the CARE 3 mathematical model and computer code was conducted and reported in NASA Contractor Report 166096, Review and Verification of CARE 3 Mathematical Model and Code: Interim Report. The study uncovered some implementation errors that were corrected and are reported in this document. The corrected CARE 3 program is called version 4. Thus the document, correction. improvement, and model verification of CARE 3, version 3 was written in April 1984. It is being published now as it has been determined to contain a more accurate representation of CARE 3 than the preceding document of April 1983. This edition supercedes NASA-CR-166122 entitled, 'Correction and Improvement of CARE 3,' version 3, April 1983.

  20. Smart Grid Interoperability Maturity Model Beta Version

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Drummond, R.; Giroti, Tony; Houseman, Doug; Knight, Mark; Levinson, Alex; longcore, Wayne; Lowe, Randy; Mater, J.; Oliver, Terry V.; Slack, Phil; Tolk, Andreas; Montgomery, Austin

    2011-12-02

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across an information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.

  1. Towards Validating Risk Indicators Based on Measurement Theory (Extended version)

    NARCIS (Netherlands)

    Morali, Ayse; Wieringa, Roel

    2010-01-01

    Due to the lack of quantitative information and for cost-efficiency, most risk assessment methods use partially ordered values (e.g. high, medium, low) as risk indicators. In practice it is common to validate risk indicators by asking stakeholders whether they make sense. This way of validation is s

  2. AISIM (Automated Interactive Simulation Modeling System) VAX Version Training Manual.

    Science.gov (United States)

    1985-02-01

    AD-Ri6t 436 AISIM (RUTOMATED INTERACTIVE SIMULATION MODELING 1/2 SYSTEM) VAX VERSION TRAI (U) HUGHES AIRCRAFT CO FULLERTON CA GROUND SYSTEMS GROUP S...Continue on reverse if necessary and Identify by block number) THIS DOCUMENT IS THE TRAINING MANUAL FOR THE AUTOMATED INTERACTIVE SIMULATION MODELING SYSTEM...form. Page 85 . . . . . . . . APPENDIX B SIMULATION REPORT FOR WORKING EXAMPLE Pa jPage.8 7AD-Ai6i 46 ISIM (AUTOMATED INTERACTIVE SIMULATION MODELING 2

  3. IDC Use Case Model Survey Version 1.1.

    Energy Technology Data Exchange (ETDEWEB)

    Harris, James Mark [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Carr, Dorthe B. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    This document contains the brief descriptions for the actors and use cases contained in the IDC Use Case Model. REVISIONS Version Date Author/Team Revision Description Authorized by V1.0 12/2014 SNL IDC Reengineering Project Team Initial delivery M. Harris V1.1 2/2015 SNL IDC Reengineering Project Team Iteration I2 Review Comments M. Harris

  4. IDC Use Case Model Survey Version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Carr, Dorthe B.; Harris, James M.

    2014-12-01

    This document contains the brief descriptions for the actors and use cases contained in the IDC Use Case Model Survey. REVISIONS Version Date Author/Team Revision Description Authorized by V1.0 12/2014 IDC Re- engineering Project Team Initial delivery M. Harris

  5. Racial Differences in Risk-Taking Propensity on the Youth Version of the Balloon Analogue Risk Task (BART-Y)

    Science.gov (United States)

    Collado, Anahi; Risco, Cristina M.; Banducci, Anne N.; Chen, Kevin W.; MacPherson, Laura; Lejuez, Carl W.

    2017-01-01

    Research indicates that White adolescents tend to engage in greater levels of risk behavior relative to Black adolescents. To better understand these differences, the current study examined real-time changes in risk-taking propensity (RTP). The study utilized the Balloon Analogue Risk Task-Youth Version (BART-Y), a well-validated real-time,…

  6. COPAT - towards a recommended model version of COSMO-CLM

    Science.gov (United States)

    Anders, Ivonne; Brienen, Susanne; Eduardo, Bucchignani; Ferrone, Andrew; Geyer, Beate; Keuler, Klaus; Lüthi, Daniel; Mertens, Mariano; Panitz, Hans-Jürgen; Saeed, Sajjad; Schulz, Jan-Peter; Wouters, Hendrik

    2016-04-01

    The regional climate model COSMO-CLM is a community model (www.clm-community.com). In close collaboration with the COSMO-consortium the model is further developed by the community members for climate applications. One of the tasks of the community is to give a recommendation on the model version and to evaluate the models performance. The COPAT (Coordinated Parameter Testing) is a voluntary community effort to allow different institutions to carry out model simulations systematically by different institutions in order to test new model options and to find a satisfactory model setup for hydrostatic climate simulations over Europe. We will present the COPAT method used to achieve the latest recommended model version of COSMO-CLM (COSMO5.0_clm6). The simulations cover the EURO-CORDEX domain at two spatial resolutions 0.44° and 0.11°. They used ERAinterim forcing data for the time period of 1979-2000. Interpolated forcing data has been prepared once to ensure that all participating groups used identical forcing. The evaluation of each individual run has been performed for the time period 1981-2000 by using ETOOL and ETOOL-VIS. These tools have been developed within the community to evaluate standard COSMO-CLM output in comparison to observations provided by EOBS and CRU. COPAT was structured in three phases. In Phase 1 all participating institutions performed a reference run on their individual computing platforms and tested the influence of single model options on the results afterwards. Derived from the results of Phase 1 the most promising options were used in combinations in the second phase (Phase 2). These first two phases of COPAT consist of more than 100 simulations with a spatial resolution of 0.44°. Based on the best setup identified in Phase 2 a calibration of eight tuning parameters has been carried out following Bellbrat et al. (2012) in Phase 3. A final simulation with the calibrated parameters has been set up at a higher resolution of 0.11°. The

  7. A Taxonomy of Operational Cyber Security Risks Version 2

    Science.gov (United States)

    2014-05-01

    References 37 CMU/SEI-2014-TN-006 | ii CMU/SEI-2014-TN-006 | iii List of Figures Figure 1: Relationships Among Assets, Business Processes, and...draws upon the definition of operational risk adopted by the banking sector in the Basel II framework [BIS 2006]. Within the cyber security space

  8. ONKALO rock mechanics model (RMM). Version 2.3

    Energy Technology Data Exchange (ETDEWEB)

    Haekkinen, T.; Merjama, S.; Moenkkoenen, H. [WSP Finland, Helsinki (Finland)

    2014-07-15

    The Rock Mechanics Model of the ONKALO rock volume includes the most important rock mechanics features and parameters at the Olkiluoto site. The main objective of the model is to be a tool to predict rock properties, rock quality and hence provide an estimate for the rock stability of the potential repository at Olkiluoto. The model includes a database of rock mechanics raw data and a block model in which the rock mechanics parameters are estimated through block volumes based on spatial rock mechanics raw data. In this version 2.3, special emphasis was placed on refining the estimation of the block model. The model was divided into rock mechanics domains which were used as constraints during the block model estimation. During the modelling process, a display profile and toolbar were developed for the GEOVIA Surpac software to improve visualisation and access to the rock mechanics data for the Olkiluoto area. (orig.)

  9. Model Versions and Fast Algorithms for Network Epidemiology

    Institute of Scientific and Technical Information of China (English)

    Petter Holme

    2014-01-01

    Network epidemiology has become a core framework for investigating the role of human contact patterns in the spreading of infectious diseases. In network epidemiology, one represents the contact structure as a network of nodes (individuals) connected by links (sometimes as a temporal network where the links are not continuously active) and the disease as a compartmental model (where individuals are assigned states with respect to the disease and follow certain transition rules between the states). In this paper, we discuss fast algorithms for such simulations and also compare two commonly used versions,one where there is a constant recovery rate (the number of individuals that stop being infectious per time is proportional to the number of such people);the other where the duration of the disease is constant. The results show that, for most practical purposes, these versions are qualitatively the same.

  10. Psychometric Properties of the Persian Version of the Youth Risk Behavior Survey Questionnaire

    OpenAIRE

    A. Baheiraei; Hamzehgardeshi, Z; M.R. Mohammadi; Nedjat, S; Mohammadi, E.

    2012-01-01

    Background Adolescents may get involved in high-risk behaviors. Surveys are the primary, and sometimes the sole source of data collection for many high-risk health behaviours. We examined the reliability and validity of the psychometric properties of the self-administered Persian version of the 2009 Youth Risk Behavior Surveillance System (YRBSS) questionnaire. Methods In a methodological study in summer 2010, 100 Iranian adolescents aged 15-18 years were recruited through convenience samplin...

  11. NASA Risk Management Handbook. Version 1.0

    Science.gov (United States)

    Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher; Maggio, Gaspare; Stamatelatos, Michael; Youngblood, Robert; Guarro, Sergio; Rutledge, Peter; Sherrard, James; Smith, Curtis; Williams, Rodney

    2011-01-01

    The purpose of this handbook is to provide guidance for implementing the Risk Management (RM) requirements of NASA Procedural Requirements (NPR) document NPR 8000.4A, Agency Risk Management Procedural Requirements [1], with a specific focus on programs and projects, and applying to each level of the NASA organizational hierarchy as requirements flow down. This handbook supports RM application within the NASA systems engineering process, and is a complement to the guidance contained in NASA/SP-2007-6105, NASA Systems Engineering Handbook [2]. Specifically, this handbook provides guidance that is applicable to the common technical processes of Technical Risk Management and Decision Analysis established by NPR 7123.1A, NASA Systems Engineering Process and Requirements [3]. These processes are part of the \\Systems Engineering Engine. (Figure 1) that is used to drive the development of the system and associated work products to satisfy stakeholder expectations in all mission execution domains, including safety, technical, cost, and schedule. Like NPR 7123.1A, NPR 8000.4A is a discipline-oriented NPR that intersects with product-oriented NPRs such as NPR 7120.5D, NASA Space Flight Program and Project Management Requirements [4]; NPR 7120.7, NASA Information Technology and Institutional Infrastructure Program and Project Management Requirements [5]; and NPR 7120.8, NASA Research and Technology Program and Project Management Requirements [6]. In much the same way that the NASA Systems Engineering Handbook is intended to provide guidance on the implementation of NPR 7123.1A, this handbook is intended to provide guidance on the implementation of NPR 8000.4A. 1.2 Scope and Depth This handbook provides guidance for conducting RM in the context of NASA program and project life cycles, which produce derived requirements in accordance with existing systems engineering practices that flow down through the NASA organizational hierarchy. The guidance in this handbook is not meant

  12. H2A Production Model, Version 2 User Guide

    Energy Technology Data Exchange (ETDEWEB)

    Steward, D.; Ramsden, T.; Zuboy, J.

    2008-09-01

    The H2A Production Model analyzes the technical and economic aspects of central and forecourt hydrogen production technologies. Using a standard discounted cash flow rate of return methodology, it determines the minimum hydrogen selling price, including a specified after-tax internal rate of return from the production technology. Users have the option of accepting default technology input values--such as capital costs, operating costs, and capacity factor--from established H2A production technology cases or entering custom values. Users can also modify the model's financial inputs. This new version of the H2A Production Model features enhanced usability and functionality. Input fields are consolidated and simplified. New capabilities include performing sensitivity analyses and scaling analyses to various plant sizes. This User Guide helps users already familiar with the basic tenets of H2A hydrogen production cost analysis get started using the new version of the model. It introduces the basic elements of the model then describes the function and use of each of its worksheets.

  13. Reliability and validity of the Persian (Farsi) version of the Risk Perception Survey-Diabetes Mellitus.

    Science.gov (United States)

    Soltanipour, S; Heidarzadeh, A; Jafarinezhad, A

    2014-04-03

    Knowledge of patients' risk perceptions is essential for the management of chronic diseases. This study aimed to assess the reliability and validity of a Persian (Farsi) language translation of the Risk Perception Survey-Diabetes Mellitus. After forward-backward translation the RPS-DM was randomly administered to 106 adult patients with diabetes who were enrolled in a teaching referral clinic in the north of the Islamic Republic of Iran (Rasht). Internal consistency and exploratory factor analysis were applied. The minimum value for internal consistency was 0.50 for risk knowledge and the highest value was 0.88 on the optimistic bias subscale. Principal component analysis showed that the items of the composite risk score matched with the same items in the English language version, except for question numbers 16, 24 and 25. The Persian version of RPS-DM is the first standardized tool for measuring risk perception and knowledge about diabetes complications in the Islamic Republic of Iran.

  14. Stochastic hyperfine interactions modeling library-Version 2

    Science.gov (United States)

    Zacate, Matthew O.; Evenson, William E.

    2016-02-01

    The stochastic hyperfine interactions modeling library (SHIML) provides a set of routines to assist in the development and application of stochastic models of hyperfine interactions. The library provides routines written in the C programming language that (1) read a text description of a model for fluctuating hyperfine fields, (2) set up the Blume matrix, upon which the evolution operator of the system depends, and (3) find the eigenvalues and eigenvectors of the Blume matrix so that theoretical spectra of experimental techniques that measure hyperfine interactions can be calculated. The optimized vector and matrix operations of the BLAS and LAPACK libraries are utilized. The original version of SHIML constructed and solved Blume matrices for methods that measure hyperfine interactions of nuclear probes in a single spin state. Version 2 provides additional support for methods that measure interactions on two different spin states such as Mössbauer spectroscopy and nuclear resonant scattering of synchrotron radiation. Example codes are provided to illustrate the use of SHIML to (1) generate perturbed angular correlation spectra for the special case of polycrystalline samples when anisotropy terms of higher order than A22 can be neglected and (2) generate Mössbauer spectra for polycrystalline samples for pure dipole or pure quadrupole transitions.

  15. Risk Modelling and Management: An Overview

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); D.E. Allen (David); M.J. McAleer (Michael); T. Pérez-Amaral (Teodosio)

    2013-01-01

    textabstractThe papers in this special issue of Mathematics and Computers in Simulation are substantially revised versions of the papers that were presented at the 2011 Madrid International Conference on “Risk Modelling and Management” (RMM2011). The papers cover the following topics: currency

  16. Risk modelling and management: An overview

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); D.E. Allen (David); M.J. McAleer (Michael); T. Pérez-Amaral (Teodosio)

    2013-01-01

    textabstractThe papers in this special issue of Mathematics and Computers in Simulation are substantially revised versions of the papers that were presented at the 2011 Madrid International Conference on "Risk Modelling and Management" (RMM2011). The papers cover the following topics: currency hedgi

  17. Risk Modelling and Management: An Overview

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); D.E. Allen (David); M.J. McAleer (Michael); T. Pérez-Amaral (Teodosio)

    2013-01-01

    textabstractThe papers in this special issue of Mathematics and Computers in Simulation are substantially revised versions of the papers that were presented at the 2011 Madrid International Conference on “Risk Modelling and Management” (RMM2011). The papers cover the following topics: currency hedgi

  18. The Lagrangian particle dispersion model FLEXPART version 10

    Science.gov (United States)

    Pisso, Ignacio; Sollum, Espen; Grythe, Henrik; Kristiansen, Nina; Cassiani, Massimo; Eckhardt, Sabine; Thompson, Rona; Groot Zwaaftnik, Christine; Evangeliou, Nikolaos; Hamburger, Thomas; Sodemann, Harald; Haimberger, Leopold; Henne, Stephan; Brunner, Dominik; Burkhart, John; Fouilloux, Anne; Fang, Xuekun; Phillip, Anne; Seibert, Petra; Stohl, Andreas

    2017-04-01

    The Lagrangian particle dispersion model FLEXPART was in its first original release in 1998 designed for calculating the long-range and mesoscale dispersion of air pollutants from point sources, such as after an accident in a nuclear power plant. The model has now evolved into a comprehensive tool for atmospheric transport modelling and analysis. Its application fields are extended to a range of atmospheric transport processes for both atmospheric gases and aerosols, e.g. greenhouse gases, short-lived climate forces like black carbon, volcanic ash and gases as well as studies of the water cycle. We present the newest release, FLEXPART version 10. Since the last publication fully describing FLEXPART (version 6.2), the model code has been parallelised in order to allow for the possibility to speed up computation. A new, more detailed gravitational settling parametrisation for aerosols was implemented, and the wet deposition scheme for aerosols has been heavily modified and updated to provide a more accurate representation of this physical process. In addition, an optional new turbulence scheme for the convective boundary layer is available, that considers the skewness in the vertical velocity distribution. Also, temporal variation and temperature dependence of the OH-reaction are included. Finally, user input files are updated to a more convenient and user-friendly namelist format, and the option to produce the output-files in netCDF-format instead of binary format is implemented. We present these new developments and show recent model applications. Moreover, we also introduce some tools for the preparation of the meteorological input data, as well as for the processing of FLEXPART output data.

  19. Software Engineering Designs for Super-Modeling Different Versions of CESM Models using DART

    Science.gov (United States)

    Kluzek, Erik; Duane, Gregory; Tribbia, Joe; Vertenstein, Mariana

    2014-05-01

    The super-modeling approach connects different models together at run time in order to provide run time feedbacks between the models and thus synchronize the models. This method reduces model bias further than after-the-fact averaging of model outputs. We explore different designs to connect different configurations and versions of an IPCC class climate model - the Community Earth System Model (CESM). We build on the Data Assimilation Research Test-bed (DART) software to provide data assimilation from truth as well as to provide a software framework to link different model configurations together. We show a system building on DART that uses a Python script to do simple nudging between three versions of the atmosphere model in CESM (the Community Atmosphere Model (CAM) versions three, four and five).

  20. 19-vertex version of the fully frustrated XY model

    Science.gov (United States)

    Knops, Yolanda M. M.; Nienhuis, Bernard; Knops, Hubert J. F.; Blöte, Henk W. J.

    1994-07-01

    We investigate a 19-vertex version of the two-dimensional fully frustrated XY (FFXY) model. We construct Yang-Baxter equations for this model and show that there is no solution. Therefore we have chosen a numerical approach based on the transfer matrix. The results show that a coupled XY Ising model is in the same universality class as the FFXY model. We find that the phase coupling over an Ising wall is irrelevant at criticality. This leads to a correction of earlier determinations of the dimension x*h,Is of the Ising disorder operator. We find x*h,Is=0.123(5) and a conformal anomaly c=1.55(5). These results are consistent with the hypothesis that the FFXY model behaves as a superposition of an Ising model and an XY model. However, the dimensions associated with the energy, xt=0.77(3), and with the XY magnetization xh,XY~=0.17, refute this hypothesis.

  1. Assessing risk propensity in American soldiers: preliminary reliability and validity of the Evaluation of Risks (EVAR) scale--English version.

    Science.gov (United States)

    Killgore, William D S; Vo, Alexander H; Castro, Carl A; Hoge, Charles W

    2006-03-01

    Risk-taking propensity is a critical component of judgment and decision-making in military operations. The Evaluation of Risks scale (EVAR) was recently developed to measure state and trait aspects of risk proneness. The scale, however, was psychometrically normed in French and no data are available for the English translation. We administered the English version of the EVAR to 165 U.S. soldiers to obtain reliability, validity, and normative data for English-speaking respondents. Confirmatory factor analysis suggested that the factor structure of the English EVAR differs from that obtained in the French studies. Instead, a three-factor solution, including recklessness/impulsivity, self-confidence, and need for control, emerged. Internal consistency was comparable to the French version. EVAR scores correlated with age, military rank, and years of service, and discriminated soldiers with histories of high-risk behavior. The data support the reliability and validity of the English version of the EVAR for evaluating risk propensity in U.S. soldiers.

  2. Validating the Hamilton Anatomy of Risk Management-Forensic Version and the Aggressive Incidents Scale.

    Science.gov (United States)

    Cook, Alana N; Moulden, Heather M; Mamak, Mini; Lalani, Shams; Messina, Katrina; Chaimowitz, Gary

    2016-07-14

    The Hamilton Anatomy of Risk Management-Forensic Version (HARM-FV) is a structured professional judgement tool of violence risk developed for use in forensic inpatient psychiatric settings. The HARM-FV is used with the Aggressive Incidents Scale (AIS), which provides a standardized method of recording aggressive incidents. We report the findings of the concurrent validity of the HARM-FV and the AIS with widely used measures of violence risk and aggressive acts, the Historical, Clinical, Risk Management-20, Version 3 (HCR-20(V3)) and a modified version of the Overt Aggression Scale. We also present findings on the predictive validity of the HARM-FV in the short term (1-month follow-up periods) for varying severities of aggressive acts. The results indicated strong support for the concurrent validity of the HARM-FV and AIS and promising support for the predictive accuracy of the tool for inpatient aggression. This article provides support for the continued clinical use of the HARM-FV within an inpatient forensic setting and highlights areas for further research. © The Author(s) 2016.

  3. Risk factors for cesarean section and instrumental vaginal delivery after successful external cephalic version.

    Science.gov (United States)

    de Hundt, Marcella; Vlemmix, Floortje; Bais, Joke M J; de Groot, Christianne J; Mol, Ben Willem; Kok, Marjolein

    2016-01-01

    Aim of this article is to examine if we could identify factors that predict cesarean section and instrumental vaginal delivery in women who had a successful external cephalic version. We used data from a previous randomized trial among 25 hospitals and their referring midwife practices in the Netherlands. With the data of this trial, we performed a cohort study among women attempting vaginal delivery after successful ECV. We evaluated whether maternal age, gestational age, parity, time interval between ECV and delivery, birth weight, neonatal gender, and induction of labor were predictive for a vaginal delivery on one hand or a CS or instrumental vaginal delivery on the other hand. Unadjusted and adjusted odds ratios were calculated with univariate and multivariate logistic regression analysis. Among 301 women who attempted vaginal delivery after a successful external cephalic version attempt, the cesarean section rate was 13% and the instrumental vaginal delivery rate 6%, resulting in a combined instrumental delivery rate of 19%. Nulliparity increased the risk of cesarean section (OR 2.7 (95% CI 1.2-6.1)) and instrumental delivery (OR 4.2 (95% CI 2.1-8.6)). Maternal age, gestational age at delivery, time interval between external cephalic version and delivery, birth weight and neonatal gender did not contribute to the prediction of failed spontaneous vaginal delivery. In our cohort of 301 women with a successful external cephalic version, nulliparity was the only one of seven factors that predicted the risk for cesarean section and instrumental vaginal delivery.

  4. Metric properties of the "timed get up and go- modified version" test, in risk assessment of falls in active women.

    Science.gov (United States)

    Alfonso Mora, Margareth Lorena

    2017-03-30

    To analyse the metric properties of the Timed Get up and Go-Modified Version Test (TGUGM), in risk assessment of falls in a group of physically active women. A sample was constituted by 202 women over 55 years of age, were assessed through a crosssectional study. The TGUGM was applied to assess their fall risk. The test was analysed by comparison of the qualitative and quantitative information and by factor analysis. The development of a logistic regression model explained the risk of falls according to the test components. The TGUGM was useful for assessing the risk of falls in the studied group. The test revealed two factors: the Get Up and the Gait with dual task. Less than twelve points in the evaluation or runtimes higher than 35 seconds was associated with high risk of falling. More than 35 seconds in the test indicated a risk fall probability greater than 0.50. Also, scores less than 12 points were associated with a delay of 7 seconds more in the execution of the test (p= 0.0016). Factor analysis of TGUGM revealed two dimensions that can be independent predictors of risk of falling: The Get up that explains between 64% and 87% of the risk of falling, and the Gait with dual task, that explains between 77% and 95% of risk of falling.

  5. Melanoma Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  6. Looking for the dichromatic version of a colour vision model

    Science.gov (United States)

    Capilla, P.; Luque, M. J.; Díez-Ajenjo, M. A.

    2004-09-01

    Different hypotheses on the sensitivity of photoreceptors and post-receptoral mechanisms were introduced in different colour vision models to derive acceptable dichromatic versions. Models with one (Ingling and T'sou, Guth et al, Boynton) and two linear opponent stages (DeValois and DeValois) and with two non-linear opponent stages (ATD95) were used. The L- and M-cone sensitivities of red-green defectives were either set to zero (cone-loss hypothesis) or replaced by that of a different cone-type (cone-replacement hypothesis), whereas for tritanopes the S-cone sensitivity was always assumed to be zero. The opponent mechanisms were either left unchanged or nulled in one or in all the opponent stages. The dichromatic models obtained have been evaluated according to their performance in three tests: computation of the spectral sensitivity of the dichromatic perceptual mechanisms, prediction of the colour loci describing dichromatic appearance and prediction of the gamut of colours that dichromats perceive as normal subjects do.

  7. RISK ANALYSIS DEVELOPED MODEL

    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA

    2012-07-01

    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  8. The integrated Earth System Model Version 1: formulation and functionality

    Energy Technology Data Exchange (ETDEWEB)

    Collins, William D.; Craig, Anthony P.; Truesdale, John E.; Di Vittorio, Alan; Jones, Andrew D.; Bond-Lamberty, Benjamin; Calvin, Katherine V.; Edmonds, James A.; Kim, Son H.; Thomson, Allison M.; Patel, Pralit L.; Zhou, Yuyu; Mao, Jiafu; Shi, Xiaoying; Thornton, Peter E.; Chini, Louise M.; Hurtt, George C.

    2015-07-23

    The integrated Earth System Model (iESM) has been developed as a new tool for pro- jecting the joint human/climate system. The iESM is based upon coupling an Integrated Assessment Model (IAM) and an Earth System Model (ESM) into a common modeling in- frastructure. IAMs are the primary tool for describing the human–Earth system, including the sources of global greenhouse gases (GHGs) and short-lived species, land use and land cover change, and other resource-related drivers of anthropogenic climate change. ESMs are the primary scientific tools for examining the physical, chemical, and biogeochemical impacts of human-induced changes to the climate system. The iESM project integrates the economic and human dimension modeling of an IAM and a fully coupled ESM within a sin- gle simulation system while maintaining the separability of each model if needed. Both IAM and ESM codes are developed and used by large communities and have been extensively applied in recent national and international climate assessments. By introducing heretofore- omitted feedbacks between natural and societal drivers, we can improve scientific under- standing of the human–Earth system dynamics. Potential applications include studies of the interactions and feedbacks leading to the timing, scale, and geographic distribution of emissions trajectories and other human influences, corresponding climate effects, and the subsequent impacts of a changing climate on human and natural systems. This paper de- scribes the formulation, requirements, implementation, testing, and resulting functionality of the first version of the iESM released to the global climate community.

  9. System Analysis and Risk Assessment system (SARA) Version 4.0

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M B; Russell, K D; Skinner, N L [EG and G Idaho, Inc., Idaho Falls, ID (United States)

    1992-01-01

    This NUREG is the tutorial for the System Analysis and Risk Assessment System (SARA) Version 4.0, a microcomputer-based system used to analyze the safety issues of a family (i.e., a power plant, a manufacturing facility, any facility on which a probabilistic risk assessment (PRA) might be performed). A series of lessons are provided that walk the user through some basic steps common to most analyses performed with SARA. The example problems presented in the lessons build on one another, and in combination, lead the user through all aspects of SARA sensitivity analysis.

  10. Preliminary Evaluation of the Community Multiscale Air Quality (CMAQ) Model Version 5.1

    Science.gov (United States)

    The AMAD will perform two annual CMAQ model simulations, one with the current publically available version of the CMAQ model (v5.0.2) and the other with the beta version of the new model (v5.1). The results of each model simulation will then be compared to observations and the pe...

  11. Evaluation of the Community Multiscale Air Quality (CMAQ) Model Version 5.1

    Science.gov (United States)

    The AMAD will performed two CMAQ model simulations, one with the current publically available version of the CMAQ model (v5.0.2) and the other with the new version of the CMAQ model (v5.1). The results of each model simulation are compared to observations and the performance of t...

  12. CREDIT RISK. DETERMINATION MODELS

    Directory of Open Access Journals (Sweden)

    MIHAELA GRUIESCU

    2012-01-01

    Full Text Available The internationalization of financial flows and banking and the rapid development of markets have changed the financial sector, causing him to respond with force and imagination. Under these conditions, the concerns of financial and banking institutions, rating institutions are increasingly turning to find the best solutions to hedge risks and maximize profits. This paper aims to present a number of advantages, but also limits the Merton model, the first structural model for modeling credit risk. Also, some are extensions of the model, some empirical research and performance known, others such as state-dependent models (SDM, which together with the liquidation process models (LPM, are two recent efforts in the structural models, show different phenomena in real life.

  13. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  14. Integrating Cloud Processes in the Community Atmosphere Model, Version 5.

    Energy Technology Data Exchange (ETDEWEB)

    Park, S.; Bretherton, Christopher S.; Rasch, Philip J.

    2014-09-15

    This paper provides a description on the parameterizations of global cloud system in CAM5. Compared to the previous versions, CAM5 cloud parameterization has the following unique characteristics: (1) a transparent cloud macrophysical structure that has horizontally non-overlapped deep cumulus, shallow cumulus and stratus in each grid layer, each of which has own cloud fraction, mass and number concentrations of cloud liquid droplets and ice crystals, (2) stratus-radiation-turbulence interaction that allows CAM5 to simulate marine stratocumulus solely from grid-mean RH without relying on the stability-based empirical empty stratus, (3) prognostic treatment of the number concentrations of stratus liquid droplets and ice crystals with activated aerosols and detrained in-cumulus condensates as the main sources and evaporation-sedimentation-precipitation of stratus condensate as the main sinks, and (4) radiatively active cumulus. By imposing consistency between diagnosed stratus fraction and prognosed stratus condensate, CAM5 is free from empty or highly-dense stratus at the end of stratus macrophysics. CAM5 also prognoses mass and number concentrations of various aerosol species. Thanks to the aerosol activation and the parameterizations of the radiation and stratiform precipitation production as a function of the droplet size, CAM5 simulates various aerosol indirect effects associated with stratus as well as direct effects, i.e., aerosol controls both the radiative and hydrological budgets. Detailed analysis of various simulations revealed that CAM5 is much better than CAM3/4 in the global performance as well as the physical formulation. However, several problems were also identifed, which can be attributed to inappropriate regional tuning, inconsistency between various physics parameterizations, and incomplete model physics. Continuous efforts are going on to further improve CAM5.

  15. RiskREP: Risk-Based Security Requirements Elicitation and Prioritization (extended version)

    NARCIS (Netherlands)

    Herrmann, Andrea; Morali, A.

    2010-01-01

    Today, companies are required to be in control of the security of their IT assets. This is especially challenging in the presence of limited budgets and conflicting requirements. Here, we present Risk-Based Requirements Elicitation and Prioritization (RiskREP), a method for managing IT security

  16. Models of Credit Risk Measurement

    OpenAIRE

    Hagiu Alina

    2011-01-01

    Credit risk is defined as that risk of financial loss caused by failure by the counterparty. According to statistics, for financial institutions, credit risk is much important than market risk, reduced diversification of the credit risk is the main cause of bank failures. Just recently, the banking industry began to measure credit risk in the context of a portfolio along with the development of risk management started with models value at risk (VAR). Once measured, credit risk can be diversif...

  17. Melanoma risk prediction models

    Directory of Open Access Journals (Sweden)

    Nikolić Jelena

    2014-01-01

    Full Text Available Background/Aim. The lack of effective therapy for advanced stages of melanoma emphasizes the importance of preventive measures and screenings of population at risk. Identifying individuals at high risk should allow targeted screenings and follow-up involving those who would benefit most. The aim of this study was to identify most significant factors for melanoma prediction in our population and to create prognostic models for identification and differentiation of individuals at risk. Methods. This case-control study included 697 participants (341 patients and 356 controls that underwent extensive interview and skin examination in order to check risk factors for melanoma. Pairwise univariate statistical comparison was used for the coarse selection of the most significant risk factors. These factors were fed into logistic regression (LR and alternating decision trees (ADT prognostic models that were assessed for their usefulness in identification of patients at risk to develop melanoma. Validation of the LR model was done by Hosmer and Lemeshow test, whereas the ADT was validated by 10-fold cross-validation. The achieved sensitivity, specificity, accuracy and AUC for both models were calculated. The melanoma risk score (MRS based on the outcome of the LR model was presented. Results. The LR model showed that the following risk factors were associated with melanoma: sunbeds (OR = 4.018; 95% CI 1.724- 9.366 for those that sometimes used sunbeds, solar damage of the skin (OR = 8.274; 95% CI 2.661-25.730 for those with severe solar damage, hair color (OR = 3.222; 95% CI 1.984-5.231 for light brown/blond hair, the number of common naevi (over 100 naevi had OR = 3.57; 95% CI 1.427-8.931, the number of dysplastic naevi (from 1 to 10 dysplastic naevi OR was 2.672; 95% CI 1.572-4.540; for more than 10 naevi OR was 6.487; 95%; CI 1.993-21.119, Fitzpatricks phototype and the presence of congenital naevi. Red hair, phototype I and large congenital naevi were

  18. A comparison of modified versions of the Static-99 and the Sex Offender Risk Appraisal Guide.

    Science.gov (United States)

    Nunes, Kevin L; Firestone, Philip; Bradford, John M; Greenberg, David M; Broom, Ian

    2002-07-01

    The predictive validity of 2 risk assessment instruments for sex offenders, modified versions of the Static-99 and the Sex Offender Risk Appraisal Guide, was examined and compared in a sample of 258 adult male sex offenders. In addition, the independent contributions to the prediction of recidivism made by each instrument and by various phallometric indices were explored. Both instruments demonstrated moderate levels of predictive accuracy for sexual and violent (including sexual) recidivism. They were not significantly different in terms of their predictive accuracy for sexual or violent recidivism, nor did they contribute independently to the prediction of sexual or violent recidivism. Of the phallometric indices examined, only the pedophile index added significantly to the prediction of sexual recidivism, but not violent recidivism, above the Static-99 alone.

  19. The NDFF-EcoGRID logical data model, version 3. - Document version 1.1

    NARCIS (Netherlands)

    W. Arp; G. van Reenen; R. van Seeters; M. Tentij; L.E. Veen; D. Zoetebier

    2011-01-01

    The National Authority for Data concerning Nature has been appointed by the Ministry of Agriculture, Nature and Food Quality, and has been assigned the task of making available nature data and of promoting its use. The logical data model described here is intended for everyone in The Netherlands (an

  20. A Constrained and Versioned Data Model for TEAM Data

    Science.gov (United States)

    Andelman, S.; Baru, C.; Chandra, S.; Fegraus, E.; Lin, K.

    2009-04-01

    The objective of the Tropical Ecology Assessment and Monitoring Network (www.teamnetwork.org) is "To generate real time data for monitoring long-term trends in tropical biodiversity through a global network of TEAM sites (i.e. field stations in tropical forests), providing an early warning system on the status of biodiversity to effectively guide conservation action". To achieve this, the TEAM Network operates by collecting data via standardized protocols at TEAM Sites. The standardized TEAM protocols include the Climate, Vegetation and Terrestrial Vertebrate Protocols. Some sites also implement additional protocols. There are currently 7 TEAM Sites with plans to grow the network to 15 by June 30, 2009 and 50 TEAM Sites by the end of 2010. At each TEAM Site, data is gathered as defined by the protocols and according to a predefined sampling schedule. The TEAM data is organized and stored in a database based on the TEAM spatio-temporal data model. This data model is at the core of the TEAM Information System - it consumes and executes spatio-temporal queries, and analytical functions that are performed on TEAM data, and defines the object data types, relationships and operations that maintain database integrity. The TEAM data model contains object types including types for observation objects (e.g. bird, butterfly and trees), sampling unit, person, role, protocol, site and the relationship of these object types. Each observation data record is a set of attribute values of an observation object and is always associated with a sampling unit, an observation timestamp or time interval, a versioned protocol and data collectors. The operations on the TEAM data model can be classified as read operations, insert operations and update operations. Following are some typical operations: The operation get(site, protocol, [sampling unit block, sampling unit,] start time, end time) returns all data records using the specified protocol and collected at the specified site, block

  1. Incremental Validity Analyses of the Violence Risk Appraisal Guide and the Psychopathy Checklist: Screening Version in a Civil Psychiatric Sample

    Science.gov (United States)

    Edens, John F.; Skeem, Jennifer L.; Douglas, Kevin S.

    2006-01-01

    This study compares two instruments frequently used to assess risk for violence, the Violence Risk Appraisal Guide (VRAG) and the Psychopathy Checklist: Screening Version (PCL:SV), in a large sample of civil psychiatric patients. Despite a strong bivariate relationship with community violence, the VRAG could not improve on the predictive validity…

  2. Integrated Medical Model (IMM) Optimization Version 4.0 Functional Improvements

    Science.gov (United States)

    Arellano, John; Young, M.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Goodenow, D. A.; Myers, J. G.

    2016-01-01

    The IMMs ability to assess mission outcome risk levels relative to available resources provides a unique capability to provide guidance on optimal operational medical kit and vehicle resources. Post-processing optimization allows IMM to optimize essential resources to improve a specific model outcome such as maximization of the Crew Health Index (CHI), or minimization of the probability of evacuation (EVAC) or the loss of crew life (LOCL). Mass and or volume constrain the optimized resource set. The IMMs probabilistic simulation uses input data on one hundred medical conditions to simulate medical events that may occur in spaceflight, the resources required to treat those events, and the resulting impact to the mission based on specific crew and mission characteristics. Because IMM version 4.0 provides for partial treatment for medical events, IMM Optimization 4.0 scores resources at the individual resource unit increment level as opposed to the full condition-specific treatment set level, as done in version 3.0. This allows the inclusion of as many resources as possible in the event that an entire set of resources called out for treatment cannot satisfy the constraints. IMM Optimization version 4.0 adds capabilities that increase efficiency by creating multiple resource sets based on differing constraints and priorities, CHI, EVAC, or LOCL. It also provides sets of resources that improve mission-related IMM v4.0 outputs with improved performance compared to the prior optimization. The new optimization represents much improved fidelity that will improve the utility of the IMM 4.0 for decision support.

  3. Estimating Parameters for the PVsyst Version 6 Photovoltaic Module Performance Model

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    We present an algorithm to determine parameters for the photovoltaic module perf ormance model encoded in the software package PVsyst(TM) version 6. Our method operates on current - voltage (I - V) measured over a range of irradiance and temperature conditions. We describe the method and illustrate its steps using data for a 36 cell crystalli ne silicon module. We qualitatively compare our method with one other technique for estimating parameters for the PVsyst(TM) version 6 model .

  4. psychotools - Infrastructure for Psychometric Modeling: Version 0.1-1

    OpenAIRE

    Zeileis, A.; Strobl, Carolin; Wickelmaier, F

    2011-01-01

    Infrastructure for psychometric modeling such as data classes (e.g., for paired comparisons) and basic model fitting functions (e.g., for Rasch and Bradley-Terry models). Intended especially as a common building block for fitting psychometric mixture models in package ‘‘psychomix’’ and psychometric tree models in package ‘‘psychotree’’. License: GPL-2

  5. Implementing an HL7 version 3 modeling tool from an Ecore model.

    Science.gov (United States)

    Bánfai, Balázs; Ulrich, Brandon; Török, Zsolt; Natarajan, Ravi; Ireland, Tim

    2009-01-01

    One of the main challenges of achieving interoperability using the HL7 V3 healthcare standard is the lack of clear definition and supporting tools for modeling, testing, and conformance checking. Currently, the knowledge defining the modeling is scattered around in MIF schemas, tools and specifications or simply with the domain experts. Modeling core HL7 concepts, constraints, and semantic relationships in Ecore/EMF encapsulates the domain-specific knowledge in a transparent way while unifying Java, XML, and UML in an abstract, high-level representation. Moreover, persisting and versioning the core HL7 concepts as a single Ecore context allows modelers and implementers to create, edit and validate message models against a single modeling context. The solution discussed in this paper is implemented in the new HL7 Static Model Designer as an extensible toolset integrated as a standalone Eclipse RCP application.

  6. Psychometric properties of the persian version of the youth risk behavior survey questionnaire.

    Science.gov (United States)

    Baheiraei, A; Hamzehgardeshi, Z; Mohammadi, M R; Nedjat, S; Mohammadi, E

    2012-06-01

    Adolescents may get involved in high-risk behaviors. Surveys are the primary, and sometimes the sole source of data collection for many high-risk health behaviours. We examined the reliability and validity of the psychometric properties of the self-administered Persian version of the 2009 Youth Risk Behavior Surveillance System (YRBSS) questionnaire. In a methodological study in summer 2010, 100 Iranian adolescents aged 15-18 years were recruited through convenience sampling. The face and content validity were used for the questionnaire validity. In order to evaluate the questionnaire's reliability, the Intraclass Correlation Coefficient (ICC) and Cronbach's α were calculated for domains and 89 items. Among 89 items, the ICC values were below 0.4 (weak reliability) for 2 items (2.25%), 0.4-0.6 (moderate reliability) for 10 items (11.24%), 0.6-0.8 (good reliability) for 32 items (35.96%) and 0.8-1 (excellent reliability) for 45 items (50.56%). The prevalence of most high-risk behaviors was constant in the first and second survey. The value of Cronbach's α was 0.73 for intentional and unintentional injuries, 0.77 for tobacco use, 0.86 for alcohol and other drug use, and 0.79 for unsafe sexual behaviors. No domain had a mean ICC of below 0.6. Furthermore, 97.75% of the items had moderate to excellent reliability. Thus, the Persian YRBSS questionnaire had an acceptable reliability. Over the 2-week period, sexual behaviors were reported with less consistency compared to other behaviors. In any case, researchers must be aware of the limitation of the data collected through this questionnaire, particularly in comparison to the domain of sexual behaviors. Overall, 97.75% of the items had moderate to excellent reliability. Thus, the Persian YRBSS questionnaire had an acceptable reliability.

  7. Calibrating and Updating the Global Forest Products Model (GFPM version 2014 with BPMPD)

    Science.gov (United States)

    Joseph Buongiorno; Shushuai Zhu

    2014-01-01

    The Global Forest Products Model (GFPM) is an economic model of global production, consumption, and trade of forest products. An earlier version of the model is described in Buongiorno et al. (2003). The GFPM 2014 has data and parameters to simulate changes of the forest sector from 2010 to 2030. Buongiorno and Zhu (2014) describe how to use the model for simulation....

  8. Calibrating and updating the Global Forest Products Model (GFPM version 2016 with BPMPD)

    Science.gov (United States)

    Joseph Buongiorno; Shushuai  Zhu

    2016-01-01

    The Global Forest Products Model (GFPM) is an economic model of global production, consumption, and trade of forest products. An earlier version of the model is described in Buongiorno et al. (2003). The GFPM 2016 has data and parameters to simulate changes of the forest sector from 2013 to 2030. Buongiorno and Zhu (2015) describe how to use the model for...

  9. Aircraft/Air Traffic Management Functional Analysis Model. Version 2.0; User's Guide

    Science.gov (United States)

    Etheridge, Melvin; Plugge, Joana; Retina, Nusrat

    1998-01-01

    The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) a National Aeronautics and Space Administration (NASA) contract. This document provides a guide for using the model in analysis. Those interested in making enhancements or modification to the model should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Technical Description.

  10. Modelling and analysis of Markov reward automata (extended version)

    NARCIS (Netherlands)

    Guck, Dennis; Timmer, Mark; Hatefi, Hassan; Ruijters, Enno; Stoelinga, Mariëlle

    2014-01-01

    Costs and rewards are important ingredients for cyberphysical systems, modelling critical aspects like energy consumption, task completion, repair costs, and memory usage. This paper introduces Markov reward automata, an extension of Markov automata that allows the modelling of systems incorporating

  11. Estimating hybrid choice models with the new version of Biogeme

    OpenAIRE

    Bierlaire, Michel

    2010-01-01

    Hybrid choice models integrate many types of discrete choice modeling methods, including latent classes and latent variables, in order to capture concepts such as perceptions, attitudes, preferences, and motivatio (Ben-Akiva et al., 2002). Although they provide an excellent framework to capture complex behavior patterns, their use in applications remains rare in the literature due to the difficulty of estimating the models. In this talk, we provide a short introduction to hybrid choice model...

  12. A hypocentral version of the space-time ETAS model

    Science.gov (United States)

    Guo, Yicun; Zhuang, Jiancang; Zhou, Shiyong

    2015-10-01

    The space-time Epidemic-Type Aftershock Sequence (ETAS) model is extended by incorporating the depth component of earthquake hypocentres. The depths of the direct offspring produced by an earthquake are assumed to be independent of the epicentre locations and to follow a beta distribution, whose shape parameter is determined by the depth of the parent event. This new model is verified by applying it to the Southern California earthquake catalogue. The results show that the new model fits data better than the original epicentre ETAS model and that it provides the potential for modelling and forecasting seismicity with higher resolutions.

  13. Custom v. Standardized Risk Models

    Directory of Open Access Journals (Sweden)

    Zura Kakushadze

    2015-05-01

    Full Text Available We discuss when and why custom multi-factor risk models are warranted and give source code for computing some risk factors. Pension/mutual funds do not require customization but standardization. However, using standardized risk models in quant trading with much shorter holding horizons is suboptimal: (1 longer horizon risk factors (value, growth, etc. increase noise trades and trading costs; (2 arbitrary risk factors can neutralize alpha; (3 “standardized” industries are artificial and insufficiently granular; (4 normalization of style risk factors is lost for the trading universe; (5 diversifying risk models lowers P&L correlations, reduces turnover and market impact, and increases capacity. We discuss various aspects of custom risk model building.

  14. Result Summary for the Area 5 Radioactive Waste Management Site Performance Assessment Model Version 4.113

    Energy Technology Data Exchange (ETDEWEB)

    Shott, G. J.

    2012-04-15

    Preliminary results for Version 4.113 of the Nevada National Security Site Area 5 Radioactive Waste Management Site performance assessment model are summarized. Version 4.113 includes the Fiscal Year 2011 inventory estimate.

  15. SSM - SOLID SURFACE MODELER, VERSION 6.0

    Science.gov (United States)

    Goza, S. P.

    1994-01-01

    The Solid Surface Modeler (SSM) is an interactive graphics software application for solid-shaded and wireframe three- dimensional geometric modeling. It enables the user to construct models of real-world objects as simple as boxes or as complex as Space Station Freedom. The program has a versatile user interface that, in many cases, allows mouse input for intuitive operation or keyboard input when accuracy is critical. SSM can be used as a stand-alone model generation and display program and offers high-fidelity still image rendering. Models created in SSM can also be loaded into other software for animation or engineering simulation. (See the information below for the availability of SSM with the Object Orientation Manipulator program, OOM, a graphics software application for three-dimensional rendering and animation.) Models are constructed within SSM using functions of the Create Menu to create, combine, and manipulate basic geometric building blocks called primitives. Among the simpler primitives are boxes, spheres, ellipsoids, cylinders, and plates; among the more complex primitives are tubes, skinned-surface models and surfaces of revolution. SSM also provides several methods for duplicating models. Constructive Solid Geometry (CSG) is one of the most powerful model manipulation tools provided by SSM. The CSG operations implemented in SSM are union, subtraction and intersection. SSM allows the user to transform primitives with respect to each axis, transform the camera (the user's viewpoint) about its origin, apply texture maps and bump maps to model surfaces, and define color properties; to select and combine surface-fill attributes, including wireframe, constant, and smooth; and to specify models' points of origin (the positions about which they rotate). SSM uses Euler angle transformations for calculating the results of translation and rotation operations. The user has complete control over the modeling environment from within the system. A variety of file

  16. Alternative Factor Models and Heritability of the Short Leyton Obsessional Inventory--Children's Version

    Science.gov (United States)

    Moore, Janette; Smith, Gillian W.; Shevlin, Mark; O'Neill, Francis A.

    2010-01-01

    An alternative models framework was used to test three confirmatory factor analytic models for the Short Leyton Obsessional Inventory-Children's Version (Short LOI-CV) in a general population sample of 517 young adolescent twins (11-16 years). A one-factor model as implicit in current classification systems of Obsessive-Compulsive Disorder (OCD),…

  17. The MiniBIOS model (version 1A4) at the RIVM

    NARCIS (Netherlands)

    Uijt de Haag PAM; Laheij GMH

    1993-01-01

    This report is the user's guide of the MiniBIOS model, version 1A4. The model is operational at the Laboratory of Radiation Research of the RIVM. MiniBIOS is a simulation model for calculating the transport of radionuclides in the biosphere and the consequential radiation dose to humans. The

  18. The MiniBIOS model (version 1A4) at the RIVM

    NARCIS (Netherlands)

    Uijt de Haag PAM; Laheij GMH

    1993-01-01

    This report is the user's guide of the MiniBIOS model, version 1A4. The model is operational at the Laboratory of Radiation Research of the RIVM. MiniBIOS is a simulation model for calculating the transport of radionuclides in the biosphere and the consequential radiation dose to humans. The

  19. Simulating historical landscape dynamics using the landscape fire succession model LANDSUM version 4.0

    Science.gov (United States)

    Robert E. Keane; Lisa M. Holsinger; Sarah D. Pratt

    2006-01-01

    The range and variation of historical landscape dynamics could provide a useful reference for designing fuel treatments on today's landscapes. Simulation modeling is a vehicle that can be used to estimate the range of conditions experienced on historical landscapes. A landscape fire succession model called LANDSUMv4 (LANDscape SUccession Model version 4.0) is...

  20. The MiniBIOS model (version 1A4) at the RIVM

    NARCIS (Netherlands)

    Uijt de Haag PAM; Laheij GMH

    1993-01-01

    This report is the user's guide of the MiniBIOS model, version 1A4. The model is operational at the Laboratory of Radiation Research of the RIVM. MiniBIOS is a simulation model for calculating the transport of radionuclides in the biosphere and the consequential radiation dose to humans. The transp

  1. The MiniBIOS model (version 1A4) at the RIVM

    NARCIS (Netherlands)

    Uijt de Haag PAM; Laheij GMH

    1993-01-01

    This report is the user's guide of the MiniBIOS model, version 1A4. The model is operational at the Laboratory of Radiation Research of the RIVM. MiniBIOS is a simulation model for calculating the transport of radionuclides in the biosphere and the consequential radiation dose to humans. The

  2. Alternative Factor Models and Heritability of the Short Leyton Obsessional Inventory--Children's Version

    Science.gov (United States)

    Moore, Janette; Smith, Gillian W.; Shevlin, Mark; O'Neill, Francis A.

    2010-01-01

    An alternative models framework was used to test three confirmatory factor analytic models for the Short Leyton Obsessional Inventory-Children's Version (Short LOI-CV) in a general population sample of 517 young adolescent twins (11-16 years). A one-factor model as implicit in current classification systems of Obsessive-Compulsive Disorder (OCD),…

  3. Complexity, accuracy and practical applicability of different biogeochemical model versions

    Science.gov (United States)

    Los, F. J.; Blaas, M.

    2010-04-01

    The construction of validated biogeochemical model applications as prognostic tools for the marine environment involves a large number of choices particularly with respect to the level of details of the .physical, chemical and biological aspects. Generally speaking, enhanced complexity might enhance veracity, accuracy and credibility. However, very complex models are not necessarily effective or efficient forecast tools. In this paper, models of varying degrees of complexity are evaluated with respect to their forecast skills. In total 11 biogeochemical model variants have been considered based on four different horizontal grids. The applications vary in spatial resolution, in vertical resolution (2DH versus 3D), in nature of transport, in turbidity and in the number of phytoplankton species. Included models range from 15 year old applications with relatively simple physics up to present state of the art 3D models. With all applications the same year, 2003, has been simulated. During the model intercomparison it has been noticed that the 'OSPAR' Goodness of Fit cost function (Villars and de Vries, 1998) leads to insufficient discrimination of different models. This results in models obtaining similar scores although closer inspection of the results reveals large differences. In this paper therefore, we have adopted the target diagram by Jolliff et al. (2008) which provides a concise and more contrasting picture of model skill on the entire model domain and for the entire period of the simulations. Correctness in prediction of the mean and the variability are separated and thus enhance insight in model functioning. Using the target diagrams it is demonstrated that recent models are more consistent and have smaller biases. Graphical inspection of time series confirms this, as the level of variability appears more realistic, also given the multi-annual background statistics of the observations. Nevertheless, whether the improvements are all genuine for the particular

  4. Enterprise Risk Management Models

    CERN Document Server

    Olson, David L

    2010-01-01

    Enterprise risk management has always been important. However, the events of the 21st Century have made it even more critical. The top level of business management became suspect after scandals at ENRON, WorldCom, and other business entities. Financially, many firms experienced difficulties from bubbles. The problems of interacting cultures demonstrated risk from terrorism as well, with numerous terrorist attacks, to include 9/11 in the U.S. Risks can arise in many facets of business. Businesses in fact exist to cope with risk in their area of specialization. Financial risk management has focu

  5. Efficient Modelling and Generation of Markov Automata (extended version)

    NARCIS (Netherlands)

    Timmer, Mark; Katoen, Joost-Pieter; Pol, van de Jaco; Stoelinga, Mariëlle

    2012-01-01

    This paper introduces a framework for the efficient modelling and generation of Markov automata. It consists of (1) the data-rich process-algebraic language MAPA, allowing concise modelling of systems with nondeterminism, probability and Markovian timing; (2) a restricted form of the language, the M

  6. Modeling the complete Otto cycle: Preliminary version. [computer programming

    Science.gov (United States)

    Zeleznik, F. J.; Mcbride, B. J.

    1977-01-01

    A description is given of the equations and the computer program being developed to model the complete Otto cycle. The program incorporates such important features as: (1) heat transfer, (2) finite combustion rates, (3) complete chemical kinetics in the burned gas, (4) exhaust gas recirculation, and (5) manifold vacuum or supercharging. Changes in thermodynamic, kinetic and transport data as well as model parameters can be made without reprogramming. Preliminary calculations indicate that: (1) chemistry and heat transfer significantly affect composition and performance, (2) there seems to be a strong interaction among model parameters, and (3) a number of cycles must be calculated in order to obtain steady-state conditions.

  7. Flipped version of the supersymmetric strongly coupled preon model

    Science.gov (United States)

    Fajfer, S.; Mileković, M.; Tadić, D.

    1989-12-01

    In the supersymmetric SU(5) [SUSY SU(5)] composite model (which was described in an earlier paper) the fermion mass terms can be easily constructed. The SUSY SU(5)⊗U(1), i.e., flipped, composite model possesses a completely analogous composite-particle spectrum. However, in that model one cannot construct a renormalizable superpotential which would generate fermion mass terms. This contrasts with the standard noncomposite grand unified theories (GUT's) in which both the Georgi-Glashow electrical charge embedding and its flipped counterpart lead to the renormalizable theories.

  8. [Risk of developmental dysplasia of the hip in patients subjected to the external cephalic version].

    Science.gov (United States)

    Sarmiento Carrera, Nerea; González Colmenero, Eva; Vázquez Castelo, José Luis; Concheiro Guisán, Ana; Couceiro Naveira, Emilio; Fernández Lorenzo, José Ramón

    2017-05-03

    Developmental dysplasia of the hip (DDH) refers to the spectrum of abnormalities of maturation and development of the hip. Breech presentation is associated with DDH. This risk factor can be modified by external cephalic version (ECV). The aim of this study is to evaluate the incidence of DDH in patients who successfully underwent ECV, as well as to evaluate need for these children (breech for a period during gestation) to be included in the DDH screening protocol. A prospective cohort study was conducted in the Hospital Universitario de Vigo from January 1, 2015 to December 31, 2015. It included children born in cephalic presentation after a successful ECV, as well as children born in breech presentation. They all were screened for DDH by ultrasound examination of the hip. Out of a total of 122 newborns included in the study, ECV was attempted on 67 (54.9%), of which 35 (52.2%) were successful. Out of the 14 children diagnosed with DDH, 3 of those born in cephalic presentation after a successful ECV were found to be normal on physical examination. Successful ECV is associated with a lower incidence of DDH as regards breech presentation. However, these patients should be included in the DDH screening protocol for the early detection of this disorder. Copyright © 2017. Publicado por Elsevier España, S.L.U.

  9. Obstetric care providers assessing psychosocial risk factors during pregnancy: validation of a short screening tool - the KINDEX Spanish Version.

    Science.gov (United States)

    Spyridou, Andria; Schauer, Maggie; Ruf-Leuschner, Martina

    2014-01-01

    High levels of stress due to diverse psychosocial factors have a direct impact on the mothers' wellbeing during pregnancy and both direct and indirect effects on the fetus. In most cases, psychosocial risk factors present during pregnancy will not disappear after delivery and might influence the parent-child relationship, affecting the healthy development of the offspring in the long term. We introduce a short innovative prenatal assessment to detect psychosocial risk factors through an easy to use instrument for obstetrical medical staff in the daily clinical practice, the KINDEX Spanish Version. In the present study midwives and gynecologists interviewed one hundred nineteen pregnant women in a public health center using the KINDEX Spanish Version. Sixty-seven women were then randomly selected to participate in an extended standardized validation interview conducted by a clinical psychologist using established questionnaires to assesses current stress (ESI, PSS-14), symptoms of psychopathology (HSCL-25, PDS) and traumatic experiences (PDS, CFV). Ethical approval was granted and informed consent was required for participation in this study. The KINDEX sum score, as assessed by medical staff, correlated significantly with stress, psychopathology and trauma as measured during the clinical expert interview. The KINDEX shows strong concurrent validity. Its use by medical staff in daily clinical practice is feasible for public health contexts. Certain items in the KINDEX are related to the respective scales assessing the same risks (e.g.PSS-4 as the shorter version of the PSS-14 and items from the ESI) used in the validation interview. The KINDEX Spanish Version is a valid tool in the hands of medical staff to identify women with multiple psychosocial risk factors in public health settings. The KINDEX Spanish Version could serve as a base-instrument for the referral of at-risk women to appropriate psychosocial intervention. Such early interventions could prove pivotal

  10. ONKALO rock mechanics model (RMM) - Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Moenkkoenen, H. [WSP Finland Oy, Helsinki (Finland); Hakala, M. [KMS Hakala Oy, Nokia (Finland); Paananen, M.; Laine, E. [Geological Survey of Finland, Espoo (Finland)

    2012-02-15

    The Rock Mechanics Model of the ONKALO rock volume is a description of the significant features and parameters related to rock mechanics. The main objective is to develop a tool to predict the rock properties, quality and hence the potential for stress failure which can then be used for continuing design of the ONKALO and the repository. This is the second implementation of the Rock Mechanics Model and it includes sub-models of the intact rock strength, in situ stress, thermal properties, rock mass quality and properties of the brittle deformation zones. Because of the varying quantities of available data for the different parameters, the types of presentations also vary: some data sets can be presented in the style of a 3D block model but, in other cases, a single distribution represents the whole rock volume hosting the ONKALO. (orig.)

  11. U.S. Coastal Relief Model - Southern California Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NGDC's U.S. Coastal Relief Model (CRM) provides a comprehensive view of the U.S. coastal zone integrating offshore bathymetry with land topography into a seamless...

  12. The Oak Ridge Competitive Electricity Dispatch (ORCED) Model Version 9

    Energy Technology Data Exchange (ETDEWEB)

    Hadley, Stanton W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Baek, Young Sun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-11-01

    The Oak Ridge Competitive Electricity Dispatch (ORCED) model dispatches power plants in a region to meet the electricity demands for any single given year up to 2030. It uses publicly available sources of data describing electric power units such as the National Energy Modeling System and hourly demands from utility submittals to the Federal Energy Regulatory Commission that are projected to a future year. The model simulates a single region of the country for a given year, matching generation to demands and predefined net exports from the region, assuming no transmission constraints within the region. ORCED can calculate a number of key financial and operating parameters for generating units and regional market outputs including average and marginal prices, air emissions, and generation adequacy. By running the model with and without changes such as generation plants, fuel prices, emission costs, plug-in hybrid electric vehicles, distributed generation, or demand response, the marginal impact of these changes can be found.

  13. Macro System Model (MSM) User Guide, Version 1.3

    Energy Technology Data Exchange (ETDEWEB)

    Ruth, M.; Diakov, V.; Sa, T.; Goldsby, M.

    2011-09-01

    This user guide describes the macro system model (MSM). The MSM has been designed to allow users to analyze the financial, environmental, transitional, geographical, and R&D issues associated with the transition to a hydrogen economy. Basic end users can use the MSM to answer cross-cutting questions that were previously difficult to answer in a consistent and timely manner due to various assumptions and methodologies among different models.

  14. A Systems Engineering Capability Maturity Model, Version 1.1,

    Science.gov (United States)

    1995-11-01

    Ongoing Skills and Knowledge 4-113 PA 18: Coordinate with Suppliers 4-120 Part 3: Appendices Appendix A Appendix B Appendix C Appendix D...Ward-Callan, C. Wasson, A. Wilbur, A.M. Wilhite, R. Williams, H. Wilson, D. Zaugg, and C. Zumba . continued on next page SM CMM and Capability...Model (SE-CMM) was developed as a response to industry requests for assistance in coordinating and publishing a model that would foster improvement

  15. Due Regard Encounter Model Version 1.0

    Science.gov (United States)

    2013-08-19

    Note that no existing model covers encoun- ters between two IFR aircraft in oceanic airspace. The reason for this is that one cannot observe encounters...encounters between instrument flight rules ( IFR ) and non- IFR traffic beyond 12NM. 2 TABLE 1 Encounter model categories. Aircraft of Interest Intruder...Aircraft Location Flight Rule IFR VFR Noncooperative Noncooperative Conventional Unconventional CONUS IFR C C U X VFR C U U X Offshore IFR C C U X VFR C U

  16. Using the Global Forest Products Model (GFPM version 2012)

    Science.gov (United States)

    Joseph Buongiorno; Shushuai Zhu

    2012-01-01

    The purpose of this manual is to enable users of the Global Forest Products Model to: • Install and run the GFPM software • Understand the input data • Change the input data to explore different scenarios • Interpret the output The GFPM is an economic model of global production, consumption and trade of forest products (Buongiorno et al. 2003). The GFPM2012 has data...

  17. Institutional Transformation Version 2.5 Modeling and Planning.

    Energy Technology Data Exchange (ETDEWEB)

    Villa, Daniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mizner, Jack H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Passell, Howard D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gallegos, Gerald R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Peplinski, William John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vetter, Douglas W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Christopher A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Malczynski, Leonard A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Addison, Marlin [Arizona State Univ., Mesa, AZ (United States); Schaffer, Matthew A. [Bridgers and Paxton Engineering Firm, Albuquerque, NM (United States); Higgins, Matthew W. [Vibrantcy, Albuquerque, NM (United States)

    2017-02-01

    Reducing the resource consumption and emissions of large institutions is an important step toward a sustainable future. Sandia National Laboratories' (SNL) Institutional Transformation (IX) project vision is to provide tools that enable planners to make well-informed decisions concerning sustainability, resource conservation, and emissions reduction across multiple sectors. The building sector has been the primary focus so far because it is the largest consumer of resources for SNL. The IX building module allows users to define the evolution of many buildings over time. The module has been created so that it can be generally applied to any set of DOE-2 ( http://doe2.com ) building models that have been altered to include parameters and expressions required by energy conservation measures (ECM). Once building models have been appropriately prepared, they are checked into a Microsoft Access (r) database. Each building can be represented by many models. This enables the capability to keep a continuous record of models in the past, which are replaced with different models as changes occur to the building. In addition to this, the building module has the capability to apply climate scenarios through applying different weather files to each simulation year. Once the database has been configured, a user interface in Microsoft Excel (r) is used to create scenarios with one or more ECMs. The capability to include central utility buildings (CUBs) that service more than one building with chilled water has been developed. A utility has been created that joins multiple building models into a single model. After using the utility, several manual steps are required to complete the process. Once this CUB model has been created, the individual contributions of each building are still tracked through meters. Currently, 120 building models from SNL's New Mexico and California campuses have been created. This includes all buildings at SNL greater than 10,000 sq. ft

  18. Prevalence, outcome, and women's experiences of external cephalic version in a low-risk population

    NARCIS (Netherlands)

    Rijnders, M.; Offerhaus, P.; Dommelen, P. van; Wiegers, T.; Buitendijk, S.

    2010-01-01

    Background: Until recently, external cephalic version to prevent breech presentation at birth was not widely accepted. The objective of our study was to assess the prevalence, outcomes, and women's experiences of external cephalic version to improve the implementation of the procedure in the Netherl

  19. Zig-zag version of the Frenkel-Kontorova model

    DEFF Research Database (Denmark)

    Christiansen, Peter Leth; Savin, A.V.; Zolotaryuk, Alexander

    1996-01-01

    We study a generalization of the Frenkel-Kontorova model which describes a zig-zag chain of particles coupled by both the first- and second-neighbor harmonic forces and subjected to a planar substrate with a commensurate potential relief. The particles are supposed to have two degrees of freedom:...

  20. A node-based version of the cellular Potts model.

    Science.gov (United States)

    Scianna, Marco; Preziosi, Luigi

    2016-09-01

    The cellular Potts model (CPM) is a lattice-based Monte Carlo method that uses an energetic formalism to describe the phenomenological mechanisms underlying the biophysical problem of interest. We here propose a CPM-derived framework that relies on a node-based representation of cell-scale elements. This feature has relevant consequences on the overall simulation environment. First, our model can be implemented on any given domain, provided a proper discretization (which can be regular or irregular, fixed or time evolving). Then, it allowed an explicit representation of cell membranes, whose displacements realistically result in cell movement. Finally, our node-based approach can be easily interfaced with continuous mechanics or fluid dynamics models. The proposed computational environment is here applied to some simple biological phenomena, such as cell sorting and chemotactic migration, also in order to achieve an analysis of the performance of the underlying algorithm. This work is finally equipped with a critical comparison between the advantages and disadvantages of our model with respect to the traditional CPM and to some similar vertex-based approaches. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Red Storm usage model :Version 1.12.

    Energy Technology Data Exchange (ETDEWEB)

    Jefferson, Karen L.; Sturtevant, Judith E.

    2005-12-01

    Red Storm is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Sandia National Laboratories (SNL). The Red Storm Usage Model (RSUM) documents the capabilities and the environment provided for the FY05 Tri-Lab Level II Limited Availability Red Storm User Environment Milestone and the FY05 SNL Level II Limited Availability Red Storm Platform Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and SNL. Additionally, the Red Storm Usage Model maps the provided capabilities to the Tri-Lab ASC Computing Environment (ACE) requirements. The ACE requirements reflect the high performance computing requirements for the ASC community and have been updated in FY05 to reflect the community's needs. For each section of the RSUM, Appendix I maps the ACE requirements to the Limited Availability User Environment capabilities and includes a description of ACE requirements met and those requirements that are not met in that particular section. The Red Storm Usage Model, along with the ACE mappings, has been issued and vetted throughout the Tri-Lab community.

  2. Parameter Estimation in Rainfall-Runoff Modelling Using Distributed Versions of Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Michala Jakubcová

    2015-01-01

    Full Text Available The presented paper provides the analysis of selected versions of the particle swarm optimization (PSO algorithm. The tested versions of the PSO were combined with the shuffling mechanism, which splits the model population into complexes and performs distributed PSO optimization. One of them is a new proposed PSO modification, APartW, which enhances the global exploration and local exploitation in the parametric space during the optimization process through the new updating mechanism applied on the PSO inertia weight. The performances of four selected PSO methods were tested on 11 benchmark optimization problems, which were prepared for the special session on single-objective real-parameter optimization CEC 2005. The results confirm that the tested new APartW PSO variant is comparable with other existing distributed PSO versions, AdaptW and LinTimeVarW. The distributed PSO versions were developed for finding the solution of inverse problems related to the estimation of parameters of hydrological model Bilan. The results of the case study, made on the selected set of 30 catchments obtained from MOPEX database, show that tested distributed PSO versions provide suitable estimates of Bilan model parameters and thus can be used for solving related inverse problems during the calibration process of studied water balance hydrological model.

  3. Incremental Testing of the Community Multiscale Air Quality (CMAQ) Modeling System Version 4.7

    Science.gov (United States)

    This paper describes the scientific and structural updates to the latest release of the Community Multiscale Air Quality (CMAQ) modeling system version 4.7 (v4.7) and points the reader to additional resources for further details. The model updates were evaluated relative to obse...

  4. Connected Equipment Maturity Model Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Butzbaugh, Joshua B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mayhorn, Ebony T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sullivan, Greg [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Whalen, Scott A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-05-01

    The Connected Equipment Maturity Model (CEMM) evaluates the high-level functionality and characteristics that enable equipment to provide the four categories of energy-related services through communication with other entities (e.g., equipment, third parties, utilities, and users). The CEMM will help the U.S. Department of Energy, industry, energy efficiency organizations, and research institutions benchmark the current state of connected equipment and identify capabilities that may be attained to reach a more advanced, future state.

  5. Development of polygonal-surface version of ICRP reference phantoms: Lymphatic node modeling

    Energy Technology Data Exchange (ETDEWEB)

    Thang, Ngyen Tat; Yeom, Yeon Soo; Han, Min Cheol; Kim, Chan Hyeong [Hanyang University, Seoul (Korea, Republic of)

    2014-04-15

    Among radiosensitive organs/tissues considered in ICRP Publication 103, lymphatic nodes are many small size tissues and widely distributed in the ICRP reference phantoms. It is difficult to directly convert lymphatic nodes of ICRP reference voxel phantoms to polygonal surfaces. Furthermore, in the ICRP reference phantoms lymphatic nodes were manually drawn only in six lymphatic node regions and the reference number of lymphatic nodes reported in ICRP Publication 89 was not considered. To address aforementioned limitations, the present study developed a new lymphatic node modeling method for the polygonal-surface version of ICRP reference phantoms. By using the developed method, lymphatic nodes were modelled in the preliminary version of ICRP male polygonal-surface phantom. Then, lymphatic node dose values were calculated and compared with those of the ICRP reference male voxel phantom to validate the developed modeling method. The present study developed the new lymphatic node modeling method and successfully modeled lymphatic nodes in the preliminary version of the ICRP male polygonal-surface phantom. From the results, it was demonstrated that the developed modeling method can be used to model lymphatic nodes in polygonal-surface version of ICRP reference phantoms.

  6. Operational Risk Modeling

    Directory of Open Access Journals (Sweden)

    Gabriela ANGHELACHE

    2011-06-01

    Full Text Available Losses resulting from operational risk events from a complex interaction between organizational factors, personal and market participants that do not fit a simple classification scheme. Taking into account past losses (ex. Barings, Daiwa, etc. we can say that operational risk is a major financial losses in the banking sector, although until recently have been underestimated, considering that they are generally minor, note setting survival of a bank.

  7. Validation of the Korean Version of the Lewy Body Composite Risk Score (K-LBCRS).

    Science.gov (United States)

    Ryu, Hui Jin; Kim, Minyoung; Moon, Yeonsil; Choi, Yeji; Han, Jee-Young; Galvin, James E; Han, Seol-Heui

    2017-01-01

    The Lewy body composite risk score (LBCRS) is a useful clinical screening tool to help determine whether the dementia is related to Lewy body pathology. The purpose of this study is to verify reliability, validity, and diagnostic usefulness of Korean version of LBCRS (K-LBCRS). CDR-sum of boxes, Mini-Mental State Examination, and standardized scales related to cognition, mood, behavior, and motor function were administered to a total of 107 subjects, including 30 dementia with Lewy bodies (DLB), 54 Alzheimer's disease (AD), and 23 cognitively normal elderly people and their collateral informants. Internal consistency of the K-LBCRS was good with Cronbach's alpha of 0.85, and concurrent validity was also satisfactory, with K-LBCRS correlating highly with CDR-SB and other scales. The test-retest reliability was very high with a Pearson correlation coefficient of 0.97. The mean scores of K-LBCRS were significantly different among three groups, with DLB (6.2±2.4), AD (1.4±1.3), and controls (0.3±0.6). We identified a cut-off score of 3 as best to differentiate between DLB and AD, having AUC of 0.97 (95% CI 0.94-1.00), sensitivity 97%, specificity 83%, positive predictive value 76%, negative predictive value 98%, which is the same score suggested in the original study. This study shows K-LBCRS as a new useful screening tool for Korean DLB patients in clinical settings.

  8. Wildfire Risk Main Model

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The model combines three modeled fire behavior parameters (rate of spread, flame length, crown fire potential) and one modeled ecological health measure (fire regime...

  9. Response Surface Modeling Tool Suite, Version 1.x

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-05

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code, a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.

  10. The ``Nordic`` HBV model. Description and documentation of the model version developed for the project Climate Change and Energy Production

    Energy Technology Data Exchange (ETDEWEB)

    Saelthun, N.R.

    1996-12-31

    The model described in this report is a version of the HBV model developed for the project Climate Change and Energy Production. This was a Nordic project aimed at evaluating the impacts of the Scandinavian countries including Greenland with emphasis on hydropower production. The model incorporates many of the features found in individual versions of the HBV model in use in the Nordic countries, and some new ones. It has catchment subdivision in altitude intervals, a simple vegetation parametrization including interception, temperature based evapotranspiration calculation, lake evaporation, lake routing, glacier mass balance simulation, special functions for climate change simulations etc. The user interface is very basic, and the model is primarily intended for research and educational purposes. Commercial versions of the model should be used for operational implementations. 5 refs., 4 figs., 1 tab.

  11. Credit Risk Modeling

    DEFF Research Database (Denmark)

    Lando, David

    Credit risk is today one of the most intensely studied topics in quantitative finance. This book provides an introduction and overview for readers who seek an up-to-date reference to the central problems of the field and to the tools currently used to analyze them. The book is aimed at researchers...

  12. COMODI: An ontology to characterise differences in versions of computational models in biology

    OpenAIRE

    Scharm, Martin; Waltemath, Dagmar; Mendes, Pedro; Wolkenhauer, Olaf

    2016-01-01

    Motivation: Open model repositories provide ready-to-reuse computational models of biological systems. Models within those repositories evolve over time, leading to many alternative and subsequent versions. Taken together, the underlying changes reflect a model’s provenance and thus can give valuable insights into the studied biology. Currently, however, changes cannot be semantically interpreted. To improve this situation, we developed an ontology of terms describing changes in computational...

  13. 78 FR 76791 - Availability of Version 4.0 of the Connect America Fund Phase II Cost Model; Adopting Current...

    Science.gov (United States)

    2013-12-19

    ... provide additional protection from harsh weather. This version modifies the prior methodology used for..., which provides more detail on the current model architecture, processing steps, and data sources...

  14. A new version of code Java for 3D simulation of the CCA model

    Science.gov (United States)

    Zhang, Kebo; Xiong, Hailing; Li, Chao

    2016-07-01

    In this paper we present a new version of the program of CCA model. In order to benefit from the advantages involved in the latest technologies, we migrated the running environment from JDK1.6 to JDK1.7. And the old program was optimized into a new framework, so promoted extendibility.

  15. All-Ages Lead Model (Aalm) Version 1.05 (External Draft Report)

    Science.gov (United States)

    The All-Ages Lead Model (AALM) Version 1.05, is an external review draft software and guidance manual. EPA released this software and associated documentation for public review and comment beginning September 27, 2005, until October 27, 2005. The public comments will be accepte...

  16. Using the Global Forest Products Model (GFPM version 2016 with BPMPD)

    Science.gov (United States)

    Joseph Buongiorno; Shushuai   Zhu

    2016-01-01

     The GFPM is an economic model of global production, consumption and trade of forest products. The original formulation and several applications are described in Buongiorno et al. (2003). However, subsequent versions, including the GFPM 2016 reflect significant changes and extensions. The GFPM 2016 software uses the...

  17. [Psychometric properties of the French version of the Effort-Reward Imbalance model].

    Science.gov (United States)

    Niedhammer, I; Siegrist, J; Landre, M F; Goldberg, M; Leclerc, A

    2000-10-01

    Two main models are currently used to evaluate psychosocial factors at work: the Job Strain model developed by Karasek and the Effort-Reward Imbalance model. A French version of the first model has been validated for the dimensions of psychological demands and decision latitude. As regards the second one evaluating three dimensions (extrinsic effort, reward, and intrinsic effort), there are several versions in different languages, but until recently there was no validated French version. The objective of this study was to explore the psychometric properties of the French version of the Effort-Reward Imbalance model in terms of internal consistency, factorial validity, and discriminant validity. The present study was based on the GAZEL cohort and included the 10 174 subjects who were working at the French national electric and gas company (EDF-GDF) and answered the questionnaire in 1998. A French version of Effort-Reward Imbalance was included in this questionnaire. This version was obtained by a standard forward/backward translation procedure. Internal consistency was satisfactory for the three scales of extrinsic effort, reward, and intrinsic effort: Cronbach's Alpha coefficients higher than 0.7 were observed. A one-factor solution was retained for the factor analysis of the scale of extrinsic effort. A three-factor solution was retained for the factor analysis of reward, and these dimensions were interpreted as the factor analysis of intrinsic effort did not support the expected four-dimension structure. The analysis of discriminant validity displayed significant associations between measures of Effort-Reward Imbalance and the variables of sex, age, education level, and occupational grade. This study is the first one supporting satisfactory psychometric properties of the French version of the Effort-Reward Imbalance model. However, the factorial validity of intrinsic effort could be questioned. Furthermore, as most previous studies were based on male samples

  18. Causal Models for Risk Management

    Directory of Open Access Journals (Sweden)

    Neysis Hernández Díaz

    2013-12-01

    Full Text Available In this work a study about the process of risk management in major schools in the world. The project management tools worldwide highlights the need to redefine risk management processes. From the information obtained it is proposed the use of causal models for risk analysis based on information from the project or company, say risks and the influence thereof on the costs, human capital and project requirements and detect the damages of a number of tasks without tribute to the development of the project. A study on the use of causal models as knowledge representation techniques causal, among which are the Fuzzy Cognitive Maps (DCM and Bayesian networks, with the most favorable MCD technique to use because it allows modeling the risk information witho ut having a knowledge base either itemize.

  19. SIMULATION OF COLLECTIVE RISK MODEL

    Directory of Open Access Journals (Sweden)

    Viera Pacáková

    2007-12-01

    Full Text Available The article focuses on providing brief theoretical definitions of the basic terms and methods of modeling and simulations of insurance risks in non-life insurance by means of mathematical and statistical methods using statistical software. While risk assessment of insurance company in connection with its solvency is a rather complex and comprehensible problem, its solution starts with statistical modeling of number and amount of individual claims. Successful solution of these fundamental problems enables solving of curtail problems of insurance such as modeling and simulation of collective risk, premium an reinsurance premium calculation, estimation of probabiliy of ruin etc. The article also presents some essential ideas underlying Monte Carlo methods and their applications to modeling of insurance risk. Solving problem is to find the probability distribution of the collective risk in non-life insurance portfolio. Simulation of the compound distribution function of the aggregate claim amount can be carried out, if the distibution functions of the claim number process and the claim size are assumed given. The Monte Carlo simulation is suitable method to confirm the results of other methods and for treatments of catastrophic claims, when small collectives are studied. Analysis of insurance risks using risk theory is important part of the project Solvency II. Risk theory is analysis of stochastic features of non-life insurance process. The field of application of risk theory has grown rapidly. There is a need to develop the theory into form suitable for practical purposes and demostrate their application. Modern computer simulation techniques open up a wide field of practical applications for risk theory concepts, without requiring the restricive assumptions and sophisticated mathematics. This article presents some comparisons of the traditional actuarial methods and of simulation methods of the collective risk model.

  20. User guide for MODPATH Version 7—A particle-tracking model for MODFLOW

    Science.gov (United States)

    Pollock, David W.

    2016-09-26

    MODPATH is a particle-tracking post-processing program designed to work with MODFLOW, the U.S. Geological Survey (USGS) finite-difference groundwater flow model. MODPATH version 7 is the fourth major release since its original publication. Previous versions were documented in USGS Open-File Reports 89–381 and 94–464 and in USGS Techniques and Methods 6–A41.MODPATH version 7 works with MODFLOW-2005 and MODFLOW–USG. Support for unstructured grids in MODFLOW–USG is limited to smoothed, rectangular-based quadtree and quadpatch grids.A software distribution package containing the computer program and supporting documentation, such as input instructions, output file descriptions, and example problems, is available from the USGS over the Internet (http://water.usgs.gov/ogw/modpath/).

  1. RAMS Model for Terrestrial Pathways Version 3. 0 (for microcomputers). Model-Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Niebla, E.

    1989-01-01

    The RAMS Model for Terrestrial Pathways is a computer program for calculation of numeric criteria for land application and distribution and marketing of sludges under the sewage-sludge regulations at 40 CFR Part 503. The risk-assessment models covered assume that municipal sludge with specified characteristics is spread across a defined area of ground at a known rate once each year for a given number of years. Risks associated with direct land application of sludge applied after distribution and marketing are both calculated. The computer program calculates the maximum annual loading of contaminants that can be land applied and still meet the risk criteria specified as input. Software Description: The program is written in the Turbo/Basic programming language for implementation on IBM PC/AT or compatible machines using DOS 3.0 or higher operating system. Minimum core storage is 512K.

  2. Liver Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  3. Colorectal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  4. Cervical Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  5. Prostate Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  6. Pancreatic Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  7. Colorectal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  8. Bladder Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  9. Esophageal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  10. Lung Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  11. Breast Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. Ovarian Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  13. Testicular Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of testicular cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. Site investigation SFR. Hydrogeological modelling of SFR. Model version 0.2

    Energy Technology Data Exchange (ETDEWEB)

    Oehman, Johan (Golder Associates AB (Sweden)); Follin, Sven (SF GeoLogic (Sweden))

    2010-01-15

    The Swedish Nuclear Fuel and Waste Management Company (SKB) has conducted site investigations for a planned extension of the existing final repository for short-lived radioactive waste (SFR). A hydrogeological model is developed in three model versions, which will be used for safety assessment and design analyses. This report presents a data analysis of the currently available hydrogeological data from the ongoing Site Investigation SFR (KFR27, KFR101, KFR102A, KFR102B, KFR103, KFR104, and KFR105). The purpose of this work is to develop a preliminary hydrogeological Discrete Fracture Network model (hydro-DFN) parameterisation that can be applied in regional-scale modelling. During this work, the Geologic model had not yet been updated for the new data set. Therefore, all analyses were made to the rock mass outside Possible Deformation Zones, according to Single Hole Interpretation. Owing to this circumstance, it was decided not to perform a complete hydro-DFN calibration at this stage. Instead focus was re-directed to preparatory test cases and conceptual questions with the aim to provide a sound strategy for developing the hydrogeological model SFR v. 1.0. The presented preliminary hydro-DFN consists of five fracture sets and three depth domains. A statistical/geometrical approach (connectivity analysis /Follin et al. 2005/) was performed to estimate the size (i.e. fracture radius) distribution of fractures that are interpreted as Open in geologic mapping of core data. Transmissivity relations were established based on an assumption of a correlation between the size and evaluated specific capacity of geologic features coupled to inflows measured by the Posiva Flow Log device (PFL-f data). The preliminary hydro-DFN was applied in flow simulations in order to test its performance and to explore the role of PFL-f data. Several insights were gained and a few model technical issues were raised. These are summarised in Table 5-1

  15. a Version-Similarity Based Trust Degree Computation Model for Crowdsourcing Geographic Data

    Science.gov (United States)

    Zhou, Xiaoguang; Zhao, Yijiang

    2016-06-01

    Quality evaluation and control has become the main concern of VGI. In this paper, trust is used as a proxy of VGI quality, a version-similarity based trust degree computation model for crowdsourcing geographic data is presented. This model is based on the assumption that the quality of VGI objects mainly determined by the professional skill and integrity (called reputation in this paper), and the reputation of the contributor is movable. The contributor's reputation is calculated using the similarity degree among the multi-versions for the same entity state. The trust degree of VGI object is determined by the trust degree of its previous version, the reputation of the last contributor and the modification proportion. In order to verify this presented model, a prototype system for computing the trust degree of VGI objects is developed by programming with Visual C# 2010. The historical data of Berlin of OpenStreetMap (OSM) are employed for experiments. The experimental results demonstrate that the quality of crowdsourcing geographic data is highly positive correlation with its trustworthiness. As the evaluation is based on version-similarity, not based on the direct subjective evaluation among users, the evaluation result is objective. Furthermore, as the movability property of the contributors' reputation is used in this presented method, our method has a higher assessment coverage than the existing methods.

  16. Multilevel joint competing risk models

    Science.gov (United States)

    Karunarathna, G. H. S.; Sooriyarachchi, M. R.

    2017-09-01

    Joint modeling approaches are often encountered for different outcomes of competing risk time to event and count in many biomedical and epidemiology studies in the presence of cluster effect. Hospital length of stay (LOS) has been the widely used outcome measure in hospital utilization due to the benchmark measurement for measuring multiple terminations such as discharge, transferred, dead and patients who have not completed the event of interest at the follow up period (censored) during hospitalizations. Competing risk models provide a method of addressing such multiple destinations since classical time to event models yield biased results when there are multiple events. In this study, the concept of joint modeling has been applied to the dengue epidemiology in Sri Lanka, 2006-2008 to assess the relationship between different outcomes of LOS and platelet count of dengue patients with the district cluster effect. Two key approaches have been applied to build up the joint scenario. In the first approach, modeling each competing risk separately using the binary logistic model, treating all other events as censored under the multilevel discrete time to event model, while the platelet counts are assumed to follow a lognormal regression model. The second approach is based on the endogeneity effect in the multilevel competing risks and count model. Model parameters were estimated using maximum likelihood based on the Laplace approximation. Moreover, the study reveals that joint modeling approach yield more precise results compared to fitting two separate univariate models, in terms of AIC (Akaike Information Criterion).

  17. Risk modelling in portfolio optimization

    Science.gov (United States)

    Lam, W. H.; Jaaman, Saiful Hafizah Hj.; Isa, Zaidi

    2013-09-01

    Risk management is very important in portfolio optimization. The mean-variance model has been used in portfolio optimization to minimize the investment risk. The objective of the mean-variance model is to minimize the portfolio risk and achieve the target rate of return. Variance is used as risk measure in the mean-variance model. The purpose of this study is to compare the portfolio composition as well as performance between the optimal portfolio of mean-variance model and equally weighted portfolio. Equally weighted portfolio means the proportions that are invested in each asset are equal. The results show that the portfolio composition of the mean-variance optimal portfolio and equally weighted portfolio are different. Besides that, the mean-variance optimal portfolio gives better performance because it gives higher performance ratio than the equally weighted portfolio.

  18. A Fast Version of LASG/IAP Climate System Model and Its 1000-year Control Integration

    Institute of Scientific and Technical Information of China (English)

    ZHOU Tianjun; WU Bo; WEN Xinyu; LI Lijuan; WANG Bin

    2008-01-01

    A fast version of the State Key Laboratory of Numerical Modeling for Atmospheric Sciences and Geo- physical Fluid Dynamics (LASG)/Institute of Atmospheric Physics (IAP) climate system model is briefly documented. The fast coupled model employs a low resolution version of the atmospheric component Grid Atmospheric Model of IAP/LASG (GAMIL), with the other parts of the model, namely an oceanic com- ponent LASG/IAP Climate Ocean Model (LICOM), land component Common Land Model (CLM), and sea ice component from National Center for Atmospheric Research Community Climate System Model (NCAR CCSM2), as the same as in the standard version of LASG/IAP Flexible Global Ocean Atmosphere Land System model (FGOALS_g). The parameterizatious of physical and dynamical processes of the at- mospheric component in the fast version are identical to the standard version, although some parameter values are different. However, by virtue of reduced horizontal resolution and increased time-step of the most time-consuming atmospheric component, it runs faster by a factor of 3 and can serve as a useful tool for long- term and large-ensemble integrations. A 1000-year control simulation of the present-day climate has been completed without flux adjustments. The final 600 years of this simulation has virtually no trends in global mean sea surface temperatures and is recommended for internal variability studies. Several aspects of the control simulation's mean climate and variability axe evaluated against the observational or reanalysis data. The strengths and weaknesses of the control simulation are evaluated. The mean atmospheric circulation is well simulated, except in high latitudes. The Asian-Australian monsoonal meridional cell shows realistic features, however, an artificial rainfall center is located to the eastern periphery of the Tibetan Plateau persists throughout the year. The mean bias of SST resembles that of the standard version, appearing as a "double ITCZ" (Inter

  19. The Lagrangian particle dispersion model FLEXPART-WRF VERSION 3.1

    Energy Technology Data Exchange (ETDEWEB)

    Brioude, J.; Arnold, D.; Stohl, A.; Cassiani, M.; Morton, Don; Seibert, P.; Angevine, W. M.; Evan, S.; Dingwell, A.; Fast, Jerome D.; Easter, Richard C.; Pisso, I.; Bukhart, J.; Wotawa, G.

    2013-11-01

    The Lagrangian particle dispersion model FLEXPART was originally designed for cal- culating long-range and mesoscale dispersion of air pollutants from point sources, such as after an accident in a nuclear power plant. In the meantime FLEXPART has evolved into a comprehensive tool for atmospheric transport modeling and analysis at different scales. This multiscale need from the modeler community has encouraged new developments in FLEXPART. In this document, we present a version that works with the Weather Research and Forecasting (WRF) mesoscale meteoro- logical model. Simple procedures on how to run FLEXPART-WRF are presented along with special options and features that differ from its predecessor versions. In addition, test case data, the source code and visualization tools are provided to the reader as supplementary material.

  20. Cancer Genetics Risk Assessment and Counseling (PDQ®)—Health Professional Version

    Science.gov (United States)

    Expert-reviewed information summary in which cancer risk perception, risk communication, and risk counseling are discussed. The summary also contains information about recording and analyzing a family history of cancer and factors to consider when offering genetic testing.

  1. Thermal site descriptive model. A strategy for the model development during site investigations - version 2

    Energy Technology Data Exchange (ETDEWEB)

    Back, Paer-Erik; Sundberg, Jan [Geo Innova AB (Sweden)

    2007-09-15

    This report presents a strategy for describing, predicting and visualising the thermal aspects of the site descriptive model. The strategy is an updated version of an earlier strategy applied in all SDM versions during the initial site investigation phase at the Forsmark and Oskarshamn areas. The previous methodology for thermal modelling did not take the spatial correlation fully into account during simulation. The result was that the variability of thermal conductivity in the rock mass was not sufficiently well described. Experience from earlier thermal SDMs indicated that development of the methodology was required in order describe the spatial distribution of thermal conductivity in the rock mass in a sufficiently reliable way, taking both variability within rock types and between rock types into account. A good description of the thermal conductivity distribution is especially important for the lower tail. This tail is important for the design of a repository because it affects the canister spacing. The presented approach is developed to be used for final SDM regarding thermal properties, primarily thermal conductivity. Specific objectives for the strategy of thermal stochastic modelling are: Description: statistical description of the thermal conductivity of a rock domain. Prediction: prediction of thermal conductivity in a specific rock volume. Visualisation: visualisation of the spatial distribution of thermal conductivity. The thermal site descriptive model should include the temperature distribution and thermal properties of the rock mass. The temperature is the result of the thermal processes in the repository area. Determination of thermal transport properties can be made using different methods, such as laboratory investigations, field measurements, modelling from mineralogical composition and distribution, modelling from density logging and modelling from temperature logging. The different types of data represent different scales, which has to be

  2. The Hamburg Oceanic Carbon Cycle Circulation Model. Version 1. Version 'HAMOCC2s' for long time integrations

    Energy Technology Data Exchange (ETDEWEB)

    Heinze, C.; Maier-Reimer, E. [Max-Planck-Institut fuer Meteorologie, Hamburg (Germany)

    1999-11-01

    The Hamburg Ocean Carbon Cycle Circulation Model (HAMOCC, configuration HAMOCC2s) predicts the atmospheric carbon dioxide partial pressure (as induced by oceanic processes), production rates of biogenic particulate matter, and geochemical tracer distributions in the water column as well as the bioturbated sediment. Besides the carbon cycle this model version includes also the marine silicon cycle (silicic acid in the water column and the sediment pore waters, biological opal production, opal flux through the water column and opal sediment pore water interaction). The model is based on the grid and geometry of the LSG ocean general circulation model (see the corresponding manual, LSG=Large Scale Geostrophic) and uses a velocity field provided by the LSG-model in 'frozen' state. In contrast to the earlier version of the model (see Report No. 5), the present version includes a multi-layer sediment model of the bioturbated sediment zone, allowing for variable tracer inventories within the complete model system. (orig.)

  3. A one-dimensional material transfer model for HECTR version 1. 5

    Energy Technology Data Exchange (ETDEWEB)

    Geller, A.S.; Wong, C.C.

    1991-08-01

    HECTR (Hydrogen Event Containment Transient Response) is a lumped-parameter computer code developed for calculating the pressure-temperature response to combustion in a nuclear power plant containment building. The code uses a control-volume approach and subscale models to simulate the mass, momentum, and energy transfer occurring in the containment during a loss-of-collant-accident (LOCA). This document describes one-dimensional subscale models for mass and momentum transfer, and the modifications to the code required to implement them. Two problems were analyzed: the first corresponding to a standard problem studied with previous HECTR versions, the second to experiments. The performance of the revised code relative to previous HECTR version is discussed as is the ability of the code to model the experiments. 8 refs., 5 figs., 3 tabs.

  4. Description and evaluation of the Model for Ozone and Related chemical Tracers, version 4 (MOZART-4

    Directory of Open Access Journals (Sweden)

    L. K. Emmons

    2010-01-01

    Full Text Available The Model for Ozone and Related chemical Tracers, version 4 (MOZART-4 is an offline global chemical transport model particularly suited for studies of the troposphere. The updates of the model from its previous version MOZART-2 are described, including an expansion of the chemical mechanism to include more detailed hydrocarbon chemistry and bulk aerosols. Online calculations of a number of processes, such as dry deposition, emissions of isoprene and monoterpenes and photolysis frequencies, are now included. Results from an eight-year simulation (2000–2007 are presented and evaluated. The MOZART-4 source code and standard input files are available for download from the NCAR Community Data Portal (http://cdp.ucar.edu.

  5. The global chemistry transport model TM5: description and evaluation of the tropospheric chemistry version 3.0

    NARCIS (Netherlands)

    Huijnen, V.; Williams, J.; van Weele, M.; van Noije, T.; Krol, M.; Dentener, F.; Segers, A.; Houweling, S.; Peters, W.; de Laat, J.; Boersma, F.; Bergamaschi, P.; van Velthoven, P.; Le Sager, P.; Eskes, H.; Alkemade, F.; Scheele, R.; Nédélec, P.; Pätz, H.-W.

    2010-01-01

    We present a comprehensive description and benchmark evaluation of the tropospheric chemistry version of the global chemistry transport model TM5 (Tracer Model 5, version TM5-chem-v3.0). A full description is given concerning the photochemical mechanism, the interaction with aerosol, the treatment o

  6. Scope Complexity Options Risks Excursions (SCORE) Version 3.0 Mathematical Description.

    Energy Technology Data Exchange (ETDEWEB)

    Gearhart, Jared Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Samberson, Jonell Nicole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shettigar, Subhasini [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jungels, John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Welch, Kimberly M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Dean A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options. The results of this model allow those considering these options to understand the complexity tradeoffs between proposed warhead options. The core idea of SCORE is to divide a warhead option into a well- defined set of scope elements and then estimate the complexity of each scope element against a well understood reference system. The uncertainty associated with estimates can also be captured. A weighted summation of the relative complexity of each scope element is used to determine the total complexity of the proposed warhead option or portions of the warhead option (i.e., a National Work Breakdown Structure code). The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA- 12 led Enterprise Modeling and Analysis Consortium (EMAC), that has provided the data elicitation, integration and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).

  7. Actuarial Assessment of Sex Offender Recidivism Risk: A Validation of the German version of the Static-991

    Directory of Open Access Journals (Sweden)

    Martin Rettenberger

    2006-12-01

    Full Text Available The Static-99 and the RRASOR are actuarial risk assessment tools for evaluating the risk of sexual and violent recidivism in sexual offenders. The Static-99 was developed in 1999 by Karl R. Hanson (Canada and David Thornton (Great Britain and is in the mean time regularly used for risk assessment in North America and some countries in Europe. The RRASOR can be described as a predecessor of the Static-99 and was published by Hanson in 1997. At first we translated the revised version of the Static-99 (Harris, Phenix, Hanson & Thornton, 2003 and adapted the instrument and the manual to the forensic context in Germany and Austria (Rettenberger & Eher, 2006. In this retrospective study, interrater reliability and concurrent validity of the RRASOR and of the German adaption of the Static-99 is presented. Furthermore we evaluated the predictive accuracy of the Static-99 and the RRASOR and compared their results. The instruments were validated from file information of Austrian sexuel offenders, who were convicted between 1968 and 2002. Both the Static-99 and the RRASOR had good interrater reliability and concurrent validity. The Static-99 showed good predictive validity for general (r = .41, AUC = .74, sexual (r = .35, AUC = .74 and violent (r = .41, AUC = .76 recidivism, whereas the predictive accuracy of the RRASOR was moderate for general (r = .29, AUC = .66, sexual (r = .30, AUC = .68 and violent (r = .28, AUC = .67 recidivism. The Static-99 exhibited a higher accuracy for the prediction of sexual offender recidivism. Although further validation studies on German-speaking populations of sex offenders are necessary, these results support the utility of the German version of the revised version of the Static-99 in improving risk assessment of sexual offenders.

  8. New versions of the BDS/GNSS zenith tropospheric delay model IGGtrop

    Science.gov (United States)

    Li, Wei; Yuan, Yunbin; Ou, Jikun; Chai, Yanju; Li, Zishen; Liou, Yuei-An; Wang, Ningbo

    2015-01-01

    The initial IGGtrop model proposed for Chinese BDS (BeiDou System) is not very suitable for BDS/GNSS research and application due to its large data volume while it shows a global mean accuracy of 4 cm. New versions of the global zenith tropospheric delay (ZTD) model IGGtrop are developed through further investigation on the spatial and temporal characteristics of global ZTD. From global GNSS ZTD observations and weather reanalysis data, new ZTD characteristics are found and discussed in this study including: small and inconsistent seasonal variation in ZTD between and stable seasonal variation outside; weak zonal variation in ZTD at higher latitudes (north of and south of ) and at heights above 6 km, etc. Based on these analyses, new versions of IGGtrop, named , are established through employing corresponding strategies: using a simple algorithm for equatorial ZTD; generating an adaptive spatial grid with lower resolutions in regions where ZTD varies little; and creating a method for optimized storage of model parameters. Thus, the models require much less parameters than the IGGtrop model, nearly 3.1-21.2 % of that for the IGGtrop model. The three new versions are validated by five years of GNSS-derived ZTDs at 125 IGS sites, and it shows that: demonstrates the highest ZTD correction performance, similar to IGGtrop; requires the least model parameters; is moderate in both zenith delay prediction performance and number of model parameters. For the model, the biases at those IGS sites are between and 4.3 cm with a mean value of cm and RMS errors are between 2.1 and 8.5 cm with a mean value of 4.0 cm. Different BDS and other GNSS users can choose a suitable model according to their application and research requirements.

  9. Digital elevation models for site investigation programme in Oskarshamn. Site description version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Brydsten, Lars; Stroemgren, Maarten [Umeaa Univ. (Sweden). Dept. of Biology and Environmental Science

    2005-06-01

    In the Oskarshamn area, a digital elevation model has been produced using elevation data from many elevation sources on both land and sea. Many elevation model users are only interested in elevation models over land, so the model has been designed in three versions: Version 1 describes land surface, lake water surface, and sea bottom. Version 2 describes land surface, sediment levels at lake bottoms, and sea bottoms. Version 3 describes land surface, sediment levels at lake bottoms, and sea surface. In cases where the different sources of data were not in point form 'such as existing elevation models of land or depth lines from nautical charts' they have been converted to point values using GIS software. Because data from some sources often overlaps with data from other sources, several tests were conducted to determine if both sources of data or only one source would be included in the dataset used for the interpolation procedure. The tests resulted in the decision to use only the source judged to be of highest quality for most areas with overlapping data sources. All data were combined into a database of approximately 3.3 million points unevenly spread over an area of about 800 km{sup 2}. The large number of data points made it difficult to construct the model with a single interpolation procedure, the area was divided into 28 sub-models that were processed one by one and finally merged together into one single model. The software ArcGis 8.3 and its extension Geostatistical Analysis were used for the interpolation. The Ordinary Kriging method was used for interpolation. This method allows both a cross validation and a validation before the interpolation is conducted. Cross validation with different Kriging parameters were performed and the model with the most reasonable statistics was chosen. Finally, a validation with the most appropriate Kriging parameters was performed in order to verify that the model fit unmeasured localities. Since both the

  10. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    Science.gov (United States)

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  11. A new tool for modeling dune field evolution based on an accessible, GUI version of the Werner dune model

    Science.gov (United States)

    Barchyn, Thomas E.; Hugenholtz, Chris H.

    2012-02-01

    Research into aeolian dune form and dynamics has benefited from simple and abstract cellular automata computer models. Many of these models are based upon a seminal framework proposed by Werner (1995). Unfortunately, most versions of this model are not publicly available or are not provided in a format that promotes widespread use. In our view, this hinders progress in linking model simulations to empirical data (and vice versa). To this end, we introduce an accessible, graphical user interface (GUI) version of the Werner model. The novelty of this contribution is that it provides a simple interface and detailed instructions that encourage widespread use and extension of the Werner dune model for research and training purposes. By lowering barriers for researchers to develop and test hypotheses about aeolian dune and dune field patterns, this release addresses recent calls to improve access to earth surface models.

  12. Multifractal Value at Risk model

    Science.gov (United States)

    Lee, Hojin; Song, Jae Wook; Chang, Woojin

    2016-06-01

    In this paper new Value at Risk (VaR) model is proposed and investigated. We consider the multifractal property of financial time series and develop a multifractal Value at Risk (MFVaR). MFVaR introduced in this paper is analytically tractable and not based on simulation. Empirical study showed that MFVaR can provide the more stable and accurate forecasting performance in volatile financial markets where large loss can be incurred. This implies that our multifractal VaR works well for the risk measurement of extreme credit events.

  13. Thermal modelling. Preliminary site description. Forsmark area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, Jan; Back, Paer-Erik; Bengtsson, Anna; Laendell, Maerta [Geo Innova AB, Linkoeping (Sweden)

    2005-08-01

    This report presents the thermal site descriptive model for the Forsmark area, version 1.2. The main objective of this report is to present the thermal modelling work where data has been identified, quality controlled, evaluated and summarised in order to make an upscaling to lithological domain level possible. The thermal conductivity at canister scale has been modelled for two different lithological domains (RFM029 and RFM012, both dominated by granite to granodiorite (101057)). A main modelling approach has been used to determine the mean value of the thermal conductivity. Two alternative/complementary approaches have been used to evaluate the spatial variability of the thermal conductivity at domain level. The thermal modelling approaches are based on the lithological model for the Forsmark area, version 1.2 together with rock type models constituted from measured and calculated (from mineral composition) thermal conductivities. Results indicate that the mean of thermal conductivity is expected to exhibit a small variation between the different domains, 3.46 W/(mxK) for RFM012 to 3.55 W/(mxK) for RFM029. The spatial distribution of the thermal conductivity does not follow a simple model. Lower and upper 95% confidence limits are based on the modelling results, but have been rounded of to only two significant figures. Consequently, the lower limit is 2.9 W/(mxK), while the upper is 3.8 W/(mxK). This is applicable to both the investigated domains. The temperature dependence is rather small with a decrease in thermal conductivity of 10.0% per 100 deg C increase in temperature for the dominating rock type. There are a number of important uncertainties associated with these results. One of the uncertainties considers the representative scale for the canister. Another important uncertainty is the methodological uncertainties associated with the upscaling of thermal conductivity from cm-scale to canister scale. In addition, the representativeness of rock samples is

  14. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  15. Shortened version of the work ability index to identify workers at risk of long-term sickness absence.

    Science.gov (United States)

    Schouten, Lianne S; Bültmann, Ute; Heymans, Martijn W; Joling, Catelijne I; Twisk, Jos W R; Roelen, Corné A M

    2016-04-01

    The Work Ability Index (WAI) identifies non-sicklisted workers at risk of future long-term sickness absence (LTSA). The WAI is a complicated instrument and inconvenient for use in large-scale surveys. We investigated whether shortened versions of the WAI identify non-sicklisted workers at risk of LTSA. Prospective study including two samples of non-sicklisted workers participating in occupational health checks between 2010 and 2012. A heterogeneous development sample (N= 2899) was used to estimate logistic regression coefficients for the complete WAI, a shortened WAI version without the list of diseases, and single-item Work Ability Score (WAS). These three instruments were calibrated for predictions of different (≥2, ≥4 and ≥6 weeks) LTSA durations in a validation sample of non-sicklisted workers (N= 3049) employed at a steel mill, differentiating between manual (N= 1710) and non-manual (N= 1339) workers. The discriminative ability was investigated by receiver operating characteristic analysis. All three instruments under-predicted the LTSA risks in both manual and non-manual workers. The complete WAI discriminated between individuals at high and low risk of LTSA ≥2, ≥4 and ≥6 weeks in manual and non-manual workers. Risk predictions and discrimination by the shortened WAI without the list of diseases were as good as the complete WAI. The WAS showed poorer discrimination in manual and non-manual workers. The WAI without the list of diseases is a good alternative to the complete WAI to identify non-sicklisted workers at risk of future LTSA durations ≥2, ≥4 and ≥6 weeks. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  16. Information risk and security modeling

    Science.gov (United States)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  17. Cabin Environment Physics Risk Model

    Science.gov (United States)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  18. Risk Modelling of Agricultural Products

    Science.gov (United States)

    Nugrahani, E. H.

    2017-03-01

    In the real world market, agricultural commodity are imposed with fluctuating prices. This means that the price of agricultural products are relatively volatile, which means that agricultural business is a quite risky business for farmers. This paper presents some mathematical models to model such risks in the form of its volatility, based on certain assumptions. The proposed models are time varying volatility model, as well as time varying volatility with mean reversion and with seasonal mean equation models. Implementation on empirical data show that agricultural products are indeed risky.

  19. The Digital Astronaut Project Computational Bone Remodeling Model (Beta Version) Bone Summit Summary Report

    Science.gov (United States)

    Pennline, James; Mulugeta, Lealem

    2013-01-01

    Under the conditions of microgravity, astronauts lose bone mass at a rate of 1% to 2% a month, particularly in the lower extremities such as the proximal femur [1-3]. The most commonly used countermeasure against bone loss in microgravity has been prescribed exercise [4]. However, data has shown that existing exercise countermeasures are not as effective as desired for preventing bone loss in long duration, 4 to 6 months, spaceflight [1,3,5,6]. This spaceflight related bone loss may cause early onset of osteoporosis to place the astronauts at greater risk of fracture later in their lives. Consequently, NASA seeks to have improved understanding of the mechanisms of bone demineralization in microgravity in order to appropriately quantify this risk, and to establish appropriate countermeasures [7]. In this light, NASA's Digital Astronaut Project (DAP) is working with the NASA Bone Discipline Lead to implement well-validated computational models to help predict and assess bone loss during spaceflight, and enhance exercise countermeasure development. More specifically, computational modeling is proposed as a way to augment bone research and exercise countermeasure development to target weight-bearing skeletal sites that are most susceptible to bone loss in microgravity, and thus at higher risk for fracture. Given that hip fractures can be debilitating, the initial model development focused on the femoral neck. Future efforts will focus on including other key load bearing bone sites such as the greater trochanter, lower lumbar, proximal femur and calcaneus. The DAP has currently established an initial model (Beta Version) of bone loss due to skeletal unloading in femoral neck region. The model calculates changes in mineralized volume fraction of bone in this segment and relates it to changes in bone mineral density (vBMD) measured by Quantitative Computed Tomography (QCT). The model is governed by equations describing changes in bone volume fraction (BVF), and rates of

  20. Manual risk zoning wind turbines. Final version. 3rd updated version May 2013. 3 ed.; Handboek risicozonering windturbines. Eindversie. 3e geactualiseerde versie mei 2013

    Energy Technology Data Exchange (ETDEWEB)

    Faasen, C.J.; Franck, P.A.L.; Taris, A.M.H.W. [DNV KEMA, Arnhem (Netherlands)

    2013-05-15

    The title manual has been drafted to be able to quickly inventory the safety risks of wind turbines, which can speed up the licensing procedures for the installation of wind turbines. Attention is paid to the following categories: building areas, roads, waterways, railroads, industrial areas, underground cables and mains, aboveground cables, power transmission lines, dikes and dams, and telecommunication. For each of these categories the risks are calculated. This is an updated version of the manual which was first published in 2000 and updated in 2005 [Dutch] Dit is een actualisatie van het Handboek uit 2005. Het oorspronkelijke Handboek is in 2000 opgesteld. Het in 2000 verschenen Handboek is door ECN samengesteld in opdracht van Novem (nu: Agentschap NL) met als doel een uniforme methode te bieden voor het uitvoeren van kwantitatieve risicoanalyses en voor het toetsen van de resultaten aan acceptatiecriteria. Dit Handboek bood antwoord op de vraag van zowel projectontwikkelaars als overheden naar een algemeen geldende methode om veiligheidsrisico's van windturbines te berekenen voor diverse omgevingsaspecten. In 2005 is een actualisatie van het Handboek uitgebracht door ECN en KEMA in opdracht van SenterNovem (nu onderdeel van Agentschap NL). Daarin is verder ingegaan op turbines met grotere vermogens en is het Handboek aangevuld met rekenvoorbeelden. Vanwege de verdere ontwikkeling van de windturbinetechnologie, aangepaste wetgeving, het feit dat rekenmodellen verouderd waren en dat diverse infrastructuur steeds vaker gebundeld of geclusterd met windturbines wordt gerealiseerd, is het wenselijk / noodzakelijk de eventuele risico's op een consistente en eenduidige wijze in kaart te brengen. Daarom heeft Agentschap NL in 2012 opdracht gegeven aan DNV KEMA om het Handboek opnieuw te actualiseren. Hierbij zijn de resultaten van het rapport: 'Rekenmethodiek zonering windturbines in relatie tot gas- en elektrische infrastructuur' (2012), dat in

  1. Incremental testing of the Community Multiscale Air Quality (CMAQ modeling system version 4.7

    Directory of Open Access Journals (Sweden)

    K. M. Foley

    2010-03-01

    Full Text Available This paper describes the scientific and structural updates to the latest release of the Community Multiscale Air Quality (CMAQ modeling system version 4.7 (v4.7 and points the reader to additional resources for further details. The model updates were evaluated relative to observations and results from previous model versions in a series of simulations conducted to incrementally assess the effect of each change. The focus of this paper is on five major scientific upgrades: (a updates to the heterogeneous N2O5 parameterization, (b improvement in the treatment of secondary organic aerosol (SOA, (c inclusion of dynamic mass transfer for coarse-mode aerosol, (d revisions to the cloud model, and (e new options for the calculation of photolysis rates. Incremental test simulations over the eastern United States during January and August 2006 are evaluated to assess the model response to each scientific improvement, providing explanations of differences in results between v4.7 and previously released CMAQ model versions. Particulate sulfate predictions are improved across all monitoring networks during both seasons due to cloud module updates. Numerous updates to the SOA module improve the simulation of seasonal variability and decrease the bias in organic carbon predictions at urban sites in the winter. Bias in the total mass of fine particulate matter (PM2.5 is dominated by overpredictions of unspeciated PM2.5 (PMother in the winter and by underpredictions of carbon in the summer. The CMAQv4.7 model results show slightly worse performance for ozone predictions. However, changes to the meteorological inputs are found to have a much greater impact on ozone predictions compared to changes to the CMAQ modules described here. Model updates had little effect on existing biases in wet deposition predictions.

  2. Incremental testing of the community multiscale air quality (CMAQ modeling system version 4.7

    Directory of Open Access Journals (Sweden)

    K. M. Foley

    2009-10-01

    Full Text Available This paper describes the scientific and structural updates to the latest release of the Community Multiscale Air Quality (CMAQ modeling system version 4.7 (v4.7 and points the reader to additional resources for further details. The model updates were evaluated relative to observations and results from previous model versions in a series of simulations conducted to incrementally assess the effect of each change. The focus of this paper is on five major scientific upgrades: (a updates to the heterogeneous N2O5 parameterization, (b improvement in the treatment of secondary organic aerosol (SOA, (c inclusion of dynamic mass transfer for coarse-mode aerosol, (d revisions to the cloud model, and (e new options for the calculation of photolysis rates. Incremental test simulations over the eastern United States during January and August 2006 are evaluated to assess the model response to each scientific improvement, providing explanations of differences in results between v4.7 and previously released CMAQ model versions. Particulate sulfate predictions are improved across all monitoring networks during both seasons due to cloud module updates. Numerous updates to the SOA module improve the simulation of seasonal variability and decrease the bias in organic carbon predictions at urban sites in the winter. Bias in the total mass of fine particulate matter (PM2.5 is dominated by overpredictions of unspeciated PM2.5 (PMother in the winter and by underpredictions of carbon in the summer. The CMAQ v4.7 model results show slightly worse performance for ozone predictions. However, changes to the meteorological inputs are found to have a much greater impact on ozone predictions compared to changes to the CMAQ modules described here. Model updates had little effect on existing biases in wet deposition predictions.

  3. Thermal site descriptive model. A strategy for the model development during site investigations - version 2

    Energy Technology Data Exchange (ETDEWEB)

    Back, Paer-Erik; Sundberg, Jan [Geo Innova AB (Sweden)

    2007-09-15

    This report presents a strategy for describing, predicting and visualising the thermal aspects of the site descriptive model. The strategy is an updated version of an earlier strategy applied in all SDM versions during the initial site investigation phase at the Forsmark and Oskarshamn areas. The previous methodology for thermal modelling did not take the spatial correlation fully into account during simulation. The result was that the variability of thermal conductivity in the rock mass was not sufficiently well described. Experience from earlier thermal SDMs indicated that development of the methodology was required in order describe the spatial distribution of thermal conductivity in the rock mass in a sufficiently reliable way, taking both variability within rock types and between rock types into account. A good description of the thermal conductivity distribution is especially important for the lower tail. This tail is important for the design of a repository because it affects the canister spacing. The presented approach is developed to be used for final SDM regarding thermal properties, primarily thermal conductivity. Specific objectives for the strategy of thermal stochastic modelling are: Description: statistical description of the thermal conductivity of a rock domain. Prediction: prediction of thermal conductivity in a specific rock volume. Visualisation: visualisation of the spatial distribution of thermal conductivity. The thermal site descriptive model should include the temperature distribution and thermal properties of the rock mass. The temperature is the result of the thermal processes in the repository area. Determination of thermal transport properties can be made using different methods, such as laboratory investigations, field measurements, modelling from mineralogical composition and distribution, modelling from density logging and modelling from temperature logging. The different types of data represent different scales, which has to be

  4. Lunar Landing Operational Risk Model

    Science.gov (United States)

    Mattenberger, Chris; Putney, Blake; Rust, Randy; Derkowski, Brian

    2010-01-01

    Characterizing the risk of spacecraft goes beyond simply modeling equipment reliability. Some portions of the mission require complex interactions between system elements that can lead to failure without an actual hardware fault. Landing risk is currently the least characterized aspect of the Altair lunar lander and appears to result from complex temporal interactions between pilot, sensors, surface characteristics and vehicle capabilities rather than hardware failures. The Lunar Landing Operational Risk Model (LLORM) seeks to provide rapid and flexible quantitative insight into the risks driving the landing event and to gauge sensitivities of the vehicle to changes in system configuration and mission operations. The LLORM takes a Monte Carlo based approach to estimate the operational risk of the Lunar Landing Event and calculates estimates of the risk of Loss of Mission (LOM) - Abort Required and is Successful, Loss of Crew (LOC) - Vehicle Crashes or Cannot Reach Orbit, and Success. The LLORM is meant to be used during the conceptual design phase to inform decision makers transparently of the reliability impacts of design decisions, to identify areas of the design which may require additional robustness, and to aid in the development and flow-down of requirements.

  5. Fuzzy audit risk modeling algorithm

    Directory of Open Access Journals (Sweden)

    Zohreh Hajihaa

    2011-07-01

    Full Text Available Fuzzy logic has created suitable mathematics for making decisions in uncertain environments including professional judgments. One of the situations is to assess auditee risks. During recent years, risk based audit (RBA has been regarded as one of the main tools to fight against fraud. The main issue in RBA is to determine the overall audit risk an auditor accepts, which impact the efficiency of an audit. The primary objective of this research is to redesign the audit risk model (ARM proposed by auditing standards. The proposed model of this paper uses fuzzy inference systems (FIS based on the judgments of audit experts. The implementation of proposed fuzzy technique uses triangular fuzzy numbers to express the inputs and Mamdani method along with center of gravity are incorporated for defuzzification. The proposed model uses three FISs for audit, inherent and control risks, and there are five levels of linguistic variables for outputs. FISs include 25, 25 and 81 rules of if-then respectively and officials of Iranian audit experts confirm all the rules.

  6. The Lagrangian particle dispersion model FLEXPART-WRF version 3.1

    Science.gov (United States)

    Brioude, J.; Arnold, D.; Stohl, A.; Cassiani, M.; Morton, D.; Seibert, P.; Angevine, W.; Evan, S.; Dingwell, A.; Fast, J. D.; Easter, R. C.; Pisso, I.; Burkhart, J.; Wotawa, G.

    2013-11-01

    The Lagrangian particle dispersion model FLEXPART was originally designed for calculating long-range and mesoscale dispersion of air pollutants from point sources, such that occurring after an accident in a nuclear power plant. In the meantime, FLEXPART has evolved into a comprehensive tool for atmospheric transport modeling and analysis at different scales. A need for further multiscale modeling and analysis has encouraged new developments in FLEXPART. In this paper, we present a FLEXPART version that works with the Weather Research and Forecasting (WRF) mesoscale meteorological model. We explain how to run this new model and present special options and features that differ from those of the preceding versions. For instance, a novel turbulence scheme for the convective boundary layer has been included that considers both the skewness of turbulence in the vertical velocity as well as the vertical gradient in the air density. To our knowledge, FLEXPART is the first model for which such a scheme has been developed. On a more technical level, FLEXPART-WRF now offers effective parallelization, and details on computational performance are presented here. FLEXPART-WRF output can either be in binary or Network Common Data Form (NetCDF) format, both of which have efficient data compression. In addition, test case data and the source code are provided to the reader as a Supplement. This material and future developments will be accessible at http://www.flexpart.eu.

  7. A computationally efficient description of heterogeneous freezing: A simplified version of the Soccer ball model

    Science.gov (United States)

    Niedermeier, Dennis; Ervens, Barbara; Clauss, Tina; Voigtländer, Jens; Wex, Heike; Hartmann, Susan; Stratmann, Frank

    2014-01-01

    In a recent study, the Soccer ball model (SBM) was introduced for modeling and/or parameterizing heterogeneous ice nucleation processes. The model applies classical nucleation theory. It allows for a consistent description of both apparently singular and stochastic ice nucleation behavior, by distributing contact angles over the nucleation sites of a particle population assuming a Gaussian probability density function. The original SBM utilizes the Monte Carlo technique, which hampers its usage in atmospheric models, as fairly time-consuming calculations must be performed to obtain statistically significant results. Thus, we have developed a simplified and computationally more efficient version of the SBM. We successfully used the new SBM to parameterize experimental nucleation data of, e.g., bacterial ice nucleation. Both SBMs give identical results; however, the new model is computationally less expensive as confirmed by cloud parcel simulations. Therefore, it is a suitable tool for describing heterogeneous ice nucleation processes in atmospheric models.

  8. Community Land Model Version 3.0 (CLM3.0) Developer's Guide

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, FM

    2004-12-21

    This document describes the guidelines adopted for software development of the Community Land Model (CLM) and serves as a reference to the entire code base of the released version of the model. The version of the code described here is Version 3.0 which was released in the summer of 2004. This document, the Community Land Model Version 3.0 (CLM3.0) User's Guide (Vertenstein et al., 2004), the Technical Description of the Community Land Model (CLM) (Oleson et al., 2004), and the Community Land Model's Dynamic Global Vegetation Model (CLM-DGVM): Technical Description and User's Guide (Levis et al., 2004) provide the developer, user, or researcher with details of implementation, instructions for using the model, a scientific description of the model, and a scientific description of the Dynamic Global Vegetation Model integrated with CLM respectively. The CLM is a single column (snow-soil-vegetation) biogeophysical model of the land surface which can be run serially (on a laptop or personal computer) or in parallel (using distributed or shared memory processors or both) on both vector and scalar computer architectures. Written in Fortran 90, CLM can be run offline (i.e., run in isolation using stored atmospheric forcing data), coupled to an atmospheric model (e.g., the Community Atmosphere Model (CAM)), or coupled to a climate system model (e.g., the Community Climate System Model Version 3 (CCSM3)) through a flux coupler (e.g., Coupler 6 (CPL6)). When coupled, CLM exchanges fluxes of energy, water, and momentum with the atmosphere. The horizontal land surface heterogeneity is represented by a nested subgrid hierarchy composed of gridcells, landunits, columns, and plant functional types (PFTs). This hierarchical representation is reflected in the data structures used by the model code. Biophysical processes are simulated for each subgrid unit (landunit, column, and PFT) independently, and prognostic variables are maintained for each subgrid unit

  9. Statistical model of fractures and deformation zones. Preliminary site description, Laxemar subarea, version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Hermanson, Jan; Forssberg, Ola [Golder Associates AB, Stockholm (Sweden); Fox, Aaron; La Pointe, Paul [Golder Associates Inc., Redmond, WA (United States)

    2005-10-15

    The goal of this summary report is to document the data sources, software tools, experimental methods, assumptions, and model parameters in the discrete-fracture network (DFN) model for the local model volume in Laxemar, version 1.2. The model parameters presented herein are intended for use by other project modeling teams. Individual modeling teams may elect to simplify or use only a portion of the DFN model, depending on their needs. This model is not intended to be a flow model or a mechanical model; as such, only the geometrical characterization is presented. The derivations of the hydraulic or mechanical properties of the fractures or their subsurface connectivities are not within the scope of this report. This model represents analyses carried out on particular data sets. If additional data are obtained, or values for existing data are changed or excluded, the conclusions reached in this report, and the parameter values calculated, may change as well. The model volume is divided into two subareas; one located on the Simpevarp peninsula adjacent to the power plant (Simpevarp), and one further to the west (Laxemar). The DFN parameters described in this report were determined by analysis of data collected within the local model volume. As such, the final DFN model is only valid within this local model volume and the modeling subareas (Laxemar and Simpevarp) within.

  10. CRAC: Confidentiality Risk Analysis and IT-Architecture Comparison of Business Networks (extended version)

    NARCIS (Netherlands)

    Morali, A.; Zambon, Emmanuele; Etalle, Sandro; Wieringa, Roelf J.

    2009-01-01

    The leakage of confidential information (e.g.\\ industrial secrets, patient records and user credentials) is one of the risks that have to be accounted for and mitigated by organizations dealing with confidential data. Unfortunately, assessing confidentiality risk is challenging, particularly in the

  11. The Iranian version of the Copenhagen Psychosocial Questionnaire (COPSOQ) for assessment of psychological risk factors at work

    Science.gov (United States)

    Aminian, Mohammad; Dianat, Iman; Miri, Anvar; Asghari-Jafarabadi, Mohammad

    2017-01-01

    Background: The Copenhagen Psychosocial Questionnaire (COPSOQ) is a widely used tool for evaluation of psychosocial risk factors at work. The aims of this study were to describe the short version of Farsi COPSOQ and to present its psychometric properties. Methods: A total of 427 administrative health care staff participated in this descriptive methodological study. Forward–backward procedure was adopted to translate the questionnaire from English into Farsi. Content validity was assessed by a panel of 10 experts. Construct validity was evaluated by exploratory and confirmatory factor analyses. The internal consistency and test-retest reliability were assessed using Cronbach’s α and intraclass correlation coefficient(ICC), respectively. The feasibility was assessed using ceiling and floor effect. Results: The short version of Farsi COPSOQ was configured with 16 dimensions (32 items).Content validity of the questionnaire was established. Factor analysis supported the conceptual multi-dimensionality (four factors), and therefore confirmed the construct validity of the Farsi COPSOQ. The internal consistency (Cronbach’s α ranging between 0.75 and 0.89) and test retest reliability (ICC values ranged from 0.75 to 0.89) were both approved and the results showed no ceiling or floor effect. Conclusion: The results support the use of Farsi COPSOQ for evaluation of psychological risks and for research purposes in Iranian population. PMID:28058236

  12. Risk Quantification and Evaluation Modelling

    Directory of Open Access Journals (Sweden)

    Manmohan Singh

    2014-07-01

    Full Text Available In this paper authors have discussed risk quantification methods and evaluation of risks and decision parameter to be used for deciding on ranking of the critical items, for prioritization of condition monitoring based risk and reliability centered maintenance (CBRRCM. As time passes any equipment or any product degrades into lower effectiveness and the rate of failure or malfunctioning increases, thereby lowering the reliability. Thus with the passage of time or a number of active tests or periods of work, the reliability of the product or the system, may fall down to a low value known as a threshold value, below which the reliability should not be allowed to dip. Hence, it is necessary to fix up the normal basis for determining the appropriate points in the product life cycle where predictive preventive maintenance may be applied in the programme so that the reliability (the probability of successful functioning can be enhanced, preferably to its original value, by reducing the failure rate and increasing the mean time between failure. It is very important for defence application where reliability is a prime work. An attempt is made to develop mathematical model for risk assessment and ranking them. Based on likeliness coefficient β1 and risk coefficient β2 ranking of the sub-systems can be modelled and used for CBRRCM.Defence Science Journal, Vol. 64, No. 4, July 2014, pp. 378-384, DOI:http://dx.doi.org/10.14429/dsj.64.6366 

  13. Aerosol specification in single-column Community Atmosphere Model version 5

    Science.gov (United States)

    Lebassi-Habtezion, B.; Caldwell, P. M.

    2015-03-01

    Single-column model (SCM) capability is an important tool for general circulation model development. In this study, the SCM mode of version 5 of the Community Atmosphere Model (CAM5) is shown to handle aerosol initialization and advection improperly, resulting in aerosol, cloud-droplet, and ice crystal concentrations which are typically much lower than observed or simulated by CAM5 in global mode. This deficiency has a major impact on stratiform cloud simulations but has little impact on convective case studies because aerosol is currently not used by CAM5 convective schemes and convective cases are typically longer in duration (so initialization is less important). By imposing fixed aerosol or cloud-droplet and crystal number concentrations, the aerosol issues described above can be avoided. Sensitivity studies using these idealizations suggest that the Meyers et al. (1992) ice nucleation scheme prevents mixed-phase cloud from existing by producing too many ice crystals. Microphysics is shown to strongly deplete cloud water in stratiform cases, indicating problems with sequential splitting in CAM5 and the need for careful interpretation of output from sequentially split climate models. Droplet concentration in the general circulation model (GCM) version of CAM5 is also shown to be far too low (~ 25 cm-3) at the southern Great Plains (SGP) Atmospheric Radiation Measurement (ARM) site.

  14. MESOI Version 2. 0: an interactive mesoscale Lagrangian puff dispersion model with deposition and decay

    Energy Technology Data Exchange (ETDEWEB)

    Ramsdell, J.V.; Athey, G.F.; Glantz, C.S.

    1983-11-01

    MESOI Version 2.0 is an interactive Lagrangian puff model for estimating the transport, diffusion, deposition and decay of effluents released to the atmosphere. The model is capable of treating simultaneous releases from as many as four release points, which may be elevated or at ground-level. The puffs are advected by a horizontal wind field that is defined in three dimensions. The wind field may be adjusted for expected topographic effects. The concentration distribution within the puffs is initially assumed to be Gaussian in the horizontal and vertical. However, the vertical concentration distribution is modified by assuming reflection at the ground and the top of the atmospheric mixing layer. Material is deposited on the surface using a source depletion, dry deposition model and a washout coefficient model. The model also treats the decay of a primary effluent species and the ingrowth and decay of a single daughter species using a first order decay process. This report is divided into two parts. The first part discusses the theoretical and mathematical bases upon which MESOI Version 2.0 is based. The second part contains the MESOI computer code. The programs were written in the ANSI standard FORTRAN 77 and were developed on a VAX 11/780 computer. 43 references, 14 figures, 13 tables.

  15. A p-version embedded model for simulation of concrete temperature fields with cooling pipes

    Directory of Open Access Journals (Sweden)

    Sheng Qiang

    2015-07-01

    Full Text Available Pipe cooling is an effective method of mass concrete temperature control, but its accurate and convenient numerical simulation is still a cumbersome problem. An improved embedded model, considering the water temperature variation along the pipe, was proposed for simulating the temperature field of early-age concrete structures containing cooling pipes. The improved model was verified with an engineering example. Then, the p-version self-adaption algorithm for the improved embedded model was deduced, and the initial values and boundary conditions were examined. Comparison of some numerical samples shows that the proposed model can provide satisfying precision and a higher efficiency. The analysis efficiency can be doubled at the same precision, even for a large-scale element. The p-version algorithm can fit grids of different sizes for the temperature field simulation. The convenience of the proposed algorithm lies in the possibility of locating more pipe segments in one element without the need of so regular a shape as in the explicit model.

  16. DOUBLE-MARKOV RISK MODEL

    Institute of Scientific and Technical Information of China (English)

    Xiaoyun MO; Jieming ZHOU; Hui OU; Xiangqun YANG

    2013-01-01

    Given a new Double-Markov risk model DM =(μ,Q,v,H; Y,Z) and Double-Markov risk process U ={U(t),t ≥ 0}.The ruin or survival problem is addressed.Equations which the survival probability satisfied and the formulas of calculating survival probability are obtained.Recursion formulas of calculating the survival probability and analytic expression of recursion items are obtained.The conclusions are expressed by Q matrix for a Markov chain and transition probabilities for another Markov Chain.

  17. The complex model of risk and progression of AMD estimation

    Directory of Open Access Journals (Sweden)

    V. S. Akopyan

    2012-01-01

    Full Text Available Purpose: to develop a method and a statistical model to estimate individual risk of AMD and the risk for progression to advanced AMD using clinical and genetic risk factors.Methods: A statistical risk assessment model was developed using stepwise binary logistic regression analysis. to estimate the population differences in the prevalence of allelic variants of genes and for the development of models adapted to the population of Moscow region genotyping and assessment of the influence of other risk factors was performed in two groups: patients with differ- ent stages of AMD (n = 74, and control group (n = 116. Genetic risk factors included in the study: polymorphisms in the complement system genes (C3 and CFH, genes at 10q26 locus (ARMS2 and HtRA1, polymorphism in the mitochondrial gene Mt-ND2. Clinical risk factors included in the study: age, gender, high body mass index, smoking history.Results: A comprehensive analysis of genetic and clinical risk factors for AMD in the study group was performed. Compiled statis- tical model assessment of individual risk of AMD, the sensitivity of the model — 66.7%, specificity — 78.5%, AUC = 0.76. Risk factors of late AMD, compiled a statistical model describing the probability of late AMD, the sensitivity of the model — 66.7%, specificity — 78.3%, AUC = 0.73. the developed system allows determining the most likely version of the current late AMD: dry or wet.Conclusion: the developed test system and the mathematical algorhythm for determining the risk of AMD, risk of progression to advanced AMD have fair diagnostic informative and promising for use in clinical practice.

  18. The complex model of risk and progression of AMD estimation

    Directory of Open Access Journals (Sweden)

    V. S. Akopyan

    2014-07-01

    Full Text Available Purpose: to develop a method and a statistical model to estimate individual risk of AMD and the risk for progression to advanced AMD using clinical and genetic risk factors.Methods: A statistical risk assessment model was developed using stepwise binary logistic regression analysis. to estimate the population differences in the prevalence of allelic variants of genes and for the development of models adapted to the population of Moscow region genotyping and assessment of the influence of other risk factors was performed in two groups: patients with differ- ent stages of AMD (n = 74, and control group (n = 116. Genetic risk factors included in the study: polymorphisms in the complement system genes (C3 and CFH, genes at 10q26 locus (ARMS2 and HtRA1, polymorphism in the mitochondrial gene Mt-ND2. Clinical risk factors included in the study: age, gender, high body mass index, smoking history.Results: A comprehensive analysis of genetic and clinical risk factors for AMD in the study group was performed. Compiled statis- tical model assessment of individual risk of AMD, the sensitivity of the model — 66.7%, specificity — 78.5%, AUC = 0.76. Risk factors of late AMD, compiled a statistical model describing the probability of late AMD, the sensitivity of the model — 66.7%, specificity — 78.3%, AUC = 0.73. the developed system allows determining the most likely version of the current late AMD: dry or wet.Conclusion: the developed test system and the mathematical algorhythm for determining the risk of AMD, risk of progression to advanced AMD have fair diagnostic informative and promising for use in clinical practice.

  19. The Lagrangian particle dispersion model FLEXPART-WRF version 3.0

    Directory of Open Access Journals (Sweden)

    J. Brioude

    2013-07-01

    Full Text Available The Lagrangian particle dispersion model FLEXPART was originally designed for calculating long-range and mesoscale dispersion of air pollutants from point sources, such as after an accident in a nuclear power plant. In the meantime FLEXPART has evolved into a comprehensive tool for atmospheric transport modeling and analysis at different scales. This multiscale need has encouraged new developments in FLEXPART. In this document, we present a FLEXPART version that works with the Weather Research and Forecasting (WRF mesoscale meteorological model. We explain how to run and present special options and features that differ from its predecessor versions. For instance, a novel turbulence scheme for the convective boundary layer has been included that considers both the skewness of turbulence in the vertical velocity as well as the vertical gradient in the air density. To our knowledge, FLEXPART is the first model for which such a scheme has been developed. On a more technical level, FLEXPART-WRF now offers effective parallelization and details on computational performance are presented here. FLEXPART-WRF output can either be in binary or Network Common Data Form (NetCDF format with efficient data compression. In addition, test case data and the source code are provided to the reader as Supplement. This material and future developments will be accessible at http://www.flexpart.eu.

  20. The Lagrangian particle dispersion model FLEXPART-WRF version 3.0

    Science.gov (United States)

    Brioude, J.; Arnold, D.; Stohl, A.; Cassiani, M.; Morton, D.; Seibert, P.; Angevine, W.; Evan, S.; Dingwell, A.; Fast, J. D.; Easter, R. C.; Pisso, I.; Burkhart, J.; Wotawa, G.

    2013-07-01

    The Lagrangian particle dispersion model FLEXPART was originally designed for calculating long-range and mesoscale dispersion of air pollutants from point sources, such as after an accident in a nuclear power plant. In the meantime FLEXPART has evolved into a comprehensive tool for atmospheric transport modeling and analysis at different scales. This multiscale need has encouraged new developments in FLEXPART. In this document, we present a FLEXPART version that works with the Weather Research and Forecasting (WRF) mesoscale meteorological model. We explain how to run and present special options and features that differ from its predecessor versions. For instance, a novel turbulence scheme for the convective boundary layer has been included that considers both the skewness of turbulence in the vertical velocity as well as the vertical gradient in the air density. To our knowledge, FLEXPART is the first model for which such a scheme has been developed. On a more technical level, FLEXPART-WRF now offers effective parallelization and details on computational performance are presented here. FLEXPART-WRF output can either be in binary or Network Common Data Form (NetCDF) format with efficient data compression. In addition, test case data and the source code are provided to the reader as Supplement. This material and future developments will be accessible at http://www.flexpart.eu.

  1. Quantile uncertainty and value-at-risk model risk.

    Science.gov (United States)

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks.

  2. Technical note: The Lagrangian particle dispersion model FLEXPART version 6.2

    Directory of Open Access Journals (Sweden)

    A. Stohl

    2005-01-01

    Full Text Available The Lagrangian particle dispersion model FLEXPART was originally (about 8 years ago designed for calculating the long-range and mesoscale dispersion of air pollutants from point sources, such as after an accident in a nuclear power plant. In the meantime FLEXPART has evolved into a comprehensive tool for atmospheric transport modeling and analysis. Its application fields were extended from air pollution studies to other topics where atmospheric transport plays a role (e.g., exchange between the stratosphere and troposphere, or the global water cycle. It has evolved into a true community model that is now being used by at least 25 groups from 14 different countries and is seeing both operational and research applications. A user manual has been kept actual over the years and was distributed over an internet page along with the model's source code. In this note we provide a citeable technical description of FLEXPART's latest version (6.2.

  3. Muninn: A versioning flash key-value store using an object-based storage model

    OpenAIRE

    Kang, Y.; Pitchumani, R; Marlette, T; Miller, El

    2014-01-01

    While non-volatile memory (NVRAM) devices have the po-tential to alleviate the trade-off between performance, scal-ability, and energy in storage and memory subsystems, a block interface and storage subsystems designed for slow I/O devices make it difficult to efficiently exploit NVRAMs in a portable and extensible way. We propose an object-based storage model as a way of addressing the shortfalls of the current interfaces. Through the design of Muninn, an object-based versioning key-value st...

  4. QMM – A Quarterly Macroeconomic Model of the Icelandic Economy. Version 2.0

    DEFF Research Database (Denmark)

    Ólafsson, Tjörvi

    This paper documents and describes Version 2.0 of the Quarterly Macroeconomic Model of the Central Bank of Iceland (QMM). QMM and the underlying quarterly database have been under construction since 2001 at the Research and Forecasting Division of the Economics Department at the Bank and was first...... implemented in the forecasting round for the Monetary Bulletin 2006/1 in March 2006. QMM is used by the Bank for forecasting and various policy simulations and therefore plays a key role as an organisational framework for viewing the medium-term future when formulating monetary policy at the Bank. This paper...

  5. User’s Manual for the Navy Coastal Ocean Model (NCOM) Version 4.0

    Science.gov (United States)

    2009-02-06

    Manual for the Navy Coastal Ocean Model (NCOM) Version 4.0 Paul J. Martin Charlie n. Barron luCy F. SMedStad tiMothy J. CaMPBell alan J. WallCraFt...Timothy J. Campbell, Alan J. Wallcraft, Robert C. Rhodes, Clark Rowley, Tamara L. Townsend, and Suzanne N. Carroll* Naval Research Laboratory...1997- 1998 ENSO event. Bound.-Layer Meteor. 103: 439-458. Large, W.G., J.C. McWilliams , and S. Doney, (1994). Oceanic vertical mixing: a review and

  6. Navy Coastal Ocean Model (NCOM) Version 4.0 (User’s Manual)

    Science.gov (United States)

    2009-02-06

    Manual for the Navy Coastal Ocean Model (NCOM) Version 4.0 Paul J. Martin Charlie n. Barron luCy F. SMedStad tiMothy J. CaMPBell alan J. WallCraFt...Timothy J. Campbell, Alan J. Wallcraft, Robert C. Rhodes, Clark Rowley, Tamara L. Townsend, and Suzanne N. Carroll* Naval Research Laboratory...the 1997- 1998 ENSO event. Bound.-Layer Meteor. 103: 439-458. Large, W.G., J.C. McWilliams , and S. Doney, (1994). Oceanic vertical mixing: a review

  7. QMM – A Quarterly Macroeconomic Model of the Icelandic Economy. Version 2.0

    DEFF Research Database (Denmark)

    Ólafsson, Tjörvi

    This paper documents and describes Version 2.0 of the Quarterly Macroeconomic Model of the Central Bank of Iceland (QMM). QMM and the underlying quarterly database have been under construction since 2001 at the Research and Forecasting Division of the Economics Department at the Bank and was first...... implemented in the forecasting round for the Monetary Bulletin 2006/1 in March 2006. QMM is used by the Bank for forecasting and various policy simulations and therefore plays a key role as an organisational framework for viewing the medium-term future when formulating monetary policy at the Bank. This paper...

  8. Model risk analysis for risk management and option pricing

    NARCIS (Netherlands)

    Kerkhof, F.L.J.

    2003-01-01

    Due to the growing complexity of products in financial markets, market participants rely more and more on quantitative models for trading and risk management decisions. This introduces a fairly new type of risk, namely, model risk. In the first part of this thesis we investigate the quantitative inf

  9. Update of the Polar SWIFT model for polar stratospheric ozone loss (Polar SWIFT version 2)

    Science.gov (United States)

    Wohltmann, Ingo; Lehmann, Ralph; Rex, Markus

    2017-07-01

    The Polar SWIFT model is a fast scheme for calculating the chemistry of stratospheric ozone depletion in polar winter. It is intended for use in global climate models (GCMs) and Earth system models (ESMs) to enable the simulation of mutual interactions between the ozone layer and climate. To date, climate models often use prescribed ozone fields, since a full stratospheric chemistry scheme is computationally very expensive. Polar SWIFT is based on a set of coupled differential equations, which simulate the polar vortex-averaged mixing ratios of the key species involved in polar ozone depletion on a given vertical level. These species are O3, chemically active chlorine (ClOx), HCl, ClONO2 and HNO3. The only external input parameters that drive the model are the fraction of the polar vortex in sunlight and the fraction of the polar vortex below the temperatures necessary for the formation of polar stratospheric clouds. Here, we present an update of the Polar SWIFT model introducing several improvements over the original model formulation. In particular, the model is now trained on vortex-averaged reaction rates of the ATLAS Chemistry and Transport Model, which enables a detailed look at individual processes and an independent validation of the different parameterizations contained in the differential equations. The training of the original Polar SWIFT model was based on fitting complete model runs to satellite observations and did not allow for this. A revised formulation of the system of differential equations is developed, which closely fits vortex-averaged reaction rates from ATLAS that represent the main chemical processes influencing ozone. In addition, a parameterization for the HNO3 change by denitrification is included. The rates of change of the concentrations of the chemical species of the Polar SWIFT model are purely chemical rates of change in the new version, whereas in the original Polar SWIFT model, they included a transport effect caused by the

  10. Flood predictions using the parallel version of distributed numerical physical rainfall-runoff model TOPKAPI

    Science.gov (United States)

    Boyko, Oleksiy; Zheleznyak, Mark

    2015-04-01

    The original numerical code TOPKAPI-IMMS of the distributed rainfall-runoff model TOPKAPI ( Todini et al, 1996-2014) is developed and implemented in Ukraine. The parallel version of the code has been developed recently to be used on multiprocessors systems - multicore/processors PC and clusters. Algorithm is based on binary-tree decomposition of the watershed for the balancing of the amount of computation for all processors/cores. Message passing interface (MPI) protocol is used as a parallel computing framework. The numerical efficiency of the parallelization algorithms is demonstrated for the case studies for the flood predictions of the mountain watersheds of the Ukrainian Carpathian regions. The modeling results is compared with the predictions based on the lumped parameters models.

  11. The Systems Biology Markup Language (SBML) Level 3 Package: Qualitative Models, Version 1, Release 1.

    Science.gov (United States)

    Chaouiya, Claudine; Keating, Sarah M; Berenguier, Duncan; Naldi, Aurélien; Thieffry, Denis; van Iersel, Martijn P; Le Novère, Nicolas; Helikar, Tomáš

    2015-09-04

    Quantitative methods for modelling biological networks require an in-depth knowledge of the biochemical reactions and their stoichiometric and kinetic parameters. In many practical cases, this knowledge is missing. This has led to the development of several qualitative modelling methods using information such as, for example, gene expression data coming from functional genomic experiments. The SBML Level 3 Version 1 Core specification does not provide a mechanism for explicitly encoding qualitative models, but it does provide a mechanism for SBML packages to extend the Core specification and add additional syntactical constructs. The SBML Qualitative Models package for SBML Level 3 adds features so that qualitative models can be directly and explicitly encoded. The approach taken in this package is essentially based on the definition of regulatory or influence graphs. The SBML Qualitative Models package defines the structure and syntax necessary to describe qualitative models that associate discrete levels of activities with entity pools and the transitions between states that describe the processes involved. This is particularly suited to logical models (Boolean or multi-valued) and some classes of Petri net models can be encoded with the approach.

  12. User guide for MODPATH version 6 - A particle-tracking model for MODFLOW

    Science.gov (United States)

    Pollock, David W.

    2012-01-01

    MODPATH is a particle-tracking post-processing model that computes three-dimensional flow paths using output from groundwater flow simulations based on MODFLOW, the U.S. Geological Survey (USGS) finite-difference groundwater flow model. This report documents MODPATH version 6. Previous versions were documented in USGS Open-File Reports 89-381 and 94-464. The program uses a semianalytical particle-tracking scheme that allows an analytical expression of a particle's flow path to be obtained within each finite-difference grid cell. A particle's path is computed by tracking the particle from one cell to the next until it reaches a boundary, an internal sink/source, or satisfies another termination criterion. Data input to MODPATH consists of a combination of MODFLOW input data files, MODFLOW head and flow output files, and other input files specific to MODPATH. Output from MODPATH consists of several output files, including a number of particle coordinate output files intended to serve as input data for other programs that process, analyze, and display the results in various ways. MODPATH is written in FORTRAN and can be compiled by any FORTRAN compiler that fully supports FORTRAN-2003 or by most commercially available FORTRAN-95 compilers that support the major FORTRAN-2003 language extensions.

  13. Landfill Gas Energy Cost Model Version 3.0 (LFGcost-Web V3 ...

    Science.gov (United States)

    To help stakeholders estimate the costs of a landfill gas (LFG) energy project, in 2002, LMOP developed a cost tool (LFGcost). Since then, LMOP has routinely updated the tool to reflect changes in the LFG energy industry. Initially the model was designed for EPA to assist landfills in evaluating the economic and financial feasibility of LFG energy project development. In 2014, LMOP developed a public version of the model, LFGcost-Web (Version 3.0), to allow landfill and industry stakeholders to evaluate project feasibility on their own. LFGcost-Web can analyze costs for 12 energy recovery project types. These project costs can be estimated with or without the costs of a gas collection and control system (GCCS). The EPA used select equations from LFGcost-Web to estimate costs of the regulatory options in the 2015 proposed revisions to the MSW Landfills Standards of Performance (also known as New Source Performance Standards) and the Emission Guidelines (herein thereafter referred to collectively as the Landfill Rules). More specifically, equations derived from LFGcost-Web were applied to each landfill expected to be impacted by the Landfill Rules to estimate annualized installed capital costs and annual O&M costs of a gas collection and control system. In addition, after applying the LFGcost-Web equations to the list of landfills expected to require a GCCS in year 2025 as a result of the proposed Landfill Rules, the regulatory analysis evaluated whether electr

  14. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (IBM PC VERSION)

    Science.gov (United States)

    Manning, R. M.

    1994-01-01

    The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal

  15. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (MACINTOSH VERSION)

    Science.gov (United States)

    Manning, R. M.

    1994-01-01

    The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal

  16. Obstetric care providers are able to assess psychosocial risks, identify and refer high-risk pregnant women: validation of a short assessment tool - the KINDEX Greek version.

    Science.gov (United States)

    Spyridou, Andria; Schauer, Maggie; Ruf-Leuschner, Martina

    2015-02-21

    Prenatal assessment for psychosocial risk factors and prevention and intervention is scarce and, in most cases, nonexistent in obstetrical care. In this study we aimed to evaluate if the KINDEX, a short instrument developed in Germany, is a useful tool in the hands of non-trained medical staff, in order to identify and refer women in psychosocial risk to the adequate mental health and social services. We also examined the criterion-related concurrent validity of the tool through a validation interview carried out by an expert clinical psychologist. Our final objective was to achieve the cultural adaptation of the KINDEX Greek Version and to offer a valid tool for the psychosocial risk assessment to the obstetric care providers. Two obstetricians and five midwives carried out 93 KINDEX interviews (duration 20 minutes) with pregnant women to assess psychosocial risk factors present during pregnancy. Afterwards they referred women who they identified having two or more psychosocial risk factors to the mental health attention unit of the hospital. During the validation procedure an expert clinical psychologist carried out diagnostic interviews with a randomized subsample of 50 pregnant women based on established diagnostic instruments for stress and psychopathology, like the PSS-14, ESI, PDS, HSCL-25. Significant correlations between the results obtained through the assessment using the KINDEX and the risk areas of stress, psychopathology and trauma load assessed in the validation interview demonstrate the criterion-related concurrent validity of the KINDEX. The referral accuracy of the medical staff is confirmed through comparisons between pregnant women who have and have not been referred to the mental health attention unit. Prenatal screenings for psychosocial risks like the KINDEX are feasible in public health settings in Greece. In addition, validity was confirmed in high correlations between the KINDEX results and the results of the validation interviews. The

  17. Evaluation of the Snow Simulations from the Community Land Model, Version 4 (CLM4)

    Science.gov (United States)

    Toure, Ally M.; Rodell, Matthew; Yang, Zong-Liang; Beaudoing, Hiroko; Kim, Edward; Zhang, Yongfei; Kwon, Yonghwan

    2015-01-01

    This paper evaluates the simulation of snow by the Community Land Model, version 4 (CLM4), the land model component of the Community Earth System Model, version 1.0.4 (CESM1.0.4). CLM4 was run in an offline mode forced with the corrected land-only replay of the Modern-Era Retrospective Analysis for Research and Applications (MERRA-Land) and the output was evaluated for the period from January 2001 to January 2011 over the Northern Hemisphere poleward of 30 deg N. Simulated snow-cover fraction (SCF), snow depth, and snow water equivalent (SWE) were compared against a set of observations including the Moderate Resolution Imaging Spectroradiometer (MODIS) SCF, the Interactive Multisensor Snow and Ice Mapping System (IMS) snow cover, the Canadian Meteorological Centre (CMC) daily snow analysis products, snow depth from the National Weather Service Cooperative Observer (COOP) program, and Snowpack Telemetry (SNOTEL) SWE observations. CLM4 SCF was converted into snow-cover extent (SCE) to compare with MODIS SCE. It showed good agreement, with a correlation coefficient of 0.91 and an average bias of -1.54 x 10(exp 2) sq km. Overall, CLM4 agreed well with IMS snow cover, with the percentage of correctly modeled snow-no snow being 94%. CLM4 snow depth and SWE agreed reasonably well with the CMC product, with the average bias (RMSE) of snow depth and SWE being 0.044m (0.19 m) and -0.010m (0.04 m), respectively. CLM4 underestimated SNOTEL SWE and COOP snow depth. This study demonstrates the need to improve the CLM4 snow estimates and constitutes a benchmark against which improvement of the model through data assimilation can be measured.

  18. Version 3.0 of code Java for 3D simulation of the CCA model

    Science.gov (United States)

    Zhang, Kebo; Zuo, Junsen; Dou, Yifeng; Li, Chao; Xiong, Hailing

    2016-10-01

    In this paper we provide a new version of program for replacing the previous version. The frequency of traversing the clusters-list was reduced, and some code blocks were optimized properly; in addition, we appended and revised the comments of the source code for some methods or attributes. The compared experimental results show that new version has better time efficiency than the previous version.

  19. Igpet software for modeling igneous processes: examples of application using the open educational version

    Science.gov (United States)

    Carr, Michael J.; Gazel, Esteban

    2016-09-01

    We provide here an open version of Igpet software, called t-Igpet to emphasize its application for teaching and research in forward modeling of igneous geochemistry. There are three programs, a norm utility, a petrologic mixing program using least squares and Igpet, a graphics program that includes many forms of numerical modeling. Igpet is a multifaceted tool that provides the following basic capabilities: igneous rock identification using the IUGS (International Union of Geological Sciences) classification and several supplementary diagrams; tectonic discrimination diagrams; pseudo-quaternary projections; least squares fitting of lines, polynomials and hyperbolae; magma mixing using two endmembers, histograms, x-y plots, ternary plots and spider-diagrams. The advanced capabilities of Igpet are multi-element mixing and magma evolution modeling. Mixing models are particularly useful for understanding the isotopic variations in rock suites that evolved by mixing different sources. The important melting models include, batch melting, fractional melting and aggregated fractional melting. Crystallization models include equilibrium and fractional crystallization and AFC (assimilation and fractional crystallization). Theses, reports and proposals concerning igneous petrology are improved by numerical modeling. For reviewed publications some elements of modeling are practically a requirement. Our intention in providing this software is to facilitate improved communication and lower entry barriers to research, especially for students.

  20. Igpet software for modeling igneous processes: examples of application using the open educational version

    Science.gov (United States)

    Carr, Michael J.; Gazel, Esteban

    2017-04-01

    We provide here an open version of Igpet software, called t-Igpet to emphasize its application for teaching and research in forward modeling of igneous geochemistry. There are three programs, a norm utility, a petrologic mixing program using least squares and Igpet, a graphics program that includes many forms of numerical modeling. Igpet is a multifaceted tool that provides the following basic capabilities: igneous rock identification using the IUGS (International Union of Geological Sciences) classification and several supplementary diagrams; tectonic discrimination diagrams; pseudo-quaternary projections; least squares fitting of lines, polynomials and hyperbolae; magma mixing using two endmembers, histograms, x-y plots, ternary plots and spider-diagrams. The advanced capabilities of Igpet are multi-element mixing and magma evolution modeling. Mixing models are particularly useful for understanding the isotopic variations in rock suites that evolved by mixing different sources. The important melting models include, batch melting, fractional melting and aggregated fractional melting. Crystallization models include equilibrium and fractional crystallization and AFC (assimilation and fractional crystallization). Theses, reports and proposals concerning igneous petrology are improved by numerical modeling. For reviewed publications some elements of modeling are practically a requirement. Our intention in providing this software is to facilitate improved communication and lower entry barriers to research, especially for students.

  1. Ocean Model, Analysis and Prediction System version 3: operational global ocean forecasting

    Science.gov (United States)

    Brassington, Gary; Sandery, Paul; Sakov, Pavel; Freeman, Justin; Divakaran, Prasanth; Beckett, Duan

    2017-04-01

    The Ocean Model, Analysis and Prediction System version 3 (OceanMAPSv3) is a near-global (75S-75N; no sea-ice), uniform horizontal resolution (0.1°x0.1°), 51 vertical level ocean forecast system producing daily analyses and 7 day forecasts. This system was declared operational at the Bureau of Meteorology in April 2016 and subsequently upgraded to include ACCESS-G APS2 in June 2016 and finally ported to the Bureau's new supercomputer in Sep 2016. This system realises the original vision of the BLUElink projects (2003-2015) to provide global forecasts of the ocean geostrophic turbulence (eddies and fronts) in support of Naval operations as well as other national services. The analysis system has retained an ensemble-based optimal interpolation method with 144 stationary ensemble members derived from a multi-year hindcast. However, the BODAS code has been upgraded to a new code base ENKF-C. A new strategy for initialisation has been introduced leading to greater retention of analysis increments and reduced shock. The analysis cycle has been optimised for a 3-cycle system with 3 day observation windows retaining an advantage as a multi-cycle time-lagged ensemble. The sea surface temperature and sea surface height anomaly analysis errors in the Australian region are 0.34 degC and 6.2 cm respectively an improvement of 10% and 20% respectively over version 2. In addition, the RMSE of the 7 day forecast has lower error than the 1 day forecast from the previous system (version 2). International intercomparisons have shown that this system is comparable in performance with the two leading systems and is often the leading performer for surface temperature and upper ocean temperature. We present an overview of the system, the data assimilation and initialisation, demonstrate the performance and outline future directions.

  2. Development of Arabic version of Berlin questionnaire to identify obstructive sleep apnea at risk patients

    Directory of Open Access Journals (Sweden)

    Abdel Baset M Saleh

    2011-01-01

    Results: The study demonstrated a high degree of internal consistency and stability over time for the developed ABQ. The Cronbach′s alpha coefficient for the 10-item tool was 0.92. Validation of ABQ against AHI at cutoff >5 revealed a sensitivity of 97%, specificity of 90%, positive and negative predictive values of 96% and 93%, respectively. Conclusion: The ABQ is reliable and valid scale in screening patients for the risk of OSA among Arabic-speaking nations, especially in resource-limited settings.

  3. APPLICATION OF TWO VERSIONS OF A RNG BASED k-ε MODEL TO NUMERICAL SIMULATIONS OF TURBULENT IMPINGING JET FLOW

    Institute of Scientific and Technical Information of China (English)

    Chen Qing-guang; Xu Zhong; Zhang Yong-jian

    2003-01-01

    Two independent versions of the RNG based k-ε turbulence model in conjunction with the law of the wall have been applied to the numerical simulation of an axisymmetric turbulent impinging jet flow field. The two model predictions are compared with those of the standard k-ε model and with the experimental data measured by LDV (Laser Doppler Velocimetry). It shows that the original version of the RNG k-ε model with the choice of Cε1=1.063 can not yield good results, among them the predicted turbulent kinetic energy profiles in the vicinity of the stagnation region are even worse than those predicted by the standard k-ε model. However, the new version of RNG k-ε model behaves well. This is mainly due to the corrections to the constants Cε1 and Cε2 along with a modification of the production term to account for non-equilibrium strain rates in the flow.

  4. Description of the Earth system model of intermediate complexity LOVECLIM version 1.2

    Directory of Open Access Journals (Sweden)

    H. Goosse

    2010-11-01

    Full Text Available The main characteristics of the new version 1.2 of the three-dimensional Earth system model of intermediate complexity LOVECLIM are briefly described. LOVECLIM 1.2 includes representations of the atmosphere, the ocean and sea ice, the land surface (including vegetation, the ice sheets, the icebergs and the carbon cycle. The atmospheric component is ECBilt2, a T21, 3-level quasi-geostrophic model. The ocean component is CLIO3, which consists of an ocean general circulation model coupled to a comprehensive thermodynamic-dynamic sea-ice model. Its horizontal resolution is of 3° by 3°, and there are 20 levels in the ocean. ECBilt-CLIO is coupled to VECODE, a vegetation model that simulates the dynamics of two main terrestrial plant functional types, trees and grasses, as well as desert. VECODE also simulates the evolution of the carbon cycle over land while the ocean carbon cycle is represented by LOCH, a comprehensive model that takes into account both the solubility and biological pumps. The ice sheet component AGISM is made up of a three-dimensional thermomechanical model of the ice sheet flow, a visco-elastic bedrock model and a model of the mass balance at the ice-atmosphere and ice-ocean interfaces. For both the Greenland and Antarctic ice sheets, calculations are made on a 10 km by 10 km resolution grid with 31 sigma levels. LOVECLIM1.2 reproduces well the major characteristics of the observed climate both for present-day conditions and for key past periods such as the last millennium, the mid-Holocene and the Last Glacial Maximum. However, despite some improvements compared to earlier versions, some biases are still present in the model. The most serious ones are mainly located at low latitudes with an overestimation of the temperature there, a too symmetric distribution of precipitation between the two hemispheres, and an overestimation of precipitation and vegetation cover in the subtropics. In addition, the atmospheric circulation is

  5. Exact solution for a metapopulation version of Schelling’s model

    Science.gov (United States)

    Durrett, Richard; Zhang, Yuan

    2014-01-01

    In 1971, Schelling introduced a model in which families move if they have too many neighbors of the opposite type. In this paper, we will consider a metapopulation version of the model in which a city is divided into N neighborhoods, each of which has L houses. There are ρNL red families and ρNL blue families for some ρ ρb, a new segregated equilibrium appears; for ρb < ρ < ρd, there is bistability, but when ρ increases past ρd the random state is no longer stable. When ρc is small enough, the random state will again be the stationary distribution when ρ is close to 1/2. If so, this is preceded by a region of bistability. PMID:25225367

  6. New Source Term Model for the RESRAD-OFFSITE Code Version 3

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States); Gnanapragasam, Emmanuel [Argonne National Lab. (ANL), Argonne, IL (United States); Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Shih-Yew [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-06-01

    This report documents the new source term model developed and implemented in Version 3 of the RESRAD-OFFSITE code. This new source term model includes: (1) "first order release with transport" option, in which the release of the radionuclide is proportional to the inventory in the primary contamination and the user-specified leach rate is the proportionality constant, (2) "equilibrium desorption release" option, in which the user specifies the distribution coefficient which quantifies the partitioning of the radionuclide between the solid and aqueous phases, and (3) "uniform release" option, in which the radionuclides are released from a constant fraction of the initially contaminated material during each time interval and the user specifies the duration over which the radionuclides are released.

  7. Validity study of the Beck Anxiety Inventory (Portuguese version by the Rasch Rating Scale model

    Directory of Open Access Journals (Sweden)

    Sónia Quintão

    2013-01-01

    Full Text Available Our objective was to conduct a validation study of the Portuguese version of the Beck Anxiety Inventory (BAI by means of the Rasch Rating Scale Model, and then compare it with the most used scales of anxiety in Portugal. The sample consisted of 1,160 adults (427 men and 733 women, aged 18-82 years old (M=33.39; SD=11.85. Instruments were Beck Anxiety Inventory, State-Trait Anxiety Inventory and Zung Self-Rating Anxiety Scale. It was found that Beck Anxiety Inventory's system of four categories, the data-model fit, and people reliability were adequate. The measure can be considered as unidimensional. Gender and age-related differences were not a threat to the validity. BAI correlated significantly with other anxiety measures. In conclusion, BAI shows good psychometric quality.

  8. Development of a user-friendly interface version of the Salmonella source-attribution model

    DEFF Research Database (Denmark)

    Hald, Tine; Lund, Jan

    of questions, where the use of a classical quantitative risk assessment model (i.e. transmission models) would be impaired due to a lack of data and time limitations. As these models require specialist knowledge, it was requested by EFSA to develop a flexible user-friendly source attribution model for use...... with a user-manual, which is also part of this report. Users of the interface are recommended to read this report before starting using the interface to become familiar with the model principles and the mathematics behind, which is required in order to interpret the model results and assess the validity...

  9. An improved version of the consequence analysis model for chemical emergencies, ESCAPE

    Science.gov (United States)

    Kukkonen, J.; Nikmo, J.; Riikonen, K.

    2017-02-01

    We present a refined version of a mathematical model called ESCAPE, "Expert System for Consequence Analysis and Preparing for Emergencies". The model has been designed for evaluating the releases of toxic and flammable gases into the atmosphere, their atmospheric dispersion and the effects on humans and the environment. We describe (i) the mathematical treatments of this model, (ii) a verification and evaluation of the model against selected experimental field data, and (iii) a new operational implementation of the model. The new mathematical treatments include state-of-the-art atmospheric vertical profiles and new submodels for dense gas and passive atmospheric dispersion. The model performance was first successfully verified using the data of the Thorney Island campaign, and then evaluated against the Desert Tortoise campaign. For the latter campaign, the geometric mean bias was 1.72 (this corresponds to an underprediction of approximately 70%) and 0.71 (overprediction of approximately 30%) for the concentration and the plume half-width, respectively. The geometric variance was computers, tablets and mobile phones. The predicted results can be post-processed using geographic information systems. The model has already proved to be a useful tool of assessment for the needs of emergency response authorities in contingency planning.

  10. VALIDATION OF THE ASTER GLOBAL DIGITAL ELEVATION MODEL VERSION 3 OVER THE CONTERMINOUS UNITED STATES

    Directory of Open Access Journals (Sweden)

    D. Gesch

    2016-06-01

    Full Text Available The ASTER Global Digital Elevation Model Version 3 (GDEM v3 was evaluated over the conterminous United States in a manner similar to the validation conducted for the original GDEM Version 1 (v1 in 2009 and GDEM Version 2 (v2 in 2011. The absolute vertical accuracy of GDEM v3 was calculated by comparison with more than 23,000 independent reference geodetic ground control points from the U.S. National Geodetic Survey. The root mean square error (RMSE measured for GDEM v3 is 8.52 meters. This compares with the RMSE of 8.68 meters for GDEM v2. Another important descriptor of vertical accuracy is the mean error, or bias, which indicates if a DEM has an overall vertical offset from true ground level. The GDEM v3 mean error of −1.20 meters reflects an overall negative bias in GDEM v3. The absolute vertical accuracy assessment results, both mean error and RMSE, were segmented by land cover type to provide insight into how GDEM v3 performs in various land surface conditions. While the RMSE varies little across cover types (6.92 to 9.25 meters, the mean error (bias does appear to be affected by land cover type, ranging from −2.99 to +4.16 meters across 14 land cover classes. These results indicate that in areas where built or natural aboveground features are present, GDEM v3 is measuring elevations above the ground level, a condition noted in assessments of previous GDEM versions (v1 and v2 and an expected condition given the type of stereo-optical image data collected by ASTER. GDEM v3 was also evaluated by differencing with the Shuttle Radar Topography Mission (SRTM dataset. In many forested areas, GDEM v3 has elevations that are higher in the canopy than SRTM. The overall validation effort also included an evaluation of the GDEM v3 water mask. In general, the number of distinct water polygons in GDEM v3 is much lower than the number in a reference land cover dataset, but the total areas compare much more closely.

  11. Validation of the ASTER Global Digital Elevation Model version 3 over the conterminous United States

    Science.gov (United States)

    Gesch, Dean B.; Oimoen, Michael J.; Danielson, Jeffrey J.; Meyer, David

    2016-01-01

    The ASTER Global Digital Elevation Model Version 3 (GDEM v3) was evaluated over the conterminous United States in a manner similar to the validation conducted for the original GDEM Version 1 (v1) in 2009 and GDEM Version 2 (v2) in 2011. The absolute vertical accuracy of GDEM v3 was calculated by comparison with more than 23,000 independent reference geodetic ground control points from the U.S. National Geodetic Survey. The root mean square error (RMSE) measured for GDEM v3 is 8.52 meters. This compares with the RMSE of 8.68 meters for GDEM v2. Another important descriptor of vertical accuracy is the mean error, or bias, which indicates if a DEM has an overall vertical offset from true ground level. The GDEM v3 mean error of −1.20 meters reflects an overall negative bias in GDEM v3. The absolute vertical accuracy assessment results, both mean error and RMSE, were segmented by land cover type to provide insight into how GDEM v3 performs in various land surface conditions. While the RMSE varies little across cover types (6.92 to 9.25 meters), the mean error (bias) does appear to be affected by land cover type, ranging from −2.99 to +4.16 meters across 14 land cover classes. These results indicate that in areas where built or natural aboveground features are present, GDEM v3 is measuring elevations above the ground level, a condition noted in assessments of previous GDEM versions (v1 and v2) and an expected condition given the type of stereo-optical image data collected by ASTER. GDEM v3 was also evaluated by differencing with the Shuttle Radar Topography Mission (SRTM) dataset. In many forested areas, GDEM v3 has elevations that are higher in the canopy than SRTM. The overall validation effort also included an evaluation of the GDEM v3 water mask. In general, the number of distinct water polygons in GDEM v3 is much lower than the number in a reference land cover dataset, but the total areas compare much more closely.

  12. Validation of the Aster Global Digital Elevation Model Version 3 Over the Conterminous United States

    Science.gov (United States)

    Gesch, D.; Oimoen, M.; Danielson, J.; Meyer, D.

    2016-06-01

    The ASTER Global Digital Elevation Model Version 3 (GDEM v3) was evaluated over the conterminous United States in a manner similar to the validation conducted for the original GDEM Version 1 (v1) in 2009 and GDEM Version 2 (v2) in 2011. The absolute vertical accuracy of GDEM v3 was calculated by comparison with more than 23,000 independent reference geodetic ground control points from the U.S. National Geodetic Survey. The root mean square error (RMSE) measured for GDEM v3 is 8.52 meters. This compares with the RMSE of 8.68 meters for GDEM v2. Another important descriptor of vertical accuracy is the mean error, or bias, which indicates if a DEM has an overall vertical offset from true ground level. The GDEM v3 mean error of -1.20 meters reflects an overall negative bias in GDEM v3. The absolute vertical accuracy assessment results, both mean error and RMSE, were segmented by land cover type to provide insight into how GDEM v3 performs in various land surface conditions. While the RMSE varies little across cover types (6.92 to 9.25 meters), the mean error (bias) does appear to be affected by land cover type, ranging from -2.99 to +4.16 meters across 14 land cover classes. These results indicate that in areas where built or natural aboveground features are present, GDEM v3 is measuring elevations above the ground level, a condition noted in assessments of previous GDEM versions (v1 and v2) and an expected condition given the type of stereo-optical image data collected by ASTER. GDEM v3 was also evaluated by differencing with the Shuttle Radar Topography Mission (SRTM) dataset. In many forested areas, GDEM v3 has elevations that are higher in the canopy than SRTM. The overall validation effort also included an evaluation of the GDEM v3 water mask. In general, the number of distinct water polygons in GDEM v3 is much lower than the number in a reference land cover dataset, but the total areas compare much more closely.

  13. Immersion freezing by natural dust based on a soccer ball model with the Community Atmospheric Model version 5: climate effects

    Science.gov (United States)

    Wang, Yong; Liu, Xiaohong

    2014-12-01

    We introduce a simplified version of the soccer ball model (SBM) developed by Niedermeier et al (2014 Geophys. Res. Lett. 41 736-741) into the Community Atmospheric Model version 5 (CAM5). It is the first time that SBM is used in an atmospheric model to parameterize the heterogeneous ice nucleation. The SBM, which was simplified for its suitable application in atmospheric models, uses the classical nucleation theory to describe the immersion/condensation freezing by dust in the mixed-phase cloud regime. Uncertain parameters (mean contact angle, standard deviation of contact angle probability distribution, and number of surface sites) in the SBM are constrained by fitting them to recent natural dust (Saharan dust) datasets. With the SBM in CAM5, we investigate the sensitivity of modeled cloud properties to the SBM parameters, and find significant seasonal and regional differences in the sensitivity among the three SBM parameters. Changes of mean contact angle and the number of surface sites lead to changes of cloud properties in Arctic in spring, which could be attributed to the transport of dust ice nuclei to this region. In winter, significant changes of cloud properties induced by these two parameters mainly occur in northern hemispheric mid-latitudes (e.g., East Asia). In comparison, no obvious changes of cloud properties caused by changes of standard deviation can be found in all the seasons. These results are valuable for understanding the heterogeneous ice nucleation behavior, and useful for guiding the future model developments.

  14. Risk matrix model for rotating equipment

    Directory of Open Access Journals (Sweden)

    Wassan Rano Khan

    2014-07-01

    Full Text Available Different industries have various residual risk levels for their rotating equipment. Accordingly the occurrence rate of the failures and associated failure consequences categories are different. Thus, a generalized risk matrix model is developed in this study which can fit various available risk matrix standards. This generalized risk matrix will be helpful to develop new risk matrix, to fit the required risk assessment scenario for rotating equipment. Power generation system was taken as case study. It was observed that eight subsystems were under risk. Only vibration monitor system was under high risk category, while remaining seven subsystems were under serious and medium risk categories.

  15. NGNP Risk Management Database: A Model for Managing Risk

    Energy Technology Data Exchange (ETDEWEB)

    John Collins; John M. Beck

    2011-11-01

    The Next Generation Nuclear Plant (NGNP) Risk Management System (RMS) is a database used to maintain the project risk register. The RMS also maps risk reduction activities to specific identified risks. Further functionality of the RMS includes mapping reactor suppliers Design Data Needs (DDNs) to risk reduction tasks and mapping Phenomena Identification Ranking Table (PIRTs) to associated risks. This document outlines the basic instructions on how to use the RMS. This document constitutes Revision 1 of the NGNP Risk Management Database: A Model for Managing Risk. It incorporates the latest enhancements to the RMS. The enhancements include six new custom views of risk data - Impact/Consequence, Tasks by Project Phase, Tasks by Status, Tasks by Project Phase/Status, Tasks by Impact/WBS, and Tasks by Phase/Impact/WBS.

  16. UNSAT-H Version 2. 0: Unsaturated soil water and heat flow model

    Energy Technology Data Exchange (ETDEWEB)

    Fayer, M.J.; Jones, T.L.

    1990-04-01

    This report documents UNSAT-H Version 2.0, a model for calculating water and heat flow in unsaturated media. The documentation includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plant transpiration, and the code listing. Waste management practices at the Hanford Site have included disposal of low-level wastes by near-surface burial. Predicting the future long-term performance of any such burial site in terms of migration of contaminants requires a model capable of simulating water flow in the unsaturated soils above the buried waste. The model currently used to meet this need is UNSAT-H. This model was developed at Pacific Northwest Laboratory to assess water dynamics of near-surface, waste-disposal sites at the Hanford Site. The code is primarily used to predict deep drainage as a function of such environmental conditions as climate, soil type, and vegetation. UNSAT-H is also used to simulate the effects of various practices to enhance isolation of wastes. 66 refs., 29 figs., 7 tabs.

  17. A new version of the NeQuick ionosphere electron density model

    Science.gov (United States)

    Nava, B.; Coïsson, P.; Radicella, S. M.

    2008-12-01

    NeQuick is a three-dimensional and time dependent ionospheric electron density model developed at the Aeronomy and Radiopropagation Laboratory of the Abdus Salam International Centre for Theoretical Physics (ICTP), Trieste, Italy and at the Institute for Geophysics, Astrophysics and Meteorology of the University of Graz, Austria. It is a quick-run model particularly tailored for trans-ionospheric applications that allows one to calculate the electron concentration at any given location in the ionosphere and thus the total electron content (TEC) along any ground-to-satellite ray-path by means of numerical integration. Taking advantage of the increasing amount of available data, the model formulation is continuously updated to improve NeQuick capabilities to provide representations of the ionosphere at global scales. Recently, major changes have been introduced in the model topside formulation and important modifications have also been introduced in the bottomside description. In addition, specific revisions have been applied to the computer package associated to NeQuick in order to improve its computational efficiency. It has therefore been considered appropriate to finalize all the model developments in a new version of the NeQuick. In the present work the main features of NeQuick 2 are illustrated and some results related to validation tests are reported.

  18. A description of the FAMOUS (version XDBUA climate model and control run

    Directory of Open Access Journals (Sweden)

    A. Osprey

    2008-12-01

    Full Text Available FAMOUS is an ocean-atmosphere general circulation model of low resolution, capable of simulating approximately 120 years of model climate per wallclock day using current high performance computing facilities. It uses most of the same code as HadCM3, a widely used climate model of higher resolution and computational cost, and has been tuned to reproduce the same climate reasonably well. FAMOUS is useful for climate simulations where the computational cost makes the application of HadCM3 unfeasible, either because of the length of simulation or the size of the ensemble desired. We document a number of scientific and technical improvements to the original version of FAMOUS. These improvements include changes to the parameterisations of ozone and sea-ice which alleviate a significant cold bias from high northern latitudes and the upper troposphere, and the elimination of volume-averaged drifts in ocean tracers. A simple model of the marine carbon cycle has also been included. A particular goal of FAMOUS is to conduct millennial-scale paleoclimate simulations of Quaternary ice ages; to this end, a number of useful changes to the model infrastructure have been made.

  19. Solid Waste Projection Model: Database User`s Guide. Version 1.4

    Energy Technology Data Exchange (ETDEWEB)

    Blackburn, C.L.

    1993-10-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC) specifically to address Hanford solid waste management issues. This document is one of a set of documents supporting the SWPM system and providing instructions in the use and maintenance of SWPM components. This manual contains instructions for using Version 1.4 of the SWPM database: system requirements and preparation, entering and maintaining data, and performing routine database functions. This document supports only those operations which are specific to SWPM database menus and functions and does not Provide instruction in the use of Paradox, the database management system in which the SWPM database is established.

  20. Hydrogeochemical evaluation of the Forsmark site, model version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Laaksoharju, Marcus (ed.) [GeoPoint AB, Sollentuna (Sweden); Gimeno, Maria; Auque, Luis; Gomez, Javier [Univ. of Zaragoza (Spain). Dept. of Earth Sciences; Smellie, John [Conterra AB, Uppsala (Sweden); Tullborg, Eva-Lena [Terralogica AB, Graabo (Sweden); Gurban, Ioana [3D-Terra, Montreal (Canada)

    2004-01-01

    Siting studies for SKB's programme of deep geological disposal of nuclear fuel waste currently involves the investigation of two locations, Forsmark and Simpevarp, on the eastern coast of Sweden to determine their geological, geochemical and hydrogeological characteristics. Present work completed has resulted in model version 1.1 which represents the first evaluation of the available Forsmark groundwater analytical data collected up to May 1, 2003 (i.e. the first 'data freeze'). The HAG group had access to a total of 456 water samples collected mostly from the surface and sub-surface environment (e.g. soil pipes in the overburden, streams and lakes); only a few samples were collected from drilled boreholes. The deepest samples reflected depths down to 200 m. Furthermore, most of the waters sampled (74%) lacked crucial analytical information that restricted the evaluation. Consequently, model version 1.1 focussed on the processes taking place in the uppermost part of the bedrock rather than at repository levels. The complex groundwater evolution and patterns at Forsmark are a result of many factors such as: a) the flat topography and closeness to the Baltic Sea resulting in relative small hydrogeological driving forces which can preserve old water types from being flushed out, b) the changes in hydrogeology related to glaciation/deglaciation and land uplift, c) repeated marine/lake water regressions/transgressions, and d) organic or inorganic alteration of the groundwater caused by microbial processes or water/rock interactions. The sampled groundwaters reflect to various degrees modern or ancient water/rock interactions and mixing processes. Based on the general geochemical character and the apparent age two major water types occur in Forsmark: fresh-meteoric waters with a bicarbonate imprint and low residence times (tritium values above detection limit), and brackish-marine waters with Cl contents up to 6,000 mg/L and longer residence times (tritium

  1. Thermal modelling. Preliminary site description Simpevarp subarea - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, Jan; Back, Paer-Erik; Bengtsson, Anna; Laendell, Maerta [Geo Innova AB, Linkoeping (Sweden)

    2005-08-15

    This report presents the thermal site descriptive model for the Simpevarp subarea, version 1.2. The main objective of this report is to present the thermal modelling work where data has been identified, quality controlled, evaluated and summarised in order to make an upscaling to lithological domain level possible. The thermal conductivity at possible canister scale has been modelled for four different lithological domains (RSMA01 (Aevroe granite), RSMB01 (Fine-grained dioritoid), RSMC01 (mixture of Aevroe granite and Quartz monzodiorite), and RSMD01 (Quartz monzodiorite)). A main modelling approach has been used to determine the mean value of the thermal conductivity. Three alternative/complementary approaches have been used to evaluate the spatial variability of the thermal conductivity at domain level. The thermal modelling approaches are based on the lithological model for the Simpevarp subarea, version 1.2 together with rock type models constituted from measured and calculated (from mineral composition) thermal conductivities. For one rock type, the Aevroe granite (501044), density loggings within the specific rock type has also been used in the domain modelling in order to consider the spatial variability within the Aevroe granite. This has been possible due to the presented relationship between density and thermal conductivity, valid for the Aevroe granite. Results indicate that the mean of thermal conductivity is expected to exhibit only a small variation between the different domains, from 2.62 W/(m.K) to 2.80 W/(m.K). The standard deviation varies according to the scale considered and for the canister scale it is expected to range from 0.20 to 0.28 W/(m.K). Consequently, the lower confidence limit (95% confidence) for the canister scale is within the range 2.04-2.35 W/(m.K) for the different domains. The temperature dependence is rather small with a decrease in thermal conductivity of 1.1-3.4% per 100 deg C increase in temperature for the dominating rock

  2. Thermal modelling. Preliminary site description Laxemar subarea - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, Jan; Wrafter, John; Back, Paer-Erik; Laendell, Maerta [Geo Innova AB, Linkoeping (Sweden)

    2006-02-15

    This report presents the thermal site descriptive model for the Laxemar subarea, version 1.2. The main objective of this report is to present the thermal modelling work where data has been identified, quality controlled, evaluated and summarised in order to make an upscaling to lithological domain level possible. The thermal conductivity at canister scale has been modelled for five different lithological domains: RSMA (Aevroe granite), RSMBA (mixture of Aevroe granite and fine-grained dioritoid), RSMD (quartz monzodiorite), RSME (diorite/gabbro) and RSMM (mix domain with high frequency of diorite to gabbro). A base modelling approach has been used to determine the mean value of the thermal conductivity. Four alternative/complementary approaches have been used to evaluate the spatial variability of the thermal conductivity at domain level. The thermal modelling approaches are based on the lithological domain model for the Laxemar subarea, version 1.2 together with rock type models based on measured and calculated (from mineral composition) thermal conductivities. For one rock type, Aevroe granite (501044), density loggings have also been used in the domain modelling in order to evaluate the spatial variability within the Aevroe granite. This has been possible due to an established relationship between density and thermal conductivity, valid for the Aevroe granite. Results indicate that the means of thermal conductivity for the various domains are expected to exhibit a variation from 2.45 W/(m.K) to 2.87 W/(m.K). The standard deviation varies according to the scale considered, and for the 0.8 m scale it is expected to range from 0.17 to 0.29 W/(m.K). Estimates of lower tail percentiles for the same scale are presented for all five domains. The temperature dependence is rather small with a decrease in thermal conductivity of 1.1-5.3% per 100 deg C increase in temperature for the dominant rock types. There are a number of important uncertainties associated with these

  3. Thermal modelling. Preliminary site description Laxemar subarea - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, Jan; Wrafter, John; Back, Paer-Erik; Laendell, Maerta [Geo Innova AB, Linkoeping (Sweden)

    2006-02-15

    This report presents the thermal site descriptive model for the Laxemar subarea, version 1.2. The main objective of this report is to present the thermal modelling work where data has been identified, quality controlled, evaluated and summarised in order to make an upscaling to lithological domain level possible. The thermal conductivity at canister scale has been modelled for five different lithological domains: RSMA (Aevroe granite), RSMBA (mixture of Aevroe granite and fine-grained dioritoid), RSMD (quartz monzodiorite), RSME (diorite/gabbro) and RSMM (mix domain with high frequency of diorite to gabbro). A base modelling approach has been used to determine the mean value of the thermal conductivity. Four alternative/complementary approaches have been used to evaluate the spatial variability of the thermal conductivity at domain level. The thermal modelling approaches are based on the lithological domain model for the Laxemar subarea, version 1.2 together with rock type models based on measured and calculated (from mineral composition) thermal conductivities. For one rock type, Aevroe granite (501044), density loggings have also been used in the domain modelling in order to evaluate the spatial variability within the Aevroe granite. This has been possible due to an established relationship between density and thermal conductivity, valid for the Aevroe granite. Results indicate that the means of thermal conductivity for the various domains are expected to exhibit a variation from 2.45 W/(m.K) to 2.87 W/(m.K). The standard deviation varies according to the scale considered, and for the 0.8 m scale it is expected to range from 0.17 to 0.29 W/(m.K). Estimates of lower tail percentiles for the same scale are presented for all five domains. The temperature dependence is rather small with a decrease in thermal conductivity of 1.1-5.3% per 100 deg C increase in temperature for the dominant rock types. There are a number of important uncertainties associated with these

  4. Thermal modelling. Preliminary site description Simpevarp subarea - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, Jan; Back, Paer-Erik; Bengtsson, Anna; Laendell, Maerta [Geo Innova AB, Linkoeping (Sweden)

    2005-08-15

    This report presents the thermal site descriptive model for the Simpevarp subarea, version 1.2. The main objective of this report is to present the thermal modelling work where data has been identified, quality controlled, evaluated and summarised in order to make an upscaling to lithological domain level possible. The thermal conductivity at possible canister scale has been modelled for four different lithological domains (RSMA01 (Aevroe granite), RSMB01 (Fine-grained dioritoid), RSMC01 (mixture of Aevroe granite and Quartz monzodiorite), and RSMD01 (Quartz monzodiorite)). A main modelling approach has been used to determine the mean value of the thermal conductivity. Three alternative/complementary approaches have been used to evaluate the spatial variability of the thermal conductivity at domain level. The thermal modelling approaches are based on the lithological model for the Simpevarp subarea, version 1.2 together with rock type models constituted from measured and calculated (from mineral composition) thermal conductivities. For one rock type, the Aevroe granite (501044), density loggings within the specific rock type has also been used in the domain modelling in order to consider the spatial variability within the Aevroe granite. This has been possible due to the presented relationship between density and thermal conductivity, valid for the Aevroe granite. Results indicate that the mean of thermal conductivity is expected to exhibit only a small variation between the different domains, from 2.62 W/(m.K) to 2.80 W/(m.K). The standard deviation varies according to the scale considered and for the canister scale it is expected to range from 0.20 to 0.28 W/(m.K). Consequently, the lower confidence limit (95% confidence) for the canister scale is within the range 2.04-2.35 W/(m.K) for the different domains. The temperature dependence is rather small with a decrease in thermal conductivity of 1.1-3.4% per 100 deg C increase in temperature for the dominating rock

  5. Technical Note: Chemistry-climate model SOCOL: version 2.0 with improved transport and chemistry/microphysics schemes

    Directory of Open Access Journals (Sweden)

    M. Schraner

    2008-10-01

    Full Text Available We describe version 2.0 of the chemistry-climate model (CCM SOCOL. The new version includes fundamental changes of the transport scheme such as transporting all chemical species of the model individually and applying a family-based correction scheme for mass conservation for species of the nitrogen, chlorine and bromine groups, a revised transport scheme for ozone, furthermore more detailed halogen reaction and deposition schemes, and a new cirrus parameterisation in the tropical tropopause region. By means of these changes the model manages to overcome or considerably reduce deficiencies recently identified in SOCOL version 1.1 within the CCM Validation activity of SPARC (CCMVal. In particular, as a consequence of these changes, regional mass loss or accumulation artificially caused by the semi-Lagrangian transport scheme can be significantly reduced, leading to much more realistic distributions of the modelled chemical species, most notably of the halogens and ozone.

  6. Credit Risk Modelling and Implementation of Credit Risk Models in China

    OpenAIRE

    Yu, Mengxiao

    2007-01-01

    Credit risk, or the risk of counterparty default, is an important factor in the valuation and risk management of financial assets. It has become increasingly important to financial institutions. A variety of credit risk models have been developed to measure credit risk. They are J.P. Morgan's CreditMetrics; KMV's PortfolioManager based on Merton (1974) option pricing model; macroeconomic model CreditPortfolio View developed by McKinsey; CSFB's Credit Risk+ Model based on actuarial science fra...

  7. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions.

    Science.gov (United States)

    Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J

    2015-09-04

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org.

  8. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions

    Science.gov (United States)

    Hucka, Michael; Bergmann, Frank T.; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M.; Le Novére, Nicolas; Myers, Chris J.; Olivier, Brett G.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Waltemath, Dagmar; Wilkinson, Darren J.

    2017-01-01

    Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528569

  9. User manual for GEOCOST: a computer model for geothermal cost analysis. Volume 2. Binary cycle version

    Energy Technology Data Exchange (ETDEWEB)

    Huber, H.D.; Walter, R.A.; Bloomster, C.H.

    1976-03-01

    A computer model called GEOCOST has been developed to simulate the production of electricity from geothermal resources and calculate the potential costs of geothermal power. GEOCOST combines resource characteristics, power recovery technology, tax rates, and financial factors into one systematic model and provides the flexibility to individually or collectively evaluate their impacts on the cost of geothermal power. Both the geothermal reservoir and power plant are simulated to model the complete energy production system. In the version of GEOCOST in this report, geothermal fluid is supplied from wells distributed throughout a hydrothermal reservoir through insulated pipelines to a binary power plant. The power plant is simulated using a binary fluid cycle in which the geothermal fluid is passed through a series of heat exchangers. The thermodynamic state points in basic subcritical and supercritical Rankine cycles are calculated for a variety of working fluids. Working fluids which are now in the model include isobutane, n-butane, R-11, R-12, R-22, R-113, R-114, and ammonia. Thermodynamic properties of the working fluids at the state points are calculated using empirical equations of state. The Starling equation of state is used for hydrocarbons and the Martin-Hou equation of state is used for fluorocarbons and ammonia. Physical properties of working fluids at the state points are calculated.

  10. Modelling waste stabilisation ponds with an extended version of ASM3.

    Science.gov (United States)

    Gehring, T; Silva, J D; Kehl, O; Castilhos, A B; Costa, R H R; Uhlenhut, F; Alex, J; Horn, H; Wichern, M

    2010-01-01

    In this paper an extended version of IWA's Activated Sludge Model No 3 (ASM3) was developed to simulate processes in waste stabilisation ponds (WSP). The model modifications included the integration of algae biomass and gas transfer processes for oxygen, carbon dioxide and ammonia depending on wind velocity and a simple ionic equilibrium. The model was applied to a pilot-scale WSP system operated in the city of Florianópolis (Brazil). The system was used to treat leachate from a municipal waste landfill. Mean influent concentrations to the facultative pond of 1,456 g(COD)/m(3) and 505 g(NH4-N)/m(3) were measured. Experimental results indicated an ammonia nitrogen removal of 89.5% with negligible rates of nitrification but intensive ammonia stripping to the atmosphere. Measured data was used in the simulations to consider the impact of wind velocity on oxygen input of 11.1 to 14.4 g(O2)/(m(2) d) and sun radiation on photosynthesis. Good results for pH and ammonia removal were achieved with mean stripping rates of 18.2 and 4.5 g(N)/(m(2) d) for the facultative and maturation pond respectively. Based on measured chlorophyll a concentrations and depending on light intensity and TSS concentration it was possible to model algae concentrations.

  11. RISK LOAN PORTFOLIO OPTIMIZATION MODEL BASED ON CVAR RISK MEASURE

    Directory of Open Access Journals (Sweden)

    Ming-Chang LEE

    2015-07-01

    Full Text Available In order to achieve commercial banks liquidity, safety and profitability objective requirements, loan portfolio risk analysis based optimization decisions are rational allocation of assets.  The risk analysis and asset allocation are the key technology of banking and risk management.  The aim of this paper, build a loan portfolio optimization model based on risk analysis.  Loan portfolio rate of return by using Value-at-Risk (VaR and Conditional Value-at-Risk (CVaR constraint optimization decision model reflects the bank's risk tolerance, and the potential loss of direct control of the bank.  In this paper, it analyze a general risk management model applied to portfolio problems with VaR and CVaR risk measures by using Using the Lagrangian Algorithm.  This paper solves the highly difficult problem by matrix operation method.  Therefore, the combination of this paper is easy understanding the portfolio problems with VaR and CVaR risk model is a hyperbola in mean-standard deviation space.  It is easy calculation in proposed method.

  12. An observational study of the success and complications of 2546 external cephalic versions in low-risk pregnant women performed by trained midwives

    NARCIS (Netherlands)

    Beuckens, A.; Rijnders, M.; Verburgt-Doeleman, G.H.M.; Rijninks-van Driel, G.C.; Thorpe, J.; Huttom, E.K.

    2016-01-01

    Objective To evaluate the success of an external cephalic version (ECV) training programme, and to determine the rates of successful ECV, complications, and caesarean birth in a low-risk population. Design Prospective observational study. Setting Primary health care and hospital settings throughout

  13. Low Dose Radiation Cancer Risks: Epidemiological and Toxicological Models

    Energy Technology Data Exchange (ETDEWEB)

    David G. Hoel, PhD

    2012-04-19

    The basic purpose of this one year research grant was to extend the two stage clonal expansion model (TSCE) of carcinogenesis to exposures other than the usual single acute exposure. The two-stage clonal expansion model of carcinogenesis incorporates the biological process of carcinogenesis, which involves two mutations and the clonal proliferation of the intermediate cells, in a stochastic, mathematical way. The current TSCE model serves a general purpose of acute exposure models but requires numerical computation of both the survival and hazard functions. The primary objective of this research project was to develop the analytical expressions for the survival function and the hazard function of the occurrence of the first cancer cell for acute, continuous and multiple exposure cases within the framework of the piece-wise constant parameter two-stage clonal expansion model of carcinogenesis. For acute exposure and multiple exposures of acute series, it is either only allowed to have the first mutation rate vary with the dose, or to have all the parameters be dose dependent; for multiple exposures of continuous exposures, all the parameters are allowed to vary with the dose. With these analytical functions, it becomes easy to evaluate the risks of cancer and allows one to deal with the various exposure patterns in cancer risk assessment. A second objective was to apply the TSCE model with varing continuous exposures from the cancer studies of inhaled plutonium in beagle dogs. Using step functions to estimate the retention functions of the pulmonary exposure of plutonium the multiple exposure versions of the TSCE model was to be used to estimate the beagle dog lung cancer risks. The mathematical equations of the multiple exposure versions of the TSCE model were developed. A draft manuscript which is attached provides the results of this mathematical work. The application work using the beagle dog data from plutonium exposure has not been completed due to the fact

  14. A spatially-dynamic preliminary risk assessment of the American peregrine falcon at the Los Alamos National Laboratory (version 1)

    Energy Technology Data Exchange (ETDEWEB)

    Gallegos, A.F.; Gonzales, G.J.; Bennett, K.D. [and others

    1997-06-01

    The Endangered Species Act and the Record of Decision on the Dual Axis Radiographic Hydrodynamic Test Facility at the Los Alamos National Laboratory require protection of the American peregrine falcon. A preliminary risk assessment of the peregrine was performed using a custom FORTRAN model and a geographical information system. Estimated doses to the falcon were compared against toxicity reference values to generate hazard indices. Hazard index results indicated no unacceptable risk to the falcon from the soil ingestion pathway, including a measure of cumulative effects from multiple contaminants that assumes a linear additive toxicity type. Scaling home ranges on the basis of maximizing falcon height for viewing prey decreased estimated risk by 69% in a canyons-based home range and increased estimated risk by 40% in a river-based home range. Improving model realism by weighting simulated falcon foraging based on distance from potential nest sites decreased risk by 93% in one exposure unit and by 82% in a second exposure unit. It was demonstrated that choice of toxicity reference values can have a substantial impact on risk estimates. Adding bioaccumulation factors for several organics increased partial hazard quotients by a factor of 110, but increased the mean hazard index by only 0.02 units. Adding a food consumption exposure pathway in the form of biomagnification factors for 15 contaminants of potential ecological concern increased the mean hazard index to 1.16 ({+-} 1.0), which is above the level of acceptability (1.0). Aroclor-1254, dichlorodiphenyltrichlorethane (DDT) and dichlorodiphenylethelyne (DDE) accounted for 81% of the estimated risk that includes soil ingestion and food consumption Contaminant pathways and a biomagnification component. Information on risk by specific geographical location was generated, which can be used to manage contaminated areas, falcon habitat, facility siting, and/or facility operations. 123 refs., 10 figs., 2 tabs.

  15. VALIDATION OF THE ASTER GLOBAL DIGITAL ELEVATION MODEL VERSION 2 OVER THE CONTERMINOUS UNITED STATES

    Directory of Open Access Journals (Sweden)

    D. Gesch

    2012-07-01

    Full Text Available The ASTER Global Digital Elevation Model Version 2 (GDEM v2 was evaluated over the conterminous United States in a manner similar to the validation conducted for the original GDEM Version 1 (v1 in 2009. The absolute vertical accuracy of GDEM v2 was calculated by comparison with more than 18,000 independent reference geodetic ground control points from the National Geodetic Survey. The root mean square error (RMSE measured for GDEM v2 is 8.68 meters. This compares with the RMSE of 9.34 meters for GDEM v1. Another important descriptor of vertical accuracy is the mean error, or bias, which indicates if a DEM has an overall vertical offset from true ground level. The GDEM v2 mean error of –0.20 meters is a significant improvement over the GDEM v1 mean error of –3.69 meters. The absolute vertical accuracy assessment results, both mean error and RMSE, were segmented by land cover to examine the effects of cover types on measured errors. The GDEM v2 mean errors by land cover class verify that the presence of aboveground features (tree canopies and built structures cause a positive elevation bias, as would be expected for an imaging system like ASTER. In open ground classes (little or no vegetation with significant aboveground height, GDEM v2 exhibits a negative bias on the order of 1 meter. GDEM v2 was also evaluated by differencing with the Shuttle Radar Topography Mission (SRTM dataset. In many forested areas, GDEM v2 has elevations that are higher in the canopy than SRTM.

  16. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.

  17. A generic method for automatic translation between input models for different versions of simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Serfontein, Dawid E., E-mail: Dawid.Serfontein@nwu.ac.za [School of Mechanical and Nuclear Engineering, North West University (PUK-Campus), PRIVATE BAG X6001 (Internal Post Box 360), Potchefstroom 2520 (South Africa); Mulder, Eben J. [School of Mechanical and Nuclear Engineering, North West University (South Africa); Reitsma, Frederik [Calvera Consultants (South Africa)

    2014-05-01

    A computer code was developed for the semi-automatic translation of input models for the VSOP-A diffusion neutronics simulation code to the format of the newer VSOP 99/05 code. In this paper, this algorithm is presented as a generic method for producing codes for the automatic translation of input models from the format of one code version to another, or even to that of a completely different code. Normally, such translations are done manually. However, input model files, such as for the VSOP codes, often are very large and may consist of many thousands of numeric entries that make no particular sense to the human eye. Therefore the task, of for instance nuclear regulators, to verify the accuracy of such translated files can be very difficult and cumbersome. This may cause translation errors not to be picked up, which may have disastrous consequences later on when a reactor with such a faulty design is built. Therefore a generic algorithm for producing such automatic translation codes may ease the translation and verification process to a great extent. It will also remove human error from the process, which may significantly enhance the accuracy and reliability of the process. The developed algorithm also automatically creates a verification log file which permanently record the names and values of each variable used, as well as the list of meanings of all the possible values. This should greatly facilitate reactor licensing applications.

  18. Description and evaluation of the Community Multiscale Air Quality (CMAQ) modeling system version 5.1

    Science.gov (United States)

    Wyat Appel, K.; Napelenok, Sergey L.; Foley, Kristen M.; Pye, Havala O. T.; Hogrefe, Christian; Luecken, Deborah J.; Bash, Jesse O.; Roselle, Shawn J.; Pleim, Jonathan E.; Foroutan, Hosein; Hutzell, William T.; Pouliot, George A.; Sarwar, Golam; Fahey, Kathleen M.; Gantt, Brett; Gilliam, Robert C.; Heath, Nicholas K.; Kang, Daiwen; Mathur, Rohit; Schwede, Donna B.; Spero, Tanya L.; Wong, David C.; Young, Jeffrey O.

    2017-04-01

    The Community Multiscale Air Quality (CMAQ) model is a comprehensive multipollutant air quality modeling system developed and maintained by the US Environmental Protection Agency's (EPA) Office of Research and Development (ORD). Recently, version 5.1 of the CMAQ model (v5.1) was released to the public, incorporating a large number of science updates and extended capabilities over the previous release version of the model (v5.0.2). These updates include the following: improvements in the meteorological calculations in both CMAQ and the Weather Research and Forecast (WRF) model used to provide meteorological fields to CMAQ, updates to the gas and aerosol chemistry, revisions to the calculations of clouds and photolysis, and improvements to the dry and wet deposition in the model. Sensitivity simulations isolating several of the major updates to the modeling system show that changes to the meteorological calculations result in enhanced afternoon and early evening mixing in the model, periods when the model historically underestimates mixing. This enhanced mixing results in higher ozone (O3) mixing ratios on average due to reduced NO titration, and lower fine particulate matter (PM2. 5) concentrations due to greater dilution of primary pollutants (e.g., elemental and organic carbon). Updates to the clouds and photolysis calculations greatly improve consistency between the WRF and CMAQ models and result in generally higher O3 mixing ratios, primarily due to reduced cloudiness and attenuation of photolysis in the model. Updates to the aerosol chemistry result in higher secondary organic aerosol (SOA) concentrations in the summer, thereby reducing summertime PM2. 5 bias (PM2. 5 is typically underestimated by CMAQ in the summer), while updates to the gas chemistry result in slightly higher O3 and PM2. 5 on average in January and July. Overall, the seasonal variation in simulated PM2. 5 generally improves in CMAQv5.1 (when considering all model updates), as simulated PM2. 5

  19. Evaluating and improving cloud phase in the Community Atmosphere Model version 5 using spaceborne lidar observations

    Science.gov (United States)

    Kay, Jennifer E.; Bourdages, Line; Miller, Nathaniel B.; Morrison, Ariel; Yettella, Vineel; Chepfer, Helene; Eaton, Brian

    2016-04-01

    Spaceborne lidar observations from the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite are used to evaluate cloud amount and cloud phase in the Community Atmosphere Model version 5 (CAM5), the atmospheric component of a widely used state-of-the-art global coupled climate model (Community Earth System Model). By embedding a lidar simulator within CAM5, the idiosyncrasies of spaceborne lidar cloud detection and phase assignment are replicated. As a result, this study makes scale-aware and definition-aware comparisons between model-simulated and observed cloud amount and cloud phase. In the global mean, CAM5 has insufficient liquid cloud and excessive ice cloud when compared to CALIPSO observations. Over the ice-covered Arctic Ocean, CAM5 has insufficient liquid cloud in all seasons. Having important implications for projections of future sea level rise, a liquid cloud deficit contributes to a cold bias of 2-3°C for summer daily maximum near-surface air temperatures at Summit, Greenland. Over the midlatitude storm tracks, CAM5 has excessive ice cloud and insufficient liquid cloud. Storm track cloud phase biases in CAM5 maximize over the Southern Ocean, which also has larger-than-observed seasonal variations in cloud phase. Physical parameter modifications reduce the Southern Ocean cloud phase and shortwave radiation biases in CAM5 and illustrate the power of the CALIPSO observations as an observational constraint. The results also highlight the importance of using a regime-based, as opposed to a geographic-based, model evaluation approach. More generally, the results demonstrate the importance and value of simulator-enabled comparisons of cloud phase in models used for future climate projection.

  20. GENII Version 2 Users’ Guide

    Energy Technology Data Exchange (ETDEWEB)

    Napier, Bruce A.

    2004-03-08

    The GENII Version 2 computer code was developed for the Environmental Protection Agency (EPA) at Pacific Northwest National Laboratory (PNNL) to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) and the radiological risk estimating procedures of Federal Guidance Report 13 into updated versions of existing environmental pathway analysis models. The resulting environmental dosimetry computer codes are compiled in the GENII Environmental Dosimetry System. The GENII system was developed to provide a state-of-the-art, technically peer-reviewed, documented set of programs for calculating radiation dose and risk from radionuclides released to the environment. The codes were designed with the flexibility to accommodate input parameters for a wide variety of generic sites. Operation of a new version of the codes, GENII Version 2, is described in this report. Two versions of the GENII Version 2 code system are available, a full-featured version and a version specifically designed for demonstrating compliance with the dose limits specified in 40 CFR 61.93(a), the National Emission Standards for Hazardous Air Pollutants (NESHAPS) for radionuclides. The only differences lie in the limitation of the capabilities of the user to change specific parameters in the NESHAPS version. This report describes the data entry, accomplished via interactive, menu-driven user interfaces. Default exposure and consumption parameters are provided for both the average (population) and maximum individual; however, these may be modified by the user. Source term information may be entered as radionuclide release quantities for transport scenarios, or as basic radionuclide concentrations in environmental media (air, water, soil). For input of basic or derived concentrations, decay of parent radionuclides and ingrowth of radioactive decay products prior to the start of the exposure scenario may be considered. A single code run can

  1. MODELING CREDIT RISK THROUGH CREDIT SCORING

    OpenAIRE

    Adrian Cantemir CALIN; Oana Cristina POPOVICI

    2014-01-01

    Credit risk governs all financial transactions and it is defined as the risk of suffering a loss due to certain shifts in the credit quality of a counterpart. Credit risk literature gravitates around two main modeling approaches: the structural approach and the reduced form approach. In addition to these perspectives, credit risk assessment has been conducted through a series of techniques such as credit scoring models, which form the traditional approach. This paper examines the evolution of...

  2. DEVELOPMENT OF A MORE APPLIED VERSION OF COHERENCY CALLED ‘SENSIBLE COHERENCY’ FOR ASSESSMENT OF FINANCIAL RISK MEASURES

    Directory of Open Access Journals (Sweden)

    M. Jasemi

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Coherency is becoming a necessary feature for any risk measure, and now is an acceptable tool in risk management to assess the risk measures. For example, recent studies have strongly criticised VaR-based models for not providing a coherent risk measure. Because of such acceptance, it is important to improve the efficiency of the touchstone for evaluating risk measures in order to achieve a fairer assessment. This is just the challenge that this paper seeks to address. This goal is achieved on the one hand by doing some simplifications in axioms of coherency without losing their major financial points, and on the other hand by removing the paradox between two of the axioms. The new concept is called ‘sensible coherency’, and the risk measure that satisfies the four new simplified and corrected axioms will be ‘sensibly coherent’. Finally, the new axioms are applied to a particular type of lower partial moments as a case study.

    AFRIKAANSE OPSOMMING: Koherensie word ‘n noodsaaklike kenmerk van enige risikomaatstaf en is nou ‘n aanvaarbare gereedskapstuk in die beoordeling van risikomaatstawwe. Die doel van hierdie artikel word bereik deur enersyds die aksiomas van koherensie te vereenvoudig en andersyds die paradoks tussen die aksiomas te verwyder. Die resultaat word “sinvolle koherensie” genoem.

  3. Overview of the Uniform California Earthquake Rupture Forecast Version 3 (UCERF3) Time-Independent Model

    Science.gov (United States)

    Field, E. H.; Arrowsmith, R.; Biasi, G. P.; Bird, P.; Dawson, T. E.; Felzer, K. R.; Jackson, D. D.; Johnson, K. M.; Jordan, T. H.; Madugo, C. M.; Michael, A. J.; Milner, K. R.; Page, M. T.; Parsons, T.; Powers, P.; Shaw, B. E.; Thatcher, W. R.; Weldon, R. J.; Zeng, Y.

    2013-12-01

    We present the time-independent component of the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3), where the primary achievements have been to relax fault segmentation and include multi-fault ruptures, both limitations of UCERF2. The rates of all earthquakes are solved for simultaneously, and from a broader range of data, using a system-level 'grand inversion' that is both conceptually simple and extensible. The inverse problem is large and underdetermined, so a range of models is sampled using an efficient simulated annealing algorithm. The approach is more derivative than prescriptive (e.g., magnitude-frequency distributions are no longer assumed), so new analysis tools were developed for exploring solutions. Epistemic uncertainties were also accounted for using 1440 alternative logic tree branches, necessitating access to supercomputers. The most influential uncertainties include alternative deformation models (fault slip rates), a new smoothed seismicity algorithm, alternative values for the total rate of M≥5 events, and different scaling relationships, virtually all of which are new. As a notable first, three deformation models are based on kinematically consistent inversions of geodetic and geologic data, also providing slip-rate constraints on faults previously excluded due to lack of geologic data. The grand inversion constitutes a system-level framework for testing hypotheses and balancing the influence of different experts. For example, we demonstrate serious challenges with the Gutenberg-Richter hypothesis for individual faults. UCERF3 is still an approximation of the system, however, and the range of models is limited (e.g., constrained to stay close to UCERF2). Nevertheless, UCERF3 removes the apparent UCERF2 over-prediction of M6.5-7 earthquake rates, and also includes types of multi-fault ruptures seen in nature. While UCERF3 fits the data better than UCERF2 overall, there may be areas that warrant further site

  4. Energy Integration for 2050 - A Strategic Impact Model (2050 SIM), Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    John Collins

    2011-09-01

    The United States (U.S.) energy infrastructure is among the most reliable, accessible, and economic in the world. On the other hand, it is also excessively reliant on foreign energy sources, experiences high volatility in energy prices, does not always practice good stewardship of finite indigenous energy resources, and emits significant quantities of greenhouse gas. The U.S. Department of Energy is conducting research and development on advanced nuclear reactor concepts and technologies, including High Temperature Gas Reactor (HTGR) technologies, directed at helping the United States meet its current and future energy challenges. This report discusses the Draft Strategic Impact Model (SIM), an initial version of which was created during the later part of FY-2010. SIM was developed to analyze and depict the benefits of various energy sources in meeting the energy demand and to provide an overall system understanding of the tradeoffs between building and using HTGRs versus other existing technologies for providing energy (heat and electricity) to various energy-use sectors in the United States. This report also provides the assumptions used in the model, the rationale for the methodology, and the references for the source documentation and source data used in developing the SIM.

  5. Energy Integration for 2050 - A Strategic Impact Model (2050 SIM), Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    2010-10-01

    The United States (U.S.) energy infrastructure is among the most reliable, accessible, and economic in the world. On the other hand, it is also excessively reliant on foreign energy sources, experiences high volatility in energy prices, does not always practice good stewardship of finite indigenous energy resources, and emits significant quantities of greenhouse gas. The U.S. Department of Energy is conducting research and development on advanced nuclear reactor concepts and technologies, including High Temperature Gas Reactor (HTGR) technologies, directed at helping the United States meet its current and future energy challenges. This report discusses the Draft Strategic Impact Model (SIM), an initial version of which was created during the later part of FY-2010. SIM was developed to analyze and depict the benefits of various energy sources in meeting the energy demand and to provide an overall system understanding of the tradeoffs between building and using HTGRs versus other existing technologies for providing energy (heat and electricity) to various energy-use sectors in the United States. This report also provides the assumptions used in the model, the rationale for the methodology, and the references for the source documentation and source data used in developing the SIM.

  6. A multi-sectoral version of the Post-Keynesian growth model

    Directory of Open Access Journals (Sweden)

    Ricardo Azevedo Araujo

    2015-03-01

    Full Text Available Abstract With this inquiry, we seek to develop a disaggregated version of the post-Keynesian approach to economic growth, by showing that indeed it can be treated as a particular case of the Pasinettian model of structural change and economic expansion. By relying upon vertical integration it becomes possible to carry out the analysis initiated by Kaldor (1956 and Robinson (1956, 1962, and followed by Dutt (1984, Rowthorn (1982 and later Bhaduri and Marglin (1990 in a multi-sectoral model in which demand and productivity increase at different paces in each sector. By adopting this approach it is possible to show that the structural economic dynamics is conditioned not only to patterns of evolving demand and diffusion of technological progress but also to the distributive features of the economy, which can give rise to different regimes of economic growth. Besides, we find it possible to determine the natural rate of profit that makes the mark-up rate to be constant over time.

  7. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  8. Re-evaluation of Predictive Models in Light of New Data: Sunspot Number Version 2.0

    Science.gov (United States)

    Gkana, A.; Zachilas, L.

    2016-10-01

    The original version of the Zürich sunspot number (Sunspot Number Version 1.0) has been revised by an entirely new series (Sunspot Number Version 2.0). We re-evaluate the performance of our previously proposed models for predicting solar activity in the light of the revised data. We perform new monthly and yearly predictions using the Sunspot Number Version 2.0 as input data and compare them with our original predictions (using the Sunspot Number Version 1.0 series as input data). We show that our previously proposed models are still able to produce quite accurate solar-activity predictions despite the full revision of the Zürich Sunspot Number, indicating that there is no significant degradation in their performance. Extending our new monthly predictions (July 2013 - August 2015) by 50 time-steps (months) ahead in time (from September 2015 to October 2019), we provide evidence that we are heading into a period of dramatically low solar activity. Finally, our new future long-term predictions endorse our previous claim that a prolonged solar activity minimum is expected to occur, lasting up to the year ≈ 2100.

  9. Modeling Research Project Risks with Fuzzy Maps

    Science.gov (United States)

    Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana

    2009-01-01

    The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…

  10. Competing Risks and Multistate Models with R

    CERN Document Server

    Beyersmann, Jan; Schumacher, Martin

    2012-01-01

    This book covers competing risks and multistate models, sometimes summarized as event history analysis. These models generalize the analysis of time to a single event (survival analysis) to analysing the timing of distinct terminal events (competing risks) and possible intermediate events (multistate models). Both R and multistate methods are promoted with a focus on nonparametric methods.

  11. Hydrogeochemical evaluation for Simpevarp model version 1.2. Preliminary site description of the Simpevarp area

    Energy Technology Data Exchange (ETDEWEB)

    Laaksoharju, Marcus (ed.) [Geopoint AB, Stockholm (Sweden)

    2004-12-01

    Siting studies for SKB's programme of deep geological disposal of nuclear fuel waste currently involves the investigation of two locations, Simpevarp and Forsmark, to determine their geological, hydrogeochemical and hydrogeological characteristics. Present work completed has resulted in Model version 1.2 which represents the second evaluation of the available Simpevarp groundwater analytical data collected up to April, 2004. The deepest fracture groundwater samples with sufficient analytical data reflected depths down to 1.7 km. Model version 1.2 focusses on geochemical and mixing processes affecting the groundwater composition in the uppermost part of the bedrock, down to repository levels, and eventually extending to 1000 m depth. The groundwater flow regimes at Laxemar/Simpevarp are considered local and extend down to depths of around 600-1000 m depending on local topography. The marked differences in the groundwater flow regimes between Laxemar and Simpevarp are reflected in the groundwater chemistry where four major hydrochemical groups of groundwaters (types A-D) have been identified: TYPE A: This type comprises dilute groundwaters (< 1000 mg/L Cl; 0.5-2.0 g/L TDS) of Na-HCO{sub 3} type present at shallow (<200 m) depths at Simpevarp, but at greater depths (0-900 m) at Laxemar. At both localities the groundwaters are marginally oxidising close to the surface, but otherwise reducing. Main reactions involve weathering, ion exchange (Ca, Mg), surface complexation, and dissolution of calcite. Redox reactions include precipitation of Fe-oxyhydroxides and some microbially mediated reactions (SRB). Meteoric recharge water is mainly present at Laxemar whilst at Simpevarp potential mixing of recharge meteoric water and a modern sea component is observed. Localised mixing of meteoric water with deeper saline groundwaters is indicated at both Laxemar and Simpevarp. TYPE B: This type comprises brackish groundwaters (1000-6000 mg/L Cl; 5-10 g/L TDS) present at

  12. Relative risk regression models with inverse polynomials.

    Science.gov (United States)

    Ning, Yang; Woodward, Mark

    2013-08-30

    The proportional hazards model assumes that the log hazard ratio is a linear function of parameters. In the current paper, we model the log relative risk as an inverse polynomial, which is particularly suitable for modeling bounded and asymmetric functions. The parameters estimated by maximizing the partial likelihood are consistent and asymptotically normal. The advantages of the inverse polynomial model over the ordinary polynomial model and the fractional polynomial model for fitting various asymmetric log relative risk functions are shown by simulation. The utility of the method is further supported by analyzing two real data sets, addressing the specific question of the location of the minimum risk threshold.

  13. ASTER Global Digital Elevation Model Version 2 - summary of validation results

    Science.gov (United States)

    Tachikawa, Tetushi; Kaku, Manabu; Iwasaki, Akira; Gesch, Dean B.; Oimoen, Michael J.; Zhang, Z.; Danielson, Jeffrey J.; Krieger, Tabatha; Curtis, Bill; Haase, Jeff; Abrams, Michael; Carabajal, C.; Meyer, Dave

    2011-01-01

    On June 29, 2009, NASA and the Ministry of Economy, Trade and Industry (METI) of Japan released a Global Digital Elevation Model (GDEM) to users worldwide at no charge as a contribution to the Global Earth Observing System of Systems (GEOSS). This “version 1” ASTER GDEM (GDEM1) was compiled from over 1.2 million scenebased DEMs covering land surfaces between 83°N and 83°S latitudes. A joint U.S.-Japan validation team assessed the accuracy of the GDEM1, augmented by a team of 20 cooperators. The GDEM1 was found to have an overall accuracy of around 20 meters at the 95% confidence level. The team also noted several artifacts associated with poor stereo coverage at high latitudes, cloud contamination, water masking issues and the stacking process used to produce the GDEM1 from individual scene-based DEMs (ASTER GDEM Validation Team, 2009). Two independent horizontal resolution studies estimated the effective spatial resolution of the GDEM1 to be on the order of 120 meters.

  14. ISM Approach to Model Offshore Outsourcing Risks

    Directory of Open Access Journals (Sweden)

    Sunand Kumar

    2014-07-01

    Full Text Available In an effort to achieve a competitive advantage via cost reductions and improved market responsiveness, organizations are increasingly employing offshore outsourcing as a major component of their supply chain strategies. But as evident from literature number of risks such as Political risk, Risk due to cultural differences, Compliance and regulatory risk, Opportunistic risk and Organization structural risk, which adversely affect the performance of offshore outsourcing in a supply chain network. This also leads to dissatisfaction among different stake holders. The main objective of this paper is to identify and understand the mutual interaction among various risks which affect the performance of offshore outsourcing.  To this effect, authors have identified various risks through extant review of literature.  From this information, an integrated model using interpretive structural modelling (ISM for risks affecting offshore outsourcing is developed and the structural relationships between these risks are modeled.  Further, MICMAC analysis is done to analyze the driving power and dependency of risks which shall be helpful to managers to identify and classify important criterions and to reveal the direct and indirect effects of each criterion on offshore outsourcing. Results show that political risk and risk due to cultural differences are act as strong drivers.

  15. MODELS OF BANKING RISKS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Bente Corneliu Cristian

    2009-05-01

    Full Text Available Banking risks management as a fundamental element of banking management aims at diminishing as much as possible the negative impact of risk factors, at minimizing losses by expenditures cut-off and maximizing direct and transferred influxes, changing the

  16. Uniform California earthquake rupture forecast, version 3 (UCERF3): the time-independent model

    Science.gov (United States)

    Field, Edward H.; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David D.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin R.; Page, Morgan T.; Parsons, Thomas; Powers, Peter M.; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua; ,

    2013-01-01

    In this report we present the time-independent component of the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3), which provides authoritative estimates of the magnitude, location, and time-averaged frequency of potentially damaging earthquakes in California. The primary achievements have been to relax fault segmentation assumptions and to include multifault ruptures, both limitations of the previous model (UCERF2). The rates of all earthquakes are solved for simultaneously, and from a broader range of data, using a system-level "grand inversion" that is both conceptually simple and extensible. The inverse problem is large and underdetermined, so a range of models is sampled using an efficient simulated annealing algorithm. The approach is more derivative than prescriptive (for example, magnitude-frequency distributions are no longer assumed), so new analysis tools were developed for exploring solutions. Epistemic uncertainties were also accounted for using 1,440 alternative logic tree branches, necessitating access to supercomputers. The most influential uncertainties include alternative deformation models (fault slip rates), a new smoothed seismicity algorithm, alternative values for the total rate of M≥5 events, and different scaling relationships, virtually all of which are new. As a notable first, three deformation models are based on kinematically consistent inversions of geodetic and geologic data, also providing slip-rate constraints on faults previously excluded because of lack of geologic data. The grand inversion constitutes a system-level framework for testing hypotheses and balancing the influence of different experts. For example, we demonstrate serious challenges with the Gutenberg-Richter hypothesis for individual faults. UCERF3 is still an approximation of the system, however, and the range of models is limited (for example, constrained to stay close to UCERF2). Nevertheless, UCERF3 removes the apparent UCERF2 overprediction of

  17. Atmospheric radionuclide transport model with radon postprocessor and SBG module. Model description version 2.8.0; ARTM. Atmosphaerisches Radionuklid-Transport-Modell mit Radon Postprozessor und SBG-Modul. Modellbeschreibung zu Version 2.8.0

    Energy Technology Data Exchange (ETDEWEB)

    Richter, Cornelia; Sogalla, Martin; Thielen, Harald; Martens, Reinhard

    2015-04-20

    The study on the atmospheric radionuclide transport model with radon postprocessor and SBG module (model description version 2.8.0) covers the following issues: determination of emissions, radioactive decay, atmospheric dispersion calculation for radioactive gases, atmospheric dispersion calculation for radioactive dusts, determination of the gamma cloud radiation (gamma submersion), terrain roughness, effective source height, calculation area and model points, geographic reference systems and coordinate transformations, meteorological data, use of invalid meteorological data sets, consideration of statistical uncertainties, consideration of housings, consideration of bumpiness, consideration of terrain roughness, use of frequency distributions of the hourly dispersion situation, consideration of the vegetation period (summer), the radon post processor radon.exe, the SBG module, modeling of wind fields, shading settings.

  18. Planar version of the CPT-even gauge sector of the standard model extension

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira Junior, Manoel M.; Casana, Rodolfo; Gomes, Adalto Rodrigues; Carvalho, Eduardo S. [Universidade Federal do Maranhao (UFMA), Sao Luis, MA (Brazil). Dept. de Fisica

    2011-07-01

    The CPT-even abelian gauge sector of the Standard Model Extension is represented by the Maxwell term supplemented by (K{sub F} ){sub {mu}}{nu}{rho}{sigma} F{sup {mu}}{nu} F{sup {rho}}{sigma}, where the Lorentz-violating background tensor, (K{sub F} ){sub {mu}}{nu}{rho}{sigma}, possesses the symmetries of the Riemann tensor and a double null trace, which renders nineteen independent components. From these ones, ten components yield birefringence while nine are nonbirefringent ones. In the present work, we examine the planar version of this theory, obtained by means of a typical dimensional reduction procedure to (1 + 2) dimensions. We obtain a kind of planar scalar electrodynamics, which is composed of a gauge sector containing six Lorentz-violating coefficients, a scalar field endowed with a noncanonical kinetic term, and a coupling term that links the scalar and gauge sectors. The dispersion relation is exactly determined, revealing that the six parameters related to the pure electromagnetic sector do not yield birefringence at any order. In this model, the birefringence may appear only as a second order effect associated with the coupling tensor linking the gauge and scalar sectors.The equations of motion are written and solved in the stationary regime. The Lorentz-violating parameters do not alter the asymptotic behavior of the fields but induce an angular dependence not observed in the Maxwell planar theory. The energy-momentum tensor was evaluated as well, revealing that the theory presents energy stability. (author)

  19. Caries risk assessment models in caries prediction

    Directory of Open Access Journals (Sweden)

    Amila Zukanović

    2013-11-01

    Full Text Available Objective. The aim of this research was to assess the efficiency of different multifactor models in caries prediction. Material and methods. Data from the questionnaire and objective examination of 109 examinees was entered into the Cariogram, Previser and Caries-Risk Assessment Tool (CAT multifactor risk assessment models. Caries risk was assessed with the help of all three models for each patient, classifying them as low, medium or high-risk patients. The development of new caries lesions over a period of three years [Decay Missing Filled Tooth (DMFT increment = difference between Decay Missing Filled Tooth Surface (DMFTS index at baseline and follow up], provided for examination of the predictive capacity concerning different multifactor models. Results. The data gathered showed that different multifactor risk assessment models give significantly different results (Friedman test: Chi square = 100.073, p=0.000. Cariogram is the model which identified the majority of examinees as medium risk patients (70%. The other two models were more radical in risk assessment, giving more unfavorable risk –profiles for patients. In only 12% of the patients did the three multifactor models assess the risk in the same way. Previser and CAT gave the same results in 63% of cases – the Wilcoxon test showed that there is no statistically significant difference in caries risk assessment between these two models (Z = -1.805, p=0.071. Conclusions. Evaluation of three different multifactor caries risk assessment models (Cariogram, PreViser and CAT showed that only the Cariogram can successfully predict new caries development in 12-year-old Bosnian children.

  20. Modeling extreme risks in ecology.

    Science.gov (United States)

    Burgman, Mark; Franklin, James; Hayes, Keith R; Hosack, Geoffrey R; Peters, Gareth W; Sisson, Scott A

    2012-11-01

    Extreme risks in ecology are typified by circumstances in which data are sporadic or unavailable, understanding is poor, and decisions are urgently needed. Expert judgments are pervasive and disagreements among experts are commonplace. We outline approaches to evaluating extreme risks in ecology that rely on stochastic simulation, with a particular focus on methods to evaluate the likelihood of extinction and quasi-extinction of threatened species, and the likelihood of establishment and spread of invasive pests. We evaluate the importance of assumptions in these assessments and the potential of some new approaches to account for these uncertainties, including hierarchical estimation procedures and generalized extreme value distributions. We conclude by examining the treatment of consequences in extreme risk analysis in ecology and how expert judgment may better be harnessed to evaluate extreme risks.

  1. Why operational risk modelling creates inverse incentives

    NARCIS (Netherlands)

    Doff, R.

    2015-01-01

    Operational risk modelling has become commonplace in large international banks and is gaining popularity in the insurance industry as well. This is partly due to financial regulation (Basel II, Solvency II). This article argues that operational risk modelling is fundamentally flawed, despite efforts

  2. Concordance for prognostic models with competing risks

    NARCIS (Netherlands)

    M. Wolbers (Marcel); S. Blanche (Stephane); M. Koller (Michael); J.C.M. Witteman (Jacqueline); T.A. Gerds (Thomas)

    2014-01-01

    textabstractThe concordance probability is a widely used measure to assess discrimination of prognostic models with binary and survival endpoints. We formally define the concordance probability for a prognostic model of the absolute risk of an event of interest in the presence of competing risks and

  3. A Comparison of Different Versions of the Method of Multiple Scales for an Arbitrary Model of Odd Nonlinearities

    OpenAIRE

    Pakdemirli, Mehmet; Boyacı, Hakan

    1999-01-01

    A general model of cubic and fifth order nonlinearities is considered. The linear part as well as the nonlinearities are expressed in terms of arbitrary operators. Two different versions of the method of multiple scales are used in constructing the general transient and steady-state solutions of the model: Modified Rahman-Burton method and the Reconstitution method. It is found that the usual ordering of reconstitution can be used, if at higher orders of approximation, the time scale correspo...

  4. Scaling and long-range dependence in option pricing III: A fractional version of the Merton model with transaction costs

    Science.gov (United States)

    Wang, Xiao-Tian; Yan, Hai-Gang; Tang, Ming-Ming; Zhu, En-Hui

    2010-02-01

    A model for option pricing of fractional version of the Merton model with ‘Hurst exponent’ H being in [1/2,1) is established with transaction costs. In particular, for H∈(1/2,1) the minimal price Cmin(t,St) of an option under transaction costs is obtained, which displays that the timestep δt and the ‘Hurst exponent’ H play an important role in option pricing with transaction costs.

  5. Energy risk management and value at risk modeling

    Energy Technology Data Exchange (ETDEWEB)

    Mehdi Sadeghi; Saeed Shavvalpour [Imam Sadiq University, Tehran (Iran). Economics Dept.

    2006-12-15

    The value of energy trades can change over time with market conditions and underlying price variables. The rise of competition and deregulation in energy markets has led to relatively free energy markets that are characterized by high price shifts. Within oil markets the volatile oil price environment after OPEC agreements in the 1970s requires a risk quantification. ''Value-at-risk'' has become an essential tool for this end when quantifying market risk. There are various methods for calculating value-at-risk. The methods we introduced in this paper are Historical Simulation ARMA Forecasting and Variance-Covariance based on GARCH modeling approaches. The results show that among various approaches the HSAF methodology presents more efficient results, so that if the level of confidence is 99%, the value-at-risk calculated through HSAF methodology is greater than actual price changes in almost 97.6 percent of the forecasting period. (author)

  6. Energy risk management and value at risk modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sadeghi, Mehdi [Economics department, Imam Sadiq University, P.B. 14655-159, Tehran (Iran, Islamic Republic of)]. E-mail: sadeghi@isu.ac.ir; Shavvalpour, Saeed [Economics department, Imam Sadiq University, P.B. 14655-159, Tehran (Iran, Islamic Republic of)]. E-mail: shavalpoor@isu.ac.ir

    2006-12-15

    The value of energy trades can change over time with market conditions and underlying price variables. The rise of competition and deregulation in energy markets has led to relatively free energy markets that are characterized by high price shifts. Within oil markets the volatile oil price environment after OPEC agreements in the 1970s requires a risk quantification.' Value-at-risk' has become an essential tool for this end when quantifying market risk. There are various methods for calculating value-at-risk. The methods we introduced in this paper are Historical Simulation ARMA Forecasting and Variance-Covariance based on GARCH modeling approaches. The results show that among various approaches the HSAF methodology presents more efficient results, so that if the level of confidence is 99%, the value-at-risk calculated through HSAF methodology is greater than actual price changes in almost 97.6 percent of the forecasting period.

  7. Effects of Lower and Higher Quality Brand Versions on Brand Evaluation: an Opponent-Process Model Plus Differential Brand-Version Weighting

    National Research Council Canada - National Science Library

    Timothy Heath; Devon DelVecchio; Michael McCarthy; Subimal Chatterjee

    2009-01-01

    ...) or lower-quality versions (e.g., Ruby Tuesday's Corner Diner). A brand-quality asymmetry emerges on measures ranging from brand choice to brand attitude to perceptions of brand expertise, innovativeness, and prestige...

  8. Does Diversity Matter In Modeling? Testing A New Version Of The FORMIX3 Growth Model For Madagascar Rainforests

    Science.gov (United States)

    Armstrong, A. H.; Fischer, R.; Shugart, H. H.; Huth, A.

    2012-12-01

    Ecological forecasting has become an essential tool used by ecologists to understand the dynamics of growth and disturbance response in threatened ecosystems such as the rainforests of Madagascar. In the species rich tropics, forest conservation is often eclipsed by anthropogenic factors, resulting in a heightened need for accurate assessment of biomass before these ecosystems disappear. The objective of this study was to test a new Madagascar rainforest specific version of the FORMIX3 growth model (Huth and Ditzer, 2000; Huth et al 1998) to assess how accurately biomass can be simulated in high biodiversity forests using a method of functional type aggregation in an individual-based model framework. Rainforest survey data collected over three growing seasons, including 265 tree species, was aggregated into 12 plant functional types based on size and light requirements. Findings indicated that the forest study site compared best when the simulated forest reached mature successional status. Multiple level comparisons between model simulation data and survey plot data found that though some features, such as the dominance of canopy emergent species and relative absence of small woody treelets are captured by the model, other forest attributes were not well reflected. Overall, the ability to accurately simulate the Madagascar rainforest was slightly diminished by the aggregation of tree species into size and light requirement functional type groupings.

  9. Preliminary normative data for the Evaluation of Risks Scale-Bubble Sheet Version (EVAR-B) for large-scale surveys of returning combat veterans.

    Science.gov (United States)

    Killgore, William D S; Castro, Carl A; Hoge, Charles W

    2010-10-01

    The Evaluation of Risks (EVAR) scale has been used to assess risk-taking propensity in military samples. This report provides preliminary reliability, validity, and normative data on a modified version of the instrument designed to facilitate data entry with optical scanners, the Evaluation of Risks-Bubble Sheet version (EVAR-B). 2,015 U.S. Army soldiers completed the EVAR-B and a survey assessing risk-related behaviors 3 months after returning home from combat deployment in Iraq. EVAR-B demonstrated acceptable internal consistency and reliability and correlated significantly with independent measures of self-reported risk-taking behavior, including alcohol use and aggressive behavior, in the weeks preceding the survey. Tentative cut-offs significantly differentiated heavy drinkers, dangerous drivers, and soldiers reporting recent aggressive outbursts. Normative data are provided for comparison with future studies. The EVAR-B is a reliable and valid measure of risk-taking propensity, which provides enhanced flexibility for administration and scoring in large surveys and field environments.

  10. A Network Model of Credit Risk Contagion

    Directory of Open Access Journals (Sweden)

    Ting-Qiang Chen

    2012-01-01

    Full Text Available A network model of credit risk contagion is presented, in which the effect of behaviors of credit risk holders and the financial market regulators and the network structure are considered. By introducing the stochastic dominance theory, we discussed, respectively, the effect mechanisms of the degree of individual relationship, individual attitude to credit risk contagion, the individual ability to resist credit risk contagion, the monitoring strength of the financial market regulators, and the network structure on credit risk contagion. Then some derived and proofed propositions were verified through numerical simulations.

  11. Techniques and Simulation Models in Risk Management

    OpenAIRE

    Mirela GHEORGHE

    2012-01-01

    In the present paper, the scientific approach of the research starts from the theoretical framework of the simulation concept and then continues in the setting of the practical reality, thus providing simulation models for a broad range of inherent risks specific to any organization and simulation of those models, using the informatics instrument @Risk (Palisade). The reason behind this research lies in the need for simulation models that will allow the person in charge with decision taking i...

  12. Stochastic empirical loading and dilution model (SELDM) version 1.0.0

    Science.gov (United States)

    Granato, Gregory E.

    2013-01-01

    The Stochastic Empirical Loading and Dilution Model (SELDM) is designed to transform complex scientific data into meaningful information about the risk of adverse effects of runoff on receiving waters, the potential need for mitigation measures, and the potential effectiveness of such management measures for reducing these risks. The U.S. Geological Survey developed SELDM in cooperation with the Federal Highway Administration to help develop planning-level estimates of event mean concentrations, flows, and loads in stormwater from a site of interest and from an upstream basin. Planning-level estimates are defined as the results of analyses used to evaluate alternative management measures; planning-level estimates are recognized to include substantial uncertainties (commonly orders of magnitude). SELDM uses information about a highway site, the associated receiving-water basin, precipitation events, stormflow, water quality, and the performance of mitigation measures to produce a stochastic population of runoff-quality variables. SELDM provides input statistics for precipitation, prestorm flow, runoff coefficients, and concentrations of selected water-quality constituents from National datasets. Input statistics may be selected on the basis of the latitude, longitude, and physical characteristics of the site of interest and the upstream basin. The user also may derive and input statistics for each variable that are specific to a given site of interest or a given area. SELDM is a stochastic model because it uses Monte Carlo methods to produce the random combinations of input variable values needed to generate the stochastic population of values for each component variable. SELDM calculates the dilution of runoff in the receiving waters and the resulting downstream event mean concentrations and annual average lake concentrations. Results are ranked, and plotting positions are calculated, to indicate the level of risk of adverse effects caused by runoff concentrations

  13. A methodology for modeling regional terrorism risk.

    Science.gov (United States)

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States.

  14. Software Design Description for the Navy Coastal Ocean Model (NCOM) Version 4.0

    Science.gov (United States)

    2008-12-31

    cstr ,lenc) Data Declaration: Integer lenc Character cstr Coamps_uvg2uv Subroutine COAMPS_UVG2UV...are removed from the substrings. Calling Sequence: strpars(cline, cdelim, nstr, cstr , nsto, ierr) NRL/MR/7320--08-9149...NCOM Version 4.0 SDD 92 Subroutine Description Data Declaration: Character cline, cstr ,cdelim

  15. PRISM: a planned risk information seeking model.

    Science.gov (United States)

    Kahlor, LeeAnn

    2010-06-01

    Recent attention on health-related information seeking has focused primarily on information seeking within specific health and health risk contexts. This study attempts to shift some of that focus to individual-level variables that may impact health risk information seeking across contexts. To locate these variables, the researcher posits an integrated model, the Planned Risk Information Seeking Model (PRISM). The model, which treats risk information seeking as a deliberate (planned) behavior, maps variables found in the Theory of Planned Behavior (TPB; Ajzen, 1991) and the Risk Information Seeking and Processing Model (RISP; Griffin, Dunwoody, & Neuwirth, 1999), and posits linkages among those variables. This effort is further informed by Kahlor's (2007) Augmented RISP, the Theory of Motivated Information Management (Afifi & Weiner, 2004), the Comprehensive Model of Information Seeking (Johnson & Meischke, 1993), the Health Information Acquisition Model (Freimuth, Stein, & Kean, 1989), and the Extended Parallel Processing Model (Witte, 1998). The resulting integrated model accounted for 59% of the variance in health risk information-seeking intent and performed better than the TPB or the RISP alone.

  16. Risk-Averse Control of Undiscounted Transient Markov Models

    CERN Document Server

    Cavus, Ozlem

    2012-01-01

    We use Markov risk measures to formulate a risk-averse version of the undiscounted total cost problem for a transient controlled Markov process. We derive risk-averse dynamic programming equations and we show that a randomized policy may be strictly better than deterministic policies, when risk measures are employed. We illustrate the results on an optimal stopping problem and an organ transplant problem.

  17. Bayesian modelling of geostatistical malaria risk data

    Directory of Open Access Journals (Sweden)

    L. Gosoniu

    2006-11-01

    Full Text Available Bayesian geostatistical models applied to malaria risk data quantify the environment-disease relations, identify significant environmental predictors of malaria transmission and provide model-based predictions of malaria risk together with their precision. These models are often based on the stationarity assumption which implies that spatial correlation is a function of distance between locations and independent of location. We relax this assumption and analyse malaria survey data in Mali using a Bayesian non-stationary model. Model fit and predictions are based on Markov chain Monte Carlo simulation methods. Model validation compares the predictive ability of the non-stationary model with the stationary analogue. Results indicate that the stationarity assumption is important because it influences the significance of environmental factors and the corresponding malaria risk maps.

  18. Bayesian modelling of geostatistical malaria risk data.

    Science.gov (United States)

    Gosoniu, L; Vounatsou, P; Sogoba, N; Smith, T

    2006-11-01

    Bayesian geostatistical models applied to malaria risk data quantify the environment-disease relations, identify significant environmental predictors of malaria transmission and provide model-based predictions of malaria risk together with their precision. These models are often based on the stationarity assumption which implies that spatial correlation is a function of distance between locations and independent of location. We relax this assumption and analyse malaria survey data in Mali using a Bayesian non-stationary model. Model fit and predictions are based on Markov chain Monte Carlo simulation methods. Model validation compares the predictive ability of the non-stationary model with the stationary analogue. Results indicate that the stationarity assumption is important because it influences the significance of environmental factors and the corresponding malaria risk maps.

  19. Modeling foreign exchange risk premium in Armenia

    NARCIS (Netherlands)

    Poghosyan, Tigran; Kocenda, Evnen; Zemcik, Petr

    2008-01-01

    This paper applies stochastic discount factor methodology to modeling the foreign exchange risk premium in Armenia. We use weekly data on foreign and domestic currency deposits, which coexist in the Armenian banking system. This coexistence implies elimination of the cross-country risks and transact

  20. The Flexible Global Ocean-Atmosphere-Land System Model,Grid-point Version 2:FGOALS-g2

    Institute of Scientific and Technical Information of China (English)

    LI Lijuan; LIN Pengfei; YU Yongqiang; WANG Bin; ZHOU Tianjun; LIU Li; LIU Jiping

    2013-01-01

    This study mainly introduces the development of the Flexible Global Ocean-Atmosphere-Land System Model:Grid-point Version 2 (FGOALS-g2) and the preliminary evaluations of its performances based on results from the pre-industrial control run and four members of historical runs according to the fifth phase of the Coupled Model Intercomparison Project (CMIP5) experiment design.The results suggest that many obvious improvements have been achieved by the FGOALS-g2 compared with the previous version,FGOALS-g1,including its climatological mean states,climate variability,and 20th century surface temperature evolution.For example,FGOALS-g2 better simulates the frequency of tropical land precipitation,East Asian Monsoon precipitation and its seasonal cycle,MJO and ENSO,which are closely related to the updated cumulus parameterization scheme,as well as the alleviation of uncertainties in some key parameters in shallow and deep convection schemes,cloud fraction,cloud macro/microphysical processes and the boundary layer scheme in its atmospheric model.The annual cycle of sea surface temperature along the equator in the Pacific is significantly improved in the new version.The sea ice salinity simulation is one of the unique characteristics of FGOALS-g2,although it is somehow inconsistent with empirical observations in the Antarctic.

  1. Programs OPTMAN and SHEMMAN Version 6 (1999) - Coupled-Channels optical model and collective nuclear structure calculation -

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Jong Hwa; Lee, Jeong Yeon; Lee, Young Ouk; Sukhovitski, Efrem Sh. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-01-01

    Programs SHEMMAN and OPTMAN (Version 6) have been developed for determinations of nuclear Hamiltonian parameters and for optical model calculations, respectively. The optical model calculations by OPTMAN with coupling schemes built on wave functions functions of non-axial soft-rotator are self-consistent, since the parameters of the nuclear Hamiltonian are determined by adjusting the energies of collective levels to experimental values with SHEMMAN prior to the optical model calculation. The programs have been installed at Nuclear Data Evaluation Laboratory of KAERI. This report is intended as a brief manual of these codes. 43 refs., 9 figs., 1 tabs. (Author)

  2. COMPUTERIZED MODEL OF RISK MANAGEMENT IN BUSINESS

    Directory of Open Access Journals (Sweden)

    Petrişor MANDU

    2011-01-01

    Full Text Available The occurrence of risk situation and the manager’s awareness of it are serious threats for the organization and its objectives. Consequently, the manager has to have available, analyze, select and interpret many pieces of information, under stress, before making a decision for avoiding a disaster. Under these circumstances, a computerized model of risk management is the most adequate solution to make the intervention possibilities effective through a quicker and more accurate intervention. The model offers enough confidence and a favorable psychological state for managing risk. In accordance with this model, the risk manager processes the information by means of some operational (mathematical methods and that favors reaching optimum solutions in the shortest delay, based on some estimated anticipations through a rational model.

  3. GrundRisk - Coupling of vertical and horizontal transport models

    DEFF Research Database (Denmark)

    Locatelli, Luca; Rosenberg, Louise; Bjerg, Poul Løgstrup

    This report presents the development of the GrundRisk model for contaminated site risk assessment.......This report presents the development of the GrundRisk model for contaminated site risk assessment....

  4. Salutary effects of high-intensity interval training in persons with elevated cardiovascular risk [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Jerome L. Fleg

    2016-09-01

    Full Text Available Although moderate-intensity continuous training (MICT has been the traditional model for aerobic exercise training for over four decades, a growing body of literature has demonstrated equal if not greater improvement in aerobic capacity and similar beneficial effects on body composition, glucose metabolism, blood pressure, and quality of life from high-intensity interval training (HIIT. An advantage of HIIT over MICT is the shorter time required to perform the same amount of energy expenditure. The current brief review summarizes the effects of HIIT on peak aerobic capacity and cardiovascular risk factors in healthy adults and those with various cardiovascular diseases, including coronary artery disease, chronic heart failure, and post heart transplantation.

  5. CLASSICAL RISK MODEL WITH THRESHOLD DIVIDEND STRATEGY

    Institute of Scientific and Technical Information of China (English)

    Zhou Ming; Guo Junyi

    2008-01-01

    In this article, a threshold dividend strategy is used for classical risk model.Under this dividend strategy, certain probability of ruin, which occurs in case of constant barrier strategy, is avoided. Using the strong Markov property of the surplus process and the distribution of the deficit in classical risk model, the survival probability for this model is derived, which is more direct than that in Asmussen(2000, P195, Proposition 1.10). The occupation time of non-dividend of this model is also discussed by means of Martingale method.

  6. Two criteria for evaluating risk prediction models.

    Science.gov (United States)

    Pfeiffer, R M; Gail, M H

    2011-09-01

    We propose and study two criteria to assess the usefulness of models that predict risk of disease incidence for screening and prevention, or the usefulness of prognostic models for management following disease diagnosis. The first criterion, the proportion of cases followed PCF (q), is the proportion of individuals who will develop disease who are included in the proportion q of individuals in the population at highest risk. The second criterion is the proportion needed to follow-up, PNF (p), namely the proportion of the general population at highest risk that one needs to follow in order that a proportion p of those destined to become cases will be followed. PCF (q) assesses the effectiveness of a program that follows 100q% of the population at highest risk. PNF (p) assess the feasibility of covering 100p% of cases by indicating how much of the population at highest risk must be followed. We show the relationship of those two criteria to the Lorenz curve and its inverse, and present distribution theory for estimates of PCF and PNF. We develop new methods, based on influence functions, for inference for a single risk model, and also for comparing the PCFs and PNFs of two risk models, both of which were evaluated in the same validation data.

  7. CERT Resilience Management Model - Mail-Specific Process Areas: International Mail Transportation (Version 1.0)

    Science.gov (United States)

    2014-08-01

    requirements to the physical IMPC facilities where mail transportation activities are conducted and other physical, environmental , and geographical...controls to support the resilience of mail and mail services during transportation are managed in the CERT-RMM Environmental Control process area. The...RISK:SG3 and RISK:SG4 in the CERT-RMM Risk Management process area. Typical Work Products 1. Postal item risk statements, with impact valuation 2

  8. Breast cancer risk prediction using a clinical risk model and polygenic risk score.

    Science.gov (United States)

    Shieh, Yiwey; Hu, Donglei; Ma, Lin; Huntsman, Scott; Gard, Charlotte C; Leung, Jessica W T; Tice, Jeffrey A; Vachon, Celine M; Cummings, Steven R; Kerlikowske, Karla; Ziv, Elad

    2016-10-01

    Breast cancer risk assessment can inform the use of screening and prevention modalities. We investigated the performance of the Breast Cancer Surveillance Consortium (BCSC) risk model in combination with a polygenic risk score (PRS) comprised of 83 single nucleotide polymorphisms identified from genome-wide association studies. We conducted a nested case-control study of 486 cases and 495 matched controls within a screening cohort. The PRS was calculated using a Bayesian approach. The contributions of the PRS and variables in the BCSC model to breast cancer risk were tested using conditional logistic regression. Discriminatory accuracy of the models was compared using the area under the receiver operating characteristic curve (AUROC). Increasing quartiles of the PRS were positively associated with breast cancer risk, with OR 2.54 (95 % CI 1.69-3.82) for breast cancer in the highest versus lowest quartile. In a multivariable model, the PRS, family history, and breast density remained strong risk factors. The AUROC of the PRS was 0.60 (95 % CI 0.57-0.64), and an Asian-specific PRS had AUROC 0.64 (95 % CI 0.53-0.74). A combined model including the BCSC risk factors and PRS had better discrimination than the BCSC model (AUROC 0.65 versus 0.62, p = 0.01). The BCSC-PRS model classified 18 % of cases as high-risk (5-year risk ≥3 %), compared with 7 % using the BCSC model. The PRS improved discrimination of the BCSC risk model and classified more cases as high-risk. Further consideration of the PRS's role in decision-making around screening and prevention strategies is merited.

  9. Concordance for prognostic models with competing risks

    DEFF Research Database (Denmark)

    Wolbers, Marcel; Blanche, Paul; Koller, Michael T

    2014-01-01

    The concordance probability is a widely used measure to assess discrimination of prognostic models with binary and survival endpoints. We formally define the concordance probability for a prognostic model of the absolute risk of an event of interest in the presence of competing risks and relate i...... of the working model. We further illustrate the methods by computing the concordance probability for a prognostic model of coronary heart disease (CHD) events in the presence of the competing risk of non-CHD death.......The concordance probability is a widely used measure to assess discrimination of prognostic models with binary and survival endpoints. We formally define the concordance probability for a prognostic model of the absolute risk of an event of interest in the presence of competing risks and relate...... it to recently proposed time-dependent area under the receiver operating characteristic curve measures. For right-censored data, we investigate inverse probability of censoring weighted (IPCW) estimates of a truncated concordance index based on a working model for the censoring distribution. We demonstrate...

  10. UNSAT-H Version 3.0: Unsaturated Soil Water and Heat Flow Model Theory, User Manual, and Examples

    Energy Technology Data Exchange (ETDEWEB)

    MJ Fayer

    2000-06-12

    The UNSAT-H model was developed at Pacific Northwest National Laboratory (PNNL) to assess the water dynamics of arid sites and, in particular, estimate recharge fluxes for scenarios pertinent to waste disposal facilities. During the last 4 years, the UNSAT-H model received support from the Immobilized Waste Program (IWP) of the Hanford Site's River Protection Project. This program is designing and assessing the performance of on-site disposal facilities to receive radioactive wastes that are currently stored in single- and double-shell tanks at the Hanford Site (LMHC 1999). The IWP is interested in estimates of recharge rates for current conditions and long-term scenarios involving the vadose zone disposal of tank wastes. Simulation modeling with UNSAT-H is one of the methods being used to provide those estimates (e.g., Rockhold et al. 1995; Fayer et al. 1999). To achieve the above goals for assessing water dynamics and estimating recharge rates, the UNSAT-H model addresses soil water infiltration, redistribution, evaporation, plant transpiration, deep drainage, and soil heat flow as one-dimensional processes. The UNSAT-H model simulates liquid water flow using Richards' equation (Richards 1931), water vapor diffusion using Fick's law, and sensible heat flow using the Fourier equation. This report documents UNSAT-H .Version 3.0. The report includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plants, and the code manual. Version 3.0 is an, enhanced-capability update of UNSAT-H Version 2.0 (Fayer and Jones 1990). New features include hysteresis, an iterative solution of head and temperature, an energy balance check, the modified Picard solution technique, additional hydraulic functions, multiple-year simulation capability, and general enhancements.

  11. Calibrated predictions for multivariate competing risks models.

    Science.gov (United States)

    Gorfine, Malka; Hsu, Li; Zucker, David M; Parmigiani, Giovanni

    2014-04-01

    Prediction models for time-to-event data play a prominent role in assessing the individual risk of a disease, such as cancer. Accurate disease prediction models provide an efficient tool for identifying individuals at high risk, and provide the groundwork for estimating the population burden and cost of disease and for developing patient care guidelines. We focus on risk prediction of a disease in which family history is an important risk factor that reflects inherited genetic susceptibility, shared environment, and common behavior patterns. In this work family history is accommodated using frailty models, with the main novel feature being allowing for competing risks, such as other diseases or mortality. We show through a simulation study that naively treating competing risks as independent right censoring events results in non-calibrated predictions, with the expected number of events overestimated. Discrimination performance is not affected by ignoring competing risks. Our proposed prediction methodologies correctly account for competing events, are very well calibrated, and easy to implement.

  12. Risk terrain modeling predicts child maltreatment.

    Science.gov (United States)

    Daley, Dyann; Bachmann, Michael; Bachmann, Brittany A; Pedigo, Christian; Bui, Minh-Thuy; Coffman, Jamye

    2016-12-01

    As indicated by research on the long-term effects of adverse childhood experiences (ACEs), maltreatment has far-reaching consequences for affected children. Effective prevention measures have been elusive, partly due to difficulty in identifying vulnerable children before they are harmed. This study employs Risk Terrain Modeling (RTM), an analysis of the cumulative effect of environmental factors thought to be conducive for child maltreatment, to create a highly accurate prediction model for future substantiated child maltreatment cases in the City of Fort Worth, Texas. The model is superior to commonly used hotspot predictions and more beneficial in aiding prevention efforts in a number of ways: 1) it identifies the highest risk areas for future instances of child maltreatment with improved precision and accuracy; 2) it aids the prioritization of risk-mitigating efforts by informing about the relative importance of the most significant contributing risk factors; 3) since predictions are modeled as a function of easily obtainable data, practitioners do not have to undergo the difficult process of obtaining official child maltreatment data to apply it; 4) the inclusion of a multitude of environmental risk factors creates a more robust model with higher predictive validity; and, 5) the model does not rely on a retrospective examination of past instances of child maltreatment, but adapts predictions to changing environmental conditions. The present study introduces and examines the predictive power of this new tool to aid prevention efforts seeking to improve the safety, health, and wellbeing of vulnerable children.

  13. Queues and risk models with simultaneous arrivals

    NARCIS (Netherlands)

    Badila, E.S.; Boxma, O.J.; Resing, J.A.C.; Winands, E.M.M.

    2014-01-01

    We focus on a particular connection between queueing and risk models in a multidimensional setting. We first consider the joint workload process in a queueing model with parallel queues and simultaneous arrivals at the queues. For the case that the service times are ordered (from largest in the firs

  14. A new version of variational integrated technology for environmental modeling with assimilation of available data

    Science.gov (United States)

    Penenko, Vladimir; Tsvetova, Elena; Penenko, Aleksey

    2014-05-01

    the equations of the model of processes as desired deterministic control functions. This method of data assimilation with control functions is implemented by direct algorithms. The modeling technology presented here focuses on various scientific and applied problems of environmental prediction and design, including risk assessment in relation to existing and potential sources of natural and anthropogenic influences. The work is partially supported by the Programs No 4 of Presidium RAS and No 3 of Mathematical Department of RAS; by RFBR projects NN 11-01-00187 and 14-01-31482; by Integrating projects of SD RAS No 8 and 35. Our studies are in the line with the goals of COST Action ES1004. References 1. V. Penenko, A.Baklanov, E. Tsvetova and A. Mahura. Direct and Inverse Problems in a Variational Concept of Environmental Modeling, Pure and Applied Geoph. 2012.V.169:447-465. 2. A.V. Penenko, Discrete-analytic schemes for solving an inverse coefficient heat conduction problem in a layered medium with gradient methods, Numerical Analysis and Applications, 2012. V. 5:326-341. 3. V. Penenko, E. Tsvetova. Variational methods for constructing the monotone approximations for atmospheric chemistry models, Numerical analysis and applications, 2013. V. 6: 210-220.

  15. Technical report series on global modeling and data assimilation. Volume 1: Documentation of the Goddard Earth Observing System (GEOS) General Circulation Model, version 1

    Science.gov (United States)

    Suarez, Max J. (Editor); Takacs, Lawrence L.; Molod, Andrea; Wang, Tina

    1994-01-01

    This technical report documents Version 1 of the Goddard Earth Observing System (GEOS) General Circulation Model (GCM). The GEOS-1 GCM is being used by NASA's Data Assimilation Office (DAO) to produce multiyear data sets for climate research. This report provides a documentation of the model components used in the GEOS-1 GCM, a complete description of model diagnostics available, and a User's Guide to facilitate GEOS-1 GCM experiments.

  16. A Fast and Efficient Version of the TwO-Moment Aerosol Sectional (TOMAS) Global Aerosol Microphysics Model

    Science.gov (United States)

    Lee, Yunha; Adams, P. J.

    2012-01-01

    This study develops more computationally efficient versions of the TwO-Moment Aerosol Sectional (TOMAS) microphysics algorithms, collectively called Fast TOMAS. Several methods for speeding up the algorithm were attempted, but only reducing the number of size sections was adopted. Fast TOMAS models, coupled to the GISS GCM II-prime, require a new coagulation algorithm with less restrictive size resolution assumptions but only minor changes in other processes. Fast TOMAS models have been evaluated in a box model against analytical solutions of coagulation and condensation and in a 3-D model against the original TOMAS (TOMAS-30) model. Condensation and coagulation in the Fast TOMAS models agree well with the analytical solution but show slightly more bias than the TOMAS-30 box model. In the 3-D model, errors resulting from decreased size resolution in each process (i.e., emissions, cloud processing wet deposition, microphysics) are quantified in a series of model sensitivity simulations. Errors resulting from lower size resolution in condensation and coagulation, defined as the microphysics error, affect number and mass concentrations by only a few percent. The microphysics error in CN70CN100 (number concentrations of particles larger than 70100 nm diameter), proxies for cloud condensation nuclei, range from 5 to 5 in most regions. The largest errors are associated with decreasing the size resolution in the cloud processing wet deposition calculations, defined as cloud-processing error, and range from 20 to 15 in most regions for CN70CN100 concentrations. Overall, the Fast TOMAS models increase the computational speed by 2 to 3 times with only small numerical errors stemming from condensation and coagulation calculations when compared to TOMAS-30. The faster versions of the TOMAS model allow for the longer, multi-year simulations required to assess aerosol effects on cloud lifetime and precipitation.

  17. Refinement and evaluation of the Massachusetts firm-yield estimator model version 2.0

    Science.gov (United States)

    Levin, Sara B.; Archfield, Stacey A.; Massey, Andrew J.

    2011-01-01

    The firm yield is the maximum average daily withdrawal that can be extracted from a reservoir without risk of failure during an extended drought period. Previously developed procedures for determining the firm yield of a reservoir were refined and applied to 38 reservoir systems in Massachusetts, including 25 single- and multiple-reservoir systems that were examined during previous studies and 13 additional reservoir systems. Changes to the firm-yield model include refinements to the simulation methods and input data, as well as the addition of several scenario-testing capabilities. The simulation procedure was adapted to run at a daily time step over a 44-year simulation period, and daily streamflow and meteorological data were compiled for all the reservoirs for input to the model. Another change to the model-simulation methods is the adjustment of the scaling factor used in estimating groundwater contributions to the reservoir. The scaling factor is used to convert the daily groundwater-flow rate into a volume by multiplying the rate by the length of reservoir shoreline that is hydrologically connected to the aquifer. Previous firm-yield analyses used a constant scaling factor that was estimated from the reservoir surface area at full pool. The use of a constant scaling factor caused groundwater flows during periods when the reservoir stage was very low to be overestimated. The constant groundwater scaling factor used in previous analyses was replaced with a variable scaling factor that is based on daily reservoir stage. This change reduced instability in the groundwater-flow algorithms and produced more realistic groundwater-flow contributions during periods of low storage. Uncertainty in the firm-yield model arises from many sources, including errors in input data. The sensitivity of the model to uncertainty in streamflow input data and uncertainty in the stage-storage relation was examined. A series of Monte Carlo simulations were performed on 22 reservoirs

  18. Simulations of the mid-Pliocene Warm Period using two versions of the NASA/GISS ModelE2-R Coupled Model

    Directory of Open Access Journals (Sweden)

    M. A. Chandler

    2013-04-01

    Full Text Available The mid-Pliocene Warm Period (mPWP bears many similarities to aspects of future global warming as projected by the Intergovernmental Panel on Climate Change (IPCC, 2007. Both marine and terrestrial data point to high-latitude temperature amplification, including large decreases in sea ice and land ice, as well as expansion of warmer climate biomes into higher latitudes. Here we present our most recent simulations of the mid-Pliocene climate using the CMIP5 version of the NASA/GISS Earth System Model (ModelE2-R. We describe the substantial impact associated with a recent correction made in the implementation of the Gent-McWilliams ocean mixing scheme (GM, which has a large effect on the simulation of ocean surface temperatures, particularly in the North Atlantic Ocean. The effect of this correction on the Pliocene climate results would not have been easily determined from examining its impact on the preindustrial runs alone, a useful demonstration of how the consequences of code improvements as seen in modern climate control runs do not necessarily portend the impacts in extreme climates. Both the GM-corrected and GM-uncorrected simulations were contributed to the Pliocene Model Intercomparison Project (PlioMIP Experiment 2. Many findings presented here corroborate results from other PlioMIP multi-model ensemble papers, but we also emphasise features in the ModelE2-R simulations that are unlike the ensemble means. The corrected version yields results that more closely resemble the ocean core data as well as the PRISM3D reconstructions of the mid-Pliocene, especially the dramatic warming in the North Atlantic and Greenland-Iceland-Norwegian Sea, which in the new simulation appears to be far more realistic than previously found with older versions of the GISS model. Our belief is that continued development of key physical routines in the atmospheric model, along with higher resolution and recent corrections to mixing parameterisations in the ocean

  19. Land-total and Ocean-total Precipitation and Evaporation from a Community Atmosphere Model version 5 Perturbed Parameter Ensemble

    Energy Technology Data Exchange (ETDEWEB)

    Covey, Curt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lucas, Donald D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Trenberth, Kevin E. [National Center for Atmospheric Research, Boulder, CO (United States)

    2016-03-02

    This document presents the large scale water budget statistics of a perturbed input-parameter ensemble of atmospheric model runs. The model is Version 5.1.02 of the Community Atmosphere Model (CAM). These runs are the “C-Ensemble” described by Qian et al., “Parametric Sensitivity Analysis of Precipitation at Global and Local Scales in the Community Atmosphere Model CAM5” (Journal of Advances in Modeling the Earth System, 2015). As noted by Qian et al., the simulations are “AMIP type” with temperature and sea ice boundary conditions chosen to match surface observations for the five year period 2000-2004. There are 1100 ensemble members in addition to one run with default inputparameter values.

  20. Simple geometrical explanation of Gurtin-Murdoch model of surface elasticity with clarification of its related versions

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    It is showed that all equations of the linearized Gurtin-Murdoch model of surface elasticity can be derived, in a straightforward way, from a simple second-order expression for the ratio of deformed surface area to initial surface area. This elementary derivation offers a simple explanation for all unique features of the model and its simplified/modified versions, and helps to clarify some misunderstandings of the model already occurring in the literature. Finally, it is demonstrated that, because the Gurtin-Murdoch model is based on a hybrid formulation combining linearized deformation of bulk material with 2nd-order finite deformation of the surface, caution is needed when the original form of this model is applied to bending deformation of thin-walled elastic structures with surface stress.

  1. Evaluation of the tropospheric aerosol number concentrations simulated by two versions of the global model ECHAM5-HAM

    Science.gov (United States)

    Zhang, K.; Kazil, J.; Feichter, J.

    2009-04-01

    Since its first version developed by Stier et al. (2005), the global aerosol-climate model ECHAM5-HAM has gone through further development and updates. The changes in the model include (1) a new time integration scheme for the condensation of the sulfuric acid gas on existing particles, (2) a new aerosol nucleation scheme that takes into account the charged nucleation caused by cosmic rays, and (3) a parameterization scheme explicitly describing the conversion of aerosol particles to cloud nuclei. In this work, simulations performed with the old and new model versions are evaluated against some measurements reported in recent years. The focus is on the aerosol size distribution in the troposphere. Results show that modifications in the parameterizations have led to significant changes in the simulated aerosol concentrations. Vertical profiles of the total particle number concentration (diameter > 3nm) compiled by Clarke et al. (2002) suggest that, over the Pacific in the upper free troposphere, the tropics are associated with much higher concentrations than the mid-latitude regions. This feature is more reasonably reproduced by the new model version, mainly due to the improved results of the nucleation mode aerosols. In the lower levels (2-5 km above the Earth's surface), the number concentrations of the Aitken mode particles are overestimated compared to both the Pacific data given in Clarke et al. (2002) and the vertical profiles over Europe reported by Petzold et al. (2007). The physical and chemical processes that have led to these changes are identified by sensitivity tests. References: Clarke and Kapustin: A Pacific aerosol survey - part 1: a decade of data on production, transport, evolution and mixing in the troposphere, J. Atmos. Sci., 59, 363-382, 2002. Petzold et al.: Perturbation of the European free troposphere aerosol by North American forest fire plumes during the ICARTT-ITOP experiment in summer 2004, Atmos. Chem. Phys., 7, 5105-5127, 2007

  2. Uncertainty in surface water flood risk modelling

    Science.gov (United States)

    Butler, J. B.; Martin, D. N.; Roberts, E.; Domuah, R.

    2009-04-01

    Two thirds of the flooding that occurred in the UK during summer 2007 was as a result of surface water (otherwise known as ‘pluvial') rather than river or coastal flooding. In response, the Environment Agency and Interim Pitt Reviews have highlighted the need for surface water risk mapping and warning tools to identify, and prepare for, flooding induced by heavy rainfall events. This need is compounded by the likely increase in rainfall intensities due to climate change. The Association of British Insurers has called for the Environment Agency to commission nationwide flood risk maps showing the relative risk of flooding from all sources. At the wider European scale, the recently-published EC Directive on the assessment and management of flood risks will require Member States to evaluate, map and model flood risk from a variety of sources. As such, there is now a clear and immediate requirement for the development of techniques for assessing and managing surface water flood risk across large areas. This paper describes an approach for integrating rainfall, drainage network and high-resolution topographic data using Flowroute™, a high-resolution flood mapping and modelling platform, to produce deterministic surface water flood risk maps. Information is provided from UK case studies to enable assessment and validation of modelled results using historical flood information and insurance claims data. Flowroute was co-developed with flood scientists at Cambridge University specifically to simulate river dynamics and floodplain inundation in complex, congested urban areas in a highly computationally efficient manner. It utilises high-resolution topographic information to route flows around individual buildings so as to enable the prediction of flood depths, extents, durations and velocities. As such, the model forms an ideal platform for the development of surface water flood risk modelling and mapping capabilities. The 2-dimensional component of Flowroute employs

  3. Modelling of Systemic Risk of Banking Sector

    Directory of Open Access Journals (Sweden)

    Laura Gudelytė

    2014-03-01

    Full Text Available Purpose – to evaluate the general networking and simulation approaches of modelling of systemic risk and the financial contagion and their ability to assess the banking sector resilience in the case of external economic shocks and collapse of idiosyncratic financial institutions.Design/methodology/approach – a general overview of research papers presenting concepts and methodologies of assessment of systemic risk of the banking sector.Findings – limitations of the networking approach and possible ways to improve modelling of systemic risk. The network approach cannot explain the causes of initial default of bank. On the other hand, assumptions made on LGD and interbank exposures are very strong. These features are important limitations of network and simulation approaches.Research limitations/implications – the application of reviewed methods in the case of Lithuanian banking sector falls, however, due to the lack of exhaustive data. On the other hand, until now, applied methods for systemic risk due to the lack of data have been limited. Also, because of this reason, there are difficulties to create adequate dynamic assessment for systemic risk models. Therefore, in assessing systemic risk of the banking sector, the same problem remains: is it possible to parameterize the financial crisis, its spread and speed and other characteristics according to quantitative methods. Knowing the liquidity, credit risk and other standards set in Basel Accords, it is also not enough to properly manage the systemic risk of the whole banking sector because for the proper activity of the banking sector not only characteristics related to capital requirements have influence on it, but also external (mostly the macroeconomic, political factors.Practical implications – determination of the explicit connection based on quantitative methods determining the systemic risk of the banking sector would be exact and objective assessment and useful not only for the

  4. Reliability of the Violence Risk Scale of Chinese Version%暴力危险量表中文版的信度

    Institute of Scientific and Technical Information of China (English)

    章雪利; 谌霞灿; 蔡伟雄; 胡峻梅

    2012-01-01

    Objective To introduce and revise the Violence Risk Scale (VRS) for assessing violence risk and risk change, and to examine the reliability of Violence Risk Scale of Chinese version (VRS-C). Methods The original English version of the VRS was translated into Chinese according to established translation procedures. To examine the scorer reliability the 14 cases assessed by 3 assessors separately. One hundred and twenty-five patients with mental disorders from 3 different institutions in Sichuan province (Refined Control Ward in Ankang Hospital, Department of Forensic Psychiatry, Institute of Forensic Science and Mental Health Center of West China Hospital) were collected to examine the reliability of VRS-C. Results The results showed moderately good scale reliability of the VRS-C, with 0.80 of ICC for scorer reliability. All items have significant consistence with Cronbach's alpha coefficient as 0.921, split-half reliability as 0.906 and item total correlation as 0.246-0.849. Conclusion The reliability of the VRS-C version is acceptable.%目的 引入、修订具有评估暴力危险及危险变化的暴力危险量表(Violence Risk Scale,VRS),对修订后的暴力危险量表中文版(VRS-C)进行信度检验. 方法 通过标准的翻译程序形成VRS-C,3位评估者独立评估14个案例以检验评分者信度,以125例来自成都安康医院监管病区、四川华西法医学鉴定中心法医精神病学教研室及华西心理卫生中心的精神疾病患者为被试,对VRS-C的信度进行检验. 结果 初步修订的VRS-C具有较好的评分者信度(ICC=0.80)、同质性信度(克朗巴赫α系数=0.921)、分半信度(0.906)及题总相关性(0.246~0.849). 结论 初步修订的VRS-C具有较好的信度.

  5. Issues in Value-at-Risk Modeling and Evaluation

    NARCIS (Netherlands)

    J. Danielsson; C.G. de Vries (Casper); B.N. Jorgensen (Bjørn); P.F. Christoffersen (Peter); F.X. Diebold (Francis); T. Schuermann (Til); J.A. Lopez (Jose); B. Hirtle (Beverly)

    1998-01-01

    textabstractDiscusses the issues in value-at-risk modeling and evaluation. Value of value at risk; Horizon problems and extreme events in financial risk management; Methods of evaluating value-at-risk estimates.

  6. Update of the Polar SWIFT model for polar stratospheric ozone loss (Polar SWIFT version 2

    Directory of Open Access Journals (Sweden)

    I. Wohltmann

    2017-07-01

    Full Text Available The Polar SWIFT model is a fast scheme for calculating the chemistry of stratospheric ozone depletion in polar winter. It is intended for use in global climate models (GCMs and Earth system models (ESMs to enable the simulation of mutual interactions between the ozone layer and climate. To date, climate models often use prescribed ozone fields, since a full stratospheric chemistry scheme is computationally very expensive. Polar SWIFT is based on a set of coupled differential equations, which simulate the polar vortex-averaged mixing ratios of the key species involved in polar ozone depletion on a given vertical level. These species are O3, chemically active chlorine (ClOx, HCl, ClONO2 and HNO3. The only external input parameters that drive the model are the fraction of the polar vortex in sunlight and the fraction of the polar vortex below the temperatures necessary for the formation of polar stratospheric clouds. Here, we present an update of the Polar SWIFT model introducing several improvements over the original model formulation. In particular, the model is now trained on vortex-averaged reaction rates of the ATLAS Chemistry and Transport Model, which enables a detailed look at individual processes and an independent validation of the different parameterizations contained in the differential equations. The training of the original Polar SWIFT model was based on fitting complete model runs to satellite observations and did not allow for this. A revised formulation of the system of differential equations is developed, which closely fits vortex-averaged reaction rates from ATLAS that represent the main chemical processes influencing ozone. In addition, a parameterization for the HNO3 change by denitrification is included. The rates of change of the concentrations of the chemical species of the Polar SWIFT model are purely chemical rates of change in the new version, whereas in the original Polar SWIFT model, they included a transport effect

  7. Simulation modeling for microbial risk assessment.

    Science.gov (United States)

    Cassin, M H; Paoli, G M; Lammerding, A M

    1998-11-01

    Quantitative microbial risk assessment implies an estimation of the probability and impact of adverse health outcomes due to microbial hazards. In the case of food safety, the probability of human illness is a complex function of the variability of many parameters that influence the microbial environment, from the production to the consumption of a food. The analytical integration required to estimate the probability of foodborne illness is intractable in all but the simplest of models. Monte Carlo simulation is an alternative to computing analytical solutions. In some cases, a risk assessment may be commissioned to serve a larger purpose than simply the estimation of risk. A Monte Carlo simulation can provide insights into complex processes that are invaluable, and otherwise unavailable, to those charged with the task of risk management. Using examples from a farm-to-fork model of the fate of Escherichia coli O157:H7 in ground beef hamburgers, this paper describes specifically how such goals as research prioritization, risk-based characterization of control points, and risk-based comparison of intervention strategies can be objectively achieved using Monte Carlo simulation.

  8. Managing risks in business model innovation processes

    DEFF Research Database (Denmark)

    Taran, Yariv; Boer, Harry; Lindgren, Peter

    2010-01-01

    ) innovation is a risky enterprise, many companies are still choosing not to apply any risk management in the BM innovation process. The objective of this paper is to develop a better understanding of how risks are handled in the practice of BM innovation. An analysis of the BM innovation experiences of two......Companies today, in some industries more than others, invest more capital and resources just to stay competitive, develop more diverse solutions, and increasingly start thinking more radically when considering their business models. However, despite the understanding that business model (BM...... industrial companies shows that both companies are experiencing high levels of uncertainty and complexity during their innovation processes and are, consequently, struggling to find new processes for handling the risks involved. Based on the two companies’ experiences, various testable propositions are put...

  9. A Probabilistic Asteroid Impact Risk Model

    Science.gov (United States)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  10. Application of a short-time version of the Equalization-Cancellation model to speech intelligibility experiments with speech maskers.

    Science.gov (United States)

    Wan, Rui; Durlach, Nathaniel I; Colburn, H Steven

    2014-08-01

    A short-time-processing version of the Equalization-Cancellation (EC) model of binaural processing is described and applied to speech intelligibility tasks in the presence of multiple maskers, including multiple speech maskers. This short-time EC model, called the STEC model, extends the model described by Wan et al. [J. Acoust. Soc. Am. 128, 3678-3690 (2010)] to allow the EC model's equalization parameters τ and α to be adjusted as a function of time, resulting in improved masker cancellation when the dominant masker location varies in time. Using the Speech Intelligibility Index, the STEC model is applied to speech intelligibility with maskers that vary in number, type, and spatial arrangements. Most notably, when maskers are located on opposite sides of the target, this STEC model predicts improved thresholds when the maskers are modulated independently with speech-envelope modulators; this includes the most relevant case of independent speech maskers. The STEC model describes the spatial dependence of the speech reception threshold with speech maskers better than the steady-state model. Predictions are also improved for independently speech-modulated noise maskers but are poorer for reversed-speech maskers. In general, short-term processing is useful, but much remains to be done in the complex task of understanding speech in speech maskers.

  11. Process Definition and Process Modeling Methods Version 01.01.00

    Science.gov (United States)

    1991-09-01

    process model. This generic process model is a state machine model . It permits progress in software development to be characterized as transitions...e.g., Entry-Task-Validation-Exit (ETVX) diagram, Petri Net, two-level state machine model , state machine, and Structured Analysis and Design

  12. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  13. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 1 object-oriented parameter estimation code is here extended to Version 3 to incorporate additional algorithms and tools to further improve support for large and complex environmental modeling problems. PEST++ Version 3 includes the Gauss-Marquardt-Levenberg (GML) algorithm for nonlinear parameter estimation, Tikhonov regularization, integrated linear-based uncertainty quantification, options of integrated TCP/IP based parallel run management or external independent run management by use of a Version 2 update of the GENIE Version 1 software code, and utilities for global sensitivity analyses. The Version 3 code design is consistent with PEST++ Version 1 and continues to be designed to lower the barriers of entry for users as well as developers while providing efficient and optimized algorithms capable of accommodating large, highly parameterized inverse problems. As such, this effort continues the original focus of (1) implementing the most popular and powerful features of the PEST software suite in a fashion that is easy for novice or experienced modelers to use and (2) developing a software framework that is easy to extend.

  14. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 1 object-oriented parameter estimation code is here extended to Version 3 to incorporate additional algorithms and tools to further improve support for large and complex environmental modeling problems. PEST++ Version 3 includes the Gauss-Marquardt-Levenberg (GML) algorithm for nonlinear parameter estimation, Tikhonov regularization, integrated linear-based uncertainty quantification, options of integrated TCP/IP based parallel run management or external independent run management by use of a Version 2 update of the GENIE Version 1 software code, and utilities for global sensitivity analyses. The Version 3 code design is consistent with PEST++ Version 1 and continues to be designed to lower the barriers of entry for users as well as developers while providing efficient and optimized algorithms capable of accommodating large, highly parameterized inverse problems. As such, this effort continues the original focus of (1) implementing the most popular and powerful features of the PEST software suite in a fashion that is easy for novice or experienced modelers to use and (2) developing a software framework that is easy to extend.

  15. Assimilation of MODIS Snow Cover Through the Data Assimilation Research Testbed and the Community Land Model Version 4

    Science.gov (United States)

    Zhang, Yong-Fei; Hoar, Tim J.; Yang, Zong-Liang; Anderson, Jeffrey L.; Toure, Ally M.; Rodell, Matthew

    2014-01-01

    To improve snowpack estimates in Community Land Model version 4 (CLM4), the Moderate Resolution Imaging Spectroradiometer (MODIS) snow cover fraction (SCF) was assimilated into the Community Land Model version 4 (CLM4) via the Data Assimilation Research Testbed (DART). The interface between CLM4 and DART is a flexible, extensible approach to land surface data assimilation. This data assimilation system has a large ensemble (80-member) atmospheric forcing that facilitates ensemble-based land data assimilation. We use 40 randomly chosen forcing members to drive 40 CLM members as a compromise between computational cost and the data assimilation performance. The localization distance, a parameter in DART, was tuned to optimize the data assimilation performance at the global scale. Snow water equivalent (SWE) and snow depth are adjusted via the ensemble adjustment Kalman filter, particularly in regions with large SCF variability. The root-mean-square error of the forecast SCF against MODIS SCF is largely reduced. In DJF (December-January-February), the discrepancy between MODIS and CLM4 is broadly ameliorated in the lower-middle latitudes (2345N). Only minimal modifications are made in the higher-middle (4566N) and high latitudes, part of which is due to the agreement between model and observation when snow cover is nearly 100. In some regions it also reveals that CLM4-modeled snow cover lacks heterogeneous features compared to MODIS. In MAM (March-April-May), adjustments to snowmove poleward mainly due to the northward movement of the snowline (i.e., where largest SCF uncertainty is and SCF assimilation has the greatest impact). The effectiveness of data assimilation also varies with vegetation types, with mixed performance over forest regions and consistently good performance over grass, which can partly be explained by the linearity of the relationship between SCF and SWE in the model ensembles. The updated snow depth was compared to the Canadian Meteorological

  16. Accounting for observation uncertainties in an evaluation metric of low latitude turbulent air-sea fluxes: application to the comparison of a suite of IPSL model versions

    Science.gov (United States)

    Servonnat, Jérôme; Găinuşă-Bogdan, Alina; Braconnot, Pascale

    2017-09-01

    Turbulent momentum and heat (sensible heat and latent heat) fluxes at the air-sea interface are key components of the whole energetic of the Earth's climate. The evaluation of these fluxes in the climate models is still difficult because of the large uncertainties associated with the reference products. In this paper we present an objective metric accounting for reference uncertainties to evaluate the annual cycle of the low latitude turbulent fluxes of a suite of IPSL climate models. This metric consists in a Hotelling T 2 test between the simulated and observed field in a reduce space characterized by the dominant modes of variability that are common to both the model and the reference, taking into account the observational uncertainty. The test is thus more severe when uncertainties are small as it is the case for sea surface temperature (SST). The results of the test show that for almost all variables and all model versions the model-reference differences are not zero. It is not possible to distinguish between model versions for sensible heat and meridional wind stress, certainly due to the large observational uncertainties. All model versions share similar biases for the different variables. There is no improvement between the reference versions of the IPSL model used for CMIP3 and CMIP5. The test also reveals that the higher horizontal resolution fails to improve the representation of the turbulent surface fluxes compared to the other versions. The representation of the fluxes is further degraded in a version with improved atmospheric physics with an amplification of some of the biases in the Indian Ocean and in the intertropical convergence zone. The ranking of the model versions for the turbulent fluxes is not correlated with the ranking found for SST. This highlights that despite the fact that SST gradients are important for the large-scale atmospheric circulation patterns, other factors such as wind speed, and air-sea temperature contrast play an

  17. Accounting for observation uncertainties in an evaluation metric of low latitude turbulent air-sea fluxes: application to the comparison of a suite of IPSL model versions

    Science.gov (United States)

    Servonnat, Jérôme; Găinuşă-Bogdan, Alina; Braconnot, Pascale

    2016-11-01

    Turbulent momentum and heat (sensible heat and latent heat) fluxes at the air-sea interface are key components of the whole energetic of the Earth's climate. The evaluation of these fluxes in the climate models is still difficult because of the large uncertainties associated with the reference products. In this paper we present an objective metric accounting for reference uncertainties to evaluate the annual cycle of the low latitude turbulent fluxes of a suite of IPSL climate models. This metric consists in a Hotelling T 2 test between the simulated and observed field in a reduce space characterized by the dominant modes of variability that are common to both the model and the reference, taking into account the observational uncertainty. The test is thus more severe when uncertainties are small as it is the case for sea surface temperature (SST). The results of the test show that for almost all variables and all model versions the model-reference differences are not zero. It is not possible to distinguish between model versions for sensible heat and meridional wind stress, certainly due to the large observational uncertainties. All model versions share similar biases for the different variables. There is no improvement between the reference versions of the IPSL model used for CMIP3 and CMIP5. The test also reveals that the higher horizontal resolution fails to improve the representation of the turbulent surface fluxes compared to the other versions. The representation of the fluxes is further degraded in a version with improved atmospheric physics with an amplification of some of the biases in the Indian Ocean and in the intertropical convergence zone. The ranking of the model versions for the turbulent fluxes is not correlated with the ranking found for SST. This highlights that despite the fact that SST gradients are important for the large-scale atmospheric circulation patterns, other factors such as wind speed, and air-sea temperature contrast play an

  18. Assessment of two versions of regional climate model in simulating the Indian Summer Monsoon over South Asia CORDEX domain

    Science.gov (United States)

    Pattnayak, K. C.; Panda, S. K.; Saraswat, Vaishali; Dash, S. K.

    2017-07-01

    This study assess the performance of two versions of Regional Climate Model (RegCM) in simulating the Indian summer monsoon over South Asia for the period 1998 to 2003 with an aim of conducting future climate change simulations. Two sets of experiments were carried out with two different versions of RegCM (viz. RegCM4.2 and RegCM4.3) with the lateral boundary forcings provided from European Center for Medium Range Weather Forecast Reanalysis (ERA-interim) at 50 km horizontal resolution. The major updates in RegCM4.3 in comparison to the older version RegCM4.2 are the inclusion of measured solar irradiance in place of hardcoded solar constant and additional layers in the stratosphere. The analysis shows that the Indian summer monsoon rainfall, moisture flux and surface net downward shortwave flux are better represented in RegCM4.3 than that in the RegCM4.2 simulations. Excessive moisture flux in the RegCM4.2 simulation over the northern Arabian Sea and Peninsular India resulted in an overestimation of rainfall over the Western Ghats, Peninsular region as a result of which the all India rainfall has been overestimated. RegCM4.3 has performed well over India as a whole as well as its four rainfall homogenous zones in reproducing the mean monsoon rainfall and inter-annual variation of rainfall. Further, the monsoon onset, low-level Somali Jet and the upper level tropical easterly jet are better represented in the RegCM4.3 than RegCM4.2. Thus, RegCM4.3 has performed better in simulating the mean summer monsoon circulation over the South Asia. Hence, RegCM4.3 may be used to study the future climate change over the South Asia.

  19. Hypnotic drug risks of mortality, infection, depression, and cancer: but lack of benefit [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Daniel F. Kripke

    2016-05-01

    Full Text Available This is a review of hypnotic drug risks and benefits, reassessing and updating advice presented to the Commissioner of the Food and Drug Administration (United States FDA. Almost every month, new information appears about the risks of hypnotics (sleeping pills. This review includes new information on the growing USA overdose epidemic, eight new epidemiologic studies of hypnotics’ mortality not available for previous compilations, and new emphasis on risks of short-term hypnotic prescription. The most important risks of hypnotics include excess mortality, especially overdose deaths, quiet deaths at night, infections, cancer, depression and suicide, automobile crashes, falls, and other accidents, and hypnotic-withdrawal insomnia. The short-term use of one-two prescriptions is associated with greater risk per dose than long-term use. Hypnotics are usually prescribed without approved indication, most often with specific contraindications, but even when indicated, there is little or no benefit. The recommended doses objectively increase sleep little if at all, daytime performance is often made worse, not better, and the lack of general health benefits is commonly misrepresented in advertising. Treatments such as the cognitive behavioral treatment of insomnia and bright light treatment of circadian rhythm disorders might offer safer and more effective alternative approaches to insomnia.

  20. Landslide risk models for decision making.

    Science.gov (United States)

    Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio

    2009-11-01

    This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.

  1. Risk management model in road transport systems

    Science.gov (United States)

    Sakhapov, R. L.; Nikolaeva, R. V.; Gatiyatullin, M. H.; Makhmutov, M. M.

    2016-08-01

    The article presents the results of a study of road safety indicators that influence the development and operation of the transport system. Road safety is considered as a continuous process of risk management. Authors constructed a model that relates the social risks of a major road safety indicator - the level of motorization. The model gives a fairly accurate assessment of the level of social risk for any given level of motorization. Authors calculated the dependence of the level of socio-economic costs of accidents and injured people in them. The applicability of the concept of socio-economic damage is caused by the presence of a linear relationship between the natural and economic indicators damage from accidents. The optimization of social risk is reduced to finding the extremum of the objective function that characterizes the economic effect of the implementation of measures to improve safety. The calculations make it possible to maximize the net present value, depending on the costs of improving road safety, taking into account socio-economic damage caused by accidents. The proposed econometric models make it possible to quantify the efficiency of the transportation system, allow to simulate the change in road safety indicators.

  2. Mathematical modelling of risk reduction in reinsurance

    Science.gov (United States)

    Balashov, R. B.; Kryanev, A. V.; Sliva, D. E.

    2017-01-01

    The paper presents a mathematical model of efficient portfolio formation in the reinsurance markets. The presented approach provides the optimal ratio between the expected value of return and the risk of yield values below a certain level. The uncertainty in the return values is conditioned by use of expert evaluations and preliminary calculations, which result in expected return values and the corresponding risk levels. The proposed method allows for implementation of computationally simple schemes and algorithms for numerical calculation of the numerical structure of the efficient portfolios of reinsurance contracts of a given insurance company.

  3. Public sector risk management: a specific model.

    Science.gov (United States)

    Lawlor, Ted

    2002-07-01

    Risk management programs for state mental health authorities are generally limited in scope and reactive in nature. Recent changes in how mental health care is provided render it necessary to redirect the risk management focus from its present institutional basis to a statewide, network-based paradigm that is integrated across public and private inpatient and community programs alike. These changes include treating an increasing number of individuals in less-secure settings and contracting for an increasing number of public mental health services with private providers. The model proposed here is closely linked to the Quality Management Process.

  4. The Marine Virtual Laboratory (version 2.1): enabling efficient ocean model configuration

    Science.gov (United States)

    Oke, Peter R.; Proctor, Roger; Rosebrock, Uwe; Brinkman, Richard; Cahill, Madeleine L.; Coghlan, Ian; Divakaran, Prasanth; Freeman, Justin; Pattiaratchi, Charitha; Roughan, Moninya; Sandery, Paul A.; Schaeffer, Amandine; Wijeratne, Sarath

    2016-09-01

    The technical steps involved in configuring a regional ocean model are analogous for all community models. All require the generation of a model grid, preparation and interpolation of topography, initial conditions, and forcing fields. Each task in configuring a regional ocean model is straightforward - but the process of downloading and reformatting data can be time-consuming. For an experienced modeller, the configuration of a new model domain can take as little as a few hours - but for an inexperienced modeller, it can take much longer. In pursuit of technical efficiency, the Australian ocean modelling community has developed the Web-based MARine Virtual Laboratory (WebMARVL). WebMARVL allows a user to quickly and easily configure an ocean general circulation or wave model through a simple interface, reducing the time to configure a regional model to a few minutes. Through WebMARVL, a user is prompted to define the basic options needed for a model configuration, including the model, run duration, spatial extent, and input data. Once all aspects of the configuration are selected, a series of data extraction, reprocessing, and repackaging services are run, and a "take-away bundle" is prepared for download. Building on the capabilities developed under Australia's Integrated Marine Observing System, WebMARVL also extracts all of the available observations for the chosen time-space domain. The user is able to download the take-away bundle and use it to run the model of his or her choice. Models supported by WebMARVL include three community ocean general circulation models and two community wave models. The model configuration from the take-away bundle is intended to be a starting point for scientific research. The user may subsequently refine the details of the model set-up to improve the model performance for the given application. In this study, WebMARVL is described along with a series of results from test cases comparing WebMARVL-configured models to observations

  5. Department of Defense Data Model, Version 1, Fy 1998, Volume 8.

    Science.gov (United States)

    2007-11-02

    15 C g ’■s c 3 oo O) CO IO CM CO O) CO a. Appendix A IDEFl-x Modeling Conventions APPENDIX A: IDEFIX MODELING CONVENTIONS...1.0 IDEFIX DATA MODELING CONVENTIONS Whenever data structures and business rules required to support a functional area need to be specified, it is...etc.). An entity must have an attribute or A-l APPENDIX A: IDEFIX MODELING CONVENTIONS combination of attributes whose values uniquely identify

  6. Evaluation of the Community Multiscale Air Quality model version 5.1

    Science.gov (United States)

    The Community Multiscale Air Quality model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Atmospheric Modeling and Analysis Division (AMAD) of the U.S. Environment...

  7. Technical description of the RIVM/KNMI PUFF dispersion model. Version 4.0

    NARCIS (Netherlands)

    van Pul WAJ

    1992-01-01

    This report provides a technical description of the RIVM/KNMI PUFF model. The model may be used to calculate, given wind and rain field data, the dispersion of components emitted following an accident, emergency or calamity; the model area may be freely chosen to match the area of concern. The re

  8. Crop insurance: Risks and models of insurance

    Directory of Open Access Journals (Sweden)

    Čolović Vladimir

    2014-01-01

    Full Text Available The issue of crop protection is very important because of a variety of risks that could cause difficult consequences. One type of risk protection is insurance. The author in the paper states various models of insurance in some EU countries and the systems of subsidizing of insurance premiums by state. The author also gives a picture of crop insurance in the U.S., noting that in this country pays great attention to this matter. As for crop insurance in Serbia, it is not at a high level. The main problem with crop insurance is not only the risks but also the way of protection through insurance. The basic question that arises not only in the EU is the question is who will insure and protect crops. There are three possibilities: insurance companies under state control, insurance companies that are public-private partnerships or private insurance companies on a purely commercial basis.

  9. Hypnotic drug risks of mortality, infection, depression, and cancer: but lack of benefit [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Daniel F. Kripke

    2017-03-01

    Full Text Available This is a review of hypnotic drug risks and benefits, reassessing and updating advice presented to the Commissioner of the Food and Drug Administration (United States FDA. Almost every month, new information appears about the risks of hypnotics (sleeping pills. The most important risks of hypnotics include excess mortality, especially overdose deaths, quiet deaths at night, infections, cancer, depression and suicide, automobile crashes, falls, and other accidents, and hypnotic-withdrawal insomnia. Short-term use of one-two prescriptions is associated with greater risk per dose than long-term use. Hypnotics have usually been prescribed without approved indication, most often with specific contraindications, but even when indicated, there is little or no benefit. The recommended doses objectively increase sleep little if at all, daytime performance is often made worse, not better, and the lack of general health benefits is commonly misrepresented in advertising. Treatments such as the cognitive behavioral treatment of insomnia and bright light treatment of circadian rhythm disorders offer safer and more effective alternative approaches to insomnia.

  10. Measuring psychological trauma in the workplace: psychometric properties of the Italian version of the psychological injury risk indicator-a cross-sectional study.

    Science.gov (United States)

    Magnavita, Nicola; Garbarino, Sergio; Winwood, Peter C

    2015-01-01

    The aim of this study was to cross-culturally adapt the Psychological Injury Risk Indicator (PIRI) and to validate its psychometric properties. Workers from 24 small companies were invited to self-complete the PIRI before undergoing their routine medical examination at the workplace. All participants (841 out of 845, 99.6%) were also asked to report occupational injuries and episodes of violence that had occurred at the workplace in the previous 12 months and were given the General Health Questionnaire (GHQ12) to complete. Exploratory factor analysis revealed a 4-factor structure, "sleep problems," "recovery failure," "posttraumatic stress symptoms," and "chronic fatigue," which were the same subscales observed in the original version. The internal consistency was excellent (alpha = 0.932). ROC curve analysis revealed that the PIRI was much more efficient than GHQ12 in diagnosing workers who had suffered trauma (workplace violence or injury) in the previous year, as it revealed an area under the curve (AUC) of 0.679 (95% CI: 0.625-0.734) for the PIRI, while for the GHQ12 the AUC was 0.551 (not significant). This study, performed on a large population of workers, provides evidence of the validity of the Italian version of the PIRI.

  11. Measuring Psychological Trauma in the Workplace: Psychometric Properties of the Italian Version of the Psychological Injury Risk Indicator—A Cross-Sectional Study

    Directory of Open Access Journals (Sweden)

    Nicola Magnavita

    2015-01-01

    Full Text Available Background. The aim of this study was to cross-culturally adapt the Psychological Injury Risk Indicator (PIRI and to validate its psychometric properties. Methods. Workers from 24 small companies were invited to self-complete the PIRI before undergoing their routine medical examination at the workplace. All participants (841 out of 845, 99.6% were also asked to report occupational injuries and episodes of violence that had occurred at the workplace in the previous 12 months and were given the General Health Questionnaire (GHQ12 to complete. Results. Exploratory factor analysis revealed a 4-factor structure, “sleep problems,” “recovery failure,” “posttraumatic stress symptoms,” and “chronic fatigue,” which were the same subscales observed in the original version. The internal consistency was excellent (alpha = 0.932. ROC curve analysis revealed that the PIRI was much more efficient than GHQ12 in diagnosing workers who had suffered trauma (workplace violence or injury in the previous year, as it revealed an area under the curve (AUC of 0.679 (95% CI: 0.625–0.734 for the PIRI, while for the GHQ12 the AUC was 0.551 (not significant. Conclusions. This study, performed on a large population of workers, provides evidence of the validity of the Italian version of the PIRI.

  12. A "CLIPER" Risk Model for Insured Losses From US Hurricane Landfalls and the Need for an Open-Source Risk Model

    Science.gov (United States)

    Murnane, R. J.

    2003-12-01

    To plan for the consequences of hurricanes, earthquakes, and other natural hazards the public and private sectors use a variety of risk models. Model output is tailored for specific users and includes a range of parameters including: damage to structures, insured losses, and estimates of shelter requirements to care for people displaced by the catastrophe. Extensive efforts are made to tune risk models to past events. However, model "forecasts" of losses are rarely verified through a comparison with new events. Instead, new events generally are used to further tune a new version of the model. In addition, there has been no public attempt to determine which model has the most predictive skill, in part because there is no agreed upon reference forecast, and in part because most risk models are proprietary. Here I describe a simple risk model that can be used to provide deterministic and probabilistic exceedance probabilities for insured losses caused by hurricanes striking the US coastline. I propose that loss estimates based on the approach used in this simple model can be used as a reference forecast for assessing the skill of more complex commercial models. I also suggest that an effort be initiated to promote the development of an open-source risk model. The simple risk model combines wind speed exceedance probabilities estimated using the historical record of maximum sustained winds for hurricanes at landfall, and a set of normalized insured losses produced by landfalling hurricanes. The approach is analogous to weather, or climate, forecasts based on a combination of CLImatology and PERsistence (CLIPER). The climatological component accounts for low frequency variability in weather due to factors such as seasonality. The analog to climatology in the simple risk model is the historical record of hurricane wind speeds and insured losses. The insured losses have been corrected for the effects of inflation, population increases, and wealth, and other factors. The

  13. A cloud feedback emulator (CFE, version 1.0) for an intermediate complexity model

    Science.gov (United States)

    Ullman, David J.; Schmittner, Andreas

    2017-02-01

    The dominant source of inter-model differences in comprehensive global climate models (GCMs) are cloud radiative effects on Earth's energy budget. Intermediate complexity models, while able to run more efficiently, often lack cloud feedbacks. Here, we describe and evaluate a method for applying GCM-derived shortwave and longwave cloud feedbacks from 4 × CO2 and Last Glacial Maximum experiments to the University of Victoria Earth System Climate Model. The method generally captures the spread in top-of-the-atmosphere radiative feedbacks between the original GCMs, which impacts the magnitude and spatial distribution of surface temperature changes and climate sensitivity. These results suggest that the method is suitable to incorporate multi-model cloud feedback uncertainties in ensemble simulations with a single intermediate complexity model.

  14. Soil Erosion Risk Assessment and Modelling

    Science.gov (United States)

    Fister, Wolfgang; Kuhn, Nikolaus J.; Heckrath, Goswin

    2013-04-01

    Soil erosion is a phenomenon with relevance for many research topics in the geosciences. Consequently, PhD students with many different backgrounds are exposed to soil erosion related questions during their research. These students require a compact, but detailed introduction to erosion processes, the risks associated with erosion, but also tools to assess and study erosion related questions ranging from a simple risk assessment to effects of climate change on erosion-related effects on geochemistry on various scales. The PhD course on Soil Erosion Risk Assessment and Modelling offered by the University of Aarhus and conducted jointly with the University of Basel is aimed at graduate students with degrees in the geosciences and a PhD research topic with a link to soil erosion. The course offers a unique introduction to erosion processes, conventional risk assessment and field-truthing of results. This is achieved by combing lectures, mapping, erosion experiments, and GIS-based erosion modelling. A particular mark of the course design is the direct link between the results of each part of the course activities. This ensures the achievement of a holistic understanding of erosion in the environment as a key learning outcome.

  15. Feasibility Risk Assessment of Transport Infrastructure Projects: The CBA-DK Decision Support Model

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Banister, David

    2010-01-01

    This paper presents the final version of the CBA-DK decision support model for assessment of transport projects. The model makes use of conventional cost-benefit analysis resulting in aggregated single point estimates and quantitative risk analysis using Monte Carlo simulation resulting in interval...... result, and the determination of suitable probability distributions. Use is made of the reference class forecasting information, such as that developed in Optimism Bias for adjustments to investment decisions that relate to all modes of transport. The CBA-DK decision support model results in more...

  16. Global assessment of Vegetation Index and Phenology Lab (VIP and Global Inventory Modeling and Mapping Studies (GIMMS version 3 products

    Directory of Open Access Journals (Sweden)

    M. Marshall

    2015-06-01

    Full Text Available Earth observation based long-term global vegetation index products are used by scientists from a wide range of disciplines concerned with global change. Inter-comparison studies are commonly performed to keep the user community informed on the consistency and accuracy of such records as they evolve. In this study, we compared two new records: (1 Global Inventory Modeling and Mapping Studies (GIMMS Normalized Difference Vegetation Index Version 3 (NDVI3g and (2 Vegetation Index and Phenology Lab (VIP Version 3 NDVI (NDVI3v and Enhanced Vegetation Index 2 (EVI3v. We evaluated the two records via three experiments that addressed the primary use of such records in global change research: (1 prediction of the Leaf Area Index (LAI used in light-use efficiency modeling, (2 estimation of vegetation climatology in Soil-Vegetation-Atmosphere Transfer models, and (3 trend analysis of the magnitude and phenology of vegetation productivity. Experiment one, unlike previous inter-comparison studies, was performed with a unique Landsat 30 m spatial resolution and in situ LAI database for major crop types on five continents. Overall, the two records showed a high level of agreement both in direction and magnitude on a monthly basis, though VIP values were higher and more variable and showed lower correlations and higher error with in situ LAI. The records were most consistent at northern latitudes during the primary growing season and southern latitudes and the tropics throughout much of the year, while the records were less consistent at northern latitudes during green-up and senescence and in the great deserts of the world throughout much of the year. The two records were also highly consistent in terms of trend direction/magnitude, showing a 30+ year increase (decrease in NDVI over much of the globe (tropical rainforests. The two records were less consistent in terms of timing due to the poor correlation of the records during start and end of growing season.

  17. Hydrogen Macro System Model User Guide, Version 1.2.1

    Energy Technology Data Exchange (ETDEWEB)

    Ruth, M.; Diakov, V.; Sa, T.; Goldsby, M.; Genung, K.; Hoseley, R.; Smith, A.; Yuzugullu, E.

    2009-07-01

    The Hydrogen Macro System Model (MSM) is a simulation tool that links existing and emerging hydrogen-related models to perform rapid, cross-cutting analysis. It allows analysis of the economics, primary energy-source requirements, and emissions of hydrogen production and delivery pathways.

  18. Human Plague Risk: Spatial-Temporal Models

    Science.gov (United States)

    Pinzon, Jorge E.

    2010-01-01

    This chpater reviews the use of spatial-temporal models in identifying potential risks of plague outbreaks into the human population. Using earth observations by satellites remote sensing there has been a systematic analysis and mapping of the close coupling between the vectors of the disease and climate variability. The overall result is that incidence of plague is correlated to positive El Nino/Southem Oscillation (ENSO).

  19. Graphical models and Bayesian domains in risk modelling: application in microbiological risk assessment.

    Science.gov (United States)

    Greiner, Matthias; Smid, Joost; Havelaar, Arie H; Müller-Graf, Christine

    2013-05-15

    Quantitative microbiological risk assessment (QMRA) models are used to reflect knowledge about complex real-world scenarios for the propagation of microbiological hazards along the feed and food chain. The aim is to provide insight into interdependencies among model parameters, typically with an interest to characterise the effect of risk mitigation measures. A particular requirement is to achieve clarity about the reliability of conclusions from the model in the presence of uncertainty. To this end, Monte Carlo (MC) simulation modelling has become a standard in so-called probabilistic risk assessment. In this paper, we elaborate on the application of Bayesian computational statistics in the context of QMRA. It is useful to explore the analogy between MC modelling and Bayesian inference (BI). This pertains in particular to the procedures for deriving prior distributions for model parameters. We illustrate using a simple example that the inability to cope with feedback among model parameters is a major limitation of MC modelling. However, BI models can be easily integrated into MC modelling to overcome this limitation. We refer a BI submodel integrated into a MC model to as a "Bayes domain". We also demonstrate that an entire QMRA model can be formulated as Bayesian graphical model (BGM) and discuss the advantages of this approach. Finally, we show example graphs of MC, BI and BGM models, highlighting the similarities among the three approaches.

  20. Predictive validity and reliability of the Turkish version of the risk assessment pressure sore scale in intensive care patients: results of a prospective study.

    Science.gov (United States)

    Günes, Ülkü Yapucu; Efteli, Elçin

    2015-04-01

    Multiple pressure ulcer (PU) risk assessment instruments have been developed and tested, but there is no general consensus on which instrument to use for specific patient populations and care settings. The purpose of this study was to determine the reliability and predictive validity of the Turkish version of the Risk Assessment Pressure Sore (RAPS) instrument, which includes 12 variables--5 from the modified Norton Scale, 3 from the Braden Scale, and 3 from other research results--for use in intensive care unit (ICU) patients. The English version of the RAPS instrument was translated into Turkish and tested for internal consistency and predictive validity (sensitivity, specificity, positive predictive value, and negative predictive value) using a convenience sample of 122 patients consecutively admitted to an ICU unit in Turkey. The patients were assessed within 24 hours of admission, and after that, once a week until the development of a PU or discharge from the unit. The incidence of PUs in this population was 23%. The majority of ulcers that developed were Stage I. Internal consistency of the RAPS tool was adequate (Cronbach's α = 0.81). The best balance between sensitivity and specificity for ICU patients was reached at a cut-off point of ≤ 27 (ie, sensitivity = 74.2%, specificity = 31.8%, positive predictive value = 38.7%, and negative predictive value 91.3%). This is lower than the cut-off point reported in other studies of the RAPS scale. In this population of ICU patients, the RAPS scale was found to have acceptable reliability and poor validity. Additional studies to evaluate the predictive validity and reliability of the RAPS scale in other patient populations and care settings are needed.

  1. PhytoSFDM version 1.0.0: Phytoplankton Size and Functional Diversity Model

    Science.gov (United States)

    Acevedo-Trejos, Esteban; Brandt, Gunnar; Smith, S. Lan; Merico, Agostino

    2016-11-01

    Biodiversity is one of the key mechanisms that facilitate the adaptive response of planktonic communities to a fluctuating environment. How to allow for such a flexible response in marine ecosystem models is, however, not entirely clear. One particular way is to resolve the natural complexity of phytoplankton communities by explicitly incorporating a large number of species or plankton functional types. Alternatively, models of aggregate community properties focus on macroecological quantities such as total biomass, mean trait, and trait variance (or functional trait diversity), thus reducing the observed natural complexity to a few mathematical expressions. We developed the PhytoSFDM modelling tool, which can resolve species discretely and can capture aggregate community properties. The tool also provides a set of methods for treating diversity under realistic oceanographic settings. This model is coded in Python and is distributed as open-source software. PhytoSFDM is implemented in a zero-dimensional physical scheme and can be applied to any location of the global ocean. We show that aggregate community models reduce computational complexity while preserving relevant macroecological features of phytoplankton communities. Compared to species-explicit models, aggregate models are more manageable in terms of number of equations and have faster computational times. Further developments of this tool should address the caveats associated with the assumptions of aggregate community models and about implementations into spatially resolved physical settings (one-dimensional and three-dimensional). With PhytoSFDM we embrace the idea of promoting open-source software and encourage scientists to build on this modelling tool to further improve our understanding of the role that biodiversity plays in shaping marine ecosystems.

  2. Parameterization Improvements and Functional and Structural Advances in Version 4 of the Community Land Model

    Directory of Open Access Journals (Sweden)

    Andrew G. Slater

    2011-05-01

    Full Text Available The Community Land Model is the land component of the Community Climate System Model. Here, we describe a broad set of model improvements and additions that have been provided through the CLM development community to create CLM4. The model is extended with a carbon-nitrogen (CN biogeochemical model that is prognostic with respect to vegetation, litter, and soil carbon and nitrogen states and vegetation phenology. An urban canyon model is added and a transient land cover and land use change (LCLUC capability, including wood harvest, is introduced, enabling study of historic and future LCLUC on energy, water, momentum, carbon, and nitrogen fluxes. The hydrology scheme is modified with a revised numerical solution of the Richards equation and a revised ground evaporation parameterization that accounts for litter and within-canopy stability. The new snow model incorporates the SNow and Ice Aerosol Radiation model (SNICAR - which includes aerosol deposition, grain-size dependent snow aging, and vertically-resolved snowpack heating –– as well as new snow cover and snow burial fraction parameterizations. The thermal and hydrologic properties of organic soil are accounted for and the ground column is extended to ~50-m depth. Several other minor modifications to the land surface types dataset, grass and crop optical properties, atmospheric forcing height, roughness length and displacement height, and the disposition of snow-capped runoff are also incorporated.Taken together, these augmentations to CLM result in improved soil moisture dynamics, drier soils, and stronger soil moisture variability. The new model also exhibits higher snow cover, cooler soil temperatures in organic-rich soils, greater global river discharge, and lower albedos over forests and grasslands, all of which are improvements compared to CLM3.5. When CLM4 is run with CN, the mean biogeophysical simulation is slightly degraded because the vegetation structure is prognostic rather

  3. A Scalable Version of the Navy Operational Global Atmospheric Prediction System Spectral Forecast Model

    Directory of Open Access Journals (Sweden)

    Thomas E. Rosmond

    2000-01-01

    Full Text Available The Navy Operational Global Atmospheric Prediction System (NOGAPS includes a state-of-the-art spectral forecast model similar to models run at several major operational numerical weather prediction (NWP centers around the world. The model, developed by the Naval Research Laboratory (NRL in Monterey, California, has run operational at the Fleet Numerical Meteorological and Oceanographic Center (FNMOC since 1982, and most recently is being run on a Cray C90 in a multi-tasked configuration. Typically the multi-tasked code runs on 10 to 15 processors with overall parallel efficiency of about 90%. resolution is T159L30, but other operational and research applications run at significantly lower resolutions. A scalable NOGAPS forecast model has been developed by NRL in anticipation of a FNMOC C90 replacement in about 2001, as well as for current NOGAPS research requirements to run on DOD High-Performance Computing (HPC scalable systems. The model is designed to run with message passing (MPI. Model design criteria include bit reproducibility for different processor numbers and reasonably efficient performance on fully shared memory, distributed memory, and distributed shared memory systems for a wide range of model resolutions. Results for a wide range of processor numbers, model resolutions, and different vendor architectures are presented. Single node performance has been disappointing on RISC based systems, at least compared to vector processor performance. This is a common complaint, and will require careful re-examination of traditional numerical weather prediction (NWP model software design and data organization to fully exploit future scalable architectures.

  4. Statistical analysis of fracture data, adapted for modelling Discrete Fracture Networks-Version 2

    Energy Technology Data Exchange (ETDEWEB)

    Munier, Raymond

    2004-04-01

    The report describes the parameters which are necessary for DFN modelling, the way in which they can be extracted from the data base acquired during site investigations, and their assignment to geometrical objects in the geological model. The purpose here is to present a methodology for use in SKB modelling projects. Though the methodology is deliberately tuned to facilitate subsequent DFN modelling with other tools, some of the recommendations presented here are applicable to other aspects of geo-modelling as well. For instance, we here recommend a nomenclature to be used within SKB modelling projects, which are truly multidisciplinary, to ease communications between scientific disciplines and avoid misunderstanding of common concepts. This report originally occurred as an appendix to a strategy report for geological modelling (SKB-R--03-07). Strategy reports were intended to be successively updated to include experience gained during site investigations and site modelling. Rather than updating the entire strategy report, we choose to present the update of the appendix as a stand-alone document. This document thus replaces Appendix A2 in SKB-R--03-07. In short, the update consists of the following: The target audience has been broadened and as a consequence thereof, the purpose of the document. Correction of errors found in various formulae. All expressions have been rewritten. Inclusion of more worked examples in each section. A new section describing area normalisation. A new section on spatial correlation. A new section describing anisotropy. A new chapter describing the expected output from DFN modelling, within SKB projects.

  5. The Model of Emissions of Gases and Aerosols from Nature version 2.1 (MEGAN2.1: an extended and updated framework for modeling biogenic emissions

    Directory of Open Access Journals (Sweden)

    A. B. Guenther

    2012-06-01

    Full Text Available The Model of Emissions of Gases and Aerosols from Nature version 2.1 (MEGAN2.1 is a modeling framework for estimating fluxes of 147 biogenic compounds between terrestrial ecosystems and the atmosphere using simple mechanistic algorithms to account for the major known processes controlling biogenic emissions. It is available as an offline code and has also been coupled into land surface models and atmospheric chemistry models. MEGAN2.1 is an update from the previous versions including MEGAN2.0 for isoprene emissions and MEGAN2.04, which estimates emissions of 138 compounds. Isoprene comprises about half of the estimated total global biogenic volatile organic compound (BVOC emission of 1 Pg (1000 Tg or 1015 g. Another 10 compounds including methanol, ethanol, acetaldehyde, acetone, α-pinene, β-pinene, t−β-ocimene, limonene, ethene, and propene together contribute another 30% of the estimated emission. An additional 20 compounds (mostly terpenoids are associated with another 17% of the total emission with the remaining 3% distributed among 125 compounds. Emissions of 41 monoterpenes and 32 sesquiterpenes together comprise about 15% and 3%, respectively, of the total global BVOC emission. Tropical trees cover about 18% of the global land surface and are estimated to be responsible for 60% of terpenoid emissions and 48% of other VOC emissions. Other trees cover about the same area but are estimated to contribute only about 10% of total emissions. The magnitude of the emissions estimated with MEGAN2.1 are within the range of estimates reported using other approaches and much of the differences between reported values can be attributed to landcover and meteorological driving variables. The offline version of MEGAN2.1 source code and driving variables is available from http://acd.ucar.edu/~guenther/MEGAN/MEGAN.htm and the version integrated into the

  6. Description of the Mountain Cloud Chemistry Program version of the PLUVIUS MOD 5. 0 reactive storm simulation model

    Energy Technology Data Exchange (ETDEWEB)

    Luecken, D.J.; Whiteman, C.D.; Chapman, E.G.; Andrews, G.L.; Bader, D.C.

    1987-07-01

    Damage to forest ecosystems on mountains in the eastern United States has prompted a study conducted for the US Environmental Protection Agency's Mountain Cloud Chemistry Program (MCCP). This study has led to the development of a numerical model called MCCP PLUVIUS, which has been used to investigate the chemical transformations and cloud droplet deposition in shallow, nonprecipitating orographic clouds. The MCCP PLUVIUS model was developed as a specialized version of the existing PLUVIUS MOD 5.0 reactive storm model. It is capable of simulating aerosol scavenging, nonreactive gas scavenging, aqueous phase SO/sub 2/ reactions, and cloud water deposition. A description of the new model is provided along with information on model inputs and outputs, as well as suggestions for its further development. The MCCP PLUVIUS incorporates a new method to determine the depth of the layer of air which flows over a mountaintop to produce an orographic cloud event. It provides a new method for calculating hydrogen ion concentrations, and provides updated expressions and values for solubility, dissociation and reaction rate constants.

  7. Groundwater Risk Assessment Model (GRAM: Groundwater Risk Assessment Model for Wellfield Protection

    Directory of Open Access Journals (Sweden)

    Nara Somaratne

    2013-09-01

    Full Text Available A groundwater risk assessment was carried out for 30 potable water supply systems under a framework of protecting drinking water quality across South Australia. A semi-quantitative Groundwater Risk Assessment Model (GRAM was developed based on a “multi-barrier” approach using likelihood of release, contaminant pathway and consequence equation. Groundwater vulnerability and well integrity have been incorporated to the pathway component of the risk equation. The land use of the study basins varies from protected water reserves to heavily stocked grazing lands. Based on the risk assessment, 15 systems were considered as low risk, four as medium and 11 systems as at high risk. The GRAM risk levels were comparable with indicator bacteria—total coliform—detection. Most high risk systems were the result of poor well construction and casing corrosion rather than the land use. We carried out risk management actions, including changes to well designs and well operational practices, design to increase time of residence and setting the production zone below identified low permeable zones to provide additional barriers to contaminants. The highlight of the risk management element is the well integrity testing using down hole geophysical methods and camera views of the casing condition.

  8. Risk Assessment Model for Mobile Malware

    Directory of Open Access Journals (Sweden)

    George Stanescu

    2015-03-01

    Full Text Available The mobile technology is considered to be the fastest-developing IT security area. Only in the last year security threats around mobile devices have reached new heights in terms of both quality and quantity. The speed of this development has made possible several types of security attacks that, until recently, were only possible on computers. In terms of the most targeted mobile operating systems, Android continues to be the most vulnerable, although new ways of strengthening its security model were introduced by Google. The aim of this article is to provide a model for assessing the risk of mobile infection with malware, starting from a statistical analysis of the permissions required by each application installed into the mobile system. The software implementation of this model will use the Android operating system and in order to do so, we will start by analyzing its permission-based security architecture. Furthermore, based on statistical data regarding the most dangerous permissions, we build the risk assessment model and, to prove its efficiency, we scan some of the most popular apps and interpret the results. To this end, we offer an overview of the strengths and weaknesses of this permission-based model and we also state a short conclusion regarding model’s efficiency.

  9. FAME: Friendly Applied Modelling Environment. Version 2.2 User Manual

    NARCIS (Netherlands)

    Wortelboer FG; Aldenberg T

    1989-01-01

    FAME (Friendly Applied Modelling Environment) is een algemene modelleer omgeving, ontwikkeld voor de dynamische simulatie van waterkwaliteitsmodellen. De modellen worden beschreven als sets van differentiaalvergelijkingen, waarbij van een algemene notatie gebruik wordt gemaakt. Geen kennis van een

  10. Technical manual for basic version of the Markov chain nest productivity model (MCnest)

    Science.gov (United States)

    The Markov Chain Nest Productivity Model (or MCnest) integrates existing toxicity information from three standardized avian toxicity tests with information on species life history and the timing of pesticide applications relative to the timing of avian breeding seasons to quantit...

  11. User’s manual for basic version of MCnest Markov chain nest productivity model

    Science.gov (United States)

    The Markov Chain Nest Productivity Model (or MCnest) integrates existing toxicity information from three standardized avian toxicity tests with information on species life history and the timing of pesticide applications relative to the timing of avian breeding seasons to quantit...

  12. EQRM: An open-source event-based earthquake risk modeling program

    Science.gov (United States)

    Robinson, D. J.; Dhu, T.; Row, P.

    2007-12-01

    Geoscience Australia's Earthquake Risk Model (EQRM) is an event-based tool for earthquake scenario ground motion and scenario loss modeling as well as probabilistic seismic hazard (PSHA) and risk (PSRA) modeling. It has been used to conduct PSHA and PSRA for many of Australia's largest cities and it has become an important tool for the emergency management community which use it for scneario response planning. It has the potential to link with earthquake monitoring programs to provide automatic loss estimates from network recorded events. An open-source alpha-release version of the software is freely available on SourceForge. It can be used for hazard or risk analyses in any region of the world by supplying appropriately formatted input files. Source code is also supplied so advanced users can modify individual components to suit their needs.

  13. Illustrating and homology modeling the proteins of the Zika virus [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Sean Ekins

    2016-09-01

    Full Text Available The Zika virus (ZIKV is a flavivirus of the family Flaviviridae, which is similar to dengue virus, yellow fever and West Nile virus. Recent outbreaks in South America, Latin America, the Caribbean and in particular Brazil have led to concern for the spread of the disease and potential to cause Guillain-Barré syndrome and microcephaly. Although ZIKV has been known of for over 60 years there is very little in the way of knowledge of the virus with few publications and no crystal structures. No antivirals have been tested against it either in vitro or in vivo. ZIKV therefore epitomizes a neglected disease. Several suggested steps have been proposed which could be taken to initiate ZIKV antiviral drug discovery using both high throughput screens as well as structure-based design based on homology models for the key proteins. We now describe preliminary homology models created for NS5, FtsJ, NS4B, NS4A, HELICc, DEXDc, peptidase S7, NS2B, NS2A, NS1, E stem, glycoprotein M, propeptide, capsid and glycoprotein E using SWISS-MODEL. Eleven out of 15 models pass our model quality criteria for their further use. While a ZIKV glycoprotein E homology model was initially described in the immature conformation as a trimer, we now describe the mature dimer conformer which allowed the construction of an illustration of the complete virion. By comparing illustrations of ZIKV based on this new homology model and the dengue virus crystal structure we propose potential differences that could be exploited for antiviral and vaccine design. The prediction of sites for glycosylation on this protein may also be useful in this regard. While we await a cryo-EM structure of ZIKV and eventual crystal structures of the individual proteins, these homology models provide the community with a starting point for structure-based design of drugs and vaccines as well as a for computational virtual screening.

  14. Functional Risk Modeling for Lunar Surface Systems

    Science.gov (United States)

    Thomson, Fraser; Mathias, Donovan; Go, Susie; Nejad, Hamed

    2010-01-01

    We introduce an approach to risk modeling that we call functional modeling , which we have developed to estimate the capabilities of a lunar base. The functional model tracks the availability of functions provided by systems, in addition to the operational state of those systems constituent strings. By tracking functions, we are able to identify cases where identical functions are provided by elements (rovers, habitats, etc.) that are connected together on the lunar surface. We credit functional diversity in those cases, and in doing so compute more realistic estimates of operational mode availabilities. The functional modeling approach yields more realistic estimates of the availability of the various operational modes provided to astronauts by the ensemble of surface elements included in a lunar base architecture. By tracking functional availability the effects of diverse backup, which often exists when two or more independent elements are connected together, is properly accounted for.

  15. Development and validation of THUMS version 5 with 1D muscle models for active and passive automotive safety research.

    Science.gov (United States)

    Kimpara, Hideyuki; Nakahira, Yuko; Iwamoto, Masami

    2016-08-01

    Accurately predicting the occupant kinematics is critical to better understand the injury mechanisms during an automotive crash event. The objectives of this study were to develop and validate a finite element (FE) model of the human body integrated with an active muscle model called Total HUman Model for Safety (THUMS) version 5, which has the body size of the 50th percentile American adult male (AM50). This model is characterized by being able to generate a force owing to muscle tone and to predict the occupant response during an automotive crash event. Deformable materials were assigned to all body parts of THUMS model in order to evaluate the injury probabilities. Each muscle was modeled as a Hill-type muscle model with 800 muscle-tendon compartments of 1D truss and seatbelt elements covering whole joints in the neck, thorax, lumbar region, and upper and lower extremities. THUMS was validated against 36 series of post-mortem human surrogate (PMHS) and volunteer tests on frontal, lateral, and rear impacts. The muscle architectural and kinetic properties for the hip, knee, shoulder, and elbow joints were validated in terms of the moment arms and maximum isometric joint torques over a wide range of joint angles. The muscular moment arms and maximum joint torques estimated from THUMS occupant model with 1D muscles agreed with the experimental data for a wide range of joint angles. Therefore, this model has the potential to predict the occupant kinematics and injury outcomes considering appropriate human body motions associated with various human body postures, such as sitting or standing.

  16. Geological discrete fracture network model for the Olkiluoto site, Eurajoki, Finland. Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Fox, A.; Forchhammer, K.; Pettersson, A. [Golder Associates AB, Stockholm (Sweden); La Pointe, P.; Lim, D-H. [Golder Associates Inc. (Finland)

    2012-06-15

    This report describes the methods, analyses, and conclusions of the modeling team in the production of the 2010 revision to the geological discrete fracture network (DFN) model for the Olkiluoto Site in Finland. The geological DFN is a statistical model for stochastically simulating rock fractures and minor faults at a scale ranging from approximately 0.05 m to approximately 565m; deformation zones are expressly excluded from the DFN model. The DFN model is presented as a series of tables summarizing probability distributions for several parameters necessary for fracture modeling: fracture orientation, fracture size, fracture intensity, and associated spatial constraints. The geological DFN is built from data collected during site characterization (SC) activities at Olkiluoto, which is selected to function as a final deep geological repository for spent fuel and nuclear waste from the Finnish nuclear power program. Data used in the DFN analyses include fracture maps from surface outcrops and trenches, geological and structural data from cored drillholes, and fracture information collected during the construction of the main tunnels and shafts at the ONKALO laboratory. Unlike the initial geological DFN, which was focused on the vicinity of the ONKALO tunnel, the 2010 revisions present a model parameterization for the entire island. Fracture domains are based on the tectonic subdivisions at the site (northern, central, and southern tectonic units) presented in the Geological Site Model (GSM), and are further subdivided along the intersection of major brittle-ductile zones. The rock volume at Olkiluoto is dominated by three distinct fracture sets: subhorizontally-dipping fractures striking north-northeast and dipping to the east that is subparallel to the mean bedrock foliation direction, a subvertically-dipping fracture set striking roughly north-south, and a subvertically-dipping fracture set striking approximately east-west. The subhorizontally-dipping fractures

  17. Water, Energy, and Biogeochemical Model (WEBMOD), user’s manual, version 1

    Science.gov (United States)

    Webb, Richard M.T.; Parkhurst, David L.

    2017-02-08

    The Water, Energy, and Biogeochemical Model (WEBMOD) uses the framework of the U.S. Geological Survey (USGS) Modular Modeling System to simulate fluxes of water and solutes through watersheds. WEBMOD divides watersheds into model response units (MRU) where fluxes and reactions are simulated for the following eight hillslope reservoir types: canopy; snowpack; ponding on impervious surfaces; O-horizon; two reservoirs in the unsaturated zone, which represent preferential flow and matrix flow; and two reservoirs in the saturated zone, which also represent preferential flow and matrix flow. The reservoir representing ponding on impervious surfaces, currently not functional (2016), will be implemented once the model is applied to urban areas. MRUs discharge to one or more stream reservoirs that flow to the outlet of the watershed. Hydrologic fluxes in the watershed are simulated by modules derived from the USGS Precipitation Runoff Modeling System; the National Weather Service Hydro-17 snow model; and a topography-driven hydrologic model (TOPMODEL). Modifications to the standard TOPMODEL include the addition of heterogeneous vertical infiltration rates; irrigation; lateral and vertical preferential flows through the unsaturated zone; pipe flow draining the saturated zone; gains and losses to regional aquifer systems; and the option to simulate baseflow discharge by using an exponential, parabolic, or linear decrease in transmissivity. PHREEQC, an aqueous geochemical model, is incorporated to simulate chemical reactions as waters evaporate, mix, and react within the various reservoirs of the model. The reactions that can be specified for a reservoir include equilibrium reactions among water; minerals; surfaces; exchangers; and kinetic reactions such as kinetic mineral dissolution or precipitation, biologically mediated reactions, and radioactive decay. WEBMOD also simulates variations in the concentrations of the stable isotopes deuterium and oxygen-18 as a result of

  18. Representing winter wheat in the Community Land Model (version 4.5)

    Science.gov (United States)

    Lu, Yaqiong; Williams, Ian N.; Bagley, Justin E.; Torn, Margaret S.; Kueppers, Lara M.

    2017-05-01

    Winter wheat is a staple crop for global food security, and is the dominant vegetation cover for a significant fraction of Earth's croplands. As such, it plays an important role in carbon cycling and land-atmosphere interactions in these key regions. Accurate simulation of winter wheat growth is not only crucial for future yield prediction under a changing climate, but also for accurately predicting the energy and water cycles for winter wheat dominated regions. We modified the winter wheat model in the Community Land Model (CLM) to better simulate winter wheat leaf area index, latent heat flux, net ecosystem exchange of CO2, and grain yield. These included schemes to represent vernalization as well as frost tolerance and damage. We calibrated three key parameters (minimum planting temperature, maximum crop growth days, and initial value of leaf carbon allocation coefficient) and modified the grain carbon allocation algorithm for simulations at the US Southern Great Plains ARM site (US-ARM), and validated the model performance at eight additional sites across North America. We found that the new winter wheat model improved the prediction of monthly variation in leaf area index, reduced latent heat flux, and net ecosystem exchange root mean square error (RMSE) by 41 and 35 % during the spring growing season. The model accurately simulated the interannual variation in yield at the US-ARM site, but underestimated yield at sites and in regions (northwestern and southeastern US) with historically greater yields by 35 %.

  19. Hybrid Model of the Context Dependent Vestibulo-Ocular Reflex: Implications for Vergence-Version Interactions

    Directory of Open Access Journals (Sweden)

    Mina eRanjbaran

    2015-02-01

    Full Text Available The vestibulo-ocular reflex (VOR is an involuntary eye movement evoked by head movements. It is also influenced by viewing distance. This paper presents a hybrid nonlinear bilateral model for the horizontal angular vestibulo-ocular reflex (AVOR in the dark. The model is based on known interconnections between saccadic burst circuits in the brainstem and ocular premotor areas in the vestibular nuclei during fast and slow phase intervals of nystagmus. We implemented a viable switching strategy for the timing of nystagmus events to allow emulation of real nystagmus data. The performance of the hybrid model is evaluated with simulations, and results are consistent with experimental observations. The hybrid model replicates realistic AVOR nystagmus patterns during sinusoidal or step head rotations in the dark and during interactions with vergence, e.g. fixation distance. By simply assigning proper nonlinear neural computations at the premotor level, the model replicates all reported experimental observations. This work sheds light on potential underlying neural mechanisms driving the context dependent AVOR and explains contradictory results in the literature. Moreover, context-dependent behaviors in more complex motor systems could also rely on local nonlinear neural computations.

  20. Long-term Industrial Energy Forecasting (LIEF) model (18-sector version)

    Energy Technology Data Exchange (ETDEWEB)

    Ross, M.H. (Univ. of Michigan, Ann Arbor, MI (United States). Dept. of Physics); Thimmapuram, P.; Fisher, R.E.; Maciorowski, W. (Argonne National Lab., IL (United States))

    1993-05-01

    The new 18-sector Long-term Industrial Energy Forecasting (LIEF) model is designed for convenient study of future industrial energy consumption, taking into account the composition of production, energy prices, and certain kinds of policy initiatives. Electricity and aggregate fossil fuels are modeled. Changes in energy intensity in each sector are driven by autonomous technological improvement (price-independent trend), the opportunity for energy-price-sensitive improvements, energy price expectations, and investment behavior. Although this decision-making framework involves more variables than the simplest econometric models, it enables direct comparison of an econometric approach with conservation supply curves from detailed engineering analysis. It also permits explicit consideration of a variety of policy approaches other than price manipulation. The model is tested in terms of historical data for nine manufacturing sectors, and parameters are determined for forecasting purposes. Relatively uniform and satisfactory parameters are obtained from this analysis. In this report, LIEF is also applied to create base-case and demand-side management scenarios to briefly illustrate modeling procedures and outputs.

  1. Long-term Industrial Energy Forecasting (LIEF) model (18-sector version)

    Energy Technology Data Exchange (ETDEWEB)

    Ross, M.H. [Univ. of Michigan, Ann Arbor, MI (US). Dept. of Physics; Thimmapuram, P.; Fisher, R.E.; Maciorowski, W. [Argonne National Lab., IL (US)

    1993-05-01

    The new 18-sector Long-term Industrial Energy Forecasting (LIEF) model is designed for convenient study of future industrial energy consumption, taking into account the composition of production, energy prices, and certain kinds of policy initiatives. Electricity and aggregate fossil fuels are modeled. Changes in energy intensity in each sector are driven by autonomous technological improvement (price-independent trend), the opportunity for energy-price-sensitive improvements, energy price expectations, and investment behavior. Although this decision-making framework involves more variables than the simplest econometric models, it enables direct comparison of an econometric approach with conservation supply curves from detailed engineering analysis. It also permits explicit consideration of a variety of policy approaches other than price manipulation. The model is tested in terms of historical data for nine manufacturing sectors, and parameters are determined for forecasting purposes. Relatively uniform and satisfactory parameters are obtained from this analysis. In this report, LIEF is also applied to create base-case and demand-side management scenarios to briefly illustrate modeling procedures and outputs.

  2. Hybrid model of the context dependent vestibulo-ocular reflex: implications for vergence-version interactions.

    Science.gov (United States)

    Ranjbaran, Mina; Galiana, Henrietta L

    2015-01-01

    The vestibulo-ocular reflex (VOR) is an involuntary eye movement evoked by head movements. It is also influenced by viewing distance. This paper presents a hybrid nonlinear bilateral model for the horizontal angular vestibulo-ocular reflex (AVOR) in the dark. The model is based on known interconnections between saccadic burst circuits in the brainstem and ocular premotor areas in the vestibular nuclei during fast and slow phase intervals of nystagmus. We implemented a viable switching strategy for the timing of nystagmus events to allow emulation of real nystagmus data. The performance of the hybrid model is evaluated with simulations, and results are consistent with experimental observations. The hybrid model replicates realistic AVOR nystagmus patterns during sinusoidal or step head rotations in the dark and during interactions with vergence, e.g., fixation distance. By simply assigning proper nonlinear neural computations at the premotor level, the model replicates all reported experimental observations. This work sheds light on potential underlying neural mechanisms driving the context dependent AVOR and explains contradictory results in the literature. Moreover, context-dependent behaviors in more complex motor systems could also rely on local nonlinear neural computations.

  3. [Application of three risk assessment models in occupational health risk assessment of dimethylformamide].

    Science.gov (United States)

    Wu, Z J; Xu, B; Jiang, H; Zheng, M; Zhang, M; Zhao, W J; Cheng, J

    2016-08-20

    Objective: To investigate the application of United States Environmental Protection Agency (EPA) inhalation risk assessment model, Singapore semi-quantitative risk assessment model, and occupational hazards risk assessment index method in occupational health risk in enterprises using dimethylformamide (DMF) in a certain area in Jiangsu, China, and to put forward related risk control measures. Methods: The industries involving DMF exposure in Jiangsu province were chosen as the evaluation objects in 2013 and three risk assessment models were used in the evaluation. EPA inhalation risk assessment model: HQ=EC/RfC; Singapore semi-quantitative risk assessment model: Risk= (HR×ER) (1/2); Occupational hazards risk assessment index=2(Health effect level)×2(exposure ratio)×Operation condition level. Results: The results of hazard quotient (HQ>1) from EPA inhalation risk assessment model suggested that all the workshops (dry method, wet method and printing) and work positions (pasting, burdening, unreeling, rolling, assisting) were high risk. The results of Singapore semi-quantitative risk assessment model indicated that the workshop risk level of dry method, wet method and printing were 3.5 (high) , 3.5 (high) and 2.8 (general) , and position risk level of pasting, burdening, unreeling, rolling, assisting were 4 (high) , 4 (high) , 2.8 (general) , 2.8 (general) and 2.8 (general) . The results of occupational hazards risk assessment index method demonstrated that the position risk index of pasting, burdening, unreeling, rolling, assisting were 42 (high) , 33 (high) , 23 (middle) , 21 (middle) and 22 (middle) . The results of Singapore semi-quantitative risk assessment model and occupational hazards risk assessment index method were similar, while EPA inhalation risk assessment model indicated all the workshops and positions were high risk. Conclusion: The occupational hazards risk assessment index method fully considers health effects, exposure, and operating conditions

  4. A Prototypicality Validation of the Comprehensive Assessment of Psychopathic Personality (CAPP) Model Spanish Version.

    Science.gov (United States)

    Flórez, Gerardo; Casas, Alfonso; Kreis, Mette K F; Forti, Leonello; Martínez, Joaquín; Fernández, Juan; Conde, Manuel; Vázquez-Noguerol, Raúl; Blanco, Tania; Hoff, Helge A; Cooke, David J

    2015-10-01

    The Comprehensive Assessment of Psychopathic Personality (CAPP) is a newly developed, lexically based, conceptual model of psychopathy. The content validity of the Spanish language CAPP model was evaluated using prototypicality analysis. Prototypicality ratings were collected from 187 mental health experts and from samples of 143 health professionals and 282 community residents. Across the samples the majority of CAPP items were rated as highly prototypical of psychopathy. The Self, Dominance, and Attachment domains were evaluated as being more prototypical than the Behavioral and Cognitive domains. These findings are consistent with findings from similar studies in other languages and provide further support for the content validation of the CAPP model across languages and the lexical approach.

  5. An integrated assessment modeling framework for uncertainty studies in global and regional climate change: the MIT IGSM-CAM (version 1.0)

    OpenAIRE

    Monier, E.; Scott, J R; A. P. Sokolov; C. E. Forest; C. A. Schlosser

    2013-01-01

    This paper describes a computationally efficient framework for uncertainty studies in global and regional climate change. In this framework, the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity to a human activity model, is linked to the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM). Since the MIT IGSM-CAM framework (version 1.0) inc...

  6. The Flexible Global Ocean-Atmosphere-Land System Model,Spectral Version 2:FGOALS-s2

    Institute of Scientific and Technical Information of China (English)

    BAO Qing; LIN Pengfei; ZHOU Tianjun; LIU Yimin; YU Yongqiang; WU Guoxiong; HE Bian

    2013-01-01

    The Flexible Global Ocean-Atmosphere-Land System model,Spectral Version 2 (FGOALS-s2) was used to simulate realistic climates and to study anthropogenic influences on climate change.Specifically,the FGOALS-s2 was integrated with Coupled Model Intercomparison Project Phase 5 (CMIP5) to conduct coordinated experiments that will provide valuable scientific information to climate research communities.The performances of FGOALS-s2 were assessed in simulating major climate phenomena,and documented both the strengths and weaknesses of the model.The results indicate that FGOALS-s2 successfully overcomes climate drift,and realistically models global and regional climate characteristics,including SST,precipitation,and atmospheric circulation.In particular,the model accurately captures annual and semi-annual SST cycles in the equatorial Pacific Ocean,and the main characteristic features of the Asian summer monsoon,which include a low-level southwestern jet and five monsoon rainfall centers.The simulated climate variability was further examined in terms of teleconnections,leading modes of global SST (namely,ENSO),Pacific Decadal Oscillations (PDO),and changes in 19th-20th century climate.The analysis demonstrates that FGOALS-s2 realistically simulates extra-tropical teleconnection patterns of large-scale climate,and irregular ENSO periods.The model gives fairly reasonable reconstructions of spatial patterns of PDO and global monsoon changes in the 20th century.However,because the indirect effects of aerosols are not included in the model,the simulated global temperature change during the period 1850-2005 is greater than the observed warming,by 0.6℃.Some other shortcomings of the model are also noted.

  7. Measuring market liquidity risk - which model works best?

    OpenAIRE

    Ernst, Cornelia; Stange, Sebastian; Kaserer, Christoph

    2009-01-01

    Market liquidity risk, the difficulty or cost of trading assets in crises, has been recognized as an important factor in risk management. Literature has already proposed several models to include liquidity risk in the standard Value-at-Risk framework. While theoretical comparisons between those models have been conducted, their empirical performance has never been benchmarked. This paper performs comparative back-tests of daily risk forecasts for a large selection of traceable liquidity risk ...

  8. A Risk Management Model for Merger and Acquisition

    OpenAIRE

    Chui, B. S.

    2011-01-01

    In this paper, a merger and acquisition risk management model is proposed for considering risk factors in the merger and acquisition activities. The proposed model aims to maximize the probability of success in merger and acquisition activities by managing and reducing the associated risks. The modeling of the proposed merger and acquisition risk management model is described and illustrated in this paper. The illustration result shows that the proposed model can help to screen the best targe...

  9. Unit testing, model validation, and biological simulation [version 1; referees: 2 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Gopal P. Sarma

    2016-08-01

    Full Text Available The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  10. Hydrogeological DFN modelling using structural and hydraulic data from KLX04. Preliminary site description Laxemar subarea - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Follin, Sven [SF GeoLogic AB, Taeby (Sweden); Stigsson, Martin [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2006-04-15

    SKB is conducting site investigations for a high-level nuclear waste repository in fractured crystalline rocks at two coastal areas in Sweden. The two candidate areas are named Forsmark and Simpevarp. The site characterisation work is divided into two phases, an initial site investigation phase (ISI) and a complete site investigation phase (CSI). The results of the ISI phase are used as a basis for deciding on the subsequent CSI phase. On the basis of the CSI investigations a decision is made as to whether detailed characterisation will be performed (including sinking of a shaft). An integrated component in the site characterisation work is the development of site descriptive models. These comprise basic models in three dimensions with an accompanying text description. Central in the modelling work is the geological model which provides the geometrical context in terms of a model of deformation zones and the less fractured rock mass between the zones. Using the geological and geometrical description models as a basis, descriptive models for other disciplines (surface ecosystems, hydrogeology, hydrogeochemistry, rock mechanics, thermal properties and transport properties) will be developed. Great care is taken to arrive at a general consistency in the description of the various models and assessment of uncertainty and possible needs of alternative models. The main objective of this study is to support the development of a hydrogeological DFN model (Discrete Fracture Network) for the Preliminary Site Description of the Laxemar area on a regional-scale (SDM version L1.2). A more specific objective of this study is to assess the propagation of uncertainties in the geological DFN modelling reported for L1.2 into the groundwater flow modelling. An improved understanding is necessary in order to gain credibility for the Site Description in general and the hydrogeological description in particular. The latter will serve as a basis for describing the present

  11. NGNP Risk Management Database: A Model for Managing Risk

    Energy Technology Data Exchange (ETDEWEB)

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  12. Preliminary site description: Groundwater flow simulations. Simpevarp area (version 1.1) modelled with CONNECTFLOW

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, Lee; Worth, David [Serco Assurance Ltd, Risley (United Kingdom); Gylling, Bjoern; Marsic, Niko [Kemakta Konsult AB, Stockholm (Sweden); Holmen, Johan [Golder Associates, Stockholm (Sweden)

    2004-08-01

    The main objective of this study is to assess the role of known and unknown hydrogeological conditions for the present-day distribution of saline groundwater at the Simpevarp and Laxemar sites. An improved understanding of the paleo-hydrogeology is necessary in order to gain credibility for the Site Descriptive Model in general and the Site Hydrogeological Description in particular. This is to serve as a basis for describing the present hydrogeological conditions as well as predictions of future hydrogeological conditions. This objective implies a testing of: geometrical alternatives in the structural geology and bedrock fracturing, variants in the initial and boundary conditions, and parameter uncertainties (i.e. uncertainties in the hydraulic property assignment). This testing is necessary in order to evaluate the impact on the groundwater flow field of the specified components and to promote proposals of further investigations of the hydrogeological conditions at the site. The general methodology for modelling transient salt transport and groundwater flow using CONNECTFLOW that was developed for Forsmark has been applied successfully also for Simpevarp. Because of time constraints only a key set of variants were performed that focussed on the influences of DFN model parameters, the kinematic porosity, and the initial condition. Salinity data in deep boreholes available at the time of the project was too limited to allow a good calibration exercise. However, the model predictions are compared with the available data from KLX01 and KLX02 below. Once more salinity data is available it may be possible to draw more definite conclusions based on the differences between variants. At the moment though the differences should just be used understand the sensitivity of the models to various input parameters.

  13. Neck keloids: evaluation of risk factors and recommendation for keloid staging system [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Michael H. Tirgan

    2016-08-01

    Full Text Available Importance: Health care providers have long struggled with recurrent and hard to treat keloids. Advancing our understanding of natural history and risk factors for development of large, very large and massive neck keloids can lead to improved treatment outcomes. Clinical staging system for the categorization of keloid lesions, as well as grouping of keloid patients according to the extent of skin involvement is both fundamental for design and delivery of proper plan of care and an absolute necessity for methodical trial design and interpretation of the results thereof. Objective: To review clinical presentation and natural history of neck keloids; to explore risk factors for development of large, very large and massive neck keloids; and to propose a clinical staging system that allows for categorization of keloid lesions by their size and grouping of keloid patients by the extent of their skin involvement.  Setting: This is a retrospective analysis of 82 consecutive patients with neck keloids who were seen by the author in his keloid specialty medical practice.    Intervention: Non-surgical treatment was offered to all patients.  Results: Neck-area keloids were found to have several unique characteristics. All 65 African Americans in this study had keloidal lesions elsewhere on their skin. Very large and massive neck keloids appear to be race-specific and almost exclusively seen among African Americans. Submandibular and submental skin was the most commonly involved area of the neck. Keloid removal surgery was found to be the main risk factor for development of very large and massive neck keloids.  Conclusions and relevance: Surgical removal of neck keloids results in wounding of the skin and triggering a pathological wound-healing response that often leads to formation of a much larger keloid.  Given the potential for greater harm from surgery, the author proposes non-surgical approach for treatment of all primary neck keloids. Author

  14. Genetic/Familial High-Risk Assessment: Colorectal Version 1.2016, NCCN Clinical Practice Guidelines in Oncology.

    Science.gov (United States)

    Provenzale, Dawn; Gupta, Samir; Ahnen, Dennis J; Bray, Travis; Cannon, Jamie A; Cooper, Gregory; David, Donald S; Early, Dayna S; Erwin, Deborah; Ford, James M; Giardiello, Francis M; Grady, William; Halverson, Amy L; Hamilton, Stanley R; Hampel, Heather; Ismail, Mohammad K; Klapman, Jason B; Larson, David W; Lazenby, Audrey J; Lynch, Patrick M; Mayer, Robert J; Ness, Reid M; Regenbogen, Scott E; Samadder, Niloy Jewel; Shike, Moshe; Steinbach, Gideon; Weinberg, David; Dwyer, Mary; Darlow, Susan

    2016-08-01

    This is a focused update highlighting the most current NCCN Guidelines for diagnosis and management of Lynch syndrome. Lynch syndrome is the most common cause of hereditary colorectal cancer, usually resulting from a germline mutation in 1 of 4 DNA mismatch repair genes (MLH1, MSH2, MSH6, or PMS2), or deletions in the EPCAM promoter. Patients with Lynch syndrome are at an increased lifetime risk, compared with the general population, for colorectal cancer, endometrial cancer, and other cancers, including of the stomach and ovary. As of 2016, the panel recommends screening all patients with colorectal cancer for Lynch syndrome and provides recommendations for surveillance for early detection and prevention of Lynch syndrome-associated cancers.

  15. Flood risk assessment: concepts, modelling, applications

    Directory of Open Access Journals (Sweden)

    G. Tsakiris

    2014-01-01

    Full Text Available Natural hazards have caused severe consequences to the natural, modified and human systems, in the past. These consequences seem to increase with time due to both higher intensity of the natural phenomena and higher value of elements at risk. Among the water related hazards flood hazards have the most destructive impacts. The paper presents a new systemic paradigm for the assessment of flood hazard and flood risk in the riverine flood prone areas. Special emphasis is given to the urban areas with mild terrain and complicated topography, in which 2-D fully dynamic flood modelling is proposed. Further the EU flood directive is critically reviewed and examples of its implementation are presented. Some critical points in the flood directive implementation are also highlighted.

  16. The big challenges in modeling human and environmental well-being [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Shripad Tuljapurkar

    2016-04-01

    Full Text Available This article is a selective review of quantitative research, historical and prospective, that is needed to inform sustainable development policy. I start with a simple framework to highlight how demography and productivity shape human well-being. I use that to discuss three sets of issues and corresponding challenges to modeling: first, population prehistory and early human development and their implications for the future; second, the multiple distinct dimensions of human and environmental well-being and the meaning of sustainability; and, third, inequality as a phenomenon triggered by development and models to examine changing inequality and its consequences. I conclude with a few words about other important factors: political, institutional, and cultural.

  17. Business models for renewable energy in the built environment. Updated version

    Energy Technology Data Exchange (ETDEWEB)

    Wuertenberger, L.; Menkveld, M.; Vethman, P.; Van Tilburg, X. [ECN Policy Studies, Amsterdam (Netherlands); Bleyl, J.W. [Energetic Solutions, Graz (Austria)

    2012-04-15

    The project RE-BIZZ aims to provide insight to policy makers and market actors in the way new and innovative business models (and/or policy measures) can stimulate the deployment of renewable energy technologies (RET) and energy efficiency (EE) measures in the built environment. The project is initiated and funded by the IEA Implementing Agreement for Renewable Energy Technology Deployment (IEA-RETD). It analysed ten business models in three categories (amongst others different types of Energy Service Companies (ESCOs), Developing properties certified with a 'green' building label, Building owners profiting from rent increases after EE measures, Property Assessed Clean Energy (PACE) financing, On-bill financing, and Leasing of RET equipment) including their organisational and financial structure, the existing market and policy context, and an analysis of Strengths, Weaknesses, Opportunities and Threats (SWOT). The study concludes with recommendations for policy makers and other market actors.

  18. First implementation of secondary inorganic aerosols in the MOCAGE version R2.15.0 chemistry transport model

    Science.gov (United States)

    Guth, J.; Josse, B.; Marécal, V.; Joly, M.; Hamer, P.

    2016-01-01

    In this study we develop a secondary inorganic aerosol (SIA) module for the MOCAGE chemistry transport model developed at CNRM. The aim is to have a module suitable for running at different model resolutions and for operational applications with reasonable computing times. Based on the ISORROPIA II thermodynamic equilibrium module, the new version of the model is presented and evaluated at both the global and regional scales. The results show high concentrations of secondary inorganic aerosols in the most polluted regions: Europe, Asia and the eastern part of North America. Asia shows higher sulfate concentrations than other regions thanks to emission reductions in Europe and North America. Using two simulations, one with and the other without secondary inorganic aerosol formation, the global model outputs are compared to previous studies, to MODIS AOD retrievals, and also to in situ measurements from the HTAP database. The model shows a better agreement with MODIS AOD retrievals in all geographical regions after introducing the new SIA scheme. It also provides a good statistical agreement with in situ measurements of secondary inorganic aerosol composition: sulfate, nitrate and ammonium. In addition, the simulation with SIA generally gives a better agreement with observations for secondary inorganic aerosol precursors (nitric acid, sulfur dioxide, ammonia), in particular with a reduction of the modified normalized mean bias (MNMB). At the regional scale, over Europe, the model simulation with SIA is compared to the in situ measurements from the EMEP database and shows a good agreement with secondary inorganic aerosol composition. The results at the regional scale are consistent with those obtained from the global simulations. The AIRBASE database was used to compare the model to regulated air quality pollutants: particulate matter, ozone and nitrogen dioxide concentrations. Introduction of the SIA in MOCAGE provides a reduction in the PM2.5 MNMB of 0.44 on a

  19. User’s Guide for COMBIMAN Programs (COMputerized BIomechanical MAN-Model) Version 5

    Science.gov (United States)

    1982-04-01

    accomplishing this has been to build mock-ups and use an undetermined number of "representative" test pilots to evaluate the work environment and...the "representative" pilots depends on the availability of pilots and the whims of the designers. The COMputerized Blomechanical MAN-model (COMBIMAN...de- fined with letter S, is the field of stereovision , which is the field visible to both eyes simultaneously. The field defined with letter F

  20. User`s guide to the META-Net economic modeling system. Version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Lamont, A.

    1994-11-24

    In a market economy demands for commodities are met through various technologies and resources. Markets select the technologies and resources to meet these demands based on their costs. Over time, the competitiveness of different technologies can change due to the exhaustion of resources they depend on, the introduction of newer, more efficient technologies, or even shifts in user demands. As this happens, the structure of the economy changes. The Market Equilibrium and Technology Assessment Network Modelling System, META{center_dot}Net, has been developed for building and solving multi-period equilibrium models to analyze the shifts in the energy system that may occur as new technologies are introduced and resources are exhausted. META{center_dot}Net allows a user to build and solve complex economic models. It models` a market economy as a network of nodes representing resources, conversion processes, markets, and end-use demands. Commodities flow through this network from resources, through conversion processes and market, to the end-users. META{center_dot}Net then finds the multiperiod equilibrium prices and quantities. The solution includes the prices and quantities demanded for each commodity along with the capacity additions (and retirements) for each conversion process, and the trajectories of resource extraction. Although the changes in the economy are largely driven by consumers` behavior and the costs of technologies and resources, they are also affected by various government policies. These can include constraints on prices and quantities, and various taxes and constraints on environmental emissions. META{center_dot}Net can incorporate many of these mechanisms and evaluate their potential impact on the development of the economic system.

  1. Modelling turbulent vertical mixing sensitivity using a 1-D version of NEMO

    Directory of Open Access Journals (Sweden)

    G. Reffray

    2014-08-01

    Full Text Available Through two numerical experiments, a 1-D vertical model called NEMO1D was used to investigate physical and numerical turbulent-mixing behaviour. The results show that all the turbulent closures tested (k + l from Blanke and Delecluse, 1993 and two equation models: Generic Lengh Scale closures from Umlauf and Burchard, 2003 are able to correctly reproduce the classical test of Kato and Phillips (1969 under favourable numerical conditions while some solutions may diverge depending on the degradation of the spatial and time discretization. The performances of turbulence models were then compared with data measured over a one-year period (mid-2010 to mid-2011 at the PAPA station, located in the North Pacific Ocean. The modelled temperature and salinity were in good agreement with the observations, with a maximum temperature error between −2 and 2 °C during the stratified period (June to October. However the results also depend on the numerical conditions. The vertical RMSE varied, for different turbulent closures, from 0.1 to 0.3 °C during the stratified period and from 0.03 to 0.15 °C during the homogeneous period. This 1-D configuration at the PAPA station (called PAPA1D is now available in NEMO as a reference configuration including the input files and atmospheric forcing set described in this paper. Thus, all the results described can be recovered by downloading and launching PAPA1D. The configuration is described on the NEMO site (http://www.nemo-ocean.eu/Using-NEMO/Configurations/C1D_PAPA. This package is a good starting point for further investigation of vertical processes.

  2. The Everglades Depth Estimation Network (EDEN) surface-water model, version 2

    Science.gov (United States)

    Telis, Pamela A.; Xie, Zhixiao; Liu, Zhongwei; Li, Yingru; Conrads, Paul A.

    2015-01-01

    The Everglades Depth Estimation Network (EDEN) is an integrated network of water-level gages, interpolation models that generate daily water-level and water-depth data, and applications that compute derived hydrologic data across the freshwater part of the greater Everglades landscape. The U.S. Geological Survey Greater Everglades Priority Ecosystems Science provides support for EDEN in order for EDEN to provide quality-assured monitoring data for the U.S. Army Corps of Engineers Comprehensive Everglades Restoration Plan.

  3. The Canadian Defence Input-Output Model DIO Version 4.41

    Science.gov (United States)

    2011-09-01

    Output models, for instance to study the regional benefits of different large procure- ment programmes, the data censorship limitation would...excluding potato chips and nuts 113 0960 Cocoa and chocolate 114 0979 Nuts DRDC CORA TM 2011-147 31 Index Code Commodity name 115 0989 Chocolate...Private hospital services 631 5631 Private residential care facilities 632 5632 Child care, outside the home 633 5633 Other health and social services 634

  4. Uncorrelated Encounter Model of the National Airspace System, Version 2.0

    Science.gov (United States)

    2013-08-19

    between two IFR aircraft in oceanic airspace. The reason for this is that one cannot observe encounters of sufficient fidelity in the available data...does not observe a sufficient number of encounters between instrument flight rules ( IFR ) and non- IFR traffic beyond 12 NM from the shore. 4 TABLE 1...Encounter model categories. Aircraft of Interest Intruder Aircraft Location Flight Rule IFR VFR Noncooperative Noncooperative Conventional

  5. Advanced Propagation Model (APM) Version 2.1.04 Computer Software Configuration Item (CSCI) Documents

    Science.gov (United States)

    2007-02-01

    19 3.1.2.17 Ray Trace ( RAYTRACE ) SU................................................................................ 20 3.1.2.18...NOSC TD 1015, Feb. 1984. Horst, M.M., Dyer, F.B., Tuley, M.T., “ Radar Sea Clutter Model,”, IEEE International Conference on Antennas and Propagation...3.1.2.17 Ray Trace ( RAYTRACE ) SU Using standard ray trace techniques, a ray is traced from a starting height and range with a specified starting

  6. System cost model user`s manual, version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Shropshire, D.

    1995-06-01

    The System Cost Model (SCM) was developed by Lockheed Martin Idaho Technologies in Idaho Falls, Idaho and MK-Environmental Services in San Francisco, California to support the Baseline Environmental Management Report sensitivity analysis for the U.S. Department of Energy (DOE). The SCM serves the needs of the entire DOE complex for treatment, storage, and disposal (TSD) of mixed low-level, low-level, and transuranic waste. The model can be used to evaluate total complex costs based on various configuration options or to evaluate site-specific options. The site-specific cost estimates are based on generic assumptions such as waste loads and densities, treatment processing schemes, existing facilities capacities and functions, storage and disposal requirements, schedules, and cost factors. The SCM allows customization of the data for detailed site-specific estimates. There are approximately forty TSD module designs that have been further customized to account for design differences for nonalpha, alpha, remote-handled, and transuranic wastes. The SCM generates cost profiles based on the model default parameters or customized user-defined input and also generates costs for transporting waste from generators to TSD sites.

  7. T2LBM Version 1.0: Landfill bioreactor model for TOUGH2

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Curtis M.

    2001-05-22

    The need to control gas and leachate production and minimize refuse volume in landfills has motivated the development of landfill simulation models that can be used by operators to predict and design optimal treatment processes. T2LBM is a module for the TOUGH2 simulator that implements a Landfill Bioreactor Model to provide simulation capability for the processes of aerobic or anaerobic biodegradation of municipal solid waste and the associated flow and transport of gas and liquid through the refuse mass. T2LBM incorporates a Monod kinetic rate law for the biodegradation of acetic acid in the aqueous phase by either aerobic or anaerobic microbes as controlled by the local oxygen concentration. Acetic acid is considered a proxy for all biodegradable substrates in the refuse. Aerobic and anaerobic microbes are assumed to be immobile and not limited by nutrients in their growth. Methane and carbon dioxide generation due to biodegradation with corresponding thermal effects are modeled. The numerous parameters needed to specify biodegradation are input by the user in the SELEC block of the TOUGH2 input file. Test problems show that good matches to laboratory experiments of biodegradation can be obtained. A landfill test problem demonstrates the capabilities of T2LBM for a hypothetical two-dimensional landfill scenario with permeability heterogeneity and compaction.

  8. Regional groundwater flow model for a glaciation scenario. Simpevarp subarea - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Jaquet, O.; Siegel, P. [Colenco Power Engineering Ltd, Baden-Daettwil (Switzerland)

    2006-10-15

    A groundwater flow model (glaciation model) was developed at a regional scale in order to study long term transient effects related to a glaciation scenario likely to occur in response to climatic changes. Conceptually the glaciation model was based on the regional model of Simpevarp and was then extended to a mega-regional scale (of several hundred kilometres) in order to account for the effects of the ice sheet. These effects were modelled using transient boundary conditions provided by a dynamic ice sheet model describing the phases of glacial build-up, glacial completeness and glacial retreat needed for the glaciation scenario. The results demonstrate the strong impact of the ice sheet on the flow field, in particular during the phases of the build-up and the retreat of the ice sheet. These phases last for several thousand years and may cause large amounts of melt water to reach the level of the repository and below. The highest fluxes of melt water are located in the vicinity of the ice margin. As the ice sheet approaches the repository location, the advective effects gain dominance over diffusive effects in the flow field. In particular, up-coning effects are likely to occur at the margin of the ice sheet leading to potential increases in salinity at repository level. For the base case, the entire salinity field of the model is almost completely flushed out at the end of the glaciation period. The flow patterns are strongly governed by the location of the conductive features in the subglacial layer. The influence of these glacial features is essential for the salinity distribution as is their impact on the flow trajectories and, therefore, on the resulting performance measures. Travel times and F-factor were calculated using the method of particle tracking. Glacial effects cause major consequences on the results. In particular, average travel times from the repository to the surface are below 10 a during phases of glacial build-up and retreat. In comparison

  9. Implementing and Evaluating Variable Soil Thickness in the Community Land Model, Version 4.5 (CLM4.5)

    Energy Technology Data Exchange (ETDEWEB)

    Brunke, Michael A.; Broxton, Patrick; Pelletier, Jon; Gochis, David; Hazenberg, Pieter; Lawrence, David M.; Leung, L. Ruby; Niu, Guo-Yue; Troch, Peter A.; Zeng, Xubin

    2016-05-01

    One of the recognized weaknesses of land surface models as used in weather and climate models is the assumption of constant soil thickness due to the lack of global estimates of bedrock depth. Using a 30 arcsecond global dataset for the thickness of relatively porous, unconsolidated sediments over bedrock, spatial variation in soil thickness is included here in version 4.5 of the Community Land Model (CLM4.5). The number of soil layers for each grid cell is determined from the average soil depth for each 0.9° latitude x 1.25° longitude grid cell. Including variable soil thickness affects the simulations most in regions with shallow bedrock corresponding predominantly to areas of mountainous terrain. The greatest changes are to baseflow, with the annual minimum generally occurring earlier, while smaller changes are seen in surface fluxes like latent heat flux and surface runoff in which only the annual cycle amplitude is increased. These changes are tied to soil moisture changes which are most substantial in locations with shallow bedrock. Total water storage (TWS) anomalies do not change much over most river basins around the globe, since most basins contain mostly deep soils. However, it was found that TWS anomalies substantially differ for a river basin with more mountainous terrain. Additionally, the annual cycle in soil temperature are affected by including realistic soil thicknesses due to changes to heat capacity and thermal conductivity.

  10. Validation of the French version of the marijuana craving questionnaire (MCQ) generates a two-factor model.

    Science.gov (United States)

    Chauchard, Emeline; Goutaudier, Nelly; Heishman, Stephen J; Gorelick, David A; Chabrol, Henri

    2015-01-01

    Craving is a major issue in drug addiction, and a target for drug treatment. The Marijuana Craving Questionnaire-Short Form (MCQ-SF) is a useful tool for assessing cannabis craving in clinical and research settings. To validate the French version of the MCQ-SF (FMCQ-SF). Young adult cannabis users not seeking treatment (n = 679) completed the FMCQ-SF and questionnaires assessing their frequency of cannabis use and craving, cannabis use disorder criteria, and alcohol use. Confirmatory factor analysis of the four-factor FMCQ-SF model did not fit the data well. Exploratory factor analysis suggested a two-factor solution ("pleasure", characterized by planning and expectation of positive effects, and "release of tension", characterized by relief from anxiety, nervousness, or tension) with good psychometric properties. This two-factor model showed good internal and convergent validity and correlated with cannabis abuse and dependence and with frequency of cannabis use and craving. Validation of the FMCQ-SF generated a two-factor model, different from the four-factor solution generated in English language studies. Considering that craving plays an important role in withdrawal and relapse, this questionnaire should be useful for French-language addiction professionals.

  11. Presentation, calibration and validation of the low-order, DCESS Earth System Model (Version 1

    Directory of Open Access Journals (Sweden)

    J. O. Pepke Pedersen

    2008-11-01

    Full Text Available A new, low-order Earth System Model is described, calibrated and tested against Earth system data. The model features modules for the atmosphere, ocean, ocean sediment, land biosphere and lithosphere and has been designed to simulate global change on time scales of years to millions of years. The atmosphere module considers radiation balance, meridional transport of heat and water vapor between low-mid latitude and high latitude zones, heat and gas exchange with the ocean and sea ice and snow cover. Gases considered are carbon dioxide and methane for all three carbon isotopes, nitrous oxide and oxygen. The ocean module has 100 m vertical resolution, carbonate chemistry and prescribed circulation and mixing. Ocean biogeochemical tracers are phosphate, dissolved oxygen, dissolved inorganic carbon for all three carbon isotopes and alkalinity. Biogenic production of particulate organic matter in the ocean surface layer depends on phosphate availability but with lower efficiency in the high latitude zone, as determined by model fit to ocean data. The calcite to organic carbon rain ratio depends on surface layer temperature. The semi-analytical, ocean sediment module considers calcium carbonate dissolution and oxic and anoxic organic matter remineralisation. The sediment is composed of calcite, non-calcite mineral and reactive organic matter. Sediment porosity profiles are related to sediment composition and a bioturbated layer of 0.1 m thickness is assumed. A sediment segment is ascribed to each ocean layer and segment area stems from observed ocean depth distributions. Sediment burial is calculated from sedimentation velocities at the base of the bioturbated layer. Bioturbation rates and oxic and anoxic remineralisation rates depend on organic carbon rain rates and dissolved oxygen concentrations. The land biosphere module considers leaves, wood, litter and soil. Net primary production depends on atmospheric carbon dioxide concentration and

  12. Regional hydrogeological simulations. Numerical modelling using ConnectFlow. Preliminary site description Simpevarp sub area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, Lee; Hoch, Andrew; Hunter, Fiona; Jackson, Peter [Serco Assurance, Risley (United Kingdom); Marsic, Niko [Kemakta Konsult, Stockholm (Sweden)

    2005-02-01

    objective of this study is to support the development of a preliminary Site Description of the Simpevarp area on a regional-scale based on the available data of August 2004 (Data Freeze S1.2) and the previous Site Description. A more specific objective of this study is to assess the role of known and unknown hydrogeological conditions for the present-day distribution of saline groundwater in the Simpevarp area on a regional-scale. An improved understanding of the paleo-hydrogeology is necessary in order to gain credibility for the Site Description in general and the hydrogeological description in particular. This is to serve as a basis for describing the present hydrogeological conditions on a local-scale as well as predictions of future hydrogeological conditions. Other key objectives were to identify the model domain required to simulate regional flow and solute transport at the Simpevarp area and to incorporate a new geological model of the deformation zones produced for Version S1.2.Another difference with Version S1.1 is the increased effort invested in conditioning the hydrogeological property models to the fracture boremap and hydraulic data. A new methodology was developed for interpreting the discrete fracture network (DFN) by integrating the geological description of the DFN (GeoDFN) with the hydraulic test data from Posiva Flow-Log and Pipe-String System double-packer techniques to produce a conditioned Hydro-DFN model. This was done in a systematic way that addressed uncertainties associated with the assumptions made in interpreting the data, such as the relationship between fracture transmissivity and length. Consistent hydraulic data was only available for three boreholes, and therefore only relatively simplistic models were proposed as there isn't sufficient data to justify extrapolating the DFN away from the boreholes based on rock domain, for example. Significantly, a far greater quantity of hydro-geochemical data was available for calibration in the

  13. Solid Modeling Aerospace Research Tool (SMART) user's guide, version 2.0

    Science.gov (United States)

    Mcmillin, Mark L.; Spangler, Jan L.; Dahmen, Stephen M.; Rehder, John J.

    1993-01-01

    The Solid Modeling Aerospace Research Tool (SMART) software package is used in the conceptual design of aerospace vehicles. It provides a highly interactive and dynamic capability for generating geometries with Bezier cubic patches. Features include automatic generation of commonly used aerospace constructs (e.g., wings and multilobed tanks); cross-section skinning; wireframe and shaded presentation; area, volume, inertia, and center-of-gravity calculations; and interfaces to various aerodynamic and structural analysis programs. A comprehensive description of SMART and how to use it is provided.

  14. Feynman propagator for the planar version of the CPT-even electrodynamics of Standard Model Extension

    Energy Technology Data Exchange (ETDEWEB)

    Casana, Rodolfo; Ferreira Junior, Manoel M.; Moreira, Roemir P.M. [Universidade Federal do Maranhao (UFMA), MA (Brazil); Gomes, Adalto R. [Instituto Federal de Educacao Ciencia e Tecnologia do Maranhao (IFMA), MA (Brazil)

    2011-07-01

    Full text: In a recent work, we have accomplished the dimensional reduction of the non birefringent CPT-even gauge sector of the Standard Model Extension. As well-known, the CPT-even gauge sector is composed of nineteen components comprised by the fourth-rank tensor, (K{sub F} ){sub μνρσ}, of which nine do not yield birefringence. These nine components can be parametrized in terms of the symmetric and traceless tensor, k{sub μν} = (K{sub F}){sup ρ} νρσ. Starting from this parametrization, and applying the dimensional reduction procedure, we obtain a planar theory corresponding to the non birefringent sector, composed of a gauge and scalar sectors, mutually coupled. These sectors possess six and three independent components, respectively. Some interesting properties of this theory, concerning classical stationary solutions, were examined recently. In the present work, we explicitly evaluate the Feynman propagator for this model, in a tensor closed way, using a set of operators defined in terms of three 3-vectors. We use this propagator to examine the dispersion relations of this theory, and analyze some properties related to its causality, stability, and unitarity. (author)

  15. Representing icebergs in the iLOVECLIM model (version 1.0 – a sensitivity study

    Directory of Open Access Journals (Sweden)

    M. Bügelmayer

    2014-07-01

    Full Text Available Recent modelling studies have indicated that icebergs alter the ocean's state, the thickness of sea ice and the prevailing atmospheric conditions, in short play an active role in the climate system. The icebergs' impact is due to their slowly released melt water which freshens and cools the ocean. The spatial distribution of the icebergs and thus their melt water depends on the forces (atmospheric and oceanic acting on them as well as on the icebergs' size. The studies conducted so far have in common that the icebergs were moved by reconstructed or modelled forcing fields and that the initial size distribution of the icebergs was prescribed according to present day observations. To address these shortcomings, we used the climate model iLOVECLIM that includes actively coupled ice-sheet and iceberg modules, to conduct 15 sensitivity experiments to analyse (1 the impact of the forcing fields (atmospheric vs. oceanic on the icebergs' distribution and melt flux, and (2 the effect of the used initial iceberg size on the resulting Northern Hemisphere climate and ice sheet under different climate conditions (pre-industrial, strong/weak radiative forcing. Our results show that, under equilibrated pre-industrial conditions, the oceanic currents cause the bergs to stay close to the Greenland and North American coast, whereas the atmospheric forcing quickly distributes them further away from their calving site. These different characteristics strongly affect the lifetime of icebergs, since the wind-driven icebergs melt up to two years faster as they are quickly distributed into the relatively warm North Atlantic waters. Moreover, we find that local variations in the spatial distribution due to different iceberg sizes do not result in different climate states and Greenland ice sheet volume, independent of the prevailing climate conditions (pre-industrial, warming or cooling climate. Therefore, we conclude that local differences in the distribution of their

  16. Unitary version of the single-particle dispersive optical model and single-hole excitations in medium-heavy spherical nuclei

    Science.gov (United States)

    Kolomiytsev, G. V.; Igashov, S. Yu.; Urin, M. H.

    2017-07-01

    A unitary version of the single-particle dispersive optical model was proposed with the aim of applying it to describing high-energy single-hole excitations in medium-heavy mass nuclei. By considering the example of experimentally studied single-hole excitations in the 90Zr and 208Pb parent nuclei, the contribution of the fragmentation effect to the real part of the optical-model potential was estimated quantitatively in the framework of this version. The results obtained in this way were used to predict the properties of such excitations in the 132Sn parent nucleus.

  17. Midlatitude atmospheric responses to Arctic sensible heat flux anomalies in Community Climate Model, Version 4

    Science.gov (United States)

    Mills, Catrin M.; Cassano, John J.; Cassano, Elizabeth N.

    2016-12-01

    Possible linkages between Arctic sea ice loss and midlatitude weather are strongly debated in the literature. We analyze a coupled model simulation to assess the possibility of Arctic ice variability forcing a midlatitude response, ensuring consistency between atmosphere, ocean, and ice components. We work with weekly running mean daily sensible heat fluxes with the self-organizing map technique to identify Arctic sensible heat flux anomaly patterns and the associated atmospheric response, without the need of metrics to define the Arctic forcing or measure the midlatitude response. We find that low-level warm anomalies during autumn can build planetary wave patterns that propagate downstream into the midlatitudes, creating robust surface cold anomalies in the eastern United States.

  18. Programs OPTMAN and SHEMMAN version 5 (1998). Coupled channels optical model and collective nuclear structure calculation

    Energy Technology Data Exchange (ETDEWEB)

    Sukhovitskii, E.Sh.; Porodzinskii, Y.V.; Iwamoto, Osamu; Chiba, Satoshi; Shibata, Keiichi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-05-01

    Program OPTMAN has been developed to be a tool for optical model calculations and employed in nuclear data evaluation at Radiation Physics and Chemistry Problems Institute. The code had been continuously improved to incorporate a number of options for more than twenty years. For the last three years it was successfully applied for evaluation of minor actinides nuclear data for a contract with International Science and Technology Center with Japan as the financing party. This code is now installed on the PC and UNIX work station by the authors at Nuclear Data Center of JAERI as well as program SHEMMAN which is used for the determination of nuclear Hamiltonian parameters. This report is intended as a brief manual of these codes for the users at JAERI. (author)

  19. Offshore Wind Guidance Document: Oceanography and Sediment Stability (Version 1) Development of a Conceptual Site Model.

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, Jesse D.; Jason Magalen; Craig Jones

    2014-06-01

    This guidance document provide s the reader with an overview of the key environmental considerations for a typical offshore wind coastal location and the tools to help guide the reader through a thoro ugh planning process. It will enable readers to identify the key coastal processes relevant to their offshore wind site and perform pertinent analysis to guide siting and layout design, with the goal of minimizing costs associated with planning, permitting , and long - ter m maintenance. The document highlight s site characterization and assessment techniques for evaluating spatial patterns of sediment dynamics in the vicinity of a wind farm under typical, extreme, and storm conditions. Finally, the document des cribe s the assimilation of all of this information into the conceptual site model (CSM) to aid the decision - making processes.

  20. Theoretical modelling of epigenetically modified DNA sequences [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Alexandra Teresa Pires Carvalho

    2015-05-01

    Full Text Available We report herein a set of calculations designed to examine the effects of epigenetic modifications on the structure of DNA. The incorporation of methyl, hydroxymethyl, formyl and carboxy substituents at the 5-position of cytosine is shown to hardly affect the geometry of CG base pairs, but to result in rather larger changes to hydrogen-bond and stacking binding energies, as predicted by dispersion-corrected density functional theory (DFT methods. The same modifications within double-stranded GCG and ACA trimers exhibit rather larger structural effects, when including the sugar-phosphate backbone as well as sodium counterions and implicit aqueous solvation. In particular, changes are observed in the buckle and propeller angles within base pairs and the slide and roll values of base pair steps, but these leave the overall helical shape of DNA essentially intact. The structures so obtained are useful as a benchmark of faster methods, including molecular mechanics (MM and hybrid quantum mechanics/molecular mechanics (QM/MM methods. We show that previously developed MM parameters satisfactorily reproduce the trimer structures, as do QM/MM calculations which treat bases with dispersion-corrected DFT and the sugar-phosphate backbone with AMBER. The latter are improved by inclusion of all six bases in the QM region, since a truncated model including only the central CG base pair in the QM region is considerably further from the DFT structure. This QM/MM method is then applied to a set of double-stranded DNA heptamers derived from a recent X-ray crystallographic study, whose size puts a DFT study beyond our current computational resources. These data show that still larger structural changes are observed than in base pairs or trimers, leading us to conclude that it is important to model epigenetic modifications within realistic molecular contexts.

  1. Incorporating Enterprise Risk Management in the Business Model Innovation Process

    OpenAIRE

    Yariv Taran; Harry Boer; Peter Lindgren

    2013-01-01

    Purpose: Relative to other types of innovations, little is known about business model innovation, let alone the process of managing the risks involved in that process. Using the emerging (enterprise) risk management literature, an approach is proposed through which risk management can be embedded in the business model innovation process. Design: The integrated business model innovation risk management model developed in this paper has been tested through an action research study in a Dani...

  2. Expanding pedestrian injury risk to the body region level: how to model passive safety systems in pedestrian injury risk functions.

    Science.gov (United States)

    Niebuhr, Tobias; Junge, Mirko; Achmus, Stefanie

    2015-01-01

    Assessment of the effectiveness of advanced driver assistance systems (ADAS) plays a crucial role in accident research. A common way to evaluate the effectiveness of new systems is to determine the potentials for injury severity reduction. Because injury risk functions describe the probability of an injury of a given severity conditional on a technical accident severity (closing speed, delta V, barrier equivalent speed, etc.), they are predestined for such evaluations. Recent work has stated an approach on how to model the pedestrian injury risk in pedestrian-to-passenger car accidents as a family of functions. This approach gave explicit and easily interpretable formulae for the injury risk conditional on the closing speed of the car. These results are extended to injury risk functions for pedestrian body regions. Starting with a double-checked German In-depth Accident Study (GIDAS) pedestrian-to-car accident data set (N = 444) and a functional-anatomical definition of the body regions, investigations on the influence of specific body regions on the overall injury severity will be presented. As the measure of injury severity, the ISSx, a rescaled version of the well-known Injury Severity Score (ISS), was used. Though traditional ISS is computed by summation of the squares of the 3 most severe injured body regions, ISSx is computed by the summation of the exponentials of the Abbreviated Injury Scale (AIS) severities of the 3 most severely injured body regions. The exponentials used are scaled to fit the ISS range of values between 0 and 75. Three body regions (head/face/neck, thorax, hip/legs) clearly dominated abdominal and upper extremity injuries; that is, the latter 2 body regions had no influence at all on the overall injury risk over the range of technical accident severities. Thus, the ISSx is well described by use of the injury codes from the same body regions for any pedestrian injury severity. As a mathematical consequence, the ISSx becomes explicitly

  3. Hybrid2: The hybrid system simulation model, Version 1.0, user manual

    Energy Technology Data Exchange (ETDEWEB)

    Baring-Gould, E.I.

    1996-06-01

    In light of the large scale desire for energy in remote communities, especially in the developing world, the need for a detailed long term performance prediction model for hybrid power systems was seen. To meet these ends, engineers from the National Renewable Energy Laboratory (NREL) and the University of Massachusetts (UMass) have spent the last three years developing the Hybrid2 software. The Hybrid2 code provides a means to conduct long term, detailed simulations of the performance of a large array of hybrid power systems. This work acts as an introduction and users manual to the Hybrid2 software. The manual describes the Hybrid2 code, what is included with the software and instructs the user on the structure of the code. The manual also describes some of the major features of the Hybrid2 code as well as how to create projects and run hybrid system simulations. The Hybrid2 code test program is also discussed. Although every attempt has been made to make the Hybrid2 code easy to understand and use, this manual will allow many organizations to consider the long term advantages of using hybrid power systems instead of conventional petroleum based systems for remote power generation.

  4. Sensitivity of precipitation to parameter values in the community atmosphere model version 5

    Energy Technology Data Exchange (ETDEWEB)

    Johannesson, Gardar; Lucas, Donald; Qian, Yun; Swiler, Laura Painton; Wildey, Timothy Michael

    2014-03-01

    One objective of the Climate Science for a Sustainable Energy Future (CSSEF) program is to develop the capability to thoroughly test and understand the uncertainties in the overall climate model and its components as they are being developed. The focus on uncertainties involves sensitivity analysis: the capability to determine which input parameters have a major influence on the output responses of interest. This report presents some initial sensitivity analysis results performed by Lawrence Livermore National Laboratory (LNNL), Sandia National Laboratories (SNL), and Pacific Northwest National Laboratory (PNNL). In the 2011-2012 timeframe, these laboratories worked in collaboration to perform sensitivity analyses of a set of CAM5, 2° runs, where the response metrics of interest were precipitation metrics. The three labs performed their sensitivity analysis (SA) studies separately and then compared results. Overall, the results were quite consistent with each other although the methods used were different. This exercise provided a robustness check of the global sensitivity analysis metrics and identified some strongly influential parameters.

  5. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  6. Variational assimilation of land surface temperature within the ORCHIDEE Land Surface Model Version 1.2.6

    Science.gov (United States)

    Benavides Pinjosovsky, Hector Simon; Thiria, Sylvie; Ottlé, Catherine; Brajard, Julien; Badran, Fouad; Maugis, Pascal

    2017-01-01

    The SECHIBA module of the ORCHIDEE land surface model describes the exchanges of water and energy between the surface and the atmosphere. In the present paper, the adjoint semi-generator software called YAO was used as a framework to implement a 4D-VAR assimilation scheme of observations in SECHIBA. The objective was to deliver the adjoint model of SECHIBA (SECHIBA-YAO) obtained with YAO to provide an opportunity for scientists and end users to perform their own assimilation. SECHIBA-YAO allows the control of the 11 most influential internal parameters of the soil water content, by observing the land surface temperature or remote sensing data such as the brightness temperature. The paper presents the fundamental principles of the 4D-VAR assimilation, the semi-generator software YAO and a large number of experiments showing the accuracy of the adjoint code in different conditions (sites, PFTs, seasons). In addition, a distributed version is available in the case for which only the land surface temperature is observed.

  7. Fuel Cell Power Model Version 2: Startup Guide, System Designs, and Case Studies. Modeling Electricity, Heat, and Hydrogen Generation from Fuel Cell-Based Distributed Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Steward, D.; Penev, M.; Saur, G.; Becker, W.; Zuboy, J.

    2013-06-01

    This guide helps users get started with the U.S. Department of Energy/National Renewable Energy Laboratory Fuel Cell Power (FCPower) Model Version 2, which is a Microsoft Excel workbook that analyzes the technical and economic aspects of high-temperature fuel cell-based distributed energy systems with the aim of providing consistent, transparent, comparable results. This type of energy system would provide onsite-generated heat and electricity to large end users such as hospitals and office complexes. The hydrogen produced could be used for fueling vehicles or stored for later conversion to electricity.

  8. MIG version 0.0 model interface guidelines: Rules to accelerate installation of numerical models into any compliant parent code

    Energy Technology Data Exchange (ETDEWEB)

    Brannon, R.M.; Wong, M.K.

    1996-08-01

    A set of model interface guidelines, called MIG, is presented as a means by which any compliant numerical material model can be rapidly installed into any parent code without having to modify the model subroutines. Here, {open_quotes}model{close_quotes} usually means a material model such as one that computes stress as a function of strain, though the term may be extended to any numerical operation. {open_quotes}Parent code{close_quotes} means a hydrocode, finite element code, etc. which uses the model and enforces, say, the fundamental laws of motion and thermodynamics. MIG requires the model developer (who creates the model package) to specify model needs in a standardized but flexible way. MIG includes a dictionary of technical terms that allows developers and parent code architects to share a common vocabulary when specifying field variables. For portability, database management is the responsibility of the parent code. Input/output occurs via structured calling arguments. As much model information as possible (such as the lists of required inputs, as well as lists of precharacterized material data and special needs) is supplied by the model developer in an ASCII text file. Every MIG-compliant model also has three required subroutines to check data, to request extra field variables, and to perform model physics. To date, the MIG scheme has proven flexible in beta installations of a simple yield model, plus a more complicated viscodamage yield model, three electromechanical models, and a complicated anisotropic microcrack constitutive model. The MIG yield model has been successfully installed using identical subroutines in three vectorized parent codes and one parallel C++ code, all predicting comparable results. By maintaining one model for many codes, MIG facilitates code-to-code comparisons and reduces duplication of effort, thereby reducing the cost of installing and sharing models in diverse new codes.

  9. Simulating the 2012 High Plains Drought Using Three Single Column Model Versions of the Community Earth System Model (SCM-CESM)

    Science.gov (United States)

    Medina, I. D.; Denning, S.

    2014-12-01

    The impact of changes in the frequency and severity of drought on fresh water sustainability is a great concern for many regions of the world. One such location is the High Plains, where the local economy is primarily driven by fresh water withdrawals from the Ogallala Aquifer, which accounts for approximately 30% of total irrigation withdrawals from all U.S. aquifers combined. Modeling studies that focus on the feedback mechanisms that control the climate and eco-hydrology during times of drought are limited in the sense that they use conventional General Circulation Models (GCMs) with grid length scales ranging from one hundred to several hundred kilometers. Additionally, these models utilize crude statistical parameterizations of cloud processes for estimating sub-grid fluxes of heat and moisture and have a poor representation of land surface heterogeneity. For this research, we focus on the 2012 High Plains drought, and will perform numerical simulations using three single column model versions of the Community Earth System Model (SCM-CESM) at multiple sites overlying the Ogallala Aquifer for the 2010-2012 period. In the first version of SCM-CESM, CESM will be used in standard mode (Community Atmospheric Model (CAM) coupled to a single instance of the Community Land Model (CLM)), secondly, CESM will be used in Super-Parameterized mode (SP-CESM), where a cloud resolving model (CRM consists of 32 atmospheric columns) replaces the standard CAM atmospheric parameterization and is coupled to a single instance of CLM, and thirdly, CESM is used in "Multi Instance" SP-CESM mode, where an instance of CLM is coupled to each CRM column of SP-CESM (32 CRM columns coupled to 32 instances of CLM). To assess the physical realism of the land-atmosphere feedbacks simulated at each site by all versions of SCM-CESM, differences in simulated energy and moisture fluxes will be computed between years for the 2010-2012 period, and will be compared to differences calculated using

  10. User Manual for Graphical User Interface Version 2.10 with Fire and Smoke Simulation Model (FSSIM) Version 1.2

    Science.gov (United States)

    2010-05-10

    calculations, while fast, have limitations in applicability and large uncertainties in their results. CFD computations have the potential to be accurate...variables or a CFD model that uses a multitude of variables. A network representation allows for maximum physical extent of a simulation with a minimum...are separated; therefore, the floor of the upper deck and the ceiling of the lower d eck are highlighted. A vertical surf ace would only appear as a

  11. Evaluating litter decomposition in earth system models with long-term litterbag experiments: an example using the Community Land Model version 4 (CLM4).

    Science.gov (United States)

    Bonan, Gordon B; Hartman, Melannie D; Parton, William J; Wieder, William R

    2013-03-01

    Decomposition is a large term in the global carbon budget, but models of the earth system that simulate carbon cycle-climate feedbacks are largely untested with respect to litter decomposition. We tested the litter decomposition parameterization of the community land model version 4 (CLM4), the terrestrial component of the community earth system model, with data from the long-term intersite decomposition experiment team (LIDET). The LIDET dataset is a 10-year study of litter decomposition at multiple sites across North America and Central America. We performed 10-year litter decomposition simulations comparable with LIDET for 9 litter types and 20 sites in tundra, grassland, and boreal, conifer, deciduous, and tropical forest biomes using the LIDET-provided climatic decomposition index to constrain temperature and moisture effects on decomposition. We performed additional simulations with DAYCENT, a version of the CENTURY model, to ask how well an established ecosystem model matches the observations. The results show large discrepancy between the laboratory microcosm studies used to parameterize the CLM4 litter decomposition and the LIDET field study. Simulated carbon loss is more rapid than the observations across all sites, and nitrogen immobilization is biased high. Closer agreement with the observations requires much lower decomposition rates, obtained with the assumption that soil mineral nitrogen severely limits decomposition. DAYCENT better replicates the observations, for both carbon mass remaining and nitrogen, independent of nitrogen limitation. CLM4 has low soil carbon in global earth system simulations. These results suggest that this bias arises, in part, from too rapid litter decomposition. More broadly, the terrestrial biogeochemistry of earth system models must be critically tested with observations, and the consequences of particular model choices must be documented. Long-term litter decomposition experiments such as LIDET provide a real

  12. Model-Based Mitigation of Availability Risks

    NARCIS (Netherlands)

    Zambon, Emmanuele; Bolzoni, D.; Etalle, Sandro; Salvato, Marco

    2007-01-01

    The assessment and mitigation of risks related to the availability of the IT infrastructure is becoming increasingly important in modern organizations. Unfortunately, present standards for Risk Assessment and Mitigation show limitations when evaluating and mitigating availability risks. This is due

  13. Urban Drainage Modeling and Flood Risk Management

    Science.gov (United States)

    Schmitt, Theo G.; Thomas, Martin

    The European research project in the EUREKA framework, RisUrSim (Σ!2255) has been worked out by a project consortium including industrial mathematics and water engineering research institutes, municipal drainage works as well as an insurance company. The overall objective has been the development of a simulation to allow flood risk analysis and cost-effective management for urban drainage systems. In view of the regulatory background of European Standard EN 752, the phenomenon of urban flooding caused by surcharged sewer systems in urban drainage systems is analyzed, leading to the necessity of dual drainage modeling. A detailed dual drainage simulation model is described based upon hydraulic flow routing procedures for surface flow and pipe flow. Special consideration is given to the interaction between surface and sewer flow in order to most accurately compute water levels above ground as a basis for further assessment of possible damage costs. The model application is presented for small case study in terms of data needs, model verification, and first simulation results.

  14. The global aerosol-climate model ECHAM-HAM, version 2: sensitivity to improvements in process representations

    Directory of Open Access Journals (Sweden)

    K. Zhang

    2012-10-01

    Full Text Available This paper introduces and evaluates the second version of the global aerosol-climate model ECHAM-HAM. Major changes have been brought into the model, including new parameterizations for aerosol nucleation and water uptake, an explicit treatment of secondary organic aerosols, modified emission calculations for sea salt and mineral dust, the coupling of aerosol microphysics to a two-moment stratiform cloud microphysics scheme, and alternative wet scavenging parameterizations. These revisions extend the model's capability to represent details of the aerosol lifecycle and its interaction with climate. Nudged simulations of the year 2000 are carried out to compare the aerosol properties and global distribution in HAM1 and HAM2, and to evaluate them against various observations. Sensitivity experiments are performed to help identify the impact of each individual update in model formulation.

    Results indicate that from HAM1 to HAM2 there is a marked weakening of aerosol water uptake in the lower troposphere, reducing the total aerosol water burden from 75 Tg to 51 Tg. The main reason is the newly introduced κ-Köhler-theory-based water uptake scheme uses a lower value for the maximum relative humidity cutoff. Particulate organic matter loading in HAM2 is considerably higher in the upper troposphere, because the explicit treatment of secondary organic aerosols allows highly volatile oxidation products of the precursors to be vertically transported to regions of very low temperature and to form aerosols there. Sulfate, black carbon, particulate organic matter and mineral dust in HAM2 have longer lifetimes than in HAM1 because of weaker in-cloud scavenging, which is in turn related to lower autoconversion efficiency in the newly introduced two-moment cloud microphysics scheme. Modification in the sea salt emission scheme causes a significant increase in the ratio (from 1.6 to 7.7 between accumulation mode and coarse mode emission fluxes of

  15. ADVANCED UTILITY SIMULATION MODEL, REPORT OF SENSITIVITY TESTING, CALIBRATION, AND MODEL OUTPUT COMPARISONS (VERSION 3.0)

    Science.gov (United States)

    The report gives results of activities relating to the Advanced Utility Simulation Model (AUSM): sensitivity testing. comparison with a mature electric utility model, and calibration to historical emissions. The activities were aimed at demonstrating AUSM's validity over input va...

  16. Personalized Predictive Modeling and Risk Factor Identification using Patient Similarity.

    Science.gov (United States)

    Ng, Kenney; Sun, Jimeng; Hu, Jianying; Wang, Fei

    2015-01-01

    Personalized predictive models are customized for an individual patient and trained using information from similar patients. Compared to global models trained on all patients, they have the potential to produce more accurate risk scores and capture more relevant risk factors for individual patients. This paper presents an approach for building personalized predictive models and generating personalized risk factor profiles. A locally supervised metric learning (LSML) similarity measure is trained for diabetes onset and used to find clinically similar patients. Personalized risk profiles are created by analyzing the parameters of the trained personalized logistic regression models. A 15,000 patient data set, derived from electronic health records, is used to evaluate the approach. The predictive results show that the personalized models can outperform the global model. Cluster analysis of the risk profiles show groups of patients with similar risk factors, differences in the top risk factors for different groups of patients and differences between the individual and global risk factors.

  17. A Risk Management Model for Merger and Acquisition

    Directory of Open Access Journals (Sweden)

    B. S. Chui

    2011-05-01

    Full Text Available In this paper, a merger and acquisition risk management model is proposed for considering risk factors in the merger and acquisition activities. The proposed model aims to maximize the probability of success in merger and acquisition activities by managing and reducing the associated risks. The modeling of the proposed merger and acquisition risk management model is described and illustrated in this paper. The illustration result shows that the proposed model can help to screen the best target company with minimum associated risks in the merger and acquisition activity.

  18. Updating sea spray aerosol emissions in the Community Multiscale Air Quality (CMAQ model version 5.0.2

    Directory of Open Access Journals (Sweden)

    B. Gantt

    2015-05-01

    Full Text Available Sea spray aerosols (SSA impact the particle mass concentration and gas-particle partitioning in coastal environments, with implications for human and ecosystem health. Despite their importance, the emission magnitude of SSA remains highly uncertain with global estimates varying by nearly two orders of magnitude. In this study, the Community Multiscale Air Quality (CMAQ model was updated to enhance fine mode SSA emissions, include sea surface temperature (SST dependency, and reduce coastally-enhanced emissions. Predictions from the updated CMAQ model and those of the previous release version, CMAQv5.0.2, were evaluated using several regional and national observational datasets in the continental US. The updated emissions generally reduced model underestimates of sodium, chloride, and nitrate surface concentrations for an inland site of the Bay Regional Atmospheric Chemistry Experiment (BRACE near Tampa, Florida. Including SST-dependency to the SSA emission parameterization led to increased sodium concentrations in the southeast US and decreased concentrations along parts of the Pacific coast and northeastern US. The influence of sodium on the gas-particle partitioning of nitrate resulted in higher nitrate particle concentrations in many coastal urban areas due to increased condensation of nitric acid in the updated simulations, potentially affecting the predicted nitrogen deposition in sensitive ecosystems. Application of the updated SSA emissions to the California Research at the Nexus of Air Quality and Climate Change (CalNex study period resulted in modest improvement in the predicted surface concentration of sodium and nitrate at several central and southern California coastal sites. This SSA emission update enabled a more realistic simulation of the atmospheric chemistry in environments where marine air mixes with urban pollution.

  19. Study of the Eco-Economic Indicators by Means of the New Version of the Merge Integrated Model. Part 1

    Directory of Open Access Journals (Sweden)

    Boris Vadimovich Digas

    2015-12-01

    Full Text Available One of the most relevant issues of the day is the forecasting problem of climatic changes and mitigation of their consequences. The official point of view reflected in the Climate doctrine of the Russian Federation consists in the recognition of the need of the development of the state approach to the climatic problems and related issues on the basis of the comprehensive scientific analysis of ecological, economic and social factors. For this purpose, the integrated estimation models of interdisciplinary character are attracted. Their functionality is characterized by the possibility of construction and testing of various dynamic scenarios of complex systems. The main purposes of the computing experiments described in the article are a review of the consequences of hypothetical participation of Russia in initiatives for greenhouse gas reduction as the Kyoto Protocol and approbation of one of the calculation methods of the green GDP representing the efficiency of environmental management in the modelling. To implement the given goals, the MERGE optimization model is used, its classical version is intended for the quantitative estimation of the application results of nature protection strategies. The components of the model are the eco-power module, climatic module and the module of loss estimates. In the work, the main attention is paid to the adaptation of the MERGE model to a current state of the world economy in the conditions of a complicated geopolitical situation and introduction of a new component to the model, realizing a simplified method for calculation the green GDP. The Project of scenario conditions and the key macroeconomic forecast parameters of the socio-economic development of Russia for 2016 and the schedule date of 2017−2018 made by the Ministry of Economic Development of the Russian Federation are used as a basic source of entrance data for the analysis of possible trajectories of the economic development of Russia and the

  20. Study of the Eco-Economic Indicators by Means of the New Version of the Merge Integrated Model Part 2

    Directory of Open Access Journals (Sweden)

    Boris Vadimovich Digas

    2016-03-01

    Full Text Available One of the most relevant issues of the day is the forecasting problem of climatic changes and mitigation of their consequences. The official point of view reflected in the Climate doctrine of the Russian Federation consists in the recognition of the need of the development of the state approach to the climatic problems and related issues on the basis of the comprehensive scientific analysis of ecological, economic and social factors. For this purpose, the integrated estimation models of interdisciplinary character are attracted. Their functionality is characterized by the possibility of construction and testing of various dynamic scenarios of complex systems. The main purposes of the computing experiments described in the article are a review of the consequences of hypothetical participation of Russia in initiatives for greenhouse gas reduction as the Kyoto Protocol and approbation of one of the calculation methods of the green gross domestic product representing the efficiency of environmental management in the modelling. To implement the given goals, the MERGE optimization model is used, its classical version is intended for the quantitative estimation of the application results of nature protection strategies. The components of the model are the eco-power module, climatic module and the module of loss estimates. In the work, the main attention is paid to the adaptation of the MERGE model to a current state of the world economy in the conditions of a complicated geopolitical situation and introduction of a new component to the model, realizing a simplified method for calculation the green gross domestic product. The Project of scenario conditions and the key macroeconomic forecast parameters of the socio-economic development of Russia for 2016 and the schedule date of 2017−2018 made by the Ministry of Economic Development of the Russian Federation are used as a basic source of entrance data for the analysis of possible trajectories of the

  1. VELMA Ecohydrological Model, Version 2.0 -- Analyzing Green Infrastructure Options for Enhancing Water Quality and Ecosystem Service Co-Benefits

    Science.gov (United States)

    This 2-page factsheet describes an enhanced version (2.0) of the VELMA eco-hydrological model. VELMA – Visualizing Ecosystem Land Management Assessments – has been redesigned to assist communities, land managers, policy makers and other decision makers in evaluataing the effecti...

  2. VELMA Ecohydrological Model, Version 2.0 -- Analyzing Green Infrastructure Options for Enhancing Water Quality and Ecosystem Service Co-Benefits

    Science.gov (United States)

    This 2-page factsheet describes an enhanced version (2.0) of the VELMA eco-hydrological model. VELMA – Visualizing Ecosystem Land Management Assessments – has been redesigned to assist communities, land managers, policy makers and other decision makers in evaluataing the effecti...

  3. Hierarchical linear modeling of California Verbal Learning Test--Children's Version learning curve characteristics following childhood traumatic head injury.

    Science.gov (United States)

    Warschausky, Seth; Kay, Joshua B; Chi, PaoLin; Donders, Jacobus

    2005-03-01

    California Verbal Learning Test-Children's Version (CVLT-C) indices have been shown to be sensitive to the neurocognitive effects of traumatic brain injury (TBI). The effects of TBI on the learning process were examined with a growth curve analysis of CVLT-C raw scores across the 5 learning trials. The sample with history of TBI comprised 86 children, ages 6-16 years, at a mean of 10.0 (SD=19.5) months postinjury; 37.2% had severe injury, 27.9% moderate, and 34.9% mild. The best-fit model for verbal learning was with a quadratic function. Greater TBI severity was associated with lower rate of acquisition and more gradual deceleration in the rate of acquisition. Intelligence test index scores, previously shown to be sensitive to severity of TBI, were positively correlated with rate of acquisition. Results provide evidence that the CVLT-C learning slope is not a simple linear function and further support for specific effects of TBI on verbal learning. ((c) 2005 APA, all rights reserved).

  4. An interactive code (NETPATH) for modeling NET geochemical reactions along a flow PATH, version 2.0

    Science.gov (United States)

    Plummer, L. Niel; Prestemon, Eric C.; Parkhurst, David L.

    1994-01-01

    NETPATH is an interactive Fortran 77 computer program used to interpret net geochemical mass-balance reactions between an initial and final water along a hydrologic flow path. Alternatively, NETPATH computes the mixing proportions of two to five initial waters and net geochemical reactions that can account for the observed composition of a final water. The program utilizes previously defined chemical and isotopic data for waters from a hydrochemical system. For a set of mineral and (or) gas phases hypothesized to be the reactive phases in the system, NETPATH calculates the mass transfers in every possible combination of the selected phases that accounts for the observed changes in the selected chemical and (or) isotopic compositions observed along the flow path. The calculations are of use in interpreting geochemical reactions, mixing proportions, evaporation and (or) dilution of waters, and mineral mass transfer in the chemical and isotopic evolution of natural and environmental waters. Rayleigh distillation calculations are applied to each mass-balance model that satisfies the constraints to predict carbon, sulfur, nitrogen, and strontium isotopic compositions at the end point, including radiocarbon dating. DB is an interactive Fortran 77 computer program used to enter analytical data into NETPATH, and calculate the distribution of species in aqueous solution. This report describes the types of problems that can be solved, the methods used to solve problems, and the features available in the program to facilitate these solutions. Examples are presented to demonstrate most of the applications and features of NETPATH. The codes DB and NETPATH can be executed in the UNIX or DOS1 environment. This report replaces U.S. Geological Survey Water-Resources Investigations Report 91-4078, by Plummer and others, which described the original release of NETPATH, version 1.0 (dated December, 1991), and documents revisions and enhancements that are included in version 2.0. 1 The

  5. Enigma Version 12

    Science.gov (United States)

    Shores, David; Goza, Sharon P.; McKeegan, Cheyenne; Easley, Rick; Way, Janet; Everett, Shonn; Guerra, Mark; Kraesig, Ray; Leu, William

    2013-01-01

    Enigma Version 12 software combines model building, animation, and engineering visualization into one concise software package. Enigma employs a versatile user interface to allow average users access to even the most complex pieces of the application. Using Enigma eliminates the need to buy and learn several software packages to create an engineering visualization. Models can be created and/or modified within Enigma down to the polygon level. Textures and materials can be applied for additional realism. Within Enigma, these models can be combined to create systems of models that have a hierarchical relationship to one another, such as a robotic arm. Then these systems can be animated within the program or controlled by an external application programming interface (API). In addition, Enigma provides the ability to use plug-ins. Plugins allow the user to create custom code for a specific application and access the Enigma model and system data, but still use the Enigma drawing functionality. CAD files can be imported into Enigma and combined to create systems of computer graphics models that can be manipulated with constraints. An API is available so that an engineer can write a simulation and drive the computer graphics models with no knowledge of computer graphics. An animation editor allows an engineer to set up sequences of animations generated by simulations or by conceptual trajectories in order to record these to highquality media for presentation. Enigma Version 12 Lyndon B. Johnson Space Center, Houston, Texas 28 NASA Tech Briefs, September 2013 Planetary Protection Bioburden Analysis Program NASA's Jet Propulsion Laboratory, Pasadena, California This program is a Microsoft Access program that performed statistical analysis of the colony counts from assays performed on the Mars Science Laboratory (MSL) spacecraft to determine the bioburden density, 3-sigma biodensity, and the total bioburdens required for the MSL prelaunch reports. It also contains numerous

  6. Assessment of radionuclide databases in CAP88 mainframe version 1.0 and Windows-based version 3.0.

    Science.gov (United States)

    LaBone, Elizabeth D; Farfán, Eduardo B; Lee, Patricia L; Jannik, G Timothy; Donnelly, Elizabeth H; Foley, Trevor Q

    2009-09-01

    In this study the radionuclide databases for two versions of the Clean Air Act Assessment Package-1988 (CAP88) computer model were assessed in detail. CAP88 estimates radiation dose and the risk of health effects to human populations from radionuclide emissions to air. This program is used by several U.S. Department of Energy (DOE) facilities to comply with National Emission Standards for Hazardous Air Pollutants regulations. CAP88 Mainframe, referred to as version 1.0 on the U.S. Environmental Protection Agency Web site (http://www.epa.gov/radiation/assessment/CAP88/), was the very first CAP88 version released in 1988. Some DOE facilities including the Savannah River Site still employ this version (1.0) while others use the more user-friendly personal computer Windows-based version 3.0 released in December 2007. Version 1.0 uses the program RADRISK based on International Commission on Radiological Protection Publication 30 as its radionuclide database. Version 3.0 uses half-life, dose, and risk factor values based on Federal Guidance Report 13. Differences in these values could cause different results for the same input exposure data (same scenario), depending on which version of CAP88 is used. Consequently, the differences between the two versions are being assessed in detail at Savannah River National Laboratory. The version 1.0 and 3.0 database files contain 496 and 838 radionuclides, respectively, and though one would expect the newer version to include all the 496 radionuclides, 35 radionuclides are listed in version 1.0 that are not included in version 3.0. The majority of these has either extremely short or long half-lives or is no longer in production; however, some of the short-lived radionuclides might produce progeny of great interest at DOE sites. In addition, 122 radionuclides were found to have different half-lives in the two versions, with 21 over 3 percent different and 12 over 10 percent different.

  7. ASSESSMENT OF RADIONUCLIDES DATABASES IN CAP88 MAINFRAME VERSION 1.0 AND WINDOWS-BASED VERSION 3.0

    Energy Technology Data Exchange (ETDEWEB)

    Farfan, E.; Lee, P.; Jannik, T.; Donnelly, E.

    2008-09-16

    In this study the radionuclide databases for two versions of the Clean Air Act Assessment Package-1988 (CAP88) computer model were assessed in detail. CAP88 estimates radiation dose and the risk of health effects to human populations from radionuclide emissions to air. This program is used by several Department of Energy (DOE) facilities to comply with National Emission Standards for Hazardous Air Pollutants (NESHAP) regulations. CAP88 Mainframe, referred to as Version 1.0 on the Environmental Protection Agency (EPA) website (http://www.epa.gov/radiation/assessment/CAP88/), was the very first CAP88 version released in 1988. Some DOE facilities including the Savannah River Site still employ this version (1.0) while others use the more user-friendly personal computer Windows-based Version 3.0 released in December 2007. Version 1.0 uses the program RADRISK based on International Commission on Radiological Protection (ICRP) Publication 30 as its radionuclide database. Version 3.0 uses half-life, dose and risk factor values based on Federal Guidance Report 13. Differences in these values could cause different results for the same input exposure data (same scenario), depending on which version of CAP88 is used. Consequently, the differences between the two versions are being assessed in detail at Savannah River National Laboratory. The version 1.0 and 3.0 database files contain 496 and 838 radionuclides, respectively, and though one would expect the newer version to include all the 496 radionuclides, thirty-five radionuclides are listed in version 1.0 that are not included in version 3.0. The majority of these has either extremely short or long half-lives or is no longer in production; however, some of the short-lived radionuclides might produce progeny of great interest at DOE sites. In addition, one hundred and twenty-two radionuclides were found to have different half-lives in the two versions, with 21 over 3 percent different and 12 over 10 percent different.

  8. The globalization of risk and risk perception: why we need a new model of risk communication for vaccines.

    Science.gov (United States)

    Larson, Heidi; Brocard Paterson, Pauline; Erondu, Ngozi

    2012-11-01

    Risk communication and vaccines is complex and the nature of risk perception is changing, with perceptions converging, evolving and having impacts well beyond specific geographic localities and points in time, especially when amplified through the Internet and other modes of global communication. This article examines the globalization of risk perceptions and their impacts, including the example of measles and the globalization of measles, mumps and rubella (MMR) vaccine risk perceptions, and calls for a new, more holistic model of risk assessment, risk communication and risk mitigation, embedded in an ongoing process of risk management for vaccines and immunization programmes. It envisions risk communication as an ongoing process that includes trust-building strategies hand-in-hand with operational and policy strategies needed to mitigate and manage vaccine-related risks, as well as perceptions of risk.

  9. National Insect and Disease Risk Map (NIDRM)--cutting edge software for rapid insect and disease risk model development

    Science.gov (United States)

    Frank J. Krist

    2010-01-01

    The Forest Health Technology Enterprise Team (FHTET) of the U.S. Forest Service is leading an effort to produce the next version of the National Insect and Disease Risk Map (NIDRM) for targeted release in 2011. The goal of this effort is to update spatial depictions of risk of tree mortality based on: (1) newly derived 240-m geospatial information depicting the...

  10. EIA model documentation: World oil refining logistics demand model,``WORLD`` reference manual. Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    1994-04-11

    This manual is intended primarily for use as a reference by analysts applying the WORLD model to regional studies. It also provides overview information on WORLD features of potential interest to managers and analysts. Broadly, the manual covers WORLD model features in progressively increasing detail. Section 2 provides an overview of the WORLD model, how it has evolved, what its design goals are, what it produces, and where it can be taken with further enhancements. Section 3 reviews model management covering data sources, managing over-optimization, calibration and seasonality, check-points for case construction and common errors. Section 4 describes in detail the WORLD system, including: data and program systems in overview; details of mainframe and PC program control and files;model generation, size management, debugging and error analysis; use with different optimizers; and reporting and results analysis. Section 5 provides a detailed description of every WORLD model data table, covering model controls, case and technology data. Section 6 goes into the details of WORLD matrix structure. It provides an overview, describes how regional definitions are controlled and defines the naming conventions for-all model rows, columns, right-hand sides, and bounds. It also includes a discussion of the formulation of product blending and specifications in WORLD. Several Appendices supplement the main sections.

  11. A numerical 4D Collision Risk Model

    Science.gov (United States)

    Schmitt, Pal; Culloch, Ross; Lieber, Lilian; Kregting, Louise

    2017-04-01

    With the growing number of marine renewable energy (MRE) devices being installed across the world, some concern has been raised about the possibility of harming mobile, marine fauna by collision. Although physical contact between a MRE device and an organism has not been reported to date, these novel sub-sea structures pose a challenge for accurately estimating collision risks as part of environmental impact assessments. Even if the animal motion is simplified to linear translation, ignoring likely evasive behaviour, the mathematical problem of establishing an impact probability is not trivial. We present a numerical algorithm to obtain such probability distributions using transient, four-dimensional simulations of a novel marine renewable device concept, Deep Green, Minesto's power plant and hereafter referred to as the 'kite' that flies in a figure-of-eight configuration. Simulations were carried out altering several configurations including kite depth, kite speed and kite trajectory while keeping the speed of the moving object constant. Since the kite assembly is defined as two parts in the model, a tether (attached to the seabed) and the kite, collision risk of each part is reported independently. By comparing the number of collisions with the number of collision-free simulations, a probability of impact for each simulated position in the cross- section of the area is considered. Results suggest that close to the bottom, where the tether amplitude is small, the path is always blocked and the impact probability is 100% as expected. However, higher up in the water column, the collision probability is twice as high in the mid line, where the tether passes twice per period than at the extremes of its trajectory. The collision probability distribution is much more complex in the upper end of the water column, where the kite and tether can simultaneously collide with the object. Results demonstrate the viability of such models, which can also incorporate empirical

  12. Assessing patients' risk of febrile neutropenia: is there a correlation between physician-assessed risk and model-predicted risk?

    Science.gov (United States)

    Lyman, Gary H; Dale, David C; Legg, Jason C; Abella, Esteban; Morrow, Phuong Khanh; Whittaker, Sadie; Crawford, Jeffrey

    2015-08-01

    This study evaluated the correlation between the risk of febrile neutropenia (FN) estimated by physicians and the risk of severe neutropenia or FN predicted by a validated multivariate model in patients with nonmyeloid malignancies receiving chemotherapy. Before patient enrollment, physician and site characteristics were recorded, and physicians self-reported the FN risk at which they would typically consider granulocyte colony-stimulating factor (G-CSF) primary prophylaxis (FN risk intervention threshold). For each patient, physicians electronically recorded their estimated FN risk, orders for G-CSF primary prophylaxis (yes/no), and patient characteristics for model predictions. Correlations between physician-assessed FN risk and model-predicted risk (primary endpoints) and between physician-assessed FN risk and G-CSF orders were calculated. Overall, 124 community-based oncologists registered; 944 patients initiating chemotherapy with intermediate FN risk enrolled. Median physician-assessed FN risk over all chemotherapy cycles was 20.0%, and median model-predicted risk was 17.9%; the correlation was 0.249 (95% CI, 0.179-0.316). The correlation between physician-assessed FN risk and subsequent orders for G-CSF primary prophylaxis (n = 634) was 0.313 (95% CI, 0.135-0.472). Among patients with a physician-assessed FN risk ≥ 20%, 14% did not receive G-CSF orders. G-CSF was not ordered for 16% of patients at or above their physician's self-reported FN risk intervention threshold (median, 20.0%) and was ordered for 21% below the threshold. Physician-assessed FN risk and model-predicted risk correlated weakly; however, there was moderate correlation between physician-assessed FN risk and orders for G-CSF primary prophylaxis. Further research and education on FN risk factors and appropriate G-CSF use are needed.

  13. Discrete-Element bonded particle Sea Ice model DESIgn, version 1.3 – model description and implementation

    Directory of Open Access Journals (Sweden)

    A. Herman

    2015-07-01

    Full Text Available This paper presents theoretical foundations, numerical implementation and examples of application of a two-dimensional Discrete-Element bonded-particle Sea Ice model DESIgn. In the model, sea ice is represented as an assemblage of objects of two types: disk-shaped "grains", and semi-elastic bonds connecting them. Grains move on the sea surface under the influence of forces from the atmosphere and the ocean, as well as interactions with surrounding grains through a direct contact (Hertzian contact mechanics and/or through bonds. The model has an option of taking into account quasi-threedimensional effects related to space- and time-varying curvature of the sea surface, thus enabling simulation of ice breaking due to stresses resulting from bending moments associated with surface waves. Examples of the model's application to simple sea ice deformation and breaking problems are presented, with an analysis of the influence of the basic model parameters ("microscopic" properties of grains and bonds on the large-scale response of the modeled material. The model is written as a toolbox suitable for usage with the open-source numerical library LIGGGHTS. The code, together with a full technical documentation and example input files, is freely available with this paper and on the Internet.

  14. Discrete-Element bonded-particle Sea Ice model DESIgn, version 1.3a - model description and implementation

    Science.gov (United States)

    Herman, Agnieszka

    2016-04-01

    This paper presents theoretical foundations, numerical implementation and examples of application of the two-dimensional Discrete-Element bonded-particle Sea Ice model - DESIgn. In the model, sea ice is represented as an assemblage of objects of two types: disk-shaped "grains" and semi-elastic bonds connecting them. Grains move on the sea surface under the influence of forces from the atmosphere and the ocean, as well as interactions with surrounding grains through direct contact (Hertzian contact mechanics) and/or through bonds. The model has an experimental option of taking into account quasi-three-dimensional effects related to the space- and time-varying curvature of the sea surface, thus enabling simulation of ice breaking due to stresses resulting from bending moments associated with surface waves. Examples of the model's application to simple sea ice deformation and breaking problems are presented, with an analysis of the influence of the basic model parameters ("microscopic" properties of grains and bonds) on the large-scale response of the modeled material. The model is written as a toolbox suitable for usage with the open-source numerical library LIGGGHTS. The code, together with full technical documentation and example input files, is freely available with this paper and on the Internet.

  15. Models of intestinal infection by Salmonella enterica: introduction of a new neonate mouse model [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Marc Schulte

    2016-06-01

    Full Text Available Salmonella enterica serovar Typhimurium is a foodborne pathogen causing inflammatory disease in the intestine following diarrhea and is responsible for thousands of deaths worldwide. Many in vitro investigations using cell culture models are available, but these do not represent the real natural environment present in the intestine of infected hosts. Several in vivo animal models have been used to study the host-pathogen interaction and to unravel the immune responses and cellular processes occurring during infection. An animal model for Salmonella-induced intestinal inflammation relies on the pretreatment of mice with streptomycin. This model is of great importance but still shows limitations to investigate the host-pathogen interaction in the small intestine in vivo. Here, we review the use of mouse models for Salmonella infections and focus on a new small animal model using 1-day-old neonate mice. The neonate model enables researchers to observe infection of both the small and large intestine, thereby offering perspectives for new experimental approaches, as well as to analyze the Salmonella-enterocyte interaction in the small intestine in vivo.

  16. Development of an Information Exchange format for the Observations Data Model version 2 using OGC Observations and Measures

    Science.gov (United States)

    Valentine, D. W., Jr.; Aufdenkampe, A. K.; Horsburgh, J. S.; Hsu, L.; Lehnert, K. A.; Mayorga, E.; Song, L.; Zaslavsky, I.; Whitenack, T.

    2014-12-01

    The Observations Data Model v1 (ODMv1) schema has been utilized of the basis hydrologic cyberinfrastructures include the CUAHSI HIS. The first version of ODM focused on timeseries, and ultimately led the development of OGC "WaterML2 Part 1: Timeseries", which is being proposed to be developed into OGC TimeseriesML.Our team has developed an ODMv2 model to address ODMv1 shortcomings, and to encompass a wider community of spatially discrete, feature-based earth observations. The development process included collecting requirements from several existing Earth Observations data systems: HIS,CZOData, IEDA and EarthChem system, and IOOS. We developed ODM2 as a set of core entities with additional extensioncomponents that can be utilized. These extensions include for shared functionality (e.g. data quality, provenance), as well as specific use cases (e.g. laboratory analysis, equipment). Initially, we closely followed the Observations and Measures (ISO19156) concept model. After prototyping and reviewing the requirements, we extended the ODMv2 conceptual model to include entities to document ancillary acts that do not always produce a result. Differing from O&M where acts are expected to produce a result. ODMv2 includes the core concept of an "Action" which encapsulates activities or actions associated that are performed in the process of making an observation, but may not produce a result. Actions, such as a sample analysis, that observe a property and produce a result are equivalent to O&M observation. But in many use cases, many actions have no resulting observation. Examples of such actions are a site visit or sample preparation (splitting of a sample). These actions are part of a chain of actions, iwhich produce the final observation. Overall the ODMv2 generally follows the O&M conceptual model. The nearly final ODMv2 includes a core and extensions. The core entities include actions, feature actions (observations), datasets (groupings), methods (procedures), sampling

  17. SHEDS-Multimedia Model Version 3 (a) Technical Manual; (b) User Guide; and (c) Executable File to Launch SAS Program and Install Model

    Science.gov (United States)

    Reliable models for assessing human exposures are important for understanding health risks from chemicals. The Stochastic Human Exposure and Dose Simulation model for multimedia, multi-route/pathway chemicals (SHEDS-Multimedia), developed by EPA’s Office of Research and Developm...

  18. Modeling the structure of the attitudes and belief scale 2 using CFA and bifactor approaches: Toward the development of an abbreviated version.

    Science.gov (United States)

    Hyland, Philip; Shevlin, Mark; Adamson, Gary; Boduszek, Daniel

    2014-01-01

    The Attitudes and Belief Scale-2 (ABS-2: DiGiuseppe, Leaf, Exner, & Robin, 1988. The development of a measure of rational/irrational thinking. Paper presented at the World Congress of Behavior Therapy, Edinburg, Scotland.) is a 72-item self-report measure of evaluative rational and irrational beliefs widely used in Rational Emotive Behavior Therapy research contexts. However, little psychometric evidence exists regarding the measure's underlying factor structure. Furthermore, given the length of the ABS-2 there is a need for an abbreviated version that can be administered when there are time demands on the researcher, such as in clinical settings. This study sought to examine a series of theoretical models hypothesized to represent the latent structure of the ABS-2 within an alternative models framework using traditional confirmatory factor analysis as well as utilizing a bifactor modeling approach. Furthermore, this study also sought to develop a psychometrically sound abbreviated version of the ABS-2. Three hundred and thirteen (N = 313) active emergency service personnel completed the ABS-2. Results indicated that for each model, the application of bifactor modeling procedures improved model fit statistics, and a novel eight-factor intercorrelated solution was identified as the best fitting model of the ABS-2. However, the observed fit indices failed to satisfy commonly accepted standards. A 24-item abbreviated version was thus constructed and an intercorrelated eight-factor solution yielded satisfactory model fit statistics. Current results support the use of a bifactor modeling approach to determining the factor structure of the ABS-2. Furthermore, results provide empirical support for the psychometric properties of the newly developed abbreviated version.

  19. Lord-Wingersky Algorithm Version 2.0 for Hierarchical Item Factor Models with Applications in Test Scoring, Scale Alignment, and Model Fit Testing.

    Science.gov (United States)

    Cai, Li

    2015-06-01

    Lord and Wingersky's (Appl Psychol Meas 8:453-461, 1984) recursive algorithm for creating summed score based likelihoods and posteriors has a proven track record in unidimensional item response theory (IRT) applications. Extending the recursive algorithm to handle multidimensionality is relatively simple, especially with fixed quadrature because the recursions can be defined on a grid formed by direct products of quadrature points. However, the increase in computational burden remains exponential in the number of dimensions, making the implementation of the recursive algorithm cumbersome for truly high-dimensional models. In this paper, a dimension reduction method that is specific to the Lord-Wingersky recursions is developed. This method can take advantage of the restrictions implied by hierarchical item factor models, e.g., the bifactor model, the testlet model, or the two-tier model, such that a version of the Lord-Wingersky recursive algorithm can operate on a dramatically reduced set of quadrature points. For instance, in a bifactor model, the dimension of integration is always equal to 2, regardless of the number of factors. The new algorithm not only provides an effective mechanism to produce summed score to IRT scaled score translation tables properly adjusted for residual dependence, but leads to new applications in test scoring, linking, and model fit checking as well. Simulated and empirical examples are used to illustrate the new applications.

  20. Breast Cancer Risk Assessment SAS Macro (Gail Model)

    Science.gov (United States)

    A SAS macro (commonly referred to as the Gail Model) that projects absolute risk of invasive breast cancer according to NCI’s Breast Cancer Risk Assessment Tool (BCRAT) algorithm for specified race/ethnic groups and age intervals.

  1. Temperature and Humidity Profiles in the TqJoint Data Group of AIRS Version 6 Product for the Climate Model Evaluation

    Science.gov (United States)

    Ding, Feng; Fang, Fan; Hearty, Thomas J.; Theobald, Michael; Vollmer, Bruce; Lynnes, Christopher

    2014-01-01

    The Atmospheric Infrared Sounder (AIRS) mission is entering its 13th year of global observations of the atmospheric state, including temperature and humidity profiles, outgoing long-wave radiation, cloud properties, and trace gases. Thus AIRS data have been widely used, among other things, for short-term climate research and observational component for model evaluation. One instance is the fifth phase of the Coupled Model Intercomparison Project (CMIP5) which uses AIRS version 5 data in the climate model evaluation. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is the home of processing, archiving, and distribution services for data from the AIRS mission. The GES DISC, in collaboration with the AIRS Project, released data from the version 6 algorithm in early 2013. The new algorithm represents a significant improvement over previous versions in terms of greater stability, yield, and quality of products. The ongoing Earth System Grid for next generation climate model research project, a collaborative effort of GES DISC and NASA JPL, will bring temperature and humidity profiles from AIRS version 6. The AIRS version 6 product adds a new "TqJoint" data group, which contains data for a common set of observations across water vapor and temperature at all atmospheric levels and is suitable for climate process studies. How different may the monthly temperature and humidity profiles in "TqJoint" group be from the "Standard" group where temperature and water vapor are not always valid at the same time? This study aims to answer the question by comprehensively comparing the temperature and humidity profiles from the "TqJoint" group and the "Standard" group. The comparison includes mean differences at different levels globally and over land and ocean. We are also working on examining the sampling differences between the "TqJoint" and "Standard" group using MERRA data.

  2. Testing the prognostic accuracy of the updated pediatric sepsis biomarker risk model.

    Directory of Open Access Journals (Sweden)

    Hector R Wong

    Full Text Available BACKGROUND: We previously derived and validated a risk model to estimate mortality probability in children with septic shock (PERSEVERE; PEdiatRic SEpsis biomarkEr Risk modEl. PERSEVERE uses five biomarkers and age to estimate mortality probability. After the initial derivation and validation of PERSEVERE, we combined the derivation and validation cohorts (n = 355 and updated PERSEVERE. An important step in the development of updated risk models is to test their accuracy using an independent test cohort. OBJECTIVE: To test the prognostic accuracy of the updated version PERSEVERE in an independent test cohort. METHODS: Study subjects were recruited from multiple pediatric intensive care units in the United States. Biomarkers were measured in 182 pediatric subjects with septic shock using serum samples obtained during the first 24 hours of presentation. The accuracy of PERSEVERE 28-day mortality risk estimate was tested using diagnostic test statistics, and the net reclassification improvement (NRI was used to test whether PERSEVERE adds information to a physiology-based scoring system. RESULTS: Mortality in the test cohort was 13.2%. Using a risk cut-off of 2.5%, the sensitivity of PERSEVERE for mortality was 83% (95% CI 62-95, specificity was 75% (68-82, positive predictive value was 34% (22-47, and negative predictive value was 97% (91-99. The area under the receiver operating characteristic curve was 0.81 (0.70-0.92. The false positive subjects had a greater degree of organ failure burden and longer intensive care unit length of stay, compared to the true negative subjects. When adding PERSEVERE to a physiology-based scoring system, the net reclassification improvement was 0.91 (0.47-1.35; p<0.001. CONCLUSIONS: The updated version of PERSEVERE estimates mortality probability reliably in a heterogeneous test cohort of children with septic shock and provides information over and above a physiology-based scoring system.

  3. CAPM and APT like models with risk measures

    OpenAIRE

    Balbás, Alejandro; Balbás, Beatriz; Balbás, Raquel

    2009-01-01

    The paper deals with optimal portfolio choice problems when risk levels are given by coherent risk measures, expectation bounded risk measures or general deviations. Both static and dynamic pricing models may be involved. Unbounded problems are characterized by new notions such as compatibility and strong compatibility between pricing rules and risk measures. Surprisingly, it is pointed out that the lack of bounded optimal risk and/or return levels arises in practice for ver...

  4. Incorporating Enterprise Risk Management in the Business Model Innovation Process

    Directory of Open Access Journals (Sweden)

    Yariv Taran

    2013-12-01

    Full Text Available Purpose: Relative to other types of innovations, little is known about business model innovation, let alone the process of managing the risks involved in that process. Using the emerging (enterprise risk management literature, an approach is proposed through which risk management can be embedded in the business model innovation process. Design: The integrated business model innovation risk management model developed in this paper has been tested through an action research study in a Danish company. Findings: The study supports our proposition that the implementation of risk management throughout the innovation process reduces the risks related to the uncertainty and complexity of developing and implementing a new business model. Originality: The study supports the proposition that the implementation of risk management throughout the innovation process reduces the risks related to the uncertainty and complexity of developing and implementing a new business model. The business model risk management model makes managers much more focused on identifying problematic issues and putting explicit plans and timetables into place for resolving/reducing risks, and assists companies in aligning the risk treatment choices made during the

  5. Flipped versions of the universal 3-3-1 and the left-right symmetric models in [S U (3 )]3 : A comprehensive approach

    Science.gov (United States)

    Rodríguez, Oscar; Benavides, Richard H.; Ponce, William A.; Rojas, Eduardo

    2017-01-01

    By considering the 3-3-1 and the left-right symmetric models as low-energy effective theories of the S U (3 )C⊗S U (3 )L⊗S U (3 )R (for short [S U (3 )]3 ) gauge group, alternative versions of these models are found. The new neutral gauge bosons of the universal 3-3-1 model and its flipped versions are presented; also, the left-right symmetric model and its flipped variants are studied. Our analysis shows that there are two flipped versions of the universal 3-3-1 model, with the particularity that both of them have the same weak charges. For the left-right symmetric model, we also found two flipped versions; one of them is new in the literature and, unlike those of the 3-3-1, requires a dedicated study of its electroweak properties. For all the models analyzed, the couplings of the Z' bosons to the standard model fermions are reported. The explicit form of the null space of the vector boson mass matrix for an arbitrary Higgs tensor and gauge group is also presented. In the general framework of the [S U (3 )]3 gauge group, and by using the LHC experimental results and EW precision data, limits on the Z' mass and the mixing angle between Z and the new gauge bosons Z' are obtained. The general results call for very small mixing angles in the range 1 0-3 radians and MZ'>2.5 TeV .

  6. Statistical model of fractures and deformations zones for Forsmark. Preliminary site description Forsmark area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    La Pointe, Paul R. [Golder Associate Inc., Redmond, WA (United States); Olofsson, Isabelle; Hermanson, Jan [Golder Associates AB, Uppsala (Sweden)

    2005-04-01

    Compared to version 1.1, a much larger amount of data especially from boreholes is available. Both one-hole interpretation and Boremap indicate the presence of high and low fracture intensity intervals in the rock mass. The depth and width of these intervals varies from borehole to borehole but these constant fracture intensity intervals are contiguous and present quite sharp transitions. There is not a consistent pattern of intervals of high fracture intensity at or near to the surface. In many cases, the intervals of highest fracture intensity are considerably below the surface. While some fractures may have occurred or been reactivated in response to surficial stress relief, surficial stress relief does not appear to be a significant explanatory variable for the observed variations in fracture intensity. Data from the high fracture intensity intervals were extracted and statistical analyses were conducted in order to identify common geological factors. Stereoplots of fracture orientation versus depth for the different fracture intensity intervals were also produced for each borehole. Moreover percussion borehole data were analysed in order to identify the persistence of these intervals throughout the model volume. The main conclusions of these analyses are the following: The fracture intensity is conditioned by the rock domain, but inside a rock domain intervals of high and low fracture intensity are identified. The intervals of high fracture intensity almost always correspond to intervals with distinct fracture orientations (whether a set, most often the NW sub-vertical set, is highly dominant, or some orientation sets are missing). These high fracture intensity intervals are positively correlated to the presence of first and second generation minerals (epidote, calcite). No clear correlation for these fracture intensity intervals has been identified between holes. Based on these results the fracture frequency has been calculated in each rock domain for the

  7. Bankruptcy risk model and empirical tests.

    Science.gov (United States)

    Podobnik, Boris; Horvatic, Davor; Petersen, Alexander M; Urosevic, Branko; Stanley, H Eugene

    2010-10-26

    We analyze the size dependence and temporal stability of firm bankruptcy risk in the US economy by applying Zipf scaling techniques. We focus on a single risk factor--the debt-to-asset ratio R--in order to study the stability of the Zipf distribution of R over time. We find that the Zipf exponent increases during market crashes, implying that firms go bankrupt with larger values of R. Based on the Zipf analysis, we employ Bayes's theorem and relate the conditional probability that a bankrupt firm has a ratio R with the conditional probability of bankruptcy for a firm with a given R value. For 2,737 bankrupt firms, we demonstrate size dependence in assets change during the bankruptcy proceedings. Prepetition firm assets and petition firm assets follow Zipf distributions but with different exponents, meaning that firms with smaller assets adjust their assets more than firms with larger assets during the bankruptcy process. We compare bankrupt firms with nonbankrupt firms by analyzing the assets and liabilities of two large subsets of the US economy: 2,545 Nasdaq members and 1,680 New York Stock Exchange (NYSE) members. We find that both assets and liabilities follow a Pareto distribution. The finding is not a trivial consequence of the Zipf scaling relationship of firm size quantified by employees--although the market capitalization of Nasdaq stocks follows a Pareto distribution, the same distribution does not describe NYSE stocks. We propose a coupled Simon model that simultaneously evolves both assets and debt with the possibility of bankruptcy, and we also consider the possibility of firm mergers.

  8. Rock mechanics modelling of rock mass properties - summary of primary data. Preliminary site description Laxemar subarea - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Lanaro, Flavio [Berg Bygg Konsult AB, Solna (Sweden); Oehman, Johan; Fredriksson, Anders [Golder Associates AB, Uppsala (Sweden)

    2006-05-15

    The results presented in this report are the summary of the primary data for the Laxemar Site Descriptive Modelling version 1.2. At this stage, laboratory tests on intact rock and fracture samples from borehole KSH01A, KSH02A, KAV01 (already considered in Simpevarp SDM version 1.2) and borehole KLX02 and KLX04 were available. Concerning the mechanical properties of the intact rock, the rock type 'granite to quartz monzodiorite' or 'Aevroe granite' (code 501044) was tested for the first time within the frame of the site descriptive modelling. The average uniaxial compressive strength and Young's modulus of the granite to quartz to monzodiorite are 192 MPa and 72 GPa, respectively. The crack initiation stress is observed to be 0.5 times the uniaxial compressive strength for the same rock type. Non negligible differences are observed between the statistics of the mechanical properties of the granite to quartz monzodiorite in borehole KLX02 and KLX04. The available data on rock fractures were analysed to determine the mechanical properties of the different fracture sets at the site (based on tilt test results) and to determine systematic differences between the results obtained with different sample preparation techniques (based on direct shear tests). The tilt tests show that there are not significant differences of the mechanical properties due to the fracture orientation. Thus, all fracture sets seem to have the same strength and deformability. The average peak friction angle for the Coulomb's Criterion of the fracture sets varies between 33.6 deg and 34.1 deg, while the average cohesion ranges between 0.46 and 0.52 MPa, respectively. The average of the Coulomb's residual cohesion and friction angle vary in the ranges 28.0 deg - 29.2 deg and 0.40-0.45 MPa, respectively. The only significant difference could be observed on the average cohesion between fracture set S{sub A} and S{sub d}. The direct shear tests show that the

  9. Proliferation Risk Characterization Model Prototype Model - User and Programmer Guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Dukelow, J.S.; Whitford, D.

    1998-12-01

    A model for the estimation of the risk of diversion of weapons-capable materials was developed. It represents both the threat of diversion and site vulnerability as a product of a small number of variables (two to eight), each of which can take on a small number (two to four) of qualitatively defined (but quantitatively implemented) values. The values of the overall threat and vulnerability variables are then converted to threat and vulnerability categories. The threat and vulnerability categories are used to define the likelihood of diversion, also defined categorically. The evaluator supplies an estimate of the consequences of a diversion, defined categorically, but with the categories based on the IAEA Attractiveness levels. Likelihood and Consequences categories are used to define the Risk, also defined categorically. The threat, vulnerability, and consequences input provided by the evaluator contains a representation of his/her uncertainty in each variable assignment which is propagated all the way through to the calculation of the Risk categories. [Appendix G available on diskette only.

  10. Advances in Disaster Modeling, Simulation and Visualization for Sandstorm Risk Management in North China

    Directory of Open Access Journals (Sweden)

    Hang Lei

    2012-05-01

    Full Text Available Dust storms in North China result in high concentrations of airborne dust particles, which cause detrimental effects on human health as well as social and economic losses and environmental degradation. To investigate the impact of land surface processes on dust storms, we simulate two dust storm events in North China during spring 2002 using two versions of a dust storm prediction system developed by the Institute for Atmospheric Physics (IAP in Beijing, China. The primary difference between the IAP Sandstorm Prediction System (IAPS 1.0 and more recent version (IAPS 2.0 is the land surface modeling. IAPS 1.0 is based on the Oregon State University (OSU land surface model, whereas the latest version of the dust storm prediction (IAPS 2.0 uses NOAH land surface schemes for land surface modeling within a meteorological model, MM5. This work investigates whether the improved land surface modeling affects modeling of sandstorms. It is shown that an integrated sandstorm management system can be used to aid the following tasks: ensure sandstorm monitoring and warning; incorporate weather forecasts; ascertain the risk of a sandstorm disaster; integrate multiple technologies (for example, GIS, remote sensing, and information processing technology; track the progress of the storm in real-time; exhibit flexibility, accuracy and reliability (by using multiple sources of data, including in-situ meteorological observations; and monitor PM10 and PM2.5 dust concentrations in airborne dustfalls. The results indicate that with the new land surface scheme, the simulation of soil moisture is greatly improved, leading to a better estimate of the threshold frictional velocity, a key parameter for the estimating surface dust emissions. In this study, we also discuss specific mechanisms by which land surface processes affect dust storm modeling and make recommendations for further improvements to numerical dust storm simulations.

  11. The Survival Probability in Generalized Poisson Risk Model

    Institute of Scientific and Technical Information of China (English)

    GONGRi-zhao

    2003-01-01

    In this paper we generalize the aggregated premium income process from a constant rate process to a poisson process for the classical compound Poinsson risk model,then for the generalized model and the classical compound poisson risk model ,we respectively get its survival probability in finite time period in case of exponential claim amounts.

  12. BaP (PAH) air quality modelling exercise over Zaragoza (Spain) using an adapted version of WRF-CMAQ model.

    Science.gov (United States)

    San José, Roberto; Pérez, Juan Luis; Callén, María Soledad; López, José Manuel; Mastral, Ana

    2013-12-01

    Benzo(a)pyrene (BaP) is one of the most dangerous PAH due to its high carcinogenic and mutagenic character. Because of this reason, the Directive 2004/107/CE of the European Union establishes a target value of 1 ng/m(3) of BaP in the atmosphere. In this paper, the main aim is to estimate the BaP concentrations in the atmosphere by using last generation of air quality dispersion models with the inclusion of the transport, scavenging and deposition processes for the BaP. The degradation of the particulated BaP by the ozone has been considered. The aerosol-gas partitioning phenomenon in the atmosphere is modelled taking into a count that the concentrations in the gas and the aerosol phases. If the pre-existing organic aerosol concentrations are zero gas/particle equilibrium is established. The model has been validated at local scale with data from a sampling campaign carried out in the area of Zaragoza (Spain) during 12 weeks.

  13. Conceptual Model of Offshore Wind Environmental Risk Evaluation System

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Richard M.; Copping, Andrea E.; Van Cleve, Frances B.; Unwin, Stephen D.; Hamilton, Erin L.

    2010-06-01

    In this report we describe the development of the Environmental Risk Evaluation System (ERES), a risk-informed analytical process for estimating the environmental risks associated with the construction and operation of offshore wind energy generation projects. The development of ERES for offshore wind is closely allied to a concurrent process undertaken to examine environmental effects of marine and hydrokinetic (MHK) energy generation, although specific risk-relevant attributes will differ between the MHK and offshore wind domains. During FY10, a conceptual design of ERES for offshore wind will be developed. The offshore wind ERES mockup described in this report will provide a preview of the functionality of a fully developed risk evaluation system that will use risk assessment techniques to determine priority stressors on aquatic organisms and environments from specific technology aspects, identify key uncertainties underlying high-risk issues, compile a wide-range of data types in an innovative and flexible data organizing scheme, and inform planning and decision processes with a transparent and technically robust decision-support tool. A fully functional version of ERES for offshore wind will be developed in a subsequent phase of the project.

  14. Gender and the assessment of at-risk drinking: evidence from the GENACIS Canada (2004-2005) telephone survey version of the AUDIT.

    Science.gov (United States)

    Bernards, Sharon; Graham, Kathryn; Demers, Andrée; Kairouz, Sylvia; Wells, Samantha

    2007-05-11

    The alcohol use disorders identification test (AUDIT) is widely used in general population surveys as a method of determining prevalence of hazardous drinking. However, its interpretation has been questioned particularly regarding the unequal contribution of the items to the total score, specifically, that the drinking frequency item contributes disproportionately to the score and may lead to inappropriate identification of some drinkers as hazardous drinkers. To explore these issues further as well as possible gender differences in the applicability of the AUDIT, we conducted analyses using a modified version of the AUDIT (AUDIT(M)) as part of a general population survey that used random digit dialing and computer-assisted telephone interviewing. Item and factor analyses were performed separately for men and women, and the impacts of excluding the frequency of drinking item in the measurement of mean scores, percentages and types of problems for men and women were examined. We found that the AUDIT(M) items loaded onto three distinct dimensions for both men and women: frequency of drinking; usual quantity and frequency of heavy-episodic drinking; problem consequences from drinking. In addition, we found that excluding the frequency question may give a more meaningful estimate of the percent of drinkers actually at risk of experiencing problems from drinking for both men and women. Finally, although our analyses identified only minor gender differences in the structure of the AUDIT and good sensitivity for identifying problem drinkers among both men and women, significant gender differences in the types of problems experienced suggest that use and interpretation of the AUDIT should routinely take gender into consideration.

  15. Coupling of the VAMPER permafrost model within the earth system model iLOVECLIM (version 1.0: description and validation

    Directory of Open Access Journals (Sweden)

    D. Kitover

    2014-11-01

    Full Text Available The VAMPER permafrost model has been enhanced for coupling within the iLOVECLIM earth system model of intermediate complexity by including snow thickness and active layer calculations. In addition, the coupling between iLOVECLIM and the VAMPER model includes two spatially variable maps of geothermal heat flux and generalized lithology. A semi-coupled version is validated using the modern day extent of permafrost along with observed permafrost thickness and subsurface temperatures at selected borehole sites. The modeling run not including the effects of snow cover overestimate the present permafrost extent. However, when the snow component is included, the extent is overall reduced too much. It was found that most of the modeled thickness values and subsurface temperatures fall within a reasonable range of the corresponding observed values. Discrepancies are due to lack of captured effects from features such as topography and organic soil layers. In addition, some discrepancy is also due to disequilibrium with the current climate, meaning that some permafrost is a result of colder states and therefore cannot be reproduced accurately with the iLOVECLIM preindustrial forcings.

  16. Managing risks in business model innovation processes

    DEFF Research Database (Denmark)

    Taran, Yariv; Boer, Harry; Lindgren, Peter

    2010-01-01

    industrial companies shows that both companies are experiencing high levels of uncertainty and complexity during their innovation processes and are, consequently, struggling to find new processes for handling the risks involved. Based on the two companies’ experiences, various testable propositions are put......) innovation is a risky enterprise, many companies are still choosing not to apply any risk management in the BM innovation process. The objective of this paper is to develop a better understanding of how risks are handled in the practice of BM innovation. An analysis of the BM innovation experiences of two...... forward, which link success and failure to the way companies appreciate and handle the risks involved in BM innovation....

  17. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate......Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  18. Modeling herring population dynamics: herring catch-at-age model version 2 = Modelisation de la dynamique des populations de hareng : modele des captures a l'age de harengs, Version 2

    National Research Council Canada - National Science Library

    Christensen, L.B; Haist, V; Schweigert, J

    2010-01-01

    The herring catch-at-age model (HCAM) is an age-structured stock assessment model developed specifically for Pacific herring which is assumed to be a multi-stock population that has experienced periods of significant fishery impact...

  19. Stochastic models in risk theory and management accounting

    NARCIS (Netherlands)

    Brekelmans, R.C.M.

    2000-01-01

    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest

  20. Stochastic models in risk theory and management accounting

    NARCIS (Netherlands)

    Brekelmans, R.C.M.

    2000-01-01

    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest for

  1. Stochastic models in risk theory and management accounting

    NARCIS (Netherlands)

    Brekelmans, R.C.M.

    2000-01-01

    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest for

  2. Model Averaging Software for Dichotomous Dose Response Risk Estimation

    Directory of Open Access Journals (Sweden)

    Matthew W. Wheeler

    2008-02-01

    Full Text Available Model averaging has been shown to be a useful method for incorporating model uncertainty in quantitative risk estimation. In certain circumstances this technique is computationally complex, requiring sophisticated software to carry out the computation. We introduce software that implements model averaging for risk assessment based upon dichotomous dose-response data. This software, which we call Model Averaging for Dichotomous Response Benchmark Dose (MADr-BMD, fits the quantal response models, which are also used in the US Environmental Protection Agency benchmark dose software suite, and generates a model-averaged dose response model to generate benchmark dose and benchmark dose lower bound estimates. The software fulfills a need for risk assessors, allowing them to go beyond one single model in their risk assessments based on quantal data by focusing on a set of models that describes the experimental data.

  3. Data Scaling for Operational Risk Modelling

    NARCIS (Netherlands)

    H.S. Na; L. Couto Miranda; J.H. van den Berg (Jan); M. Leipoldt

    2006-01-01

    textabstractIn 2004, the Basel Committee on Banking Supervision defined Operational Risk (OR) as the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events. After publication of the new capital accord containing this dfinition, statistical

  4. Data Scaling for Operational Risk Modelling

    NARCIS (Netherlands)

    H.S. Na; L. Couto Miranda; J.H. van den Berg (Jan); M. Leipoldt

    2006-01-01

    textabstractIn 2004, the Basel Committee on Banking Supervision defined Operational Risk (OR) as the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events. After publication of the new capital accord containing this dfinition, statistical pro

  5. [Measuring psychosocial stress at work in Spanish hospital's personnel. Psychometric properties of the Spanish version of Effort-Reward Imbalance model].

    Science.gov (United States)

    Macías Robles, María Dolores; Fernández-López, Juan Antonio; Hernández-Mejía, Radhamés; Cueto-Espinar, Antonio; Rancaño, Iván; Siegrist, Johannes

    2003-05-10

    Two main models are currently used to evaluate the psychosocial factors at work: the Demand-Control (or job strain) model developed by Karasek and the Effort-Reward Imbalance model, developed by Siegrist. A Spanish version of the first model has been validated, yet so far no validated Spanish version of the second model is available. The objective of this study was to explore the psychometric properties of the Spanish version of the Effort-Reward Imbalance model in terms of internal consistency, factorial validity, and discriminate validity. A cross-sectional study on a representative sample of 298 workers of the Spanish public hospital San Agustin in Asturias was performed. The Spanish version of Effort-Reward Imbalance Questionnaire (23 items) was obtained by a standard forward/backward translation procedure, and the information was gathered by a self-administered application. Exploratory factor analysis were performed to test the dimensional structure of the theoretical model. Cronbach's alpha coefficient was calculated to estimate the internal consistency reliability. Information on discriminate validity is given for sex, age and education. Differences were calculated with the t-test for two independent samples or ANOVA, respectively. Internal consistency was satisfactory for the two scales (reward and intrinsic effort) and Cronbach's Alpha coefficients higher than 0.80 were observed. The internal consistency for the scale of extrinsic effort was lower (alpha = 0.63). A three-factor solution was retained for the factor analysis of reward as expected, and these dimensions were interpreted as a) esteem, b) job promotion and salary and c) job instability. A one-factor solution was retained for the factor analysis of intrinsic effort. The factor analysis of the scale of extrinsic effort did not support the expected one-dimension structure. The analysis of discriminate validity displayed significant associations between measures of Effort-Reward Imbalance and the

  6. PVWatts Version 5 Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dobos, A. P.

    2014-09-01

    The NREL PVWatts calculator is a web application developed by the National Renewable Energy Laboratory (NREL) that estimates the electricity production of a grid-connected photovoltaic system based on a few simple inputs. PVWatts combines a number of sub-models to predict overall system performance, and makes includes several built-in parameters that are hidden from the user. This technical reference describes the sub-models, documents assumptions and hidden parameters, and explains the sequence of calculations that yield the final system performance estimate. This reference is applicable to the significantly revised version of PVWatts released by NREL in 2014.

  7. House Price Risk Models for Banking and Insurance Applications

    OpenAIRE

    Katja Hanewald; Michael Sherris

    2011-01-01

    The recent international credit crisis has highlighted the significant exposure that banks and insurers, especially mono-line credit insurers, have to residential house price risk. This paper provides an assessment of risk models for residential property for applications in banking and insurance including pricing, risk management, and portfolio management. Risk factors and heterogeneity of house price returns are assessed at a postcode-level for house prices in the major capital city of Sydne...

  8. Modelling risks for electric power markets.

    OpenAIRE

    Pantoja-Robayo, Javier

    2012-01-01

    This paper presents a study of the Forward Risk Premia (FRP) in Wholesale Electric Power Market in Colombia (WPMC) showing how the FRP varies throughout the day and how its properties are explained by risk factors. It also shows that expected forward risk premia depends on factors such as variations in expected spot prices, due to the climatic conditions generated by the Oceanic Niño Index (ONI) and its impact on the available quantity of water to generate electric power. This document provid...

  9. Groundwater vulnerability and risk mapping using GIS, modeling and a fuzzy logic tool.

    Science.gov (United States)

    Nobre, R C M; Rotunno Filho, O C; Mansur, W J; Nobre, M M M; Cosenza, C A N

    2007-12-07

    A groundwater vulnerability and risk mapping assessment, based on a source-pathway-receptor approach, is presented for an urban coastal aquifer in northeastern Brazil. A modified version of the DRASTIC methodology was used to map the intrinsic and specific groundwater vulnerability of a 292 km(2) study area. A fuzzy hierarchy methodology was adopted to evaluate the potential contaminant source index, including diffuse and point sources. Numerical modeling was performed for delineation of well capture zones, using MODFLOW and MODPATH. The integration of these elements provided the mechanism to assess groundwater pollution risks and identify areas that must be prioritized in terms of groundwater monitoring and restriction on use. A groundwater quality index based on nitrate and chloride concentrations was calculated, which had a positive correlation with the specific vulnerability index.

  10. Risk management modeling and its application in maritime safety

    Institute of Scientific and Technical Information of China (English)

    QIN Ting-rong; CHEN Wei-jiong; ZENG Xiang-kun

    2008-01-01

    Quantified risk assessment (QRA) needs mathematicization of risk theory. However,attention has been paid almost exclusively to applications of assessment methods,which has led to neglect of research into fundamental theories,such as the relationships among risk,safety,danger,and so on. In order to solve this problem,as a first step,fundamental theoretical relationships about risk and risk management were analyzed for this paper in the light of mathematics,and then illustrated with some charts. Second,man-machine-environment-management (MMEM) theory was introduced into risk theory to analyze some properties of risk. On the basis of this,a three-dimensional model of risk management was established that includes: a goal dimension;a management dimension;an operation dimension. This goal management operation (GMO) model was explained and then emphasis was laid on the discussion of the risk flowchart (operation dimension),which lays the groundwork for further study of risk management and qualitative and quantitative assessment. Next,the relationship between Formal Safety Assessment (FSA) and Risk Management was researched. This revealed that the FSA method,which the international maritime organization (IMO) is actively spreading,comes from Risk Management theory. Finally,conclusion were made about how to apply this risk management method to concrete fields efficiently and conveniently,as well as areas where further research is required.

  11. The Nexus Land-Use model version 1.0, an approach articulating biophysical potentials and economic dynamics to model competition for land-use

    Directory of Open Access Journals (Sweden)

    F. Souty

    2012-02-01

    Full Text Available Interactions between food demand, biomass energy and forest preservation are driving both food prices and land-use changes, regionally and globally. This study presents a new model called Nexus Land-Use version 1.0 which describes these interactions through a generic representation of agricultural intensification mechanisms. The Nexus Land-Use model equations combine biophysics and economics into a single coherent framework to calculate crop yields, food prices, and resulting pasture and cropland areas within 12 regions inter-connected with each other by international trade. The representation of cropland and livestock production systems in each region relies on three components: (i a biomass production function derived from the crop yield response function to inputs such as industrial fertilisers; (ii a detailed representation of the livestock production system subdivided into an intensive and an extensive component, and (iii a spatially explicit distribution of potential (maximal crop yields prescribed from the Lund-Postdam-Jena global vegetation model for managed Land (LPJmL. The economic principles governing decisions about land-use and intensification are adapted from the Ricardian rent theory, assuming cost minimisation for farmers. The land-use modelling approach described in this paper entails several advantages. Firstly, it makes it possible to explore interactions among different types of biomass demand for food and animal feed, in a consistent approach, including indirect effects on land-use change resulting from international trade. Secondly, yield variations induced by the possible expansion of croplands on less suitable marginal lands are modelled by using regional land area distributions of potential yields, and a calculated boundary between intensive and extensive production. The model equations and parameter values are first described in details. Then, idealised scenarios exploring the impact of forest preservation policies or

  12. Model uncertainty and systematic risk in US banking

    NARCIS (Netherlands)

    Baele, L.T.M.; De Bruyckere, Valerie; De Jonghe, O.G.; Vander Vennet, Rudi

    2015-01-01

    This paper uses Bayesian Model Averaging to examine the driving factors of equity returns of US Bank Holding Companies. BMA has as an advantage over OLS that it accounts for the considerable uncertainty about the correct set (model) of bank risk factors. We find that out of a broad set of 12 risk fa

  13. Tests of control in the Audit Risk Model : Effective? Efficient?

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans)

    2004-01-01

    Lately, the Audit Risk Model has been subject to criticism. To gauge its validity, this paper confronts the Audit Risk Model as incorporated in International Standard on Auditing No. 400, with the real life situations faced by auditors in auditing financial statements. This confrontation exposes ser

  14. Tests of control in the Audit Risk Model : Effective? Efficient?

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans)

    2004-01-01

    Lately, the Audit Risk Model has been subject to criticism. To gauge its validity, this paper confronts the Audit Risk Model as incorporated in International Standard on Auditing No. 400, with the real life situations faced by auditors in auditing financial statements. This confrontation exposes ser

  15. Tests of control in the Audit Risk Model : Effective? Efficient?

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans)

    2004-01-01

    Lately, the Audit Risk Model has been subject to criticism. To gauge its validity, this paper confronts the Audit Risk Model as incorporated in International Standard on Auditing No. 400, with the real life situations faced by auditors in auditing financial statements. This confrontation exposes

  16. Radiation risk estimation based on measurement error models

    CERN Document Server

    Masiuk, Sergii; Shklyar, Sergiy; Chepurny, Mykola; Likhtarov, Illya

    2017-01-01

    This monograph discusses statistics and risk estimates applied to radiation damage under the presence of measurement errors. The first part covers nonlinear measurement error models, with a particular emphasis on efficiency of regression parameter estimators. In the second part, risk estimation in models with measurement errors is considered. Efficiency of the methods presented is verified using data from radio-epidemiological studies.

  17. AERONET Version 3 processing

    Science.gov (United States)

    Holben, B. N.; Slutsker, I.; Giles, D. M.; Eck, T. F.; Smirnov, A.; Sinyuk, A.; Schafer, J.; Rodriguez, J.

    2014-12-01

    The Aerosol Robotic Network (AERONET) database has evolved in measurement accuracy, data quality products, availability to the scientific community over the course of 21 years with the support of NASA, PHOTONS and all federated partners. This evolution is periodically manifested as a new data version release by carefully reprocessing the entire database with the most current algorithms that fundamentally change the database and ultimately the data products used by the community. The newest processing, Version 3, will be released in 2015 after the entire database is reprocessed and real-time data processing becomes operational. All V 3 algorithms have been developed, individually vetted and represent four main categories: aerosol optical depth (AOD) processing, inversion processing, database management and new products. The primary trigger for release of V 3 lies with cloud screening of the direct sun observations and computation of AOD that will fundamentally change all data available for analysis and all subsequent retrieval products. This presentation will illustrate the innovative approach used for cloud screening and assesses the elements of V3 AOD relative to the current version. We will also present the advances in the inversion product processing with emphasis on the random and systematic uncertainty estimates. This processing will be applied to the new hybrid measurement scenario intended to provide inversion retrievals for all solar zenith angles. We will introduce automatic quality assurance criteria that will allow near real time quality assured aerosol products necessary for real time satellite and model validation and assimilation. Last we will introduce the new management structure that will improve access to the data database. The current version 2 will be supported for at least two years after the initial release of V3 to maintain continuity for on going investigations.

  18. Development and analysis of some versions of the fractional-order point reactor kinetics model for a nuclear reactor with slab geometry

    Science.gov (United States)

    Vyawahare, Vishwesh A.; Nataraj, P. S. V.

    2013-07-01

    In this paper, we report the development and analysis of some novel versions and approximations of the fractional-order (FO) point reactor kinetics model for a nuclear reactor with slab geometry. A systematic development of the FO Inhour equation, Inverse FO point reactor kinetics model, and fractional-order versions of the constant delayed neutron rate approximation model and prompt jump approximation model is presented for the first time (for both one delayed group and six delayed groups). These models evolve from the FO point reactor kinetics model, which has been derived from the FO Neutron Telegraph Equation for the neutron transport considering the subdiffusive neutron transport. Various observations and the analysis results are reported and the corresponding justifications are addressed using the subdiffusive framework for the neutron transport. The FO Inhour equation is found out to be a pseudo-polynomial with its degree depending on the order of the fractional derivative in the FO model. The inverse FO point reactor kinetics model is derived and used to find the reactivity variation required to achieve exponential and sinusoidal power variation in the core. The situation of sudden insertion of negative reactivity is analyzed using the FO constant delayed neutron rate approximation. Use of FO model for representing the prompt jump in reactor power is advocated on the basis of subdiffusion. Comparison with the respective integer-order models is carried out for the practical data. Also, it has been shown analytically that integer-order models are a special case of FO models when the order of time-derivative is one. Development of these FO models plays a crucial role in reactor theory and operation as it is the first step towards achieving the FO control-oriented model for a nuclear reactor. The results presented here form an important step in the efforts to establish a step-by-step and systematic theory for the FO modeling of a nuclear reactor.

  19. The Nexus Land-Use model version 1.0, an approach articulating biophysical potentials and economic dynamics to model competition for land-use

    Directory of Open Access Journals (Sweden)

    F. Souty

    2012-10-01

    Full Text Available Interactions between food demand, biomass energy and forest preservation are driving both food prices and land-use changes, regionally and globally. This study presents a new model called Nexus Land-Use version 1.0 which describes these interactions through a generic representation of agricultural intensification mechanisms within agricultural lands. The Nexus Land-Use model equations combine biophysics and economics into a single coherent framework to calculate crop yields, food prices, and resulting pasture and cropland areas within 12 regions inter-connected with each other by international trade. The representation of cropland and livestock production systems in each region relies on three components: (i a biomass production function derived from the crop yield response function to inputs such as industrial fertilisers; (ii a detailed representation of the livestock production system subdivided into an intensive and an extensive component, and (iii a spatially explicit distribution of potential (maximal crop yields prescribed from the Lund-Postdam-Jena global vegetation model for managed Land (LPJmL. The economic principles governing decisions about land-use and intensification are adapted from the Ricardian rent theory, assuming cost minimisation for farmers. In contrast to the other land-use models linking economy and biophysics, crops are aggregated as a representative product in calories and intensification for the representative crop is a non-linear function of chemical inputs. The model equations and parameter values are first described in details. Then, idealised scenarios exploring the impact of forest preservation policies or rising energy price on agricultural intensification are described, and their impacts on pasture and cropland areas are investigated.

  20. The Nexus Land-Use model version 1.0, an approach articulating biophysical potentials and economic dynamics to model competition for land-use

    Science.gov (United States)

    Souty, F.; Brunelle, T.; Dumas, P.; Dorin, B.; Ciais, P.; Crassous, R.; Müller, C.; Bondeau, A.

    2012-10-01

    Interactions between food demand, biomass energy and forest preservation are driving both food prices and land-use changes, regionally and globally. This study presents a new model called Nexus Land-Use version 1.0 which describes these interactions through a generic representation of agricultural intensification mechanisms within agricultural lands. The Nexus Land-Use model equations combine biophysics and economics into a single coherent framework to calculate crop yields, food prices, and resulting pasture and cropland areas within 12 regions inter-connected with each other by international trade. The representation of cropland and livestock production systems in each region relies on three components: (i) a biomass production function derived from the crop yield response function to inputs such as industrial fertilisers; (ii) a detailed representation of the livestock production system subdivided into an intensive and an extensive component, and (iii) a spatially explicit distribution of potential (maximal) crop yields prescribed from the Lund-Postdam-Jena global vegetation model for managed Land (LPJmL). The economic principles governing decisions about land-use and intensification are adapted from the Ricardian rent theory, assuming cost minimisation for farmers. In contrast to the other land-use models linking economy and biophysics, crops are aggregated as a representative product in calories and intensification for the representative crop is a non-linear function of chemical inputs. The model equations and parameter values are first described in details. Then, idealised scenarios exploring the impact of forest preservation policies or rising energy price on agricultural intensification are described, and their impacts on pasture and cropland areas are investigated.

  1. Assessment model of dam operation risk based on monitoring data

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Although the dams produce remarkable social and economic benefits,the threat made by unsafe dams to the life and property of people who live in the lower river area is un-negligible.Based on the monitoring data which reflect the safety condition of dams,the risk degree concept is proposed and the analysis system and model for evaluating risk degree (rate) are established in this paper by combining the reliability theory and field monitoring data.The analysis method for risk degree is presented based on Bayesian approach.A five-grade risk degree system for dam operation risk and corresponding risk degree is put forward according to the safety condition of dams.The operation risks of four cascade dams on some river are analyzed by the model and approach presented here and the result is adopted by the owner.

  2. Risk Classification Model for Design and Build Projects

    Directory of Open Access Journals (Sweden)

    O. E. Ogunsanmi

    2011-07-01

    Full Text Available The purpose of this paper is to investigate if the various risk sources in Design and Build projects can be classified into three risk groups of cost, time and quality using the discriminant analysis technique. Literature search was undertaken to review issues of risk sources, classification of the identified risks into a risk structure, management of risks and effects of risks all on Design and Build projects as well as concepts of discriminant analysis as a statistical technique. This literature review was undertaken through the use of internet, published papers, journal articles and other published reports on risks in Design and Build projects. A research questionnaire was further designed to collect research information. This research study is a survey research that utilized cross-sectional design to capture the primary data. The data for the survey was collected in Nigeria. In all 40 questionnaires were sent to various respondents that included Architects, Engineers, Quantity Surveyors and Builders who had used Design and Build procurement method for their recently completed projects. Responses from these retrieved questionnaires that measured the impact of risks on Design and Build were analyzed using the discriminant analysis technique through the use of SPSS software package to build two discriminant models for classifying risks into cost, time and quality risk groups. Results of the study indicate that time overrun and poor quality are the two factors that discriminate between cost, time and quality related risk groups. These two discriminant functions explain the variation between the risk groups. All the discriminating variables of cost overrun, time overrun and poor quality demonstrate some relationships with the two discriminant functions. The two discriminant models built can classify risks in Design and Build projects into risk groups of cost, time and quality. These classifications models have 72% success rate of classification

  3. A Team Mental Model Perspective of Pre-Quantitative Risk

    Science.gov (United States)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  4. The Application of Asymmetric Liquidity Risk Measure in Modelling the Risk of Investment

    Directory of Open Access Journals (Sweden)

    Garsztka Przemysław

    2015-06-01

    Full Text Available The article analyses the relationship between investment risk (as measured by the variance of returns or standard deviation of returns and liquidity risk. The paper presents a method for calculating a new measure of liquidity risk, based on the characteristic line. In addition, it is checked what is the impact of liquidity risk to the volatility of daily returns. To describe this relationship dynamic econometric models were used. It was found that there was an econometric relationship between the proposed measure liquidity risk and the variance of returns.

  5. Fuzzy MCDM Model for Risk Factor Selection in Construction Projects

    Directory of Open Access Journals (Sweden)

    Pejman Rezakhani

    2012-11-01

    Full Text Available Risk factor selection is an important step in a successful risk management plan. There are many risk factors in a construction project and by an effective and systematic risk selection process the most critical risks can be distinguished to have more attention. In this paper through a comprehensive literature survey, most significant risk factors in a construction project are classified in a hierarchical structure. For an effective risk factor selection, a modified rational multi criteria decision making model (MCDM is developed. This model is a consensus rule based model and has the optimization property of rational models. By applying fuzzy logic to this model, uncertainty factors in group decision making such as experts` influence weights, their preference and judgment for risk selection criteria will be assessed. Also an intelligent checking process to check the logical consistency of experts` preferences will be implemented during the decision making process. The solution inferred from this method is in the highest degree of acceptance of group members. Also consistency of individual preferences is checked by some inference rules. This is an efficient and effective approach to prioritize and select risks based on decisions made by group of experts in construction projects. The applicability of presented method is assessed through a case study.

  6. Driving Strategic Risk Planning With Predictive Modelling For Managerial Accounting

    DEFF Research Database (Denmark)

    Nielsen, Steen; Pontoppidan, Iens Christian

    for managerial accounting and shows how it can be used to determine the impact of different types of risk assessment input parameters on the variability of important outcome measures. The purpose is to: (i) point out the theoretical necessity of a stochastic risk framework; (ii) present a stochastic framework......Currently, risk management in management/managerial accounting is treated as deterministic. Although it is well-known that risk estimates are necessarily uncertain or stochastic, until recently the methodology required to handle stochastic risk-based elements appear to be impractical and too...... mathematical. The ultimate purpose of this paper is to “make the risk concept procedural and analytical” and to argue that accountants should now include stochastic risk management as a standard tool. Drawing on mathematical modelling and statistics, this paper methodically develops risk analysis approach...

  7. Users` manual for LEHGC: A Lagrangian-Eulerian Finite-Element Model of Hydrogeochemical Transport Through Saturated-Unsaturated Media. Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Yeh, Gour-Tsyh [Pennsylvania State Univ., University Park, PA (United States). Dept. of Civil and Environmental Engineering; Carpenter, S.L. [Univ. of New Mexico, Albuquerque, NM (United States). Dept. of Earth and Planetary Sciences; Hopkins, P.L.; Siegel, M.D. [Sandia National Labs., Albuquerque, NM (United States)

    1995-11-01

    The computer program LEHGC is a Hybrid Lagrangian-Eulerian Finite-Element Model of HydroGeo-Chemical (LEHGC) Transport Through Saturated-Unsaturated Media. LEHGC iteratively solves two-dimensional transport and geochemical equilibrium equations and is a descendant of HYDROGEOCHEM, a strictly Eulerian finite-element reactive transport code. The hybrid Lagrangian-Eulerian scheme improves on the Eulerian scheme by allowing larger time steps to be used in the advection-dominant transport calculations. This causes less numerical dispersion and alleviates the problem of calculated negative concentrations at sharp concentration fronts. The code also is more computationally efficient than the strictly Eulerian version. LEHGC is designed for generic application to reactive transport problems associated with contaminant transport in subsurface media. Input to the program includes the geometry of the system, the spatial distribution of finite elements and nodes, the properties of the media, the potential chemical reactions, and the initial and boundary conditions. Output includes the spatial distribution of chemical element concentrations as a function of time and space and the chemical speciation at user-specified nodes. LEHGC Version 1.1 is a modification of LEHGC Version 1.0. The modification includes: (1) devising a tracking algorithm with the computational effort proportional to N where N is the number of computational grid nodes rather than N{sup 2} as in LEHGC Version 1.0, (2) including multiple adsorbing sites and multiple ion-exchange sites, (3) using four preconditioned conjugate gradient methods for the solution of matrix equations, and (4) providing a model for some features of solute transport by colloids.

  8. Enterprise Projects Set Risk Element Transmission Chaotic Genetic Model

    Directory of Open Access Journals (Sweden)

    Cunbin Li

    2012-08-01

    Full Text Available In order to research projects set risk transfer process and improve risk management efficiency in projects management, combining chaos theory and genetic algorithm, put forward enterprise projects set risk element transmission chaos genetic model. Using logistic chaos mapping and chebyshev chaos mapping mixture, constructed a hybrid chaotic mapping system. The steps of adopting hybrid chaos mapping for genetic operation include projects set initialization, calculation of fitness, selection, crossover and mutation operators, fitness adjustment and condition judgment. The results showed that the model can simulate enterprise projects set risk transmission process very well and it also provides the basis for the enterprise managers to make decisions.

  9. Recursive inter-generational utility in global climate risk modeling

    Energy Technology Data Exchange (ETDEWEB)

    Minh, Ha-Duong [Centre International de Recherche sur l' Environnement et le Developpement (CIRED-CNRS), 75 - Paris (France); Treich, N. [Institut National de Recherches Agronomiques (INRA-LEERNA), 31 - Toulouse (France)

    2003-07-01

    This paper distinguishes relative risk aversion and resistance to inter-temporal substitution in climate risk modeling. Stochastic recursive preferences are introduced in a stylized numeric climate-economy model using preliminary IPCC 1998 scenarios. It shows that higher risk aversion increases the optimal carbon tax. Higher resistance to inter-temporal substitution alone has the same effect as increasing the discount rate, provided that the risk is not too large. We discuss implications of these findings for the debate upon discounting and sustainability under uncertainty. (author)

  10. Technical report series on global modeling and data assimilation. Volume 5: Documentation of the AIRES/GEOS dynamical core, version 2

    Science.gov (United States)

    Suarez, Max J. (Editor); Takacs, Lawrence L.

    1995-01-01

    A detailed description of the numerical formulation of Version 2 of the ARIES/GEOS 'dynamical core' is presented. This code is a nearly 'plug-compatible' dynamics for use in atmospheric general circulation models (GCMs). It is a finite difference model on a staggered latitude-longitude C-grid. It uses second-order differences for all terms except the advection of vorticity by the rotation part of the flow, which is done at fourth-order accuracy. This dynamical core is currently being used in the climate (ARIES) and data assimilation (GEOS) GCMs at Goddard.

  11. Modelling of Radiological Health Risks from Gold Mine Tailings in Wonderfonteinspruit Catchment Area, South Africa

    Directory of Open Access Journals (Sweden)

    Manny Mathuthu

    2016-06-01

    Full Text Available Mining is one of the major causes of elevation of naturally-occurring radionuclide material (NORM concentrations on the Earth’s surface. The aim of this study was to evaluate the human risk associated with exposure to NORMs in soils from mine tailings around a gold mine. A broad-energy germanium detector was used to measure activity concentrations of these NORMs in 66 soil samples (56 from five mine tailings and 10 from the control area. The RESidual RADioactivity (RESRAD OFFSITE modeling program (version 3.1 was then used to estimate the radiation doses and the cancer morbidity risk of uranium-238 (238U, thorium-232 (232Th, and potassium-40 (40K for a hypothetical resident scenario. According to RESRAD prediction, the maximum total effective dose equivalent (TEDE during 100 years was found to be 0.0315 mSv/year at year 30, while the maximum total excess cancer morbidity risk for all the pathways was 3.04 × 10−5 at year 15. The US Environmental Protection Agency considers acceptable for regulatory purposes a cancer risk in the range of 10−6 to 10−4. Therefore, results obtained from RESRAD OFFSITE code has shown that the health risk from gold mine tailings is within acceptable levels according to international standards.

  12. Modelling of Radiological Health Risks from Gold Mine Tailings in Wonderfonteinspruit Catchment Area, South Africa

    Science.gov (United States)

    Mathuthu, Manny; Kamunda, Caspah; Madhuku, Morgan

    2016-01-01

    Mining is one of the major causes of elevation of naturally-occurring radionuclide material (NORM) concentrations on the Earth’s surface. The aim of this study was to evaluate the human risk associated with exposure to NORMs in soils from mine tailings around a gold mine. A broad-energy germanium detector was used to measure activity concentrations of these NORMs in 66 soil samples (56 from five mine tailings and 10 from the control area). The RESidual RADioactivity (RESRAD) OFFSITE modeling program (version 3.1) was then used to estimate the radiation doses and the cancer morbidity risk of uranium-238 (238U), thorium-232 (232Th), and potassium-40 (40K) for a hypothetical resident scenario. According to RESRAD prediction, the maximum total effective dose equivalent (TEDE) during 100 years was found to be 0.0315 mSv/year at year 30, while the maximum total excess cancer morbidity risk for all the pathways was 3.04 × 10−5 at year 15. The US Environmental Protection Agency considers acceptable for regulatory purposes a cancer risk in the range of 10−6 to 10−4. Therefore, results obtained from RESRAD OFFSITE code has shown that the health risk from gold mine tailings is within acceptable levels according to international standards. PMID:27338424

  13. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    Science.gov (United States)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2017-06-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  14. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    Science.gov (United States)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2016-03-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  15. Model of Information Security Risk Assessment based on Improved Wavelet Neural Network

    Directory of Open Access Journals (Sweden)

    Gang Chen

    2013-09-01

    Full Text Available This paper concentrates on the information security risk assessment model utilizing the improved wavelet neural network. The structure of wavelet neural network is similar to the multi-layer neural network, which is a feed-forward neural network with one or more inputs. Afterwards, we point out that the training process of wavelet neural networks is made up of four steps until the value of error function can satisfy a pre-defined error criteria. In order to enhance the quality of information security risk assessment, we proposed a modified version of wavelet neural network which can effectively combine all influencing factors in assessing information security risk by linear integrating several weights. Furthermore, the proposed wavelet neural network is trained by the BP algorithm with batch mode, and the weight coefficients of the wavelet are modified with the adopting mode. Finally, a series of experiments are conduct to make performance evaluation. From the experimental results, we can see that the proposed model can assess information security risk accurately and rapidly

  16. Injury count model for quantification of risk of occupational injury.

    Science.gov (United States)

    Khanzode, Vivek V; Maiti, J; Ray, P K

    2011-06-01

    Reduction of risk of occupational injuries is one of the most challenging problems faced by industry. Assessing and comparing risks involved in different jobs is one of the important steps towards reducing injury risk. In this study, a comprehensive scheme is given for assessing and comparing injury risks with the development of injury count model, injury risk model and derived statistics. The hazards present in a work system and the nature of the job carried out by workers are perceived as important drivers of injury potential of a work system. A loglinear model is used to quantify injury counts and the event-tree approach with joint, marginal and conditional probabilities is used to quantify injury risk. A case study was carried out in an underground coal mine. Finally a number of indices are proposed for the case study mine to capture risk of injury in different jobs. The findings of this study will help in designing injury intervention strategies for the mine studied. The job-wise risk profiles will be used to prioritise the jobs for redesign. The absolute indices can be applied for benchmarking job-wise risks and the relative indices can be used for comparing job-wise risks across work systems.

  17. Physics-Based Fragment Acceleration Modeling for Pressurized Tank Burst Risk Assessments

    Science.gov (United States)

    Manning, Ted A.; Lawrence, Scott L.

    2014-01-01

    As part of comprehensive efforts to develop physics-based risk assessment techniques for space systems at NASA, coupled computational fluid and rigid body dynamic simulations were carried out to investigate the flow mechanisms that accelerate tank fragments in bursting pressurized vessels. Simulations of several configurations were compared to analyses based on the industry-standard Baker explosion model, and were used to formulate an improved version of the model. The standard model, which neglects an external fluid, was found to agree best with simulation results only in configurations where the internal-to-external pressure ratio is very high and fragment curvature is small. The improved model introduces terms that accommodate an external fluid and better account for variations based on circumferential fragment count. Physics-based analysis was critical in increasing the model's range of applicability. The improved tank burst model can be used to produce more accurate risk assessments of space vehicle failure modes that involve high-speed debris, such as exploding propellant tanks and bursting rocket engines.

  18. Modeling Multiple Risks: Hidden Domain of Attraction

    CERN Document Server

    Mitra, Abhimanyu

    2011-01-01

    Hidden regular variation is a sub-model of multivariate regular variation and facilitates accurate estimation of joint tail probabilities. We generalize the model of hidden regular variation to what we call hidden domain of attraction. We exhibit examples that illustrate the need for a more general model and discuss detection and estimation techniques.

  19. Risk models to predict hypertension: a systematic review.

    Directory of Open Access Journals (Sweden)

    Justin B Echouffo-Tcheugui

    Full Text Available BACKGROUND: As well as being a risk factor for cardiovascular disease, hypertension is also a health condition in its own right. Risk prediction models may be of value in identifying those individuals at risk of developing hypertension who are likely to benefit most from interventions. METHODS AND FINDINGS: To synthesize existing evidence on the performance of these models, we searched MEDLINE and EMBASE; examined bibliographies of retrieved articles; contacted experts in the field; and searched our own files. Dual review of identified studies was conducted. Included studies had to report on the development, validation, or impact analysis of a hypertension risk prediction model. For each publication, information was extracted on study design and characteristics, predictors, model discrimination, calibration and reclassification ability, validation and impact analysis. Eleven studies reporting on 15 different hypertension prediction risk models were identified. Age, sex, body mass index, diabetes status, and blood pressure variables were the most common predictor variables included in models. Most risk models had acceptable-to-good discriminatory ability (C-statistic>0.70 in the derivation sample. Calibration was less commonly assessed, but overall acceptable. Two hypertension risk models, the Framingham and Hopkins, have been externally validated, displaying acceptable-to-good discrimination, and C-statistic ranging from 0.71 to 0.81. Lack of individual-level data precluded analyses of the risk models in subgroups. CONCLUSIONS: The discrimination ability of existing hypertension risk prediction tools is acceptable, but the impact of using these tools on prescriptions and outcomes of hypertension prevention is unclear.

  20. Extended-range prediction trials using the global cloud/cloud-system resolving model NICAM and its new ocean-coupled version NICOCO

    Science.gov (United States)

    Miyakawa, Tomoki

    2017-04-01

    The global cloud/cloud-system resolving model NICAM and its new fully-coupled version NICOCO is run on one of the worlds top-tier supercomputers, the K computer. NICOCO couples the full-3D ocean component COCO of the general circulation model MIROC using a general-purpose coupler Jcup. We carried out multiple MJO simulations using NICAM and the new ocean-coupled version NICOCO to examine their extended-range MJO prediction skills and the impact of ocean coupling. NICAM performs excellently in terms of MJO prediction, maintaining a valid skill up to 27 days after the model is initialized (Miyakawa et al 2014). As is the case in most global models, ocean coupling frees the model from being anchored by the observed SST and allows the model climate to drift away further from reality compared to the atmospheric version of the model. Thus, it is important to evaluate the model bias, and in an initial value problem such as the seasonal extended-range prediction, it is essential to be able to distinguish the actual signal from the early transition of the model from the observed state to its own climatology. Since NICAM is a highly resource-demanding model, evaluation and tuning of the model climatology (order of years) is challenging. Here we focus on the initial 100 days to estimate the early drift of the model, and subsequently evaluate MJO prediction skills of NICOCO. Results show that in the initial 100 days, NICOCO forms a La-Nina like SST bias compared to observation, with a warmer Maritime Continent warm pool and a cooler equatorial central Pacific. The enhanced convection over the Maritime Continent associated with this bias project on to the real-time multi-variate MJO indices (RMM, Wheeler and Hendon 2004), and contaminates the MJO skill score. However, the bias does not appear to demolish the MJO signal severely. The model maintains a valid MJO prediction skill up to nearly 4 weeks when evaluated after linearly removing the early drift component estimated from